Popular search engines like Google and Bing are making it easy to surface nonconsensual deepfake pornography by placing it at the top of search results, NBC News reported Thursday.
These controversial deepfakes superimpose faces of real women, often celebrities, onto the bodies of adult entertainers to make them appear to be engaging in real sex. Thanks in part to advances in generative AI, there is now a burgeoning black market for deepfake porn that could be discovered through a Google search, NBC News previously reported.
NBC News uncovered the problem by turning off safe search, then combining the names of 36 female celebrities with obvious search terms like “deepfakes,” “deepfake porn,” and “fake nudes.” Bing generated links to deepfake videos in top results 35 times, while Google did so 34 times. Bing also surfaced “fake nude photos of former teen Disney Channel female actors” using images where actors appear to be underaged.
Read 12 remaining paragraphs | Comments