When Gabi Belle learned there was a naked photo of her circulating on the internet, her body turned cold. The YouTube influencer had never posed for the image, which showed her standing in a field without clothes. She knew it must be fake.
“I felt yucky and violated,” Belle said in an interview. “Those private parts are not meant for the world to see because I have not consented to that. So it’s really strange that someone would make images of me.”
Artificial intelligence is fueling an unprecedented boom this year in fake pornographic images and videos. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people in photographs — analyzing what their naked bodies would look like and imposing it into an image — or seamlessly swap a face into a pornographic video.
On the top 10 websites that host AI-generated porn photos, fake nudes have ballooned by more than 290 percent since 2018, according to Genevieve Oh, an industry analyst. These sites feature celebrities and political figures such as New York Rep. Alexandria Ocasio-Cortez alongside ordinary teenage girls, whose likenesses have been seized by bad actors to incite shame, extort money or live out private fantasies.
Victims have little recourse. There’s no federal law governing deepfake porn, and only a handful of states have enacted regulations. President Biden’s AI executive order issued Monday recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work.
Meanwhile, legal scholars warn that AI fake images may not fall under copyright protections for personal likenesses, because they draw from data sets populated by millions of images. “This is clearly a very serious problem,” said Tiffany Li, a law professor at the University of San Francisco.
The advent of AI images comes at a particular risk for women and teens, many of whom aren’t prepared for such visibility. A 2019 study by Sensity AI, a company that monitors deepfakes, found 96 percent of deepfake images are pornography, and 99 percent of those photos target women.
“It’s now very much targeting girls,” said Sophie Maddocks, a researcher and digital rights advocate at the University of Pennsylvania. “Young girls and women who aren’t in the public eye.”
‘Look, Mom. What have they done to me?’
On Sept. 17, Miriam Al Adib Mendiri was returning to her home in southern Spain from a trip when she found her 14-year-old daughter distraught. Her daughter shared a nude picture of herself.
“Look, Mom. What have they done to me?” Al Adib Mendiri recalled her daughter saying.
She’d never posed nude. But a group of local boys had grabbed clothed photos from the social media profiles of several girls in their town and used an AI “nudifier” app to create the naked pictures, according to police.
The application is one of many AI tools that use real images to create naked photos, which have flooded the web recent months. By analyzing millions of images, AI software can better predict how a body will look naked and fluidly overlay a face into a pornographic video, said Gang Wang, an expert in AI at the University of Illinois at Urbana-Champaign.
Though many AI image-generators block users from creating pornographic material, open source software, such as Stable Diffusion, makes its code public, letting amateur developers adapt the technology — often for nefarious purposes. (Stability AI, the maker of Stable Diffusion, did not return a request for comment.)
Once these apps are public, they use referral programs that encourage users to share these AI-generated photos on social media in exchange for cash, Oh said.
When Oh examined the top 10 websites that host fake porn images, she found more than 415,000 had been uploaded this year, garnering nearly 90 million views.
AI-generated porn videos have also exploded across the web. After scouring the 40 most popular websites for faked videos, Oh found more than 143,000 videos had been added in 2023 — a figure that surpasses all new videos from 2016 to 2022. The fake videos have received more than 4.2 billion views, Oh found.
The Federal Bureau of Investigation warned in June of an uptick of sexual extortion from scammers demanding payment or photos in exchange for not distributing sexual images. While it’s unclear what percentage of these images are AI-generated, the practice is expanding. As of September, over 26,800 people have been victims of “sextortion” campaigns, a 149 percent rise from 2019, the FBI told The Post.
‘You’re not safe as a woman’
In May, a poster on a popular pornography forum started a thread called “I can fake your crush.” The idea was simple: “Send me whoever you want to see nude and I can fake them” using AI, the moderator wrote.
Within hours, photos of women came flooding in. “Can u do this girl? not a celeb or influencer,” one poster asked. “My co-worker and my neighbor?” another one added.
Minutes after a request, a naked version of the image would appear on the thread. “Thkx a lot bro, it’s perfect,” one user wrote.
Celebrities are a popular target for fake porn creators aiming to capitalize on search interest for nude photos of famous actors. But websites featuring famous people can lead to a surge in other kinds of nudes. The sites often include “amateur” content from unknown individuals and host ads that market AI porn-making tools.
Google has polices in place to prevent nonconsensual sexual images from appearing in search results, but its protections for deepfake images are not as robust. Deepfake porn and the tools to make it show up prominently on the company’s search engines, even without specifically searching for AI-generated content. Oh documented more than a dozen examples in screenshots, which were independently confirmed by The Post.
Ned Adriance, a spokesman for Google, said in a statement the company is “actively working to bring more protections to search” and that the company lets users request the removal of involuntary fake porn.
Google is in the process of “building more expansive safeguards” that would not require victims to individually request content gets taken down, he said.
Li, of the University of San Francisco, said it can be hard to penalize creators of this content. Section 230 in the Communications Decency Act shields social media companies from liability for the content posted on their sites, leaving little burden for websites to police images.
Victims can request that companies remove photos and videos of their likeness. But because AI draws from a plethora of images in a data set to create a faked photo, it’s harder for a victim to claim the content is derived solely from their likeness, Li said.
“Maybe you can still say: ‘It’s a copyright violation, it’s clear they took my original copyrighted photo and then just added a little bit to it,’” Li said. “But for deep fakes … it’s not that clear … what the original photos were.”
In the absence of federal laws, at least nine states — including California, Texas and Virginia — have passed legislation targeting deepfakes. But these laws vary in scope: In some states victims can press criminal charges, while others only allow civil lawsuits — though it can be difficult to ascertain whom to sue.
The push to regulate AI-generated images and videos is often intended to prevent mass distribution, addressing concerns about election interference, said Sam Gregory, executive director of the tech human rights advocacy organization Witness.
But these rules do little for deepfake porn, where images shared in small groups can wreak havoc on a person’s life, Gregory added.
Belle, the YouTube influencer, is still unsure how many deepfake photos of her are public and said stronger rules are needed to address her experience.
“You’re not safe as a woman,” she said.