How AI fake nudes are ruining teens’ lives

How AI fake nudes are ruining teens’ lives

When Gabby Bell learned that there was a nude photo of her circulating on the Internet, her body went cold. The YouTube influencer had never before taken the photo that showed her standing in a field without clothes. She knew it had to be fake.

But when Bell, 26, messaged her colleague asking for help removing the image, he told her there were nearly 100 fake images spread across the web, most of them on websites known for hosting AI-generated pornography. Bell said they were deleted in July, but new images have already surfaced depicting her in graphic sexual situations.

“I felt sick and violated,” Bell said in an interview. “These private parts aren’t supposed to be seen by the world because I didn’t consent to that. So it’s really weird for someone to take pictures of me.”

Artificial intelligence is fueling an unprecedented boom this year in fake porn photos and videos. This has been enabled by the emergence of cheap, easy-to-use AI tools that can “undress” people in photographs – analyze the shape of their naked bodies and superimpose it in an image – Or seamlessly swap the face into a porn video.

In the top 10 sites hosting AI-generated porn, fake nudes have ballooned more than 290 percent since 2018, according to Genevieve Oh, an industry analyst. These sites feature celebrities and political figures like New York State Rep. Alexandria Ocasio-Cortez alongside ordinary teenage girls, whose photos have been hijacked by bad actors to incite shame, extort money, or live out private fantasies.

Victims have few resources. There is no federal law governing deep porn, and only a few states have passed regulations. President Biden’s executive order on artificial intelligence on Monday recommends, but does not require, companies to label images, video and audio generated by artificial intelligence to indicate computer-generated work.

Meanwhile, legal scholars warn that fake images driven by artificial intelligence may not fall under copyright protection for personal likenesses, because they are derived from datasets filled with millions of images. “This is clearly a very serious problem,” said Tiffany Lee, a law professor at the University of San Francisco.

The emergence of AI images poses a particular risk to women and adolescents, many of whom are unprepared for such exposure. A 2019 study by Sensity AI, a company that monitors fake photos, found that 96 percent of fake photos are pornographic, and that 99 percent of those photos target women.

“It’s now overwhelmingly targeting girls,” said Sophie Maddox, a researcher and digital rights advocate at the University of Pennsylvania. “Young girls and women who do not receive general media coverage.”

Look, mom. What did they do to me?

On September 17, Meriem Adib Mendere was returning home in southern Spain from a trip when she found her 14-year-old daughter in a distraught state. Her daughter shared a nude photo of herself.

“Look, mom. What did they do to me?” The writer Al-Mandiri remembered what her daughter said.

She never posed nude. But a group of local boys seized clothed photos from the social media profiles of several girls in their town and used an artificial intelligence-based “nudifier” app to create the nude photos, according to police.

Scarlett Johansson on AI-generated fake sex videos: ‘Nothing can stop anyone from cutting and pasting my photo’

The app is one of many AI tools that use real photos to create nudes, which have flooded the web in recent months. By analyzing millions of images, the AI ​​software can better predict how a naked body and face will be seamlessly covered in a porn video, said Gang Wang, an artificial intelligence expert at the University of Illinois at Urbana-Champaign.

Although many AI-based image generators prevent users from creating pornography, open source software, such as Stable Diffusion, makes its code public, allowing amateur developers to adapt the technology — often for nefarious purposes. (Stability AI, the maker of Stable Diffusion, did not respond to a request for comment.)

Once these apps go public, they use referral programs that encourage users to share these AI-generated images on social media in exchange for money, Oh said.

When Oh examined the top 10 sites hosting fake porn, she found that more than 415,000 sites had been uploaded this year, garnering nearly 90 million views.

AI-generated porn videos have also spread across the web. After searching the 40 most popular sites for fake videos, Oh found that more than 143,000 videos were added in 2023 — a number that exceeds all new videos from 2016 to 2022. Oh found that fake videos received more than 4.2 billion views.

The FBI warned in June of increasing cases of sextortion by scammers demanding money or photos in exchange for not distributing sexual images. While it’s unclear what percentage of these images are generated by AI, the practice is expanding. As of September, more than 26,800 people had fallen victim to “sextortion” campaigns, an increase of 149 percent from 2019, the FBI told The Washington Post.

“You are not safe as a woman”

In May, a poster on a popular porn forum started a thread titled “I can fake your crush.” The idea was simple: “Send me who you want to see naked and I can fake it” using artificial intelligence, the moderator wrote.

Within hours, photos of the women poured in. “Not a celebrity or influencer,” one poster asked. “My co-worker and my neighbor?” Another one added.

Minutes after the request, a nude version of the photo will appear on the post. “Thank you so much bro, it’s perfect,” one user wrote.

These fake images reveal how artificial intelligence is amplifying our worst stereotypes

Celebrities are a common target for fake porn creators who aim to capitalize on interest in searching for nude photos of famous actors. But websites featuring famous people can lead to an increase in other types of nudity. The sites often include “amateur” content from unknown individuals and host ads marketing AI-powered porn-making tools.

Google has policies in place to prevent non-consensual sexual images from appearing in search results, but its protections for deepfakes are not as strong. Deepfake porn and the tools needed to make it appear more prominently on the company’s search engines, even without specifically searching for AI-generated content. Oh documented more than a dozen examples in screenshots, which The Post independently confirmed.

Google spokesman Ned Adrians said in a statement that the company is “actively working to provide more search protections” and that the company allows users to involuntarily request the removal of fake pornography.

He said Google is “building more comprehensive safeguards” that won’t require victims to individually request removal of content.

Lee, of the University of San Francisco, said it may be difficult to punish creators of such content. Section 230 of the Communications Decency Act protects social media companies from liability for content posted on their sites, leaving little burden on websites to censor images.

Victims can ask companies to remove photos and videos that resemble them. But because the AI ​​draws from a large number of images in a dataset to create a fake image, it is difficult for a victim to claim that the content is derived solely from their image, Lee said.

“You might still say, ‘It’s copyright infringement,’ and they obviously took my original copyrighted photo and then added a little bit more to it,” Lee said. “But for deepfakes… it’s not clear… what the original images are.”

Find out why AI like ChatGPT has become so good, so fast

In the absence of federal laws, at least nine states — including California, Texas and Virginia — have passed legislation targeting deepfakes. But these laws vary in scope: In some states, victims can press criminal charges, while others only allow civil lawsuits to be filed — although it can be difficult to ascertain who will be sued.

Sam Gregory, executive director of tech advocacy group Witness, said the push to regulate AI-generated images and videos is often aimed at preventing mass distribution and addressing concerns about election interference.

Gregory added that these rules don’t do much about deep porn, where images shared in small groups can wreak havoc on a person’s life.

Bell, a YouTube influencer, remains unsure how many fake photos of herself have been made public, and said stronger rules are needed to deal with her experience.

“You are not safe as a woman,” she said.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *