Websites that online reduce people’s clothes have grown more popular as AI-powered image creators have. Two feeds of what appear to be images uploaded by people who want to “nudify” the content on one of these websites offer a disturbing glimpse into how these programs are used.
Images ‘ feeds reveal shocking depictions of intended patients. Some pictures of women who were obviously children were seen by WIRED. Additional pictures had adults in the captions that indicated they were feminine friends or strangers. No false nude photos that might have been created on the website’s pages are displayed to customers who are n’t logged in.
Using a bitcoin wallet, users who want to create and preserve algorithmic nude images are asked to log in to the website. Although the website offers credit for creating deepfake nude images starting at$ 5 for a 2022 video posted by an affiliated YouTube page, the price is n’t currently listed. WIRED learned about the website from a blog about the NFT marketplace OpenSea on a post that had a link to the YouTube webpage. After WIRED contacted YouTube, the app said it terminated the stream, Reddit told WIRED that the person had been banned.
WIRED is not identifying the site, which is still website, to defend the women and girls who remain on its feeds. The project’s IP address, which went dwell in February 2022, belongs to online security and system company Cloudflare. When asked about its presence, business director Jackie Dutton noted the difference between providing a site’s IP address, as Cloudflare does, and hosting its material, which it does not.
WIRED notified the National Center for Missing &, Exploited Children, which helps record cases of child abuse to law enforcement, about the site’s life.
Developers of AI software like OpenAI and Stability AI claim that their image generators are designed for visual and commercial purposes and have guardrails to stop harmful content. However, one of the most well-known use cases for open source AI image-making technology is now pornography, which is currently somewhat powerful. The issue of nonconsensual skinny algorithmic images, which are most frequently targeted at women, has become more common and severe as image generation has become more common and common. In what appears to be the first event of its kind, two Florida youth were detained earlier this month for supposedly creating and sharing AI-generated nude photos of their thick school colleagues without consent.
The deepnude site features a terrible reality, according to Mary Anne Franks, a professor at the George Washington University School of Law who has studied the issue of nonconsensual explicit pictures. There are far more incidents involving AI-generated nude images of women without acceptance and juveniles than the general public is aware of. The couple private cases were simply made public when people heard about them and raised the alarm when the images were shared within a neighborhood.
There will be” all kinds of sites like this that are impossible to chase down,” Franks claims, and most victims wo n’t realize it has happened until someone flags it for them.
Nonconsensual Graphics
On two distinct pages of the website, which WIRED reviewed, are feeds with images that appear to have been submitted by users. One is labeled” Home” and the other” Explore”. Girls under the age of 18 were plainly visible in some of the pictures.
A young woman was seen facing a tree while holding a flower in her hair. Another girl is pictured in a classroom that appears to be center or higher. The image, seemingly taken gently by a colleague, is captioned” PORN”.
A child taking a picture in what appears to be a university school with two girls, who both teeth and present for the picture, was one of the images on the website. A Snapchat lens, which had grown to the point where the boy’s features were obscured, made his face completely coverless.
Captions on the apparently uploaded images indicated they include images of friends, classmates, and romantic partners. ” My gf” one caption says, showing a young woman taking a selfie in a mirror.
Many of the photos showed influencers who are popular on TikTok, Instagram, and other social media platforms. Other images appeared to be Instagram screenshots of people sharing photos of their daily lives. A young woman was pictured smiling while eating a dessert topped with a candle.
A number of images appeared to show people who were completely strangers to the photographer. A woman or girl was pictured in a behind image and was not posing for a photo but was standing near what appeared to be a tourist attraction.
Some of the images in the WIRED feeds were cropped to show only their chests or crotches, as opposed to the faces of women and girls.
Huge Audience
Five new images of women were posted to the Home feed and three more to the Explore page over the course of eight days of monitoring the site. According to statistics released on the website, the majority of these images had hundreds of “views” accumulated. Whether or not all images submitted to the website make it to the Home or Explore feed or how views are calculated is undetermined. Every post on the Home feed has at least a few thousand views.
The list of” Most Viewed” images on the website is topped by photos of famous people and people with large Instagram followings. The most- viewed people of all time on the site are actor Jenna Ortega with more than 66, 000 views, singer- songwriter Taylor Swift with more than 27, 000 views, and an influencer and DJ from Malaysia with more than 26, 000 views.
Deepfake nudes have previously been used against Swift and Ortega. A resounding debate about the effects of deepfakes and the need for more legal protections for victims was sparked by the publication of fake nude Swift images on X in January. This month, NBC reported that, for seven months, Meta had hosted ads for a deepnude app. The app boasted about its ability to “undress” people, using a picture of Jenna Ortega from when she was 16 years old.
In the US, no federal law targets the distribution of fake, nonconsensual nude images. A few states have passed their own laws. But AI- generated nude images of minors come under the same category as other child sexual abuse material, or CSAM, says Jennifer Newman, executive director of the NCMEC’s Exploited Children’s Division.
According to Newman,” If it is indistinguishable from an image of a live victim, of a real child, then that is child sexual abuse material.” ” And we will treat it as such as we’re processing our reports, as we’re getting these reports out to law enforcement”.
In 2023, Newman says, NCMEC received about 4, 700 reports that” somehow connect to generative AI technology”.
” Pathetic Bros”
Those who want to use WalletConnect, Metamask, or Coinbase to create and save deepfake nude images on the website are required to log in via a WalletConnect, Metamask, or Coinbase wallet. According to a spokesman for Coinbase, McKenna Otterstedt announced that the company was conducting an internal investigation into how the site and Coinbase connected. WalletConnect and Metamask, which are owned by ConsenSys, did not respond to requests for comment.
In November 2022, the deepnude site’s YouTube channel posted a video claiming users could “buy credit” with Visa or Mastercard. The two payment processors did not respond to the comments from WIRED.
On OpenSea, a marketplace for NFTs, the site listed 30 NFTs in 2022 with unedited, not deepfaked, pictures of different Instagram and TikTok influencers, all women. After buying an NFT with the ether cryptocurrency —$ 280 worth at today’s exchange rate—owners would get access to the website, which according to a web archive, was in its early stages at the time. ” Privacy is the ultimate priority” for its users, the NFT listings said.
The tags used to describe the women’s perceived features were used to categorize the NFTs. The categories included Boob Size, Country ( with most of the women listed as from Malaysia or Taiwan ), and Traits, with tags including” cute”, “innocent”, and “motherly”.
None of the NFTs listed by the account have ever been sold. Within 90 minutes of WIRED contacting the company, OpenSea deleted the listings and the account. None of the women depicted in the NFTs reached out for comment.
It’s unclear who, or how many people, created or own the deepnude website. The now-deactivated OpenSea account had a profile image that looked exactly like the third Google Image result for “nerd.” According to the account bio, the creator’s guiding principle is to “reveal the shitty thing in this world” and then distribute it to” all douche and pathetic bros.”
The same bio was used by an X account linked to an OpenSea account, as well as a link to a now dead blog post about” Whitehat, Blackhat Hacking” and” Scamming and Money Making” The account’s owner appears to have been one of three contributors to the blog, where he went by the moniker 69 Fucker.
One user, who had a profile picture of a man of East Asian descent who appeared to be under 50, promoted the website on Reddit. However, an archive of the website from March 2022 claims that the site “was created by 9 horny skill- full people”. The job titles were all facetious, and the majority of the profile pictures appeared to be stock photos. Three of them were Horny Director, Scary Stalker, and Booty Director.
The website’s email address did not respond to requests for comment.