A Designed analysis found 16 of the biggest so-called dress and “nudify” sites using the sign-in system from Google, Apple, Discord, Online, Patreon, and Line. This method enables individuals to quickly set up accounts on fake websites and provide them with a patina of credibility before paying for credits and producing images.
Despite the existence of bots and sites that produce unrequited romantic images of women and ladies for years, the number has grown with the development of conceptual AI. This kind of “undress” misuse is dangerously popular, with teen boys reportedly creating images of their classmates. According to reviewers, software companies have been slower to address the scope of the issues, with the sites showing up prominently in search results, paid ads promoting them on social media, and applications showing up in game stores.
According to Adam Dodge, a lawyer and founder of EndTAB ( Ending Technology-Enabled Abuse ),” This is a continuation of a trend that normalizes sexual violence against women and girls by Big Tech.” ” Sign-in APIs are tools of convenience. We should never be making sexual violence an act of convenience”, he says. Instead of building barriers around the access to these apps, we’re creating a drawbridge.
People can sign up for the deepfake websites using existing accounts that have been vetted by WIRED, which are deployed via APIs and common authentication methods. Google’s login system appeared on 16 websites, Discord’s appeared on 13, and Apple’s on six. X’s button was on three websites, with Patreon and messaging service Line’s both appearing on the same two websites.
WIRED is not naming the websites, since they enable abuse. Many of these are connected to wider networks and are the property of the same people or businesses. Despite the fact that tech companies generally have guidelines preventing developers from using their services in ways that would cause harm, harass, or invade people’s privacy, the login systems have been used.
Discord and Apple spokespeople said they have removed the developer accounts linked to their websites after being contacted by WIRED. Google stated that it will take legal action against developers if it discovers that its terms have been violated. Line confirmed it is investigating, but stated it could not comment on specific websites. Patron said it prohibits accounts that allow explicit imagery to be created. Regarding the way in which its systems are being used, X did not respond to a request for comment.
One of the undress websites claimed in a Telegram channel that it was “temporarily unavailable” and was attempting to restore access shortly after Jud Hoffman, Discord’s vice president of trust and safety, told WIRED it had suspended the websites ‘ access to its APIs for breaking its developer policy. That undress service did not respond to WIRED’s request for information on its operations.
Rapid Expansion
The number of non-consensual intimate videos and images being produced has increased exponentially since deepfake technology started to emerge toward the end of 2017. While videos are harder to produce, the creation of images using “undress” or “nudify” websites and apps has become commonplace.
” We must be clear that this is not innovation, this is sexual abuse”, says David Chiu, San Francisco’s city attorney, who recently opened a lawsuit against undress and nudify websites and their creators. According to Chiu, the 16 websites he’s suing are based on received about 200 million visits in the first six months of this year alone. ” These websites are exploiting girls and women everywhere in the world. These images are used to bully, humiliate, and threaten women and girls”, Chiu alleges.
The websites are constantly evolving, with one user claiming that their AI can control how women’s bodies look and allow “uploads from Instagram” to be posted frequently about new features they are creating. Some websites have combined into a collective to create their own cryptocurrency that could be used to pay for images and can run affiliate schemes to encourage people to share them.
According to WIRED, Alexander August, the CEO of one of the websites, and a person identifying themselves as Alexander August, “understand and acknowledge the concerns regarding the potential misuse of our technology.” The user asserts that the website has implemented various security measures to stop the creation of images of minors. ” We are committed to taking social responsibility and are open to collaborating with official bodies to enhance transparency, safety, and reliability in our services”, they wrote in an email.
When someone attempts to sign up for the website or clicks on buttons to try to generate images, logins for the tech company are frequently displayed. How many people will have logged in, and most websites also let users create accounts using just their email addresses. However, of the websites reviewed, the majority had implemented the sign-in APIs of more than one technology company, with Sign-In With Google being the most widely used. When this option is clicked, prompts from the Google system say the website will get people’s name, email addresses, language preferences, and profile picture.
Google’s sign-in system also reveals some information about the developer accounts linked to a website. For example, four websites are linked to one Gmail account, another six websites are linked to another. A Google spokesman says that “appropriate action” will be taken if these terms are broken.” In order to use Sign in with Google, developers must agree to our Terms of Service, which prohibits the promotion of sexually explicit content as well as behavior or content that defames or harasses others.
After being contacted by WIRED, other tech companies that used sign-in systems claimed to have banned accounts.
Hoffman fro D scord cl flagged. According to Apple spokesman Shane Bauer, it has terminated several developer’s licenses with Apple and that Sign In With Apple will no longer function on their websites. Patreon prohibits accounts that grant or fund access to external tools that can produce explicit imagery or adult material, according to Adiya Taylor, corporate communications lead at Patreon. works We or wil accounts take acti violate our community n if we di s ove th t an of our atreo ‘s orks or guidelines.
Numerous of the websites displayed the logos of Mastercard and Visa, which suggests they could possibly be used to pay for their services. While a Mastercard spokesperson says “purchases of nonconsensual deepfake content are not allowed on our network” and that it takes action when it detects or is made aware of any instances, Visa did not respond to WIRED’s request for comment.
Tech companies and payment providers have taken legal action against AI services on numerous occasions that allow users to create fake images or videos after news of their activities has been covered in media reports. By not taking proactive steps against them, according to Charlotte McGlynn, a professor of law at Durham University with expertise in the legal regulation of pornography and sexual violence and abuse online.
What’s worrying about the sign-in systems being used is that these are the most fundamental of security measures and moderation that are missing or not being implemented, McGlynn says. ” It is evident that they simply do not care, despite their rhetoric”, McGlynn says. They would have taken these simplest measures to limit access, they said.