A creature slip and tumbles down a sewer there in an active New York. As a storm of nuclear natural filth envelops him, his body begins to transform—limbs mutating, rows of terrible teeth emerging—his circular, snakelike form, slithering ominously across the screen.
A robot-sounding speaker sings,” Beware the creature in the evening, a darkness spirit no end in sight,” as the terrifying creature sneaks up behind a screaming child before brutally stumbling between its teeth.
However, it’s a different tale when you click on the movie’s creator. ” Welcome to Go Cat—a fun and exciting YouTube channel for kids”! 24 500 members and more than 7 million people are notified by the show’s information. Every event is filled with imagination, vibrant animation, and a amazing tale of transformation that is about to unfold. Whether it’s a funny crash or a creepy problem, each picture brings a fresh new tale of transition for children to enjoy”!
Go Cat’s apparently kid-friendly content is intense, surreal, almost like body horror. Its designs are eerily similar to what became known as Elsagate in 2017, where hundreds of thousands of films were made that showed children’s figures like Elsa from Frozen, Spider-Man, and Peppa Pig being in dangerous, physical, and aggressive situations. By manipulating the product’s systems, these videos were able to look on YouTube’s dedicated Kids ‘ app—preying on children’s mysteries to land thousands of clicks for money. In its efforts to fix the issue, YouTube removed ads from over 2 million video, deleted more than 150 000, and terminated 270 records. Although succeeding investigations by WIRED discovered that similar channels, some of which featured sexual and lurid depictions of Minecraft avatars, continued to appear on YouTube’s Topic webpage, Elsagate‘s reach had been considerably diminished.
Next came AI. Create these bizarre and macabre videos has become not just simple but lucrative thanks to the ability to enter ( and circumvent ) generative AI prompts and an increase in tutorials on how to monetize children’s content. One of the many that came up when WIRED looked for terms as harmless as “minions,”” Thomas the Tank Engine,” and” sweet cats” was Move Cat. Some involve Elsagate items like female, lingerie-clad types of Elsa and Anna, but soldiers are another great hitter, because are animated animals and kittens.
YouTube says it “terminated two marked programmes for violating our Terms of Service” and is suspending the marketing of three different programs in response to WIRED’s request for comment.
A YouTube director claims that” a number of movies have also been removed for violating our Child Safety policy.” ” As always, all material uploaded to YouTube is content to our Community Guidelines and quality guidelines for kids—regardless of how it’s generated”.
When questioned about the policies in place to stop banned customers from just creating innovative channels, YouTube stated that doing so would violate its Terms of Service and that these rules were strictly enforced “using a combination of both individuals and systems.”
WIRED can confirm that some of the flagged channels, including two cat-centric channels with abuse themes, were actually shut down last week. But other linked channels with reposts of the same videos remain on the platform. Go Cat’s channel description and active status are unaffected, and it continues to exist.
WIRED reached out to other channels for comment after discovering no emails linked to Go Cat. We did not receive a response.
The second wave of Elsagate’s explosion of AI-animated cat videos surpasses those of any other type in both scope and content in scope. These videos frequently take the form of fables, where kittens are starved, forced to do unpleasant chores, and audibly beaten by their parents with baseball bats or frying pans, and have titles like” Kitten abused by its own mother.” They are then taken to the hospital and revived—before the parent arrives, apologetic for their actions, as melancholic music or a meowing cover of Billie Eilish’s” What Was I Made For” plays in the background. According to experts, they are a clear attempt to mislead young audiences by using nearly identical channel names like” Cute cat AI” and” Cute cat of Ni,” and that is also an obvious attempt to lazily and sloppily monetize cheap content in ways that are beyond the scope of generative AI.
According to Robbie Torney, senior director of AI programs at Common Sense Media,” we are deeply concerned about the proliferation of AI-generated content that appears to be targeted at children and contains deeply inappropriate material.” The nonprofit, which rates and reviews media to provide accurate recommendations for families, was shown several such channels discovered during this investigation. The organization found common themes in the videos of” characters in extreme distress or peril,” “mutilation, medical procedures, and cruel experiments,” and “depictions of child abuse and torture.”
These channels now typically appear on YouTube’s main app rather than YouTube Kids, but their ulterior changes, including implementing new rules in 2019 and implementing new rules for the US Children’s Online Privacy Protection Act, are only partially intended to mask their intentions. Sounds of babies ‘ laughter and babbling are blended in with music and set to backdrops of bright, Cocomelon-esque landscapes. The well-known children’s cartoon even appears in some of these videos ‘ background. While Go Cat directly promotes its content to children, others make claims that their audience is completely unaffected or that their descriptions are” not for kids” in the description. The metadata for several channels revealed some videos have been tagged with keywords such as #funnycat, #familyfun, and #disneyanimatedmovies. The hashtag #animalrescue appears alongside more educational content when tagged with images of polar bears and reindeer that have been infected with parasites.
Although in 2017, Elsagate content typically featured traditional animation or even costumed actors ( both of which are still a part of this new wave ), the development of generative AI allows for disturbing, brain-rot-style videos to now be produced much more quickly and by anyone, regardless of skill.
” This trend is particularly concerning because of the scale and speed at which AI can generate this content”, Torney says. ” Unlike traditional content creation, AI-generated videos can be produced in large volumes with little supervision,” says the author. Without the use of human oversight in the creation process, children are able to easily access inappropriate and potentially harmful material. The comparative speed of AI also means that when one channel is flagged and removed by YouTube—another with identical reposts springs up days later.
At a recent hearing in support of a bill that aims to protect children from the risks of AI, Tracy Pizzo Frey, senior AI adviser for Common Sense Media, testified at a California State Assembly hearing. It will require systems to be classified on a scale from” Prohibited Risk” to” Low Risk” and ban the use of controversial AI companions such as Replika by children, alongside other measures. As AI-generated kids ‘ content continues to outnumber its traditionally animated counterparts, the scale of this issue is growing and is likely to increase even more.
More than 70 similar content-farm channels discovered during this investigation have been shared with YouTube by WIRED. Most of these involve AI-generated images of cats alongside themes of gore, sex, and child abuse—and their subscriber count ranges from thousands to millions. Although hundreds of automated comments across these videos suggest it may be the latter, it is debatable whether these views are primarily coming from humans or are just confirmation of the realization of the dead internet theory.
When reviewing the channels, YouTube explained that it required all creators to mark AI-generated content as such, including content geared toward children and families, and that it had established standards for what it called quality content.
” We want younger viewers to not just have a safer experience but also an enriching one”, a YouTube spokesperson says. To help guide creators in creating quality content for kids and family content and reduce the amount of content that is low quality, regardless of how it was created, we partnered with experts to create a set of quality principles for kids and family content. YouTube claims that since implementing these principles, which dictate how content is monetized, ranked in recommendations, and displayed on YouTube Kids, views of “high quality” content have increased by 45 percent on the YouTube Kids app.
Still, regardless of their audience, and as YouTube’s moderators scramble to remove them, Elsagate’s successors remain on YouTube’s main platform—continuing to generate new ways to bend the rules at every turn. Similar videos have recently been posted on TikTok where Runway AI generator was overlay onto real footage from mass shootings and suicides to create “minion gore” videos, according to 404 Media. Nor is the issue unique to them. TikTok told 404 Media that it is against the law to post offensive content as well as gory, gruesome, disturbing, or extremely violent content, and that it is implementing measures to stop harmful AI-generated content that goes against its guidelines.
” We recognize that short-form video platforms are working to address content moderation challenges, but the nature of AI-generated videos presents unique difficulties that may require new solutions”, Torney tells WIRED.
” The rapid evolution of AI technology requires that all stakeholders work together to make sure that kids ‘ exposure to online video content is safe and positive,” according to the statement from the author.