A creature slip and tumbles down a sewer there in an active New York. As a storm of nuclear natural filth envelops him, his body begins to transform—limbs mutating, rows of terrible teeth emerging—his circular, snakelike form, slithering ominously across the screen.
A robot-sounding speaker sings,” Beware the creature in the evening, a darkness spirit no end in sight,” as the terrifying creature sneaks up behind a screaming child before brutally stumbling between its teeth.
However, it’s a unique story when you click through to the movie’s creator. ” Welcome to Go Cat—a fun and exciting YouTube channel for kids”! The show’s information details 24, 500 clients and more than 7 million viewers. Every season is filled with imagination, vibrant animation, and a amazing tale of transformation that is about to unfold. Whether it’s a funny crash or a creepy problem, each picture brings a fresh new tale of transition for children to enjoy”!
Go Cat’s apparently kid-friendly content is profound, surreal, almost like body horror. Its designs are eerily similar to what became known as Elsagate in 2017, where hundreds of thousands of films were made that showed son’s characters like Elsa from Frozen, Spider-Man, and Peppa Pig in dangerous, physical, and abusive situations. By manipulating the product’s systems, these videos were able to look on YouTube’s dedicated Kids ‘ app—preying on children’s mysteries to land thousands of clicks for money. In its efforts to fix the issue, YouTube removed ads from over 2 million video, deleted more than 150 000, and terminated 270 records. Elsagate‘s reach had been considerably diminished despite following investigations by WIRED finding that similar channels, some of which featured sexual and lurid depictions of Minecraft avatars, continued to appear on YouTube’s Topic page.
Next came AI. Create these ridiculous and grisly videos has become both simple and profitable thanks to the ability to bypass conceptual AI prompts and the influx of tutorials on how to sell children’s content. Proceed Cat is just one of many that came up when WIRED looked up terms like “minions,”” Thomas the Tank Engine,” and” pretty cats.” Some involve Elsagate items like female, lingerie-clad types of Elsa and Anna, but soldiers are another great hitter, because are animated animals and kittens.
YouTube says it “terminated two marked programmes for violating our Terms of Service” and is suspending the marketing of three different programs in response to WIRED’s request for comment.
A YouTube director claims that” a number of movies have also been removed because they violate our child safety policies.” ” As always, all material uploaded to YouTube is content to our Community Guidelines and quality guidelines for kids—regardless of how it’s generated”.
When questioned about the policies in place to stop banned customers from just creating innovative channels, YouTube stated that doing so would violate its Terms of Service and that these rules were strictly enforced “using a combination of both individuals and systems.”
WIRED you confirm that some of the marked channels, including two cat-centric channels with abuse themes, were actually shut down last month. But another linked stations with reposts of the same movies remain on the program. Get Cat’s channel description and lively status are unaffected, and it continues to exist.
WIRED reached out to other programs for comment after discovering an message associated with Go Cat. We did not receive a reply.
The next wave of Elsagate’s blast of AI-animated feline videos surpasses those of any other type in both scope and material in scope. These videos frequently take the form of fables, where animals are starved, forced to do unpleasant tasks, and noisily beaten by their families with baseball bat or frying pan, and have names like” Kitten abused by its own mother.” They are then taken to the hospital and revived—before the parent arrives, apologetic for their actions, as melancholic music or a meowing cover of Billie Eilish’s” What Was I Made For” plays in the background. According to experts, they are a clear attempt to mislead young audiences by using nearly identical channel names like” Cute cat AI” and” Cute cat of Ni,” and that is also an obvious attempt to lazily and sloppily monetize cheap content in ways that are beyond the scope of generative AI.
We are “deeply concerned about the proliferation of AI-generated content that appears to be directed at children and contains utterly inappropriate material,” says Robbie Torney, senior director of AI programs at Common Sense Media. The nonprofit, which rates and reviews media to provide accurate recommendations for families, was shown several such channels discovered during this investigation. The organization found common themes in videos featuring” characters in extreme distress or peril,” “mutilation, medical procedures, and cruel experiments,” as well as “depictions of child abuse and torture.”
These channels now typically appear on YouTube’s main app rather than YouTube Kids, but their ulterior changes, including implementing new rules in 2019 and implementing new rules for the US Children’s Online Privacy Protection Act, are only partially intended to mask their intentions. Sounds of babies ‘ laughter and babbling are blended in with music and set to backdrops of bright, Cocomelon-esque landscapes. The well-known kids ‘ cartoon even appears in some of these videos’ backgrounds ( in fact ). While Go Cat directly promotes its content to children, others make claims that their audience is completely unaffected or that their descriptions are” not for kids” in the description. The metadata for several channels revealed some videos have been tagged with keywords such as #funnycat, #familyfun, and #disneyanimatedmovies. The hashtag #animalrescue appears alongside more educational content when tagged with images of polar bears and reindeer that have been infected with parasites.
Although in 2017, Elsagate content typically featured traditional animation or even costumed actors ( both of which are still a part of this new wave ), the development of generative AI allows anyone to create disturbing, brain-rot-style videos much more quickly and without any prior training.
” This trend is particularly concerning because of the scale and speed at which AI can generate this content”, Torney says. AI-generated videos can be produced in large volumes with little supervision, contrary to traditional content creation. Without the use of human oversight in the creation process, children are able to easily access inappropriate and potentially harmful material. The comparative speed of AI also means that when one channel is flagged and removed by YouTube—another with identical reposts springs up days later.
At a recent hearing in the California State Assembly, senior AI adviser for Common Sense Media, Tracy Pizzo Frey testified in favor of a bill that seeks to protect children from AI risks. It will require systems to be classified on a scale from” Prohibited Risk” to” Low Risk” and ban the use of controversial AI companions such as Replika by children, alongside other measures. As AI-generated kids ‘ content continues to outnumber its traditionally animated counterparts, the scale of this issue is growing and is likely to increase even more.
More than 70 similar content-farm channels discovered during this investigation have been shared with YouTube by WIRED. Most of these involve AI-generated images of cats alongside themes of gore, sex, and child abuse—and their subscriber count ranges from thousands to millions. Although hundreds of automated comments across these videos suggest it may be the latter, it is debatable whether these views are primarily coming from humans or are just confirmation of the realization of the dead internet theory.
When reviewing the channels, YouTube explained that it required all creators to mark AI-generated content as such, including content geared toward children and families, and that it had established standards for what it called quality content.
” We want younger viewers to not just have a safer experience but also an enriching one”, a YouTube spokesperson says. To help creators create quality content for kids and family content and reduce the amount of content that is low quality, regardless of how it was made, we collaborated with experts to develop a set of quality principles for kids and family content. YouTube claims that since YouTube started using these principles, which dictate how much content is monetized, ranked in recommendations, and displayed on YouTube Kids, viewership of “high quality” content has increased by 45 percent on the YouTube Kids app.
Still, regardless of their audience, and as YouTube’s moderators scramble to remove them, Elsagate’s successors remain on YouTube’s main platform—continuing to generate new ways to bend the rules at every turn. Similar videos have recently been posted on TikTok where Runway AI generator was overlay onto real footage from mass shootings and suicides to create “minion gore” videos, according to 404 Media. Nor is the issue unique to them. TikTok told 404 Media that it is against the law to post offensive content as well as gory, gruesome, disturbing, or extremely violent content, and that it is implementing measures to stop harmful AI-generated content that goes against its guidelines.
” We recognize that short-form video platforms are working to address content moderation challenges, but the nature of AI-generated videos presents unique difficulties that may require new solutions”, Torney tells WIRED.
” The rapid evolution of AI technology requires that all stakeholders work together to make sure that kids ‘ exposure to online video content is safe and positive,” says author.