Elsagate
⚠️ Article status notice: This article has been marked as incomplete
This article needs additional work to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.
This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, visit the discord and post to the #appeals
channel.
Learn more ▼
Elsagate refers to a type of incident where malicious content creators are left unrestrained to publish illicit content that is targeted towards minors. Residing among video platforms such as YouTube and TikTok, millions of minors have been left affected by the negligence of the companies that maintain them.
Background[edit | edit source]
Elsagate content is associated with mature content that is covered by a thin façade of what audiences would initially expect to be child-friendly content. The content itself takes advantage of a multitude of mediums, such as flash-style animation, stop-motion or claymation, live action footage, or most recently, AI generation. Most videos uploaded would take advantage of methods to take advantage of the algorithm with junk keywords and buzzwords in the titles like "education" or "nursery rhymes", and the attention of children or unaware parents with bright thumbnails and recognizable characters.
Version 1 - (2016-2017)[edit | edit source]
The initial introduction of Elsagate content started at an unknown date, however its presence was mostly discovered by outlets such as The Guardian in 2016,[1] where the content was rather tame with no dialogue, only taking advantage of characters owned by large companies that would be popular among children, such as Spider-Man or Elsa. The following year, the content eventually became far more questionable, such as sexual behaviors and scatological humor.[2][3]
By early 2017, the amount of channels publishing this content skyrocketed, and reports on the content were matching similar amounts of proliferation. Publications such as the BBC,[4] CTV News,[5] and Yahoo! Lifestyle[6] were among many covering this, and additionally the community 'r/ElsaGate' on Reddit formed, focusing on documenting the excessive publishing of this content.
Google's response[edit | edit source]
Initial response from Google's moderation teams were limited, only a few videos were demonetized or taken down. When users on the platform were covering the incident, rather than the moderators taking down offending channels, YouTubers faced punishment for reporting instead.[7] Only after facing mass coverage from major news outlets on the mature depictions of child-favorite characters,[8] potential investigations from the FBI,[9] and threats from advertisers pulling funding, did YouTube put forth measures to effectively stop the spread of this content.[10]
Features have included the following:
- Disabling comments on videos marked 'for kids'
- Disabling "targeted" advertising on marked videos
- Loss of monetization for some marked videos
Version 2 - 'Aphmaugate' (2020-Now)[edit | edit source]
In the start of the 2020s, the content resurfaced in the feeds of users, especially as the height of 'lockdown' encouraged an increase in screen time among all demographics. Compared to the previous iteration, the content would instead infringe on smaller indie intellectual properties and game franchises.[11][12] The content was considerably more violent and mature in nature, such as murder, abuse, and drugs.[12]
This period had also shown content created intended explicitly for mature audiences being automatically marked for kids against the wishes of the creator,[13][14] which harms the monetization of their content[15][16] and contributes to the harm of children.[17]
Version 3 - 'AIgate' (2023-Now)[edit | edit source]
This section is incomplete. This notice can be deleted once all the placeholder text has been replaced.
An offshoot of the previous version emerged with the growth of generative AI, as content could be generated at a considerably high pace compared to previous incidents. This content continued the trend of disguising mature content with characters that would seem kid-friendly.
Consumer response[edit | edit source]
The initial response from the public of the first version of the incident was excessively negative, with users agreeing to refuse to use YouTube kids due to a lack of adequate parental controls for the content that would be shown to the user.[citation needed] Version 3 has especially brought significant backlash due to the explicit nature of the videos being published and being shown to children using the platform.[citation needed]
References[edit | edit source]
- ↑ Dredge, Stuart (Jun 29, 2016). "YouTube's latest hit: neon superheroes, giant ducks and plenty of lycra". The Guardian. Archived from the original on Nov 23, 2017. Retrieved Jun 8, 2025.
- ↑ Popper, Ben (Feb 20, 2017). "Adults dressed as superheroes is YouTube's new, strange, and massively popular genre". The Verge. Archived from the original on Nov 13, 2017. Retrieved May 8, 2025.
- ↑ Deal, Rachel (Feb 23, 2017). "The Ballad Of Elsa And Spiderman". The Awl. Archived from the original on Dec 1, 2017. Retrieved May 8, 2025.
- ↑ Subedar, Anisa; Yates, Will (Mar 27, 2017). "The disturbing YouTube videos that are tricking children". BBC. Archived from the original on Jun 26, 2019. Retrieved May 8, 2025.
- ↑ "Fake toons: Kids falling prey to adult parodies of popular children's shows". CTV News. Mar 28, 2017. Archived from the original on Mar 27, 2024. Retrieved Jun 8, 2025.
- ↑ Lee, Natasha (Jul 9, 2017). "Disturbing videos masked as kid's cartoons on YouTube". Yahoo Lifestyle. Retrieved Jun 8, 2025.
- ↑ Yeung, Alysha (Oct 6, 2024). "Aphmaugate: The Return of Elsagate". Medium. Retrieved Jun 8, 2025.
- ↑ Whigham, Nick (Nov 29, 2017). "Parents warned to look out for disturbing 'Elsagate' videos". New York Post. Retrieved Jun 6, 2025.
- ↑ Warzel, Charlie (Nov 22, 2017). "YouTube Has A Massive Child Exploitation Problem". BuzzFeed. Archived from the original on Nov 22, 2017. Retrieved Jun 6, 2025.
- ↑ Brandon, Russel (Dec 8, 2017). "Inside Elsagate, the conspiracy-fueled war on creepy YouTube kids videos". The Verge. Archived from the original on Apr 13, 2018. Retrieved Jun 8, 2025.
- ↑ D'Anastasio, Cecilia (Mar 30, 2021). "Blood, Poop, and Violence: YouTube Has a Creepy Minecraft Problem". Wired. Retrieved Jun 8, 2025.
- ↑ 12.0 12.1 Hajjaji, Danya (Apr 12, 2022). "Violent YouTube Cartoons Exploit Children's Favorite Horror Characters". Newsweek. Retrieved Jun 8, 2025.
- ↑ Amadeo, Ron (Jul 5, 2022). "YouTube flags horror video as "for kids," won't let creator change rating". Ars Technica. Retrieved Jun 8, 2025.
- ↑ u/davidverner (Nov 5, 2021). "Is this actually a thing, forcing videos to YouTube Kids because it's about a cartoon series?". Reddit. Retrieved Jun 8, 2025.
- ↑ nuckles87 (July 13, 2023). "YouTube goes after Balena Productions' Sonic Animations, Marking Them "Made For Kids"". Sonic Stadium. Retrieved Jun 8, 2025.
{{cite news}}
: CS1 maint: numeric names: authors list (link) - ↑ u/Muska327 (May 9, 2022). "Looks like YouTube has it out for Sonic content now". Reddit. Retrieved Jun 8, 2025.
{{cite web}}
: CS1 maint: numeric names: authors list (link) - ↑ Parker, Tom (Jan 9, 2020). "YouTube COPPA changes result in videos with violence, gore, and strong language being labeled "made for kids"". Reclaim The Net. Retrieved Jun 8, 2025.