Videos of children showing their exposed buttocks, underwear and genitals are racking up millions of views on YouTube – with the site displaying advertising from major cosmetics and car brands alongside the content.
Comments beneath scores of videos appear to show paedophiles sharing timestamps for parts of the videos where exposed genitals can be seen, or when a child does the splits or lifts up their top to show their nipples. Some of the children in the videos, most of whom are girls, appear to be as young as five. Many of the videos have hundreds of thousands, if not millions of views, with hundreds of comments.
The videos are also being monetised by YouTube, including pre-roll adverts from Alfa Romeo, Fiat, Fortnite, Grammarly, L’Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com. Banner advertising for Google and the World Business Forum also appeared alongside some of the videos. As well as providing YouTube with our research, we contacted the advertisers to alert them to the issue.
Videos of little girls playing Twister, doing gymnastics, playing in the pool and eating ice lollies are all routinely descended upon by hordes of semi-anonymous commenters, sharing time codes for crotch shots, directing other people to similar videos of children and exchanging phone numbers along with a promise to swap more videos via WhatsApp or Kik. On some videos, confused children who have uploaded videos of them playing in the garden respond to comments asking them how old they are. On one video, a young girl appears to ask another commenter why one of the videos had made him “grow”. The video shows the child and her friend doing yoga and is accompanied by pre-roll advertising from L’Oreal. The video has almost two million views.
“We’re absolutely horrified and have reached out to YouTube to rectify this immediately,” a Grammarly spokesperson said. “We have a strict policy against advertising alongside harmful or offensive content. We would never knowingly associate ourselves with channels like this.”
A spokesperson for Fortnite publisher Epic Games said it had paused all pre-roll advertising on YouTube. “Through our advertising agency, we have reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service,” the spokesperson added. A World Business Forum spokesperson said it found it “repulsive that paedophiles are using YouTube for their criminal activities”. A Peloton spokesperson said it was working with its media buying agency to investigate why its adverts were being displayed against such videos.
YouTube says that it’s 99 per cent effective at ensuring that adverts only appear on appropriate content and that it takes every instance of ads showing up where they shouldn’t very seriously.
But with a blank YouTube account, and a couple of quick searches, hundreds of videos that are seemingly popular with paedophiles are surfaced by YouTube’s recommendation system. Worse still, YouTube doesn’t just recommend you watch more videos of children innocently playing, its algorithm specifically suggests videos that are seemingly popular with other paedophiles, most of which have hundreds of thousands of views and dozens of disturbing comments. Many include pre-roll advertising.
In one monetised video with 410,300 views, a prepubescent girl performs a dance routine in a dingy flat, flashing the camera half-way through in a definitely-illegal, distressingly exploitative bare crotch shot that’s shared in the comments with a time stamp. We’ve seen it accompanied by adverts for Fiat and Shen Yun.
Although some prominent YouTube channels have been taken down over child abuse revelations, we were still able to find a number dedicated to “pre-teen models” and groups of young girls bathing, doing stretches and talking through their morning routines. Other YouTube profiles sharing these videos are anonymous, minimally filled out profiles that exist only to share videos of young children.
View numbers measure in the tens and hundreds of thousands, with some clocking up millions of views. In the case of normal vlog channels, it’s common to see one video with a massive view count, while everything else on the channel attracts just a few hundred.
The video with millions of views will almost always involve young girls swimming, dancing or doing yoga. In the comments, people inform them how “nice” and “beautiful” they are, while making requests for more videos with better lighting or different outfits. Many of these accounts also have their own playlists or uploaded collections of videos featuring children that have been scraped from elsewhere on YouTube. Many of the comments are too upsetting to reproduce.
Twister, as with yoga challenges and gymnastics, is one of the most popular themes with the paedophiles on YouTube. And once you’ve browsed through just a handful of these videos, YouTube’s recommendations are packed with nothing but videos full of paedophiles sharing timestamps of little girls doing the splits. It’s yet another example you YouTube’s algorithm doing what it’s designed to in order to show viewers what it thinks they’ll want, and in this case, it’s actively enabling the production and distribution of paedophilic content.
YouTube has ostensibly been battling content of this kind for years, but its efforts remain ineffective. A recent set of videos discussed by YouTuber Matt Watson and confirmed by multiple Reddit users highlights how poor YouTube’s attempts to keep paedophiles off its platform have been, describing an algorithmic “wormhole” that could be reproduced in mere seconds with a couple of YouTube searches and video views.
Even search suggestions help funnel people down that path – start typing “girl yoga” and the autocomplete options include “young” and “hot”. Enter “twister girl” and autocomplete suggests “little girl twister in skirt”.
Since November 2017, it’s been YouTube policy to disable comments on videos where users are saying “inappropriate” things. However, although comments are disabled, those videos – typically of children simply going about their lives – were still recommended by an algorithm that decided within a few videos that our main interest was in little girls flashing their crotches. Many of the videos we found did not have comments disabled and had racked up millions of views.
On a clip of a very young girl with spinal muscular dystrophy lying on a swimming pool inflatable, users exchanged phone numbers with the promise of sharing more videos via WhatsApp. Similar conversations appear in the comments on a video of a pre-teen in a summer dress playing with a hoverboard.
The majority of the most troubling conversations are in Spanish, Russian and Portuguese, indicating that YouTube’s automatic systems that detect predatory comments may not work as well for non-English languages. But many of the comments we saw were also in English.
In a statement, a YouTube spokesperson said the company enforced its policies “aggressively” and reported any content, including comments, that endangers minors to relevant authorities. The spokesperson added that such material is also removed from YouTube, with associated accounts deleted. “When we find content that is in violation of our policies, we immediately stop serving ads or remove it altogether,” they added.
YouTube has a long history of content regulation problems, particularly when it comes to children. In 2017, the streaming service shut down the Toy Freaks channel, which showed simulated violence against children, and in 2018 police arrested the creator of the FamilyOFive video channels on suspicion of child sex abuse.
Reports of paedophiles commenting on kids’ channels in first appeared in 2017, and YouTube promised to fix the problem. In the same year, major advertisers pulled their ads after they appeared next to extremist content including rape apologia, anti-Semitism and hate preachers.
To report troubling content on YouTube, go to youtube.com/reportabuse or select the Report option below any video. If you’re concerned about a child’s safety, contact the Child Exploitation and Online Protection Centre at ceop.police.uk.
Source: https://www.wired.co.uk/article/youtube-pedophile-videos-advertising