YouTube's Creepy Kid Problem Was Worse Than We Thought

YouTube says that it's removed ads from some two million videos and over 50,000 channels that featured disturbing content aimed at kids. Some of that content actually exploited children in videos. And while we've long known that YouTube struggles to keep bad stuff off of its platform, the fact that tens of thousands of channels involved doing bad things to children feels chilling.

Image: YouTube / Pexels / Gizmodo

The public outcry over YouTube's creepy kid videos started a few weeks ago. Several reports highlighted seemingly kid-friendly videos that depicted scenes of Disney characters in bikinis flirting with other popular children's characters, and other generally inappropriate themes. The issue was compounded by disturbing videos of cartoon characters dealing with themes such as torture and suicide popping up in the YouTube Kids app. There was also a rash of videos that showed kids being tied up, apparently hurt, or otherwise engaged in exploitative situations.

YouTube quickly addressed the issue by announcing plans to age-restrict and demonetise these kinds of videos. The company went beyond that and told Vice News that it "terminated more than 270 accounts and removed over 150,000 videos" as well as "turned off comments on over 625,000 videos targeted by child predators". Additionally, YouTube says it "removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content".

News of YouTube's action against millions of disturbing videos came on the heels of a separate but related controversy. Over the weekend, a number of people reported that typing "how to have" into the YouTube search bar prompted autofill suggestions that include phrases such as "how to have s*x with kids" and "how to have s*x in school". Those searches led to videos with titles such as "Inappropriate Games You Played as a Kid" featuring a provocative thumbnail of young people kissing and "School Is Hard and So Is Your Maths Teacher" with an image of a crying girl being touched by an older man. Those videos have 23 million and 117 million views, respectively, so it isn't hard to imagine why they showed up at the top of the search results.

YouTube says it has removed these vile suggested searches, which is good. But the persistence of these kinds of problems raises larger questions about YouTube's capacity for moderating creepy content. I'm not talking about "spookin in the wrong neighbourhood" or any other fun, weird but also slightly dark videos. I'm talking about the stuff that targets kids and appeals to bad people, such as paedophiles.

The problem with all of these videos is how borderline they appear to be - even to humans. And yet, YouTube primarily depends on algorithms and filters to keep bad content off its platform. Videos and channels are removed by human moderators, but only after they're flagged by users. In the meantime, you have disturbing autofill suggestions such as "how to have s*x in school" showing up for everyone, as well as the countless questionable videos to which these searches lead. Removing all of these suggestions and videos seems like an impossible task, especially since over 400 hours of content are uploaded to YouTube every minute.

The thing is, algorithms are inherently imperfect. Computers have a hard time identifying the uncanny valley that separates an innocuous video from one that's entirely inappropriate. Sure, videos that violate YouTube's terms of service - stuff that's copyrighted, gruesome, illegal, full of nudity, exploits children and so on - can get flagged and removed. YouTube also uses algorithmic filters now in order to catch some of these videos before they're published. The system isn't perfect, but YouTube seems committed to it.

It's inevitably hard to point fingers. Is it a tech company's fault that humans are awful and abusive and exploitative? Of course not. YouTube does shoulder a tremendous burden when it comes to deciding how to let the right videos in and keep the bad ones out. Few are surprised when the platform fails to catch every creepy video. But the creepy videos are still a problem. Whether that says more about YouTube's limitations or our own perversions remains to be seen.

WATCH MORE: Tech News


Comments

    In YouTube (Google's) defence, the autocomplete search suggestions are based (at least partly) on your previous searches. If you're getting weird/dodgy search suggestions, you need to look no further than your own searches.

      Exactly.

      I always laugh when i see facebook posts on news websites facebook pages complaining about the adult advertisments they are seeing, With the poster not realizing the ads are based on their internet usage.

      Pretty sure a lot of the google search selections are actually compiled from a list of all searches by all users. They index common search phrases and keywords and as you type it matches against that. Some of the suggestions are specifically related to your previous searches but definitely not all.

      For example I typed "How" in google and it brought up "How old is prince harry" and there's no way I've ever searched for anything related to Prince Harry. I'd imagine it got onto the list because Prince Harry is "trending" now since the marriage announcement. It also brought up "how to mine bitcoin" which is probably related to my search history since I was looking into that a few weeks ago.

      Back on the article, I think there is also a problem in that some content is never intended for kids but they find it anyway. Like the Disney characters doing weird things, that's probably a Rule 34 kinda deal. Should it be banned/blocked/removed if it was never intended for kids?

      Here's a similar real life example, my mate's young son (about 9) was trying to find some Teddy Bear song for a school play (I assume Teddy Bear picnic) but he found the song from the movie Ted "Fuck you thunder". As you'd imagine he's been singing it non-stop and has gotten in trouble. So should anything related to the movie Ted be removed?

    I cleaned out all my.. ahem.. i mean my kid's searches several times. Starts with a clean slate. I know it's cleaned out because you get Logan Paul suggestions on the main page straight away. With that search history being completely erased and reset we search for a Peppa episode he wanted to see and BAM, the first suggestion is that one where she gets tortured at the dentist and a million Ryan's toys review videos.

Join the discussion!

Trending Stories Right Now