YouTube Is Demonetising Low-Quality Kids’ Content, a Thing They Didn’t Do Before for Some Reason

YouTube Is Demonetising Low-Quality Kids’ Content, a Thing They Didn’t Do Before for Some Reason
Photo: LIONEL BONAVENTURE / Contributor, Getty Images

Continuing its crackdown on subpar content designed explicitly for young people, YouTube is warning creators that it will begin demonetising channels that consistently upload low-quality kids’ videos next month.

YouTube has long maintained standards around what constitutes high- and low-quality kids’ content on the platform. Indicators of “age-appropriate, enriching, engaging, and inspiring content” include videos that promote being a good person, learning, creativity, and diversity. While low-quality content is comprised of videos that are overly promotional in nature, deceptively educational, promote bad behaviours, or use children’s characters in bizarre or questionable ways. But the new policies, announced on Monday, mark the first time that the platform has threatened to boot creators from the YouTube Partnership Program (YPP) if they don’t uphold those standards.

“Our ultimate goal is to foster a safe and enriching environment for families while rewarding trusted creators who are making high-quality kids and family content,” James Beser, director of product management for YouTube’s Kids and Family division, wrote in a Monday blog post announcing the changes.

The new policies are part of a larger suite of changes aimed at keeping kids safe that YouTube has instituted in recent months. Back in February, the platform rolled out “supervised experiences,” a new set of filters that parents could opt into in order to better control what content their children are allowed to watch on the platform. The new content settings include three tiers: ‘Explore’, which is designed for young children who are ready to move on from YouTube Kids and browse content that YouTube has deemed suitable for viewers ages 9+; ‘Explore More,’ which allows for content considered generally suitable for viewers ages 13+; and ‘Most of YouTube,’ which allows viewers to access almost all videos on the platform that aren’t explicitly age-restricted.

While YouTube has floundered for years with its approach to handling ethically dubious content that deals with themes like climate denialism and other forms of political misinformation, its failures in moderating content for children have been a particular thorn in the platform’s side. Videos that are creepy, bizarre, and inappropriate have long proliferated on the Kid-specific corner of the platform, creating an obvious problem for parents who want to be able to hand off their iPads without having to worry about videos with titles like, “BURIED ALIVE Outdoor Playground Finger Family Song Nursery Rhymes Animation Education Learning Video,” showing up in their kids’ feeds.

The increased scrutiny comes as other platforms are also under fire for their lacklustre efforts to moderate the experiences of the minors who frequent them. Most recently, a group of Democratic lawmakers beseeched Facebook CEO Mark Zuckerberg to halt plans to launch an “Instagram for Kids,” referencing internal research that found that the platform has played a role in fuelling negative mental health patterns, including suicidal ideation, among its teenage user base.