YouTube’s Plan To Stop Recommending Conspiracy Theory Videos Is Actually Working

YouTube’s Plan To Stop Recommending Conspiracy Theory Videos Is Actually Working

The YouTube algorithm is responsible for many a video-viewing rabbit hole—one that critics have pointed to as an effective digital megaphone for spreading wonky conspiracy theories. In January 2019, however, YouTube announced it would begin cracking down on “borderline content” in its recommendations. Well, a new study suggests that YouTube’s efforts have actually yielded some fruit. According to the study, conspiracy theories are now 40 per cent less likely to pop up in your recommendations than they were before YouTube cracked down.

The gist of the study is that Berkley researchers examined 8 million recommendations over a 15-month period. To judge how effective YouTube’s efforts were, the researchers trained an algorithm to determine whether a video contained conspiracy theories based on its description, comments, and transcripts. The results were mixed.

On the plus side, researchers found that YouTube had been successful at axing videos peddling theories that the U.S. government perpetrated the 9/11 terrorist attacks and that the Earth is flat—two topics that YouTube identified as targets when it initially announced its plans to tackle conspiracy theories. The study also found that in the period from June to December 2019, the percentage of conspiracy theory recommendations dropped first to 50 per cent, and then to an all-time low of 70 per cent.

However, the researchers also found that those numbers, while consistent with their own data, didn’t necessarily account for the popularity of the source video. When adjusting for that, they found conspiratorial recommendations have risen from the lowest point in May 2019 and now are only 40 per cent less common than before. Also, while YouTube was successful at curbing some conspiracy theories, others are still quite rampant—including those about aliens building pyramids and, more concerningly, climate change denial. The researchers told the New York Times that indicates YouTube has made a choice as to what types of misinformation it will shut down—ones that get a lot of negative media attention, like Sandy Hook conspiracies—versus the ones it will allow.

Another problem here is that while the study does show a marked decrease in conspiratorial recommendations, it doesn’t really shed any more light on how that impacts radicalisation. Furthermore, the study was limited in that it only studied recommendations without logging into a YouTube account—which doesn’t reflect how most people browse on the platform. Without cooperation from YouTube, it’s hard to accurately replicate personalised recommendations at scale, meaning any study claiming to definitively judge whether or not YouTube has an impact on radicalising people is inherently flawed.

YouTube has nearly 2 billion monthly active users, an increasing number of whom use the platform as their primary news source. Steps like curbing recommended conspiracy videos and giving users more direct control over what the algorithm shows them are a step in the right direction, but there’s still work left to be done.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.