Facebook Has a Superspreader Problem

Facebook Has a Superspreader Problem
Image: Graeme Jennings , Getty Images

You could make the case that stemming misinformation on social media is about as feasible as beheading a hydra, even the people running said platforms were inclined towards monster-slaying (which many decidedly aren’t). A convincing conspiracy video can become national news within a matter of hours in the midst of a health crisis which scientists are still working to fully understand. But it may be easier to root out these threats than Facebook would have you believe, according to the leftist online activist network Avaaz. In a new report, they mapped a network of misinformation websites and complimentary superspreader Facebook pages and estimated that they, together, generated 3.8 billion views over the past year, with a major spike in during coronavirus’s first wave in the US.

To be clear, the misinformation recorded in the study is not all necessarily directly related to health misinformation — rather, it tracked total views from websites and pages which ritually share health misinformation. “In some cases, [benign] posts from these websites are more dangerous than misinformation,” Luca Nicotra, senior campaigner and researcher with Avaaz told Gizmodo. “It can even be a strategy: you’re drawn in by a nice image of meditation, and then you realise that website you like is anti-vaccination, and you think, maybe I should think twice about this.” Facebook knows this strategy well from the 2016 election.

Looking at activity from the year leading up to late May 2020, Avaaz identified 82 websites that NewsGuard — a fake news watchdog — has flagged for regularly spreading falsehoods including health misinformation, with such names as “Realfarmacy.com,” “sonsoflibertymedia.com,” and “nowtheendbegins.com.” They then used Facebook’s content discovery platform CrowdTangle to identify 42 Facebook pages which: 1) generated at least 100,000 interactions on posts linking to the sites and 2) posted at least three pieces of misinformation that had been debunked by third party fact-checkers. Avaaz estimates that the superspreaders drove about 32% of views to these links, when accounting for profile pages and private groups. (This is a rough estimate; Facebook doesn’t actually share the number of views on posts, so Avaaz calculated views from the average ratio of engagement metrics on video videos from the pages.)

The Bill Gates conspiracy is a prime example of the health misinformation that took off like wildfire on Facebook. Avaaz noted that a single article about Bill Gates’s “Vaccine Agenda” — which contained nine debunked or unsupported claims (including of a global polio “explosion”) — got around 3.7 million estimated views on Facebook and was viewed an estimate 4.7 million times in cloned or translated versions. Tellingly, they found it was quoted in or linked by 29 of these problematic websites and 15 of the superspreader pages. The study found similar patterns on Facebook around other conspiracies that cast doubt on the efficacy of quarantining, pushed snake oil cures, and alleged negative health impacts of 5G cellular networks.

But if Facebook has been keeping tabs on who is responsible for sharing vaccine misinformation — and it has said it would for promoted posts — these pages should be on the shortlist. The top ten pages mostly have over one million followers, and, Avaaz notes, they’ve been sharing health-related bunk for a long time. “Many of these networks, made up of both websites and Facebook pages, have spread vaccination and health misinformation on the social media platform for years,” Avaaz writes. “However, some did not appear to have had any focus on health until Feb. 2020 when they started covering the COVID-19 pandemic.”

Meanwhile, Avaaz estimated that views for the top 10 misinformation pages generated around four times the views of posts from top 10 reputable health sites, including the World Health Organisation, and the Centres for Disease Control — despite the fact that Facebook provided free pop-ups for WHO and the CDC, added automatic labels to coronavirus posts with credible links, set up a covid-19 information centre for debunks, claims that it directed two billion people to health resources, and promised to stifle health-related misinformation in the News Feed. Facebook has claimed this last initiative would reduce views to this dangerous nonsense by up to 80 per cent.

“This suggests that just when citizens needed credible health information the most, and while Facebook was trying to proactively raise the profile of authoritative health institutions on the platform, its algorithm was potentially undermining these efforts and helping to boost content from health misinformation spreading websites at a staggering rate,” the study reads.

Even if Facebook quashed the superspreaders, Avaaz’s research represents another compelling bit of evidence that the platform simply can’t handle the gargantuan task of fact-checking its platform. In an email to Gizmodo, Facebook claimed that between April and June, their “global network of fact-checkers” allowed them to deploy warning labels on 98 million pieces of covid-19 misinformation, and they removed 7 million pieces of content that “could lead to imminent harm.” And yet, Avaaz found that just 16 per cent of health misinformation in their study had a warning label on it.

If any social network has the resources to successfully cut the spread of conspiracy theories and junk science off at the knees, it’s Facebook. And if a group like Avaaz which, it goes without saying, has considerably less staff and funding, can suss out the problem, what excuse does Facebook have?