In a rare preemptive move to crack down on misinformation before it goes viral, Meta, the tech giant formerly known as Facebook, is partnering with the Centres for Disease Control and Prevent and the World Health Organisation to take down harmful content related to the coronavirus vaccine and its effects on children. The announcement coincided with the U.S. Food and Drug Administration’s authorization of the first covid-19 vaccine for children between the ages of 5 and 11.
In the coming weeks, Facebook users will start seeing in-feed reminders that the vaccine has been approved for kids in the U.S. along with information about where it’s available. Meta’s rolling out both English and Spanish versions of the reminders.
It’s also expanding its anti-vaccine misinformation policies to remove false claims specifically relating to the vaccine and children. This includes misinformation about the vaccine’s availability, efficacy, and vetting in the scientific community, such as claims that the covid-19 vaccine can kill or seriously harm children. Posts claiming that any treatment other than a covid-19 vaccine can innoculate kids against the virus will also be removed.
“This is not a single update, but part of an ongoing effort in partnership with health authorities like the CDC and WHO and others, both in the U.S. and globally,” Meta’s head of health, Kang-Xing Jin, wrote in a blog post announcing the partnership Friday. “We will continue to clarify our policies and add new claims about the COVID-19 vaccine for children that we remove from our apps.”
The U.S. FDA’s authorization covers the Pfizer/BioNTech mRNA vaccine, which is estimated to be 90.7% effective at preventing covid-19 in children ages 5 to 11. The child version of the vaccine, which comes in a smaller dose than the version available to adults, will be offered as a two-dose shot given over a three-week period in the U.S. In trials, no serious adverse events connected to the vaccine were reported.
Over the years, Meta has frequently come under fire for failing to curb the rapid spread of misinformation on its platform, a problem that came into stark relief with the covid-19 pandemic. Right-wing conspiracy theorists, anti-vaxxers, and pseudoscience zealots had always had a foothold on Facebook, and it wasn’t long before they began flooding feeds with bullshit about covid-19 and the efficacy of wearing masks, later adding propaganda about the coronavirus vaccine to the list.
Amid mounting pressure from critics, Facebook has implemented several sweeping new policies to take down or limit the reach of harmful content. But many argue its response has been too little, too late. President Joe Biden accused the company of “killing people” by letting false vaccine claims go unchecked on its platform. And Facebook still isn’t finished performing damage control: The hashtag #VaccinesKill, which would seem like an obvious contender for Facebook’s ban hammer, remained functional as recently as July before Facebook finally blocked it.
Since the start of the pandemic, Meta has taken down a total of 20 million pieces of content and 3,000 accounts, pages, and groups from Facebook and Instagram related to covid-19 and vaccine misinformation, the company said Friday.