For years, anti-vaccine conspiracy theories found a welcome home on Facebook. The social network saw the issue as being about free speech and the anti-vaxxer groups it hosted as being about as dangerous as flat-Earthers. But now that getting accurate vaccine information to the public is a matter of existential importance, Facebook says it’s launching an unprecedented campaign to remove false claims on the subject entirely.
As folks in the U.S. start to meander into flu season, Facebook decided that now’s the time to formally issue a “ban” on ads that discourage its users from getting vaccinated.Read more
In a blog post on Monday, Facebook wrote a lot of words touting its efforts to deliver accurate vaccine information to the couple billion people that use its products. Tucked within all of that self-congratulation was a short note about implementing stricter policies to combat vaccine disinfo. Kang-Xing Jin, Facebook’s head of health, writes:
In addition to sharing reliable information, we are expanding our efforts to remove false claims on Facebook and Instagram about COVID-19, COVID-19 vaccines and vaccines in general during the pandemic. Today, following consultations with leading health organisations, including the WHO, we’re expanding the list of false claims we will remove to include additional debunked claims about COVID-19 and vaccines. Learn more about how we’re combating COVID-19 and vaccine misinformation.
This was not a sudden move, but it’s likely to open the platform up to new levels of moderation chaos and angered users. In 2019, Facebook vowed to downrank pages and groups that “spread misinformation about vaccinations in News Feed and Search” and said it would reject ads that spread misinformation about vaccinations. It also said it would remove targeted ad categories like “vaccine controversies,” reminding everyone that yes, Facebook had a special category for this stuff.
As the anti-vaxxer trend turned into something resembling a social movement, the covid-19 pandemic overwhelmed the globe and it became clear that this issue was about more than a resurgence of measles. In December, Facebook said it would begin removing posts that push false information about the covid-19 vaccines, specifically. Today’s move goes all the way. The company claims it will remove all vaccine-related misinformation that falls within the criteria established by Facebook in coordination with the “World Health Organisation (WHO), government health authorities, and stakeholders from across the spectrum of people who use our service.”
The list of prohibited content includes straightforward items like claims that “vaccines cause autism” or “vaccines cause the disease against which they are meant to protect.” Those points should be fairly easy to enforce, but critics are already raising concern about some of the trickier rules. Journalist and sociologist Zeynep Tufekci pointed out on Twitter that several rules could lead to legitimate research being flagged as false by Facebook while our knowledge on covid-19 and related vaccines continues to evolve.
Even if all the rules were thoroughly crafted to target only the content that Facebook doesn’t want around, we have too many examples that demonstrate the social network is terrible at enforcing its own policies, and its automated removal systems too often fail. Just today, the BBC reported on the case of a photographer in England who’s had his work rejected by Facebook’s ad algorithm on at least seven different occasions. Examples of rejected photos included a fireworks display that was blocked for “promoting weapons” and a shot of a basic cow in a gloomy field that was labelled “overtly sexual.”
Do we absolutely need Facebook as a vital space for scientists to share preliminary vaccine information? That seems debatable. Do we need the freedom to share overtly sexual photos of cows? Absolutely. As with all things related to moderation and censorship, be careful what you wish for.