Facebook Now Says It Will Remove Misinformation That Inspires Real-World Violence

Facebook Now Says It Will Remove Misinformation That Inspires Real-World Violence

After a rough week of criticism over Facebook CEO Mark Zuckerberg’s shoddy explanation for why he won’t ban conspiracy site Infowars — including a very awkward tangent into apparently believing Holocaust deniers are not “intentionally getting it wrong” — the social media giant has announced it will begin removing misinformation that provokes real-world violence.

Per The New York Times, the new policy is “largely a response to episodes in Sri Lanka, Myanmar and India” where rumours spread rapidly on Facebook, leading to targeted attacks on ethnic minorities.

The paper writes that Facebook staff admit they bear “responsibility” to curb that kind of content from circulating on the site:

“We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline,” said Tessa Lyons, a Facebook product manager. “We have a broader responsibility to not just reduce that type of content but remove it.”

In another statement to CNBC, a Facebook spokesperson characterised the policy as a crackdown on a specific type of content they have deemed worthy of removal, while defending their laissez-faire approach to other dubious posts:

“Reducing the distribution of misinformation — rather than removing it outright — strikes the right balance between free expression and a safe and authentic community,” a Facebook spokesperson said in a statement to CNBC. “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.”

According to CNBC, Facebook says the new policy will have them partnering with local civil-society groups to identify text and image content with the purpose of “contributing to or exacerbating violence or physical harm” for removal.

The CNBC report also notes that the process will involve Facebook’s internal image recognition technologies, presumably a similar system to the one it uses to automatically purge revenge porn from the site.

That’s an improvement on the current situation. For example, in Sri Lanka and Myanmar, the company has faced harsh criticism from local NGOs for alleged inaction as incitement and propaganda circulated widely on the site.

In both countries, despite having large userbases, reports indicate Facebook largely failed to hire enough moderation staff. Partnering with local organisations could help the site become less of an absentee landlord.

However, this is likely far from a slam dunk. For one, Facebook’s standards for what qualifies as inappropriate content are habitually lax, and there will be a lot of said content to sift through. It often relies on automated methods that are easily worked around, or others that simply end up backfiring (as was the case with the “disputed” flags it put on dubious articles).

In this case, it’s easy to imagine this ending up being an unending game of Whac-A-Mole in which they only commit the resources to stare at one hole.

As the NYT wrote, there are two other solutions the site is moving forward with: Downranking posts flagged as false by its third-party fact checkers, and adding “information boxes under demonstrably false news stories, suggesting other sources of information for people to read”.

While either method will likely have some impact, Facebook’s fact checkers have repeatedly expressed concerns that the site’s system is too constrained to be effective.

Additionally, the NYT reported Facebook has no plans to roll out the new rules to its subsidiaries, photo- and video-sharing app Instagram and encrypted chat service WhatsApp, the latter of which has been linked to several deadly hoaxes:

The new rules do not apply to Facebook’s other big social media properties, Instagram and WhatsApp, where false news has also circulated. In India, for example, false rumours spread through WhatsApp about child kidnappers have led to mob violence.

Policing WhatsApp may be somewhat more difficult or outright impossible, as Facebook ostensibly cannot see the content of the messages without watering down its encryption, so it’s between a rock and a hard place there. (As Indian daily Economic Times wrote last year, authorities there still consider WhatsApp group administrators liable for the content of chats.)

Then there’s the matter of Facebook’s stated commitment to free speech, which is nice in theory but vague enough in practice that it seems to function primarily as a shield against criticism.

Linked to this is the site’s habitual wariness, linked in part to a 2016 Gizmodo post alleging bias in its now-defunct trending news section, of offending conservative and far-right groups eager to cry censorship.

For example, take Infowars, which spread conspiracy theories about a DC-area pizza restaurant until a gunman showed up. As The Washington Post noted, it is hard to reconcile how Facebook’s “beefed-up approach” to misinformation can coexist with some of its main purveyors being allowed to remain on the site.

These problems are innumerable. As former Gawker editor in chief Max Read recently wrote, they are also perhaps unsolvable short of a radical restructuring of the company, given Facebook’s scale is now so big it approaches a form of “sovereign power without accountability” (or indeed, a coherent vision of what it is supposed to be).

“There is not always a really clear line,” Facebook product manager Tessa Lyons told the NYT. “All of this is challenging — that is why we are iterating. That is why we are taking serious feedback.”

Gizmodo reached out to Facebook for comment but had not heard back at time of writing.

[New York Times]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.