Facebook’s Oversight Board, the supposedly independent entity the company created to quell some of the heat over its moderation policies, announced its first five decisions on Thursday. The oversight panel, which consists of 20 academics, lawyers, and human rights activists, overruled Facebook moderators on four of their decisions and dinged the company for having vague rules which it enforces on an arbitrary basis.
The Oversight Board has the power to overrule Facebook and its subsidiary Instagram’s decisions on content — the company recently punted to it to decide whether Donald Trump should be allowed back on the site after years of using it to spread lies before he incited a riot at the Capitol this month — and compel the sites to restore posts if they decide the content wasn’t in violation of its policies. The Oversight Board forms five-member panels to investigate each case and present a decision for majority approval. The Board’s decisions on specific posts are binding, but any recommendations it issues are entirely Facebook’s prerogative to act on or ignore.
The cases in question included: a user in Myanmar who shared two famous photos of a Syrian toddler of Kurdish origin that drowned trying to reach Europe in 2015; a Brazilian user who posted a breast cancer awareness image containing nipples; a quote inaccurately attributed to Nazi propaganda minister Joseph Goebbels; a French-language video boosting debunked coronavirus treatment hydroxychloroquine that was viewed 50,000 times; and a post using a slur against Armenians.
According to the Oversight Board, the post from Myanmar compared the outcry over the Syrian refugee crisis in Europe to the reaction against human rights abuses perpetrated by the Chinese government against Uighur Muslims, “concludes that recent events in France reduce the user’s sympathies for the depicted child, and seems to imply the child may have grown up to be an extremist.”
Facebook’s moderators found the specific wording in question translated to “[there is] something wrong with Muslims psychologically” in English, violating company guidelines against hate speech. The oversight committee said the decision didn’t take into context the full post and consulted an outside translation team, which suggested a more accurate meaning might be the more specific “those male Muslims have something wrong in their mindset.” Context experts consulted by the board also suggested that while members of the Rohingya Muslim minority community have faced a genocidal, military-backed campaign of ethnic cleaning in Myanmar in recent years, accusations of mental health issues were not overall a large part of anti-Muslim rhetoric there. Facebook has specifically been cited by United Nations investigators as recklessly complicit in that genocide by enabling Myanmar military officials to spread anti-Rohingya propaganda with virtually no pushback.
While the post about Muslims “might be seen as perjorative, read in context, it did not amount to hate speech,” Stanford Law School professor and Oversight Board member Michael McConnell said during a Thursday morning conference call with reporters.
The Oversight Board stated that Facebook had attempted to stop them from issuing a judgement on the Instagram post from Brazil that included photos of nipples — because the company had already admitted it removed the post in error. The board said the post should be restored under Facebook’s rules allowing content promoting breast cancer awareness, but it also criticised the company for relying on buggy automated systems that flagged the post in the first place, saying that users can’t always appeal the bots’ decision. It wrote: “Automated content moderation without necessary safeguards is not a proportionate way for Facebook to address violating forms of adult nudity.”
“One of the things this particular case showed… is that they didn’t have a human moderator to look at a case,” retired Danish politician and board member Helle Thorning-Schmidt told reporters, adding it was “very clear that was part of the problem” and human moderators wouldn’t have taken it down. The Oversight Board’s recommendations included that users be informed when automated systems had flagged their posts and that they be specifically told which rule their post had violated.
The post inaccurately quoting Goebbels, the board found, was criticising the Nazi regime rather than endorsing it. Facebook confirmed to the board that Goebbels was on their list of dangerous individuals and organisations; the board recommended that list be made public, or at least specific examples.
The Oversight Board also told Facebook to restore the post about hydroxychloroquine, which alleged a scandal at the Agence Nationale de Sécurité du Médicament to refuse authorization to researchers “[Didier] Raoult’s cure” but instead authorise remdesivir, an antiviral also found to be useless in the fight against coronavirus. The board found the content was intended to criticise government policy; it also wrote the drugs “are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription.” The post thus fell short of Facebook’s rules against medical misinformation resulting in imminent harm, according to the Oversight Board, and its deletion “did not comply with international human rights standards on limiting freedom of expression.”
The Oversight Board found that the Russian-language post smearing Armenians as people without a history was a clear rule violation, siding with Facebook that it contained a racial slur:
The post used the term “тазики” (“taziks”) to describe Azerbaijanis. While this can be translated literally from Russian as “wash bowl,” it can also be understood as wordplay on the Russian word “азики” (“aziks”), a derogatory term for Azerbaijanis which features on Facebook’s internal list of slur terms. Independent linguistic analysis commissioned on behalf of the Board confirms Facebook’s understanding of “тазики” as a dehumanising slur attacking national origin.
Taken as a whole, the Oversight Board decisions suggest that the board is seeking to expansively interpret its charter, with a focus on greater transparency from Facebook over what exactly its rules are and how it makes decisions about reported posts. Of course, Facebook has a very long history of breaking promises and saying it’s working to redress its shortcomings while doing the bare minimum. The company could easily choose to issue a few statements claiming it’s working to change the system, and then file the recommendations down the memory hole. In other words, the board has a very long way to go before it can prove it’s an exercise in anything but “corporate whitewashing” in the form of a convenient body Facebook can point to whenever it wants to distance itself from taking responsibility for what circulates on it.
One of the board’s decisions already hasn’t gone down so well. U.S. civil liberties groups Muslim Advocates accused the Oversight Board of enabling hate speech and compounding ongoing human rights abuses by overruling Facebook on the anti-Muslim post from Myanmar. Spokesperson Eric Naing told the Guardian: “It is clear that the oversight board is here to launder responsibility for Zuckerberg and Sheryl Sandberg. Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide.”
The decision allowing the hydroxychloroquine post to remain on the site will also prove contentious, as content flush with medical misinformation but stopping just short of blatantly encouraging quack treatments has spread far and wide on Facebook, and its approach to combating it has been rife with inconsistency. Facebook also reportedly soft-peddled its approach to conservatives with large followings who pushed disinformation — including antivax conspiracy theories — to avoid angering Republican politicians before the 2020 elections. Incorrect, misleading, and just plain hoax claims about medicine have been highlighted by researchers as having potentially major consequences for public health, especially during the coronavirus pandemic.
“Users do require more clarity and precision from the community standards,” Thomas Hughes, the director of Oversight Board Administration, said during the call.
Michael McConnell told reporters on the call that Trump’s team has not yet reached out to the Oversight Board to appeal the indefinite lockout of his account. He added that the board had begun deliberations on the issue, but those remained in the “extremely early” phase. The board has several months to decide whether Trump should be allowed back onto the site and regain the privilege of sending out misinformed diatribes to his more than 33 million former followers on the main site and nearly 25 million on Instagram.