Facebook, which the United Nations’ top human rights commissioner accused earlier this year of a “slow and ineffective” response to evidence it was fueling state genocide against the Rohingya Muslim minority in Myanmar, admitted in a blog post on Monday that their own “independent human rights impact assessment” has more or less confirmed that it really screwed that one up.
For many Facebook users in Myanmar, the site is their primary (or even lone) portal to the internet. Numerous media reports have confirmed that the country’s military used Facebook as, in the words of the New York Times, a “tool for ethnic cleansing… [military officials were the] prime operatives behind a systematic campaign on Facebook that stretched back half a decade and that targeted the country’s mostly Muslim Rohingya minority group.”
The social media giant wrote that their independent report, created by the San Francisco-based nonprofit Business for Social Responsibility (BSR), “concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence.” While Facebook noted that it “can and should do more”, the company also wrote that the BSR report emphasised that it was now trying harder not to enable genocide (as well as that it could not be solely held responsible for it, which no one has alleged):
Over the course of this year, we have invested heavily in people, technology and partnerships to examine and address the abuse of Facebook in Myanmar, and BSR’s report acknowledges that we are now taking the right corrective actions.
BSR’s report also examines the complex social and political context in Myanmar, which includes a population that has fast come online, a legal framework that does not reflect universal human rights principles, and cultural, religious, and ethnic tension. In this environment, the BSR report explains, Facebook alone cannot bring about the broad changes needed to address the human rights situation in Myanmar.
Facebook acknowledged that it faces difficulties monitoring users in Myanmar, because it “is currently the only country in the world with a significant online presence that hasn’t standardised on Unicode — the international text encoding standard.” It said it has onboarded 99 native Myanmar language speakers to improve “development and enforcement of our policies” and took action on 64,000 reports in Myanmar in the past few months:
In the third quarter of 2018, we saw continued improvement: we took action on approximately 64,000 pieces of content in Myanmar for violating our hate speech policies, of which we proactively identified 63% — up from 13% in the last quarter of 2017 and 52% in the second quarter of this year.
Hours after the release of the UN report in August, Facebook banned several leaders of the country’s military, including Senior General Min Aung Hlaing, as well as several organisations in Myanmar. In October, it also banned a number of other pages found to be involved in spreading propaganda and misinformation in support of the ethnic cleansing.
But reporting from Wired showed that Facebook’s culpability in the violent campaign dates to at least 2013, when foreign correspondent Aela Callan alerted it to the number of hateful pages propping up:
Aela Callan, a foreign correspondent on a fellowship from Stanford University, met with Elliot Schrage, vice president of global communications for Facebook, in November 2013 to discuss hate speech and fake user pages that were pervasive in Myanmar. Callan returned to the company’s Menlo Park, California, headquarters in early March 2014, after follow-up meetings, with an official from a Myanmar tech civil society organisation to again raise the issues with the company and show Facebook “how serious it [hate speech and disinformation] was”, Callan says.
But Facebook’s sprawling bureaucracy and its excitement over the potential of the the Myanmar market appeared to override concerns about the proliferation of hate speech. At the time, the company had just one Burmese speaker based in Dublin, Ireland, to review Burmese language content flagged as problematic, Callan was told.
A separate investigation by Reuters found that in early 2015, there were still “only two people at Facebook who could speak Burmese reviewing problematic posts”.
While Facebook says it will do better in the future, it’s a bit late for the nearly 700,000 Rohingya Muslims that UNICEF estimated had fled Myanmar by April 2018, or the estimated 25,000 that the UN estimated had been killed. Elsewhere in the world, the list of Facebook wrongdoing keeps adding up: Last week, the company apologised for promoting an ad category designed to appeal to users interested in the “white genocide conspiracy theory”.