Facebook Says Covid-19 Shutdowns Hurt Its Ability to Fight Suicide, Self-Injury, Child Exploitation Content

Facebook Says Covid-19 Shutdowns Hurt Its Ability to Fight Suicide, Self-Injury, Child Exploitation Content

Facebook said Tuesday it can’t moderate its own site or its subsidiary Instagram as effectively as possible for certain categories of rule violations during the novel coronavirus pandemic, while almost nobody got the chance to appeal its moderators’ decisions in the second quarter of 2020.

Per the latest version of its Community Standards Enforcement Report, which covers the Q2 period of April 2020 to June 2020, Facebook took action on 1.7 million pieces of content that violated its policies on suicide and self-injury in Q1, but just 911,000 in Q2. (That number is down from 5 million in Q4 2019.) While enforcement against content in violation of Facebook policies on child nudity and sexual exploitation rose from 8.6 million to 9.5 million, it was way down on Instagram, where the number fell from around 1 million to just over 479,000. Enforcement of rules prohibiting suicide and self-injury content on Instagram also plummeted from 1.3 million actions in Q1 to 275,000 actions in Q2. Instagram increased enforcement against graphic and violent content, but on Facebook that category fell from 25.4 million actions in Q1 to 15.1 million actions in Q2.

Facebook Vice President of Integrity Guy Rosen wrote in a blog post that the lower number of actions taken was the direct result of the coronavirus, as enforcing rules in those categories requires increased human oversight. The company’s long-suffering force of content moderators, many of whom are contractors, can’t do their job well or at all from home, according to Rosen:

With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram. Despite these decreases, we prioritised and took action on the most harmful content within these categories. Our focus remains on finding and removing this content while increasing reviewer capacity as quickly and as safely as possible.

The report didn’t offer estimates of the prevalence of violent, graphic, or adult nudity and sexual activity on Facebook or Instagram, with Rosen claiming that the company “prioritised removing harmful content over measuring certain efforts.”

The Facebook appeals process, by which users can challenge a moderation decision, has also flatlined to near-zero levels in every category. The company previously announced in July that with moderators out of the office, it would give users who want to appeal “the option to tell us that they disagree with our decision and we’ll monitor that feedback to improve our accuracy, but we likely won’t review content a second time.”

Facebook took action on a far larger number of posts for violating rules against hate speech in Q2 (22.5 million, up from 9.6 million in Q1). It wrote in the report that automated machine learning tools now find 94.5 per cent of the hate speech the company ends up taking down, something it attributed to support for more languages (English, Spanish, and Burmese). Enforcement against organised hate group content fell (4.7 million to 4 million) while that against terrorism content rose (6.3 million to 8.7 million).

Curiously, the amount of content that was later restored without an appeal after being removed under the anti-organised hate and terrorism rules skyrocketed in Q2; Facebook restored 135,000 posts in the first category and 533,000 in the second. It doesn’t appear that Facebook processed a single appeal on either category in Q2, suggesting the company’s human moderators have their eyes turned elsewhere. Facebook does not release the internal data which may show how prevalent hate speech or organised hate groups are on the site.

Keep in mind that this is all according to Facebook, which has recently faced accusations it turns a blind eye to rule violations that are politically inconvenient as well as an employee walkout and advertiser boycott pressuring the company to do more about hate speech and misinformation. By definition, the report only shows the prohibited content that Facebook is already aware of. Independent assessments of the company’s handling of issues like hate speech haven’t always reflected Facebook’s insistence that progress is being made.

A civil rights audit released in July 2020 that found it failed to build a civil rights infrastructure and made “vexing and heartbreaking” choices that have actively caused “significant setbacks for civil rights.” A United Nations report in 2019 assessed Facebook’s reaction to accusations of complicity in the genocide of the Rohingya people in Myanmar as slow and subpar, in particular calling out the company for not doing enough to remove racist content on the site quickly or prevent it from behind uploaded in the first place. (It’s possible that some of the surge in hate speech on Facebook is due to the introduction of more tools to detect in Burmese, the majority language of Myanmar.)

It remains broadly unclear just how well Facebook’s AI tools are doing their job. Seattle University associate professor Caitlin Carlson coronavirus misinformation this year.

Many of Facebook’s moderators have begun to return to work. According to VentureBeat, Facebook said that it is working to see how its metrics can be audited “most effectively,” and said that it is calling for an external, independent audit of Community Standards Enforcement Report data that it expects to begin in 2021.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.