In spite of coronavirus cases continuing to climb around the world, Facebook’s legions of contracted content moderators are still required to work out of offices “to maintain Facebook’s profits during the pandemic.” This is according to an open letter published on the company’s internal Workplace communication software, signed by more than 200 content moderators today.
“Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work,” the letter reads. “You refused. We are publishing this letter because we are left with no choice.”
Back in March, Zuckerberg told reporters that the bulk of this workforce would be allowed to work from home until the “public health response has been sufficient.” Apparently, that bar was cleared in mid-October, when Facebook told content moderation teams that they were required to work from their offices again. The Intercept later reported that one contractor working out of an Accenture-owned Facebook facility in Austin, Texas became symptomatic just two days after returning to work, and got a positive test back some days later. Foxglove, the law firm representing these contractors, added in a statement to the New York Times that additional contractors based out of Ireland, Germany, and Poland have tested positive for covid-19.
Naturally, the letter makes some requests for repairing the obviously strained relationship between these contractors and management.
First, they ask that all content moderators who are — or live with — someone in a high risk group be allowed to work from home indefinitely, and regardless of health status “work that can be done from home should continue to be done from home.” Right now, the only moderators granted this privilege are those that bring in a doctor’s note proving that they’re at high risk. Even then, the letter claims, this option isn’t offered in some workplaces.
In the call where Zuckerberg originally granted remote work for his company’s contractors, he mentioned that some of the more sensitive — or potentially illegal — subjects like child abuse would be best dealt with in-office for security purposes. The letter agrees that while criminal content should be handled onsite, there’s no reason that the rest of the content moderation team should be roped into doing the same.
The letter also asks that the moderators who work on these sorts of high-risk posts should be paid a hazard pay — 1.5 times higher than their usual wage, while all content moderators should be offered “real” health and psychiatric care for the work that they do. Moderators, they argue, deserve “at least” as much mental and physical support as Facebook’s salaried staff.
Perhaps the boldest of the letter’s demands asks Facebook to stop outsourcing its moderation altogether. “There is, if anything, more clamor than ever for aggressive content moderation at Facebook. This requires our work,” the petitioners write. “Facebook should bring the content moderation workforce in house, giving us the same rights and benefits as full Facebook staff.”
As leverage for the contractors point out that Facebook’s prior attempts at quietly moderate its platform with AI-based solutions were abject failures. “Important speech got swept into the maw of the Facebook filter — and risky content, like self-harm, stayed up,” the moderators, who this AI solution was not equipped to displace, write. “Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there.”
It’s worth mentioning that these contractors hardly had it easy in the pre-pandemic era. Reports have described the long hours, low paychecks, and psychological trauma involved in what the letter rightfully calls one of the Facebook’s most “brutal” jobs. As the letter explains, the workers monitoring Facebook for child abuse specifically had their workload upped during the pandemic, but were given “no additional support” to support their slog.
Facebook currently boasts a moderation team of around 35,000 people, globally. It’s unclear how many of those Foxglove intends to represent or in what capacity.
We’ve reached out Facebook and Foxglove for comment and will update if we hear back.