Facebook Sued By Former Content Moderator Over ‘Debilitating PTSD’

Facebook Sued By Former Content Moderator Over ‘Debilitating PTSD’

Facebook likes to outsource its most egregious issues, and weeding out the platform’s gruesome and graphic content is no exception. On Friday, a former Facebook content moderator sued the social network, alleging that she suffers from psychological trauma as a result of the job.

Selena Scola was a content moderator contracted by Facebook from around June 2017 through March of this year. Scola filed a class-action lawsuit against the company — as first reported by Motherboard — claiming that it doesn’t provide its contractors who view a flood of deeply troubling content every day with the necessary training, safety or medical support needed.

The complaint also states that Scola developed and still suffers from “debilitating PTSD” from her time as a contractor for Facebook.

“Ms Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled,” the lawsuit states. “Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

The class-action lawsuit was filed on behalf of all residents of California who worked as content moderators for Facebook in the last three years. This will likely cover a large scope of people, given Facebook has contracted thousands of moderators and plans to double its safety and security team by the end of this year to 20,000 — that team includes contractors.

The lawsuit states that Scola seeks a “Facebook-funded medical monitoring program” that would “include a trust fund to pay for medical monitoring and treatment” whenever necessary for her and any contractors joining the class-action as well as injunctive relief and attorney fees.

“Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized,” Steve Williams of the Joseph Saveri Law Firm said in a press release.

This is hardly the first account of a content moderator developing severe mental health issues due to the horrendous nature of the job.

Facebook is one of the largest platforms in the world and by virtue is inundated with an overwhelming amount of violating content, but many other social platforms are guilty of offloading the task of sifting through it to ill-supported contractors. Supporting the mental health needs of the ones doing the dirty work is not asking a lot from companies worth billions.

Facebook’s corporate communications director Bertie Thomson told Gizmodo that the company is “currently reviewing this claim”. Here is their statement in full:

We are currently reviewing this claim. We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources. Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling – available at the location where the plaintiff worked – and other wellness resources like relaxation areas at many of our larger facilities.

[Motherboard]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.