Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube

Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube

Long since report after report indicated that emotional trauma is an occupational hazard of content moderation, YouTube is still hurling low-wage workers into a daily emotional onslaught, allegedly with limited mental health safeguards. In a class action suit filed yesterday, a former moderator claims that a year-and-a-half term as one such content janitor caused her to develop symptoms of PTSD. Meanwhile, the anonymous litigant claims Youtube failed to provide the sorts of workplace safety standards it had helped to draft, and undertook an “aggressive campaign” to stifle whistleblowers.

The suit is asking for damages for, among other charges, exposing employees to abnormally dangerous activity.

What the moderator allegedly witnessed is disturbing to even imagine, let alone have to watch over and over again. The moderator claims she’d been exposed to “thousands of graphic and objectionable videos,” including:

a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground; school shootings included dead bodies of children; a politician shooting himself; backyard abortions; child abuse; and child sexual assault.

Images of brain eating, beheading, and child sexual assault were part of a training session, she says.

While these sorts of things are unpleasant to see and in aggregate can be damaging psychologically, this ex-moderator’s suit includes a small reminder that (voluntary) recommendations to increase “employee resilience” do exist. The Technology Coalition, which created these recommendations, is composed of some of the biggest companies in the tech space, including Google itself. And yet, she alleges, these standards — which include such meager safeguards as exposing moderators to “no more than four consecutive hours” of disturbing media or keeping them from seeing child pornography during the hour (yes, just the one) before they end their shift — were not enacted.

According to the suit, little or none of these protections trickled down to contractors, who were frequently reminded by supervisors of expectations to view “between 100 and 300 pieces of content per day with an error rate of two to five per cent.” The suit also claims that moderators stay on only about a year, and for that reason, YouTube’s content moderation workforce is “chronically understaffed.”

Even the three-week job training course, where the aforementioned cannibalism and beheading allegedly were shown, gave scant information about “wellness or resiliency,” according to the suit. As a result, this moderator says she developed a rash of psychological issues:

She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks. She has lost many friends because of her anxiety around people. She has trouble interacting and being around kids and is now scared to have children.

Rather than invest in these workers’ wellbeing, the suit alleges, YouTube silenced them. Instead of first showing potential moderators the horrendous footage they would expect to see on an average day, the complaint claims, moderators were told to first sign NDAs. Following a Verge exposé, YouTube allegedly made content moderators sign documents acknowledging that they understand that the job can cause PTSD, and “this job is not for everyone.” YouTube also allegedly scrubbed any mention of trauma from its internal message boards.

YouTube could also follow Microsoft’s lead by implementing filters, recommended by the National Centre for Missing and Exploited Children. Microsoft, they note, blurs images, changes them to black-and-white, reduces them to thumbnail size, and blurs audio. According to the complaint, content moderators requested on YouTube’s internal messaging board in 2017 that the company blur images and add warning labels for posts flagged as “ultra-graphic,” but Suzanne French, Head of Global Vendor Operations, declined the request saying that the change was not high on the company’s priorities, according to the allegations in the suit.

Earlier this year, the same law firm representing YouTube’s content moderators won a $US52 ($73) million settlement for Facebook moderators. Because YouTube sources moderators from third-party contractors, they’re not sure how many moderators work on behalf of YouTube, but they estimate that the number is in the thousands.

Gizmodo has reached out to YouTube and the moderator’s attorneys and will update the post if we hear back.

Read the suit in full below:


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.