Many members of Facebook’s legion of moderators — contractors hired through companies like Accenture, CPL, Hays, and Voxpro — are leaving to work for competitor TikTok, according to an analysis by CNBC.
TikTok has been staffing up its safety team, and according to CNBC, over two dozen former Facebook content moderators now list positions at TikTok on their LinkedIn profiles, suggesting more may be making the transition as well. The company’s director of government relations and public policy in Europe, Theo Bertram, told British officials in September 2020 that TikTok’s trust and safety units now employ more than 1,000 people. Facebook, which as of earlier this year had contracts to employ around 15,000 moderators in the U.S., is a natural target. Many of TikTok’s team work in Dublin, where CNBC reported the company recently announced plans to expand its staffing from 900 to 1,100.
Chris Grey, a former Facebook moderator for CPL, told the network that working there was a “terrible job” and TikTok “looks better,” adding that the pay (around $US15.39 ($21)-$US19.24 ($26) an hour, depending on the hours) wasn’t great for having to deal with around 100 pieces of hateful, racist, or disturbing content a day. Grey is currently leading a lawsuit against Facebook in Ireland over continual exposure to extreme content that allegedly caused widespread psychological distress and trauma in himself and workers; the lawsuit claims Facebook failed to provide adequate counseling or health resources or work to create a safe workplace.
Facebook content moderators have said they face nightmarish conditions at work. The company settled a separate lawsuit brought by moderators in the U.S. who developed PTSD for $US52 ($72) million earlier this year. The Irish lawsuit carries much higher legal risk, according to Motherboard, thanks to stricter labour laws in Ireland and the plaintiffs’ intent to compel a sea change in Facebook’s handling of content moderation by forcing the disclosure of its internal data on toxic, violent, illegal, or otherwise objectionable content.
“It could be people being unloaded from a truck somewhere in the Middle East and lined up by a trench and machine gunned or it might be Dave and Dorine have broken up and they’re having a bit of a spat and making claims about who’s a junkie and who is a slut,” Grey told CNBC, adding that Facebook users often wield reporting functions “as a weapon against each other.” TikTok moderators work in-house, and Grey said he was hopeful it would learn from Facebook’s mistake and work to create safer conditions for moderators.
“If there’s one company that knows how to ruthlessly poach staff from rivals it’s ByteDance,” social media analyst Matthew Brennan told CNBC. “They won’t think twice about swooping in to take advantage of Facebook’s difficulties. All’s fair in love, war and business.”
That’s not to say that TikTok doesn’t have its own issues. Earlier this year, the Intercept reported that TikTok had instructed moderators to suppress content uploaded by people who appeared to be ‘ugly,’ impoverished, or disabled to maintain an “aspirational” image, as well as censor a wide range of content ideologically offensive to Chinese authorities. TikTok, which is separate from ByteDance’s similar app for the Chinese market Douyin, has distanced itself from its Chinese ownership and claimed those directives were no longer in effect.
TikTok didn’t return a request for comment from CNBC, but Facebook told the network in a statement: “Our content reviewers play an important role in keeping our platform safe for billions of users. That’s why we ensure our partners provide competitive pay and benefits, high-quality work environments, and the training, coaching, guidance and support necessary to successfully review content.”