Isn't Facebook great? (It's not.) But isn't it nice and clean and kid friendly? This is true for a very specific reason: the social media giant outsources the gnarly task of finding and deleting inappropriate content. In the November issue of Wired, Adrian Chen offers a peek into the darkest corners of the industry. It's only slightly horrifying.
It's not just Facebook, of course. Pretty much any social media site you can think of uses some sort of moderation to keep abusive content off its pages. Chen specifically visited the offices of a company in the Philippines that handles moderation for Whisper, the not-so-anonymous secret-sharing app. There, contracted workers likely make less in a day than you do in an hour by looking at pictures of everything from child beastiality to brutal violence. This sort of thing takes a toll on content-moderating workers — of whom there are an estimated 100,000 worldwide.
The moderator interviewed in the piece left his job not long after Chen's visit, and apparently, the average length of employment for content moderators is between three and six months. It's sort of incredible that some workers last that long. One of Chen's sources earned just about $US300 a month.