Nearly all the major technology services we've come to rely on have been negligent in enforcing their own user protection guidelines. Repeated calls to act on the Terms of Service these companies outlined for themselves without any meaningful response has arguably emboldened the worst elements taking root on them -- with years of simmering hatred brought to boil this weekend in Charlottesville.
Tagged With discord
Back in January, Facebook CEO Mark Zuckerberg said he was "quite proud of the impact that we were able to have on civic discourse", doubling down on his stance that the rise of misinformation, spread of outright propaganda, and rapid erosion of trust in the fourth estate were anyone's problems but his. A whitepaper from the world's largest social media platform -- where an estimated 66 per cent of the site's American users get their news -- casually mentions that Facebook is also fertile soil for "subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people".
In February, Gizmodo reported that many Discord users were facing abuse and were given no clear recourse. Among the issues endemic to the voice- and text-chat platform was the anonymous and unsolicited dissemination of child pornography, a problem which seems to have only gotten worse.
Over 25 million users have flocked to Discord, a text and voice platform for gamers, since its launch in May of 2015. Despite the company raising at least 30 million in venture capital funding, the company has only five "customer experience" personnel and no moderators on its staff. From what I've seen, users who wish to engage in harassment, raid servers or bombard chats and users with child pornography suffer no lasting repercussions for doing so. That seemingly any server can become the victim of organised attacks represents the strained and failing infrastructure of moderation -- of Discord, and of virtually any community on the internet.