Nearly all the major technology services we've come to rely on have been negligent in enforcing their own user protection guidelines. Repeated calls to act on the Terms of Service these companies outlined for themselves without any meaningful response has arguably emboldened the worst elements taking root on them — with years of simmering hatred brought to boil this weekend in Charlottesville.
In the wake of the mayhem last weekend, which resulted in the death of an innocent woman, tech companies seem to be realising the inadequacy of their facile defences from critics. Rather than simply throwing up their arms and arguing that they aren't responsible for what happens on their platforms, companies are finally taking measures to remove some of the most vile, dangerous elements they harbour.
Policies that could have prevented sites like The Daily Stormer from using these services have, in most cases, existed for many years — and even now, the half-hearted purge of a few specific aggressors involved in the Unite the Right rally look a lot like half-assed window-dressing designed to satisfy public opinion. These are companies that won't — or can't — meaningfully crack down on extremism in broad terms.
So which companies have finally chosen to act this week?
To its credit, Airbnb is one of the few companies to take action before all hell broke loose in Virginia this weekend, opting to ban users it had reason to believe were booking rooms to attend the Unite the Right rally. Though earlier versions of Airbnb's ToS disallowed users to "post, upload, publish, submit or transmit any Content that [...] promotes discrimination, bigotry, racism, hatred, harassment or harm against any individual or group" its most current iteration contains the following passage:
you will not and will not assist or enable others to [...] discriminate against or harass anyone on the basis of race, national origin, religion, gender, gender identity, physical or mental disability, medical condition, marital status, age or sexual orientation, or otherwise engage in any abusive or disruptive behaviour
The company, in a statement to Gizmodo, also interpreted the booked lodgings for Unite the Right as a breach of the company's Community Commitment, and intends to make use of preemptive bans in the future to mitigate ease of travel for similar rallies.
The distributed domain server and internet security services company also presents an unusual circumstance. Cloudflare's Terms of Service includes a typical blank check policy that allows the company to act when it sees fit:
You agree that Cloudflare may, under certain circumstances and without prior notice, immediately terminate your Cloudflare account [...] Cause for such termination shall include [....] any use of the Service deemed at Cloudflare's sole discretion to be prohibited.
In an internal email obtained by Gizmodo, company CEO Matthew Prince said he acted against neo-Nazi blog The Daily Stormer because "the people behind the Daily Stormer are arseholes and I'd had enough."
This opens a can of worms in the sense that, while legally insulated and justified, by exercising the blank check clause in his company's ToS, Matthew Prince could conceivably discontinue services like DDoS protection for any website whose leadership he deems repugnant. In that email Prince calls the move arbitrary and says that he thinks its dangerous and that it won't set a precedent — though its just as easy to speculate Cloudflare's decision was informed by mounting pressure to exile The Daily Stormer and doesn't want to be held responsible for vetting its customers more broadly.
Cloudflare appears to still provide its services to the homepage of the Traditionalist Worker Party, a group which the Southern Poverty Law Center describes as "a white nationalist group that... blames Jews for many of the world's problems [and] is intimately allied with neo-Nazi and other hardline racist organisations" and to both 4chan and 8chan, the "politically incorrect" boards of which are infamous for their association with the alt-right and more overt forms of fascism.
The gaming chat client-cum-AIM For Scumbags has had issues with moderating content before, and has been slow in acting to rectify them. (After Gizmodo reported that Discord had become a Wild West for the dissemination of unsolicited child pornography, company leadership took nearly two months to roll out protections that prevented people from getting illegal content from people they didn't know.)
Despite being one of the most laissez faire platforms, Discord actually broadly bans lots of activity in its terms:
[Y]ou agree not to use the Service in order to [...] defame, libel, ridicule, mock, stalk, threaten, harass, intimidate or abuse anyone
Presumably this wording ("anyone") implies people both on and off Discord. Yet white supremacist, ethno-nationalist, and "alt-right" servers flourished — most trafficking in racially- and religiously motivated hatred and threats. Some of the most vile servers insist on "vetting" users, often demanding pictures of their hands (to prove they're white) and voice interviews (to prove they're sympathetic to white supremacy or adjacent ideologies.)
On Monday, the platform finally decided to begin banning some of its worst users, as well as the some of the worst servers, including those dedicated to Richard Spencer and Baked Alaska (both were headlining speakers at Unite the Right). Discord's spokespeople have declined to provide specifics on which or how any servers are being banned. Currently, remaining white nationalist servers are regrouping and taking measures to avoid future detection.
"You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence," Facebook's ToS commands — but of course, the world's largest social network has always struggled to moderate its sprawling platform, with revenge porn and livestreamed murders and suicides being an upsettingly common occurrence.
In one of his robotic posts which appeals to no one by trying to appeal to everyone, Mark Zuckerberg reaffirmed his position that "there is no place for hate in our community" without specifying how he intends to clean things up. Following last weekend, at least 11 pages appear to be removed, according to Quartz, including the Facebook page of sobbing neo-Nazi Christopher Cantwell and the Instagram account of ex-Rebel Media blogger Lauren Southern.
The prohibition against posting "hateful" content appeared in Facebook's ToS at least as early as 2005, though, Richard Spencer's National Policy Institute still has an active page on the platform. Then again, it's difficult to expect anything from company that denied its role of massively amplifying misinformation and propaganda in the 2016 election until April.
With two decades under its belt, GoDaddy is the world's largest domain registrar. Like Cloudflare, it has been in the news for severing ties with The Daily Stormer — a site which it defended as recently as early July. Per its current ToS:
You will not use this Site or the Services in a manner (as determined by GoDaddy in its sole and absolute discretion) that [...] Promotes, encourages or engages in terrorism, violence against people, animals, or property
In fact, this is considerably less broad than the oldest available archive of the registrar's terms available on Archive.org's Wayback Machine, which instead informs customers that
You will not use this Site or the Services found at this Site in a manner (as determined by Go Daddy in its sole and absolute discretion) that [...] Promotes, encourages or engages in defamatory, harassing, abusive or otherwise objectionable behaviour
To the best of our knowledge, the more recent wording was present when The Daily Stormer first registered with GoDaddy and the site's registration has never been in jeopardy until the company tweeted its decision to disassociate Sunday night.
GoDaddy continues to host — among other extremist sites — Richard Spencer's altright.com.
Like Airbnb, GoFundMe (and a few other crowdfunding platforms) began shutting down alt-right personalities' accounts ahead of — though sometimes in connection with — Unite the Right. Baked Alaska appears to have been banned on July 28. Kyle Chapman AKA Based Stickman — an alt-right personality best known for snapping a pole over a counter-protester's head and for founding the Fraternal Order of Alt Knights, a militant subset of the Proud Boys — was kicked around May 4. As far back as January 30, Brittany Pettibone — a sympathizer on the periphery of the "alt-right" — had a fund for her podcast removed.
The platform's non-exhaustive list of prohibited content includes:
the promotion of hate, violence, harassment, discrimination, terrorism, or intolerance of any kind relating to race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender or gender identity, or serious disabilities or diseases
Since at least February 7 of 2015, site policies have — in different language — disallowed "hurtful or hateful language; violent or hateful material; materials including bigotry, racism, sexism, or profanity." Prior to the recent wave of cancelled accounts, GoFundMe has been used to raise money on behalf of disgraced Breitbart editor/struggling novelist Milo Yiannopoulos, among others.
Multiple funds for the benefit of the driver who mowed down pedestrian counter-protesters this weekend have already been removed.
Following GoDaddy's decision to part ways with The Daily Stormer, publisher Andrew Anglin re-registered the site with Google, which, within an hour or so told Gizmodo it was "cancelling Daily Stormer's registration with Google Domains for violating our terms of service."
Google, being a massive company with access to excellent lawyers, words a portion of their Registrar arms's ToS as such:
We shall have the right, at our sole discretion and without liability to you or any of your Contacts, to suspend or cancel your domain name and to reveal Registrant and Contact Whois Information [...] to avoid financial loss or legal liability [...] ;if we believe that you or one of your Contacts is using the Whois Privacy Service to conceal involvement with activities that are: illegal, illicit, misleading, objectionable, harmful, hateful, defamatory, derogatory or bigoted based on racial, ethnic, sexual preference, age, disability or political grounds or that may otherwise cause injury, damage or harm of any kind to any person or entity
Few if any other extremist white nationalist sites seem to have attempted to register domains through Google, likely due to paranoia around potential data collection and/or cooperation with law enforcement. (This fear partly fuels the abortive push by alt-righters to boycott all Google products following the firing of anti-diversity memo scribe James Dalmore.) However, the wider Alphabet corporate umbrella also includes YouTube.
Fertile soil for recruitment and propaganda, vlogger personalities have proliferated on the world's largest video platform with ease — and largely with impunity. Even as it's most-subscribed creator Felix Kjellberg (AKA PewDiePie) took today to reflect on his complicity in normalizing modern Naziism through poorly-executed jokes, the company has made no public effort to ban any of the dozens of channels popular within the movement, despite its community guidelines clearly barring hate speech, which it defines as:
content that promotes violence or hatred against individuals or groups based on certain attributes, such as: race or ethnic origin, religion, disability, gender, age, veteran status, sexual orientation/gender identity
Similar to GoFundMe, Patreon — the crowdfund distinguished by providing creators with a recurring monthly allowance — has dipped a toe into cutting off monetisation avenues for far-right figures, including TV KWA in June and, more notably, Lauren Southern in July for what CEO Jack Conte called "manifest observable behaviour" of her assistance in a Generation Identity effort to disrupt a Doctors Without Borders search-and-rescue mission in the Mediterranean Sea. (Southern denies direct involvement.)
The platform reserves the right to "terminate your account" if you "break the law or encourage others to break the law," or "harass or bully others, or promote violence or hatred towards others," among other clauses. Like GoDaddy, the earliest available version of Patreon's ToS are even more strict, and even more specific.
Patreon appears to have allowed a page for Unite the Right organiser Jason Kessler to remain on its platform until at least Saturday morning.
The ubiquitous payment processor — which first made current Trump advisor Peter Thiel and Trump-counsel abandoner Elon Musk fantastically wealthy and influential in Silicon Valley — has long regulated what users can and can't do on its service. This includes obvious infractions like fraud, and less obvious ones like exchanging money for nude photos or videos between consenting adults, or, as a source alleged to Gizmodo, buying certain unscheduled chemical substances like phenibut.
Starting in May, as Buzzfeed reported, alt-right and anti-immigrant groups — from the aforementioned Kyle Chapman and Generation Identity to sympathizer and pickup artist Roosh V — had their accounts limited or removed entirely.
Makes sense, as the service's Acceptable Uses states that:
You may not use the PayPal service for activities that [...] relate to transactions involving [...] the promotion of hate, violence, racial intolerance or the financial exploitation of a crime
That policy has not changed since at least July of 2015 but has not been enforced to a degree that cause visible blowback until just a few months ago. It's unclear how many extremist groups or figures are still active on PayPal and due to a piece of legislation tucked into the Housing and Economic Recovery Act, payments received this way may not even need to be reported to the IRS.
Reddit's enormous influence is only matched by its bewildering inability to enforce its user guidelines, or even decide on what such guidelines should be. Incremental removal of extremists started this year with the banning of r/altright, followed on Tuesday by banning r/Physical_Removal, a community which frequently glorified Augusto Pinochet's murder of political dissidents. "We are very clear in our site terms of service that posting content that incites violence will get users banned from Reddit," a spokesperson told Gizmodo.
Those terms of service as prohibit content that "threatens, harasses, or bullies or encourages others to do so," and arguably gathering in a specific community built around deeply racist, sexist, or anti-semitic views would violate that. Still, repugnant subreddits like r/WhiteRights, r/PussyPass, r/NatSoc, and r/the_donald persist, to the detriment of users, staff, and the already-tarnished reputation of the website.
Of all the measures taken by companies to remove white nationalist content, Spotify's seems the most like a PR grab. The music repository's ToS forbids activity that's "offensive, abusive, defamatory, pornographic, threatening, or obscene," or "is intended to or does harass or bully other users," but considering there's no meaningful community element, its decision to remove 27 Southern Poverty Law Center-identified "hate bands" is welcome but not hugely impactful.
And even in this somewhat hollow gesture, Spotify comes out a laggard: Apple banned twice as many acts associated with white nationalism, three years ago.
Familiar to anyone who has ever listened to a podcast, idiot-proof site-building service Squarespace has, in its Acceptable Use Police, prohibited users to "advocate bigotry or hatred against any person or group based on their race, ethnicity, nationality, religion, gender, gender identity, sexual preference, age or disability," and a broader clause against "'Hate sites' or content that could be reasonably considered as slanderous or libelous" was included in the ToS going back to 2004 when the company was first founded.
Currently, it still hosts Richard Spencer's Radix Journal and National Policy Institute sites, as well as the homepage of Identity Europa.
Thirteen years of clear content guidelines, a Vocativ article specifically about the white supremacist sites using the company's web-building tools in April, and a change.org petition signed by nearly 56,000 asking Squarespace to enforce its own policies have been ineffective. The Outline reported late yesterday that Squarespace intends to "remove a group of sites from our platform," but didn't specify which.
Once called a "honeypot for arseholes" by a former employee, Twitter is has uniformly been lousy at keeping the very worst people from using its platform for the very worst reasons. Following Charlottesville, three accounts associated with The Daily Stormer — @dailystormers, @dailystormer and @rudhum — are no more.
Though its rules don't permit broad user behaviours like "violent threats," "harassment," and "hateful content," Twitter pages associated with key players from Unite the Right remain unscathed, including organiser Jason Kessler who was physically run out of his own press conference by justifiably angry Charlottesville residents.
The company has banned exactly one user involved in Unite the Right to the best of our knowledge — James Allsup — after a video was tweeted by (literal) fellow traveller Baked Alaska of a driver allegedly kicking the pair out of her car. Though at this point it strains belief to think Uber would have the moral high-ground on anything, the ride-sharing platform disallows "use of inappropriate and abusive language or gestures" and "any behaviour involving violence, sexual misconduct, harassment, discrimination, or illegal activity." It's unclear what took place before the video was filmed, or why Baked Alaska managed to avoid a similar ban.
Update: A letter sent to drivers and staff states the company will "act swiftly and decisively [...] against discrimination of any kind," and continue to ban users found to be violating those policies.
Photographs from Charlottesville show James Fields Jr, the driver who struck a crowd of counter-protesters in cold blood, holding the insignia of and surrounded by other members of the white supremacist group Vanguard America. Their website — bloodandsoil.org — has been taken down by Wordpress.
The closest Wordpress comes to a policy on hate speech concerns the posting of "direct and realistic threats of violence," and the company has yet to make a statement regarding its decision.