Moving into 2021 and forward, conservatives angry about cancel culture, censorship, shadowbans, or the attention of the FBI have a rich array of social destinations to choose from. We’ve prepped a travel guide for the unwitting observer who might be thinking of checking any of these conspicuous and lesser-known internet hellholes out — whether it’s to keep an eye on what the far-right is up to or to tell you exactly why you shouldn’t be going to these places.
Donald Trump and the Republican media ecosystem spent the last few years building an elaborate fantasy world for his supporters. They insisted, at every turn, that any unflattering portrayal of his unpopular administration was the product of a liberal media establishment staffed by socialist journalists and amplified by Silicon Valley tech companies angling to take him down.
A wide array of alternative social media sites cropped up to cater to right-wingers convinced that Facebook and Twitter were censoring them, despite all evidence indicating otherwise. They also cater to far-right groups ranging from fascists and white supremacists to QAnon truthers whom mainstream sites actually had been, with varying levels of commitment or success, trying to rid themselves of.
The riots in DC on January 7 when a mob of pro-Trump rioters charged into U.S. Congress trying to overturn the results of the election, resulted in a wave of platform bans targeted against the perpetrators and Trump himself. This fuelled a sense of urgency among conservatives that their days on Facebook, Twitter, YouTube, and other sites were numbered. So here’re some of the sites, platforms, and apps where they might set up shop in 2021, whether as a forever home or just a pit stop on a never-ending ride out into the fringe.
Trump dedicated what counted, for him, as considerable time, effort, and energy into indoctrinating supporters with the idea that tech companies are hunting down and eliminating conservative accounts like it’s The Most Dangerous Game. Parler, which is sort of like if Facebook and Twitter were around in 1939 and allied with the Axis, was the primary beneficiary of this conspiracy theory — at least until its role in the Capitol fiasco saw it stabbed in the back by Amazon, Google, and Apple, which collectively trashed the app by killing its hosting contract and app store access in January.
Parler launched in 2018. But in the days after the November 2020 elections, Parler leapt to top spots on the App Store and Play Store, surging to over 10 million users in a very short period of time. That’s in large part because conservative media personalities with huge audiences, including pundit Dan Bongino, numerous Fox News hosts such as Maria Bartiromo, former Trump campaign official Brad Parscale, former Turning Point USA comms director and Hitler endorser Candace Owens, radio host Mark Levin, and a number of GOP members of Congress had been urging their followers to #WalkAway and set up shop there.
This is the Parler I remember. pic.twitter.com/GptC352CWW
— PatriotTakes ???????? (@patriottakes) February 16, 2021
Parler managed to maintain the outward appearance of being one of the most mainstreams of the alternative sites on this list — an extremely low standard — as it was flooded with conservative celebrities and hadn’t been implicated in any horrifying acts of violence yet. Rank-and-file Republicans may have been attracted to Parler from its promise of a moderation-free environment free from the influence of effete tech titans. But so were neo-fascist street-brawling groups like the Proud Boys, racists and anti-Semites, grifters, people posing as senators to sell CBD oil, porn spammers, campaigns begging for money, and disinformation purveyors (some from Macedonia), who thanks to those same policies were all able to rub shoulders with the normies in the endless feedback loop they’d always dreamed of. Now-former CEO John Matze said in an interview that “community jury” groups handled most moderation, which sort of helps explain why the moderation sucked.
If this sounds like absolute hell, that is probably a positive statement reflection of your mental health. Well before the January 7 riots at the Capitol, where a large number of the crowd were members of Parler live-streaming crimes, it was clear that was exactly where it was headed.
At least several users of the far-right social network Parler appear to be among the hoard of rioters that managed to penetrate deep inside the U.S. Capitol building and into areas normally restricted to the public, according to GPS metadata linked to videos posted to the platform the day of...Read more
“Parler is a mix of hard-right extremists, right-wing influencers, and mainstream conservatives who feel they’ve been personally abused by Silicon Valley,” Cassie Miller, a Southern Poverty Law Centre senior research analyst, told Gizmodo in December. “It acts largely as a pro-Trump echo chamber and amplifier for misinformation. It will likely contribute to an even greater fracturing of our information system, which we know has immense consequences for elections and the larger political process. For example, the notion that the country is inevitably heading toward civil war is pretty pervasive on the platform.”
Miller told Gizmodo that the Proud Boys, which had been staging brawls in the streets of D.C. for months, used Facebook for recruitment until they were pushed off in 2018. She added Parler had “largely solved that problem for them, and it now acts as their main platform for propaganda and recruitment.” A half-dozen Proud Boys have since been arrested for their alleged role in instigating and carrying out the riot.
There were a number of reasons to be sceptical that Parler’s success would last through 2021. Few, if any, of its celebrity proponents actually deleted their accounts on Facebook, Instagram, Twitter, YouTube, or what have you, because they were not actually being censored there. Parler’s target demographic include droves of trolls, arseholes, racists, and other unpleasant people whose activities online tended to be centered around trying to piss off liberals, leftists, and minority groups, almost none of whom were actually on Parler to hold their attention. The site hadn’t demonstrated that it was anything more than a fad driven by feverish rhetoric from conservative media that would drop off as soon as they moved on to some other bogeyman.
For a blessed few weeks, Parler’s blacklisting by Amazon, Apple, and Google seemed like it might mean the app wouldn’t come back anytime soon or possibly ever. The social media service spent most of its time helplessly petitioning for the courts to intervene and restore their service, and for weeks the only sign of actual business operations was a “Technical Difficulties” page that listed letters of support from such luminaries as Sean Hannity. Its CEO, John Matze, got fired in some kind of power struggle over moderation policies.
Unfortunately, Parler is back, baby, with a new web host that seems to believe something will turn out different this time. New safety measures the company announced on Feb. 15 included a “privacy-preserving” algorithm to identify threats or incitement to violence, a “trolling filter” to hide potentially bigoted posts, and a ban on attempts to use the site to commit a crime. Seeing as that’s pretty much the bulk of Parler, one wonders how studiously the new restrictions can possibly be enforced.
“The fact that Parler’s interruption in service was only temporary tells us something about where tech is going,” Miller told Gizmodo this week. “We are going to continue to see a growing number of platforms that are looking to cater specifically to right-wing and extremist users, as well as infrastructure to support them. This is going to have a major impact on the information landscape and is something we’ll increasingly have to take into consideration as we try to tackle problems like disinformation and political polarisation.”
Parler was so desperate to have Trump sign up that it reportedly tried to negotiate an equity deal with the Trump Organisation while he was still in office, something that could be viewed as an, uh, bribe. Trump had reportedly been toying with joining the site, possibly under the moniker — we shit you not — “Person X.” He’s also reportedly had so little idea what to do without his Facebook and Twitter access that he’s spending a lot of his time suggesting tweets to those aides around him that remain unbanned.
This leaves open the possibility that Trump could still decide to make Parler his own little post-presidential posting palace. Suffice it to say that would be nice for him.
MeWe was created by Mark Weinstein, a tech entrepreneur behind such previous best hits as the short-lived SuperFriends.com and SuperFamily.com, early social networks that spanned just a few years from 1998 to the early 2000s. It bills itself as a privacy-focused, subscription-based “anti-Facebook.” Its primary selling point to conservatives, however, is that it promises it has “absolutely no political agenda and no one can pay us to target you with theirs.”
MeWe has millions of users, who are subject to a fairly long list of rules. But in practice, a Rolling Stone report in 2019 found, its primary draw appears to be users fleeing either bans or just paranoia one is forthcoming on Facebook. Its policy of not intervening against dishonest, hoax, or factually incorrect content had made it a landing spot for anti-Semites, mass shooting deniers, and other conspiracy theorists who are apparently largely free to run wild because of the site’s narrow definition of hateful speech.
Other groups that have migrated to MeWe include anti-vaxxers who feel suppressed by Facebook. In 2020, according to Business Insider, it became one of the staging areas for right-wingers organising anti-lockdown protests during the novel coronavirus pandemic, who created numerous groups and flooded feeds with recruitment messages.
Weinstein suggested to Rolling Stone that because MeWe does not allow advertisers to promote or boost content, that effectively eliminates any concern about groups boosting hoaxes and propaganda because “I have to go find those groups and I have to join them. They can’t find me.” He later penned a Medium post demanding the retraction of the Rolling Stone article, stating the site’s terms of service clearly state “haters, bullies, lawbreakers, and people promoting threats and violence are not welcome.”
As Mashable noted, MeWe also appears to be inflating the perception of how busy it is by creating dummy profiles for everyone from Donald Trump to the New York Times and then auto-populating them with content posted by those individuals or organisations on other sites.
(The site was originally named Sgrouples, like “scruples,” Weinstein said in an October interview, but like Parler, the original name didn’t stick due to users mispronouncing it.)
“MeWe — ugh,” Elon University professor and online extremism expert Megan Squire told Gizmodo. “MeWe reminds me of what would happen if MySpace and the ‘blink’ HTML tag had a baby. Users who try MeWe after being on Facebook complain that it is horribly designed, very ugly, hard to use, and feels frantic with chat messages popping up everywhere. Probably the most notable groups that moved to MeWe in 2020 were the Boogaloo-style groups that had been removed from Facebook and other platforms.” (Boogaloo refers to loosely affiliated groups of internet denizens who figure the country is probably headed towards a second civil war, such as far-right militia orgs that are particularly wishful it would hurry up and start already.) Squire added that those groups and others had “struggled” to build audiences on MeWe.
“Their exodus looked very similar to the Proud Boys did the same thing back in 2018 when they were first banned from Facebook,” Squire added. “Once on MeWe, both groups struggled to re-build the numbers they’d seen on Facebook, and many members of these groups left for other platforms.”
Jared Holt, visiting research fellow at The Atlantic Council’s Digital Forensic Research Lab, told Gizmodo he didn’t think MeWe had what it takes to compete for the hearts and minds of right-wingers.
“I use MeWe for research because it currently homes the remnants of a fair amount of banned Facebook groups and pages that belonged to militia, QAnon, and ‘Boogaloo’ movement figures,” Holt wrote. “The site gives its users a lot of control over privacy, which likely contributes to its appeal for some of those groups. Each MeWe group has a wall that users can post to — like Facebook — but MeWe groups also have a simultaneous group chat function. Those group chats are often chaotic and can be steered in some very strange directions depending on who is active in the conversation at any given moment in time.”
“Though some extremist groups are camping out on MeWe, I don’t see this platform capturing the attention of broader right-wing internet users in a way like Parler has,” Holt added. “Because of its privacy design, the platform can be a bit hard to grasp for users who don’t already know of specific people or types of groups they want to find. It has some territory carved out among awfully specific parts of the right-wing internet, but it’s hard for me to imagine this will become the next big conservative stomping ground.”
To give MeWe some credit, however, its default avatars — smiling cartoons of bread — are pretty cute.
Gab was founded in 2016 by the thoroughly unpleasant pro-Trump figure Andrew Torba, who was banned from seed money accelerator Y Combinator that same year “for speaking in a threatening, harassing way toward other YC founders,” according to YC via BuzzFeed. (Torba’s outbursts allegedly include telling YC founders to “fuck off” and “take your morally superior, elitist, virtue signalling bullshit and shove it.”) Since then, it’s become one of the primary dumping grounds for explicitly fascist and white supremacist posters who got tired of creating yet another Twitter alt.
The site likes to market itself, unconvincingly, as one of the last refuges of free speech on the internet in the face of Big Tech censorship, rather than a congregation of various sociopaths. Following a series of neo-Nazi terror attacks in Charlottesville, Virginia, and Pittsburgh, Pennsylvania — the latter of which was committed by a Gab user — the site was forced off the App Store, Play Store, cloud host Joyent, payment processors PayPal and Stripe, domain registrar GoDaddy, and various other services. In 2020, its alternative registrar, Epik, was banned by PayPal for running a suspicious “alternative currency.”
Suffice to say that Gab has a far more toxic reputation than, say, Parler. Mashable reported this year that analysts at a Florida police fusion centre had warned participating agencies that its new encrypted chat service, Gab Chat, was likely to become a “viable alternative” for “White Racially Motivated Violent Extremists” leaving Discord, a gaming-focused chat app that had a reputation for being overrun with Nazis during its years of explosive growth.
Gab remains a “prominent organising space for far-right extremists,” Michael Hayden, a senior investigative reporter at the Southern Poverty Law Centre, told Gizmodo. “While interest in Gab has declined since the site became so closely associated with the terror attack at Tree of Life synagogue in Pittsburgh in 2018, [Torba] has made a big push to bring in QAnon adherents who have been suspended elsewhere.”
The site provides “the type of infrastructure hateful, terroristic people need to organise mayhem,” Hayden added.
Torba has been telling anyone who will listen that Gab usership has surged as aggrieved right-wingers look for a post-Parler home, specifically claiming that as of early January, it had 3.4 million signed up. None of these figures are to be trusted, Hayden said, noting that an engineer for web host Sibyl System Ltd. had told the SPLC in 2019 that Gab’s quoted figure of 800,000 users at the time was not backed up by its usage statistics. Instead, the engineer said Gab’s usership was “a few thousand or a few tens of thousands.”
“It’s extremely difficult to get an accurate accounting of Gab’s real user numbers due to the degree to which the site is inflated with what look very much like inactive if not openly fake accounts,” Hayden told Gizmodo.
8kun originally launched in November 2019 as a rebrand of 8chan, an image board that was itself founded as a “free-speech” alternative to internet troll-hub 4chan. 8chan was knocked off the web after it was deplatformed by numerous internet companies and hit with DDOS attacks after its /pol/ board, a hub for right-wing extremists flooded with hate speech, was implicated in several mass shootings by white supremacist terrorists in Christchurch, New Zealand; Poway, California; and El Paso, Texas. The perpetrators of those attacks, where a cumulative 75 people died and 66 others were injured, had all posted manifestos to 8chan before the attacks.
Its owner, Philippines-based pig farmer Jim Watkins, was forced to testify before Congress and gave no indication he planned to change a thing.
8kun is also where “Q,” the unknown individual or individuals who started the QAnon movement, has continued the hoax after 8chan went offline. Watkins and his son, (ostensibly) former 8kun admin Ron Watkins, heavily promoted QAnon and are widely suspected to either be Q or know their identity.
Q hasn’t posted since Dec. 8, 2020 — though 8kun also served as one of the several venues where Trump supporters rallied each other ahead of the January 7 riots. Trump’s loss, subsequent humiliation in the courts, and failure to stop the Biden inauguration hasn’t exactly been great for the conspiracy theory’s brand. The younger Watkins has tried to rebrand himself as an election security expert just in time to score interviews with pro-Trump media boosting ridiculous theories of voter fraud.
Congress is somehow debating whether this insurrection was planned before January 6th, or if it was planned by Trump supporters.
Here's the kind of conversation that was happening on 8kun on 1/5.
"As many Patriots as can be… We will storm the government buildings, kill cops." pic.twitter.com/bKEuJTJaPQ
— Ben Collins (@oneunderscore__) February 23, 2021
I saw this post on 8kun's /qresearch/ warning QAnon dead enders about what they might expect going forward. pic.twitter.com/GS1O8gpDSJ
— Travis View (@travis_view) January 12, 2021
8kun is completely delisted from Google, making it somewhat harder to find for the kind of normies with limited navigational understanding of the internet flocking to sites like Parler, and it’s been sporadically knocked offline by attackers. While Q posted there, most QAnon aficionados actually followed them through a labyrinth of QAnon promoters, aggregation sites, and screenshots on other social media. That all means its gravitational draw has been somewhat blunted (a “rouge administrator” deleted its entire /qresearch board with no backups available last month, though it was later restored).
“The 8kun imageboard continues to be driven mostly by Q followers hoping for the anonymous poster’s return,” Julian Feeld, a researcher on conspiracy theories and co-host of the QAnon Anonymous podcast, told Gizmodo. “On the ‘Q Research’ board the usual cauldron of conspiracy theories stirs — ‘anons’ are tracking media reports of famous illnesses, deaths, and suicides to see if ‘the storm’ might still secretly be on track. It feels like they’re trying to stay positive as the days tick on, which is nothing new for them.”
Feeld added that 8kun’s replacement for /pol/, /pnd/, was just as openly extreme but appeared to be slowly fizzling out.
“Meanwhile the ‘Politics, News, Debate’ board is increasingly less active and currently serves as a hub for Neo-nazi propaganda,” Feeld wrote. “So far Jim Watkins has managed to keep the site functioning despite the many public outcries and activists’ efforts to keep it offline.”
Both Watkinses have been suggested as potential targets in the lawsuits being brought by Dominion Voting Systems, a company that is currently suing Trump campaign lawyers Sidney Powell and Rudy Giuliani as well as MyPillow CEO Mike Lindell for billions after they spread hoaxes claiming the company fraudulently flipped the 2020 elections. While that might be too much to hope for, 8kun doesn’t exactly seem to be on the rebound.
DLive is a video site that found an audience with right-wingers banned or demonetised on other sites like YouTube and who weren’t keen on the prospect of moving to places like Bitchute that explicitly cater to the far-right, but offer a limited audience and unwelcome associations. Unlike Bitchute, DLive briefly attracted some mainstream talent — video game personality Felix “PewDiePie” Kjellberg, one of the most-viewed streamers on the planet, signed a live-streaming exclusivity deal in April 2019 with the site before going back to YouTube exclusively in May 2020.
DLive, like the other sites on this list, has very lax rules. But it also has distinguishing features: It has an internal economy based on tokens called “lemons,” which are worth a fraction of a cent each, that runs on blockchain, the decentralised digital storage system that powers bitcoin and other cryptocurrencies. Lemons can be purchased or sold in cash and accrued by engaging in activities on the site, effectively making it gamified. DLive is also popular with gamers as a Twitch alternative, giving it access to a more youthful audience.
These factors made DLive an attractive option for extremists to continue making money. Elon University’s Squire recently published research with the SPLC showing some 56 extremist accounts had made a total of $US465,572.43 ($587,971) between April 16 and late October of last year.
“I don’t think there is any real advantage that DLive has compared to any other niche live-streaming site that facilitates donations,” Squire told Gizmodo. “There is nothing particularly ‘fashy’ about the site other than an apparently hands-off management style and a tolerance for hate speech and proximity to younger demographic game streamers. … The biggest advantage DLive has going for it is traditional network effects: like other social media platforms, the more people who use the service, the more valuable it gets.”
“Contrast this to Telegram’s file sharing/encryption/stickers or 8kun’s anonymity or Keybase’s file-sharing/encryption, for instance — these are technical features that drive adoption by extremists,” Squire added. “DLive is just a seemingly-normal platform that is also friendly to white supremacist streamers; it allows them to appear normal as they make money after they’ve been removed from the more mainstream sites.”
Squire’s research showed that over the time period in question, DLive generated $US62,250 ($78,616) for Owen Benjamin, a comedian known for racist and anti-Semitic “jokes”; $US61,650 ($77,858) for white nationalist “Groyper” chud Nick Fuentes; and $US51,500 ($65,039) for Patrick Casey, who used to be a leader of the now-defunct white supremacist group Identity Evropa and its similarly disbanded offshoot, the American Identity Movement. Others making thousands on the site included a prominent Gamergater, a white supremacist media brand, and a pseudonymous contributor to far-right publications. According to an August 2020 Time article, data from Social Blade showed eight out of the 10 highest-earning accounts on DLive were “far-right commentators, white-nationalist extremists or conspiracy theorists.”
But DLive had its own recent day of reckoning after it was highlighted in numerous news reports as playing a role in the Capitol riots — Fuentes, for example, used the site to float the idea of murdering members of Congress and later streamed on DLive from outside the building. Fuentes and Tim “Baked Alaska” Gionet, another far-righter to find a soapbox on DLive, were subsequently banned. Some alt-right streamers on DLive, such as Casey, have taken to telling their audiences that their days using it are numbered.
However, a report by Wired early this month indicated that Casey and other streamers on DLive continued to monetise with Streamlabs and StreamElements, third-party integrations that allow viewers to donate directly to creators (and allow streamers to bypass bans on major payment processors like PayPal). StreamElements told the magazine that it had removed Casey’s account after it reached out for comment, but Wired found that “dozens of Streamlabs and StreamElements accounts attached to white supremacist, far-right, or conspiracy theorist content are still live.”
The “only real actions” DLive has taken, Squire told Gizmodo, was the bans in January, a prohibition on streaming from DC implemented late last month, and demonetising accounts with an “X” tag, which is required for political streamers.
“Different streamers have been trying to game the system, for example by taking the X down so they can make money during the stream and then putting it back up and removing their videos,” Squire added. “It’s very tedious. Others are trying to pretend that they are just video game streamers.”
Conservatives are convinced that YouTube, despite playing host to a sprawling network of right-wing commentators and pundits and possibly doing the least of any major social network to fight GOP-friendly misinformation, is secretly conspiring against them. Enter Rumble, which is like YouTube if it was designed by me using WordPress.
Rumble has been around since 2013 and managed to rake together a number of partnerships with companies including MTV, Xbox, Yahoo, and MSN. Per Tech Times, it has a rather confusing number of monetisation options, two of which rely on signing over ownership rights to Rumble and a non-exclusive option where each video can make a max of $US500 ($631). Rumble appears to generate a significant amount of its revenue by licensing viral videos, as well as its video player technology. In other words, this is sort of a weird place for conservatives to end up.
Still, Rumble intentionally courted right-wingers as a growth strategy that seems to have paid off — it told the New York Times it had exploded from 60.5 million video views in October to a projected 75 million to 90 million in November. Rumble particularly benefited from the Capitol riots; Axios reported that downloads of its app doubled by the next week.
As of Tuesday afternoon, its “battle leaderboard” was headed by content from Bongino, Donald Trump Jr., far-right filmmaker Dinesh D’Souza, pro-Trump web personalities Diamond & Silk, and radio host Mark Levin. The most-viewed video from the previous week was a video of Trump Jr. arguing the left was “trying to cancel” Senator Ted Cruz for fleeing Texas while freezing weather knocked out electricity statewide, lying that Cruz had no ability to do anything about the situation.
Other top-viewed videos have included a video by “ElectionNightFacts” droning on dead-eyed about allegedly suspicious election results, a crying restaurant owner saying that her business is failing while Los Angeles authorities allowed movie sets to remain open, and numerous re-uploads of Trump speeches.
Of the 50 most-viewed videos of the last week, all but five videos (four videos in French from a Quebec-focused site and an aggregated news roundup) were viral fodder for right-wingers. Much of it was either reuploads of videos that could be found elsewhere, such as clips of Bongino’s show, videos from Trump Jr., or just clips taken from networks like CNN or C-SPAN coupled with angry or exaggerated captions.
Slate noted that in addition to a slew of content spreading conspiracy theories that the “deep state” had stolen the election from Trump, QAnon content and videos lying about the nonexistent link between vaccines and autism were gaining a large audience through Rumble. A search of the site shows that while many conservatives on Rumble were criticising QAnon, videos promoting or covering the conspiracy theory were still widely posted.
CEO Chris Pavlovski told the Washington Post that while the site has rules against obscene content and certain categories of content like videos showing how to make weapons, he views his approach to moderation as akin to bigger tech companies’ a decade ago.
“We don’t get involved in political debates or opinions. We’re an open platform,” Pavlovski said. “We don’t get involved in scientific opinions; we don’t have the expertise to do that and we don’t want to do that.”
The Post reported that Rumble was heavily reliant on traffic from Parler, with Pavlovski telling the paper more of its traffic clicked over from there than Facebook or Twitter. That may leave Rumble in a tough spot, though according to BuzzFeed, Bongino took an equity deal with Rumble to promote it to his followers on Facebook.
Encrypted messaging service Telegram had long been a safe space for various fascists, racists, and quacks, and it served as one of their last havens after being squeezed out of competitors like chat server app Discord. Telegram has a far more laissez-faire approach to content moderation and was host to hundreds of white supremacist groups with thousands of members by mid-2020; it also serves as a central hub for fascist groups like the Proud Boys as well as a remaining outlet for far-right activists like failed congressional candidate Laura Loomer and distant memory Milo Yiannopoulos to reliably stay in contact with supporters.
Of course, Telegram isn’t just used by extremists. It and Signal, another encrypted chat app, have become wildly popular and are used by everyone from random suburbanites to political dissidents. The governments of Russia and Iran took use of Telegram by protest movements seriously enough to warrant attempting to shut them down (Russia’s attempt backfired big time with major collateral damage on unrelated web apps, while Iranians simply dodged restrictions with VPNs). A Belarusian news organisation based out of Poland, Nexta, has been using Telegram to coordinate protests against dictator Alexander Lukashenko.
Moderation is inherently more complicated on Telegram, as it’s privacy focused, mixes public and private messaging functions, has various encryption types, and content flows by in realtime. Telegram has shown limited interest in moderation of its social networking dimension, and it’s based out of London, insulating it somewhat from the political debates raging around U.S.-based sites. All of these factors have contributed to its popularity with extremists.
“Telegram is the largest safe haven for the most extreme parts of the far-right,” Miller told Gizmodo. “While white power accelerationists were, until relatively recently, largely confined to small, highly vetted forums that had a limited reach, they can now reach far larger audiences on Telegram. There is a large network on Telegram that exists solely to encourage members of the white power movement to commit acts of violence.”
“We’re seeing the white power movement as a whole shift away from formalised groups in favour or small, clandestine terror cells, and Telegram is playing a major role in facilitating that reorganisation,” Miller added.
In 2020, however, Telegram began banning some of the most extreme groups on the site, including a neo-Nazi hub called Terrorwave Refined with thousands of followers, a militant group tied to foreign recruiting for a white supremacist movement fighting in eastern Ukraine, and a Satanist group obsessed with rape. But it’s not clear that Telegram is putting up much more than a token effort in response to media pressure. Terrorwave easily slipped back onto the service under another name. In November 2020, Vice News reported that Telegram didn’t delete a dual English/Russian language channel dedicated to the “scientific purposes” of distributing bomb-making instructions until after it published an article on the topic. While it banned dozens of far-right channels following the Capitol riot, many others continue to operate.
“Telegram’s attempts to ban white supremacist content had little effect on the extremist communities already established on the platform,” Miller told Gizmodo. “Most banned channels simply created backups, and had already used the platform’s export feature to preserve their content. The bans forced extremists to become slightly more agile but, beyond that, had little impact. Telegram continues to be a safe haven for extremists, allowing users to participate in the radical right without ever joining a defined group. More than any other platform, it’s helping to facilitate a shift toward a leaderless resistance model of far-right organising.”
Thinkspot, the site founded by Canadian psychiatrist and surrogate dad to a cult-like fanbase of disaffected libertarians and anti-feminists Jordan Peterson, barely registers a mention on this list. While Peterson founded the site in 2019 in response to a series of bans on fringe conservatives and commentators sympathetic to the “alt-right” on Patreon, it’s not a hub of extremism, just pseudointellectual conservative drive. It is more or less a vanity site designed to facilitate giving Peterson money under the cover story of enabling intellectual discourse banned elsewhere on the web, and it appears to have been largely abandoned after he dropped out of the public eye in 2019 amid a months-long medical crisis.
Peterson announced his return in October but has only mentioned the site on his Twitter feed five times since February 2020. His posts in the past few months have largely been reposts of podcast episodes or YouTube videos with only a few dozen “likes” and the same captions that appear on other sites. On Monday, only a handful of the featured posts seen upon logging into Thinkspot were listed as having more than a hundred views, with the one highest on the “Top Posts” leaderboard having 850 views and eight comments.
“What’s that?” you might ask. “I thought all of these conservatives were fleeing Facebook?”
Well, maybe in the same sense that an angry teenager storms out of the room. Facebook remains the undisputed nerve centre of the right-wing digital ecosystem. Right-wing media companies enjoy massive Facebook empires with staggering user engagement, the site has long been used to coordinate conservative propaganda campaigns, and Facebook executives have long bent over backward not to make policy changes that might piss off Republican politicians.
Just this weekend, BuzzFeed reported that executives including Mark Zuckerberg and the policy team headed by former GOP lobbyist Joel Kaplan had intervened to safeguard conservative pundits from Facebook’s own mod team and shut down news feed changes that might anger pundits like Ben Shapiro. Facebook is built on juicing engagement on emotionally stimulating content, which aligns naturally with the rhetorical style of the right, the business incentives of reactionary pundits like Ben Shapiro, and explosive growth of conspiracy movements like QAnon and “Stop the Steal.”
Facebook is now trying to rid itself of certain kinds of content that have proven particularly PR-hostile, like hate groups, Boogaloo, and QAnon, and right-wing extremists have indeed sped up their pattern of migrating to platforms where they are more easily ignored or shielded from scrutiny. It’s also trying to fix messes like its pivot to boosting private groups, sparking a wave of toxic “civic” groups.
Nothing about the basic pattern has changed, though. Facebook amplifies some type of reactionary mind gruel, ignores that specific strain until its exponential growth blows up in the company’s face, and then promises a quick fix while ignoring some other looming disaster. There’s no reason to expect that will change in the near future, or that conservatives won’t take advantage of it again, and again, and again. Welcome home.