A Facebook spokesperson told Business Insider that the group, titled “Official Q/QAnon,” was removed for “repeatedly posting content that violated our policies” on Thursday. Those rule violations included bullying and harassment, hate speech, and misinformation that could lead to serious harm. The group had over 200,000 members, making it the second-largest QAnon group on Facebook — where the conspiracy theory has spread like wildfire, largely through hundreds of similar groups still live on the site.
QAnon, which has exploded in popularity in recent years, is simultaneously hopelessly complicated and incredibly simple. Adherents believe that a secret, widespread cabal of power brokers and politicians (including virtually every prominent Democrat) is engaged in human trafficking, pedophilia, ritual abuse of children, and other atrocities. These constitute what they refer to as the “deep state.” That’s a flexible concept and label of inconvenience that encompasses everything from the very real labyrinth of the nation’s foreign policy, intelligence, and security agencies — sometimes referred to as the Blob — to a thinly-veiled rehash of anti-Semitic tropes and New World Order fever dreams.
QAnon is rooted in prior conspiracy theories reaching as far back as the JFK assassination, as well as more contemporary nonsense like Pizzagate, but its unifying focus is a person (or persons) who posts pseudonymous to the image boards 4chan, 8chan, and 8kun under the moniker Q. Q claims to have hard evidence of the deep state cabal, which Trump is secretly battling and plans to wipe out in one strategically brilliant fell swoop — and purports to gave gained knowledge of these goings-on via a high-level intel or military security clearances. Despite this, but he, she, or they mostly posts in vague allusions and cryptic riddles. In the self-assured technobabble of the QAnon community, this is justified as a form of deprogramming in which Q is leading them to seek the real answers rather than stating them outright; non-believers might instead see this as a form of narrative arse-covering which allows QAnon devotees to choose their own adventure.
Supporters of the theory have sometimes committed criminal or violent acts, including a man who drove an armoured truck loaded with ammunition onto Hoover Dam in 2018, causing a standoff with police. An FBI intelligence bulletin in 2019 listed QAnon as one of several “anti-government, identity-based, and fringe political conspiracy theories” that could “very likely motivate some domestic extremists” towards criminal activity. Per the Washington Post, QAnon supporters have been arrested in at least nine other incidents, including “two murders, a kidnapping, [and] vandalism of a church.” QAnon devotees are also notorious for online harassment campaigns of random individuals they’ve somehow determined are key figures in their imagined nexus of evil.
The White House, which regularly traffics in its own conspiracy theories (often pulling from similar themes) sees Qanon supporters as a die-hard fan base, and has signalled their support by echoing their rhetoric and sharing content from QAnon accounts. The Trump administration has also posted re-election ads featuring Q iconography. About 64 QAnon supporters have run or are running for Congress, including dozens this election cycle.
“We have a current president who uses conspiracy rhetoric arguably more than any other president in modern history,” University of Delaware political psychology researcher and conspiracy theory expert Joanne Miller told the Post. “These people feel emboldened. They feel like their issues are getting addressed — and that is they hate the establishment and want to blow it up. Trump built this coalition with these folks, and they feel like they’re a part of it and this is their time.”
The size of the movement in general is hard to estimate. Polling by the Post has shown most in the US are unaware of it or have very negative opinions, while Pew Research Centre polling in March found about 76 per cent of respondents have never heard of it at all. The Pew poll found that respondents who get their news primarily from Reddit, Twitter, and YouTube were far more likely than the general public to know of it, though Facebook was in line with the national norm.
Facebook is, nonetheless, a major hub for QAnon supporters to interpret Q posts and promote new conspiracy theories. (In many cases, those groups seem to serve as a proxy social outlet for right-wingers who are isolated from their loved ones, possibly because of their beliefs.) According to research by Marc-André Argentino, a Ph.D candidate specializing in online extremism at Concordia University, membership in pro-QAnon Facebook groups has exploded in recent months, fuelled by the novel coronavirus pandemic and a reactionary backlash to widespread protests over police brutality. This is partially caused by a domino effect where the size of Facebook groups translates directly into more followers and engagement, Argentino wrote.
4/ Lets now compare post volume in QAnon groups, which the entire collection has 2.51M posts. Between October 2017 & March 2020 there were 941.1K posts (7410/wk). Since March 2020 there have been 1.57M (66030/wk). pic.twitter.com/qIWL5BkCDz
— Marc-André Argentino (@_MAArgentino) August 3, 2020
7/ In the last 22 weeks QAnon interactions in Facebook groups have ⬆️102%. What is truly staggering is that the weekly interaction rate has ⬆️ 976%. What this means is that more people are interacting with QAnon content in these groups
— Marc-André Argentino (@_MAArgentino) August 3, 2020
In other words, Facebook is still allowing these conspiracy hubs to thrive, just so long as they don’t become too much of a headache — which they inevitably do. The site did delete five groups, with had amassed around 133,000 unique members between them, related to QAnon in May, citing “coordinated inauthentic behaviour.” Thursday’s action more directly cited content posted to the group, but will probably result in many of the users simply migrating to other extant QAnon groups.
In July, Twitter announced it would begin cracking down on QAnon-related content, but only when it involved harassment or threat of offline harm. Twitter told Gizmodo they expected the changes to eventually affect up to 150,000 accounts. Reddit, another site where QAnon metastasized, banned major QAnon subreddits nearly two years ago.