For years now, the QAnon conspiracy theory has existed on Facebook’s platforms. But that is set to change. On Wednesday morning the company said QAnon Facebook Pages, Groups and Instagram accounts will now be removed from the platform.
In an unattributed company blog post, Facebook reflected on its previous efforts to battle the conspiracy theory. Back in August the company said it was only removing QAnon content that promoted violence. And, apparently, there was quite a lot of that!
“In the first month, we removed over 1,500 Pages and Groups for QAnon containing discussions of potential violence and over 6,500 Pages and Groups tied to more than 300 Militarized Social Movements. But we believe these efforts need to be strengthened when addressing QAnon,” the statement reads.
Facebook went onto announce new measures that expand upon the above.
“Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” the blog post says.
“We are starting to enforce this updated policy today and are removing content accordingly, but this work will take time and need to continue in the coming days and weeks. Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”
Why this is happening now
So why such a giant shift only weeks after their first efforts to remove QAnon content was announced? Well, there’s two reasons, according to Facebook.
The first is that beyond violence, QAnon believers were creating ‘real world harm’ through other means. The post gives the examples of QAnon believers flooding the emergency phone numbers during the wildfires in the U.S. with false accusations against BLM protestors.
The second reason provided is QAnon believers are constantly evading bans by using codes and changing their messaging. It is easier, according to Facebook, to amputate the limb than try to cut around the infection.
How will this Facebook ban affect QAnon?
Facebook is perhaps the greatest vector to the QAnon conspiracy theory. While it started on imageboards, it really took off when it made its way off those niche websites and into the mainstream social media platforms.
So it’s likely that a large number of people who are active participants in the conspiracy theory will lose some access to their QAnon pipeline.
Of course, many of them will now seek out other ways of getting it — whether that be Twitter, YouTube, encrypted chat platforms like Telegram or less-known social networks. But still, deplatforming can be successful.
That being said, how the policy will be applied is still unclear. How does Facebook define what an account “representing QAnon” is? How many times do you get to post QAnon content before you’re banned?
But on the other hand, this decision comes years after QAnon has been widely reported to be a major problem. QAnon believers have allegedly killed people. They’re a massive source of misinformation. They harass people en masse. And that’s even before you consider the people who’ve lost spouses, family and friends to the radicalising forces of an Facebook-fuelled conspiracy theory.
While social media-powered conspiracy theories still remain new, the impact of QAnon has been clear for a long time now. Facebook’s move is welcome, but it’s too late for the people who have already been impacted.