The path to the million-member-strong league on Reddit’s r/conspiracy is paved with politics, toxicity, and tech culture, a sprawling study has found. From an eight-year sample ranging from 2007 to 2015, a team of Australian researchers compared 15,370 r/conspiracy posters to an equal number of users who’d started out posting on Reddit in similar forums to discover what drives people to conclude that the royal family went on human hunting parties.
Researchers found that conspiracists enjoy the simple pleasures of the average person who lives online; they poke around the porn, tech culture, and music categories. They interact with what researchers describe as “internet culture” more than most. They also go to fringe places like a category researchers labelled “drugs and bitcoin,” for UFO ideation and out-there theories. And they go to the Dark Side, aka “toxic Reddit,” for such sexist racist cesspools as r/KotakuInAction and the since-banned forums r/WhiteRights and r/fatpeoplehate.
Why do people believe conspiracy theories? Is it because of who they are, what they’ve encountered, or a combination of both? The answer is important. Belief in conspiracy theories helps fuel climate change denial, anti-vaccination stances, racism, and distrust of the media and science.Read more
And they are very, very opinionated on politics, posting over twice as many times on the theme community as the control group.
If you’re on any of those subreddits, you’ve probably heard from them a lot. Over eight years, they spilled a median of 743 posts totaling a 20,599 word count (about as long as a master’s thesis), versus their counterparts, who posted a median of 8,082 words in 300 comments. Subjects of interest: crime, stealing, law, deception, and terrorism–conspiracy stuff.
But the study found that they tended to fly under the radar on the conventional forums–the tone of their commentary on politics and porn didn’t significantly differ from the other politics and porn enthusiasts. “The results suggest that many of the clear differences observed between the two groups in the overall language use analysis were likely to have been because of where the r/conspiracy users were posting rather than what they were posting,” the researchers wrote. The team offered the following example for comparing two different users:
r/conspiracy user: “Do you really deny that a politician might make decisions, after winning the race, that would help people who funded their campaign (or even to hurt people funded their opponents?) I’m not saying that only rich people win elections, I’m saying that money can corrupt political decision-making.”
matched control user: “The Tea Party movement took off when Glenn Beck began endorsing them. I am sure that MSNBC would cover a thoughtful left-wing counter-movement. What we would really need is enough push to make the movement credible, and then have some attractive faces in the media to promote it.”
It would be difficult to say for certain which of those posts came from a conspiracy theory forum.
But contrary to popular belief, the linguistic analysis found that they’re not necessarily angrier; their comments were rated as “less hostile” than their non-conspiratorial peers (but the researchers also point out that Reddit’s not exactly a great control group for normal levels of hostility). They’re lonelier, which is suggested by the finding that they talked about friends a lot less, and r/conspiracy is probably not a great place to make them. Just for reference, if you feel like reaching out, a separate study has suggested that empathy is not so effective as rationale and ridicule.
To be fair, at least one post currently on r/conspiracy has entered the mainstream public discourse and absolutely demands journalistic attention. #Shartgate is real.