NewsPronto

 

News

  • Written by Colin Klein, Associate Professor of Philosophy, Australian National University

Why do people believe conspiracy theories? Is it because of who they are, what they’ve encountered, or a combination of both?

The answer is important. Belief in conspiracy theories helps fuel climate change denial, anti-vaccination stances, racism, and distrust of the media and science.

In a paper published today, we shed light on the online world of conspiracy theorists, by studying a large set of user comments.

Our key findings are that people who eventually engage with conspiracy forums differ from those who don’t in both where and what they post. The patterns of difference suggest they actively seek out sympathetic communities, rather than passively stumbling into problematic beliefs.

Read more: A short history of vaccine objection, vaccine cults and conspiracy theories

We looked at eight years of comments posted on the popular website Reddit, a platform hosting millions of individual forums called subreddits.

Our aim was to find out the main differences between users who post in r/conspiracy (a subreddit dedicated to conspiracy theories) and other Reddit users.

Using a technique called sentiment analysis we examined what users said, and where they said it, during the months before their first post in r/conspiracy.

We compared these posts to those of other users who started posting on Reddit at the same time, and in the same subreddits, but without going on to post in r/conspiracy.

We then constructed a network of the subreddits through which r/conspiracy posters travelled. In doing so, we were able to discover how and why they reached their destination.

Seeking the like-minded

Our research suggests there is evidence for the “self-selection” of conspiracy theorists. This means users appear to be seeking communities of people who share their views.

Users followed clear pathways to eventually reach r/conspiracy.

For example, these users were over-represented in subreddits focused on politics, drugs and internet culture, and engaged with such topics more often than their matched pairs.

We were also surprised by the diversity of pathways taken to get to r/conspiracy. The users were not as concentrated on one side of the political spectrum as people might expect. Nor did we find more anxiety in their posts, compared with other users.

Our previous research also indicated online conspiracy theorists are more diverse and ordinary than most people assume.

Where do the beliefs come from?

To dig deeper, we examined the interactions between where and what r/conspiracy users posted.

In political subreddits, the language used by them and their matched pairs was quite similar. However, in Reddit’s very popular general-purpose subreddits, the linguistic differences between the two groups were striking.

So far, psychologists, sociologists, and philosophers have struggled to find anything distinct about conspiracy believers or their environments.

Social media can play a role in spreading conspiracy theories, but it mostly entrenches beliefs among those who already have them. Thus it can be challenging to measure and understand how conspiracy beliefs arise.

Read more: The internet fuels conspiracy theories – but not in the way you might imagine

Traditional survey and interview approaches don’t always give reliable responses. This is because conspiracy theorists often frame their life in narratives of conversation and awakening, which can obscure the more complex origins of their beliefs.

Furthermore, as philosopher David Coady pointed out, some conspiracy theories turn out to be true. Insiders do sometimes uncover evidence of malfeasance and cover-ups, as recent debates over the need for whistleblower protections in Australia reflect.

Echo chambers worsen the problem

Research about online radicalisation from philosophy has focused on the passive effects of technologies such as recommended algorithms and their role in creating online echo chambers.

Our research instead suggests individuals seem to have a more active role in finding like-minded communities, before their interactions in such communities reinforce their beliefs.

These “person-situation interactions” are clearly important and under-theorised.

As the psychologist David C. Funder puts it:

Individuals do not just passively find themselves in the situations of their lives; they often actively seek and choose them. Thus, while a certain kind of bar may tend to generate a situation that creates fights around closing time, only a certain kind of person will choose to go to that kind of bar in the first place.

We suspect a similar process leads users to conspiracy forums.

A complex web of interactions

Our data indicates that conspiracy beliefs, like most beliefs, are not adopted in a vacuum. They are actively mulled over, discussed, and sought out by agents in a social (and increasingly online) world.

And when forums like 8chan and Stormfront are pushed offline, users often look for other ways to communicate.

These complex interactions are growing in number, and technology can amplify their effects.

YouTube radicalisation, for example, is likely driven by interactions between algorithms and self-selected communities.

When it comes to conspiracy beliefs, more work needs to be done to understand the interplay between a person’s social environment and their information seeking behaviour.

And this becomes even more pressing as we learn more about the risks that come with conspiracy theorising.

Read more: Why conspiracy theories aren’t harmless fun

Read more http://theconversation.com/dont-just-blame-echo-chambers-conspiracy-theorists-actively-seek-out-their-online-communities-127119