Fighting Conspiracy Theories Online at Scale

Share Share Share Share Share
[s2If is_user_logged_in()]
DOWNLOAD PDF
[/s2If] [s2If current_user_can(access_s2member_level1)]
[/s2If]

This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person’s general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.

Keywords: Conspiracy theories, fieldwork, engineers

Article citation: 2020 EPIC Proceedings pp 265–278, ISSN 1559-8918, https://www.epicpeople.org/epic

[s2If current_user_is(subscriber)]

video-paywall

[/s2If] [s2If !is_user_logged_in()] [/s2If] [s2If is_user_logged_in()]

INTRODUCTION

In 2019, Jigsaw, a technology incubator within Google, and ReD Associates, a strategy consultancy, undertook ethnographic research on conspiracy theorists across the United States and the United Kingdom. The project set out with the initial mandate to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism online. Although a small cadre of self-motivated conspiracy theory “debunkers” generate content online, their efforts are insufficient to tackle the proliferation of conspiracy misinformation online––some of which motivates serious violence. (The August 2019 El Paso shooting of 23 people in a Walmart was fueled in part by a belief in the “white genocide” conspiracy theory.)

In its constant aim to navigate between the Scylla of undue censorship and the Charybdis of permitting harmful speech, Jigsaw (and more broadly, Google) stood to benefit from being able to surgically parse harmful conspiracy content from the harmless; that way, only the harmful could be penalized. More generally, in Google’s quest to better understand niches within a user base of over two billion, an ethnography of conspiracy theorists stood to render rich portraits of dimly understood and often reflexively vilified Internet users to those who broadly shape some of the Internet’s most popular services.

This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person’s general sense of societal alienation.

Our initial assumption that some conspiracy theories were more harmful than others because they could incite acts of violence was ultimately revised, for two reasons. First, we found that any conspiracy theory, if followed to an extreme length, could become harmful. Second, we came to feel the more useful finding from our ethnography was not that there were types of theories, but rather types of theorists. At a certain point in our study, we pivoted and sought to help identify the types of conspiracy theorists that are more likely to respond to at-scale technological deterrence strategies.

By focusing on two specific ethnographic encounters, we demonstrate why it is more important to distinguish between types of theorists rather than types of conspiracy theories. Our conclusion is that “extreme” theorists themselves cannot be affected by debunking content, because they will not consider factual argumentation at all. Extreme theorists are distinguished by their visceral, emotionally driven beliefs. Rather, debunking content is best deployed to people who are milder in their conspiracy belief, at a stage where the belief has not yet become embedded and visceral. In-person ethnography was essential in arriving at this understanding, since the personas and behaviors revealed by our in-person visits often overturned the personas and behaviors suggested by a conspiracist’s digital presence.

The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content and how to scale efforts to curtail extremism fueled by conspiracy theories.

In this paper, we begin by providing background on debunking as a strategy to dissuade people from upholding conspiracy theories and explaining why the methodology was based on an ethnographic approach. Second, we share two cases from our fieldwork to illustrate our main argument that the most strategically feasible way of combatting conspiracy theories requires us to segment different types of theorists. Lastly, we discuss how our findings impacted the way Jigsaw approached users who consume conspiracy theories online. This case study stands as an example of why ethnographic research on what happens offline helps explain, contradict, and influence what happens online. This perspective is necessary to developing and designing for online products and understanding the users themselves.

BACKGROUND

Belief in conspiracy theories, particularly in the U.S., is not new, but the Internet has made it possible to spread fringe beliefs rapidly, widely, and efficiently (Merlan 2019). Conspiracy theories continue to spread online at an alarming rate. Believers in extreme versions of conspiracy theories are sometimes moved to action. For instance, in 2016, a gunman stormed a pizza parlor in Washington, DC, convinced—because of conspiracy theories circulating online—that it was the site of a child sex trafficking ring. Understanding the line between a playful conspiracy theory and one that motivates people to harmful action is crucial. So, too, is understanding what can be done to help debunk conspiracy theories in a way that is persuasive, so that people who start to fall down conspiracy rabbit holes can climb back out.

Jigsaw had been studying misinformation, but conspiracy theories caught their attention as a poorly understood form of misinformation that was closely, repeatedly linked to real world violence. They began focusing on the questions of how conspiracy theories could be so powerful that they motivate real world action, and how to deter conspiracists. A popular approach used to counter conspiracy theories is debunking. This typically entails engaging others one-on-one in great depth, or sometimes via broadcast, to counter very specific, and often highly technical, arguments. Jigsaw looked for efforts to scale debunking and came across the work of Mick West, an expert at conspiracy theory debunking.

Mick West is a successful video game programmer (noted for his role in the popular Tony Hawk skateboarding series) who retired early and became a full-time debunker. He’s the author of a book titled Escaping the Rabbit Hole: How to Debunk Conspiracy Theories Using Facts, Logic, and Respect / A Guide to Helping Friends, Family and Loved Ones. We also consulted the foundational scholarly literature on conspiracy theories, including The Paranoid Style of American Politics by Richard Hofstadter, and Conspiracy Theories by Cass Sunstein and Adrian Vermeule, whose notion of the conspiracy theorists’ “crippled epistemology” we employed in our analysis.

It was Mick West’s book that was the touchstone, though; in it, he outlines a process that involves taking seriously the points offered by the believer and offering counterinformation. Rather than being dismissive, he brings a deep sense of empathy, a wealth of knowledge, and tremendous amounts of patience and care to each interaction.

Mick West has been debunking conspiracy theories for years, and he’s an inspiration for his deeply empathic approach. He has made dozens of videos on YouTube and runs a forum called Metabunk where he hosts debates on theories as varied as 9/11, Chemtrails, and the notion that the moon landing was a hoax. His book contains a few great success stories of people who have been deep down their rabbit hole, but have gradually been coaxed out.

Still, Mick West is only one man. And even though there are other conspiracy debunkers online, too, the problem is simply too big for a handful of hobbyist debunkers to make a real dent in.

A large portion of society believes in a conspiracy theory to some degree. In the 2015 article, “Conspiracy Theories and the Paranoid Styles of American Politics,” political scientists Eric Oliver and Thomas Wood found that in any given year, about half of the American public endorses at least one highly dubious conspiracy theory. No matter how popular Mick West’s websites become, his painstakingly personalized approach to debunking simply can’t scale to meet demand for this challenge.

This raises the question, how does one “scale” debunking? And is it even possible, or achievable in ways whose costs do not outweigh its benefits?

STUDYING CONSPIRACY THEORISTS

Academic research on conspiracy theories, limited to the Western, English-speaking context, has largely focused on the psychology of individuals who believe in conspiracy theories and why they believe in them (Kluger 2017, Preston 2019, Roose 2019, van Prooijen and van Vugt. 2018). Psychological factors exploring why certain individuals are motivated to uphold conspiracy theories highlight universal traits that make one receptive to this type of content, or how the belief in conspiracy theories is reflective of other existing psychological behaviors. For instance, a variety of cognitive differences were found to increase susceptibility to conspiratorial thinking, such as schizotypy, paranoia, or delusional ideation (Dagnall, Drinkwater, Parker, Denovan, and Parton 2015). Individuals with cognitive differences are engaged in a world where conspiracy theories have explanatory power. Believing in conspiracy theories can fulfill emotional goals by feeling good about the world or exerting a feeling of control and order amid feelings of powerlessness (Hart 2018; Kluger 2017; Imhoff and Lamberty 2016; Grzesiak-Feldman 2013; Whitson and Galinksy 2008). Also, social exclusion may lead people to support conspiratorial beliefs because they provide social meaning and value (Graeupner and Coman 2017). Psychologists have argued that people who were from low-status groups (less education and wealth) were more likely to believe in conspiracy theories (Douglas et al 2019; Freeman and Bentall 2017). What we found particularly relevant to this study is how one conspiracy theory acts as a gateway to other conspiracy theories. Once a person accepts one conspiracy theory, he/she is more likely to be receptive to other conspiracy theories (Brotherton, French and Pickering 2013; Jolley and Douglas 2014; van Proojien and Douglas 2018).

Beyond psychology, scholars of media studies have examined the role that social media has played in spreading conspiracy theories and helping to form new types of communities online (Jolley and Douglas 2014; Stempel, Hargrove, and Stempel III 2007; van Prooijen and Jostmann 2013). Despite the fact that conspiracy theories found online are theoretically accessible to anyone, researchers have found that conspiracy theory content tends to stay within specific communities that are already receptive to it or are actively seeking conspiracy theory content (Douglas et al 2019). One study found that conspiracy theories about the Zika outbreak were not spread online through a central authority but rather through a series of decentralized networks (Wood 2018). This suggests that people share and consider conspiracy theories outside of, or separate from, “official” stamps of approval or authority figures.

Our research builds upon existing studies in three ways. First, we focused on the role context, or large-scale social, economic, and political factors, play in shaping conspiratorial worldviews. This is distinct from psychological perspectives that are primarily focused on types of cognitive profiles that make one susceptible to conspiracy theories. We probed further into how the environments that people were living in were connected to the formation of conspiracy worldviews. For instance, conspiracy theories positing that a small group of elites control the global economy helped people understand their lack of social mobility. Second, similar to the research on the instrumental nature of conspiracy theories, our research is focused on the generative nature of adopting a conspiratorial worldview. Beyond fulfilling the need for control and power, we explored the social aspects of engaging in a conspiratorial worldview, including making friends, having a sense of purpose, and feeling excitement when theorizing with others. Third, we investigated the relationship between what people do online vs. offline. We traveled to meet theorists in person to gain a wider view of their everyday lives in their homes and workplace. We sought to understand how people go from consuming conspiracy theory content to acting upon it. We define “acting upon it” as everything from forwarding a website to others, to liking a post, to meeting other believers in the local library, to openly considering (even if only on theoretical grounds) committing acts of harm upon the imagined conspirators.

The goal of this study was to learn whether a relationship exists between types of conspiracy theories and potential for harm. Even though a correlation between extremism and belief in conspiracy theories exists, it is not the case that believing in conspiracy theories will automatically lead to extreme or violent behavior (Bartlett and Miller 2010). We conceptualize harm broadly to include the individual consequences on their personal relationships (e.g. estranged parents), health (e.g. refusing cancer treatment), and social status (e.g. being outed as a white nationalist). We also define harm in terms of actions taken against others who believers blame as perpetuating or benefiting from conspiracies, such as immigrants. In addition, we also considered but did not directly investigate the harms that conspiracy theory belief has upon faith in governments and institutions (Coaston 2018).

METHODOLOGY

To better understand how to stem the tide of false and potentially harmful conspiracy theories, Jigsaw had to better understand how people came to hold a conspiratorial worldview, what conspiracy thinking does for them and their lives, and what people do, if anything at all, with their beliefs. We wanted to understand how conspiracy theories fit into their overall life and what role conspiracy theories played in motivating other actions, offline. While it is an important area of study to understand the psychological factors that explain how a person even comes to believe in a conspiracy theory, we were more focused on tracing life histories and identifying any patterns between belief and action, specifically with an eye toward harms that are linked to believing in conspiracies. Rather than investigating levels of education and intelligence or cognitive deficits, we wanted to understand contextual, circumstantial, and personal factors that led someone down the rabbit hole, as well as what factors kept them from falling deeper. What role did the people around them, or life events, play in upholding or backing away from theories?

To answer these questions, our team of five researchers conducted in-person, in-depth interviews with 42 conspiracy theorists across the US and UK, as well as expert interviews with academics and journalists investigating conspiracy theories. In accordance with our initial hypothesis that some conspiracies were harmful and others innocuous, we recruited respondents across three different conspiracies: two theories we believed could be tied to real world harm and a third “control group” theory we believed was likely to be harmless. In the “believed harmful” camp were theorists who believed in “false flag” events (the notion that, for instance, mass shootings have been staged—which has been linked to harassment), as well as believers in “white genocide” (the notion that immigration trends indicate a deliberate plot to eliminate whites—which has been linked to mass shootings). In our “believed harmless” camp were believers in various science-related conspiracies (e.g. chemtrails, flat earth).

We used websites like 4chan and Twitter as starting points for recruitment and observations. By searching for the term “white genocide,” for instance, or a related term called “the Kalergi Plan,” we were able to follow, converse with, and ultimately recruit participants over Twitter. We also used surveys with questions designed to screen for conspiracy belief, and we drew from our personal networks as well. Since many conspiracy theorists are of course skeptical of strangers, going through intermediaries was often helpful.

Prior to conducting research, the team familiarized themselves with media coverage and scholarship on conspiracy theories and interviewing experts. We actively cultivated having an open mind and took particular care in understanding what language to use, and how to present ourselves in ways that would not cause us to lose our credibility (e.g. showing our familiarity with theories online rather than boasting about academic credentials). We consciously did not use the phrase “conspiracy theories” because of the negative connotations, but rather spoke about “alternative narratives,” “research,” and “truth.” Our approach was to be honest in presenting ourselves as former academics and journalists; we offered our sincere interest in understanding and listening to their points of view, and to have them guide us through the websites, videos, and channels that they used as sources of information. We also offered to meet in public places and did not record or take pictures, unless permission was granted.

It should be noted that in most respects, the people we met did not strike us as fundamentally different to the types of people we usually meet in other types of studies. Our research participants represented a variety of fields including teaching, technology, construction, and healthcare. Building rapport with them was similar to any other interactions we have in the field, though with more awareness around language, being actively empathetic, and not drawing suspicion with recording devices.

Ultimately, we made in-person visits, each lasting several hours, in or around people’s homes. (One of our researchers also explored the lighter side of conspiracy culture by attending the “Storm Area 51” event held in October.) The insights gathered through fieldwork were analyzed alongside literature on grappling with the history of, or sociological studies of, conspiracy theories, including: Kill All Normies, Fantasyland, and Republic of Lies: American Conspiracy Theorists and Their Surprising Rise to Power.

OUR FINDINGS: DISTINGUISH BETWEEN THE THEORISTS, NOT THE THEORIES

As discussed, our initial hypothesis was that certain theories are more extreme or harmful than others. In other words, we assumed a person’s likelihood to commit harm was related to the type of conspiracy they believed. We had an assumption that pseudo-scientific conspiracy theories like flat earth or chemtrails—the belief that the government is spraying mind-controlling chemicals from planes—were relatively innocuous. Meanwhile we assumed that racially-tinged theories like white genocide were perhaps dangerous by definition.

But what we found surprised us. We learned that it was less important to distinguish between theories, and more important to distinguish between theorists. What matters is how much of a person’s life is taken over by a conspiratorial worldview. If everything is part of the conspiracy, a person can no longer trust anything or anyone. An extreme conspiratorial worldview frames the elite “they” as powerful and as the enemy. Thus, it is not surprising that some studies have found that belief in conspiracy theories could be a predictor of having committed crimes or stating that they would commit a crime (McNamara 2019; Herrema 2019). In our own study, believers of extreme versions of conspiracy theories justified killing a conspirator, if one could be identified, because they would be saving others. It was not a particular theory that drove people to action but rather how deeply a person lived within an extreme conspiratorial worldview.

All conspiracy theories, we came to learn, have the potential to be harmful—more on that in a moment. And when it came to conspiracy theorists, we found, there was a very wide spectrum in terms of how hardened a person’s conspiracy belief is. This becomes very relevant to know when you are hoping to try to “debunk at scale.”

To explain why, we will first discuss a trip the authors made to Montana.

[/s2If]

Pages: 1 2 3

Leave a Reply