Fighting Conspiracy Theories Online at Scale

Share Share Share Share Share
[s2If !is_user_logged_in()] [/s2If] [s2If is_user_logged_in()]

DEEP DOWN THE RABBIT HOLE: THE HARDENED THEORIST

In November 2019, we flew to a remote town in Montana to meet some friends of friends who believe the earth is flat. We met a couple who bonded over conspiracy theories and attend weekly meetups where they discuss and “test” conspiracy theories (for instance, by pointing telescopes at the horizon to try to determine the shape of the earth).

The woman we met with, whom we will call “Jennifer,” grew up with hippie parents who moved the family deeply off the grid. Homeschooled through youth, by her mid-30s Jennifer was living with her parents in a remote corner of Montana, without internet reception. It was only through her gig house-sitting that she was able to access the internet at all—which was how she met “Carl,” her first romantic partner, on an environmentally-themed dating website.

Soon, Carl began sending Jennifer thumb drives full of conspiracy material that she could consume on her home computer. Jennifer mainlined hours and hours of videos; down Carl’s rabbit hole she went, and by the time we met her, her idiosyncratic beliefs were legion. She believed the Earth was flat and there was a nefarious agenda to mislead us about its true shape. She believed that Hitler was a great man and that the Holocaust didn’t happen. She believed the government or the cabal controlling it sprayed mind-weakening chemicals from airplanes. There was hardly a conspiracy theory we had encountered that Jennifer didn’t believe in. Her whole worldview had been reprogrammed, doubly so now that Carl had moved to Montana to be with her. She, Carl, and others in their remote area began hosting a weekly conspiracy meetup, that doubled as something of a self-help group.

What we came to sense, meeting with Jennifer and people like her, was that there was no such thing as an innocuous conspiracy per se. Any theory could become dangerous or extreme, depending on what other theories it was caught up in. Jennifer’s flat earth belief was intimately tied up with the idea that a cabal of people—likely Jews, she had come to feel—were lying to her about the nature of the world. Given the right opportunity, she said, she would attack a representative of this cabal—in fact, she said, she would consider such an act a form of “self-defense” due to the “scale of the atrocity” this cabal was perpetrating on humankind. No longer did flat earth belief appear to us to be inherently innocuous. Belief in a flat earth, as discussed earlier, is a reflection of how extreme a conspiratorial worldview is. To believe in a flat earth, one must discount whole fields of expertise, such as physics and geography. In addition, to pull off a flat-earth conspiracy, one must also consider all of the instruments and narratives that propagate the theory, such as books taught in school, and “captured” professors of astronomy.

We learned something else from meeting hardened theorists like Jennifer: that for people who have reached this stage of conspiracy belief, debunking isn’t the right strategy at all. The notion of debunking presumes a sort of rational, civil debate, where each side shares facts in a sporting fashion, and some victor emerges. But for people like Jennifer, the very notion of what was a “fact” had become subverted. Any mainstream source of information was now reflexively dismissed as lies; if The New York Times (and its happens-to-be-Jewish ownership) toed the party line on the earth being round, could it really be trusted?

Furthermore, coming away from our meeting with Jennifer, we felt that a fundamentally cognitive-and-rational intervention like debunking was likely to fail against someone for whom conspiracy belief appeared to serve an emotional role. The conspiracy belief helped them make emotional sense of a world where they felt marginalized, disenfranchised, and alone.

The intervention needed to pull such a person away from conspiracy belief is likely multi-faceted, not to mention more intimate, personal, and personalized than social media would currently be able to offer at scale. Jennifer may well pull herself out from her rabbit hole someday, but it will likely take a perfect storm of personal and even societal factors before she is ready to do so.

The question of whether debunking can be scaled is moot with the set of conspiracy theorists like Jennifer: even if it could be, it won’t work on her.

AT THE TOP OF THE RABBIT HOLE: THE BUDDING CONSPIRACY THEORIST

This isn’t to say, though, that debunking has no purpose. Because we did also encounter success stories in the field, showing that with the right interventions, tech platforms can help prevent the spread of misinformation at scale. The key, we came to feel, was to make sure debunking was deployed at the right—early—moment in a budding conspiracist’s journey.

For an example of that, let’s talk about the case of Lois.

When we met her, Lois, who lives outside of San Diego, believed in so-called “chemtrails.” When airplanes fly at high altitude, the exhaust from airplanes cause condensation in the air—these familiar streaks in the sky are called contrails. But proponents of the “chemtrails” conspiracy theory believe that in many cases, the lines in the sky aren’t just water condensation but rather a nefarious chemical, likely sprayed by the government. In some variants of the theory, it’s all an experiment in climate control. In other variants, the chemicals are poisons that conspirators use to subtly undermine the will of the population (along with fluoride in the water).

When our researcher met Lois at an Italian restaurant in a San Diego strip mall, she explained why she believed in chemtrails. “I’ve seen them!” she said. She explained that back in 2015, her brother, a rancher, had pointed them out to her. Her brother said the government must be spraying poisons to “control the masses.” That struck Lois (a retired and college-educated marketing professional) as a little far-fetched, but she went home and started doing internet searches related to chemtrails. She went to NASA’s website, but couldn’t find anything debunking it. Instead, she eventually landed a video with a number of pilots and other self-proclaimed experts speaking out at a conference against supposed chemtrails. Persuaded by this parade of seeming experts, she shared this video on Facebook. (For a sense of the theory’s reach on this platform; at one point a chemtrails-themed Facebook Group had over 100,000 members.) Lois even wrote to her senator about chemtrails, but never received a response. By the time we met Lois in the fall of 2019, she was less focused on chemtrails, which had principally been her brother’s concern—and she hoped an investigative journalist would someday expose the truth.

What Lois didn’t know was that in the intervening years, major tech platforms like Google had identified the chemtrails conspiracy theory and had begun to implement policies that had the eventual effect of leading to more fact-based and authoritative content rising to the top of the page of chemtrails searches. As of this writing in late 2020, for instance, if you conduct a YouTube search for “chemtrails,” the first videos that come up are debunking videos rather than conspiracy videos. YouTube has also inserted a box at the top of the search linking to the Encyclopedia Britannica entry for “contrail”; this encyclopedia entry also debunks the chemtrails theory.

This recent change in Google policy allowed for an experiment. Our researcher asked Lois to go home and, over the subsequent week, to re-open her investigation into chemtrails. At the end of the week, we called up Lois. The difference was remarkable. She said, “I found some new articles that debunked it. I’d have to say I’m not leaning towards not thinking chemtrails are real. I don’t think they’re spraying chemicals.” Is there even such a thing as “chemtrails,” as distinct from normal airplane contrails, we asked? “I’m leaning 80-90% no,” Lois concluded.

What Lois’s story demonstrates is that if technology platforms surface the right kinds of debunking content, it can have an effect on people who haven’t yet become deeply attached to a conspiracy theory. In other words, through our ethnography we determined that it does seem possible for tech platforms to do what the debunker Mick West does, at scale.

What distinguished Lois from Jennifer is the relationship between what happens offline and online. A person who is heavily engaged in one and not the other is someone who can decrease or step back from a conspiratorial worldview. In Jennifer’s case, her offline and online behaviors are melded together and influence each other. Her social engagements with other believers in the library revolved around online content that they dissect together as a group. Her relationship with her boyfriend is founded upon their shared belief in conspiracy theories. Her increasing isolation from her family and from mainstream sources such as Google are because of her belief in conspiracies – her parents can no longer relate to her and Google cannot be trusted. Lois, on the other hand, considered chemtrails, but the rest of her life offline is not related to or motivated by a conspiratorial worldview. In fact, her research online was short-lived, brief, and kept private between herself and her brother. Lois is not part of a crusade or dedicated to finding the truth, in which chemtrails are linked to other conspiracy theories. This relationship between what happens on-and off-line was only discovered because of our ethnographic engagement with research participants. By visiting them where they live, we could observe economic changes to the town where they lived and appreciate why conspiracy theories could explain why some companies are so powerful and rich and their main street is shuttered. We could meet with their families, we could see what it means it to live off the grid, and we could witness the other parts of their lives that were not tied to conspiracies – their jobs, market investments, and church involvement. By observing both what happens online and offline, we could also observe what it meant to be a budding or a “light” conspiracist vs. a hardened, deeply entrenched and enmeshed conspiracist.

We also had demonstrated that a focus on treating some theories as harmful, and others as not, ultimately wasn’t the most fruitful way to look at conspiracy theories. More fruitful was to be aware of the difference in types of conspiracy theorists themselves. Those who are newer to a conspiracy theory are the ones you are more likely to be able to reach with the facts, and therefore Alphabet’s efforts to counter misinformation will be likeliest to have impact the further upstream they are. In other words, it’s important to catch people at the top of the rabbit hole, before they really fall down it.

CHANGING VIEWS ON CONSPIRACY THEORISTS

We delivered our findings in a set of presentations for stakeholders across Jigsaw, Google, and YouTube in December of 2019. For many of these stakeholders, it was the first time they had encountered in-depth qualitative data about the lived experience of conspiracy theorists. Of course, many employees at Alphabet are highly specialized computer scientists and businesspeople; to bring an ethnographic perspective humanizing this segment of their user base was eye-opening. The study gave teams at Alphabet new language and perspective into how conspiracies work that they would not have had from other approaches.

Not long after the completion of this study, COVID-19 hit. Those who had been briefed on our research into conspiracy theories and its relationship to harm soon had to make difficult and rapid decisions about what sorts of COVID-19 content would and wouldn’t be allowed on Google’s platforms.

By April, YouTube had pulled thousands of conspiracy and misinformation videos related to coronavirus from the platform. It began surfacing an informational panel that linked to national health agencies’ websites—like the CDC in the U.S. It also began aggressively enforcing medical misinformation policies around false COVID-19 cures, and it expanded that policy to bar promoting actions that go against recommendations from national health authorities. This expanded policy led YouTube to swiftly remove conspiratorial posts by Brazilian President Jair Bolsonaro, who had downplayed the virus. This decision was lauded by the business press.

Decision-making at an organization as large as Alphabet is diffuse, and it would be impossible to attribute these decisions to our ethnographic study alone. What we can say with confidence was that our study was a highly relevant and valued input to educate top decision makers of the harm of conspiracies at a crucial moment when Alphabet faced a flood of COVID-19 conspiracies.

We can be more precise and confident of our impact at Jigsaw itself. Ethnographic research has been a core research stream at Jigsaw since its conception, but now its value is established and appreciated by the organization’s top leadership, who participated in some of the conspiracy theory ethnography themselves. In June and July of 2020, ReD Associates and Jigsaw teamed up again, revisiting about half of our former conspiracy theorists, as well as a cohort of new ones, to learn what conspiracy beliefs they had about the COVID-19 pandemic. This time, we brought senior stakeholders not only from Jigsaw but also from Google/YouTube’s own policy teams into the “field” (redefined as Zoom calls). These senior Google stakeholders reported to us their hope that actually meeting conspiracy theorists would humanize and make more visceral their own understanding of the population their policies would affect; “I hope to feel I understand these people better than I would just by reading an article,” one said. After their participation in fieldwork, these Alphabet policymakers confirmed to us that the interviews had achieved just that: humanizing an otherwise mysterious community.

APPLICABILITY TO OTHER STUDIES

Based on our research experience, we have identified several learnings that others could consider implementing.

First, given the sensitive nature of our topic, we adopted a multi-pronged approach to recruitment. We learned after engaging with recruitment agencies that it was too off-putting to recruit explicitly for conspiracy theorists and white nationalists. Instead, we identified a variety of proxies that could help us identify potential research participants. For example, one of the questions used in our recruitment screener solicited their news sources; we listed a mix of mainstream and conspiracy-specific publications. Once we identified people who consumed conspiracy theory content, we held an initial conversation to gauge their level of familiarity with the types of conspiracies we were recruiting for and how frequent their engagement was in their day-to-day lives.

We also conducted our own recruiting. We engaged with people on social media who used hashtags associated with conspiracy theories (e.g. #wwg1wga for QAnon supporters), then approached them for an interview once we connected with them. Given the fact that some believers in conspiracy theories are suspicious of others, we also relied upon our social networks to connect with friends of friends, or sometimes friends of friends of friends. These warm introductions were a shortcut to trust among the distrustful that would otherwise not have been possible on our timeframe.

Finally, throughout the project researchers worked closely with the client, especially during fieldwork. This is a common practice, but we draw attention to it because experiencing going to a remote location together and conducting the interviews together meant that the client already had an understanding of the everyday lives of conspiracy theorists. It was helpful, too, to come to the realization together that our initial hypothesis was wrong. We no longer believed that one theory was more harmful than another. Rather, every type of conspiracy had the potential to be extreme, and that once an extreme version was adopted, it often indicated a way of seeing the world. We did not have to spend time convincing the client why our fieldwork had overturned this initial assumption and instead could focus on telling a story that humanized conspiracy theorists while expanding our sense of what made them harmful.

We’re optimistic that this more empathetic, holistic understanding of conspiracy theorists will be vital to decision making when they wrestle with refinements in their policies about how to handle disinformation, misinformation, and conspiracy theories on some of the world’s largest tech platforms.

Rebekah Park was a Senior Manager at ReD Associates when this research was conducted. She holds a PhD in Anthropology from UCLA and currently works at Gemic. She also serves as a board member of the Association of Legal and Political Anthropology of the American Anthropological Association.

David Zax is a Senior Consultant at ReD Associates, where he has focused on technology clients for the past three years. Prior to ReD, David was a technology journalist contributing to Fast Company, Wired, The New York Times, and other publications.

Beth Goldberg is at a Research Program Manager at Jigsaw, where she oversees research on violent extremism. Jigsaw is a unit within Google that builds technology to tackle global security challenges.

[/s2If]

Pages: 1 2 3

Leave a Reply