What makes someone a conspiracy theorist?

One thing that many people have noted about Donald Trump is that he seems particularly vulnerable to conspiracy theories. It is seldom made clear in these discussions, however, exactly what a “conspiracy theory” is, or what particular mental habits make people vulnerable to them. I thought, therefore, that it might be an opportune time to republish a small excerpt from my Enlightenment 2.0 that attempts to explain this. Basically, a conspiracy theorist is someone who falls victim to confirmation bias. He or she sees a pattern in the world, develops an account of the pattern, but then fails systematically to consider, much less investigate, any evidence that would contradict this account. Instead, he or she simply observes more and more instances of the pattern, treating each one (fallaciously) as more “evidence” of the account.

Conspiracy theorists serve as an excellent example of the phenomenon that Keith Stanovich refers to as dysrationalia. Conspiracy theorists are often people of above-average intelligence — this is what allows them to detect the patterns and to develop the theory that accounts for it. What they lack is the rational “override,” to systematically consider potentially contradictory evidence. (Some have speculated, for instance, that the reason that so many conspiracy theorists are engineers is that they have a technical education that gives them the capacity to understand and produce sophisticated theories, and yet they lack the discipline of the scientific method, with its focus on falsification.)

For greater detail on the phenomenon of confirmation bias, I highly recommend Raymond Nickerson, “Confirmation Bias: A Ubiquitous Phenomenon in Many GuisesReview of General Psychology (1998).

And now, the excerpt from Enlightenment 2.0 (slightly edited for length):

________________________________

One of the downsides of being a philosopher, I’ve discovered, is that you get a fair amount of unsolicited mail from people in cults, who are hoping to discuss their “philosophy” with you. These days my inbox is cluttered with messages from Falun Dafa (or Falun Gong) – which is a fairly harmless organization, except that they made the mistake of having spooked the Chinese government (just by being organized), and now suffer genuine repression. This has probably had the effect of making them more popular than they otherwise would have been. The core ideology is an amalgamation of traditional Buddhist meditative practices with an elaborate alien mind control theory. According to founder Li Hongzhi, the development of science and technology is an alien plot, designed to foment war and destruction among humans, so that they can seize our bodies, replace people with aliens, and so on. The details are, quite literally, crazy. But like all cults, the question is how so many thousands (or in this case, millions) of non-crazy people could come to believe it.

One of the astonishing things about Falun Dafa is just how standard it is, in every aspect. It’s an absolutely generic cult, instantly recognizable as what it is to anyone from anywhere in the world. It’s not “crazy with Chinese characteristics,” it’s just plain old crazy, like the common cold of mental infections. Reading through their literature, you get the sense that someone has been following a universal instruction manual on how to build a cult. How do you attract followers? Option number one: tales of miraculous healing! Sure enough, the other day I received a Falun Dafa email that included this testimonial:

I have witnessed and heard of many stories about the wonders of reading “Falun Dafa is good.” Ms. Liu, a villager in Tangshan City, is sixty-nine years old. Last year she was diagnosed with colon cancer. The doctor told her, “You are elderly and your blood pressure is too low. Quit seeking treatment. Go home and eat whatever you like.” Her third youngest sister practices Falun Gong. She visited her and taught her to practice Dafa by reciting “Falun Dafa is good.” Ten days later, Ms. Liu completely recovered and the late stage colon cancer disappeared. What is the most miraculous thing was that she looked like a young girl and became so beautiful. That’s Falun Dafa’s Divine Power. Master Li is absolutely superior to Jesus Christ! Falun Dafa saves people. Falun Dafa solves everything.

If this sort of thing sounds familiar, that’s because it is. It turns out that there is something like a universal instruction manual for getting people to acquire false beliefs. As Pascal Boyer has shown, some fanciful stories are easier to get people to believe than others. Human reasoning is subject to a number of very specific biases, and to the extent that a belief system is able to exploit these biases, it may be able to successfully reproduce itself. This is why the belief systems of cults, and the “arguments” used to support them, have a certain depressing familiarity. They all occupy the same ecological niche. Like a virus, which is able to avoid detection by the immune system, some irrational beliefs are able to withstand whatever scrutiny most people are able to bring to bear upon them, because they exploit certain characteristic flaws in our reasoning…

Consider the story of Ms. Liu from Tangshan City. There’s a lot going on in this email, but the most prominent feature of the story is that it exploits what psychologists refer to as “confirmation bias.” When considering an hypothesis, people have a tendency to look only for positive evidence that is consistent with the hypothesis, while failing to ensure the absence of evidence that is inconsistent with the hypothesis. They fail to “think the negative.” In order to get from Ms. Liu’s recovery to the divine power of Falun Dafa, there are two pieces of information that are missing. First, what would have happened if she hadn’t recited “Falun Dafa is good”? For example, what are the chances that the cancer diagnosis was simply mistaken? Or even if she had cancer, what are the chances of spontaneous remission, or even just not dying for a long time? Second, how many people suffering from terminal cancer who recite “Falun Dafa is good” fail to recover? Many people don’t ask these questions, simply because it doesn’t occur to them, or they don’t realize that the story alone provides absolutely no support for the hypothesis without this information. Failure to appreciate this is actually one of the hallmarks of “superstitious” thinking…

There is a temptation, when reading about these cognitive biases, to think of them as the sort of thing that explains why everyone else is stupid. Diagnosing them in oneself is a lot more difficult. Daniel Kahneman describes this in terms of the “futility of teaching psychology.” Students read about all sorts of experiments describing all sorts of pernicious errors and biases, then simply “exempt themselves” from the obvious conclusion that they themselves are just as biased. In part this is because of the mistaken belief that bias can be detected through introspection (and so just knowing about a bias would be enough to provide immunity from its effects). In part it is due to another well-known cognitive bias, which is that most people vastly overestimate their own abilities in every area of life. Freedom from bias is just another one of those areas. Psychologists refer to it as the “bias blind spot.” It’s easy to lament the superstitiousness of Chinese peasants. With ourselves, on the other hand, it’s a different story.

Take me for example. I am a university professor, teaching in the department of philosophy. In order to get there, I both studied and taught logic, argumentation analysis and probability theory. I’m also by temperament a bit of a rationalist. Kids started calling me “Mr. Spock” in grade three. From there I went on to chess club, computer programming, and finally, logic and philosophy. I’ve also read extensively in the psychological literature on cognitive biases – enough, for example, to know what “confirmation bias” is, and how it typically manifests itself. And yet one day, not that many years ago, I found myself falling into the most elementary trap, in a test used to detect confirmation bias. Here is how it goes:

The test was designed by Peter Wason, who had a peculiar genius for inventing problems that would cause people’s reasoning abilities to go haywire. It goes something like this:

Experimenter: “I am going to present you with a series of three numbers, and you have to try to guess the rule that generated them. Before deciding, you may ask me three questions. Your questions must have the following form. You give me a set of three numbers, and I will tell you whether or not they are an instance of the rule.”

Me: “Okay fun, sounds a bit like Mastermind. I used to be really good at that.”

Experimenter: “Your first set of numbers is: 2, 4, 6.”

Me: “Wow, that’s easy. This must be the warm-up phase. Okay, how about: 6, 8, 10?”

Experimenter: “Yes, that is an instance of the rule.”

Me: “Um, okay. How about 22, 24, 26?”

Experimenter: “Yes, that is an instance of the rule.”

Me: “Okay, this is stupid. I don’t even need the last guess. Let’s try 100, 102, 104?”

Experimenter: “Yes, that is an instance of the rule.”

Me: “Alright, the rule is ‘Three even numbers in order’.”

Experimenter: “No, that is not the rule.”

Me: “Come again?”

Experimenter: “That is not the rule.”

Me: “Then what is the rule?”

Experimenter: “The rule is ‘Any three numbers in ascending order’.”

Me: “You’re telling me that 3, 5, 7 satisfies the rule?”

Experimenter: “Yes.”

Me: “2, 67, 428 satisfies the rule?”

Experimenter: “Yes.”

Me: “Oh my God, I’m an idiot.”

This is how I learned to take confirmation bias seriously. When asked to formulate an hypothesis and evaluate it, I quickly jumped on the first pattern that I saw, and then proceeded to look exclusively for results that confirmed my theory, without trying to falsify it. Once I got it into my head that they were even numbers in order, it never occurred to me to suggest a sequence of odd numbers, with the aim of eliciting a “no” response. I completely failed to “think the negative.” In fact, I can clearly remember pausing just before asking the third question, not being sure what to ask, thinking that I had exhausted all useful questions. I was even a bit puzzled by the fact that I had been given three questions to ask when obviously I only needed one or two…

What is particularly striking about this test of confirmation bias is that it reveals more than just wishful thinking. It’s one thing to know that people look only for supporting evidence with pet theories, or things that they want to believe. But in this case, I had nothing whatsoever invested in the initial hypothesis. There is simply a huge blind spot in our reasoning. The response of “yes,” in response to “6, 8, 10” does in fact support the hypothesis that the rule is “three even numbers in sequence.” The problem is that it supports a very large number of other hypotheses, and that before one can establish that “three even numbers in sequence” is correct, one must rule out these other hypotheses. And yet I failed – failed egregiously, outrageously – to do it. This is exactly the same problem that leads people to think that the miraculous recovery of Ms. Liu generates support for belief in the divine power of Falun Dafa.

It is easy to see why our intuitive judgments might be out of whack in this regard. One of the things that rapid cognition does extremely well is pattern-recognition. What sets the grandmaster chess player, or the experienced doctor, or the seasoned firefighter apart is the ability to instantly detect patterns that no one else can see. The human brain is unfortunately quite hyperactive in this regard, and tends also to see patterns where in fact none exist. It’s called apophenia when it reaches pathological proportions, but the underlying tendency is ubiquitous. Most people, for instance, when shown a sequence of random coin tosses, or dice rolls, will insist that it is not random – since it contains too many “streaks.” Apple experienced a deluge of complaints about the “shuffle” feature on its iPod, from customers who insisted that the random sequence was biased against certain artists. People in general lack an intuitive feel for what randomness looks like, and so are constantly amazed by coincidences that are, as a matter of fact, not particularly improbable. Some psychologists have suggested that this is because, in a natural environment, genuine randomness is somewhat uncommon, and so there is no particular benefit to being able to detect it. A more plausible account is simply that the evolutionary cost of false positives (thinking that there is a pattern, when in fact there is none), are much lower than the evolutionary cost of false negatives (thinking that there is no pattern, when in fact there is one), and so natural selection has favored a pattern-recognition system that has very high sensitivity and low specificity. In an environment where there is some chance of being eaten by a predator, it pays to treat every rustle in the grass as a potential threat. The cost of getting “spooked” and running away from nothing are much lower than the costs of ignoring some telltale signs and getting torn limb from limb. Thus we tend to get all excited when we see a pattern. If and when we do think “what if I’m wrong?,” it is not a thought that occurs to us naturally. It is a cognitive override imposed on us by reason.

Comments

What makes someone a conspiracy theorist? — 3 Comments

  1. To promote its values the Enlightenment adopted a legalistic (or positivistic) concept of truth: ‘true’ is what can be proven. It did not take long to realize that control over evidence becomes control over truth. Foucault put it rather succinctly saying ‘truth is an effect of power’. Disbelieving the official truth is always possible but you should be ready to hear that you are a victim of some ‘conspiracy theory’.

    • You should also be willing to consider the possibility that you might actually be the victim of a conspiracy theory. Foucault’s argument could be used in support of climate change deniers and anti-vaxxers too, right?