the tendency to search for, interpret, favor, and recall information in a way that confirms or strengthens one’s prior personal beliefs or hypotheses. It is a type of cognitive bias. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for desired outcomes, for emotionally charged issues, and for deeply-entrenched beliefs. The pattern is to form a theory (often based on emotion) supported with insufficient data, and then to restrict critical thinking and ongoing analysis, which is, of course, irrational. Instead, you look for data that fits your theory. Commonly, there are five steps to confirmation bias:
- Form a theory (or ‘have an opinion’)
- Find ‘data’ that supports that opinion
- Work hard to collect more and more data that really ‘confirms’ your theory (i.e., what you believe)
- Identify the kinds of data that are most compelling to the people you most frequently want to convince of your theory (because that’s what people like to do), then collect and memorize and repackage and refine that data to more neatly fit your theory
- Become more emotional in your theory-holding (i.e., your opinion) because you’re now surer than ever that ‘you’re right’
- Continue to discount and discredit new or better data because then you’d have to reconstruct your belief system, apologize to people, admit you were wrong, etc.
People tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people’s conclusions. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way. However, even scientists and intelligent people can be prone to confirmation bias.
Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political, organizational and scientific contexts. For example, confirmation bias produces systematic errors in research based on inductive reasoning.
A study by Michael J. Wood and Karen M. Douglas of the University of Kent (UK) suggests that the negative stereotype of the conspiracy theorist – a hostile fanatic wedded to the truth of his own fringe theory – accurately describes the people who defend the official account of 9/11, not those who dispute it. Additionally, the study found that so-called conspiracists discuss historical context (such as viewing the JFK assassination as a precedent for 9/11) more than anti-conspiracists. It also found that the so-called conspiracists do not like to be called “conspiracists” or “conspiracy theorists.”
Both of these findings are amplified in the new book Conspiracy Theory in America by political scientist Lance deHaven-Smith, published earlier this year by the University of Texas Press. Professor deHaven-Smith explains why people don’t like being called “conspiracy theorists”: The term was invented and put into wide circulation by the CIA to smear and defame people questioning the JFK assassination! “The CIA’s campaign to popularize the term ‘conspiracy theory’ and make conspiracy belief a target of ridicule and hostility must be credited, unfortunately, with being one of the most successful propaganda initiatives of all time.”
In other words, people who use the terms “conspiracy theory” and “conspiracy theorist” as an insult are doing so as the result of a well-documented, undisputed, historically-real conspiracy by the CIA to cover up the JFK assassination. That campaign, by the way, was completely illegal, and the CIA officers involved were criminals; the CIA is barred from all domestic activities, yet routinely breaks the law to conduct domestic operations ranging from propaganda to assassinations.
Psychologist Laurie Manwell of the University of Guelph agrees that the CIA-designed “conspiracy theory” label impedes cognitive function. She points out, in an article published in American Behavioral Scientist (2010), that anti-conspiracy people are unable to think clearly about such apparent state crimes against democracy as 9/11 due to their inability to process information that conflicts with pre-existing belief.
In the same issue of ABS, University of Buffalo professor Steven Hoffman adds that anti-conspiracy people are typically prey to strong “confirmation bias” – that is, they seek out information that confirms their pre-existing beliefs, while using irrational mechanisms (such as the “conspiracy theory” label) to avoid conflicting information.
The extreme irrationality of those who attack “conspiracy theories” has been ably exposed by Communications professors Ginna Husting and Martin Orr of Boise State University. In a 2007 peer-reviewed article entitled “Dangerous Machinery: ‘Conspiracy Theorist’ as a Transpersonal Strategy of Exclusion,” they wrote:
“If I call you a conspiracy theorist, it matters little whether you have actually claimed that a conspiracy exists or whether you have simply raised an issue that I would rather avoid… By labeling you, I strategically exclude you from the sphere where public speech, debate, and conflict occur.”
How To Resist Confirmation Bias
How can you keep from committing confirmation bias? Constantly re-evaluate what you believe that you know, insist on the highest quality data, and embrace the possibility that we’re all wrong more frequently than we’re right.
The pattern, crudely put, would look something like this:
- Consider all data equally and know how to separate good data from bad, fact from opinion, misrepresented facts from properly-contextualized facts, etc.
- Form a theory based on said data/data sources
- Be open to new data, ideas, constructs, and perspectives and revise your theories in response as necessary
- Moving forward, use data to inform your theory forming instead of using your theories to inform your data seeking.
Confirmation bias goes in fact so far, that it can change your morals altogether, for example from someone who would never steal, to someone who thinks it’s actually okay.
To illustrate this, in their book Mistakes Were Made, But Not By Me, Tavris and Aronson created a beautiful metaphor: the pyramid of choice. Imagine 2 people with the same morals are given the chance to steal $500 from the cash register at work. Before making their choice, they stand on top of a pyramid. They can see all the possible paths that lead down, all options and all consequences of their actions. One decides to steal, the other decides not to. Once they start descending on their different paths, they both lose their birds-eye view and can only see the narrow path they’ve chosen for themselves. Because of self-justifications and the confirmation bias, each of them will become ever so surer that their path was the right one to take. When they reach the bottom, they end up at totally different ends of the pyramid, with completely different views of morality – one thinks it’s okay to steal, the other has become even more certain that stealing should never be done.
So what can you do to stop this self-reinforcing cycle of not admitting mistakes, making up excuses and then confirming those excuses? Simple: Start admitting them. Yes, I know, it’s hard. But here’s a good reason why you should do it anyway: Because your Asian friends, who are better at math than you, do too. What? 😀
In a study that compared US education to Chinese and Japanese schools, it was found that US students were embarrassed to make mistakes, so that they’d never tackle difficult math problems in front of the class. In China and Japan, the kid who did the worst had to go up to the board and re-do the exercise until he got it right – with support from the class! Asian cultures see mistakes for what they are: part of life. And instead of burying their heads in the sand, they proactively admit and deal with them. Don’t make mistakes a part of your identity, you aren’t stupid, you just used the wrong approach. Focus on criticizing your and other people’s behavior, not who you or they are, and you’ll develop the growth mindset you need to deal with mistakes the right way.
Confirmation bias is a killer of critical thinking. And the opposite approach is exhausting. To constantly consider a broad set of evidence and data (historical patterns, existing trends, widespread indicators, alternative explanations, etc.) and then narrow it down to identify higher-quality data in order to form a ‘fluid’ conclusion that you then consistently revisit in light of ‘new’ data as it becomes available–data that wasn’t handpicked to support a theory but rather is fresh, valid, credible, and relevant–takes a lot of cognitive energy and thinking strategies and human determination, not to mention humility, which is why it’s not as common as it could be.
A silent factor here is the challenge of cognitive dissonance–holding two competing beliefs at once. This is uncomfortable, so when considering ‘what’s real,’ we tend to cling to one and pitch the other, then scour data to prove that we are right. This can result in a kind of self-fulfilling prophecy, where you essentially will your biases and insecurities into existence.
While so-called ‘conspiracy theorists’ are typically taking on dissonance in an effort to discover truth, they can also be susceptible to confirmation bias. It is not uncommon for some to believe every tragic event is a false flag and to seek confirmation bias to affirm that belief. It is important to remove all bias of an event beforehand, look at the facts and where they lead, and test any theories of possible patterns that match a theory by looking to see if any connections do exist. It can be an arduous process while lazyness typically leads to confirmation bias.