13 Confirmation Bias

Thinking processes are far from perfect, and people experience cognitive errors every day. One common error that many people commit is called confirmation bias. To ordinary people, this generally means the tendency to believe that you are right and to disregard things that conflict with your ideas (Kahneman, 2011). To give an example of this: Maybe you have a friend who believes that you are always late when you plan social outings together. This may be true, or this could be a result of confirmation bias. If one time you were very late to a movie that you two had planned to see together, your friend was likely not too happy because a movie is really not something you should be late to. After this time, your friend starts to develop a suspicion that you are not very punctual. Then, about a week later, you are a several minutes late after your mutually planned lunch time. This next occurrence confirms your friends earlier thought that you are often late. As a result, this idea then starts to take root in your friend’s mind, and he/she starts to believe that you are generally late. In the following months, you are occasionally late to social events, but for the most part are relatively punctual. You are then surprised because one day overhear your friend complaining to someone else that you are always late. This concept in your friends mind could be the result of confirmation bias because once your friend developed the idea that you are a late person, he/she then became more likely to remember subsequent instances of your own tardiness — since these support his/her idea that you’re late a lot — and less likely to remember the times you were actually on time — since these occurrences contradict the already-rooted idea in your friend’s mind .

Confirmation bias is strengthened with time since the longer you believe something, the more time there is to collect evidence as support and the less willing you are to let go of the idea because people grow attached to ideas (Johnson, 1987). Furthermore, not only does increased time increase the effects of confirmation bias, increased status level can also increase the probability of confirmation bias occurring (Ridley, 2012). For example, if the CEO of a large company believes that a certain new program will help increase efficiency in the employees, the CEO is likely to believe this program will work when given far fewer pieces of evidence than if a lower-level manager comes up with the same idea. This makes sense because the CEO’s knowledge of his/her own seniority causes the CEO to be more confident in his/her intelligence and therefore searches for less confirming evidence for this idea and even less possible contradicting evidence.

Additionally, confirmation bias contributes to exaggerating probabilities because when you accumulate evidence in your mind that supports your hypothesis, you can think something is much more probable than it actually is (Kahneman, 2011). For example, if you get picked last for teams in gym class one day, you will probably remember that and might even develop some insecurities about yourself. Then, if it happens a second time, you will really start to remember these instances. From there, you could start to develop the idea that you always get picked last for teams in gym class. While this may not actually be true, it may seem accurate in your mind since you can remember more instances of getting picked last than not since these occurrences confirm your existing hypothesis; the confirmation bias then makes you overestimate the probability of getting picked last for teams.

The Complication

Many people often place a great deal of trust in scientific findings because they believe that scientists are intelligent and therefore follow procedures that ensure the validity of their results. However, the public should be more aware that scientists are merely human and so are equally capable of falling prey to faulty thinking processes as other people. For example, many people believe that when scientists conduct research and present findings, the findings are correct. However, for the public to be truly informed, they should not believe the findings of the study unless they have carefully checked the research done by the scientists and evaluated the quality of the research process. Since scientists may also experience confirmation bias, this means that they may tend to look for and present research evidence that supports their hypothesis and tend to not seek out or simply ignore evidence that could conflict with their ideas. This is bad because then the research becomes very one sided and in order to do extensive research and prove that your hypothesis is true, you need to seek out both evidence that claims to support and that claims to contradict your hypothesis. Scientists — like most people — tend to think that they’re right and so naturally when they come up with a hypothesis, they look for ways that they can prove it. However, it is equally important to verify if there is evidence that shows that their hypothesis is incorrect, or should be altered in some way. Part of the experience of confirmation bias may be that people are afraid that if they look for evidence to the contrary, they may find too much and disprove their own hypothesis. However, this should not be a concern, especially to scientists. If scientists are correct in their hypothesis, they should find that evidence that contradicts the hypothesis is inadequate or faulty in some way. If the hypothesis is incorrect, the scientists should find a significant amount of evidence that suggests that the hypothesis is incorrect. This is still helpful, however, because if a hypothesis is incorrect, it is better for the scientists to discover this during the research phase rather than continue to fruitlessly search for adequate evidence to prove an incorrect hypothesis. Or worse yet, they might simply glaze over the inadequacy of the research evidence — and the subsequent implications about the validity of the hypothesis — and simply publish findings containing an inaccurate hypothesis and inadequate evidence to support the (inaccurate) hypothesis. This last example may seem a bit extreme, but these types of instances likely make up the majority of cases of confirmation bias because in the last example, the scientist may not even be aware that he/she is failing to provide adequate evidence to support a possible hypothesis. An example of this is the scientific debate about climate change. In an article about the debate, the article claims that scientists who believe in climate change are “right” based on the evidence that they provide, since they mostly present evidence that supports the hypothesis that climate change exists, and the reverse is true for the scientists who do not believe in climate change — they are also “right” based on their evidence (Bell, 2012).

Examples of Confirmation Bias

One example of confirmation bias involved a study done by a psychologist named Peter Wason where a teacher knew a “mystery rule” and children were given a few examples of values that followed the rule. The children then had to try to discover the rule by guessing values and the teacher would tell them whether or not those values followed the rule or not. The students were then allowed to propose a possible rule and the teacher would tell them if their hypothesis was correct. This study showed confirmation bias because once the children had identified a possible rule, they kept guessing values that would confirm their hypothesis and failed to suggest numbers that could disprove their hypothesis. Also, they seemed to guess fairly complicated rules for the mystery equation because since this was a sort of game, they expected the mystery rule to be somewhat complex — this turned out to be very far off from the article’s first mystery rule because that rule was simply that the numbers appeared in increasing order (Johnson, 1987).

Another example was the case of nurses and administering medication to patients. In their day-to-day routine, nurses give out many medications, many of which look similar in appearance and/or packaging. This can be problematic because nurses may administer the incorrect medication because they are so used to getting them that they are more likely to fall prey to confirmation bias. In general, a nurse knows that he/she need to give out a certain type of drug to a certain patient. The nurse probably associates the specific drug with a certain category of drugs, a certain label, or type of packaging. With these preconceived notions in mind, the nurse goes to pick out the drug and accidentally picks the wrong one. This can happen because since the nurse had an idea in his/her mind, he/she was simply looking for information that corroborated that idea. So if he/she was looking for a certain type of drug which is usually packaged in a white box with red labeling, he/she will be looking for that, and may overlook the slight differences between different drugs that have similar packaging (Davis, 1994). This is an especially relevant example of confirmation bias because mistakes as a result of this error can potentially be fatal, given the variability of medication and the precision required for effective treatment.

Suggestions for Combating Confirmation Bias

People in general — not just scientists — shouldn’t be afraid to look for opposing arguments and evidence. These can make your own argument stronger, as long as you structure your ideas well. For example, you could bring up an opposing argument and point out exactly where its logic fails or where there is inadequate evidence. You should then immediately follow this by showing how, in your idea, the logic holds or that there is adequate evidence, so you undermine their ideas while strengthening support for your own through the comparison.

From the problem solving example, (involving teachers, children, and a mystery rule) one suggestion is that people should seek “positive” AND “negative” information (Johnson 1987). Positive information refers to data that will confirm a hypothesis, while negative information refers to evidence that will disprove the hypothesis. People naturally seek out positive information, but obtaining negative information is also beneficial because then you know that the current hypothesis is incorrect and you must adapt it accordingly, or possibly even restart your thinking from a completely new perspective.

From the article “Giving Debiasing Away…”, we learn more about the relevance of combating confirmation bias. This source argues that people who are more aware of biases and who consciously aim to reduce their own biases are less likely to believe in extremist ideas since they are more well informed and therefore in a better position to be making judgments/assessments of ideas and concepts (Lilienfeld, Ammirati, & Landfield, 2009). This source differs a bit from the previously referenced sources because it addresses the effects of changing the way that you think, rather than simply explain examples of the bias occurring. By discussing the effects of combating confirmation bias in a humanitarian light, this source encourages people to try to think and behave with less influence from biases because it offers the possibility that the world benefits greatly from conscientious people who take time to reduce cognitive errors.

Since cognitive errors are innate, they may seem difficult to correct; however, there are ways that people can work to overcome their negative influence. To start, people should try to be aware of their own ideas and concepts and try to truly examine the basis upon which they have been founded. In general, you should be able to come up with evidence for and against your beliefs. If you can’t think of anything that contradicts your thoughts, you should actively seek out evidence that will so you can have a more well-rounded perspective from which to make more accurate assessments. You may be suffering from confirmation bias, but exposing yourself to contradicting evidence and ideas can help you challenge your beliefs and develop more informed ideas.

References

Bell, L. (2012, August 14). Confirmation Bias: Why Both Sides Of The Global Warming Debate Are Nearly Always Right. Retrieved October 6, 2015.

Davis, N. M. (1994). Med Errors: Combating Confirmation Bias. The American Journal of                       Nursing, 94(7), 17–17. http://doi.org/10.2307/3464683

Johnson, J. E. (1987). Do You Think You Might Be Wrong? Confirmation Bias in Problem Solving. The Arithmetic Teacher, 34(9), 13–16. Retrieved from http://www.jstor.org/stable/41194223

Kahneman, D. (2011). Thinking, Fast and Slow (pp. 80-81, 324, 333). New York, New York: Farrar, Straus and Giroux.

Lilienfeld, S. O., Ammirati, R., & Landfield, K.. (2009). Giving Debiasing Away: Can                                Psychological Research on Correcting Cognitive Errors Promote Human Welfare?. Perspectives on Psychological Science, 4(4), 390–398. Retrieved from http://                                  www.jstor.org/stable/40645706

Ridley, M. (2012, July 20). When Bad Theories Happen to Good Scientists. Retrieved October 29, 2015 .