American Psychological Association (APA) defines ‘confirmation bias’ as a tendency to gather evidence that confirms preexisting expectations, typically by emphasizing or pursuing supporting evidence while dismissing or failing to seek contradictory evidence. It is a type of cognitive bias. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for desired outcomes, for emotionally charged issues, and for deeply-entrenched beliefs.
People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
As Catherine A. Sanderson points out in her book, confirmation bias also helps to form and re-confirm stereotypes we have about people: “We also ignore information that disputes our expectations. We are more likely to remember (and repeat) stereotype-consistent information and to forget or ignore stereotype-inconsistent information, which is one-way stereotypes are maintained even in the face of disconfirming evidence. If you learn that your new Canadian friend hates hockey and loves sailing, and that your new Mexican friend hates spicy foods and loves rap music, you are less likely to remember this new stereotype-inconsistent information.”
According to an article by Kendra Cherry, a series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. “Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people’s conclusions,” noted Cherry. She also wrote that “Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way. However, even scientists and intelligent people can be prone to confirmation bias.”
According to Cherry, confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political, organizational and scientific contexts. For example, confirmation bias produces systematic errors in research based on inductive reasoning.
Meanwhile Pascal Molenberghs stated in an article that once people have made up their minds about a party or a person, it’s very hard to change their view. “In fact, people actively seek information that confirms their beliefs and will often ignore contradictory information, in a process known as confirmation bias,” he argued. His colleagues and Molenberghs found further scientific evidence for confirmation bias in a recent neuroimaging study. They found that brain areas involved in processing information were more active when people observed positive messages from in-group political leaders, and negative messages from out-group political leaders. This suggests people like to hear messages that confirm what they already believe, such as our group is ‘good’ and the other group is ‘bad’.
Furthermore, Zaruhi Hakobyana and Christos Koulovatianos write in a joint article that among other factors explaining populism’s rise, much research has focused on internet and social media as one of the core culprits. They explain their argument by saying that “Internet and social media have decreased the cost of forming new networks and of exchanging information. Populists tend to spend much energy on networking and on spreading information that is not fact-based or expert-reviewed. Naturally, much of current research has focused on fake news. It is hoped that by understanding the determinants of fake news and by developing ways of combating them, problems of populism, of neglecting expert opinion, of fanaticism, etc. may be mitigated.”
While Hakobyana and Koulovatianos have also argued that combating fake news may not be sufficient for combating the rising populist tendency of neglecting expert opinion. Just combining the internet’s ease of forming networks with two fundamental features of most people, fundamental biases in attitudes towards a number of life aspects, and people’s fundamental preference for being liked by their peers, can lead to populist dynamics over time through a vicious circle. “Even without fake news, biases (including confirmation bias) lead to more homophily and, over time, more homophily leads to actions that put more weight on biases and less weight on expert opinion,” they note.
The message of findings of study conducted by Hakobyana and Koulovatianos is that societies might need to invest more intensely in ways of mitigating fundamental biases from people. “This might be possible to be achieved through educational reforms and educational approaches that train citizens in developing a fact-based attitude towards knowledge and new information, trust for science and respect for expert views. Understanding the determinants of biases and ways of making people aware of biases may be a new focus of future research that aims at mitigating populism in society,” according to them.