False belief can result from a variety of causes, including unwarranted trust, confirmation bias and motivated skepticism. I’d rank them in that order, from least to most dangerous.
Suppose a particular belief of ours resulted from interaction with a trusted authority figure (e.g., parent or teacher). If we never independently investigated the factual basis of their claim, we might be in more of a position to be persuaded away from the belief. Most likely we would acknowledge that the belief is weakly evidence-based. Belief based on trust can be better than belief based on a coin flip, if the person we relied is of, say, above average intelligence and/or has demonstrated general reliability in the past.
Consider the following two examples:
Example 1. You believe that global warming was caused by human actions. You have not read any arguments for or against the claim. Your belief is based entirely on your discovery that nearly every scientist with expertise in this area has come to just that conclusion.
Example 2. You believe that Jesus was born of a virgin, died for our sins and rose from the dead. You base that belief entirely on the fact that your parents said it’s true. You have never read the bible or looked into arguments for or against it. However, your parents have been reliable in the past.
Both are examples of faith-based belief. It is necessary to have some faith-based belief, as we don’t have the time or even the ability to thoroughly investigate every claim. However, I would consider the evidence in example 1 to be stronger than in example 2 (community of scientists have a better record than parents, in my opinion).
The point, though, is that in either example, if we are presented with actual disconfirming evidence, we might be persuaded. That’s because we are not operating under the assumption that we have already researched the topic.
Belief based entirely on trust or faith tends to be held by people who are not particularly motivated truth-seekers. Thus, in my opinion, they could be persuaded by actual evidence.
Let’s now consider another type of person. This person started with a belief that was faith-based, but then went out and looked for evidence. They used whatever tools were at their disposal to find confirmatory evidence. And of course, it’s never hard to find evidence in favor of your beliefs if you look hard enough with an uncritical eye. As a result, the person went from having a belief that they knew was faith-based, to having one that they believe is based on real evidence.
In my opinion, this person’s beliefs are going to be more strongly held than those of people who did not bother to look for evidence. Thus, disconfirming evidence might be less likely to persuade them. There is hope, though, because they have never bother looking into evidence in favor of alternative theories.
With motivated skepticism the person takes it one step further. Not only do they look for confirmatory evidence, and suffer from confirmation bias, but they also search for flaws in the arguments made in favor of alternative theories. Thus, they think they are being a good rationalist by looking at both sides of an issue. However, they apply a completely different level of skepticism to confirming evidence relative to disconfirming evidence.
Suppose, for example, that I believe that deregulation of the financial markets was the primary cause of the current financial crisis. Perhaps I do not know a lot about finance. My belief might be based on political loyalties, instinct, what I heard on my preferred news programs or what my friends were saying. Whatever the reason, I strongly believe it’s true. I decide to be a good citizen, and do a little research. Whenever I find an article that supports my belief, I soak in the information. It feels good. When I find an article that argues against my belief, I excitedly read through it, searching for flaws. By the time I am done with my ‘research,’ I have found evidence that favors my beliefs, and flaws in the arguments against my beliefs. Thus, I am more convinced than ever that I am right.
The danger here is the feeling the person has that they have objectively looked at the evidence on both sides. In their mind, their work is done. Will disconfirming evidence persuade them? It seems unlikely, as they have moved past the investigative phase on that topic.
Example — political beliefs
Consider the following study on political beliefs (link). First, they described how people should update their beliefs:
Assuming one has established an initial belief (attitude or hypothesis), normative models of human decision-making imply or posit a two-step updating process, beginning with the collection of belief-relevant evidence, followed by the integration of new information with the prior to produce an updated judgment. Critically important in such normative models is the requirement that the collection and integration of new information be kept independent of one’s prior judgment (see Evans & Over, 1996).
The authors carried out an experiment, where participants were exposed to a balanced set of pro and con arguments on two topics, affirmative action and gun control. Interestingly, they found that participants who were politically sophisticated and/or had strong prior beliefs were more likely to view the evidence in a biased way. This lead to more polarization, despite the fact that they were exposed to a balanced set of arguments. As the authors’ stated:
Asymmetrical skepticism – as would be reflected in the type of thoughts that come to mind as we read pro and con arguments – deposits in mind all the evidence needed to justify and bolster our priors with a clear conscience (Ditto, Scepansky, Munro, Apanovitch & Lockhart, 1998).
If you have strong prior beliefs, ‘research’ can just end up being an exercise in bolstering your case. You’re like an attorney, searching for evidence in favor of your client and trying to find flaws in the evidence against your client. The danger, though, is you think of yourself as a judge who is listening to both sides.
Read Full Post »