debiasing - The Skeptic's Dictionary (original) (raw)

"For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential." --Daniel Kahneman

Debiasing refers to methods, strategies, and techniques used to overcome biases in thinking. Many thinking biases are rooted in our evolutionary history; some are rooted in cultural traditions; and some are due to a variety of personal and social factors.

Biased thinking doesn't always lead to bad judgments or decisions. A bias is a tendency. If the foundation beneath that tendency is built of solid knowledge, recognized expertise, and years of relevant experience, then something like the availability bias might lead to quick judgments or decisions but not necessarily to bad ones. As Daniel Kahneman (2011, p. 11) notes:

Expert intuition strikes us as magical, but it is not. Indeed, each of us performs feats of intuitive expertise many times each day. Most of us are pitch-perfect in detecting anger in the first word of a telephone call, recognize as we enter a room that we were the subject of the conversation, and quickly react to subtle signs that the driver of the car in the next lane is dangerous....The psychology of accurate intuition involves no magic. Perhaps the best short statement of it is by the great Herbert Simon, who studied chess masters and showed that after thousands of hours of practice they come to see the pieces on the board differently from the rest of us. You can feel Simon’s impatience with the mythologizing of expert intuition when he writes: “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”

Most of us, however, are not knowledgeable of most subjects, are not recognized experts in anything, and lack experience in most fields. Biased thinking for most of us most of the time will lead to less than optimal thinking and action. Fortunately for our self-esteem, we are so good at rationalizing our beliefs and actions that we rarely recognize just how inept we are at recognizing our biases. Our capacity for rationalizing, however, is not such a stroke of good fortune if what we seek is the truth or, at least, the most reasonable beliefs based on the available evidence. Cognitive biases for most of us most of the time are cognitive illusions. Can we learn to overcome cognitive illusions or are we doomed to die with our biases on? The evidence, according to one of the leading experts in the study of cognitive biases and illusions, is "not encouraging" (Kahneman 2011). It is hard to accept that our minds are working in ways we have little control over and that we know much less about ourselves than we think we do. Much of our thinking is automatic and can't be turned off at will. Kahneman notes: "Constantly questioning our own thinking would be impossibly tedious...much too slow and inefficient....The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high."

One bias we should all be concerned with is the bias that many of us have that favors intuition over scientific data. We might excuse an individual like Boris Johnson who believes his experience of a very cold winter should be considered as strong evidence against the scientific consensus that the planet is warming up. But we should not excuse experts who don't recognize that the situation in which they are making decisions is one of low validity for predictability but who go ahead and follow intuition anyway without further investigation.

In one of his wonderful collections of essays (The Youngest Science: Notes of a Medicine-Watcher), the late physician Lewis Thomas tells of a highly successful doctor (a senior citizen back when Dr. Thomas’ father was an intern) in New York’s Roosevelt Hospital who was trained before the medical profession understood how disease spreads.

The elder doctor was renowned for his remarkable ability to diagnose typhoid fever, a common disease at that time and place. His method was to closely examine the tongues of patients. His ward rounds, the younger Dr. Thomas recounts, “were essentially tongue rounds.” Each patient would stick out his tongue for the doctor to palpate. Pondering its texture and irregularities, he would diagnose the disease “in its earliest stages over and over again” and turn out, “a week or so later, to have been right, to everyone’s amazement.”

The essayist wryly concludes: “He was a more productive carrier, using only his hands, than Typhoid Mary.”*

Despite the obstacles that face all of us who attempt to overcome cognitive biases and illusions, the first step seems obvious. We must recognize and study the main biases and illusions, so that we are aware of them and aware of what kinds of situations they are likely to take over. The second step seems obvious too: devise a plan for dealing with individual biases. At the very least, this second step requires a commitment to hard work and reflection before making a judgment (Cf. Stanovich 2010). The third step may not seem obvious, especially to those who are overconfident of their ability to make important decisions quickly without much reflection: recognize what kinds of situations are inherently unpredictable and which kinds of situations allow time for getting feedback. It should go without saying that it is not wise to follow your intuition when the situation is predictably unpredictable and it is wise to get feedback whenever possible before making important decisions.

Ideally, I suppose, debiasing would be one of the primary objectives of education. What could be more important for a society than a populace that knows how to avoid some of the main hindrances to making good decisions and judgments? Yet, few high school students and most college students will graduate without even knowing what these biases are much less having been taught how to diminish their power over us.

There is every reason to be pessimistic about the chances of successfully overcoming many cognitive biases and illusions. Nobody has made a more detailed and careful study of these biases and their effects than Daniel Kahneman, yet here is what he had to say at the end of his book Thinking Fast and Slow:

Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely ... And I have made much more progress in recognizing the errors of others than my own.

Even so, there may be hope for us after all. For example, there may be little we can do to overcome the optimistic bias in individuals, but there is some evidence that we can reduce this bias in organizations. The optimistic bias is an expression used by Kahneman to describe the idea that "most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be." Furthermore, most of us have an unrealistic view about predicting the future: we think we're much better at it than we really are. We might advise individuals when considering their goals that they ask themselves what could go wrong? and what would happen if I fail? A consideration of possible failure need not lead one to dismantle one's dreams. In fact, it could help prevent the very failures that one considers by planning ahead to avoid them. For organizations, Gary Klein has this advice:

....when the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.” (Kahneman 2011 p. 264)

Klein called his proposal a premortem. Kahneman comments:

The premortem has two main advantages: it overcomes the groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction....The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier.

Another area where debiasing has shown some success is with the planning fallacy, a term coined by Kahneman and Amos Tversky "to describe plans and forecasts that are unrealistically close to best-case scenarios" and "could be improved by consulting the statistics of similar cases" (Kahneman 2011 p. 250). To mitigate this tendency to be unrealistic about how long some task will take to accomplish, one should research similar tasks that have been accomplished and find out how long the proposers thought it would take and how long it actually took. According to Kahneman: "Using ... distributional information from other ventures similar to that being forecasted is called taking an 'outside view' and is the cure to the planning fallacy." This advice has been put into practice enough that it has its own name: reference class forecasting. The process has been used in the UK, The Netherlands, Denmark, and Switzerland. Should we be optimistic? I don’t know. On the other hand, there is not much to report regarding successful methods of debiasing individuals.

further reading

The Debunking Handbook by John Cook and Stephan Lewandowsky

Diss Information: Is There a Way to Stop Popular Falsehoods from Morphing into "Facts"? Scientific American by Carrie Arnold

Debiasing by Richard P. Larrick. 2004. ch. 16 in Blackwell Handbook of Judgment and Decision Making. Edited by Derek J. Koehler and Nigel Harvey

Debiasing - by Lee Merkhofer

Arkes, H. R. 1991. Costs and benefits of judgment errors: Implications for debiasing. Psychological Bulletin. 110, 486–98.

Fischhoff, B. 1982. "Debiasing." In D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases (pp. 422–44).

Kahneman, Daniel. 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux.

Klein, Gary. 1998. Sources of Power: How People Make Decisions. MIT Press.

Stanovich, Keith. 2010. Rationality and the Reflective Mind. Oxford University Press, USA.

web

Strategic Decisions: When Can You Trust Your Gut? McKinsey Quarterly, Nobel laureate Daniel Kahneman and psychologist Gary Klein debate the power and perils of intuition for senior executives:

"Under what conditions are the intuitions of professionals worthy of trust?" What's your answer? When can executives trust their gut?

Gary Klein: It depends on what you mean by "trust." If you mean, "My gut feeling is telling me this; therefore I can act on it and I don't have to worry," we say you should never trust your gut. You need to take your gut feeling as an important data point, but then you have to consciously and deliberately evaluate it, to see if it makes sense in this context. You need strategies that help rule things out. That's the opposite of saying, "This is what my gut is telling me; let me gather information to confirm it."

Daniel Kahneman: There are some conditions where you have to trust your intuition. When you are under time pressure for a decision, you need to follow intuition. My general view, though, would be that you should not take your intuitions at face value. Overconfidence is a powerful source of illusions, primarily determined by the quality and coherence of the story that you can construct, not by its validity. If people can construct a simple and coherent story, they will feel confident regardless of how well grounded it is in reality.

The Quarterly: Is intuition more reliable under certain conditions?

Gary Klein: We identified two. First, there needs to be a certain structure to a situation, a certain predictability that allows you to have a basis for the intuition. If a situation is very, very turbulent, we say it has low validity, and there's no basis for intuition. For example, you shouldn't trust the judgments of stockbrokers picking individual stocks. The second factor is whether decision makers have a chance to get feedback on their judgments, so that they can strengthen them and gain expertise. If those criteria aren't met, then intuitions aren't going to be trustworthy.

Most corporate decisions aren't going to meet the test of high validity. But they're going to be way above the low-validity situations that we worry about. Many business intuitions are going to be valuable; they are telling you something useful, and you want to take advantage of them.