Over at the academic blog Overcoming Bias, Arnold Kling makes a good point:
Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program."Why are people so eager for certainty? I think part of the answer is revealed in an interesting Science paper by Colin Camerer.
Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there will be an environmental catastrophe that will end life as we know it."
Instead, they speak in the language of certainty. I assume that as political leaders they know a lot better than I do how to speak to the general population. So I infer that, relative to me, the public has a bias toward certainty.
Another piece of evidence of that is an anecdote I cite about my (former) doctor. Several years ago, he said I needed a test. I did some research and some Bayes' Theorem calculations, and I faxed him a note saying that I did not think the test was worth it. He became irate.
I think that one reason that our health care system works the way it does is that it does not occur to anyone to say, "OK, I can live with that level of uncertainty." Instead, we must have the MRI, or the CT scan, or whatever. Even, as in my case, when the patient is willing to live with uncertainty, the doctor has a problem with it.
Another way that bias toward certainty shows up is in the way we handle disagreement. People don't say that there were differences within the intelligence community about the probability distribution for Saddam having WMD. They say that Bush manipulated the intelligence. And they are right, in the sense that he tried to make it sound certain.
Camerer's experiment revolved around a decision making game known as the Ellsberg paradox. Camerer imaged the brains of people while they placed bets on whether the next card drawn from a deck of twenty cards would be red or black. At first, the players were told how many red cards and black cards were in the deck, so that they could calculate the probability of the next card being a certain color. The next gamble was trickier: subjects were only told the total number of cards in the deck. They had no idea how many red or black cards the deck contained.
The first gamble corresponds to the theoretical ideal of economics: investors face a set of known risks, and are able to make a decision based upon a few simple mathematical calculations. We know what we don't know, and can easily compensate for our uncertainty. As expected, this wager led to the "rational" parts of the brain becoming active, as subjects computed the odds. Unfortunately, this isn't how the real world works. In reality, our gambles are clouded by ignorance and ambiguity; we know something about what might happen, but not very much. (For example, it's now clear just how little we actually knew about Iraq pre-invasion.) When Camerer played this more realistic gambling game, the subjects' brains reacted very differently. With less information to go on, the players exhibited substantially more activity in the amygdala and in the orbitofrontal cortex, which is believed to modulate activity in the amygdala. In other words, we filled in the gaps of our knowledge with fear. This fear creates our bias for certainty, since we always try to minimize our feelings of fear. As a result, we pretend that we have better intelligence about Iraqi WMD than we actually do; we selectively interpret the facts until the uncertainty is removed.
Camerer also tested patients with lesioned orbitofrontal cortices. (These patients are unable to generate and detect emotions.) Sure enough, because these patients couldn't feel fear, their brains treated both decks equally. Their amygdalas weren't excited by ambiguity, and didn't lead them astray. Because of their debilitating brain injury, these patients behaved perfectly rationally. They exhibited no bias for certainty.
Obviously, it's difficult to reduce something as amorphous as "uncertainty" to a few isolated brain regions. But I think Camerer is right to argue that his "data suggests a general neural circuit responding to degrees of uncertainty, contrary to decision theory."
If we could educate our leaders about this bias for certainty, perhaps they would be less confident in their assertions, especially when it comes to matters of war. Another possibility is to outsource our war decisions to somebody with a damaged orbitofrontal cortex, although that is probably a terrible idea.
Reader Comments
to our Newsletter