Dan Gardner is the New York Times best-selling author of Risk, Future Babble, Superforecasting (co-authored with Philip E. Tetlock), and How Big Things Get Done (co-authored with Bent Flyvbjerg). His books have been published in 26 countries and 20 languages. Prior to becoming an author, Gardner was an award-winning investigative journalist. More >

Think You Can Explain Tucson? Think Again.

In an ordinary place full of ordinary people, violence explodes. The world's electronic eyes turn and we watch, horrified. Why did it happen? Who or what is to blame? The loudest and most certain seize the microphones and op-ed pages. Fingers point. A flood of anger and recrimination washes over the blood. Of course, this is a summary of the Tucson shooting and its aftermath, but it's also a summary of the Oklahoma City bombing and the Columbine massacre. In fact, it's a summary of what happens almost every time a genuinely shocking incident explodes into mass consciousness. There's a reason for that. The human mind does not like disconnected stimuli, perceptions, and thoughts. It demands order. Things must fit together. The universe must make sense. For the most part, this compulsion for mental order serves people well. There wouldn't be almost seven billion of us otherwise. But it can also cause big problems. One of the biggest stems from the simple fact that, when we wake up in morning, our brains are not blank slates. They are stuffed with perceptions, thoughts and beliefs accumulated and evolved over a lifetime. So what happens when we stumble across new information? The brain doesn't assess the information impartially. It can't. It has to maintain order and that means the new information must fit with the old. So it is profoundly biased when it processes new information. Information that squares with existing cognitions is embraced with delight. See! How marvellous! More proof the brain's tidy understanding of the world is correct. Information that doesn't fit threatens the mental order, and so it's as welcome in the brain as an invading virus. We don't naturally think to go looking for it -- even though that's precisely what we should do to test our beliefs -- and, if we happen to stumble across it or have it thrust under our noses, we struggle to avoid accepting it for what it is. We are hyper-critical. We rationalize. We dodge and weave. We do whatever it takes to come up with a reason -- an excuse -- to call it meaningless and unpersuasive. Then we forget it, for memory, too, is deeply biased. So one day, you turn on the television. There are live pictures of a massacre. It's shocking, disturbing. It has your full attention. Why did it happen? What does it mean? Whether you consciously ask yourself these questions or not, you are already hard at work coming up with an explanation. As research by neuroscientist Michael Gazzaniga vividly illustrated, the brain generates explanatory stories almost as automatically and effortlessly as it controls breathing. Of course your brain doesn't spin any old story. The story it tells is drawn from your perceptions and beliefs, so it is a perfect fit with your mental universe. "Mission accomplished, Sarah Palin," tweeted the influential liberal blogger Markos Moulitsas shortly after Congresswoman Gabrielle Giffords and 18 others were shot in Tucson. At the time, almost nothing was known about the man who pulled the trigger. But Moulitsas had long decried the violent rhetoric of right-wingers, and he knew Sarah Palin had produced a map with gunsights on targeted Democratic politicians, including Gabrielle Giffords. That was enough for him. He had an explanation. Everything made sense. Lots of other liberals were convinced, too. Even when it became clear that the alleged shooter was a very sick man, that the people who knew the shooter feared he may turn violent, that the shooter had no interest in talk radio or the news and didn't follow conventional politics -- even then they stuck with the story that made everything orderly and comprehensible. Evidence to the contrary was simply waved off, belittled, or ignored. Conservatives protested. Don't jump to conclusions! Stick to the evidence! But if a Republican congresswoman had been shot under similar circumstances during the Bush administration, we can be fairly sure they would have blamed left-wing rhetoric. In 1995, Republican Newt Gingrich actually managed to see in the sensational case of Susan Smith -- the young mother who drowned her two children -- proof that the United States had been corrupted by "the counter-culture and Lyndon Johnson's Great Society." Fundamentally, this isn't about liberals and conservatives. It's not even about politics. After Columbine, explanations sprouted like weeds. Some blamed drugs. Others said it was Goth music. Video games. Permissive parenting. One of the most popular explanations was bullying. Much later, Michael Moore's Bowling for Columbine blamed it on guns and a culture of fear. As journalist Dave Cullen demonstrated in his brilliant book Columbine, all these explanatory stories were based on misinformation and half truths. All ignored contrary evidence. All fit neatly with the existing perceptions and beliefs of those who told them. The very fact that our explanations mesh with what we already believe is what gives them their power. We feel in our bones that they're true. So they must be. Yet, routinely, they're not. So how do we avoid this cognitive trap? Good old skepticism. Is all the evidence in? How reliable is it? Are there other possible explanations? Of course we're all natural skeptics when we hear explanations that don't square with what we believe (as Republicans showed last week). The danger lies in the explanations that do fit. That's when skepticism is hard. And essential. "I may be wrong." That simple thought, taken seriously, is the best and only defence against believing what just ain't so.