Dan Gardner is the New York Times best-selling author of Risk, Future Babble, Superforecasting (co-authored with Philip E. Tetlock), and How Big Things Get Done (co-authored with Bent Flyvbjerg). His books have been published in 26 countries and 20 languages. Prior to becoming an author, Gardner was an award-winning investigative journalist. More >

Agreeing to disagree

Francis Bacon observed that people tend to ignore evidence contrary to their beliefs. Centuries later, with a world of knowledge at our fingertips, this is more true than ever. On any given day of the week, readers send e-mails in response to what they read in this space. Some write nice things. Others are not so happy. But in almost every case, my correspondents feel it important to tell me that they do, or do not, agree with me. So let me say collectively to those of you who write: I don't care. It's not that I don't care about what readers think, mind you. I do. One of the pleasures of my job is hearing from people who take the time to read what I write and return the favour. (I read all e-mail, incidentally, though I don't always respond. When one types with two fingers, one must limit one's typing time.) What I do not care about is agreement. Whether you share my opinion or loathe it matters not in the slightest to me. People usually find this perplexing. Aren't you an opinion writer? Isn't the whole point to convince people you are right? Actually, no. Not even close. Different opinion writers may have different opinions -- that's the nature of the beast -- but for me the point of publishing points of view on important social or political issues is something else entirely. "The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it," Sir Francis Bacon wisely observed almost 400 years ago. "And though there be a greater weight of instances to be found on the other side, yet these it either neglects or despises, or else, by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate." The tendency Bacon identified is what modern psychologists have dubbed "confirmation bias." Once we have an opinion of any sort, we seek to confirm it. It is a universal human tendency. People promiscuously embrace information that supports their views, while ignoring, belittling or severely scrutinizing anything that disputes them. One of the many studies revealing this bias at work was conducted by Stanford University psychologists Charles Lord, Lee Ross and Mark Lepper. It was 1979. Capital punishment was a hot topic in the United States. So Lord, Ross and Lepper assembled 48 Stanford students. Half believed the death penalty deterred crime; half believed it did not. The researchers presented the students with brief summaries of studies about the deterrent value of capital punishment. One study concluded that it does deter crime; a second concluded it does not. The researchers also had the students read detailed explanations of the methods used in each study, along with criticisms of the studies, and rebuttals of those criticisms. All this information was very carefully designed to be balanced so that, objectively, the two studies were of equal weight. At each stage, students were asked about the persuasiveness of the information. At the end of the experiment, the students' beliefs about capital punishment were once again tested. "Logically," the psychologists wrote, "one might expect mixed evidence to produce some moderation in the views expressed by opposing factions. At worst, one might expect such inconclusive evidence to be ignored." But logic had nothing to do with the results. Students on both sides of the issue judged the study that supported their belief to be strong and compelling, while the study that contradicted their belief was seen as flawed and unconvincing. And so their opinions hardened. Those who believed capital punishment deters crime left the experiment believing more strongly that they were right; so did those who believed the opposite. It was an almost crystalline demonstration of the observation Sir Francis Bacon made so long ago. People really are biased. And that bias runs deep. Researchers using MRI scans have discovered that people confronted with information that supports or contradicts their political views actually process the two types of information with different parts of their brains. They also found that the brain's emotional circuitry played a major part in the processing -- explaining why we find contradiction to be unpleasant, while confirmation is like slipping into a warm bath. This has enormous implications for our Information Age. It is a truism that information, and access to it, is rapidly multiplying. It is almost as widely believed that this is a good thing. But is it? It would be if people dealt with information rationally -- if we processed information impartially, that is, and let it shape our beliefs. But that isn't our natural tendency. Our beliefs shape the information, not the other way around. And so the proliferation of information -- thanks mainly to the Internet and the explosion of computing power -- may simply provide people with an escalating ability to rationalize and entrench their views. It's prejudice powered by Moore's Law. Look at the blogosphere. With few exceptions, political blogs are written by people of fixed views for others who share those views. Contradiction is not welcome in these warm baths of confirmation and so readers are only exposed to information that reinforces their views -- which makes those views stronger and leads many to the extreme conclusion that only fools and scoundrels could disagree. Psychologists will never devise a better demonstration of confirmation bias and its pernicious effects on rational discourse. Of course, as Bacon insisted, we are not slaves to our biases. With effort, we can be more rational. The first step is recognizing the problem. We are biased. I am. You are. Everybody is. The next is to get out of the warm bath. Look for information that contradicts your views and give it real consideration -- while remaining aware that the brain that is doing the considering is biased against it. This is, I believe, the point of publishing opinion writers in newspapers. It's not to tell people what to believe. It's to offer information readers may not know, and perspectives they may not have considered, so they can rationally question their own beliefs. Whether readers ultimately agree or disagree with the writer is irrelevant. Now and then in my inbox, I get e-mails from readers who say they don't necessarily agree with something I wrote but it really made them think. That's it. That's the whole point. I do love those e-mails.