The morning after a man opened fire at a Parti Québécois rally, killing one person and injuring another, we knew the police had arrested a man at the scene. We knew the man had worn a ski mask and a bathrobe. We knew he was 62 years old, lived in Quebec, but was not a Montrealer. And there were reports that he had ranted "the Anglos are waking up," among other, angry, political statements.
So we knew next to nothing. We didn't even know the man's name. And yet many people immediately turned to Twitter, blogs, and news-paper websites to express what they were sure of.
This wasn't about politics, they said. The alleged shooter is insane. Just look at the way he was dressed. This was a meaningless tragedy.
After all, it's obvious that anyone who would kill in the name of politics is insane.
Right away, let's get that last claim off the table: Human history is positively stuffed with otherwise ordinary people who murdered, tortured, and committed all sorts of other horrible acts in the name of politics. Unless we are using "insane" as a secular synonym for "evil" - a common source of confusion - it is simply false to say that someone who kills in the name of politics must be, ipso facto, insane. Horrible, despicable, and repugnant. But not insane.
But what about the rest of what people claimed? Is the shooter in-sane? Was it a meaningless act that has nothing to do with politics? It's quite possible. But at the time, we had almost no information about the alleged shooter or his motives.
So how did people make the huge jump from "this may be true" to "this is certain"?
Very simply, they confused believing with knowing. It's one of the most common traps in human cognition. Indeed, it's so common, you might even say it's not a trap at all. It's human nature.
In a series of famous experiments, neuroscientist Michael Gazzaniga worked with "split-brain" patients - meaning the connection between the left and right hemispheres of their brain had been severed (usually as a treatment for severe epilepsy). These people function remarkably well, in general, but the fact that their brain's two hemi-spheres are not connected, and can-not communicate, has some strange consequences. Gazzaniga made use of one.
Gazzaniga showed a written instruction - such as "go open the window on the other side of the room" - to the eyeball connected to the right hemisphere. The other eyeball saw nothing. As a result, the right hemisphere got the instructions while the left remained in the dark.
As directed, the patient stood and started to walk across the room. Then Gazzaniga interrupted. "Why are you walking across the room?" he asked. The left hemisphere handles explanations but it didn't know what the explanation was.
Test subjects should have answered "I don't know." They didn't. Instead, the left hemisphere in-vented a plausible explanation. "I'm going for a soda."
But the left hemisphere didn't present "I'm going for a soda" as a mere hypothesis, a hunch, a reasonable guess. No, it was a fact. The left hemisphere was sure it knew the answer. So the person would respond, "I'm going for a soda." And he was sure that was the truth.
Admittedly, Gazzaniga's experiment is a little freaky. But what he observed isn't abnormal. Not in the least.
The human brain is a compulsive explainer. It's constantly churning out hypotheses based on cur-rent observations, prior experiences, and existing beliefs. It does that automatically, effortlessly. And quickly.
We don't experience these explanations as interesting but unproven hunches worthy of further investigation. They simply feel true. "Yes, I'm going for a soda. I'm certain of it."
That's the "feeling of knowing," as neuroscientist Robert Burton called it. It's very compelling. But as Burton explained in his book On Being Certain, the feeling of knowing arises "out of primary brain mechanisms that, like love or anger, function independently of rationality or reason. Feeling correct or certain isn't a deliberate conclusion or conscious choice. It is a mental sensation that happens to us."
As a result, if we're not very careful, we don't treat our brain's guesses as only that. They feel true. So we treat them as The Truth.
That actually makes sense from an evolutionary perspective. If our ancestors saw a shadow moving in the long grass, and thought "it may be a lion but that's only a plausible hypothesis pending consideration of lion prevalence statistics and migratory habits -" they probably wouldn't have lived long enough to become our ancestors. But we live in a very different environment than the one in which our brains evolved, and today, false certainty can often get us into more trouble than doubting our beliefs.
"It's not easy, of course, but some-how we must incorporate what neuroscience is telling us about the limits of knowing into our everyday lives," Burton writes. "We must accept that how we think isn't entirely within our control. Perhaps the easiest solution would be to substitute the word 'believe' for 'know.' A physician faced with an unsubstantiated gut feeling might say, 'I believe there's an effect despite the lack of evidence,' not, 'I'm sure there's an effect.' "
I believe - using Burton's terms - that the tragedy Tuesday night was primarily or entirely a manifestation of mental illness and is otherwise meaningless. I also hope that it is, because the alternative is frightening.
But believing is not knowing. Neither is hoping.
To know, we must recognize that distinction. Then we must wait for evidence and think - consciously, slowly, and carefully.