How Facts Backfire
Researchers discover a surprising threat to reality: our brains
8/21/2010
It is uncommon for “FACTS” to actually matter to anyone who seeks comfort or conformity. Wow. Weird and often-weak species, these humans.
(By Joe Keohane, July 11, 2010)
It’s one of the great assumptions that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble pamphlets to debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better people. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.
In the end, truth will out. Won’t it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
In the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon—known as “backfire”—is “a natural defense mechanism to avoid that cognitive dissonance.”
These findings open a long-running argument about the ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
This effect is only heightened by the information glut, which offers—alongside an unprecedented amount of good information—endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.
On its own, this might not be a problem: People ignorant of the facts could simply choose not to “decide”. But instead, it appears that misinformed people often have some of the strongest opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare—the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct—but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong anti-welfare bias.)
Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem”. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”
What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek comfort, conformity, and consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
New research, published in the journal Political Behavior last month, suggests that once those facts—or “facts”—are internalized, they are very difficult to budge.
It’s unclear what is driving the behavior—it could range from simple defensiveness, to people working harder to defend their initial beliefs—but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”
It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.
But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen—and if you feel insecure or threatened, you won’t. This would also explain why demagogues [a “leader” who seeks support by appealing to popular desires of the masses and prejudices—rather than by using facts or rational reasoning] benefit from keeping people agitated and fearful. The more threatened people feel [against the real facts], the less likely they are to listen, and the more easily controlled they are.
There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.
Kuklinski’s study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.
And if you harbor the notion that the solution is “more education” and a higher level of political sophistication overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom theory relies most heavily.”
In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts—inference, intuition, and so forth—to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, we’re easily suckered by falsehoods.
Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. [“Consider the source” and their motives.] If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on ‘Meet the Press’ and you get hammered for saying something misleading,” he says, “you’d think twice before you go and do it again.”
(Although, perhaps, if the perpetrator of falsehood lost his job, his house, and went to jail for misrepresenting truth, it would be an even more effective deterrent to libel and misinformation. When deceit and misinformation are cost-free to their sources [in this lifetime], you can be sure that it will continue as a weapon of often hidden motives.)
Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking pundits have ascended to the realm of highly lucrative popularity, while professional fact-checking operations languish in the dungeons of wonkery. Getting a quack to feel shame? That isn’t easy.
(Joe Keohane is a writer in New York; The New York Times Company)
Begin forwarded message:
I couldn’t help connecting this as I read the above article. Nothing new under the sun—and—Be not unaware…—s
“I note what you say about guiding your patient’s reading and taking care that he sees a good deal of his materialist friend. But are you not being a trifle naïf? It sounds as if you supposed that argument was the way to keep him out of the Enemy’s clutches. That might have been so if he had lived a few centuries earlier. At that time the humans still knew pretty well when a thing was proved and when it was not; and if it was proved they really believed it. They still connected thinking with doing and were prepared to alter their way of life as the result of a chain of reasoning. But what with the weekly press and other such weapons, we have largely altered that. Your man has been accustomed, ever since he was a boy, to having a dozen incompatible philosophies dancing about together inside his head. He doesn’t think of doctrines as primarily “true” or “false,” but as “academic” or “practical,” “outworn” or “contemporary,” “conventional” or “ruthless.” Jargon, not argument, is your best ally in keeping him from the Church.”
“Thanks to processes which we set at work in them centuries ago, they find it all but impossible to believe in the unfamiliar while the familiar is before their eyes. Keep pressing home on him the ordinariness of things. Above all, do not attempt to use science (I mean the real sciences) as a defense against Christianity. They will positively encourage him to think about realities he can’t touch and see. There have been sad cases among the modern physicists. If he must dabble in science, keep him on economics and sociology; don’t let him get away from that invaluable “real life.” But the best of all is to let him read no science but to give him a grand general idea that he knows it all and that everything he happens to have picked up in casual talk and reading is “the results of modern investigation.” Do remember you are there to fuddle him…”
(SCREWTAPE LETTERS, CS Lewis)