The Wrong Stuff

What we don't know about how to correct misinformation

Pushing back against political misinformation has lately become a growth industry. The Obama administration is trying to counter false claims that proposed health care reforms will lead to government-sponsored euthanasia, both via appeals from the president and on a new Web site. Meanwhile, the British government, a sort of innocent bystander to the debate, is quietly setting the record straight about its own form of universal health care. And, as Michael Calderone reported in Politico, MSNBC recently devoted a lot of time to the unhinged “birther” theories about the president’s provenance, in order to mock or debunk them.

So will any of these efforts be successful? Not likely. Once factually inaccurate ideas take hold in people’s minds, there are no reliable strategies to dislodge them—especially from the minds of those for whom the misinformation is most ideologically convenient. That’s the upshot of the work of Brendan Nyhan, a political scientist and blogger. Nyhan has been wrestling with the task of how to correct misperceptions for years—he helped run the now-defunct Spinsanity, a sort of precursor to current Web sites like Factcheck.org and the St. Petersburg Times’s PolitiFact—but his recent research with his colleague Jason Reifler raises the question of whether this battle can be won.

In one experiment (PDF), Nyhan and Reifler asked college students to read faux newspaper articles in which then-President George W. Bush said or implied things that were untrue—either that Saddam Hussein possessed weapons of mass destruction just before the invasion of Iraq, or that the tax cuts in his first term had increased federal revenues. The articles given to some of the students also contained detailed corrective material—a lengthy paragraph detailing government reports on the absence of WMD, or documenting the decline in tax revenues.

The result? The corrections were often successful in reducing misperceptions among readers who weren’t predisposed to believe the false statements. But they didn’t affect those people who had a motive to be mistaken—and in some cases, such as conservatives who believed that WMD were present, the corrections actually backfired, making the subjects more likely to believe the false information.

This sort of cognitive truculence isn’t limited to conservatives. Nyhan and Reifler conducted a similar experiment with a mock article falsely claiming that Bush had “banned” stem cell research—an untruth that liberals were nonetheless likely to believe. They found similar results, with liberals now the group resistant to correction. (That result showed no evidence of a backfire effect, however.)

Nyhan and Reifler’s work builds on other recent research showing that myths are hard to dispel, and that people believe what they want to believe. “Very often people are cognitive misers, trying to get by without thinking too deeply,” said Yaacov Schul, a professor of psychology at The Hebrew University of Jerusalem, whose work has been cited by Nyhan and Reifler. And beyond political biases, there are cognitive constraints in play. A reader who encounters a sentence like “John Doe said he did not commit adultery” immediately creates a mental association between John Doe and adultery and attaches the qualifier “not.” But often, Schul said, “with time, the qualifier disappears… and the [connection] remains intact.”

Efforts to refute misinformation are most effective when a false claim can be countered a clear-cut alternative narrative—something that creates a mental image “as vivid, as strong” as what you’re trying to negate, said Schul’s colleague, Ruth Mayo. “The problem,” she said, is “that for most misinformation there isn’t any” such alternative—in the case of the example above, “you don’t have any way in your mind to represent ‘not adultery.’” This concept seems relevant to the current debate. What’s the opposite, for example, of a government death panel that wants to kill your grandma?

The result of all this is that, as Yale political scientist John Bullock puts it, the media is “going to have a very hard time trying to persuade Republicans that a particular attack on Obama is false, no matter how good a job they do, and vice versa.” And there’s another wrinkle to the problem, Bullock said. Research suggests that even if a specific factual misunderstanding can be refuted, the judgment it facilitated is likely to remain. In other words, even if the idea that the government wants to create “death panels” is discredited, the suspicion of health care reform that the belief fostered will linger.

That’s discouraging news for Obama’s health reform ambitions—but it’s also pretty disheartening for the press, which counts as one of its core responsibilities the communication of accurate information about the world, with the expectation that that information will inform public debate. The media itself, of course, is often culpable in spreading misinformation. But if people, at a certain point, aren’t listening, are even good journalists—the skeptical, truth-telling sort, who challenge authority, check the claims they report, and speak truth to power—really communicating?

Nyhan, for one, is not quite ready to despair, and he continues to urge the media to do a better job of policing misinformation. On his blog, he criticizes newspapers for offering weak objections to political inaccuracies—he was critical, for example, of a recent New York Times primer on the health care debate that said only that the euthanasia claims “appear to be unfounded.” There’s no evidence that a more forceful fact-check would be more effective in refuting false claims, Nyhan said, but “I hope that they would [do so anyway]—it might provoke a stronger response.” (He added, though, that that response might diminish the press outlet’s credibility in the eyes of some readers.)

An even better press strategy, he believes, is “naming and shaming”—calling out the people who help falsehoods advance, and cutting them off from media access. Such an approach might not change minds on a particular issue, Nyhan said, but it would “increas[e] the reputational costs” of spreading lies, and thus create a climate in which truthfulness and accuracy were more prized.

This approach may bear fruit, but it has a few potential weaknesses. One is that major media outlets are not the gatekeepers they once were. Another is that the strategy presumes a level of coordination that doesn’t exist within newsrooms, let alone across the industry. When an ABC News reporter, for example, sets out to shame former New York Lt. Gov. Betsy McCaughey for her role in advancing the euthanasia myth, there’s no reliable way to ensure that others follow suit—or even, for that matter, see the story. And a third is that some people who foster misinformation—for example, sitting Senators who are at the center of ongoing negotiations on health care reform—can’t really be shunned.

So where does all this leave the individual reporter, working on a specific story for a general audience, who wants to debunk a false statement made by a subject? “The best chance,” Schul said, “is to tell a good story—you want to create a causal chain that links the new information to evidence the perceiver already knows so that it can modify the old interpretation [with] the one you wish to implant.” Nyhan suggested another idea: find someone who’s ideologically similar to your target but willing to repudiate the claim, as conservative Republican Sen. Johnny Isakson recently did on the euthanasia story. But such white knights are not always available, and when it comes to the other instances, “I’m at a bit of a loss,” Nyhan said. “We don’t have micro-level evidence about how to frame these stories.”

Of course, all this doesn’t mean that news organizations don’t have a responsibility to ensure that their own content is accurate, and it doesn’t mean that they should throw in the towel when it comes to correcting others. But it does mean that we know is that the orthodox journalistic approach to correcting misperceptions is ineffective, and we should be looking for a better way to accomplish the task. And if there are any strategies that might help, everyone who produces and consumes serious journalism has an interest in uncovering them. After all, the ability to convey a basic fact is not just about the outcome of any particular policy debate. As Nyhan put it, “It’s a larger question about what’s the actual effect of journalism on readers.”

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Greg Marx is an associate editor at CJR. Follow him on Twitter @gregamarx.