Pushing back against political misinformation has lately become a growth industry. The Obama administration is trying to counter false claims that proposed health care reforms will lead to government-sponsored euthanasia, both via appeals from the president and on a new Web site. Meanwhile, the British government, a sort of innocent bystander to the debate, is quietly setting the record straight about its own form of universal health care. And, as Michael Calderone reported in Politico, MSNBC recently devoted a lot of time to the unhinged “birther” theories about the president’s provenance, in order to mock or debunk them.

So will any of these efforts be successful? Not likely. Once factually inaccurate ideas take hold in people’s minds, there are no reliable strategies to dislodge them—especially from the minds of those for whom the misinformation is most ideologically convenient. That’s the upshot of the work of Brendan Nyhan, a political scientist and blogger. Nyhan has been wrestling with the task of how to correct misperceptions for years—he helped run the now-defunct Spinsanity, a sort of precursor to current Web sites like Factcheck.org and the St. Petersburg Times’s PolitiFact—but his recent research with his colleague Jason Reifler raises the question of whether this battle can be won.

In one experiment (PDF), Nyhan and Reifler asked college students to read faux newspaper articles in which then-President George W. Bush said or implied things that were untrue—either that Saddam Hussein possessed weapons of mass destruction just before the invasion of Iraq, or that the tax cuts in his first term had increased federal revenues. The articles given to some of the students also contained detailed corrective material—a lengthy paragraph detailing government reports on the absence of WMD, or documenting the decline in tax revenues.

The result? The corrections were often successful in reducing misperceptions among readers who weren’t predisposed to believe the false statements. But they didn’t affect those people who had a motive to be mistaken—and in some cases, such as conservatives who believed that WMD were present, the corrections actually backfired, making the subjects more likely to believe the false information.

This sort of cognitive truculence isn’t limited to conservatives. Nyhan and Reifler conducted a similar experiment with a mock article falsely claiming that Bush had “banned” stem cell research—an untruth that liberals were nonetheless likely to believe. They found similar results, with liberals now the group resistant to correction. (That result showed no evidence of a backfire effect, however.)

Nyhan and Reifler’s work builds on other recent research showing that myths are hard to dispel, and that people believe what they want to believe. “Very often people are cognitive misers, trying to get by without thinking too deeply,” said Yaacov Schul, a professor of psychology at The Hebrew University of Jerusalem, whose work has been cited by Nyhan and Reifler. And beyond political biases, there are cognitive constraints in play. A reader who encounters a sentence like “John Doe said he did not commit adultery” immediately creates a mental association between John Doe and adultery and attaches the qualifier “not.” But often, Schul said, “with time, the qualifier disappears… and the [connection] remains intact.”

Efforts to refute misinformation are most effective when a false claim can be countered a clear-cut alternative narrative—something that creates a mental image “as vivid, as strong” as what you’re trying to negate, said Schul’s colleague, Ruth Mayo. “The problem,” she said, is “that for most misinformation there isn’t any” such alternative—in the case of the example above, “you don’t have any way in your mind to represent ‘not adultery.’” This concept seems relevant to the current debate. What’s the opposite, for example, of a government death panel that wants to kill your grandma?

Greg Marx is a CJR staff writer. Follow him on Twitter @gregamarx.