Which of these headlines strikes you as the most persuasive:

“I am not a Muslim, Obama says.”

“I am a Christian, Obama says.”

The first headline is a direct and unequivocal denial of a piece of misinformation that’s had a frustratingly long life. It’s Obama directly addressing the falsehood.

The second option takes a different approach by affirming Obama’s true religion, rather than denying the incorrect one. He’s asserting, not correcting.

Which one is better at convincing people of Obama’s religion? According to recent research into political misinformation, it’s likely the latter.

The study was led by Brendan Nyhan and Jason Reifler, two leading researchers examining political misinformation and the ways in which it can and can’t be refuted, among other topics. Their 2009 paper, “The Effects of Semantics and Social Desirability in Correcting the Obama Muslim Myth,” found that affirming statements appeared to be more effective at convincing people to abandon or question their incorrect views regarding President Obama’s religion.

I found their work courtesy of an exhaustive post on You Are Not So Smart, a blog about “self delusion and irrational thinking” by journalist David McRaney.

McRaney spends several thousand words explaining the “backfire effect,” which he nicely summarized in one sentence: “When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”

As I detailed in a recent column, the backfire effect makes it difficult for the press to effectively debunk misinformation. We present facts and evidence, and it often does nothing to change people’s minds. In fact, it can make people dig in even more. Humans also engage in motivated reasoning, a tendency to let emotions “set us on a course of thinking that’s highly biased, especially on topics we care a great deal about”.

These two important cognitive effects can have a significant impact on society and debates in the public sphere. They also end up negating some of the debunking and reporting work done by the press. My recent attempts to understand the backfire effect and motivated reasoning has transformed into a search for ways to combat these entrenched human phenomena.

I sought out Reifler, an assistant professor of political science at Georgia State University, to learn more about his and his colleagues’ findings regarding affirmative statements and their effect of the Obama Muslim myth. I asked him if there are other other ways of presenting information that can debunk lies.

“I’m sure that there are but I don’t know what they are,” he told me, ever the cautious researcher.

Nevertheless, he did offer some encouragement.

“I think we’re moving in that direction,” he says.

Part of the process of discovering what works is to rule out what doesn’t. I listed a some of them in my previous column, and Nyhan and Reifler provide more evidence in a 2010 paper, “When Corrections Fail: The Persistence of Political Misperceptions,” published in Political Behavior. (Note that their definition of a correction is different from the ones used in the press.) Their study saw respondents read a mock news article “containing a statement from a political figure that reinforces a widespread misperception.” Some of the articles also included a paragraph of text that refuted (or “corrected”) the misperception and statement.

One article, for example, led with President George W. Bush talking about Iraq and the possibility it “would pass weapons or materials or information to terrorist networks.” It then transitioned to a graph that cited information from a CIA report that Iraq did not in fact possess illicit weapons at the time of the U.S.-led invasion. Would these corrective paragraphs influence respondents who believed Iraq had WMDs?

As the researches write, the corrective sections “frequently fail to reduce misperceptions among the targeted ideological group.”

Then there’s that familiar term: “We also document several instances of a ‘backfire effect’ in which corrections actually increase misperceptions among the group in question.”

So perhaps a single, credible refutation within a news article isn’t likely to convince people to change their views. But other research suggests that a constant flow of these kind of corrections could help combat misinformation. The theory is that the more frequently someone is exposed to information that goes against their incorrect beliefs, the more likely it is that they will change their views.

“It’s possible there is something to be said for persistence,” Reifler said. “At some point the cost of always being wrong or always getting information that runs counter to what you believe is likely to outweigh the cost of having to change your mind about something. We need to figure out what is the magic breaking or tipping point, or what leads people to get to that tipping point. I think we’re just scratching the surface.”

Craig Silverman is the editor of RegretTheError.com and the author of Regret The Error: How Media Mistakes Pollute the Press and Imperil Free Speech. He is also the editorial director of OpenFile.ca and a columnist for the Toronto Star.