We were actually surprised to find just a small handful of stories that actually looked at the policy discussion around so-called death panels. The death panels claim may be false in and of itself, but it comes from a larger, very important question: How are we going to provide for people who are going to have end-of-life counseling in a way that gives doctors incentive to do a good job of it, knowing they’re going to get reimbursed? That’s kind of the heart of the policy issue there. That’s what morphed into this claim about government bureaucrats deciding who will get care and who will not. We found literally less than five stories, if I remember correctly, in this whole sample of hundreds of stories, that actually talked in real depth about end-of-life counseling and the complexities and challenges of that. When the death panels claim came up, it quite often came up in a political context, as part of a political debate, part of day-to-day coverage of politics more than of policy.

Did you notice any patterns where particular news outlets did the debunking frequently and strongly?

The standout there, and it’s not surprising, is The St. Petersburg Times, because they’re the home of PolitiFact. By far they were the standout newspaper in terms of consistently and repeatedly running long stories that were really PolitiFact stories.

Do you think that outlets like PolitiFact and Factcheck are actually breaking through, though? It seems that if the misinformation is being believed, and becoming prevalent, then those outlets playing referee are having trouble getting to a wider audience.

If you look at just our data, you will be surprised and dismayed to find that it was really a small fraction of stories that said, “According to PolitiFact,” or, “According to Factcheck…” the claim is false. It was really just a handful of stories, which surprised us. But it’s impossible to say how much they are shaping the background information environment that reporters are working in. How much are reporters going to Factcheck, for example, reading what they have to say, and not necessarily building that into their story in an explicit way. That is very hard to know without just interviewing a lot of journalists.

Are they breaking through to the larger public? Again, that’s hard to know, because we can’t just rely on the mainstream news coverage that we looked at in this study to tell us whether or not they’re breaking through. I am sure PolitiFact and Factcheck have their own metrics for figuring out how much traffic they’re getting, but I’m not familiar with those. The larger problem is that it’s increasingly hard to talk about “breaking through” and persuading the public as a whole because of this greater polarization, politically and in terms of media choice. People now have the means to tune in to the news that’s most congenial to their views.

In the report you make a distinction between two journalistic tools, what you term “procedural objectivity” and “substantive objectivity.” What are these and how are they different?

Those are kind of overly academic terms; they’re not terms that I think journalists would use in a news meeting. We were casting around for a terminology, because everybody—journalists, scholars—throws around the term objectivity, and it’s a very complicated thing.

Procedural objectivity is following the formulas that promise that, at the end of the day, you will have a balanced, or at least not a biased, story. The he-said she-said formula. Substantive objectivity is a much harder and riskier thing to do. If all of the evidence points to something being false, then, no bones about it, that’s how you treat it. Don’t go through the motions of saying, ‘Well, but of course people believe something else on the other side.” That’s sticking your neck out, I think, for mainstream reporters.

Joel Meares is a former CJR assistant editor.