What should we make of the latest tally showing that Republicans fare worse with factcheckers than Democrats do? Last week the Center for Media and Public Affairs, a nonpartisan research group based at George Mason University, reported that, so far during Obama’s second term, GOP statements were three times as likely as claims from Democrats to earn “False” and “Pants on Fire!” verdicts from PolitiFact’s Truth-O-Meter—and only half as likely to be rated “True.” The lead of a brief write-up by Alex Seitz-Wald at Salon.com seemed to take the results at face value:
Many politicians stretch the truth or obfuscate to some degree or another — but does one party do it more than the other? According to a new study from the Center for Media and Public Affairs at George Mason University the answer is an unequivocal yes.
Or, maybe, not so unequivocal: As conservative media watchdog NewsBusters was quick to point out (and as Seitz-Wald acknowledges), the results can also be read as evidence of selection bias at PolitiFact. The press release from the CMPA hints at this interpretation; it notes that the GOP fared worse even in May, despite “controversies over Obama administration statements regarding Benghazi, the IRS, and the Associated Press.” A quote from the group’s president, Robert Lichter, sounds the note again: “While Republicans see a credibility gap in the Obama administration, PolitiFact rates Republicans as the less credible party.”
PolitiFact itself, meanwhile, did its best to stay out of the fray. A brief letter from founder Bill Adair noted simply that the factchecking outlet rates individual statements and doesn’t claim to gauge which party lies more. “We are journalists, not social scientists,” Adair wrote. “We select statements to fact-check based on our news judgment—whether a statement is timely, provocative, whether it’s been repeated and whether readers would wonder if it is true.”
This story has a familiar ring by now. In 2009, political scientist John Sides tallied a few dozen Truth-O-Meter verdicts on claims about healthcare reform, and found that Republican statements earned the two worst ratings almost three times as often as Democrats. He noted the potential for selection bias but concluded, “the data accord with what casual observation would suggest: opponents of health care reform have been more dishonest than supporters.” In 2011 another political scientist, Eric Ostermeier, found the same three-to-one ratio after counting up more than 500 PolitiFact rulings over 13 months. He drew the opposite conclusion: “it appears the sport of choice is game hunting—and the game is elephants.”
Whatever the reason, a similar pattern seems to hold at The Washington Post’s Fact Checker blog, where by his own counts Glenn Kessler hands out more Pinocchios, on average, to Republican statements. The differences tend to be slight—e.g., a 2.5-Pinocchio average for the GOP versus 2.1 for Democrats in the first half of 2012—and Kessler attributes them to electoral dynamics rather than to any difference between the parties. But an analysis of more than 300 Fact Checker rulings through the end of 2011, by Chris Mooney, found a telling detail: Republicans received nearly three times as many four-Pinocchio rulings. Even controlling for the number of statements checked, they earned the site’s worst rating at twice the rate of Democrats.
These tallies cover different periods and weren’t compiled according to a single methodology. Still, the broad pattern is striking: Republican statements evaluated by factcheckers are consistently two to three times as likely to earn their harshest ratings.
So—for the proverbial engaged citizen (or journalist, or political scientist) who’s looking for clues about the nature of our political discourse, is there any meaning in that pattern? Obviously, the issue of selection bias can’t be ignored, since factcheckers don’t pick statements at random. Does that mean, as Sides wrote last week (seeming to depart from his earlier view) that the data simply don’t “say all that much about the truthfulness of political parties”? Or even, as Jonathan Bernstein added in the Post, that while we should be grateful for the research factcheckers assemble, we should throw out their conclusions altogether?

Republicans make many more false statements the way Republicans have far more personal-relationship or personal financial scandals. Oh, wait a second . . .
Have the fact-checkers taken a look at Eric Holder's testimony before Congress in the James Rosen affair? Rather more serious than the risk-free (for MSMers) bashing of Michelle Bachmann.
#1 Posted by Mark Richard, CJR on Tue 4 Jun 2013 at 05:07 PM
More accurately, "They pick statements that seem important, or interesting, or outlandish" to Democrats. "They have a bias toward things that stand out" to Democrats.
We all know that newsrooms are a political monoculture, and while I'm confident that the fact-checkers think they're trying very hard to approach the news objectively, they're as much guided by their own views as any of us. As long as newsrooms maintain themselves as institutions without intellectual diversity, no one outside the tent is going to accept their claims of neutrality.
#2 Posted by Tom T., CJR on Thu 6 Jun 2013 at 07:42 PM
Media bias is so hard to see unless you compare stories between sources. I found a iOS app called Spectrum News Prism that lays out news sources along an axis indicating liberal/centrist/conservative biases. The feeds are fixed, so you can't add sources, but it is an interesting angle on browsing the news and comparing coverage. I searched for "immegration" and the word choice and content placement definately reveal who leans what way! My serendipitously discovered 2 cents.
#3 Posted by Bobby, CJR on Tue 25 Jun 2013 at 03:07 PM