Political factchecking has grown up over the course of the past three presidential elections. Launched in 2003, FactCheck.org examined candidates’ claims during the 2004 campaign and was even incorrectly cited by Vice President Dick Cheney during a debate. The then-nascent PolitiFact.com analyzed more than 750 political statements throughout the 2008 cycle, garnering a Pulitzer Prize for its efforts. The genre soon grew even more popular amid expansions by PolitiFact and others, and by the 2012 campaign, PolitiFact was topping one million pageviews on some days.
At the same time, there’s been plenty of debate about the influence of all this work. Amid a torrent of bad press during the summer of 2012, an aide to Mitt Romney famously said, “We’re not going to let our campaign be dictated by factcheckers.” It was a blunt admission—one that would likely turn the Truth-O-Meter green. But in response to that and other challenges to the form, PolitiFact’s founder, Bill Adair, wrote that influence on campaigns’ behavior “is a silly measurement of our work. Our mission is to inform readers, not change the behavior of politicians.”
So it’s natural to ask: Does factchecking actually succeed on that front?
That question, among others, was taken up by a group of academics in several papers commissioned by the American Press Institute and released Wednesday. One study compared the effectiveness of various factchecking formats to each other, and another analyzed the diffusion of the genre throughout the industry. The third study gauged whether Americans view factchecking favorably–and whether exposure to it sharpens political knowledge.
In that experiment, researchers assigned randomized groups to read either three PolitiFact stories or three non-political news releases apiece in three waves leading up to the 2014 midterm elections. Later, the participants were asked to evaluate the accuracy of fact-based political statements. The results were encouraging: While correct answers were relatively rare across the board, people who had read the factchecks were much more likely to give correct responses (25 percent compared to 16 percent). The gap widened to 11 percentage points among the more politically knowledgeable. And, though past research has suggested that voters tend to resist corrective information that clashes with their ideologies, in this study the gains actually increased when the factual information was inconsistent with respondents’ political beliefs.
It certainly adds meat to Adair’s argument, though the study’s additional findings point to the challenges that remain for the genre. Though most people have favorable views of factchecking, Republicans are far less supportive of factcheck outfits than their Democratic counterparts, and that disparity was also larger among those with high levels of political knowledge. What’s more, interest in factchecking and the likelihood that someone will actually look for it online skews heavily toward the politically sophisticated. Factchecking’s reach is limited as a result, as roughly half of survey respondents said they were unfamiliar with the genre. One takeaway: “Citizens may not always seek out fact-checks, but they can learn a surprising amount from the format if given the chance,” as study co-author Brendan Nyhan wrote Wednesday in The New York Times. (Nyhan is a former CJR contributor, and the API studies were supported by the Democracy Fund, which is also a major supporter of CJR.)
Audiences do have an increasing number of those chances. The amount of factchecking stories that appeared in 173 prominent newspapers nationwide increased by more than 50 percent between the 2004 and 2008 elections, according to a separate paper by Nyhan, Jason Reifler, and Lucas Graves. The amount of coverage ballooned 300 percent between 2008 and 2012.
The research also found, however, that this recent growth of factchecking isn’t happening organically. It has been driven largely by PolitiFact, which has franchised its model to 11 state-level news outlets. But those satellite operations appear to have little effect on compelling in-state competitors to ramp up their own coverage. The growth of factchecking might accelerate, the authors wrote, if more journalists start “promoting the genre as a high-status practice.”
Of course, whether that’s an end in and of itself remains an open question. Factchecking-based gains in political knowledge have yet to be tested against those culled from other types of journalism. Further, it remains unclear whether that increased knowledge actually changes public perception of elected officials. Among other conclusions, the third study released Wednesday found that while factchecks dampened respondents’ approval of non-political public figures, the same didn’t hold true for politicians. “It was the party affiliation of the candidate that had a strong and consistent influence on people’s feelings,” the authors wrote.
In this sense, the success of factchecking depends on how you define its goal: to inform the public, to hold powerbrokers in line, or to change real-world outcomes. But the same can be said of any type of political reporting. “And it’s ridiculous to think,” PolitiFact’s Adair continued in 2012, “that our new form of accountability journalism would suddenly rewrite the traditions of American politics and end decades of lying by candidates and elected officials.” The Romney campaign understood that, and it said so out loud. It remains to be seen if any 2016 candidate similarly tells the truth.David Uberti is a writer in New York. He was previously a media reporter for Gizmodo Media Group and a staff writer for CJR. Follow him on Twitter @DavidUberti.