Like many champions of a fledgling idea, Trevor Butterworth has a big vision. He talks about revolutions–a revolution in big data, a revolution in exposing bad science, a revolution in journalism.
“This is the counterpoint to all the dismal news about the news media,” Butterworth says. “This is really exciting. This is journalism ascending to a higher level of understanding of the way the world works, and we want to help as many journalists as possible get there.”
What Butterworth is talking about is the increasingly quantitative nature of the stories journalists find themselves telling, and the potential of numbers to hold powerful interests accountable. But the tool he’s using to address it is, for now at least, actually quite small: six statisticians, all volunteers, who make up an advisory board designed to help journalists struggling to sort through reams of data or understand the statistical evidence presented by a research paper.
As the editor of STATS.org, Butterworth has long facilitated an informal advice-giving process for journalists in need of numerical guidance. But it’s only in the last month that the official advisory board became active, after a collaboration with the British charity Sense About Science and the American Statistical Association allowed the site to expand its reach.
That may be particularly crucial for journalists covering the industry-heavy field of medicine, where so many researchers are backed by corporations with mammoth financial interests and findings are often distorted before they reach the editor’s desk. In 2012, Cardiff University researchers hoping to parse out the source of distorted medical claims found that the PR machine was the original source of misinformation far more often than was the news media. When stories came from a fair press release, less than 20 percent contained exaggerated or false information. But when the press release fudged the results, between 58 and 86 percent of news stories did, too.
The copycat rate reveals the problem with mathematically untrained writers–i.e., most of us–covering quantitative research in an increasingly competitive and slick research world: In the sciences, at least, journalists are losing the power to truly press their sources. When a senator makes claims about campaign financing, a political reporter can call bullshit because they both speak the same language. But when a science journalist on a deadline hears a researcher cured cancer, and the paper–if she has time to read the paper–is filled with subtly distorted graphs, she often has to take the researcher’s word for it.
“There are a fair number of journalists who do health reporting, medicine reporting, who don’t know how to challenge these things,” says Jeanne Lenzer, a freelance medical investigative journalist and one of the first users of the advisory board. “I think an organization like STATS is really necessary.”
Lenzer contacted the advisory board when she noticed that a prominent medical researcher appeared to be making public claims that weren’t validated by the data in his research. After a review of his work, one of the board members concluded that the study didn’t have a sufficient sample size to make any claims of a trend, just as the researcher had been doing.
The conversation Lenzer had with the board member helped her place the researcher’s deceptive statements in a larger context of fraudulent behavior, leading to a more deeply investigative piece that’s still ongoing–which is precisely the role Butterworth is hoping the board can have in journalism.
“It provides real value in the news media, in the news business,” Butterworth says. “This is a way of going beyond rewriting a press release and getting a few quotes and slapping on an interesting hed that you hope will get a click. This is a way of cutting through that worn-out formula and saying, ‘Is this really true?’”