Universities are revamping their curriculums to meet that challenge. Nicholas Lemann, the dean of Columbia Journalism School, which publishes CJR, said that 5 million Americans call themselves journalists of one sort or another, and it’s his mission “to produce the 5 percent that will be paid.” In an effort to do that, in 2010, he helped launch a dual-degree program in journalism and computer science with Columbia’s school of engineering.

“What can you put in somebody’s head that would enable them to enter a field with no barrier to entry, where there’s no professional credentials, and establish themselves as someone who has economic value?” Lemann asked. “Computational journalism is an incredibly appealing answer to that question. We need to go from turning out a graduate with a commodity skill set to turning out someone with a high value skill set.”

Eager to foster progress in the field, academic sponsors helped the second ‘Computation + Journalism Symposium’ forgo industry sponsors, which provided some funding in 2008. Financial backing came from the National Science Foundation and a group of media-innovation labs at Georgia Tech, Columbia, Stanford, Northwestern and Duke.

These research centers are becoming the core of journalism’s R&D community, but one thing that hasn’t changed since 2008 is the news industry’s dismal record of transferring technology. University labs and startup companies have developed all kinds of tools to improve news production, very little of that work has made its way into newsrooms or media products.

Sam Gassel, who works for Turner Broadcasting and who was instrumental in creating CNN.com, told me that the open-ended relationships, without concrete deliverables, weren’t a viable starting point. Emily Bell, the director of Columbia’s Tow Center for Digital Journalism (who sits on CJR’s board of overseers), had a more dire warning:

I don’t think you’ll be able to survive or have a sustainable future as a news organization unless you have an efficient way of technology transfer. It’s still going to be an industry of journalists and storytellers but it’s a technology-driven industry.

The divide between computing research and journalism is closing, but some of the same issues come up in 2008 and 2013. Chief among them is what to do about capital-T “Truth.” It’s an issue that comes up when even an enormous volume of systemic computational analysis fails to deliver certainty, only probability.

“If you say to somebody in an article there’s a 40 percent chance that this person is corrupt, you get sued,” said James Hamilton, an economist and public policy professor at Duke. But even when the odds are low, such statistical insights can provide useful tips on which leads to pursue and where to dig for a story, and Hamilton suggested that one could launch a news service offering data-driven tips that “that would be an intermediate product.”

Such a service would be an example of the technology transfer that research centers and newsrooms so badly need. Systems built to help journalists won’t ever be perfect, but they don’t need to be, as long as human innovation and expertise remain part of the equation. One day, it’ll be a beautiful marriage.

 

Brad Stenger was an organizer of the 2013 Computation + Journalism Symposium with Irfan Essa, Mark Hansen, Ian Bogost, and Nick Diakopoulos. Stenger is a journalist and researcher living in New York City.