Julian Assange is upset with The New York Times for talking with the White House about WikiLeaks’s trove of Afghanistan documents prior to publication. Really, though, he should bite his tongue. The Times’s decision to check with the White House was of great service to WikiLeaks, because it was one of several processes that served to remove any doubts about the authenticity of the Afghanistan documents.

People continue to debate the ethics, legality, and motivations of what WikiLeaks has done. But few, if any, are questioning the origin and accuracy of the documents. People seem to agree they are, in fact, secret military documents. Assange expertly removed accuracy and verification from the conversation by placing the burden for these elements on the shoulders of The New York Times, The Guardian and Der Spiegel. The Times, in turn, placed some of that burden on White House.

David Carr of the Times labeled the “War Logs” operation “asymmetrical journalism.” But perhaps asymmetrical journalism is only possible—or best enabled—when accompanied by distributed verification, which is the best way to engineer trust in today’s information environment.

Just as the Internet itself is built with a distributed architecture, the most powerful way to deploy verification in a networked world is via a distributed process that uses multiple nodes, each of which have a certain level of reliability. In this example, the nodes were the Times, Guardian, Der Spiegel, the White House, and WikiLeaks itself, among other, smaller players.

If WikiLeaks had released the documents on its own, the initial debate and coverage would have focused on whether the material was real, thus delaying any discussion about the contents. By enabling a process of distributed verification, Assange was able to ensure the conversation about the documents moved immediately to the information they contained, not whether they were authentic.

“WikiLeaks was soaking, drowning in data,” Clay Shirky, the author of Cognitive Surplus, told Carr. “What they needed was someone who could tell a story. They needed someone who could bring accuracy and political context to what was being revealed.”

Going back to the moment when they were handed the data, this was an unprecedented verification challenge for the three news organizations. There was no conceivable way for them to read, let alone verify, all of the documents prior to publication. And no one person had the ability to answer questions or fully interpret the data.

I suspect this kind of challenge—dealing with an abundance of data that offer multiple narratives and potential interpretations—will become more commonplace for news organizations, thanks to the movement towards open data and the ability to digitize and archive huge quantities of information. (Let’s just hope the resulting abundance of data comes in structured form.) The discipline of verification will have to evolve to meet this new challenge. Based on this example, the future of verification seems to be more open, distributed and collaborative processes that mix human intelligence and expertise with machine-based analysis and assistance. It’s a mouthful, but it’s also what seems to have happened here.

Some details about the verification process used for this story have begun to trickle out. Clint Hendler’s fascinating account for CJR about how this arrangement and the resulting editorial packages came together provided a glimpse of Der Spiegel’s approach to fact checking:

… reporters from the three outlets sat down and divvied up some tasks. Der Spiegel offered to check the logs against incident reports submitted by the German army to their parliament—partly as story research, partly to check their authenticity—and to share their findings. Davies, Goetz, Leigh, and Schmitt brainstormed about fifteen topic areas for which The New York Times’s computer assisted reporting team would try to find relevant logs to be shared with the group. Der Spiegel and The Guardian did their own searching, and also shared fruitful results, search terms, and methods.

That paragraph alone highlights traditional fact checking, data analysis, and a very rare form of collaboration between media outlets. I contacted Der Spiegel to see how the world’s largest fact checking organization handled the verification process, but was told by the head of its research and fact checking department that they weren’t ready to talk about this yet. I also provided questions to the Times, but didn’t receive responses. I hope to follow up in a future column with additional information.

For now, though, the Times detailed some of its approach to verification in A Note to Readers:

The Times spent about a month mining the data for disclosures and patterns, verifying and cross-checking with other information sources, and preparing the articles that are published today …



Craig Silverman is the editor of RegretTheError.com and the author of Regret The Error: How Media Mistakes Pollute the Press and Imperil Free Speech. He is also the editorial director of OpenFile.ca and a columnist for the Toronto Star.