Julian Assange is upset with The New York Times for talking with the White House about WikiLeaks’s trove of Afghanistan documents prior to publication. Really, though, he should bite his tongue. The Times’s decision to check with the White House was of great service to WikiLeaks, because it was one of several processes that served to remove any doubts about the authenticity of the Afghanistan documents.
People continue to debate the ethics, legality, and motivations of what WikiLeaks has done. But few, if any, are questioning the origin and accuracy of the documents. People seem to agree they are, in fact, secret military documents. Assange expertly removed accuracy and verification from the conversation by placing the burden for these elements on the shoulders of The New York Times, The Guardian and Der Spiegel. The Times, in turn, placed some of that burden on White House.
David Carr of the Times labeled the “War Logs” operation “asymmetrical journalism.” But perhaps asymmetrical journalism is only possible—or best enabled—when accompanied by distributed verification, which is the best way to engineer trust in today’s information environment.
Just as the Internet itself is built with a distributed architecture, the most powerful way to deploy verification in a networked world is via a distributed process that uses multiple nodes, each of which have a certain level of reliability. In this example, the nodes were the Times, Guardian, Der Spiegel, the White House, and WikiLeaks itself, among other, smaller players.
If WikiLeaks had released the documents on its own, the initial debate and coverage would have focused on whether the material was real, thus delaying any discussion about the contents. By enabling a process of distributed verification, Assange was able to ensure the conversation about the documents moved immediately to the information they contained, not whether they were authentic.
“WikiLeaks was soaking, drowning in data,” Clay Shirky, the author of Cognitive Surplus, told Carr. “What they needed was someone who could tell a story. They needed someone who could bring accuracy and political context to what was being revealed.”
Going back to the moment when they were handed the data, this was an unprecedented verification challenge for the three news organizations. There was no conceivable way for them to read, let alone verify, all of the documents prior to publication. And no one person had the ability to answer questions or fully interpret the data.
I suspect this kind of challenge—dealing with an abundance of data that offer multiple narratives and potential interpretations—will become more commonplace for news organizations, thanks to the movement towards open data and the ability to digitize and archive huge quantities of information. (Let’s just hope the resulting abundance of data comes in structured form.) The discipline of verification will have to evolve to meet this new challenge. Based on this example, the future of verification seems to be more open, distributed and collaborative processes that mix human intelligence and expertise with machine-based analysis and assistance. It’s a mouthful, but it’s also what seems to have happened here.
Some details about the verification process used for this story have begun to trickle out. Clint Hendler’s fascinating account for CJR about how this arrangement and the resulting editorial packages came together provided a glimpse of Der Spiegel’s approach to fact checking:
… reporters from the three outlets sat down and divvied up some tasks. Der Spiegel offered to check the logs against incident reports submitted by the German army to their parliament—partly as story research, partly to check their authenticity—and to share their findings. Davies, Goetz, Leigh, and Schmitt brainstormed about fifteen topic areas for which The New York Times’s computer assisted reporting team would try to find relevant logs to be shared with the group. Der Spiegel and The Guardian did their own searching, and also shared fruitful results, search terms, and methods.
That paragraph alone highlights traditional fact checking, data analysis, and a very rare form of collaboration between media outlets. I contacted Der Spiegel to see how the world’s largest fact checking organization handled the verification process, but was told by the head of its research and fact checking department that they weren’t ready to talk about this yet. I also provided questions to the Times, but didn’t receive responses. I hope to follow up in a future column with additional information.
For now, though, the Times detailed some of its approach to verification in A Note to Readers:
The Times spent about a month mining the data for disclosures and patterns, verifying and cross-checking with other information sources, and preparing the articles that are published today …
To establish confidence in the information, The Times checked a number of the reports against incidents that had been publicly reported or witnessed by our own journalists. Government officials did not dispute that the information was authentic.
Note the last line. Assange can gripe about the White House’s involvement, but the truth is the Times in effect forced the administration to confirm the authenticity of the documents. That’s another big verification win for WikiLeaks. It added a key node to the verification network.
As it turns out, the Times wasn’t alone in approaching the administration. Politico’s Ben Smith offered details about how the three news organizations dealt with the White House:
White House officials I talked to feel the Times was conscientious … The administration was considerably less impressed with the Guardian’s outreach efforts — an administration official described their attempts to verify the reports through the White House and Pentagon as minimal.
Der Speigel reporters did a little better, requesting comments on a few of the reports, the person added.
There’s one final layer in this verification scheme that’s particularly remarkable. At the same time these news organizations were engaged in a process that served to add credibility to the documents and to WikiLeaks itself, Assange set his group up to act as an arbiter of the accuracy of coverage delivered by these same news organizations. This point was made by NYU journalism professor Jay Rosen, and summed up nicely by Mathew Ingram at GigaOm:
… even after it provided the documents to the media outlets, WikiLeaks still maintained the ultimate control over them — including the ability to publish all 90,000 of them at the same time that the stories based on them appeared in the NYT, Guardian and Der Spiegel. This, Rosen says, provided an almost unprecedented check on the traditional media, since any gaps or omissions from their stories would become obvious. Typically, sources cut exclusive deals with a single outlet, and that entity has the final say over what appears — but WikiLeaks has altered that traditional balance of power.
In the end, there’s a certain brilliance in the way Assange shifted the burden of verification and analysis away from WikiLeaks, while at the same time ensuring he was able to call out mistakes made by the very news organizations that supplied the all-important credibility to his data.
Correction of the Week
“In Monday’s editions of the New York Post, we published a story that confessed wife-killer Johnny Concepcion underwent a liver-transplant operation at New York-Presbyterian Hospital.
“The hospital yesterday issued a statement that no such operation took place. The Post relied on two NYPD sources for its report, and it is now evident they were misinformed. We apologize to our readers for the error.
“Prior to publishing the story, The Post sought official response from New York-Presbyterian. The Post was denied information by the hospital, which stated it could not discuss individual cases because it would be in breach of the Health Information Privacy Act (HIPA).
“Curiously, the hospital now sees itself free to publicly discuss Concepcion’s case.” – New York Post