Although the Archives routinely conducts a review of classified documents to evaluate whether they are eligible for release, its current backlog runs to some 417 million pages, mostly dating from the 1940s through the 1970s. It will only grow, given the natural human tendency for government bureaucrats to believe that if their work is important, it must be confidential (or secret, or top secret, or on up the line into “code-word clearances” whose existence is itself classified). At its best, the declassification process is quite cumbersome, with representatives of various agencies entitled to weigh in and block release of material that may not have caused concern in the agency where it originated. Thus, the gaping holes in some volumes of the official Foreign Relations of the United States issued by the State Department.

But it is the influx of new material in electronic form that has officials at the Archives reeling. According to Jason Baron, its director of litigation, there were 32 million e-mails transferred to the Archives from the Clinton administration, and the final number from the presidency of George W. Bush is expected to be about 250 million.

“At the present rate of e-mail creation,” says Baron, the Archives “expects to receive over one billion e-mails over the course of the next decade as permanently accessioned records of the government.” If all of those had to be reviewed for potential release under FOIA, he estimates, it would take a hundred people, working ten-hour days 365 days a year, fifty-four years to complete the task. Even the recent creation of a National Declassification Center within the Archives has not inspired optimism about solving the disastrous problem of classification run amok.

With one intelligence agency alone creating a petabyte (a million gigabytes, or the equivalent of 49 million cubic feet of paper) of new classified records every eighteen months, the US Public Interest Declassification Board, an obscure panel created under the inspiration of the late Senator Daniel Patrick Moynihan, the New York Democrat, to advise the president on these matters (of which the author is a member), has recently been entertaining potential schemes for “mass declassification.” At a public board hearing last September, Jeff Jonas, chief scientist of IBM Entity Analytics based in Las Vegas, asserted that the volume of classified documents may well be “beyond human, brute force review,” and he appealed for the introduction of “some form of machine triage” into the declassification process. Jonas promoted the concept of “context accumulation,” whereby computers would review classified documents and would, over time, gain increasing sophistication—with decreasing amounts of human input—about which truly need to be protected.

The threshold challenge would be to persuade federal agencies, especially the many involved in intelligence work, to trust such a bold new process; the hope would be that once it took effect, the pace of classification and the number of classified documents would eventually decrease, and thus potentially compromising leaks would become less likely and, some would argue, less necessary.

Informed Consent

Meanwhile, until the new era dawns, the WikiLeaks case provides everyone an additional opportunity to live with the old. On the substance of the diplomatic cables that were distributed, it was difficult to claim damage to American national security. It may be awkward, say, for Saudi Arabia and certain other Middle Eastern states to have it known that they are every bit as worried about Iran as are Israel and the United States, if not more so, as revealed through WikiLeaks, but hardly a threat to anyone’s well-being. And for the Chinese to be identified as complaining that North Korea was behaving like a “spoiled child” is not terribly surprising.

“The members of the Foreign Service owe a great debt to Julian Assange,” observed Charles Peters in the Washington Monthly. “He got their cables read.” And Fareed Zakaria, writing in Time, said the published cables were “actually quite reassuring about the way Washington—or at least the State Department—works.” Indeed, leaked cables revealing foreign-service officers’ assessment of the precarious hold on power of the corrupt regime in Tunisia, just weeks before it fell, looked positively prescient.

In the end, it may have been the unpredictability and loss of control in the WikiLeaks case that most rattled the bureaucracy. Although the government lost both its civil and criminal cases involving the Pentagon Papers, it is remarkable in retrospect to what extent it managed to control the process at the time. In both New York and Washington, federal courts halted publication of the documents for almost two weeks—in effect granting a prior restraint on the free press—while government lawyers attempted to prove that grievous harm to national security was at stake.

Sanford J. Ungar , author of The Papers & The Papers: An Account of the Legal and Political Battle Over the Pentagon Papers, which won a George Polk Award in 1972, is president of Goucher College in Baltimore and a member of the US Public Interest Declassification Board.