Posts Tagged ‘Snowden’

The Leaky Leviathan

2015/06/21

David Pozen’s Harvard Law Review paper [1] “The Leaky Leviathan: Why the Government Condems and Condones Unlawful Disclosures of Information” explores the issues of leaks, plants, and combinations “pleaks” of confidential government information. Such disclosures represent an art form for U.S. (and I’m sure other nation) politicians. This paper will be required reading for any student of constitutional law or participant in government. For technologists such as readers of this blog, this paper begins the wrestling match around the Snowden leaks, and the battle over the NSA’s previously secret activities to intercept foreign and domestic Internet and telephone communications.

Pozen’s paper analyzes the reality of how government works. It is easy to make something “secret” in one form or another, but government has created informal ways to incrementally relax secrecy. We’ve all heard newscasters attribute a story to an “unnamed source”. When the story supports the government position in a measured way, it is a plant. When the government feels too much pain and the unnamed source was not controlled by the government, then it is a leak, and top executives in the government squeal like little piglets at the wrong done. In reality, Pozen writes, with tongue-in-cheek, plants need to be “nourished” by leaks. Otherwise, if all leaks were suppressed, plants would lose their believability and would be ineffective as a government tool. He points out that historically whistle-blowers and sources of leaks are rarely thrown in jail. They are, however, often shunned, losing valuable access to government executives.

The Creaky Leviathan by Sagar [2] is sometimes cited as a rebuttal, but I found it hugely supportive:

Let me be clear. Leaky Leviathan contains no hasty judgments. Pozen is admirably careful and measured in his praise of permissive enforcement; his broader objective is to show that the varied causes and consequences associated with leaks mean that the prevailing “system of information control defies simple normative assessment.” This conclusion I fully endorse. But the question remains: even if it advances our comprehension, does “middle-range theory” improve our ability to judge the value or worth of the prevailing “system of information control”? For the reasons outlined above, I think the answer must be no, since the overall consequences of permissive enforcement are obscure and hard to ascertain. As far as this “disorderly” system is concerned, the most we can do from a normative perspective, I think, is become clearer about whether particular outcomes it produces are more or less acceptable.

Whistle-blowing protection when it comes to national security issues is a dicey topic. The issue here is that huge losses of life and treasure are at risk. Counterbalancing such risk is that the national security infrastructure is huge: NSA, the Pentagon and all of the DoD, CIA, NSC, and of course all of the Executive Branch (and some of Congress) represent the government side, and the major contractors Boeing, L3, Halliburton, McDonnell Douglas, Raytheon, etc. are also in the mix. A system of secrecy often makes it difficult to generate healthy debate; moreover, these institutions are “wired” to worry about and exaggerate threats. Secrecy in these institutions makes it difficult even for internals to gain enough information to offer opposing views. Senator McCarthy, J. Edgar Hoover, Nixon, and others required outside forces to neutralize their negative actions. However real the communist threat was, McCarthy and Hoover violated the rights of many U.S. Citizens. There were, in the end, no weapons of mass destruction in Iraq, and we went to war over this fabrication. The Maginot line in France was an utter failure. Examples abound. The whistle-blowing that has been legitimized in the private sector is not well legitimized in the government sector. Current law limits disclosure to Congress, does not cover civilian contractors (thus Snowden is not protected; Manning was somewhat, but still got 35 years, possibly due more to the choice of WikiLeaks as an outlet). The Leaky Leviathan screams out for a legal structure to fairly protect national security whistle-blowing. Benkler’s paper [3] “Whistle Blower Defense” takes a solid crack at this.

Benkley starts with an excellent in-depth review of how we got from 9/11 to the mess that Manning and Snowden disclosed. His “Public Accountability Defense” starts with the observation that most leaks are not prosecuted, because if they were, the non-prosecuted leaks would appear to be sanctioned and would lose their credibility to shape public opinion. He focuses on “accountability leaks”, which are those that expose substantial instances of illegality or gross incompetence or error in important matters of national security. These are rare. One set occurred at the confluence of the Vietnam and Cold Wars with the anti-war and civil rights movements. The second deals with extreme post 9/11 tactics and strategies. Such leaks have played a significant role in undermining threats that the national security establishment has made to the “constitutional order of the United States” – in short, a very big deal. For many of us technologists the issues we would consider to leak would wrack our conscience and destroy our moral compass. We would be going out of our element to deal with the failure of mechanisms inside the national security system to create a leak. For example, the CIA’s program of torture, rendition, and secret off-shore prisons somehow leaked without an Ellsberg, a Manning, or a Snowden. (The technology that enabled Ellsberg to release the Pentagon Papers was a massive, highly automated, Xerox machine.) Benkley observes, “The greater the incongruity between what the national security system has developed and what public opinion is willing to accept, the greater the national security establishment’s need to prevent the public from becoming informed. The prosecutorial deviation from past practices is best explained as an expression of the mounting urgency felt inside the national security system to prevent public exposure. The defense I propose is intended to reverse that prosecutorial deviation.” Needed is a defense or at least a sentencing mitigation platform that requires only a belief that the disclosure would expose substantial “violation of law or systemic error, incompetence, or malfeasance.” This defense is based on the leaker serving a public good. It is not individual rights based. It is belief based, and does not depend on later proven illegality of what was disclosed.

This is not a naive proposal, and many subtleties are discussed, for example, to whom the leak is made, how it is made, when it is made, what is redacted, how public the leak mechanism is, etc.

Benkler reviews several historical leaks, going back to 1942 when Morton Seligman leaked decoded Navy messages to a reporter. If published, the fact that Japanese codes had been broken would be disclosed causing harm to the war effort. Since no government wrongdoing was disclosed, the accountability defense would not apply. The long historical review ends with a discussion of the Manning and Snowden cases. Manning’s 35 year sentence is obscenely excessive even though the criteria for an accountability defense are mixed. One would hope, in the presence of an accountability defense that at least a more reasonable sentence would have been handed down. A detailed analysis of the Snowden case is given; with the single exception of leaks on NSA’s Tailored Access Operations, TAO, which target specific computers, the defense applies. One interesting issue is that the legal defense should be structured so that the prosecution cannot “cherry pick” the weakest disclosure and prosecute that, ignoring the public value of the other disclosures.

The institution through which the leaks are made should also be protected by the defense. In some sense, “free press” does this, but this should be clarified in national defense cases.

Finally, “punishment by process” is discussed. The government can ruin someone in many ways. Huge legal expenses, long drawn-out trials, loss of access and jobs, etc. While protection from punishment by process is desirable, how to do this needs to be addressed. I would think that technologists fear this the most.

I strongly recommend these three thought-provoking articles.

[1] http://cdn.harvardlawreview.org/wp-content/uploads/pdfs/vol127_pozen.pdf , “The Leaky Leviathan: Why the Government Condems and Condones Unlawful Disclosures of Information”, Harvard Law Review, December 2013, Vol 127, No. 2, David Pozner

[2] http://harvardlawreview.org/2013/12/creaky-leviathan-a-comment-on-david-pozens-leaky-leviathan/ “Creaky Leviathan: A Comment on David Pozner’s Leaky Leviathan”, Harvard Law Review Forum, December 2013, Vol 127, No. 2, Rabul Sagar. [A mostly favorable review of [1]]

[3] http://benkler.org/Benkler_Whistleblowerdefense_Prepub.pdf “A Public Accountability Defense for National Security Leakers and Whistleblowers” 8(2) Harv. Rev. L. & Policy, July 2014, Yochai Benkler. [A well reasoned and historically justified proposal for the legal structure of a whistle blower defense that is of particular interest to technologists.]

Advertisements

Comments on NIST Interagency Report 7977, Draft 2

2015/03/24

To: crypto-review@nist.gov

Cc: crypto@nist.gov

Re: NIST Interagency Report NISTIR 7977 “NIST Cryptographic Standards and Guidelines Development Process” January 2015 Draft.

Date: March 24, 2015

The January 2015 Draft is a big improvement. In addition, the Summary of Public Comments on NISTIR 7977 (Feb 2014 (First) Draft) was quite helpful to compare how the current draft addresses comments on the first draft. The authors of both new documents are to be commended for their work.

General review comments:

A. Review comments on the first draft of NISTIR 7977 reflect a distrust of the NSA, and by extension of NIST processes. Suppose China sends NIST an email that says, “Hey, we’ve been studying elliptic curve E over field Z/p with base point Q. It has some neat properties, and China recommends that NIST adopt E for its elliptic curve encryption.” Should NIST consider China’s suggestion? A good test of the processes in NISTIR 7977, would be to answer “yes”, even though we suspect that China knows a backdoor to encryption that uses E. Now, receiving such a suggestion from the NSA should be little different. Even though NIST is required to consult with the NSA, most independent security consultants, post-Snowden, would not trust a curve suggested by the NSA any more than one suggested by China. I certainly would not. Therefore we look for processes in NISTIR 7977 that politely look with great suspicion on suggestions for tools and algorithms from the NSA. NIST should get expertise and knowledge from the NSA, but should not blindly accept its tools and algorithms. NIST processes for analysis and acceptance of any suggestion, be it from the NSA, from China, or from any other source should be equally stringent. In particular, cryptographic technology should not be standardized without significant consensus from a wide swath of independent cryptographers. The current draft of NISTIR 7977 does not have an emphasis on public analysis and consensus for acceptance.

B. Of course, NIST could standardize, say an encryption method, even from a private and independent source, and the NSA, China’s PLA Unit 1398, the UK’s GCHQ, or other nation-state cryptography group could know how to crack this encryption method, remaining silent during and after the standardization process. One would hope that NIST, through its own efforts and its financial support of academic and other cryptography research would facilitate the discovery of the weakness and would quickly retire the standard. Such post-standardization life cycle efforts by NIST also need to be part of NISTIR 7977 [line 727].

C. Now if NIST were to publish a proof that a tool or algorithm received from China, the NSA, or another source in fact had no back doors and was totally secure then a consensus on standardization might well be easily achieved. After believing such a proof, I might recommend the proven technology to my clients, but I probably would wait until a huge number of non-government cryptographers also believed the proof and also were recommending this technology. NISTIR 7977 says roughly that NIST will “pursue” to find and use proofs [line 55]. I’d be happy if NIST worked with the NSA and other agencies on such proofs and would recommend such efforts be part of NISTIR 7977.

D. NIST publishes security papers at a prodigious rate. So fast that reviews are deemed inadequate. In light of post-Snowden caution around NIST processes, people naturally ask if these poorly reviewed papers can be trusted. It isn’t going to help if NIST says, “It’s ok, the NSA has reviewed it…” Look, not only does the current draft of NISTIR 7977 fail to convince that future NIST papers will receive good independent reviews, there was no indication that past NIST papers will retroactively receive good reviews. This is a very sad state of affairs, but it is fixable.

Some more specific review comments:

  1. Clarity of the NIST Mission: To develop strong cryptographic standards and guidelines for meeting U.S. federal agency non-national security and commerce needs. This mission should be parsed: To develop strong cryptographic standards and guidelines for meeting 1. U.S. federal agency non-national security needs and 2. commerce needs. My point is that the needs of commerce should treated by NIST as equal to the needs of any federal agency. [line 202 Balance]. For example, federal agencies may well be happy with NSA algorithms, but general commerce may not be.
  2. I do not agree with the NIST Response (page 7 of Summary) to Technical Merit comments that NIST should give priority to non-national security federal information systems. NIST should always make commerce needs equally important. Such a priority statement doesn’t seem to be in NISTIR 7977 explicitly, but there are several statements about NIST being legally required to give such a preference when so ordered by a government entity.
  3. NIST’s statement that it will “never knowingly misrepresent or conceal security properties” [Summary page 3; line 215 Integrity] reminds me of Barry Bond’s statement that he “never knowingly took growth steroids” when his hat/head size at the end of his career was three sizes larger than when he was a rookie. I would prefer a more proactive statement such as “NIST will make every reasonable effort to ensure that military, intelligence and law enforcement agencies by their suggestions, review comments, or contributions do not compromise any security tool or algorithm recommended by NIST.” For NIST standards to regain the confidence of the security and general commerce communities, NIST processes should convincingly ensure by NIST public actions that its tools and algorithms do not compromise the privacy or the integrity of any commercial or private message being protected by NIST standards.
  4. The FISMA requirement that NIST consult with certain federal agencies including the NSA to avoid duplication of effort and to maintain synergy of federal information protection efforts, but NIST can never in the future blindly accept a recommendation from any public or private agency. What is important is that NIST actively regain and maintain its process integrity via the new NISTIR 7977. The current draft falls short.
  5. NIST should consider resolving the conundrum of needing NIST output frequently and needing adequate public reviews of such output by the creation of additional outside paid review boards and conformance testing bodies. Such a review board should be established to review annually the entire Cryptographic Technology Group. [lines 316 and 326]
  6. Minutes of the monthly NIST/NSA meetings should be published. [line 377]
  7. Independent review boards should have the power to reject a proposed standard, say if NIST could not convince the board that the NSA or another agency has not compromised the standard. [page 4; lines 47, 403, and 464]
  8. The NISTIR 7977 Development Process itself should undergo regular review and updates at, say, an annual frequency.

Requested Comments [line 125]:

NIST Question Comment

Do the expanded and revised principles state appropriate drivers and conditions for NIST’s efforts related to cryptographic standards and guidelines?

Yes, but if the word “appropriate” were replaced by “adequate” then No. Neither the integrity of NIST processes in face of NSA influence, nor the issue of adequate review are satisfactorily answered. [A, C, D]

Do the revised processes for engaging the cryptographic community provide the necessary inclusivity, transparency and balance to develop strong, trustworthy standards? Are they worded clearly and appropriately? Are there other processes that NIST should consider?

No. “Trustworthy” standards need public confidence that the NSA or another agency have not added or know of backdoors or other weaknesses to their contributions. Wording isn’t an issue. Different new processes are necessary to separate NIST from NSA and other related agency influence. After-standardization research efforts should be funded as part of all life cycles [B].

Do these processes include appropriate mechanisms to ensure that proposed standards and guidelines are reviewed thoroughly and that the views of interested parties are provided to and considered by NIST? Are there other mechanisms NIST should consider?

No. Cf. A, C, 2, 3, 4, and 7 above. Regarding 7, if NIST won’t vest veto power to independent reviewers, such experts will tend to not participate. Lack of review resources also seems to be a problem. Cf. 5 above.

Are there other channels or mechanisms that NIST should consider in order to communicate most effectively with its stakeholders?

Yes. More paid outside reviewers including an annual review of the Cryptographic Technology Group. Cf. D and 5 above.

Respectfully submitted,

Gayn B. Winters, Ph.D.

Technology Consultant