In these heady days of the Internet, other forms of global communication, and multinational corporations, the need for privacy in electronic communications is greater than ever. Without it, consumers will not make credit card purchases, and companies and individuals will be extremely reluctant to disseminate confidential information to their worldwide offices and to their clients, lest such information fall prey to hacking competitors and criminals.
Encryption not only is valuable in terms of ensuring privacy, but also facilitates "authentication" in that it creates non-forgeable "digital" signatures on electronic documents and also provides a fool-proof way of detecting whether anybody has attempted to alter a communication while in transition.1 Thus, in many ways, "paperless" electronic transactions are, at least potentially, both more efficient and safer for the consumer and the seller of goods and services than more standard transactions. Indeed, it would not be an overstatement to say that ensuring electronic privacy is essential to maximize the development of our global cyber-economy.
The art and science of cryptography is almost as old as civilization itself, tracing its roots to ancient Egypt and the time of Julius Caesar, who sent encrypted messages, replacing each letter by the third later letter in the Latin alphabet, to his field generals in battle.2 Cryptography has proven particularly valuable during times of war, enabling our country, for example, to crack the German’s "Ultra" codes and the Japanese’s "Purple" codes during World War II, thereby substantially shortening the war and saving thousands of lives.3 In addition to military applications, cryptography plays a vital role within the intelligence community, helping us stay one step ahead of international terrorists and the like.
While computers have played an important role in the area of code-breaking, they have likewise played an important role in the area of code-making. Through the encryption process, readable data (known as "plaintext") is run through a computer program, which utilizes algorithms, and is transformed into unreadable gibberish (known as "ciphertext"). Decryption is the process whereby the ciphertext is translated back to plaintext by someone possessing the appropriate code or "key."
Generally speaking, the strength of a particular cryptographic system is gauged by the length of its key and the complexity of its algorithm. Moreover, encryption is measured in bits, and its strength doubles with each added bit. So, for example, according to Netscape’s chief scientist, it would take a "trillion trillion years" to break a system using 128-bit encryption, but only a few hours to break a 40-bit system.4 As this statement implies, there are encryption products already in existence that contain codes that are so complex that they are virtually impossible to break without the proper key, which are oftentimes in the sole possession of the recipient of the information.
As one might expect, the international market for encryption hardware and software is huge, and getting bigger, its demand being limited only by the demand for computers and cellular telephones. How rapidly this market will, and should, develop is a matter that is being debated in the courts and within the halls of Congress, because of concerns expressed by the law enforcement community that it may not be able to keep up with the technology and the resulting consequences if this technology were to fall into the wrong hands.
Testifying earlier this year before the Senate Select Committee on Intelligence, FBI Director Louis Freeh bluntly stated that, "[l]aw enforcement remains in unanimous agreement that the widespread use of robust non-recovery encryption will devastate our ability to fight crime and terrorism."5 For example, encryption to prevent our intelligence community from collecting data was detected in the Aldrich Ames spy case, and Ramzi Yousef, the convicted mastermind of the World Trade Center bombing and other despicable acts, used encryption products to protect his computer files that related to terrorist activities.6 Encryption has also been used by child pornographers to transmit obscene images over the Internet, and by major drug traffickers, violent gangs, and domestic anti-government groups seeking to stifle government investigators.7
Primarily for this reason, ever since its "Clipper Chip" initiative in 1993, the Clinton Administration’s policy and proposals have all involved the concept of "escrowed" encryption, sometimes referred to as a "key escrow" or "back door" system. An escrowed encryption system would be one in which a "key" to the system is kept "in escrow" by a designated, government-approved agency or third party who can be served with a request, subpoena, or court order (different variations on this theme have been proposed)8 to turn the key over to law enforcement officials without notifying the user. As one might expect, each of these proposals has met with a negative reaction from the computer industry and from civil libertarians.
In addition to escrowed encryption proposals, the other response by the Clinton Administration has been an attempt to forge a compromise by permitting unregulated and unlimited domestic use and distribution of encryption technology, despite objections from the FBI, but severely regulating and limiting the exportation of encryption products. Prior to 1996, the exportation of encryption products was governed by the Arms Export Control Act (AECA)9 and the International Traffic in Arms Regulations (ITAR).10 Both AECA and ITAR, which were administered by the State Department, provided for the regulation of items contained on the U.S. Munitions List, which included encryption products with key-lengths of greater than 40 bits.11
In late 1996, the Clinton Administration transferred authority over the export of non-military encryption to the Commerce Department, which issued its own set of regulations. These regulations provided for exceptions to export restrictions for certain encryption products, including non-recovery encryption software up to a 56-bit key length, so long as the manufacturer submits a plan for developing, manufacturing, and marketing encryption products containing recovery features.12
The Clinton Administration and the law enforcement community face a wide array of formidable opponents. In addition to groups such as the American Civil Liberties Union, the Electronic Frontier Foundation, the Center for Democracy and Technology, and the Electronic Privacy Information Center, a coalition of over 100 business and associations, including Intel, Microsoft, Sun Microsystems, and the Business Software Alliance, recently formed Americans for Computer Privacy (ACP), whose sole goal is to promote pro-encryption legislation.13 These groups generally fear the possibility of "Orwellian snooping" by the government, and fervently believe that encryption restrictions violate fundamental rights to privacy, as well as the First, Fourth, and Fifth Amendments. Suffice it to say that many of these groups are well-financed and highly motivated.
These critics of the government’s position on encryption also argue that any system that is "dumbed-down" to permit immediate access by government investigators would also be more susceptible to hackers and saboteurs. In other words, such a system, designed to prevent crime, would, paradoxically, leave law abiding citizens and companies more susceptible to computer-savvy criminals who desire to steal and misuse sensitive information. If, as has been acknowledged by the Department of Defense, two 17-year-old hackers can penetrate the Pentagon’s computer system,14 how difficult would it be for someone to get access to our personal financial information, medical records, trade secrets, other proprietary information, and the like -- to take or alter as he sees fit -- if companies possessing such data lack sophisticated encryption products? In fact, a variety of so-called "sniffer" programs already exist which enable the sniffer to monitor traffic on the Internet and to copy particular patterns of characters and numbers, usually credit card numbers, for later use.
Critics also argue that maintaining export controls will do nothing more than doom the competitive positions of U.S. firms in terms of supplying the worldwide demand for these products, thereby denying them access to a multi-billion dollar market. In this regard, the computer industry is quick to note that it is one of the few remaining industries in this country that maintains a positive trade balance and some measure of dominance in world markets, which it believes would be seriously jeopardized if companies either cannot deliver high-quality encryption or suffer inordinate delays and high costs associated with obtaining export licenses. Why, critics ask, would a purchaser of encryption software, who, after all, has obviously placed a high premium on security, buy an American product with a key-recovery system when that purchaser can buy an equally-sophisticated product from a company in another country whose government has not required it to turn over the keys? And who will buy a 40-bit or 56-bit system, which can be cracked in a few hours or days by a determined college kid with a PC, from a U.S. manufacturer when you can buy a virtually-uncrackable (at least as of today) 128-bit system from a foreign company?
In the face of such criticism, the Clinton Administration has recently shown signs of relaxing encryption export regulations. For instance, in March, Hewlett-Packard was granted approval by the Department of Commerce to export an encryption package for corporate computer networks that accommodate, but do not require, key escrow applications.15 Last December, Cylink Corp. was granted a license to export strong encryption without a key recovery to members of the European central bank network, and in February, the Commerce Department expanded its definition of "financial institutions" permitted to export strong encryption hardware to include credit card companies and securities firms.16
Despite the recent easing of export restrictions, the debate about encryption shows no signs of abating. There are currently pending before Congress no fewer than five bills dealing with encryption technology, some of which impose additional restrictions and some of which eliminate those restrictions that currently exist.
In the House, Rep. Bob Goodlatte (R-Va.) and Zoe Lofgren (D-Calif.) have proposed the Security and Freedom through Encryption (SAFE) Act.17 As originally proposed, SAFE would prohibit mandatory key escrow and ease export controls. However, SAFE has been subjected to numerous revisions that offend civil libertarians, such as the addition of key-recovery provisions and a provision making it a crime (punishable up to five years in prison) to use encrypted communications in the commission of a felony.
In the Senate, John McCain (R-Ariz.) and Bob Kerrey (D-Neb.) have introduced the Secure Public Networks Act of 1996,18 which authorizes the export of encryption products without key recovery of up to 56-bit strength to certain buyers within qualified countries. The bill would allow the president to increase the encryption strength of exportable products by executive order and further provides that the president "shall take such action as necessary to increase the encryption strength for encryption products which may be exported if similar products are determined by the President to be widely available for export from other Nations."19 In the absence of an executive order, the bill prohibits the exportation of encryption products with more than 56 bits unless they are "based on a qualified system of key recovery." The bill would also prohibit the government and any state from requiring the escrow of an encryption key for communications between parties within the United States, but does criminalize the use of encryption for any purpose other than those permitted by the Act.
Conrad Burns (R-Mont.) has introduced the Promotion of Commerce On-line in the Digital Era (Pro-CODE) Act of 1997.20 Pro-CODE would essentially eliminate export controls of encryption technology products, by permitting the export of encryption technologies if products of similar strength are available anywhere else in the world and by prohibiting the imposition of mandatory key-recovery programs. The bill would also prohibit both the federal government and state governments from regulating the interstate sale of encryption devices.
Patrick Leahy (D-Vt.) has introduced the Encrypted Communications Privacy Act of 1997,21 which, like Senator Burns’s bill, would eliminate export controls on encryption devices and technology. However, it also offers protection to any United States citizen or entity who uses encryption of any strength in any state or foreign country, and criminalizes the use of encryption when used in furtherance of a crime.
Most recently, John Ashcroft (R-Mo.) and Senator Leahy introduced the Encryption Protects the Rights of Individuals from Violation and Abuse in Cyberspace (E-PRIVACY) Act,22 which would allow companies to export advanced encryption products, after a one-time review of mass-market encryption products and after it is verified that comparable technology is already available in foreign markets; however, exports to certain countries, such as Iraq, Iran, and Libya, would still be banned. The bill would also bar any attempt to assert domestic controls, including a mandatory key-recovery system. The bill attempts to accommodate law enforcement interests by making it a criminal offense to use encryption to hide incriminating evidence, and by establishing a National Electronic Center (Net) to help law enforcement personnel stay abreast of the latest technologies.
Not surprisingly, the debate has not been limited to Congress. Critics of the Administration’s encryption policies have challenged them in the courts as well, with mixed success.
Daniel Bernstein, Ph.D. candidate in mathematics, developed an encryption formula entitled Snuffle, and published the source code for his formula on the Internet. When the State Department advised Bernstein that his encryption program, and the paper discussing it, were "defense articles" subject to AECA and ITAR, thereby requiring the grant of a license prior to export, Bernstein filed a lawsuit in federal court seeking declaratory and injunctive relief, claiming that the AECA and the implementing regulations violated his right to free speech under the First Amendment.
In two separate opinions,23 Judge Marilyn Patel agreed with Bernstein,24 and rejected the government’s argument that his computer program was not speech for purposes of the First Amendment, but rather was unprotected conduct, because its purpose was functional (i.e. to encrypt electronic transmissions) rather than communicative or expressive. Judge Patel also held that AECA and ITAR imposed an unconstitutional prior restraint of Prof. Bernstein’s right to free speech, and found "national security, without more, too amorphous a rationale to abrogate the protections of the First Amendment."25
In a third opinion,26 Judge Patel concluded that the transfer of authority from the State Department to the Commerce Department did not remedy the problem, and held that the Commerce Department’s regulations also constituted an unconstitutional prior restraint of Professor Bernstein’s right to publish his encryption software over the Internet. Although the issue has been briefed and argued, as of this writing, the Ninth Circuit has not issued its opinion in the Bernstein case.
The government met with greater success in Karn v. Department of State.27 In that case, Philip Karn, Jr. wanted to export, for commercial purposes, a book on cryptography containing a source code for certain encryption software, and a diskette containing the same information. The State Department decided that the disk, but not the book, would be classified as a defense article requiring an export license, and Karn filed an action in federal court.
While assuming that Mr. Karn’s source code was speech for purposes of the First Amendment, Judge Charles Richey found the regulations to be content-neutral and stressed the foreign policy and national security rationale behind the export regulations. In so doing, the court stated that Mr. Karn had "needlessly invoked" the federal courts because he has "not been able to persuade the Congress and the Executive Branch that the technology at issue does not endanger national security."28 In short, the court stated that it would "not substitute its policy judgments for that of the President, especially in the area of national security."29 The Karn case has been remanded by the D.C. Circuit to determine what impact, if any, the transfer of authority over non-military encryption exports from the State Department to the Commerce Department should have.30 As of this writing, no additional opinions have been issued in the case.
The government has also prevailed, thus far, in Junger v. Dailey.31 In that case, Peter Junger, a law professor at Case Western Reserve University School of Law, filed an action in federal court seeking to invalidate certain provisions of ITAR which pertain to the exportation of encryption products and technology. In addition to the constitutional issues asserted by Karn and Bernstein, Professor Junger also sought a ruling on what, precisely, constitutes an "export" under ITAR. Junger professed to fear that he could be found to have violated ITAR by disclosing cryptographic software and ideas to foreign students (who might subsequently return to their homelands) during the course of their law school studies, and by publishing cryptographic information on the Internet where it can easily be downloaded outside the country. While this may seem far-fetched to some, it is worth noting that the founder of Pretty Good Privacy, Inc., Philip Zimmerman, published an early version of his encryption program on the Internet in 1991, and was the target of criminal investigation by the U.S. Custom’s Service. The investigation was terminated in 1996, without an indictment being returned.32
On July 2, 1998, Judge James Gwin granted the government’s motion for summary judgment in the Junger case, concluding that a computer program is inherently functional in that "it is designed to enable a computer to do a designated task," and not sufficiently expressive to warrant First Amendment protection. The court also concluded that the regulations in question seek to restrict the distribution of encryption software, not ideas on encryption.33 Junger has promised to appeal the judge’s ruling.
The problem for the law enforcement community is that, regardless of what happens in the courts or in Congress, technology may soon render much of this debate moot, if it has not already done so. Last year, the Organization for Economic Cooperation and Development, a think tank of developed nations, rejected U.S. pleas to endorse mandatory key escrow,34 and even our allies are not united behind this proposed solution.
Literally hundreds of powerful encryption products are already available to companies, individuals, and criminals alike on the international market. These products are relatively inexpensive, and, unlike drugs or weapons of mass destruction, they are easy to transport and store. In fact, sophisticated encryption software can already be found on the Internet and can be downloaded by anyone with a computer. Soon only the cheapest and dumbest of criminals will be stopped by the government’s policies and restrictions. Further, criminals don’t, by and large, apply for export licenses, and they are equally unlikely to turn over the "keys" to some neutral third party for possible use by law enforcement agents trying to thwart them.
When it comes to encryption technology and products, the genie is out and cannot be put back in the bottle, regardless of the hopes and wishes of the law enforcement community. This is not the first time, though, that the law enforcement community has faced challenges from emerging technologies. Law enforcement officers have managed to overcome the data processing difficulties posed by fax machines, communication networks, and the like. In short, the law enforcement and intelligence community is ultimately going to have to rely, as it has done many times before, on being smarter, faster, and technologically superior if it is going to stay ahead of the curve and continue to be effective at cracking the crook’s code.
*Mr. Malcolm & Mr. Schroeder are partners at the law firm of Malcolm & Schroeder, L.L.P., which specializes in white collar criminal defense, False Claims Act litigation, commercial litigation, and medical malpractice litigation. Prior to the formation of their firm, Mr. Malcolm and Mr. Schroeder were Assistant United States Attorneys in Atlanta assigned to the fraud and public corruption section, and also served with the Office of Independent Counsel as Associate Independent Counsel in Washington, D.C. Mr. Malcolm is the Chairman-Elect of the Criminal Law Practice Group, and Mr. Schroeder is the Co-Chairman of the White Collar Crime Subcommittee.
1. Singhal, A., The Piracy of Privacy? A Fourth Amendment Analysis of Key Escrow Cryptography, 7 Stan. L. & Pol’y Rev. 189, 190-91 (1996).
2. Stender, J.T., Too Many Secrets: Challenges to the Control of Strong Crypto and the National Security Perspective, 30 Case W. Res. J. Int’l L. 287, 288, 299 (1998)(hereinafter referred to as "Stender"); Koffsky, Mark, Comment, Choppy Waters in the Surveillance Data Stream: The Clipper Scheme and the Particularity Clause, 9 High Tech. L.J. 131, 133 (1994).
3. Stender at 300.
4. "Congress Holds the Key to Encryption Regulation," National Law Journal, pg.B9 (April 20, 1998).
5. "Sides Talk Compromise, But Encryption Policy Lags; Deadlock May Be Harder to Break Than Codes Themselves," New York Law Journal, pg.S3 (April 13, 1998).
6. See Security and Freedom Through Encryption (SAFE) Act: Hearing before the Subcomm. on Courts and Intellectual Property of the Comm. on the Judiciary House of Representatives, 105th Cong. 36-37 (1997)(statement of Robert S. Litt, Deputy Assistant Attorney General, Criminal Division).
7. See The Promotion of Commerce Online in the Digital Era Act: Hearing before the Comm. on Commerce, Science and Transp., 104th Cong. 14 (1996)(statement of Louis J. Freeh, Director, FBI).
8. At the present time, the interception in general of non-voice electronic communications is governed by the Electronic Communications Privacy Act of 1986, Pub.L.No. 99-508, 100 Stat. 1848, which extends most of protections found in Title III of the Omnibus Crime Control and Safe Streets Act of 1968 to e-mail, other computer-to-computer transmissions, and the like.
9. See 22 U.S.C. § 2778 (1994).
10. See 22 C.F.R. § 120 (1997).
11. Stewart Baker, Government Regulation of Encryption Technology: Frequently Asked Questions, 452 PLI/Pat 287, 293 (Sept. 1996).
12. See 61 Fed. Reg. 68572 (Dec. 30, 1996).
13. "Sides Talk Compromise, But Encryption Policy Lags; Deadlock May Be Harder to Break Than Codes Themselves," New York Law Journal, pg.S3 (April 13, 1998).
14. See "The Radical Center: Cybarians at the Gate," The Recorder, Pg.5 (March 11, 1998).
15. "Talking In Code," The Recorder, pg.6 (April 1, 1998).
16. "Congress Holds the Key to Encryption Regulation," The National Law Journal, pg.B9 (April 20, 1998).
17. House Bill 695, 105th Cong. (1997).
18. Senate Bill 909, 105th Cong. (1997).
19. Critics note that by prohibiting the export of encryption devices when such products are not readily available overseas, this could have the undesirable effect of stifling the incentives to develop new products in this country.
20. Senate Bill 377, 105th Cong. (1997).
21. Senate Bill 376, 105th Cong. (1997).
22. Senate Bill 2067, 105th Cong. (1998).
23. Bernstein v. United States Dep’t of State, 922 F.Supp. 1426 (N.D. Cal. 1996); Bernstein v. United States Dep’t of State, 945 F.Supp. 1279 (N.D. Cal. 1996).
24. Bernstein, 922 F.Supp. at 1436.
25. Bernstein, 945 F.Supp. at 1288.
26. Bernstein, 974 F.Supp. 1288 (N.D. Cal. 1997).
27. 925 F.Supp. 1 (D.D.C. 1996).
28. Id. at 2-3.
29. Id. at 13.
30. Karn v. United States Dep’t of State, 107 F.3d 923 (1997).
31. Junger v. Dailey, No. 1:96-CV-1723 (N.D. Ohio).
32. "Battle Over Encryption Export Flares," The National Law Journal, pg.A1 (September 29, 1997).
33. Junger v. Dailey, 1998 U.S. Dist. Lexis 10225.
34. Organization for Economic Cooperation and Development, Recommendation of the Council Concerning Guidelines for Cryptography Policy (March 27, 1997).