This paper reviews encryption policy and market trends
and the driving forces behind them. Focus is the use of encryption for
confidentiality protection, as this has been the area of greatest controversy.
Emphasis is also on U.S. policy, although major developments outside the
U.S. are briefly summarized.
Driving Forces
The driving forces behind encryption policy and technology
are served by two opposing functions: code making and code breaking.
Code Making
The term “code making” is used here loosly to to refer
to the use as well as development of encryption products. Code making
serves several purposes, including:
- Protecting proprietary information from corporate
and economic espionage. This includes protecting communications from
eavesdroppers and protecting stored data (documents, e- mail messages,
databases, etc.) from insiders and outsiders who gain unauthorized access.
- Protecting individual privacy, including private
communications and personal records.
- Protecting military and diplomatic secrets from
foreign espionage, and information relating to criminal and terrorist
investigations from those being investigated.
- Preventing crimes which might be facilitated by
eavesdropping. For example, after intercepting a password to a system,
an intruder might log into the system and perform a fraudulent financial
transaction, delete files, or plant a virus. Or, after intercepting
a credit card number, the perpetrator might make illegal purchases against
the cardholder’s account.
- Selling encryption products and services.
- Pursuing the intellectual aspects of code making
and advancing the state of the field.
The stakeholders include corporations as users and
vendors, government agencies, academics, hobbyists, and other organizations
and individuals as users. The underlying goals are both economic and social.
They include information security, economic strength at the corporate
and national level, national security, public safety, crime prevention,
privacy, and academic freedom.
Although needs vary, users generally want strong, robust
encryption that is easy to use and maintain. They want encryption to be
integrated into their application and networking environments. They want
products they can trust, and they want communications products to interoperate
globally so that their international communications are protected from
foreign governments and competitors. The encryption, however, must be
cost effective. Users are not willing to pay more for encryption, both
in terms of direct expenditures and overhead costs, than needed to balance
the perceived threat.
Manufacturers want to be able to build products at
the lowest possible cost, unencumbered by government regulations. They
seek cost-effective methods for building encryption into their products
and policies that permit sales of their products in as broad a market
as possible. Academics, researchers, and hobbyists want to study encryption
without constraints on what they can do, what they can publish, and whom
they can teach. They wish to contribute to the knowledge base on cryptography.
Code Breaking
The term “code breaking” is also used here loosely,
in this case to mean acquiring access to the plaintext of encrypted data
by some means other than the normal decryption process used by the intended
recipient(s) of the data. Code breaking is achieved either by obtaining
the decryption key through a special key recovery service or by finding
the key through cryptanalysis (e.g., brute force search). It can be employed
by the owner of encrypted data when the decryption key has been lost or
damaged, or by an adversary or some other person who was never intended
to have access. The objectives of code breaking are complimentary to those
of code making and include:
- Protecting corporate information from loss
in case the decryption keys are lost or damaged. From a corporate perspective,
losing access to valuable information can be just as serious as losing
control over who has access to it. Corporate interest in code breaking
applies primarily to stored data, but there is some interest in being
able to tap into communications when an employee is under investigation
for wrongful acts.
- Protecting personal records from loss of keys.
- Acquiring the military and diplomatic secrets
of foreign governments, particularly rogue governments.
- Conducting lawful communications intercepts
(wiretaps) and searches of computer files in criminal and terrorist
investigations, including investigations of corporate espionage, fraud,
and other economic crimes, many of which are now transnational. These
crimes can harm individual companies, or worse, the economic stability
of nations. Evidence obtained through wiretaps and searches is among
the most valuable because it captures the subject's own words. In some
cases, intercepted communications provide intelligence in advance of
a criminal or terrorist act so that the act can be averted.
- Selling code breaking products and services
to the owners of data and governments.
- Pursuing the intellectual aspects of code
breaking, including participation in large-scale demonstration projects.
- Testing whether one's own codes are strong. It is
not possible to develop good products without a thorough understanding
of code breaking.
As with code making, the stakeholders include corporations
as users and vendors, government agencies, academics, hobbyists, and other
organizations and individuals as users. The underlying goals are also
similar: information security, economic strength at the corporate and
national level, national security, public safety, crime prevention and
investigation, privacy, and academic freedom. Although code breaking is
normally considered antithetical to privacy, in some situations it is
not, for example when it uncovers a plan to kidnap, abduct, molest, or
take hostage innocent persons -- acts which completely destroy the privacy
of their victims.
The above shows that national interests, including
those of corporations, government agencies, and individual citizens, are
served by both code making and code breaking efforts. At the same time,
these interests are threatened by the code making and code breaking activities
of adversaries. Hence, encryption policy must deal with opposing capabilities
and objectives. This is what makes it so difficult. Although the dilemma
is often characterized as one of governments vs. corporations and citizens,
or of national security and law enforcement against security, privacy,
and economic competitiveness as illustrated in Figure 1, the actual dilemma
is considerably more complex. It is how to effectively serve national,
corporate, and individual interests in both code making and code breaking.
Figure 2 illustrates.
Many countries, including the United States, have historically
approached encryption policy by regulating exports of encryption technology
[1] but not their import and use (some countries, including France, Israel,
China, and Russia, have also regulated these functions). This made sense
given that most code breaking efforts were performed by governments against
foreign governments. Encryption was seldom used domestically, so there
was little need for governments or corporations to break domestic codes.
However, the growth of telecommunications and electronic commerce has
changed all that. Use of encryption, both internationally and domestically,
is skyrocketing. There are now strong reasons for corporations and governments
to break domestic codes in limited circumstances, and for manufacturers
to sell strong encryption products internationally in support of global
electronic commerce. These changes demand a new approach to encryption
policy.
Market Trends
Global Proliferation
Encryption is spreading worldwide. As of December 1996,
Trusted Information Systems of Glenwood, Maryland, identified 1393 encryption
products worldwide produced and distributed by 862 companies in at least
68 countries [2]. Of these, 823 (56%) are produced in the U.S. The remaining
570 (44%) are produced in 28 different countries. Implementations are
in hardware, software, firmware, or some combination. Hardware products
include smart cards and PCMCIA cards, which are used for user authentication,
encryption, and digital signatures. Almost half (44%) of the products
implement the Data Encryption Standard (DES). Although the number of product
instances in the field is unknown, at their annual conference in January
1997, RSA Data Security, Inc. reported that they expected the number of
RSA crypto engines to surpass 100 million in the first quarter of 1997.
Software encryption is also spreading through the Internet
and other computer networks. It can be freely downloaded by anyone from
international sites which do not control exports. One web site contains
links for all the algorithms in Bruce Schneier's book, Applied Cryptography
[3]. The widespread use of Pretty Good Privacy (PGP), an e-mail and file
encryption program developed in the United States, is due in part to its
worldwide availability on the Internet (despite U.S. export controls).
While encryption software currently accounts for only
about 1-3% of the total software market, the market is beginning to expand
exponentially with the development of electronic commerce, public networks,
and distributed processing [4]. Also, whereas the majority (75%) of general-purpose
software products available on foreign markets are of U.S. origin, this
is generally not the case with encryption software, where the markets
tend to be more national (see Table 1).
The use of encryption is expected to rise rapidly.
Based on a survey of 1600 U.S. business users, the U.S. Chamber of Commerce,
Telecommunications Task Force estimated that 17% of companies used encryption
for confidentiality in 1995. They projected an annual growth rate of 29%,
which would bring this figure to 60% by the year 2000. The 1996 Ernst
& Young and Information Week annual security survey of 1300
information security managers found that 26% used file encryption, 17%
telecommunications encryption, and 6% public-key cryptography [5].
The use of encryption on the World Wide Web is still
quite low. A report released in December, 1996 showed that of the 648,613
publicly visible web sites, only 10% offered SSL encryption (see next
section) to protect web communications and only 5% of those offered third-party
certificates for strong authentication [6].
As encryption is coming into greater use, law enforcement
agencies are encountering it more often in criminal and terrorist investigations.
The Computer Analysis and Response Team at FBI Headquarter reported that
the number of cases they handled involving encryption increased from 2%
of 350 cases in 1994 to 5-6% of 500 cases in 1996. This is a fourfold
increase over the two-year period.
Application and Network Integration
Encryption is being integrated into software applications,
including word processors, database systems, and spreadsheets. It is also
forming an important building block in the development of network protocols,
which operate at various layers in the protocol stack [7]. Examples are:
- IPsec, the security specifications for the Internet
Protocol (IP), which handles packet delivery and routing across the
Internet. The specifications are optional in IP version 4, but mandatory
in version 6 (IPv6). IP-layer encryption can be used to build virtual
private networks which flow over public links in a process called tunneling
and to support secure Internet applications that run on top of TCP/IP.
- Secure Sockets Layer (SSL), a session-layer
protocol used on the World Wide Web to protect credit card numbers and
other sensitive data transmitted between a user's browser and an Internet
web server through the HyperText Transport Protocol (HTTP); and Secure
HTTP (S-HTTP), which integrates encryption into the HTTP protocol.
- Secure Electronic Transactions (SET), a protocol
for secure electronic payments which protects payment information among
users, merchants, and banks.
- Secure/Multipurpose Internet Mail Extensions
(S/MIME), Message Security Protocol (MSP), Privacy Enhanced Mail (PEM),
MIME Object Security Services (MOSS), and MIME/PGP, which are protocols
for secure messaging.
These protocols are used to build secure network applications
for electronic commerce, home banking, electronic mail, distributed computation
and databases, and virtual private Intranets. The effect is to make encryption
ready at hand and easy to use. It can be automatic or at the push of a
button.
Integration of encryption has been facilitated by the
development of Cryptographic Application Programming Interfaces (CAPIs),
which have made it possible to build applications and systems which are
independent of any particular method or implementation. Examples include:
- Platform-Independent Cryptography API (PICA),
which builds on the Public Key Cryptography Standards (PKCS).
- Open Group Generic Crypto Services API (GCS-API).
- Microsoft CryptoAPI.
- RSA Labs Cryptographic Token Interface (Cryptoki,
a.k.a. PKCS #11).
- Intel Computer Data Security Architecture
(CDSA) Cryptographic Services Manager.
- IBM Common Cryptographic Architecture (CCA).
- Spyrus Extensions for Algorithm Agility (SPEX/AA).
These interfaces support a variety of hardware and
software cryptographic engines which implement the low-level cryptographic
algorithms and hash functions (e.g, DES, RC4, SKIPJACK, RSA, Digital Signature
Algorithm (DSA), Diffie-Hellman key exchange, Secure Hash Algorithm-1
(SHA-1), and MD5). CAPIs are used to build higher-level APIs which provide
confidentiality, authentication, integrity, non-repudiation, certificate
management, directory services, key recovery, audit, and so forth in support
of applications. Examples include
- Internet Generic Security Service API (GSS-API).
- Microsoft Security Support Provider Interface.
- Intel CDSA Common Security Services Manager
API.
- TIS/MOSS API.
The International Cryptography Experiment (ICE) is
using CAPIs to demonstrate the development of flexible, cost-effective,
and exportable computer software applications. [8]
Multiple Methods and Interoperability
CAPI's and other security-related APIs have made it
relatively easy to build products that support multiple methods of encryption.
It is not uncommon to find support for at least a half dozen different
algorithms and modes of operation, some of which may be proprietary. For
example, a domestic product might offer a choice of 56-bit DES, 168-bit
triple-DES, 40-bit and 128-bit RC4, and so forth. A product might also
support multiple public-key certificate formats (e.g., X.503, Secure DNS
resource records, and hashed public keys) and multiple protocols and infrastructures
for managing and distributing certificates. Finally, it might support
methods with built-in key archive and recovery capabilities as well as
methods that do not provide key recovery.
Interoperability between products is achieved not by
universal adoption of any single method, but rather by protocols that
negotiate to find the strongest method they have in common or by mechanisms
that let the user pick among several options. This is similar to the way
in which modem protocols negotiate transmission speed or that word processors,
spreadsheet programs, and graphics tools handle multiple data formats.
One consequence of supporting multiple methods is that domestic versions
of products can interoperate with their exportable counterparts using
exportable methods (e.g., algorithms with 40-bit keys). Also, through
open standards, the domestic products of one country can be designed to
interoperate with those of another. Thus, global interoperability is possible
with both exportable and non-exportable encryption products. In some cases,
the cryptographic strength provided by an exportable product can be brought
to the level of a domestic product through a foreign-made security plug-in.
For example, foreign users of Netscape's or Microsoft's 40-bit web browser
can install a 128-bit plug-in (SafePassage) that acts as a proxy between
their 40-bit browser and a 128-bit web server [6]
Key Length
Domestic versions of products often use key lengths
far in excess of what is needed to prevent compromise. For example, the
domestic version of Netscape's Navigator 3.0 offers 128-bit RC4 and 168-bit
Triple-DES. Breaking such keys by brute force is totally infeasible and
could remain so forever.
One reason for the long keys is that advances in computer
technology continually reduce the security afforded by any given key length
[9]. Users want key sizes that will remain secure for the lifetime of
the data and the systems they are using. Another reason is that it is
relatively easy to design and use algorithms with long keys. For example,
RC4 and RC5 take a variable length key and Triple-DES is constructed out
of standard DES by using three keys and triple encryption. In many application
contexts, the performance degradation from using longer keys is not consequential.
Perhaps the most important factor, however, has been public perception.
When the DES was adopted in 1977, two well-know cryptographers,
Whitfield Diffie and Martin Hellman, argued that 56-bit keys were not
secure [10]. They estimated that they could build a search machine that
would crack keys at a cost of $100 each by 1994. In 1993, Michael Wiener
described a special-purpose architecture for cracking DES keys. He estimated
that a machine with 57,600 search chips and costing $1 million could break
a key in 3.5 hours [11]. Neither of these machines was built, but Wiener's
design in particular was put forth as proof that DES was crackable.
Concern about key length was heightened when a 40-bit
key was cracked by a French student, Damien Doligez, in 8 days using 120
workstations and a few supercomputers [12]. Even though a 56-bit key would
take 65 thousand times longer to break and a 64-bit key 17 million times
longer, the perception was that much longer keys were needed for adequate
protection.
In 1996, a group of seven cryptographers issued a report
recommending that keys be at least 75-90 bits to protect against a well-funded
adversary [13]. The cryptographers estimated that a 40-bit key could be
cracked in 12 minutes and a 56-bit key in 18 months using a $10,000 machine
consisting of 25 Field Programmable Gate Array (FPGA) chips. Each chip
would cost $200 and test 30 million keys per second. For $10 million,
a machine with 25,000 FPGA chips could crack a 56-bit DES key in 13 hours;
one with 250,000 Application-Specific Integrated Circuits costing $10
each could do it in 6 minutes. By comparison, the National Security Agency
estimated it would take 10 minutes to crack a 40-bit key and 1 year and
87.5 days to crack a 56-bit key on a Cray T3D supercomputer with 1024
nodes and costing $30 million. Table 2 shows the estimates for the FPGA
and ASIC architectures and for the Cray (row 3). The first row corresponds
to the actual attack carried out by the French student.
At their January 1997 conference, RSA Data Security
announced a set of challenge ciphers with prizes for the first person
breaking each cipher [14]. These included $1,000 for breaking a 40-bit
RC5 key, $5,000 for breaking a 48-bit RC5 key, and $10,000 for breaking
a 56-bit RC5 or DES key. The challenges extend to 128-bit RC5 keys in
increments of 8 bits each. The 40-bit prize was won shortly thereafter
by Ian Goldberg, a student at Berkeley, who cracked it in 3.5 hours using
a network of 250 computers that tested 100 billion keys per hour. The
48-bit prize was won a few weeks later by Germano Caronni, a student at
the Swiss Federal Institute of Technology. Caronni harnessed the power
of over 3,500 computers on the Internet to achieve a peak search rate
of 1.5 trillion keys per hour. The key was found after 312 hours (13 days).
Because DES is nearing the end of its useful lifetime,
the Department of Commerce is in the process of finding a successor. In
January, 1997, they requested comments on proposed draft minimum acceptability
requirements and evaluation criteria [15].
Key length is also a factor with public-key algorithms.
The driving force for longer keys here is not only faster hardware, but
also much faster algorithms for factoring. Whereas only 30- to 60-digit
numbers could be factored in 1980, a 129-digit RSA key was factored in
1995 and a 130-digit key in 1996. Both numbers were factored by harnessing
compute cycles from Internet users. The 130-digit number was actually
factored with about 10 times fewer operations than the 129-digit
number by using a much faster method. RSA Laboratories recommends that
keys be at least 230 digits (or more than 768 bits) [16]. Elliptic curve
implementations of public-key algorithms, which are believed to provide
comparable security (and faster execution) with fewer bits, will allow
for shorter keys as they become available.
Although key length is significant to the strength
of an algorithm, weaknesses in key management protocols or implementation
can allow keys to be cracked that would be impossible to determine by
brute force. For example, shortly after the French student cracked the
40-bit key in 8 days, Ian Goldberg and David Wagner found that the keys
generated for Netscape could be hacked in less than a minute because they
were not sufficiently random [17]. Paul Kocher showed that under suitable
conditions, a key could be cracked by observing the time it took to decrypt
or sign messages with that key [18]. Richard Lipton, Rich DeMillo, and
Dan Boney at Bellcore showed that public-key cryptosystems implemented
on smart cards and other tamperproof tokens hardware were potentially
vulnerable to hardware fault attacks if the attacker could induce certain
types of errors on the card and observe their effect [19]. Eli Biham and
Adi Shamir showed that the strategy could also work against single-key
systems such as DES and Triple-DES [20]. Thus, while key length is a factor
in security, it is by no means the only one.
Key Recovery
Manufacturer's of encryption products are building
key recovery capabilities into products, particularly those used to encrypt
stored data, to protect users and their organizations from lost or damaged
keys [21]. Several different approaches are used, but all involve archiving
individual or master keys with officers of the organization or with a
trusted third party. The archived keys are not used to encrypt or decrypt
data, but only to unlock the data encryption keys under exigent circumstances.
They may be entrusted with a single person (or agency) or split between
two or more parties. In one approach, the data encryption key K in encrypted
under a public key owned by the organization and then stored in the message
or file header. In another, the private key establishment keys of users
(that is, the keys used to distribute or negotiate data encryption keys)
are archived. Whenever a message is sent to a user, the data encryption
key K is passed in the header encrypted under the user's private key establishment
key. Both of these approaches can accommodate lawful access by law enforcement
officials as well as by the owners of the data.
There is less user demand for key recovery with systems
used only for transient communications and not stored data, for example,
systems used to encrypt voice communications or to encrypt the transmission
of a credit card on the Internet. The reason is that there is no risk
of losing information. However, some companies, for example Shell Group
enterprises, have established corporate-wide key recovery mechanisms for
all encrypted data. The advantage to key recovery in this context is that
it enables criminal investigations of employees. For example, an employee
could use the company network to transmit proprietary documents to a competitor
or to engage in fraud.
There are several national and international efforts
to develop and use key recovery systems. The Clinton Administration plans
to develop a federal key management infrastructure with key recovery services.
In July 1996, the Administration announced the formation of a Technical
Advisory Committee to Develop a Federal Information Processing Standard
for the Federal Key Management Infrastructure (TACDFIPSFMKI), which began
meeting in December. The Administration also initiated an Emergency Access
Demonstration Project, with 10 pilots selected to test approaches to key
recovery in federal systems.
The European Commission has been preparing a proposal
to establish a European-wide network of Trusted Third Parties (ETS) that
would be accredited to offer services that support digital signatures,
notarization, confidentiality, and data integrity. The trust centers,
which would operate under the control of member nations, would hold keys
that would enable them to assist the owners of data with emergency decryption
or supply keys to their national authorities on production of a legal
warrant. The proposal is currently undergoing further consideration within
the Commission before it can be brought before the Council of the European
Union for adoption. Eight studies and pilot projects are planned for 1987.
Canada is building its public-key infrastructure using
the Nortel Entrust product line for its underlying security architecture.
Entrust supports optional key archive and recovery through the certificate
authorities. The certificate authority for an organization, which may
be internal to the organization, holds the private keys of users when
recovery is desired.
The Open Group (formerly X/Open and OSF) is pursuing
standards for a public-key infrastructure. It is working with law enforcement
and other government agencies, as well as with the international business
community, to build an infrastructure that would support key archive and
recovery.
Because not all encryption systems have built-in key
recovery mechanisms, there is also a market for recovering keys (and ultimately
the plaintext) by other means, for example, brute-force attacks against
short keys or attacks that exploit weaknesses in design or implementation.
Many systems contain flaws, for example, in key management, that allow
them to be cracked despite using long keys. In some cases, the key may
be stored on a disk encrypted with a password that can be cracked. AccessData
Corp., a company in Orem, Utah, provides software and services to help
law enforcement agencies and companies recover data that has been locked
out by encryption. In an interview with the Computer Security Institute,
Eric Thompson, founder of AccessData, reported that they had a recovery
rate of about 80-85% with large-scale commercial commodity software applications
[23]. Thompson also noted that former CIA spy Aldrich Ames had used off-the-shelf
software that could be broken.
United States Policy
Clinton Administration Initiatives
Beginning with the Clipper chip in 1993 [24], the Clinton
Administration has embraced an encryption policy based on key recovery,
initially called "key escrow." This policy includes development of federal
standards for key recovery and adoption of key recovery systems within
the federal government as outlined in the preceding section. It also includes
liberalization of export controls for products that provide key recovery.
The objective has been to promote the use of encryption in a way that
effectively balances national goals for information security, economic
strength, national security, public safety, crime prevention and investigation,
privacy, and freedom, and to do so through export controls and government
use of key recovery rather than mandatory controls on the use of encryption.
Key recovery is seen as a way of addressing the fundamental dilemma of
encryption. It allows the use of robust algorithms with long keys, but
at the same time accommodates code breaking under very tightly controlled
conditions, in particular, by the owners of encrypted data and by government
officials with a court order or other lawful authorization.
When the Clipper chip was announced, products which
used the RC2 and RC4 encryption algorithms with 40-bit keys were readily
exported through general licensing arrangements. Products with longer
keys, however, were subject to much tighter restrictions. 56-bit DES,
for example, could not be exported except under special circumstances.
Given the perceived weakness of 40-bit keys, industry was lobbying hard
for longer keys to meet the demands of foreign customers.
The Clipper chip, which was the Administration's initial
offering, allowed export of 80-bit keys in an NSA-designed microchip which
implemented the SKIPJACK encryption algorithm and a built-in key recovery
mechanism. However, it was sharply criticized for several reasons: the
classified SKIPJACK algorithm was not open to public scrutiny, it required
special purpose hardware, the government held the keys, it did not provide
user data recovery, and it did not accommodate industry-developed encryption
methods. In response to these criticisms, in August 1995 the Administration
announced that it would also allow for exports of 64-bit software encryption
when combined with an acceptable key recovery system [25]. The algorithms
could be public or proprietary, and the keys could be held by non-government
entities. This proposal, however, fell short of industry demands for unlimited
key lengths and immediate export relief.
On October 1, 1996, the Administration announced that
vendors could export DES and other 56-bit algorithms provided they had
a plan for implementing key recovery and for building the supporting infrastructure
internationally, with commitments to explicit benchmarks and milestones
[26]. In some cases, organizations could operate their own internal key
recovery services. Temporary licenses would be granted for six-month periods
up to two years, with renewals contingent on meeting milestones. After
two years, 56-bit products without key recovery would no longer be exportable.
However, beginning immediately, products with acceptable key recovery
systems would be readily exportable regardless of algorithm, key length,
or hardware or software implementation. In addition, encryption products
would no longer be classified as munitions under the International Trafficking
and Arms Limitation (ITAR) [27]. Jurisdiction for commercial export licenses
would be transferred from the Department of State to the Department of
Commerce.
On November 15, President Clinton signed an Executive
Order transferring certain encryption products from the United States
Munitions List administered by the Department of State to the Commerce
Control List administered by the Department of Commerce [28]. He also
appointed Ambassador David L. Aaron, the United States Permanent Representative
to the Organization for Economic and Cooperation Development, as Special
Envoy for Cryptography. As Special Envoy, Ambassador Aaron is to promote
international cooperation, coordinate U.S. contacts with foreign officials,
and provide a focal point on bilateral and multilateral encryption issues.
On December 30, 1996, the Commerce Department issued an interim rule amending
the Export Administration Regulations (EAR) in accordance with the Executive
Order and the policy announced in October [29]. The interim regulations
went into effect immediately, with a comment period for proposing revisions..
Following the October announcement, eleven major information
technology firms, led by IBM and including Apple, Atalla, Digital Equipment
Corp., Groupe Bull, Hewlett-Packard, NCR, Sun, Trusted Information Systems,
and UPS, announced the formation of an alliance to define an industry-led
standard for flexible cryptographic key recovery [30]. By the end of January
1997, forty-eight companies had joined the alliance. The Computer Systems
Policy Project (CSPP), a coalition of the chief executive officers of
the twelve leading U.S. computer systems companies, issued a statement
acknowledging the progress that had been made in removing export restrictions
on cryptography and supporting the Administration's decision to encourage
the development of voluntary, industry-led key recovery techniques [31].
Hitachi Ltd. and Fujitsu Ltd. announced a plan to jointly develop key
recovery technology under the new policy [32].
By the end of January 1997, three companies had received
general licenses to export strong encryption under the new regulations:
Trusted Information Systems, Digital Equipment Corporation, and Cylink.
TIS received licenses to export their Gauntlet Internet firewall with
both DES and Triple-DES in a global virtual private network mode and to
export their Microsoft CryptoAPI-compliant Cryptographic Service Providers
[33].
In May 1997, the Commerce Department announced that
it will allow export of non-recoverable encryption with unlimited key
length for products that are specifically designed for financial transactions,
including home banking[34]. They will also allow exports, for two years,
of non- recoverable general-purpose commercial products of unlimited key
length when used for interbank and similar financial transactions, once
the manufacturers file a commitment to develop recoverable products. The
reason key recovery is not required with financial transactions is that
financial institutions are legally required and have demonstrated a consistent
ability to provide access to transaction information in response to authorized
law enforcement requests.
The Department of Commerce has also announced the formation
of a President’s Export Council Subcommittee on Encryption. The subcommittee
is to advise the Secretary on matters pertinent to the implementation
of an encryption policy that supports the growth of commerce while protecting
the public safety and national security. The subcommittee is to consist
of approximately 25 members representing the exporting community and government
agencies responsible for implementing encryption policy.
The Clinton Administration has drafted a bill intended
to promote the establishment of a key management infrastructure (KMI)
with key recovery services. The bill is based on the premise that in order
to fully support electronic commerce, encryption products must interface
with a KMI which issues and manages certificates for users’ public keys.
The bill would create a program under the Secretary of Commerce for registering
certificate authorities and key recovery agents wishing to participate
in the KMI enabled by the act. Certificate authorities registered under
the act would be permitted to issue certificates for public encryption
keys only if the corresponding decryption keys were stored with a registered
key recovery agent (private signature keys would not be stored). Participation
in the registered KMI would be voluntary. Certificate authorities could
operate without registration, and encryption products could interface
with infrastructures supported by unregistered CA’s. Users would be free
to acquire certificates from unregistered CA’s without depositing their
keys
The bill specifies the conditions under which recovery
information can be released to government agencies or other authorized
parties, and criminalizes various acts relating to the abuse of keys or
the KMI. The bill also establishes liability protections for key recovery
agents acting in good faith. Certificate authorities and key recovery
agents registered under the act will be required to meet minimum standards
for security and performance. Thus, users of the KMI should have strong
assurances that their keys are adequately safeguarded and that public
keys acquired from the KMI can be trusted. The bill would also add a fine
or up to five years of imprisonment for persons knowingly encrypting information
in furtherance of the commission of a criminal offense when there is not
a key recovery system allowing government access to plaintext.
Congressional Bills to Liberalize Export Controls
Three bills were introduced in the 2nd session of the
104th Congress (1996) to liberalize export controls on encryption, two
in the Senate and one in the House of Representatives. Although none of
the bills was brought to the floor for a vote, all three were reintroduced
in February 1997. The current bills are as follows:
- S. 376, the “Encrypted Communications Privacy Act
of 1997,” introduced by Senator Leahy with Senators Burns, Murray, and
Wyden as co-sponsors.
- S. 377, the “Promotion of Commerce On-Line in the
Digital Era (Pro-CODE) Act of 1997,” introduced by a bi-partisan group
of seventeen senators led by Senators Burns and Leahy.
- H.R. 695, the “Security and Freedom Through Encryption
(SAFE) Act of 1997,” introduced by Representative Goodlatte with fifty-five
co-sponsors. It was passed by the House Judiciary Committee on May 14.
These bills would all lift export controls on encryption
software independent of whether the products provide key recovery. They
have been strongly supported by many people in the private sector on the
grounds that export controls harm the competitiveness of U.S. industry
in the global market and make it more difficult for consumers and businesses
to get products with strong encryption. S. 1587 and H.R. 3011 would also
make unlawful the use of encryption to obstruct justice.
It is extremely difficult to measure the economic impact
of export controls on U.S. business. The CSPP estimated that as much as
$30-60 billion in revenues could be at stake by the year 2000 [35]. However,
the National Research Council committee on cryptography policy concluded
that "The dollar cost of limiting the availability of cryptography abroad
is hard to estimate with any kind of confidence, since even the definition
of what counts as a cost is quite fuzzy. At the same time, a floor of
a few million dollars per year for the market affected by export controls
on encryption seems plausible, and all indications are that this figure
will only grow in the future." [36].
The NRC study agreed that export controls should be
relaxed, but suggested a more cautious approach. Their recommendations
included allowing ready export of DES, allowing exports of products with
longer keys to a list of approved companies that would be willing to provide
access to decrypted information upon legal authorization, and streamlining
the export process [37].
Challenges to the Constitutionality of Export Controls
There have been three lawsuits challenging the constitutionality
of export controls on encryption software. The first was filed on behalf
of Philip Karn in February 1994 after the State Department denied his
request to export a computer disk containing the source code for the encryption
algorithms in Bruce Schneier's book Applied Cryptography. Karn
claimed that export restrictions on the disk violated his First Amendment
right to free speech. He also claimed that because the book was exportable,
treating the disk differently from the book violated his Fifth Amendment
right to substantive due process. The suit was filed against the State
Department in the United States District Court for the District of Columbia.
In March 1996, Judge Charles Richey filed an opinion [38] stating that
the plaintiff "raises administrative law and meritless constitutional
claims because he and others have not been able to persuade the Congress
and the Executive Branch that the technology at issue does not endanger
the national security." The Court granted the defendant's motion to dismiss
the plaintiff's claims. Karn appealed the decision, but in January 1997,
the DC Court of Appeals sent the case back to the District Court for reconsideration
under the new Commerce Department encryption regulations.
In February 1995, the Electronic Frontier Foundation
filed a lawsuit against the State Department on behalf of Daniel Bernstein,
a graduate student at the University of California, Berkeley. The suit,
which was filed in the Northern District of California, claims that export
controls on software are an "impermissible prior restraint on speech,
in violation of the First Amendment." Bernstein had been denied a commodity
jurisdiction request to export the source code for an algorithm he had
developed called Snuffle. The Department of Justice filed a motion to
dismiss, arguing that export controls on software source code were not
based on the content of the code but rather its functionality. In December
1996 Judge Marilyn Patel ruled that the ITAR licensing scheme acted as
an unconstitutional prior restraint in violation of the First Amendment
[39]. It is not clear how the ruling affect the new regulatory regime.
A third lawsuit was filed on behalf of Peter Junger,
a law professor at Case Western Reserve Law School in Cleveland, Ohio
[40]. Junger claims that export controls impose unconstitutional restraints
on anyone who wants to speak or write publicly about encryption programs,
and that the controls prevent him from admitting foreign students to his
course or from publishing his course materials and articles with cryptographic
software. But in fact the government does not restrict academic courses
in cryptography or the admission of foreign students to these courses.
Professors can give lectures, publish papers, speak at conferences, and
make software available to their students without licenses. Licenses are
needed only to make that software available internationally in electronic
form (e.g., by posting it on an FTP or web site on the Internet).
I personally question the claim that export licenses
impose an impermissible prior restraint on speech. Export controls on
encryption software are concerned with its operational behavior -- with
the fact that encryption software loaded onto a computer is an encryption
device. They are not targeted at speech or ideas about the software [41].
International Policy
The following summarizes recent developments.
OECD
In recognition of the need for an internationally
coordinated approach to encryption policy to foster the development of
a secure global information infrastructure, the Organization for Economic
Cooperation Development (OECD), has recently issued guidelines for cryptography
policy [42]. The guidelines represent a consensus about specific policy
and regulatory issues. While not binding to OECD's 29 member countries,
they are intended to be taken into account in formulating policies at
the national and international level.
The guidelines were prepared by a Group of Experts
on Cryptography Policy under a parent Group of Experts on Security, Privacy,
and Intellectual Property Protection in the GII. The committee received
input from various sectors, with the Business-Industry Advisory Council
(BIAC) to the OECD participating in the drafting process
The guidelines expound on eight basic principles for
cryptography policy:
- trust in cryptographic methods
- choice of cryptographic methods
- market driven development of cryptographic methods
- standards for cryptographic methods
- protection of privacy and personal data
- lawful access
- liability protection
- international cooperation
The principal of lawful access states: "National cryptography
policies may allow lawful access to plaintext, or cryptographic keys,
of encrypted data. These policies must respect the other principles contained
in the guidelines to the greatest extent possible."
France
France has waived its licensing requirement on the
use of encryption when keys are escrowed with government-approved key
holders, effectively trading licenses on the use of encryption for licenses
governing the operation of key archive and recovery services [43]. To
get a license, an organization providing key archive services would have
to do business in France and have stock honored by the French government.
The service providers would have to be of French nationality. Under the
new law, licenses are still needed for all imports and exports of encryption
products.
United Kingdom
The British government considers it essential that
security, intelligence, and law enforcement agencies preserve their ability
to conduct effective legal interception of communications, while at the
same time ensuring the privacy of individuals. Accordingly, they have
issued a draft proposal to license trusted third parties (TTPs) providing
encryption services to the general public [44]. The TTPs would hold and
release the encryption keys of their clients; appropriate safeguards would
be established to protect against abuse and misuse of keys. The licensing
regime would seek to ensure that TTPs meet criteria for liability coverage,
quality assurance, and key recovery. It would allow for relaxed export
controls on encryption products that work with licensed TTPs. It would
be illegal for an unlicensed entity to offer encryption services to the
public, however, the private use of encryption would not be regulated.
Japan
Japan recently tightened their export controls on encryption
by requiring that businesses obtain a license for any order exceeding
50,000 yen, or about $450. The previous limit was 1 million yen. According
to officials from the Ministry of International Trade and Industry (MITI),
the change resulted from sensitivity to what is going on in the international
community regarding encryption, and not pressure from the U.S. government.
The Japanese Justice Ministry is also seeking legislation that would authorize
court-ordered wiretaps in criminal investigations [45]. Finally, the Hitachi/
Fujitsu plan to jointly develop key recovery technology in conformance
with U.S. policy has the backing of MITI.
Conclusions
Encryption is spreading worldwide, with nearly 1400
products produced and distributed by over 800 companies in at least 68
countries. It is becoming a standard feature of applications and systems
software, facilitated in part by the development of application programming
interfaces and industry standards. Many products support a variety of
encryption methods and interoperate using the strongest methods they have
in common. Through internationally accepted open standards, products manufactured
in one country will be able to interoperate with those made in foreign
countries even if they cannot be exported to those countries.
Commercial products for domestic markets now use algorithms
with key lengths that are totally infeasible to crack by brute force,
for example 128-bit RC4 and 168-bit Triple-DES. At the same time, code
breakers on the Internet are pooling resources to break ever longer keys,
most recently 48 bits. Although many commercial products are breakable
through flaws in design and implementation, the trend is to build products
with stronger security and to provide emergency decryption, both for the
owners of the data and for lawfully authorized government officials, through
a key recovery system.
The encryption market and government policies are driven
by several interests including information security, privacy, freedom,
crime prevention and investigation, public safety, economic strength,
and national security. The stakeholders are governments, industry, and
citizens. What makes encryption policy so hard is that all of these interests
are simultaneously served by and threatened by both code making and code
breaking. Key recovery is seen as a potential way of effectively balancing
national, corporate, and individual interests in these opposing activities.
Several governments are adopting encryption policies
that favor key recovery systems. The Clinton Administration's policy is
to leave the U.S. domestic market unregulated and to ease export controls
on products with acceptable key recovery systems. So far, three companies
have obtained licenses to export strong encryption with key recovery under
regulations established at the end of 1996. Because key recovery provides
much stronger protection than short keys, which can be broken by anyone,
while also being valuable to customers, other vendors are expected to
follow suit and put key recovery capabilities into the export versions
of products rather than using short keys. To reduce product development,
maintenance, and management costs, vendors may produce a single product
line, based on key recovery, for both domestic and international use.
However, some companies are ignoring the international market entirely.
The Administration's policy has been challenged both by Congressional
bills that would lift export controls for products with or without key
recovery and by lawsuits claiming that export controls on encryption software
are unconstitutional.
The use of encryption is expected to rise rapidly,
reaching 60% of U.S. business users by the year 2000. Because organizations
have a need to recover the keys to stored encrypted data, including files
and saved electronic mail, the use of key recovery with stored data could
become standard business practice. Companies will either operate their
own key recovery services or use trusted third parties. Self escrow will
be allowed with export versions of products sold to approved organizations.
Pilot projects in the U.S. and elsewhere are testing different approaches
to key recovery.
To mitigate potential risks, efforts are underway to
develop strong technical, procedural, and legal safeguards to protect
organizations and individuals who use key recovery services from improper
use of those services and to provide liability protection for key recovery
service providers when properly releasing keys. Efforts are also underway
to establish bilateral and multilateral key release agreements so that
a government can conduct an investigation within its jurisdiction even
when the keys needed for decryption are held outside its borders. I expect
these agreements to have safeguards that protect corporations and individuals
from espionage by foreign governments. A foreign government might be required
to submit a request for decryption assistance to the government of the
country where the keys are held so that the home government can review
the request and any plaintext before it is released to the foreign government.
Whether governments will be able to access communications
and stored records in criminal investigations will depend on three factors:
the knowledge and sophistication of criminals, the breakability of common
commercial products, and the adoption of key recovery systems. The latter
in turn will depend on whether key recovery is a standard feature of commercial
products, either as a result of market forces or government policies.
Even if key recovery becomes commonplace with stored data, it may be less
common with transient communications such as phone calls (voice and fax),
communications on the World Wide Web, and communications over virtual
private networks, where there is less user demand.
ANNEX : Tables
Country |
U.S. |
Other |
Argentina |
> 60% |
Israel, France, Swiss. |
Australia |
some |
Australia, Japan, Taiwan, France, U.K. |
Austria |
20% |
Siemens-Nixdorf: 80% |
Canada |
42% |
Canada: 40%, other: 18% |
Czech Rep. |
small |
- |
|
Denmark |
10% bank, 0 other |
- |
|
Finland |
60-80% mass mkt |
U.K.: 20-30%, Germany: 20% |
Germany |
< 20% |
Germany: most |
India |
35% |
India: 42%, UK: 10%, Germany: 8%, Sing: 5% |
Israel |
small |
Israel: most |
Italy |
no security specific |
Italy |
Japan |
6% software mkt |
Japan |
The Netherlands |
50% |
The Netherlands: 10%, Germany, UK, France |
New Zealand |
second |
UK: largest |
Norway |
largest |
Sweden, Germany, Israel |
South Africa |
71% of imports |
S.Africa, France, Israel, Germany, Swiss, Italy,
UK |
Switzerland |
10% |
Swiss: 55%, Europe: 35% |
Taiwan |
56% |
- |
|
United Kingdom |
15% |
UK: 80% |
Table 1. Market Shares in Encrytpion Software. Source:
U.S. Department of Commerce and the National Security Agency, A Study
of the International Market for Computer Software with Encryption.
Arrows indicate whether U.S. share has been increasing or decreasing.
Budget |
Tool |
Time (Cost) to crack a 40-bit key |
Time (Cost) to crack a 56-bit key |
Recommended Length 1996-2018 |
tiny |
scavenged resources |
1 week |
infeasible |
45 - 60 |
$400 |
FPGA - 1 chip |
5 hrs ($.08) |
38 yrs ($5,000) |
50 - 65 |
$30,000,000 |
Cray T3D - 1024 nodes |
10 min |
15 mo |
- |
|
$10,000 |
FPGA - 25 chips |
12 min ($.08) |
18 mo ($5,000) |
55 - 70 |
$300,000 |
FPGA - 750 chips
ASIC - 15,000 chips
|
24 sec ($.08)
.18 sec ($.001)
|
19 days ($5,000)
3 hrs ($38)
|
60 - 75 |
$10,000,000 |
FPGA - 25,000 chips
ASIC - 500,000 chips
|
.7 sec ($.08)
.005 sec ($.001)
|
13 hrs ($5,000)
6 min ($38)
|
70 - 85 |
$300,000,000 |
ASIC - 1,500,000 chips |
.0002 sec ($.001) |
12 sec ($38) |
75 - 90 |
Table 2. Brute Force Attacks on 40-bit and 56-bit Keys.
FPGA = Field Programmable Gate Array - a $200 AT&T
ORCA chip can test 30 million 56-bit DES keys per second.
ASIC = Application-Specific Integrated Circuits - a
$10 chip can test 200 million keys per second.
Estimates for 2018 based on Moore's law: cost halved
every 18 months.
Source: Data for the row with the Cray T3D are from
the National Security Agency, 1996. The remaining data are from M. Blaze,
W. Diffie, R. Rivest, B. Schneier, T. Shimomura, E. Thompson, M. Weiner,
"Minimal Key Lengths for Symmetric Ciphers to Provide Adequate Commercial
Security," Jan. 1996.
References and Notes
1. Australia, Canada, Europe, Japan,
New Zealand, and the United States adopted common rules governing exports
under the Coordinating Committee for Multilateral Export Controls (COCOM).
COCOM was replaced by the New Forum in 1995. For a summary of foreign
regulations of cryptography, see James Chandler, "Identification and Analysis
of Foreign Laws and Regulations Pertaining to the Use of Commercial Encryption
Products for Voice and Data Communications," Proceedings of the International
Cryptography Institute 1995: Global Challenges, National Intellectual
Property Law Institute, September 21-22, 1995.
2. TIS Worldwide Survey of Cryptographic
Products, June 1996. http://www.tis.com/.
3. http://www.openmarket.com/techinfo/applied.htm.
4. A Study of the International Market
for Computer Software with Encryption, U.S. Department of Commerce
and the National Security Agency, Washington, DC, 1996.
5. Ernst &Young and Information Week
Security Survey, http://techweb.cmp.com/iw/602/02mtsec.htm
6. The State of Web Commerce,
O'Reilly & Associates and Netcraft, Ltd., December 1996.
7. Computer communications are implemented
through a hierarchy of network protocols called the protocol stack. The
OSI model has seven layers, which from top to bottom are: application,
presentation, session, transport, network, link, and physical. In the
Internet, the protocols are centered around TCP/IP (Transmissions Control
Protocol/Internet Protocol). TCP/IP handles message transmission and delivery
and corresponds roughly to the transport and network layers.
8. More information on ICE and CAPIs
is available at http://www.tis.com/crypto/ice.html
9. Under the current rate of advancement,
an additional bit is needed every 18 months to stay even.
10. Whitfield Diffie and Martin Hellman,
"Exhaustive Cryptanalysis of the NBS Data Encryption Standard," Computer,
June 1977, pp. 74-84.
11. M. J. Wiener, "Efficient DES Key
Search," presented at the rump session of CRYPTO '93, Aug. 1993 and later
published as TR-244, School of Computer Science, Carleton Univ., May 1994.
12. Jared Sandberg, "French Hacker Cracks
Netscape Code, Shrugging Off U.S. Encryption Scheme," Wall St. J.,
Aug. 17, 1995, at B3.
13. Matt Blaze, Whitfield Diffie, Ronald
L. Rivest, Bruce Schneier, Tsutomu Shimomura, Eric Thompson, and Michael
Wiener, "Minimal Key Lengths for Symmetric Ciphers to Provide Adequate
Commercial Security," Jan. 1996.
14. Information about the challenge
ciphers and prizes awarded is on the RSA home page at http://www.rsa.com/.
15. Department of Commerce, National
Institute of Technology, "Announcing Development of a Federal Information
Processing Standard for Advanced Encryption Standard," Federal Register,
Jan. 2, 1997.
16. Cryptobytes, RSA Laboratories,
Summer 1996, pp. 7.
17. Steven Levy, "Wisecrackers," Wired,
Mar. 1996, pp. 128+.
18. Paul Kocher, "Cryptanalysis of Diffie-Hellman,
RSA, DSS, and Other Systems Using Timing Attacks," Dec. 7, 1995.
19. "Now, Smart Cards Can Leak Secrets,"
http://www.bellcore.com/PRESS/ADVSRY96/medadv.html.
20. Eli Biham and Adi Shamir, Research
announcement: A new cryptanalytic attack on DES, Oct. 18, 1996.
21. Systems that provide key recovery
have been called key recovery systems, data recovery systems, key escrow
systems, and key archive systems . For a taxonomy of the features and
options in such systems and descriptions of different products and approaches,
see Dorothy E. Denning and Dennis K. Branstad, "A Taxonomy of Key Escrow
Encryption," Communications of the ACM, Vol. 39, No. 3, March 1996,
pp. 34-40. Available through the Cryptography Project at http://www.cs.georgetown.edu/~denning/crypto.
22. deleted
23. "Can your crypto be turned against
you? A CSI interview with Eric Thompson of AccessData, Computer Security
Alert, No. 167, Feb. 1997, pp. 1+.
24. The White House, Statement by the
Press Secretary, April 16, 1993.
25. National Institute of Standards
and Technology, "Commerce's NIST Announces Process for Dialogue on Key
Escrow Issues," NIST 95-24, Aug. 17, 1995.
26. The White House, Office of the Vice
President, Statement of the Vice President, Oct. 1, 1996.
27. Cryptographic systems or software
with the capability of providing secrecy or confidentiality protection
are included in Category XIII(b) of the U.S. Munitions List, CFR 121.1.
The Office of Defense Trade Controls of the Department of State has jurisdiction
over all items on the Munitions List (ML). The ML is part of the International
Traffic and Arms Regulations (ITAR).
28. The White House, Office of the Press
Secretary, Executive Order, Administration of Export Controls on Encryption
Products, Nov. 15, 1996.
29. Federal Register, Vol. 61, No. 251,
Dec. 30, 1996. Available at http://jya.com/bxa123096.txt.
30. The press release is available at
http://www.ibm.com/.
31. CSPP Position Statement, "Updating
Export Controls for Encryption and Developing Key Recovery Technologies,"
Oct. 1, 1996
32. EPLR Alert, Vol. 1, No. 4, The Bureau
of National Affairs Inc., Washington DC, Oct. 28,1996.
http://www.tis.com.
33. Trusted Information Systems, Inc.,
“TIS’ Key Recovery Technology First to Enable General Purpose Export for
Very Strong Encryption,” Dec. 16, 1996. http://www.tis.com.
34. U.S. Department of Commerce News,
Bureau of Export Administration, Encryption Exports Approved for Electronic
Commerce, May 8, 1997.
35. Emerging Security Needs and U.S.
Competitiveness: Impact of Export Controls on Cryptographic Technology,
The Computer Systems Policy Project, Dec. 1995.
36. Cryptography's Role in Securing
the Information Society, Kenneth Dam and Herbert Lin, Eds., Committee
to Study National Cryptography Policy, Computer Science and Telecommunications
Board, National Research Council, National Academy Press, May 30, 1996,
pp. 165.
37. Ibid. Recommendations 4.1-4.3, pp.
8-9.
38. Philip R. Karn, Jr., Plaintiff,
v. U.S. Department of State and Thomas B. McNamara, Defendants, Memorandum
Opinion of Charles R. Richey, United States District Court Judge, United
States District Court for the District of Columbia, Civil Action No. 95-01812,
Mar. 22, 1996.
39. Daniel J. Bernstein, Plaintiff,
v. United States Department of State et al., Defendants, Opinion of
U.S. District Judge Marilyn Hall Patel, United States District Court for
the Northern District of California, No. C-95-0582.
40. Press release, Plaintiff Seeks Summary
Judgment in Cleveland Case Challenging Licensing of "Exports" of Cryptographic
Information, Cleveland, OH, Oct. 1, 1996. http://samsara.law.cwru.edu/comp_law/jvc/.
41. Dorothy E. Denning, "Export Controls,
Encryption Software, and Speech," statement for the RSA Data Security
Conference, Jan. 28, 1997. At http://www.cs.georgetown.edu/~denning/crypto.
42. OECD News Release, OECD Guidelines
for Cryptography Policy,” March, 1996. http://www.oecd.org/dsti/iccp/crypto_e.html.
For an analysis, see Stewart Baker, Background information and a detailed
analysis of the OECD Cryptography Policy Guidelines, March 1997. http://www.steptoe.com/comment.htm
43. A translation and analysis of the
French law is available from Steptoe & Johnson at http://www.us.net/~steptoe/france.htm
44. Paper on Regulatory Intent Concerning
Use of Encryption on Public Networks, issued by the Department of Trade
and Industry, London, England, June 10, 1996. Available through the Cryptography
Project at http://www.cs.georgetown.edu/~denning/crypto.
45. "Legalizing Wiretapping," Mainichi
Shimbun, Oct.9, 1996.
|