|
Binding cryptography : A fraud-detectible alternative to key-escrow proposals |
by Eric Verheul, Bert-Jaap Koops and Henk van Tilborg, The Computer Law & Security Report (01/1997) |
|
Introduction Information security, and so cryptography, is essential in today's information society. A robust worldwide information security infrastructure must be set up, including a Key Management Infrastructure. However, the unconditional use of encryption by criminals poses a threat to law enforcement. Consequently, governments have two tasks. The first is stimulating the establishment of a security infrastructure that protects their citizens, but which does not facilitate criminals to shield their activities from law-enforcement agencies. The second task is coping with the use of encryption by criminals in general - outside of this infrastructure. In this article, Eric Verheul, Bert-Jaap Koops and Henk van Tilborg address the first task. The authors review several (technical) proposals and a few government initiatives, focusing on key-escrow proposals. They present a series of criteria that acceptable solutions should meet, and note that all proposals so far fail to meet many of these criteria. The establishment of a worldwide security infrastructure can not be achieved without strong cooperation of governments. In fact, the authors argue, governments themselves should take up the challenge of establishing a security infrastructure based on public-key encryption, which does not hamper law enforcement. The authors offer a new solution to achieve this, "binding data", which improves upon current proposals. It helps the establishment of a strong security infrastructure in which (unilateral) fraud (i.e., not complying with the agreed rules) for criminal or subversive purposes is discouraged; such abuse is difficult and detectible by arbitrary parties. The proposal allows a straightforward monitoring of compliance with law-enforcement regulations, without users having to deposit keys beforehand. This security infrastructure does not solve criminal encryption use outside of this framework - it is not meant to. Criminals can use encryption anyhow; they should only be kept from gaining advantage in using the government infrastructure for this. The authors envision how a security infrastructure can be established that is flexible enough to be incorporated in any national crypto policy, on both the domestic and international use of cryptography.
But governments also face a regulatory task with respect to the information society. Both the National Information Infrastructure (NII) initiative of the US administration and the EU Bangemann Report emphasize that while private investment is needed to develop the information infrastructure, governments should focus on creating a stimulating and balanced regulatory environment. The task of providing such a stimulating and balanced regulatory environment includes the solving of several problems that currently hamper an untroubled development of the information society. For instance, interconnection, universal service, protecting intellectual property rights and privacy are issues that have to be addressed. Another major issue is information security. 2. Information Security
and Cryptography The threats to information vary widely. Many security incidents are caused through carelessness - people spill coffee over a machine, spread a home-copied virus or choose easily guessable passwords. Threats from the outside - cr/hackers and viruses - have received much attention, but inside attacks (by employees, managers or system operators) often cause much larger damage. Because of the variety of threats, an adequate information security plan must consist of various measures to lower the overall risk. Typical measures include physical measures (such as isolating key information and vital information systems in closely guarded rooms), technological measures (firewalls, authentication mechanisms, cryptography), procedural measures (such as responsibility division or the regular changing of passwords), control and auditing. Within an information security plan, cryptography plays an important part. It safeguards the integrity and confidentiality of stored or transported data; it can also be used for non-repudiation of the sender. Indeed, for many purposes, cryptography is the only way to effectively shield information from unauthorized access or altering. For instance, cryptography protects billions of dollars of financial transactions that are processed daily over the global financial networks; it also provides for electronic payment, both in sending credit card numbers securely and with digital cash. Encrypted services, such as pay-TV or video-on-demand constitute a growing market. Likewise, the soaring market for mobile communications is enabled through cryptographic protection. E-mail can be encrypted to safeguard the confidentiality of privacy-related or sensitive company information; it also provides integrity, which is essential in electronic business transactions, EDI, online tax declaration, and government information publishing. Cryptography will enable new applications such as road pricing and electronic voting. A few of these applications are somewhat exotic, but many are everyday necessities. In short, cryptography is essential in today's information society. The establishment of a good infrastructure for information security that incorporates cryptography is not only a private concern, but it is also a government task: they must provide the necessary regulatory environment and stimulate the establishment of a good information security infrastructure. This need is recognized in several policy documents, such as the 1992 OECD Guidelines for the security of information systems and the European Commission's draft Green Book on the Security of Information Systems. 3. Public Key Infrastructures This method does not only safeguard the confidentiality of the document, it also assures Bob of the integrity of the document. Indeed, as the combination opens the safe, the document must be from Alice. However, Bob can never convince anybody else (for instance, a judge) of this, as he could have opened the safe, put any document in the safe himself and closed it again. In other words, this method does not provide for non-repudiation. Apart from the problem of securely sending the combination, a disadvantage of this method is the poor scalability of it. For each partner Alice wants to securely communicate with, she has to create and securely send a combination. For instance, if ten partners all want to communicate with each other, forty-five combinations are needed. These disadvantages make the sole use of symmetric cryptography not a suitable security solution for the information society. In 1976, Diffie and Hellman introduced the concept of public-key encryption (pke). In terms of the previous metaphor, this introduces combination locks with a pair of combinations: a close-combination and an open-combination. With the close-combination, one can close the combination lock, and only with the open-combination can one open it again. A requirement of these pke-combinations locks is that it must be easy to create a pair of close/open combinations, but it must be impossible to derive the open-combination from the close-one or vice versa. Since 1976, several constructions for pke-combination locks have been published; the construction of Rivest, Shamir and Adleman (RSA) is the most famous and widely used. Suppose Alice wants to securely send a confidential document to Bob. First, Bob generates a random pair of close/open-combinations and sends only the close-combination to Alice. Now Alice takes a pke-combination lock, puts the document in a safe, seals it with the close-combination and sends the safe to Bob by regular mail. As only Bob has access to the corresponding open-combination of the pke-combination lock, only he can open the safe and obtain the document. It follows that, although Bob must keep his open-combination secret, he can publish the close-combination widely. So, he can use it to securely communicate with as many people he likes. This concept is used on the World Wide Web for securely sending sensitive data: browsers contain a public close-combination, the secret open-combination of which is incorporated in (secure) servers. Apart from confidentiality, pke-combination locks can be used to let Bob determine the integrity of documents sent by Alice. To this end, Alice generates a random pair of close/open combinations and only sends the open-combination to Bob. Now, Alice takes a pke-combination lock, puts the document in a safe, seals it with the combination lock and the close-combination, and sends it to Bob by regular mail. On delivery, Bob can open the safe with the open-combination. Assuming that Alice's close-combination has not been compromised, she can never repudiate safes closed by her close-combination. This specific sealing by Alice is called "digital signing"; the opening by Bob is called "verifying". In both applications of public-key encryption, one of the combinations may be (and usually is) publicly known, while the other one must remain secret: in pke jargon they are called respectively the public keyand the private key. Whether the close-combination or the open-combination is publicly known depends on which of the two applications, confidentiality or integrity, is needed. In both applications, only one public key is sufficient for communication with all partners. It is this important improvement over the use of symmetric cryptography that makes public-key encryption the appropriate basis for security in the information society. In practice, e.g., in the freeware program Pretty Good Privacy (PGP), both applications are usually combined to obtain safeguards for both confidentiality and integrity. In this fashion, Alice first signs a message with her private key, and then encrypts the result with the public key of Bob. On delivery, Bob decrypts the incoming message with his own private key and then verifies the result with Alice's public key. As public-key encryption on whole messages is too time-consuming, usually a combination of symmetric and public-key cryptography (a hybrid system) is used. For confidentiality purposes, the message is (symmetrically) enciphered with a randomly generated session key, and this - short - session key is (public-key) enciphered with the public key of the receiver. This public-key-enciphered session key is sent along with the (symmetrically) enciphered message. On delivery, the receiver can use his private key to find the session key, and hence decrypt the message through this session key. By signing a short digest (hash) instead of the whole message, integrity can also be efficiently realised by a hybrid system. Securing communication in the information society now seems quite straightforward. Each citizen generates two public key pairs, one for confidentiality and one for integrity purposes. Publishing both public keys can be done very efficiently by using a directory server, a digital version of a telephone book. In effect, PGP makes possible the construction of these public keys. Usually they are sent to others by Internet E-mail, but directory servers for PGP public keys exist as well. To illustrate a serious problem in this setting, suppose that Alice receives a digitally signed message from somebody calling himself Bob, a business partner of hers, asking her to send some confidential company information. She looks up Bob in the directory server and fetches his (verification) public key and verifies the signature. Now suppose the verification succeeds, then how sure can she be of the integrity of the message? Well, just as sure as she can be of the integrity of "Bob's" public key, since somebody impersonating Bob may have put this public key on the server. A similar conclusion holds for Alice's use of Bob's public key - also fetched from the server - to protect confidential information she sends to Bob: Alice can only be as sure that confidentiality is safeguarded as she can be of the integrity of this public key. So, in public-key encryption, the integrity of public keys is crucial, just as the confidentiality of keys is with conventional, symmetric cryptography. Does this mean that Alice should only trust public keys when she has received them from Bob in person? Luckily not, as this would make the key management of public-key encryption nearly as inconvenient as that of conventional cryptography. The solution that PGP uses is a "web of trust", based on the paradigm "a friend of a friend is a friend". Here, users can bind public keys to their owner by creating certificates: the public key plus the identity of the owner signed by a user. Users only certify public keys when they can vouch for their integrity, e.g., when they were handed over in person. Usually, PGP public keys are signed by several people. So, if the public keys of Alice and Bob are signed by a mutual friend, they can trust each other's public keys. Apart from the problem that a friend of a friend need not be a friend, people will have different notions about vouching for integrity. And how is a user going to securely communicate with somebody (e.g., a digital store) not contained in his "web of trust"? Moreover, as there are no formal responsibilities in the scheme, whom is a user going to sue if he has been cheated? Another problem is public key revocation: how is a user going to quickly revoke his public key if he has lost his private key? A horizontal key-certification structure, then, is too problematic to serve as a basis for an information security infrastructure. A better solution is to bring in a party trusted by
everyone working in a well-established framework. Both Alice and Bob go
to this party, a certifying authority (CA), identify themselves, e.g.,
with a passport, and hand over their public keys. The CA digitally certifies
these public keys, makes them publicly available in a directory server
and hands over its own (verification) public key, so that both Alice and
Bob can verify other public keys signed by this CA. Most of the standards
for public-key certificates give rise to a tree-like Public Key Infrastructure
(PKI) depicted below. There is one "root" which creates the
overall guidelines for the entire PKI and signs the public keys of all
the CAs directly underneath it (CA1 and CA2). All CAs are expected to
certify the public keys of users and the CAs below them in accordance
with the overall guidelines. A (certified) public key can be considered a digital passport in the information society. This metaphor is particularly appropriate as authentication of the owner of a public key prior to certification is usually based on a conventional passport. It is generally believed that PKIs will provide an important tool for security in the information society. Several organizations (e.g., banks, multinationals, telephone companies) and governments (USA - NIST, Canada) are developing separate PKIs for secure communications for themselves and with their clients (customers and citizens). PKIs will indeed present an information security structure in the information society, provided, first, the PKIs are part of a legal framework with adequate supervision, and, second, there are not too many separate PKIs: citizens in the information society should not have to use a different private/public key pair for each branch of trade they do business with. So, PKIs should be developed for large target groups. The development of a legal framework for PKIs is obviously a task for governments. Moreover, there are at least three reasons why the supervision of PKIs should be in the hands of governments as well. First of all, a PKI can be considered the digital equivalent (with respect to authentication) or extension (with respect to confidentiality) of the "structure of passports" which has always, and for good reasons, been in the hands of governments. It seems only natural that in future, governments themselves will distribute certified public keys for their citizens, on smart cards, as part of a passport. This, for instance, is one of the aims of the Danish Department of the Interior with the "Danish citizen card". So, governments will become the operators of large PKIs. Second, governments are in a good position to make worldwide agreements on the international recognition, standards and policies for these PKIs. Indeed, governments are in a good position to develop a worldwide PKI, which is very tempting for the private sector to hook on to. Third, governments will be the prime and most central users of PKIs: large exchanges of information between governments, businesses and citizens (e.g., publishing new legislation, online tax declaration, electronic voting) require integrity and confidentiality. Therefore, although the actual construction of PKIs could and should be done mainly by the private sector, governments (in particular the departments of Commerce and the Interior) are in the best position to lead and supervise this development. Actually, it is in this fashion the government of the US state of Utah has recently started to set up a Utah-wide PKI (for digital signatures). 4. Problems for Law Enforcement
and National Security The problems for law enforcement and national security surface in wiretaps and searches. Encrypted communications render wiretapping useless - and law-enforcement agencies generally regard wiretapping an effective, often essential, tool in gaining information on criminals and their networks. Also, the encryption of information storage effectively prevents the police from gaining evidence during a search or seizure. In this article, we focus on the wiretapping problem - as far as it is related to an information security infrastructure. Apart from addressing the overall threat of cryptography to law enforcement and national security [see Koops], governments face the task of stimulating the establishment of a good information security infrastructure for "lawful" use. Thus, governments must address this issue, while taking care that this infrastructure does not facilitate criminals in escaping law-enforcement scrutiny. Many governments are looking at (voluntary or mandatory) key escrow (i.e., key deposits) (see Section 6) as a viable solution. Preventing abuse in such a government-supported security infrastructure does not mean to solve the general problem of criminal encryption use outside of the infrastructure. It merely wants to prevent criminals from gaining advantage in using the infrastructure. 5. A Security Infrastructure:
Criteria for Solutions Essential criteria c. Enough strength to resist any realistic criminal threat, relative to the sensitivity of the application and the risk involved. d. Details of solutions should be public. Subjects in the information society who want to communicate securely have to trust the solutions they use. To accomplish this trust, it is essential that anyone is able to closely examine, or to have experts closely examine, the details of the solutions. e. Not frustrate law enforcement. The tools in the information society that law-abiding subjects can use to secure their communication, can also be used by subjects to support criminal or subversive activities. As this subversive use of cryptography is contrary to the interest of society at large, security solutions in the information society should try and prevent abuse. This criterion does not prevent criminal or subversive subjects from finding other security solutions for protecting their communication. It merely prevents their using regular solutions for this. f. Abuse is difficult and easily detectible. Not only should the regular use of security solutions in the information society be of no aid to criminal or subversive activities, it should also be difficult to circumvent the measures taken to achieve this. More in particular, it should be difficult or even impossible to achieve such advantage unilaterally [cf. LWY, p.199]. To illustrate this point, the Clipper chip (see below) can be manipulated in a unilateral way to make law-enforcement decryption impossible [Blaze]. It should be difficult for two colluding criminals to effectively gain advantage from the solution, i.e., having its encryption advantages (such as key management) but not its law-enforcement "disadvantages". Especially with software-only solutions, abuse by collusion is hard to prevent. So, as an extra safeguard, we suggest that solutions should have some means to make abuse at least detectible by a wide range of third parties (e.g., network and service operators). This ability of detection should preferably not give these parties access to confidential information. Finally, we remark that the measures taken for preventing abuse should be such that the benefit/effort trade-off for criminals in abusing the system is negative; abuse need not be impossible, it should only be made difficult enough. g. Incorporation into an organizational, legal framework. In order to prosecute cheaters and to address liability issues, a security solution must be accompanied by a trustworthy organizational framework that is legally recognized. h. Consideration for constitutional rights. Security solutions in the information society must take into account constitutional and fundamental rights, such as the freedom of speech and the privacy of correspondence and communications. In essence, this consideration should be such that the current balance between the privacy of users and the coercive measures of law-enforcement agencies remains the same in the information society. As an illustration: when law-enforcement agencies are given "master-keys" of the person they are lawfully wiretapping, then also the person's past and future privacy might be jeopardized. Therefore, it must remain possible for the courts to enforce the time-limits of a warrant (time-boundedness). Desirable criteria k. Be easily available. Given the importance of information security, solutions should be easily available to everyone, and thus not be too expensive. For security solutions of high quality and performance, it is best to put some components in (tamper-proof) hardware. Other security solutions, especially those used for basic security, should also be completely deliverable in (usually cheaper) software. l. Offer optional safeguards ("safety belts") for users. Security solutions, especially those to maintain confidentiality, can also work against users and companies. If, for instance, keys are lost, deliberately destroyed or withheld (e.g., by disgruntled employees), this can effectively imply loss of information. Security solutions should have optional, flexible safeguards to prevent this. 6. Proposed Data Recovery
Solutions For precision's sake, we will distinguish between various sorts of TTPs: we shall reserve the term TTPs for those parties who perform "everyday" functions for (public-key) cryptography: certification, time-stamping, key distribution and revocation, and the like. TTPs who hold (escrowed) master keys in deposit we will call Key Escrow Agencies (KEAs), and TTPs who help law-enforcement agencies in decrypting tapped messages shall be called Trusted Recovery Parties (TRPs). TRPs can provide law-enforcement access by operating as a key-escrow agency, but also as a session-key escrow agency by serving as a virtual addressee (see below). The US Digital Signature
Standard (DSS) As such, this proposal scores badly on criterion a and, possibly, on criteria e and f. European Trusted Services As such, the proposal meets criteria a and i. Criteria j and k are likely to be met, but the proposal leaves this in the open. As a solution for establishing an information security infrastructure, the proposal scores well, but as to discouraging abuse, it is less adequate. A major problem of the proposal is the way data recovery is handled. The proposal seems to imply that private keys are escrowed with KEAs. Apart from raising privacy issues (criterion h), the proposal is too vague to be able to assess the ways in which it wants to discourage abuse. Also, as the proposal does not (yet) entail a harmonisation of national rules nor impose a specific solution, international cooperation within the EU (KEAs yielding (session) keys to foreign judiciary) will be questionable, let alone worldwide cooperation (criterion b). Escrowed Encryption Standard
(Clipper) Trusted Information Systems [TIS] has proposed a software variant of Clipper using publicly known (public-key) encryption techniques. In [LWY], a technique is proposed that would make it possible for a KEA to release chips' master keys with an expiry date, safeguarding time-boundedness (criterion h). Partial Data Recovery
By the time-boundedness condition (criterion h), the KEA should not hand over private keys of participants but rather effect decryption itself. A potential problem with this scheme (cf. [Frank]), as with any scheme that relies on law-enforcement decryption through accessing a private key, is that it works only one-way: the addressee's private key is needed for the decryption, not the sender's. This means that if someone suspected of criminal activities is being wiretapped and sends a message to a "good guy", law enforcement need cooperation of the KEA to decrypt the message with the good guy's private key. For this, the KEA must be convinced the message came from the suspect in the period covered by the warrant. Therefore, the system would require mandatory time-stamping and signing of messages, or network operators would have to provide KEAs with evidence of the time and origin of messages. (This is not enough to overcome the problem of a policeman conspiring with a criminal to recover a highly sensitive message; see below for a solution to the "tempted policeman".) Another problem is, of course, when the addressee lives abroad. The concept is not easy to use if one of the communicating parties and its KEA are outside the jurisdiction of the law-enforcement agency conducting a wiretap. For various reasons, international cooperation of the KEA with the law-enforcement agency might be difficult and time-consuming. For instance, the conditions for interception of communications differ per country. Also, fears of economic espionage will make countries wary of handing over keys to foreign governments; an international treaty of countries agreeing to mutually hand over keys to foreign law-enforcement agencies is unlikely to be agreed upon soon. Another reason why the general PKI key-escrow concept is less acceptable to users is that they have to trust the KEA unconditionally: if it is corrupted, then so are the user's private keys. In short, this proposal scores badly on criterion b and possibly on i. Assuming the KEAs will not hand over private keys, the proposal does protect constitutional rights as much as under existing wiretap laws (criterion h). Fair Cryptography To overcome the problem of KEA corruption, one might - as in the Escrowed Encryption Standard - let the user split his private key into two or more pieces which he hands over to different key-escrow agencies of his choice: only with all shares can the user's private key be reconstructed. Although more acceptable to users, this concept is easily abusable by criminal of subversive subjects. Indeed, receiving a single share of the key, a KEA has no way of telling it is indeed part of the private key, or that all shares have been deposited with a KEA: a user might have sent all KEAs some useless data. Silvio Micali has proposed a solution to this problem, which he calls fair public-key cryptography [Mica]. He proposes a splitting method in which the pieces of the key have the additional property that they can be individually verified by the KEA to be correct, without reconstructing the private key. Although fulfilling criterion i and helping in meeting criteria f and h, this proposal does not address international cooperation (criterion b).
Now suppose citizens Alice in the USA and Bob in Britain (associated with respectively KEAAand KEAB) want to have a confidential communication. Then, Alice and Bob are each given - in a secure fashion - private/public key pairs by their respective KEAs. The construction of the private/public key pairs of Alice and Bob happens in a special way: they are constructed solely from their identity and the mutual secret key K(A,B). Therefore, the American KEAA can also construct the public/private key pair of British Bob (and vice versa) without having to communicate with its foreign counterpart. So, if there is a warrant for legal interception of the communication between Alice and Bob, the intercepting party can retrieve information related to the private keys of both Alice and Bob from the associated key-escrow agency within its jurisdiction and thus decrypt international communications. This property makes the concept potentially usable worldwide. A problem with the Royal Holloway concept is that users will need a separate private/public key pair for each foreign KEA with whose users they are communicating. Moreover, the Royal Holloway concept uses a rather rigid public-key encryption scheme that makes possible the secure exchange of only one shared key per public key: for a new shared key, either the sender or the receiver has to obtain a new public key from his KEA. Therefore, this concept scores rather badly on criteria a, j and possibly on i. Trusted Information System's
Commercial Key Escrow (TIS-CKE) By sending along useless data instead of a session key encrypted with the public key of the TRP, unilateral abuse is easily possible and will only be detected in case of a lawful wiretap. This is prevented in TIS-CKE (or actually in its successor RecoverKey International) by having the decryption software of the addressee first validate whether the session key encrypted with the public key of the TRP matches the third component; if it does not, the software refuses to decrypt. However, abuse by collusion of sender and receiver - through manipulation of this validation in the software - is still (easily) possible and will only be detected in case of a lawful wiretap (contrary to criterion f). No solution for obtaining foreign access to session keys is given; therefore, law-enforcement agencies could only decrypt outgoing messages - unless the sender includes keys for both a domestic and a foreign TRP. However, the proposal does not include the option of virtually addressing keys to more than one TRP. Thus, the proposal scores less on criteria b and e. Because of the fixed 64-bit key length, the proposal does not score well on criterion c. Also, abuse by colluding is (easily) possible (contrary to criterion f). In the available information, the property of time-boundedness is not explicitly addressed. In principle, users have a wide choice in TRPs, although they have to trust them completely, since key splitting is not supported. So, the proposal scores reasonably on criterion i. Finally, the proposal scores well on criteria d, j, k and l. Translucent cryptography One can argue that this concept strengthens the properties of any TIS-CKE type proposal with respect to constitutional rights (criterion h). 7. The Binding Alternative The Technical Perspective 2. the session key encrypted (using pke) with the public key(s) of the addressee(s); 3. the session key encrypted (using pke) with the public key of a TRP. As explained above, a drawback of this concept is that unilateral abuse is easily possible by sending nonsense in the third component. This can be prevented by having the decryption software of the addressee first validate whether the session key encrypted with the public key of the TRP matches the third component; if it does not, the software could refuse to decrypt. However, abuse by colluding of sender and receiver - through manipulation of this validation in the software - is still easily possible. So, the solution is almost entirely unenforceable. Therefore, we propose a binding alternative, which adds a fourth component to the encrypted message: 4. binding data. The idea is that any third party, e.g., a network or service provider, who has access to components 2, 3 and 4 (but not to any additional (secret) information) can - either online of offline: a) determine that the session keys in components 2 and 3 coincide; b) not determine any information on the actual session key. In this way, fraud is easily detectible: a sender that attempts to virtually address a session key to the TRP (component 3) that is different from the real one he uses on the message (or just nonsense) will be discovered by anyone checking the binding data. If such checking happens regularly, fraud can be properly discouraged and, if desirable, fined. Constructing binding data is feasible. An outline of the construction for binding data for an important public-key encryption system (ElGamal) can be found at [W3]. This outline will be elaborated in a separate article [VT]. The construction is based on the technique of zero-knowledge proofs. We expect that this construction can be improved and that many other public-key encryption systems can be equipped with binding data. We present this as a challenge to the cryptographic research community. Use of the ElGamal scheme is particulary interesting as on 29 April 1997, ElGamal will be unencumbered by patents in the USA [Schn, p.479]. Moreover, the next version of PGP (3.0) will include ElGamal. For non-technical readers, we hope a simile may give some understanding of how the binding data work. Imagine an envelope (which serves as a safe as in the analogy in section 3) is sealed with a combination lock which can be locked with the public close-combination of the receiver. Alice uses this system to securely send a session key to Bob. She puts a letter (the session key) in an envelope and seals this with a combination lock which only Bob can open. Then, she puts a letter (which should be the same session key) in an envelope and seals it with the close-combination of a TRP. She sends both envelopes as a message to Bob. Now, a monitor should be able to check whether the two letters are identical, without seeing the letters himself. For this, the system requires special envelopes, the back of which is a (coloured) filter selectively transparent to a certain colour. The letter will be printed in many different colours, in such a way that each letter is built up of hundreds of differently-coloured pixels; this is illustrated in the figure (different characters represent different colours). The sender now computes the binding data, which is the colour that the filter should be selectively transparent to; this computation depends on many data, such as Alice's and Bob's identity and the date and time. In effect, Alice can not predict beforehand the colour of the filter at the back of the envelop, and so she can not manipulate the message. Now, the monitor (e.g., a postman) can first compute and check the colour of the filter and then check whether the letters in the envelopes match by simply checking the pixel patterns appearing through the filter (the appearing "q" in the figure). If these match, the letters must be identical - after all, the colour was chosen arbitrarily. Moreover, the monitor does not get any information on the letter's contents; although in the simile he might gain a slight amount of information, in our technical proposal, complete confidentiality is guaranteed. The binding concept that we envision supports the virtual addressing of session keys to several TRPs (or none for that matter), for instance, one to a TRP in the country of the sender and one in the country of the addressee. The solution therefore offers the same advantage for worldwide useability as the Royal Holloway concept. We also remark that the concept supports the use of controllable key splitting in the sense of Micali as well, even in two ways. First, the private TRP key can be splitted in several parts and be deposited with several TRPs. It appears that some systems (e.g., ElGamal) can very conveniently support the splitting and reconstruction of private TRP keys by users themselves (details will appear in [VT]). Second, a sender can split the session key and address all the shares separately to the addressee and virtually to various TRPs using the binding concept. Moreover, the number of shares and the TRPs can - in principle - be chosen freely by each user. Finally we remark that the time-boundedness condition (criterion h) can be fulfilled by additionally demanding that encrypted information (or all components) be timestamped and signed by the sender; a condition that can be publicly verified by any third party as well. An additional feature could prevent the threat of the "tempted policeman". This tempted policeman might conspire with a criminal and have the criminal resend (or "receive") an intercepted, highly confidential business message. The TRP, thinking the message originated from the (wiretapped) criminal, would assist the policeman in decrypting. In the binding scheme, this can be prevented by additionally requiring senders to virtually address the session key to themselves as well, thereby binding the message to both sender and addressee. The TRP should check this component before assisting law-enforcement, and monitors could check on compliance. Incidentally, this feature can also solve a similar problem in TIS-CKE and in the US Key Management Infrastructure initiative. In the latter, it also overcomes the problem of international communications: the KEA has got the private key of the sender and can therefore retrieve the session key. Thus, binding cryptography can also benefit other proposals. The Organizational Perspective Being the developers of this PKI, governments can determine the standards the PKI will support and the ways the PKI may be used. With respect to confidentiality, the infrastructure should support only the use of public-key encryption systems that incorporate binding data; acceptance of new systems must be agreed within the governmental cooperation. The PKI has the four following players: - Users, i.e., governments, businesses, and citizens,
1. General legislation on PKIs: the use and responsibilities concerning digital signatures, certifying authorities etc. 2. Whether virtually addressing keys to domestic or foreign TRPs for domestic or international encrypted communications is obligatory for participation in the PKI, and if so, to what extent the splitting variant of fair cryptography may be used. 3. The conditions under which TTPs and TRPs may be set up, the conditions under which the roles of TTPs and TRPs may be combined, and the conditions under which TRPs have to cooperate with law-enforcement agencies. 4. Who plays the role of monitor, and what are the sanctions for abuse, e.g., blocking throughput, administrative fining and/or reporting to the judiciary. This part can be performed by network operators such as PTTs, but also by Internet Service Providers, who increasingly provide value-added services. Complying with this part could be a legal requirement for becoming a network or service provider. Moreover, the monitors could be monitored themselves by random checks by governments. Recall that for the monitor role, only access to the communications is necessary, not any additional (secret) information. The monitoring could likely be automated to a large extent. 5. The prohibition of the use, manufacture or trade of software/ hardware that use the PKI (in particular, the public-key certificates issued by the PKI TTPs), but do not comply with the binding rules of the PKI. This discourages, for example, a version of PGP that makes use of the PKI without complying to the binding rules. A complying version of PGP, which can be easily made, would be readily accepted. 6. The formulation of additional rules on cryptography that does not use the PKI and its binding policy. There could be a ban on (other) types of encryption, or on specific (illegal) use of it. We feel, however, that one should not ban certain types or use of cryptography, as this could infringe upon constitutional rights and would likely disproportionately harm the beneficent use of cryptography. A user who wants to use the PKI for securing his communication must comply with the binding regulations of both his own country and the country he is communicating with (provided, of course, that that country is a party to the international agreement). Evaluation of the Binding
Alternative 8. Conclusion
[Denn] D.E. Denning's Descriptions of Key Escrow Systems, http://www.cosc.georgetown.edu/~denning/crypto/Appendix.html [DSS] Proposed Federal Information Processing Standard for Digital Signature Standard (DSS), Federal Register, v. 56, n. 169, 30 Aug 1991, pp. 42980-42982. [ETS] Proposal for a Council Decision in the field of security of information systems, concerning the establishment of a Europe-wide network of Trusted Third Party Services (ETS), draft, November 1995 [Frank] Y. Frankel, M. Yung, 'Escrow Encryption Systems Visited: Attacks, Analysis and Designs', Advances in Cryptology - Crypto '95 Proceedings, Springer-Verlag, 1995, pp. 230-235. [Holl] N. Jefferies, C. Mitchell, M. Walker, 'A Proposed Architecture for Trusted Third Party Services', Cryptography: Policy and Algorithms, Proceedings of the conference, Springer-Verlag (LNCS 1029), 1996, pp. 98-104. [IWG] US Interagency Working Group on Cryptographic Policy, Enabling Privacy, Commerce, Security and Public Safety in the Global Information Infrastructure, 17 May 1996, see http://www.cdt.org/crypto/clipper_III. [Koops] B.J. Koops, 'A survey of cryptography laws and regulations', Computer Law and Security Report, November/December 1996, pp. 349-355. [LWY] A.K. Lenstra, P. Winkler, Y. Yacobi, 'A key escrow system with warrant bounds', Advances in Cryptology - Crypto '95 Proceedings, Springer-Verlag, 1995, pp.197-207. [Mica] S. Micali, 'Fair Public-key Cryptosystems', Advances in Cryptology - CRYPTO '92 Proceedings, Springer-Verlag, 1993, pp. 113-138. [Schn] B. Schneier, Applied Cryptography, Protocols, Algorithms and Source Code in C, Second Edition, John Wiley and Sons, 1996. [TIS] D.M. Balenson et al., (TIS Inc.), 'A New Approach to Software Escrow Encryption', in: L.J. Hoffman (ed.), Building in Big Brother (Springer-Verlag, NY, 1996), pp. 180-207. See also http://www.tis.com/. [VT] E.R. Verheul, H.C.A. van Tilborg, Binding ElGamal: A fraud-detectable alternative to key-escrow solutions, submitted to Eurocrypt '97. A summary is available online. [W3] http://cwis.kub.nl/~frw/people/koops/binding.htm. |
|