Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1997, Lecture Notes in Computer Science
…
15 pages
1 file
The UK government is fielding an architecture for secure electronic mail based on the NSA's Message Security Protocol, with a key escrow scheme inspired by Diffie-Hellman. Attempts have been made to have this protocol adopted by other governments and in various domestic applications. The declared policy goal is to entrench commercial key escrow while simultaneously creating a large enough market that software houses will support the protocol as a standard feature rather than charging extra for it. We describe this protocol and show that, like the 'Clipper' proposal of a few years ago, it has a number of problems. It provides the worst of both secret and public key systems, without delivering the advantages of either; it does not support nonrepudiation; and there are serious problems with the replacement of compromised keys, the protection of security labels, and the support of complex or dynamic administrative structures. 1 A UK official who chairs the EU's Senior Officials' Group-Information Security (SOGIS) has since admitted that 'law enforcement' in this context actually refers to national intelligence [10].
The Journal of Information, …, 1997
2006
Debate on encryption in Global Information Infrastructures has been complicated by issues relating to law enforcement. This paper looks at a technique for limiting the use of anonymous cash for illicit purposes to an acceptable level. It also argues that the majority of users will not require high level encryption systems to protect their privacy and hence would not require the keys to their encryption schemes to be escrowed. It proposes a scheme of Differential Key Escrow (DKE) where only the keys of high level encryption systems used by government and larger corporations would be held in escrow by the orgaaisation.
Journal of Cybersecurity, 2015
Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels "going dark," these attempts to regulate security technologies on the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today, there are again calls for regulation to mandate the provision of exceptional access mechanisms. In this article, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates. We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago. In the wake of the growing economic and social cost of the fundamental insecurity of today's Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse "forward secrecy" design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today's Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect ‡ The authors wish it to be known that all authors should be regarded as joint authors.
Computer Standards & Interfaces, 2005
The purpose of this study is to explore the emerging technology called Public Key Infrastructure (PKI) that is used to secure electronic transmissions over the Internet. PKI technology will enable users in the corporate, government, and consumer sectors to take advantage of high Internet speeds and verification abilities, while at the same time guarding against tampering and interruption. The use and development of PKI will have important legal, economic, and security implications. This paper explores the pros and cons of this new technology, develops a framework for the evaluation and implementation of PKI, and finally presents case studies of PKI implementation.
2001
Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and resold , or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads. CORE Metadata, citation and similar papers at core.ac.uk
The European Union has launched a comprehensive strategy framework and emerging actions on security and privacy issues. To this direction, a number of relevant initiatives have been put on (e.g. cyber security task force, awareness campaigns, promotion of good practices, improved exchange of information mechanisms, etc.). Their results will provide the basis for the work towards a secure information infrastructure. The key actions proposed for a secure information infrastructure, under the eEurope-2005 umbrella, include, between others, "Secure Communication between Public Services", e.g. examination of the possibilities to establish a secure communications environment for the exchange of government information. An important aspect towards this direction is the deployment of a Public Key Infrastructure (PKI). In this paper a good-practice guidance is described, on how a secure and efficient PKI can be developed to support secure and efficient Government-to-Government and Government-to-Citizen electronic communication
2005
This book contains the proceedings of the 5th European Public Key Infrastructure Workshop: Theory and Practice, EuroPKI 2008, which was held on the NTNU campus Gløshaugen in Trondheim, Norway, in June 2008. The EuroPKI workshop series focuses on all research and practice aspects of public key infrastructures, services and applications, and welcomes original research papers and excellent survey contributions from academia, government, and industry. Simply put, public keys are easier to distribute than secret keys. Nevertheless, constructing effective, practical, secure and low cost means for assuring authenticity and validity of public keys used in large-scale networked services remains both a technological and organizational challenge. In a nutshell, this is the PKI problem, and the papers presented herein propose new solutions and insight for these questions. This volume holds 16 refereed papers including the presentation paper by the invited speaker P. Landrock. In response to the EuroPKI 2008 call for papers, a total of 37 paper proposals were received. All submissions underwent a thorough blind review by at least three PC members, resulting in a careful selection and revision of the accepted papers. The authors came from 10 countries: Belgium,
IEEE Computer, 1998
Lecture Notes in Computer Science, 2012
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. While the advice and information in this book are believed to be true and accurate at the date of publication, neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein.
Introduction
Over the last two years, the British government's crypto policy has changed completely. Whereas in 1994 the Prime Minister assured the House of Commons that no further restrictions on encryption were envisaged, we now find the government proposing to introduce a licensing scheme for 'trusted third parties', and licenses will only be granted to operators that escrow their customers' confidentiality keys to the government's satisfaction [11, ?].
In March 1996, a document describing the cryptographic protocols to be used in government electronic mail systems was issued by CESG, the department of GCHQ concerned with the protection of government information; it has since been made available on the worldwide web [4]. According to this document, policy goals include 'attempting to facilitate future inter-operability with commercial users, maximising the use of commercial technology in a controlled manner, while allowing access to keys for data recovery or law enforcement purposes if required' 1 .
A document on encryption in the National Health Service, issued in April, had already recommended that medical traffic should be encrypted, and keys should be managed, using mechanisms compatible with the future 'National Public Key Infrastructure' [26]; part of the claimed advantages for the health service were that the same mechanisms would be used to protect electronically filed tax returns and applications from industry for government grants. Furthermore, attempts are being made to persuade other European countries to standardise on this protocol suite.
So the soundness and efficiency of the GCHQ protocol proposals could be extremely important. If an unsound protocol were to be adopted across Europe, then this could adversely affect not just the secrecy of national classified data, the safety and privacy of medical systems, and the confidentiality of tax returns and government grant applications. It could also affect a wide range of commercial systems too, and make Europe significantly more vulnerable to information warfare. If the protocols were sound but inefficient, then they might not be widely adopted; or if they were, the costs imposed on the economy could place European products and services at a competitive disadvantage.
In this paper, we present an initial analysis of the security and efficiency of the GCHQ protocol.
The GCHQ Protocol
The precursor of the government protocol was first published by Jefferies, Mitchell and Walker at a conference in July 1995 [13]. A flaw was pointed out there 2 and a revised version was published in the final proceedings of that conference; this version also appeared at the Public Key Infrastructure Invitational Workshop at MITRE, Virginia, USA, in September 1995 and at PKS '96 in Zürich on 1st October 1996 [14]. The final, government approved, version of the protocol fixes some minor problems and adds some new features.
The document [4] is not complete in itself, as the protocol is presented as a series of extensions to the NSA's Message Security Protocol [18]. In the next section we will attempt for the first time to present the whole system in a complete and concise way, suitable for analysis by the cryptologic and computer security communities. We will then discuss some of its more obvious faults.
The GCHQ architecture assumes administrative domains 'corresponding approximately to individual departments', although there may be smaller domains where a department is scattered over a large geographical area. Each will have a 'Certificate Management Authority', under the control of the departmental security officer, which will be responsible for registering users and supplying them with keys. Key management will initially be under the control of GCHQ but might, in time, be devolved.
The basic idea is that if Alice wants to send email to Bob, she must go to her certificate management authority, whom we will call TA, and obtain from him secret information that enables her to calculate a key for communicating with Bob. She also receives a certificate of this secret information, and sends this to Bob along with the encrypted message. On receipt of the message Bob contacts his certificate management authority TB and obtains the secret information that he needs to decrypt the message. Thus two individuals can communicate only if both their departmental security officers decide to permit this.
The communication flow can be visualised in the following diagram:
A B
TA TB
prearranged key
We will now describe the content of these messages. The protocol is a derivative of Diffie Hellman [5] and the basic idea is that, in order to communicate with Bob, Alice must obtain a 'public receive key' for him from TA and operate on this with a 'secret send key' that TA also issues her, along with a certificate on the corresponding 'public send key'. At the other end, Bob will obtain a 'secret receive key' for her from TB and will use this to operate on her 'public send key' whose certificate he will check.
The secret receive keys are known to both users' authorities, and are calculated from their names using a shared secret master key. Each pair of domains TX, TY has a 'top level interoperability key', which we will call K TXY for managing communication. The relevant key here is K TAB which is shared between TA and TB. The mechanisms used to establish these keys are not described.
We will simplify the GCHQ notation by following [3] and writing {X} Y for the block X encrypted under the key Y using a conventional block cipher. Then the long term seed key that governs Bob's reception of traffic from all users in the domain of TA is:
A secret receive key of the day is then derived by using this seed key to encrypt a datestamp:
and Bob's public key of the day, for receiving messages from users in the domain TA, is
where the 'base' g A and the modulus N A are those of TA's domain (the document does not specify whether N A should be prime or composite, or the properties that the group generated by g A should possess).
Finally, TA certifies Bob's public key of the day as
Only receive keys are generated using secrets shared between authorities. Send keys are unilaterally generated by the sender's authority from an internal master key, which we will call K TA for TA, and the user's name. Thus Alice's seed key for sending messages is sseed A = {A} K TA ; her secret send key of the day is derived as SSK A,D = {D} sseedA and her public send key is
. TA sends her the secret send key, plus a certificate Cert(A, D, PSK A,D ) on her public send key. Send seed keys may be refreshed on demand.
Now Alice can finally generate a shared key of the day with Bob as
This key is not used directly to encipher data, but as a 'token key' to encipher a token containing a session key. Thus, when sending the same message to more than one person, it need only be encrypted once, and its session key can be sent in a number of tokens to its authorised recipients.
Anyway, Alice can now send Bob an encrypted version of the message M . According to the GCHQ protocol specification, certificates are sent with the object 'to simplify processing', so the packet that she sends to Bob (in message 3 of the diagram overleaf) is actually
This protocol is rather complex. But what does it actually achieve?
Problem 1 -why not just use Kerberos?
The obvious question to ask about the GCHQ protocol is why public key techniques are used at all. After all, if TA and TB share a secret key, and Alice and Bob have to interact with them to obtain a session key, then one might just as well use the kind of protocol invented by Needham and Schroder [19] and since deployed in products like Kerberos [20]. Where Alice shares the key K A with TA and Bob shares K B with TB, a suitable protocol might look like A → TA :
This protocol uses significantly less computing than the GCHQ offering, and no more messages. It can be implemented in cheap commercial off-the-shelf tokens such as smartcards, and with only minor modification of the widely available code for Kerberos. This would bring the further advantage that the implications of 'Kerberising' existing applications have been widely studied and are fairly well understood in a number of sectors (see, e.g. [12]). On the other hand, the integration of a completely new suite of authentication and encryption software would mean redoing this work. Given that the great majority of actual attacks on cryptosystems exploit blunders at the level of implementation detail [1], this will mean less secure systems.
The GCHQ response to this criticism is [15]:
This is not so much an attack on the recommendations as an objection to the Trusted Third Party concept and the need for key recovery. The recommendations offer a realistic architectural solution to a complex problem and, as with any system, will require professional implementation.
This completely misses the point. Given that the UK government has decided to adopt key escrow in its own internal systems, exactly the same functionality could have been provided by a simple adaptation of Kerberos at much less cost and risk. The only extra feature that appears to be provided by the GCHQ protocol is that users who receive mail from only a small number of other departments, and who operate under security rules that permit seed keys to persist for substantial periods of time, may save some communications with their TTPs by storing receive seed keys locally. This leads us to consider the issue of scalability.
Problem 2 -where are the keys administered?
How well the GCHQ protocol (or for that matter Kerberos) will scale will depend on how many key management authorities there are. With a large number of them -say, one per business enterprise -the problem of inter-enterprise key management would dominate and the above protocol would have solved nothing.
The British government may be aware of this problem, as they propose to minimise the number of authorities. Under the legislation currently proposed, large companies would be permitted to manage their own keys -the rationale being that having significant assets they would be responsive to warrants -while small to medium enterprises and individuals would have to use the services of licensed TTPs -organisations such as banks that would undertake the dual role of certificate management authority and escrow agent.
We do not believe that this will work. One of us has experience of a bank with 25,000 employees, managed through seven regional personnel offices, trying to administer mainframe passwords at a central site. With thirty staff and much message passing to and from the regions, the task was just about feasible, but compelling a million small businesses to conduct a transaction with the 'Trusted Third Party' every time a staff member was hired, fired or moved, would do little for national economic competitiveness.
Medicine is another application to consider, as the issue of encryption and signature of medical records is the subject of debate in a number of European and other countries. There is relevant experience from New Zealand, where a proposal to have doctors' keys managed by officials in the local district hospitals turned out to be impractical. It is now proposed that keys there should be managed at the practice level [9]. In the UK, with some 12,000 general practices, hospitals and community care facilities, centralised key management is even less likely to be workable.
The GCHQ response to this criticism is [15]:
It has also been suggested that a TTP network could become large and that some users would have to keep a large number of public keys. This problem is overcome in the Royal Holloway architecture since any user can obtain all the necessary key material from its local TTP. This is inherently more scalable than other approaches. This again misses the point. If the UK health service, with 12,000 providers, has 12,000 TTPs, then the inter-TTP communications would be the bottleneck.
There is also the issue of trust. In the UK, the medical profession perceived the recommendation in [26] that key management should be centralised in a government body as an attempt to undermine the independence of the institutions currently responsible for professional registration -the General Medical Council (for doctors), the UK Central Council (for nurses), and so on. Retaining these organisations as top level CAs is essential for creating professional trust without which a security system would deliver little value.
But with the GCHQ protocol, this would appear to mean that a doctor who wished to send an encrypted email to a nurse working in the same practice would have to send a message to the GMC to get a key to encrypt the message, and the nurse would have to contact the UKCC to get a key to decrypt it. This is clearly ludicrous.
In short, the GCHQ protocol may work for a strictly hierarchical organisation like government may be thought to be (though if that were the case, a Kerberos like system would almost certainly work better). But it is not flexible enough to accommodate real world applications such as small business and professional practice. This raises the question of whether it will even work in government. We suspect it would work at best badly -and impose a structural rigidity which could frustrate attempts to make government more efficient and accountable.
The GCHQ response to this criticism is [15]:
The frameworks for confidentiality and authentication have been designed to cater for a wide range of environments. A hierarchy is defined only for the authentication framework and this is necessary because good security requires tight control.
This claim is inconsistent with the protocol document according to which 'As the Certificate Management Authority is responsible for generating the confidentiality keys, it should also take on the role of a certification authority in order to authenticate them'. Thus the confidentiality and authentication hierarchies are clearly intended to be identical.
Rossnagel made the point that trust structures in the electronic world should mirror those in existing practice [23]; a point which all security engineers should consider carefully.
Problem 3 -should signing keys be escrowed?
The next problem is the plan to set up an escrowed trust structure of confidentiality keys first, and then bootstrap authentication keys from this [4] [26].
The GCHQ protocol defines a structure called a token to transfer private keys in an encrypted form. What is also required is a mechanism to convey public signature verification keys to the authority for certification, as well as a means to revoke signature keys (which should be independent of the 'key of the day' system that provides implicit revocation of encryption keys). Such mechanisms are not provided.
Similar considerations apply to MACs. The original US MSP has a mode of operation which provides confidentiality and integrity but not non-repudiation. In this mode, the message is not signed, and instead the confidentiality key (or a key derived from it) is used to generate a MAC on the message. As the GCHQ protocol is specified by citing the US MSP specification and explaining the differences, it would appear that this mode will also be a part of it; but when combined with the GCHQ key management, the effect is that an escrowed confidentiality key is used to authenticate the message.
Even if confidentiality keys are eventually required by law to be escrowed, the keys used for authentication must be treated differently, and there is a risk that programmers and managers responsible for implementing the GCHQ protocol might overlook this distinction and produce a flawed system. So it is worth explaining explicitly.
The stated purpose of key escrow is to enable law enforcement and other government employees to monitor the contents of encrypted traffic (and, in some escrow schemes, to facilitate data recovery if users lose or forget their keys). Its stated purpose does not include allowing government employees to create forged legal documents (such as contracts or purchase orders). It would be highly undesirable if people with access to the escrow system were able to use this access to forge other people's digital signatures. The scope for insider fraud and conspiracy to pervert the course of justice would be immense.
Any police officer will appreciate that if he can get copies of my bank statements, then perhaps he can use them in evidence against me; but if he can tracelessly forge my cheques, then there is no evidence at all any more. So if there is any possibility that a digital signature might be needed as evidence, then the private key used to create it must not be escrowed.
In fact, we would go further than this: keys which are used only for authentication (and not non-repudiation) should not be escrowed either. For example, suppose that some piece of equipment (e.g. a power station, or a telephone exchange) is controlled remotely, and digital signatures or MACs are used to authenticate the control messages. Even if these messages are not retained for the purposes of evidence, it is clearly important to distinguish between authorising a law enforcement officer to monitor what is going on and authorising him to operate the equipment. If authentication keys are escrowed, then the ability to monitor and the ability to create seemingly authentic control messages become inseparable: this is almost certainly a bad thing. Returning to the medical context, it is unlikely that either doctors or patients would be happy with a system that allowed the police to forge prescriptions, or the intelligence services to assume control of life support equipment. We doubt that a well informed minister would wish to expose himself and his officers in such a way.
In such applications, we need an infrastructure of signature keys that is as trustworthy as we can make it. Bootstrapping the trust structure from a system of escrowed confidentiality keys is unacceptable.
The GCHQ response to this criticism is [15]:
This confuses the authentication and confidentiality frameworks. There is no intention to bootstrap signature keys required for non-repudiation purposes within the authentication framework.
The protocol document states (2.2.1) that 'to provide a non-repudiation service users would generate their own secret and public authentication key pairs, then pass the public part to a certification authority'. But no mechanism for this is provided; in the rest of the document, it is assumed that all secret keys are generated by the certification authority, and both the secret and public parts passed to the user. Given GCHQ's response, we conclude that their protocol is not intended to provide a non-repudiation service at all.
Furthermore, both authentication and confidentiality key material is under the control of the Departmental Security Officer. This leads to an interesting 'plausible deniability' property. If there is a failure of security, and an embarrassing message is leaked, then it is always possible to claim that the message was forged (perhaps by the very security officer whose negligence permitted the leak in the first place).
For these reasons, if non-governmental use of the GCHQ protocol is contemplated -or compelled by legislation -then signing keys should be managed by some other means (and not escrowed). We also recommend that normal policy should prohibit the sending of MAC-only messages; if a MAC-only message is received, the purported sender should be asked to resend a properly signed version (there are some special purpose uses in which the MAC-only mode is useful, but we won't describe them here).
Problem 4 -clear security labels
In the original NSA Message Security Protocol, the label describing the security classification of the contents of an encrypted message is also encrypted. The GCHQ version adds an extension which contains the label in clear (we will refer to this as the 'cleartext' security label, while the actual classification is the 'plaintext' security label).
There is a problem with doing this. An attacker can often derive valuable information from the cleartext label, taken together with the identity of the sender and recipient and the message volume. Indeed, with some labels, the attacker learns all she wants to know from the label itself, and cryptanalysis of the message body is unnecessary. This is why the US does not use cleartext security labels.
The GCHQ response to this criticism is [15]:
CESG's modifications have been made after careful consideration of government requirements and in consultation with departments; they are sensible responses to these requirements.
We understand that these 'requirements' concern the national rules concerning the forms of protection which are deemed appropriate for various types of information.
Under the UK rules, it is possible for a combination of physical and cryptographic mechanism taken together to be deemed adequate, whereas either mechanism on its own is deemed inadequate. For example, a message classified SECRET can be enciphered with RAMBUTAN and then transmitted over a link which lies entirely within the UK. The protection provided by RAMBUTAN is deemed insufficient if the same message is being transmitted across the Atlantic.
So British enciphered messages need to divided into two or more types: those that require various forms of additional physical protection, and those that don't. The message transfer system needs to be able to tell which messages are which, so it can use physically protected communications lines for some messages but not for others. The easiest way to achieve this is to mark the ciphertext with the classification of the plaintext.
However, if an opponent can get past the physical protection (which is often quite easy), then she can carry out the attacks described above. It would clearly be desirable for UK to follow the American lead and encrypt all security labels.
It may be argued that the rules are so entrenched that this is infeasible. A technical alternative is to reduce the cleartext security label to a single bit indicating only the handling requirements. In this way, routers have the information they need, and attackers get no more information than this (which they could arguably derive in any case by observing the route that the message takes). Using a completely incompatible (and information-losing) syntax for cleartext labels would also prevent lazy or careless implementers using them as plaintext labels. Such robustness would have been prudent design practice.
In fact, the GCHQ protocol does not protect the integrity of the cleartext security label either, and so the attacker can manipulate it. If it is ever used to determine the sensitivity of the decrypted plaintext, then the recipient could be tricked into believing that the message had a different classification, which might lead to its compromise.
Problem 5 -identity based keys
The GCHQ protocol gives users seed keys which they hash with timestamps to get user keys. But it is quite likely that some seed keys will be compromised (e.g. by Trojan horses previously inserted by attackers; via theft of computers; if smart cards holding them are lost etc). In that case, the user's certificates can be revoked, but the user cannot be issued with a new seed key, as it is a deterministic function of her name. All the CA can do is reissue the same (compromised) key.
To recover from this situation, either the user has to change her name, or the CA has to change the interoperability key and reissue new seed keys for every user in the domain. Both of these alternatives are unacceptable, and this is a serious flaw in the GCHQ protocol. It might be remedied by making the seed key also depend on an initial timestamp (which would also have to be added at several other places in the protocol).
Problem 6 -scope of master key compromise
The compromise of the interoperability key between two domains would be catastrophic, as all traffic between users in those domains could now be read. In our experience, the likelihood of master key compromise is persistently underestimated. We know of cases in both the banking and satellite TV industries where organisations have had to reissue millions of customer cards as a result of a key compromise that they had considered impossible and for which they therefore had no disaster recovery plan. Introducing such a vulnerability on purpose is imprudent.
The GCHQ response to this criticism is [15]:
CESG is fully aware of the need adequately to secure such high level exchanges and there are a number of ways this could be done.
Indeed, and comparison with other escrow systems such as Clipper shows that it is possible to provide some degree of protection against accidental disclosure and rogue insiders, by using two escrow agents in different departments rather than the single crypto custodians proposed by GCHQ. Clipper is not perfect in this regard, but it at least shows that it is possible to do better. At the very least, it would be prudent to change the interoperability keys frequently; this would remove the need for seed keys (and thus strengthen the argument for using Kerberos instead).
Of course, if corrupt law enforcement officers are allowed to abuse the system indefinitely, then no cryptographic or dual control protocol can put things right. Any discussion of insider attacks must assume that there exist procedures for dealing with misbehaving insiders, and indeed for detecting misbehaviour in the first place. This can be done with non-escrowed key management protocols (see for example [16]) but appears more difficult when escrow is a requirement.
Problem 7 -MOAC
The GCHQ protocol defines an extension which provides a "simple message origin authentication check". This is a digital signature computed on the contents of the message, and nothing else. By way of contrast, the original US MSP provided message origin authentication by computing a digital signature on a hash of the message and some additional control information. This additional control information can contain the data type of the message (e.g. whether it is an interpersonal text message or an EDI transaction).
The GCHQ proposal is an extension, rather than a replacement. That is, messages will contain two forms of digital signature: the old US form and the new UK form. As a result, this extension has not made the protocol simpler; it has made it more complex.
In nearly all circumstances, it would best to use the original US form of signature rather than the new UK one. The US form is very nearly as quick to compute, and it protects against some attacks to which the UK version is vulnerable. (It is possible for a bit string to have two different interpretations, depending on which data type the receiver believes it to be. The UK signature does not protect the content type, so an attacker could change this field and trick the receiver into misinterpreting the message.)
One situation in which there might be a use for this UK extension is in implementing gateways between the GCHQ protocol and other security protocols. For example, it would be possible for a gateway to convert between Internet Privacy Enhanced Mail and UK MSP by replacing the PEM header with an MSP header and copying the PEM signature into the UK signature extension field. The important point to note about this is that such a gateway does not need access to any cryptographic keys, as it does not need to re-compute the signature. By way of contrast, a gateway between US MSP and PEM would need access to the sender's signature key: this is a very bad idea for obvious reasons.
Problem 8 -choice of encryption algorithm
GCHQ wants people to use an unpublished block cipher with 64 bit block and key size called Red Pike. According to a report on the algorithm prepared in an attempt to sell it to the Health Service, it is similar to RC5 [22] but with a different key schedule. It will apparently be the standard for government traffic marked up to 'Restricted', and it is claimed that systems containing it may be less subject to export controls: health service officials have claimed that US companies operating in the UK may be allowed by the US government to use Red Pike in products in which the use of DES would be discountenanced by the US State Department.
More significantly, Red Pike will shortly be fielded in mass market software, and will thus inevitably be reverse engineered and published, as RC2 and RC4 were. So it is hard to understand why the UK government refuses to publish it, or why anyone should trust it, at least until it has been exposed to the attention of the cryptanalytic community for a number of years. If GCHQ scientists have found a weakness in RC5 and a fix for it -or even a change that speeds it up without weakening it -then surely the best way to gain acceptance for such an innovation would be to publish it.
The GCHQ response to this criticism is [15]:
Another common misconception is that the CESG Red Pike algorithm is being recommended for use in the public arena. No confidentiality algorithm is mandated in the recommendations; for HMG use, however, approved algorithms will be required; Red Pike was designed for a broad range of HMG applications.
Vigorous efforts are still being made to promote the use of Red Pike in the health service, and as noted above, it is supposed to be used in a wide range of citizens' interactions with government such as filing tax returns and grant applications. Thus the accuracy of the above response is a matter of how one interprets the phrase 'public arena'.
Conclusion
The GCHQ protocol is very poorly engineered.
1. The key management scheme gives us all the disadvantages of public key crypto (high computational complexity, long key management messages, difficult to implement on cheap devices such as smartcards), and all the disadvantages of secret key crypto (single point of failure, little forward security, little evidential force, difficulty of 'plug and play' with shrink-wrapped software). It does not provide any of the advantages that one could get from either of these technologies; and its complexity is likely to lead to the subtle and unexpected implementation bugs which are the cause of most real world security failures. 2. It is designed for tightly hierarchical organisations, and cannot economically cope with the more complex trust structures in modern commerce, industry and professional practice. Its main effect in government may to perpetuate rigid hierarchies and frustrate the efficiency improvements that modern management techniques might make possible. 3. It goes about establishing trust in the wrong way. To plan to bootstrap signature keys from a 'national public key infrastructure' of escrowed confidentiality keys shows a cavalier disregard of the realities of evidence and of safety-critical systems. 4. There are a number of serious technical problems with the modifications that have been made to the US Message Security Protocol, which underlies the UK government's offering. Quite independently of the key management scheme and trust hierarchy that are eventually adopted, these modifications are unsound and should not be used.
The above four conclusions appeared in an earlier draft of this paper. The GCHQ response that that draft, which we have cited here, has not persuaded us to change a single word of their text.
We call on the cryptologic and computer security communities to subject this protocol to further study. If adopted as widely as the British government clearly hopes it to be, it would be a single point of failure of a large number of applications on which the security, health 3 , privacy and economic wellbeing of Europe's citizens would come to depend.