Email-Security.Net HOME       Papers       Blog       Atom Feed       RSS Feed       Contact Us  

 Replace Postini for Google Apps with ZSentry for Message Encryption. Also for Gmail, Android, iPhone, and more.


NEW PAPER: "Take Five" In Internet Security   

Comparison Of Secure Email Technologies
X.509 / PKI, PGP, and IBE

This work is now expanded in:
Gerck, E., Secure Email Technologies X.509/PKI, PGP, IBE and Zmail, in Corporate Email Management, Chapter 12, edited by Krishna SJ, Raju E. Hyderabad, India, ICFAI University Press, 2007, pp. 171-196.
Read the reprint (PDF)


Last major revision: December 22, 2005, 13:28 PST.
Copy at http://email-security.net/papers/pki-pgp-ibe.htm
Please see the Blog for discussion.

E. Gerck, Ph.D.
Copyright © 2005 by E. Gerck, first published online on December 6, 2005.
All rights reserved, free copying and citation allowed with source and author reference.

Abstract

    This work presents a list of desirable secure email features as well as a list of attacks or problems, with a corresponding score card for the secure email technologies X.509 / PKI, PGP and IBE. Concise definitions for each feature or problem regarding security and privacy of email communication are also included, with a view to both improve the email security technologies X.509 / PKI, PGP and IBE, and develop the specifications for new technology beyond current limitations. Usability, as an aggregation of properties, is considered the Most Important Feature of a secure email system. The IBE technology receives the highest Usability Score. However, the IBE technology receives the lowest Security Score. Both the Usability and the Security Scores may change according to a particular use environment.
DISCLAIMER
REFERENCES

Introduction
Usability
Security
Auditing
1. DESIRABLE FEATURES REFERENCE SHEET
2. PROBLEMS / ATTACKS REFERENCE SHEET
3. COMPARISON CHART OF SECURE EMAIL TECHNOLOGIES
Conclusions

Introduction

Email is just NOT a secure method of communication. Email has a number of longstanding security shortcomings that are being increasingly exploited in mass scale. There are several reasons for this state of affairs [1].

Email is used between individuals and also, in even larger volume, for organizational communication. The latter includes additional needs, such as sharing the responsibility between the organization and the person, formal work flow with structured documents, document release and retention policies, cross-organization end-points, and third-party verification services. In the 21st century, email can no longer be that simple, text message that it was originally designed to be.

Notwithstanding its limitations, email communication is still very attractive because of its low cost and very large user base worldwide, with more than 500 million users:

There are 100 million personal mailboxes worldwide.
(Radicati Group, November 2003)

There are 412 million corporate mailboxes worldwide; each one processes about 110 messages daily.

(Radicati Group, July 2003)

Corporations, the largest market sector, have a clear need for email security. For very compelling business reasons including legal requirements, corporations need to send and receive private and secure messages between specific endpoints [2] -- and even be able to control them at the end-points.

There are many possible requirements for privacy and security for email, with varying degrees of assurances, cost, and benefits. For example, what works best within a large business enterprise may not be best suited for use with consumers or members of the public. Nonetheless, it is usually recognized that a secure email system should at least provide the Basic Features given below.

       Basic Features of Secure Email:

  • message confidentiality (only the dialogue parties are privy to the message),
  • message integrity (the message was not tampered with), and
  • authentication (the dialogue parties have verified identities and / or credentials).

According to this classification, digitally signed email, for example, is not secure email. Even though it authenticates the sender and provides message integrity, it does not provide message confidentiality and does not authenticate the recipient. Encrypted email, when just message confidentiality is provided, is also not secure.

X.509 / PKI (Public-Key Infrastructure), PGP (Pretty Good Privacy) and IBE (Identity-Based Encryption) are three technologies that promise privacy and security for email. There are several secure email products in the market using these technologies. [3]

However, email encryption is not a mainstream application today -- not even for corporations. What's missing is the Most Important Feature of all: Usability -- including ease of use and ease of deployment.

Providing ease of use in email security is such a difficult task that it has not been done yet, even after 15 years of development (X.509 was released in 1988 and PGP in 1991). In a usability test done using PGP 5.0, when the test participants were given even 90 minutes in which to sign and encrypt a message, the majority of them were unable to do so successfully [WhTg1].

Still, even without usability considerations, providing email security is a difficult task. Secure email solutions are limited by the security technologies used by them, as reviewed by the author in [Ge1] for X.509/PKI and PGP. Email uses a store-and-forward messaging system from sender to recipient that is hard to control or even trace [4]. Secure Sockets Layer (SSL), an Internet security technology that can be applied well to secure credit card transactions, cannot secure email communications [5].

This paper compares the technologies X.509 / PKI, PGP, and IBE for secure email. This paper also provides a view to both improve these email security technologies and develop the specifications for new technology beyond current limitations.

This work begins with the specifications of Usability and Security. To create a metric for evaluating secure email systems that use different technologies, desirable features are listed in Section 1 while shortcomings (problems or attacks) are listed in Section 2. For clarity, these Sections also present concise definitions for each feature and problem or attack considered, in the context of secure email.

Based on these considerations, developed to be technologically neutral, Section 3 of this work uses the metric to present a score card for the secure email technologies X.509 / PKI, PGP, and IBE. In Conclusions, the score card is applied to rate the usability and security offered by the secure email technologies.


Usability

The Most Important Feature of a secure email system is Usability.

In practice, users will rather use an insecure email system that is easy to use than a secure email system where even the help text seems intimidating. The secure email system has to be be easy enough to use when compared with simple, familiar, regular email systems -- not when compared with other secure email systems. If security is too difficult or annoying, users may give up on it altogether.

Ease of use is considered here to be a self-evident need in all email security systems. See for example, the paper by Alma Whitten and J. D. Tygar, "Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0" [WhTg1].

In "Why Johnny Can't Encrypt" (op. cit.), the authors report a number of user interface design flaws, that may contribute to security failures, and a user test demonstrating that when the test participants were given even 90 minutes in which to sign and encrypt a message, the majority of them were unable to do so successfully.

While Whitten and Tygar make some good points, this paper takes the stance that what's needed for improved usability in secure email is, first of all, technology improvement.

Technology defines an upper bound for usability.

Usability may seem to be, inescapably, a user interface design problem (see, for example, [WhTg1]). However, when the role of user interface design is seen as that of providing a language for communication between the user and the system, it becomes clear that the language cannot be more expressive than the system's technology allows it to be -- even if the user has unlimited ingenuity and learning capacity. In short, if the technology does not allow it to exist, it cannot be expressed. Moreover, because communication cannot be 100% efficient, the system's technology provides an upper bound on usability -- user interface design, no matter how clever and sophisticated, can at best only reduce usability when compared with what the technology is potentially able to offer. Additionally, improving the graphical user interface and the help dialogue in email security products seems to have reached a point of diminishing returns after almost 15 years of development.

Therefore, rather than calling for more work on yet another improved graphical user interface and more help text to guide the user through all the steps required to send and receive secure email, with expected meager if not negative returns, what's needed is a real Reduction and Simplification of those steps at the technology level. What's need is technology with less complexity, less steps, less need for help text, and less need for the user to learn anything.

Usability is an aggregation of properties.

Usability (including ease of use and deployment) is generally unpredictable when looking into specific, isolated features. But usability is not entirely subjective, either. In practice, usability emerges from simple, effective rules that allow complex patterns to be expressed.

In summary: Usability is technologically supported and user interface design will, at best, reduce the usability inherent in the technology. To support usability, the technology should have simple, effective rules allowing complex patterns to be expressed as desired, rather than rules that require complexity from the start.

We note that because simplicity is also a basic principle for increased security, usability and security are not in conflict with each other -- contrary to common opinion, there should be no fundamental need to balance security with usability.

Usability is, thus, viewed here not as the result of any isolated property, or as a purely subjective evaluation, but as the result of an aggregation of properties. Usability can be technologically provided by inclusion of usability features as well as by exclusion of usability problems or attacks, with Reduction and Simplification of all rules.

Sections 1 and 2 present a majority of entries that are usability related. The most usability-relevant entries in Sections 1 and 2 are marked with rows in bold. Usability is provided as an aggregation of supported features and excluded problems or attacks, for each technology. The definition and evaluation of the Usability Score are presented in Section 3.


Security

An email message needs to be protected end-to-end, so that no one can eavesdrop, tamper, fake, spoof, or even automatically scan and index information from it while it moves from sender to recipient. In addition, an email may also need to be protected at the end-point -- with control features such as expiration ("self-destruct") and usage rights management.

The objective of a secure email system is, thus, more than just a secure transport, as done with a SSL secure web page. It is about control at the end-points, too. Without having to "do something" to sign or decrypt a message (i.e., operations that use private keys at each end-point), there's no control. One can have server control, client control and human control -- all are useful for secure email.

For example, for machine-to-machine secure communication one can have fully automatic email messages between two servers, signed and decrypted without anyone having to do anything -- and this is secure because the servers are in control.

Could we then have fully automatic, machine-to-machine secure email for human communication? No, because this is not a secure scenario for human correspondence. The servers / clients could fake a digital signature or decrypt a message without knowledge of the author or recipient. Humans need to be in sole control of their private-keys for secure human correspondence, even if the computational tasks are executed (as they must be) by a machine.

Control of private keys include control of digital signature and decryption keys. The need for signing keys to be in sole control of the signer is well-known, in terms of meeting legal evidence requirements, including use and revocation of the signing key (1.2). The same principle applies to decryption keys.

Decryption control is the dual of signature control, the latter provides assurance to the recipient that the message was sent only by the specified sender while the former provides assurance to the sender that the message can be read only by the specified recipient.

A secure email system needs, thus, to provide human control of signing and decryption. Human control of encryption and signature verification (i.e., operations that use public keys at each end-point) is usually not relevant for security and can be provided automatically to improve Usability.

Sections 1 and 2 present the desired security features and the undesired problems or attacks, for each technology. The definition and evaluation of the Security Score are presented in Section 3.


Auditing

Auditing is usually understood as an independent review and examination of records and activities to assess the adequacy of system controls, to ensure compliance with established policies and operational procedures, and to recommend necessary changes in controls, policies, or procedures.

Auditing can be used to establish and verify trust [Ge2]. The decision to trust a record (e.g., the source of a communication, or the name on a certificate) must be based on factors outside the assertion of trustworthiness that the record's system makes for itself.

Accordingly, an auditing system should be as independent as possible from what is audited and should be provided together with a secure email system (for a secure system without auditing is not secure).

The needs of the auditing system, standard in the art, are not mentioned in this paper. However, a secure email system should provide mechanisms for auditing. For example, features F17 ("Verified Timestamp") and F20 ("Visible Message Fingerprint") in Section 1 can be useful as auditing inputs that can be verified independently of the email system.


1. DESIRABLE FEATURES REFERENCE SHEET

The table below focus on desirable features, where the entries in bold correspond to features that can considerably improve Usability by  Reduction and Simplification -- see also the bold entries in Section 2. The context of usage is likely to affect what features are most suitable. Desirable features are considered positive points in the metric.

Ref

FEATURE REFERENCE

DESCRIPTION

F1

Encryption
(message confidentiality)

Scrambles data using an algorithm and a key. Should use well-known algorithms, with verifiable standards-based implementation, and keys with adequate length (e.g., 128-bit).

F2

Decryption

Uses an algorithm and a key to de-scramble data.

F3

Message Integrity

Verifies that the message was not tampered with.

F4

Key Expiration

Prevents a key from being used after its lifetime expires.

F5

Key Revocation

Prevents a key from being used after notification.

F6

Base 64 Encoding

Encodes signed and encrypted data (8-bit) into gibberish Base 64 text (6-bit) that is standards-compliant and suitable for email transport.

F7

Compact Encoding

Encoded signed or encrypted data and headers are compact.

F8

Identity Certificate

A data file that strongly (i.e., in way that cannot be forged) binds a key and an identity, usually including how and when the identity was verified, the certificate lifetime, revocation information, key usage and issuer information.

F9

Private Key Not At Server

Server does not have private key. (1.1)

F10

Meets Digital Signature Requirement

Legal requirement for private key to be in sole control of signer. (1.2)

F11

Authenticates Sender and Recipient

Verifies that sender and recipient use valid credentials. (1.3)

F12

Message Expiration

Prevents email from being read after its lifetime expires.

F13

Message Release

Prevents email from being read before its release time lapses.

F14

Message Recall

Prevents email from being read after recall notification.

F15

Secure Web Form Processing

Encrypts, decrypts and processes email data as a web form. (1.4)

F16

Return Receipt

Informs the sender by whom, where, how and when an email was decrypted. (1.5)

F17

Verified Timestamp

Email date and time are defined by a verified time source. (1.6)

F18

Attachment Encoding (easy decryption)

Decryption from attachment without copy-and-paste. E.g., S/MIME (X.509 or IBE), PGP/MIME.

F19

Direct Encoding (easy & simple decryption)

Decryption without attachment or copy-and-paste. E.g., PGP inlined.

F20

Visible Message Fingerprint

Provides short, human-usable, message fingerprint for third-party and audit message authentication.

F21

Key Self-Revocation & Reset

User can securely self-revoke private key and self-reset to a new key, without other human intervention, at any time. (1.7)

F22

Key Self-Recovery

User can securely self-recover private key, without other human intervention, at any time, based on a master key pre-defined by the user. The master key must be private, in sole control of the user and, preferably, memorable. (1.7)



(1.1) Applies to both signature and decryption private keys. If the server has a user's private keys, the server could create, change, sign, or read a message without the user's cooperation, knowledge or consent. With Feature F9, the recipient is assured that the sender must have "done something" in order to sign it (signature control), while the sender is assured that the recipient actually received the message because the recipient must have "done something" in order to decrypt it (decryption control). Signing or decryption may be done by the server but if and only if the user provides the private key (has "done something"). Decryption control is the dual of signature control, the latter provides assurance to the recipient that the message was sent only by the specified sender while the former provides assurance to the sender that the message can be read only by the specified recipient. NOTE: Absence of F9 does not imply key escrow (P1) because the private key may exist at the server and yet other parties cannot access it.

(1.2) Beyond Feature F9, digital signatures are considered also in terms of meeting legal evidence requirements, including use and revocation of signing key. E.g., "...the usual legal definition of an electronic signature, which imposes a requirement that the signer maintain the means of signature creation under her sole control." [6].

(1.3.) Authentication of both sender and recipient, which is a Basic Feature of secure email (see Introduction), is necessary for email end-to-end secure communication. Without sender authentication, email messages can be easily spoofed. Without recipient authentication, email messages can be received by anyone.

(1.4) Provides for secure structured messaging and processing, e.g. for document workflow automation, without using a SSL web site. For example, data can be collected securely using a web form (XML or HTML) that is sent to the recipient and back by secure email; the recipient is not able to edit the web form, just use it to input the requested data in the required format. The secure email client or server can include a plug-in for the XML or HTML specification (which tends to be standardized); alternatively, if the XML or HTML is not-standardized yet (e.g., new database format or style sheet), the secure email message itself can include it.

(1.5) To allay privacy concerns, the recipient should be informed beforehand that the Return Receipt will be sent back to the sender if the recipient decrypts the message. If the recipient wishes to decline to provide the receipt, the recipient should not attempt to decrypt the message. This is the same rule that postal mail follows. The receipt is useful for both sender and recipient, in addition as evidence for the sender; for example, if the sender knows that the recipient read (decrypted) the email, the sender does not have to send another email or make a call.

(1.6) Today, the sender can easily fake the date fields in an email. This not only provides scam opportunities but also prevents email to be used in applications that require verified release time (email cannot be read before) or expiration time (email cannot be read after). Even though not all email requires such control, this is a valuable feature for 21st email needs -- such as directly supporting the sender's document release and retention policies (which may be different from the recipient's).

(1.7) Applies to any private key, including decryption and signature private keys. There should be no other human intervention necessary, besides the user herself, and no one else should be able to do it, except the user.


2. PROBLEMS / ATTACKS REFERENCE SHEET

Problems and attacks are considered equally, in first-order approximation, as negative points in the metric. A justification for considering problems and attacks equally is that both reduce security. A secure email system that has problems will likely be used badly (insecurely) or not at all -- which opens room for attacks.

The entries in bold correspond to Usability problems. Absence of these problems can considerably improve Usability, by Reduction and Simplification. See also the bold entries in Section 1, for features that improve Usability. The context of usage is likely to affect what problems or attacks are most important.

Ref

PROBLEMS / ATTACKS REFERENCE

DESCRIPTION

P1

Private Key Escrowed

Private key can be provided without cooperation or knowledge of key owner. (2.1)

P2

Break Private Key Protection

Data protection of private key (at client or server) can be broken with much higher probability than a brute-force attack.(e.g., dictionary attack)

P3

Break Policy Protection

Data protection of policy and privileges of users (at client or server) can be broken with much higher probability than a brute-force attack. (e.g., dictionary attack)

P4

Weak Authentication Accepted

Accepts username / password authentication, which has a vulnerable password file (at client or server) and can be broken remotely or locally with much higher probability than a brute-force attack. (e.g., dictionary attack)

P5

Server Spoofing

A server that mimics a legitimate site to lure users into disclosing confidential information. (2.2)

P6

Unverified Sender's Email Address

The "From:" header of the email (possibly other headers) are set to a reputable email address, to lure the recipient to read and act on the email. E.g., using the email address of a friend, the user's employer, a bank or a government agency.

P7

Phishing (Email Fraud)

The email appears to be from a well-known entity but is not. Simply clicking the link may subject the user to background installations of key logging software or viruses.

P8

"Lunchtime" Attack

The attacker (e.g., secretary, technician, or customer) can sneak into the person's office for a few minutes, while the person is away for lunch, and use the person's computer. (2.3)

P9

Key Management

Key issuance, key certification, key revocation and key distribution.

P10

Key Revocation Delay

Revocation is not immediate. There may be a significant delay between sending the notification to revoke and the actual posting of the revocation. [7]

P11

Lack of Centralized Key Revocation

Lack of a single location to send revocation notices, resulting in different revocation status potentially being posted for the same key, at the same time.

P12

Open Message Headers

Email message headers containing cleartext information with a list of names, email addresses and the Subject, that cannot be protected. (2.4)

P13

Must Pre-Enroll Recipients

Recipient must register before the message can be sent.

P14

Must Register To Read

Recipient must register before the message can be read.

P15

Must Send Own Certificate

Sender must make her own key certificate available to recipient before recipient can send a message.

P16

Requires Common Root Of Trust

Sender and recipient must a priori trust a common root for security services. (2.5)


(2.1) Applies to either signature or decryption private keys. The server has the user's private key (i.e., F9 does not apply) and can provide it to other parties. Key escrow is not inherent to X. 509 / PKI or PGP technology. Key escrow is inherent to IBE technology, with the PKG (Private Key Generator). An IBE server is able, therefore, to decrypt or sign any messages for any user of the IBE system. Requiring the user to check with a PKG before reading a message makes the use of multiple PKGs much more difficult, unless they can be convinced to work together, a hard problem for competing businesses. Constant checking with a single PKG also makes traffic analysis much easier. Even if the attacker cannot decrypt the message which was sent, if the attacker can monitor the central PKG (with a single administrative order, a rootkit or a man-in-middle attack), everyone's private keys can be obtained. [8]

(2.2) Can be done even with SSL (Secure Sockets Layer) using 128-bit encryption and two-factor authentication (e.g., SecurID does not authenticate the server).

(2.3) The user is not trusted to assiduously perform any security action that could prevent this attack (such as engaging a screen lock, removing a hardware token, or locking the door). Additional security systems (such as a screen lock that is engaged automatically when the user leaves the room), which are not part of the secure email technology in evaluation, are also not considered.

(2.4) If email message headers are recorded and stored, everyone's communication patterns can be easily seen and sensitive subjects sorted out.


(2.5) Trust is understood as reliance; more precisely, trust is qualified reliance on information, based on factors independent of that information [Ge2]. A common root of trust is required if at least one of those factors (upon which trust is based) is common to both parties (sender and recipient). A common root of trust is required for X.509 / PKI and IBE but not necessarily in PGP:

- In X.509 / PKI, the sender and recipient must a priori trust each other's CA for issuing and revocation information of their respective subscribers, even when cross-certificates and bridge CAs are used (i.e., in addition to the requirement of a trusted path between the certificates). For example, the sender and the recipient must trust the recipient's CA NOT to have a large, unannounced delay between receiving a certificate revocation request and posting a certificate revocation notice that the sender can verify before sending a message using the certificate.

- In IBE, the sender must a priori know the system parameters of and trust the key server (the PKG, Private Key Generator) used by the recipient. If the trust is broken in any of these cases, system security breaks as well. For example, in IBE, if the PKG is a rogue key server, the private key may be provided to other parties in addition to the recipient (breaking the sender's message confidentiality), without the recipient's cooperation or knowledge, and in spite of best efforts by the recipient to safeguard the private key.

- In PGP, however, even though the sender and recipient must a priori trust each other's key signers (the web of trust), in practice, PGP users verify keys out of band (e.g., by phone call) with each other, not through the web of trust -- eliminating the need for a Common Root Of Trust because each key can be self-signed.


3. COMPARISON CHART OF SECURE EMAIL TECHNOLOGIES (PRODUCTS)

This Section compares the technologies X.509 / PKI, PGP, and IBE, currently used with secure email products.

The methodology used, rather than comparing one secure email product with another, first establishes a technologically-neutral metric and then applies it to create a score card. The scores are based on market products that use each corresponding technology, measuring that technology's capability to support secure email.  Desirable features are taken as positive points (noted as +) while shortcomings (problems and attacks considered equally, in first-order approximation) are taken as negative points (noted as x) in the metric.

Thus, for each main market product using the technologies X.509 / PKI, PGP, and IBE, one asks two questions:

(1) what are the product's capabilities in terms of a list of desired features (the positive points), and

(2) what are the product's shortcomings in terms of a list of attack and problems (the negative points),

and assigns each score (positive or negative) to the technology used by that product. The lists presented in Sections 1 and 2 are used as the metric. At the end, scanning enough products, we have a list of all capabilities and all shortcomings that affect each technology, as a score card.

Usability: The entries in bold correspond to potential improvements in Usability by Reduction and Simplification, with inclusion for features and exclusion for problems / attacks. Usability is an aggregation of those properties.

Usability and Security Scores: Using the score card below, they are defined as: - Usability Score is the total number of bold rows with (+), minus the total number of bold rows with (x); - Security Score is the total number of rows with (+), minus the total number of rows with (+).

Both the Usability and the Security Scores depend on which features and which problems or attacks are considered in a particular use environment. For example, users in an enterprise environment are usually subject to less problems and attacks when compared with users in a wide-open consumer environment. On the other hand, users in an enterprise environment might require more features when compared with users in a consumer environment. The same technology (or product) may receive different Usability and Security Scores for each environment.

Reader's comments are welcome to correct any omissions or errors in the results.
 Ref

TECHNOLOGY

X.509 / PKI

PGP

IBE



     

Features      

F1

Encryption (message confidentiality)

+ + +

F2

Decryption

+ + +

F3

Message Integrity

+ + +

F4

Key Expiration

+ + +

F5

Key Revocation

+ 3.2


F6

Base 64 Encoding

+ + +

F7

Compact Encoding

  + +

F8

Identity Certificate

+ +


F9

Private Key Not At Server (1.1)

+ +


F10

Meets Digital Signature Requirement (1.2)

+ 3.3


F11

Authenticates Sender and Recipient (1.3)

+ +


F12

Message Expiration

+   3.1

F13

Message Release

    3.1

F14

Message Recall

   


F15

Secure Web Form Processing (1.4)

   


F16

Return Receipt (1.5)

   


F17

Verified Timestamp

 +  


F18

Attachment Encoding (easy decryption)

+ + +

F19

Direct Encoding (easy & simple decryption)

  + 3.1

F20

Visible Message Fingerprint




F21

Key Self-Revocation & Reset (1.6)

+ +  

F22

Key Self-Recovery (1.6)

+ + +
         

Problems / Attacks      

P1

Private Key Escrowed At Server (2.1)

  3.4 x

P2

Break Private Key Protection

x x x

P3

Break Policy Protection

x x x

P4

Weak Authentication Accepted


3.5 3.5

P5

Server Spoofing (2.2)

x x x

P6

Unverified Sender's Email Address

    x

P7

Phishing (Email Fraud)

    x

P8

"Lunchtime" Attack (2.3)

x x x

P9

Key Management

x x 3.6

P10

Key Revocation Delay [7]

x 3.8 3.7

P11

Lack of Centralized Key Revocation

  x x

P12

Open Message Headers (2.4)

x x x

P13

Must Pre-Enroll Recipients

x x

P14

Must Register To Read

x x x

P15

Must Send Own Certificate

x x  

P16

Requires Common Root Of Trust (2.5) x 3.9 x


(3.1) Possible feature using IBE for Voltage, not currently offered.

(3.2) Technically available; possible feature with PGP Universal and Hushmail.

(3.3) Does not post revocation information.

(3.4) Problem / attack for PGP Universal and Hushmail.

(3.5) When a system accepts both weak and strong authentication to grant user access, the other party does not know which one was used. With X.509 / PKI, it is possible, based on factors such as certificate class, for the other party to verify the security policy that was used to grant user access.

(3.6) Requires issuance and distribution of expiration parameter if key expiration (F4) is added, with potential problems / attacks.

(3.7) Because IBE has no key revocation, a compromised private key cannot be terminated when desired. The revocation delay is equal to the time remaining for key expiration, which may be undefined. This implies a very large or even undefined delay for key revocation.

(3.8) In PGP, certificate revocation is done by the authenticators themselves in a happenstance pattern, with an unspecified delay. There is no guarantee if and when the revocation information is up-to-date. In fact, many PGP public keys are simply abandoned in public repositories after the user forgets the passphrase for the private key. See E. Gerck, "Overview of Certification Systems: X.509, CA, PGP and SKIP", Black Hat Conference, 1999, in http://nma.com/papers/certover.pdf.

(3.9) In practice, PGP users verify keys out of band (e.g., by phone call) with each other, not through the web of trust -- eliminating the need for a Common Root Of Trust; each key can be self-signed. However, if the sender and recipient use the web of trust, they must a priori trust each other's key signers -- which requires a Common Root Of Trust.

(Ref) See each Reference Sheet for Reference description.

Conclusions

A series of conceptual, legal and implementation features and problems / attacks have been discussed and presented together with a score card for the email security technologies X.509 / PKI, PGP and IBE, with their pros and cons regarding security and privacy of email communication. Usability, as an aggregation of properties, is considered the Most Important Feature of a secure email system. This work also provides a view on what to improve regarding the email security technologies X.509 / PKI, PGP and IBE, in order to develop the specifications for new technology beyond current limitations.

Using the score card given in Section (3), the Usability Score and the Security Score were defined and calculated for each technology. Both the Usability and the Security Scores may change according to the particular use environment.

The IBE technology receives the highest Usability score. However, the IBE technology receives the lowest Security socre.

Notwithstanding this result, the paper finds that because simplicity is a basic principle for both Usability and Security, usability and security are not in conflict with each other -- contrary to common opinion, there should be no fundamental need to balance security with usability.

Because any feature or problem / attack can be considered to be optional, readers can pick and choose what they want for each technology, to make their own subset of the score card, valid for their needs, budget, usage and other factors. One can also rate individual products, including products using different technologies, by comparing their score cards. For product evaluation, the check mark can be changed to a product-specific grade.

The author acknowledges important contributions in private and public comments from readers. The public comments are available, for the most part, in the email-security.net Blog and also in messages in other Blogs and listservers.


REFERENCES

[Ge1] E. Gerck, "Overview of Certification Systems: X.509, CA, PGP and SKIP", Black Hat Conference, 1999, in http://mcwg.org/mcg-mirror/cert.htm.

[Ge2] E. Gerck, "Trust as Qualified Reliance on Information, Part I", The COOK Report on Internet, Volume X, No. 10, January 2002, ISSN 1071 - 6327, in http://nma.com/papers/it-trust-part1.pdf. See also Gerck, E., Trust Points (cited section), in Digital Certificates: Applied Internet Security authored by Feghhi J, Feghhi J, Williams P., New York, Addison-Wesley, 1998, 194-195.

[WhTg1] Alma Whitten and J. D. Tygar, "Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0", in http://www.gaudior.net/alma/johnny.pdf

[1] Email communication is much like anonymous postcards, answered by anonymous recipients. Anyone can send an email in your name or using your email address. Every email or attachment you send over a computer network is copied (and perhaps even backed up) on many different computers, without your explicit knowledge or consent. However, email messages, open for anyone to read -- and even write in them -- are expected to carry secure and private messages between specific endpoints. The Internet, as an open network with open participation, has also increased the need for email security far beyond the threat model considerations envisioned when email was first specified, more than a generation ago, for a closed Internet with vetted participants (ARPANET). Today, the Internet has more than one billion users [Morgan Stanley Research, 2005]. A second billion ussers is expected to follow in the next ten years, bringing a dramatic change in worldwide security and usability needs.

[2] While email privacy and security encompass many possible needs, in general any email that can be read or used without authorization present a liability. For example, a corporation's reasoning behind a contract negotiation might be harmful if revealed even after the contract is signed. In practice, however, users are neither careful enough in selecting email content to prevent any harmful disclosure when they send email nor assiduous in protecting all email messages they receive. In the event that regular email has to be used in a business or legal communication, content should be limited considering that disclosure to third-parties cannot be avoided and confidentiality disclaimers should be used to state that fact. The Missouri Bar Disciplinary Counsel, for example, requires all Missouri attorneys to notify all recipients of e-mail that:

(1) e-mail communication is not a secure method of communication; (2) any e-mail that is sent between you and this law firm may be copied and held by various computers it passes through as it is transmitted; (3) persons not participating in our communication may intercept our communications by improperly accessing your computer or this law firm's computers -- or even some computer unconnected to either of us that the e-mail may have passed through.

[3] RSA, Entrust, Postini, (SSL), Google (SSL), Cryptzone, RPost, Microsoft Outlook, PGP, HushMail, MessageGuard, and Voltage. Product names are trademarks of the respective owners.

[4] Email is not a point-to-point message between one client and one server (as a web browser viewing a web page at a server). Email is a store-and-forward message from one client to another client, with possibly several independent servers, routers, caches, buffers, content analyzers, human agents, traffic analyzers, monitors and storage devices in-between the two clients, all acting at different times and with possibly long delays. Store-and-forward supports availability, reliability and anonymity for email.  Anonymity or pseudonymity, for example, cannot be achieved if there is a direct connection from the sender to the recipient because it can be traced. For strong anonymity or pseudonymity, email messages can use anonymizing remailers with random latency store and forward.

[5] SSL provides for encryption between two communicating points, such as a client application at a desktop PC and an application at a secure Internet server, usually authenticated at the server end only. Data can be transmitted using SSL over the Internet in point-to-point, two-way secure communication, encrypted at the sending point and decrypted at the receiving point. However, SSL, by itself, cannot protect an email message from client to client [4], and thus cannot prevent spam, spoofing, phishing and pharming either -- all of which affect email security.

[6] Ross Anderson, University of Cambridge Computer Laboratory, UK.

[7] For X.509 / PKI the delay betwen a certificate being revoked and the actual posting of the revocation can be quite small, through the use of OCSP or other real-time discovery technologies. However, the delay between the CA receiving from the subscriber a notice to revoke and the actual posting of the revocation can be quite large and even, usually, unspecified by the CA (e.g., to reduce liability). See Ed Gerck, "Certificate Revocation Revisited, Internet X.509 Public Key Infrastructure". IETF PKIX Working Group, draft-gerck-pkix-revocation-00.txt, in http://tools.ietf.org/draft/draft-gerck-pkix-revocation/draft-gerck-pkix-revocation-00.txt

[8] Key escrow is a backdoor to decrypt a message without the author's or recipient's cooperation or knowledge; if signing keys are  included, anyone's digital signature can be forged. Businesses, to avoid the key escrow backdoor liability and yet prevent problems if the proverbial bus hits an employee, may use message escrow (e.g., a secure email copy that can be decrypted by selected persons in administration) for sensitive communications. For policy or law enforcement purposes, communication information (such as routing, IP numbers, email addresses, file size, time, and frequency of use) can also be used.

Contact Information

Ed Gerck, Ph.D.

DISCLAIMER

This paper does not intend to cover all the details of the technologies reported, or all the variants thereof. Its coverage is limited to provide support and references to the work in progress on new email security technology and to unify references, concepts and terminology. No political or country-oriented criticism is to be construed from this work, which respects all the apparently divergent efforts found today on the subjects treated. Products, individuals or organizations are cited as part of the fact-finding work needed for this paper and their citation constitutes neither a favorable nor an unfavorable recommendation or endorsement.


Copyright © 2005 by E. Gerck, first published online on December 6, 2005.
All rights reserved, free copying and citation allowed with source and author reference.