Security flaws introduced by "other readers" in CMR

Adam Back aba at dcs.ex.ac.uk
Fri Oct 17 16:04:53 PDT 1997




High quality anonymous post, which I'll comment on below.  I have my
suspicions as to who this person is; I won't comment on who, but I do
appreciate his thoughtful comments.

Anonymous writes:
> Let's look a little closer at the kinds of access and notice available in
> the PGP system compared to alternatives.
> 
> As several people have pointed out, no encryption system can provide
> protection against what is done with the message after it is decrypted.
> The receiver could share it with whoever he wants to, or whoever he is
> forced to.
> 
> Consider the following scenarios:
> 
>  - An encrypted message is sent to someone at his workplace using current
>    systems like PGP 2.6.
> 
>  - An encrypted message is sent to someone at his workplace, who is using the
>    message access features of PGP 5.5.  The corporation key is marked as
>    "optional", and the message is not encrypted to that extra key.
> 
>  - An encrypted message is sent as before, but this time the message is
>    encrypted to the corporate key.

You missed one scenario, which corporates are probably going to like
better than you imagine.  The reason they will like it better than you
imagine is because you believe in privacy; they do not.  You belief in
privacy is making you unrealistically optimistic about companies
respect for privacy.

The missed scenario is:

>  - An encrypted message is sent to someone at his workplace, who is using the
>    message access features of PGP 5.5.  The corporation key is marked as
>    "non-optional", and the message is encrypted to that extra key.

If you are non-technical (which most people are not), you would then
be forced to use this.  This is because the SMTP policy enforcer will
bounce the mail if you do not.

Your expectation of privacy is respected, because you expect none.

However if the system is used for other than anticipated purposes,
such as perhaps as a GAK enforcement mechanism in
tinpotdictatorsville, it is not just expectation of privacy concerns
which will concern you.  It is civil rights issues.

You have a dual concern: you are trying to protect against big brother
and against little brother.

The way that your concerns about little brother have been transferred
into the protocol resulting in the CMR mechanism has not added
anything at all to protection against little brother; protection
against little brother are largely statement of intent issues.

However you do raise a fresh point in this debate below.

>  - An encrypted message is sent to someone at his workplace, who is using
>    a different form of corporate access which relies on key escrow
>    or automatic acquisition of cleartext.
> 
> We agree that none of these systems can provide complete protection
> against third party access.  In any of them, the receiver could be forced
> to turn over the encrypted message.  But there is still a difference.
> 
> In the first two scenarios, the expectation of privacy is the same.
> You are encrypting to a personal key at a business address, but you
> expect that you would know about it if the data were being provided to
> a third party.  There is no widespread installed base of software which
> provides secret access after the message is decrypted.

This is clever reasoning.  I have not seen anyone raise this objection
previously.

However, you neglect that there is something which is incredibly
highly deployed: hardly anyone is using disk encryption software.  So
the mostly widely installed user base already has the possibility for
someone else in the company to decrypt it: the plaintext is an easier
tarket than CDR recoverable data in my system.

Your second distinction then resides in the claim that the email is
still encrypted in the mail folder, and it has no recovery
information.

True enough, it doesn't.

However what you are really saying is that the system has been
purposefully designed to respect employees rights to receive private
email.  This design decision is implemented at a higher level than the
protocol level.  This design decision is independent of the use of CMR
or CDR.

You can easily implement private email with CDR.  You can provide two
keys: one for private use and one for company use.  Or you can the
email as private or not.  Or you can give the user the ability to
control independently for each email:

is it archived (yes/no)
is the archive recoverable (yes/no)

I have been arguing for these features to be put in also, which
demonstrates that I have just as much concern for privacy as you do.

> In fact, the second scenario arguably provides slightly more privacy
> than the first, which is what we have now.  The reason is that in the
> second scenario, the company provides an optional mechanism for business
> mail to be recoverable, but also provides an explicit, sender-controlled
> escape clause.  The company is granting an explicit expectation of privacy
> by allowing the corporate key recipient to be removed.  With usage of
> older versions of PGP, there was really no way for the company to even
> inform the sender about whether it was going to be reading the mail.

I like the statement intent of plaintext handling flag.  It is
independent of the protocol level, and can be used for either system.

> In the third scenario, where you encrypt to the corporate key, there is
> no expectation of privacy.  All parties, the sender, the receiver, and
> the company, know that the data is being made available to the business.
> There is not much privacy here; what you have is business security.  This
> mode would be used for business documents and business communications.

This is dangerous.  Companies could lose lots of money over this when
they are sued and a discovery process is used to recover all business
communiations.  Lawyers will be poring over those communications.
Most of them will have been written with under 1 minutes thought.  The
lawyer will be able to find lots of things to pick legal nits in.  He
will build a case, and your company will lose a fortune.
 
This is why you should give user control over what is archived.  He
can decide to archive things he thinks of import to keep for company
records.  The rest can be lost.  It is also why you should do
something somebody else suggested to me in email: have signatures
which expire.

> In the fourth scenario, which some people are offering as a superior
> approach, the expectation of privacy is much less clear.  Unlike in the
> current PGP approach, where the design of the corporate access system
> is oriented around notifying the sender and receiver, a key escrow or
> plaintext recovery system inherently provides no such notice.

The expectation of privacy is based on trust.  It can not be enforced,
as you acknowledged above.  The statement of intent expectation of
privacy can be encoded in the key independent of data recovery method.

> This is different from the status quo.  We are talking about a commercial
> product which provides automatic corporate access to communications.
> Widespread installation and use of this alternate version of PGP
> inherently changes the assumptions and expectations that you will have
> when you encrypt to someone at work.  

You can add as many privacy features as you want on top of CDR.
Whether your implementation changes the status quo positively or
negatively is your decision; you have full control in which features
you provide.

> Given the company's desire for message access, 

I thought that PGP stated that companies wanted to be able to recover
stored archived email for employee performance benefits.

> this approach implies that you must conservatively assume that most
> encryption to a business address will be read.

No.  You can add the exact same framework for statement of intent, and
of what is and what is not archived.  This is a completely orthogonal
issue.

> The PGP system is designed to avoid this.  It provides another method
> for corporate access, a visible and explicit method.  

Visibility and explicitness are acheiveable with either system.

> It is intended as an alternative to key escrow and other systems
> which recovery data after the recipient gets it.  PGP's third party
> recipient feature is designed to prevent the creation of an
> installed base of escrowed keys and after the fact third party
> access software.

What would you rather:

- government recovery of those parts of the data which users
voluntarily select to be stored in ways which can be recovered (after
government raid of offices)

- government key word scanning all email in transit without needing to
raid offices.

PGP Inc's CMR design is flawed because you have focused on privacy
concerns which can be equally acheived with the CDR design; and you
have done so at the expense of building a system which offers little
resistance to being adopted by goverment to build a communications
snooping, keyword scanning system.

> Even if the encryption system provides a way for the user to mark his
> keys as being escrowed or otherwise providing third party access,
> the system is not self-correcting in the way the PGP system is.
> It would be much easier for a business owner to pervert a secretive
> access system than one like PGP.  

I didn't propose it as a secretive access system, I proposed it as a
recovery system.  Of course, it can be perverted to be one in the same
way that CMR can be; both systems can be perverted.  I argued that the
software should be designed to make it as hard as possible to bypass.
This won't be hard because software modifications can acheive it.

My proposal is to allow users to selectively archive what they want
to, and to selectively allow users to allow corporate recovery of what
they want.

The CMR system is not self correcting.  The CMR system means that
companies can snoop all of the email (that the sender hasn't taken
steps to hack around CMR key).  With my proposal in adition the
recipient can delete things he would rather not keep around.

> The company could install the access features but not tell its
> employees that their keys were being escrowed, or that their mail
> client was secretly storing a copy of all plaintexts.  This kind of
> change would be easy to make with the after-the-fact access systems
> some people are proposing.  Yes, theoretically this can be done with
> any encryption system (as all agree) but the point is to avoid the
> widespread distribution of software which invites this kind of
> abuse.  The PGP system is designed to satisfy business needs without
> leading us into this dangerous situation.

CMR invites even worse kinds of abuse.  It invites wide scale
government abuse of keyword scanning email.

CMR is even worse because every last email can be recovered.

> A world in which third party encryption by the sender is the normal
> and widespread method for corporate access is one with increased
> privacy and more visibility of access.

disagree on both counts, as explained variously above.

> The alternatives which have been proposed are prone to abuse and
> provide less visibility.

They are less prone to abuse as I have stated them above.  But the
proness to abuse depends on the functionality you are demanding of
them in as far as that affects design, ultimately I think they can
both be hacked with equivalent ease in either direction (more privacy,
less privacy).

> They lead us towards a world in which there is no expectation of
> privacy even for encrypted messages, 

Disagree.  Expectations of privacy are orthogonal issues.

> by providing an installed base of snooping software.  

this charge applies to CMR software.  Third party snooping of
communications is infinitely more dangerous than snooping of stored
information.

In a company environment the company has control.  He can install a
keyboard sniffer when you are out of the office.  He can replace your
software.  This applies to both systems.  With CMR the company can all
email without the receiver having any choice.  With CDR the reciever
can choose to archive or not to archive.  He can choose to allow
recovery or not.

Next talk about government dangers.

With CMR if this technology is adopted by governments, and the OpenPGP
standard is allowed to include CMR technology we have keyword
scanning.  The CMR technology will allow government to know when the
unskilled person is cheating.

With CDR, if the government demands escrow of all storage keys the
government has no way of knowing if you are cheating with out breaking
into your building/house.  If you consider the government will break
in, you don't have much to lose by not using the key the government
has escrowed.

> The PGP approach is explicitly designed to keep us out of that
> world, by removing the need for that kind of software.

It is trying to do something useful, and the people who are proposing
it have good privacy preserving intentions.

I think it is failing to meet those intentions well, becuase the
privacy preserving, statement of intent, and non-hackability for
companies are all orthogonal issues to the choice between CDR and CMR.

CDR offers significant extra protection against government perversion
of the software authors intent.

data recovery good, communications recovery bad.

Adam
-- 
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/

print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`







More information about the cypherpunks-legacy mailing list