Challenge to David Wagner on TCPA

AARG! Anonymous remailer at aarg.net
Wed Jul 31 23:45:35 PDT 2002


Peter Trei writes:
> AARG!, our anonymous Pangloss, is strictly correct - Wagner should have
> said "could" rather than "would".

So TCPA and Palladium "could" restrict which software you could run.
They aren't designed to do so, but the design could be changed and
restrictions added.

But you could make the same charge about any software!  The Mac OS could
be changed to restrict what software you can run.  Does that mean that
we should all stop using Macs, and attack them for something that they
are not doing and haven't said they would do?

The point is, we should look critically at proposals like TCPA and
Palladium, but our criticisms should be based in fact and not fantasy.
Saying that they could do something or they might do something is a much
weaker argument than saying that they will have certain bad effects.
The point of the current discussion is to improve the quality of the
criticism which has been directed at these proposals.  Raising a bunch
of red herrings is not only a shameful and dishonest way to conduct the
dispute, it could backfire if people come to realize that the system
does not actually behave as the critics have claimed.

Peter Fairbrother made a similar point:

> The wise general will plan his defences according to his opponent's
> capabilities, not according to his opponent's avowed intentions.

Fine, but note that at least TCPA as currently designed does not have this
specific capability of keeping some software from booting and running.
Granted, the system could be changed to allow only certain kinds of
software to boot, just as similar changes could be made to any OS or
boot loader in existence.

Back to Peter Trei (and again, Peter Fairbrother echoed his concern):

> However, TCPA and Palladium fall into a class of technologies with a
> tremendous potential for abuse. Since the trust model is directed against
> the computer's owner (he can't sign code as trusted, or reliably control 
> which signing keys are trusted), he has ceded ultimate control of what 
> he can and can't do with his computer to another. 

Under TCPA, he can do everything with his computer that he can do today,
even if the system is not turned off.  What he can't do is to use the
new TCPA features, like attestation or sealed storage, in such a way as
to violate the security design of those systems (assuming of course that
the design is sound and well implemented).  This is no more a matter of
turning over control of his computer than is using an X.509 certificate
issued by a CA to prove his identity.  He can't violate the security of
the X.509 cert.  He isn't forced to use it, but if he does, he can't
forge a different identity.  This is analogous to how the attestation
features of TCPA works.  He doesn't have to use it, but if he wants to
prove what software he booted, he doesn't have the ability to forge the
data and lie about it.

> Sure, TCPA can be switched off - until that switch is disabled. It 
> could potentially be permenantly disabled by a BIOS update, a 
> security patch, a commercial program which carries signed 
> disabling code as a Trojan, or over the net through a backdoor or 
> vulnerability in any networked software. Or by Congress 
> which could make running a TCPA capable machine with TCPA 
> turned off illegal.

This is why the original "Challenge" asked for specific features in the
TCPA spec which could provide this claimed functionality.  Even if TCPA
is somehow kept turned on, it will not stop any software from booting.

Now, you might say that they can then further change the TCPA so that
it *does* stop uncertified software from booting.  Sure, they could.
But you know what?  They could do that without the TCPA hardware.
They could put in a BIOS that had a cert in it and only signed OS's could
boot.  That's not what TCPA does, and it's nothing like how it works.
A system like this would be a very restricted machine and you might
justifiably complain if the manufacturer tried to make you buy one.
But why criticize TCPA for this very different functionality, which
doesn't use the TCPA hardware, the TCPA design, and the TCPA API?

> With TCPA, I now have to trust that a powerful third party, over
> which I have no control, and which does not necessarily have
> my interests are heart, will not abuse it's power. I don't
> want to have to do that.

How could this be true, when there are no features in the TCPA design
to allow this powerful third party to restrict your use of your computer
in any way?

(By the way, does anyone know why these messages are appearing on
cypherpunks but not on the cryptography at wasabisystems.com mailing list,
when the responses to them show up in both places?  Does the moderator of
the cryptography list object to anonymous messages?  Or does he think the
quality of them is so bad that they don't deserve to appear?  Or perhaps
it is a technical problem, that the anonymous email can't be delivered
to his address?  If someone replies to this message, please include this
final paragraph in the quoted portion of your reply, so that the moderator
will perhaps be prompted to explain what is going wrong.  Thanks.)





More information about the cypherpunks-legacy mailing list