RE: Challenge to David Wagner on TCPA
Peter Trei writes:
I'm going to respond to AARGH!, our new Sternlight, by asking two questions.
1. Why can't I control what signing keys the Fritz chip trusts?
If the point of TCPA is make it so *I* can trust that *my* computer to run the software *I* have approved, and refuse to run something which a virus or Trojan has modifed (and this, btw, is the stated intention of TCPA), then why the hell don't I have full control over the keys? If I did, the thing might actually work to my benefit.
The beneficiary of TCPA when I don't have ultimate root control is not I. It is someone else. That is not an acceptable situation.
You might be surprised to learn that under the TCPA, it is not necessary for the TPM (the so-called "Fritz" chip) to trust *any* signing keys! The TCPA basically provides two kinds of functionality: first, it can attest to the software which was booted and loaded. It does this by taking hashes of the software before transferring control to it, and storing those hashes in its internal secure registers. At a later time it can output those hashes, signed by its internal signature key (generated on-chip, with the private key never leaving the chip). The system also holds a cert issued on this internal key (which is called the Endorsement key), and this cert is issued by the TPM manufacturer (also called the TPME). But this functionality does not require storing the TPME key, just the cert it issued. Second, the TCPA provides for secure storage via a "sealing" function. The way this works, a key is generated and used to encrypt a data blob. Buried in the blob can be a hash of the software which was running at the time of the encryption (the same data which can be reported via the attestation function). Then, when the data is decrypted and "unsealed", the hash is compared to that which is in the TPM registers now. This can make it so that data which is encrypted when software system X boots can only be decrypted when that same software boots. Again, this functionality does not require trusting anyone's keys. Now, there is an optional function which does use the manufacturer's key, but it is intended only to be used rarely. That is for when you need to transfer your sealed data from one machine to another (either because you have bought a new machine, or because your old one crashed). In this case you go through a complicated procedure that includes encrypting some data to the TPME key (the TPM manufacturer's key) and sending it to the manufacturer, who massages the data such that it can be loaded into the new machine's TPM chip. So this function does require pre-loading a manufacturer key into the TPM, but first, it is optional, and second, it frankly appears to be so cumbersome that it is questionable whether manufacturers will want to get involved with it. OTOH it is apparently the only way to recover if your system crashes. This may indicate that TCPA is not feasible, because there is too much risk of losing locked data on a machine crash, and the recovery procedure is too cumbersome. That would be a valid basis on which to criticize TCPA, but it doesn't change the fact that many of the other claims which have been made about it are not correct. In answer to your question, then, for most purposes, there is no signing key that your TPM chip trusts, so the issue is moot. I suggest that you go ask the people who misled you about TCPA what their ulterior motives were, since you seem predisposed to ask such questions.
2. It's really curious that Mr. AARGH! has shown up simultaneously on the lists and on sci.crypt, with the single brief of supporting TCPA.
While I totally support his or her right to post anonymously, I can only speculate that anonymity is being used to disguise some vested interest in supporting TCPA. In other words, I infer that Mr. AARGH! is a TCPA insider, who is embarassed to reveal himself in public.
So my question is: What is your reason for shielding your identity? You do so at the cost of people assuming the worst about your motives.
The point of being anonymous is that there is no persistent identity to attribute motives to! Of course I have departed somewhat from this rule in the recent discussion, using a single exit remailer and maintaining continuity of persona over a series of messages. But feel free to make whatever assumptions you like about my motives. All I ask is that you respond to my facts.
Peter Trei
PS: Speculating about the most tyrannical uses to which a technology can be put has generally proved a winning proposition.
Of course, speculation is entirely appropriate - when labeled as such! But David Wagner gave the impression that he was talking about facts when he said, "The world is moving toward closed digital rights management systems where you may need approval to run programs," says David Wagner, an assistant professor of computer science at the University of California at Berkeley. "Both Palladium and TCPA incorporate features that would restrict what applications you could run." Do you think he was speculating? Or do you agree that if he makes such statements, he should base them on fact? TCPA appears to have no mechanism for the user to need approval in order to run programs. That is how the facts look to me, and if anyone can find out otherwise, I would appreciate knowing. Maybe someone could ask David Wagner what he based the above claim on?
On Thu, 1 Aug 2002, AARG!Anonymous wrote:
Peter Trei writes:
I'm going to respond to AARGH!, our new Sternlight, by asking two questions.
1. Why can't I control what signing keys the Fritz chip trusts?
If the point of TCPA is make it so *I* can trust that *my* computer to run the software *I* have approved, and refuse to run something which a virus or Trojan has modifed (and this, btw, is the stated intention of TCPA), then why the hell don't I have full control over the keys? If I did, the thing might actually work to my benefit.
The beneficiary of TCPA when I don't have ultimate root control is not I. It is someone else. That is not an acceptable situation.
You might be surprised to learn that under the TCPA, it is not necessary for the TPM (the so-called "Fritz" chip) to trust *any* signing keys!
The TCPA basically provides two kinds of functionality: first, it can attest to the software which was booted and loaded. It does this by taking hashes of the software before transferring control to it, and storing those hashes in its internal secure registers. At a later time it can output those hashes, signed by its internal signature key (generated on-chip, with the private key never leaving the chip). The system also holds a cert issued on this internal key (which is called the Endorsement key), and this cert is issued by the TPM manufacturer (also called the TPME). But this functionality does not require storing the TPME key, just the cert it issued.
Second, the TCPA provides for secure storage via a "sealing" function. The way this works, a key is generated and used to encrypt a data blob. Buried in the blob can be a hash of the software which was running at the time of the encryption (the same data which can be reported via the attestation function). Then, when the data is decrypted and "unsealed", the hash is compared to that which is in the TPM registers now. This can make it so that data which is encrypted when software system X boots can only be decrypted when that same software boots. Again, this functionality does not require trusting anyone's keys.
Now, there is an optional function which does use the manufacturer's key, but it is intended only to be used rarely. That is for when you need to transfer your sealed data from one machine to another (either because you have bought a new machine, or because your old one crashed). In this case you go through a complicated procedure that includes encrypting some data to the TPME key (the TPM manufacturer's key) and sending it to the manufacturer, who massages the data such that it can be loaded into the new machine's TPM chip.
So this function does require pre-loading a manufacturer key into the TPM, but first, it is optional, and second, it frankly appears to be so cumbersome that it is questionable whether manufacturers will want to get involved with it. OTOH it is apparently the only way to recover if your system crashes. This may indicate that TCPA is not feasible, because there is too much risk of losing locked data on a machine crash, and the recovery procedure is too cumbersome. That would be a valid basis on which to criticize TCPA, but it doesn't change the fact that many of the other claims which have been made about it are not correct.
In answer to your question, then, for most purposes, there is no signing key that your TPM chip trusts, so the issue is moot. I suggest that you go ask the people who misled you about TCPA what their ulterior motives were, since you seem predisposed to ask such questions.
2. It's really curious that Mr. AARGH! has shown up simultaneously on the lists and on sci.crypt, with the single brief of supporting TCPA.
While I totally support his or her right to post anonymously, I can only speculate that anonymity is being used to disguise some vested interest in supporting TCPA. In other words, I infer that Mr. AARGH! is a TCPA insider, who is embarassed to reveal himself in public.
So my question is: What is your reason for shielding your identity? You do so at the cost of people assuming the worst about your motives.
The point of being anonymous is that there is no persistent identity to attribute motives to! Of course I have departed somewhat from this rule in the recent discussion, using a single exit remailer and maintaining continuity of persona over a series of messages. But feel free to make whatever assumptions you like about my motives. All I ask is that you respond to my facts.
Peter Trei
PS: Speculating about the most tyrannical uses to which a technology can be put has generally proved a winning proposition.
Of course, speculation is entirely appropriate - when labeled as such! But David Wagner gave the impression that he was talking about facts when he said,
"The world is moving toward closed digital rights management systems where you may need approval to run programs," says David Wagner, an assistant professor of computer science at the University of California at Berkeley. "Both Palladium and TCPA incorporate features that would restrict what applications you could run."
Do you think he was speculating? Or do you agree that if he makes such statements, he should base them on fact? TCPA appears to have no mechanism for the user to need approval in order to run programs. That is how the facts look to me, and if anyone can find out otherwise, I would appreciate knowing. Maybe someone could ask David Wagner what he based the above claim on?
By your own account the system is designed to give root in an manner claimed to be incorrigible by me, but certainly inconvenient to me, to others. May I ask whether your definition of TCPA includes the legal and economic infrastructures whose purpose is to require that at some date in the future all computers sold be TCPA enabled? If so what is the claimed advantage to me here? Since I can certainly today let you ssh in to my box and run stuff in a sandbox and I can certainly decide to not observe you running your stuff. TCPA in any form offers no advantages whatsoever except to the Englobulators. oo--JS.
On 2002-08-01, AARG!Anonymous uttered to ptrei@rsasecurity.com,...:
It does this by taking hashes of the software before transferring control to it, and storing those hashes in its internal secure registers.
So, is there some sort of guarantee that the transfer of control won't be stopped by a check against cryptographic signature within the executable itself, in the future? That sort of thing would be trivial to enforce via licencing terms, after all, and would allow for the introduction of a strictly limited set of operating systems to which control would be transferred. I'm having a lot of trouble seeing the benefit in TCPA without such extra measures, given that open source software would likely evolve which circumvented any protection offered by the more open ended architecture you now describe. Such a development would simply mean that Peter's concern would be transferred a level up, without losing its relevance. I'd also contend that this extra level of diversion is precisely what TCPA, with its purported policy of "no trusted keys" aims at.
Then, when the data is decrypted and "unsealed", the hash is compared to that which is in the TPM registers now. This can make it so that data which is encrypted when software system X boots can only be decrypted when that same software boots.
Again, such values would be RE'd and reported by any sane open source OS to the circuitry, giving access to whatever data there is. If this is prevented, one can bootstrap an absolutely secure platform where whatever the content provider says is the Law, including a one where every piece of runnable OS software actually enforces the kind of control over permissible signatures Peter is so worried about. Where's the guarantee that this won't happen, one day?
In answer to your question, then, for most purposes, there is no signing key that your TPM chip trusts, so the issue is moot.
At the hardware level, yes. At the software one, it probably won't be, even in the presence of the above considerations. After you install your next Windows version, you will be tightly locked in with whatever M$ throws at you in their DLL's, and as I pointed out, there's absolutely no guarantee Linux et al. might well be shut out by extra features, in the future. In the end what we get is an architecture, which may not embody Peter's concerns right now, but which is built from the ground up to bring them into being, later. More generally, as long as we have computers which allow data to be addressed as code and vice versa, the ability to control use of data will necessarily entail ability to control use of code. So, either we will get systems where circumventing copyright controls is trivial or ones where you cannot compile your own code. All the rest is just meaningless syntax. In that light I bet you can guess why people are worried about TCPA and its ilk. -- Sampo Syreeni, aka decoy - mailto:decoy@iki.fi, tel:+358-50-5756111 student/math+cs/helsinki university, http://www.iki.fi/~decoy/front openpgp: 050985C2/025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2 --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
-- On 2 Aug 2002 at 3:31, Sampo Syreeni wrote:
More generally, as long as we have computers which allow data to be addressed as code and vice versa, the ability to control use of data will necessarily entail ability to control use of code. So, either we will get systems where circumventing copyright controls is trivial or ones where you cannot compile your own code. All the rest is just meaningless syntax.
The announced purpose of TCPA/Palladium is to introduce some intermediate cases. For example you could compile your own code, and then encrypt it so that it can only run on a specific target computer. As somone who sells code, I would think this would be a great idea, were it not for the excesses we have been seeing from the IP lobbyists. --digsig James A. Donald 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG iB5WVaGfx+zq5Dani1KQGdZIU5Kl21LDrc7w4e1m 2PoKhj2EuUKqjKlZ/RN3VXdP0TFKxmpO/rR69KupZ --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
participants (4)
-
AARG! Anonymous
-
James A. Donald
-
Jay Sulzberger
-
Sampo Syreeni