Adam Back writes:
To address privacy with for example Brands digital credentials, the underlying cryptography may be harder to understand, or at least less familiar, but I don't think using a toolkit based on Brands digital credentials would be significantly harder than using an identity or attribute based PKI toolkit. Similar for Chaum's credentials or other approach.
Sure, but how many pages would it take in the spec to describe the protocol? Especially given their turgid technical-writer prose? Brands took a whole book to describe his credentials thoroughly. In any case, I agree that something like this would be an excellent enhancement to the technology. IMO it is very much in the spirit of TCPA. I suspect they would be very open to this suggestion.
Also I notice you imply patents are a problem. However, the TCPA itself has patents and will of course charge for the hardware. Patents it doesn't seem would present a problem for this application, where there is non-zero reproduction cost hardware involved.
They don't say much about patents or intellectual property licensing in the documents I have found on their site. It's not clear to me that the so-called Palladium patents actually cover TCPA. You'd have to look at them in detail.
Anonymous writes:
And nobody's got the root key to my computer. You make this claim in many places in the document. What exactly is this "root key" in TCPA terms? The endorsement key? It's private part is generated on-chip and never leaves the chip!
The "root key" to your computing environment is the private key of the CA that signs the software updates.
What software updates, exactly? Spec reference?
You'll recall that in the secure boot-strapping process if you choose to boot in the TCPA mode, if there are deviations or updates these are fetched and are only accepted if certified by the layer owner.
No, I don't recall seeing this in the spec. Hopefully as you have a chance to study it you can point out this part. I may well have missed a portion. If so then I agree that this is a potentially serious problem.
You said somewhere in this thread that the user must approve software and firmware updates. However:
- the user will not see the source code for the updates
- the user is not in a position to evaluate the update
- there will be lots of updates (daily, weekly -- look at microsofts security bug fix rate), to the extent that the user will blindly click ok.
I was talking about the optional TPM_FieldUpgrade function described on page 251 of the spec. It is apparently intended for bug fixes and such. I doubt that there will be that many bug fixes, or that users will install them that often. And if an upgrade does obvious bad things like the various despotic features you fear, keeping you from booting Linux or whatever, people can avoid installing it. I don't see this as a mechanism for someone to take over the world. It's true what you say about the user having to trust the manufacturer about the upgrade - but he has to trust the manufacturer anyway that the chip works right. Whatever monitoring process may be in place to further that trust can also protect the upgrade as well.
- there is nothing the user can do to determine whether the update he gets is also the same one other users of the OS get, vs a key board sniffer the FBI or NSA request is inserted, or have copies of the software update root keys to insert themselves.
The problem is the centralised control. The user must at minimum be able to choose his own software update certification agents. We need transparency, distributed control, and openness to allow people to use third party auditors they trust and have reason to trust to audit and endorse updates.
Well, he can choose who he buys the TPM chip from, I suppose. But upgrades are basically new firmware for the TPM chip, so they will probably always come from the manufacturer. Why exactly is this so much more of a threat than, say, flash BIOS upgrades? The BIOS has a lot more power over your machine than the TPM does.
I don't agree with your characterization that TCPA enforces policies against the owner's interests. He has to voluntarily agree to everything, from turning on TCPA, to booting a TCPA compliant program, to running an application which some third party will trust, to accepting data from that third party under agreed-upon conditions. If at any step he didn't feel that what he was doing was in his interests, he can stop and do something else.
He has no choice due to architectural design decisions, probably motivated by economic profit motives in retaining monopoly control of the TCPA consortium members. The control is ceded at the root of the secure boot-strapping framework.
Everything I have read indicates that he can boot other operating systems. And the spec seems to bear that out. I don't see anywhere in there that it would stop people from booting whatever software they want.
After that all user choice is gone, all he can do is turn TCPA off. I'm assuming that over time increasing amounts of the OS and applications will simply not function without being in TCPA mode, so turning TCPA off will increasingly become a non-choice.
The only way that TCPA will become as popular as you fear is if it really solves problems for people. Otherwise nobody will pay the extra $25 to put it in their machine.
Specifically, you will be able to "choose" not to use the only tools that will read formats used by 90+% of the world, and which are only available for the Windows platform, and only when in TCPA mode, to allow the platform to meet copyright tracking, IP ownership tracking or other policy features implemented with the document format.
That's largely the case already. That's why so many people choose Windows, to be compatible with what everyone else is doing. You seem to judge things by the outcome: Windows being more popular and powerful is bad. I judge by process: letting people make their own decisions is good, even if it leads to an outcome I personally don't like.
The danger once we get to this scenario is that as I described above TCPA itself becomes "a generic extensible policy enforcement architecture which can be configured to robustly enforce policies against the interests of the machine owner." This could be used for all kinds of malware policies which would run in the secure code compartments, for example:
- clipper / US key escrow implementation as a TCPA policy module
Where would that fit in the spec? The spec is intentionally not about policy; there is no such thing as a "TCPA policy module". How would TCPA stop people from running their own strong cryptography? You are extrapolating way, way beyond anything that is in this spec. This is just imagination and paranoia. Be concrete. What changes would have to be made to TCPA to get the effects of a mandatory Clipper chip. Would they be made in secret or would some government have to pass a law before it happened? Would the changes happen in one country or all countries? Paint me a scenario that has some kind of connection to reality. Otherwise this sounds like South Park logic: 1. Get TCPA widely used 2. ... 3. Take over the world
- big brotherish policies for regimes interested in censoring and imposing policies on users such as China, Iraq etc
Again, what specific TCPA features will they exploit to accomplish this?
While it may be possible technically to boot in non TCPA mode, or to boot an open source OS without the malware, most users will not have the technical ability to know when they are at risk and when not and what to do to avoid having the government policies enforced upon them.
That's already the case. Face it: if government decided to enforce mandatory key escrow, most users would not object and would be unable to help themselves if they did, whether TCPA existed or not.
All kinds of dubious laws in western countries could start to see stricter enforcement. Fair use rights erosion, data retention laws, and on; I really think full enforcement of current and soon coming laws will make things very unpleasant and greatly erode individual rights.
That's possible, and if we lived in a dictatorship I would be more concerned. But if new technologies make laws more enforceable, and people are uncomfortable with the loss of freedom, they will vote to relax restrictions. And as I have pointed out, it is possible that TCPA could allow for other applications that would actually magnify freedom. The thrust of the proposal is to improve the ability for applications to keep their secrets, both locally and on the net.
As far as the concern about changes, I think the smart thing to do is to fight the bad and promote the good. Definitely we should oppose any proposal to make TCPA non-voluntary, to force people to boot a certain OS, to limit what they can do on their computers. But presently none of those features are in TCPA. Rather than saying TCPA is bad because someone could make all these hypothetical changes, it makes more sense to judge TCPA on its own, as a system that emphasizes user choice.
A number of the features which are "not in TCPA" are obvious design motivators. For example online content DRM. Others are obvious things that Microsoft has been aggressively trying to do for years. I'm not sure why you suppose they will stop persuing tricks such as format compatibility lock-out, hidden changing APIs, using every trick in the book.
I'm just looking at the TCPA spec and trying to evaluate it. I don't see all the bad things that people have said are in there. Instead I see a lot of effort to provide security while still protecting user control and privacy. It's true that some developers may use the new power of TCPA for bad purposes, while other people will use it for good. I say, let us focus our criticism, don't waste time trashing a proposal because of things that are not in it.
Similarly I'm not sure why you presume governments will have no interest in exerting control on the platform. Governments are certainly not technology leaders, but they sure were persistent in trying to persue the whole crypto-tech export legislation, clipper/key-escrow and so on. Also this platform will be used world wide. There are more repressive regimes with much more intrusive plans.
I just don't see that TCPA is of that much use to them, given that they already have essentially unlimited power. Ultimately, in the West, governments are the responsibility of the populace. If the Chinese government were to do a TCPA-like system, I doubt that it would look much like this one.
Involuntary TCPA is bad, voluntary is good. So we should not fight TCPA, we should fight proposals to make it involuntary.
Initial claims of "voluntary" is the standard trick for reducing resistant to deployment. Look at history of various technologies relating to privacy and security, or just politics in general. First it's voluntary, then it becomes voluntary in name only (you can choose not to, put the "choice" is marginalized to become almost meaningless), and finally the last step is to not even pretend it's voluntary.
Even if so, that's no excuse for trying to stop people from making their own decisions about what to do with their resources. You shouldn't stop people from using a technology because you are afraid that someone else may come along and make it mandatory.
I think it's interesting to explore what can be fixed about TCPA. It's putative voluntary / mandatory status is one aspect, true, but more interesting ones are to ensure it is open, has distributed user configurable control, open access to APIs and non-discriminatory licensing with no policy strings attached.
Absolutely! I fully agree with these sentiments. I think as you study the spec in detail you will see that it does a very good job by these standards. But ultimately you can't let users take control of their TPM chip and force it to lie to other people, without losing the whole point of the system. Doing that would be like insisting on a PKI where every user could make arbitrary modifications to certs issued by other people. Sure, in some sense it may increase freedom, but it's at the cost of making the whole infrastructure worthless. The thing that makes your certified key useful is the raw fact that you can't change it. By the same token, the thing that makes TCPA useful is the fact that you can't get at sealed data or get the system to lie. By voluntarily giving up this ability locally, you gain tremendous power in interacting with other people.