Thanks for the clarifications of the differences between TCPA and Palladium. The lack of Palladium docs and fact that TCPA docs describe OS level features led to the inference that Palladium unless otherwise stated did what TCPA proposed. Peter Biddle writes:
Adam Back wrote:
I think some of the current disagreements and not very strongly technology grounded responses to anonymous are due to the lack of any concise and informative papers describing TCPA and palladium.
I agree, and from my perspective this is a problem. We have a great deal of information we need to get out there.
* Documentation I'm feeling frustrated in being unable to properly analyse Palladium due to lack of documentation. Surely microsoft must have _some_ internal documentation that could be released. Two second hand blog articles by third parties doesn't quite cut it! The documents claim privacy advocacy group consultation? Could the information shared with these groups be published? It's quite difficult to reason about implications and limits of a novel new architecture with incomplete information -- you need grounded facts down to the technical details to work out what the implications are from first principles and compare and evaluate the proponents claims. * privacy CA
The suggestions for TCPA responses that William Arbaugh raises seem quite good (c). 1 and 2 are already true for Pd, I believe that 3 is true but I would need to talk with him about what he means here to confirm it, 4 is covered in Eric Norlin's blog (d), and 5 is something we should do.
Insufficient data to comment on the degree of openness provided by palladium for 1 (allow owner to load trusted root cert), and 2 (allow TPM to be completely disabled). For 3, the TCPA version of the "privacy CA" is broken (implemented using "trust me" by a server the user does not and should not have to trust). Does Palladium do something different? later in same message you said:
The privacy model in Pd is different from TCPA. I could go on for a long time about it, but the key difference is that the public key is only revealed to named third parties which a user trusts. You are right in thinking that you need to trust them, but you don't have to show anyone your key if you don't trust them, so you (the user) are always in control of this.
From the limited information available my understanding is that the
It sounds as if Palladium suffers from the same broken privacy problem as TCPA then. Saying you have a choice about whether to use a service doesn't alter the fact that you are linkable and identifiable to some extent -- the extent depending on the exact permutation of attribute, identity and endoresment certificates you have used. This vulnerability is unnecessary. You can certifying things without having to trust anyone. * architecture functionality main features of Palladium together with a hardware collection called the SCP (= ?) can do the following things: 1. no secure-bootstrapping -- unlike TCPA this is not implemented 2. software-attestation -- Palladium uses SCP able to hash perhaps the TOR, Trusted Agents, and application software, and then uses TCPA-like endorsed hardware keys in the SCP to remotely attest to these hashes. 3. hardware assisted compartmentalization -- CPU can run another layer of privileged software with ability to prevent supervisor mode (and user mode) reading chosen user mode process memory areas. The ubermode code is called the TOR. The supervisor mode can install any TOR into the ubermode, but the TOR can be remotely attested(?) 4. sealing -- TCPA-style sealing on 2, softeware-attestation from information so far, it's unclear what is hashed and what is attested. on 3, hardware assisted compartmentalization - Presumably Palladium enabled applications would refuse to run unless a Palladium/Microsoft certified TOR is running? - Limits the meaningfulness of claims of openness in loading your own TOR. More "you can turn x off but nothing will work if you do". - or perhaps it is up to the individual Palladium application which TOR it trusts? - but wouldn't this lead to a break: - If a microsoft written Palladium enabled application would talk to any TOR, user can load a TOR which is under user control to by-pass the compartmentalized memory restrictions, regaining root. But if he can do this, he can break Palladium enforced DRM. also about hardware assisted compartmentalization, earlier I said:
Optionally the software source can be published but that is not necessary, and if it's not you won't be able to reverse-engineer it as it can be encrypted for the CPU
you responded:
Confusion. The memory isn't encrypted, nor are the apps nor the TOR when they are on the hard drive. Encrypting the apps wouldn't make them more secure, so they aren't encrypted.
If I understand the architecture makes it possible to write a TOR which supports encrypted applications encrypted for the machines key which is stored in the SCP. I thought this scheme would be in the current design because as far as I can see this would in fact be necessary for strong copy protection for software (software licensed only for a given machine), which I presumed microsoft has an interest in. It would be a bit like a "sealed" application. * on claim "palladium doesn't prevent anything":
I believe that there isn't a single thing you can do with your PC today which is prevented on a Palladium PC. I am open to being challenged on this, so please let me know what you think you won't be able to do on a Pd PC that you can do today.
Well as you can still run the same software that is a tautology. The correct question is: can Palladium enabled software prevent you from doing things you could do with non-Palladium enabled software. Palladium is in fact designed explicitly to prevent the user doing things! The are whole classes of things Palladium enabled software using Trusted Agents, a default TOR and SCP features can prevent the user from doing: - it prevents the owner modifying application code running on his machine which uses the remote-attestation functions to talk to remote servers - it could be used to robustly prevent the user auditing what information flowing into and out of his machine (the user can't obtain the keys negotiated by the SCP and a remote site, and can't grab the keys from the application because it's a trusted agent running in a TOR mediated code compartment.) - it could be used to make file formats which are impossible for third parties to be compatible with (Ross Anderson came up with this example in [4]) - it could be used to securely hide undocumented APIs - it could be used to securely implement software copy-protection using encrypt for endorsed SCP stored machine keys - it could be used to prevent reverse-engineering applications - it could be used for DRM to enforce play-once, to revoke fair use rights or any other arbitrary policies (on formats only available to Palladium enabled applications) - it could be used to implement key-escrow of SCP stored keys to put government or corporate backdoors in sealed data, and comms with remote servers encrypted using SCP negotiated keys; wasn't there a statement somewhere that CAPI uses would more immediately get benefit from Palladium SCP functions? Not sure how the mix of non-palladium using CAPI applications and palladium enabled SCP and/or CAP using applications work out for the key-escrow implementing TOR scenario. now as I mentioned in an earlier post you could claim, oh that's ok because the user has choice, he can still boot with his own custom TOR. In the short term that argument would work. In the long term we run the risk that: a) many users will be so baffled by technology they won't know when they are at risk and when they are not. b) new non-backwards compatible file formats and feature creep together with potential "palladium only format" enforcement could start to make it very inconvenient to use non-standard TORs or non palladium applications. On top of that there are next gen issues, and on-going legal issues. For example what happens when this scenario plays out: 1. recent release digital content is published early (same time as cinema for the right price say). 2. hardware hackers do a break-once run anywhere by ripping all the content on their hacked machine and start distributing it on kazaa (2.5 Peta-bytes of ripped content and growing weekly) 3. RIAA/MPAA goes back and lobbys for further controls, DMCA extensions 4. new legal, DMCA and RIAA/MPAA, and competition from Sony pressures are placed on microsoft difficult to see that far out what is going to happen, but blase presumptions that it's unrealistic to expect key-escrow, or certified TOR only and to assume that Lucky Green's Document Revocation Lists don't get rolled out with DMCA style laws against interfering with -- that ignores a lot of history. Also mentioned in previous post: just because it's law doesn't mean it should be enforced. People are currently afforded the ability to ignore the masses of extreme and ridiculous IP law as individuals. When a large chunk of it gets implemented into DRM, and narcware the once free-wheeling internet information exchanges will become marginalized, or underground only affairs -- the world will become stifled by their own computers acting as the policeman inside -- your own computer deputized by the US government, DMCA et al.
So what I've read so far, I think people's gut reactions are right -- that it's an aggressive and abmitious power grab by the evil empire -- the 3 cartels / monopolies surrounding PC hardware, Operating systems and Content Distribution. The operating system near monoply will doubtless find creative ways to use and expand the increased control to control application interoperability (with the sealing function), to control with hardware assistance the access to undocumented APIs (no more reverse engineering, or using the APIs even if you do / could reverse engineer).
I know that we aren't using undocumented API's
I think you must have misinterpreted what I said: I wasn't talking about Palladium APIs. I was talking about the fact that microsoft has historically used undocumented APIs as a business tactic, and I presume this is still an ongoing strategy, and the Palladium hardware architecture could be used to build a TOR which forced competitors attempting to discover undocumented APIs in order to fairly compete to resort to hardware hacking. Unless and until we get the software copyright analog of DMCA reverse-engineering restrictions which makes that illegal.
and that we will strive for the highest degree of interoperability and user control possible. Pd represents massive de-centralization of trust, not the centralization of it.
I'm not sure what you mean by "de-centralized trust". Perhaps that the TOR is publicly audited? Perhaps that there are multiple vendors endorsing SCP implementations? I think Palladium clearly has possibilities to magnify centralized control. It is in microsoft's economic interests, and historically their modus-operandi to aggresively try to create and exploit control points to extract monopolistic rents, and/or suppress competition. And of course other companies also have used the same strategies, but the current limits placed on such practices by the ability to reverse engineer for compatibility may come to be eroded by TCPA/Palladium.
I think that time is going to have to tell on this one. I know that this isn't true. You think that it is. I doubt that my saying it isn't true is going to change your mind; I know that the technology won't do much of what you are saying it does do, but I also know that some of these things boil down to suspicion around intent, and only time will show if my intent is aligned with my stated goals.
It's nothing personal, it's just that intent is open to evolutionary change, legislative attack, pressures outside of your personal, or microsoft's control -- economic and legal incentives will arise which make the platform deviate from current stated intent. 2002 stated intent is zip guarantee. Someone sent me this hugely appropriate quote in email relating to this point: | "You do not examine legislation in the light of the benefits it will | convey if properly administered, but in the light of the wrongs it | would do and the harms it would cause if improperly administered." - | Lyndon Johnson except here we are not talking about legislation directly but a hardware platform with the tools to create different control points and the legal and business pressures that act upon the use and misuse of that platform and those created control points.
I think that Pd represents an enhancement to personal freedoms and user control over their machines. I hope that over time I will be able to explain Pd sufficiently well so that you have all the facts you need to understand how and why I say this.
I can only think that your definitions of user control probably are in terms of abstract "security" rather than a distrusting viewpoint that wants to be able to audit and modify application behavior. I presume you mean "freedom" in the sense of freedom from the ills that it is claimed Palladium enabled applications could improve. I think of freedom as the ability for self-determination, to completely control and audit all aspects of software running on my system. Without these freedoms applications and vendors can conspire against the machine owner. I suspect the majority of people feel similarly. With current information it seems that the Palladium platform is damaging to freedom and indivdual liberty, and a future risk to free society. I am of course completely open to being proven wrong, and look forward to seeing more detailed Palladium specs so that this can be tested, and to see the community given the opportunity to point out potential more open and less control point prone modifications. Adam [4] "Security in Open versus Closed Systems (The Dance of Boltzmann, Coase and Moore)", Ross Anderson, (Sections 4 and 5 only, rest is unrelated) http://www.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf
participants (1)
-
Adam Back