Here are some alternative applications for TCPA/Palladium technology which could actually promote privacy and freedom. A few caveats, though: they do depend on a somewhat idealized view of the architecture. It may be that real hardware/software implementations are not sufficiently secure for some of these purposes, but as systems become better integrated and more technologically sound, this objection may go away. And these applications do assume that the architecture is implemented without secret backdoors or other intentional flaws, which might be guaranteed through an open design process and manufacturing inspections. Despite these limitations, hopefully these ideas will show that TCPA and Palladium actually have many more uses than the heavy-handed and control-oriented ones which have been discussed so far. To recap, there are basically two technologies involved. One is "secure attestation". This allows machines to securely receive a hash of the software which is running remotely. It is used in these examples to know that a trusted client program is running on the remote machine. The other is "secure storage". This allows programs to encrypt data in such a way that no other program can decrypt it. In addition, we assume that programs are able to run "unmolested"; that is, that other software and even the user cannot peek into the program's memory and manipulate it or learn its secrets. Palladium has a feature called "trusted space" which is supposed to be some special memory that is immune from being compromised. We also assume that all data sent between computers is encrypted using something like SSL, with the secret keys being held securely by the client software (hence unavailable to anyone else, including the users). The effect of these technologies is that a number of computers across the net, all running the same client software, can form their own closed virtual world. They can exchange and store data of any form, and no one can get access to it unless the client software permits it. That means that the user, eavesdroppers, and authorities are unable to learn the secrets protected by software which uses these TCPA features. (Note, in the sequel I will just write TCPA when I mean TCPA/Palladium.) Now for a simple example of what can be done: a distributed poker game. Of course there are a number of crypto protocols for playing poker on the net, but they are quite complicated. Even though they've been around for almost 20 years, I've never seen game software which uses them. With TCPA we can do it trivially. Each person runs the same client software, which fact can be tested using secure attestation. The dealer's software randomizes a deck and passes out the cards to each player. The cards are just strings like "ace of spades", or perhaps simple numerical equivalents - nothing fancy. Of course, the dealer's software learns in this way what cards every player has. But the dealer himself (i.e. the human player) doesn't see any of that, he only sees his own hand. The software keeps the information secret from the user. As each person makes his play, his software sends simple messages telling what cards he is exposing or discarding, etc. At the end each person sends messages showing what his hand is, according to the rules of poker. This is a trivial program. You could do it in one or two pages of code. And yet, given the TCPA assumptions, it is just as secure as a complex cryptographically protected version would be that takes ten times as much code. Of course, without TCPA such a program would never work. Someone would write a cheating client which would tell them what everyone else's cards were when they were the dealer. There would be no way that people could trust each other not to do this. But TCPA lets people prove to each other that they are running the legitimate client. So this is a simple example of how the secure attestation features of TCPA/Palladium can allow a kind of software which would never work today, software where people trust each other. Let's look at another example, a P2P system with anonymity. Again, there are many cryptographic systems in the literature for anonymous communication. But they tend to be complicated and inefficient. With TCPA we only need to set up a simple flooding broadcast network. Let each peer connect to a few other peers. To prevent traffic analysis, keep each node-to-node link at a constant traffic level using dummy padding. (Recall that each link is encrypted using SSL.) When someone sends data, it gets sent everywhere via a simple routing strategy. The software then makes the received message available to the local user, if he is the recipient. Possibly the source of the message is carried along with it, to help with routing; but this information is never leaked outside the secure communications part of the software, and never shown to any users. That's all there is to it. Just send messages with flood broadcasts, but keep the source locked inside the secure part. Messages can be sent and received, and neither participants nor outsiders can tell what the source of any message is. As with the earlier example, such a system would never work without TCPA. Rogue software would easily determine which direction messages were coming from, and the anonymity provided would be extremely limited at best. But by eliminating rogues using secure attestation, and keeping the sensitive data safe from molestation, we are able to achieve using a very simple system what otherwise takes tremendous complexity. Here's one more example, which I think is quite amazing: untraceable digital cash with full anonymity, without blinding or even any cryptography at all! (Excepting of course the standard TCPA pieces like SSL and secure storage and attestation.) The idea is, again, trivial. Making a withdrawal, the client sends the user's password and account ID to the bank (this information is kept in secure storage). The bank approves, and the client increments the local "wallet" by that amount (also kept in secure storage). To make a payment, use the anonymous network for transport, and just send a message telling how much is being paid! The recipient increments his wallet by that amount and the sender decrements his. Deposit works analogously to withdrawal. Again, that's all there is to it. Nothing could be simpler. Yet it provides for secure (assuming TCPA is secure), anonymous, untraceable payments. The secure attestation is crucial, of course, to make sure that people are running legitimate clients, otherwise cheating would be rampant. And the secure storage is equally crucial, otherwise any software could increment the sum stored in the wallet and everyone would accept and believe those payments. I understand, of course, that this specific example is not very practical unless we have an extremely secure version of TCPA. If anyone who can break the security can give themselves unlimited money, it means that the security has to be essentially perfect. So this is more of a proof of concept than a realistic proposal. But eventually, with TCPA technology integrated into a tamper-proof, nanotech CPU with molecular sensors and built-in self-destructs, possibly this might be good enough. Or you could augment this solution with some crypto, similar with the "wallets with observers" proposals from Chaum and from Brands. Note that we can make the client open-source, allowing anyone to verify that it has no back doors or cheating potentials, which allows all users to trust that it is not going to hurt them (a problem that takes great complexity to solve with the observer protocols). But still the bare simplicity of the system should make clear how powerful something like TCPA can be for this kind of application. I could go on and on, but the basic idea is always the same, and hopefully once people see the pattern they will come up with their own ideas. Being able to write software that trusts other computers allows for an entirely new approach to security software design. TCPA can enhance freedom and privacy by closing off possibilities for surveillance and interference. The same technology that protects Sony's music content in a DRM application can protect the data exchanged by a P2P system. As Seth Schoen of the EFF paraphrases Microsoft, "So the protection of privacy was the same technical problem as the protection of copyright, because in each case bits owned by one party were being entrusted to another party and there was an attempt to enforce a policy." (http://vitanuova.loyalty.org/2002-07-05.html, 3rd bullet point) In fact, TCPA and Palladium have tremendous potential for enhancing and protecting privacy, if people will just look at them with an open mind.
On Sat, 3 Aug 2002, AARG!Anonymous wrote: < ... />
Now for a simple example of what can be done: a distributed poker game. Of course there are a number of crypto protocols for playing poker on the net, but they are quite complicated. Even though they've been around for almost 20 years, I've never seen game software which uses them. With TCPA we can do it trivially.
< ... /> No. Have you included the cost of giving every computer on Earth to the Englobulators? If you wish, we can write an implementation of the wonderful protocols for distributed safer card drawing and we can play our games of poker. And we may run our poker room on the hardware and software we have today, no need for DRM. Indeed today millions use toady's untrammeled hardware and, this is incredible, Microsoft OSes to conduct their personal banking. If "the market" considers that present systems suffice for this, well, I do not think that we need surrender our computers to the Englobulators to save three man-months of programmer time. ad next moves in the eristic tree: You: Marginals vs. total time-space integrated costs/benefits! I: Happy to demonstrate estimates of totals come out for my side. oo--JS.
AARG!Anonymous writes:
I could go on and on, but the basic idea is always the same, and hopefully once people see the pattern they will come up with their own ideas. Being able to write software that trusts other computers allows for an entirely new approach to security software design. TCPA can enhance freedom and privacy by closing off possibilities for surveillance and interference. The same technology that protects Sony's music content in a DRM application can protect the data exchanged by a P2P system. As Seth Schoen of the EFF paraphrases Microsoft, "So the protection of privacy was the same technical problem as the protection of copyright, because in each case bits owned by one party were being entrusted to another party and there was an attempt to enforce a policy." (http://vitanuova.loyalty.org/2002-07-05.html, 3rd bullet point)
I would just like to point out that the view that "the protection of privacy [is] the same technical problem as the protection of copyright" is Microsoft's and not mine. I don't agree that these problems are the same. An old WinHEC presentation by Microsoft's Peter Biddle says that computer security, copyright enforcement, and privacy are the same problem. I've argued with Peter about that claim before, and I'm going to keep arguing about it. For one thing, facts are not copyrightable -- copyright law in the U.S. has an "idea/expression dichotomy", which, while it might be ultimately incoherent, suggests that copyright is not violated when factual information is reproduced or retransmitted without permission. So, for example, giving a detailed summary of the plot of a novel or a movie -- even revealing what happens in the ending! -- is not an infringement of copyright. It's also not something a DRM system can control. But privacy is frequently violated when "mere" facts are redistributed. It often doesn't matter that no bits, bytes, words, or sentences were copied verbatim. In some cases (sexual orientation, medical history, criminal history, religious or political belief, substance abuse), the actual informational content of a "privacy-sensitive" assertion is extremely tiny, and would probably not be enough to be "copyrightable subject matter". Sentences like "X is gay", "Y has had an abortion", "Z has AIDS", etc., are not even copyrightable, but their dissemination in certain contexts will have tremendous privacy implications. "Technical enforcement of policies for the use of a file within a computer system" is a pretty poor proxy for privacy. This is not to say that trusted computing systems don't have interesting advantages (and disadvantages) for privacy. -- Seth David Schoen <schoen@loyalty.org> | Reading is a right, not a feature! http://www.loyalty.org/~schoen/ | -- Kathryn Myronuk http://vitanuova.loyalty.org/ |
On Tue, 6 Aug 2002, Seth David Schoen wrote: < ... />
This is not to say that trusted computing systems don't have interesting advantages (and disadvantages) for privacy.
-- Seth David Schoen <schoen@loyalty.org> | Reading is a right, not a feature!
I think that giving root of your machine to an entity you do not trust is not reasonable, even if it is claimed that the control so given is a partial and compartmentalized control. It is even more unreasonable in case the entity has repeatedly declared 1. their deep and abiding distrust of you 2. their minimal demand to have root on all the world's general purpose computers forever 3. their intent to obtain 2 by government mandate. If we wish to improve security and privacy, then let us improve ssh and GNUPG so that they can actually be installed and used by more people. It is better to think about and to work on our own systems than to waste time and money and effort on discovering the endless "flaws" and "inadequacies" and "dangers" and the endless amusing Panglossian "advantages" of TCPA/Palladium. TCPA/Palladium has several faces, but one of the most important faces is "deception, division, and diversion". It is not a good idea to work on improving the designs of our openly declared enemies. Nor is it good to spend much time examining tiny irrelevant details of TCPA/Palladium. Every such discussion I have seen starts by making the crudest errors in formal logic. Here is one important such error: "See this tiny part of the system does not, in and of itself in isolation, 'give root' to the Englobulators, hence TCPA/Palladium is partway OK.". oo--JS.
-- On 6 Aug 2002 at 16:12, Jay Sulzberger wrote:
If we wish to improve security and privacy, then let us improve ssh and GNUPG so that they can actually be installed and used by more people. It is better to think about and to work on our own systems than to waste time and money and effort on discovering the endless "flaws" and "inadequacies" and "dangers" and the endless amusing Panglossian "advantages" of TCPA/Palladium.
Not everyone is equally evil, and even when they are equally evil not everyone is as immediate a threat. Roosevelt allied himself with Stalin, Reagan found himself fighting the same enemy puppet regime as Pol Pot was fighting. Hollywood is not TCPA, though there seem to be disturbing connections, and Palladium is not TCPA either. Hollywood wants to turn computers users by law into passive consumers of content generated by large corporations. Microsoft, despite all of its sins, has very different and less evil objectives. TCPA looks to me suspiciously like a stalking horse for the hollywood program. As yet, I do not know what the case is with Palladium. --digsig James A. Donald 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG Zna/iIvm7+exkPJmH+Ywo/J1MS/WQtJX45T0vGSI 2doVQThla81OopVfWO1DW+1Ps9ao+2zjzU2p6mQ7I
On Tue, 6 Aug 2002, Jay Sulzberger wrote:
"See this tiny part of the system does not, in and of itself in isolation, 'give root' to the Englobulators, hence TCPA/Palladium is partway OK.".
It is important for us to divide and conquer the "Englobulators". Clearly there is a division between TCPA and Palladium already, and we should use that division to ensure the failure of englobulation. I'm not so sure it will be easy, but it seems doable. Patience, persistence, truth, Dr. mike
Neither of us really had the time to clearly articulate things last time, so I am glad you brought it up. My perspective is primarily from an architectural one, and it boils down to this: Platform security shouldn't choose favorites. I don't want there to be any second class data citizens, as the determination of who is a "first class" citizen and who isn't seems arbitrary and unfair, especially if you happen to be second class. The technology should be egalitarian and should be capable of treating all data the same. If a user wants data to be secure, or an application wants it's execution to be secure, they should be able to ask for and get the highest level of security that the platform can offer. You point out that legal and societal policy likes to lump some kinds of data together and then protect those lumps of data in certain ways from certain things. Policy may also leave the same data open for some kinds of usage and or exploitation in some circumstances. This is a fine and wonderful thing from a policy perspective. This kind of rich policy is only possible in a PC if that machine is capable of exerting the highest degrees of security to every object seeking it. You can't water the security up; you can only water it down. I don't think that the platform security functions should have to decide that some data looks like copyrighted information and so it must be treated in one way, while other data looks like national secrets and so should be treated differently. The platform shouldn't be able to make that choice on it's own. The platform needs someone else (eg the user) to tell it what policies to enforce. (Of course the policy engine required to automatically enforce policy judgement on arbitrary data would be impossible to manage. It would vary from country to country, and most importantly (from my architectural perspective) it's impossible to implement becuase the only SW with access to all data must be explicitly non-judgemental about what good or bad policy is.) More in-line: ----- Original Message ----- From: "Seth David Schoen" <schoen@loyalty.org> To: <cypherpunks@lne.com>; <cryptography@wasabisystems.com>; <mail2news@anon.lcs.mit.edu> Sent: Tuesday, August 06, 2002 12:11 PM Subject: Re: Privacy-enhancing uses for TCPA
AARG!Anonymous writes:
I could go on and on, but the basic idea is always the same, and hopefully once people see the pattern they will come up with their own ideas. Being able to write software that trusts other computers allows for an entirely new approach to security software design. TCPA can enhance freedom and privacy by closing off possibilities for surveillance and interference. The same technology that protects Sony's music content in a DRM application can protect the data exchanged by a P2P system. As Seth Schoen of the EFF paraphrases Microsoft, "So the protection of privacy was the same technical problem as the protection of copyright, because in each case bits owned by one party were being entrusted to another party and there was an attempt to enforce a policy." (http://vitanuova.loyalty.org/2002-07-05.html, 3rd bullet point)
I would just like to point out that the view that "the protection of privacy [is] the same technical problem as the protection of copyright" is Microsoft's and not mine. I don't agree that these problems are the same.
You say above that you don't agree the the problems are the same, but you don't specify in what domain - policy, technical, legal, all of the above, something else? The examples you give below are not technical examples - I think that they are policy examples. What about from the technical perspective?
An old WinHEC presentation by Microsoft's Peter Biddle says that computer security, copyright enforcement, and privacy are the same problem. I've argued with Peter about that claim before, and I'm going to keep arguing about it.
The term I use is "a blob is a blob"...
For one thing, facts are not copyrightable -- copyright law in the U.S. has an "idea/expression dichotomy", which, while it might be ultimately incoherent, suggests that copyright is not violated when factual information is reproduced or retransmitted without permission. So, for example, giving a detailed summary of the plot of a novel or a movie -- even revealing what happens in the ending! -- is not an infringement of copyright. It's also not something a DRM system can control.
Isn't copyright a legal protection, and not a technical one? The efficacy of copyright has certainly benefited greatly from the limitations of the mediums it generally protects (eg books are hard and expensive to copy; ideas, quotes, reviews and satires are allowed and also (not coincidentally) don't suffer from the physical limitations imposed by the medium) and so those limitations can look like technical protections, but really they aren't. I agree that copyrighted material is subject to different policy from other kinds of information. What I disagree on is that the TOR should arbitrarily enforce a different policy for it becuase it thinks that it is copyrighted. The platform should enforce policy based on an external (user, application, service, whatever) policy assertion around a given piece of data. Note that data can enter into Pd completely encrypted and unable to be viewed by anything but a user-written app and the TOR. At that point the policy is that the app, and thus the user, decides what can be done with the data. The TOR simply enforces the protections. No one but the app and the TOR can see the data to attempt to exert policy.
But privacy is frequently violated when "mere" facts are redistributed.
I swear that *I* was arguing this very point last time, and you were saying something else! Hmmm. Maybe we agree or something.
It often doesn't matter that no bits, bytes, words, or sentences were copied verbatim. In some cases (sexual orientation, medical history, criminal history, religious or political belief, substance abuse), the actual informational content of a "privacy-sensitive" assertion is extremely tiny, and would probably not be enough to be "copyrightable subject matter". Sentences like "X is gay", "Y has had an abortion", "Z has AIDS", etc., are not even copyrightable, but their dissemination in certain contexts will have tremendous privacy implications.
The platform should treat this kind of data with the highest degree of security and integrity available, and the level of security available should support local policy like "no SW can have access to this data without my explicit consent". The fact that the data is small makes it particularly sensitive as it is so highly portable, so there must be law to allow the legal assertion of policy independently from the technical exertion of policy, and there has to be some rationalization between the two approaches. While bandwidth limits the re-distribution of many kinds of content, it doesn't with this kind of info. (And of course bandwidth limitations aren't really technical protections and are subject to the vagaries of increased bandwidth. Not a good security model.) Not only should the platform be able to exert the highest degrees of control over this information on behalf of a user, it should also allow the user to make smart choices about who gets the info and what the policy is around the usage of this info remotely. This must be in a context where lying is both extremely difficult and onerous. Common sense dictates that the unlawful usage of some kinds of data is far more damaging (to society, individuals, groups, companies) than other kinds of data, and that some kinds of unlawful uses are worse than others, but common sense is not something that can be exercised by a computer program. This will need to be figured out by society and then the policy can be exerted accordingly.
"Technical enforcement of policies for the use of a file within a computer system" is a pretty poor proxy for privacy.
This is not to say that trusted computing systems don't have interesting advantages (and disadvantages) for privacy.
I am not sure I understand the dichotomy; technical enforcement of user defined policies around access to, and usage of, their local data would seem to be the right place to start in securing privacy. (Some annoying cliche about cleaning your own room first is nipping at the dark recesses of my brain ; I can't seem to place it.) When you have control over privacy sensitive information on your own machine you should be able to use similiar mechanisms to achieve similiar protections on other machines which are capable of exerting the same policy. You should also have an infrastructure which makes that policy portable and renewable. This is, of course, another technical / architectural argument. The actual policy around data like "X is gay" must come from society, but controls on the information itself originates with the user X, and thus the control on the data that represents this information must start in user X's platform. The platform should be capable of exerting the entire spectrum of possible controls. Peter ++++
-- Seth David Schoen <schoen@loyalty.org> | Reading is a right, not a feature! http://www.loyalty.org/~schoen/ | -- Kathryn Myronuk http://vitanuova.loyalty.org/ |
--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
On Tue, Aug 06, 2002 at 07:08:25PM -0700, Peter N. Biddle wrote: | Neither of us really had the time to clearly articulate things last time, so | I am glad you brought it up. My perspective is primarily from an | architectural one, and it boils down to this: | | Platform security shouldn't choose favorites. I think most of us will agree to that. But you are choosing favorites: You're asserting certain ideas about society and how it ought be structured, and asserting that a system should do certain things. Some de-contextualized quotes are below. | enforce policy judgement on arbitrary data would be impossible to manage. It | would vary from country to country, and most importantly (from my Why do countries get to impose their laws on my data? Which countries get to do so? And are you still in France? ;) | Not only should the platform be able to exert the highest degrees of control | over this information on behalf of a user, it should also allow the user to | make smart choices about who gets the info and what the policy is around the | usage of this info remotely. This must be in a context where lying is both | extremely difficult and onerous. Why? Lying is a really good way to protect your privacy. | Common sense dictates that the unlawful usage of some kinds of data is far | more damaging (to society, individuals, groups, companies) than other kinds | of data, and that some kinds of unlawful uses are worse than others, but | common sense is not something that can be exercised by a computer program. | This will need to be figured out by society and then the policy can be | exerted accordingly. Again, we disagree. | I am not sure I understand the dichotomy; technical enforcement of user | defined policies around access to, and usage of, their local data would seem | to be the right place to start in securing privacy. (Some annoying cliche | about cleaning your own room first is nipping at the dark recesses of my | brain ; I can't seem to place it.) When you have control over privacy | sensitive information on your own machine you should be able to use similiar | mechanisms to achieve similiar protections on other machines which are | capable of exerting the same policy. You should also have an infrastructure | which makes that policy portable and renewable. This doesn't work, since, as Ross Anderson points out, the knowledge that you're HIV positive is owned by lots of different people at different times, and once one of them reads it on screen, they can reproduce it, irrevocably, outside the policy which you've tried to impose. So, you've made some choices about how the system can be used; you've chosen ways to protect privacy which reflect your view of how privacy should be protected. Similarly copyright. Thats your right, however, I, and many others are deeply concerned that those choices are going to get embedded and imposed on the rest of us. Hey, you know what? They may even be good choices. But I don't care. Fundamentally, they restrict my freedom to do dumb things, to be unreasonable, to dissent. And that worries the hell out of me. Adam -- "It is seldom that liberty of any kind is lost all at once." -Hume
At 1:55 PM -0400 8/3/02, AARG!Anonymous wrote:
Here's one more example, which I think is quite amazing: untraceable digital cash with full anonymity, without blinding or even any cryptography at all! (Excepting of course the standard TCPA pieces like SSL and secure storage and attestation.)
The idea is, again, trivial. Making a withdrawal, the client sends the user's password and account ID to the bank (this information is kept in secure storage). The bank approves, and the client increments the local "wallet" by that amount (also kept in secure storage). To make a payment, use the anonymous network for transport, and just send a message telling how much is being paid! The recipient increments his wallet by that amount and the sender decrements his. Deposit works analogously to withdrawal.
Note that if the user can modify the wallet, a "fat, dumb, and happy" implementation may be vulnerable to the following attacks. Attack 1: (1) Withdraw $0.01 from the bank. (2) Change a random bit in the encrypted wallet. (Picking the bit to change will be easier if the storage format in known.) (3) Fire up the application as see how much money you have. Attack 2: (1) Withdraw many $$$ from the bank. (2) Copy the wallet. (3) Deposit the $$$ back in the bank. (4) Restore the wallet using the copy. While there are certainly ways to notice modifications to the wallet, and prevent the replay attack, they result in considerable additional complexity for what was a very simple implementation Cheers - Bill ------------------------------------------------------------------------- Bill Frantz | The principal effect of| Periwinkle -- Consulting (408)356-8506 | DMCA/CBDTPA is to | 16345 Englewood Ave. frantz@pwpconsult.com | prevent fair use. | Los Gatos, CA 95032, USA
participants (8)
-
AARG! Anonymous
-
Adam Shostack
-
Bill Frantz
-
James A. Donald
-
Jay Sulzberger
-
Mike Rosing
-
Peter N. Biddle
-
Seth David Schoen