On Tue, Aug 06, 2002 at 07:08:25PM -0700, Peter N. Biddle wrote: | Neither of us really had the time to clearly articulate things last time, so | I am glad you brought it up. My perspective is primarily from an | architectural one, and it boils down to this: | | Platform security shouldn't choose favorites. I think most of us will agree to that. But you are choosing favorites: You're asserting certain ideas about society and how it ought be structured, and asserting that a system should do certain things. Some de-contextualized quotes are below. | enforce policy judgement on arbitrary data would be impossible to manage. It | would vary from country to country, and most importantly (from my Why do countries get to impose their laws on my data? Which countries get to do so? And are you still in France? ;) | Not only should the platform be able to exert the highest degrees of control | over this information on behalf of a user, it should also allow the user to | make smart choices about who gets the info and what the policy is around the | usage of this info remotely. This must be in a context where lying is both | extremely difficult and onerous. Why? Lying is a really good way to protect your privacy. | Common sense dictates that the unlawful usage of some kinds of data is far | more damaging (to society, individuals, groups, companies) than other kinds | of data, and that some kinds of unlawful uses are worse than others, but | common sense is not something that can be exercised by a computer program. | This will need to be figured out by society and then the policy can be | exerted accordingly. Again, we disagree. | I am not sure I understand the dichotomy; technical enforcement of user | defined policies around access to, and usage of, their local data would seem | to be the right place to start in securing privacy. (Some annoying cliche | about cleaning your own room first is nipping at the dark recesses of my | brain ; I can't seem to place it.) When you have control over privacy | sensitive information on your own machine you should be able to use similiar | mechanisms to achieve similiar protections on other machines which are | capable of exerting the same policy. You should also have an infrastructure | which makes that policy portable and renewable. This doesn't work, since, as Ross Anderson points out, the knowledge that you're HIV positive is owned by lots of different people at different times, and once one of them reads it on screen, they can reproduce it, irrevocably, outside the policy which you've tried to impose. So, you've made some choices about how the system can be used; you've chosen ways to protect privacy which reflect your view of how privacy should be protected. Similarly copyright. Thats your right, however, I, and many others are deeply concerned that those choices are going to get embedded and imposed on the rest of us. Hey, you know what? They may even be good choices. But I don't care. Fundamentally, they restrict my freedom to do dumb things, to be unreasonable, to dissent. And that worries the hell out of me. Adam -- "It is seldom that liberty of any kind is lost all at once." -Hume