Lucky Green writes regarding Ross Anderson's paper at: http://www.ftp.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf
I must confess that after reading the paper I am quite relieved to finally have solid confirmation that at least one other person has realized (outside the authors and proponents of the bill) that the Hollings bill, while failing to mention TCPA anywhere in the text of the bill, was written with the specific technology provided by the TCPA in mind for the purpose of mandating the inclusion of this technology in all future general-purpose computing platforms, now that the technology has been tested, is ready to ship, and the BIOS vendors are on side.
It's an interesting claim, but there is only one small problem. Neither Ross Anderson nor Lucky Green offers any evidence that the TCPA (http://www.trustedcomputing.org) is being designed for the support of digital rights management (DRM) applications. In fact if you look at the documents on the TCPA web site you see much discussion of applications such as platform-based ecommerce (so that even if a user's keys get stolen they can't be used on another PC), securing corporate networks (assuring that each workstation is running an IT-approved configuration), detecting viruses, and enhancing the security of VPNs. DRM is not mentioned. Is the claim by Ross and Lucky that the TCPA is a fraud, secretly designed for the purpose of supporting DRM while using the applications above merely as a cover to hide their true purposes? If so, shouldn't we expect to see the media content companies as supporters of this effort? But the membership list at http://www.trustedcomputing.org/tcpaasp4/members.asp shows none of the usual suspects. Disney's not there. Sony's not there. No Viacom, no AOL/Time/Warner, no News Corp. The members are all technology companies, including crypto companies like RSA, Verisign and nCipher. Contrast this for example with the Brodcast Protection Discussion Group whose ongoing efforts are being monitored by the EFF at http://www.eff.org/IP/Video/HDTV/. There you do find the big media companies. That effort is plainly aimed at protecting information and supporting DRM, so it makes sense that the companies most interested in those goals are involved. But with the TCPA, the players are completely different. And unlike with the BPDG, the rationale being offered is not based on DRM but on improving the trustworthiness of software for many applications. Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms? --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
Anonymous writes:
Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms?
Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution. Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide). I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill. Would Anonymous perhaps like to take this question? --Lucky Green --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
On Sun, 23 Jun 2002, Lucky Green wrote:
Anonymous writes:
Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms?
Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution.
Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide).
That makes the claim a might weak, at least in my perspective.
I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill.
Is the Hollings bill you refer to S.2048? In S.2048 they want to "plug the analog hole". It's far worse both econmicly and "big brother" wise. How are they going to deal with all the processors now running that don't have this "fritz" chip? Deny them access to data? Won't win over a whole lot of votes that way, pissing off every grandfather in the country. Patience, persistence, truth, Dr. mike
It seems clear at least if DRM is an application than DRM applications would benefit from the "increased trust" and architecturally that such "trust" would be needed to enforce/ensure some/all of the requirements of the Hollings bill. hawk Lucky Green wrote:
.... other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill.
Would Anonymous perhaps like to take this question?
--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
I, for one, can vouch for the fact that TCPA could absolutely be applied to a DRM application. In a previous life I actually designed a DRM system (the company has since gone under). In our research and development in '96-98, we decided that you need at least some trusted hardware at the client to perform any DRM, but if you _did_ have some _minimal_ trusted hardware, that would provide a large hook to a fairly secure DRM system. Check the archives of, IIRC, coderpunks... I started a thread entitled The Black Box Problem. The issue is that in a DRM system you (the content provider) wants to verify the operation of the client, even though the client is not under your control. We developed an online interactive protocol with a sandbox environment to protect content, but it would certainly be possible for someone to crack it. Our threat model was that we didn't want people to be able to use a hacked client against our distributation system. We discovered that if we had some trusted hardware that had a few key functions (I don't recall the few key functions offhand, but it was more than just encrypt and decrypt) we could increase the effectiveness of the DRM system astoundingly. We thought about using cryptodongles, but the Black Box problem still applies. The trusted hardware must be a core piece of the client machine for this to work. Like everything else in the technical world, TPCA is a tool.. It is neither good nor bad; that distinction comes in how us humans apply the technology. -derek "Lucky Green" <shamrock@cypherpunks.to> writes:
Anonymous writes:
Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms?
Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution.
Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide).
I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill.
Would Anonymous perhaps like to take this question?
--Lucky Green
--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
-- Derek Atkins Computer and Internet Security Consultant derek@ihtfp.com www.ihtfp.com
Over the last six months, I'd discovered that Carl Ellison (Intel), Joan Feigenbaum (Yale) and I agreed on at least one thing: that the problem statements for "privacy" and for "digital rights management" were identical, viz., "controlled release of information is yours at a distance in space or time" and that as such our choices for the future of digital rights management and privacy are "both or neither" at least insofar as technology, rather than cultural norms & law, drive. Last week at USENIX 2002 I tried this out on Larry Lessig as his keynote had been a takeoff from his recent _The Future of Ideas_ book. His response was confirming: "Of course they are the same!" and he went on to describe that when Mark Stefik (Xerox PARC) had submitted his patent on DRM in the early '90s it had roughly said "wrap data such that if you try to abuse it it will self destruct." Sometime in the late '90s a Canadian inventor had attempted to patent a privacy technology with the rough description "wrap data such that if you try to abuse it it will self destruct." The USPTO denied the patent request on the grounds that it duplicated an application that had already been granted. Speaking personally, if asked "DRM & privacy, both or neither?" then I will take "both" -- YMMV. --dan
Speaking personally, if asked "DRM & privacy, both or neither?" then I will take "both" -- YMMV.
This bullshit is getting deeper and thicker. (dis)ability to replay received information at will has next to nothing to do with ability to stop unwanted parties from obtaining secret information. Let me rephrase this for stupids: DRM is about enforcing NDA between me and someone who made information available to me. DRM is about preventing me to transmit information which became a part of my experience. DRM is about who owns my memories. One *is* the sum of information obtained from the outside world. Information becomes (a small) part of you. This is why people share songs - they identify with something there and want to communicate it. THAT'S WHY THEY LIKE IT IN THE FIRST PLACE. The ultimate DRM is city government stopping you from describing streets, leased apartment owner stopping you from reminiscing sex you had there, school suing you from passing on the knowledge learned. Put a newborn in a sensory deprivation tank and twenty years later observe someone who fully obeys "rights". There is no moderate answer to this. The only possible answer is FUCK YOU. Privacy is about stopping unwanted from knowing my private bits, bits I share with my chosen circle of associates and friends. And guess what - I am not a friend or associate with entertainment publishers. I am not a member. Sale of information is always a sale to the group one belongs to. After few iterations it quickly expands to the whole connected world. So publishers can choose to (a) become pipes more convenient and faster than information working its way through degrees of separations or (b) go out of business. In the meantime a lot of money and maybe some blood will be spent trying to accomodate sheer greed. There is no middle road. Keep your fingers off my memories or I'll pulverize yours. ===== end (of original message) Y-a*h*o-o (yes, they scan for this) spam follows: Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com
Dan Geer wrote:
Over the last six months, I'd discovered that Carl Ellison (Intel), Joan Feigenbaum (Yale) and I agreed on at least one thing: that the problem statements for "privacy" and for "digital rights management" were identical,
...
... YMMV.
Uhhh, my mileage varies rather considerably. Perhaps we are using wildly divergent notions of "privacy" -- or wildly divergent notions of "identical". DRM has to do mainly with protecting certain rights to _published_ material. Private material is not "identical" with published material -- it is more opposite than identical. Private material is, according to the usual definitions, in the hands of persons who have a common interest in keeping the information private and restricted. Published material, in contrast, is in the hands of persons who have no interest in keeping it private, and indeed commonly have an interest in defeating whatever restrictions are in place. We have thousands of years of experience with military crypto, where the parties at both ends of the conversation are highly motivated to restrict the flow of private information. The current state of this technology is very robust. Ending about 20 years ago we had a 500-year era where it was not practical for anyone except an established publisher to infringe copyrights in a big way. During this era, Rights Management had essentially nothing to do with crypto; it mainly had to do with the economics of printing presses and radio transmitters, supplemented by copyright laws that were more-or-less enforceable. This era was killed by analog means (widespread photocopy machines) and the corpse was pulverized by digital means (widespread computers and networking). I repeat: The main features of our experience with Privacy Management are disjoint from the main features of our experience with Publishers' Rights Management. They are about as different as different can be. The record is replete with spectacular failures attributable to non-understanding of the difference. --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
On Tue, 25 Jun 2002, John S. Denker wrote:
Date: Tue, 25 Jun 2002 22:21:36 -0400 From: John S. Denker <jsd@monmouth.com> To: Dan Geer <geer@TheWorld.com>, cryptography@wasabisystems.com, cypherpunks@lne.com, Ross.Anderson@cl.cam.ac.uk Subject: Re: privacy <> digital rights management
Dan Geer wrote:
Over the last six months, I'd discovered that Carl Ellison (Intel), Joan Feigenbaum (Yale) and I agreed on at least one thing: that the problem statements for "privacy" and for "digital rights management" were identical,
...
... YMMV.
Uhhh, my mileage varies rather considerably. Perhaps we are using wildly divergent notions of "privacy" -- or wildly divergent notions of "identical".
DRM has to do mainly with protecting certain rights to _published_ material. Private material is not "identical" with published material -- it is more opposite than identical.
The spectrum from 2 people knowing something to 2 billion knowing something is pretty smooth and continuous. Both DRM and privacy have to do with controlling material after you have released it to someone who might wish to pass it on further against your wishes. There is little *tehcnical* difference between your doctors records being passed on to assorted insurance companies, your boss, and/or tabloid newspapers and the latest Disney movies being passed on from a country where it has been released to people/theaters in a country where it has not been released.
Private material is, according to the usual definitions, in the hands of persons who have a common interest in keeping the information private and restricted.
The only case where all holders of information always have a common interest is where the number of holder is one.
Published material, in contrast, is in the hands of persons who have no interest in keeping it private, and indeed commonly have an interest in defeating whatever restrictions are in place.
"Privacy", according to the usual definitions, involve controlling the spread of information by persons autorized to have it. Contrast with secrecy which primarily has to do with stopping the spread of information through the actions of those not authorized to have it.
We have thousands of years of experience with military crypto, where the parties at both ends of the conversation are highly motivated to restrict the flow of private information. The current state of this technology is very robust.
That's secrecy technology, not privacy technology.
Ending about 20 years ago we had a 500-year era where it was not practical for anyone except an established publisher to infringe copyrights in a big way. During this era, Rights Management had essentially nothing to do with crypto; it mainly had to do with the economics of printing presses and radio transmitters, supplemented by copyright laws that were more-or-less enforceable. This era was killed by analog means (widespread photocopy machines) and the corpse was pulverized by digital means (widespread computers and networking).
Sure, you can't have either privacy or DRM with plain paper texts or plaintext digital data on untrusted hardware. That's pretty obvious. A xerographic copier works just as well on a "private" handwritten letter as it does on a mass produced printed page. And if you want to argue that total privacy and DRM are unobtainable because anyone knowing something in their mind can trasnmit it in plain text, sure. But that does not mean that, at least in principal, it is impossible to achieve "technical privacy" thorugh crypto and trusted hardware where the information can not be improperly passed on by an authorized holder other than via their mind.
I repeat: The main features of our experience with Privacy Management are disjoint from the main features of our experience with Publishers' Rights Management. They are about as different as different can be. The record is replete with spectacular failures attributable to non-understanding of the difference.
You are confusing privacy with secrecy and are confusing accidental/historic differences between privacy and DRM with their essential techncial identity. Donald --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
Donald Eastlake 3rd wrote:
On Tue, 25 Jun 2002, John S. Denker wrote:
Date: Tue, 25 Jun 2002 22:21:36 -0400 From: John S. Denker <jsd@monmouth.com> To: Dan Geer <geer@TheWorld.com>, cryptography@wasabisystems.com, cypherpunks@lne.com, Ross.Anderson@cl.cam.ac.uk Subject: Re: privacy <> digital rights management
Dan Geer wrote:
Over the last six months, I'd discovered that Carl Ellison (Intel), Joan Feigenbaum (Yale) and I agreed on at least one thing: that the problem statements for "privacy" and for "digital rights management" were identical,
...
... YMMV.
Uhhh, my mileage varies rather considerably. Perhaps we are using wildly divergent notions of "privacy" -- or wildly divergent notions of "identical".
DRM has to do mainly with protecting certain rights to _published_ material. Private material is not "identical" with published material -- it is more opposite than identical.
The spectrum from 2 people knowing something to 2 billion knowing something is pretty smooth and continuous. Both DRM and privacy have to do with controlling material after you have released it to someone who might wish to pass it on further against your wishes.
No they don't! Privacy has to do with two (or more) parties wishing to collaborate to prevent third parties from eavesdropping. DRM has to do with one party attempting to control everyone else's ability to reproduce. If these are related to each other at all, they are what mathematicians like to call duals. IMO. Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.thebunker.net/ "There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit." - Robert Woodruff --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
I wrote:
Perhaps we are using wildly divergent notions of "privacy"
Donald Eastlake 3rd wrote:
You are confusing privacy with secrecy
That's not a helpful remark. My first contribution to this thread called attention to the possibility of wildly divergent notions of "privacy". Also please note that according to the US Office of Technology Assessment, such terms do not posess "a single clear definition, and theorists argue variously ... the same, completely distinct, or in some cases overlapping". Please let's avoid adversarial wrangling over terminology. If there is an important conceptual distinction, please explain the concepts using unambiguous multi-word descriptions so that we may have a collegial discussion.
The spectrum from 2 people knowing something to 2 billion knowing something is pretty smooth and continuous.
That is quite true, but quite irrelevant to the point I was making. Pick an intermediate number, say 100 people. Distributing knowledge to a group of 100 people who share a vested interest in not divulging it outside the group is starkly different from distributing it to 100 people who have nothing to lose and something to gain by divulging it. Rights Management isn't even directly connected to knowledge. Suppose I know by heart the lyrics and music to _The Producers_ --- that doesn't mean I'm free to rent a hall and put on a performance.
Both DRM and privacy have to do with controlling material after you have released it to someone who might wish to pass it on further against your wishes. There is little *tehcnical* difference between your doctors records being passed on to assorted insurance companies, your boss, and/or tabloid newspapers and the latest Disney movies being passed on from a country where it has been released to people/theaters in a country where it has not been released.
That's partly true (although overstated). In any case it supports my point that fixating on the *technical* issues misses some crucial aspects of the problem.
The only case where all holders of information always have a common interest is where the number of holder is one.
Colorful language is no substitute for a logical argument. Exaggerated remarks ("... ALWAYS have ...") tend to drive the discussion away from reasonable paths. In the real world, there is a great deal of information held by N people where (N>>1) and (N<<infinity). --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
On Wed, 26 Jun 2002, Donald Eastlake 3rd wrote:
"Privacy", according to the usual definitions, involve controlling the spread of information by persons autorized to have it. Contrast with secrecy which primarily has to do with stopping the spread of information through the actions of those not authorized to have it.
We have thousands of years of experience with military crypto, where the parties at both ends of the conversation are highly motivated to restrict the flow of private information. The current state of this technology is very robust.
That's secrecy technology, not privacy technology.
I have seen "private" and "secret" defined in exactly the opposite fashion as regards keys: a "private" key is private because you never ever share it with anyone, whereas a "secret" (symmetric) key is a secret because you've told someone else and you expect them to not share it (in the sense of "can you keep a secret?"). Clearly there's not a common understanding of these simple words. Seems to me that Dan's mini-rant was referring to "privacy" in the sense you define it above (controlling spread of info already held by others). - RL "Bob"
On Wed, Jun 26, 2002 at 09:51:58AM -0400, Donald Eastlake 3rd wrote: | "Privacy", according to the usual definitions, involve controlling the | spread of information by persons autorized to have it. Contrast with | secrecy which primarily has to do with stopping the spread of | information through the actions of those not authorized to have it. It sounds to me like you mean "data protection," not "privacy." By data protection, I mean the ability of the state to tell you not to use information about certain people in certain ways. See, for example, the EU Data Protection Directive. I find its really useful to not use the word privacy in debates about privacy; it simply means too many things to too many people. Bob Blakely once defined privacy as "The ability to lie about yourself and get away with it" which is an interesting definition. Other good ones include untracability, the inability to trace from a message to a person; unlinkability, the inability to link two instances of "theres a person here" to the same person; and unobservability, which is the ability to not be observed doing something (think curtains, my current favorite privacy technology.) | > We have thousands of years of experience with military crypto, where | > the parties at both ends of the conversation are highly motivated to | > restrict the flow of private information. The current state of this | > technology is very robust. | | That's secrecy technology, not privacy technology. I'm not getting into this one. :) -- "It is seldom that liberty of any kind is lost all at once." -Hume
On 6/25/02 4:15 AM, "Dan Geer" <geer@TheWorld.com> wrote:
Over the last six months, I'd discovered that Carl Ellison (Intel), Joan Feigenbaum (Yale) and I agreed on at least one thing: that the problem statements for "privacy" and for "digital rights management" were identical, viz., "controlled release of information is yours at a distance in space or time" and that as such our choices for the future of digital rights management and privacy are "both or neither" at least insofar as technology, rather than cultural norms & law, drive.
I think it even goes further than that. I was giving one of my DMCA-vs-Security talks while l'affaire Sklyarov was roiling, and noted that while that was going on, the US was being testy with China over alleged espionage by US nationals while in China. At a high level, each of infringement and espionage can be described as: Alice gives Bob some information. Bob is careless with it, disclosing it to someone that Alice would rather not see it. Alice has a non-linear response. You can call it infringement or you can call it espionage, but at the bottom of it, Alice believes that a private communication has been inappropriately disclosed. She thinks her privacy has been compromised and she's stomping angry about it. At the risk of creating a derivative work, you say pr-eye-vacy, I say pr-ih-vacy. Infringement, espionage, let's call the whole thing off. Jon --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
On Tue, 25 Jun 2002, Dan Geer wrote:
the problem statements for "privacy" and for "digital rights management" were identical
Hmm, so: privacy : DRM :: wiretapping : fair use - RL "Bob" --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to majordomo@wasabisystems.com
Anonymous writes:
Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms?
Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution. Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide). I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill. Would Anonymous perhaps like to take this question? --Lucky Green
participants (13)
-
Adam Shostack
-
Ben Laurie
-
Dan Geer
-
Derek Atkins
-
Donald Eastlake 3rd
-
Harry Hawk
-
John S. Denker
-
Jon Callas
-
Lucky Green
-
Mike Rosing
-
Morlock Elloi
-
Nomen Nescio
-
RL 'Bob' Morgan