Most of us agree that this technology is Nasty Copy Protection pushed by Nasty Software Hoarders opposed to Honest Open Source, and by Nasty Music Hoarders who want us to Pay Per View for music, videos, e-books, and other products that we've bought, using them technical workarounds for activities that would normally be covered by First Sale and Fair Use and only be covered by the limited protections of copyright, and they richly deserve to Die Like DIVX (remember DIVX?*) and get rejected by the market like Lotus Copy Protection. (*I'm told DIVX's cracked format has been recycled as a convenient tool for Napsterizing videos...) Much worse, these Mindshare Marketing Thugs are in league with the sleazy DMCA-abusers who got a law written badly enough that it not only directly confiscates the previous rights of information consumers but goes far beyond that to criminalize people who are engaged in the legitimate activity of seeing what it is they bought and using it in interesting ways. The technical side is bad enough, but left to itself, either Darwin would get them or they'd find a market that's willing to be couch-potato consumers we can sneer at, either of which are ok, while the legal side is outright evil. But what happens if we look at this from a cypherpunks viewpoint? Cypherpunks write code. Nasty MusicHoarderPunks can too. The right way to protect information isn't to write laws, which are ineffective against crackers (whether government or free-lance), usually contain loopholes for cops to abuse, and can be changed if the government wants to - it's to write code and algorithms and hardware designs that actually protect the information. That's what these guys are doing, and it's what we WANT them to do, though we'd rather have them operate a gift economy, the way the folk music profession did before it commercialized, and the church music and hacking professions. (I'm not counting the use of DMCA to criminalize working around bad software - that's still evil.) How do you build tools to protect information, at a level of granularity that someone who'd cracked root on a Unix box or bought or cracked User on a Windows box can't break into? You use crypto to encrypt data, with public-key algorithms to do appropriate parts in public, use objects that maintain their own data and keys, and maybe you build capability-based operating systems, or partition functions into separate devices like smartcards or intelligent peripherals to keep the private parts more isolated. If you want to build a For Your Ears Only secure telephone, it's much easier if you can ship an encrypted data stream that only the recipient's headphone can decrypt. And if you want a digital signature system that can't easily be forged by FBI spook who shoulder-surfs your passphrase, or want a digital payment system that can't easily be ripped off by some online store clerk, it's easier if you can use some hardware object in the process. To a large extent, the threat models are critical to your security - but if being overprotected doesn't interfere with regular use, and doesn't interfere with the other protection you're building, it's not a Bad Thing. Of course, it cuts two ways - if you're not a Good Guy building hardware protection against virus crackers, but an NSA Spook building cracking tools to workaround for the software protections, it's nice to get down and dirty in the hardware and hire Chip-R-Us to include an undocumented Export Chip Private Key instruction in addition to the Export Chip Public Key instruction... Music Hoarders have a somewhat harder problem, in that they want to copy-protect information while providing near-identical copies to large numbers of people, while you're more likely to want to provide your personal transaction information or private messages only to a small number of recipients - but you may still want some kind of watermarking to identify who sold your "private" information to somebody you didn't authorize. As long as watermarking isn't seriously obnoxious, the fact that different listeners hear slightly different versions isn't that bad - listeners at a concert also hear different versions depending on whether they're in the front row, the nosebleed seats, or the Phil Zone, as well as how hard they've been dancing, how bouncily the people in front are dancing, whether Jerry forgot some of the words or had a magical guitar night that reminds them of a previous concert, and how, umm, chemically enhanced they are :-) Somebody allegedly wrote to RAH:
Don't forget Intel and IBM are charter members of both these scuzzy outfits. And somebody please tell me what good an encrypted hard drive is gonna be when the key material has to pass through an untrusted PC running a see-through OS such as Windows? If one is actually trying to save the data _from_ the PC operator not _for_ him/her, one needs a TCPA-like hardening. At least Intel and IBM must realize this
Intel and IBM know that Windows isn't going to protect their data - if they want it protected, they'll have to work around it, using techniques like CPUs, speakers, and disk drives that share public keys and only pass encrypted data through the OS. Thanks! Bill Bill Stewart, bill.stewart@pobox.com PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639