Re: Executing Encrypted Code

At 9:18 PM 12/19/1996, Ben Byer wrote:
At the last meeting references were made to processors which only execute encrypted code. Decryption occurs on chip.
If each chip has a unique public/secret key pair, and executes authenticated code only, there are some interesting implications.
Let's see... What about this scenario:
Alice gets a contraband copy of PGP 4.0 off the Internet. Since the public-key algorithm is publicized so that people can encrypt software to a chip, PGP 4.0 has the ability to encode/decode/generate keys for the chip. Alice generates a public key/private key pair 0x12345678, in software. Alice goes to www.microsoft.com and orders Office '99 online, and tells Microsoft "Hi, my name is Alice, my credit card number is 31426436136778 and my PGPentium's public key is 0x12345678."
Microsoft unwittingly sends Alice a copy encrypted to 0x12345678, for which she has the private key to. Alice decrypts Office '99, and reencrypts it with public key of her PGPentium, as well as the keys f all her friends.
The software vendor would be wise to check that the public key was legal. It would be a simple matter for the manufacturer to publicize all public keys that had been installed on chips.
Does the authentication defeat this?
I'm sort of waving my hands around when I say "authentication". One approach is for the manufacturer to authenticate software submitted by approved vendors. The vendors are then tasked with encrypting it for the correct processor.
Our computers would only run software from Microsoft? Scary.
There are all sorts of nifty deals that could be made. Microsoft could commission a special run of the processors which only run Microsoft approved software. Machines using these processors could be given away or sold at a steep discount. You could also timestamp the software so that it only runs for a given length of time. This will encourage people to upgrade regularly. ;-) The processors could also support metering. The processor could support some sort of API for the software to tell it how many computrons had been used and stop it from running after they run out. This means that light users or evaluators of software pay relatively low prices while heavy users pay high prices. This is a great deal for all concerned. Right now software vendors try to do this with clever deals, but it's crude at best. Peter Hendrickson ph@netcom.com

At 7:34 PM -0800 12/19/96, Peter Hendrickson wrote:
You could also timestamp the software so that it only runs for a given length of time. This will encourage people to upgrade regularly. ;-)
Or to reset their clocks. Which is what many of us do when software is about to "expire." (The issue of enforcing "digital time delays" is an interesting one. Usually this necessitates some variant of "beacons," presumably on the Net, as the local clock can of course not be trusted or counted upon to be accurate. I wrote a couple of articles on this several years ago...I'll see if I can find them if there's interest.) --Tim May Just say "No" to "Big Brother Inside" We got computers, we're tapping phone lines, I know that that ain't allowed. ---------:---------:---------:---------:---------:---------:---------:---- Timothy C. May | Crypto Anarchy: encryption, digital money, tcmay@got.net 408-728-0152 | anonymous networks, digital pseudonyms, zero W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets, Higher Power: 2^1398269 | black markets, collapse of governments. "National borders aren't even speed bumps on the information superhighway."

-----BEGIN PGP SIGNED MESSAGE----- On Thu, 19 Dec 1996, Timothy C. May wrote:
At 7:34 PM -0800 12/19/96, Peter Hendrickson wrote:
You could also timestamp the software so that it only runs for a given length of time. This will encourage people to upgrade regularly. ;-)
Or to reset their clocks. Which is what many of us do when software is about to "expire."
(The issue of enforcing "digital time delays" is an interesting one. Usually this necessitates some variant of "beacons," presumably on the Net, as the local clock can of course not be trusted or counted upon to be accurate. I wrote a couple of articles on this several years ago...I'll see if I can find them if there's interest.)
--Tim May
Just say "No" to "Big Brother Inside" We got computers, we're tapping phone lines, I know that that ain't allowed. ---------:---------:---------:---------:---------:---------:---------:---- Timothy C. May | Crypto Anarchy: encryption, digital money, tcmay@got.net 408-728-0152 | anonymous networks, digital pseudonyms, zero W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets, Higher Power: 2^1398269 | black markets, collapse of governments. "National borders aren't even speed bumps on the information superhighway."
Just off the top of my head, the chips could come connected to a battery to maintain an internal clock and be configured to stop functioning if it is ever disconnected. Since the life expectancy of one generation of a cpu is so short now, limiting the life of a chip to that of a battery is not much of a problem. Also, if these are given away as was suggested, the fact that a dead battery would kill your computer is no big deal. <<<< NOTE CHANGE IN WHO'S BEING QUOTED >>>> On Thu, 19 Dec 1996, Peter Hendrickson Wrote:
... stuff deleted ...
The manufacturer of the encrypted-code processor would protect its instruction set using intellectual property law. Given the high price of a fab, it is entirely feasible to stop anybody from building a new architecture which can execute the code about as fast as the encrypting-code processor.
It seems to me that this is where this scheme would be broken. Have intellectual property laws been (successfully) used in this way? And even if so, would they be enforced in all the countries where the chips might be fabricated?
Peter Hendrickson ph@netcom.com
- -------------------- Scott V. McGuire <svmcguir@syr.edu> PGP key available at http://web.syr.edu/~svmcguir Key fingerprint = 86 B1 10 3F 4E 48 75 0E 96 9B 1E 52 8B B1 26 05 -----BEGIN PGP SIGNATURE----- Version: 2.6.3i Charset: noconv iQCVAwUBMro1I97xoXfnt4lpAQHD9gQAo0rwSzXmo8Qu46auFGhcp6RaWDDwxHtS SZNoy2L3VVVECgNb+wuHSdHlPCdocK/sWzncmg4DSipa81r4cUK/8hIbvEJp+rRz qS6vs2VpxEMaTLUA+RS82Bc/c99b3AjGtjf55uYdgVIbGfH4Tnqc1yvzDcP03G// mVVQTga4lHA= =gXr8 -----END PGP SIGNATURE-----

-----BEGIN PGP SIGNED MESSAGE-----
At 9:18 PM 12/19/1996, Ben Byer wrote:
At the last meeting references were made to processors which only execute encrypted code. Decryption occurs on chip.
If each chip has a unique public/secret key pair, and executes authenticated code only, there are some interesting implications.
Let's see... What about this scenario:
Alice gets a contraband copy of PGP 4.0 off the Internet. Since the public-key algorithm is publicized so that people can encrypt software to a chip, PGP 4.0 has the ability to encode/decode/generate keys for the chip. Alice generates a public key/private key pair 0x12345678, in software. Alice goes to www.microsoft.com and orders Office '99 online, and tells Microsoft "Hi, my name is Alice, my credit card number is 31426436136778 and my PGPentium's public key is 0x12345678."
Microsoft unwittingly sends Alice a copy encrypted to 0x12345678, for which she has the private key to. Alice decrypts Office '99, and reencrypts it with public key of her PGPentium, as well as the keys f all her friends.
The software vendor would be wise to check that the public key was legal. It would be a simple matter for the manufacturer to publicize all public keys that had been installed on chips.
The manufacturer is going to publish a list of ALL of the public keys? We're talking one key per chip, right? Isn't that an AWFUL lot of keys, like, in the millions range? Also... with a few million possible keys like this, all you need to do is to either guess or factor just one of them.
Does the authentication defeat this?
I'm sort of waving my hands around when I say "authentication".
One approach is for the manufacturer to authenticate software submitted by approved vendors. The vendors are then tasked with encrypting it for the correct processor.
I'm not sure the "approved" bit would go over too well... one idea would be to license the compiler writers, who would build the encryption into compilers. It's still not horribly great, but better.
Our computers would only run software from Microsoft? Scary.
There are all sorts of nifty deals that could be made. Microsoft could commission a special run of the processors which only run Microsoft approved software. Machines using these processors could be given away or sold at a steep discount.
Right; the only reason I could see people using this would be for economical reasons.
You could also timestamp the software so that it only runs for a given length of time. This will encourage people to upgrade regularly. ;-) The processors could also support metering.
Right; once the user loses control of what he's running, then you can pretty much do anything you want as far as metering goes. ObGAK question: Would this be exportable? I mean, you could be encrypting god knows WHAT into those .exe's... Key escrow? How would they get the key?!? I can see the headlines, "Key Escrow Database Leaked to Pirate Firm"... :) - -- Ben Byer root@bushing.plastic.crosslink.net I am not a bushing -----BEGIN PGP SIGNATURE----- Version: 2.6.2 iQB1AwUBMroTE7D5/Q37XXHFAQG6sgL8DnusDI/jqV3sn9U5ru2hhJPFxP1dZVpZ ohmJYteQdraD5/YfmvYNHFfslULB47Spx6ZTpT+xw512iMWJfyW5sN6NtejL6+CM 2BoX0SaRGxZrfVeRFAZAXMVx3/ak1LDk =HZOI -----END PGP SIGNATURE-----
participants (4)
-
Ben Byer
-
ph@netcom.com
-
Scott V. McGuire
-
Timothy C. May