Re: Privacy Software
![](https://secure.gravatar.com/avatar/2152a7e26768c9300d9f3dbbd2c36fa5.jpg?s=120&d=mm&r=g)
-----BEGIN PGP SIGNED MESSAGE----- Adam Back wrote:
Monty Cantsin writes:
The PGP source code is not the worst I've ever seen, but it's kind of odd.
I had a go at doing something with it (I'll let you know when I get it to work) -- I had the damnest job figuring out what was going on. The problem I found with understanding it were all of the nested functions called through vectors of functions and handler functions. Makes it hard to inspect what will happen without running the code under a debugger -- lots control flow is decided at run time.
So in other words, even though we have the source code we don't really have confidence that we can tell what it is doing. Even though the source code is available, I don't think it has been studied all that carefully. For example, hardly anybody knew that the PGP 5.0 source had CAK features lurking in it. Or, remember that bug with the random number generator? As I recall (i.e., feel free to correct me), it was in Colin Plumb's code and he found it himself. This would imply that it got by whoever went over the code when it was released. Not reassuring. C is a big part of the problem. Also, PGP was originally designed to operate on some fairly slow machines and they tried very hard to optimize the hell out of it. Now, however, cycles are a lot cheaper. I think we should give up speed for clarity. Slow code that we can really trust is better. And, the PGP code does a bunch of careful things which complicate it like burning the stack all the time. In my judgement, this is misplaced. If you don't trust the platform you are using you can't fix it in your encryption code.
We should consider a rewrite, which gives us the added benefit that it will be completely unencumbered.
Sounds maybe worth doing.
It's at least worth talking about. Hashing out the design is a good exercise and it can inspire other people when they are designing their own crypto systems.
It also gives us the opportunity to write it in a language other than C, one which truly supports encapsulation. C code is hard to verify with great confidence because it is possible to obfuscate it and introduce security holes. This means that C requires one to trust the authors to a greater extent than is desirable.
Some C programmers do have fun writing obfuscated chicken scratchings, but you _can_ write C code optimised for readability.
What's really bad is that any part of the code can betray the rest. Let's say you had somebody who was really very good who wanted to weaken things. All he or she has to do is have a little obfuscated code someplace that undermines the majority of nicely designed and implemented C code. A truly encapsulated language is a lot easier to verify because if none of the magic functions exist in the code (easy to check), you can be confident that when you have verified one piece of code that the other pieces can't interfere.
What language did you have in mind? modula-3? iso-pascal/borland pascal?
Hadn't thought of Modula-3, but that is an excellent idea. Java would be a candidate, although performance is an issue. There's been some good buzz around about Ada. It might perform well and it's here to stay. Is Modula-3 being used in any serious way by anybody? If it is, let's consider it. (I think there is a Modula-3 -> C compiler out there someplace, which means it's totally portable.) A good language choice would probably save a lot of time.
The whole issue of compatibility is an interesting one. Would it be a good idea to have a cryptographic system which was completely incompatible with PGP, given the Big Brother risk?
Theoretically perhaps, however I'm not sure a system will get very far unless it can automatically interoperate with pgp2.x and 5.x.
It would be nice to just shed the PGP baggage for the moment and think about how we would do a system from scratch for our own uses. Ultimately, this system could reside in an application which also speaks PGP. Or, it could just be its own thing.
You could build something which did the right thing automatically based on public key types:
hypothetical cpunks mail system (HCMS)
We should use the acronym CMR. (Cypherpunk Message Remodel? Suggestions invited.)
HCMS -> pgp2.x use RSA/MD5/IDEA and pgp2.x message formats HCMS -> pgp5.x use DSA/EG/3DES/SHA1 and pgp5.x message formats HCMS -> HCMS use whatever goodies you want, stego, mixmaster, etc.
CMR nee HCMS should be an independent system. The mail system you are using should decide whether to use it or some other system depending on the correspondent. While it may seem crazy to toss compatibility, it has some advantages. For instance, the only people who will use it are the hardcore types. I like the idea of an exclusive crypto system that only cool people who are fairly with it use. What do you think about mandating a key size? I like the idea that use of the protocol communicates "I am very serious about my privacy." It's also slightly cripple resistant. (And, were I to consider mass market concerns which I am not, I would say that it makes it easier for the user to make the right choice.) How about finding a way to eliminate indicators? It's nice not to leak any information at all. It also makes super-encryption stronger. If each encrypted layer looks like noise, the total key size is the sum of the bits of each layer. One way to eliminate indicators is through the prior agreement of the meaning of random bits. In effect, you can say "when you see 0x234B4D234F, it means, outside layer RSA/IDEA, middle layer 3DES/SHA1, and inner layer the Captain Crunch decoder ring." This information can be exchanged at the time of key authentication, securely, by telling your correspondent how you are encrypting everything. Later, you can send more random bits when the tank starts getting low. Super-encryption is something we should do more of. Let's say we believe, conservatively, that any particular cipher we are using has a 10% chance of falling in a particular year. In ten years, that's a 35% chance of compromise. But if three different ciphers are used, there is only a 4% chance that it will fall in a decade. Super-encryption has been unpopular because it takes too long. However, if you are using a modern OS, even if it takes five minutes running in the background it's just not a big deal. It doesn't seem unreasonable to me to break out authentication from the communication protocol. Strictly speaking they are not the same thing. Both programs can be called by the same message composition program. This naturally leads to using separate communications and authentication keys. I am not fond of PGP's signature design. The "BEGIN PGP SIGNATURE" stuff clutters things up. How about a one line signature with a previous character count? Note that this wraps beautifully without modifying the message itself. It's unobtrusive. ASCII armor also sets off alarms with me, but I'm not sure exactly how to solve this problem. It seems right to define protocols in terms of full 8 bit bytes. Perhaps just uuencoding encrypted messages is the way to go? For authentication, it would probably be okay to define an ASCII mode. I think key management is good to think of as a separate application. The right way to do this isn't clear, yet. The Web of Trust is a neat idea, but I think the concept is flawed because you are unlikely to tell something sensitive to somebody you wouldn't (or couldn't) authenticate yourself. Ultimately it relies on things like driver's licenses issued by Big Brother, which is terrible.
Something I've never liked about PGP is their approach to encrypting to multiple keys. For one thing, the PGP crowd seems overly conservative with bit expenditure, which is silly because bits are cheap. This means that creating entirely separate messages is completely economical.
This is more secure I agree. The real kicker with this problem is people who turn on encrypt to self -- I don't want messages with encrypt to self (an extra door into the message) on them in my mailbox, nor coming over the wire headed to me.
You can see the reason for multiple recipient though -- it's too ease integration into mailers -- you can process the message and give it back to the mailer, and then it can still send the message To: x, y; Cc: z.
But this is just bad design if you care about security in a serious way. Even assuming you have to reveal who you are sending the message to, why reveal that three messages are related to each other? This is yet another mass market issue. If we were to do it the right way, all the mail header fields would be contained inside the encryption envelope.
Doing away with multiple crypto recipient risk is more an issue of MUA integration than rabid conservation of bits.
I'll buy that.
So, perhaps a protocol which does not support anything more than one encryption key per message would be a good idea.
You don't have to use it. I tend not to. I never use encrypt to self, and I get annoyed with people who send me messages encrypt to themselves, and very rarely do I use Cc on encrypted messages.
If the protocol doesn't accept multiple keys for a message it is slightly CAK/GAK resistent. And, it's a nice statement of intent. It's also nice to have tools which help you to behave securely without carefully thinking about it when you are using them.
And, I wonder if compression doesn't actually weaken security? Let's say I forward a known message with some commentary. Since the compression tables will be known, it seems like the increased size of the message could provide some interesting information about the preceding commentary. All by itself, this probably doesn't matter, but combined with other information it might result in a breach. In any event, that which is ambiguous should be eliminated.
I don't understand this comment.
Let's say it is known with high probability that I am forwarding a particular message with some unknown prefacing comments. It seems to me that (possibly) some information is be revealed by the size of the compressed encrypted text. That is, the known message will compress to a different size depending on your prefacing comments. For example, if I use lots of words that are in the known message, it will compress very well. The result will be significantly smaller than if I use words that never appear in the known message. This might not be a problem, but I don't like "mights". Also, I like things which are really orthogonal. There's no particular reason for the compression to be part of an encryption program, if you use it at all.
One thing that some people don't realise is that the plaintext gets mixed into the random pool as an additional source of entropy. In automated environments (lots of MUAs which set +batchmode), this is all the entropy you'll get -- except for the original key presses to generate the key, and a small bit of entropy from the system clock.
It's fairly good normally because the way entropy is mixed in is a one way function of the randseed.bin based on IDEA. To make use of this an attacker would need a copy of your randseed.bin before you sent the target message, and to have suspicions of what the message is. Even after a few known messages it would still likely be possible to attack, because the entropy added by the clock is partly predictable from the message headers Date: field, and because it isn't that much entropy each time.
The randseed.bin file has always bothered me. What we really want is some good sources of entropy in which we have tremendous confidence. Also, if you signed the message the Date: field won't be needed. This is bad design, in my opinion. A signature and a time stamp are two different things. For instance, I would prefer not to time stamp my messages as it reveals the state of my clock and the transit time of the messages I send.
It would also be nice if the messages were padded to predetermined sizes, say 10K, 20K, 40K, etc. (Once compression is eliminated this is less of an issue.)
It would be nice to have a system where you send 100k and receive 100k per day regardless. Say in 10 10k packets which get poured into a mixmaster node.
Yes, this would be good. Although, I would think of this as being on a higher level than the encryption program itself. Probably we want an entire communications management program which tracks everything you've sent and analyzes your activity for weaknesses.
How about a one time pad mode? One time pads are more practical than widely believed. Many things we talk about we *do* want to keep quiet for the rest of our natural lives.
Right.
The problem with this is that you need random numbers. How do you generate them?
You only need to generate keys once in awhile. This means that you can go to some trouble when doing it. For instance, you might have some hardware which is not routinely connected to your machine.
If you use PGP's random pool, one suspects that if IDEA becomes attackable at some point in the future the random pool will start to look more like a predictable PRNG to the attacker.
I wonder how good linux's /dev/urandom would be if MD5 becomes even more suspect.
Well, neither of these would be good for a one time pad, of course. It would be neat to have, say, three sources of hardware randomness and then XOR the result with the above pseudo random output. It's hard to recognize non-randomness, so some overkill is good practice. BTW, it would be nice if PGP or CMR/HCMS would let you supply bits of entropy by hand. It takes awhile to generate 128 bits by hand, but for some things it would be worth it.
It's clear that going the corporate route has to be handled with some care. Given the political implications, investors have certain risks.
Also, many people seem to switch into a different mode as soon as they have a company. Anything which they perceive as increasing their profits becomes good. PGP, Inc. has gone this way, we've seen First Virtual do some unsavory things,
What did FV do? I know they don't use encryption, and Nathaniel Borenstein wrote a few hype-hype articles about the _gasp_ newly discovered security danger of "key board sniffers".
The hype article was what I had in mind. I doubt very much that Borenstein would ever have been involved with such a thing outside a corporation. It seems to have a bad effect on some people. The problem is, you don't always know where that bad effect stops once you've noticed it.
and even good old C2 has made a few people uncomfortable.
Only thing slightly negative C2 did was to make a dubious decision in handling of Mr Nemesis's fabricated claims about stronghold flaws. C2 still rocks, though right?
Comme ci, comme ca. The libel suit business was uncool. Also, there was a press release relating to one of the cracks which was somewhat misleading, I thought. C2 has shown some symptoms of institutional paranoia, which is bad. Sameer's comments along the lines of "this isn't some cypherpunk hobby" are also not reassuring. All of these things arise from being in a corporation. The problem is that I'm not tapping their phones. I don't really know what they are thinking or doing. When something happens that is troubling, it's hard to restore trust. I don't give C2 five stars. That said, C2 has done some very good things. People seem to like Stronghold a lot. It really is a secure product. They publish their source code. I do like what they have to say in many of their press releases and they are getting certain ideas into the mainstream. (The Forbes article, for instance.) Sameer's one line comment "sell code" had a lot of influence on me, too.
It doesn't have to be this way, of course. Look at Comsec Partners. We don't see any "conversation recovery", lying press releases, or any other nonsense from them, just a beautiful product.
Quite so. C2 hasn't got "web traffic recovery" either :-)
No kidding. Yes, C2 is miles above PGP, Inc. in my book. But, Comsec gets the five stars. They have never done anything or said anything that gave me less than complete confidence.
What I like about selling software is that you could actually make good living by doing the right thing. And, after all, if you've spent six months writing something, why shouldn't the users kick in a little money instead of freeloading? I would like to see more crypto users in the habit of paying for tools and in the habit starting security companies.
New payment models will need to come in. How can you extract money from a cryptoanarchist? Copyright? Patent? Hah, hah.
The important thing is establishment of the custom. Most cryptoanarchists with class will pay. The way to do this is to make it clear from day one that it is not free software. If you want to run it, you should buy it. (The problem with share ware is that people get used to "borrowing" it.) This should work well. Cryptoanarchists, in my experience, have pretty high integrity. Some people will pirate, true, but that's life. Ironically, people who honestly discuss their skepticism of intellectual property laws are probably less likely to pirate.
If we get a real eternity service going (not the poor imitation perl script up now where all traffic goes through a server unless you install it locally) software copyright could become a thing of the past :-)
Trust might be one way to go, rely on good will. Or paying for technical support. Or mixmaster, or DC net packet delivery postage charges.
These are good ideas. You know, it would be cool to define a software architecture for remailer software. Different remailers have different properties and different strategies. They may continue to differentiate. When you decided to trust a particular remailer, it could give you a module of code with a standard external interface which prepares messages for that particular remailer. Putting your idea another way, the code is "free" but the remailers cost money. Monty Cantsin Editor in Chief Smile Magazine http://www.neoism.org/squares/smile_index.html http://www.neoism.org/squares/cantsin_10.htm -----BEGIN PGP SIGNATURE----- Version: 2.6.2 iQEVAwUBNF1+s5aWtjSmRH/5AQFvHQf+PFXqriw8FoMMcsflEf1Jo55Myl94Luj8 UxSQ9YCq6+dedih7j/FJ3RLL0LtCTa7Sn6E/kGmgqzwcAcfzcjavpl7joNdHU9Yo pS81mXeZ3kw34Nj1hbaX2j6Shx4+46uNUskfXThi6bz3lh1OM2hy7xF63blB2m1I MVt786tah76UfveYlWR/MXK+ZN2DwUvejiPX+bP7vNqaDsXRyrW5mX1vVUuKf6po 5PRZ5wSMKQF/ULm7tOb/g8otoj2w84iir8kckLJiiSs4Xm3hk6+QbDVIhz31aPD3 D2PXIDEXz+x+Vtr5xPIurvpfEo5GpgFdMv5Ox1y796yHr6jjJ8BJjQ== =rjFe -----END PGP SIGNATURE-----
![](https://secure.gravatar.com/avatar/6c1aa6b36c84a2e64d661f02c8a2ac65.jpg?s=120&d=mm&r=g)
At 2:32 PM -0800 11/4/97, Adam Back wrote:
What's wrong with the randseed.bin and the public and private key rings is that they should all be encrypted with a key derived from your passphrase.
Think about it for a minute. randseed.bin is a place to store entropy. Entropy is about uncertainty. If I do a reversible transform (e.g. encrypt) to randseed.bin, I still recover the entropy without reversing (e.g. decrypting) the transform. ------------------------------------------------------------------------- Bill Frantz | Internal surveillance | Periwinkle -- Consulting (408)356-8506 | helped make the USSR the | 16345 Englewood Ave. frantz@netcom.com | nation it is today. | Los Gatos, CA 95032, USA
![](https://secure.gravatar.com/avatar/a57e37ac90cde6088c9d7e9b99436994.jpg?s=120&d=mm&r=g)
Bill Frantz <frantz@netcom.com> writes:
At 2:32 PM -0800 11/4/97, Adam Back wrote:
What's wrong with the randseed.bin and the public and private key rings is that they should all be encrypted with a key derived from your passphrase.
Think about it for a minute. randseed.bin is a place to store entropy. Entropy is about uncertainty. If I do a reversible transform (e.g. encrypt) to randseed.bin, I still recover the entropy without reversing (e.g. decrypting) the transform.
You might get some entropy from it -- but you won't get my PRNG state! An attacker is welcome to the entropy, but may find it cheaper to generate his own entropy than to copy some of mine. There are certain attacks which become possible when an attacker can snarf a copy of your randseed.bin, eg. the attacker can predict session keys if he can guess your plaintext, and you are using an environment which does not allow pgp2.x to sample your keystrokes (eg integrated mail scripts). randseed.bin is more sensitive than people treat it. pgp2.x encrypts private keys because people could use them to decrypt traffic, but it does not encrypt the randseed.bin which could in some circumstances also allow traffic to be decrypted. An ergonomic disadvantage of encrypting randseed.bin is that you would need to enter the passphrase to decrypt it before being able to encrypt messages. (You could make that optional -- and just use it in encrypted form when you couldn't be bothered entropy shows through :-) Encrypted public and private key rings is a separate good, and this because it obscures who you are talking to and what your nyms are. premail does this for you. Adam -- Now officially an EAR violation... Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/ print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<> )]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
![](https://secure.gravatar.com/avatar/5ccd664bdf3ddc5842e863bd17a084f3.jpg?s=120&d=mm&r=g)
My advice to Monty Cantsin or whomever actually wrote this: write shorter pieces! The quote-and-coment style, especially for very long pieces, usually results in people skipping huge sections, or the whole thing. I only scanned this and stopped when I saw the word "cryptoanarchist." At 4:41 AM -0700 11/3/97, Mix wrote: (quoting someone else)
New payment models will need to come in. How can you extract money from a cryptoanarchist? Copyright? Patent? Hah, hah.
The important thing is establishment of the custom. Most cryptoanarchists with class will pay. The way to do this is to make it clear from day one that it is not free software. If you want to run it, you should buy it. (The problem with share ware is that people get used to "borrowing" it.)
_This_ cryptoanarchist will almost _never_ pay for that which is free. If someone gives me something, no strings attached, and then says, "Oh, the "suggested donation" is $10," I tell them that they should have charged me that in the first place. (I'm obviously not fond of leftie events which are advertised as free, but which require a mandatory voluntary "suggested donation.") More to the point, I use various freebies I get off the Net. Some of them have obscure schemes for sending payments to the alleged authors. Too much hassle. And if it's _free_, why pay anything? Charityware is not a viable business model. --Tim May The Feds have shown their hand: they want a ban on domestic cryptography ---------:---------:---------:---------:---------:---------:---------:---- Timothy C. May | Crypto Anarchy: encryption, digital money, ComSec 3DES: 408-728-0152 | anonymous networks, digital pseudonyms, zero W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets, Higher Power: 2^2,976,221 | black markets, collapse of governments. "National borders aren't even speed bumps on the information superhighway."
![](https://secure.gravatar.com/avatar/a57e37ac90cde6088c9d7e9b99436994.jpg?s=120&d=mm&r=g)
Tim May <tcmay@got.net> writes:
Monty writes:
The important thing is establishment of the custom. Most cryptoanarchists with class will pay. The way to do this is to make it clear from day one that it is not free software.
_This_ cryptoanarchist will almost _never_ pay for that which is free.
It depends what you call free. If someone puts a copy of eudora pro 5.67 or whatever other commercial software up on the eternity service, is that "free"? It is free to the extent that it is unlikely that you will be caught. It is even less likely that you will be caught if you are a nym using remailers mostly. The difference with an anonymous cryptoanarchist run software house is that you know that they probably aren't paying taxes on their sales. So you have additional assurance that they can't touch you if you use their software -- they'd have to blow their cover unless they just made a donation to AP or something:-) People copy commercial software all the time. I suspect only some small fraction of personal use software is ever paid for. Companies are less likely to do this because they are larger and more worth going after, and because the SPA software police have caused some companies problems already. The shareware, beggarware, or "charityware" approach of giving it away for free, and then demanding or begging payment is partly an acknowledgement that there is no way to stop you redistributing it even if they demanded payment up front. Plus the advantage that you get free distribution, if people like it, it'll be all over the place in no time. Beggarware has worked for a few people, I think, perhaps ID software's DOOM was one example. Yes most people didn't pay, but the free advertising more than compensated I suspect. We to frown on protocols which rely on "please give me money", or "and then we call the cops". Payment protocols should be clean. One alternative which Intel is brewing up for us is clipper CPUs which do things against their owners interests, like encrypted instruction schemes... nasty stuff. This would then be used to ensure your copy of the software won't run on other peoples machines. That could be enabling technology for all sorts of snooping. One idea which I did like was the open bidding for a product to be developed, people interested in the feature or product pay how much it is worth to them up front, developers bid to take the job on. After it's done the software is freeware. Supporting ideas discussed before on this topic are that first to complete gets paid, or lowest bidder presents completion bond, and that there are independent aribitrators checking quality. There was someone who posted to the list that he was going to set this up. He even purchased the domain name if I recall. What happened, did anything come of it? Adam -- Now officially an EAR violation... Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/ print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<> )]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
![](https://secure.gravatar.com/avatar/a57e37ac90cde6088c9d7e9b99436994.jpg?s=120&d=mm&r=g)
Monty writes:
What language did you have in mind? modula-3? iso-pascal/borland pascal?
Hadn't thought of Modula-3, but that is an excellent idea. Java would be a candidate, although performance is an issue.
Java isn't so bad. Just in time compilers are being shipped with some browsers (netscape on some platforms). Sun now has native bigint library (good for you know what, and exportable because they haven't included the few lines required to implement public key encryption with it). Modula-3 is also fairly clean... garbage collection, bounds checked arrays, etc. A `safe' language. There is indeed a m3 -> c translator. I think GNU were working on a m3 module for their retargetable gcc compiler suite (to go with ada, C, fortran, pascal). I did hear talk of a java gcc front end also.
While it may seem crazy to toss compatibility, it has some advantages. For instance, the only people who will use it are the hardcore types. I like the idea of an exclusive crypto system that only cool people who are fairly with it use.
I think there would be advantages to always tunneling the encrypted messages inside PGP, that way no snoops know you're using it, until it's too late.
How about finding a way to eliminate indicators? It's nice not to leak any information at all. It also makes super-encryption stronger. If each encrypted layer looks like noise, the total key size is the sum of the bits of each layer.
That's a reasonable idea if you're going to use things like IDEA( k1, 3DES( k2, data ) ).
If the protocol doesn't accept multiple keys for a message it is slightly CAK/GAK resistent. And, it's a nice statement of intent. It's also nice to have tools which help you to behave securely without carefully thinking about it when you are using them.
Forward secrecy is what you want for GAK resistance -- can't get much more GAK hostile than burning your keys seconds after message receipt.
The randseed.bin file has always bothered me. What we really want is some good sources of entropy in which we have tremendous confidence.
What's wrong with the randseed.bin and the public and private key rings is that they should all be encrypted with a key derived from your passphrase.
If you use PGP's random pool, one suspects that if IDEA becomes attackable at some point in the future the random pool will start to look more like a predictable PRNG to the attacker.
I wonder how good linux's /dev/urandom would be if MD5 becomes even more suspect.
Well, neither of these would be good for a one time pad, of course.
Linux's /dev/random might not be bad. There is some real entropy in key strokes and mouse movements, and they are quite conservative about entropy estimation. You can easily make it more conservative -- XOR together a load of it to derive a smaller key.
It would be neat to have, say, three sources of hardware randomness and then XOR the result with the above pseudo random output.
That'd be fine. Personally not being a hardware type, I am suspicious of hardware RNGs.. I can't tell when they are going wrong.. failure modes can be dangerous (whoops lead fell of geiger tube), etc. Software and computers are easier to understand. Provided you XOR the lot together you should be fine though.
You know, it would be cool to define a software architecture for remailer software. Different remailers have different properties and different strategies. They may continue to differentiate.
One thing that is sorely needed in my opinion is a simple way to integrate with existing mailers. Ie to have some code which acts as an interface between various extensible email based functions (remailers, signatures, DC nets, keyservers) and MUAs. Then someone gets to fight with the plugin API once, and after that we can automatically add features which work with lots of MUAs. The future is in extensibility. Java is pretty good for this. Adam -- Now officially an EAR violation... Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/ print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<> )]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
![](https://secure.gravatar.com/avatar/78c3feb162181f7d15753f0677893886.jpg?s=120&d=mm&r=g)
On Mon, 3 Nov 1997, Mix wrote:
Adam Back wrote:
Monty Cantsin writes:
The PGP source code is not the worst I've ever seen, but it's kind of odd.
I had a go at doing something with it (I'll let you know when I get it to work) -- I had the damnest job figuring out what was going on. The problem I found with understanding it were all of the nested functions called through vectors of functions and handler functions. Makes it hard to inspect what will happen without running the code under a debugger -- lots control flow is decided at run time.
So in other words, even though we have the source code we don't really have confidence that we can tell what it is doing.
There were only a few obscure points. Most of it can be determined from hex dumps of the files produced. I had public key management before I saw the source code. Since I have a version that does not use any PGP source, but PGP 5.x interoperates fully, I think I can tell precisely what it is doing.
Even though the source code is available, I don't think it has been studied all that carefully. For example, hardly anybody knew that the PGP 5.0 source had CAK features lurking in it. Or, remember that bug with the random number generator? As I recall (i.e., feel free to correct me), it was in Colin Plumb's code and he found it himself. This would imply that it got by whoever went over the code when it was released. Not reassuring.
There is that section for CAK in a future expansion note in Vol. 1 of the source. There is probably a note in the CP archives I posted eons ago, long before the 5.5 controversy spawned all the traffic about CAK. It was near the new passphrase hash which was one thing I could not figure out without the source.
C is a big part of the problem. Also, PGP was originally designed to operate on some fairly slow machines and they tried very hard to optimize the hell out of it. Now, however, cycles are a lot cheaper. I think we should give up speed for clarity. Slow code that we can really trust is better.
C is not part of the problem, people who aren't good at abstraction are. It would look even more horrid in modula-2(3?). If you have something go through three layers instead of one, it will be more complicated, for instance I did something more like: *hashstart[]() = {md5start,sha1start,ripemd160start}; instead of the layering. --- reply to tzeruch - at - ceddec - dot - com ---
participants (5)
-
Adam Back
-
Bill Frantz
-
Mix
-
nospam-seesignature@ceddec.com
-
Tim May