Just recently, I bought the current issue of Wired. I've not looked at the magazine much since some time in it's second or third year of publication, and of course now I remember why. Anyhow, as I was chuckling at the endless stream of articles and advertisements telling us how wonderful it will be when we can finally enjoy ballroom dancing as we fly about between cities and countries in great big dirigibles, I noticed a single paragraph in one article which brings to mind a question. On Page 181, Bruce Sterling wrote, "I think the intermediating people running the means of production need to be exterminated -- a lot of them are basically war criminals. I want to empower designers. I want them walking across the landscape like a colossus." I'd like to know if Mr. Sterling has eaten Mr. May's brain. I note that I haven't seen him posting here of late. Regards, Steve -- Antidisintermediationalist and Posi-Trak advocate at large.
At 11:23 PM +0000 12/24/00, Steve Thompson wrote:
Just recently, I bought the current issue of Wired. I've not looked at the magazine much since some time in it's second or third year of publication, and of course now I remember why.
Anyhow, as I was chuckling at the endless stream of articles and advertisements telling us how wonderful it will be when we can finally enjoy ballroom dancing as we fly about between cities and countries in great big dirigibles, I noticed a single paragraph in one article which brings to mind a question. On Page 181, Bruce Sterling wrote,
"I think the intermediating people running the means of production need to be exterminated -- a lot of them are basically war criminals. I want to empower designers. I want them walking across the landscape like a colossus."
I'd like to know if Mr. Sterling has eaten Mr. May's brain. I note that I haven't seen him posting here of late.
I haven't been posting here a lot for various reasons. First, the quality of the responses has not been good. It seems repartee and tired Nazi vs. Stalinist debate is the norm, with Choatian physics and Choatian history filling in the gaps. Second, and perhaps related to the first point, a lot of folks have retreated to the safety of filtered lists, where Lewis and Perry can screen messages for them. (Though I have noticed that a lot of _political_ messages get cross-posted from Perrypunks, where they are not supposed to exist, over to Cypherpunks. So much for the safe haven of having list.monitors limiting "off-topic" discussion.) Third, "been there, done that." Most of the topics surfacing now have not new topics. Most topics were beaten to death by 1993. In fact, most of the tangentially-crypto stuff is actually _less_ interesting than the stuff in 1993 was. See the topics in 1992-1994 and compare them to the topics in 1998-2000. Fourth, as with my new .sig, the election has caused me to "move on," at least until the direction of things is determined. As for what Bruce says in the above quote, nothing different from what he's been saying for decades. He speaks of liquidating middlemen, I speak of liquidating tens of millions of welfare varmints, useless eaters, and politicians. And for this they call him a visionary and me a Nazi. Go figure. --Tim May -- Timothy C. May tcmay@got.net Corralitos, California Political: Co-founder Cypherpunks/crypto anarchy/Cyphernomicon Technical: physics/soft errors/Smalltalk/Squeak/agents/games/Go Personal: b.1951/UCSB/Intel '74-'86/retired/investor/motorcycles/guns
Tim expounds:
I haven't been posting here a lot for various reasons.
First, the quality of the responses has not been good. It seems repartee and tired Nazi vs. Stalinist debate is the norm, with Choatian physics and Choatian history filling in the gaps.
It's been a slow politics and cryptography year. The list is full of spam, and vandals keep subscribing it to other mailing lists. Perhaps next year will be better. I'm almost begining to feel that Cryptology has achieved the status of a "Mature Science."
Second, and perhaps related to the first point, a lot of folks have retreated to the safety of filtered lists, where Lewis and Perry can screen messages for them.
I'm currently amusing myself on DetweilerPunks. Also known as Theory-Edge, moderated by Vladimir Z. Nuri. http://www.egroups.com/group/theory-edge, if anyone wants to visit.
Fourth, as with my new .sig, the election has caused me to "move on," at least until the direction of things is determined.
Yes, a tasteful .sig designed not to cause public alarm, until the Shrub Administration's interpretation of our Constitution is clarified. I suspect we are entering an era in which even vague hints concerning a sticky end for tyrants can get one arrested.
He speaks of liquidating middlemen, I speak of liquidating tens of millions of welfare varmints, useless eaters, and politicians.
And for this they call him a visionary and me a Nazi. Go figure.
You need to moderate your views on non-producing eaters in the same way you moderated your .sig file. A new Tim for a new decade. So, when's the next Jim Bell trial? Anyone know? -- Eric Michael Cordian 0+ O:.T:.O:. Mathematical Munitions Division "Do What Thou Wilt Shall Be The Whole Of The Law"
On Sun, 24 Dec 2000, Eric Cordian wrote:
Perhaps next year will be better. I'm almost begining to feel that Cryptology has achieved the status of a "Mature Science."
It's my impression that mature sciences don't have the same kind of foundational or engineering problems cryptography does. We still see surprises about what a "definition of security" should be, even in the public-key setting where people have investigated such things for nearly 20 years. Plus even when we figure that out, we'll still have to deal with the fact that the models used in theoretical crypto don't deal with some of the attacks possible in real life -- timing and power analysis come to mind. As does the van Someren and Shamir trick for finding keys because they look "too random." To say nothing of the nasty fact that passphrases, and therefore keys based on them, aren't random at all. Which does not play nice with models which assume keys are picked randomly. It may be true that this year was a lull in "interesting" cryptographic research (I don't know if that's quite true), but it doesn't seem to be because too many problems are solved. Rather, there are lots of open problems left which no one seems to know how to solve... -David
At 09:55 PM 12/25/00 -0500, dmolnar wrote:
On Sun, 24 Dec 2000, Eric Cordian wrote:
Perhaps next year will be better. I'm almost begining to feel that Cryptology has achieved the status of a "Mature Science."
It's my impression that mature sciences don't have the same kind of foundational or engineering problems cryptography does.
Infosec is essentially a branch of 'safety systems' (see Leveson, et al), where you also look at all possible consequences of failure. The same problem ---the strength of the system is the strength of the weakest element--- dominates both arts.
At 9:50 PM -0500 12/25/00, dmolnar wrote:
On Sun, 24 Dec 2000, Eric Cordian wrote:
Perhaps next year will be better. I'm almost begining to feel that Cryptology has achieved the status of a "Mature Science."
It's my impression that mature sciences don't have the same kind of foundational or engineering problems cryptography does. We still see surprises about what a "definition of security" should be, even in the public-key setting where people have investigated such things for nearly 20 years. Plus even when we figure that out, we'll still have to deal with the fact that the models used in theoretical crypto don't deal with some of the attacks possible in real life -- timing and power analysis come to mind. As does the van Someren and Shamir trick for finding keys because they look "too random."
Parts of cryptology are in math, e.g., number theory. And parts are in economics. And parts are even in human psychology. Some of the foundations are, of course, "mature"...and not very exciting. The core of mathematical crypto is hardly frontier mathematics. (Yeah, I suppose Dave and Eric and a few others could make a case that there's some connection with the proof of Fermat's Last Theorem, stuff about elliptic functions, etc. But we all know that such connections are tenuous. Most of crypto still is built around good old number theory, basically what has been known for dozens of years, even centuries. Euler would not have had a problem understanding RSA.) The "far out" stuff of reputations, multi-player games, digital money, etc., is much less-grounded in theory. More interdisciplinary, more "fuzzy," more prone to hand-waving. Doesn't mean this this isn't the interesting area, just means it's not as "foundational" as math areas are. Reductionists who seek the rigor of a pure science often end up throwing out what's interesting. As many of us have noted over the years, and as Austin Hill recently noted vis-a-vis the ZKS technologies, the status of these things is roughly where mathematical ciphers ("pure crypto") were in, say, 1970. Some interest, some popularizations, some secret work at NSA and related places, but no serious academic coverage. By academic coverage I mean researchers studying weaknesses in various kinds of data havens, digital currencies, reputation systems, etc., in the same way that the "Crypto Conference" folks looked at various ciphers. (And specific digital currency systems, for example.) Crypto systems, using a mix of crypto tools, is only slowly taking off. In fact, the focus keeps moving back to simple encryption, depressingly enough! Someday, more complex systems will be actually deployed. An interesting way to look at such systems is to to think back to many examples of engineered systems. Steel buildings, for example. The "basic science" of steel, its strength and properties, was basically well-understood a century ago. A bit of later science, through understanding of things like martensitic transitioins and dislocations, etc., happened. But most of foundational science was laid a long time ago. And yet buildings collapsed, engineered figured out new ways to bolt together beams, and taller and taller buildings were erected. Crypto systems will be a lot like that. (And, as I have been saying for close to 10 years, the insurance industry will be a driver of new approaches. Newer safes were bought not because store and bank owners were "educated" about security (the precise analogy to security today), but because insurance premiums were lessened with better safes. Discounted present value, DPV, speaks louder than all of the moralizing and lecturing.)
It may be true that this year was a lull in "interesting" cryptographic research (I don't know if that's quite true), but it doesn't seem to be because too many problems are solved. Rather, there are lots of open problems left which no one seems to know how to solve...
I go further: the academic community is largely uninterested in, or unmotivated by, or unable to get funding for, the "Cypherpunkish" areas. Possibly this is because most fields are not interdisciplinary, so a researcher is more likely to study a pure math approach than to mix in economic/market issues. (E.g., our "Hayekian" sensibilities make a lot of sense to nearly every smart person who gets exposed to them, but such approaches smack of voodoo economics, to coin a phrase, to many pure researchers. I cite this is as just one facet of the issue. And, by the way, the Hayekian approach fits right in with "building skyscrapers," though not for the writing of papers about dislocation propagation in high-tensile steels.) In other words, it's time to get crypto out of the math and computer science departments and put it in the engineering departments where it belongs. --Tim May -- Timothy C. May tcmay@got.net Corralitos, California Political: Co-founder Cypherpunks/crypto anarchy/Cyphernomicon Technical: physics/soft errors/Smalltalk/Squeak/agents/games/Go Personal: b.1951/UCSB/Intel '74-'86/retired/investor/motorcycles/guns
On Mon, 25 Dec 2000, Tim May wrote:
Some of the foundations are, of course, "mature"...and not very exciting. The core of mathematical crypto is hardly frontier mathematics. (Yeah, I suppose Dave and Eric and a few others could make a case that there's some connection with the proof of Fermat's Last Theorem, stuff about elliptic functions, etc. But we all know
I don't think I'd go that far. As far as I'm concerned, elliptic curves are just another group to do Diffie-Hellman & friends in. What I'd call the "core" of mathematical crypto is the work that Goldreich, Goldwasser, Micali, et. al. have been doing over the past fifteen years -- trying to rough out just what kind of assumptions are necessary and sufficient to give us the kind of cryptography we want. That being said, almost none of it works without those pesky one-way functions. or trapdoor one-way functions. and we have too few examples of either.
that such connections are tenuous. Most of crypto still is built around good old number theory, basically what has been known for dozens of years, even centuries. Euler would not have had a problem understanding RSA.)
That's true, and in some sense it's a good thing - we have some confidence that these problems are hard because "Euler worked on them." (On the other hand, Euler didn't have the ability to experiment today's mathematicians do). In another sense, it's a bad thing, because the number of one-way functions we have is so small. To say nothing of trapdoor one-way functions...
The "far out" stuff of reputations, multi-player games, digital money, etc., is much less-grounded in theory. More interdisciplinary, more "fuzzy," more prone to hand-waving. Doesn't mean this this isn't the interesting area, just means it's not as "foundational" as math areas are. Reductionists who seek the rigor of a pure science often end up throwing out what's interesting.
So I have noticed. (and so I have to caution myself against every day).
By academic coverage I mean researchers studying weaknesses in various kinds of data havens, digital currencies, reputation systems, etc., in the same way that the "Crypto Conference" folks looked at various ciphers. (And specific digital currency systems, for example.)
Reminds me of the reaction I got when I asked some friends about doing a term project on mix-nets. "So, has there been any recent academic work on this?" There's some hope. There was a workshop on "Design Issues in Anonymity and Unobservability" this past summer which brought people together to talk about these issues. The Info Hiding Workshops are still going strong. With luck, this year's IHW may have a paper on reputations in it... This year's ACM CCS conference had two papers of special interest. The "Hordes" paper, _A protocol for anonymous communication over the Internet_ by Clay Shields and Brian Neil Levine, gives a definition of anonymity which seems convincing. Then the paper by Franklin and Durfee on "Distribution Chain Security" discusses the problems of dealing with contracts in a distribution chain. They have to balance the rights of buyers, sellers, and various middlemen - and develop some cute cryptographic tricks to do it. Obfuscated contracts, zero-knowledge proofs, and special "contract certifiers" make an appearance. It wouldn't surprise me if this ended up having application beyond the content distribution network scenario they propose.
Crypto systems, using a mix of crypto tools, is only slowly taking off. In fact, the focus keeps moving back to simple encryption, depressingly enough!
Depressingly enough, we keep finding that the focus *needs* to move back to simple encryption. Birgit Pfitzmann published a paper in the 1980s on "How To Break the Direct-RSA Implementation of MIXes." Today, nearly fifteen years later, we still don't know "really" what we need from an encryption system for MIXes; David Hopwood has some good thoughts, but we're not done yet. On the other hand, we can oppose this to the fact that we have a bunch of remailers, and they seem to work. They may be unreliable, but no one seems to have used padding flaws to break a remailer, as far as we know.
(And, as I have been saying for close to 10 years, the insurance industry will be a driver of new approaches. Newer safes were bought not because store and bank owners were "educated" about security (the precise analogy to security today), but because insurance premiums were lessened with better safes. Discounted present value, DPV, speaks louder than all of the moralizing and lecturing.)
This may have to wait until liability issues in general for software are straightened out, won't it? More than that, if the "tragedy of the commons" really happens for Gnutella and Napster and friends, then people will look for ways to avert it. Maybe it won't happen ("The Cornucopia of the Commons"), but if it does, reputation systems might see some sudden interest.
In other words, it's time to get crypto out of the math and computer science departments and put it in the engineering departments where it belongs.
Actually, to read this message, it sounds more like it should be part of the economics department! There are people working on that. Joan Feigenbaum came to speak at Harvard last spring on her recent work on fair pricing for multicast trees; this was a case of finding the best algorithm in the face of an "adversary" model specified by economic considerations. Noam Nisan has a list of some academic research groups also working on this at http://www.cs.huji.ac.il/~noam/econcsgrps.html I suppose the next step is to apply this to cypherpunkish considerations... -David
At 2:42 AM -0500 12/26/00, dmolnar wrote:
On Mon, 25 Dec 2000, Tim May wrote:
Some of the foundations are, of course, "mature"...and not very exciting. The core of mathematical crypto is hardly frontier mathematics. (Yeah, I suppose Dave and Eric and a few others could make a case that there's some connection with the proof of Fermat's Last Theorem, stuff about elliptic functions, etc. But we all know
I don't think I'd go that far. As far as I'm concerned, elliptic curves are just another group to do Diffie-Hellman & friends in. What I'd call the "core" of mathematical crypto is the work that Goldreich, Goldwasser, Micali, et. al. have been doing over the past fifteen years -- trying to rough out just what kind of assumptions are necessary and sufficient to give us the kind of cryptography we want.
Has there really been much progress in the last ten years? I remember the flurry of ground-breaking work in the mid-80s, and it was much in the air at the first "Crypto Conference" I attended in 1988 (also the last such conference I attended, for various reasons). Something I expected to have happened long ago was the encapsulization of these protocols into building blocks into libraries/classes/reusable objects that could be bolted together by those building systems. ("Let's take EncryptStream and throw in EscrowService and then add ObliviousTransfer..."). This is partly what I mean by "devolving back to basic ciphers." It seems that when all is said is done, the only real "core module" we have is basic encryption. And even that is not treated as a module (yeah, I know that RSA is only recently unencumbered by patents). Some stuff with signatures, too, but basically very similar. In short, the world doesn't look very different than it did in 1990. The Web is different, but not in how users send messages and files back and forth.
Depressingly enough, we keep finding that the focus *needs* to move back to simple encryption. Birgit Pfitzmann published a paper in the 1980s on "How To Break the Direct-RSA Implementation of MIXes." Today, nearly fifteen years later, we still don't know "really" what we need from an encryption system for MIXes; David Hopwood has some good thoughts, but we're not done yet.
On the other hand, we can oppose this to the fact that we have a bunch of remailers, and they seem to work. They may be unreliable, but no one seems to have used padding flaws to break a remailer, as far as we know.
Yes, and those remailers are not much different than what we specc'ed out at the very first Cypherpunks meeting. That they work as well as they do relates to the economics point. A digression: One of the conventional models for a cryptographic attack is that an attacker gets to take a problem back to his lab and torture it to death, i.e., throw as much computer power against a cipher as he wishes. This is a reasonable model for ciphers. However, mix-nets and such need to have some economic considerations. It costs money and effort to subvert certain nodes and alter message padding, times of sending, etc. An attack on a mix-net is not the same as taking the whole net back into NSA basements and proceeding to diddle it to death. Chaum, Pfitzman, et. al. of course refer to n-out-of-m sorts of collaborations, but left unsaid is the cost of such collaborations. A start, but missing a lot. That such a simple implementation of Chaum's mix-net (it had to be simple, as I was the one who specc'ed out most of the features a remailer network needed to have, and Eric Hughes implemented some of them in Perl, then Hal Finney added PGP a few weeks later) has not had a known major attack is a tribute to the difficulty in actually subverting enough nodes in a mix-net. (Nodes in different countries, nodes operated more-or-less on automatic pilot, nodes which mail to _themselves_, nodes which are "middleman only," etc.) Crypto does encompass the idea of a "work factor," of course. Usually expressed as MIPS-years or somesuch. This needs to be extended in some rough way to include the costs of soliciting cooperation or collusion, etc. Without such inputs, how could a heterogeneous mix of remailers be analyzed?
(And, as I have been saying for close to 10 years, the insurance industry will be a driver of new approaches. Newer safes were bought not because store and bank owners were "educated" about security (the precise analogy to security today), but because insurance premiums were lessened with better safes. Discounted present value, DPV, speaks louder than all of the moralizing and lecturing.)
This may have to wait until liability issues in general for software are straightened out, won't it?
It could happen absent any action on the legal front. Pure anarcho-capitalism, a la the Law Merchant (law of the seas, with no nation having legal jurisdiction). Lloyds of London was underwriting shipping before there was much concern about international legal dispute resolution. Computer security and information theft is not the same thing as ships going down, so the evolution will be different. But, no, I don't think such systems will have to wait until liability issues are resolved.
In other words, it's time to get crypto out of the math and computer science departments and put it in the engineering departments where it belongs.
Actually, to read this message, it sounds more like it should be part of the economics department! There are people working on that. Joan Feigenbaum came to speak at Harvard last spring on her recent work on fair pricing for multicast trees; this was a case of finding the best algorithm in the face of an "adversary" model specified by economic considerations.
Indeed, I cited economics in a major way. Hal Varian at Berkeley is also interested in such issues, and a former Cypherpunks list member, Robin Hanson, got his Ph.D. at Caltech on these kinds of issues. (Robin is the main "idea futures" guy.) One key issue is that not a lot of econ folks are good at crypto-type protocols, and vice versa. Different departments, different standards for advancement and academic fame. But I already alluded to this, so no need to expand on this here. Multi-agent systems, evolutionary game theory, and combinatorial game theory are some of the other areas I think are critical. Ecologies of agents interacting with each other via various protocols for identity, values of goods traded, pricing, auctions, escrows, etc. --Tim May -- Timothy C. May tcmay@got.net Corralitos, California Political: Co-founder Cypherpunks/crypto anarchy/Cyphernomicon Technical: physics/soft errors/Smalltalk/Squeak/agents/games/Go Personal: b.1951/UCSB/Intel '74-'86/retired/investor/motorcycles/guns
On Tue, Dec 26, 2000 at 10:38:36AM -0800, Tim May wrote: | >I don't think I'd go that far. As far as I'm concerned, elliptic curves | >are just another group to do Diffie-Hellman & friends in. What I'd call | >the "core" of mathematical crypto is the work that Goldreich, Goldwasser, | >Micali, et. al. have been doing over the past fifteen years -- trying to | >rough out just what kind of assumptions are necessary and sufficient to | >give us the kind of cryptography we want. | | Has there really been much progress in the last ten years? I remember | the flurry of ground-breaking work in the mid-80s, and it was much in | the air at the first "Crypto Conference" I attended in 1988 (also the | last such conference I attended, for various reasons). Depends on your definition of progress. I think that the work that esp. Goldreich has been doing in the foundations of cryptography (ie, http://www.toc.lcs.mit.edu/~oded/tfoc.html) is very exciting stuff, because it pushes us towards a solid grounding for systems, and away from the need for one of a dozen or so really solid cryptanalysts to look at each system published. Is this progress in the space of librarization, standardization, or economics of security? No. But we need stronger foundations in both security and crypto in order to justify the investments in it. When a company can spend really large sums of money for only small assurance that its systems are more secure, its a hard decision to justify. (Not that there aren't justifications, they're just non-obvious.) When those investments are butressed by an understanding that the features will work as planned, they'll be easier to make. Speaking for myself, Adam -- "It is seldom that liberty of any kind is lost all at once." -Hume
On Tue, 26 Dec 2000, Tim May wrote:
Has there really been much progress in the last ten years? I remember the flurry of ground-breaking work in the mid-80s, and it was much in the air at the first "Crypto Conference" I attended in 1988 (also the last such conference I attended, for various reasons).
Some things come to mind: 1) Efficient realizations of provably secure cryptosystems. - Optimal Asymmetric Encryption Padding (OAEP) 1994 Bellare and Rogaway show a way to do secure RSA encryption without falling victim to any of the nasty padding attacks found by Coppersmith, Hastad, and others. - Probabilistic Signature Scheme (PSS) 1996 How to properly pad your signatures. Again Bellare and Rogaway. Both OAEP and PSS rely on the so-called "random oracle assumption" which needs its own message to explain. Suffice to say that if you buy that assumption, you end up with schemes which are 99% as efficient as the "naive" way of doing things. Of course, there *were* padding methods arround before OAEP and PSS. Recent events have shown that some of them weren't a good idea - in particular the padding in the old PKCS #1 1.5 standard has come in for a beating. More recently, people have come up with methods which don't need the "random oracle assumption" (Cramer and Shoup most notedly), but they're still not as efficient as OAEP and PSS. This is a big deal because it opens the way for "practice-oriented provable security" -- all that beautiful theory people were excited about in 1988 makes its way into real products. 2) Commercialization and availability of SSL/TLS Ten years ago, we didn't have a good engineering-level spec for a protocol to encrypt streams. Now, three or four iterations of SSL later, we do. Plus the www.openssl.org effort makes this easy to use - some random undergrad can knock together an SSL-enabled application in a short time. Now that we're *finally* going to get some good documentation in the form of Eric Rescorla's book, maybe more people will take advantage of OpenSSL... In addition, the failures of SSL 1.0 and 2.0 (should have) taught us about what Not To Do when designing protocols. Like "protect your protocol negotiation"... 3) Differential Cryptanlysis, Linear Cryptanalysis, Block Cipher Design Not particularly cypherpunkish or protocol-ish, but needs mentioning. Academic cryptographers rediscover NSA techniques and start us on the path to an AES competition. Look also at the analysis of Skipjack and how it was determined that 16 rounds were "just enough" -- we could not have done that in 1988 or 1990, I think. Although I should really leave that determination to someone with more experience in that area than I have. 4) Practical and Theoretical advances in Digital Cash Today, I can go pick up the -lucre library - either version, or Magic Money and produce a "Pretty Good Digital Cash" enabled application. There's still no reason on earth why I should do so, alas, but I can. Also now the Mojo Nation stuff may take off. To this you may add the various smart card and digital cash experiments and projects which went on. Stefan Brands' techniques take the old "credential without identity" idea and make it practical and flexible. Plus his thesis and advocacy for an alternative to a hierarchical PKI will be useful in heading off the advent of "digital driver's licenses." None of these are as fundamental as creating the first real definition of security for a cryptosystem or as ground-breaking. But they follow up on those first efforts and make it easier to do what we want with cryptography.
Something I expected to have happened long ago was the encapsulization of these protocols into building blocks into libraries/classes/reusable objects that could be bolted together by those building systems. ("Let's take EncryptStream and throw in EscrowService and then add ObliviousTransfer...").
There have been two major problems, as far as I see, which have held this up. First, the theory has not always been cooperative. Second, despite the myriad of amazing special protocols which have appeared over the years, we have only a few core protocols which people seem to want badly enough to justify moving from the CRYPTO paper to the engineering stage. You can point the finger at patents, too, if you like - but they may have helped as well as hurt (I'll explain why in a bit). When I say "the theory has not always been cooperative," I mean such things as the fact that zero-knowledge proofs are not zero-knowledge under parallel composition or in a concurrent setting. That is, you "can't" take a ZKP out of a textbook and stick it in a situation where thousands of clients are interacting with a server all at once. At best you lose a security guarantee, at worst you get killed. Composing two different kinds of protocols can have similar problems; one example of this would be an "anonymous" digital cash scheme which uses zero-knowledge proofs to prove coins are "well-formed"...and the withdrawer's identity is included in the ZKP. (Pfitzmann and Waidner, "How to Break Another 'Provably Secure' Payment Scheme" ..and for fairness I should mention that there is a rebuttal) Then there's the fact that while we have a general result which tells us that "any function can be computed in secure multiparty computation," the constructions used to establish this result are impractical - it takes more work to find a particular protocol to do what you actually *want*, and there's no guarantee that it will be acceptably efficient. This is changing; people are finding more efficient ways to do "CryptoComputing" (Donald Beaver for one) and we have some cute new toys like ring-homomorphic commitments (Cramer and Damgard 1998, used in the "Distribution Chain Security" paper I cited previously), but we don't seem to be at the point yet where it's straightforward. *Each* new efficient protocol still yields a new paper. (that may be overstating things a bit; I have no idea how many protocol papers are rejected from conferences) Ideally we'd have a situation akin to where complexity theory is with NP-completeness. You can't get new NP-completeness results published any more because the techniques have become standard things taught in introductory undergrad courses. If and when we get to that point, it'll be easier to just build the protocol yourself instead of trying to look it up somewhere. I mentioned the problem of "only a few protocols that people actually want" in a separate e-mail. Face it, as cool as the Dining Cryptographers in the Disco is, who's going to implement it? and who besides cypherpunks would want to use it? This is why the commercial research labs are important - they seem to actually implement and occasionally release their protocols. Bell Labs released PAK and PAK-X clients; I know that some of their other work has at least been prototyped. This wouldn't have happened without the incentive of patents to fund that research and development. Unfortunately, the flip side is that none of us are ever going to get to use that material, or at least not without selling our souls in a licensing deal. :-(
This is partly what I mean by "devolving back to basic ciphers." It seems that when all is said is done, the only real "core module" we have is basic encryption. And even that is not treated as a module (yeah, I know that RSA is only recently unencumbered by patents).
Some stuff with signatures, too, but basically very similar.
In short, the world doesn't look very different than it did in 1990. The Web is different, but not in how users send messages and files back and forth.
I agree -- but remember that the Web, and the "e-commerce" craze that came with it, has only been in widespread play since 1995 or 1996. That means we've only had 5 or 6 years in which nearly any idea with "crypto" associated with it could get funded.
A digression: One of the conventional models for a cryptographic attack is that an attacker gets to take a problem back to his lab and torture it to death, i.e., throw as much computer power against a cipher as he wishes. This is a reasonable model for ciphers.
There's also another point - it's supposed to be an "easier" model to deal with than the one you outline below. The famous question of Goldwasser and Micali is "Should a cryptosystem which leaks the last bit of the plaintext be considered 'secure'?" Their answer is twofold. First, "no." Second, *it doesn't matter*, since we can 1) state a better definition of security which captures the notion that "no information" leaks from the ciphertext 2) give a cryptosystem which seems to meet this definition 3) actually give a *proof* (assuming a polynomial time limited adversary and the hardness of some problem) that it does meet this definition. The thing to notice, however, is that if you can show that the last bit of the plaintext is _all_ the cryptosytem leaks, AND you can prove that your application doesn't need the last bit of the plaintext, then you don't need to go through these steps. The system is secure - for you. But figuring out what exactly you need is tough. The idea is that by considering a maximally powerful adversary, you can avoid having to analyse every protocol to see if it needs the last bit to be secure or not. The point Greg Broiles raises later is that as powerful as this adversary is, it is not "powerful enough" - for instance, it doesn't torture people. Part of that seems to be a split between the cryptographic "science" of protocol design and the "engineering" of protocol implementation. Without passing judgement on whether this split is a Good Thing or not, I'll just note that the engineering is in its infancy compared to even ordinary software engineering.
However, mix-nets and such need to have some economic considerations. It costs money and effort to subvert certain nodes and alter message padding, times of sending, etc. An attack on a mix-net is not the same as taking the whole net back into NSA basements and proceeding to diddle it to death.
Chaum, Pfitzman, et. al. of course refer to n-out-of-m sorts of collaborations, but left unsaid is the cost of such collaborations. A start, but missing a lot.
Yes - even today, most people seem to consider models with a threshold of failure. If the adversary has less than n/2 of the nodes, you're safe. If it has more, you're toast. I can't think of a model which allows gradual failure or probabilistic failure off the top of my head.
(Nodes in different countries, nodes operated more-or-less on automatic pilot, nodes which mail to _themselves_, nodes which are "middleman only," etc.)
Note that each of these speaks to different problems facing a remailer network. The different countries is for jurisdictional arbitrage and so against a geographically local adversary. The automatic pilot works against an adversary which is observing users and trying to see which are operators. Self-mailing nodes are tied up with traffic analysis. The middleman only nodes seem to be tied up with the abuse problem...and so on. All of which are separate attacks to be pulled out and inspected, and then those techniques can be evaluated...
Crypto does encompass the idea of a "work factor," of course. Usually expressed as MIPS-years or somesuch. This needs to be extended in some rough way to include the costs of soliciting cooperation or collusion, etc. Without such inputs, how could a heterogeneous mix of remailers be analyzed?
By assuming the worst about all of them - you turn it into a homogenous mix by assuming the worst. Pow - no more heterogenous problem, but with the drawbacks we all know... Pulling back from that is difficult. First, it may be hard to model, especially for those of us without economics background. Second, it runs counter to the way "things are done"(but you knew that). An aside -- from what I've seen, mostly the symmetric cipher people deal with "work factor" in MIPS-years. When you move to the public-key and protocol side of things, it's back to the "polynomial-time adversary." This didn't change until Bellare and Rogaway started asking about "exact security" in which they wanted to know what the exact probability of an adversary to recover information would be if it could invert RSA or some such in a given amount of time.
This may have to wait until liability issues in general for software are straightened out, won't it?
It could happen absent any action on the legal front. Pure anarcho-capitalism, a la the Law Merchant (law of the seas, with no nation having legal jurisdiction). Lloyds of London was underwriting shipping before there was much concern about international legal dispute resolution. Computer security and information theft is not the same thing as ships going down, so the evolution will be different.
Oh, OK. I was thinking about it in terms of the software vendor being insured against lawsuit in case its software failed. You seem to be referring to insurance issued to a business against its transaction failing. I hadn't considered that - it is indeed plausible.
Indeed, I cited economics in a major way. Hal Varian at Berkeley is also interested in such issues, and a former Cypherpunks list member, Robin Hanson, got his Ph.D. at Caltech on these kinds of issues. (Robin is the main "idea futures" guy.)
Indeed, now that I look at the page with the "list of research groups in crypto & economics", Varian is on it. There's also L. Jean Camp here at the Kennedy School - although he's more interested in the intersection of politics and cryptography, I think.
One key issue is that not a lot of econ folks are good at crypto-type protocols, and vice versa. Different departments, different standards for advancement and academic fame.
I don't know about anyone else, but it takes me a while to learn, and an undergraduate education is only so long. I do have a number of friends in economics (sometimes it seems like all my friends from high school are in econ...) but we don't talk about this. It's hard for me to know how to frame these questions to them. Plus I still have the naive hope that the right protocols will end up provable in the end... -David
At 02:42 AM 12/26/00 -0500, dmolnar wrote:
More than that, if the "tragedy of the commons" really happens for Gnutella and Napster and friends, then people will look for ways to avert it. Maybe it won't happen ("The Cornucopia of the Commons"), but if it does, reputation systems might see some sudden interest.
Napster itself suffers from tragedy of the inadequate business model, since it relies on centralized servers with no visible means of support (other than the "with 20 million users we should be able to get revenue _somewhere_") and a potential for exponential growth in their legal costs if they get any revenue. They do have a problem related to tragedy of the commons, which is a need for servers that are bigger than the biggest individual servers they currently support, and a technology that doesn't scale as well as they'd like, though some parts of it scale extremely well and the next level of bottlenecks are still good enough for pirating music, with users sharing music in communities of a few hundred thousand, if not good enough for six billion users. I suspect the next layer of scalability could be handled adequately by some good engineering, though perhaps it needs Real Computer Science, but without a good funding model it's not likely to get done. The current model does seem to port well to the Open-Servers-Not-Run-By-Napster model - volunteers can run medium-sized servers because the first level of scalability design was well done, and as with Napster-run servers, it's close enough for pirate music, though it doesn't let you find everything on the distributed net. Less Napster-like systems with decentralized servers have to address scaling problems as well. Some of them tie their metadata and their transmission methods together closely; some split them apart better. Gnutella sounds like it's in trouble - too much needs to be online, and the original designs can't handle a large number of requests if there are people with slow connections on the net. It's kind of like tragedy of the commons where the commons is small and everybody has to walk their sheep in single file, so the slowest or dumbest sheep become a bottleneck for everyone else. Freenet paid more attention to scaling in its design - it's easy to retrieve stuff if you know where it is, or to find stuff if it's relatively near you, and it can cope with not being able to find everything - On the other hand, it may be harder to find the stuff you want.
On Mon, 25 Dec 2000, Tim May wrote:
In other words, it's time to get crypto out of the math and computer science departments and put it in the engineering departments where it belongs.
Some of this may be computer science, some is engineering, some is just counting stuff :-) Some problems, like scalability or understanding don't-use-the-same-key-twice attacks on RC4, are Science the first time you learn them, but they're just engineering after a while, the way understanding the relationship of the tensile strength of material to its molecular structure is science, but designing a bridge so that it doesn't overstress any of its beams is engineering, and taking occasional samples of bolts and destructively testing them to make sure they've got the tensile strength they're supposed to is engineering or maybe just business practice (depending on whether you're doing it to make sure your bridge will perform the way you want or to make sure your suppliers aren't ripping you off.) Thanks! Bill Bill Stewart, bill.stewart@pobox.com PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
Tim May wrote:
In other words, it's time to get crypto out of the math and computer science departments and put it in the engineering departments where it belongs.
Tim's complained for a while that the cypherpunks meetings and discussions have declined in quality, partly because we've tended to rehash old material rather than doing new and interesting work, and partly because we've tended to have fewer talks on new stuff people are doing and more on some commercial business (maybe or maybe not run by cypherpunks) doing their product or non-technical talks by EFF lawyer types. While I'm not disagreeing with him here, I think a lot of this is _precisely_ related to the movement of crypto out of math and CS areas and into engineering. Mojo Nation, for example, is partly interesting because it's not just Yet Another Encrypted Music Sharing Product - it's mixing the crypto with economic models in ways that are intellectually complex, even if they're somewhat at the hand-waving level rather than highly precise. At 02:42 AM 12/26/00 -0500, dmolnar wrote:
There's some hope. There was a workshop on "Design Issues in Anonymity and Unobservability" this past summer which brought people together to talk about these issues. The Info Hiding Workshops are still going strong. With luck, this year's IHW may have a paper on reputations in it...
Cool. Are the proceedings on line anywhere? (Or is it only for people who know the secret keys...)
On the other hand, we can oppose this to the fact that we have a bunch of remailers, and they seem to work. They may be unreliable, but no one seems to have used padding flaws to break a remailer, as far as we know.
Arrgh! Dave, just because nobody's known to have broken them doesn't mean that nobody's succeeded in breaking them (without us knowing they've succeeded), or that anybody's put serious effort into an attack. The basic remailer network is known to be breakable by anybody doing a thorough eavesdropping attack, because you can learn a lot from message sizes. Mixmasters are much safer, because message sizes are constant (though message counts aren't), but it's not clear whether they're good enough, given a good attack. Pipenets are probably secure enough against most attacks, but they're annoying economically - not surprising that Zero Knowledge's initial service didn't fully implement them. The reason remailers have been Good Enough so far is that as far as we know, nobody's had the motivation to do a proactive eavesdropping attack on them, or a proactive deployment of untrustworthy remailers the attacks have either been after-the-fact attempts to get information that wasn't logged (they're strong enough for that, if run by trustable people on uncracked machines), or proactive attempts to close the remailers (many of those attacks have been successful.) Small numbers of remailers (there are typically about 20) aren't good enough to resist shutdown-forcing attacks. The cool thing about Zero Knowledge was that they had a business model they thought could get large numbers of service providers to support, which increases the security against loss of individual remailers as well as reducing the likelihood of an individual remailer shutting down. Thanks! Bill Bill Stewart, bill.stewart@pobox.com PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
On Wed, 27 Dec 2000, Bill Stewart wrote:
fewer talks on new stuff people are doing and more on some commercial business (maybe or maybe not run by cypherpunks) doing their product or non-technical talks by EFF lawyer types.
I'm in the midddle of composing a reply to Tim's message (which is getting bigger every time I sit down to finish it, ominously enough). One of the points that has popped into my mind so far is that while we've had academic crypto research since the 80s, thanks to Rivest, Shamir, Aldeman, Diffie, Hellman, and others willing to defy the NSA, we have _not_ had a similar tradition of commercial cryptography - or at least, not a tradition of companies obtaining money for cryptographic *protocols* as opposed to ciphers. It seems to me that it took a long while for people to even recognize that there was more to cryptography than secrecy. Maybe it happened quickly in academia, but it doesn't seem to have filtered out quickly (and then there's still the chilling effect from export controls). This is one of the reasons why the early Cypherpunk work is so damn important -- it showed the amazing, powerful things you can do given cryptography and a little cleverness, and it did so to a (comparatively) wide audience! Even after "everyone" knows that you can do, say, cryptographic voting, there's still the question of "who's going to pay for it?" That question seems to have found a partial answer with the Internet/Web/"e-commerce" frenzy. The thing is, that is *new*, only 4 or 5 years old. Before, you could go out and say "I want to go commercialize neat protocol X," and good luck to you...today, you might get funding. Until you get that funding, you can't start the engineering work that's required to take a protocol from the "cool CRYPTO paper" stage to the "real world product." Before Tim jumps on me, yes, I know there were early electronic markets, and yes, electronic trading was around before the Web. Yes, these could have been viable markets for digital cash, fair exchange protocols, whatever. Even electronic voting could and did get started earlier (though not using cryptographic techniques AFAIK) I do not dispute this! It simply seems to me that the climate today has the possibility of demand for such protocols (and more) on a wider scale than previously.
of crypto out of math and CS areas and into engineering. Mojo Nation, for example, is partly interesting because it's not just Yet Another Encrypted Music Sharing Product - it's mixing the crypto with economic models in ways that are intellectually complex, even if they're somewhat at the hand-waving level rather than highly precise.
Maybe it will force smart people to move the mix from the hand-waving level to something highly precise. Insh'allah.
Cool. Are the proceedings on line anywhere? (Or is it only for people who know the secret keys...)
The 2nd and 3rd are, via Springer-Verlag LINK service. Tables of contents are free; you should be able to recover the papers from their authors' home pages (use Google!). If you can't find something, e-mail me. Page for past proceedings: http://chacs.nrl.navy.mil/IHW2001/past-workshops.html Page for IHW 2001: http://chacs.nrl.navy.mil/IHW2001/ Unfortunately, the TOC for the first IHW is not online, nor do the papers seem to be available. You can extract the papers from Petitcolas' bibliography at http://www.cl.cam.ac.uk/users/fapp2/steganography/bibliography/index.html and may be able to get some of the papers that way. I note a previous message from Hal Finney which has some links as well http://www.inet-one.com/cypherpunks/dir.1997.05.15-1997.05.21/msg00298.html (I haven't tried them) I should state up front that the workshops are a little heavy on watermarking papers, which may not be of too much interest to cypherpunks. The papers on breaking watermarks, on the other hand, may be of more interest. :-)
On the other hand, we can oppose this to the fact that we have a bunch of remailers, and they seem to work. They may be unreliable, but no one seems to have used padding flaws to break a remailer, as far as we know.
Arrgh! Dave, just because nobody's known to have broken them doesn't mean that nobody's succeeded in breaking them (without us knowing they've succeeded),
[snip a well-deserved beating] Well, this is what I get for trying to moderate myself. Everything you say is correct - of course. I actually agree with you! I mentioned this because I wanted to avoid playing the part of a "theoretical Cassandra," which is something I do too often. (In fact, if I'm not mistaken, that's part of what Tim's response about different adversary models attempts to speak to - the fact that traditional cryptographic models assume a maximally powerful adversary, while we might want a finer grained hierarchy of adversaries and their effects...) -David
At 3:56 AM -0500 12/28/00, dmolnar wrote:
I'm in the midddle of composing a reply to Tim's message (which is getting bigger every time I sit down to finish it, ominously enough).
Sounds good to me!
One of the points that has popped into my mind so far is that while we've had academic crypto research since the 80s, thanks to Rivest, Shamir, Aldeman, Diffie, Hellman, and others willing to defy the NSA, we have _not_ had a similar tradition of commercial cryptography - or at least, not a tradition of companies obtaining money for cryptographic *protocols* as opposed to ciphers.
Probably the most basic motivation Eric Hughes and I had for calling together a bunch of Bay Area folks in '92 was because, in a 3-day series of talks we'd had earlier in the spring, we concluded that a lot of academic crypto was ripe for conversion into "building blocks." (Building blocks, protocols, modules, libraries...) Well, we were half-right.
It seems to me that it took a long while for people to even recognize that there was more to cryptography than secrecy. Maybe it happened quickly in academia, but it doesn't seem to have filtered out quickly (and then there's still the chilling effect from export controls). This is one of the reasons why the early Cypherpunk work is so damn important -- it showed the amazing, powerful things you can do given cryptography and a little cleverness, and it did so to a (comparatively) wide audience!
Thanks. It was an amazing time. It was clear that "uncoerced transactions" would be possible by combining "untraceable communications" (mixes, remailers, pseudonyms) and "untraceable payments" (pure Chaumian digicash). And that all manner of related things would come from this. Frankly, the early work on Magic Money (by Pr0ductCypher) _could_ have been the extended to give a Pretty Good Digital Cash, at least for experimental markets, but it wasn't. And as David notes, the commercial sector was focused on fairly mundane straight crypto.
... Before Tim jumps on me, yes, I know there were early electronic markets, and yes, electronic trading was around before the Web. Yes, these could have been viable markets for digital cash, fair exchange protocols, whatever. Even electronic voting could and did get started earlier (though not using cryptographic techniques AFAIK) I do not dispute this! It simply seems to me that the climate today has the possibility of demand for such protocols (and more) on a wider scale than previously.
I won't jump on you. Those early electronic markets, like Phil Salin's "AmIX" (American Information Exchange) were failures. AmIX desperately needed the Web, or at least free connect time. (We pioneers were paying $12 an hour, or somesuch, IIRC, to dial in to Palo Alto. This was circa 1990.) The Extropians list even ran "reputation markets" as a viable experiment, circa 1993-94. Some guy in Utah, IIRC, implemented it in Perl. (Precursors to Firefly and suchlike.) But it took the Web to create a proper substrate.
of crypto out of math and CS areas and into engineering. Mojo Nation, for example, is partly interesting because it's not just Yet Another Encrypted Music Sharing Product - it's mixing the crypto with economic models in ways that are intellectually complex, even if they're somewhat at the hand-waving level rather than highly precise.
Maybe it will force smart people to move the mix from the hand-waving level to something highly precise. Insh'allah.
I hear the focus of Mojo Nation is shifting from "better living through piracy," to something more mundane involving deals to deliver video content. If so, much of the motivation to be absolutely robust will go away. Sad, if true. (Mojo folks feel free to jump in to set me straight...)
On the other hand, we can oppose this to the fact that we have a bunch of remailers, and they seem to work. They may be unreliable, but no one seems to have used padding flaws to break a remailer, as far as we know.
Arrgh! Dave, just because nobody's known to have broken them doesn't mean that nobody's succeeded in breaking them (without us knowing they've succeeded),
[snip a well-deserved beating]
I think Bill was a bit harsh. There are some _economic_ issues involved, as usual. So long as the "value of what is being sent through remailers" is LESS THAN "the cost of subverting remailers," they will tend not to be subverted. There is an interesting trade-off in three dimensions between "value of material" and "cost to send it" and "bandwidth/latency." A remailer network is pretty good at sending small packets (e-mails) through N hops, where N can be quite large, so long as a latency of ~ hours is acceptable, which it usually is. And at very low cost. However, sending Web page queries and responses through is another matter. ZKS believes that "untraceable surfing" is an important business model...and for this sort of app they need PipeNet-like bandwidth. And so on. I wish e-mail allowed us to draw pictures. IMO, any analysis of breaking mixes should be heavily-centered around economic analysis. This is not as heretical as it sounds. Game theory of both main flavors--matrix game theory of the Von Neuman/Morgenstern/Nash type and combinatorial game theory of the Conway/Berlenkamp/Guy type--often involves payoffs, costs, and other economic issues. IMO, there is no reason crypto cannot easily co-opt such approaches. At the most trivial level, work factor is a fundamentally economic issue. For mix-nets and other Cypherpunkish things, economic analysis is everything.
Well, this is what I get for trying to moderate myself. Everything you say is correct - of course. I actually agree with you! I mentioned this because I wanted to avoid playing the part of a "theoretical Cassandra," which is something I do too often. (In fact, if I'm not mistaken, that's part of what Tim's response about different adversary models attempts to speak to - the fact that traditional cryptographic models assume a maximally powerful adversary, while we might want a finer grained hierarchy of adversaries and their effects...)
Yes, as noted above. Pure crypto is often treated as a pure math exercise, akin to finding "existence" proofs of the sort we see standard problems (travelling salesman, Hamiltonian cycle, etc.). But crypto is really more of an N-party game, with Alice and Bob (and maybe others) making moves and countermoves. (This is one reason many such games are in an important sense "harder" than being merely NP-complete.) The moves and countermoves, and the hidden knowledge (*), are similar to the evolutionary process of building and attacking castles and other fortifications. Siege engines, better walls, traps, moats, economic isolation, etc. (* A standard assumption--it probably has a name that I have forgotten--is that the attacker of a cipher has complete knowledge except for the key. That is, he can take the cipher back to his lab and attack it with everything he's got except for the key itself. This is sort of the Basic Modern Assumption. Security through obscurity is deprecated (because, practically, it falls long before the other attacks). However, even in crypto we find things like "tamper-responding systems," which alter the equation: there is now a cost in attacking such a system, as the adversay _knows_ the attack is occuring and may take steps in response. Again, N-party games.) Pardon this rambling above. I expect Dave and Bill and some others know where this is going. Really, this is a call for a "new paradigm" in crypto. More later. --Tim May -- Timothy C. May tcmay@got.net Corralitos, California Political: Co-founder Cypherpunks/crypto anarchy/Cyphernomicon Technical: physics/soft errors/Smalltalk/Squeak/agents/games/Go Personal: b.1951/UCSB/Intel '74-'86/retired/investor/motorcycles/guns
On Thu, 28 Dec 2000, Tim May wrote:
At 3:56 AM -0500 12/28/00, dmolnar wrote:
I'm in the midddle of composing a reply to Tim's message (which is getting bigger every time I sit down to finish it, ominously enough).
Sounds good to me!
One of the points that has popped into my mind so far is that while we've had academic crypto research since the 80s, thanks to Rivest, Shamir, Aldeman, Diffie, Hellman, and others willing to defy the NSA, we have _not_ had a similar tradition of commercial cryptography - or at least, not a tradition of companies obtaining money for cryptographic *protocols* as opposed to ciphers.
Not enough energy by half has been focused on protocols. I think there's probably a good set of programs to be written here. Basically, I'm thinking in terms of the old unix philosophy -- "A good program does exactly one thing, and does it well.". If somebody designs a good set of command-line programs, which produce output usable by each other so that they can be piped together in useful ways on a unix command line, then protocols should be easy to implement as shell scripts. But a proper building block would have to be scriptable from the word "go." You'd have to fix it so that anything it could do, at all, it could do "in a straight run". A command line, a command file, whatever. And you'd have to do it so your keys didn't wind up in unencrypted batch files. Maybe a reference to keys' locations in an encrypted file system would be what went on the command line. Such energy as has been focused on protocols has been at the level of applications -- basically fixing them in source code so the users can't as easily pick them apart and stick them back together again different. Hmmm. More later. Some ideas are percolating through my head but they're not very well developed. Bear
On Thu, 28 Dec 2000, Ray Dillinger wrote:
Not enough energy by half has been focused on protocols. I think there's probably a good set of programs to be written here.
While we're at it, check out this web page http://www.cs.nyu.edu/cs/dept_info/course_home_pages/fall96/G22.3033.02/ Avi Rubin and Matt Franklin's course on crypto protocols.
Basically, I'm thinking in terms of the old unix philosophy -- "A good program does exactly one thing, and does it well.". If somebody designs a good set of command-line programs, which produce output usable by each other so that they can be piped together in useful ways on a unix command line, then protocols should be easy to implement as shell scripts. But a proper
First you need to identity a set of common building blocks! I thought about this briefly a year or two ago. Then realized that many protocols for, say, "digital cash" do not actually share many components with each other. Sure, all of them may have a public-key cryptosystem, but the exact requirements are different...and sometimes a protocol needs specific properties of a cryptosystem in order to work. My programming languages professor recently pointed me to a paper describing a library for doing smart contracts and options in Haskell. (I'll post the reference later; I'm having trouble finding it). They put together a library of combinators, which together could be used to write real contracts. Even have a semantics for this beast. It seems that the reason they could do that is that the contracts they're looking at decompose nicely into distinct parts. It's not clear to me how to do that for crypto protocols. Maybe a place to start would be to go through a bunch of papers on crypto protocols and analyse the all the "Alice sends to Bob" messages. See what commonalities pop out. -David
On Thu, Dec 28, 2000 at 12:01:20PM -0500, Tim May wrote:
Probably the most basic motivation Eric Hughes and I had for calling together a bunch of Bay Area folks in '92 was because, in a 3-day series of talks we'd had earlier in the spring, we concluded that a lot of academic crypto was ripe for conversion into "building blocks."
(Building blocks, protocols, modules, libraries...)
Well, we were half-right.
Tim, I've seen you mention this issue several times, but don't remember if I replied to it before. Being a library writer, I think the main reason there aren't a lot of higher-level building blocks in crypto libraries is that in order to use a higher level crypto, you have to understand it at a fairly low level and know how and why it works in terms of number theory and ciphers and hash functions. This is in contrast to lower-level building blocks like ciphers, where you don't really need to understand a cipher at the bit-twiddling level in order to use it. So you can't really treat higher-level crypto as black boxes. It's also hard to design interfaces to them so they plug in together nicely for all the different purposes you might want to use them for. Finally, if you do understand how they work and have a good low-level crypto library they're typically not hard to implement, espeically if you just want to implement them for a specific purpose and not as highly reusable components. So I think there are several good reasons why we don't have a high-level crypto library.
On Thu, 28 Dec 2000, Tim May wrote:
I hear the focus of Mojo Nation is shifting from "better living through piracy," to something more mundane involving deals to deliver video content. If so, much of the motivation to be absolutely robust will go away. Sad, if true.
So maybe it takes away the incentive for the original Mojo folks. So? That may actually be a good thing, if it gets the technology spread far and wide so that other people can produce an absolutely robust Mojo++ which rides on top of Mojo. Plus it raises the profile of these kinds of services. Today's teenager reading about Mojo on slashdot (or wherever) is going to be tomorrow's data haven architect...
I think Bill was a bit harsh. There are some _economic_ issues involved, as usual. So long as the "value of what is being sent through remailers" is LESS THAN "the cost of subverting remailers," they will tend not to be subverted.
Yes, BUT I think one of the reasons why a maximally powerful adversary model is so appealing, however, is that it sidesteps the question of evaluating "value of what is being sent through remailers." If you can prove security against a maximally powerful adversary, then you don't have to answer that question - no matter how much it's worth to the adversary, it won't win. If you take this tack, then you seem to start worrying about what the adversary wants -- and as Terry Ritter often points out on sci.crypt, you don't know much about your adversary. Plus putting a "value" on what is sent through remailers seems to require that you be sensitive to the way the system is used after it's designed. This is *not* to discourage an economic analysis, but to point out a potential benefit to the "modern" approach. It wouldn't be much of a benefit, EXCEPT that in encryption and digital signatures, we have actually been able to achieve security against maximal adversaries (or at least probabilistic polytime ones assuming some problems are hard).
But crypto is really more of an N-party game, with Alice and Bob (and maybe others) making moves and countermoves. (This is one reason many such games are in an important sense "harder" than being merely NP-complete.)
Hmm. I know of some results on some two-player games which shows that playing them "optimally" is PSPACE-complete. The two I can think of, however - Hex and Go - are perfect information games. I'm not sure how hiding information changes things. Maybe one way to cast crypto as a game would be to consider protocol verification. "Here's a state machine. Here's Alice's state. Here's Bob's state. Can an eavesdropper learn their shared key if he has the following moves...?"
(* A standard assumption--it probably has a name that I have forgotten--is that the attacker of a cipher has complete knowledge except for the key. That is, he can take the cipher back to his lab
Kerchoff's principle, I think. -David
On Fri, Dec 29, 2000 at 02:18:10AM -0500, dmolnar wrote:
Yes, BUT I think one of the reasons why a maximally powerful adversary model is so appealing, however, is that it sidesteps the question of evaluating "value of what is being sent through remailers."
The other reason it's appealing is that it lets academics consider their work finished when they've constructed a logical argument or proof, without considering implementation details, because the "maximally powerful adversary model" in practice seems to include a maximally competent implementer of the design under consideration. Which is great for academics, but in the real world, software and hardware systems have defects - frequently, defects with security implications.
If you can prove security against a maximally powerful adversary, then you don't have to answer that question - no matter how much it's worth to the adversary, it won't win.
Yes, for attacks against the strong part of the system - but that's not what sensible attackers go after. The "maximal adversary" imagined is apparently a very gentle and polite one, who can only operate on network wires, but won't consider physical penetration or torture, and in some models won't even subvert the security of the machines hosting the system.
This is *not* to discourage an economic analysis, but to point out a potential benefit to the "modern" approach. It wouldn't be much of a benefit, EXCEPT that in encryption and digital signatures, we have actually been able to achieve security against maximal adversaries (or at least probabilistic polytime ones assuming some problems are hard).
But - several, if not many times - the security we've achieved has been broken, because of implementation errors on the part of creators, installers, or users. Consider the computing power assembled for the DES or RC5 cracks, instead applied to dictionary attacks versus a PGP keyring, or SSH keyfile. How long until the average user's passphrase is recovered? -- Greg Broiles gbroiles@netbox.com PO Box 897 Oakland CA 94604
On Fri, 29 Dec 2000, Greg Broiles wrote:
But - several, if not many times - the security we've achieved has been broken, because of implementation errors on the part of creators, installers, or users.
That's right - that's part of the fact that cryptographic engineering (as opposed to "cryptographic science") is still in its infancy. This is the downside of the current approach, which focuses on getting the protocol right first, and only later considers the "real world." Bruce Schneier had another way of putting it - something along the lines of "The math is perfect, the hardware is so-so, the software is a mess, and the people are awful!" (not an exact quote, but I remember it from one of his DEF CON speeches). That being said, there is some benefit to considering the protocols in an ideal, polite model - because in the past we haven't even been able to get security in *that* model. So in some sense this is a case of "publishing what we can prove." It's only comparatively recently that we've had protocols which we can prove secure, even in weak models -- the first real definitions of security from Yao, Goldwasser and Micali, and probably others weren't until the early to mid 1980s. Truly practical cryptosystems which meet these definitions of security didn't arrive until the 1990s. (Some would argue that they still aren't here - Bellare and Rogaway's Optimal Asymmetric Encryption Padding (OAEP) satisfies a strong definition of security, but only if you buy the "random oracle assumption.") Now on the "science" side we can and should extend the model to deal with more of the real world. You might find the recent paper I posted a link to by Canetti interesting - he sets out to deal with an asynchronous network with active adversaries. I didn't see torture included yet, but maybe next version. Birgit Pfitzmann and Michael Waidner are considering something called "reactive systems" which may also yield results. http://citeseer.nj.nec.com/297161.html On the engineering side -- well, there's a long way to go. Ross Anderson has a new book coming out which may help a little bit. http://www.cl.cam.ac.uk/~rja14/book.html The fact remains that I don't think we have enough experience implementing protocols beyond encryption and signatures. At least not on a wide scale. Take digital cash and voting protocols as an example. Digital cash has been implemented and re-implemented several times. It's even had a "live" test or two. But how many people have managed to buy something tangible with it? and how does that compare to the amount cleared by credit cards? Electronic voting seems to be on the upswing - at least with votehere.com and the recent election debacle hanging over our heads. Still, who has implemented, tested, and deployed a truly large-scale voting system based on cryptographic protocols? The one which comes to mind is the MIT system built on the FOO protocol - and while that *works* (modulo operator error), that's only a few thousand undergrads. It's at times like this that I wish I knew more about formal verification of protocols...
Consider the computing power assembled for the DES or RC5 cracks, instead applied to dictionary attacks versus a PGP keyring, or SSH keyfile. How long until the average user's passphrase is recovered?
If the passphrase is in the dictionary, nearly no time at all. Some take this to mean that now we should write passphrases down, and use the opportunity to pick long random ones unlikely to be in any dictionary... -David
On Wed, 27 Dec 2000, Bill Stewart wrote:
There's some hope. There was a workshop on "Design Issues in Anonymity and Unobservability" this past summer which brought people together to talk about these issues. The Info Hiding Workshops are still going strong. With luck, this year's IHW may have a paper on reputations in it...
Cool. Are the proceedings on line anywhere? (Or is it only for people who know the secret keys...)
Uh, it just occurs to me that I may have misread you. The Design Issues in Anonymity and Unobservability is currently being turned into Springer-Verlag LNCS 2009. So the proceedings aren't online as a whole yet (indeed, we just submitted our final final draft two weeks ago). You can find a list of papers at http://www.icsi.berkeley.edu/~hannes/wsprogram.html our paper is at http://www.freehaven.net/doc/berk/freehaven-berk.ps and searching for authors' home pages or e-mail may reveal other papers. -David
On Sun, 24 Dec 2000, Tim May wrote:
First, the quality of the responses has not been good. It seems repartee and tired Nazi vs. Stalinist debate is the norm, with Choatian physics and Choatian history filling in the gaps.
What he means is he gets tired of having his bluff called. ____________________________________________________________________ Before a larger group can see the virtue of an idea, a smaller group must first understand it. "Stranger Suns" George Zebrowski The Armadillo Group ,::////;::-. James Choate Austin, Tx /:'///// ``::>/|/ ravage@ssz.com www.ssz.com .', |||| `/( e\ 512-451-7087 -====~~mm-'`-```-mm --'- --------------------------------------------------------------------
Quoting Tim May (tcmay@got.net):
I haven't been posting here a lot for various reasons.
That's your business. I just thought the meme sighting was funny.
First, the quality of the responses has not been good. It seems repartee and tired Nazi vs. Stalinist debate is the norm, with Choatian physics and Choatian history filling in the gaps.
I don't quite know what to say to that. Perhaps collective ennui (or senility) has set upon most USENET denizens. Odd that. Personally, I would have thought that name-calling would have lost its novelty after a few years.
Second, and perhaps related to the first point, a lot of folks have retreated to the safety of filtered lists, where Lewis and Perry can screen messages for them. (Though I have noticed that a lot of _political_ messages get cross-posted from Perrypunks, where they are not supposed to exist, over to Cypherpunks. So much for the safe haven of having list.monitors limiting "off-topic" discussion.)
They should know better. I recall some of the uproar about the Cypherpunks moderation experiment.
Third, "been there, done that." Most of the topics surfacing now have not new topics. Most topics were beaten to death by 1993. In fact, most of the tangentially-crypto stuff is actually _less_ interesting than the stuff in 1993 was. See the topics in 1992-1994 and compare them to the topics in 1998-2000.
You're quite correct. While I haven't had the time to actually read more than a few messages from those days, it's obvious that the then fresh speculations have become today's old news. Now, those with the skills and resources are busy planning and implementing that which they see will be profitable within their planning horizons. Few would be so short-sighted as to talk about their work when it might hurt their income or revenue potential. So, we wait for product releases, legislation initiatives, etc.
Fourth, as with my new .sig, the election has caused me to "move on," at least until the direction of things is determined.
Sure. It looks like most people (myself included) are waiting to see what happens so they can amend their plans accordingly. What else is there to do?
As for what Bruce says in the above quote, nothing different from what he's been saying for decades.
He speaks of liquidating middlemen, I speak of liquidating tens of millions of welfare varmints, useless eaters, and politicians.
Despite the fact that I disagree strongly with wholesale `liquidation', I at least respect your candour -- and while I'd love to argue this point with you, that, too, has been done to death. However, I've at least got an incentive to stay off the welfare rolls.
And for this they call him a visionary and me a Nazi. Go figure.
As they say, these things happen. If I have anything useful to contribute to future discussions, I'll do so. If not, I'll just go back to lurking while I work on my own projects, education and career, such as it is. Regards, Steve -- Antidisintermediationalist and Posi-Trak advocate at large.
participants (11)
-
Adam Shostack
-
Bill Stewart
-
David Honig
-
dmolnar
-
Eric Cordian
-
Greg Broiles
-
Jim Choate
-
Ray Dillinger
-
Steve Thompson
-
Tim May
-
Wei Dai