Factoring - State of the Art and Predictions
((Comments are appreciated. -Bruce)) Factoring large numbers is hard. Unfortunately for algorithm designers, it is getting easier. Even worse, it is getting easier faster than mathematicians expected. In 1976 Richard Guy wrote: "I shall be surprised if anyone regularly factors numbers of size 10^80 without special form during the present century." In 1977 Ron Rivest said that factoring a 125-digit number would take 40 quadrillion years. In 1994 a 129-digit number was factored. If there is any lesson in all this, it is that making predictions is foolish. Table 1 shows factoring records over the past dozen years. The fastest factoring algorithm during the time was the quadratic sieve. Table 1: Factoring Using the Quadratic Sieve year # of decimal how many times harder to digits factored factor a 512-bit number 1983 71 > 20 million 1985 80 > 2 million 1988 90 250,000 1989 100 30,000 1993 120 500 1994 129 100 These numbers are pretty frightening. Today it is not uncommon to see 512-bit numbers used in operational systems. Factoring them, and thereby completely compromising their security, is well in the range of possibility: A weekend-long worm on the Internet could do it. Computing power is generally measured in mips-years: a one- million-instruction-per-second computer running for one year, or about 3*10^13 instructions. By convention, a 1 mips machine is equivalent to the DEC VAX 11/780. Hence, a mips-year is a VAX 11/780 running for a year, or the equivalent. The 1983 factorization of a 71-digit number required 0.1 mips- years; the 1994 factorization of a 129-digit number required 5000. This dramatic increase in computing power resulted largely from the introduction of distributed computing, using the idle time on a network of workstations. The 1983 factorization used 9.5 CPU hours on a single Cray X-MP; the 1994 factorization used the idle time on 1600 computers around the world for about 8 months. Modern factoring methods lend themselves to this kind of distributed implementation. The picture gets even worse. A new factoring algorithm has taken over from the quadratic sieve: the general number field sieve. In 1989 mathematicians would have told you that the general number field sieve would never be practical. In 1992 they would have told you that it was practical, but only faster than the quadratic sieve for numbers greater than 130-150 digits or so. Today it is known to be faster than the quadratic sieve for numbers well below 116 digits. The general number field sieve can factor a 512-bit number over 10 times faster than the quadratic sieve. The algorithm would require less than a year to run on an 1800-node Intel Paragon. Table 2 gives the number of mips-years required to factor numbers of different sizes, given current implementations of the general number field sieve. Table 2: Factoring Using the General Number Field Sieve # of bits mips-years required to factor 512 30,000 768 2*10^8 1024 3*10^11 1280 1*10^14 1536 3*10^16 2048 3*10^20 And the general number field sieve is still getting faster. Mathematicians keep coming up with new tricks, new optimizations, new techniques. There's no reason to think this trend won't continue. A related algorithm, the special number field sieve, can already factor numbers of a certain specialized form--numbers not generally used for cryptography--must faster than the general number field sieve can factor general numbers of the same size. It is not unreasonable to assume that the general number field sieve can be optimized to run this fast; it is possible that the NSA already knows how to do this. Table 3 gives the number of mips-years required for the special number field sieve to factor numbers of different lengths. Table 3: Factoring Using the Special Number Field Sieve # of bits mips-years required to factor 512 < 200 768 100,000 1024 3*10^7 1280 3*10^9 1536 2*10^11 2048 4*10^14 At a European Institute for System Security workshop in 1992, the participants agreed that a 1024-bit modulus should be sufficient for long-term secrets through 2002. However, they warned: "Although the participants of this workshop feel best qualified in their respective areas, this statement [with respect to lasting security] should be taken with caution." This is good advice. The wise cryptographer is ultra-conservative when choosing public-key key lengths. To determine how long a key you need requires you to look at both the intended security and lifetime of the key, and the current state-of-the-art of factoring. Today you need a 1024-bit number to get the level of security you got from a 512-bit number in the early 1980s. If you want your keys to remain secure for 20 years, 1024 bits is likely too short. Even if your particular secrets aren't worth the effort required to factor your modulus, you may be at risk. Imagine an automatic banking system that uses RSA for security. Mallory can stand up in court and say: "Did you read in the newspaper in 1994 that RSA-129 was broken, and that 512-bit numbers can be factored by any organization willing to spend a few million dollars and wait a few months? My bank uses 512-bit numbers for security, and by the way I didn't make these seven withdrawals." Even if Mallory is lying, the judge will probably put the onus on the bank to prove it. Earlier I called making predictions foolish. Now I am about to make some. Table 4 gives my recommendations for public-key lengths, depending on how long you require the key to be secure. There are three key lengths for each year, one secure against an individual, one secure against a major corporation, and the third secure against a major government. Here are some assumptions from the mathematicians who factored RSA-129: We believe that we could acquire 100 thousand machines without superhuman or unethical efforts. That is, we would not set free an Internet worm or virus to find resources for us. Many organizations have several thousand machines each on the net. Making use of their facilities would require skillful diplomacy, but should not be impossible. Assuming the 5 mips average power, and one year elapsed time, it is not too unreasonable to embark on a project which would require half a million mips years. The project to factor the 129-digit number harnesses an estimated 0.03% of the total computing power of the Internet, and they didn't even try very hard. It isn't unreasonable to assume that a well-publicized project can harness 0.1% of the world's computing power for a year. Assume a dedicated cryptanalyst can get his hands on 10,000 mips- years, a large corporation can get 10^7 mips-years, and that a large government can get 10^9 mips-years. Also assume that computing power will increase by a factor of ten every five years. And finally, assume that advances in factoring mathematics allows us to factor general numbers at the speeds of the special number field sieve. Table 4 recommends different key lengths for security during different years. Table 4: Recommended public-key key lengths (in bits) Year vs. I vs. C vs. G 1995 768 1280 1536 2000 1024 1280 1536 2005 1280 1536 2048 2010 1280 1536 2048 2015 1536 2048 2048 Remember to take the value of the key into account. Public keys are often used to secure things of great value for a long time: the bank's master key for a digital cash system, the key the government uses to certify its passports, a notary public's digital signature key. It probably isn't worth the effort to spend months of computing time to break an individual's private key, but if you can print your own money with a broken key the idea becomes more attractive. A 1024-bit key is long enough to sign something that will be verified within the week, or month, or even a few years. But you don't want to stand up in court twenty years from now with a digitally signed document, and have the opposition demonstrate how to forge documents with the same signature. Making predictions beyond the near future is even more foolish. Who knows what kind of advances in computing, networking, and mathematics are going to happen by 2020? However, if you look at the broad picture, in every decade we can factor numbers twice as long as in the previous decade. This leads to Table 5. Table 5: Long-range factoring predictions Year Key length (in bits) 1995 1024 2005 2048 2015 4096 2025 8192 2035 16,384 2045 32,768 Not everyone will agree with my recommendations. The NSA has mandated 512-bit to 1024-bit keys for their Digital Signature Standard--far less than I recommend for long-term security. PGP has a maximum RSA key length of 1280 bits. Lenstra, the world's most successful factorer, refuses to make predictions past ten years. And Table 6 gives Ron Rivest's key-length recommendations, originally made in 1990, which I consider much too optimistic. While his analysis looks fine on paper, recent history illustrates that surprises regularly happen. It makes sense to choose your keys to be resilient against future surprises. Table 6: Rivest's Optimistic Key-Length Recommendations (In Bits) Year Low Avg High 1990 398 515 1289 1995 405 542 1399 2000 422 572 1512 2005 439 602 1628 2010 455 631 1754 2015 472 661 1884 2020 489 677 2017 Low estimates assume a budget of $25,000, the quadratic sieve algorithm, and a technology advance of 20% per year. Average estimates assume a budget of $25 million, the general number field sieve algorithm, and a technology advance of 33% per year. High estimates assume a budget of $25 billion, a general quadratic sieve algorithm running at the speed of the special number field sieve, and a technology advance of 45% per year. There is always the possibility that an advance in factoring will surprise me as well, but I think that unlikely. But why trust me? I just proved my own foolishness by making predictions.
This touches on something I was thinking the other day: Most cryptosystems that we seem to use are based on the assumption that factoring large numbers is a Hard Problem. Isn't this putting all our eggs in one basket? Are there other Hard Problems crypto systems can be based on? In the ludicrous case, suppose Eve is visited by aliens and given a black box that would instantly factor a number irrelevant of its size... how much of current cryptography would this device invalidate? I'm no crypto-expert, so I don't know... but surely there are other hard problems in the universe that we can base crypto on... --Zachary
schneier@chinet.chinet.com says:
Making predictions beyond the near future is even more foolish. Who knows what kind of advances in computing, networking, and mathematics are going to happen by 2020? However, if you look at the broad picture, in every decade we can factor numbers twice as long as in the previous decade. This leads to Table 5.
I'm not sure I agree with this assumption. From current knowledge, it seems that factoring is still exponential -- we've just progressed on the algorithms a bit. That can't continue forever, though. Assuming algorithms remain stable on your most optimistic estimate (which would require some advances even so), we would assume that factoring would remain exponential, and that adding a constant number of bits to a number would add a constant factor to the increase in complexity. Since computing speeds are also rising exponentially, but not superexponentially, we would assume that the number of bits we could factor would grow linearly with time -- that is, each decade would see numbers about another 60-80 digits long factored. This would mean that every doubling in key length would give us more than just a constant increase in the safety factor. Perry
schneier@chinet.chinet.com writes:
((Comments are appreciated. -Bruce))
Ok. [Nice historical presentation of factoring snipped]
A new factoring algorithm has taken over from the quadratic sieve: the general number field sieve. In 1989 mathematicians would have told you that the general number field sieve would never be practical.
In 1992 they would have told you that it was practical, but only faster than the quadratic sieve for numbers greater than 130-150 digits or so. Today it is known to be faster than the quadratic sieve for numbers well below 116 digits.
The general number field sieve can factor a 512-bit number over 10 times faster than the quadratic sieve.
The GNFS situation is a little bit more complicated than that. Today's factoring algorithms work by finding distinct square roots of the same quadratic residue modulo the number to be factored. Each such discovery yields an approximately 50% chance of factoring the number using Euclid's GCD algorithm. Since searching directly for a congruence of squares would be grossly inefficient, one sieves for relations involving arbitrary powers of numbers from a set called the "factor base", and after collecting an overdetermined number of such relations, finds their null space modulo two in huge matrix operation. This yields a relation whose powers are all even, from which a congruence of squares may be constructed in an obvious manner. Most popular factoring methods including NFS, GNFS, and numerous flavors of QS utilize this general scheme, and differ only in the numbers to which they are applicable and in the methods that they use to fish for relations in more densely populated mathematical waters. GNFS uses a particularly cute trick, which is to express the number being factored as a polynomial with small coefficients, evaluated at a small argument. On can then construct a homomorphism from a ring of algebraic integers into Z/nZ. This permits sieving to be conducted in a particularly efficient fashion. Finding such a polynomial, unfortunately, is a far from straightforward task. Current state of the art is to start with a guess, and flog it to death on a workstation for several days, attempting some sort of stepwise refinement. Although the problem is mathematically rich, no systematic method currently exists to pick the "best" polynomial, and the problem of doing so may be of a difficulty comparable to factoring. The speed with which GNFS runs and the degree to which it outperforms QS is extremely sensitive to the polynomial chosen, so the blanket statement that GNFS outperforms QS by a factor of 10 on 512 bit numbers is in my opinion, a bit of an oversimplification. GNFS is one of the most complicated computer algorithms to be constructed, sieving and factoring simultaneously in both a ring of algebraic integers and in Z/nZ. The algorithm has been known to experience "cycle explosions" in which unexpectly large amounts of raw data are produced from relatively small numbers. It is certainly not something that can be regularly run in "production mode" and it requires a skilled operator (currently its creator) to help it coast smoothly through its various stages. I don't think GNFS is going to be available in shrink-wrapped form for quite some time. :)
A related algorithm, the special number field sieve, can already factor numbers of a certain specialized form--numbers not generally used for cryptography--must faster than the general number field sieve can factor general numbers of the same size.
NFS and GNFS are essentially the same algorithm. NFS is simply a special case where a particularly simple polynomial is known, Z[a] is a unique factorization domain, and some other nice algebraic properties are present. In the case of a general integer, and a more complex polynomial, some things get messier.
It is not unreasonable to assume that the general number field sieve can be optimized to run this fast; it is possible that the NSA already knows how to do this.
I think this is unlikely. The difference in speed is due to the fact that NFS only factors specially chosen simple numbers, and GNFS factors anything. That is not something that is likely to be optimized away. Also, I think we make far to much of the magical ability of the NSA to do things. At the present point in time, most of the cryptomathematical expertise in the world is external to the NSA. The NSA didn't invent GNFS, or for that matter, public key cryptography.
Making predictions beyond the near future is even more foolish. Who knows what kind of advances in computing, networking, and mathematics are going to happen by 2020? However, if you look at the broad picture, in every decade we can factor numbers twice as long as in the previous decade.
GNFS probably represents the final step in the evolution of the "combination of congruences" factoring methods. Further refinements would probably be such complicated algorithms as to be inpractical to program. Additional improvements in our ability to break RSA will probably come via some new factoring scheme that we are presently unaware of, or via a method of computing the inverse of the encryption permutation used by RSA which does not require explicit formation of the factors of the modulus.
Table 5: Long-range factoring predictions
Year Key length (in bits) 1995 1024 2005 2048 2015 4096 2025 8192 2035 16,384 2045 32,768
I think factoring technology may reach its "Omega Point" long before 2045. Twenty years from now, we might be able to factor anything. I think predictions past ten years are pure speculation.
There is always the possibility that an advance in factoring will surprise me as well, but I think that unlikely.
I expect to be surprised by an advance in factoring momentarily. You are far too pessimistic. :) -- Mike Duvos $ PGP 2.6 Public Key available $ mpd@netcom.com $ via Finger. $
Mike Duvos says:
Also, I think we make far to much of the magical ability of the NSA to do things. At the present point in time, most of the cryptomathematical expertise in the world is external to the NSA. The NSA didn't invent GNFS, or for that matter, public key cryptography.
I'm on both sides of this issue. On the one hand, the people in the open crypto community are now, or soon will, substantially exceed in number the people in the black community, and the people in the open community have certain advantages in the way that they do their work. On the other hand, the people in the black community have the advantage that they can read anything that the open community produces but not vice versa, and they have at least a 15 year edge in knowledge about the design of conventional systems, and who knows (we certainly have no idea) how much of an edge in the modern cryptographic arena. We don't know for sure if the NSA knew about Public Key before the open community did. Certainly they knew of differential cryptanalysis and similar techniques, and they must know quite a lot that we don't. The black community also has lots of day-to-day experience that we don't have, and they understand both the threat model and the practical side of things a lot better than we do. Overall, I'd say that in the long run the open community is going to catch up regardless of what the NSA likes. That does not mean, however, that this is going to happen particularly soon, or that they don't still know decades more than we do. Perry
A few weeks back Matt Blaze posted on top ten problems we face. I'll add two to that list. First is our inability to accurately assess the strength of various government agencies. We tend to make very pessimistic assumptions, which tends to be safe, but having real data on which to base our assumptions would be better. The other problem we face is that people like Matt write solid essays on various things, and no one responds. People who write essays, post solid mathematical results, etc, bemoan this pretty regularly. Fortunately, this problem is easier to address. Try to spend more time on the posts which people took longer on. Its usually obvious which those are. The reason to spend more time on solid posts is that someone took the time to write well on something. If they get solid feedback, they'll do more solid writing, and the quality of discourse goes up. Adam Perry writes: | The black community also has lots of day-to-day experience that we | don't have, and they understand both the threat model and the | practical side of things a lot better than we do. | | Overall, I'd say that in the long run the open community is going to | catch up regardless of what the NSA likes. That does not mean, | however, that this is going to happen particularly soon, or that they | don't still know decades more than we do. -- "It is seldom that liberty of any kind is lost all at once." -Hume
On Mon, 13 Feb 1995, Adam Shostack wrote:
Date: Mon, 13 Feb 1995 00:54:21 -0500 (EST) From: Adam Shostack <adam@bwh.harvard.edu> To: perry@imsi.com Cc: Cypherpunks Mailing List <cypherpunks@toad.com> Subject: Re: The NSA (Was Re: Factoring - State of the Art and Predictions )
A few weeks back Matt Blaze posted on top ten problems we face. I'll add two to that list. First is our inability to accurately assess the strength of various government agencies. We tend to make very pessimistic assumptions, which tends to be safe, but having real data on which to base our assumptions would be better.
What your talking about here is a cypherpunks intelligence capability. If you think we are thought of as subversive and distasteful now, just wait to see what happens if anyone on the list outs the kind of information your talking about about e.g., the NSA or the Justice Department. Were this a private, closely held group instead of a public mailing list, you might have a different story.
The other problem we face is that people like Matt write solid essays on various things, and no one responds. People who write essays, post solid mathematical results, etc, bemoan this pretty regularly. Fortunately, this problem is easier to address. Try to spend more time on the posts which people took longer on. Its usually obvious which those are. The reason to spend more time on solid posts is that someone took the time to write well on something. If they get solid feedback, they'll do more solid writing, and the quality of discourse goes up.
There is a internet lore that says the more valuable and insightful a given article is, the less response it gets. I hope this is right, as most of mine tend to be ignored.
Adam
-- "It is seldom that liberty of any kind is lost all at once." -Hume
073BB885A786F666 nemo repente fuit turpissimus - potestas scientiae in usu est 6E6D4506F6EDBC17 quaere verum ad infinitum, loquitur sub rosa - wichtig!
Adam Shostack writes:
The other problem we face is that people like Matt write solid essays on various things, and no one responds. People who write essays, post solid mathematical results, etc, bemoan this pretty regularly.
I agree that this is a problem, but perhaps not so much as we might think. For one thing, communication sometimes develops in private. For instance, I exchanged some mail with Matt about Caller ID after his Top 10 Problems list. More importantly, it takes much more time, and in some cases expertise, to compose a good response to a long discourse than to reply to a short opinion piece or news report. Relatively few people have the time and ability to formulate a significant extension or rebuttal to a major work. This is natural and inevitable. It's probably unrealistic to expect a much greater frequency of such messages. Black Unicorn writes: # There is a internet lore that says the more valuable and insightful a # given article is, the less response it gets. # I hope this is right, as most of mine tend to be ignored. Lately, I've had the feeling that majordomo@toad echoes my epistles only back to me. None of the longer pieces I've written has elicited so much as a flame from Eric, Perry, or even James in a while. As they say, "opinions are like assholes", and it's easy to argue about them. Netiquette strenously discourages people from simply agreeing. This can be carried too far. I've encountered an insidious hazard of high-volume lists (such as this) that probably snares other people too. It's altogether too easy to sit at one's mailer and merely react to whatever comes along. Obviously, if everyone did this all the time, nothing of substance would ever be accomplished. It's therapeutic, IMHO, to step back regularly, refocus on one's long term goals w.r.t the group, and push new initiatives. -L. Futplex McCarthy
L. McCarthy wrote:
Lately, I've had the feeling that majordomo@toad echoes my epistles only back to me. None of the longer pieces I've written has elicited so much as a flame from Eric, Perry, or even James in a while.
Should I feel left out by not being mentioned in this set? Or relieved? In any case, I agree that most responses are mostly reactive. Though in defense of the Cypherpunks list, not nearly so reactive as are many groups. Lots of lists and groups are dominated by in jokes, non sequitors, and other ephemera. At least this group quite often gets into meaty issues.
I've encountered an insidious hazard of high-volume lists (such as this) that probably snares other people too. It's altogether too easy to sit at one's mailer and merely react to whatever comes along. Obviously, if everyone did this all the time, nothing of substance would ever be accomplished. It's therapeutic, IMHO, to step back regularly, refocus on one's long term goals w.r.t the group, and push new initiatives.
Like a lot of you, I try to do this regularly. If people are interested, they'll follow up. If not, they won't. Think of it as evolution in action. It so happens that the latest theme I've been thinking about is ready to spring on you folks. If you respond, so be it. That theme is this: Is cyberspace, or the Net/Web/Etc., sufficiently rich or complex to meet our needs? By "rich" or "complex" I mean in terms of "places to go," of "degrees of freedom." For example, the multiplicity of routing paths for messages, via remailers explicitly and via the underlying routing options the Internet itself offers implicitly, gives certain major advantages that a centralized system vulnerable to "choke points" would not have. (The Internet gurus will likely jump in at this point and blather about how this is isn't so, how they could shut down the Internet in several minutes with just their Leatherman tool and a few O'Reilly books, but my point is not that it isn't _possible_, but that the direction in which the Net has moved is generally one that makes shut-down harder than more centralized alternatives.) By our "needs" I mean roughly the Cypherpunks goals of privacy, free choice, cybernetic free association, virtual communities, anarcho-capitalism, etc. (Quibblers can dispute any of these, but clearly most active posters on the list advocate some vector made up of many of these diverse elements.) So, what am I getting at? Consider how the abstractions of the World Wide Web, URLs, HTML, HTTP, and Web browsers have *increased the size of cyberspace* rather dramatically in just the past two years. More places to visit, more interconnectedness, more difficulties in controlling access to stuff, etc. Home pages containing banned material are proliferating (a la the Homolka-Teale ritualistic cannibalism trial in Canada, the Scientology material, and so on--this is not the place for me to recap this). Sure, ftp sites used to do this pretty well; in fact, I'm considering ftp sites in this "evolution" toward greater complexity (in the richness sense). (Actually, cyberspace is partly getting "bigger" and partly "increasing in dimensionality." Dimensionality of a space can be related to how many neighbors one has....think of the two nearest neighbors one has in a 1-D space, the 4 (or 8 if diagonals are considered) neighbors in a 2-D space, the 6 in a 3-D space, and so on. Arguably, if one has "100 close neighbors" in a space, it is roughly a 50 dimensional space. An equivalent formulation is in terms of the radius of the n-sphere that everyone fits into. For example, the "six degrees of separation," the 6 "handshakes" that separate nearly any two people in America, suggests that American society is in some important sense roughly a 15-17 dimensional space, because in some sense all 250 million Americans "fit into" a hypersphere of radius 3 (diameter 6) when the dimensionality is around 17. (Or slightly lower, as the slight corrections to V = r ^ n have to be included, which I'm not bothering with). What "increased connectivity" does is to increase dimensionality, about as one would expect from our usual metaphors about "a multidimensional society" and "the world is shrinking"...indeed it is shrinking, even as the absolute volume increases.) What Cypherpunks should be pushing for, in my view, is this increased dimensionality. More places to stick things, more places to escape central control, and more degrees of freedom (which has a nice dual meaning I once used as the working title for a novel I was working on). Is Cyberspace already rich enough (= high enough dimensionality) so that central control cannot be reestablished (to the extent it ever existed)? Many of this think that it probably already is past this point, that the "point of no return" has been reached. After all, the Soviets couldn't stop samizdats, the Chinese couldn't stop fax machines, and the Americans can't stop drug use, so what hope is there in controlling modems, crypto, cellular phones, satellites, Web links, stegonography, terabytes of data flowing unobstructed across borders, and so on. Just to "stop the Net" would disrupt the entire financial system, which not even Clinton or the next (Republican) President would be tempted to do....they might as well launch a nuclear war as try to shut down this "anarchic" ( = high dimensionality) system. But can we do more? One of my own wishes is to see hundreds (nay, thousands!) of remailers, as these act as "teleportation booths" which can dramatically increase connectivity. (They can increase the connectivity in a different way that just straight connections can...they "stitch together" otherwise visibly-connected regions with unobservable connections, a desirable thing.) What else? * Lots more remailers. Run out of accounts, not just "remailer machines." Accounts allow trivial proliferation of more remailers. * Web access remailers. Like the "anonymous anonymous ftp," why not explore combining Web systems with remailers? (Not so great for browsing, of course, but there should be some interesting possibilities.) * More offshore sites, members, etc. This increases connectivity and increases the "regulatory arbitrage" we so often talk about. * Local corporate computer nets are "extra rooms in cyberspace," and thus are harder to search. The equivalents of "rat lines" (in which drugs are kept in one apartment and retrieved through a hole in the wall, thus delaying/foiling searches and kick-in-the-door raids....think of how technology makes all this so much easier). * digital cash is of course of central importance. It glues commerce together, but also greases it (a dual metaphor, not a mixed one). In terms of the "richness" I'm talking about, it incentivizes the colonization of cyberspace, the expansion of this space, and the general richness. * Alternative Nets, like FIDONet, are often lost in the discussion of "the Net," but perhaps we should take much greater interest in these alternatives. They make a crackdown harder, they lessen the dangers of a single-point attack, and they provide "genetic diversity" for building future Nets. (I'm not saying Cypherpunks have the time, expertise, or incentive to work on this, but just reminding folks that the Internet is not the end all and be all...) * More users, more education, more articles....all increase dimensionality, by expanding the space (e.g., key software on more machines, accessible by more people, more home pages, etc.). And so on. Increase the richness of cyberspace. More places, more avenues, more rooms, more more. Make sure there's a "there," there. Well, I've written too much, and as folks have noted, long posts get fewer responses that do short ones, especially flamish ones. Personally, I think there are fewer long essays and analyses for the same reason there are fewer large predators than grass-munching herbivores. --Tim May -- .......................................................................... Timothy C. May | Crypto Anarchy: encryption, digital money, tcmay@netcom.com | anonymous networks, digital pseudonyms, zero | knowledge, reputations, information markets, W.A.S.T.E.: Aptos, CA | black markets, collapse of governments. Higher Power: 2^859433 | Public Key: PGP and MailSafe available. Cypherpunks list: majordomo@toad.com with body message of only: subscribe cypherpunks. FAQ available at ftp.netcom.com in pub/tc/tcmay
Black Unicorn writes:
If you think we are thought of as subversive and distasteful now, just wait to see what happens if anyone on the list outs the kind of information your talking about about e.g., the NSA or the Justice Department.
Were this a private, closely held group instead of a public mailing list, you might have a different story.
IMHO, people generally don't share that sort of information here because they don't possess it, not because they fear potential repercussions. Plenty of list subscribers seem able to shed their inhibitions quite easily when it comes to criticizing organizations and revealing information. Full utilization of the most advanced available remailer features offers at least the appearance of sufficient anonymity for most people, AFAI can tell. [Perhaps we could run a short survey: please respond if you could dish dirt on a government security agency, but do not plan to do so on this list ;] BTW, with an eye to avoiding rehashes of old flamewars, this does *not* constitute an invitation to scour the subscriber list for *.gov and *.mil and announce the results. Consult the archives if in doubt on this point. -L. Futplex McCarthy
BTW, with an eye to avoiding rehashes of old flamewars, this does *not* constitute an invitation to scour the subscriber list for *.gov and *.mil and announce the results. Consult the archives if in doubt on this point.
Where are these archives? I haven't seen them. Thanks.
"Perry E. Metzger" <perry@imsi.com> writes: PM+The black community also has lots of day-to-day experience that we +don't have, and they understand both the threat model and the +practical side of things a lot better than we do. Perry, I don't see too much reason to suppose that this is true. The CIA has had almost everything wrong since 1960, and it routinely becomes clear a few weeks after the fact that smart bombs, air breathing missiles, SCUD hunts and the like are an almost total waste of good silicon. When the Russians first landed stuff on the Moon it was semi-amateurs in England, not the vaunted NASA folks, who were tuned in. My guess would be that the folks at NSA spend most of their time worrying about getting a better parking spot, the same as everybody else in Washington. PM+Overall, I'd say that in the long run the open community is going to +catch up regardless of what the NSA likes. That does not mean, +however, that this is going to happen particularly soon, or that they +don't still know decades more than we do. I think you're right about the superiority of the open community, but it may not be out there in the future. It may be accompli. -dlj. david.lloyd-jones@canrem.com * 1st 1.11 #3818 * Gingrich, n. abbrev. : "Giving to the rich".
My knowledge of number theory is negligable, but I've got this one obvious comment: use algorithms where factoring is not a weakness. Isn't LUC safe from factoring?
participants (11)
-
Adam Shostack -
Black Unicorn -
david.lloyd-jones@canrem.com -
Jason Burrell -
L. McCarthy -
mpd@netcom.com -
Perry E. Metzger -
Robert Rothenburg Walking-Owl -
schneier@chinet.chinet.com -
tcmay@netcom.com -
Zachary