schneier@chinet.chinet.com says:
Making predictions beyond the near future is even more foolish. Who knows what kind of advances in computing, networking, and mathematics are going to happen by 2020? However, if you look at the broad picture, in every decade we can factor numbers twice as long as in the previous decade. This leads to Table 5.
I'm not sure I agree with this assumption. From current knowledge, it seems that factoring is still exponential -- we've just progressed on the algorithms a bit. That can't continue forever, though. Assuming algorithms remain stable on your most optimistic estimate (which would require some advances even so), we would assume that factoring would remain exponential, and that adding a constant number of bits to a number would add a constant factor to the increase in complexity. Since computing speeds are also rising exponentially, but not superexponentially, we would assume that the number of bits we could factor would grow linearly with time -- that is, each decade would see numbers about another 60-80 digits long factored. This would mean that every doubling in key length would give us more than just a constant increase in the safety factor. Perry