On Sun, Jun 17, 2007 at 05:03:27AM -0400, Tyler Durden wrote:
11 months in 2007? How long will that take 18 months from now if we assume pure microprocessor advance a la Moore's law? Is there also a slow but
Moore's law is about integration density; I'm not sure brute-force L1 code/data also doubles in 12-18 month range. With multicores, it now probably does. It would be interesting to see how systems requiring very large precomputed lookup tables (GBytes and TBytes) for cryptoattack do here. -- Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org ______________________________________________________________ ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE