On Sat, Sep 03, 2016 at 05:29:07PM -0700, Riad S. Wahby wrote:
John <jnn@synfin.org> wrote:
The reason I asked: updating a few certs at office recently I nuked an older F5 LTM device by installing a 4096 bit key/cert pair - the load on the appliance (Linux based) shot up from less than 1 to about 30 and became so excruciatingly slow it was nearly impossible to back the change out (web GUI and ssh were both nearly non-responsive)..
I think this could be some crypto/SSL issue, consider asking them.
Multiplying two n-bit numbers naively costs O(n^2) operations (one can do better with Karatsuba and related tricks), and exponentiation costs O(n) multiplications (so O(n^3) in total). So assuming the device is using something like Montgomery multiplication you'd expect about 8x increased load. 30x sounds like there *could* be some other issue with the implementation, but a significant slowdown is not unexpected.
This applies to straightforward implementation. openssl is protecting against timing attacks, which additionally slows, but unless there is bug, the load shouldn't jump so much. A 2010 thread discusses timings with sign/verify with 100K bit RSA keys, pycrypt (python + some libraries) is significantly faster than openssl. The keys/certs are somewhere on the interwebs: http://openssl-dev.openssl.narkive.com/Y60Bau1J/inconsistent-timings-for-rsa... ===== inconsistent timings for rsa sign/verify with 100K bit rsa keys sign verify key1 5min <1sec key2 48min 21min (tested on patched openssl1.0.0a) ===== Besides quantum computers, another threat is private factoring algorithm, sufficiently faster than the public ones. Say one more square root in the subexponential stuff or the constants significantly lower.