[Man there's a lot of names from the old days on this list. Good to hear from you Lance :-] I think the take-away from this issue is CAs should issue certifictes on keys used for signing only. Say its a DSA, or ECDSA which is a damn good choice because it is not even directly possible to encrypt with it (*), and the key usage will be marked sign only, so there is no argument about its purpose. Then we disable any non-forward-secret ciphersuites (and forward secret ciphersuites are not coincidentally the only ciphersuites that work with a signing only server key). Then the only plausible reason to demand the signing key is to perform a MITM not to access "encrypted data". Firstly MITM is more work, and secondly theyd at that point just as well play nicely and ask the operator with a subpoena to hand over some info inside the SSL stream if there's anything useful in there. In some countries there are explicit legal protections for signature only keys. At best they subpoena could ask the operator to record the session keys via the SSL web server, however that feature is not present as far as I know. I also think the weak point with lavabit was probably the in-mail and out-mail, as with silentcircle, and I presume the reason silent circle disabled email (though they could have secured internal sc-sc mail using eg the same end2end secure messaging architcture they use for messaging). A further weak point of lavabit as I understand it is it was actually taking the password to the server!! So the user private key was in the server ram temporarily. Which is complete misdesign and makes you start to question Snowden's crypto tradecraft which up to that point was looking pretty damn strong from the news reports. Anyway signature only keys and forward-secrecy FTW already. About software updates, I think we've reached the point of multiple independent public interest code review bodies with signing authority together with the software vendor. The other thing with opensource it can be forked if the main vendor goes wrong or is coerced. You see this kind of reasoning with bitcoin foundation etc as its probably the highest open software assurance level on the planet protecting > $1bn in bearer bitcoin value :) The only possible exception to the coerced code change might be the hushmail thing thogh I am kind of fuzzy about what exactly did happen. There were two versions, one like lavabit (server has key temporarily) and one real end2end as I recall and one version of the story is it was the non-end2end one that got the user info info subpoenaed. Adam (*) Yes yes I know you could abuse DSA public key for another discrete encryption log algorithm, however such practice is considered risky to reuse an asymmetric key for two different algorithms in case there is a way to use one as an oracle to attack the other. On Thu, Oct 03, 2013 at 11:57:22AM -0700, coderman wrote:
On Thu, Oct 3, 2013 at 9:30 AM, Lance Cottrell <loki@obscura.com> wrote:
When architecting a system, it is critical that the operator of the system should not have access to the keys at all... ... Rule #1 don't store clear text. Rule #2 don't store decryption keys Rule #3 don't do decryption on the server Rule #4 treat all communications with people not implementing security on THEIR computers as insecure
some have suggested a rule #5: don't distribute updates automatically to your users and don't implement security critical functions in code that is delivered to the client via the server.
i have yet to see a definitive case of a US company forced to include a backdoor in their software or forced to use their software update channel to deliver a CALEA/intercept friendly version of code to the targeted customer. to date all of these requests appear to be off the record rather than enforced via judicial motion.
this is a shame, since out of date software itself poses significant risk, and is best resolved via automatic updates from the vendor.