On Thu, Oct 3, 2013 at 9:30 AM, Lance Cottrell <loki@obscura.com> wrote:
When architecting a system, it is critical that the operator of the system should not have access to the keys at all... ... Rule #1 don't store clear text. Rule #2 don't store decryption keys Rule #3 don't do decryption on the server Rule #4 treat all communications with people not implementing security on THEIR computers as insecure
some have suggested a rule #5: don't distribute updates automatically to your users and don't implement security critical functions in code that is delivered to the client via the server. i have yet to see a definitive case of a US company forced to include a backdoor in their software or forced to use their software update channel to deliver a CALEA/intercept friendly version of code to the targeted customer. to date all of these requests appear to be off the record rather than enforced via judicial motion. this is a shame, since out of date software itself poses significant risk, and is best resolved via automatic updates from the vendor.
Email security for systems designed to work with outsiders who don't use the tool are particularly problematic. The operator can use public keys to encrypt traffic as it arrives, but can easily be compelled to reveal the arriving clear text messages before encryption.
i'll avoid repeating my "email is for public communication" rant ;)
Is it the SSL certificate for the SMTP TLS that was being requested?... This is hardly highly secured content. The HTTPS sessions might reasonably be considered more sensitive and secure.
my reading of this sequence of motions is that at least five different keys were requested, which seems to imply _all_ SSL/TLS keys, including those for HTTPS sessions. e.g. they can request "pen register" information for web traffic! (we're a long way from just the dialed digits days...)