Re: [cryptography] Question About Best Practices for Personal File Encryption
On Aug 15, 2014 11:06 PM, "Mark Thomas" <mark00thomas@gmail.com> wrote:
I have a question for the group, if I may ask it here and in this manner
(?).
What are you guys using to encrypt individual files and folders or even
entire drives like a USB?
I am thinking that:
1. any commercial product could be compromised and not completely secure.
Like Apple’s FileVault2, which Apple has a key to. The comment about Apple is simply false. Apple does not have a key to FileVault2 unless you escrow your key with them. I know this because a dear friend recently passed, and his family was not able to gain access to his encrypted drives through Apple. That said, FileVault2 is susceptible to offline dictionary attacks on the password, or if you can get access while the drive is online, there are attacks on the Keychain.
2. It is probably open source.
What makes you think open source will save you? All the eyeballs looking at the code? That was proven a false sense of security when heartbleed was announced.
3. It is probably implemented with the command line.
Am I on the right track? If so does anyone know of a helpful guide to get started with OpenSSL on the command line besides the man pages?
Regards,
Mark _______________________________________________ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography
On Sat, Aug 16, 2014 at 04:21:53PM -0500, Christopher Nielsen wrote:
The comment about Apple is simply false. Apple does not have a key to FileVault2 unless you escrow your key with them. I know this because a dear friend recently passed, and his family was not able to gain access to his encrypted drives through Apple.
You may be right or may not, but I certainly have to think that if there is a backdoor password to Filevault2 it is quite likely that Apple would not choose to disclose that fact to just some random user who had lost files due to forgotten passwords. One imagines that unless Apple wants to declare their security breakable and presumably bear the burden of having every law enforcement agency, divorce attorney, corporate trial lawyer and government intelligence operation around the world - along with millions of users with various grades of good and bad stories about why they need Apple to break into Filevault2 partitions demanding help (often for much less than it costs Apple to provide it and handle the legal costs to validate the reasons for and authority of the requester to break in) that they would not wish to share the fact that there is a deliberate backdoor mechanism to break in or even a known bug that would allow it. And that of course begs the question of whether such a publicly announced backdoor could ever be kept secret and reserved for Apple alone as it would become an instant target for every hacker and spy and corporate espionage type to reverse engineer... or steal from inside Apple. On the other hand, given the right appeals to patriotism, and national security along with blackmail type arm twisting from certain governments, I'd not be sure they would not provide help or have not been forced to design things so they can. Only a few folks at Apple probably know the real truth about this... one way or the other. -- Dave Emery N1PRE/AE, die@dieconsulting.com DIE Consulting, Weston, Mass 02493 "An empty zombie mind with a forlorn barely readable weatherbeaten 'For Rent' sign still vainly flapping outside on the weed encrusted pole - in celebration of what could have been, but wasn't and is not to be now either."
On Sat, Aug 16, 2014, at 11:21 PM, Christopher Nielsen wrote:
2. It is probably open source.
What makes you think open source will save you? All the eyeballs looking at the code? That was proven a false sense of security when heartbleed was announced.
Can we please stop perpetuating that Open Source is the less secure option? Linus said "given enough eyeballs, all bugs are shallow", he didn't say "all bugs are non-existent". Given an open source program, it can be accountable by anyone. If there is a bug, it can be patched. If there is a deliberate backdoor, it can be pointed to as an example of why to completely abandon the program and mark the developer as tainted forever. Given a proprietary program, it is accountable to the supplier and you have no other option. If there is a bug, all you can do is hope for a patch. If there is a deliberate backdoor, all you can do is hope that someone will spots if it is ever reverse engineered. In other words: - Open Source: "trust, but verify" - Proprietary: "trust, and have faith in the supplier" Given the current Snowden climate, you would be naive to choose a proprietary option. Prove me wrong. Alfie -- Alfie John alfiej@fastmail.fm
On Sun, Aug 17, 2014 at 10:56:33PM +0200, Alfie John wrote:
Given an open source program, it can be accountable by anyone. If there is a bug, it can be patched. If there is a deliberate backdoor, it can be pointed to as an example of why to completely abandon the program and mark the developer as tainted forever.
I'm a significant proponent of open source, and the benefits you enumerate here are definitely true. Open source can be helpful in reviewing code, in grokking developer intent, in providing a hash-chain guarantee of code lineage, in providing change history and justification when reviewing new releases of a previously audited program, and in fostering positive engineering practices. However --
Given a proprietary program, it is accountable to the supplier and you have no other option. If there is a bug, all you can do is hope for a patch. If there is a deliberate backdoor, all you can do is hope that someone will spots if it is ever reverse engineered.
Your "proprietary program" strawman is full of holes. The intellectual labor of decompiling a program delivered as a binary is not especially large compared to the labor required to do a thorough systematic review. Given IDA Pro and a non-obfuscated Win32 or Linux app, people I trust say the decompilation process is on the order of 10%-20% of the total effort of a review. Binary patches are not great by any means, but they are definitely a feasible method of deploying fixes, and this method works and is well tested in the real world. Some kinds of deployments basically require binary patching, no matter what the underlying source management technology. (The Linux Ksplice project provides one prominent example.) Backdoors are an enormous problem for both open source and binary-distribution codebases, and claiming that open source will save you from backdoors ignores the reality of the situation. Just to start, http://underhanded.xcott.com/ http://www.wired.com/2013/04/underhanded-c-contest/ http://graphics.stanford.edu/~danielrh/vote/vote.html http://codegolf.stackexchange.com/questions/tagged/underhanded?sort=votes&pageSize=50 "Building Reliable Voting Machine Software", Ka-Ping Yee http://zesty.ca/voting/ page 148 of http://zesty.ca/pubs/yee-phd.pdf provides a sobering assessment of the difficulty of finding intentionally inserted bugs in open source software. -andy
participants (4)
-
Alfie John
-
Andy Isaacson
-
Christopher Nielsen
-
David I. Emery