CDR: RE: Paranoid Encryption Standard (was Re: Rijndael & Hitachi)
Are you guys still talking about the feasibility of a cipher that implements each AES candidate in turn with the same key? I don't really get this idea. Provided you were actually using the same key with each stage of the encryption, then your system is only gong to be as secure as the key of the first algorithm. In fact, it seems that if the key is compromised at any one point, then the entire system is shot, given that you know the algorithm (Kerckhoff's principle IIRC). Maybe I am misunderstanding. ok, Rush Carskadden -----Original Message----- From: Arnold G. Reinhold [mailto:reinhold@WORLD.STD.COM] Sent: Friday, October 27, 2000 12:29 PM To: Damien Miller Cc: John Kelsey; Bram Cohen; cryptography@c2.net; cypherpunks@cyberpass.net Subject: Re: Paranoid Encryption Standard (was Re: Rijndael & Hitachi) At 4:16 PM +1100 10/27/2000, Damien Miller wrote:
On Thu, 26 Oct 2000, Arnold G. Reinhold wrote:
simple way to combine the AES finalists and take advantage of all the testing that each has already undergone. And, IMHO, it is an interesting theoretical question as well. Even if the answer is "yes," I am not advocating that it be used in most common applications, e.g network security, because there are so many greater risks to be dealt with. But it might make sense in some narrow, high value, applications.
What threat model do you propose that would require this?
o Your opponent has the cryptologic capabilities of the a major world power o The content has very high value (multi-billion dollar deal, could bring down a government, could start a war) o Long term protection is required (30+ years) o You are in a position to properly secure the terminals at both ends 0 Efficiency is not a concern For example, a chief of state's personal diary, an opposition leader's communications, best and final bids on large projects, etc.
I can't think of anything that isn't contrived and couldn't be served by using 3DES.
In a way I see this question as how one should manage the transition from 3DES to AES. Does one keep using DES until the big day and then switch to AES? Or does a blended solution make sense in some cases? While I think there may be a use for something like a Paranoid Encryption Standard in very unusual situations, I don't wish to waste more of people's time arguing with those who say there's no need for it at all. I don't have any compelling evidence. It's pure speculation. I am really more interested in the theoretical "why not?" question, i.e. is there any real downside in combining ciphers in this way, besides efficiency? Conventional wisdom seems to be more cautious than I think is justified and I am trying to prove that. Arnold Reinhold
At 1:00 PM -0500 10/27/2000, Carskadden, Rush wrote:
Are you guys still talking about the feasibility of a cipher that implements each AES candidate in turn with the same key? I don't really get this idea. Provided you were actually using the same key with each stage of the encryption, then your system is only gong to be as secure as the key of the first algorithm. In fact, it seems that if the key is compromised at any one point, then the entire system is shot, given that you know the algorithm (Kerckhoff's principle IIRC). Maybe I am misunderstanding.
That is the theoretical question that I am asking. What you say appears to be the conventional wisdom, and I am claiming that it is wrong. As long as there is some way to make sure that none of the ciphers in a chain are inverses of the others, or close to an inverse, in some sense, then I claim as long as one of the ciphers is strong, there is no way to get any information out about the keys from the other ciphers, even if they are all designed to reveal that information. As a practical matter, you may as well derive the sub keys from the master key using a one-way hash, but I am questioning the theoretical justification for doing that. Massey and Maurer base their paper on oracles that give you the key for all component ciphers but one. I am saying such oracles cannot exist if one of the ciphers is strong and "inverses" of the strong cipher are excluded. Arnold Reinhold
-----BEGIN PGP SIGNED MESSAGE----- At 04:20 PM 10/27/00 -0400, Arnold G. Reinhold wrote:
At 1:00 PM -0500 10/27/2000, Carskadden, Rush wrote:
Are you guys still talking about the feasibility of a cipher that implements each AES candidate in turn with the same key? I don't really get this idea. Provided you were actually using the same key with each stage of the encryption, then your system is only gong to be as secure as the key of the first algorithm. In fact, it seems that if the key is compromised at any one point, then the entire system is shot, given that you know the algorithm (Kerckhoff's principle IIRC). Maybe I am misunderstanding.
Actually, it doesn't work this way. Nearly all block ciphers are product ciphers, which means you take some relatively short key, derive ``subkeys'' (often in some simple linear way) for each round, and apply the rounds in sequence to get a strong cipher. Each round is weak enough to be broken, usually pretty trivially. So the non-independence of keys for successive rounds isn't generally enough to make the cipher weak.
That is the theoretical question that I am asking. What you say appears to be the conventional wisdom, and I am claiming that it is wrong.
The conventional wisdom is not that it's bad, but that it *could* be bad. There are good reasons to expect that E = Twofish, F = Rijndael with the same key will be no weaker than the stronger of Twofish or Rijndael. But you can at least imagine choices of E and F so pathological that the result is weaker. (I know you understand this, but I think it's useful to state exactly what we mean with stuff like this, especially on a list.)
As long as there is some way to make sure that none of the ciphers in a chain are inverses of the others, or close to an inverse, in some sense, then I claim as long as one of the ciphers is strong, there is no way to get any information out about the keys from the other ciphers, even if they are all designed to reveal that information.
Hmmm. I'm trying to think of what condition you need to specify for this. You need something stronger than ``not the inverse,'' but I am not sure what. If you say ``not too close to the inverse,'' we need to figure out how to measure distance. For example, one way of considering the distance between two permutations E and F is to see what number of inputs x have the property that E(x)==F(x). In this case, you could then talk about E^{-1}(x) (inverse of E(X)) and F(x) needing to have a certain distance. But you have to be careful with that; what do you do when E(x) is Twofish without the whitening, and F(x) = inverse of Twofish without the whitening, and missing the last two rounds? F() and E^{-1}() won't be close at all in the sense of our distance metric.
As a practical matter, you may as well derive the sub keys from the master key using a one-way hash, but I am questioning the theoretical justification for doing that. Massey and Maurer base their paper on oracles that give you the key for all component ciphers but one. I am saying such oracles cannot exist if one of the ciphers is strong and "inverses" of the strong cipher are excluded.
I'll comment more on this from another note of yours. I think you're probably right, but that we need to figure out how to really nail that argument down, which means specifying exactly what's meant by ``close to an inverse,'' or whatever.
Arnold Reinhold
--John Kelsey, Counterpane Internet Security, kelsey@counterpane.com PGP Fingerprint: 5D91 6F57 2646 83F9 6D7F 9C87 886D 88AF -----BEGIN PGP SIGNATURE----- Version: PGPfreeware 6.5.1 Int. for non-commercial use <http://www.pgpinternational.com> Comment: foo iQCVAwUBOft6CiZv+/Ry/LrBAQH7PwQAtqG0DsDfKxQllVQcWljBK8xjQePZeV4k Z40lMrsVHghwhsfyWqepISENrWJkepbIQ7ECc3TvYkVX4Yh13fZA/xnfoYX++sQx AfnVozmp2an/I0rKdxCB47UGmTqVsiOIVHIHlT18H4UAzOft9/Nu8FPEIwllp8yH aoHSwqS+4EE= =5AUg -----END PGP SIGNATURE-----
At 9:16 PM -0400 10/28/2000, John Kelsey wrote:
I'll comment more on this from another note of yours. I think you're probably right, but that we need to figure out how to really nail that argument down, which means specifying exactly what's meant by ``close to an inverse,'' or whatever.
I have some ideas on this, based on the earlier note, but I think I should take some time and write them up more formally. Arnold Reinhold
participants (3)
-
Arnold G. Reinhold
-
Carskadden, Rush
-
John Kelsey