Re: why compression doesn't perfectly even out entropy
Is it possible to find a percentage of the key space to eliminate that will optimize security assuming that the attacker will try the easy stuff first (and is it possible to quantify "easy stuff")?
If you eliminate all repeating byte sequences, such as 00 00 or 7F 7F, you will reduce your possible entropy by .07058% (7.99435 bits per byte), and eliminate the (astronomically remote) possibility of Hamlet or some other English text popping out of your RNG/PRNG. As long as your key is long enough to withstand this slight entropy reduction, you are still OK.
JonWienke@aol.com writes:
Is it possible to find a percentage of the key space to eliminate that will optimize security assuming that the attacker will try the easy stuff first (and is it possible to quantify "easy stuff")?
If you eliminate all repeating byte sequences, such as 00 00 or 7F 7F, you will reduce your possible entropy by .07058% (7.99435 bits per byte), and eliminate the (astronomically remote) possibility of Hamlet or some other English text popping out of your RNG/PRNG. As long as your key is long enough to withstand this slight entropy reduction, you are still OK.
Before making pronouncements like "You are still OK" you ought to learn a bit more about cryptanalysis. Its tiny little statistical toeholds like that which permit breaks. I don't know for sure, but my intuition says that there may very well be instances in which a couple of little nicks like that into the entropy of a key are sufficient to radically lower the time to crack something. Since there are far better techniques available (hash distillation, for instance) for assuring the quality of a random stream, Jon's suggested techniques should be regarded as unnecessary and dangerous. PUBLIC SERVICE ANNOUNCEMENT: For the benefit of everyone reading, I've become increasingly convinced that Jon really doesn't understand the topic he's working on well enough to trust, and he doesn't have the sense to know that he doesn't understand it well enough. I know enough to know that I'm extremely ignorant -- he's ignorant enough to think that he knows more than he does. I don't mean to insult Jon -- I'm sure that in his own field whatever it is he's a smart enough guy, and he seems like a nice enough fellow -- but cryptography is a dangerous business -- bad technique KILLS, literally. Until Mr. Wienke loses his bad case of hubris I would suggest not taking his technical suggestions. Perry
In article <199604181352.JAA08215@jekyll.piermont.com>, Perry E. Metzger <perry@piermont.com> wrote:
If you eliminate all repeating byte sequences, such as 00 00 or 7F 7F, you will reduce your possible entropy by .07058% (7.99435 bits per byte), and eliminate the (astronomically remote) possibility of Hamlet or some other English text popping out of your RNG/PRNG. [...]
Before making pronouncements like "You are still OK" you ought to learn a bit more about cryptanalysis. [...]
Then I propose the following scheme. (I've proposed it before.) My entropy cruncher takes in random noise from a number of diverse sources (some possibly of dubious quality). I take *all* the noise and run it through a hash function to distill entropy. Now I need to have some method to estimate when I have enough entropy in the random noise I'm crunching. First rule: be conservative. One can never have too much entropy in the input to the hash function. Therefore, I suggest making a *copy* of the input noise stream, running it through Jon Wienke's "this shouldn't happen" filter, and feeding the result to some entropy estimator. When the entropy estimator says "I've got 1000 bits of entropy", I stop crunching. This is conservative design, folks. Using Wienke's filter in this manner can't be any weaker than not using it at all. (agreed?) Now, you can go argue whether the extra design complexity is worth it, if you like. <shrug> P.S. To forestall confusion, let me be explicit about what I'm *not* proposing: I *don't* want you to apply Wienke's filter to the input or output of the hash function. Applying Wienke's filter to the random noise stream, to the input to the hash function, or to the output to the hash function, is clearly a bad idea. (The mathematician says "clearly", knowing full well that, unfortunately, some small part of the audience probably doesn't get it... <sigh>) This is what the "POTP" snake oil folks were proposing-- they had some "quality control" process they applied to the one-time pads they generated. I think they said they regularly eliminated 70% of the pads as "defective". This was supposed to be encouraging :-) If you don't understand why the POTP "quality control" process was laughable, let someone else design the entropy cruncher!!!
David Wagner writes:
Therefore, I suggest making a *copy* of the input noise stream, running it through Jon Wienke's "this shouldn't happen" filter, and feeding the result to some entropy estimator. When the entropy estimator says "I've got 1000 bits of entropy", I stop crunching.
This is conservative design, folks. Using Wienke's filter in this manner can't be any weaker than not using it at all. (agreed?)
Unfortunately, I think his filter puts too high a bound on the entropy. Put it this way: I think he's only giving you an upper bound. Furthermore, he's using his technique because he's using spinners as RNGs, which I have a substantial fear of. However, you are correct that this mechanism is no worse than not using it at all. However, it doesn't substitute for doing a thorough systems analysis to try to figure out how much entropy there actually is in your source. Thus, to summarize, yes, I agree with your strict statement that using the filter this way is not weaker than not using it at all, but I'm not sure it is worthwhile in this case because it isn't sufficient.
Applying Wienke's filter to the random noise stream, to the input to the hash function, or to the output to the hash function, is clearly a bad idea.
Agreed.
(The mathematician says "clearly", knowing full well that, unfortunately, some small part of the audience probably doesn't get it... <sigh>)
Sad but true. Perry
participants (3)
-
daw@cs.berkeley.edu -
JonWienke@aol.com -
Perry E. Metzger