Randomness and RE>OTPs

Scott Collins scott_collins at genmagic.genmagic.com
Wed Jan 27 10:40:15 PST 1993


Subject:  Randomness and RE>OTPs

>Does anybody remember a good recipe for converting a biased RNG into an
>unbiased one?  I can't think of one off the top of my head, and that's
>what Thug's friend seems to need.  This has been discussed at length in
>the literature.

1. If you want randomness, introducing order is bad.  As Eric Hughes
pointed out, trimming runs reduces the entropy of the sequence.  You
want to increase the entropy i.e. maximize the surprise.  One good
way to increase the entropy is to compress the 'random' sequence.  The
output of a good compressor has greater entropy than the input.  If the
input is already random, no harm done (again, with a GOOD compressor...
otherwise it is easy to accidentally introduce order).  If the input
has some subtle bias or regularity, the compressor will get rid of it
(at the cost of reducing the total volume of the sequence).  Good
compressors are much better at detecting regularity (and eliminating it)
than human beings.

2. Of course (as Thug stated) you are (also) compressing the plaintext before
you encrypt it.  It is best to do this with an adaptive scheme and an
arithmetic encoder so that a) the entropy of the plaintext is maximized
and so that b) accidentally decrypting something correctly in the middle
of the stream is useless.

My recommendation for a good binary scheme is DMC (dynamic markov compression)
feeding into almost any binary arithmetic encoder (e.g. the Q-coder, et. al.).
I would use this to compress both the plaintext stream before encryption, and
a 'suspect' random number stream.

If there is interest, I will post a bibliography of papers and books
relating to this.

Scott Collins (Scott_Collins at genmagic.com)








More information about the cypherpunks-legacy mailing list