Interesting thread going on at
dsfjdssdfsd@ietf.org. Forwarded for our
collective interest and amusement.
---------- Forwarded message ----------
Date: Thu, 23 Jan 2014 23:38:07 +0100
From: Krisztián Pintér <
pinterkr@gmail.com>
To: Michael Hammer
<
michael.hammer@yaanatech.com>
Cc: "
dsfjdssdfsd@ietf.org" <
dsfjdssdfsd@ietf.org>,
"
ietf@hosed.org" <
ietf@hosed.org>
Subject: Re: [dsfjdssdfsd] Any plans for drafts or discussions on here?
Michael Hammer (at Thursday, January 23, 2014, 9:49:32 PM):
>> This may get off-topic, but are there good software tools for testing
>> entropy,
>> that could help applications determine if the underlying system is giving
>> them good input?
>disclaimer: i'm no expert, it is just what i
gathered. (i'm pretty
>much interested in randomness.)
>short answer: no
>long answer: in some situations yes. if you are handed a bunch of
>data, all you can do is to try different techniques to put an upper
>limit on the entropy. for example you can calculate the shannon
>entropy assuming independent bits. then you can hypothesize some
>interdependence, and see if you can compress the data. you can apply
>different lossless compression methods. the better compression you
>find puts an upper limit on the entropy. but never a lower limit.
Consider this: Suppose I handed you the digits of pi, the digits from the millionth digit to the two-millionth digit, and I asked you to determine if they are 'random'. By many tests, you'd conclude that they are random. (Or, at least 'normal'
http://en.wikipedia.org/wiki/Normal_numbers
) But, in reality they are highly non-random, precisely because they are a million sequential digits of pi. But you wouldn't know that, if you didn't know that.
Jim Bell