[dsfjdssdfsd] Any plans for drafts or discussions on here? (fwd)

J.A. Terranson measl@mfn.org
Thu Jan 23 23:49:52 PST 2014


Interesting thread going on at dsfjdssdfsd@ietf.org. Forwarded for our 
collective interest and amusement.

//Alif

-- 
Those who make peaceful change impossible,
make violent revolution inevitable.

An American Spring is coming:
   one way or another.




---------- Forwarded message ----------
Date: Thu, 23 Jan 2014 23:38:07 +0100
From: Krisztián Pintér <pinterkr@gmail.com>
To: Michael Hammer <michael.hammer@yaanatech.com>
Cc: "dsfjdssdfsd@ietf.org" <dsfjdssdfsd@ietf.org>,
    "ietf@hosed.org" <ietf@hosed.org>
Subject: Re: [dsfjdssdfsd] Any plans for drafts or discussions on here?


Michael Hammer (at Thursday, January 23, 2014, 9:49:32 PM):
> This may get off-topic, but are there good software tools for testing
> entropy, 
> that could help applications determine if the underlying system is giving
> them good input?

disclaimer: i'm no expert, it is just what i gathered. (i'm pretty
much interested in randomness.)

short answer: no

long answer: in some situations yes. if you are handed a bunch of
data, all you can do is to try different techniques to put an upper
limit on the entropy. for example you can calculate the shannon
entropy assuming independent bits. then you can hypothesize some
interdependence, and see if you can compress the data. you can apply
different lossless compression methods. the better compression you
find puts an upper limit on the entropy. but never a lower limit.

you can only do better if you have an idea about the process that
created the data. for example you might assume that it is mostly
thermal noise. you can assume that thermal noise has some frequency
distribution, or energy or whatever, etc. within this assumption, you
can determine the entropy content by measurements. but at this point,
you are pretty much prone to two errors: 1, what if your assumption is
wrong and 2, what if your physical model overestimates the
unpredictability of the given system. example for the former: the
signal might be largely controllable by an external EM interference,
and then you measure not noise, but attacker controlled data. example
for the latter: a smartass scientist might come up with a better
physical model for thermal noise.

it is also important to note that entropy is observer dependent. we
actually talk about the entropy as seen by the attacker. but it is not
straightforward to assess what is actually visible to an attacker and
what is not. observation methods improve with time.

_______________________________________________
dsfjdssdfsd mailing list
dsfjdssdfsd@ietf.org
https://www.ietf.org/mailman/listinfo/dsfjdssdfsd



More information about the cypherpunks mailing list