Re: Entropy vs Random Bits
At 11:51 PM 9/20/95 P, David Van Wie wrote:
This is odd. The term entropy describes an aspect of thermodynamic equlibrium in physical systems. Although sometimes used as a synonym for "random," that definition is vernacular, not technical. In fact, there is no meaningful relationship between "entropy" and random data of the type described in the postings related to seed values. In the presense of a perfectly suitable and precise mathematical term (i.e. random),
Your use of the word random is incorrect: The throw of a dice is random, but only contains 2.6 bits of entropy. The windows VDT counter is very far from being random, but contains roughly sixteen bits of entropy.
why invent new terms? Why use them to mean at least two different things?
This is old term of the art, a term of information theory: We use the same word because entropy in information theory has the same measure as entropy in thermodynamics. In both cases the entropy, measured in bits, of an ensemble of possible states is sum of - P(i) * lg[P(i)] over all the possible states. --------------------------------------------------------------------- | We have the right to defend ourselves | http://www.jim.com/jamesd/ and our property, because of the kind | of animals that we are. True law | James A. Donald derives from this right, not from the | arbitrary power of the state. | jamesd@echeque.com
participants (1)
-
James A. Donald