[
Lists Home |
Date Index |
Thread Index
]
> It's one of those cases where the conventional meaning of a word
> ("disorder" in this case) ...
When dealing with entropy as defined by Shannon it is best to totally forgot
thermodynamic entropy:
"The measure of amount of information is called entropy. If we want to
understand this entropy of communication theory, it is best first to clear
our minds of any ideas associated with the entropy of physics. Once we
understand entropy as it is used in communication theory thoroughly, there
is no harm in trying to relate it to the entropy of physics, but the
literature indicates that some workers have never recovered from the
confusion engendered by an early admixture of ideas concerning the entropies
of physics and communication theory."
-- An Introduction to Information Theory by John R. Pierce
/Roger
|