Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: PatrickHenry; Doctor Stochastic; betty boop; tortoise; StJacques; marron; cornelis
Concerning using the phrase "negative entropy" instead of information, y'all might wish to consider Schneider's objection to the term:

A Glossary for Molecular Information Theory

negentropy: Poor Terminology! The term negentropy was defined by Brillouin (L. Brillouin, Science and Information Theory, second, Academic Press, Inc.,New York, 1962, page 116) as `negative entropy', N = -S. Supposedly living creatures feed on `negentropy' from the sun. However it is impossible for entropy to be negative, so `negentropy' is always a negative quantity. The easiest way to see this is to consider the statistical-mechanics (Boltzmann) form of the entropy equation:


where kb is Boltzmann's constant, is the number of microstates of the system and Pi is the probability of microstate i. Unless one wishes to consider imaginary probabilities (!) it can be proven that S is positive or zero. Rather than saying `negentropy' or `negative entropy', it is more clear to note that when a system dissipates energy to its surroundings, its entropy decreases. So it is better to refer to -delta S (a negative change in entropy).

Recommendation: replace this concept with ‘decrease in entropy'.

Examples:

In "Maxwell's demon: Slamming the door" (Nature 417: 903) John Maddox says "Maxwell's demon ... must be a device for creating negative entropy". The Demon is required to create decreases in entropy, not the impossible `negentropy'. (Note: On 2002 July 6 Nature rejected a correspondence letter to point out this error.)

I also visited the Wikipedia definition of Information Entropy and found it weak in the same area that evidently disturbs Schneider. Namely, in physical systems we have thermodynamic entropy and in information theory we have information entropy - but in "information theory in molecular biology" there is a connection between the two which can lead to confusion --- where there is a decrease of entropy (a gain of information content) in a molecular machine, there is a corresponding gain of thermodynamic entropy (dissipation of energy) in the local surroundings.

In his webpage on uncertainty, entropy and information Schneider clears up any confusion between information entropy and thermodynamic entropy as they apply to molecular biology.

Would y'all entertain the alternative phrasing Schneider suggests, i.e "decrease in entropy" to mean gain of information content in molecular biology?

447 posted on 01/04/2005 9:21:48 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 446 | View Replies ]


To: PatrickHenry; Doctor Stochastic; betty boop; tortoise; StJacques; marron; cornelis
What would y'all think about fabricating some words of our own to meet our purposes for this discussion? For instance,

Biocomm – Shannon information, successful communication, reduction of uncertainty as applied to molecular biology

Biocast – Channel capacity and noise [Shannon] available to a biological system to facilitate a gain of information content, i.e. the broadcast or incoming message.

Biolearn – Gain of information content in a molecular machine, Shannon bits, decrease in information entropy, increase in thermodynamic entropy in local surroundings.

Bioknow – Accrued information content in a biological organism, e.g. DNA

Biolanguage – The encoding/decoding in Biocomm, i.e. semiosis.

Biothought – The property of complexity (Kolmogorov, self-organizing, physical, irreducible, functional, specified – or whatever we decide) which exists in the Biocomm.

Biostructure – The biological system in which Biocomm exists, i.e. autonomy


448 posted on 01/04/2005 10:01:04 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 447 | View Replies ]

To: Alamo-Girl
Namely, in physical systems we have thermodynamic entropy and in information theory we have information entropy - but in "information theory in molecular biology" there is a connection between the two which can lead to confusion --- where there is a decrease of entropy (a gain of information content) in a molecular machine, there is a corresponding gain of thermodynamic entropy (dissipation of energy) in the local surroundings.

All these definitions of entropy are identical, they are just distilled rules from an obscure generalization applied at different levels of the system. The transaction theoretic versions (e.g. as would apply to thermodynamics) are pretty esoteric. Algorithmic information theory unifies all the various notions of "entropy" into a single concept that is probably far more confusing than either the static or transactional version in isolation (though very elegant in its own way).

One can see, as in your example, that they are all related. The conceptually hard part is figuring out what the general mathematical description is of a system that will express something that looks like thermodynamics -- ask yourself why a system would express this behavior in the hypothetical. The answer to that question has good-sized lightbulb attached to it.

456 posted on 01/04/2005 1:23:55 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 447 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson