Posted on 11/30/2004 6:21:11 PM PST by betty boop
Great post on the perils of using a loaded term.
When Shannon coined the terms information and bit and entropy back in 1948 - he defined them very clearly, very mathematically. One cannot be speaking to his model and slip in some common usage definition for the terms and hope to assert a credible argument.
The word complexity is even more explosive because there are hard, mathematically oriented, definitions of various types of complexity. We can't be throwing the term around here and hope to be communicating if we don't first stop and define what we mean.
If I had my way, I wouldn't use "information," "complexity" or any other word that has an alternate meaning in English. The possibility of an inadvertent slip into some less precise meaning, either on the part of the writer or the reader, is just too great. Perhaps we could, for our own discussions, consider adopting a symbol that has no other meaning.
Instead of the loaded word "information" just use "negative entropy" (with apologies to von Neumann and Shannon.)
"Negative entropy" seems fine for the purpose. It has a clear meaning which is relevant to the phenomenon under consideration, and it's doubtful that anyone will read more into the expression than is intended.
Some people do like to sew a coat onto a button.
Negative entropy (informational, not thermodynamic) is just a measure of how many messages can be carried by a system. "Information" (in the sense of Prisoner Number Six) is an arbitrary. Messages may have any "meaning" although the number of these meanings is limited by the entropy.
negentropy: Poor Terminology! The term negentropy was defined by Brillouin (L. Brillouin, Science and Information Theory, second, Academic Press, Inc.,New York, 1962, page 116) as `negative entropy', N = -S. Supposedly living creatures feed on `negentropy' from the sun. However it is impossible for entropy to be negative, so `negentropy' is always a negative quantity. The easiest way to see this is to consider the statistical-mechanics (Boltzmann) form of the entropy equation:
where kb is Boltzmann's constant, is the number of microstates of the system and Pi is the probability of microstate i. Unless one wishes to consider imaginary probabilities (!) it can be proven that S is positive or zero. Rather than saying `negentropy' or `negative entropy', it is more clear to note that when a system dissipates energy to its surroundings, its entropy decreases. So it is better to refer to -delta S (a negative change in entropy).
Recommendation: replace this concept with decrease in entropy'.
Examples:
In "Maxwell's demon: Slamming the door" (Nature 417: 903) John Maddox says "Maxwell's demon ... must be a device for creating negative entropy". The Demon is required to create decreases in entropy, not the impossible `negentropy'. (Note: On 2002 July 6 Nature rejected a correspondence letter to point out this error.)
In his webpage on uncertainty, entropy and information Schneider clears up any confusion between information entropy and thermodynamic entropy as they apply to molecular biology.
Would y'all entertain the alternative phrasing Schneider suggests, i.e "decrease in entropy" to mean gain of information content in molecular biology?
Biocast Channel capacity and noise [Shannon] available to a biological system to facilitate a gain of information content, i.e. the broadcast or incoming message.
Biolearn Gain of information content in a molecular machine, Shannon bits, decrease in information entropy, increase in thermodynamic entropy in local surroundings.
Bioknow Accrued information content in a biological organism, e.g. DNA
Biolanguage The encoding/decoding in Biocomm, i.e. semiosis.
Biothought The property of complexity (Kolmogorov, self-organizing, physical, irreducible, functional, specified or whatever we decide) which exists in the Biocomm.
Biostructure The biological system in which Biocomm exists, i.e. autonomy
The term, however, only applies to the receiver in the communication and the Shannon-Weaver model includes other elements: source, encoder, message, channel and decoder.
More specifically, in the Shannon-Weaver model the term information puts the emphasis on the first word: decrease in entropy. IOW, Shannon information is an action not a condition. The condition (words taken together - decreased entropy) is more akin to the "information content" gained by the receiver which is also called the decoded message.
The message which is sent (from source via encoding in Shannon) is "information content" which (in molecular biology) was previously gained by a decrease in entropy and retained (i.e. DNA).
So - although I'm all for less confusing or potentially misleading terminology - the one phrase doesn't fill the entire glossary we need to communicate. We still need something for "message", etc.
"Message capacity" is descriptive and more nearly accurately captures the concept. It's a bit bulky. "Messcap" sounds like an article of clothing though.
Think about the term "exchange" (which is used in physics) or maybe "transfer" instead of communication. Think about "status" or "condition" instead of information content. I donno. The whole field reeks with sloppy, and thus potentially misleading terminology. Gives me a brain-ache.
Another possibility is just use "Anzeigekapazität" or "Anzkap" (keeping with the German tendency to abbreaviate) rather than "information." The French could put it on the front of a camera as a l'Anzkap.
DNA should not be treated as a "message" in a molecular system without making some assumptions that I do not think apply here, nor is it an element that can be legitimately treated as independent of the rest of the molecular machinery. The "message" that materializes is completely dependent on the context of the rest of the system/machinery, and the lack of functional independence for practical matters in real molecular systems makes the application of Shannon-Weaver doubtful. A simple example of this is that there are single machine code sequences for computers that are valid (and different) programs on wildly different machine architectures. The entire system determines the algorithm, not just some inextricable element that is arbitrarily determined to be "the program" or message.
Because of this, DNA only preserves state, not the algorithmic information of the system. Hypothetical transplantation of the DNA to another cellular system does NOT preserve the algorithmic information of the previous cellular system. Now, sufficiently similar systems may tend to converge on similar states (in biology, ones that don't will die), but that is really an accident of the configuration space and not at all required. My DNA is a bit of state that will usually converge on something stable in cells that are configured like mine, but it does not carry a message; there are plenty of contexts where my exact DNA will produce stable systems that are nothing remotely like me. Under the proper cellular contexts, my DNA would produce a living organism that looks like a slime mold. But for a lot of practical reasons, nature tends to put my DNA in system contexts that look similar to mine and so there are no slime molds with my DNA running around in the wild (that I know of anyway).
All these definitions of entropy are identical, they are just distilled rules from an obscure generalization applied at different levels of the system. The transaction theoretic versions (e.g. as would apply to thermodynamics) are pretty esoteric. Algorithmic information theory unifies all the various notions of "entropy" into a single concept that is probably far more confusing than either the static or transactional version in isolation (though very elegant in its own way).
One can see, as in your example, that they are all related. The conceptually hard part is figuring out what the general mathematical description is of a system that will express something that looks like thermodynamics -- ask yourself why a system would express this behavior in the hypothetical. The answer to that question has good-sized lightbulb attached to it.
I just wanted to thank you, M-PI, for your post. Karol Josef Wojtyla -- His Holiness John Paul II -- is probably one of the greatest intellectuals ever to sit on the Throne of St. Peter. And with this statement he shrewdly puts his finger on an ineluctible truth: A man will see what his "philosophy" allows him to see. As I argue without ceasing, "world-view" drives everything else, epistemologically speaking. Why, people have even been know to construct theories, and design experiments, with the view to validate their own worldview, thus to falsify the other man's, rather than just follow the clues one finds along the trail and track them to wherever they lead. Thus the Pope is right to speak of "several theories of evolution," depending on the essential motivation of such conceptualizations, be it "materialist, reductionist, spiritualist," or some other basis.
And this is a living truth: "Conflicts between the truths of science and the truths of faith ... are only apparent, never real, for both science and faith, the natural world accessible to reason, and the 'world' of revelation accessible to faith, have the same author: God."
Thank you so much, Matchett-PI, for putting this "on the record" of this thread. I'm grateful to you.
Dear Alamo-Girl, thank you so much for this invaluable clarification. And also for earlier pointing out that "negative entropy" is a no-go term. Apparently, entropy is always positive, in that its "natural habit," so to speak, is to increase and spread. To say it can have a "negative" value would seem to take us clear out of the second law of thermodynamics. And I think PatrickHenry is right to point out that thermodynamic entropy most closely bears on our problem of trying to figure out the nature and source of biological "information."
Also thank you so very much for your elegant exposition of Kolgomorov complexity. That seems to be the path that beckons here. (IMHO FWIW). I was particularly intrigued by your suggestion that memory has function within this scheme.
I've been sort of lurking about lately, not piping up too much. (Plus there's been some scuffling about; hope we've gotten over it.) But I've been googling away, and have come across an interesting hypothesis regarding the role of entropy in living systems (no, not AG's, somebody else's; I don't think I'm at liberty to discuss what AG is up to these days).
Anyhoot, if the relevant point comes up, I'll be sure to weigh in. Just a sketch: the universe evolves as "a population of one"; what we typically think of as "Darwinist theory" is helpless before this postulate. Nothing within the Darwinist purview can come remotely close to explaining such a conception.
There's more, of course; but i don't want to get ahead of the discussion -- so ably, graciously, and capably conducted, in your kind and astoundingly knowledgeable hands.
But I must somewhat guiltily confess there's been another reason I haven't been around much lately: I got three of Partick O'Brien's Jack Aubrey novels for Christmas. I am completely, totally enchanted. Read two -- Master and Commander and The Post Captain over the long weekend, and am about half-way through the third, H.M.S. Surprise. O'Brien is a magnificent prose artist telling an historical tale about well-documented British naval actions of the Napoleonic Wars. But his greatest glory is his ability to evoke flesh-and-blood, vividly living, loving (or hating or indifferent) characters: His Jack Aubrey and Stephen Maturin evoke the classical model of the deep male bonded friendship of Achilles and Patroclus. O'Brien's work is simply brilliant, astonishing, lapidary, corruscating!!! Needless to say: I am so sucked in!!!
[Seventeen more to go!!! :^) I can't wait!!!]
God bless you, dear sister -- and all the rest of us reading these lines, too.
I agree. In his book entitled "Faith and Reason - Searching for a Rational Faith", Ronald H. Nash hits on that subject big time! Lots of Alvin Plantinga quotes.
"Thank you so much, Matchett-PI, for putting this "on the record" of this thread. I'm grateful to you.:
You are most welcome!
I'll have to check out the Nash. Thanks so much, Matchett-PI!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.