Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

On Plato, the Early Church, and Modern Science: An Eclectic Meditation
November 30, 2004 | Jean F. Drew

Posted on 11/30/2004 6:21:11 PM PST by betty boop

click here to read article


Navigation: use the links below to view more comments.
first previous 1-20 ... 421-440441-460461-480 ... 921-935 next last
To: Alamo-Girl

Great post on the perils of using a loaded term.


441 posted on 01/03/2005 11:54:45 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 440 | View Replies]

To: PatrickHenry; StJacques; betty boop; tortoise; Doctor Stochastic; marron; cornelis
Thank you for your reply!

Great post on the perils of using a loaded term.

LOLOL! That is why getting our terms defined first is the most important step.

When Shannon coined the terms information and bit and entropy back in 1948 - he defined them very clearly, very mathematically. One cannot be speaking to his model and slip in some common usage definition for the terms and hope to assert a credible argument.

The word complexity is even more explosive because there are hard, mathematically oriented, definitions of various types of complexity. We can't be throwing the term around here and hope to be communicating if we don't first stop and define what we mean.

442 posted on 01/03/2005 8:58:42 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 441 | View Replies]

To: Alamo-Girl; StJacques; betty boop; tortoise; Doctor Stochastic
The word complexity is even more explosive ...

If I had my way, I wouldn't use "information," "complexity" or any other word that has an alternate meaning in English. The possibility of an inadvertent slip into some less precise meaning, either on the part of the writer or the reader, is just too great. Perhaps we could, for our own discussions, consider adopting a symbol that has no other meaning.

443 posted on 01/04/2005 6:39:22 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 442 | View Replies]

To: PatrickHenry

Instead of the loaded word "information" just use "negative entropy" (with apologies to von Neumann and Shannon.)


444 posted on 01/04/2005 6:40:59 AM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 443 | View Replies]

To: Doctor Stochastic

"Negative entropy" seems fine for the purpose. It has a clear meaning which is relevant to the phenomenon under consideration, and it's doubtful that anyone will read more into the expression than is intended.


445 posted on 01/04/2005 6:48:52 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 444 | View Replies]

To: PatrickHenry
...it's doubtful that anyone will read more into the expression than is intended.

Some people do like to sew a coat onto a button.

Negative entropy (informational, not thermodynamic) is just a measure of how many messages can be carried by a system. "Information" (in the sense of Prisoner Number Six) is an arbitrary. Messages may have any "meaning" although the number of these meanings is limited by the entropy.

446 posted on 01/04/2005 7:28:52 AM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 445 | View Replies]

To: PatrickHenry; Doctor Stochastic; betty boop; tortoise; StJacques; marron; cornelis
Concerning using the phrase "negative entropy" instead of information, y'all might wish to consider Schneider's objection to the term:

A Glossary for Molecular Information Theory

negentropy: Poor Terminology! The term negentropy was defined by Brillouin (L. Brillouin, Science and Information Theory, second, Academic Press, Inc.,New York, 1962, page 116) as `negative entropy', N = -S. Supposedly living creatures feed on `negentropy' from the sun. However it is impossible for entropy to be negative, so `negentropy' is always a negative quantity. The easiest way to see this is to consider the statistical-mechanics (Boltzmann) form of the entropy equation:


where kb is Boltzmann's constant, is the number of microstates of the system and Pi is the probability of microstate i. Unless one wishes to consider imaginary probabilities (!) it can be proven that S is positive or zero. Rather than saying `negentropy' or `negative entropy', it is more clear to note that when a system dissipates energy to its surroundings, its entropy decreases. So it is better to refer to -delta S (a negative change in entropy).

Recommendation: replace this concept with ‘decrease in entropy'.

Examples:

In "Maxwell's demon: Slamming the door" (Nature 417: 903) John Maddox says "Maxwell's demon ... must be a device for creating negative entropy". The Demon is required to create decreases in entropy, not the impossible `negentropy'. (Note: On 2002 July 6 Nature rejected a correspondence letter to point out this error.)

I also visited the Wikipedia definition of Information Entropy and found it weak in the same area that evidently disturbs Schneider. Namely, in physical systems we have thermodynamic entropy and in information theory we have information entropy - but in "information theory in molecular biology" there is a connection between the two which can lead to confusion --- where there is a decrease of entropy (a gain of information content) in a molecular machine, there is a corresponding gain of thermodynamic entropy (dissipation of energy) in the local surroundings.

In his webpage on uncertainty, entropy and information Schneider clears up any confusion between information entropy and thermodynamic entropy as they apply to molecular biology.

Would y'all entertain the alternative phrasing Schneider suggests, i.e "decrease in entropy" to mean gain of information content in molecular biology?

447 posted on 01/04/2005 9:21:48 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 446 | View Replies]

To: PatrickHenry; Doctor Stochastic; betty boop; tortoise; StJacques; marron; cornelis
What would y'all think about fabricating some words of our own to meet our purposes for this discussion? For instance,

Biocomm – Shannon information, successful communication, reduction of uncertainty as applied to molecular biology

Biocast – Channel capacity and noise [Shannon] available to a biological system to facilitate a gain of information content, i.e. the broadcast or incoming message.

Biolearn – Gain of information content in a molecular machine, Shannon bits, decrease in information entropy, increase in thermodynamic entropy in local surroundings.

Bioknow – Accrued information content in a biological organism, e.g. DNA

Biolanguage – The encoding/decoding in Biocomm, i.e. semiosis.

Biothought – The property of complexity (Kolmogorov, self-organizing, physical, irreducible, functional, specified – or whatever we decide) which exists in the Biocomm.

Biostructure – The biological system in which Biocomm exists, i.e. autonomy


448 posted on 01/04/2005 10:01:04 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 447 | View Replies]

To: Alamo-Girl
I can go with "decrease in entropy." That is what's observed. It's certainly better than some of the other terms that have been tossed around in the past: information, communication, message, code, meaning, broadcast, etc. Whatever does not suggest, either to the writer having a momentary lapse, or to the overly suggestive reader, that someone, or something is inherently involved in the process is fine with me. Mind now, I'm not playing the materialism game by ruling anything out. I just don't want to load up the vocabulary one way or the other.
449 posted on 01/04/2005 10:51:32 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 448 | View Replies]

To: PatrickHenry; betty boop; Doctor Stochastic; tortoise; marron; cornelis; StJacques
Thank you so much for your reply and for accepting Schneider's rephrasing of the term!

The term, however, only applies to the receiver in the communication and the Shannon-Weaver model includes other elements: source, encoder, message, channel and decoder.

More specifically, in the Shannon-Weaver model the term information puts the emphasis on the first word: decrease in entropy. IOW, Shannon information is an action not a condition. The condition (words taken together - decreased entropy) is more akin to the "information content" gained by the receiver which is also called the decoded message.

The message which is sent (from source via encoding in Shannon) is "information content" which (in molecular biology) was previously gained by a decrease in entropy and retained (i.e. DNA).

So - although I'm all for less confusing or potentially misleading terminology - the one phrase doesn't fill the entire glossary we need to communicate. We still need something for "message", etc.

450 posted on 01/04/2005 11:12:53 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 449 | View Replies]

To: Alamo-Girl

"Message capacity" is descriptive and more nearly accurately captures the concept. It's a bit bulky. "Messcap" sounds like an article of clothing though.


451 posted on 01/04/2005 11:22:01 AM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 450 | View Replies]

To: Alamo-Girl

Think about the term "exchange" (which is used in physics) or maybe "transfer" instead of communication. Think about "status" or "condition" instead of information content. I donno. The whole field reeks with sloppy, and thus potentially misleading terminology. Gives me a brain-ache.


452 posted on 01/04/2005 11:31:48 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 450 | View Replies]

To: Alamo-Girl

Another possibility is just use "Anzeigekapazität" or "Anzkap" (keeping with the German tendency to abbreaviate) rather than "information." The French could put it on the front of a camera as a l'Anzkap.


453 posted on 01/04/2005 11:42:17 AM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 450 | View Replies]

Comment #454 Removed by Moderator

To: Alamo-Girl
I really have not had time to respond to this, but I am going to take a quick stab now before you all go to far afield. The Shannon-Weaver model is an idealization of more general theory that has utility for some types engineering, in the same way that most types of engineering use idealized Newtonian mechanics rather than theoretically correct physics since the errors in the model fall below the noise floor for engineering purposes if one assumes certain system parameters. Molecular biology and molecular machinery is well outside the parametric space in which the Shannon-Weaver assumptions give good results. In the Shannon-Weaver model, you list six elements. In the generalized theory, these are not only not independent from each other, there is no distinction between these elements. Cutting a single system into quasi-independent elements is probably the single most common engineering idealization, but one has to understand the limits of that idealization. A very common example of this is the distinction between "program" and "data" that is pervasive in computer science, a widely accepted distinction which has no theoretical basis and only exists for engineering convenience (and which will get you into trouble in some algorithm spaces).

DNA should not be treated as a "message" in a molecular system without making some assumptions that I do not think apply here, nor is it an element that can be legitimately treated as independent of the rest of the molecular machinery. The "message" that materializes is completely dependent on the context of the rest of the system/machinery, and the lack of functional independence for practical matters in real molecular systems makes the application of Shannon-Weaver doubtful. A simple example of this is that there are single machine code sequences for computers that are valid (and different) programs on wildly different machine architectures. The entire system determines the algorithm, not just some inextricable element that is arbitrarily determined to be "the program" or message.

Because of this, DNA only preserves state, not the algorithmic information of the system. Hypothetical transplantation of the DNA to another cellular system does NOT preserve the algorithmic information of the previous cellular system. Now, sufficiently similar systems may tend to converge on similar states (in biology, ones that don't will die), but that is really an accident of the configuration space and not at all required. My DNA is a bit of state that will usually converge on something stable in cells that are configured like mine, but it does not carry a message; there are plenty of contexts where my exact DNA will produce stable systems that are nothing remotely like me. Under the proper cellular contexts, my DNA would produce a living organism that looks like a slime mold. But for a lot of practical reasons, nature tends to put my DNA in system contexts that look similar to mine and so there are no slime molds with my DNA running around in the wild (that I know of anyway).

455 posted on 01/04/2005 12:47:10 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 371 | View Replies]

To: Alamo-Girl
Namely, in physical systems we have thermodynamic entropy and in information theory we have information entropy - but in "information theory in molecular biology" there is a connection between the two which can lead to confusion --- where there is a decrease of entropy (a gain of information content) in a molecular machine, there is a corresponding gain of thermodynamic entropy (dissipation of energy) in the local surroundings.

All these definitions of entropy are identical, they are just distilled rules from an obscure generalization applied at different levels of the system. The transaction theoretic versions (e.g. as would apply to thermodynamics) are pretty esoteric. Algorithmic information theory unifies all the various notions of "entropy" into a single concept that is probably far more confusing than either the static or transactional version in isolation (though very elegant in its own way).

One can see, as in your example, that they are all related. The conceptually hard part is figuring out what the general mathematical description is of a system that will express something that looks like thermodynamics -- ask yourself why a system would express this behavior in the hypothetical. The answer to that question has good-sized lightbulb attached to it.

456 posted on 01/04/2005 1:23:55 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 447 | View Replies]

To: Matchett-PI; Alamo-Girl; marron; PatrickHenry; Doctor Stochastic; tortoise; stripes1776; stremba
"....to tell the truth, rather than the theory of evolution, we should speak of several theories of evolution. On the one hand, this plurality has to do with the different explanations advanced for the mechanism of evolution, and on the other, with the various philosophies on which it is based. Hence the existence of materialist, reductionist and spiritualist interpretations. What is to be decided here is the true role of philosophy and, beyond it, of theology."

I just wanted to thank you, M-PI, for your post. Karol Josef Wojtyla -- His Holiness John Paul II -- is probably one of the greatest intellectuals ever to sit on the Throne of St. Peter. And with this statement he shrewdly puts his finger on an ineluctible truth: A man will see what his "philosophy" allows him to see. As I argue without ceasing, "world-view" drives everything else, epistemologically speaking. Why, people have even been know to construct theories, and design experiments, with the view to validate their own worldview, thus to falsify the other man's, rather than just follow the clues one finds along the trail and track them to wherever they lead. Thus the Pope is right to speak of "several theories of evolution," depending on the essential motivation of such conceptualizations, be it "materialist, reductionist, spiritualist," or some other basis.

And this is a living truth: "Conflicts between the truths of science and the truths of faith ... are only apparent, never real, for both science and faith, the natural world accessible to reason, and the 'world' of revelation accessible to faith, have the same author: God."

Thank you so much, Matchett-PI, for putting this "on the record" of this thread. I'm grateful to you.

457 posted on 01/04/2005 4:51:35 PM PST by betty boop
[ Post Reply | Private Reply | To 438 | View Replies]

To: Alamo-Girl; marron; PatrickHenry; StJacques; Doctor Stochastic; tortoise; stremba; cornelis; ...
Shannon information is an action not a condition.

Dear Alamo-Girl, thank you so much for this invaluable clarification. And also for earlier pointing out that "negative entropy" is a no-go term. Apparently, entropy is always positive, in that its "natural habit," so to speak, is to increase and spread. To say it can have a "negative" value would seem to take us clear out of the second law of thermodynamics. And I think PatrickHenry is right to point out that thermodynamic entropy most closely bears on our problem of trying to figure out the nature and source of biological "information."

Also thank you so very much for your elegant exposition of Kolgomorov complexity. That seems to be the path that beckons here. (IMHO FWIW). I was particularly intrigued by your suggestion that memory has function within this scheme.

I've been sort of lurking about lately, not piping up too much. (Plus there's been some scuffling about; hope we've gotten over it.) But I've been googling away, and have come across an interesting hypothesis regarding the role of entropy in living systems (no, not AG's, somebody else's; I don't think I'm at liberty to discuss what AG is up to these days).

Anyhoot, if the relevant point comes up, I'll be sure to weigh in. Just a sketch: the universe evolves as "a population of one"; what we typically think of as "Darwinist theory" is helpless before this postulate. Nothing within the Darwinist purview can come remotely close to explaining such a conception.

There's more, of course; but i don't want to get ahead of the discussion -- so ably, graciously, and capably conducted, in your kind and astoundingly knowledgeable hands.

But I must somewhat guiltily confess there's been another reason I haven't been around much lately: I got three of Partick O'Brien's Jack Aubrey novels for Christmas. I am completely, totally enchanted. Read two -- Master and Commander and The Post Captain over the long weekend, and am about half-way through the third, H.M.S. Surprise. O'Brien is a magnificent prose artist telling an historical tale about well-documented British naval actions of the Napoleonic Wars. But his greatest glory is his ability to evoke flesh-and-blood, vividly living, loving (or hating or indifferent) characters: His Jack Aubrey and Stephen Maturin evoke the classical model of the deep male bonded friendship of Achilles and Patroclus. O'Brien's work is simply brilliant, astonishing, lapidary, corruscating!!! Needless to say: I am so sucked in!!!

[Seventeen more to go!!! :^) I can't wait!!!]

God bless you, dear sister -- and all the rest of us reading these lines, too.

458 posted on 01/04/2005 6:16:18 PM PST by betty boop
[ Post Reply | Private Reply | To 450 | View Replies]

To: betty boop
"As I argue without ceasing, "world-view" drives everything else, epistemologically speaking. ..."

I agree. In his book entitled "Faith and Reason - Searching for a Rational Faith", Ronald H. Nash hits on that subject big time! Lots of Alvin Plantinga quotes.

"Thank you so much, Matchett-PI, for putting this "on the record" of this thread. I'm grateful to you.:

You are most welcome!

459 posted on 01/04/2005 7:37:40 PM PST by Matchett-PI (Today's DemocRATS are either religious moral relativists, libertines or anarchists.)
[ Post Reply | Private Reply | To 457 | View Replies]

To: Matchett-PI

I'll have to check out the Nash. Thanks so much, Matchett-PI!


460 posted on 01/04/2005 7:51:59 PM PST by betty boop
[ Post Reply | Private Reply | To 459 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 421-440441-460461-480 ... 921-935 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson