Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: betty boop
Especially since what entropy is usually taken to mean is the amount of thermal energy not available to perform useful work.

I think this is where you are missing the big picture.

1,772 posted on 02/04/2005 9:23:52 PM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1770 | View Replies ]


To: WildTurkey; betty boop; tortoise
I think this is where you are missing the big picture.

Which big picture are you speaking to? I thought we were addressing whether the appeal to the 2nd Law of Thermodynamics by those arguing against evolution is valid.

Concerning that issue, the phrasing is correct for thermodynamic entropy:

The Second Law of Thermodynamics

The American Heritage Dictionary gives as the first definition of entropy, "For a closed system, the quantitative measure of the amount of thermal energy not available to do work." So it's a negative kind of quantity, the opposite of available energy.

As the above article mentions, the 2nd Law of Thermodynamics is usually stated: "Entropy in a closed system can never decrease."

When speaking of biological systems on the forum, the term "entropy" is a stumbling block because there are several types and many people are only familiar with the thermodynamic.

The above article discusses thermodynamic entropy and logical entropy and mentions others. Thermodynamic entropy is related to the 2nd law. But in biological systems, thermodynamic entropy is directly related to logical entropy. And without considering that relationship, it may seem like biological life violates the 2nd Law of Thermodynamics ergo evolution is impossible.

Concerning information theory and molecular biology there are three types of entropy involved:

Thermodynamic Entropy: Maxwell-Boltzmann-Gibbs entropy

Logical Entropy: Shannon probability-distribution entropy

Algorithmic Entropy: Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity.

The reduction of Shannon entropy (before state to after state) in molecular machines dissipates energy into the local surroundings (thermodynamic entropy). Uncertainty, Entropy and Information The Shannon model is used by Schneider (NIH, Cancer Research) and others in active research.

Because the term "entropy" is confusing in this context, we usually call it "uncertainty". As you can see from the above link, the difference reduces to the unit of measure, bit v. joules per K. See also, The Evolution of Carnot's Principle

AFAIK, Adami is the only notable investigator in the field who is attempting to also bring in algorithmic entropy. We happen to have an expert in algorithmic information theory on the forum, Freeper tortoise. He briefly reviewed Adami's approach to Information Theory and Molecular Biology and was not particularly impressed with his math [post 582 calling it 'immature'.]

Seems to me it would only complicate this particular discussion to raise a third kind of entropy, when we can put the issue to bed with the first two.

If we don't raise logical entropy, and leave thermodynamic entropy to fend for itself in the face of biological systems - then the gain of information content (incl. autonomy and semiosis) in biological systems, functional complexity, "energy storage" and the ilk will continue to be raised as as evidence of biological systems violating the 2nd Law of Thermodynamics.

1,773 posted on 02/04/2005 10:48:40 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 1772 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson