Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: WildTurkey; Alamo-Girl; marron; Right Wing Professor; logos; cornelis; ckilmer; StJacques; ...
The story is often told that in the late 1940s, John von Neumann, a pioneer of the computer age, advised communication-theorist Claude E. Shannon to start using the term entropy when discussing information because "no one knows what entropy really is, so in a debate you will always have the advantage."

Your reaction to the alleged exchange above is instructive, WildTurkey. Where you see evidence of a cabal forming for the purpose of perpetrating a fraud on science, all I see is an inside joke shared by two great professional mathematicians. Still, the phrase “entropy…nobody understands it anyway” has a certain resonance.

So maybe we should just try to understand it? And especially it seems we need to understand how it works in living systems. Living systems are information-driven systems, and entropy is a key facilitator of this process. It turns out information and entropy are directly correlated terms: The more complex the (self-organizing) living system, the greater its need for information and, thus, the greater its need to accumulate high entropy.

To put this into perspective, Paul Davies [The Fifth Miracle, 1998] writes:

The laws of physics … are algorithmically very simple; they contain relatively little information. Consequently they cannot on their own be responsible for creating informational macromolecules … life cannot be “written into” the laws of physics…. Life works its magic not by bowing to the directionality of chemistry, but by circumventing what is chemically and thermodynamically “natural.” Of course, organisms must comply with the laws of physics and chemistry, but these laws are only incidental to biology.”

Thus we have an apparent paradox: Living systems must simultaneously “circumvent” and “comply with” the laws of physics and chemistry. But the paradox dissolves when we see that it is by “paying their entropy debt” that living systems can do this. And pay it they must, for the balance equation of the second law — DS = 0 [note: that "D" really ought to be the Greek symbol, Delta, which refers to a probability distribution. But after all this time I still don't know how to make one in HTML. Guess i should go look it up. :^)] — requires it (which might be translated, for any given system of whatever type, as “the change in entropy equals zero”).

Rod Swenson explains the balance equation this way: Where any form of energy (e.g., mechanical, chemical, electrical, or energy in the form of heat) is out of equilibrium with its surroundings, “a potential exists that the world acts spontaneously to minimize.”

In other words, you can’t just look at a “thermodynamic object” as if it were somehow discrete, isolatable from its environment. The point of the second law is to predict the behavior of a system precisely in the context of its physical environment, an environment that ultimately extends to the entire universe. For the second law of thermodynamics, like the first, is a universal law. Yet unlike the first, the second law is not time-reversible. The “arrow of time” moves inexorably in only one direction, towards the future. Along the way (so to speak), it is the nature of entropy to inexorably increase, DS > 0. Speaking globally, were entropy to “max out,” the result would be a universe in thermal equilibrium, a world in a state of maximum disorder in which nothing above the particle level would exist. Applied to the case of an individual living system, the result would be “heat death”: It would cease to live.

It has been said that the first law of thermodynamics — the law of energy conservation — unifies all real-world processes, and expresses the underlying symmetry principle of the natural world. As such, it is the law of “that which does not change.” The second law, on the other hand, is the law of “that which changes.” Hold that thought for now (we’ll return to it shortly), and let’s look at the thermodynamic behavior of a simple self-organizing system and see what we can figure out.

I propose we look at Bernard cells. You’ll recall earlier I said that the Boltzmann hypothesis of the second law has been broadly understood as a law of disorder. In effect, as Swenson notes, he reduced the second law “to the stochastic collisions of mechanical particles, or a law of probability.” Apparently, Boltzmann reasoned that in a world of mechanically colliding particles, disordered states would be the most probable. Swenson writes of Boltzmann’s view, “There are so many more possible disordered states than ordered ones that a system will almost always be found either in the state of maximum disorder, the macrostate with the greatest number of accessible microstates such as a gas in a box at equilibrium, or moving towards it. A dynamically ordered state, one with molecules moving ‘at the same speed and in the same direction,’ said Boltzmann, ‘is the most improbable case conceivable … an infinitely improbable configuration of energy’.”

Yet a living system is a highly complex, self-organizing system — meaning that, at minimum, it requires molecules to “move together and in the same direction.” Thus the “Boltzmann regime” does not appear to be applicable to such cases. At bottom, a living system is not a mechanistic one. And the “end-directedness” of a living system — to organize, maintain, and conserve life — runs exactly counter to the “end-directedness” of the second law: maximum disorder, extinction of potentials, “heat death.”

The Bernard cell presents an instructive case. For it demonstrates a state in which gangs of molecules move together and in the same direction — that is, it is an organized, some say self-organized system, provided it has an energy (heat) source above a certain critical threshold. Clearly, a Bernard cell does not behave like Boltzmann’s “gas in a box.”

Let me try to describe Claude Bernard’s experiment. It consists of a circular dish holding a viscous liquid in between a uniform heat source below and the cooler ambient air “above.” The difference between the temperature “below” and the temperature “above” constitutes a potential called thermodynamic Force F, whose magnitude is determined by the difference between the two temperatures; i.e., between the “source” (the heat source below) and the “sink” (the ambient air). The heat gradient between the heat source below and the sink above is what constitutes the potential which, once the heating temperature reaches a certain minimum threshold, becomes sufficient to motivate flows within the system that take the form of Bernard cells. We observe the development of an ordered flow that moves hot fluid up from the bottom of the dish through the center up to the top surface where it is cooled by the air, then moves it down the sides where it pulls in more potential as it moves across the bottom again, then rises through the center again, and the cycle repeats. If the heating temperature of the source falls below the minimum threshold, this activity ceases, and “the Boltzmann regime” takes over.

If all this sounds really complicated, well you might say that any cook who has ever made a gravy, or a sauce béchamel, has observed this experiment in its gross aspect. It’s called: boiling.

The take-away from Bernard’s experiment is that any ordered flow must function to increase the rate of entropy production of the system-plus-environment, pulling in sufficient resources and then dissipating them, thus satisfying the balance equation of the second law. As Heraclitus might put it, that which persists does so as the result of ceaseless change.

Schroedinger makes this explicit; for he says that living systems must produce entropy (minimize potentials) at a sufficient rate to compensate for their own internal ordering (which can be measured as distance from equilibrium) — which is what preserves the system in its particular form — thus to satisfy the second law’s balance equation.

To put this matter very, very crudely, for a thing to “be what it is,” it has to lose all potential for being “anything else.” Entropy is that which dissipates the unneeded potentials. Thus, the more complex and ordered a system is, the more entropy it requires. And this is the reason why people say that living systems “must pay their entropy debt.”

In living systems, entropy — the dissipation of potentials — generally takes the form of heat radiated out to the external environment.

So, what does any of this have to do with “Shannon entropy?” Shannon entropy is a term for a quantity isolated in Shannon’s theory of information. I understand the term to refer in a directly analogous way to thermodynamic entropy in terms of its function or role. We might say it refers to potential information that is not selected upon the successful communication of a message, “success” being defined as “the reduction of uncertainty in the receiver” that moves the living system from a before state to the after state best serving biological interests. And analogously to the case of thermodynamic entropy, the more “decisions” the living system makes (which are what reduce uncertainty in the system), the more Shannon entropy there has to be. All the “paths” not taken are “dissipated”; that is, they have no force.

At least that’s what the situation looks like to me. Call it a “hypothesis,” and then anyone who wants to falsify it can take a stab at it. I’m keenly interested in entertaining other views of the matter.

Sorry to be so long in replying, but I'm simply buried in work these days. And am so behind in answering my correspondance that i am on the brink of despair. Thanks for writing, WildTurkey.

1,792 posted on 02/06/2005 10:38:39 AM PST by betty boop
[ Post Reply | Private Reply | To 1778 | View Replies ]


To: betty boop
thus, the greater its need to accumulate high entropy.

A biological form is low entropy and you do not "accumulate high entropy".

1,793 posted on 02/06/2005 10:44:03 AM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop
And pay it they must, for the balance equation of the second law — DS = 0

deltaS = zero only in reversible processes. All natural processes are non-reversible. I think you need a class in thermo-dynamics.

1,795 posted on 02/06/2005 10:47:34 AM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop
It has been said that the first law of thermodynamics — the law of energy conservation — unifies all real-world processes, and expresses the underlying symmetry principle of the natural world

I think I see your problem.

1,797 posted on 02/06/2005 10:53:56 AM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop

When you hand pearls to swine, the swine bite your hand. Ad they call it "science". Of course, what swine call "science" is not science at all, but just being an animal.


1,801 posted on 02/06/2005 11:42:09 AM PST by bvw
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop
Schroedinger makes this explicit; for he says that living systems must produce entropy

[Schroedinger] This seems to suggest that the higher temperature of the warm-blooded animal includes the advantage of enabling it to get rid of its entropy at a quicker rate, so that it can afford a more intense life process. I am not sure how much truth there is in this argument (for which I am responsible, not Simon). One may hold against it, that on the other hand many warm-blooders are protected against the rapid loss of heat by coats of fur or feathers. So the parallelism between body temperature and 'intensity of life', which I believe to exist, may have to be accounted for more directly by van't Hoff's law, mentioned on p. 65: the higher temperature itself speeds up the chemical reactions involved in living. (That it actually does, has been confirmed experimentally in species which take the temperature of the surroundings.)

1,805 posted on 02/06/2005 12:10:23 PM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop
So maybe we should just try to understand it?

[Schroedinger]o the physicist - but only to him - I could hope to make my view clearer ...

1,806 posted on 02/06/2005 12:12:00 PM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop
Where any form of energy (e.g., mechanical, chemical, electrical, or energy in the form of heat) is out of equilibrium with its surroundings, “a potential exists that the world acts spontaneously to minimize.”

Of course you are referencing the outmoded, caloric theory of heat.

1,832 posted on 02/06/2005 2:26:47 PM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1792 | View Replies ]

To: betty boop; js1138
Thank you so much for your excellent posts and great insight!

Still, the phrase “entropy…nobody understands it anyway” has a certain resonance.

So very true! I have tried repeatedly to assert the evolution of the term "entropy" but with little success.

For Lurkers who are interested in the subject, here's a great historical overview of how "entropy" developed: The Second Law of Thermodynamics

In biological systems, there are three different types of entropy involved:

Thermodynamic Entropy: Maxwell-Boltzmann-Gibbs entropy

Logical Entropy: Shannon probability-distribution entropy

Algorithmic Entropy: Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity.

If one only looks at thermodynamic entropy (which is clearly the tendency around here) - then one is arriving at a conclusion without all the facts.

In molecular machinery, the uncertainty of the receiver before a message is received and decoded is the Shannon entropy in the before state. When the incoming message (along with noise) has been received and decoded, the uncertainty in the molecular machinery decreases to the after state, also a Shannon entropy.

The difference between the two (after state less before state) is the gain of information in bits. In Shannon, a bit is not binary. Each bit gained by this reduction of uncertainty (state change) has a corresponding release of energy into the local surroundings. That pays the 2nd Law tab of thermodynamic entropy.

voila - two different entropies, one organism.

The third kind of entropy - algorithmic entropy - is not yet well defined. Adami is a primary investigator in this particular quest. In the end, that form of entropy may supplant Shannon entropy. And because it is wrapped up in the principles of Komologrov complexity, Solomonoff induction and Chaitin randomness - it may reach to some other questions as well. But it is a work-in-progess and thus not relevant to this discussion other than to acknowledge that it is "out there".

IOW, Darwinist evolutionary theory seems to account beautifully for selections made according to the external, environmental pressures, but is entirely silent about the internal, biologically- or organismically-driven ones. And for that reason I continue to suspect that the theory is somehow incomplete as a comprehensive theory of biological life.

So very true!!!

Wolfram, in pursuing the von Neumann challenge of cellular automata (self-organizing complexity) believes that evolution has happened despite natural selection. Rocha's approach is clearly autonomous biological self-organizing complexity.

And then we have Gehring, Weiss and others who are recognizing (owing to the concurrent evolution of eyeness across phyla, including between vertebrates and invertebrates) - that a common ancestor who had no eyes would have to have master control genetic mechanisms which were not (as) subject to wholesale mutations.

All of this has developed long after Darwin and points to a direction of evolution rather than a happenstance of evolution. That does not speak to whether or not the direction was "designed" but rather that the "randomness" pillar of evolution is in dire peril.

Personally, I expect the "random mutations + natural selection > species" formulation to be replaced with "autonomous biological self-organizing complexity".

1,834 posted on 02/06/2005 3:25:16 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 1792 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson