Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Alamo-Girl
Actually, your links do have some find tidbits: "Entropy in this latter sense has come to be used in the growing fields of information science, computer science, communications theory, etc. The story is often told that in the late 1940s, John von Neumann, a pioneer of the computer age, advised communication-theorist Claude E. Shannon to start using the term entropy when discussing information because "no one knows what entropy really is, so in a debate you will always have the advantage"
1,778 posted on 02/05/2005 10:14:02 AM PST by WildTurkey (When will CBS Retract and Apologize?)
[ Post Reply | Private Reply | To 1773 | View Replies ]


To: WildTurkey
Perhaps this will help: Wikipedia: Information Entropy.

It explains how Shannon entropy is related to Boltzman entropy.

1,779 posted on 02/05/2005 10:36:39 AM PST by Alamo-Girl
[ Post Reply | Private Reply | To 1778 | View Replies ]

To: WildTurkey; Alamo-Girl; marron; Right Wing Professor; logos; cornelis; ckilmer; StJacques; ...
The story is often told that in the late 1940s, John von Neumann, a pioneer of the computer age, advised communication-theorist Claude E. Shannon to start using the term entropy when discussing information because "no one knows what entropy really is, so in a debate you will always have the advantage."

Your reaction to the alleged exchange above is instructive, WildTurkey. Where you see evidence of a cabal forming for the purpose of perpetrating a fraud on science, all I see is an inside joke shared by two great professional mathematicians. Still, the phrase “entropy…nobody understands it anyway” has a certain resonance.

So maybe we should just try to understand it? And especially it seems we need to understand how it works in living systems. Living systems are information-driven systems, and entropy is a key facilitator of this process. It turns out information and entropy are directly correlated terms: The more complex the (self-organizing) living system, the greater its need for information and, thus, the greater its need to accumulate high entropy.

To put this into perspective, Paul Davies [The Fifth Miracle, 1998] writes:

The laws of physics … are algorithmically very simple; they contain relatively little information. Consequently they cannot on their own be responsible for creating informational macromolecules … life cannot be “written into” the laws of physics…. Life works its magic not by bowing to the directionality of chemistry, but by circumventing what is chemically and thermodynamically “natural.” Of course, organisms must comply with the laws of physics and chemistry, but these laws are only incidental to biology.”

Thus we have an apparent paradox: Living systems must simultaneously “circumvent” and “comply with” the laws of physics and chemistry. But the paradox dissolves when we see that it is by “paying their entropy debt” that living systems can do this. And pay it they must, for the balance equation of the second law — DS = 0 [note: that "D" really ought to be the Greek symbol, Delta, which refers to a probability distribution. But after all this time I still don't know how to make one in HTML. Guess i should go look it up. :^)] — requires it (which might be translated, for any given system of whatever type, as “the change in entropy equals zero”).

Rod Swenson explains the balance equation this way: Where any form of energy (e.g., mechanical, chemical, electrical, or energy in the form of heat) is out of equilibrium with its surroundings, “a potential exists that the world acts spontaneously to minimize.”

In other words, you can’t just look at a “thermodynamic object” as if it were somehow discrete, isolatable from its environment. The point of the second law is to predict the behavior of a system precisely in the context of its physical environment, an environment that ultimately extends to the entire universe. For the second law of thermodynamics, like the first, is a universal law. Yet unlike the first, the second law is not time-reversible. The “arrow of time” moves inexorably in only one direction, towards the future. Along the way (so to speak), it is the nature of entropy to inexorably increase, DS > 0. Speaking globally, were entropy to “max out,” the result would be a universe in thermal equilibrium, a world in a state of maximum disorder in which nothing above the particle level would exist. Applied to the case of an individual living system, the result would be “heat death”: It would cease to live.

It has been said that the first law of thermodynamics — the law of energy conservation — unifies all real-world processes, and expresses the underlying symmetry principle of the natural world. As such, it is the law of “that which does not change.” The second law, on the other hand, is the law of “that which changes.” Hold that thought for now (we’ll return to it shortly), and let’s look at the thermodynamic behavior of a simple self-organizing system and see what we can figure out.

I propose we look at Bernard cells. You’ll recall earlier I said that the Boltzmann hypothesis of the second law has been broadly understood as a law of disorder. In effect, as Swenson notes, he reduced the second law “to the stochastic collisions of mechanical particles, or a law of probability.” Apparently, Boltzmann reasoned that in a world of mechanically colliding particles, disordered states would be the most probable. Swenson writes of Boltzmann’s view, “There are so many more possible disordered states than ordered ones that a system will almost always be found either in the state of maximum disorder, the macrostate with the greatest number of accessible microstates such as a gas in a box at equilibrium, or moving towards it. A dynamically ordered state, one with molecules moving ‘at the same speed and in the same direction,’ said Boltzmann, ‘is the most improbable case conceivable … an infinitely improbable configuration of energy’.”

Yet a living system is a highly complex, self-organizing system — meaning that, at minimum, it requires molecules to “move together and in the same direction.” Thus the “Boltzmann regime” does not appear to be applicable to such cases. At bottom, a living system is not a mechanistic one. And the “end-directedness” of a living system — to organize, maintain, and conserve life — runs exactly counter to the “end-directedness” of the second law: maximum disorder, extinction of potentials, “heat death.”

The Bernard cell presents an instructive case. For it demonstrates a state in which gangs of molecules move together and in the same direction — that is, it is an organized, some say self-organized system, provided it has an energy (heat) source above a certain critical threshold. Clearly, a Bernard cell does not behave like Boltzmann’s “gas in a box.”

Let me try to describe Claude Bernard’s experiment. It consists of a circular dish holding a viscous liquid in between a uniform heat source below and the cooler ambient air “above.” The difference between the temperature “below” and the temperature “above” constitutes a potential called thermodynamic Force F, whose magnitude is determined by the difference between the two temperatures; i.e., between the “source” (the heat source below) and the “sink” (the ambient air). The heat gradient between the heat source below and the sink above is what constitutes the potential which, once the heating temperature reaches a certain minimum threshold, becomes sufficient to motivate flows within the system that take the form of Bernard cells. We observe the development of an ordered flow that moves hot fluid up from the bottom of the dish through the center up to the top surface where it is cooled by the air, then moves it down the sides where it pulls in more potential as it moves across the bottom again, then rises through the center again, and the cycle repeats. If the heating temperature of the source falls below the minimum threshold, this activity ceases, and “the Boltzmann regime” takes over.

If all this sounds really complicated, well you might say that any cook who has ever made a gravy, or a sauce béchamel, has observed this experiment in its gross aspect. It’s called: boiling.

The take-away from Bernard’s experiment is that any ordered flow must function to increase the rate of entropy production of the system-plus-environment, pulling in sufficient resources and then dissipating them, thus satisfying the balance equation of the second law. As Heraclitus might put it, that which persists does so as the result of ceaseless change.

Schroedinger makes this explicit; for he says that living systems must produce entropy (minimize potentials) at a sufficient rate to compensate for their own internal ordering (which can be measured as distance from equilibrium) — which is what preserves the system in its particular form — thus to satisfy the second law’s balance equation.

To put this matter very, very crudely, for a thing to “be what it is,” it has to lose all potential for being “anything else.” Entropy is that which dissipates the unneeded potentials. Thus, the more complex and ordered a system is, the more entropy it requires. And this is the reason why people say that living systems “must pay their entropy debt.”

In living systems, entropy — the dissipation of potentials — generally takes the form of heat radiated out to the external environment.

So, what does any of this have to do with “Shannon entropy?” Shannon entropy is a term for a quantity isolated in Shannon’s theory of information. I understand the term to refer in a directly analogous way to thermodynamic entropy in terms of its function or role. We might say it refers to potential information that is not selected upon the successful communication of a message, “success” being defined as “the reduction of uncertainty in the receiver” that moves the living system from a before state to the after state best serving biological interests. And analogously to the case of thermodynamic entropy, the more “decisions” the living system makes (which are what reduce uncertainty in the system), the more Shannon entropy there has to be. All the “paths” not taken are “dissipated”; that is, they have no force.

At least that’s what the situation looks like to me. Call it a “hypothesis,” and then anyone who wants to falsify it can take a stab at it. I’m keenly interested in entertaining other views of the matter.

Sorry to be so long in replying, but I'm simply buried in work these days. And am so behind in answering my correspondance that i am on the brink of despair. Thanks for writing, WildTurkey.

1,792 posted on 02/06/2005 10:38:39 AM PST by betty boop
[ Post Reply | Private Reply | To 1778 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson