Posted on 05/04/2005 10:48:30 AM PDT by betty boop
Autocatakinetics, Evolution, and the Law of Maximum Entropy Production
By Rod Swenson
An Excerpt:
Ecological science addresses the relations of living things to their environments, and the study of human ecology the particular case of humans. There is an opposing tradition built into the foundations of modern science of separating living things, and, in particular, humans from their environments. Beginning with Descartes dualistic world view, this tradition found its way into biology by way of Kant, and evolutionary theory through Darwin, and manifests itself in two main postulates of incommensurability, the incommensurability between psychology and physics (the first postulate of incommensurability), and between biology and physics (the second postulate of incommensurability).
The idea of the incommensurability between living things and their environments gained what seemed strong scientific backing with Boltzmanns view of the second law of thermodynamics as a law of disorder according to which the transformation of disorder to order was said to be infinitely improbable. If this were true, and until very recently it has been taken to be so, then the whole of life and its evolution becomes one improbable event after another. The laws of physics, on this view, predict a world that should be becoming more disordered, while terrestrial evolution is characterized by active order production. The world, on this view, seemed to consist of two incommensurable, or opposing rivers, the river of physics which flowed down to disorder, and the river of biology, psychology, and culture, which flowed up, working, it seemed, to produce as much order as possible.
As a consequence of Boltzmanns view of the second law, evolutionary theorists, right up to present times, have held onto the belief that organic evolution was a negation of physical evolution, and that biology and culture work somehow to defy the laws of physics (Dennett, 1995). With its definition of evolution as an exclusively biological process, Darwinism separates both biology and culture from their universal, or ecological, contexts, and advertises the Cartesian postulates of incommensurability at its core, postulates that are inimical to the idea of ecological science. An ecological science, by definition, assumes contextualization or embeddedness, and as its first line of business wants to know what the nature of it is. This requires a universal, or general theory of evolution which can uncover and explicate the relationship of the two otherwise incommensurable rivers, and put the active ordering of biological, and cultural systems, of terrestrial evolution as a time-asymmetric process, back into the world.
The law of maximum entropy production, when coupled with the balance equation of the second law, and the general facts of autocatakinetics [see below], provides the nomological basis for such a theory, and shows why, rather than living in a world where order production is infinitely improbable, we live in and are products of a world, in effect, that can be expected to produce as much order as it can. It shows how the two otherwise incommensurable rivers, physics on the one hand, and biology, psychology, and culture on the other, are part of the same universal process and how the fecundity principle, and the intentional dynamics it entails, are special cases of an active, end-directed world opportunistically filling dynamical dimensions of space-time as a consequence of universal law. The epistemic dimension, the urgency towards existence in Leibnizs terms, characterizing the intentional dynamics of living things and expressed in the fecundity principle, and the process of evolution writ large as a single planetary process, is thus not only commensurable with first, or universal, principles, but a direct manifestation of them.
The view presented here thus provides a principled basis for putting living things, including humans, back in the world, and recognizing living things and their environments as single irreducible systems. It provides the basis for contextualizing the deep and difficult questions concerning the place of humans as both productions and producers of an active and dynamic process of terrestrial evolution, which as a consequence of the present globalization of culture is changing the face of the planet at a rate which seems to be without precedent over geological time. Of course, answers to questions such as these always lead to more questions, but such is the nature of the epistemic process we call life.
IMHO, that is similar to where Swenson wants to take us - to step outside of existing evolution theory and grasp the bigger picture.
Wolfram, in his "new kind of science", applies the concept (actually, cellular automata which is very close) to the entire universe as a closed system. That too may be a good fit. The "self" in that case is the universe which is organizing itself to increasing complexity. The universe however is also self-destructive (physical entropy) so the complexification has its limits.
Wolfram also looks at the theory applied to evolution and therein he sees a problem.
Roughly paraphrased, he asserts that self-organizing complexity occurs despite natural selection. That is why I've asserted long ago on this forum that if the Wolfram's and Rocha's are successful, the formulation of "RM + NS > Species" will be replaced with "autonomous biological self-organizing complexity".
There are other, perhaps better, formulations for complexity which might upset this apple cart - functional complexity, Kolmogorov complexity, physical complexity, metatransition, etc. But no matter where the investigation leads, it'll be very interesting to watch!
Or possibly it could be said that the evolution of the Universe can only take place/be what it is by being in the business of maximum entropy increase, of depending on it.
Now when we deal with the concept of "entropy," we first of all must come to grips with the idea that we are grappling with "a greased hog": in the sense that "the goalpost keeps moving."
Entropy refers to a dynamic process in nature that refers to the dispositions of energy in a system given its relations to its environment (ultimately to the Universe itself). High entropy means very large possibilities at the level of microstates.
But given the first law of thermodynamics, energy is neither created nor destroyed: It is a given, fixed quantity. This tells us that not all possibilities made available by the second law could possibly all be executed "at once." If that could be done, not only would there be no evolution of nature or of any of its forms, but that would practically mean that the Universe could not have been born in the first place. [Assuming it was "born" in the first place, as most human traditions still hold till this day.]
To put this all into "baby talk": Life needs high entropy; for high entropy is what gives Life its possibilities. But in order for actual, realized possibilities to exist, all other (unselected) possibilities must be left latent, unmanifested at any given point in time.
High entropy potentially enables a virtually infinite set of possible outcomes at any given point in time. "Selection" reduces the set to an actual outcome (i.e., the low-entropy solution in a given time-step). Existential life is based on such outcomes. The rest of the set is held in abeyance for the moment (so to speak). Particular members -- some or all -- of the set of potentials may yet manifest in the future. But they are not "here" yet....
Anyhoot, "selection" refers to some kind of informed process. That is, on information that can be utilized by living things towards the the achievement of biological purposes/persistence.
This is an absolutely enormous field, the surface of which has only just begun to be scratched. Just to indicate some of the "covered problems," here's an excerpt from Avshalom Elitzur's Ces't la Vie: A Physicist's Definition
* * * * * * *
When we open a lock with a key rather than breaking it, we unwittingly use a profound principle associated with Maxwells demon paradox. Consider a closed box, full with gas in equilibrium and divided by a partition into two halves. A tiny demon within the box directs the gas molecules motions by opening and closing a microscopic door in the partition. Eventually, hot gas forms on one half [of the partition] and cold gas on the other. Since the energy required for this operation is negligible, entropy seems to have been reduced without the energy investment that would normally increase entropy elsewhere, in defiance of the Second Law.
The paradox was resolved once it was pointed out that the demon needs information in order to perform its task. The acquisition of this information has its cost in energy, which increases entropy more than the entropy reduced by the demons sorting.
From this principle, assigning energetic price to [the acquisition of] information, a complementary principle follows: The use of information can save energy. For example, a person opening a lock with a key utilizes the information embedded in the key, which makes the enormous force needed for [physically] breaking the lock unnecessary.
Biology abounds with such uses of information for saving energy. The tiger, for example, exerts enormous mechanical force to kill its prey. In contrast, the cobra can kill the same prey by merely spitting into its eyes. What is striking in the latter case is the apparent disproportion between the negligible force exerted by the predator and the fatal result suffered by the prey. The secret lies in the snakes choice of the appropriate neurotoxin (in this case, cobrotoxin) that so precisely matches the acetylcholine receptors at the ends of the preys muscles. Similar precision is manifested by the choice of the vulnerable point in the preys body (once the venom has penetrated the eye, it is the preys own vascular system that carries it from there over the entire body!). In other words, the cobra makes a spectacular use of information about its preys physiology and neurochemistry, thereby saving the energy that the tiger has to invest for the same purpose. The force exerted by the venom is literally infinitesimal of molecular scale but it is exerted with great precision. We can therefore formulate the benefit of information thus: with the aid of information, it is possible to perform a given work with less energy, but at the right place and at the right time.
* * * * * * *
The article at the link is most provocative. Maybe the discussion of his ideas here will be a help to all folks engaging such problems as considered here in some way.
Thanks for your great post, Ronzo!
May I propose a tiny qualification to this statement? Entropy always refers to transformations of energy -- typically energy's penchant for "speading out" whenever not otherwise restricted (e.g., by boundary conditions or insufficient activation energies). It does not refer to the state of objects after such transformations take place (i.e., the disposition of objects on messy desks or in households for lack of "tidying up." It should be fairly obvious that such "messes" are the result of human failure to keep things neat, not the result of thermodynamic energetic processes).
The trickiest thing about entropy -- and I have been struggling with this concept in recent times -- is that it is a dynamic, real-time process that refers to the distribution of energy between systems (inorganic and organic) and their environment. Statistical analysis can "predict." But actual measurement is a retrospective activity: At the time we measure, the subject system has already "moved on." We can measure its passage; but by the time we do so, only as a "freeze-frame" of some evolution that is already in the past....
A good analogy is a hurricane passing through -- a self-organizing (that is, autocatakinetic) system that draws on both internal and external (environmental) energetic resources in order to exist. The phenomenon of the hurricane as an energetic system is an entirely different matter from the actual physical condition of the debris that it leaves in its wake.
It really is a mind-boggling problem, when the "physical evidence" you have (e.g., debris left in the wake of a past hurricane) that a phenomenon has taken place can tell you nothing about the phenomenon proper (i.e., the hurricane itself), in terms of its actual "disposition of energies." It seems that two categorical orders are involved here.
I need to find better examples to explicate this issue. Will certainly be looking for them.
Then again, maybe I'm "all wet" to begin with! :^)
Thanks for listening to my "rant," A-G!
I'm stumbling over your analysis though because I see the First Law of Thermodynamics being concerned with the transformation of energy whereas the Second Law is concerned with physical entropy.
Then again, I might be just too tired to think (LOL!) I'll literally sleep on it tonight and perhaps I'll be more clear headed tomorrow.
My take on the first law is that it is THE universal conservation principle: energy (and by implication its Lorenz-transformable complementarity, matter) can be neither created nor destroyed. All the "second-tier" physical laws that I can think of pay homage to this ("least action") principle.
The second law of thermodynamics is dependent on the first, and yet governs it in a certain sense, as the universal variational principle -- without which there could be no innovation, no evolution, no new things happening in the Universe, no universal expansion in space and time.... It governs in the sense that the first law would have "nothing to do" without it. The second law absolutely needs the first to "do its thing"; and the first would be meaningless as a law of nature without the second (for it would be "frozen in time," so to speak). Conservation is manifestly a different thing from action. Speculating here: We are not speaking of different categories here, just different modalities....
It seems the first and second laws are themselves "complementarities" of the highest order....
As the basis for this supposition, I go back to the insights of Heraclitus and (more recently) to Leibniz, who both recognized that for something to persist as what it is it must undergo a ceaseless process of change -- the sort of thing made apparent with the recognition that the human body replaces all its cells every seven years or so, and yet its form is undisturbed by this process and persists as it is all the same.
But frankly, I'm struggling with these ideas, too, A-G. Maybe if we put our heads together we can "figure this 'stuff' out." :^) With the help of our friends....
Thank you so much for sharing your thoughts with me on this fascinating (and difficult) question, dear Alamo-Girl!
I do not see that this follows at all. By simply adding sufficient redundancy, replication error can be reduced to any desired level of insignificance. In any case, I don't see how even perfectly exact replication is contrary to the second law, so long as the free energy is expended in the process.
It's kind of amusing to think of the molecules of me (or anyone), right this instant, in the same fashion. Sure, they're organized now, but in 1,000,000 years what kind of order will they have? No perceptible order I'd bet. Entropy-wise I just don't see that much difference between a snowflake's "self" and my own.
The point is that one cannot get zero error; just arbitrarily small. From an information theory point of view, on still has the various bounds (Griesmer for example) on how much redundancy is needed to correct errors.
I probably get all astir with the second law of thermodynamics because it has been taken well beyond strict thermodynamics into philosophy, information theory and so on by lumping all of entropy - including uncertainty, disarray and the ilk - into the law.
There is something to be learned from the effort, but I assert it is wrong to presume a compliance of such "unphysicals" to the physical laws.
The history of the concept of thermodynamics is helpful but not definitive in keeping a proper ordering of the knowledge base.
As an example, many appeal to the second law to substantiate a sense of an 'arrow of time' or timeline.
The First Law of thermodynamics is an exact consequence of the laws of mechanics - classical or quantum. The Fluctuation Theorem shows that the Second Law of Thermodynamics is also an exact consequence of the laws of mechanics except that it is only valid in the large system or long time limit.
Wikipedia: Fluctuation Theorum
This theorem (FT) gives a mathematical expression for the probability ratio that time-averaged irreversible entropy production[1], takes on a value, A, to the opposite value, .A, in systems away from equilibrium. In other words, for a finite non-equilibrium system in a finite time, the FT gives the probability that entropy will flow in a direction opposite to that dictated by the second law of thermodynamics.
Thus, from my point of view, what the laws of thermodynamics can tell us with regard to the fecundity principle has its limits - i.e. classical physics, microscope to telescope.
I definitely share the tendency! Much of science today, and all of Darwinism, is still on Newton's "classical model."... For instance, did you know that NASA's "official position" on evolution today is classical Darwinism (i.e., RM+NS)? I only found out about this recently, and was quite surprised.
Entropy is not a process. It is a state variable. The entropy of a system depends only on the temperature, pressure, (and other things under some situations) of the system, not on any process involved in creating the system.
Thermodynamically speaking, entropy has the same status as temperature, energy, pressure, etc.
You're correct here; but lots of nearly exact replications imply a few inexact. Error is never zero; just arbitrarily small but positive (as you noted.)
It's the same problem as having a perfect crystal. The (Gibbs or Helmolz) free energy is minimized, not the total energy nor is entropy maximized.
Of a truth, it would probably do more to advance science and reduce costs if NASA went head-to-head with China on a vital technology contest like it did way back when against Russia to get a man on the moon.
Is it not true that "state variables" occur as a moving succession of events? If so, that succession looks a lot to me like a process. But then I am sure I am not as technically proficient in the understanding and use of scientific terminology as you are, Doc.
BTW, if entropy cannot be understood as a succession of events describing the moving state of a system, how would you describe it to a layman like me?
Thanks so much for writing, Doc.
Now there's a really interesting idea, Alamo-Girl! Put the "competition" back into science...what an idea!
On the other hand, I suppose we could just wait for American science to be overtaken altogether by the "international rat pack" of next-generation physicists now collaboratively forming -- from Israel, Central Europe, India, China, and Russia -- who will probably wind up eating our lunch for us if present trends hold. (So to speak.) IMO FWIW.
I confess that I do not share your difficulty with the concept, in some measure, perhaps, because I view it mathematically.
Others have discussed entropy as a philosophical concept, and Dr. Stochastic refered to it as a thermodynamic state variable (which it is). However, I prefer to view it from its original derivation in Statistical Mechanics. I always found Thermodynamics hard but Stat Mech easy.
You are indeed correct, that the first law of thermodynamics states that energy is conserved. Therefore, the total heat and work are conserved quantities.
The second law, again you are correct, addresses the distribution of energy.
From its original, mathematical derivation, entropy is indeed a measure of how energy is distributed. It is the natural log of the function omega. Omega is called the density of states function, and gives the total number of quantum states available to the system.
It is also called the partition function, because it defines, essentially, how the energy is partitioned among the various quantum states. In that sense it is usually considered as a distribution function. Indeed, in gas and plasma physics, the derivation is reversed. Assuming a maximum entropy state (i.e. equilibrium), and given a temperature and pressure, you can than calculate the partition of energy among independent, free particles (therefore they do not have discreet quantum states per se). The result is the Maxwellian distribution, which is the three dimensional velocity distribution, or the Boltzmann distribution, which is the 1 dimensional energy distribution. These distributions are the equivalent to the partition function in that they give the distribution of energy among the population of particles. It is a fundamental concept in physics. Systems that are defined as "thermal" are those for which the particle distribution (in energy or velocity) follows the Maxwellian. It is the basis for the thermonuclear bomb and for the definition of temperature.
That would be like saying a person's name is a process. A name may change over time (as may weight or volume) but mass isn't a process; it's a variable.
The succesion of changes would be a process as you mention. You mustn't confuse the object with changes in the object.
Heat is a variable; heating is a process. It's important to get these things straight.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.