Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: b_sharp
When I learned thermodynamics in an Engineering class, at the same time I was learning Statisical Physics in a physics class, and more -- I was learning binomial distributions and statistics in a mathematics class. The three overlapped!

The three laws of thermodynamics were taught in both the engineering and physics classes. If you first approach thermodynamics from the classical engineeing path of rediscovering how to bore out cannons effeciently, I can see why you'd have the (false) complaint of conflation that you do. There's no sense in learning of heats enthalpic, latent, and transfered of the more general fundaments of the mathematics and physical models involved.

Just to note, that like you, I was always bothered by the non-respect for local phenonmenom that the laws of thermo have -- how they impose perfect gases, perfect diffusions, perfect mixing so as to make use of the classical thermodynamic distributions -- Maxwell-Botlzmann, etc..

So I try to work from the old simple physics mindset. Make simple analogies -- such as the fifty pennies. Use those analogies to provide a mental model to consider aspects of the problem.

Here the fifty unsequenced pennies are being used to to show how the laws of *information* thermodynamics develop and what entropy means, what order means. I could *presto* change it into a heat problem by saying that if heads the pennie is in energy state E1, and if tails, E2. There is a complete parallelism to classical thermo in that regard.

795 posted on 12/30/2005 6:07:42 AM PST by bvw
[ Post Reply | Private Reply | To 741 | View Replies ]


To: bvw
Your math is beyond mine I suspect, I have a simple undergrad CS degree.

What bothers me about this article is the author's misapplication of the Statistical Mechanics definition of entropy to sidestep the biochemical utilization of energy to produce work. His understanding of evolution and biology is quite poor, as evidenced by his use of a slightly modified tornado in a junkyard analogy.

If I may I will use an analogy to explain my problem with Sewell's article. If a human takes a hand full of superballs and tosses that handful into a room, the superballs will eventually arrange themselves into a pattern (random or otherwise) on the floor. The probability of those balls arranging into a recognizable order (pattern) is infinitesimally small because any arrangement of balls represents one possible microstate of the macrostate (a room of a specific number of balls) and there are many more non-ordered microstates than ordered states. This is pretty much as Sewell explained it, but easier to visualize.

However, this is a rather poor analogy because the concept of 'order' we as humans observe in the room is based on our concept of order; does the arrangement create a figure, letter, geometric pattern or something else we recognize. Statistical mechanics is not concerned with this kind of order but the number of microstates possible given a specific macrostate where disorder is a description of our uncertainty of the positions of the molecules (balls). The higher the uncertainty, the higher the disorder. This uncertainty can be quantified, the human concept of order can not. That is the difference between human disorder and SM disorder.

What is missing in all this is the dispersion of energy contained by the classical Thermodynamics definition of entropy. Entropy is a measurement of the amount of energy dispersed from a thermodynamic system to the environment.

This analogy ignores many things. The potential energy imparted to the handful of balls by the contraction of arm muscles of the tosser. The entropy increase as the balls shed heat gained through friction as they moved through air, the kinetic energy imparted to the floor as it meets the balls. The entropy increase as some of that kinetic energy is dissipated into the environment. The increase of entropy as the human muscles create and dissipate energy in the form of heat. The entropy increase in the form of waste material from the cells, the entropy increase when sugars are formed in the body, and so on all the way back to the dissipation of energy (entropy) of the Sun.

Sewell's entire article was an attempt to show that biological organisms cannot on their own develop without violating the 2LoT. What he forgets is that this energy dissipation can be used for work by other systems. Thermodynamic systems can't be considered in isolation, the environment that contains the system can become its own system in turn.

There is no violation of the 2LoT when energy from the Sun (increase in entropy) heats up water and causes evaporization (more entropy). There is no violation of the 2LoT when plants receive energy from the sun which overcomes the activation energy of molecules forming more complex molecules (more entropy) of carbohydrates and stores the majority of that energy in the molecules. The 2LoT is not violated when the human consumes the carbohydrates and forms simpler carbohydrate molecules (more entropy created) which in turn are used to flex muscles, correct damage to cells, create new cells and even enlarge the genome and create new genes all while dissipating some energy at each stage (increasing entropy) and storing the rest. Eventually when biological organisms die, energy intake ceases and the rest of the stored energy dissipates thus increasing entropy.

851 posted on 12/30/2005 10:15:10 AM PST by b_sharp (Science adjusts theories to fit evidence, creationism distorts evidence to fit the Bible.)
[ Post Reply | Private Reply | To 795 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson