To: Doctor Stochastic
No. That's not what I said. I said that the configuration C(t+1) has the same entropy whether or not it is obtained from C(t) by random or non-random processes.
If is it a closed system and a random process, it is unlikely to have the same entropy, it will likely have more entropy, and it will never ever have less.
Sewell adds that if it is an open system "According to these equations, the thermal order in an open system can decrease in two different ways -- it can be converted to disorder, or it can be exported through the boundary. It can increase in only one way: by importation through the boundary."
To: FreedomProtector
If is it a closed system and a random process, it is unlikely to have the same entropy, it will likely have more entropy, and it will never ever have less. No. Entropy is a state function. Random, non-random, deterministic, chaotic, quantum, spooky, etc., all processes lead to the same entropy for the same state. This is what Sewell seems to be missing in his appendix.
59 posted on
10/24/2006 3:15:14 PM PDT by
Doctor Stochastic
(Vegetabilisch = chaotisch ist der Charakter der Modernen. - Friedrich Schlegel)
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson