Concerning that issue, the phrasing is correct for thermodynamic entropy:
The American Heritage Dictionary gives as the first definition of entropy, "For a closed system, the quantitative measure of the amount of thermal energy not available to do work." So it's a negative kind of quantity, the opposite of available energy.
When speaking of biological systems on the forum, the term "entropy" is a stumbling block because there are several types and many people are only familiar with the thermodynamic.
The above article discusses thermodynamic entropy and logical entropy and mentions others. Thermodynamic entropy is related to the 2nd law. But in biological systems, thermodynamic entropy is directly related to logical entropy. And without considering that relationship, it may seem like biological life violates the 2nd Law of Thermodynamics ergo evolution is impossible.
Concerning information theory and molecular biology there are three types of entropy involved:
Logical Entropy: Shannon probability-distribution entropy
Algorithmic Entropy: Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity.
Because the term "entropy" is confusing in this context, we usually call it "uncertainty". As you can see from the above link, the difference reduces to the unit of measure, bit v. joules per K. See also, The Evolution of Carnot's Principle
AFAIK, Adami is the only notable investigator in the field who is attempting to also bring in algorithmic entropy. We happen to have an expert in algorithmic information theory on the forum, Freeper tortoise. He briefly reviewed Adami's approach to Information Theory and Molecular Biology and was not particularly impressed with his math [post 582 calling it 'immature'.]
Seems to me it would only complicate this particular discussion to raise a third kind of entropy, when we can put the issue to bed with the first two.
If we don't raise logical entropy, and leave thermodynamic entropy to fend for itself in the face of biological systems - then the gain of information content (incl. autonomy and semiosis) in biological systems, functional complexity, "energy storage" and the ilk will continue to be raised as as evidence of biological systems violating the 2nd Law of Thermodynamics.
Now I see your complete problem. If you want to discuss science, you first need to learn science, NOT from philosophers who get their scientific definitions from a common dictionary.
Thanks for the 2nd Law link. Now I know where 2beathomemom got her garbage.
But you have clouded the issue and also supported their side by using false science. Unless you are prepared to use REAL science, I don't believe discussing entropy with you will be productive.
Here is why creationists have picked up on "entropy". Basically they are trying to play upon the ignorance of the population.
"In all of physics, there is perhaps no topic more underrated and misunderstood than entropy. The behavior of large collections of particles, such as the universe, a grain of sand, or a tuna salad sandwich, is dictated by two universal laws: one involving energy, the other involving entropy. And yet, while energy is described in great detail throughout any introductory physics textbook, entropy is relegated to about two or three pages, and is usually badly described."