negentropy: Poor Terminology! The term negentropy was defined by Brillouin (L. Brillouin, Science and Information Theory, second, Academic Press, Inc.,New York, 1962, page 116) as `negative entropy', N = -S. Supposedly living creatures feed on `negentropy' from the sun. However it is impossible for entropy to be negative, so `negentropy' is always a negative quantity. The easiest way to see this is to consider the statistical-mechanics (Boltzmann) form of the entropy equation:
where kb is Boltzmann's constant, is the number of microstates of the system and Pi is the probability of microstate i. Unless one wishes to consider imaginary probabilities (!) it can be proven that S is positive or zero. Rather than saying `negentropy' or `negative entropy', it is more clear to note that when a system dissipates energy to its surroundings, its entropy decreases. So it is better to refer to -delta S (a negative change in entropy).
Recommendation: replace this concept with decrease in entropy'.
Examples:
In "Maxwell's demon: Slamming the door" (Nature 417: 903) John Maddox says "Maxwell's demon ... must be a device for creating negative entropy". The Demon is required to create decreases in entropy, not the impossible `negentropy'. (Note: On 2002 July 6 Nature rejected a correspondence letter to point out this error.)
In his webpage on uncertainty, entropy and information Schneider clears up any confusion between information entropy and thermodynamic entropy as they apply to molecular biology.
Would y'all entertain the alternative phrasing Schneider suggests, i.e "decrease in entropy" to mean gain of information content in molecular biology?
Biocast Channel capacity and noise [Shannon] available to a biological system to facilitate a gain of information content, i.e. the broadcast or incoming message.
Biolearn Gain of information content in a molecular machine, Shannon bits, decrease in information entropy, increase in thermodynamic entropy in local surroundings.
Bioknow Accrued information content in a biological organism, e.g. DNA
Biolanguage The encoding/decoding in Biocomm, i.e. semiosis.
Biothought The property of complexity (Kolmogorov, self-organizing, physical, irreducible, functional, specified or whatever we decide) which exists in the Biocomm.
Biostructure The biological system in which Biocomm exists, i.e. autonomy
All these definitions of entropy are identical, they are just distilled rules from an obscure generalization applied at different levels of the system. The transaction theoretic versions (e.g. as would apply to thermodynamics) are pretty esoteric. Algorithmic information theory unifies all the various notions of "entropy" into a single concept that is probably far more confusing than either the static or transactional version in isolation (though very elegant in its own way).
One can see, as in your example, that they are all related. The conceptually hard part is figuring out what the general mathematical description is of a system that will express something that looks like thermodynamics -- ask yourself why a system would express this behavior in the hypothetical. The answer to that question has good-sized lightbulb attached to it.