One gathers you take great pleasure in abstraction, RightWhale! But particularity -- living organisms participating in/with/through their environments, and even human individuals doing the same (how shocking!) -- is not abstract. It is real.
To quote Christoph Adami, "Entropy can never be a measure of complexity. Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted. Information is entropy 'shared with the world,' and the amount of information a sequence shares with its world represents its complexity."
And probably also its "symmetry."
It seems to me that living beings are not "closed" thermodynamic systems at all. And neither are any of their "constituting parts," down to the level of cells, DNA, etc. Why would you want to treat them as such? They are conscious information processers/users in some degree. And the one species that preeminently thinks, that is capable of self-reflection, is also considered the most complex. I refer to homo sapiens sapiens, of course.
Living organisms are open systems.... In other words, they possess degrees of freedom; they choose and process information; they display emergent, self-organizing and self-regulatory abilities; thus they are neither closed systems, nor fully pre-determined and pre-specified "machines" -- with all their "engineering" done from the "outside." Thus I strongly doubt it is appropriate to apply closed-system analogies to them. FWIW
Complexity is the opposite of order, but synomymous with organization.
I've seen entropy, in this case Shannon information entropy, used to determine whether or not the given time scales (et al) are sufficient to allow for autonomous self-organizing complexity of life from non-life (abiogenesis.) Hubert Yockey's determination is that they are not sufficient. One counter argument is that he should have used Kolmogorov complexity/Solomonoff induction.
However, many of the articles I've read prefer Shannon entropy for the biological question proposed by Yockey. The second link above is heavy on details and extremely biased against Dembski's additional definition of complexity.
BTW, the first link is particularly handy for keeping all the terms defined wrt complexity, etc. Here's more on the last item in the definition list, functional complexity wrt biological systems.