The Second Law of Thermodynamics is usually stated this way: entropy in a closed system can never decrease
There are three different kinds of entropy that invariably get confused when we speak of the 2nd Law of Thermodynamics and biological systems. They are:
Shannon probability-distribution entropy
Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity.
For clarity, Shannon probability-distribution entropy can be called uncertainty:
The only difference between uncertainty and entropy for the microstates of a macromolecule is in the units of measure, bits versus joules per K respectively
Thus, life does not violate the physical laws but it is unique in successful communication, the gain of information content and therefore, functional complexity (or whatever brand of complexity you like).
This is the difference between life and non-life. It is also the difference between life and death. When the successful communications cease, the organism is dead. Where there is no successful communication, there is no life.
Disclaimer: all of this is referring to that which occurs in nature, not artificial intelligence, robots, etc.
A few more items relating the history of entropy and the arrow of time:
The modern concept of the atom was first proposed by the British chemist and physicist John Dalton in 1808 and was based on his studies that showed that chemical elements enter into combinations based on fixed ratios of their weights. The existence of molecules as the smallest particles of a substance that can exist in the freethat is, gaseousstate and have the properties of any larger amount of the substance, was first hypothesized by the Italian physicist and chemist Amedeo Avogadro in 1811, but did not find general acceptance until about 50 years later, when it also formed the basis of the kinetic theory of gases (see Avogadro's Law). Developed by Maxwell, the Austrian physicist Ludwig Boltzmann, and other physicists, it applied the laws of mechanics and probability to the behavior of individual molecules, and drew statistical inferences about the properties of the gas as a whole.
A typical but important problem solved in this manner was the determination of the range of speeds of molecules in the gas, and from this the average kinetic energy of the molecules. The kinetic energy of a body, as a simple consequence of Newton's second law, is ymv2, where m is the mass of the body and v its velocity. One of the achievements of kinetic theory was to show that temperature, the macroscopic thermodynamic property describing the system as a whole, was directly related to the average kinetic energy of the molecules. Another was the identification of the entropy of a system with the logarithm of the statistical probability of the energy distribution. This led to the demonstration that the state of thermodynamic equilibrium corresponding to that of highest probability is also the state of maximum entropy. Following the success in the case of gases, kinetic theory and statistical mechanics were subsequently applied to other systems, a process that is still continuing.
Kinetic Theory of Gases: A Brief Review
Reversibility and Entropy (arrow of time)
Better just admit you were wrong.