I opened up my copy of Li & Vitanyi (the de facto bible of modern information theory) to look up a reference for a paper I'm writing, and discovered that the entire last chapter is dedicated to the mathematical unification of the various notions of entropy in physics, chemistry, etc with the definition used in broader algorithmic information theory, and also maps other concepts in science into their formal AIT analogs. It is a little out of date (1997) and there is some newer computational theory work it does not include, but it pretty much settles the question as to how "entropy" as used in science and engineering maps into the broader mathematical definitions.
It did not even occur to me to see if Li & Vitanyi addressed this topic; I assumed the book was pure math and I hardly ever open it any more. Anyway, if you want a properly rigorous overview of this topic, you'll find it in Chapter 8 of that book.
I also wanted to bring to y'alls attention Adami's article on Information Theory and Molecular Biology. He does not approach the subject in a way substantively different from Schneider, but he associates the measure of Shannon Information to Physical Complexity and makes the distinction between Shannon entropy, thermodynamic entropy and Shannon information very clear.