Not true. Entropy in physics is nothing more than a special case of entropy in mathematics.
In fact, the last chapter of Li and Vitanyi (the de facto reference text for information theory) is dedicated to showing how entropy and complexity in the physical sciences is derivable from the general mathematics version.
The relationship is not obvious in older Shannon information theory, which dealt primarily with zero-order information, but the broader field of modern algorithmic information theory (Solomonoff and later) very neatly maps complex physical systems into the mathematics.
Ah thanks, this has always been a confusion of mine. Unfortunately I am not very skilled at maths at all. One question - is it safe to equate entropy with disorder?