I indicated that there's a distinction between noise and interference, because it matters for anything more detailed than the most trivial model. Noise and interference are distinguishable, they have different chacteristics and effects. Any effective treatment of the received signal to eliminate errors depends on recognizing whether errors are due to noise, or interference.
This thread has not moved into the specifics of information theory nor should it since the issues being raised go to the more general question of whether or not Gitt (and Williams) are in the right ballpark with reference to Shannon.
For instance, Gitt claims that Shannon's theory must be expanded to include the meaning of the message which he calls information and yet the meaning of message is irrelevant to the model. The model is universal, the message is a chance variable.
Also he doesn't seem to grasp that information (successful communication) is the reduction of uncertainty (Shannon entropy) in the receiver (or molecular machine) as it goes from a before state to an after state. It is the action, not the message.
The first step to expanding on another's theory ought to be knowing what that theory "is." As I said before, if a person wants to complain about farmers, he shouldn't speak with his mouth full.
That said, I am glad oldmanreedy brought up Kolmogorov, Chaitin and Solomonoff. (Though it makes me miss tortoise all the more.)
If a person wants to opine about information (successful communications) he must deal with Shannon. More specifically, if he wants to apply Shannon's theory to molecular biology he must deal with Yockey and Schneider et al.
But if a person wants to opine per se about the meaning of the message being communicated (complexity, semiosis, etc.) he must deal with Kolmogorov, Chaitin, Solomonoff, Wolfram, Eigen, von Neumann, Turing, et al.
Or the philosophers and theologians - whichever level he is addressing.