The theory is mathematics, plain and simple. Meaning of the message has no bearing on the communication of it. That is where the Shannon theory ends.
Meaning in the biological message goes to complex systems theory, another subject altogether bringing in issues such self-organizing complexity, cellular automata, algorithmic complexity, Kolmogorov complexity, etc. Ditto for autonomy and semiosis.
The Shannon theory is a powerful argument in the intelligent design debate - indeed, in many theological and philosophical debates as well.
If the correspondent ignores it, minimizes it or mixes other issues into it, he is hurting his own argument.
Because the mathematical theory is universal as it is, it is portable between many disciplines. It is well established.
It is like a Caterpillar in these debates, why would anyone want to use it like a little red wagon?
When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:
How many letters, numbers, and words make up the entire text?
How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
How frequently do certain letters and words occur?
To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.
As explained fully in appendix A1, Shannons theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:
Definition 1: According to Shannons theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).
According to Shannons definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:
Theorem 4: A message which has been subject to interference or noise, in general comprises more information than an error-free message.
This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.
Figure 12: The five aspects of information. A complete characterization of the information concept requires all five aspectsstatistics, syntax, semantics, pragmatics, and apobetics, which are essential for both the sender and the recipient. Information originates as a language; it is first formulated, and then transmitted or stored. An agreed-upon alphabet comprising individual symbols (code), is used to compose words. Then the (meaningful) words are arranged in sentences according to the rules of the relevant grammar (syntax), to convey the intended meaning (semantics). It is obvious that the information concept also includes the expected/implemented action (pragmatics), and the intended/achieved purpose (apobetics).
Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (18901970), The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry. Another philosopher said, There are about 35 million laws on earth to validate the ten commandments. A certain representative in the American Congress concluded, The Lords Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.
Theorem 5: Shannons definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.
It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannons information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannons theory is only useful for describing the statistical level (see chapter 5).
[[The theory is mathematics, plain and simple. Meaning of the message has no bearing on the communication of it. ]]
This is hte crux of hte hwole matter, and somethign I think you nailed pretty well in the previous htreads on life’s irreducible complexities.
[[If the correspondent ignores it, minimizes it or mixes other issues into it, he is hurting his own argument.]]
I’m goign to agree with htis- I think William’s response to Shannon theory kinda vrushed off one of hte most important issues of information- info is uselss without hte means to comunicate it, and hte two are of no use without hte other- while one can exist alone, it is meaningless- both must exist and work in unison to be of any use at all