Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Alamo-Girl; betty boop; CottShop; AndrewC
Yes, I realize that now. Nor am I in any way disparaging the practical benefits of Shannon's theory. I'm merely pointing out that according to Dr. Gitt, and apparently Williams, the statistics of the message is actually the lowest level of information. From the article:

4.1 The Lowest Level of Information: Statistics

When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:

–How many letters, numbers, and words make up the entire text?
–How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
–How frequently do certain letters and words occur?

To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.

As explained fully in appendix A1, Shannon’s theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:

Definition 1: According to Shannon’s theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).

According to Shannon’s definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:

Theorem 4: A message which has been subject to interference or “noise,” in general comprises more information than an error-free message.

This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.

Figure 12: The five aspects of information. A complete characterization of the information concept requires all five aspects—statistics, syntax, semantics, pragmatics, and apobetics, which are essential for both the sender and the recipient. Information originates as a language; it is first formulated, and then transmitted or stored. An agreed-upon alphabet comprising individual symbols (code), is used to compose words. Then the (meaningful) words are arranged in sentences according to the rules of the relevant grammar (syntax), to convey the intended meaning (semantics). It is obvious that the information concept also includes the expected/implemented action (pragmatics), and the intended/achieved purpose (apobetics).

Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (1890–1970), “The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry.” Another philosopher said, “There are about 35 million laws on earth to validate the ten commandments.” A certain representative in the American Congress concluded, “The Lord’s Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.”

Theorem 5: Shannon’s definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.

It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannon’s information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannon’s theory is only useful for describing the statistical level (see chapter 5).

50 posted on 04/02/2009 9:35:19 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 47 | View Replies ]


To: GodGunsGuts; Alamo-Girl; CottShop
What, precisely, do Drs. Williams and Gitt mean by "the statistics of the message?"

Do they believe that there is such a thing as "information" absent a sender and a receiver?

Truly these are fascinating issues, GGG!

73 posted on 04/03/2009 9:07:36 AM PDT by betty boop (All truthful knowledge begins and ends in experience. — Albert Einstein)
[ Post Reply | Private Reply | To 50 | View Replies ]

To: GodGunsGuts; betty boop; CottShop; AndrewC
Both Dr. Gitt and Alex Williams have provided important insights into the meaning of biological messages. I was particularly impressed with William's inverse causality, that the molecular machinery received repair and maintenance messages which are temporally non-local, i.e. the need hadn't happened yet and it could not be anticipated. And their points should be raised far and wide.

But their issues go to complexification, origin and meaning of the biological message itself - not the communication of it.

My criticism goes to their hand wave of Shannon's Mathematical Theory of Communications, which is the door opener, the strength of their argument to a secular world.

Shannon's model is mathematics, it is universal, not subject to extension or qualification (from Gitt's excerpt, emphasis mine:)

It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannon’s information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannon’s theory is only useful for describing the statistical level (see chapter 5).

To qualify Shannon's theory to include meaning (information content) in any application of it is to attack its universality per se.

It is roughly the equivalent of saying Euclidean geometry must be qualified to accommodate the content of a plane, as if the strength of Euclidean geometry rests in its application rather than its universality.

This is an unnecessary overreach that diminishes their own arguments, it makes them look weak - like a young physicist taking a podium declaring "Einstein's theory must be extended and I'm going to tell you how things really are."

As another example, Newtonian physics stands on its own as does Relativity as does Quantum Mechanics. To obtain the most complete view of the physical world, one must entertain all three. And a theory of everything would reconcile all of them.

They are complementarities (to use one of betty boop's favorite terms) - not extensions, the one does not diminish the other.

And that is the way Gitt and Williams should have approached it.

79 posted on 04/03/2009 9:27:40 AM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 50 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson