Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts
Information in Living Organisms
Theorem 28: There is no known law of nature, no known process, and no known sequence of events which can cause information to originate by itself in matter...
(for remainder, click link below)
(Excerpt) Read more at answersingenesis.org ...
Touche’.
I love an evolutionist with a sense of humor...
Hmmm. I didn’t realize that this was such a hot issue for you. I never really knew what Alex Williams was referring to until I started reading Werner Gitt’s “In the Beginning was Information.” His thesis does not exclude Shannon’s mathematical theory of information. Rather, it merely stipulates that Shannon’s theory occupies the lowest level of information, namely statistics (as opposed to the highest level, meaning).
No. A theorem is a statement that is proven from a set of given axioms in a logical system. The proof is constructed by starting with the axioms of the system and using the permitted logical rules to derive “true” statements. Think of how you derived the mean value theorem in elementary calculus, or established the equality of alternate interior angles in geometry.
Now, back to your post, it’s not a theorem if it has no proof, as above. Without proof, it’s a conjecture. Unless you are assuming it to be axiomatic, which wouldn’t surprise me.
To improve - one must *know* or *reward*. But small, random changes (like in DNA my transpose errors or knockouts) do not produce by themselves any changes in the macroscopic organism that may or may not influence progeny.
This is why slow, gradual change has been discarded as theory for the somewhat more plausible punctuated equilibrium.
I like the classic million monkeys typing for a million years example some people throw out for the usefulness of random processes....except for one problem - suppose you get to nearly the end of the complete works of Shakespeare with all but the last word spelled correctly, one letter remaining.
What stops the first monkey from changing his letter? Nothing. Because no monkey “knows” he needs to keep the letter he got lucky with.
This shows the classic problem with statistical theory with regard to randomness. That is, ALL variables are INDEPENDANT. To hold some variables static - even for a moment in time, implies *something” is holding onto that state for a reason....
There is a recent TED talk (google TED talks) about the hundreds of people wrongly convicted on DNA evidence because of a lack of understanding about what variables are independent vs Dependant - essentially there were quoting that it was “One in a Billion” that the match was not correct, when in fact, it was more like “one in one thousand”. People were convicted unjustly because reasonable doubt was destroyed.
All due to lack of understanding of just the basic “what if’s” in simple combinations of genetic markers.
In the strictest sense of theorem you are right. In the more colloquial use which you point out, it is a "theorem". But unless you can name one of the objects which does provide a means for "information" to arise spontaneously and can rigorously prove it, it seems entirely reasonable to accept the "theorem".
I said nothing about randomness, just natural. That’s all.
The theory is mathematics, plain and simple. Meaning of the message has no bearing on the communication of it. That is where the Shannon theory ends.
Meaning in the biological message goes to complex systems theory, another subject altogether bringing in issues such self-organizing complexity, cellular automata, algorithmic complexity, Kolmogorov complexity, etc. Ditto for autonomy and semiosis.
The Shannon theory is a powerful argument in the intelligent design debate - indeed, in many theological and philosophical debates as well.
If the correspondent ignores it, minimizes it or mixes other issues into it, he is hurting his own argument.
Because the mathematical theory is universal as it is, it is portable between many disciplines. It is well established.
It is like a Caterpillar in these debates, why would anyone want to use it like a little red wagon?
My comment and challenge to the use of the term was meant to shed some light on the paucity of rigor and absence of logical process in the original document.
Apparently, the word theorem has a slightly different definition with respect to science:
“There are also “theorems” in science, particularly physics, and in engineering, but they often have statements and proofs in which physical assumptions and intuition play an important role; the physical axioms on which such “theorems” are based are themselves falsifiable.”
http://en.wikipedia.org/wiki/Theorem#Theorems_in_logic
When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:
How many letters, numbers, and words make up the entire text?
How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
How frequently do certain letters and words occur?
To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.
As explained fully in appendix A1, Shannons theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:
Definition 1: According to Shannons theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).
According to Shannons definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:
Theorem 4: A message which has been subject to interference or noise, in general comprises more information than an error-free message.
This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.
Figure 12: The five aspects of information. A complete characterization of the information concept requires all five aspectsstatistics, syntax, semantics, pragmatics, and apobetics, which are essential for both the sender and the recipient. Information originates as a language; it is first formulated, and then transmitted or stored. An agreed-upon alphabet comprising individual symbols (code), is used to compose words. Then the (meaningful) words are arranged in sentences according to the rules of the relevant grammar (syntax), to convey the intended meaning (semantics). It is obvious that the information concept also includes the expected/implemented action (pragmatics), and the intended/achieved purpose (apobetics).
Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (18901970), The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry. Another philosopher said, There are about 35 million laws on earth to validate the ten commandments. A certain representative in the American Congress concluded, The Lords Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.
Theorem 5: Shannons definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.
It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannons information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannons theory is only useful for describing the statistical level (see chapter 5).
I believe the target audience of the book is the semi-technical populace, not a purely technical one. And it is not written as a text book. People do have opinions. You seem to be chafed that it was not written as a proof. Okay, so ignore it.
Personally, I find the book fascinating. And as I probe a little deeper into the meaning of the word theorem, it is becoming clear that there are multiple definitions, depending on the discipline involved. And give the Wikipedia definition of a theorem re: science, it seems to me that Dr. Gitt’s use of the word is appropriate to the field of knowledge he is pursuing.
"There are also theorems in science, particularly physics, and in engineering, but they often have statements and proofs in which physical assumptions and intuition play an important role; the physical axioms on which such theorems are based are themselves falsifiable."
This is wrong. where did it come from? You gave a link to WIKI, but the quote is not contained there. The following quote is given there and it's correct.
"Theorems in mathematics and theories in science are fundamentally different in their epistemology. A scientific theory cannot be proven; its key attribute is that it is falsifiable, that is, it makes predictions about the natural world that are testable by experiments."
Notice that the words theory and theorem are two different words.
It’s there, you need to read a little further down.
That describes bits pretty well.
It's the applicability of any theorem to model reality that requires evidence. That evidence is never a part of determining whether, or not some statement is a theorem.
Note that the "no hair theorem" is still a theorem, but it was shown not to be an accurate model of reality and thus is only an element of mathematics, not science(physics). The no hair theorem said in summmary that, black holes mask their contents.
If you are so adamant then you should easily be able to prove it untrue. You can’t.
Oh, and it only took 32 posts for someone to attack the author's “worthiness”.
GGG, count it as a victory.
“Also, there’s no assumption that things operate the same way when they’re not observed. It’s a requirement that they do so. Otherwise A?A !”
Not going to get into the theology, and generally agree with your assessment of the site, but your above statement is factually incorrect. I’m guessing you have studied QM much. The reality is that it is provable (and has been proven) that at the QM (and in some cases if set up properly macro) level, A(observed) does NOT equal A(unobserved). The double slit exeriment is the classical (pun intended) experiment that shows this. It’s very counter-intuitive stuff. Also google for the double slit experiment using star light - great example of the distemporal nature of this QM effect. PS It has been a while since I took my graduate level QM classes and I don’t use this stuff much anymore, so apologies for inacuracies in terminology.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.