Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

In the Beginning Was Information: Information in Living Organisms (Ch 6)
AiG ^ | April 2, 2009 | Dr. Werner Gitt

Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts

click here to read article


Navigation: use the links below to view more comments.
first previous 1-20 ... 161-180181-200201-220221-230 last
To: oldmanreedy; GodGunsGuts
NO BUSINESS PARTICIPATING IN DISCUSSIONS ABOUT INFORMATION THEORY.

I'm sorry, but who died and made you king? As to your link, Albert Gore wrote a book. Does that make him believable and a requirement for people to discuss the subject of climate? I've looked at the first few pages of the book you linked to and I have not found a definition of "information" in it so far. But I have found this in the book....

Kolmogorov, Chaitin, and Solomonoff put for the idea that the complexity of a string of data can be defined by the length of the shortest binary computer program for computing the string.

What exactly is data, and a computer program? I suppose the least objectionable statement would have been.:

... idea that a characteristic of a sequence of symbols, a member of a certain set, can be defined by a characteristic of another sequence of symbols, a member of a set(which could possibly be the same set braving the halting problem), having the value of "shortest".

That assumes we agree on what symbols, sets, members, value, and shortest mean.

221 posted on 04/05/2009 12:31:03 PM PDT by AndrewC
[ Post Reply | Private Reply | To 219 | View Replies]

To: AndrewC
"I'm sorry, but who died and made you king?"

You know what died and made me king? What died is the basic damn honesty to keep your mouth shut when you know nothing about a subject. So as long as you (and by you I don't just mean you specifically AndrewC, but everyone on this shameful thread) are publicly ignorant and I am knowledgeable, I am king.

"As to your link, Albert Gore wrote a book. Does that make him believable and a requirement for people to discuss the subject of climate?"

Laughable fallacy of equivocation. Al Gore is a freakin politican who wrote a freakin propaganda piece for the consumption of the gullible masses. I linked to a textbook, written by an actual practicing specialist in the field, intended for use in rigorous college level math courses.

"I've looked at the first few pages of the book you linked to and I have not found a definition of "information" in it so far."

hahahahahahah, this is the whole dirty little secret that keeps me alternately laughing at you and spraying bile at you. Information theory is, in very loose terms, an approach to statistics that focuses on measuring the "surprise" and "uncertainty" associated with the outcomes of a random variable. There are multiple ways of doing this, and so the term "information" can be attached to several different constructions. Self-information, mutual information, entropy, joint entropy, conditional entropy, etc. None of these really correspond to what creationists are feebly attempting to talk about; the one that comes closest might be Kolmogorov complexity. Which leads me to...

"Kolmogorov, Chaitin, and Solomonoff put for the idea that the complexity of a string of data can be defined by the length of the shortest binary computer program for computing the string.

What exactly is data, and a computer program? I suppose the least objectionable statement would have been...[nonsense omitted by editor]"

Son you realize you read an introduction right? That there is in fact an entire chapter devoted to Kolmogorov complexity, and that's where you should look for rigorous definitions? Specifically the section entitled "Kolmogorov Complexity: Definitions"? For what it's worth, I believe strings of data are usually just functions from the natural numbers into a set (the set is the "alphabet"), and computer programs can be given rigorous definitions using Turing Machines. In fact, I believe that the rigorous discussion of Kolmogorov complexity is usually conducted in the context of Turing Machines.

DISCLAIMER: I know very little about Kolmogorov complexity, theoretical computer science, and related matters, please consult actual experts or their actual expert books for real education on this subject.

222 posted on 04/05/2009 2:22:39 PM PDT by oldmanreedy
[ Post Reply | Private Reply | To 221 | View Replies]

To: oldmanreedy
What died is the basic damn honesty to keep your mouth shut when you know nothing about a subject. So as long as you (and by you I don't just mean you specifically AndrewC, but everyone on this shameful thread) are publicly ignorant and I am knowledgeable, I am king.

Sorry buddy, but you could Shannon, Kolmogorov, or Chaitin but that does not make you gatekeeper on participation in discussions. You do not have to listen, heed, or even respect anybody's argument, but that does not give you the right to attempt to prevent their participation.

Al Gore is a freakin politican who wrote a freakin propaganda piece for the consumption of the gullible masses.

Is it not a book which purports to be fact and not fiction? There is no equivocation there unless you use book in the sense that Tony Soprano would use the word. Book on it.

hahahahahahah, this is the whole dirty little secret that keeps me alternately laughing at you and spraying bile at you.

Well, at least you are truthful about the bile.

None of these really correspond to what creationists are feebly attempting to talk about; the one that comes closest might be Kolmogorov complexity. Which leads me to...

So you at the end you admit that the discussion doesn't seem to be about what you explicitly say it is. Now if you look at my post 85 you'll see a link to Shannon's paper. You'll find it handy to see if I have ever used the term "Information theory". You'll also note that I often put "information" in quotes. I do that because no one has really defined what we are talking about. Having said those things, I have read the Shannon paper, and I do know something about the subject when talking about his view of communication. And as I pointed out, meaning is not important to his analysis, but obviously his paper is a message with meaning.

Son you realize you read an introduction right?

No shit Sherlock. You might use your magnifying glass to note the I expressly used the words "so far". Those words have meaning despite the irrelevance of that meaning to Shannon. And I am not your son being the age of 60 and knowing my parents, which is something I fear you might have difficulty with seeing that you do not know your children.

and computer programs can be given rigorous definitions using Turing Machines. In fact, I believe that the rigorous discussion of Kolmogorov complexity is usually conducted in the context of Turing Machines.

And I know about Turing machines having conceptually programmed them while learning PL/1 in 1968. So I have some familiarity with state machines.

Finally, who died and made you king?

223 posted on 04/05/2009 5:59:53 PM PDT by AndrewC
[ Post Reply | Private Reply | To 222 | View Replies]

To: spunkets; oldmanreedy; GodGunsGuts; AndrewC; CottShop; betty boop
Any effective treatment of the received signal to eliminate errors depends on recognizing whether errors are due to noise, or interference.

... or distortion.

This thread has not moved into the specifics of information theory nor should it since the issues being raised go to the more general question of whether or not Gitt (and Williams) are in the right ballpark with reference to Shannon.

For instance, Gitt claims that Shannon's theory must be expanded to include the meaning of the message which he calls information and yet the meaning of message is irrelevant to the model. The model is universal, the message is a chance variable.

Also he doesn't seem to grasp that information (successful communication) is the reduction of uncertainty (Shannon entropy) in the receiver (or molecular machine) as it goes from a before state to an after state. It is the action, not the message.

The first step to expanding on another's theory ought to be knowing what that theory "is." As I said before, if a person wants to complain about farmers, he shouldn't speak with his mouth full.

That said, I am glad oldmanreedy brought up Kolmogorov, Chaitin and Solomonoff. (Though it makes me miss tortoise all the more.)

If a person wants to opine about information (successful communications) he must deal with Shannon. More specifically, if he wants to apply Shannon's theory to molecular biology he must deal with Yockey and Schneider et al.

But if a person wants to opine per se about the meaning of the message being communicated (complexity, semiosis, etc.) he must deal with Kolmogorov, Chaitin, Solomonoff, Wolfram, Eigen, von Neumann, Turing, et al.

Or the philosophers and theologians - whichever level he is addressing.

224 posted on 04/05/2009 10:30:26 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 218 | View Replies]

To: Alamo-Girl; betty boop; CottShop; AndrewC

It seems to me that “reduction of uncertainty” is not the best way to define information. First, it implies that someone or something is waiting for or otherwise anticipating a message. But if nobody is waiting for a message, how does that reduce uncertaintly? Second, “reduction of uncertainty” does not strike me as an adequate description of what is really happening when successful communication takes place. For instance, how is uncertainty quantified? And even if it can be quantified, it’s not really telling you what’s going on IMHO. Take for example a pile of lumber. Could not the lumber be described as being in a state of uncertainty? Simply describing a reduction of lumber does not describe a house, or a porch, or a fence. It seems to me the definition of information needs to include much more than a simple reduction of uncertainty.


225 posted on 04/06/2009 8:33:27 AM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 193 | View Replies]

To: GodGunsGuts; betty boop; CottShop; AndrewC
The thing is, neither you nor Gitt nor Williams should come up with a new definition for the word "information" while at the same time appealing to Shannon's theory.

That is what I mean by people should not complain about a farmer when their mouth is full.

The title of Shannon's theory is A Mathematical Theory of Communication.

It is not about the message, it is about the communication of the message no matter what the message "is."

Under Shannon, information (successful communication) is the reduction of uncertainty (Shannon entropy) in the receiver (or molecular machine) in going from a before state to an after state. It is the action, the communication. Not the message much less the "meaning" of the message.

It doesn't matter to me if Gitt and Williams want to come up with a new vocabulary but they have no business trying to do while standing on Shannon's shoulders. It destroys their credibility.

Schneider uses the term "uncertainty" because it is more understandable than the term "Shannon entropy." The Shannon entropy formula is basically the same as thermodynamics, Boltzmann's H-Theorem and thus called "entropy" but the similarity ends there.

Shannon entropy is reduced by successful communication. In thermodynamics, entropy increases..

That is a powerful line of argument in Intelligent Design debate.

Likewise, your lumber is indeed in a state of uncertainty. But it cannot act as a receiver of a message. If it could, as something occurring in nature then we would say it is alive.

And you can talk all day long to a bucket of water but it cannot act as a receiver, if it did then voila there would be hard evidence of abiogenesis.

Instead, the fact that there had to exist a capable receiver before the first biological message was sent stands as a powerful argument in favor of biogenesis.

Likewise, the very point that the message (DNA) survives physical death of the biological organism while as long as the biological organism continues to physically communicate (Shannon) - it is alive stands as a very powerful argument in the Intelligent Design debate.

And certainly the content of the biological message (complexity, semiosis, etc.) is a wonderful Intelligent Design argument. But that argument should never be made while criticizing Shannon as not having gone far enough.

226 posted on 04/06/2009 9:54:20 AM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 225 | View Replies]

To: Alamo-Girl

==The thing is, neither you nor Gitt nor Williams should come up with a new definition for the word “information” while at the same time appealing to Shannon’s theory.

I don’t think what I’m asking for would overturn Shannon. It would merely build upon him, and shore-up some areas that, at least to my mind, he left ambiguous. I just don’t think a reduction of uncertainty is adequate to describe what information is, or the successful transmission of the same. It is certainly a component part of a successful communication, but it does not come close to describing the factors involved IMHO. I really can’t speak to Gitt’s “In the Beginning was Information” just yet (I’m still stuck on whether his theorems are actually theorems), other than to say that I think he is asking the right kinds of questions. Perhaps all the confusion surrounding Shannon’s information theory is because it is too broadly named, making it sound like it is dealing with the totality of information, when in reality it is only dealing with the statistical aspects of the same.


227 posted on 04/06/2009 10:21:08 AM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 226 | View Replies]

To: Alamo-Girl

PS I think Dr. Gitt has actually come up with a definition for information, but he hasn’t yet got there in the chapters of the book AiG has posted so far.

PSS I realize that Shannon’s theory is a “Mathematical Theory of Communication.” All I’m saying is that it is not nearly enough by itself to describe the totality of what communication involves.

All the best—GGG


228 posted on 04/06/2009 10:26:21 AM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 226 | View Replies]

To: GodGunsGuts
Our Constitution is a wonderful thing. I may not agree with what you say, but I will defend whole-heartedly your right to say it.

May God ever bless you, dear brother in Christ!

229 posted on 04/06/2009 10:42:57 AM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 228 | View Replies]

To: Alamo-Girl

Thank you, AG :o)

BTW, I found a PDF of Dr. Gitt’s book, thus allowing me to skip forward to his definition of information (I wish he would have started with the definition, but alas, he spends his entire book building to it). He also goes more into depth with respect to Dr. Shannon’s mathematical information theory in the appendix. But I will save it until Chapter 7 of his book appears on the AiG website.

All the best—GGG


230 posted on 04/06/2009 11:31:39 AM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 229 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 161-180181-200201-220221-230 last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson