Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

In the Beginning Was Information: Information in Living Organisms (Ch 6)
AiG ^ | April 2, 2009 | Dr. Werner Gitt

Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts

Information in Living Organisms

Theorem 28: There is no known law of nature, no known process, and no known sequence of events which can cause information to originate by itself in matter...

(for remainder, click link below)

(Excerpt) Read more at answersingenesis.org ...


TOPICS: Constitution/Conservatism; Culture/Society; News/Current Events; Philosophy
KEYWORDS: aminoacids; code; creation; dna; evolution; genetic; genome; goodgodimnutz; information; intelligentdesign; proteins
Navigation: use the links below to view more comments.
first previous 1-20 ... 101-120121-140141-160 ... 221-230 next last
To: Alamo-Girl; betty boop; CottShop; AndrewC

PS I’m still failing to see what is bothering you about Dr. Gitt et al wanting to take Information Theory into what constitutes the meaning of the message. Is there some other theory that depends on Dr. Shannon’s information theory that is being threatened by Dr. Gitt’s work?


121 posted on 04/03/2009 11:48:19 AM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 117 | View Replies]

To: Alamo-Girl; betty boop; CottShop; AndrewC

==Indeed, he does it again to Manfred Eigen declaring his work useless because it does not consider what he wants it to consider.

Dr. Gitt was simply quoting other authorities to demonstrate where he thinks Dr. Shannon’s theory comes up short. He did, btw, credit Dr. Shannon with the following:

“Claude E. Shannon was the first researcher who tried to define information mathematically. The theory based on his findings had the advantages that different methods of communication could be compared and that their performance could be evaluated. In addition, the introduction of the bit as a unit of information made it possible to describe the storage requirements of information quantitatively.”

But then Dr. Gitt goes on to explain the disadvantages of Dr. Shannon’s theory:

“The main disadvantage of Shannon’s definition of information is that the actual contents and impact of messages were not investigated. Shannon’s theory of information, which describes information from a statistical viewpoint only, is discussed fully in the appendix (chapter A1).”

In other words, in Gitt’s opinion, Shannon’s theory is a start, but there are many aspects of information that his theory does not explain, which is precisely what Gitt attempts to do in his book. I view such attempts as a positive development.


122 posted on 04/03/2009 12:00:42 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 117 | View Replies]

To: GodGunsGuts; betty boop; AndrewC; CottShop; r9etb
Dr. Gitt finds Shannon theory lacking because it doesn't address what he wants to address:

It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannon’s information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannon’s theory is only useful for describing the statistical level (see chapter 5).

Shannon's Mathematical Theory of Communications is not hierarchical, as betty boop has pointed out. The meaning of the message has no bearing on the model - it is universal.

Likewise, pi is universal - the size and composition of the circle has no bearing on the formula. Complaints that it should would discredit anything else the speaker might have to say about circles.

Information is not the message, it is the successful communication of it, i.e. the receiver becomes (action) "informed."

I realize I probably sound like a purist - and perhaps I am - but as they say "if you want to complain about a farmer, don't speak with your mouth full."

123 posted on 04/03/2009 12:01:29 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 120 | View Replies]

To: GodGunsGuts; r9etb; Alamo-Girl
PS I’m still failing to see what is bothering you about Dr. Gitt et al wanting to take Information Theory into what constitutes the meaning of the message.

Oh he can take "information theory" anywhere he wants to for his purposes; the point is, the Shannon model of information theory is completely blind to the content of messages. So why would he choose this model to make any point whatever about the content or meaning of messages?

124 posted on 04/03/2009 12:06:02 PM PDT by betty boop (All truthful knowledge begins and ends in experience. — Albert Einstein)
[ Post Reply | Private Reply | To 121 | View Replies]

To: Alamo-Girl; betty boop; CottShop; AndrewC

==I realize I probably sound like a purist - and perhaps I am - but as they say “if you want to complain about a farmer, don’t speak with your mouth full.”

LOL...Good one!

I think Dr. Gitt is making a distinction between information, and what information is carried on (the medium). Whereas Dr. Shannon treats the material medium as measurable information. I think on this point, they may disagree. I will try to read Gitt more carefully this weekend to see if I can sort it out.

All the best—GGG


125 posted on 04/03/2009 12:08:36 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 123 | View Replies]

To: betty boop; Alamo-Girl; CottShop; AndrewC

I think he’s trying to say that the medium is not the information itself. However, the medium does carry the information, which collectively constitutes the message. As such there is definite overlap, but it appears to me now that there is bit of disagreement over what constitutes information as well. As I mentioned to Alamo-Girl, I am going to try and find the time to read Gitt’s book (i.e. that part of it that is available on the internet) more closely over the weekend to see if I can sort this thing out.

God bless you and yours!—GGG


126 posted on 04/03/2009 12:14:33 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 124 | View Replies]

To: r9etb
For a fellow who is clearly trying to present a "scientific" discussion, that is fatal to his argument -- whatever his motivations may be.

Okay. Nothing wrong with what you say from the viewpoint of someone seeking a rigorous argument. But even Feynman was not always rigorous when he was teaching, which is something teachers do.

127 posted on 04/03/2009 12:15:30 PM PDT by AndrewC
[ Post Reply | Private Reply | To 118 | View Replies]

To: GodGunsGuts; betty boop; CottShop; AndrewC; r9etb
But then Dr. Gitt goes on to explain the disadvantages of Dr. Shannon’s theory:

“The main disadvantage of Shannon’s definition of information is that the actual contents and impact of messages were not investigated. Shannon’s theory of information, which describes information from a statistical viewpoint only, is discussed fully in the appendix (chapter A1).”

In other words, in Gitt’s opinion, Shannon’s theory is a start, but there are many aspects of information that his theory does not explain, which is precisely what Gitt attempts to do in his book. I view such attempts as a positive development.

By his statement, Dr. Gitt shows that he doesn't know or refuses to accept what information "is."

Information (successful communication) is the reduction of uncertainty (Shannon entropy) in the receiver (or molecular machine) as it goes from a before state to an after state.

Hamlet is not information. It is a message. Einstein's theory of general relativity is not information. It is a message. The files in your file cabinet are messages, the data on your hard drive, the mail in your mailboxes, etc.

DNA is not information. It is a message. It is dead as a doornail.

Take a live bird and a dead bird to the top of a tower and throw them both off. They both have the same DNA, the same message. But the one that flew away had information. The other went *splat.*

128 posted on 04/03/2009 12:23:25 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 122 | View Replies]

To: GodGunsGuts
I am going to try and find the time to read Gitt’s book (i.e. that part of it that is available on the internet) more closely over the weekend to see if I can sort this thing out.

Sounds great to me! I look forward to your comments.

129 posted on 04/03/2009 12:26:39 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 126 | View Replies]

To: betty boop; Alamo-Girl
"Shannon theory has no input whatsoever at the level of the message itself."

It is important that the message actually be a message, because a distinction is made between messages and noise.

130 posted on 04/03/2009 12:27:25 PM PDT by spunkets
[ Post Reply | Private Reply | To 101 | View Replies]

To: GodGunsGuts; Alamo-Girl; r9etb; CottShop
But then Dr. Gitt goes on to explain the disadvantages of Dr. Shannon’s theory:

“The main disadvantage of Shannon’s definition of information is that the actual contents and impact of messages were not investigated. Shannon’s theory of information, ... describes information from a statistical viewpoint only....”

Dr. Shannon was not concerned about "the actual contents and impact of messages." He was only interested in the "mechanics" of successful communication of any message, not what it "means," let alone how it shapes behavior. So why is Dr. Gitt "picking on him" for failing to do something that was completely removed from Shannon's own concern and intent?

In other words, in Gitt’s opinion, Shannon’s theory is a start, but there are many aspects of information that his theory does not explain, which is precisely what Gitt attempts to do in his book. I view such attempts as a positive development.

Of course there are "many aspects of information that his theory does not explain." But Shannon never sought to explain all aspects of information, only the more limited problem of how it is successfully communicated.

As Alamo-Girl has already pointed out, Dr. Gitt unfairly makes a strawman out of Dr. Shannon. Shannon and Gitt are/were not even working on the same problem.

[Dr. Shannon passed away in 2001, in Medford, Massachusetts — so sadly, of complications of Alzheimers disease. May God rest his soul.]

131 posted on 04/03/2009 12:28:45 PM PDT by betty boop (All truthful knowledge begins and ends in experience. — Albert Einstein)
[ Post Reply | Private Reply | To 122 | View Replies]

To: Blood of Tyrants
"If you are so adamant then you should easily be able to prove it untrue. You can’t."

After you've learned what a theorem is, you can work on your logic. A negative can not be proved.

132 posted on 04/03/2009 12:33:49 PM PDT by spunkets
[ Post Reply | Private Reply | To 58 | View Replies]

To: spunkets; betty boop
It is important that the message actually be a message, because a distinction is made between messages and noise.

Indeed.

For instance, as applied to molecular biology, viruses can be seen as noise.

And "noise" does not necessarily mean harmful - noise could be a broadcast (or bleeding) as compared to an autonomous communication.


133 posted on 04/03/2009 12:34:13 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 130 | View Replies]

To: betty boop
[Dr. Shannon passed away in 2001, in Medford, Massachusetts — so sadly, of complications of Alzheimers disease. May God rest his soul.]

Amen.

Shannon and Gitt are/were not even working on the same problem.

Precisely so.

Thank you so very much for sharing your insights, dearest sister in Christ!

134 posted on 04/03/2009 12:36:39 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 131 | View Replies]

To: AndrewC
But even Feynman was not always rigorous when he was teaching, which is something teachers do.

OTOH, I don't think Dr. Fenman would introduce "theorems" into his teaching without making sure that they were a) actually theorems, and b) offered some sense of the arguments by which those theorems would be proved.

Mr. Gitt fails to do that.

135 posted on 04/03/2009 12:49:36 PM PDT by r9etb
[ Post Reply | Private Reply | To 127 | View Replies]

To: betty boop
[Dr. Shannon passed away in 2001, in Medford, Massachusetts — so sadly, of complications of Alzheimers disease. May God rest his soul.]

Would it be ghoulish of me to point out the horrific irony of such a man, dying such a death?

136 posted on 04/03/2009 12:51:48 PM PDT by r9etb
[ Post Reply | Private Reply | To 131 | View Replies]

To: spunkets
After you've learned what a theorem is, you can work on your logic. A negative can not be proved.

Perhaps your own logic bears some inspection as well.

I challenge you to prove your assertion that "a negative can not be proved."

137 posted on 04/03/2009 12:53:03 PM PDT by r9etb
[ Post Reply | Private Reply | To 132 | View Replies]

To: spunkets

Spunkets, I believe that you are correct. A theorem has not been posited. But take this word out of his challenge, and he makes a solid point. Read Phil Johnson’s “Defeating Darwinism by Opening Minds”, and he includes a thorough explanation of the mystery of information. Blessings, Bob


138 posted on 04/03/2009 1:00:03 PM PDT by alstewartfan
[ Post Reply | Private Reply | To 16 | View Replies]

To: r9etb
a) actually theorems, and b) offered some sense of the arguments by which those theorems would be proved.

For a), I think you might be happier if he used the word "hypothesis". For b), I think the following meets the criteria of "some sense of the arguments...

We now discuss the question of devising a suitable coding system. For instance, how many different letters are required and how long should the words be for optimal performance? If a certain coding system has been adopted, it should be strictly adhered to (theorem 8, par 4.2), since it must be in tune with extremely complex translation and implementation processes. The table in Figure 19 comprises only the most interesting 25 fields, but it can be extended indefinitely downward and to the right. Each field represents a specific method of encoding, for example, if n = 3 and L = 4, we have a ternary code with 3 different letters. In that case, a word for identifying an amino acid would have a length of L = 4, meaning that quartets of 4 letters represent one word. If we now want to select the best code, the following requirements should be met:

—The storage space in a cell must be a minimum so that the code should economize on the required material. The more letters required for each amino acid, the more material is required, as well as more storage space.

—The copying mechanism described above requires n to be an even number. The replication of each of the two strands of DNA into complementary strands thus needs an alphabet having an even number of letters. For the purpose of limiting copying errors during the very many replication events, some redundance must be provided for (see appendix A 1.4).

—The longer the employed alphabet, the more complex the implementing mechanisms have to be. It would also require more material for storage, and the incidence of copying errors would increase.


139 posted on 04/03/2009 1:03:39 PM PDT by AndrewC
[ Post Reply | Private Reply | To 135 | View Replies]

To: spunkets
A negative can not be proved.

???


Figure 17


140 posted on 04/03/2009 1:08:42 PM PDT by TChris (There is no freedom without the possibility of failure.)
[ Post Reply | Private Reply | To 132 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 101-120121-140141-160 ... 221-230 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson