Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

On Plato, the Early Church, and Modern Science: An Eclectic Meditation
November 30, 2004 | Jean F. Drew

Posted on 11/30/2004 6:21:11 PM PST by betty boop

click here to read article


Navigation: use the links below to view more comments.
first previous 1-20 ... 321-340341-360361-380 ... 921-935 next last
To: StJacques; betty boop; Doctor Stochastic; tortoise; PatrickHenry
Thank you so much for your reply!

Like Rocha, Kauffman appears to see a connection between the way(s) information is transferred and the development of "autonomy" in a biological system, i.e. an "organism."

Indeed, but Kauffman is relying on Maxwell’s demon for the source of his information (”A coevolving community of non-equilibrium Maxwell Demons is a union of matter-energy-information into an organization that proliferates and constructs hierarchical complexity.”) . And there appears to be a very good cause in physics as to why Maxwell’s demon is dead.

He may be close, because certainly the state changes are necessary to bring about autonomy - but the demon is not looking like a good source for information.

Now, I repeat, if RNA has been catalyzed in mineral clays (Ferris) then we have a scenario for the origins of at least some degree of information, since by definition RNA is composed of nucleotide sequences and those sequences in and of themselves are information. But that degree of information -- and I am just guessing here, though I doubt you will disagree -- cannot be of such a "complex" nature that it explains how information originates and functions within an "autonomous agent," a term that is still far removed from RNA that was produced by mere chemical reaction, such as occurred within the experiment Ferris conducted. With that in mind I think the real question that must be asked is not "what is the origin of information in biological systems?" but rather "what is the origin of complex information?" or perhaps even "what is the process of complexification by which information rises to a level necessary for the functioning of autnomous agents?"

Truly, I hate to be picky because we are in fact on the same page here. But “information” in the sense you have used it above (as do ever so many people in all kinds of literature) is tantamount to the message rather than the communication of it (Shannon). The Shannon definition of information is a “reduction of uncertainty in the receiver”. A Mathematical Theory of Communications

Indeed, as we look at it "altogether", it looks to us as if we are speaking of complexity. And indeed we are. Once the communication is “bootstrapped” then some form of complexification must ensue – e.g. self-organizing complexity – to account for the rise of the myriad forms of complexity: physical complexity, functional complexity, etc.

To attempt to clarify it a bit more, I’ll borrow the following excerpt from a linked website: Review of Yockey’s book

. DNA as a message In his book, Yockey uses communication theory to study the DNA-RNA-protein system in living organisms. Yockey uses the theory of communication systems not only as a metaphor, but also as a theory to describe, explain and predict phenomena in molecular biology. Here we have a communication system (telephone or CD player)

in the engineer's world:

Message in
source code
>
Encoder
Transmitter
  >
channel
channel
code
^
Noise
  >
channel
Decoder
Receiver
>
Message in
destination code

in the biological world:

       
genetic noise:
mutations
v
 
noise in
genetic code.
tRNA
v
   
Genetic message
in DNA
including
tRNA
 
>
transcription
into mRNA
>
channel
mRNA
code

  >
channel
translation
into protein

^  ^
>



<
Genetic message
in protein code
<   tRNA
> tRNA > independent channel (cytoplasma?) > ^    

(the independent channel is not in Yockey's book)

The information in DNA is transmitted to the information in proteins. DNA is encoded information. Proteins are decoded information. tRNA is the decoder or translator. Noise in the engineering system equals mutation in the biological system. Indeed both systems look much the same. On an abstract level, they are the same.

Continuing now with my comments…

The reviewer claims that there is no encoding process in the biological world. I believe Rocha would disagree with him. The reviewer claims that the biological world only decodes, that the genetic code is the decoder device. If it is not encoded, then why would there be any decoding...

But going back to the question of what we are looking for. It is a type of “complexity” in that we are seeking to find the source of the communication itself – in the above charts, the arrows which are connecting the boxes. The other part that we are seeking is the source for the semiosis – the language – the syntax – in the encoding and the decoding boxes. Or if one insists that no encoding has/is taking place, then the semiosis in the decoding box.

The message which is being transmitted in the graphic is the DNA. It is often called “information” but that is not the kind of information we are looking for – we are looking for what is causing the reduction of uncertainity in the receiver – the Shannon information, successful communication. The DNA itself – like the chemicals themselves – is as good dead as alive. IOW, once that successful communications ends, the biological system is dead.

So if we want to agree to a term for this thing we are looking for – I’d like to suggest a change to your phrase ”what is the origin of complex information?” to ”what is the origin of successful communication at the molecular level?”.

Or give the complexity a moniker like they have with physical complexity v functional complexity v irreducible complexity v Kolmogorov complexity v self-organizing complexity. We could call it "StJacques' Project complexity" (or SJPC for short).


341 posted on 12/17/2004 2:00:40 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 340 | View Replies]

To: Alamo-Girl
". . . Truly, I hate to be picky because we are in fact on the same page here. But “information” in the sense you have used it above (as do ever so many people in all kinds of literature) is tantamount to the message rather than the communication of it (Shannon). The Shannon definition of information is a “reduction of uncertainty in the receiver”. . . ."

No, I have to disagree. There can be no definition of the "information" that is not expressable in a biochemical formula, otherwise we are discussing something outside of the physical world. The RNA strand -- we may substitute "molecules" for "strand" here -- is the messenger, the "RNA transcriptions" which may include "RNA edits" are the message, and the RNA sequences are the information.

If you see the information as something different, then please say exactly what you think it is and express it in biochemical terms.
342 posted on 12/17/2004 3:10:20 PM PST by StJacques
[ Post Reply | Private Reply | To 341 | View Replies]

To: Alamo-Girl; betty boop; PatrickHenry; Doctor Stochastic; tortoise; cornelis; RadioAstronomer
Let me help put my previous response in context here, to give you an alternative means of responding.

You could alternatively give a biochemical equivalent to a "reduction of uncertainty" in the receiver. This is where the applicability of mathematical theory to microbiology, in this case Shannon Information Theory, must be judged based upon its "translatability" into the language of genetics and biochemistry.

For those of you I have pinged, see my previous post -- #342 -- for the context.
343 posted on 12/17/2004 3:17:41 PM PST by StJacques
[ Post Reply | Private Reply | To 341 | View Replies]

To: tortoise; Doctor Stochastic
This is a special ping for tortoise, though you may be able to jump in here as well Doctor Stochastic.

What is the notational expression for information and/or uncertainty in Shannon Information Theory? I know that "reduced uncertainty" is a matter of "state" and that the Theory of Successful Communication's premise is that communication is "successful" if the information received is identical or close to identical as that which was sent, with the accounting for "noise" in the communication channel present.

The reason why I ask is that I am wondering if there are coefficients and/or integrable constraints that separate the message and the information contained within that permit the maintenance of "state" within the process of communication, which may be translatable to biochemistry.

Just thought I'd ask.
344 posted on 12/17/2004 3:29:04 PM PST by StJacques
[ Post Reply | Private Reply | To 343 | View Replies]

To: StJacques; betty boop; PatrickHenry; Doctor Stochastic; tortoise; cornelis; RadioAstronomer
For you - and anyone else - trying to express this in some biochemical fashion:

Schneider: Uncertainty, Entropy, and Information


345 posted on 12/17/2004 3:54:55 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 344 | View Replies]

To: Alamo-Girl; betty boop; PatrickHenry; Doctor Stochastic; tortoise; cornelis; RadioAstronomer
I would like to propose a two-step process of using biological research sources to identify what the "information" is. Go back up the page and look at the diagram and notice that "transcription" is the process that encodes the "message."

Step 1: With respect to identifying what the "message" is I have a biochemical definition of "Transcription" here that may be of help:

"The synthesis of RNA using a DNA template. The process whereby RNA is synthesized from a DNA template."

According to this defintion the "synthesized RNA" would be the "message transcribed."

Step 2: I would like to post a quote from a page on DNA Replication & RNA Transcription

". . . Central dogma: information is encoded in DNA. To express this information, RNA is transcribed with same coding, then translated into amino acid sequence which folds to form active proteins. . . ."

If the "synthesized RNA" (from Step 1 above) is the [transcribed] "message" that is translated [by the receiver] into "amino acid sequences" (from Step 2) then those sequences, which are the components of the message, must be the "information."

I still contend that the amino acid sequences are the information and I now add that the transcribed RNA is the message. And I believe the diagram you posted shows this, because if the information received is meant to be identical or nearly identical to the information sent and it is clearly stated that the "Genetic Message in Protein Code" is what is received, then the amino acid sequences that constitute the "protein code" must be the "information."

And from my examination of the Schneider paper I believe that the probability notation Pi represents the probability of uncertainty expressed as a numerical value, which I believe relates more to "state" than it does "information" if it is to be applied to microbiology. And though I am not a molecular biologist or biochemist, I see problems with applying that notation scheme, as I believe it will be problematic.

By the way, we might want to get back to the Schneider paper later. There has been some very recent work by a Brazilian Physicist named Tsallis that has called Boltzman-Gibbs entropy into question when used in molecular biology. I ran into this over a week ago and stopped to read some of it when we were in the midst of another sub-topic, but Tsallis's work seems to be gaining creedence. And I mention this because Schneider is using Boltzman-Gibbs entropy in his calculations. No time for that sidetrack now.
346 posted on 12/17/2004 5:08:12 PM PST by StJacques
[ Post Reply | Private Reply | To 345 | View Replies]

To: StJacques
Shannon's definition of information is mostly concerned with computing how many different messages a given probability distribution can convey. The basic formula for "information" or "entropy" is:

Sum(-p_i*lg(p_i)) where the sum is taken over all symbols (indexed by i) and lg is the base two logarithm. For example, with two elements of equal probability, H=2*(-1/2*lg(1/2)) or simple lg(2) or just 1 bit. Thus one can convey the presence of absence of one message. Shannon information measures how many messages a distribution can represent but not the "meaning" of such information. For example, just before D-Day, playing of a quotation from the French poet Verlaine’s work "Chanson d’Automne" signaled the resistance that an invasion was imminent which allowed the resistance to destroy railroad bridges, etc. just before the invasion. Only one message was of interest.

347 posted on 12/17/2004 9:20:33 PM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 344 | View Replies]

To: Doctor Stochastic; tortoise; Alamo-Girl; PatrickHenry; betty boop; cornelis; RadioAstronomer
Thank you for the explanation Doctor Stochastic.

I was just doing some reading on Shannon before coming back in here. Somehow I think the Yockey work causes some problems, because it looks to me as though Yockey begins with a message and ends with a message. This is the Yockey diagram just posted:


       
genetic noise:
mutations
v
 
noise in
genetic code.
tRNA
v
   
Genetic message
in DNA
including
tRNA
 
>
transcription
into mRNA
>
channel
mRNA
code

  >
channel
translation
into protein

^  ^
>



<
Genetic message
in protein code
<   tRNA
> tRNA > independent channel (cytoplasma?) > ^    


And this is a diagram I have found used twice to explain the Shannon-Weaver model, without any applicability to biology:



Now there is some confusion between the two in that "messages" in the upper image appear to be where "source" and "destination" are in the lower image. So I think there is some work to be done on Yockey's conceptualization -- which I assume the upper image is taken from -- before it can fit.

I have been trying to make sense of Alamo-Girl's insistence that Shannon's definition of information is "a reduction of uncertainty in the receiver" and, based upon my reading, I think this is rather the description of "Successful Communication." The quantitative definition for "information" I found as stated by Weaver was "a measure of one's freedom of choice in selecting a message." And since this second definition -- "a measure" -- can be represented quantitatively as a number, whereas the first term, "a reduction," comes closer to an operand, I think the second term must form the basis for forming a definition of "information" in biological systems if it is to be represented quantitatively, as Shannon's model implies it must. And it appears to me that when using this construct that the "information" is the range of choices available in transcription, what Kauffman referred to as the "Adjacent Possible." But I want to think on this a bit. I've been doing a lot of that in this thread.

And I think I remember the poem, sent one line at at time, and the example is a good one to illustrate "noise" in the channel that surrounds information.

Les sanglots des violons de l’automne
Blessent mon coeur d’une longeur monotone


One more thing, Alamo-Girl, I think I have been thinking in too practical a manner in my previous series of posts without stopping to delve into the theoretical underpinnings of Shannon communication theory when advancing definitions, so I hope you will show me some patience. I still want the definition to be "translatable" in some fashion to the language of molecular biology, but we're not there yet.
348 posted on 12/17/2004 10:41:34 PM PST by StJacques
[ Post Reply | Private Reply | To 347 | View Replies]

To: StJacques; Doctor Stochastic; betty boop; tortoise; PatrickHenry
Thank you for your replies, StJacques! And thank you for the definition, Doctor Stochastic!

I will stay in the background until you’ve had a chance to consider Shannon’s Mathematical Theory of Communications and how it is being used in molecular biology.

One little heads up with regard to the "message". It is very common in normal discourse to associate information with meaning. But in Shannon’s definition, information does not address meaning (the message). Here’s a thread on the Chowder Society where that very issue was hotly contested.

For Lurkers, here is a definition of the Central Dogma of Molecular Biology from Wikipedia:

The central dogma of molecular biology (sometimes Crick's central dogma after Francis Crick who coined the term and discovered some of the principles) states that the flow of genetic information is "DNA to RNA to protein". With a few notable exceptions, all biological cells conform to this rule.

In layspeak, what this means is that unlike Lamarck thought, characteristics and habits of a generation are not passed on to the next generation. Whereas the environment can cause changes to the outward character of an organism (phenotype) – there is no mechanism whereby they can directly alter an organism’s genes (genotype).

The concept does not actually originate in biology, but in mathematics. From the same review website that had the charts we are discussing:

Finally, one example what information theory can do: we can learn from information theory that the famous 'Central Dogma' is not a first principle of molecular biology at all. The Central Dogma is a property of any code in which the source alphabet is larger than the destination alphabet. "Although many people feel that the Central Dogma belongs only to biology, we must, nevertheless, render unto biology that which is biology's and to mathematics that which belongs to mathematics!". These delightful insights illustrate the originality of Hubert Yockey.


349 posted on 12/17/2004 11:13:28 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 348 | View Replies]

To: Alamo-Girl; StJacques; Doctor Stochastic; cornelis; PatrickHenry; marron; tortoise
Kauffman’s hypotheses are quite engaging, but so far I haven’t read where he has specifically addressed how information itself (communication) and the required symbolization emerge within the autonomous agents – which are the two issues central to Rocha, Yockey, et al. But he does a great job laying out the environment which would be required for such to emerge autonomously.... There is another huge stumbling block to his hypotheses, Maxwell’s demon.

Hello A-G!!! Yes, Kauffman's hypotheses are indeed engaging and intriguing! And yes, that Maxwell's Demon -- wich is essentially what his "Autonomous Agent" logically reduces to -- is a huge stumbling block. But oh how "charming" I find that demon to be! For as Kauffman notes, "no physical theory ... comfortably unites matter, energy and information in a single dynamical framework." Yet this is precisely what Maxwell's Demon does. Still, as you note, Kauffman has not "specifically addressed how information itself (communication) and the required symbolization emerge within the autonomous agents." Still, Kauffman's "coevolutionary, autocatalyzing autonomous agent" formally opens up this question to further investigation, and I think that is wonderful.

I also was captivated by his concept of the "adjacent possible" into which non-ergodic (I generalize this to mean living) systems "might tend to flow towards maximum complexity"; and also his extremely provocative suggestions regarding a "span" between the quantum and classical worlds. Kauffman writes, "space (or space and mass-energy) might conveivably be comprised of autocatlytic autonomous Planck scale agents coevolving with one another." A statement like that simply takes your breath away!

Kauffman himself states that he is not "doing science" here, rather "protoscience." Meaning: He seeks to open up the conceptual space in which science can (hopefully) fruitfully proceed in the devlopment of its work. I certainly agree that he has succeeded in doing just that in this article.

Still, I doubt that his hypothesis can confirm abiogenesis, in the sense that it appears abiogenesis is not truly an autonomous process. A community of coevolving autonomous agents ever flowing into "the adjacent possible" seems to require that said agents make decisions that conduce to the "possible." Meaning that not all decisions necessarily would be "possible" ones. In which case they will come to naught (Possibly the agent might perish as a result of a "bad" or "wrong" decision). And this observation points to a deeper-level principle at work that is not addressed in Kauffman's hypothesis.

I also have the Schneider article, which I hope to have a chance to read soon; also Peter Corning's on "Thermoeconomics." (ditto). This afternoon, however, i have to help my better half stack five cords of firewood. :^) (O lucky me!) But I hope to be back this evening.

Thank you so much for your wonderful essay, and for pointing me to Kauffman! He really is superb.

350 posted on 12/18/2004 9:55:51 AM PST by betty boop
[ Post Reply | Private Reply | To 339 | View Replies]

To: StJacques; Doctor Stochastic; cornelis; PatrickHenry; marron; tortoise
There can be no definition of the "information" that is not expressable in a biochemical formula, otherwise we are discussing something outside of the physical world.

Indeed, StJacques. But that's the entire point: We are discussing something here that is irreducible to biochemical formulaic terms because it is in essence something "extra" or "meta" to the physical world, yet without which the physical world would not have the form it has. You cannot rule out such by fiat. FWIW.

This has been the most fascinating discussion! Thank you so much!

351 posted on 12/18/2004 10:04:29 AM PST by betty boop
[ Post Reply | Private Reply | To 342 | View Replies]

To: betty boop; Alamo-Girl
I'm just stopping by for a moment this afternoon before coming back in later tonight. I've caught up and read your replies and I'll have some responses later this evening -- I hope -- but we've got friends coming over for a little Christmas celebration and I won't be able to get back here for at least another four hours.

A quick side note here Alamo-Girl. I have spent about two plus hours or so reading Shannon's original paper, some additional web pages on the theory, and I plan to do a little more before I try to say something concrete. But I can at least give you a "heads up" that I am trying to conceive of a way to define "biological information" in quantitative terms, because that is how it is used in Shannon Information Theory. And I may stop to create a web image of my own to show what I think the process should look like.

I'll be back later. And I really am enjoying this wide-ranging discussion to which both of you have contributed so much of your time and energy in a genuinely good-willed spirit of exchange. I'm trying to live up to the same standard in return.
352 posted on 12/18/2004 3:08:20 PM PST by StJacques
[ Post Reply | Private Reply | To 351 | View Replies]

To: StJacques; Alamo-Girl
I am trying to conceive of a way to define "biological information" in quantitative terms, because that is how it is used in Shannon Information Theory. And I may stop to create a web image of my own to show what I think the process should look like.

Truly I'm looking forward to your description, StJacques! It is such a busy time of year. I'll be patient. Thank you for your excellent contributions thus far.

Meanwhile, enjoy your company this evening! 'Tis the season for that.

353 posted on 12/18/2004 3:27:00 PM PST by betty boop
[ Post Reply | Private Reply | To 352 | View Replies]

To: betty boop
Thank you oh so very much for all your excellent posts!

Kauffman writes, "space (or space and mass-energy) might conveivably be comprised of autocatlytic autonomous Planck scale agents coevolving with one another." A statement like that simply takes your breath away!

I absolutely agree with you! Sadly, we have put string theory on the back burner for this particular discussion, but keeping Kauffman's speculation in mind, you mind want to take another quick read of Geometry and String Theory - particularly around the last page.

Evidently this physicist would be working somewhere around Cumrun Vafa (Harvard) - so I'm not surprised that he surmises that geometry may actually be the source of strings! This is very close to both Max Tegmark's Level IV universe and to Kauffman's speculation of a space/time agency at work in the becoming of the whole universe.

That geometry - and not Maxwell's demon - IMHO, would be a much better target source for information in the universe. It would also go a long way to make sense of all existents being describable by mathematical structures and the unreasonable effectiveness of math.

It also would point (like the beginning of time) to an uncaused cause, i.e. God. So the metaphysical naturalists wouldn't likely care much for it. LOL!

354 posted on 12/18/2004 4:22:11 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 350 | View Replies]

To: StJacques
Thank you so much for the heads up! I look forward to your reply!

Like you, I have company today and will be out of town tomorrow - so I guess will be playing post tag for a few days. LOL!

355 posted on 12/18/2004 4:23:53 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 352 | View Replies]

To: Alamo-Girl; betty boop; StJacques; PatrickHenry; Doctor Stochastic

No posts since the 18th.

I miss your posts.

Or, is it time to put my dictionary back?

Merry Christmas to all of you.

PS: please come back.


356 posted on 12/21/2004 1:02:26 PM PST by Baraonda (Demographic is destiny. Don't hire 3rd world illegal aliens nor support businesses that hire them.)
[ Post Reply | Private Reply | To 355 | View Replies]

To: Baraonda
LOLOL! I can almost hear the crickets too!

But it is for a good purpose. StJacques is doing research and thinking everything through to prepare a response for us. His posts are excellent and very much worth the wait.

Merry Christmas, Baraonda! Hugs!

357 posted on 12/21/2004 8:22:41 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 356 | View Replies]

To: Baraonda; Alamo-Girl; StJacques
LOL Baraonda! Yep, I hear the crickets chirping too!

It's a busy season. But there's more coming on this thread. As A-G noted, StJacques is researching a most intriguing question. Plus I'm cooking up some new stuff, too. Stay tuned!

Thanks for writing! Merry Christmas!!!

358 posted on 12/22/2004 6:28:58 AM PST by betty boop
[ Post Reply | Private Reply | To 356 | View Replies]

To: betty boop; Alamo-Girl; StJacques
Christmas greetings from PatrickHenry: Christmas 1776.
359 posted on 12/24/2004 10:28:34 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 358 | View Replies]

To: PatrickHenry

Merry Christmas, dear Patrick!


360 posted on 12/25/2004 9:11:17 AM PST by betty boop
[ Post Reply | Private Reply | To 359 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 321-340341-360361-380 ... 921-935 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson