Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: gore3000; Doctor Stochastic; betty boop; tortoise
I understood Doctor Stochastic’s statement to mean something like this:

Whatever the ascii length of a (or all) Shakespeare work(s) - create a program which randomly generates bit strings of that length. Eventually the program will generate a matching bit stream.

That is no doubt true but IMHO, it does not necessarily tell us much about the genetic code. First of all, the amount of time in the geological record is finite, so one question is what is the probability of yielding a particular bit string given a certain rate of production and window of opportunity.

Secondly, the theory of evolution suggests that the content of the bit string becomes more complex over time – a stair stepping effect. So the window of opportunity questions are more narrowly construed, step by step.

Thirdly, and most importantly, the steps of increasing complexity are functional - they have information content. That also must be explained.

That’s why I’m raising the question of how we can evaluate the genetic code of living creatures. From what I have read, the issue comes down to entropy v complexity. But as with the multi-verse models, the question of motive is also in the mix (emphasis mine):

At this website the reader is advised to use the difference to argue against creationists:

Gregory Chaitin[1], Ray Solomonoff, and Andrei Kolmogorov developed a different view of information from that of Shannon. Rather than considering the statistical ensemble of messages from an information source, algorithmic information theory looks at individual sequences of symbols. Here, information H(X) is defined as the minimum size of a program necessary to generate the sequence X.

This article highlights some of the main concepts of Algorithmic Information Theory. Knowledge of this subject can be useful when arguing with Creationists whose use of concepts from both Classical and Algorithmic Information Theory tends to be sloppy. This article is only a survey of the major ideas. For the mathematical proofs, see the cited material.

At the Origin of Life prize website we are shown the meaning of entropy (emphasis mine):

By "entropy" as it relates to information theory, the Foundation adopts Hubert P. Yockey's distinction between Maxwell-Boltzmann-Gibbs entropy, Shannon probability-distribution entropy, and Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity. (See Information Theory and Molecular Biology, Cambridge University Press, 1992, sections 2.2.2 and 2.4.1 - 2.4.6). (See also, Yockey, H.P., (1974) "An application of information theory to the Central Dogma and the sequence hypothesis." Journal of Theoretical Biology, 46, 369-406, and Yockey, H.P.(1981) Self Organization, Origin of Life Scenarios, and Information Theory, J. Theor. Biology, 91, 13-31, and Yockey, H.P. (2000) Origin of life on earth and Shannon's theory of communication, Comput Chem, 24, 1, pp 105-123) Yockey argues that there is no "balancing act" between algorithmic informational entropy and Maxwell-Boltzmann-Gibbs-type entropy. The two are not on the same see-saw. The two probability spaces are not isomorphic. Information theory lacks the integral of motion present in thermodynamics and statistical mechanics. In addition, there is no code linking the two "alphabets" of stochastic ensembles. Kolmogorov-Solomonoff-Chaitin complexity does not reside in the domain of stochastic ensembles of statistical mechanics. They have no relation despite endless confusion and attempts in the literature to merge the two.

"Highly ordered" is paradoxically opposite from "complex" in algorithmic-based information theory. The emergent property of "instructions," "organization," and the "message" of "messenger biomolecules" is simply not addressed in Maxwell-Boltzmann-Gibbs equations of heat equilibration and energy flux between compartments. Surprisingly, the essence of genetic "prescriptive information" and "instructions" is not addressed by current "information theory" either. Shannon information theory concerns itself primarily with data transmission, reception, and noise-reduction processing without regard for the essence of the "message" itself.

The Foundation questions whether "order," physical "complexity," or "shared entropy" are synonymous with "prescriptive information," "instructions," or "organization." Christoph Adami emphasizes that information is always "about something, and cannot be defined without reference to what it is information about." It is "correlation entropy" that is "shared" or "mutual." Thus, says Adami, "Entropy can never be a measure of complexity. Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted. Information is entropy "shared with the world," and the amount of information a sequence shares with its world represents its complexity." (Personal communication; see also PNAS, April 25, 2000, 97, #9, 4463-4468).

Differences of perspective among information theorists are often definitional. "Complexity" and "shared entropy" (shared uncertainty between sender and receiver) has unfortunately often been used synonymously with "prescriptive information (instruction)." But is it? Mere complexity and shared entropy seem to lack the specification and orchestrational functionality inherent in the genetic "instruction" system of translation.

The confusion between algorithmic instruction and Maxwell-Boltzmann-Gibbs entropy may have been introduced through the thought experiment imagining Maxwell's Demon - a being exercising intelligent choice over the opening and closing of a trap door between compartments. Statistical mechanics has no empirical justification for the introduction of purposeful control over the trap door.

Solar energy itself has never been observed to produce prescriptive information (instruction/organization). Photons are used by existing instructional mechanisms which capture, transduce, store, and utilize energy for work. Fiber optics is used by human intelligence to transmit meaningful prescriptive information (instruction) and message. But raw energy itself must not be confused with functional prescriptive information/instructions. The latter is a form of algorithmic programming. Successions of certain decision-node switch settings determine whether a genetic "program" will "work" to accomplish its task.


567 posted on 06/27/2003 7:30:39 AM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 564 | View Replies ]


To: Alamo-Girl
Whatever the ascii length of a (or all) Shakespeare work(s) - create a program which randomly generates bit strings of that length.

No, I made a much stronger claim. The string that I constructed, 11011100101110111... (concatenating the integers in binary) will contain the works of Shakespeare somewhere within the string. It's constructive.

568 posted on 06/27/2003 7:50:48 AM PDT by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 567 | View Replies ]

To: Alamo-Girl
Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted.

This is so interesting, A-G. It seems that people making descriptions so often forget that their descriptions are of something that will continue to "be there" regardless of whether it is described or not. The description is not autonomous; neither is it the equal of what it describes. I'd say, logically it never can be. Even in instances where it is completely "accurate," it is an approximation of, or a mental reduction of, something greater than itself, the existence of which does not depend on our thought process. But any description is only as good as how well and faithfully it correlates with the "actual object" it describes. FWIW.

578 posted on 06/27/2003 10:09:24 AM PDT by betty boop (Nothing is outside of us, but we forget this at every sound. -- Nietzsche)
[ Post Reply | Private Reply | To 567 | View Replies ]

To: Alamo-Girl
Knowledge of this subject can be useful when arguing with Creationists whose use of concepts from both Classical and Algorithmic Information Theory tends to be sloppy.

Seems to be that this person wants to win not by honestly setting forth the facts but by trying to baffle opponents with unknown terms. Does not seem very productive to me and more an attempt to win at any cost and the cost in this case is the truth.

In a previous post I asked a question which seems to me relevant but have not seen answered here or in some articles I have read on the subject. That is comparisons of Kolmogorov complexity of specific strings such as Shakespeare, the DNA code, etc. Seems to me that the strongest test of a theory comes when one tries to apply it to real life problems and I have not seen anywhere the specific applications of this theory. I know that you have read much on this, have you seen any such examples of this theory?

591 posted on 06/27/2003 7:21:37 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 567 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson