Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: elfman2; betty boop
Thank you so much for your reply and your research! I enjoyed visiting the links!

The messages from Talk-Origins are dated 1992 and are in reference to an article by Yockey dated 1977.

However, it was in 1992 that Yockey’s book Information Theory and Molecular Biology was printed. Yockey is working on that book for a second edition will be out shortly.

In other words, the criticisms you found are considerably out of date. Before we discuss this further, you might want to read the 1996 critics of his 1992 book – and Yockey’s response in the Chowder Society.

Yockey’s critics today narrow down to the methods he used to determine information content, e.g. whether to use Shannon entropy or Kolmorov complexity/Solomonoff induction, etc.

Because Shannon entropy is the uncertainty of communication whereas information is its success – and the biological mechanism involves such communication – I have no problem with Shannon entropy. But either way, the information opportunity issues remain – which is the point of the Origin of Life prize (updated 11/2003):

By "entropy" as it relates to information theory, the Foundation adopts Hubert P. Yockey's distinction between Maxwell-Boltzmann-Gibbs entropy, Shannon probability-distribution entropy, and Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity. (See Information Theory and Molecular Biology, Cambridge University Press, 1992, sections 2.2.2 and 2.4.1 - 2.4.6). (See also, Yockey, H.P., (1974) "An application of information theory to the Central Dogma and the sequence hypothesis." Journal of Theoretical Biology, 46, 369-406, and Yockey, H.P.(1981) Self Organization, Origin of Life Scenarios, and Information Theory, J. Theor. Biology, 91, 13-31, and Yockey, H.P. (2000) Origin of life on earth and Shannon's theory of communication, Comput Chem, 24, 1, pp 105-123) Yockey argues that there is no "balancing act" between algorithmic informational entropy and Maxwell-Boltzmann-Gibbs-type entropy. The two are not on the same see-saw. The two probability spaces are not isomorphic. Information theory lacks the integral of motion present in thermodynamics and statistical mechanics. In addition, there is no code linking the two "alphabets" of stochastic ensembles. Kolmogorov-Solomonoff-Chaitin complexity does not reside in the domain of stochastic ensembles of statistical mechanics. They have no relation despite endless confusion and attempts in the literature to merge the two.

"Highly ordered" is paradoxically opposite from "complex" in algorithmic-based information theory. The emergent property of "instructions," "organization," and the "message" of "messenger biomolecules" is simply not addressed in Maxwell-Boltzmann-Gibbs equations of heat equilibration and energy flux between compartments. Surprisingly, the essence of genetic "prescriptive information" and "instructions" is not addressed by current "information theory" either. Shannon information theory concerns itself primarily with data transmission, reception, and noise-reduction processing without regard for the essence of the "message" itself.

The Foundation questions whether "order," physical "complexity," or "shared entropy" are synonymous with "prescriptive information," "instructions," or "organization." Christoph Adami emphasizes that information is always "about something, and cannot be defined without reference to what it is information about." It is "correlation entropy" that is "shared" or "mutual." Thus, says Adami, "Entropy can never be a measure of complexity. Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted. Information is entropy "shared with the world," and the amount of information a sequence shares with its world represents its complexity." (Personal communication; see also PNAS, April 25, 2000, 97, #9, 4463-4468).

Differences of perspective among information theorists are often definitional. "Complexity" and "shared entropy" (shared uncertainty between sender and receiver) has unfortunately often been used synonymously with "prescriptive information (instruction)." But is it? Mere complexity and shared entropy seem to lack the specification and orchestrational functionality inherent in the genetic "instruction" system of translation. The confusion between algorithmic instruction and Maxwell-Boltzmann-Gibbs entropy may have been introduced through the thought experiment imagining Maxwell's Demon - a being exercising intelligent choice over the opening and closing of a trap door between compartments. Statistical mechanics has no empirical justification for the introduction of purposeful control over the trap door.

Solar energy itself has never been observed to produce prescriptive information (instruction/organization). Photons are used by existing instructional mechanisms which capture, transduce, store, and utilize energy for work. Fiber optics is used by human intelligence to transmit meaningful prescriptive information (instruction) and message. But raw energy itself must not be confused with functional prescriptive information/instructions. The latter is a form of algorithmic programming. Successions of certain decision-node switch settings determine whether a genetic "program" will "work" to accomplish its task.

Other questions to be considered in submissions, from that link (the 9th is excerpted:

2. Is life autonomous?
3. Does life display Negentropy?
4. Specified aperiodic complexity
5. Algorithmic instruction
6. The source of genetic information in nature
7. Genetic code
8. Scaffolding models
9. Biochemical correlation

a. The hypothetical mechanism must demonstrate correspondence with "the real world" of biochemistry.

b. The submission must provide adequate empirical support strongly suggesting that such a hypothetical scenario can take place naturally in a prebiotic environment. Simulation of abiogenesis must be independent of the factor of human intelligence that is so often subconsciously incorporated into computer hardware/software experimental design and simulation.

c. Thermodynamic realities must be clearly addressed, including specific discussion of any supposed pockets of momentary exception to the Second Law of increasing Maxwell-Boltzmann-Gibbs entropy. The Foundation's view is that Prigogine's dissipative structures, and life itself, operate within the constraints of the 2nd Law.

Maxwell-Boltzmann-Gibbs entropy must not be confused with statistical Shannon entropy or Kolmogorov-Chaitin-Solomonoff-Yockey "complexity." The latter two are nonphysical, abstract, mathematical constructs. All physical matrices of prescriptive information retention, however, are subject to the plight of Maxwell-Boltzmann-Gibbs entropy. They manifest a tendency toward deterioration in both closed and open systems. Repair mechanisms for these messenger biomolecules, therefore, require all the more starting instructional integrity. Prescriptive information would have been necessary in any primordial life form's genome to correct for continuous noise corruption of its functional protogenes. Deterioration of existing recipe in a physical matrix is far more probable than the spontaneous writing of new conceptually complex metabolic algorithms. Building-block synthesis, for instance, would have required something like a reductive citric acid cycle. There are no simple algorithms for integrating such a multistep, highly-directional pathway.

d. Empirical support does not have to be original research, but can be gleaned from existing scientific literature. Previously published empirical support must be described in detail and well referenced within the applicant's published research paper, explaining exactly how those controlled observations demonstrate empirical correlation with the applicant's theory.

10. "Design" anthropomorphisms
11. Appeals to unknown laws
12. Infinity issues
13. Computerization

685 posted on 11/25/2003 9:50:56 AM PST by Alamo-Girl
[ Post Reply | Private Reply | To 682 | View Replies ]


To: Alamo-Girl
"Yockey’s critics today narrow down to the methods he used to determine information content"

Alamo-girl, It’s nice to be debating with a polite partner.

Yes, I mentioned those dates, but Yockey’s improbability claim of cytochrome c that you referenced in support of disproving biogenesis is still the point refuted in the discussion that I linked to.

I don’t doubt that there are “other” criticisms of Yockey that would be interesting to investigate, but I don’t recognize them as addressing the two listed rebuttals in my last regarding the improbability of cytochrome c, at least not directly.

Ancillary debates of Shannon entropy, Kolmorov complexity, aperiodic complexity, autonomous, Negentropy, Scaffolding, and "Design" anthropomorphisms is not a journey I can afford to travel now, but if you can summarize why it’s fundamental to a defense from the refutation in my last, I’ll try to find the time to investigate.

688 posted on 11/25/2003 11:12:26 AM PST by elfman2
[ Post Reply | Private Reply | To 685 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson