Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Alamo-Girl
Sorry, I was busy sanding drywall getting our guestroom ready for the holidays until late last night.

I did a quick Google “groups” search to see what people were posting of the first author in the list, Yockey. It looks like his work is irrelevant to current biogenesis concerns for the identical reasons as those of Hoyle. Specifically

1) Calculating odds in reverse, “that because cytochrome c is what we have now, the reactions had to select it out of all the possible combinations. As the reactions occurred, cytochrome c became "what we have now"

2) Ignoring the dynamics of chemical reactions Chemical reactions have a great deal of specificity -- enzymes are designed to make one specific reaction happen. As the cytochrome c sequence developed, the molecular development was likely constrained by molecular orientation -- drastically reducing the number of available combinations.

As one of those 1992 critics put it in,
“His final conclusion is that no, the random sloshing of amino acids together is a very unlikely explanation for the origin of cytochrome c, despite previous suggestions to that effect. This was way back in 1977, of course, so people were still trying to figure that one out. Today, this is standard knowledge, and nobody bothers with such models.
” And another
” autocatalytic _networks_ have been proposed in which case you get to include the combinatorial crossmatching of short molocules that are "too short" in the sense your author uses. This means he threw out a factor of more than {10,000!}^20, more than enough to make the probability of life _in his scenario_ approach unity.

Why the heck can't you refute the _current_ model? Why post brain dead straw men and refute them?”

Alamo-Girl, I saw a previous post here indicating that you were interested in an objective investigation of this. (That can be an ambitious objective.) But I got the impression in your last post to me that you didn’t place the same significance as I on the misapplication of work by Hoyle (and now Yockey). You said, “ The other authors do not make the omissions you mention because they are not making an approximation”. What’s identified here is not just an “omission” when it’s applied to a criticism of biogenesis, it’s a mischaracterization.

There are dynamic relationship between molecules. In the more extreme examples of this dynamic, there are about 4,000 returns on Google for “biogenesis and macromolecules” where independent events operate on different parts of the molecules.

Using conclusions from work ignoring molecular dynamics is misleading in the context of evaluating the probability of biogenesis. It would be like me putting up a web page to refute Creationism, and arguing with claims of an unrecognized pseudo-Christian sect. I hope that’s clear.

As an aside, I also read that Yockey is using the older warm pond presumption rather than more modern deep-sea hot-springs and associated biofauna premise.

I took a look at the other two authors you recommended. I see that they haven’t generated the same controversy as the use of work by Hoyle and Yockey.

682 posted on 11/25/2003 5:50:20 AM PST by elfman2
[ Post Reply | Private Reply | To 611 | View Replies ]


To: elfman2; betty boop
Thank you so much for your reply and your research! I enjoyed visiting the links!

The messages from Talk-Origins are dated 1992 and are in reference to an article by Yockey dated 1977.

However, it was in 1992 that Yockey’s book Information Theory and Molecular Biology was printed. Yockey is working on that book for a second edition will be out shortly.

In other words, the criticisms you found are considerably out of date. Before we discuss this further, you might want to read the 1996 critics of his 1992 book – and Yockey’s response in the Chowder Society.

Yockey’s critics today narrow down to the methods he used to determine information content, e.g. whether to use Shannon entropy or Kolmorov complexity/Solomonoff induction, etc.

Because Shannon entropy is the uncertainty of communication whereas information is its success – and the biological mechanism involves such communication – I have no problem with Shannon entropy. But either way, the information opportunity issues remain – which is the point of the Origin of Life prize (updated 11/2003):

By "entropy" as it relates to information theory, the Foundation adopts Hubert P. Yockey's distinction between Maxwell-Boltzmann-Gibbs entropy, Shannon probability-distribution entropy, and Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity. (See Information Theory and Molecular Biology, Cambridge University Press, 1992, sections 2.2.2 and 2.4.1 - 2.4.6). (See also, Yockey, H.P., (1974) "An application of information theory to the Central Dogma and the sequence hypothesis." Journal of Theoretical Biology, 46, 369-406, and Yockey, H.P.(1981) Self Organization, Origin of Life Scenarios, and Information Theory, J. Theor. Biology, 91, 13-31, and Yockey, H.P. (2000) Origin of life on earth and Shannon's theory of communication, Comput Chem, 24, 1, pp 105-123) Yockey argues that there is no "balancing act" between algorithmic informational entropy and Maxwell-Boltzmann-Gibbs-type entropy. The two are not on the same see-saw. The two probability spaces are not isomorphic. Information theory lacks the integral of motion present in thermodynamics and statistical mechanics. In addition, there is no code linking the two "alphabets" of stochastic ensembles. Kolmogorov-Solomonoff-Chaitin complexity does not reside in the domain of stochastic ensembles of statistical mechanics. They have no relation despite endless confusion and attempts in the literature to merge the two.

"Highly ordered" is paradoxically opposite from "complex" in algorithmic-based information theory. The emergent property of "instructions," "organization," and the "message" of "messenger biomolecules" is simply not addressed in Maxwell-Boltzmann-Gibbs equations of heat equilibration and energy flux between compartments. Surprisingly, the essence of genetic "prescriptive information" and "instructions" is not addressed by current "information theory" either. Shannon information theory concerns itself primarily with data transmission, reception, and noise-reduction processing without regard for the essence of the "message" itself.

The Foundation questions whether "order," physical "complexity," or "shared entropy" are synonymous with "prescriptive information," "instructions," or "organization." Christoph Adami emphasizes that information is always "about something, and cannot be defined without reference to what it is information about." It is "correlation entropy" that is "shared" or "mutual." Thus, says Adami, "Entropy can never be a measure of complexity. Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted. Information is entropy "shared with the world," and the amount of information a sequence shares with its world represents its complexity." (Personal communication; see also PNAS, April 25, 2000, 97, #9, 4463-4468).

Differences of perspective among information theorists are often definitional. "Complexity" and "shared entropy" (shared uncertainty between sender and receiver) has unfortunately often been used synonymously with "prescriptive information (instruction)." But is it? Mere complexity and shared entropy seem to lack the specification and orchestrational functionality inherent in the genetic "instruction" system of translation. The confusion between algorithmic instruction and Maxwell-Boltzmann-Gibbs entropy may have been introduced through the thought experiment imagining Maxwell's Demon - a being exercising intelligent choice over the opening and closing of a trap door between compartments. Statistical mechanics has no empirical justification for the introduction of purposeful control over the trap door.

Solar energy itself has never been observed to produce prescriptive information (instruction/organization). Photons are used by existing instructional mechanisms which capture, transduce, store, and utilize energy for work. Fiber optics is used by human intelligence to transmit meaningful prescriptive information (instruction) and message. But raw energy itself must not be confused with functional prescriptive information/instructions. The latter is a form of algorithmic programming. Successions of certain decision-node switch settings determine whether a genetic "program" will "work" to accomplish its task.

Other questions to be considered in submissions, from that link (the 9th is excerpted:

2. Is life autonomous?
3. Does life display Negentropy?
4. Specified aperiodic complexity
5. Algorithmic instruction
6. The source of genetic information in nature
7. Genetic code
8. Scaffolding models
9. Biochemical correlation

a. The hypothetical mechanism must demonstrate correspondence with "the real world" of biochemistry.

b. The submission must provide adequate empirical support strongly suggesting that such a hypothetical scenario can take place naturally in a prebiotic environment. Simulation of abiogenesis must be independent of the factor of human intelligence that is so often subconsciously incorporated into computer hardware/software experimental design and simulation.

c. Thermodynamic realities must be clearly addressed, including specific discussion of any supposed pockets of momentary exception to the Second Law of increasing Maxwell-Boltzmann-Gibbs entropy. The Foundation's view is that Prigogine's dissipative structures, and life itself, operate within the constraints of the 2nd Law.

Maxwell-Boltzmann-Gibbs entropy must not be confused with statistical Shannon entropy or Kolmogorov-Chaitin-Solomonoff-Yockey "complexity." The latter two are nonphysical, abstract, mathematical constructs. All physical matrices of prescriptive information retention, however, are subject to the plight of Maxwell-Boltzmann-Gibbs entropy. They manifest a tendency toward deterioration in both closed and open systems. Repair mechanisms for these messenger biomolecules, therefore, require all the more starting instructional integrity. Prescriptive information would have been necessary in any primordial life form's genome to correct for continuous noise corruption of its functional protogenes. Deterioration of existing recipe in a physical matrix is far more probable than the spontaneous writing of new conceptually complex metabolic algorithms. Building-block synthesis, for instance, would have required something like a reductive citric acid cycle. There are no simple algorithms for integrating such a multistep, highly-directional pathway.

d. Empirical support does not have to be original research, but can be gleaned from existing scientific literature. Previously published empirical support must be described in detail and well referenced within the applicant's published research paper, explaining exactly how those controlled observations demonstrate empirical correlation with the applicant's theory.

10. "Design" anthropomorphisms
11. Appeals to unknown laws
12. Infinity issues
13. Computerization

685 posted on 11/25/2003 9:50:56 AM PST by Alamo-Girl
[ Post Reply | Private Reply | To 682 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson