Furthermore, Yockeys work dovetails quite nicely into the research on autonomous biological self-organizing complexity. The indication is that the evolutionary process is not happenstance, that evolution is not a directionless walk after all.
That doesnt mean evolution didnt happen but - IMHO - it may go a long way to explain the non-mutability of regulatory control genes which leads to eyeness developing seemingly independently across phyla, the absence of new phyla since the Cambrian explosion, the relatively rapid evolution considering finite opportunity and other such anomalies.
As to the applicability of Shannon entropy to biological systems:
Abstract In the last decade, two tools, one drawn from information theory and the other from artificial neural networks, have proven particularly useful in many different areas of sequence analysis. The work presented herein indicates that these two approaches can be joined in a general fashion to produce a very powerful search engine that is capable of locating members of a given nucleic acid sequence family in either local or global sequence searches. This program can, in turn, be queried for its definition of the motif under investigation, ranking each base in context for its contribution to membership in the motif family. In principle, the method used can be applied to any binding motif, including both DNA and RNA sequence families, given sufficient family size.
Introduction Gatlin (1) first recognized that the Shannon expression for string entropy might prove useful in sequence analysis. This function is a statistical average for the distribution of possible characters at a particular position in a message. [Although it shares the form of the GibbsBoltzman entropy function it is independently derived and nonisomorphic with that function (2, 3).] Gatlin insightfully proposed that this function, originally developed to assay the fidelity with which strings could be transmitted in noisy communication channels, was also appropriate for the analogous transmission of string information represented in the central dogma of genetics. Schneider et al. (4) subsequently developed a redundancy index (RI), based on this function, to profile a given family of DNA-binding-site sequences. This index measured the reduction in Shannon entropy relative to the background DNA, represented in the strings of the sequences belonging to a particular motif .
With all due respect, Right Wing Professor, Im not about to ignore Yockey simply because you disagree with him.
Would you kindly show me where I have used Yockey references to attack evolution?!
The reference was not to anything you wrote, but to recent posts by betty boop, notably this one .
Furthermore, Yockeys work dovetails quite nicely into the research on autonomous biological self-organizing complexity. The indication is that the evolutionary process is not happenstance, that evolution is not a directionless walk after all.
There is no such indication.
In page 313 in Yockeys book he says "...The Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have nothing to do with each other".
You can repost this as often as you want; it's still wrong. The Shannon entropy is simply the combinatorial entropy of a sequence; it forms part of the total entropy. The remaining part of the total entropy is the entropy of a randomly sequenced piece of DNA of the same base composition as the defined sequence.
As for the position of Information theory in the biological sciences; this appears to be a reasonable evaluation.