For Lurkers: In the discussion last night, Doctor Stochastic said that a single instruction RAM computer (the following instruction is hard-wired) could emulate a Universal Turing Machine:
For those following our discussion, here are some useful links and definitions:
The Kolmogorov complexity of a string of bits is the length of the smallest Turing machine program which produces the bit string as output.
A Turing Machine is an idealized computer consisting of an infinite tape and a read-write "head" which moves back and forth on the tape, reading and writing, according to a rule set that refers to i) what it sees on the tape ii) an internal "memory" state.
A Universal Turing Machine is a Turing machine with a rule set which allows it to imitate any other Turing machine (if the rule set and the input of the machine to be emulated are presented on the tape).
Entropy, for a closed system, is the quantitative measure of the amount of thermal energy not available to do work. It is the opposite of available energy and is often used to state the second law of thermodynamics: entropy in a closed system can never decrease.
Entropy is also used to mean disorganization or disorder, i.e. a measure of disorder or randomness in a closed system. (Boltzmann) This is the meaning of the term in information theory.
Both Feynman and Shannon (information theory) recognize that there is a difference arriving at the number of possible arrangements is an arbitrary parceling in information theory whereas thermodynamics is objective. Shannon: If we change coordinates, the entropy will in general change.
Yockey agrees the different kinds of entropy do not correlate, but notes that Shannon entropy does not distinguish between viable DNA sequences and happenstance DNA sequences of the same length. Thus he uses Shannon entropy in his book, Information Theory and Molecular Biology which debunks the notion of abiogenesis.
In this panspermia discussion the conclusion drawn is that things never organize themselves. That of course runs counter to the entire point of autonomous self-organizing complexity, which appears to be supported by what we see present in the Hox and Pax genes which are virtually identical across phyla, e.g. eyeness. The issue is how this autonomous self-organizing complexity could arise from non-life (abiogenesis.)
In the same article, the author sees quantum entropy as a possible solution in the distant future.
But, IMHO, that bridge is already being crossed:
Entropic Nonextensivity: a possible measure of complexity
Physics of Computation and the Quantum
I don't have time right at this moment, but maybe later today if I have the time.