Indeed, A-G, we do live in most exciting times!
I've just finished reading a wonderful book, by Dean L. Overman, A Case Against Accident and Self-Organization (Lanham, MD: Rowman and Littlefield Publishers, Inc., 1997). I enjoyed it so much, and I'd bet you'd like it, too. He quotes extensively from Yockey and Penrose, Hawking, many others. He points out that "order" and "complexity" are not synonyms, and elucidates the critical difference between them. By complexity, he means information content -- the minimum number of instructions necessary to specify and maintain a structure. So you can see information theory is very out front in Overman's analysis.
He writes, "Highly complex structures require many instructions. A structure may be highly ordered, such as a crystal, but contain very few instructions." Order displays pattern, sequence. Indeed, very simple instruction sets (and even chaos) have been observed to produce regular patterns. But highly complex structures -- such as DNA -- are nonperiodic, seemingly random sequences. DNA is "complex" in the way a crystal is not: Its complexity means it can encode an astronomically vast number of instructions/information content to specify its structure and realize its function.
Overman is also very keyed into issues in particle astrophysics. He wrote:
"Because the formation of life requires the formation of a universe compossible with life, the case against accident as an explanation for life is satisfied completely by an examination of the probabilities involved in the fine tuning of particle astrophysics without regard to issues raised by molecular biology. When one couples the probabilities in physics against an accidental universe compossible with life with the molecular biological and pre-biological possibilities against the formation of the first form of life from inert matter, the compounded calculation wipes the idea of accident entirely out of court."
The statement comes in the book's conclusion. It seems to have been thoroughly well argued and documented throughout.
Of course, there are things that cannot be known for a certainty. Most cases, we have to be satisfied with the standard, "beyond a reasonable doubt." I think Overman makes an persuasive case against life arising by accident; but I'll be checking his thesis against future developments, new evidence, new discoveries....
Just "thinking out loud" through some of Overman's ideas here, A-G. Thanks for letting me rant! You've got to read this book!
It seems whenever we start speaking of complexity, order, randomness, chaos, probability and entropy the conversation tends to get caught up in a quagmire of definitions.
I certainly agree with the authors measure of complexity which roughly corresponds to the Chaitin/Kolmogorov view that complexity can be measured as the size of smallest program which will produce the subject string.
Sadly, there is a tendency around here to dismiss the importance of Shannon entropy to biological information content. Instructions flow through biological systems like communications between devices and thus, Shannon is very relevant in my view. Shannon entropy is roughly the uncertainty of that flow, the successful flow is information.
Likewise, when we speak of order there seems to be a tendency here to observe that when the universe dissolves in the end, it will have achieved both maximum entropy and greatest order. That is an interesting observation but it doesnt really tell us much about order in biological systems.
Randomness raises the same kind of issue. For instance, both pi and Chaitins Omega will generate a string which if you were to select a chunk of it at a respectable distance would appear to be random. In the case of pi that impression would be false. In the case of Omega after a certain number of positions, that impression would be true.
Or would it? since in both cases, the number itself is a derivation of algorithm and thus, not random. I believe this is Wolframs counter-point, i.e. that all randomness is only pseudo-randomness.
Indeed, on closer inspection (especially in alternative bases) - most candidate number generating algorithms have a high degree of auto-correlation.
The order and complexity of biological information content is frankly stunning. But if the greater the entropy, the higher the order, then the less the opportunity for complexity. On its own then, complexity can only form in lower entropy, higher chaos. But is that rational? IOW, for a metaphysical naturalist explanation to prevail it must have gone from chaos to complexity to order to entropy to more chaos, more complexity, more order, more entropy and so on.
In sum, if the initial conditions are not random - indeed, if there is no randomness apart from pseudo-randomness - then the metaphysical naturalism theory of origins fails.
The counter to actual randomness around here has been the Brownian motion, but that (like pi and Omega) is an effect and not a cause, i.e. the consequence of ongoing bombardment by atoms and molecules.
The only defense the metaphysical naturalists have to this is the plenitude argument everything that can exist does, in some parallel universe.
Even for the die-hards who hold on to the hope of plenitude, they are nevertheless stuck with a beginning and for that, they have no defense!