Teleology, as you know, deals with "ends," purposes, and goals. Evidently nature is purposive, evolution is purposive. How do we get from accident to purpose?
It seems evident (to me anyway) that biology cannot be reduced simply to physics ("matter in its motions" as described by the physicochemical laws, given initial and boundary conditions), leading to "random" mutations whose fitness value will be "rewarded" or punished by the environment -- since the genetic, algorithmic, and symbolic information content of living organisms is much greater than the information content of the physical laws.
Chaitin (1985) pointed out that the laws of physics have very low information content, since their algorithmic complexity can be characterized by a computer program fewer than a thousand characters in length. In 2004, in a private communication to a colleague, Chaitin wrote: My paper on physics was never published, only as an IBM report. In it I took: Newtons laws, Maxwells laws, the Schrödinger equation, and Einsteins field equations for curved spacetime near a black hole, and solved them numerically, giving motion-picture solutions. The programs, which were written in an obsolete computer programming language APL2 at roughly the level of Mathematica, were all about half a page long, which is amazingly simple.
How does the complexity of living organisms increase if its main driver is the physicochemical laws, estimated to have an algorithmic complexity of only 103 bits? Certainly, the observed flow of environmental information is enormous, and tellingly, it is morphological information. But what is the source of the enormous environmental information flow?
Did the Big Bang's initial conditions have an algorithmic complexity greater than the algorithmic complexity of physical laws themselves? If not, then how did environmental and biological information increase in the evolution of the Universe?
Now Ashbys Law (Ashby, 1962) states that The variety of outputs of any deterministic physical system cannot be greater than the variety of inputs; the information of output cannot exceed the information already present in the input. In accordance, Kahres Law of Diminishing Information reads: Compared to direct reception, an intermediary can only decrease the amount of information (Kahre, 2002, 14). Moreover, it is a widely held view nowadays that the chain of physical causes forms a closed circle. The hypothesis of the causal closure of the physical (Cameron, 2000, 244) maintains (roughly) that for any event E that has a cause we can cite a physical cause, P, for its happening, and that citing P explains why E happened. Therefore, not only Ashbys and Kahres laws but the causal closure thesis is in conflict with the complexity measures found in physics and in biology. Now if the algorithmic complexity of one human brain is already around I1~1015-1017 bits, the information paradox consists in the fact that the information content of physics is I(physics)~103 bits while that of the whole living kingdom is ... I(biology)~1015-1017 bits. Taking into account also that physics is hopelessly far from being able to cope with the task to govern even one human persons biological activity ~2*1021 bits per second, it becomes clear that at present, modern cosmological models algorithmic complexity is much less than the above obtained complexity measures characterizing life.Just tell me, js1138, how did nature become, not only purposive, but informed such that it can be purposive? What is the information source, if (as Ashby, Kahre, and Cameron seem to suggest) it cannot be explained on the basis of an evolution strictly according to the physicochemical laws?
-- A. Grandpierre, "Complexity Measures and Lifes Algorithmic Complexity," 2005.
To me, this is the great question....