Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

To: betty boop
...given a simple set of initial conditions, the iteration of even very simple rules over indefinitely long time periods can "spontaneously" generate systematic behavior that appears to be quite random.

Do you mean complex? I think what you are getting at is that the rules appear to us to be random when they are, fact, quite simple. There is no way to model the system from a top-down, macroscopic point of view, but only in a reductionist way, by teasing out the simple rules. However, the initial conditions can be random.

4,334 posted on 01/10/2003 7:43:21 AM PST by Nebullis
[ Post Reply | Private Reply | To 4224 | View Replies ]


To: Nebullis
Good morning, Nebullis! I meant to ping you to #4338....
4,340 posted on 01/10/2003 7:57:32 AM PST by betty boop
[ Post Reply | Private Reply | To 4334 | View Replies ]

To: Nebullis; betty boop
Please excuse my interruption, but your post to betty boop touches on the very reason why I've been saying the randomness pillar of the theory of evolution is in trouble. You said:

There is no way to model the system from a top-down, macroscopic point of view, but only in a reductionist way, by teasing out the simple rules. However, the initial conditions can be random. Here's why I'm concerned: from the Chaitin papers [ps]

We now turn to Kolmogorov’s and Chaitin’s proposed definition of randomness or patternlessness. Let us consider once more the scientist confronted by experimental data, a long binary sequence. This time he in not interested in predicting future observations, but only in determining if there is a pattern in his observations, if there is a simple theory that explains them. If he found a way of compressing his observations into a short computer program which makes the computer calculate them, he would say that the sequence follows a law, that it has pattern. But if there is no short program, then the sequence has no pattern—it is random. That is to say, the complexity C(S) of a finite binary sequence S is the size of the smallest program which makes the computer calculate it. Those binary sequences S of a given length n for which C(S) is greatest are the most complex binary sequences of length n, the random or patternless ones. This is a general formulation of the definition…

In other words - to sustain the pillar, one would have to presume that random information content can be algorithmic (which is, by definition, not random.)

4,341 posted on 01/10/2003 8:12:35 AM PST by Alamo-Girl
[ Post Reply | Private Reply | To 4334 | View Replies ]

To: Nebullis
However, the initial conditions can be random.

On my reading, to say that the prime mover has random initial conditions would be to say that the laws of the universe are random. Clearly, this cannot be the case. Certainly it wasn't the case with the illustration I gave -- the Rule 110 CA. In that case, Stephen Wolfram stands in the role of the prime mover, specifying the initial condition and the simple rules that, over their evolution, generate both order and apparent randomness -- that is, complexity. Just looking at the later iterations of the evolution, it is impossible to tell what the initial simple rules were.

4,342 posted on 01/10/2003 8:18:00 AM PST by betty boop
[ Post Reply | Private Reply | To 4334 | View Replies ]

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson