To: StJacques
"Randomness" means a number of things, and is mostly a convention for "my predictive model is too simple to have utility". In any finite context, it does NOT mean non-deterministic, only means that the relative entropy between the predictor and the actual process has the same complexity as the predictor output itself. Strong pseudo-random number generators work this way, processes with very little complexity but no low-order patterns that can be tractably perceived via induction as a practical matter in our universe (true geometric complexity in time and space puts a damper on that -- O(2^n) is a killer). They will appear "random" by every mathematical measure of their output, even though we know they are not by definition.
This leads to a somewhat interesting situation for physicists in that no apparently "random" process can ever be definitively asserted to be non-deterministic, as even simple deterministic processes are capable of having this apparent property. When you get right down to it, "random" tells you almost nothing about the nature of whatever process you are describing with it. But being able to assert determinism is useful for a few theoretical purposes even if you never figure out how to look inside the box.
It really starts to get interesting when you start considering the fundamental theoretical nature of bias (both intrinsic and apparent) in probability distributions.
293 posted on
12/15/2004 9:11:29 PM PST by
tortoise
(All these moments lost in time, like tears in the rain.)
To: tortoise
This is a response to your #293, which you were evidently writing when I was replying to your previous post.
I love your first sentence:
"'Randomness' means a number of things, and is mostly a convention for 'my predictive model is too simple to have utility.' . . ."
I think this gets to the truth in a big hurry. Chaitin said something similar, but I'll hold off on that now.
Some of what you have in what I'll refer to as the "middle" of your post I will probably want to return to after I go over Doctor Stochastic's earlier discussion, especially your comment "the relative entropy between the predictor and the actual process has the same complexity as the predictor output itself." I may want to return to that, because the use of "entropy" as equivalent to "randomness" can become confusing when other definitions of entropy, as a law of thermodynamics e.g., will probably become part of the discussion as well.
I also loved your final comment about bias and probability distributions, though I tend to doubt that "frequencies" will form part of the discussion. What a shame!
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson