To: tortoise
This is a response to your #293, which you were evidently writing when I was replying to your previous post.
I love your first sentence:
"'Randomness' means a number of things, and is mostly a convention for 'my predictive model is too simple to have utility.' . . ."
I think this gets to the truth in a big hurry. Chaitin said something similar, but I'll hold off on that now.
Some of what you have in what I'll refer to as the "middle" of your post I will probably want to return to after I go over Doctor Stochastic's earlier discussion, especially your comment "the relative entropy between the predictor and the actual process has the same complexity as the predictor output itself." I may want to return to that, because the use of "entropy" as equivalent to "randomness" can become confusing when other definitions of entropy, as a law of thermodynamics e.g., will probably become part of the discussion as well.
I also loved your final comment about bias and probability distributions, though I tend to doubt that "frequencies" will form part of the discussion. What a shame!
To: StJacques
I think this gets to the truth in a big hurry. Chaitin said something similar, but I'll hold off on that now. IIRC, there was a general mathematical description and proof of this published circa 1992 by Merhav and his gang. It has since been reformulated in a half-dozen different ways by a number of folks. This is widely considered to be a very elegant mathematical basis for the concept of "free will".
303 posted on
12/15/2004 10:31:28 PM PST by
tortoise
(All these moments lost in time, like tears in the rain.)
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson