To: UndauntedR
"are surprisingly strong... if made correctly"
if made correctly...if designed correctly...that was the point.
"The search space doesn't have to be bounded."
That is True.
"First of all, note that all known life is contained (approximately) in this search space. All evolution is doing is changing parameters... there's no "information" involved."
Of coarse DNA is not an example of information theory. There is no information involved. (Pardon the sarcasm)
To: FreedomProtector
if made correctly...if designed correctly...that was the point.
You seem to have a grasp of evolutionary algorithms... unless you mined that information... I'm confounded how you can understand evolutionary algorithms but not understand the natural process the idea came from.
Of coarse DNA is not an example of information theory. There is no information involved. (Pardon the sarcasm)
You said so yourself: "Evolutionary algorithms don't produce anything new, just find different parameters..
x[t+1] = s( v( x[t]) )".
ALL you need to do is let x be an open set (a "population") in the unbound genetic search space, v() be genetic variation (reproduction, mutation, etc), and s() be natural selection. Working in the space, you would not be able to tell me which genotype contains more information... they're all simply points in genetic space. Only the fitness function (which, as you know, is implicitly built into the s() function and, in this case, is completely environment dependent) can give you a sense of "good" and "bad" adaptation - the gradient (slope) of the fitness function (in a billion-dimensional space remember). No "information"... no "new"... Just a simple evolutionary algorithm in an unbound genetic search space with really really complicated v() and s() functions (which are dependent on time and position within the search space).
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson