First, he points in the article I linked earlier in the thread that mutation is the equivalent of noise, which always, without exception, degrades the information. It is like tape hiss on a cassette tape. I am a professional recording engineer and a musician. I know what he is talking about. So do you, if you have ever listened to a cassette tape. He discusses a possible exception in digital recording - dither:
"And again, once the noise is there, it is absolutely impossible to get it back out. And Ive never met any engineer who ever said the signal could be better after you added noise to it. The only exception to this is something called dither which does add noise to the signal before [me: or after] its recorded, but that is done to neutralize distortions in the recording equipment. Its dither in digital recording, and bias in analog recording. But it does not increase the information; it degrades the signal, albeit in a useful way.The challenge is:So Im hunting for a flaw in this theory. Can anyone show that noise increases the useful information in a signal?"
Show me an example where random mutation actually increases informationSecond, with reference to any code, he points out in other articles that in every case where the origin of a code is known, it is always, without exception the product of a mind. The challenge is:
Provide one example of a code, defined as "a channel with an input alphabet A and an output alphabet B" that is not the product of a mind.
Cordially,