A question for your question:
Is there a random process?
Because the theory of evolution is based on two processes: mutation (random) and natural selection (non-random).
You start with a group of organisms. Some number of them have random mutations. Most of those mutations are harmful, so those individuals die-- they don't reproduce. Only a tiny percentage of mutations are beneficial, but the individuals with those mutations survive longer and reproduce more. So the next generation has more individuals with the favorable mutation than the prior generation.
Some percentage of the next generation has mutations. Again, most of the mutations are harmful, but those individuals don't reproduce; the few with favorable mutations do reproduce, and in disproportionate numbers.
Each generation thus keeps the beneficial results, and only the beneficial results, from the previous generation's random mutations; and each set of favorable mutations builds on the prior successes.
That's how it works in this example. It's not a perfect evolutionary analogy, because our example here is working towards a specific goal - a particular sentence - whereas evolution via natural selection doesn't really have a goal in mind.
If order is derived by chance from nothing then mustn?t we assume that each try is completely unique and in no way connected with any other attempt? Isn?t this very meaning of randomness?
Let's walk through it. First, we need an environment. And to experience some sort of evolutionary process, our environment has to have selective pressures - that is, some traits will be more helpful for survival, and some will be less helpful, and some will be downright dangerous for creatures that have them. Imagine a dysfunctional creature that drowns every time it rains, and you'll see what I mean.
So, for this little thought experiment, we want an environment consisting of a chains of letters, 41 letters long. And we further want an environment where chains that are more like the final product have an advantage over chains that don't. And the chains that aren't much like the final product will have a disadvantage, and will die and go away.
So, we start with a random string of letters created by spinning the big genetics wheel. Now, as this is a random process, the odds that we'll get the final product right at the start are pretty damn long, as this article rushes to assure us. But the odds are, that we'll get a string of letters out that has at least one or two letters in the right place.
Now we have a chain that has a slight resemblance to the final product. These few letters in the right place are an adaptive trait - they are preferentially replicated in the next generation. What that means is that those letters are (almost) automatically replicated in the next generation - after all, if they weren't, the offspring would die, right?
So, come the next generation, we have a chain where a few letters are already in place, and since that's an adaptive trait, those letters get passed on to the offspring - the next chain. And then we spin the big genetics wheel yet again, but not for all letters - some letters are passed on from the parents. So we spin and generate random letters in place of the non-adaptive letters. And we find that one or two of the new letters are in the right place, in addition to the one or two that we had from the last generation.
Keep this up, and after a few generations. you'll have the final sentence. And it won't take trillions and trilions of years, either. If you programmed a computer to do it for you, you'd have the final product in probably less than 60 generations, and almost certainly less than 100.
It is a random process, but some random products are more successfull than others. That's what I'm talking about, and that's why this article is dead wrong. Period.
How can a random process accrue 'data' to achieve some eventual state when said state is supposed to be an unknown?
Well, that's where the "million monkeys" analogy breaks down ;)
There's no selective pressure in monkeys typing randomly, so there's no reason for them to eventually produce "Hamlet." If we imagine a selective pressure - e.g., we reward monkeys that can produce things a little bit like "Hamlet", and shoot the monkeys that type gibberish, we'd have a selective pressure. And then we up the bar a little bit by rewarding the few monkeys that can produce something somewhat like "Hamlet," and shooting the monkeys that only produce stuff a little bit like "Hamlet." And then we up the bar again by rewarding monkeys that produce stuff that's a lot like "Hamlet" and shooting all the lesser monkeys.
Keep that up for a while, and you'll get "Hamlet" out of a monkey soon enough ;)
The "random walk" is one example. Every step in the walk (i.e., your location) depends on preceeding steps, yet is random. There are many such statistical processes in nature.