Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Why do you debate about evolution?
me ^ | 2-5-2002 | me

Posted on 02/05/2002 8:18:30 AM PST by JediGirl

For those of us who are constantly checking up on the crevo threads, why do you debate the merits (or perceived lack thereof) of evolution?


TOPICS: Miscellaneous; Your Opinion/Questions
KEYWORDS: crevolist
Navigation: use the links below to view more comments.
first previous 1-20 ... 421-440441-460461-480481-483 next last
To: Diamond
What a great post. You must be the most 'cordial' person on this forum.
461 posted on 02/09/2002 1:57:23 PM PST by Nebullis
[ Post Reply | Private Reply | To 450 | View Replies]

To: Nebullis
What are these barriers?

Plant and animal breeders have been limited by the barriers for thousands of years. Ernst Mayr call the limit to the amount of genetic variabliity available in a species, "genetic homeostasis".

(This must be my lucky day - thank you for the comment about cordiality. Information involving both a sender and a receiver requirers that I infer one of at least two possible intents of an apparent compliment, so I'll go with the odds and pick your sincerity, which has been observed in abundance in other postings:-)

Cordially

462 posted on 02/10/2002 7:02:16 AM PST by Diamond
[ Post Reply | Private Reply | To 460 | View Replies]

To: Diamond
Plant and animal breeders have been limited by the barriers for thousands of years. Ernst Mayr call the limit to the amount of genetic variabliity available in a species, "genetic homeostasis".

Genetic homeostasis is not a barrier to speciation. (Lerner coined the term, btw.) It means that populations have a tendency toward a mean fitness. That mean can still change via genetic drift, for example.

Artificial selection by plant and animal breeders for specific traits within species shows a continuous change over time. A recent post Best in Breed may interest you. Supposed limits in breeding of yesteryear are easily surpassed today.

463 posted on 02/10/2002 2:55:32 PM PST by Nebullis
[ Post Reply | Private Reply | To 462 | View Replies]

To: Diamond
I notice you haven't answered the question I put to you in my last message here.

In your own words, what is the difference between "coded" information and any other kind of information? What experiment would you propose to distinguish between the two?

There are a few other problems in your reply to me, and if there's still activity on this thread, I'll post some responses.

464 posted on 02/11/2002 6:35:08 AM PST by Karl_Lembke
[ Post Reply | Private Reply | To 418 | View Replies]

To: Karl_Lembke
The issue is addressed at length in an article I linked in my reply, but here is a very brief snippet from Part 3:

Part 3:  The concept of information:
a Shannon approach

Shannon’s theory of information, while useful in the context of telecommunications, does not seem to help anyone much in the evolution/creation debate. The purpose in spending the effort at all here is to clarify why a richer concept, discussed in Part 4, becomes necessary.

Building on ideas Shannon developed in 1948, Dawkins expresses (here in summarized form) the key ingredients of his view of information.

‘Redundancy was a second technical term introduced by Shannon, as the inverse of information. … Redundancy is any part of a message that is not informative, either because the recipient already knows it (is not surprised by it) or because it duplicates other parts of the message. …

‘Note that Shannon’s definition of the quantity of information is independent of whether it is true. The measure he came up with was ingenious and intuitively satisfying. Let’s estimate, he suggested, the receiver’s ignorance or uncertainty before receiving the message, and then compare it with the receiver’s remaining ignorance after receiving the message. The quantity of ignorance-reduction is the information content. Shannon’s unit of information is the bit, short for ‘binary digit’. One bit is defined as the amount of information needed to halve the receiver’s prior uncertainty, however great that prior uncertainty was …’63

‘In practice, you first have to find a way of measuring the prior uncertainty — that which is reduced by the information when it comes. For particular kinds of simple message, this is easily done in terms of probabilities. …

‘In a message that is totally free of redundancy, after there’s been an error there is no means of reconstructing what was intended. Computer codes often incorporate deliberately redundant ‘parity bits’ to aid in error detection. DNA, too, has various error-correcting procedures which depend upon redundancy. …

‘DNA carries information in a very computer-like way, and we can measure the genome’s capacity in bits too, if we wish. DNA doesn’t use a binary code, but a quaternary one. Whereas the unit of information in the computer is a 1 or a 0, the unit in DNA can be T, A, C or G. …

‘Whenever prior uncertainty of recipient can be expressed as a number of equiprobable alternatives N, the information content of a message which narrows those alternatives down to one is log2N (the power to which 2 must be raised in order to yield the number of alternatives N). …

‘When the prior uncertainty is some mixture of alternatives that are not equiprobable, Shannon’s formula becomes a slightly more elaborate weighted average, but it is essentially similar.’64

‘The true information content is what’s left when the redundancy has been compressed out of the message’20

Testing the Relevance of this Definition of Information

Experiment # 1. Tell me about the information content of the following message:

1011100100100011

Experiment # 2. In the following pairs which has the most information:

11010010101 or 1001101

E=mc2 or the big brown dog which

Experiment # 3. Tell me about the information content of the following message:

Be6

We see we are in trouble. Every attempt at an answer seems to start with, ‘It depends’. Let’s give some thought to these three experiments.

In experiment # 1, the bits could represent characters in a computer’s extended ASCII code or a whole sentence in a secret agent’s code look-up book. What did I actually intend by this bit sequence? The first 4 bits represent 1 out of 16 books (24; we’ll let 0000 represent book 0, the first one) I have agreed to in advance with my secret agent in Bolivia; the next 8 bits represent a page number, between 1 and 256 (28; 00000000 represents the first page); the final 4 bits represent a sentence on that page. In case you are interested, book number 15 was Sun Tzu’s, ‘The Art of War’ page 146 sentence 3 and the intended message was: ‘Expendable agents are those of our own spies who are deliberately given fabricated information.’

In experiment # 2 we cannot select between the pairs without knowing what the intended meaning was or how this was coded. The short code, E=mc2, provides a huge amount of information and causes a great amount of surprise. The information content cannot be captured by Dawkins’ concept of information.

The three letters in experiment # 3 might have several meanings. Mine was ‘Bishop to e6’ (= ‘Bishop to [White’s] King 6’, in the obsolete descriptive notation). Its information value depends on the particular settings of the other chess pieces. Shannon might have argued that the number of potential moves available reflects the space of possibilities, and the densest communication of the move chosen can be represented through bits of 0 and 1. This is not very helpful nor does it reflect the usual, and my, meaning of information in this context. In this case the total information can be what I now surmise, after receiving the message, about the intentions of my chess opponent and this depends on the context. Some possible conclusions might be:

  1. ‘What a dummy. That move makes no sense, he doesn’t have a clue about what I am up to. He’s dead meat.’

  2. ‘Aha, that closes off the last escape route for my king. Given the layout of the whole chess board, his strategy seems to be a direct attack on the big guy.’

  3. ‘Oh no, he just opened my queen to attack by his rook and simultaneously attacked my unprotected knight. Time to bump the table over.’

The definition of information in terms of binary bits across a communication channel has been analyzed and evaluated as being of use in only some limited contexts by many information theorists. This includes Professor Gitt (see Part 4), who then developed a detailed theory involving sender and receiver pairs, which indeed allows us to identify how the three experiments outlined above can be handled. These ideas will be discussed later.

Lets consider an example Dawkins offers, based on Shannon’s theory:... [snip]

Cordially,

465 posted on 02/11/2002 10:13:16 AM PST by Diamond
[ Post Reply | Private Reply | To 464 | View Replies]

To: Diamond
OK, since you're still looking in here, here's a reply...
Chemistry operates by deterministic rules, which. These rules depend only on the electrical properties of the electrons in the atoms which make up matter.

You cannot therefore explain or make sense of the fact of your own consciousness or thinking or free will, because any explanation is nothing more than the deterministic movement of electrons in the atoms that make up matter, which are entirely coercive in that they operate by nothing but physical force. Angelo dealt with this problem in much more detail earlier in the thread.

Can you prove that there's anything going on besides naturalistic processes operating under natural law? Unless you show that there's something actually there, there's no reason for science to incorporate it in its thinking.


"Michael Denton, in "Evolution, a Theory in Crisis", notes:

The basic outline of the traditional evolutionary scenario is well known. ... The existence of a prebiotic soup is crucial to the whole scheme. Without an abiotic accumulation of the building blocks of the cell no life could ever evolve.

Firstly, we have Denton beating his head against a picture that is at least fifty years old, and does not appear to be the current thinking. In a way, it's a bit like an astronomer arguing that the Ptolemaic crystalline spheres don't exist, and so the planets move the way they do on account of supernatural forces.

Nowadays, rather than a "pre-biotic soup", research is focusing on the properties of surfaces as hosts for the chemical reactions that led to life.

Nevertheless, Denton's reasoning is very interesting, and betrays his antipathy to the notion of naturalistic explanations of life. He reasons that, because we don't find precisely the evidence he thinks we should find for one particular naturalistic explanation, no naturalistic explanation is valid.

This is a bit like saying that because AIDS isn't caused by a bacterium, it's caused by supernatural forces.

Secondly, it's very common for anti-evolutionists to retreat into abiogenesis, and point out that we don't know everything about how life came to be. While this is an interesting question, it's only peripherally related to evolution. Once the first living thing exists, all the hardware needed for evolution is in place, and small changes can accumulate in lines of descent from that first living thing. Whether evolution can give rise to a variety of organisms does not depend on where that first living thing came from.

It could have arisen from prebiotic chemical reactions, or it could have been designed by any of an infinite set of possible designers (God, Allah, Zeus, Buddha, Wakonda, God and Zeus, God and Buddha, God and Wakonda, Zeus and Wakonda, etc.). It doesn't matter.


I'll give you all the molecules of life. All the RNA. All the DNA. All the amino acids. All the enzymes. All the membranes. All the cell structures. A hermetically sealed can of sardines has all of the components necessary for life that you mentioned and more. Imagine that the can of sterile sardines is as big as the earth. Wait for life to spontaneously appear. All the billions of sterile sardine cans that have ever existed contitute billions upon billions of experimental observations that life does not arise spontaneously, even given all the necessary components.

One problem with the canned sardines system is that it lacks an energy source capable of rearranging certain chemical bonds. You need something like an electric spark, or high-energy photons. Furthermore, the molecules in a can of sardines are already incorporated into structures, and no one keeps canned sardines around long enough for these to degrade into elementary pieces.

Why don't you look up the research that's being done, and see what sorts of chemicals are being studied, and what kinds of results people are getting?

What you are really claiming is that non-living moecules and non-living matter are capable of producing results that can only be ascribed to technical intelligence (thought) or to life. You ascribe to non-living matter the properties of intelligence or of life itself. You are essentially saying that non-living matter is creative.

What you're claiming is that these processes require thought. Creationist discussions of information theory are logically flawed, because they assume what they set out to prove. Inevitably, they declare that some sort of intelligent design must have taken place in some process because that process requires intelligent design.

A quantum state is not nothing, but a set of conditions with laws of it's own, which still requires an explanation as to its origin.

Okay, what is the origin of quantum fluctuations? How do we tell?

466 posted on 02/11/2002 8:41:20 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 418 | View Replies]

To: Diamond
OK, you are indeed confusing information with meaning. Meaning can be, and very often is, determined after the fact. Very few, if any, things have any intrinsic meaning.
467 posted on 02/11/2002 8:43:22 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 465 | View Replies]

To: Diamond
I've skimmed the article you referenced, and I have a few questions.

1. An electrically charged object can be seen as generating a signal. Electric charges generate a flood of virtual photons, which transmit forces between those objects and other charged objects. These forces tend to repel or attract the other objects.

In a way, these photons can be seen as carrying a message, saying "come closer" or "move farther away". Is this a valid signal? Is it a coded signal? Please explain why or why not. (I'm trying to determine how this notion of information is applied, and to see if you -- or anyone, for that matter -- really understands it.)

2. Suppose a string of amino acids is assembled at random. We know from organic chemistry that such a random protein will very likely catalyze some chemical reaction. Is there information in this random chain of amino acids? How do we measure or interpret this information? Where did it come from?

3. Presumably, the difference between living and nonliving matter depends on the information content, otherwise you wouldn't have brought up information and abiogenesis in the same message.

When the first living thing came to be, where did the critical information come from? How was it delivered to the system? Can we see any instances of any similar processes today?

3-a. When a living thing dies, where does its information go? Does it simply vanish into nowhere? If information can vanish into nowhere, can it emerge from nowhere?

3-b. Is information conserved? When a living thing reproduces itself, does this create additional information? Where does this information come from?

From where I sit, the notions of information you are relying on appear to be very ill-defined. Indeed, they seem to amount to hand-waving, serving no function other than to sound impressive while saying "it just happened that way".

I'll be interested in seeing how you apply information theory to answer these questions, and I'd appreciate it if you can show your work.

........Karl

468 posted on 02/11/2002 8:56:34 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 465 | View Replies]

To: Karl_Lembke
Can you prove that there's anything going on besides naturalistic processes operating under natural law? Unless you show that there's something actually there, there's no reason for science to incorporate it in its thinking.

There's a lot to be unpacked here. First, proof itself is a non-naturalistic property, meaning that it is not a scientific, physical characteristic of physics or chemistry. It has no physical weight or volume. It doesn't have a positive or negative charge, etc. If science deals only with the physical universe of cause and effect, governed by natural laws in a metaphysically closed system, then by definition, science cannot address itself directly to the existence of all sorts of non-physical things like proofs, propositions, numbers, mental states, thoughts, and beliefs, simply because they are not physical things. The crux of the matter is the tacit proposition that only what can be known by science exists or is true, yet the proposition itself is not a statement of science, it is a philosophical statement about science, because it is not based upon or proven by scientific investigation. It is a belief about science that can not be empirically verified. So a request for "proof" is therefore outside your own metaphysical restriction of naturalism. It's a statement of belief about science that fails its own test.

Once the first living thing exists, all the hardware needed for evolution is in place, and small changes can accumulate in lines of descent from that first living thing. Whether evolution can give rise to a variety of organisms does not depend on where that first living thing came from.

Whether or not when the first living thing exists, that all the hardware needed for evolution is in place is the very thing under contention. I would say that whether evolution can give rise to a variety of organisms depends, not so much on where it came from, but on its properties and characteristics, because everything that has lived or lives is supposedly descended from it.

what is the origin of quantum fluctuations? How do we tell?

Good question. I could easily ask, what is the purely naturalistic origin of quantum fluctuations? It may turn out to be beyond the ability of science to ascertain the answer. I don't know. But I'm not opposed to trying to find out.

I will try to respond to some of your other points when I get a chance. This work thing keeps getting in my way.

Cordially,

469 posted on 02/12/2002 7:21:49 AM PST by Diamond
[ Post Reply | Private Reply | To 466 | View Replies]

To: Diamond
Once the first living thing exists, all the hardware needed for evolution is in place, and small changes can accumulate in lines of descent from that first living thing. Whether evolution can give rise to a variety of organisms does not depend on where that first living thing came from.

Whether or not when the first living thing exists, that all the hardware needed for evolution is in place is the very thing under contention. I would say that whether evolution can give rise to a variety of organisms depends, not so much on where it came from, but on its properties and characteristics, because everything that has lived or lives is supposedly descended from it.

Living things reproduce. That's part of the definition of life.

No living thing makes perfect copies of itself. Mutations always creep in. That's variation.

Some variations are harmful, some are neutral, and some are beneficial. All are variations. As we proceed generation after generation, variations will pile up. There is no avoiding it.

Alternatively, demonstrate a barrier which prevents variation from accumulating beyond a certain level. And in particular, given the genomes of different life forms, information which is becoming increasingly available, demonstrate the existence of a "no man's land" between any two living things.

Such a "no man's land" would be a genome intermediate between creature X and creature Y, which was incompatible with life. Such a "no man's land" would also have to be athwart the only mutational pathway between X and Y, and a large enough gap that the chance of jumping that gap is vanishingly small.

Thus, a hypothetical DNA sequence in a cat, ...CATCATCAT..., codes for some protein. In another creature, it's ...GATTATCAT....

Showing that the proposed intermediate ...GATCATCAT... yields a nonfunctional protein will not help, if it turns out that ...CATTATCAT... yields a functional one.

No one has shown that any such barriers to speciation, or differentiation at higher taxonomic levels, exist.

The first person to do so will probably win the Nobel prize.

470 posted on 02/12/2002 11:43:47 AM PST by Karl_Lembke
[ Post Reply | Private Reply | To 469 | View Replies]

To: Karl_Lembke
1. An electrically charged object can be seen as generating a signal. Electric charges generate a flood of virtual photons, which transmit forces between those objects and other charged objects. These forces tend to repel or attract the other objects.

In a way, these photons can be seen as carrying a message, saying "come closer" or "move farther away". Is this a valid signal? Is it a coded signal? Please explain why or why not. (I'm trying to determine how this notion of information is applied, and to see if you -- or anyone, for that matter -- really understands it.)

I been thinking about your questions for a couple of days, believe it or not. Rather than re-inventing the wheel, allow me to refer to what William Dembski has already written about some of the issues you have raised here in a very detailed way, including what constitutes information, how it is measured (21 paragraphs), and how it relates to the study of evolutionary biology (25 paragraphs). He says that information in a very general sense can be defined as the actualization of one possibility to the exclusion of others. Information can be measured in terms of its complexity. Smaller probabilities signify more information, not less. A higher level of of complexity would be actualization of circumscribed possibilities corresponding to patterns. A still higher level of information would be when the patterns are specified, that is, independently given-in-advance (i.e., not simply read off information after the fact.) I would say (leaving aside the existence and nature of the photons themselves for the moment) that they constitute an elemental level of information between a non-intelligent sender and receiver, because they actualize one possibility to the exclusion of others. I would not think that this level of information is coded, because coded to me represents the existence of a convention to the exchanges of information between sender and receiver.

2. Suppose a string of amino acids is assembled at random. We know from organic chemistry that such a random protein will very likely catalyze some chemical reaction. Is there information in this random chain of amino acids? How do we measure or interpret this information? Where did it come from?

3. Presumably, the difference between living and nonliving matter depends on the information content, otherwise you wouldn't have brought up information and abiogenesis in the same message.

When the first living thing came to be, where did the critical information come from? How was it delivered to the system? Can we see any instances of any similar processes today?

I think to ask where information comes from is to ask what the fundamental nature of the universe is. It may be like like asking where numbers come from. That is a very, very big question. Again, Dembski has this to say, (in part)..."The abiotic infusion of exogenous information is the great mystery confronting modern evolutionary biology. It is Manfred EigenÕs mystery with which we began this paper. Why is it a mystery? Not because the abiotic infusion of exogenous information is inherently spooky or unscientific, but rather because evolutionary biology has failed to grasp the centrality of information to its task. The task of evolutionary biology is to explain the origin and development of life. The key feature of life is the presence of complex specified informationÑCSI. Caught up in the Darwinian mechanism of selection and inheritance with modification, evolutionary biology has failed to appreciate the informational hurdles organisms need to jump in the course of natural history. To jump those hurdles, organisms require information. WhatÕs more, a significant part of that information is exogenous and must originally have been infused abiotically.

"In this section I want briefly to consider what evolutionary biology would look like if information were taken as its central and unifying concept. First off, letÕs be clear that the Darwinian mechanism of selection and inheritance with modification will continue to occupy a significant place in evolutionary theory. Nevertheless, its complete and utter dominance in evolutionary theoryÑthat selection and inheritance with modification together account for the full diversity of lifeÑthis inflated view of the Darwinian mechanism will have to be relinquished. As a mechanism for conserving, adapting, and honing already existing biological structures, the Darwinian mechanism is ideally suited. But as a mechanism for innovating irreducibly complex biological structures, it utterly lacks the informational resources. As for symbiotic infusion, its role within an information-theoretic framework must always remain quite limited, for even though it can account for how organisms trade already existing biological information, it can never get at the root question of how that biological information came to exist in the first place.

"Not surprisingly, therefore, the key task an information-theoretic approach to evolutionary biology faces is to make sense of abiotically infused CSI. Abiotically infused CSI is information exogenous to an organism, but which nonetheless gets transmitted to and assimilated by the organism. Two obvious questions now arise: (1) What is the mode of transmission of abiotically infused CSI into the organism? and (2) Where is this information prior to being transmitted? If this information is clearly represented in some empirically accessible non-biological physical system, and if there is a clear informational pathway from this system to the organism, and if this informational pathway can be shown suitable for transmitting this information to the organism so that the organism properly assimilates it, only then will these two questions receive an empirically adequate naturalistic answer. But note that this naturalistic answer, far from eliminating the information question, simply pushes it one step further back, for how did the CSI that was abiotically infused into an organism first get into a non-organism? Because of the Law of Conservation of Information, whenever we inquire into the source of some information, we never resolve the information problem, but only intensify it. This is not to say that such inquiries are unilluminating (contra Dawkins, 1987, pp. 11­13; and Dennett, 1995, p. 153 who think that the only valid explanations in evolutionary biology are reductive, explaining the more complex in terms of the simpler). We learn an important fact about a pencil when we learn a certain pencil-making machine made it. Nonetheless, the information in the pencil-making machine exceeds the information in the pencil. The Law of Conservation of Information guarantees that as we trace informational pathways backwards, we have more information to explain than we started with..."

3-a. When a living thing dies, where does its information go? Does it simply vanish into nowhere? If information can vanish into nowhere, can it emerge from nowhere?

3-b. Is information conserved? When a living thing reproduces itself, does this create additional information? Where does this information come from?

Demsbski argues that information is conserved: "...(1) Chance generates contingency, but not complex specified information. (2) Functions (e.g., algorithms and natural laws) generate neither contingency, nor information, much less complex specified information... This result, that neither chance nor functions nor some combination of the two can generate CSI (complex specified information), I call the Law of Conservation of Information, or LCI for short. Though formulated at a high level of mathematical abstraction, LCI has many profound implications for science. Among its immediate corollaries are the following: (1) The CSI within a system closed to outside information always remains constant or decreases. (2) If CSI increases within a system, then CSI was added exogenously. (3) CSI cannot be generated spontaneously, originate endogenously, or organize itself. (4) To explain the CSI within a system is to appeal to a system whose CSI is equal or greater in complexity still (in particular, reductive explanations of CSI are never adequate)..."

From where I sit, the notions of information you are relying on appear to be very ill-defined. Indeed, they seem to amount to hand-waving, serving no function other than to sound impressive while saying "it just happened that way".

Demsbski's information-theoretic work is regarded as mathematically rigorous and fairly respectable, even by those who disagree with his conclusions.

I'll be interested in seeing how you apply information theory to answer these questions, and I'd appreciate it if you can show your work.

Well thank you for the compliment. I'm guessing that my reply here will have disappointed you, but if I were to be able to answer some of these questions I should win the Nobel Prize. I don't think you really want to see my work because I'm a paralegal, and it's very, very, very boring stuff. That's why I hand around here. You people make me think much more.

Cordially,

471 posted on 02/14/2002 8:05:05 AM PST by Diamond
[ Post Reply | Private Reply | To 468 | View Replies]

To: Diamond
OK, here it is. It's a bit long, I'm afraid...

Indeed, it seems I'm going to have to break it into pieces. <sigh>


Rather than re-inventing the whell, allow me to refer you to what William Dembski has already written...

Don't mind if I do.

The distinction between specified and unspecified information may now be defined as follows: the actualization of a possibility (i.e., information) is specified if independently of the possibility's actualization, the possibility is identifiable via a pattern. If not, then the information is unspecified. Note that ... specified information cannot become unspecified, though unspecified information may become specified information. ... For instance, a cryptographic transmission whose cryptosystem we have yet to break will constitute unspecfied information. Yet as soon as we break the cryptosystem, the cryptographic transmission becomes specified information.

Um. Dembski has just as much as stated that information can be both specified and unspecified at the same time. The encrypted transmssion referred to above is presumably "specified" as far as the transmitter and intended recipient (by convention in discussions of cryptographic protocols, "Alice" and "Bob") are concerned. The fact that an intended evesdroppper ("Eve") is unable to decipher the message does not make it any less "specified" for Alice and Bob. And presumably, the fact that Alice and Bob know what the message says does not make it "specified" for Eve.

We also have signals that are alleged to be specified, but may or may not be. You may have heard of the book, The Bible Code. The thesis of this book is that messages are hidden in the text of the Tanach (The Hebrew Bible), and these messages may be discovered by reading every Nth letter. Different sets of messages are found by using different values of N. A large number of messages have been "discovered" by this method, and these messages seem to refer to recent and current events of significance.

The problem is, although every individual "message" is highly improbable, the set of possible matches to any given string is quite large. The result is something we might call "The Rorchach Effect, or maybe "The Nostradamus Effect". Given any sufficiently ambiguous signal, a "match" can usually be found. Especially if you're not too picky about how close a match you get.

Indeed, in critiques of The Bible Code, one reviewer applied the same test to a classic novel. (I think it may have been Moby Dick. ) He found similarly significant "messages". Conclusion: complex specified information can appear spontaneously without the intervention of any complex specified design.

Continued...

472 posted on 02/20/2002 10:02:11 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 471 | View Replies]

To: Diamond
Continued...
...The key conceptual difficulty here is to characterize the independence condition that obtains between patterns and information.

Indeed.

Is the origin of life specified?

That is a very good question. In a way, it is. The question is, how specified is it? And how complex?

Current research in abiogenesis is focusing on, among other things, RNA. RNA has been shown to form spontaneously, given the right conditions. It has even been shown to polymerize under the right conditions. Some of these polymers have been shown to catalyze various chemical reactions, including the polymerization of RNA.

Once we have a system in place that makes copies of RNA, a form of evolution can take place.

We may never know the exact pathway that life took. If life is as improbable as Dembski and others think it is, we might know with high confidence how it came to be. I suspect, though, there will turn out to be a multitude of different pathways that are available, and could have been followed. To make up some numbers, a 100-stage process, with two alternatives at only ten of those stages, yields 1024 different pathways. Unless every pathway leads to a different final result, we won't be able to tell which one was followed.

The Law of Conservation of Information...

On reading through this section, I see that in fact information, or at least CSI, is not conserved. In the third paragraph of this section, we find that the information from one source, upon being filtered through a function, can never be greater (according to Dembski) than the information that was originally present in that source. It can be less. Indeed, if a message is destroyed, then it would seem to follow that the information is also destroyed. This is not how conservation laws work.

Secondly, information can be created, simply by copying it. A message may have N bits, and a copy of that message would have N bits as well. Two copies of the message have 2N bits, although the Shannon entropy would be only slightly larger than N. (A file compression program would code the new message as "two copies of ...".) Once a second (third, fourth, etc) copy of a message exists, it can then be changed slightly, depending on environmental conditions. For example, I have had occasion to write batch files which begin as lots of copies of "move file A to file 2001A in directory D". I change the "A" in the second copy to "B", to "C" in the third copy, and so on. Although intelligently designed in this case, it shows how copies of information, after being slightly altered, can become additional information.

The question of whether the new information in copies subjected to random changes is useful is another question entirely. The answer seems to be that sometimes it is. Dawkins discusses a case where mammalian hemoglobin appears to have arisen from just such a duplication-and-modification event. The modified hemoglobin is at least as efficient as the original ancestral form, and a synergistic interaction between the two types makes the entire system more efficient. If, in fact, this is the result of duplication-and-modification, then it is an example of information created with no apparent creator.

In his application of his variant of information theory to evolution, he makes a serious blunder in paragraphs 9 and 10. He counts bits of new information by counting the number of offspring in any given critter, and taking the log2 of that number. Thus, a critter with an average litter size of 4 increases the information content of the species by two bits. A critter with one offspring increases the information content of the species by zero bits.

This is wrong. The information content of (to use Dembski's term) an actualized possibility is not based on the number of instances of that actualization, but on the number of a priori possibilities there were to begin with. For example, in human reproduction, 46 chromosomes are combined from the male and female gametes in the child zygote. Each chromosome had a 50% chance of being selected from the parent's chromosome pair. Thus, the chance of any particular set of chromosomes being pieced together in the final zygote is one in 2^26, or one in just over seventy trillion. By this argument, each human child increases the information in the species by 46 bits.

But wait, there's more!

Continued...

473 posted on 02/20/2002 10:04:04 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 471 | View Replies]

To: Diamond
Continued...

Each gene has one chance in 10,000 of mutating. There are between 30,000 and 70,000 genes in the human genome. (They're arguing over counting methods right now.) A mutation consists of, at least, a base change, from the current nucleotide base to one of three other bases. There are three billion base pairs in the human genome. 2% of that is in expressed genes. That means that 60 million base pairs are subject to being replaced with one of three choices. That's 180 million possible substitutions, for 25 .8 bits of additional information. I'll leave insertions, deletions, duplications and crossing-over events for someone with a stronger math-ochistic streak than I have. (By the way, if I assume that all 3 billion bases have some meaning in the genetic code, we get to 33 bits per mutation.)

OOPS! I'm sorry, at 30,000 genes, that's an average of three mutations. Triple the 25.8, or the 33, whichever you prefer.

Anyway, we're now looking at over a hundred bits of new information with the birth of each child.

Another blunder Dembski makes is that he neglects parallelism. He takes the case of a bacterium which divides every 20 minutes. This, he states, creates one bit of additional information. (Again, so much for "conservation of information".) He then divides 20 minutes into a billion years to get 26 trillion bits of information. Neglecting the fact that he's neglected sources of information, let's consider:

One bacterium divides, producing two daughter bacteria, and one bit of information. These daughters divide 20 minutes later, and each produces one bit. Total bits now = 3. At the end of a day, 4.7 * 1021 bits of information have, in theory, been added. (This would produce about 10,000 cubic meters of bacteria. Not an unreasonable number -- yet.)

Using the numbers I calculate for the amount of information that can result from a birth, I note that the human race consists of about 6 * 109 individuals. Worldwide, each human gives rise to about 1.1 descendants. Thus, in this generation, we can expect the production of 6.6 * 109 * 100 bits, give or take. This is 6 * 1011 bits per generation. A lot of that information can be filtered out, and still produce a measurable increase in useful information.

And in fact, all the information introduced into the genome is filtered, by a filter Dembski seems to prefer not to credit -- natural selection.

Natural selection is a filter which passes information which fits a certain specification more closely than other available information. The specification used is whatever criteria the environment cares to provide. If the environment includes a cold temperature, information which produces an ability to generate heat, or conserve heat, is going to be passed on as "useful". If the environment includes visible light, information which codes for the construction of some sort of receiver and translator will be "useful", and will be passed on.

Dembski has developed a nice vocabulary, and has thrashed through quite a bit of number theory, but he has not proven that any amount of intelligence is required for evolution to occur.


Well, I seem to have run out of stuff to say about information for the moment, and it's getting late. I'm going to upload this and go to bed.

G'night!

.............Karl

474 posted on 02/20/2002 10:05:06 PM PST by Karl_Lembke
[ Post Reply | Private Reply | To 471 | View Replies]

To: Karl_Lembke
...Dembski has just as much as stated that information can be both specified and unspecified at the same time. The encrypted transmssion referred to above is presumably "specified" as far as the transmitter and intended recipient (by convention in discussions of cryptographic protocols, "Alice" and "Bob") are concerned. The fact that an intended evesdroppper ("Eve") is unable to decipher the message does not make it any less "specified" for Alice and Bob. And presumably, the fact that Alice and Bob know what the message says does not make it "specified" for Eve.

This could be a semantic problem of trying to refer to relative and objective perspective at the same time. If it is a more substantial problem, perhaps something like an observer affecting the outcome, then the concept needs a sharper definition.

We also have signals that are alleged to be specified, but may or may not be. You may have heard of the book, The Bible Code. The thesis of this book is that messages are hidden in the text of the Tanach (The Hebrew Bible), and these messages may be discovered by reading every Nth letter. Different sets of messages are found by using different values of N. A large number of messages have been "discovered" by this method, and these messages seem to refer to recent and current events of significance.

The problem is, although every individual "message" is highly improbable, the set of possible matches to any given string is quite large. The result is something we might call "The Rorchach Effect, or maybe "The Nostradamus Effect". Given any sufficiently ambiguous signal, a "match" can usually be found. Especially if you're not too picky about how close a match you get.

Indeed, in critiques of The Bible Code, one reviewer applied the same test to a classic novel. (I think it may have been Moby Dick. ) He found similarly significant "messages". Conclusion: complex specified information can appear spontaneously without the intervention of any complex specified design.

The key is that the specifications have to be given in advance, so to speak. The Bible Code is like someone throwing a dart at a blank wall and then drawing the bullseye and the target around the dart. The question is, is the above really complex specified information, or is it a fabrication that gives the appearance of being complex, specified information? In the case of The Bible Code, you have correctly that the problem is that the set of possible matches to any given string is quite large. While I agree with your reasoning, I disagree with the conclusion that because something can be made to give an initial appearance of complex, specified information, that therefore CSI can appear spontaneously without the intervention of any complex specified design.

Cordially

475 posted on 02/21/2002 6:38:07 AM PST by Diamond
[ Post Reply | Private Reply | To 472 | View Replies]

To: Diamond
The key is that the specifications have to be given in advance, so to speak. The Bible Code is like someone throwing a dart at a blank wall and then drawing the bullseye and the target around the dart. The question is, is the above really complex specified information, or is it a fabrication that gives the appearance of being complex, specified information? In the case of The Bible Code, you have correctly that the problem is that the set of possible matches to any given string is quite large. While I agree with your reasoning, I disagree with the conclusion that because something can be made to give an initial appearance of complex, specified information, that therefore CSI can appear spontaneously without the intervention of any complex specified design.

Since you agree with my reasoning, and disagree with my conclusion, you must disagree with one or more of my premises. (Either that or you're not using logic.) Could you identify which premises you disagree with?

...........Karl

476 posted on 02/21/2002 10:06:58 AM PST by Karl_Lembke
[ Post Reply | Private Reply | To 475 | View Replies]

To: Diamond
First try came up "page cannot be displayed"...
The key is that the specifications have to be given in advance, so to speak. The Bible Code is like someone throwing a dart at a blank wall and then drawing the bullseye and the target around the dart. The question is, is the above really complex specified information, or is it a fabrication that gives the appearance of being complex, specified information? In the case of The Bible Code, you have correctly that the problem is that the set of possible matches to any given string is quite large. While I agree with your reasoning, I disagree with the conclusion that because something can be made to give an initial appearance of complex, specified information, that therefore CSI can appear spontaneously without the intervention of any complex specified design.

Since you agree with my reasoning, and disagree with my conclusion, you must disagree with one or more of my premises. (Either that or you're not using logic.) Could you identify which premises you disagree with?

...........Karl

477 posted on 02/21/2002 10:07:31 AM PST by Karl_Lembke
[ Post Reply | Private Reply | To 475 | View Replies]

To: Karl_Lembke
I'm sorry for the slow responses. It's been hectic. I want to later respond to some of your other points as well.

All I'm saying is that from the premise that CSI can be fabricated, it does not necessarily follow that CSI itself can appear spontaneously without the intervention of any intelligent agent. The Bible Code patterns, rudimentary as they are, are still fabrication after the fact (done by an intelligent agent by design, btw) like me drawing the target around the dart that I just threw at the wall. The 'messages' in the Bible Code are not real messages at all, just as my phenomenal accuracy at dart-throwing is not real. but simply a fabrication after the fact. So the conclusion that CSI itself can arise spontaneously and autonomously does not follow from the premise.

Cordially,

478 posted on 02/22/2002 4:23:52 AM PST by Diamond
[ Post Reply | Private Reply | To 477 | View Replies]

To: JediGirl
To fight intimidations against the truth posing as theory fronts. Here at last we can voice against fascism freely.
479 posted on 02/22/2002 4:26:45 AM PST by lavaroise
[ Post Reply | Private Reply | To 1 | View Replies]

To: Karl_Lembke
Current research in abiogenesis is focusing on, among other things, RNA. RNA has been shown to form spontaneously, given the right conditions. It has even been shown to polymerize under the right conditions. Some of these polymers have been shown to catalyze various chemical reactions, including the polymerization of RNA.

Could you tell me to what research you are referring? I have been laboring under the impression that even if given high levels of investigator interference to simulate 'the right conditions', and even given somehow, presently unknown, a prebiotic synthesis of cytosine, and even if building blocks could have formed polymers, there would have been no tendency to form the high-information polymers required for life as opposed to random ones, and any polymers would have readily hydrolysed. As late as 1999 Shapiro stated that ‘The evidence that is available at the present time does not support the idea that RNA, or an alternative replicator that uses the current set of RNA bases, was present at the start of life."

Cordially,

480 posted on 02/22/2002 5:58:05 AM PST by Diamond
[ Post Reply | Private Reply | To 473 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 421-440441-460461-480481-483 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson