Posted on 01/20/2005 12:54:58 PM PST by Jay777
ANN ARBOR, MI The small town of Dover, Pennsylvania today became the first school district in the nation to officially inform students of the theory of Intelligent Design, as an alternative to Darwins theory of Evolution. In what has been called a measured step, ninth grade biology students in the Dover Area School District were read a four-paragraph statement Tuesday morning explaining that Darwins theory is not a fact and continues to be tested. The statement continued, Intelligent Design is an explanation of the origin of life that differs from Darwins view. Since the late 1950s advances in biochemistry and microbiology, information that Darwin did not have in the 1850s, have revealed that the machine like complexity of living cells - the fundamental unit of life- possessing the ability to store, edit, and transmit and use information to regulate biological systems, suggests the theory of intelligent design as the best explanation for the origin of life and living cells.
Richard Thompson, President and Chief Counsel of the Thomas More Law Center, a national public interest law firm representing the school district against an ACLU lawsuit, commented, Biology students in this small town received perhaps the most balanced science education regarding Darwins theory of evolution than any other public school student in the nation. This is not a case of science versus religion, but science versus science, with credible scientists now determining that based upon scientific data, the theory of evolution cannot explain the complexity of living cells.
It is ironic that the ACLU after having worked so hard to prevent the suppression of Darwins theory in the Scopes trial, is now doing everything it can to suppress any effort to challenge it, continued Thompson.
(Excerpt) Read more at thomasmore.org ...
Both of the posts cover considerably more ground than my contention about the "fallacy of quantizing the continuum" so I'll leave that to y'all to whittle away.
Before I make my "full response", I want to make sure we're talking about the same thing, and I'm not sure we are.
Reviewing the discussion to date, I get the impression that we may be talking past each other, so let's make sure that we're both on the same page when it comes to what the erm "quantizing the continuum" actually *means*. We can discuss whether it's valid/invalid/fallacy/silly/bad-news-for-evolution/whatever after we agree on our definitions.
When Tortoise used the phrase "quantizing the continuum", I hope we already both understand what is meant by a continuum -- it is a function or entity which has values that vary smoothly from point to point. It's a "slope" of values, not a "stairstep".
"Quantizing", however, might be a source of misunderstanding. It is *not* another word for "quantifying". It is a verb form of "quantum". Think "quantum-ifying". A quantum in the general sense is a "package" of a fixed size, or which comes in discrete fixed sizes (like a product which comes only in standard 2-pound, 5-pound, or 20-pound bags). "Quantum physics" is named what it is because of how (at small scales) energy is found to be emitted only in "chunks" of specific sizes, and there are no "fractional-sized" energy emissions.
So when tortoise coined the phrase, "quantizing the continuum", he wasn't talking about "quantifying [measuring] the continuum", he was talking about "quantizing [quantum-ifying] the continuum" -- i.e. artificially breaking a continuum up into discrete "chunks", when the true nature of a continuum (by definition) is to be a *smooth* transition from one end to the other.
Again, the classic example (conceptually) is to take a smooth transition (i.e. "continuum") which goes from white through shades of gray into black, and to treat it as if the only significant portions (or the only portions to even *exist*) are the "black" part and the "white" part.
Here for example is a photo rendered with a continuum of black-to-white:
Here's the same photo rendered with the same continuum , *quantized* to "only-black-OR-white":
I think you'll agree that a lot of detail is lost in the translation...
So tortoise's term "quantizing the continuum" referred to cases where a continuum of something (consisting of many different gradations of a quality) was being shoe-horned into being treated (or being re-measured) as if it consisted of only a few "stairsteps" of fixed "buckets". And the worst sort of "shoehorning" is when a smooth gradation gets crammed into only two "black or white", or "all or nothing" catch-all categories.
Does this match your interpretation of his term "quantizing the continuum"? Or were you interpreting his term in some other way?
And again, I'm making no comment at this time about whether such a procedural "bucketizing" leads to a fallacy or not, I'm just making sure we're all on the same page when it comes to the term being discussed.
And in tortoise's specific example in that discussion, his point was that important conceptual "detail" may be lost if one attempts to sort all existing objects into only the two categories, 1. "Living" and 2. "Non-living", since this may cause one to overlook a sizeable "gray area" in between which consists of various kinds of "not fully alive but not fully non-living either". If, he says, the scale from "fully non-living" on one end and "fully alive" on the other end actually has a *scale* of "10% living", "67% living" and so on between them (for one example, things which reproduce themselves yet don't metabolize...), then looking to draw a "line" between "all alive" on the right and "all dead" on the left might be missing some key details about what we know as "life" and whether it could arise *gradually*, or had to come about *bang*.
In short: Is the "either alive or dead" paradigm missing the boat by breaking up the reality into too few too-broad categories?
Yes, your understanding of "quantizing" and "continuum" is the same as mine.
The objection I raise is that "quantizing the continuum" is a property of the evidence and not a "fallacy". The fossil record is a quantization, so are b-Mesons, etc.
We had a solid, elegant mathematical definition of life v. non-life/death [Shannon]. The "fallacy" was raised as an objection to that definition.
In a darkness to light example, all the grey-scales in the world will not cancel the definition of "black" and "white". Further, the grey scales should not be an objection to investigating how a scene got from "black" to "white" when they are in fact, the whole point of such an investigation.
Sorry, but I've got to go now ... more later.
Quickie response: I think the fallacy is in the mis-perception of the evidence. That is, failing to deal with it as part of a continuum (in those cases where it is part of a continuum).
The pieces, when put together, reveal a picture. In the context of evolution, if these pieces were fossils, the analogy of the way we fit the pieces together is the anatomical structures of the fossils and their ages. We end up with the well-known tree of life, showing common descent with variation.
Now it's possible that someone could come along and claim that this isn't the only possible picture we could make with those pieces, and that the picture we're showing is merely the result of imposing our prejudices on the pieces.
That might be true, but only if it were possible to arrange the pieces in some other way (for example, if the pieces were all the same shape, so that any number of mosaic designs could be produced). But that's not what we're working with. We might challenge our skeptic to try his hand at re-arranging the pieces, but no, he won't do that.
We could also point out that DNA evidence shows a close, pre-existing relationship of the pieces that we've fitted together, thus confirming the picture; and that re-arranging the pieces would be inconsistent with such evidence. But somehow, notwithstanding any other way to arrange the pieces, the skeptic will always insist that the picture is the result of prejudice.
IOW, this particular quantization of a continuum was not a fallacy but a requirement. Such quantizations are also necessary for evolution theory, high energy particle physics, etc. In other applications, financial modeling, artificial intelligence, etc. - such quantizations may skew the results. So the fallacy is not in quantizing but in applying it wrongfully.
Aye, you got it.
The problem usually becomes apparent at the edges of the chunks. No matter where you split the continuum from white to black into "white" or "black", the colors on either side of the splitting line will look extremely similar to each other to the point of barely being distinguishable, and bear more resemblance to each other than to the color at the far end of their chunk that nominally defines them.
I think this fallacy runs deeper than so far discussed. I can build a discrete system with similar properties.
Sometimes, even the earlist work (Aristotle) or the best (Linnaeus) may not be good enough later. We have concepts such as species, genus, family, etc. but the boundaries between them may not be so clear as previously thought. Back in the 1950s, I remember people drawing up relationships based on genotypic similarity rather than phenotypic. All the biologist that I knew thought this was a better way to do things. They were looking for relationships between entities rather than just classifications.
Here's an example of the impossibility of drawing (some) sharp boundaries between sets. A plausible criterion may be impossible to meet.
Take a set of entities each of which has 3 properties from a set of 7 (like diatonic chords?). There are 35 possible entities.
abc abd abe abf abg acd ace acf acg ade adf adg aef aeg afg
bcd bce bcf bcg bde bdf bdg bef beg bfg
cde cdf cdg cef ceg cfg
def deg dfg
efg
Now for example, assume that having two properties in common allow the entities to interbreed. Thus (abc abd abe abf abg) can interbreed, but abc and ade cannot. This is an example of a complex ring species.
The concept "can interbreed with" (equilalent here to "has two properties in common") doesn't seem quite right for "species" in this case. Also "can't interbreed with" doesn't make a really good boundary either.
Likewise a successive of single property changing progressions (keeping the two property breeding capacity) can move an entity abc through abe abf abd abg acg adg aeg afg bfg cfg dfg to efg which is rather far away "genetically."
More properties lead to similar results but with more complicated possibilities. All this happens with discrete items.
This type of object could be synchronic, but it is not a tree strucure. A large number of these would make the idea of a "tree of life" invalid. The same think happens in the simpler "ring species" of plants which can occur around mountains. (I heard of these about 50 years ago.) The plants form a circle (a,b,c,d,e,f) where "f" and "a" are ajacent. Variety "a" can interbreed with "b" or f" but not with (c,d,e). Of course were "c" and "d" to die off, there would be two non-interbreeding groups.
Even when the "continuum" actually consists of a large number of small "steps", and is thus in reality "quantized" (as in the individual generations of a long evolutionary sequence), it's still a continuum in the long view.
Remember that there's an issue of scale here. What is "grainy" under a microscope may be extremely "smooth" when viewed as a whole.
Even my grayscale "continuum of gray" photo example actually used 256 discrete values of luminosity. But it still provides for "smooth" contours in the photo, and my points about it still stand.
So let's not get bogged down losing the forest for the trees (which is rather an apt metaphor here).
The point, in a nutshell, is that some processes, data, phenomena, etc. etc. manifest as "smooth" gradients on the "big picture" scale (whether or not they may be discrete under the "microscope" is beside the point), and to (mis)model them as just a few (or in the worst case, only two) discrete "states" which "jump" from one category to another is quite simply a fallacy in every sense of the word:
It's a mental trap to too coarsely conceptually quantize something which is in reality a smoother transition. I can't state it any more succinctly than that.fal-la-cy (fal'uh see) n. pl. <-cies> 1. a deceptive, misleading, or false notion, belief, etc.; misconception. 2. a misleading or unsound argument. 3. deceptive, misleading, or false nature; erroneousness. 4. any of various types of erroneous reasoning that render arguments logically unsound.
And your example of fossils being "quanta" of evolution is irrelevant to that. Yes, evolution is "quantized" into generations, and yes, life is "quantized" into individuals. There's no fallacy in that, nor does that contradict the fact that evolution proceeds by "smooth" transitions when viewed across hundreds of generations or more, and makes a continuum of living forms.
[In short: Is the "either alive or dead" paradigm missing the boat by breaking up the reality into too few too-broad categories?]
We were investigating abiogenesis!
Yes. Exactly.
The grey-scaling would have been the domain of the investigation.
If by that you mean that you were examining the nature of the transition between completely "nonliving" and "life as we know it today", *and* were aware that the transitions in between would likely be a "gray area" of things which were "not fully nonliving but not fully living as we now know it", then fine -- but is that actually the case? Because your next statement gives me cause for concern:
We had a solid, elegant mathematical definition of life v. non-life/death [Shannon]. The "fallacy" was raised as an objection to that definition.
And rightly so, if your "solid, elegant mathematical definition of life" gave a "binary" result -- i.e., "if it meets this definition it is 'living', if it doesn't then it is 'non-living'"... That would be quantizing the range of possibilities of "life" into living/nonliving, yes/no, black/white.
If you were trying to devise a "yes/no" test for "life", then you were indeed "quantizing" what may well be a "continuum", without first establishing that it *is* a binary condition (is/isn't) as opposed to a continuum ("degrees of life").
In a darkness to light example, all the grey-scales in the world will not cancel the definition of "black" and "white".
I never said that they would. The point, however, is that a "is this white or not" test would be grossly misleading, because a "no" result would imply "black" to the person applying the test, even though a "very almost white but just not *quite*" level of gray would also qualify as "not white" -- even though most people would consider it more "white" than "not-white".
In short, recognizing black and white should not become a mental trap against seeing the gray.
Further, the grey scales should not be an objection to investigating how a scene got from "black" to "white" when they are in fact, the whole point of such an investigation.
Correct, but that wasn't Tortoise's concern. He was concerned that a binary "living or not" test would be unable to distinguish the details of such a transition, and would instead engender a mindset (or worse, may be the result of a mindset) which doesn't recognize the gray parts.
I tend to agree -- even today, apart from abiogenesis, I think it's a big mistake to try to define "life" in a way that draws any kind of sharp line between "living" and "nonliving". I think the reality is more complex than that, there is no such clean dividing line.
You have proposed some "Shannon information" test, but I haven't seen the details (if you could point me to a post which lays it all out, I'd appreciate it). But that approach in general seems doomed to failure to me, since many things we definitely do not consider alive *also* exchange Shannon information (computers, simple natural objects such as crystals, etc.), plus many things we consider alive often exchange no Shannon information whatsoever for long periods of time (e.g. quiescent anthrax spores).
To answer your last question first because it requires no discussion, here are primary links where you can read up on information theory and molecular biology and Shannon's mathematical theory of communications:
Returning to the "quantizing the continuum" discussion....
In order to know we had a successful theory we needed a starting point and an ending point - what is non-life and what is life. The theory itself would address all the grey scales in between.
That is where the fallacy of quantizing the continuum killed the investigation - and, as far as I'm concerned, all such investigations. Thus I now consider all theories of abiogenesis trash - there can be no such theory if science refuses to accept a clear definition of life, non-life and death.
The corrolary is this: to whatever extent the correspondents apply the fallacy to abiogenesis, it must also be applied to all other theory including evolution. And as you know the entire theory of evolution is a construct of a continuum of life based on a quantization of another continuum, the geologic record.
It is a poison pill - not to Intelligent Design but to Evolution theory as well as abiogenesis, where it is most obvious.
Works for me. I have no need for strict and unambiguous delineation of such things. Discarding them will make the discussion more rigorous and better grounded in physical reality.
Perhaps it will cause the discussion of abiogenesis to be more rigorous among those who wish to tackle a theory without boundaries, as you suggest. However, I predict more Heat than Light will be the result.
Without boundaries, nothing is. Except relentlessly subjective opinion. So natch, you're gonna get a whole lot of "heat," and zilch "light." What tortoise proposes is a fool's game, not to put too fine a point on it. JMHO FWIW.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.