Posted on 11/22/2002 9:09:10 PM PST by forsnax5
NSF awards grants to discover the relationships of 1.75 million species
One of the most profound ideas to emerge in modern science is Charles Darwin's concept that all of life, from the smallest microorganism to the largest vertebrate, is connected through genetic relatedness in a vast genealogy. This "Tree of Life" summarizes all we know about biological diversity and underpins much of modern biology, yet many of its branches remain poorly known and unresolved.
To help scientists discover what Darwin described as the tree's "everbranching and beautiful ramifications," the National Science Foundation (NSF) has awarded $17 million in "Assembling the Tree of Life" grants to researchers at more than 25 institutions. Their studies range from investigations of entire pieces of DNA to assemble the bacterial branches; to the study of the origins of land plants from algae; to understanding the most diverse group of terrestrial predators, the spiders; to the diversity of fungi and parasitic roundworms; to the relationships of birds and dinosaurs.
"Despite the enormity of the task," said Quentin Wheeler, director of NSF's division of environmental biology, which funded the awards, "now is the time to reconstruct the tree of life. The conceptual, computational and technological tools are available to rapidly resolve most, if not all, major branches of the tree of life. At the same time, progress in many research areas from genomics to evolution and development is currently encumbered by the lack of a rigorous historical framework to guide research."
Scientists estimate that the 1.75 million known species are only 10 percent of the total species on earth, and that many of those species will disappear in the decades ahead. Learning about these species and their evolutionary history is epic in its scope, spanning all the life forms of an entire planet over its several billion year history, said Wheeler.
Why is assembling the tree of life so important? The tree is a picture of historical relationships that explains all similarities and differences among plants, animals and microorganisms. Because it explains biological diversity, the Tree of Life has proven useful in many fields, such as choosing experimental systems for biological research, determining which genes are common to many kinds of organisms and which are unique, tracking the origin and spread of emerging diseases and their vectors, bio-prospecting for pharmaceutical and agrochemical products, developing data bases for genetic information, and evaluating risk factors for species conservation and ecosystem restoration.
The Assembling the Tree of Life grants provide support for large multi-investigator, multi-institutional, international teams of scientists who can combine expertise and data sources, from paleontology to morphology, developmental biology, and molecular biology, says Wheeler. The awards will also involve developing software for improved visualization and analysis of extremely large data sets, and outreach and education programs in comparative phylogenetic biology and paleontology, emphasizing new training activities, informal science education, and Internet resources and dissemination.
-NSF-
For a list of the Assembling the Tree of Life grants, see: http://www.nsf.gov/bio/pubs/awards/atol_02.htm
It opens up the possibility of addressing scientifically rather than ideologically the central issue so hotly contested by fundamentalists on both sides of the Creationist-Darwinist debate: Is there any guiding intelligence at work in the origin of species displaying exquisite adaptations that range from lambda prophage repression and the Krebs cycle through the mitotic apparatus and the eye to the immune system, mimicry, and social organization?
certainly does not indicate this ---
Shapiro rejects creationist, non-naturalistic explanations for the origins of life
considering your definition of creationist.
These two cases are not comparable. Deterministic not only is not equivalent to "high probability" but the word "high" is not defined. There is already no algorithm, only some weasel words.
You have a problem with reading comprehension. He did not say they were equal. This is a method of selecting what may be intelligently designed, call it what you will.
(b)If regularity as an explanation fails, one should then see if chance is an acceptable explanation... -dembski-
How is "chance" different from "probability"? Again Dembski weasels. Cases b and a are identical according to the putative Dembski algorithm. Again there is nothing quantitative here.
What he means is simply whether there is a smaller chance but a reasonable one that it might have occurred at random. Again this is a method for selecting what may be intelligently designed, he is defining successive criteria for making the determination and excluding along the way those things which cannot be said with certainty to be designed.
(c)Only once chance has been excluded is design assumed to be the cause...-dembski-
How was chance excluded? No method is given. Is "vanishingly small" equal to 1.0-"high"? Again only weasel words.
By the steps above, again it is a series of steps, not a single one.
A seemingly random pattern may be discovered later to contain information...-dembski-
The converse happens too. A seemingly structured pattern may be discovered later to be randomly generated. In fact, most people do see "patterns" in randomly generated data.
Very doubtful that once a pattern has been found it will later be found to be random. Any examples? Dembski at least gives examples of what he means. And again, all he is saying there is that for now, it may not be deemed to have been intelligently designed but upon further scientific observation it may be found to have been designed. In fact this is what has been happening in science the last 150 years. What was thought happenstance has been shown to be the result of very specific systems and organization. In other words, the more we learn the more designed things look.
Dembski seems to have divided things into high, intermediate, and vanishingly small probabilities. He gives no criteria for distinguishing these from each other nor criteria to distinguish randomly generated phenomena from deterministic ones. In short, there is no algorighm, no method, no numbers, and thus no usefulness.
No, what Dembski is doing is methodically excluding doubtful items. It is a process for determining how to find things which have been intelligently designed by using scientific knowledge to make the determination. As he states, the goal is to eliminate random chance. If something could not have occurred by chance then it must have been designed.
What significance does an emerging interface between biology and information science hold for thinking about evolution? It opens up the possibility of addressing scientifically rather than ideologically the central issue so hotly contested by fundamentalists on both sides of the Creationist-Darwinist debate: Is there any guiding intelligence at work in the origin of species displaying exquisite adaptations that range from lambda prophage repression and the Krebs cycle through the mitotic apparatus and the eye to the immune system, mimicry, and social organization? Borrowing concepts from information science, new schools of evolutionists can begin to rephrase virtually intractable global questions in terms amenable to computer modelling and experimentation. We can speculate what some of these more manageable questions might be: How can molecular control circuits be combined to direct the expression of novel traits? Do genomes display characteristic system architectures that allow us to predict phenotypic consequences when we rearrange DNA sequence components? Do signal transduction networks contribute functional information as they regulate the action of natural genetic engineering hardware?
In which of those "manageable questions" do you find non-naturalistic explanations lurking?
There is no mind-reading involved when the normal meanings of words are used.
Is there any guiding intelligence at work in the origin of species... ?
Means that there exists a question and he does not reject an answer, which you say he does. Which naturalistic explanation that you cited has the potential guiding intelligence contained therein?
This one:
Signal transduction is not limited to multicellular development. We are learning that virtually every aspect of cellular function is influenced by chemical messages detected, transmitted, and interpreted by molecular relays. To a remarkable extent, therefore, contemporary biology has become a science of sensitivity, inter- and intra-cellular communication, and control. Given the enormous complexity of living cells and the need to coordinate literally millions of biochemical events, it would be surprising if powerful cellular capacities for information processing did not manifest themselves. In an important way, then, biology has returned to questions debated during the mechanism-vitalism controversy earlier this century. This time around, however, the discussion is informed by two new factors. One is that the techniques of molecular and cell biology allow us to examine the detailed operation of the hardware responsible for cellular responsiveness and decision-making. The second is the existence of computers and information networks, physical entities endowed with computational and decision-making capabilities. Their existence means that discussing the potential for similar activities by living organisms is neither vague nor mystical.
Unlike ID theory, which posits the existence of some external intelligence operating on passive organisms, it is tolerably obvious that Shapiro is talking about the ability of living organisms to shape themselves in response to external events. You are conflating the Shapiro's notion of "intelligence", which he uses to indicate internal information-processing abilities, with an external operator. They are not one and the same, the fallacy of equivocation notwithstanding - one (Shapiro's) is entirely naturalistic, and the other is not.
Then what is your beef with Shapiro?
Then what is your beef with Shapiro?
LOL - who says I have a beef with him? I'm merely suggesting that his notion of "intelligence" is an entirely naturalistic one. You disagree?
Far be it from me to cast stones - I am cringing at some of my own typos here ;)
Yes. I believe though he might himself consider and expect the answers to be entirely naturalistic, his use and reference to creationists indicates to me that he has not rejected the participation of creationist viewpoints in the answering of the question. One of the stronger viewpoints that Shapiro has is his apparent utter rejection of the gradualist, stochastic evolution of organisms. From that premise it is difficult to understand how a computing machine could come into being using a gradualist, stochastic mechanism.
The algorithms, methods and numbers may first arrive from the theoretical physics, mathematics or information theory disciplines, without their having any particular intention to impact the theory of biological evolution.
Max Tegmark - Is ``the theory of everything'' merely the ultimate ensemble theory?
Jürgen Schmidhuber - Algorithmic Theories of Everything
Iaian Stewart - Theories of Everything
Stephen Wolfram - A New Kind of Science
This is the crucial error. How does Dembski propose to know if an outcome can be explained by as yet unknown natural laws? I really don't see how that can be overcome. Since obviously only known natural laws can be applied, what will happen is that the Filter will dump outcomes into the Design bucket only to pull them back out as our knowledge increases. That doesn't strike me as very useful.
There are many other problems too. For example, it may not be in fact be computable that a given outcome is or is not a consequence of even known natural laws. This could be due to only practical considerations (e.g. the combinatorial search space is far too large) or theoretical ones (e.g. the outcome is a Goedel sentence of a natural law formalism).
Then consider that outcomes come not only from natural laws but boundary conditions also. In the presence of chaos we might conceivably need to know the initial/boundary conditions of the universe to infinite precision. How will Dembski produce this knowledge?
Short term stock movements. If one assumes an (even marginally) efficient market.
This "filter" cannot be applied in any but the most trivial cases, because very complex outcomes can be generated by simple, iterative rules. You can, for example, generate the digits of pi with a simple algorithm, but if given a long string of digits, you cannot determine what algorithm generated them.
The point is, given the existing complexity of life, you cannot prove that it is not the result of a simple iterative process with a naturalistic explanation. Complexity of outcome does not require complexity of means.
Yes, he's more like Grant in the final weeks before Appomattox.
What? Do you or Patrick "Maish Rennick" Henry plan on surrendering to him?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.