Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Evolution through the Back Door
Various | 6/15/2003 | Alamo-Girl

Posted on 06/15/2003 10:36:08 AM PDT by Alamo-Girl

Evolution through the back door - musings of an Alamo-Girl

What Mathematics brings to the Table

I do very much love the epistemological zeal that mathematicians bring to the "evolution biology" table. For one thing, to a mathematician the "absence of evidence IS evidence of absence."

For another, mathematicians and physicists accept axioms of the level evolutionary biologists do not, such as taking life as an axiom. According to Sir Karl Popper, when given two theories an experiment will decide one true and one false. But in wave-particle duality one experiment proves the electron is a wave, another proves it is a particle. Mathematicians and physicists consider these undecidabilities whereas evolutionary biologists offer "just-so" stories.

Evolutionary biologists speak of function and complexity over time. Mathematicians speak of functional complexity, randomness and probabilities over time.

Evolutionary biologists speak of chemistry and genetics. Mathematicians speak of symbolization, self-organizing complexity and syntactic autonomy.

Or to put it more succinctly, the evolutionary biologist describes but the mathematician/physicist explains.

This article will attempt a new approach - starting at the mathematics/physics angle - to explore the much argued subject of biological evolution. As always, the first step is definition of terms and scope, which in this case is "all that there is."

The most beautiful and deepest experience a man can have is the sense of the mysterious. It is the underlying principle of religion as well as all serious endeavour in art and science. He who never had this experience seems to me, if not dead, then at least blind. To sense that behind anything that can be experienced there is a something that our mind cannot grasp and whose beauty and sublimity reaches us only indirectly and as a feeble reflection, this is religiousness. In this sense I am religious. To me it suffices to wonder at these secrets and to attempt humbly to grasp with my mind a mere image of the lofty structure of all that there is.

Einstein's speech 'My Credo' to the German League of Human Rights, Berlin, autumn 1932, Einstein: A Life in Science, Michael White and John Gribbin, page 262

 

What is "all that there is?"

This is no small philosophical question. Many of us agree that "reality" must exist before it is discovered, observed or measured. However, that definition becomes qualified with whatever a person already believes because, for example:

To a metaphysical naturalist, "reality" is all that exists in nature

To an autonomist "reality" is all that is, the way it is

To an objectivist "reality" is that which exists

To a mystic "reality" may include thought as substantive force and hence, a part of "reality"

To Plato "reality" includes constructs such as redness, chairness, numbers, geometry and pi

To Aristotle these constructs are not part of "reality" but merely language

To some physicists, "reality" is the illusion of quantum mechanics

To Christians "reality" is God's will and unknowable in its fullness.

 

Reality and Physics:

Reality is an illusion, albeit a very persistent one. - Einstein

In physics, "realism" refers to the idea that a particle has properties that exist even before they are measured. Measurement however presents a problem and thus, our sense of "reality" in quantum mechanics.

Measurement Problem - Stanford Encyclopedia of Philosophy

From the inception of Quantum Mechanics (QM) the concept of measurement has proved a source of difficulty. The Einstein-Bohr debates, out of which both the Einstein Podolski Rosen paradox and Schrödinger's cat paradox developed, centered upon this difficulty. The problem of measurement in quantum mechanics arises out of the fact that several principles of the theory appear to be in conflict. In particular, the dynamic principles of quantum mechanics seem to be in conflict with the postulate of collapse. David Albert puts the problem nicely when he says:

The dynamics and the postulate of collapse are flatly in contradiction with one another ... the postulate of collapse seems to be right about what happens when we make measurements, and the dynamics seems to be bizarrely wrong about what happens when we make measurements, and yet the dynamics seems to be right about what happens whenever we aren't making measurements. (Albert 1992, 79)

"Reality" also suffers from our difficulty in accounting for the mass which is observed in the universe. Both Fermilab and CERN are looking for the Higgs boson/field which is predicted by the Standard Model. But even if found, 70% of the mass in the universe is attributed to something else, dark energy - which, if it exists, ought to be detectable in local space, the laboratory. So far, it is not detectable.

DOE Office of Science, High Energy Physics Program

As we reach the centennial of Einstein's theory of relativity, we may find ourselves at the threshold of discoveries equally profound. What gives matter its mass? Have we at last identified nature's ultimate constituents, or will we find a mirror set of "supersymmetric" particles? What is the dimensionality of space-are there extra dimensions hidden from us, and why? What is the role of the neutrino in the universe? What are the mysterious "dark energy" and "dark matter" that seem to make up 95% of the universe? Why is the remaining 5% of the universe made only of matter, with hardly any antimatter? At the end of this decade, our views of matter and the universe may be changed forever.

The major tools at hand to explore these fundamental questions are the high energy accelerator facilities built and operated by the Office of Science, and used by the talented and imaginative scientists that it supports at its laboratories and universities. With these tools, the HEP program will pursue the following challenges:

Higgs and the Origin of Mass.

The Large Electron-Positron Collider at CERN was shut down in 2000, leaving behind a tantalizing hint of the long theorized, but as yet unseen Higgs boson. The Higgs is believed to be the source of mass for all elementary particles, and its discovery would mark a profound advance in science. The Large Hadron Collider (LHC) now being constructed at CERN, will be a strong contender to find the Higgs, and American physicists will participate in that research. However, the LHC cannot begin an intense physics program before the spring of 2007, leaving the Tevatron collider at the Office of Science's Fermi National Laboratory with a window of opportunity to make this important discovery. This collider and its two detectors completed major upgrades and began operating in March 2001. With protons and antiprotons colliding at energies of one trillion electron volts (1 TeV), the Tevatron will be the world's highest energy physics research facility until the LHC is operational.

Beyond the Standard Model.

Theoretical explorations beyond the Standard Model suggest that a new class of "supersymmetric" particles may be discovered, or that extra dimensions may exist beyond the familiar four dimensions of space/time. The upgraded Tevatron may be able to test these theories. Confirming either theory would represent a major advance in human knowledge.

Matter and the Universe.

Scientists using the B-Factory and its BaBar detector at the Stanford Linear Accelerator Center (SLAC) have an opportunity to explain the vast preponderance of matter over antimatter in the universe. Electrons and positrons colliding at energies of several billion electron volts allow the study of a small asymmetry in the way B mesons decay into other particles. The asymmetry is known as Charge-Parity (CP) violation and was first discovered in 1964. CP violation is believed to be at least partly responsible for the survival of more matter than antimatter after the Big Bang origin of the universe.

The Role of Neutrinos.

The neutrino is a particle that plays a key role in the interactions of elementary particles and in astrophysical processes. Neutrinos are created and detected in one of three "flavors": electron, muon, or tau. The current theory of elementary particles, called the Standard Model, requires that neutrinos be massless, but experimental results now provide compelling evidence that they do have mass. If they have mass, a neutrino created in one flavor would "oscillate" among different flavors as it travels. A new Fermilab experiment called MiniBooNE will begin taking data in 2002 to test whether muon neutrinos oscillate to electron neutrinos, as indicated by an earlier experiment. Another new detector called MINOS is being as- sembled in a Minnesota mine and a beam of neutrinos for it is being built at Fermilab. With this long baseline experiment (450 miles), physicists will make precise measurements of neutrino mass. Results from MiniBooNE and MINOS will help scientists understand the role of this unique particle in particle interactions and in the evolution of the universe.

Dark Energy and Matter.

At the Office of Science's Lawrence Berkeley National Laboratory, studies of supernovae have indicated that the universe is expanding at an accelerating rate, due to "dark energy," estimated to comprise 70% of the critical density of the universe. "Dark matter," which emits no radiation, makes up another 25% of the critical density, with normal matter contributing only 5%. Explaining these mysterious forms of matter and energy is another high priority of the HEP program.

FermiLab

CERN

A significant anomaly exists with regard to space/time. Non-locality is at issue with the violations of Bell's Inequalities at distance. And there is no bridge for the quantum to classic, i.e. Schrodinger's cat.

Bell's Inequalities violated at distance - Physics News 399, October 26, 1998

Splitting a single photon of well-defined energy into a pair of photons with initially undefined energies, and sending each photon through a fiber-optic network to detectors 10 km apart, researchers in Switzerland ... showed that determining the energy for one photon by measuring it had instantaneously determined the energy of its neighbor 10 km away

Uncertainty Principle

Classical physics was on loose footing with problems of wave/particle duality, but was caught completely off-guard with the discovery of the uncertainty principle.

The uncertainty principle also called the Heisenberg Uncertainty Principle, or Indeterminacy Principle, articulated (1927) by the German physicist Werner Heisenberg, that the position and the velocity of an object cannot both be measured exactly, at the same time, even in theory. The very concepts of exact position and exact velocity together, in fact, have no meaning in nature.

Ordinary experience provides no clue of this principle. It is easy to measure both the position and the velocity of, say, an automobile, because the uncertainties implied by this principle for ordinary objects are too small to be observed. The complete rule stipulates that the product of the uncertainties in position and velocity is equal to or greater than a tiny physical quantity, or constant (about 10-34 joule-second, the value of the quantity h (where h is Planck's constant). Only for the exceedingly small masses of atoms and subatomic particles does the product of the uncertainties become significant.

Any attempt to measure precisely the velocity of a subatomic particle, such as an electron, will knock it about in an unpredictable way, so that a simultaneous measurement of its position has no validity. This result has nothing to do with inadequacies in the measuring instruments, the technique, or the observer; it arises out of the intimate connection in nature between particles and waves in the realm of subatomic dimensions....

For centuries, scientists have gotten used to the idea that something like strong objectivity is the foundation of knowledge. So much so that we have come to believe that it is an essential part of the scientific method and that without this most solid kind of objectivity science would be pointless and arbitrary. However, the Copenhagen interpretation of quantum physics (see below) denies that there is any such thing as a true and unambiguous reality at the bottom of everything. Reality is what you measure it to be, and no more. No matter how uncomfortable science is with this viewpoint, quantum physics is extremely accurate and is the foundation of modern physics (perhaps then an objective view of reality is not essential to the conduct of physics). And concepts, such as cause and effect, survive only as a consequence of the collective behavior of large quantum systems....

In 1935 Schrodinger, who was responsible for formulating much of the wave mechanics in quantum physics, published an essay describing the conceptual problems in quantum mechanics. A brief paragraph in this essay described the, now famous, cat paradox.

 

Quantum Field Theory

Reality in Math - Plato v. Aristotle

The debate about whether constructs are real has raged from the time of Plato and Aristotle. It was argued by Einstein and Gödel and is being argued today by Stephen Hawking and Roger Penrose. They are two different worldviews which cannot be reconciled.

By extension, the debate goes to the issue of when to stop looking. For instance, Hawking is content when an experiment confirms the theory, but Penrose wants the theory to also make sense.

I am a Platonist - more like Penrose than Hawking. For instance, I perceive that geometry exists in reality and the mathematician comes along and discovers it, e.g. pi, Schwarzschild Geometry, Riemannian Geometry and so on.

It is important to know and/or pick a side because it has a lot to do with how this information (and other science information) will be understood. Here are the two sides:

Parallel Universes - Max Tegmark

According to the Aristotelian paradigm, physical reality is fundamental and mathematical language is merely a useful approximation. According to the Platonic paradigm, the mathematical structure is the true reality and observers perceive it imperfectly. In other words, the two paradigms disagree on which is more basic, the frog perspective of the observer or the bird perspective of the physical laws. The Aristotelian paradigm prefers the frog perspective, whereas the Platonic paradigm prefers the bird perspective....

A mathematical structure is an abstract, immutable entity existing outside of space and time. If history were a movie, the structure would correspond not to a single frame of it but to the entire videotape. Consider, for example, a world made up of pointlike particles moving around in three-dimensional space. In four-dimensional spacetime--the bird perspective--these particle trajectories resemble a tangle of spaghetti. If the frog sees a particle moving with constant velocity, the bird sees a straight strand of uncooked spaghetti. If the frog sees a pair of orbiting particles, the bird sees two spaghetti strands intertwined like a double helix. To the frog, the world is described by Newton's laws of motion and gravitation. To the bird, it is described by the geometry of the pasta--a mathematical structure. The frog itself is merely a thick bundle of pasta, whose highly complex intertwining corresponds to a cluster of particles that store and process information. Our universe is far more complicated than this example, and scientists do not yet know to what, if any, mathematical structure it corresponds.

The Platonic paradigm raises the question of why the universe is the way it is. To an Aristotelian, this is a meaningless question: the universe just is. But a Platonist cannot help but wonder why it could not have been different. If the universe is inherently mathematical, then why was only one of the many mathematical structures singled out to describe a universe? A fundamental asymmetry appears to be built into the very heart of reality.

As a way out of this conundrum, I have suggested that complete mathematical symmetry holds: that all mathematical structures exist physically as well. Every mathematical structure corresponds to a parallel universe. The elements of this multiverse do not reside in the same space but exist outside of space and time. Most of them are probably devoid of observers. This hypothesis can be viewed as a form of radical Platonism, asserting that the mathematical structures in Plato's realm of ideas or the "mindscape" of mathematician Rudy Rucker of San Jose State University exist in a physical sense. It is akin to what cosmologist John D. Barrow of the University of Cambridge refers to as "pi in the sky," what the late Harvard University philosopher Robert Nozick called the principle of fecundity and what the late Princeton philosopher David K. Lewis called modal realism. Level IV brings closure to the hierarchy of multiverses, because any self-consistent fundamental physical theory can be phrased as some kind of mathematical structure.

 

What is Mathematics?

The view [Platonism] as pointed out earlier is this: Mathematics exists. It transcends the human creative process, and is out there to be discovered. Pi as the ratio of the circumference of a circle to its diameter is just as true and real here on Earth as it is on the other side of the galaxy. Hence the book's title Pi in the Sky. This is why it is thought that mathematics is the universal language of intelligent creatures everywhere....

Barrow goes on to discuss Platonic views in detail. The most interesting idea is what Platonist mathematics has to say about Artificial Intelligence (it does not think it is really possible). The final conclusion of Platonism is one of near mysticism. Barrow writes:

We began with a scientific image of the world that was held by many in opposition to a religious view built upon unverifiable beliefs and intuitions about the ultimate nature of things. But we have found that at the roots of the scientific image of the world lies a mathematical foundation that is itself ultimately religious. All our surest statements about the nature of the world are mathematical statements, yet we do not know what mathematics "is" ... and so we find that we have adapted a religion strikingly similar to many traditional faiths. Change "mathematics" to "God" and little else might seem to change. The problem of human contact with some spiritual realm, of timelessness, of our inability to capture all with language and symbol -- all have their counterparts in the quest for the nature of Platonic mathematics. (pg. 296-297)

Ultimately, Platonism also is just as problematic as Formalism, Inventionism and Intuitionism, because of its reliance on the existence of an immaterial world. That math should have a mystical nature is a curiosity we are naturally attracted to, but ultimately does not really matter. Platonism can think of a mathematical world as an actual reality or as a product of our collective imaginations. If it is a reality then our ability to negotiate Platonic realms is limited to what we can know, if it is a product of our collective imaginations then mathematics is back to an invention of sorts. True or not our knowledge of mathematics is still limited by our brains.

Do there exist mathematical theorems that our brains could never comprehend? If so, then Platonic mathematical realms may exist, if not then math is a human invention. We may as well ask, "Is there a God?" The answer for or against does not change our relationship to mathematics. Mathematics is something that we as humans can understand as far as we need.

 

Naturalized Platonism or Platonized Naturalism (pdf)

Platonized Naturalism is the view that a more traditional kind of Platonism is consistent with naturalism. Traditional Platonism is the realist ontology that recognizes abstract objects, i.e., objects that are nonspatiotemporal and outside the casual order. The more traditional kind of Platonism that we defend, however is distinguished by general comprehension principles that assert the existence of abstract objects. We shall argue that such comprehension principles are synthetic and are known a priori. Nevertheless, we claim they are consistent with naturalist standards of ontology, knowledge, and reference. Since we believe that Naturalized Platonism has gone wrong most clearly in the case of mathematics, we shall demonstrate our claims with respect to a comprehension principle that governs the domain in which mathematical objects, among other abstracts, will be located. This is the comprehension principle for abstract individuals, and in what follows, we show that our knowledge of mathematical truths is linked to our knowledge of this principle. Though we shall concentrate the argument of our paper on this particular principle, we believe that similar arguments apply to corresponding comprehension principles for properties, relations and propositions.

 

Beyond the Doubting of a Shadow - Roger Penrose

9.2 Moreover, in the particular Gödelian arguments that are needed for Part 1 of Shadows, there is no need to consider as "unassailable", any mathematical proposition other than a P-sentence (or perhaps the negation of such a sentence). Even in the very weakest form of Platonism, the truth or falsity of P-sentences is an absolute matter. I should be surprised if even Moravec's robot could make much of a case for alternative attitudes with regard to P-sentences (though it is true that some strong intuitionists have troubles with unproved P-sentences). There is no problem of the type that Feferman is referring to, when he brings up the matter of whether, for example, Paul Cohen is or is not a Platonist. The issues that might raise doubts in the minds of people like Cohen - or Gödel, or Feferman, or myself, for that matter - have to do with questions as to the absolute nature of the truth of mathematical assertions which refer to large infinite sets. Such sets may be nebulously defined or have some other questionable aspect in relation to them. It is not very important to any of the arguments that are given in Shadows whether very large infinite sets of this nature actually exist or whether they do not or whether or not it is a conventional matter whether they exist or not. Feferman seems to be suggesting that the type of Platonism that I claimed for Cohen (or Gödel) would require that for no such set could its existence be a conventional matter. I am certainly not claiming that - at least my own form of Platonism does not demand that I need necessarily go to such extremes. (Incidentally, I was speaking to someone recently, who knows Cohen, and he told me that he would certainly describe him as a Platonist. I am not sure where that, in itself, would leave us; but it is my direct personal impression that the considerable majority of working mathematicians are at least "weak" Platonists - which is quite enough. I should also refer Feferman to the informal survey of mathematicians reported on by Davis and Hersch in their book The Mathematical Experience, 1982, which confirms this impression.)

And from a post by Freeper tortoise, an Artificial Intelligence expert:

They [symbols] are very real in the same sense that all information is. Symbols (in the information theoretic abstract) describe everything describable, and the more thoughtful people realize that physical objects are actually a complex collections of symbols at their essence. Of course, one could then argue that energy is the ultimate substrate in which symbols manifest, a pervasive field of unknown origin that we only see from differentials in the energy field (as manifested symbols).

 

In the beginning...

One of the most profound discoveries of science is that there was a beginning, which of course is the first phrase in the Bible.

In the beginning God created the heaven and the earth. - Genesis 1:1

The discovery of a beginning is evidently a great stumbling-block to metaphysical naturalism (atheism) - since a beginning requires a cause, which obviously is God.

It appears that I am not the only one who concludes it was to counter the obvious theological importance of that discovery - that the multi-verse theories were proposed. But that strategy is only clever by half since even a multi-verse must have a beginning.

Interview with Nicolo Dallaporta, a father of modern cosmology

To get away from this evidence, cosmological scenarios are offered that in one way or another repropose a form of the old principle of plenitude ("everything that can exist, does exist"). The existence is thus postulated of an infinity of chances, among which "our case" becomes an obvious favorable case (today the most popular form is that of multi-universes). What is your view on this?

It is very possible, but it is not physics. It is a metaphysics in which recourse is made to a chance that is so enormously limitless that everything that is possible is real. But in this way it becomes a confrontation between metaphysics in which chance collides with purpose. This latter, however, seems much easier to believe! Physics up to now has been based on measurable "data." Beyond this it is a passage of metaphysics. At this point I compare it with another metaphysics. Those who sustain these viewpoints (like Stephen Hawking, for instance) should realize that this goes beyond physics; otherwise it is exaggerated. Physics, pushed beyond what it can measure, becomes ideology.

Robert Jastrow's book God and the Astronomers underscores the significance of a beginning:

Interview with Jastrow

JASTROW: Oh yes, the metaphor there was that we know now that the universe had a beginning, and that all things that exist in this universe-life, planets, stars-can be traced back to that beginning, and it's a curiously theological result to come out of science. The image that I had in my mind as I wrote about this was a group of scientists and astronomers who are climbing up a range of mountain peaks and they come to the highest peak and the very top, and there they meet a band of theologians who have been sitting for centuries waiting for them.

Einstein offered and then withdrew as "kluged" a cosmological constant which allowed the universe to be considered unchanging, steady, i.e. no beginning. Continuing today, the line of defense against God, the multi-verse model, is that we should not be amazed at physical existence because everything that is possible to exist, does.

Nevertheless, most multi-verse theories require a causation and thus only push the beginning further away. The Tegmark article summarizes the multi-verse theories.

What is astonishing is the transparency of motive, which is clearly visible in the following excerpts!

Why is there life? - Martin Rees

The Universe is unlikely. Very unlikely. Deeply, shockingly unlikely.

"It's quite fantastic," says Martin Rees, Britain's Astronomer Royal, waving a hand through the steam rising from his salmon-and-potato casserole...

In his newest book, Just Six Numbers, Rees argues that six numbers underlie the fundamental physical properties of the universe, and that each is the precise value needed to permit life to flourish. In laying out this premise, he joins a long, intellectually daring line of cosmologists and astrophysicists (not to mention philosophers, theologians, and logicians) stretching all the way back to Galileo, who presume to ask: Why are we here? As Rees puts it, "These six numbers constitute a recipe for the universe." He adds that if any one of the numbers were different "even to the tiniest degree, there would be no stars, no complex elements, no life." ...

Faced with such overwhelming improbability, cosmologists have offered up several possible explanations. The simplest is the so-called brute fact argument. "A person can just say: 'That's the way the numbers are. If they were not that way, we would not be here to wonder about it,' " says Rees. "Many scientists are satisfied with that." Typical of this breed is Theodore Drange, a professor of philosophy at the University of West Virginia, who claims it is nonsensical to get worked up about the idea that our life-friendly universe is "one of a kind." As Drange puts it, "Whatever combination of physical constants may exist, it would be one of a kind."

Rees objects, drawing from an analogy given by philosopher John Leslie. "Suppose you are in front of a firing squad, and they all miss. You could say, 'Well, if they hadn't all missed, I wouldn't be here to worry about it.' But it is still something surprising, something that can't be easily explained. I think there is something there that needs explaining."

Meanwhile, the numbers' uncanny precision has driven some scientists, humbled, into the arms of the theologians. "The exquisite order displayed by our scientific understanding of the physical world calls for the divine," contends Vera Kistiakowsky, a physicist at the Massachusetts Institute of Technology. But Rees offers yet another explanation, one that smacks of neither resignation nor theology. Drawing on recent cosmology- especially the research of Stanford University physicist Andrei Linde and his own theories about the nature of the six numbers- Rees proposes that our universe is a tiny, isolated corner of what he terms the multiverse.

The idea is that a possibly infinite array of separate big bangs erupted from a primordial dense-matter state. As extravagant as the notion seems, it has nonetheless attracted a wide following among cosmologists. Rees stands today as its champion. "The analogy here is of a ready-made clothes shop," says Rees, peeling his dessert, a banana. "If there is a large stock of clothing, you're not surprised to find a suit that fits. If there are many universes, each governed by a differing set of numbers, there will be one where there is a particular set of numbers suitable to life. We are in that one."

Imaginary Time - Stephen Hawking

In this lecture, I would like to discuss whether time itself has a beginning, and whether it will have an end. All the evidence seems to indicate, that the universe has not existed forever, but that it had a beginning, about 15 billion years ago. This is probably the most remarkable discovery of modern cosmology. Yet it is now taken for granted. We are not yet certain whether the universe will have an end...

The time scale of the universe is very long compared to that for human life. It was therefore not surprising that until recently, the universe was thought to be essentially static, and unchanging in time...

Since events before the Big Bang have no observational consequences, one may as well cut them out of the theory, and say that time began at the Big Bang. Events before the Big Bang, are simply not defined, because there's no way one could measure what happened at them. This kind of beginning to the universe, and of time itself, is very different to the beginnings that had been considered earlier. These had to be imposed on the universe by some external agency. There is no dynamical reason why the motion of bodies in the solar system can not be extrapolated back in time, far beyond four thousand and four BC, the date for the creation of the universe, according to the book of Genesis. Thus it would require the direct intervention of God, if the universe began at that date. By contrast, the Big Bang is a beginning that is required by the dynamical laws that govern the universe. It is therefore intrinsic to the universe, and is not imposed on it from outside.
Although the laws of science seemed to predict the universe had a beginning, they also seemed to predict that they could not determine how the universe would have begun. This was obviously very unsatisfactory. So there were a number of attempts to get round the conclusion, that there was a singularity of infinite density in the past. ....

If space and imaginary time are indeed like the surface of the Earth, there wouldn't be any singularities in the imaginary time direction, at which the laws of physics would break down. And there wouldn't be any boundaries, to the imaginary time space-time, just as there aren't any boundaries to the surface of the Earth. This absence of boundaries means that the laws of physics would determine the state of the universe uniquely, in imaginary time. But if one knows the state of the universe in imaginary time, one can calculate the state of the universe in real time. One would still expect some sort of Big Bang singularity in real time. So real time would still have a beginning. But one wouldn't have to appeal to something outside the universe, to determine how the universe began. Instead, the way the universe started out at the Big Bang would be determined by the state of the universe in imaginary time. Thus, the universe would be a completely self-contained system. It would not be determined by anything outside the physical universe, that we observe.

To return to the basic argument between Plato and Aristotle - Hawking (Aristotle worldview) says that quantum mechanics is all we can ever know and if you use imaginary time then you can have a beginning for the universe that is not on real time. No need to look further, end of story.

From the Platonist side, mathematicians/physicists like Penrose, would say "not so fast!" (figuratively speaking.) IOW, we ought not shut down the (inconvenient) research - there is always a cause and a beginning, if we can't know any more from quantum mechanics, then let's look for a new kind of physics.

I recognize the Hawking-Aristotle worldview as "the end justifies the means" style of science that has so dominated evolution theory. Nevertheless, even in imaginary time there is a beginning and there will be an end.

Speaking of time...

Time does not mean what we commonly assume that it means. That is true from all points of view but is particularly revealing in the Level IV parallel universe description in Tegmark's article.

Time is geometric. Therefore, as long as we look at that which is physical, there is always a beginning.

The only way to get perspective on time is to get outside of it mentally, i.e. to understand the mathematical constructs of dimensions - or more directly, the spiritual realm and God - i.e. that which is non-temporal, non-spatial and non-corporeal.

Did God have a beginning?

Some people assert that God must have created Himself.

Others (I am one) say that God the Creator exists outside of space and time and thus there is no beginning for God, i.e. the creation is not some in which the Creator exists. There is no "before" the big bang or any multi-verse or dimensional parallel in ekpyrotic cosmology.

This is another area wherein each person must work out his own understanding. Mine is somewhat unique but is based on the Word, Jewish tradition and science.

One of the words used to describe God at creation is Ayn Sof which roughly translated from Hebrew means infinite and nothing. The scientific term for such a state at the beginning of this universe, is singularity - in which there are no physical laws, no space, no time, no particles, no geometry, no energy, nothing - and yet everything. It has a parallel in math as well, the number zero - nothing can be divided by it, anything multiplied by it is it, it is in between all positive and negative numbers. Infinite and not at the same time.

I pondered on this state at length and deduced that God must have wanted to reveal Himself and thus there was a beginning.

Then I pondered how God would go about revealing Himself. I deduced He would create beings who could think to whom He would reveal Himself and would commune. I further deduced how He would go about communicating Himself to these beings, i.e. that He is good and truth and so forth.

These attributes would have no meaning in any language unless they were set in contrast to what they are not. (How would you know if you are happy if you have never been sad?) Thus, I pondered that He would create good and evil, love and hate, et al so that a language could be formed, the Word.

I then pondered He would communicate His will to the thinking beings so they would know Him. I also pondered that, for the words to have meaning, He would give them numerous manifestations of all these contrasts - space/time, geometry, particles, energy, matter, creatures.

One of the ideas of the Jewish Kabbalah that rings true to my spirit is that the Scriptures are another name for God, i.e. it reveals who He is. So I see all of creation - spiritual and material - and the Word as God revealing Himself.

Enter Satan, beautiful and thinking being as he is, decided he ought to exalted. He became "aware" of his beauty and self and thus was at odds with God's will for him.

Likewise, Adam and Eve became "aware" of themselves and sought to be more by gaining the knowledge of good and evil. So likewise, they were at odds with God's will for them and were banished to mortality (the frog view.)

When it is all said and done I see us restored to what was intended at the beginning, we will be the thinking beings to whom God reveals Himself and with whom He communes. His will is what matters over all else. The Lord's Prayer reveals as much, the meaning of life and the purpose of our existence:

Our Father which art in heaven,
Hallowed be thy name.
Thy kingdom come.
Thy will be done in earth, as [it is] in heaven.
Give us this day our daily bread.
And forgive us our debts, as we forgive our debtors.
And lead us not into temptation, but deliver us from evil:
For thine is the kingdom, and the power, and the glory, for ever. Amen.

IMHO, every believer ought to meditate deeply, every day, on the Lord's Prayer - phrase by phrase and word by word. Our place is sandwiched between God's purpose and His dominion.

What about reality and the non-temporal, non-spatial and non-corporeal?

There are various aspects of the non-temporal, non-spatial and non-corporeal gaining attention from a variety of disciplines. These include such things as consciousness, healing power of prayer, near death experiences, retrocognition, precognition, clairvoyance, telepathy.

As you can imagine, the opinions vary wildly. But here are a few links to read-up and help formulate your own conclusions:

PSYCHE: an interdisciplinary journal for cognitive science, philosophy, psychology, physics, neuroscience, and artificial intelligence

Healing Power of Prayer

Near Death Experiences

Edgar Cayce

The consciousness debate...

Of course, those working on strong Artificial Intelligence expect to achieve a self-aware device, i.e. consciousness by mechanism. The presumption is that the conscious mind (thought) is a physical phenomenon of the brain:

Summary of that view

Modern neurophysiology, though, leaves no room for the soul. A neurophysiologist can change our perceptions, our opinions, our motivations and memories by removing or stimulating tiny but well-defined fragments of the brain, or by administering small amounts of a hormone or neurotransmitter to the right place. What seemed a font of life is now part logical engine, part chemical soup, and all vulnerable to outside physical influences. Specific neurological deficits can make us feel that our family members are impostors, that a leg does not belong to us, that others are plotting against us, even that we are ourselves dead, all deeply personal feelings yet driven by ordinary interactions of neurons. Certain drugs or stimulation of parts of the temporal lobe can even elicit religious experiences.

Physical penetration into the depths of the self on this scale allows no free will -- neurons are affected only by other neurons, not by will or effort. The only remaining alternatives are a deterministic mechanism or an element of randomness. Determinism obviously would rule out free will. But the workings of the axons, dendrites and synapses are only determined to a first approximation. Unfortunately the indeterminacy of random errors does not help, for free will is defined as goal-directed, not random. In the neurophysiological context, randomness and chaos offer an escape from predetermination, but fall short of restoring free will.

Until recently considerations of free will have been the purview of a branch of modern philosophy, the philosophy of mind. Wegner makes short work of the philosophers, for without empirical progress there is nothing more to go on than yet another speculation or introspection. The introspection of free will, though irresistably powerful, is not science. And science is just a systematic way of looking closely at the world and at ourselves.

Naturally, I strongly disagree. A successful model does not an original make. To some it makes no difference whether it is the Mona Lisa, a fine reproduction of it or a photographic copy. But neither the reproduction nor the photo are the Mona Lisa.

The device, a thing contrived, is not a human, an ensemble of anatomical parts, physiological functions and consciousness. As far as I know, strong A.I. is not trying to create such an ensemble.

Some believe strong A.I. may eventually give us a device which has its own qualia, e.g. preference for da Vince over Picasso. But even that does not make the device a human.

Qualia

Feelings and experiences vary widely. For example, I run my fingers over sandpaper, smell a skunk, feel a sharp pain in my finger, seem to see bright purple, become extremely angry. In each of these cases, I am the subject of a mental state with a very distinctive subjective character. There is something it is like for me to undergo each state, some phenomenology that it has. Philosophers often use the term 'qualia' (singular 'quale') to refer to the introspectively accessible, phenomenal aspects of our mental lives. In this standard, broad sense of the term, it is difficult to deny that there are qualia. Disagreement typically centers on which mental states have qualia, whether qualia are intrinsic qualities of their bearers, and how qualia relate to the physical world both inside and outside the head. The status of qualia is hotly debated in philosophy largely because it is central to a proper understanding of the nature of consciousness. Qualia are at the very heart of the mind-body problem....

Moreover, any viable hypothesis such as "thought is a physical phenomenon of the brain" must have more than evidence - it must have a falsification. That is the concept Sir Popper brought to the table and this debate is a good example why.

For instance, my "hypothesis" is that the brain is like a transmitter/receiver for the soul which is non-temporal, non-corporeal and non-spatial. All the evidence that goes to prove the "thought is a physical phenomenon" hypothesis proves mine as well. Tinker with the physical brain, certain emotions, thoughts, functions and words cease, others arise.

The trick is that neither view can be falsified; instead, we have to look further to the physical laws and moreover, define "all that there is" to see what is possible.

The one view requires a universe or multi-verse which is more narrowly construed because it requires a corporeal, temporal or spatial explanation for consciousness. Mine is within the mathematical constructs of higher dimensional dynamics as described in Level IV of the Tegmark article.

Moreover, the above article strikes me as an evangelical flyer for metaphysical naturalism (atheism) as he makes sweeping statements that the reader should take as "gospel" truth simply because he says so. And in doing, he sweeps away free will and the soul, and therefore also philosophy and theology as irrelevant. That is the reasonable end result of any "thinking is a physical phenomenon" hypothesis.

His hypothesis reminds me of what might happen if an intelligent cave-dweller got his hands on a radio. Since he knows nothing of radio waves, they don't exist in his world. He tinkers with the radio and discovers he can control whether it speaks or not or what it says. Aha, says he, these parts in the radio only touch these other parts in the radio and it does this and that by my tinkering - ipso facto there is no soul, no free will and hence, theology and philosophy are irrelevant.

Obviously quantum mechanics and higher dimensional dynamics are outside the scope of A.I. and the biology of the brain - much like abiogenesis is outside the scope of evolution biology. That works quite well until either discipline makes a statement touching on the out-of-scope subject. It doesn't work to make such a statement and then dismiss all rebuttals to it as "out of bounds."

Therefore I assert that quantum mechanics, quantum field theory and higher dimensional dynamics overarch all physicality and are relevant to the study of any physical phenomenon, including the brain and consciousness. So naturally, I believe my hypothesis is better grounded.

To sum it up, what we comprehend is by choice of three spatial coordinates plus time. This is the "natural" boundary of both vision and the brain - but we deduce there may well be other dimensions and have formulated laboratory tests to affirm or falsify.

The Curse of Dimensionality (pdf)

I am not sure how many ways our physical bodies appear in the eyes of God or any being self-aware in a higher dimension. But I very much appreciate the frog/bird metaphor Tegmark used to illustrate that our comprehension is limited to three spatial dimensions plus time. Tegmark explains that the frog would appear as a bundle of cooked pasta from the bird's view. That may be true.

The above link on the curse of dimensionality goes into the issue in greater detail.

I do believe our being - our spirit - is whole in a higher dimension, but that our physical bodies are intentionally made in such a way that we cannot easily perceive it - thus are not aware of the severe damage we do by untoward thoughts in this mortal life (even without acting on them!) I believe this is the subtext for all the relevant warnings in the Sermon on Mount and elsewhere in Scripture.

Ye have heard that it was said by them of old time, Thou shalt not kill; and whosoever shall kill shall be in danger of the judgment:

But I say unto you, That whosoever is angry with his brother without a cause shall be in danger of the judgment: and whosoever shall say to his brother, Raca, shall be in danger of the council: but whosoever shall say, Thou fool, shall be in danger of hell fire. - Matthew 5:21-22

For these reasons, I see physical death as the shedding of the mortality (dimensional) blindfold - and why I conclude that the physical brain is a transmitter/receiver for the spirit and not a container for consciousness.

Predestination v. Free Will

Simply put, as shown in the above linked article - following the "thought is a physical phenomenon of the brain" in a metaphysically naturalist worldview leads to the conclusion that all is predestined, thus there is no free will.

But predestination v free will is not a doctrinal dilemma for me. Fulfilled prophecy is evidence of predestination. The Word authenticates predestination in Romans 8:30:

Moreover whom he did predestinate, them he also called: and whom he called, them he also justified: and whom he justified them he also glorified.

Likewise, we have free will, the ability to choose. Our spiritual realm ancestors Adam and Eve choose self will over God's will and thus became mortals grounded to a finite timeline in the physical realm - "pastaness" in the frog/bird metaphor. But God's is eternal, i.e. unchanging.

Here is why I do not have a problem with predestination v free will.

To begin, I see that "all that there is" - all spiritual realms, physical realms (including dimensions, multi-verses and all geometries) - are God's revealing Himself to creatures He is concurrently creating to commune with eternally.

As one cannot know health if they have never know sickness - likewise, courage appears by contrast to fear, love to hate, good to evil, obedience to disobedience, etc. The "properties" of God are being shown to us in contrast to what He is not. When His kingdom comes, all that is Him emerges and that which is not Him is culled.

But if the process did not exist, we would have no way to know Him.

Only God exists, i.e. has life in Himself; He says we can use the nickname I am to refer to Him. He is outside of space/time, before "the beginning" and after "the end" from our point of view. Thus, being well beyond all form of geometry, He is not constrained to or by any timeline.

That is why He speaks of what is the future to us, as if it were already past. That is why, when He pronounces judgment, it is already done. If this were not so, then Christ's sacrifice would be unecessary.

That Christ became the propitiation for our sin is evidence to me that my concept of eternity v. time-like paths is so. The emphasis on Christ as the Lamb of God in Revelation goes to that understanding as well.

And all that dwell upon the earth shall worship him [the beast], whose names are not written in the book of life of the Lamb slain from the foundation of the world. - Revelation 13:8

So, yes, we have free will in the same sense that we have physical reality and physical laws in the Level IV multi-verse. Life goes on in blissful disregard of wave/particle duality, multi-verses, dimensionality - and the import of time. He of course has always known what we will choose in our individual mortal timelines because He is above and beyond our geometry.

In the frog/bird metaphor, He sees the entire movie - but we, being the frog observers, see it one frame at a time. Conversely, our pasta-like geometry is malleable over all time by Him, but not by us.

As He says in the Word, His sheep hear His voice, He knows us and we follow Him. John 10:14-29 Further, those passages make it clear that He already knows His sheep.

This understanding of free will does not bother me in the least, because the meaning and purpose of our existence is not to be the captains of our ships and the masters of our destiny - but rather, to know Him and thereby, to prepare us as family members for all eternity. From Revelation 4:11:

Thou art worthy, O Lord, to receive glory and honour and power: for thou hast created all things, and for thy pleasure they are and were created.

I do realize this analysis would be extremely distressing to anyone - believer or not - who wishes to be the master of their own destiny.. Anyway, that's my two cents...

Which brings us to evolution...

We see requisite mechanisms at the most elementary levels, building blocks laying hither and fro, we can theorize how species arose - but that does not speak to opportunity or viability even at the most primitive levels, much less the functional complexity that we actually see.

And that's only looking at the biology - the enigma is magnified when one also examines what would be necessary to evolve consciousness under the metaphysical naturalist worldview.

Naturally, I support the Intelligent Design view. However, except for the following links which are offered so others can get the information from the "movement" directly - I will concentrate on remarks by scientists who are not affiliated with the Intelligent Design movement.

Discovery Institute, Center for Science and Culture

Access Research Network

Dembski Website

The problem of functional complexity

Interview with Marcel-Paul Schützenberger

Until his death, the mathematician and doctor of medicine Marcel-Paul Schützenberger (1920-1996) was Professor of the Faculty of Sciences at the University of Paris and a member of the Academy of Sciences...

Q: What do you mean by functional complexity?

S: It is impossible to grasp the phenomenon of life without that concept, the two words each expressing a crucial and essential idea. The laboratory biologists' normal and unforced vernacular is almost always couched in functional terms: the function of an eye, the function of an enzyme, or a ribosome, or the fruit fly's antennae -- their function; the concept by which such language is animated is one perfectly adapted to reality. Physiologists see this better than anyone else. Within their world, everything is a matter of function, the various systems that they study -- circulatory, digestive, excretory, and the like -- all characterized in simple, ineliminable functional terms. At the level of molecular biology, functionality may seem to pose certain conceptual problems, perhaps because the very notion of an organ has disappeared when biological relationships are specified in biochemical terms; but appearances are misleading, certain functions remaining even in the absence of an organ or organ systems. Complexity is also a crucial concept. Even among unicellular organisms, the mechanisms involved in the separation and fusion of chromosomes during mitosis and meiosis are processes of unbelieveable complexity and subtlety. Organisms present themselves to us as a complex ensemble of functional interrelationships. If one is going to explain their evolution, one must at the same time explain their functionality and their complexity.

Q: What is it that makes functional complexity so difficult to comprehend?

S: The evolution of living creatures appears to require an essential ingredient, a specific form of organization. Whatever it is, it lies beyond anything that our present knowledge of physics or chemistry might suggest; it is a property upon which formal logic sheds absolutely no light. Whether gradualists or saltationists, Darwinians have too simple a conception of biology, rather like a locksmith improbably convinced that his handful of keys will open any lock. Darwinians, for example, tend to think of the gene rather as if it were the expression of a simple command: do this, get that done, drop that side chain. Walter Gehring's work on the regulatory genes controlling the development of the insect eye reflects this conception. The relevant genes may well function this way, but the story on this level is surely incomplete, and Darwinian theory is not apt to fill in the pieces.

Q: You claim that biologists think of a gene as a command. Could you be more specific?

S: Schematically, a gene is like a unit of information. It has simple binary properties. When active, it is an elementary information-theoretic unit, the cascade of gene instructions resembling the cascade involved in specifying a recipe. Now let us return to the example of the eye. Darwinists imagine that it requires what? A thousand or two thousand genes to assemble an eye, the specification of the organ thus requiring one or two thousand units of information? This is absurd! Suppose that a European firm proposes to manufacture an entirely new household appliance in a Southeast Asian factory. And suppose that for commercial reasons, the firm does not wish to communicate to the factory any details of the appliance's function -- how it works, what purposes it will serve. With only a few thousand bits of information, the factory is not going to proceed very far or very fast. A few thousand bits of information, after all, yields only a single paragraph of text. The appliance in question is bound to be vastly simpler than the eye; charged with its manufacture, the factory will yet need to know the significance of the operations to which they have committed themselves in engaging their machinery. This can be achieved only if they already have some sense of the object's nature before they undertake to manufacture it. A considerable body of knowledge, held in common between the European firm and its Asian factory, is necessary before manufacturing instructions may be executed.

Q: Would you argue that the genome does not contain the requisite information for explaining organisms?

S: Not according to the understanding of the genome we now possess. The biological properties invoked by biologists are in this respect quite insufficient; while biologists may understand that a gene triggers the production of a particular protein, that knowledge -- that kind of knowledge -- does not allow them to comprehend how one or two thousand genes suffice to direct the course of embryonic development.

Q: You are going to be accused of preformationism...

S: And of many other crimes. My position is nevertheless strictly a rational one. I've formulated a problem that appears significant to me: how is it that with so few elementary instructions, the materials of life can fabricate objects that are so marvelously complicated and efficient? This property with which they are endowed -- just what is its nature? Nothing within our actual knowledge of physics and chemistry allows us intellectually to grasp it. If one starts from an evolutionary point of view, it must be acknowledged that in one manner or another, the earliest fish contained the capacity, and the appropriate neural wiring, to bring into existence organs which they did not possess or even need, but which would be the common property of their successors when they left the water for the firm ground, or for the air.

 

Can the part become greater than the whole?

For evolution theory to work there must be autonomy, self-organizing complexity and symbolization - such that functional complexity can arise.

"Self-organizing complexity" is a relatively new area of study and the meaning varies considerably whether the subject is economics, biological systems, information theory, consciousness, etc. What is meant by "self" in each usage becomes significant; and since evolution theory precludes intelligent design, I added "autonomy" to the requirement of "self-organized complexity."

In sum, the "jury is still out" on the subject in all disciplines - but all the latest can be found here:

Los Alamos archives

International Society for Complexity, Information and Design

What could have started biological evolution?

I suspect we are seeing the influence of information theory on the field of evolution biology. Scientists like Yockey, Patten and Rocha are pointing to the requirement - within evolution biology theory - of the organism to have autonomous self-organizing complexity, including symbolization. My observation is that the lack of any of these in the genetic code leaves the mechanism empty to explain the rise of functional complexity.

Moreover, even if all of these were discovered - it would nevertheless require a bootstrap on the front end to initiate the process. And the existence of such a bootstrap, if algorithmic, would point directly to intelligent design. IMHO, the randomness pillar of evolution theory is in deep peril due to these contributions from mathematics.

With all the talk of primordial soups, one might think this issue has been solved. It has not. There is even a nice prize for the first one to come up with a good solution:

Origin of Life Prize

"The Origin-of-Life Prize" ® (hereafter called "the Prize") consists of $1.35 Million (USD) paid directly to the winner(s). The Prize will be awarded for proposing a highly plausible mechanism for the spontaneous rise of genetic instructions in nature sufficient to give rise to life. To win, the explanation must be consistent with empirical biochemical and thermodynamic concepts as further delineated herein, and be published in a well-respected, peer-reviewed science journal(s).

Hubert P. Yockey

All dialectical materialist origin of life scenarios require in extremis a primeval soup. There is no path from this mythical soup to the generation of a genome and a genetic code. John von Neumann showed that fact in his Theory of Self-Reproducing Automata U of Ill. Press 1966. One must begin with a genetic message of a rather large information content. Manfred Eigen and his disciples argue that all it takes is one self-catalytic molecule to generate a genome. This self-catalytic molecule must have a very small information content. By that token, there must be very few of them [Section 2.4.1] As they self-reproduce and evolve the descendants get lost in the enormous number of possible sequences in which the specific messages of biological are buried. From the Shannon-McMillan theorem I have shown that a small protein, cytochrome c is only 2 x 10^-44 of the possible sequences. It takes religious faith to believe that would happen. Of course the minimum information content of the simplest organism is much larger than the
information content of cytochrome c.

H. H. Pattee

But there is another type of subjective feeling about understanding life that motivated Pearson's question, the same, I think, that motivated Lucretius' and von Neumann's questions. It is a feeling of paradox, the same feeling that motivated Bohr, Wigner, Polanyi, the skeptics, and somewhat ironically, the founders of what is now reductionist molecular biology, like Delbrück. They all believed that life follows laws, but from their concept of law, they could not understand why life was so strikingly different from non-life. So I find another way of asking this type of question: What exactly does our view of universal dynamical laws abstract away from life, so that the striking distinctions between the living and the lifeless become obscure and apparently paradoxical?

My first answer is that dynamical language abstracts away the subject side of the epistemic cut. The necessary separation of laws and initial conditions is an explicit principle in physics and has become the basis (and bias) of objectivity in all the sciences. The ideal of physics is to eliminate the subjective observer completely. It turned out that at the quantum level this is a fundamental impossibility, but that has not changed the ideal. Physics largely ignores the exceptional effects of individual (subjective) constraints and boundary conditions and focusses on the general dynamics of laws. This is because constraints are assumed to be reducible to laws (although we know they are not reducible across epistemic cuts) and also because the mathematics of complex constraints is often unmanageable. Philosophers have presented innumerable undecidable metaphysical models about the mind-brain cut, and physicists have presented more precise but still undecidable mathematical models about quantum measurement. But at the primeval level, where it all began, the genotype-phenotype cut is now taken for granted as ordinary chemistry.

My second answer is that if you abstract away the details of how subject and object interact, the "very peculiar range" of sizes and behaviors of the allosteric polymers that connect subject and object, the memory controlled construction of polypeptides, the folding into highly specific enzymes and other functional macromolecules, the many-to-many map of sequences to structures, the self-assembly, and the many conformation dependent controls - in other words, if you ignore the actual physics involved in these molecules that bridge the epistemic cut, then it seems unlikely that you will ever be able to distinguish living organisms by the dynamic laws of "inorganic corpuscles" or from any number of coarse-grained artificial simulations and simulacra of life. Is it not plausible that life was first distinguished from non-living matter, not by some modification of physics, some intricate nonlinear dynamics, or some universal laws of complexity, but by local and unique heteropolymer constraints that exhibit detailed behavior unlike the behavior of any other known forms of matter in the universe?

Luis Rocha

The idea that life may have originated from pure RNA world has been around for a while. In this scenario the first life forms relied on RNA molecules as both symbolic carriers of genetic information, and functional, catalytic molecules. The neutralist hypothesis for the function of RNA editing assumes such a RNA world origin of life. It posits that RNA editing could offer a process by which the dual role of RNA molecules as information carriers and catalysts could more easily co-exist. The key problem for the RNA world origin of life hypothesis is precisely the separation between these two functions of RNA. On the one hand RNA molecules should be stable (non-reactive) to carry information, and on the other hand they should be reactive to perform their catalytic function. RNA editing, could be seen as means to fragment genetic information into several non-reactive molecules, that are later, through RNA editing processes, integrated into reactive molecules. This way, the understanding of this process of mediation between the role of RNA molecules as information carriers and catalytic molecules based on RNA editing, can also offer many clues to the problem of origin of a semiotic code from s dynamic (catalytic) substrate.

Given many random distributions of the reactivity of a RNA sequence space, we could study how easily can reactive sequences be constructed from RNA edition of non-reactive molecules. A study of this process is forthcoming.

 

And leaving the soup bowl doesn't make math/physics problems with biological evolution disappear. Indeed, the long view of biological history is perplexing:

Stephen Wolfram

On the basis of traditional biological thinking one would tend to assume that whatever complexity one saw must in the end be carefully crafted to satisfy some elaborate set of constraints. But what I believe instead is that the vast majority of the complexity we see in biological systems actually has its origin in the purely abstract fact that among randomly chosen programs many give rise to complex behavior....

So how can one tell if this is really the case?

One circumstantial piece of evidence is that one already sees considerable complexity even in very early fossil organisms. Over the course of the past billion or so years, more and more organs and other devices have appeared. But the most obvious outward signs of complexity, manifest for example in textures and other morphological features, seem to have already been present even from very early times.

And indeed there is every indication that the level of complexity of individual parts of organisms has not changed much in at least several hundred million years. So this suggests that somehow the complexity we see must arise from some straightforward and general mechanism and not, for example, from a mechanism that relies on elaborate refinement through a long process of biological evolution....

Gerald Schroeder

Life is in essence a symbiotic combination of proteins (and other structures, but here I'll discuss only the proteins). The history of life teaches us that not all combinations of proteins are viable. At the Cambrian explosion of animal life, 530 million years ago, some 50 phyla (basic body plans) appeared suddenly in the fossil record. Only 30 to 34 survived. The rest perished. Since then no new phyla have evolved. It is no wonder that Scientific American asked whether the mechanism of evolution has changed in a way that prohibits all other body phyla. It is not that the mechanism of evolution has changed. It is our understanding of how evolution functions that must change, change to fit the data presented by the fossil record. To use the word of Harvard professor Stephen Jay Gould, it appears that the flow of life is "channeled" along these 34 basic directions...

Among the structures that appeared in the Cambrian were limbs, claws, eyes with optically perfect lenses, intestines. These exploded into being with no underlying hint in the fossil record that they were coming. Below them in the rock strata (i.e., older than them) are fossils of one-celled bacteria, algae, protozoans, and clumps known as the essentially structureless Ediacaran fossils of uncertain identity. How such complexities could form suddenly by random processes is an unanswered question. It is no wonder that Darwin himself, at seven locations in The Origin of Species, urged the reader to ignore the fossil record if he or she wanted to believe his theory. Abrupt morphological changes are contrary to Darwin's oft repeated statement that nature does not make jumps. Darwin based his theory on animal husbandry rather than fossils. If in a few generations of selective breeding a farmer could produce a robust sheep from a skinny one, then, Darwin reasoned, in a few million or billion generations a sponge might evolve into an ape. The fossil record did not then nor does it now support this theory...

With the advent of molecular biology's ability to discern the structure of proteins and genes, statistical comparison of the similarity of these structures among animals has become possible. The gene that controls the development of the eye is the same in all mammals. That is not surprising. The fossil record implies a common branch for all mammals. But what is surprising, even astounding, is the similarity of the mammal gene the gene that controls the development of eyes in mollusks and the visual systems in worms. The same can be said for the gene that controls the expression of limbs in insects and in humans. In fact so similar is this gene, that pieces of the mammalian gene, when spliced into a fruit fly, will cause a wing to appear on the fly. This would make sense if life's development were described as a tree. But the bush of life means that just above the level of one-celled life, insects and mammals and worms and mollusks separated.

The eye gene has 130 sites. That means there are 20 to the power of 130 possible combinations of amino acids along those 130 sites. Somehow nature has selected the same combination of amino acids for all visual systems in all animals. That fidelity could not have happened by chance. It must have been pre-programmed in lower forms of life. But those lower forms of life, one-celled, did not have eyes. These data have confounded the classic theory of random, independent evolution producing these convergent structures. So totally unsuspected by classical theories of evolution is this similarity that the most prestigious peer-reviewed scientific journal in the Untied States, Science, reported: "The hypothesis that the eye of the cephalopod [mollusk] has evolved by convergence with vertebrate [human] eye is challenged by our recent findings of the Pax-6 [gene] ... The concept that the eyes of invertebrates have evolved completely independently from the vertebrate eye has to be reexamined."

The significance of this statement must not be lost. We are being asked to reexamine the idea that evolution is a free agent. The convergence, the similarity of these genes, is so great that it could not, it did not, happen by chance random reactions.

 

Ah, the epistemological zeal of the mathematicians and physicists! None of the above scientists are hostile to evolution theory but neither do they accept pedigree as proof, they require explanations - not descriptions.

All over the internet and in public forums hither and yon, the battle rages between Biological Evolution and the Intelligent Design Movement. When the evidence is descriptive, the format of the debate reminds one of a courtroom. But when the subject turns to mathematics, the format of the debate narrows to such issues as "irreducible complexity."

I aver that it doesn't matter who wins this particular contest. Even if the "movement" were crushed tomorrow, the mathematicians and physicists are already in the fields of molecular biology and evolutionary biology. And there are far too many Platonists (weak and strong, naturalized and not) to sustain any "just so" stories.



TOPICS: Culture/Society; Editorial; Miscellaneous; Philosophy
KEYWORDS: archaeology; godsgravesglyphs; history
Navigation: use the links below to view more comments.
first previous 1-20 ... 221-240241-260261-280 ... 661-675 next last
To: jayef
Have you read Deutsch?

No. What does he have to say? Got some links to save time? (A search on "deutsch" is rather hopeless.)

241 posted on 06/17/2003 3:26:48 PM PDT by PatrickHenry (Felix, qui potuit rerum cognoscere causas.)
[ Post Reply | Private Reply | To 236 | View Replies]

To: betty boop
...people expect perfect randomness, where randomness in actuality is constrained in some fashion?

The constraints just mean limitations of the area over which an event is unpredictable ("perfectly random"). Let me give you an example. Aflotoxins cause DNA mutations by binding to parts of guanine residues resulting in a G to T transversion. The toxin is very biased in this affinity, because it only binds to G and not to other nucleotides. At the same time, the toxin has no affinity for one available G over another G. It is constrained by chemistry, not by a directed or goal-oriented process, and, as such, it is still unpredictable within those constraints.

242 posted on 06/17/2003 3:41:48 PM PDT by Nebullis
[ Post Reply | Private Reply | To 227 | View Replies]

To: Nebullis; betty boop
Extant phenomena in biology arrived there via extremely biased pathways and with the help of many external variables.

Yes, tell us about those miraculous biases that come along the way of mindless evolution, Mr. Ad Hominem.... We're all ears. Tell us how mindless nature makes such wild leaps over and over again, despite the inabilities to demonstrate scientifically.

BTW, what arm of the pseudo-science chair do you lean on?

243 posted on 06/17/2003 4:51:16 PM PDT by unspun ("Do everything in love.")
[ Post Reply | Private Reply | To 158 | View Replies]

To: Nebullis; betty boop
The constraints just mean limitations of the area over which an event is unpredictable ("perfectly random"). Let me give you an example. Aflotoxins cause DNA mutations by binding to parts of guanine residues resulting in a G to T transversion. The toxin is very biased in this affinity, because it only binds to G and not to other nucleotides. At the same time, the toxin has no affinity for one available G over another G. It is constrained by chemistry, not by a directed or goal-oriented process, and, as such, it is still unpredictable within those constraints.

And this proves what?

244 posted on 06/17/2003 4:53:26 PM PDT by unspun ("Do everything in love.")
[ Post Reply | Private Reply | To 242 | View Replies]

To: unspun
And this proves what?

It's not intended to prove anything.

245 posted on 06/17/2003 5:02:56 PM PDT by Nebullis
[ Post Reply | Private Reply | To 244 | View Replies]

To: Nebullis
It's not intended to prove anything.

An apt non-intention, thank you.

246 posted on 06/17/2003 5:06:56 PM PDT by unspun ("Do everything in love.")
[ Post Reply | Private Reply | To 245 | View Replies]

To: Michael121
To measure the date we use carbon dating.

We also use many, many other methods, all based on wildly different methodologies, measurements, and premises. And yet, for the most part results of all the differently determined dating methods agree with each other. (And when they disagree, there are well-understood reasons why.) How do you explain this if you believe they are unreliable?

Furthermore, most items of evolutionary interest are *not* dated via "carbon dating", because Carbon-14 dating can only be used for items up to about 50,000 years old. Most items of evolutionary interest have ages measured in millions of years, and other methods are used. Carbon dating is primarily of use for items within the range of human history and early pre-history.

Yet can you answer with certainty the amount of carbon at any given time? As in, were the levels constant? The answer is no.

Very wrong. The answer is yes. There are many, many samples of known age (e.g. tree rings, arctic ice layers, lake bottom layers, etc.) which can be used to multiply and independently determine how much carbon-14 was in the atmosphere in any given year, and thus be used to calibrate Carbon-14 dating methods.

For a quick article on one such study, see http://more.abcnews.go.com/sections/science/dailynews/carbon0220.html

A much more technical treatment: Atmospheric Radiocarbon Calibration to 45,000 yr B.P.: Late Glacial Fluctuations and Cosmogenic Isotope Production

Such studies produce calibration results such as the following:

If the amount of Carbon-14 in the atmosphere had been exactly constant throughout time (and no one expects that it has been), then the results would fall on the straight diagonal line. Instead, the wiggly line indicates how much the actual amount of C-14 in the atmosphere deviated from the "base" amount, and from this we can know how much C-14 was actually present in any given year in the past 50,000 years.

Note that the above graph includes C-14 data from *two* completely independent sources (Lake Suigetsu varves, and ocean corals), and yet the results overlap beautifully, confirming each other. There is similar match from C-14 studies based on tree-ring data and other sources.

From this, we can build a Carbon-14 dating calibration or "correction" curve which can be used to confidently produce an accurate date from a given Carbon-14 measurement. These calibration curves look like this:

There are many databases available which are used to compile massive amounts of data to ensure the proper calibration of carbon-dating. For just one example, Marine Reservoir Correction Database.

Other methods are used to cross-check and calibrate other dating methods to ensure accuracy.

So we can get approximate dates, but relative to how close in terms of the universe?

Quite close.

If we can take a leaf from a tree and date it as being 10,000 years old, yet it was just removed and is still green,

"If"... Feel free to document that this is actually possible.

, then how can we rely on this?

Because the various dating methods give consistent results which have been repeatedly determined to be reliable and accurate.

Science goes out of its way to try to disprove the existance of God.

No, it really doesn't.

More so than to try and prove evolution.

You are extremely mistaken. There are multiple heavy monthly journals which consist of nothing but studies of evolution [one sample]. I can't think of a single article published in any peer-reviewed science journal which even attempted to "disprove the existence of God" (although you might find a few in the Philosophy department).

To deny a "supreme being" without proving "his" non-existance conclusively, is a fundemental error.

So... Since Shiva and Zeus and Odin haven't had their "non-existence conclusively proven", is it a "fundamental error" to deny them as well?

To prove "evolution" wihtout complete proof, such as all missing "links" is also an error.

You misunderstand how science works. Science does not deal in proofs.

However, by your own argument, if it's a "fundamental error" to "deny" something without "proving its non-existance conclusively", then aren't you making a "fundamental error" if you deny evolution without conclusively proving its falseness?

I think your thesis needs a bit more work. Meaning that the evidence, as more is gathered seems in direct contradiction to itself.

Feel free to present your alleged examples.

Leave anit-religious agendas out of science and deal with the facts found. But sadly it seems most science attacks religion

Speaking of agendas...

The Bible if nothing else, has proven things archeologically and including the existance of Peter. His house, and his name carved upon a stone. In the fishing village he was from. When you use the evidence found in the pages of the Bible and things start to add up, more truth upon more truth it is hard to deny the "whole" when the sum of its parts turn out to be real.

History and archeology teach us that the Civil War really happened, there really was a general named Sherman who burned Atlanta, there really were battles at certain places and times, etc., and that there was a woman named Margaret Mitchell. Does this make *all* of "Gone With The Wind" necessarily true?

Science has yet to find life on another planet. Yet they keep trying in this.

Because the only way to find out, *either way*, is to keep looking.

So they seem to have their own faith.

Yes, they have faith in the value of making efforts to keep learning more about the universe we live in.

But have yet to prove it to the world.

Nothing can be "proven" (there's that word again) without information. Science believes in gathering as much information as possible, so that when conclusions are made, they are based on real information, and not philosophical or religious dogmas or "sound-good-isms". And in gathering further information so that past conclusions can be further reality-checked. Science and the scienific method is, in a nutshell, all about doing frequent reality-checks of beliefs.

247 posted on 06/17/2003 6:32:49 PM PDT by Ichneumon
[ Post Reply | Private Reply | To 72 | View Replies]

To: tortoise
2. If there was a chemical necessity to any particular scheme, we would be seeing that certain possible combinations do not occur. Instead we see all 64 possible combinations of the three bit code appearing in living things.-me-

What you are saying is not strictly true, but the point is minor enough that it doesn't really change things one way or the other.

If it is not accurate, let me know how it is innacurate, don't leave me guessing. I like to be as accurate as possible.

More importantly, DNA doesn't "do" anything, merely providing a template.

That is a terrible analogy for what DNA does. Sure, DNA needs the rest of the cell to accomplish its work and even the entire organism, but to call it a template is like calling a program a bit of nonsense in a computer. Like a program without which a computer is just a piece of junk, without DNA a body would be just food for scavengers. Like a program, DNA is information, essential information for the human body, just as a program is essential information to make a computer work.

Building proteins off that template is an extremely biased system

If you mean by the above that it takes a lot of fiddling to get the protein to come out correct, you would be right, however it is DNA itself that sets out how it is to be fiddled with with stop codons, homeoboxes, specific RNA's for specific genes, sets up a control system to tell how much protein is to be produced and when, and much more. So you are totally degrading the tremendous job which DNA does in the organism.

we expend a fair portion of our supercomputing power today figuring out what protein conformations are probable under certain circumstances and which aren't.

If you are trying to make a better protein than nature or to modify it in any way you certainly will need a lot of work to accomplish it which like the rest of your post pretty much verifies what Alamo-Girl's sources have been saying - that it is virtually impossible to create a single functional gene at random. Such a miracle occurring once would be possible though extremely unlikely. However to propose that such a miracle could have occurred not just once but millions and millions of times with the numerous species we have on earth is certainly impossible.

248 posted on 06/17/2003 6:43:54 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 164 | View Replies]

To: Nebullis; Alamo-Girl
What sort of evidence is there for "pre programmed adapation capability"?

The Hox genes, all of them:

What do humans have in common with worms, flies and rodents? If you said, "not much," you're right. But not as "right" as you might think. During the early 1980s, scientists discovered that most of the genes in fruit flies that control the identity of different body parts -- a head, wing, or other structure -- are remarkably identical. The genes contain short sequences of deoxyribonucleic acid (DNA), which is found in every living cell and forms the "blueprint" for all organisms. Surprisingly, researchers discovered that the DNA sequence they had found in flies, called the homeobox, was common to genes that direct development of body structure in virtually all animals, including worms, flies, birds, mice and humans. "Homeo" is derived from the Greek word for similar; "box" refers to the clearly defined sequence, as though in a box.

Since the homeobox sequence stayed very similar during millions of years of evolution in many species, scientists suspected it must be important to life. They soon learned that the part of the protein it encodes can bind to DNA in a way that turns other genes on and off.

Even more surprising, scientists found that many genes containing the homeobox sequence, called hox genes, are lined up in clusters along chromosomes -- large strands of genetic material -- in an order that parallels the body part they control. On a fly chromosome, hox genes closest to one end control formation of the head, while the next ones in line control the upper body. At the other end of the cluster are genes controlling abdomen formation. When all these genes work correctly, the proteins they produce act together to ensure that each organism's body parts are made in correct locations. Hox genes also control development of parts of the central nervous system, including different regions of the brain.

Researchers concluded that hox genes are "master regulators" for the organization of the body. When the function of one of these genes is changed due to a genetic mutation or other factor, the wrong body part will develop in a given place. A fly, for example might grow a leg in the middle of its head.

A brief note, all the genes which evolutionists call 'pathways' are from multi-cellular creatures which arose during the Cabmrian explosion. The Hox genes are obviously a necessary requirement for multi-cellular organisms, that such a universal set of genes could have arisen in such a short time to become the basis of just about all multi-cellular animals, and that they could serve as building blocks for future species functions, shows pretty well that they could not have arisen either at random or due to 'selection' but could only have arisen by design.

249 posted on 06/17/2003 7:29:33 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 186 | View Replies]

To: Ichneumon; Nebullis; Doctor Stochastic; js1138; Helms; dark_lord; tortoise; PatrickHenry; ...
You misunderstand how science works. Science does not deal in proofs.

Let's review what science actually is and what the scientific method actually does. (When one does, one realizes how far from science are conjectures of macroevolution, how far from scientifically validated are any descriptive hypotheses of how evolution is supposed to work, and how far from scientific theory is any patchwork model of the process of evolution.)

I'll present the information in roughly an order of very summarized to more detailed, so you can best decide where to stop.

1/4. from:  http://www.soci.niu.edu/~phildept/Dye/method.html 
          Selected texts bolded in green, by unspun

Socratic Method and Scientific Method

Socratic Method Scientific Method
1. Wonder. Pose a question (of the "What is X ?" form). 1. Wonder. Pose a question.
2. Hypothesis. Suggest a plausible answer (a definition or definiens) from which some conceptually testable hypothetical propositions can be deduced. 2. Hypothesis. Suggest a plausible answer (a theory) from which some empirically testable hypothetical propositions can be deduced.
3. Elenchus ; "testing," "refutation," or "cross-examination." Perform a thought experiment by imagining a case which conforms to the definiens but clearly fails to exemplify the definiendum, or vice versa. Such cases, if successful, are called counterexamples. If a counterexample is generated, return to step 2, otherwise go to step 4. 3. Testing. Construct and perform an experiment which makes it possible to observe whether the consequences specified in one or more of those hypothetical propositions actually follow when the conditions specified in the same proposition(s) pertain. If the experiment fails, return to step 2, otherwise go to step 4.
4. Accept the hypothesis as provisionally true. Return to step 3 if you can conceive any other case which may show the answer to be defective. 4. Accept the hypothesis as provisionally true.Return to step 3 if there other predictable consequences of the theory which have not been experimentally confirmed.
5. Act accordingly. 5. Act accordingly.
Copyright © 1996, James Dye

Last Updated 8 January, 1996

2/4. from: http://www.ldolphin.org/SciMeth2.html

Steps in the Scientific Method

by Lambert Dolphin
Email: lambert@ldolphin.org
Web Pages: http://ldolphin.org/
May 1992.


3/4. from:  http://teacher.nsrl.rochester.edu/phy_labs/AppendixE/AppendixE.html
           Selected texts bolded in green, by unspun

APPENDIX E: Introduction to the Scientific Method


Introduction to the Scientific Method

The scientific method is the process by which scientists, collectively and over time, endeavor to construct an accurate (that is, reliable, consistent and non-arbitrary) representation of the world.

Recognizing that personal and cultural beliefs influence both our perceptions and our interpretations of natural phenomena, we aim through the use of standard procedures and criteria to minimize those influences when developing a theory. As a famous scientist once said, "Smart people (like smart lawyers) can come up with very good explanations for mistaken points of view." In summary, the scientific method attempts to minimize the influence of bias or prejudice in the experimenter when testing an hypothesis or a theory.

I. The scientific method has four steps

1. Observation and description of a phenomenon or group of phenomena.

2. Formulation of an hypothesis to explain the phenomena. In physics, the hypothesis often takes the form of a causal mechanism or a mathematical relation.

3. Use of the hypothesis to predict the existence of other phenomena, or to predict quantitatively the results of new observations.

4. Performance of experimental tests of the predictions by several independent experimenters and properly performed experiments.

If the experiments bear out the hypothesis it may come to be regarded as a theory or law of nature (more on the concepts of hypothesis, model, theory and law below). If the experiments do not bear out the hypothesis, it must be rejected or modified. What is key in the description of the scientific method just given is the predictive power (the ability to get more out of the theory than you put in; see Barrow, 1991) of the hypothesis or theory, as tested by experiment. It is often said in science that theories can never be proved, only disproved. There is always the possibility that a new observation or a new experiment will conflict with a long-standing theory.

II. Testing hypotheses

As just stated, experimental tests may lead either to the confirmation of the hypothesis, or to the ruling out of the hypothesis. The scientific method requires that an hypothesis be ruled out or modified if its predictions are clearly and repeatedly incompatible with experimental tests. Further, no matter how elegant a theory is, its predictions must agree with experimental results if we are to believe that it is a valid description of nature. In physics, as in every experimental science, "experiment is supreme" and experimental verification of hypothetical predictions is absolutely necessary. Experiments may test the theory directly (for example, the observation of a new particle) or may test for consequences derived from the theory using mathematics and logic (the rate of a radioactive decay process requiring the existence of the new particle). Note that the necessity of experiment also implies that a theory must be testable. Theories which cannot be tested, because, for instance, they have no observable ramifications (such as, a particle whose characteristics make it unobservable), do not qualify as scientific theories.

If the predictions of a long-standing theory are found to be in disagreement with new experimental results, the theory may be discarded as a description of reality, but it may continue to be applicable within a limited range of measurable parameters. For example, the laws of classical mechanics (Newton's Laws) are valid only when the velocities of interest are much smaller than the speed of light (that is, in algebraic form, when v/c << 1). Since this is the domain of a large portion of human experience, the laws of classical mechanics are widely, usefully and correctly applied in a large range of technological and scientific problems. Yet in nature we observe a domain in which v/c is not small. The motions of objects in this domain, as well as motion in the "classical" domain, are accurately described through the equations of Einstein's theory of relativity. We believe, due to experimental tests, that relativistic theory provides a more general, and therefore more accurate, description of the principles governing our universe, than the earlier "classical" theory. Further, we find that the relativistic equations reduce to the classical equations in the limit v/c << 1. Similarly, classical physics is valid only at distances much larger than atomic scales (x >> 10-8 m). A description which is valid at all length scales is given by the equations of quantum mechanics.

We are all familiar with theories which had to be discarded in the face of experimental evidence. In the field of astronomy, the earth-centered description of the planetary orbits was overthrown by the Copernican system, in which the sun was placed at the center of a series of concentric, circular planetary orbits. Later, this theory was modified, as measurements of the planets motions were found to be compatible with elliptical, not circular, orbits, and still later planetary motion was found to be derivable from Newton's laws.

Error in experiments have several sources. First, there is error intrinsic to instruments of measurement. Because this type of error has equal probability of producing a measurement higher or lower numerically than the "true" value, it is called random error. Second, there is non-random or systematic error, due to factors which bias the result in one direction. No measurement, and therefore no experiment, can be perfectly precise. At the same time, in science we have standard ways of estimating and in some cases reducing errors. Thus it is important to determine the accuracy of a particular measurement and, when stating quantitative results, to quote the measurement error. A measurement without a quoted error is meaningless. The comparison between experiment and theory is made within the context of experimental errors. Scientists ask, how many standard deviations are the results from the theoretical prediction? Have all sources of systematic and random errors been properly estimated? This is discussed in more detail in the appendix on Error Analysis and in Statistics Lab 1.

III. Common Mistakes in Applying the Scientific Method

As stated earlier, the scientific method attempts to minimize the influence of the scientist's bias on the outcome of an experiment. That is, when testing an hypothesis or a theory, the scientist may have a preference for one outcome or another, and it is important that this preference not bias the results or their interpretation. The most fundamental error is to mistake the hypothesis for an explanation of a phenomenon, without performing experimental tests. Sometimes "common sense" and "logic" tempt us into believing that no test is needed. There are numerous examples of this, dating from the Greek philosophers to the present day.


Another common mistake arises from the failure to estimate quantitatively systematic errors (and all errors). There are many examples of discoveries

Another common mistake is to ignore or rule out data which do not support the hypothesis. Ideally, the experimenter is open to the possibility that the hypothesis is correct or incorrect. Sometimes, however, a scientist may have a strong belief that the hypothesis is true (or false), or feels internal or external pressure to get a specific result. In that case, there may be a psychological tendency to find "something wrong", such as systematic effects, with data which do not support the scientist's expectations, while data which do agree with those expectations may not be checked as carefully. The lesson is that all data must be handled in the same way.

Another common mistake arises from the failure to estimate quantitatively systematic errors (and all errors). There are many examples of discoveries which were missed by experimenters whose data contained a new phenomenon, but who explained it away as a systematic background. Conversely, there are many examples of alleged "new discoveries" which later proved to be due to systematic errors not accounted for by the "discoverers."

In a field where there is active experimentation and open communication among members of the scientific community, the biases of individuals or groups may cancel out, because experimental tests are repeated by different scientists who may have different biases. In addition, different types of experimental setups have different sources of systematic errors. Over a period spanning a variety of experimental tests (usually at least several years), a consensus develops in the community as to which experimental results have stood the test of time.

IV. Hypotheses, Models, Theories and Laws

In physics and other science disciplines, the words "hypothesis," "model," "theory" and "law" have different connotations in relation to the stage of acceptance or knowledge about a group of phenomena.

An hypothesis is a limited statement regarding cause and effect in specific situations; it also refers to our state of knowledge before experimental work has been performed and perhaps even before new phenomena have been predicted. To take an example from daily life, suppose you discover that your car will not start. You may say, "My car does not start because the battery is low." This is your first hypothesis. You may then check whether the lights were left on, or if the engine makes a particular sound when you turn the ignition key. You might actually check the voltage across the terminals of the battery. If you discover that the battery is not low, you might attempt another hypothesis ("The starter is broken"; "This is really not my car.")

The word model is reserved for situations when it is known that the hypothesis has at least limited validity. A often-cited example of this is the Bohr model of the atom, in which, in an analogy to the solar system, the electrons are described has moving in circular orbits around the nucleus. This is not an accurate depiction of what an atom "looks like," but the model succeeds in mathematically representing the energies (but not the correct angular momenta) of the quantum states of the electron in the simplest case, the hydrogen atom. Another example is Hook's Law (which should be called Hook's principle, or Hook's model), which states that the force exerted by a mass attached to a spring is proportional to the amount the spring is stretched. We know that this principle is only valid for small amounts of stretching. The "law" fails when the spring is stretched beyond its elastic limit (it can break). This principle, however, leads to the prediction of simple harmonic motion, and, as a model of the behavior of a spring, has been versatile in an extremely broad range of applications.

A scientific theory or law represents an hypothesis, or a group of related hypotheses, which has been confirmed through repeated experimental tests. Theories in physics are often formulated in terms of a few concepts and equations, which are identified with "laws of nature," suggesting their universal applicability. Accepted scientific theories and laws become part of our understanding of the universe and the basis for exploring less well-understood areas of knowledge. Theories are not easily discarded; new discoveries are first assumed to fit into the existing theoretical framework. It is only when, after repeated experimental tests, the new phenomenon cannot be accommodated that scientists seriously question the theory and attempt to modify it. The validity that we attach to scientific theories as representing realities of the physical world is to be contrasted with the facile invalidation implied by the expression, "It's only a theory." For example, it is unlikely that a person will step off a tall building on the assumption that they will not fall, because "Gravity is only a theory."

Changes in scientific thought and theories occur, of course, sometimes revolutionizing our view of the world (Kuhn, 1962). Again, the key force for change is the scientific method, and its emphasis on experiment.

V. Are there circumstances in which the Scientific Method is not applicable?

While the scientific method is necessary in developing scientific knowledge, it is also useful in everyday problem-solving. What do you do when your telephone doesn't work? Is the problem in the hand set, the cabling inside your house, the hookup outside, or in the workings of the phone company? The process you might go through to solve this problem could involve scientific thinking, and the results might contradict your initial expectations.

Like any good scientist, you may question the range of situations (outside of science) in which the scientific method may be applied. From what has been stated above, we determine that the scientific method works best in situations where one can isolate the phenomenon of interest, by eliminating or accounting for extraneous factors, and where one can repeatedly test the system under study after making limited, controlled changes in it.

There are, of course, circumstances when one cannot isolate the phenomena or when one cannot repeat the measurement over and over again. In such cases the results may depend in part on the history of a situation. This often occurs in social interactions between people. For example, when a lawyer makes arguments in front of a jury in court, she or he cannot try other approaches by repeating the trial over and over again in front of the same jury. In a new trial, the jury composition will be different. Even the same jury hearing a new set of arguments cannot be expected to forget what they heard before.

VI. Conclusion

The scientific method is intricately associated with science, the process of human inquiry that pervades the modern era on many levels. While the method appears simple and logical in description, there is perhaps no more complex question than that of knowing how we come to know things. In this introduction, we have emphasized that the scientific method distinguishes science from other forms of explanation because of its requirement of systematic experimentation. We have also tried to point out some of the criteria and practices developed by scientists to reduce the influence of individual or social bias on scientific findings. Further investigations of the scientific method and other aspects of scientific practice may be found in the references listed below.

VII. References

1. Wilson, E. Bright. An Introduction to Scientific Research (McGraw-Hill, 1952).

2. Kuhn, Thomas. The Structure of Scientific Revolutions (Univ. of Chicago Press, 1962).

3. Barrow, John. Theories of Everything (Oxford Univ. Press, 1991).


Send comments, questions and/or suggestions via email to wolfs@nsrl.rochester.edu.


4/4.  from:  http://phyun5.ucr.edu/~wudka/Physics7/Notes_www/node5.html
Next: What is the ``scientific Up: Introduction Previous: Overview

The scientific method

Science is best defined as a careful, disciplined, logical search for knowledge about any and all aspects of the universe, obtained by examination of the best available evidence and always subject to correction and improvement upon discovery of better evidence. What's left is magic. And it doesn't work. -- James Randi


It took a long while to determine how is the world better investigated. One way is to just talk about it (for example Aristotle, the Greek philosopher, stated that males and females have different number of teeth, without bothering to check; he then provided long arguments as to why this is the way things ought to be). This method is unreliable: arguments cannot determine whether a statement is correct, this requires proofs.

A better approach is to do experiments and perform careful observations. The results of this approach are universal in the sense that they can be reproduced by any skeptic. It is from these ideas that the scientific method was developed. Most of science is based on this procedure for studying Nature.



 

Jose Wudka
9/24/1998

250 posted on 06/17/2003 7:31:48 PM PDT by unspun ("Do everything in love.")
[ Post Reply | Private Reply | To 247 | View Replies]

To: gore3000
If you are trying to make a better protein than nature or to modify it in any way you certainly will need a lot of work to accomplish it which like the rest of your post pretty much verifies what Alamo-Girl's sources have been saying - that it is virtually impossible to create a single functional gene at random.

That isn't really a valid analysis -- apples and oranges. You completely missed what I said, so I'll try again. "Difficult to compute" and "improbable" are completely orthogonal to each other. I can trivially compute molecular interactions that are virtually impossible in a real molecular system (as shown by the computation). Computing a particular conformation has the same cost no matter how probable or improbable its actual occurence is.

What we are trying to do (and straining our computational abilities as we do it) is compute the probabilities of an entire molecular phase space. Not just what happens in a specific instance, but what could happen under what conditions and the probability pertaining thereto. The easy case is taking a particular starting point and seeing what the outcome is. Even worse, we often attempt do an inverse computation i.e. given a certain protein outcome, what are the possible starting points that would give that result. Quite frankly, our computers are pretty taxed doing the forward computation, and doing the inverse computation is largely beyond our computational abilities. It is not a symmetric computation, in the same way factoring large composites is vastly more difficult than multiplying the primes that make up the composite.

Biochemistry is computationally probable for the most part, and we can compute specific results with relative ease. Computing the inverse case so that we can manipulate protein systems at will is nigh intractable. Nature just follows the probable pathways. Computing what probable pathways can get you to a specific endpoint is extraordinarily difficult no matter how "common" and probable the protein interaction is. The extremely difficult inverse computation is important because it allows us to thoroughly explore biochemistry (both the probable and improbable), and despite the computational expense, it is often cheaper to do the modeling on computers than actually testing and sifting the astronomical number of permutations in a lab.

In short: difficult to compute is utterly unrelated to probability. We aren't just analyzing what happens from a specific known starting point, we are reverse engineering the entire phase space of possible starting points and possible end points. A vastly different problem, that.

251 posted on 06/17/2003 7:31:58 PM PDT by tortoise (Dance, little monkey! Dance!)
[ Post Reply | Private Reply | To 248 | View Replies]

To: js1138
This is one of those profound differences between things that are "designed" and things arising through evolution. Living things have an enormous economy in their blueprints.

Indeed there is a tremendous economy in living things. For example, when the genome project was done, scientists were surprised that there were only some 30,000 odd thousand genes in humans because they had already identified some 100,000 different proteins used in human organisms. The reason is that genes can be made to make more than one protein by using very sophisticated code reuse. Some genes can make more than 50-60 proteins! Code reuse is definitely a sign of intelligence. It takes hard thinking to figure out how to take code from here and there to make it do something else you need done. This cannot be done by dumb luck.

252 posted on 06/17/2003 7:46:47 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 203 | View Replies]

To: Alamo-Girl; Nebullis
Thank you for allowing me to eaves drop on this conversation about that 'maxim' of Yockey's. I root for it being applicable only where evidence should be demonstrably compelled to reveal its little head, but doesn't.

(I confess I didn't read where it go to be "too many notes" though.)

--holy roamin' empirer (though not a scientist)
253 posted on 06/17/2003 7:48:55 PM PDT by unspun ("Do everything in love.")
[ Post Reply | Private Reply | To 150 | View Replies]

To: Nebullis
On top of all that, we already know, with massive amounts of supporting evidence, that natural selection that acts on variation exists. Any change, however it is induced, is subject to selection.

Evolutionists often speak as if natural selection changes the odds of something occurring. It does not. Selection only works after the event occurs so it has no influence in the occurrence of a particular event. If the chances of an event occurring are 1 in 10^60 chances without selection they are 1 in 10^60 with selection. Selection does not create anything, it does not work before the fact which is what would be needed for it to change the odds. What it does do is leave a trail of death which makes the finding of the correct change virtually impossible. That is why selection is an agent of stasis not of evolution.

254 posted on 06/17/2003 7:59:25 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 204 | View Replies]

To: gore3000
Code reuse is definitely a sign of intelligence.

Or efficient expression, which happens all the time in nature absent intelligence because it is favored by thermodynamics.

This relates back to the point that "entropy" has a much deeper meaning than the simplistic (and often incorrect) thermodynamic definition that is the extent of most people's understanding of the word.

255 posted on 06/17/2003 8:31:17 PM PDT by tortoise (Dance, little monkey! Dance!)
[ Post Reply | Private Reply | To 252 | View Replies]

To: tortoise
I can trivially compute molecular interactions that are virtually impossible in a real molecular system (as shown by the computation).

Since the discussion is about biological evolution not of computing per se, you are agreeing with my statement (and Alamo-Girls - and Yockey's!) about the virtual impossibility of creating functional genes.

Biochemistry is computationally probable for the most part, and we can compute specific results with relative ease.

No. You have just gone through a long exegesis on how the computing faculties are strained trying to find a simple change. Unlike with computers which work fast and do not die if they do not find the answer, organisms do not reproduce at megabytes per second. They also die if they get the wrong answer.

Further the computers have been given intelligent directions which cut down the number of tries required to get success. This is not the case in nature.

To change at random a single DNA bit correctly will take numerous tries. This claiming that there are 'pathways' which cut down the chances is not correct because there is no chemical reason for the sequence of DNA. What the 'pathways' do is exclude out of hand a tremendous amount of possible changes, it does not cut down in any way the random tries it takes to achieve those changes. You are indulging in the usual evolutionist fallacy of the future predicting the past. When put this way it is obvious nonsense. When put as 'pathways' determine the outcome, it does not sound as silly but it is the same logic - that what will be successful in the future is the cause for the events in the past.

In short: difficult to compute is utterly unrelated to probability. We aren't just analyzing what happens from a specific known starting point, we are reverse engineering the entire phase space of possible starting points and possible end points.

You are again giving support to my statement above. What you are speaking of is the reverse of how things actually happen. The future does not determine the past (except in the Terminator movies!).

256 posted on 06/17/2003 8:38:13 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 251 | View Replies]

To: tortoise
Code reuse is definitely a sign of intelligence.-me-

Or efficient expression, which happens all the time in nature absent intelligence because it is favored by thermodynamics.

Just because something happens in nature, does not mean that it is due to evolution, in fact this is what the whole evolution/creation debate is about - how did things in nature come about.

Now gene expression is determined by DNA. You surely are not claiming that the laws of thermodynamics get into our genome and change our DNA so that it will be in conformance with it do you??????????

257 posted on 06/17/2003 8:45:25 PM PDT by gore3000 (Intelligent people do not believe in evolution.)
[ Post Reply | Private Reply | To 255 | View Replies]

To: gore3000
Selection does not create anything, it does not work before the fact which is what would be needed for it to change the odds.

You are sort of right, but you completely missed the point nonetheless. Evolutionary theory proscribes some large number of small steps between two points. At step(n), selection(n) does not alter the odds of step(n) occurring. In this you are correct. The point you miss is that selection(n) constrains the possible phase space for step(n+1), thereby altering the probabilities for all step(n+k) where k>0 (a recursive feedback loop that reduces the number of possible outcomes at each step, increasing the odds of any one of those outcomes of happening).

This is why the aggregate probabilities are not a multiplicative function of simple combinatorics. You cannot assert the probabilities at each step until the selection function has been applied to the previous step which actually limits the number of possibilities at each step. You have to use the aggregate probabilities of each step post-selection from the previous step which makes each subsequent step far more probable than if you assumed the phase space was unconstrained (which is what you do).

Time for dinner...

258 posted on 06/17/2003 8:49:28 PM PDT by tortoise (Dance, little monkey! Dance!)
[ Post Reply | Private Reply | To 254 | View Replies]

To: gore3000
Thank you so much for the information on the Hox gene!

It looks like the trend may be that many of the regulator genes appear from the earliest, e.g. like pre-programmed adaptation ability.

Are the Hox genes conserved across phyla like the eyeness gene, i.e. between human and mouse, 100% identical and between human and drosophilia, 94%? This is evidently the astonishing observation; IOW, it puts more emphasis on pre-programmed adaptation capability and less on random mutation branching away from the common ancestor(s).

259 posted on 06/17/2003 8:58:12 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 249 | View Replies]

To: gore3000
Just because something happens in nature, does not mean that it is due to evolution

I never said that. I said that code reuse is not a sign of intelligence ipso facto, contrary to what you asserted. Evolution versus design is a false dichotomy. There are other plausible mechanisms that can create speciation (such as complexification in automata systems). System dynamics offers a number of possibilities, so-called "evolution" is just one possible mechanism described, and one fixated on because some famous dude wrote a book on it many years ago. In fact, it rather annoys me that people remain stubbornly ignorant about the fact that evolution is only one of a myriad of plausible explanations for speciation that come out of systems theory.

If I say "does not imply design", it does not equal "implies evolution". It could be any one of a number equally plausible mechanisms. As I said, evolution versus design is a false dichotomy, mostly due to ignorance and an unhealthy fixation on evolution that exists for historical reasons. No one has, for example, even attempted to refute molecular automata theories, which are actually increasingly popular in many biological circles. I'm not sure what the creationists would do if all the evolutionists switched teams to automata theory.

260 posted on 06/17/2003 9:00:52 PM PDT by tortoise (Dance, little monkey! Dance!)
[ Post Reply | Private Reply | To 257 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 221-240241-260261-280 ... 661-675 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson