Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

To: tortoise
Thank you oh so very, very much for your reply!!!

My dear, you are describing an intrinsic capacity of all finite state machines.

Yes, I am – though I was only looking at inception. But as you said at post 5086:

And quite frankly, structural constraints make it highly improbable that a non-zero order machine would evolve at that level of abstraction, though it would probably effectively kill the possibility of biological evolution if our genome did function as a high-order FSM. The problem is in your usage of "algorithm".

Thank you so much for the great explanation of Kolmogorov complexity! That is not, however, what I mean when I speak of algorithm. I’m using the description right out of Penrose’s Emporer’s New Mind where he gives Euclid’s algorithm as an example, which I restate in Basic:

Again:
C=A-(Int(A/B)*B)
Print A;" divided by ";B;" gives a remainder of ";C
If C<>0 then
A=B
B=C
Goto Again
End If
Print "Euclid algorithm complete!"
The above algorithm, which is a step by step instruction, includes a conditional, symbols, recursive logic and process. This is the kind of information content being discovered in the genetic code.

For the mathematical measure of any system, what you are calling "algorithms" is information, higher order information to be precise.

Indeed, looking only at the genetic code, it would have the appearance of information content. But there is more to it than that, from all that I have read! The information content is more like the database consisting of self-organized instructions along with the memory-results from what appears to be the operation of the finite state machine.

Because "algorithms" are defined in terms of proper information theoretic "information", they are bound by the same restrictions. Among these is that information is intrinsically context-free. This means that no arbitrary piece of information (or "algorithm" if you prefer) is more important or meaningful than any other. In any case where there is no known context, like the case we are talking about here, it is not possible to recognize "design".

If the genetic code were the only thing being looked at, this would be true.

And here is something else that will cook your noodle even more: It is not possible to perfectly define the "algorithm" of the universe within the universe. The nasty self-modeling inequality of computational information theory doesn't allow it; the best we can do is find a modest approximation to the real solution (the AIT analog of Godel's Incompleteness theorem).

I assert that one cannot perfectly define the big bang or the inflationary theory, but closeness is acceptable. Closeness notwithstanding, under laboratory conditions, perhaps we would be able to synthesize the FSM I described.

Worse, we can't even rationally assert the probability that it is because it is not possible for us to have the context to make that assertion.

Ah, but science does not leave the subject alone. IMHO, its first reaction is to offer the anthropic principle or plenitude when backed into the metaphysical corner:

Interview with Nicolo Dallaporta, one of the fathers of modern cosmology

To get away from this evidence, cosmological scenarios are offered that in one way or another repropose a form of the old principle of plenitude ("everything that can exist, does exist"). The existence is thus postulated of an infinity of chances, among which "our case" becomes an obvious favorable case (today the most popular form is that of multi-universes). What is your view on this?

It is very possible, but it is not physics. It is a metaphysics in which recourse is made to a chance that is so enormously limitless that everything that is possible is real. But in this way it becomes a confrontation between metaphysics in which chance collides with purpose. This latter, however, seems much easier to believe! Physics up to now has been based on measurable "data." Beyond this it is a passage of metaphysics. At this point I compare it with another metaphysics. Those who sustain these viewpoints (like Stephen Hawking, for instance) should realize that this goes beyond physics; otherwise it is exaggerated. Physics, pushed beyond what it can measure, becomes ideology.

Space.com

There's a reason some theorists want other universes to exist: They believe it's the only way to explain why our own universe, whose physical laws are just right to allow life, happens to exist. According to the so-called anthropic principle, there are perhaps an infinite number of universes, each with its own set of physical laws. And one of them happens to be ours. That's much easier to believe, say the anthropic advocates, than a single universe "fine-tuned" for our existence.

SpaceDaily.com

Moreover, the Sun's circular orbit about the galactic center is just right; through a combination of factors it manages to keep out of the way of the Galaxy's dangerous spiral arms. Our Solar System is also far enough away from the galactic center to not have to worry about disruptive gravitational forces or too much radiation.

When all of these factors occur together, they create a region of space that Gonzalez calls a "Galactic Habitable Zone." Gonzalez believes every form of life on our planet - from the simplest bacteria to the most complex animal - owes its existence to the balance of these unique conditions.

Because of this, states Gonzalez, "I believe both simple life and complex life are very rare, but complex life, like us, is probably unique in the observable Universe."

Ian’s Cosmic

Carbon Resonance.

A carbon-12 nucleus is made from the near-simultaneous collision of three of these helium-4 nuclei [within stars]. Actually, what happens is that two helium-4 nuclei merge to make beryllium-8 [G1], but beryllium-8 is so unstable that it lasts only 10^-17 of a second, and so a third alpha particle (which is what a helium nucleus is) must collide and fuse with the beryllium nucleus within that time. Not only is this triple encounter a relatively unlikely event, but any such unstable beryllium nuclei ought to be smashed apart in the process. Therefore, it should be expected that carbon itself (and consequently all heavier elements) would be rare in the Universe.

However, the efficiencies of nuclear reactions vary as a function of energy, and at certain critical levels a reaction rate can increase sharply - this is called resonance. It just so happens that there is a resonance in the three-helium reaction at the precise thermal energy corresponding to the core of a star...

So if there was another resonance at work here all the carbon would be quickly processed into oxygen, making carbon very rare again. In fact, it turns out that there is an excited state of oxygen-16 that almost allows a resonant reaction, but it is too low by just 1%. It is shifted just far enough away from the critical energy to leave enough life-giving quantities of carbon untouched.

Strong Nuclear Force.

If the strong force had actually been just 13% stronger, all of the free protons would have combined into helium-2 at an early stage of the Big Bang, and decay almost immediately into deuterons. Then pairs of deuterons would readily fuse to become helium-4, leaving no hydrogen in the Universe, and so no water, and no hydrocarbons… An increase in the strong force of just 9% would have made the dineutron possible. On the other hand a decrease of about 31% would be sufficient to make the deuteron unstable, and so remove an essential step in the chain of nucleosynthesis: the Universe would contain nothing but hydrogen, and again life would be impossible.

Supernovae.

... In blasting apart a supernova, its precise interactivity (or lack of it) is such that it should have enough time to reach the stellar envelope before dumping its energy and momentum, but not so much time that it should escape. This property is partly a function of the weak force in a complex relationship which must be just as we observe it, to one part in a thousand. If the star's matter was not so effectively redistributed, it would simply collect about the dead star or fall back. It would not be available for new stars to make planets capable of bearing life...

Gravity.

…Suppose gravity was stronger, by a factor of 10^10. This seems quite a lot, but it would still be the weakest force, just 10^-28 of the strength of electromagnetism. The result would be that not as many atoms would be needed in a star to crush its core to make a nuclear furnace. Stars in this high-gravity universe would have the mass of a small planet in our Universe, being about 2km in diameter. They would have far less nuclear fuel as a result, and would use it all up in about one year... Make gravity substantially weaker on the other hand, the gas clouds of hydrogen and helium left after the Big Bang would never manage to collapse in an expanding universe, once again leaving no opportunity for life to emerge.

Water.

These and other odd features of water are a consequence of the hydrogen bond - the attraction of the electron-rich oxygen atoms of water molecules for the electron-starved hydrogen atoms of other water molecules. This in turn is a function of the precise properties of the oxygen and hydrogen atoms, which also determines the H-O-H bond angle of 104.5 degrees - only slightly less than the ideal tetrahedral angle of 109.5 degrees. It is (incidentally) the hydrogen bond which holds together the two strands of DNA… It is also the hydrogen bond which is responsible for the crystalline structure of ice, which is in the form of an open lattice: this makes ice less dense than the liquid. As a result, ice floats. If ice was denser than its liquid form (as is the case with most other substances) then it would collect at the bottom of lakes and oceans, and eventually build up until the world was frozen solid. As it is, it forms a thin insulating sheet which prevents evaporation and keeps the waters below warm.

Proton-Neutron Mass Difference.

The difference in mass between a proton and a neutron is only a little greater than the mass of the relatively tiny electron (which has about 1/1833 the mass of a proton). Calculations of relative particle abundances following the first second of the Big Bang, using Boltzmann's statistical theorem, show that neutrons should make up about 10% of the total particle content of the Universe. This is sensitive to the proton:neutron mass ratio which is (coincidentally) almost 1. A slight deviation from this mass ratio could have led to a neutron abundance of zero, or of 100%, the latter being most catastrophic for the prospects of any life appearing. Even if there were 50% neutrons, all of them would have combined with the remaining protons early in the Big Bang, leading to a Universe with no hydrogen, no stable long-lived stars, and no water. And no life.

Antimatter and the Photon/Proton Ratio.

Why is there matter in the universe, but no appreciable quantities of antimatter? In the colossal energies of first millionth of a second of the Big Bang, particles and their anti-particles would have been created and destroyed in pairs, equally. Once the temperature fell sufficiently, photons could no longer be readily converted into particle-antiparticle pairs, and so they annihilated each other. The present ratio of photons to protons, 'S', is 10^9, which suggests that only one proton (and one electron) per billion escaped annihilation.

As I mentioned to PatrickHenry, one of the things you gotta love about science is that it will eventually accept the evidence and theories, though there may be a lot of kicking and screaming along the way.

5,445 posted on 01/18/2003 8:50:40 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 5439 | View Replies ]


To: Alamo-Girl
I'm sorry for the correction... but many algorithms can be represented as turing machines, including the ones you discuss... it's a theory in computer science that all "computable" problems can, if you're masochistic enough to break it down into its essences, be solved with lambda calculus and turing machines (the same thing, actually). A turing machine does hypothetically store and re-use information...

However, that does not negate your excellent argument about how principles in math/computer science/logic can apply to natural laws. And, certainly, it can be an argument in favor of intelligent design. :-)
5,447 posted on 01/18/2003 9:25:46 PM PST by Nataku X
[ Post Reply | Private Reply | To 5445 | View Replies ]

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson