Posted on 08/08/2002 9:06:23 AM PDT by Momaw Nadon
SYDNEY (Reuters) - A team of Australian scientists has proposed that the speed of light may not be a constant, a revolutionary idea that could unseat one of the most cherished laws of modern physics -- Einstein's theory of relativity.
The team, led by theoretical physicist Paul Davies of Sydney's Macquarie University, say it is possible that the speed of light has slowed over billions of years.
If so, physicists will have to rethink many of their basic ideas about the laws of the universe.
"That means giving up the theory of relativity and E=mc squared and all that sort of stuff," Davies told Reuters.
"But of course it doesn't mean we just throw the books in the bin, because it's in the nature of scientific revolution that the old theories become incorporated in the new ones."
Davies, and astrophysicists Tamara Davis and Charles Lineweaver from the University of New South Wales published the proposal in the August 8 edition of scientific journal Nature.
The suggestion that the speed of light can change is based on data collected by UNSW astronomer John Webb, who posed a conundrum when he found that light from a distant quasar, a star-like object, had absorbed the wrong type of photons from interstellar clouds on its 12 billion year journey to earth.
Davies said fundamentally Webb's observations meant that the structure of atoms emitting quasar light was slightly but ever so significantly different to the structure of atoms in humans.
The discrepancy could only be explained if either the electron charge, or the speed of light, had changed.
IN TROUBLE EITHER WAY
"But two of the cherished laws of the universe are the law that electron charge shall not change and that the speed of light shall not change, so whichever way you look at it we're in trouble," Davies said.
To establish which of the two constants might not be that constant after all, Davies' team resorted to the study of black holes, mysterious astronomical bodies that suck in stars and other galactic features.
They also applied another dogma of physics, the second law of thermodynamics, which Davies summarizes as "you can't get something for nothing."
After considering that a change in the electron charge over time would violate the sacrosanct second law of thermodynamics, they concluded that the only option was to challenge the constancy of the speed of light.
More study of quasar light is needed in order to validate Webb's observations, and to back up the proposal that light speed may vary, a theory Davies stresses represents only the first chink in the armor of the theory of relativity.
In the meantime, the implications are as unclear as the unexplored depths of the universe themselves.
"When one of the cornerstones of physics collapses, it's not obvious what you hang onto and what you discard," Davies said.
"If what we're seeing is the beginnings of a paradigm shift in physics like what happened 100 years ago with the theory of relativity and quantum theory, it is very hard to know what sort of reasoning to bring to bear."
It could be that the possible change in light speed will only matter in the study of the large scale structure of the universe, its origins and evolution.
For example, varying light speed could explain why two distant and causally unconnected parts of the universe can be so similar even if, according to conventional thought, there has not been enough time for light or other forces to pass between them.
It may only matter when scientists are studying effects over billions of years or billions of light years.
Or there may be startling implications that could change not only the way cosmologists view the universe but also its potential for human exploitation.
"For example there's a cherished law that says nothing can go faster than light and that follows from the theory of relativity," Davies said. The accepted speed of light is 300,000 km (186,300 miles) per second.
"Maybe it's possible to get around that restriction, in which case it would enthrall Star Trek fans because at the moment even at the speed of light it would take 100,000 years to cross the galaxy. It's a bit of a bore really and if the speed of light limit could go, then who knows? All bets are off," Davies said.
Assuming a fixed speed of light, wavelength and frequency are related:
frequency = c / (wavelength)Now, it's true that changing c, say by slowing it down, will change the frequency without changing the wavelength. But that leaves your light with a modified energy. The energy of a photon is
Energy = (Planck's constant) * c / (wavelength)Everything I said about the problems Setterfield gets into still applies, although he plays with Planck's constant (no longer a constant in CDK), mass, the gravitational force, and several other "constants." It's all in vain, so far as I can tell.or
Energy = (Planck's constant) * frequency
It would appear, nonetheless, that changing the speed of light could have an adverse impact on optics, in that the relative index of refraction would be changed. However, according to Snell's law it is the ratio of the velocities of an incident waveform and the velocity of the waveform transmitted through a different medium (part of the incident waveform being reflected).
You're still on the wrong track. It isn't the index of refraction, it's the energy of the photons and the games CDK has to play to pretend that everything still comes out the same. It turns out that the per-photon changes are a wash, but the sun is cooking its fuel off like mad. That means that there are vastly more photons. Why does't Adam cook? They get red-shifted--this is standard CDK doctrine; it means they're very long-wave--because there's far, far less mass in the solar atoms fusing to produce them. However, if they're red-shifted, then Adam is blind. Etc. etc. as I explained earlier.
Even radio astronomers don't deal with frequencies, they deal with wavelength eg. 21 cm. Of course their electromagnetic oscillators might need to function at some frequency, but what empirical evidence is there that these frequencies are constant?
I find it interesting that you argue against Setterfield's idea just as fervently as the Creationists do. I believe this is because Setterfield's ideas are contrary to both camps presuppositions.
Adam doesn't cook because there's less energy per photon (the photon's have less apparent mass). I don't think you've looked at either Setterfields postulations, nor Montgomery's statistical analysis, because either you can't comprehend the mathematics or they are anathemsa to your preconceptions. That doesn't strike me as very scientific. The most astounding scientific discoveries were made when experimenters made observations that went against the very grain of common knowledge, and obtained results that they did not want to see.
So Adam is blind.
Will you explain how Setterfield destroys the following principles of physics:
Law of reflection : (theta)i = (theta)r
Law of refraction : n = c/v
Snell's law : n1 sin (theta)1 = n2 sin (theta)2
Critical angle : sin (theta)c = n2/n1, n1 > n2 (internal reflection)
Phase difference : Delta (theta) = (2 pi)/lambda (path difference)
Young's double slit : y = (n* lambda* L))/d, n= 0,1,2...
Single slit : y = (m* lamda* L)/w, m=1,2,3...
Diffraction grating : d sin (theta) = n (lambda), n=0,1,2...
Bragg's law (X-ray diffraction) : 2d sin (theta) = n lambda, n= 1,2,3...
Brewsters angle : tan (theta)p = n
VadeRetro, your task is quite monumental indeed. and I do understand Occam's Razor. What needs to be disproved with empirical evidence, is that Setterfield is full of hot air. What will not float the balloon is your quoting pamphlets that it can't be done. Why don't you just prove it, mathematicically....
the optical adjustments needed for an eye are truly amazing. The refractive components that act to focus images on the retina are the cornea (n aprox. 1.351), the aqueaouis humor (n approx. 1.337), the crystalline lens (n approx. 1.437), the vitreous body (n approx. 1.337). The principle bending of indcident light rays occurs at the cornea, whose radius of curvature is less than that of the eyeball. Nerve signals cause the cilliary muscle to change the radius of the curvature and thus the focal length of the crystalline lens so that the images of objects at different distances are sharply focused on the retina. When the normal eye views distant objects, the ciliary muscle is relaxed and the crystaline lens has its thinnest shape (approx. 20 diopters). When viewing nerar objects, the lens becomes thicker and the radius of the curvature, and focal distance of the eyeball are decreased. The accommodation of the the radial curvature and focal length is attributed to evolution. Fossils are explicitely clear that evolution is true. Anybody that is contrary to this is a Creationist and is absolutley wacked.
"At any given moment there is an orthodoxy, a body of ideas of whish it is assumed that all right-thinking people will accept without question. It is not exactly forbidden to say this, that, or the other, but it is "not done" to say it...Anyone who challenges the prevailing orthodoxy finds himelf silenced with surprising effectiveness. A genuinely unfashioable opinion is almost never given a fair hearing, either in the popular press or in the high-brow periodicals."
--George Orwell, 1945, Introduction to Animal Farm.
You shouldn't worry about it VadeRetro, since you'll only meet up with a figment of your imagination, right? Its not real, and what ever happens is just hysteria...
Good luck VadeRetro, you'll need it...
I thank my God always on your behalf, for the grace of God which is given to you you by Jesus Christ.
That in every thing ye are enriched by him, in all utterances, and in all knowledge.
Even as the testimony of Christ was confirmed in you.
So that ye come behind in no gift; waiting for the coming of our Lord Jesus Christ:
Who shall also confirm you unto the end, that ye may be be blameless in the day of our Lord Jesus Christ.
I Cor 1:3-8 I leave you with this: 1 John 1:23 2 John 1:5-6 1 Pt 1:22 1 Pt 4:8 Gal 5:13-14 1 John 4:7-11; 19-21 Phil 1:9 Phil 2:3-8 John 3:16-18 I'm not going on, because VadeRetro is not listening...
That's interesting. Lets look at Tom Van Flandern. He had a Ph. D. from Yale in atronomy (specializing in celestial mechanics, for 20 years - 1963 through 1983), Research Astronomer and Chief of the Celestial Mechanics Branch of the U.S.N. Observatory in Wash. D.C., who released the results of tests that showed that the rate of the ticking of the atomic clock was measurabley slowing down with perspective to the dynmamic clock. Mr. Van Flandern was subsequently terminated from his position).
I'd question why.
Its funny to think that I'd think you are on the wrong track. That's just me though...
It's not the lenses. It's the ability of a photon of a given wavelength to resolve an object. Now maybe Setterfield plays with that ratio, which IIRC is about 2/3. (A photon of a given wavelength can "see" an object about 2/3 of its wavelength or greater, but anything shorter is utterly invisible.)
Consider the door on your microwave. It has a metal screen to trap the harmful levels of microwave energy inside. Nevertheless, you can see inside through a mesh of holes. Why don't the same holes leak microwaves?
Those "micro"-waves are far too long to be able to resolve the holes. At "micro"-wavelengths, the metal screen looks like a solid sheet. However, the relatively tiny visible light wavelengths can easily resolve much tinier details than those holes.
For the very long-wave radio submarines employ, a wire antenna a mile or more long must be uncoiled. (The bandwidth is very low (no voice--teletype only), but the advantage is that LF propagates through water so you don't have to surface. Also, no one without a mile-long antenna will be listening in on you.) That's the kind of sensor Adam would need in his eyes to see the very red-shifted solar output that I think Setterfield has cornered himself into predicting.
At least, I can't get a good answer on why that's not so.
Try to understand that everyone who measures c now gets the same answer, within the resolving power of the equipment. If c is constant today, then frequency and wavelength have a perfect relationship. Every wavelength has its frequency. Every frequency has its wavelength.
Things shift a little when you enter a medium other than a vaccuum and light slows down, but the waveform also bunches up. Ignoring scattering and absorption losses, the per-photon energy E = hc/wavelength is preserved because c, the overall speed of the waveform, slows down but the wavelength shortens in precise proportion.
It's an either-or (or maybe both). Setterfield redshifts the photons to keep Adam from cooking. That's far less energy per photon but he has the sun cooking off like mad for a while there so there are far more of them and the energy flux is the same.
But Adam's eyes aren't a mile wide. How does he see the long waves? If, as I heard in one instance, the light isn't all that redshifted, why doesn't the furious reaction rate in the sun 6000 years ago cook Adam? (Setterfield explains the apparent radiometric ages of the sun and earth by the difference in reaction rates then and now.)
I've seen nothing but tap-dancing.
I'm listening but I'm sure as heck not looking up all that stuff. Thank you for making clear the source of your objections to evolution (and geology and cosmology and probably nuclear chemistry).
I'd question why.
I wouldn't assume anything without credible information. Most people aren't going to assume that, if an atomic clock and a twenty-dollar wind-up Benrus drift apart, the atomic clock is at fault.
One thing I'm not doing is quoting pamphlets. Most people have attacked Setterfield's abuse of historical light-measurement statistics and the incredibly ad hoc nature of his selection of what varies and how much in step with changes in c. These critics, most of what's out there, assume that Setterfield does achieve some kind of transparency across values of c. (Or assume that it doesn't matter, since his methods are so bogus.)
So I've tried to model it out for myself, including corresponding with Lambert Dolphin and Helen Fryman. I can't see how he achieves his claims that Adam is OK in the Garden of Eden 6 days after creation with c 11 million times the modern value, the sun and earth reacting like mad, Adam's body mass some tiny fraction of what it would be now and the force of gravity greatly increased.
Setterfield's main refuge is that he's varied so many things at once it's impossible to model how anything works. That's about as useful as teats on a boar hog, even if it's thus unfalsifiable. Unfalsifiability, even if achieved, also makes the claims advanced for CDK by various parties unverifiable.
Personally, I cannot see how the claims hold up and don't get good answers to my objections.
The questions are "How does he uphold them?" and "Does his theory do anything at all even the slightest bit better than the less-complicated understanding we have now?"
Perhaps after consulting pamphlets in my possession, and correlate them with physics texts and my knowledge of things contained therein, I might be able to understand your aversion to the issues raised by Setterfield.
I think CDK got into trouble by assuming that the same thing--call it a thickening of the vaccuum---that slows down light also adds mass to the universe and slows down nuclear reactions. (My characterizations of CDK come mainly from Lambert Dolphin's "Implications of a Non-Constant Velocity of Light", as it's the most readable such paper around.) It looked good because if you vary reaction rates you can explain why the sun and earth look old and why radiometric dating gives answers that read old. You see, the sun was cooking off much faster 6000 years ago and so were the nuclear ores of the earth. Assumptions that don't take these far higher reactions into account will of course generate the ages we see for our solar system (about 4.5 billion years).
But the problem is that Setterfield has to put life, including humans, back into that crazy time with the high light speed and the high nuclear reaction rates. If you speed up a photon, it gains energy in proportion. You can say it's because the frequency goes up (E=hf) or because the speed of light itself is higher (E=hc/lambda) but it's more energy. So Setterfield in an ad hoc manner simply lowers Planck's Constant "h" and the differences cancel.
Except for those crazy reaction rates. We have to make a 6000 year old sun and earth look like they've been cooking for 4.5 billion years, and we have to front-load most of the work because the decline when it kicks in is inverse exponential. Setterfield starts with things moving along about 11 million times faster than now.
So the individual photons from the sun, when we left them, were blue-shifted 11 million times by their higher speed, but the excess energy was cancelled by the finely-tuned changes in Planck's constant. Their wavelength is the same as now so that any human eye might receive them.
But there are 11 million times too many of them. Imagine the sun putting out 11 million times more photons than now, same wavelength. We'd all fry instantly. So as we have it right now it still doesn't work.
That's where Setterfield invokes another ad hoc change. It seems the universe had less mass. That means that the photons are being generated by hydrogen nuclei (protons) lighter than such now, and thus have less energy in just the proportion to cancel the effect of having so many more of them being given off. They're red-shifted. From the Dolphin link:
Barry now assumes that energy flux from our sun or from distant stars is constant over time. (Energy flux is due to atomic processes and is the amount of energy radiated from the surface of a star per square centimeter per second). Setterfield also now proposes that when the velocity of light was (say) ten times higher than now, then 10 times as many photons per second (in dynamical time) were emitted from each square centimeter of surface. Each photon would however carry only one tenth as much energy, conserving the total energy flux. Setterfield says, "This approach has a dramatic effect. When light-speed c was 10 times higher, a star would emit 10 photons in one second compared with one now. This ten-photon stream then comprised part of a light beam of photons spaced 1/10th of a second apart. In transit, that light beam progressively slowed until it arrived at the earth with today's c value. This speed is only 1/10th of its original speed, so that the 10 photons arrive at one second intervals. The source appears to emit photons at today's rate of 1 per second. However, the photon's wavelength is red-shifted, since the energy per photon was lower when it was emitted."The above doesn't work for me. 11 million times more photons per second from the sun, but they're red-shifted enough that the energy is the same as now. That's a lot of red-shifting. Adam's eyes can't be long enough to resolve those photons. If they're not that red-shifted after all, he cooks. If they are, he's blind.
I spell out other problems in the vanity paper I linked above but the photon/energy/wavelength juggling act is the main red flag. I never got a good answer from those guys even when I emailed them about it.
Thanks VannRox, interesting topic.
Whoops. Thanks go to Momaw Nadon for this interesting topic. And I'm not pingin' either of 'em.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.