Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Inconstant Speed of Light May Debunk Einstein
Reuters via Yahoo! ^ | Wed Aug 7, 2:07 PM ET | By Michael Christie

Posted on 08/08/2002 9:06:23 AM PDT by Momaw Nadon

SYDNEY (Reuters) - A team of Australian scientists has proposed that the speed of light may not be a constant, a revolutionary idea that could unseat one of the most cherished laws of modern physics -- Einstein's theory of relativity.

The team, led by theoretical physicist Paul Davies of Sydney's Macquarie University, say it is possible that the speed of light has slowed over billions of years.

If so, physicists will have to rethink many of their basic ideas about the laws of the universe.

"That means giving up the theory of relativity and E=mc squared and all that sort of stuff," Davies told Reuters.

"But of course it doesn't mean we just throw the books in the bin, because it's in the nature of scientific revolution that the old theories become incorporated in the new ones."

Davies, and astrophysicists Tamara Davis and Charles Lineweaver from the University of New South Wales published the proposal in the August 8 edition of scientific journal Nature.

The suggestion that the speed of light can change is based on data collected by UNSW astronomer John Webb, who posed a conundrum when he found that light from a distant quasar, a star-like object, had absorbed the wrong type of photons from interstellar clouds on its 12 billion year journey to earth.

Davies said fundamentally Webb's observations meant that the structure of atoms emitting quasar light was slightly but ever so significantly different to the structure of atoms in humans.

The discrepancy could only be explained if either the electron charge, or the speed of light, had changed.

IN TROUBLE EITHER WAY

"But two of the cherished laws of the universe are the law that electron charge shall not change and that the speed of light shall not change, so whichever way you look at it we're in trouble," Davies said.

To establish which of the two constants might not be that constant after all, Davies' team resorted to the study of black holes, mysterious astronomical bodies that suck in stars and other galactic features.

They also applied another dogma of physics, the second law of thermodynamics, which Davies summarizes as "you can't get something for nothing."

After considering that a change in the electron charge over time would violate the sacrosanct second law of thermodynamics, they concluded that the only option was to challenge the constancy of the speed of light.

More study of quasar light is needed in order to validate Webb's observations, and to back up the proposal that light speed may vary, a theory Davies stresses represents only the first chink in the armor of the theory of relativity.

In the meantime, the implications are as unclear as the unexplored depths of the universe themselves.

"When one of the cornerstones of physics collapses, it's not obvious what you hang onto and what you discard," Davies said.

"If what we're seeing is the beginnings of a paradigm shift in physics like what happened 100 years ago with the theory of relativity and quantum theory, it is very hard to know what sort of reasoning to bring to bear."

It could be that the possible change in light speed will only matter in the study of the large scale structure of the universe, its origins and evolution.

For example, varying light speed could explain why two distant and causally unconnected parts of the universe can be so similar even if, according to conventional thought, there has not been enough time for light or other forces to pass between them.

It may only matter when scientists are studying effects over billions of years or billions of light years.

Or there may be startling implications that could change not only the way cosmologists view the universe but also its potential for human exploitation.

"For example there's a cherished law that says nothing can go faster than light and that follows from the theory of relativity," Davies said. The accepted speed of light is 300,000 km (186,300 miles) per second.

"Maybe it's possible to get around that restriction, in which case it would enthrall Star Trek fans because at the moment even at the speed of light it would take 100,000 years to cross the galaxy. It's a bit of a bore really and if the speed of light limit could go, then who knows? All bets are off," Davies said.


TOPICS: Culture/Society; Front Page News; Miscellaneous; News/Current Events; Philosophy; Technical; Unclassified
KEYWORDS: einstein; light; physics; relativity; speed; universe
Navigation: use the links below to view more comments.
first previous 1-20 ... 161-180181-200201-220 ... 241 next last
To: balrog666
Why request other peoples' colors when you could request your own?
181 posted on 08/09/2002 10:06:17 AM PDT by medved
[ Post Reply | Private Reply | To 175 | View Replies]

To: medved
Why request other peoples' colors when you could request your own?

It's just that I'm already used to skipping anything in blue. But any color other than black is fine with me.

182 posted on 08/09/2002 10:21:36 AM PDT by balrog666
[ Post Reply | Private Reply | To 181 | View Replies]

To: medved
"The notions of a "big bang" and an expanding universe are total BS, based on nothing more than a fundamental misinterpretation of redshift data. Those ideas have been coercively disproven.

Sorry, but I have no choice but to discount any proofs that are constructed under duress. They tend to be somewhat biased, if you know what I mean. By the way, who exactly was doing the coercing?

183 posted on 08/09/2002 10:23:21 AM PDT by BlackRazor
[ Post Reply | Private Reply | To 179 | View Replies]

To: BlackRazor
The term 'coercive' is used by mathematicians and physicists to mean an airtight proof or demonstration of something. The man doing the 'coercing' in this csae, at least originally, was Halton Arp.
184 posted on 08/09/2002 10:29:28 AM PDT by medved
[ Post Reply | Private Reply | To 183 | View Replies]

To: medved
"The term 'coercive' is used by mathematicians and physicists to mean an airtight proof or demonstration of something. The man doing the 'coercing' in this csae, at least originally, was Halton Arp.

Ah... Apologies then for the flippant post, and for my own ignorance. And thank you for expanding my vocabulary!

185 posted on 08/09/2002 10:37:12 AM PDT by BlackRazor
[ Post Reply | Private Reply | To 184 | View Replies]

To: medved
The term 'coercive' is used by mathematicians and physicists to mean an airtight proof or demonstration of something.

Actually, we don't use the term that way.

186 posted on 08/09/2002 11:58:47 AM PDT by Doctor Stochastic
[ Post Reply | Private Reply | To 184 | View Replies]

To: Doctor Stochastic
A non-coercive placemarker.
187 posted on 08/09/2002 12:36:33 PM PDT by Junior
[ Post Reply | Private Reply | To 186 | View Replies]

To: Doctor Stochastic
Actually, we don't use the term that way.

I've never heard of it either. I figured it was new.

188 posted on 08/09/2002 12:36:51 PM PDT by balrog666
[ Post Reply | Private Reply | To 186 | View Replies]

To: balrog666
I think the number of users is approximately 1**720.
189 posted on 08/09/2002 12:41:05 PM PDT by Doctor Stochastic
[ Post Reply | Private Reply | To 188 | View Replies]

To: Physicist
An index of refraction doesn't just slow the speed of light; it slows it in a frequency-dependent way. If that dispersion

The three dimensions of lens design =

Indexes of refraction of the lens materials
Dispersions of the glasses
Curvatures of the lens surfaces and air spaces.

These variables are used together to attempt to reduce blurriness by bringing all rays of all colors from an object point to the same focal point. Then you also have axial rays and off-axis and non-parallel rays to deal with. When you have a good solution you can then proceed to ruin everything by making it a zoom lens.

190 posted on 08/09/2002 12:53:54 PM PDT by RightWhale
[ Post Reply | Private Reply | To 180 | View Replies]

To: equus
If the physics of the early universe was so different from that operating in our neighborhood today, could the physics of the universe at any time in the future, and, for that matter, the physics of other parts of the universe today (like, say, black holes,) also be fundamentally different?
191 posted on 08/09/2002 12:59:36 PM PDT by aristeides
[ Post Reply | Private Reply | To 164 | View Replies]

To: Doctor Stochastic
Actually, we don't use the term that way.

I've heard it used that way by serious mathematicians. I don't really know anything about your credentials.

192 posted on 08/09/2002 2:14:31 PM PDT by medved
[ Post Reply | Private Reply | To 186 | View Replies]

To: VadeRetro
Yeah, that's all fine and dandy. You tell me where Dolphin and Montgomery are incorrect in their review of Setterfield's statistics in Galilean Electrodymamics. Its so interesting that you can dismiss it, and yet there has been no credible credentialed refutation of their evaluation to date.

Moreover, I suggest that you review Montgomery's own investigation of the existing data on the subject.

Now it is quite clear to those with rudimentary physics knowledge, that the product of wavelength and frequency is that of C. If C is decreasing, then either or both of the multiplicands would need to be changing also. However, if one uses data obtained from Cephid variables, no change in wavelength can be observed. From this one can infer that the frequency must alone must be changing. That assertion is made by myself, because the amplitude of the waveform is immaterial (with regard to the issue of the speed of light i.e. c) to the established classical physics of electromagnetic radiation as we know it today.

Present evidence where this is not happening (the frequency is changing), and at the same time explain the quantized red-shift phenomenon.

By the way, C.L. Strong must be an actor or singer, quoted on CNN to make such a statement in Scientific American, don't you think? /tongue-in-cheek

193 posted on 08/09/2002 3:17:05 PM PDT by raygun
[ Post Reply | Private Reply | To 156 | View Replies]

To: Doctor Stochastic
No doubt by a few orders of magnitude (not just factors), right?
194 posted on 08/09/2002 3:31:27 PM PDT by raygun
[ Post Reply | Private Reply | To 171 | View Replies]

To: medved
I've heard it used that way by serious mathematicians.

Then you've heard it wrong or translated it wrong.

195 posted on 08/09/2002 3:36:11 PM PDT by balrog666
[ Post Reply | Private Reply | To 192 | View Replies]

To: VadeRetro
There's one more burr under my saddle about your response. Obviously foremost is the issue of the same person doing the same experiment to determine c at a later date, and the value is lower than before. Not much but lower.

If one plots a best fit curve of the data (using whatever regression model they desire), along with upper and lower limits of margin for error, all three curves are decreasing, albeit the funnel width is decreasing with respect to the limits of error, but nevertheless all three curves are decreasing.

Furthermore, the problem with current methods of light-speed measurements (mainly laser) is that both wavelengths [W] and frequency [F] are measured to give c as the equation reads [c = FW]. If one is intimately familiar with Setterfield's conjecture, one should be aware that, within a quantum interval, wavelengths are invariant with any change in c. This means that it is the frequency of light that varies lock-step with c. Unfortunately, atomic frequencies also vary lock-step with c, so that when laser frequencies are measured with atomic clocks no difference will be found.

The way out of this is to use some experimental method where this problem can be avoided. Ron Samec has suggested that the Roemer method could be used. This method uses eclipse times of Jupiter's inner satellite Io. Indeed it has been investigated by Eugene Chaffin. Although many things can be said about his investigation (and they may be appropriate at a later date), there are a couple of outstanding problems which confronts all investigators using that method. Chaffin pointed out that perturbations by Saturn, and resonance between Io, Europa, and Ganymede are definitely affecting the result, and a large number of parameters therefore need investigation. Even after that has been done, there remains inherent within the observations themselves a standard deviation ranging from about 30 to 40 seconds. This means the results will have an intrinsic error of up to 24,000 km/s. Upon reflection, all that can be said is that this method is too inaccurate to give anything more than a ball-park figure for c, which Roemer to his credit did, despite the opposition. It therefore seems unwise to dismiss the cDK proposition on the basis of one of the least precise methods of c measurement as the notice proposes that was brought to our attention by Ron. This leaves a variety of other methods to investigate.

However, that is not the only way of determining what is happening to c. There are a number of other physical constants which are c-dependent that overcome the problem with the use of atomic clocks. One of these is quantised Hall Resistance now called the von Klitzing constant. Another might be the gyromagnetic ratio. A further method is to compare dynamical intervals (for example, using Lunar radar or laser ranging) with atomic intervals.

196 posted on 08/09/2002 3:48:26 PM PDT by raygun
[ Post Reply | Private Reply | To 156 | View Replies]

To: medved
Il reductio ad hominem. Stick to the logic and reason, personal considerations are immaterial in forensics.

Does your opponent have an arguement (then counter it), if your opponent is using observation in lieu of arguement, that is rhetoric and needs no rebuttal.

The first party of a debate that resorts to use of ad hominem, by default is signaling capitulation at best and at worst total defeat.

Credentials matters not in the least, unless the person can be argued into a corner and they need to get out of the kitchen because the big boys are playing hardball. Get my drift? And no, before you even inquire, rest assured that I'm not a forensics captain either, and that I could very well be wrong (I'll admit when I am).

197 posted on 08/09/2002 4:30:24 PM PDT by raygun
[ Post Reply | Private Reply | To 192 | View Replies]

To: Junior
I'm finally up to speed on this thread placemarker.
198 posted on 08/09/2002 4:38:55 PM PDT by PatrickHenry
[ Post Reply | Private Reply | To 187 | View Replies]

To: raygun
Here are the actual data, a "master set" and a selected "best measurements" set. My question is "Who came up with those error bars for each method?" (Some of the earliest ones look suspiciously small to me.) Supposedly, those tables started all this.

Now it is quite clear to those with rudimentary physics knowledge, that the product of wavelength and frequency is that of C. If C is decreasing, then either or both of the multiplicands would need to be changing also. However, if one uses data obtained from Cephid variables, no change in wavelength can be observed. From this one can infer that the frequency must alone must be changing. That assertion is made by myself, because the amplitude of the waveform is immaterial (with regard to the issue of the speed of light i.e. c) to the established classical physics of electromagnetic radiation as we know it today.

You guess wrong. Setterfield has far more than frequency changing with the changes in c. The number of photons being emitted from the sun declines, but the photons shorten in wavelength, becoming more energetic. The inverse relationship between frequency and wavelength is retained, but Planck's "constant" changes. The idea is to keep the amount of energy being emitted from the sun and the earth a constant. It's a mess. I looked hard at it once, as described here.

199 posted on 08/09/2002 4:47:44 PM PDT by VadeRetro
[ Post Reply | Private Reply | To 193 | View Replies]

To: raygun
Obviously foremost is the issue of the same person doing the same experiment to determine c at a later date, and the value is lower than before. Not much but lower.

The design of the experiments and the accuracy of the equipment have changed drastically since Roemer's first observations.

If one plots a best fit curve of the data (using whatever regression model they desire), along with upper and lower limits of margin for error, all three curves are decreasing, albeit the funnel width is decreasing with respect to the limits of error, but nevertheless all three curves are decreasing.

Not true, which they tacitly acknowledged when they went to an oscillating model.

Furthermore, the problem with current methods of light-speed measurements (mainly laser) is that both wavelengths [W] and frequency [F] are measured to give c as the equation reads [c = FW].

I don't think so. As I understand it, the propagation time of a signal through some long light path is measured directly or indirectly. Perhaps you are confused because some techniques (Michelson and Morley's, e.g.) involve detecting tiny lags in time by detecting interference fringes in out-of-phase waves.

Unfortunately, atomic frequencies also vary lock-step with c, so that when laser frequencies are measured with atomic clocks no difference will be found.

I assume Setterfield was trying to make his theory as unfalsifiable as possible when he went for that point. But it kills him on having the early earth (say Day 6 of Creation Week) habitable for humans with the sun and earth cooking off like crazy. He tries to dodge that problem, but I don't think he's ever succeeded.

The way out of this is to use some experimental method where this problem can be avoided. Ron Samec has suggested that the Roemer method could be used. This method uses eclipse times of Jupiter's inner satellite Io. Indeed it has been investigated by Eugene Chaffin. Although many things can be said about his investigation (and they may be appropriate at a later date), there are a couple of outstanding problems which confronts all investigators using that method. Chaffin pointed out that perturbations by Saturn, and resonance between Io, Europa, and Ganymede are definitely affecting the result, and a large number of parameters therefore need investigation.

When I first read about this I practically fell on the floor laughing. Roemer's original high reading for c is one of Setterfield's props for his theory, but now the technique is too error-prone to be worth replicating.

All the early measurements were necessarily low-tech. And all the really accurate clocks are atomic. Making your theory unfalsifiable isn't really such a good strategy.

200 posted on 08/09/2002 5:05:53 PM PDT by VadeRetro
[ Post Reply | Private Reply | To 196 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 161-180181-200201-220 ... 241 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson