Posted on 08/25/2010 8:59:18 AM PDT by decimon
When researchers found an unusual linkage between solar flares and the inner life of radioactive elements on Earth, it touched off a scientific detective investigation that could end up protecting the lives of space-walking astronauts and maybe rewriting some of the assumptions of physics.
>
Checking data collected at Brookhaven National Laboratory on Long Island and the Federal Physical and Technical Institute in Germany, they came across something even more surprising: long-term observation of the decay rate of silicon-32 and radium-226 seemed to show a small seasonal variation. The decay rate was ever so slightly faster in winter than in summer.
>
On Dec 13, 2006, the sun itself provided a crucial clue, when a solar flare sent a stream of particles and radiation toward Earth. Purdue nuclear engineer Jere Jenkins, while measuring the decay rate of manganese-54, a short-lived isotope used in medical diagnostics, noticed that the rate dropped slightly during the flare, a decrease that started about a day and a half before the flare.
If this apparent relationship between flares and decay rates proves true, it could lead to a method of predicting solar flares prior to their occurrence, which could help prevent damage to satellites and electric grids, as well as save the lives of astronauts in space.
>
(Excerpt) Read more at news.stanford.edu ...
Facinating!
If it truly was some arcane weak nuclear interaction affected by neutrinos (an idea I grasped at in my earlier post) I don’t think there would be a seasonal effect.
The difference between winter and summer is a difference in effective flux on an angled surface. But neutrinos can zip straight through the earth’s crust and not interact with anything.
And it can’t be something like W and Z boson flux from the Sun, because - apart from all the range and lifetime arguments - these massive particles could not be moving at anything like C.
Which makes me think its got to be electromagnetic: some manifestation of the electroweak theory. But this also doesn’t make sense - radioactive decay is not effected by magnetic fields (also heat, gravity etc). That’s why we can trust atomic clocks.
Well I’m stumped.
Maybe it’s gravitational waves.
Atomic clocks don't tick based upon the decay of an isotope. Rather, they resonate at a measurable frequency when excited. Atoms are unharmed in the process.
perhaps something that proceeds a flare, neutrino collisions could be the cause but when you consider the rarity of those, one is driven to think either multiple/complex interactions at the quantum level neutrino vs decaying isotope or an entirely different force.
Oh yes that’s right, my bad.
IIRC the second is calibrated against a hyperfine level of Cesium, but that doesn’t mean there’s radioactive Cesium inside an atomic clock, radiating steadily away.
But you know, I have lazily thought that for years. Useful thread!
Yes, that would be the reason (release of neutrinos or some other fast particle before the flare).
I shouldn’t really have mentioned neutrinos though. Yes they are fast enough, and because we can’t really detect them it makes sense to suspect their involvement.
But neutrinos really do have a lousy collision cross-section with most matter. Unless Manganese turns out to be the periodic tables champion neutrino collector, and we can also trace a role in neutrinos and radioactive decay.
Shades of "The Endochronic Properties of Resublimated Thiotimoline." (See Asimov).
If decay rates are not constant and can be changed by external influences then the dating methods based upon them are also suspect over very long periods of time.
Here’s my theory.
Changes in the Sun’s magnetic field affect the flux of high energy cosmic rays on the earth. This is a known effect.
Maybe a greater rate of cosmic ray impacts smash up Manganese nuclei in a manner that just happens to look like increased decay, without involving some exotic change to the cross-section of the weak force.
I started thinking of it, and thought...what about Cs-133? That is used as the international standard for time by measuring a component of its decay.
What if that was effected? What kinds of effects might that have on very, very precise types of calculations and experiments?
Interesting stuff.
I’ll take up a collection for those African kids if you want to make a generous donation.
“Hi Folks, I’m from Democratic Undeground. What are you guys talking about? It sounds like dugh syhuhhhhhduh duyh iuyh diuyhhhhh to me”
Why not; thiotimoline signals impending dissolution in water.
Question for further study: if the decay rates drops, does the sun flare to bring it back to the normal rate, i.e. does the decrease cause the flare?
More serious question: do prolonged periods of increased or decreased solar activity affect decay rates of elements used for dating purposes; and if so, is a significant amount? Would it average out over time, or would it be biased in one direction?
Will this finding increase or decrease Crevo argumentation? ;-)
See Post 24 above. Atomics clocks don’t measure decay, since decay of isotopes is essentially random in the short term. Rather, the atomic clocks excite the atoms which resonate at a constant measurable frequency.
Not to worry.
The calibration of the second, and the operation of these clocks, is based on a fine splitting of energy levels of the 6s electron of Cesium. These are electronic energy levels, conceptually similar to the levels of a Hydrogen atom.
These levels have nothing to do with radioactive decay - the only reason why anyone would associate them with radioactive decay is because the definition of the second includes interaction with nuclear spin as the reason for this hyperfine level splitting.
That word ‘nuclear’ - that’s why for the longest time I thought Atomic clocks involved a timed decay, but they don’t.
Although they do get a little shaken up.
Close, but actually it’s toooooooo deeeeeeeeeep for da Undergrind, er ground, grind, whatever!
Increase. That is immutable.
Further to my theory about Cosmic Ray bombardment: 90% of CR particles are protons, 9% are alpha particles and 1% are electrons, all of them at very high energies.
It seems quite likely that some of these alpha and beta particles could whack into a radioactive isotope and cause an apparent increase in decay rate.
Or indeed that the raw CR flux could be mistaken for a decay particle if the observation weren’t set up to shield against CR-particles.
We sometimes forget that we’re living on a rock bathed in a sleet of hard cosmic rays.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.