It’s been said, “When a theorist publishes his results, no one believes them but the theorist. When an experimenter publishes a result, everybody believes it except the experimenter.” (’cuz he knows what can go wrong, you see. )
The data, taken at face value, suggest a seasonal variation in the measured decay rates. One of the suggested explanations was a seasonal variation in the fine structure constant, which governs all E-M interactions, including atomic spectral lines. Variations approaching 0.1% in its value would cause havoc in all kinds of routine measurements, not to mention capricious variation in the familiar properties of matter, which in actuality are exquisitely and reliably constant.
What about spectral lines in the Sun’s atmosphere? What is alpha when you’re ON the Sun. It’s ridiculous!
On a discussion group, I saw a comment that this result would be worth several Nobel prizes, if correct. This understates the case considerably. It would be the end of modern physics as we know it, tantamount to “yarn world” in Hitchhikers Guide to the Galaxy ... anything goes!
If you torture the data enough, it will confess.
IMHO, most likely explanation? Measurement error. Some systematic variation in the equipment that is a function of ambient temperature might suffice. Even the old standard meter bar had a slight seasonal variation in length, despite exquisitely careful temperature control.
How did they measure the radiation? Geiger tubes? Do the tubes have a minuscule dimensional change with atmospheric pressure? Seasonally lower barometric readings could cause a very slight change in each tube's capture volume.
On a discussion group, I saw a comment that this result would be worth several Nobel prizes, if correct.
Perhaps. Depending on the explanation, of course.
Still, it would be way kewl if everything we knew about physics is wrong!