Watts Up With That?
Mon, 13 May 2013 14:48 CDT
Back in August 2010, WUWT ran
an article wherein it was claimed that variations in the sun changed the rate of radioactive decay. This, of course, flew in the face of years and years of experimental evidence, starting with the Curies, that the rate of radioactive decay is constant, unaffected by pressure or temperature or anything else.
However, this claim that the sun could change radioactive decay rates was shortly challenged by a follow-up article
at WUWT and then a second follow-up
, both of which threw cold water on the idea.
Figure 1. Mass of the universe, by type
So I was interested to stumble across an announcement
issued by Purdue University in August 2012, which strongly confirmed the reality of the phenomenon. Purdue has applied for a patent for the use of this effect as a means to supply advance warning of solar flares.
I found this most interesting, however, not because it affords a chance to have warning of another Carrington Event
, although that would be great in itself. Instead, I found it interesting for a curious reason involving the mechanism whereby the sun is able to affect the rate of radioactive decay.
The thing I really like about the mechanism, about the way that the sun is able to influence the rate of radioactive decay, is that we don't have any idea what it is or how it works