Follow up on the solar-neutrinos-radioactive decay story

Getting out of the solar core, neutrinos are speed demons, photons are slugs. h/t to Leif Svalgaard for the graphical annotation inspiration. Solar core image from NASA.

Via slashdot:

A couple of days ago, WUWT carried a story that was rather shocking: some physicists published claims they have detected a variation in earthly radioactive decay rates, big news by itself, but the shocker is they attributed it to solar neutrinos.

The findings attracted immediate attention because they seemed to violate two known basic facts of physics:

1. Radioactive decay is a constant

2. Neutrinos very rarely interact with matter and are hard to detect when they do.

For example: trillions of the neutrinos are zipping through your body right now. So why would they interact with radioactive elements in a more detectable way?

Discover Magazine’s blog 80beats followed up on the initial story buzzing around the web this week and interviewed several physicists who work on neutrinos. The neutrino-affecting-radioactive decay theory is being questioned.

Excerpts:

“My gut reaction is one of skepticism,” Sullivan told DISCOVER. The idea isn’t impossible, he says, but you can’t accept a solution as radical as the new study’s with just the small data set the researchers have. “Data is data. That’s the final arbiter. But the more one has to bend [well-establish physics], the evidence has to be that much more scrutinized.”

Among the reasons Sullivan cited for his skepticism after reading the papers:

  • Many of the tiny variations that the study authors saw in radioactive decay rates came from labs like Brookhaven National Lab—the researchers didn’t take the readings themselves. And, Sullivan says, some are multiple decades old. In their paper, Fischbach’s team takes care to try to rule out variations in the equipment or environmental conditions that could have caused the weird changes they saw in decay rates. But, Sullivan says, “they’re people 30 years later [studying] equipment they weren’t running. I don’t think they rule it out.”
  • The Purdue-Stanford team cites an example of a 2006 solar flare, saying that they saw a dip in decay rates in a manganese isotope before the occurrence that lasted until after it was gone. Sullivan, however, says he isn’t convinced this is experimentally significant, and anyway it doesn’t make sense: Solar neutrinos emanate from the interior of the sun—not the surface, where flares emerge. Moreover, he says, other solar events like x-ray flares didn’t have the same effect.
  • If it were true, the idea would represent a huge jump in neutrino physics. At the Super-Kamiokande detector, Sullivan says only about 10 neutrinos per day appeared to interact with the 20 kilotons of water. Sullivan says the Purdue-Stanford team is proposing that neutrinos are powerfully interacting with matter in a way that has never before been observed. “They’re looking for something with a very much larger effect than the force of neutrinos, but that doesn’t show up any other way,” he says.

Fischbach and Jenkins, who have published a series of journal articles supporting their theory on neutrinos and radioactive decay, emailed DISCOVER to respond to these criticisms of their work. Regarding the first one, the researchers defended the integrity of the data even though they didn’t take it themselves, saying the experiments “were carried out by two well-known and experienced groups. We have published an analysis of these experiments, in Nuclear Instruments and Methods … showing that the potential impact of known environmental effects is much too small to explain the annual variations.”

The full story here.

0 0 votes
Article Rating
110 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Enneagram
August 27, 2010 6:09 am

That model of the Sun seems obsolete:
http://www.holoscience.com/news.php?article=ah63dzac
About Sun´s influence on chemical reactions:
http://www.rexresearch.com/piccardi/piccardi4.pdf

Leon Brozyna
August 27, 2010 6:24 am

Ahhh … some more real science.
These folks seem to have stirred up quite a hornets nest of controversy. And from the full article comes this gem of true science from a couple of skeptics:

Both Adelberger and Sullivan agreed that the Purdue-Stanford findings pave the way to some interesting—and more carefully controlled—research to verify or falsify the idea. But for now, neither is a believer.

Either way, the outcome should be most interesting.

August 27, 2010 6:27 am

it is entirly possible that they have found something revolutionary, and it is equally possible that they found an anomally in data due to something outside their control . The process now is to have the data and methodology examined for flaws by anybody with an interest in it. If the data and results stand up we have new science and probably a trip to Sweden. At present there is no definitve answer , but they have found a whole bunch of new questions… which when you look at it is the entire purpose of science.
Game on dueing thesis at 10 paces and high noon

Editor
August 27, 2010 6:33 am

OK… It’s not April 1… But this has to be a joke.
This sounds too much like the plot to 2012.

The Total Idiot
August 27, 2010 6:54 am

I wonder if they’ve considered the effect of long-wave gravitics on fusion/nuclear reactions. It’s difficult to know if it would have an effect, given that there would be no real way to detect the presence of a long wave gravity concentration or trough, (say a wave literally years across) Even a small change (.1%) would be undetectable by other means, though should potentially affect radioactive decay by tiny increases in effective neutron density.
While spacetime density is a misnomer, ‘waves’ in spacetime itself should be corollary to the relativity equations, and being in the frame of reference would make them difficult to measure.
The quantum nature of such events would be interesting to theorize. Does quantum probability of neutron interaction increase or stay the same in such a state, when operating as a quantum wave event? Until detection (by decay or other means) it should remain independent of the framework until observation, at which point quantum collapse occurs. Given the nature of the spatial distances, there should be a tiny change in the amount of the weak and strong nuclear force… but at the same time the framework is also compressed or or expanded.
It opens up questions of quantum distances versus macrospatial distances, and if the framework is relative until observation, and how much energy such a wave would possess… and what might be capable of creating such. The only way to observe such would be to observe its effects at the macro level, by the interaction of particles, and then it would only have infinitesimal potential changes in observable phenomena. It would be an interesting mathematics concept, but likely without use for anything else, save for discussions of philosophy amongst graduate physics and mathematics majors.
But then… I admit ignorance on much of the subject.

Louis Hissink
August 27, 2010 7:08 am

Radioactive decay is NOT a constant. Russian work published in 1989 describes experiments including decay of Pu239 over long periods of time, Schnoll et al. The crucial fact is that the magnetic, gravitational and magnetic fields have been shown not to affect radioactive decay but no one has subjected it to variations in the electric field. It’s variations in the electric field which affect radiactive decay rates observed.

John Whitman
August 27, 2010 7:10 am

Leif,
Thank you for continuing to stimulate us on this fascinating story line by your tip/sources location of this topic.
Four things that immediately struck me with the WUWT “Solar flares are teleconnected to earthly radioactive decay” post a couple of days ago were:
1) the need before we go too far with this is an alternate technique verification of the reported 33 day variations in isotopic decay constants and replication of their experiments
2) why initially locking so much onto neutrinos as potential cause, maybe because this would have least impact on fundamentals of modern physics?
3) why initially locking onto the sun’s rotation as an originator of the mechanism (whatever it is, neutrinos, gravitons or “X”)? Pulsing (with some feedbacks) also are know in nature to create recurring patterns, not just rotations.
4) the potential for much longer variations in the isotopic decay contants also exist, so this should be part of the critique of the original findings of 33 day variations
But, thank you for being circumspect on the findings. It seems what is needed is not to get the cart too far before the horse and verify critically the initial findings broadly.
John

Fred Lightfoot
August 27, 2010 7:18 am

This definately requires a Willis Logic approach !
Go Willis.

anna v
August 27, 2010 7:41 am

Moderator I think my previous post was snagged, had two long links.
Glen Shevlin says:
August 27, 2010 at 6:27 am
it is entirly possible that they have found something revolutionary, and it is equally possible that they found an anomally in data due to something outside their control . The process now is to have the data and methodology examined for flaws by anybody with an interest in it. If the data and results stand up we have new science and probably a trip to Sweden.
In scientific disciplines where experiments can be done, the method is to repeat the experiment, not to analyze the same data to death. This experiment can be done, it might take years, but it can be done, and should be done. It is impossible to gauge whether such small effects are instrumental artifacts of data being used by other people than the data gatherers themselves.

hunter
August 27, 2010 7:46 am

And of course, if you recall, the mechanism the destroyed the Earth in the (silly) movie, 2012, was a change in the way neutrinos behaved……

kwinterkorn
August 27, 2010 7:46 am

Whatever the substance of the arguments, you gotta love what Sullivan says: “…Data is data. That is the final arbiter…..”
Seems that is the essential mission statement of WUWT as a blog: without quality data there is no quality science. AGW is not based on quality data, and it is a travesty.

George E. Smith
August 27, 2010 8:10 am

“”” Louis Hissink says:
August 27, 2010 at 7:08 am
Radioactive decay is NOT a constant. Russian work published in 1989 describes experiments including decay of Pu239 over long periods of time, Schnoll et al. The crucial fact is that the magnetic, gravitational and magnetic fields have been shown not to affect radioactive decay but no one has subjected it to variations in the electric field. It’s variations in the electric field which affect radiactive decay rates observed. “””
Well the problem with ascribing things to electric fields; such as the medical effects of power lines for example, is that you can calculate the energy density in an electric field; and in the case of the power line chemistry; somebody who did such a calculation said the energy density at the organic molecular size level was 27 orders of magnitude too low to influence any chemical bonds.
So now you zoom on in to the dimensions of the atomic nucleus; and any conceivable electric field we have observed around the laboratory would hardly seem to be noticeable to a nucleus.
So I don’t think any Classical Physics could be involved; presuming that the data check out and the decay variation is proved to be correct; whatever the presently unknown cause.
Thanks to Leif for pointing out the tortured path that mere photons have to endure to escape the optical inhomogeneity of the sun; while neutrinos evidently simply ask: “What sun ? I don’t see no stinkin’ sun !”

Enneagram
August 27, 2010 8:17 am

The Total Idiot says:
August 27, 2010 at 6:54 am
Evidently there is a need of a much more simple explanation. A unified field theory is needed…or perhaps it is out there in the books and we reject it because of prejudice.
There are general lwas we chose to ignore them.

JDN
August 27, 2010 8:19 am

I hope this proposed phenomenon gets a good scientific workout. Are data & methods being released? Is there a cogent plan to determine how widespread this phenomenon is? Every scientist knows this is the way to do things, and, it would make a nice contrast to warmist science that WUWT opposes.

Kevin Kilty
August 27, 2010 8:24 am

Radioactive decay has statistical properties that are stationary, probably, and we can speak of a decay constant that is well defined for a large population of radioactive species, but it isn’t really constant. For each atom it is either 1 or 0 or decay or not, which seems difficult to describe as constant.
Statistical variation is a source of all sorts of “signals” that really aren’t there–possibly cold fusion of the Fleschman/Pons variety fits this idea.

Steve Fitzpatrick
August 27, 2010 8:31 am

Well, neutrinos are produced by radioactive decay. Could a radioactive decay process be reversed by adding a neutrino of the right type at the right time? Feynman diagrams posit reversibility in time, so maybe it is possible.
Still, the data need to be looked at very closely for spurious effects.

Jim Stegman
August 27, 2010 8:39 am

This reminds me of a variation on Murphy’s Law: “Variables won’t and Constants aren’t”.

Douglas Dc
August 27, 2010 8:45 am

This is a true science controversy-like a chess game. The best strategy and tactics
and- theory wins.

jorgekafkazar
August 27, 2010 8:48 am

I’m still waiting for the response from the Melvin Dumar Cold Fusion Laboratory at the University of Utah.

Jeremy
August 27, 2010 8:50 am

But, Sullivan says, “they’re people 30 years later [studying] equipment they weren’t running. I don’t think they rule it out.”
I agree with that, however it also provides enough of a mystery for someone to find funding to replicate/verify their findings with a real experiment. It shouldn’t be difficult, we’ve already got neutrino detectors spread around the world. It wouldn’t be difficult to create a radioactive decay monitoring experiment on the surface above such a detector.
The Purdue-Stanford team cites an example of a 2006 solar flare, saying that they saw a dip in decay rates in a manganese isotope before the occurrence that lasted until after it was gone.
This also didn’t sit well with me. Neutrinos are supposed to come straight from the solar core uninterrupted. Solar flares are surface phenomena of charged matter interacting with magnetic fields in highly unstable states. The idea that they interact in some fashion seems a stretch.
If it were true, the idea would represent a huge jump in neutrino physics.
Also true, and also exciting.

Anthea Collins
August 27, 2010 8:51 am

Forgive my ignorance, but is the change in decay rate of an element really significant? Surely it would be so minute as to be only worthy of study for the sake of study. (I stand by trembling with my hard hat pulled right down!)

beng
August 27, 2010 8:55 am

Photons are easily created (emitted) and destroyed (absorbed). Any given photon doesn’t survive from the core to the outside; it survives only a tiny distance in the dense inner sun before being absorbed. Of course, the particle absorbing it is excited to a higher energy level & will quickly emit a new photon. The 200k yr period is the time required for the photon “energy” to escape.

Mac
August 27, 2010 8:57 am

I would highly recommend “The Heretic’s Guide to Modern Physics”
Link here
http://www.wbabin.net/ppreww.htm
I series of articles from the 1980’s by a professional practicising physicist, W A Scott Murray, rather than an academic physicist, printed in an esteemed UK magazine Wireles World.
These articles are a must read, especially the comments about neutrinos.
“I think everybody would agree that atomic nuclei are quantized (type one), in that every nucleus is constructed out of a definite number of discrete particles, protons and neutrons, that can be recognised in the free state by their consistent properties and behaviour. But according to the new ideas the mechanics of everything small is also quantized (type two), and because the atomic nucleus is very much smaller than the complete atom, a fortiori should the mechanical energy and momentum within the nucleus be quantized. Yet the beta radiaiion, which is associated with the radioactive decay of one neutron into a proton inside the nucleus, apparently is not quantized. It was an article of the new faith that it should be quantized . . . “Therefore”, said the quantum theorists, “the conservation of energy must have failed (Niels Bohr); or, alternatively, the experimental evidence of the beta decay must be wrong”. Wolfgang Pauli saved the day, by postulating the existence of a completely unexpected neutrino or “small neutral particle” which had about the same mass as an electron but no electric charge. Such a particle, he suggested, would not show up in any ordinary particle counter or photograph. So: if one neutrino were to be emitted along with every radioactive beta electron, nobody would ever be able to detect the fact; but the invisible neutrino would carry away energy too, so that it and the beta electron, between them, could possess the quantized line spectrum of energy that the theory demanded although the visible beta electron did not. (The failure to quantize the sharing of this energy between the neutrino and the beta electron in fixed proportions was not explained).
Now if you feel this to be a somewhat implausible, ad hoc suggestion, designed to make the experimental facts agree with the theory and not far removed from a confidence trick, be sure I share your suspicions. The question before us is: Do we
believe in neutrinos? We would not be alone if we didn’t. Neutrinos are essential
to the modern quantum theory, however, and their existence is assumed as a matter
of course when describing nuclear reactions, yet not even their owners seem to
be very sure about them. When first. invented by Pauli they had about the same
mass as an electron (so as to share the missing energy equitably, on average); then suddenly it was proved that they could have no rest mass, but must be like some kind of non-radiant, indetectable photon. However, to make up for that they must be spinning – “but not mechanically, of course, since there is no structure there to spin”. More recently it has been declared that they probably do have rest mass but very, very little (actual amount unspecified), and that there must be at least four different kinds of them. It does not add up to a very convincing story.
From the theorists’ viewpoint the delightful thing about neutrinos is that they are virtually indetectable. Being so light, and electrically neutral, it is said that most of them fly right through the planet Earth, touching neither nucleus nor electron and
leaving no trace of their passage. (There is another logical inconsistency here too, but
we needn’t labour every one!). Very occasionally a particle counter registers inside
a 12ft-thick steel box near the target area of the big CERN accelerator at Geneva,
and this effect, like some others, is attributed to a neutrino collision because “it couldn’t be anything else”. Then one day the astrophysicists discovered that, according to current theory, the Sun should be pouring out neutrinos at a calculable,
fabulous rate; and accordingly an enormous neutrino detector was built in the United States especially to look for them, deep below ground in a diamond mine where undentified particles would be unlikely to be mistaken for neutrinos and confuse the results. That experiment was reported in 1976. It detected fewer than one-tenth of the neutrinos of solar origin that it was expected to detect, and maybe none; there is
no assurance that the very few nuclear reactions that it did detect were actually
due to neutrinos. ‘The astrophysicists have been sent away to do all their sums again.
But why should the poor astro-physicists take the blame for this negative result?
What if Pauli’s adventurous speculation should have been wrong, and his postulated
neutrino never existed after all? To the theorists such a thought really is unthinkable:
for if, after weighing the evidence, we were to determine that on balance of probabilities we did not believe in neutrinos, then we would be suggesting that the atomic nucleus might not be “quantized” (into discrete energy levels, type two). And that thought in its turn would strike at the roots of every modern theory about the physics of elementary particles.”

August 27, 2010 9:00 am

Enneagram says:
August 27, 2010 at 6:09 am
That model of the Sun seems obsolete
This is what our best data derived from helioseismology and neutrino-flux measurements show. Nothing obsolete here.

Scott
August 27, 2010 9:03 am

It definitely looks like it’s too early to make a call on this one right now. But my hope is that the paper is correct because I’ve always considered #1 above as an assumption of physics and not a fact. If it indeed shows to not be a fact, I’m fairly certain that most of what we think about the universe suddenly comes into question, because it relies too much on the assumption that so many things are “constant”.
Note that I’m not saying that these assumptions are bad, but to date they’re the best we’ve had to go on, and I love that a least someone (reputable) is willing to question them.
-Scott

MJB
August 27, 2010 9:03 am

The technical side of this is fascinating but I was also struck by the qoute from Dr. Sullivan
“Data is data. That’s the final arbiter. But the more one has to bend [well-establish physics], the evidence has to be that much more scrutinized.”
For me, the last sentence highlights a basic human instinct – confirmation bias. Something we see in climate science, archaeology, ecology, etc all to often. The advantage that physicists have, is a very robust version of “well established” (altough one could argue this about string theory and some other modern advances that are hard to test with experimentation). It is only because such care has been taken in building up the established picture for physics, that this system is valid. When “well established” is unproven theory, or worse dogma, the warm fuzzy we get when a new observation matches our belief is false comfort.
Archaeology offers some fascinating perspectives on confirmation bias. I would suggest the search for the “missing link” in the nineteenth and early 20th century has many similarities to the evolution of AGW “science”. Piltdown mann 🙂 for example.

MarkB
August 27, 2010 9:10 am

Dollars to donuts this will turn out to be nothing. Most ground-breaking, revolutionary studies that overturn everything we know turn out to be wrong.

William
August 27, 2010 9:11 am

It does not appear the change in decay rates on the earth is due to a particle emission from the sun, as there is a significant lag time as the forcing mechanism builds up. A second logical argument against a particle cause is there is no known particle that can affect both beta and alpha emission.
It appears what is causing the changes in decay ratios (based on something that can affect both alpha and beta emission and that has a lag time) is modulation to a significant solar scale field. (See link below.)
There are peculiar unexplained terrestrial observations – for example unexplained cyclic large geomagnetic field inclination and intensity changes called archeomagnetic jerks (periodicity around 200 years), large numbers of burn marks on the surface of the earth with the same data of formation (At 12,300 BP and at around 30,000 BP), geologically simultaneous volcanic eruptions from volcaneos that are in the same region but have separate magma chambers, and so on, – that can provide support for a scalar field with a normal state and with an interruption that leads to discharge that affects the earth and the other planets in the solar system. I have been looking at the origin and evolution of stellar magnetic fields as well as the formation of large stars. There are a large number of anomalous observations that appear to be related to what is causing these observations.
Correlations Between Nuclear Decay Rates and Earth-Sun Distance http://arxiv.org/abs/0808.3283
Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance
Unexplained periodic fluctuations in the decay rates of Si-32 and Ra-226 have been reported by groups at Brookhaven National Laboratory (Si-32), and at the Physikalisch-Technische-Bundesandstalt in Germany (Ra-226). We show from an analysis of the raw data in these experiments that the observed fluctuations are strongly correlated in time, not only with each other, but also with the distance between the Earth and the Sun. Some implications of these results are also discussed, including the suggestion that discrepancies in published half-life determinations for these and other nuclides may be attributable in part to differences in solar activity during the course of the various experiments, or to seasonal variations in fundamental constants.
The preceding considerations, along with the correlations evident in Fig. 4, suggest that the time-dependence of the 32Si/36Cl ratio and the 226Ra decay rate are being modulated by an annually varying flux or field originating from the Sun, although they do not specify what this flux or field might be. The fact that the two decay processes are very different (alpha decay for 226Ra and beta decay for 32Si) would seem to preclude a common mechanism for both.
However, recent work by Barrow and Shaw [12, 13] provides an example of a type of theory in which the Sun could affect both the alpha- and beta-decay rates of terrestrial nuclei. In their theory, the Sun produces a scalar field _ which would modulate the terrestrial value of the electromagnetic fine structure constant _EM.

August 27, 2010 9:14 am

I am not a physicist but that does not mean not interested. This is fascinating work that raises questions. Those questions need to be resolved. One thing that needs to be resolved is how constant is constant? The only constant in physics, I am aware of, that has no variation or uncertainty, is the speed of light and that is by definition. I suspect this phenomena has more to do with definitions of precision and accuracy then anything else. One other thought, almost all natural phenomena I can think of can be expressed or is expressed in or as a wave form. It will be interesting to watch as the physicists try and unravel what they have observed.

Ian E
August 27, 2010 9:15 am

[snip] These findings are clearly robust – and the effect may be worse than first thought. We need a UN-sponsored panel to be set up so that the general public can be properly advised and a neutrino trading scheme should be set up forthwith!

Jim
August 27, 2010 9:27 am

It seems to me this effect would have been seen before in nuclear power reactors unless it is sensitive to specific nuclei.

SSam
August 27, 2010 9:32 am

A contrary view can be found in:
“Evidence against correlations between nuclear decay rates and Earth–Sun distance”
Eric B. Norman a,b,c,*, Edgardo Browne c, Howard A. Shugart d, Tenzing H. Joshi a, Richard B. Firestone
http://donuts.berkeley.edu/papers/EarthSun.pdf
Evidently, the effect mentioned in the 2010 paper was originally reported in 2008, which explains the timing of this Oct 2009 article.

Grumpy Old Man
August 27, 2010 9:37 am

Mac is right. This could be a tipping point in the debate of the alleged existence of the Neutrino. On a more immediate level, if the rate of radioactive decay is not a constant, what will this do to historical dates produced by the likes of radioactive carbon?

William
August 27, 2010 9:41 am

This is a link to the paper that discusses the archeomagnetic jerks. The archeomagnetic jerk affects planetary climate by changing the inclination and intensity of the geomagnetic field. When the geomagnetic field is strongly inclined as compared to the rotational axis of the planet, higher rates and greater intensity of GCR strikes lower latitudes of the planet. (If the geomagnetic field’s axis and the rotational field are aligned, the GCR particles are deflected from striking low latitude regions on the planet.) Over time the geomagnetic field’s axis tries to return to alignment with the planet’s rotational axis however there is a delay as the core geomagnetic field has a large time constant.
This mechanism (abrupt changes to the geomagnetic’s field inclination) explains the puzzling regional difference in climate changes that are observed to have occurred, where regions at similar latitudes but different longitudes on the same hemisphere, experience different amounts of climate change during the cyclic abrupt climate events in the geological past.
Where the strikes occur on the planet and hence how the strikes affects the geomagnetic field depends on the planet’s inclination at the time of the event, the pole of the planet that is pointing towards the sun at the time of the event (which determines whether the strike is in the Northern or Southern Hemisphere), and other terrestrial factors such as the extent and location on the surface of insulating ice sheets.
http://geosci.uchicago.edu/~rtp1/BardPapers/responseCourtillotEPSL07.pdf
Response to Comment on “Are there connections between Earth’s magnetic field and climate?, Earth Planet. Sci. Lett., 253, 328–339, 2007” by Bard, E., and Delaygue, M., Earth Planet. Sci. Lett., in press, 2007
Also, we wish to recall that evidence of a correlation between archeomagnetic jerks and cooling events (in a region extending from the eastern North Atlantic to the Middle East) now covers a period of 5 millenia and involves 10 events (see f.i. Figure 1 of Gallet and Genevey, 2007). The climatic record uses a combination of results from Bond et al (2001), history of Swiss glaciers (Holzhauser et al, 2005) and historical accounts reviewed by Le Roy Ladurie (2004). Recent high-resolution paleomagnetic records (e.g. Snowball and Sandgren, 2004; St-Onge et al., 2003) and global geomagnetic field modeling (Korte and Constable, 2006) support the idea that part of the centennial-scale fluctuations in 14C production may have been influenced by previously unmodeled rapid dipole field variations. In any case, the relationship between climate, the Sun and the geomagnetic field could be more complex than previously imagined. And the previous points allow the possibility for some connection between the geomagnetic field and climate over these time scales.

Larry Geiger
August 27, 2010 9:47 am

Even if it’s not neutrinos upsetting the decay, it’s still interesting. If it’s an artifact of the measurments, that’s still interesting. That radioactive decay measuring might be affected by the seasons is interesting, though in a different way.

Ric Locke
August 27, 2010 9:48 am

Anthony —
You and the WUWT commenters have a golden opportunity to do real science. Your focus up to now has been on GW and weather, but the interest in the Sun evident here indicates that it’s at worst related.
What’s needed to support this hypothesis is a large number of readings taken over as large as possible an area for the maximum possible length of time, with as much quality control as possible.
How much would it cost to develop a device that contained a tiny sample of some radioactive isotope, a detector (many plastics scintillate, and teeny scintillator crystals are available), and a USB interface, together with software to log readings of activity and upload them to a server at regular intervals? Regulations for the handling of active isotopes are such that it would actually be cheaper to make something that couldn’t be tampered with than otherwise and get regulatory approval. Purchasers could then register with neutrinos@home and allow concentration of the readings in a central database.
The result, if it took off the way your climate widget did, would collect a lot of data with good traceability and consistency, and based on observation there would be no shortage of volunteers to analyze the data collected.
Regards,
Ric

Dave Dardinger
August 27, 2010 9:53 am

The first thing which comes to my mind as to a physical reason for this finding, presuming it doesn’t just turn out to be a spurious correlation, can be illustrated by looking at photons, which someone mentioned above. Photons of a particular wavelength are absorbed with a likelihood which can be calculated with great accuracy (at least in some cases). But as photons stream by an atom they may still have effects aside from absorption. Since it’s been over 40 years since I studied advanced physics, I’ve long since forgotten the names of any such effects, but probably somebody else here knows them. In the same way, it might be possible that a second order effect of neutrinos (or neutrino density) could cause a minor change in decay rates of some isotopes. And this might vary depending on which sort of neutrino we’re talking about. Things like neutrino mass, etc. would also enter into the picture. It’d be nice if some who is up on the subject would chime in (unless, of course, he or she is busy writing the article which might earn a trip to Oslo.)

James F. Evans
August 27, 2010 10:10 am

If this study has validity and radioactive decay rates vary via neutrinos fluctuations from the Sun, then assumptions about the age of the Earth and other geologic time measurements (such as various ages such as the Permian) are seriously called into question.
It seems the more Science learns about various physical dynamics & relationships, the more various assumptions, which Science has took for granted, are called into question.

Lorne
August 27, 2010 10:19 am

Ian E says:
August 27, 2010 at 9:15 am
[snip] These findings are clearly robust – and the effect may be worse than first thought. We need a UN-sponsored panel to be set up so that the general public can be properly advised and a neutrino trading scheme should be set up forthwith!
lol, coffee out nose on keyboard

Michael Larkin
August 27, 2010 10:23 am

Mac says:
August 27, 2010 at 8:57 am
“I would highly recommend “The Heretic’s Guide to Modern Physics”
Link here
http://www.wbabin.net/ppreww.htm
Many thanks for this link, Mac. I have started reading the series “A heretic’s guide to modern physics” and am finding it most interesting. I recommend it to everyone here.

Tim Clark
August 27, 2010 10:32 am

James F. Evans says:
August 27, 2010 at 10:10 am
If this study has validity and radioactive decay rates vary via neutrinos fluctuations from the Sun, then assumptions about the age of the Earth and other geologic time measurements (such as various ages such as the Permian) are seriously called into question.

What’s .2% of 4.5 billion years? Might change dating of dinosaurs by 100 yrs or so, well within the accepted deviation.

Jim G
August 27, 2010 10:34 am

Read João Magueijo’s book on FTL, (Faster Than Light). There have been several theories that light speed is, has not been, constant since the Big Bang, if there was one, that is. I have one of his books and it is interesting while making no claims other than that it is a theory which if true would eliminate some of the problems with presently accepted theory, like inflation. Energy density/level could have changed the speed of light over time is part of the theory.

August 27, 2010 10:36 am

I thought this was the money quote:

you can’t accept a solution as radical as the new study’s with just the small data set the researchers have

But apparently one study using 12 trees is good enough reason to regulate the global economy. Why can’t a climate scientist be more like a physicist? (Apologies to Lerner and Lowe)

Merrick
August 27, 2010 10:37 am

Mac,
It makes a great story, only the quantum theory of the nucleus has been substantiated both theoretically and experimentally over hundreds if not thousands of experiments for decades.

kfg
August 27, 2010 11:34 am

Dennis Nikols, P. Geol. says: “The only constant in physics, I am aware of, that has no variation or uncertainty, is the speed of light and that is by definition.”
The constancy of the speed of light in a vacuum is not accomplished by definition. It is derived as a natural and necessary consequence of Maxwell’s electro-magnetic field equations and the Scientific Axiom. As its actual value is dependent of the field properties, it is as invariant as the field properties.
The Scientific Axiom, is, of course, a working assumption that cannot be proven, but has stood up to fairly rigorous test. Put colloquially it can be stated as: “It ain’t magic.”

James F. Evans
August 27, 2010 11:39 am

Tim Clark presents Evans’ comment: “If this study has validity and radioactive decay rates vary via neutrinos fluctuations from the Sun, then assumptions about the age of the Earth and other geologic time measurements (such as various ages such as the Permian) are seriously called into question.”
And, Tim Clark responds: “What’s .2% of 4.5 billion years? Might change dating of dinosaurs by 100 yrs or so, well within the accepted deviation.”
Tim, first, alot of assumptions go into the 4.5 billion years hypothesis, and, remember, that’s all it is, a hypothesis.
Second, If you add up .2% into that 4.5 billion year hypothesis, it changes the numbers rather significantly, more than just a “100 yrs or so”.
Third, if, “this study has validity and radioactive decay rates vary via neutrinos fluctuations from the Sun (or some other potential physical mechanism)”, do we know that the radioactive decay rate doesn’t vary substantially more than just .2% when certain of physical parameters are met or changed?
Tim, I realize that pointing out assumptions (particularly assumptions taken for granted) can be wrong makes many uncomfortable, but that is what Science is all about — if advancing scientific understanding is the goal.
On the other hand, if protecting reputations and the assumptions that go with those reputations is what Science is all about then I understand and agree with your comment completely…

wsbriggs
August 27, 2010 12:08 pm

Merrick says:
August 27, 2010 at 10:37 am
“quantum theory … substantiated both theoretically and experimentally”
Agreed, but there are also small dangling questions. The Magnetic Vector Potential, which looks like a mathematical construct, but which appears to influence electron pairs in Josephson Junctions. Is there an unknown equivalent at the nuclear level? Not in my QM books, but in books about particle currents…
Transmutation was only as a result of decays, now it appears it’s controllable with replicated experiments world-wide. There are lots of things we don’t know about nuclear reactions, even more that we don’t know that we don’t know.

Ken Harvey
August 27, 2010 12:17 pm

Michael Larkin says:
August 27, 2010 at 10:23 am
“I would highly recommend “The Heretic’s Guide to Modern Physics”
Link here
http://www.wbabin.net/ppreww.htm”
Thank you (and Mac) for this. I will be reading for some time to come.
.

Jim
August 27, 2010 12:21 pm

It is legal to buy and possess uranium ore and can be bought at United Nuclear.
http://unitednuclear.com/index.php?main_page=index&cPath=2_4
I’m sure there are other sources. The low or medium active ores probably would be good enough for the experiment. There are also thorium ores. Both these have long half lives meaning the output will be fairly constant, barring other factors. Not sure about how to put the detector in a USB device. Scintillation detector sounds good, maybe with a solar cell chip to convert to electronic count?

kfg
August 27, 2010 12:23 pm

James F. Evans says: “. . .do we know that the radioactive decay rate doesn’t vary substantially more than just .2% when certain of physical parameters are met or changed?”
Boom! BADDA Boom! BIG badda boom!

Z
August 27, 2010 12:23 pm

Tim Clark says:
August 27, 2010 at 10:32 am
What’s .2% of 4.5 billion years?

Nine million years.
One of the first easy questions I’ve ever seen on this site…

Editor
August 27, 2010 12:24 pm

Let’s not get too carried away with invalidating radiometric dating methods. Major extinction events are highly correlated with massive flood basalt eruptions…
Flood Basalts and Extinctions
I’d prefer to think that these things only happen every 20 to 60 million years and not every 2 to 6 thousand years.
😉

Steve
August 27, 2010 12:30 pm

I understand the general skepticism, but I don’t understand why this discovery is so bizarre as to be considered more bizarre than the quantum behaviors that have already been verified as fact.
Change in the decay rate of radioactive isotopes has already been verified (since 1977) to occur under certain conditions, via the quantum zeno (and anti-zeno) effect. In other words, the rate of decay is already known to only be constant if environmental conditions are constant. It would seem like this new discovery would expound on that. If the rate of neutrino “observations” of an atomic nucleus vary, the decay rate will vary.
http://en.wikipedia.org/wiki/Quantum_Zeno_effect

August 27, 2010 12:44 pm

Jim,
United Nuclear is one of my favorite sites. They have scintillation detectors [spinthariscopes]. They have radioactive isotopes. They even have parts for death rays! How cool is that?

anna v
August 27, 2010 12:46 pm

Here is what I wrote in the Discover comments:
Two points:
The statement “decay rates are constant” is a statement describing local physics. As far as general relativity goes decay rates depend on the topology of space time; the same is true of the velocity of light, another so called constant, that is only locally constant.
Any interaction with the neutrino field introduces an extra weak vertex. No energy exchange can happen without this extra vertex. The extra weak vertex introduces in the probability a 10^-12 diminution, and they are talking of 0.2% effects. It cannot be neutrinos.
Even if there is a correlation with neutrino bursts or solar flares, correlations are not causation. The whole solar system could be going through a rough patch of space time.
Measuring accurately the velocity of light over a few sun cycles is an easier experiment than measuring decay rate changes.
I would vote for data contamination from instrumental biases. Otherwise this would be a first measurement of gravitons, 😉
BTW, It is true that when neutrinos were first visualized Fermi, I think, said: “who ordered that?”. Since then, and particularly with the data gathered in electron positron collisions the past twenty years or so, the standard model with its three neutrinos is as well established as the model of the atom.

Z
August 27, 2010 12:49 pm

Anthea Collins says:
August 27, 2010 at 8:51 am
Forgive my ignorance, but is the change in decay rate of an element really significant? Surely it would be so minute as to be only worthy of study for the sake of study. (I stand by trembling with my hard hat pulled right down!)

Normally U235 has a very minor decay rate, but hit it with the right speed of neutron, and suddenly it becomes “decay on demand” – with associated flash, bang, and mushroom cloud, so altering the decay rate of a substance has its uses.
Similarly but in the other direction, there are decay mechanism which “suck in” an electron to turn a proton into a neutron. In the case of an completely ionised sample, this decay mechanism completely stops (because there are no electrons) – so the old canard of it being a constant was never really true.
For more prosaic uses of radioactive decay, it is used to date various things on this planet like rocks and carbon based matter for purposes of geology/history etc. Any errors will mean that these dates will be wrong.
There is also the mechanism behind how it could vary. It would mean what we know about particle physics, isn’t actually true…

Dave Springer
August 27, 2010 1:00 pm

Might want to check out this stuff I was looking at a couple of years ago.
First is a 2008 study by the same authors (Jenkins, Fischbach) with different radioisotopes and seasonal changes in decay rate aligned with change in distance from sun seen at two different facilities.
http://arxiv.org/PS_cache/arxiv/pdf/0808/0808.3283v1.pdf
Here’s more data on the Voyager RTG (radio thermal generators).
1999 paper on the RTG design and details about the degradation studies of the materials.
http://arxiv.org/PS_cache/arxiv/pdf/0808/0808.3283v1.pdf
And below a 2006 paper on actual vs. predicted performance over 28 years over a distance of 1AU (at launch) from the sun to 101AU out in 2006.
http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/38760/1/06-0391.pdf
Note that after 28 years the thermopile was producing 5% more power than predicted which seems an excessively large error given the robust degradation studies of the materials.

Steve
August 27, 2010 1:27 pm

Oops, my previous comment regarding verification of the quantum zeno effect on radioactive isotopes was off by a few decades.
Experimentation regarding rate of radioactive decay was certainly hypothesized by 1977 (much earlier, actually), but experiments back then where to verify the effect on the decay of other quantum systems, not radioactive isotopes.
Below is a link to a Nature article in 2000 that specifically discusses confirmation of the quantum zeno effect in regards to the rate of radioactive decay.

kfg
August 27, 2010 1:27 pm

Z says: “Any errors will mean that these dates will be wrong.”
Of course the idea that there is only a certain degree of precision in the methods is known, studied and thus assumed. We are always looking for ways to increase precision; nor would this be the first time that the very accuracy of the methods has been called into question – sometimes, it turns out with justification.
Which is why we never rely on a single measurement or method in the first place; grabbing at every “yardstick” available and only having such confidence in the results as their concurrence suggests is, or is NOT, warranted.
One might even be inclined to examine how these ideas apply to the measurement of Global Mean Temperature.
“It would mean what we know about particle physics, isn’t actually true…”
Which would be . . . really frickin’ cool!

J J
August 27, 2010 1:29 pm

Just a couple of comments, for your consideration:
1) If you look at the bulk of the decay data, the data will follow the normal exponential, thus the general decay trend is as expected. However, you can see a fine structure (the fluctuations that appear to have an annual period) as the data points oscillate along the general decay line. There should be no ordered trends in a “random” data stream, right?
2) Alpha and beta decays are two entirely different processes, and even within the definition of beta-decay, there are multiple processes (beta-minus, beta-plus, K-Capture). And then if you look at the energetics of the decays, which are unique to each isotope…and the fact that beta-decays are dependent on the energy available, the idea that neutrinos could kick in a few eV as they pass through a mass, and that the cumulative sum of the energy added to a system could be enough to alter the decay constant slightly, is not out of the question.
3) The Purdue group did an exhaustive analysis of the detector systems used in the two different experiments (one data set 30 years old, the other ten years old), looking at the possible known environmental effects (including gravity/time and some others) and found that none of these could have been the single culprit to cause this, and that the sum of the causes was not likely to have been effective to an order enough to cause the +/- 0.1% oscillations. It could also be pointed out that the two data sets, taken in laboratories on separate continents, with two different types of detectors (one an ion chamber measuring photons, the other a proportional counter measuring beta particles), measuring three different decay chains, actually correlated in time with each other for the two years the experiments overlapped.

August 27, 2010 1:34 pm

Dave Springer says:
August 27, 2010 at 1:00 pm
“Note that after 28 years the thermopile was producing 5% more power than predicted which seems an excessively large error given the robust degradation studies of the materials.”
The only surprising thing here is how accurate their prediction was. Especially since there are lots of other effects they do not seem to have been able to include in the model, such as the long term effect of cosmic rays on the thermocouple materials, which could easily either increase or decrease the thermoelectric emfs, etc. Moreover, where there was any uncertainty in their figures, they could be expected to have made somewhat conservative choices, because it doesn’t matter if the thermopile outperforms its specifications, but it matters a lot if it underperforms. It is also not clear from the pdf whether they included the gradually increasing energy output of the daughter nuclides. There is no evidence of variable decay rates here.

peterhodges
August 27, 2010 1:40 pm

it is observations of the unusual like this that help us to make progress
i for one not only doubt the neutrino link but doubt the neutrino, period
the bohr model works reasonably well but i am sure we can come up with something even better

Dave Springer
August 27, 2010 2:11 pm

@Paul Birch
“the long term effect of cosmic rays on the thermocouple materials, which could easily either increase or decrease the thermoelectric emfs”
So those could explain either an increase or a decrease in degradation of thermocouples. Easily huh? Try going into a little more detail on how that could easily go either way.

Dave Springer
August 27, 2010 2:21 pm

@Paul Birch
Actually Paul, I just noticed I put the wrong link in to the Voyager RTG materials degradation study so you didn’t actually read it before concluding it wasn’t thorough. LOL
Here’s the correct link:
http://www.ligo.caltech.edu/docs/P/P990034-00.pdf
Try actually reading it before you start commenting on how well it was done.

August 27, 2010 2:22 pm

Some of us here have been debating the effect as an enhancement of beta decay, hypothetically by solar neutrinos. On reading one of the papers just now, I noticed that one of the radionuclides in question was Ra-226, which is an alpha emitter, not the beta-emitting Ra-228 (which I’d previously mistakenly assumed or misread it as). This makes it even harder to see how neutrinos could be responsible, since they are not involved in alpha decay. The other radionuclide, Si-32, is a beta emitter, while Mn-54, mentioned in the article, apparently decays by K-capture; both of these are Weak processes. So the suggestion is that the mechanism is affecting all three decay modes similarly, which also seems rather unlikely. Note, by the way, that K-capture decay rates can be reduced quite easily, by exciting or ionising the atom (if there aren’t any K-electrons, they can’t be captured).

George E. Smith
August 27, 2010 2:31 pm

“”” anna v says:
August 27, 2010 at 12:46 pm
Here is what I wrote in the Discover comments:
Two points:
The statement “decay rates are constant” is a statement describing local physics. As far as general relativity goes decay rates depend on the topology of space time; the same is true of the velocity of light, another so called constant, that is only locally constant.
Any interaction with the neutrino field introduces an extra weak vertex. No energy exchange can happen without this extra vertex. The extra weak vertex introduces in the probability a 10^-12 diminution, and they are talking of 0.2% effects. It cannot be neutrinos.
Even if there is a correlation with neutrino bursts or solar flares, correlations are not causation. The whole solar system could be going through a rough patch of space time.
Measuring accurately the velocity of light over a few sun cycles is an easier experiment than measuring decay rate changes.
I would vote for data contamination from instrumental biases. Otherwise this would be a first measurement of gravitons, 😉
BTW, It is true that when neutrinos were first visualized Fermi, I think, said: “who ordered that?”. Since then, and particularly with the data gathered in electron positron collisions the past twenty years or so, the standard model with its three neutrinos is as well established as the model of the atom. “””
I would agree with your last statement Anna. Moreover, I believe there is a rather elegant argument (which I do not have at my fingertips) that says there cannot be any fourth Neutrino, along with its coterie of another bunch of as yet unknown particles.
So any other explanation that would posit some other new unknown particle has to run the gauntlet of a great abhorrence for accepting any new particles; once the Higgs Boson is located.
The more I read about this “discovery” the more I feel that it seems someone discovered a dusty old box full of baseball cards and they have been going through it to see if any of them is valuable.
The initial (current) release of this story; made it sound like JPL or some other modern group had just had their bells rung in some fancy new detector; and had some hitherto unknown phenomena to explain. Better check whether the bells are really ringing first and then try to cook up some causal explanation; please let it not be CO2.

kwinterkorn
August 27, 2010 2:36 pm

Neutrino flux from the sun presumably follows the inverse square law.
Do any of our various spacecraft whizzing around the solar system (or stuck in the snad on Mars) have instruments that could be bent to the purpose of calculating local radioactive decay rates?

Jim G
August 27, 2010 2:37 pm

Our nature seems to want things to be ordered and symmetrical. Perhaps our mathematical systems have that bias built into them but, unfortunately, we live in a chaotic universe where we have, so far, found that our mathematical representations of physical processes always seem to be mere approximations of what is really going on. From Newton to Einstein to dark matter and dark energy to inflation and our inability to marry general relativity to quantum physics. Dogmatism is our greatest enemy.

kwinterkorn
August 27, 2010 2:38 pm

I wish I could tipe beter. “sand”, not “snad”.

Steve
August 27, 2010 2:44 pm

Good gravy is my brain off today! I stated “link below” in my previous comment and then forgot to paste the link. OK, one last time…
Below is a link to the abstract published in Nature in 2000, “Acceleration of quantum decay processes by frequent observations”:
http://www.nature.com/nature/journal/v405/n6786/abs/405546a0.html
So, if neutrinos “observe” atomic nuclei, and the rate of neutrion observations changes appreciably, it isn’t a huge leap to assume that the rate of radioactive decay will change as a result of the quantum zeno effect.

August 27, 2010 2:54 pm

Dave Springer says:
August 27, 2010 at 2:21 pm
“Actually Paul, I just noticed I put the wrong link in to the Voyager RTG materials degradation study so you didn’t actually read it before concluding it wasn’t thorough. ”
I did in fact read the pdf you linked, which both described the procedure and gave the predicted and actual power curves. I did not conclude that it wasn’t thorough; on the contrary, I said that it was surprisingly accurate, given that there were many effects they could not have included (because the data did not exist). No one could possibly have known the precise effect of cosmic ray exposure on the thermocouple, for example. I have now also checked the study via your corrected link; my previous comments all stand.

Z
August 27, 2010 3:00 pm

kwinterkorn says:
August 27, 2010 at 2:36 pm
Neutrino flux from the sun presumably follows the inverse square law.
Do any of our various spacecraft whizzing around the solar system (or stuck in the snad on Mars) have instruments that could be bent to the purpose of calculating local radioactive decay rates?

The Voyagers (actually singular) are the best candidates. Their downside is the singular bit.
Perhaps we could go the other way, and see if close sun-orbiting spacecraft are showing anything peculiar (apart from a BBQ taste…)

August 27, 2010 3:08 pm

Dave Springer says:
August 27, 2010 at 2:11 pm
@Paul Birch “the long term effect of cosmic rays on the thermocouple materials, which could easily either increase or decrease the thermoelectric emfs”
“So those could explain either an increase or a decrease in degradation of thermocouples. Easily huh? Try going into a little more detail on how that could easily go either way.”
Thermoelectric emf is a complex materials property. Different alloys, phases and dopings produce different emfs, some higher, some lower. Cosmic rays will change the properties of materials in ways that are very hard to predict; some of those changes will produce materials with higher thermoelectric emfs, some lower. For any given thermocouple combination, it would be a toss-up which way this particular effect would go. Remember, this is a change in addition to the predicted thermal diffusion effects.

Scarlet Pumpernickel
August 27, 2010 3:39 pm

Ok this steps on the toes of everyone else’s research, so lets discredit them, and bury it just like famous scientists in the past

Dave Springer
August 27, 2010 4:12 pm

http://www.osti.gov/bridge/servlets/purl/481894-QjgVc5/webviewable/481894.pdf
CASSINI RTG ACCEPTANCE TEST RESULTS
RTG PERFORMANCE ON GALILEO AND ULYSSES
Lockheed Martin 5/23/97
Appears to be a smoking gun in figure 8. Galileo RTG power output glitched downwards at the Venus flyby and again on both Earth flybys then bounced back .
Like to see someone dismiss this with something more than vague handwaving about how easily cosmic rays that can both increase and decrease the degradation rate of the thermopile. There was no degradation here, just temporary declines in performance as gravitational assist maneuvers in the inner solar system were accomplished.

Z
August 27, 2010 4:19 pm

Scarlet Pumpernickel says:
August 27, 2010 at 3:39 pm
Ok this steps on the toes of everyone else’s research, so lets discredit them, and bury it just like famous scientists in the past

Hey! This is the modern era.
Let’s call them names as well…

pwl
August 27, 2010 4:27 pm

So I guess that the alarmists [snip] dreams of a coming 2012 doomsday will have to be put on hold?

Dave Springer
August 27, 2010 4:33 pm

Note also in the Lockheed Martin graph of Galileo RTG output the comparison line with Voyager RTG. It took 8 years for Galileo to reach the orbit of Jupiter. Voyager had no gravitational assists and passed the orbit of Jupiter in 18 months and continued outward. Note that both RTGs performed identically for about a year after launch then Voyager’s RTG (RTG in Voyager and Galileo were identical modules) then performance began to slowly drift farther from Galileo’s as Voyager got much much farther from the sun than Galileo over the same period of time. There’s yet another anomaly that speaks to this not being due to mischaracterized thermopile degradation.

Dave Springer
August 27, 2010 4:35 pm

Paul Birch says:
August 27, 2010 at 3:08 pm
Stop handwaving Paul. You explained nothing.

Dave Springer
August 27, 2010 4:52 pm

Further on the Galileo RTG output anomalies during Venus and Earth gravity assist flybys…
The temporary decline in output power is the same for all three flybys. However, Venus has very little electromagnetic field compared to the earth so we can rule out planetary electromagnetic fields and their ability to deflect cosmic rays as a potential cause and it also seems to rule out neutrino flux as that would be much greater on the Venus flyby. Gravity fields would be about the same as the earth and venus mass about the same.
That leaves us with strength of gravitational field very close to the planets and inertial reference frames. I can’t think of anything else that would change on a flyby. There’s no glitch when Galileo encounters Jupiter which may or may not be meaningful. It would seem that somehow either the gravitational field and/or the inertial reference frame is responsible for either the change in performance of the thermopile or the decay rate of the Pu-238.

Dave Springer
August 27, 2010 5:57 pm

http://www.fas.org/nuke/space/gphs.pdf
See figure 13 above. This is a graph of the comparative performance of all Pu-238 RPGs flown as of 2006.
Two on earth orbiting satellites (LES 8 and 9), two on super long distance probes (Voyager I and II), and two on medium distance missions to Jupiter (Galileo and Ulysses). Ulysses was very short mission as it looped around Jupiter to get out of the ecliptic and raced back to the inner system to get a polar view of the sun.
Note how, without exception, the closer to the sun the less power the RTG delivered.
You can see some flattening in the decay rate on Ulysses as looped out around Jupiter.
At first blush I might blame this on change in the cold sink side of the thermopile changing with distance from the sun but the fins are so highly mirrored that incident light from the sun should have no significant effect. As well, some of the spacecraft rotate (Galileo) and some do not (Voyager) which appears to make no difference. There was some speculation that the position anomalies are related to rotating or not rotating as the non-rotating craft could have gas vents and heat sources that were pointed in a constant direction and could thus effect velocity.

Dave Springer
August 27, 2010 6:34 pm

Paul Birch says:
August 27, 2010 at 3:08 pm
Can you give me a single example of cosmic rays improving the performance of a thermopile?
There would be a lot of money in it for you. RTGs are bloody expensive with tens of kilograms of Pu-238 fuel in them. The US had to spin up a billion dollar Pu-238 process just to get enough to power more recent missions, especially Cassini as it had more Pu-238 in it than several previous missions combined. The weight of the RPG was 18% of the Cassini payload. So if there’s any possible way to irradiate the RPGs into better performance give JPL a call (get an NDA first) and make yourself rich. Otherwise I think you should just continue waving your hands around on blogs making claims that have no basis in reality.

Dave Springer
August 27, 2010 7:11 pm

Paul Birch says:
July 23, 2010 at 11:36 am
An “insignficant effect” is one where you don’t even know its sign.

Okay, I’ll buy that.

Paul Birch says:
August 27, 2010 at 1:34 pm
“the long term effect of cosmic rays on the thermocouple materials, which could easily either increase or decrease the thermoelectric emfs”

So you’re saying the delta emf could be positive or negative due to cosmic rays.
I guess that makes them insignificant, huh? rofl

Jim
August 27, 2010 8:33 pm

Wait! I’ve got it!! It’s DARK PHOTONS!!!

August 27, 2010 9:18 pm

The Total Idiot,
I like the name. I hope you keep it. 😉

anticlimactic
August 27, 2010 11:38 pm

I certainly hope that radioactive decay is a particle interraction, whatever the particle is. In nuclear reactors it is thermal neutrons which trigger decay, so a particle causing decay is not absurd.
The problem is worse if it is NOT a particle interraction. An atom may be stable for millions of years then suddenly decays. Why? Most, if not all, of science is based on cause and effect, here we have an effect and no clue as to the cause. What is the condition that arises in an atom to cause it to decay if there is no external influence? Saying ‘It just happens’ is not science. Believing that ‘It just happens’ is sufficient and should not be challenged is religion.
If the particle does originate from the Sun then the speed of radioactive decay would increase closer to the Sun. It would be interesting to know what effect this would have on the evolution of Venus.

August 28, 2010 4:15 am

Dave Springer says:
August 27, 2010 at 4:35 pm
“Stop handwaving Paul. You explained nothing.”
I explained why the prediction was as accurate as could reasonably be expected. It is apparent that the authors themselves did not expect anything better, and were unsurprised by the residual error. I explained why that error could have been in either direction. I explained how, for example, the marginal effect of cosmic rays on these thermopiles was not known, and why it could not have been known beforehand. That they would degrade over time was not in question; how much they would do so was inescapably uncertain, due to the inherent complexity of the materials phenomena. It has become evident to me that you have a strong emotional desire to believe that the behaviour of these power generators is anomalous and inexplicable in terms of conventional science, while paradoxically relying upon the endless capacity of that science to make highly accurate predictions of the performance of very complex devices. Presumably it has been made a lynchpin of some fringe pseudo-science theory you’re a fan of. I’ve some news for you; in this sort of engineering, glitches are the rule not the exception. It would be more surprising if there weren’t any.

Joe Lalonde
August 28, 2010 4:18 am

I love 100% garbage!
NOT A SINGLE ROTATION!

Vince Causey
August 28, 2010 7:02 am

Anna v, I am confused when you write “The same is true of the velocity of light, another so called constant, that is only locally constant.”
Perhaps I am misunderstanding your post, but the speed of light is a universal constant, and it is the variable rate of time itself that keeps the speed of light the same for every observer. Even gravity doesn’t change the measured speed of light, but shifts the wavelength and reduces its energy (in redshift).

johnnythelowery
August 28, 2010 7:35 am

Mac says:
August 27, 2010 at 8:57 am
Good stuff Mac. V. interesting with good explanations to keep the uninitiated in the loop. Never knew this Neutrino issue existed and i doubt Anna would agree that there is even a ‘Neutrino Issue’. Everything looking good then Peter Sturrock hijacks a UFO and lands it in the middle of the playing field for us all to crawl over!

anna v
August 28, 2010 8:15 am

Vince Causey says:
August 28, 2010 at 7:02 am
When space time topography is changing, what is the definition of a meter and a second?
http://edu-observatory.org/physics-faq/Relativity/SpeedOfLight/speed_of_light.html
In general relativity, the appropriate generalisation is that the speed of light is constant in any freely falling reference frame (in a region small enough that tidal effects can be neglected). In this passage, Einstein is not talking about a freely falling frame, but rather about a frame at rest relative to a source of gravity. In such a frame, the speed of light can differ from c, basically because of the effect of gravity (spacetime curvature) on clocks and rulers.
Speaking hypothetically, the interaction of a graviton with a proton or a photon happens in a non inertial frame, is all I am saying,( and it is a hypothesis).

August 28, 2010 4:28 pm

I would like to wade into this subject, but not to a great depth.
I’m sure most, if not all, of the readers of this blog understands (far better than I) the physics of a LASER. The photon, which is ‘nothing’ but electromagnetic radiation’ stimulates the emission of another photon, having the same energy and coherently. Of course, the electron emitting the photon loses the appropriate energy. The electron, by virtue of it losing energy, ‘feels the effect’ of the passing photon, but does not interact with the stimulating photon.
I would postulate (assuming the observations under discussion prove to be real) that the neutrino, having a small but non-zero mass, may either be exerting a large, but very localized, gravitational field, due to it’s velocity, or a new type of field altogether – we really don’t have that much knowledge or data concerning masses traveling close to the speed of light! When this field passes by an atom on the verge of decaying, it disrupts the process, and delays the decay. I think this would fit into existing theories of physics without upsetting too many apple carts.

tobyglyn
August 28, 2010 7:04 pm

Vince Causey says:
August 28, 2010 at 7:02 am
“… the speed of light is a universal constant, and it is the variable rate of time itself that keeps the speed of light the same for every observer. Even gravity doesn’t change the measured speed of light, but shifts the wavelength and reduces its energy (in redshift).”
Making light of constants. 🙂
“A team of researchers from the Ecole Polytechnique Fédérale de Lausanne (EPFL) has successfully demonstrated, for the first time, that it is possible to control the speed of light – both slowing it down and speeding it up – in an optical fiber, using off-the-shelf instrumentation in normal environmental conditions. Their results, to be published in the August 22 issue of Applied Physics Letters, could have implications that range from optical computing to the fiber-optic telecommunications industry.
On the screen, a small pulse shifts back and forth – just a little bit. But this seemingly unremarkable phenomenon could have profound technological consequences. It represents the success of Luc Thévenaz and his fellow researchers in the Nanophotonics and Metrology laboratory at EPFL in controlling the speed of light in a simple optical fiber. They were able not only to slow light down by a factor of three from its well – established speed c of 300 million meters per second in a vacuum, but they’ve also accomplished the considerable feat of speeding it up – making light go faster than the speed of light.”
http://scienceblog.com/light.html

August 28, 2010 10:39 pm

“http://scienceblog.com/light.html”
Wow, whoever wrote that article has no idea at all of index of refraction, speed of light in optical fiber, or what was the significant about the research in question. Makes me wonder what other misinformation is sitting out there on such blogs.
The phrase, “the speed of light is a constant,” is taken out of context. The correct wording is, “the speed of light IN A VACUUM is a constant.” The speed of light is dependent upon the medium it is traveling in, and can derived from the medium’s index of refraction. The speed of light in typical optical fiber is approximately 2/3 that of a vacuum (i.e., space). It has also long been known that some materials have indexes of refraction whose values are such that the speed of light is faster in the materials than in a vacuum.
What is significant about the research was the degree which the signal was slowed. Unfortunately Stimulate Brillion Scattering (SBS) also causes a frequency shift, which may limit its usefulness.
By applying an electric field across some materials, you can make small changes in its index of refraction, thus slowing or increasing the speed of light through the material. However, you can only achieve a very small delay in a reasonably sized electro-optic device. When we needed to delay a signal to a greater degree, we increased the optical path, i.e., added a coil of optical fiber. Of course, that’s not realistic if you want a chip-based device, and that’s where SBS may have a place.

August 29, 2010 4:44 am

There is also another news strongly suggesting that we should revise nuclear physics:
http://www.physorg.com/news202020721.html
So maybe they are not just blurry, fluctuating as in QM picture, but rather have some concrete spatial structure near (local?) energy minimum? (analogously to protein folding)
These energy minimas can have different shape – depth, width – and so different statistical dependence on energy carriers like Sun’s neutrinos – comparing such behavior of different isotopes could became the basic tool to finally understand the structure of nucleuses …

anna v
August 29, 2010 5:53 am

Jarek Duda says:
August 29, 2010 at 4:44 am
There is also another news strongly suggesting that we should revise nuclear physics:
Revising models, as this random matrix model might need revision, is not the same as destroying the size of coupling constants. Neutrinos cannot enter any induced radiation model because they would be accompanied by a factor of ten to the minus twelve, that is 1 divided by 10000000000. The interaction would be so minuscule that it would be undetectable, let alone give effects of 0.2%

anna v
August 29, 2010 6:46 am

missed two zeros there. sorry.

August 29, 2010 7:36 am

You are saying about direct absorption … what we need here is just helping nucleus to get out energy local minimum – tunneling – neutrino energy doesn’t really have to change while such process …
Does standard neutrino detection exclude such scenarios of decays only induced by neutrino? Evaluate their cross sections?
And generally I was told that decay time depends on temperature and pressure:
http://www.scienceforums.net/topic/40163-can-we-be-sure-that-decay-times-are-constant/

August 29, 2010 8:01 am

You are referring situations in which neutrino is absorbed … what if it could only help nucleus to get out of energy minimum – tunneling – only induce decay not losing own energy?
Does standard detection methods exclude such scenarios? Say something about their statistics?
And generally I was told that temperature and pressure influence decay times: http://www.scienceforums.net/topic/40163-can-we-be-sure-that-decay-times-are-constant/
The constance of particle decay times looks to be only an approximation: practical idealization of process we don’t really understand.

August 29, 2010 11:15 am

Jarek Duda says:
August 29, 2010 at 8:01 am
“You are referring situations in which neutrino is absorbed … what if it could only help nucleus to get out of energy minimum – tunneling – only induce decay not losing own energy?
Does standard detection methods exclude such scenarios? Say something about their statistics?”
Anna seems not to be giving sufficient attention to the distinction between induced and stimulated decays. An induced decay is one where the neutrino is absorbed; this is the process used in neutrino detectors. A stimulated decay is analogous to a laser; the stimulating particles are not absorbed. Both processes, under the standard theory, have the same coupling constant in their probabilities, so the cross-sections, for a single neutrino, will be of the same order of magnitude. However, stimulated decay is proportional to the square of the number of simultaneously interacting particles (the wave functions add in phase – thus by amplitude not intensity). This is how synchrotron emission and free electron lasers work. Which means that a sufficiently intense beam could increase the decay probabilities by many orders of magnitude – perhaps even to the 0.2% level reported. The question then is whether the solar neutrinos can be considered coherently simultaneous or only sequential, and this in turn hangs on their effective size or wavelength. For the most common energies (up to 400keV) this would be down to ~3e-12m. Now if the neutrino flux is ~3% of the total solar flux (ie ~40W/m2), this would mean ~6e14 neutrinos/m2/s (at 400keV), or ~2e6 neutrinos/m3. Their mean separation would then be of order a centimeter, very much greater than their wavelength. Thus the beam is “sparse”, and nuclei “see” only a single neutrino at a time, unless solar neutrinos are very strongly clustered, or some weird quantum entanglement makes them act as one over greater distances (I don’t know how that could work, but I wouldn’t want to rule it out absolutely). There is however independent evidence that the solar neutrinos do not form such an interaction-enhancing coherent beam; if they did, then the induced decays by which we detect them would also be enhanced. The solar neutrino problem would then have been the other way round; too many events detected, not too few.

anna v
August 29, 2010 11:42 am

Jarek Duda :
August 29, 2010 at 8:01 am
Any interaction, absorption or scattering or what have you, of a neutrino with another particle or field has to be mediated by the weak interaction or the gravitational interaction . This introduces an extra weak vertex in the calculation of the cross section , which is what will give the factor 10^-12. Gravitation is much worse, would introduce a 10^-74 or so.
Nothing is excluded, it is just that it is very very small to give a measurable effect of 0.2%.

Pascvaks
August 30, 2010 3:53 am

Very interesting indeed!
PS: Has Big Al come out and said what the truth of all this really is? Ya know, the science isn’t settled and the game ain’t over til’ The Fat Boy sings!

August 30, 2010 4:29 am

anna v says:
August 29, 2010 at 11:42 am
“Any interaction, absorption or scattering or what have you, of a neutrino with another particle or field has to be mediated by the weak interaction or the gravitational interaction .”
Only if those are the only interactions of which neutrinos are capable, which we do not and cannot know. All we know is that those are the only interactions seen within the regimes so far investigated (and the only ones our current theories envisage). Any time we enter a new and untested regime the possibility that some previously unobserved interactions will be manifest cannot be excluded. Experimental test of the theory in each new regime is essential.
Now, since stronger effects are usually discovered before weaker effects, it is on the face of it rather unlikely that any new neutrino interactions will prove so much stronger than those already known that they could explain this reported effect; nevertheless, science is full of seemingly unlikely discoveries. It could be, for example, that the neutrinos mediating the effect are a very different type of neutrino from the ordinary ones normally detected – perhaps some sort of gravitino or heavy neutrino – or some other particle altogether. This is, of course, sheer speculation, but the possibility of such weird and wonderful particles existing outside the standard model is a fairly common speculation in particle physics.

August 30, 2010 4:45 am

Note: if perchance there already does exist experimental evidence relating to this regime, which would conclusively rule out mediation by solar neutrinos (or other particles), I should be interested to learn of it. So far as I am aware, all the other evidence (apart from the previously mentioned observations of solar neutrinos) relates to significantly different situations, which are not directly comparable.

johnnythelowery
August 30, 2010 5:57 am

Third attempt to post this article about THORIUM
‘…..There is no certain bet in nuclear physics but work by Nobel laureate Carlo Rubbia at CERN (European Organization for Nuclear Research) on the use of thorium as a cheap, clean and safe alternative to uranium in reactors may be the magic bullet we have all been hoping for, though we have barely begun to crack the potential of solar power. Dr Rubbia says a tonne of the silvery metal – named after the Norse god of thunder, who also gave us Thor’s day or Thursday – produces as much energy as 200 tonnes of uranium, or 3,500,000 tonnes of coal. A mere fistful would light London for a week. Thorium eats its own hazardous waste. It can even scavenge the plutonium left by uranium reactors, acting as an eco-cleaner. “It’s the Big One,” said Kirk Sorensen, a former NASA rocket engineer and now chief nuclear technologist at Teledyne Brown Engineering. “Once you start looking more closely, it blows your mind away. You can run civilisation on thorium for hundreds of thousands of years, and it’s essentially free. You don’t have to deal with uranium cartels,” he said.
Thorium is so common that miners treat it as a nuisance, a radioactive by-product if they try to dig up rare earth metals. The US and Australia are full of the stuff. So are the granite rocks of Cornwall. You do not need much: all is potentially usable as fuel, compared to just 0.7pc for uranium. After the Manhattan Project, US physicists in the late 1940s were tempted by thorium for use in civil reactors. It has a higher neutron yield per neutron absorbed. It does not require isotope separation, a big cost saving. But by then America needed the plutonium residue from uranium to build bombs.
“They were really going after the weapons,” said Professor Egil Lillestol, a world authority on the thorium fuel-cycle at CERN. “It is almost impossible make nuclear weapons out of thorium because it is too difficult to handle. It wouldn’t be worth trying.” It emits too many high gamma rays. You might have thought that thorium reactors were the answer to every dream but when CERN went to the European Commission for development funds in 1999-2000, they were rebuffed.
Brussels turned to its technical experts, who happened to be French because the French dominate the EU’s nuclear industry. “They didn’t want competition because they had made a huge investment in the old technology,” he said.
Another decade was lost. It was a sad triumph of vested interests over scientific progress. “We have very little time to waste because the world is running out of fossil fuels. Renewables can’t replace them. Nuclear fusion is not going work for a century, if ever,” he said. The Norwegian group Aker Solutions has bought Dr Rubbia’s patent for the thorium fuel-cycle, and is working on his design for a proton accelerator at its UK operation. Victoria Ashley, the project manager, said it could lead to a network of pint-sized 600MW reactors that are lodged underground, can supply small grids, and do not require a safety citadel. It will take £2bn to build the first one, and Aker needs £100mn for the next test phase. The UK has shown little appetite for what it regards as a “huge paradigm shift to a new technology”. Too much work and sunk cost has already gone into the next generation of reactors, which have another 60 years of life.
So Aker is looking for tie-ups with the US, Russia, or China. The Indians have their own projects – none yet built – dating from days when they switched to thorium because their weapons programme prompted a uranium ban. America should have fewer inhibitions than Europe in creating a leapfrog technology. The US allowed its nuclear industry to stagnate after Three Mile Island in 1979. Anti-nuclear neorosis is at last ebbing. The White House has approved $8bn in loan guarantees for new reactors, yet America has been strangely passive. Where is the superb confidence that put a man on the moon? A few US pioneers are exploring a truly radical shift to a liquid fuel based on molten-fluoride salts, an idea once pursued by US physicist Alvin Weinberg at Oak Ridge National Lab in Tennessee in the 1960s. The original documents were retrieved by Mr Sorensen. Moving away from solid fuel may overcome some of thorium’s “idiosyncracies”. “You have to use the right machine. You don’t use diesel in a petrol car: you build a diesel engine,” said Mr Sorensen. Thorium-fluoride reactors can operate at atmospheric temperature. “The plants would be much smaller and less expensive. You wouldn’t need those huge containment domes because there’s no pressurized water in the reactor. It’s close-fitting,” he said.
Nuclear power could become routine and unthreatening. But first there is the barrier of establishment prejudice. When Hungarian scientists led by Leo Szilard tried to alert Washington in late 1939 that the Nazis were working on an atomic bomb, they were brushed off with disbelief. Albert Einstein interceded through the Belgian queen mother, eventually getting a personal envoy into the Oval Office.
Roosevelt initially fobbed him off. He listened more closely at a second meeting over breakfast the next day, then made up his mind within minutes. “This needs action,” he told his military aide. It was the birth of the Manhattan Project. As a result, the US had an atomic weapon early enough to deter Stalin from going too far in Europe.
The global energy crunch needs equal “action”. If it works, Manhattan II could restore American optimism and strategic leadership at a stroke: if not, it is a boost for US science and surely a more fruitful way to pull the US out of perma-slump than scattershot stimulus. Even better, team up with China and do it together, for all our sakes……………’
—————————————————————————————

johnnythelowery
August 30, 2010 7:10 am

I really think this sounds interesting as far as an alternative move from the oil economy even though the plants need the CO2. From an article in the Telegraph
‘…Dr Rubbia says a tonne of the silvery metal – named after the Norse god of thunder, who also gave us Thor’s day or Thursday – produces as much energy as 200 tonnes of uranium, or 3,500,000 tonnes of coal. A mere fistful would light London for a week.
Thorium eats its own hazardous waste. It can even scavenge the plutonium left by uranium reactors, acting as an eco-cleaner. “It’s the Big One,” said Kirk Sorensen, a former NASA rocket engineer and now chief nuclear technologist at Teledyne Brown Engineering.
“Once you start looking more closely, it blows your mind away. You can run civilisation on thorium for hundreds of thousands of years, and it’s essentially free. You don’t have to deal with uranium cartels,” he said.
Thorium is so common that miners treat it as a nuisance, a radioactive by-product if they try to dig up rare earth metals. The US and Australia are full of the stuff. So are the granite rocks of Cornwall. You do not need much: all is potentially usable as fuel, compared to just 0.7pc for uranium.
After the Manhattan Project, US physicists in the late 1940s were tempted by thorium for use in civil reactors. It has a higher neutron yield per neutron absorbed. It does not require isotope separation, a big cost saving. But by then America needed the plutonium residue from uranium to build bombs.
“They were really going after the weapons,” said Professor Egil Lillestol, a world authority on the thorium fuel-cycle at CERN. “It is almost impossible make nuclear weapons out of thorium because it is too difficult to handle. It wouldn’t be worth trying.” It emits too many high gamma rays………………………’
Wow.

George E. Smith
August 30, 2010 10:38 am

“”” jtom says:
August 28, 2010 at 10:39 pm
“http://scienceblog.com/light.html”
Wow, whoever wrote that article has no idea at all of index of refraction, speed of light in optical fiber, or what was the significant about the research in question. Makes me wonder what other misinformation is sitting out there on such blogs.
The phrase, “the speed of light is a constant,” is taken out of context. The correct wording is, “the speed of light IN A VACUUM is a constant.” The speed of light is dependent upon the medium it is traveling in, and can derived from the medium’s index of refraction. The speed of light in typical optical fiber is approximately 2/3 that of a vacuum (i.e., space). It has also long been known that some materials have indexes of refraction whose values are such that the speed of light is faster in the materials than in a vacuum. “””
Well don’t you also have to be careful as to what velocity you are specifying. It is the “group” veliocity of light in a vaccuum (c) that is constant; and also now has an exact value. The “phase” velocity is what can exceed the group velocity.
Simplest demonstration idea, is to imagine an ocean wave that is approaching a sea wall head on so that the crest of the wave hits every point on the wall at the same instant of time probably creating a big wash along the whole wall, that goes up and over the wall.
Now tilt the wave off the normal to the wall, by some small angle (say one degree). The wave still approaches the wall at exactly the same velocity, but now the contact of the wave crest with the wall runs along the wall at many times the actual velocity of the wave.
To an observer watching the high point of the wave as it appears to race along the wall, he sees the wave crest running along the wall at 57.3 times the actual velocity of the wave.
Well of course nothing is moving along the wall at that speed; the water that forms the crest a mile down the wall is not the same water that hit the wall down the other end just a fraction of a second ago. No information is propagating along the wall at that enhanced speed.

CRS, Dr.P.H.
August 30, 2010 12:26 pm

Detailed story about this on Fermilab’s webzine:
http://www.symmetrymagazine.org/breaking/2010/08/23/the-strange-case-of-solar-flares-and-radioactive-elements/
Comments are always interesting, these are some really sharp folks! Theoretical physicists etc.

jimgineer
September 4, 2010 5:55 am

If radioactive decay rates decrease, then the radiation given off by the core of a nuclear weapon would decrease. The cores critical mass would increase and more/denser core material would be required for warhead detonation. What if the force that can decrease radioactive decay could also increase it? How much increase would it require for a nuclear weopon to spontaneously reach critical mass?

Alexander Keur
December 29, 2010 2:04 pm

That wud be a nice movie script indeed. 8)