Solar flares are teleconnected to earthly radioactive decay

From Stanford University News a really wild must read science discovery.

h/t to Leif Svalgaard and WUWT reader “carbon-based-life-form”.

The strange case of solar flares and radioactive elements

When researchers found an unusual linkage between solar flares and the inner life of radioactive elements on Earth, it touched off a scientific detective investigation that could end up protecting the lives of space-walking astronauts and maybe rewriting some of the assumptions of physics.

BY DAN STOBER

It’s a mystery that presented itself unexpectedly: The radioactive decay of some elements sitting quietly in laboratories on Earth seemed to be influenced by activities inside the sun, 93 million miles away.

Is this possible?

Researchers from Stanford and Purdue University believe it is. But their explanation of how it happens opens the door to yet another mystery.

There is even an outside chance that this unexpected effect is brought about by a previously unknown particle emitted by the sun. “That would be truly remarkable,” said Peter Sturrock, Stanford professor emeritus of applied physics and an expert on the inner workings of the sun.

The story begins, in a sense, in classrooms around the world, where students are taught that the rate of decay of a specific radioactive material is a constant. This concept is relied upon, for example, when anthropologists use carbon-14 to date ancient artifacts and

when doctors determine the proper dose of radioactivity to treat a cancer patient.

Random numbers

But that assumption was challenged in an unexpected way by a group of researchers from Purdue University who at the time were more interested in random numbers than nuclear decay. (Scientists use long strings of random numbers for a variety of calculations, but they are difficult to produce, since the process used to produce the numbers has an influence on the outcome.)

Ephraim Fischbach, a physics professor at Purdue, was looking into the rate of radioactive decay of several isotopes as a possible source of random numbers generated without any human input. (A lump of radioactive cesium-137, for example, may decay at a steady rate overall, but individual atoms within the lump will decay in an unpredictable, random pattern. Thus the timing of the random ticks of a Geiger counter placed near the cesium might be used to generate random numbers.)

As the researchers pored through published data on specific isotopes, they found disagreement in the measured decay rates – odd for supposed physical constants.

Checking data collected at Brookhaven National Laboratory on Long Island and the Federal Physical and Technical Institute in Germany, they came across something even more surprising: long-term observation of the decay rate of silicon-32 and radium-226 seemed to show a small seasonal variation. The decay rate was ever so slightly faster in winter than in summer.

Peter Sturrock
Peter Sturrock, professor emeritus of applied physics - photo L.A. Cicero

Was this fluctuation real, or was it merely a glitch in the equipment used to measure the decay, induced by the change of seasons, with the accompanying changes in temperature and humidity?

“Everyone thought it must be due to experimental mistakes, because we’re all brought up to believe that decay rates are constant,” Sturrock said.

The sun speaks

On Dec 13, 2006, the sun itself provided a crucial clue, when a solar flare sent a stream of particles and radiation toward Earth. Purdue nuclear engineer Jere Jenkins, while measuring the decay rate of manganese-54, a short-lived isotope used in medical diagnostics, noticed that the rate dropped slightly during the flare, a decrease that started about a day and a half before the flare.

If this apparent relationship between flares and decay rates proves true, it could lead to a method of predicting solar flares prior to their occurrence, which could help prevent damage to satellites and electric grids, as well as save the lives of astronauts in space.

The decay-rate aberrations that Jenkins noticed occurred during the middle of the night in Indiana – meaning that something produced by the sun had traveled all the way through the Earth to reach Jenkins’ detectors. What could the flare send forth that could have such an effect?

Jenkins and Fischbach guessed that the culprits in this bit of decay-rate mischief were probably solar neutrinos, the almost weightless particles famous for flying at almost the speed of light through the physical world – humans, rocks, oceans or planets – with virtually no interaction with anything.

Then, in a series of papers published in Astroparticle Physics, Nuclear Instruments and Methods in Physics Research and Space Science Reviews, Jenkins, Fischbach and their colleagues showed that the observed variations in decay rates were highly unlikely to have come from environmental influences on the detection systems.

Reason for suspicion

Their findings strengthened the argument that the strange swings in decay rates were caused by neutrinos from the sun. The swings seemed to be in synch with the Earth’s elliptical orbit, with the decay rates oscillating as the Earth came closer to the sun (where it would be exposed to more neutrinos) and then moving away.

So there was good reason to suspect the sun, but could it be proved?

Enter Peter Sturrock, Stanford professor emeritus of applied physics and an expert on the inner workings of the sun. While on a visit to the National Solar Observatory in Arizona, Sturrock was handed copies of the scientific journal articles written by the Purdue researchers.

Sturrock knew from long experience that the intensity of the barrage of neutrinos the sun continuously sends racing toward Earth varies on a regular basis as the sun itself revolves and shows a different face, like a slower version of the revolving light on a police car. His advice to Purdue: Look for evidence that the changes in radioactive decay on Earth vary with the rotation of the sun. “That’s what I suggested. And that’s what we have done.”

A surprise

Going back to take another look at the decay data from the Brookhaven lab, the researchers found a recurring pattern of 33 days. It was a bit of a surprise, given that most solar observations show a pattern of about 28 days – the rotation rate of the surface of the sun.

The explanation? The core of the sun – where nuclear reactions produce neutrinos – apparently spins more slowly than the surface we see. “It may seem counter-intuitive, but it looks as if the core rotates more slowly than the rest of the sun,” Sturrock said.

All of the evidence points toward a conclusion that the sun is “communicating” with radioactive isotopes on Earth, said Fischbach.

But there’s one rather large question left unanswered. No one knows how neutrinos could interact with radioactive materials to change their rate of decay.

“It doesn’t make sense according to conventional ideas,” Fischbach said. Jenkins whimsically added, “What we’re suggesting is that something that doesn’t really interact with anything is changing something that can’t be changed.”

“It’s an effect that no one yet understands,” agreed Sturrock. “Theorists are starting to say, ‘What’s going on?’ But that’s what the evidence points to. It’s a challenge for the physicists and a challenge for the solar people too.”

If the mystery particle is not a neutrino, “It would have to be something we don’t know about, an unknown particle that is also emitted by the sun and has this effect, and that would be even more remarkable,” Sturrock said.

Chantal Jolagh, a science-writing intern at the Stanford News Service, contributed to this story.

Share

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans."
0 0 votes
Article Rating
320 Comments
Inline Feedbacks
View all comments
PhilJourdan
August 26, 2010 1:56 pm

300 Comments! WOW! So if this is a repeat, go ahead and delete it.
But is not the “universal” time kept by basically radio-active clocks around the world? Or is it a different aspect?
I am wondering how accurate the clocks are now.

PhilJourdan
August 26, 2010 1:59 pm

tallbloke says:
August 26, 2010 at 5:43 am
Maybe we are rapidly approaching the DUH (Dogmatic Universe Horizon).

Would not that be “Dogbert Universe Horizon”? 😉

August 26, 2010 2:17 pm

PhilJourdan says:
August 26, 2010 at 1:56 pm
“But is not the “universal” time kept by basically radio-active clocks around the world? Or is it a different aspect?”
Atomic clocks. Different mechanism. Caesium clocks are the most common, and 1 second is 9,192,631,770 periods of the electromagnetic radiation (ie., 9GHz radio waves) corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom. Accuracies down to ~1 part in 10^16 are achieved.

tallbloke
August 26, 2010 3:29 pm

Dave Springer says:
August 26, 2010 at 9:16 am
tallbloke says:
August 26, 2010 at 4:23 am
In fact, they are not observed at all, directly or indirectly, but inferred from a set of assumptions which include the weakest of the fundamental forces being the only one which matters at interstellar scales.
I almost wrote dark matter & energy are “inferred from” observed gravitational anomalies instead of “observed indirectly from”” so I’m not inclined to disagree with you that it was a poor choice of words.

Dave, I wasn’t taking a shot at your choice of words. I was just reinforcing your point, and adding another.

August 26, 2010 5:13 pm

tallbloke says:
August 26, 2010 at 3:29 pm
In fact, they are not observed at all, directly or indirectly, but inferred from a set of assumptions which include the weakest of the fundamental forces being the only one which matters at interstellar scales.
You conflate the words ‘observed’ and ‘inferred’. We have never observed gravity at the surface of the Sun [27g], but infer it from the Sun’s gravity at the Earth and the radius of the Sun.

anna v
August 26, 2010 9:17 pm

About inferring and observing:
We each of us sit inside a marvelous biologic machine that allows us to define the word “observe” when what we are doing is “infer” from an incredibly complicated system of electromagnetic and chemical inputs; we infer patterns into such a logical order to be able to discuss the observation of the gravity of the sun. The discussion itself comes through the exchange of inferences through the non chaotic complex biological system that we, each of us, are.
It is a matter of convention then, what one means by “observe”, not of absolute definition. A convention that is different between lay persons and scientists.

August 27, 2010 3:02 am

Leif Svalgaard says:
August 26, 2010 at 5:13 pm
“We have never observed gravity at the surface of the Sun [27g], but infer it from the Sun’s gravity at the Earth and the radius of the Sun.”
I might get nitpicky here and say that we have observed gravity at (or close to) the surface of the Sun, by observing the gravitational deflection of starlight passing the Sun’s limb. Of course, there are inferences involved in that calculation too.

tallbloke
August 27, 2010 4:06 am

Leif Svalgaard says:
August 26, 2010 at 5:13 pm
You conflate the words ‘observed’ and ‘inferred’.

No I don’t.
My “observation” (thank you anna v) about ‘observation and inference’ was regarding ‘dark matter’ and interstellar cosmological issues, not the relative surface gravity of the Earth and the Sun. The ‘woollyness factor’ involved in the inference of one is much removed from the other. You are conflating reasonable Newtonian engineering estimates with ad hoc hypotheses designed to save the Big Bang theory.
anna v says:
August 26, 2010 at 9:17 pm
It is a matter of convention then, what one means by “observe”, not of absolute definition. A convention that is different between lay persons and scientists.
I take a cosmologist’s statements beginning with “It is observed that” to mean
“This is what we have calculated, and it may be right, given that we haven’t totally messed up on assumptions A,B,C,D…N)
I would advise all lay people (who on the whole seem to have more common sense than the average cosmologist anyway) to do the same.

PhilJourdan
August 27, 2010 5:38 am

Paul Birch says:
August 26, 2010 at 2:17 pm

Ok, thanks. I knew the clocks were based on some atomic aspect, just unsure what. It is reassuing to know that I can still set my watch by them. 😉
Very much appreciate the explanation.

August 27, 2010 7:23 am

tallbloke says:
August 27, 2010 at 4:06 am
I would advise all lay people (who on the whole seem to have more common sense than the average cosmologist anyway) to do the same.
Common sense is a poor guide in physics, where many phenomena defy ‘common sense’, e.g. in quantum mechanics, relativity, even simple things like “heavier things fall faster that light things”.
The assumptions underlying modern cosmology are that the laws of Nature are the same everywhere and the General Relativity holds. There are no indications that these assumptions are incorrect.

son of mulder
August 27, 2010 3:01 pm

” Leif Svalgaard says:
August 27, 2010 at 7:23 am
The assumptions underlying modern cosmology are that the laws of Nature are the same everywhere and the General Relativity holds. There are no indications that these assumptions are incorrect.”
Yet we can’t unify GR & Quantum. This is either because we’ve not cracked the problem yet or maybe one or both is incorrect at some fundamental level.
That would certainly still be consistent with the laws of nature being the same everywhere but not necessarily that GR is correct.
Both Quantum and GR are very successful theories for practical purposes doesn’t make them right.
Because GR & Quantum are formulated so fundamentally differently and attempts to find more fundamental unifying mathematical structures has not been successful eg String Theory or Twistor Theory, then there must come a point when it is reasonable to look for possible fundamental reasons why not. What are the mathematical assumptions that we make? To me the one that sticks out like a sore thumb is the assumption that space-time is differentiable, the whole basis of Riemannian geometry.
By analogy fine sand can behave like a fluid to some extent but we can see the limitation of the fluid model. The conflict between GR & Quantum may demonstrate such a limitation and open the possibility for the seemingly impossible.

August 27, 2010 6:36 pm

son of mulder says:
August 27, 2010 at 3:01 pm
Yet we can’t unify GR & Quantum. This is either because we’ve not cracked the problem yet or maybe one or both is incorrect at some fundamental level.
More likely that they have different domains of applicability.
The conflict between GR & Quantum may demonstrate such a limitation and open the possibility for the seemingly impossible.
But cannot be taken as making the seemingly impossible plausible.
There are no indications that QM makes GR inapplicable at intergalactic distances [or even just across the street].

son of mulder
August 28, 2010 12:49 am

Leif Svalgaard says:
August 27, 2010 at 6:36 pm
“More likely that they have different domains of applicability. ”
For most practical purposes yes but that is no reasonto assume thay are independent at root.
Leif then says,
“But cannot be taken as making the seemingly impossible plausible.
There are no indications that QM makes GR inapplicable at intergalactic distances [or even just across the street].”
It’s not a question that one make the other inapplicable, we simply assume that say GR is applicable at intergalactic distances and when we perceive inconsistencies with expectation we invent say ‘dark matter’ to resolve the problem. Or with Quantum we introduce renormalization without a reason other than that it gives an observed answer.
Both examples that give a clear indication that we have a hole in understanding of the basic principles of both theories let alone the ability to unify them.

anna v
August 28, 2010 4:02 am

Up to a few years ago there was no way to quantize gravity, i.e. a method that one could use to to expand the solutions into a perturbative series with diminishing values and prove that the series was finite and bounded. That is why Feynman diagrams with gravitons do not work. The problem is that the graviton is a spin 2 particle and that introduces infinities.
Strong interactions, because of the coupling constant of 1, also had a problem in series expansion, until QCD with asymptotic freedom was found, which allowed calculations where the Feynman diagrams are not real prescriptions for integrations but just iconic representations
Then string theories, known theoretically from the 70’s, were discovered to allow the quantization of gravity and at the same time allowed embedding the standard SU3XSU2XU1 model. A large part of the theoretical effort in particle physics is expended in studying and trying to calculate using string theories. Some models predict mini black holes, which decay very fast thermodynamically into a large multiplicity of known particles. Had I not retired and were still working in my last experiment for the LHC, ( it takes decades to plan and build and run an experiment in high energy physics these days, as well as hundreds of people) this would have been my research interest.

August 28, 2010 8:00 am

son of mulder says:
August 28, 2010 at 12:49 am
when we perceive inconsistencies with expectation we invent say ‘dark matter’ to resolve the problem.
Dark matter was not ‘invented’ because of inconsistencies in GR, but based on observations, e.g. gravitational lensing. That we cannot ‘see’ the matter in light is not important for its discovery; we often deduce [i.e. ‘see’] the existence of unseen companions by their gravitational effects.
The acoustic peaks in the CMB power spectrum [ http://www.leif.org/EOS/PHTOAD000061.pdf ] depends only on freshman-type physics and not on GR. But the main argument is the many cross-checks that give a robust overall, coherent picture.
Of course, if you were to maintain that the laws of Nature vary in unknown ways with location and with time then you may claim that nothing is ever understood. And there I will not follow you.

son of mulder
August 28, 2010 3:25 pm

Leif Svalgaard says:
August 28, 2010 at 8:00 am
“Dark matter was not ‘invented’ because of inconsistencies in GR, but based on observations, e.g. gravitational lensing.”
In fact it was first proposed by Fritz Zwicky in 1934 to account for evidence of “missing mass” in the orbital velocities of galaxies in clusters ie GR predictions didn’t match observation.
Another similar example is the postulation of Dark Energy.
Your comment
“Of course, if you were to maintain that the laws of Nature vary in unknown ways with location and with time then you may claim that nothing is ever understood. And there I will not follow you.”
On that I fully agree with you.
But what we have is the dilemma of whether you postulate the existence of something
which we can’t/haven’t demonstrated/detected locally or whether you modify the mathematical theory to match observations, in line with the principle that physics is the same everywhere.

August 28, 2010 7:47 pm

son of mulder says:
August 28, 2010 at 3:25 pm
whether you modify the mathematical theory to match observations, in line with the principle that physics is the same everywhere.
There has been no modification to the mathematical theory for dark matter or Big Bang. The standard model of fundamental particles is empirically derived.

Don Smith
September 12, 2010 2:28 am

Was this fluctuation real, or was it merely a glitch in the equipment used to measure the decay, induced by the change of seasons, with the accompanying changes in temperature and humidity?
“Everyone thought it must be due to experimental mistakes, because we’re all brought up to believe that decay rates are constant,” Sturrock said.
Why do they need to be constant? We’ve been brought up to believe this because the theory of evolution must be protected at all cost. Time is the holy grail for the theory, a foundation stone. True we should make sure that the science is sound, but we should not try to bend it to suite a pet theory. The evidence should drive the theory, or dismiss it.
Have we truly put the time in to cross examine these methods of dating? Have all our dated samples been contaminated by an outside element(s)? Does it mean we’ve been using a rubber ruler all this time? What else could affect these dating methods? Just how old is that rock or fossil?
It will be interesting to see how the ‘creation verses evolution’ divide use this data. Imagine, if this is unanimously confirmed among the scientist, what will happen to all those books, articles, media presentations etc.

September 12, 2010 11:37 am

Don Smith says:
September 12, 2010 at 2:28 am
Have we truly put the time in to cross examine these methods of dating?
The effect [even if real] is so small that it will have almost no measurable effect on the dating [this is why it is so hard to prove the effect in the first place].

1 11 12 13