Getting out of the solar core, neutrinos are speed demons, photons are slugs. h/t to Leif Svalgaard for the graphical annotation inspiration. Solar core image from NASA.
In August, WUWT carried a story that was rather shocking: some physicists published claims they have detected a variation in earthly radioactive decay rates, big news by itself, but the shocker is they attributed it to solar neutrinos.
The findings attracted immediate attention because they seemed to violate two known basic facts of physics:
1. Radioactive decay is a constant
2. Neutrinos very rarely interact with matter and are hard to detect when they do.
For example: trillions of the neutrinos are zipping through your body right now. So why would they interact with radioactive elements in a more detectable way?
WUWT carried a follow-up story, citing doubts. Now there’s confirmation via experiment that the initial doubt was a fluke.
From the NIST: Research Shows Radiometric Dating Still Reliable (Again)
Recent puzzling observations of tiny variations in nuclear decay rates have led some to question the science of using decay rates to determine the relative ages of rocks and organic materials. Scientists from the National Institute of Standards and Technology (NIST), working with researchers from Purdue University, the University of Tennessee, Oak Ridge National Laboratory and Wabash College, tested the hypothesis that solar radiation might affect the rate at which radioactive elements decay and found no detectable effect.
Atoms of radioactive isotopes are unstable and decay over time by shooting off particles at a fixed rate, transmuting the material into a more stable substance. For instance, half the mass of carbon-14, an unstable isotope of carbon, will decay into nitrogen-14 over a period of 5,730 years. The unswerving regularity of this decay allows scientists to determine the age of extremely old organic materials—such as remains of Paleolithic campfires—with a fair degree of precision. The decay of uranium-238, which has a half-life of nearly 4.5 billion years, enabled geologists to determine the age of the Earth.
Many scientists, including Marie and Pierre Curie, Ernest Rutherford and George de Hevesy, have attempted to influence the rate of radioactive decay by radically changing the pressure, temperature, magnetic field, acceleration, or radiation environment of the source. No experiment to date has detected any change in rates of decay.
Recently, however, researchers at Purdue University observed a small (a fraction of a percent), transitory deviation in radioactive decay at the time of a huge solar flare. Data from laboratories in New York and Germany also have shown similarly tiny deviations over the course of a year. This has led some to suggest that Earth’s distance from the sun, which varies during the year and affects the planet’s exposure to solar neutrinos, might be related to these anomalies.
Researchers from NIST and Purdue tested this by comparing radioactive gold-198 in two shapes, spheres and thin foils, with the same mass and activity. Gold-198 releases neutrinos as it decays. The team reasoned that if neutrinos are affecting the decay rate, the atoms in the spheres should decay more slowly than the atoms in the foil because the neutrinos emitted by the atoms in the spheres would have a greater chance of interacting with their neighboring atoms. The maximum neutrino flux in the sample in their experiments was several times greater than the flux of neutrinos from the sun. The researchers followed the gamma-ray emission rate of each source for several weeks and found no difference between the decay rate of the spheres and the corresponding foils.
According to NIST scientist emeritus Richard Lindstrom, the variations observed in other experiments may have been due to environmental conditions interfering with the instruments themselves.
“There are always more unknowns in your measurements than you can think of,” Lindstrom says.
* R.M. Lindstrom, E. Fischbach, J.B. Buncher, G.L. Greene, J.H. Jenkins, D.E. Krause, J.J. Mattes and A. Yue. Study of the dependence of 198Au half-life on source geometry. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. doi:10.1016/j.nima.2010.06.270
It might even be amusing to compile a list of current fudge factors:
“missing heat”
neutrinos
dark energy
dark matter
“emergent properties”
misuse of chaos theory
Just for starters. This could be fun.
Steve says:
September 27, 2010 at 9:17 pm
From what V and Feht are saying, it would appear that neutrinos are the equivalent (in a sense) of epicycles (not really fair to epicycles which are observable (but an artifact of perception)) – they are fudge factors to sustain the dominant operant paradigm.
Neutrinos are NOT fudge factors. They are very well defined elementary particles, as well as other elementary particles.
All elementary particles are mathematical constructs invented in order to describe measurements as economically as possible.
One cannot see neutrons either. The only reason we know there are neutrons is because they decay fast enough to catch some of the products in our detectors, and it was necessary to invent the neutrino rather than say that “energy and momentum conservation are not valid in the microcosm”.
The hypothesis has worked well, experimentally, and the whole standard model as studied the last 20 years hangs well with remarkable accuracy.
It may be superseded, but will not go away, maybe in the same way that epicycles are still there, if one goes to the geocentric system.
There is a touching faith on the accuracy of these half life measurements. Nuclear data is constantly being revised. The half life of Po-209 was revised from 102 years to 115 years earlier this year.
In this particular experiment, I would look very closely at the cross-correlations. A seasonal variation in instrument response could come from air pressure or temperature for example. Also the background count correlates with weather: high pressure gives greater shielding from cosmic rays.
Low pressure brings out radon gas from the ground and if the weather is also bad (which low pressure causes of course), correlates with workers reducing the ventilation rate of their buildings, trapping the radon daughters inside. It also correlates with rain, which brings down radon daughters from the atmosphere, increasing instrumental backgrounds.
Steve says:
September 27, 2010 at 9:20 pm
“It might even be amusing to compile a list of current fudge factors:
“missing heat”
neutrinos
dark energy
dark matter
“emergent properties”
misuse of chaos theory
Just for starters. This could be fun.”
So much fun it makes you want to cry…
“It has been known for millennia that the Earth rests upon the back of a giant turtle. Only in recent centuries has this knowledge been added to. In 1794, in one of the high valleys of the Himalayas, one of the wise was asked, “Master, what does the turtle rest upon?” The Master answered: “It is turtles all the way down, my son.”
But now that scientists have finally succeeded in mapping the universe, a turtle controversy has arisen. It turns out that level 7,484,912 is occupied not by a turtle, but by a man dressed as a turtle. It is not known how this will affect our other equations.”
Quote: “Miles Mathis”
Much of science has got itself into the position that it is too attached to its basic tenets that it is having difficulty progressing. Abstract and ever more complex math and computer models have taken over from observation and experiment.
Back in 1928, physicist and Nobel Prize winner Max Born told a group of visitors to Gottingen University, “Physics, as we know it, will be over in six months.” Still no sign that we know everything yet, or even getting close to it!
Tenuc says:
September 28, 2010 at 7:28 am
Right!…“Physics, as we know it, will be over in six months.”
It’s over, though some physicists keep wandering around, like ghosts lost in a black hole, or entangled in a eleven dimensions universe,trapped with thousand of strings and slipping down through a worm hole 🙂
Anna V says: “All elementary particles are mathematical constructs invented in order to describe measurements as economically as possible. ”
Yes, very much so. I think that the essential difference between your posts and the posting of Alexander Feht is two sentences later where you say “it was necessary to invent the neutrino rather than say that “energy and momentum conservation are not valid in the microcosm”.
Well, no…and please do not think I am simply nit-picking for the amusement of it. It was not “necessary” but it was very much “convenient.” Even if we posit that there is in fact, a reality that exists separately from our observation of it, science, (and physics included), does not describe reality itself, but rather describes the models which we have made of reality. Neutrinos — in and of themselves — do not exist any more than gravity, beauty or justice. If a better model of elementary particles is created (and by better, I mean with a more accurate predictive power) which does not include neutrinos, then that particular mathematical construct (pun intended) will be put aside.
Let me use gravity as an analogous creation to neutrinos. F=Ma is more of a definition than a model, but once it was accepted, then there was the obvious problem that apples (and pretty much every thing else) violated the definition. Turn an apple loose and it accelerated toward the ground by itself with no apparent force. What to do? Either give up on F=Ma or invent an imaginary force called gravity. The imaginary force was a brilliant piece of work and was used for hundreds of years. Eventually someone came up with a better model that did not include gravity, but used curved space to work with F=Ma in such a way that the apple moved toward the ground without being tugged by an imaginary “gravity” force.
Neutrinos, gravity, distance, mass, automobiles, shrubbery — they are all provisional memetic constructs used to model the equally provisional constructs of our senses and our equipment. Science seeks the most elegant balance of prediction, completeness, and simplicity from differing models.
Pardon me if I have been either too simplistic or too pedantic, but it is a wonderfully interesting subject.
Neutrino is not a particle. It is a purely theoretical construct invented to fit theory to the observed experimental deviation from the same theory.
In plain English: together with the dark energy and other science fiction, neutrino is one of many repair patches necessary to create in college students’ minds a financially convenient illusion that their professors have at their disposal a system of scientific thought that sufficiently explains reality. Ancient Egyptian priests behaved the same way while lecturing their apprentices, I am sure.
Neutrons are physical objects that can be counted, controlled, traced, generated, and absorbed in every which way.
Nobody ever observed, counted, controlled, traced, or generated a neutrino.
The difference is manifest.
P.S. The very idea of neutrino would not be necessary if another axiomatic dogma wouldn’t be accepted by “consensus”: namely, that a photon doesn’t have any mass. If a photon has mass, however infinitesimally small, some artificial, invented elements of the standard model would become unnecessary.
But that’s another heresy that no modern college graduate would tolerate, isn’t it?
Just to be clear, these are the axioms of the paradigm being defended here and supporting evidence:
1. The dynamics of a mechanical system are independent of the manner in which observable quantities are measured. This is expressed mathematically as Local Gauge Invariance. This axiom can actually be used more to distinguish science from art than as an assumption.
2. At low energies, matter and energy exist and propagate as probability-waves but interact as particles. If that were seriously wrong, your computer would not work and you would not be reading this.
3. At low energies, the interactions between particles approach being transformations under the U(1), SU(2), and SU(3) groups, (you can look them up on Wolfram Mathworld) that change particle-types, as well as Lorentz group transformations which are transfers of mechanical energy. The first three of these groups are thoroughly tested with every experiment of electrodynamics and nuclear physics supporting them. The fourth is special relativity and is supported by the interaction between electric and magnetic fields and the demand that they work, without understanding which we could not produce electricity and you would not be reading this, and the demand that they work in the same universe as mechanics.
4. Again at low energies, the presence of energy (or momentum) curves space-time in a manner consistent with the theory of general relativity. Without this, the communication-satellites that the internet runs on would have their targeting off. Again, you would likely not be reading this.
I’ll give anybody 1000-to-1 odds the paradigm is right, and I don’t gamble.
These constructions are not only explanatory of already-existing observations, they are predictive and their predictions have worked. Epicycles and similar constructs are not predictive or have made outright wrong predictions. The paradigm does not imply that we know everything: After all, for most of the “laws” that we describe, we know they only work at low energies.
Steve says:
September 27, 2010 at 9:20 pm
“It might even be amusing to compile a list of current fudge factors:
“missing heat”
neutrinos
dark energy
dark matter
“emergent properties”
misuse of chaos theory
We might as well add clear glass because you cannot see that either.
If your glass is so clear than nobody can see it, so light and slippery that nobody can feel or touch it, if it cannot be registered by any instruments despite the enormous effort put into excavating large, very deep underground chambers equipped with extremely sensitive and expensive sensors — and if we all know that you desperately need this glass for your paradigm to hold water — then, obviously, the statement that this glass is a filament of your imagination is much closer to the truth than holey paradigm.
In short: obfuscation doesn’t make anything clear, Stephen. Save it for dormitory booze-ups.
Alexander Feht says:
September 28, 2010 at 4:18 pm
In plain English: together with the dark energy and other science fiction, neutrino is one of many repair patches necessary to create in college students’ minds a financially convenient illusion that their professors have at their disposal a system of scientific thought that sufficiently explains reality.
Your knowledge of physics is passe and you underestimate graduate students.
There are neutrino beams from CERN, hitting experimental set ups in Grand Sasso in Italy. If that is not the definition of a particle, what is?
Again, the very idea of all elementary particles is a mathematical construct to explain our macroscopic measurements. Any particles, neutrons and protons and the explanation of the periodic table included. So?
Jason Calley says:
September 28, 2010 at 1:02 pm
Well, no…and please do not think I am simply nit-picking for the amusement of it. It was not “necessary” but it was very much “convenient.”
It was necessary to invent the neutrino to explain the decay of the neutron without throwing away basic conservation laws. Fermi, I think, at the time said “who ordered this?”. In neutron decay, if momentum and energy conservation hold, something has to take them away to balance the equations.
Otherwise the very fundamental conservation laws coming out of the Lagrangian formulation of mechanics and quantum mechanics would be violated. A proposition much nastier than giving the attributes to an unseen particle.
Of course there were and there will be large changes in the paradigm for the microcosm as you so justly explain with gravity. Nevertheless F=ma still works in the relevant subsystem. That is the difference between hand waving suppositions of how the world works, and mathematical mapping of the available data on how the world works. The mappings remain, ( the epicycles still exist if you plot a geocentric system) they might become obsolete, cumbersome and uninteresting, but not wrong , as also Stephen Amsel notes in his September 28, 2010 at 4:33 pm.
Predictions from these mapping might be wrong, and actually this is one way of new physics being signaled, when something predicted from the paradigm is not found. The necessary new paradigm will have to include the old and forge ahead into new areas of predictions.
Anna V.,
Whose knowledge is passé? Five year-old CERN experiment results that you linked were never confirmed. Muons can be produced by a lot of other transactions. All their attempts to prove that they registered a neutrino failed.
Endless arguments about neutrinos are going on for more than 70 years now. Would there be any positive evidence of neutrino’s existence, there wouldn’t be any argument.
I don’t want this to become a fruitless squabble. Believe whatever consensus you want to believe, what’s it to me?
Good night.
If one goes to the Cern document server, and searches for “neutrino beam”, one gets
Published Articles, 989 records found
Preprints, 592 records found
Theses, 49 records found
Reports, 16 records found
very recent publications.
Here is a progress report for the Grand Sasso
http://cdsweb.cern.ch/record/1237127/files/EuCARD-CON-2009-014.pdf
After successfully resolving these issues, CNGS started the
physics run in 2008 and has since then 4.2E+19 protons on target accumulated.
With the present statistics, the first tau-neutrino events are expected by the end
of this year’s physics run.
p.s.
http://www.physorg.com/news194544551.html
May 31 2010
Researchers on the OPERA experiment at the INFN’s Gran Sasso laboratory in Italy today announced the first direct observation of a tau particle in a muon neutrino beam sent through the Earth from CERN, 730km away.
Jere Jenkins has responded to this Thread on Tips & Notes at Jere Jenkins says:
September 28, 2010 at 12:19 pm.
Many observations have actually found dark matter. Because it does not directly interact with light (which is actualy the same reason why it is so slippery), we must use other means. One such other method is by measuring its gravitational effect. We have seen the effects of its gravity through many, many telescopes, and at this point there really is nothing else that could explain it. With these observations, Occam’s Razer very strongly favours dark matter. The current search is for Weak force interactions of low-kinetic-energy dark matter.
As for neutrinos, we have countless observations of the effects of them bumping into things in ways that nothing else could. Just like the clear glass, we cannot “see” them, but we can measure them and have done so. Here is a report from one of those deep underground detectors that has seen countless neutrinos:
http://arxiv.org/abs/nucl-ex/0110005
Also, it is not nearly as easy to get an equivalent signal as it seems: To get deep into these detectors, you need something slippery, which would not bounce off long before getting very far. Your next-best candidate for this is the neutron, but that has very different kinematics and has Strong-force interactions, so with a little effort they can be distinguished. Beyond that, we are talking about charged particles which are easily visible and distinguishable from neutrinos.
In short: Continuing a debate from decades ago by simply ignoring the data which settled it does not make anything clear. Save it for non-scientific topics where that does not happen.