Findings will enhance dark energy experiments at major telescopes
DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

Cosmologists have found a way to double the accuracy of measuring distances to supernova explosions – one of their tried-and-true tools for studying the mysterious dark energy that is making the universe expand faster and faster. The results from the Nearby Supernova Factory (SNfactory) collaboration, led by Greg Aldering of the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), will enable scientists to study dark energy with greatly improved precision and accuracy, and provide a powerful crosscheck of the technique across vast distances and time. The findings will also be central to major upcoming cosmology experiments that will use new ground and space telescopes to test alternative explanations of dark energy.
Two papers published in The Astrophysical Journal report these findings, with Kyle Boone as lead author. Currently a postdoctoral fellow at the University of Washington, Boone is a former graduate student of Nobel Laureate Saul Perlmutter, the Berkeley Lab senior scientist and UC Berkeley professor who led one of the teams that originally discovered dark energy. Perlmutter was also a co-author on both studies.
Supernovae were used in 1998 to make the startling discovery that the expansion of the universe is speeding up, rather than slowing down as had been expected. This acceleration – attributed to the dark energy that makes up two-thirds of all the energy in the universe – has since been confirmed by a variety of independent techniques as well as with more detailed studies of supernovae.
The discovery of dark energy relied on using a particular class of supernovae, Type Ia. These supernovae always explode with nearly the same intrinsic maximum brightness. Because the observed maximum brightness of the supernova is used to infer its distance, the small remaining variations in the intrinsic maximum brightness limited the precision with which dark energy could be tested. Despite 20 years of improvements by many groups, supernovae studies of dark energy have until now remained limited by these variations.Quadrupling the number of supernovae
The new results announced by the SNfactory come from a multi-year study devoted entirely to increasing the precision of cosmological measurements made with supernovae. Measurement of dark energy requires comparisons of the maximum brightnesses of distant supernovae billions of light-years away with those of nearby supernovae “only” 300 million light-years away. The team studied hundreds of such nearby supernovae in exquisite detail. Each supernova was measured a number of times, at intervals of a few days. Each measurement examined the spectrum of the supernova, recording its intensity across the wavelength range of visible light. An instrument custom-made for this investigation, the SuperNova Integral Field Spectrometer, installed at the University of Hawaii 2.2-meter telescope at Maunakea, was used to measure the spectra.
“We’ve long had this idea that if the physics of the explosion of two supernovae were the same, their maximum brightnesses would be the same. Using the Nearby Supernova Factory spectra as a kind of CAT scan through the supernova explosion, we could test this idea,” said Perlmutter.
Indeed, several years ago, physicist Hannah Fakhouri, then a graduate student working with Perlmutter, made a discovery key to today’s results. Looking at a multitude of spectra taken by the SNfactory, she found that in quite a number of instances, the spectra from two different supernovae looked very nearly identical. Among the 50 or so supernovae, some were virtually identical twins. When the wiggly spectra of a pair of twins were superimposed, to the eye there was just a single track. The current analysis builds on this observation to model the behavior of supernovae in the period near the time of their maximum brightness.
The new work nearly quadruples the number of supernovae used in the analysis. This made the sample large enough to apply machine-learning techniques to identify these twins, leading to the discovery that Type Ia supernova spectra vary in only three ways. The intrinsic brightnesses of the supernovae also depend primarily on these three observed differences, making it possible to measure supernova distances to the remarkable accuracy of about 3%.
Just as important, this new method does not suffer from the biases that have beset previous methods, seen when comparing supernovae found in different types of galaxies. Since nearby galaxies are somewhat different than distant ones, there was a serious concern that such dependence would produce false readings in the dark energy measurement. Now this concern can be greatly reduced by measuring distant supernovae with this new technique.
In describing this work, Boone noted, “Conventional measurement of supernova distances uses light curves – images taken in several colors as a supernova brightens and fades. Instead, we used a spectrum of each supernova. These are so much more detailed, and with machine-learning techniques it then became possible to discern the complex behavior that was key to measuring more accurate distances.”
The results from Boone’s papers will benefit two upcoming major experiments. The first experiment will be at the 8.4-meter Rubin Observatory, under construction in Chile, with its Legacy Survey of Space and Time, a joint project of the Department of Energy and the National Science Foundation. The second is NASA’s forthcoming Nancy Grace Roman Space Telescope. These telescopes will measure thousands of supernovae to further improve the measurement of dark energy. They will be able to compare their results with measurements made using complementary techniques.
Aldering, also a co-author on the papers, observed that “not only is this distance measurement technique more accurate, it only requires a single spectrum, taken when a supernova is brightest and thus easiest to observe – a game changer!” Having a variety of techniques is particularly valuable in this field where preconceptions have turned out to be wrong and the need for independent verification is high.
###
The SNfactory collaboration includes Berkeley Lab, the Laboratory for Nuclear Physics and High Energy at Sorbonne University, the Center for Astronomical Research of Lyon, the Institute of Physics of the 2 Infinities at the University Claude Bernard, Yale University, Germany’s Humboldt University, the Max Planck Institute for Astrophysics, China’s Tsinghua University, the Center for Particle Physics of Marseille, and Clermont Auvergne University.
This work was supported by the Department of Energy’s Office of Science, NASA’s Astrophysics Division, the Gordon and Betty Moore Foundation, the French National Institute of Nuclear and Particle Physics and the National Institute for Earth Sciences and Astronomy of the French National Centre for Scientific Research, the German Research Foundation and German Aerospace Center, the European Research Council, Tsinghua University, and the National Natural Science Foundation of China.Additional background
In 1998, two competing groups studying supernovae, the Supernova Cosmology Project and the High-z Supernova Search team, both announced they had found evidence that, contrary to expectations, the expansion of the universe was not slowing but becoming faster and faster. Dark energy is the term used to describe the cause of the acceleration. The 2011 Nobel Prize was awarded to leaders of the two teams: Saul Perlmutter of Berkeley Lab and UC Berkeley, leader of the Supernova Cosmology Project, and to Brian Schmidt of the Australian National University and Adam Riess of Johns Hopkins University, from the High-z team.
Additional techniques for measuring dark energy include the DOE-supported Dark Energy Spectroscopic Instrument, led by Berkeley Lab, which will use spectroscopy on 30 million galaxies in a technique called baryon acoustic oscillation. The Rubin Observatory will also use another called weak gravitational lensing.
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory (https://www.lbl.gov/) and its scientists have been recognized with 14 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
-By Bob Cahn
Wonderful. What a huge advance since Hoyle. Perhaps the marvellous Berkeley could take five minutes off to determine whether CO2 affects the weather.
“Today, Berkeley Lab researchers develop sustainable energy and environmental solutions“. They already know about CO2 and climate, and are working to save us from our addiction to fossil fuels. But of course, that’s where the big money is.
This isn’t according to Hoyle? Well, then, I think the cards are stacked against the Berkeley authors…
Yet another demostration of dark matter:
Wonder if these twin spectra supernovae are actually the same supernova with light travelling a different route around our toroidal universe.
🙂
So, what if the apparent acceleration of the universe is only dependent on our place and time relative to our observations?
So what place(s) and time(s) do you propose would yield a different observation?
Maybe the apparent acceleration is an artifact of early hyper expansion of the universe. Wouldn’t it be interesting if what was being observed is that older objects were in a younger universe in which hyper expansion had not yet decreased completely?
I had the same thought. Are they not comparing light emitted billions of years ago with light emitted “only” 300 million years ago?
This is really interesting. Much of this dark energy explanation relies on the assertion that all of this type of supernova are alike. If that were not so, brightness wouldn’t be uniform and distance could not be inferred from brightness.
I always thought this (simplistic public) explanation was a weak foundation for such an earth shattering theory. While we assume the laws of physics are everywhere the same, (with some confirmation such as spectral lines), individual events are rarely ever identical.
But according to this summary, the differences in events has been a thorn in the analysis of these ‘identical’ events. And a new technique has been developed that shows similarity and differences in these explosions.
Not only should this result in more accurate distance measurements, it could be a tool that results in a much better understanding of these supernova and their differences.
The fact that they are aware of differences yet say expansion is happening gives me a greater assurance that it is real.
Like the apocryphal little girl who grabbed a shovel and started digging saying there is so much manure that there has to be a pony in here somewhere, the presence of a lot of BS in scientific stories tells us that there are one or more bulls fertilizing the field.
Scientists should never be ‘hiding’ the truth about their theories and how the data fits. They may think they are keeping it simple, or that the detail would be too confusing. But they end up being seen hiding obvious truths that potentially contradict their theories.
Type 1A SN are surely influenced by their content of dark matter, which should enhance their gravitational collapse, without contributing to degeneracy pressure. The radiant yield, or brightness of a 1A will therefore depend on how much DM it contains. As DM varies in content of galaxies and is denser in the earlier (more distant) universe, how can 1A SNs serve as standard candles? And how can one assume dark energy and an accelerating universe exist? (And win Nobel prizes?). Of course it might be the case that DM does not exist. The young lady who discovered the spectral similarities may have opened the door to a major advance (Nobel Prize if so?).
Stars consist of baryonic matter, not dark matter.
Non-meteorite impact craters hypothetically attributed to collisions with dark matter objects composed of strange and charm quarks as well as up and down:
I read this article three times, then I read the abstract of the underlying paper (full paper is paywalled). I couldn’t understand why there was no mention of red shift.
I think I grasp what they are trying to say. There seems to be an assumption that supernovae with identical spectra (after adjustment for red shift?) will have identical amplitudes, so that they can use apparent brightness to estimate relative distances. Red shift indicates velocity, and supernova brightness indicates distance, and this is what allowed them to build a model showing that universe expansion is accelerating. This research will allow that model to be refined. That’s my attempt to reconstruct what it’s all about, and I could well be wrong. Cosmology isn’t easy.
The assumption is articulated in the quote: “We’ve long had this idea that if the physics of the explosion of two supernovae were the same, their maximum brightnesses would be the same…..” said Perlmutter
Still, it seems a fairly weak assumption. Unless there is some theoretical reason (far beyond the capacity of my earth-bound Newtonian brain to understand) that all supernovae MUST be equally bright.
I get the impression that the news article writer understood it even less than me.
The article is available for free on arxiv here:
https://arxiv.org/pdf/2105.02676
I wonder if the intervening dark matter attenuates the apparent brightness of a very distant supernova?
By the way, in my spare time I’m working on an all-encompassing theory that explains dark matter and dark energy. It involves turtles.
Dark matter and dark energy are both made-up to explain what they can not explain. Time to revisit their entire theory.
“Dark matter” is the name given to various astrophysical observations taken as evidence that undected mass must exist in the universe.
Besides Type 1a supernovae distance measurments, these include galaxy rotation curves and clusters, the Bullet Cluster (a recent collision of two galaxy clusters), velocity dispersions, gravitational lensing, patterns in the cosmic microwave background radiation, structure formation after the Big Bang, baryonic oscillations in sky surveys, red-shift space distortions and the Lyman-alpha forest (the sum of absorption lines of neutral hydrogen transitions in the spectra of distant galaxies and quasars).
Various hypotheses exist as to what dark matter might be physically. Some small portion may well be hard to detect baryonic matter, but presently estimated at most about one sixth. Some guesses have been ruled out by testing predictions, but others remain open after further investigation, such as blackholes of a certain size range.
Another, minority explanation involves modifying the standard laws of general relativity to account for observations.
Axions are another hypothesis:
https://www.nationalgeographic.com/science/article/news-admx-dark-matter-detector-physics?cmpid=org=ngp::mc=crm-email::src=ngp::cmp=editorial::add=SpecialEdition_Escape_20210512&rid=CCB5417100B840FF40A3111D67CAD83B
Experiments in Seattle will test this guess as to the identity of dark matter, because, dark matter matters!
Since the supposed dark matter does not interact via the electromagnetic force, it would not be able to do that (I think). However, since gravity can bend light and shift its wavelength, there could be those effects, but not simple attenuation of intensity. That’s my guess, anyway. But only if dark matter really exists, of course.
P.S. You might enjoy some of Sabine Hossenfelder’s videos about dark matter and related topics.
She says, But wait! Dark matter is both particle and revised gravity formulae:
I think all of the critical comments are way off the mark. We need to 1) understand how the accelerating expansion of the universe is anthropogenic, and 2) be able to explain how it’s all Trump’s fault.
Come to think of it, we don’t need to understand or explain anything. We can just assert it, and it will be so.
Dark Energy, Dark Matter, Black Holes. Scientists playing blindman’s bluff with super computers.
Cut off their funding.
How refreshing to read real science. Astronomers actually looking through telescopes at something real. Using spectroscopy to study the light of objects billions of light-years away. Not punching made up numbers in a computer and saying, “Aha.”
“Having a variety of techniques is particularly valuable in this field where preconceptions have turned out to be wrong and the need for independent verification is high.”
Now just what other investigation could this be applied to?
interesting article but I’m still trying to process how star distances can be measured at all due to space dust and other interstellar “fog” and “dark” matter that, IMO, make it near impossible to get accurate distance results. I guess a way around this is that all supernova studied are in the same tiny patch of the sky so the space dust will be more likely effect all these supernova in the same way… or maybe using wavelenghts not effected by space dust (if that’s even a thing). Please, someone smarter than me explain howspace dust is factored out in distance .measurements.
Not only howspace dust, but whatspace and whyspace dust as well.
The wherespace dust they got a pretty good handle on I think.