Modeling sunspots during times when few are seen

(h/t to Michael Ronayne)

NCAR

Sunspots Revealed in Striking Detail by Supercomputers

BOULDER—In a breakthrough that will help scientists unlock mysteries of the Sun and its impacts on Earth, an international team of scientists led by the National Center for Atmospheric Research (NCAR) has created the first-ever comprehensive computer model of sunspots. The resulting visuals capture both scientific detail and remarkable beauty.

flower-like shape; dark center, bright petals
The interface between a sunspot's umbra (dark center) and penumbra (lighter outer region) shows a complex structure with narrow, almost horizontal (lighter to white) filaments embedded in a background having a more vertical (darker to black) magnetic field. Farther out, extended patches of horizontal field dominate. For the first time, NCAR scientists and colleagues have modeled this complex structure in a comprehensive 3D computer simulation, giving scientists their first glimpse below the visible surface to understand the underlying physical processes.

The high-resolution simulations of sunspot pairs open the way for researchers to learn more about the vast mysterious dark patches on the Sun’s surface. Sunspots are the most striking manifestations of solar magnetism on the solar surface, and they are associated with massive ejections of charged plasma that can cause geomagnetic storms and disrupt communications and navigational systems. They also contribute to variations in overall solar output, which can affect weather on Earth and exert a subtle influence on climate patterns.

The research, by scientists at NCAR and the Max Planck Institute for Solar System Research (MPS) in Germany, is being published this week in Science Express.

“This is the first time we have a model of an entire sunspot,” says lead author Matthias Rempel, a scientist at NCAR’s High Altitude Observatory. “If you want to understand all the drivers of Earth’s atmospheric system, you have to understand how sunspots emerge and evolve. Our simulations will advance research into the inner workings of the Sun as well as connections between solar output and Earth’s atmosphere.”

Ever since outward flows from the center of sunspots were discovered 100 years ago, scientists have worked toward explaining the complex structure of sunspots, whose number peaks and wanes during the 11-year solar cycle. Sunspots encompass intense magnetic activity that is associated with solar flares and massive ejections of plasma that can buffet Earth’s atmosphere. The resulting damage to power grids, satellites, and other sensitive technological systems takes an economic toll on a rising number of industries.

Creating such detailed simulations would not have been possible even as recently as a few years ago, before the latest generation of supercomputers and a growing array of instruments to observe the Sun. Partly because of such new technology, scientists have made advances in solving the equations that describe the physics of solar processes.

The work was supported by the National Science Foundation, NCAR’s sponsor. The research team improved a computer model, developed at MPS, that built upon numerical codes for magnetized fluids that had been created at the University of Chicago.

Computer model provides a unified physical explanation

The new computer models capture pairs of sunspots with opposite polarity. In striking detail, they reveal the dark central region, or umbra, with brighter umbral dots, as well as webs of elongated narrow filaments with flows of mass streaming away from the spots in the outer penumbral regions. They also capture the convective flow and movement of energy that underlie the sunspots, and that are not directly detectable by instruments.

The models suggest that the magnetic fields within sunspots need to be inclined in certain directions in order to create such complex structures. The authors conclude that there is a unified physical explanation for the structure of sunspots in umbra and penumbra that is the consequence of convection in a magnetic field with varying properties.

The simulations can help scientists decipher the mysterious, subsurface forces in the Sun that cause sunspots. Such work may lead to an improved understanding of variations in solar output and their impacts on Earth.

Supercomputing at 76 trillion calculations per second

To create the model, the research team designed a virtual, three-dimensional domain that simulates an area on the Sun measuring about 31,000 miles by 62,000 miles and about 3,700 miles in depth – an expanse as long as eight times Earth’s diameter and as deep as Earth’s radius. The scientists then used a series of equations involving fundamental physical laws of energy transfer, fluid dynamics, magnetic induction and feedback, and other phenomena to simulate sunspot dynamics at 1.8 billion points within the virtual expanse, each spaced about 10 to 20 miles apart. For weeks, they solved the equations on NCAR’s new bluefire supercomputer, an IBM machine that can perform 76 trillion calculations per second.

The work drew on increasingly detailed observations from a network of ground- and space-based instruments to verify that the model captured sunspots realistically.

The new models are far more detailed and realistic than previous simulations that failed to capture the complexities of the outer penumbral region. The researchers noted, however, that even their new model does not accurately capture the lengths of the filaments in parts of the penumbra. They can refine the model by placing the grid points even closer together, but that would require more computing power than is currently available.

“Advances in supercomputing power are enabling us to close in on some of the most fundamental processes of the Sun,” says Michael Knölker, director of NCAR’s High Altitude Observatory and a co-author of the paper. “With this breakthrough simulation, an overall comprehensive physical picture is emerging for everything that observers have associated with the appearance, formation, dynamics, and the decay of sunspots on the Sun’s surface.”

aerial

First view of what goes on below the surface of sunspots. Lighter/brighter colors indicate stronger magnetic field strength in this subsurface cross section of two sunspots. For the first time, NCAR scientists and colleagues have modeled this complex structure in a comprehensive 3D computer simulation, giving scientists their first glimpse below the visible surface to understand the underlying physical processes. This image has been cropped horizontally for display. [ENLARGE & DISPLAY FULL IMAGE] (©UCAR, image courtesy Matthias Rempel, NCAR. News media terms of use*)

See a video animation of this and other sunspot visualizations as well as still “photo” images in the Sunspots Multimedia Gallery.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

120 Comments
Inline Feedbacks
View all comments
Editor
June 21, 2009 8:14 pm

Gary Strand (19:51:23) :
JamesD, would you recommend all climate modeling efforts be ended and their funding zeroed?
I’m with Gary on this one. Models can only be as good as the science that goes in to them but building the model is important for testing the science. I WOULD argue that climate modeling has gotten too bound up with politics and a lot of us are suspicious about the spin and lack of both transparency and humility, but let’s not turn into a bunch of Luddites.

Gary Strand
June 21, 2009 8:15 pm

Kath, how does one create a “known test model”, “small scale experiment”, “scale or partial full scale model” of the earth’s climate system?

Hank Hancock
June 21, 2009 8:23 pm

“A persistent problem in solar magnetic modeling is that as you go to finer and finer scales, the less accurate the model becomes because of lack of computer power, while at the same time telling us that the most interesting things [and determining factors] happen at still finer scales.” – Leif Svalgaard (10:45:28)
Another challenge of modeling is the limitations of math precision of today’s binary computing systems. In a good math co-processor or subsystem that can make qualitative (best algorithm) decisions on how to perform operations on number sets. The limits are determined by the size, width (abstract difference between the highest and lowest values), and density, which is the relationship between the size and width of a number set. Precision error can occur in base or type casting conversion of different types of floating point numbers, importing data from lower precision input, and execution of higher order math operators against floating point numbers.
While all integer values (which all have a density = 1) can be represented in a binary system – there are gaps in floating point values in all binary math systems. Those gaps are real numbers that cannot be represented without approximation. Thus, another challenge in going to finer and finer scales is not only limits in computing power by also inherent limitations in math precision in today’s binary computing systems.
Analog computers could solve this problem as they are capable of representing all values of real numbers in a given scale and range but are still not ready for prime [no pun intended] time. While we get all excited about faster computational speeds, what will be even more exciting is the analog computers of the future that will handle very large and very small numbers with incredible precision, perform non-linear calculations effortlessly, and all at light speed minus velocity factor.

Evan Jones
Editor
June 21, 2009 8:52 pm

It’s worse than that. You can’t model something like that from the bottom up. The errors compound. It’s like trying to simulate World War II using Squad Leader rules. Can’t be done; foolish, even silly, to attempt.
Something like that has to be modeled, if it is to be modeled in the first place, from the top down. The results will be less precise–and probably far more accurate. Such a model is also much easier to modify for newly discovered factors. (Such a model could also be run using a 1980s PC. Or, lacking that, the back of a large envelope.)
And, like judging a wargame, one must start by comparing the model with the “storyboard”, that is to say, the real past. One does not even consider using the model to project until it has first duplicated.
STAVKA actually did a top-down simulation in 1940 of a projected German invasion. Zhukov “played” the Germans. The results were almost exactly the same as what happened historically, down to the stall at Yelnya. (This did not suit Stalin, and STAVKA threw the results out and Zhukov was in the doghouse.)

Paul R
June 21, 2009 8:58 pm

Looking into that computer generated spot makes my eyes go wonky, that’s all I have to say about that.

June 21, 2009 8:58 pm

kim (14:58:04) :
Leif, could one of those ‘more interesting things happening at finer scales’ be the tidal movements of mere millimeters?
Fraction of millimeters 🙂
The problem with these minute movements is that they occur on top of [or swamped by] random movements millions of times larger and faster. The tidal effects are not fine scale, but the largest scale possible on the Sun.

J. Peden
June 21, 2009 9:18 pm

Gary Strand:
“Or, something much easier – how would you run an experiment on the climate system?”
You make some predictions based upon your hypothesis and see if they eventuate in the real world. You make clear what empirical conditions would falsify your hypothesis. Yes, those really are real experiments, with the whole climate, Earth, and Solar System themselves as a giant sources of altering conditions, results, and data. Isn’t that enough?
But you can also run smaller real world experiments on the elements of your hypothesis, such as the temperature effect of varying concentrations of CO2 within a confined space and a given light wave input, etc., or on CO2’s effect on a simulated “ocean”.
Isn’t that how general laws came to be developed and accepted? Isn’t that what Climate Science should be doing?
Oh, and don’t forget to check your measuring equipment, wink, wink.

John F. Hultquist
June 21, 2009 9:49 pm

“They also contribute to variations in overall solar output, which can affect weather on Earth and exert a subtle influence on climate patterns.”
I’d like the authors to expand on this “subtle influence” idea. Where is it? Why is it? How is it? Haven’t we all heard that the climate is swamped by CO2 and the positive feedback of clouds?

June 21, 2009 10:00 pm

To the devotees of supermodels:
I create and use models all the time in my forest biometrics work — primarily growth-and-yield models, but also many types of mensurational tool-type models.
I don’t make pretty pictures or use fabulously expensive supercomputers to create my models. Nor do I rely on models alone to make difficult management decisions. They are tools, not scripture.
The pretty pictures are useless to a large degree, but even more useless is the expense. The economy of the entire world is reeling. We have real problems galore, such as poverty, famine, epidemic disease, ignorance, war, etc. Is it too much to ask government scientists to pinch a few pennies on work that has no practical value?
I am not opposed to “pure” science such as astronomy or particle physics. But I am on a tight budget right now. We all are. Wouldn’t it be better to pursue real wealth creation and relief of human suffering at the moment, than to squander, yes squander, precious resources on pretty pictures?
We all know that most government science excludes the truly innovative, paradigm-challenging thinkers. PC science does not advance knowledge; in fact, I would argue it is a barricade to fresh scientific inquiry. That is another reason to reduce the funding to inside-the-Establishment institutional science.
I am not a Luddite. But I do find it ironic that the real Luddites, those who would shut down civilization out of a gripe with technology and indeed humanity in general, rely on supercomputer climate models to justify their dystopianism.

Kath
June 21, 2009 10:27 pm

“Gary Strand (20:15:56) :
Kath, how does one create a “known test model”, “small scale experiment”, “scale or partial full scale model” of the earth’s climate system?”
Gary, J.Peden, above, answered your question. The climate experiment is all around us. It is happening as we speak. And, as J.Peden notes, the temperature and other measurements must be accurate and not based on urban heating, bad siting or poor equipment.
One has to ask the following of the climate models:
a) Temperatures are dropping while CO2 continues to rise. Do the models correctly reflect this fact?
b) The Arctic ice just about reached “average” thickness this winter; in fact, the ice was thick enough for the Russians to drive vehicles to the North pole. http://www.russia-ic.com/news/show/8099/. Do the models accurately predict the changes in ice thickness that we have seen from the winter of 2007 to 2008-2009?
c) Do the models adequately conform to recorded historical climate data such as the little ice age and the medieval warm period.
(There are others that I have not listed)
The problem is this: If even one of those events is not correctly modeled, can we reasonably expect the same models to predict climate decades into the future?
One other thing: Climate change is normal. It has always happened in the past and it will always happen in the future. With or without human intervention.

kristof
June 21, 2009 10:28 pm

Wow, sometimes it is here worse than in a retirement home. ‘In the good old days, there were no computers,…then it was all ‘real’ science.’
So some of you are just saying that computer models are useless because they can give faulty outcome. And let’s first call a cat a cat. Computer models are actually just calculating the physical models we have, but that were just to difficult to calculate by hand. They are not some magical new tool. They are used very extensively in many, if not all fields of engineering and science. Doe that mean that their outcome should be trusted blindly? Duh, of course not.
That is why people check them, test them, improve them, disregard them if they are wrong, and then still you are aware of the fact that it is only a model, not reality itself.
That is why different groups make different models.
So yes, you have to be cautious about the output of computer models, but to disregard them as a whole, well that is just plain old nostalgic whining.
But anything that helps not to confront the issue, is good enough right.
the reasoning that some follow is: 1. computer models can be faulty
2. So therefore nothing that comes out of the models for our climate can be trusted.
2. does not have to follow 1.
or some say:
1. the climate is too complex, is not enough understood or the grid size should be smaller so 2. everything a models predict is false.
Also here 2. does not have to follow 1.
A simple model can take you already on the way of understanding what happens, even though the system is complex.It is not because the outcome of the present models has some variance in it, that it results are just useless.
Some people are here discussing the pros and cons of computer models as should be done.
But please quit the whining about the bad computer models that are bad because you just don’t like their outcome.
You would think that under Bush at least one group would have come up with a model that said nothing was wrong if the models are really so useless, unreliable and can predict anything with bad input (the garbage in, garbage out).

June 21, 2009 10:36 pm

Katherine (14:12:45) :
Shouldn’t that be “what they think lies below the visible surface”? It’s just a model after all, not the real thing.
I was thinking exactly the same thing. It seems that modeling is taking the place of reality. What the results of a computer are, they become into realities in the minds of some people. At this rate, some scientists will end believing that the Earth warms up the Sun, or that the energy in the Earth is totally endogenous, created by the carbon dioxide from the nothingness… 🙂

Neil O'Rourke
June 21, 2009 11:05 pm

Consider that the next new bridge you may cross or skyscraper you go into was modeled on a computer. Do you still believe models are worthless?
The stress calculations needed for this sort of modelling are (essentially) completly solved. Also, the vast majority of modelling is some form of interpolation.
Global Climate Models are taking a subset of data and extrapolating.

hotrod
June 21, 2009 11:37 pm

Every time I see this sort of “news item” about computer models appear, I always flash back to the same two events.
One was an incident in a convenience store years ago right after the magic computer electronic cash registers came out that figured your change for you. I had been shopping there for years, and knew the price of certain items by heart. I walked in pulled a soda pop out of the cooler and set it on the counter. A new employee rang it up and announced that that would be$1.25, I smiled and said it should be $.89, she indignantly said, “no it is $1.25” and pointed at the electronic cash register. I turned and pointed to the sign on the glass door of the cooler that clearly advertised $.89. Her response was, that it would be $1.25, because that is what the computer says! It took a moment to convince her that the sign was evidence of a reality that the computer did not recognize.
I also flash back to a debate I was having with an engineer about their limits of knowledge regarding accidents on nuclear reactors. Here in Colorado at that time, we had the only commercial high temperature gas cooled nuclear reactor in the U.S. (perhaps in the world – there was one other similar research reactor in Peach Bottom Pa). They were describing various maximum credible accident scenarios.
One of them was a breach of the concrete containment and the possibility of run away heating of the graphite core (think very big charcoal grill) when it was exposed to oxygen while it was heated to very high temperatures (not unlike what would happen at Chernobyl years later). When the subject came up their answer was a simple emphatic “that is impossible”. We were trying to impress on them the limits of their knowledge and get them to at least qualify that statement with something like “That is extremely unlikely, and our calculations show it is probably impossible”.
The engineers would simply not budge from their flat declarative statement, until I commented, “Yes and the engineers at de Havilland did not think the wings would fall off their Comet airplane either”!
At that point there was a pregnant pause and you could see the light go on as they said that yes “to the best of their knowledge that was impossible”!
Larry

RexAlan
June 22, 2009 12:09 am

Computers are a great tool for helping you think, just never let them do the thinking for you.

Gary Pearse
June 22, 2009 4:09 am

Gary Strand (18:37:56) :
“The commentators who dismiss computer modeling – I wonder if any are engineers, who live by computer models these days. Consider that the next new bridge you may cross or skyscraper you go into was modeled on a computer”
Your defense of models is a bit over the top. Complex bridges and buildings were designed for a long time with a pencil and a slide rule. The models, using the same applied science simply do the calculations and drawings a heck of a lot faster (when you know the science! I wouldn’t walk across a bridge designed by a climate change modeler). Yes computer models can be useful but keep in mind what happened to the movies when computer simulation could make good car crashes and explosions. Moviecraft and even telling a good story got overpowered by this new tech. The story became written around the simulations and a very useful tool became a toy and an end in itself. And of course, you could make the simulation do exactly what your wanted it to do. Sound familiar? I think a major contribution of modelling to science is that they have proven that the cosmos is not deterministic. With models, it appears one can fashion many alternatives for a phenomenon (and pick the one your like)

Gary Pearse
June 22, 2009 4:29 am

Re models and engineering. Engineering models are relatively simple dealing with strength of materials, loads, force vectors, and effects of time. But even in these simple circumstances with factors that you can count on your fingers, the engineer usually then builds in a safety factor of 50 to several hundred percent to be sure!! Surely, climate models that are projecting a degree or two over a century with measuring tools that aren’t accurate to a degree or two and dealing with so many unknowns should have such a huge “safety factor” as to render them useless. I hope there are enough of us Luddites around to vote down the computer model’s projections of the next century.

Gary Strand
June 22, 2009 5:50 am

J. Peden:
“You make some predictions based upon your hypothesis and see if they eventuate in the real world. You make clear what empirical conditions would falsify your hypothesis. Yes, those really are real experiments, with the whole climate, Earth, and Solar System themselves as a giant sources of altering conditions, results, and data. Isn’t that enough?”
If that’s your criteria, then climate models are doing quite well. They can replicate the known 20th century climate – with the appropriate caveat on “known”. I doubt you agree, but given your metric above, perhaps you need to alter it given climate models’ successes.
“But you can also run smaller real world experiments on the elements of your hypothesis, such as the temperature effect of varying concentrations of CO2 within a confined space and a given light wave input, etc., or on CO2’s effect on a simulated “ocean”.
What of all the other factors? Climate depends on far more than just CO2 and solar input; how would we know what’s important if we relied on just one or two forcings?
“Isn’t that how general laws came to be developed and accepted? Isn’t that what Climate Science should be doing?”
That’s what “Climate Science” has been doing – for the better part of 40 years.
“Oh, and don’t forget to check your measuring equipment, wink, wink.”
Indeed.

Tim Clark
June 22, 2009 6:00 am

The simulations can help scientists decipher the mysterious, subsurface forces in the Sun that cause sunspots. Such work may lead to an improved understanding of variations in solar output and their impacts on Earth.
Give the authors some credit. They could have stated, ” Such work will lead to understanding that minute variations in solar output cannot account for the hockey stick.

Gary Strand
June 22, 2009 6:01 am

Kath:
“The climate experiment is all around us. It is happening as we speak. And, as J.Peden notes, the temperature and other measurements must be accurate and not based on urban heating, bad siting or poor equipment.”
Of course – but those measurements are only a very small part of the picture. Did you see my comment about the lack of 1km x 1km x 10m resolution data for the ocean and atmosphere? Indeed, those (somewhat arbitrary) requirements for data are generous; for land surface and subsurface processes, a vertical resolution of centimeters would be appropriate – same applies for sea ice.
“One has to ask the following of the climate models:
a) Temperatures are dropping while CO2 continues to rise. Do the models correctly reflect this fact?”
Despite the fact that “temperatures are dropping” is a cherry-pick of the data, yes, climate models do replicate temperature drops as CO2 increases. Your hidden assumption that temperature must increase monotonically as CO2 increases ignores the knowledge we have of the many other factors that affect temperature.
“b) The Arctic ice just about reached “average” thickness this winter; in fact, the ice was thick enough for the Russians to drive vehicles to the North pole. http://www.russia-ic.com/news/show/8099/. Do the models accurately predict the changes in ice thickness that we have seen from the winter of 2007 to 2008-2009?”
I personally haven’t looked at the output from the IPCC AR4 models for this particular measure – I’d be quite surprised if at least one of the models, for one of its runs, didn’t show sea ice area and thickness matching observations. One thing – measuring sea ice thickness is tricky – very tricky.
“c) Do the models adequately conform to recorded historical climate data such as the little ice age and the medieval warm period.
(There are others that I have not listed)”
Provide accurate and correct forcing and boundary conditions for those two states, and climate models will give it a go. Of course, since what we do know of those two eras is quite poor and has huge error bars, expecting climate models to replicate them is asking quite a lot. Remember, GIGO…
“The problem is this: If even one of those events is not correctly modeled, can we reasonably expect the same models to predict climate decades into the future?”
That’s the kicker, isn’t it? Who, and how, defines “correctly modeled” since our knowledge is imperfect? Likewise, does our lack of omniscience mean we are truly completely ignorant? I don’t think so.
“One other thing: Climate change is normal. It has always happened in the past and it will always happen in the future. With or without human intervention.”
That’s true – but that doesn’t mean that we humans cannot be changing the climate ourselves. Just as humans have always died, doesn’t mean someone cannot be charged with murder, when circumstances and the facts require it.

Gary Strand
June 22, 2009 6:04 am

Gary Pearse:
“Your defense of models is a bit over the top. Complex bridges and buildings were designed for a long time with a pencil and a slide rule.”
I recommend you examine the efforts of Lewis Fry Richardson.
“The models, using the same applied science simply do the calculations and drawings a heck of a lot faster (when you know the science! I wouldn’t walk across a bridge designed by a climate change modeler).”
Why not? Because climate change modelers don’t know the science (which is a bit unfair) or because climate change modelers deliberately manipulate and alter the science to achieve some pre-determined outcome (which is hugely unfair and unreasonably snarky)?

david_a
June 22, 2009 6:07 am

Gary Strand:
I’d say that some of the most ardent ‘disbelievers’ in the AGW hypothesis are precisely those people who have a lot of hands on experience with computer models and understand them for what they are.
The great problem with the climate models appears to be that they contain few hard and fast testable outcomes at least on shorter timescales, or if they do then they are not made public in a clear manor. We have about 30 years of pretty decent satellite temperature series to work worth and are in the process of accumulating good ocean heat content data with the deployment of the Argo network. I have yet to find a paper which says something to the effect that here are the variances of the various energy balance components of the system as predicted by the models and given them here is a probability distribution of what the energy balance should look like as a function of time.
Instead the ‘science’ (especially the paleo reconstructions) is presented as a public relations campaign and to anyone even remotely conversant with the math and the physics it is laughable and causes one to have an enormous distrust of those ‘doing the work’.

Gary Strand
June 22, 2009 6:29 am

david_a (06:07:21) :
“I’d say that some of the most ardent ‘disbelievers’ in the AGW hypothesis are precisely those people who have a lot of hands on experience with computer models and understand them for what they are.”
Those of us in the business aren’t as enamored of the model output as imply.
“The great problem with the climate models appears to be that they contain few hard and fast testable outcomes at least on shorter timescales, or if they do then they are not made public in a clear manor.”
Expecting climate models to replicate “shorter timescales” is a misunderstanding of climate. Depends on what length of time you mean. Climate models will never be able to tell you weather phenomena.
As for the results not being made public, that’s directly contradicted by the existence of the CMIP3 archive.

Just Want Results...
June 22, 2009 7:12 am

Bill Illis (13:41:59) :
Thanks Bill! You’re the best commenter here at WUWT.

Steven Hill
June 22, 2009 7:25 am

Those spots on the sun, are they part of cycle 23? Trying to learn something.
thanks,
Steve