I loved the way James Russell described CO2 molecules as “natural thermostats”
Solar Storm Dumps Gigawatts into Earth’s Upper Atmosphere
A recent flurry of eruptions on the sun did more than spark pretty auroras around the poles. NASA-funded researchers say the solar storms of March 8th through 10th dumped enough energy in Earth’s upper atmosphere to power every residence in New York City for two years.
“This was the biggest dose of heat we’ve received from a solar storm since 2005,” says Martin Mlynczak of NASA Langley Research Center. “It was a big event, and shows how solar activity can directly affect our planet.”
Mlynczak is the associate principal investigator for the SABER instrument onboard NASA’s TIMED satellite. SABER monitors infrared emissions from Earth’s upper atmosphere, in particular from carbon dioxide (CO2) and nitric oxide (NO), two substances that play a key role in the energy balance of air hundreds of km above our planet’s surface.
“Carbon dioxide and nitric oxide are natural thermostats,” explains James Russell of Hampton University, SABER’s principal investigator. “When the upper atmosphere (or ‘thermosphere’) heats up, these molecules try as hard as they can to shed that heat back into space.”
That’s what happened on March 8th when a coronal mass ejection (CME) propelled in our direction by an X5-class solar flare hit Earth’s magnetic field. (On the “Richter Scale of Solar Flares,” X-class flares are the most powerful kind.) Energetic particles rained down on the upper atmosphere, depositing their energy where they hit. The action produced spectacular auroras around the poles and significant1 upper atmospheric heating all around the globe.
“The thermosphere lit up like a Christmas tree,” says Russell. “It began to glow intensely at infrared wavelengths as the thermostat effect kicked in.”
For the three day period, March 8th through 10th, the thermosphere absorbed 26 billion kWh of energy. Infrared radiation from CO2 and NO, the two most efficient coolants in the thermosphere, re-radiated 95% of that total back into space.
In human terms, this is a lot of energy. According to the New York City mayor’s office, an average NY household consumes just under 4700 kWh annually. This means the geomagnetic storm dumped enough energy into the atmosphere to power every home in the Big Apple for two years.
“Unfortunately, there’s no practical way to harness this kind of energy,” says Mlynczak. “It’s so diffuse and out of reach high above Earth’s surface. Plus, the majority of it has been sent back into space by the action of CO2 and NO.”
During the heating impulse, the thermosphere puffed up like a marshmallow held over a campfire, temporarily increasing the drag on low-orbiting satellites. This is both good and bad. On the one hand, extra drag helps clear space junk out of Earth orbit. On the other hand, it decreases the lifetime of useful satellites by bringing them closer to the day of re-entry.
The storm is over now, but Russell and Mlynczak expect more to come.
“We’re just emerging from a deep solar minimum,” says Russell. “The solar cycle is gaining strength with a maximum expected in 2013.”
More sunspots flinging more CMEs toward Earth adds up to more opportunities for SABER to study the heating effect of solar storms.
“This is a new frontier in the sun-Earth connection,” says Mlynczak, and the data we’re collecting are unprecedented.”
Stay tuned to Science@NASA for updates from the top of the atmosphere. Author:Dr. Tony Phillips|
h/t to WUWT reader AJB
Wait…do I have this right? If not please correct me.
1 Kw = 1000w, and 1kwh = 1000w/hr. So, 26,000,000,000 kwh over 2yrs (~17,520hrs) = 455,520,000,000,000kw, which = 455,520,000,000,000,000w.
95% re-emitted leaves us with 22,776,000,000,000,000w.
There are ~ 120,000,000,000,000 sq/m on Earth, so globally… isn’t that 189.8W/m^2 that is NOT re-emitted to space? Where does that excess 5% go? 189.8W/m^2 is huge
Even if only 1% of the absorbed energy makes it to the surface, that is, 1.898W/m^2…more than the total RF of CO2 increase since 1750 which is ~1.6W/m^2, with error boundaries of course.
Am I missing something? How can this amount of energy be ignored by climate science?
26 billion kWh dropped into the ocean would not raise its temperature 0.01K. In fact, presumably all the power plants that provide energy to New York City for two years (sarcasm intended) dump all of that energy times 3 or 4 (efficiency) into the local environment in that time and it doesn’t measurably affect the temperature.
meters in radius. Its surface area is
square meters. The depth of its atmosphere (just the part thick enough to actually breath, sort of) is roughly 10,000 meters, although you’d probably die of hypoxia at the top of that, giving one a volume of order of
cubic meters. Into this you drop
or
joules, or around 0.0001 joule of energy per cubic meter of air. And then 95% of it was immediately reradiated/reflected.
It’s important to put things into perspective. The Earth is
A year or so ago I thought about whether magnetic induction — basically eddy currents — from the Sun’s magnetic field could be responsible for part of the observed differential warming of the Earth during times where the Sun is magnetically active. After all, at one point in time this process was thought to have provided enough energy to have fused metallic asteroids during the early accretion process of the solar system. Although the field involved was very weak, the Earth is (as noted) rather large, so a back-of-the-envelope estimate of the total energy involved yielded a satisfyingly large number of joules — until you consider the volume being heated. Suddenly the effect is utterly ignorable.
So in spite of the emphasis, this sort of thing isn’t even as much as a drop in the bucket of Earth’s energy balance. If it were sustained for a very long time, maybe. If it has additional nonlinear effects (e.g. altering the albedo of the upper atmosphere by increasing its ionization) maybe. But as far as direct heating is concerned, the ionosphere is already enormously hot, but it is also enormously diffuse. I doubt that there was any effect at all on the actual stratosphere or troposphere.
rgb
Steve Keohane says:
March 23, 2012 at 7:13 am
I have to agree that the jury is out on just what CO2 does. Disregarding the time lag in the ice cores, and what we think we know about how it acts, it has always seemed as logical to me to argue that high CO2 causes the temperature to crash at every CO2 peak; as it does to argue that it causes temperature increase.
===========================================
Steve, that would explain something that’s not explained right now…..
….why didn’t temps follow the usual peak…..flat line…..while CO2 levels kept going up
edit: oops, this right? 26,000,000,000kwh over ~ 72hrs = 1,872,000,000,000kwh total, which = 1,872,000,000,000,000w. 5% of that is 93,600,000,000,000w, over Earth’s ~120,000,000,000,000sq/m surface is 0.78W/m^2.
Makes more sense, it’s still huge
But themolecular density of the body being heated should be taken into account too I believe, the volume of Earth’s atmosphere in the troposphere makes up a large majority if I recall correctly. There is less substance to actually intercept the photons traveling through the upper atmosphere, while the surface/oceans/lower atmosphere are more likely to do so.
Am I missing something? How can this amount of energy be ignored by climate science?
Watt-HOURS are a measure of energy, not power. 1 W-hour = 3600 joules. The article says that the 26 billion KW-hours is the TOTAL energy delivered over some unknown amount of time. By your own argument, if it were delivered over 180 seconds (three minutes) it would be a surplus of roughly 1 watt/meter squared over that time. (That is consistent with my estimate of one Joule per square meter, or 0.0001 joules per cubic meter in the entire 10 km high air column). But it wasn’t — it was delivered over a much, much longer time. The averaged power delivery over that time was doubtless milli- to microwatts per square meter, distributed (as noted) over a VOLUME much, much larger still.
The problem is that the Earth is really big. Terajoules vanish without a trace, utterly negligible. That’s why solar energy is feasible — a tiny, tiny fraction of the Earth’s surface could provide ten times 100% of the per-capita energy consumption of the world’s most affluent citizens to every person on earth IF we can convert it at reasonable efficiency with inexpensive (per watt) surfaces and store/buffer it in at high energy density in inexpensive (per watt per unit volume) energy storage devices. Both are purely technical problems and almost certainly have technologically and economically feasible solutions that are being aggressively pursued.
That, in turn, is why Carbon Trading is really stupid, even if the CAGW enthusiasts are completely correct. If we do nothing, in a decade or at most two PV energy generation will be rapidly overtaking all other forms of energy generation — to eliminate them if the storage problem is solved, and to supplement them with daytime power if not. And if fusion is worked out in the meantime, and/or e.g. liquid salt thorium reactors (more or less nuclear-proliferation and meltdown proof), so much more, so much the faster.
IMO CAGW is not particularly plausible, but ultimately it doesn’t matter. Not even the CAGW enthusiasts seriously claim disaster in the next 30 years, and by then the production of CO_2 will be going down of its own accord.
rgb
rgb
edit: oops, this right? 26,000,000,000kwh over ~ 72hrs = 1,872,000,000,000kwh total, which = 1,872,000,000,000,000w. 5% of that is 93,600,000,000,000w, over Earth’s ~120,000,000,000,000sq/m surface is 0.78W/m^2.
Makes more sense, it’s still huge
No, no, no. See the above, kwh. It is already energy. You need to divide by the time to make power, and divide again to make intensity and multiply by 0.05 to get some measure of “downwelling”. Which is “zero” to as many digits as you are likely going to want to write.
rgb
The atmosphere doesn’t absorb most of the downard emitted energy from CO2 (from the CME) before it reaches the surface, most of the atmosphere is IR transparent so much of it will hit the surface, ~0.78W/m^2 if the 5% that was not re-emitted actually hits the surface (as it should).
So, how much total net energy is truly contained in the ocean-surface-atmosphere system, and at what thermal threshold? Kinetic theshold? Electric? Does it take energy to maintain a perturbation against gravity? Is convective overturning the substitution of/for gravity, or is it a perturbation against gravity? What is gravity? Does it take additional energy to re-route atmospheric circulation?
If doubling Co2 = ~3.7W/m^2 of RF increase, how can that be expected to manifest thermally if it’s surface warming in a minute fashion? The surface heats up over 40C in sunshine, and it is conduction from the surface that result in most of the thermal attainement/retainement w/ a small portion of that being radiative (backradiation) which may heat the atmosphere between the warmer surface and cooer upper atmosphere, but does not warm the surface and barely slows it’s cooling, if at all. When conduction is taken out of the equation the remaining value for backradiation is exeedingly small, and adding 1.6W/m^2 of LW re-emission through the largely IR-transparent atmosphere turns out to be almost meaningless against the surface’s thermal retainement via conduction/diffusion.
So if CO2 emits 95% of the energy it absorbed in the upper atmosphere back out to space
Did I mis-interpret the article? I was under the impression that, example, 10Kwh = 10Kw over 1 hr.
I do not believe you can weight the IR-transparent atmosphere at all, not only is the density at the upper levels much less, but the photons that are emitted downward should pass freely through most of the atmosphere and reach the surface, which is what warms the atmosphere. Of course GHG molecules such as H2O, CO2, etc, may absorb a small fraction of it an re-emit it.
The bigger issue with CMEs may very well be the electroscavenging. Destruction of cloud forming particles could lead to a lowering of the albedo that would generate a much greater warming effect than the numbers provided.
Oh, so 26,000,000,000 kwh (KW x H)? So if 1kw = 1000w, a 1000W bulb will use 10kwh in 10hrs?
Yes, I also would like to see an explanation of why CO2 in the upper atmosphere supposedly acts differently (95% radiatied back out) than CO2 in the lower troposphere (uniform omni-directional radiation)
kk total of 26,000,000,000kw = 26,000,000,000,000w, which 5% of it is 1,300,000,000,000w, which = 0.01W/m^2. That makes much more sense.
_Jim says: March 23, 2012 at 7:04 am
……………..
Hi again Jim
Plasma (charged particles) electric currents are ‘magnetic field aligned’ currents, and considerably more complex than electric currents in a wire, I assume you have looked at the link:
http://ase.tufts.edu/cosmos/pictures/Sept09/Fig8_7.MagCloud.gif
To understand what is going there you can find more details here:
http://en.wikipedia.org/wiki/Birkeland_current
There are number of misconceptions regarding Birkeland currents, so when Dr. Svalgaard appears on the scene, if you ask, he may direct you to an authoritative article.
jack morrow says: March 23, 2012 at 6:24 am
…..
I wouldn’t think so, there is a limit to the plasma currents velocity. I am inclined to think it is a shockwave. The propagation of shock waves in a magnetic flux ropes or tubes is complex matter, but to be honest I don’t sufficiently understand the Brinkley–Kirkwood theory on which these things are based. You can google ‘Brinkley–Kirkwood theory’ and see how you get on.
Fred N. says:
March 23, 2012 at 10:02 am
Yes, I also would like to see an explanation of why CO2 in the upper atmosphere supposedly acts differently (95% radiatied back out) than CO2 in the lower troposphere (uniform omni-directional radiation)
_________________________
I am not a physicist so I may be wrong (Please correct) but this is my take on it.
There are two factors. The first is the curve of the earth. Close to the earth, the surface can be treated as flat, so if the CO2 is reradiating in all directions, a fairly large angle will strike at least a glancing blow. It is not 180 degrees but very close to the earth it could be approximated as approaching 180 degrees or 1/2 the radiation. As you move further and further from the surface that angle is going to be come smaller because of the earth’s curve. (think a cone shape)
The second and more important reason is how often reradiation from the CO2 escapes into space. In the upper atmosphere where the space between atoms/molecules are “large” the change of the radiation escaping is much much greater. Also if the radiation is back towards earth there is a darn good chance it will be captured and reradiated back towards space. So you are talking about chance, probability and N factorial (the number of atoms hit) where the chance of it heading to space is counted up for each time the radiation encounters an atom/molecule during its trip down towards earth. Think one of those old pinball machines where the radiation has to get past each obstacle (atom/molecule) as the paddler play Wack-a-mole with a good chance of knocking the ball/energy back up/sideways)
Also remember because the atmosphere is less dense headed up than it is headed down it tips the probability towards escape.
N! = N factorial: http://factorielle.free.fr/index_en.html
The calc. that sez it could power all the residences in NYC for 2 yrs. just shows how trivial the human heat contribution to the atmosphere is.
“in spite of the emphasis, this sort of thing isn’t even as much as a drop in the bucket of Earth’s energy balance. If it were sustained for a very long time, maybe.”
It is sustained for a very long time… presumably it’s constant… with major events currently being recorded every few years and not that much less energetic flows all the time as the graph with the article shows: http://science.nasa.gov/media/medialibrary/2012/03/22/both_spikes.jpg.
Over time this amount of energy must accumulate within the Earth system. What is the total amount of energy shown in the graph? (The area under the line in the graph?).
What was it up there that got “hot” and needed the assistance of CO2 to emit IR back into space to cool down? Is any of that stuff down here that heats up and cannot emit and needs CO2 to cool down?
Phil says:
March 23, 2012 at 10:08 am
> kk total of 26,000,000,000kw = 26,000,000,000,000w, which 5% of it is
> 1,300,000,000,000w, which = 0.01W/m^2. That makes much more sense.
You’re still completely messing up the units. (That’s okay, a lot of high school physics teachers don’t realize teaching dimensional analysis is important.)
26,000,000,000kWh over three days (3 days × 24 hours/day = 72 hours). Assuming steady “brightness”, that’s 26,000,000,000kWh ÷ 72 h = 361,000,000 kW during the 3 days the “light” was on. 5% of that is 18,000,000 kW
http://en.wikipedia.org/wiki/Earth says the surface area is 510,072,000 km₂, so dividing by that yields 18/510 W/m₂ or 0.0353 W/m₂.
So, not a very bright light after all. I won’t do the math, but http://answers.yahoo.com/question/index?qid=20110308121849AA7xu4v claims a full Moon offers 0.025 W/m₂ – so the energy is more than we get from (I assume) the full Moon when it’s directly overhead! http://home.earthlink.net/~kitathome/LunarLight/moonlight_gallery/technique/moonbright.htm offers interesting but mostly irrelevant comments on lunar illumination.
Oops – replace all m₂ with m² Can’t read my own guide page….
Only from the evidence (if accurate) that the atmosphere (in average) is cooled exclusively by radiation (100%, 64% of incoming solar), but on the other hand, warmed multi-modally by the following:
– latent heat flux from the surface: 23% (of incoming solar),
– directly absorbed solar by atmosphere/clouds: 19%,
– absorbed surface radiation: 15%, and
– sensible heat flux from the surface: 7%,
can be concluded (on the face of it) that atmospheric CO2 (if anything significant) cools the atmosphere.
http://edro.files.wordpress.com/2007/11/earths-energy-budget.jpg
Has anyone noticed what’s happened to Arctic mean temp since all this kicked off?
http://ocean.dmi.dk/arctic/meant80n.uk.php
Probably unconnected (and sure it spikes up and down like a yo-yo) but does seem a bit odd given the equinox switch around.
Wait there’s more: For reasons not fully understood by scientists, the weeks around the vernal equinox are prone to Northern Lights.
http://science.nasa.gov/science-news/science-at-nasa/2008/20mar_spring
Would be good to get Leif’s take on this if he drops by.
Wouldn’t the difference of net (vertical) radiation be due to the exponential that results from continually splitting flux. I know I’m leaving a lot to be desired semantically here. But if you assume, as a very crude model, multiple layers of opaque gasses that absorb unidirectionally and then re-emit isotropically, each downward ‘hit’ at a layer results in a 50% split (in the vertical component). Half continues on down. Half gets reversed.
Downward flux from a hot and very high source would be constantly converted by conversions from the unidirectional source to isotropic result. There would always be a bias (even without density change) that would make it very hard to reach the surface. Once you factor in a density profile to considerations of the probabalistic nature of the collisions at a given level it seems like most would get out and never be seen at the surface (most likely collision is downward).
Isn’t the reverse also true, i.e. it’s hard for energy in a CO2 band to get out when the energy source is the surface? And the likely corollary is that it doesn’t take a massive amount of opacity at a given altitude to cause a bias because of the nature of the path lengths and multiple conversions applied to the radiation ‘leak’.