Jim Hansen's balance problem of 0.58 watts

From NASA Goddard, Jim Hansen reports on his balance problem:

Earth’s Energy Budget Remained Out of Balance Despite Unusually Low Solar Activity

conceptual image of the sun
A prolonged solar minimum left the sun's surface nearly free of sunspots and accompanying bright areas called faculae between 2005 and 2010. Total solar irradiance declined slightly as a result, but the Earth continued to absorb more energy than it emit throughout the minimum. An animation of a full solar cycle is available here. Credit: NASA Goddard's Scientific Visualization Studio

A new NASA study underscores the fact that greenhouse gases generated by human activity — not changes in solar activity — are the primary force driving global warming.

The study offers an updated calculation of the Earth’s energy imbalance, the difference between the amount of solar energy absorbed by Earth’s surface and the amount returned to space as heat. The researchers’ calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.

James Hansen, director of NASA’s Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.

Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of the Earth’s atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun’s magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

graph of the sun's total solar irradiance
A graph of the sun's total solar irradiance shows that in recent years irradiance dipped to the lowest levels recorded during the satellite era. The resulting reduction in the amount of solar energy available to affect Earth's climate was about .25 watts per square meter, less than half of Earth's total energy imbalance. (Credit: NASA/James Hansen)

Pinpointing the magnitude of Earth’s energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen’s team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

“The fact that we still see a positive imbalance despite the prolonged solar minimum isn’t a surprise given what we’ve learned about the climate system, but it’s worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming,” Hansen said.

According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.

Climate scientists have been refining calculations of the Earth’s energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Argo float and ship
Data collected by Argo floats, such as this one, helped Hansen's team improve the calculation of Earth's energy imbalance. Credit: Argo Project Office

Hansen’s analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

“Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect,” Hansen said.

Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.

map showing global reach of Argo floats A chart shows the global reach of the network of Argo floats. (Credit: Argo Project Office)

› Larger image

“Unfortunately, aerosols remain poorly measured from space,” said Michael Mishchenko, a scientist also based at GISS and the project scientist for Glory, a satellite mission designed to measure aerosols in unprecedented detail that was lost after a launch failure in early 2011. “We must have a much better understanding of the global distribution of detailed aerosol properties in order to perfect calculations of Earth’s energy imbalance,” said Mishchenko.

3 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

189 Comments
Inline Feedbacks
View all comments
Martin Lewitt
February 2, 2012 7:15 am

Dr. Hansen,
If “assessment of the imbalance requires measurement accuracy approaching 0.1 W m−2”, shouldn’t at least level of accuracy also been needed for model based attribution of the warming to various forcings and projection of the climate based upon future scenarios? I believe you have just conceded as much. I offer you a simple, heartfelt thank you. The last time I teared up about climate science was back at the initial climategate revelations, and it was for quite the opposite reasons.
I know, I know … I’m over reacting, but it is tough not to be emotional when one has had a lifelong love affair with science.

February 2, 2012 8:47 am

Henry@Leif
The link
http://st4a.stelab.nagoya-u.ac.jp/nagoya_workshop_2/pdf/1-1_Lean.pdf
is like a piece of propaganda for AGW.
Leif, you disappoint me.
You should read again my latest results (which is not propaganda)
and try to follow my thinking why I say your carbon foot print is good for earth.
http://www.letterdash.com/HenryP/more-carbon-dioxide-is-ok-ok
I hope you realize that the margin for error (variation) must actually be measured, 3 x M’s
man (various people), if there are none, then that is better,
method (various methods, if avalaible)
machine
in fact there are at least two machines here,
the one doing the actual measurement and the sun itsself, which also has variation.
Unless you come to me with actual results on how the error margin that you claim is 0.2% ,
was arrived at,
I have to go with the 1.6% that was actually compiled and measured in R-351.
That means of course that the raving and ranting of the missing 0.58 watts is a bit of a joke,
is it not?

George E. Smith;
February 2, 2012 12:08 pm

If that TSI graph is real; and I just presume that it is, why did NASA recently suggest that current TSI best value is actually 1362 W/m^2, and maybe some change. It was reported here at WUWT, and 1362 is the value I currently use to replace the old historic 1353 W/m^2 of my school days.

February 2, 2012 12:17 pm

HenryP says:
February 2, 2012 at 8:47 am
Unless you come to me with actual results on how the error margin that you claim is 0.2% ,
was arrived at, I have to go with the 1.6% that was actually compiled and measured in R-351.

R-351 is so old that it is a joke that you go with that. The error margin is actually “provided with a relative standard uncertainty (absolute accuracy) of approximately 0.01% (100 parts per million, ppm) based on SI units and with a long-term precision (relative accuracy) of 0.001%/yr (10 ppm).”
Here is how TSI is measured: http://lasp.colorado.edu/sorce/data/tsi_data.htm#quality
The solar cycle variation is of the order of 0.15%.

George E. Smith;
February 2, 2012 12:19 pm

Frther on the above, I have seen a variety of reported satellite measures of TSI, going back about three total solar (half) cycles, but unfortunately no continuous three cycle record from any one.
And those numbers were more in the 1366-7 range, similar to the first graph above. Yes the various partial satellite records, had some offsets; but that’s a natural result of technology improvement, and I don’t put much significance to those (three) different satellite figures.
What I really would like to know, is HOW GOOD was that old 1353 number; probably a result from rockets or balloons (likely with corrections.)
Some name like Thekaekara comes to mind. Maybe Dr Leif could tell us “who dat ?”
I don’t think current climate is too closely linked to TSI.
” It’s the WATER ! “

George E. Smith;
February 2, 2012 12:35 pm

“”””” Dale says:
January 31, 2012 at 8:46 pm
According to my eyeballing, isn’t the difference between max and min solar cycles around 1.25 W/m2 and not 0.25 as specified by Hansen? “””””
Well no it isn’t. You are looking at the blue line. The red line is the actual TSI data, and it clearly has at least a 1.5 W/m^2 range.
If you want to pick a number, it would seem that TSI is 1366 W/m^2, and what is with that “proxy data” before 1978. Why not just leave that out, since it is not real data, or at least he could have predicted it; excuse me, that’s projected it back to before the first minimum, which might have shown his most recent minimum is simply “ho hum-yawn”
Perhaps the “proxy data” comes from Briffa’s Christmas tree, in Yamal.

George E. Smith;
February 2, 2012 12:42 pm

Are the Argo buoys tethered to some solid place, that has a lat/long co-ordinate or do they simply drift around the ocean; or do they drive themselves to some fixed GPS location ?
How do we know that and particular Argo buoy, is always in the same block of water at watever depth it is at; because the water is certainly not locked to any GPS location ?
And remember, they are looking for extremely small Temperature variations, so even with 3400 buoys, Nyquist aliassing noise must be a concern. So how often does any buoy take some Temperature reading, or some location fix ?
I agree the buoys are better than nothing; but I wonder about the reliability of the numbers, specially the “proxy data”

February 2, 2012 12:44 pm

George E. Smith; says:
February 2, 2012 at 12:19 pm
What I really would like to know, is HOW GOOD was that old 1353 number;
It is only the last ten years that we have been able to measure absolute TSI with any accuracy. The current value is 1361.5. Here is how it is done: http://lasp.colorado.edu/sorce/data/tsi_data.htm#inst_description

February 2, 2012 1:28 pm

Seems to me there are only two ways to measure the earth’s energy budget:
1. Measure energy entering vs energy leaving. TOA satellite measurements.
Fail: Instrumental accuracy.
(CERES) data is adjusted from the measured 6.4 W m−2 gap to match the modelled 0.85 ± 0.15 W m−2 gap (Hansen et al. (2005) (now,above, it is more recently a 0.58 W m−2 gap?) …. using …taDAH! …modelled data ……
(Loeb etal 2009 ‘Toward Optimal Closure of the Earth’s Top-of-Atmosphere Radiation Budget’ J. Climate, 22, 748–766)
2. Measure the temperature change of ‘the planet’ accurately over a set time period. Then show that any change is significant taking into account all positional and seasonal variations over the set time period, compared with other set periods of time.
Fail: Statistical inaccuracy. We need more replicates. Some ‘slight’ difficulty in choosing suitable ‘control’ comparison periods.
Fail: Measurement accuracy/inadequate time period.

Bill Illis
February 2, 2012 7:21 pm

The only way to measure radiation imbalance through satellites is to give the raw data to a statistical agency staffed by real mathematicians who have no incentive system in the results.
Keep Hansen away from the data.
The Argo data was like this for a period of time. Now Hansen’s co-authors are interpreting it. Of course, it has now been adjusted to CO2 = MC^2.

February 3, 2012 5:27 am

Henry@Leif
The paper you now quote says: (data quality description)
Present absolute accuracy is estimated to be 0.48 W/m^2 (350 ppm), largely determined by the agreement between all four TIM radiometers. The 4.5 W/m^2 by which the TIM reads lower than prior instruments has been resolved as being largely due to internal instrument scatter in those prior instruments causing erroneously high readings (see Kopp & Lean, GRL, 38, L01706, 2011).
The [0.48] works out to [0.04]% to which we must still add the [0.15]% that you say is measured as the variation coming from the sun.
That brings us now to about 0.2%.
That is indeed a lot better.
However, the statement that it reads 4.5 W/m2 lower than previous instrumentation raises my eyebrows again. He quotes a very recent paper where presumeably the errors of the previous equipment were explained.
Which instrumentation did Hansen use?

February 3, 2012 12:33 pm

@GlynnMhor says:
February 1, 2012 at 4:13 pm
“Now they just need to demonstrate where all that extra warming went..”
Outwards http://www1.ncdc.noaa.gov/pub/data/cmb/teleconnections/olr-5b-pg.gif

Resourceguy
February 3, 2012 1:34 pm

A Physicist says: More broadly, it is not rational for skeptics to underestimate the scientific foresight of James Hansen and his colleagues: their thirty-year predictions from 1981 in ‘Climate Impact of Increasing Atmospheric Carbon Dioxide’ are looking pretty solid right now.
I have a model called the AMO that works quite well from 1981 also.

George E. Smith;
February 3, 2012 3:23 pm

“”””” Leif Svalgaard says:
February 2, 2012 at 12:44 pm
George E. Smith; says:
February 2, 2012 at 12:19 pm “””””
Thanks Leif, I presume that those earlier earth bound scientists applied whatever corrections seemed to make sense to them at the time; but I can see it was a difficult task in those days. Given what we have learned about earth’s outer atmosphere since then, it is surprising they were able to do as well as they did.
The article you pointed to mentioned that they measured the “entire solar spectrum” Realistically, about what wavelength range does that cover. I can see how a “cavity” sensor, can capture pretty much any wavelength, but I wonder how much of it actually registers on the sensor. If the sensor, is completely Temperature responsive, it would seem that ANY wavelength can cause heat. It would be the exception for any wavelength to NOT cause heat.

February 3, 2012 5:05 pm

HenryP says:
February 3, 2012 at 5:27 am
That brings us now to about 0.2%.
As I said.
However, the statement that it reads 4.5 W/m2 lower than previous instrumentation raises my eyebrows again. He quotes a very recent paper where presumeably the errors of the previous equipment were explained.
Indeed, the early errors are understood as coming from light scattered back into the instrument.
George E. Smith; says:
February 3, 2012 at 3:23 pm
If the sensor, is completely Temperature responsive, it would seem that ANY wavelength can cause heat. It would be the exception for any wavelength to NOT cause heat.
ALL wavelengths are indeed measured, that is why it is called TOTAL solar irradiance.

George E. Smith;
February 3, 2012 10:41 pm

“”””” Leif Svalgaard says:
February 3, 2012 at 5:05 pm
George E. Smith; says:
February 3, 2012 at 3:23 pm
If the sensor, is completely Temperature responsive, it would seem that ANY wavelength can cause heat. It would be the exception for any wavelength to NOT cause heat.
ALL wavelengths are indeed measured, that is why it is called TOTAL solar irradiance. “””””
How poetic; who would have thought that the best way to measure the totality of everything arriving, was to simply WASTE all of it as “heat” ; surely the lowest form of energy life. I hope they designed the cavity to be silent so it didn’t make any noise by converting EM radiation into sound or some other heat “leakage”.
Years ago, HP made some sort of fancy thin film “black gold” Bolometer, that had a tiny thermal mass so it was decidedly rapid reponding for a thermal gizmo. That had to be over 30 years ago, so I can’t imagine what the moden bolometer technology is.

George E. Smith;
February 3, 2012 10:44 pm

And as a footnote, our reluctant friend Myrrh, still believes that visible light can’t heat anything. Just yakking on your cell phone can heat things.

February 3, 2012 11:00 pm

George E. Smith; says:
February 3, 2012 at 10:41 pm
How poetic; who would have thought that the best way to measure the totality of everything arriving, was to simply WASTE all of it as “heat”
Yes, that’s how it works: http://lasp.colorado.edu/sorce/instruments/tim/tim_concept_op.htm

February 4, 2012 1:26 am

Well, either way,
in case there are some who missed my logic,
0.58 W/m2
works out on the 1361 as 0.04%
And if the combined variation of equipment (0.04) and sun (0.15) = 0.19%
then the 0.58 W/m2 is actually very well within the variation of the whole measuring system and could not possibly be taken as a significant result to draw any conclusions all…
Not least the blaming of CO2 for the missing 0.58 W/m2
Most recently I have discovered that it is more likely that some warming on earth occurs due to the increase in greenery.
http://wattsupwiththat.com/2012/01/31/jim-hansens-balance-problem-of-0-58-watts/#comment-881571

February 4, 2012 6:20 am

HenryP says:
February 4, 2012 at 1:26 am
And if the combined variation of equipment (0.04)
You got this basically wrong. There is a difference between the absolute calibration and the relative calibration. The latter [which is what matters] has much smaller variation: 0.001%/yr (10 ppm).

pochas
February 4, 2012 8:04 am

A physicist says:
February 1, 2012 at 7:23 am
“In coming decades, Hansen and his colleagues are planning on being proven correct in these three predictions. And given the solid historical track record of Hansen’s 1981 predictions, and the check-for-yourself sensible thermodynamics of Hansen’s new predictions, rational skepticism must now focus its doubt upon those who assert “Hansen’s new predictions are wrong.”
Zadoc the Priest and Nathan the Physicist anointed Hanson King.

February 4, 2012 8:07 am

Henry!Leif
The variation of equipment is 0.04% as they reported.Don’t confuse that issue.
I think there are two problems that are causing some other confusion for me here:
I quote from the post:
“The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).”
The missing 0.58 W/m2 presumably from TSI is compared with 2x the variation coming from the sun.
If the variation of the sun between max and min is only 0.25W/m2 that works out to 0.02%.
But you said the variation from the sun was 0.15%
The other problem could be: are they talking about a missing 0.58 W/m2 from the 240 going back to space from earth? It seemed to me they had recalculated the “missing” 0.58 W/m2 in terms of TSI.

February 4, 2012 8:09 am

Sorry, that last sentence of my previous post should be:
0.58 W/m2, and not 0.58 %

February 4, 2012 10:08 am

HenryP says:
February 4, 2012 at 8:07 am
The variation of equipment is 0.04% as they reported.Don’t confuse that issue.
No, the variation of equipment is not 0.04%. the equipment is stable to 0.001% per year. The uncertainty in the total is 0.04%, but that is a constant and does not vary with time.
If the variation of the sun between max and min is only 0.25W/m2 that works out to 0.02%.
But you said the variation from the sun was 0.15%

The variation between min and max is at most 2 W/m2 or 0.15%, but is cyclic so does not a long-term trend.

February 4, 2012 11:50 am

Henry@Leif
I quote both the relevant sections
“Data Quality Description
On-orbit instrument characterization is an on-going effort, as the TIM team regularly tracks instrument degradation and calibrates the instrument servo system on-orbit, periodically updating the data processing system with new calibration values. Only minor corrections are anticipated at this phase in the SORCE/TIM mission. To date the TIM is proving very stable with usage and solar exposure, and long-term relative uncertainties are estimated to be less than 0.014 W/m2/yr (10 ppm/yr). Present absolute accuracy is estimated to be 0.48 W/m^2 (350 ppm), largely determined by the agreement between all four TIM radiometers. The 4.5 W/m^2 by which the TIM reads lower than prior instruments has been resolved as being largely due to internal instrument scatter in those prior instruments causing erroneously high readings (see Kopp & Lean, GRL, 38, L01706, 2011).
Measurement Objectives
The primary objective of the SORCE Total Irradiance Monitor (TIM) instrument is to make precise and accurate measurements of Total Solar Irradiance (TSI), adding to previous TSI measurements in order to continue the long-term climate record. Once on-orbit instrument characterization is complete, these TSI measurements will be provided with a relative standard uncertainty (absolute accuracy) of approximately 0.01% (100 parts per million, ppm) based on SI units and with a long-term precision (relative accuracy) of 0.001%/yr (10 ppm).”
end quote
With due respect,
the data quality description is: as it stands, not what we want or hope it will be (objectives).
As it stands it is: 0.48 W/m2. (0.04%). Remember also the 4.5 W/m2 by which previous results must be reduced. (0.33%)
Once in use, they hope to re-calibrate to get better precision. If they can or have achieved that, and how they did that, is a matter for another report, which you did not yet quote to me.
Personally, I think to reach 0.01 is perhaps a bit over optimistic, in my opinion.
About the variation coming from the sun:I read the same what you said somewhere else. If the rate of change every year is more or less constant, or follows a curve, it could be tracked;
however, if you look at the first graph in this post, there is (still) an awful great variation in the 31 day running mean, even in the blue area.
Referring to that first graph in this post, I understand now that the 0.25 applies to solar forcing (i.e. outgoing from earth);
sorry about me misunderstanding that.
However, for the blue area (period analysed) it is not clear to me if, how and when the correction of 4.5 W/m2 was applied. (I believe the average for TSI should now be 1361.5 W/m2)