Guest Post by Ira Glickstein.
The graphic from RealClimate asks “How well did Hansen et al (1988) do?” They compare actual temperature measurements through 2012 (GISTEMP and HadCRUT4) with Hansen’s 1988 Scenarios “A”, “B”, and “C”. The answer (see my annotations) is “Are you kidding?”
HANSEN’S SCENARIOS
The three scenarios and their predictions are defined by Hansen 1988 as follows
“Scenario A assumes continued exponential trace gas growth, …” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.9 ⁰C, OVER FOUR TIMES HIGHER than the actual increase of 0.22 ⁰C.
“scenario B assumes a reduced linear growth of trace gases, …” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.75 ⁰C, OVER THREE TIMES HIGHER than the actual increase of 0.22 ⁰C.
“scenario C assumes a rapid curtailment of trace gas emissions such that the net climate forcing ceases to increase after the year 2000.” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.29 ⁰C, ONLY 31% HIGHER than the actual increase of 0.22 ⁰C.
So, only Scenario C, which “assumes a rapid curtailment of trace gas emissions” comes close to the truth.
THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”
As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated.

What does RealClimate make of this situation?
“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). … The conclusion is the same as in each of the past few years; the models are on the low side of some changes, and on the high side of others, but despite short-term ups and downs, global warming continues much as predicted.”
Move along, folks, nothing to see here, everything is OK, “global warming continues much as predicted.”
CONCLUSIONS
Hansen 1988 is the keystone of the entire CAGW Enterprise, the theory that Anthropogenic (human-caused) Global Warming will lead to a near-term Climate Catastrophe. RealClimate, the leading Warmist website, should be congratulated for publishing a graphic that so clearly debunks CAGW and calls into question all the Climate models put forth by the official Climate Team (the “hockey team”).
Hansen’s 1988 models are based on a Climate Sensitivity (predicted temperature increase given a doubling of CO2) of 4.2 ⁰C. The actual CO2 increase since 1988 is somewhere between Hansen’s Scenario A (“continued exponential trace gas growth”) and Scenario B (“reduced linear growth of trace gases”), so, based on the failure of Scenarios A and B, namely their being high by a factor of three or four, it would be reasonable to assume that Climate Sensitivity is closer to 1 ⁰C than 4 ⁰C.
As for RealClimate’s conclusion that Hansen’s simulation “out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change)”, they are WRONG. Even a “naive” prediction of no change would have been closer to the truth (low by 0.22 ⁰C) than Hansen’s Scenarios A (high by +0.68 ⁰C) and B (high by 0.53 ⁰C)!
It looks like Hansen has shown that whatever was causing climate forcing ceased perhaps a little before 2000. If his methodology has any sound basis, perhaps he demonstrated that CO2 was never causing the temperature increase he was tracking. He has possibly presented a pretty good argument against assigning temperature increases of the past few decades to CO2 increases.
Hah, I just spotted the “Teams” way out.
Up comment someone was critiquing CO2 measuring methods at Mona Loa, there’s the escape clause.
” We have spotted an error in the CO2 record, emissions actually dropped in 1998 and have stabilized/fallen ever since.” Therefor scenario C is 100% accurate, honest.
Did I need sarc?
richard verney says:
March 20, 2013 at 5:49 pm
“As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31.”
You appear to be missing my point entirely. “Manmade CO2 emissions” are having no discernible effect on CO2 levels at all. It is essentially all being driven by temperature. The differential equation describing the relationship is
dCO2/dt = k*(T – To)
dCO2/dt = derivative of atmospheric CO2 concentration
k = coupling constant in ppm/degC/unit-of-time
T = global temperature anomaly
To = equilibrium level for global temperature anomaly
“I think that what you are looking at when you are looking at in the CO2 derivative is a response, not a cause.”
Exactly!
Bart says:
March 20, 2013 at 10:31 am
Ira comments:
[Bart: Have a close look at the Mauna Loa CO2 data (second graphic above) and tell me where CO2 has “leveled out”. Yes, since CO2 has its small seasonal ups and downs there are some months where it is level, but, on a smoothed yearly basis it seems to me to be a continued rapid increase that is slightly exponential in the upward direction. Ira]
The slope has leveled out. Look right there at the end where temperatures are leveling out, too. That is what I said: “But, the rate of change of CO2 has leveled out, in lockstep with the leveling out of temperatures.” Rate. Of. Change. Not the concentration itself, but its rate of change.
[Bart: What does that link or your mathematical jibber-jabber have to do with the past century? Or with the Atmospheric “Greenhouse Effect” that makes the Earth Surface at least 30 ⁰C warmer than it would be if the Atmosphere lacked “Greenhouse gases”? I read your words four times and visited your link. Please explain your point in English. advTHANKSance. Ira]
Just because the GHE heats up the surface above what it would be without the GHGs in the atmosphere does not mean that adding more of a particular GHG will increase it further. It is the difference between a global feature of a function and local behavior. For example, the function
y = 3*x – 6*x^2 + 4*x^3
is a generally increasing function of x. But, between about 0.4 and 0.6, it is essentially flat. At 0.5, the local derivative is zero. A small change in x doesn’t change the function hardly at all.
Similarly, the functional dependence of surface temperature on increasing CO2 appears to be such that, though it warms the surface above what it otherwise would be, we are at a point on the function where adding more doesn’t lead to a significant increase in temperature.
[Bart: I have to agree with Richard Verney that there is no FIRST ORDER correlation between temperature rise and the rise in CO2 emissions. Please don’t simply point to some graph. Explain what you mean so even Richard and I can understand. advTHANKSance. Ira]
Not emissions! Concentration! Emissions are not doing hardly anything at all.You can essentially ignore them. It is clear that they are rapidly being sequestered or otherwise transported out of the surface system by ocean, mineral, and biological processes.
Those flows are miniscule compared to natural flows. Temperatures are driving the rate of CO2. It is a continuous flow problem, in which the differential rate between sources and sinks is being modulated by temperature, as I described here on this page.
Thanks for responding. I hope I have made things clearer and that someone will begin to appreciate what is so patently obvious in the data: Atmospheric CO2 concentration is not being driven significantly by human inputs. It’s the temperature modulation of a continuous transport system.
Lubos Motl has an detailed look at the Mauna Loa CO2 data.
http://motls.blogspot.ca/2013/02/mauna-loa-carbon-dioxide-fit.html
Ira-
Since the last time Hansen et.al. ’88 was the topic here at WUWT, I have been studying the paper and doing some calculations and plotting. Unfortunately, my lack of internet skills have prevented me from preparing a post to submit.
The main point I want to make is that there is fourth scenario that nobody mentions. It is shown in the very first figure of Hansen ’88. This scenario is the 100 year control run. It is called the “control run” because all the ghg’s were held constant at the 1958 values. That’s right, the control run keeps the ghg’s constant for 100 years (to 2058).
However, to accurately compare Hansen’s scenarios with measurements, it is necessary to compare the ghg’s in the scenarios with the ghg’s estimated from measurements. The tabular listing of the ghg’s are at Real Climate at:
http://www.realclimate.org/data/H88_scenarios.dat
The NOAA estimated ghg’s in graphical form by year are at:
http://www.esrl.noaa.gov/gmd/aggi/aggi_2011.fig2.png
When these are compared it can be seen that Hansen’s Scenario C is a good match for everything but CO2. Up until 2010 Scenario B was the best estimate of CO2. Fortunately Hansen’s paper presents a method for adjusting for these differences In the words of the paper “The forcing for any other scenario of atmospheric trace gases can be compared to these three cases by computing the [delta t] with the formulas provided in Appendix A.”
Using the Appendix B formulas for CO2 , the difference between the CO2 delta T for Scenarios B and C can be calculated. This difference can then be added to the delta T for Scenario C to obtain a Scenario that is approximately correct for all ghg’s Hansen considers.
For 2010 this adjustment for CO2 adds about 0.11 deg C to the delta T for Scenario C.
I chose to compare the adjusted Scenario C to the UAH satellite measurements. However before that can be done the Hansen Scenarios must be adjusted to the same base years as the UAH measurements (1981-2010). You must use the control run to do this. Then add the delta T differences between the control run and the various Scenarios to the adjusted control run delta T for each year to obtain the Scenarios on an ’81 to ’10 basis.
When this is done the following temperature anomalies are obtained for 2010 (using a 5yr, centered average):
Scenario C adjusted for CO2 and base year = 0.63 deg. C.
UAH = 0.18 deg C.
Control run = 0.08 deg C.
The fact that the 2010 temperature anomaly based on measurements is 0.45 deg. C. lower than the adjusted Scenario C, but only 0.1 degree above the Control run shows that the model is completely useless in predicting future temperature, and that the methodology for determining the effects of ghg’s is deeply flawed.
The key is to include the control run in all scenario comparisons.
Leave it to an (Old ) engineer to point out something so lost in the hustle and bustle and to point it out so clearly.
Old Engineer says:
March 20, 2013 at 10:56 pm
Nice. The evidence is piling up – AGW is dead. The only question is how long its inertia will keep it going.
If anyone wants to know what the next 30 or so years is going to look like, appropriately offset and graft the portion of this plot from about 1945 to 1975 onto today’s temperatures. And, for CO2, expect the rate of change to decrease with the temperatures as it is already doing.
AGW is a Crime against Humanity and perpetrators should be pursued through the courts. Whinging in the press does nothing to ameliorate the hurt AGW has imposed on countless millions by impoverishing them with gargantuan fuel bills and even starvation. Hansen at al should be held to account. Why, it seems, is no one really mad with ANGER!
Re: Hansen.
Hansen abandoned interest in Venus, which was his first area of research, at the precise point when atmospheric conditions became measurable. That is, when whatever speculations he had advanced could be verified empirically. Or disproved.
He claims that just at the point where this was going to occur, he suddenly decided to focus, effective immediately, on the earth and AGW. Without knowing whether the fruits of years of work were vindicated.
This is utter bullshit.
People simply do not behave this way.
If someone with sufficient expertise looks at his published papers prior, and the actual measurements obtained, it will be obvious that he was wrong. At a guess, wildly, undeniably, wrong.
Thus his subsequent life: determination never to be shown wrong achieved through his control and manipulation of data.
Hansen didnt include the effects of the sun and ocean cycles on mulit decadal time scales throughout the 20th century, he simply assumed that all late 20th century warming was being caused by human greenhouse emissions, and the failure of his forecast since then has shown his theory to be wrong.
It’s not surprising that RC would use a red herring to further their illogical argument. The need to use a “no warming” comparison is just plain silly. It completely ignores the most likely comparison of ocean oscillations. When the PDO/AMO are considered the current warming is almost a perfect match. Their dishonesty is typical of the alarmist mind.
Looking at the “Adjusted Data”, in the RC article, I wonder how Roy Spencer feels on how his UAH data was “adjusted”.
Ira
Good graph & explanation.
May I suggest comparing Hansen’s predictions with a first order null hypothesis of continued warming from the Little Ice Age. e.g. see
Syun-Ichi Akasofu, On the recovery from the Little Ice Age
Natural Science Vol.2, No.11, 1211-1224 (2010) doi:10.4236/ns.2010.211149
Openly accessible at http://www.scirp.org/journal/NS/
On CO2, Fred H. Haynie provides fascinating analyses of CO2 versus latitudinal temperatures in The Future of Global Climate Change. e.g. in slide 10 he shows polar CO2 driven by the respective polar ice extent.
Old Engineer March 20, 2013 at 10:56 pm wrote:
BRILLIANT! Correcting Hansen Scenario C for the true Atmospheric CO2 you get 0.63 ⁰C warming, but the true warming, based on UAH (satelite sensors) is only 0.18 ⁰C.
Therefore, Hansen Scenario C, corrected for CO2, predicts a temperature rise of 0.63/0.18 = 3.5 times too high. Thus, Hansen’s assumed Climate Sensitivity of 4.2 ⁰C is too high by a factor of 3.5. Had he assumed Climate Sensitivity of 1.2 ⁰C, his Scenario C would have made a better prediction.
This confirms my contention that Climate Sensitivity is closer to 1 ⁰C than the IPCC’s 3 ⁰C.
Ira
Hansen, the 1977 Ice Age man stikes zero again……face it, he is a moron.
Ira-
It was very late (for my time zone) last night when I saw your post, so my comments were brief. I didn’t get a chance to thank you for bringing the RealClimate post to our attention.
Thanks also for so forcefully pointing out that the RC post was just plain wrong. That the “naive” prediction of no change, which is exactly what Hansen’s own Control Run is (no change since 1958!), is closer to the measured temperature anomaly than any of Hansen’s scenarios.
As you pointed out, “Hansen 1988 is the keystone for entire CAGW Enterprise.” So it needs to be hammered home again and again, that the premise on which this keystone is based has been shown to be wrong.
I’m not sure I like the “climate sensitivity” way of looking at ghg effects. But since there is a long history of using climate sensitivity, I certainly agree with you, it is probably close to 1. I think others have also come to same conclusion using different approaches.
THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”
As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated
You have made the same error that McIntyre made several years ago at CA, look carefully at Fig 2, you’ll see that the difference between scenario A & B is less than 0.1ºC by now. Hansen clearly showed that the main cause of temperature increase up to now would due to the ‘trace gases’ if they continued to increase at the then rates. This is clearly stated in the abstract and the Intro para.
[Phil: Fig 2 of what? My WUWT posting of the RealClimate posting I got the base graphic from? Yes, according to the RealClimate graphic (which is the first graphic of my posting) there is only a small difference between A and B now, and BOTH are WRONG (as compared to ACTUAL), and BOTH assume continued “trace gas emissions”, and all Hansen Scenarios (A, B, and C) assume a Climate Sensitivity of 4.2 ⁰C. The ONLY one of Hansen’s scenarios that is close to the ACTUAL is Scenario C, which is ONLY 31% high. What is your point? What is your evidence? Do they teach clarity at Princeton? Please state your points clearly and I will try to respond. advTHANKSance. Ira]
@ur momisugly Bart says: March 20, 2013 at 7:47 pm
//////////////////////////////////////////////////////
Bart
It is often said that CO2 is a “well mixed” gas. But the accuracy of that claim depends upon the meaning one ascribes to ‘well’. Factually, is it ‘well mixed’ or only ‘reasonably well mixed’? The latest satellite data suggests that it is not what I would classify as being ‘well mixed’ but instead it is rather lumpy (for want of a better expression). Indeed, Moustafa Chahine of the NASA Jet Propulsion Laboratory (JPL), AIRS’s principal scientist, in a press conference partly conceded as much. He said: “Contrary to prevailing wisdom, CO2 is not well mixed in the mid-troposphere,”
Of course there are large variations in the concentration of CO2 and its geographical location (ie., between each hemisphere) and this variation can become more marked dependent upon seasons. Even the concentration of CO2 varies quite significantly from night to day, and this daily variation is more marked or less marked again depending upon geographical location. There are also altitude variations etc.etc. These variations can exceed 20ppm (say 5 or so percent).
Having read your comments (unfortunately only quickly and I probably have not done them justice), it appears that we are talking at cross purposes. I am looking at response to CO2 on a multidecadel time frame, whereas you appear to be looking at response measured upon a much smaller time frame (measured in months). I understand your point but I remain unsure the extent to which it deals with climate sensitivity on a multidecadel basis. Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?
My bottom line take is that you cannot begin to ascertain climate sensitivity until you first know and understand everything there is to know about natural variation and in particular its bounds. Until that is known, it is not possible to separate CO2 signal from noise. Presently, all the data sets are not fit for purpose, they are low resolution and far too noisy. In addition, can they even be relied upon because of endless adjustments the need for and correctness of which is moot?
The only thing we really know about climate sensitivity is that natural variation is stronger; it can trump climate sensitivity. Proof; (i) there has been no warming these past 17 years since downward forcings associated with natural variation have equalled the upward forcing component of climate sensitivity to CO2 during this 17 year period. (2) there was cooling between say 1940 to say 1975 because the downward forcings associated with natural variation have more than equalled the upward forcing component of climate sensitivity to CO2 during that period such that the strength of natural variation forcings were able to drive temperatures down even in the face of the warming effect of CO2!!
If climate sensitivity is high, then in view of (2) above, we know that natural variation is even stronger!
Of course, I appreciate that you consider that CO2 can do nothing and it may be that is the case, or may be it is the case given the saturated levels already reached today. I myself am unable to make any firm assertions mainly because of the poor quality data available, and its error bars, coupled with a lack of empirical observational experimentation on the issues raised. I feel that we are all groping around in the dark.
I don’t think the Earth’s atmosphere increases temperature by 30 C. The Moon has an average surface temperature of about -5 C and for Earth is is about 14 C. The Moon has no atmosphere. Atmospheric pressure alone increases temperature. A case in point is Venus. I am willing to bet if the Moon has a thick atmosphere comprising only of oxygen and nitrogen with no CO2, water vapor or any other greenhouse gas, the Moon would have an average temperature of at least 10 C.
amoorhouse says:
March 20, 2013 at 8:19 am
—————————————————————————–
well said, nothing they said and did was close, However the have suceeded in severely damaging the worlds economy.
I THINK the calculated sensitivity with *no positive feedback* is 0.4C and that I would say is right.
Water is as far as I can see a massive NEGATIVE feedback system but with multi-decadal lags due to oceanic thermal inertia so it tends to lead to oscillatory behaviours.
“BRILLIANT! Correcting Hansen Scenario C for the true Atmospheric CO2 you get 0.63 °C warming, but the true warming, based on UAH (satelite sensors) is only 0.18 °C.”
Some funny arithmetic here. OE says the adjustment for CO2 is 0.11 °C;. You say Scen C had a rise of 0.29 °C;. That makes 0.4, as I understand.
The rise in GIStemp LOTI was 0.393°C. Looks pretty good to me!
You say UAH showed 0.18°C warming? Looks like 0.41°C to me.
richard verney says:
March 21, 2013 at 12:07 pm
“…whereas you appear to be looking at response measured upon a much smaller time frame (measured in months).”
Well, 55 years is 660 months. If you call that a “small” time frame, then…¯\_(ツ)_/¯
“Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?”
It is 55 years in which the CO2 derivative has matched the temperature anomaly perfectly. That 55 years saw the major part of the increase in atmospheric concentration from “pre-industrial” levels.
I’m wondering if you understand that the derivative contains all the information needed to reconstruct the entire change during the time interval it is plotted? I did that here. That’s all you need to reconstruct the entire delta-CO2 in the last 55 years to high fidelity: temperature. You don’t need human inputs. Their contribution is therefore necessarily insignificant.
Nick Stokes says:
March 21, 2013 at 3:59 pm
=================================================================
Nick-
Perhaps I didn’t put enough description on my values.
First, the anomaly values I gave were 5 years averages, just as Hansen did in Figure 3 of Hansen ’88.
Second, they were at year 2010 (so the average was from 2008 to 2012 inclusive)
Next, the 0.11 deg. C., was increase in the Scenario C 2010 temperature anomaly that would result if the Scenario C 2010 atmospheric CO2 concentrations were the same level as the Scenario B CO2 concentrations. This was calculated from the equations in Appendix B of Hansen ’88, using the method Hansen presented. Thus, the 0.11 should be added to the Scenario C 2010 temperature anomaly shown in Figure 3 of Hansen ’88.
Finally note, that to compare Hansen’s scenario’s to UAH, it is necessary to adjust the base to the UAH base, which is 1981 to 2010. Hansen’s base was the average of the 100 year Control Run. To do this, the Control Run is adjusted to an ’81 to ’10 base, then the differences from the original control run and the three scenarios are added to the adjusted Control Run to get the adjusted scenarios. The adjusted values are slightly different from the Hansen values shown in Ira’s graph.
As for the UAH temperature anomaly, the 0.18 is the 61 month average, not the yearly value.
Ira, the Fig 2, abstract and Introduction I referred to are from Hansen (88), the paper that’s under discussion! I naively assumed that you’d actually read the paper whose results you’re critiquing. That you apparently haven’t explains some of your misunderstandings.