How well did Hansen (1988) do?

Guest Post by Ira Glickstein.

The graphic from RealClimate asks “How well did Hansen et al (1988) do?” They compare actual temperature measurements through 2012 (GISTEMP and HadCRUT4) with Hansen’s 1988 Scenarios “A”, “B”, and “C”. The answer (see my annotations) is “Are you kidding?”Hansen88

HANSEN’S SCENARIOS

The three scenarios and their predictions are defined by Hansen 1988 as follows

“Scenario A assumes continued exponential trace gas growth, …” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.9 ⁰C, OVER FOUR TIMES HIGHER than the actual increase of 0.22 ⁰C.

“scenario B assumes a reduced linear growth of trace gases, …”   Hansen’s predicted temperature increase, from 1988 to 2012, is 0.75 ⁰C, OVER THREE TIMES HIGHER than the actual increase of 0.22 ⁰C.

“scenario C assumes a rapid curtailment of trace gas emissions such that the net climate forcing ceases to increase after the year 2000.” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.29 ⁰C, ONLY 31% HIGHER than the actual increase of 0.22 ⁰C.

So, only Scenario C, which “assumes a rapid curtailment of trace gas emissions” comes close to the truth.

THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”

As everyone knows,  the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated.

What does RealClimate make of this situation?

“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change).  …  The conclusion is the same as in each of the past few years; the models are on the low side of some changes, and on the high side of others, but despite short-term ups and downs, global warming continues much as predicted.”

Move along, folks, nothing to see here, everything is OK, “global warming continues much as predicted.”

CONCLUSIONS

Hansen 1988 is the keystone of the entire CAGW Enterprise, the theory that Anthropogenic (human-caused) Global Warming will lead to a near-term Climate Catastrophe. RealClimate, the leading Warmist website, should be congratulated for publishing a graphic that so clearly debunks CAGW and calls into question all the Climate models put forth by the official Climate Team (the “hockey team”).

Hansen’s 1988 models are based on a Climate Sensitivity (predicted temperature increase given a doubling of CO2) of 4.2 ⁰C. The actual CO2 increase since 1988 is somewhere between Hansen’s Scenario A (“continued exponential trace gas growth”) and Scenario B (“reduced linear growth of trace gases”), so, based on the failure of Scenarios A and B, namely their being high by a factor of three or four, it would be reasonable to assume that Climate Sensitivity is closer to 1 ⁰C than 4 ⁰C.

As for RealClimate’s conclusion that Hansen’s simulation “out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change)”, they are WRONG. Even a “naive” prediction of no change would have been closer to the truth (low by 0.22 ⁰C) than Hansen’s Scenarios A (high by +0.68 ⁰C) and B (high by 0.53 ⁰C)!

0 0 votes
Article Rating
212 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Steve
March 20, 2013 7:30 am

I predicted the UK Wildcats would make a good showing in the Tourament this year… clearly my prediction “out-performed any reasonable naive hypothesis that people put forward” back in January.
I love outcome based predictions! LOL!

March 20, 2013 7:31 am

Climate sensitivity to CO2 is exactly 0C !….period..

Luther Wu
March 20, 2013 7:31 am

Thanks Ira, for presenting this info from Real Climate’s website, I never go there. It’s an unhealthy place to visit, what with blood pressure, and all.
They convict themselves with efforts like this, but who could convince them of it?

Mark Bofill
March 20, 2013 7:33 am

Now wait a minute there Anthony, let’s think this through for a minute.
If RealClimate is arguing that Hansen’s predictions are accurate because we’re in Scenario C, doesn’t that mean that our (non existent) mitigation efforts have been successful? We’re not emitting CO2 into the atmosphere anymore, is basically what they’re saying; we haven’t been since 2000. Great! Let’s quit with the legislative efforts to destroy the fossil fuel industry, stop the wind and solar subsidies, dismantle the IPCC, declare victory and go home.
Maybe? Somehow I have a nagging feeling that won’t work, although I can’t quite put my finger on why… /sarc

Mark Bofill
March 20, 2013 7:35 am

Mark Bofill says:
Your comment is awaiting moderation.
March 20, 2013 at 7:33 am
——
Gah, beg pardon, I should have addressed Ira, not Anthony.

March 20, 2013 7:38 am

Reblogged this on This Got My Attention and commented:
It’s time to hold the know-it-all predictors accountable for their exaggerations.

rilfeld
March 20, 2013 7:39 am

Comparing forecasts to the real world is patently unfair. If you were to compare the forecasts to other forecasts, you would find them spot on, indicating a concensus.
Besides, you didn’t correct the measured temperatures upward to compensate for all the energy used in our plethora of severe storms. We don’t need a weatherman to know which way the winds blows…..(apology & /Sarcoff)

mycroft
March 20, 2013 7:40 am

Realclimate and others ignore the hiatus global temperture rise,its as if they stick fingers in ears and go BLAH,BLAH BLAH when ever the data is put in front of them…and these people have the nerve to call themselves scientists

rilfeld
March 20, 2013 7:41 am

ER, it must be really really warm in Lexington this morning.

tallbloke
March 20, 2013 7:42 am

Good summary. The whole sensitivity debate is a red herring however. Changes in co2 level FOLLOW changes in temperature, at ALL timescales. Cause precedes effect.
http://tallbloke.wordpress.com/2011/06/26/which-causes-which-out-of-atmospheric-temperature-and-co2-content/

johnmarshall
March 20, 2013 7:45 am

CO2 emissions will continue because nature contributes MORE than we do by 33 times. Climate will continue its natural change pattern despite this because climate change is solar driven NOT CO2 driven.
There are NO peer reviewed papers that show that CO2 drives climate. There are model runs that do but these are based on CO2 driving climate so constitute zero proof and show a circular argument when introduced. There are plenty of peer reviewed papers showing that CO2 does NOT drive either temperature or climate.
Get real.

Joe
March 20, 2013 7:46 am

Interesting that the nearest match is the “no forcing increase past 2000” which actually seems to match reality since 2000 fairly well, but had already run “high” by the millenium to create the over-estimate today.
Since CO2 hasn’t stopped increasing, surely that suggests that CO2 is not the relevant forcing and (assuming the simple radiative forcing model is basically sound in the first place) we should be looking for some other variable that was increasing up to 2000 and has flattened / fallen since?
CO2? This is not the forcing you’re looking for.

March 20, 2013 7:49 am

I wonder , if the temperature is driving Co2 and not Co2 driving the temperature , than at what temperature drop does the amount of Co2 go down ?
Did anyone ever publish any data about that ?
[Great question! According to the Ice Core data over several hundred thousand years:
1) Temperature drops and stays low for hundreds of years until CO2 begins to drop. Then there is a period of hundreds of years while Temperatures stay low and CO2 drops to a minimum level. Then a long period where both Temperature and CO2 are low.
2) Then, Temperature rises and stays high for hundreds of years until CO2 begins to rise. Then there is a period of hundreds of years while Temperatures stay high and CO2 rises to a maximum level. Then a long period where both Temperature and CO2 are high.
3) Repeat above process for multiple cycles.
Of course, that was well before humans evolved on Earth and everything changed :^)
Ira]

pokerguy
March 20, 2013 7:49 am

RealClimate as ever is asking, “Who are you going to believe? Us or your own lying eyes?”

Doug Proctor
March 20, 2013 7:59 am

It has been 26 years since Hansen et al did the work noted above. The IPCC is in the process of producing the fifth report since then, involving “thousands” of “top-notch” “climate” “scientists” (each category is, of course, a piece of misinformation). In the world of science and business and general human lives, during that period we would expect the range of Scenarios to have been narrowed. But we are still faced with A through C.
This is why references of how Hansen’s predictions are still relevant. The most recent forecast is still as wild and wooly as it was in ’88 – despite the new ones being curve-matched to data from 1988 to about 1997. There has been no progress. And that is because the fundamental assumptions that drive the model Scenarios have not changed.
And they can’t. The whole CAGW narrative rides on a high sensitivity of CO2 and a significant, positive feedback from water vapour. The disaster, however, is not present but projected. Any reduction in the impact of additional atmospheric CO2 is not just a short-term problem for the story – you can’t have “extreme” weather from CO2 if there has been little or no global warming from CO2 – but a long-term problem. A world warmer by 2C in 200 years (by model) is a world that adapts, not collapses.
Each day the newspapers have articles touting the terrible things to happening in a warming and changing world of weather. All of them – and I mean “all” – are written with the present and future conditionals: the “may”, “could”, “might” and “should”. Whatever event we are to be alarmed about is “possibly” related to CO2 warming. Hansen and others behind the Scenarios continue to use the wide range because they know that the earlier they try to turn to the words “does”, “will” and “shall”, narrow the Scenarios to get us a clearer picture of what lies ahead (under the current use of fossil fuels) the earlier we will be looking for the proof of their puddings. And discovering them to be tasteless.

Bob Rogers
March 20, 2013 7:59 am

Beautiful graphic. That really says it all without being overly complicated. Good job!

Dave in Canmore
March 20, 2013 8:00 am

Incorrectly predicting the imperitive to entirely remake our economic system still “shows skill?”
No wonder these guys don’t work in the real world where results matter!

Doug Huffman
March 20, 2013 8:00 am

Ahhh, a retro-cast! Will Hansen wear it, his hair-shirt?

March 20, 2013 8:03 am

“It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” — Feynman
Hansen 1988 was wrong.

Girma
March 20, 2013 8:03 am

Excellent

knr
March 20, 2013 8:03 am

My predictions for this week wining lottery numbers are 6 numbers in the range 01-50 , given that is the total range possible, and this is the good part , I must be right no matter what numbers come up.
Does this make me overqualified to be a ‘climate scientists.

March 20, 2013 8:07 am

I also enjoyed Gavin’s comment:
“Short term (15 years or less) trends in global temperature are not usefully predictable as a function of current forcings.”
I look forward to 2018, when 20 years will not be long enough.
Meanwhile, as shown above,. there are no successful predictions of that length, either, but climate models don’t need to be proven, we’re just supposed to assume they’re correct until proven otherwise. Because that’s Science.
[Good points. Actually, it has been 24 years since Hansen 1988 was published, and according to the RealClimate posting, Hansen’s data was from 1984, so it has been around 28 years. By any measure, Even Gavin’s, 28 years should be LONG ENOUGH! Ira]

SasjaL
March 20, 2013 8:09 am

Atmospheric CO2 at Mauna Loa Observatory – measured at height???
At ground level? Then it is a vulcano activity indicator …

Tom in Florida
March 20, 2013 8:12 am

“What does RealClimate make of this situation?
… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 ”
Or as any 13 year old would say: “Yeah, well I maybe wrong but not as wrong as some other people.”

Charles.U.Farley
March 20, 2013 8:13 am

Looks like Hansen shouldnt be relied upon to pick this weeks winning lotto ticket.

March 20, 2013 8:14 am

Compare to Leona Marshall Libby’s 1979 prediciton
http://joannenova.com.au/2011/05/climate-scientists-who-were-right-30-years-ago/

garymount
March 20, 2013 8:14 am

It is claimed, by the climate-deranged (my new word replacement for warmists / alarmists h/t Peter Foster ) that CFC’s should be considered as a greenhouse gas reduction that took place and therefore scenario B should be the scenario to compare to.

Editor
March 20, 2013 8:15 am

Thanks, Ira.

March 20, 2013 8:15 am

I need some help graphing it,
but I project that Hansen has a greater chance of being arrested again (a positive sloped line)
than he does having one of his projection/predictions be right (a negative sloped line).
Although, once his projection of boiling oceans occurs, all bets are off.
🙂

amoorhouse
March 20, 2013 8:19 am

Lets face it the original graph was pure propaganda. The point was that CO2 was likely to be added at Scenario A rates. The point of the graph was to make Scenario A look scary so that we should address CO2. Looking at Scenario B if we got sensible about CO2 it wouldn’t make appreciable difference as the line is almost as scary as Scenario A.
Scenario C was only included to show that if the world went back to an agrarian lifestyle and all SUVs were recycle for diversity-powered scooters we could stop the scary lines happening.
Well we didn’t stop. The SUVs are still here and the world is still rapidly industrialising and we are below scenario C.
So the propaganda didn’t work, the calculations didn’t work, the temperature increase didn’t work, the CO2/Temp relationship didn’t work, the IPCC didn’t work, carbon sequestration didn’t work, CO2 offsets didn’t work, the vilification of Humanity’s progress didn’t work, the peer review STILL doesn’t work (otherwise why are they still talking about this graph rather than quietly recycling it as diversity-pressed parchment) and finally … the economics didn’t work.
Did anything that came out of these claims actually ever work? Anything? Ever?
I didn’t make the claims and even I am embarrassed for them. What a waste of a life’s work.

Latitude
March 20, 2013 8:20 am

it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988
=============
hysterical…..so they admit to how bad this “science” is

EternalOptimist
March 20, 2013 8:21 am

If Hansen had known then what we know now, he might have made a bang-on 100% accurate prediction. He might have gotten the sensitivity right and the physics too.
But no one would have listened, no one would have cared

SasjaL
March 20, 2013 8:22 am

Ira Glickstein, PhD
 on March 20, 2013 at 8:07 am

Regarding the effect of the clouds, this is obvious for us living far north, especially during the winter period. Others may not … (like well known AGW’ers …)

pokerguy
March 20, 2013 8:28 am

@ Lutherwu “Thanks Ira, for presenting this info from Real Climate’s website, I never go there. It’s an unhealthy place to visit, what with blood pressure, and all.”
Me too, Luther. I get so angry and frustrated it’s not healthy. And I want to live long enough to see these guys fall from grace, to stretch the word “grace” all out of shape. I’m 62. Another 3-5 years ought to do it I hope.

March 20, 2013 8:34 am

“amoorhouse says: March 20, 2013 at 8:19 am Did anything that came out of these claims actually ever work? Anything? Ever?”
Yes. One was actually very, very successful.
That may be the only real hockey-stick in climate.

Scott Scarborough
March 20, 2013 8:36 am

Anthony,
Scenarios A, B, and C make assumptions about the CO2 increase in the atmosphere. It would be worth while to show how those assumptions would Plot-out on a time vs CO2 Concentration scale for us not as technically inclined.
Scenario A, I would guess, would be closest to the actual Mauna Loa plot you presented ( Would it be more curved upwards?). Would Scenario B be a straight line starting in 1988? If so what slope would it be? Would Scenario C be a horizontal straight line starting in the year 2000? If so at what level would it be?
Thanks.

richard verney
March 20, 2013 8:37 am

squid2112 says:
March 20, 2013 at 7:31 am
Climate sensitivity to CO2 is exactly 0C !….period
///////////////////////////////////////////////////////////////////////////////////////////////////////
There certainly is some observational evidence supporting that conclusion. For example:
1. If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.. Unless the SuperEl Nino of 1998 was in some way caused by CO2, and to my knowledge, no one suggests that it was, then that one off event should be excluded from considering wither and to what extent there is a temperature response to rising CO2 emissions. Accordingly, excluding that one off event which brought about a step change (not a gradual linear increase), one sees that the temperature was flat between 1979 and 1997 and flat between 1999 to 2012. In otherwords temperatures have remained flat for 33 years and there is no first order correlation between temperature and rising CO2 levels in the satellite data. Thus the issue then becomes whether there is some second order correlation if one were to take into account the negative effects of aerosol emissions and.or variance in TSI and/or cloudiness and incoming solar insolation. The problem is that we do not have accurate data on these so any adjustments to be made are somewhat speculative. That said, it appears that global SO2 emissions are no greater today than they were in 1979 and if anything they are less. We are frequently told that variations in TSI are minimal and insignificant. That being the case neither of these two factors can be masking the warming (if any) that is otherwise brought about by the rise in CO2 emissions over the period. The satellite data suggests that at current levels of CO2 any sensitivity to CO2 is so small that it cannot be seen within the noisy satellite temperature record.
2. If one considers the global temperature anomaly around 1880-55 then according to Hadcrut4, it was about -0.4 degC. Today (about 2000 onwards) it is around +0.5degC. On its face a change of about 0.9degC. During this time CO2 has risen by about 50% from about 280ppm to about 400ppm which is approximately 1/2 a forcing. However, adjustment needs to be made to take account of volcanic activity. The Team suggest that Krakatoa depressed global temperatures by about 1.2degC. If that figure is right (ie., if it is the correct forcing for such extreme volcanic activity -which I personally doubt) then but for Krakatoa, one would have expected global temperatures in the period 1883 to 1886 to be some 1.2degC higher than the -0.4degC anomaly figure, ie., the appropriate temperature anomaly for the period say 1883/86 would have been +0.8degC. It is materaial that that is higher than the Hadcrut anomaly of today. Accordingly, once one takes account of volcanic forcings, and makes the necessary adjustment for those forcings, it would appear that temperature anomalies in around the 1883/86 period would be higher than today!! Thus notwithstanding a 1/2 forcing (ie., an increase in CO2 levels from about 280ppm to about 400ppm), there is no observable increase in global temperature anomaly!! Again, this suggests that at around the 300ppm CO2 level, climate sensitivity is around zero.
[Richard Verney: Good to “see” you here again. Regarding your statement that “cliimate sensitivity is around zero”, I agree. 0.25 ⁰C IS AROUND ZERO, and even my high estimate of 1 ⁰C is way closer to zero than Hansen 1988’s 4.2 ⁰C or the IPCC’s 3 ⁰C. Ira]

Eliza
March 20, 2013 8:40 am

Tall dave But gavin he will be retired by then (~20 years no warming) and will not give a @@@@ hell probably become an ardent denier in his old age LOL

MarkW
March 20, 2013 8:42 am

CO2 growth is close to A, but actual temperatures are less than C, which proves that Hansen was correct?
Do these guys wear special glasses that let them see things that aren’t there?

March 20, 2013 8:45 am

Ira,
I have taken the liberty of suggesting a change to one of your comments as follows. I am making the change because use of the terminology ‘greenhouse effect’ is not and never was scientifically appropripriate.
Note: My changes are in bold.

Ira Glickstein, PhD on March 20, 2013 at 8:07 am

squid2112 (March 20, 2013 at 7:31 am):

@squid2112
[ . . .]
I would not go so far as to say Climate Sensitivity is exactly 0.00 ⁰C. All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C. If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” Planetary Atmospheric Effect, of which there is a radiative gas portion, is real. Thus, if you could hold Cosmic Rays and Multi-Decadal Oscillations and everything else constant, and then increased Atmospheric CO2, Earth Surface temperatures would increase.
[ . . .]
Thanks for your comment.
Ira

Again note that my changes are in bold.
Your article is much appreciated. Thank you.
John
John Whitman: Thanks for your comment and your suggestion. Of course I agree that the warming effect of H2O, CO2 and other gases in the Atmosphere are very different from what goes on in an actual physical greenhouse, where the walls and roof primarily prevent loss of heat by convection rather than by outgoing long wave radiation. However, this mis-analogy has become established and, to communicate clearly with others, I choose to go along with established terminology. However, I try to say Atmospheric (to make it clear I am not talking about an actual physical greenhouse) and I put “scare” quotes around the words “Greenhouse Effect”. Ira]

Jimbo
March 20, 2013 8:49 am

Not only Hansen but the IPCC have failed. How much failure can we take? We have waited decades and there is a divergence. The theory exaggerates warming and is essentially a pile of batshit.
What if there was a rapid curtailment after 1988, Hansen would be seen as a superstar scientist. Instead, there has been none and we have hit the target. Any reasonable scientist, after raising so much alarm, would say they were wrong and go into hiding. Instead, Hansen is even angrier that ever and is prepared to get arrested over this trace gas.
Finally, we have hit below scenario C so the goal has been met, the panic is over, let’s disband CRU, IPCC and fire Hansen immediately.

March 20, 2013 8:50 am

Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
———————————————————-
Congratulations on the clear and concise presentation.
However, when you say in your comment: “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real” you are attributing the entire Greenhouse Effect to CO2, whereas it is dominated by water vapor.
CO2 has a narrow absorption band, and it must plateau by 300 ppm concentration, since there is a CO2 sensitivity to temperature. If there were a significant temperature sensitivity to CO2 above 300 ppm, there would be a positive feedback that shouldn’t stop until essentially all the massive stores of surficial carbon returned to the atmosphere as CO2, with global temperature tens of degrees warmer then ever inferred from the entire ice-core record.
[R Taylor: You are correct that the primary “Greenhouse gas” is H2O. However, some climate scientists have suggested that, lacking CO2 (and/or other gases that do not precipitate and freeze at 0 ⁰C), the first strong Ice Age would have caused our planet to become “snowball Earth” where all the H2O would become snow and ice, which have a high albedo (reflectiveness to short wave Solar radiation). They claim that “snowball Earth”, lacking any appreciable H2O vapor in the Atmosphere, and reflecting much of the incoming Solar radiation, might remain in that frozen state. It was the unfrozen CO2, and other unfrozen trace gases in the Atmosphere, that allowed the Atmosphereic “Greenhouse Effect” to be effective and pull Earth out of the “snowball”. I believe that theory is credible, and, if so, CO2 and other non-H2O gases are responsible for the Earth not being a “snowball” and allowing the H2O to return to liquid and gaseous form, with the result being at least 30 ⁰C warmer than it would be otherwise. Ira]

Stefan
March 20, 2013 8:52 am

The RC guys are funny.

Bill
March 20, 2013 8:54 am

They can’t even be straight about what he predicted. The fair way would be to show both, but instead they often say that since methane did not rise as much as predicted and because they now agree that the median increase for CO2 doubling is 2.8 to 3.0 C instead of 4.2 that if you make those changes it is not as bad and he was “reasonably accurate”. Well, he did not use 2.8 C and they believe that in many ways CO2 is a bigger problem than methane and they were PREDICTING that methane would rise, that was part of it as well.
So to fairly test what Hansen said in 1988 you do as Ira did. Then after that you can say, well, if we had made the prediction in 2004 or 2013 we would not have been as wrong as Hansen back in 1988. Which sounds kind of lame and explains why they choose the more misleading way.

commieBob
March 20, 2013 8:56 am

Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
… If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.

1 – Actually, the average temperature would be about the same. It’s just that we would, literally, bake during the day and freeze at night. The surface conditions would be like those on the moon. If you apply a low pass filter (ie. dig down into the moon far enough) the temperature is a comfortable 23 deg. C. http://www.lunarpedia.org/index.php?title=Lunar_Temperature
2 – The most powerful greenhouse gas, by far, is water vapour. Attributing the whole greenhouse effect to CO2 is wrong. Note that the adiabatic lapse rate is hugely influenced by moisture and not at all influenced by CO2. http://en.wikipedia.org/wiki/Lapse_rate

DesertYote
March 20, 2013 8:56 am

The lefty mind, literally, can not contain the two concepts “Real world scenario is close to Hansen A scenario” and “Real world temps are close to Hansen C” at the same time. That this is a fact, is no accident. It is the consequence of deliberate interference in the natural development of the cognitive ability of our children and has been going on since at least the 30’s. The Marxist architect mind is constructed of independent nodes each with its own reference, set of definitions, and parsers. Only one can be active at a time. This is one of the two reasons why the trolls can not read a post or comment without completely missing what it is saying, re-interpreting it to conform to their ideological preconceptions.

richard verney
March 20, 2013 8:57 am

Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
“…If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real….”
///////////////////////////////////////////////////////////////////////////////////////////////////////////////////
If, over the course of the next 10 or so years, global temperatures do not increase, then there will be 2 obvious contenders as to the reason why there has been no increase. namely:
1 At current levels of CO2, saturation has already been reached such that sensitivity is now so close to zero that it cannot be measured; or
2. The basic physics upon which AGW is built is fundamentally flawed.
You are a brave man ruling out the second.
I strongly suspect that if the temperature haitus continues and there is no rise in temperature anomaly by early 2020, we will be seeing many papers dealing with the basic physics and these papers will, inter alia, question whether “..the Earth Surface temperatures would be over 30 ⁰C cooler..” but for so called greenhouse effect.
Watch this space and bring along the pop corn.

Jimbo
March 20, 2013 8:59 am

“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …

Let me do a re-write for ya.

Dad, I promised you I would get an A in biology, instead I got an E. Hey, at least I didn’t get an F.

Gavin puts the best political spin masters to shame.

Steve from Rockwood
March 20, 2013 9:08 am

If you can’t predict future climate change then you don’t really know why the climate changes. If you don’t really know why climate changes then you can’t introduce the term “climate sensitivity” and try to estimate its value. You can only say “I don’t know”.

michael hart
March 20, 2013 9:11 am

I quite like his hat. It would be a shame to see it being eaten.

mogamboguru
March 20, 2013 9:12 am

Well, regarding the lots and lots of cold, sticky, white stuff still lying around outside my home right at this year’s spring equinox, I rather accept 1970ies predictions of a coming Ice Age.
Brrrrrrrrr!

March 20, 2013 9:13 am

knr says:
March 20, 2013 at 8:03 am
My predictions for this week wining lottery numbers are 6 numbers in the range 01-50 , given that is the total range possible, and this is the good part , I must be right no matter what numbers come up.
Does this make me overqualified to be a ‘climate scientists.
————————————————————————————————————————
Yes, it does. If’n you want to be a REAL climate scientist you need to predict that all lottery numbers will exceed 50. When the results come in, you argue because of your large error bars that you won anyway. Of course, your only hope of collecting is if the lottery is run by NSF.

RockyRoad
March 20, 2013 9:21 am

Jimbo says:
March 20, 2013 at 8:59 am

“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …
Let me do a re-write for ya.
Dad, I promised you I would get an A in biology, instead I got an E. Hey, at least I didn’t get an F.
Gavin puts the best political spin masters to shame.

You’re right, Jimbo. Nobody but an eedjit would tell people there was to be “no-change” in the climate. (I wonder if Gavin has a link or two for his so-called “no-change” climate prognosticators?)

Editor
March 20, 2013 9:23 am

the models are on the low side of some changes,
The only thing the models are low on is accuracy!

richard verney
March 20, 2013 9:25 am

Ira
Further to my post (richard verney says: March 20, 2013 at 8:57 am), I am not saying that you are wrong, merely that your assertion may be wrong. Personally, i consider the 30degC figure to be highly suspect.
Let us suppose, for the sake of argument, that the MWP existed as a global event and it was warmer than today. What does that event say about climate sensitivity?
Ditto, if the Roman and the Minoan warm periods truly existed and were warmer than today?
Ditto, the Holocene Optimum existed and was warmer than today?
I would be interested to hear your views as to climate sensitivity with respect to those 4 scenarios.
Is it not the case that proponents of cAGW have to erase such events from the global temperature record since if they existed, then it follows that climate sensitivity must indeed be small (less than 1degC, perhaps the 0.25degC figure you mention)?
[richard verney: I accept that the Roman and Minoan and Medieval Warming Periods (as well as 1934 at least in the US :^) were warmer, on average, than the past decade. Therefore, as you say, Climate Sensitivity MUST be small, perhaps as small as 0.25 ⁰C, and very unlikely to be higher than 1 ⁰C. So we are in agreement. Ira]

Laurie Bowen (being a troll . . . again)
March 20, 2013 9:25 am

ok IRA . . . you got me at “nothing can move faster than the speed of light” . . . . If that is true . . . why DOES . . E=mc2 (squared)?
(checked out your site) http://tvpclub.blogspot.com/ ABOUT MY ENTRY
it’s as far as I got~!
[Laurie Bowen: THANKS for checking out my Blog. For those who have not checked it out, my current top topic is “What is Time? Alan Alda’s 2013 Flame Challenge“. I make the claim that Time is the fourth dimension, plain and simple, and I subscribe to the theory that you and I and the whole Universe is zipping along the Time dimension at nearly the speed of light, which is as fast as anything can go. That is how I explain the fact that, when we move in any of the three Space dimensions, our speed in the Time dimension must reduce a bit such that the vector sum of our speed in Space and our speed in Time always equal exactly the speed of light. (Time Dilation and Length Contraction are well established, experimentally demonstrated truths. These effects are tiny even at the speeds of our fastest rockets, but they are real in that GPS satelites must correct for them. At speeds approaching a significant fraction of the speed of light, these effects can cause Time to slow down considerably.)
I have submitted a five and a half minute video that will be judged by 11-year-old science students. Please check it all out HERE.
As for your reference to e = mc^2, I don’t know what that has to do with the speed of light other than that Albert Einstein contributed to our understanding of the close inter-relationship between Energy and Mass as well as Space and Time. Ira]

March 20, 2013 9:26 am

Thank you Sir very much.
Alfred

Frank K.
March 20, 2013 9:26 am

For even more laughs, you should plot the absolute “global-averaged” temperature instead of the anomalies. GCMs cannot even get absolute temperatures right. And remember, energy transfer by thermal radiation varies as absolute temperature to the fourth power…

Ryan
March 20, 2013 9:28 am

I predicated that Manchester United would win this years Champions League.I was wrong but only by one away goal. Sadly it was scored quite early in the competition…
How about we re-draw Hansens graphs starting from 1995 say? How well do the curves fit then? Perhaps “fit” is the wrong word…..

Russ R.
March 20, 2013 9:32 am

Don’t confuse “emissions” with “atmospheric concentrations”.
Hansen’s 3 scenarios were built around different GHG emissions paths. Exponential growth, linear growth, and abrupt curtailment.
For each emissions path, he projected atmospheric concentrations of GHGs. These are a derivative of each scenario… not the scenario itself.
The problems with Hansen (1988) are two fold…1) emissions have continued to rise as fast as scenario A, but concentrations have only risen in line with Scenario B, and 2) despite concentrations that have risen in line with Scenario B, temperatures have risen less than Scenario C.
RealClimate and Skeptical Science both gloss over problem 1)… pointing out that Scenario B should be the reference point for comparison because atmospheric concentrations of all GHGs have been below Scenario A. This is goal-post shifting.
Scenario A spelled out an emissions path, and is still the relevant benchmark for how the real world has played out..
If emissions have continued to grow exponentially, but atmospheric concentrations have risen less than projected, it means the model was wrong, and either GHG absorption by the biosphere is greater than projected, or atmospheric dwell time is less than projected. Either way… Hansen’s models were biased high.

Greg House
March 20, 2013 9:33 am

Ira Glickstein, PhD says, (March 20, 2013 at 8:07 am): “All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C. If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real. Thus, if you could hold Cosmic Rays and Multi-Decadal Oscillations and everything else constant, and then increased Atmospheric CO2, Earth Surface temperatures would increase.”
=========================================================
Ira, this calculation about “over 30 ⁰C cooler” is so wrong. It is based on the notion that the Earth gets warmed as a disc and cools as a sphere. This is absurd, Ira. Earth gets warmed as a sphere too, because it rotates. The Earth without atmosphere would be very hot, because radiative cooling is very slow. Just start thinking about it.
The second thing is that back radiation (from CO2 or whatever) does not warm the source. This is impossible already on the theoretical level, because the assumption that it does leads in some cases to creating energy out of nothing, which means that the assumption is wrong. And on the experimental level it has been known since the R.W.Wood experiment (1909).
Time to wake up, Ira, and look at those things really critically.

William Astley
March 20, 2013 9:34 am

In reply to tallbloke:
tallbloke says:
March 20, 2013 at 7:42 am
Good summary. The whole sensitivity debate is a red herring however. Changes in co2 level FOLLOW changes in temperature, at ALL timescales. Cause precedes effect.
William:
I do not understand the reasoning to support this comment. Post 2000 CO2 has increased linearly and there has been no increase in planetary temperature. What is driving the linear increase in CO2? If the 20th century increase in CO2 was caused by the increase in planetary temperature wouldn’t we expect the rise in CO2 have stopped as the rise in planetary temperature has stopped?
Comments:
1) Explaining the glacial/interglacial lag in CO2 rise is a separate problem. The specialists cannot explain the drop in atmospheric CO2 from interglacial to glacial phase or the increase from glacial to interglacial. The amount of CO2 that is absorbed by the oceans as they cool is almost offset by the reduction of CO2 due to shrinking of the biosphere (vegetation) due to the massive ice sheets. The proposed hypotheses (mechanisms) to explain the change create paradoxes when applied to other periods.
2) The question of what maintains atmospheric CO2 and why atmospheric CO2 changes on geologically long time periods is not understood. The proposed hypotheses (mechanisms) to explain the change create paradoxes when applied to other periods. The hypothesis that increased erosion removed the atmospheric CO2 for example results in catastrophically cyclically low atmospheric CO2 levels (end of plant life).
3) The current observation that there is no correlation in planetary temperature to changes CO2 and the observation on geologically long time periods that planetary temperature is not correlated to atmospheric CO2 can be explained by the CO2 greenhouse mechanism saturating due to clouds in the tropics increasing and decreasing to resist forcing change. Lindzen and Choi’s sensitivity paper supports the assertion that the planet resists forcing change by increasing or decreasing clouds in the tropics.
Atmospheric carbon dioxide levels for the last 500 million years
http://www.pnas.org/content/99/7/4167.full
Using a variety of sedimentological criteria, Frakes et al. (18) have concluded that Earth’s climate has cycled several times between warm and cool modes for roughly the last 600 My. Recent work by Veizer et al. (28), based on measurements of oxygen isotopes in calcite and aragonite shells, appears to confirm the existence of these long-period (_135 My) climatic fluctuations. Changes in CO2 levels are usually assumed to be among the dominant mechanisms driving such long-term climate change (29).
Superficially, this observation would seem to imply that pCO2 does not exert dominant control on Earth’s climate at time scales greater than about 10 My. A wealth of evidence, however, suggests that pCO2 exerts at least some control [see Crowley and Berner (30) for a recent review]. Fig. 4 cannot by itself refute this assumption. Instead, it simply shows that the ‘‘null hypothesis’’ that pCO2 and climate are unrelated cannot be rejected on the basis of this evidence alone.
http://faculty.washington.edu/battisti/589paleo2005/Papers/SigmanBoyle2000.pdf
Glacial/interglacial variations in atmospheric carbon dioxide by Daniel M. Sigman & Edward A. Boyle
The exchange of CO2 between the atmosphere and the surface ocean would reach completion over the course of six to twelve months if there were no other processes redistributing inorganic carbon in the ocean. However, the pCO2 of surface waters is continuously being reset by its interaction with the deep ocean reservoir of inorganic carbon, which is more than 25 times that of the atmosphere and surface ocean combined (Fig. 2).
As ocean temperature was lower during ice ages, it is an obvious first step to consider its effect on atmospheric CO2. The lower temperatures of the glacial ocean would have reduced the concentration of CO2 in the atmosphere by drawing more of it into the ocean. The deep ocean, which is the dominant volume of ocean water, has a mean temperature of 2 8C. Sea water begins to freeze at about -2C, producing buoyant ice. As a result, deep ocean water could not have been more than 4C colder during the last ice age, placing an upper bound on how much additional CO2 this water could have sequestered simply by cooling. The potential cooling of surface waters in polar regions such as the Antarctic is also constrained by the freezing point of sea water.
There are uncertainties in each of these effects, but it seems that most of the 80±100 p.p.m.v. CO2 change across the last glacial/interglacial transition must be explained by other processes. (My comment than by temperature variance which forces CO2 out of solution in the ocean.) We must move on to the more complex aspects of the ocean carbon cycle.
What caused Glacial-Interglacial CO2 Change?
http://geosci.uchicago.edu/~archer/reprints/revgeo/rog.pdf
Abstract. Fifteen years after the discovery of major glacial/interglacial cycles in the CO2 concentration of the atmosphere, it seems that all of the simple mechanisms for lowering pCO2 have been eliminated. We use a model of ocean and sediment geochemistry, which includes new developments of iron limitation of biological production at the sea surface and anoxic diagenesis and its effect on CaCO3 preservation in the sediments, to evaluate the current proposals for explaining the glacial/ interglacial pCO2 cycles within the context of the ocean carbon cycle. After equilibration with CaCO3 the model is unable to generate glacial pCO2 by increasing ocean NO3 2 but predicts that a doubling of ocean H4SiO4 might suffice. However, the model is unable to generate a doubling of ocean H4SiO4 by any reasonable changes in SiO2 weathering or production.
Our conclusions force us to challenge one or more of the assumptions at the foundations of chemical oceanography. We can abandon the stability of the “Redfield ratio” of nitrogen to phosphorus in living marine phytoplankton and the ultimate limitation of marine photosynthesis by phosphorus. We can challenge the idea that the pH of the deep ocean is held relatively invariant by equilibrium with CaCO3. A third possibility, which challenges physical oceanographers, is that diapycnal mixing in ocean circulation models exceeds the rate of mixing in the real ocean, diminishing the model pCO2 sensitivity to biological carbon uptake.

Ryan
March 20, 2013 9:40 am

@Ira Glickstein: “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real. ”
Sure about that? This is based on calculations assuming Earth as a black body. Except it is known that there is not a body in the universe that actually behaves as a black body. The Earth is not a good contender – I mean, exactly at what point should you consider it as a black body? On the Earth’s surface or at the top of the troposphere or some mean point from top of atmosphere to Earth’s surface or….. where?
[Ryan: True, the Earth is NOT a PERFECT Black Body, nor is anything in the Universe. The Earth is not a perfect absorber nor a perfect emitter of electro-magnetic radiation. The Earth has an albedo (reflectiveness) significantly greater than zero. The Earth reflects around 30% of the incoming Sunlight back out into Space, so only about 70% of that Sunlight energy participates in the Atmospheric “Greenhouse Effect”. All the calculations I have seen include the effect of our albedo. Ira]

Gary
March 20, 2013 9:41 am

Ira, you’ve got to clean up that graph. Too much information crammed into a small space. It’s like you’re talking so fast it’s hard to recognize the words. Don’t change the story; just make it more elegant.

Kaboom
March 20, 2013 9:43 am

24 years of being wrong, only Ehrlich can beat that.

seanbrady
March 20, 2013 9:52 am

while my prediction that Romney would win the 2012 election was not perfect, it has shown skill in that it has out-performed the naive hypothesis of no-winner in that election.

jorgekafkazar
March 20, 2013 10:09 am

Always a pleasure to see a post by Ira. Thank, Ira.
“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …” –“Real”Climate
While Bernie Madoff was not perfect, he has shown honesty in that he stole a lot less than other people we can’t name might have.

Eliza
March 20, 2013 10:13 am

Ira I would tend to agree but I think the negative feedbacks are greater so that’s why from being a skeptic I now believe C02 has no effect whatsoever on global temperatures. If anything an excess of C02 would tend to lower global temperatures as a compensatory mechanism. I think both Lindzen and Spencer have shown this somewhat. There is evidence I believe from Holocene records that 3000ppm+- C02 was related with massive ice ages?

François
March 20, 2013 10:18 am

Well, it took you a month and a half to come up with that comment on the original Realclimate post. By the way, trace gaz means a lot more than just CO2 (methane, cholofluorocarbons et al.)
[François: Actually, although you are right that the RealClimate posting I commented on has been up for several weeks, I only just saw it a few days ago. My visits to RC are infrequent, about as often as I go to the dentist, and for similar painful reasons of duty. The above WUWT posting only took me several hours to compose.
And, yes, “trace gas emissions” include more than CO2. There’s CH4 and lots of other minor contributing “Greenhouse gases”. However, is there any evidence you know of that “trace gas emissions”, broadly defined, have ceased their increase since the year 2000? That is what Hansen Scenario C assumes, and that is why Scenario C is only about 30% higher than the actual data posted on the actual RC graphic that I annotated.
Actual “trace gas emissions” are somewhere between Scenarios A and B, but the temperature change is way less than predicted for Scenarios A and B. Actual results are close to Scenario C but “trace gas emissions” are close to Scenarios A and B. So, what is your point? I’m listening. Ira]

March 20, 2013 10:29 am

The C projection even does not present a realistic picture from using the same start point instead of a say 5 year rolling projection that would show temperatures being flat since around 2000.
If you don`t move the base that projection will still show a rising trend 5 years from now
even if temperatures remain flat, Words like `foot` `own` and `shooting` come to mind……………..
Hansen till wrong but only by a few degrees instead of entirely.

Bart
March 20, 2013 10:31 am

FTA: “As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated.”
There has not been any curtailment of emissions, that much is true. But, the rate of change of CO2 has leveled out, in lockstep with the leveling out of temperatures. Our emissions do not control CO2 – Nature bats them away with barely an acknowledgement.
[Bart: Have a close look at the Mauna Loa CO2 data (second graphic above) and tell me where CO2 has “leveled out”. Yes, since CO2 has its small seasonal ups and downs there are some months where it is level, but, on a smoothed yearly basis it seems to me to be a continued rapid increase that is slightly exponential in the upward direction. Ira]
Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
“All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C.”
All else being equal being the operative phrase. All else is not equal.
“If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.”
A globally positive function is not necessarily monotonic. Local sensitivity – the incremental change in temperature due to an incremental change in CO2 concentration at the current position, i.e., the partial derivative – does not have to be significant or even positive. The data show that there has been no significant deviation from longstanding patterns in the past century.
[Bart: What does that link or your mathematical jibber-jabber have to do with the past century? Or with the Atmospheric “Greenhouse Effect” that makes the Earth Surface at least 30 ⁰C warmer than it would be if the Atmosphere lacked “Greenhouse gases”? I read your words four times and visited your link. Please explain your point in English. advTHANKSance. Ira]
richard verney says:
March 20, 2013 at 8:37 am
“If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”
But, there is a definite correlation between the rise in CO2 concentration and temperature.
[Bart: I have to agree with Richard Verney that there is no FIRST ORDER correlation between temperature rise and the rise in CO2 emissions. Please don’t simply point to some graph. Explain what you mean so even Richard and I can understand. advTHANKSance. Ira]

Keitho
Editor
March 20, 2013 10:40 am

So clear, so simple yet the other guys will still try and pretend Jim was right.
Thanks Ira, this is going out to all stations Keitho.

jorgekafkazar
March 20, 2013 10:43 am

“What if there was a rapid curtailment after 1988, Hansen would be seen as a superstar scientist.” –Jimbo
He’d be disappointed with that title. He’d much prefer “Messiah.”
http://fuelfix.com/files/2012/12/JamesHansen-306×204.jpg

AnonyMoose
March 20, 2013 10:44 am

Should the vertical scale on the CO2 graph start at zero?

March 20, 2013 10:46 am

In the interest of accuracy, we should drop the term “greenhouse effect.” What happens in the atmosphere in no way resembles what happens in a greenhouse, which heats itself by suppressing convection.
There is a growing body of research that calls into question the entire notion of “increased CO2 levels raising the earth’s temperature.” Physicist Ferenc Miskolczi has done some groundbreaking work in this area.

Beta Blocker
March 20, 2013 10:57 am

The current hiatus in the rise of global mean temperature represents a temporary pause, nothing more. The earth has been warming since the end of the Little Ice Age, with various time-localized plateaus and accelerations along the way; and it will continue to warm in that same pattern until at some point in the future, it doesn’t.
.
In the meantime, the debate over human-induced climate change will continue to pass through the GHG Narrative Diode.
.
The GHG Narrative Diode works this way….. any plateau which is anything less than a statistically significant drop in Global Mean Temperature occurring continuously over some long period of time — let’s figure on three to five decades — will continue to be interpreted by the climate science community as representing insufficient evidence that a human caused GHG-driven global warming trend isn’t still operative as the primary driver for climate change.
.
Said another way, a statistically significant trend in falling global mean temperature must occur continuously over some very lengthy period of time — thirty years at the minimum, but more likely fifty years — before the climate science community ever begins to question the narrative of human-caused GHG-driven global warming.

MikeN
March 20, 2013 11:01 am

Ira, looking through the details of the paper, and comparing to actual emissions, I would say Scenario B is the closest to reality.

March 20, 2013 11:02 am

[ Note: bold emphasis is by me (John Whitman) ]
Eliza on March 20, 2013 at 10:13 am said,
Ira I would tend to agree but I think the negative feedbacks are greater so that’s why from being a skeptic I now believe C02 has no effect whatsoever on global temperatures. If anything an excess of C02 would tend to lower global temperatures as a compensatory mechanism. I think both Lindzen and Spencer have shown this somewhat. There is evidence I believe from Holocene records that 3000ppm+- C02 was related with massive ice ages?

– – – – – – –
Eliza,
Your comment presents a climate science thesis that I recommend is worthy of significant funding to expand the research base on it. The climate science community perhaps has finally opened enough for such views to start to be pursued. I hope some climate scientists who are enterprising and are traditional scientific skeptics will engage the task!
The paradigm has shifted from alarming / dangerous AGW by CO2 toward one of minimal observed or below attribution level effects of radiative gas CO2 on the Total Earth Atmospheric System.
Take care.
John

Sam the First
March 20, 2013 11:02 am

Yes, but when are the environment and ‘science’ journalists in the MSM going to get their heads around this stuff? And the fact that Hansen and his ‘team’ live in a fantasy world? Who is going to persuade them to do so?
Even with right and science on their side, the sceptics are still losing the wider argument. There had to be some way to get it out there….

Laurie Bowen (being a troll . . . again)
March 20, 2013 11:04 am

Funny Ad! This is the one I saw: http://www.youtube.com/watch?v=EUE5a-k1quk

lurker, passing through laughing
March 20, 2013 11:13 am

It is always fun when current prophets forget that the best prophecies are always written long after the fact.

March 20, 2013 11:18 am

Heathens! No one cares about temps NOW as they are however going to shoot off the page any moment, just you deniers wait and see. You only have to look at the news to see every kind of weather is ‘unprecidented’ and ‘proof’. In fact we don’t even need charts or facts or poor people…if we don’t like them there is always the block function, if not we can go na-na-na-na
/sarc

John Finn
March 20, 2013 11:31 am

commieBob says:
March 20, 2013 at 8:56 am
Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
2 – The most powerful greenhouse gas, by far, is water vapour. Attributing the whole greenhouse effect to CO2 is wrong. Note that the adiabatic lapse rate is hugely influenced by moisture and not at all influenced by CO2. http://en.wikipedia.org/wiki/Lapse_rate

But is it not reasonable to assume that if there were no CO2 – particularly in the higher, colder and DRIER regions of the troposphere – then it’s likely there would be less atmospheric water vapour. Remember a warmer atmosphere can hold more moisture.
This is one of the reasons I suspect that any feedback effect is probably slightly postive.

Fred from Canuckistan
March 20, 2013 11:32 am

Hansen has yet to publish his Feb 2013 data.
Maybe it is not cooperating with his preferred outcome and he needs more time to persuade the data t warm up.

jaypan
March 20, 2013 11:41 am

They are trying to sell a SPECTACULAR FAIL as success.
Nice try, guys.

John Finn
March 20, 2013 11:44 am

Bart says:
March 20, 2013 at 10:31 am
But, there is a definite correlation between the rise in CO2 concentration and temperature.

Indeed there is. From ice core data, Hans Erren found it to be about 10 ppm per 1 deg C – which is roughly what we see when we look at the rate of CO2 change during El Nino and La Nina. The atmospheric CO2 change is always positive – but it’s greater in warmer years than colder.
Using Erren’s estimate the temperature rise since ~1850 has contributed about 7 ppm to the overall CO2 increase. Others argue it might be nearer to 16 ppm. Whatever – it sure isn’t anywhere near the 100 or so ppm we’ve seen in the past 150 years.
[John Finn: THANKS for clearing that up. I agree that rising global temperatures, due to any cause, will result in a small increase in CO2 levels. Over the past 130 years or so, we’ve seen a net temperature increase of 0.5 ⁰C to 0.8 ⁰C, along with a CO2 increase of over 100 ppmv. Perhaps 10% of that 100 ppmv may be due to temperature increase, 20% tops. So, it is not a FIRST ORDER effect.
The other 80 or 90% of the rise in CO2 is due to a variety of causes, including unprecedented levels burning of fossil fuels. I hasten to add that the amount of global warming has been overstated, the amount due to human activitues (both fossil fuels and land usage that has decreased albedo) has also been overstated, and that predictions of climate catastrophe are unwarranted. Indeed, if, based on the fact that the current Sunspot cycle #24 is on the low/long side, we enter a cooling period similar to past eras of a series of low/long cycles, future generations may thank us for contributing a bit to the warming of the Earth. Ira]

davidmhoffer
March 20, 2013 12:06 pm

François says:
March 20, 2013 at 10:18 am
Well, it took you a month and a half to come up with that comment on the original Realclimate post.
>>>>>>>>>>>>>>>>>>>
That’s your rebuttal? It took six weeks? If he’d done it in 2 weeks would it be more right? Would Hansen be less wrong?
Troll quality has declined significantly in the last two years. Perhaps this is a proxy for something?
[Dave: Great to “see” you again! I agree with your comments to François, but let us all calm down. Perhaps François will change his mind if he hangs around WUWT and us for an extended period of time. Ira]

Bruce Cobb
March 20, 2013 12:08 pm

Given that temps have virtually flatlined the past 16 years (or more), it would actually be more reasonable to assume that the actual increase of .22C had little, if anything to do with C02.

xham
March 20, 2013 12:10 pm

Where does the 0.22 C line come from and how was it calculated? It doesn’t seem to reflect anything else on the graph. Maybe I am missing something here.
[xham: I merely took the midpoint between GISTEMP and HADCrut4, as plotted on the RealClimate graphic for 1988, and drew an arrow to the same midpoint for 2012, and subtracted the (estimated) numbers. Thanks for asking. Ira]

Matt G
March 20, 2013 12:13 pm

Ira Glickstein, PhD says:
March 20, 2013 at 8:07 am
“…If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real….”
The fatal flaw of the greenhouse effect resolves around the idea and ignored that energy stored in the oceans doesn’t contribute to it,. Fortunately it contributes a lot and the 1-4% water vapor and CO2 in the atmosphere cannot be claimed to be the 33c difference on there own. The black body using Stefan’s law to show the difference of 33c doesn’t take into account a water body storing extra energy. IMO the current theory of the greenhouse effect is wrong and calculation, observational data is needed to estimate how much energy this contributes to the overall greenhouse effect itself.

March 20, 2013 12:14 pm

What did Feynman say of data matching theory? I’d the data doesn’t match the theory, then the theory is wrong. It’s that simple.
http://m.youtube.com/watch?v=EYPapE-3FRw

philjourdan
March 20, 2013 12:25 pm

New age math: 22 is greater than 53 when it suits your purpose.

Laurie Bowen
Reply to  philjourdan
March 20, 2013 12:38 pm

http://wattsupwiththat.com/2013/03/20/how-well-did-hansen-1988-do/#comment-1252989
That is exactly Right! . . . . and that is exacty why we have GOLF!

son of mulder
March 20, 2013 12:26 pm

If the warmists really believed really,really hard then the temperature would be higher. They are not believing hard enough, the crystals are not responding. Gaia feels unloved and is letting the Physics monster win.

Bart
March 20, 2013 12:29 pm

John Finn says:
March 20, 2013 at 11:44 am
“Indeed there is. From ice core data…”
Forget your trumped up ice core analysis. That is a massive, flailing rationalization.
The relationship is right here before your eyes. Temperatures drives CO2. There is no way to contradict it – the best, most modern, most direct, most reliable measurements indicate that temperature drives CO2, and CO2 has little effect on temperature in the modern era.

Matt G
March 20, 2013 12:54 pm

Important reminder that Hansen devised these scenario’s just after only 8 years of global warming. Yet somehow in [their] delusional world 16 years is still not long enough to falsify it. You are wrong again for dozens time and you you are wrong now. The planet Earth has falsified you not the skeptics. The skeptics just have to show the plentiful supportive evidence towards this conclusion. Which is why the CAGW’s have lost the plot with most most bizarre tactics only can be described as not science.
p.s. It can not be claimed to be a short term natural pause any more because it is now longer than any since the instrumental record began. The only periods that occurred longer than this involved long term cooling.

Casper
March 20, 2013 1:06 pm

If it don’t fit, use a bigger hammer…

HankHenry
March 20, 2013 1:06 pm

Since so many are chiming in on Dr. Glickstein’s statement below:
“If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.”
I’d like to chime in myself. The question I pose is: “What precisely is meant by the Earth’s Surface in this calculation?”
The textbook figure that is usually used for this calculation is around 15 ⁰C (15 actual measured degrees less -15 theoretical degrees yields 30 degrees of “Greenhouse Effect” warming). I would like to argue that the textbook number of 15 is higher than true surface temperature of the Earth by more than 5 degrees C. The textbook number 15 for surface temperature of the earth comes from a merge of air and sea surface temperature records. What is universally overlooked is a vast pool of deep ocean temperatures approaching 4 ⁰C that is also part of earth’s surface temperature. It’s a fascinating phenomenon in and of itself and is due to the fact that water is densest at 4 degrees C. In effect the earth’s ocean is refrigerated. This refrigeration, as I call it, implies that there is a stream of energy exiting the system (purportedly in the Arctic) that is generally overlooked and unincorporated into the calculation the of degrees of the “Greenhouse Effect” on the Earth’s surface. Assuming straight line proportionality, I would argue that climate sensitivity calculations based on 30 degrees of Greenhouse Effect are overestimated by 20 to 30 percent simply due to this oversight.
For purposes of incoming energy what amounts to the Earth’s surface is simple, it is anything struck by sunlight. When considering what the Earth’s warmed surface amounts to things are a little more complicated. Nonetheless, it should include all areas influenced by both incoming sunlight and outgoing longwave radiation. This includes the deep ocean.
[Hank Henry: Let us assume, for sake of argument, that the Atmospheric “Greenhouse Effect” is overstated, as you claim, by “20 to 30 percent”. That would change my statement to “over 20 or 25 ⁰C cooler”. So, even accepting your math, the Atmospheric “Greenhouse Effect” is still real. Ira]

Ian H
March 20, 2013 1:14 pm

W.Fox says: I wonder , if the temperature is driving Co2 and not Co2 driving the temperature , than at what temperature drop does the amount of Co2 go down ?
Did anyone ever publish any data about that ?

Temperature once was the main driver of CO2. That means when we look at the past and see temperature and CO2 rising together we cannot assume that CO2 is the cause and the temperature change is the effect; and use this to infer what what changes in CO2 will do to temperature today.
However we don’t know that it is still true today that temperature alone is driving CO2. In fact we have good reason to suspect that CO2 today is also being driven by other things like the burning of fossil fuels and land use changes (e.g. desertification). If today’s higher CO2 levels are caused by multiple factors with temperature being only one of them then we cannot predict what will happen to CO2 by looking at temperature alone.
In fact trying to predict CO2 from temperature is the same kind of mistake as trying to predict temperature from CO2.

Mark Bofill
March 20, 2013 1:42 pm

davidmhoffer says:
March 20, 2013 at 12:06 pm

Troll quality has declined significantly in the last two years. Perhaps this is a proxy for something?
————
See, now that’d be a more worthwhile use of my tax dollars in my opinion. At least it’d be good for laughs. I’d be glad to whomp up some computer models and fiddle with data to reach whatever conclusion you’d like me to objectively study the question if we could just resolve the question of where the grant money would come from.

Paul Hanlon
March 20, 2013 1:54 pm

During the last century we had two warm “pulses” and one cold, superimposed over a roughly 0.6K rise from the Little Ice Age (which was the second coldest period since we came out of the last Ice Age), each lasting very roughly 30 years. This century, we will have two cold pulses and one warm, leaving us with temperatures pretty much what they are now. This doesn’t take into consideration the unusually quiet sun, which could affect things even more to the negative side.
How’s that for a “reasonable naive hypothesis”.

TimO
March 20, 2013 2:00 pm

The thing is that you can’t force people’s wallets open to cure a problem that might, just maybe might be a problem in several centuries to a millenium. It has to be NOW NOW NOW OR WE ALL DIE!

BruceC
March 20, 2013 2:34 pm

An off topic question for Andrew (no offence Ira).
Why are there no ads or ‘donate’ buttons on the RC site?
Great article btw Ira (thumbs up)

Bart
March 20, 2013 2:46 pm

Ian H says:
March 20, 2013 at 1:14 pm
“However we don’t know that it is still true today that temperature alone is driving CO2. “
Yes, we do. Or, those of use savvy enough to understand what this plot is telling us do.

Bart
March 20, 2013 2:53 pm

Ian H says:
March 20, 2013 at 1:14 pm
“In fact trying to predict CO2 from temperature is the same kind of mistake as trying to predict temperature from CO2.”
I absolutely can predict atmospheric CO2 levels from temperature to high fidelity. It is right here. Over the modern era, I can integrate the affine temperature relationship and get it within about 4 ppm.
I can reverse it, and take the derivative of the CO2 to get temperature, but it is the integrated variable which must be the cause to the effect.

Bart
March 20, 2013 3:01 pm

Bart says:
March 20, 2013 at 2:53 pm
“Over the modern era, I can integrate the affine temperature relationship and get it within about 4 ppm.”
I could do even better, centralizing the error, if I did an actual least squares fit to derive the affine relationship instead of just eyeballing. Furthermore, the datasets have varying fidelity. The best agreement appears to be with temperatures in the Southern Hemisphere, which suggests that this is mostly an oceanic phenomenon.
That is perfectly reasonable, since the oceans are a giant conveyor belt for CO2 with continuous flow (from hundreds of years ago) into the surface system in the tropics, and out of it at the poles. Any imbalance between those flows will either accumulate in the surface system, or drain out of it. And, that imbalance is absolutely temperature dependent, and would give rise to exactly the relationship we see of the rate of change of CO2 as an affine function of temperature.

Alex
March 20, 2013 3:17 pm

“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change)” ok im tired but it sure looks to me like that naiva hypothesis is less wrong then the applicable prediction

March 20, 2013 3:24 pm

This might sound like a stupid question!
Is the Mauna Loa measurements of atmospheric CO2 measured by volume? if so can someone quantify the volume for me?

rogerknights
March 20, 2013 3:33 pm

philjourdan says:
March 20, 2013 at 12:25 pm
New age math: 22 is greater than 53 when it suits your purpose.

Robert Anton Wilson’s law of the priority of politics over science:
“If A is greater than B, and B is greater than C, then A is greater than C, except where prohibited by law.”

Goldie
March 20, 2013 3:42 pm

At the risk of being labelled a real denier what evidence do we have that the Mauna Loa seemingly exponential rise in CO2 is human induced? To be honest I would have expected to see some impact from the GFC.

DesertYote
March 20, 2013 4:26 pm

e = mc^2: hint binomial expansion of srt( 1 – v^2 / c^2) from m1 = m0 * sqrt( 1 – v^2 / c^2 )
[DesertYote: OK, since I’ve been studying Time Dilation in connection with my What is Time? video, I recognize “srt( 1 – v^2 / c^2)” as 1/(Lorentz Factor). v is velocity of an object in Space and c is the speed of light.
An object moving at v appears to us, if we are “at rest” to have Length Contraction proportional to the Lorentz Factor and Time Dilation proportional to 1/(Lorentz Factor). Also, the mass of the object appears to increase by 1/(Lorentz Factor) which you state as “m1 = m0 * sqrt( 1 – v^2 / c^2 )” but I think you meant to write as “m1 = m0 * 1/(sqrt( 1 – v^2 / c^2 ))”.
The reason no object with Mass can get up to the speed of light in Space is that it would take infinite energy to get it up to that speed, apparently because, as its speed approaches c, its mass approaches infinity. You can see that, when v approaches c, and thus v^2 approaches c^2, v^2 / c^2 approaches 1, and 1 – v^2 / c^2 approaches 0, causing 1/, 1 – v^2 / c^2 to approach infinity.
So, since f = ma, and the Energy required to increase the speed of an object is proportional to f, then Energy is proportional to Force which is equal to m0 multiplied by 1/( 1 – v^2 / c^2) multiplied by a. OK, this is starting to look something like e = mc^2, but I am not quite there. Do you have a reference that will get me there? Ira]

Hugh
March 20, 2013 4:36 pm

Ira’s excellent point: “Even a “naive” prediction of no change would have been closer to the truth (low by 0.22 ⁰C) than Hansen’s Scenarios A (high by +0.68 ⁰C) and B (high by 0.53 ⁰C)!” can be made another way:
Suppose James Hansen had a contrary twin brother, Jock, who made the prediction at the congressional hearings – just after brother James (and turning the air conditioner back on as he came in) – that actually global temperatures would DROP from 1988 levels by .30 ⁰C on CO2 output Scenario B and by .45 ⁰C on CO2 output A. Jock’s COOLING predictions, though not borne out, would nevertheless be nearer to the realised outcome than those of his headline-grabbing warmist brother.
Come on down, Jock!

richard verney
March 20, 2013 4:45 pm

Ira
I have read your comments with interest. Your articles are always thought provoking and objective.
My take on this is that unfortunately we do not have sufficient high quality data from which to form any firm view. Error bars in the data sets are not sufficiently identified and acknowledged and there has been a lack of empirical experimentation on the true effects of CO2 in real world conditioons and precisely what DWLWIR can achieve particularly with respect to its interaction with the oceans and the atmosphere immediately above the oceans (I refer to the atmosphere say a few metres above the ocean and the top centremeter of the ocean itself). Any proxy data needs to be considered with a high degree of caution. Unfortunately the thermometer data set is not fit for purpose; it was never intended to produce a record of global temperatures accurate to within tenths of a degree, and with siting issues, station drop outs and endless basterdisation by way of repeated adjustments the legitimacy of which is moot, it too cannot be used as a reliable base upon which to make reliable extrapolations. Indeed, the land based thermometer record is not even measuring the correct energy metric, and we lack proper data on clouds, albedo and solar irradiance.
As a consequence, we are very much left with a gut impression as to what is going on. My gut tells me that there is some GHE caused mainly by water vapour but it is very the GHE is much over hyped. I do not consider that we sufficiently understand solar irradiance and I consider that solar irradiance is probably sufficient in itself to prevent the tropical ocean from freezing. If that is so, then global temperatures even withoutb GHE would probably not be less than plus a couple of degrees. Accordingly, I consider the 30 degree C figure you cite for the GHE to be probably over hyped. I would not wish to put a figure on the GHE but I would not be surprised if it turns out to be only about a third of the figure that you suggest.
As regards CO2, I consider it likely that it plays some limited role over land (very much less over the oceans since water is a LWIR block) and I consider that it is likely that sensitivity varies both with temperature, prevailing water vapour and saturation. With today’s temperatures, and water vapour, I consider it probale that CO2 sensitivity is low. I would not wish to forcefully argue against your 0.25degC figure, maybe it is a bit more, but maybe it is a bit less. We simply do not know enough about natural variation to make a good stab, but I would be reluctant to argue against your figure being a good ball park figure. My gut firmly riles against positive feedbacks; quite simply planet Earth would not have enjoyed the relatively benign conditions that it enjoys today if feedbacks were positive and we would not be here today some 4.5 billion years after its creation blogging on this topic if feedbacks were positive.
Of course, being sceptical, I cannot rule out the possibility that the entire underlying science is fundamentally flawed. If only there had been some proper empirical experimentation on some of the fundamental issues raised, we would have a better understanding of whether that might or might not be the case.
[richard verney: Well stated. Although I am quite a bit less skeptical than you on the basic science of the Atmospheric “Greenhouse Effect” I cannot object to anything you say here. The data is not accurate to support precision of anything like tenths of a degree, and the Climate System is far too complex to model or predict to that type of precision. As engineers, we understand the difference between precision and accuracy, and the fact that precision to more significant figures than is supported by the underlying accuracy is an illusion. Anyone who says they are absolutely sure about any of this stuff is almost certainly wrong :^). Thanks for this useful and collegial interchange of ideas and opinions! Ira]

Bart
March 20, 2013 4:47 pm

Ira Glickstein, PhD says:
March 20, 2013 at 4:39 pm
Greg is one of those people who believe the whole idea is simply false. And, that is simply false.
But, there are serious criticisms of the GHE. Not that it does not exist, but that it is not necessarily monotonic. The GHE can exist while being locally insensitive to additional GHG.

March 20, 2013 5:13 pm

I went over Mauna Loa’s method of measuring CO2, they process the air, remove water vapor, do a bunch of fatally flawed conversions, bounce inferred light of a box full of processed dry air and measure an electrical response, then they convert this electrical measurement into a mole fraction of dimensionless quantity and redefine what Air is, they shift the figures about and give lengthy BS about why and after they have their data, they further process it by calibrating it with other data (who calibrates their data? Mauna Loa?). Then When they model the calibrated data to produce a graph that say’s PPM, they are dishonest and call it Atmospheric CO2. They give the impression that what they have measured is the volume or effective volume of CO2 in the atmosphere, It is NOT.
They say so themselves “Most people assume that we measure the concentration of CO2 in air, and in communicating with the general public we frequently use that word because it is familiar”
This process can be controlled even inadvertently to produce results that give the impression of an increasing mole fraction over a time scale by processing normal variability, effectively the processed data produced is artificial and can in no-way be used to quantify what atmospheric CO2 is on the planet, (and I certainly wouldn’t use it to predict future climate temperatures) What can you measure an artificial value to?
It also seems that there are assumptions built into their data of fossil fuel use, just like the CO2 based climate models.

Bill Illis
March 20, 2013 5:21 pm

This is the actual effective forcing numbers that Hansen used.
http://www.realclimate.org/data/H88_scenarios_eff.dat
If you compare that to what IPCC AR5 is using for the increase in net forcing since 1984 (RCP 6.0 scenario which is the most realistic trend), it is almost identical to the increase in forcing Hansen used in Scenario B. So B was an accurate description of the forecing which actually occurred (despite the spin from RealClimate or Skeptical Science).
http://www.pik-potsdam.de/~mmalte/rcps/data/RCP6_MIDYEAR_RADFORCING.DAT
http://www.pik-potsdam.de/~mmalte/rcps/data/RCP6_MIDYEAR_RADFORCING.xls
Couple of other Hansen 1988 data sources; temperature anomalies under the scenarios and the GHG concentrations.
http://www.realclimate.org/data/scen_ABC_temp.data
http://www.realclimate.org/data/H88_scenarios.dat
Scenario B temp in 2012 +1.065C GISTemp +0.560C
Predicted increasein temps +0.983C Actual increase GISTemp +0.44C
Hansen also updates his own 1988 predictions here.
http://www.columbia.edu/~mhs119/Temperature/T_moreFigs/PNAS_GTCh_Fig2.gif

chris y
March 20, 2013 5:22 pm

Regarding Hansen’s ABC emission scenarios, here is an insightful comment at Climate Audit a few weeks ago-
Climate Audit comment, 03/02/2013
Kurt in Switzerland
“Ref. Scenario B being the most reasonable comparandum to actuals:
I disagree. Scenario A corresponds to 1.5% annual CO2 emissions growth. Actuals have been about 1.9% per annum since 1990, i.e., above Hansen’s Scenario A.
Hansen’s CFC growth rate for Scenario A was 3% p.a.
The additional 0.4% p.a. growth rate in CO2 would compensate for the lack of CFC growth rate, nest-ce pas?
Furthermore, Hansen’s Scenario B called for the growth rate of CO2 emissions to decline from 1.5% per year {1988} to 1% per year in {1990}, 0.5% per year {2000} and 0.0% per year {2010}. Clearly this didn’t happen.
From a cursory look at Hansen et al 1988 Appendix B, the estimated forcing due to both CFC’s mentioned (F-11 and F-12) rising from 0.0 to 2.0 ppb is about 1/4 that due to CO2 doubling (315-630 ppm). The lack of this should approximately cover the underestimate of CO2 emissions growth.
I’ll consider CH4 second order (and much tougher to calculate). Given all the media screaming about methane clathrates being “freed” due to unprecedented arctic warming, one would think actual methane emissions had exceeded Hansen’s worst fears.
So Scenario A is the closest to reality, not Scenario B (it would seem to me).
Either that, or the net UPTAKE by the atmosphere as a result of said emissions was poorly estimated. But that would be a different story. Hansen’s Scenarios are defined primarily by human emissions, not by atmospheric concentration per se.
Kurt in Switzerland”

Greg House
March 20, 2013 5:36 pm

Ira Glickstein, PhD says, (March 20, 2013 at 4:39 pm): “I appreciate your concern for my being correctly informed. I have a similar concern for you. I would appreciate it if you, and others who are having a problem accepting the Atmospheric “Greenhouse Effect” as real would read my Visualizing series, here at WUWT. … This series has garnered over 2000 comments at WUWT, mostly positive. I’ve learned a lot from WUWT readers who know more than I do. However, some commenters seem to have been taken in by scientific-sounding objections to the basic science behind the Atmospheric “Greenhouse Effect”.”
============================================================
“Basic science”, Ira? Let me give you an example. 2×2=4 is basic math. 2×2=5 is not basic math, it is not science at all, it is false. I do not think there is a reason to call warming by back radiation (this is the “greenhouse effect” as presented by the IPCC) is science. It is false, Ira. Of course, you are entitled to a different opinion.
To your articles, I am already familiar with the concept of “warming by back radiation” and find it proven false both on theoretical and experimental level. As I said, on the experimental level it has been known since the R.W.Wood experiment, maybe you need to read his paper. It is really easy reading. Second, on the theoretical level in a certain case of a body at a stable temperature starting receiving back radiation according to the “greenhouse effect” concept, the result would be that more energy is radiated away then there is in the system, and this proves that the initial assumption (“greenhouse effect”) is false.
I guess I’d better not wait till you start digging yourself and give you a link to that second point. Note that it is about a theoretical setup our beloved Willis invented to illustrate how “greenhouse effect” is supposed to work: a planet with a constant internal power supply surrounded by a sphere. Here is the link to my comment about that on another blog: http://climateofsophistry.com/2013/03/08/the-fraud-of-the-aghe-part-11-quantum-mechanics-the-sheer-stupidity-of-ghe-science-on-wuwt/#comment-828. Well, the actual post starting that thread is overheated, but please, focus just on the scientific point.
[Greg House: I have read and considered the R. W. Wood experiment material and nevertheless remain convinced of the basic truth of the Atmospheric “Greenhouse Effect” and that the radiation exchanged between the Earth Surface and the “Greenhouse Gases” in the Atmosphere has been properly accounted for.
However, as I expressed above in my agreement with Richard Verney, I do not accept the levels of precision claimed by some Climate Scientists. That is why my estimte of Climate Sensitivity has such a large range, from 0.25 ⁰C to 1 ⁰C, and, even my high estimate is a factor of three less than the IPCC and a fourth of Hansen 1988. It is also why I think net warming since 1880 is closer to 0.5 ⁰C than the 0.8 ⁰C or higher claimed by the official Climate Team (“hockey team”).
So, I side with “our beloved Willis” and the proprietor of WUWT, and scientists such as Roger Pielke (Sr and Jr :^), and the other skeptics and lukewarmers who seem to me to have taken a detailed and rational approach to this subject area. Ira]

richard verney
March 20, 2013 5:49 pm

Bart says:
March 20, 2013 at 10:31 am
>>>>
richard verney says:
March 20, 2013 at 8:37 am
“If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”
But, there is a definite correlation between the rise in CO2 concentration and temperature.
//////////////////////////////////////////////////////
Your referenced data, proves my point. No mathematician worth his salt would plot a straight linear line through the temperature anomalies between 1977 and end of 2012. A mathematician would look at that data set and say that there is an approximate straight line fit at around the +0.12C level for the period 1977 to 1996 and another straight line fit at around the +0.18C level for the period 2000 to end of 2012. The mathematician would note the anomaly around 1997 to 1999 where a step change takes place; that step change is not brought about by CO2 unless CO2 caused the super El Nino of 1998 and as I said, no one suggests that CO2 levels were responsible for that event.
If you wish to consider the point I made in more detail I would suggest that you split your plot in two. First consider the period 1979 to say 1997 and then consider the period 1999 to date.
As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31. What evidence is there for a manmade emission change of this order in the year 1997 to 1998. Please provide your data on annual manmade emissions supporting such a change in these years.
I think that what you are looking at when you are looking at in the CO2 derivative is a response, not a cause. The ocean heat release has out-gassed CO2. CO2 has not driven the temperature spike, but rather what you are seeing in the derivative is the CO2 response to the El Nino temperature change.
Bart says:
March 20, 2013 at 10:31 am
>>>>
richard verney says:
March 20, 2013 at 8:37 am
“If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”
But, there is a definite correlation between the rise in CO2 concentration and temperature.
//////////////////////////////////////////////////////
Your referenced data, proves my point. No mathematician worth his salt would plot a straight linear line through the temperature anomalies between 1977 and end of 2012. A mathematician would look at that data set and say that there is an approximate straight line fit at around the +0.12C level for the period 1977 to 1996 and another straight line fit at around the +0.18C level for the period 2000 to end of 2012. The mathematician would note the anomaly around 1997 to 1999 where a step change takes place; that step change is not brought about by CO2 unless CO2 caused the super El Nino of 1998 and as I said, no one suggests that CO2 levels were responsible for that event.
If you wish to consider the point I made in more detail I would suggest that you split your plot in two. First consider the period 1979 to say 1997 and then consider the period 1999 to date.
As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31. What evidence is there for a manmade emission change of this order in the year 1997 to 1998. Please provide your data on annual manmade emissions supporting such a change in these years.
I think that you will find that what you are looking at in the CO2 derivative s a response, not a cause. The ocean heat release has out-gassed CO2. CO2 has not driven the temperature spike, but rather what you are seeing in the derivative is the CO2 response to the El Nino temperature change.

Rick Bradford
March 20, 2013 5:50 pm

Dr. Will Happer (Physicist), Princeton: “I have spent a long research career studying physics that is closely related to the greenhouse effect. Fears about man-made global warming are unwarranted and are not based on good science.”
Dr. Lee C. Gerhard (Geologist), UN IPCC expert reviewer: “I never fully accepted or denied the anthropogenic global warming (AGW) concept until the furor started after [NASA’s James] Hansen’s wild claims in the late 1980s. I went to the scientific literature to study the basis of the claim, starting at first principles. My studies then led me to believe that the claims were false, they did not correlate with recorded human history.”
Dr. Nicholas Drapela (Chemist): “My dear colleague [NASA’s James] Hansen, I believe, has finally gone off the deep end… The global warming ‘time bomb,’ ‘disastrous climate changes that spiral dynamically out of humanity’s control.’ These are the words of an apocalyptic prophet, not a rational scientist.”
Now *that’s* a model that doesn’t need updating…..

March 20, 2013 6:01 pm

Hansen’s Scenario A appears to be the best fit for CO2 emmissions but I’ve heard it be argued by lukewarmers that scenario B is preferred because scenario A includes increasing levels of CFCs which instead have decreased thanks to the Montreal Protocol which applied from around the time of the prediction.
So to accept Scenario B is the most apt means accepting that CFCs had about as much impact on the climate as CO2 did. That doesn’t seem very plausible to me and so I think Hansen’s scenario A is closer to his true prediction than Scenario B is.

davidmhoffer
March 20, 2013 6:10 pm

As I said, on the experimental level it has been known since the R.W.Wood experiment, maybe you need to read his paper. It is really easy reading.
>>>>>>>>>>>>>>>>>>>>
What the Wood experiment demonstrates is precisely the same thing that Mann’s hockey stick graph demonstrates. That, on both sides of the debate, there are those who justify their world view on the flimsiest of evidence and cling to it in the face of overwhelming factual evidence to the contrary.

March 20, 2013 6:38 pm

This graphic – with excerpts from the recent Marcott paper is it appears, pretty intresting.
When you match the main Marcott temp history graphic and compare with graph “D” the ice core GHG measured Radiative Forcing, things get interesting. The thin green line is 1950 (“present” for BP) .
The GHG measured Radiative Forcing bottomed appx 8000 years BP andwere relatively flat for appx 1000 years, after which they began, and have continuing, climbing – relatively uniformly – since. About 2000 years BP the radiative forcing level flattened, similar to 8000 yr BP – until a sharp increase between 1000 and 800 years BP – after which – from 800 years BP to end of data the GHG measured radiative forcing was decreasing.
What is interesting to me – is that at a virtually identical times; the GHG measured radiative forcing bottomed and began climbing, AND there is a clearly defined warming pulse/peak in the Marcott data. Since that time GHG measured radiative forcing has been increasing while the Marcott demonstrated temps have been decreasing.
It is also interesting that the same pattern seems to exist at the 8000-7000 yr BP and 2000 to 1000 yr BP points. The GHG measured radiative forcing flattens for appx 1000 years, during which time (7000 and appx 1200 yrs BP) there is a noted, short lived temp spike, followed by accelerated cooling.
In the time period appx 50 to -50 yrs BP (1900-2000) – ignoring the Marcott “spike” which it appears is not supported by the data, we see a similar warm spike beginning in the Marcott graphic. We also know this matches the temp record for the same period, with temps reaching a peak appx 1997 and leveling since.
Despite that GHG measured radiative forcing increased from 7000 BP thru the end of the data poresented, and that we know GHG’s have continued to increase steadily the last 100 years and more, Marcott shows overall temps have continued to fall from the peak appx 7000 BP.
We also know that after climbing from late 1800 thru late 1990’s, while GHG’s continue their rise, temps have leveled for the last appx 17 years.
While this is admittedly “eyeball” science – and assumes Marcotts charts are accurate, it certainly appears Marcott shows that GHG measured radiative forcing is clearly not a driver of temperatures – at least not a driver of temp increases.
Or I could be missing something altogether … 😉
http://tinyurl.com/marcott-ghg

Steve
March 20, 2013 7:12 pm

It looks like Hansen has shown that whatever was causing climate forcing ceased perhaps a little before 2000. If his methodology has any sound basis, perhaps he demonstrated that CO2 was never causing the temperature increase he was tracking. He has possibly presented a pretty good argument against assigning temperature increases of the past few decades to CO2 increases.

john robertson
March 20, 2013 7:12 pm

Hah, I just spotted the “Teams” way out.
Up comment someone was critiquing CO2 measuring methods at Mona Loa, there’s the escape clause.
” We have spotted an error in the CO2 record, emissions actually dropped in 1998 and have stabilized/fallen ever since.” Therefor scenario C is 100% accurate, honest.
Did I need sarc?

Bart
March 20, 2013 7:47 pm

richard verney says:
March 20, 2013 at 5:49 pm
“As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31.”
You appear to be missing my point entirely. “Manmade CO2 emissions” are having no discernible effect on CO2 levels at all. It is essentially all being driven by temperature. The differential equation describing the relationship is
dCO2/dt = k*(T – To)
dCO2/dt = derivative of atmospheric CO2 concentration
k = coupling constant in ppm/degC/unit-of-time
T = global temperature anomaly
To = equilibrium level for global temperature anomaly
“I think that what you are looking at when you are looking at in the CO2 derivative is a response, not a cause.”
Exactly!

Bart
March 20, 2013 8:13 pm

Bart says:
March 20, 2013 at 10:31 am
Ira comments:
[Bart: Have a close look at the Mauna Loa CO2 data (second graphic above) and tell me where CO2 has “leveled out”. Yes, since CO2 has its small seasonal ups and downs there are some months where it is level, but, on a smoothed yearly basis it seems to me to be a continued rapid increase that is slightly exponential in the upward direction. Ira]
The slope has leveled out. Look right there at the end where temperatures are leveling out, too. That is what I said: “But, the rate of change of CO2 has leveled out, in lockstep with the leveling out of temperatures.” Rate. Of. Change. Not the concentration itself, but its rate of change.
[Bart: What does that link or your mathematical jibber-jabber have to do with the past century? Or with the Atmospheric “Greenhouse Effect” that makes the Earth Surface at least 30 ⁰C warmer than it would be if the Atmosphere lacked “Greenhouse gases”? I read your words four times and visited your link. Please explain your point in English. advTHANKSance. Ira]
Just because the GHE heats up the surface above what it would be without the GHGs in the atmosphere does not mean that adding more of a particular GHG will increase it further. It is the difference between a global feature of a function and local behavior. For example, the function
y = 3*x – 6*x^2 + 4*x^3
is a generally increasing function of x. But, between about 0.4 and 0.6, it is essentially flat. At 0.5, the local derivative is zero. A small change in x doesn’t change the function hardly at all.
Similarly, the functional dependence of surface temperature on increasing CO2 appears to be such that, though it warms the surface above what it otherwise would be, we are at a point on the function where adding more doesn’t lead to a significant increase in temperature.
[Bart: I have to agree with Richard Verney that there is no FIRST ORDER correlation between temperature rise and the rise in CO2 emissions. Please don’t simply point to some graph. Explain what you mean so even Richard and I can understand. advTHANKSance. Ira]
Not emissions! Concentration! Emissions are not doing hardly anything at all.You can essentially ignore them. It is clear that they are rapidly being sequestered or otherwise transported out of the surface system by ocean, mineral, and biological processes.
Those flows are miniscule compared to natural flows. Temperatures are driving the rate of CO2. It is a continuous flow problem, in which the differential rate between sources and sinks is being modulated by temperature, as I described here on this page.
Thanks for responding. I hope I have made things clearer and that someone will begin to appreciate what is so patently obvious in the data: Atmospheric CO2 concentration is not being driven significantly by human inputs. It’s the temperature modulation of a continuous transport system.

garymount
March 20, 2013 9:59 pm

Lubos Motl has an detailed look at the Mauna Loa CO2 data.
http://motls.blogspot.ca/2013/02/mauna-loa-carbon-dioxide-fit.html

Old Engineer
March 20, 2013 10:56 pm

Ira-
Since the last time Hansen et.al. ’88 was the topic here at WUWT, I have been studying the paper and doing some calculations and plotting. Unfortunately, my lack of internet skills have prevented me from preparing a post to submit.
The main point I want to make is that there is fourth scenario that nobody mentions. It is shown in the very first figure of Hansen ’88. This scenario is the 100 year control run. It is called the “control run” because all the ghg’s were held constant at the 1958 values. That’s right, the control run keeps the ghg’s constant for 100 years (to 2058).
However, to accurately compare Hansen’s scenarios with measurements, it is necessary to compare the ghg’s in the scenarios with the ghg’s estimated from measurements. The tabular listing of the ghg’s are at Real Climate at:
http://www.realclimate.org/data/H88_scenarios.dat
The NOAA estimated ghg’s in graphical form by year are at:
http://www.esrl.noaa.gov/gmd/aggi/aggi_2011.fig2.png
When these are compared it can be seen that Hansen’s Scenario C is a good match for everything but CO2. Up until 2010 Scenario B was the best estimate of CO2. Fortunately Hansen’s paper presents a method for adjusting for these differences In the words of the paper “The forcing for any other scenario of atmospheric trace gases can be compared to these three cases by computing the [delta t] with the formulas provided in Appendix A.”
Using the Appendix B formulas for CO2 , the difference between the CO2 delta T for Scenarios B and C can be calculated. This difference can then be added to the delta T for Scenario C to obtain a Scenario that is approximately correct for all ghg’s Hansen considers.
For 2010 this adjustment for CO2 adds about 0.11 deg C to the delta T for Scenario C.
I chose to compare the adjusted Scenario C to the UAH satellite measurements. However before that can be done the Hansen Scenarios must be adjusted to the same base years as the UAH measurements (1981-2010). You must use the control run to do this. Then add the delta T differences between the control run and the various Scenarios to the adjusted control run delta T for each year to obtain the Scenarios on an ’81 to ’10 basis.
When this is done the following temperature anomalies are obtained for 2010 (using a 5yr, centered average):
Scenario C adjusted for CO2 and base year = 0.63 deg. C.
UAH = 0.18 deg C.
Control run = 0.08 deg C.
The fact that the 2010 temperature anomaly based on measurements is 0.45 deg. C. lower than the adjusted Scenario C, but only 0.1 degree above the Control run shows that the model is completely useless in predicting future temperature, and that the methodology for determining the effects of ghg’s is deeply flawed.
The key is to include the control run in all scenario comparisons.

Keitho
Editor
Reply to  Old Engineer
March 21, 2013 12:09 am

Leave it to an (Old ) engineer to point out something so lost in the hustle and bustle and to point it out so clearly.

Bart
March 20, 2013 11:19 pm

Old Engineer says:
March 20, 2013 at 10:56 pm
Nice. The evidence is piling up – AGW is dead. The only question is how long its inertia will keep it going.
If anyone wants to know what the next 30 or so years is going to look like, appropriately offset and graft the portion of this plot from about 1945 to 1975 onto today’s temperatures. And, for CO2, expect the rate of change to decrease with the temperatures as it is already doing.

mitigatedsceptic
March 21, 2013 2:22 am

AGW is a Crime against Humanity and perpetrators should be pursued through the courts. Whinging in the press does nothing to ameliorate the hurt AGW has imposed on countless millions by impoverishing them with gargantuan fuel bills and even starvation. Hansen at al should be held to account. Why, it seems, is no one really mad with ANGER!

jc
March 21, 2013 4:10 am

Re: Hansen.
Hansen abandoned interest in Venus, which was his first area of research, at the precise point when atmospheric conditions became measurable. That is, when whatever speculations he had advanced could be verified empirically. Or disproved.
He claims that just at the point where this was going to occur, he suddenly decided to focus, effective immediately, on the earth and AGW. Without knowing whether the fruits of years of work were vindicated.
This is utter bullshit.
People simply do not behave this way.
If someone with sufficient expertise looks at his published papers prior, and the actual measurements obtained, it will be obvious that he was wrong. At a guess, wildly, undeniably, wrong.
Thus his subsequent life: determination never to be shown wrong achieved through his control and manipulation of data.

thingodonta
March 21, 2013 4:58 am

Hansen didnt include the effects of the sun and ocean cycles on mulit decadal time scales throughout the 20th century, he simply assumed that all late 20th century warming was being caused by human greenhouse emissions, and the failure of his forecast since then has shown his theory to be wrong.

Richard M
March 21, 2013 5:20 am

It’s not surprising that RC would use a red herring to further their illogical argument. The need to use a “no warming” comparison is just plain silly. It completely ignores the most likely comparison of ocean oscillations. When the PDO/AMO are considered the current warming is almost a perfect match. Their dishonesty is typical of the alarmist mind.

J. Bob
March 21, 2013 8:02 am

Looking at the “Adjusted Data”, in the RC article, I wonder how Roy Spencer feels on how his UAH data was “adjusted”.

David L. Hagen
March 21, 2013 8:31 am

Ira
Good graph & explanation.
May I suggest comparing Hansen’s predictions with a first order null hypothesis of continued warming from the Little Ice Age. e.g. see
Syun-Ichi Akasofu, On the recovery from the Little Ice Age
Natural Science Vol.2, No.11, 1211-1224 (2010) doi:10.4236/ns.2010.211149
Openly accessible at http://www.scirp.org/journal/NS/
On CO2, Fred H. Haynie provides fascinating analyses of CO2 versus latitudinal temperatures in The Future of Global Climate Change. e.g. in slide 10 he shows polar CO2 driven by the respective polar ice extent.

Steve Hill from Ky
March 21, 2013 8:59 am

Hansen, the 1977 Ice Age man stikes zero again……face it, he is a moron.

Old Engineer
March 21, 2013 10:37 am

Ira-
It was very late (for my time zone) last night when I saw your post, so my comments were brief. I didn’t get a chance to thank you for bringing the RealClimate post to our attention.
Thanks also for so forcefully pointing out that the RC post was just plain wrong. That the “naive” prediction of no change, which is exactly what Hansen’s own Control Run is (no change since 1958!), is closer to the measured temperature anomaly than any of Hansen’s scenarios.
As you pointed out, “Hansen 1988 is the keystone for entire CAGW Enterprise.” So it needs to be hammered home again and again, that the premise on which this keystone is based has been shown to be wrong.
I’m not sure I like the “climate sensitivity” way of looking at ghg effects. But since there is a long history of using climate sensitivity, I certainly agree with you, it is probably close to 1. I think others have also come to same conclusion using different approaches.

Phil.
March 21, 2013 10:46 am

THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”
As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated

You have made the same error that McIntyre made several years ago at CA, look carefully at Fig 2, you’ll see that the difference between scenario A & B is less than 0.1ºC by now. Hansen clearly showed that the main cause of temperature increase up to now would due to the ‘trace gases’ if they continued to increase at the then rates. This is clearly stated in the abstract and the Intro para.
[Phil: Fig 2 of what? My WUWT posting of the RealClimate posting I got the base graphic from? Yes, according to the RealClimate graphic (which is the first graphic of my posting) there is only a small difference between A and B now, and BOTH are WRONG (as compared to ACTUAL), and BOTH assume continued “trace gas emissions”, and all Hansen Scenarios (A, B, and C) assume a Climate Sensitivity of 4.2 ⁰C. The ONLY one of Hansen’s scenarios that is close to the ACTUAL is Scenario C, which is ONLY 31% high. What is your point? What is your evidence? Do they teach clarity at Princeton? Please state your points clearly and I will try to respond. advTHANKSance. Ira]

richard verney
March 21, 2013 12:07 pm

@ Bart says: March 20, 2013 at 7:47 pm
//////////////////////////////////////////////////////
Bart
It is often said that CO2 is a “well mixed” gas. But the accuracy of that claim depends upon the meaning one ascribes to ‘well’. Factually, is it ‘well mixed’ or only ‘reasonably well mixed’? The latest satellite data suggests that it is not what I would classify as being ‘well mixed’ but instead it is rather lumpy (for want of a better expression). Indeed, Moustafa Chahine of the NASA Jet Propulsion Laboratory (JPL), AIRS’s principal scientist, in a press conference partly conceded as much. He said: “Contrary to prevailing wisdom, CO2 is not well mixed in the mid-troposphere,”
Of course there are large variations in the concentration of CO2 and its geographical location (ie., between each hemisphere) and this variation can become more marked dependent upon seasons. Even the concentration of CO2 varies quite significantly from night to day, and this daily variation is more marked or less marked again depending upon geographical location. There are also altitude variations etc.etc. These variations can exceed 20ppm (say 5 or so percent).
Having read your comments (unfortunately only quickly and I probably have not done them justice), it appears that we are talking at cross purposes. I am looking at response to CO2 on a multidecadel time frame, whereas you appear to be looking at response measured upon a much smaller time frame (measured in months). I understand your point but I remain unsure the extent to which it deals with climate sensitivity on a multidecadel basis. Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?
My bottom line take is that you cannot begin to ascertain climate sensitivity until you first know and understand everything there is to know about natural variation and in particular its bounds. Until that is known, it is not possible to separate CO2 signal from noise. Presently, all the data sets are not fit for purpose, they are low resolution and far too noisy. In addition, can they even be relied upon because of endless adjustments the need for and correctness of which is moot?
The only thing we really know about climate sensitivity is that natural variation is stronger; it can trump climate sensitivity. Proof; (i) there has been no warming these past 17 years since downward forcings associated with natural variation have equalled the upward forcing component of climate sensitivity to CO2 during this 17 year period. (2) there was cooling between say 1940 to say 1975 because the downward forcings associated with natural variation have more than equalled the upward forcing component of climate sensitivity to CO2 during that period such that the strength of natural variation forcings were able to drive temperatures down even in the face of the warming effect of CO2!!
If climate sensitivity is high, then in view of (2) above, we know that natural variation is even stronger!
Of course, I appreciate that you consider that CO2 can do nothing and it may be that is the case, or may be it is the case given the saturated levels already reached today. I myself am unable to make any firm assertions mainly because of the poor quality data available, and its error bars, coupled with a lack of empirical observational experimentation on the issues raised. I feel that we are all groping around in the dark.

March 21, 2013 2:20 pm

I don’t think the Earth’s atmosphere increases temperature by 30 C. The Moon has an average surface temperature of about -5 C and for Earth is is about 14 C. The Moon has no atmosphere. Atmospheric pressure alone increases temperature. A case in point is Venus. I am willing to bet if the Moon has a thick atmosphere comprising only of oxygen and nitrogen with no CO2, water vapor or any other greenhouse gas, the Moon would have an average temperature of at least 10 C.

David
March 21, 2013 2:45 pm

amoorhouse says:
March 20, 2013 at 8:19 am
—————————————————————————–
well said, nothing they said and did was close, However the have suceeded in severely damaging the worlds economy.

Leo Smith
March 21, 2013 3:21 pm

I THINK the calculated sensitivity with *no positive feedback* is 0.4C and that I would say is right.
Water is as far as I can see a massive NEGATIVE feedback system but with multi-decadal lags due to oceanic thermal inertia so it tends to lead to oscillatory behaviours.

Nick Stokes
March 21, 2013 3:59 pm

“BRILLIANT! Correcting Hansen Scenario C for the true Atmospheric CO2 you get 0.63 °C warming, but the true warming, based on UAH (satelite sensors) is only 0.18 °C.”
Some funny arithmetic here. OE says the adjustment for CO2 is 0.11 °C;. You say Scen C had a rise of 0.29 °C;. That makes 0.4, as I understand.
The rise in GIStemp LOTI was 0.393°C. Looks pretty good to me!
You say UAH showed 0.18°C warming? Looks like 0.41°C to me.

Bart
March 21, 2013 4:54 pm

richard verney says:
March 21, 2013 at 12:07 pm
“…whereas you appear to be looking at response measured upon a much smaller time frame (measured in months).”
Well, 55 years is 660 months. If you call that a “small” time frame, then…¯\_(ツ)_/¯
“Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?”
It is 55 years in which the CO2 derivative has matched the temperature anomaly perfectly. That 55 years saw the major part of the increase in atmospheric concentration from “pre-industrial” levels.
I’m wondering if you understand that the derivative contains all the information needed to reconstruct the entire change during the time interval it is plotted? I did that here. That’s all you need to reconstruct the entire delta-CO2 in the last 55 years to high fidelity: temperature. You don’t need human inputs. Their contribution is therefore necessarily insignificant.

Old Engineer
March 21, 2013 6:15 pm

Nick Stokes says:
March 21, 2013 at 3:59 pm
=================================================================
Nick-
Perhaps I didn’t put enough description on my values.
First, the anomaly values I gave were 5 years averages, just as Hansen did in Figure 3 of Hansen ’88.
Second, they were at year 2010 (so the average was from 2008 to 2012 inclusive)
Next, the 0.11 deg. C., was increase in the Scenario C 2010 temperature anomaly that would result if the Scenario C 2010 atmospheric CO2 concentrations were the same level as the Scenario B CO2 concentrations. This was calculated from the equations in Appendix B of Hansen ’88, using the method Hansen presented. Thus, the 0.11 should be added to the Scenario C 2010 temperature anomaly shown in Figure 3 of Hansen ’88.
Finally note, that to compare Hansen’s scenario’s to UAH, it is necessary to adjust the base to the UAH base, which is 1981 to 2010. Hansen’s base was the average of the 100 year Control Run. To do this, the Control Run is adjusted to an ’81 to ’10 base, then the differences from the original control run and the three scenarios are added to the adjusted Control Run to get the adjusted scenarios. The adjusted values are slightly different from the Hansen values shown in Ira’s graph.
As for the UAH temperature anomaly, the 0.18 is the 61 month average, not the yearly value.

Phil.
March 21, 2013 6:43 pm

Ira, the Fig 2, abstract and Introduction I referred to are from Hansen (88), the paper that’s under discussion! I naively assumed that you’d actually read the paper whose results you’re critiquing. That you apparently haven’t explains some of your misunderstandings.

March 21, 2013 9:00 pm

The “predictions” which various bloggers attribute to Hansen are not predictions but rather are projections. Though they are often conflated, the term “prediction” and the term “projection” have differing meanings. To conflate the two terms is to foster deceptive arguments on the methodology of the international study of global warming via the equivocation fallacy.

davidmhoffer
March 21, 2013 10:01 pm

Terry Oldberg;
Though they are often conflated, the term “prediction” and the term “projection” have differing meanings.
>>>>>>>>>>>>>
Darn right. One is based on data with error bars and a defined precision and the other is an excuse for not having either.

Reply to  davidmhoffer
March 22, 2013 8:54 am

davidmhoffer:
Thanks for taking the time to reply. There are interesting differences between models that make predictions and models that make projections. One is that a model of the former type conveys information to a policy maker about the outcomes from his or her policy decisions while a model of the latter type conveys no information. Thus, while a model of the former type is suitable for making policy, a model of the latter type is completely unsuitable. In AR4, the models that are cited by the IPCC as the basis for making policy on CO2 emissions convey no information to policy makers about the outcomes from their policy decisions; thus these models are completely unsuitable for making policy.
Though being completely unsuitable, models that make projections are what are used in making policy. That this is so is one of many ghastly consequences from the incorporation of the equivocation fallacy into arguments regarding the methodologies of climatological studies.
An “equivocation” is an argument in which a term changes meaning in the middle of the argument. By logical rule, to draw a conclusion from an equivocation is improper. To draw a conclusion from an equivocation is the equivocation fallacy. Under the circumstance that the words “prediction” and “projection” are treated as synonyms, the word pair prediction-projection has dual meanings and is said to be “polysemic.” In making arguments about the methodologies of their studies, climatologists use polysemic terms that include prediction-projection and draw improper conclusions from these arguments. One of the consequences is to move money from the pockets of non-climatologists to the pockets of climatologists. As uses of the equivocation fallacy are hard to spot, few of the non-climatologists are aware of having their pockets picked by the climatologists!

E.M.Smith
Editor
March 21, 2013 11:04 pm

Oldberg:
I’m so sick of that deceptive canard. It is used as a shield to cover blatant deception. First started by the Club Of Rome propaganda piece “Limits To Growth” by Meadows et.al. as near as I can tell.
Folks tell you what they expect to happen in the future. That’s a prediction. Fortune tellers do not say “I will now project your future!”… One doesn’t say “The High Priest will now project the date of the eclipse.” In common usage, saying “this is what will happen” or “this is what I expect to happen” is a prediction.
The rest of the word play is just bafflegab to dodge responsibility for it being a FAILED prediction.
And I predict the Warmers will continue to play the Projection Game as cover for abuse, deception, and error.

Reply to  E.M.Smith
March 22, 2013 9:31 am

E.M.Smith:
The deception being practiced by participating climatologists has been studied by logicians and is called by them the “equivocation fallacy.” Please see my response to davidmhoffer for details.
By the way, skeptics as well as warmists are guilty of incorporating the equivocation fallacy into arguments about methodology. Participating skeptics unwittingly parrot uses of the equivocation fallacy by their opponents the warmists. In doing so, they reduce their argument with the warmists to one that is over the size of the equilibrium climate sensitivity (TECS). TECS does not logically exist but rather is a product of uses of the equivocation fallacy in making methodological arguments.

richard verney
March 22, 2013 1:57 am

Ira
I consider that Russ R, (see his comment : Russ R. says: March 20, 2013 at 9:32 am)
raises a very good point. Have a look at it.
I think that your article could be improved if you were to detail the CO2 assumptions (in ppm) both natural and manmande which Hansen made in scenaro A, and scenaro B. If Hansen was basing these assumptions on say 30 years worth of CO2 emissions beweeen say 1958 to 1988, then CO2 as from 1988 to date has risen at more than a linear rate. In particular, the manmade component certainly has.
I had always thought that CO2 emissions were above the BAU (scenaro B) but less than the scenaro A, ie., that we were somewhere between the two scenaros – running a little above BAU. . However, (given what Russ R says) it may be that my understanding of this is incorrect since although manmade CO2 emissions is above those assumed in BAU (scenaro B), this increase has been offset by the growth of the naturally occurring CO2 sinks which have grown faster than was envisaged in BAU (scenaro B).
I think if we are to consider Hansen’s ‘projections’ (some would say predictions’) we need to specifically consider CO2 emissions and the assumptions that Hansen made with respect to these. Your Mauna Loa plot shows what has happened to CO2 but it does not detail precisely what Hansen assumed CO2 emissions would be running at in his various assumptions.
PS The point raised by Old Engineer is very interesting.

richard verney
March 22, 2013 9:12 am

Ira Glickstein, PhD says:
March 22, 2013 at 8:47 am
“…if CO2 had increased linearly (Scenario B) it would have increased 36 ppmv in 24 years and be up to 381 ppmv. Actual CO2 is 2012 was about 393, so, for CO2, the increase has been exponential which corresponds to Scenario C….”
/////////////////////////////////////////////////////////////////////////////////
Ira
Actual 2012 CO2 is 393 ppm. This is more than linear which would have been 381 ppm, but is it properly clasified as exponential?
You state: “… so, for CO2, the increase has been exponential which corresponds to Scenario C….” but do you mean it corresponds with Scenario A (not C) since Scenario A is is the projection of an exponential growth in CO2 emissions, and you are asserting that CO2 has risen exponentially?
Of course, we are not entirely Scenario A because Scenario A assumes no volcanoes and there has been some (claimed) negative forcing due to aerosols.
We are obviously not Scenario C since that assumes no increase in CO2 post 2000. Nor are we Scenario B, which assumes some negative aerosol forcing but a linear rise in CO2 emissions, whereas we have seen more than a linear rise in CO2. Are we therefore not somewhere between the Scenario B and Scenario A projections? Should we not be comparing reality with something lying between Hansen’s Scenario B and A projections?
There is also the spanner in the works pertaining to aerosols from developing nations use of coal powered generation. Personally, i consider this suspect. Is it not the position that aerosol emissions today are no greater than they were in the 70s/80s. Perhaps a plot of aerosol emissions from 1980 to date would be a useful addition to your article.

Bart
March 22, 2013 10:14 am

Well, I see nobody knows what to make of my evidence. It shocks me that something so obvious can be so hard for people to grasp. I guess doing this kind of stuff for a living for several decades skews your viewpoint to the point that things which are obvious and basic to you are just not in the realm of experience for most others.
Oh well, so it goes. But, I have made predictions, so you can all watch them unfold and realize I am right.

Reply to  Bart
March 22, 2013 1:41 pm

Bart:
It is logically troublesome that the entity which you call a “prediction” is the entity which I call a “projection” for to treat the two terms as synonyms makes of the word pair “prediction-projection” a polysemic term; a polysemic term is a term with several meanings. That this word pair is polysemic leads arguments about the methodology of global warming research to degenerate into examples of the equivocation fallacy. Thus, it is important for all of us to assign the same distinct meanings to the two terms.

Bart
March 22, 2013 1:54 pm

Terry Oldberg says:
March 22, 2013 at 1:41 pm
I think you addressed the wrong guy. I haven’t addressed anything to you in particular, and am off on a completely different topic.
But, I get your argument, a projection is not a prediction. It is a scenario of what would happen under specific conditions. However, when you determine which projection is consonant with observed conditions, then that projection can effectively be considered a prediction. So, in that regard, I see your objection as largely semantic.

Reply to  Bart
March 22, 2013 5:08 pm

Bart:
Though it has a semantic aspect, the problem that interests me is logical. In making arguments about the methodologies of their studies, climatologists habitually draw improper conclusions from equivocations thus being guilty of the equivocation fallacy. A favored vehicle for this practice is to treat “prediction” and “projection” as synonyms. To do so is to is to create a polysemic term that switches meaning in the middle of their argument. When a conclusion is drawn from this argument, another example of an equivocation fallacy is born.
Opportunities for shenanigans of this kind can be eliminated by disambiguating terms in the language of methodological arguments thus eliminating the polysemic terms. When this is done, today’s climate models are revealed to have numerous pathological features. Among them is that the models convey no information to policy makers about the outcomes from their policy decisions. Thus, the models are useless for making policy. It is instances of the equivocation fallacy that make them seem useful.

Bart
March 22, 2013 1:55 pm

And, my prediction is not a projection.

Reply to  Bart
March 22, 2013 5:34 pm

Bart:
As I use the term “prediction,” it is a product of an inference to the unobserved outcome of an event in a statistical population; for the IPCC climate models there is no statistical population and thus is no such thing as a prediction. As I use the term “projection,” it is a mathematical function that maps the time to the global average surface air temperature. Your “prediction” sounds like my “projection” and unlike my “prediction.”
The identity of the term that one assigns to the meaning which I assign to the term “predict” and the identity of the term that one assigns to the meaning which I assign to the term “project” is immaterial to the logic or illogic of the methodologies of climatological studies. Currently these methodologies are illogical but sound to many as though they are logical. This mistake is a consequence from the fallacy of drawing an improper conclusion from an equivocation, the so-called “equivocation fallacy.”

AndyG55
March 22, 2013 5:23 pm

Either way you look at it, the atmosphere allows the Earth’s surface to retain heat (and other ) energy. Some calculations say as much as 30C.
The question is whether CO2 has any influence on this. Ira says about 1C per doubling of CO2 concentration. I suspect that the answer is more likely to be = 0C, mainly because the atmosphere is a self-balancing mechanism controlled and regulated by pressure differences. CO2 makes no difference (well, extremely small) to the atmospheric pressure or to the rate transfer of energy within the atmosphere. Any minor warming by so-called back-radiation from CO2 would be instantly balanced out by the atmosphere’s pressure regulating mechanism. The atmosphere acts to COOL the surface if it the surface is warmer than it should be, and would also act to cool any part of the atmosphere that gets warmer than the pressure gradient will allow.
Anyways.. it will certainlly be fun watching the warmists squirm over the next several years as the global temperature starts to drop. 🙂

mitigatedsceptic
Reply to  AndyG55
March 23, 2013 2:49 am

AndyG55 I agree – I find it very difficult to conceive that man, such a puny animal, occupying such a small space on this, relatively, gigantic planet with its vastly more gigantic atmosphere, could have any appreciable effect on it. Is this not a hangover from the kind of anthropocentrism (e.g. Man is in God’s image, Earth at the centre of the Universe, etc.) that reflects more on the arrogance of our species than it does on the state of affairs.
Yes, AndyG55, but while we enjoy the discomfort of the warmists, millions are are paying the price of this cruel deception; affordable energy has been at the heart of of civil refinement (let’s not call it ‘progress’) and this ghastly conspiracy is almost wholly responsible for depriving the burgeoning population of access to it. People are being driven out of work, poverty stalks the land, the lights are going to go out, the poor and elderly will starve and freeze to death, all as a result of the war against humanity being waged by self-appointed, self-important bullies in the name of the “Green Agenda”.
Man under threat, like many other animals, tends to produce more of the species. Give security from war, ample food, clothing and shelter the reproductive rate falls and, if there really is an overpopulation problem, that problem will diminish without draconian eugenics and the like.
AGW, like WMD, was contrived by the ruthless, selfish and powerful to impose servility on the innocent and ignorant. While we gloat over the warmists’ humiliation, let us also grieve for all those paying the price of their arrogance.
What surprises me even more is that very few seem to express anger now that the deception has been exposed.

Bart
March 23, 2013 11:00 am

Terry Oldberg says:
March 22, 2013 at 5:34 pm
‘Your “prediction” sounds like my “projection” and unlike my “prediction.”’
Not so. It is based on the statistically observed behavior of what I argue is an, at least approximately, ergodic system – the time average of certain measures is, approximately, equal to the distribution expectation.
It is a prediction. It is not just one random possibility, it is the expected outcome.

Reply to  Bart
March 23, 2013 3:18 pm

Bart:
If your “prediction” is an Oldbergian prediction, underlying your model is an example of a statistical population. If there is one, kindly describe the independent events in this population. In particular, what is the starting time and ending time of each event and what is the complete set of possible outcomes.

Bart
March 23, 2013 11:40 am

Terry Oldberg says:
March 22, 2013 at 5:34 pm
I agree with you entirely in this sense, however: What the IPCC has done is assumed a model based on a population of one, and projected that model forward to give a distribution of outcomes. The usefulness of the resulting distribution is not in predicting the future, but in validating the model, i.e., in showing whether it is likely false or indeterminate – it cannot show if it is true.
Unfortunately, the clearly desired implication is that they are predictions, and that the scarier scenarios can happen, when there really is a very tenuous, to the point of negligible, basis to expect that at all.
But, that differs from what I have done. I have looked at the time evolution of the data itself, and matched it to a statistical model which describes the observed dynamics over the timeline. Their process is deductive, based on premises and projected from a sample size of one. Mine is inductive – I start with the data, and infer the statistical model from it.
And, mine has matched the future observables, where their’s has failed, i.e., it is very likely false. For me, the turning point of the ~60 year cycle in globally averaged temperature arrived right on time in about 2005, whereas the first figure in Ira’s post here shows that they are skirting on the extreme lower boundary of the distribution of their projections.
In this reality, systems tend to behave in simple ways. Complex systems tend to regress to a simple systematic mean. Thus, for example, the ungodly complications of quantum theory regress to Newton’s simple F = ma in the large, as demonstrated by Ehrenfest’s Theorem. Complex nonlinear systems tend to behave as simple linear ones near a particular equilibrium. Such regression has been observed ubiquitously. Our entire techno-industrial society depends upon it.
Those who stay locked in ivory towers and have little contact with practical reality tend to get wrapped around the axle, and overwhelmed by complexity. They just cannot conceive that the system could really evolve so simply as the rate of change of atmospheric CO2 being proportional to temperature anomaly, or the evolution of temperature being simply a trend plus a simple cyclic phenomenon. Yet, that is precisely what the data confirm. And, humans clearly have little impact on either.

Reply to  Ira Glickstein, PhD
March 23, 2013 3:09 pm

Ira Glickstein:
I specialize in answering the question of whether methodological arguments are logical. Before examining the methodological arguments of the global warming climatologists in light of logic, it is necessary to rid these arguments of instances of the equivocation fallacy. This can be accomplished through disambiguation of the polysemic terms in the language of these arguments. A polysemic term is a term with more than one meaning.
Among the polysemic terms is the word pair prediction-projection in the circumstance that the two words in this word pair are treated as synonyms. In their study of the methodology of global warming climatology, Green and Armstrong found that most IPCC-affiliated climatologists treated the two words as synonyms at the time at which AR4 was being written.
The polysemic term prediction-projection can be disambiguated by assignment of distinct meanings to “prediction” and “projection.” This raises the issue of what these meanings shall be.
In addressing this task ( http://judithcurry.com/2011/02/15/the-principles-of-reasoning-part-iii-logic-and-climatology/ ), I formed the hypothesis that climatologists had acquired the two words from the meteorological literature. In particular, they had acquired the term “projection” from the literature of ensemble forecasting. In ensemble forecasting, a “projection” is a response function that maps the time to the values of a selected independent variable. Meteorologists seemed to adopt the definition of “prediction” that was standard in mathematical statistics. Under this definition, a prediction was an unconditional predictive inference.
As all of the evidence that I was able to acquire was consistent with this hypothesis, I went with it. By examination of uses of the two words in the literature of meteorology, I formed an impression of what meteorologists meant by each of the words. These are the meanings that I assign to the two words in this thread. While the IPCC claims there to be a circumstance in which a projection becomes a prediction, this is not mathematically possible for a “prediction” is an extrapolation to the outcome of a specified event in a statistical population but for global warming climatology there is no such population.

March 23, 2013 4:43 pm

Bart:
In a search lasting 3+ years, I’ve been unable to find so much as a single event in the statistical population underlying the IPCC climate models. If you are aware of one, please point me in the right direction.
Instead of one or more events, I find multiple examples of the equivocation fallacy that deceive many of us into thinking that: a) predictions have been made when only projections have been made b) the models convey information to policy makers on CO2 emissions when the models convey no such information c) global temperatures are controllable by man when they are uncontrollable d) the models have been validated when they have only been evaluated e) the scientific method has been followed in studies of global warming when it has not been followed.

Phil.
March 23, 2013 4:54 pm

Ira Glickstein, PhD says:
March 23, 2013 at 12:01 pm
ALL of Hansen’s 1988 predictions were very wrong, including C which, while close to the actual temperature anomalies, is not matched with any drastic societal action nor any change in the trend of trace gas levels, which was the whole point of C)

The ‘drastic societal action’ was the Montreal Protocol, as for the change in the trend of trace gas levels, see below.
Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C:
http://climateaudit.org/2008/01/17/hansen-scenarios-a-and-b/

Werner Brozek
March 23, 2013 9:47 pm

Phil. says:
March 23, 2013 at 4:54 pm
Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C
So with the very low rise now, it appears as if CO2 was never a major player all along so we can ignore CO2. Is that correct?

March 24, 2013 8:38 am

Werner Brozek:
Whether or not CO2 is a major factor cannot be determined until climatologists supply a missing ingredient for doing research that is “scientific.” This ingredient is the statistical population underlying their models.
Suppose this population is finally described and the duration of an event in this population is 30 years. An event has one of 2 possible outcomes. In one of these, there are two possible outcomes. One is that the spatially and temporally averaged global surface air temperature exceeds the long run median. The other is that the spatially and temporally averaged global surface air temperature does not exceed the median. In this case, the recent 16 year period in which the warming has oscillated about zero yields no observed events. Thus, it provides us with no information about the outcomes of the events of the future.

Bart
March 24, 2013 11:08 am

Terry Oldberg says:
March 23, 2013 at 3:18 pm
“If there is one, kindly describe the independent events in this population.”
They are the events which drive the system forward in time. If a whitening filter can be devised which effectively yields an estimate of that population of inputs as stationary broadband noise, then the inverse of that filter provides an effective model for prediction. It is not, of course, guaranteed, but the range of systems for which such an approach has been successfully applied is vast.

Reply to  Bart
March 24, 2013 12:07 pm

Bart:
Thanks for taking the time to reply. An event has a starting time and stopping time and a set of mutually exclusive collectively exhaustive possible outcomes. I’d like to know the starting time and stopping time for each statistically independent event in your population as well as the set of possible outcomes.

Bart
March 24, 2013 11:18 am

Terry Oldberg says:
March 23, 2013 at 3:18 pm
PS: I believe your criticisms are right on. What is being done in climate science right now is a scattershot approach which has very little likelihood of success. I would advocate a more phenomenological approach, based on how the data has actually been observed to behave, rather than trying to make the observations conform to an underlying theory which might well be (indeed, has been found to be IMHO) false. As Sherlock Holmes was fond of saying:

“It is a capital mistake to theorize before one has data. Insensibly, one begins to twist data to suit theories, instead of theories to suit facts.”

Reply to  Bart
March 24, 2013 12:28 pm

Bart:
Thanks for the support. The aspect of global warming research that strikes me as most interesting is that the methodology is unscientific, illogical and unsuitable for the intended purpose but that this state of affairs has successfully been covered up through repeated uses of a deceptive argument. Two hundred billion US$ have been spent on the misguided research and deluded governments are gearing up to spend one hundred trillion US$ or so on implementing the results. An advocate for impoverishing people in this way has received the Nobel Peace Prize. The President of the U.S. and United Nations are on board. This is a great story!

Bart
March 24, 2013 2:00 pm

Terry Oldberg says:
March 24, 2013 at 12:07 pm
“I’d like to know the starting time and stopping time for each statistically independent event in your population as well as the set of possible outcomes.”
I will see if a brief synposis of my viewpoint will help.
We start with a description of the system from a set of linear time invariant stochastic differential equations. There are many powerful tools in existence for the identification of such systems. The best starting point is probably estimating the power spectral density (PSD).
For example, I did a PSD estimate of Sun Spot Number here. From this, I was able to see that the specturm was dominated by two spectral peaks which modulated against one another to produce four peaks in the spectrum. So, the underlying system has the character of the Hypothetical Resonance PSD I show in the middle, When squared, the signal has the theoretical specturm shown as the green line in the bottom plot. and that matches up pretty well with the spectrum of the squared process from actual data in blue. Although there are other small peaks evident in the data, it is apparent that the two processes identified above dominate. The others can be added at some point to achieve a better model, but these two main ones should be enough to provide a useful first-cut approximation.
I do not know what these processes are but, from a phenomenological viewpoint, I do not need to. Once identified, their source can be tracked down independently, but we can still used the identified structure to predict future evolution.
A model for the dominant processes is shown here. The input PSD of the driving processes is assumed to be wideband and flat, and this idealization works fine if the true stochastic input is simply more or less uniform in the frequency range of each of the two spectral peaks. I show here and here the outputs of the model when simulated, and they are seen to have similar character to the actual observed SSNs.
Using the square of the SSN provides a smooth observable which can be incorporated into an Extended Kalman Filter. Using backwards and forwards propagation and update of the filter, the states can be optimally smoothed and primed at the last value for prediction. Propagating the differential equations forward then provides an approximate expected value, which is therefore a prediction of the future based on all past observables, and the Kalman Filter formalism provides RMS bounds on the error in the expected value. I haven’t gone forward with the project of doing so because it is a big job, and I have many other competing interests, some of by which I earn a living, but the procedure is straightforward.
Note that this is an entirely phenomenological approach, I do not need to know the actual underlying dynamics, only a reasonable equivalent representation. The observed structure of the dynamics can be expected to continue as they have in the past. So, we have a statistically non-trivial set of observations with which to verify the model.
Now, why are we justified in taking such an approach, and why should we expect it to bear fruit? First and foremost, because it has innumerable times in the past. It is not an overstatement to say that our entire tech-industrial society has been built upon this very foundation. But, it is also reasonable from a first principles point of view.
Most processes, at the most basic level, can be represented to high fidelity as the outcome of a randomly driven set of partial differential equations (PDEs). PDEs can generally be decomposed onto a functional basis, thence expanded into a multi-dimensional set of first order ordinary differential equations. Further simplifications can be achieved by focusing on those states which dictate the long term behavior, the
Langevin equations. In the neighborhood of a particular equilibrium state, these equations take the form of a linear time invariant (LTI) system. And, so, we can expect that starting from determination of an LTI system which describes the observations, we can ultimately arrive at a useful predictive model.
That model then provides a standard against which to validate theories on the deeper underlying system dynamics. So, it is essentially the reverse of the procedure the climate establishment is using. As I said before, their approach is essentially deductive – they start developing theory unconstrained by the observables, then they try to force the observations to match the theory. That, as Sherlock would say, is bass-ackwards. You should start with the observations, which then constrain the form of your theoretical models, in an inductive progression.
The deductive approach is somewhat like trying to get a winning Lotto ticket by randomly choosing a number, and seeing if it matches the winning number. The inductive approach is more like observing that the winning numbers in this particular game are always prime and never repeat, and so you dramatically reduce the number of possible winning numbers from which you can choose and, eventually, since the numbers are composed of a finite number of digits, you can zero in on the winning number.

Bart
March 24, 2013 2:06 pm

Bart says:
March 24, 2013 at 2:00 pm
“The deductive approach is somewhat like…”
Or, maybe, it is like trying to guess the solution of an equation by generating random numbers until you get an exact match of the equation, versus using a Newton algorithm to converge quadratically on the answer.

Phil.
March 24, 2013 2:41 pm

Werner Brozek says:
March 23, 2013 at 9:47 pm
Phil. says:
March 23, 2013 at 4:54 pm
Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C
So with the very low rise now, it appears as if CO2 was never a major player all along so we can ignore CO2. Is that correct?

No, the point of Hansen 88 was that GHGs would have a major effect on climate, most of the short term change would be due to gases other than CO2, long term the effect of CO2 would be significant. What the actual measurements show was that Hansen’s Scenario C was most representative of reality except for CO2 which was between A and B so you’d expect the overall result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.

March 24, 2013 3:58 pm

Phil.,
What would it take for you to admit that AGW is either non-existent, or so minuscule that it isn’t worth worrying about?
Numbers, please: how many more years of little or no global warming would it take? How much more human-emitted CO2 without runaway global warming would it take?
Or, is your mind made up to the point where nothing can possibly convince you that your “carbon” conjecture was/is wrong? <— [Like Hansen's true belief.]
Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?

DirkH
March 24, 2013 4:07 pm

Phil. says:
March 24, 2013 at 2:41 pm
“result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.”
The greenhouse effect of CH4 competes with H2O and is only measurable in dry winter weather. Are you SURE it’s a problem when it can’t even be measured in warm moist weather?

Reply to  Ira Glickstein, PhD
March 24, 2013 10:07 pm

Ira Glickstein:
Thanks for taking the time to read my article. A tricky aspect of the type of model that makes projections is that it conveys no information to policy makers about the outcomes from their policy decisions. Thus, though policy makers think the opposite, the availability of this type of model does not make global temperatures controllable through regulation of CO2 emissions.

Bart
March 25, 2013 3:03 am

Ira Glickstein, PhD says:
March 24, 2013 at 6:34 pm
“Scenario A got the exponential increase in CO2 pretty much correct…”
Good comment in general, but I must take issue with this. The increase in atmospheric CO2 concentration is not exponential. It is at best quadratic.
The derivative of an exponential is an exponential. As is readily apparent in the graph I keep bringing up, the derivative is at best linear, and in fact is decelerating along with the globally averaged temperature in the last decade or so.
[Bart, you are correct. According to Wikipedia:Quadratic growth is a special case of a convex function, and should not be confused with exponential growth, a better-known growth function. “Convex growth” means “increasing at an increasing rate” (the second derivative or second difference is positive), while quadratic growth means “increasing at a constantly increasing rate” (the second derivative is positive and constant), and exponential growth mean “increasing at a rate proportional to current value” (second derivative is proportional to current value, which is positive; this is because the first derivative is proportional to the current (positive) value, hence (taking derivatives) second derivative is proportional to first derivative, hence (proportional to proportional is proportional, second derivative is proportional to first derivative). That is, quadratic and exponential growth are both different special cases of convex growth. Thank you for clearing this up. Ira]

March 25, 2013 8:26 am

Bart:
In building a model, the deductive approach has a shortcoming. It assumes that information needed for deductive conclusions is not missing but information is, in fact, missing. The inductive approach introduces problem not faced under the deductive approach; this problem is called “the problem of induction.” In each circumstance in which an inference is made by the model there are several candidates for being made. How can the one candidate that is correct be discriminated from the many that are incorrect?
This problem was solved circa 1963 by the theoretical physicist Ronald Christensen. Logic could be extended from its confines in the deductive logic and through the inductive logic via replacement of the rule that every proposition had a truth value by the rule that every proposition had a probability of being true.
In the resulting “probabilistic” logic, every proposition had a unique measure. The measure of a proposition was its entropy. It followed from the existence and uniqueness of the measure of an inference that the problem of induction could be solved by a kind of optimization. Depending upon circumstances, the correct inference was the one that minimized the entropy or that maximized the entropy under constraints expressing the available information. Christensen called this rule “entropy minimax.” Logic held principles of reasoning. The principles of reasoning were entropy minimax.
In concert with colleagues who included me, Christensen tested this hypothesis in a number of real world circumstances. It held up to testing. Among the scientific theories that were produced by entropy minimax were thermodynamics, the modern theory of communications and the first successful long range weather forecasting models.
However, though Christensen published his work, few scientists or academics read these publications. (Among the few who read some of them were a couple of professors from Ira Glickstein’s systems science department at Binghamton University. One was the department chairman, George Klir)
In their ignorance, scientists continued to build models as they had done so for centuries. This was by discriminating the one correct inference from among the many candidates using the intuitive rules of thumb that I call “heuristics.” However, in each instance in which a particular heuristic selected a particular inference as the one correct inference, a different heuristic selected a different inference as the one correct inference. In this way, the method of heuristics violated Aristotle’s law of contradiction thus being illogical.
Almost all of the models that are used today in practical decision making were built by the method of heuristics. Kalman filters are built by it. Use of this method introduces variability in the quality of the models that are generated in which the quality depends upon the model builder’s luck in the selection of heuristics.
If you’d like to follow up, I recommend that you start with one of two tutorials. Judith Curry published one of them in her blog as a three part series under the title “The Principles of Reasoning” in 2011. The other, also written by me, is at http://www.knowledgetothemax.com . A bibliography is available at the same URL.
Entropy minimax builds the best possible model from given informational resources. With the availability of this idea, generalizations can be reached about the possibilities from climatological research. One of these generalizations is that informational resources are insufficient for the construction of a model that predicts global temperatures over a horizon long enough (about 30 years) for this model to be usable for making policy on CO2 emissions. In ten thousand years, we may have enough observed events for this to happen. In the interim, entropy minimax will dictate that the global temperature varies independent of the CO2 level.

Phil.
March 25, 2013 5:05 pm

D.B. Stealey says:
March 24, 2013 at 3:58 pm
Phil.,
What would it take for you to admit that AGW is either non-existent, or so minuscule that it isn’t worth worrying about?

Scientific evidence.
Numbers, please: how many more years of little or no global warming would it take? How much more human-emitted CO2 without runaway global warming would it take?
Since we’re not in a period of no warming that’s not an issue, I have never expected runaway global warming in any case.
Or, is your mind made up to the point where nothing can possibly convince you that your “carbon” conjecture was/is wrong? <— [Like Hansen's true belief.]
Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?

Believe the planet, in a couple of years time when the arctic is devoid of sea ice perhaps you’ll start to see the light?
[Phil: You have never expected RUNAWAY global warming -GOOD TO HEAR- but, but you expect the arctic to be devoid of sea ice in a COUPLE YEARS TIME ? Forgive me, but that seems inconsistent. Ira]

Reply to  Phil.
March 25, 2013 6:32 pm

Phil and D.B. Stealey :
Arguments being made in this thread have become disjointed. In another part of it, I believe I have convinced participants that we have no way of knowing of the effect of raising the CO2 concentration upon global temperatures because, in organizing their study of global warming, climatologists such as James Hansen have blown their assignment. We’d might as well have shoveled the 200 billion US$ spent by Hansen and his colleagues down a rat hole.

Phil.
March 25, 2013 6:02 pm

Ira Glickstein, PhD says:
March 24, 2013 at 6:34 pm
Phil. March 23, 2013 at 4:54 pm:
Thanks for the link to Climate Audit, but it dates from 2008. Do you have anything more recent for CFCs, CH4 and N2O?

No but feel free to get the raw data and plot it, after all it’s you who wrote the post entitled “How well did Hansen (1988) do?”
http://cdiac.ornl.gov/oceans/new_atmCFC.html
http://www.esrl.noaa.gov/gmd/dv/iadv/graph.php?code=MLO&program=ccgg&type=ts
I respect McIntyre and accept his conclusion that Hansen 1988 way, way, way over-estimated the grown of “Greenhouse gases” other than CO2. Therefore, actual CFCs, CH4 and N2O levels fell considerably below even Scenario C assumptions, and way, way, way below the Scenario A and B assumptions.
That wasn’t his conclusion!
You continue to give me the impression that you haven’t read the paper we’re discussing, you certainly don’t understand it!
So, let us take score:
1) A close look at Fig 2 from Hansen 1988 indicates that they assumed the warming effect of CO2, alone, was about equal to the combined warming effect of CFCs, CH4 and N2O. So, if they over-estimated one (CO2) and underestimated the other (CFCs, CH4 and N2O) by about the same amount, we should expect that the temperature anomaly would flatline, which, over the past decade and a half, it did. Score one for dumb luck!
2) Scenario A got the exponential increase in CO2 pretty much correct, but they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario A is so much higher than actual temperature anomalies.

The point of Scenario A was to illustrate what would happen if existing trends in emissions continued, thus it was referred to as ‘business as usual’. “Scenario A assumes that growth rates of trace gas emissions typical of the 1970s and 1980s will continue indefinitely”. They didn’t over-estimate the increase, by definition that was what it was!
“Scenario A…., must inevitably be on the high side of reality in view of resource constraints and environmental concerns”
“Global warming…..occurs in all three scenarios…….depending on trace gas growth”
3) Scenario B assumed a linear increase in CO2, which turned out to be pretty much correct, since the actual exponential increase is very mildly upward. But they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario B is also much higher than actual temperature anomalies.
Scenario B assumed growth so as to maintain the rate of growth in greenhouse forcing stayed at the current rate. Hansen judged that “Scenario B is perhaps the most plausible of the three cases.”
4) Scenario C assumed a linear increase in CO2, until the year 2000, where they assumed it would flatline. That to this dayturned out to be wrong, because CO2 has continued its mild exponential rise. They slightly over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario C is only 31% higher than actual temperature anomalies.
Scenario C was chosen to be the result of “a more drastic curtailment of emissions than has generally been imagined.”
The scenarios were chosen to “yield sensitivity experiments for a broad range of future greenhouse forcings”. The goal being to bracket possible future climates, in this they were successful as you admit. In fact the then accepted value for sensitivity was somewhat high, if the currently accepted value is used then the result falls between B & C as one would expect from the actual emissions.
Bottom line score; 1 for 4, and that 1 was dumb luck. Not too good. Not even for Scenario C, the closest to actual temperature anomalies.
Pretty much bang on for a 25 years in the future!
Yet, they say their model (or “simulation” as RealClimate recently called it), in their words “while not perfect … has shown skill”.
Indeed it has, try reading the paper and understanding it.
RealClimate is comfortable with that. Are you? If so, they have one foot in a bucket of scalding water, and the other foot in a bucket of ice, but, on the average, they are comfortable :^)
Yes because I understand the paper, and have read it several times, clearly you haven’t.

Phil.
March 26, 2013 9:17 am

Phil. says:
March 25, 2013 at 5:05 pm
Since we’re not in a period of no warming that’s not an issue, I have never expected runaway global warming in any case.
“Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?”
Believe the planet, in a couple of years time when the arctic is devoid of sea ice perhaps you’ll start to see the light?
[Phil: You have never expected RUNAWAY global warming -GOOD TO HEAR- but, but you expect the arctic to be devoid of sea ice in a COUPLE YEARS TIME ? Forgive me, but that seems inconsistent. Ira]

‘Runaway’ I interpret as heading towards a Venus like state, which I don’t believe will happen.
An increase of a few degrees however is another matter and is sufficient to cause problems for us. Melting of the Arctic sea ice would be one effect of such a change, I would be surprised if that isn’t substantially complete in a couple of years. Here’s a graphic of the progress so far:
http://iwantsomeproof.com/extimg/siv_september_average_polar_graph.png
Given the fragmentation of the present ice, which is mostly FYI, I wouldn’t be surprised to see another record low this fall.

Phil.
March 26, 2013 9:21 am

DirkH says:
March 24, 2013 at 4:07 pm
Phil. says:
March 24, 2013 at 2:41 pm
“result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.”
The greenhouse effect of CH4 competes with H2O and is only measurable in dry winter weather. Are you SURE it’s a problem when it can’t even be measured in warm moist weather?

And yet it’s routinely measured by satellites, how do you suppose that is?
At the surface it’e easily measured with an FTIR spectrometer, so I don’t think that you premise holds up.

Phil.
March 28, 2013 4:19 am

Please Ira, showing a low resolution cartoon of the spectrum shows nothing ‘clearly’. Also it’s for the surface the situation is much different at altitude!

March 28, 2013 8:28 am

To argue over the magnitude of the climate sensitivity ( aka the equilibrium climate sensitivity (TECS) ) is scientifically nonsensical, for the equilibrium temperature is not an observable.

Phil.
March 28, 2013 9:42 am

Ira Glickstein, PhD says:
March 27, 2013 at 11:37 am
However, as shown clearly here, all the CH4 peaks are coincident with portions of the spectrum where H2O is saturated or nearly so. Therefore, as DirkH is saying, given the prevalence of H2O, the contribution of rising CH4 levels (if they were actually rising) to actual warming of the Earth surface would be minimal, except in dry winter weather.

Here’s a portion of the actual spectra rather than a cartoon.
http://i302.photobucket.com/albums/nn107/Sprintstar400/WaterCH4.gif
Bear in mind that at an altitude of the mid-troposphere the temperature is about 250K so CH4/H20 has increased quite a bit by then, we’re not just talking about dry winter weather!
Between 1300 and 1600 cm^-1 CH4 has ~36,000 absorption lines to ~4,000 water lines. Between 1200 and 1300 it’s even more dramatic, ~13,000 to 1,000 (this is around the peak of the CH4 spectrum!

Phil.
March 29, 2013 3:37 am

Ira please look at the legend of the graph, it’s transmittance not absorbance! In the region shown CH4 has many strong lines whereas water has a few weak ones.

Phil.
March 29, 2013 9:54 am

Ira Glickstein, PhD says:
March 28, 2013 at 8:58 pm
Phil. March 28, 2013 at 9:42:
THANKS for the link to http://s302.photobucket.com/user/Sprintstar400/media/WaterCH4.gif.htm which PROVES MY POINT!

I’m afraid not!
Please notice the SCALE on the left size of the graph. Both the H2O and CH4 graphs are the same physical height, but the CH4 graph is multiplied eight times as much in height, and the bottom 7/8ths are cut off! The H2O graph goes from 0.88 to 1.00 while the CH4 graph goes all the way from 0.00 to 1.00! This is all quite misleading.
As indicated briefly above you have misread the spectrum, as indicated on the legend it is a transmissivity spectrum so the H2O spectrum shows very little absorption in that region compared with CH4!
Furthermore, please notice that the graphs you linked to extend only from Wavenumber 1200 (8.3 microns) to 1300 (7.7 microns). The LWIR spectrum of interest extends down below 7 microns up to 20 microns or more! So, those graphs are showing only a tiny fraction of the LWIR spectrum responsible for the Atmospheric “Greenhouse” Effect. The strong portion of the H2O absorbtion spectrum extends from 7 to 9 microns, and from 12 to 20 microns and beyond.
I presented this in response to your statement which I highlighted: “all the CH4 peaks are coincident with portions of the spectrum where H2O is saturated or nearly so.”
As I showed this is not true for the main CH4 peak between 1200 and 1300 cm^-1, what water does elsewhere is not relevant. That’s the problem of working with a low resolution cartoon!
The H2O spectrum while extensive is very sparse in terms of lines as referred to above, in the upper atmosphere there are many gaps where other gases can be effective.
PLEASE NOTICE:
1) H20: The H2O graph extends from 0.88 to 1.00, which is NEAR-SATURATION (88%) to COMPLETE SATURATION (100%). Also notice that there is 100% saturation for well over 95% of the spectrum. Yes, there are several narrow spectral lines that are not 100% saturated, but, the first three of these go down only to 96% saturation, the next one goes down to 90% saturation,and so on and the worst one goes down only to 88% saturation.
2) CH4: The CH4 graph extends all the way down to 0.00, which gives a totally misleading impression. If the H2O graphic was plotted to the same scale, it would only be about 1/8th the height! Also notice that the CH4 graph has many, many spectral lines that are less than 20% saturated and some that approach ZERO saturation.

completely wrong because of the misreading of the spectrum.
I don’t have the base data for these graphs, however I would love to see a graph and calculation of what percentage of absorbtion of LWIR that a doubling of CH4 would contribute to the total absorbtion of H2O plus CO2. My guess is that it would be well under 1%. And, as your link to Climate Audit indicated, CH4 is not increasing very much, if at all.
Over its lifetime in the atmosphere CH4 averages over 20X greater effect /mole than CO2 so a doubling from the current ~2ppm the effect would be equivalent to an additional 40ppm of CO2, also it has a greater sensitivity than log. My reference to CH4 was: “Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.” So although the flattening anticipated by Hansen’s Scenario C has occurred, present measurements in the Arctic are indicating recent releases, probably due to local warming.
I appreciate your contrubutions to this Topic thread and hope you answer the above.

Bart
March 29, 2013 11:55 pm

Terry Oldberg says:
March 25, 2013 at 8:26 am
I had to travel, and was unable to respond. I am familiar with the MEM in signal processing. The methodology I advocated could be couched in those terms, involving as it does identification of the system dynamics, and following through with optimal prediction of its evolution. MEM identification methods can be employed, but the system at hand has high SNR, and it would really just be gilding the lily.

Reply to  Bart
March 30, 2013 8:30 am

Bart:
A generalization can be drawn from experience in building information theoretically optimal models. This is that 150 observed independent events is about the minimum for construction of a predictive model. Going back to the beginning of the various global temperature time series in the year 1850, there are between 5 and 6 events of 30 years’ duration each; 30 years is the duration that is canonical in climatology and is the approximate period in which power producing facilities depreciate to nil. Five events is too few for construction of an information theoretically optimal model by a factor of at least 50. Thus, I can’t agree with you when you claim it to be easy to extract the signal from the noise. Under current circumstances, it would be impossible to extract such a signal at all.
By the way, as a signal would come to us from the future, it would have to travel at a superluminal speed. It follows from the impossiblility of such a speed under Einsteinian relativity that a signal power greater than nil is not possible. Thus, the signal-to-noise ratio is undefined. In view of this state of affairs, I prefer to replace the term “signal” by the term “message.” The message that one might hypothetically receive would be a sequence of outcomes of events in the underlying statistical population. As global warming climatology references no such population, the existence of such a message is not currently possible.

Bart
March 30, 2013 2:00 pm

Terry Oldberg says:
March 30, 2013 at 8:30 am
“It follows from the impossiblility of such a speed under Einsteinian relativity that a signal power greater than nil is not possible”
Not according to absorber theory.
In any case, I don’t really want to be arguing over how many angels can dance on the head of a pin, or whether our universe is a grain of detritus lodged in the fingernail of a giant. Such conversations are best conducted when one is an undergraduate, in a relaxed atmosphere among friends sharing mind altering substances.
The method I am advocating is a converging series of inferences based upon actual measurements, rather than a scattershot series of random guesses to which one then attempts to reconcile the data. The latter is the process in which the IPCC et al. are engaged, and it is horrible science. I think we agree on that, at least, so let us leave it there until we meet again. Cheers.

Reply to  Bart
March 30, 2013 5:09 pm

Bart:
“Despite theoretical arguments against the existence of faster-than-light particles, experiments have been conducted to search for them. No compelling evidence for their existence has been found.” ( http://en.wikipedia.org/wiki/Tachyon )

Phil.
March 31, 2013 5:55 am

A brief reply Ira I’ll respond in more detail later. Firstly you’re mistaken on the provenance of the spectrum you linked to, it’s actually a synthetic spectrum not measured. It’s actually made from the same source as mine, spectracalc! It does not say which location in the atmosphere it is calculated for so it’s not clear how much H2O is assumed. The spectrum is smoothed which is why the structure is not apparent so strong lines might not be apparent and the sparse nature of the H2O spectrum is lost. You are incorrect about the relationship of the CH4 and H2O spectra, look carefully the CH4 is at the edge of the H2O spectrum, this is even more important when you don’t look at a smoothed spectrum such as the one you presented.