How well did Hansen (1988) do?

Guest Post by Ira Glickstein.

The graphic from RealClimate asks “How well did Hansen et al (1988) do?” They compare actual temperature measurements through 2012 (GISTEMP and HadCRUT4) with Hansen’s 1988 Scenarios “A”, “B”, and “C”. The answer (see my annotations) is “Are you kidding?”Hansen88

HANSEN’S SCENARIOS

The three scenarios and their predictions are defined by Hansen 1988 as follows

“Scenario A assumes continued exponential trace gas growth, …” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.9 ⁰C, OVER FOUR TIMES HIGHER than the actual increase of 0.22 ⁰C.

“scenario B assumes a reduced linear growth of trace gases, …”   Hansen’s predicted temperature increase, from 1988 to 2012, is 0.75 ⁰C, OVER THREE TIMES HIGHER than the actual increase of 0.22 ⁰C.

“scenario C assumes a rapid curtailment of trace gas emissions such that the net climate forcing ceases to increase after the year 2000.” Hansen’s predicted temperature increase, from 1988 to 2012, is 0.29 ⁰C, ONLY 31% HIGHER than the actual increase of 0.22 ⁰C.

So, only Scenario C, which “assumes a rapid curtailment of trace gas emissions” comes close to the truth.

THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”

As everyone knows,  the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated.

What does RealClimate make of this situation?

“… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change).  …  The conclusion is the same as in each of the past few years; the models are on the low side of some changes, and on the high side of others, but despite short-term ups and downs, global warming continues much as predicted.”

Move along, folks, nothing to see here, everything is OK, “global warming continues much as predicted.”

CONCLUSIONS

Hansen 1988 is the keystone of the entire CAGW Enterprise, the theory that Anthropogenic (human-caused) Global Warming will lead to a near-term Climate Catastrophe. RealClimate, the leading Warmist website, should be congratulated for publishing a graphic that so clearly debunks CAGW and calls into question all the Climate models put forth by the official Climate Team (the “hockey team”).

Hansen’s 1988 models are based on a Climate Sensitivity (predicted temperature increase given a doubling of CO2) of 4.2 ⁰C. The actual CO2 increase since 1988 is somewhere between Hansen’s Scenario A (“continued exponential trace gas growth”) and Scenario B (“reduced linear growth of trace gases”), so, based on the failure of Scenarios A and B, namely their being high by a factor of three or four, it would be reasonable to assume that Climate Sensitivity is closer to 1 ⁰C than 4 ⁰C.

As for RealClimate’s conclusion that Hansen’s simulation “out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change)”, they are WRONG. Even a “naive” prediction of no change would have been closer to the truth (low by 0.22 ⁰C) than Hansen’s Scenarios A (high by +0.68 ⁰C) and B (high by 0.53 ⁰C)!

About these ads

212 thoughts on “How well did Hansen (1988) do?

  1. I predicted the UK Wildcats would make a good showing in the Tourament this year… clearly my prediction “out-performed any reasonable naive hypothesis that people put forward” back in January.

    I love outcome based predictions! LOL!

  2. Thanks Ira, for presenting this info from Real Climate’s website, I never go there. It’s an unhealthy place to visit, what with blood pressure, and all.

    They convict themselves with efforts like this, but who could convince them of it?

  3. Now wait a minute there Anthony, let’s think this through for a minute.

    If RealClimate is arguing that Hansen’s predictions are accurate because we’re in Scenario C, doesn’t that mean that our (non existent) mitigation efforts have been successful? We’re not emitting CO2 into the atmosphere anymore, is basically what they’re saying; we haven’t been since 2000. Great! Let’s quit with the legislative efforts to destroy the fossil fuel industry, stop the wind and solar subsidies, dismantle the IPCC, declare victory and go home.

    Maybe? Somehow I have a nagging feeling that won’t work, although I can’t quite put my finger on why… /sarc

  4. Mark Bofill says:
    Your comment is awaiting moderation.
    March 20, 2013 at 7:33 am
    ——
    Gah, beg pardon, I should have addressed Ira, not Anthony.

  5. Comparing forecasts to the real world is patently unfair. If you were to compare the forecasts to other forecasts, you would find them spot on, indicating a concensus.
    Besides, you didn’t correct the measured temperatures upward to compensate for all the energy used in our plethora of severe storms. We don’t need a weatherman to know which way the winds blows…..(apology & /Sarcoff)

  6. Realclimate and others ignore the hiatus global temperture rise,its as if they stick fingers in ears and go BLAH,BLAH BLAH when ever the data is put in front of them…and these people have the nerve to call themselves scientists

  7. CO2 emissions will continue because nature contributes MORE than we do by 33 times. Climate will continue its natural change pattern despite this because climate change is solar driven NOT CO2 driven.
    There are NO peer reviewed papers that show that CO2 drives climate. There are model runs that do but these are based on CO2 driving climate so constitute zero proof and show a circular argument when introduced. There are plenty of peer reviewed papers showing that CO2 does NOT drive either temperature or climate.
    Get real.

  8. Interesting that the nearest match is the “no forcing increase past 2000″ which actually seems to match reality since 2000 fairly well, but had already run “high” by the millenium to create the over-estimate today.

    Since CO2 hasn’t stopped increasing, surely that suggests that CO2 is not the relevant forcing and (assuming the simple radiative forcing model is basically sound in the first place) we should be looking for some other variable that was increasing up to 2000 and has flattened / fallen since?

    CO2? This is not the forcing you’re looking for.

  9. I wonder , if the temperature is driving Co2 and not Co2 driving the temperature , than at what temperature drop does the amount of Co2 go down ?

    Did anyone ever publish any data about that ?

    [Great question! According to the Ice Core data over several hundred thousand years:
    1) Temperature drops and stays low for hundreds of years until CO2 begins to drop. Then there is a period of hundreds of years while Temperatures stay low and CO2 drops to a minimum level. Then a long period where both Temperature and CO2 are low.
    2) Then, Temperature rises and stays high for hundreds of years until CO2 begins to rise. Then there is a period of hundreds of years while Temperatures stay high and CO2 rises to a maximum level. Then a long period where both Temperature and CO2 are high.
    3) Repeat above process for multiple cycles.

    Of course, that was well before humans evolved on Earth and everything changed :^)

    Ira]

  10. RealClimate as ever is asking, “Who are you going to believe? Us or your own lying eyes?”

  11. It has been 26 years since Hansen et al did the work noted above. The IPCC is in the process of producing the fifth report since then, involving “thousands” of “top-notch” “climate” “scientists” (each category is, of course, a piece of misinformation). In the world of science and business and general human lives, during that period we would expect the range of Scenarios to have been narrowed. But we are still faced with A through C.

    This is why references of how Hansen’s predictions are still relevant. The most recent forecast is still as wild and wooly as it was in ’88 – despite the new ones being curve-matched to data from 1988 to about 1997. There has been no progress. And that is because the fundamental assumptions that drive the model Scenarios have not changed.

    And they can’t. The whole CAGW narrative rides on a high sensitivity of CO2 and a significant, positive feedback from water vapour. The disaster, however, is not present but projected. Any reduction in the impact of additional atmospheric CO2 is not just a short-term problem for the story – you can’t have “extreme” weather from CO2 if there has been little or no global warming from CO2 – but a long-term problem. A world warmer by 2C in 200 years (by model) is a world that adapts, not collapses.

    Each day the newspapers have articles touting the terrible things to happening in a warming and changing world of weather. All of them – and I mean “all” – are written with the present and future conditionals: the “may”, “could”, “might” and “should”. Whatever event we are to be alarmed about is “possibly” related to CO2 warming. Hansen and others behind the Scenarios continue to use the wide range because they know that the earlier they try to turn to the words “does”, “will” and “shall”, narrow the Scenarios to get us a clearer picture of what lies ahead (under the current use of fossil fuels) the earlier we will be looking for the proof of their puddings. And discovering them to be tasteless.

  12. Incorrectly predicting the imperitive to entirely remake our economic system still “shows skill?”

    No wonder these guys don’t work in the real world where results matter!

  13. “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” — Feynman

    Hansen 1988 was wrong.

  14. My predictions for this week wining lottery numbers are 6 numbers in the range 01-50 , given that is the total range possible, and this is the good part , I must be right no matter what numbers come up.

    Does this make me overqualified to be a ‘climate scientists.

  15. squid2112 (March 20, 2013 at 7:31 am):

    I would not go so far as to say Climate Sensitivity is exactly 0.00 ⁰C. All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C. If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real. Thus, if you could hold Cosmic Rays and Multi-Decadal Oscillations and everything else constant, and then increased Atmospheric CO2, Earth Surface temperatures would increase.

    Calculations say doubling of CO2 would result in about 1 ⁰C average temperature rise, but feedback, assuming clouds have, on net, a negative effect on temperatures, would reduce that temperature rise, possibly as low as 0.25 ⁰C. (As we know, nighttime clouds have a positive feedback effect and daytime clouds have a negative feedback effect.) There are also other negative (and positive) feedback effects.

    Thanks for your comment.

    Ira

  16. I also enjoyed Gavin’s comment:

    “Short term (15 years or less) trends in global temperature are not usefully predictable as a function of current forcings.”

    I look forward to 2018, when 20 years will not be long enough.

    Meanwhile, as shown above,. there are no successful predictions of that length, either, but climate models don’t need to be proven, we’re just supposed to assume they’re correct until proven otherwise. Because that’s Science.

    [Good points. Actually, it has been 24 years since Hansen 1988 was published, and according to the RealClimate posting, Hansen's data was from 1984, so it has been around 28 years. By any measure, Even Gavin's, 28 years should be LONG ENOUGH! Ira]

  17. Atmospheric CO2 at Mauna Loa Observatory – measured at height???

    At ground level? Then it is a vulcano activity indicator …

  18. “What does RealClimate make of this situation?
    … while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 ”

    Or as any 13 year old would say: “Yeah, well I maybe wrong but not as wrong as some other people.”

  19. It is claimed, by the climate-deranged (my new word replacement for warmists / alarmists h/t Peter Foster ) that CFC’s should be considered as a greenhouse gas reduction that took place and therefore scenario B should be the scenario to compare to.

  20. I need some help graphing it,

    but I project that Hansen has a greater chance of being arrested again (a positive sloped line)

    than he does having one of his projection/predictions be right (a negative sloped line).

    Although, once his projection of boiling oceans occurs, all bets are off.

    :)

  21. Lets face it the original graph was pure propaganda. The point was that CO2 was likely to be added at Scenario A rates. The point of the graph was to make Scenario A look scary so that we should address CO2. Looking at Scenario B if we got sensible about CO2 it wouldn’t make appreciable difference as the line is almost as scary as Scenario A.

    Scenario C was only included to show that if the world went back to an agrarian lifestyle and all SUVs were recycle for diversity-powered scooters we could stop the scary lines happening.

    Well we didn’t stop. The SUVs are still here and the world is still rapidly industrialising and we are below scenario C.

    So the propaganda didn’t work, the calculations didn’t work, the temperature increase didn’t work, the CO2/Temp relationship didn’t work, the IPCC didn’t work, carbon sequestration didn’t work, CO2 offsets didn’t work, the vilification of Humanity’s progress didn’t work, the peer review STILL doesn’t work (otherwise why are they still talking about this graph rather than quietly recycling it as diversity-pressed parchment) and finally … the economics didn’t work.

    Did anything that came out of these claims actually ever work? Anything? Ever?

    I didn’t make the claims and even I am embarrassed for them. What a waste of a life’s work.

  22. it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988
    =============
    hysterical…..so they admit to how bad this “science” is

  23. If Hansen had known then what we know now, he might have made a bang-on 100% accurate prediction. He might have gotten the sensitivity right and the physics too.
    But no one would have listened, no one would have cared

  24. Ira Glickstein, PhD
     on March 20, 2013 at 8:07 am

    Regarding the effect of the clouds, this is obvious for us living far north, especially during the winter period. Others may not … (like well known AGW’ers …)

  25. @ Lutherwu “Thanks Ira, for presenting this info from Real Climate’s website, I never go there. It’s an unhealthy place to visit, what with blood pressure, and all.”

    Me too, Luther. I get so angry and frustrated it’s not healthy. And I want to live long enough to see these guys fall from grace, to stretch the word “grace” all out of shape. I’m 62. Another 3-5 years ought to do it I hope.

  26. Anthony,

    Scenarios A, B, and C make assumptions about the CO2 increase in the atmosphere. It would be worth while to show how those assumptions would Plot-out on a time vs CO2 Concentration scale for us not as technically inclined.
    Scenario A, I would guess, would be closest to the actual Mauna Loa plot you presented ( Would it be more curved upwards?). Would Scenario B be a straight line starting in 1988? If so what slope would it be? Would Scenario C be a horizontal straight line starting in the year 2000? If so at what level would it be?
    Thanks.

  27. squid2112 says:
    March 20, 2013 at 7:31 am
    Climate sensitivity to CO2 is exactly 0C !….period
    ///////////////////////////////////////////////////////////////////////////////////////////////////////

    There certainly is some observational evidence supporting that conclusion. For example:

    1. If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.. Unless the SuperEl Nino of 1998 was in some way caused by CO2, and to my knowledge, no one suggests that it was, then that one off event should be excluded from considering wither and to what extent there is a temperature response to rising CO2 emissions. Accordingly, excluding that one off event which brought about a step change (not a gradual linear increase), one sees that the temperature was flat between 1979 and 1997 and flat between 1999 to 2012. In otherwords temperatures have remained flat for 33 years and there is no first order correlation between temperature and rising CO2 levels in the satellite data. Thus the issue then becomes whether there is some second order correlation if one were to take into account the negative effects of aerosol emissions and.or variance in TSI and/or cloudiness and incoming solar insolation. The problem is that we do not have accurate data on these so any adjustments to be made are somewhat speculative. That said, it appears that global SO2 emissions are no greater today than they were in 1979 and if anything they are less. We are frequently told that variations in TSI are minimal and insignificant. That being the case neither of these two factors can be masking the warming (if any) that is otherwise brought about by the rise in CO2 emissions over the period. The satellite data suggests that at current levels of CO2 any sensitivity to CO2 is so small that it cannot be seen within the noisy satellite temperature record.

    2. If one considers the global temperature anomaly around 1880-55 then according to Hadcrut4, it was about -0.4 degC. Today (about 2000 onwards) it is around +0.5degC. On its face a change of about 0.9degC. During this time CO2 has risen by about 50% from about 280ppm to about 400ppm which is approximately 1/2 a forcing. However, adjustment needs to be made to take account of volcanic activity. The Team suggest that Krakatoa depressed global temperatures by about 1.2degC. If that figure is right (ie., if it is the correct forcing for such extreme volcanic activity -which I personally doubt) then but for Krakatoa, one would have expected global temperatures in the period 1883 to 1886 to be some 1.2degC higher than the -0.4degC anomaly figure, ie., the appropriate temperature anomaly for the period say 1883/86 would have been +0.8degC. It is materaial that that is higher than the Hadcrut anomaly of today. Accordingly, once one takes account of volcanic forcings, and makes the necessary adjustment for those forcings, it would appear that temperature anomalies in around the 1883/86 period would be higher than today!! Thus notwithstanding a 1/2 forcing (ie., an increase in CO2 levels from about 280ppm to about 400ppm), there is no observable increase in global temperature anomaly!! Again, this suggests that at around the 300ppm CO2 level, climate sensitivity is around zero.

    [Richard Verney: Good to "see" you here again. Regarding your statement that "cliimate sensitivity is around zero", I agree. 0.25 ⁰C IS AROUND ZERO, and even my high estimate of 1 ⁰C is way closer to zero than Hansen 1988's 4.2 ⁰C or the IPCC's 3 ⁰C. Ira]

  28. Tall dave But gavin he will be retired by then (~20 years no warming) and will not give a @@@@ hell probably become an ardent denier in his old age LOL

  29. CO2 growth is close to A, but actual temperatures are less than C, which proves that Hansen was correct?
    Do these guys wear special glasses that let them see things that aren’t there?

  30. Ira,

    I have taken the liberty of suggesting a change to one of your comments as follows. I am making the change because use of the terminology ‘greenhouse effect’ is not and never was scientifically appropripriate.

    Note: My changes are in bold.

    Ira Glickstein, PhD on March 20, 2013 at 8:07 am

    squid2112 (March 20, 2013 at 7:31 am):

    @squid2112

    [ . . .]

    I would not go so far as to say Climate Sensitivity is exactly 0.00 ⁰C. All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C. If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” Planetary Atmospheric Effect, of which there is a radiative gas portion, is real. Thus, if you could hold Cosmic Rays and Multi-Decadal Oscillations and everything else constant, and then increased Atmospheric CO2, Earth Surface temperatures would increase.

    [ . . .]

    Thanks for your comment.

    Ira

    Again note that my changes are in bold.

    Your article is much appreciated. Thank you.

    John

    John Whitman: Thanks for your comment and your suggestion. Of course I agree that the warming effect of H2O, CO2 and other gases in the Atmosphere are very different from what goes on in an actual physical greenhouse, where the walls and roof primarily prevent loss of heat by convection rather than by outgoing long wave radiation. However, this mis-analogy has become established and, to communicate clearly with others, I choose to go along with established terminology. However, I try to say Atmospheric (to make it clear I am not talking about an actual physical greenhouse) and I put “scare” quotes around the words “Greenhouse Effect”. Ira]

  31. Not only Hansen but the IPCC have failed. How much failure can we take? We have waited decades and there is a divergence. The theory exaggerates warming and is essentially a pile of batshit.

    What if there was a rapid curtailment after 1988, Hansen would be seen as a superstar scientist. Instead, there has been none and we have hit the target. Any reasonable scientist, after raising so much alarm, would say they were wrong and go into hiding. Instead, Hansen is even angrier that ever and is prepared to get arrested over this trace gas.

    Finally, we have hit below scenario C so the goal has been met, the panic is over, let’s disband CRU, IPCC and fire Hansen immediately.

  32. Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am
    ———————————————————-
    Congratulations on the clear and concise presentation.

    However, when you say in your comment: “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real” you are attributing the entire Greenhouse Effect to CO2, whereas it is dominated by water vapor.

    CO2 has a narrow absorption band, and it must plateau by 300 ppm concentration, since there is a CO2 sensitivity to temperature. If there were a significant temperature sensitivity to CO2 above 300 ppm, there would be a positive feedback that shouldn’t stop until essentially all the massive stores of surficial carbon returned to the atmosphere as CO2, with global temperature tens of degrees warmer then ever inferred from the entire ice-core record.

    [R Taylor: You are correct that the primary "Greenhouse gas" is H2O. However, some climate scientists have suggested that, lacking CO2 (and/or other gases that do not precipitate and freeze at 0 ⁰C), the first strong Ice Age would have caused our planet to become "snowball Earth" where all the H2O would become snow and ice, which have a high albedo (reflectiveness to short wave Solar radiation). They claim that "snowball Earth", lacking any appreciable H2O vapor in the Atmosphere, and reflecting much of the incoming Solar radiation, might remain in that frozen state. It was the unfrozen CO2, and other unfrozen trace gases in the Atmosphere, that allowed the Atmosphereic "Greenhouse Effect" to be effective and pull Earth out of the "snowball". I believe that theory is credible, and, if so, CO2 and other non-H2O gases are responsible for the Earth not being a "snowball" and allowing the H2O to return to liquid and gaseous form, with the result being at least 30 ⁰C warmer than it would be otherwise. Ira]

  33. They can’t even be straight about what he predicted. The fair way would be to show both, but instead they often say that since methane did not rise as much as predicted and because they now agree that the median increase for CO2 doubling is 2.8 to 3.0 C instead of 4.2 that if you make those changes it is not as bad and he was “reasonably accurate”. Well, he did not use 2.8 C and they believe that in many ways CO2 is a bigger problem than methane and they were PREDICTING that methane would rise, that was part of it as well.

    So to fairly test what Hansen said in 1988 you do as Ira did. Then after that you can say, well, if we had made the prediction in 2004 or 2013 we would not have been as wrong as Hansen back in 1988. Which sounds kind of lame and explains why they choose the more misleading way.

  34. Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am

    … If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.

    1 – Actually, the average temperature would be about the same. It’s just that we would, literally, bake during the day and freeze at night. The surface conditions would be like those on the moon. If you apply a low pass filter (ie. dig down into the moon far enough) the temperature is a comfortable 23 deg. C. http://www.lunarpedia.org/index.php?title=Lunar_Temperature
    2 – The most powerful greenhouse gas, by far, is water vapour. Attributing the whole greenhouse effect to CO2 is wrong. Note that the adiabatic lapse rate is hugely influenced by moisture and not at all influenced by CO2. http://en.wikipedia.org/wiki/Lapse_rate

  35. The lefty mind, literally, can not contain the two concepts “Real world scenario is close to Hansen A scenario” and “Real world temps are close to Hansen C” at the same time. That this is a fact, is no accident. It is the consequence of deliberate interference in the natural development of the cognitive ability of our children and has been going on since at least the 30’s. The Marxist architect mind is constructed of independent nodes each with its own reference, set of definitions, and parsers. Only one can be active at a time. This is one of the two reasons why the trolls can not read a post or comment without completely missing what it is saying, re-interpreting it to conform to their ideological preconceptions.

  36. Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am
    “…If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real….”
    ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    If, over the course of the next 10 or so years, global temperatures do not increase, then there will be 2 obvious contenders as to the reason why there has been no increase. namely:

    1 At current levels of CO2, saturation has already been reached such that sensitivity is now so close to zero that it cannot be measured; or
    2. The basic physics upon which AGW is built is fundamentally flawed.

    You are a brave man ruling out the second.

    I strongly suspect that if the temperature haitus continues and there is no rise in temperature anomaly by early 2020, we will be seeing many papers dealing with the basic physics and these papers will, inter alia, question whether “..the Earth Surface temperatures would be over 30 ⁰C cooler..” but for so called greenhouse effect.

    Watch this space and bring along the pop corn.

  37. “… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …

    Let me do a re-write for ya.

    Dad, I promised you I would get an A in biology, instead I got an E. Hey, at least I didn’t get an F.

    Gavin puts the best political spin masters to shame.

  38. If you can’t predict future climate change then you don’t really know why the climate changes. If you don’t really know why climate changes then you can’t introduce the term “climate sensitivity” and try to estimate its value. You can only say “I don’t know”.

  39. Well, regarding the lots and lots of cold, sticky, white stuff still lying around outside my home right at this year’s spring equinox, I rather accept 1970ies predictions of a coming Ice Age.

    Brrrrrrrrr!

  40. knr says:
    March 20, 2013 at 8:03 am
    My predictions for this week wining lottery numbers are 6 numbers in the range 01-50 , given that is the total range possible, and this is the good part , I must be right no matter what numbers come up.

    Does this make me overqualified to be a ‘climate scientists.
    ————————————————————————————————————————
    Yes, it does. If’n you want to be a REAL climate scientist you need to predict that all lottery numbers will exceed 50. When the results come in, you argue because of your large error bars that you won anyway. Of course, your only hope of collecting is if the lottery is run by NSF.

  41. Jimbo says:
    March 20, 2013 at 8:59 am

    “… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …

    Let me do a re-write for ya.

    Dad, I promised you I would get an A in biology, instead I got an E. Hey, at least I didn’t get an F.

    Gavin puts the best political spin masters to shame.

    You’re right, Jimbo. Nobody but an eedjit would tell people there was to be “no-change” in the climate. (I wonder if Gavin has a link or two for his so-called “no-change” climate prognosticators?)

  42. Ira

    Further to my post (richard verney says: March 20, 2013 at 8:57 am), I am not saying that you are wrong, merely that your assertion may be wrong. Personally, i consider the 30degC figure to be highly suspect.

    Let us suppose, for the sake of argument, that the MWP existed as a global event and it was warmer than today. What does that event say about climate sensitivity?

    Ditto, if the Roman and the Minoan warm periods truly existed and were warmer than today?

    Ditto, the Holocene Optimum existed and was warmer than today?

    I would be interested to hear your views as to climate sensitivity with respect to those 4 scenarios.

    Is it not the case that proponents of cAGW have to erase such events from the global temperature record since if they existed, then it follows that climate sensitivity must indeed be small (less than 1degC, perhaps the 0.25degC figure you mention)?

    [richard verney: I accept that the Roman and Minoan and Medieval Warming Periods (as well as 1934 at least in the US :^) were warmer, on average, than the past decade. Therefore, as you say, Climate Sensitivity MUST be small, perhaps as small as 0.25 ⁰C, and very unlikely to be higher than 1 ⁰C. So we are in agreement. Ira]

  43. ok IRA . . . you got me at “nothing can move faster than the speed of light” . . . . If that is true . . . why DOES . . E=mc2 (squared)?

    (checked out your site) http://tvpclub.blogspot.com/ ABOUT MY ENTRY

    it’s as far as I got~!

    [Laurie Bowen: THANKS for checking out my Blog. For those who have not checked it out, my current top topic is "What is Time? Alan Alda's 2013 Flame Challenge". I make the claim that Time is the fourth dimension, plain and simple, and I subscribe to the theory that you and I and the whole Universe is zipping along the Time dimension at nearly the speed of light, which is as fast as anything can go. That is how I explain the fact that, when we move in any of the three Space dimensions, our speed in the Time dimension must reduce a bit such that the vector sum of our speed in Space and our speed in Time always equal exactly the speed of light. (Time Dilation and Length Contraction are well established, experimentally demonstrated truths. These effects are tiny even at the speeds of our fastest rockets, but they are real in that GPS satelites must correct for them. At speeds approaching a significant fraction of the speed of light, these effects can cause Time to slow down considerably.)

    I have submitted a five and a half minute video that will be judged by 11-year-old science students. Please check it all out HERE.

    As for your reference to e = mc^2, I don't know what that has to do with the speed of light other than that Albert Einstein contributed to our understanding of the close inter-relationship between Energy and Mass as well as Space and Time. Ira]

  44. For even more laughs, you should plot the absolute “global-averaged” temperature instead of the anomalies. GCMs cannot even get absolute temperatures right. And remember, energy transfer by thermal radiation varies as absolute temperature to the fourth power…

  45. I predicated that Manchester United would win this years Champions League.I was wrong but only by one away goal. Sadly it was scored quite early in the competition…

    How about we re-draw Hansens graphs starting from 1995 say? How well do the curves fit then? Perhaps “fit” is the wrong word…..

  46. Don’t confuse “emissions” with “atmospheric concentrations”.

    Hansen’s 3 scenarios were built around different GHG emissions paths. Exponential growth, linear growth, and abrupt curtailment.

    For each emissions path, he projected atmospheric concentrations of GHGs. These are a derivative of each scenario… not the scenario itself.

    The problems with Hansen (1988) are two fold…1) emissions have continued to rise as fast as scenario A, but concentrations have only risen in line with Scenario B, and 2) despite concentrations that have risen in line with Scenario B, temperatures have risen less than Scenario C.

    RealClimate and Skeptical Science both gloss over problem 1)… pointing out that Scenario B should be the reference point for comparison because atmospheric concentrations of all GHGs have been below Scenario A. This is goal-post shifting.

    Scenario A spelled out an emissions path, and is still the relevant benchmark for how the real world has played out..

    If emissions have continued to grow exponentially, but atmospheric concentrations have risen less than projected, it means the model was wrong, and either GHG absorption by the biosphere is greater than projected, or atmospheric dwell time is less than projected. Either way… Hansen’s models were biased high.

  47. Ira Glickstein, PhD says, (March 20, 2013 at 8:07 am): “All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C. If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real. Thus, if you could hold Cosmic Rays and Multi-Decadal Oscillations and everything else constant, and then increased Atmospheric CO2, Earth Surface temperatures would increase.”
    =========================================================

    Ira, this calculation about “over 30 ⁰C cooler” is so wrong. It is based on the notion that the Earth gets warmed as a disc and cools as a sphere. This is absurd, Ira. Earth gets warmed as a sphere too, because it rotates. The Earth without atmosphere would be very hot, because radiative cooling is very slow. Just start thinking about it.

    The second thing is that back radiation (from CO2 or whatever) does not warm the source. This is impossible already on the theoretical level, because the assumption that it does leads in some cases to creating energy out of nothing, which means that the assumption is wrong. And on the experimental level it has been known since the R.W.Wood experiment (1909).

    Time to wake up, Ira, and look at those things really critically.

  48. In reply to tallbloke:

    tallbloke says:
    March 20, 2013 at 7:42 am

    Good summary. The whole sensitivity debate is a red herring however. Changes in co2 level FOLLOW changes in temperature, at ALL timescales. Cause precedes effect.

    William:
    I do not understand the reasoning to support this comment. Post 2000 CO2 has increased linearly and there has been no increase in planetary temperature. What is driving the linear increase in CO2? If the 20th century increase in CO2 was caused by the increase in planetary temperature wouldn’t we expect the rise in CO2 have stopped as the rise in planetary temperature has stopped?

    Comments:
    1) Explaining the glacial/interglacial lag in CO2 rise is a separate problem. The specialists cannot explain the drop in atmospheric CO2 from interglacial to glacial phase or the increase from glacial to interglacial. The amount of CO2 that is absorbed by the oceans as they cool is almost offset by the reduction of CO2 due to shrinking of the biosphere (vegetation) due to the massive ice sheets. The proposed hypotheses (mechanisms) to explain the change create paradoxes when applied to other periods.
    2) The question of what maintains atmospheric CO2 and why atmospheric CO2 changes on geologically long time periods is not understood. The proposed hypotheses (mechanisms) to explain the change create paradoxes when applied to other periods. The hypothesis that increased erosion removed the atmospheric CO2 for example results in catastrophically cyclically low atmospheric CO2 levels (end of plant life).
    3) The current observation that there is no correlation in planetary temperature to changes CO2 and the observation on geologically long time periods that planetary temperature is not correlated to atmospheric CO2 can be explained by the CO2 greenhouse mechanism saturating due to clouds in the tropics increasing and decreasing to resist forcing change. Lindzen and Choi’s sensitivity paper supports the assertion that the planet resists forcing change by increasing or decreasing clouds in the tropics.

    Atmospheric carbon dioxide levels for the last 500 million years

    http://www.pnas.org/content/99/7/4167.full

    Using a variety of sedimentological criteria, Frakes et al. (18) have concluded that Earth’s climate has cycled several times between warm and cool modes for roughly the last 600 My. Recent work by Veizer et al. (28), based on measurements of oxygen isotopes in calcite and aragonite shells, appears to confirm the existence of these long-period (_135 My) climatic fluctuations. Changes in CO2 levels are usually assumed to be among the dominant mechanisms driving such long-term climate change (29).

    Superficially, this observation would seem to imply that pCO2 does not exert dominant control on Earth’s climate at time scales greater than about 10 My. A wealth of evidence, however, suggests that pCO2 exerts at least some control [see Crowley and Berner (30) for a recent review]. Fig. 4 cannot by itself refute this assumption. Instead, it simply shows that the ‘‘null hypothesis’’ that pCO2 and climate are unrelated cannot be rejected on the basis of this evidence alone.

    http://faculty.washington.edu/battisti/589paleo2005/Papers/SigmanBoyle2000.pdf

    Glacial/interglacial variations in atmospheric carbon dioxide by Daniel M. Sigman & Edward A. Boyle

    The exchange of CO2 between the atmosphere and the surface ocean would reach completion over the course of six to twelve months if there were no other processes redistributing inorganic carbon in the ocean. However, the pCO2 of surface waters is continuously being reset by its interaction with the deep ocean reservoir of inorganic carbon, which is more than 25 times that of the atmosphere and surface ocean combined (Fig. 2).

    As ocean temperature was lower during ice ages, it is an obvious first step to consider its effect on atmospheric CO2. The lower temperatures of the glacial ocean would have reduced the concentration of CO2 in the atmosphere by drawing more of it into the ocean. The deep ocean, which is the dominant volume of ocean water, has a mean temperature of 2 8C. Sea water begins to freeze at about -2C, producing buoyant ice. As a result, deep ocean water could not have been more than 4C colder during the last ice age, placing an upper bound on how much additional CO2 this water could have sequestered simply by cooling. The potential cooling of surface waters in polar regions such as the Antarctic is also constrained by the freezing point of sea water.

    There are uncertainties in each of these effects, but it seems that most of the 80±100 p.p.m.v. CO2 change across the last glacial/interglacial transition must be explained by other processes. (My comment than by temperature variance which forces CO2 out of solution in the ocean.) We must move on to the more complex aspects of the ocean carbon cycle.

    What caused Glacial-Interglacial CO2 Change?

    http://geosci.uchicago.edu/~archer/reprints/revgeo/rog.pdf

    Abstract. Fifteen years after the discovery of major glacial/interglacial cycles in the CO2 concentration of the atmosphere, it seems that all of the simple mechanisms for lowering pCO2 have been eliminated. We use a model of ocean and sediment geochemistry, which includes new developments of iron limitation of biological production at the sea surface and anoxic diagenesis and its effect on CaCO3 preservation in the sediments, to evaluate the current proposals for explaining the glacial/ interglacial pCO2 cycles within the context of the ocean carbon cycle. After equilibration with CaCO3 the model is unable to generate glacial pCO2 by increasing ocean NO3 2 but predicts that a doubling of ocean H4SiO4 might suffice. However, the model is unable to generate a doubling of ocean H4SiO4 by any reasonable changes in SiO2 weathering or production.

    Our conclusions force us to challenge one or more of the assumptions at the foundations of chemical oceanography. We can abandon the stability of the “Redfield ratio” of nitrogen to phosphorus in living marine phytoplankton and the ultimate limitation of marine photosynthesis by phosphorus. We can challenge the idea that the pH of the deep ocean is held relatively invariant by equilibrium with CaCO3. A third possibility, which challenges physical oceanographers, is that diapycnal mixing in ocean circulation models exceeds the rate of mixing in the real ocean, diminishing the model pCO2 sensitivity to biological carbon uptake.

  49. @Ira Glickstein: “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real. ”

    Sure about that? This is based on calculations assuming Earth as a black body. Except it is known that there is not a body in the universe that actually behaves as a black body. The Earth is not a good contender – I mean, exactly at what point should you consider it as a black body? On the Earth’s surface or at the top of the troposphere or some mean point from top of atmosphere to Earth’s surface or….. where?

    [Ryan: True, the Earth is NOT a PERFECT Black Body, nor is anything in the Universe. The Earth is not a perfect absorber nor a perfect emitter of electro-magnetic radiation. The Earth has an albedo (reflectiveness) significantly greater than zero. The Earth reflects around 30% of the incoming Sunlight back out into Space, so only about 70% of that Sunlight energy participates in the Atmospheric "Greenhouse Effect". All the calculations I have seen include the effect of our albedo. Ira]

  50. Ira, you’ve got to clean up that graph. Too much information crammed into a small space. It’s like you’re talking so fast it’s hard to recognize the words. Don’t change the story; just make it more elegant.

  51. while my prediction that Romney would win the 2012 election was not perfect, it has shown skill in that it has out-performed the naive hypothesis of no-winner in that election.

  52. Always a pleasure to see a post by Ira. Thank, Ira.

    “… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change). …” –“Real”Climate

    While Bernie Madoff was not perfect, he has shown honesty in that he stole a lot less than other people we can’t name might have.

  53. Ira I would tend to agree but I think the negative feedbacks are greater so that’s why from being a skeptic I now believe C02 has no effect whatsoever on global temperatures. If anything an excess of C02 would tend to lower global temperatures as a compensatory mechanism. I think both Lindzen and Spencer have shown this somewhat. There is evidence I believe from Holocene records that 3000ppm+- C02 was related with massive ice ages?

  54. Well, it took you a month and a half to come up with that comment on the original Realclimate post. By the way, trace gaz means a lot more than just CO2 (methane, cholofluorocarbons et al.)

    [François: Actually, although you are right that the RealClimate posting I commented on has been up for several weeks, I only just saw it a few days ago. My visits to RC are infrequent, about as often as I go to the dentist, and for similar painful reasons of duty. The above WUWT posting only took me several hours to compose.

    And, yes, "trace gas emissions" include more than CO2. There's CH4 and lots of other minor contributing "Greenhouse gases". However, is there any evidence you know of that "trace gas emissions", broadly defined, have ceased their increase since the year 2000? That is what Hansen Scenario C assumes, and that is why Scenario C is only about 30% higher than the actual data posted on the actual RC graphic that I annotated.

    Actual "trace gas emissions" are somewhere between Scenarios A and B, but the temperature change is way less than predicted for Scenarios A and B. Actual results are close to Scenario C but "trace gas emissions" are close to Scenarios A and B. So, what is your point? I'm listening. Ira]

  55. The C projection even does not present a realistic picture from using the same start point instead of a say 5 year rolling projection that would show temperatures being flat since around 2000.
    If you don`t move the base that projection will still show a rising trend 5 years from now
    even if temperatures remain flat, Words like `foot` `own` and `shooting` come to mind……………..
    Hansen till wrong but only by a few degrees instead of entirely.

  56. FTA: “As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated.”

    There has not been any curtailment of emissions, that much is true. But, the rate of change of CO2 has leveled out, in lockstep with the leveling out of temperatures. Our emissions do not control CO2 – Nature bats them away with barely an acknowledgement.

    [Bart: Have a close look at the Mauna Loa CO2 data (second graphic above) and tell me where CO2 has "leveled out". Yes, since CO2 has its small seasonal ups and downs there are some months where it is level, but, on a smoothed yearly basis it seems to me to be a continued rapid increase that is slightly exponential in the upward direction. Ira]

    Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am

    “All else being equal, I am pretty sure Climate Sensitivity is somewhere between 0.25 ⁰C and 1 ⁰C.”

    All else being equal being the operative phrase. All else is not equal.

    “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.”

    A globally positive function is not necessarily monotonic. Local sensitivity – the incremental change in temperature due to an incremental change in CO2 concentration at the current position, i.e., the partial derivative – does not have to be significant or even positive. The data show that there has been no significant deviation from longstanding patterns in the past century.

    [Bart: What does that link or your mathematical jibber-jabber have to do with the past century? Or with the Atmospheric "Greenhouse Effect" that makes the Earth Surface at least 30 ⁰C warmer than it would be if the Atmosphere lacked "Greenhouse gases"? I read your words four times and visited your link. Please explain your point in English. advTHANKSance. Ira]

    richard verney says:
    March 20, 2013 at 8:37 am

    “If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”

    But, there is a definite correlation between the rise in CO2 concentration and temperature.

    [Bart: I have to agree with Richard Verney that there is no FIRST ORDER correlation between temperature rise and the rise in CO2 emissions. Please don't simply point to some graph. Explain what you mean so even Richard and I can understand. advTHANKSance. Ira]

  57. So clear, so simple yet the other guys will still try and pretend Jim was right.

    Thanks Ira, this is going out to all stations Keitho.

  58. “What if there was a rapid curtailment after 1988, Hansen would be seen as a superstar scientist.” –Jimbo

    He’d be disappointed with that title. He’d much prefer “Messiah.”

  59. In the interest of accuracy, we should drop the term “greenhouse effect.” What happens in the atmosphere in no way resembles what happens in a greenhouse, which heats itself by suppressing convection.

    There is a growing body of research that calls into question the entire notion of “increased CO2 levels raising the earth’s temperature.” Physicist Ferenc Miskolczi has done some groundbreaking work in this area.

  60. The current hiatus in the rise of global mean temperature represents a temporary pause, nothing more. The earth has been warming since the end of the Little Ice Age, with various time-localized plateaus and accelerations along the way; and it will continue to warm in that same pattern until at some point in the future, it doesn’t.
    .
    In the meantime, the debate over human-induced climate change will continue to pass through the GHG Narrative Diode.
    .
    The GHG Narrative Diode works this way….. any plateau which is anything less than a statistically significant drop in Global Mean Temperature occurring continuously over some long period of time — let’s figure on three to five decades — will continue to be interpreted by the climate science community as representing insufficient evidence that a human caused GHG-driven global warming trend isn’t still operative as the primary driver for climate change.
    .
    Said another way, a statistically significant trend in falling global mean temperature must occur continuously over some very lengthy period of time — thirty years at the minimum, but more likely fifty years — before the climate science community ever begins to question the narrative of human-caused GHG-driven global warming.

  61. Ira, looking through the details of the paper, and comparing to actual emissions, I would say Scenario B is the closest to reality.

  62. [ Note: bold emphasis is by me (John Whitman) ]

    Eliza on March 20, 2013 at 10:13 am said,

    Ira I would tend to agree but I think the negative feedbacks are greater so that’s why from being a skeptic I now believe C02 has no effect whatsoever on global temperatures. If anything an excess of C02 would tend to lower global temperatures as a compensatory mechanism. I think both Lindzen and Spencer have shown this somewhat. There is evidence I believe from Holocene records that 3000ppm+- C02 was related with massive ice ages?

    – – – – – – –

    Eliza,

    Your comment presents a climate science thesis that I recommend is worthy of significant funding to expand the research base on it. The climate science community perhaps has finally opened enough for such views to start to be pursued. I hope some climate scientists who are enterprising and are traditional scientific skeptics will engage the task!

    The paradigm has shifted from alarming / dangerous AGW by CO2 toward one of minimal observed or below attribution level effects of radiative gas CO2 on the Total Earth Atmospheric System.

    Take care.

    John

  63. Yes, but when are the environment and ‘science’ journalists in the MSM going to get their heads around this stuff? And the fact that Hansen and his ‘team’ live in a fantasy world? Who is going to persuade them to do so?

    Even with right and science on their side, the sceptics are still losing the wider argument. There had to be some way to get it out there….

  64. It is always fun when current prophets forget that the best prophecies are always written long after the fact.

  65. Heathens! No one cares about temps NOW as they are however going to shoot off the page any moment, just you deniers wait and see. You only have to look at the news to see every kind of weather is ‘unprecidented’ and ‘proof’. In fact we don’t even need charts or facts or poor people…if we don’t like them there is always the block function, if not we can go na-na-na-na

    /sarc

  66. commieBob says:
    March 20, 2013 at 8:56 am

    Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am

    2 – The most powerful greenhouse gas, by far, is water vapour. Attributing the whole greenhouse effect to CO2 is wrong. Note that the adiabatic lapse rate is hugely influenced by moisture and not at all influenced by CO2. http://en.wikipedia.org/wiki/Lapse_rate

    But is it not reasonable to assume that if there were no CO2 – particularly in the higher, colder and DRIER regions of the troposphere – then it’s likely there would be less atmospheric water vapour. Remember a warmer atmosphere can hold more moisture.

    This is one of the reasons I suspect that any feedback effect is probably slightly postive.

  67. Hansen has yet to publish his Feb 2013 data.

    Maybe it is not cooperating with his preferred outcome and he needs more time to persuade the data t warm up.

  68. Bart says:
    March 20, 2013 at 10:31 am

    But, there is a definite correlation between the rise in CO2 concentration and temperature.

    Indeed there is. From ice core data, Hans Erren found it to be about 10 ppm per 1 deg C – which is roughly what we see when we look at the rate of CO2 change during El Nino and La Nina. The atmospheric CO2 change is always positive – but it’s greater in warmer years than colder.

    Using Erren’s estimate the temperature rise since ~1850 has contributed about 7 ppm to the overall CO2 increase. Others argue it might be nearer to 16 ppm. Whatever – it sure isn’t anywhere near the 100 or so ppm we’ve seen in the past 150 years.

    [John Finn: THANKS for clearing that up. I agree that rising global temperatures, due to any cause, will result in a small increase in CO2 levels. Over the past 130 years or so, we've seen a net temperature increase of 0.5 ⁰C to 0.8 ⁰C, along with a CO2 increase of over 100 ppmv. Perhaps 10% of that 100 ppmv may be due to temperature increase, 20% tops. So, it is not a FIRST ORDER effect.

    The other 80 or 90% of the rise in CO2 is due to a variety of causes, including unprecedented levels burning of fossil fuels. I hasten to add that the amount of global warming has been overstated, the amount due to human activitues (both fossil fuels and land usage that has decreased albedo) has also been overstated, and that predictions of climate catastrophe are unwarranted. Indeed, if, based on the fact that the current Sunspot cycle #24 is on the low/long side, we enter a cooling period similar to past eras of a series of low/long cycles, future generations may thank us for contributing a bit to the warming of the Earth. Ira]

  69. François says:
    March 20, 2013 at 10:18 am
    Well, it took you a month and a half to come up with that comment on the original Realclimate post.
    >>>>>>>>>>>>>>>>>>>

    That’s your rebuttal? It took six weeks? If he’d done it in 2 weeks would it be more right? Would Hansen be less wrong?

    Troll quality has declined significantly in the last two years. Perhaps this is a proxy for something?

    [Dave: Great to "see" you again! I agree with your comments to François, but let us all calm down. Perhaps François will change his mind if he hangs around WUWT and us for an extended period of time. Ira]

  70. Given that temps have virtually flatlined the past 16 years (or more), it would actually be more reasonable to assume that the actual increase of .22C had little, if anything to do with C02.

  71. Where does the 0.22 C line come from and how was it calculated? It doesn’t seem to reflect anything else on the graph. Maybe I am missing something here.

    [xham: I merely took the midpoint between GISTEMP and HADCrut4, as plotted on the RealClimate graphic for 1988, and drew an arrow to the same midpoint for 2012, and subtracted the (estimated) numbers. Thanks for asking. Ira]

  72. Ira Glickstein, PhD says:
    March 20, 2013 at 8:07 am
    “…If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real….”

    The fatal flaw of the greenhouse effect resolves around the idea and ignored that energy stored in the oceans doesn’t contribute to it,. Fortunately it contributes a lot and the 1-4% water vapor and CO2 in the atmosphere cannot be claimed to be the 33c difference on there own. The black body using Stefan’s law to show the difference of 33c doesn’t take into account a water body storing extra energy. IMO the current theory of the greenhouse effect is wrong and calculation, observational data is needed to estimate how much energy this contributes to the overall greenhouse effect itself.

  73. If the warmists really believed really,really hard then the temperature would be higher. They are not believing hard enough, the crystals are not responding. Gaia feels unloved and is letting the Physics monster win.

  74. John Finn says:
    March 20, 2013 at 11:44 am

    “Indeed there is. From ice core data…”

    Forget your trumped up ice core analysis. That is a massive, flailing rationalization.

    The relationship is right here before your eyes. Temperatures drives CO2. There is no way to contradict it – the best, most modern, most direct, most reliable measurements indicate that temperature drives CO2, and CO2 has little effect on temperature in the modern era.

  75. Important reminder that Hansen devised these scenario’s just after only 8 years of global warming. Yet somehow in [their] delusional world 16 years is still not long enough to falsify it. You are wrong again for dozens time and you you are wrong now. The planet Earth has falsified you not the skeptics. The skeptics just have to show the plentiful supportive evidence towards this conclusion. Which is why the CAGW’s have lost the plot with most most bizarre tactics only can be described as not science.

    p.s. It can not be claimed to be a short term natural pause any more because it is now longer than any since the instrumental record began. The only periods that occurred longer than this involved long term cooling.

  76. Since so many are chiming in on Dr. Glickstein’s statement below:
    “If Atmospheric CO2 had zero effect, the Earth Surface temperatures would be over 30 ⁰C cooler, because the Atmospheric “Greenhouse Effect” is real.”

    I’d like to chime in myself. The question I pose is: “What precisely is meant by the Earth’s Surface in this calculation?”

    The textbook figure that is usually used for this calculation is around 15 ⁰C (15 actual measured degrees less -15 theoretical degrees yields 30 degrees of “Greenhouse Effect” warming). I would like to argue that the textbook number of 15 is higher than true surface temperature of the Earth by more than 5 degrees C. The textbook number 15 for surface temperature of the earth comes from a merge of air and sea surface temperature records. What is universally overlooked is a vast pool of deep ocean temperatures approaching 4 ⁰C that is also part of earth’s surface temperature. It’s a fascinating phenomenon in and of itself and is due to the fact that water is densest at 4 degrees C. In effect the earth’s ocean is refrigerated. This refrigeration, as I call it, implies that there is a stream of energy exiting the system (purportedly in the Arctic) that is generally overlooked and unincorporated into the calculation the of degrees of the “Greenhouse Effect” on the Earth’s surface. Assuming straight line proportionality, I would argue that climate sensitivity calculations based on 30 degrees of Greenhouse Effect are overestimated by 20 to 30 percent simply due to this oversight.
    For purposes of incoming energy what amounts to the Earth’s surface is simple, it is anything struck by sunlight. When considering what the Earth’s warmed surface amounts to things are a little more complicated. Nonetheless, it should include all areas influenced by both incoming sunlight and outgoing longwave radiation. This includes the deep ocean.

    [Hank Henry: Let us assume, for sake of argument, that the Atmospheric "Greenhouse Effect" is overstated, as you claim, by "20 to 30 percent". That would change my statement to "over 20 or 25 ⁰C cooler". So, even accepting your math, the Atmospheric "Greenhouse Effect" is still real. Ira]

  77. W.Fox says: I wonder , if the temperature is driving Co2 and not Co2 driving the temperature , than at what temperature drop does the amount of Co2 go down ?

    Did anyone ever publish any data about that ?

    Temperature once was the main driver of CO2. That means when we look at the past and see temperature and CO2 rising together we cannot assume that CO2 is the cause and the temperature change is the effect; and use this to infer what what changes in CO2 will do to temperature today.

    However we don’t know that it is still true today that temperature alone is driving CO2. In fact we have good reason to suspect that CO2 today is also being driven by other things like the burning of fossil fuels and land use changes (e.g. desertification). If today’s higher CO2 levels are caused by multiple factors with temperature being only one of them then we cannot predict what will happen to CO2 by looking at temperature alone.

    In fact trying to predict CO2 from temperature is the same kind of mistake as trying to predict temperature from CO2.

  78. davidmhoffer says:
    March 20, 2013 at 12:06 pm

    Troll quality has declined significantly in the last two years. Perhaps this is a proxy for something?
    ————
    See, now that’d be a more worthwhile use of my tax dollars in my opinion. At least it’d be good for laughs. I’d be glad to whomp up some computer models and fiddle with data to reach whatever conclusion you’d like me to objectively study the question if we could just resolve the question of where the grant money would come from.

  79. During the last century we had two warm “pulses” and one cold, superimposed over a roughly 0.6K rise from the Little Ice Age (which was the second coldest period since we came out of the last Ice Age), each lasting very roughly 30 years. This century, we will have two cold pulses and one warm, leaving us with temperatures pretty much what they are now. This doesn’t take into consideration the unusually quiet sun, which could affect things even more to the negative side.

    How’s that for a “reasonable naive hypothesis”.

  80. The thing is that you can’t force people’s wallets open to cure a problem that might, just maybe might be a problem in several centuries to a millenium. It has to be NOW NOW NOW OR WE ALL DIE!

  81. An off topic question for Andrew (no offence Ira).

    Why are there no ads or ‘donate’ buttons on the RC site?

    Great article btw Ira (thumbs up)

  82. Ian H says:
    March 20, 2013 at 1:14 pm

    “However we don’t know that it is still true today that temperature alone is driving CO2. “

    Yes, we do. Or, those of use savvy enough to understand what this plot is telling us do.

  83. Ian H says:
    March 20, 2013 at 1:14 pm

    “In fact trying to predict CO2 from temperature is the same kind of mistake as trying to predict temperature from CO2.”

    I absolutely can predict atmospheric CO2 levels from temperature to high fidelity. It is right here. Over the modern era, I can integrate the affine temperature relationship and get it within about 4 ppm.

    I can reverse it, and take the derivative of the CO2 to get temperature, but it is the integrated variable which must be the cause to the effect.

  84. Bart says:
    March 20, 2013 at 2:53 pm

    “Over the modern era, I can integrate the affine temperature relationship and get it within about 4 ppm.”

    I could do even better, centralizing the error, if I did an actual least squares fit to derive the affine relationship instead of just eyeballing. Furthermore, the datasets have varying fidelity. The best agreement appears to be with temperatures in the Southern Hemisphere, which suggests that this is mostly an oceanic phenomenon.

    That is perfectly reasonable, since the oceans are a giant conveyor belt for CO2 with continuous flow (from hundreds of years ago) into the surface system in the tropics, and out of it at the poles. Any imbalance between those flows will either accumulate in the surface system, or drain out of it. And, that imbalance is absolutely temperature dependent, and would give rise to exactly the relationship we see of the rate of change of CO2 as an affine function of temperature.

  85. “… while this simulation was not perfect, it has shown skill in that it has out-performed any reasonable naive hypothesis that people put forward in 1988 (the most obvious being a forecast of no-change)” ok im tired but it sure looks to me like that naiva hypothesis is less wrong then the applicable prediction

  86. This might sound like a stupid question!
    Is the Mauna Loa measurements of atmospheric CO2 measured by volume? if so can someone quantify the volume for me?

  87. philjourdan says:
    March 20, 2013 at 12:25 pm

    New age math: 22 is greater than 53 when it suits your purpose.

    Robert Anton Wilson’s law of the priority of politics over science:

    “If A is greater than B, and B is greater than C, then A is greater than C, except where prohibited by law.”

  88. At the risk of being labelled a real denier what evidence do we have that the Mauna Loa seemingly exponential rise in CO2 is human induced? To be honest I would have expected to see some impact from the GFC.

  89. e = mc^2: hint binomial expansion of srt( 1 – v^2 / c^2) from m1 = m0 * sqrt( 1 – v^2 / c^2 )

    [DesertYote: OK, since I've been studying Time Dilation in connection with my What is Time? video, I recognize "srt( 1 - v^2 / c^2)" as 1/(Lorentz Factor). v is velocity of an object in Space and c is the speed of light.

    An object moving at v appears to us, if we are "at rest" to have Length Contraction proportional to the Lorentz Factor and Time Dilation proportional to 1/(Lorentz Factor). Also, the mass of the object appears to increase by 1/(Lorentz Factor) which you state as "m1 = m0 * sqrt( 1 - v^2 / c^2 )" but I think you meant to write as "m1 = m0 * 1/(sqrt( 1 - v^2 / c^2 ))".

    The reason no object with Mass can get up to the speed of light in Space is that it would take infinite energy to get it up to that speed, apparently because, as its speed approaches c, its mass approaches infinity. You can see that, when v approaches c, and thus v^2 approaches c^2, v^2 / c^2 approaches 1, and 1 - v^2 / c^2 approaches 0, causing 1/, 1 - v^2 / c^2 to approach infinity.

    So, since f = ma, and the Energy required to increase the speed of an object is proportional to f, then Energy is proportional to Force which is equal to m0 multiplied by 1/( 1 - v^2 / c^2) multiplied by a. OK, this is starting to look something like e = mc^2, but I am not quite there. Do you have a reference that will get me there? Ira]

  90. Ira’s excellent point: “Even a “naive” prediction of no change would have been closer to the truth (low by 0.22 ⁰C) than Hansen’s Scenarios A (high by +0.68 ⁰C) and B (high by 0.53 ⁰C)!” can be made another way:

    Suppose James Hansen had a contrary twin brother, Jock, who made the prediction at the congressional hearings – just after brother James (and turning the air conditioner back on as he came in) – that actually global temperatures would DROP from 1988 levels by .30 ⁰C on CO2 output Scenario B and by .45 ⁰C on CO2 output A. Jock’s COOLING predictions, though not borne out, would nevertheless be nearer to the realised outcome than those of his headline-grabbing warmist brother.

    Come on down, Jock!

  91. Greg House March 20, 2013 at 9:33 am:

    Thanks for your comment. You clearly and collegially restate arguments put forth by many sincere and generally well-informed WUWT readers who cannot accept the basic science of the Atmospheric “Greenhouse Effect” or that H2O and CO2 and other Atmospheric gases are responsible for the Earth Surface being over 30 ⁰C warmer that it would be absent that effect.

    You conclude with “Time to wake up, Ira, and look at those things really critically.”

    I appreciate your concern for my being correctly informed. I have a similar concern for you. I would appreciate it if you, and others who are having a problem accepting the Atmospheric “Greenhouse Effect” as real would read my Visualizing series, here at WUWT.

    There are five parts: Physical Analogy, Atmospheric Windows, Emission Spectra, Molecules/Photons and Light and Heat.

    This series has garnered over 70,000 page views and 3000 comments at WUWT, mostly positive. I’ve learned a lot from WUWT readers who know more than I do. However, some commenters seem to have been taken in by scientific-sounding objections to the basic science behind the Atmospheric “Greenhouse Effect”. Their objections seemed to add more heat than light to the discussion.

    My series was designed to get back to basics and perhaps transform our heated arguments into more enlightened understanding :^)

    Please have a look at why I, while I definitely am properly SKEPTICAL about the entire CAGW Enterprise, and the unscientific antics of the ALARMISTS and some WARMISTS, nevertheless accept that the Atmospheric “Greenhouse Effect” is real.

    Happy reading, then get back to me. OK?

    Ira

  92. Ira

    I have read your comments with interest. Your articles are always thought provoking and objective.

    My take on this is that unfortunately we do not have sufficient high quality data from which to form any firm view. Error bars in the data sets are not sufficiently identified and acknowledged and there has been a lack of empirical experimentation on the true effects of CO2 in real world conditioons and precisely what DWLWIR can achieve particularly with respect to its interaction with the oceans and the atmosphere immediately above the oceans (I refer to the atmosphere say a few metres above the ocean and the top centremeter of the ocean itself). Any proxy data needs to be considered with a high degree of caution. Unfortunately the thermometer data set is not fit for purpose; it was never intended to produce a record of global temperatures accurate to within tenths of a degree, and with siting issues, station drop outs and endless basterdisation by way of repeated adjustments the legitimacy of which is moot, it too cannot be used as a reliable base upon which to make reliable extrapolations. Indeed, the land based thermometer record is not even measuring the correct energy metric, and we lack proper data on clouds, albedo and solar irradiance.

    As a consequence, we are very much left with a gut impression as to what is going on. My gut tells me that there is some GHE caused mainly by water vapour but it is very the GHE is much over hyped. I do not consider that we sufficiently understand solar irradiance and I consider that solar irradiance is probably sufficient in itself to prevent the tropical ocean from freezing. If that is so, then global temperatures even withoutb GHE would probably not be less than plus a couple of degrees. Accordingly, I consider the 30 degree C figure you cite for the GHE to be probably over hyped. I would not wish to put a figure on the GHE but I would not be surprised if it turns out to be only about a third of the figure that you suggest.

    As regards CO2, I consider it likely that it plays some limited role over land (very much less over the oceans since water is a LWIR block) and I consider that it is likely that sensitivity varies both with temperature, prevailing water vapour and saturation. With today’s temperatures, and water vapour, I consider it probale that CO2 sensitivity is low. I would not wish to forcefully argue against your 0.25degC figure, maybe it is a bit more, but maybe it is a bit less. We simply do not know enough about natural variation to make a good stab, but I would be reluctant to argue against your figure being a good ball park figure. My gut firmly riles against positive feedbacks; quite simply planet Earth would not have enjoyed the relatively benign conditions that it enjoys today if feedbacks were positive and we would not be here today some 4.5 billion years after its creation blogging on this topic if feedbacks were positive.

    Of course, being sceptical, I cannot rule out the possibility that the entire underlying science is fundamentally flawed. If only there had been some proper empirical experimentation on some of the fundamental issues raised, we would have a better understanding of whether that might or might not be the case.

    [richard verney: Well stated. Although I am quite a bit less skeptical than you on the basic science of the Atmospheric "Greenhouse Effect" I cannot object to anything you say here. The data is not accurate to support precision of anything like tenths of a degree, and the Climate System is far too complex to model or predict to that type of precision. As engineers, we understand the difference between precision and accuracy, and the fact that precision to more significant figures than is supported by the underlying accuracy is an illusion. Anyone who says they are absolutely sure about any of this stuff is almost certainly wrong :^). Thanks for this useful and collegial interchange of ideas and opinions! Ira]

  93. Ira Glickstein, PhD says:
    March 20, 2013 at 4:39 pm

    Greg is one of those people who believe the whole idea is simply false. And, that is simply false.

    But, there are serious criticisms of the GHE. Not that it does not exist, but that it is not necessarily monotonic. The GHE can exist while being locally insensitive to additional GHG.

  94. I went over Mauna Loa’s method of measuring CO2, they process the air, remove water vapor, do a bunch of fatally flawed conversions, bounce inferred light of a box full of processed dry air and measure an electrical response, then they convert this electrical measurement into a mole fraction of dimensionless quantity and redefine what Air is, they shift the figures about and give lengthy BS about why and after they have their data, they further process it by calibrating it with other data (who calibrates their data? Mauna Loa?). Then When they model the calibrated data to produce a graph that say’s PPM, they are dishonest and call it Atmospheric CO2. They give the impression that what they have measured is the volume or effective volume of CO2 in the atmosphere, It is NOT.

    They say so themselves “Most people assume that we measure the concentration of CO2 in air, and in communicating with the general public we frequently use that word because it is familiar”

    This process can be controlled even inadvertently to produce results that give the impression of an increasing mole fraction over a time scale by processing normal variability, effectively the processed data produced is artificial and can in no-way be used to quantify what atmospheric CO2 is on the planet, (and I certainly wouldn’t use it to predict future climate temperatures) What can you measure an artificial value to?

    It also seems that there are assumptions built into their data of fossil fuel use, just like the CO2 based climate models.

  95. This is the actual effective forcing numbers that Hansen used.

    http://www.realclimate.org/data/H88_scenarios_eff.dat

    If you compare that to what IPCC AR5 is using for the increase in net forcing since 1984 (RCP 6.0 scenario which is the most realistic trend), it is almost identical to the increase in forcing Hansen used in Scenario B. So B was an accurate description of the forecing which actually occurred (despite the spin from RealClimate or Skeptical Science).

    http://www.pik-potsdam.de/~mmalte/rcps/data/RCP6_MIDYEAR_RADFORCING.DAT

    http://www.pik-potsdam.de/~mmalte/rcps/data/RCP6_MIDYEAR_RADFORCING.xls

    Couple of other Hansen 1988 data sources; temperature anomalies under the scenarios and the GHG concentrations.

    http://www.realclimate.org/data/scen_ABC_temp.data

    http://www.realclimate.org/data/H88_scenarios.dat

    Scenario B temp in 2012 +1.065C GISTemp +0.560C
    Predicted increasein temps +0.983C Actual increase GISTemp +0.44C

    Hansen also updates his own 1988 predictions here.

  96. Regarding Hansen’s ABC emission scenarios, here is an insightful comment at Climate Audit a few weeks ago-

    Climate Audit comment, 03/02/2013
    Kurt in Switzerland

    “Ref. Scenario B being the most reasonable comparandum to actuals:

    I disagree. Scenario A corresponds to 1.5% annual CO2 emissions growth. Actuals have been about 1.9% per annum since 1990, i.e., above Hansen’s Scenario A.

    Hansen’s CFC growth rate for Scenario A was 3% p.a.

    The additional 0.4% p.a. growth rate in CO2 would compensate for the lack of CFC growth rate, nest-ce pas?

    Furthermore, Hansen’s Scenario B called for the growth rate of CO2 emissions to decline from 1.5% per year {1988} to 1% per year in {1990}, 0.5% per year {2000} and 0.0% per year {2010}. Clearly this didn’t happen.

    From a cursory look at Hansen et al 1988 Appendix B, the estimated forcing due to both CFC’s mentioned (F-11 and F-12) rising from 0.0 to 2.0 ppb is about 1/4 that due to CO2 doubling (315-630 ppm). The lack of this should approximately cover the underestimate of CO2 emissions growth.
    I’ll consider CH4 second order (and much tougher to calculate). Given all the media screaming about methane clathrates being “freed” due to unprecedented arctic warming, one would think actual methane emissions had exceeded Hansen’s worst fears.

    So Scenario A is the closest to reality, not Scenario B (it would seem to me).

    Either that, or the net UPTAKE by the atmosphere as a result of said emissions was poorly estimated. But that would be a different story. Hansen’s Scenarios are defined primarily by human emissions, not by atmospheric concentration per se.
    Kurt in Switzerland”

  97. Ira Glickstein, PhD says, (March 20, 2013 at 4:39 pm): “I appreciate your concern for my being correctly informed. I have a similar concern for you. I would appreciate it if you, and others who are having a problem accepting the Atmospheric “Greenhouse Effect” as real would read my Visualizing series, here at WUWT. … This series has garnered over 2000 comments at WUWT, mostly positive. I’ve learned a lot from WUWT readers who know more than I do. However, some commenters seem to have been taken in by scientific-sounding objections to the basic science behind the Atmospheric “Greenhouse Effect”.”
    ============================================================

    “Basic science”, Ira? Let me give you an example. 2×2=4 is basic math. 2×2=5 is not basic math, it is not science at all, it is false. I do not think there is a reason to call warming by back radiation (this is the “greenhouse effect” as presented by the IPCC) is science. It is false, Ira. Of course, you are entitled to a different opinion.

    To your articles, I am already familiar with the concept of “warming by back radiation” and find it proven false both on theoretical and experimental level. As I said, on the experimental level it has been known since the R.W.Wood experiment, maybe you need to read his paper. It is really easy reading. Second, on the theoretical level in a certain case of a body at a stable temperature starting receiving back radiation according to the “greenhouse effect” concept, the result would be that more energy is radiated away then there is in the system, and this proves that the initial assumption (“greenhouse effect”) is false.

    I guess I’d better not wait till you start digging yourself and give you a link to that second point. Note that it is about a theoretical setup our beloved Willis invented to illustrate how “greenhouse effect” is supposed to work: a planet with a constant internal power supply surrounded by a sphere. Here is the link to my comment about that on another blog: http://climateofsophistry.com/2013/03/08/the-fraud-of-the-aghe-part-11-quantum-mechanics-the-sheer-stupidity-of-ghe-science-on-wuwt/#comment-828. Well, the actual post starting that thread is overheated, but please, focus just on the scientific point.

    [Greg House: I have read and considered the R. W. Wood experiment material and nevertheless remain convinced of the basic truth of the Atmospheric "Greenhouse Effect" and that the radiation exchanged between the Earth Surface and the "Greenhouse Gases" in the Atmosphere has been properly accounted for.

    However, as I expressed above in my agreement with Richard Verney, I do not accept the levels of precision claimed by some Climate Scientists. That is why my estimte of Climate Sensitivity has such a large range, from 0.25 ⁰C to 1 ⁰C, and, even my high estimate is a factor of three less than the IPCC and a fourth of Hansen 1988. It is also why I think net warming since 1880 is closer to 0.5 ⁰C than the 0.8 ⁰C or higher claimed by the official Climate Team ("hockey team").

    So, I side with "our beloved Willis" and the proprietor of WUWT, and scientists such as Roger Pielke (Sr and Jr :^), and the other skeptics and lukewarmers who seem to me to have taken a detailed and rational approach to this subject area. Ira]

  98. Bart says:
    March 20, 2013 at 10:31 am
    >>>>
    richard verney says:
    March 20, 2013 at 8:37 am

    “If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”

    But, there is a definite correlation between the rise in CO2 concentration and temperature.
    //////////////////////////////////////////////////////

    Your referenced data, proves my point. No mathematician worth his salt would plot a straight linear line through the temperature anomalies between 1977 and end of 2012. A mathematician would look at that data set and say that there is an approximate straight line fit at around the +0.12C level for the period 1977 to 1996 and another straight line fit at around the +0.18C level for the period 2000 to end of 2012. The mathematician would note the anomaly around 1997 to 1999 where a step change takes place; that step change is not brought about by CO2 unless CO2 caused the super El Nino of 1998 and as I said, no one suggests that CO2 levels were responsible for that event.

    If you wish to consider the point I made in more detail I would suggest that you split your plot in two. First consider the period 1979 to say 1997 and then consider the period 1999 to date.

    As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31. What evidence is there for a manmade emission change of this order in the year 1997 to 1998. Please provide your data on annual manmade emissions supporting such a change in these years.

    I think that what you are looking at when you are looking at in the CO2 derivative is a response, not a cause. The ocean heat release has out-gassed CO2. CO2 has not driven the temperature spike, but rather what you are seeing in the derivative is the CO2 response to the El Nino temperature change.

    Bart says:
    March 20, 2013 at 10:31 am
    >>>>
    richard verney says:
    March 20, 2013 at 8:37 am

    “If one looks at the 33 years of satellite data then there would appear to be no first order correlation between temperature rise and the rise in CO2 emissions.”

    But, there is a definite correlation between the rise in CO2 concentration and temperature.
    //////////////////////////////////////////////////////

    Your referenced data, proves my point. No mathematician worth his salt would plot a straight linear line through the temperature anomalies between 1977 and end of 2012. A mathematician would look at that data set and say that there is an approximate straight line fit at around the +0.12C level for the period 1977 to 1996 and another straight line fit at around the +0.18C level for the period 2000 to end of 2012. The mathematician would note the anomaly around 1997 to 1999 where a step change takes place; that step change is not brought about by CO2 unless CO2 caused the super El Nino of 1998 and as I said, no one suggests that CO2 levels were responsible for that event.
    If you wish to consider the point I made in more detail I would suggest that you split your plot in two. First consider the period 1979 to say 1997 and then consider the period 1999 to date.
    As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31. What evidence is there for a manmade emission change of this order in the year 1997 to 1998. Please provide your data on annual manmade emissions supporting such a change in these years.

    I think that you will find that what you are looking at in the CO2 derivative s a response, not a cause. The ocean heat release has out-gassed CO2. CO2 has not driven the temperature spike, but rather what you are seeing in the derivative is the CO2 response to the El Nino temperature change.

  99. Dr. Will Happer (Physicist), Princeton: “I have spent a long research career studying physics that is closely related to the greenhouse effect. Fears about man-made global warming are unwarranted and are not based on good science.”

    Dr. Lee C. Gerhard (Geologist), UN IPCC expert reviewer: “I never fully accepted or denied the anthropogenic global warming (AGW) concept until the furor started after [NASA's James] Hansen’s wild claims in the late 1980s. I went to the scientific literature to study the basis of the claim, starting at first principles. My studies then led me to believe that the claims were false, they did not correlate with recorded human history.”

    Dr. Nicholas Drapela (Chemist): “My dear colleague [NASA’s James] Hansen, I believe, has finally gone off the deep end… The global warming ‘time bomb,’ ‘disastrous climate changes that spiral dynamically out of humanity’s control.’ These are the words of an apocalyptic prophet, not a rational scientist.”

    Now *that’s* a model that doesn’t need updating…..

  100. Hansen’s Scenario A appears to be the best fit for CO2 emmissions but I’ve heard it be argued by lukewarmers that scenario B is preferred because scenario A includes increasing levels of CFCs which instead have decreased thanks to the Montreal Protocol which applied from around the time of the prediction.

    So to accept Scenario B is the most apt means accepting that CFCs had about as much impact on the climate as CO2 did. That doesn’t seem very plausible to me and so I think Hansen’s scenario A is closer to his true prediction than Scenario B is.

  101. As I said, on the experimental level it has been known since the R.W.Wood experiment, maybe you need to read his paper. It is really easy reading.
    >>>>>>>>>>>>>>>>>>>>

    What the Wood experiment demonstrates is precisely the same thing that Mann’s hockey stick graph demonstrates. That, on both sides of the debate, there are those who justify their world view on the flimsiest of evidence and cling to it in the face of overwhelming factual evidence to the contrary.

  102. This graphic – with excerpts from the recent Marcott paper is it appears, pretty intresting.

    When you match the main Marcott temp history graphic and compare with graph “D” the ice core GHG measured Radiative Forcing, things get interesting. The thin green line is 1950 (“present” for BP) .

    The GHG measured Radiative Forcing bottomed appx 8000 years BP andwere relatively flat for appx 1000 years, after which they began, and have continuing, climbing – relatively uniformly – since. About 2000 years BP the radiative forcing level flattened, similar to 8000 yr BP – until a sharp increase between 1000 and 800 years BP – after which – from 800 years BP to end of data the GHG measured radiative forcing was decreasing.

    What is interesting to me – is that at a virtually identical times; the GHG measured radiative forcing bottomed and began climbing, AND there is a clearly defined warming pulse/peak in the Marcott data. Since that time GHG measured radiative forcing has been increasing while the Marcott demonstrated temps have been decreasing.

    It is also interesting that the same pattern seems to exist at the 8000-7000 yr BP and 2000 to 1000 yr BP points. The GHG measured radiative forcing flattens for appx 1000 years, during which time (7000 and appx 1200 yrs BP) there is a noted, short lived temp spike, followed by accelerated cooling.

    In the time period appx 50 to -50 yrs BP (1900-2000) – ignoring the Marcott “spike” which it appears is not supported by the data, we see a similar warm spike beginning in the Marcott graphic. We also know this matches the temp record for the same period, with temps reaching a peak appx 1997 and leveling since.

    Despite that GHG measured radiative forcing increased from 7000 BP thru the end of the data poresented, and that we know GHG’s have continued to increase steadily the last 100 years and more, Marcott shows overall temps have continued to fall from the peak appx 7000 BP.

    We also know that after climbing from late 1800 thru late 1990’s, while GHG’s continue their rise, temps have leveled for the last appx 17 years.

    While this is admittedly “eyeball” science – and assumes Marcotts charts are accurate, it certainly appears Marcott shows that GHG measured radiative forcing is clearly not a driver of temperatures – at least not a driver of temp increases.

    Or I could be missing something altogether … ;-)

    http://tinyurl.com/marcott-ghg

  103. It looks like Hansen has shown that whatever was causing climate forcing ceased perhaps a little before 2000. If his methodology has any sound basis, perhaps he demonstrated that CO2 was never causing the temperature increase he was tracking. He has possibly presented a pretty good argument against assigning temperature increases of the past few decades to CO2 increases.

  104. Hah, I just spotted the “Teams” way out.
    Up comment someone was critiquing CO2 measuring methods at Mona Loa, there’s the escape clause.
    ” We have spotted an error in the CO2 record, emissions actually dropped in 1998 and have stabilized/fallen ever since.” Therefor scenario C is 100% accurate, honest.
    Did I need sarc?

  105. richard verney says:
    March 20, 2013 at 5:49 pm

    “As far as your CO2 derivative, are you seriously suggesting that as a consequence of manmade CO2 emissions the global derivative in 1997 was 0.02 but in 1998 it was 0.31.”

    You appear to be missing my point entirely. “Manmade CO2 emissions” are having no discernible effect on CO2 levels at all. It is essentially all being driven by temperature. The differential equation describing the relationship is

    dCO2/dt = k*(T – To)

    dCO2/dt = derivative of atmospheric CO2 concentration
    k = coupling constant in ppm/degC/unit-of-time
    T = global temperature anomaly
    To = equilibrium level for global temperature anomaly

    “I think that what you are looking at when you are looking at in the CO2 derivative is a response, not a cause.”

    Exactly!

  106. Bart says:
    March 20, 2013 at 10:31 am

    Ira comments:

    [Bart: Have a close look at the Mauna Loa CO2 data (second graphic above) and tell me where CO2 has "leveled out". Yes, since CO2 has its small seasonal ups and downs there are some months where it is level, but, on a smoothed yearly basis it seems to me to be a continued rapid increase that is slightly exponential in the upward direction. Ira]

    The slope has leveled out. Look right there at the end where temperatures are leveling out, too. That is what I said: “But, the rate of change of CO2 has leveled out, in lockstep with the leveling out of temperatures.” Rate. Of. Change. Not the concentration itself, but its rate of change.

    [Bart: What does that link or your mathematical jibber-jabber have to do with the past century? Or with the Atmospheric "Greenhouse Effect" that makes the Earth Surface at least 30 ⁰C warmer than it would be if the Atmosphere lacked "Greenhouse gases"? I read your words four times and visited your link. Please explain your point in English. advTHANKSance. Ira]

    Just because the GHE heats up the surface above what it would be without the GHGs in the atmosphere does not mean that adding more of a particular GHG will increase it further. It is the difference between a global feature of a function and local behavior. For example, the function

    y = 3*x – 6*x^2 + 4*x^3

    is a generally increasing function of x. But, between about 0.4 and 0.6, it is essentially flat. At 0.5, the local derivative is zero. A small change in x doesn’t change the function hardly at all.

    Similarly, the functional dependence of surface temperature on increasing CO2 appears to be such that, though it warms the surface above what it otherwise would be, we are at a point on the function where adding more doesn’t lead to a significant increase in temperature.

    [Bart: I have to agree with Richard Verney that there is no FIRST ORDER correlation between temperature rise and the rise in CO2 emissions. Please don't simply point to some graph. Explain what you mean so even Richard and I can understand. advTHANKSance. Ira]

    Not emissions! Concentration! Emissions are not doing hardly anything at all.You can essentially ignore them. It is clear that they are rapidly being sequestered or otherwise transported out of the surface system by ocean, mineral, and biological processes.

    Those flows are miniscule compared to natural flows. Temperatures are driving the rate of CO2. It is a continuous flow problem, in which the differential rate between sources and sinks is being modulated by temperature, as I described here on this page.

    Thanks for responding. I hope I have made things clearer and that someone will begin to appreciate what is so patently obvious in the data: Atmospheric CO2 concentration is not being driven significantly by human inputs. It’s the temperature modulation of a continuous transport system.

  107. Ira-
    Since the last time Hansen et.al. ’88 was the topic here at WUWT, I have been studying the paper and doing some calculations and plotting. Unfortunately, my lack of internet skills have prevented me from preparing a post to submit.

    The main point I want to make is that there is fourth scenario that nobody mentions. It is shown in the very first figure of Hansen ’88. This scenario is the 100 year control run. It is called the “control run” because all the ghg’s were held constant at the 1958 values. That’s right, the control run keeps the ghg’s constant for 100 years (to 2058).

    However, to accurately compare Hansen’s scenarios with measurements, it is necessary to compare the ghg’s in the scenarios with the ghg’s estimated from measurements. The tabular listing of the ghg’s are at Real Climate at:

    http://www.realclimate.org/data/H88_scenarios.dat

    The NOAA estimated ghg’s in graphical form by year are at:

    http://www.esrl.noaa.gov/gmd/aggi/aggi_2011.fig2.png

    When these are compared it can be seen that Hansen’s Scenario C is a good match for everything but CO2. Up until 2010 Scenario B was the best estimate of CO2. Fortunately Hansen’s paper presents a method for adjusting for these differences In the words of the paper “The forcing for any other scenario of atmospheric trace gases can be compared to these three cases by computing the [delta t] with the formulas provided in Appendix A.”

    Using the Appendix B formulas for CO2 , the difference between the CO2 delta T for Scenarios B and C can be calculated. This difference can then be added to the delta T for Scenario C to obtain a Scenario that is approximately correct for all ghg’s Hansen considers.

    For 2010 this adjustment for CO2 adds about 0.11 deg C to the delta T for Scenario C.

    I chose to compare the adjusted Scenario C to the UAH satellite measurements. However before that can be done the Hansen Scenarios must be adjusted to the same base years as the UAH measurements (1981-2010). You must use the control run to do this. Then add the delta T differences between the control run and the various Scenarios to the adjusted control run delta T for each year to obtain the Scenarios on an ’81 to ’10 basis.

    When this is done the following temperature anomalies are obtained for 2010 (using a 5yr, centered average):

    Scenario C adjusted for CO2 and base year = 0.63 deg. C.
    UAH = 0.18 deg C.
    Control run = 0.08 deg C.

    The fact that the 2010 temperature anomaly based on measurements is 0.45 deg. C. lower than the adjusted Scenario C, but only 0.1 degree above the Control run shows that the model is completely useless in predicting future temperature, and that the methodology for determining the effects of ghg’s is deeply flawed.

    The key is to include the control run in all scenario comparisons.

    • Leave it to an (Old ) engineer to point out something so lost in the hustle and bustle and to point it out so clearly.

  108. Old Engineer says:
    March 20, 2013 at 10:56 pm

    Nice. The evidence is piling up – AGW is dead. The only question is how long its inertia will keep it going.

    If anyone wants to know what the next 30 or so years is going to look like, appropriately offset and graft the portion of this plot from about 1945 to 1975 onto today’s temperatures. And, for CO2, expect the rate of change to decrease with the temperatures as it is already doing.

  109. AGW is a Crime against Humanity and perpetrators should be pursued through the courts. Whinging in the press does nothing to ameliorate the hurt AGW has imposed on countless millions by impoverishing them with gargantuan fuel bills and even starvation. Hansen at al should be held to account. Why, it seems, is no one really mad with ANGER!

  110. Re: Hansen.

    Hansen abandoned interest in Venus, which was his first area of research, at the precise point when atmospheric conditions became measurable. That is, when whatever speculations he had advanced could be verified empirically. Or disproved.

    He claims that just at the point where this was going to occur, he suddenly decided to focus, effective immediately, on the earth and AGW. Without knowing whether the fruits of years of work were vindicated.

    This is utter bullshit.

    People simply do not behave this way.

    If someone with sufficient expertise looks at his published papers prior, and the actual measurements obtained, it will be obvious that he was wrong. At a guess, wildly, undeniably, wrong.

    Thus his subsequent life: determination never to be shown wrong achieved through his control and manipulation of data.

  111. Hansen didnt include the effects of the sun and ocean cycles on mulit decadal time scales throughout the 20th century, he simply assumed that all late 20th century warming was being caused by human greenhouse emissions, and the failure of his forecast since then has shown his theory to be wrong.

  112. It’s not surprising that RC would use a red herring to further their illogical argument. The need to use a “no warming” comparison is just plain silly. It completely ignores the most likely comparison of ocean oscillations. When the PDO/AMO are considered the current warming is almost a perfect match. Their dishonesty is typical of the alarmist mind.

  113. Looking at the “Adjusted Data”, in the RC article, I wonder how Roy Spencer feels on how his UAH data was “adjusted”.

  114. Ira
    Good graph & explanation.
    May I suggest comparing Hansen’s predictions with a first order null hypothesis of continued warming from the Little Ice Age. e.g. see
    Syun-Ichi Akasofu, On the recovery from the Little Ice Age
    Natural Science Vol.2, No.11, 1211-1224 (2010) doi:10.4236/ns.2010.211149
    Openly accessible at http://www.scirp.org/journal/NS/

    On CO2, Fred H. Haynie provides fascinating analyses of CO2 versus latitudinal temperatures in The Future of Global Climate Change. e.g. in slide 10 he shows polar CO2 driven by the respective polar ice extent.

  115. Old Engineer March 20, 2013 at 10:56 pm wrote:

    the following temperature anomalies are obtained for 2010 (using a 5yr, centered average):

    Scenario C adjusted for CO2 and base year = 0.63 deg. C.
    UAH = 0.18 deg C.
    Control run = 0.08 deg C.

    The fact that the 2010 temperature anomaly based on measurements is 0.45 deg. C. lower than the adjusted Scenario C, but only 0.1 degree above the Control run shows that the model is completely useless in predicting future temperature, and that the methodology for determining the effects of ghg’s is deeply flawed.

    BRILLIANT! Correcting Hansen Scenario C for the true Atmospheric CO2 you get 0.63 ⁰C warming, but the true warming, based on UAH (satelite sensors) is only 0.18 ⁰C.

    Therefore, Hansen Scenario C, corrected for CO2, predicts a temperature rise of 0.63/0.18 = 3.5 times too high. Thus, Hansen’s assumed Climate Sensitivity of 4.2 ⁰C is too high by a factor of 3.5. Had he assumed Climate Sensitivity of 1.2 ⁰C, his Scenario C would have made a better prediction.

    This confirms my contention that Climate Sensitivity is closer to 1 ⁰C than the IPCC’s 3 ⁰C.

    Ira

  116. Ira-
    It was very late (for my time zone) last night when I saw your post, so my comments were brief. I didn’t get a chance to thank you for bringing the RealClimate post to our attention.

    Thanks also for so forcefully pointing out that the RC post was just plain wrong. That the “naive” prediction of no change, which is exactly what Hansen’s own Control Run is (no change since 1958!), is closer to the measured temperature anomaly than any of Hansen’s scenarios.

    As you pointed out, “Hansen 1988 is the keystone for entire CAGW Enterprise.” So it needs to be hammered home again and again, that the premise on which this keystone is based has been shown to be wrong.

    I’m not sure I like the “climate sensitivity” way of looking at ghg effects. But since there is a long history of using climate sensitivity, I certainly agree with you, it is probably close to 1. I think others have also come to same conclusion using different approaches.

  117. THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”

    As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated

    You have made the same error that McIntyre made several years ago at CA, look carefully at Fig 2, you’ll see that the difference between scenario A & B is less than 0.1ºC by now. Hansen clearly showed that the main cause of temperature increase up to now would due to the ‘trace gases’ if they continued to increase at the then rates. This is clearly stated in the abstract and the Intro para.

    [Phil: Fig 2 of what? My WUWT posting of the RealClimate posting I got the base graphic from? Yes, according to the RealClimate graphic (which is the first graphic of my posting) there is only a small difference between A and B now, and BOTH are WRONG (as compared to ACTUAL), and BOTH assume continued "trace gas emissions", and all Hansen Scenarios (A, B, and C) assume a Climate Sensitivity of 4.2 ⁰C. The ONLY one of Hansen's scenarios that is close to the ACTUAL is Scenario C, which is ONLY 31% high. What is your point? What is your evidence? Do they teach clarity at Princeton? Please state your points clearly and I will try to respond. advTHANKSance. Ira]

  118. @ Bart says: March 20, 2013 at 7:47 pm
    //////////////////////////////////////////////////////
    Bart

    It is often said that CO2 is a “well mixed” gas. But the accuracy of that claim depends upon the meaning one ascribes to ‘well’. Factually, is it ‘well mixed’ or only ‘reasonably well mixed’? The latest satellite data suggests that it is not what I would classify as being ‘well mixed’ but instead it is rather lumpy (for want of a better expression). Indeed, Moustafa Chahine of the NASA Jet Propulsion Laboratory (JPL), AIRS’s principal scientist, in a press conference partly conceded as much. He said: “Contrary to prevailing wisdom, CO2 is not well mixed in the mid-troposphere,”

    Of course there are large variations in the concentration of CO2 and its geographical location (ie., between each hemisphere) and this variation can become more marked dependent upon seasons. Even the concentration of CO2 varies quite significantly from night to day, and this daily variation is more marked or less marked again depending upon geographical location. There are also altitude variations etc.etc. These variations can exceed 20ppm (say 5 or so percent).

    Having read your comments (unfortunately only quickly and I probably have not done them justice), it appears that we are talking at cross purposes. I am looking at response to CO2 on a multidecadel time frame, whereas you appear to be looking at response measured upon a much smaller time frame (measured in months). I understand your point but I remain unsure the extent to which it deals with climate sensitivity on a multidecadel basis. Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?
    My bottom line take is that you cannot begin to ascertain climate sensitivity until you first know and understand everything there is to know about natural variation and in particular its bounds. Until that is known, it is not possible to separate CO2 signal from noise. Presently, all the data sets are not fit for purpose, they are low resolution and far too noisy. In addition, can they even be relied upon because of endless adjustments the need for and correctness of which is moot?

    The only thing we really know about climate sensitivity is that natural variation is stronger; it can trump climate sensitivity. Proof; (i) there has been no warming these past 17 years since downward forcings associated with natural variation have equalled the upward forcing component of climate sensitivity to CO2 during this 17 year period. (2) there was cooling between say 1940 to say 1975 because the downward forcings associated with natural variation have more than equalled the upward forcing component of climate sensitivity to CO2 during that period such that the strength of natural variation forcings were able to drive temperatures down even in the face of the warming effect of CO2!!

    If climate sensitivity is high, then in view of (2) above, we know that natural variation is even stronger!

    Of course, I appreciate that you consider that CO2 can do nothing and it may be that is the case, or may be it is the case given the saturated levels already reached today. I myself am unable to make any firm assertions mainly because of the poor quality data available, and its error bars, coupled with a lack of empirical observational experimentation on the issues raised. I feel that we are all groping around in the dark.

  119. I don’t think the Earth’s atmosphere increases temperature by 30 C. The Moon has an average surface temperature of about -5 C and for Earth is is about 14 C. The Moon has no atmosphere. Atmospheric pressure alone increases temperature. A case in point is Venus. I am willing to bet if the Moon has a thick atmosphere comprising only of oxygen and nitrogen with no CO2, water vapor or any other greenhouse gas, the Moon would have an average temperature of at least 10 C.

  120. amoorhouse says:
    March 20, 2013 at 8:19 am
    —————————————————————————–
    well said, nothing they said and did was close, However the have suceeded in severely damaging the worlds economy.

  121. I THINK the calculated sensitivity with *no positive feedback* is 0.4C and that I would say is right.
    Water is as far as I can see a massive NEGATIVE feedback system but with multi-decadal lags due to oceanic thermal inertia so it tends to lead to oscillatory behaviours.

  122. “BRILLIANT! Correcting Hansen Scenario C for the true Atmospheric CO2 you get 0.63 °C warming, but the true warming, based on UAH (satelite sensors) is only 0.18 °C.”

    Some funny arithmetic here. OE says the adjustment for CO2 is 0.11 °C;. You say Scen C had a rise of 0.29 °C;. That makes 0.4, as I understand.

    The rise in GIStemp LOTI was 0.393°C. Looks pretty good to me!

    You say UAH showed 0.18°C warming? Looks like 0.41°C to me.

  123. richard verney says:
    March 21, 2013 at 12:07 pm

    “…whereas you appear to be looking at response measured upon a much smaller time frame (measured in months).”

    Well, 55 years is 660 months. If you call that a “small” time frame, then…¯\_(ツ)_/¯

    “Are you not looking more at seasonal variations, and of course, seasons and temperatures have a close correlation?”

    It is 55 years in which the CO2 derivative has matched the temperature anomaly perfectly. That 55 years saw the major part of the increase in atmospheric concentration from “pre-industrial” levels.

    I’m wondering if you understand that the derivative contains all the information needed to reconstruct the entire change during the time interval it is plotted? I did that here. That’s all you need to reconstruct the entire delta-CO2 in the last 55 years to high fidelity: temperature. You don’t need human inputs. Their contribution is therefore necessarily insignificant.

  124. Nick Stokes says:
    March 21, 2013 at 3:59 pm
    =================================================================
    Nick-

    Perhaps I didn’t put enough description on my values.

    First, the anomaly values I gave were 5 years averages, just as Hansen did in Figure 3 of Hansen ’88.

    Second, they were at year 2010 (so the average was from 2008 to 2012 inclusive)

    Next, the 0.11 deg. C., was increase in the Scenario C 2010 temperature anomaly that would result if the Scenario C 2010 atmospheric CO2 concentrations were the same level as the Scenario B CO2 concentrations. This was calculated from the equations in Appendix B of Hansen ’88, using the method Hansen presented. Thus, the 0.11 should be added to the Scenario C 2010 temperature anomaly shown in Figure 3 of Hansen ’88.

    Finally note, that to compare Hansen’s scenario’s to UAH, it is necessary to adjust the base to the UAH base, which is 1981 to 2010. Hansen’s base was the average of the 100 year Control Run. To do this, the Control Run is adjusted to an ’81 to ’10 base, then the differences from the original control run and the three scenarios are added to the adjusted Control Run to get the adjusted scenarios. The adjusted values are slightly different from the Hansen values shown in Ira’s graph.

    As for the UAH temperature anomaly, the 0.18 is the 61 month average, not the yearly value.

  125. Ira, the Fig 2, abstract and Introduction I referred to are from Hansen (88), the paper that’s under discussion! I naively assumed that you’d actually read the paper whose results you’re critiquing. That you apparently haven’t explains some of your misunderstandings.

  126. The “predictions” which various bloggers attribute to Hansen are not predictions but rather are projections. Though they are often conflated, the term “prediction” and the term “projection” have differing meanings. To conflate the two terms is to foster deceptive arguments on the methodology of the international study of global warming via the equivocation fallacy.

  127. Terry Oldberg;
    Though they are often conflated, the term “prediction” and the term “projection” have differing meanings.
    >>>>>>>>>>>>>

    Darn right. One is based on data with error bars and a defined precision and the other is an excuse for not having either.

    • davidmhoffer:

      Thanks for taking the time to reply. There are interesting differences between models that make predictions and models that make projections. One is that a model of the former type conveys information to a policy maker about the outcomes from his or her policy decisions while a model of the latter type conveys no information. Thus, while a model of the former type is suitable for making policy, a model of the latter type is completely unsuitable. In AR4, the models that are cited by the IPCC as the basis for making policy on CO2 emissions convey no information to policy makers about the outcomes from their policy decisions; thus these models are completely unsuitable for making policy.

      Though being completely unsuitable, models that make projections are what are used in making policy. That this is so is one of many ghastly consequences from the incorporation of the equivocation fallacy into arguments regarding the methodologies of climatological studies.

      An “equivocation” is an argument in which a term changes meaning in the middle of the argument. By logical rule, to draw a conclusion from an equivocation is improper. To draw a conclusion from an equivocation is the equivocation fallacy. Under the circumstance that the words “prediction” and “projection” are treated as synonyms, the word pair prediction-projection has dual meanings and is said to be “polysemic.” In making arguments about the methodologies of their studies, climatologists use polysemic terms that include prediction-projection and draw improper conclusions from these arguments. One of the consequences is to move money from the pockets of non-climatologists to the pockets of climatologists. As uses of the equivocation fallacy are hard to spot, few of the non-climatologists are aware of having their pockets picked by the climatologists!

  128. @Terry Oldberg:

    I’m so sick of that deceptive canard. It is used as a shield to cover blatant deception. First started by the Club Of Rome propaganda piece “Limits To Growth” by Meadows et.al. as near as I can tell.

    Folks tell you what they expect to happen in the future. That’s a prediction. Fortune tellers do not say “I will now project your future!”… One doesn’t say “The High Priest will now project the date of the eclipse.” In common usage, saying “this is what will happen” or “this is what I expect to happen” is a prediction.

    The rest of the word play is just bafflegab to dodge responsibility for it being a FAILED prediction.

    And I predict the Warmers will continue to play the Projection Game as cover for abuse, deception, and error.

    • E.M.Smith:
      The deception being practiced by participating climatologists has been studied by logicians and is called by them the “equivocation fallacy.” Please see my response to davidmhoffer for details.

      By the way, skeptics as well as warmists are guilty of incorporating the equivocation fallacy into arguments about methodology. Participating skeptics unwittingly parrot uses of the equivocation fallacy by their opponents the warmists. In doing so, they reduce their argument with the warmists to one that is over the size of the equilibrium climate sensitivity (TECS). TECS does not logically exist but rather is a product of uses of the equivocation fallacy in making methodological arguments.

  129. Ira

    I consider that Russ R, (see his comment : Russ R. says: March 20, 2013 at 9:32 am)
    raises a very good point. Have a look at it.

    I think that your article could be improved if you were to detail the CO2 assumptions (in ppm) both natural and manmande which Hansen made in scenaro A, and scenaro B. If Hansen was basing these assumptions on say 30 years worth of CO2 emissions beweeen say 1958 to 1988, then CO2 as from 1988 to date has risen at more than a linear rate. In particular, the manmade component certainly has.

    I had always thought that CO2 emissions were above the BAU (scenaro B) but less than the scenaro A, ie., that we were somewhere between the two scenaros – running a little above BAU. . However, (given what Russ R says) it may be that my understanding of this is incorrect since although manmade CO2 emissions is above those assumed in BAU (scenaro B), this increase has been offset by the growth of the naturally occurring CO2 sinks which have grown faster than was envisaged in BAU (scenaro B).

    I think if we are to consider Hansen’s ‘projections’ (some would say predictions’) we need to specifically consider CO2 emissions and the assumptions that Hansen made with respect to these. Your Mauna Loa plot shows what has happened to CO2 but it does not detail precisely what Hansen assumed CO2 emissions would be running at in his various assumptions.

    PS The point raised by Old Engineer is very interesting.

  130. Phil. March 21, 2013 at 6:43 pm: Thanks for clarifying that the “Fig 2″ you were refering to is from Hansen 1988, available as a .pdf http://pubs.giss.nasa.gov/docs/1988/1988_Hansen_etal.pdf

    In your first comment (Phil. March 21, 2013 at 10:46 am) you wrote:

    [quoting me] THERE HAS BEEN NO ACTUAL “CURTAILMENT OF TRACE GAS EMISSIONS”

    As everyone knows, the Mauna Loa measurements of atmospheric CO2 proves that there has NOT BEEN ANY CURTAILMENT of trace gas emissions. Indeed, the rapid increase of CO2 continues unabated [end of quote of me]

    You have made the same error that McIntyre made several years ago at CA, look carefully at Fig 2, you’ll see that the difference between scenario A & B is less than 0.1ºC by now. Hansen clearly showed that the main cause of temperature increase up to now would due to the ‘trace gases’ if they continued to increase at the then rates. This is clearly stated in the abstract and the Intro para.

    OK, Fig. 2 shows the assumed “Greenhouse forcing for trace gas scenarios A, B, and C as described in the text.” The horizontal axis is years, from 1960 through 2050, and the vertical axis is delta T in ºC, There are three sections, with the upper section showing CO2 forcing. As I quoted from the Abstract in my graphic above, Scenario A assumes an exponential increase in CO2, B assumes a linear increase, and C assumes a linear increase until the year 2000 and then CO2 flatlines. The middle section shows CO2 + trace gases, with forcings over twice that of the upper section. The lower section shows CO2 + trace gases + aerosols, with forcings the same as the upper section for Scenario A, but lower for B and C due to assumed volcanic eruptions.

    The text in section 1 of Hansen 1988 says that CO2 “is now about 345 ppmv with current mean annual increments of about 1.5 ppmv”. Given that starting point, if CO2 had increased linearly (Scenario B) it would have increased 36 ppmv in 24 years and be up to 381 ppmv. Actual CO2 is 2012 was about 393, so, for CO2, the increase has been exponential which corresponds to Scenario A.

    The text in Section 2 of Hansen 1988 says “equilibrium sensitivity” (what is now called “Climate Sensitivity”) to doubling of CO2 from 315 ppmv to 630 ppmv, is 4.2 ºC.

    The “trace gases” include CH4 and N2O, and Scenario A assumes a linear increase, while B assumes a moderate decrease, while C assumes a “more drastic curtailment of emissions than has generally been imagined”. Which of these three scenarios best describes the increase in “trace gases”? I think it is A or B, which predict much greater warming than has actually occured. It is definitely NOT C, yet the prediction of C, while still high, is closest to the actual.

    The “aerosols” come from volcanos, and they provide a negative forcing. Scenario A assumes no volcanos. B and C assume some volcanos.

    So, Phil., now that we are on the same Fig. 2, please make your point. Tell us about the “more drastic curtailment of emissions [of trace gases] than has generally been imagined”. And, what about aerosols? It is clear that Scenario C, Hansen’s best prediction, is based on counterfactuals with regard to CO2 stabilizing (it has increased exponentially) and “drastic curtailment” of other trace gases.

    Please take us through why Scenario B, which Hansen says is the most likely, got warming high by so much.

    advTHANKSance

    Ira

  131. Ira Glickstein, PhD says:
    March 22, 2013 at 8:47 am

    “…if CO2 had increased linearly (Scenario B) it would have increased 36 ppmv in 24 years and be up to 381 ppmv. Actual CO2 is 2012 was about 393, so, for CO2, the increase has been exponential which corresponds to Scenario C….”
    /////////////////////////////////////////////////////////////////////////////////

    Ira

    Actual 2012 CO2 is 393 ppm. This is more than linear which would have been 381 ppm, but is it properly clasified as exponential?

    You state: “… so, for CO2, the increase has been exponential which corresponds to Scenario C….” but do you mean it corresponds with Scenario A (not C) since Scenario A is is the projection of an exponential growth in CO2 emissions, and you are asserting that CO2 has risen exponentially?

    Of course, we are not entirely Scenario A because Scenario A assumes no volcanoes and there has been some (claimed) negative forcing due to aerosols.

    We are obviously not Scenario C since that assumes no increase in CO2 post 2000. Nor are we Scenario B, which assumes some negative aerosol forcing but a linear rise in CO2 emissions, whereas we have seen more than a linear rise in CO2. Are we therefore not somewhere between the Scenario B and Scenario A projections? Should we not be comparing reality with something lying between Hansen’s Scenario B and A projections?

    There is also the spanner in the works pertaining to aerosols from developing nations use of coal powered generation. Personally, i consider this suspect. Is it not the position that aerosol emissions today are no greater than they were in the 70s/80s. Perhaps a plot of aerosol emissions from 1980 to date would be a useful addition to your article.

  132. Well, I see nobody knows what to make of my evidence. It shocks me that something so obvious can be so hard for people to grasp. I guess doing this kind of stuff for a living for several decades skews your viewpoint to the point that things which are obvious and basic to you are just not in the realm of experience for most others.

    Oh well, so it goes. But, I have made predictions, so you can all watch them unfold and realize I am right.

    • Bart:

      It is logically troublesome that the entity which you call a “prediction” is the entity which I call a “projection” for to treat the two terms as synonyms makes of the word pair “prediction-projection” a polysemic term; a polysemic term is a term with several meanings. That this word pair is polysemic leads arguments about the methodology of global warming research to degenerate into examples of the equivocation fallacy. Thus, it is important for all of us to assign the same distinct meanings to the two terms.

  133. Terry Oldberg says:
    March 22, 2013 at 1:41 pm

    I think you addressed the wrong guy. I haven’t addressed anything to you in particular, and am off on a completely different topic.

    But, I get your argument, a projection is not a prediction. It is a scenario of what would happen under specific conditions. However, when you determine which projection is consonant with observed conditions, then that projection can effectively be considered a prediction. So, in that regard, I see your objection as largely semantic.

    • Bart:

      Though it has a semantic aspect, the problem that interests me is logical. In making arguments about the methodologies of their studies, climatologists habitually draw improper conclusions from equivocations thus being guilty of the equivocation fallacy. A favored vehicle for this practice is to treat “prediction” and “projection” as synonyms. To do so is to is to create a polysemic term that switches meaning in the middle of their argument. When a conclusion is drawn from this argument, another example of an equivocation fallacy is born.

      Opportunities for shenanigans of this kind can be eliminated by disambiguating terms in the language of methodological arguments thus eliminating the polysemic terms. When this is done, today’s climate models are revealed to have numerous pathological features. Among them is that the models convey no information to policy makers about the outcomes from their policy decisions. Thus, the models are useless for making policy. It is instances of the equivocation fallacy that make them seem useful.

    • Bart:

      As I use the term “prediction,” it is a product of an inference to the unobserved outcome of an event in a statistical population; for the IPCC climate models there is no statistical population and thus is no such thing as a prediction. As I use the term “projection,” it is a mathematical function that maps the time to the global average surface air temperature. Your “prediction” sounds like my “projection” and unlike my “prediction.”

      The identity of the term that one assigns to the meaning which I assign to the term “predict” and the identity of the term that one assigns to the meaning which I assign to the term “project” is immaterial to the logic or illogic of the methodologies of climatological studies. Currently these methodologies are illogical but sound to many as though they are logical. This mistake is a consequence from the fallacy of drawing an improper conclusion from an equivocation, the so-called “equivocation fallacy.”

  134. Either way you look at it, the atmosphere allows the Earth’s surface to retain heat (and other ) energy. Some calculations say as much as 30C.

    The question is whether CO2 has any influence on this. Ira says about 1C per doubling of CO2 concentration. I suspect that the answer is more likely to be = 0C, mainly because the atmosphere is a self-balancing mechanism controlled and regulated by pressure differences. CO2 makes no difference (well, extremely small) to the atmospheric pressure or to the rate transfer of energy within the atmosphere. Any minor warming by so-called back-radiation from CO2 would be instantly balanced out by the atmosphere’s pressure regulating mechanism. The atmosphere acts to COOL the surface if it the surface is warmer than it should be, and would also act to cool any part of the atmosphere that gets warmer than the pressure gradient will allow.

    Anyways.. it will certainlly be fun watching the warmists squirm over the next several years as the global temperature starts to drop. :-)

    • AndyG55 I agree – I find it very difficult to conceive that man, such a puny animal, occupying such a small space on this, relatively, gigantic planet with its vastly more gigantic atmosphere, could have any appreciable effect on it. Is this not a hangover from the kind of anthropocentrism (e.g. Man is in God’s image, Earth at the centre of the Universe, etc.) that reflects more on the arrogance of our species than it does on the state of affairs.
      Yes, AndyG55, but while we enjoy the discomfort of the warmists, millions are are paying the price of this cruel deception; affordable energy has been at the heart of of civil refinement (let’s not call it ‘progress’) and this ghastly conspiracy is almost wholly responsible for depriving the burgeoning population of access to it. People are being driven out of work, poverty stalks the land, the lights are going to go out, the poor and elderly will starve and freeze to death, all as a result of the war against humanity being waged by self-appointed, self-important bullies in the name of the “Green Agenda”.
      Man under threat, like many other animals, tends to produce more of the species. Give security from war, ample food, clothing and shelter the reproductive rate falls and, if there really is an overpopulation problem, that problem will diminish without draconian eugenics and the like.
      AGW, like WMD, was contrived by the ruthless, selfish and powerful to impose servility on the innocent and ignorant. While we gloat over the warmists’ humiliation, let us also grieve for all those paying the price of their arrogance.
      What surprises me even more is that very few seem to express anger now that the deception has been exposed.

  135. Terry Oldberg says:
    March 22, 2013 at 5:34 pm

    ‘Your “prediction” sounds like my “projection” and unlike my “prediction.”’

    Not so. It is based on the statistically observed behavior of what I argue is an, at least approximately, ergodic system – the time average of certain measures is, approximately, equal to the distribution expectation.

    It is a prediction. It is not just one random possibility, it is the expected outcome.

    • Bart:

      If your “prediction” is an Oldbergian prediction, underlying your model is an example of a statistical population. If there is one, kindly describe the independent events in this population. In particular, what is the starting time and ending time of each event and what is the complete set of possible outcomes.

  136. Terry Oldberg says:
    March 22, 2013 at 5:34 pm

    I agree with you entirely in this sense, however: What the IPCC has done is assumed a model based on a population of one, and projected that model forward to give a distribution of outcomes. The usefulness of the resulting distribution is not in predicting the future, but in validating the model, i.e., in showing whether it is likely false or indeterminate – it cannot show if it is true.

    Unfortunately, the clearly desired implication is that they are predictions, and that the scarier scenarios can happen, when there really is a very tenuous, to the point of negligible, basis to expect that at all.

    But, that differs from what I have done. I have looked at the time evolution of the data itself, and matched it to a statistical model which describes the observed dynamics over the timeline. Their process is deductive, based on premises and projected from a sample size of one. Mine is inductive – I start with the data, and infer the statistical model from it.

    And, mine has matched the future observables, where their’s has failed, i.e., it is very likely false. For me, the turning point of the ~60 year cycle in globally averaged temperature arrived right on time in about 2005, whereas the first figure in Ira’s post here shows that they are skirting on the extreme lower boundary of the distribution of their projections.

    In this reality, systems tend to behave in simple ways. Complex systems tend to regress to a simple systematic mean. Thus, for example, the ungodly complications of quantum theory regress to Newton’s simple F = ma in the large, as demonstrated by Ehrenfest’s Theorem. Complex nonlinear systems tend to behave as simple linear ones near a particular equilibrium. Such regression has been observed ubiquitously. Our entire techno-industrial society depends upon it.

    Those who stay locked in ivory towers and have little contact with practical reality tend to get wrapped around the axle, and overwhelmed by complexity. They just cannot conceive that the system could really evolve so simply as the rate of change of atmospheric CO2 being proportional to temperature anomaly, or the evolution of temperature being simply a trend plus a simple cyclic phenomenon. Yet, that is precisely what the data confirm. And, humans clearly have little impact on either.

  137. Terry Oldberg March 22, 2013 at 5:08 pm and March 22, 2013 at 5:34 pm:

    I am trying to get my head around the distinction you make between PROJECTION and PREDICTION. In my main Topic above I used “prediction” exclusively. In my previous writing, I do not believe I ever used “projection” in the sense you seem to be using it here as some sort of invalid prediction. The Merriam-Webster dictionary seems to be on my side:

    Definition of PROJECTION
    1 a : a systematic presentation of intersecting coordinate lines on a flat surface upon which features from a curved surface (as of the earth or the celestial sphere) may be mapped
    b : the process or technique of reproducing a spatial object upon a plane or curved surface or a line by projecting its points; also : a graph or figure so formed
    2: a transforming change
    3: the act of throwing or thrusting forward
    4: the forming of a plan : scheming
    5 a (1) : a jutting out (2) : a part that juts out
    b : a view of a building or architectural element
    6 a : the act of perceiving a mental object as spatially and sensibly objective; also : something so perceived
    b : the attribution of one’s own ideas, feelings, or attitudes to other people or to objects; especially : the externalization of blame, guilt, or responsibility as a defense against anxiety
    7: the display of motion pictures by projecting an image from them upon a screen
    8 a : the act of projecting especially to an audience
    b : control of the volume, clarity, and distinctness of a voice to gain greater audibility
    9: an estimate of future possibilities based on a current trend [my bold]

    Note that we have to go to the ninth definition to find anything like what we are talking about here.

    Indeed, when I first heard the word used by you in association with the work of climatologists, I thought of simply taking a trend line (such as CO2 levels or temperature anomalies, etc.) and projecting it to the future. Thus, if the most recent trend was a linear slope, I would project it in a linear manner, if exponentially increasing slope, I would project it with upward curvature, if decreasing slope, I would project it with downward curvature.

    In my research, I found how the IPCC defines the terms: see http://www.ipcc-data.org/ddc_definitions.html

    The term “projection” is used in two senses in the climate change literature. In general usage, a projection can be regarded as any description of the future and the pathway leading to it. However, a more specific interpretation has been attached to the term “climate projection” by the IPCC when referring to model-derived estimates of future climate….

    Forecast/Prediction: When a projection is branded “most likely” it becomes a forecast or prediction. A forecast is often obtained using deterministic models, possibly a set of these, outputs of which can enable some level of confidence to be attached to projections

    Therefore, we could call Hansen’s 1988 scenarios A, B, and C “projections” in the IPCC sense. Each of Hansen’s sceniarios is an attempt to project the curve of the current temperature anomalies into the future according to three different possible sets of actions by human societies (emission of trace gases) and Nature (aerosols from volcanic eruptions). A is “business as usual” with no special action to curb emissions and no volcanic activity. B is moderate curbing of emissions and some volcanic eruptions. C is drastic curbing of emissions and some volcanic eruptions.

    Hansen 1988 says Scenario B is the “most likely” and thus, by the IPCC definition, it becomes a forecast or prediction.

    As a native speaker of (American – actually Brooklyn :^) English, I would go along with the IPCC and say that Scenario B, which Hansen 1988 called “most likely” is a prediction. Assuming he was sincere when he and his team constructed and ran the models (which assumption I accept) he was predicting two things:
    B It is most likely that his recommendations would be accepted to some extent and thus, a) Society would curb emissions of trace gases in some moderate manner, and b) as a result, the Scenario B curve of future temperature anomalies would be approximated by future measurements

    Now, although he did not say his Scenario A or Scenario C was “most likely”, I also consider them to be predictions. Namely:
    A if, in the less likely case that his recommendations were totally ignored, a) Society would NOT curb emissions of trace gases and they would continue to increase exponentially, and, b) as a result, the Scenario A curve of future temperature anomalies would be approximated by future measurements.
    C if, in the less likely case that his recommendations were totally accepted in a drastic, nearly unimaginable way, a) Society would DRASTICALLY curb emissions of trace gases and they would flatline in the year 2000, and b) as a result, the Scenario C curve of future temperature anomalies would be approximated by future measurements.

    The above definitions of prediction seem reasonable to me. A failed prediction (and ALL of Hansen’s 1988 predictions were very wrong, including C which, while close to the actual temperature anomalies, is not matched with any drastic societal action nor any change in the trend of trace gas levels, which was the whole point of C) is, nevertheless a PREDICTION.

    You (Terry Oldberg) seem to be saying that a prediction based on wrong science is a not a “prediction” at all, but only a “projection”.

    But let us all agree that whatever Hansen 1998 is, it turned out almost totally wrong!

    Ira

    [" ...as a result, the Scenario B curve of future temperature anomalies would be approximated by future measurements." Should this not be "Scenario C curve" ? Mod] [Yes, fixed, Thanks. Ira]

    • Ira Glickstein:

      I specialize in answering the question of whether methodological arguments are logical. Before examining the methodological arguments of the global warming climatologists in light of logic, it is necessary to rid these arguments of instances of the equivocation fallacy. This can be accomplished through disambiguation of the polysemic terms in the language of these arguments. A polysemic term is a term with more than one meaning.

      Among the polysemic terms is the word pair prediction-projection in the circumstance that the two words in this word pair are treated as synonyms. In their study of the methodology of global warming climatology, Green and Armstrong found that most IPCC-affiliated climatologists treated the two words as synonyms at the time at which AR4 was being written.

      The polysemic term prediction-projection can be disambiguated by assignment of distinct meanings to “prediction” and “projection.” This raises the issue of what these meanings shall be.

      In addressing this task ( http://judithcurry.com/2011/02/15/the-principles-of-reasoning-part-iii-logic-and-climatology/ ), I formed the hypothesis that climatologists had acquired the two words from the meteorological literature. In particular, they had acquired the term “projection” from the literature of ensemble forecasting. In ensemble forecasting, a “projection” is a response function that maps the time to the values of a selected independent variable. Meteorologists seemed to adopt the definition of “prediction” that was standard in mathematical statistics. Under this definition, a prediction was an unconditional predictive inference.

      As all of the evidence that I was able to acquire was consistent with this hypothesis, I went with it. By examination of uses of the two words in the literature of meteorology, I formed an impression of what meteorologists meant by each of the words. These are the meanings that I assign to the two words in this thread. While the IPCC claims there to be a circumstance in which a projection becomes a prediction, this is not mathematically possible for a “prediction” is an extrapolation to the outcome of a specified event in a statistical population but for global warming climatology there is no such population.

  138. Bart:

    In a search lasting 3+ years, I’ve been unable to find so much as a single event in the statistical population underlying the IPCC climate models. If you are aware of one, please point me in the right direction.

    Instead of one or more events, I find multiple examples of the equivocation fallacy that deceive many of us into thinking that: a) predictions have been made when only projections have been made b) the models convey information to policy makers on CO2 emissions when the models convey no such information c) global temperatures are controllable by man when they are uncontrollable d) the models have been validated when they have only been evaluated e) the scientific method has been followed in studies of global warming when it has not been followed.

  139. Ira Glickstein, PhD says:
    March 23, 2013 at 12:01 pm
    ALL of Hansen’s 1988 predictions were very wrong, including C which, while close to the actual temperature anomalies, is not matched with any drastic societal action nor any change in the trend of trace gas levels, which was the whole point of C)

    The ‘drastic societal action’ was the Montreal Protocol, as for the change in the trend of trace gas levels, see below.
    Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C:

    http://climateaudit.org/2008/01/17/hansen-scenarios-a-and-b/

  140. Phil. says:
    March 23, 2013 at 4:54 pm

    Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C

    So with the very low rise now, it appears as if CO2 was never a major player all along so we can ignore CO2. Is that correct?

  141. Werner Brozek:

    Whether or not CO2 is a major factor cannot be determined until climatologists supply a missing ingredient for doing research that is “scientific.” This ingredient is the statistical population underlying their models.

    Suppose this population is finally described and the duration of an event in this population is 30 years. An event has one of 2 possible outcomes. In one of these, there are two possible outcomes. One is that the spatially and temporally averaged global surface air temperature exceeds the long run median. The other is that the spatially and temporally averaged global surface air temperature does not exceed the median. In this case, the recent 16 year period in which the warming has oscillated about zero yields no observed events. Thus, it provides us with no information about the outcomes of the events of the future.

  142. Terry Oldberg says:
    March 23, 2013 at 3:18 pm

    “If there is one, kindly describe the independent events in this population.”

    They are the events which drive the system forward in time. If a whitening filter can be devised which effectively yields an estimate of that population of inputs as stationary broadband noise, then the inverse of that filter provides an effective model for prediction. It is not, of course, guaranteed, but the range of systems for which such an approach has been successfully applied is vast.

    • Bart:

      Thanks for taking the time to reply. An event has a starting time and stopping time and a set of mutually exclusive collectively exhaustive possible outcomes. I’d like to know the starting time and stopping time for each statistically independent event in your population as well as the set of possible outcomes.

  143. Terry Oldberg says:
    March 23, 2013 at 3:18 pm

    PS: I believe your criticisms are right on. What is being done in climate science right now is a scattershot approach which has very little likelihood of success. I would advocate a more phenomenological approach, based on how the data has actually been observed to behave, rather than trying to make the observations conform to an underlying theory which might well be (indeed, has been found to be IMHO) false. As Sherlock Holmes was fond of saying:

    “It is a capital mistake to theorize before one has data. Insensibly, one begins to twist data to suit theories, instead of theories to suit facts.”

    • Bart:

      Thanks for the support. The aspect of global warming research that strikes me as most interesting is that the methodology is unscientific, illogical and unsuitable for the intended purpose but that this state of affairs has successfully been covered up through repeated uses of a deceptive argument. Two hundred billion US$ have been spent on the misguided research and deluded governments are gearing up to spend one hundred trillion US$ or so on implementing the results. An advocate for impoverishing people in this way has received the Nobel Peace Prize. The President of the U.S. and United Nations are on board. This is a great story!

  144. Terry Oldberg says:
    March 24, 2013 at 12:07 pm

    “I’d like to know the starting time and stopping time for each statistically independent event in your population as well as the set of possible outcomes.”

    I will see if a brief synposis of my viewpoint will help.

    We start with a description of the system from a set of linear time invariant stochastic differential equations. There are many powerful tools in existence for the identification of such systems. The best starting point is probably estimating the power spectral density (PSD).

    For example, I did a PSD estimate of Sun Spot Number here. From this, I was able to see that the specturm was dominated by two spectral peaks which modulated against one another to produce four peaks in the spectrum. So, the underlying system has the character of the Hypothetical Resonance PSD I show in the middle, When squared, the signal has the theoretical specturm shown as the green line in the bottom plot. and that matches up pretty well with the spectrum of the squared process from actual data in blue. Although there are other small peaks evident in the data, it is apparent that the two processes identified above dominate. The others can be added at some point to achieve a better model, but these two main ones should be enough to provide a useful first-cut approximation.

    I do not know what these processes are but, from a phenomenological viewpoint, I do not need to. Once identified, their source can be tracked down independently, but we can still used the identified structure to predict future evolution.

    A model for the dominant processes is shown here. The input PSD of the driving processes is assumed to be wideband and flat, and this idealization works fine if the true stochastic input is simply more or less uniform in the frequency range of each of the two spectral peaks. I show here and here the outputs of the model when simulated, and they are seen to have similar character to the actual observed SSNs.

    Using the square of the SSN provides a smooth observable which can be incorporated into an Extended Kalman Filter. Using backwards and forwards propagation and update of the filter, the states can be optimally smoothed and primed at the last value for prediction. Propagating the differential equations forward then provides an approximate expected value, which is therefore a prediction of the future based on all past observables, and the Kalman Filter formalism provides RMS bounds on the error in the expected value. I haven’t gone forward with the project of doing so because it is a big job, and I have many other competing interests, some of by which I earn a living, but the procedure is straightforward.

    Note that this is an entirely phenomenological approach, I do not need to know the actual underlying dynamics, only a reasonable equivalent representation. The observed structure of the dynamics can be expected to continue as they have in the past. So, we have a statistically non-trivial set of observations with which to verify the model.

    Now, why are we justified in taking such an approach, and why should we expect it to bear fruit? First and foremost, because it has innumerable times in the past. It is not an overstatement to say that our entire tech-industrial society has been built upon this very foundation. But, it is also reasonable from a first principles point of view.

    Most processes, at the most basic level, can be represented to high fidelity as the outcome of a randomly driven set of partial differential equations (PDEs). PDEs can generally be decomposed onto a functional basis, thence expanded into a multi-dimensional set of first order ordinary differential equations. Further simplifications can be achieved by focusing on those states which dictate the long term behavior, the Langevin equations. In the neighborhood of a particular equilibrium state, these equations take the form of a linear time invariant (LTI) system. And, so, we can expect that starting from determination of an LTI system which describes the observations, we can ultimately arrive at a useful predictive model.

    That model then provides a standard against which to validate theories on the deeper underlying system dynamics. So, it is essentially the reverse of the procedure the climate establishment is using. As I said before, their approach is essentially deductive – they start developing theory unconstrained by the observables, then they try to force the observations to match the theory. That, as Sherlock would say, is bass-ackwards. You should start with the observations, which then constrain the form of your theoretical models, in an inductive progression.

    The deductive approach is somewhat like trying to get a winning Lotto ticket by randomly choosing a number, and seeing if it matches the winning number. The inductive approach is more like observing that the winning numbers in this particular game are always prime and never repeat, and so you dramatically reduce the number of possible winning numbers from which you can choose and, eventually, since the numbers are composed of a finite number of digits, you can zero in on the winning number.

  145. Bart says:
    March 24, 2013 at 2:00 pm

    “The deductive approach is somewhat like…”

    Or, maybe, it is like trying to guess the solution of an equation by generating random numbers until you get an exact match of the equation, versus using a Newton algorithm to converge quadratically on the answer.

  146. Werner Brozek says:
    March 23, 2013 at 9:47 pm
    Phil. says:
    March 23, 2013 at 4:54 pm

    Ira, I suggest you read this post by McIntyre, it will show the observed reductions in CFCs, CH4 and N2O actually fell below Hansen’s Scenario C

    So with the very low rise now, it appears as if CO2 was never a major player all along so we can ignore CO2. Is that correct?

    No, the point of Hansen 88 was that GHGs would have a major effect on climate, most of the short term change would be due to gases other than CO2, long term the effect of CO2 would be significant. What the actual measurements show was that Hansen’s Scenario C was most representative of reality except for CO2 which was between A and B so you’d expect the overall result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.

  147. Phil.,

    What would it take for you to admit that AGW is either non-existent, or so minuscule that it isn’t worth worrying about?

    Numbers, please: how many more years of little or no global warming would it take? How much more human-emitted CO2 without runaway global warming would it take?

    Or, is your mind made up to the point where nothing can possibly convince you that your “carbon” conjecture was/is wrong? <— [Like Hansen's true belief.]

    Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?

  148. Phil. says:
    March 24, 2013 at 2:41 pm
    “result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.”

    The greenhouse effect of CH4 competes with H2O and is only measurable in dry winter weather. Are you SURE it’s a problem when it can’t even be measured in warm moist weather?

  149. Terry Oldberg March 23, 2013 at 3:09 pm:

    Thanks for the link to your Topic on Judith Curry’s site. I read it through and now I understand the distinction you are making between types of models and whether their outputs are properly called projections or predictions.

    The only problem I see is that there are many areas of public policy that are so complex and beset with lack of reliable data where the best anone can achieve is what you call a projection. Nevertheless, individuals and public officials must make decisions in the short term despite the uncertainty.

    So, what to do? Well, we should take the most conservative course and not act rashly unless it is pretty clear that that is necessary in a given case.

    In my field of system engineering, risk is defined as the probability a given bad event will occur multiplied by the cost if that event occurs. Thus, if the cost of that bad event is catastrophic, we need to act to prevent it even if the probability is low. Conversely, if the probability of that bad event occuring is high, we need to act to prevent it even if the cost if it happens is low.

    On the other hand, if the probabiliy of a bad event happening is low, and the cost if it does occur is also low, we do not need to act to prevent it. I think that is the case with Global Warming. The probability of it amounting to more than 1 ºC per century appears to be very low, and, the cost to society of even as much as 1 ºC per century is negligible, because we can adapt to it, and it may turn out to be of net benefit.

    Ira

    • Ira Glickstein:

      Thanks for taking the time to read my article. A tricky aspect of the type of model that makes projections is that it conveys no information to policy makers about the outcomes from their policy decisions. Thus, though policy makers think the opposite, the availability of this type of model does not make global temperatures controllable through regulation of CO2 emissions.

  150. Phil. March 23, 2013 at 4:54 pm:

    Thanks for the link to Climate Audit, but it dates from 2008. Do you have anything more recent for CFCs, CH4 and N2O?

    I respect McIntyre and accept his conclusion that Hansen 1988 way, way, way over-estimated the grown of “Greenhouse gases” other than CO2. Therefore, actual CFCs, CH4 and N2O levels fell considerably below even Scenario C assumptions, and way, way, way below the Scenario A and B assumptions.

    So, let us take score:

    1) A close look at Fig 2 from Hansen 1988 indicates that they assumed the warming effect of CO2, alone, was about equal to the combined warming effect of CFCs, CH4 and N2O. So, if they over-estimated one (CO2) and underestimated the other (CFCs, CH4 and N2O) by about the same amount, we should expect that the temperature anomaly would flatline, which, over the past decade and a half, it did. Score one for dumb luck!

    2) Scenario A got the exponential increase in CO2 pretty much correct, but they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario A is so much higher than actual temperature anomalies.

    3) Scenario B assumed a linear increase in CO2, which turned out to be pretty much correct, since the actual exponential increase is very mildly upward. But they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario B is also much higher than actual temperature anomalies.

    4) Scenario C assumed a linear increase in CO2, until the year 2000, where they assumed it would flatline. That to this dayturned out to be wrong, because CO2 has continued its mild exponential rise. They slightly over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario C is only 31% higher than actual temperature anomalies.

    Bottom line score; 1 for 4, and that 1 was dumb luck. Not too good. Not even for Scenario C, the closest to actual temperature anomalies.

    Yet, they say their model (or “simulation” as RealClimate recently called it), in their words “while not perfect … has shown skill”.

    RealClimate is comfortable with that. Are you? If so, they have one foot in a bucket of scalding water, and the other foot in a bucket of ice, but, on the average, they are comfortable :^)

    Ira

  151. Ira Glickstein, PhD says:
    March 24, 2013 at 6:34 pm

    “Scenario A got the exponential increase in CO2 pretty much correct…”

    Good comment in general, but I must take issue with this. The increase in atmospheric CO2 concentration is not exponential. It is at best quadratic.

    The derivative of an exponential is an exponential. As is readily apparent in the graph I keep bringing up, the derivative is at best linear, and in fact is decelerating along with the globally averaged temperature in the last decade or so.

    [Bart, you are correct. According to Wikipedia:Quadratic growth is a special case of a convex function, and should not be confused with exponential growth, a better-known growth function. "Convex growth" means "increasing at an increasing rate" (the second derivative or second difference is positive), while quadratic growth means "increasing at a constantly increasing rate" (the second derivative is positive and constant), and exponential growth mean "increasing at a rate proportional to current value" (second derivative is proportional to current value, which is positive; this is because the first derivative is proportional to the current (positive) value, hence (taking derivatives) second derivative is proportional to first derivative, hence (proportional to proportional is proportional, second derivative is proportional to first derivative). That is, quadratic and exponential growth are both different special cases of convex growth. Thank you for clearing this up. Ira]

  152. Bart:

    In building a model, the deductive approach has a shortcoming. It assumes that information needed for deductive conclusions is not missing but information is, in fact, missing. The inductive approach introduces problem not faced under the deductive approach; this problem is called “the problem of induction.” In each circumstance in which an inference is made by the model there are several candidates for being made. How can the one candidate that is correct be discriminated from the many that are incorrect?

    This problem was solved circa 1963 by the theoretical physicist Ronald Christensen. Logic could be extended from its confines in the deductive logic and through the inductive logic via replacement of the rule that every proposition had a truth value by the rule that every proposition had a probability of being true.

    In the resulting “probabilistic” logic, every proposition had a unique measure. The measure of a proposition was its entropy. It followed from the existence and uniqueness of the measure of an inference that the problem of induction could be solved by a kind of optimization. Depending upon circumstances, the correct inference was the one that minimized the entropy or that maximized the entropy under constraints expressing the available information. Christensen called this rule “entropy minimax.” Logic held principles of reasoning. The principles of reasoning were entropy minimax.

    In concert with colleagues who included me, Christensen tested this hypothesis in a number of real world circumstances. It held up to testing. Among the scientific theories that were produced by entropy minimax were thermodynamics, the modern theory of communications and the first successful long range weather forecasting models.

    However, though Christensen published his work, few scientists or academics read these publications. (Among the few who read some of them were a couple of professors from Ira Glickstein’s systems science department at Binghamton University. One was the department chairman, George Klir)

    In their ignorance, scientists continued to build models as they had done so for centuries. This was by discriminating the one correct inference from among the many candidates using the intuitive rules of thumb that I call “heuristics.” However, in each instance in which a particular heuristic selected a particular inference as the one correct inference, a different heuristic selected a different inference as the one correct inference. In this way, the method of heuristics violated Aristotle’s law of contradiction thus being illogical.

    Almost all of the models that are used today in practical decision making were built by the method of heuristics. Kalman filters are built by it. Use of this method introduces variability in the quality of the models that are generated in which the quality depends upon the model builder’s luck in the selection of heuristics.

    If you’d like to follow up, I recommend that you start with one of two tutorials. Judith Curry published one of them in her blog as a three part series under the title “The Principles of Reasoning” in 2011. The other, also written by me, is at http://www.knowledgetothemax.com . A bibliography is available at the same URL.

    Entropy minimax builds the best possible model from given informational resources. With the availability of this idea, generalizations can be reached about the possibilities from climatological research. One of these generalizations is that informational resources are insufficient for the construction of a model that predicts global temperatures over a horizon long enough (about 30 years) for this model to be usable for making policy on CO2 emissions. In ten thousand years, we may have enough observed events for this to happen. In the interim, entropy minimax will dictate that the global temperature varies independent of the CO2 level.

  153. Terry Oldberg March 25, 2013 at 8:26 am:

    … “the problem of induction” … was solved circa 1963 by the theoretical physicist Ronald Christensen. Logic could be extended from its confines in the deductive logic and through the inductive logic via replacement of the rule that every proposition had a truth value by the rule that every proposition had a probability of being true.

    In the resulting “probabilistic” logic, every proposition had a unique measure. However, though Christensen published his work, few scientists or academics read these publications. (Among the few who read some of them were a couple of professors from Ira Glickstein’s systems science department at Binghamton University. One was the department chairman, George Klir)[my bold]

    THANKS Terry Oldberg for reminding me of my days as a part-time Masters and then PhD student in the System Science department at Binghamton University! George Klir was chairman of the department while I was a student. He pronounces his name “clear” and, what I most remember about him was that he was not “clear” to me at all. As I recall his work, he was very focussed on information theory and characterized all kinds of uncertainty into probabiity, possibility and a few other levels and sets as fuzzy, crisp and so on and on.

    A computer program captured some of Klir’s theories and allowed a complex system to be reduced by, as I recall, sequentially eliminating links that had the lowest interconnectivity and interdependency between subystems and components.

    After Klir retired as chairman, and after I got my PhD (at age 57 in 1996), I taught System Engineering and Human Factors in the System Science Department and Artificial Intelligence in the Computer Science Department as an Adjunct at Binghamton University. That was part-time because my “real” job was as a Sr. System Engineer at Lockheed Martin (formerly IBM) in Owego, NY.

    Ah, the “good old days” (when the air was clean and sex was dirty :^) -Or so our distorted memories claim-

    Ira

  154. D.B. Stealey says:
    March 24, 2013 at 3:58 pm
    Phil.,

    What would it take for you to admit that AGW is either non-existent, or so minuscule that it isn’t worth worrying about?

    Scientific evidence.

    Numbers, please: how many more years of little or no global warming would it take? How much more human-emitted CO2 without runaway global warming would it take?

    Since we’re not in a period of no warming that’s not an issue, I have never expected runaway global warming in any case.

    Or, is your mind made up to the point where nothing can possibly convince you that your “carbon” conjecture was/is wrong? <— [Like Hansen's true belief.]

    Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?

    Believe the planet, in a couple of years time when the arctic is devoid of sea ice perhaps you’ll start to see the light?

    [Phil: You have never expected RUNAWAY global warming -GOOD TO HEAR- but, but you expect the arctic to be devoid of sea ice in a COUPLE YEARS TIME ? Forgive me, but that seems inconsistent. Ira]

    • Phil and D.B. Stealey :

      Arguments being made in this thread have become disjointed. In another part of it, I believe I have convinced participants that we have no way of knowing of the effect of raising the CO2 concentration upon global temperatures because, in organizing their study of global warming, climatologists such as James Hansen have blown their assignment. We’d might as well have shoveled the 200 billion US$ spent by Hansen and his colleagues down a rat hole.

  155. Ira Glickstein, PhD says:
    March 24, 2013 at 6:34 pm
    Phil. March 23, 2013 at 4:54 pm:

    Thanks for the link to Climate Audit, but it dates from 2008. Do you have anything more recent for CFCs, CH4 and N2O?

    No but feel free to get the raw data and plot it, after all it’s you who wrote the post entitled “How well did Hansen (1988) do?”

    http://cdiac.ornl.gov/oceans/new_atmCFC.html

    http://www.esrl.noaa.gov/gmd/dv/iadv/graph.php?code=MLO&program=ccgg&type=ts

    I respect McIntyre and accept his conclusion that Hansen 1988 way, way, way over-estimated the grown of “Greenhouse gases” other than CO2. Therefore, actual CFCs, CH4 and N2O levels fell considerably below even Scenario C assumptions, and way, way, way below the Scenario A and B assumptions.

    That wasn’t his conclusion!

    You continue to give me the impression that you haven’t read the paper we’re discussing, you certainly don’t understand it!

    So, let us take score:

    1) A close look at Fig 2 from Hansen 1988 indicates that they assumed the warming effect of CO2, alone, was about equal to the combined warming effect of CFCs, CH4 and N2O. So, if they over-estimated one (CO2) and underestimated the other (CFCs, CH4 and N2O) by about the same amount, we should expect that the temperature anomaly would flatline, which, over the past decade and a half, it did. Score one for dumb luck!

    2) Scenario A got the exponential increase in CO2 pretty much correct, but they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario A is so much higher than actual temperature anomalies.

    The point of Scenario A was to illustrate what would happen if existing trends in emissions continued, thus it was referred to as ‘business as usual’. “Scenario A assumes that growth rates of trace gas emissions typical of the 1970s and 1980s will continue indefinitely”. They didn’t over-estimate the increase, by definition that was what it was!
    “Scenario A…., must inevitably be on the high side of reality in view of resource constraints and environmental concerns”
    “Global warming…..occurs in all three scenarios…….depending on trace gas growth”

    3) Scenario B assumed a linear increase in CO2, which turned out to be pretty much correct, since the actual exponential increase is very mildly upward. But they way, way, way over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario B is also much higher than actual temperature anomalies.
    Scenario B assumed growth so as to maintain the rate of growth in greenhouse forcing stayed at the current rate. Hansen judged that “Scenario B is perhaps the most plausible of the three cases.”

    4) Scenario C assumed a linear increase in CO2, until the year 2000, where they assumed it would flatline. That to this dayturned out to be wrong, because CO2 has continued its mild exponential rise. They slightly over-estimated the increase in CFCs, CH4 and N2O, which have pretty much flatlined. That explains why Scenario C is only 31% higher than actual temperature anomalies.

    Scenario C was chosen to be the result of “a more drastic curtailment of emissions than has generally been imagined.”
    The scenarios were chosen to “yield sensitivity experiments for a broad range of future greenhouse forcings”. The goal being to bracket possible future climates, in this they were successful as you admit. In fact the then accepted value for sensitivity was somewhat high, if the currently accepted value is used then the result falls between B & C as one would expect from the actual emissions.

    Bottom line score; 1 for 4, and that 1 was dumb luck. Not too good. Not even for Scenario C, the closest to actual temperature anomalies.

    Pretty much bang on for a 25 years in the future!

    Yet, they say their model (or “simulation” as RealClimate recently called it), in their words “while not perfect … has shown skill”.

    Indeed it has, try reading the paper and understanding it.

    RealClimate is comfortable with that. Are you? If so, they have one foot in a bucket of scalding water, and the other foot in a bucket of ice, but, on the average, they are comfortable :^)

    Yes because I understand the paper, and have read it several times, clearly you haven’t.

  156. Phil. says:
    March 25, 2013 at 5:05 pm
    Since we’re not in a period of no warming that’s not an issue, I have never expected runaway global warming in any case.

    “Planet Earth is not agreeing with you, Phil. Who should we believe, you and Hansen? Or the planet?”

    Believe the planet, in a couple of years time when the arctic is devoid of sea ice perhaps you’ll start to see the light?

    [Phil: You have never expected RUNAWAY global warming -GOOD TO HEAR- but, but you expect the arctic to be devoid of sea ice in a COUPLE YEARS TIME ? Forgive me, but that seems inconsistent. Ira]

    ‘Runaway’ I interpret as heading towards a Venus like state, which I don’t believe will happen.
    An increase of a few degrees however is another matter and is sufficient to cause problems for us. Melting of the Arctic sea ice would be one effect of such a change, I would be surprised if that isn’t substantially complete in a couple of years. Here’s a graphic of the progress so far:

    Given the fragmentation of the present ice, which is mostly FYI, I wouldn’t be surprised to see another record low this fall.

  157. DirkH says:
    March 24, 2013 at 4:07 pm
    Phil. says:
    March 24, 2013 at 2:41 pm
    “result to lie between B and C. Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.”

    The greenhouse effect of CH4 competes with H2O and is only measurable in dry winter weather. Are you SURE it’s a problem when it can’t even be measured in warm moist weather?

    And yet it’s routinely measured by satellites, how do you suppose that is?
    At the surface it’e easily measured with an FTIR spectrometer, so I don’t think that you premise holds up.

  158. Phil. says: March 26, 2013 at 9:21 am:

    DirkH says:
    March 24, 2013 at 4:07 pm
    Phil. says:
    March 24, 2013 at 2:41 pm
    [DirkH quoting Phil]… “ Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.”

    [DirkH] The greenhouse effect of CH4 competes with H2O and is only measurable in dry winter weather. Are you SURE it’s a problem when it can’t even be measured in warm moist weather?

    [Phil] And yet it’s routinely measured by satellites, how do you suppose that is?
    At the surface it’s easily measured with an FTIR spectrometer, so I don’t think that you premise holds up.

    DirkH and Phil: I accept that the “Greenhouse Effect” is real see this in my series on Visualizing the “Greenhouse Effect” and that CH4 is a “Greenhouse gas”.

    However, as shown clearly here, all the CH4 peaks are coincident with portions of the spectrum where H2O is saturated or nearly so. Therefore, as DirkH is saying, given the prevalence of H2O, the contribution of rising CH4 levels (if they were actually rising) to actual warming of the Earth surface would be minimal, except in dry winter weather.

    Ira

  159. Please Ira, showing a low resolution cartoon of the spectrum shows nothing ‘clearly’. Also it’s for the surface the situation is much different at altitude!

  160. To argue over the magnitude of the climate sensitivity ( aka the equilibrium climate sensitivity (TECS) ) is scientifically nonsensical, for the equilibrium temperature is not an observable.

  161. Ira Glickstein, PhD says:
    March 27, 2013 at 11:37 am

    However, as shown clearly here, all the CH4 peaks are coincident with portions of the spectrum where H2O is saturated or nearly so. Therefore, as DirkH is saying, given the prevalence of H2O, the contribution of rising CH4 levels (if they were actually rising) to actual warming of the Earth surface would be minimal, except in dry winter weather.

    Here’s a portion of the actual spectra rather than a cartoon.

    Bear in mind that at an altitude of the mid-troposphere the temperature is about 250K so CH4/H20 has increased quite a bit by then, we’re not just talking about dry winter weather!
    Between 1300 and 1600 cm^-1 CH4 has ~36,000 absorption lines to ~4,000 water lines. Between 1200 and 1300 it’s even more dramatic, ~13,000 to 1,000 (this is around the peak of the CH4 spectrum!

  162. Phil. March 28, 2013 at 9:42:
    THANKS for the link to http://s302.photobucket.com/user/Sprintstar400/media/WaterCH4.gif.htm which PROVES MY POINT!

    Please notice the SCALE on the left size of the graph. Both the H2O and CH4 graphs are the same physical height, but the CH4 graph is multiplied eight times as much in height, and the bottom 7/8ths are cut off! The H2O graph goes from 0.88 to 1.00 while the CH4 graph goes all the way from 0.00 to 1.00! This is all quite misleading.

    Furthermore, please notice that the graphs you linked to extend only from Wavenumber 1200 (8.3 microns) to 1300 (7.7 microns). The LWIR spectrum of interest extends down below 7 microns up to 20 microns or more! So, those graphs are showing only a tiny fraction of the LWIR spectrum responsible for the Atmospheric “Greenhouse” Effect. The strong portion of the H2O absorbtion spectrum extends from 7 to 9 microns, and from 12 to 20 microns and beyond.

    PLEASE NOTICE:

    1) H20: The H2O graph extends from 0.88 to 1.00, which is NEAR-SATURATION (88%) to COMPLETE SATURATION (100%). Also notice that there is 100% saturation for well over 95% of the spectrum. Yes, there are several narrow spectral lines that are not 100% saturated, but, the first three of these go down only to 96% saturation, the next one goes down to 90% saturation,and so on and the worst one goes down only to 88% saturation.

    2) CH4: The CH4 graph extends all the way down to 0.00, which gives a totally misleading impression. If the H2O graphic was plotted to the same scale, it would only be about 1/8th the height! Also notice that the CH4 graph has many, many spectral lines that are less than 20% saturated and some that approach ZERO saturation.

    I don’t have the base data for these graphs, however I would love to see a graph and calculation of what percentage of absorbtion of LWIR that a doubling of CH4 would contribute to the total absorbtion of H2O plus CO2. My guess is that it would be well under 1%. And, as your link to Climate Audit indicated, CH4 is not increasing very much, if at all.

    I appreciate your contrubutions to this Topic thread and hope you answer the above.

    advTHANKSance

    Ira

  163. Ira please look at the legend of the graph, it’s transmittance not absorbance! In the region shown CH4 has many strong lines whereas water has a few weak ones.

  164. Ira Glickstein, PhD says:
    March 28, 2013 at 8:58 pm
    Phil. March 28, 2013 at 9:42:
    THANKS for the link to http://s302.photobucket.com/user/Sprintstar400/media/WaterCH4.gif.htm which PROVES MY POINT!

    I’m afraid not!

    Please notice the SCALE on the left size of the graph. Both the H2O and CH4 graphs are the same physical height, but the CH4 graph is multiplied eight times as much in height, and the bottom 7/8ths are cut off! The H2O graph goes from 0.88 to 1.00 while the CH4 graph goes all the way from 0.00 to 1.00! This is all quite misleading.

    As indicated briefly above you have misread the spectrum, as indicated on the legend it is a transmissivity spectrum so the H2O spectrum shows very little absorption in that region compared with CH4!

    Furthermore, please notice that the graphs you linked to extend only from Wavenumber 1200 (8.3 microns) to 1300 (7.7 microns). The LWIR spectrum of interest extends down below 7 microns up to 20 microns or more! So, those graphs are showing only a tiny fraction of the LWIR spectrum responsible for the Atmospheric “Greenhouse” Effect. The strong portion of the H2O absorbtion spectrum extends from 7 to 9 microns, and from 12 to 20 microns and beyond.

    I presented this in response to your statement which I highlighted: “all the CH4 peaks are coincident with portions of the spectrum where H2O is saturated or nearly so.”
    As I showed this is not true for the main CH4 peak between 1200 and 1300 cm^-1, what water does elsewhere is not relevant. That’s the problem of working with a low resolution cartoon!
    The H2O spectrum while extensive is very sparse in terms of lines as referred to above, in the upper atmosphere there are many gaps where other gases can be effective.

    PLEASE NOTICE:

    1) H20: The H2O graph extends from 0.88 to 1.00, which is NEAR-SATURATION (88%) to COMPLETE SATURATION (100%). Also notice that there is 100% saturation for well over 95% of the spectrum. Yes, there are several narrow spectral lines that are not 100% saturated, but, the first three of these go down only to 96% saturation, the next one goes down to 90% saturation,and so on and the worst one goes down only to 88% saturation.

    2) CH4: The CH4 graph extends all the way down to 0.00, which gives a totally misleading impression. If the H2O graphic was plotted to the same scale, it would only be about 1/8th the height! Also notice that the CH4 graph has many, many spectral lines that are less than 20% saturated and some that approach ZERO saturation.
    completely wrong because of the misreading of the spectrum.

    I don’t have the base data for these graphs, however I would love to see a graph and calculation of what percentage of absorbtion of LWIR that a doubling of CH4 would contribute to the total absorbtion of H2O plus CO2. My guess is that it would be well under 1%. And, as your link to Climate Audit indicated, CH4 is not increasing very much, if at all.
    Over its lifetime in the atmosphere CH4 averages over 20X greater effect /mole than CO2 so a doubling from the current ~2ppm the effect would be equivalent to an additional 40ppm of CO2, also it has a greater sensitivity than log. My reference to CH4 was: “Hopefully the ongoing melting of the Arctic sea-ice won’t lead to a recurrence of the growth in CH4 but current measurements in the Arctic suggest otherwise.” So although the flattening anticipated by Hansen’s Scenario C has occurred, present measurements in the Arctic are indicating recent releases, probably due to local warming.

    I appreciate your contrubutions to this Topic thread and hope you answer the above.

  165. Terry Oldberg says:
    March 25, 2013 at 8:26 am

    I had to travel, and was unable to respond. I am familiar with the MEM in signal processing. The methodology I advocated could be couched in those terms, involving as it does identification of the system dynamics, and following through with optimal prediction of its evolution. MEM identification methods can be employed, but the system at hand has high SNR, and it would really just be gilding the lily.

    • Bart:

      A generalization can be drawn from experience in building information theoretically optimal models. This is that 150 observed independent events is about the minimum for construction of a predictive model. Going back to the beginning of the various global temperature time series in the year 1850, there are between 5 and 6 events of 30 years’ duration each; 30 years is the duration that is canonical in climatology and is the approximate period in which power producing facilities depreciate to nil. Five events is too few for construction of an information theoretically optimal model by a factor of at least 50. Thus, I can’t agree with you when you claim it to be easy to extract the signal from the noise. Under current circumstances, it would be impossible to extract such a signal at all.

      By the way, as a signal would come to us from the future, it would have to travel at a superluminal speed. It follows from the impossiblility of such a speed under Einsteinian relativity that a signal power greater than nil is not possible. Thus, the signal-to-noise ratio is undefined. In view of this state of affairs, I prefer to replace the term “signal” by the term “message.” The message that one might hypothetically receive would be a sequence of outcomes of events in the underlying statistical population. As global warming climatology references no such population, the existence of such a message is not currently possible.

  166. Terry Oldberg says:
    March 30, 2013 at 8:30 am

    “It follows from the impossiblility of such a speed under Einsteinian relativity that a signal power greater than nil is not possible”

    Not according to absorber theory.

    In any case, I don’t really want to be arguing over how many angels can dance on the head of a pin, or whether our universe is a grain of detritus lodged in the fingernail of a giant. Such conversations are best conducted when one is an undergraduate, in a relaxed atmosphere among friends sharing mind altering substances.

    The method I am advocating is a converging series of inferences based upon actual measurements, rather than a scattershot series of random guesses to which one then attempts to reconcile the data. The latter is the process in which the IPCC et al. are engaged, and it is horrible science. I think we agree on that, at least, so let us leave it there until we meet again. Cheers.

  167. Phil. March 29, 2013 at 3:37 am

    Ira please look at the legend of the graph, it’s transmittance not absorbance! In the region shown CH4 has many strong lines whereas water has a few weak ones.

    You are correct, the graphs you linked to say “Transmittance” along the Y axis.

    However, they are in conflict with the graph in http://en.wikipedia.org/wiki/File:Atmospheric_Transmission.png (which you call a “cartoon”). I agree the graphs you linked to are more precise and focussed on the narrow spectral absorbtion region for CH4, than the broader and coarser graph that I linked to from Wikipedia.

    However, the Wikipedia graph is clearly based on actual measurements of the Atmosphere and it covers the entire region of LWIR emissions from the Earth that are absorbed and retransmitted back towards the Surface and that make the Atmospheric “greenhouse” effect (AGE) work. Everyone is welcome to look at that graphic and will notice that, even in the narrow band where CH4 is active, its absorbtion peaks below 50%. In that same narrow band, H2O is either totally saturated at 100% absorbtion, and, for part of that band, it goes down only to around 50%. Thus, even in that narrow band, H2O overwhelms CH4 in effectiveness and thus in importance for AGE

    If we look at the entire LWIR band, from about 6 to 30 microns, H2O is saturated for a major portion, and, when combined with CO2, together they are saturated for the majority of the band of interest.

    The narrow CH4 band graphs you linked to, marked “Transmittance”, do not have any information about where or what part of the actual Atmosphere they refer to. Are they for dry winter conditions in the Arctic? It seems to me we should be using average worldwide Atmpospheric conditions. You do not give a source for your graphs, except for the legend “Spectracalc.com”. I went to that website and found a calculator application where the user could enter various conditions and assumptions, What did you (or whoever generated these graphs) enter as assumptions?

    Do you have any explanation for why the graphs you linked to contradict the Wikipedia graph?

    Please reply because I am interested in this issue. advTHANKSance

    Ira

  168. A brief reply Ira I’ll respond in more detail later. Firstly you’re mistaken on the provenance of the spectrum you linked to, it’s actually a synthetic spectrum not measured. It’s actually made from the same source as mine, spectracalc! It does not say which location in the atmosphere it is calculated for so it’s not clear how much H2O is assumed. The spectrum is smoothed which is why the structure is not apparent so strong lines might not be apparent and the sparse nature of the H2O spectrum is lost. You are incorrect about the relationship of the CH4 and H2O spectra, look carefully the CH4 is at the edge of the H2O spectrum, this is even more important when you don’t look at a smoothed spectrum such as the one you presented.

  169. Phil., yes, the chart I linked to is generated from Spectracalc as you say, but it is clearly intended to represent some version of an “average” worldwide Atmosphere, which is what is needed here to visualize how the various “greenhouse” gases contribute to AGE.

    I took the largest available version of the chart and blew it up and, using a drawing tool, compared the CH4 part of the spectrum to the corresponding H2O portion and found that, while the CH4 averaged perhaps 25% saturation over that narrow range, the H2O averaged well over 50%. (I would appreciate it if you or some other reader would do the same and report conclusions.)

    Then, considering that the CH4 band of absorbance is only about an eight of the strong LWIR band of Earth Emission, it seems to me there is no other conclusion than that H20 dominates, that CO2 covers a portion where H2O is weakest, and that CH4 contributes next to nothing because it is narrow, weak, and happens to overlap an H2O regioin that is stronger than the CH4 by a factor of at least two.

    Again, please report the conditions assumed by Spectracalc for the graphs you linked to. Do you agree the person who requested that Spectracalc plot specified an extremenly dry period (e.g., winter) or location (e.g., arctic) and therefore obtained results that, while true for that period and region, are not representative of averages for the whole Earth and all Seasons.

    Ira

    PS: I reallyappreciate your sticking with this Topic thread. THANKS.

  170. Phil: While waiting for your reply to my previous posting, I found this at: http://en.wikipedia.org/wiki/Greenhouse_gas It appears to be fact-based and is certainly not on the skeptic side regarding Global Warming! The bottom line for me is that methane is well below water vapor and carbon dioxide and contributes less than half as much as CO2 and a ninth of H2O.

    Impact of a given gas on the overall greenhouse effect

    Schmidt et al. (2010) analysed how individual components of the atmosphere contribute to the total greenhouse effect. They estimated that water vapor accounts for about 50% of the Earth’s greenhouse effect, with clouds contributing 25%, carbon dioxide 20%, and the minor greenhouse gases and aerosols accounting for the remaining 5%. In the study, the reference model atmosphere is for 1980 conditions….

    The contribution of each gas to the greenhouse effect is affected by the characteristics of that gas, its abundance, and any indirect effects it may cause. For example, on a kg for kg basis the direct radiative effects of methane is about 72 times stronger than carbon dioxide over a 20 year time frame but it is present in much smaller concentrations so that its total direct radiative effect is smaller, and it has a shorter atmospheric lifetime. On the other hand, in addition to its direct radiative impact methane has a large indirect radiative effect because it contributes to ozone formation. Shindell et al. (2005) argue that the contribution to climate change from methane is at least double previous estimates as a result of this effect.

    When these gases are ranked by their direct contribution to the greenhouse effect, the most important are:
    [this table does not reproduce too well, see original at linked webpage]

    Gas / Formula / Contribution (%)
    Water vapor H2O 36 – 72%
    Carbon dioxide CO2 9 – 26%
    Methane CH4 4 – 9%
    Ozone O3 3 – 7%

    … It is not possible to state that a certain gas causes an exact percentage of the greenhouse effect. This is because some of the gases absorb and emit radiation at the same frequencies as others, so that the total greenhouse effect is not simply the sum of the influence of each gas. The higher ends of the ranges quoted are for each gas alone; the lower ends account for overlaps with the other gases. [my bold] In addition, some gases such as methane are known to have large indirect effects that are still being quantified.

    So, taking the low ends to, according to the text, “account for overlaps with the other gases” we get Methane at 4%, compared to CO2 at 9% (over twice the contribution) and H2O at 36% (nine times the contribution).

    Or, to put it another way, if I listed the “greenhouse” gases in order of importance, it would be: H2O, H2O, H2O, H2O, H2O, H2O, H2O, H2O, H2O, CO2, CO2, CO2, CO2, CH4

    Ira

Comments are closed.