Climate Models -vs- Climate Reality: diverging or just a dip?

Here’s something really interesting:  two comparisons between model ensembles and  3 well known global climate metrics plotted together. The interesting part is what happens in the near present. While the climate models and climate measurements start out in sync in 1979, they don’t stay that way as we approach the present.

Here are the trends from 1979-”Year” for HadCrut, NOAA, GISSTemp compared to the trend based on 16 AOCGMs models driven by volcanic forcings:

Figure 1: Trends since 1979.

Figure 1: Trends since 1979 ending in ‘Year’.

A second graph showing 20 year trends is more pronounced.Lucia Liljegren of The Blackboard did both of these, and she writes:

Note: I show models with volcanic forcings partly out of laziness and partly because the period shown is affected by eruptions of both Pinatubo and El Chichon.

Here are the 20 year trends as a function of end year:

Figure 2: Twenty-year trends as a function of end year.

Figure 2: Twenty-year trends as a function of end year.

One thing stands out clearly in both graphs: in the past few years the global climate models and the measured global climate reality have been diverging.

Lucia goes on to say:

I want to compare how the observed trends fit into the ±95 range of “all trends for all weather in all models”. For now I’ll stick with the volcano models. I’ll do that tomorrow. With any luck, HadCrut will report, and I can show it with March Data. NOAA reported today.

Coming to the rescue was Blackboard commenter Chad, who did his own plot to demonstrate +/- 95% confidence intervals using the model ensembles and HadCRUT. He showed very similar divergent results to Lucia’s plots, starting about 2006.

http://scientificprospective.files.wordpress.com/2009/04/hadcrut_models_01.png

So the question becomes: is this the beginning of  a new trend, or just short term climatic noise? Only time will tell us for certain. In the meantime it is interesting to watch.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
180 Comments
Inline Feedbacks
View all comments
PeteB
April 17, 2009 2:04 am

Thanks all,
I found the exchange here between David Stockwell and Stefan interesting
http://www.realclimate.org/index.php?p=554#comment-84440
and also the stuff on David Stockwell’s site
I am not sure I understand the statistics well enough, but 16 years seems too short a period, (certainly for temperature, maybe not for sea level) to say that temperature is increasing faster than the IPCC estimates – to be fair Stefan did acknowledge that in the original paper
Given the relatively short 16-year time period considered,
it will be difficult to establish the reasons
for this relatively rapid warming, although
there are only a few likely possibilities. The first
candidate reason is intrinsic variability within the
climate system.

Its a pity somebody couldn’t have taken up Stefan’s challenge though
[Response: If you really think you’d come to a different conclusion with a different analysis method, I suggest you submit it to a journal, like we did. I am unconvinced, though. -stefan]

E.M.Smith
Editor
April 17, 2009 2:22 am

Squidly (17:21:09) : As a long time computer scientist myself, I cannot imagine how one could seriously consider conclusions from codes developed so haphazardly.
They you will just love this. From the top level control script of Step4 of GIStemp, we have these “operator instructions” embedded in an error message:

fortran_compile=$FC
if [[ $FC = ” ]]
then echo “set an environment variable FC to the fortran_compile_command like f90”
echo “or do all compilation first and comment the compilation lines”
exit
fi

Can you imagine what would happen in a real production shop if your compilation procedure is ~”or whatever, and edit the code a bit”…
From:
http://chiefio.wordpress.com/2009/03/07/gistemp-step4-the-process/

It then procedes to check that the environment variable “FC” is set to your FORTRAN compiler. Interestingly enough, it suggest hand editing the script to comment out the compilation steps if you want to do everything by hand. Have these people never heard of using “#DEFINE” or passed parameters to control script execution paths?
Next up, we check that the input file input_files/SBBX.HadR2 exists.
(Which it doesn’t right now. STEP3 at the end says to create it or use STEP4 code to update it. It looks like the creation of the file is left as an exercise for the student.)
The second passed parameter is assigned to “mo” and this is manipulated. We prepend an “0” on any single digits, then look for an input file of the form input_files/oiv2mon.[year][start_month]
(which also doesn’t exist right now so is also left as an exercise for the student. Maybe malt liquer would help… or Scotch… yeah, a nice single malt… Sorry, the mind wanders at times like this when faced with crap like this.)
If we have a third parameter, we assign it to “mo2” though no guidance is given as to when you would want 2 vs 3 parameters or what choices might make the anomalies work out “best”. More play time for the students…
Finally, we get to the meat of it. We compile and run the FORTRAN program convert1.HadR2_mod4.f passing to it the parameters year, monthstart and monthend that were passed into the script, or using the monthstart as the monthend if only one month was given.
At the end we are told what to do as a manual step “if all went well”. One is left to wonder how one knows if all went well, and what the acceptance criteria might be…

3x2
April 17, 2009 2:35 am

Frank K. (11:17:51) :
CO2 Realist (07:53:41) :
(…) Extracting exactly the part that corresponds to a single paper and documenting it so that it is clear what your conventions are (often unstated) is non-trivial. – gavin]” (…)

Given the huge sums invested in “climate science” together with the astronomical carbon taxes being proposed it really is a disgrace. Quite unbelievable that they are allowed to continue with this attitude.
It may be interesting to add up the climate science money plus carbon taxes worldwide and divide that by the papers produced to put a figure on just how “non-trivial” that all works out per paper.
A complete house of cards.

E.M.Smith
Editor
April 17, 2009 2:47 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”
Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ‘sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
“I am ‘gobsmacked’ !!. ”
Me too! I think you got it ‘spot on’ in your opening: “refusal to think on what evidence implies.”

E.M.Smith
Editor
April 17, 2009 2:49 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”

Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ‘sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
I am ‘gobsmacked’ !!.
Me too! I think you got it ‘spot on’ in your opening:

E.M.Smith
Editor
April 17, 2009 2:51 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”

Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ’sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
I am ‘gobsmacked’ !!.
Me too! I think you got it ’spot on’ in your opening: “refusal to think on what evidence implies.”
(It must be getting late… 3 tries to get on short comment straight?…)

E.M.Smith
Editor
April 17, 2009 3:02 am

Per GIStemp:
CO2 Realist (22:32:59) :
Frank K. (21:50:45) says: Try to figure out what this code is really doing versus the description given in the “documentation”…
I’ll give it a go, but I’m sure I could take your word for it! It will be an enlightening exercise.

I have the code on line and with comments at:
http://chiefio.wordpress.com/gistemp/
You will probably find that an easier start than trying to decipher the code as downloaded. I’m hoping to get a few more folks looking at it and ‘documenting’ what bits do in the comments to the pages…

3x2
April 17, 2009 4:00 am

On the subject of GIGO it is worth noting that, presuming the models are under continual development, GI also alters the code in the process.
As an example, suppose that as a modeller you are happy with your model representation of the Antarctic. Your model correctly (in line with the other models) sees no warming and possibly some cooling in the region. Along comes Steig and tells you that there is significant Antarctic warming.
As a modeller what do you do with this information? If you uncritically take it at face value then you accept that your model is missing something big
and modify it accordingly. Your model now agrees with the new data. This may require major changes that ripple out through the rest of the model. What if it turns out, as many suspect, Steig is a complete fabrication?
I propose Steig Amplification – similar to GIGO only it alters the code on the way through.

doodle
April 17, 2009 4:06 am

you need a celebrity to take up the cause and make it fashionable to question climate change, maybe susan boyle can be the spokesperson in a years time LOL

Frank K.
April 17, 2009 5:44 am

E.M.Smith (03:02:13) :
Thanks for the link! Your analysis of GISTEMP is extraordinary and worthy of a post here at WUWT (perhaps in several installments). How about it, Anthony? :^)

April 17, 2009 7:20 am

E.M.Smith (02:22:33) says:
Can you imagine what would happen in a real production shop if your compilation procedure is ~”or whatever, and edit the code a bit”…
and
At the end we are told what to do as a manual step “if all went well”. One is left to wonder how one knows if all went well, and what the acceptance criteria might be…

Now you’re really scaring me. Apparently it is much worse than I thought.
E.M.Smith (03:02:13) says:
I have the code on line and with comments…

Great effort. I’ll have a look though Fortran is not my strong point.
3×2 (02:35:47) says:
A complete house of cards.

Quite the understatment.
Anthony, here’s another vote for a post of E.M.Smith’s analysis.
REPLY: Convince him to pack it up into a single document and I’ll have a look. – Anthony

John Galt
April 17, 2009 8:03 am

@E.M. Smith:
We’re constantly told how complex the climate models are. Complexity is subjective, but how about lines of code? How big is the codebase? Does it really require a supercomputer to run?

April 17, 2009 8:17 am

Frank K. (05:44:04) writes:
E.M.Smith (03:02:13) :
Thanks for the link! Your analysis of GISTEMP is extraordinary and worthy of a post here at WUWT (perhaps in several installments). How about it, Anthony? :^)

I added my vote at CO2 Realist (07:20:39) and Anthony replied:
REPLY: Convince him to pack it up into a single document and I’ll have a look. – Anthony
So E.M.Smith, what do you say? I think it would be educational for many here.

Philip_B
April 17, 2009 9:25 am

Masonmart, hot deserts are colder most of the time (nighttime, winter, spring, autumn) than the humid tropics. That is why their average temperature is 10C lower than the humid tropics.
For example, the hottest temperature ever recorded on Earth was 58C in El Azizia, Libya (Sahara desert). Yet the winter (January) average temp in Azizia is only 11.5C.
Even in the middle of summer, hot deserts aren’t any hotter than the humid tropics. Both have average temps around 27C/28C. Compare El Azizia with Singapore.
http://hypertextbook.com/facts/2000/MichaelLevin.shtml

Tom P
April 17, 2009 11:51 am

The problems in the science of Monckton’s analysis have been presented elsewhere, e.g.:
e.g.http://arxiv.org/abs/0811.4600?context=physics
and the plots at the top of this very thread are hardly consistent with his claim of current global cooling.
But what most amused me in his submission to Congress referenced above was his frequent use of a logo in the background of his figures that bears a striking resemblance to the portcullis insignia of the Houses of Parliament:
http://img5.imageshack.us/img5/3170/portcullis.png
I suppose these were used by Monckton as a vague appeal to authority, although the use of the portcullis insignia is tightly regulated:
“In 1996, the usage of the crowned portcullis was formally authorised by licence granted by Her Majesty the Queen for the two Houses unambiguously to use the device and thus to regulate its use by others. The emblem should not be used for purposes to which such authentication is inappropriate, or where there is a risk that its use might wrongly be regarded, or represented as having the authority of the House.”
But what is amusing is not whether Monckton is misusing the insignia in his submission. Although he inherited a title, Viscount Monckton is not even a member of the House of Lords. When he stood for election in 2007 by his fellow Conservative peers, he received precisely zero votes.

Roger Knights
April 17, 2009 4:14 pm

Thom Scrutchin (17:54:00) wrote:
“The new acronym should be AGWA for Anthropogenic Global Warming Alarmism.”
Great idea, because it’s pronounceable–and sounds silly.
(BTW, since AGW isn’t pronounceable, it isn’t an acronym.)

Kohl Piersen
April 17, 2009 5:16 pm

TomP – “But what most amused me …” and what follows
Why do you think that this public demonstration of your personal antipathy towards Lord Monckton is of any interest to anyone?
Rather than continuing to speak through your backside, I suggest that you put it back in your pants, where it undoubtedly belongs, and then …. sit on it!

Philip_B
April 17, 2009 7:02 pm

the plots at the top of this very thread are hardly consistent with his claim of current global cooling.
Tom P, thanks for giving us an example of the reality denial that pervades the Warming Believer camp.
According to you, temperatures going down is not cooling.

Tom P
April 18, 2009 12:36 am

Philip_B,
Please read the y-axis label of the three figures: it is the trend, not the temperature that is decreasing. All three figures indicate we continue to warm, not cool.
Can you offer any explanations for that?

April 18, 2009 12:20 pm

Lucia’s analysis was inspired by my observation and analysis showing that observed long-term temp trends were rising in the 2001-present period, even though the trend within the period was flat or declining.
I compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections. This same benchmark was used by Roger Pielke Jr and Pat Michaels in their analyses, and is fairly standard as far as I know.
The 20-year trends for GISTemp, NOAA and HadCRUT were all ahead of the benchmark earlier this decade, and are presently a little under. This fluctuation about the benchmark value is hardly surprising, given normal interannual variations and a relatively cool La Nina year at the end point.
See:
http://deepclimate.org/2009/04/18/20-year-surface-trends-close-to-models/
http://deepclimate.files.wordpress.com/2009/04/20-year-trend.gif
Thanks for reading.

John M
April 18, 2009 1:26 pm

Deep Climate

compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections.

Except Lucia and others have pointed out to you that you’ve sort of cherry-picked that 0.2 C/decade benchmark (how ’bout those volcanoes?)
http://rankexploits.com/musings/2009/hadcrut-march-data-available/#comment-12890
http://rankexploits.com/musings/2009/hadcrut-march-data-available/#comment-12900

Niels A Nielsen
April 18, 2009 1:33 pm

Deepclimate writes:

Deep is not comparing aples to aples. The models’ projected 20 year trends with endpoint 2009 would be far lower than 0.2c/decade if a couple of stratospheric volcano eruptions had occurred within the last 5 years (volcanos cause cooling) and no eruptions had ocurred for the preceding 15 years.
Now that is not the case. El chichon (82) and Pinatubo (91) erupted followed by a vulcanic lull until now, making the models’ projected trend higher than 0.2C/decade as Lucia has explained.
Deep refuse to understand. I trust he will maintain his point about the fixed benchmark of 0.2C/decade should a large stratospheric volcano erupt in the near future and the 20 year observational trend in 5 years time is no longer affected by the Pinatubo eruption at the other end of the 20 year trend line 😉
http://rankexploits.com/musings/2009/hadcrut-march-data-available/

John M
April 18, 2009 1:33 pm

Hmm.
Looks like the individual comment links are not unique.
You’ll have to scroll down to read Lucia’s relevant comments.

Niels A Nielsen
April 18, 2009 1:47 pm

Deep Climate (12:20:03) :

I compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections.

Deep is not comparing apples to apples. The models’ projected 20 year trends with endpoint 2009 would be far lower than 0.2c/decade if a couple of stratospheric volcano eruptions had occurred within the last 5 years (volcanos cause cooling) and no eruptions had ocurred for the preceding 15 years.
Now that is not the case. El chichon (’82) and Pinatubo (’91) erupted followed by a vulcanic lull until now, making the models’ projected trend higher than 0.2C/decade as Lucia has explained.
Deep refuse to understand. I trust he will maintain his point about the fixed benchmark of 0.2C/decade should a large stratospheric volcano erupt in the near future and the 20 year observational trend in 5 years time is no longer affected by the Pinatubo eruption at the other end of the 20 year trend line 😉
http://rankexploits.com/musings/2009/hadcrut-march-data-available/

April 18, 2009 1:54 pm

PeteB:
I have submitted a different conclusion with the same analysis method, simply with two additional years data. The non-linear trend is no longer in the upper region of the IPCC projections for temperature. See powerpoint at post http://landshape.org/enm/newcastle-lecture-update/ for the image.