Short term trends from GISS Model E: "The model would be off by about 0.15C in the first five years"

On occasion, comments posted on WUWT are backed up with data or graphs from the commenter, and are so germane that they merit their own post for discussion. This is one of those cases. Bill Illis has done a couple of guest posts on WUWT, the most recent about the “Trade Winds Drive the ENSO“. In the comments about the story on “When you can’t believe the model” he posted a significant comment on his work with NASA GISS model E (Global Climate Model) backed up with his own research graphs. For those brave enough to slog through it, here is the manual for Model E. I thought Bill’s comments were worth sharing. – Anthony

http://www.norcalblogs.com/watts/images/gcmE1.gif

Image above is from my stock imagery and for illustration only, not from Bill Illis.

Guest Comments by Bill Illis

Awhile ago, I pulled apart the components of GISS’s Model-E and then extended the forecast it would have provided from 2003 (the end date of the data provided by GISS) to 2013, ten years.

The model would be off by about 0.15C in the first five years.

The more detailed version of this extension is here:

Click for a larger image

The simpler version is below.

Click for a larger image

Another way to look at is they have huge GHG temperature impacts built in (no way to get to +3.0C without it) but they need to build in almost as big negative temperature impacts from other sources to keep the hindcast close to the actual temperatures we have seen so far.

One could conclude they are just plugging the big negative numbers into the hindcast after the fact to make it work.

Which is close to the point Leland Teschler was trying to make in this article. (seen here)

Click for a larger image

Without a large uptick in temperatures in the next few years, the modelers really have to go back to the drawing board (or they need to discover another “negative forcing” to keep the models on track to reality).

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
81 Comments
Inline Feedbacks
View all comments
anna v
February 20, 2009 8:44 am

Basil (06:13:41) :
An exposition of the connection of neural nets, climate and chaos is given here by Tsonis:
http://www.uwm.edu/~aatsonis/BAMS_proofs.pdf
I think in a sense it is the same logic as with analogue computing: since one cannot solve the coupled differential equations one can model the equations themselves (make analogues) and let the system solve them.
There have been discussions of the Tsonis and Koutsogiannis papers in CA .

John F. Pittman
February 20, 2009 9:15 am

Kohl Pierson, go to lucia’s http://rankexploits.com/musings/ for a good amount of information. If you want a detailed discussion, find the argument, I believe it is on the Pat Frank thread, between Gerald Browning and Schmidt at RC. Also, read the cross posts at CA, just in case RC edited or deleted something. For your own, read Tebaldi, C. and R. Knutti, 2007, The use of the multi-model ensemble in probabilistic climate projections, Philosophical Transactions of the Royal Society A, 365, 2053-2075, doi:10.1098/rsta.2007.2076, PDF (0.3 MB) http://www.iac.ethz.ch/people/knuttir/papers/tebaldi07ptrsa.pdf

An Inquirer
February 20, 2009 9:15 am

“One could conclude they are just plugging the big negative numbers into the hindcast after the fact to make it work.”
This is a key statement: a devastating one if true, but an incendiary one if simply conjecture. Does Bill Illis reach this conclusion? – or is he just speculating that others have reached this conclusion? Has he done the necessary research to come to a conclusion?
Hopefully it is well known that climate models do not track GMT well if one includes just GHGs, and they do not get much better by adding our current understanding of solar energy. However, they track well if aerosols are added.
Over a year ago, I spent a couple of months studying GCM use of aerosols, including available data sets on aerosols that can be used as inputs. The data on aerosols are more extensive than most skeptics realize. Nevertheless, on a global scale, aerosol data sets are sketchy relative to the number of grids and years in the models. Also, one must estimate (or assume / conjecture) what the forcing impact of aerosols is on temperatures. (This estimation process makes laughable the statement that GCMs are based on solid, well-understood physical principles.) While I did not conclude that aerosols are simply dummy variables used arbitrarily to make the hindcast fit, I do get the impression that the data sets chosen and the forcing impact estimated are definitely conveniently chosen.

February 20, 2009 9:21 am

Bill C: Where does Bill Illis say that volcanic aerosols should be eliminated from the equation? My take on what he’s written and provided is that he notes that they’re not included in his model, so that the user can take it into consideration.

Mike C
February 20, 2009 9:44 am

Bob, think about it… if they’re not included in the model then they’re obviously eliminated from the equasion.

February 20, 2009 10:28 am

Bill Illis,
Great analysis. Much better than the IPCC’s projections: click

Earle Williams
February 20, 2009 10:33 am

I can relate to many of the issues folks have raised about modeling. I developed a FORTRAN model for my MS thesis to calculate the direct current response to buried resistive bodies (things, not coprses!). I still remember a few dreams of being stuck in a do loop and unable to get out… 🙂
Adjusting model inputs to yield model outputs that compare well with reality isn’t a bad thing. If a model can’t accurately predict the known data then it is close to useless. So the tuning of the models in of itself is nothing to merit raised eyebrows.
What should raise eyebrows are when the inputs are tweaked beyond the range of ‘normal’ for said input. If the only way to get a model (any model, not just GCM) to match real world output is to feed it non-real-world input then the model is wrong and needs to be changed.
When the inputs are something we can measure, say the amount of sulfate particulate in the atmosphere, we can readily test the fidelity of the model input to reality. When the model input is parameterized, ie not a measurable property of the real world but an aggregated one, then there is no way to compare the fidelity of the model input to the real world. Tweaks of parameterized inputs done to accurately hindcast must be very explicitly explained and justified. Absent that explanation and justification, the model and its outputs should be viewed with a high amount of suspicion.

Bill Illis
February 20, 2009 10:48 am

Bob and Mike C,
I just meant to say they are not included in the model (not that they aren’t required).
When I started doing this reconstruction, I originally thought that I would need to include a volcanic impact as well.
But after accounting for the ENSO and the AMO, there really isn’t any volcanic impact that needs to be included. Other than Pinatubo, the other volcanoes (including Krakatoa, Santa Maria, Nova Rupta, Agung, and El Chichon) did not influence the temperature trend very much at all.
I could plug some (negative) impact here or there but I did not want to start using plugged data, I just wanted it to be a straight-up mathematical calculation.
There is some room for some negative numbers with Pinatubo though.
I did look at this very closely because it did not make sense to me but the data is the data. It is possible that the ENSO and the AMO go down due to the volcanoe influence and my residuals would then show little need for them but the base temperature data does not show much impact either.
Here are some charts showing it. I think the blue lines are the actual temperature lines in all these charts and some of them come from other projects so there is a different focus in some of them.
http://img372.imageshack.us/img372/3685/volcanoehadleyxs2.png
http://img297.imageshack.us/img297/16/krakatoata5.png
http://img232.imageshack.us/img232/760/elchichonks8.png
Santa Maria and Nova Rupta occured in this chart. See if you can pick out the dates?
http://img142.imageshack.us/img142/535/1020modelfa6.png
I see Mount Pinatubo had a larger impact than the others.
http://img55.imageshack.us/img55/2716/mountpinatubanb7.png
Now where the volcanoes have the greatest impact is in the stratosphere. If you go back to 1963 with Agung, this same trend exists in the stratosphere temps. An immediate 1.0C to 1.5C increase in temperatures followed by a decline of 1.5C to 2.0C at which point the stratosphere temps stabilize and then potentially build back up until the next volcanoe.
I wonder if the optical depth calculations used to estimate the volcanoe influences are really just picking the high stratosphere impact but that the surface is not being impacted as much as the optical depth data would indicate should be occuring. The stratosphere layers are often isolated from the surface layers.
http://img258.imageshack.us/img258/235/uahstratvolcanoesyb5.png

Chris
February 20, 2009 11:58 am

Bill,
Great job! I’ve been saying the same thing for over a year at RC (I reposted your pertinent comments below). I have received nothing but poor responses and/or attacks when I do post these issues. Essentially the answer has been, “Temperature is going to go up anyway, so your point is without merit. No need to change our methodology.” Regarding aerosols, I looked into the data as well (led by a lady researcher from UM). No one can answer the question why aerosol cooling is going up every year when in fact North America and Europe have cleaned up emissions from their power plants and factories. Plus, the former Soviet Union pollutes a lot less today than it did in the 70 or 80’s. Granted China is a mess, but the trend numbers that I’ve seen (both past and present) don’t pass the smell test (my doctoral thesis in chemical engineering was SOx/NOx removal from coal-fired power plants). Trust me, climate modeling is a big hoax that Gavin and the others all realize. There only hope is that things get warmer (not cooler). A cooling climate exposes their fallacy much like falling home prices exposed the fallacy of sub-prime mortgage-backed securities being rated AAA. I’m not saying that malice is involved, but ego and greed (just like the current financial mess).
Chris
Great quotes:
“Another way to look at is they have huge GHG temperature impacts built in (no way to get to +3.0C without it) but they need to build in almost as big negative temperature impacts from other sources to keep the hindcast close to the actual temperatures we have seen so far. One could conclude they are just plugging the big negative numbers into the hindcast after the fact to make it work. Without a large uptick in temperatures in the next few years, the modelers really have to go back to the drawing board (or they need to discover another “negative forcing” to keep the models on track to reality).”

timbrom
February 20, 2009 12:24 pm

Quick question, just so that I can appear knowledgable to my ever decreasing circle of AGW “believers.” Does GCM stand for Global Climate Models or General Circulation Models?

February 20, 2009 1:01 pm

35 years ago, I had the dubious pleasure of providing some input on the mining industry for the construction of Canada’s first computer modelling of the national economy. The first models had some 30 to 40 equations to describe the economy historically and when it was given a trial run for the future, Canada appeared to be heading for a long term growth of about 10% a year. They adjusted the equations several times without much sucess and finally a colleague picked up a piece of chalk and drew an upward sloping line on the blackboard and said this is a model of the Canadian economy – an average annual growth of about 4%. I think this kind of “targetting” figures largely in the climate models.

Mike C
February 20, 2009 1:02 pm

Bill, I would take into consideration the location, type and strength of the individual erruptions. Santa Maria, being in Guatemala should have a global impact. El Chichon, near Mexico City mainly affected the northern hemisphere according to direct optical thickness measurments. Novarupta was in Alaska so it would have had little impact other than high lattitudes in the northern hemisphere. Agung, Krackatoa and Pinatubo are all in low lattitudes (Pinatubo definately had global effects according to optical thickness measurments).
El Chichon knocked down global temps that otherwise would have increased by the 82-83 El Nino. Pinatubo knocked down global temps that would have been higher due to the string of El Nino’s in the following 2-3 years.
So the challenge for you is to calculate the change in solar forcing created by each volcano. Then you need to calculate the temperature change created by individual ENSO events. Then you need to do the same with each of the other ocean circulations (AMO, PDO and etc). That way you will have done what the AGW modelers have failed to do thus far… include all of the inputs into your model without having to increase the magnitude of any one to compensate for the lack of magnitude of another.

February 20, 2009 1:35 pm

Bill Illis: Volcanic eruptions not only lower global temperatures by reducing downward shortwave radiation, but those that occur in the tropics also suppress the “poleward” heat transfer of El Nino events. Mount Pinatubo (16N, 120E) squashed the 1991/92 El Nino and the minor mid-year El Nino in 1993. El Chichon (19N, 93W) totally counteracted the 1982/83 El Nino.
http://s5.tinypic.com/16c0vat.jpg
The 1902 eruption of Santa Maria (14N, 91W) would have been been counteracted by the 1902/1903 El Nino. The 1912 eruption of Novarupta (58N, 156W) is a high-latitude NH eruption. It might have had some effect on the 1911/1912 El Nino, hard to tell with the time lags of El Nino impacts on high latitudes. And then again, back in the early 1900s the data was kind of sparse, which is another reason the 1911/12 El Nino and the Novarupta eruption may not be showing that well in your early graph.
http://s5.tinypic.com/wbf82f.jpg
If you’ve chosen not to include volcanic eruptions because you don’t have a place for them in your model is one thing; not including them because you can’t see the effects in the data is another.
Off topic: I’m finishing up a post you’ll enjoy on OI.v2 SST data. I removed the North Atlantic SST anomaly data from the global SST anomaly data. It sets the trend way high, obviously, but it also suppresses lots of worthwhile data in the rest of the oceans. There is at least one thing you’ll find very interesting. It may help with your efforts. I’m trying to get it done by tomorrow A.M. With luck (getting some time to finish it), it’ll be done tonight.
Regards

February 20, 2009 1:41 pm

Don’t be so shocked or disappointed about what the warmists are after … It has nothing to do with saving the planet or the environment or proving their point that their predictions of a scorched earth with high water is at hand.
This about getting money from the rich to give to the poor and also about controlling populations. The free spirited middle class is too unpredictable and hard to control. Plus they generate far too much envy from the loser classes.
End of story. This is politics not science. If the warmists fail, their masters will find a new fear to leverage their evil.

Richard M
February 20, 2009 2:38 pm

timbrom (12:24:35) :
“Quick question, just so that I can appear knowledgable to my ever decreasing circle of AGW “believers.” Does GCM stand for Global Climate Models or General Circulation Models?”
While I have seen both usages, I believe the latter is correct.

February 20, 2009 4:27 pm

Bill Illis (05:26:15) :
I still can’t see where you get your projections. You haven’t run the model. The GISS page you link to just has data to 2003. How did you calculate the extension? Are you fitting curves to past data and then extending them? If so, what curves are you using?

davidc
February 20, 2009 5:32 pm

Robert Wykoff (07:39:24) :
“I found the code in Hansens program to calculate temperature…
T = 58 + 5 * Log2(CO2/260)
The other 50,000 lines are there to make it look impressive”
That’s been my take on the whole idea of “sensitivity”. If you know that (ie what change in degC for a given change in CO2 ppm) why do you need anything else? I’ve raised this point before in different places but have never got a reply.

Bill Illis
February 20, 2009 5:33 pm

Nick Stokes,
There are two components to the extension; the GHG component and the non-GHG component.
(This is going to get technical now but I guess I should fully explain it so…)
For the GHG side, Model E – GHG forcing temperature impact fits very well over time to Temp C (anomaly versus Kelvin) = 4.053 Ln (CO2) -23.
I use CO2 as a proxy for all the GHGs but this is a reasonable proxy since CO2 is the biggest GHG and N2O is increasing at exactly the same rate. The other GHGs, Methane and CFCs have stabilized recently so using just CO2 could slightly over-estimate the 2003 to 2013 trend but not by very much at all. (Actually there has been a recent uptick in Methane concentrations while it looked up to 2006 that it had stabilized. The preliminary numbers into the fall of 2008 show the uptick continuing.)
So, from 2003 to 2008, I used the actual global CO2 numbers and from 2008 and onward, I used the forecast CO2 growth rate trends (which are just slightly exponential).
I’ve been able to simulate the IPCC temp forecasts using this method and the math works backwards and forwards so I have no problem having faith in it.
For the Non-GHG component, I just fitted a polynomial function starting in 1995 to Model E’s non-GHG forcing (starting in 1995 allowed the big impact from Pinatubo to settle out of the numbers). For 2003 on, the negative forcing from Pinatubo still has a few years to completely return to the no-volcano-normal (which increases the trend slightly in the first few years) and then afterward, the negative trend from Aerosols and Solar forcing kicks in to make the trend downward again.
So there is a very slight increase in the non-GHG component for a few years after 2003 and then it will start down again.
It is possible that a little more negative impact should be built in for Solar given the recent decline in the Sun but there is a little built in already for the non-GHG component to decline so I just left it at that. It is also possible that slightly more negative trend could be included for the Asian brown cloud but European, North American and Russia’s aerosols impact is supposed to be declining now so I just left it at that again. We are only talking about a maximum 0.05C change in this either way.
Add it all up and there you go. The first chart linked to in the post shows it as best I can.

February 20, 2009 7:42 pm

OK Bill,
but now I can’t see the connection to Model E at all. You’ve used the forcing data that they used, but that isn’t model E output – it’s just general atmospheric data that they’ve collected and make conveniently available. You’ve used general IPCC data, and IPCC projections about CO2 emission, none of which is connected with Model E. And you seem to have used the log relation of CO2 to temp, which again is ancient, and nothing to do with model E. You haven’t run model E, nor, as far as I can see, used the results of other runs. So why is this post titled “Short term trends from Giss Model E”?

Editor
February 20, 2009 10:11 pm

Robert Wykoff (07:39:24) :
“I found the code in Hansens program to calculate temperature…
T = 58 + 5 * Log2(CO2/260)
The other 50,000 lines are there to make it look impressive”
What Hansen is doing here is now easy to understand. He’s taking whatever the present CO2 level is at, and dividing it by 260, which he has decided is the “proper” CO2 level. This factor then tweaks his temperature. The problem with this is that CO2’s insolation ability follows a diminishing returns curve, not a Log curve that would curve upward logarithmically. Hansen is operating on the idea that our thin atmosphere is capable of a runaway greenhouse effect like Venus is. This is the big fraud in his code.

Mike Bryant
February 20, 2009 10:25 pm

It’s good to see the simpler formulas that Hansen has recently “nailed”. They can be falsified without major excursions into the computer codes.

February 21, 2009 1:31 am

Bill Illis:
From your posting above:
“Actual Model E hindcast versus GISS’ temperature anomaly, R^2 0.544
http://img355.imageshack.us/img355/9043/modelehindcastoz1.png
You can compare Model E’s hindcast to the simple model I built (R^2 0.713) using the ENSO, AMO and only about half the GHG impact they have built in. (There are no aerosols or volcanoe impacts built into this model.)
http://img301.imageshack.us/img301/7785/finalgissmodelns3.png

I note that there is no arbitrarily and increasingly negative “land use” fudge factor (er, correction) built into the simple model either. [I assume the Model E textbook adds, “It is left as an exercise for the student to explain exactly why land use would be a negative factor w/r future global temperatures.”]
Your note on the graphic for aerosol “correction” shows that Hanson’s (Model E ) aerosol correction values differ greatly from the historical record of how much pollution (smoke primarily) has been produced at what levels from which sources. [Again, this is left as an exercise for the student to explain why the values are what they have entered ….]
For your own equation, please let me play devil’s advocate (er, AGW zealot) for a moment with your “simple model.” It appears to be a “curve fit” of past data “using the ENSO, AMO and only about half the GHG impact.”
Since the ENSO and AMO are chaotic, random events that don’t (or do they???) follow a predictable pattern, can we use them to predict the near future, or can we at best predict only a 70 year 1/2 degree temperature cycle? Are they a valid index for the future – even if they can’t be predicted nor fully explained at this time?
It appears logically reasonable to include a (small) CO2 factor in predicting future temperatures – the cleverly presented theorectically valid “logic” of the CO2 religion is, after all, the ONLY thing that keeps their faith alive. Why would your CO2 fudge factor be so much lower than Hanson’s – other than that your factor actually fits the observed data over time, that is.
For your “curve fit solution” what is the source of the input values (ie, what exactly is “input” when you calculate a term for ENSO; who determines this value, and how often is it updated? [Hanson’s Model E “urban use factor” and “aerosol factors” for example, are not explained, do not have a validated “source”, are not publically reported, are not checked, and does not appear to be based on any historically valid (verifiable) public data, nor any logical extrapolation form today’s (assumed) data. Are your extrapolations any better?
Why are you using the (R^2 0.713) term, and what is its source? If TSI changes, does this value change?
Is your “simple model” valid over a short period (6 – 12 months)? Can you “predict” the next 6- 18 months based on today’s ENSO and PDO states – even if longer term predictions are not practical nor reasonable?

Joel Shore
Reply to  Robert A Cook PE
February 23, 2009 10:50 am

[I assume the Model E textbook adds, “It is left as an exercise for the student to explain exactly why land use would be a negative factor w/r future global temperatures.”]

Converting forests to croplands, for example, raises the surface albedo since forests generally reflect less sunlight than the croplands do. Of course, clearing forests also releases CO2 which can cause more warming…But, if one is using the actual amount of CO2 in the atmosphere to determine the net forcings, any such change would be reflected in that term.

timbrom
February 21, 2009 3:50 am

Bill
CO2 is the biggest GHG?

Bill Illis
February 21, 2009 5:52 am

Regarding the formula in Model E’s code “T = 58 + 5*Log2(CO2/260)”, I imagine this was a little tongue in cheek comment. A variation of the formula would fall out of the model runs, not be in the code.

Pamela Gray
February 21, 2009 7:40 am

I think the natural cycle model can predict fairly accurately what is “due” in the future. We have other models that use that phrase. For example, earthquake and volcanic models use this terminology to warn us that we are “due” for an earthquake here or there, etc. Then we can go about making preparations for that event. Global models based on naturally occurring cycles (even ones that don’t follow a set 10 year on, 10 year off pattern) can predict that we are “due” for a trend down or up, and what to look for when the trend hits.