Model Charged with Excessive Use of Forcing

Guest Post by Willis Eschenbach

The GISS Model E is the workhorse of NASA’s climate models. I got interested in the GISSE hindcasts of the 20th century due to an interesting posting by Lucia over at the Blackboard. She built a simple model (which she calls “Lumpy”) which does a pretty good job of emulating the GISS model results, using only a model including forcings and a time lag. Stephen Mosher points out how to access the NASA data here (with a good discussion), so I went to the NASA site he indicated and got the GISSE results he points to. I plotted them against the GISS version of the global surface air temperature record in Figure 1.

Figure 1. GISSE Global Circulation Model (GCM or “global climate model”) hindcast 1880-1900, and GISS Global Temperature (GISSTemp) Data. Photo shows the new NASA 15,000-processor “Discover” supercomputer. Top speed is 160 trillion floating point operations per second (a unit known by the lovely name of “teraflops”). What it does in a day would take my desktop computer seventeen years.

Now, that all looks impressive. The model hindcast temperatures are a reasonable match both by eyeball and mathematically to the observed temperature. (R^2 = 0.60). True, it misses the early 20th century warming (1920-1940) entirely, but overall it’s a pretty close fit. And the supercomputer does 160 teraflops. So what could go wrong?

To try to understand the GISSE model, I got the forcings used for the GISSE simulation. I took the total forcings, and I compared them to the GISSE model results. The forcings were yearly averages, so I compared them to the yearly results of the GISSE model. Figure 2 shows a comparison of the GISSE model hindcast temperatures and a linear regression of those temperatures on the total forcings.

Figure 2. A comparison of the GISSE annual model results with a linear regression of those results on the total forcing. (A “linear regression” estimates the best fit of the forcings to the model results). Total forcing is the sum of all forcings used by the GISSE model, including volcanos, solar, GHGs, aerosols, and the like. Deep drops in the forcings (and in the model results) are the result of stratospheric aerosols from volcanic eruptions.

Now to my untutored eye, Fig. 2 has all the hallmarks of a linear model with a missing constant trend of unknown origin. (The hallmarks are the obvious similarity in shape combined with differing trends and a low R^2.) To see if that was the case I redid my analysis, this time including a constant trend. As is my custom, I merely included the years of the observation in the analysis to get that trend. That gave me Figure 3.

Figure 3. A comparison of the GISSE annual model results with a regression of the total forcing on those results, including a constant annual trend. Note the very large increase in R^2 compared to Fig. 2, and the near-perfect match of the two datasets.

There are several surprising things in Figure 3, and I’m not sure I see all of the implications of those things yet. The first surprise was how close the model results are to a bozo simple linear response to the forcings plus the passage of time (R^2 = 0.91, average error less than a tenth of a degree). Foolish me, I had the idea that somehow the models were producing some kind of more sophisticated, complex, lagged, non-linear response to the forcings than that.

This almost completely linear response of the GISSE model makes it trivially easy to create IPCC style “scenarios” of the next hundred years of the climate. We just use our magic GISSE formula, that future temperature change is equal to 0.13 times the forcing change plus a quarter of a degree per century, and we can forecast the temperature change corresponding to any combination of projected future forcings …

Second, this analysis strongly suggests that in the absence of any change in forcing, the GISSE model still warms. This is in agreement with the results of the control runs of the GISSE and other models that I discussed st the end of my post here. The GISSE control runs also showed warming when there was no change in forcing. This is a most unsettling result, particularly since other models showed similar (and in some cases larger) warming in the control runs.

Third, the climate sensitivity shown by the analysis is only 0.13°C per W/m2 (0.5°C per doubling of CO2). This is far below the official NASA estimate of the response of the GISSE model to the forcings. They put the climate sensitivity from the GISSE model at about 0.7°C per W/m2 (2.7°C per doubling of CO2). I do not know why their official number is so different.

I thought the difference in calculated sensitivities might be because they have not taken account of the underlying warming trend of the model itself. However, when the analysis is done leaving out the warming trend of the model (Fig. 2), I get a sensitivity of 0.34°C per W/m2 (1.3°C per doubling, Fig. 2). So that doesn’t solve the puzzle either. Unless I’ve made a foolish mathematical mistake (always a possibility for anyone, check my work), the sensitivity calculated from the GISSE results is half a degree of warming per doubling of CO2 …

Troubled by that analysis, I looked further. The forcing is close to the model results, but not exact. Since I was using the sum of the forcings, obviously in their model some forcings make more difference than other forcings. So I decided to remove the volcano forcing, to get a better idea of what else was in the forcing mix. The volcanos are the only forcing that makes such large changes on a short timescale (months). Removing the volcanos allowed me to regress all of the other forcings against the model results (without volcanos), so that I could see how they did. Figure 4 shows that result:

Figure 4. All other forcings regressed against GISSE hindcast temperature results after volcano effect is removed. Forcing abbreviations (used in original dataset): W-M_GHGs = Well Mixed Greenhouse Gases; O3 = Ozone; StratH2O = Stratospheric Water Vapor; Solar = Energy From The Sun; LandUse = Changes in Land Use and Land Cover; SnowAlb = Albedo from Changes in Snow Cover; StratAer = Stratospheric Aerosols from volcanos; BC = Black Carbon; ReflAer = Reflective Aerosols; AIE = Aerosol Indirect Effect. Numbers in parentheses show how  well the various forcings explain the remaining model results, with 1.0 being a perfect score. (The number is called R squared, usually written R^2) Photo Source

Now, this is again interesting. Once the effect of the volcanos is removed, there is very little difference in how well the other forcings explain the remainder. With the obvious exception of solar, the R^2 of most of the forcings are quite similar. The only two that outperform a simple straight line are stratospheric water vapor and GHGs, and that is only by 0.01.

I wanted to look at the shape of the forcings to see if I could understand this better. Figure 5 has NASA GISS’s view of the forcings, shown at their actual sizes:

Figure 5: The radiative forcings used by the GISSE model as shown by GISS. SOURCE

Well, that didn’t tell me a lot (not GISS’s fault, just the wrong chart for my purpose), so I took the forcing data, standardized it, and took a look at the forcings in a form in which they could be seen. I found out the reason that they all fit so well lies in the shape of the forcings. All of them increase slowly (either negatively or positively) until 1950. After that, they increase more quickly. To see these shapes, it is necessary to standardize the forcings so that they all have the same size. Figure 6 shows what the forcings used by the model look like after standardization:

Figure 6. Forcings for the GISSE model hindcast 1880-2003. Forcings have been “standardized” (set to a standard deviation of 1.0) and set to start at zero as in Figure 4.

There are several oddities about their forcings. First, I had assumed that the forcings used were based at least loosely on reality. To make this true, I need to radically redefine “loosely”. You’ll note that by some strange coincidence, many of the forcings go flat from 1990 onwards … loose. Does anyone believe that all those forcings (O3, Landuse, Aerosol Indirect, Aerosol Reflective, Snow Albedo, Black Carbon) really stopped changing in 1990? (It is possible that this is a typographical or other error in the dataset. This idea is supported by the slight post-1990 divergence of the model results from the forcings as seen in Fig. 3)

Next, take a look at the curves for snow albedo and black carbon. It’s hard to see the snow albedo curve, because it is behind the black carbon curve. Why should the shapes of those two curves be nearly identical? … loose.

Next, in many cases the “curves” for the forcings are made up of a few straight lines. Whatever the forcings might or might not be, they are not straight lines.

Next, with the exception of solar and volcanoes, the shape of all of the remaining forcings is very similar. They are all highly correlated, and none of them (including CO2) is much different from a straight line.

Where did these very strange forcings come from? The answer is neatly encompassed in “Twentieth century climate model response and climate sensitivity”, Kiehl, GRL 2007 (emphasis mine):

A large number of climate modeling groups have carried out simulations of the 20th century. These simulations employed a number of forcing agents in the simulations. Although there are established data for the time evolution of well-mixed greenhouse gases [and solar and volcanos although Kiehl doesn’t mention them], there are no established standard datasets for ozone, aerosols or natural forcing factors.

Lest you think that there is at least some factual basis to the GISSE forcings, let’s look again at black carbon and snow albedo forcing. Black carbon is known to melt snow, and this is an issue in the Arctic, so there is a plausible mechanism to connect the two. This is likely why the shapes of the two are similar in the GISSE forcings. But what about that shape, increasing over the period of analysis? Here’s one of the few actual records of black carbon in the 20th century, from 20th-Century Industrial Black Carbon Emissions Altered Arctic Climate Forcing, Science Magazine (paywall)

Figure 7. An ice core record from the Greenland cap showing the amount of black carbon trapped in the ice, year by year. Spikes in the summer are large forest fires.

Note that rather than increasing over the century as GISSE claims, the observed black carbon levels peaked in about 1910-1920, and have been generally decreasing since then.

So in addition to the dozens of parameters that they can tune in the climate models, the GISS folks and the other modelers got to make up some of their own forcings out of the whole cloth … and then they get to tell us proudly that their model hindcasts do well at fitting the historical record.

To close, Figure 8 shows the best part, the final part of the game:

Figure 8. ORIGINAL IPCC CAPTION (emphasis mine). A climate model can be used to simulate the temperature changes that occur from both natural and anthropogenic causes. The simulations in a) were done with only natural forcings: solar variation and volcanic activity. In b) only anthropogenic forcings are included: greenhouse gases and sulfate aerosols. In c) both natural and anthropogenic forcings are included. The best match is obtained when both forcings are combined, as in c). Natural forcing alone cannot explain the global warming over the last 50 years. Source

Here is the sting in the tale. They have designed the perfect forcings, and adjusted the model parameters carefully, to match the historical observations. Having done so, the modelers then claim that the fact that their model no longer matches historical observations when you take out some of their forcings means that “natural forcing alone cannot explain” recent warming … what, what?

You mean that if you tune a model with certain inputs, then remove one or more of the inputs used in the tuning, your results are not as good as with all of the inputs included? I’m shocked, I tell you. Who would have guessed?

The IPCC actually says that because the tuned models don’t work well with part of their input removed, this shows that humans are the cause of the warming … not sure what I can say about that.

What I Learned

1. To a very close approximation (R^2 = 0.91, average error less than a tenth of a degree C) the GISS model output can be replicated by a simple linear transformation of the total forcing and the elapsed time. Since the climate is known to be a non-linear, chaotic system, this does not bode well for the use of GISSE or other similar models.

2. The GISSE model illustrates that when hindcasting the 20th century, the modelers were free to design their own forcings. This explains why, despite having climate sensitivities ranging from 1.8 to 4.2, the various climate models all provide hindcasts which are very close to the historical records. The models are tuned, and the forcings are chosen, to do just that.

3. The GISSE model results show a climate sensitivity of half a degree per doubling of CO2, far below the IPCC value.

4. Most of the assumed GISS forcings vary little from a straight line (except for some of them going flat in 1990).

5. The modelers truly must believe that the future evolution of the climate can be calculated using a simple linear function of the forcings. Me, I misdoubts that …

In closing, let me try to anticipate some objections that people will likely have to this analysis.

1. But that’s not what the GISSE computer is actually doing! It’s doing a whole bunch of really really complicated mathematical stuff that represents the real climate and requires 160 teraflops to calculate, not some simple equation. This is true. However, since their model results can be replicated so exactly by this simple linear model, we can say that considered as black boxes the two models are certainly equivalent, and explore the implications of that equivalence.

2. That’s not a new finding, everyone already knew the models were linear. I also thought the models were linear, but I have never been able to establish this mathematically. I also did not realize how rigid the linearity was.

3. Is there really an inherent linear warming trend built into the model? I don’t know … but there is something in the model that acts just like a built-in inherent linear warming. So in practice, whether the linear warming trend is built-in, or the model just acts as though it is built-in, the outcome is the same. (As a side note, although the high R^2 of 0.91 argues against the possibility of things improving a whole lot by including a simple lagging term, Lucia’s model is worth exploring further.)

4. Is this all a result of bad faith or intentional deception on the part of the modelers? I doubt it very much. I suspect that the choice of forcings and the other parts of the model “jes’ growed”, as Topsy said. My best guess is that this is the result of hundreds of small, incremental decisions and changes made over decades in the forcings, the model code, and the parameters.

5. If what you say is true, why has no one been able to successfully model the system without including anthropogenic forcing?

Glad you asked. Since the GISS model can be represented as a simple linear model, we can use the same model with only natural forcings. Here’s a first cut at that:

Figure 9. Model of the climate using only natural forcings (top panel). All forcings model from Figure 3 included in lower panel for comparison. Yes, the R^2 with only natural forcings is smaller, but it is still a pretty reasonable model.

6. But, but … you can’t just include a 0.42 degree warming like that! For all practical purposes, GISSE does the same thing only with different numbers, so you’ll have to take that up with them. See the US Supreme Court ruling in the case of Sauce For The Goose vs. Sauce For The Gander.

7. The model inherent warming trend doesn’t matter, because the final results for the IPCC scenarios show the change from model control runs, not absolute values. As a result, the warming trend cancels out, and we are left with the variation due to forcings. While this sounds eminently reasonable, consider that if you use their recommended procedure (cancel out the 0.25°C constant inherent warming trend) for their 20th century hindcast shown above, it gives an incorrect answer … so that argument doesn’t make sense.

To simplify access to the data, I have put the forcings, the model response, and the GISS temperature datasets online here as an Excel worksheet. The worksheet also contains the calculations used to produce Figure 3.

And as always, the scientific work of a thousand hands continue.

Regards,

w.

 

[UPDATE: This discussion continues at Where Did I Put That Energy.]

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

155 Comments
Inline Feedbacks
View all comments
wayne
December 19, 2010 11:36 pm

Willis, thanks for the data. Love to extract some amazing things from such datasets as the GISS you provided. As you said the weighting given each forcing are not logically all identical so I added a weighting factor to each row in your spread and came up with this after letting Excel find this 11-way minimization of variances. The weights it came up are listed below. Apply those weights to GISS forcing data and you closely replicated (.833 R^2) GISS observed temperature data, according to them. The only real difference suppression the r2 is that the observed data is much more noisy and volatile, bigger jumps up and down.
What I found interesting in this is the light weight it applied to GHGs and how massively it weighted SnowAlb and AIE. Some weights are even negative implying that the sign is wrong in these forcings supplied by GISS.

W-M_GHGs    0.26
O3         -1.07
StratH2O    2.93
Solar       0.15
LandUse     1.98
SnowAlb     4.33
StratAer    0.04
BC         -2.35
ReflAer    -1.07
AIE         3.39
AddLinear   0.02

Thanks for the interesting posts!
Oh, the AddLinear is a column I added as you mentioned to apply a ‘unknown’ pure linear forcing of exactly 1 per year, it only used 0.02 of it in the minimization of the variances.

Geoff Sherrington
December 19, 2010 11:57 pm

For the encouragement of Willis,
The use of a pictorial background is excellent for assisting retention of the message.
Here are two fake “graphs” of the century-long global temperature record, reinforced with a pictorial message that expert practitioners can make simple mistakes with models. (I would give thanks and attribution, but the photo cartoon came to me with no author noted.)
http://www.geoffstuff.com/Plastic%20Surgeon.jpg

tty
December 20, 2010 12:35 am

This exercise in curve-fitting inevitably reminds me of von Neumann’s claim:
“Give me four variables and I can draw an elephant, give me five and I’ll make him wave his trunk”

E.M.Smith
Editor
December 20, 2010 12:55 am

Is there any reason why “Land Use” is a negative? I’d expect changes of land use to increase warming. Asphalt warm. Grass cool…
Also, what happens if you replace their “strait line all the same” parameters with some that are more representative of the actual data? For example, that “black carbon” curve… and maybe having a 1/2 C or so UHI in the thermometers correction…
IMHO the reason for the “Go Flat” at the end of the non-GHG curves is so that the GHG curve can start out lagging, then as it catches up, the others can be dropped out and leave GHG as very dominant while hiding the fact that it was WAY too fast a rise (nearly exponential?) when it ought to have been a decreasing impact (log). So you hide the needed real “log like behaviour” in the other curves having a compensating lag, and leave the GHG curve more exponential, that way runs into the further future have highly divergent heating, but you can say “Look the model matched in the past!”.
All in all, a “neat trick”.
So put in a Log decay curve on the GHGs, un-flatten the other curves, and see if suddenly your ‘future 40 years out’ looks like “not much happens”… Then tell me again why that model they are using has a non-Log GHG curve… (it does look like an exponential in the early stages to me. Would be nice to have a curve fit to it…)
(Fig 5 looks like GHG accelerates in the middle. Fig. 4 GHG looks to have a ‘belly sag’ in the middle and rise at the end. More precise than ‘eyeball’ analysis would be helpful 😉

charles nelson
December 20, 2010 1:11 am

Willis.
A thought experiment for you…
It’s the year 1880. Visualize all the locations on earth where accurate temperature measurements were being recorded. Then, after accepting that coverage was rather patchy back then, ask your self these questions….
Was the method uniform across all of these sites?
How were these instruments calibrated?
Wet bulb dry bulb? Urbanization of surroundings etc etc…?
Was it exclusively maxima and minima being recorded or other day/night temperatures…if so at what times?
Were some being recorded in degrees F and others in degrees C for instance.
Most importantly…were these readings taken against a scale accurate to one tenth of one degree!?
Now visualize the world today and the pandemic spread of electronic meterological measuring gear. All over the world tens of thousands of systems that can log a day’s data second by second, from high resolution digital thermometers.
Ask yourself if you think it is fair to put these two types of Data 1880 and 2010 on the same graph.
It’s only my opinion but as someone whose job it is to control temperatures, I can assure you that measuring air temperature in any meaningful way down to 1 tenth of a degree resolution is nigh on impossible in a natural space ( by that I mean a space where convection and general air circulation is possible). You probably could measure a steady stream of air more accurately or a sealed container.
In the real world of rooms, streets, fields, mountains jungles and oceans however, one tenth of a degree is so transient as to be meaningless.
Looking at your brutal ramp up of .4 degrees C over the second half of the twentieth century I’m beginning to think that the temperature rise that has been ‘observed’ during this period might well be down to the greater number of more accurate observations made as the century progressed.

December 20, 2010 1:20 am

Thiose models are garbage. All they do is mimicking the Keeling curve, and where it conflicts with reality, they add ad-hoc aerosols.
One sign of good theory is, that it predicts things and does not need any “cosmological constants” added here and there. These playstation models need
a) some kind of solar forcing to explain 1910-1940 global warming by 0.7 deg C
b) aerosol plug to explain 1940-1975 cooling, combined with continuous eradication of this cooling trend from global datasets by cherry-picking stations and whatever data manipulation
c) CO2 forcing which finally takes over after 1975, when solar suddenly does not work
d) most of all, those models completely ignore oceans, which are the main climate driver.
ENSO/PDO/AMO/NAO based models easily explain the whole 20th century.

wayne
December 20, 2010 2:13 am

“E.M.Smith says:
December 20, 2010 at 12:55 am
Is there any reason why “Land Use” is a negative? I’d expect changes of land use to increase warming. Asphalt warm. Grass cool… ”
Just saw a video someone posted in the next posts that talked of that very subject. This scientist was middle-of-the-road on the subject but was speaking of asphalt and dark roofs causing UHI as towns and cities developed and also spoke of wooded areas, great absorbers, being replaced by light colored wheat fields which caused cooling. But, this is a very complicated subject of whether more dark has been replaced by man’s signature on the land overall or the other way around. Cities do occupy a small percentage of land area compared to farming but also many fields are also bare and plowed and dark brown.
My little box of data above, and it is purely GISS data supplied by Willis (I don’t place much confidence on most of it), seems to agree with land-use having a negative influence but once again, that is by GISS’s forcings and observed temperature anomalies. I’m still looking at those weightings that fit to the temp anomalies and trying to decipher what they may be saying if GISS’s data can be trusted enough to base any confidence on analysis of it. If the forcings are wrong and the GISS temps wrong then that analysis is also wrong.

Mike Borgelt
December 20, 2010 2:13 am

Juraj V. says:
December 20, 2010 at 1:20 am ENSO/PDO/AMO/NAO based models easily explain the whole 20th century.
Add the IOD (Indian Ocean Dipole)

John Edmondson
December 20, 2010 2:25 am

Thanks for the data, Willis. How is the conversion done of total forcing to response?
It is interesting to note that the big rise in the model response is due to the drop in strat Aer, similar to point made yesterday’s post at spaceweather.com.
Obviously, these simple models can be run forward in time to fill in the gap to 2010 and make a prediction to 2015.
The only problem I can see is trying to build in the lack of solar activity into a forecast. the TSI bit is easy, but I will have to add an estimate for GCR influence on clouds into snow albedo.
Thanks again.

Shevva
December 20, 2010 2:37 am

Us humans are brilliant, that we can model such complex systems and pick out a tiny atom from this complex system and prove how it is affecting the Earth is Nobel prize stuff.
Whats that the only reason they show the real world is they fiddled the figures, well no matter as the bankers have show with there bonus’s I’m sure 2011 will not see a drop in there tax payer funded grants, even if they are lying and cheating.
Modern civilisation? people freezing to death because of winter fuel shortages. 100 years ago we’d have just shovelled some more coal on the fire.

R James
December 20, 2010 3:12 am

I agree with madman2001 – the background rubbish on the graphs is distracting and poor scientific presentation – get rid of it if you want to be taken seriously. Marketing whiz kids do this sort of thing – serious scientists don’t.

Nullius in Verba
December 20, 2010 3:14 am

The sensitivity question is just the usual transient vs equilibrium warming issue.
We’ve had a 40% increase in CO2, which is ‘half’ a doubling, so we ought to get about half the sensitivity as a temperature rise. Since we observe around 0.65 C warming, we get 1.3 C sensitivity. Simple as.
No model that purports to reproduce observations can have a transient sensitivity much higher than this. The equilibrium sensitivity is the scarier number.

tallbloke
December 20, 2010 3:19 am

The enhanced greenhouse theory is in big trouble.
The Ocean heat content of the top 700m has been flat to falling from 2003.5 and the radiative imbalance at the top of the atmosphere when reconciled with that shows a small drop.
Trenberth’s ‘missing heat’ isn’t missing somewhere in the system; It just ain’t there. Perhaps the TIMS TSI measurements being 4W below the older TSI measurements aren’t so far out after all.
I wonder when they’ll get around to trying to explain them.
http://tallbloke.wordpress.com/2010/12/20/working-out-where-the-energy-goes-part-2-peter-berenyi/

Chris Wright
December 20, 2010 3:28 am

“The IPCC actually says that because the tuned models don’t work well with part of their input removed, this shows that humans are the cause of the warming … not sure what I can say about that.”
I’ve always regarded this ‘proof’ as little more than a confidence trick. Climate models are just computer programs – and you can easily write programs to ‘prove’ anything you want. Here’s how I would reproduce this ‘proof’.
1. I would put in all the required physical laws and initial conditions: honest, but it would fail miserably.
2. I would add a large forcing due to CO2: still honest (assuming I believed AGW to be correct in the first place). It would still fail to accurately reproduce historical climate.
3. I would then add in other arbitrary forcings and adjustments to achieve a good agreement with the historical climate (I would give this process an impressive name such as ‘parameterisation’): completely dishonest, but the model now perfectly reproduces the historical climate.
This would be Exhibit A: it gives a perfect match.
I would then remove the CO2 forcing. By definition, it will no longer match the historical climate. This would be Exhibit B.
For gullible people such as David Attenborough (as shown at the end of his film entitled ‘The Truth About Global Warming’), Exhibits A and B would provide perfect proof for AGW.
But of course it would prove nothing. Because it would be dishonest. In other words, a confidence trick.
Chris

Steve Keohane
December 20, 2010 3:50 am

Nice work Willis. I like the low CO2 sensitivity, more indication it is a minor player in CC.
E.M.Smith says: December 20, 2010 at 12:55 am
Is there any reason why “Land Use” is a negative? I’d expect changes of land use to increase warming. Asphalt warm. Grass cool…

I’m pretty sure, Pielke Sr. has ‘Land Use’ as a positive, his perception of the most egregious UHI error. I think this is the correct paper:
http://www.agu.org/pubs/crossref/2007/2006JD008229.shtml

Jessie
December 20, 2010 4:01 am

anna v says:
December 19, 2010 at 11:04 pm
Willis Eschenbach says:
December 19, 2010 at 10:37 p
That is interesting Anna and also the post by Willis has been very informative, as always. Thank you.
A similar story but from the health field. We used to blood let on a massive scale for testing of treponema pallidum (syphilis) in very remote areas. A very nasty disease and especially for infants/children born to undiagnosed mothers. And particularly those living in the now ‘eco-regions’ of the world.
The tests were reported within certain sensitivity parameters and treatment ordered within these parameters. We were able to treat well (very basic clinics, no telephones) based on these results (and our diligent provision of past history and treatment on the pathology forms). Then different pathology labs with different sensitivities directing different treatment regimes reared up.
The ruler or the instrument (specificity) had changed.
We realised almost 100% detection and treatment in a small population of tribal peoples. Years later they report rates 30-60x that of the nation population.
What happened?
The attitude was changed from sound clinical health practices to sociological understandings. These latter understandings and thus parameters effected massive funding for ‘forcing’ of other regressive policies.
As mentioned in the posts:- Urederra and Hartley.
Urederra says:
December 19, 2010 at 6:24 pm
Why do they need a 160 teraflops supercomputer?
To make forcings ‘a la carte’ that fit with their desire…………..[d predictions].
It is ecneics, science made backwards.

Nial
December 20, 2010 4:04 am

As an engineer I’m gob-smacked that there isn’t an agreed standard set of historical forcings against which all climate models are verified.
If I was able to make up my own acceptance tests everything would work perfectly first time! Unbelieveable.
Is there any mechanism to get them to release the “historical” data sets they’re using for their hindcasts?
Nial

December 20, 2010 4:30 am

The GISSE model results show a climate sensitivity of half a degree per doubling of CO2, far below the IPCC value.
Wow. Sounds like the science is settled. Seems everyone is converging on this value lately, whether they wanted to or not. A compendium of results showing sensitivity < 1°C might be a handy reference tool.

December 20, 2010 4:50 am

Back in 2007, I made a similar (rudimentary) analysis. Results can be found at http://users.telenet.be/j.janssens/Climate2007/Climatereconstruction.html
I got an r²=0.92 for a 6-parameter global climate model with CO2, and r²=0.74 for the model without CO2. Parameters (SOI, aa, AMO, NAO, volcano, CO2 vs. GISS yearly temperature data) were chosen for their long-term availability, and not really for their potential impact on climate.
It was during this project that I got confronted for the first time with the GISS data-tinkering, the deviation between surface and satellite measurements (better correlation with no-CO2-models), and that the solar influence might perhaps be less than what I originally had assumed (perhaps I should have taken another parameter than aa).
My main conclusion at the time was that “…The foregoing analysis certainly does not rule out CO2 as a contributor to the observed global warming. Rather, it provides grounds that its contribution is much less than that claimed by the IPCC, and that AMO contributes a significant chunk to the temperature evolution. Also, the obtained results warrant a review of the temperature data, and the way in which they are obtained and handled. …”

stephen richards
December 20, 2010 5:04 am

Willis
Have you tried Roy Spencer’s Excell model. It is simple and not too different from yours.
I wrote some years ago at SteveMc’s site that I was a Physicist and Software Programmer and Project Manager before my retirement and had looked at the models as best I could at that time and thought they were lacking control. Then along came climategate and voilà there was the proof. They have zero version control, zero VV&T, zero intra-module control, zero variable management and so on. I know for certain no matter what anyone would want to tell me that none of these models can be relied on even for basic understanding of climate.
Thanks Willis. As Prof Mosher said so do I.

Bill Illis
December 20, 2010 5:33 am

Trenberth published a few important papers in 2009 including the “missing energy” one but one that was more useful was “Tracking Earth’s Energy Budget.”
http://www.cgd.ucar.edu/cas/Trenberth/trenberth.papers/EnergyDiagnostics09final2.pdf
It was mainly a call for better monitoring but it also contained the following chart which tells the story better than any other you will see because it includes for the very first time, the feedbacks that are occuring/expected.
http://img638.imageshack.us/img638/8098/trenberthnetradiation.jpg
==> IPCC Anthro forcing to date (lower than GISS Model E) = +1.6 W/m2
==> Feedbacks which are supposed to be occuring (mostly water vapour) = +2.1 W/m2
==> mysterious “Negative Radiative Feedback” that is occuring = -2.8 W/m2
Some of this Negative Radiative Feedback could be the oceans absorbing some of the forcings but the most anyone can come up with for this is 0.5 W/m2 and it is has gone to Zero in the last several years. Some of it could be that the feedbacks just aren’t occuring as expected. Even if that were true, there would still have to be some small negative feedback left anyway.
But Trenberth calculated the negative feedback number based on the 0.75C/W/m2 response that is expected in the theory and in the climate models. The negative feedback, however, wouldn’t exist if the actual climate responds according to the Stefan-Boltzmann equations instead (which is how it should be calculated anyway).
+3.7 W/m2 [of Anthro/and water vapour feedback forcing] -0.5 [ocean absorption] = +3.2 W/m2 or just 0.22C/W/m2 which is very close to what the SB equations say it should be.
So, either there is some mysterious really large negative feedback to date that we can’t find or the global warming community got so carried away with their 0.75C/W/m2 response factor and their climate models that they forgot how to do basic math.
GHG doubling with actual climate response to a given forcing to date = +1.5C

Frank K.
December 20, 2010 5:40 am

The main problem I have with GCMs like Model E is that in many cases (at least in the case of Model E) they are poorly documented, poorly designed and poorly written. For those who wish to see the Model E source code in all its FORTRAN glory, you can find it here
For all the money we spend on this “research”, you think they could do better, especially (as one poster said above) since we are basing public policy decisions worth billions of dollars on the results of these simulations. Unfortunately, as Gavin Schmidt once replied on another blog, they don’t have time to provide full documentation and testing of their code – they are paid to do “science”!!

Speed
December 20, 2010 6:21 am

Is anyone concerned that the GISSE Climate Model Results are much smoother than the GISS Global Temperature in Figure 1? While year to year variations are more weather than climate and we expect a climate model to reproduce the trend and average measured global temperature rather than exact annual values, why doesn’t the model reproduce the wide variation that occur from year to year?

Roger Andrews
December 20, 2010 6:34 am

Willis
Regarding your post of Dec 19, 10.20pm, please see my earlier post of Dec 19, 5.51pm.

Martin Brumby
December 20, 2010 6:59 am

Willis Eschenbach says: December 19, 2010 at 10:37 pm
[…]
“Wouldn’t it be nice if someone from the GISSE modeling team would comment on this, or explain to me where I’m wrong? Or say anything?”
My guess is that they’re way too busy studying the “Situations Vacant” lists….
Another excellent post, Willis. And I like the backgrounds, too…..

Verified by MonsterInsights