Dr. McKitrick’s new paper with Lise Tole is now online at Climate Dynamics. He also has an op-ed in the Financial Post on June 13. A version with the citations provided is here. Part II is here online, and the versions with citations is here.
**McKitrick, Ross R. and Lise Tole (2012) “Evaluating Explanatory Models of the Spatial Pattern of Surface Climate Trends using Model Selection and Bayesian Averaging Methods” Climate Dynamics, 2012, DOI: 10.1007/s00382-012-1418-9
The abstract is:
We evaluate three categories of variables for explaining the spatial pattern of warming and cooling trends over land: predictions of general circulation models (GCMs) in response to observed forcings; geographical factors like latitude and pressure; and socioeconomic influences on the land surface and data quality. Spatial autocorrelation (SAC) in the observed trend pattern is removed from the residuals by a well-specified explanatory model. Encompassing tests show that none of the three classes of variables account for the contributions of the other two, though 20 of 22 GCMs individually contribute either no significant explanatory power or yield a trend pattern negatively correlated with observations. Non-nested testing rejects the null hypothesis that socioeconomic variables have no explanatory power. We apply a Bayesian Model Averaging (BMA) method to search over all possible linear combinations of explanatory variables and generate posterior coefficient distributions robust to model selection. These results, confirmed by classical encompassing tests, indicate that the geographical variables plus three of the 22 GCMs and three socioeconomic variables provide all the explanatory power in the data set. We conclude that the most valid model of the spatial pattern of trends in land surface temperature records over 1979-2002 requires a combination of the processes represented in some GCMs and certain socioeconomic measures that capture data quality variations and changes to the land surface.
He writes on his website:
We apply classical and Bayesian methods to look at how well 3 different types of variables can explain the spatial pattern of temperature trends over 1979-2002. One type is the output of a collection of 22 General Circulation Models (GCMs) used by the IPCC in the Fourth Assessment Report. Another is a collection of measures of socioeconomic development over land.
The third is a collection of geopgraphic indicators including latitude, coastline proximity and tropospheric temperature trends. The question is whether one can justify an extreme position that rules out one or more categories of data, or whether some combination of the three types is necessary. I would describe the IPCC position as extreme since they dismiss the role of socioeconomic factors in their assessments. In the classical tests, we look at whether any combination of one or two types can “encompass” the third, and whether non-nested tests combining pairs of groups reject either 0% or 100% weighting on either. (“Encompass” means provide sufficient explanatory power not only to fit the data but also to account for the apparent explanatory power of the rival model.) In all cases we strongly reject leaving out the socioeconomic data.
In only 3 of 22 cases do we reject leaving out the climate model data, but in one of those cases the correlation is negative, so only 2 count–that is, in 20 of 22 cases we find the climate models are either no better than or worse than random numbers. We then apply Bayesian Model Averaging to search over the space of 537 million possible combinations of explanatory variables and generate coefficients and standard errors robust to model selection (aka cherry-picking). In addition to the geographic data (which we include by assumption) we identify 3 socioeconomic variables and 3 climate models as the ones that belong in the optimal explanatory model, a combination that encompasses all remaining data. So our conclusion is that a valid explanatory model of the pattern of climate change over land requires use of both socioeconomic indicators and GCM processes. The failure to include the socioeconomic factors in empirical work may be biasing analysis of the magnitude and causes of observed climate trends since 1979.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Minor typo – the op-ed was published on June 13.
[REPLY: Fixed. Thanks. -REP]
So FGOALS and CCSM3.0 have the best explanatory power as per this paper. I find this odd because on other measures (e.g. seasonal temperature variations in the oceans) they are amongst the worse.
Matt Briggs has a “random walk” climatre generating program here, which can be easily downloaded.
http://wmbriggs.com/blog/?p=257
The standard deviation used by Matt Briggs was 0.1238 degrees C for a year. .
Using 0.01 degrees,, , a random walk would give an average change of square root of n over n years. After 100 years, the average change would be plus or minus sqrt 100 =10*.01 or 0.1C – plausible.
After 10,000 years, the average change would be 100* 0.01 = 1C.- quite possible.
After 100,000,000 years, the average change would be 10,,000* 0.01 = 100 C.= ridiculous.
Obviously there’s some dampening effect, making it less and less likely that temperatures will go from high to higher, or from low to lower, else life on earth would have been eliminated long ago.
What would a better randomized deviation be? Would it be proportional to log(n) rather than n?
“10 of the 22 climate models predicted a pattern that was negatively correlated with observations and had to be removed from most of the analysis to avoid biasing the results. In 10 other cases we found the climate models predicted a pattern that was loosely correlated with observations, but not significantly so—in other words not significantly better than random numbers. In only 2 cases was there statistically significant evidence of explanatory power.”
“In all 22 cases the probability you could leave out the socioeconomic data was computed as zero. But only in 3 of 22 cases did the data say you should keep the GCM, and in one of those cases the fit was negative (opposite to the observed patterns) so it didn’t count. So, again, only 2 of 22 climate models demonstrated enough explanatory power to be worth retaining, but in all 22 cases the data gave primary support to the socioeconomic measures the IPCC insists should not be used.”
“But the IPCC has taken an extreme position, that the socioeconomic patterns have no effect2 and any temperature changes must be due to global “forcings” like carbon dioxide (CO2) emissions. Studies that claim to detect the effects of CO2 emissions on the climate make this assumption, as do those that estimate the rate of greenhouse gas-induced warming. As Allen says, if this assumption isn’t true, there are a lot of papers that would have to be retracted or redone.”
“The three climate models consistently identified as having explanatory power were from China, Russia and a US lab called NCAR. Climate models from Norway, Canada, Australia, Germany, France, Japan and the UK, as well as American models from Princeton and two US government labs (NASA and NOAA), failed to exhibit any explanatory power for the spatial pattern of surface temperature trends in any test, alone or in any combination.”
The Financial Post article provides good context for the research reported here on WUWT. [FP claims an updated on June 14th]
~~~~
The headline writer for the Financial Post places this under
<<>>
. . . then provides a last line:
<<>>
There are 82 comments; some about the headline, and some trash; at 6:30 am Thursday, the 21st.
Have I got this right?
‘Socioeconomic variables’ are associated with things like land use and urban heat island effect. yes/no
Dr. Pielke Sr. has provided us with ample evidence that land use has a large effect on regional climate. The Urban Heat Island effect is well known.
In other words, it is quite reasonable that socioeconomic variables could have an effect of regional climate.
The failure of the GCMs comes as no surprise. The dynamical equations contained in these complex “sophisticated” models are derived by applying Newton’s second law to a single parcel of air which is assumed never to mix with the atmosphere (e.g. Haltiner and Martin 1957). Once the equations are derived they are arbitrarily “transformed” from a Lagrangian to an Eulerian frame of reference. Such a transformation allows the equations to be treated as though they have general applicability to all diffusive atmospheric conditions. This is contrary to assumptions of applicability only to a single indivisible air parcel contained in the derivation of the equations. .
The models would be rejected out-of-hand by any student of logic.
Reference: Haltiner G.J. and F.L. Martin 1957 Dynamical and Physical Meteorology McGraw-Hill Book Company Inc. New York Toronto London
Douglas Leahey
I did not realize the GT and LT signs would remove the text.
So, the 2 lines are:
“ Junk Science Week: Climate models fail reality test”
and
“Next week: Climate models offer millions of ways of getting the wrong answer.”
AJ on June 21, 2012 at 5:54 am said:
“So FGOALS and CCSM3.0 have the best explanatory power as per this paper. I find this odd because on other measures (e.g. seasonal temperature variations in the oceans) they are amongst the worse.”
Not odd at all. At least one model should be expected to get good agreement by pure dumb luck, given that there are more than twenty.
John F. Hultquist says:
June 21, 2012 at 6:41 am
HTML tags begin with a LT symbol and end with a GT symbol. The software saw the pair assumed it was an HTML tag that it didn’t recognize, and removed it.
[In this context, LT = less than, GT = greater than, symbols when typed. Robt]
“The three climate models consistently identified as having explanatory power were from China, Russia and a US lab called NCAR.”
Any bets which one of the three had negative explanatory power?
This is not surprising. The climate models predict a warming earth, and this is happening. However, the idea that regional level effects can be accurately predicted is a bit far fetched – especially on as short a time scale as 23 years. Maybe over 100 years.
But it matters little how the warming is distributed around the world, it remains highly likely to be harmful to us.
commieBob says:
June 21, 2012 at 6:34 am
The Urban Heat Island effect is well known.
==========
Not in climate science. Be it Phil, the dog age my homework Jones, or BEST and the faulty statistical comparison, ignoring delta/delta and finding no correlation (no surprise).
The IPCC has latched onto these studies (no UHI) because they support the notion that coal is bad and must be taxed out of existence. Since coal is currently the only practical fuel in plentiful supply, this will create artificial worldwide shortages of energy allowing for windfall profits by those that invest ahead of the regulations.
We see this already in Australia, where the CO2 tax is forcing Australians to export their coal to China rather than using it to power their own economy. The Chinese could not have hoped for better terms if they paid the Oz government.
John Brookes says:
“But it matters little how the warming is distributed around the world, it remains highly likely to be harmful to us.”
Why? Basis? Logic? Reasoning? Anything? Or is it just fear of the unknown?
“In only 3 of 22 cases do we reject leaving out the climate model data, …”
The outputs of climate models may be many things, but they are not DATA.
John Brookes says:
June 21, 2012 at 7:17 am
The climate models predict a warming earth, and this is happening.
===========
Low temperatures are increasing, which is increasing the averages. However, high temperatures are not increasing. So all that can really be said is that climate is becoming less extreme. By averaging temperatures, climate science has painted a misleading picture as to what is actually happening, which is at the heart of the failure of climate models to match reality.
The lack of increase in high temperatures points to negative feedback in the system. Something that not one of the GCM used by the IPCC allows for. Every one of the IPCC GCM’s assumes a positive climate feedback.
No climate model has been programmed with negative feedback to see if it does a better job of explaining climate. Why? The reason you use models is to test assumptions. This major assumption remains untested.
John Brookes says:
June 21, 2012 at 7:17 am
But it matters little how the warming is distributed around the world, it remains highly likely to be harmful to us.
=======
Warming is not harmful to humans. We cannot survive without clothing (technology) below 28C. Only the tropical jungles are this warm. Every other place on the planet is currently fatal to humans without technology.
does “socioeconomic indicators” = UHI?
But he doesn’t even mention sustainability…
Who is paying attention to any of this?
Is anyone in the current US administration paying any attention? Anyone at IPCC?
How about anyone wielding power in the developed world, or reporting on same?
Mark Wagner says:
June 21, 2012 at 8:02 am
does “socioeconomic indicators” = UHI?
=======
Studies that compare population to temperature ignore industrialization rates. This is how the UHI has been typically studied in the past (BEST). Thus highly populated areas are assumed to be identical, regardless of economic activity (Asia/USA/EU). On this basis there is little UHI signal. This new study shows that it is economic activity, not population that drives temperature.
In regards to:
John Brookes says:
June 21, 2012 at 7:17 am
Well, John…if the atmosphere has been warming over the last 15 years, it is not significant enough to proclaim it with such assurance. I could easily look at RSS and claim just the opposite.
Secondly, there is no logic in thinking that something that has no skill over a 23 year time period will suddenly acquire it if it does the same thing over a 100 year time period. Are you suggesting that 23 wrongs don’t make a right, but a 100 will! If a model is built on false assumptions, it will never make good predictions, except for some occasional curve fitting; a blind-squirrel phenomenon. (See ‘the late 20th century and AGW’ for an example.)
Thirdly, your last statement is the most unfounded. It is derived from the false assumption that is a bedrock of modern environmentalism: Any change to the environment, especially if created by humans, is a negative, and often a crisis. This assumption denies the entire history of the planet, the theory of evolution and, indeed, life itself. Without change, this planet would still be a lifeless rock. But the planet is constantly changing, producing temperature swings far greater than humans can create. The result is a thriving biosphere, particularly when the planet is warm. The vast majority of species do better when the planet is warmer.
Does a warmer (than today) world present some opportunity for adaptation? Certainly. That opportunity is tiny compared to what would be required in a cooling world of equal magnitude. And a warmer world would provide a great many blessings, like dramatically reducing the amount of energy humans need to survive and greatly increasing the available land and season length for food production. It is unfortunate that we face a much greater probability of a cooler world in the centuries ahead. The Holocene could double dip and continue for another 12,000 years, but it is more likely that its days are numbered, according to the recent (5 million year) history of this planet.
ferd berple says:
June 21, 2012 at 8:18 am
This new study shows that it is economic activity, not population that drives temperature.
=====
Thus the solution to global warming is to halt economic activity.