Modeling in the red

From an Ohio State University press release where they see a lot of red, and little else, yet another warm certainty model:

STATISTICAL ANALYSIS PROJECTS FUTURE TEMPERATURES IN NORTH AMERICA

Upper-left panel: The posterior mean of the average temperature-change projections. Upper-right panel: The posterior standard deviation of the average temperature-change projections. Lower panels: Pixelwise posterior 2.5th (lower-left) and 97.5th (lower-right) percentiles of the average temperature-change projections. Units for all panels are in °C. [Source: Kang and Cressie (2012)]
COLUMBUS, Ohio – For the first time, researchers have been able to combine different climate models using spatial statistics – to project future seasonal temperature changes in regions across North America.

They performed advanced statistical analysis on two different North American regional climate models and were able to estimate projections of temperature changes for the years 2041 to 2070, as well as the certainty of those projections.

The analysis, developed by statisticians at Ohio State University, examines groups of regional climate models, finds the commonalities between them, and determines how much weight each individual climate projection should get in a consensus climate estimate.

Through maps on the statisticians’ website, people can see how their own region’s temperature will likely change by 2070 – overall, and for individual seasons of the year.

Given the complexity and variety of climate models produced by different research groups around the world, there is a need for a tool that can analyze groups of them together, explained Noel Cressie, professor of statistics and director of Ohio State’s Program in Spatial Statistics and Environmental Statistics.

Cressie and former graduate student Emily Kang, now at the University of Cincinnati, present the statistical analysis in a paper published in the International Journal of Applied Earth Observation and Geoinformation.

“One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don’t know what to believe,” he said. “We wanted to develop a way to determine the likelihood of different outcomes, and combine them into a consensus climate projection. We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty.”

For their initial analysis, Cressie and Kang chose to combine two regional climate models developed for the North American Regional Climate Change Assessment Program. Though the models produced a wide variety of climate variables, the researchers focused on temperatures during a 100-year period: first, the climate models’ temperature values from 1971 to 2000, and then the climate models’ temperature values projected for 2041 to 2070. The data were broken down into blocks of area 50 kilometers (about 30 miles) on a side, throughout North America.

Averaging the results over those individual blocks, Cressie and Kang’s statistical analysis estimated that average land temperatures across North America will rise around 2.5 degrees Celsius (4.5 degrees Fahrenheit) by 2070. That result is in agreement with the findings of the United Nations Intergovernmental Panel on Climate Change, which suggest that under the same emissions scenario as used by NARCCAP, global average temperatures will rise 2.4 degrees Celsius (4.3 degrees Fahrenheit) by 2070. Cressie and Kang’s analysis is for North America – and not only estimates average land temperature rise, but regional temperature rise for all four seasons of the year.

Cressie cautioned that this first study is based on a combination of a small number of models. Nevertheless, he continued, the statistical computations are scalable to a larger number of models. The study shows that climate models can indeed be combined to achieve consensus, and the certainty of that consensus can be quantified.

The statistical analysis could be used to combine climate models from any region in the world, though, he added, it would require an expert spatial statistician to modify the analysis for other settings.

The key is a special combination of statistical analysis methods that Cressie pioneered, which use spatial statistical models in what researchers call Bayesian hierarchical statistical analyses.

“We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty.”

The latter techniques come from Bayesian statistics, which allows researchers to quantify the certainty associated with any particular model outcome. All data sources and models are more or less certain, Cressie explained, and it is the quantification of these certainties that are the building blocks of a Bayesian analysis.

In the case of the two North American regional climate models, his Bayesian analysis technique was able to give a range of possible temperature changes that includes the true temperature change with 95 percent probability.

After producing average maps for all of North America, the researchers took their analysis a step further and examined temperature changes for the four seasons. On their website, they show those seasonal changes for regions in the Hudson Bay, the Great Lakes, the Midwest, and the Rocky Mountains.

In the future, the region in the Hudson Bay will likely experience larger temperature swings than the others, they found.

That Canadian region in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 6 degrees Celsius (10.7 degrees Fahrenheit) – possibly because ice reflects less energy away from the Earth’s surface as it melts. Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 1.2 degrees Celsius (2.1 degrees Fahrenheit).

According to the researchers’ statistical analysis, the Midwest and Great Lakes regions will experience a rise in temperature of about 2.8 degrees Celsius (5 degrees Fahrenheit), regardless of season. The Rocky Mountains region shows greater projected increases in the summer (about 3.5 degrees Celsius, or 6.3 degrees Fahrenheit) than in the winter (about 2.3 degrees Celsius, or 4.1 degrees Fahrenheit).

In the future, the researchers could consider other climate variables in their analysis, such as precipitation.

This research was supported by NASA’s Earth Science Technology Office. The North American Regional Climate Change Assessment Program is funded by the National Science Foundation, the U.S. Department of Energy, the National Oceanic and Atmospheric Administration, and the U.S. Environmental Protection Agency office of Research and Development.

###

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

95 Comments
Inline Feedbacks
View all comments
May 15, 2012 9:11 pm

“One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don’t know what to believe,” he said.
I have never seen anyone make that statement — did Cressie just transfer from ANU, or did he think that one up all on his own?

Ken in Beaverton, OR
May 15, 2012 9:12 pm

Looking at the groups that supported this “research” I am not surprised by the conclusions.

May 15, 2012 9:14 pm

NEW! Models with included consensus, the must-have tool for the ambitious climate ‘scientist’.

Tired of theBS
May 15, 2012 9:25 pm

How far into the future is this model? If it’s August, they might be right this time.

May 15, 2012 9:29 pm

How can you statitisticize chaos? Here it is, folks. We’re gonna fry….statistically. Oh, and everyone agrees.

May 15, 2012 9:30 pm

Trouble is that the projected temp could be -2.4C and the probability is just as valid. That’s what my computer says.

May 15, 2012 9:33 pm

“We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty.”
Huh?
Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 1.2 degrees Celsius (2.1 degrees Fahrenheit).
After quantifying the certainty they merely estimate that Hudson Bay will do their bidding.

DesertYote
May 15, 2012 9:36 pm

So they looked for commonality between two models written by the same group?

JPG
May 15, 2012 9:38 pm

But….if the models are wrong………………………………..

Geoff Sherrington
May 15, 2012 9:44 pm

There is much overlap with this work and the prediction of national and regional economic changes. If these guys are smart enough to derive fine detail from coarse models, it should be a cinch to work in economics and become very wealthy very quickly.
I remain totally unconvinced, if for no other reason that many past temperature data bases are so corrupt that one cannot have confidence in the future. As a test, would any of these authors like to wager large personal sums of $ on the predictions?
I thought not.

Alan Clark of Dirty Oil-berta
May 15, 2012 9:52 pm

“That Canadian region in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 6 degrees Celsius”…
Six degrees over 55 years? Big deal. The temperature rose from 7°C this morning to 26°C this afternoon.
Mine’s bigger.

G. E. Pease
May 15, 2012 9:58 pm

“All data sources and models are more or less certain, Cressie explained, and it is the quantification of these certainties that are the building blocks of a Bayesian analysis.” ?!
From
http://mathworld.wolfram.com/BayesianAnalysis.html
“Bayesian analysis is somewhat controversial because the validity of the result depends on how valid the prior distribution is, and this cannot be assessed statistically.”
Clearly, an invalid a priori assumption of perfectly certain data sources and models always leads to invalidly “certain” Bayesian results. In other words, GIGO.

curious george
May 15, 2012 10:22 pm

These guys never apply their certainty models to weather forecasts.

Werner Brozek
May 15, 2012 10:36 pm

Unless I am missing something, it appears that they assume the feedbacks due to a doubling of CO2 are positive and not negative. However if the last 10 to 15 years are any indication, and if Dr. Spencer’s views on negative feedback are true, then it seems as if they made a wrong assumption right off the bat and everything else they may say would be wrong.

davidmhoffer
May 15, 2012 10:39 pm

Let’s take some models that have been shown to have no predictive skill what so ever, combine them, and make still more predictions. Yeah, that ought to get a Nobel prize…
I used to make snarky sarcastic remarks about shoddy science, but the drivel of late has descended to a level of absurdity such that mocking it is pointless.

May 15, 2012 10:40 pm

Though the models produced a wide variety of climate variables, the researchers focused on temperatures during a 100-year period: first, the climate models’ temperature values from 1971 to 2000, and then the climate models’ temperature values projected for 2041 to 2070.
As Bill Tuttles remarks, these people have no clue on what sceptics have issues with.
But as the quote shows here, they cherry pick a time when temperatures went up and CO2 went up to show their case. Why don’t they look at either the ENTIRE time period of say 1950 – today instead of cherry picking the time when CO2 levels and temperatures both went up?
The answer of course which most of us realize as sceptics ….. is that the only time period from 1950-present that fits the meme of CO2 causes it to warm is this shortened time period whereas anything outside of that shows either cooling or stagnant temperatures. Instead of showing the truth and the entire picture, they remain fixated on their goal of showing a pre-determined outcome. That is what sceptics have an issue with first and foremost.
Of course, other issues in the models (GCM’s if you will) come to mind including assumed values for positive feedbacks on CO2 (or other greenhouse gases). They fix these in the GCM’s by assuming that magically other anthropological influences were present when temperatures did not cooperate with their pre-determined conclusions……and it all comes back to the problems that no one understands exactly why clouds to this day change. And so the models are all based on cherry picking this very same time period of 1970-2000 as a “golden standard” because it MUST be the time period that warming was caused by CO2.
Its such a large logical fallacy, that I don’t know even how to tell people how retarded it is. No sceptic has a problem with computer models off the bat, or that models disagree with one another, the problems stem from one common source, this predetermined outcome always surfaces and no matter what study you find, they will compare changes seen in this time period with the future as they tell us will warm up the same way.
Why is this 30 year time period so special? Why is it that 30 years is the climate golden standard when we see a sin wave of temperatures on a 60 year cycle……?
These facts just boggle the mind because the truth is easy to determine. Sure, most sceptics agree that we warmed over the 20th century. But the effect of CO2 on this is the only point we have a contention at. And yet, every study they put out never goes to the heart of this issue.
So yes, yet another worthless study. And another scientist who completely misunderstands the sceptical argument from the get-go and refuses to leave his or her echo chamber and see the forest through the trees.

otter17
May 15, 2012 10:54 pm

Go Bucks!

sophocles
May 15, 2012 10:56 pm

Did they incluide the sun and add it to the consensus?
No? Oops.

Neil Jones
May 15, 2012 11:13 pm

“consensus climate estimate” A consensus of guesses models. Wow!

Toto
May 15, 2012 11:16 pm

These regional climate model runs are based on some scenarios for CO2 levels in the future. My question is somewhat different: How many (or which) regional weather forecasting models base their outputs on current CO2 levels?

Mike McMillan
May 15, 2012 11:19 pm

So they can predict with statistical certainty how computer climate models will predict the climate. When this technique is perfected, it might be more usefully applied to video games. Imagine knowing how Final Fantasy XII or Super Mario Bros will turn out before you begin.
“All data sources and models are more or less certain, Cressie explained…”
Are we more certain of HadCRUT3 than HadCRUT2, or 4, or USHCN more than USHCN v2?
Climate models are more sensitive to initial conditions than to CO2, and more sensitive to CO2 than reality. Perfect application of fuzzy logic to fuzzy thinking.

Phillip Bratby
May 15, 2012 11:22 pm

There’s only one word to describe this: GIGO

Lew Skannen
May 15, 2012 11:23 pm

“One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don’t know what to believe,”
Solution: Garbage In – Weighted Average Garbage Out.

Ally E.
May 15, 2012 11:31 pm

They need the people to agree with them. All the sheep for slaughter have to walk willingly into the pen. But the sheep are hesitating at the gate, so more scare tactics are needed to herd them through. That’s not working, so it’s back to rational and consensus. If the people aren’t scared enough to accept the need for global control, then, then, then, why it might just all fall apart!

Goldie
May 15, 2012 11:38 pm

So we take a bunch of models that can’t replicate current global temperature trends and combine them. Well if some were above the actual trend and some were below we might end up with something in the middle. Sadly all of these models are over-predicting temperature trends so I can have a pretty high confidence that this will be wrong too.

1 2 3 4