From an Ohio State University press release where they see a lot of red, and little else, yet another warm certainty model:
STATISTICAL ANALYSIS PROJECTS FUTURE TEMPERATURES IN NORTH AMERICA
![warming_figure2[1]](http://wattsupwiththat.files.wordpress.com/2012/05/warming_figure21.jpg?resize=640%2C470&quality=83)
They performed advanced statistical analysis on two different North American regional climate models and were able to estimate projections of temperature changes for the years 2041 to 2070, as well as the certainty of those projections.
The analysis, developed by statisticians at Ohio State University, examines groups of regional climate models, finds the commonalities between them, and determines how much weight each individual climate projection should get in a consensus climate estimate.
Through maps on the statisticians’ website, people can see how their own region’s temperature will likely change by 2070 – overall, and for individual seasons of the year.
Given the complexity and variety of climate models produced by different research groups around the world, there is a need for a tool that can analyze groups of them together, explained Noel Cressie, professor of statistics and director of Ohio State’s Program in Spatial Statistics and Environmental Statistics.
Cressie and former graduate student Emily Kang, now at the University of Cincinnati, present the statistical analysis in a paper published in the International Journal of Applied Earth Observation and Geoinformation.
“One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don’t know what to believe,” he said. “We wanted to develop a way to determine the likelihood of different outcomes, and combine them into a consensus climate projection. We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty.”
For their initial analysis, Cressie and Kang chose to combine two regional climate models developed for the North American Regional Climate Change Assessment Program. Though the models produced a wide variety of climate variables, the researchers focused on temperatures during a 100-year period: first, the climate models’ temperature values from 1971 to 2000, and then the climate models’ temperature values projected for 2041 to 2070. The data were broken down into blocks of area 50 kilometers (about 30 miles) on a side, throughout North America.
Averaging the results over those individual blocks, Cressie and Kang’s statistical analysis estimated that average land temperatures across North America will rise around 2.5 degrees Celsius (4.5 degrees Fahrenheit) by 2070. That result is in agreement with the findings of the United Nations Intergovernmental Panel on Climate Change, which suggest that under the same emissions scenario as used by NARCCAP, global average temperatures will rise 2.4 degrees Celsius (4.3 degrees Fahrenheit) by 2070. Cressie and Kang’s analysis is for North America – and not only estimates average land temperature rise, but regional temperature rise for all four seasons of the year.
Cressie cautioned that this first study is based on a combination of a small number of models. Nevertheless, he continued, the statistical computations are scalable to a larger number of models. The study shows that climate models can indeed be combined to achieve consensus, and the certainty of that consensus can be quantified.
The statistical analysis could be used to combine climate models from any region in the world, though, he added, it would require an expert spatial statistician to modify the analysis for other settings.
The key is a special combination of statistical analysis methods that Cressie pioneered, which use spatial statistical models in what researchers call Bayesian hierarchical statistical analyses.
“We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty.” |
The latter techniques come from Bayesian statistics, which allows researchers to quantify the certainty associated with any particular model outcome. All data sources and models are more or less certain, Cressie explained, and it is the quantification of these certainties that are the building blocks of a Bayesian analysis.
In the case of the two North American regional climate models, his Bayesian analysis technique was able to give a range of possible temperature changes that includes the true temperature change with 95 percent probability.
After producing average maps for all of North America, the researchers took their analysis a step further and examined temperature changes for the four seasons. On their website, they show those seasonal changes for regions in the Hudson Bay, the Great Lakes, the Midwest, and the Rocky Mountains.
In the future, the region in the Hudson Bay will likely experience larger temperature swings than the others, they found.
That Canadian region in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 6 degrees Celsius (10.7 degrees Fahrenheit) – possibly because ice reflects less energy away from the Earth’s surface as it melts. Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 1.2 degrees Celsius (2.1 degrees Fahrenheit).
According to the researchers’ statistical analysis, the Midwest and Great Lakes regions will experience a rise in temperature of about 2.8 degrees Celsius (5 degrees Fahrenheit), regardless of season. The Rocky Mountains region shows greater projected increases in the summer (about 3.5 degrees Celsius, or 6.3 degrees Fahrenheit) than in the winter (about 2.3 degrees Celsius, or 4.1 degrees Fahrenheit).
In the future, the researchers could consider other climate variables in their analysis, such as precipitation.
This research was supported by NASA’s Earth Science Technology Office. The North American Regional Climate Change Assessment Program is funded by the National Science Foundation, the U.S. Department of Energy, the National Oceanic and Atmospheric Administration, and the U.S. Environmental Protection Agency office of Research and Development.
###
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Oh. A consensus of models. Yawn.
Climate Science for beginners- Alarmism: models predict that by mixing different types of garbage you will make gold – even if sceptics see a larger pile of rubbish /sarc
“The study shows that climate models can indeed be combined to achieve consensus, and the certainty of that consensus can be quantified.”
*bangs head slowly and repeatedly on table*
Lots of red there, obviously we are going to fry
Steve
That was my conclusion. If the had rest their models to 1980 and were able to accurately predict the US weather from 2000 to 2011, then I’d be impressed and think they might have something. Until they do that, it is GIGO and a failed exercise.
Bill
“Given the complexity and variety of climate models produced by different research groups around the world, there is a need for a tool that can analyze groups of them together,”
Most of the models are based on the same root program—flat Earth, no nighttime, clouds none or designed to warm, ocean currents sort of, and 50 + other missing major factors—have the same fallacious flawed assumptions, and the same adjusted input.
Now we have garbologolists studying the output of garbologists running garbology programs. What we get is confirmation that it is all garbage and they put a smelly stamp on it.
Wonderful.
Hats off to my fellow academic Buckeyes at Ohio State, and special hello to the staff at the Lantern (“How firm thy friendship!”). But a model is still only as good as its input. I guess what Noel Cressie is saying is: “This is the super-duper new and improved model! It’s not like those old, unreliable models that drove the IPCC’s decision-making process and influenced the entire AGW movement!” Sheesh. O-H!
“One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don’t know what to believe,”
Classic projection. As a skeptic, I’m not looking for anything to “believe” in. But as warmist modelers, that’s what it’s all about. Statistical analysis of confirmation-biased models, and models of models don’t provide any facts of reality, only fodder for a horribly biased world view and belief system. Get back to us when you have collected and analyzed actual observational data; as far as I’m concerned models are nothing more than a curiosity. Predicting the future with climate models is slightly easier than changing the past.
I’m so sick and tired of being told we have to change or we will wither and die. We will all die soon enough anyway (I don’t plan on living much past 100), so I would actually enjoy a little warmth before that day comes.
“One of the criticisms from climate-change skeptics is that different climate models give different results,…”
Yeah but all of them ensure that there is warming – that is all you have discovered and we knew that. Even they knew that. BTW, you are probably safe from the scalpel of Steve McIntire because the the work is so insignificant.
I’ve seen a pattern lately. More and more papers based on models. When you think about it this is quite clever. When the models fail, as they most certainly will, these guys just point their fingers at the modelers and say “not my fault”. They are still able to ride the gravy train while having limited personal risk.
Of course, if you are modeler you should start feeling a little uneasy. Guess who’s going down hard.
Oops, maybe I shouldn’t have used the word scalpel with all this fear among the warm society academics. It was only a metafor, I didn’t mean……
Like with many other past climate models , the reality of the very recent climate does not seem t0 support future model projections . Canadian national annual temperature departures from 1961-1990 averages since 1998[or the last 14 years] show wide yearly fluctuations but the linear trend is completely flat. So are the summer and fall temperature departures , fluctuating but the linear trend is flat . Spring departures have gone negative or cooling and winter deaprtures show some rise based mostly the last few winters . Regionally , of the 11 climate regions in Canada only two areas [mostly in the high Arctic] show a rise in temperature departures namely, the Arctic Tundra, Mountains and Fiords.All other regions show declining or flat annual temperature departures. So someone will have to turn up the heat considerably to get the additional temperature rise being projected by the model . With the quiet sun being projected for the next few sun cycles and ocean cycles showing some cooling , there are likely going to be fewer climate changing strong El Ninos’s, so I anticpate no major warming for the next 2-3 decades to support the model predictions .
It’s turtles all the way down.
Such math/statistical effusions are equivalent to dividing unit-1 by zero, meaning that whether 1/0 = 1 or 1/0 = 0, unit-1 is equal to zero– a contradiction meaningless on its face. Among others, Bertrand Russell made appropriately pejorative comments on that score.
Temperatures in NE USA have been falling for decades. The projection of rising temperatures makes no sense. The authors are trying to make a linear projection using cyclic data. One might as well record temperatures from midnight to noon, and then use this to project temperatures at midnight. Your projection will show projected midnight temperature to be much higher than the night before.
I would like to second Mario Lento’s proposition for the adoption of the phrase “guess laundering” – average a bunch of models and present the outcome as fact.
ferd berple says:
May 16, 2012 at 7:56 am
Temperatures in NE USA have been falling for decades. The projection of rising temperatures makes no sense….
_____________________________
Well the temperatures are not exactly increasing in the SE either. It is 73 F today in mid North Carolina. 2004 is about two years after the cycle 23 max when the influence of the sun should be showing strongly. I counted by July tenth 43 days over ninety F for 2004 vs 26 days for 2010, and four days of 98F in 2010 vs nine days of 98F in 2004.
So far this May we have had ONE day over ninety F (91F) vs seventeen day over ninety F in 2004 three of those days were 95F or more. At this point the five day forecast is the high temperatures will be in the seventies. Heck we only had five days so far above eighty and it doesn’t look like it will be much warmer than the eighties for the rest of the month.
Here is the closest GISS station and you can see the down trend:
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425746930020&data_set=1&num_neighbors=1
Calling Matt Briggs!
Toto says:
May 15, 2012 at 11:16 pm
How many (or which) regional weather forecasting models base their outputs on current CO2 levels?
Correct me if I am wrong, but did the MET office not do this for several years when they predicted barbeque summers and the like? Then after several years of embarrassments, they stopped doing this.
“One of the criticisms from climate-change skeptics is that different climate models can’t predict the past correctly, so they argue that they don’t know what to believe of the future,” he said.
Seattle numerical weather forecaster Cliff Mass complains that the climate guys get all the big computers:
http://cliffmass.blogspot.ca/2012/05/us-climate-versus-weather-computers.html
“There is a vast overkill in pushing computer resources for climate prediction, while weather prediction is a very poor cousin.”
“Furthermore, there is no better way to improve climate models than to improve weather models, since essentially they are the same. You learn about model weaknesses from daily forecast errors. “
SSDD -and/ or GIGO
Take your pick.
Makes as much sense as polling 2 republican conventions and concluding that Mitt Romney will win in a landslide.
bernie1815 says:
May 16, 2012 at 8:44 am (Edit)
Calling Matt Briggs!
##########
I’ll bet that Matt is a fan of Cressie. Cressie wrote the book, three actually, and well over
200 referred articles. Everybody who knows R and who works in spatio temporal stats
knows him.
Folks would do well to understand what CRessie did before they had knee jerk reactions.
Up until now the methods for combining climate model predictions have been pretty crude: averaging. However, if you look at the information from various hindcasts ( and how they are wrong in some ways and right in others ) that information can be used in getting a better ensemble forecast. A tighter forecast is good, for both confirmation and disconfirmation.
The assumptions driving this model-based statistical analysis are, as usual, profoundly incorrect. They assume increased CO2 emissions, particularly the small fraction that is human-caused, will warm planetary temperatures to a level that is discernable or measurable. There simply is no hard evidence to support such a theory (which is exactly what it is).
As Dr. William M. Gray, CSU’s professor emeritus of atmospheric physics, observes:
“It is impossible for us skeptics to believe that the doubling of CO2 which causes a global average infrared (IR) radiation blockage to space ~3.7 Wm-2 for doubling of CO2 can be very much of a climate altering feature. Especially when we contrast this 3.7 Wm-2 IR blockage (from a doubling of CO2) with the much larger and continuous 342 Wm-2 average short-wave radiation impinging on the earth and the near balancing concomitant 342 Wm-2 net long-wave and solar (albedo) energy going back to space.
“The global climate will be little affected by this small amount of 3.7 Wm-2 IR energy blockage to space due to a doubling of CO2. It is this lack of scientific believability and the large economic and social disruptions which would result if the industrial world were to switch to renewable energy that motivates us skeptics to rebel against such obvious exaggerated claims for CO2 increase.”
It is impossible for us skeptics to believe that the doubling of CO2 which causes a global average infrared (IR) radiation blockage to space ~3.7 Wm-2 for doubling of CO2 can be very much of a climate altering feature.
#######################
1. The sun’s input to the climate system is around 1361 Watts.
2. Small changes in that ( ~1Watt) at solar minimum, looking at the LIA, would see to
have an effect.. right?
Changing the forcing ( Watts ) will have an effect on the temperature. If the sun went to zero
it would get cold. It the suns output increased it would have an effect.
a doubling of C02 leads to additional Watts. That is beyond question. tested. observed.
theory is correct. we use that theory to build things that work
These researchers need to try some pink sunglasses.