From Northeastern University via Eurekalert, and the department of modeling for 10 million dollars, this seems to be all they could come up with. Nature has a way however, of taking the the best laid plans and rendering them moot. I don’t think they’ve noted ‘the pause’ yet. There’s no paper listed, nor data references, nothing, making it one of the worst press releases I’ve seen in awhile. The press release upstream at the University is hardly any better, citing the 97% consensus as if it has anything to do with extremes modeling, but at least they gave a link to the paper where Eurekalert didn’t.
Big data confirms climate extremes are here to stay
In a paper published online today in the journal Scientific Reports, published by Nature, Northeastern researchers Evan Kodra and Auroop Ganguly found that while global temperature is indeed increasing, so too is the variability in temperature extremes. For instance, while each year’s average hottest and coldest temperatures will likely rise, those averages will also tend to fall within a wider range of potential high and low temperate extremes than are currently being observed. This means that even as overall temperatures rise, we may still continue to experience extreme cold snaps, said Kodra.
“Just because you have a year that’s colder than the usual over the last decade isn’t a rejection of the global warming hypothesis,” Kodra explained.
With funding from a $10-million multi-university Expeditions in Computing grant, the duo used computational tools from big data science for the first time in order to extract nuanced insights about climate extremes.
The research also opens new areas of interest for future work, both in climate and data science. It suggests that the natural processes that drive weather anomalies today could continue to do so in a warming future. For instance, the team speculates that ice melt in hotter years may cause colder subsequent winters, but these hypotheses can only be confirmed in physics-based studies.
The study used simulations from the most recent climate models developed by groups around the world for the Intergovernmental Panel on Climate Change and “reanalysis data sets,” which are generated by blending the best available weather observations with numerical weather models. The team combined a suite of methods in a relatively new way to characterize extremes and explain how their variability is influenced by things like the seasons, geographical region, and the land-sea interface. The analysis of multiple climate model runs and reanalysis data sets was necessary to account for uncertainties in the physics and model imperfections.
The new results provide important scientific as well as societal implications, Ganguly noted. For one thing, knowing that models project a wider range of extreme temperature behavior will allow sectors like agriculture, public health, and insurance planning to better prepare for the future. For example, Kodra said, “an agriculture insurance company wants to know next year what is the coldest snap we could see and hedge against that. So, if the range gets wider they have a broader array of policies to consider.”
The paper:
http://www.nature.com/srep/2014/140730/srep05884/full/srep05884.html
Asymmetry of projected increases in extreme temperature distributions
Evan Kodra & Auroop R. Ganguly
A statistical analysis reveals projections of consistently larger increases in the highest percentiles of summer and winter temperature maxima and minima versus the respective lowest percentiles, resulting in a wider range of temperature extremes in the future. These asymmetric changes in tail distributions of temperature appear robust when explored through 14 CMIP5 climate models and three reanalysis datasets. Asymmetry of projected increases in temperature extremes generalizes widely. Magnitude of the projected asymmetry depends significantly on region, season, land-ocean contrast, and climate model variability as well as whether the extremes of consideration are seasonal minima or maxima events. An assessment of potential physical mechanisms provides support for asymmetric tail increases and hence wider temperature extremes ranges, especially for northern winter extremes. These results offer statistically grounded perspectives on projected changes in the IPCC-recommended extremes indices relevant for impacts and adaptation studies.
Figure S1

“It suggests that the natural processes that drive weather anomalies today could continue to do so in a warming future.”
“It” in this case being the product of a 10 million dollar expenditure! “It” suggests that processes that have been around for billions of years will continue into the future. Give me 10 million dollars and I’ll build a fancy smancy climate data cruncher that will strongly suggest that the processes that effect weather today and have been since time immemorial will continue to effect weather in a [payer’s choice: warming/cooling/stagnant] world.
I was looking forward to reading the paper to see their analysis until I read that they analyzed model output rather than real data. Never mind.
I wonder if the authors are trying to lay the foundation, or if someone else will take it as one, for the next round of scare mongering.
“Well, sure the average temperature isn’t going up but that is not the problem. All that CO2, or something, is leading to higher highs and lower lows, and that will result in catastrophic losses in agriculture and public health. We have to reduce our CO2 emissions, or something, to keep them in a more normal range (whatever that is).”
“… There are no nuances in this kind of data. Its like trying to extract the names of respondents from divorce statistics. …”
Honky Cat says:
It’s like trying to find gold in a silver mine
It’s like trying to drink whisky from a bottle of wine
Latitude says:
July 30, 2014 at 9:25 am
pay no attention to the thermometer….your insurance rates are going up
—————————
Bingo. Stampede your competition into overestimating the risk. Everybody in the industry wins.
Predicting the future from the intestines of small avian creatures is cheaper and more accurate.
After all a fat bird in the fall indicates the bird was ready for winter.
$10 million to produce this garbage?
Climatology has not improved since Roman times.
At first I thought that degrees of freedom, error bars, and tests of significance are so old school. Now I realize this is not post-normal science but pre-normal regression back to the halcyon days of restorative snake oil elixirs and grandpa’s recipe for what ails ya.
No need for unbiased, cold investigation. All you have to do is find yourself a fancy media wagon and start barkin out the back end of it. Crowds will gather and buy all you have to sell. But there is one essential difference. What tickles my funny bone is that the sheeple gathering round the wagon all have diplomas and letters after their names.
Did John Kerry commission this in between foreign trips and sail boat escapades?
Are you kidding me!!!?? Does anyone actually do measurements anymore or work with real data? Everyone agrees that the world is warming so we’d expect asymmetry in the extremes. Do you need to crank “big data” to show this?
Can you get a graduate degree in Civil Engineering at Northwestern for this stuff? I’ve been contemplating a US road trip. Think I’ll avoid Illinois. Not sure about the bridges.
John in L du B: “Can you get a graduate degree in Civil Engineering at Northwestern for this stuff?”
Northeastern as in Boston, not Northwestern as in Evanston.
This crap is considered ‘science’?
“This crap is considered ‘science’?”
It is “science” if it pays well. It isn’t “science” if it does not.
quoting, “the team speculates that ice melt in hotter years may cause colder subsequent winters.”
The old “warming causes cooling” gag for the weak minded.
Another fine entry for the Climate Agnotology Hall of Infamy. Meh.
They would do well to learn from Google and their flu predictions:
The Parable of Google Flu: Traps in Big Data Analysis
In February 2013, Google Flu Trends (GFT) made headlines but not for a reason that Google executives or the creators of the flu tracking system would have hoped. Nature reported that GFT was predicting more than double the proportion of doctor visits for influenza-like illness (ILI) than the Centers for Disease Control and Prevention (CDC), which bases its estimates on surveillance reports from laboratories across the United States. This happened despite the fact that GFT was built to predict CDC reports. Given that GFT is often held up as an exemplary use of big data, what lessons can we draw from this error?
http://www.sciencemag.org/content/343/6176/1203.summary
Full-text PDF for the above: http://gking.harvard.edu/files/gking/files/0314policyforumff.pdf
Well, speaking as an IT and database professional, I have to make this point: The biggest fiction about big data is that its a new concept. Its actually a new name for an old concept: statistical modelling.
quoting, “knowing that models project a wider range of extreme temperature behavior will allow sectors like agriculture, public health, and insurance planning to better prepare for the future. ”
those sectors would be better off with an Ouija Board than using that Big Data garbage generator.
“Heap big data”? So now Liz Warren is on the climate bandwagon?
It turns out the $10 million covers a lot more than this analysis. The full grant is described at http://www.nsf.gov/awardsearch/showAward?AWD_ID=1029711 which says in part:
I don’t know, it sounds like they’re finding greater uncertainty….
The grant lists 32 papers produced under it (it started in 2010). None seem very supportive of AGW, a lot seem rather neutral, e.g. Lack of uniform trends but increasing spatial variability in observed Indian rainfall extremes, NATURE CLIMATE CHANGE, v.2, 2012, p. 86-91. [I picked this because it seems to fit the study at hand.]
A graph-based approach to find teleconnections in climate data, Statistical Analysis and Data Mining, v.6, 2013, p. 158.
Intensity, duration, and frequency of precipitation extremes under 21st-century warming scenarios, Journal of Geophysical Research: Atmospheres, v.116, 2011, p. 14. [There’s some confirmation bias, they should look at cooling conditions too.
Sensitive and Specific Identification of Protein Complexes in “Perturbed” Protein Interaction Networks from Noisy Pull-Down Data, IPDPS, HiCOMB Workshop, 2011. [What!?]
The paper is strictly based on model outputs.
No use of any actual climate measurements whatsoever.
There is so much money being wasted in this climate science industry, a complete dead-weight loss to the economy and government coffers,
It needs to end now. We do not need one more single run of any climate model for any purpose whatsoever. It has all be done before umpteen times. We just need to measure what is really happening. Fund somebody else besides climate modellers to carry out this work.
I love how quickly this blog can deconstruct a paper ! All my critical thoughts have already been posted by others !
Warmists either don’t recognize or choose not to acknowledge the intellect of skeptics & understand how easy it is to tear down almost all of their arguments.
LOL, is about all I can muster anymore. I am in the wrong line of work (recycling).
The paper is basically a math problem.
Outrageous claims of climate extremes + Dumb people = Political power and wealth
“Big data confirms climate extremes are here to stay”.
Put a coin in the juke box. Cue the 45: scratch, scratch, scratch…
Climate change is here to stay, it will never die
It was meant to be that way, though I don’t know why
I don’t care what people say, climate change is here to stay
We don’t care what people say, climate change is here to stay
Climate change will always be our ticket to the end
It’ll go down in history, just you wait, my friend
Climate change will always be, it’ll go down in history
Climate change will always be, it’ll go down in history