Uncertain about Uncertainty

Guest Post by Willis Eschenbach

I was reading a study just published in Science mag, pay-walled of course. It’s called “The Pace of Shifting Climate in Marine and Terrestrial Ecosystems”, by Burrows et al. (hereinafter B2011). However, I believe that the Supplementary Online Information (SOI) may not be paywalled, and it is here. The paper itself has all kinds of impressive looking graphs and displays, like this one:

Figure 1. Temperature change 1960-2009, from B2011. Blue and red lines on the left show the warming by latitude for the ocean (blue) and the land (red).

I was interested in their error bars on this graph. They were using a 1° x 1° grid size, and given the scarcity of observations in many parts of the world, I wondered how they dealt with the uneven spacing of the ground stations, the lack of data, “infilling”, and other problems with the data itself. I finally found the details regarding how they dealt with uncertainty in their SOI. I was astounded by their error estimation procedure, which was unlike any I’d ever seen.

Here’s what the B2011 SOI says about uncertainty and error bars (emphasis mine):

We do not reflect uncertainty for our estimates or attempt statistical tests because …

Say what? No error bars? No statistical tests? Why not?

The SOI continues with their reason why no error bars. It is because:

… all of our input data include some degree of model-based interpolation. Here we seek only to describe broad regional patterns; more detailed modeling will be required to reflect inherent uncertainty in specific smaller-scale predictions.

So … using model based interpolation somehow buys you a climate indulgence releasing you from needing to calculate error estimates for your work? If you use a model you can just blow off all “statistical tests”? When did that change happen? And more to the point, why didn’t I get the memo?

Also, they’ve modeled the global temperature on a 1° x 1° grid, but they say they need “more detailed modeling” … which brings up two interesting questions. First question is, what will a .5° x 0.5° (“more detailed”) model tell us that a  1° x 1°  doesn’t tell us? I don’t get that at all.

The second question is more interesting, viz:

They say they can’t give error estimates now because they are using modeled results … and their proposed cure for this problem is “more detailed modeling”???

I’d rave about this, but it’s a peaceful morning, the sun is out after yesterday’s storm, and my blood is running cool, so let me just say that this is a shabby, childish example of modern climate “science” (and “peer-review”) at its worst. Why does using a model somehow mean you can’t make error estimates or conduct statistical tests?

Sadly, this is all too typical of what passes for climate science these days … and the AGW supporters wonder why their message isn’t getting across?

w.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

78 Comments
Inline Feedbacks
View all comments
Baa Humbug
November 6, 2011 7:51 pm

Not only does this sort of junk pass as science, it makes news worldwide.
Following is a sample of the media coverage of this study:
The Atlantic: New Evidence That Climate Change Threatens Marine Biodiversity
ABC Australia: Climate change affecting oceans faster: Study
Times of India: Marine life ‘needs to swim faster to survive climate change’
Sky News Australia: Aussie marine life climate change threat
The Australian: Marine life in climate change hot water
Softpedia: Species will have to move fast to adapt to climate change
Fish Update: Climate shifts could leave some species homeless, new research shows
Deccan Chronicle (India): Marine life ‘needs to swim faster to survive climate change’
FishNewsEU.com: Climate warming poses serious conservation challenge for marine life
Sometimes it feels like trying to swim against a strong current

November 6, 2011 7:52 pm

I’ve reproduced the original graphic showing the appropriate degree of uncertainty.
uncertainty

Baa Humbug
November 6, 2011 7:58 pm

Oh by the way the full paper can be accessed here

RoyFOMR
November 6, 2011 8:26 pm

W,
You are so in the past here. Don’t you understand that the customer is always right? Error-bars just confuse the statistics, confound the democratic process and bring into question the motives of our great and good political masters!,
Never forget that these savants have an inordinantly large amount of Tax$ to spend. If they mess up the odd Trillion or two by innocently sending a few Mill into their accounts, then should we blame them?
At least they tried to save our planet and make a few bucks unlike you . All that I’ve noted about what you’ve done to date is no more than a few brilliant and heart-warming essays, a realistic hypothesis or two and a determined effort to take Post-Modernist Science back into the enlightenment!
You’ve clearly no shame. Unlike them.

November 6, 2011 8:30 pm

keith says:
“This reminds me of the NASA GISS 1200km ‘averaging’ – no attempt to account for significance. I remember using their online tool and reducing the distance to around 200kms – suddenly all these gaps appear all over the globe where previously there was massive warming and the global temp increase dropped by a third…”
I have a beef with some of what goes into GISS. Large areas are likely too-often represented by a thermometer at an Arctic location having above-regional-average local surface albedo feedback, or a land thermometer while the area represented includes a lot of ocean area.
There is another decadal temperature trend map of the world, from January 1979 to sometime recently, and it avoids surface station siting issues, growth of urban effects on surface station thermometers, and large regions represented by thermometers in local warming hotspots:
1st global map image (lower troposphere), in:
http://www.remss.com/msu/msu_browse.html
That one omits the “pole holes” within 7.5 degrees of the poles, and other areas on basis of little or no lower troposphere above the surface (elevation above 3 km or within 20 degrees of the South Pole).

Cherry Pick
November 6, 2011 9:06 pm

Their argument against using statistics on computer models is valid. Models contain man-made fudge factors called parameters. Statistical measures like averages, means and confidence intervals of model outputs are arbitrary – they are what programmers choose them to be.
Another example of this what has amazed me is the concept of ensemble mean. Its like a report of climate modeling is some kind of opinion poll. Ensemble statistics describe how well research groups agree on parameters and as a consequence on the outputs of the computer runs that they have chosen to publish.

Baa Humbug
November 6, 2011 9:43 pm

Sometimes people ask why I bother. It’s because I can’t just sit quiet while this kind of nonsense is being put forward and then goes round the world three times. Like they say … better to light one small forest fire than to curse the darkness …
Or something like that.

And that’s why I love ya

paulsNZ
November 6, 2011 9:49 pm

AGW Belief systems have errors which are relegated to the faith based error processing model employed by South-Park and its GONE.

John Trigge
November 6, 2011 10:17 pm

Willis – ask them for the error bars and expect the reposne:
“Why should I give you my error bars when you only want to find fault in it?”

November 6, 2011 11:14 pm

I don’t understand any of this. The modeling methodology they are using should be about the same as I use for a coal deposit or a metallic ore body. The reason one develops such models is the statistical confidence that is part of the spacial (grid cell) calculation. ( ore bodies are static so the methods work very well). Other important model results include maximum distance between data points, to achieve confidence and any directionality trends. That is, depending on the element or factor being modeled the confidence distance will often very between them and the cells for each may have different shapes: squares, rectangles or sometimes triangles and parallelograms.
If you don’t look at all these factors and work out the probability or confidence for each, you can be down the garden path faster then a loaded haul truck heading down a 10% grade. Sounds to me like the authors are on just such a ride. My mining related advice, jump boys before I need attend your funeral.

Baa Humbug
November 6, 2011 11:15 pm

Though this paper is more detailed and houses more pretty graphs than that infamous toilet paper about the 4 dead polar bears, it nonetheless rivals that paper in the best a$$ wiper stakes.
What the authors of this paper have done is taken the CRU TS3.1 data for land and the Hadley Had1 SST1.1 data for oceans covering the period 1960 to 2009.
From the above data they have determined that at various regions of the world, spring arrives X days earlier and autumn arrives X days later. They have also plotted the spatial coverage of these changes and concluded that unless species are able to shift their range quick enough, they will be adversely affected.
But have any of the authors jumped on a boat or squirmed into a diving suit? NO
Have they named a single species, nay, a single specimen, that has shifted range due to these changes? NO
They open the paper with

“Climate warming is a global threat to biodiversity”,

a bullshyte statement taken from a paper O. E. Sala et al., Global biodiversity scenarios for the year 2100. and finish with an equally bullshyte statement

“Maps of the velocity of climate change and seasonal shift show the areas where the threat to biodiversity from organisms’ need to rapidly track thermal conditions by shifting distributions and retiming seasonal thermal events may be greatest; these areas may coincide with high biodiversity, especially in the oceans.”

No wonder Willis gets all hot under the collar

Mark
November 7, 2011 12:53 am

MrX says:
WOW! They believe that the “garbage-in, garbage-out” rule doesn’t apply to them, which also implies that uncertain input will produce uncertain output. KUDOS! They’ve just broken a universal rule of computing (and hence models). They will be renown worldwide in computer science literature for this feat. NOPE!
Except as an example of how NOT to do things.
Wonder if they are aware of the actual shape of the Earth or if they have treated the “squares” on a Mercator projection as being real 🙂 Since they mention using a “1° x 1° grid”.

Mike Jowsey
November 7, 2011 12:56 am

Quote of the week: “…buys you a climate indulgence …”
Nice work Mr Eschenbach.

E.M.Smith
Editor
November 7, 2011 1:22 am

As the current global data have about 1200 stations (and even at the peak it was less than 8000) the use of any gridding with more more cells than that is just pointless. It is just going to hide the ignorance of the data behind an ever larger number of ‘fantasy cells’ filled with an homogenized data food product.
So what they are saying is that it’s awful hard to put an error bar on all those 1 degree grid boxes as they are (all but about 1200 of them) filled with a fantasy value anyway. What is the error bar on a fantasy? Why, even more meaningless than the non-data created to ‘fill’ it…
I don’t know how their cells are constructed, but a 360 circle of 360 degrees of longitude I think gives about 129600 cells. Compare with 1200 actual thermometer values in the present.
Yeah, that’s a lot of fantasy “values”… So again I ask: How do you calculate error bars on those fantasy values?…

Jim Turner
November 7, 2011 3:03 am

If the objective was to generate a pretty picture with lots of orange bits, and they have succeeded in that.
A point in that direction is that they look to have used a Mercator projection or similar, which is a poor choice where comparitive areas are significant. It massively exaggerates the area of higher and lower latitudes, including all those intense orange areas in Canada, Alaska, Siberia, etc. For example, Greenland is actually about 2/3 the land area of India, not 4-5 times larger as it appears on this map.

wayne Job
November 7, 2011 3:16 am

I take great delight into your tilting at windmills Willis, and have a chuckle at the stupidity of some of the nonsense science. That said I would like to go off topic, your deductions about the tropical thermostat and the almost constant heat input regardless of changing parameters was very good.
There remain four other thermostats that are a tad more perplexing, the two temperate zones like piggy in the middle, hot on one side and cold on the other, and the two poles both totally different in aspect but thermostats non the less.
Your odd analytical brain maybe can make some sense of their workings and tie them back to the tropic input. I am not even half clever enough to do this but I have a sneaking suspicion that you can do it.

November 7, 2011 3:19 am

E.M.Smith says:
November 7, 2011 at 1:22 am
“”I don’t know how their cells are constructed, but a 360 circle of 360 degrees of longitude I think gives about 129600 cells. Compare with 1200 actual thermometer values in the present.””
Actually it is 360 degrees of longitude X 180 degrees of Latitude for half that many cells 64,800 so if the data points were distributed with the nearest neighbors more than 1 degree apart 1 out of every 54 grids would contain real data.

November 7, 2011 3:38 am

Statistics? We don’t need no steenkin’ statistics…

Gail Combs
November 7, 2011 4:07 am

This is not science.
It is complete and utter propaganda fed to the media as ” a peer-reviewed paper” to give it more weight in the eyes of the public.
All it actually does is give science another black eye. NO Science was done. Some idiot took a false computer model and used it to come up with “conclusions” designed to scare the (self-snip) out of the public.
A skeptic could to the same thing showing a “rapid descent” into an ice age based on a model, temperatures falling in the last 10 to 15 years and the end of the Holocene starting with a CAGW paper.

Lesson from the past: present insolation minimum holds potential for glacial inception 2007
Abstract
The community of climatologists predicts a progressive global warming [IPCC Fourth Assessment Report—Climate Change, 2007. The Scientific Basis. Cambridge University Press, Cambridge] that will not be interrupted by a glacial inception for the next 50 ka [Berger and Loutre, 2002. An exceptionally long Interglacial ahead? Science 297, 1287–1288]. These predictions are based on continuously increasing anthropogenic greenhouse gas emissions and on the orbital forcing that will provide only muted insolation variations for the next 50 ka. To assess the potential climate development without human interference, we analyse climate proxy records from Europe and the North Atlantic of Marine Isotope Stage (MIS) 11 (423–362 ka BP), an interval when insolation variations show a strong linear correlation with those of the recent past and the future. This analysis suggests that the insolation minimum at 397 ka BP, which provides the best available analogue to the present insolation minimum, terminated interglacial conditions in Europe. At that time, tundra–steppe vegetation spread in Central Europe and pine forests dominated in the eastern Mediterranean region. Because the intensities of the 397 ka BP and present insolation minima are very similar, we conclude that under natural boundary conditions the present insolation minimum holds the potential to terminate the Holocene interglacial. Our findings support the Ruddiman hypothesis [Ruddiman, W., 2003. The Anthropogenic Greenhouse Era began thousands of years ago. Climate Change 61, 261–293], which proposes that early anthropogenic greenhouse gas emission prevented the inception of a glacial that would otherwise already have started.

http://www.sciencedirect.com/science/article/pii/S0277379107002715
And this (deleted) Article

Abrupt Climate Change: Should We Be Worried? – Woods Hole Oceanographic Institution
Robert B. Gagosian
President and Director
Woods Hole Oceanographic Institution
Prepared for a panel on abrupt climate change at the
World Economic Forum
Davos, Switzerland, January 27, 2003
Most of the studies and debates on potential climate change, along with its ecological and economic impacts, have focused on the ongoing buildup of industrial greenhouse gases in the atmosphere and a gradual increase in global temperatures. This line of thinking, however, fails to consider another potentially disruptive climate scenario. It ignores recent and rapidly advancing evidence that Earth’s climate repeatedly has shifted abruptly and dramatically in the past, and is capable of doing so in the future.
Fossil evidence clearly demonstrates that Earth vs climate can shift gears within a decade….

But the concept remains little known and scarcely appreciated in the wider community of scientists, economists, policy makers, and world political and business leaders. Thus, world leaders may be planning for climate scenarios of global warming that are opposite to what might actually occur…
REFERENCES:
1 “Are We on the Brink of a New Little Ice Age?”—testimony to the US Commission on Ocean Policy, September 25, 2002, by T. Joyce and L. Keigwin (Woods Hole Oceanographic Institution).
2 Abrupt Climate Change: Inevitable Surprises, US National Academy of Sciences, National Research Council Committee on Abrupt Climate Change, National Academy Press, 2002.
3 “Thermohaline Circulation, the Achilles’ Heel of Our Climate System: Will Man-Made CO2 Upset the Current Balance?” in Science, Vol. 278, November 28, 1997, by W. S. Broecker (Lamont-Doherty Earth Observatory, Columbia University).
4 “Rapid Freshening of the Deep North Atlantic Ocean Over the Past Four Decades,” in Nature, Vol. 416, April 25, 2002, by B. Dickson (Centre for Environment, Fisheries, and Aquaculture Science, Lowestoft, UK), I. Yashayaev, J. Meincke, B. Turrell, S. Dye, and J. Hoffort.
5 “Decreasing Overflow from the Nordic Seas into the Atlantic Ocean Through the Faroe Bank Channel Since 1950,” in Nature, Vol. 411, June 21, 2001, by B. Hansen (Faroe Fisheries Laboratory, Faroe Islands), W. Turrell, and S. østerhus.
6 “Increasing River Discharge to the Arctic Ocean,” in Science, Vol. 298, December 13, 2002, by B. J. Peterson (Marine Biological Laboratory), R. M. Holmes, J. W. McClelland, C. J. Vörösmarty, R. B. Lammers, A. I. Shiklomanov, I. A. Shiklomanov, and S. Rahmstorf.
7 “Ocean Observatories,” in Oceanus, Vol. 42, No. 1, 2000, published by the Woods Hole Oceanographic Institution.
8 The Little Ice Age: How Climate Made History 1300-1850, by Brian Fagan (University of California, Santa Barbara), Basic Books, 2000.
9 “Cultural Responses to Climate Change During the Late Holocene,” in Science, Vol. 292, April 27, 2001, by P. B. deMenocal (Lamont-Doherty Earth Observatory, Columbia University).
10 “Holocene Climate Instability: A Prominent, Widespread Event 8,200 Years Ago,” in Geology, Vol. 26, No. 6, 1997, by R. B. Alley and T. Sowers (Pennsylvania State University), P. A. Mayewski, M. Stuiver, K. C. Taylor, and P. U. Clark.
11 “A High-Resolution Absolute-Dated Late Pleistocene Monsoon Record From Hulu Cave, China,” in Science, Vol. 294, December 14, 2001, by Y. J. Wang (Nanjing Normal University, China), H. Cheng, R. L. Edwards, Z. S. An, J. Y. Wu, C. C. Shen, and J. A. Dorale.
Originally published: February 10, 2003
Last updated: September 3, 2009

(Wayback snapshot: http://web.archive.org/web/20091118222058/http://www.whoi.edu/page.do?pid=12455&tid=282&cid=9986)
Notice the usual “get more grants free card” that is played

…..It is important to clarify that we are not contemplating a situation of either abrupt cooling or global warming. Rather, abrupt regional cooling and gradual global warming can unfold simultaneously. Indeed, greenhouse warming is a destabilizing factor that makes abrupt climate change more probable. A 2002 report by the US National Academy of Sciences (NAS) said, “available evidence suggests that abrupt climate changes are not only possible but likely in the future, potentially with large impacts on ecosystems and societies.”2…

Looks like he is saying we can have a ” regional ” Ice Age along with global warming… (Rolls eyes)

November 7, 2011 4:13 am

Hi Willis,
thanks for this interesting post. I work on the same issue as you regarding how to deal with inevitable uncertainties. I met you in chicago in 2010 an dwould like to send an private email to you. Please send me your contact adress the one I have from the heartland literature 201o is invalid.
best
Michael
[Done -w.]

Pete in Cumbria UK
November 7, 2011 4:15 am

I do kinda wonder if these folks have ever heard of or seen ‘2001’ and even if they have, did they come away with any sort of clue as to what the heck went on in there. Maybe they really are as utterly dumb and completely vacant like the love interest was in that recent youtube video (where the sceptic took fire and burned down to a cinder)
Also, have they ever come across Schrödinger’s Cat (hopefully not as we all hope its still alive and well inside its box) and would they care to repeat the experiment with their own children as ‘The Cat’ and their computer determining what happened inside the box. Would they allow the peer reviewers (the peeps who bought the computer, are paying their wages and heating their offices) to tweak said computer and check their results. Would they go there or allow that with their own kids at stake? How do we go about asking ’em?

Chris B
November 7, 2011 6:18 am

They don’t estimate the error on SWAGs because it’s a constant. One hundred percent (100%).

Ben of Houston
November 7, 2011 8:18 am

I’ve seen this before, and I can tell you the real reason.
1: They either have no fricken clue what the actual error is due to incompetence, or
2: They calculated it to be something insane and discarded the result as impossible, not realizing that this meant their data was worthless.

Verified by MonsterInsights