Lovejoy's 99% 'confidence' vs. measurement uncertainty

By Christopher Monckton of Brenchley

It is time to be angry at the gruesome failure of peer review that allows publication of papers, such as the recent effusion of Professor Lovejoy of McGill University, which, in the gushing, widely-circulated press release that seems to accompany every mephitically ectoplasmic emanation from the Forces of Darkness these days, billed it thus:

“Statistical analysis rules out natural-warming hypothesis with more than 99 percent certainty.”

One thing anyone who studies any kind of physics knows is that claiming results to three standard deviations, or 99% confidence, requires – at minimum – that the data underlying the claim are exceptionally precise and trustworthy and, in particular, that the measurement error is minuscule.

Here is the Lovejoy paper’s proposition:

“Let us … make the hypothesis that anthropogenic forcings are indeed dominant (skeptics may be assured that this hypothesis will be tested and indeed quantified in the following analysis). If this is true, then it is plausible that they do not significantly affect the type or amplitude of the natural variability, so that a simple model may suffice:

clip_image002 (1)

ΔTglobet is the measured mean global temperature anomaly, ΔTantht is the deterministic anthropogenic contribution, ΔTnatt is the (stochastic) natural variability (including the responses to the natural forcings), and Δεt is the measurement error. The last can be estimated from the differences between the various observed global series and their means; it is nearly independent of time scale [Lovejoy et al., 2013a] and sufficiently small (≈ ±0.03 K) that we ignore it.”

Just how likely is it that we can measure global mean surface temperature over time either as an absolute value or as an anomaly to a precision of less than 1/30 Cº? It cannot be done. Yet it was essential to Lovejoy’s fiction that he should pretend it could be done, for otherwise his laughable attempt to claim 99% certainty for yet another me-too, can-I-have-another-grant-please result using speculative modeling would have visibly failed at the first fence.

Some of the tamperings that have depressed temperature anomalies in the 1920s and 1930s to make warming this century seem worse than it really was are a great deal larger than a thirtieth of a Celsius degree.

Fig. 1 shows a notorious instance from New Zealand, courtesy of Bryan Leyland:

clip_image004

Figure 1. Annual New Zealand national mean surface temperature anomalies, 1990-2008, from NIWA, showing a warming rate of 0.3 Cº/century before “adjustment” and 1 Cº/century afterward. This “adjustment” is 23 times the Lovejoy measurement error.

 

clip_image006clip_image008

Figure 2: Tampering with the U.S. temperature record. The GISS record from 1990-2008 (right panel) shows 1934 0.1 Cº lower and 1998 0.3 Cº higher than the same record in its original 1999 version (left panel). This tampering, calculated to increase the apparent warming trend over the 20th century, is more than 13 times the tiny measurement error mentioned by Lovejoy. The startling changes to the dataset between the 1999 and 2008 versions, first noticed by Steven Goddard, are clearly seen if the two slides are repeatedly shown one after the other as a blink comparator.

Fig. 2 shows the effect of tampering with the temperature record at both ends of the 20th century to sex up the warming rate. The practice is surprisingly widespread. There are similar examples from many records in several countries.

But what is quantified, because Professor Jones’ HadCRUT4 temperature series explicitly states it, is the magnitude of the combined measurement, coverage, and bias uncertainties in the data.

Measurement uncertainty arises because measurements are taken in different places under various conditions by different methods. Anthony Watts’ exposure of the poor siting of hundreds of U.S. temperature stations showed up how severe the problem is, with thermometers on airport taxiways, in car parks, by air-conditioning vents, close to sewage works, and so on.

(corrected paragraph) His campaign was so successful that the US climate community were shamed into shutting down or repositioning several poorly-sited temperature monitoring stations. Nevertheless, a network of several hundred ideally-sited stations with standardized equipment and reporting procedures, the Climate Reference Network, tends to show less warming than the older US Historical Climate Network.

That record showed – not greatly to skeptics’ surprise – a rate of warming noticeably slower than the shambolic legacy record. The new record was quietly shunted into a siding, seldom to be heard of again. It pointed to an inconvenient truth: some unknown but significant fraction of 20th-century global warming arose from old-fashioned measurement uncertainty.

Coverage uncertainty arises from the fact that temperature stations are not evenly spaced either spatially or temporally. There has been a startling decline in the number of temperature stations reporting to the global network: there were 6000 a couple of decades ago, but now there are closer to 1500.

Bias uncertainty arises from the fact that, as the improved network demonstrated all too painfully, the old network tends to be closer to human habitation than is ideal.

clip_image010

Figure 3. The monthly HadCRUT4 global temperature anomalies (dark blue) and least-squares trend (thick bright blue line), with the combined measurement, coverage, and bias uncertainties shown. Positive anomalies are green; negative are red.

Fig. 3 shows the HadCRUT4 anomalies since 1880, with the combined anomalies also shown. At present, the combined uncertainties are ±0.15 Cº, or almost a sixth of a Celsius degree up or down, over an interval of 0.3 Cº in total. This value, too, is an order of magnitude greater than the unrealistically tiny measurement error allowed for in Lovejoy’s equation (1).

The effect of the uncertainties is that for 18 years 2 months the HadCRUT4 global-temperature trend falls entirely within the zone of uncertainty (Fig. 4). Accordingly, we cannot tell even with 95% confidence whether any global warming at all has occurred since January 1996.

clip_image012

Figure 4. The HadCRUT4 monthly global mean surface temperature anomalies and trend, January 1996 to February 2014, with the zone of uncertainty (pale blue). Because the trend-line falls entirely within the zone of uncertainty, we cannot be even 95% confident that any global warming occurred over the entire 218-month period.

Now, if you and I know all this, do you suppose the peer reviewers did not know it? The measurement error was crucial to the thesis of the Lovejoy paper, yet the reviewers allowed him to get away with saying it was only 0.03 Cº when the oldest of the global datasets, and the one favored by the IPCC, actually publishes, every monthy, combined uncertainties that are ten times larger.

Let us be blunt. Not least because of those uncertainties, compounded by data tampering all over the world, it is impossible to determine climate sensitivity either to the claimed precision of 0.01 Cº or to 99% confidence from the temperature data.

For this reason alone, the headline conclusion in the fawning press release about the “99% certainty” that climate sensitivity is similar to the IPCC’s estimate is baseless. The order-of-magnitude error about the measurement uncertainties is enough on its own to doom the paper. There is a lot else wrong with it, but that is another story.

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

268 Comments
Inline Feedbacks
View all comments
milodonharlani
April 11, 2014 10:23 pm

Magma says:
April 11, 2014 at 9:16 pm
A short comparison
S. Lovejoy: Physics PhD; climate, meteorological and statistical expert; 500+ publications
C. Monckton: Classics BA; nil; 1 (not peer reviewed)
A short comparison
Opponents of special relativity: Physics PhDs from the most prestigious German universities, with copious numbers of publications
A. Einstein*: Teaching diploma; Swiss patent clerk; two papers (not peer reviewed)
In 1905, at time of publication of his paper describing special relativity, “Zur Elektrodynamik bewegter Körper”, Annalen der Physik 17: 891

eo
April 11, 2014 10:26 pm

Once of the most confusing aspect of statistics to the layman is confidence limit. 100 per cent confidence actually means covering from minus infinity to plus infinity in the case of the normal curve. The larger the confidence or the higher the number of standard deviations applied meant the exercise is more imprecise and untrustworthy. At 100 per cent confidence level, everything is in.

Scott Scarborough
April 11, 2014 10:29 pm

I’d like to know when the next ice-age is going to occur. These guys must know. If they can claim to know what variations are natural, to such accuracy as to claim that a 0.9 C change must be anthropogenic they have to have natural change nailed down pretty tight. So lets have it, when is the next ice-age.

April 11, 2014 10:34 pm

I used to make thermometers for a living. Well, the company did, I wrote statistical process control software that used the digital thermometers for predictive control of factory machinery. Today in the 21st century, armed with a thousand dollars, you can’t walk into a science/engineering shop and come out with a temperature probe that will give any claim to this type of accuracy. It is PURE fiction. Climate “science” is just a bunch of made up numbers published by fools and promoted by bigger fools. And crooks.

April 11, 2014 10:39 pm

Scott Scarborough: probably not for another few tens of thousands of years, at least as far as Milankovitch cycles go. Though the external forcing associated with ice ages is pretty small; most of the drop in temperatures appears due to internal feedbacks.
http://upload.wikimedia.org/wikipedia/commons/5/53/MilankovitchCyclesOrbitandCores.png

April 11, 2014 10:40 pm

“next ice-age.”
What “next”? I can’t be the only one to notice that a hunk of rock bigger than Australia is covered in up to 2km depth in ice, where 98% of all the fresh water on Earth is locked away? The temperature of this planet has been boringly stable for the last 15,000 years. And even the glitch that filled up Bass Strait with 50m of sea water wasn’t a real big deal. If you mention to a greenie that Bass Strait has only existed for 15,000 years their eyes glaze over.

April 11, 2014 10:46 pm

“Magma says:
April 11, 2014 at 9:16 pm
A short comparison
S. Lovejoy: Physics PhD; climate, meteorological and statistical expert; 500+ publications
C. Monckton: Classics BA; nil; 1 (not peer reviewed)”
Yes Magma – pretty sad isn’t it. I’m assuming you are saying Lovejoy should know better and has no excuse. If that’s not what you are saying, then I’ve enjoyed watching you embrace the “appeal to authority” logical fallacy while you ignore common sense.
Once again, one of my favorite graphs are relevant:
http://i.snag.gy/BztF1.jpg
With natural variation like that in the best proxies we have from ends of the earth, the whole premise of looking and finding a human signal much less humans being responsible for most all of the warming with a 99 percent confidence level is just laughably silly.
“Facts are stubborn things, but statistics are more pliable”

Jeff Alberts
April 11, 2014 10:50 pm

“ΔTglobe/Δt”
Is physically meaningless.

prjindigo
April 11, 2014 10:53 pm

Precisely what I keep saying.
In temperature readings the error is often MORE than 1°C to begin with.

Jeff Alberts
April 11, 2014 10:57 pm

They say statistics are like bikinisPier reviews need…

Guess we know where your mind is 😉

April 11, 2014 10:58 pm

Scott Scarborough – now that would be climate science worth pursuing. Talk about a climate emergency. Most of Canada will need to relocate as well as much of the USA. Of course we well get our old sea shores back as our continental shelves get exposed. Much of Europe will need to move too.
The scary thing is the good temp proxies show we’ve been in a long term cooling trend for thousands of years. We still have peaks, but each peak tops out at a lower temp than the prior peak. This trend is now about 3000 years old.
http://i.snag.gy/BztF1.jpg
We are heading to the return of normal conditions but when is still pretty unclear. There may be a tipping point of the geometry of our orbits combined with lowered solar activity that might pop us back into the return of icy conditions in a very short period of time.
One of the things people don’t seem to realize is things like the ‘caveman’ melting out of a glacier or that core drillings in Glacier National Park show that much of the worlds glaciers are only about 3000 years old. which the Ice Core Graph proxy above agrees with. We’ve been cooling for some time and it was warmer the vast majority of time in this current interglacial than it is now.
This full context is what makes this current agenda driven science so sorry to see. The IPCC says the human CO2 contribution was not significant on warming except mid last century to now. Yet even those short term increases happened in other times to the same degree even since the end of the little ice age. There is just nothing really going on here out side of natural variation that I can see.
I’d be nice if man was causing global warming because then we might be able to engineer our way out of the return of what most people call the ice age (technically we are still in the ice age but just are not as ice filled as normal in this short interglacial). But I don’t think we can do much to prevent the return to normal conditions.

April 11, 2014 11:11 pm

“Coverage uncertainty arises from the fact that temperature stations are not evenly spaced either spatially or temporally. There has been a startling decline in the number of temperature stations reporting to the global network: there were 6000 a couple of decades ago, but now there are closer to 1500.”
Wrong.
10,000 in North america alone
http://berkeleyearth.lbl.gov/auto/Regional/TAVG/Figures/north-america-TAVG-Counts.pdf

April 11, 2014 11:23 pm

Based on the evidence presented here, coupled with our inability to verify Magma’s identity, I can still say with 99% certainty that Magma’s parents had no children that lived past the numerical IQ of their shoe size. Sarc/off
As for the author, I think a financial audit is in order….

alleagra
April 11, 2014 11:27 pm

Magma :
You are best advised to sit down and consider for a few minutes just how illogical and ill advised your comment is. You have made a fool of yourself.
There is good news however. Hereafter if you submit a well-argued comment or draw attention to pertinent evidence you can be sure your contribution will be considered by everyone on its merits rather than on the basis of an embarrassing record or who you are and what you’ve done – because that’s how intelligent discussion proceeds..

asybot
April 11, 2014 11:32 pm

@Peterg : There must be some droll stuff in there.
Yep. For certain, 500 papers? That is why they use a photo-copier, it just keeps on repeating the same droll stuff over and over again.
And by now who ever built the copier should use it as adverting or then maybe not , it must be one very tired copier ( or some really tired interns).

Somebody
April 11, 2014 11:33 pm

“Magma, do you disagree with Feynman’s statement on the key to science?”
All of those that believe in the ‘science’ (AGW religious doctrine, really) disagree with it. By the way, Feynman has a beautiful chapter on probabilities and statistics (The Feynman Lectures on Physics, Volume I, chapter 6 http://www.feynmanlectures.caltech.edu/I_06.html ). They should read that, too.

Mindert Eiting
April 11, 2014 11:51 pm

Divide the earth in twelve latitude regions. Estimate for each region a time series of surface temperatures. If these series consisted of error only, their covariances should be zero. It does not matter whether we call it measurement error or natural error. A global signal must show up as positive covariance everywhere in the matrix. I have computed the matrix for the GHCN data (pairwise deletion of missing data) for 1702-2009. With increasing latitude distance the covariances drop to zero. There is only positive covariance between latitude regions on the Northern hemisphere, the more we go in the direction of the Arctic. Conclusions up to the reader. Has this be done in studies I have not seen?

Toto
April 11, 2014 11:56 pm

“Statistical analysis rules out natural-warming hypothesis with more than 99 percent certainty.”
Not quite. Assuming everything else is legit with what he has done (ha!), what he has actually done is to show that HIS hypothesis of how “natural-warming” works is wrong. I suspect his hypothesis is only a straw-man and does not reflect the reality of any natural climate processes.

April 12, 2014 12:05 am

Magma, Maybe Lovejoy obtained his qualifications from a mail order university (a certain Judge who was convicted of perjury obtained a PhD by mail order) or he paid someone to write his Thesis.
The other possibility, if he really is an expert on temperature assessment, is that he is a liar.
As his Lordship states any peer reviewer should be able to make an assessment that Lovejoy’s paper is nonsense and recommended that it be not published. Shame on the reviewers and the editor of the journal.

April 12, 2014 12:15 am

The result described in this paper only applies if natural variation is random and behaves like Brownian motion. As far as I know no-one has ever claimed that natural variation is stochastic. Instead natural processes are proposed to be cyclic with periods of 60 years and higher. This paper does not address this at all. Instead it reproduces standard high school physics by showing that the probablility of a dust particle in air following a path similar in shape to hadcrut4 is very small.

Steve C
April 12, 2014 12:16 am

Magma says: April 11, 9:16 pm (paraphrased):
“Only a proctologist can recognise a turd.”
Wrong.

climatereason
Editor
April 12, 2014 12:32 am

Come on Mosh
We need you to turn up and explain again how it is OK to change past temperatures by using an algorithm.
tonyb

Bob Fernley-Jones
April 12, 2014 12:41 am

In connection with various scams, particularly involving cash investments, there is a popular saying (here in Oz):
If it sounds too good to be true, it probably is!
Perhaps this rule should be considered by CAGW alarmists but apparently that is inconvenient for them, (a classic case being the Manna hockey-stick graph).
Or, maybe there is an antonymous rule that is equally logical:
If it sounds too bad to be true, it probably is!

Peter Miller
April 12, 2014 12:48 am

As a geologist I can assure you all, there never were any climate cycles prior to 1950, nor were there any afterwards. The recent warming is 99% certain to be man made, climate was totally static prior to 1950.
That’s what we need to aim for: the climate of 1750AD, 1000AD, year 0, 5000BC or 15000 years ago.
What a crock, the above is such obvious BS, the average temperature of those years varies from approximately +3 to -8 degrees C, compared to what it is today..
Natural climate cycles are the alarmists’ great heresy, which demand the immediate attention of the Climate Inquisition. For natural climate cycles are the Achilles Heel of alarmist theories, and most important, alarmists want to ensure nobody ever gets to know their dark secret that CAGW never happened in the geological record.
If the great Lew wanted to do something useful, he would probably find that those who don’t believe in natural climate cycles also do not believe, what was it now? Man landing on the moon and tobacco smoke causing cancer – and this time it would be true!!!!
Geology is a real science, while climate science is more akin to astrology. Try finding a non-government geologist who believes in CAGW and you will find it more difficult than finding a needle in a haystack.

lee
April 12, 2014 1:01 am

Louis says:
April 11, 2014 at 10:04 pm
‘The global temperature reconstructions that Lovejoy used to go back to the 1500′s have larger uncertainties than the modern temperature records.’
With the way modern temperatures are brutalised I’m not so sure.