UPDATE – BOMBSHELL: audit of global warming data finds it riddled with errors

I’m bringing this back to the top for discussion, mainly because Steven Mosher was being a cad in comments, wailing about “not checking”, claiming McLean’s PhD thesis was “toast”, while at the same time not bothering to check himself. See the update below. – Anthony


Just ahead of a new report from the IPCC, dubbed SR#15 about to be released today, we have this bombshell- a detailed audit shows the surface temperature data is unfit for purpose. The first ever audit of the world’s most important temperature data set (HadCRUT4) has found it to be so riddled with errors and “freakishly improbable data”  that it is effectively useless.

From the IPCC:

Global Warming of 1.5 °C, an IPCC special report on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.

This is what consensus science brings you – groupthink with no quality control.

HadCRUT4 is the primary global temperature dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”.  It’s also the dataset at the center of “ClimateGate” from 2009, managed by the Climate Research Unit (CRU) at East Anglia University.

The audit finds more than 70 areas of concern about data quality and accuracy.

But according to an analysis by Australian researcher John McLean it’s far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world.

Main points:

  • The Hadley data is one of the most cited, most important databases for climate modeling, and thus for policies involving billions of dollars.
  • McLean found freakishly improbable data, and systematic adjustment errors , large gaps where there is no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors.
  • Almost no quality control checks have been done: outliers that are obvious mistakes have not been corrected – one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C.  One town in Romania stepped out from summer in 1953 straight into a month of Spring at minus 46°C. These are supposedly “average” temperatures for a full month at a time. St Kitts, a Caribbean island, was recorded at 0°C for a whole month, and twice!
  • Temperatures for the entire Southern Hemisphere in 1850 and for the next three years are calculated from just one site in Indonesia and some random ships.
  • Sea surface temperatures represent 70% of the Earth’s surface, but some measurements come from ships which are logged at locations 100km inland. Others are in harbors which are hardly representative of the open ocean.
  • When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.

Details of the worst outliers

  • For April, June and July of 1978 Apto Uto (Colombia, ID:800890)  had an average monthly temperature of  81.5°C, 83.4°C and 83.4°C respectively.
  • The monthly mean temperature in September 1953 at Paltinis, Romania is reported as -46.4 °C (in other years the September average was about 11.5°C).
  • At Golden Rock Airport, on the island of St Kitts in the Caribbean, mean monthly temperatures for December in 1981 and 1984 are reported as 0.0°C. But from 1971 to 1990 the average in all the other years was 26.0°C.

More at Jo Nova


The report:

Unfortunately, the report is paywalled. The good news is that it’s a mere $8.

The researcher, John McLean, did all the work on his own, so it is a way to get compensated for all the time and effort put into it. He writes:

This report is based on a thesis for my PhD, which was awarded in December 2017 by James Cook University, Townsville, Australia. The thesis1 was based on the HadCRUT4 dataset and associated files as they were in late January 2016. The thesis identified 27 issues of concern about the dataset.

The January 2018 versions of the files contained not just updates for the intervening 24 months, but also additional observation stations and consequent changes in the monthly global average temperature anomaly right back to the start of data in 1850.
The report uses January 2018 data and revises and extends the analysis performed in the original thesis, sometimes omitting minor issues, sometimes splitting major issues and sometimes analysing new areas and reporting on those findings.

The thesis was examined by experts external to the university, revised in accordance with their comments and then accepted by the university. This process was at least equivalent to “peer review” as conducted by scientific journals.

I’ve purchased a copy, and I’ve reproduced the executive summary below. I urge readers to buy a copy and support this work.

Get it here:

Audit of the HadCRUT4 Global Temperature Dataset


EXECUTIVE SUMMARY

As far as can be ascertained, this is the first audit of the HadCRUT4 dataset, the main temperature dataset used in climate assessment reports from the Intergovernmental Panel on Climate Change (IPCC). Governments and the United Nations Framework Convention on Climate Change (UNFCCC) rely heavily on the IPCC reports so ultimately the temperature data needs to be accurate and reliable.

This audit shows that it is neither of those things.

More than 70 issues are identified, covering the entire process from the measurement of temperatures to the dataset’s creation, to data derived from it (such as averages) and to its eventual publication. The findings (shown in consolidated form Appendix 6) even include simple issues of obviously erroneous data, glossed-over sparsity of data, significant but questionable assumptions and temperature data that has been incorrectly adjusted in a way that exaggerates warming.

It finds, for example, an observation station reporting average monthly temperatures above 80°C, two instances of a station in the Caribbean reporting December average temperatures of 0°C and a Romanian station reporting a September average temperature of -45°C when the typical average in that month is 10°C. On top of that, some ships that measured sea temperatures reported their locations as more than 80km inland.

It appears that the suppliers of the land and sea temperature data failed to check for basic errors and the people who create the HadCRUT dataset didn’t find them and raise questions either.

The processing that creates the dataset does remove some errors but it uses a threshold set from two values calculated from part of the data but errors weren’t removed from that part before the two values were calculated.

Data sparsity is a real problem. The dataset starts in 1850 but for just over two years at the start of the record the only land-based data for the entire Southern Hemisphere came from a single observation station in Indonesia. At the end of five years just three stations reported data in that hemisphere. Global averages are calculated from the averages for each of the two hemispheres, so these few stations have a large influence on what’s supposedly “global”. Related to the amount of data is the percentage of the world (or hemisphere) that the data covers. According to the method of calculating coverage for the dataset, 50% global coverage wasn’t reached until 1906 and 50% of the Southern Hemisphere wasn’t reached until about
1950.

In May 1861 global coverage was a mere 12% – that’s less than one-eighth. In much of the 1860s and 1870s most of the supposedly global coverage was from Europe and its trade sea routes and ports, covering only about 13% of the Earth’s surface. To calculate averages from this data and refer to them as “global averages” is stretching credulity.

Another important finding of this audit is that many temperatures have been incorrectly adjusted. The adjustment of data aims to create a temperature record that would have resulted if the current observation stations and equipment had always measured the local temperature. Adjustments are typically made when station is relocated or its instruments or their housing replaced.

The typical method of adjusting data is to alter all previous values by the same amount. Applying this to situations that changed gradually (such as a growing city increasingly distorting the true temperature) is very wrong and it leaves the earlier data adjusted by more than it should have been. Observation stations might be relocated multiple times and with all previous data adjusted each time the very earliest data might be far below its correct value and the complete data record show an exaggerated warming trend.

The overall conclusion (see chapter 10) is that the data is not fit for global studies. Data prior to 1950 suffers from poor coverage and very likely multiple incorrect adjustments of station data. Data since that year has better coverage but still has the problem of data adjustments and a host of other issues mentioned in the audit.

Calculating the correct temperatures would require a huge amount of detailed data, time and effort, which is beyond the scope of this audit and perhaps even impossible. The primary conclusion of the audit is however that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.

One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.

Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as “indicative” of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect.

A third implication is that even if the IPCC’s claim that mankind has caused the majority of warming since 1950 is correct then the amount of such warming over what is almost 70 years could well be negligible. The question then arises as to whether the effort and cost of addressing it make any sense.

Ultimately it is the opinion of this author that the HadCRUT4 data, and any reports or claims based on it, do not form a credible basis for government policy on climate or for international agreements about supposed causes of climate change.


Full report here


UPDATE: 10/11/18

Some commenters on Twitter, and also here, including Steven Mosher, who said McLean’s thesis/PhD was “toast” seem to doubt that he was actually allowed to submit his thesis, and/or that it was accepted, thus negating his PhD. To that end, here is the proof.

McLean’s thesis appears on the James Cook University website:  “An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues“, submitted for Ph.D. in physics from James Cook University (2017).

And, he was in fact awarded a PhD by JCU for that thesis.

Larry Kummer of Fabius Maximus directly contacted the University to confirm his degree. Here is the reply.

ADDED:

JOHN MCLEAN here.
For Mr Mosher,

I don’t insult and I don’t accuse without investigation. And if I don’t know I try to ask.

(a) Data files
If you want copies of the data that I used in the audit, as they were when I downloaded them in January, go to web page https://robert-boyle-publishing.com/audit-of-the-hadcrut4-global-temperature-dataset-mclean-2018/ and just scroll down.

Or download the latest versions of the files from yourself from the CRU and Hadley Centre, namely https://crudata.uea.ac.uk/cru/data/temperature/ and https://www.metoffice.gov.uk/hadobs/hadsst3/data/download.html. (The fact that file names are always the same and it’s confusing is one of the fidnings of the audit.)

(b) Apto Uto not used? Figure 6.3 shows that it is used, the lower than expected spikes are because of other stations in the same grid cell and the vale of the cell is the average anomaly for all such stations.

(c) What stations are used and what are not?
The old minimum of 20 years of the 30 from 1961 to 1990 was dropped a few HadCRUT versions back. It then went to 15 years with no more than 5 missing in any decade. HadCRUT4 reduced it again to 14.

best wishes

John

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

512 Comments
Inline Feedbacks
View all comments
Harry Passfield
October 7, 2018 11:52 am

I do hope this the John McLean from Die Hard ++. ‘Cos he can really whip a*se.

Snarling Dolphin
October 7, 2018 12:14 pm

Is freakishly improbable the same as obviously manipulated? I’m thinking yes. Yes it is. About gd time someone calls bs bs.

Louis Hunt
October 7, 2018 12:20 pm

“…an IPCC special report on the impacts of global warming… in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.”

For the IPCC to have multiple objectives, as stated above, seems problematic to me. How do you “eradicate poverty” or encourage “sustainable development” without providing poor countries with the money and technology to do so? And when you promise developing nations who sign on to the Paris Accords that wealthy nations will pay them climate reparations as the climate warms, how does that not put political pressure on them to make sure their temperature data shows warming? When payments depend on the climate warming, you can expect the climate to warm. That goes doubly for climate scientists whose paychecks and grants also depend on a warming climate and the predictions of negative consequences from such warming. So what incentive is there to remove errors in the temperature data if it reduces the warming trend?

Samuel C Cogar
Reply to  Louis Hunt
October 8, 2018 3:03 am

How do you “eradicate poverty” or encourage “sustainable development” without providing poor countries with the money and technology to do so?

“HA”, ….. American taxpayers have been trying to “eradicate poverty in the US” for the past 55+ years by providing trillions of dollars and technology to do so …… and the percent deemed to be impoverished now days is far, far greater than 55+ years ago.

And “HA, HA”, ….. for the past 50+ years, ….. American taxpayers have been trying to “eradicate poverty” and encourage “sustainable development” by giving the Palestinians TENS of BILLIONS of dollars and technology to do so, ……. and living standards there are still bout the same.

Paramenter
October 7, 2018 12:28 pm

“one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C”

Not too bad for holiday lovers. I anticipate twofold response to those findings:

1. Ignore as long as you can.
2. If you cannot ignore and findings are actually finding ways into the professional community/public and are gathering attention say that averaging process operating on large numbers will nicely cancel out all those unfortunate errors. Thus, we may not be able to figure out accurately actual temperatures but with appropriate statistics we can with certainty measure a warming trend. And all is fine.

John Tillman
Reply to  Paramenter
October 7, 2018 12:41 pm

Should be Colombia. The country is spelled correctly farther down.

HadCRUT is probably less of a mess than GISS and BEST.

John Tillman
Reply to  Paramenter
October 7, 2018 12:43 pm

No wonder Phil Jones didn’t want anyone looking at his “data”.

MarkW
Reply to  Paramenter
October 7, 2018 2:58 pm

The question then becomes, what do you do once the problem has been identified.
First check the original logs, if they still exist.
If that doesn’t work, an honest scientist would not use the data as it is obviously flawed.
A climate scientist would declare that we can just model the data by taking the average value for that month and then adjust that average based on what has happened at stations up to 600 miles away.
The climate scientist would then tell you that the data thus modeled is even more accurate than the original data could have been so you don’t need to worry about error bars.

Alasdair
October 7, 2018 12:36 pm

Global Temperature is as long as a piece of string.

The string of assumptions, definitions, locations, errors, manipulations, bias, presentation, accuracy, interpretation and wishful thinking, – plus a few more.

Currently it is a bit like a tangle of knitting.

October 7, 2018 1:25 pm

I’ve followed this issue for many years now and remember from way back, probably 10 years ago, that Prof Jones of CRU said to à Foi request (from a Warwick Hughes I think)

We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it.

Am I right in thinking this is that data?

October 7, 2018 1:28 pm

.
❶①❶①❶①❶①
❶①❶①❶①❶①
❶①❶①❶①❶①
❶①❶①❶①❶①
.

Was the recent Slowdown caused by the super El Nino of 1998?

If you take the GISTEMP temperature series, and replace the 1998 temperature anomaly with a new value, that is spot on the trend line, does the Slowdown disappear.

Warning – the results of this article will be shocking, for some people.

https://agree-to-disagree.com/was-the-slowdown-caused-by-1998

October 7, 2018 1:40 pm

Support John McLean financially.

Pay for MULTIPLE copies of his ebook ($8) and DON’T onforward it unless you’ve paid for every copy you’re sending out.

(I’ve bought 4 so far)

John McLean
Reply to  StefanL
October 12, 2018 3:06 am

Hi StefanL,

I really appreciate what you’ve done. It’s nice to know that people support your work.

All the best,

John

Warren
October 7, 2018 1:56 pm

Just purchased a copy.
Probably won’t read it but John M deserves our money.
Robert Boyle Publishing has a very clunky ‘shop’.
Attention to detail is paramount in e-commerce.
First issue is no auto-fill and it gets worse from there.
RBP should have a ‘friend’ (who’s never used their ‘shop’ before) have a go at buying a copy under observation (without guidance). Things don’t quite happen the way one expects!

Reply to  Warren
October 7, 2018 2:14 pm

Warren
October 7, 2018 at 1:56 pm

I found the website OK…had to fill in all my details as this was my first purchase from them but download was very quick.

Pop Piasa
October 7, 2018 2:27 pm

“…in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.”

🎵One of these things id not like the others,
One of these things doesn’t belong.
Can you tell which thing is not like the others,
Before I finish my song?🎶

October 7, 2018 2:40 pm

It is good to see James Cook University at Townsville being involved and assisting some actual scientific work for a change and not simply promoting alarmism over the Great Barrier Reef to raise funds.
I would guess that the Hadley data had been checked by the Warmistas there to ensure that only improbable data, systematic adjustments, gaps with no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors were left in the data only when they promoted and supported the Alarmist-Warmist cause. Any that did the opposite would have been ruthlessly rooted out already.

michael hart
October 7, 2018 3:01 pm

Given the nature of the way the data is collected from such disparate sources, these findings are not really surprising. Other technical and scientific considerations aside, this is one of the points where satellite readings score so much better. The relative costs in man-hours taken to produce the respective competing data sets would also make for interesting reading.

Steven Fraser
October 7, 2018 3:17 pm

I wonder what would be found if the called-out problem locations/date values were checked in the other ‘global’ datasets. Do they show the same artefacts?

Patrick MJD
October 7, 2018 4:01 pm
Graeme#4
Reply to  Patrick MJD
October 8, 2018 4:31 am

The Australian paper has included it in its latest news, so it will be in tomorrow’s news. Already the leftists are trying to discredit the report.

David L Hagen
October 7, 2018 4:10 pm

As I recall, I found similar errors in the original Minnesota temperature records when trying to analyze min, max, mean, and kurtosis.

Jeff
October 7, 2018 4:54 pm

Full free thesis here :

https://researchonline.jcu.edu.au/52041/1/52041-mclean-2017-thesis.pdf

Dedicated to the late Bob Carter who was his first supervisor at James Cook.

Whiskey
October 7, 2018 6:18 pm

You know, you call yourselves “skeptics” but noone here has challenged the article, in spite of its obvious shortcomings. Instead, you wait for the real skeptics, Stokes and Mosher, to say something. But they don’t, letting you relax in a puddle of embarrassment.

John Tillman
Reply to  Whiskey
October 7, 2018 6:26 pm

Whiskey,

Please point out to readers here the paper’s shortcomings which you find so obvious. Why wait for others to do so?

Thanks!

MarkW
Reply to  John Tillman
October 7, 2018 6:32 pm

Isn’t it obvious. The paper fails to agree with the models. That’s proof that the paper is flawed.

John Tillman
Reply to  MarkW
October 7, 2018 6:37 pm

When models disagree with reality, print the models.

The Man Who Shot Down HadCRUT.

Whiskey
Reply to  John Tillman
October 7, 2018 7:19 pm

John Tillman says: “Whiskey, Please point out to readers here the paper’s shortcomings which you find so obvious. Why wait for others to do so?”
Why. Because why should do work for fake skeptics? If you are really skeptics, why are you doing this “pal review”. I mean, isn’t this what you are all against? You are a bunch of fakes.

John Tillman
Reply to  Whiskey
October 7, 2018 7:22 pm

Why not do such work in order to give us skeptics something to comment upon.

If the shortcomings are obvious, how hard could it be for you to do this work.

Thanks!

What has been obvious to anyone taking the most cursory glance at HadCRU’s “data” has found the same sort of issues. Phil Jones himself has admitted that they warmed the sea “surface data” to bring it in line with the land, which their adjustments had warmed so unphysically.

Reg Nelson
Reply to  John Tillman
October 7, 2018 10:46 pm

Phil Jones also admitted the SH temp data was largely made up. This paper proves Jones was right.

John Tillman
Reply to  John Tillman
October 11, 2018 11:49 am

Reg,

Somehow I doubt that being shown right will make Dr. Phil happy.

MarkW
Reply to  Whiskey
October 11, 2018 11:33 am

Whiskey might as well stand on the table and shout that he has no intention of behaving like an adult.

MarkW
Reply to  Whiskey
October 7, 2018 6:34 pm

While it’s true that Nick did stop buy, his complaints were easily dealt with.
As of the time of my post, Mosh has not posted to this thread.
You three really need to work on your co-ordination.

Whiskey
Reply to  MarkW
October 7, 2018 7:07 pm

I’m sorry, there are no notes on Stoke’s comments that related to Bennett’s comments. So you are misrepresenting facts, a common so called fake “skeptic” ploy. Thank you.

Reg Nelson
Reply to  Whiskey
October 7, 2018 10:51 pm

Nick didn’t address the main point of the paper — the temp data is sparse, poor quality, poor precision, heavily adjusted, and not suitable for scientific purposes. He knows all of these things are true, so he ignores them.

We don’t know the global temp in 1850. To suggest that we do with a precision of 0.1 C is simply absurd.

Patrick MJD
Reply to  Reg Nelson
October 8, 2018 12:01 am

Not only that, but it was a global average in 1850! Absolute horseraddish!

MarkW
Reply to  Whiskey
October 11, 2018 11:35 am

Wow, shooting down a claim I never made.
How alarmist of you.

Patrick MJD
October 7, 2018 6:24 pm
Whiskey
October 7, 2018 6:44 pm

Boy, almost 200 comments in and no one critical of the post. Not one skeptic among you. (Not that I really ever believed that, but I regress) I should get all your names and addresses for my next paper to enter as reviewers, I would love to have peer review as wimpy as this.

Roger Knights
Reply to  Whiskey
October 7, 2018 7:41 pm

“Boy, almost 200 comments in and no one critical of the post.”

It’s natural for people to jump the gun, based on a first look. But it’s been only 14 hours since publication here, not enough for skeptics to screw in their loope. Give it time.

michael hart
Reply to  Roger Knights
October 7, 2018 10:04 pm

I think Whiskey has “regressed” further than he/she thinks.

Maybe not all readers could afford to pay to buy the whole article before a free link was provided in the comments. However, I’m sure Whiskey will be delighted to learn that the word “measurments” on page 203 is widely considered to be an incorrect spelling.

John Tillman
Reply to  michael hart
October 7, 2018 10:13 pm

I might have given him too much credit. I thought he intentionally said “regressed” instead of “digressed” in an attempt at humor.

Editor
Reply to  Whiskey
October 7, 2018 9:13 pm

Between all the posts in the skeptic blogs about HadCRUT and other databases having data with adjustments that all made warming look worse than it was, and all text in the README file, the result is not surprising.

The only obvious shortcoming I see is that McLean didn’t find all the errors.

Reply to  Whiskey
October 7, 2018 10:56 pm

Perhaps Whiskey could review it and point out where MacLean has erred?

John Endicott
Reply to  Whiskey
October 11, 2018 11:17 am

Boy, almost 200 comments in and no one critical of the post

so where is your critical analysis of the post? Rather than bitch about everyone else not doing the work for you, why don’t you do the work yourself? hmmm?

MarkW
Reply to  Whiskey
October 11, 2018 11:38 am

In Whiskey’s “mind”, if I can use the term that losely, a skeptic is someone who rejects everything.

Roger Knights
October 7, 2018 7:01 pm

ATTENTION! Jo Nova terms this news DATAGATE, which term I suggest that we adopt.

Roger Knights
October 7, 2018 7:31 pm

The head post and comments at Jo Nova’s site are outstanding. (She says she and her husband have been in touch with McLean for six months, so she had time to prepare a lengthy head post. http://joannenova.com.au/2018/10/first-audit-of-global-temperature-data-finds-freezing-tropical-islands-boiling-towns-boats-on-land/

Robert from oz
October 7, 2018 7:33 pm

This is how the ABC have reported the news .

http://www.abc.net.au/news/science/2018-10-08/ipcc-climate-change-report/10348720

And for those not familiar with OZ we promise never to dig the Great Barrier Reef up for coal again .

Editor
Reply to  Robert from oz
October 8, 2018 5:56 am

That’s completely on the IPCC report release, different story.

Roger Knights
October 7, 2018 8:38 pm

Our minders and binders,
Whose blinders betray,
Be fault-finders, stem-winders—
No truth-finders they.

Percy Jackson
October 7, 2018 10:37 pm

My question would be “so what”? Certainly it is important to audit the quality of any dataset
but it is hard to see what is new here. Anybody who cared could have downloaded the data
and realised that the coverage was sparse especially in the Southern Hemisphere before 1950.

However what nobody has shown is that any of these issues change the results? The HadCRuT4
data series comes with associated errors. Nowhere does McLean show that the error estimates given
in the official database are wrong. In addition McLean states in his thesis that the errors do not
suggest a systematic bias and that they are probably normally distributed. So at most the conclusion
would be that we are less certain about the warming trend over the past century that we thought. Which
also leads to the possibility that the earth has warmed more than we think.

Anthony Banton
Reply to  Percy Jackson
October 8, 2018 1:48 am

“However what nobody has shown is that any of these issues change the results? The HadCRuT4”

No they haven’t, as Nick Stokes explained above ….
“OK. This is no BOMBSHELL. These are errors in the raw data files as supplied by the sources named. The MO publishes these unaltered, as they should. But they perform quality control before using them”

Scott Bennett
Reply to  Percy Jackson
October 8, 2018 3:59 am

No, he stated that:

“The variation in coverage over time is a potential source of systemic (rather than random) errors that the process of averaging cannot remove. If temperature variation trends are not uniform across the globe changes in coverage will potentially cause a misrepresentation of the global average trend. ”

They do “change the result” because the IPCC base period 1850 – 1900 is of limited use because for almost all of the period from 1850 to 1950 the coverage of the Earth’s surface was less than 50%. It matters specifically because the all important base period has zero reliability and anything before it even less. If you think one station is a perfectly reasonable way to estimate a hemisphere before 1850 then all is fine with the world.

Anthony Banton
Reply to  Scott Bennett
October 8, 2018 4:23 am

“…. limited use because for almost all of the period from 1850 to 1950 the coverage of the Earth’s surface was less than 50%.”

https://www.metoffice.gov.uk/hadobs/crutem4/data/diagnostics/global/nh+sh/index.html

But of some use because they give uncertainties.

Percy Jackson
Reply to  Scott Bennett
October 8, 2018 10:50 am

Scott,
One station is clearly not a perfect way to estimate temperatures but short of constructing
a time machine and going back in time to install more that is all we have. So the options are to
(a) make the best estimate possible with the data we have or (b) give up and make the illogical
leap claiming that if we don’t know the temperature in the past then the earth isn’t warming.

John Endicott
Reply to  Percy Jackson
October 11, 2018 11:14 am

No Percy the answer is (c) admit that we don’t know the temperature of the past (due to all the various ways in which past data is lacking) and make damn sure we are getting good data *now* so that the in the future we can have a better understanding of the temperature. Until we have good data, we can’t leap to the conclusion that you want to. Garbage in garbage out applies.

MarkW
Reply to  Percy Jackson
October 11, 2018 11:46 am

The correct answer is that while it may be warming, there’s no way we can say by how much and absolutely no way we can prove that the warming is more than would naturally be occurring.

I really find it fascinating how the best estimate after fixing all these errors and admitting that the coverage is woefully inadequate, somehow comes out to be 10 times more accurate than the best thermometer used.

D. J. Hawkins
Reply to  Percy Jackson
October 11, 2018 11:47 am

(b) is a better choice than (a). And it’s hardly “illogical” at all. If you don’t know the temperature in the past, you CAN’T say the earth is warming.

If policy decisions with trillions of dollars are at stake, it’s better to do nothing and keep on as is than to cripple the global economy chasing a ghost. Now that we know the IPCC is suggesting a carbon tax as high as $27,000/ton of oil equivalent, we can see the true lay of the land. At that level, there won’t be any effect on the global economy because there won’t be a global economy.

Blunderbunny
October 7, 2018 11:05 pm

I’m happy he got his thesis out the door and I’m happy with his choice in terms of datasets, but much of this is hardly news or indeed novel. I guess what I’m saying is that I’d have liked him to have found more. We already knew much of this and so far that hasnt stopped them, so for those hoping for a smoking gun, i think the wait continues.

Verified by MonsterInsights