UPDATE – BOMBSHELL: audit of global warming data finds it riddled with errors

I’m bringing this back to the top for discussion, mainly because Steven Mosher was being a cad in comments, wailing about “not checking”, claiming McLean’s PhD thesis was “toast”, while at the same time not bothering to check himself. See the update below. – Anthony


Just ahead of a new report from the IPCC, dubbed SR#15 about to be released today, we have this bombshell- a detailed audit shows the surface temperature data is unfit for purpose. The first ever audit of the world’s most important temperature data set (HadCRUT4) has found it to be so riddled with errors and “freakishly improbable data”  that it is effectively useless.

From the IPCC:

Global Warming of 1.5 °C, an IPCC special report on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.

This is what consensus science brings you – groupthink with no quality control.

HadCRUT4 is the primary global temperature dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”.  It’s also the dataset at the center of “ClimateGate” from 2009, managed by the Climate Research Unit (CRU) at East Anglia University.

The audit finds more than 70 areas of concern about data quality and accuracy.

But according to an analysis by Australian researcher John McLean it’s far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world.

Main points:

  • The Hadley data is one of the most cited, most important databases for climate modeling, and thus for policies involving billions of dollars.
  • McLean found freakishly improbable data, and systematic adjustment errors , large gaps where there is no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors.
  • Almost no quality control checks have been done: outliers that are obvious mistakes have not been corrected – one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C.  One town in Romania stepped out from summer in 1953 straight into a month of Spring at minus 46°C. These are supposedly “average” temperatures for a full month at a time. St Kitts, a Caribbean island, was recorded at 0°C for a whole month, and twice!
  • Temperatures for the entire Southern Hemisphere in 1850 and for the next three years are calculated from just one site in Indonesia and some random ships.
  • Sea surface temperatures represent 70% of the Earth’s surface, but some measurements come from ships which are logged at locations 100km inland. Others are in harbors which are hardly representative of the open ocean.
  • When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.

Details of the worst outliers

  • For April, June and July of 1978 Apto Uto (Colombia, ID:800890)  had an average monthly temperature of  81.5°C, 83.4°C and 83.4°C respectively.
  • The monthly mean temperature in September 1953 at Paltinis, Romania is reported as -46.4 °C (in other years the September average was about 11.5°C).
  • At Golden Rock Airport, on the island of St Kitts in the Caribbean, mean monthly temperatures for December in 1981 and 1984 are reported as 0.0°C. But from 1971 to 1990 the average in all the other years was 26.0°C.

More at Jo Nova


The report:

Unfortunately, the report is paywalled. The good news is that it’s a mere $8.

The researcher, John McLean, did all the work on his own, so it is a way to get compensated for all the time and effort put into it. He writes:

This report is based on a thesis for my PhD, which was awarded in December 2017 by James Cook University, Townsville, Australia. The thesis1 was based on the HadCRUT4 dataset and associated files as they were in late January 2016. The thesis identified 27 issues of concern about the dataset.

The January 2018 versions of the files contained not just updates for the intervening 24 months, but also additional observation stations and consequent changes in the monthly global average temperature anomaly right back to the start of data in 1850.
The report uses January 2018 data and revises and extends the analysis performed in the original thesis, sometimes omitting minor issues, sometimes splitting major issues and sometimes analysing new areas and reporting on those findings.

The thesis was examined by experts external to the university, revised in accordance with their comments and then accepted by the university. This process was at least equivalent to “peer review” as conducted by scientific journals.

I’ve purchased a copy, and I’ve reproduced the executive summary below. I urge readers to buy a copy and support this work.

Get it here:

Audit of the HadCRUT4 Global Temperature Dataset


EXECUTIVE SUMMARY

As far as can be ascertained, this is the first audit of the HadCRUT4 dataset, the main temperature dataset used in climate assessment reports from the Intergovernmental Panel on Climate Change (IPCC). Governments and the United Nations Framework Convention on Climate Change (UNFCCC) rely heavily on the IPCC reports so ultimately the temperature data needs to be accurate and reliable.

This audit shows that it is neither of those things.

More than 70 issues are identified, covering the entire process from the measurement of temperatures to the dataset’s creation, to data derived from it (such as averages) and to its eventual publication. The findings (shown in consolidated form Appendix 6) even include simple issues of obviously erroneous data, glossed-over sparsity of data, significant but questionable assumptions and temperature data that has been incorrectly adjusted in a way that exaggerates warming.

It finds, for example, an observation station reporting average monthly temperatures above 80°C, two instances of a station in the Caribbean reporting December average temperatures of 0°C and a Romanian station reporting a September average temperature of -45°C when the typical average in that month is 10°C. On top of that, some ships that measured sea temperatures reported their locations as more than 80km inland.

It appears that the suppliers of the land and sea temperature data failed to check for basic errors and the people who create the HadCRUT dataset didn’t find them and raise questions either.

The processing that creates the dataset does remove some errors but it uses a threshold set from two values calculated from part of the data but errors weren’t removed from that part before the two values were calculated.

Data sparsity is a real problem. The dataset starts in 1850 but for just over two years at the start of the record the only land-based data for the entire Southern Hemisphere came from a single observation station in Indonesia. At the end of five years just three stations reported data in that hemisphere. Global averages are calculated from the averages for each of the two hemispheres, so these few stations have a large influence on what’s supposedly “global”. Related to the amount of data is the percentage of the world (or hemisphere) that the data covers. According to the method of calculating coverage for the dataset, 50% global coverage wasn’t reached until 1906 and 50% of the Southern Hemisphere wasn’t reached until about
1950.

In May 1861 global coverage was a mere 12% – that’s less than one-eighth. In much of the 1860s and 1870s most of the supposedly global coverage was from Europe and its trade sea routes and ports, covering only about 13% of the Earth’s surface. To calculate averages from this data and refer to them as “global averages” is stretching credulity.

Another important finding of this audit is that many temperatures have been incorrectly adjusted. The adjustment of data aims to create a temperature record that would have resulted if the current observation stations and equipment had always measured the local temperature. Adjustments are typically made when station is relocated or its instruments or their housing replaced.

The typical method of adjusting data is to alter all previous values by the same amount. Applying this to situations that changed gradually (such as a growing city increasingly distorting the true temperature) is very wrong and it leaves the earlier data adjusted by more than it should have been. Observation stations might be relocated multiple times and with all previous data adjusted each time the very earliest data might be far below its correct value and the complete data record show an exaggerated warming trend.

The overall conclusion (see chapter 10) is that the data is not fit for global studies. Data prior to 1950 suffers from poor coverage and very likely multiple incorrect adjustments of station data. Data since that year has better coverage but still has the problem of data adjustments and a host of other issues mentioned in the audit.

Calculating the correct temperatures would require a huge amount of detailed data, time and effort, which is beyond the scope of this audit and perhaps even impossible. The primary conclusion of the audit is however that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.

One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.

Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as “indicative” of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect.

A third implication is that even if the IPCC’s claim that mankind has caused the majority of warming since 1950 is correct then the amount of such warming over what is almost 70 years could well be negligible. The question then arises as to whether the effort and cost of addressing it make any sense.

Ultimately it is the opinion of this author that the HadCRUT4 data, and any reports or claims based on it, do not form a credible basis for government policy on climate or for international agreements about supposed causes of climate change.


Full report here


UPDATE: 10/11/18

Some commenters on Twitter, and also here, including Steven Mosher, who said McLean’s thesis/PhD was “toast” seem to doubt that he was actually allowed to submit his thesis, and/or that it was accepted, thus negating his PhD. To that end, here is the proof.

McLean’s thesis appears on the James Cook University website:  “An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues“, submitted for Ph.D. in physics from James Cook University (2017).

And, he was in fact awarded a PhD by JCU for that thesis.

Larry Kummer of Fabius Maximus directly contacted the University to confirm his degree. Here is the reply.

ADDED:

JOHN MCLEAN here.
For Mr Mosher,

I don’t insult and I don’t accuse without investigation. And if I don’t know I try to ask.

(a) Data files
If you want copies of the data that I used in the audit, as they were when I downloaded them in January, go to web page https://robert-boyle-publishing.com/audit-of-the-hadcrut4-global-temperature-dataset-mclean-2018/ and just scroll down.

Or download the latest versions of the files from yourself from the CRU and Hadley Centre, namely https://crudata.uea.ac.uk/cru/data/temperature/ and https://www.metoffice.gov.uk/hadobs/hadsst3/data/download.html. (The fact that file names are always the same and it’s confusing is one of the fidnings of the audit.)

(b) Apto Uto not used? Figure 6.3 shows that it is used, the lower than expected spikes are because of other stations in the same grid cell and the vale of the cell is the average anomaly for all such stations.

(c) What stations are used and what are not?
The old minimum of 20 years of the 30 from 1961 to 1990 was dropped a few HadCRUT versions back. It then went to 15 years with no more than 5 missing in any decade. HadCRUT4 reduced it again to 14.

best wishes

John

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

512 Comments
Inline Feedbacks
View all comments
October 7, 2018 8:31 am

Considering that climate models have failed, and continue to fail, to show what temperatures are doing, climate modelers should embrace this research and publicize it far and wide.

They can now claim their models are NOT wrong, but that the historical data used as input was (and they are not at fault). “Our output was wrong, but we are still right.” They could then re-run the models with different data that reduces the short-term temperature increases, but keeps the longer-term, steeper upward trajectory. They may even be able to claim, “it’s worse than we thought.”

Frankly, though, I don’t see them sufficiently intelligent to use this gambit to stay relevant in the debate.

DonK31
October 7, 2018 8:32 am

Now we know why Phil Jones didn’t want to release his data and methods…it so easy to find something wrong with them.

Rod Everson
October 7, 2018 9:01 am

As a general reader, I found the explanation of how making site adjustments resulted in lowering older temperature records incorrectly to be one of the most interesting points. Tony Heller has been printing graphs for years now that show how local records of past temperatures have been consistently adjusted downward. If the reason for those downward adjustments can be shown to be primarily due to the obviously incorrect process described in this thesis, then that should be a major story in itself.

But is that the case, or are there many other reasons for adjustments always seeming to cool the past? If not, someone should write a paper exposing the fraud, for fraud it would be. Anyone thinking it happened as a result of an innocent mistake or miscalculation hasn’t been paying attention the past few years.

2hotel9
October 7, 2018 9:03 am

Errors? Thats their story and they are sticking to it.

E J Zuiderwijk
October 7, 2018 9:08 am

Quack data to be used by quacks. It figures.

Whiskey
October 7, 2018 9:15 am

Is this the same John McLean that predicted (in early 2011) “it is likely that 2011 will be the coolest year since 1956 or even earlier” ???

Reply to  Whiskey
October 7, 2018 9:27 am

Whiskey

Nah, that was the John McLean in Die Hard.

Phoenix44
Reply to  Whiskey
October 7, 2018 9:35 am

And your point is?

That because you cannot attack the work you attack the man, thus proving once again the Ad Hom fallacy that alarmists love so much.

clipe
Reply to  Phoenix44
October 7, 2018 5:39 pm

” HotScot
October 7, 2018 at 9:23 am

Dodgy Geezer

But to be a little more positive, the hits on CAGW just keep coming and there are lots of young journalists and politicians waiting to pounce, and make a name for themselves.

Public opinion is waning, the scandal of wasted money is becoming obvious, the under-performance of Germany’s energy policy is being recognised, the withdrawal of renewable subsidies is coming home to roost and the disregard of anything climate related by the Chinese by planning and building ~1,200 coal fired power stations is making people sit up and think.

No one likes Trump (allegedly) yet his policies are seeing America grow, whilst the Paris agreement and the IPCC are largely recognised as excuses for a knees up for the bureaucrats we Brits hate so much (and most other countries). The Kavanaugh fiasco is recognised as a political hatchet job by the left that’s failed miserably and will, I’m sure, engender yet more support for Trump.

In short, we sceptics just cant stop winning and the levee will eventually break when someone recognises there’s a name to be made by vilifying the green blob for all the damage it’s done to the world.”

Whiskey
Reply to  Whiskey
October 7, 2018 6:02 pm

Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done.
Here’s some more non ad hominem for you:
https://www.skepticalscience.com/John_McLean_arg.htm

MarkW
Reply to  Whiskey
October 7, 2018 6:28 pm

Once again, the warmists have to re-define the language in order to try and change the subject.

Skeptical Science? Really, is that the best you can do? Might as well quote Dr. Seuss.

Roger Knights
Reply to  Whiskey
October 7, 2018 6:40 pm

“Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done.”

Correct. Other misuses of philosophical terms are “begs the question” (misused 95% of the time) and “appeal to authority” (ditto).

Scott Bennett
Reply to  Whiskey
October 7, 2018 9:51 pm

==>Whiskey

Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done. – Whiskey

What? You went straight after the man and not his argument!

Is this the same John McLean that predicted.. – Whiskey

The only time criticism of the person is not an ad hominem argument is if a person’s merits are actually the topic of the argument! You went straight to his credibility and that is attacking the man! You have confused fallacious reasoning with criticism. You went straight to “going after what someone has said or done” and that is the very definition of argument ad hominem*.

You unwittingly applied a typical form of psychological priming to “poison the well” a subtle use of ad hominem to influence the views of spectators.

*A fallacious argumentative strategy whereby genuine discussion of the topic at hand is avoided by instead attacking the character, motive, or other attribute of the person making the argument, or persons associated with the argument, rather than attacking the substance of the argument itself. – Wikipedia

scross
Reply to  Whiskey
October 12, 2018 9:44 am

I find it a bit odd that of the eight links provided there under “Climate myths by McLean”, only one of those links (the last one) actually references him by name.

Reply to  Whiskey
October 12, 2018 6:18 am

That was a pretty bad prediction if so, but not sure it was the same guy. Anyways, he’d be in good company with Hansen and others…

Hubert Lamb, Director of CRU, Sep 8 1972: “We are past the best of the inter-glacial period which happened between 7,000 and 3,000 years ago… we are on a definite downhill course for the next 200 years….The last 20 years of this century will be progressively colder.” http://news.google.com/newspapers?nid=336&dat=19720908&id=AiwcAAAAIBAJ&sjid=0VsEAAAAIBAJ&pg=5244,2536610

John Firor, Excecutive Director of NCAR, 1973: “Temperatures have been high and steady, and steady has been more than high. Now it appears we’re going into a period where temperature will be low and variable, and variable will be more important than low.”

scross
Reply to  TallDave
October 12, 2018 9:33 am

Given that the prediction isn’t an actual quote from McLane, but rather came from the writer of a media statement (see below); and given the oddity of the prediction; and given that the quick-and-dirty review of potentially relevant documents which I did didn’t turn up any evidence to back it up anyway, then I have to assume that this was simply a misunderstanding on the part of the person who wrote the media statement. (Such a situation isn’t uncommon.) I didn’t dig too deeply into it, though, so I could be wrong.

John McLean: Statement: COOL YEAR PREDICTED: Updated with LATEST GRAPH

http://climaterealists.com/index.php?id=7349

John McLean
Reply to  Whiskey
October 19, 2018 5:08 am

Lordy, lordy, lordy! You mean that researchers can’t test hypotheses by making predictions and seeing if they come true or not?
And please tell us all how the many predictions made using climate models have turned out.

TomRude
October 7, 2018 9:16 am

If the climatic alarm was true, the very first task of scientists involved would have been to set up a tight grid of new stations overlapping the best existing ones and let the data flow in for the past 30 years then start to make sense of temperature, pressure, humidity etc…
Instead, algorithms, computer models, a complete dismissal of climatologists/geographers’ knowledge -see the Leroux versus Legras & consorts- and scientactivist media campaigns replaced the search for a diagnostic, away from politics.

Alan Tomalty
Reply to  TomRude
October 8, 2018 1:44 am

No need. The UAH satellite temperature data set is the only one that both sides trust. Everybody drools near the end of every month waiting for it to come out on the 2nd day of the next month. This dataset is now where the climate wars are fought because the alarmists don’t have any other credible data that they can point to. Eventually even the UAH dataset will crumble the alarmist sand castle as the daily tides have to always come back in.

Antony Banton
Reply to  Alan Tomalty
October 8, 2018 2:10 am

An outlier, by definition, is not the most likely……

http://postmyimage.com/img2/510_Tropospheretrends.png

Especially as neither RSS nor UAH are consistent with the sensor on the previous satellite that was superceded in 1998.
UAH says the present one is he correct one and, pragmatically, RSS says we dont know and splits the difference….
comment image
comment image

It is one instrument having taken over from the previous one instument. Measuring anyway a depth of the troposhere and missing the surface where the majority of warming is taking place over land.

John Tillman
Reply to  Antony Banton
October 8, 2018 10:08 am

Anthony,

Don’t you think that a warming surface ought to warm the troposphere?

The GHE hypothesis supposes that a troposphere warmed by slowing down the migration of heat toward space will warm the surface. If the surface is warming before and faster than the troposphere, then the GHE hypothesis is falsified.

tty
Reply to  John Tillman
October 11, 2018 11:07 am

“The GHE hypothesis supposes that a troposphere warmed by slowing down the migration of heat toward space will warm the surface.”

No it doesn’t. Read:

https://wattsupwiththat.com/2018/10/09/richard-lindzen-lecture-at-gwpf-global-warming-for-the-two-cultures/

For a lucid explanation of what GHE really is.

John Tillman
Reply to  John Tillman
October 11, 2018 11:27 am

Tty,

I was going with the official US government version of the GHE. That doesn’t mean I agree with it. In fact, I agree with Lindzen’s version of the hypothesis. But my point was that, given this view of the GHE, then observations don’t support that whatever warming has occurred to due to such an effect.

Lindzen says that water vapor and other greenhouse gases elevate “the emission level, and because of the convective mixing, the new level will be colder. This reduces the outgoing infrared flux, and, in order to restore balance, the atmosphere would have to warm.”

NASA, for example, by contrast explains the GHE thusly:

“A layer of greenhouse gases – primarily water vapor, and including much smaller amounts of carbon dioxide, methane and nitrous oxide – acts as a thermal blanket for the Earth, absorbing heat and warming the surface to a life-supporting average of 59 degrees Fahrenheit (15 degrees Celsius). Most climate scientists agree the main cause of the current global warming trend is human expansion of the “greenhouse effect” 1 — warming that results when the atmosphere traps heat radiating from Earth toward space. Certain gases in the atmosphere block heat from escaping.”

https://climate.nasa.gov/causes/

John Tillman
Reply to  John Tillman
October 11, 2018 11:54 am

The official US government and IPCC hypothesis could I guess be called retarded, since it supposes that more CO2 retards the movement of heat from the surface to space.

Red94ViperRT10
October 7, 2018 9:39 am

“…the surface temperature data is unfit for purpose…”

Gee, Anthony, where have we heard that before? https://wattsupwiththat.com/2012/07/29/press-release-2/

StephenP
October 7, 2018 9:45 am

Watch your back John McLean, you will have upset a very big applecart.

John McLean
Reply to  StephenP
October 19, 2018 5:09 am

But I was told that sacred cows make the best hamburgers.

Earthling2
October 7, 2018 9:52 am

Welcome to the Adjustocene where if we don’t know what the historical temperatures were, we make it up. This fact has to be driven home over and over again, that we don’t have a very good reliable data set for most of the 19th century and much of the 20th century. Knowing that, it is only fitting that we accept a wider margin of error what that hypothesized data might be, with a caveat that going forward the error bars on newer data can be somewhat tightened as we gather more accurate data from more of the surface of the earth. That means that 19th century data world wide is speculative at best, and manufactured at worst. And really doesn’t mean much other than we know it was still fairly cold from the previous 500 years of a cooling trend from the LIA that we know with some certainty was much colder than any previous historical normal. That really makes 1850 colder than any historic normal for a starting point to this current exercise. Adding 1.5 C to a really cold beginning doesn’t even allow for much natural variability.

If the IPCC wants credibility, then it should at least be honest with itself about the data it does have. Plus it would be more amenable if the threshold for dangerous warming is set at 1950 going forward, instead of some mythical temperature from 1850 at the tail end of the LIA, once of the coldest periods in the Holocene to date. That should be noted, when it was a fairly cold time time in the history of the world. If we do see long term temperature trending much higher over the next 30+ years to 2050 by 2 C, then that should be the basis for taking any kind of action with regards to limiting economic output of the world, by limiting CO2 and other GHG production in the future if it demonstrated that GHG’s are indeed a significant factor.

So far in the 21st century, temperatures seem to be within an acceptable range of error, in fact a global hiatus or a pause in any significant warming in these first 18-19 years of the 21st century indicates that any temperature increase is not linear with CO2 concentrations in the atmosphere. So let’s allocate resources to collecting honest and accurate weather and climate data so that wise decision making can be implemented in the next 30 years. We are just very early yet in declaring any emergency, and it hasn’t been demonstrated that any real significant threat has been identified, other than much of the very populous world is just not ready for any kind of normal inclement weather which is what leads to alarmism in general. Perhaps that is where any resources are first spent, which is hardening our defences to inclement weather.

Peter Morris
October 7, 2018 9:53 am

Holy cow you weren’t kidding. This is HUGE news. I knew the dataset was sparse in the 19th century, but that is ridiculous! There’s really no reason to trust HadCRUT until 1950 at the earliest.

I’m sure the response will be measured and sober.

Dodgy Geezer
Reply to  Peter Morris
October 7, 2018 10:32 am

…I’m sure the response will be measured and sober…

What response? This will simply be ignored.

Roger Knights
Reply to  Peter Morris
October 7, 2018 6:45 pm

Delingpole says that McLean says that of the 0.6 warming since 1950, 0.2 is likely exaggerated.

October 7, 2018 9:56 am

John Mclean has been a WUWT guest blogger or indirectly supplied article information a number of times before and has always been instructive.

Some of his previous contributions:


“Reckless commitments to the Paris Climate Agreement, November 10, 2017”
“Friday Funny: more upside down data, March 25th, 2016; through an article by Bishop Hill where Jon McLean asked for a lookover”
“Hadley Climate data has been “corrected” thanks to alert climate skeptic, April 11th, 2016″

Bruce Cobb
October 7, 2018 10:01 am

I am 97% sure that the errors almost always favoring warming is a complete coincidence.

Reply to  Bruce Cobb
October 12, 2018 6:20 am

I have peer reviewed this statement and therefore it must be true.

MarkW
Reply to  TallDave
October 12, 2018 7:51 am

I have peer reviewed your peer review and find that I have no reason to object to it being published.

Timo Soren
October 7, 2018 10:04 am

Back in 2005 McIntrye post a comment from P. Jones.

Why should I make the data available to you, when your aim is to try and find something wrong with it.

Well yes, we do want to look at it and of course, McIntrye was completely correct about the need to look!

Adam Gallon
Reply to  Timo Soren
October 7, 2018 2:22 pm

That was an exchange between Jones & Warwick Hughes. https://climateaudit.org/2005/10/15/we-have-25-years-invested-in-this-work/

Ristvan
October 7, 2018 10:16 am

It has been obvious for quite a while that the temp data is not fit for climate purpose. See, for example, essay When Data Isnt in ebook Blowing Smoke.
Good to have yet another detailed confirmation of that basic fact.

Michael Jankowski
October 7, 2018 10:18 am

It is easy to predict the reponses…

(1) errors are minor and make no difference
(2) there are other data sets which independently verify the temperature record
(3) examples of errors presented show readings both too cold and too warm, which would mostly cancel-out as errors often do
(4) McLean has misrepresented his qualifications previously
(5) McLean’s prior works were heavily criticized and/or avoided rigorous peer-review
(6) McLean is an industry shill

etc.

Reply to  Michael Jankowski
October 7, 2018 11:25 am

Yes, except that #4-5 will come first. The climate activists’ first impulse is generally the ad hominem attack. Any reference to actual data comes later, if at all.

Reply to  Dave Burton
October 7, 2018 11:26 am

Oops, typo. I meant #4-6.

MarkW
Reply to  Dave Burton
October 7, 2018 6:29 pm

As proof, read Whiskey’s posts above.

Reply to  Michael Jankowski
October 7, 2018 1:17 pm

Michael Jankowski
October 7, 2018 at 10:18 am

Yes, you’re probably right.

Alternatively they could just say…Look, there are x hundred thousand measurements in the record…of course there a few errors(we’re only human) – we have always known that but have concluded that they average out in the end. End of story.

simple-touriste
Reply to  Michael Jankowski
October 11, 2018 6:00 pm

“(2) there are other data sets which independently verify the temperature record”

And there are other accusers of Kavanaugh. There is always some other line of evidence to confirm the current one which is weaker than expected. Then the new line of evidence is even sillier, but there is another one, which have not yet been DEBUNKED.

Debunking isn’t a word of the universe of science, and it’s applicable to fake sciences, because making up stuff is pure fraud and needs debunking not just refutation.

Nik
October 7, 2018 10:19 am

These results mirror Steve McIntrye’s US work and results. Why would anyone expect the UK’s work to be any better?

Reply to  Nik
October 7, 2018 10:41 am

Nik

I think it’s Aussie work isn’t it?

Nik
Reply to  HotScot
October 7, 2018 1:48 pm

HotScot,

I was referring to the HadCRUT4 data set, which I thought is managed/controlled by the Climate Research Unit at the U. of East Anglia (in the UK). The Aussies did the great work of researching and exposing the worthlessness of that CRU dataset. My hat is off to them.

McIntrye exposed the poor quality of NASA/GISS’s surface stations network that is used by NASA/GISS to collect data for a comparable data set .

Apologies for confusing antecedents.

Regards

Editor
Reply to  Nik
October 7, 2018 7:15 pm

Steve McIntyre is, among other things, a statistician. He’s best known for debunking MBH98, aka the Hockey Stick.

Anthony Watts, before he started this blog, was looking at the performance of Cotton weather instrument shelters, type of paint, wear & tear, etc. When he discover that several stations that were part of the USHCN, he started http://surfacestations.org/ to recruit assistants to document more of them (82.5% were eventually logged). The system is operated by NOAA’s NCEI, NASA’s GISS does most of the adjusting and analysis to produce their GISTEMP database.

John Tillman
Reply to  HotScot
October 7, 2018 6:33 pm

Steve is Canadian.

October 7, 2018 10:37 am

How about someone doing an audit of the fundamental, underlying concepts.

How accurate or consistent a method someone has devised to measure the global average length of unicorn hairs is of little consequence, when the reality of unicorns is nil.

Alan Tomalty
Reply to  Robert Kernodle
October 8, 2018 1:54 am

I am going to post soon on the original scientific paper that Hansen did in 1976 on CO2 and other greenhouse gas temperature response to doubling. It is called GREENHOUSE EFFECTS DUE TO MAN-MADE PERTURBATIONS OF TRACE GASES. Hansen had been publishing for 10 years before that but the year 1976 was key because in the previous decade everybody was worried about global cooling.

October 7, 2018 10:41 am

In the same way that the Villack conference stated that they should disregard all previous historical data, perhaps we should start again and disregard all data prior to to 2018 and instead only look at data from Satillites, ballons and Argo bouys. Disregard all ground based readings.

MJE

ralfellis
October 7, 2018 10:46 am

There is another gross error in palaeoclimate too.

It has been assumed that ancient mountain treelines are controlled solely by temperature. But this has resulted in impossible lapse rates, and much head-scratching, because the treelines are far too low. Nevertheless, these low mountain temperatures and high temperature lapse rates have been used in all palaeoclimate models. And these get reflected in modern models too.

In truth, those low mountain treelines were caused by low CO2, not by temperature. So all the historic temperatures used in glacial era climate models are all wrong, and so all those models are wrong too.

But don’t worry, the science is settled…….!

Ralph

tom0mason
October 7, 2018 11:19 am

HADCUT4 = Highly Unlikely Data Confirming Unfeasible Theory 4

richardw
Reply to  tom0mason
October 7, 2018 1:33 pm

How Averaging Down Can Revise Unhelpful Temperatures

Reply to  tom0mason
October 7, 2018 1:35 pm

Tom,

Highly Adjusted Data Confirming …

: )

tom0mason
Reply to  Thomas Mee
October 7, 2018 11:15 pm

It was late when I tried for that hence the failure.

This morning the BBC has reported on the IPCC’s meeting/report 3 times in half an hour, no mention of the basic temperature being a pile of horse’s excrement.

MarkW
Reply to  Thomas Mee
October 11, 2018 11:29 am

… Unreliable Temperatures

Bruce Cobb
Reply to  tom0mason
October 7, 2018 1:57 pm

AKA HADCRUD.

MarkW
Reply to  tom0mason
October 7, 2018 6:30 pm

Highly Adjusted Data Confirming Unfeasible Theory

John Tillman
Reply to  MarkW
October 7, 2018 6:35 pm

Confirming Refuted Unfeasible Theory.

October 7, 2018 11:21 am

Even the sea-level measurement data contain obvious errors, though coastal sea-level measurement data are certainly much better quality than the temperature data. E.g., the great 9.2 magnitude Alaska earthquake was March 27, 1964. But NOAA’s monthly sea-level measurement data for Seward, AK jumped one meter in January, 1964, rather than April. That’s obviously wrong.
http://sealevel.info/MSL_graph.php?id=seward&c_date=1964/4-2019/12&co2=0

Thank you for all you do for sound, trustworthy science, Jo, Anthony, and John McLean.

William Ward
October 7, 2018 11:28 am

I just purchased the report and downloaded it. I also downloaded all of the data files used in the analysis. 7 large .zip files.

Leitwolf
Reply to  William Ward
October 7, 2018 1:24 pm

Where can you get that from.. I mean the raw data?

William Ward
Reply to  Leitwolf
October 7, 2018 5:04 pm

I see it on this page. It might be available even if you do not purchase the report. I did purchase so I’m not sure if the page shows me the same that it will show those who have not purchased.

https://robert-boyle-publishing.com/product/audit-of-the-hadcrut4-global-temperature-dataset-mclean-2018/

Scroll down near the bottom of the page, you will see 7 links under “DATA USED”. If you don’t see it then it must be limited to those who purchase. Let us know if you can get it.

Samuel C Cogar
October 7, 2018 11:44 am

The following was excerpted from a January 21, 2015 published commentary by Dr. Tim Ball, which was titled …… “2014: Among the 3 percent Coldest Years in 10,000 years?

Challenges and IPCC Fixes

Every alteration, adjustment amendment and abridgment of the record so far, was done to create and emphasize increasingly higher temperatures.

1. The instrumental data is spatially and temporally inadequate. Surface weather data is virtually non-existent and unevenly distributed for 85 percent of the world’s surface. There are virtually none for 70 percent of the oceans. On the land, there is virtually no data for the 19 percent mountains, 20 percent desert, 20 percent boreal forest, 20 percent grasslands, and 6 percent tropical rain forest. In order to “fill-in”, the Goddard Institute for Space Studies (GISS), made the ridiculous claim that a single station temperature was representative of a 1200 km radius region. Initial claims of AGW were based on land-based data. The data is completely inadequate as the basis for constructing the models.

2. Most surface stations are concentrated in eastern North America and Western Europe and became the early evidence for human induced global warming. IPCC advocates ignored, for a long time, the fact that these stations are most affected by the urban heat island effect (UHIE).

Read more http://wattsupwiththat.com/2015/01/21/2014-among-the-3-percent-coldest-years-in-10000-years/

MarkW
Reply to  Samuel C Cogar
October 7, 2018 2:54 pm

The only areas that come even close to be adequately monitored, are also those areas that have been the most extensively modified by humans.

kenw
October 7, 2018 11:46 am

“Groupthink” is a modern oxymoron.

Verified by MonsterInsights