UPDATE – BOMBSHELL: audit of global warming data finds it riddled with errors

I’m bringing this back to the top for discussion, mainly because Steven Mosher was being a cad in comments, wailing about “not checking”, claiming McLean’s PhD thesis was “toast”, while at the same time not bothering to check himself. See the update below. – Anthony


Just ahead of a new report from the IPCC, dubbed SR#15 about to be released today, we have this bombshell- a detailed audit shows the surface temperature data is unfit for purpose. The first ever audit of the world’s most important temperature data set (HadCRUT4) has found it to be so riddled with errors and “freakishly improbable data”  that it is effectively useless.

From the IPCC:

Global Warming of 1.5 °C, an IPCC special report on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.

This is what consensus science brings you – groupthink with no quality control.

HadCRUT4 is the primary global temperature dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”.  It’s also the dataset at the center of “ClimateGate” from 2009, managed by the Climate Research Unit (CRU) at East Anglia University.

The audit finds more than 70 areas of concern about data quality and accuracy.

But according to an analysis by Australian researcher John McLean it’s far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world.

Main points:

  • The Hadley data is one of the most cited, most important databases for climate modeling, and thus for policies involving billions of dollars.
  • McLean found freakishly improbable data, and systematic adjustment errors , large gaps where there is no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors.
  • Almost no quality control checks have been done: outliers that are obvious mistakes have not been corrected – one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C.  One town in Romania stepped out from summer in 1953 straight into a month of Spring at minus 46°C. These are supposedly “average” temperatures for a full month at a time. St Kitts, a Caribbean island, was recorded at 0°C for a whole month, and twice!
  • Temperatures for the entire Southern Hemisphere in 1850 and for the next three years are calculated from just one site in Indonesia and some random ships.
  • Sea surface temperatures represent 70% of the Earth’s surface, but some measurements come from ships which are logged at locations 100km inland. Others are in harbors which are hardly representative of the open ocean.
  • When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.

Details of the worst outliers

  • For April, June and July of 1978 Apto Uto (Colombia, ID:800890)  had an average monthly temperature of  81.5°C, 83.4°C and 83.4°C respectively.
  • The monthly mean temperature in September 1953 at Paltinis, Romania is reported as -46.4 °C (in other years the September average was about 11.5°C).
  • At Golden Rock Airport, on the island of St Kitts in the Caribbean, mean monthly temperatures for December in 1981 and 1984 are reported as 0.0°C. But from 1971 to 1990 the average in all the other years was 26.0°C.

More at Jo Nova


The report:

Unfortunately, the report is paywalled. The good news is that it’s a mere $8.

The researcher, John McLean, did all the work on his own, so it is a way to get compensated for all the time and effort put into it. He writes:

This report is based on a thesis for my PhD, which was awarded in December 2017 by James Cook University, Townsville, Australia. The thesis1 was based on the HadCRUT4 dataset and associated files as they were in late January 2016. The thesis identified 27 issues of concern about the dataset.

The January 2018 versions of the files contained not just updates for the intervening 24 months, but also additional observation stations and consequent changes in the monthly global average temperature anomaly right back to the start of data in 1850.
The report uses January 2018 data and revises and extends the analysis performed in the original thesis, sometimes omitting minor issues, sometimes splitting major issues and sometimes analysing new areas and reporting on those findings.

The thesis was examined by experts external to the university, revised in accordance with their comments and then accepted by the university. This process was at least equivalent to “peer review” as conducted by scientific journals.

I’ve purchased a copy, and I’ve reproduced the executive summary below. I urge readers to buy a copy and support this work.

Get it here:

Audit of the HadCRUT4 Global Temperature Dataset


EXECUTIVE SUMMARY

As far as can be ascertained, this is the first audit of the HadCRUT4 dataset, the main temperature dataset used in climate assessment reports from the Intergovernmental Panel on Climate Change (IPCC). Governments and the United Nations Framework Convention on Climate Change (UNFCCC) rely heavily on the IPCC reports so ultimately the temperature data needs to be accurate and reliable.

This audit shows that it is neither of those things.

More than 70 issues are identified, covering the entire process from the measurement of temperatures to the dataset’s creation, to data derived from it (such as averages) and to its eventual publication. The findings (shown in consolidated form Appendix 6) even include simple issues of obviously erroneous data, glossed-over sparsity of data, significant but questionable assumptions and temperature data that has been incorrectly adjusted in a way that exaggerates warming.

It finds, for example, an observation station reporting average monthly temperatures above 80°C, two instances of a station in the Caribbean reporting December average temperatures of 0°C and a Romanian station reporting a September average temperature of -45°C when the typical average in that month is 10°C. On top of that, some ships that measured sea temperatures reported their locations as more than 80km inland.

It appears that the suppliers of the land and sea temperature data failed to check for basic errors and the people who create the HadCRUT dataset didn’t find them and raise questions either.

The processing that creates the dataset does remove some errors but it uses a threshold set from two values calculated from part of the data but errors weren’t removed from that part before the two values were calculated.

Data sparsity is a real problem. The dataset starts in 1850 but for just over two years at the start of the record the only land-based data for the entire Southern Hemisphere came from a single observation station in Indonesia. At the end of five years just three stations reported data in that hemisphere. Global averages are calculated from the averages for each of the two hemispheres, so these few stations have a large influence on what’s supposedly “global”. Related to the amount of data is the percentage of the world (or hemisphere) that the data covers. According to the method of calculating coverage for the dataset, 50% global coverage wasn’t reached until 1906 and 50% of the Southern Hemisphere wasn’t reached until about
1950.

In May 1861 global coverage was a mere 12% – that’s less than one-eighth. In much of the 1860s and 1870s most of the supposedly global coverage was from Europe and its trade sea routes and ports, covering only about 13% of the Earth’s surface. To calculate averages from this data and refer to them as “global averages” is stretching credulity.

Another important finding of this audit is that many temperatures have been incorrectly adjusted. The adjustment of data aims to create a temperature record that would have resulted if the current observation stations and equipment had always measured the local temperature. Adjustments are typically made when station is relocated or its instruments or their housing replaced.

The typical method of adjusting data is to alter all previous values by the same amount. Applying this to situations that changed gradually (such as a growing city increasingly distorting the true temperature) is very wrong and it leaves the earlier data adjusted by more than it should have been. Observation stations might be relocated multiple times and with all previous data adjusted each time the very earliest data might be far below its correct value and the complete data record show an exaggerated warming trend.

The overall conclusion (see chapter 10) is that the data is not fit for global studies. Data prior to 1950 suffers from poor coverage and very likely multiple incorrect adjustments of station data. Data since that year has better coverage but still has the problem of data adjustments and a host of other issues mentioned in the audit.

Calculating the correct temperatures would require a huge amount of detailed data, time and effort, which is beyond the scope of this audit and perhaps even impossible. The primary conclusion of the audit is however that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.

One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.

Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as “indicative” of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect.

A third implication is that even if the IPCC’s claim that mankind has caused the majority of warming since 1950 is correct then the amount of such warming over what is almost 70 years could well be negligible. The question then arises as to whether the effort and cost of addressing it make any sense.

Ultimately it is the opinion of this author that the HadCRUT4 data, and any reports or claims based on it, do not form a credible basis for government policy on climate or for international agreements about supposed causes of climate change.


Full report here


UPDATE: 10/11/18

Some commenters on Twitter, and also here, including Steven Mosher, who said McLean’s thesis/PhD was “toast” seem to doubt that he was actually allowed to submit his thesis, and/or that it was accepted, thus negating his PhD. To that end, here is the proof.

McLean’s thesis appears on the James Cook University website:  “An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues“, submitted for Ph.D. in physics from James Cook University (2017).

And, he was in fact awarded a PhD by JCU for that thesis.

Larry Kummer of Fabius Maximus directly contacted the University to confirm his degree. Here is the reply.

ADDED:

JOHN MCLEAN here.
For Mr Mosher,

I don’t insult and I don’t accuse without investigation. And if I don’t know I try to ask.

(a) Data files
If you want copies of the data that I used in the audit, as they were when I downloaded them in January, go to web page https://robert-boyle-publishing.com/audit-of-the-hadcrut4-global-temperature-dataset-mclean-2018/ and just scroll down.

Or download the latest versions of the files from yourself from the CRU and Hadley Centre, namely https://crudata.uea.ac.uk/cru/data/temperature/ and https://www.metoffice.gov.uk/hadobs/hadsst3/data/download.html. (The fact that file names are always the same and it’s confusing is one of the fidnings of the audit.)

(b) Apto Uto not used? Figure 6.3 shows that it is used, the lower than expected spikes are because of other stations in the same grid cell and the vale of the cell is the average anomaly for all such stations.

(c) What stations are used and what are not?
The old minimum of 20 years of the 30 from 1961 to 1990 was dropped a few HadCRUT versions back. It then went to 15 years with no more than 5 missing in any decade. HadCRUT4 reduced it again to 14.

best wishes

John

5 1 vote
Article Rating
512 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Dodgy Geezer
October 7, 2018 5:55 am

The last time someone did a PhD thesis which showed up the Climate Change fraud (it was on some tree ring samples, I believe) all the data magically disappeared…..

John Bills
October 7, 2018 5:56 am

And we have this:
Land Surface Air Temperature Data Are Considerably Different Among BEST‐LAND, CRU‐TEM4v, NASA‐GISS, and NOAA‐NCEI
First published: 28 May 2018
https://doi.org/10.1029/2018JD028355

Greg Goodman
Reply to  John Bills
October 7, 2018 6:26 am

It is nonsense to mix land and sea data as an “average” , especialy if you think this may tell you something about the supposed heating effects of IR radiation.

TEMPERATURES OF DIFFERENT MEDIA ARE NOT FUNGIBLE.

https://judithcurry.com/2016/02/10/are-land-sea-temperature-averages-meaningful/

Land ans sea water have a heat capacity which differ by a factor of two, meaning land warms faster. Adding the two to get an average biases to result to warm faster than a proper energy based calculation.

Anyone who does not understand that should not be working on AGW.

As a crude fix, land temps should be weighted 50% less than SST.

Kudos to John McLean for doing this work and managing to get it accepted as his thesis. Well done.

JohnWho
Reply to  Greg Goodman
October 7, 2018 7:18 am

Um, not to mention that “land data” isn’t the temperature of the land at a specific point, it is the temperature of the atmosphere approximately 1 meter above the surface while “sea data” is a measurement of the water temperature at or relatively near the surface.

This isn’t “apples and oranges”, it is more “apples and apes”.

Reply to  JohnWho
October 7, 2018 8:51 pm

“apples and apes” LOL
I’m going to remember that!

nw sage
Reply to  Shelly
October 8, 2018 5:01 pm

Or Apes and what remains of the apples after the ape digests them.

coaldust
Reply to  JohnWho
October 9, 2018 8:32 am

How about “apples and orangutans”?

Gurnsy
Reply to  coaldust
October 11, 2018 3:08 pm

Ohhhh! Very good!

Jeff Alberts
Reply to  Greg Goodman
October 7, 2018 8:45 am

It’s also nonsense to average temperature data from different locations. Intensive properties.

MarkW
Reply to  Jeff Alberts
October 7, 2018 10:48 am

In that case, there is no such thing as temperature. As even a single thermometer is averaging the temperature of millions of individual atoms and molecultes.

Harry Passfield
Reply to  MarkW
October 7, 2018 11:57 am

Mosh will be along in a moment to tell us that thermometers are only modelling temps.

Paul Penrose
Reply to  MarkW
October 7, 2018 12:39 pm

No, temperature is a useful concept to express the energy density (due to molecular motion) of a substance. Temperature is inherently quite localized, but depending on the circumstances, a single measurement can represent a large volume. You just need to keep in mind that accuracy will drop as you move further from the measurement point, and that this will vary by substance and circumstance. Unfortunately, many scientists forget, or ignore, these important truths about temperature.

Reply to  MarkW
October 7, 2018 1:21 pm

MODERATOR: some dishonest person is posting comments under my name again.

MarkW
Reply to  MarkW
October 7, 2018 2:49 pm

Paul, I was responding to the writer who claimed that any averaging of temperature was inherently invalid.

It can be done, you just have to account for the uncertainty via the error bars.
The way the climate scientists do it is because they claim that two readings, hundreds of miles apart are inherently more accurate than either reading individually.

Robert of Ottawa
Reply to  MarkW
October 7, 2018 7:20 pm

No there is such thing as temperature, just not “planetary temperature”.

Ian W
Reply to  MarkW
October 7, 2018 10:41 pm

Except that the metrics are _not_ average temperature. They are the mean of the highest and lowest temperature in a 24 hour period which is most certainly not the average. The set of means is then ‘averaged’ to provide a month or annual average by which time all sense is lost.

Further, the enthalpy of the air is continually changing with its humidity, a 100% humid air say in a misty bayou in Louisiana at 75F contains twice as much energy in Kilojoules per Kilogram as a close to zero humidity but 100F air in the Arizona desert. As it is ‘trapped energy’ that the concern is about then that is what should be measured. Temperature is the incorrect metric and averaging atmospheric temperature is a nonsense.

The entire meteorological exercise shows that climate ‘scientists’ have a very poor grasp of metrology – possibly deliberately so.

Dutch
Reply to  MarkW
October 8, 2018 11:04 am

Well said Mark. You’re exactly right. There IS no such thing as temperature. It’s just an unitless index of heat. The ‘units’ are just the name of the guy who came up with the particular index. Sadly AGW only ‘exists’ in temperature measurements. And likewise is totally bankrupt because it doesn’t index back to ACTUAL heat. I get that you were being somewhat facetious but your point is not totally inane. It speaks directly to the lie of the Global Warming hypothesis, while simultaneously revealing why gullible twists who don’t understand the relationship between temperature and heat buy these lies wholesale. You can lie about temperature, you can’t lie about heat. Make it about the thing, in this case heat, and you can’t cheat. Make it about the measurement instrument, in this case a thermometer that tells you a temperature ( not the real thing) and you can fudge, lie and mislead all day. Which is precisely what happened. And now we know that it did happen and how. Though I logically deduced all this years ago, as did most everyone here and probably you too. Good stuff.

MarkW
Reply to  MarkW
October 11, 2018 10:20 am

IanW, I agree completely that the record in question is a real dog’s breakfast and isn’t fit for the purpose it is being used for.

My point is just to argue against the claim that averaging a bunch of thermometers is a scientifically meaningful exercise. It can be done, but you need to have the proper error bars on the results.

Reply to  MarkW
October 11, 2018 2:03 pm

You don’t have no much idea of thermodynamics and statistical physics, do you?

Steven Mosher
Reply to  Jeff Alberts
October 9, 2018 3:48 am

the intensive argument is wrong.

Essex fundamentally misunderstands what a spatial average of temperature is.

Even more hilariuous is that essex thinks you cant average color.

guess he never worked in image recognition

Paul Penrose
Reply to  Steven Mosher
October 9, 2018 2:54 pm

Mosh,
What is hilarious is that you think you know what temperature is a measure of. And if you say “heat” I’ll laugh even harder. Even more comical is the idea that the midpoint between the minimum and maximum temperatures for a month is the average temperature for that month.

Hugs
Reply to  Steven Mosher
October 11, 2018 11:32 am

Mosher, you don’t average color. You can average only a numeric representation of color, like RGB or hue/intensity/brightness vector.

Take red and green for example. Their average in RGB is bright yellow when you go arond the hue axel, brown when you just average numbers, but rather dirty gray when you have pigments to mix. None of those are well defined as such – RGB for example is always a subspace of the human vision with a crude, arbitrary metric.

The basis of a color space can be selected in many ways and connecting color with a number is always a bit arbitrary.

It’s not hilarious to see you think it is hilarious.

Jim Gorman
Reply to  Steven Mosher
October 11, 2018 12:36 pm

Here is a better question. If the globe is warming, then it should be warming everywhere. Why spend billions on thousands of thermometers, supercomputers, bureaucrats, etc. when you could get the result by opening your door, looking at the thermometer on your porch and recording the value? If you tell me that not every place is warming, then my next simple question would be what is the MINIMUM number of thermometers it would take to say the globe is warming.

We are going at it the wrong way. I hear more and more stations allow a better and more accurate average. Or, we need to forecast the climate for precautionary reasons. On and on. HOGWASH. If better and more measurements are needed to forecast the weather, then let the meteorologists pay for them along with the studies they generate. More and more measurements really only allow for statistics to be used to generate numbers that are inaccurate and for more and more corruption in the data. Then those same inaccurate numbers are used in models that admittedly, I say ADMITTEDLY, can’t make predictions. They can only make projections about what may or might happen.

As I sit here I can’t shake the vision that it is all a shell game with shills on every corner grubbing up money. Watch the pea! Watch the pea! Is the hand really faster than the eye? Where is the pea, sir? That’s not to say scientists are dishonest. I suspect they simply are being driven by the same desires and beliefs that old sailors were when they refused to sail past the horizon.

Dave Freer
Reply to  Steven Mosher
October 11, 2018 1:02 pm

Hey Mosh – where’s your apology? Seeing as you were flat out wrong and thereby maligned McLean, perhaps you should preface every post with an apology. It is what an honest researcher would do.

Greg
Reply to  Steven Mosher
October 11, 2018 1:05 pm

I guess Mosh has never tried using paint. You mix red , blue and yellow and you get shit brown. If you divide by 3 you still have shit brown.

HadCRUFT4 is climatologists’ equivalent of shit brown.

Reply to  Steven Mosher
October 11, 2018 2:12 pm

The intensive argument is correct. You could learn that if you would open up a thermodynamics and statistical physics book, but I reckon climastrology is much easier to ‘comprehend’. That’s why a Niels Bohr Institute researcher had the guts to put his name on an article pointing that (and so should have any honest physicist).

john
Reply to  Steven Mosher
October 11, 2018 4:08 pm

Spatial average?
You mean like the air temp is x and the ground temp 6 inches down is y in one place and time and y2 in another and y3 in a third , etc., etc., etc. and in some places and times the ground absorbs heat from the air and in other places and time it warms the air and then there is a similar equation for all the areas covered by water of various depths?
And then you write all those numbers down on a universe sized piece of paper and do infinite calculations and it so happens to turn out that the answer is exactly what you need it to be to justify firebombing the world’s economy?
Science is amazing!

Jean Parisot
Reply to  Steven Mosher
October 13, 2018 5:47 pm

The entire climate field needs a professional treatment from a spatial statistics perspective.

Reply to  Jeff Alberts
October 12, 2018 4:36 am

The temperature is not just changing during the day due to the Sun coming out. Its a completely different mass of air that the min was measured for than the max. There are other reasons not to treat it like a simple intensive property but that is the big one, even if looking at min and max separately. If the measurements were well spaced you could assume all the movement cancels out quite well, but its far from it.

DiogenesNJ
Reply to  Robert B
October 12, 2018 4:00 pm

This whole issue is why I mostly ignore “surface temperature” and pay attention to the satellite readings.

The method inherently averages the signal from an enormous volume of the atmosphere.

It’s also why I believe Christy’s results over almost anything that comes out of nominal surface temp data, and probably why he tested his measurements only against balloon data and not any surface dataset.

Dr. S. Jeevananda Reddy
Reply to  Greg Goodman
October 7, 2018 5:18 pm

Earth heats up quickly and also releases quickly. This is not so with water body wherein it heats up slowly and releases slowly and maximum and minimum times are different. The basic principle of land breeze and see-breeze follow this principle only.

Dr. S. Jeevananda Reddy

barry
Reply to  Greg Goodman
October 7, 2018 5:42 pm

Greg Goodman says:

“As a crude fix, land temps should be weighted 50% less than SST.”

His source says:

“Several of the major datasets that claim to represent ‘global average surface temperature” are directly or effectively averaging land air temperatures with sea surface temperatures.

These are typically derived by weighting a global land average and global SST average according to the 30:70 land-sea geographical surface area ratio.”

https://judithcurry.com/2016/02/10/are-land-sea-temperature-averages-meaningful/

Greg
Reply to  barry
October 8, 2018 1:55 am

Exactly Barry. but 30:70 weighting assume incorrectly that the land temp and sea temps are fungible. They are not, for the reasons I stated. If you read the article to the end you would realise the 50% downgrading of land temps makes that 15:85 land/sea weighting.

Thomas Zak
Reply to  Greg
October 11, 2018 10:28 am

Also, “In the Northern Hemisphere, the ratio of land to ocean is about 1 to 1.5. The ratio of land to ocean in the Southern Hemisphere is 1 to 4”. So the examples of sparse reporting in the southern hemisphere being extrapolated may require something other than a 30:70 land/sea assumption.

http://www.physicalgeography.net/fundamentals/8o.html

John Tillman
Reply to  Greg
October 11, 2018 5:26 pm

Earth’s 71% oceanic surface is hemispherically divided thus:

NH: 61% ocean.
SH: 81% ocean.

DaveW
Reply to  Greg Goodman
October 7, 2018 9:15 pm

AS I understand it, the temperature used as an ‘average’ for the day is actually the midpoint between the high and low temperatures recorded and not indicative of the true mean unless you have a very symmetrical distribution of temperatures. The midpoint is yanked left and right by the points on either end of the distribution – and these are often fleeting and vary by season, cloud cover etc. Highs and lows have some meaning in terms of local weather, and that is why they have been recorded, but if they have anything to say about the heat content of the atmosphere, I can’t see it. No wonder no one seems to have a clue what may or may not be happening to climate.

Harry Passfield
Reply to  DaveW
October 8, 2018 6:06 am

DaveW:
I have likened the GAT (or is it now, the GAST?) to be about as useful as averaging lottery balls over a period in the hope they will predict the favourable numbers for the future.

lee
Reply to  Greg Goodman
October 11, 2018 8:02 pm

But what about the poor Great Barrier Reef? It is subjected to local environmental phenomena. How will it know when the “Average” Global temperature reaches +1.5 – +2.0C? How will it know it is time to kick off this mortal coil?

Gary Ashe
Reply to  lee
October 13, 2018 4:40 am

Very unfortunate the barrier reef ending up in Australia, they’ve ruined it.

Greg Goodman
Reply to  John Bills
October 7, 2018 6:42 am

I looked at HadSST3 ( the sea section of this data ) years ago.
https://judithcurry.com/2012/03/15/on-the-adjustments-to-the-hadsst3-data-set-2/

Temperatures actually recorded as engine-room or over the side bucket dips were freely changed from one to the other when they did not match expected statistical quotas. ie if a sector did not have enough buckets some of them would get arbitrarily changed to engine-room intake readings.

This reveals a pervading attitude that if the data does match what you expect , you can “correct” it. Changing the type of reading implies an adjustment since the two are not the same.

Proposed ‘corrections’ are compared to model output as part of the validation process. Again implying that if the data does not match the model it must be wrong.

Adjustments for bucket measurement near the Japanese coast were validated by comparing to SST measured by Japanese fishing vessels … which used bucket measurements ! ie jap buckets are fine brit buckets need correcting. This is considered part of the “validation” of the adjustments Hadley apply to the data.

If John McLean wants details he can search for the ( fragmented) exchanges I had with Met. Office’s John Kennedy in the comments below the C. Etc article, or drop me a note on my WP blog:
https://climategrog.wordpress.com/about/

Tom Malcolm
Reply to  Greg Goodman
October 7, 2018 7:54 am

Slightly off piste but it may be of interest to you anyway. I was engineer for a mining company in the late 70s through to the end of the 80s. We mined coal in the UK. Part of the licence to do so issued by the NCB was a requirement to install a weather station on each site and record the readings of rain, temp and pressure. These readings were given to the NCB who forwarded them to the met office. I assume they used them.
As far as we were concerned this was of no interest whatsoever to our business and the task of daily readings was given to the ‘chain-boy’. Usually a sixteen year old who worked for the surveyors.
We were not the only mining contractors employed in this activity and I would estimate somewhere around a yearly average of about twenty sites across the UK for the period.
It defies credibility if anybody thinks these figures were in any way accurate. Equipment sited wherever, readings taken in every weather condition by an untrained teenager whose main interest was something to put on the paper and get back to the warm but used as figures correct to a tenth of a degree.

HotScot
Reply to  Tom Malcolm
October 11, 2018 4:59 pm

Tom Malcolm

This has been my enduring refrain since I started looking into climate change some years ago.

The guys chucking the bucket over the side of a ship and taking a temperature reading was not the scientist on board (hah, hah) or a senior officer, it was the cabin boy or deck hand, when he had time/could be bothered. In many cases it would be judged on “is it colder/warmer today than yesterday”

Similarly, the guy trudging out to a Stevenson screen in the wind, snow and rain was the tea boy when he went out for a ciggie. Again, if he could be bothered.

The SST bucket measurements were largely along well plied trade routes, barely a ship would have been in the southern ocean to take a temperature in those days. And in much the same way, land temperatures were a local endeavour with no global implication so no one really cared what they were other than for academic purposes.

Even satellite temperature observations have been fraught with problems. Calibration, drift, obsolete equipment, newer better equipment, clouds etc.

Quite how we accept historic temperatures down to a tenth of a degree simply defies logic.

Komrade Kuma
Reply to  Greg Goodman
October 7, 2018 10:17 am

Add to that the incentive for the operating engineers to understate temperatures so they could justify working the engines harder than the manufacturer would recommend/warrant and any integrity to the sea data was obliterated and replaced with distinct bias (downward) => apparent uptrend to Argo buoy data used now.

Thomas
Reply to  John Bills
October 7, 2018 12:05 pm

Temperature isn’t even a measure of atmospheric heat content.

The atmosphere has sensible heat—the temperature measured by a thermometer— and latent heat; the energy that was required to evaporate water and which is returned as heat when the water condenses.

The total heat content of atmospheric air is call enthalpy. It’s measured in units of BTUs per pound (BTU/lb) or kilojoules per kilogram (kJ/kg).

A summer after in Florida, with a temperature of 90 °F, can have the same heat content as a 110 °F day in Arizona, because the air in Florida tends to have more latent heat in the form of water vapor or humidity.

Where I live, near Los Angeles, a humid winter afternoon at 65 °F could have the same heat content (enthalpy) as a summer afternoon at 100 °F but the temperatures are 35 °F apart.

For temperature to be a reasonable proxy of atmospheric heat content, the atmospheric water vapor content (relative humidity) would have had to have been the same for every measured temperature that is used to compute the global average temperature. That assumption seems absurd.

Zig Zag Wanderer
Reply to  Thomas
October 7, 2018 12:20 pm

I have always thought this too.

Having said that, I thought something like ‘wet bulb temperature’ (or something like that, I’m definitely no expert?) was defined to fix this problem. If it does not, then yes, air temperature itself is utterly useless as a metric.

Crispin in Waterloo
Reply to  Zig Zag Wanderer
October 7, 2018 4:13 pm

Zig Zag

Temperature is not “useless” but it is only one of the two parameters required to get a meaningful metric, which is the energy content.

We cannot say temperature is useless on its own, it has value, for example to indicate when freezing will take place, or to forecast melting. This is the current case in Alberta where farmers are on tenderhooks hoping for melting and ten days above zero. Something like 40-60% of the crops are in the fields and they have a foot of snow over them. It is a huge, potentially expensive issue. It started snowing in September, hard, and has not melted since. Massive losses loom. Bankruptcy threatens.

This is what we can expect during a significant downturn in temperature, not enthalpy. Hunger follows cold, not heat (as much).

ferd berple
Reply to  Crispin in Waterloo
October 7, 2018 6:14 pm

Alberta has a carbon tax to prevent warming. Sow the wind, reap the whirlwind.

Paul Penrose
Reply to  Thomas
October 7, 2018 12:49 pm

Temperature alone can’t tell you what the heat content of a volume of gas, liquid, or even a solid is. This is because temperature is a measure of energy density. This is why the temperature of a gas decreases when you lower the pressure – the density of the gas is lower, and so is the energy density (temperature). Because of the non-linearity of most substances around their phase change points, it is not a simple calculation to determine the total amount of energy in a mass. And this is even more difficult when you have a mix of different substances, unconstrained, and at a continuum of pressures.

Thomas
Reply to  Paul Penrose
October 7, 2018 1:32 pm

Paul Penrose,

You make some excellent points. Climate scientists seem assume that these factors all average out but I have never seen a detailed treatment of the subject that lends any credence to that assumption.

I once asked Richard Lindzen why temperate is used as a metric for global warming since it doesn’t even measure heat content, which is all we’re interested in with regard to AGW. He said averaging temperature is like averaging all the numbers in your phone book, i.e. meaningless. But he didn’t otherwise address my core question other than to agree with my understanding of temperature and enthalpy.

A few months later Dr. Lindzen was addressing Congress and made the same point that I had made to him (temperature is not a measure of heat content).

Richard Trenbreth got all hot under the collar and said (yelled) something about the Clausius-Clapeyron relation. But nothing that made any convincing rebuttal.

If the world were in a petri dish, it wold reach some reach some equilibrium temperature and the water vapor content in air above the surface would be accordance with the Clausius-Clapeyrone relation. But the real world is much more complicated, with water existing in all three phases, in varying amounts, and with turbulent air/vapor and water flows.

It seem to me that the main thing wrong with the surface-air temperature record is that it measures a parameter that is meaningless with regard to the AGW debate.

It’s astonishing that so many scientists could have a debate, for so many decades, over the temporal and spacial variations in a meaningless parameter.

Nevertheless, it does seem pretty obvious that the world has warmed since the little ice age. I suspect the argument will continue at least until it starts to cool again, which could be decades.

4kx3
Reply to  Thomas
October 8, 2018 10:43 am

Thomas

Please see Crispin’s comment nearby. Temperature is an important variable for predicting heat flow and phase changes, but it does not tell one how much heat there is except in very restricted circumstances. As noted here also it is enthalpy that describes the heat, and we can define an equivalent temperature as enthalpy/heat capacity for identifying heat effects. Equivalent temperature based on heat content can be defined for any region or even hemisphere and can span different materials as well.

https://pielkeclimatesci.wordpress.com/2010/07/22/guest-post-calculating-moist-enthalpy-from-usual-meteorological-measurements-by-francis-massen/
https://aea26.wordpress.com/2017/02/23/atmospheric-energy/

Reply to  Paul Penrose
October 7, 2018 4:01 pm

Hell, something a lot of people don’t get is we can’t even measure temperature directly anyway – we use proxies like the expansion of a liquid or a solid relative to a reference point we assume is accurate. Hard to imagine something as fundamental to science as temperature has never been directly observed. On the matter of heat capacity yes that’s something I’ve been saying for years, only to be met with blank stares. Explaining that air molecules themselves are actually in the thousands of degrees earns scorn with no amount of explanation about compression or density getting through. Basically most people’s attitude is any questioning of anything by anyone Not Qualified as they see fit is clearly mad. No further inquiry required.

Alan Tomalty
Reply to  Karlos51
October 7, 2018 11:22 pm

Karlos said
“Explaining that air molecules themselves are actually in the thousands of degrees”

They arent hot as in temperature. What you are referring to is the amount of energy contained therein with mass is related by Einstein’s famous equation E = mc^2 where E = energy ouput and m = mass and c is the speed of light.

Karlos, the measurement of temperature has been done accurately since the invention of the mercury thermometer in 1714 and many equations in physics depend on temperature as an independent variable. Physics has enough problems without YOU questioning the use of thermometers.

Reply to  Karlos51
October 8, 2018 2:09 am

oh dear..

and why the ‘you’ emphasized, do we know each other? If so then you don’t know me very well at all.

BCBill
Reply to  Paul Penrose
October 7, 2018 5:30 pm

Tenderhooks sound way better than the chewy kind!

gnomish
Reply to  BCBill
October 11, 2018 11:08 am

they come fresh from the boucanneery

Alan Tomalty
Reply to  Paul Penrose
October 7, 2018 11:01 pm

It is even worse than that.
Enthalpy = Internal energy + (pressure * volume)

The atmosphere does not have a constant volume, nor is the pressure the same at any 2 altitudes. However you can’t measure enthalpy directly because you cannot measure the internal energy directly. The best you can do, is measure the change of enthalpy if you can measure the heat gained or lost from the system and the work done by or on the system.

The ideal gas law PV =nRT, on the other hand is only applicable to a system that has a definite boundary whether it is open or closed. However, it can be used as a loose approximation to figure out the average temperature at planet surfaces as long as the pressure is over 10 kPa.

Temperature anomalies would be a much better indicator of real changes to the earth’s climate if we could be sure that on the same date for different years for any particular place; the temperature should not vary due to natural causes. We know that that is false, so the real reason anomalies are used is to infill geographical areas(that have no temperature stations) with the same equal anomalies of nearby temperature stations. GISTEMP has a definition limit of the word “nearby” to mean not more than 1200 km.

Dr. S. Jeevananda Reddy
Reply to  Thomas
October 7, 2018 5:21 pm

This is exactly what happens with urban-heat island effect and rural-cold island effect.

Dr. S. Jeevananda Reddy

Frank
Reply to  Thomas
October 8, 2018 11:49 am

Thomas: You raise an interesting point: Which is more relevant, sensible heat or total heat? And relevant to what?

For radiative cooling (W = oT^4), only the sensible heat term matters.

For human comfort, meteorologists have devised a composite scale (“real feel temperature) that contains both measures. We use air conditioning to cool and lower humidity. For some reason, we feel most comfortable with an air temperature 15 degC cooler than our internal temperature (which is warmed by biochemical reactions). Most species have adapted to a particular environment. The physical properties (especially fluidity) of the lipid bilayers that surround all cells is critically dependent on temperature. (See Alkenone temperature proxy). Chemical and biochemical reaction rates modestly depend on temperature, but proteins denature when the temperature gets too high. Aquatic species don’t care about latent heat.

The rate of evaporation is proportional to wind speed and “undersaturation” of the atmosphere, and not directly by temperature.

Climate change is important. We wouldn’t want to be living during the last ice age. Why would total heat content be a better measure of such climate change be a better measure of climate change than temperature alone? We have better data about temperature change, so one would need a good reason to switch to a different metric.

Thomas
Reply to  Frank
October 11, 2018 1:10 pm

Frank,

You wrote, “Why would total heat content be a better measure of such climate change be a better measure of climate change than temperature alone?”

Total heat is not necessarily a better metric of climate change but it is the only metric that can tell us if CO2 forcing is causing heat to accumulate in the system. Temperature alone will not tell us if heat is accumulating because temperature is not a measure of atmospheric heat content.

Frank
Reply to  Thomas
October 13, 2018 10:22 pm

Thomas: Thanks for the reply, which makes scientific sense. However, heat from the putative radiative imbalance created by rising GHG’s is accumulating in:

1) the sensible heat of the atmosphere (temperature), but often only measured at 2 m over the land. SST’s are used to predict the sensible heat in the atmosphere over the ocean. The sensible heat in the atmosphere over the ocean is measure. Satellites and radiosondes measure warming at all altitudes, but orbital drift has damaged the validity of satellite data. Radiosonde technology has changed a lot and that data has been subject to a great deal of processing. UAH uses radiosonde data to correct for orbital drift.

2) Latent heat (of water vapor) in the atmosphere.

3) Melting of glaciers and ice caps and changes in seasonal snow cover.

4) A little heat is raising the temperature of the land.

5) Warming of the ocean. The vast majority of heat (ca 95%) from the putative radiative imbalance caused by rising GHGs is supposed to be accumulating in the ocean and the goal of the ARGO floats is to measure that change. The skeptic Roger Pielke Sr was a big proponent of the ARGO program to measure ocean heat uptake. From a practical point of view, one could forget about 1), 2), 3) and 4) above and simply focus on ocean heat content, which has been about 0.7 W/m2 over the last decade.

As best I can tell from the sources linked below, there is about 2 g/cm2 of water vapor in the atmosphere and it has been growing at a rate of about 5%/decade over the past two decades. The water in cloud droplets has already released its latent heat, so total column water is an inappropriate measure.) Latent heat is about 2500 J/g or 5000 J/cm2. The 5% change is 250 J/cm2 per decade or 25 J/cm2/year. That is 250000 J/m2/year. With 31.6 million seconds/year, that is 0.008 J/m2/s or 0.008 W/m2. That is about 1% of the rate of heat accumulation in the ocean.

https://www.ecmwf.int/sites/default/files/elibrary/2015/13482-total-column-water-vapour-product-gome-sciamarchy-and-gome-2-instruments-poster.pdf

The atmosphere weighs about 10,000 kg/m2 and has a heat capacity of 1 kJ/kg/K. That makes 10,000 (kJ/K)/m2. If the atmosphere is warming about 0.17K/decade, that is 1,700 kJ/m2/decade or 170 kJ/m2/yr or 170,000 J/m2/yr or 0.0054 J/m2/s or 0.0054 W/m2. So the increase in sensible and latent heat in the atmosphere appear to be similar in magnitude and about 1% of the increase in hear in the ocean. So you are right to be worried about the increase in latent heat vs sensible heat, but both are trivial compared with ocean heat. (:))

Frank
Reply to  Thomas
October 8, 2018 11:58 pm

Thomas questioned why we use sensible heat (temperature) rather that sensible+latent heat (total heat) as a measure of “climate change”, but didn’t offer any compelling reasons why temperature was an inappropriate measure.

A related question is whether we should expect there to be an important difference between trends in temperature and total heat. For total heat to go down as temperature rises, then absolute humidity must go down. Saturation vapor pressure rises about 7% per degC, so relative humidity would fall even further. AFAIK, absolute humidity is not falling.

tty
Reply to  Frank
October 11, 2018 11:02 am

It should be obvious that it is the enthalpy that is the significant climate variable, not temperature, since temperature is only part of the energy content, and a variable part at that.

Reply to  tty
October 11, 2018 12:51 pm

tty- I agree, but the problem would be the ocean measurement, so I would use temperature, minus UHI and using only quality instrumental sites, as the most cost effective metric.

Frank
Reply to  tty
October 13, 2018 10:23 pm

TTY and Chad: See my reply above to Thomas.

thomas
Reply to  Frank
October 11, 2018 1:27 pm

Frank,

I did offer a compelling reason. Saturation vapor pressure raises as temperature rises but absolute humidity rises only if there is water present. Absolute humidity is very low in the Sahara but very high around the Persian Gulf.

I also didn’t know if there is any trend in absolute humidity so I looked it up. According to NOAA it’s going up.

https://www.climate.gov/news-features/understanding-climate/2013-state-climate-humidity

I emailed NOAA to see where I can download the specific humidity data from. With that I can make a chart that shows annual changes in total heat (enthalpy). I’ll post it on here on wattsupwiththat.com if I’m successful.

David Walker
Reply to  John Bills
October 7, 2018 5:38 pm

Concerning sea surface temperatures.

Note that in the decades before the advent of the significant coverage of the oceans by the buoy networks, the ocean temperature data was acquired in the main by ship’s engine room water inlet temperature data or by measuring the temperature in buckets thrown over the side on a rope.

Ship’s engine cooling water inlet temperature data is acquired from the engine room cooling inlet temperature gauges by the engineers at their convenience, there is no protocol for the recording of the temperatures.

There is no standard for either the location of the inlets with regard especially to depth below the surface, the position in the pipework of the measuring instruments or the time of day the reading is taken and the position of the temperature sensor may be anywhere between the hull of the ship and the engine cylinder head itself.

The instruments themselves are of industrial quality, their limit of error in °C per DIN EN 13190 is ±2 deg C. for a class 2 instrument or sometimes even ±4 deg. C, as can be seen in the tables here: DS_IN0007_GB_1334.pdf . After installation it is exceptionally unlikely that they are ever checked for calibration.

It is not clear how such readings can be compared with the readings from buoy instruments specified (optimistically IMO) to a limit of error of tenths or even hundreds of a degree C. or why they are considered to have any value whatsoever for the purposes to which they are put, which is to produce historic trends apparently precise to 0.001 deg. C upon which spending of literally trillions of £/$/whatever are decided.

But hey, this is climate “science” we’re discussing so why would a little thing like that matter?

http://www.nature.com/climate/2008/0809/full/453601a.html

Dodgy Geezer
October 7, 2018 6:05 am

I wonder what his professor was thinking when he agreed that this would be a fit subject for a PhD. Had he no understanding of the political implications of work in this area?

How long do you think he will remain in tenure? And for how long will Robert Boyle Publishing retain this file on their sales database?

David Lilley
Reply to  Dodgy Geezer
October 7, 2018 6:23 am

According to James Delingpole, as reported at NOTALOTOFPEOPLEKNOWTHAT, his supervisor was Peter Ridd. So, the answer to your question is that he has already lost his tenure. Peter Ridd, formerly of James Cook University, was the scientist who pointed out that the Great Barrier Reef was not dying, a blasphemy for which the punishment was loss of his post.

Reply to  Dodgy Geezer
October 7, 2018 8:12 am

Geezer, McLeans paper will be there as long as it needs to be. We thought these results were important and David and I set up Robert Boyle Publishing with John. We put in a lot of work to make sure these results would not disappear.

http://joannenova.com.au/2018/10/first-audit-of-global-temperature-data-finds-freezing-tropical-islands-boiling-towns-boats-on-land/

Yes, his supervisor was Peter Ridd, famously sacked for saying that “the science was not being checked, tested or replicated” and for suggesting we might not be able to trust our institutions. John started the audit about 8 years ago. He’s done the last two years unpaid. He could have stopped after finding 26 issues.

The wildest scandal is that some of these errors are so obvious – a monthly average hotter than the hottest day on Earth — and no one even noticed. A high school geek could write code to find that one.

40 years after scientists switched to celsius, HadCrut still hasn’t got there. That’s 40 years of couldn’t-care-less from the same people who tell us what light globes to use. Geezer, McLeans paper will be there as long as it needs to be. We thought these results were important and David and I set up Robert Boyle Publishing with John. We put in a lot of work to make sure these results would not disappear.

#DataGate

Reply to  Jo Nova
October 7, 2018 8:22 am

Sorry that comment got duplicated inline above. Mods?

HotScot
Reply to  Jo Nova
October 7, 2018 9:01 am

Jo Nova

Thank you for all your work Jo.

Reply to  Anthony Watts
October 7, 2018 10:31 am

Thanks! Thanks for putting in a request to support John too.

It’s a thankless job doing the original analysis.

Cheers HotScot 🙂 Ta

Dodgy Geezer
Reply to  Jo Nova
October 7, 2018 9:13 am

I strongly support the auditing of data to remove errors, and sceptical analysis of all scientific papers to uncover faults. Particularly in Climate Change, where the impact on individuals and society is so large.

But the Climate Change scientists do not support this, have a lot of money available and no interest in behaving either fairly or legally. How well is Robert Boyle Publishing equipped to handle court cases intended to bankrupt you?

Latitude
Reply to  Dodgy Geezer
October 7, 2018 10:13 am

Climate science is the only science I’ve ever heard of that’s always right…
even when they are proved to be wrong and have to correct it, they go right back to claiming they were always right

Reply to  Latitude
October 9, 2018 11:36 am

How can predictions of the future climate be “wrong”
until you wait 10, 20, or 100 years?

ATheoK
Reply to  Dodgy Geezer
October 7, 2018 10:23 am

“Dodgy Geezer October 7, 2018 at 9:13 am
I strongly support the auditing of data to remove errors…”

If, by “auditing the data to remove errors”, you include data verification, on site equipment testing and certification at installation and regularly afterwards, metadata validation, methods evaluation and certification, etc. etc.; then I agree with you.

Data can get disqualified, but should never be directly “adjusted”. Adjustments should be kept in separate files along with explicit metadata.

AGW is not Science
Reply to  ATheoK
October 11, 2018 11:24 am

Agreed. You can’t “adjust” temperature readings that are deemed incorrect by any metric unless you can substitute another actual reading at the same place at basically the same time. Anything else isn’t “data,” it’s just guesswork. The response to any supposed inaccuracy or bias in the data should be to stretch error bars, not CHANGE the instrument readings.

peterh
Reply to  Jo Nova
October 7, 2018 6:30 pm

A lot of data errors that are blatantly obvious when you look at them, and trivial to write automated checks for, you might not think of unless you encounter them doing an audit or (maybe) if you are familiar with data collection procedures. But now that we know some of what to look for it should be possible to automate checks and find more obviously invalid data.

Editor
Reply to  Dodgy Geezer
October 7, 2018 12:30 pm

I just ordered my copy. Within a minute the computer at my credit union called to inquire about some suspicious charges.

After some identify verification steps, it listed five transactions, I reassured it that they were all legitimate and it went away happy.

It didn’t have the decency to explain what the red flags were, perhaps it knew that Robert Boyle publishing is a new company, perhaps I made the first ever transaction there from the DCU, perhaps it knows I’ve never set foot in Australia, perhaps someone is pressuring the computer to keep an eye on this Robert Boyle fellow.

No biggie. However, it might be a good idea to be near your phone when you place your order.

Perhaps someone is keeping an eye on this McLean fellow already.

William Ward
Reply to  Ric Werme
October 7, 2018 12:59 pm

I had similar problems with my CC transaction. I replied to an automated Text Message from my bank confirming the transaction was made by me and then the charge went through. It worked out. Now I just need to make sure I was not charged multiple times, because it took multiple attempts for it to go through.

Glad to support the author. I’m looking forward to digesting the information.

Lewis P Buckingham
Reply to  Ric Werme
October 7, 2018 1:28 pm

No problems in Australia.

Reply to  Lewis P Buckingham
October 7, 2018 1:51 pm

Lewis P Buckingham
October 7, 2018 at 1:28 pm

I can confirm no problems here in NZ. Quick download with lots of good background information and graphs.

auto
Reply to  Ric Werme
October 7, 2018 4:00 pm

For what it is worth, I had no problem ordering, and then downloading, the paper.

I used a [UK] credit card, and have – so far – had nothing to suggest I am under head-exploders’ surveillance.

Auto

gregole
Reply to  auto
October 7, 2018 6:24 pm

No problem getting the paper here in Red State Arizona. Haven’t had a chance to read it all – I’m scanning websites to get the overall ’cause I’m super busy. So far it looks like a block-buster.

William Ward
Reply to  Ric Werme
October 7, 2018 5:15 pm

What country is Robert Boyle Publishing located in? If outside the US, that might explain why some US banks are cautious. Also, this report is the only thing they sell. They appear to have just started business and this paper is their first product. The lack of business history might trigger some banks and maybe the banks in the US are more cautious than in some other countries.

Steven Mosher
Reply to  Ric Werme
October 9, 2018 2:29 am

No open data. no open code.
no science

Frank
Reply to  Steven Mosher
October 9, 2018 6:57 am

Mosh: It looks as if MacLean never located HadCRUT4’s raw data AFTER quality control, which should have eliminated most of the problems cited above.

“No open data. no open code. no science.” is a reasonable slogan. The question is who the slogan applies to.

Reply to  Steven Mosher
October 11, 2018 2:16 pm

You mean, like this open source code that had recently climastrological propaganda of Earth going Venus or something?

https://github.com/ddbkoll/PyRADS/issues/2

How do you feel to learn that computer models you have blind faith in have fundamental errors in them? It doesn’t matter, because it’s written by the might MIT postdoc gods or something? Or just it doesn’t matter if the errors support your cargo cult science dogma?

Kneel
Reply to  Steven Mosher
October 11, 2018 7:56 pm

There are commercial agreements in place regarding the source of some data.
We are unable to find out which data is covered by these agreements, so therefore we can’t give you any of it.
Also, why should I give you my data, when all you want to do is find mistakes?

Go on Mosher, I DARE YOU to say the above is unacceptable AND also suggest that HADCRUT is acceptable.

Andrew Wilkins
Reply to  Steven Mosher
October 12, 2018 4:24 am

No apology from you regarding his PhD being “toast”

John Endicott
Reply to  Andrew Wilkins
October 12, 2018 8:13 am

Don’t be silly Andrew. An apology would take integrity and honesty.

Nylo
Reply to  Dodgy Geezer
October 7, 2018 11:03 pm

Dodgy Geezer,
“I wonder what his professor was thinking when he agreed that this would be a fit subject for a PhD. Had he no understanding of the political implications of work in this area?”

I think you have not really understood the implications of his results. He is saying, in the first place, that it cools the past too much, and in second place, that data prior to 1950 is not reliable due to poor coverage. So if data prior to 1950 (i.e. prior to significant human influence through CO2 according to IPCC) is not reliable and the procedures are, in addition, cooling the past too much, we have that it could be claimed that the NATURALLY CAUSED warming prior to 1950 may have been exagerated. Which could make the alarmists claim that our impact is much greater than previously thought. I can foresee a conspiracy here to remove the naturally-caused warming pre-1950 as a result of this audit.

Alan Tomalty
Reply to  Nylo
October 8, 2018 12:03 am

Nylo you must be new here. All Dodgy Geezer is saying; is that both the professor and the pupil are extremely courageous men. The blowback from this report will be enormous.

Nylo
Reply to  Alan Tomalty
October 8, 2018 3:27 am

Why do you think that I don”t understand it? I understand it very well, and I am explaining why it may not be so. The audit can be used against skeptics like me that believe that a significant part of the warming is most likely natural. And if it can be used against us, it’s good for alarmism, nothing to punish the author for.

rbabcock
October 7, 2018 6:05 am

I’ve been reading WUWT comments and articles for years saying exactly what your essay points out. Not to be too trite, but what’s new?

About the only database you can put any faith in is the UAH and even it is an indirect way of measuring temps. But no real issues here. Just publish numbers to the tenth or hundredth of a degree, including extremely high detail colorful maps in bright reds and purples, and darn they look like the real deal.

I can’t ever remember an error bar on anything put out. Probably because publishing a number like 26.1+-3.0 just won’t get the message across.

TDBraun
Reply to  rbabcock
October 7, 2018 7:29 am

1) It’s not so easy to ignore an actual detailed PhD study of the subject — compared with the ease of ignoring commenters on a “Denialist” website. This study can be used by any skeptic to bolster his reasons for skepticism in an argument.

2) I thought the explanation of the way they adhysted a relocated site’s data creating artificial cooling in its past was new — I haven’t seen that before at least — and it explains that mystery. Clearly it is incorrect. If they were to un-adjust that data and re-adjust it more realistically, magically several tenths of a degree of warming would disappear.

Latitude
Reply to  TDBraun
October 7, 2018 7:49 am

More than that TD…..when real people on the ground measure a 10 degree or more difference in UHI…and they adjust for UHI only 1-2 degrees

…but then they can claim the adjustments lower the temp

Randy Stubbings
Reply to  rbabcock
October 7, 2018 10:06 am

While I am not arguing in support of the HadCRUT4 data set (I have not read McLean’s analysis yet), I will at least point out that HadCRUT4 comes with lower and upper 95% confidence intervals along with an explanatory paper at https://www.metoffice.gov.uk/hadobs/hadcrut4/HadCRUT4_accepted.pdf. The difference between L95 and U95 in January 1850 is 0.801 degrees, and in August 2018 it is 0.448 degrees. Over the last few decades the range seems to have fluctuated seasonally from about 0.22 degrees to about 0.45 degrees.

Paul Penrose
Reply to  Randy Stubbings
October 7, 2018 1:01 pm

Randy,
Those confidence intervals are, at best, estimates. Their “ensemble” technique is novel, but untested by professional statisticians, so I don’t think you can give it much weight. I also found this little gem in the paper:

“This model cannot take into account structural uncertainties arising from data set
construction methodologies. It is clear that a full description of uncertainties in near-surface temperatures, including those uncertainties arising from differing methodologies, requires that independent studies of near-surface temperatures should be maintained.”

I don’t see how you can cite any kind of confidence intervals without taking these things into account.

Chaamjamal
October 7, 2018 6:13 am

Yes sir it really is a bombshell and just in time for the IPCC Christmas Party.

My 2 cents on temperature reconstructions is that there is some evidence that CO2 data and the presumption of GHG forcing have a role in the way temperature reconstructions are constructed.

Please see paragraph#5 here

https://tambonthongchai.com/2018/09/25/a-test-for-ecs-climate-sensitivity-in-observational-data/

Reply to  Chaamjamal
October 7, 2018 6:12 pm

“Yes sir it really is a bombshell and just in time for the IPCC Christmas Party.”

^A miracle has happened.^

October 7, 2018 6:15 am

Not much of a surprise there.
For about at least 60 or more years the Met Office was using incorrect/wrong formula to calculate the CET, the world longest set of temperature data. Met Office corrected a long standing error in calculating annual data from daily and monthly temperatures compilation only after I alerted them to the error in the early August 2014 suggesting method of recalculation which they have now adopted. Subsequently, from 01/01/2015 the Met Office recalculated the annual values for the whole set of data going back 350 years .
For more see:
http://wattsupwiththat.com/2015/01/08/anthropogenic-warming-in-the-cet-record/#comment-1831927

Bellman
Reply to  vukcevic
October 7, 2018 6:48 am

Difficult to seehow the Met Office were geting CET wrong for 60 years, when they only started maintaining it in the 70s.

Curious George
Reply to  Bellman
October 7, 2018 7:58 am

It should have been 50 years.

Reply to  Bellman
October 7, 2018 9:56 am

“The Central England Temperature (CET) record is a meteorological dataset originally published by Professor Gordon Manley in 1953 ”
My abacus says that is “60 or more years”
https://en.wikipedia.org/wiki/Central_England_temperature

Bellman
Reply to  vukcevic
October 7, 2018 11:57 am

But not by the Met Office.

Reply to  Bellman
October 7, 2018 2:23 pm

Makes no difference, the Met Office used incorrect formula year after year, decade after decade for a half a century, and the most likely they would still be doing so; there is no excuse for it.

Bellman
Reply to  Bellman
October 7, 2018 4:42 pm

By “incorrect formula” you mean the simplification of treating all months of equal length when averaging the annual figure?

Yet you say it makes no difference whether it was for 60 or 40 years, or whether it was the Met Office or Professor Manley who committed this inexcusable mistake.

Reply to  Bellman
October 8, 2018 1:02 am

You got it .
In engineering where I spent most of my adult life sloppy calculations are not advisable.

Bellman
Reply to  Bellman
October 8, 2018 5:22 am

But you keep making the “sloppy” accusation that the Met Office were giving slightly inaccurate annual figures for over 50 years. They couldn’t have because they weren’t publishing CET 50 years ago.

Reply to  Bellman
October 8, 2018 6:15 am

Met Office is the caretaker of the CET data and it was, and is their duty to the public to present that data in the most accurate form they could master, regardless if that is today, or was a decade or half a century ago.
Are you actually suggesting that prior to 2015 the MetOffice was not not required to know that not all months in the year don’t have same number of days?
No serious person could defend half a century of the erroneous calculations by a multimillion pound public institution financed from hard pressed taxpayers, including four decades of my however modest contributions.
Mr Bellman, if you happen to be here as an apologist for the MetOffice’s sloppy work, and what you have shown above is your the very best, you are not doing well, are you?

Bellman
Reply to  Bellman
October 8, 2018 7:05 am

No, I’m trying to explain that the MO could not have been making this “mistake” for 60 or 50 years because 50 years ago the CET was not produced by them. I have no idea how long they used the slightly simplified formula for calculating averages. I don’t know when they started giveing the data to 2 decimal places. The current online page only goes back to 2011.

When Manley published his final version of CET in the mid 70s, there is no indication of whether he weighted annual values by length of month. I’d expect he didn’t as that would have made the process more time consuming. But it is largely irrelevant as he only gave annual averages to 1 decimal place.

I have never worked for the MO and am not apologising for any simplifications they made in calculating the annual values. I just don’t think it is a serious problem as the differencies are minor and completely irrelavant to any long term analysis of the data.

Editor
Reply to  Bellman
October 11, 2018 2:40 pm

Bellman and vukcevic ==> Reminded me that I posted a short essay titled “Historical Note: Greenwich, England Mean Temperature, 35-yr Daily Averages 1815-1849” in 2014. The records made by John Henry Belville.

Reply to  vukcevic
October 8, 2018 7:42 am

Hi again
It was fun talking to you, however I would guess that the Met Office wouldn’t have enjoyed it as much.
Here is website link where they provide all kind of the CET data
https://www.metoffice.gov.uk/hadobs/hadcet/data/download.html

Bellman
Reply to  vukcevic
October 8, 2018 8:20 am

Thanks, I enjoy these chats myself.

I’m well aware of the links to CET data thanks, but I note that you have so far failed to produce any evidence that the Met Office have anything to do with Manley’s original reconstructions. Therefore I still wonder if you accept that your opening statement that “For about at least 60 or more years the Met Office was using incorrect/wrong formula to calculate the CET” is wrong, or your later claim that “the Met Office used incorrect formula year after year, decade after decade for a half a century”.

Reply to  vukcevic
October 8, 2018 3:40 pm

You are joking, aren’t you.
What? Are you suggesting that the Met Office front desk receptionist is the one who was calculating the CET annual data.
It was Manley, Parker, Legg, Folland, etc, they are all responsible for using “incorrect formula year after year, decade after decade for a half (or more) of a century”!
I don’t think you will be on their Christmas card list, since your defence of the MO has badly misfired. As Mrs. May would say ‘a bad defence is worse than no defence at all’.
good night, see you some other time, some other place.
with best of regards to you
m.v.

Bellman
Reply to  vukcevic
October 8, 2018 5:43 pm

You seem to continue to miss my point, or maybe I’m missing yours.

Professor Gordon Manley, the inventor of the CET, had nothing to do with the Met Office (apart from working for them for a year in the 1920s). The Met Office had nothing to do with his CET, published in 1953 or 1973.

It’s largely irrelevant whether Manley in his 1973 paper calculated annual averages as a weighted average or not as he only gave the figures to decimal place.

My only “defense” of the MO has been to point out that they couldn’t be guilty of a 60 year error when they had nothing to do with the tables until 40 years before found the error.

dave
October 7, 2018 6:19 am

If only trees, preferably bristlecone pines, grew in the ocean then we would have a really reliable way of measuring past temperatures of 70% of the earth’s surface.

eyesonu
Reply to  dave
October 7, 2018 7:45 am

Maybe we could use ‘driftwood’.

Joe Crawford
Reply to  dave
October 7, 2018 10:52 am

Don’t forget. You only need one :<)

Susan
October 7, 2018 6:28 am

I’ve just checked my newsfeed: the story is not featured. I am not surprised.

Hugs
Reply to  Susan
October 8, 2018 2:02 am

What? Mainstream media to skip a politically inconvenient piece of work? /sarc

The data is riddled with errors. That’s why it needs adjustments. Only that being very biased in one’s presumptions will let one do the adjustments to support the ‘predone’ conclusions.

F. Ross
October 7, 2018 6:31 am

“… (HadCRUT4) has found it to be so riddled with errors and “freakishly improbable data” that it is effectively useless.
…”
My, oh my, imagine that. All those papers, projections, studies, etc. based on this dataset over the years, just USELESS.
/sarc

Jeff Alberts
Reply to  F. Ross
October 7, 2018 8:49 am

I’m sure Nick will be along at any moment, explaining how the folks maintaining HADCRUT remove these outliers effectively and completely, so there is nothing to see here.

Taphonomic
Reply to  Jeff Alberts
October 7, 2018 11:23 am

I was waiting for Mosher to do a drive by and say it doesn’t matter cause BEST is okay.

Louis Hunt
Reply to  Taphonomic
October 7, 2018 11:42 am

Yes, I thought BEST claimed to have reviewed the temperature data and found nothing wrong with it. Did they lie to us?

DonM
Reply to  Louis Hunt
October 11, 2018 11:11 am

yes

Reply to  Jeff Alberts
October 7, 2018 4:08 pm

“explaining how the folks maintaining HADCRUT remove these outliers effectively and completely”
OK. This is no BOMBSHELL. These are errors in the raw data files as supplied by the sources named. The MO publishes these unaltered, as they should. But they perform quality control before using them. You can find such a file of data as used here. I can’t find a more recent one, but this will do. It shows, for example
1. Data from Apto Uto was not used after 1970. So the 1978 errors don’t appear.
2. Paltinis, Romania, isn’t on that list, but seems to have been a more recently added station.
3. I can’t find Golden Rock, either in older or current station listings.

Paramenter
Reply to  Nick Stokes
October 8, 2018 1:20 pm

But they perform quality control before using them.

That’s reasonable explanation. Still, removing significant noise from underlying data or trying to adjust it introduces uncertainties. Furthermore, there are more issues highlighted as massive extrapolation of temperatures from small samples of records, almost complete lack of historical records from the ‘down under’ or ‘city adjustments’ which may have introduced significant and artificial cooling effect for many records.

TallDave
Reply to  Nick Stokes
October 8, 2018 4:53 pm

“But they perform quality control before using them”

Unfortunately, the quality control process is also riddled with obvious errors, so the net result is probably worse than the data.

Also, “we do QA later” doesn’t explain why obvious errors are still in the source data.

Reply to  TallDave
October 8, 2018 5:26 pm

“Also, “we do QA later” doesn’t explain why obvious errors are still in the source data.”
Because it is source data. People here would be yelling at them if they changed it before posting. You take the data as found, and then figure out what it means.

” the quality control process is also riddled with obvious errors”
You don’t know anything about the QC process.

TallDave
Reply to  TallDave
October 9, 2018 8:41 am

“Because it is source data. ”

That doesn’t explain why THE SOURCE didn’t correct the source data. Classic hiding the pea.

“You don’t know anything about the QC process.”

I know the quality control process is also riddled with obvious errors, so the net result is probably worse than the data.

TallDave
Reply to  TallDave
October 9, 2018 8:54 am

This is just the internal crap: When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.
***
“It seems like neither organization properly checked the land or sea temperature data before using it in the HadCRUT4 dataset. If it had been checked then the CRU might have queried the more obvious errors in data supplied by different countries. The Hadley Centre might also have found some of the inconsistencies in the sea surface temperature data, along with errors that it created itself when it copied data from the hand-written logs of some Royal Navy ships.”
***
And this is just the internal stuff. As many, many analyses have pointed out, what gets done later in the name of QC later on is even worse, and every such step tends to add its own new error in the course of addressing other error. The error bars should fill the graph to be accurate.

TallDave
Reply to  TallDave
October 9, 2018 9:16 am

And who can forget CRU doing everything it could to avoid even an informal QC audit, to the point of deleting their own data, claiming confidentiality and lack of time and generally doing their best to circumvent FOIA.

http://blogs.nature.com/climatefeedback/2009/08/mcintyre_versus_jones_climate_1.html

TallDave
Reply to  TallDave
October 9, 2018 9:26 am

And that turned into just barrels of fun.

https://climateaudit.org/2010/03/05/phil-jones-called-out-by-swedes-on-data-availability/

https://climateaudit.org/2009/12/27/the-uk-met-office-subset/

https://climateaudit.org/2010/07/21/inquiry-disinformation-about-crutem/

“My long-standing position on CRUTEM was that CRU’s obstruction of data requests was most likely due to its desire to conceal that it did so little work on quality control; ”

Yep.

Steven Mosher
Reply to  TallDave
October 9, 2018 7:01 pm

““Because it is source data. ”

That doesn’t explain why THE SOURCE didn’t correct the source data. Classic hiding the pea.

“You don’t know anything about the QC process.”

I know the quality control process is also riddled with obvious errors, so the net result is probably worse than the data.

###############

You know no such thing.

raw source data is riddled with errors, but skeptics LOVE THEIR RAW DATA.

the data suppliers should never touch raw data.

A) data suppliers can apply QC and then document how they QCed. This is done with
flags typically
B) downstream users may apply their own QC and document it. NOAA does this.
we do this.
C) On the grand scheme of things QC versus no QC is less than 1% different

MarkW
Reply to  TallDave
October 11, 2018 11:16 am

As usual, Mosh demonstrates that he has no intention of arguing honestly.

Yes, we do insist on seeing the raw data, because very often the methods used to “clean” the data aren’t legitimate.

Nobody ever claimed that the raw data was pure, just that it was better than the data post cooking.

Regardless, the point is that this is an excellent example of why the climate scientists are so reluctant to provide raw data. It’s that bad.
The claim is that special statistical techniques can change this sow’s ear into a silk purse. That’s BS.

TallDave
Reply to  TallDave
October 11, 2018 11:20 am

“You know no such thing,”

On the contrary, I know the quality control process is also riddled with obvious errors, so the net result is probably worse than the data. Yes, the raw data is also terrible, but that doesn’t excuse an even worse QC process. Other published datasets are also rife with this kind of thing — I wrote my own little 4GL because I couldn’t believe SG’s claims about how many in the record temperatures were being generated by models.

Unfortunately the field is rife with this kind of behavior. It’s not just these shenanignans, re-running Hansen 1988 with new data and claiming the result vindicate his fails predictions, or the various Climategate plotting against skeptics. It’s just activist slop everywhere, and it’s an affront to good science.

https://climateaudit.org/2018/10/07/pages2k-2017-south-america-revisited/

Reply to  Nick Stokes
October 9, 2018 2:40 am

The original certified data is rotting in a landfill in the Netherlands. What we have is adjusted data with no way of knowing what was adjusted. Which they admit was adjusted. That adjustment has since been adjusted several times. Additionally, the researchers are hiding behind ‘the work is confidential and not available to the public’, pay walled or not. They’ve created a moving wave of higher temperatures. The current data is correct but the former data always has to be corrected. In 30 years, today’s data will have to be adjusted. Why bother, it’s a belief system.
All the arguments against AGW are based on assuming that the data is correct, and AGW cannot stand up to that either. AGW as a theory should have died a death 10 years ago. Only belief in outdated incorrect models keeps AGW alive.

Anthony Banton
Reply to  rishrac
October 9, 2018 3:01 am

“What we have is adjusted data with no way of knowing what was adjusted. Which they admit was adjusted. That adjustment has since been adjusted several times. ”

Of course you do – it’s in the original files direct from national Met services.
And no it hasn’t been adjusted several times – the continuing myth of the US GISS adjustments due to inadequate TOBs by weather observers and correct homogenisation to make apples = apples.
The biggest “adjustment” is to warm the past and reduce the global warming trend.
Or was that some kind of “double-bluff” conspiracy? (sarc)

http://www-users.york.ac.uk/~kdc3/papers/homogenization2015/temps_by_adj.png

John McLean
Reply to  Nick Stokes
October 19, 2018 4:36 am

Nonsense Nick.
– Figure 6.3, which uses data extracted from HadCRUT4, clearly shows the Apto Uto outliers.
– My guest post on Jo Nova’s blog (see https://wattsupwiththat.com/2018/10/11/bombshell-audit-of-global-warming-data-finds-it-riddled-with-errors/ ) has a worked example that shows how the Apto Uto data was included, moreover I show the fundamental flaw with the Hadley Centre’s so-called quality control.

John McLean
Reply to  Nick Stokes
October 19, 2018 4:58 am

Nick, I feel sorry for you. You believe what the CRU says despite the HadCRUT4 (and CRUTEM4) contradicting it.

Not only have I shown a fully worked Apto Uto situation (see http://joannenova.com.au/2018/10/hadley-excuse-implies-their-quality-control-might-filter-out-the-freak-outliers-not-so/#comment-2060139) but I’ve also looked at the Gold Rock Airport data and worked through the calculations that incorporate it into the CRUTEM4 data. I haven’t documented it wider reading because unlike Apto Uto the Golden Rock HadCRUT4 grid cell contains a lot of ocean and since the SST figures are correct the relationship between the HadCRUT4 and CRUTEM4 values aren’t as obvious.
From some notes in front of me right now, in December 1984 only Golden Rock Airport (St Kitts), Raizet (Guadeloupe) and Melville Hall A (airport?) (Dominican Republic) reported data. The anomalies from these stations (i.e. Dec 1984 values minus December averages) are respectively -23.4, 0.3 and 0.3, the average of which is -7.6, which matches what’s in the CRUTEM4 grid cell for that month. The HadSST3 value is -0.72 (i.e. the sea surface temperature anomaly) and therefore HadCRUT4 is -2.43, which is quite a large spike compared to normal values. If Golden Rock had been 0.3 then CRUTEM4 would have been just 0.3 and HadCRUT4 probably about 0.6 rather than -2.43.

Scott Bennett
Reply to  Jeff Alberts
October 7, 2018 4:39 pm

I’m seated and ready with my popcorn eager to watch Nick Stokes chewing on
this giant chunk of indefensibility:

The dataset starts in 1850 but for just over two years at the start of the record the only land-based data for the entire Southern Hemisphere came from a single observation station in Indonesia. At the end of five years just three stations reported data in that hemisphere.

I’m betting he will attempt to wash it down with liberal amounts of Hansen’s Coherence! 😉

S.W.B

Reply to  Scott Bennett
October 7, 2018 9:00 pm

CRUTEM4 (and HADCRUT) are shown with uncertainties. By the time you get back to 1950, they are large (about 0.5°C). SH uncertainty is over 1°C. I personally don’t use HADCRUT back to 1850, and I’m sure many don’t. But that is no reason to suppress the information.

Alan Tomalty
Reply to  Nick Stokes
October 8, 2018 12:26 am

Nick Stokes said
“I personally don’t use HADCRUT”

However the IPCC does and that is the problem.

Scott Bennett
Reply to  Nick Stokes
October 8, 2018 2:16 am

Nice chomp!

So, HadCRUT – when shown with uncertainties – is edible but completely unpalatable! 😉

Your response to a paper concerned with an audit of uncertainties is to point out that the dataset has uncertainties!
That is an answer, not a good one but it is an answer!

Given that governments are deciding energy and climate policies on claims based on the HadCRUT4 dataset, an independent audit of its accuracy and uncertainties was undertaken using data from 1850 to 2018. – John McLean

But isn’t it even worse than you admit because the global mean is calculated as a land area weighted average – with the emphasis on land area!

That lone Southern Hemisphere (SH) reporting station in 1850 only grew to nine by 1860 and to date 52.7% of all reporting grid cells have only 1, 2 or 3 observation stations a figure that hasn’t improved beyond the minimum set in 1974.

Clearly, the paper points out that the uncertainties are greater than are known and that prior to 1950 HadCRUT4 is of limited value because for almost all of the period from 1850 to 1950 the coverage of the Earth’s surface was less than 50%.

This would seem important because the IPCC use the period 1850 to 1900 as a baseline.

Interestingly, the paper also found that from 1850 to 1900 SH temperature anomalies were disproportionately represented by particular latitude bands and that this unequal contribution to the spatial coverage, did not stabilise until 2015, unlike the Northern Hemisphere which had stabilised in about 1950.

The bottom line is that the uncertainties described in the paper preclude the calculation of any meaningful trend in the database as a whole.

Anthony Banton
Reply to  Nick Stokes
October 8, 2018 9:56 am

“However the IPCC does and that is the problem.”

Only if you think that because we don’t know everything (precisely) then we know nothing.
If that’s the case then we will get nowhere in anything.

TallDave
Reply to  Nick Stokes
October 8, 2018 4:56 pm

They are shown with “station” and “sampling” uncertainty. I didn’t see one for “quality control uncertainty” but maybe that was because it was larger than the graph it applied to.

MarkW
Reply to  Nick Stokes
October 11, 2018 11:23 am

A grand total of nobody has suggested that the information be suppressed. We just point out that the data is not fit for using to determine the world’s temperature.

HotScot
Reply to  Nick Stokes
October 11, 2018 5:21 pm

OK, dumb question interval.

Is there a recognised standard like ISO 9001 for data QC across the scientific world?

John McLean
Reply to  Nick Stokes
October 19, 2018 4:48 am

Yes they are Nick, but I think you’ll find that those uncertainties are calculated from the available data (e.g. 2 standard deviations from the mean).
This method is only appropriate if the sample is representative of the whole, such as might be the case with political polls prior to elections, when a sample of let’s say 5,000 produces a result that is a pretty good estimate of tens of millions of votes.

The problems exposed by the HadCRUT4 data audit are very extensive and we can’t say that the problems are all evenly balanced. While the error margin is probably rather evenly distributed as positive and negative for most issues, the “daylight savings” issue suggests that lower mean temperatures will result, the “not adjustment for urbanised stations that closed” would mean an upward bias and the flawed adjustment of temperature go right through the entire record and excessively lower earlier temperatures.
On top of that, the composite of two error margins is calculated in quadrature, not by simple addition. The many different errors exposed by the audit would probably mean a much higher power than just squared.

HotScot
October 7, 2018 6:37 am

I have said it before, but relying on data from cabin boys told to chuck a bucket over the side of a ship in the 1850’s to judge SST, and expecting the tea boy not to be sent out in the wind, rain and snow to check a Stevenson screen instead of the scientist in charge is simply a belief too far.

Not to mention that SST would have been routinely measured along well worn trade routes, not in the Southern Ocean or large parts of the Pacific.

Satellites can’t even be really relied on in their early days as there were calibration issues, breakdowns etc. and Argo buoys (unsurprisingly) didn’t conform to the alarmist’s expectations so have been quietly ignored.

You’de have thought they’d have got it by now.

knr
Reply to  HotScot
October 7, 2018 7:10 am

Indeed given the scale of the ocean , even current levels if measurement are like looking at a single hair they claiming you known all not only about the elephant it came from , the the rest of the herd and the land it lives it.

Let us be fair the standard of much in this area is in reality ‘better than nothing ‘ and that is ‘settled science’ in action .

R2Dtoo
Reply to  HotScot
October 7, 2018 10:10 am

But – multiple temps along one well-traveled shipping route can cover a lot of ocean when “homogenized” over 1200km/sarc.

John Tillman
Reply to  HotScot
October 7, 2018 8:28 pm

HotScot,

Before the Panama Canal, the Southern Ocean saw a lot of traffic, but with its terrible WX, I doubt that many good observations were made of SST between South America and Antarctica.

However the British coaling station at Sandy Point, today’s Punta Arenas, Chile, might have kept decent records.

Ships stopping for water and provisions at Valparaiso after rounding the Horn were so important to Chile’s economy that its navy supported Colombia’s war against the Panamanian separatists. But neither Chile nor Colombia could keep Teddy Roosevelt from pushing through the canal.

Steve Richards
October 7, 2018 6:39 am

Well done Dr McLean!

FRANK ATWILL
October 7, 2018 6:39 am

I can predict with a 99% probability that this story will not be covered by main media outlets.

FRANK ATWILL
October 7, 2018 6:39 am

I can predict with a 99% probability that this story will not be covered by main media outlets.

Michael Jankowski
Reply to  FRANK ATWILL
October 7, 2018 3:56 pm

PhD theses rarely are.

John Tillman
Reply to  Michael Jankowski
October 7, 2018 5:13 pm

True, but Joshua Lederberg did get the Nobel Prize for his PhD. thesis.

MarkW
Reply to  Michael Jankowski
October 7, 2018 6:24 pm

99% of papers are rarely covered by the media.

On the other hand ground breaking papers on topics of interest to the media usually are.
Unless the ground being broken is the ground the media has been standing on.

BCBill
Reply to  FRANK ATWILL
October 7, 2018 6:00 pm

CBC’s feeble and lone attempt at science reportage is called Quirks and Quarks. That show almost exclusively interviews female PhD candidates. This story fails the CBC publication test in that the researcher isn’t male, the work doesn’t align with the official CBC CAGW narrative and there is no gender preference angle. There isn’t a snowball’s chance in hell that CBC will cover it.

BCBill
Reply to  BCBill
October 7, 2018 6:01 pm

Should say “isn’t female”.

John Tillman
Reply to  BCBill
October 7, 2018 6:03 pm

PNW Neighbo(u)r,

Guess I should have waited a minute.

Assuming BC refers to British Columbia.

John Tillman
Reply to  BCBill
October 7, 2018 6:01 pm

Bill,

Isn’t female.

john
Reply to  BCBill
October 7, 2018 8:20 pm

There you go, paying attention again! Next I suppose you’ll be telling us that Justin is an intellectual lightweight who panders for votes to pamper his massive ego.

ScienceABC123
October 7, 2018 6:51 am

Garbage in – garbage out…

Crispin in Waterloo
Reply to  ScienceABC123
October 7, 2018 5:10 pm

ScienceABC

Not quite: quoting Willie Soon,

Garbage in, Gospel out.

Pop Piasa
Reply to  Crispin in Waterloo
October 11, 2018 9:13 pm

Shoot, all I ever get out from the garbage in is a brick of compacted trash! 🗑

d
October 7, 2018 6:54 am

Presumably this work will never be peer-reviewed, or published in an authoritative journal.

So it will be completely ignored by Climate Change scientists, and rejected for consideration by the IPCC….

mikewaite
Reply to  d
October 11, 2018 2:45 pm

Actually he has , with colleagues , published in respectable journals :
John McLean
– Published peer-reviewed papers –

4 – McLean, J. (2014) – “Late Twentieth-Century Warming and Variations in Cloud Cover”, Atmospheric and Climate Sciences, October 2014, (available online free of charge at http://www.scirp.org/journal/PaperInformation.aspx?PaperID=50837)

3 – de Freitas, C.R and J.D. McLean (2013) – ‘Update of the Chronology of Natural Signals in the Near-Surface Mean Global Temperature Record and the Southern Oscillation Index’, International Journal of Geosciences, 2013, 4, 234-239. (see http://mclean.ch/climate/docs/deFreitas_&_McLean_IJG_2013_SOI_&_Mean_Global_Temp.pdf )

2 – McLean, J.D., C.R. de Freitas and R.M. Carter (2009) – ‘Influence of the Southern Oscillation on tropospheric temperature’, Journal of Geophysical Research, vol. 114, D14104, doi:10.1029/2008JD011637. (see http://mclean.ch/climate/docs/McLean_deFreitas_Carter_JGR_2009.pdf)

(This paper provoked a ‘Comment’ (i.e. a criticism) but the journal took the almost unheard action of denying us the right of reply. See my comments at http://mclean.ch/climate/ENSO_paper.htm and the more detailed http://scienceandpublicpolicy.org/images/stories/papers/originals/agu_censorship.pdf.)

1 – McLean, J.D. (2006) – ‘A critical review of some recent Australian regional climate reports’, Energy and Environment, vol. 17 No. 1. (see http://mclean.ch/climate/docs/EE%2017-1_03%20McLean%20ok.pdf)

knr
October 7, 2018 7:02 am

The issue is of course to consider what makes ‘good data ‘ and for that you need to consider what the purpose of the data is not what its scientific or empirically validity.

Once you understand that you see what ‘bad data ‘ from a empirical sense becomes ‘good data’ from a ‘agenda ‘ point of view . And let us be fair , if you livelihood depends on results going a certain way then when it goes not way its tempting not to ask to many question has to how valid it really is.

And to that the reality that is an area that fails basic experimental design , for it lacks both range and accuracy to cover that which it claims to measure, and you can the problem . Despite the claims of ‘settled science’ and that is before we get to ‘adjustments ‘ which follow a pattern by ‘luck ‘ which should see the people behind them on the gaming tables of Vegas where with that type of ‘luck’ they could earn millions.

TonyL
October 7, 2018 7:02 am

“St Kitts, a Caribbean island, was recorded at 0°C for a whole month”

This is a totally reasonable number. I was there once and bought an ice cream. (Ice cream is very expensive in the Caribbean.)
But for a better island average, a thermometer in a frozen foods cooler should probably be averaged out with a thermometer placed just above a Weber barbecue grill. (As the Surface Stations Project shows, the Weber grill is the favorite brand for sites reporting temperature data.)

In any event, this shows that sometimes thermometer placement can be significant, and inside an ice cream cooler may not be representative of a tropical island as a whole.

TallDave
Reply to  TonyL
October 8, 2018 4:58 pm

“the Weber grill is the favorite brand for sites reporting temperature data.”

Don’t be naive. That’s what the fossil fuel companies want you to think.

#WeberKnew

steve case
October 7, 2018 7:04 am

Bombshell? How many previous posts here and elsewhere have been headlined as bombshells? I’ll believe Global Warming/Climate Change will take a hit when the several billion in annual funding
https://www.globalchange.gov/about/budget
is reduced. Until then to paraphrase Admiral David Glasgow Farragut, the response from the scientists who have bet their careers on Climate Change will be, “Damn the bombshells full speed ahead!”

commieBob
Reply to  steve case
October 7, 2018 8:24 am

Armor matters.

Even though mid- to late-19th century cruisers typically carried up-to-date guns firing explosive shells, they were unable to face ironclads in combat. This was evidenced by the clash between HMS Shah, a modern British cruiser, and the Peruvian monitor Huáscar. Even though the Peruvian vessel was obsolescent by the time of the encounter, it stood up well to roughly 50 hits from British shells. link

The CAGW ship is heavily armored. It can withstand many direct hits from explosive shells. Yes, we have bomb shells. No, they don’t have the effect we might hope for.

steve case
Reply to  commieBob
October 7, 2018 9:38 am

commieBob … 8:24 am

Thanks for reading my post – and the history lesson. Yes, it’s going to take more than science to sink the Climate Change juggernaut. Government funding is the mother’s milk of this insanity. Hit them in their pocketbooks and just maybe the hordes will start to seek employment elsewhere.

Nick K
Reply to  steve case
October 7, 2018 12:09 pm

Science has nothing to do with it. CAGW is a religion. Debunking an article of faith is completely impossible.

john
Reply to  steve case
October 7, 2018 8:25 pm

There is also an evil rot at the heart of our universities. Eventually something will have to be done about it.

catcracking
Reply to  commieBob
October 8, 2018 10:08 am

“The CAGW ship is heavily armored. It can withstand many direct hits from explosive shells. Yes, we have bomb shells. No, they don’t have the effect we might hope for.”
Absolutely, this applies to many other subjects besides CAGW.
Anyone who thinks otherwise should look at the facts versus the propaganda in the recent supreme court hearings and opinions for another example. Same tactics applied.
I was once naive and thought facts and data will rule the day, boy was I wrong.

John Tillman
Reply to  commieBob
October 8, 2018 5:10 pm

Huáscar was captured by Chile in the Great Pacific War (1879–83), and is now a floating museum in Talcahuano harbor in the Greater Concepción metro area.

During the 1877 action against the Royal Navy in the Peruvian Civil War, she was the first ship ever attacked by self-propelled torpedoes.

mwhite
October 7, 2018 7:06 am

Won#t see this on the BBC science page

https://www.bbc.co.uk/news/science_and_environment

Clyde Spencer
Reply to  mwhite
October 7, 2018 9:40 am

mwhite,
How about a clue as to what we are supposed to be looking for on the page for which you provided a link?

MarkW
Reply to  Clyde Spencer
October 7, 2018 10:01 am

I believe you are supposed to be looking for an article which talks about the study mentioned in this post. With the point being you won’t find such an article.

Harry Passfield
Reply to  mwhite
October 7, 2018 12:12 pm

he BBC don’t allow deniers sceptics on air these days: to them, the science is settled even if it’s not been audited.

john
Reply to  Harry Passfield
October 7, 2018 8:29 pm

Peer review versus audit- I’m pretty sure the Mafia peer reviews their own business operations.

Andy Pattullo
October 7, 2018 7:11 am

This is excellent work and confirms many suspicions previously raised about quality of the “global” long term temperature measurements (or should I say adjustments). It would seem the only way to get a real understanding of what may have happened historically to global temperatures is to identify as many long term single site records without known artifacts from urbanization, station changes etc. and look at their trends over time. If there was global temperature change then the average trends for those stations should reflect both the direction and magnitude of that change. CET is a good example and as far as I know it is not very alarming.

Bellman
Reply to  Andy Pattullo
October 8, 2018 4:37 am

Since 1970 CET has been warming at a rate of over 2°C / century. Faster than any global surface set.

PERRY
October 7, 2018 7:13 am
Sommer
October 7, 2018 7:46 am

Would information like this hold up in an international RICO lawsuit?

Dodgy Geezer
Reply to  Sommer
October 7, 2018 8:00 am

What information?

I presume that Dr McLean is not an accepted Climate Scientists, and he has no peer-reviewed paper published in an acceptable journal. So We can see no reason to read anything written by him.

The BBC have already stated that ‘climate deniers’ must not be given any publicity. I assume that this audit counts as denial? Ergo – he will sink without a trace.

HotScot
Reply to  Dodgy Geezer
October 7, 2018 9:23 am

Dodgy Geezer

But to be a little more positive, the hits on CAGW just keep coming and there are lots of young journalists and politicians waiting to pounce, and make a name for themselves.

Public opinion is waning, the scandal of wasted money is becoming obvious, the under-performance of Germany’s energy policy is being recognised, the withdrawal of renewable subsidies is coming home to roost and the disregard of anything climate related by the Chinese by planning and building ~1,200 coal fired power stations is making people sit up and think.

No one likes Trump (allegedly) yet his policies are seeing America grow, whilst the Paris agreement and the IPCC are largely recognised as excuses for a knees up for the bureaucrats we Brits hate so much (and most other countries). The Kavanaugh fiasco is recognised as a political hatchet job by the left that’s failed miserably and will, I’m sure, engender yet more support for Trump.

In short, we sceptics just cant stop winning and the levee will eventually break when someone recognises there’s a name to be made by vilifying the green blob for all the damage it’s done to the world.

mikewaite
Reply to  Dodgy Geezer
October 7, 2018 9:36 am

I am not sure about that, DG . There is a 2009 paper from a J D McLean in the reasonably well esteemed AGU journal : J Geophys Res-Atmospheres , with one coauthor who is at JCU so I assume that this is the same person.
https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2008JD011637
It refers to the Southern Oscillation Index , one of the topics in the Mclean thesis from JCU
https://researchonline.jcu.edu.au/52041/
According to the publisher the paper had a number of citations and an Attention Score (whatever that is ) of 70 , which seems to be quite good apparently.

Duster
Reply to  Dodgy Geezer
October 7, 2018 2:45 pm

There are a lot of things piling up right now that are dissolving any credit AGW may have. While the media has collectively avoided any serious reporting of counter-narrative news, global crop losses this year due to unstable and unexpectedly persistant cold weather, and consequent increases in the prices of staples like wheat, soy, barley, and such will be making many people start asking where the warming went. The US actually saw “winter storm warmings” in the northwestern tier of states in late summer (https://agfax.com/2018/08/16/wheat-outlook-global-production-down-sharply-u-s-exports-lifted/). Serious losses have been experienced in both (all?) hemispheres. Russia has had serious crop reductions and so has Australia and South Africa.

beowulf
Reply to  Dodgy Geezer
October 8, 2018 5:29 am

DG, try this list of his other peer reviewed papers, articles and submissions:

http://mclean.ch/climate/global_warming.html

D. J. Hawkins
Reply to  Sommer
October 7, 2018 10:43 am

Since RICO is a US law, whom would you bring to book and in what jurisdiction? There does not appear to be any international mechanism to address the breathtaking scope of this scam.

Sommer
Reply to  D. J. Hawkins
October 7, 2018 1:21 pm

There may be some overlap in the work that Charles Ortel is doing to expose the Clinton Charitable Tax Fraud and their illegal Climate Initiative schemes.
https://www.youtube.com/watch?time_continue=29&v=GBIYM1vwjco
Charles Ortel has talked about a RICO lawsuit in this context.

SMS
October 7, 2018 7:57 am

Let us not forget that nearly 40% of land temperature readings are estimated. This in addition to the bad data.

Murphyslaw
October 7, 2018 8:09 am

The IPCC are SMOOTH operators.

Phil
Reply to  Murphyslaw
October 7, 2018 11:54 am

+1.01357

auto
Reply to  Phil
October 7, 2018 3:07 pm

Phil,
Gorgeous.
I nearly lost a monitor to red wine!
Plus 2.834106554921 [approximately].

Auto

dearieme
October 7, 2018 8:30 am

“This process was at least equivalent to “peer review” as conducted by scientific journals.”

In my experience the examination of PhD dissertations in British and Commonwealth universities is routinely done to a distinctly higher standard than peer review for journals.

Fred250
Reply to  dearieme
October 7, 2018 11:48 am

“is routinely done to a distinctly higher standard than peer review for journals.”

Precisely !!

PhD reviewing is scientific, and very thorough.

Journal peer-review , is for journal publication.

Alan Tomalty
Reply to  Fred250
October 8, 2018 1:36 am

No, the PhD reviewing when it comes to climate science is a joke. Tell your professor and chairman of the Atmospheric science department that you want your PhD in atmospheric science/climate science and that your research has been proved rock solid but that you don’t believe in CAGW. Your chances of obtaining your PhD will be delayed until you are programmed the CORRECT way. They will find any excuse not to give you the certificate until you demonstrate adherence to the religion.

Fred250
Reply to  Alan Tomalty
October 8, 2018 3:02 am

Yet John got this through.

Tenacity !! 🙂

Phil.
Reply to  Alan Tomalty
October 22, 2018 11:55 am

The procedure at JCU requires the candidate to submit a a list of potential external examiners from which the Advisory Panel chooses two who actually independently examine the thesis.

Jtom
October 7, 2018 8:31 am

Considering that climate models have failed, and continue to fail, to show what temperatures are doing, climate modelers should embrace this research and publicize it far and wide.

They can now claim their models are NOT wrong, but that the historical data used as input was (and they are not at fault). “Our output was wrong, but we are still right.” They could then re-run the models with different data that reduces the short-term temperature increases, but keeps the longer-term, steeper upward trajectory. They may even be able to claim, “it’s worse than we thought.”

Frankly, though, I don’t see them sufficiently intelligent to use this gambit to stay relevant in the debate.

DonK31
October 7, 2018 8:32 am

Now we know why Phil Jones didn’t want to release his data and methods…it so easy to find something wrong with them.

Rod Everson
October 7, 2018 9:01 am

As a general reader, I found the explanation of how making site adjustments resulted in lowering older temperature records incorrectly to be one of the most interesting points. Tony Heller has been printing graphs for years now that show how local records of past temperatures have been consistently adjusted downward. If the reason for those downward adjustments can be shown to be primarily due to the obviously incorrect process described in this thesis, then that should be a major story in itself.

But is that the case, or are there many other reasons for adjustments always seeming to cool the past? If not, someone should write a paper exposing the fraud, for fraud it would be. Anyone thinking it happened as a result of an innocent mistake or miscalculation hasn’t been paying attention the past few years.

2hotel9
October 7, 2018 9:03 am

Errors? Thats their story and they are sticking to it.

E J Zuiderwijk
October 7, 2018 9:08 am

Quack data to be used by quacks. It figures.

Whiskey
October 7, 2018 9:15 am

Is this the same John McLean that predicted (in early 2011) “it is likely that 2011 will be the coolest year since 1956 or even earlier” ???

HotScot
Reply to  Whiskey
October 7, 2018 9:27 am

Whiskey

Nah, that was the John McLean in Die Hard.

Phoenix44
Reply to  Whiskey
October 7, 2018 9:35 am

And your point is?

That because you cannot attack the work you attack the man, thus proving once again the Ad Hom fallacy that alarmists love so much.

clipe
Reply to  Phoenix44
October 7, 2018 5:39 pm

” HotScot
October 7, 2018 at 9:23 am

Dodgy Geezer

But to be a little more positive, the hits on CAGW just keep coming and there are lots of young journalists and politicians waiting to pounce, and make a name for themselves.

Public opinion is waning, the scandal of wasted money is becoming obvious, the under-performance of Germany’s energy policy is being recognised, the withdrawal of renewable subsidies is coming home to roost and the disregard of anything climate related by the Chinese by planning and building ~1,200 coal fired power stations is making people sit up and think.

No one likes Trump (allegedly) yet his policies are seeing America grow, whilst the Paris agreement and the IPCC are largely recognised as excuses for a knees up for the bureaucrats we Brits hate so much (and most other countries). The Kavanaugh fiasco is recognised as a political hatchet job by the left that’s failed miserably and will, I’m sure, engender yet more support for Trump.

In short, we sceptics just cant stop winning and the levee will eventually break when someone recognises there’s a name to be made by vilifying the green blob for all the damage it’s done to the world.”

Whiskey
Reply to  Whiskey
October 7, 2018 6:02 pm

Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done.
Here’s some more non ad hominem for you:
https://www.skepticalscience.com/John_McLean_arg.htm

MarkW
Reply to  Whiskey
October 7, 2018 6:28 pm

Once again, the warmists have to re-define the language in order to try and change the subject.

Skeptical Science? Really, is that the best you can do? Might as well quote Dr. Seuss.

Reply to  Whiskey
October 7, 2018 6:40 pm

“Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done.”

Correct. Other misuses of philosophical terms are “begs the question” (misused 95% of the time) and “appeal to authority” (ditto).

Scott Bennett
Reply to  Whiskey
October 7, 2018 9:51 pm

==>Whiskey

Not sure if you know what “ad hominem” means, but it does not mean going after what someone has said or done. – Whiskey

What? You went straight after the man and not his argument!

Is this the same John McLean that predicted.. – Whiskey

The only time criticism of the person is not an ad hominem argument is if a person’s merits are actually the topic of the argument! You went straight to his credibility and that is attacking the man! You have confused fallacious reasoning with criticism. You went straight to “going after what someone has said or done” and that is the very definition of argument ad hominem*.

You unwittingly applied a typical form of psychological priming to “poison the well” a subtle use of ad hominem to influence the views of spectators.

*A fallacious argumentative strategy whereby genuine discussion of the topic at hand is avoided by instead attacking the character, motive, or other attribute of the person making the argument, or persons associated with the argument, rather than attacking the substance of the argument itself. – Wikipedia

scross
Reply to  Whiskey
October 12, 2018 9:44 am

I find it a bit odd that of the eight links provided there under “Climate myths by McLean”, only one of those links (the last one) actually references him by name.

TallDave
Reply to  Whiskey
October 12, 2018 6:18 am

That was a pretty bad prediction if so, but not sure it was the same guy. Anyways, he’d be in good company with Hansen and others…

Hubert Lamb, Director of CRU, Sep 8 1972: “We are past the best of the inter-glacial period which happened between 7,000 and 3,000 years ago… we are on a definite downhill course for the next 200 years….The last 20 years of this century will be progressively colder.” http://news.google.com/newspapers?nid=336&dat=19720908&id=AiwcAAAAIBAJ&sjid=0VsEAAAAIBAJ&pg=5244,2536610

John Firor, Excecutive Director of NCAR, 1973: “Temperatures have been high and steady, and steady has been more than high. Now it appears we’re going into a period where temperature will be low and variable, and variable will be more important than low.”

scross
Reply to  TallDave
October 12, 2018 9:33 am

Given that the prediction isn’t an actual quote from McLane, but rather came from the writer of a media statement (see below); and given the oddity of the prediction; and given that the quick-and-dirty review of potentially relevant documents which I did didn’t turn up any evidence to back it up anyway, then I have to assume that this was simply a misunderstanding on the part of the person who wrote the media statement. (Such a situation isn’t uncommon.) I didn’t dig too deeply into it, though, so I could be wrong.

John McLean: Statement: COOL YEAR PREDICTED: Updated with LATEST GRAPH

http://climaterealists.com/index.php?id=7349

John McLean
Reply to  Whiskey
October 19, 2018 5:08 am

Lordy, lordy, lordy! You mean that researchers can’t test hypotheses by making predictions and seeing if they come true or not?
And please tell us all how the many predictions made using climate models have turned out.

TomRude
October 7, 2018 9:16 am

If the climatic alarm was true, the very first task of scientists involved would have been to set up a tight grid of new stations overlapping the best existing ones and let the data flow in for the past 30 years then start to make sense of temperature, pressure, humidity etc…
Instead, algorithms, computer models, a complete dismissal of climatologists/geographers’ knowledge -see the Leroux versus Legras & consorts- and scientactivist media campaigns replaced the search for a diagnostic, away from politics.

Alan Tomalty
Reply to  TomRude
October 8, 2018 1:44 am

No need. The UAH satellite temperature data set is the only one that both sides trust. Everybody drools near the end of every month waiting for it to come out on the 2nd day of the next month. This dataset is now where the climate wars are fought because the alarmists don’t have any other credible data that they can point to. Eventually even the UAH dataset will crumble the alarmist sand castle as the daily tides have to always come back in.

Antony Banton
Reply to  Alan Tomalty
October 8, 2018 2:10 am

An outlier, by definition, is not the most likely……

http://postmyimage.com/img2/510_Tropospheretrends.png

Especially as neither RSS nor UAH are consistent with the sensor on the previous satellite that was superceded in 1998.
UAH says the present one is he correct one and, pragmatically, RSS says we dont know and splits the difference….
comment image
comment image

It is one instrument having taken over from the previous one instument. Measuring anyway a depth of the troposhere and missing the surface where the majority of warming is taking place over land.

John Tillman
Reply to  Antony Banton
October 8, 2018 10:08 am

Anthony,

Don’t you think that a warming surface ought to warm the troposphere?

The GHE hypothesis supposes that a troposphere warmed by slowing down the migration of heat toward space will warm the surface. If the surface is warming before and faster than the troposphere, then the GHE hypothesis is falsified.

tty
Reply to  John Tillman
October 11, 2018 11:07 am

“The GHE hypothesis supposes that a troposphere warmed by slowing down the migration of heat toward space will warm the surface.”

No it doesn’t. Read:

https://wattsupwiththat.com/2018/10/09/richard-lindzen-lecture-at-gwpf-global-warming-for-the-two-cultures/

For a lucid explanation of what GHE really is.

John Tillman
Reply to  John Tillman
October 11, 2018 11:27 am

Tty,

I was going with the official US government version of the GHE. That doesn’t mean I agree with it. In fact, I agree with Lindzen’s version of the hypothesis. But my point was that, given this view of the GHE, then observations don’t support that whatever warming has occurred to due to such an effect.

Lindzen says that water vapor and other greenhouse gases elevate “the emission level, and because of the convective mixing, the new level will be colder. This reduces the outgoing infrared flux, and, in order to restore balance, the atmosphere would have to warm.”

NASA, for example, by contrast explains the GHE thusly:

“A layer of greenhouse gases – primarily water vapor, and including much smaller amounts of carbon dioxide, methane and nitrous oxide – acts as a thermal blanket for the Earth, absorbing heat and warming the surface to a life-supporting average of 59 degrees Fahrenheit (15 degrees Celsius). Most climate scientists agree the main cause of the current global warming trend is human expansion of the “greenhouse effect” 1 — warming that results when the atmosphere traps heat radiating from Earth toward space. Certain gases in the atmosphere block heat from escaping.”

https://climate.nasa.gov/causes/

John Tillman
Reply to  John Tillman
October 11, 2018 11:54 am

The official US government and IPCC hypothesis could I guess be called retarded, since it supposes that more CO2 retards the movement of heat from the surface to space.

Red94ViperRT10
October 7, 2018 9:39 am

“…the surface temperature data is unfit for purpose…”

Gee, Anthony, where have we heard that before? https://wattsupwiththat.com/2012/07/29/press-release-2/

StephenP
October 7, 2018 9:45 am

Watch your back John McLean, you will have upset a very big applecart.

John McLean
Reply to  StephenP
October 19, 2018 5:09 am

But I was told that sacred cows make the best hamburgers.

Earthling2
October 7, 2018 9:52 am

Welcome to the Adjustocene where if we don’t know what the historical temperatures were, we make it up. This fact has to be driven home over and over again, that we don’t have a very good reliable data set for most of the 19th century and much of the 20th century. Knowing that, it is only fitting that we accept a wider margin of error what that hypothesized data might be, with a caveat that going forward the error bars on newer data can be somewhat tightened as we gather more accurate data from more of the surface of the earth. That means that 19th century data world wide is speculative at best, and manufactured at worst. And really doesn’t mean much other than we know it was still fairly cold from the previous 500 years of a cooling trend from the LIA that we know with some certainty was much colder than any previous historical normal. That really makes 1850 colder than any historic normal for a starting point to this current exercise. Adding 1.5 C to a really cold beginning doesn’t even allow for much natural variability.

If the IPCC wants credibility, then it should at least be honest with itself about the data it does have. Plus it would be more amenable if the threshold for dangerous warming is set at 1950 going forward, instead of some mythical temperature from 1850 at the tail end of the LIA, once of the coldest periods in the Holocene to date. That should be noted, when it was a fairly cold time time in the history of the world. If we do see long term temperature trending much higher over the next 30+ years to 2050 by 2 C, then that should be the basis for taking any kind of action with regards to limiting economic output of the world, by limiting CO2 and other GHG production in the future if it demonstrated that GHG’s are indeed a significant factor.

So far in the 21st century, temperatures seem to be within an acceptable range of error, in fact a global hiatus or a pause in any significant warming in these first 18-19 years of the 21st century indicates that any temperature increase is not linear with CO2 concentrations in the atmosphere. So let’s allocate resources to collecting honest and accurate weather and climate data so that wise decision making can be implemented in the next 30 years. We are just very early yet in declaring any emergency, and it hasn’t been demonstrated that any real significant threat has been identified, other than much of the very populous world is just not ready for any kind of normal inclement weather which is what leads to alarmism in general. Perhaps that is where any resources are first spent, which is hardening our defences to inclement weather.

Peter Morris
October 7, 2018 9:53 am

Holy cow you weren’t kidding. This is HUGE news. I knew the dataset was sparse in the 19th century, but that is ridiculous! There’s really no reason to trust HadCRUT until 1950 at the earliest.

I’m sure the response will be measured and sober.

Dodgy Geezer
Reply to  Peter Morris
October 7, 2018 10:32 am

…I’m sure the response will be measured and sober…

What response? This will simply be ignored.

Reply to  Peter Morris
October 7, 2018 6:45 pm

Delingpole says that McLean says that of the 0.6 warming since 1950, 0.2 is likely exaggerated.

ATheoK
October 7, 2018 9:56 am

John Mclean has been a WUWT guest blogger or indirectly supplied article information a number of times before and has always been instructive.

Some of his previous contributions:


“Reckless commitments to the Paris Climate Agreement, November 10, 2017”
“Friday Funny: more upside down data, March 25th, 2016; through an article by Bishop Hill where Jon McLean asked for a lookover”
“Hadley Climate data has been “corrected” thanks to alert climate skeptic, April 11th, 2016″

Bruce Cobb
October 7, 2018 10:01 am

I am 97% sure that the errors almost always favoring warming is a complete coincidence.

TallDave
Reply to  Bruce Cobb
October 12, 2018 6:20 am

I have peer reviewed this statement and therefore it must be true.

MarkW
Reply to  TallDave
October 12, 2018 7:51 am

I have peer reviewed your peer review and find that I have no reason to object to it being published.

Timo Soren
October 7, 2018 10:04 am

Back in 2005 McIntrye post a comment from P. Jones.

Why should I make the data available to you, when your aim is to try and find something wrong with it.

Well yes, we do want to look at it and of course, McIntrye was completely correct about the need to look!

Adam Gallon
Reply to  Timo Soren
October 7, 2018 2:22 pm

That was an exchange between Jones & Warwick Hughes. https://climateaudit.org/2005/10/15/we-have-25-years-invested-in-this-work/

Ristvan
October 7, 2018 10:16 am

It has been obvious for quite a while that the temp data is not fit for climate purpose. See, for example, essay When Data Isnt in ebook Blowing Smoke.
Good to have yet another detailed confirmation of that basic fact.

Michael Jankowski
October 7, 2018 10:18 am

It is easy to predict the reponses…

(1) errors are minor and make no difference
(2) there are other data sets which independently verify the temperature record
(3) examples of errors presented show readings both too cold and too warm, which would mostly cancel-out as errors often do
(4) McLean has misrepresented his qualifications previously
(5) McLean’s prior works were heavily criticized and/or avoided rigorous peer-review
(6) McLean is an industry shill

etc.

Reply to  Michael Jankowski
October 7, 2018 11:25 am

Yes, except that #4-5 will come first. The climate activists’ first impulse is generally the ad hominem attack. Any reference to actual data comes later, if at all.

Reply to  Dave Burton
October 7, 2018 11:26 am

Oops, typo. I meant #4-6.