Inquiry Launched Into Global Temperature Data Integrity

The International Temperature Data Review Project

London, 26 April 2015 – The London-based think-tank the Global Warming Policy Foundation is today launching a major inquiry into the integrity of the official global surface temperature records.

An international team of eminent climatologists, physicists and statisticians has been assembled under the chairmanship of Professor Terence Kealey, the former vice-chancellor of the University of Buckingham. Questions have been raised about the reliability of the surface temperature data and the extent to which apparent warming trends may be artefacts of adjustments made after the data are collected. The inquiry will review the technical challenges in accurately measuring surface temperature, and will assess the extent of adjustments to the data, their integrity and whether they tend to increase or decrease the warming trend.

Launching the inquiry, Professor Kealey said:

“Many people have found the extent of adjustments to the data surprising. While we believe that the 20th century warming is real, we are concerned by claims that the actual trend is different from – or less certain than – has been suggested. We hope to perform a valuable public service by getting everything out into the open.”

To coincide with the inquiry launch Professor Kealey has issued a call for evidence:

“We hope that people who are concerned with the integrity of climate science, from all sides of the debate, will help us to get to the bottom of these questions by telling us what they know about the temperature records and the adjustments made to them. The team approaches the subject as open-minded scientists – we intend to let the science do the talking. Our goal is to help the public understand the challenges in assembling climate data sets, the influence of adjustments and modifications to the data, and whether they are justifiable or not.”

All submissions will be published.

Further details of the inquiry, its remit and the team involved can be seen on its website www.tempdatareview.org

The controversy

Climatologists have long been aware of the poor state of global surface temperature records and considerable effort has been put into adjusting the raw data to correct known errors and biases. These adjustments are not insignificant. For example it has been noted that in the temperature series prepared by NOAA for the USA, the adjusted data exhibits a much larger warming trend than the raw data.

Source: http://1.usa.gov/1gQRThX

It has also been noted that over the years changes to the data have often tended to cool the early part of the record and to warm more recent years, increasing the apparent warming trend.

Although the reasons for the adjustments that are made to the raw data are understood in broad terms, for many of the global temperature series the details are obscure and it has proved difficult for outsiders to determine whether they are valid and applied consistently. For all these reasons, the global surface temperature records have been the subject of considerable and ongoing controversy.

The panel

In order to try to provide some clarity on the scientific issues, the Global Warming Policy Foundation has invited a panel of experts to investigate and report on these controversies.

The panel features experts in physics, climatology and statistics and will be chaired by Professor Terence Kealey, the former vice-chancellor of the University of Buckingham.

Terms of reference

Detailed terms of reference for the panel have been published.

Submissions of evidence

With four major surface temperature series to consider, each incorporating several layers of adjustment, the scope of the inquiry is very wide. The panel is therefore seeking to benefit from the considerable expertise that already exists on the surface records and is inviting interested parties to submit evidence.

After review by the panel, all submissions will be published and can be examined and commented upon by anyone who is interested.

The deadline for submitting evidence is 30 June 2015.

Report

No timetable has been set for the panel to report.

Contact

The International Temperature Data Review Project

Chairman

Professor Terence Kealey

terence.kealey@buckingham.ac.uk

The International Temperature Data Review Project

http://www.tempdatareview.org/

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

507 Comments
Inline Feedbacks
View all comments
Robin Hewitt
April 27, 2015 2:11 am

Little help please, I am not understanding Brandon’s graphs. I thought an anomaly graph was the difference between two other graphs. One graph could show the difference between the adjusted and the raw data, so two lines suggests to me that the data are being compared with something else. What?

richardscourtney
Reply to  Robin Hewitt
April 27, 2015 2:26 am

Robin Hewitt
You request

Little help please, I am not understanding Brandon’s graphs.

I suggest it is best not to go there.
When pressed on a point it is the normal practice of Brandon Gates to copy&paste stuff he does not understand and that often contradicts what he has tried to say. Any query of it results in his replying with long-winded and irrelevant diatribes which disrupt threads.
Richard

MikeB
Reply to  Robin Hewitt
April 27, 2015 4:19 am

Robin
When we talk about global warming we are talking about the temperature on the SURFACE of the Earth, because that is where we live. So, how do we measure the absolute temperature of the surface of the earth? There is no practical way to do that. Although there are a large number of weather stations dotted around the globe they do not provide a representative sample of the whole surface. Some stations may be on high mountains, others in valleys or local microclimates, and the coverage they provide is not evenly spaced over the surface of the Earth.
What is more, the station readings have not been historically cross-calibrated with each other and ‘observation times’ vary.
However, we have a much better chance of determining whether temperatures are increasing or not, by comparing measurements from weather stations with those they gave in the past; we call these differences ‘anomalies’. That is to say, my thermometer may not be accurately calibrated, but I can still tell if it is getting warmer or colder.
It is difficult (impossible) therefore to determine the absolute temperature of the earth accurately with any confidence. We could say that it is probably about 14 or 15 deg.C ( my 1905 encyclopaedia puts it at 45 deg.F). If you really want absolute temperatures then just add the anomalies to 14 or 15 degrees. It doesn’t change the trends.
All the world’s major datasets of global mean temperatures are thus expressed in terms of ‘anomalies’, the difference between the temperatures measured now to the temperatures averaged over a particular ‘baseline’ in the past (baselines typically average temperatures over a period of 20 or 30 years).

Reply to  MikeB
April 27, 2015 10:21 am

Mike,
So now we are interested solely in “where we live” warming, rather than the global kind?
Big news!
Make sure everyone knows!
Nobody lives in the middle of oceans, nor under them, nor in the arctic sea, or Antarctica ( Visiting is not living is it? If it is, I have some house guests that owe me some rent.) neither, do they?

MikeB
Reply to  MikeB
April 27, 2015 11:24 am

Absolutely menicholas
Global warming is only a concern in respect of where we live.
In general, greenhouse gases warm the surface of the planet while, at the same time, they cool the atmosphere at high altitudes.
Didn’t you know that?

Reply to  MikeB
April 29, 2015 3:39 pm

I am having a little trouble with my scorecard, being relatively new here.
So I am not sure if you are serious.

son of mulder
April 27, 2015 2:52 am

If I look at the graph of difference after adjustments, is it reasonable to conclude that we were much better at measuring temperature 80 to 100 years ago?

Resourceguy
Reply to  son of mulder
April 27, 2015 10:16 am

Clearly the recommendation written before the start of the effort is to shoot down the satellites and defund ARGO. That would solve it.

mikewaite
April 27, 2015 3:22 am

A belated reply to Brandon for his answer(April 26 , 3.22pm) to my small point made earlier.
An apology for the delay , but the time zones are contrary and yesterday I was busy all day on a “dig” .
My point about the data being possibly adjusted before NOAA analyse them was really addressed to others who , like me , might be unaware of the mechanism to the NOAA work . I am aware of the extent of your information files and have in fact made a note of the references that you regard as the most useful. I do not expect this enquiry team to achieve much , but if it is satisfied that data is largely uncorrupted then confidence in science , and the topic of climate change , is maintained . Whilst if there are some dubious series of data , correcting or removing them will also make the science of climatology stronger . Win-win for everyone surely.
Thank you for the reply to my “second derivative” idea . Yes the WFT plot was the one that interested me , showing how successive cooling periods have shown shorter lengths , and lower slopes as time , and CO2 concentration progresses . So is it some underlying temporal change or is it just CO2 ? The other plots I will need time to digest .

Brandon Gates
Reply to  mikewaite
April 27, 2015 11:10 am

mikewaite,

A belated reply to Brandon for his answer(April 26 , 3.22pm) to my small point made earlier.
An apology for the delay , but the time zones are contrary and yesterday I was busy all day on a “dig” .

No worries.

My point about the data being possibly adjusted before NOAA analyze them was really addressed to others who , like me , might be unaware of the mechanism to the NOAA work .

Now I’m sorry … I don’t follow what you mean by: the mechanism to the NOAA work.
More to my original point, I get weary of folks here raising questions as if that settles the issue. Anybody can ask questions. As you may have noticed, I’m in a sour mood on this thread, apologies if my comments toward you have been barbed more than warranted.

I am aware of the extent of your information files and have in fact made a note of the references that you regard as the most useful. I do not expect this enquiry team to achieve much , but if it is satisfied that data is largely uncorrupted then confidence in science , and the topic of climate change , is maintained . Whilst if there are some dubious series of data , correcting or removing them will also make the science of climatology stronger . Win-win for everyone surely.

As I’ve said elsewhere, I’m not opposed to these sorts of audits in principle. I am asking the rhetorical question, how many such reviews is it going to take? On this particular one, again, as I’ve mentioned elsewhere, the questions are not at all novel and some of them don’t really make much sense. So I’m dubious to the extreme that it’s a good-faith attempt to do anything meaningful for the science, and like harrytwinotter does, think it’s just a noisemaking stunt.
It would be properly sceptical for me to reserve judgement until the report comes in, but I find that cannot in this instance do that.

Thank you for the reply to my “second derivative” idea . Yes the WFT plot was the one that interested me , showing how successive cooling periods have shown shorter lengths , and lower slopes as time , and CO2 concentration progresses . So is it some underlying temporal change or is it just CO2 ? The other plots I will need time to digest .

I’m loath to discuss it in this thread. I’ll carry those questions over to where they’re in context and answer there.

kim
Reply to  Brandon Gates
April 27, 2015 11:50 pm

Heh, you can’t reserve judgement, cuz you’ve already judged. Amusing that you can give lip service to reserving judgement, but kiss it off anyway.
============

Brandon Gates
Reply to  Brandon Gates
April 28, 2015 1:23 pm

Kim,
Amusing that you see my self-admitted failure but choose to be critical anyway.

Brandon Gates
Reply to  Brandon Gates
April 28, 2015 1:51 pm

mikewaite,
As promised I have responded to your other questions about CO2 derivatives here: http://wattsupwiththat.com/2015/04/22/a-statistical-definition-of-the-hiatus-in-global-warming-using-nasa-giss-and-mlo-data/#comment-1920448

Reply to  Brandon Gates
April 29, 2015 12:30 am

I am disappointed.
Taking the time to write about one being in a foul mood is either irrelevant, a ploy to distract, unseemly and embarrassing pouting, or just a waste of keystrokes.
Who cares about someone else’s snotty moods?
Last time I will comment on it.

Reply to  Brandon Gates
April 29, 2015 3:40 pm

BTW, That was NOT at Kim.

April 27, 2015 7:28 am

Fishy. Prof. Muller’s BEST project was an alarmists’ PR trick — I smelled it from the beginning. This one? One thing is certain: following the famous sheriff’s maxim, they will NOT dig up anything they can’t bury.
Nobody wants to lose their academic pass to the chow trough. Nobody is big enough to go against the mass mesmerizers (remember, what “progressive” racists did to Dr. Watson, the DNA discoverer?)
Lie is ever triumphant, simply because it is easier than the truth.

Richard Ilfeld
April 27, 2015 8:31 am

For SCIENCE, the ‘best’ thing a new group could do, IMHO, is put all available raw records into a common data form, along with identification of the instrument used, the sponsoring agent name, an abstract of the level of urbanization,and a link to any other mind numbing detail on file. We have a plethora of homogenized records, while the underlying data, with which one can intelligently criticize the homogenization, remains generally anecdotal, or very inconveniently available.

Reply to  Richard Ilfeld
April 29, 2015 12:24 am

“We have a plethora of homogenized records, while the underlying data, with which one can intelligently criticize the homogenization, remains generally anecdotal, or very inconveniently available.”
Richard,
The main point of this whole thing. Thank you!

Resourceguy
April 27, 2015 9:16 am

BEST of luck on that effort. Or maybe it’s the effort that counts and not the outcomes.

April 27, 2015 9:32 am

I’m coming to the conclusion that discussions on this site tend to be rather long winded and repetitive. I have an anecdote and a suggestion.
This is the story of the annual convention of comedians which takes place every year, and the existence of a certain book, wherein is a compilation of all of the funniest jokes of all time.
Now, since all the comedians have read this book and know it, inside out, front to back, and have it memorized, when these folks are all gathered together and one of them wants to tell a joke, they no longer need to recite the entire joke. Rather they just say the page number of the joke they want to tell, which is invariably greeted with uproarious laughter.
I propose a similar format here.
We could compile all the different arguments that anyone has ever made, particularly those that are made over and over and over again, then number them.
From that point on, once all the regulars have the whole thing memorized, each person can just say the number of the argument they wish to present.
This will allow us all to work through these threads much more quickly and move on to the next story.
You are all welcome!
Whoever wants to compile the list can get started immediately.
[Wouldn’t work. The mods would still keep getting requests to correct the misspelled comment abbreviations. 8<) .mod]

Reply to  menicholas
April 27, 2015 10:36 am

Doh!

Reply to  menicholas
April 27, 2015 11:26 am

I heard that that was tried once with jokes. A member would say a certain number and all laughed. Then when a different person said a different number, no one laughed. When he asked why no one laughed, he was told because of the way he said it.
[Yes, that was the reference (jokes by number) that was being used. The mods merely abused it further. 8<) mod]

Reply to  Werner Brozek
April 27, 2015 11:31 am

Werner Brozek on April 27, 2015 at 11:26 am
– – – – – –
Werner Brozek,
Now that is funny.
John

Reply to  Werner Brozek
April 27, 2015 12:19 pm

Well you see, all us comedians know the whole joke, so there I’d no need to tell that part.
🙂
But good on ya for knowing that!

Reply to  Werner Brozek
April 27, 2015 12:20 pm

BTW, 47!

Reply to  Werner Brozek
April 29, 2015 12:34 am

“[Yes, that was the reference (jokes by number) that was being used. The mods merely abused it further. 8<) mod]"
But that is OK. I am more disappointed that Mr. Whitman thinks Mr. Brozak is funny, but I am not. Apparently, I told it wrong, too.
*sigh*

April 27, 2015 9:54 am

{bold emphasis mine – JW}
Terms of reference
The panel is asked to examine the preparation of data for the main surface temperature records: HadCRUT, GISS, NOAA and BEST. For this reason the satellite records are beyond the scope of this inquiry.
The following questions will be addressed.
1. Are there aspects of surface temperature measurement procedures that potentially impair data quality or introduce bias and need to be critically re-examined?
2. How widespread is the practice of adjusting original temperature records? What fraction of modern temperature data, as presented by HadCRUT/GISS/NOAA/BEST, are actual original measurements, and what fraction are subject to adjustments?
3. Are warming and cooling adjustments equally prevalent?
4. Are there any regions of the world where modifications appear to account for most or all of the apparent warming of recent decades?
5. Are the adjustment procedures clearly documented, objective, reproducible and scientifically defensible? How much statistical uncertainty is introduced with each step in homogeneity adjustments and smoothing?
A project of the Global Warming Policy Foundation
http://www.tempdatareview.org/remit

”For this reason the satellite records are beyond the scope of this inquiry” seems a rational and clean division of labor to make. Which leads to asking whether there have been audits by independent panels of the satellite records in a similar fashion with a similar charter and if not are there currently any plans by independent panels to audit the satellite records?
”1. Are there aspects of surface temperature measurement procedures that potentially impair data quality or introduce bias [. . .]” addresses the nature of the processes that result in the ‘original’ data. It is crucial that the ITDR Project formally addresses this as establishing a clear understanding of the limits of surface temperature focused activities. I think it is important that it was the first item of the five terms of reference.
In item 2 of the terms of reference there is focus on determining whether there is a significant occurrence of transforming data; thus the ITDR Project seeks to document ”[w]hat fraction of modern temperature data, as presented by HadCRUT/ GISS/ NOAA/ BEST, are actual original measurements, and what fraction are subject to adjustments”. This item does not yet explicitly address the validity of the ‘adjustments’.
In item 3 of the terms of reference there is focus on comparing the occurrence of positive and negative direction in transformations of data; thus the ITDR seeks to document whether “warming and cooling adjustments [are] equally prevalent”. This item does not yet explicitly address the validity of the ‘adjustments’
In item 4 of the terms of reference we see introduction of focus on geographic location of transformed data; thus the ITDR seeks to document whether “there [are] any regions of the world where modifications appear to account for most or all of the apparent warming of recent decades”. This item does not yet explicitly address the validity of ‘adjustments’.
Item 5 of the terms of reference focuses on the validity of the ‘adjustments’; thus the ITDR seeks to document whether ”the adjustment procedures [are] clearly documented, objective, reproducible and scientifically defensible”. Item 5 is the beneficiary of the context set by items 1 through 4. I would like to stress what I think is a very crucial aspect of the work product of HadCRUT/ GISS/ NOAA/ BEST that the ITDR could document within its current terms of reference; I think it is crucial to document an assessment of the formality and due diligence of their Quality Assurance Programs and their Quality Control Procedures/Processes.
The terms of reference are well articulated and clear.
John

Ed Coffer
Reply to  John Whitman
April 28, 2015 1:45 am

Item 3 Are warming and cooling adjustments equally prevalent?
This doesn’t even make sense, The non-climatic adjustments are made for specific reasons for individual stations. They are what they are depending on why the adjustment was necessary. Its like they don’t think it’s ‘fair’ if the adjustments are not ‘equally’ up or equally down. It’s a political question, not a scientific one. This shows me that they aren’t really interested in facts or have any desire to improve the science.

kim
Reply to  Ed Coffer
April 28, 2015 2:00 am

‘does not yet explicitly address’.
========================

kim
Reply to  Ed Coffer
April 28, 2015 2:21 am

It seems, Ed, that this is one of the reasons for the inquiry. Have the adjustments been for good reasons or from a perverted algorithm?
I wish you’d keep talking. I’m trying to figure out whether you know what you are talking about. So far you seem like a more ignorant version of B Gates, with the same aura of outrage and sneer. Kiddo, that’s very poor tone in the face of curiosity.
========================

Reply to  Ed Coffer
April 28, 2015 8:55 am

kim on April 28, 2015 at 2:00 am
&
kim on April 28, 2015 at 2:21 am
– – – – – – – – –
kim,
Indeed, in the face of wonderfully open curiosity, condescending attitudes can be shrugged at, winked at and ignored as being counterproductive to voluntary civil discourse.
But does Ed Coffer’s condescending attitude in comments invalidate his ideas and argument? Probably not, except that the chronic condescending modus operandi makes one tend to just ignore what he says; because of that I often do tend to ignore him.
John

MarkW
Reply to  Ed Coffer
April 28, 2015 3:22 pm

They “claim” that the adjustments are being made for legitimate reasons.
That claim has yet to be scientifically proven.
Regardless, to answer the question. About 90% of recent records have been adjusted upwards, while about 90% of older records have been adjusted downwards.

Brandon Gates
Reply to  Ed Coffer
April 28, 2015 6:55 pm

kim,

Kiddo, that’s very poor tone in the face of curiosity.

Tone trolling on WUWT …. really? That’s about as sporting as fishing at the fish market.

Brandon Gates
Reply to  Ed Coffer
April 28, 2015 6:56 pm

MarkW,

They “claim” that the adjustments are being made for legitimate reasons.
That claim has yet to be scientifically proven.

What would constitute scientific proof in your view?

Reply to  Ed Coffer
April 28, 2015 10:05 pm

Brandon Gates on April 28, 2015 at 6:56 pm
– – – – – – – –
Brandon Gates,
Dual reality metaphysics of the climate alarmists is proof that they aren’t in any reality. They are so unconnected to reality that they are not even wrong about metaphysics, much less climate science.
John

Brandon Gates
Reply to  Ed Coffer
April 29, 2015 10:00 am

John Whitman,
I asked: What would constitute scientific proof in your view?
You answered …

Dual reality metaphysics of the climate alarmists is proof that they aren’t in any reality. They are so unconnected to reality that they are not even wrong about metaphysics, much less climate science.

… with … something … that does not even remotely directly address my question. I’d ask you whether you see the irony here, but I’m not sure you’re equipped to answer.

Reply to  John Whitman
April 28, 2015 8:37 am

Ed Coffer on April 28, 2015 at 1:45 am
– – – – – – –
Ed Coffer,
I think the statistical distribution involved in the transformations of data in time, in space and in direction is a normal fundamental scientific task in an objective scientific process. With that element of the scientific process in hand, then when the other step, which is the scientific assessment of the validity of transformations of data, is complete we can see the impact of any invalid transformations of data. N’est ce pas?
John

1sky1
April 27, 2015 10:56 am

This long-overdue inquiry is certainly welcome. The crucial problem in “global average temperature” indices, however, is not just ill-founded data adjustments.
In many vast regions around the globe, virtually the ONLY station records come from urban locations that have experienced great growth, particularly since WWII. Until index-makers come to realistic grips with the highly site-specific, biasing effects of UHI, instead of merely glossing over the problem, the propagandistic attribution of the “trend” apparent in urban records to increased CO2 levels will continue to seduce the unwary.
The situation with historical SST time-series is even more dire. Until the the advent of satellite sensing, there were only OCCASIONAL observations made by TRANSITING ships of opportunity, using a great variety of measurement techniques. In many Marsden Squares outside well-traveled sea lanes, the WMO-mandated 4 instantaneous observations per day necessary for even spatially-crude estimates of the daily average SST are rarely available . It is only through sheer hubris that “climate scientists” pretend that their fabricated SST time-series meaningfully represent reality on anything resembling a global basis.
I would urge this project to examine the fundamental availability of unbiased data and not simply restrict itself to subsequent adjustments.

Coeur de Lion
April 27, 2015 11:21 am

I still love Valentia. So remote, such a long dataset. And a neat anchorage

JP
April 27, 2015 11:34 am

You remove NOAA’s TOB adjustments and you pretty much remove all of the AGW the last 40 years.

Reply to  JP
April 27, 2015 12:23 pm

Why do you think they invented it?

immi_the_dalek
April 27, 2015 2:15 pm

Those who object to all adjustments may like to consider this quote, and its source,
[i] The USHCNv2 monthly temperature data set is described by Menne et al. (2009). The
raw and unadjusted data provided by NCDC has undergone the standard quality-control
screening for errors in recording and transcription by NCDC as part of their normal ingest
process but is otherwise unaltered. The intermediate (TOB) data has been adjusted for
changes in time of observation such that earlier observations are consistent with current
observational practice at each station. The fully adjusted data has been processed by the
algorithm described by Menne et al. (2009) to remove apparent inhomogeneities where
changes in the daily temperature record at a station differs significantly from neighboring
stations. Unlike the unadjusted and TOB data, the adjusted data is serially complete, with
missing monthly averages estimated through the use of data from neighboring stations.
The USHCNv2 station temperature data in this study is identical to the data used in Fall
et al. (2011), coming from the same data set.[/i]
It’s from the draft Watts et al. (2012) paper

jimmi_the_dalek
Reply to  immi_the_dalek
April 27, 2015 2:27 pm

Sorry, got the italics wrong

April 27, 2015 3:58 pm

Thanks, Anthony. I posted an article about The International Temperature Data Review Project. Another on the United States Historical Climatology Network (USHCN) adjustments.
I note that these 0.3°C of adjustments and homogenization amount to half of the 0.6°C warming since 1940 in the NCDC temperature time series.

Dr. Deanster
April 27, 2015 5:15 pm

You folks are funny.
This review will happen, … it will have some opposing conclusions from the published records …. skeptics will make a stink .. .Fox News will publish it, LImbaugh will talk about it …. BUT ..
… The Main Stream Media will sweep it under the rug, and it will never see the light of day. It will be DISMISSED … just like all other opposing evidence. Politicians will continue their propanda campaign as if this didn’t even happen. …………. the drive will still be the same as before, as noted by the UN Chick …
pp. .. out goal is not to save the earth, it is to change the economic model of the world.
Just sayin’

kim
Reply to  Dr. Deanster
April 27, 2015 11:41 pm

Our goal is to change the shape of cultured Chinese ladies’ feet.
====================

Dr. S. Jeevananda Reddy
April 27, 2015 9:01 pm

Unit of measurements — prior to 1957, the temperature was measured in oF and precipitation in inches and from 1957 onwards temperature is measured in oC and precipitation in millimeters. Here observational errors are quite different.
Rounding of figures to 1st place of decimal — follow even and odd
Instruments — makes changes; automatic weather data differs from weather data measured in Stevenson’s Screen
They are more inaccuracies in ocean weather data measurements in the past to date.
Dr. S. Jeevananda Reddy

MarkW
Reply to  Dr. S. Jeevananda Reddy
April 28, 2015 3:25 pm

Measurements have always been taken in metric. They were rounded to the nearest degree C.

Reply to  MarkW
April 28, 2015 8:29 pm

Are you serious?

The Fahrenheit scale was the primary temperature standard for climatic, industrial and medical purposes in English-speaking countries until the 1960s. In the late 1960s and 1970s, the Celsius scale replaced Fahrenheit in almost all of those countries—with the notable exception of the United States—typically during their metrication process. — Wikipedia:

Reply to  MarkW
April 29, 2015 1:01 am

I was in grade school (a “special school”) when the plan was for the metric system to go into effect in the US, and I was one of the few people who I grew up with that actually learned it.
We never had any switchover here in the US, to this day. Most Americans have little clue how to relate to the system that the rest of the world uses.
I spent time in various sots of jobs and activities over the years, and know that among the reasons for not switching was the retooling costs that would have impacted auto and other heavy manufacturers, having to buy new tools among auto mechanics (who needed to anyway once imported cars became widespread) and other technicians, fitting and fastener manufacturers pushing back and fighting the change, the need to have both sorts of tools and parts for some time after the change, were it to have occurred…etc.
Here in the US, there has yet to be any widespread switch at all.
I do not think it is even being taught to students in grade school these days, and there is no plan in place to change over at some point in the future.
Although, most devices have dual scales, and weather tables in the newspaper or elsewhere have readings in F and in C.

SAMURAI
April 30, 2015 8:21 pm

The revised UAH version 6.0 satellite TLT temp data, which now basically matches RSS and radiosonde data perfectly, shows just how distorted land-based temps have become.
According to many climatologists, the lower troposphere should be warming faster than surface temps if the CAGW hypothesis is viable.
UAH, RSS and radiosonde temp anomalies are now almost 50% LESS than GIS/HADCRUT4, which shows just how distorted the land-based temp records have become.
Even with the HUGE artificial increases to surface temp data, global temp trends have still been flat for almost 18 years, which shows the CAGW hypothesis is disconfirmed.
Arctic Ice is recovering, Antarctic ice extents are setting 35-yr records, sea level rise has been stuck at 6 inches per century for the past 200 years, severe weather frequency/intensity trends haven’t changed for the past 50~100 years (depending on weather phenomena evaluated), ocean pH is stuck at 8.1, and global temps have only recovered about 0.8C (probably less without the adjustments) since the end of the Little Ice Age in 1850, with most of this caused by natural factors–CO2 probably contributed just 0.2C of the total… what a joke.
Stick a fork in this turkey. CAGW is dead.

May 4, 2015 11:50 am

Things that could be done to improve the perception or legitimacy of the land station temperature record.
1) Choose a TOD which would minimize the adjustments needed in historical data. Since the TOD bias is an estimate the larger this is, the more potential error could be introduced. As pointed out the larger the adjustments are as a % of the total record it makes the discussion more about are the adjustments legitimate vs the CO2.
It makes sense to minimize the adjustment even if that means establishing an odd time to take temp comparisons from recent times. If this is hard consider equalizing adjustments for the past and current so that the overall adjustment curve does not look so highly skewed.
2) Consider understandable laborious and costly exercise of physically verifying the homogenization adjustments and other adjustments for some records. These adjustments are controversial and need to be validated against data that shows something really did happen that makes sense for the adjustment taken. Choose a random sample of fairly large adjustments in this category and seek to understand why the adjustment was necessary and if it was provably a good adjustment including site inspection if necessary or alternative data records, people interviewed. If a sample shows that the adjustments are reasonable then people can assume the overall adjustment process is probably okay.
3) Compute the increased error from the adjustments. It is assumed that the data prior to adjustment is less accurate than after. In some cases adjustments may actually increase accuracy, in some cases they may produce better “looking” results but not increase accuracy at all or even decrease it. We should know if the error bars are modified by the adjustment which is half the total effect.
4) Consider selecting subsets of stations that have been consistent and not needing any adjustment or much smaller than others and look at their record to make sure this makes sense, i.e. verify that they in fact have that characteristic. Then look at their record to see if they corroborate the overall record after adjustment.
5) Use satellite data for last 40 years of data to try and compare regional level changes to ones established by the land station record to verify that at least recent records do conform to proxies.
6) Use some scientists to examine the logic of all the adjustments for possible reasons to adjust the data in the other direction. There is a worry that scientists with GW on the brain don’t see possible adjustments that could go the other way. While individual adjustments do sometimes lower the net change for some data overall all the adjustments each consistently increase the overall trend. Is there some adjustment being missed that has been missed that could have moved the data the other way? Or more simply are there other adjustments at all that have been missed.
7) Have any of the adjustments applied changes twice causing a double counting of adjustments.
8) Can data be provided that shows where the weaknesses in terms of coverage area exists. For instance vast parts of many countries are not covered in the past. Since infilling data for those regions is highly error prone possibly they should be ignored altogether and considered separately.
If these things could be done or a subset it would seem to take the wind out of the argument that the adjustments are reasonable.

Reply to  logiclogiclogic
May 5, 2015 9:53 pm

Mr. Logiccubed,
Many of these points seem to assume that the people doing all the adjusting are objective stewards of the data, rather than biased hacks doing everything they can to keep their sinking ship afloat.
It is no longer possible to give the benefit of the doubt to the warmistas.

Reply to  Menicholas
May 7, 2015 11:35 pm

It is getting kind of ridiculous to give the benefit of the doubt. The adjustment so large and the explanations so meager. 18 1/2 years of flatline is pretty hard for most people to stomach. I believe that if people saw a scandal that their is significant error in the “global warming” over the last 100 years as the result of adjustments that are questionable it is possible it could be the straw that finally breaks this in the publics mind.

1 3 4 5
Verified by MonsterInsights