Grandma Learns About Data Adjustment: A little story about how data adjustment might work in everyday life.

Note: On Thursday of this week, NOAA/NCDC will attempt to rewrite the surface temperature record yet again, making even more “adjustments” to the data to achieve a desired effect. This story by Mr. Core is worth noting in the context of data spin that we are about to be subjected to – Anthony Watts

adjustments_buttonGuest essay by  E. L. Core

Grandma hangs up the phone, beaming. She has just talked with her daughter-in-law, Gabrielle, who had said, “Final report cards are out, and Gavin has straight A’s!” So, Grandma hurries over to see this remarkable report card for herself.

Sitting down at the kitchen table with Gabrielle and Gavin, Grandma opens the report card expectantly — though she has noticed a sheepish look on her grandson’s face. She looks the report card over. And over. And over. Instead seeing of all A’s, she’s seeing three A’s, two B’s, one C, and a D.

Puzzled, and with a sheepish look on her own face now, she hesitatingly asks her daughter-in-law, “Didn’t you tell me… Dear… that Gavin got straight A’s?”

“Yes, I did”, Gabrielle replies.

Noting the look of confusion on her mother-in-law’s face, she continues, “Here. Let me explain.”

“There are three A’s on the report card. You see them” — she points — “here, here, and here. So, we know he gets A’s.”

“Now, this first B, here,” Gabrielle continues. “You must understand that Gavin didn’t like that class. If he had liked that class, he would have put in more effort, and he would have gotten an A. So, that B should really be an A.”

Grandma sits quietly.

“And this other B, here,” Gabrielle says. “You must understand that the teacher just had it in for Gavin. If he had had a different teacher, he would have gotten an A. So, that B, too, should really be an A.”

Grandma sits quietly.

“Now, the C,” her daughter-in-law continues. “You must understand that Gavin didn’t like the class, and the teacher had it in for him, too. If Gavin had liked the class, and if he had had a different teacher, he would have gotten an A. So the C should really be an A.”

Grandma sits quietly.

“Now, the D,” says Gabrielle. “Gavin liked that class, and he had a good teacher, too. But three of his friends got an A in this class; they also got A’s in the very same three classes that Gavin’s report card has A’s in. So, the D should really be an A.”

“And that’s why I told you that Gavin has straight A’s.”

Grandma sits quietly.

Then, Gavin’s sister walks into the room. A sheepish look comes over her face when her mother asks Grandma, “Would you like me to explain how Gavrilla really won at the track meet?”

Grandma leaves quietly.


E. L. Core has a B.S. in Mathematics and Computer Science and is an associate editor at the Catholic Lane website, catholiclane.com. His series “Uncommon Core: Climate Sanity” is forthcoming later this year.

0 0 votes
Article Rating
132 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
RiHo08
June 1, 2015 2:07 pm

No worries. Grade inflation by teachers is nothing compared to grade inflation by parents. If your child is not doing well, then, that means you as a parent are not doing well. So, you inflate your child’s grades to inflate your own ego, which in turn allows you to discount what the teacher said about studying, showing your work, and turning in your home work.

tom s
Reply to  RiHo08
June 2, 2015 8:11 am

Perhaps, but the parent can do lots and lots and lots and to help the kid, but the kid still has to be the one taking the test, and if he/she does not do well despite plenty of parental guidance, then what say you? Off subj I know….

ScienceABC123
June 1, 2015 2:13 pm

If you want to make “adjustments” to the raw data, please follow these guidelines;
1) Maintain all of the raw data, and provide it upon request.
2) Specifically state the reason(s) for and amount(s) of adjustment(s) for every individual data piece adjusted. I’m looking for methodology here.
3) Provide the adjusted data upon request.
4) Require any work based on the adjusted data to clearly state, up front, that “adjusted data was used.”

John
Reply to  ScienceABC123
June 1, 2015 2:37 pm

The problem is there is no ‘raw data’. There are many fragments of data but they cannot simple be merged into a single dataset and called ‘raw data’.
For 1 the distribution of weather stations is not an even distribution. To give them all the same weighting is clearly wrong. Then there are problems such as weather stations that are moved, changes in the time of day the data is collected and changes to equipment. The homogenisation of the data is essential if you are to produce any meaningful data.

Reply to  John
June 1, 2015 2:59 pm

Congratulations John, your application to Warmista Liars Academy has been accepted.

Reply to  John
June 1, 2015 3:01 pm

The problem is there is no ‘raw data’. There are many fragments of data but they cannot simple be merged into a single dataset and called ‘raw data’.

Nonsense John, you take a measurement and you publish it.
You can even take a measurement do some QA (providing all of the rules), add a QA flag and release it all as raw data.
It’s not hard to do.

The homogenisation of the data is essential if you are to produce any meaningful data.

This isn’t true either.

Paul Jackson
Reply to  John
June 1, 2015 3:14 pm

What you’re describing isn’t data, it’s product, it’s like going to the organic food store, asking for cheese and being handed a can of cheeze-whiz. I like cheese-whiz, but it ain’t the kind of cheese you should get in an organic food store. We want the unprocessed raw data, fragments, warts and all, so we can tell what they did to make their product, and we want to see if we can actually reproduce their product. Until we can reproduce the underlying product, all of this AGW is just blue-smoke and mirrors

skorrent1
Reply to  John
June 1, 2015 3:15 pm

“…if you are to produce any meaningful data.” What is “produced” is not “data”, it is manufactured product. The raw data are those individual measurements of temperature together with the descriptors of location and condition of the instrument and time of reading. All operations on each individual datum should be recorded and available for review to assure that “homogenisation” is not just another word for “corruption”.

Boulder Skeptic
Reply to  John
June 1, 2015 3:21 pm

And still, this process creates a meaningless number with so much uncertainty (which is never reported) so as to show that nothing statistically important is happening with temperature. Anyone who thinks that temperature change measured in the second decimal place is in any way significant seems to me to be absolutely anti-science.
The one part I object to almost as much as the utter meaninglessness of a single number from a set of thermometers describing the Earth’s temperature at a given time, is using thermometers geographically spread far apart and “homogenizing” them into a virtual temperature somewhere else. Utter BS. Which is clear because these adjustments keep sequentially moving adjusted temperatures further in one direction (sure sign of a bias) with each new adjustment. This phraud is covered well elsewhere and is going to get even more attention in the near future, so I’ll not rehash in this comment.
At least satellite data are almost uniformly spread already and I can almost accept something like a single number.
Bruce

Evan Jones
Editor
Reply to  John
June 1, 2015 3:51 pm

Gridding (what you describe) is important. Homogenization, however, is crap.

Alx
Reply to  John
June 1, 2015 4:44 pm

… the distribution of weather stations is not an even distribution. To give them all the same weighting is clearly wrong.

The thing about weighting is it is subjective, it requires human judgement as to how and how much to weight this station over that station, this factor over that factor. If only raw data is provided, that is certainly problematic, it would allow different people to make different judgments or put the readings in different contexts. And if that happens then the temperature record really becomes a circus. Or I should say gets exposed as a circus.
Luckily we do not require a people to make a judgments as to what a gallon of petrol represents, or a pound of flour.

Reply to  John
June 1, 2015 5:45 pm

“Luckily we do not require a people to make a judgments as to what a gallon of petrol represents, or a pound of flour.”
And you’re confused as to what a station measures?
Infilling and homogenization increases the uncertainty, and few (if any) of the temp series offer a realistic value.

Niff
Reply to  John
June 1, 2015 6:14 pm

“The homogenisation of the data is essential if you are to produce any meaningful data.”

Yeah…I think I get it now. You have to add “meaning” otherwise we would get the wrong idea…!

M Seward
Reply to  John
June 1, 2015 10:00 pm

OK John, I see where you are comning from. But how do you explain the ‘adjustments’ overwhelmingly being upwards? Sorry, sport but that is jus a bit too cute for me. I think I will just refer to the satellite/balloon data. It seems to be so… what is the right word? Robust? Reliable? Unadjusted?
Maybe you are quite correct and it is just not possible to construct ( contrive?) a meaningful number from such a raggeday assed set of instrument data. So where does that leave us? B,lind faith with its eyes gouged out?

Reply to  John
June 2, 2015 12:40 am

Actually I agree with you John. Data has to be processed. Measurements are subject to variables, its a fact. And the better we understand those variables the clearer the image. I’m sorry that you’re experiencing a “pile on” after your comment. It reminds me of a similar comment made by Lindzen during a debate after his opponent chided him for not trusting the data. The implication here was raw data.

richardscourtney
Reply to  John
June 2, 2015 1:51 am

owenvsthegenius
Nobody “piled on”. Some refuted and others ridiculed nonsense.
But you say

Data has to be processed. Measurements are subject to variables, its a fact. And the better we understand those variables the clearer the image.

Please explain how you think data being “processed” is preferable to citing the error bounds of each datum. And an explanation of how that preference can be equated with the scientific method would also be appreciated.
Thanking you in advance.
Richard

Reply to  richardscourtney
June 2, 2015 9:26 am

Richard, raw data and the error bounds should accompany adjustments. It aught to be clear what processing took place and why. Adjusents are a moving target. As to the scientific method…we are always attempting to contextualize data. In the case of a LIDAR scan, a 3D modeler has to withdraw noise from the scan, fill in blanks ( snow, rain can cause these) the data has to be processed and layers applied to create a working rig. The LIDAR scan is not as useful on its own. This is not to say “toss out the raw”, adjustments have to made clear so they can be verifiable and justifiable, otherwise the adjustments are not useful

Tim Hammond
Reply to  John
June 2, 2015 2:16 am

All true, all irrelevant
The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.
When you homogenise, show the original data and show what you did. Allow people to critique it, try and replicate it, show if you are wrong.
Changing the underlying data because you know what it should say is simply wrong.

Hugh
Reply to  John
June 2, 2015 5:47 am

All true, all irrelevant
The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.

Correct, but for example, the global mean temperature is not data in the first place since you can’t measure it. You can have raw data from stations or satellites (measured in the three spatial and one time dimension), from which you construct in a way or another the thing you purport to represent the global mean temperature, whatever it actually means.
This question is surprisingly philosophical. I don’t think adjustment is wrong at all, but care should be taken in what you call the end result. Do you call it a measurement, or model? Do you realise the model is vulnerable to your conceptions and is, in fact, an interpretation or even opinion rather than data?
What I’m trying to say is that people are willing to take a number for its face value, let it be with error bars or not, when the trouble is NOT with the missing error bar, but rather what the number actually represents. People believe too much in numbers. It is very difficult to understand the number can be anything as the text around it actually defines its purported meaning. And this I say as a mere M.Sc. from distance; I guess both scientists and laymen promote numbers too high compared to the legend.
‘Your table is precise, yes, but your legend fails to tell accurately what it really represents.’

richardscourtney
Reply to  John
June 2, 2015 9:48 am

owenvsthegenius
Thankyou for your explanation of your point which I requested.
However, as Tony Hammond says, your explanation is “all true but irrelevant”.
I asked

Please explain how you think data being “processed” is preferable to citing the error bounds of each datum. And an explanation of how that preference can be equated with the scientific method would also be appreciated.

You have not said in what – if any – way data being “processed” is preferable to citing the error bounds of each datum. And your discussion of attempts to “contextualize” data from other fields has no relevance of any kind.
Furthermore, you say

This is not to say “toss out the raw”, adjustments have to made clear so they can be verifiable and justifiable, otherwise the adjustments are not useful.

In climastrology – which is the subject we are discussing – the raw data ARE thrown away so according to you the “adjustments” to temperature data “are not useful”.
The bottom line is as Tony Hammond says

The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.
When you homogenise, show the original data and show what you did. Allow people to critique it, try and replicate it, show if you are wrong.
Changing the underlying data because you know what it should say is simply wrong.

Richard

Reply to  richardscourtney
June 2, 2015 10:25 am

Richard, I agree completely with Tony. I would add, that in many cases the raw + error bounds are preferable to adjusted. In fact the only instance where “tossing out the raw” is acceptable is when the raw cannot be read. This might occur when the data is so rubust that it requires computational power + software + engineering that the layman can’t access. To clarify; the data has to be processed to be read. Otherwise raw should always accompany adjusted. And oftentimes the adjusted is little better than than an opportunity to display bias. Without verification the adjustments are not useful, furthermore in the case of “climate science”, with so much at stake, raw data should not be proprietary. We should be careful giving authority to inferences.
As to contextualizing; I used an example from a (related) field to illustrate an idea. I’m sure you get the point

Reply to  richardscourtney
June 2, 2015 10:47 am

Richard, I almost forgot. Error bounds, yes they are a must. Without these we have nothing. Just a thought, how do we arrive at the margins for error? Please relate your answer to the topic

richardscourtney
Reply to  John
June 2, 2015 10:01 am

Hugh
I agree with you when you say

This question is surprisingly philosophical.

but we part company when you say

What I’m trying to say is that people are willing to take a number for its face value, let it be with error bars or not, when the trouble is NOT with the missing error bar, but rather what the number actually represents.

The number cannot represent anything unless it includes its error bars. Absent the error bars the number could be representing any value between minus infinity and plus infinity: an indication that could be of any value is not an indication of any actual value.
This was the purpose of my original question to owenvsthegenius; viz.

Please explain how you think data being “processed” is preferable to citing the error bounds of each datum.

In the absence of error bars any “adjustment” to the value has no real effect because it does not alter the fact that the value has no defined accuracy and precision whether or not it is adjusted.
Richard

richardscourtney
Reply to  John
June 2, 2015 11:05 am

owenvsthegenius
It seems we may be converging on some sort of agreement.
You say

Richard, I almost forgot. Error bounds, yes they are a must. Without these we have nothing. Just a thought, how do we arrive at the margins for error? Please relate your answer to the topic

All my comments have been related to the topic.
I refer you to the comment from Hugh and my subsequent answer to him.
Also, we cannot provide error estimates because
(a) temperature is an intrinsic property so cannot have a valid average according to physics
but
(b) average global temperature anomaly is calculated by ‘climate science’ although there is no definition of it (each team that computes it uses its own definition of it and changes that definition almost every month)
and
(c) there cannot be an agreed error estimate for a parameter that has no agreed definition
additionally and importantly
(d) there is no possibility of a calibration reference for average global temperature anomaly however it is defined.
For a more full assessment of these issues please see this item especially its Appendix B.
Richard

Duster
Reply to  John
June 3, 2015 3:19 pm

John, if there is a trend to global temperatures, then it must of necessity appear in the data from all individual stations. Otherwise, the trend cannot possibly construed as a “global” trend. If regional trends overwhelm a global signal, then it seems reasonable that the global signal is not likely a significant one. Most of the other changes such as reading times should introduce a step change and should have a clear signal that can be removed without trouble. Before “adjusting” for station moves it would be reasonable to determine IF the station moved. There are numerous adjustments for “relocated” stations that never were moved. There are some notorious instances where the “move” was attributed due to a change in the rounding of latitude and longitude. In other instances, “moves” or instrumentation changes were assumed and corrected for, even though no such changes had occurred. Rather than rely on modeling assumptions about what kind of “signal” such changes should yield, it would be better to simply DO THE WORK to determine whether the change is a legitimate one or spurious. More importantly, when one is corrected about whether a station has moved or experienced instrumentation changes, then the “adjustments” need to be revoked. More importantly, if changes are going to be made to historical records, preservation of the original records is vital to any attempt to replicate or correct a combined (“merged”) data set.

Reply to  ScienceABC123
June 1, 2015 2:37 pm

5) Put out a press statement to say that you have done 1 – 4.
6) Make sure the storage computer system crashes so that everything is lost.
Who said I was a cynic?

E.M.Smith
Editor
Reply to  Oldseadog
June 1, 2015 3:39 pm

:
The computer need not crash. Simply have it operated and housed at the Clinton Data Center… all needed data will be kept and only the “private” data will evaporate…

Crispin in Waterloo
Reply to  Oldseadog
June 1, 2015 5:28 pm

E.M.Smith
Can I use that method to make salt out of seawater?

Reply to  Oldseadog
June 1, 2015 6:11 pm

Crispin in Waterloo
June 1, 2015 at 5:28 pm
No. It only works to make seawater out of salt.

Robert of Ottawa
Reply to  ScienceABC123
June 1, 2015 3:12 pm

The problem with (2) is that the adjustments are made in a long, old piece of Fortran code, I suspect, without adequate comments and no documentation. And it has been modified and added to over the years.

MartyH
June 1, 2015 2:17 pm

So here’s my question. Every time that there is an adjustment, shouldn’t that increase the uncertainty of the actual measurement? Say that a max temperature of 90 +/- 0.5 degrees was measured, but after adjustments is now 89 degrees. The uncertainty has to be at least +/- 0.75 degrees now, doesn’t it?
If you plotted the adjusted temperatures with adjusted error bars, would these adjustments really change anything?

mpaul
Reply to  MartyH
June 1, 2015 3:15 pm

The claim is that the adjustment and homogenization process reduces the measurement errors and increases the measurement precision. You can’t make this stuff up.

Alx
Reply to  mpaul
June 1, 2015 4:47 pm

Well at least it is good for a laugh and maybe a bit of poetry:
A rose is a rose is a rose, but global temperature is as fleeting as the wind.

Reply to  mpaul
June 2, 2015 12:17 am

“You can’t make this stuff up”. No I couldn’t. My imagination just isn’t strong enough. Climate Change “Scientists” seem to have mastered the technique though.

George A
June 1, 2015 2:24 pm

The C and D were outliers, so should be discarded.

Bryan A
Reply to  George A
June 1, 2015 4:47 pm

Nah, just homogenized them

Bryan
Reply to  George A
June 1, 2015 4:56 pm

Clearly.

Crispin in Waterloo
Reply to  George A
June 1, 2015 5:29 pm

They weren’t ‘outliers’, they were ‘outright liars’ and had to be discarded.

Will Nelson
Reply to  George A
June 1, 2015 5:32 pm

I have a model….

June 1, 2015 2:28 pm

Gee whiz. How could they hope to obfuscate if they had to do that?

Steve
June 1, 2015 2:28 pm

I would think the adjustments they made to the NCDC data would be to bring the overall result closer to the most accurate system we have for measuring global temperatures, the RSS. They sold us on spending billions of dollars on the satellite based RSS because it would be so much more accurate than the measurements taken on land by the NCDC and others, now not only is RSS data less referenced than land collected (and adjusted) NCDC data, the NCDC data is continually adjusted to INCREASE the divergence between NCDC and the more accurate RSS data. NCDC data is a sales pitch, it is not scientific data, it’s primary purpose is to justify the budgets for the NCDC by exaggerating the amount of global warming going on and increase the sense of urgency for budgets that support climate monitoring and climate studies. If the purpose of the NCDC data was to be as accurate as possible they would be adjusting it to more closely match the RSS data.

June 1, 2015 2:29 pm

They don’t accept BEST database?

Bryan A
Reply to  keith Sketchley
June 1, 2015 4:50 pm

Nah only WORST which is why it needs adjusting and homogenization

Crispin in Waterloo
Reply to  Bryan A
June 1, 2015 5:30 pm

BEST’s WORST fear:
Watts’
Ordinary
Response
Stems
Tide

cnxtim
June 1, 2015 2:37 pm

Some of Gavin’s teachers were “deniers” the rest were politically correct.

June 1, 2015 2:41 pm

Thank you, Anthony.

June 1, 2015 2:52 pm

When we married many, many years ago my bride was a size 12. She is still a size 12, and sometimes a size 10, but I can assure you her weight is not the same as when we married. Size adjustment anyone?

Robert of Ottawa
Reply to  lenbilen
June 1, 2015 3:16 pm

Lenbilen, we were not married, ever.

Tom in Florida
Reply to  lenbilen
June 1, 2015 3:30 pm

No, just the Universal Law of Marriage at work.

Reply to  lenbilen
June 1, 2015 4:29 pm

Perhaps a density increase. Same size different weight . . . density change.
John

noaaprogrammer
Reply to  John Whitman
June 1, 2015 10:18 pm

Same weight, just redistributed.

Reply to  John Whitman
June 1, 2015 10:59 pm

I always hide mine in another dimension… Hey, maybe that’s where the enviro-climo’s stashed the missing heat! It’s got to be somewhere! [*shock*]

rogerknights
Reply to  lenbilen
June 2, 2015 7:15 am

That’s “vanity sizing.” Mail order sellers, and other sellers, realize that clothing that is too large for a customer won’t be as readily returned as clothing that is too small. They want to avoid returns. So vanity sizing gives them a favorable margin of error.

Gareth Phillips
June 1, 2015 3:01 pm

Looks like you have perplexed the resident lefty on the site at long last Anthony. What is Gavrilla, and what on earth is a ‘track meet’? I get the joke about adjustments, but the final line makes no sense in cold windy Wales!

Reply:
Gavrilla is the female cousin of Gavin, and a Track meet is the US Colloquilism for a Track and Field Contest, you know, foot races, hurdles, shot puts, pole vault etc. I’ve recently been working with some instructors in the UK and boy do I understand your occasional confusion. ~ mod.

Reply to  Gareth Phillips
June 1, 2015 4:17 pm

Gavin and Gavrilla are brother and sister. Once I decided on “Gavin”, I wanted a very similar feminine name, and I settled on “Gavrilla”.

Bryan A
Reply to  ELCore (@OneLaneHwy)
June 1, 2015 4:52 pm

How about Gavina

RACookPE1978
Editor
Reply to  Bryan A
June 1, 2015 5:12 pm

I like Glavina, but Gavina is much better than Gadvilla – which is what her name would quickly degrade to other youngsters.

AB
Reply to  ELCore (@OneLaneHwy)
June 1, 2015 5:25 pm

How about Gavalene, rhymes with gasolene.

PiperPaul
Reply to  ELCore (@OneLaneHwy)
June 1, 2015 5:39 pm

Brian A, I can’t think of any possible anagram of that proposed name that could be used cruelly.

trafamadore
June 1, 2015 3:02 pm

How three independent groups plus individuals like Stokes can come to the about same relative value each month means that there must be an unbelievable coordination of dishonesty. Based on Stokes values, which we can see did to day, what do you think it is, that everyone puts their thumbs on the scale once they see his?
This is just a little batty.
BTW, it looks like May is going to come in close to February this year. Get ready for more records.

Reply to  trafamadore
June 1, 2015 7:53 pm

No, it just takes them all doing the approximate same “best practices” to the raw data, infilling and homogenization based on a normalized area and latitude based temp trend they just need to read the same papers.
One of the reasons I do neither, I wanted to see what the actual data said.

Dawtgtomis
June 1, 2015 3:06 pm

I hereby nominate Gavin for a nobel prize!

Dawtgtomis
Reply to  Dawtgtomis
June 1, 2015 3:08 pm

(Just calibrating my sarc meter)

Reply to  Dawtgtomis
June 1, 2015 11:01 pm

I like it! 🙂

Ian W
June 1, 2015 3:10 pm

The children sitting 5 desks away got A’s so Gavin’s grades were homogenized and adjusted up to A’s as the lower grades were obviously incorrectly recorded.

Paul Jackson
Reply to  Ian W
June 1, 2015 3:36 pm

No that’s not how it works, Gavin has been copying off Gavrilla’s test for 2/3rds of the semester, finally Teacher Grannywings catches Gavin cheating and moves him 5 chairs away, so Gavin starts getting D’s instead of A’s! Principal Dufus, noted that Gavrilla and Gavin have historically gotten same scores and now they aren’t so He adjusts Gavins score upward without telling Miss Grannywings.

Hugh
Reply to  Paul Jackson
June 2, 2015 6:00 am

Later Dufus accepts that the A’s were incorrect in this cherry-picked case, but tells that the school’s mean over the period of 1850-2000 did not change because some A’s given in 1920’s had been reinterpreted as D’s.

DirkH
June 1, 2015 3:19 pm

In real life, Gavin’s notes would be given as anomalies to a 30 year baseline somewhere in a past century, though. Which you compute by taking data from that period and adjust it downwards.

Bryan A
Reply to  DirkH
June 1, 2015 4:56 pm

Obviously Gavin was sitting on the CO2 enhanced side of the room so it was too hot to concentrate

Gentle Tramp
June 1, 2015 3:24 pm

Well, this whole biased temperature data “Adjustment” business is just another example of Noble Cause Corruption. Data manipulation for a “Good Purpose” can’t be a sin…
Two days ago, even a former Swiss Minister and President, Moritz Leuenberger, did admit that he plainly lied to the public in connection with a CO2 reduction law. Here’s the Swiss newspaper report about that confession:
http://www.tagblatt.ch/ostschweiz/thurgau/kantonthurgau/tz-tg/Die-ganze-Wahrheit-haelt-gar-niemand-aus;art123841,4242625
The crucial quote of Moritz Leuenberger in this article is as follows:
«Der Klimagipfel in Kopenhagen kurz vor der Abstimmung zur Reduktion des CO2-Ausstosses war desaströs», gibt Leuenberger jetzt zu. Doch damals habe er dies absichtlich nicht den Medien gesagt und somit gelogen, damit die Schweizer dafür stimmen würden. Leuenberger: «Jetzt glaube ich, die Lüge ist legitim, wenn sie etwas Gutes bewirkt.»
(English translation: “The climate summit in Copenhagen shortly before a (Swiss) referendum about the reduction of CO2 emission was disastrous. But then he had not told this the media deliberately and therefore lied consequently, in order to make the Swiss people vote for it (the CO2 reduction law)”. Leuenberger said further, “Now I believe, a lie is legitimate if it will cause a good result.”)
Thus we see by this example quite plainly that “Noble Cause Corruption” is very real in Politics today!
The big problem with this kind of behavior is that such “well-meaning zealots” only believe – but don’t actually know – whether their dishonest crusades will really help mankind…
Just think, only 500 years ago, the majority of people believed that burning witches would be a very “Noble Cause” indeed. And today, Mr. Leuenberger and the majority of the misguided public believe that the vital and desert-greening plant-food CO2 is the new diabolical witch that must be hunted down…

June 1, 2015 3:25 pm

Anyone want to bet that they raise the past temps and lower the present temps?

June 1, 2015 4:14 pm

“Note: On Thursday of this week, NOAA/NCDC will attempt to rewrite the surface temperature record yet again, making even more “adjustments” to the data to achieve a desired effect. [. . .]” – Anthony Watts

Well. This is only a partially sarcastic comment. Most of this comment is not sarcasm.
/ partial sarcasm on . . .
I think NOAA/NCDC will justify rewriting the surface temperature record via more data ‘adjustments’ by insinuating that the IPCC endorsed GCMs (models) must be right. Thus, they will maintain that it is reasonable to significantly adjust the temperature data up to be more in agreement with the unquestionable models.
. . . partial sarcasm off /
John

knr
June 1, 2015 4:16 pm

“Who controls the past controls the future; who controls the present controls the past.” George Orwell’s Nineteen Eighty-Four.’
Could you get a finer demonstration of this in action than ‘adjustments’ of past temperatures?
Of course it could be just ‘lucky chance ‘ that all adjustments fall in favour of ‘the cause ‘ but with that type of luck you think they spend more time on the tables at Las Vegas
When you heap poor pratice unto what is already in many ways poor data you cannot ‘magical’ turn it into good data no matter how much you ‘believe’

Alx
Reply to  knr
June 1, 2015 4:51 pm

Maybe today George Orwell would write,

“Who controls the past temperature data controls the future; who presently controls the raw data controls the past temperature data.”

old construction worker
June 1, 2015 4:27 pm

I seem to remember there was a problem: UK Met Office couldn’t reproduce their “homogenized” data or maybe it was some other office in the UK.

June 1, 2015 4:50 pm

ELCore (@OneLaneHwy) wrote this final line of his story,
“Grandma leaves [the room] quietly.”

ELCore,
And then there is the rest of your wonderful story . . .
Grandma then picks up her mobile phone and dials her son.
When her son answers she says, “Your wife is thinking in an odd way since she finished being an expert reviewer for the IPCC’s AR5.”
Her son says, “Mom, her odd way of thinking started before that while she was getting a Masters Degree in Climate Science from Penn State University.”
John

George Devries Klein, PhD, PG, FGSA
June 1, 2015 4:59 pm

And that folks is how the UN, the US-EPA, NOAA, NASA and academic atmospheric sciences do their rsearch and work!!!

Alx
June 1, 2015 5:03 pm

The lack of controlled process or methodology for adjustments at NOAA is a tragic-comedy.
But the constant adjustments does point out the difficulty of the concept of “Global Temperature”. I still insist it is a low confidence measurement because at this time it is extremely difficult thing to define and measure. There is progress being made and satellites give us hope of some day arriving at the decisive definition of “Global Temperature”. Currently the whole business is sketchy, but throwing the Tarot cards I predict in about 100 years we’ll get it. My Tarot cards are computer modeled so you can rest assured in their accuracy.

Reply to  Alx
June 1, 2015 6:03 pm

Crowley deck or Waite?

Reply to  M Simon
June 1, 2015 6:23 pm

They know it’s a “low confidence” measurement:
“For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, [for global average temperature 1951-1980] but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.”
http://data.giss.nasa.gov/gistemp/abs_temp.html
They just don’t put information like that in their press releases.

June 1, 2015 5:17 pm

In defense of NOAA/NCDC, someone has to be in last place on climate science credibility. An argument could be made that NASA GISS is probably in last place on temperature dataset credibility with NOAA/NCDC only slightly more credible.
John

Crispin in Waterloo
June 1, 2015 5:59 pm

While some of Gavin’s excuses are a little weak, one must understand that he started off the year knowing he was going to get straight A’s at the end. After trying all the various ways to learn, he got a result. This result did not fit his mental model. Obviously the problem lay in the data, as the model was perfect and exactly what he needed for further progress in school.
Thus the data must be defective and had to be corrected for bias, first to change them to A’s and then an explanation conjured justifying each. He had two friends examine the original and corrected data and they agreed with the process and complimented him for the fine work he had put in throughout the year. They both needed to make adjustments to their raw data as well because there was something obviously wrong with each data set. Gavin agreed to return the favour and examine their work to see if it met the same standards he had used, a method which he already had validated by two external reviewers.
When parents criticised the result Gavin explained they were not students in these classes and had no standing to comment. Only students could understand the pressure they were under to produce A-Grade work and how he was harassed by incompetent teachers.
An investigation of the whole matter by fellow students concluded that the locks on the doors in the men’s bathroom were faulty and needed upgrading.

Reply to  Crispin in Waterloo
June 1, 2015 6:09 pm

I imagine Gavin still sucks his thumb a lot.
John

June 1, 2015 6:00 pm

I skipped school because I don’t like going to classes. I became an aerospace engineer. That should count for straight A’s in the school I didn’t go to.

June 1, 2015 7:36 pm

My wife takes size six shoes, but she wears nines because they’re more comfortable.

Mervyn
June 1, 2015 8:09 pm

Anthony, I refer to your article of June 6, 2012:
http://wattsupwiththat.com/2012/06/06/noaas-national-climatic-data-center-caught-cooling-the-past-modern-processed-records-dont-match-paper-records/
Your following comment was spot on:
“Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don^t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made. In the case of Arizona and changing the Climate Divisions, it would be the equivalent of changing state borders as saying less people lived in Arizona in 1934 because we changed the borders today. That wouldn’t fly, so why should this?”

jpatrick
June 1, 2015 8:20 pm

One way to treat data that isn’t well behaved is to just take the logarithm of it and then plot that. If it’s not quite right, just take the log of it again. That usually works.
If it’s necessary to match up thermometric data with proxy data, you might have to try a few adjustment factors to get them to match up.

Eugene WR Gallun
June 1, 2015 8:48 pm

Gavin Schmidt — I Got The Data In Me
(most sorry Kiki Dee)
I got no troubles at NASA
I’m a rocket nothing can stop
Survival’s always the first law
And I’m in with those at the top
I heat up
I cool down
A site I don’t like I discard it
The high and the mighty can frown
So say what they want they reward it
Man is the measure
Of all things that be!
The Progressive Alliance
And its New Age Science
Say I got the data in me!
I work in the mists and the fogs
By methods that none can review
To hide like a fox from the dogs
The premise of all that I do
The thermometers all want skilling
If their readings are not alarming
As the early ones all need chilling
So the later ones all need warming
Man is the measure
Of all things that be!
What Protagoras said
Onto Nietzsche led
So I got the data in me!
The truth’s a Consensus of thought
We agree to agree about
A joy for which long we have sought
Our minds ever free of all doubt
We are born uncertain of heart
And live in fear of things unknown
But Consensus is truly the start
Of our souls becoming our own
Man is the measure
Of all things that be!
To Progressive drums
The Superman comes!
And I got the data in me!
I heat up
I cool down
A site I don’t like I discard it
The high and the mighty can frown
So say what they want they reward it
Eugene WR Gallun

Eliza
June 1, 2015 9:16 pm

As long as everyone here takes this as a joke nothing will happen. This is in fact criminal activity akin to Mafia criminal activity. This needs to be taken up by lawyers and the people doing this need to be charged.

Frank Kotler
June 1, 2015 9:28 pm

When I was a lad, there were two things you just did not do in the laboratory. One was work without eye protectionm and the other was alter your ovservations. I [guess] in computer labm you don’t need the eye protection either…

Rob
June 2, 2015 12:36 am

I’ve got my originals.

June 2, 2015 12:50 am


Pointman

Eugene WR Gallun
Reply to  Pointman
June 2, 2015 1:10 am

Pointman — A real message??? That is just too good to be true. — Eugene WR Gallun

harrytwinotter
June 2, 2015 4:47 am

An anecdote – are you serious?

Ron Clutz
June 2, 2015 5:34 am

Weather stations measure the temperature of air in thermal contact with the underlying terrain. Each site has a different terrain, and for a host of landscape features documented by Pielke Sr., the temperature patterns will differ, even in nearby locations. However, if we have station histories (and we do), then trends from different stations can be compared to see similarities and differences..
In summary, temperatures from different stations should not be interchanged or averaged, since they come from different physical realities. The trends can be compiled to tell us about the direction, extent and scope of temperature changes.
https://rclutz.wordpress.com/2015/03/20/auditing-the-abuse-of-temperature-records/

Reply to  Ron Clutz
June 2, 2015 5:51 am

The trends can be compiled to tell us about the direction, extent and scope of temperature changes.

The process I use is to calculate the difference between one day and the next for each station, If a station is changed, it effects only that day or range of days. I feel like the only accurate baseline for a station is itself.
It’s not perfect, but all of the other ways (guessing, or a baseline of averages) has so many other things that can go wrong, I wanted to see how each station’s temp evolves.
It’s the same reason I don’t infill or homogenize. I also don’t exclude data based on it’s value unless it’s drastically off, like about or below 200F/-200F.
I do some station exclusion based on the number of yearly samples, depending on what I’m looking for, but I also produce a report of the station I used, sample size and a few other parameter.

Ron Clutz
Reply to  micro6500
June 2, 2015 6:04 am

When I compiled trends of stations in SE USA, i got something looking very much like this:comment image

Reply to  Ron Clutz
June 2, 2015 8:16 am

Because I don’t infill and work with stations that are missing data, I tend to just look at the day to day change.
Here’s the UScomment image
Baseline starts at 0, so the offset can be whatever.

Reply to  Ron Clutz
June 2, 2015 12:01 pm

read willis.
over 90% of temperature variation is determined by latitude and elevation.
second, nobody in their right mind averages temperatures.

Ron Clutz
Reply to  Steven Mosher
June 2, 2015 12:23 pm

Glad to hear it.

Richard M
June 2, 2015 6:54 am

What is really needed to solve this problem is a global temperature proxy that goes up to the 1980s. After that we have satellite data. Does such a proxy exist? Seems like there are all kinds of proxy records. This may already exist but has not been synced up with the satellite data.

June 2, 2015 7:14 am

“What is really needed to solve this problem is….
What’s really needed is a cogent understanding of the why and how of homogenization rather than the wild and unsupported claims of conspiracy. Mr Watts has promised to issue a paper on this topic, but it has failed to appear. Why?

Reply to  Anthony Watts
June 2, 2015 8:02 am

Watts.
My handle is warrenlb. Where do you get ‘Beeton’ from?

Reply to  warrenlb
June 2, 2015 11:59 am

“What’s really needed is a cogent understanding of the why and how of homogenization rather than the wild and unsupported claims of conspiracy. Mr Watts has promised to issue a paper on this topic, but it has failed to appear. Why?”
1. The code that explains adjustments has been posted for a very long time. Nobody cares to read it or run it. Nobody cares to look at why it handles specific cases the way it does. Iceland is a good example.
2. The adjustments cool the record. Skeptics dont want to address that beccause it doesnt fit the fraud/conspiracy theory.
3. Watts 2012 ( yet to published ) if correct, still won’t address the issue. the land is 30% of the total. the US records are less than 5% of the total. The US is the worst record in terms of radical changes to observation practice. In other words, even if you found that the land in the US was biased high by 50% since 1979, that would not change core science in any material way.

Reply to  Steven Mosher
June 2, 2015 1:32 pm

Perhaps Mr Watts’s paper hasn’t appeared because he’s trying to work out the footwork to explain how cooling adjustments are really warming adjustments.

Reply to  Steven Mosher
June 2, 2015 6:15 pm

Why is Iceland “a good example”? A “good example” of what?

cheshirered
June 2, 2015 7:27 am

A cross-blog consortium of senior sceptics need to come together to work on this one. There’s too many divisive ego’s getting in the way of highlighting what is an outright scandal.
* Select adjusted data examples.
* Show how far they’ve been adjusted, and the consequences of the ‘new’ data.
* Challenge the relevant data provider to explain the adjustments, with workings.
* Demand politicians launch official inquiry.
We have Fifa execs’, bank interest-rate setters and individual FX traders being charged for manipulating data to enrich themselves. Climate science guys are doing that AND misdirecting £billions of public monies. These people need holding to account. Why the silence?

Reply to  cheshirered
June 2, 2015 8:44 am

“Why the silence?”
My question is why do you believe in worldwide conspiracies?

RACookPE1978
Editor
Reply to  warrenlb
June 2, 2015 9:41 am

Because the so-called “conspirators” – and there is nothing “conspiratorial” at all about their stated claims and world-wide behavior – have spent billions worldwide promoting their goals, their methods, and their motives. YOU are the ones who use “conspiracy” and all of its implied hidden agendas. This IS United States policy, the US “highest global security threat” for the President AND Sec of State, DOD, and all of its subordinate agencies and all of their 92 billions in CAGW monies…..
It is totally political and political-economic driven and political-Gaea-theist. No conspiracies at all needed.

Reply to  warrenlb
June 2, 2015 1:27 pm


“This IS United States policy, the US “highest global security threat” for the President AND Sec of State, DOD, and all of its subordinate agencies and all of their 92 billions in CAGW monies…..”
In order for this amazing statement of yours to be true, ALL these parts of the US Government are complicit in your posited deception.
What’s more, ALL the nations of the planet must be in on it as well, since the National Science Academies of every nation on Earth –China, Japan, UK, France, Germany, Canada, Australia, etc take formal positions concluding AGW. EVERYONE. Plus NASA, NOAA, major Universities, and all Scientific Professional Societies.
A bit of a stretch, as they say… or perhaps you don’t say…

Paul Matthews
Reply to  cheshirered
June 2, 2015 9:11 am

cheshire, GWPF is doing pretty much what you ask for:
http://www.thegwpf.org/inquiry-launched-into-global-temperature-data-integrity/
They are doing an inquiry and have a call for evidence, to be submitted by June 30th.

cheshirered
Reply to  Paul Matthews
June 2, 2015 11:14 am

You’re right, cheers. I’d forgotten about that one. Hope it delivers results.

Reply to  cheshirered
June 2, 2015 11:49 am

Actually Roger Peilke Sr. called for a collection of mutually agreed upon experts, none of which had an interest in the outcome.
So, not a team of skeptics. Not a team of non-skeptics, but a mutually agreed upon experts.
GWPF had an opportunity to follow Roger’s prescription ( one endorsed by Anthony BTW) ,but GWPF failed.

Reply to  cheshirered
June 2, 2015 11:52 am

* Select adjusted data examples.
* Show how far they’ve been adjusted, and the consequences of the ‘new’ data.
* Challenge the relevant data provider to explain the adjustments, with workings.
* Demand politicians launch official inquiry.
Data adjustments are all explained in papers that skeptics refuse to read and code which they refuse to review.
Data adjustments COOL THE GLOBAL RECORD.

Alcheson
Reply to  Steven Mosher
June 2, 2015 12:58 pm

Slight bit of deception…. Data adjustments COOL THE PAST to enhance the warming rate… which then helps bring the rate of change more in line with the defective models. Unfortunately, falls are coming earlier and winters lasting longer. Today’s climate seeming to be very similar now as to what it was back in the 70’s, to anyone who has been around that long, cause disbelief in credibility of the adjustments. In addition since satellites and weather balloons completely go against the warmists strong desire to show rapid and soon to be catastrophic warming. It is very highly unlikely that the adjustments can be correct, they do NOT fit with peoples observations.

Reply to  Steven Mosher
June 2, 2015 6:23 pm

“Cool the global record”? This must be a case of somebody saying one thing and meaning another.
Look at the graph “Global Temperature (meteorological stations)” in Hansen’s 1999 paper (Figure 4). Compare that with the current “Global Temperature (meteorological stations)” at the GISS website. The rise in temperature over the 20th century is practically doubled.

PeterinMD
June 2, 2015 10:52 am

As I understand it, the common practice is to use a 1200 km gridded system to in-fill missing data and homogenize the data. To put that into context, that would be like comparing Baltimore, MD where I live with Ocala, FL. That is ludicrous in every sense of the word. Two completely different climatic areas. You could use January temps in Baltimore to cool the January temps temps of Ocala from the past.

Reply to  PeterinMD
June 2, 2015 11:47 am

wrong.
see Willis’ post on the relationship between latitude, elevation and temperature.

tadchem
June 2, 2015 11:22 am

My favorite story about how ‘data adjustment’ can work in real life can be summed up in the punch line: “If you can’t afford the surgery to remove the tumor, Mr. Johnson, for a fraction of the cost I could touch up your X-rays.”

June 2, 2015 1:43 pm

Has anyone on this thread read Dr Moshers posts and links about data homogenization, understood them, and care to post their own point by point explanation vs his?

richardscourtney
Reply to  warrenlb
June 3, 2015 12:41 am

warrenlb
I answer your questions in turn.
Yes, yes, and no because his sophistry is not worth the bother of refuting;
e.g. when he writes “Cool the global record” he means ‘cool the past to increase the apparent warming rate’.
Richard

June 2, 2015 1:52 pm

Mr. layman here.
My impression is that the surface station info is trying to be used for something for which they were not designed. They were set up for local conditions, not global. The data can’t be changed (adjusted, homogenized, etc.) to give a truly accurate global result.
But they can be changed to give a desired or expected result. (Desired by the politicians and “political scientist”. Expected by the prevailing hypothesis.)

Reg Nelson
June 2, 2015 2:12 pm

If the goal is to gain a better understanding of the Earth’s Climate, then the effort should be made on ensuring the quality and quantity of the data going forward.
Why bother looking at data from the Griffith Observatory from the 1940’s when you’ve got a Hubble Space Telescope. It doesn’t make any scientific sense to use sparse and flawed data from the past.

Dr. Deanster
Reply to  Reg Nelson
June 2, 2015 7:23 pm

It most certainly does make sense if you are going to make a statement regarding the present in comparison to the past.
I’ve always had a problem with the data adjustment, because it changes the trend within the raw data. IF a temp is taken every day at 10 a.m. … and it produces a trend …. it is not likely that that trend is going to change had the temp been taken every day at 5:30 a.m. Moving max and mins are also not a good measure, outside of establishing records here and there.
If I ever win the lottery, I’m going to get into climate science … simply because I want to answer a lot of questions that aren’t being answered.

June 2, 2015 9:00 pm

To suggest that NOAA deliberately adjusts temperature data in order “to achieve a desired effect” is a very serious charge and should not be made lightly.
If the author has evidence to show that scientists have deliberately misapplied TOBS adjustments or miscalculated homogenization with the specific intent of skewing the results in a specific direction, then he should present such evidence.
Otherwise such charges border on “bearing false witness against your neighbor”.

Reply to  David Sanger
June 2, 2015 9:39 pm

David, why the incessant need to always keep going back to fiddle with temperatures from the past?? Here, let me answer this for you since you likely will not answer honestly. Answer: Because the models which every warmist believes must be right, show considerable warming as CO2 rises and believe that at least 3C or more warming must occur between 1900-2100. The data, uncorrected do NOT show the expected warming. Thus, the warmists conclude there MUST be errors in the data and so come up with reasons as to why the adjustments must be made. And as long as the plateau continues, there will always be a need to correct the past in order to show that the warming continues unabated. Support in this answer is seen in the growing discrepancy between satellites and the ground based temperature with every new correction.

RACookPE1978
Editor
Reply to  David Sanger
June 2, 2015 10:24 pm

Yes, such a charge is serious.
So is destroying the world’s economy and killing millions of innocents, in order to …. nothing.

Reply to  RACookPE1978
June 3, 2015 8:09 am

all the more reason why the author should provide evidence to back up his accusations..

Reply to  David Sanger
June 3, 2015 10:25 am

Sanger
Posters on this forum repeatedly make such accusations without evidence; moreover, Mr Watts claims to have written a paper ‘proving’ that the data is fraudulent, yet we have not seen it.
It’s all part of the process of rejecting AGW by proving there’s a conspiracy to commit fraud. The only problem with this scenario is that the entire world of science must be ‘in on it’, since every Science Academy, Scientific Professional Society, and major University in the world assert Earth is warming and Man is the Cause.
You would think these folks would be trying to figure out why the adjustments are made, instead of making unsupported accusations. It seems apparent they do so not because of supporting evidence, but because they don’t like the answers of Science re: AGW.