Grandma Learns About Data Adjustment: A little story about how data adjustment might work in everyday life.

Note: On Thursday of this week, NOAA/NCDC will attempt to rewrite the surface temperature record yet again, making even more “adjustments” to the data to achieve a desired effect. This story by Mr. Core is worth noting in the context of data spin that we are about to be subjected to – Anthony Watts

adjustments_buttonGuest essay by  E. L. Core

Grandma hangs up the phone, beaming. She has just talked with her daughter-in-law, Gabrielle, who had said, “Final report cards are out, and Gavin has straight A’s!” So, Grandma hurries over to see this remarkable report card for herself.

Sitting down at the kitchen table with Gabrielle and Gavin, Grandma opens the report card expectantly — though she has noticed a sheepish look on her grandson’s face. She looks the report card over. And over. And over. Instead seeing of all A’s, she’s seeing three A’s, two B’s, one C, and a D.

Puzzled, and with a sheepish look on her own face now, she hesitatingly asks her daughter-in-law, “Didn’t you tell me… Dear… that Gavin got straight A’s?”

“Yes, I did”, Gabrielle replies.

Noting the look of confusion on her mother-in-law’s face, she continues, “Here. Let me explain.”

“There are three A’s on the report card. You see them” — she points — “here, here, and here. So, we know he gets A’s.”

“Now, this first B, here,” Gabrielle continues. “You must understand that Gavin didn’t like that class. If he had liked that class, he would have put in more effort, and he would have gotten an A. So, that B should really be an A.”

Grandma sits quietly.

“And this other B, here,” Gabrielle says. “You must understand that the teacher just had it in for Gavin. If he had had a different teacher, he would have gotten an A. So, that B, too, should really be an A.”

Grandma sits quietly.

“Now, the C,” her daughter-in-law continues. “You must understand that Gavin didn’t like the class, and the teacher had it in for him, too. If Gavin had liked the class, and if he had had a different teacher, he would have gotten an A. So the C should really be an A.”

Grandma sits quietly.

“Now, the D,” says Gabrielle. “Gavin liked that class, and he had a good teacher, too. But three of his friends got an A in this class; they also got A’s in the very same three classes that Gavin’s report card has A’s in. So, the D should really be an A.”

“And that’s why I told you that Gavin has straight A’s.”

Grandma sits quietly.

Then, Gavin’s sister walks into the room. A sheepish look comes over her face when her mother asks Grandma, “Would you like me to explain how Gavrilla really won at the track meet?”

Grandma leaves quietly.


E. L. Core has a B.S. in Mathematics and Computer Science and is an associate editor at the Catholic Lane website, catholiclane.com. His series “Uncommon Core: Climate Sanity” is forthcoming later this year.

Advertisements

132 thoughts on “Grandma Learns About Data Adjustment: A little story about how data adjustment might work in everyday life.

  1. No worries. Grade inflation by teachers is nothing compared to grade inflation by parents. If your child is not doing well, then, that means you as a parent are not doing well. So, you inflate your child’s grades to inflate your own ego, which in turn allows you to discount what the teacher said about studying, showing your work, and turning in your home work.

    • Perhaps, but the parent can do lots and lots and lots and to help the kid, but the kid still has to be the one taking the test, and if he/she does not do well despite plenty of parental guidance, then what say you? Off subj I know….

  2. If you want to make “adjustments” to the raw data, please follow these guidelines;
    1) Maintain all of the raw data, and provide it upon request.
    2) Specifically state the reason(s) for and amount(s) of adjustment(s) for every individual data piece adjusted. I’m looking for methodology here.
    3) Provide the adjusted data upon request.
    4) Require any work based on the adjusted data to clearly state, up front, that “adjusted data was used.”

    • The problem is there is no ‘raw data’. There are many fragments of data but they cannot simple be merged into a single dataset and called ‘raw data’.
      For 1 the distribution of weather stations is not an even distribution. To give them all the same weighting is clearly wrong. Then there are problems such as weather stations that are moved, changes in the time of day the data is collected and changes to equipment. The homogenisation of the data is essential if you are to produce any meaningful data.

      • Congratulations John, your application to Warmista Liars Academy has been accepted.

      • The problem is there is no ‘raw data’. There are many fragments of data but they cannot simple be merged into a single dataset and called ‘raw data’.

        Nonsense John, you take a measurement and you publish it.
        You can even take a measurement do some QA (providing all of the rules), add a QA flag and release it all as raw data.
        It’s not hard to do.

        The homogenisation of the data is essential if you are to produce any meaningful data.

        This isn’t true either.

      • What you’re describing isn’t data, it’s product, it’s like going to the organic food store, asking for cheese and being handed a can of cheeze-whiz. I like cheese-whiz, but it ain’t the kind of cheese you should get in an organic food store. We want the unprocessed raw data, fragments, warts and all, so we can tell what they did to make their product, and we want to see if we can actually reproduce their product. Until we can reproduce the underlying product, all of this AGW is just blue-smoke and mirrors

      • “…if you are to produce any meaningful data.” What is “produced” is not “data”, it is manufactured product. The raw data are those individual measurements of temperature together with the descriptors of location and condition of the instrument and time of reading. All operations on each individual datum should be recorded and available for review to assure that “homogenisation” is not just another word for “corruption”.

      • And still, this process creates a meaningless number with so much uncertainty (which is never reported) so as to show that nothing statistically important is happening with temperature. Anyone who thinks that temperature change measured in the second decimal place is in any way significant seems to me to be absolutely anti-science.
        The one part I object to almost as much as the utter meaninglessness of a single number from a set of thermometers describing the Earth’s temperature at a given time, is using thermometers geographically spread far apart and “homogenizing” them into a virtual temperature somewhere else. Utter BS. Which is clear because these adjustments keep sequentially moving adjusted temperatures further in one direction (sure sign of a bias) with each new adjustment. This phraud is covered well elsewhere and is going to get even more attention in the near future, so I’ll not rehash in this comment.
        At least satellite data are almost uniformly spread already and I can almost accept something like a single number.
        Bruce

      • … the distribution of weather stations is not an even distribution. To give them all the same weighting is clearly wrong.

        The thing about weighting is it is subjective, it requires human judgement as to how and how much to weight this station over that station, this factor over that factor. If only raw data is provided, that is certainly problematic, it would allow different people to make different judgments or put the readings in different contexts. And if that happens then the temperature record really becomes a circus. Or I should say gets exposed as a circus.
        Luckily we do not require a people to make a judgments as to what a gallon of petrol represents, or a pound of flour.

      • “Luckily we do not require a people to make a judgments as to what a gallon of petrol represents, or a pound of flour.”
        And you’re confused as to what a station measures?
        Infilling and homogenization increases the uncertainty, and few (if any) of the temp series offer a realistic value.

      • “The homogenisation of the data is essential if you are to produce any meaningful data.”

        Yeah…I think I get it now. You have to add “meaning” otherwise we would get the wrong idea…!

      • OK John, I see where you are comning from. But how do you explain the ‘adjustments’ overwhelmingly being upwards? Sorry, sport but that is jus a bit too cute for me. I think I will just refer to the satellite/balloon data. It seems to be so… what is the right word? Robust? Reliable? Unadjusted?
        Maybe you are quite correct and it is just not possible to construct ( contrive?) a meaningful number from such a raggeday assed set of instrument data. So where does that leave us? B,lind faith with its eyes gouged out?

      • Actually I agree with you John. Data has to be processed. Measurements are subject to variables, its a fact. And the better we understand those variables the clearer the image. I’m sorry that you’re experiencing a “pile on” after your comment. It reminds me of a similar comment made by Lindzen during a debate after his opponent chided him for not trusting the data. The implication here was raw data.

      • owenvsthegenius
        Nobody “piled on”. Some refuted and others ridiculed nonsense.
        But you say

        Data has to be processed. Measurements are subject to variables, its a fact. And the better we understand those variables the clearer the image.

        Please explain how you think data being “processed” is preferable to citing the error bounds of each datum. And an explanation of how that preference can be equated with the scientific method would also be appreciated.
        Thanking you in advance.
        Richard

        • Richard, raw data and the error bounds should accompany adjustments. It aught to be clear what processing took place and why. Adjusents are a moving target. As to the scientific method…we are always attempting to contextualize data. In the case of a LIDAR scan, a 3D modeler has to withdraw noise from the scan, fill in blanks ( snow, rain can cause these) the data has to be processed and layers applied to create a working rig. The LIDAR scan is not as useful on its own. This is not to say “toss out the raw”, adjustments have to made clear so they can be verifiable and justifiable, otherwise the adjustments are not useful

      • All true, all irrelevant
        The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.
        When you homogenise, show the original data and show what you did. Allow people to critique it, try and replicate it, show if you are wrong.
        Changing the underlying data because you know what it should say is simply wrong.

      • All true, all irrelevant
        The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.

        Correct, but for example, the global mean temperature is not data in the first place since you can’t measure it. You can have raw data from stations or satellites (measured in the three spatial and one time dimension), from which you construct in a way or another the thing you purport to represent the global mean temperature, whatever it actually means.
        This question is surprisingly philosophical. I don’t think adjustment is wrong at all, but care should be taken in what you call the end result. Do you call it a measurement, or model? Do you realise the model is vulnerable to your conceptions and is, in fact, an interpretation or even opinion rather than data?
        What I’m trying to say is that people are willing to take a number for its face value, let it be with error bars or not, when the trouble is NOT with the missing error bar, but rather what the number actually represents. People believe too much in numbers. It is very difficult to understand the number can be anything as the text around it actually defines its purported meaning. And this I say as a mere M.Sc. from distance; I guess both scientists and laymen promote numbers too high compared to the legend.
        ‘Your table is precise, yes, but your legend fails to tell accurately what it really represents.’

      • owenvsthegenius
        Thankyou for your explanation of your point which I requested.
        However, as Tony Hammond says, your explanation is “all true but irrelevant”.
        I asked

        Please explain how you think data being “processed” is preferable to citing the error bounds of each datum. And an explanation of how that preference can be equated with the scientific method would also be appreciated.

        You have not said in what – if any – way data being “processed” is preferable to citing the error bounds of each datum. And your discussion of attempts to “contextualize” data from other fields has no relevance of any kind.
        Furthermore, you say

        This is not to say “toss out the raw”, adjustments have to made clear so they can be verifiable and justifiable, otherwise the adjustments are not useful.

        In climastrology – which is the subject we are discussing – the raw data ARE thrown away so according to you the “adjustments” to temperature data “are not useful”.
        The bottom line is as Tony Hammond says

        The data is the data. End of story. We don’t what the data “should” be – if we did, we wouldn’t need the data.
        When you homogenise, show the original data and show what you did. Allow people to critique it, try and replicate it, show if you are wrong.
        Changing the underlying data because you know what it should say is simply wrong.

        Richard

        • Richard, I agree completely with Tony. I would add, that in many cases the raw + error bounds are preferable to adjusted. In fact the only instance where “tossing out the raw” is acceptable is when the raw cannot be read. This might occur when the data is so rubust that it requires computational power + software + engineering that the layman can’t access. To clarify; the data has to be processed to be read. Otherwise raw should always accompany adjusted. And oftentimes the adjusted is little better than than an opportunity to display bias. Without verification the adjustments are not useful, furthermore in the case of “climate science”, with so much at stake, raw data should not be proprietary. We should be careful giving authority to inferences.
          As to contextualizing; I used an example from a (related) field to illustrate an idea. I’m sure you get the point

        • Richard, I almost forgot. Error bounds, yes they are a must. Without these we have nothing. Just a thought, how do we arrive at the margins for error? Please relate your answer to the topic

      • Hugh
        I agree with you when you say

        This question is surprisingly philosophical.

        but we part company when you say

        What I’m trying to say is that people are willing to take a number for its face value, let it be with error bars or not, when the trouble is NOT with the missing error bar, but rather what the number actually represents.

        The number cannot represent anything unless it includes its error bars. Absent the error bars the number could be representing any value between minus infinity and plus infinity: an indication that could be of any value is not an indication of any actual value.
        This was the purpose of my original question to owenvsthegenius; viz.

        Please explain how you think data being “processed” is preferable to citing the error bounds of each datum.

        In the absence of error bars any “adjustment” to the value has no real effect because it does not alter the fact that the value has no defined accuracy and precision whether or not it is adjusted.
        Richard

      • owenvsthegenius
        It seems we may be converging on some sort of agreement.
        You say

        Richard, I almost forgot. Error bounds, yes they are a must. Without these we have nothing. Just a thought, how do we arrive at the margins for error? Please relate your answer to the topic

        All my comments have been related to the topic.
        I refer you to the comment from Hugh and my subsequent answer to him.
        Also, we cannot provide error estimates because
        (a) temperature is an intrinsic property so cannot have a valid average according to physics
        but
        (b) average global temperature anomaly is calculated by ‘climate science’ although there is no definition of it (each team that computes it uses its own definition of it and changes that definition almost every month)
        and
        (c) there cannot be an agreed error estimate for a parameter that has no agreed definition
        additionally and importantly
        (d) there is no possibility of a calibration reference for average global temperature anomaly however it is defined.
        For a more full assessment of these issues please see this item especially its Appendix B.
        Richard

      • John, if there is a trend to global temperatures, then it must of necessity appear in the data from all individual stations. Otherwise, the trend cannot possibly construed as a “global” trend. If regional trends overwhelm a global signal, then it seems reasonable that the global signal is not likely a significant one. Most of the other changes such as reading times should introduce a step change and should have a clear signal that can be removed without trouble. Before “adjusting” for station moves it would be reasonable to determine IF the station moved. There are numerous adjustments for “relocated” stations that never were moved. There are some notorious instances where the “move” was attributed due to a change in the rounding of latitude and longitude. In other instances, “moves” or instrumentation changes were assumed and corrected for, even though no such changes had occurred. Rather than rely on modeling assumptions about what kind of “signal” such changes should yield, it would be better to simply DO THE WORK to determine whether the change is a legitimate one or spurious. More importantly, when one is corrected about whether a station has moved or experienced instrumentation changes, then the “adjustments” need to be revoked. More importantly, if changes are going to be made to historical records, preservation of the original records is vital to any attempt to replicate or correct a combined (“merged”) data set.

    • 5) Put out a press statement to say that you have done 1 – 4.
      6) Make sure the storage computer system crashes so that everything is lost.
      Who said I was a cynic?

      • @Oldseadog:
        The computer need not crash. Simply have it operated and housed at the Clinton Data Center… all needed data will be kept and only the “private” data will evaporate…

    • The problem with (2) is that the adjustments are made in a long, old piece of Fortran code, I suspect, without adequate comments and no documentation. And it has been modified and added to over the years.

  3. So here’s my question. Every time that there is an adjustment, shouldn’t that increase the uncertainty of the actual measurement? Say that a max temperature of 90 +/- 0.5 degrees was measured, but after adjustments is now 89 degrees. The uncertainty has to be at least +/- 0.75 degrees now, doesn’t it?
    If you plotted the adjusted temperatures with adjusted error bars, would these adjustments really change anything?

    • The claim is that the adjustment and homogenization process reduces the measurement errors and increases the measurement precision. You can’t make this stuff up.

      • Well at least it is good for a laugh and maybe a bit of poetry:
        A rose is a rose is a rose, but global temperature is as fleeting as the wind.

      • “You can’t make this stuff up”. No I couldn’t. My imagination just isn’t strong enough. Climate Change “Scientists” seem to have mastered the technique though.

  4. I would think the adjustments they made to the NCDC data would be to bring the overall result closer to the most accurate system we have for measuring global temperatures, the RSS. They sold us on spending billions of dollars on the satellite based RSS because it would be so much more accurate than the measurements taken on land by the NCDC and others, now not only is RSS data less referenced than land collected (and adjusted) NCDC data, the NCDC data is continually adjusted to INCREASE the divergence between NCDC and the more accurate RSS data. NCDC data is a sales pitch, it is not scientific data, it’s primary purpose is to justify the budgets for the NCDC by exaggerating the amount of global warming going on and increase the sense of urgency for budgets that support climate monitoring and climate studies. If the purpose of the NCDC data was to be as accurate as possible they would be adjusting it to more closely match the RSS data.

  5. When we married many, many years ago my bride was a size 12. She is still a size 12, and sometimes a size 10, but I can assure you her weight is not the same as when we married. Size adjustment anyone?

    • That’s “vanity sizing.” Mail order sellers, and other sellers, realize that clothing that is too large for a customer won’t be as readily returned as clothing that is too small. They want to avoid returns. So vanity sizing gives them a favorable margin of error.

  6. Looks like you have perplexed the resident lefty on the site at long last Anthony. What is Gavrilla, and what on earth is a ‘track meet’? I get the joke about adjustments, but the final line makes no sense in cold windy Wales!

    Reply:
    Gavrilla is the female cousin of Gavin, and a Track meet is the US Colloquilism for a Track and Field Contest, you know, foot races, hurdles, shot puts, pole vault etc. I’ve recently been working with some instructors in the UK and boy do I understand your occasional confusion. ~ mod.

  7. How three independent groups plus individuals like Stokes can come to the about same relative value each month means that there must be an unbelievable coordination of dishonesty. Based on Stokes values, which we can see did to day, what do you think it is, that everyone puts their thumbs on the scale once they see his?
    This is just a little batty.
    BTW, it looks like May is going to come in close to February this year. Get ready for more records.

    • No, it just takes them all doing the approximate same “best practices” to the raw data, infilling and homogenization based on a normalized area and latitude based temp trend they just need to read the same papers.
      One of the reasons I do neither, I wanted to see what the actual data said.

  8. The children sitting 5 desks away got A’s so Gavin’s grades were homogenized and adjusted up to A’s as the lower grades were obviously incorrectly recorded.

    • No that’s not how it works, Gavin has been copying off Gavrilla’s test for 2/3rds of the semester, finally Teacher Grannywings catches Gavin cheating and moves him 5 chairs away, so Gavin starts getting D’s instead of A’s! Principal Dufus, noted that Gavrilla and Gavin have historically gotten same scores and now they aren’t so He adjusts Gavins score upward without telling Miss Grannywings.

      • Later Dufus accepts that the A’s were incorrect in this cherry-picked case, but tells that the school’s mean over the period of 1850-2000 did not change because some A’s given in 1920’s had been reinterpreted as D’s.

  9. In real life, Gavin’s notes would be given as anomalies to a 30 year baseline somewhere in a past century, though. Which you compute by taking data from that period and adjust it downwards.

    • Obviously Gavin was sitting on the CO2 enhanced side of the room so it was too hot to concentrate

  10. Well, this whole biased temperature data “Adjustment” business is just another example of Noble Cause Corruption. Data manipulation for a “Good Purpose” can’t be a sin…
    Two days ago, even a former Swiss Minister and President, Moritz Leuenberger, did admit that he plainly lied to the public in connection with a CO2 reduction law. Here’s the Swiss newspaper report about that confession:
    http://www.tagblatt.ch/ostschweiz/thurgau/kantonthurgau/tz-tg/Die-ganze-Wahrheit-haelt-gar-niemand-aus;art123841,4242625
    The crucial quote of Moritz Leuenberger in this article is as follows:
    «Der Klimagipfel in Kopenhagen kurz vor der Abstimmung zur Reduktion des CO2-Ausstosses war desaströs», gibt Leuenberger jetzt zu. Doch damals habe er dies absichtlich nicht den Medien gesagt und somit gelogen, damit die Schweizer dafür stimmen würden. Leuenberger: «Jetzt glaube ich, die Lüge ist legitim, wenn sie etwas Gutes bewirkt.»
    (English translation: “The climate summit in Copenhagen shortly before a (Swiss) referendum about the reduction of CO2 emission was disastrous. But then he had not told this the media deliberately and therefore lied consequently, in order to make the Swiss people vote for it (the CO2 reduction law)”. Leuenberger said further, “Now I believe, a lie is legitimate if it will cause a good result.”)
    Thus we see by this example quite plainly that “Noble Cause Corruption” is very real in Politics today!
    The big problem with this kind of behavior is that such “well-meaning zealots” only believe – but don’t actually know – whether their dishonest crusades will really help mankind…
    Just think, only 500 years ago, the majority of people believed that burning witches would be a very “Noble Cause” indeed. And today, Mr. Leuenberger and the majority of the misguided public believe that the vital and desert-greening plant-food CO2 is the new diabolical witch that must be hunted down…

  11. “Note: On Thursday of this week, NOAA/NCDC will attempt to rewrite the surface temperature record yet again, making even more “adjustments” to the data to achieve a desired effect. [. . .]” – Anthony Watts

    Well. This is only a partially sarcastic comment. Most of this comment is not sarcasm.
    / partial sarcasm on . . .
    I think NOAA/NCDC will justify rewriting the surface temperature record via more data ‘adjustments’ by insinuating that the IPCC endorsed GCMs (models) must be right. Thus, they will maintain that it is reasonable to significantly adjust the temperature data up to be more in agreement with the unquestionable models.
    . . . partial sarcasm off /
    John

  12. “Who controls the past controls the future; who controls the present controls the past.” George Orwell’s Nineteen Eighty-Four.’
    Could you get a finer demonstration of this in action than ‘adjustments’ of past temperatures?
    Of course it could be just ‘lucky chance ‘ that all adjustments fall in favour of ‘the cause ‘ but with that type of luck you think they spend more time on the tables at Las Vegas
    When you heap poor pratice unto what is already in many ways poor data you cannot ‘magical’ turn it into good data no matter how much you ‘believe’

  13. I seem to remember there was a problem: UK Met Office couldn’t reproduce their “homogenized” data or maybe it was some other office in the UK.

  14. ELCore (@OneLaneHwy) wrote this final line of his story,
    “Grandma leaves [the room] quietly.”

    ELCore,
    And then there is the rest of your wonderful story . . .
    Grandma then picks up her mobile phone and dials her son.
    When her son answers she says, “Your wife is thinking in an odd way since she finished being an expert reviewer for the IPCC’s AR5.”
    Her son says, “Mom, her odd way of thinking started before that while she was getting a Masters Degree in Climate Science from Penn State University.”
    John

  15. And that folks is how the UN, the US-EPA, NOAA, NASA and academic atmospheric sciences do their rsearch and work!!!

  16. The lack of controlled process or methodology for adjustments at NOAA is a tragic-comedy.
    But the constant adjustments does point out the difficulty of the concept of “Global Temperature”. I still insist it is a low confidence measurement because at this time it is extremely difficult thing to define and measure. There is progress being made and satellites give us hope of some day arriving at the decisive definition of “Global Temperature”. Currently the whole business is sketchy, but throwing the Tarot cards I predict in about 100 years we’ll get it. My Tarot cards are computer modeled so you can rest assured in their accuracy.

  17. In defense of NOAA/NCDC, someone has to be in last place on climate science credibility. An argument could be made that NASA GISS is probably in last place on temperature dataset credibility with NOAA/NCDC only slightly more credible.
    John

  18. While some of Gavin’s excuses are a little weak, one must understand that he started off the year knowing he was going to get straight A’s at the end. After trying all the various ways to learn, he got a result. This result did not fit his mental model. Obviously the problem lay in the data, as the model was perfect and exactly what he needed for further progress in school.
    Thus the data must be defective and had to be corrected for bias, first to change them to A’s and then an explanation conjured justifying each. He had two friends examine the original and corrected data and they agreed with the process and complimented him for the fine work he had put in throughout the year. They both needed to make adjustments to their raw data as well because there was something obviously wrong with each data set. Gavin agreed to return the favour and examine their work to see if it met the same standards he had used, a method which he already had validated by two external reviewers.
    When parents criticised the result Gavin explained they were not students in these classes and had no standing to comment. Only students could understand the pressure they were under to produce A-Grade work and how he was harassed by incompetent teachers.
    An investigation of the whole matter by fellow students concluded that the locks on the doors in the men’s bathroom were faulty and needed upgrading.

  19. I skipped school because I don’t like going to classes. I became an aerospace engineer. That should count for straight A’s in the school I didn’t go to.

  20. Anthony, I refer to your article of June 6, 2012:
    http://wattsupwiththat.com/2012/06/06/noaas-national-climatic-data-center-caught-cooling-the-past-modern-processed-records-dont-match-paper-records/
    Your following comment was spot on:
    “Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don^t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made. In the case of Arizona and changing the Climate Divisions, it would be the equivalent of changing state borders as saying less people lived in Arizona in 1934 because we changed the borders today. That wouldn’t fly, so why should this?”

  21. One way to treat data that isn’t well behaved is to just take the logarithm of it and then plot that. If it’s not quite right, just take the log of it again. That usually works.
    If it’s necessary to match up thermometric data with proxy data, you might have to try a few adjustment factors to get them to match up.

  22. Gavin Schmidt — I Got The Data In Me
    (most sorry Kiki Dee)
    I got no troubles at NASA
    I’m a rocket nothing can stop
    Survival’s always the first law
    And I’m in with those at the top
    I heat up
    I cool down
    A site I don’t like I discard it
    The high and the mighty can frown
    So say what they want they reward it
    Man is the measure
    Of all things that be!
    The Progressive Alliance
    And its New Age Science
    Say I got the data in me!
    I work in the mists and the fogs
    By methods that none can review
    To hide like a fox from the dogs
    The premise of all that I do
    The thermometers all want skilling
    If their readings are not alarming
    As the early ones all need chilling
    So the later ones all need warming
    Man is the measure
    Of all things that be!
    What Protagoras said
    Onto Nietzsche led
    So I got the data in me!
    The truth’s a Consensus of thought
    We agree to agree about
    A joy for which long we have sought
    Our minds ever free of all doubt
    We are born uncertain of heart
    And live in fear of things unknown
    But Consensus is truly the start
    Of our souls becoming our own
    Man is the measure
    Of all things that be!
    To Progressive drums
    The Superman comes!
    And I got the data in me!
    I heat up
    I cool down
    A site I don’t like I discard it
    The high and the mighty can frown
    So say what they want they reward it
    Eugene WR Gallun

  23. As long as everyone here takes this as a joke nothing will happen. This is in fact criminal activity akin to Mafia criminal activity. This needs to be taken up by lawyers and the people doing this need to be charged.

  24. When I was a lad, there were two things you just did not do in the laboratory. One was work without eye protectionm and the other was alter your ovservations. I [guess] in computer labm you don’t need the eye protection either…

    • Pointman — A real message??? That is just too good to be true. — Eugene WR Gallun

  25. Weather stations measure the temperature of air in thermal contact with the underlying terrain. Each site has a different terrain, and for a host of landscape features documented by Pielke Sr., the temperature patterns will differ, even in nearby locations. However, if we have station histories (and we do), then trends from different stations can be compared to see similarities and differences..
    In summary, temperatures from different stations should not be interchanged or averaged, since they come from different physical realities. The trends can be compiled to tell us about the direction, extent and scope of temperature changes.
    https://rclutz.wordpress.com/2015/03/20/auditing-the-abuse-of-temperature-records/

    • The trends can be compiled to tell us about the direction, extent and scope of temperature changes.

      The process I use is to calculate the difference between one day and the next for each station, If a station is changed, it effects only that day or range of days. I feel like the only accurate baseline for a station is itself.
      It’s not perfect, but all of the other ways (guessing, or a baseline of averages) has so many other things that can go wrong, I wanted to see how each station’s temp evolves.
      It’s the same reason I don’t infill or homogenize. I also don’t exclude data based on it’s value unless it’s drastically off, like about or below 200F/-200F.
      I do some station exclusion based on the number of yearly samples, depending on what I’m looking for, but I also produce a report of the station I used, sample size and a few other parameter.

    • read willis.
      over 90% of temperature variation is determined by latitude and elevation.
      second, nobody in their right mind averages temperatures.

  26. What is really needed to solve this problem is a global temperature proxy that goes up to the 1980s. After that we have satellite data. Does such a proxy exist? Seems like there are all kinds of proxy records. This may already exist but has not been synced up with the satellite data.

  27. “What is really needed to solve this problem is….
    What’s really needed is a cogent understanding of the why and how of homogenization rather than the wild and unsupported claims of conspiracy. Mr Watts has promised to issue a paper on this topic, but it has failed to appear. Why?

    • Well Mr. Beeton, it’s because of people like you who will attack it for sport, we have been very very careful, almost to a fault. We completely started over [from] our original effort, and since the paper is being done with zero funding (again because of people like you that will attack us even getting a dime for the effort) it takes a very long time when everything is volunteer. That’s all I’m going to say about it, but feel free to be as upset as you wish.

    • “What’s really needed is a cogent understanding of the why and how of homogenization rather than the wild and unsupported claims of conspiracy. Mr Watts has promised to issue a paper on this topic, but it has failed to appear. Why?”
      1. The code that explains adjustments has been posted for a very long time. Nobody cares to read it or run it. Nobody cares to look at why it handles specific cases the way it does. Iceland is a good example.
      2. The adjustments cool the record. Skeptics dont want to address that beccause it doesnt fit the fraud/conspiracy theory.
      3. Watts 2012 ( yet to published ) if correct, still won’t address the issue. the land is 30% of the total. the US records are less than 5% of the total. The US is the worst record in terms of radical changes to observation practice. In other words, even if you found that the land in the US was biased high by 50% since 1979, that would not change core science in any material way.

  28. A cross-blog consortium of senior sceptics need to come together to work on this one. There’s too many divisive ego’s getting in the way of highlighting what is an outright scandal.
    * Select adjusted data examples.
    * Show how far they’ve been adjusted, and the consequences of the ‘new’ data.
    * Challenge the relevant data provider to explain the adjustments, with workings.
    * Demand politicians launch official inquiry.
    We have Fifa execs’, bank interest-rate setters and individual FX traders being charged for manipulating data to enrich themselves. Climate science guys are doing that AND misdirecting £billions of public monies. These people need holding to account. Why the silence?

      • Because the so-called “conspirators” – and there is nothing “conspiratorial” at all about their stated claims and world-wide behavior – have spent billions worldwide promoting their goals, their methods, and their motives. YOU are the ones who use “conspiracy” and all of its implied hidden agendas. This IS United States policy, the US “highest global security threat” for the President AND Sec of State, DOD, and all of its subordinate agencies and all of their 92 billions in CAGW monies…..
        It is totally political and political-economic driven and political-Gaea-theist. No conspiracies at all needed.

      • @RACookPE1978
        “This IS United States policy, the US “highest global security threat” for the President AND Sec of State, DOD, and all of its subordinate agencies and all of their 92 billions in CAGW monies…..”
        In order for this amazing statement of yours to be true, ALL these parts of the US Government are complicit in your posited deception.
        What’s more, ALL the nations of the planet must be in on it as well, since the National Science Academies of every nation on Earth –China, Japan, UK, France, Germany, Canada, Australia, etc take formal positions concluding AGW. EVERYONE. Plus NASA, NOAA, major Universities, and all Scientific Professional Societies.
        A bit of a stretch, as they say… or perhaps you don’t say…

    • Actually Roger Peilke Sr. called for a collection of mutually agreed upon experts, none of which had an interest in the outcome.
      So, not a team of skeptics. Not a team of non-skeptics, but a mutually agreed upon experts.
      GWPF had an opportunity to follow Roger’s prescription ( one endorsed by Anthony BTW) ,but GWPF failed.

    • * Select adjusted data examples.
      * Show how far they’ve been adjusted, and the consequences of the ‘new’ data.
      * Challenge the relevant data provider to explain the adjustments, with workings.
      * Demand politicians launch official inquiry.
      Data adjustments are all explained in papers that skeptics refuse to read and code which they refuse to review.
      Data adjustments COOL THE GLOBAL RECORD.

      • Slight bit of deception…. Data adjustments COOL THE PAST to enhance the warming rate… which then helps bring the rate of change more in line with the defective models. Unfortunately, falls are coming earlier and winters lasting longer. Today’s climate seeming to be very similar now as to what it was back in the 70’s, to anyone who has been around that long, cause disbelief in credibility of the adjustments. In addition since satellites and weather balloons completely go against the warmists strong desire to show rapid and soon to be catastrophic warming. It is very highly unlikely that the adjustments can be correct, they do NOT fit with peoples observations.

      • “Cool the global record”? This must be a case of somebody saying one thing and meaning another.
        Look at the graph “Global Temperature (meteorological stations)” in Hansen’s 1999 paper (Figure 4). Compare that with the current “Global Temperature (meteorological stations)” at the GISS website. The rise in temperature over the 20th century is practically doubled.

  29. As I understand it, the common practice is to use a 1200 km gridded system to in-fill missing data and homogenize the data. To put that into context, that would be like comparing Baltimore, MD where I live with Ocala, FL. That is ludicrous in every sense of the word. Two completely different climatic areas. You could use January temps in Baltimore to cool the January temps temps of Ocala from the past.

    • wrong.
      see Willis’ post on the relationship between latitude, elevation and temperature.

  30. My favorite story about how ‘data adjustment’ can work in real life can be summed up in the punch line: “If you can’t afford the surgery to remove the tumor, Mr. Johnson, for a fraction of the cost I could touch up your X-rays.”

  31. Has anyone on this thread read Dr Moshers posts and links about data homogenization, understood them, and care to post their own point by point explanation vs his?

    • warrenlb
      I answer your questions in turn.
      Yes, yes, and no because his sophistry is not worth the bother of refuting;
      e.g. when he writes “Cool the global record” he means ‘cool the past to increase the apparent warming rate’.
      Richard

  32. Mr. layman here.
    My impression is that the surface station info is trying to be used for something for which they were not designed. They were set up for local conditions, not global. The data can’t be changed (adjusted, homogenized, etc.) to give a truly accurate global result.
    But they can be changed to give a desired or expected result. (Desired by the politicians and “political scientist”. Expected by the prevailing hypothesis.)

  33. If the goal is to gain a better understanding of the Earth’s Climate, then the effort should be made on ensuring the quality and quantity of the data going forward.
    Why bother looking at data from the Griffith Observatory from the 1940’s when you’ve got a Hubble Space Telescope. It doesn’t make any scientific sense to use sparse and flawed data from the past.

    • It most certainly does make sense if you are going to make a statement regarding the present in comparison to the past.
      I’ve always had a problem with the data adjustment, because it changes the trend within the raw data. IF a temp is taken every day at 10 a.m. … and it produces a trend …. it is not likely that that trend is going to change had the temp been taken every day at 5:30 a.m. Moving max and mins are also not a good measure, outside of establishing records here and there.
      If I ever win the lottery, I’m going to get into climate science … simply because I want to answer a lot of questions that aren’t being answered.

  34. To suggest that NOAA deliberately adjusts temperature data in order “to achieve a desired effect” is a very serious charge and should not be made lightly.
    If the author has evidence to show that scientists have deliberately misapplied TOBS adjustments or miscalculated homogenization with the specific intent of skewing the results in a specific direction, then he should present such evidence.
    Otherwise such charges border on “bearing false witness against your neighbor”.

    • David, why the incessant need to always keep going back to fiddle with temperatures from the past?? Here, let me answer this for you since you likely will not answer honestly. Answer: Because the models which every warmist believes must be right, show considerable warming as CO2 rises and believe that at least 3C or more warming must occur between 1900-2100. The data, uncorrected do NOT show the expected warming. Thus, the warmists conclude there MUST be errors in the data and so come up with reasons as to why the adjustments must be made. And as long as the plateau continues, there will always be a need to correct the past in order to show that the warming continues unabated. Support in this answer is seen in the growing discrepancy between satellites and the ground based temperature with every new correction.

    • @David Sanger
      Posters on this forum repeatedly make such accusations without evidence; moreover, Mr Watts claims to have written a paper ‘proving’ that the data is fraudulent, yet we have not seen it.
      It’s all part of the process of rejecting AGW by proving there’s a conspiracy to commit fraud. The only problem with this scenario is that the entire world of science must be ‘in on it’, since every Science Academy, Scientific Professional Society, and major University in the world assert Earth is warming and Man is the Cause.
      You would think these folks would be trying to figure out why the adjustments are made, instead of making unsupported accusations. It seems apparent they do so not because of supporting evidence, but because they don’t like the answers of Science re: AGW.

Comments are closed.