More on the Bombshell David Rose Article: Instability in the Global Historical Climate Network

There has been a visceral reaction by the defenders of the climate faith to the Mail on Sunday article by David Rose…

mail-on-sunday-rose1

…where the Karl et al. 2015 “pausebuster” was not just called into question by a NOAA whistleblower, who [says] procedures weren’t followed, and that the authors “played fast and loose with the figures”, but basically called fraudulent on the face of it because it appears to have been done for political gain. In my opinion the lead authors, Thomas Karl and Thomas Petersen both retired from NOAA in the last two years, made this their “last big push”, so they didn’t fear any retribution.

Having met both of these people, and seen their zealotry, none of the shenanigans brought out by the David Rose article surprised me.

The faithful have been claiming that there’s no difference between the NOAA and HadCRUT temperature datasets depicted in the Rose article, saying it’s a baseline error that gives the offset. I’ll give them that, and that may have simply been a mistake by the Mail on Sunday graphics department, I don’t know.

MoS2 Template Master

When the baselines for anomalies are matched, the offset goes away:

Comparison of HadCRUT4 and NOAA global land/ocean monthly temperature anomalies put on a common 1961-1990 baseline.

Comparison of HadCRUT4 and NOAA global land/ocean monthly temperature anomalies put on a common 1961-1990 baseline. h/t The CarbonBrief

BUT….there’s other serious problems in global climate data.

Despite what you might think, NOAA and HadCRUT data are not entirely “independent”. They both use Global Historical Climate Network (GHCN) data, and the GHCN was administered by ….drum roll… Thomas Peterson of NOAA, one of the co-authors of the Karl et al. 2015 “pausebuster” paper.

It’s the fox guarding the henhouse, and as you can see below, the data is seriously shonky.

 writes at the website CliScep:


The purpose of this post is to confirm one detail of Bates’s complaint. The Mail article says that “The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.” and later on in the article, “Moreover, the GHCN software was afflicted by serious bugs. They caused it to become so ‘unstable’ that every time the raw temperature readings were run through the computer, it gave different results.”

Bates is quite correct about this. I first noticed the instability of the GHCN (Global Historical Climatology Network) adjustment algorithm in 2012. Paul Homewood at his blog has been querying the adjustments for many years, particularly in Iceland, see here, here, here and here for example. Often, these adjustments cool the past to make warming appear greater than it is in the raw data. When looking at the adjustments made for Alice Springs in Australia, I noticed (see my comment in this post in 2012) that the adjustments made to past temperatures changed, often quite dramatically, every few weeks. I think Paul Homewood also commented on this himself somewhere at his blog. When we first observed these changes, we thought that perhaps the algorithm itself had been changed. But it became clear that the adjustments were changing so often, that this couldn’t be the case, and it was the algorithm itself that was unstable. In other words, when new data was added to the system every week or so and the algorithm was re-run, the resulting past temperatures came out quite differently each time.

Here is a graph that I produced at the time, using data that can be downloaded from the GHCN ftp site (the unadjusted and adjusted files are ghcnm.tavg.latest.qcu.tar.gz and ghcnm.tavg.latest.qca.tar.gz respectively) illustrating the instability of the adjustment algorithm:

alice

The dark blue line shows the raw, unadjusted temperature record for Alice Springs. The green line shows the adjusted data as reported by GHCN in January 2012. You can see that the adjustments are quite small. The red line shows the adjusted temperature after being put the through the GHCN  algorithm, as reported by GHCN in March 2012. In this case, past temperatures have been cooled by about 2 degrees. In May, the adjustment algorithm actually warmed the past, leading to adjusted past temperatures that were about three degrees warmer than what they had reported in March! Note that all the graphs converge together at the right hand end, since the adjustment algorithm starts from the present and works backwards. The divergence of the lines as they go back in time illustrates the instability.

There is a blog post by Peter O’Neill, Wanderings of a Marseille January 1978 temperature, according to GHCN-M, showing the same instability of the algorithm.  He looks at adjusted temperatures in Marseille, that illustrate the same  apparently random jumping around, although the amplitude of the instability is a bit lower than the Alice Springs case shown here.  His post also shows that more recent versions of the GHCN code have not resolved the problem, as his graphs go up to 2016. You can find several similar posts at his blog.

There is a lot more to be said about the temperature adjustments, but I’ll keep this post fixed on this one point. The GHCN adjustment algorithm is unstable, as stated by Bates, and produces virtually meaningless adjusted past temperature data.  The graphs shown here and by Peter O’Neill show this.  No serious scientist should make use of such an unstable algorithm. Note that this spurious adjusted data is then used as the input for widely reported data sets such as GISTEMP. Even more absurdly, GISS carry out a further adjustment themselves on the already adjusted GHCN data. It is inconceivable that the climate scientists at GHCN and GISS are unaware of this serious problem with their methods.

Finally, I just downloaded the latest raw and adjusted temperature datasets from GHCN as of Feb 5 2017. Here are the plots for Alice Springs. There are no prizes for guessing which is raw and which is adjusted. You can see a very similar graph at GISS.

alicefeb17

Full post: https://cliscep.com/2017/02/06/instability-of-ghcn-adjustment-algorithm/

Advertisements

314 thoughts on “More on the Bombshell David Rose Article: Instability in the Global Historical Climate Network

    • They tell you that the man or woman reading the thermometer back in 1880 misread by 3 degrees.

      Those who believe that will also believe in airborne pigs.

      • jorgekafkazar February 7, 2017 at 12:00 pm
        “Hot Under: The linked article gives zero mention of other whistleblowers.”

        This looks like a “mention” to me:

        “The committee aide said they had heard from other NOAA whistleblowers as well, but would not bring that evidence forward until given permission by sources.”

      • Airborne pigs have been an essential element of the CAGW air-land-sea integrated assault on the scientific method since Hanson first addressed an thermo-enhanced house/senate committee. Airborne pigs with stealth capacity have been part of NOAA’s armoury for quite some time now as the Rose post reveals. We can now see very clearly that it has been very challenging to keep these airborne pigs flying and the software is vital, perhaps even more vital than the pigs themselves.

      • The grant hogs of the IPCC fly all over-constantly! They only touch down to cash cheques and spew garbage science papers that they write up in an afternoon.

      • Airborne pigs flew a lot lower in the past. They have since been trained to flyer higher and higher. Any pig caught level flying has been culled.

      • It slowly changes from 3 degrees to zero, was it a magical shed that read to warm at the start of the readings but slowly got more reliable as time went on?

      • Too high from what? That’s the problem with all this temperature data. It “assumes” the outside temperature was a certain amount and then adjusts the data. Where’s the evidence? Where was the thermometer? How does one know the “correct” outdoor temperature? If we know what the temperature is supposed to be without data, why gather data at all? (of course, it seems in many cases, they don’t)

      • According to NT Library, there has been a SS at the Darwin PO from at least Nov, 1889 until the PO was bombed during WW2.

      • climanrecon @10:20; the claim is the historical record shows warming, however, when one looks at the raw data the warming disappears. Thus the support for the CAGW thesis is entirely due to the adjustments. You can claim these adjustments are reasonable but the fact is that when support for a thesis is entirely due to “adjustments” alarm bells should be ringing very loud and clear. It is a huge red flag for confirmation bias. If one cannot show confirmation in the raw objective data the thesis has to be considered shaky at the very best.

        When, on top of the above, one sees repeated adjustments (to the earlier adjustments) over time and when each further adjustment increasingly supports the original thesis it is virtually proof of “at best” massive unjustified confirmation bias and “at worst” outright fraud.

      • Then the data should have been discarded. It is crystal clear it was never fit for purpose and I imagine that could be said of an awful lot if not most of the data around the globe. This whole exercise is based on data that is unfit for purpose that is then manipulated so self serving funding seekers can keep themselves in the manner to which they like to be kept.

      • “Temperatures in 1880 may well have been recorded in a poorly ventilated shed, so yes, could be 3 degrees too high.”

        Pure speculation.

        An adjustment of 3°C means that there is at least ±3°C error margin that should have been carried forward.

      • ” …recorded in a poorly ventilated shed … ”
        Bunk. It is adjustment creep.
        If the shed was ventilated eg open to prevailing airflow, then the temperature inside it would have been close to the outside temperature. If enclosed or unventilated, likely the interior temperature would have been a lot more than 3° higher. Either way, night temperature would have been the same, as typical sheds (eg uninsulated) loose heat very rapidly after sunset. Some of us have actually spent time evaluating factors like these rather than sitting on our backsides speculating about them. Like Jorge says, show us the shed.

      • “…could be 3 degrees too high.”
        If it was in a simple shed, then got moved to a Stevenson Shed later, then there would be a noticable discontinuity in the recorded data. The fact that there isn’t, is strong evidence that this supposition is false.

        The same trick has been done in Greenland, where they take their temperature record seriously. The post-hoc modifications to the official temperature record are even less justifiable than for Australian data.

      • That pdf says nothing about those adjustments. It refers only to a site move in 1974 and to variations in the height of nearby grass having some unspecified possible effect on readings at other unspecified times.

      • Yet another reason I never homogenized my work. I wanted to see what was recorded, as I believe they had the best knowledge to affect any required adjustments then, not 80 years later.

      • @climanrecon
        Your BoM homogenization reference refers to photographs of grass growth and rainfall etcetera after 1974, which in its application could be an interesting separate discussion along with volcanoes, ENSO, UHI and whatnot.

        Rather than consider your speculations back to 1880, let’s instead look at available data from the beginning of time according to the Oz BoM (aka ACORN and claimed as ‘World’s best practice’). Their daily max & min data for Alice Springs are available here:

        http://www.bom.gov.au/climate/change/acorn-sat/#tabs=Data-and-networks

        If you plot the mean from 1910 the BoM homogenised linear trend gives a temperature rise of ~1.8 C.

        I don’t know where the GHCN data files are, but I’ve taken the final figure in the essay for Alice GHCN and eyeballed the linear trends for raw (blue) and “corrected” (red) as from 1910:
        T rise Blue: ~ 1.0 C
        T rise Red ~ 2.7 C

        Do you not think it’s odd that GHCN experts think higher than Oz experts about “correction” of Oz data?
        Hey look! Don’t trust me. Download the data into say Excel and check it out.

        Bob Fernley-Jones

      • climanrecon: “so yes, could be 3 degrees too high”

        I can understand the individual adjustments and the logic behind some of them. However when you look at the scope of the changes it invariably ends up as them cooling the past and warming the present. How is it even remotely possible for that to be the end result in all of these cases?

        Even if done innocently, there would seem to be some motivated reasoning going on. The scientists presume there must be warming, and then go hunting to find it. And viola, indeed, they are finding it. But it is a bias nonetheless.

        Time and time again they are cooling the past and warming the present to steepen the slope. To a layperson like myself, that seems extremely suspicious.

      • It seems like the new database GHCN v4 will fix the instability issue in Alice Springs. More nearby stations give support, and remove the algorithm’s indecisiveness about what the adjustments should be..

        Trends in Alice Springs 1941-2016 (for the moment)
        GHCNv3 unadjusted: 0.120 C/decade
        GHCNv3 adjusted 0.237 C/decade

        GHCNv4 adjusted 0.146 C/decade

        The GHCNv4 adjustment scraps all data prior to 1941 (unreliable? quality issues?)
        https://www1.ncdc.noaa.gov/pub/data/ghcn/v4/beta/products/StationPlots/AS/ASN00015590/

        As a validation of the GHCNv4-trend we could use CRUTEM4-data for the gridcell (an average of all stations in it). CRU don’t adjust data themselves, but uses adjusted data as it comes from BoM.

        The CRUTEM4 trend for the Alice Spring gridcell (1941-2016) is 0.147 C !!

        So, all Batesians, Please don’t hold back the development of science..
        GHCNv4 is good!!

    • So the temps at Alice Springs were off by a full 3 degrees in 1880?

      Not exactly. The temps recorded in 1880 were off by some number. When you ask the question determines how large that number is/was/will be.

      • It isn’t that Aussies hold our thermometers upside down. Rather that the rest of the world is upside down. I have a map that proves it.

  1. We all have been yelling about ‘fixing’ the data to make global warming appear for the last ten years. About time that it explode in the faces of the cheaters and liars who pushed global warming when there wasn’t any.

    • Of course stations like Alice have a very high impact being in the middle of massive sparsely populated continent. I seem to recall that Darwin N.T. has similar massive adjustments.

      It would probably take very noticeable manipulation of hundreds if not thousands of US stations to have the same impact on global averages as these two Aussie sites.

      • It did. But worst Australian example is Rutherglen. It is a well tended, well sited agricultural reseach station with no site moves since onset. Shows if anything cooling. Homogenized into warming. Both Darwin and Rutherglen are illustrated in essay When Data Isn’t.

      • Nick…maybe true, but the same dodgy, one size fits all, adjustments applied…. For no good reason. Just as well we have good people keeping the original raw data ratger than allow the gate keepers to simply delete it and claim its BEST. Ugh.

      • Got to disagree on the volkswagon engineering comment. My thought is the software cheat was brilliant against the arbitrary requirements. No animals were killed; No measurement was faked; Not even sure exactly what law may have been broken. Kind of Kirks solution to the Kobayashi Maru simulation.

        There’s an argument (that I’d like to see run to ground) that when volkswagon fixes their cheat the overall emissions could go up because of decreased fuel efficiency.

      • Gotta agree w/taz1999 — anything to get out of bogus “pollution” regulations is commendable — like the hallowed tradition of moonshiners avoiding state/federal taxes in the US.

      • There is another difference: The VW software engineers/managers broke the law in the USA, the climate data adjusters broke no law. (but the damage they did is far greater and is continuing)

      • NW Sage- the US does have laws regarding data integrity, “suitability for purpose”, archiving, etc. It was passed in 2001 as part of an appropriations bill(common practice in the US). “(Regulations) that provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies…”

        taz1999- The Volkswagen fiasco was caused by the EPA, since their own regulations required them to substantiate and randomly test vehicles for compliance. The Agency never did that. They wrote the regs to allow companies to self-certify that the cars passed the required tests, which they did. I believe the response was a gross over -reaction to the problem. Under normal conditions the VW diesels involved get better fuel economy but produce more NOx pollutants(real, smog-forming pollution) particularly at highway speeds. Given that some 500,000 vehicles, out of ~253million in use, and that the pollution was on highways, not primarily in cities, the effects have been miniscule in terms of smog and other overall pollution.

      • Given the egregious rule setting by EPA based on spurious “science”, I’m uncomfortable with blaming Volkswagen’s engineers with anything other that being clever.

      • The so-called VW settlement for the emissions “scandal” was that VW, a German company, would invest billions in California to switch the state to EVs.

        See:
        VOLKSWAGEN
        Settlement billions to jolt EV industry
        Camille von Kaenel, E&E News reporter
        Published: Thursday, December 22, 2016
        The billions Volkswagen AG is required to provide to electrify transportation as part of its emissions scandal settlement will boost, and in some cases displace, a nascent industry.

  2. “The GHCN adjustment algorithm is unstable, as stated by Bates, and produces virtually meaningless adjusted past temperature data.” – unstable algorithms, meaningless “results” – the essence of the alarmist “climate science.”

    • Unstable = Unusable……..!!! Only two letters different, but a globe away.
      If I’d tried this sort of stunt in my mining and civil engineering work I’d have been out the door that fast my feet wouldn’t have touched the floor.
      But then only lives depended on the quality of my work – far less important than saving the world – er – and many very cushy careers.
      BJ in UK.

    • Unstable “scientists” producing meaningless results with enormous funds and causing massive economic damage.

  3. Thank you for continuing to follow this important debate. There are questions about modeling that need to be answered by scientists and others trained in this highly technical field. Enjoy learning from the debate look forward to more.

    • Any time someone has to resort to models to prove a theory, you know they can’t prove their theory. These models were so bad they couldn’t even predict the pause. To then go back and try to rewrite the past and erase the pause is just so anti-science, it is mind-boggling.

  4. In May, the adjustment algorithm actually warmed the past, leading to adjusted past temperatures that were about three degrees warmer than what they had reported in March!

    OK.
    So the past is constantly changing. An unknowable fluid. Not fixed in reality.

    But at least the future is certain.

    • Reminds me of the fake news from the media, rewriting the past so that Democrats fought for freeing the slaves.

      • Catcracking February 7, 2017 at 9:00 am
        Actually they did, like it or not! McClellan, Porter, heck most of the Generals were democrats.
        The so called political Generals were Republicans. Butler, Banks.

        Not sure about Grant off the top of my head.

        michael :-)

      • Grant was a Republican by the time he ran for president. I don’t know about his politics earlier.

      • “hech they were all democrats”

        That might explain why the war took so damlong.

        What was the name of that battle after which Lee was allowed to slip away?

        That’s right. The Civil War.

      • There was an alternative party in the north called the American Party, which tried to waffle and kick the can down the road on the issue of slavery spreading into the Western Territories and new states.

        The Republican Party at it’s founding was the other alternative when the Whigs disappeared. The north overwhelmingly chose the anti-slavery Republican Party — while the American party only won one state, and subsequently vanished.

        Now the Boomers want to replace the Republican Party with a pro-slave south party (libertarians)

        Just watch. Now that their parents are dead they will go full on pro-German activists and bemoan our role and victory in WWII just like they did for WWI and the Civil War. IT is going to be like clock work. 3-2-1

        “We should never have gotten involved in WWII!” It is coming.

      • McClellan was as political as Jefferson Davis was. McClellan was definitely and administrator; unfortunately, one who blamed everyone but himself for his failures and retreats.

        Politics governed how Lincoln chose his Generals, until Grant kept winning his battles. A fact that Grant’s superior both tried to take credit for and then relieve Grant of his command.

      • In reply to “Mike the Morlock”

        Republican Generals who later became President:
        Ulysses S. Grant
        Rutherford B. Hayes
        James A. Garfield
        Chester A. Arthur- political appointee, quartermaster general of New York State
        Benjamin Harrison

        William T. Sherman had a brother who was a Republican Senator, but as far as I know, he was apolitical, as I suspect many generals were.- One of my favorite Sherman quotes: .
        “Grant stood by me when I was crazy, and I stood by him when he was drunk, and now we stand by each other.”

      • Anyone who would call the Libertarian party pro-slavery, knows nothing about politics.
        And is a complete idiot to boot.

      • MarkW, try to get out a little more often. Every time the Civil War and Lincoln come up, there are a well-represented group of people who claim that Lincoln was a criminal and that slavery had nothing to do with the War of the Rebellion. They claim that this conflict was merely an issue of states’ rights, and that the federal government was infringing on the rights of the southern states by limiting slavery, which their economy was based on, and that Georgia, Virginia, N&S Carolina, etc. were merely standing up for their right to make their own laws in their own states.

        These people often claim to be Republicans but more often than not are now identifying as libertarian. But it needs to be made clear that the founding of the Republican party was in response to the issue of slavery, and Lincoln was the first Republican.

        I think you are not paying close enough attention to this very vocal group and you could make a better effort to understand it. Are you trying to say that libertarians never say that the South was right and the War of the Rebellion was not over slavery? Are you trying to say that people who identify as Republicans are not now trying to make Lincoln out as a criminal? In my experience, they are almost always libertarians. I have not met any Democrats who say this, but perhaps you have.

        I think you cannot be a Republican and take the side of the Slave South. It is a contradiction.

      • PS: The US had no business getting involved in WWI. It was a war for the control of colonial empires.
        There are many who feel that had the US stayed out of WWI, there never would have been a WWII.

      • MarkW says, “The US had no business getting involved in WWI. It was a war for the control of colonial empires.
        There are many who feel that had the US stayed out of WWI, there never would have been a WWII.”

        The Baby Boomers have adopted this ridiculous historical doctrine, so much so that there are even some encyclopedias which cite the victors of WWI as the cause for WWII. But thankfully, not that many.

        Other scholars point out rightly that Germany was using the opportunity of war to invade its neighbors and attempt to take over Europe. There were a lot of socialists and German sympathizers who later promoted the view that it was all pointless and that it would have been “six-one-way-half-a-dozen-the-other” if the Kaiser’s Second Reich ambitions had not been contained in Europe by WWI.

        By the way Germany never paid on red cent in reparations.

        One of the German sympathizers who promoted the view that the Great War was pointless was John M Keynes. That is why I say that all Boomers are Keynesians. Either economically or historically.

    • That may be the best summary of the state of “climate science” today:

      The future is known with certainty, while the past continues to be uncertain.

      Thanks for the insight.

    • Are you seriously asking this question? There is no need for the adjustments except to support the global warming agenda. It can easily by demonstrated that the alarmist global warming agenda has to do with redistribution of wealth and increasingly socialistic control of Europe and America. The alarmist really do want to put the world back to the stone age as a means to reduce and control human populations through the UN.

      Hopefully, an axe will finally be taken into NASA and NOAA to get rid of departments that have grown to support this scam. The algorithm discussed here is only a fraction of the deception of those departments and meteorological services around the globe.

      • I remember during the late 1980s, during and soon after the Reagan years, the left talked of infiltrating the public service to implement ‘change from inside’. I think this was because it had become clear that democracy was not going to give them their their socialist/communist/whatever paradise.

    • All stations need to be adjusted to match the accurate stations.
      Accurate stations are determined as being those that best match the models.

      • If you have good measurements and bad measurements, you throw the bad measurements out. Adjusting the bad measurements to match gives a false impression of the number of independent measurements and the overall reliability of the data. Any measurement that is ‘adjusted’ to match other measurements is not an ‘independent’ measurement, by definition.

      • As one alarmist once told me, they have to use the bad data, because it’s the only data they have.

        My response is that a wrong result is worst than no result.

    • Can someone tell me why temperature measurements need to go through an algorithm at all?

      ANSWER: … Al-Gore-ithm (yeah, it’s a tired joke by now, but alexwade might not have heard it yet)

    • “Can someone tell me why temperature measurements need to go through an algorithm at all?”

      Because otherwise there would be no warming. In many places there would be cooling. And you can’t get grant money for global warming if the evidence shows the globe is cooling.

  5. I have read that the NOAA is going to review itself over this issue. How the hell can that be allowed to happen? Surely the review must be conducted by an independent body. We all know what happened with Climategate when the UEA invested itself, they, of course, exonerated themselves.

  6. But it makes perfect sense!
    The climate is a chaotic system which is impossible to predict.
    They use algorithms which are non-deterministic, and are likewise impossible to predict.
    When you think about it, this is exactly what we should have expected all along.
    We certainly were given enough clues along the way that something like this might turn out to be the case.

    • TonyL

      You wrote “The climate is a chaotic system which is impossible to predict”

      In terms of average global temperatures, it is quite possible to predict, Google Climate Change Deciphered for proof of this.

      • What rock have you been under, Burl? The modelers and warmistas go to grear lengths to say they do not do “predictions,” just “projections.” And the “projections” run too warm when compared to actual data.

        Few even try to assess an “average global temperature,” let alone “predict” it. Hence the widespread use of temperature anomalies.

      • Of course it is possible to predict, or as Shakespeare has it

        “I can summon spirits from the vasty deep”
        “Aye, and so can any man. But do they come when thou so callest?

      • I predict the future all the time when I play poker, or make a bet at the racetrack or on a football game.. Unfortunately, my predictions are not more accurate than random chance.

    • As difficult as it is to predict future temperatures, past temperatures are even harder to predict. Does anyone know what the GHCN temperature for Alice Springs in 1880 is going to be next week, or the week after that?

  7. In other words, when new data was added to the system every week or so and the algorithm was re-run, the resulting past temperatures came out quite differently each time.

    ……….exactly

    This is why it’s so hard to catch….the algorithm changes it every time it’s run

    Stick with this…and bust them for it

      • Good point, Tapho.

        When I saw Paul Matthews’ top graph with it’s under — OVER — UNDER — OVER misses (Jan, Mar, May, June), I thought of a Navy destroyer overshooting, then undershooting, to find its target. No good photos/videos of that (shrug).

        “Long!”
        “Short!”
        “Long!”
        “Short!”
        “LONG!”
        Sigh.

        :)

      • Two linear econometrists were out deer hunting. One, the hunter, held the rifle, while the other was “spotting” with binoculars. Lo and behold they saw a nice buck on a hillside quite some distance away. The hunter aimed and fired, kicking up dust three feet to the right of the buck, but with good elevation. “Three feet right!” exclaimed the spotter. The hunter compensated and fired again. This time the dust kicked up three feet to the left of the buck. “Got him!” exclaimed the spotter.

    • I am not disagreeing with Latitude; just placing some context into position.

      Normally, as in real world IT, any program that returned different numbers every time it ran, would be cause for employment termination, contract suspension and possibly fines.

      But in the NASA/NOAA/NCDC world, not only are senior officers content with such a program, the senior officers trust said program!

      How can the senior officers trust a program giving different results on every run!?

      When senior officers are unconcerned about obviously wandering program results, then what those senior officers care about must be rock solid dependable every time. So dependable, that senior officers are willing to suffer and spin wandering historic results.

      Somewhere in those programs are modules ensuring the past is cooled while the present is cooked.
      Those senior officers allowed history to wander because their first demands are met.

    • model runs are laike a bawx o’ chawklits, ya nevr know what yer gonna git but yer know it’s gonna be warmer.

  8. In the business world heads would roll for this kind of failure. It would make no difference if it was done for some kind of hidden purpose or came about simply by ineptness. And in the military when failures like this occur it leads to a disaster. People die. In the climatology field, the so called scientist are given more money. They go on a speaking tour. They write white papers and books. They get awards. They set-up and go to “conventions” in far flung and exotic places, where they plot new strategies with their political friends. So sad.

  9. It is looking more and more that Tony Heller was not really overstating the corruption of temperature records. Karl et al seem to be channeling Winston Smith in his 1984 job.

    • Heller has done some things wrong. But he has well documented changes in temperature records over time that exceed the supposed error bars by >2x. The one thing he missed was the US state by state changes when NOAA switched from Drd964x tonClimDiv in 2014. Documented in essay When Data IsnT in ebook Blowing Smoke.

    • Tony Heller is denying climate denying science testifying in the Washington State Senate environmental committee at the moment (starting 10am PST). The Democrats—ranking or not—do not like it.

      State senator invites climate-change denier to brief committee while he’s away

      By Walker Orenstein
      worenstein@thenewstribune.com

      A presentation by a climate-change denier scheduled for Tuesday in the Republican-led state Senate is drawing fierce criticism from Democrats who say the briefing is a misuse of government time.
      Tony Heller — who also blogs under the pseudonym Steven Goddard — will address an environmental committee run by state Sen. Doug Ericksen, a Republican from Ferndale who is temporarily leading communications at the Environmental Protection Agency for the Trump administration.

      State Sen. Reuven Carlyle blasted Ericksen Monday for Heller’s scheduled presentation in light of rescheduled hearings and work-flow interruptions caused by Ericksen’s EPA gig. Carlyle, of Seattle, is the ranking Democrat on the Environment, Energy and Telecommunications Committee.

      http://www.thenewstribune.com/news/politics-government/article131138229.html

    • Tony Heller is denving climate denving science testifying in the Washington State Senate environmental committee at the moment (starting 10am PST). The Democrats—ranking or not—do not like it. Because science. Because work-flow. Because government efficiency.

      State senator invites climate-change denïer to brief committee while he’s away

      By Walker Orenstein
      worenstein@thenewstribune.com

      A presentation by a climate-change denïer scheduled for Tuesday in the Republican-led state Senate is drawing fierce criticism from Democrats who say the briefing is a misuse of government time.
      Tony Heller — who also blogs under the pseudonym Steven Goddard — will address an environmental committee run by state Sen. Doug Ericksen, a Republican from Ferndale who is temporarily leading communications at the Environmental Protection Agency for the Trump administration.

      State Sen. Reuven Carlyle blasted Ericksen Monday for Heller’s scheduled presentation in light of rescheduled hearings and work-flow interruptions caused by Ericksen’s EPA gig. Carlyle, of Seattle, is the ranking Democrat on the Environment, Energy and Telecommunications Committee.

      http://www.thenewstribune.com/news/politics-government/article131138229.html

      (reposted with D-words partially suppressed)

      • Everything is going to be OK – NOAA is going to investigate itself. :)

        Think of a District Attorney letting an accused bank robber investigate himself.

      • pure gold , the entire warmista community will be in uproar that tony heller is being asked to do this :)

  10. Might be worth creating a verifiable archive of the versions of GHCN over the years for anyone interested in making comparisons of the changes over time. GHCN is not archive online (take a look at GHCN v2 which now just shows some precip data in the ftp, no temp. It should be archived properly.

    I have dated downloads of about 7 or 8 versions of GHCN V1, V2 and V3. Of particular interest to some may be that I have the final official GHCN v2 obtained directly from NOAA in 2013 (it was superceded several years earlier). I obtained this by special request when they stopped archiving it online and have the email trails to support this. I had planned to compare the historical temperature changes over time, as well as the adjustments, not got around to it yet although I do have some big awk scripts that will do a lot of the work.

    Anyway, just a thought. NOAA clearly don’t keep archives of the results at different time points, perhaps a collective effort might be a useful exercise.

    • Steve McIntyre also requested this in a comment at cliscep.
      If anybody with the technical knowledge could set up an ftp site or equivalent where people could upload versions of the files that they have saved, that might be a useful resource. Each file is about 12MB.
      I have a sporadic collection of about 20 or so of the files, most from 2012.

    • Try Humlum’s site, climate4you.com. He has a chart showing the changes in the various records just since 2008. For GISS, the changes amount to half or three quarters of a degree!

  11. You do all know the UAH and RSS data has been multiply adjusted, don’t you?

    Adjustment is a legitimate technique – try and understand it before assuming it is deception or wrong.

    • Yes, if the is the real reason and if you document carefully all the adjustments. That is what you can ask for UAH and RSS but if you ask same for the GISS, I guess you will never be replied. Phil Jones, the former director of the Climatic Research Unit (CRU) of the University of East Anglia has admitted that they have lost the original raw data of HadCRUT. By accident? Probably not.

    • So what was the temp at my house today? I thought I knew what it was, but you are telling me that in 30 years some guy behind his computer will tell me I was wrong and it actually was 3 degrees cooler because he needed to adjust it down to create some man-made warming. “We jest some dumb 2017 folk dat dont know how to read tempture, we need smart man n 2047 to tells us wat r temp waz”. Is that what you guys think of people in 1880?

    • Try defining “adjustment” honestly, Arch Deceiver.

      The above article discusses ad hoc, without physical justification, adjustments. That they are not physical is PROVEN by their wild divergence from observed data.

      UAH (and RSS, though lately, some of their adjusting looks more ad hoc than physical) only adjusts for physical reasons.

      … About 12 years ago, we discovered that even though two different satellites were looking at the same globe at the same time, there were differences in their measurements beyond a simple bias (time-invariant offset). We learned that these were related to the variations in the temperature of the instrument itself. If the instrument warmed or cooled (differing solar angles as it orbited or drifted), so did the calculated temperature. We used the thermistors embedded in the hot-target plate to track the instrument temperature, hence the metric is often called the ‘hot-target temperature coefficient.’

      To compensate for this error, we devised a method to calculate a coefficient that when multiplied by the hot-target temperature would remove this variation for each satellite. … The calculation of this coefficient depends on a number of things: (a) the magnitude of the already-removed satellite drift correction (i.e., diurnal correction); (b) the way the inter-satellite differences are smoothed; and (c) the sequence in which the satellites are merged.

      Since UAH and RSS perform these processes differently, the coefficients so calculated will be different. Again, recall that the UAH (and RSS) coefficients are calculated from a system of equations, they are not invented [or guessed at]. The coefficients are calculated to produce the largest decrease in inter-satellite error characteristics in each dataset. …

      (Source: https://wattsupwiththat.com/2012/05/09/christy-and-spencer-our-response-to-recent-criticism-of-the-uah-satellite-temperatures/ )

    • Griff,
      Fine. But then state the requirements clearly. Review them. Then design the code. Review the design. Then write the code. Review the code. Independently validate and verify (test) the code to ensure it meets the design and requirements. Then follow proper configuration management to track changes to the code. Implement an independent QC function for all the above phases of development. NONE of this was done for the current load of crappy code they are using.

    • Griff, Griff, Griff: We been reading this site for over 10 years, so we do know what goes on.
      Two questions for you:
      Do UAH and RSS use unstable algorithms?
      UAH and RSS use quality procedures. Why don’t NOAA?

      • Really? what is the difference between scientists working on RSS and UAH and those working with other temp data sets? Please do explain their different methodologies…

        And this website has as much political opinion from one viewpoint as it does science… it is a political view, not a scientific or journalistic view (i.e. no fact checking or no seeking of comment from people and institutions quoted or reported on)

      • Having a political view that differs from yours invalidates the science on this site?

        As to the difference, that has explained to you even more frequently than why Germany’s renewable power generation numbers have to be taken with a few truckloads of salt.

      • “And this website has as much political opinion from one viewpoint as it does science.”

        That’s only possible if you, yourself, only have one viewpoint, and perceive the world as having only two – yours and everyone else. Political opinion from the skeptic side is wide and varied, based mostly on observation, common sense, and an understanding of how people and society actually work. The only common unification is against people like you, with your opportunistic ‘facts’, and your devious little efforts to exploit a digitally manufactured ‘crisis’.

    • Wildly varying unreliable results that almost always lead to past warming suggests anything but legitimate.

    • Griff you are, of course, correct. Some adjustments are needed. Would you please explain why, though, one would see such differences in the past at a single temperature station from month-to-month? And why would data from the past seem to continually change based on adding new data from the present? Shouldn’t the past become stable at some point?

    • Poor little Griffie.
      Try reading the article, this time have a friend explain the big words to you.
      It’s not the adjustments per se that are being criticized, it’s the fact that each time the algorithm that makes the adjustments is run, it comes up with different answers, even when the raw data hasn’t changed.
      It’s also the questionable fact that every single “adjustment” makes the amount of warming greater. There has never been an adjustment that goes the other way.

      • I should also point out that many of the complaints deal with flaws in the adjustments.
        As always, Griffie is forced to try and change the subject.

      • wow GrifF, it’s frightening what the cooler temps earlier in the 20th Century caused-

        1933: Rare Hurricane Slams Into South Africa
        1933: Bitter Winter Weather In Russia & Europe: Snow Causes Wolves To Attack Train
        1933: West Australian Heat Wave – “Severest In History”
        1933: Heat Waves, Floods, Droughts, Famines Plague China
        1933: Spain’s Heat Wave: 130 Degrees In Shade
        1933: Heat Wave Causes New Jersey Road To “Explode”
        1933: Hottest June In U.S. History – Heat Wave & Drought
        1933: 21 Perish During Texas, Louisiana Tornado & Hail Storms
        1933: Drought In South Africa – “Worst Outlook For 50 Years”
        1933: Flooding In China Kills 50,000
        1933: India’s Ganges River Bursts Its Banks – Widespread Flood Damage & Fatalities
        1934: 80% of U.S. Suffers From Drought Conditions
        1934: “Heat Wave In China Kills One In Every Thousand”
        1934: Antarctic Has Incredible Heat Wave – 25 Degrees Over Zero
        1934: February Tornado Strikes Several U.S. States
        1934: World Wide Drought & Heat Causes Vast Majority of Alps’ Glaciers To Melt
        1934: Iowa Heat Wave In May – Pushes Temps Over 110 Degrees
        1934: All 48 U.S. States Over 100 Degrees During June
        1934: 14 Days of Above 100°F Temps Kill Over 600 Americans
        1934: South African Drought Severely Hits Farmers
        1934: Nebraska Temperatures Soar To 117 Degrees
        1934: Drought, Heat, Floods, Cyclones, & Forest Fires Hit Europe
        1934: British Drought Stunts Hay Growth
        1934: Worst Drought In England For 100 Years
        1934: 7 Days of Incessant, Torrential Rains Cause Massive Flooding In Eastern Bengal
        1934: Global Warming Causes 81% Of Swiss Glaciers To Retreat
        1934: Canadian Crops Blasted By Intense Heat Wave
        1934: “South African Floods Are Unprecedented”
        1934: Typhoon Hits Japan Followed By A Massive Tsunami
        1934: Record Heat And Drought Across The Midwest
        1934: China’s Fall Crops Burning Up During Drought & Heat
        1934: Five Million Americans Face Starvation From Drought
        1934: Adelaide, Australia Has Record Dry Spell
        1934: Gigantic Hailstorm Blankets South African Drought Region
        1934: Drought And Sweltering Heat In England
        1934: Record Heat Bakes Wisconsin – 104°F
        1934: 20 Nebraskans Succumb To Unprecedented 117 Degree Heat
        1934: Poland Swamped By Floods – Hundreds Perish
        1934: 115 Degrees In Iowa Breaks Record
        1934: 115 Degrees Reached In China In The Shade – Heat Wave Ruining Crops
        1934: Majority of Continental U.S. Suffers From Drought Conditions
        1934: Severe Northern Hemisphere Drought Causes Wheat Prices To “Skyrocket”
        1934: Extreme U.S. Winter Weather Leaves 60 Dead In Its Path
        1935: Severe Wind Storm Lashes Western States With 60 MPH Gusts
        1935: Florida Burns Its Dead After The Most Powerful Hurricane In US History
        1935: “The Worst Dust Storm In History” – Kansas City
        1935: Worst Drought Since 1902 Has Queensland, Australia In Its Grip
        1935: “50 Dust Storms In 104″ Days
        1935: France Cooked By Heat Wave
        1935: Tropical Windstorm Strikes Texas With 85 MPH Gusts
        1935: ‘Black Dusters’ Strike Again In The Texas Dust Bowl
        1935: India Hit With Extreme Heat Wave – 124 Degrees
        1935: Heat Wave, Drought & Torrential Rains Cause Misery In Europe
        1936: “Niagara Falls Freezes Into One Giant Icicle”
        1936: February Was Coldest In U.S. History
        1936: Italian Alps Glacier Shrinks: WWI Army Bodies Uncovered By Melting
        1936: Ice Bridge In Iceland Collapses From Heat Wave & Glacier Melt
        1936: Violent Tornadoes Pummel The South – 300 Dead
        1936: Dust, Snow & Wind Storm Hit Kansas Region In Same Day
        1936: Unprecedented Heat Wave In Moscow
        1936: Ukraine Wheat Harvest Threatened By Heat Wave
        1936: 780 Canadians Die From Heat Wave
        1936: Iowa Heat Wave Has 12 Days of Temperatures Over 100 Degrees
        1936: Heat Wave Deaths In Just One Small U.S. City: 50 Die In Springfield, IL
        1936: Missouri Heat Wave: 118 Degrees & 311 Deaths
        1936: Ontario, Canada Suffers 106 Degree Temps During Heat Wave
        1936: Alaska’s 10-Day Heat Wave Tops Out At 108 Degrees
        1936 : Record Heat Wave Bakes Midwest; “Condition of Crops Critical”
        1936: Midwest Climate So Bad That Climate Scientist Recommends Evacuation of Central U.S.
        1936: 12,000 Perish In U.S. Heat Wave – Murderous Week
        1936: Single Day Death Toll From Heat Wave – 1,000 Die
        1936: Iceland Hurricane Sinks Polar Research Ship Filled With Scientists
        1936: Severe Drought & Disastrous Floods In Southern Texas
        1936: 20,000 Homeless In Flame Ravaged Forests of Oregon
        1936: Northern California Seared By Forest Fires Over 400-Mile Front
        1936: Tremendous Gale & Mountainous Waves Pound S. California – 7 Persons Missing
        1936: Glacier Park Hotel Guests Flee As Forest Fire Advances – Worst Fire In Years
        1936: Iowa Christmas Season Heat Wave Sets Temperature Records – 58 Degrees

      • Really? This tired old 1934 chestnut?

        What we are looking at is a repeated series of high temps and weather events clustering over recent decades indicating a warming trend…

        It does not matter if once we had some anomalous event as high or as cold or as windy… what matters is a trend and a change in a trend.

        I refer you once more to the arctic sea ice…

      • Griff February 7, 2017 at 11:13 am

        I refer you once more to the arctic sea ice…

        And I remind you of arctic ice 1940s, Pacific to Murmansk, Tankers and freighters, no ice breakers no radar.
        Also using Kaiser liberty ships. Can’t do that today. Even icebreakers get stuck.

        The last 25 yrs? ho-hum, nothing extreme or uneventful occurring.

        michael

        Oh and do read a bit of how the liberty ships were constructed, if there was any amount ice they would not have serviced the journey.

      • nah, the tired old chestnut is trying to con the the world that there is any warming of significance.

        Seems like the 1930s incidences of extreme weather were worse than today. Today we have year after year of bumper crops due to benign weather- oops a bit of a thorn in the side for the comedian Bill Nye.

    • Adjustment is a legitimate technique – so long as it is applied consistently through the data set, that is not what is happening here. This is not adjustment, this is alteration of data. Selectively. You cannot change data. EVER.

      • I still don’t know why their algorithm even touches the past data. Once the site data is verified and any TOBS taken into account on a day-by-day, site-by-site, one-at-a-time basis, then the agreed upon spatial homogenization performed, the data should never be looked at again. When the new daily data comes in it should be site-by-site quality checked, added to the database and processed. The results for that day should then be added to the top of the daily file for continued plotting. Any “homogenizing” should only touch TODAY’S data and it should forever be locked in. The only exceptions would be for late arriving data sheets, which would necessitate rerunning the days covered by the new data only. Homogenizing over the TIME domain is an abomination.

      • Martin and Mark: I stand corrected. My money stays in my account long enough for me to wave at it goodbye. They charge me $3.95 a month for this privilege. Except on those months where they adjust it, always upwards, for my having the audacity to actually get at it.

      • Hang on, my bank balance actually affects something in the real world. Namely, me.
        How is that like global temperature?

    • le·git·i·mate, adjective: conforming to the law or to rules.

      Adjustment is a legitimate technique – try and understand it before assuming it is deception or wrong.

      Agreed, but whose version of the law or whose version of the rules …? If the law or the rules change sporadically, then this does not lead a person to nurture compassion for understanding, except for understanding deception.

      The problem appears to be with the seemingly sporadic nature of the corrections, which logically leads one to question what law or rules are being applied — chaos theory? … tossing beans in the wind to see which one lands where? … flipping a coin that day? …

    • giffiepoo is making false claims again.

      Adjustment is never a legitimate technique.

      Even corrections, which in the accounting world are called adjustments, must never touch the original record.
      Corrections are maintained separately with full documentation for the correction; e.g. late payment, transposed input, etc.

      Reports may include corrections, but the report will always note the corrected numbers along with information on specific details.

      Again, the original raw data is never changed!

      Nor are generic rationale’s allowed; e.g. TOD (“Time of Day”) or TOB (“Time of Observation”) may be a generic title for tracking a correction, it is not a sufficient rationale.

      Especially egregious is using a TOB title while assuming a specific quantity of change. Without full documentation on how a specific quantity is arrived at, it is specious assumption.

      Filling in missing numbers is also bogus. If the number is missing, so be it! Filling in vacant spaces is akin to assuming mid-ear vacant spaces can be filled.

      Using distant temperature stations is another specious reason for changing numbers. A major reason that the Arctic temperatures are so skewed is using temperatures from distant land based population center temperature stations.

      • Years ago I tracked 2 different locations that GISS used and downloaded their numbers each month. One of the locations was a local rural location and the other was the Charlotte, NC. Interesting was that the Charlotte location had many recent months that were 999 which meant incomplete data. I was skeptical about how a Major city would have so much missing data, anybody that watches the local news knows that you get the temps for your city every day. Well after about a year of doing tracking them they once again came out with another 999 for the Charlotte. I looked at many sites and they all had data for every day that month. Well GISS finally did their infill for data for that month and surprise, surprise GISS created an extra 2 degrees of warming for that month compared to what was recorded by every other site that had the daily numbers and didn’t FAKE numbers. After that I quit tracking GISS as it confirmed my thoughts that it is corrupt.

      • AtheoK writes: “Even corrections, which in the accounting world are called adjustments, must never touch the original record.”

        This has been common dogma in every scientific discipline I’ve ever been associated with. An absolute rule. Truth. It’s written into textbooks; never change the original data and never ever lose it; either has been a cardinal sin in every investigative science project I’ve participated in. One of the fundamental lessons taught me from day one.

        My first experience with investigative science was actually at NASA, at Ames Research Center in the 1970’s, which makes this so poignant, and my job at the time was to “spin tapes”. We had lab data that had been recorded on magnetic tape, a media that degrades over time from magnetic bleed through between the physical layers of a reel. To combat this and to preserve it as long as possible, every 5 years the tape had to be mounted on a drive, spun onto an empty reel (think “fast forward” without reading it), then rewound and replaced in the archive. We had thousands of reels of tape. Occasionally we’d get a request from another institution, usually a university, for a copy of some data set and it would be spun during the copy, but we spun them every five years just to be certain they didn’t decay faster than it had to. Eventually optical storage was invented and that wasn’t needed anymore, but I’d long since graduated to better things by the time that happened.

        Preserving the primary source data is a cardinal rule. It’s never changed. If an investigator wants to perform a transformation on it, that’s done on a copy and never replaces the original. These folks have violated a principal rule of scientific investigation and should be fired for incompetence. There’s no excuse for it.

    • I think everyone understands the need for adjustments, but to adjust buoys measuring SSTs so that they are nearer temperatures measured be taking buckets of water at sea inlets is bizarre.

  12. If you want to see the adjustments in action, go to:

    https://data.giss.nasa.gov/gistemp/stdata/

    Where you can click on stations and see the changes. I tried clicking on stations with long series (> 100 years) and it certainly seems to me as though the past is being cooled. Tried Valentia in the SW tip of Ireland, Plymouth in southern UK of Funchal out in the Atlantic.

    Can anyone explain to me the physical reason why temperatures in the past were recorded too high relative and need systematically lowering? I keep asking this question and no-one ever answers….surely cannot be too hard to answer with a back of the envelope explanation. After all, the adjustments to GHCN account for bout half of the warming over the 20th Century….what is the physical cause? Surely UHI is the biggest problem – but that goes the other way when corrected?

    • Seems to me I saw an article that said a good deal of the adjustment is “time of day” adjustments. It seemed a good balanced article that made a good case for that adjustment. But I’ll leave it to those who have worked in depth on analyzing the validity of such adjustments. However, I have yet to see any work that justifies the associated margin of error – I don’t see how you can make those kind of adjustments without a significant increase in the margin of error associated with the now adjusted measurements.

      • gnomish! Applause! SPOT on.

        Except…. Except that the cut-and-paste pet shop clerk was FORTHRIGHT and HONEST about what he was doing…. (still — great allegorical video choice little sort of like a gnome :) )

      • Tony Heller did a ToBs analysis and found it made little difference. Adjustments were not justified

      • Whenever an assumption is made then justified as a correction to a datum, that is falsehood.

        Exacting measurements are required to determine what, if any, correction should be made to a datum. Even then, the original observation should not be changed nor obscured.

        NOAA/GISS/NCDC are apparently quite content with a very slapdash methods for determining corrections.
        What they should be using is a “test engineer” for designing, testing and then using approaches for comparing, tracking and determining potential corrections.

        Any “adjustment or correction” is admission to large error ranges. Errors that are never carried forward into their alleged global anomalies.

  13. Can we have our grant money back from all the scientists who published papers on the (50) reasons for the pause when, apparently, it never existed! They must feel pretty embarrassed having wasted all that time.

  14. Griff, so for those of us who have not noticed any change in temperature, would you mind if we did not have to pay for climate scientists to keep playing with their computer models and games? They don’t seem to fulfill any useful purpose.

  15. “Adjustment is a legitimate technique.” And a necessary one. Fine.

    But there are reasons and rules for adjustments. Legitimate reasons, such as accounting for station moves and time of observation changes. And illegitimate reasons, such as, only sometimes making one lone “outlier” station “match” the “correct” record of other nearby stations — if the “outlier” is lower than the the “correct” station records but choosing NOT to adjust an “outlier” if the data seems to be lying high.

    The whistle blower claims the rhyme reason and rules government the release of this report was based on politics rather than established data management techniques. Maybe so, maybe no. But to argue that getting the same answer twice proves that the data handling was “correct” is a fight we’ve had since Wegman testified against Mann. “Right Answer plus Wrong Method” is less than “Science.”

    • Getting the same answer twice is less of an achievement when one of the comparisons is derived from the same dataset being scrutinised.

      Not much better: the others were all told something to the effect that “you will find 10 C of warming per century. that’s the most popular number, therefore right”. Oddly enough, they all found a similar amount of warming.

  16. This is probably just the result of poorly written/buggy code. That’s what you get when you assign graduate students with little to no software education/experience into developing software. I can understand why the models are not properly designed, developed, and tested using industry standard processes, since they are process models originally intended for research purposes only. But from the beginning the temperature data sets were intended to be used as a base input into not only further research, but for making policy decisions. It is inexcusable that they were just hacked together by a bunch of unqualified academics that couldn’t write a decent phone app, let alone something of this complexity. This requires a redo, from the ground up, by an experienced software development group.

    • It does not need a “re-do”. More than enough money and time has been wasted on this garbage science. Fire them all. Throw all the research in a dumpster. There is no indication that we are experiencing any unnatural warming. Most unfortunate, as here in Western Canada it is -23 with a lot of snow!

      • I disagree. There is value in recording climatic data and making it available to the public. It just has to be done correctly and in a transparent manner. And contrary to popular belief, most of the raw data is still available; it just needs to be collected and collated. This isn’t rocket surgery people. It’s basic data collection and record keeping.

        Just because some unqualified people did a poor job of it in the past does not mean it can’t (or shouldn’t) be done properly now.

    • Quick, somebody call Jon Gruber.
      I think he knows someone who is good at writing programs, I believe they wrote the healthcare software which exploded on contact.

    • Paul Penrose writes: “This requires a redo, from the ground up, by an experienced software development group.”

      But if they’ve “lost” the original data we can’t recover and start over from 1885, 1910 or 1979. We have to start over from now.

      We’ve left the fox to guard the hen house, and with the anticipated results. “The dog ate my homework Ms. Peachtree! Please don’t make me stay after school!” is what we can expect. These folks obviously have no integrity or accountability and they’ve been allowed to do whatever they please for over 30 years. It’s way to late for a do over.

      The only reasonable thing to do is reverse all policy decisions made on this subject since the 1960’s and begin collecting data again, this time (hopefully) with improved security and a verifiable chain of custody. If the data have been irretrievably lost and irreversibly corrupted there’s just no way to start over from where we were 40 years ago. That horse has left the barn.

      • There is no need to retrieve or redo any lost data. .

        Climate Change is really VERY simple.

        It is caused by the reduction in dimming tropospheric SO2 aerosol emissions, as I have proved in.my post: Climate Change Deciphered (do a Google search).

        Carbon Dioxide (CO2) has NO climatic effect

        All that is needed to be done to prevent future warming is to halt further reductions in anthropogenic SO2 aerosol emissions.

        As would be expected, Increasing emissions will cause cooling

        (The “rule of thumb” is .02 deg. C. or warming or cooling for each net Megatonne of change in global SO2 aerosol emissions)

        .

      • Most of the raw data has not been lost. Phil Jones may not have the original data that he collected from all the weather services around the world, but they still have what they handed over to him. So it just needs to be collected and collated again – this time doing a better job.

        This historical temperature data still has value, and that value extends far beyond the CAGW conjecture. Let’s not conflate the (real) data with the misuse of it.

  17. Come on now. Having political activists in control of the data was always problematic. Government science is by its very nature subject to political manipulation. That should not be in dispute between skeptics and alarmists. It is after all the reason for panic in certain quarters.

    That this problem can only occur in one direction is a silly proposition. Remember recently released felon John Beale was doing
    important work at the EPA while lying about everything. Do you beleive that this did not include his work.

  18. If the algorithm is that unstable, it cannot be used.

    The algorithm should have settled down by now for older temperatures, in particular, and not be constantly reassessing all the close-by stations. I mean you can get some new data covering the last few years and maybe that changes the pair-wise homogenization process for recent years, but why would 100 year old records be changing just because of an algorithm.

    • What kind of algorithm changes the old data … ever? There may be justification for correcting it, but is that not a manual one-time thing based on the unique aspects of each surface station?

  19. Someone once studied database pricing errors at grocery store checkouts and found they almost always favored higher profits for the stores.

    Sounds like the same guys are programming NOAA software now.

  20. “Adjustment is a legitimate technique – try and understand it before assuming it is deception or wrong”. Right! Except that the adjusters always adjust in the direction that supports their global warming/climate change conjecture, i.e., adjust the past down, adjust the present up. If I toss a coin that always comes up heads, I wouldn’t be surprised if somebody suggested I wasn’t using a balanced coin.

    • Hey Trebla! Yes, Griff says: “Adjustment is a legitimate technique – try and understand it before assuming it is deception or wrong”.

      Griff doesn’t seem to realize, we DO understand how adjustments are made. That’s why we are saying that it is wrong.

    • Trebla February 7, 2017 at 8:36 am
      “Adjustment is a legitimate technique – try and understand it before assuming it is deception or wrong”. Right! Except that the adjusters always adjust in the direction that supports their global warming/climate change conjecture, i.e., adjust the past down, adjust the present up.

      The paper referred to in this thread appears not to do what you say.
      Karl et al. 2015
      https://d2ufo47lrtsv5s.cloudfront.net/content/sci/348/6242/1469/F2.large.jpg?width=800&height=600&carousel=1

      • Hey Phil.! Yes, you are correct. It would be more accurate to say “the adjusters OVERWHELMINGLY adjust it in the direction that supports their global warming”. Saying “always” instead of “overwhelmingly” is a slight exaggeration. It is like saying that a card shark “wins every hand of poker!” when in fact he only wins 98% of them.

        In the case of the Karl et al paper the adjustments did, in fact, subtract out some of the previous adjustments, but it also allowed Karl to claim (falsely, as it happens) that the warming pause never happened.

  21. I have not been to Alice Springs lately, so I can’t comment, but it just so happens that I spent my youth in Marseille (add an “s” if you so wish, the way the Brits do), ’twas sixty years ago. The last time I was there, I felt like it was a bit warmer but then again, I may be mistaken. Some trumpified alernative fact.

  22. In terms of social issues, birth control and the rhythm method were replaced in the 1990s by climate control and the al-gore-rithm method. Popes have praised both.

  23. So the world’s global climate temperature records are created by an out of control algorithm that functions as a random number generator? And we spend billions on climate science. Wow.

  24. Assuming (despite some objections raised in the past) that the MetOffice CET data might be less manipulated than the various versions of the NOAA’s global temperature, it can be seen that both the coldest (February) and the warmest (August) months of the year show cooling trend since the beginning of this century

    The observed cooling trend is entirely consistent with decline in the solar activity since the apparent ending of Grand Solar Maximum.

    • Looks as if you didn’t include the 2015/16 temperatures which happens to be an el Nino. Since the trend starts at the end of the 97/98 el Nino, perhaps it should be run again including this time the current el Nino. (That tends to remove the claim of “cherry-picked’ start date, although it still depends on which el Nino (both natural events ) was stronger.

      • If you look carefully at the graph you will see that the temperatures for both 2015 (Feb = 4C, Aug = 15.9C) and 2016 (Feb = 4.9C, Aug = 17C) are included.

  25. There is a lot more to be said about the temperature adjustments, but I’ll keep this post fixed on this one point. The GHCN adjustment algorithm is unstable, as stated by Bates, and produces virtually meaningless adjusted past temperature data. The graphs shown here and by Peter O’Neill show this. No serious scientist should make use of such an unstable algorithm. Note that this spurious adjusted data is then used as the input for widely reported data sets such as GISTEMP. Even more absurdly, GISS carry out a further adjustment themselves on the already adjusted GHCN data. It is inconceivable that the climate scientists at GHCN and GISS are unaware of this serious problem with their methods.
    ____________________________________

    So the positive feedback resonance in Catastrophic Climate Change doesn’t come from real world experience.

    But from superfunded climate swindle computer models.

    Ain’t that good news.
    Man ain’t that news.

    • Just think of all the other real scientific work that has been done over the last 25 years where they have used said Temperature data.
      None of it can be believed, if it was written 20 years ago when the past was much hotter and we belive the current data, it makes all their work useless.
      If on the other hand the work was done in the last 5 or 10 years and we do not believe the current data, it then makes all their current work useless.
      The real Scientists should hate the Climate Brigade, but because they always include the word “climate” in their papers they still get published, granted and paid, so that makes it all allright.

      Science in total disrepute.

      • I have said for years, “All of science will end up paying for this.”

        Unfortunately when science is not to be trusted, the snake oil salesman take over. dark times ahaed for science ans society.

  26. The only way to change this behavior is to administer consequence. And yes, I am suggesting legal and financial consequences for purposely manipulating data. We can start with Congressional subpoenas, questions under oath and perp walks…

    • @ Oatley.

      Exactly. Evidence that drives policy should be unimpeachable, replicable, archived and available for public analysis. Cook the books and you lose your licence, job or maybe even your liberty. This swamp needs a LOT of draining.

    • 100% correct! A message needs to be sent regarding political abuse of science and the public trust. No one who studies this can even doubt that there is boatloads of deliberate and premeditated deception in all this. Mann, Karl, the whole damn hockey team and many others absolutely knew that what they were doing was misrepresenting data. Lock ’em up and take away their pensions!

  27. Goodness gracious that’s some serious cooked bookery …. you absolutely NEED an independent audit otherwise there is ZERO credibility

    • As Griff says that adjustment above must be legitimate, and \i must try to understand it. Yes, now I understand, but why was the decrease in the 1880’s only 3 degrees. Oops, I forget, it is LEGITIMATE!
      No further questions, your honor.

      • Also, degrees are like dollars — there’s an inflation factor that has to be figured in. It’s really quite complex, but take my word for it, … it is a legitimate practice.

  28. The past has an uncertainty associated with it. The future is unpredictable. And even the present is malleable to conform with human expectations. The scientific domain is established with the observation and self-evident knowledge that accuracy is inversely proportional to time and space offsets from an observer’s frame of reference.

    • Owen Martin February 7, 2017 at 9:27 am
      Its a shame there was a mistake in the Daily Mail article.

      The faithful will now claim the whole thing is fake, Guardian readers will lap it up.

      They will follow Schopenhauer’s cynical advice:

      Should your opponent be in the right, but, luckily for your contention, choose a faulty proof, you can easily manage to refute it, and then claim that you have thus refuted his whole position.
      —Schopenhauer, The Art of Controversy, #37

    • The Daily Mail suffers from being rubbish.
      NOAA is meant to be respectable.
      They shouldn’t hold themselves down to the level of the Daily Mail. That they are struggling to even reach that level is particularly incriminating.

      Yet it matters not.
      If all they can argue with is the illustration then people will notice they don’t mention the actual complaint.
      This is about dodgy data archiving and manipulation and interpretation. Baselines on pictures are not relevant.

    • Calling it a “mistake” is being very generous.

      Here is another squirrel for Janice. I know she likes pretty pictures.

      Taking bets on 2017 or 2018 being the first ‘blue-water Arctic in 10,000 years. Good luck with the AMOC and the PJS after that.

      • Dear Tony,

        How sweet of you to send me that little homemade Valentine. I didn’t know you cared.

        Now, Tony, this will be hard for you to hear and your admiration is flattering, but, I must make something very plain: you and I are not ever going to go on a date. If I have done anything to give you hope in that direction, please forgive me. It was not intentional.

        Try Griff. You and she have a lot in common, you know. Just be sure to put lots of ice in her root beer and she will be happy.

        Good luck.

        Sincerely,

        Janice

      • There were predictions of an ice-free arctic starting in the late 1930s. Didn’t happen.

        What are your predictions re: Antarctic ice extents?

      • tony mclode:

        Nice volume chart you present.

        How did they determine the ice volume? In an age where most of the ice area charts are also modeled…

        Just another pathetic model pumping out data for their climate alarmist fellow parasites.

        Cutting off funds paying for trick data sites and researchers pumping out CO2 climate twaddle; 10. 9. 8.

        PS mclode, the house of Representatives drives the budget legislation and they are solidly Republican. They will be so happy to pull a number of alarmist financial plugs.

        PPS Perhaps you should get your employers to pay you in advance. Then when we deny you ever posted here, they’ll send collectors to talk to you.

        PPPS Those alarmists don’t play nice.

  29. In any genuine empirical science such a thing would be shocking. But then this is “Climate (as in psuedo) Science,” so it is hardly shocking.

    Bottom line is the data that supposedly “supports” warming “caused by CO2” [which has never been observed to drive temperature in the past (but they like to ignore that inconvenient fact)] is at best crap and at worst outright [word that shall not be spoken here].

  30. The existence of a pause in warming was denied publicly even when the consensus science could not explain it. Recall the admission of “no statistically significant warming” for over a decade by Phil Jones when asked by Parliament. Prior to that moment the pause was called a ruse of climate change denial.

    Eventually the pause was accepted but a furious effort was under way to discredit it. Karl, et al is part of that effort. As with the IPCC it cannot find what it was designed not to find. This is the real meaning of science fit for purpose. Hockey sticks are manufactured the same way.

    • I think Dr Roy prefers “CCC” (Cadillac Calibration Cherrypick). A divine intervention that sets uncertainties straight…
      The scientific equivalent of Maradona’s infamous “God’s hand” in FIFA World Cup 1986…

  31. The GHCN adjustment algorithm is unstable, as stated by Bates, and produces virtually meaningless adjusted past temperature data.

    Or to use the technical term , gibberish.

  32. I may not win anything for picking the correct temperaturecrecord but if enough of us “guess” the right one don’t we collectively win trillions of dollars?

  33. Okay the algorithm is unstable. So what? As Bates has pointed out “I learned that the computer used to process the software had suffered a complete failure.”

    The computer that ran the unstable algorithm is no longer available to run the unstable algorithm. Problem solved.
    /sarc off

  34. In my world, data from properly calibrated instruments doesn’t have to be adjusted.

    The kind of slapdash software adjustments described in this story make my guts roil.

    • commie BOB , Absolutely!

      “Conclusion
      High precision temperature measurement is possible through the use of well-specified and suitably calibrated sensors and instrumentation. However, the accuracy of these measurements will be meaningless unless the equipment and sensors are used correctly”

    • commieBob: What you don’t believe that the water temperature being measured by an Argo Buoy suddenly and magically jumped by 0.12C over the temperature that was actually measured?

    • There are very valid reasons for corrections. In a previous working life doing ballistics work we had a change in the type of pressure gauge specified. The change resulted in an alteration in the volume under the gauge. The change in volume altered the overall pressure in the system, so tests conducted with one type of gauge could not be directly compared to results from the second, later gauge type directly. To enable comparison of the standard product against new and old gauges a correction was determined and was systematically applied where needed (14.4MPa on our standard lot from memory). Both gauge types are fully calibrated, just that in application they will generate different results as they alter the test environment in slightly different ways. I don’t think the adjustments in the climate data fall into this type of adjustment or correction though as they are different on different days, not constant over the time series.

  35. These don’t matter, its all a distraction.
    Temps just follow dew point temperature, and if they included dew points, and did they same stuff to both data sets, it would still follow dew point temps. It’s 57 in Cleveland today because warm water vapor out of the Gulf of Mexico blew north instead of east.

    Instead of 20 years of arguing about temperature series that don’t prove anything as used. Why doesn’t anyone study how co2 actually affect the day to day change in temperature response?

    • A nice analysis, here, by micro (Mike Crow);

      … CAGW is a rate-of-cooling problem, not a static temperature problem. Is CO2 changing the rate of cooling, thereby altering the expected surface temperature? Are the hypothesized positive feedbacks actually there? Are there any actual measurements of these parameters? … Every night, the Sun sets on every location on Earth, and the surface starts to cool by radiating heat into the cold black of space. What can weather station data tell us about this? The temperature record has daily min and max temperatures. When the Sun comes up in the morning, on most days it warms the surface from the minimum temp of the day, peaking late in the afternoon. Then, the Sun sets and temperatures start to plummet. I live at 41 N Lat, and on a clear night, the temperature will drop 20-30 F (Figure 1), over a degree F per hour. If there’s a CO2 effect in the temperature record, it should show up in nighttime cooling. The question is, does this loss of cooling actually show up in the data? …

      {‘The Preprocessing Step’} — I started with NCDC’s global summary of day’s data set which contains over 120 million station
      records, and starts in late 1929. The first thing to notice is how few samples there are each year prior to 1973 (Figure 2). What I wanted to look at is how much the temperature went up ‘today,’ and how much does it drop ‘tonight.’ Today’s Rising temp is: (today’s T-max –today’s T-min.). Falling temp is: (today’s T-max – tomorrow’s T-min), the drop in temperature overnight. … When annual averages are generated, I average the daily values for a particular station, then, average the annual values of the collections of stations in the area being examined. …

      Conclusion:

      The worldwide surface station measured, average, daily, rising temp and falling temps are: 17.465460F and 17.465673F for the period of 1950 to 2010. Not only are the falling temperatures slightly larger than the rising temperatures, 17.4F is only 50%-70% of a typical clear sky temperature swing of 25F to 30F, which can be as large as +40F depending on location and humidity. …

      Since recorded Min – Max temperatures show no sign of a loss of cooling on a daily basis since at least 1950, even if CO2 has increased the amount of DLR, something else (most likely variability of clouds) is controlling temperatures. This would seem to eliminate CO2 as the main cause of late 20th century warming.”

      (emphasis mine)

      (Source: https://wattsupwiththat.com/2013/05/17/an-analysis-of-night-time-cooling-based-on-ncdc-station-record-data/ )

    • There was an American guy, many years ago who was plotting Temp Vs Humidity and Temp Vs Co2 for individual stations.
      Humidity correlated CO2 did not.
      I just wish I could remember the blog, I remember the post, it was on “Temp being a Random Walk” and it was Bart’s forum and the guy pushing the random walk I think was VS and I think the US guy might hav been Scott something. I will have to try and fing it tomorrow.

      • He was right. There’s a rel humidity regulation of night time cooling rates, cools really fast until it starts having to cool and condensed water vapor. And you can see the asymmetry in the spring vs the fall as the length of day is changing, longer nights have more time to radiate at the reduced rate.

      • Here, A. C. Osborn, are some clues for you (I looked it up in my handy, dandy, WUWT 10th Anniversary anthology :) ):

        in a comment:

        RiHo08: “Temperature time series data correlation was assessed last year March 2010 by VS on Bart Verhengeen’s blog using 1880 to 2008 data and concluded that all temperatures fell within normal variance parameters. …”

        (https://wattsupwiththat.com/2011/02/14/pielk-sr-on-the-30-year-random-walk-in-surface-temperature-record/#comment-598796 )

        from this thread:

        Excerpt

        “… Bye, J., K. Fraedrich, E. Kirk, S. Schubert, and X. Zhu (2011), Random walk lengths of about 30 years in global climate, Geophys. Res. Lett., doi:10.1029/2010GL046333 — … We have applied the relation for the mean of the expected values of the maximum excursion in a bounded random walk to estimate the random walk length from time series of eight independent global mean quantities (temperature maximum, summer lag, temperature minimum and winter lag over the land and in the ocean) … The results … indicate a random walk length on land of 24 yr and over the ocean of 20 yr. Using three 1000 yr segments from mil01, the random walk lengths increased to 37 yr on land and 33 yr over the ocean … on the time scale of civilizations, [] the random walk length is likely to be about 30 years. …

        We [Pielke, Sr., et al.] agree with the authors of the paper on this statement. This is one of the reasons we completed the paper. Herman, B.M. M.A. Brunke, R.A. Pielke Sr., J.R. Christy, and R.T. McNider, 2010: Global and hemispheric lower tropospheric temperature trends. Remote Sensing, 2, 2561-2570; doi:10.3390/rs2112561 — …

        The actual dates of the hemispheric maxima and minima are a complex function of many variables which can change from year to year …

        Here we examine: the time of occurrence of the global and hemispheric maxima and minima; lower tropospheric temperatures; the values of the annual maxima and minima; and the slopes and significance of the changes in these metrics. …

        The 2011 Bye, et al. GRL paper conclusion reads: In 1935, the International Meteorological Organisation confirmed that ‘climate is the average weather’ and adopted the years 1901-1930 as the ‘climate normal period’. Subsequently a period of thirty years has been retained as the classical period of averaging (IPCC 2007). Our analysis suggests that this administrative decision was an inspired guess. Random walks of length about 30 years within natural variability are an ‘inconvenient truth’ which must be taken into account in the global warming debate. This is particularly true when the causes of trends in the temperature record are under consideration. …”

        (https://wattsupwiththat.com/2011/02/14/pielk-sr-on-the-30-year-random-walk-in-surface-temperature-record/ )

        Note: lots of informative comments on that thread.

    • Why doesn’t anyone study how co2 actually affect the day to day change in temperature response?

      ANSWER: Because this does not conform to theory.

      You mean do actual observations? … Well, this is just too uncomfortable. I’d have to leave my cozy computer lab and miss lunch at that new bar & grill — I hear they serve a mean burger over there. I burn easily too. Now where did I put that funding application?

    • Good point.nobody interested.
      Theoretically if it was 280 ppm today it would be x degrees if it was 400 it would be x + r degrees under the same other weather conditions.
      And it is not.

  36. I nominate Karl for the Lois Lerner and John C. Beale Award in public dis-service. Do I get a concurrence from the FBI?

  37. Isn’t the selection and agreement on a baseline important? I mean, if we are continuously told that climate doom kicks in once we are 2 degrees above the baseline, the baseline becomes obviously important. Adjusting that baseline can make it look closer or further from the arbitrary 2 degree mark. Also, are the datasets the same after the baselines are matched? It actually appears to me that NOAA shows more warming starting about 2010.

    • It is important NOT to post graphs with different baselines on the same chart without telling people that’s what you have done. That’s falsification.

      • The baseline business is an irrelevant distraction.
        This graph clearly shows the result of the Karl et al. paper:

        “June 18, 2015: NCDC has introduced a number of rather large administrative changes to their sea surface temperature record. The overall result is to produce a record giving the impression of a continuous temperature increase, also in the 21st century. As the oceans cover about 71% of the entire surface of planet Earth, the effect of this administrative change is clearly seen in the NCDC record for global surface air temperature above” (climate4you).
        The Karl et al. paper clearly states: “These results do not support the notion of a “slowdown” in the increase of global surface temperature”.
        The controversy is about the how why and when surrounding this adjustment.

      • Posting graphs with only post 1978 dates is also misleading.

        Please be sure to only post graphs that at a minimum include the Medieval Warming Period and the Little Ice Age.

        Graphs that include all of the Holocene would of course be even more informative.

    • I’ve been wondering about that – if the ‘reference period’ is adjusted down, doesn’t the ‘anomaly’ go up, even if the actual temperature is unchanged?

    • All the baseline does is change the height above 0 by a consistent offset. If you plot it with 1950 – 1980 the numbers will all be larger than if you plot it with a 1980-2010 baseline. The shapes will be unaffected.

  38. One thing I find odd is that in a group with 8 co-writers, the data and code appears to have only been on one computer, which crashed.

    To me that says that NONE of the co-writers actually checked the code or data… AMAZING!!

    Oh well, their names are on the piece of anti-science crap now… that is their problem

      • Climate trend?
        Er, since when did ‘climate’ become ‘temperature’ (and vice-versa)?
        And since when did a (computer generated) ‘adjustment’ give us a better idea of what ‘actually’ happened in the past?
        Virtual fantasy.

      • Prove your assertions Griff. You aren’t a working scientist. You don’t know what it is to have to debunk fraudulent science to children. I’m one of the scientists who helped establish that Cannabis isn’t related to opiates in medicine. Do you like the government claiming that marijuana is like heroin, telling your child, that heroin and marijuana are about the same, because the scientists of the federal government say so?

        No you wouldn’t, that’s fraud. You can not explain away all the fraud: from the Hansen computer programs that don’t have the laws of thermodynamics for the atmosphere in them, to the Mann computer programs that turned out to be nothing more nor less than thousands of lines of Fortran: to manufacture hockey sticks: to the Phil Jones computer programs he was found fraudulently patching together around “Mike’s nature trick” or Phil Briffa’s fraudulent tree dating, using TINY numbers of the WRONG trees to claim the entire history of the world is different.

        You can’t explain how no one who claims to believe the fraud, can even discuss basic atmopsherics. You can’t explain how it is you’re so sure it’s right but you don’t know the name of the law to find out the temperature of some air; or why the atmospheric mix, and gases, and vapors even have their own law of thermodynamics to solve their matter/energy relationships.

        What is the name of that law Griff?

        Why does atmospheric mix and gas, and vapor phase of matter have it’s own law of thermodynamics to solve it’s relationships Griff?

        You see how swiftly anybody with any atmospheric specialization at all can check whether you’re just another fraud, like all the other frauds, who came down the pike claiming they were going to unleash an army of bloggers who would rule the world with stupid, and transform science? They did transform it: you helped. Climate science and any of you associated with it are the laughingstocks of the entire earth.

      • Just hang around. Cowtans last major paper used a multiplication by 3 instead of dividing by 3 yet sailed through pal review. I’ll warrant there is a similar massive cock-up in this so-called verification paper. BEST(worst) has blotted its copybook more than once and Hausfather & Cowtan are far from either independent or unbiased. In any event you cannot verify any process that is so demonstrably flawed so Hausfathers effort stands alone now – ready for a climate audit I suspect.

        Meanwhile here
        https://judithcurry.com/2017/02/06/response-to-critiques-climate-scientists-versus-climate-data/

        Bates eviscerates Peter Thorne’s made-up nonsense that you were earlier linking too. Apparently Thorne wasn’t even there at the time!

      • Congrats Griff, that is the most worthless comment you have ever made. I could not have demonstrated your complete ignorance on the subject any better.

        Sheesh, we are not talking about an experiment or a theory here. We are talking about the production of a data set intended to be used by other scientists and decision makers around the globe. This is a task that must be done according to a process that includes peer review of ALL changes, proper QC, proper code and data management (CM), etc. Right now it is just a slap dash operation as if they are only using it among themselves, which is undoubtedly how it started.

        Time for the professionals (in data management) to step in. This leaves you out Griff as you have no clue what I’m talking about.

    • Andy the whole nonsense about the original data being lost and the computer braking is just that, nonsense. Almost everyone has enough experience to know computers get backed up, most especially computers used professionally. Most people backup their personal computers even.

      And the computer breaking? Is there some reason to think this computer was a one off purpose built machine that has no design documents to go with it? Really? No, that’s absurd. If the computer broke it can be replaced.

      It’s just a refusal to produce the requested material and the people responsible need to be held in contempt.

  39. So it seems that Hadley just copies NOAA while we already knew NASA copies NOAA and just adds made-up ‘hot’ data for the Arctic. But these are all described by the zealots as independent. In reality there really is therefore just one official land-based dataset which we now know to be seriously screwed up. Whats left are the 3 satellite, 2 radiosonde and Argo buoy datasets that all agree with each other plus an outlier Berkeley amateur effort which also uses made-up ‘hot’ data in the Arctic and has no data at all for 80% of the planet.

    • Exactly right, JasG. All the surface temperature data is contaminated and is certainly unfit to use to make public policy.

  40. time after time the warmists have been shown lying and falsifying data.

    the media and western educational system promotes the scam. basically iti s an ideological war against conservatives and free market capitalism.

    at least some of the truth is getting exposed.

  41. I think I know exactly why the algorithm is unstable. All of the symptoms make it look like they have implemented it using floating point arithmetic. The floating point rounding errors accumulate the longer running the calculation. This leads to the largest adjustments being made at the end of the run. They start from from the most recent date and run to the oldest meaning that the largest adjustments happen to the oldest data!

    /ikh

    • +1 — I think you might be right!

      And, of COURSE they did not intend that. They, uh, ….. they just….. just got their start and end point mixed up. :)

    • It would be more accurate to say that this is a reason the models never give the same result twice. I was going to say the right result, but that is far more fundamental – treating the Earth like an onion, instead of a complex set of heat pumps pushing warm and cold air/water, as it is known to be.

      The reason the data munger keeps giving different answers is because , well what else would you expect if you give PhDs in climatology tasks more suited to people that know how to program computers.

  42. It seems to me that the people responsible for this should be on the witness stand testifying under oath with penalty of perjury.

  43. Ik like this bit from the answer:
    “If stations had intentionally been dropped to maximize the warming trend, one would expect to see more divergence between surface and satellite records over time as the number of stations used by GHCN decreases. However, a close examination of the residuals when satellite records are subtracted from surface station records shows no significant divergence over time compared to either UAH or RSS.”

    Till 2010 satelite temperatures were in line with surface temperatures.
    What happened then?

  44. There is no doubts that this development at NOAA is the start of a line of whistleblowers who will come out of the woodwork as the rats leave the sinking global warming ship. Whilst Bates seems motivated by genuine concern others will be motivated by self preservation. As the fraud charges begin to flow watch the number of individuals who will happily provide evidence to save their own skin. The extent and magnitude of the scandal will be there for all to see and even the MSM will not be able to ignore it which they are continuing to try to do.
    The carnival is over!

  45. Seems to me that a possibility exists that the climate in Alice Springs was a bit different then,that a shift in an ocean current may have occurred sometime in the last century.

    One must always challenge assumptions.

    • You do realize that Alice Springs is basically in the middle of continental Australia don’t you? i.e in the middle of a desert? or did you forget the /sarc?.

  46. Dr. John Bates works to enhance the quality and storage of climate data. He is a better scientist than I am, for I have tried to work with quality and storage, but failed, miserably at times.
    In the present blog context of adjustments to some Australian and global temperature data, I here present two working graphs from the towns of Darwin and Alice Springs in the Northern Territory. Others on this blog have presented related graphs that can be compared. BTW, I have stayed in both towns many times since 1960 and am familiar with some relevant history and geography.
    The graphs I show are genuine and composed from what was available from a routine search at past times – Nov 2010 for Darwin, Jan 2014 for the Alice. They have lost their metadata in a subsequent disc crash. My apologies.
    The explanations for the title blocks are, first for Alice Springs –
    BOM CD ca 1993. A colleague purchased from the BOM a full set of station data as it existed for public use in 1993 or close to then.
    BOM ONLINE. This is Computer Data Online from the BOM web site http://www.bom.gov.au/climate/data/ This searchable site has been stable and essentially unaltered data since I first viewed it about 2007.
    BOM CD 2007 et seq from CDO BOM sold a product, a CD with over 1,000 Australian and Antarctic stations, with the CD record ending March 2007. Daily data with max and min temperatures, considered to be raw through matching with original observer handwritten library records.
    NASA GISS HOMOGENISED I cannot recall more detail than this. I would have searched easy key words and graphed what seemed to be the dominant search result. I have lost the version number.
    KNMI GHCN ver 2 ADJUSTED. Again, much as it says.

    And for Darwin, as above or self-explanatory except for Butterworth, which I recall digitising by hand from a printed graph. See Butterworth I (1993) On the inhomogeneity of climatic temperature records at Darwin. Bureau of Meteorology, Northern Territory Regional Office Research Papers 2:107–110


    These graphs are shown as examples of the variability of the temperature record over time, but also according to the compiler body. While more comment is available if requested, I simply show these and mention the obvious difficulty for anyone to derive a reliable, low error, summary time series good enough to be the primary standard for these 2 towns into the future.
    Remember as well the pair matching procedures that have used pairs up to 1,200 km apart. There are few reliable records 1,200 km from each of these towns, which are themselves 1,200 km apart. The temperature history of Australia relies heavily on the data from these 2 places and to the extent that Australia has most SH records, so does the global mean surface temperature.

    Geoff

  47. There are two historic temperature sources that can be added to the existing data that shows the GISS interpretation of Alice Springs is without foundation.

    One is Meteorological Data for Certain Australian Locations published in 1933 by the CSIR (http://www.waclimate.net/csir.pdf), which gives monthly averages 1874-1931, and the other is Australia’s 1953 Year Book which provides monthly averages during 1911-40 (http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_display_type=dailyDataFile&p_nccObsCode=122&p_stn_num=015540&p_c=-48294455&p_startYear=1953).

    The CSIR and Year Book temperatures are probably totally unadjusted but any changes were made by people who’d never heard of CO2 and global warming. GISS, of course, can also be compared to the BoM’s unadjusted RAW and adjusted ACORN datasets.

    CSIR and Year Book verify the bureau’s raw temps. The question of thermometer shelters is convoluted. The ACORN station catalogue (http://www.bom.gov.au/climate/change/acorn-sat/documents/ACORN-SAT-Station-Catalogue-2012-WEB.pdf) states:

    “There is no known documentation of the screen type at the Telegraph Station but the observations are consistent with a Stevenson screen having been used there. The site was enclosed by a rock wall about 1m high and painted white that would have interrupted wind flow and reflected heat. Observations moved to the Post Office on 23 January 1932. The Post Office site continued until 1953 but data after 1944 were not used in ACORN-SAT as there appear to have been changes at the Post Office site around the time that the airport site opened.”

    Alice Springs in 2010-16 had an ACORN mean of 21.1C, so temps over the past seven years have increased either 0.2C since 1874-1931 according to CSIR and BoM RAW, or 3.1C according to GISS.

    If you prefer a probable Stevenson in 1911-40, the Alice Springs mean has increased 0.5C to 2010-16 according to the Year Book and BoM RAW (the YB/RAW 1911-40 vs 2010-16 comparison is actually 20.63 vs 21.06C, so the unadjusted increase was 0.43C), 1.0C according to ACORN, or 2.8C according to GISS.

    If the 1m high white wall is considered an artificial factor, note that Darwin to the north was 28.23C in 1882-1931 and 28.07C in 1911-40, a 0.16C cooling that suggests a climate influence (Camooweal 1907-31: 25.06C, 1911-40: 24.95C – 0.11C cooler / Boulia 1889-1931: 24.29C, 1911-40: 24.18C – 0.11C cooler).

    Multiple unadjusted sources suggest between 0.2C to 0.5C warming over the past century at Alice Springs. The BoM’s ACORN experts have warmed it by 1.0C, yet GISS has found other unknown reasons to warm it by 2.8C. It might be assumed that Australia’s experts have overlooked 1.8C worth of artificial influence, according to their international compatriots.

    It’s noteworthy that the Alice Springs Post Office and Airport had a 12 year overlap 1942-53 during which the PO had a raw mean temperature of 20.41C and the airport had a mean of 20.45C.

    Its southern hemisphere isolation makes Alice Springs a very influential site and all the records suggest it’s probably had no natural temperature change, yet adjustments have warmed it from 1.0C to to 3.1C.

  48. The Global Homogenised Climatology Network or the Global Hysterical Climatology Network, I’m not sure which at present. Where’s Griff when you really need him to clarify these minor errata?

  49. From the article: ” Note that this spurious adjusted data is then used as the input for widely reported data sets such as GISTEMP. Even more absurdly, GISS carry out a further adjustment themselves on the already adjusted GHCN data. It is inconceivable that the climate scientists at GHCN and GISS are unaware of this serious problem with their methods.”

    That’s right it *is* inconceivable that these scientists were unaware of this problem. This “problem” allowed them to deliberatly lie to the world about the surface temperatures and help promote their human-caused climate change agenda.

    Anyone who is paying good money for this contaminted data ought to be suing to get their money back.

    The data NOAA/NASA data manipulators have been caught out. I saw a report about it on Fox News this morning, and there were Senate hearing on the EPA administrator, where this subject was broached. This isn’t going away.

Comments are closed.