American Thinker on CRU, GISS, and Climategate

Climategate: CRU Was But the Tip of the Iceberg

By Marc Sheppard

Not surprisingly, the blatant corruption exposed at Britain’s premiere climate institute was not contained within the nation’s borders. Just months after the Climategate scandal broke, a new study has uncovered compelling evidence that our government’s principal climate centers have also been manipulating worldwide temperature data in order to fraudulently advance the global warming political agenda.

Not only does the preliminary report [PDF] indict a broader network of conspirators, but it also challenges the very mechanism by which global temperatures are measured, published, and historically ranked.

Last Thursday, Certified Consulting Meteorologist Joseph D’Aleo and computer expert E. Michael Smith appeared together on KUSI TV [Video] to discuss the Climategate — American Style scandal they had discovered. This time out, the alleged perpetrators are the National Oceanic and Atmospheric Administration (NOAA) and the NASA Goddard Institute for Space Studies (GISS).

NOAA stands accused by the two researchers of strategically deleting cherry-picked, cooler-reporting weather observation stations from the temperature data it provides the world through its National Climatic Data Center (NCDC). D’Aleo explained to show host and Weather Channel founder John Coleman that while the Hadley Center in the U.K. has been the subject of recent scrutiny, “[w]e think NOAA is complicit, if not the real ground zero for the issue.”

And their primary accomplices are the scientists at GISS, who put the altered data through an even more biased regimen of alterations, including intentionally replacing the dropped NOAA readings with those of stations located in much warmer locales.

As you’ll soon see, the ultimate effects of these statistical transgressions on the reports which influence climate alarm and subsequently world energy policy are nothing short of staggering.

NOAA – Data In / Garbage Out

Although satellite temperature measurements have been available since 1978, most global temperature analyses still rely on data captured from land-based thermometers, scattered more or less about the planet. It is that data which NOAA receives and disseminates – although not before performing some sleight-of-hand on it.

Smith has done much of the heavy lifting involved in analyzing the NOAA/GISS data and software, and he chronicles his often frustrating experiences at his fascinating website. There, detail-seekers will find plenty to satisfy, divided into easily-navigated sections — some designed specifically for us “geeks,” but most readily approachable to readers of all technical strata.

Perhaps the key point discovered by Smith was that by 1990, NOAA had deleted from its datasets all but 1,500 of the 6,000 thermometers in service around the globe.

Now, 75% represents quite a drop in sampling population, particularly considering that these stations provide the readings used to compile both the Global Historical Climatology Network (GHCN) and United States Historical Climatology Network (USHCN) datasets. These are the same datasets, incidentally, which serve as primary sources of temperature data not only for climate researchers and universities worldwide, but also for the many international agencies using the data to create analytical temperature anomaly maps and charts.

Yet as disturbing as the number of dropped stations was, it is the nature of NOAA’s “selection bias” that Smith found infinitely more troubling.

It seems that stations placed in historically cooler, rural areas of higher latitude and elevation were scrapped from the data series in favor of more urban locales at lower latitudes and elevations. Consequently, post-1990 readings have been biased to the warm side not only by selective geographic location, but also by the anthropogenic heating influence of a phenomenon known as the Urban Heat Island Effect (UHI).

For example, Canada’s reporting stations dropped from 496 in 1989 to 44 in 1991, with the percentage of stations at lower elevations tripling while the numbers of those at higher elevations dropped to one. That’s right: As Smith wrote in his blog, they left “one thermometer for everything north of LAT 65.” And that one resides in a place called Eureka, which has been described as “The Garden Spot of the Arctic” due to its unusually moderate summers.

Smith also discovered that in California, only four stations remain – one in San Francisco and three in Southern L.A. near the beach – and he rightly observed that

It is certainly impossible to compare it with the past record that had thermometers in the snowy mountains. So we can have no idea if California is warming or cooling by looking at the USHCN data set or the GHCN data set.

That’s because the baseline temperatures to which current readings are compared were a true averaging of both warmer and cooler locations. And comparing these historic true averages to contemporary false averages – which have had the lower end of their numbers intentionally stripped out – will always yield a warming trend, even when temperatures have actually dropped.

Overall, U.S. online stations have dropped from a peak of 1,850 in 1963 to a low of 136 as of 2007. In his blog, Smith wittily observed that “the Thermometer Langoliers have eaten 9/10 of the thermometers in the USA[,] including all the cold ones in California.” But he was deadly serious after comparing current to previous versions of USHCN data and discovering that this “selection bias” creates a +0.6°C warming in U.S. temperature history.

And no wonder — imagine the accuracy of campaign tracking polls were Gallup to include only the replies of Democrats in their statistics.  But it gets worse.

Prior to publication, NOAA effects a number of “adjustments” to the cherry-picked stations’ data, supposedly to eliminate flagrant outliers, adjust for time of day heat variance, and “homogenize” stations with their neighbors in order to compensate for discontinuities. This last one, they state, is accomplished by essentially adjusting each to jive closely with the mean of its five closest “neighbors.” But given the plummeting number of stations, and the likely disregard for the latitude, elevation, or UHI of such neighbors, it’s no surprise that such “homogenizing” seems to always result in warmer readings.

The chart below is from Willis Eschenbach’s WUWT essay, “The smoking gun at Darwin Zero,” and it plots GHCN Raw versus homogeneity-adjusted temperature data at Darwin International Airport in Australia. The “adjustments” actually reversed the 20th-century trend from temperatures falling at 0.7°C per century to temperatures rising at 1.2°C per century. Eschenbach isolated a single station and found that it was adjusted to the positive by 6.0°C per century, and with no apparent reason, as all five stations at the airport more or less aligned for each period. His conclusion was that he had uncovered “indisputable evidence that the ‘homogenized’ data has been changed to fit someone’s preconceptions about whether the earth is warming.”

WUWT’s editor, Anthony Watts, has calculated the overall U.S. homogeneity bias to be 0.5°F to the positive, which alone accounts for almost one half of the 1.2°F warming over the last century. Add Smith’s selection bias to the mix and poof – actual warming completely disappears!

Yet believe it or not, the manipulation does not stop there.

GISS – Garbage In / Globaloney Out

The scientists at NASA’s GISS are widely considered to be the world’s leading researchers into atmospheric and climate changes. And their Surface Temperature (GISTemp) analysis system is undoubtedly the premiere source for global surface temperature anomaly reports.

In creating its widely disseminated maps and charts, the program merges station readings collected from the Scientific Committee on Antarctic Research (SCAR) with GHCN and USHCN data from NOAA.

It then puts the merged data through a few “adjustments” of its own.

First, it further “homogenizes” stations, supposedly adjusting for UHI by (according to NASA) changing “the long term trend of any non-rural station to match the long term trend of their rural neighbors, while retaining the short term monthly and annual variations.” Of course, the reduced number of stations will have the same effect on GISS’s UHI correction as it did on NOAA’s discontinuity homogenization – the creation of artificial warming.

Furthermore, in his communications with me, Smith cited boatloads of problems and errors he found in the Fortran code written to accomplish this task, ranging from hot airport stations being mismarked as “rural” to the “correction” having the wrong sign (+/-) and therefore increasing when it meant to decrease or vice-versa.

And according to NASA, “If no such neighbors exist or the overlap of the rural combination and the non-rural record is less than 20 years, the station is completely dropped; if the rural records are shorter, part of the non-rural record is dropped.”

However, Smith points out that a dropped record may be “from a location that has existed for 100 years.” For instance, if an aging piece of equipment gets swapped out, thereby changing its identification number, the time horizon reinitializes to zero years. Even having a large enough temporal gap (e.g., during a world war) might cause the data to “just get tossed out.”

But the real chicanery begins in the next phase, wherein the planet is flattened and stretched onto an 8,000-box grid, into which the time series are converted to a series of anomalies (degree variances from the baseline). Now, you might wonder just how one manages to fill 8,000 boxes using 1,500 stations.

Here’s NASA’s solution:

For each grid box, the stations within that grid box and also any station within 1200km of the center of that box are combined using the reference station method.

Even on paper, the design flaws inherent in such a process should be glaringly obvious.

So it’s no surprise that Smith found many examples of problems surfacing in actual practice. He offered me Hawaii for starters. It seems that all of the Aloha State’s surviving stations reside in major airports. Nonetheless, this unrepresentative hot data is what’s used to “infill” the surrounding “empty” Grid Boxes up to 1200 km out to sea. So in effect, you have “jet airport tarmacs ‘standing in’ for temperature over water 1200 km closer to the North Pole.”

An isolated problem? Hardly, reports Smith.

From KUSI’s Global Warming: The Other Side:

“There’s a wonderful baseline for Bolivia — a very high mountainous country — right up until 1990 when the data ends.  And if you look on the [GISS] November 2009 anomaly map, you’ll see a very red rosy hot Bolivia [boxed in blue].  But how do you get a hot Bolivia when you haven’t measured the temperature for 20 years?”

Of course, you already know the answer:  GISS simply fills in the missing numbers – originally cool, as Bolivia contains proportionately more land above 10,000 feet than any other country in the world – with hot ones available in neighboring stations on a beach in Peru or somewhere in the Amazon jungle.

Remember that single station north of 65° latitude which they located in a warm section of northern Canada? Joe D’Aleo explained its purpose: “To estimate temperatures in the Northwest Territory [boxed in green above], they either have to rely on that location or look further south.”

Pretty slick, huh?

And those are but a few examples. In fact, throughout the entire grid, cooler station data are dropped and “filled in” by temperatures extrapolated from warmer stations in a manner obviously designed to overestimate warming…

…And convince you that it’s your fault.

Government and Intergovernmental Agencies — Globaloney In / Green Gospel Out

Smith attributes up to 3°F (more in some places) of added “warming trend” between NOAA’s data adjustment and GIStemp processing.

That’s over twice last century’s reported warming.

And yet, not only are NOAA’s bogus data accepted as green gospel, but so are its equally bogus hysterical claims, like this one from the 2006 annual State of the Climate in 2005 [PDF]: “Globally averaged mean annual air temperature in 2005 slightly exceeded the previous record heat of 1998, making 2005 the warmest year on record.”

And as D’Aleo points out in the preliminary report, the recent NOAA proclamation that June 2009 was the second-warmest June in 130 years will go down in the history books, despite multiple satellite assessments ranking it as the 15thcoldest in 31 years.

Even when our own National Weather Service (NWS) makes its frequent announcements that a certain month or year was the hottest ever, or that five of the warmest years on record occurred last decade, they’re basing such hyperbole entirely on NOAA’s warm-biased data.

And how can anyone possibly read GISS chief James Hansen’s Sunday claim that 2009 was tied with 2007 for second-warmest year overall, and the Southern Hemisphere’s absolute warmest in 130 years of global instrumental temperature records, without laughing hysterically? It’s especially laughable when one considers that NOAA had just released a statement claiming that very same year (2009) to be tied with 2006 for the fifth-warmest year on record.

So how do alarmists reconcile one government center reporting 2009 as tied for second while another had it tied for fifth? If you’re WaPo’s Andrew Freedman, you simply chalk it up to “different data analysis methods” before adjudicating both NASA and NOAA innocent of any impropriety based solely on their pointless assertions that they didn’t do it.

Earth to Andrew: “Different data analysis methods”? Try replacing “analysis” with “manipulation,” and ye shall find enlightenment. More importantly, does the explicit fact that since the drastically divergent results of both “methods” can’t be right, both are immediately suspect somehow elude you?

But by far the most significant impact of this data fraud is that it ultimately bubbles up to the pages of the climate alarmists’ bible: The United Nations Intergovernmental Panel on Climate Change Assessment Report.

And wrong data begets wrong reports, which – particularly in this case – begets dreadfully wrong policy.

It’s High Time We Investigated the Investigators

The final report will be made public shortly, and it will be available at the websites of both report-supporter Science and Public Policy Institute and Joe D’Aleo’s own ICECAP. As they’ve both been tremendously helpful over the past few days, I’ll trust in the opinions I’ve received from the report’s architects to sum up.

This from the meteorologist:

The biggest gaps and greatest uncertainties are in high latitude areas where the data centers say they ‘find’ the greatest warming (and thus which contribute the most to their global anomalies). Add to that no adjustment for urban growth and land use changes (even as the world’s population increased from 1.5 to 6.7 billion people) [in the NOAA data] and questionable methodology for computing the historical record that very often cools off the early record and you have surface based data sets so seriously flawed, they can no longer be trusted for climate trend or model forecast assessment or decision making by the administration, congress or the EPA.

Roger Pielke Sr. has suggested: “…that we move forward with an inclusive assessment of the surface temperature record of CRU, GISS and NCDC.  We need to focus on the science issues.  This necessarily should involve all research investigators who are working on this topic, with formal assessments chaired and paneled by mutually agreed to climate scientists who do not have a vested interest in the outcome of the evaluations.” I endorse that suggestion.

Certainly, all rational thinkers agree. Perhaps even the mainstream media, most of whom have hitherto mistakenly dismissed Climategate as a uniquely British problem, will now wake up and demand such an investigation.

And this from the computer expert:

That the bias exists is not denied.  That the data are too sparse and with too many holes over time in not denied.  Temperature series programs, like NASA GISS GIStemp try, but fail, to fix the holes and the bias.  What is claimed is that “the anomaly will fix it.”  But it cannot.  Comparison of a cold baseline set to a hot present set must create a biased anomaly.   It is simply overwhelmed by the task of taking out that much bias.  And yet there is more.  A whole zoo of adjustments are made to the data.  These might be valid in some cases, but the end result is to put in a warming trend of up to several degrees.  We are supposed to panic over a 1/10 degree change of “anomaly” but accept 3 degrees of “adjustment” with no worries at all. To accept that GISTemp is “a perfect filter”. That is, simply, “nuts”.  It was a good enough answer at Bastogne, and applies here too.

Smith, who had a family member attached to the 101st Airborne at the time, refers to the famous line from the 101st commander, U.S. Army General Anthony Clement McAuliffe, who replied to a German ultimatum to surrender the December, 1944 Battle of Bastogne, Belgium with a single word: “Nuts.”

And that’s exactly what we’d be were we to surrender our freedoms, our economic growth, and even our simplest comforts to duplicitous zealots before checking and double-checking the work of the prophets predicting our doom should we refuse.

Marc Sheppard is environment editor of American Thinker and editor of the forthcoming Environment Thinker.

//

Advertisements

  Subscribe  
newest oldest most voted
Notify of
crosspatch

You know, there is nothing in this article that hasn’t been published at least twice before. It is an absolute crime that nothing comes of it. And this article, too, will be ignored by those in a position to do something about it.
Global warming is, indeed, man made … at GISS and CRU.

Nick Stokes

This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.

Doug in Seattle

I am not sure how this dropping out of cooler stations works. It is my understanding that when NOAA or GISS homogenizes for missing readings or stations they reach out to the nearest stations up to 1200 km distant and do their corrections based on the anomaly of the other stations – not the absolute temperature as seems to be inferred by D’Aleo and Smith.
While this doesn’t deal with UHI (which I think is quite imprtant), it should not necessarily inject a warm bias unless they are cherry picking nearest stations for UHI. But that doesn’t appear to be what D’Aleo and Smith are alleging.

Baa Humbug

So lets get this straight. Is the following hypothetical example correct?
3 stations measure 11deg 12 deg 13deg averaging 12deg.
Then we drop the first 2 “cooler” ones leaving us with the third 13deg. Which of itself is 1deg above the average.
You gotto hand it to them, they found a way to make a station read warming against itself just beautifully. (except they got caught)

Michael

This article warrants the widest possible dissemination, in my opinion. The more the climate “gate” opens, the more people will pass through it.
We haven’t been having unprecedented warming, but we’ve sure had some unprecedented temperature reporting, have we not?

latitude

Measuring temperature is easy,
you just use the Alice in Wonderland homogeneity bias
One pill makes you larger
and one pill makes you small

Hey Skipper

Central Alaska has a good, continuous, UHI free temperature record.
Conclusions:
— over the last 30 years, there has been 3 deg F of warming
— over the last 80 years, 0.5 F.
— essentially all detectable variation is directly related to the PDO.
Clearly, adjustment is required. Where data collides with theory, so much the worse for data.

Look at the results of the NOAA’s “adjustments” of the raw temperature data: click [takes a few seconds to load]
This is a chart showing the reduction in the number of temperature stations: click. Most of the reduction was in rural stations – leaving the urban stations located on airport tarmacs, next to air conditioning exhausts, etc. Obviously this will skew the global temperature record higher than it really is.
And here we can see a graphic of the radical reduction in the number of temperature stations that compile the global temperature record: click
With $Trillions in higher taxes being demanded to “fight global warming,” an accurate temperature record is required. As we can see, NOAA “adjusts” the record upward by using very questionable methodology. An impartial investigation is necessary, with all sides of the debate fully involved.

Deanster

Great article … too bad it will not get any leggs outside of the skeptic community.
I mean … “The American Thinker” … you might as well have said that Rush read this on his show, … thus, it must be rubbish.

anon

Here’s a graph from Joseph D’Aleo’s site showing how clearly the dropout affected the temperatures:
http://icecap.us/images/uploads/Stationdropout.jpg
The only think I don’t understand is the scale, as it’s recreated here by Ross McKitrick (from the source data inidcated by Joseph) but with a different station count on the vertical scale
http://www.uoguelph.ca/~rmckitri/research/nvst.html

anon
Peter G

Puts a whole new meaning into ‘human-produced global warming’ doesn’t it?
Pete

photon without a Higgs

originally cool, as Bolivia contains proportionately more land above 10,000 feet than any other country in the world – with hot ones available in neighboring stations on a beach in Peru or somewhere in the Amazon jungle.
It’s so cold in the Amazon jungle you’d think you’re on a snowy mountain.

It’s the NOAA adjustments that really need to be looked at. They raise the anomalies by 0.6 degrees. The total anomaly (including adjustments and everything else) is 0.7 degrees.
See this NOAA graph:
http://www.ncdc.noaa.gov/img/climate/research/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif

photon without a Higgs

…demand such an investigation.
….I have no recollection Senator….

photon without a Higgs

Nick Stokes (18:29:14) :
This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.
Funny that they didn’t drop the stations most affected by UHI which causes anomaly to be unnaturally higher.

richard verney

The short point is that if you do not use exactly the same data sets throughout in the calculation of the entire temperature record, you cannot even show a trend. Thus station drop outs render the temeprature record next to meaningless since any apparent trend is a factor (to some more or less extent) of station drop out.
Would not any serious scientist seek to compile the record on the purest data available, ie data sets requiring no adjustment for UHI, changing land use etc. This would lead one to reject urban station sets and insetad use rural data sets.
In fact, the entire idea of creating an artificail global average temperature record is absurd given that global warming is not a global problem. Some countries will benefit, for others it will be a problem (to more or less degree). The melting of artic/antartic ice and glaciers depend smore upon their own micro climatic conditions than some notional global average. With the passing months, it is becoming ever more apparant that manmade global warming is little more than manmade manipulation of data.

Michael

[snip OT, and old news, we covered Colemans video here already]

photon without a Higgs

Nick Stokes (18:29:14) :
You think they dropped this one Nick?
http://joannenova.com.au/globalwarming/photos/surface-stations/tucson_arizona-labelled.jpg

photon without a Higgs

Funny how this was started in 1989 right after James Hansen gave his infamous 1988 presentation before congress.
Maybe you had to help your predictions, hey James?

You really have to ask what is the reason these people would do this and how did they get it to this scale.
Some say control and power and some say grants and money. Whatever- it is almost unbelieveable that it could go this far.The scale is breathtaking when you include NOAA and NASA. Maybe those moon pictures were fake-LOL.

Nick Stokes (18:29:14) :
“This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.”
As I read this they were saying that the anomaly can be affected if warmer stations are extropalated to an area (up to 1200 Klicks) from an area where readings are no longer used, thus raising the anomaly for that area, especially true if the areas transposed are affected by improper UHI adjustments.
Also if rural areas are dropped, and UHI is not correctly accounted for, this could raise the anomaly.
Also at times I hear reports that the global average temp was such and such for a given year, and so even if it is not in the IPCC papers it, (the global mean temperature) can be used politcaly for misinformation.

Leon Brozyna

The inertia of AGW is such that it will take more than a single event, such as leaking of emails and documents at CRU, to affect significant change upon the direction of climate change theory.
The ‘independent’ reviews at UEA and PSU, even allowing them the benefit of the doubt and assuming they will vigorously hunt for the truth, will never be enough to turn around the state of climate science.
The only way to rip the lid off the messy state of climate science is to keep pressing for full disclosure. Let the first tier investigations proceed, knowing that other investigations are set to proceed (investigate the investigators?).
While this article by Marc Sheppard is nothing ‘new’, it further reveals the nature of the problem and advances what Meteorologist Joseph D’Aleo and computer expert E. Michael Smith have so far uncovered and discussed. This serves to continue to raise the pressure, thereby inviting further investigations.
It will take many years to truly reconstruct climate science into a viable field of study and rid it of that silly notion that mankind has a major impact on climate. Climategate and its immediate fallout is but the first step of a long process.

Steve in SC

These adjusters need to be tried and executed as soon as possible.

photon without a Higgs

Nick Stokes (18:29:14) :
Well let’s do this, Nick:
Let’s go back to 1989 when all this started, take all the data that was dropped up until today, and use it as GISTemp product instead of the data that wasn’t dropped. And let’s just drop all the data that was used by GISS. According to you they are interchangeable.
Want to do it?

I’d like to see that graph. And I am sure you would have no problem with it too since you call this whole thing a ‘a silly meme’.

Why make it so complicated. Use rural stations, you could probably still have more records then they now use, plot them on overlapping time grids, and get your anomaly estimates for each decade based on the number of available stations for that decade? KISS could really help. Then we could argue about the quality of the rural stations, but not about all these (what appear to be) unnecessary adjustments.
Produce a second record for the mean of all the long term rural temps. I think we would then have a better anomaly record, and a mean temp record.
Then we could get to the point of factoring in mean relative humitiy for the same time period, it sounds difficult, and IMV is more difficult then it sounds)
Then we could calculate how much of whatever warming we found was at night verses day.
Also, I understand that exactly how all these adjstments are made is not fully disclosed. Is this correct.

chili palmer

As to GISS being even more influential than UK counterparts, this from article re Hansen’s warmest decade story, seems to say the UK has few if any stations in the Artic and Antarctic. That leaves GISS as the main source for disaster scenarios at the poles. I wouldn’t take any single person’s word about such a thing:
Other research groups also track global temperature trends but use different analysis techniques. The Met Office Hadley Centre in the United Kingdom uses similar input measurements as GISS, for example, but it omits large areas of the Arctic and Antarctic where monitoring stations are sparse.”…escience, 1/21/10
As to why they give all the bad info, it’s money and politics, climate is among biggest businesses in the world now–all based on nothing. Carbon trading is the most profitable division of investment banks in Europe. The US has had 20 years of weak leadership, so we just rolled over for UN thugs. There are 5 climate lobbyists for every congressman in DC.

Thank you, Marc Sheppard, for keeping the spotlight of public attention focused on the global climate warming scandal.
I have personally witnessed a steady decline in the integrity of federally funded research since 1960.
Over the past 50 years, scientists have been trained with grant funds the way Pavlov’s dogs were trained with dog biscuits.
The National Academy of Sciences (NAS) uses its power of review of the budgets for NASA, NOAA, NSF, DOE, etc. to direct funds to researchers who will find evidence of favorite NAS opinions: Anthropological Global Warming, Oscillating Solar Neutrinos, the Standard Solar Model, etc, ad infinitum.
Beneath the Climategate iceberg is a half-century of distortion and data manipulation by those who control the purse strings of NASA, NOAA, NSF, DOE, etc.
What a sad state of affairs for science,
Oliver K. Manuel
Former NASA PI for Apollo

anon

I wonder if a volunteer team of several hundred, like Anthony’s surface station project, could plow their way through each station record like was done for Darwin Airport.
It might be quite technical and everyone may not have the skills to understand everything but perhaps there are preliminary analyses that can be done to get some idea of what the state of the records are. And with enough people it may not take too long and we’d have the final and definitive answer on this entire mess.

rbateman

What GISS and NOAA have done is to write a program and implement it to make any weather anomaly they want.
With UHI’s as high as 10deg F, they control the horizontal and vertical, and that’s how you get whole regions mysteriously showing warmer than ever before when the opposite is true in the real world.
Yes, they need to have thier clocks cleaned.
There is only 1 way, and 1 way only, to clean up this stinky mess.
Pull up all the data from the stations they don’t use anymore. Most of them are still recording. Verify with printed material of the times.
Nothing that GISS or NOAA has been producing is beyond suspicion.

Nick Stokes, (I feel like I’m piling on at this point), the comments against your point are valid. Especially when the “anomaly” is measured from some baseline, and the “current value” is taken from a known, UHI-enhanced, hotter location extrapolated over into an area that is actually much cooler.
I especially like Photon’s offer – let’s just switch all the dropped with retained stations, and run the numbers again! What say you?

kuhnkat

Nick Stokes and others,
thanks for the apologetics.
When you toss out stations at higher elevation and higher latitude you are generally tossing out research stations, smaller villages and towns. When you look at what is left and find there is actually a high percentage of airports and larger towns with virtually no rural stations, yes, you ARE gridding with high trend sites!!
But then, you KNEW that and STILL wanted to apologise for James “coal trains are death trains” Hansen and the reswt!! But thanks for dropping by Nick!!! Nice to see you making the rounds and making yourself look ignorant again!!

Margaret

The Carbon Pollution Reduction Scheme legislation is due to be reintroduced into the Australian Parliament in February. That is if Mr Rudd follows through with his threat to reintroduce the bill and thereby create a ‘trigger’ for a double dissolution of both houses of Parliament and call an early election. I have sent a copy of this article to Senators who voted in favour of the bill last year. At least they can not say they did not know this fraud/conspiracy was going on.

igloowhite

For sure the New York Times will be all over this story.
What would we do without the New York Times telling the truth.

Nick Stokes

David (19:35:41) :
“thus raising the anomaly for that area, “

Anomalies don’t work that way. An anomaly is calculated for each individual site. They are then aggregated. The purpose is exactly to avoid the issue spuriously raised here. For climate purposes you only aggregate deviations from the individual site average. Then it doesn’t matter whether the mean temperatures represent the average mean for the area.

Orson

Here’s the data set I would like to see plotted:
SINCE radiosonde (weather balloon) data have been used to validate MSU data sets, which we have going back over 30 years, THEN find the ground data that corresponds to these release points.
THIS independently validated surface temperature data set can be plotted. Next, of course, UHI adjustments, relocations deleted, and other factors may need be performed.
Overall, I think this should be the World Temperature – not the phony ones we’ve put up with for so many years.

Nick Stokes

photon without a Higgs (19:44:41) :
“I am sure you would have no problem with it too”

Yes, not much. The data was probably dropped for a reason, but as long as it is reasonably OK, the fact that it came from a cooler area (if that is indeed the trend) won’t matter.

Certified Consulting Meteorologist Joseph D’Aleo and computer expert E. Michael Smith have done an outstanding job. Marc Sheppard’s report is well-written, too. He’s trying to explain it to people who have not been following this work, as many WUWT-ers have been.
I’m glad to see D’Aleo and Smith (and Watts et al.) getting some credit where it is certainly due. I also anxiously await the full report. It will cause a stir immediately, but it will also resonate for years. It may well mark the tipping point. Climategate-zilla.

philincalifornia

Nick Stokes (18:29:14) :
This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.

How about they were thrown out because, in contrast to your theory, they did affect the reported warming trends …. downwards ?? In fact, I thought a warmista meme was that, because of the lack of water vapor at the poles (sites with a distinctly cooler mean), anthropogenic CO2-induced warming (as in the temperature anomaly) would be more pronounced ?? Did that one die a death ??
Science has a nasty habit of biting theories in the ass, so I will keep an open mind on this until you, Nick, show me the data that supports your conclusion.

AJB

Latest Pachauri article in the Times … most viewed on-line
UN climate change expert: there could be more errors in report
Falling apart at the seams, just look at those comments … LOL !!

Matt O

Nick Stokes (18:29:14) :
This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.
Since measuring anomalies from fewer locations is as accurate as measuring anomalies from more locations, why don’t they simply measure from one location?

jerry

Using rural stations only has it’s own share of problems. You can get regional scale changes similar to the UHI effect. For instance:
– Change of land use by making bigger farms by merging smaller farms, typically with removal of vegetation and fencing to make it more economical to do the farming with big machines.
– Forest clear-felling.
– Introduction of irrigation.
– Change in plant characteristics – e.g. wheat plants these days are significantly shorter than years ago.
– Use of artificial fertilizers as compared to older style crop rotation.
– Change in preferred crop – e.g. growing corn for bioethanol.
Now I don’t know the exact numeric effect of these changes, but I do know they will change the regional micro-climate by mechanisms such as
– changing the surface albedo (incoming and outgoing radiation balance)
– changing the surface turbulence ( i.e. mixing of near ground and higher air)
– changing the amount of particles in the air (cloud formation)
– change in the local evaporation rate (cloud & humidity)
Perhaps it is not sensible to correct for any of these or UHI? Perhaps a better solution is to take the figures as they are and recognize the simple truth of “yes it’s hotter in town now than 20 years ago”. That is the ground truth for the residents and what actually affects their lives. It is no comfort to know what caused it, just that it is so.
The secondary interest is whether the world is heating up or not. The issue is how to take a sparse and spotty set of measurements and fairly apportion them to some grid that can then be compared over time.
My naive approach would be to take all measurements, uncorrected, calculate the reference temperature and anomaly at each station, and then calculate gridded values of the reference temperature & anomaly using a geometric averaging process from relatively nearby stations, with a weighting factor that discounts multiple stations close together – e.g. the 10 weather stations in a city compared to the 5 nearby country stations. Obviously it would need to be a bit cleverer than that to handle stations starting and stopping etc

Baa Humbug

I rather get my info from the horses mouth.
J Hansen said the following on Oz tv interview with T Jones.
TONY JONES: Okay, can you tell us how the Goddard Institute takes and adjusts these global temperatures because sceptics claim that urban heat centres make a huge difference; that they distort global temperatures and they make it appear hotter that it really is.
So do you adjust, in your figures, for the urban heat zone effects?
JAMES HANSEN: We get data from three different sources and we now, in order to avoid criticisms from contrarians, we no longer make an adjustment. Even if we see there are eight stations in Alaska and seven of them have temperatures in the minus 30s and one of them says plus 35, which pretty obvious what happens, someone didn’t put the minus sign there, we just, we don’t correct that.
Instead we send an email or letter or a letter to the organisation that produces the data and say, you’d better check the Alaska temperatures, because we don’t want to be blamed for changing anything. But as far as adjusting for urban effects, we have a very simple procedure.
We exclude urban locations, use rural locations to establish a trend, and that does eliminate – though urban stations do have more warming than the rural stations, and so we eliminate that effect simply by eliminating those stations, but it’s very clear that the warming that we see is not urban, it’s largest in Siberia, and in the Arctic and the Antarctic, and there aren’t any cities there, and there’s warming over the oceans, there are no cities there. So it’s not urban warming that’s just nonsense.
So he says they drop the UHI stations “to establish a trend”
What happens when you drop “warm” ones? the trend is cool so that current temps will more than likely be warmer against the trend.
Alternatively, if he meant they drop UHI stations altogether, we know this is false, they did the exact opposite.

Steve Schaper

And now, apparently, Hansen is on record as desiring the destruction of all cities, and the murder of billions of people.
Aren’t there facilities for people like that?

Michael Larkin

This kind of analysis is something that everyone can understand, even the layman (maybe even politicians!). It has a strong appeal to commonsense. If one could only get a mainstream newspaper to publish something like it!
On another point: that word “meme”. I’ve noticed it is very often used in the sense of something (obviously false) that is parroted by a person one happens to disagree with. Though essentially a crude and pejorative insult, it masquerades as sophisticated superiority. “Meme” has become a meme in its own right, and, ironically, parodies itself.
People who utter it often try to package up that which they disapprove of, colour it black, and toss it in the stupid bin. It doesn’t matter what is correct or incorrect in a post-modern world, does it? Everything is relative; my truth is whatever I choose it to be, and can quote the largest number (real or fabricated) are in support of. As if truth bowed its head to mere numerical superiority.
Observe countless climate-related comments: “meme” usually implicitly declares the utterer as having little interest in intellectual enquiry. The world is divided between what resides in the stupid bin, and what not: but the latter may also contain memes. You know, the “approved” ones nobody wants to admit are memes because they are bad things only someone else ascribes to, right?
I wonder what it must be like to inhabit this world of memes, one set coloured white, and the other black. The very thought of it makes me shudder. How could I live without my doubts, my need to explore and occasionally be surprised and delighted with what I find, whether or not it gainsays things I previously thought to be true?
“Meme” folk don’t know what they are missing. Things are so much more uncertain, and therefore interesting, than they realise.

Baa Humbug

Further to my post above, the US has the most modern and comprehensive temp. data of any region in the world. When S McIntyre found an error, GISS corrected their data set. When questioned about the error, Hansen said it was only from 2000-2006 and only 0.15DegC (0.15 over 7 yrs equates to 0.21 over ten years or 2.1 per century, nearly all of the supposed scary warming).
He also stated that the contiguous US was only 2% of the globes land mass so the error had no significance.
I contend that if the gold standard in data was wrong by 0.21 per decade, what hope the rest of the global data (not of US standard) is remotely close to accurate?

Oliver Ramsay

Nick Stokes (18:29:14) : said;
“This “dropping cooler sites” is a silly meme. Whether true or not, what is used from the sites is anomaly data. Whether the mean is cooler or not does not affect the reported warming trends.”
——-
With just a casual observation that “silly meme” is a fatuous tautology, I’ll ask for an explanation of how an opinion that you concede might be true can still be described as a “silly meme”.
A meme that I find silly is that every objection made to the methods, models or mangled data of the climate orthodoxy can be dismissed by the faithful as a “silly meme”.
You didn’t choose to attempt to show us in what way this cogently written article erred. Bolivia would be a good place to start.

John F. Hultquist

I’ve just read through the first 26 comments and missed (if it is there) mention of all the “How not to measure temperature” posts that WUWT has had. I think Anthony and the volunteer crew of surface station scouts has shown that these data being collected and manipulated are not acceptable for the use to which they are being put, namely precision of fractions of a degree. They are useful for local weather reporting and classification of regions (see Köppen), as long as one is interested only in general aspects. The “global warming scam” takes these data way beyond their possibilities.

rbateman

‘We exclude urban locations, use rural locations to establish a trend, ‘
We golly be, if you have previously dropped the rural stations, and then you go and drop the urban stations… we didn’t do nuthin’. Must have been somebody else.