More on John Coleman's Special tonight – KUSI press release says NASA improperly manipulated data

UPDATE: See

John Coleman’s hourlong news special “Global Warming – The Other Side” now online, all five parts here

via SpaceRef.com

PRESS RELEASE

Date Released: Thursday, January 14, 2010

Source: KUSI-TV

Climate researchers have discovered that NASA researchers improperly manipulated data in order to claim 2005 as “THE WARMEST YEAR ON RECORD.” KUSI-TV meteorologist, Weather Channel founder, and iconic weatherman John Coleman will present these findings in a one-hour special airing on KUSI-TV on Jan.14 at 9 p.m. A related report will be made available on the Internet at 6 p.m. EST on January 14th at www.kusi.com.

In a new report, computer expert E. Michael Smith and Certified Consulting Meteorologist Joseph D’Aleo discovered extensive manipulation of the temperature data by the U.S. Government’s two primary climate centers: the National Climate Data Center (NCDC) in Ashville, North Carolina and the NASA Goddard Institute for Space Studies (GISS) at Columbia University in New York City. Smith and D’Aleo accuse these centers of manipulating temperature data to give the appearance of warmer temperatures than actually occurred by trimming the number and location of weather observation stations. The report is available online at http://icecap.us/images/uploads/NOAAroleinclimategate.pdf.

The report reveals that there were no actual temperatures left in the computer database when NASA/NCDC proclaimed 2005 as “THE WARMEST YEAR ON RECORD.” The NCDC deleted actual temperatures at thousands of locations throughout the world as it changed to a system of global grid points, each of which is determined by averaging the temperatures of two or more adjacent weather observation stations. So the NCDC grid map contains only averaged, not real temperatures, giving rise to significant doubt that the result is a valid representation of Earth temperatures.

The number of actual weather observation points used as a starting point for world average temperatures was reduced from about 6,000 in the 1970s to about 1,000 now. “That leaves much of the world unaccounted for,” says D’Aleo.

The NCDC data are regularly used by the National Weather Service to declare a given month or year as setting a record for warmth. Such pronouncements are typically made in support of the global warming alarmism agenda. Researchers who support the UN’s Intergovernmental Panel on Climate Change (IPCC) also regularly use the NASA/NCDC data, including researchers associated with the Climate Research Unit at the University of East Anglia that is now at the center of the “Climategate” controversy.

This problem is only the tip of the iceberg with NCDC data. “For one thing, it is clear that comparing data from previous years, when the final figure was produced by averaging a large number of temperatures, with those of later years, produced from a small temperature base and the grid method, is like comparing apples and oranges,” says Smith. “When the differences between the warmest year in history and the tenth warmest year is less than three quarters of a degree, it becomes silly to rely on such comparisons,” added D’Aleo who asserts that the data manipulation is “scientific travesty” that was committed by activist scientists to advance the global warming agenda.

Smith and D’Aleo are both interviewed as part of a report on this study on the television special, “Global Warming: The Other Side” seen at 9 PM on January 14th on KUSI-TV, channel 9/51, San Diego, California. That program can now be viewed via computer at the website http://www.kusi.com/. The detailed report is available at http://icecap.us/images/uploads/NOAAroleinclimategate.pdf.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

122 Comments
Inline Feedbacks
View all comments
January 15, 2010 6:22 am

E.M.Smith (02:19:14) :

Yes. I’ve identified a couple of core patterns. One is that anything at altitude gets removed from the recent part of the dataset. For example, Japan now has no thermometer above 300 m. No mountains need apply. Similarly the Andes are deleted (from the recent part of the record).
Then see if there is a difference between the temperature trend of Set A and Set B up to 1985. In other words, is there any sign that the stations to be maintained been pre-selected based on the temperature trend.
It would be interesting to do that. But take a lot of time and effort. Right now I have to split my time between temperature stuff and “making ends meet”, so I only get to put a little time on temperature stuff. But someone with the time could do this fairly easily.

I think this is stunning stuff. Far too much to be held back through a lack of funds.
Myself I have no time to spare, but a little money. Having said that I can process data being a code monkey. Others may have time. Can we donate to the cause to help this along?

January 15, 2010 6:26 am

Multiplied by a column of air about 30,000 feet, is approximately one thermometer for every volume of atmosphere equal to about 1000 close-packed Mt. Everests. I suppose one can calculate the uncertainty by measuring variations in temperature 7000 termometers across the State of Louisiana. I imagine the uncertainty is too high to warrant substantive conclusions and the spending trillions of dollars to abet CO2 emissions.

Pascvaks
January 15, 2010 6:36 am

Ref – Turboblocke (04:44:35) :
“Perhaps the potential defendants should have read these documents before making their defamatory remarks…”
________________
I’m sure NOAA has gigathousands of stations in mothball status that are, and have been since 1818, silently sampling temperature and humidity and air pressure data –oooops, nearly forgot wind direction. The problem is that they have selectively choosen which devices to use in devising their data. Or, they have been politically manipulating and creating with the aim of achieving special outcomes. Proving you use data is easy. Proving you’re data is credible is another story. Proving you’re the ‘trustworthy government agency’ the American people always thought you were, can be near impossible. When its all said and done, you’re only as good as your reputation.

Allen
January 15, 2010 7:15 am

You know the National Archives and Records Administration (NARA) has guidelines on what are official record and what are not – and raw data that hasn’t been minipulated is generally considered official records (kind of like the chain of evidence thingie in the judicial arena). There is probably a very good legal case against the people responsible for the data. My $.02

paul jackson
January 15, 2010 7:30 am

Non Gradus Anus Rodentum! -> not worth a rat’s ass

Ira
January 15, 2010 7:30 am

I’ve read all 215 pages of NASA GISS emails at Judicial Watch. Thanks to Steve McIntyre at Climate Audit for finding the original error in 2007 and writing courteous emails with very specific and reasonable requests. James Hansen and others at GISS, in their internal emails and emails to friendly reporters, call him a “court jester” and question if he has “a light on upstairs?” Hislight is BRILLIANT! The best court jesters let the King know the truth in a valuable way that didn’t get their heads cut off :^)
Makiko Sato, the author of the email with the seven versions of the 1934 vs 1998 temperare anomaly data I graphed and mentioned by Jim Hansen in an email excerpt below, appears to be the innocent truth teller at GISS. She may turn out to be the heroine of this story! (Along with the hero “court jester”, Steve McIntyre!)
Jim Hansen writes [2007-08-10 at 11:59 -500]: “The appropriate response is to show the curves for U.S. and global temperatures before and after (before and after McIntyre’s correction). Makiko doubts that this is possible because the earlier result has been “thrown away”. We will never live this down if we give such a statement. … By the way, I think that we should save the results of the analyses at least once per year …”
(If any of you downloaded my PowerPoint Show on Explaining Climategate, I’ve just updated it with the info just made available yesterday so you may want to download the newer version.)

Turboblocke
January 15, 2010 7:33 am

CharlesRKiss: LAND SURFACE stations. Think about it.
If you measure your own temperature at one point, you don’t know what your average temperature is. But you can tell when your temperature goes up. If you measure your own temperature at fifty different points you still don’t know what your average temperature is, but you can still tell that you’ve got a fever if the average of the measurements goes up.
Pascvaks: I don’t think that you’ve checked the data have you?

Pascvaks
January 15, 2010 8:03 am

Ref – Turboblocke (07:33:51) :
“Pascvaks: I don’t think that you’ve checked the data have you?”
__________________
What data? The original? No. The adjusted? No. The political? No. The theoretical? No. The selective? No. The tweeked? No.
Your question suggests that we would agree about something if I did. You’re probably right is some respect, I’ll admit that; but is that the point of all this? Isn’t there a question in your mind as well, “Can the ‘whatever’ data be trusted?” If you say yes, I’ll understand where you stand. If I say no to the same question, will you understand where I stand?

Krishna Gans
January 15, 2010 8:22 am

This story id based on / connected to these infos:
http://www.uoguelph.ca/~rmckitri/research/nvst.html

beng
January 15, 2010 8:28 am

********
Tom P (17:45:23) :
Dr. Bob (17:25:02) :
This might help:
http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps+002
With a value of 0.79 C we’ve just overtaken the monthly peak of the 1998 Super El Niño. We appear to be seeing a massive release of stored thermal energy in the Pacific.
After 1998 temperatures did not settle back to their values before the El Niño. If the same happens again it will be very difficult to explain how just natural processes could have added so much energy.

********
Not quite as difficult as explaining how temps naturally rose 6C globally in a few thousand yrs at the beginning of this interglacial. Or the natural temp changes during any interglacial/glacial transition for that matter.

January 15, 2010 8:46 am

beng (08:28:49) : edit
********
Tom P (17:45:23) :
With a value of 0.79 C we’ve just overtaken the monthly peak of the 1998 Super El Niño. We appear to be seeing a massive release of stored thermal energy in the Pacific.
After 1998 temperatures did not settle back to their values before the El Niño. If the same happens again it will be very difficult to explain how just natural processes could have added so much energy.
********
Not quite as difficult as explaining how temps naturally rose 6C globally in a few thousand yrs at the beginning of this interglacial. Or the natural temp changes during any interglacial/glacial transition for that matter.

TomP also fails to notice that the energy release is happening worldwide, and because it’s not concentrated just in the Pacific, there isn’t a high water vapour concentration (humidity) holding heat in. That’s why he’s utterly wrong to say the temp won’t fall further after the current modoki el nino. Wait 12 months and see.

kwik
January 15, 2010 8:56 am

Tom P (17:45:23) :
“If the same happens again it will be very difficult to explain how just natural processes could have added so much energy.”
Not to mention how difficult it would be using scientific methods to show how un-natural processes could do it.
/Humour_on
Can anyone tell me who this “al” is whenever they mention Mann et. “al” ? Al Gore?
/Humour_off

mpaul
January 15, 2010 8:56 am

E.M. Smith — you clearly have cause and effect reversed. It is global warming that is causing the thermometers to go extinct. Get with the program.

January 15, 2010 2:25 pm

Turboblocke (07:33:51) :
If you don’t know the temperature due to uncertainty, you can’t tell if it has increased; especially if the error is greater than the measured increase, that is what error is for; it’s not just a number at the end of a measurement!
Sorry, Turbo, the error is too high.

January 16, 2010 5:28 am

fishhead (16:41:41) :
With only 1000 weather observation points and about 150 million square kilometers of land mass, that’s about one observation point per 150,000 square kilometers. By comparison, New York state is approximately 140,000 sq.km.

There are probably something over 100 million men in the US – yet it would be possible to estimate the average height of the US male froma random sample of around 1000.
We are looking for a trend – not precise measurements. A global trend of say, 0.6 deg over 30 years (0.2 deg per decade) , could be detected with a lot less than 1000 site observations providing those sites gave reasonable spatial coverage…..
… and probably even if the measuring apparatus (thermometers) only measured to the nearest degree.
I think there’s a lot of misunderstanding over this issue.
[You can say that again! RT – mod]

January 16, 2010 9:26 am

… and probably even if the measuring apparatus (thermometers) only measured to the nearest degree.
I think there’s a lot of misunderstanding over this issue.


[You can say that again! RT – mod]
Are you suggesting that my understanding is at fault. If so what are you referring to in particular.
[It was a general comment, I try not to get personal in mod comments. RT – mod]

January 16, 2010 10:22 am

Are you suggesting that my understanding is at fault. If so what are you referring to in particular.
[It was a general comment, I try not to get personal in mod comments. RT – mod]

It’s ok – I didn’t take it personally. I don’t mind my comments being challenged.

sky
January 16, 2010 2:06 pm

As much as I enjoyed John Coleman’s presentation, the very crux of data manipulation was never properly brought to light. It lies not in the mean temperature levels of the stations employed, as E.M. Smith implied, but in the time-series of average anomalies produced from an ever-changing set of stations. That, and not the zonal temperature factor, is what provides an open door to obtain any “trend” you want. An invariant set of UHI-uncorrupted stations must be used throughout the entire time interval to obtain reliable time-series.

January 16, 2010 10:56 pm

I made a comparison of raw and adjusted GISS temperature data for several european meteostations with a long data record (1880-now). My impression is that a lot of data is ‘adjusted’ to favour a warming trend. You can find the results on:
http://people.mech.kuleuven.be/~jpeirs/Climate/Temperature_comparison.html
One would expect a downward correction for compensating the Urban Heat Island effect, but you see the opposite for cities such as Paris, Geneva, Zurich, Trier, etc. For Brussels (Uccle) they simply cut of the ‘inconvenient’ hot period before 1950.

gober
January 17, 2010 5:52 pm

E.M.Smith
Thanks for that post / those links.

E.M.Smith
Editor
January 17, 2010 5:59 pm

Turboblocke (04:44:35) : Perhaps the potential defendants should have read these documents before making their defamatory remarks in print: […]
About 7000 stations and quality control documented in 1997.

The key here is that the stations are deleted from the record AT a POINT IN TIME. Their past is left intact.
So a bunch of cold thermometers participate in setting the baseline, then in the present have been taken out and shot. But no worries, we can fabricate an infill anomally from 1000 km away at the major jet airports classed as “rural” and compare it.
So yes, all 7000 are “USED”, just some cold ones are used for the base line and some warm ones are used for “now”. The count of PRESENT used thermometers (i.e. in 2009) in GHCN is 1500+ (it hit a low a couple of years back at 1470 or so) but they are compared to 5996 in 1970 (back when the world had cold mountains, Cambell Island in N.Zealand down toward antarctica, Norther Siberian stations in the USSR. You know, what they call “Baseline” stations…
John Finn (05:28:07) : There are probably something over 100 million men in the US – yet it would be possible to estimate the average height of the US male froma random sample of around 1000.
Yes, but the problem is that the 1500 current stations are not random. They are preferentially warm.
We are looking for a trend – not precise measurements. A global trend of say, 0.6 deg over 30 years (0.2 deg per decade) , could be detected with a lot less than 1000 site observations providing those sites gave reasonable spatial coverage…
Yes, just like if we were looking for a trend in male height in the USA we could, oh, take the average height of a sample of Irish immigrants in the early years (growth limited by famine) and compare it with a small group “randomly” selected from the NBA … and “find a trend”.
The problem with “the anomaly will save us” is that the baseline is an obvious cold period cherry pick and the “present” is a warm biased set from thermometer deletions.
Other than that, no problem.
But yes, they DO use both sets of thermometers. Just at different times and for different “uses”…

E.M.Smith
Editor
January 17, 2010 6:11 pm

sky (14:06:54) : It lies not in the mean temperature levels of the stations employed, as E.M. Smith implied, but in the time-series of average anomalies produced from an ever-changing set of stations. That, and not the zonal temperature factor, is what provides an open door to obtain any “trend” you want. An invariant set of UHI-uncorrupted stations must be used throughout the entire time interval to obtain reliable time-series.
Remarkably well said!
mpaul (08:56:59) :
E.M. Smith — you clearly have cause and effect reversed. It is global warming that is causing the thermometers to go extinct. Get with the program.

So you are saying the cold thermometers have ‘caught a cold’ and died out? Oh Dear. The poor little things. You are right, I’d completely missed that 😉

1 3 4 5
Verified by MonsterInsights