I’ve watched part 4, which had an early release. The video is cheering, and supported with a multitude of graphics and interviews. “Chiefio” aka E.M. Smith and Joe D’Aleo make strong appearances.

Here is the KUSI introduction:
A computer programmer named E. Michael Smith and a Certified Consulting Meteorologist named Joseph D’Aleo join the program to tell us about their breakthrough investigation into the manipulations of data at the NASA Goddard Science and Space Institute at Columbia University in New York and the NOAA National Climate Data Center in Ashville, North Carolina.
E. Michael Smith kept a blog of his findings. See his site by clicking here.
Joe D’Aleo has written a detailed report on the findings. It is available here .
I have written a blog about this important climate news development. It is available by clicking here.
D’Aleo wrote an outstanding article on Climategate. It is available here.
You can read about the English Climategate leaked or hacked files at the Anglia University Climate Center at this newspaper site.
And, there is a US connection with the original Climategate, as well. Professor Michael Mann, of Penn State University, is in the middle of it. Here is the latest on it.
All five parts of the video are now online.
Click below to watch each segment of the KUSI Special Report, Global Warming: The Other Side
Sponsored IT training links:
Interested in CISA certification? Try out our latest 650-575 dumps and 642-262 practice test with 100% success guarantee.






David (04:53:18) :
See Bob Tisdale’s site here:
http://bobtisdale.blogspot.com
Viv Evans (05:39:36) :
They will use the extra CO² taxes to fund their reseach to find yet more environmental problems to tax us on.
@ur momisugly rbateman (11:52:22) :
@ur momisugly boballab (08:33:44) :
I find it refreshing that ncdc allows us to browse the directory here:
http://www1.ncdc.noaa.gov/pub/download
There’s an interesting 48 page pdf of FOIA requested emails. I assume other know about this as I’m behind the curve on all this. see http://www1.ncdc.noaa.gov/pub/download/Additional%20Emails%20Relating%20to%20%20FOIA%20Requests.pdf
Looks like some other interesting stuff as well.
Dr. Bob (05:59:14) :
Ok, he’s talking about station dropout in video 4. He says something to the effect of there being 4 stations in California. But when I look at Anthony’s distribution of stations on surfacestations.org, there are many more than 4 stations in California.
Is he saying that when it comes time to do the math, only data from 4 stations is used?
It’s actually a bit more complicated than that; but in 9 minutes of video for the general public you can’t put in every detail and caveat.
(Yes, I was up late dealing with ‘the flood’ [ in a good way 😉 ] and I’ll be working through the comments and questions now that I’ve got tea in hand.)
I have not yet gotten to watch the video, but what I said (hope it came through editing) was that “In GHCN” there are 4 stations. Now, up until about a month ago only four survived in GIStemp as well. (They used the USHCN added to GHCN but USHCN ‘cut off’ in May 2007). As of a few weeks back, they swapped to USHCN Version 2 (that does carry forward more thermometers into the present). Problem solved? Hardly.
USHCN Version 2 has a new “adjustment” re-cooking method used (“peer” reviewed, of course) that puts in lots of new warming tilt. See the blink chart link here:
Mike McMillan (17:28:30) :
I’ve completed USHCN vs USHCN version 2 blink comparison charts for Wisconsin. As with the Illinois charts, the majority of stations had their raw data adjusted to show more warming by lowering the temperatures in the first half of the 20th century.
That brings the raw data more in line with the GISS homogenized versions. I haven’t blinked the original GISS with the new homogenized charts yet, but I’d bet a nickle they’ll show even more warming.
Wisconsin original USHCN raw / revised raw data –
http://www.rockyhigh66.org/stuff/USHCN_revisions_wisconsin.htm
Illinois original raw / revised raw –
http://www.rockyhigh66.org/stuff/USHCN_revisions.htm
Revised raw data. Oxymoron?
So I had the choice of going down the rabbit hole of GIStemp in the past vs GIStemp of a couple of weeks ago vs all the temp series that use only GHCN vs…
For TV, you just stick in “GHCN has” – not the whole detailed rats nest of revised revisions of changed modified versions of the temperatures.
I figure that story of “They put the thermometers back in, but had to cook them first!!” will make a nice follow on story 😉
But back to the point made above:
GHCN still has only 4 on the beach in California. Yes, there are other thermometers in California and you can find the data, but GHCN is what all the major Global Temperature Series use. Furthermore, GIStemp had only GHCN used from May 2007 until just a few weeks ago. I will happily publish a “gee, they DO use more as of December when they changed their code and process AGAIN” as soon as they publish a “Gee, we’re sorry we grossly mislead the public for 3 years and tried to hide it with a swap of datasets last month. We are retracting all the hysteria driven press releases about excess warming in the USA from that time period.”
So what I said is factually correct AND correct in spirit even for GIStemp.
FWIW, anyone who would like a bit of fame could most likely get it via taking the USHCN the GHCN (both unadjusted and adjusted) and the USHCN.v2 and plotting all 4 values on the same chart. Then just ask one question: These are all published by NOAA / NCDC as correct and valid. So which one is the really valid one? (There are 3 F differences in some of them, to the warming side, naturally…)
For bonus point, add the GIStemp product to the mix for it’s 3 temperature steps… (the stuff downloaded from GISS “raw GHCN + USHCN corrections”, “after homgenizing”, etc.)
Seven total lines. All different. All supposedly “correct” in some way. All different. All different by more than the AGW signal.
One final point:
I’m now getting a few “fleas” that have decided to try chewing on me. Here is a quote from another site:
Well guess where the tortured numbers pop up again. Three points to the guy in the corner who said, gee, that’s just the sort of thing that Joe D’Aleo would love and five to the lady in the black hat who said, that’s so good it should be featured on the new KUSI blockbuster show we set our TIVO to capture when we went to the tea party. It would be nice to see a libel suit dropped on these clowns, but it’s time to saddle up and point out that the icecap man is melting. Oh yeah, here is the “programming expert”, please help undress him
There is more of the usual, the number of stations has fallen over the past thirty years, etc. Pushback will be needed.
It is both amusing to see that the attack dog spirit is alive and well and at the same time a bit sad. Somehow folks on “the other side” just don’t “get it” that it’s not about me it’s about the data. Frankly, like so many others, I started on this with the idea that (as another comment put it) “There’s smoke, lets go see where the fire is”. But when I looked, I found a guy putting wet leaves on a BBQ…
Take a picture of that and show it to folks, some how it becomes you that is the “issue”… Just bizzare.
So much hate and invective. So much focus on shooting the messenger. Just amazing.
At any rate “The truth just is. -E.M.Smith” and they will have to deal with that.
So if I get a load of fleas, don’t be surprised if I go quiet for a while as I find the DDT cannister and dust some all over…
Hansen never worked over the data??? What about this?
http://zapruder.nl/images/uploads/screenhunter3qk7.gif
I would like to see real lab. results of the actual amount of warming from CO2, in a chart similar to the one on page 4 of
http://carbon-sense.com/wp-content/uploads/2008/05/hertzberg.pdf
A bar graf should show the actual amount in numbers.
I take it then that you are going directly to these KUSI pages and it does not let you click on the video graphic ?
http://www.kusi.com/weather/colemanscorner/81557272.html
http://www.kusi.com/weather/colemanscorner/81558532.html
http://www.kusi.com/weather/colemanscorner/81558842.html
http://www.kusi.com/weather/colemanscorner/81559212.html
http://www.kusi.com/weather/colemanscorner/81559582.html
Have you double checked to see if your Firefox browser has java script enabled?
Go to Tools —-> Options —-> Content, and see if the Java script check box is enabled.
Mine is and I can see them just fine. Looking at the page source it looks like they call java scripts on the page.
Larry
Well, E.M. Smith, I’ve got a serious issue with you:
There are now *3* Canadian Territories:
Yukon, Northwest, & Nunavut.
Glad we could clear up this factual error – (alarmist browsers take note – maybe you can spin this into something – clearly this shoots E.M. Smith’s credibility…)
I’d love to see Gavin or someone address this directly and in detail. I found the program and the supporting documentation on the blogs very convincing as to the horrors of what’s been done to the surface temp record. Having said that, I try not to make up my mind until the other side has had the opportunity to provide their own rebuttal.
I thought both the progam and the blog data was very well done. I’d known about the dropouts, I knew they were felt to bias the trend data warmer, but I now have a much deeper understanding of how that happened.
Thank god for the satellite data. Tho even that can’t save us from continual diddling of the historic pre-1979 record. At this rate, there is going to be a new little ice age in the 1960s by and by so they can keep their trends scary.
Otoh, this may be why we are getting more and more stories of “temporary pause” of 20 years or so in warming. . .they must be starting to realize they can only diddle the back numbers so far, and then must do some log-rolling and hope that warming starts up again.
Altough I have found this program outstanding, I have to admit that the statements made by E.M Smith and d’Aleo regarding the accuracy of the global surface temperature record were obviously wrong. The reduction in the number of temperature measuring sites could have a significant effect on the most recently observed global trends, but not because of changes in spatial distribution. The mentioned temperature series are based on anomalies and not on absolute temperatures. So the dropout of stations sited in cold places have nothing to do with the estimated monthly anomalies, altough this event should produce a significant increase in uncertainty levels due to more limited coverage and therefore larger sampling errors.
In our case, the real culprit is the exact percentage of sites classified as ‘rural’, ‘semi-urban’ and ‘urban’ stations. A couple of posts before Jeff Id pointed out that the GHCN raw station data is contaminated by the well-known but always belittled UHI effect. On a global scale, urban stations have shown almost 3 times more increase in annual mean temperatures than the rural sites in the last 30 years. These findings confirmed my earlier suspicions. After 1990 the percentage of stations which are located in cities or airports have been increased dramatically. The effects of this change may not have been negligible, even Tom Wigley admitted it privately in one of his emails to PD Jones:
We probably need to say more about this. Land warming since 1980 has been twice the ocean warming — and skeptics might claim that this proves that urban warming is real and important.
So true, urban warming is really-really important in order to get the pronounced 0.7-0.8°c warming in 150 years. PD Jones of CRU relied on the following argument in his UHI assessment paper in 1990 (available here):
…in any gridded temperature data set, a single affected station is unlikely to have a large influence on the time series of the nearest grid point, because this is generally the weighted average of between 5 and 20 station records.
The often cited Jones et al (1990) is a very doubtful paper now bacause of some obvious fabrications regarding the reliability of their rural reference networks, especially the Chinese one. However, the statement cited above received almost no attention, despite the fact that it contains an enormous flaw. The CRUTEM3 and HadCRUT gridded datasets are available in a 5×5 lat/lon. grid, this means that we have 2592 grid boxes globally. The number of GHCN stations with continuously available data is below 1500, see here. A majority of these sites are located in Europe or in the United States, so the coverage on other continents are even smaller than it can be in case of a homogeneous spatial distribution. Even a ‘homogeneous’ case means only an average of 1.9 stations per grid box (if 29 percent of the grid boxes are located over land). In fact, the spatial coverage of landmasses excluding Europe and the USA is far worse than this number – for example the coverage of Siberia is around 0.5 to 0.7 stations per gridcell, lightyears away from the 5 to 20 interval.
Now we can conclude that the argument used by Dr. Jones regarding the reliability of gridded temperature datasets is untrue. Any urban station can have a significant effect on a single gridcell and an increasing percentage of these contaminated sites due to station dropout can alter the observed global temperature anomaly significantly. It could have a very strong effect at the early part of the record, especially before 1900. Numerous observation sites started as small towns or rural locations, and they were encircled by urbanisation in the last 100-150 years. CRU calculates the temperature anomaly wrt. 1961-90, and it is quite obvious that analysis with the UHI-contaminated 1961-90 base perod will produce cooler anomalies in the 19th century, when population and energy consumption in the vicinities of the measurement sites were lower.
rbateman, it shouldn’t be too hard to see if these have been tampered with. Examine the fonts used, especially those before 1980-82 when computer-based fonts became the norm. Even if there are courier fonts used, it’s easy to spot those made by a typewriter (remember those?) and by a PC
I just browsed through that very quickly and found this:
Am I reading that right? These morons are continually modifying raw data and not keeping archival copies of the data before modification?
I think I am going to go bang my head on a wall for a while!
It takes effort to be that stupid!
Larry
Boballob, it seems the NCDC is using the Tobacco Industry defence: bury your opponents in paper
-=NikFromNYC=- (23:24:47) : Global averages are presented as anomalies, meaning variations above/below an arbitrary time period’s average. If this was done for the input data too, then only the difference in variability of cold vs. hot regions would be modified by the great dying out of thermometers, not the absolute temperatures.
Aye, now there’s the rub… It is NOT done for the input data. The anomaly map is calculated in the final land data step of GIStemp: STEP3. All the homgenizing, UHI adjustment, etc. are all done on “the monthly average of daily min/max averages”. Only at the very end is the Anomaly Map made.
There are a few points along the way in GIStemp where sets of averages of averages are calculated, and then offsets between these are used for some part; and technically you could call those “anomalies”, but they are not at all what folks think of when they think of anomalies. (For example, there is a UHI adjustment calculated in STEP2 in the PApars.f program. It does this by sprialing out (up to 1000 km) looking for stations to use. It adds up (about 10) of these and uses there (sort of a mean) as the comparision to the station to decide how much to adjust that station for UHI.
Yes, it is TECHNICALLY an anomaly calculation. But then the average of that (semi-random) set of “nearby rural” stations (up to 1000 km away and including major airports and cities with significant UHI) is thrown away and only the station data (now suitably adjusted) moves forward.
You see this all the way through GIStemp. Some average is used to adjust the station data average, then only the station data proceed. I would call these “offsets” rather than anomalies (and the code calls them offsets internally). Finally in STEP3, the Anomaly Map is made and “Ta Dah!!” it is “all anomalies so it’s perfect” is the mantra…
Well, one small problem. If there are insufficient “nearby rural” stations, the data is passed through unchanged. No UHI adjustent is done. So delete the rural stations in the present part of the data set, you get induced warming via no UHI correction. A very large percentage of “rural” stations now are at major airports (such as the largest Marine Base: Quantico Virgina). Any guess how warm it is on the Quantico airstrip right now with all the flights to Haiti? Delete the “real rural” airports, more of the UHI correction goes “the wrong way”, more induced artifical “warming”. All of this BEFORE the Anomaly step of STEP3.
I could go on with many other examples of how this works, but then I’d be retyping my whole blog. Just think on this: NOAA have announced that 100% of the pacific ocean basin will be from AIRPORTS in the near future (it is almost that now). A station on an island can “fill in” grid boxes up to 1200 km out to sea. So one hot station on, oh, Diego Garcia where we built a giant air base that is “rural” can, and does, warm a 2400 km diameter circle of cool ocean via “fill in”… and there being no “nearby rural” station to compare with, will get NO UHI correction. Think those islands airports changed much between the start of 1950 and the advent of the Jet Age Tourist boom in the 1980’s?
But that’s OK, the magical “anomaly” will fix it all up in STEP3 when we compare Diego Carcia today with what it was in 1950 …
There are three non-satellite global averages, GISS, Hadley and NCDC so there are three software packages to ask this question for. Isn’t the whole point of anomalies that using them instead of absolute temperature removes the problem that Smith is making a case for?
I can only speak to the process done inside GIStemp, but the fact that it has close agreement with NCDC and HadCRUT leads me to believe they are similar. Further, inside GIStemp the “dataset format” is called NCAR (As in NOAA NCDC North Carolina…) during the early steps then swaps to a HadCRUT compatible one for STEP4_5 where the Hadley Sea Surface Anomalies are blended in. This says that these folks share data formats (and thus at least the code to read and handle them). They also all work from the same set of “peer” reviewed literature, so will share methods from there as well. I’d love to take a team of programmers through all three and show how much they match, but there is only me and only one set of published code (GIStemp). NCDC is mum, and the UEA leak while helpful is not the whole code base.
Please forgive the length of this, but the “Anomaly” is a frequently used dodge by the AGW believers, and you can not get them to look at what really happens in the code… and it sounds so good… but it is just a “honey pot” to distract you from the real process being done to the data.
@ur momisugly Lucy Skywalker (10:37:09) : There are several D/L tools available. The one I’m referring to is this: https://addons.mozilla.org/en-US/firefox/addon/3006
It puts a small 3 colour animation in your toolbar which rotates when it finds something. Click on the arrow beside it, and you get a list of content. These clips were all entitled “movie**********mp4” .
Bernie (05:50:18) :
“My concern is that it is too easy to show the flaws in the piece and thereby dismiss more scientifically and rigorously constructed arguments. I had the same reaction to the recent Not Evil, Just Wrong.
That said, thanks for providing access to the show. I hope it leads to more detailed discussions.”
I thought it was too much like crude propaganda.
However, like it or not this is largely an economic and political question and the science is something of a sideshow; at any rate, debunking the bad science, on which far reaching and damaging policies are being based, does not necessarily mean that sensible policies will follow. The important consequences are political and economic and given the head of steam the political establishments have for AGW based policies, the only thing that will cause a change of course is large numbers of voters demanding they be changed, or no vote.
Consider the propaganda about the Hockey Stick, drowning polar bears, melting glaciers on Kilimanjaro, and the warmest everything since records began. None of these errors is ever admitted, they either quietly drop the line or blatantly push it, despite the evidence to the contrary.
Compare it with other political programs discussing defence, health provision, the economy, or whatever, and it becomes a model of rigorous analysis. I think it was pitched at the right level. I’d give it an A, considering its intended audience and even though some of it made me cringe.
“hotrod ( Larry L ) (12:25:27) :
Lucy Skywalker (10:37:09) :”
Lucy (and Hotrod),
I had the same problem at first (picture showed, no way to play it), but I finally just crashed my browser (go to task manager and kill the process) and then restarted it and it worked.
You probably know this site
http://www.john-daly.com/stations/stations.htm
I’m wondering if anybody has the data from the stations mentioned from about 2000 untill now
*********************
Bernie (09:36:28) :
tucci:
As I noted there are certainly plenty of “pathologies” visible in the CRU emails. Moreover, if your point is that the whole global temperature record is somewhat bizarre, I do not disagree. However, the point at issue is the effectiveness of this particular presentation. Most of the claims will be readily dismissed by the scientific establishment.
******************
Bernie: Did you notice there were scientists in the video who disagreed with the non-consensus. Just because they don’t go along with the hockey team does not mean they are not part of “the establishment.” They are and they are doing what all scientists are supposed to do, they are being skeptical!
E. M. Smith: Thank You!!
bud dingler (23:48:47) :
i thought it was a poor quality reporting (or reading from the prompter) and could not finish watching it. came off a like a Faux News piece.
There was NO prompter and not even a rehearsal. It was “walk in, mic up, sit down, question and answer, one take.” Don’t know what was done in the edit, though. I suppose the ‘voice overs’ (if any) might have been prompted.
In fact, they started the taping session with live news in the ‘ear bug’ and It sounded like we were going to go live… so I asked the camera / tech guy and he said it was just to give me something to listen to while he set up…
Only 3 people involved. John, Me, camera / tech guy.
Tried the lot. Still I see nothing. I have Java 6 update 17 (seems to be the latest). Watched Minnesotans on U-tube, no probs. Sometimes there are insoluble mysteries.
Lucy Skywalker, they are now posted to YouTube (at least part 1 of 7 is. I din’t check to see if the other parts were there)
hotrod ( Larry L ) (12:39:16) :
Am I reading that right? These morons are continually modifying raw data and not keeping archival copies of the data before modification?
Yes, you’re correct. They have plausible deniability by simply claiming the manuscript data sources are still in the archive, if anyone wants them bad enough. What they don’t tell you is how inaccessible those manuscripts are while in their custody, or that the insects are reportedly eating at least some of the original manuscript records and their raw data. Oops, it was an accident. Sorry ’bout that. Shoulda paid us more money to take better care of those old wet and musty records no one wants to see anyway.
Re: J.Peden (Jan 15 08:51),
Take it easy there J.Peden. Geez, I was making a statement of fact. That is the counter argument! I did not say it was mine. Warmers argue the satellite and surface data are “in good agreement”, thus arguing the surface data being manipulated is irrelevant.
Here is a link to youtube for those having problems:
There is 7 parts according to what I seen.