Guest Post by Ira Glickstein
According to the latest from NASA GISS (Goddard Institute for Space Studies), 2010 is shaping up to be “the warmest of 131 years”, based on global data from January through November. They compare it to 2005 “2nd warmest of 131 years” and 1998 “5th warmest of 131 years”.
We won’t know until the December data is in. Even then, given the level of noise in the base data and the wiggle room in the analysis, each of which is about the same magnitude as the Global Warming they are trying to quantify, we may not know for several years. If ever. GISS seems to analyze the data for decades, if necessary, to get the right answer.
A case in point is the still ongoing race between 1934 and 1998 to be the hottest for US annual mean temperature, the subject of one of the emails released in January of this year by NASA GISS in response to a FOIA (Freedom of Information Act) request. The 2007 message from Dr. Makiko Sato to Dr. James Hansen traces the fascinating story of that hot competition. See the January WUWT and my contemporary graphic that was picked up by several websites at that time.
[My new graphic, shown here, reproduces Sato’s email text, including all seven data sets, some or all of which were posted to her website. Click image for a larger version.]
The Great Hot 1934 vs 1998 Race
1) Sato’s first report, dated July 1999, shows 1934 with an impressive lead of over half a degree (0.541ºC to be exact) above 1998.
Keep in mind that this is US-only data, gathered and analyzed by Americans. Therefore, there is no possibility of fudging by the CRU (Climategate Research Unit) at East Anglia, England, or bogus data from Russia, China, or some third-world country. (If there is any error, it was due to home-grown error-ists :^)
Also note that total Global Warming, over the past 131 years, has been, according to the IPCC, GISS and CRU, in the range of 0.7ºC to 0.8ºC. So, if 1934 was more than 0.5ºC warmer than 1998, that is quite a significant percentage of the total.
At the time of this analysis, July 1999, the 1998 data had been in hand for more than half a year. Nearly all of it was from the same reporting stations as previous years, so any adjustments for relocated stations or those impacted by nearby development would be minor. The 1934 data had been in hand for, well, 65 years (eligible to collect Social Security :^) so it had, presumably, been fully analyzed.
Based on this July 1999 analysis, if I was a betting man, I would have put my money on 1934 as a sure thing. However, that was not to be, as Sato’s email recounts.
Why? Well, given steadily rising CO2 levels, and the high warming sensitivity of virtually all climate models to CO2, it would have been, let us say inconvenient, for 1998 to have been bested by a hot golden oldie from over 60 years previous! Kind of like your great grandpa beating you in a foot race.
2) The year 2000 was a bad one for 1934. November 2000 analysis seems to have put it on a downhill ski slope that cooled it by nearly a fifth of a degree (-0.186ºC to be precise). On the other hand, it was a very good year for 1998, which, seemingly put on a ski lift, managed to warm up by nearly a quarter of a degree (+0.233ºC). That confirms the Theory of Conservation of Mass and Energy. In other words, if someone in your neighborhood goes on a diet and loses weight, someone else is bound to gain it.
OK, now the hot race is getting interesting, with 1998 only about an eighth of a degree (0.122ºC) behind 1934. I’m still rooting for 1934. How about you?
3) Further analysis in January 2001 confirmed the downward trend for 1934 (lost an additional 26th of a degree) and the upward movement of 1998 (gained an additional 21th of a degree), tightening the hot race to a 28th of a degree (0.036ºC).
Good news! 1934 is still in the lead, but not by much!
4) Sato’s analysis and reporting on the great 1934 vs 1998 race seems to have taken a hiatus between 2001 and 2006. When the cat’s away, the mice will play, and 1998 did exactly that. The January 2006 analysis has 1998 unexpectedly tumbling, losing over a quarter of a degree (-0.269ºC), and restoring 1934‘s lead to nearly a third of a degree (0.305ºC). Sato notes in her email “This is questionable, I may have kept some data which I was checking.” Absolutely, let us question the data! Question, question, question … until we get the right answer.
5) Time for another ski lift! January 2007 analysis boosts 1998 by nearly a third of a degree (+0.312ºC) and drops 1934 a tiny bit (-0.008ºC), putting 1998 in the lead by a bit (0.015ºC). Sato comments “This is only time we had 1998 warmer than 1934, but one [on?] web for 7 months.”
6) and 7) March and August 2007 analysis shows tiny adjustments. However, in what seems to be a photo finish, 1934 sneaks ahead of 1998, being warmer by a tiny amount (0.023ºC). So, hooray! 1934 wins and 1998 is second.
OOPS, the hot race continued after the FOIA email! I checked the tabular data at GISS Contiguous 48 U.S. Surface Air Temperature Anomaly (C) today and, guess what? Since the Sato FOIA email discussed above, GISS has continued their taxpayer-funded work on both 1998 and 1934. The Annual Mean for 1998 has increased to 1.32ºC, a gain of a bit over an 11th of a degree (+0.094ºC), while poor old 1934 has been beaten down to 1.2ºC., a loss of about a 20th of a degree (-0.049ºC). So, sad to say, 1934 has lost the hot race by about an eighth of a degree (0.12ºC). Tough loss for the old-timer.
Analysis of the Analysis
What does this all mean? Is this evidence of wrongdoing? Incompetence? Not necessarily. During my long career as a system engineer I dealt with several brilliant analysts, all absolutely honest and far more competent than me in statistical processes. Yet, they sometimes produced troubling estimates, often due to poor assumptions.
In one case, prior to the availability of GPS, I needed a performance estimate for a Doppler-Inertial navigation system. They computed a number about 20% to 30% worse than I expected. In those days, I was a bit of a hot head, so I stormed over and shouted at them. A day later I had a revised estimate, 20% to 30% better than I had expected. My conclusion? It was my fault entirely. I had shouted too loudly! So, I went back and sweetly asked them to try again. This time they came in near my expectations and that was the value we promised to our customer.
Why had they been off? Well, as you may know, an inertial system is very stable, but it drifts back and forth on an 84 minute cycle (the period of a pendulum the length of the radius of the Earth). A Doppler radar does not drift, but it is noisy and may give erroneous results over smooth surfaces such as water and grass. The analysts had designed a Kalman filter that modeled the error characteristics to achieve a net result that was considerably better than either the inertial or the Doppler alone. To estimate performance they needed to assume the operating conditions, including how well the inertial system had been initialized prior to take off, and the terrain conditions for the Doppler. Change assumptions, change the results.
Conclusions
Is 2010 going to be declared warmest global annual by GISS after the December data comes in? I would not bet against that. As we have seen, they keep questioning and analyzing the data until they get the right answers. But, whatever they declare, should we believe it? What do you think?
Figuring out the warmest US annual is a lot simpler. Although I (and probably you) think 1934 was warmer than 1998, it seems someone at GISS, who knows how to shout loudly, does not think so. These things happen and, as I revealed above, I myself have been guilty of shouting at analysts. But, I corrected my error, and I was not asking all the governments of the world to wreck their economies on the basis of the results.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

“Of course we should care! Peer reviewed research from several data sets confirm that the planet is getting warmer. This is not a controversial statement anymore.”
Never was controversial in my opinion. any idiot can see that it has been getting warmer since the last time it was getting colder. DUUUH!
” It is a change which clearly needs to be mediated, to avoid consequences which will be catastrophic for everyone. If you have real data to disprove this, please present it!”
You are the one making the snake oil pitch–the obligation is upon you to present credible information.
@Onion says:
December 25, 2010 at 5:15 pm
“…Furthermore, the change has been systematic with more and more of the measurements by United States cooperative observers being in the morning, rather then {SIC} the afternoon….”
Did they really say that, with that atrocious misspelling included? If so, then “they” ought to be fired. Or else, you’re just another propagandist with poor grammar trolling here.
Onion says:
December 25, 2010 at 5:15 pm
‘Hansen 2001 titled “2001: A closer look at United States and global surface temperature change”…Changes in the GISS analysis subsequent to the documentation by Hansen et al. [1999] are as follows: (1) incorporation of corrections for time-of-observation bias and station history adjustments…’blah blah blah…
My dear Mr. Onion, does it really need to be pointed out that once you “adjust” or “correct”data it is no longer data, it is corrupted data. If it needs correcting it is not good data. It is bad data. It has become a guess. The suggestion by Hanson that it needs to be adjusted is an admission that they are comparing apples and oranges. It is this that becomes glaringly obvious when reading the harry_read_me documents from the climate gate file dump. They don’t know the provenance or quality, they make some up. To use the vernacular, it is a crock.
I’m about to staunch the flow from this thread–I’m not through catching up to it.
but before I go I just wanted to observe thqat I am sure I remember Dad telling me about Roosevelt’s mention of the satellite temperature trend in the televised Fireside chats.
Ira: Great post. As you said: “…As we have seen, they keep questioning and analyzing the data until they get the right answers. But, whatever they declare, should we believe it? What do you think?…”
I said in another post here some time back, now that almost all the thermometers are located on tarmac in range of jet blast at the world’s airports, there can be no more “global warming” unless of course the globe is warming. The last few years suggest the globe is in stasis on the ∆T front, and maybe, we have to put a minus sign in there somewhere.
The only other way that T will increase is through the manipulations of data. We’ve seen some doozies – Willis’s “Darwin” evaluation, New Zealand’s fantastic 2˚C increase over the course of a century or so (now entirely discredited), and the seeming paradox of UHI corrections pinning current temperatures at where they are, while reducing historical measurements.
Fortunately the wheels are falling off this ramblin’ wreck of supposed science, and the FOIA requests and inquiries by state’s attorneys general and so forth, will accelerate until, hopefully, I’ll see Michael Mann and a bunch of other charlatans either defrocked, or hopefully, refrocked in neat striped uniforms that they can wear for 10 to 20 (years, that is).
And I would have guessed that 2008 or 2009 was the “Warmist’s Year”, since that is the last time everybody quaked (or worse) in their boots whenever a warmist ranted.
Re E.M.Smith says:
December 26, 2010 at 11:57 am:
“OK, I’ll bite: Exactly what is the “GIStemp Global Product” vs the “GIStemp US Product”?”
On the GISTEMP site there are two graphs. One for globe. One for the US only. That is what I am referring to. It’s the same analysis but this “race” only exists in the US graph. On the global graph the values for 1998 and 1934 have not changed greatly.
The Hansen’99 paper shows the GISTEMP met stations only value for 1998 was about +0.66C and the 1934 value was about +0.04C. The current GISTEMP website shows GISTEMP met stations only for 1998 is +0.7C and for 1934 is still +0.04C. So hardly any change.
And Land+Ocean 1998 has actually gone down. In Hansen’99, the land+ocean value for 1998 is given as 0.58C. On the GISTEMP website today it’s listed as 0.56C. But again these changes of a few hundredths of a degree are so small as to be irrelevant.
“He has TESTIFIED that he thinks “The greater good” outweighs law and property rights. So he would apply that same belief to the property of the temperature data as he is rabid about the “greater good” of doing whatever it takes to get AGW believed.”
I disagree. He could believe in both scientific integrity and civil disobedience.
“And, IMHO, I’ve done one. The Dt/dt method.”
But do you have a global temperature graph that can be compared with the other records? In order to show that GISTEMP is doing it wrong you need to show what the graph should look like done right. That’s necessary so we don’t end up chasing our tails and thinking GISTEMP is wrong when in fact we are nitpicking over one hundredth of a degree or something.
The graphic is clearly labeled “US Temp Anomaly ºC”, and I am clear that 2010 may be declared the Global hottest year. I live in the US and am most interested in temperatures here, but, the Global Warming activists say their theory applies worldwide. If GISS cannot get their data analysis straight for their own US 2% of the globe, how can we trust their Global declarations?
I think it is pretty clear the analysts were being pressured to come up with results that would bolster the “tipping point” and “runaway warming” theory of their bosses, else why would they spend so much time fiddling with old data? However, since I was not there, I did not conclude they were either incompentent or that there was wrongdoing. Do you know the reason their seven analyses differ?
As you say “the cost estimates are all over the place”. They vary from expensive to astonomical. No one thinks they are inexpensive. Apart from that I find your comment interesting.
A parting shot (this really has gotten tejous)…
My understanding of “science” and “publication” is this:
A “Scientist carefully records (almost always in a speckally-gray book-of-lined-blank-pages (May have the word “Ledger”–black letters on a prinrted on a white panel on the cover, and probably has some sort of hand written (or printed) title, somebody’s name and, across the bottom a line with [some-date] —- [some later date] written it at least two kinds of ink).
After there is either some satisfaction with results or realization that there is some question to answered, the scientist (or one of the grunts) types it up, prepares the data in some sensible form, maps out the procedures used, and what the findings or questions are,and sends it off to some editor that tidies it who then sends ut forward (after perhaps an iterative process) to be published.
After publication other scientists take the data and the procedures and try to reproduce the results and either publish (details much like those give above) their findings, or their additional data or changed procedures.
Eventually these occurrences fade and if people begin to cite these articles without arguing with them what they settled on (without a “vote”, but maybe after a conference reading) the things discussed will be treated as if they were facts, and may become the foundation for new ideas.
A “social scientist” works out what needs to be done (and maybe why, perhaps) types that up, sends it in for assignment reviewers who look at it to be sure that it does not say anything that has not already been said by the right people, doesn’t propound any unacceptable ideas, and didn’t get assigned to an reviewer that hates the author, it is published. It is then cited as “peer-reviewed accepted science” in passing laws, regulations, and protocols for redistributing the ill-gotten wealth of the people that worked for it to people who, for what ever reason, did not work.
“Onion says:
December 26, 2010 at 8:53 am
If the station data actually show something significantly different than Hansen finds with GISTEMP, then what I would expect to find was a temperature record that very much disagrees with GISTEMP. That’s really the only way GISTEMP is ever going to be shown to be wrong, you need a benchmark.”
See: http://stevengoddard.files.wordpress.com/2010/12/2010temperatureanomalies.png
GISS says November 2010 was the warmest November on record. The other three disagree.
GISS says 2010 may be the warmest year on record. The year is not over yet, but there is every indication that for the other three, 1998 will remain the warmest year.
GISS says July and August had lower anomalies than October and November according the graphs. The other three disagree.
“Mycroft says:
December 26, 2010 at 6:52 am
Was 1934 a El Nino year as well?”
That is an excellent question! I could well be wrong, but according to the following table, it may be that they did not know about El Ninos before 1950.
http://www.esrl.noaa.gov/psd/people/klaus.wolter/MEI/rank.html
“Espen says:
December 26, 2010 at 3:09 am
Instead of temperature anomalies, one should have used enthalpy (total energy) anomalies. 2010 is a good illustration of this, since a lot of the positive anomaly comes from arctic deserts, where it takes a lot less energy to heat a given mass of air by e.g. 5 C than it would take in the tropics – or even in Sahara.”
You bring up an excellent point! The following, with its huge up and down steep spikes, really illustrates this well: http://ocean.dmi.dk/arctic/meant80n.uk.php
Then to compound the problem, GISS uses few stations in the Arctic. So if the jet stream causes a corner of the Arctic to be warmer, that seems to be extrapolated over a much larger area than may be warranted. At least that is the impression I am under.
2010 was far from the warmest year on record. Anyone who makes that claim is deliberately cherry-picking the record.
See for yourself.
The favoured corner of the Arctic for warming measurements seems to be the southern tip of Greenland at the moment, where displaced warm water is concentrating.
Onion,
To <>
It is simple to compare GISTEMP again it self – see link:
http://cbullitt.wordpress.com/2010/09/25/all-your-agw-are-belong-to-us/
As Brian H mentions, the area around Greenland and Labrador has had less ice and gives v high anomalies (especially with 1200km smoothing and rectangular maps) this particular effect seems to keep the world warm even when it is not for most of us.
Its interesting that NASA themselves identify this as the reason for the high global temperature while the populated areas of the world shiver.
http://data.giss.nasa.gov/gistemp/2010november/
This of course was not the only warm anomaly in Novemeber but it seems to have quite a constant influence. One cant help wondering is the lack of arctic ice and the warm temperatures (maybe more ENSO, PDO, and AO related) at all the coastal stations an independant influence on global surface records.
Is there a methodoligal problem concerning measurement of global temperature in the face of changing arctic ice cover?
Re Lars P:
That’s US temperatures. Compare global and they’ve hardly changed. For global warming, it’s global that’s relevant.
Re Werner Brozek:
The image you post shows the records with different baselines. Therefore comparison is invalid, they have to calculated so they share the same baseline to compare month for month.
“Onion says:
December 27, 2010 at 8:54 am
Re Werner Brozek:
The image you post shows the records with different baselines. Therefore comparison is invalid, they have to calculated so they share the same baseline to compare month for month.”
Base lines are one thing.
But see: http://motls.blogspot.com/
A quote from here: “Phil Jones’ HadCRUT3 dataset is attributing November 2010 the coolest rating among Novembers and among the four datasets – with a 0.43 °C global anomaly, it was the 7th warmest November – while the GISS dataset says that November 2010 was the warmest November on record.”
Strange GISS data in Siberia, do I have correct thoughts?
Hi Guys,
I just wanted to verify in nutshell what is this all time high temp about. I selected by random point in Siberia that turned out to be closest to Turuhansk (65.8 N 87.9 E, http://data.giss.nasa.gov/cgi-bin/gistemp/findstation.py?datatype=gistemp&data_set=1&name=Turuhansk). Village has record since 1881 with some missing months as could be expected. I calculated all time average for November as 19,5 C. 1951-1980 average (GISS baseline) gives same average 19,5. Latest November temperatures in GISS:
2006 -23,7
2007 -14,2
2008 -15,1
2009 -22,4
2010 -20,8
Now GISS area plot with defaults today (goto http://data.giss.nasa.gov/gistemp/maps/ -> press Plot) gives image which states actually that in Siberia in that area and also overall average temperatures are some 4 degrees above long term average in that area. So based on the calculations I made the image is very faulty, misleading even.
So the issue is now that if I with one random point see all too big deviation in officially announced area temperature average, what is the reliability of ‘warmest’ ? Or is there some error in this logic?
Richard Sharpe says:
What is the probability that the warming will continue and is not over?
It is important to keep two things clearly apart. The comment about Roman Optimum return is ONLY in the context of “IFF IPCC is correct and we are warming”. I’m saying “IFF they are right, we get a Roman Optimum”.
But I believe they are not right…
My opinion is that it’s basically zero chance of warming… We topped out in 1998 at a tad under 1934 (in keeping with the 60 year cycle inside a 7000 year downtrend).
Now we’re into the sleeping sun Grand Mininum (that I like, in a Gilbert and Sullivan lilt, to say “Now We’ve Got A Major Minima!!” … ) with a cold plunge to 2040 or so. Not good. IMHO, it’s going to be LIA redux but slightly colder (in keeping with the general long term trend). AT MOST the CO2 can dampen the long term trend, but it can do nothing about the 60 year cycle nor the Major Minima melody… and we’re about the right timing for a 1500 year cold Bond Event too. So, sadly, I don’t see any warmth hanging on for long. But I can enjoy it while it’s here…
What I’d like to believe and would like to see happen is a return to The Roman Optimum temperatures in about 2070 as we exit the New Little Ice Age (that I pray will be CO2 dampened) and take the next run up the warming half cycle. It would be very nice to have some added grain growing regions then. Heck, it would even be nice if we had some added rain just on the North African coastal areas as the rain bands shifted more south ( i.e. not a complete re-greening of the Sahara ala Sahara Pump theory, just a minor shift of coastal rains).
So there are my “three visions”.
1) IPCC right: New Roman Optimum. What’s not to like?
2) I’m right: OMG that’s cold. Potential Bond Event Zero (c) and you don’t want to think about that one…
3) Optomistic hope: Sort of a Roman Optimum after a CO2 dampened not-to-cold not-quite-a-LIA cool bump.
MarkkuP says:
Strange GISS data in Siberia, do I have correct thoughts?
Absolutely! What you have done is something that can be done pretty much everwhere, that GISS says is surprisingly hot, in their maps.
All the individual stations show nothing out of line in thier history. The aggregate shows warming after GISS processing. Therefor the “warming” is in the processing not in the data. I’ve seen that for many places. From Marble Bar Austrlia to Canada.
That is the “magic” of GIStemp. It can splice together stations with low or no trend and find warming. The thing that makes this possible, IMHO, is the change of what stations are in GHCN for which years. “Managing the splice”.
press Plot) gives image which states actually that in Siberia in that area and also overall average temperatures are some 4 degrees above long term average in that area. So based on the calculations I made the image is very faulty, misleading even.
Yup. That’s the result of the “homogenizing” and “UHI correcting” and “grid / box anomaly” creation done with “The Reference Station Method”. The individual stations don’t need to rise for a ‘grid box’ to be nice and toasty…
So the issue is now that if I with one random point see all too big deviation in officially announced area temperature average, what is the reliability of ‘warmest’ ? Or is there some error in this logic?
Your logic is very sound. The problem is in proving that the warming is bogus. One needs to repeat your excercize for EVERY ‘grid box’ on the planet (16,000 at last code update by GISS) and then you can show that it’s not just one little odd data point…
That was one of the drivers for the Dt/dt method. To let me automate some of that and make graphs for local geographies. The Marble Bar posting shows this effect using that tool:
http://chiefio.wordpress.com/2010/04/03/mysterious-marble-bar/
Does this falsify GIStemp? It would require much more work than I have time (given that I’ve got a family to feed) to do it conclusively. I like to think that it does show where the error is located so folks know where to dig and that someone will do the work to show that GIStemp is not just falisified in the specifics, but falsified in the general as well.
FWIW, I started by making tables of the actual temperatures for various locations. A bit of code that would calculate the simple averages. I then ran this on very small areas (so you don’t need anomaly processing to see artifacts) and noticed just what you are seeing. While a scan of the data showed no station with an out of the ordinary high temperature, the “grid box” showed a “hot hot hot” anomaly. That was what led to my finding that all the “warming signal” was in the newest 1/10 or so of the thermometers… and that it’s all just a giant “splice artifact” of recent changed processing and airports onto older stable data.
To falsify GIStemp fully would, IMHO, take about a staff of 4 ( 3 if done cheaply and they were dedicated) and about 1 year elapsed time ( 2 years if you have a ‘peer review and publish’ cycle at the end). 1 Staff Scientist to do the publication main writing. 2 Programmer types to run the equipment, write the code, and collect & manage the data sets. 1 Project Manager type (preferably who can also write code when the project hits a snag and write publication quality summaries for the Staff Scientist and PR type press releases and hustle for money while managing budgets and hang out with the clients and keep management happy and talk to the press and…) so the other three can concentrate on getting the real work done 😉
I can say that about P.M.s as I’ve been one for a decade or two …
But ‘bottom line’: Your thinking is exactly right.
I just read “Bundle Up, It’s Global Warming” By JUDAH COHEN (NY Times) and I can’t decide if the point was that AGW is now making us colder because the Himalayas, the Tien Shan and the Altai mountains are taller these days or because the snow is making the northern hemisphere more reflective than snow did when the climate was colder. Oh well, at least I can take cold and miserable comfort in knowing that even should I live to the middle of the next ice age AGW will still be working since it causes all sorts of weather to happen and weather is never climate change.
@ur momisugly E.M. Smith
“Can I quote you on that? It’s rather well said… ”
I’m flattered you think so, please feel free.
I’ve spent a lot of time looking at the links you provided, and the links they had, and then those other links…. I swear your web site is worse than picking up the dictionary; 3 hours later I can’t remember the word I wanted to look up! Thanks for an interesting start.
I’m new to the nuts and bolts aspects of the Global Warming, er, Climate Change or whatever they’re calling the “world is getting hotter” debate this month. You mentioned running the GISStemp code on a Linux box and it made me think. The code for Linux is open source, with hundreds, if not thousands of folks around the world helping to tweek and maintain it. Has anyone ever floated the idea of an open source global climate model? I suppose contributions would have to be moderated (even Wikipedia has gone that way), but really interested folks with the technical savvy might be able make small but useful contributions without being intimidated by the scope of the project.
@D.J. Hawkins:
Don’t know if there’s an open source version, there’s a “more portable” version of GIStemp (translated to something better than FORTRAN 😉 and there are published codes for several of the predictive models.
http://climatecode.org/
are the folks doing the more open version of GIStemp (and some other stuff too, I think).
“open source climate code” as a google term returns over a million hits…
Glad you liked my site. If there’s anything that you get “stuck” on and would like an explanation, feel free to just foat a question on some thread there. (I’m pretty loose about thread topic discipline… )
It’s all pretty dense at first, and there is a LOT of jargon and acronym soup, but none of it is very hard to master. Just take modest sized bites and chew on each for a while 😉
onion says:
“That’s US temperatures. Compare global and they’ve hardly changed. For global warming, it’s global that’s relevant.”
I understand you find NASA GISS Temp graph for US having a problem. Frankly speaking I agree with you, if the relative positions of the older records change there is something wrong.
Now the global is based on several such local graphs – on same data – US was only a selection of it. We see trouble in US graphs/data?, we saw trouble in other locations too. Where do these come from? This looks wrong to me. You find this ok?
btw-does anybody have the GISS program code? Did any body had a look at it?
Lars;
Since the US has always been considered to have the best sited, best maintained, best built, and best distributed weather station pattern on the planet, you can pretty well toss the lot. It might be amusing first, though, to “adjust” the UHI-contaminated readings by the -9 to -14°C recently found to be appropriate instead of the +.05°C favored by Hansen’s GISS, and re-run some models. I scenario-ize a massive meltdown of GCM computers, world-wide.