GISS Swiss Cheese

By Steve Goddard

We are all familiar with the GISS graph below, showing how the world has warmed since 1880.

http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.lrg.gif

The GISS map below shows the geographic details of how they believe the planet has warmed. It uses 1200 km smoothing, a technique which allows them to generate data where they have none – based on the idea that temperatures don’t vary much over 1200 km. It seems “reasonable enough” to use the Monaco weather forecast to make picnic plans in Birmingham, England. Similarly we could assume that the weather and climate in Portland, Oregon can be inferred from that of Death Valley.

GISS 1200 km

The map below uses 250 km smoothing, which allows us to see a little better where they actually have trend data from 1880-2009.

GISS 250 km

I took the two maps above, projected them on to a sphere representing the earth, and made them blink back and forth between 250 km and 1200 km smoothing. The Arctic is particularly impressive. GISS has determined that the Arctic is warming rapidly across vast distances where they have no 250 km data (pink.)

A way to prove there’s no data in the region for yourself  is by using the GISTEMP Map locator at http://data.giss.nasa.gov/gistemp/station_data/

If we choose 90N 0E (North Pole) as the center point for finding nearby stations:

http://data.giss.nasa.gov/cgi-bin/gistemp/findstation.py?datatype=gistemp&data_set=1&name=&world_map.x=369&world_map.y=1

We find that the closest station from the North Pole is Alert, NWT,  834 km (518 miles)  away. That’s about the distance from Montreal to Washington DC. Is the temperature data in Montreal valid for applying to Washington DC.?

Even worse, there’s no data in GISTEMP for Alert NWT since 1991. Funny though, you can get current data right now, today, from Weather Underground, right here. WUWT?

Here’s the METAR report for Alert, NWT from today

METAR CYLT 261900Z 31007KT 10SM OVC020 01/M00 A2967 RMK ST8 LAST OBS/NEXT 270600 UTC SLP051

The next closest GISTEMP station is Nord, ADS at 935 km (580 miles) away.

Most Arctic stations used in GISTEMP are 1000 km (621 miles) or more away from the North Pole. That is about the distance from Chicago to Atlanta. Again would you use climate records from Atlanta to gauge what is happening in Chicago?

Note the area between Svalbard and the North Pole in the globe below. There is no data in the 250 km 1880-2009 trend map indicating that region has warmed significantly, yet GISS 1200 km 1880-2009 has it warming 2-4° C. Same story for northern Greenland, the Beaufort Sea, etc. There’s a lot of holes in the polar data that has been interpolated.

The GISS Arctic (non) data has been widely misinterpreted. Below is a good example:

Apr 8, 2009

Monitoring Greenland’s melting

The ten warmest years since 1880 have all taken place within the 12-year period of 1997–2008, according to the NASA Goddard Institute for Space Studies (GISS) surface temperature analysis. The Arctic has been subject to exceptionally warm conditions and is showing an extraordinary response to increasing temperatures. The changes in polar ice have the potential to profoundly affect Earth’s climate; in 2007, sea-ice extent reached a historical minimum, as a consequence of warm and clear sky conditions.

If we look at the only two long-term stations which GISS does have in Greenland, it becomes clear that there has been nothing extraordinary or record breaking about the last 12 years (other than one probably errant data point.) The 1930s were warmer in Greenland.

Similarly, GISS has essentially no 250 km 1880-2009 data in the interior of Africa, yet has managed to generate a detailed profile across the entire continent for that same time period. In the process of doing this, they “disappeared” a cold spot in what is now Zimbabwe.

Same story for Asia.

Same story for South America. Note how they moved a cold area from Argentina to Bolivia, and created an imaginary hot spot in Brazil.

Pay no attention to that man behind the curtain.


Sponsored IT training links:

No matter you have to pass 70-667 exam or looking for 642-165 training, our up to date 640-721 exam dumps are guaranteed to provide first hand success.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

282 Comments
Inline Feedbacks
View all comments
July 27, 2010 7:15 pm

stevengoddard says: “You do realize that a trend involves data at both ends, as well as the middle.” And you continued, “GISS is lacking data in Africa from the start of the period. They are also lacking data at the end of the period (present.)”
Let’s see, you object to the trends being presented in the GISTEMP product with 1200km radius smoothing, and for that dataset, here’s the map of annual anomalies for 1880:
http://i27.tinypic.com/2ll1zzb.jpg
And here’s 2009:
http://i28.tinypic.com/1zh3121.jpg
The African coverage may be incomplete in 1880, but again, Steven, GISS writes, “Trends: Temperature change of a specified mean period over a specified time interval based on local linear trends.” And they further qualify it with, “’Trends’ are not reported unless >66% of the needed records are available.”
With that in mind, here’s a gif animation of the maps of the GISTEMP annual temperature anomalies with 1200km radius smoothing, at 20 year intervals from 1890 to 1990. Note how the data in the African interior is almost complete by 1910:
http://i26.tinypic.com/16hojv5.jpg
So apparently GISS has greater that 66% of the needed records, otherwise, they don’t print the trend. Do you have any data or documentation that shows that GISS is presenting trends in areas where they have less than their 66% threshold for the dataset with the 1200km radius smoothing?

DR
July 27, 2010 7:23 pm

Further, RPS has done considerable work in this area.
From his weblog in January:
NASA GISS Inaccurate Press Release On The Surface Temperature Trend Data

My comments below remain unchanged. Readers will note that Jim Hansen does not cite or comment on any of the substantive unresolved uncertainties and systematic warm bias that we report on in our papers. They only report on their research papers. This is a clear example of ignoring peer reviewed studies which conflict with one’s conclusions.

Have GISS apologists address the issues outlined by Pielke et al. If so, where? Certainly not in this thread.

July 27, 2010 7:35 pm

Bob,
Are you trying to make an argument that GISS 250 km maps are incorrect? They clearly show large regions of “no data.”

July 27, 2010 7:53 pm

I give up. Everyone please continue to ignore my post from 1:44 PM. It’s now so far up the page that you would need oxygen tanks and a team of Sherpa to climb back up to it.
– dT

July 27, 2010 7:54 pm

stevengoddard says:
July 27, 2010 at 5:43 pm
“Some people in this discussion seem satisfied with the idea that interpolation sorta, kinda maybe works sometimes within a few degrees.
We are talking about a global temperature measurement reported within one one hundredth of a degree. It is ludicrous.”
Yep, keep hammering!
How most people come to understand temperature anomaly: We gather average observed temperatures for specific periods of time(T). We see that in one period of time the average temperature was N. In a later period of time we observe the average temperature was M. We note the difference between N and M. We see that the difference is P. (N-M=P) So the anomaly(Z) = P/T.
The way GISS understands anomalies: Gather average observed temperatures for specific periods of time(T) specific to one location. See that in one period of time the average temperature was N, specific to one location. In a later period of time, observe the average temperature was M, specific to one location. They note the difference between N and M, specific to one location.
Then, they realize there are other locations, only they haven’t gathered the observed temperatures. So they then state, if observed temperatures at this point and time here were N, then over there is must have been iA! And now, if the temperatures here are M then over there must be iB! So the difference for there is iA-iB which =iC and iC/T is the anomaly(Z) for there! So the total anomaly is Z + iZ /2! Which, of course = iZ.

July 27, 2010 8:04 pm

I forgot to add, please for the love of every thing holy, take that to your high school algebra teacher or your high school science teacher and have them explain why I’m wrong. I’m wrong on several levels, but when it is explained how I’m wrong, you should come to an understanding as to why GISS is wrong………..on several levels…….sophomoric math and science. How much bastardization of the (once) hard sciences can we stomach? Are we going to continually lower the standards in public education just so we pass this BS off as science? DAMMIT my grandchildren go to public schools!! What’s wrong with you people?

Frank K.
July 27, 2010 8:30 pm

Doubting Thomas says:
July 27, 2010 at 7:53 pm
Hey dT. I looked at your post above and of course you are correct. The global temperature anomaly product of the GISS is merely an adhoc index, and is thermodynamically meaningless. However, this does not prevent them from say things like the earth is the “hottest” it’s ever been in 120 years of recorded history!
Moreover, while most of this discussion has focused on the land data, let’s not forget the sea surface temperature data. You know, those ocean temperatures obtained in the past by sailors nautical climatologists using very careful measuring techniques involving buckets and thermometers…

bemused
July 27, 2010 9:58 pm

Doubting Thomas,
Sorry, have just read your post. You’re right – global surface temperature will not tell the whole story. To look at energy in the system you have to consider sensible and latent heat in the full depth of the atmosphere (not just the near surface layer) and also the ocean heat content.
Here’s an interesting link looking at the moist enthalpy near the surface:
http://atmoz.org/blog/2008/05/07/using-surface-heat-content-to-assess-global-warming/
it doesn’t change the overall signal though…

bemused
July 27, 2010 10:32 pm

Steven Goddard at 4:36
I don’t think GISS says anything about daily temperature anomalies, so I have no idea where you get your x50 precision statement from. In my original post I said that this was a silly thing to do for a daily value.
I was simply explaining to you the difference between absolute temperatures and temperature anomalies; a simple point that you very clearly failed to grasp when you wrote this article and your earlier replies.

July 27, 2010 10:34 pm

Doubting Thomas says:
July 27, 2010 at 7:53 pm
“I give up……”
The reason why no one commented is because it is a fairly nuanced perspective. I can’t argue with the logic. Sure, humidity(read water) carries heat/energy. There is another that posts here that affirms the energy is mostly carried in the oceans instead of the air. Similar, but not exactly the same perspective. I think there is much to be made of the water/energy vs the air/energy topic. You guys should get together and write a paper.
Most won’t comment on it because it would change the dynamics of the argument. (I don’t because I can neither add nor subtract from the assertion.) Most aren’t prepared to discuss the issue with either of you. Most acknowledge that temperature is a measurement of heat. Heat is an expression of energy. Water holds heat, so water holds energy. I’m reading Ohm’s law right now, but I can’t find a reference to water!?!? Dang, I’m talking about electricity when we’re talking about energy, or heat which electricity is. See where I’m taking this? You’re point, likely, sent many people to the books, but, in general, most alarmists and skeptics alike, simply aren’t prepared or equipped to discuss this perspective.
When it leaves water, where does it go? In what form? How is it measured? How long is the heat in water? Does the density of the water determine how much energy it holds? Is this really relative to global warming or is it a constant which holds a certain amount of energy? Does ice or snow hold the energy?
Beyond doubt, water, in vapor form or full liquid holds energy and is an important factor of our climate. But, I’d rather talk about temps anomalies that exists when temps don’t.
Just because we don’t comment on it, doesn’t mean people haven’t read it, we just don’t know jack about it……I’d stay with it though…….seems valid.

July 27, 2010 10:39 pm

bemused
GISS doesn’t measure temperature anomalies, nor does anybody else. Take a trip to your local Wal-Mart and ask them for a “temperature anomaly thermometer.”
What GISS does around the Arctic is to take temperature readings in a few locations, manipulate them upwards, then extrapolate them across vast distances with no data. Then they calculate imaginary anomalies based on past imaginary temperatures.
Perhaps you should think more and accuse less?

July 27, 2010 11:19 pm

bemused says: July 27, 2010 at 9:58 pm (and thanks for responding), “it doesn’t change the overall signal though…” But we’re still stuck with the stark fact that the average “moist” enthalpy in Phoenix (how is “moist” different from “normal” enthalpy?) on 7/22 was greater than on 7/26 when the temperature was 5 deg. F higher. (Ref. my post of 1:44 PM.)
The cited post says, “Put another way, if global warming were to be framed as a change in surface energy as opposed to surface temperature, the degree of warming would be twice as large.” That is not, “the same signal.” It’s a much stronger single.
I have to study the post, and it’s way past my bedtime, but I don’t immediately see how these two statements can sleep comfortably together.
– dT

July 27, 2010 11:19 pm

Bill Illis says:
July 27, 2010 at 5:46 pm
GISS, NCDC, and Hadley/CRU are just being lazy/
I don’t know if it’s laziness. I think they’ve put effort into deciding which stations to use, and the way to phrase why they did chose those stations so to deflect any appearance of being shifty. So I wouldn’t say lazy. But I’m not as diplomatic as you I think.

July 27, 2010 11:24 pm

For those who haven’t sen this video yet, this is Joseph D’Aleo talking about the dropped stations:

jaymam
July 27, 2010 11:25 pm

Thanks Bob Tisdale. I didn’t spot the decimal point.
So an average global rise of 0.008 degrees C each year is a catastrophic problem according to the warmists?

July 27, 2010 11:37 pm

James Sexton,
I dropped out of high school sometime in the the 12th grade. If I can get it, you can get it. Find a psychometric chart. This is simple engineering not rocket science.
Evaporating water uses energy but energy (they say) is always conserved. So any energy used to evaporate water is still there. It’s in the “form” of evaporated water. It’s called latent heat because it’s not expressed as temperature. Water exists on earth in three phases; ice, liquid and vapor. Anytime it changes from one phase to the other it either uses, or gives back, energy. Since our wonderful planet has a whole lot of water, one has to take it’s three phases into account.
The temperature of atmospheric air is not a measure of it’s total energy content.
In the context of man-made global warming, all the argumentation about temperature (or temperature anomalies) is, at best, a big waste of time. Temperature, alone, cannot tell us anything about the energy budget of a wet planet.
– dT

July 27, 2010 11:49 pm

It’s cold in Pasadena, CA. Really cold. Could we get some of that east coast warming out here?
I read somewhere that there are amino acids in meteorites. The D’Aleno video is nice.
Bedtime … zzzzz
– dT

July 28, 2010 2:27 am

stevengoddard says: “Are you trying to make an argument that GISS 250 km maps are incorrect? They clearly show large regions of ‘no data.'”
No. I’m not saying the GISS maps with 250km radius smoothing are incorrect. But they are not the dataset that have the trends applied to them. It’s the maps with the 1200km radius smoothing that have the trends.

July 28, 2010 4:29 am

Steven Goddard: Have you ever plotted the GISTEMP combined land+sea surface temperature anomalies and compared the products with 250km and 1200km radius smoothing? That’s how you determine if the 1200km radius smoothing really influences the end product.
Here a comparison of the changes in zonal mean temperature anomalies from 1880 to 2009 for the datasets with the 250km and 1200km radius smoothing. I’ve used the data that GISS presents in the zonal mean plots below the trend maps—the ones you’ve used in this post.
http://i28.tinypic.com/2hqufbb.jpg
Note how the two datasets only diverge at high latitudes. Why? Those are the latitudes where GISS deletes SST data and extends land surface data out over the oceans Remember my post on this? Here’s a link:
http://bobtisdale.blogspot.com/2010/05/giss-deletes-arctic-and-southern-ocean.html
And here’s a time-series comparison of the two GISTEMP combined products with 250km and 1200km radius smoothing, in which I’ve limited the latitudes to 60S-60N, excluding the poles:
http://i29.tinypic.com/55pc86.jpg
The difference in trends is 0.003 deg C/Decade or 0.03 deg C/Century. This indicates the 1200km radius smoothing adds basically nothing to the GISTEMP product for the vast majority of the globe.
In other words, outside of the Arctic, your complaints about the GISTEMP product with 1200km radius smoothing are unfounded.

July 28, 2010 5:34 am

Bob Tisdale,
You might want to think about these statements from Hansen. You are completely missing the point.

the 12-month running mean global temperature in the GISS analysis has reached a new record in 2010…. GISS analysis yields 2005 as the warmest calendar year, while the HadCRUT analysis has 1998 as the warmest year. The main factor is our inclusion of estimated temperature change for the Arctic region.

July 28, 2010 5:41 am
July 28, 2010 5:49 am

I’ll be blunt. I’ve said it here before and I will say it again. I dislike the use of anomalies to describe a temperature record. Anomalies can be used in two ways: As “snapshots of real data” (good but unnecessary), and as “Fudge Factors” (Bad). Yes, we like to see TRENDS. But real data will show you trends just as well as anomalies when both are based on real data.
As far back as Kepler, who as best I can tell was the first to use the term anomaly in science (as in “mean anomaly”) when describing planetary motion, it has been used as a “fudge factor”; Hence my dislike for the term and its usage. Kepler was wrong by the way, although not by much, but he had to fudge it to make it work, fascinating to read how he did it. He actually built a model, yes a model – (what a novel idea!), and had to fudge that too by the way, and yes, it was also wrong.
So why are we using anomalies to express global temperatures? Both NASA and NOAA go through great lengths to justify the use of anomalies instead of temperatures. I’ll get to those in a second.
First, I have another question. Why would Hansen go through such great lengths back in 1987 to justify the validity of anomalies over ~1000km grids? Regardless of the answer, in this particular application, anomalies are obviously being used as “Fudge Factors” – to hide the absence of real data.
And why did they pick the mean to be from 1951 to 1980? This is the mean now used by GISS and NCDC. Well, in the 1987 paper they mention that: “The zero point of the temperature scale is arbitrary.”
Arbitrary – From Webster:
1 : depending on individual discretion (as of a judge) and not fixed by law
Why would they do that? I don’t have an answer, but I will say that where there is individual discretion, there is prejudice (not in the pejorative sense) – by definition.
And that’s the whole problem with anomalies, in this case the zero point is a 22-year period in the life of a 4-billion year old planet that just happens to coincide with one of the largest upticks in temperature in the last 100 or so years, not that 100-years means squat in the big planetary time-scale, picture, whatever. And, they are creating data where it doesn’t exist.
So why do that? What’s the urgent need for this fudging? I can only speculate and suggest that they needed these uniformly spread out global anomalies for one thing: Model Feed.
Now let’s look at the justifications for anomalies made by NASA and NOAA. First from the NCDC:
“Absolute estimates of global average surface temperature are difficult to compile for several reasons. Some regions have few temperature measurement stations (e.g., the Sahara Desert) and interpolation must be made over large, data-sparse regions. In mountainous areas, most observations come from the inhabited valleys, so the effect of elevation on a region’s average temperature must be considered as well. For example, a summer month over an area may be cooler than average, both at a mountain top and in a nearby valley, but the absolute temperatures will be quite different at the two locations. The use of anomalies in this case will show that temperatures for both locations were below average.”
So, in the first part they say “we need to fudge”. The second part, the last two sentences, is bogus because with real data sets for both locations there is no need for anomalies to calculate “below average” TRENDS.
And from GISS:
“Anomalies and Absolute Temperatures
Our analysis concerns only temperature anomalies, not absolute temperature. Temperature anomalies are computed relative to the base period 1951-1980. The reason to work with anomalies, rather than absolute temperature is that absolute temperature varies markedly in short distances, while monthly or annual temperature anomalies are representative of a much larger region. Indeed, we have shown (Hansen and Lebedeff, 1987) that temperature anomalies are strongly correlated out to distances of the order of 1000 km. For a more detailed discussion, see The Elusive Absolute Surface Air Temperature.”
So I went and read “The Elusive Absolute Surface Air Temperature” and read this:
“Q. If SATs cannot be measured, how are SAT maps created?
A. This can only be done with the help of computer models, the same models that are used to create the daily weather forecasts. We may start out the model with the few observed data that are available and fill in the rest with guesses (also called extrapolations) and then let the model run long enough so that the initial guesses no longer matter, but not too long in order to avoid that the inaccuracies of the model become relevant. This may be done starting from conditions from many years, so that the average (called a ‘climatology’) hopefully represents a typical map for the particular month or day of the year.”
Ahhh, models…. Remember Kepler? Love the word “hopefully” in there too…. Notice, that in contrast to the NCDC, they do not mention the lack of data directly, instead they refer to the Hansen paper saying they can correlate temperatures out to ~1000km. They jump from the premise of big variations in “short distances” to concluding that anomalies are better for “monthly or annual” and “much larger regions” It still means the same thing, they don’t have the data and are fudging it. The really interesting thing is how they go through great lengths to say that SATs are worthless. But they still have to use them as their base data!
I speculate that what might have possibly started as Model Feed at the beginning has now been evolved into another global depiction tool all on its own.

Ryan
July 28, 2010 6:09 am

Tisdale: The red blob in South America bears no relation to any of the surrounding data points. According to GISTEMP the nearest data point to the center of that red dot is Cuiaba, 630km away – which shows no warming. Next up would be Santarem showing 1Celsius warming. Next up Manaus – no real warming, maybe 0.5Celsius if you were generous. Next up Corumba (1000km away) – 1 Celsius warming. No more long term data after that.
The GISS map with 1200km smoothing shows a 2-4Celsius anomaly in Brazil. I cannot find any data to support it. The anomaly in GISTEMP is 1Celsius at most.
Looking at the 250km smoothing I think the algorithm is doing rather more than you suppose. The only real data is dispersed around the red dot. The anomaly showing 1-2Celsius at the North coast and the South coast appears to have been joined by the alogorithm and the red center appears to then be an extrapolation of the trend from cooling/stasis at the East and West coasts through the warming center to a red hot core.
All of this appears to be made up however, since there is no actual data in that red dot at all, let alone any data showing >1Celsius anomaly. The Area seems to be centered on Mato Grosso a heavily forrested part of Brazil with little human habitation. It doesn’t surprise me that there is no reliable data for this area.
Given that the stations that could be used to support it are all at the coast.

DR
July 28, 2010 6:14 am

Bob Tisdale;
Is then the divergence of GISS from the rest of the global temperature products solely based on their Arctic interpolation/extrapolation (which is it)?

Ryan
July 28, 2010 6:26 am

Mosher/Bemused: If the smoothing of anomalies was valid over 1200km was a valid approach then the maps for the 250km smoothed data would be more or less the same as the map using 1200km data. In fact that is clearly not the case as you can prove to yourself by comparing the first two maps on this thread.
The map of 250km smoothing shows regions +3-4Celsius anomalies right next to -2Celsius anomalies – they are only 250km apart. These differences don’t happen just once or twice – they happen all over the map.
Fact is that the idea that temperature anomalies would be the same over distances of 1200km came from the idea that global warming was, well, a global phenomena. So a 2Celsius warming seen in Canada must mean 2 Celsius warming everywhere else more or less. Fact is that the 250km map vividly illustrates the flaw in that argument. It begs the question “just how global is global warming?”
Still, I don’t expect either of you to respond to this post, since I notice that both of you dodge the more difficult questions in favour of using obfuscation and cod science to make weaker points. Tell you what, go away and reads the Encyclopaedia Britannica and come back when your ready.

Verified by MonsterInsights