GISS Swiss Cheese

By Steve Goddard

We are all familiar with the GISS graph below, showing how the world has warmed since 1880.

http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.lrg.gif

The GISS map below shows the geographic details of how they believe the planet has warmed. It uses 1200 km smoothing, a technique which allows them to generate data where they have none – based on the idea that temperatures don’t vary much over 1200 km. It seems “reasonable enough” to use the Monaco weather forecast to make picnic plans in Birmingham, England. Similarly we could assume that the weather and climate in Portland, Oregon can be inferred from that of Death Valley.

GISS 1200 km

The map below uses 250 km smoothing, which allows us to see a little better where they actually have trend data from 1880-2009.

GISS 250 km

I took the two maps above, projected them on to a sphere representing the earth, and made them blink back and forth between 250 km and 1200 km smoothing. The Arctic is particularly impressive. GISS has determined that the Arctic is warming rapidly across vast distances where they have no 250 km data (pink.)

A way to prove there’s no data in the region for yourself  is by using the GISTEMP Map locator at http://data.giss.nasa.gov/gistemp/station_data/

If we choose 90N 0E (North Pole) as the center point for finding nearby stations:

http://data.giss.nasa.gov/cgi-bin/gistemp/findstation.py?datatype=gistemp&data_set=1&name=&world_map.x=369&world_map.y=1

We find that the closest station from the North Pole is Alert, NWT,  834 km (518 miles)  away. That’s about the distance from Montreal to Washington DC. Is the temperature data in Montreal valid for applying to Washington DC.?

Even worse, there’s no data in GISTEMP for Alert NWT since 1991. Funny though, you can get current data right now, today, from Weather Underground, right here. WUWT?

Here’s the METAR report for Alert, NWT from today

METAR CYLT 261900Z 31007KT 10SM OVC020 01/M00 A2967 RMK ST8 LAST OBS/NEXT 270600 UTC SLP051

The next closest GISTEMP station is Nord, ADS at 935 km (580 miles) away.

Most Arctic stations used in GISTEMP are 1000 km (621 miles) or more away from the North Pole. That is about the distance from Chicago to Atlanta. Again would you use climate records from Atlanta to gauge what is happening in Chicago?

Note the area between Svalbard and the North Pole in the globe below. There is no data in the 250 km 1880-2009 trend map indicating that region has warmed significantly, yet GISS 1200 km 1880-2009 has it warming 2-4° C. Same story for northern Greenland, the Beaufort Sea, etc. There’s a lot of holes in the polar data that has been interpolated.

The GISS Arctic (non) data has been widely misinterpreted. Below is a good example:

Apr 8, 2009

Monitoring Greenland’s melting

The ten warmest years since 1880 have all taken place within the 12-year period of 1997–2008, according to the NASA Goddard Institute for Space Studies (GISS) surface temperature analysis. The Arctic has been subject to exceptionally warm conditions and is showing an extraordinary response to increasing temperatures. The changes in polar ice have the potential to profoundly affect Earth’s climate; in 2007, sea-ice extent reached a historical minimum, as a consequence of warm and clear sky conditions.

If we look at the only two long-term stations which GISS does have in Greenland, it becomes clear that there has been nothing extraordinary or record breaking about the last 12 years (other than one probably errant data point.) The 1930s were warmer in Greenland.

Similarly, GISS has essentially no 250 km 1880-2009 data in the interior of Africa, yet has managed to generate a detailed profile across the entire continent for that same time period. In the process of doing this, they “disappeared” a cold spot in what is now Zimbabwe.

Same story for Asia.

Same story for South America. Note how they moved a cold area from Argentina to Bolivia, and created an imaginary hot spot in Brazil.

Pay no attention to that man behind the curtain.


Sponsored IT training links:

No matter you have to pass 70-667 exam or looking for 642-165 training, our up to date 640-721 exam dumps are guaranteed to provide first hand success.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

282 Comments
Inline Feedbacks
View all comments
Steve M. from TN
July 27, 2010 9:24 am

Ok let me see if I can paraphrase what Bob Tisdale has been alluding to:
250km smoothing:
Grid cell X in the middle of Africa has 50% data recorded. This does not meet the GISS Standard of 66%, so the algorithm looks for stations within 250km to “fill in” the missing data. It can’t find any, so it is displayed as gray “no data available”
1200km smoothing:
Same grid cell, looks for other stations with in 1200km and finds some, so is able to “fill in” the missing data using trends from other stations, displays an anomaly color on the map.
Am I close Bob?
This brings a question to mind: Can “filled in” data propogate across multiple grid squares? Or does the algorithm use only “raw” data?

Gail Combs
July 27, 2010 9:26 am

Geoff Smith says:
July 26, 2010 at 7:24 pm
So this is out right lying but why?So many groups are doing this can it be out of self interest and funding.
Maybe there really is something to the Iron Mountain Report.
______________________________________________________________
Google “Maurice Strong” he has his fingers in a lot of pies starting with the Un’s First Earth Summit in 1972.
Who controls the food supply controls the people; who controls the energy can control whole continents; who controls money can control the world. – Kissinger
MONEY
“A Primer on Money” — published by the Government Printing Office in 1964
Report to Pres. Reagan: all taxes go to banks as interest
READ the comments on the Financial Stability Board
Official text: Financial Regulatory Reform: A New Foundation
Explanation of Obama’s “Financial Regulatory Reform
FOOD
History, HACCP regs and Food
food system is in trouble: s My comment has a lot of links so I will not repeat them
ENERGY
Climategate e-mail on Global Governance & Sustainable Development (B1)
Here is more on the (B1) scenario IPCC Emissions Scenarios
Here is who Ged Davis is (Shell Oil executive with IPCC connection)
Here is the context and history:
In Maurice Strong’s 1972 First Earth Summit speech, Strong warned urgently about global warming
Obama’s Chief Science Adviser is John Holden.’In their 1973 book “Human Ecology: Problems and Solutions,” Holdren and co-authors Paul and Anne Ehrlich wrote:
The de-development plan is UN Division for Sustainable Development – full text of Agenda 21
UN REFORM – Restructuring for Global Governance
Our Global Neighborhood – Report of the Commission on Global Governance: a summary analysis
a lot of research and links about Agenda 21 in the USA
USA and EU sign law harmonization agreement
====================
REPLY: This is getting quite a ways off-topic – Anthony

dp
July 27, 2010 9:26 am

“Which direction do you walk and why?
Which way is the wind blowing?

July 27, 2010 9:26 am

It was almost 100 degrees in Colorado yesterday. San Diego was 65 degrees.
Therefore, Death Valley must have been about 70 degrees yesterday afternoon, and Chicago must have been about 125 degrees.

Gail Combs
July 27, 2010 9:30 am

Mike says:
July 26, 2010 at 7:32 pm
So there is uncertainty and gaps in the data. Maybe that is why the first GISS graph has error bars on it. If you can demonstrate that their error bars are smaller than they should be you might have something worth talking about.
__________________________________________
I suggest you read AJStrata’s analysis of the error in the temperature data:
http://strata-sphere.com/blog/index.php/archives/11420

Christopher
July 27, 2010 9:39 am

Why is it that all the most severe warming always seems to happen in areas that are sparsely populated? Just a fluke? Or something more sinister?

July 27, 2010 9:42 am

Steven Mosher: July 27, 2010 at 9:13 am
2. I tell you that 1200km north of you is a spot where the temperature is 100F. East of you at 1200KM is spot that is 110F. South of you is a spot that is 80F. West of you at 1200km it is 115F. Which direction do you walk and why?
Southeast.
If it gets warm as I head north and cool as I head south, then I must be in the Southern Hemisphere. If the mystery desert is 2,400km wide, then I must be in Australia.
Therefore, I head southeast because I know some *great* bars in Sydney…

wayne
July 27, 2010 9:42 am

Steven Mosher says:
July 27, 2010 at 8:41 am
The warming produced by GHGs happens slowly over time and the impact is lagged. over time this warming shows up in a long term trend.
~~~~
Just as a secular trend in sun insolation during the recovery from the LIA in the 1600’s, mainly found in the 1900’s. It was the sun Steven Mosher.
There is about 2% water vapor in the air at any given time. CO2 could only have an affect at the ratio of CO2 to water vapor of about .04% / 2% at most or in other words 2% of any warming since year 1700, they are both GHGs.
Another way to get a rough estimate is by taking 20000 water molecules plus 270 CO2 molecules per one million in year 1700 and comparing that to 20000 water molecules plus 400 CO2 molecules per one million in 2010. That is about 1 – 20270 / 20400 or less that 1% affect. You see, the effect from the minuscule increase in CO2 is not what we have seen at most if water molecules have not decreased by the 400 less 270 difference to compensate, then there would have been no affect from Co2 increase. It was the sun, Steven, the sun. I’m surprised you have eaten some of the figments being passed around. 🙂
Your big mistake, you leave out the 20000 per million water vapor molecules in your calculations. What happens to CO2 molecules also happens to water vapor molecules, there are just a huge amount more water vapor molecules.

July 27, 2010 10:07 am

wayne:
“There is about 2% water vapor in the air at any given time. CO2 could only have an affect at the ratio of CO2 to water vapor of about .04% / 2% at most or in other words 2% of any warming since year 1700, they are both GHGs.”
you should meet mr stratosphere. I dont think you understand using a % tells you very little. C02 exists throughout the atompsheric column as does H20 at varying concentrations. Go say hello to mr stratosphere. Then go look at “line broadening”.
Then come back. Or go study a LBL radiative transfer model. Or if your an engineer who has worked with radiative physics just run the code you use everyday to design sensors or missles or airplanes that have IR stealth. We all know how important C02 is to the transfer of radiation in the real world. Ya, it warms the planet. Definately doesnt COOL the planet. Warms it. question is how much. Read your Lindzen, spenser, christy, Willis, Mcintyre, Monkton, etc etc. Yup. Warms the planet. How much? thats the real question

Ben
July 27, 2010 10:11 am

Looking at the GISS map at the top of this page, is there a way to produce a map of 1200 km smoothing with Death Valley assigned temps from Portland, Oregon and Monaco given temps from Birmingham, England, etc? I’m just curious what that would look like. It would probably make me uncomfortably cold all year long.

max
July 27, 2010 10:45 am

So it begs the question what % of all the warming in the last 100 years comes from within these GISS 1200 km smoothing areas.

frank
July 27, 2010 10:45 am

David Jay says: July 26, 2010 at 6:45 pm
I understand the loss of the cool spot in Africa, averaging (smearing?) should move temperatures away from extremes. However, the hot spot in Brazil is a winner. I want to hear an explanation of that methodology!
A possible explanation: Let’s suppose a large number of stations were added to Brazil between 1880 and 1920 and that these stations show strong enough warming from 1920-2010 to deserve bright red color. These new stations don’t show up on the 1880-2010 data at 250 km resolution because the first 40 years of data are missing. At 1200 km resolution, we have data for many of these grid cells extrapolated from 250-1200 km away for 1880-1920. This allows us combine extrapolated data for 1880-1920 with “real” data for 1920-2010 and see the red color that would be present on a 1920-2010 plot. The question is will Steve Goddard present data for any period besides 1880-2010 so that we can see if the mysterious red color arises from real data or extrapolated data.

July 27, 2010 10:54 am

Ben,
I am working on just such an analysis to show people how much missing data could ‘skew’ the results. First I have to finish the user guide.. maybe next week.
But you DONT assign TEMPS. you assign ANOMALIES. that is deviations from the norm.
For example, is death valley temps were constant its anomaly would be ZERO.
If I use death valley anomalies to ‘fill in” the artic, I would input a ZERO anomaly to the arctic. That is, I would assume that if death valley hadn’t warmed then it is safe to assume that the arctic hadnt warmed. ( if I choose to make that assumption)
When it comes to ‘infilling” I can
1. NOT infill
2. Infill the GLOBAL AVERAGE of all grids.
3. Infill using the closest grids.
4. Infill with the highest anomaly on the planet( worst case)
5. Infill with the lowest anomaly on the planet (best case)
then I can compare those 5 approaches. That gives me an idea of how important missing data could be.
That excercise might be instructive.
Its about 50F in SF now. How warm is it where you are?

Contrarian
July 27, 2010 11:00 am

Steve Mosher wrote,
“If 60-70N saw a positive trend of 1C, would you expect 70-90N to see
a higher trend? lower trend? or the same trend.”
I’ll tackle that one. You should answer “NA” unless you have *data* showing a reasonably strong correlation between trends from 60-70N and 70-90N over a different period of time. By hypothesis, there is no other period with data. So you answer, “NA” —not available.

A C Osborn
July 27, 2010 11:04 am

Steven Mosher says:
July 27, 2010 at 8:37 am
Zeke Hausfather says:
July 27, 2010 at 7:42 am
So you guys are saying that E.M. Smith’s analysis of the all of the Raw data is wrong?
That Ken Stewarts analysis of Australian data is wrong?
Don’t you find it odd that when other people look at the data they don’t get the same Trends as you and GISS get?
Are you saying that they can’t do the analisys correctly?
If so have you pointed out their errors?

Spector
July 27, 2010 11:06 am

Piecewise Linear Data Approximation
Just for reference, I have noticed that it is possible to do a piecewise linear step approximation of the global ocean surface temperature anomaly data from January 1880 to June 2010 with just five linear segments and have a root-mean-square error of 0.0934 deg C. For this data set, that appears to be equivalent to the error obtained using a 15-term, power-series approximation.
The Microsoft Excel Solver tool was allowed to pick the initial value of the first segment, the interior segment date points (subject to manual pruning) and the slopes for each segment. Interior initial values were the calculated end-points of the previous segment. The Microsoft Excel Offset and Match functions were used to select the segment data applicable for each source date.
This representation appears to show a cyclic process having about a 30-year half period. I present this because I have not seen this technique used before to represent climate data. I hope it is useful.

Seg          Decimal            Initial      Slope
No.           Dates              Value      Deg/yr
 1    1880.042 -to- 1910.621    -0.0324    -0.01061
 2    1910.621 -to- 1941.127    -0.3569     0.01357
 3    1941.127 -to- 1971.831     0.0571    -0.00085
 4    1971.831 -to- 2004.208     0.0309     0.01309
 5    2004.208 -to- 2010.458     0.4548    -0.00049
July 27, 2010 11:15 am

wayne:
“You see, the effect from the minuscule increase in CO2 is not what we have seen at most if water molecules have not decreased by the 400 less 270 difference to compensate, then there would have been no affect from Co2 increase. It was the sun, Steven, the sun. I’m surprised you have eaten some of the figments being passed around. 🙂
Your big mistake, you leave out the 20000 per million water vapor molecules in your calculations. What happens to CO2 molecules also happens to water vapor molecules, there are just a huge amount more water vapor molecules.”
You clearly dont understand the physics of radiative transfer, the windows through which IR can pass and those through which it cannot. You too need to meet mr. stratosphere.
Here is a question. If you were building an airplane to fly at 50K feet. And you wanted that plane to be INVISIBLE to IR sensors on the ground, you would want to know how the IR energy that plane gave off was TRANSMITTED through the atmosphere. The plane has ‘hot parts’, a gas plume, and heating due to aero friction ( called aero heating) Each of these heat sources radiatates at different wavelengths. (more or less)
So, you have this heat source (IR) in the sky. can you see it at the ground? can a stinger missile “see” it. can a SAM missile see it? How much IR energy makes it through the atmosphere?
is the IR “blocked?” yup. but some gets through. How much gets “blocked”. It depends what is “in the way.” Did the crazy engineers who designed this baby
http://en.wikipedia.org/wiki/Northrop_YF-23
have to understand, predict, and verify how IR transmits through the atmosphere?
YUP. yup, we sure did. What tools did we use? RTEs, radiative transfer equations.
Did the physics underlying those tools predict accurately? yup. Did those physics say that C02 would block some of the IR? yup. Was it altitude dependant? yup. Are those same physics the ones that underlie the AGW theory. yup. Are the models they use today much better than the classified ones we used in the early 80s? yup.
Did we fly the plane and check our predictions? yup. Were they accurate? yup.
did we use the same tools on the B2? yup. C02 has an effect. been there done that.
is h20 more important? yup. Does that make AGW false. nope. Its part of the theory

A C Osborn
July 27, 2010 11:15 am

frank says:
July 27, 2010 at 10:45 am
It doesn’t look like it.
http://chiefio.files.wordpress.com/2010/03/brazil_full_hair.png

James Sexton
July 27, 2010 11:18 am

Unbelievable conversation. Steve Goddard, keep plugging away at it. Eventually, the rest will see the truth in what you are stating.
To the GISS rationalizers, we can’t extrapolate or assume things in which we have no experience. For instance, I know fire is hot, no because each time I see a fire I experience its heat, but rather I’ve experienced fires heat on many occasions in many different manners, so I assume all fires operate in a similar fashion because of my experience. No one really knows if we can extrapolate 1200 km in Africa or not. Why? Because we’ve never observed, or experienced the temperature anomalies there compared to where we do have thermometers. For instance, we don’t know how to figure the temp anomaly in Tanzania from a thermometer anomaly data in Capetown. Maybe one can, maybe one can’t. We don’t know because we’re not currently measuring the temp anomaly in Tanzania. Let’s work that out in a math formula. Let’s say just for argument sake the current average yearly temp in Capetown is 16C and that the anomaly is +2. So, the equation would look like 16(x)=2. The currently average temp in Tanzania is ?, well we don’t know, so we’ll call it Y. So, the anomaly is ……….well, we don’t know because we couldn’t possibly know how it relates to the anomaly in Capetown. I’m glad you guys aren’t accountants. Someone show me the formula for knowing something never measured specific to the composition, time and proximity.
One thing we do know is that while we can average temperatures for the globe, the heat isn’t evenly distributed, we just don’t know how or why.(We know somethings, but we don’t know many things.) But we(the GISS) are basically performing an average distribution function to determine specifically where the distribution of the heat is going. I don’t understand how you guys and gals can’t see that. Someone send me the extrapolating formula and I’ll destroy it in less than a day.
“Its getting hotter where we don’t measure temps.” Brilliant and beautiful. With apologies to the late Pierre Bosquet, “It is magnificent, but it is not war science: it is madness”

frank
July 27, 2010 11:20 am

The fundamental question is: “How much error does extrapolation over 1200 km introduce?” Conventional wisdom asserts that there is a strong enough correlation between temperature anomalies between stations within 1200 km that useful information can be obtained from sites that far away. When I look at the real data presented in Hansen and Lebedeff (http://pubs.giss.nasa.gov/docs/1987/1987_Hansen_Lebedeff.pdf), the situation isn’t as simple as I thought. In the tropics, correlation is weak (0.8 to <0) even over distances of 250 miles. In the Arctic, the correlation between stations that are about 1000 km apart ranges from 0.8 to 0.3. What causes this variation? Are the better correlations only seen between inland stations and worse correlations between coastal and inland stations. Are coastal stations well correlated only when they are influenced by the same ocean currents and poorly correlated when this isn't true (for example, different coasts of Greenland or locations strongly and not strongly influenced by the MOC/Gulf Stream)? Is the good correlation in the Arctic mostly present during seasons when sea ice is melting or forming and temperatures are constrained by phase change to be near 0 degC. If a more refined set of rules about situation were useful correlation exists could be abstracted the data, we might find that the red areas in the Arctic would shrink dramatically.
J. Box, Int. J. Climatol. 22: 1829 – 1847 (2002) claims that temperatures on the East and West Coasts of Greenland are not correlated. He has a lot of other useful info on Greenland temperatures. http://www.astro.uu.nl/~werkhvn/study/Y3_05_06/data/ex6/gl.pdf shows that

Frank K.
July 27, 2010 11:25 am

“When you compute a global average, no explicit interpolation is necessary. You can interpolate points and then add them if you want but the summed result is still just a combination of the data points just with different weighting. Where points are sparse, youre just regarding them as representing a larger area. That increases the error range.”
I suppose this is true if you don’t care what the final average is. If you’re interested in the numerical value of average itself, then the “weighting” becomes quite important…
In any case, the GISS “reference station” method is an ad hoc approach to deriving a thermodynamically meaningless anomaly index. Given the large uncertainty associated with the spatial correlation of the anomalies (see the original Hansen paper), it is amusing to see people talk about the “highest index value ever!” as if we really know these values within +/- 1 C.
By the way, since the anomalies are calculated relative to a local temporal average (using a predefined reference period), has anyone generated a spatial map of these averages? I’d be interested to see what the “ideal” reference temperatures are for different locations. For example, if the anomaly for 2009 for Chicago was +0.5 C, what is the underlying average that this anomaly is referenced to? 15 C? I wonder how different (or not) this is to, say, Detroit or Indianapolis.

July 27, 2010 11:26 am

Steve M. from TN: Regarding your July 27, 2010 at 9:24 am paraphrasing:
Nope. The 250km and 1200km smoothing has already been performed before the trends are analysed.

Gail Combs
July 27, 2010 11:27 am

RW says:
July 27, 2010 at 4:24 am
“It uses 1200 km smoothing, a technique which allows them to generate data where they have none – based on the idea that temperatures don’t vary much over 1200 km”
That’s incorrect. It is observed that there is a correlation between temperature anomalies at widely spaced locations….
….I got hold of weather station data from Montreal and Washington, choosing the station from each which had the longest record. I calculated the mean January temperature, and then subtracted it from the series, to convert it from absolute temperature to temperature anomaly. I calculated the correlation between the anomalies at the two locations. I found a Pearson coefficient of 0.75, which implies a significant correlation.
_____________________________________________________________
“Pearson correlation coefficient is largely used in economics and social sciences…..” Pearson Coefficient in Analyzing Social and Economical phenomena.
Assumptions in calculating the Pearson’s correlation coefficient:
“1. Independent of case: In Pearson’s correlation of coefficient, cases should be independent to each other.
2. Distribution: In Pearson’s correlation coefficient, variables of the correlation should be normally distributed.
3. Cause and effect relationship: In Pearson’s correlation coefficient, there should be a cause and effect relationship between the correlation variables.
4. Linear relationship: In Pearson’s correlation coefficient, two variables should be linearly related to each other, or if we plot the value of variables on a scatter diagram, it should yield a straight line.”

I do not see why anyone would be using Pearson’s correlation coefficient on temperature data. I do not think the data meets the criteria.
Independent of case:
I am taking it that “cases should be independent to each other” means “Two events are independent if the occurrence of one of the events gives us no information about whether or not the other event will occur; that is, the events have no influence on each other.” http://www.stats.gla.ac.uk/steps/glossary/probability.html#indepevents
Since storm systems sweep whole continents how could the two data sets (cases) be considered independent of each other?
Distribution:
I can not see how temperature data over a year could be considered normally distributed. I would think it is bimodal (winter /summer)
Cause and effect relationship:
How does the weather in Washington DC cause the weather in Montreal?
Untrained people keep plugging numbers into “statistical programs” without understanding the underlying principles. It is the main reason I hate the flavor of the month programs that supposedly teach managers and scientists statistics and actually muck things up instead.

JAE
July 27, 2010 11:34 am

It is so comforting to know that our premiere space agency is so capable.

Billy Liar
July 27, 2010 11:39 am

richard telford says:
July 27, 2010 at 2:01 am
‘It would not be hard to write a useful post, to test if the interpolation of anomalies to 1200km have any skill. It would require only a modicum of coding and statistical nous.’
I lok forward to reading your post.

1 5 6 7 8 9 12
Verified by MonsterInsights