By Steve Goddard
h/t to reader “Phil.” who lead me to this discovery.
In a previous article, I discussed how UAH, RSS and HadCrut show 1998 to be the hottest year, while GISS shows 2010 and 2005 to be hotter.
But it wasn’t always like that. GISS used to show 1998 as 0.64 anomaly, which is higher than their current 2005 record of 0.61.
You can see this in Hansen’s graph below, which is dated August 25, 1999
But something “interesting” has happened to 1998 since then. It was given a demotion by GISS from 0.64 to 0.57.
http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.lrg.gif
The video below shows the changes.
Note that not only was 1998 demoted, but also many other years since 1975 – the start of Tamino’s “modern warming period.” By demoting 1998, they are now able to show a continuous warming trend from 1975 to the present – which RSS, UAH and Had Crut do not show.
Now, here is the real kicker. The graph below appends the post 2000 portion of the current GISS graph to the August 25, 1999 GISS graph. Warming ended in 1998, just as UAH, RSS and Had Crut show.
The image below superimposes Had Crut on the image above. Note that without the post-1999 gymnastics, GISS and Had Crut match quite closely, with warming ending in 1998.
Conclusion : GISS recently modified their pre-2000 historical data, and is now inconsistent with other temperature sets. GISS data now shows a steady warming from 1975-2010, which other data sets do not show. Had GISS not modified their historic data, they would still be consistent with other data sets and would not show warming post-1998. I’ll leave it to the readers to interpret further.
————————————————————————————————————-
BTW – I know that you can download some of the GISS code and data, and somebody checked it out and said that they couldn’t find any problems with it. No need to post that again.




kadaka:
“If you obtained from GISS the data they crank through to obtain such wonderful products as their global average whatever, would you be looking at original unaltered data from GHCN, USHCN, and SCAR, or something adjusted?
What is “GLB.Ts.txt” anyway?”
It depends upon what step of the analysis you look at: GHCN,USHCN, SCAR is read into the program. At each step, fully documented, you can see what is done. If you have questions about GLB.Ts.txt, go download the code and walk through it. It takes a competant individual a couple days to get up to speed. When you show yourself ready to do the same work that John G or SteveMc or me or Clearclimate code has done, then you have standing to make a comment.
Yea I noticed that. I did like his book, guess I don’t have to necessarily like him though to appreciate a reasonably significant contribution to the whole debate. He is coming off as being condescending, impatient, and generally patting the heads of the unwashed. Its a bit unseemly if you ask me.
Kadatka:
here its called a read.me
GISS Temperature Analysis
=========================
Sources
——-
GHCN = Global Historical Climate Network (NOAA)
USHCN = US Historical Climate Network (NOAA)
SCAR = Scientific Committee on Arctic Research
Basic data set: GHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2
v2.mean.Z (data file)
v2.temperature.inv.Z (station information file)
For US: USHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly
9641C_200907_F52.avg.gz
ushcn-v2-stations.txt
For Antarctica: SCAR – http://www.antarctica.ac.uk/met/READER/surface/stationpt.html
http://www.antarctica.ac.uk/met/READER/temperature.html
http://www.antarctica.ac.uk/met/READER/aws/awspt.html
For Hohenpeissenberg – http://members.lycos.nl/ErrenWijlens/co2/t_hohenpeissenberg_200306.txt
complete record for this rural station
(thanks to Hans Erren who reported it to GISS on July 16, 2003)
USHCN stations are part of GHCN; but the data are adjusted for various recording
and protocol errors and discontinuities; this set is particularly relevant if
studies of US temperatures are made, whereas the corrections have little impact
on the GLOBAL temperature trend, the US covering less than 2% of the globe.
Step 0 : Merging of sources (do_comb_step0.sh)
—————————
GHCN contains reports from several sources, so there often are multiple records
for the same location. Occasionally, a single record was divided up by NOAA
into several pieces, e.g. if suspicious discontinuities were discovered.
USHCN and SCAR contain single source reports but in different formats/units
and with different or no identification numbers. For USHCN, the table
“ushcn2.tbl” gives a translation key, for SCAR we extended the WMO number if it
existed or created a new ID if it did not (2 cases). SCAR stations are treated
as new sources.
Adding SCAR data to GHCN:
The tables were reformatted and the data rescaled to fit the GHCN format;
the new stations were added to the inventory file. The site temperature.html
has not been updated for several years; we found and corrected a few typos
in that file. (Any SCAR data marked “preliminary” are skipped)
Replacing USHCN-unmodified by USHCN-corrected data:
The reports were converted from F to C and reformatted; data marked as being
filled in using interpolation methods were removed. USHCN-IDs were replaced
by the corresponding GHCN-ID. The latest common 10 years for each station
were used to compare corrected and uncorrected data. The offset so obtained
was subtracted from the corrected USHCN reports to match any new incoming
GHCN reports for that station (GHCN reports are updated monthly; in the past,
USHCN data used to lag by 1-5 years).
Filling in missing data for Hohenpeissenberg:
This is a version of a GHCN report with missing data filled in, so it is used
to fill the gaps of the corresponding GHCN series.
Result: v2.mean_comb
Step 1 : Simplifications, elimination of dubious records, 2 adjustments (do_comb_step1.sh)
———————————————————————–
The various sources at a single location are combined into one record, if
possible, using a version of the reference station method. The adjustments
are determined in this case using series of estimated annual means.
Non-overlapping records are viewed as a single record, unless this would
result introducing a discontinuity; in the documented case of St.Helena
the discontinuity is eliminated by adding 1C to the early part.
After noticing an unusual warming trend in Hawaii, closer investigation
showed its origin to be in the Lihue record; it had a discontinuity around
1950 not present in any neighboring station. Based on those data, we added
0.8C to the part before the discontinuity.
Some unphysical looking segments were eliminated after manual inspection of
unusual looking annual mean graphs and comparing them to the corresponding
graphs of all neighboring stations.
Result: Ts.txt
Step 2 : Splitting into zonal sections and homogenization (do_comb_step2.sh)
———————————————————-
To speed up processing, Ts.txt is converted to a binary file and split
into 6 files, each covering a latitudinal zone of a width of 30 degrees.
At the same time, stations with less than 20 years of data are dropped, since
in the subsequent gridding step we require overlaps of at least 20 years
to combine station records.
The goal of the homogenization effort is to avoid any impact (warming
or cooling) of the changing environment that some stations experienced
by changing the long term trend of any non-rural station to match the
long term trend of their rural neighbors, while retaining the short term
monthly and annual variations. If no such neighbors exist, the station is
completely dropped, if the rural records are shorter, part of the
non-rural record is dropped.
Result: Ts.GHCN.CL.1-6 – before peri-urban adjustment
Ts.GHCN.CL.PA.1-6 – after peri-urban adjustment
Step 3 : Gridding and computation of zonal means (do_comb_step3.sh)
————————————————
A grid of 8000 grid boxes of equal area is used. Time series are changed
to series of anomalies. For each grid box, the stations within that grid
box and also any station within 1200km of the center of that box are
combined using the reference station method.
A similar method is also used to find a series of anomalies for 80 regions
consisting of 100 boxes from the series for those boxes, and again to find
the series for 6 latitudinal zones from those regional series, and finally
to find the hemispheric and global series from the zonal series.
WARNING: It should be noted that the base period for any of these anomalies
is not necessarily the same for each grid box, region, zone. This is
irrelevant when computing trend maps; however, when used to compute
anomalies, we always have to subtract the base period data from the
series of the selected time period to get a consistent anomaly map.
Result: SBBX1880.Ts.GHCN.CL.PA.1200 and tables (GLB.Ts.GHCN.CL.PA.txt,…)
Step 4 : Reformat sea surface temperature anomalies
—————————————————
Sources: http://www.hadobs.org HadISST1: 1870-present
http://ftp.emc.ncep.noaa.gov cmb/sst/oimonth_v2 Reynolds 11/1981-present
For both sources, we compute the anomalies with respect to 1982-1992, use
the Hadley data for the period 1880-11/1981 and Reynolds data for 12/1981-present.
Since these data sets are complete, creating 1982-92 climatologies is simple.
These data are replicated on the 8000-box qual-area grid and stored in the same way
as the surface data to be able to use the same utilities for surface and ocean data.
Areas covered occasionally by sea ice are masked using a time-independent mask.
The Reynolds climatology is included, since it also may be used to find that
mask. Programs are included to show how to regrid these anomaly maps:
do_comb_step4.sh adds a single or several successive months for the same year
to an existing ocean file SBBX.HadR2; a program to add several years is also
included.
Result: update of SBBX.HadR2
Step 5 : Computation of LOTI zonal means
—————————————-
The same method as in step3 is used, except that for a particular grid box
the anomaly or trend is computed twice, first based on surface data, then
based on ocean data. Depending on the location of the grid box, one or
the other is used with priority given to the surface data, if available.
Result: tables (GLB.Tsho2.GHCN.CL.PA.txt,…)
Final Notes
———–
A program that can read the two basic files SBBX1880.Ts.GHCN.CL.PA.1200 and
SBBX.HadR2 in order to compute anomaly and trend maps etc was available on our
web site for many years and still is.
For a better overview of the structure, the programs and files for the various
steps are put into separate directories with their own input_files,
work_files, temp_files directories. If used in this way, files created by
step0 and put into the temp_files directory will have to be manually moved
to the temp_files directory of the step1. To avoid that, you could
consolidate all sources in a single directory and merge all input_files
directories into a single subdirectory.
The reason to call the first step “Step 0” is historical: For our 1999 paper
“GISS analysis of surface temperature change”, we started with “Step 1”, i.e.
we used GHCN’s v2.mean as our only source for station temperature data. The
USHCN data were used for the 2001 paper “A closer look at United States and
global surface temperature change”, the other parts of “Step 0” were added later.
I suppose you know how to compute an integral of a function…right? Now, do you know the theory behind it? Like say, Riemann–Stieltjes Integration theory? Probably not. You know how to do the arithmetic but not the underlying math. That’s ok though cause why should everybody be self sufficient in all aspects of a particular discipline? Why make everybody prove the theorems? Engineers don’t need that stuff they just need to know some conditions about where the function is integrable and how to compute it if it is. So, maybe everybody doesn’t need to walk the code cause I bet there will be a resource that summarizes the contents thereby relieving everybody of having to retrace every step. Use other people’s labor to come up with something new.
Alan Cheetham says:
August 29, 2010 at 10:52 am
Alan, Hansen’s constant revision of the data reminds me of the Gilligan’s Island Episode where the professor decides the island is sinking – all because Gilligan wanted to catch larger and larger lobsters!
Gilligan’s is a perfect match for hansen!
Steven Mosher says:
August 30, 2010 at 12:17 pm
“When you show yourself ready to do the same work that John G or SteveMc or me or Clearclimate code has done, then you have standing to make a comment.”
Steven, I’ve read your statements in the past with a great deal of earnest. But you’re acting as if you’ve just come down from a mountain top carrying two stone tablets. The algorithm you’ve described or method, if you will, isn’t exactly revolutionary. So, unless we go into detail about how GISS does it we don’t have any standing to comment on the general error built-in to this type of averaging and in-fill? That’s a load of crap. Now GISS has particular insights to algebraic equations? And only a select few may comment on this type of mathematics? I can be part of the lucky few if I play follow the bouncing ball with the programmer’s code? If it is anything like you described, I’ll comment on it because I essentially do the same thing in my line of work. Its crap. I’ve fewer inputs and less steps and still hate to generate a bill from it, much less pass laws because of it. Steven, I thank you for the work you did to open up GISS’s methods and code. But on this particular day, your viewing this discussion in a singular dimension and if there is a lesson from the climate discussion it is that there are various views and dimensions to any particular part of the climate discussion. I’d urge you to re-think your perspective on this matter.
Anu says:
August 30, 2010 at 8:46 am [ … ]
Knew it! No cojones — from the guy who pretends to be knowledgeable.☺
Anu says he might make a prediction , maybe [I predict he’ll chicken out again], but then he says: “Of course, any prediction I make would lie somewhere between young molten Earth and snowball Earth,” the ultimate Sharpshooter’s Fallacy, making Hansen’s giant error bars look puny by comparison. Very brave prediction… not.
Here’s my prediction: when the climate sensitivity number is finally determined, it will be found to be decisively under 1°C, making the emission of CO2 a complete non-issue.
Steven Mosher says:
August 30, 2010 at 12:11 pm
Well that explains the error. Clearly then they are comparing apples to oranges. But in your (very clear and easy to understand) example, when the original calculation was made, station 2 was still too young to use. So 10 years later, it is still too young to use for 1998. So why use it? The temperature for 1998 is not changing 12 years later. Except when they decide to add in values not deemed accurate enough at the time of the calculation, but 12 years later they are deemed such.
That is not science. That is philosophy.
Mosher won’t be back to this particular thread.
stevengoddard says:
August 30, 2010 at 8:52 am
Venter
No one’s work is beyond question or review. That is how science works.
True.
But that’s not how blogging works – to bring up one’s previous work is considered a “taunt” at some websites.
Jaye Bass says:
August 30, 2010 at 1:29 pm
“Mosher won’t be back to this particular thread.”
That’s too bad, I was about to heap praises on him for his god-like insights to GISS that the rest of us mere mortals couldn’t possibly understand or the ones that can are only too lazy to do so and aren’t worthy of conversing with someone of his abilities, dedication and knowledge.
Steven Mosher says:
August 30, 2010 at 11:54 am
So many missed points so little time;
In re; Pols; This isn’t like pols. This would be the equivalent of going back now and declaring that Al Gore didn’t actually get 50.01 percent of the vote, he only got 48%, and now Barrack got 50.01% in the most recent election. If your old figures keep changing (esp. in a monolithic downward way) then your methodology necessarily comes into question; see my ‘walking up the down escalator’ analogy in my early comment. You cannot say the ’98 was the warmest ever at 60.4, then 5 years later claim ’05 to be the warmest ever at 60.4 and ’98 was really.57 , then come back in 2010 and say we’re at the warmest ever at 60.4, while ’05 was really .57 and ’98 was at .54. That is just folly.
And that isn’t illogical at all; if your means of estimating global mean surface temperature (GMST) tends to systematically overstate today’s temperatures (requiring downward revision multiple times in the future) then you have a real problem stating with any useful degree of certainty what today’s GMST is.
“There is no CO2 mitigation strategy which will stabilize our temperature estimating methodology.” I think you missed the point here Steve; if CO2 causes global warming, then you can mitigate global warming by reducing CO2 emissions. There is no amount of CO2 reduction, however, that will allow you to better estimate global surface T. So if you don’t have very accurate GMST estimates that are stable over time, there’s no point in pointing to an experiment involving a bottle and heat lamps to tell us we have a problem to solve. The whole notion of the multiple ‘tipping points’ in order to make this into a true crisis is almost hysterical. Whether anyone agrees with the basic localized physics or not is irrelevant. Take the cap off your bottle and see what happens to heat in the uncontained atmosphere. But if you cannot point to increasing temps that are stable over time (not revised downward every 5 years) then you certainly cannot claim we need to reduce CO2 to reduce GMST.
Now; You can either produce an historic record of increasing global temps through your methodology that doesn’t involve systematically revising prior temperature estimates downward or you cannot. So far Anthony, Lindzen, McIntyre, et. al. have been very successful in showing this to be very problematic for the Mann’s and Phil Jones types. Meanwhile Mann and Phil keep hiding data and methodology, getting rid of the MWP and hiding the decline. Pick which side of the argument you want to be on, but really we need to get rid of scientists who’ve chosen sides and get back to just studying what’s going on. I think your last comment about reducing our dependence on fossil fuels says it all; yes, it’s better to conserve resources, but that is completely absurd in the AGW/CO2 debate. It’s got no place here. To bring it up should be the global warming equivalent of Godwin’s law; argument over, you lose.
I will seed points 2, 3 and 4; We cannot quantify the problem reliably, we cannot estimate the harms reliably and we cannot determine the costs and mitigation solutions accurately. That about sum it up? Yeah; it’s manbearpig. It’s the bogeyman. We can’t define it, we can’t tell you for sure what will end it, but we can tell you what you need to do about it and it involves surrendering this little bit of your personal freedom right here. Okay, I don’t believe this is a conspiracy, but it sure doesn’t seem too far fetched either.
I too am a luke warmer; has the climate gotten warmer in the last 100 years? Yes, most likely. Have anthropogenic CO2 emissions contributed to this? Likely or possibly in some degree or another. I don’t think they’re the primary driving influence and I am certain science hasn’t answered that question in any meaningful way. However, we are more than two standard deviations below where the GCMs said we’d be by now on GMST and I don’t see the earth warming significantly lately. If GISS wants to drone on and on about the most recent decade, let’s look at where their models said we’d be by now. And tipping points? Come on already. If their claims were at all valid we’d have never entered the last three ice ages.
So, in conclusion; yes, it is a cult. If you want funding you need to tie your subject back to global warming. If that’s not a self affirming feedback loop then I don’t know what is. There is real science that needs to be done, but if it’s done by those who already know what’s causing the warming then you will certainly get warming and it will certainly be proven to be caused by CO2. Of that you can be certain.
They’ve lost any and all pretense of objectivity which clearly precludes their involvement in scientific endeavor, so they need to step aside and allow some real scientists to come in, pick up the pieces, try to establish the historical data set and get to work understanding how our atmosphere and oceans and climate really work.
Right now we’re too busy proving it’s warming by walking up a down escalator.
From: Anu on August 30, 2010 at 7:51 am
http://www.swcs.com.au/ansi.htm
And the first 8 items of the ANSI code page:
0 NUL 1 ☺ 2 ☻ 3 ♥ 4 ♦ 5 ♣ 6 ♠ 7 BELLPlease avoid trying to rewrite the history of those of us who lived through it. And enjoy this ANSI smiley.
☺Mr Mosher,
You’re not doing your co-author any favors on this thread.
Bob:
“Bob Kutz says:
August 30, 2010 at 2:02 pm (Edit)
Steven Mosher says:
August 30, 2010 at 11:54 am
So many missed points so little time;
In re; Pols; This isn’t like pols. This would be the equivalent of going back now and declaring that Al Gore didn’t actually get 50.01 percent of the vote, he only got 48%, and now Barrack got 50.01% in the most recent election. If your old figures keep changing (esp. in a monolithic downward way) then your methodology necessarily comes into question; see my ‘walking up the down escalator’ analogy in my early comment. You cannot say the ’98 was the warmest ever at 60.4, then 5 years later claim ’05 to be the warmest ever at 60.4 and ’98 was really.57 , then come back in 2010 and say we’re at the warmest ever at 60.4, while ’05 was really .57 and ’98 was at .54. That is just folly.”
The OLD FIGURES do not change. get that through your head.
In 2000, GISS looks through 7364 stations and they select stations to construct an average. Their rules say. NO STATION that has less than 20 years can be used. SO, for example a station that started reporting in 1990 could not be used.
By 2010 that station NOW has 20 years. NOW, its data from 1990-2000 CAN be used. That will of necessity change the estimate of 1990-2000. Further you WANT to use more data. If GISS did NOT use this data, THEN you would claim they were disregarding data. When they do use the data you cry that the estimate changes.
The Example of the poll was to show your silliness about not having a “stable” statistic.
And further, you can say in 2005 that 1998 was the warmest and then in 2006 say that no, 2003 or 1934 or whatever was the warmest. Its the warmest GIVEN THE ADDITION OF NEW DATA. it would be silly NOT to change your claim. it would be dishonest NOT to change the claim.
latitude says:
August 30, 2010 at 12:03 pm (Edit)
Mosh, you’re a smart guy, too smart to be resorting to name calling and trying to paint people as ignorant because they don’t agree with you..
## you painted yourself bud, I hold up a mirror.
Steve, can I get you a bigger shovel?
All I said was that I stopped reading after your comment I quoted.
I stand corrected, my impression of you was not correct, bud.
Bob Kutz:
“But if you cannot point to increasing temps that are stable over time (not revised downward every 5 years) then you certainly cannot claim we need to reduce CO2 to reduce GMST.”
Wrong. in several ways.
1. I can point to a stable estimate, Romans. Its actually the most statistically sound approach as it uses all available data. Also CRU is stable. Also NCDC is stable.
2. If you had a series that show 1C of warming year to year, but it changed slighty, year to year ( sometimes .99C, sometimes 1.01C) you surely could NOT conclude as you do. the ‘stability’ of a measure year to year has NOTHING to do with the argument to mitigate ( which i dont buy anyway) it COULD matter if:
A. all measures had an instability
B. The instability was LARGE.
C. the instability kept me from doing skillful hindcasts.
3. If you had an unstable series that showed the problem was WORSE than you thought, you’d beter take attention to it. Especially if the “instability” derives from the fundamental choice to consider more data as more data becomes available.
Who cares right? Its just a Darn blOg, these are jUst names on a page. Can’t really be to sensitive to ah Hoc Explanations.
Bob,
“I too am a luke warmer; has the climate gotten warmer in the last 100 years? Yes, most likely. Have anthropogenic CO2 emissions contributed to this? Likely or possibly in some degree or another. I don’t think they’re the primary driving influence and I am certain science hasn’t answered that question in any meaningful way”
1. Glad you agree with Christy, Monkton,Spenser, Lindzen,Willis.
2. You cannot rationally maintain the following:
” the science hasnt answered the question in any meaningful way BUT i dont think that they are a primary driver.
#2 is illogical. The best you can say is that the science is uncertain. Unless you have a theory that explains or supports the notion that they ARE NOT the primary driver, you would be more consistent if you said “the best science is uncertain”, we dont know. they could be primary, secondary or hardly an effect at all. But instead you point to uncertainty in the science as cover to give opinion. Not skeptical of your own ability to ascertain the truth. selectively skeptical.
And yes, I know the “GCMS” are forecasting high, Thats plays a role in being a lukewarmer.
Jaye Bass says:
August 30, 2010 at 1:29 pm (Edit)
Mosher won’t be back to this particular thread.
You should switch to ice predictions
More predictable than grumpy blog posters.
Steven Mosher said on August 30, 2010 at 12:17 pm
Actually, as I previously posted, all I had to do was Google and I found that file, at least the current version.
And as I said in my first comment, we’re looking at the results, such as that file, not the process. There is no need to know everything that is done between the animal and the package to evaluate if the sausage is good or not. Plus I fail to see how an in-depth study of how a particular sausage is made will make me think it tastes any better.
BTW, please avoid getting so worked up as it may lead to your being cranky towards your roommate, and we don’t like to see Charles get irritated.
😉
PhilJourdan says:
August 30, 2010 at 1:28 pm (Edit)
Steven Mosher says:
August 30, 2010 at 12:11 pm
Well that explains the error. Clearly then they are comparing apples to oranges. But in your (very clear and easy to understand) example, when the original calculation was made, station 2 was still too young to use. So 10 years later, it is still too young to use for 1998. So why use it? The temperature for 1998 is not changing 12 years later. Except when they decide to add in values not deemed accurate enough at the time of the calculation, but 12 years later they are deemed
************************
In my example I shortened the number of year FOR ILLUSTRATION. so that people could work through the problem without taking their shoes off to count. the sticthing of series is MORE complicated that I have explained in my simple example. The simple example is merely to explain the process.
here is a rule. When two stations 5 miles apart overlap by 2 years, combine them.
Station 1. 0,0,0
Station2 NA,NA,1
Now in year 3 your estimate of the temp for “station 1 area” is 0,0,0
BUT there is data your are NOT considering. station 2. Why not? well its got a short
record. A year goes by. you get this
Station 1. 0,0,0 0
Station2 NA,NA,1,1
Now you do have enough data to include station 2. so your revise your estimate of the past based on the latest information.
.5.5
Now imagine the problem with GHCN. You have 7280 stations, add in SCAR and you have 7364. Now 50% of the stations have ‘duplicate records’ except the duplicates are not always duplicates they can be the same station different instrument, Some stations have 15 year gaps, 2 year gaps, 1 year gaps. add 20 years of fresh data to the end of a series and it will cascade changes all the way back to the start. Understanding how these changes propogate is not at first ,apparent. People who know me, know I screamed about this very thing. Until I understood it, until I watched John G work through some aspects of the problem. Until I actually sat down and tried to do the job myself. Then I said ‘hmm tough problem.” how many different ways are there to solve this and what do each of those ways get right and what do they get wrong. and how much does it matter.
wow…how did that get through. dog gone keyboard…keeps sticking.