Hansen on the surface temperature record, Climategate, solar, and El Nino

The Temperature of Science (PDF available here)

James Hansen

My experience with global temperature data over 30 years provides insight about how the science and its public perception have changed. In the late 1970s I became curious about well known analyses of global temperature change published by climatologist J. Murray Mitchell: why were his estimates for large-scale temperature change restricted to northern latitudes? As a planetary scientist, it seemed to me there were enough data points in the Southern Hemisphere to allow useful estimates both for that hemisphere and for the global average. So I requested a tape of meteorological station data from Roy Jenne of the National Center for Atmospheric Research, who obtained the data from records of the World Meteorological Organization, and I made my own analysis.

Fast forward to December 2009, when I gave a talk at the Progressive Forum in Houston Texas. The organizers there felt it necessary that I have a police escort between my hotel and the forum where I spoke. Days earlier bloggers reported that I was probably the hacker who broke into East Anglia computers and stole e-mails. Their rationale: I was not implicated in any of the pirated e-mails, so I must have eliminated incriminating messages before releasing the hacked emails.

The next day another popular blog concluded that I deserved capital punishment. Web chatter on this topic, including indignation that I was coming to Texas, led to a police escort.

How did we devolve to this state? Any useful lessons? Is there still interesting science in analyses of surface temperature change? Why spend time on it, if other groups are also doing it? First I describe the current monthly updates of global surface temperature at the Goddard Institute for Space Studies. Then I show graphs illustrating scientific inferences and issues. Finally I respond to questions in the above paragraph.

Current Updates

Each month we receive, electronically, data from three sources: weather data for several thousand meteorological stations, satellite observations of sea surface temperature, and Antarctic research station measurements. These three data sets are the input for a program that produces a global map of temperature anomalies relative to the mean for that month during the period of climatology, 1951-1980.

The analysis method has been described fully in a series of refereed papers (Hansen et al., 1981, 1987, 1999, 2001, 2006). Successive papers updated the data and in some cases made minor improvements to the analysis, for example, in adjustments to minimize urban effects. The analysis method works in terms of temperature anomalies, rather than absolute temperature, because anomalies present a smoother geographical field than temperature itself. For example, when New York City has an unusually cold winter, it is likely that Philadelphia is also colder than normal. The distance over which temperature anomalies are highly correlated is of the order of 1000 kilometers at middle and high latitudes, as we illustrated in our 1987 paper.

Although the three input data streams that we use are publicly available from the

organizations that produce them, we began preserving the complete input data sets each month in April 2008. These data sets, which cover the full period of our analysis, 1880-present, are available to parties interested in performing their own analysis or checking our analysis. The computer program that performs our analysis is published on the GISS web site.

Fig. 1. (a) GISS analysis of global surface temperature change. Open square for 2009 is 11- month temperature anomaly. Green vertical bar is 95 percent confidence range (two standard deviations) for annual temperature. (b) Hemispheric temperature change in GISS analysis.

Responsibilities for our updates are as follows. Ken Lo runs programs to add in the new data and reruns the analysis with the expanded data. Reto Ruedy maintains the computer program that does the analysis and handles most technical inquiries about the analysis. Makiko Sato updates graphs and posts them on the web. I examine the temperature data monthly and write occasional discussions about global temperature change.

Scientific Inferences and Issues

Temperature data – example of early inferences. Figure 1 shows the current GISS

analysis of global annual-mean and 5-year running-mean temperature change (left) and the hemispheric temperature changes (right). These graphs are based on the data now available, including ship and satellite data for ocean regions.

Figure 1 illustrates, with a longer record, a principal conclusion of our first analysis of temperature change (Hansen et al., 1981). That analysis, based on data records through December 1978, concluded that data coverage was sufficient to estimate global temperature change. We also concluded that temperature change was qualitatively different in the two hemispheres. The Southern Hemisphere had more steady warming through the century while the Northern Hemisphere had distinct cooling between 1940 and 1975.

It required more than a year to publish the 1981 paper, which was submitted several times to Science and Nature. At issue were both the global significance of the data and the length of the paper. Later, in our 1987 paper, we proved quantitatively that the station coverage was sufficient for our conclusions – the proof being obtained by sampling (at the station locations) a 100-year data set of a global climate model that had realistic spatial-temporal variability. The different hemispheric records in the mid-twentieth century have never been convincingly explained. The most likely explanation is atmospheric aerosols, fine particles in the air, produced by fossil fuel burning. Aerosol atmospheric lifetime is only several days, so fossil fuel aerosols were confined mainly to the Northern Hemisphere, where most fossil fuels were burned. Aerosols have a cooling effect that still today is estimated to counteract about half of the warming effect of human-made greenhouse gases. For the few decades after World War II, until the oil embargo in the 1970s, fossil fuel use expanded exponentially at more than 4%/year, likely causing the growth of aerosol climate forcing to exceed that of greenhouse gases

Fig. 2. Global (a) and U.S. (b) analyzed temperature change before and after correction of computer program flaw. Results are indistinguishable except for the U.S. beginning in year 2000. in the Northern Hemisphere. However, there are no aerosol measurements to confirm that interpretation. If there were adequate understanding of the relation between fossil fuel burning and aerosol properties it would be possible to infer the aerosol properties in the past century. But such understanding requires global measurements of aerosols with sufficient detail to define their properties and their effect on clouds, a task that remains elusive, as described in chapter 4 of Hansen (2009).

Flaws in temperature analysis. Figure 2 illustrates an error that developed in the GISS analysis when we introduced, in our 2001 paper, an improvement in the United States temperature record. The change consisted of using the newest USHCN (United States Historical Climatology Network) analysis for those U.S. stations that are part of the USHCN network. This improvement, developed by NOAA researchers, adjusted station records that included station moves or other discontinuities. Unfortunately, I made an error by failing to recognize that the station records we obtained electronically from NOAA each month, for these same stations, did not contain the adjustments. Thus there was a discontinuity in 2000 in the records of those stations, as the prior years contained the adjustment while later years did not. The error was readily corrected, once it was recognized. Figure 2 shows the global and U.S. temperatures with and without the error. The error averaged 0.15°C over the contiguous 48 states, but these states cover only 1½ percent of the globe, making the global error negligible.

However, the story was embellished and distributed to news outlets throughout the country. Resulting headline: NASA had cooked the temperature books – and once the error was corrected 1998 was no longer the warmest year in the record, instead being supplanted by 1934.

This was nonsense, of course. The small error in global temperature had no effect on the ranking of different years. The warmest year in our global temperature analysis was still 2005.

Conceivably confusion between global and U.S. temperatures in these stories was inadvertent. But the estimate for the warmest year in the U.S. had not changed either. 1934 and 1998 were tied as the warmest year (Figure 2b) with any difference (~0.01°C) at least an order of magnitude smaller than the uncertainty in comparing temperatures in the 1930s with those in the 1990s.

The obvious misinformation in these stories, and the absence of any effort to correct the stories after we pointed out the misinformation, suggests that the aim may have been to create distrust or confusion in the minds of the public, rather than to transmit accurate information. That, of course, is a matter of opinion. I expressed my opinion in two e-mails that are on my Columbia University web site

Click to access 20070810_LightUpstairs.pdf

http://www.columbia.edu/~jeh1/mailings/2007/20070816_realdeal.pdf.

We thought we had learned the necessary lessons from this experience. We put our

analysis program on the web. Everybody was free to check the program, if they were concerned that any data “cooking” may be occurring.

Unfortunately, another data problem occurred in 2008. In one of the three incoming data streams, the one for meteorological stations, the November 2008 data for many Russian stations was a repeat of October 2008 data. It was not our data record, but we properly had to accept the blame for the error, because the data was included in our analysis. Occasional flaws in input data are normal in any analysis, and the flaws are eventually noticed and corrected if they are

substantial. Indeed, we have an effective working relationship with NOAA – when we spot data that appears questionable we inform the appropriate people at the National Climate Data Center – a relationship that has been scientifically productive.

This specific data flaw was a case in point. The quality control program that NOAA runs on the data from global meteorological stations includes a check for repetition of data: if two consecutive months have identical data the data is compared with that at the nearest stations. If it appears that the repetition is likely to be an error, the data is eliminated until the original data source has verified the data. The problem in 2008 escaped this quality check because a change in their program had temporarily, inadvertently, omitted that quality check.

The lesson learned here was that even a transient data error, however quickly corrected provides fodder for people who are interested in a public relations campaign, rather than science.

That means we cannot put the new data each month on our web site and check it at our leisure, because, however briefly a flaw is displayed, it will be used to disinform the public. Indeed, in this specific case there was another round of “fraud” accusations on talk shows and other media all around the nation.

Another lesson learned. Subsequently, to minimize the chance of a bad data point

slipping through in one of the data streams and temporarily affecting a publicly available data product, we now put the analyzed data up first on a site that is not visible to the public. This allows Reto, Makiko, Ken and me to examine maps and graphs of the data before the analysis is put on our web site – if anything seems questionable, we report it back to the data providers for them to resolve. Such checking is always done before publishing a paper, but now it seems to be necessary even for routine transitory data updates. This process can delay availability of our data analysis to users for up to several days, but that is a price that must be paid to minimize disinformation.

Is it possible to totally eliminate data flaws and disinformation? Of course not. The fact that the absence of incriminating statements in pirated e-mails is taken as evidence of wrongdoing provides a measure of what would be required to quell all criticism. I believe that the steps that we now take to assure data integrity are as much as is reasonable from the standpoint of the use of our time and resources.

Fig. 3. (a) Monthly global land-ocean temperature anomaly, global sea surface temperature, and El Nino index. (b) 5-year and 11-year running means of the global temperature index. Temperature data – examples of continuing interest. Figure 3(a) is a graph that we use to help provide insight into recent climate fluctuations. It shows monthly global temperature anomalies and monthly sea surface temperature (SST) anomalies. The red-blue Nino3.4 index at the bottom is a measure of the Southern Oscillation, with red and blue showing the warm (El Nino) and cool (La Nina) phases of sea surface temperature oscillations for a small region in the eastern equatorial Pacific Ocean.

Strong correlation of global SST with the Nino index is obvious. Global land-ocean

temperature is noisier than the SST, but correlation with the Nino index is also apparent for global temperature. On average, global temperature lags the Nino index by about 3 months.

During 2008 and 2009 I received many messages, sometimes several per day informing me that the Earth is headed into its next ice age. Some messages include graphs extrapolating cooling trends into the future. Some messages use foul language and demand my resignation. Of the messages that include any science, almost invariably the claim is made that the sun controls Earth’s climate, the sun is entering a long period of diminishing energy output, and the sun is the cause of the cooling trend.

Indeed, it is likely that the sun is an important factor in climate variability. Figure 4 shows data on solar irradiance for the period of satellite measurements. We are presently in the deepest most prolonged solar minimum in the period of satellite data. It is uncertain whether the solar irradiance will rebound soon into a more-or-less normal solar cycle – or whether it might remain at a low level for decades, analogous to the Maunder Minimum, a period of few sunspots that may have been a principal cause of the Little Ice Age.

The direct climate forcing due to measured solar variability, about 0.2 W/m2, is

comparable to the increase in carbon dioxide forcing that occurs in about seven years, using recent CO2 growth rates. Although there is a possibility that the solar forcing could be amplified by indirect effects, such as changes of atmospheric ozone, present understanding suggests only a small amplification, as discussed elsewhere (Hansen 2009). The global temperature record (Figure 1) has positive correlation with solar irradiance, with the amplitude of temperature variation being approximately consistent with the direct solar forcing. This topic will become clearer as the records become longer, but for that purpose it is important that the temperature record be as precise as possible.

Fig. 4. Solar irradiance through October 2009, based on concatenation of multiple satellite records by Claus Frohlich and Judith Lean (see Frohlich, 2006). Averaged over day and night Earth absorbs about 240 W/m2 of energy from the sun, so the irradiance variation of about 0.1 percent causes a direct climate forcing of just over 0.2 W/m2.

Frequently heard fallacies are that “global warming stopped in 1998” or “the world has been getting cooler over the past decade”. These statements appear to be wishful thinking – it would be nice if true, but that is not what the data show. True, the 1998 global temperature jumped far above the previous warmest year in the instrumental record, largely because 1998 was affected by the strongest El Nino of the century. Thus for the following several years the global temperature was lower than in 1998, as expected.

However, the 5-year and 11-year running mean global temperatures (Figure 3b) have continued to increase at nearly the same rate as in the past three decades. There is a slight downward tick at the end of the record, but even that may disappear if 2010 is a warm year.

Indeed, given the continued growth of greenhouse gases and the underlying global warming trend (Figure 3b) there is a high likelihood, I would say greater than 50 percent, that 2010 will be the warmest year in the period of instrumental data. This prediction depends in part upon the continuation of the present moderate El Nino for at least several months, but that is likely.

Furthermore, the assertion that 1998 was the warmest year is based on the East Anglia – British Met Office temperature analysis. As shown in Figure 1, the GISS analysis has 2005 as the warmest year. As discussed by Hansen et al. (2006) the main difference between these analyses is probably due to the fact that British analysis excludes large areas in the Arctic and Antarctic where observations are sparse. The GISS analysis, which extrapolates temperature anomalies as far as 1200 km, has more complete coverage of the polar areas. The extrapolation introduces uncertainty, but there is independent information, including satellite infrared measurements and reduced Arctic sea ice cover, which supports the existence of substantial positive temperature anomalies in those regions.

In any case, issues such as these differences between our analyses provide a reason for having more than one global analysis. When the complete data sets are compared for the different analyses it should be possible to isolate the exact locations of differences and likely gain further insights.

Summary

The nature of messages that I receive from the public, and the fact that NASA

Headquarters received more than 2500 inquiries in the past week about our possible “manipulation” of global temperature data, suggest that the concerns are more political than scientific. Perhaps the messages are intended as intimidation, expected to have a chilling effect on researchers in climate change.

The recent “success” of climate contrarians in using the pirated East Anglia e-mails to cast doubt on the reality of global warming* seems to have energized other deniers. I am now inundated with broad FOIA (Freedom of Information Act) requests for my correspondence, with substantial impact on my time and on others in my office. I believe these to be fishing expeditions, aimed at finding some statement(s), likely to be taken out of context, which they would attempt to use to discredit climate science.

There are lessons from our experience about care that must be taken with data before it is made publicly available. But there is too much interesting science to be done to allow intimidation tactics to reduce our scientific drive and output. We can take a lesson from my 5- year-old grandson who boldly says “I don’t quit, because I have never-give-up fighting spirit!”

Click to access 20091130_FightingSpirit.pdf

There are other researchers who work more extensively on global temperature analyses than we do – our main work concerns global satellite observations and global modeling – but there are differences in perspectives, which, I suggest, make it useful to have more than one analysis. Besides, it is useful to combine experience working with observed temperature together with our work on satellite data and climate models. This combination of interests is likely to help provide some insights into what is happening with global climate and information on the data that are needed to understand what is happening. So we will be keeping at it.

*By “success” I refer to their successful character assassination and swift-boating. My interpretation of the e-mails is that some scientists probably became exasperated and frustrated by contrarians – which may have contributed to some questionable judgment. The way science works, we must make readily available the input data that we use, so that others can verify our analyses. Also, in my opinion, it is a mistake to be too concerned about contrarian publications – some bad papers will slip through the peer-review process, but overall assessments by the National Academies, the IPCC, and scientific organizations sort the wheat from the chaff.

The important point is that nothing was found in the East Anglia e-mails altering the reality and magnitude of global warming in the instrumental record. The input data for global temperature analyses are widely available, on our web site and elsewhere. If those input data could be made to yield a significantly different global temperature change, contrarians would certainly have done that – but they have not.

References

Frölich, C. 2006: Solar irradiance variability since 1978. Space Science Rev., 248, 672-673.

Hansen, J., D. Johnson, A. Lacis, S. Lebedeff, P. Lee, D. Rind, and G. Russell, 1981: Climate

impact of increasing atmospheric carbon dioxide. Science, 213, 957-966.

Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J.

Geophys. Res., 92, 13345-13372.

Hansen, J., R. Ruedy, J. Glascoe, and Mki. Sato, 1999: GISS analysis of surface temperature change. J. Geophys. Res., 104, 30997-31022.

Hansen, J.E., R. Ruedy, Mki. Sato, M. Imhoff, W. Lawrence, D. Easterling, T. Peterson, and T. Karl, 2001: A closer look at United States and global surface temperature change. J. Geophys. Res., 106, 23947-23963.

Hansen, J., Mki. Sato, R. Ruedy, K. Lo, D.W. Lea, and M. Medina-Elizade, 2006: Global

temperature change. Proc. Natl. Acad. Sci., 103, 14288-14293.

Hansen, J. 2009: “Storms of My Grandchildren.” Bloomsbury USA, New York. (304 pp.)

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
380 Comments
Inline Feedbacks
View all comments
Steve
December 22, 2009 5:56 am

Readers should examine the variability of the GISS analysis.
The large smoothing radius gives a spatial exaggeration
for areas with adjacent extremes – notably the Arctic and interior Africa.
This is probably why the GISS is the high outlier amongst global data sets
(CRU & Hadley SST before they were shutdown, UAH and RSS LT and MT)
It is also probably why the GISS was the low outlier during the early
twentieth century.
Look at all the independent data sets before taking Hansen’s word.

JackStraw
December 22, 2009 6:00 am

>>How did we devolve to this state?
You’re a scientist, yes? Then collect the data which includes years of you and yours calling anyone who questioned your findings liars, idiots, corporate stooges, etc. and caused severe damage to the reputations of scientists who didn’t agree with you, mix in the billions of dollars your absurdly twisted “science” has cost us taxpayers, factor in the attempt to wreck our economy based on what you knew to be bogus information and don’t forget the addition of years of scaring the hell out of the populous to increase your personal standing. Then build a computer model complete with tricks, fudge factors and don’t forget to hide the decline.
Let me know what your results look like.

INGSOC
December 22, 2009 6:01 am

Dr Hanson has demonstrated that he is clearly incapable of understanding the difference between politics and science. It is ironic that he complains that it is others that are confusing the two. Pot meet kettle. The courts use a method to determine the veracity of a persons statements. If someone can be shown to be lying about one thing, then it can be assumed that they are lying about everything else and their entire testimony can be disregarded. I am satisfied that Dr Hanson has been deceptive in more than a few of his statements. I no longer believe anything James Hanson has to say.

Frank K.
December 22, 2009 6:01 am

Roger Knights (05:12:42) :
Hansen:
“There is a slight downward tick at the end of the record, but even that may disappear if 2010 is a warm year.”
“there is a high likelihood, I would say greater than 50 percent, that 2010 will be the warmest year in the period of instrumental data.”
Why is it that climate “scientists” excoriate skeptics for confusing “weather” with “climate”, and yet willingly extrapolate the average weather over ONE year as some proof of global warming?

latitude
December 22, 2009 6:04 am

Cry me a river
Hansen became an activist and put himself in this position.
Now complains because he’s having to act accordingly.
But still won’t release the information people need to replicate his work.
smoke and mirrors

December 22, 2009 6:11 am

nanuuq (23:38:36) :
I have been reading many of the web sites, including this one discussing the current climate warming issue. In many ways I am disgusted with both of the extreme viewpoints. On this site we see an almost lynch mob, drown the witch mentality. Not looking for the scientific realtiy, but *GLOATING* in finding some minor fault in somebody elses analysis.
You’ve obviously mistaken the commenters here for the alarmist ones in Texas that triggered Hansen’s police escort.
In many cases, ad hominem attacks are made, and comments carried on without any justifcation in science or analysis. I want to see both sides settle down and show SCIENTIFICALLY that their side has merit.
Ummmm — I’m guessing this was your first visit here, and you haven’t had time to do anything other than scan some random comments in this thread.

Bill Illis
December 22, 2009 6:14 am

This is what Hansen wrote in 1980 (a book review).
“One is left with the impression that the procedure for
more in-depth analysis of planetary atmospheres requires
only use of “sufficiently large computers.” In
fact, although computers are a useful tool, they often
do more harm than good; progress is impeded by hasty
publication of numerical results which leave us with a
swollen literature and conclusions which are difficult to
check and often undeserving of our confidence …”
Then the following year, 1981, he writes this (which set many of the parametres still used today in global warming theory – another in 1984 was also as influential – he is the grandfather for how the theory turned out; he pushed other positions out during the 1980s – this paper was also the first production of GISTemp global page 961 – note how the temperature charts look different then).
http://www.sciencemag.org/cgi/rapidpdf/213/4511/957.pdf?ijkey=aAXr.Ejbb0ces&keytype=ref&siteid=sci
Hansen comes out of the climategate scandal relatively unscathed since he is in only a few of the emails. One of the reasons was that Hansen and Jones have had a mini-rivalry for several years. I’m sure when individuals were cc:ing Gavin Schmidt in the emails, they assumed it was going to Hansen as well. If they weren’t, then Gavin has some explaining to do to his boss.

Dave Springer
December 22, 2009 6:14 am

John Egan,
Are you aware that Hansen himself called for oil executives who deny global warming to be put on trial for crimes against humanity?
Are you also aware that Brietbart TV published the audio interview of Hansen where he said that?
Are you also aware that Anthony Watts already identified the source of the “threat” as Mr. Breitbart himself?
Are you aware that Mr. Breitbart called for Hansen to be treated as Hansen called for oil executive deniers be treated?
Hansen is no innocent. He’s getting back exactly what he gave out. That’s poetic justice. If Dr. Hansen can’t take the heat he should get out of the kitchen.

Bill Illis
December 22, 2009 6:18 am

Sorry, the link to the Science paper by Hansen in 1981 did not work. Here is another.
http://pubs.giss.nasa.gov/docs/1981/1981_Hansen_etal.pdf

drjohn
December 22, 2009 6:19 am

“If Dr. Hansen can’t take the heat he should get out of the kitchen.”
You mean get out of the Earth. LOL.

DirkH
December 22, 2009 6:19 am

But if Hansen were the CRUgate hacker why would any sceptic want to kill him? This makes no sense at all. Death threats? Links please! I wanna know which camp wanted to get him.

Steve in SC
December 22, 2009 6:25 am

I see that Herr Dr Professor Hansen has brought his entourage along for the ride.
Extrapolations are useful only in very limited cases because the error varies exponentially with the distance from the starting point. 1200 km? I don’t think so.
Any university associated with any team member should have their accreditation removed.

KeithGuy
December 22, 2009 6:25 am

In the UK we had a sadly departed comedic character called Stanley Unwin, who was famous for his creation of a kind of science babble language called Unwinese.
I wonder how he would describe Hansen’s work?…
“We taken the globally stationy, jangleybob and did an overstuffy in the tumloader, finisht the job with a ladleho of brandy butter, and a ‘twisty and corruptit of the basic numberyows.”

latitude
December 22, 2009 6:27 am

Hansen states that after all of this, routine data will have to be checked before it is released – “but that is a price that must be paid to minimize disinformation”.
So Hansen admits they have not been providing accurate information, have not been checking it before releasing it, and has the nerve to call it “disinformation” when they get caught doing that.
In my opinion, this is the sickest statement Hansen makes:
“Also, in my opinion, it is a mistake to be too concerned about contrarian publications – some bad papers will slip through the peer-review process, but overall assessments by the National Academies, the IPCC, and scientific organizations sort the wheat from the chaff.”
Since when did science change to “Papers with opposing views” are contrarian?
And since when are opposing views “bad” before they are even presented?
and sorted out?
The science is settled……..

Chris S
December 22, 2009 6:31 am

The Gospel according to Saint James.
The “Faithful” will be relieved.

Bob Kutz
December 22, 2009 6:33 am

I think the new gambit is to introduce adjusted data as raw data, since the raw is admittedly lost.
In that way, the warming becomes ‘real’.
http://www.ncdc.noaa.gov/img/climate/research/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif
If you take raw data, add this to it, and lose the raw data, you are all set. In fact, if you call the adjusted data ‘raw’, you can adjust it again!
Open question; if somebody went to microfiche archives of local newspapers and began extracting published daily highs and lows, could that rise to the level of scientific data, or is it’s provenance immediately suspect? Not that I have a lot of time, but I least I have experience in this type of ‘research’. If it was gone about in the same manner as surfacestations.org project, could it produce a useful temperature record?
There may be better sources than local newspapers, but in my opinion anything related to an academic source has been under the control of people who’s agenda is very much in question. At least with local newspapers, it would be very difficult for anyone person or organization to have had widespread influence. Just some random thoughts from a laymen in Iowa.

Anders L.
December 22, 2009 6:33 am

1DandyTroll
wrote:
““The distance over which temperature anomalies are highly correlated is of the order of 1000 kilometers at middle and high latitudes,”
Doesn’t this only holds true if the whole area, 1000 km, is either about equally cloudy or about equally free of clouds and are within the same weather systems, and of course being of the same elevation and situated about the same relative to the coast? For instance the temperatures between the beaches of LA don’t necessarily correlates with the temperatures 1000 km to the north east.”
If you had read what Dr. Hansen actually wrote, you would have noticed that he was talking about temperature ANOMALIES, not temperatures. So you missed the whole point.

Jim
December 22, 2009 6:34 am

It seems to me inconsistent that warming due to increased CO2 would be amplified by more water vapor due to the added warmth while warming due to increased solar output would not. From the paper: “present understanding suggests only a small amplification.” Why would additional warmth from the Sun not evaporate more water and thereby cause what they call a “feedback” also??

DaveC
December 22, 2009 6:46 am

The “death train” quote is in this article.
The irony of a person who repeatedly calls for civil disobedience to fight AGW and who has testified on behalf of environmentalist vandals now needing a police escort to speak in public is sweet.

DaveC
December 22, 2009 6:48 am
Bob H.
December 22, 2009 6:50 am

I haven’t read through all of the comments, but the tone of the ones I have read is no better than those from RealClimate. Mr. Hansen stated they received data from three sources, presumably one is from the CRU. If the data has been cooked before it gets to GISS, then even a honest scientist would come up with “alarmist” numbers. It’s known there are siting problems, and probably some other countries metreological organizations have cooked the books, creating doubt. There is an old computer adage: “Garbage in, garbage out.”
Mr. Hansen may be a honest scientist, or not. If the CRU emails are any indication, then he and GISS weren’t involved in cooking the data. If GISS got garbage from CRU, then the analysis will be garbage, even if done correctly and without any bias. For the time being, I’d be willing to give him the benefit of the doubt. It’s entirely possible he and GISS got caught in CRU’s web of lies, as I believe he and GISS would accept the data from CRU as being honestly presented rather than cooked as it seems to have been.

Gary Palmgren
December 22, 2009 6:51 am

Well Mr. Hansen, it is easy to show your work is trash. Long term temperature records from old stations are needed to estimate the climate trend. But there has been a steady monotonically increasing population and increase in the Urban Heat Island effect over the average of all stations. This has resulted in a bias toward increasing temperatures. This is recognized by everybody. Therefore the average of all “adjustments” to the temperature record must be a reduction in any upward trend in the raw data. If you do have this, if you have not checked this, if you cannot demonstrate this, you are a self deluded fool.
In stead of insuring that your work is correct, you have become a political activist talking about death trains, imprisoning energy executives, and encouraging the stopping of power plants that I need to live. Your incompetence is a threat to my life.

Bob H.
December 22, 2009 6:55 am

One more thing…it is a failing of a lot of people today to accept what comes out of a computer as being correct. This may be because of time constraints or the quantity of data dumped by the computer. I have found the output needs to be at the very least spot checked, and the question asked “Is the output reasonable?” This is something that seems to be missing in a lot of cases today.

December 22, 2009 6:56 am

It is interesting to me that the Trillions of dollars these guys are after is disappearing quicker than the snows on Kilimanjaro. Obama and his ilk around the world will reduce the level of GDP so much that the spoils of their efforts will disintegrate. So we’ll probably end up in a now completely socialist and cooling world with less infrastructure or ability to build it back. In the paraphrased words of Dean Wormer (at least as credible an expert on Global Warming as many!) “Cold, poor and stupid is no way to go through life, son.”

Pamela Gray
December 22, 2009 6:57 am

Dear Mr. Hansen,
Have you ever done a seed plot experiment? Let me express how you might do this. You take 20 or so seed plots, plant seed, and provide different treatments under controlled conditions. Through the experiment, you take data from the 20 or so seed plots. Then, for some reason, 50% of the plots disappear, and someone brings in the BBQ inside the greenhouse and sets it up next to a remaining plot, or WHATEVER. Instead of throwing the whole thing out and starting again, you continue to take data from fewer and fewer seed plots, including the plot of seed that now experiences the extra warmth of the BBQ, filling in the blanks where the now gone plots used to be with data from the still there plots. That appears to be the way you would do this experiment, but please correct me if I’m wrong.
Were I to send this data to a statistician working for an agriculture company and say, “Try to get something out of this so we can sell seed”, I would be barred from ever bringing in a set of data again and would be demoted to sweeping the company sidewalks leading up to the greenhouses. Yet you are telling us that your data is just fine, never mind the station drop out or the BBQ’s, and you are selling your product regardless, along with a dog and pony show that makes it look good.
If you were a seed developer and this was the degree of research you applied to your product, I would not be buying your seed. Neither do I buy your temperature analysis.
Re: the whining in your post. Weren’t you arrested for something? And here you are complaining about other people’s boorish behavior.

1 5 6 7 8 9 16