Pielke Senior: NASA GISS Inaccurate Press Release On The Surface Temperature Trend Data

From Dr. Roger Pielke Senior’s blog:

UPDATE PM JANUARY 16 2010 – Jim Hansen has released a statement on his current conclusions regarding the global average surface temperature trends [and thanks to Leonard Ornstein and Brian Toon for alerting us to this information].   The statement is If It’s That Warm, How Come It’s So Damned Cold? by James Hansen, Reto Ruedy, Makiko Sato, Ken Lo

My comments below remain unchanged. Readers will note that Jim Hansen does not cite or comment on any of the substantive unresolved uncertainties and systematic warm bias that we report on in our papers. They only report on their research papers.   This is a clear example of  ignoring peer reviewed studies which conflict with one’s conclusions.

***ORIGINAL POST***

Thanks to Anthony Watts for alerting us to a news release by NASA GISS (see) which reads

“NASA has not been involved in any manipulation of climate data used in the annual GISS global temperature analysis. The analysis utilizes three independent data sources provided by other agencies. Quality control checks are regularly performed on that data. The analysis methodology as well as updates to the analysis are publicly available on our website. The agency is confident of the quality of this data and stands by previous scientifically based conclusions regarding global temperatures.” (GISS temperature analysis website: http://data.giss.nasa.gov/gistemp/)” [note: I could not find the specific url from NASA, so I welcome being sent this original source].

This statement perpetuates the erroneous claim that the data sources are independent [I welcome information from GISS to justify their statement, and will post if they do].  This issue exists even without considering any other concerns regarding their analyses.

I have posted a number of times on my weblog with respect to the lack of independence of the surface temperature data; e.g. see

Further Comment On The Surface Temperature Data Used In The CRU, GISS And NCDC Analyses

An Erroneous Statement Made By Phil Jones To The Media On The Independence Of The Global Surface Temperature Trend Analyses Of CRU, GISS And NCDC.

There remain also important unresolved uncertainties and systematic biases in the surface temperature data used by GISS [and CRU and NCDC] which we reported in the peer reviewed literature, i.e.

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229

with only one Comment in the literature on just two of our issues by the CRU group

Parker, D. E., P. Jones, T. C. Peterson, and J. Kennedy, 2009: Comment on Unresolved issues with the assessment of multidecadal global land surface temperature trends. by Roger A. Pielke Sr. et al.,J. Geophys. Res., 114, D05104, doi:10.1029/2008JD010450

which we refuted in

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2009: Reply to comment by David E. Parker, Phil Jones, Thomas C. Peterson, and John Kennedy on “Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 114, D05105,

doi:10.1029/2008JD010938

with the referees agreeing with our Reply (see reviews contained within this post).

The NASA GISS (and NCDC and CRU groups) have also not responded to the systematic warm bias that we reported in

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Klotzbach, P. J., R. A. Pielke, Sr., R. A. Pielke, Jr., J. R. Christy, and R. T. McNider (2010), Correction to “An alternative explanation for differential temperature trends at the surface and in the lower troposphere”, J. Geophys. Res., 115, D01107, doi:10.1029/2009JD013655.

The GISS news release is symptomatic of the continued attempt to ignore science issues in their data analysis which conflict with their statement in the press release. This is not how the scientific process should be conducted.

We urge, based on the exposure of such type of behavior in the CRU e-mails; i.e. see

The Crutape Letters by Steven Mosher, Thomas W. Fuller, 2010 ISBN/EAN13: 1450512437 / 9781450512435

that the suppression of alternative viewpoints ends.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
156 Comments
Inline Feedbacks
View all comments
John Blake
January 17, 2010 6:51 pm

Back in mid-January 2009, GISS/NASA’s benighted James Hansen issued a pronunciamento to effect that ’09 would register as “the hottest year on record.” Of course he said the same for 2006, 2007, and 2008, all of which tumbled out-of-bed on the downside.
Taking note of this extravagance, Dr. Pielke, Sr. bestowed on Hansen his coveted Bonehead Award for 2009. In line with Dr. Pielke’s new-minted tradition, perhaps this year’s Bonehead Trophy should go either to Ban Ki-moon (a truly impenetrable skull) or to Rajendra Pauchauri, his kindred as an oh-so-diverse Statist oligarch. Lord knows, there are candidates enough to choose.

LAShaffer
January 17, 2010 6:56 pm

[quote]Joel Shore (16:22:06) :
I am not sure how you get your intuition. What is your intuition for what would happen to you if you were placed in a room with 0.038% of plutonium?[\quote]
Are you insane? You’re comparing X-rays to microwave. Get a life. Get an EM spectrum chart. You obviously don’t have a clue. (Sorry, mod, I can’t take much more of this ignorance.)
[quote]deech56 (16:49:32) :
Really? Most of the atmosphere (O2 and N2) is transparent to infrared radiation,[\quote]
Demonstrably untrue by anyone with any knowledge of AAS/AES, unless rephrased to identify the specific wavelength ranges you are discussing. And the ranges had better be really d**m tight.
[quote] Ric Werme (17:10:52) :
At certain important infrared wavelengths of light,…[\quote]
What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one? Is it your imaginary “windows”? Guess what, they are exactly that, IMAGINARY! There are no windows in the atmosphere, there is only the vacuum beyond it. The spectral “windows” are of no more importance than the lines on the map of the world, which separate countries from each other and are also – wait for it – IMAGINARY!
I have a suggestion for you true believers. The next time the temperature in your house starts heading south, simple open the CGA valve on a very large cylinder of compressed CO2 and bask in the warmth (Seal your windows first, we wouldn’t want you to pollute the atmosphere the rest of us are trying to enjoy, would we?).

January 17, 2010 7:01 pm

Leif Svalgaard (18:11:46),
You always find the weak part of an argument. That’s no fun!
Actually, I may not have been clear enough when I said 100% of solar radiation doesn’t warm the planet. I was referring only to those photons interacting with CO2 molecules, and which are then re-emitted into space. So a small part of the total [100%] incoming radiation from the Sun will be lost to space due to CO2.
Maybe 99.99% of photons are making it through the atmosphere because they avoid CO2 absorption, and 50% of those left [or .005% – half of the .01% that are absorbed] are ‘reflected’ back to space by CO2, leaving only the remaining .005% to warm the planet. You’re making me sweat here.
My point is that without any CO2, all of the incoming photons would pass through the atmosphere. But with CO2 there might be a cooling effect, because some of the photons captured by CO2 are re-emitted back to space, and cannot warm the planet. So adding CO2 may bleed off incoming solar energy, and have a slight cooling effect.
No more thought experiments! From now on I’m sticking to charts & graphs.

January 17, 2010 7:06 pm

Smokey (19:01:31) :
because some of the photons captured by CO2 are re-emitted back to space, and cannot warm the planet.
the standard argument is that of the LW photons trying to escape, some are returned to the surface by your very argument… hence warming.

January 17, 2010 7:31 pm

Leif, deech,
You’re right, I had forgotten about outgoing radiation. Once I got started, my post took on a life of its own.
But my example of a CO2 molecule half way between the Sun & the Earth is entirely reasonable. As Leif points out, what if it’s only one CO2 molecule? There can’t be one CO2 molecule passing half way between the Earth and the Sun? After all, it was just a thought experiment to show that a re-emitted photon wouldn’t be likely to hit the Earth from that distance/altitude.

Joel Shore
January 17, 2010 7:35 pm

Theo Goodwin says:

Old Jim Hansen just trots out his same old assumption. He believes that he can estimate temperatures in adjacent grids that have no sensors and do so without falsifying the data.

Actually, if you read the statement that Hansen et al. have put out ( http://www.columbia.edu/%7Ejeh1/mailings/2010/20100115_Temperature2009.pdf ) , you will see that they have done considerable testing of the assumption that they make. In fact, their testing of this goes back many, many years.
And, they do NOT estimate temperatures from temperatures in adjacent grids. They estimate temperature ANOMALIES in this way. The two are very different beasts: Temperatures are not strongly correlated over distance. (In fact, if you consider a mountainous region like the top of Mt. Washington and the valley only a few miles away, you can see how a few miles can make a huge difference in surface temperature!) However, temperature ANOMALIES turn out to be correlated over quite large regions. I.e., if we have a cold month here in Rochester, it is also likely to be colder than average in, say, Buffalo and Syracuse and the Adirondack Mountains…and, to a slightly lesser degree, even more distant places like New York City, Boston, Pittsburgh, Philadelphia, and Cleveland.

Joel Shore
January 17, 2010 7:49 pm

LAShaffer says:

[quote]Joel Shore (16:22:06) :
I am not sure how you get your intuition. What is your intuition for what would happen to you if you were placed in a room with 0.038% of plutonium?[\quote]
Are you insane? You’re comparing X-rays to microwave. Get a life. Get an EM spectrum chart. You obviously don’t have a clue. (Sorry, mod, I can’t take much more of this ignorance.)

***I*** don’t have a clue? I won’t even start trying to dissect what is wrong with your statement. I will merely point out that I was giving an example of why one cannot simply use intuition to say, “Boy, 0.038% is such a small quantity, it can’t possibly have an effect.” I was not yet explaining why, in the particular physically-relevant case of CO2 and the absorption of infrared radiation, the 0.038% can have a significant effect.

[quote]deech56 (16:49:32) :
Really? Most of the atmosphere (O2 and N2) is transparent to infrared radiation,[\quote]
Demonstrably untrue by anyone with any knowledge of AAS/AES, unless rephrased to identify the specific wavelength ranges you are discussing. And the ranges had better be really d**m tight.

I have no clue what you are trying to say here…but the fact is that O2 and N2 in the atmosphere have an essentially negligible amount of absorption of infrared radiation emitted at terrestrial temperatures.

What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one?

Well, Planck’s Law, for one, which specifies the distribution of wavelengths over which an emitter at a certain temperature emits EM radiation ( http://en.wikipedia.org/wiki/Planck%27s_law ). The two most relevant relevant emission spectra are then the solar spectrum (which emits largely in the visible and also into the infrared and near-IR) and the terrestrial spectrum (which is solidly in the infrared). (See, e.g., http://ockhams-axe.com/yahoo_site_admin/assets/images/emissionspec.293103427_std.gif )

Mike Bryant
January 17, 2010 7:55 pm

Right Joel, all the work that E. M. Smith has done demonstrating the gross dishonesty in the numbers has been answered by your short refutation… Riiiiighhhht………….

Joel Shore
January 17, 2010 7:58 pm

Ric Werme says:

What you can say, and this is one of the key reasons I gave up on AGW, is that there’s so much CO2 in the atmosphere that adding more has much less impact than it did before it became nearly opaque.

…Which just demonstrates that you have not understood the basic physics of the greenhouse effect. See here http://www.aip.org/history/climate/simple.htm#L_0623 for a further explanation in an historical context, but the basic point is not whether or not a single absorption occurs. Rather, what you have is multiple absorptions and emission and what ends up being relevant is the temperature from which most of the radiation is emitted that can then escape into space (without further absorption). Increasing CO2 increases the height of this effective emitting layer and, since the tropospheric temperatures decrease with height, that means the layer from which that emission occurs is colder. However, since the intensity of emission goes as temperature to the fourth power, this means that less energy is being emitted than is being absorbed from the sun and the earth responds by warming until radiative balance is restored.

davidc
January 17, 2010 8:49 pm

Some comment on : If It’s That Warm, How Come It’s So Damned Cold?
James Hansen, Reto Ruedy, Makiko Sato, Ken Lo, link above in lead comments.
This seems to have been written during the current NH cold spell. Since conditions are so obviously not what they’ve been predicting you would think that a comment with a title like this would be their best shot. Well, I find nothing convincing here at all. In fact, they hardly seem to be trying.
They say ”There is a contradiction between the observed continued warming trend and popular perceptions about climate trends. Frequent statements include: “There has been global cooling over the past decade.” ”
The main reason for this (they say) is that HdCRUT has 1998 as their hottest year, while GISS has that as 2005. Investigating why the difference, they show that if they adopt the same gridding as HadCRUT they get the same result (their Figure 4; warmest 1998, slight cooling since then). They don’t seem to notice that what they demonstrating is that the whole of the rise that they claim comes not from the actual data but from the interpolated “data”. Remove the interpolation, remove the warming. In passing, they also demonstrate that the interpolation is in no way necessary to get a result. They can do it without, they just get the wrong answer.
In attemting to justify their methods, they say that with interpolation
“the patterns of warm and cool regions have realistic‐looking meteorological patterns”, not bothering to say what it looks like without (the actual observations are maybe “unrealistic”?).
They acknowledge that this is qualitative support, and continue: ” One way to estimate …uncertainty, or possible error, can be obtained via use of the complete time series of global surface temperature data generated by a global climate model that has been demonstrated to have REALISTIC spatial and temporal variability of surface temperature.[my caps]” No doubt this REALISTIC means the the model output agrees with the assumptions that underlie their interpolation algorithm. And yes, the find the “errors” are very small.
As far as I can see, that’s all they’ve got.

Pofarmer
January 17, 2010 9:12 pm

“this means that less energy is being emitted than is being absorbed from the sun and the earth responds by warming until radiative balance is restored.”
Unless a cloud floats by.

Editor
January 17, 2010 9:19 pm

Joel Shore (19:58:59) :
Ric Werme says:

What you can say, and this is one of the key reasons I gave up on AGW, is that there’s so much CO2 in the atmosphere that adding more has much less impact than it did before it became nearly opaque.

…Which just demonstrates that you have not understood the basic physics of the greenhouse effect. See here http://www.aip.org/history/climate/simple.htm#L_0623 for a further explanation in an historical context, but the basic point is not whether or not a single absorption occurs. Rather, what you have is multiple absorptions and emission and what ends up being relevant is the temperature from which most of the radiation is emitted that can then escape into space (without further absorption).

One problem I have with the historical contexts people like to fall back on, is that I like to think science has progressed since the days of Hershel (solar activity and climate), Tyndall, and Arrhenius. In particular, the CO2 studies neglected convection, and while the CO2 blanket slows down some of the IR transport, convection provides another means of transporting heat upward. The more CO2 slows down IR radiation, the more important other processes become, and that simply wasn’t examined. As far as I know, it may still not be well examined or understood.
While appeals to century-old science may help bolster arguments about the science being settled, it also implies that temperature should be tracking the Keeling CO2 curve, but it’s not.
Getting back to my original comment – are you suggesting that adding 100 ppm to current CO2 levels will have the same impact that the first 100 ppm did?

Pofarmer
January 17, 2010 9:21 pm

Uhm , that would be a cloud, not a could.
[REPLY – How many colds would a cold cloud cause if a cold cloud could cause colds? ~ Evan]

kadaka
January 17, 2010 11:49 pm

rbateman (16:51:32) :
A machinist would say that the C02 concentration in the atmosphere is .38 of a thousandth.
Try opening your dial caliper to 1/3 of a thousandth. Takes some practice and due diligence to deal with the backlash.

That’s what micrometers are for. Hope it’s a good one with carbide faces. Set the four-tenths on the Vernier marks, you can eyeball it just under for .38, flick the spindle lock. If really fussy you can check the gap on an optical comparator.
And it better be a regular type. Electronic digitals are cute with the implied precision, but just try getting the correct feel for them to repeat that very last digit every time. And the mechanical digitals are only good until you drop them one time too hard and/or one time too many. 😉
Of course, machinists know that four-tenths can be critical, even one tenth can be important, especially on tight tolerances like small holes. So they likely wouldn’t say that. One good fart spread throughout the shop, now that would be a good description.

R John
January 17, 2010 11:51 pm

Joel Shore —
C’mon how do you get around the logarithmic effect of the absorption of CO2 and any other gas!!! Adding CO2 at this point has zero effect on its absorbance!

steven mosher
January 17, 2010 11:51 pm

Dr. Pielke.
It’s very interesting the way certain factions of climate science leave unfinished threads and unresolved issues. One such “hole” in the science is
peterson’s postulate ( his 2003 paper on UHI) where he postulates ( his word) that temperature stations should be located in “cool parks,” That seemed a fair “postulate” that was begging for on site verification. With Surfacestations at over 1000 sites visited, perhaps we can say something definitive about this ‘cool park’ postulate.
Also, thx for the plug

rbateman
January 18, 2010 12:47 am

“If it’s that warm, how come it’s so damned cold?”
Top 10 Answers:
10.) It never was that warm, you have been spotted with Heide DeKline.
9.) It hasn’t been that warm in 10 years; Heidi dumped you for Frosty.
8.) The cold has to gain momentum before it drags the warm down with it kicking & screaming. (no snickering !)
7.) That’s the last time we let you bring burritos on the Polar Expedition.
6.) Hey, you made this stuff up, you tell me.
5.) This box of Frosted Climate Charms is packed by weight, not volume, some settling of science may occur.
4.) You’ve got a fever. Remember that Twilight Zone with Mrs. Bronson?
3.) Drinking in the sauna again, I see.
2.) Ask the guy with the pitchfork. He says he’s come to collect on a debt.
1.) It’s official: Hell is frozen over.

bob watkins
January 18, 2010 3:29 am

Dear Anthony,
I have been following your postings for a couple of months and I am obviously the reason for the huge increase in the number of hits that you have been having. Well done you!
I am not a meteorologist or a scientist. I am not a denier or an alarmist. But I do like to reduce our collective use of resources – be it oil or coal or gas or gold.
I was particularly interested in two recent posts “Coal Creek Redux” and “Darwin Zero”. I have been looking at some temperature data from Macau (or Macao if you prefer). Don’t ask why but I was.
For those that don’t know, Macau is a very small port city on the southern coast of China (http://en.wikipedia.org/wiki/Macau but use with the usual caution about wiki). It used to be a Portuguese colony until recently and they have weather records that go back quite a long way. I have no special knowledge about the weather recording station there – it may be that it was a mercury thermometer hanging out of somebody’s window for all I know – but I doubt it. Macau is a very small place and I suspect that the weather station was in the old Portuguese fort (probably for all of its life) and it was probably read religiously from when it was set up. Anyway, cutting to the chase, the Macau meteorological service have a website which lists a hundred years of temperature data 1901-2000 (see http://www.smg.gov.mo/www/smg/centenary2/index_four.htm ). They have more data, as well, going forward and back but not as easily accessible. I thought that I’d take a look at it. It is not in an easy database format but with a bit of effort I was able to export the mean temperature data, year at a time, on a daily basis. The advantage appeared to me that with daily means, minima and maxima the temperatures were less likely to have been ‘adjusted’. I thought, well does this show any warming or cooling over the century?
Most of the analyses that I have seen elsewhere present an annual mean temperature and see what the trend is over the period. I did this for Macau. See below.
What’s up with that? Well in this case we have 55 annual points covering the 100 year period (because I have not completed grabbing the data). Each of these points represents the mean value from 365 (or 366 in a leap year) daily mean temperature values for the year in question. The trend line, using a linear regression, provides an estimate for the mean temperature in 1900 of 22.28°C and in 2000 of 22.62°C. Thus, we have an estimated warming of 0.34°C in the century from 1900 to 2000.
As you know, each time that you get a mean or average value of a variable and state it by itself, you lose some of the information that lies behind the mean. Thus a yearly mean temperature loses a lot of information about the variability within that year. I thought that getting the mean monthly temperatures might be a second estimate – and this is what the plot looks like for the year’s that I have grabbed so far. I’m sure somebody with more time could grab the years that I have not got so far (1906 to 1910, etc)
What’s this mean? If taken at face value, it looks as if the mean of the ‘mean monthly temperatures’ averaged month at a time was about 22.4°C. It also appears to mean that the trend of increase in the mean temperature is 0.0003°C per month. (We need to be a bit careful because I’ve used Microsoft Excel and Excel does some strange things when you use dates and converts them to numbers – starting at 1 January 1900). This gives an estimated potential increase of 0.39°C for the century from 1900 to 2000.
Hold on a second though, this is giving equal weight to every month in the year. Are the very hot months and the very cold months skewing the data? Another way of looking at the same data is to check what happens in each quinquennium. See below:
In this case, the slopes of the trend lines are probably not relevant since they are only taken over a short time period of five years. We can see, however, that there is no large shift in the curves from the period 1901-1905 to the period 1996-2000. We do get estimates of 22.49°C for the first quinquennium and 22.71°C for the last quinquennium of the century. This represents an estimated warming trend of 0.22°C (over 95 years) or 0.23°C over 100 years.
Another way of looking at the same data is to first get the mean values for each month – we know that there is a nearly periodic cycle of about 12 months which affects the temperature. If we take the average for each month over the 100 years then (so far) we get the following average annual cycle:
We can use these monthly means to get the deviations in each year from the monthly means and therefore de-trend the monthly data for the effect of the annual cycle.
This de-trended data, using 660 monthly data points (out of an available 1200 data points for 100 years) provides a very slight estimated warming trend of 0.03°C per century. You can see that the spread of data is roughly in the range ±5°C and there do not appear to be ‘outliers’ of large magnitude in the data set. However, it might be as well to look at the distribution of the de-trended data. See below:
The distribution of the de-trended data is shown on the histogram. Also plotted is a Normal distribution for comparison. The data looks pretty normal and the extreme values – the hottest and coldest months in the century do not look outstandingly hot or cold – the sort of values that might be expected.
Of course, we can do a further check on this – and I suggest that somebody with more time could do so. This is to grab to maximum and minimum temperature data sets on a daily basis and check that the maximum monthly temperatures and the minimum monthly temperatures always dance above and below the mean monthly temperatures respectively. Another advantage of checking the maxima and minima would be that one could see whether the maxima are getting larger (hot getting hotter) or the minima are getting smaller (cold getting colder) or vice versa. This would tell us more about why there is a warming or cooling trend – if indeed there is such a trend.
(Just as an aside, there seems to be a tendency for the modern environmental scientists to discard ‘outliers’ just because they are outliers. Clearly, there is case for do so if, for instance, somebody has misread the original handwritten record and has read ‘54’ instead of ‘34’, but I think one should only discard outliers if one can see why – and then tell the world which outliers have been discarded and why.)
Oh, I forgot to mention, Macau used to be a very small place indeed as far as population is concerned. In recent years its population has grown quite considerably. According the World Bank, Macau had a population of 172,608 in 1960 and 526,178 in 2008 (http://www.google.com/publicdata?ds=wb-wdi&met=sp_pop_totl&idim=country:MAC&q=macau+population ). I do not know what the population was in 1900, but I would guess at less than 50,000 and maybe much fewer. Should there be a sign of an UHI (urban heat island) here? I suspect not, but that may be for others to look at.
What’s the point of all this? Well I think the first lesson that it teaches is that for each temperature reading station we need to have some history. Where was the temperature read and how? Has the temperature reading or system been changed over time? These are some of the points that were being made in the post about Darwin. Is the station data from a consistent source and is it reliable? (By the way, Willis might want to look at temperature data from nearby Indonesian islands such as Bali to compare with Darwin – probably more relevant than Alice Springs.) The second lesson is that each station needs to be analysed in detail before it is added to the global database. The third lesson is that using a linear regression trend to project forward may or may not be appropriate. Are there any other trends within the data which can be decomposed and thus eliminated from the data? Is Macau relevant in any way as a station to search for global warming? Search me!
I have not completed the analysis of the Macau data – just started in fact – but you can get different answers depending upon the analysis – particularly if you have a small number of data points.
As you can see, I’ve just done what I’m suggesting that nobody should do – presenting data that is incomplete and only partially analysed!
Cheers
Bob the Builder
PS You can see that I’m an amateur and can’t submit the charts that go with this post. Let me know how and I’ll send them separately.

deech56
January 18, 2010 3:38 am

RE Ric Werme (17:47:32) :

deech56 (16:49:32) :
RE: Peter Miller (15:51:24) :
> I’m surprised nobody else answered your concerns.
Sorry – we’re only on here too much, not all the time.

Oh, is there egg on my face. After posting I did notice that I was replying to a fairly recent post; not only that, apparently the excellent reply by Joel Shore was in the queue. Oops.

deech56
January 18, 2010 3:45 am

RE LAShaffer (18:56:01) :

What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one? Is it your imaginary “windows”?

I would guess that all the “whack jobs” have their physics textbooks open to the same chapter. Just think of it as independent confirmation of results. 😉

Douglas Haynes
January 18, 2010 3:48 am

For Joel
Cannot we examine the pCO2-Temp-Time space as indicated from the Epeica Dome C ice cores to get a more empirical indication on the GHG effects of CO2 effect at concentrations above 280ppmv? Doesn’t the pCO2-Temp-Time space indicate that as earth’s mean surface T’s rise after each glacial epoch, pCO2 then rises 400 to 1200 years AFTER the T rise – through ocean CO2 degassing as the mean surface T’s rise? And conversely, as global mean surface T’s fall at the onset of the next glacial epoch, then pCO2 falls 800 to 2000 years AFTER the T fall – through ocean solubilisation as surface T’s cool. Remember that this pCO2-Temp-Time space operates at about 280 ppmv CO2. So how can CO2 be acting as a significant greenhouse gas at these concentrations, and higher, noting that its concentration in the atmosphere FOLLOWS global mean surface temperature excursions? This supports the conclusion that adding more CO2 , above about 20ppmv, does not cause an enhanced greenhouse gas effect as asserted, and indeed it just that, an assertion, by the “AGW modellers”- in other words, the pCO2-Temp-Time space defined from a study of the ice core is indeed consistent with empirical observations on the logarithmic drop off in CO2 infrared absorbance effect as its concentration increases.

deech56
January 18, 2010 3:58 am

RE Ric Werme (21:19:15) :

One problem I have with the historical contexts people like to fall back on, is that I like to think science has progressed since the days of Hershel (solar activity and climate), Tyndall, and Arrhenius. In particular, the CO2 studies neglected convection, and while the CO2 blanket slows down some of the IR transport, convection provides another means of transporting heat upward. The more CO2 slows down IR radiation, the more important other processes become, and that simply wasn’t examined. As far as I know, it may still not be well examined or understood.

Ric, you raise an interesting point, but we should look on these earlier studies as the building blocks of the later studies. They are incomplete in the same way that Watson and Crick (1953) was incomplete. (You might also note that their earliest papers had an error in the number of hydrogen bonds between C and G.) On a related note, Robert Grumbine wrote an interesting post about successive approximations. Reading the climate literature, especially the idea of climate sensitivity, is a study in the progress scientists have made in their understanding of climate.

E.M.Smith
Editor
January 18, 2010 5:33 am

Joel Shore (16:22:06) : Since the forcings on which Scenario A was based didn’t come to pass, […]and, in fact, the forcings used for Scenario B turn out to be quite close (although apparently still a little bit high) of what came to pass.
Sorry, I’m having a bit of trouble finding “forcing” in my physics book, what S.I. units is a “forcing” measured in? Can’t use it in a calculation if you don’t know what its units mean, after all. So is it degrees / day or delta degrees / day or ergs per fortnight or what?
Joel Shore (19:35:43) : And, they do NOT estimate temperatures from temperatures in adjacent grids. They estimate temperature ANOMALIES in this way.
Ah, yes, the old “Use the Anomaly Luke, the Anomaly will save us!” approach.
Well, your first problem is that you have to have a temperature in that adjacent “grid” (actually, grid box) before you can calculate the anomaly. One heck of a lot of these are simply “made up” from “nearby” station data up to 1000 km away. Real data in the baseline, made up in the comparison. Viola! Anomaly! (Works for Bolivia, all of it, as well as the Canadian Arctic – almost all of it, except for “the garden spot of the arctic” where a thermometer was left in Eureka…) and many many more.
But GIStemp calculates its UHI before the anomaly map step, and does it using interpolated temperatures from up to 1000 km away. And it does infill from up to 1000 km away using temperatures. And it homogenizes using temperatures And … But yes, it does average an entire 10 of them together first, so while the code calls this an ‘offset’ you can feel free to call it an anomaly if you like. I certainly think the GIStemp UHI calculation is an anomaly. After all, it gets the sign wrong about 1/4 to 1/3 of the time…
The two are very different beasts: Temperatures are not strongly correlated over distance. (In fact, if you consider a mountainous region like the top of Mt. Washington and the valley only a few miles away, you can see how a few miles can make a huge difference in surface temperature!)
Yes you can. Well, actually, no you can’t. Since mountain tops have been with near religious zeal removed from GHCN by NOAA / NCDC you can hardly find the poor dears in the basic data set that underpins all of NCDC adjusted, GIStemp, and HadCRUT data series. The Andes are gone. The Canadian Rockies are gone. Poor Japan, flat as Kansas. No thermometer over 300 m elevation. It would be nice to be able to see the difference between cold places on mountains and everywhere warmer down lower, but we can’t. Well, you can, but only if you look back in the baseline for those anomalies where the cold locations are left in and then compare them to the present where they are taken out.
Oh, I’m sorry, that WAS your point! And here I missed it. Those wonderful Anomaly Maps that are all rosy red are there to show us what it looks like when you compare a cold mountain baseline with a warm valley / beach present, and I didn’t realize it. Silly me. They are just doing it for educational purposes.

However, temperature ANOMALIES turn out to be correlated over quite large regions. I.e., if we have a cold month here in Rochester, it is also likely to be colder than average in, say, Buffalo and Syracuse and the Adirondack Mountains…

And if we have a hot patch of tarmac at Diego Garcia in the tropical sun in the Jet exhaust, we are likely to have a cold patch of ocean water 1200 km away out to sea, but no worries, we’ll just take that Island Tarmac temp and fill in that 2400 km diameter circle of ocean with it, since it’s just an anomaly after all… (Yes, that is EXACTLY what GIStemp does.)
BTW, think on this: Lodi vs. San Francisco. They have a nice positive correlation throughout most of the year. Then comes summer. When it gets hot in Lodi ( like 100+F or 40 C) the valley air heats up and rises. Pulls the fog blanket in over SFO and cools them down nicely (like 50 F) . So you compute your ‘average offset’ from SFO (where NCDC in GHCN has a thermometer) and then use that to fill in LODI, which will even work for part of the year, sort of. Until summer. Then LODI will be given a way wrong number. But don’t worry, when LODI is stuck under valley tulie fog in winter, it’s way off the other way. So these two errors might accidentally offset. Maybe.
My favorite, though, is Pisa Italy where the UHI correction is 1.4 C in the wrong direction because they reference to a site in the German approach to the Alps as a “nearby” station…
Oh, and in the Pacific Basin 100 % of reporting stations are scheduled to be Airports real soon now (they almost are already). Good Luck finding a ‘rural reference’ for detrending that UHI. (Oh, so sorry, I forgot, major airports are flagged as ‘rural’ in GHCN so you can just use them to correct each other… kind of like the Marine Base at Quantico Virginia is classed as rural. You know, near the air strip where right about know I’d expect near continuous air ops what with supporting troops in 2 theatres along with a disaster relief op in Haiti) I’m sure there will be No Problems At All getting a nice clean red anomaly out of them…
/sarcoff>
I’m still looking for the real beef in this pile of anomaly bull and all I find are a load of hypothetical cows.
I’ll believe Hansen and his code when:
1) The QA data and suite are presented along with the QA run output.
2) They put the thermometers back in AND NOTHING CHANGES. (Oh, and put back in the SAME data, please. Not like that USHCN hack, where they left out the USA thermometers from May 2007 to Nov 2009, but finally put them back in; but only AFTER cooking the data some more so USHCN.Version2 has warming baked in that matches GIStemp)
3) A full benchmark suite and benchmark data are provided that measures and demonstrates exactly what the program does with representative data, white noise, red noise, biased hot series, biased cold series, and dead flat series. That will let you calculate the filter “Q” of GIStemp. A fundamental, and lacking, specification.
Until then, all you are saying is “GIStemp is a 100% perfect filter and can remove horrid levels of data bias with absolute perfection”. And since that isn’t possible, it’s a void statement. You must measure the Q of a filter, not assume it nor fantasize about it.

Mike Bryant
January 18, 2010 5:43 am

Very nice explanation, Mr. Smith. Looking forward to Joel’s refutation.. but not expecting it. Joel sometimes even admits it when he missteps. This would be a good time to do it.

deech56
January 18, 2010 6:30 am

RE Douglas Haynes (03:48:06) :

For Joel
Cannot we examine the pCO2-Temp-Time space as indicated from the Epeica Dome C ice cores to get a more empirical indication on the GHG effects of CO2 effect at concentrations above 280ppmv? Doesn’t the pCO2-Temp-Time space indicate that as earth’s mean surface T’s rise after each glacial epoch, pCO2 then rises 400 to 1200 years AFTER the T rise – through ocean CO2 degassing as the mean surface T’s rise?

I’m not Joel, but I will make a couple of points.
1. The change in CO2 occurs about 800 years after the start of the temperature change, but continues throughout the warming and cooling phases.
2. The forcing from Milankovitch cycles cannot account for the temperature change without the inclusion of feedbacks – and we know that CO2 is a greenhouse gas and as CO2 levels increase, they feed back on the earth’s heat balance.
3. Calculations of climate sensitivity have included the analysis of the last glacial maximum, and the number that comes up (1.2–4.3 oC) is close to the IPCC estimates. Of course, compared to the last million years plus, we are in uncharted territory in a way, so it is useful to look back even further, at a study in which the authors found a range of 1.6-5.5 oC; again, consistent with the IPCC estimates.
OK – that’s 3 points, and there’s probably a better answer in the queue.