Calculating global temperature

I’m happy to present this essay created from both sides of the aisle, courtesy of the two gentlemen below. Be sure to see the conclusion. I present their essay below with only a few small edits for spelling, format, and readability. Plus an image, a snapshot of global temperatures.  – Anthony

http://veimages.gsfc.nasa.gov/16467/temperature_airs_200304.jpg

Image: NASA The Atmospheric Infrared Sounder (AIRS) instrument aboard NASA’s Aqua satellite senses temperature using infrared wavelengths. This image shows temperature of the Earth’s surface or clouds covering it for the month of April 2003.

By Zeke Hausfather and Steven Mosher

There are a variety of questions that people have about the calculation of a global temperature index. Questions that range from the selection of data and the adjustments made to data, to the actual calculation of the average. For some there is even a question about whether the measure makes any sense or not. It’s not possible to address all these questions in one short piece, but some of them can be addressed and reasonably settled. In particular we are in a position to answer the question about potential biases in the selection of data and biases in how that data is averaged.

To move the discussion onto the important matters of adjustments to data or, for example, UHI issues in the source data it is important to move forward on some answerable questions. Namely, do the methods for averaging data, the methods of the GISS, CRU and NCDC bias the result? There are a variety of methods for averaging spatial data, do the methods selected and implemented by the big three bias the result?

There has been a trend of late among climate bloggers on both sides of the divide to develop their own global temperature reconstructions. These have ranged from simple land reconstructions using GHCN data

(either v2.mean unadjusted data or v2.mean_adj data) to full land/ocean reconstructions and experiments with alternative datasets (GSOD , WDSSC , ISH ).

Bloggers and researchers who have developed reconstructions so far this year include:

Roy Spencer

Jeff Id

Steven Mosher

Zeke Hausfather

Tamino

Chad

Nick Stokes

Residual Analysis

And, just recently, the Muir Russell report

What is interesting is that the results from all these reconstructions are quite similar, despite differences in methodologies and source data. All are also quite comparable to the “big three” published global land temperature indices: NCDC , GISTemp , and CRUTEM .

[Fig 1]

The task of calculating global land temperatures is actually relatively simple, and the differences between reconstructions can be distilled down to a small number of choices:

1. Choose a land temperature series.

Ones analyzed so far include GHCN (raw and adjusted), WMSSC , GISS Step 0, ISH , GSOD , and USHCN (raw, time-of-observation adjusted, and F52 fully adjusted). Most reconstructions to date have chosen to focus on raw datasets, and all give similar results.

[Fig 2]

It’s worth noting that most of these datasets have some overlap. GHCN and WMSSC both include many (but not all) of the same stations. GISS Step 0 includes all GHCN stations in addition to USHCN stations and a selection of stations from Antartica. ISH and GSOD have quite a bit of overlap, and include hourly/daily data from a number of GHCN stations (though they have many, many more station records than GHCN in the last 30 years).

2. Choosing a station combination method and a normalization method.

GHCN in particular contains a number of duplicate records (dups) and multiple station records (imods) associated with a single wmo_id. Records can be combined at a single location and/or grid cell and converted into anomalies through the Reference Station Method (RSM), the Common Anomalies Method (CAM), and First Differences Method (FDM), or the Least Squares Method (LSM) developed by Tamino and Roman M . Depending on the method chosen, you may be able to use more stations with short records, or end up discarding station records that do not have coverage in a chosen baseline period. Different reconstructions have mainly made use of CAM (Zeke, Mosher, NCDC) or LSM (Chad, Jeff Id/Roman M, Nick Stokes, Tamino). The choice between the two does not appear to have a significant effect on results, though more work could be done using the same model and varying only the combination method.

[Fig 3]

3. Choosing an anomaly period.

The choice of the anomaly period is particularly important for reconstructions using CAM, as it will determine the amount of usable records. The anomaly period can also result in odd behavior of anomalies if it is too short, but in general the choice makes little difference to the results. In the figure that follows Mosher shows the difference between picking an anomaly period like CRU does, 1961-1990, and picking an anomaly period that maximizes the number monthly reports in a 30 year period.  The period that maximizes the number of monthly reports over a 30 year period turns out to be 1952-1983.  1953-82 (Mosher). No other 30 year period in GHCN has more station reports. This refinement, however, has no appreciable impact.

[Fig 4]

4. Gridding methods.

Most global reconstructions use 5×5 grid cells to ensure good spatial coverage of the globe. GISTemp uses a rather different method of equal-size grid cells. However, the choice between the two methods does not seem to make a large difference, as GISTemp’s land record can be reasonably well-replicated using 5×5 grid cells. Smaller resolution grid cells can improve regional anomalies, but will often result in spatial bias in the results, as there will be large missing areas during periods when or in locations when station coverage is limited. For the most part, the choice is not that important, unless you choose extremely large or small gridcells. In the figure that follows Mosher shows that selecting a smaller grid does not impact the global average or the trend over time. In his implementation there is no averaging or extrapolation over missing grid cells. All the stations within a grid cell are averaged and then the entire globe is averaged. Missing cells are not imputed with any values.

[Fig 5]

5. Using a land mask.

Some reconstructions (Chad, Mosh, Zeke, NCDC) use a land mask to weight each grid cell by its respective land area. The land mask determines how much of a given cell ( say 5×5) is actually land. A cell on a coast, thus, could have only a portion of land in it. The land mask corrects for this. The percent of land in a cell is constructed from a 1 km by 1 km dataset. The net effect of land masking is to increase the trend, especially in the last decade. This factor is the main reason why recent reconstructions by Jeff Id/Roman M and Nick Stokes are a bit lower than those by Chad, Mosh, and Zeke.

[Fig 6]

6. Zonal weighting.

Some reconstructions (GISTemp, CRUTEM) do not simply calculate the land anomaly as the size-weighted average of all grid cells covered. Rather, they calculate anomalies for different regions of the globe (each hemisphere for CRUTEM, 90°N to 23.6°N, 23.6°N to 23.6°S and 23.6°S to 90°S for GISTemp) and create a global land temp as the weighted average of each zone (weightings 0.3, 0.4 and 0.3, respectively for GISTemp, 0.68 × NH + 0.32 × SH for CRUTEM). In both cases, this zonal weighting results in a lower land temp record, as it gives a larger weight to the slower warming Southern Hemisphere.

[Fig 7]

These steps will get you a reasonably good global land record. For more technical details, look at any of the many http://noconsensus.wordpress.com/2010/03/25/thermal-hammer-part-deux/different  http://residualanalysis.blogspot.com/2010/03/ghcn-processor-11.html models  http://rankexploits.com/musings/2010/a-simple-model-for-spatially-weighted-temp-analysis/ that have been publicly  http://drop.io/treesfortheforest released http://moyhu.blogspot.com/2010/04/v14-with-maps-conjugate-gradients.html

].

7. Adding in ocean temperatures.

The major decisions involved in turning a land reconstruction into a land/ocean reconstruction are choosing a SST series (HadSST2, HadISST/Reynolds, and ERSST have been explored  http://rankexploits.com/musings/2010/replication/ so far), gridding and anomalizing the series chosen, and creating a combined land-ocean temp record as a weighted combination of the two. This is generally done by: global temp = 0.708 × ocean temp + 0.292 × land temp.

[Fig 8]

8. Interpolation.

Most reconstructions only cover 5×5 grid cells with one or more station for any given month. This means that any areas without station coverage for any given month are implicitly assumed to have the global mean temperature. This is arguably problematic, as high-latitude regions tend to have the poorest coverage and are generally warming faster than the global average.

GISTemp takes a somewhat different approach, assigning a temperature anomaly to all missing grid boxes located within 1200 km of one or more stations that do have defined temperature anomalies. They rationalize this based on the fact that “temperature anomaly patterns tend to be large scale, especially at middle and high latitudes.” Because GISTemp excludes SST readings from areas with sea ice cover, this leads to the extrapolation of land anomalies to ocean areas, particularly in the Arctic. The net effects of interpolation on the resulting GISTemp record is small but not insignificant, particularly in recent years. Indeed, the effect of interpolation is the main reason why GISTemp shows somewhat different trends from HadCRUT and NCDC over the past decade.

[Fig 9]

9. Conclusion

As noted above there are many questions about the calculation of a global temperature index. However, some of those questions can be fairly answered and have been fairly answered by a variety of experienced citizen researchers from all sides of the debate. The approaches used by GISS and CRU and NCDC do not bias the result in any way that would erase the warming we have seen since 1880. To be sure there are minor differences that depend upon the exact choices one makes, choices of ocean data sets, land data sets, rules for including stations, rules for gridding, area weighting approaches, but all of these differences are minor when compared to the warming we see.

That suggests a turn in the discussion to the matters which have not been as thoroughly investigated by independent citizen researchers on all sides:

A turn to the question of data adjustments and a turn to the question of metadata accuracy and finally a turn to the question about UHI. Now, however, the community on all sides of the debate has a set of tools to address these questions.

About these ads

About Steven Mosher

President at new Venture
This entry was posted in Climate data, GHCN, NOAA, UHI. Bookmark the permalink.

194 Responses to Calculating global temperature

  1. Alice says:

    Much ado about nothing, clearly. A change of 1 degree C over 130 years is of negligible importance — particularly when the underlying mechanisms of the change remain uncertain and largely unknown.

    The Earth has seen much more radical climate changes in the past, long before humans made the scene. All the hoopla over these global climate metrics approach mental masturbatory status.

  2. Bob Tisdale says:

    And for those interested, I recently completed a detailed overview of the different Sea Surface Temperture datasets. Refer to:
    http://bobtisdale.blogspot.com/2010/07/overview-of-sea-surface-temperature.html

  3. Anthony,

    Could I suggest hyperlinking all the URLs to the word that immediately precedes them? It might make the text a bit more readable.

    REPLY: You are quite welcome. If either of you want to revise your manuscript, I’ll be happy to repost it. I posted it as it was provided. – Anthony

    Reply 2: I just did the linking for Mosh. Maybe I can get him to clean the kitchen in return. ~ ctm

  4. Varco says:

    Excellent post, entirely in character for this fine blog.

  5. See - owe to Rich says:

    Well, Mosh, it was nailbiting reading through that – I kept wondering when you would mention UHI. Finally you did. But how easy is that to correct for? It must be difficult, otherwise HadCRUT3 would have done it ;-)

    Rich.

  6. Rich,

    UHI is a tough one, simply because it can depend so much on micro-site effects that are difficult to quantify.

    There has been a fair bit of work on trying to use macro-site characteristics (through various urbanity proxies like GRUMP and satellite nightlights), e.g.:
    http://rankexploits.com/musings/2010/uhi-in-the-u-s-a/
    http://rankexploits.com/musings/2010/in-search-of-the-uhi-signal/

    That said, there is plenty of work still to be done. Now that everyone and their mother has created an anomaly and gridding tool, various bloggers can do their own analysis using existing metadata, CRN rating, etc.

  7. latitude says:

    I guess Jones was right after all. At least they all show no statistical warming for the past 15 years.

    But 1 degree in over 100 years?
    How much of that 1 degree rise is from adjusting the earlier temps down?

  8. jmc says:

    All graphs show an upward trend since 1970, but see figures 11 to 20 of “The Cause of Global Warming” by Vincent Gray, January 2001 here. The conclusion is:
    “Global temperature measurements remote from human habitation and activity show no evidence of a warming during the last century. … The small average and highly irregular individual warming displayed by surface measurements is therefore caused by changes in the thermal environment of individual measurement stations over long periods of time, and not by changes in the background climate.”

  9. R. Gates says:

    Superb article…exactly why WUWT is #1.

    Thanks!

  10. Steven Mosher says:

    Thank you Charles.

  11. David A. Evans says:

    How RAW is RAW?

    DaveE.

  12. rbateman says:

    I would like a link to just one of those plotted dataset files. The yearly mean temp, or the anomaly data plus the mean temp it is based upon.
    Something that Leif told me about regarding a standard in plotting data, to keep things in perspective.

  13. carrot eater says:

    Nicely done, gentlemen. This is a good summary, and should serve as a useful reference.

    Two things: it is not entirely clear from your language, but I’m pretty sure that GISS always interpolates to the center of the box, regardless of whether the box has stations or not. Of course, when the (sub)box does have stations of its own, then anything being interpolated from afar will have little influence.

    Second, it’s worth emphasizing again that most (looking over it, probably all) of your graphs presented here use unadjusted land station data. Quite simply, global warming is not an artifact of adjustments. This appears to continuously be surprising to some people.

  14. EthicallyCivil says:

    The starting point for all of these analyzes seems to be the “we’ve cooled the 40′s for your protection” post-Hansen baseline, the one that lowered past temperatures based on the need for homogeneity between urbanized and non-urbanized sights. Remember the one with the logic — “gee we know there is no UHI, thus if rural sites aren’t showing the same temperature growth as urban sites they must have been reporting too hot in the past” that way the temperature growth is homogenized (and as such a UHI signal becomes *very* hard to find, as only a residual remains).

    Or am I wrong?

  15. George E. Smith says:

    So what happened to the story in the headline:- “”” Calculating global temperature
    Posted on July 13, 2010 by Steven Mosher “””

    So finally I thought we were going to find out how to measure global Temperature; specially with that impressive; but rather limited range NASA AIRS picture.

    Too bad their instrument only goes from -81 to + 47 deg C. Why not cover the full +/- 90 deg C range that can cover from Vostok lows; to asphalt surface highs.

    But then all hell breaks loose, and apparnetly they lose sight of the headlien; and all we find is graphs of anomalies; not real Temperatures.

    Yes measuring the earth’s surface Temperature is simple; you put a thermometer in the center of each global grid cell, and you read them all simultaneously; then you multiply each readingt by the cell area; add them all up and divide by the global surface area.
    Well I didn’t see anything like that happening here.
    Of course you have to choose the cell size properly so that you have at least one cell sample for each half cycle of the highest spatial frequency that occurs in the global temperature map.

    Looking at the daily weather map for the SF Bay area nine counties; it appears that you need a cell size no bigger than about 1 km on a side or else you will be undersampling.

    Well of course there probably haven’t been that many thermometers made since Merucry was discovered.

    then there’s that 1961 to 1990 base period for all those anomaly graphs; what is that all about. If they didn’t measure the correct global temperature during the base period; then of course the anomalies don’t have any real reference either.

    Why not admit, that ALL of these records; are simply the result of statistical manipulations on the output of a certain set of thermometers; and that any connection between the result of that computation and the actual temperature of the earth is without any scientific foundation; it’s a fiction.

    And it’s a meaningless fiction since there isn’t any scientific relationship between the temperature at any earth location and the local flux of energy into or out of planet earth. Each different type of terrain results in different thermal processes, and there is no common link between energy flow, and local Temperature.
    Energy flow is a consequence of Temperature differences between locations; it is not simply related to the atual Temperature at any place; or at any time.

    And anyone who thinks a 1200 km cell size is sufficient just doesn’t understand the problem.

  16. Steven Mosher says:

    See – owe to Rich says:
    July 13, 2010 at 2:26 pm (Edit)
    Well, Mosh, it was nailbiting reading through that – I kept wondering when you would mention UHI. Finally you did. But how easy is that to correct for? It must be difficult, otherwise HadCRUT3 would have done it ;-)

    Rich.

    **************

    before you even tackle that question you have to have an analysis method.

    When we look at individual sites we can see issues or questions. Theory tells us there should be a UHI signal, theoretically. The question is can you

    1. measure that signal
    2. correct for that.

    The work people have been doing all goes to the first question. The analysis method.
    In general there are two approaches that I can thing of.

    A. Compare area averages. In this approach we would divide our samples in urban and rural. Then we would area average both samples and compare the two.
    Issues: metadata. how do we decide what is RURAL and what is URBAN. second
    urbanity happens over time. third, UHI grows to a threshhold. fourth, do we have a good sample size. fifth, how powerful is our signal detection approach, that is how much error in the method. The work that everybody has done here gives us tools to begin this approach. So people can play what if games.

    B. paired station approach. for every rural station find an urban one close by and compare differences in trends over time. Something along the lines of peterson 2003.

    The point is you cant even start the analysis without the tools. And before you use a tool you better test it. We now have a variety of tools. More to come as well. I’m most excited about the work of Ron B, who is bring some GIS expertise to the party.
    and he’s going to work in R, so I’ll throw him whatever support I can.

  17. BillyBob says:

    What kind of warming “have we seen”?

    Without the min/max we don’t don’t know if:

    1) Both he min and max have gone up
    2) or the min has gone up and the max has not
    3) or the max has gone up and the min has not

    If you use the top 50 US state temperature records as a proxy, and not 25 of the top 50 max records are int he 1930′s, one could conclude that the max is lower than the 1930′s and the min has gone up.

    If all the warming is raised min’s caused by UHI and station dropout, we have zero to worry about.

    So, Zeke and Steve, is there a reliable record for min/max anywhere?

  18. tonyb says:

    Very good article, well written and logical in its format.

    I remain to be convinced of the merits of a global temperature comprised of thousands of stations that record one micro climate, then move or are subsumed by uhi and are consequently recording a completely different micro climate to the one they started off with. As for the idea that we have accurate global ocean temperature records dating back over a century-where are they supposed to have come from?

    Be that as it may, how do we explain the fact that CO2 is being blamed for rising temperatures since 1880 yet our instrumental records show temperatures have been increasing for 350 years?

    This is Co2 superimposed on CET showing linear regression
    http://c3headlines.typepad.com/.a/6a010536b58035970c0120a7c87805970b-pi

    These are other historic data sets with linear regression
    http://i47.tinypic.com/2zgt4ly.jpg
    http://i45.tinypic.com/125rs3m.jpg

    Other historic datasets from around the world are collected on my site here, together with weather observations, diaries and articles.
    http://climatereason.com/LittleIceAgeThermometers/

    The giss and cru records merely plugged into the end of a steadily rising trend established two hundred years previously. They did not record the start of the rising trend.

    tonyb

  19. George E. Smith says:

    On #4 Gridding Methods; just what the heck grid are we talking about ?

    I thought both Hadcrud, and GISStemp used data from some small number of thermometers spread around the world; so what the heck are these grid cells and what do 5 x 5 and 3 x 3 grids mean ?

  20. Hansen seems to think that a global temperature is not that easy …

    http://eureferendum.blogspot.com/2009/12/that-elusive-temperature.html

  21. Steven Mosher says:

    Alice:

    “Much ado about nothing, clearly. A change of 1 degree C over 130 years is of negligible importance — particularly when the underlying mechanisms of the change remain uncertain and largely unknown.

    The Earth has seen much more radical climate changes in the past, long before humans made the scene. All the hoopla over these global climate metrics approach mental masturbatory status.”

    1. We make no points about the importance of 1 degree. I see no factual basis to conclude that it is of no consequence.

    2. The question ( open question) is what does the future hold. I remain open minded about this.

    3. underlying mechanism. Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected. They dont prove the theory, no theory is proven.

    4. radical past changes before humans?. heck there was a time when earth wasnt here. So I’m not sure what your point is.

    Our point was this: concerns about bias in the methods of GISS, CRU and NCDC can be put to rest. The big issue HAS ALWAYS been the adjustments and the metadata.

    That is the next topic for discussion, for serious discussion that is.

  22. John Trigge says:

    All of these reconstructions rely on data from dubious sources as they start well before the (probable) unbiassed satellite records.

    The Surface Stations project (thanks, Anthony) confirms garbage in, garbage out.

    Given the siting issues Anthony and others have PROVEN, how much credence can be placed on ANY of the temperatures graphs.

    Until the World agrees on, finances and impliments a global, open, transparent, long-term and unbiassed temperature measurement system, all of these reconstructions will be able to be dismisssed by one side whilst lauded by the other.

    AND NONE OF THEM PROVE THAT CO2 IS THE CAUSE

  23. John from CA says:

    Thanks — its great to see a comparison of approaches that generally produce the same result for a change. I was beginning to worry that we might have to pick a different computer model behind curtain #3 every day.

    Couple of questions:
    - why aren’t rogue measurements discarded as we would discard a looney response to a market research study?
    - the charts don’t indicate a degree of accuracy — do they need to?

  24. George E. Smith says:

    What is the reason for “Correcting” for UHI.

    Are not UHI regions of higher temperature than their rural surroundings ? And if they are then they surely must be contributing to higher temperature readings; so what is to correct; the temperature is going up because of UHI; and Mother Gaia takes all of that into consideration, when she decides what the earth’s temperature should be.

    So why are we doing it differently; is not correction simply making up false data ?

  25. DirkH says:

    Here’s a guy who did a very simple analysis of raw data who comes to the conclusion that there is no discernible trend:
    http://crapstats.wordpress.com/2010/01/21/global-warming-%e2%80%93-who-knows-we-all-care/
    He compensates the death of thermometers by using, for each year-to-year interval, only the thermometers that exist in both years – a so-called “constant set”.

    (NOTE: I added the image for DirkH, since it is interesting – Anthony)

    From Crapstats blog

  26. pat says:

    I have an idea. Let’s actually read th thermometer. Report what was read, and identify the location and environment. No mopre homogenization, models, adjustments, proxies,and tinkering with the past.

  27. George E. Smith,

    Gridding is the method used to spatially weight temperature measurements. For example, if you have 1000 stations in the U.S., and 1000 stations in the rest of the world, simply averaging the anomalies from all the stations would result in the U.S. temperatures having the same weight in the resulting global anomaly as the rest of the world combined. To avoid biasing oversampled areas, reconstructions generally assign each station to a 5×5 lat/lon grid cell, calculate the anomalies of the entire grid cell as an average of the anomalies of all stations contained therein (using a common anomaly period), and calculate the global anomaly as the area-weighted average of all covered grid cells for a given month.

    I go into a bit more detail about the specific methods in my post at Lucia’s place, but Jeff Id, Mosh, myself, NCDC, HadCRUT, etc. all use this same method (GISTemp uses something slightly different, with equal-sized grids instead of 5×5 cells).

    http://rankexploits.com/musings/2010/a-simple-model-for-spatially-weighted-temp-analysis/

  28. DirkH says:

    DirkH says:
    “He compensates the death of thermometers [...]”

    I’m referring to E.M. Smith’s famous phrase that describes the population decline of thermometers worldwide from 6000 to 1500, of course. But the constant set method also makes sure you eliminate artefacts from stations that stop reporting for a while to restart later.

  29. crosspatch says:

    But how easy is that to correct for? (meaning UHI)

    I would say “exceedingly difficult” because it impacts different stations in different ways and there is no one-size-fits-all algorithm to use for every single station. You can’t have simply a binary “urban/rural” rule because much depends on rate of urbanization. Even the predominant wind direction can play a role. Agricultural uses can change temperatures, too. An area that had been desert suddenly experiencing irrigation might show a higher average temperature with slightly cooler days but much warmer nights.

    If someone were to task me with making a global land surface temperature estimate, I would probably begin with either establishing new or selecting existing stations that are quite far from developed areas, just the opposite of the approach we seem to have from the Big Three where remote stations have been dropped over the years in favor of stations in more populated areas.

  30. Steven Mosher says:

    EthicallyCivil

    The data I used was “uncorrected” GHCN. This analysis does NOT address the issue of UHI. It addresses the TECHNICAL and MATHEMATICAL issues of how you select, combine, average and weight your input data. PERIOD.

    Questions about UHI corruption in the data are not addressed.

    Hansen has nothing to do with this approach.

    1. I use different ( but there is overlap) data.
    2. I dont use “adjusted data.”
    3. I dont extrapolate as Hansen does.
    4. The weight of a land grid is calculated based on the actual percent of land in
    that grid. The mask is for lakes and sea built from a 1Km by 1km global dataset
    5. Hansen combines stations, I use a CAM approach, like CRU
    6. My baseline period is not cherry picked. I pick the period with the most
    stations reporting. That turns out to be 1953-82. That turns out not to matter.
    7. I use an entirely different method for combining duplicates in GHCN than Hansen does.

  31. Nick Stokes says:

    George E Smith

    “then there’s that 1961 to 1990 base period for all those anomaly graphs; what is that all about. If they didn’t measure the correct global temperature during the base period; then of course the anomalies don’t have any real reference either.”

    It’s an important fact about anomalies that they aren’t calculated relative to a global temperature – each is calculated relative to the local mean over the period. They just use the same period. But the LSM methods that are mentioned don’t use a base period at all. The LS method calculates the correct offset. That’s one of the points of the article – both ways give the same result.

  32. carrot eater says:

    Mosh

    “I use an entirely different method for combining duplicates in GHCN than Hansen does.”

    Had to chuckle at that one. For all the methodology choices that turn out not to matter much, that’s got to be one of the least consequential. No?

    Zeke,

    I like how Broberg came to see why you have to use anomalies, in this sort of exercise.
    http://rhinohide.wordpress.com/2010/07/10/trb-0-01-ghcn-simple-mean-average/#comment-602

  33. DirkH says:

    Steven Mosher says:
    July 13, 2010 at 3:43 pm
    “[...]6. My baseline period is not cherry picked. I pick the period with the most
    stations reporting. That turns out to be 1953-82. [....]”

    And right after 82, a steep temp rise (and declining thermometer population).
    Not accusing anyone of anything, just saying.

  34. John from CA says:

    Thanks Zeke and Steven – great article!

  35. Malcolm Miller says:

    This seems to me all a terrible waste of time and effort. We don’t know how to measure the surface temperature of the whole planet (always changing with night, day, weather, seasons, etc) from here. Maybe it could be measured from space. But that would give us no information about what it was in the past, or how it might have changed in 100 or 200 or 500 years (all very tiny intervals in terms of geological time!). So what is the ‘temperature of the planet’ and where do you put the thermometer? It seems to me that the present records and readings are so suspect and so inaccurate (those Stevenson screens!) that they are useless and don’t represent valid data.

  36. Tommy says:

    What if one were to take the topographical data of each cell, and graph the amount of land at various averaged altitudes, and compare the # of stations at those altitudes. Would certain altitudes be over-represented in the cell’s average, and would it matter?

    Just thinking out loud…

  37. Ron Broberg says:

    Dirk H: Here’s a guy who did a very simple analysis of raw data who comes to the conclusion that there is no discernible trend:

    That guy freely admits that he did no geographic weighting. GHCN has a high percentage of US stations – and a low percentage of Southern Hemisphere stations. By taking simple averages of the data, you don’t get an ‘even’ input from each region of the world. So his method is flawed if uses as a “global” analysis.

    You might enjoy watching how I am adding geographic information into my “global gridded anomaly” code.
    http://rhinohide.wordpress.com/category/trb/

  38. sky says:

    It should come as no surprise that, if all the station data commonly shared by all is processed by somewhat different procedures, then the end-products will differ only slightly. But the crucial question is what the end-product tells us about temperature variations beyond the immediate–and usually changing–environment at the station sites. In other words, are the “global” time-series produced truly indicative of the Earth’s entire continental surface?

    Lack of any credible century-long station-data from much of equatorial Africa and several other major regions, as well as the propensity for virtually all station sites to be affected by human activity leaves that question much unsettled. And the whole enterprise of stitching together continous anomaly-series from station records that are spatially inhomogenous and temporally discontinous needs to be examined far more critically. If we wish to discern climate variations over a century measured in tenths of degrees, the clerical approach of indiscriminately using ALL the station data will not provide a reliable answer. Long experience with world-wide station records and Surface Marine Observations made by ships of opportunity strongly argues for an uncertainy level that is as large as the climate “signal.” Only the most thoroughly vetted station records tell us anything meaningful about that signal. The rest should be ignored.

  39. Ron Broberg says:

    sky: Only the most thoroughly vetted station records tell us anything meaningful about that signal.

    While this selection is not ‘throroughly vetted’, Nick Stokes took a stab at a global analysis using just GHCN stations that were current, at least 90 years long, and flagged as ‘rural’.

    Stokes: Just 60 Stations?
    http://moyhu.blogspot.com/2010/05/just-60-stations.html

    REPLY: Yes he did, and one of the most interesting things Chiefio found was that of the stations in GHCN that are tagged “rural” 100 of them are actually at small rural airports. Between ASOS issues and issues like we’ve found at Carefree Skypark (a small “rual” airport at one time, more on this station tomorrow) it does not instill a lot of confidence in the data quality. – Anthony

  40. 1DandyTroll says:

    I think it’s all bull crap when different methods results in different results.

    Use the simplest method et voila you get the proper result of what your equipment, which turns out to usually be reality, is showing you. The only thing then is the context.

    People who screw around with the data, trying to process it and mold it (into better data?) to fit their belief of what reality is supposed to be, is just doing that. The data only need to change if your equipment it faulty, otherwise it’s supposed to function perfectly (usually then the change of data consist of actually making an exchange of the faulty equipment for a functional apparatus to get proper data.)

    The data is the data, what might need processing and adjusting, molding even, is the context to explain the data. It’s not exactly the temperature data between 1900 and 2000 from New York, Paris, or Rom, that needed processing and adjusting, but the context that needed updating, i.e. the explanation of the heat island effect with perfectly natural population growth. Unnatural growth would mayhap have been just the building of one sky scraper to house everybody, oh and of course painted white like the chilean mountains (actually that would make sense to paint risers white, but mountains not since that’d constitute anthropogenic climate change, well locally anyways, for the “local people”, and critters and what not.)

    But of course if the alarmist believers put everything into a proper context, would they be alarmist believers then?

  41. Bill DiPuccio says:

    This is a well written overview of the problem by Steve Mosher!

    However, the iconic status of global average surface temperature and its actual utility to the climate debate remains dubious. I think this point should continue to be pressed. Ocean heat content (though not without problems) is a much better metric.

    Consider: Vast regions can undergo significant climate changes with little net effect on global average temperature. If region A increases 2 degrees, but region B decreases 2 degrees, the net difference is zero degrees.

    Moreover, changes in humidity (latent heat) do not register either. But if we must have a global average temp, a better metric would be moist enthalpy, viz. equivalent potential temp (theta-e). This would bring us closer to measuring the actual heat present in the planetary boundary layer.

  42. carrot eater says:

    I can’t make head or tail of what was actually done in Dirk’s link.

    The ending implies strongly that the analysis only covers one country, perhaps New Zealand. In which case, I don’t know where the ‘GISS’ series came from, but that’s another matter.

    It’s also very unclear how the station combining was done.

    But if people want to see a constant-station sample, those have been done as well. Zeke’s done it; I’m sure he can find the link to his own graph somewhere.

  43. RomanM says:

    CE

    Mosh

    “I use an entirely different method for combining duplicates in GHCN than Hansen does.”

    Had to chuckle at that one. For all the methodology choices that turn out not to matter much, that’s got to be one of the least consequential. No?

    Have you bothered to look at the data itself at all? A preliminary glance indicates that the quality control has been somewhat lax. In some cases, the data from different versions of the same station is not even close either in value or in pattern. Simple averaging doesn’t cut it.

    Also, despite your constant nattering about the “need” for anomalies, this is not really the case when there are overlaps in station records. The LSM can handle that without resorting to referencing fixed periods of time (which can actually introduce seasonal distortion at other times in the record). Anomalising for comparison purposes can be done on the final result with greater accuracy.

  44. B. Kindseth says:

    If you were to start from scratch to measure average global temperature, how would you do it? As an engineer, I would identify different environment types, such as urban, suburban, woodland, farmland, desert, marsh, lake, etc. For each grid we would need to determine what percentage consists of these environment types, place temperature recording instrumentation in each area and calculate a weighted average for each grid. Not having done this initially, the next best thing is to put out instrumentation now and correlate it with existing instrumentation. Historical land use patterns could then be used to adjust historical temperature data. Statistically adjusted data, even though the methodology has passed peer(i.e. pal) review does not pass the smell test.

  45. cicero says:

    It would seem that once the UHI effect is removed from the temperature record, solar activity correlates pretty well with unbiased surface temperatures.

    The paper, ‘Can solar variability explain global warming since 1970?’ Solanki and Krivova 2003 (http://www.mps.mpg.de/homes/natalie/PAPERS/warming.pdf) concluded that, “The solar indicators correlate well with the temperature record prior to 1970…” but that “the Sun has contributed less than 30% of the global
    warming since 1970″. The author’s decided that the difference has to be due to human-induced warming but there seems to be another obvious possibility…

    Their conclusion was based on the surface temp dataset they obtained from CRU which contains the UHI bias. I looked at just a dozen randomly chosen rural station plots throughout North America from the GISS site (checking against surfacestations.org – which is a great site Anthony! – and Google Earth to be sure there were no nearby heat sources).and could see what appears to be a good correlation between these graphs and the plotted solar activity post-1970 from the Solanki paper.

  46. AMac says:

    sky (July 13, 2010 at 4:02 pm) –

    > But the crucial question is … are the “global” time-series produced truly indicative of the Earth’s entire continental surface?

    The great thing about this work is that citizen-scientists from across the Great Divide have taken the time to tackle a tool-building question. If that methods question can be answered, then citizen-scientists can use these methods to look at more interesting problems.

    One possibility was that models built by folks subscribing to AGW would show a lot of warming. While models built by those skeptical of AGW would show modest or no warming.

    Then we’d say that (1) at least some the models appear to be biased and unreliable, or (2) none of the models are any good.

    But that didn’t happen. E.g. skeptic Jeff Id’s method produced a warming trend that’s on the high side, slightly. “That’s what it is,” he said.

    With added confidence that the models are reproducible and not biased by the preconceptions of the modelers, it should be possible to go forward to the contentious issues, like the ones you raise. Scientific progress!

  47. carrot eater says:

    RomanM

    Yet in many cases, the duplicates are pretty much the same where they overlap.

    And what ‘nattering’? When I say ‘anomaly’, I include what the RSM and LSM do. The moment you use offsets, you have relative temperatures, not absolutes. The terminology ‘anomaly’ is certainly not limited to the CAM. The CAM just introduces a fixed baseline at the time of combining, that’s all.

  48. carrot eater says:

    AMac,
    To avoid confusion, it might be better to not refer to the above work as models.

  49. Mac the Knife says:

    Zeke Hausfather and Steven Mosher:
    A most excellent post! Thank you very much!!!

    Anthony and Charles The Moderator:
    You guys Rock! Rock On, Numero Uno!!!!!!

    Carrot eater:
    There is no serious dispute that the planet has been warming in fits and starts since the last major ice age and, closer in time series, since the Little Ice Age. A most reasonable dispute ensues when claims are made that man made emissions of CO2 are direct causation for any of the extended period of global warming.

    Nature plays enough ‘tricks’ to keep us all guessing at correlations, without dubious statistical manipulations such as ‘Mike’s nature trick’, as in “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) amd (sic) from 1961 for Keith’s to hide the decline”. When faced with potential malfeasance by rabid advocates of the man made global warming hypothesis, it should come to no ones surprise that reasoning minds conclude some reported global warming is indeed “an artifact of adjustments”.

    Now… where is the link to that catchy tune “Hide the Decline!”?

  50. Geoff Sherrington says:

    Hitherto, it had been my understanding that the base period was selected over a span of continuous years then set to zero.

    Illustration http://www.geoffstuff.com/global-temperature-anomalies-1800-2006.png
    Source http://www.globalissues.org/article/233/climate-change-and-global-warming-introduction
    which is a version of GISS land + sea in a 2009 report. (Chosen at random, not because of any attribute).

    Now I realise that my assumption of setting to zero might be wrong. Although the material quoted here eyeballs to zero temp from 1951-1980, none of the graphs in the artcle here on WUWT seem to averge to zero over the base period. Please, can someone clarify the procedure? In an ideal world, the “anomaly” (a method that I dislike personally) should be comparable as one moves between grid cells, hemispheres, land/sea, etc.

    It seems that stations are being adjusted, dropped out and perhaps added to the base period from time to time. Thus, if correct, the “anomaly” is a bit emphemeral, changing in K temperature each time the base period is altered.

    So this is a call to be more close to real physics, by using at least degree C actual. Graphics can be used to cut out the dead area at the base of the graph in hot places.

    Addendum: Is each author above aware of the probability that reconstructions as shown are based on country information that is, for some countries, already adjusted?
    For example, I do not think I have ever seen a graph of Australia land temperatures that is based on raw data as collected by the observers. Also, the Australian record is undergoing continuous revision as more metadata are being extended back in time.

    In short, I think that you are not dealing with a stationary historical data set.

  51. RomanM says:

    CE

    When I say ‘anomaly’, I include what the RSM and LSM do. The moment you use offsets, you have relative temperatures, not absolutes.

    Then you don’t understand these other methods very well.

    Anomalies in the usual sense calculate a temperature record relative to itself during a specific time period. The results that you end up with using them end up as relative values.

    On the other hand, in LSM, no temperature series is ever compared to itself and at the end of the process you end up with a non-anomalised result.

    The two approaches really do not have much in common either in technique or in final result form.

  52. Charles Higley says:

    “pat says:
    July 13, 2010 at 3:33 pm
    I have an idea. Let’s actually read th thermometer. Report what was read, and identify the location and environment. No mopre homogenization, models, adjustments, proxies,and tinkering with the past.”

    I totally agree. When you look at the raw data from well sited rural thermometers just about anywhere and you do not find the warming these complicated treatments appear to show, then maybe there is something deeply flawed in the concept of a global temperature. We do not have the coverage and info to do this when we have to extrapolate so extensively. And the adjustments for UHI are usually wrong signed, queering the sites.

    When 5 rural stations in the US show an average in which the 1930s were the warmest and 1990s not even equal to 1953 and then the bigger averages show the 90s to be warmer, obviously there must be a significant flaw. Too many local, raw, rural sites seriously disagree.

  53. DR says:

    So once again the accuracy and precision of the data is still not addressed.

    Sorry, I still fail to see the significance of reproducing the same results over and over without investigating the quality of measurement at each individual station and also the inclusion of land use change (which alters the climate over time i.e. boundary layer ) and UHI. It would seem those are the most important factors that need to be ironed out.

  54. Dave McK says:

    I enjoyed the article – the overview was interesting, if arcane.
    You guys make enough stats to start shipping bubble gum cards of the major players…lol.
    Do you suppose they’ll be worth something one day?

  55. Bill Illis says:

    Thanks everyone who participated in producing these reconstructions because it clearly took a lot of time and effort.

    I guess we can conclude then that the methodology of constructing a properly-weighted land temperature record does not vary much between the various choices of reasonable methodologies.

    Land temperatures in the GHCN dataset has increased about 0.9C and the Land/Ocean temperature series has increased about 0.55C since 1900.

    There are some other adjustments done that have GISTemp at +0.70C (Land) and +0.64C (Land and Ocean) since 1900 and Hadcrut3 at +0.966C (Land) and +0.703C (Land and Ocean) since 1900.

    This then leaves a number of questions remaining:

    1. How Raw is the Raw data in the GHCN dataset.
    2. What adjustments are done to have GISTemp and Hadcrut3 higher (and lower) than the reconstruction numbers.
    3. Can we pick a better series of high quality rural stations that have consistently reported over the whole period so we can avoid UHI and station-selection biases (continuing to use rising stations and discarding declining stations) in the GHCN dataset.

  56. George E. Smith says:

    “”” Nick Stokes says:
    July 13, 2010 at 3:45 pm
    George E Smith

    “then there’s that 1961 to 1990 base period for all those anomaly graphs; what is that all about. If they didn’t measure the correct global temperature during the base period; then of course the anomalies don’t have any real reference either.”

    It’s an important fact about anomalies that they aren’t calculated relative to a global temperature – each is calculated relative to the local mean over the period. They just use the same period. But the LSM methods that are mentioned don’t use a base period at all. The LS method calculates the correct offset. That’s one of the points of the article – both ways give the same result. “””

    Nick I don’t disagree with anything you said. But the Title of this Article was “Calculating the Global Temperature.” When in fact the methodology has nothing whatsoever to do with the global temperature. It is simply the result of manipulations of some given and quite arbitrary set of thermometers somewhere; compared against themselves each to each; and all of that can take place even if the planet earth was nowhere to be found.

    The point is that NOWHERE in this process, can the result be connected to the planet to “Calculate the Global Temperature” It simply calculates the variations of some quite arbitrary set of thermometers from themselves.

    The min/max daily temperature reading fails the Nyquist sampling criterion; and the spatial distribution of the set of thermometers also fails the Nyquist Test; and by orders of magnitude; so there’s no way that any recovered average can be correct because of the aliassing noise; and the result is meaningless anyway.

    As I have said many times GISStemp calculates GISStemp; and nothing else; same goes for HADcrud.

    And even if one did actually measure the true average Temperature of the globe; it is not related in any way to the energy flows; so it tells us nothing about the stability of earth’s energy balance.

  57. Larry T says:

    Again basically all of these land datasets show one thing –
    that with growth of airports, plane traffic, and UHI effects the temperature readings are higher.

  58. Bruce of Newcastle says:

    Concerning UHI there is Dr Spencer’s analysis which suggests it can be quite large even for quite low population density areas in the US.
    http://www.drroyspencer.com/2010/03/direct-evidence-that-most-u-s-warming-since-1973-could-be-spurious/

    However the really interesting study to my mind is by Edward Long, who chose apparently good quality rural stations in 48 US states, and found that the raw data showed a warming of only 0.11 C/century. A key to Anthony’s concern is he eyeballed the station sites “using a GPS map application”, which probably means no airport UHI issues.

    Now how can CO2 avoid warming rural sites if it is the main cause of warming? At 0.11 C/century it would take a cool 1800 years or so to reach IPCC’s arbitrary 2 C limit.

    Edward Long’s study can be found here:
    http://icecap.us/index.php/go/joes-blog/a_pending_american_temperaturegate/

    The PDF report is linked at this site and has a listing of the station ID’s.

  59. rbateman says:

    If we plotted the yearly mean temp instead of the anomaly, set the bottom of the graph at ZERO, set the top at 2x the mean, then we’d see how this Global Temp scare is making a mountain out of a molehill.

  60. carrot eater says:

    RomanM

    Don’t worry, I understand all the methods just fine. To me, an ‘anomaly’ is simply a relative temperature. That’s the key part. Whether it’s relative to its own average (CAM) or something else (as with the intermediate calculations in RSM, LSM) is not that important, when explaining to people the principle of why you use relatives, instead of just averaging together absolutes.

  61. carrot eater says:

    Mac the knife,

    You may think that is not under dispute, but look around. You will see lots of commenters and bloggers say or imply that global warming only really shows up at the advertised magnitude when GISS or NOAA or CRU add in their adjustments. I don’t know how widely held this confusion is, but it’s been out there.

  62. Luis Dias says:

    So someone can delete all of your posts lambasting GISSTEMP after all?

    If Not Then Goto FacePalmLand.

  63. Layne Blanchard says:

    I also question: How Raw is Raw? And I can’t see how vast ocean expanse temps with no records in existence can be determined from measurements on the irregular land masses. And then there is this:

    http://icecap.us/images/uploads/DrKeen2.jpg

    I’m fully aware this is only a region, yet it must be used to determine a larger area.

    And let’s not forget all references to this: (et al in the arctic, e.g. Lucy Skywalker’s post):

    http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=431042500000&data_set=1&num_neighbors=1

    and this:

    [Wibjorn Karlen] In attempts to reconstruct the temperature I find an increase from the early 1900s to ca 1935, a trend down until the mid 1970s and so another increase to about the same temperature level as in the late 1930s.

    A distinct warming to a temperature about 0.5 deg C above the level 1940 is reported in the IPCC diagrams. I have been searching for this recent increase, which is very important for the discussion about a possible human influence on climate, but I have basically failed to find an increase above the late 1930s.

    See here:
    http://wattsupwiththat.com/2009/11/29/when-results-go-bad/

    ??????????

  64. DeNihilist says:

    Firstly, a HUGE pat on the back to Anthony, for having the kohannas to post this! Thanx Anthony!

    Second a huge thanx to all the citizen scientists who have spent their own precious time on these temp constructions. Because of you, we can now move on to the next questions. My favourite being where is this warmth from, min or max? BRAVO!

    And finally, thanx to all the respondents for being civil and asking questions about the article, not about the authors! The more we have of this type of exchange, the better chance we have of deciding as to whether or not we are in real trouble, a bit of trouble or no trouble.

    Zeke and Mosh, would it be advantagious, to maybe do a dual post on your next step, one here and maybe one at Tamino’s or Gavin’s?

    Kudos to all!

  65. George E. Smith says:

    “”” Zeke Hausfather says:
    July 13, 2010 at 3:39 pm
    George E. Smith,

    Lucia gave an excellent explanation of why anomalies are more useful than absolute temps awhile back: “””

    Maybe anomalies are “more useful than absolute temps”; I’ll even stipulate that although I can’t imagine what for. That doesn’t
    change the fact that anomalies have nothing whatsoever to do with Calculating the Global Temperature.

    The GISStemp process, and the HADcrud process calculate GISStemp and HADcrud respectively; and nothing else. They have no connection to the mean global temperature of the planet; which in turn has no connection to the energy balance of the earth energy budget. The thermal processes over different terrains are all different; and none of them are simply related to even the local Temperature; let alone to any global average temperature; so the whole process is an exercise in self delusion.

    Might as well average the telephone numbers in your local phone directory; it is meaningless, uless the average happens to be your phone number; It might not even be a valid phone number; but it still is the average of a quite arbitrary set of numbers.

    I quite understand the concept of scraping all the numbers off a thermometer, so they have no absolute calibration; and simply referencing the mark at any time to some other place it once was. But what a complete and utter waste of time and effort that is.

    And the results if they show any trends at all, simply show the trend of a particular algorithm applied to a particular set of quite arbitrary and meaningless numbers.

    It’s like holding your arms out straight in front of you and then reporting:- ” see there are my fingers right out there on the end of my hands; which is about where they always have been !”

  66. graham g says:

    Good paper Anthony.!

    I admire the scientists who post here, but a major simple fact seems to be absent in all the presentations. You only have to review the snapshot under Anthony’s starting comment, and ask yourself why the equatorial temperatures show as cool while the adjacent areas are quite warm.

    I find the complete disregard of relative humidity most strange. Is it solely because we cannot measure it remotely. ? The global land temperature graph is in accordance with my memories as a 70+ year old living in tropical Australia and PNG. The period around the 1940′s were major drought times that changed to major flooding years from mid 1950′s to mid 1970′s. Since then we have gone into a serious drought cycle that we may be coming out of in the past few years.

    Given that people around the world’s developed countries are now demanding the availability of huge amount of energy for their enjoyment in it’s various forms, it’s not suprising that the UHI effect is so significant.

    I’m sure that the scientists are well aware of the impact of relative humidity.
    They were, when I was a boy. Why not now.?

  67. carrot eater says:

    For all those asking, how raw is raw:

    In the case of unadjusted GHCN and USHCN: these are monthly means. They are often the means of the daily max and min temperatures, but in some countries other averages may be used. So somebody somewhere calculated those means. But there is no attempt to correct for TOB, UHI, station moves, changing conditions at the site, changing instrumentation, etc in the unadjusted source files.

    NOAA does attempt such adjustments, but they put the results in other files (adj for GHCN, TOB and F52 for USHCN). The individual countries often also do their own adjustments for their own stations, but these should be kept separate from what gets sent to NOAA. You can check that by checking against CLIMAT reports for recent years.

  68. DeNihilist says:

    Oh, and R. Gates, in all sincerity, would you please contribute a post about the arctic? Anthony has already said that he would most likely post it. And this is the way science grows, by looking at all sides of the question.

  69. Nick Stokes says:

    Bill Illis,
    “1. How Raw is the Raw data in the GHCN dataset.”
    Quite raw, at least in current practice. It comes straight from the CLIMAT forms submitted by the various Mets. You can see these at OGIMET.
    “2. What adjustments are done to have GISTemp and Hadcrut3 higher (and lower) than the reconstruction numbers.”
    Gistemp code is available. But the message from the linked articles is that the adjusted GISS and HADcrut are not noticeably different from the raw GHCN results.
    “3. Can we pick a better series of high quality rural stations that have consistently reported over the whole period so we can avoid UHI and station-selection biases (continuing to use rising stations and discarding declining stations) in the GHCN dataset.”
    That was the criterion I used in the 60 stations exercise. Rural, 90 years of record, and still reporting in 2008. The exercise was intended just to show what a rather random choice could do – if I were doing it again, I would modify the weighting.

  70. Luis Dias says:

    George Smith, that’s quite an astonishing nihillist (and paranoid) vision of reality. I never thought I’d see that kind of thing. Even here.

  71. jorgekafkazar says:

    Steven Mosher says: “Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected. They dont prove the theory, no theory is proven.”

    “Consistent with,” perhaps. CO2 went up and temperatures went up. This does not prove causation. There have been periods in the past where temperatures went down and CO² remained high.

    But “confirm the theory of GHG warming?” How did you reach that conclusion?

    The fact that the theory was espoused before the data was collected is irrelevant and “confirms” nothing. It may be necessary, but is far from sufficient. Correlation is not causation. If I had a theory that wind is caused by trees wiggling their branches, and data subsequently showed that when it’s windy, tree branches are always waving, then according to your logic [or what I think is your logic], this would “confirm” my earlier theory of dendroanemosicity.

  72. Bob Tisdale says:

    BillyBob: You asked, “…is there a reliable record for min/max anywhere?

    There is an absolute land surface temperature dataset from 1948 t0 present. It’s identified as the “CPC GHCN/CAMS t2m analysis (land)” and is available through the KNMI Climate Explorer in three different resolutions:
    http://climexp.knmi.nl/selectfield_obs.cgi?someone@somewhere

    The dataset is discussed in Fan and van den Dool (2007) “A global monthly land surface air temperature analysis for 1948–present” (24mb):
    ftp://ftp.cpc.ncep.noaa.gov/wd51yf/GHCN_CAMS/2007JD008470.pdf

    I used it in one post:
    http://bobtisdale.blogspot.com/2010/03/absolute-land-surface-temperature.html

    And in that post, I plotted annual maximum, average, and minimum global land surface temperatures:
    http://i43.tinypic.com/25qr8yo.png

  73. Zeke Hausfather says:

    Here are only stations with 100 year records, for those interested: http://i81.photobucket.com/albums/j237/hausfath/Picture135.png

    One of the nice things of having all of these tools out there is that folks can look at any particular subset of station they desire, be it dark rural non-airport stations with 100+ year records or anything else.

    For those asking about the rawness of the data, GHCN v2.mean comes directly from CLIMAT reports submitted by national MET offices, with some very basic QA (they keep a file with all the QA rejects if you want to check). We also show reconstructions with two alternative datasets (WMSSC and GSOD). The latter (GSOD) uses mostly stations not included in GHCN and is constructed from raw daily readings.

    REPLY: Do you have a list of those stations by name and ID Zeke? I’d like to have a look at them. – Anthony

  74. Ron Broberg says:

    Just a note to look again at the charts at the top of the post and note the label “GSOD.” It’s not all about GHCN data.

    GSOD is an alternate data set – it is not GHCN. Whereas GHCN is compiled from CLIMAT reports submitted by various National Meteorology Centers, GSOD is the daily summary of near real time (synoptic) data gathered hourly to every several hours, depending on the source. this ~hourly data is used for the ISH/ISD data set (used by Dr Spencer). GSOD is a separate data stream, although many of the same stations will appear in both GHCN and GSOD. However, GSOD includes thousands of stations not included in GHCN after 1972 – although there are many fewer stations before 1972.

  75. Zeke Hausfather says:

    Anthony,

    Ron did all the heavy lifting on the GSOD data. The daily temp values are a bit hard to work with (and he posted the scripts for them here: http://rhinohide.wordpress.com/2010/06/26/gsod-to-ghcn-round-2/ ), but Ron turned them into monthly means available here:

    http://rhinohide.cx/co2/gsod/data/201006/my.gsod.mean

    With a station inventory file here:

    http://rhinohide.cx/co2/gsod/data/201006/my.gsod.inv

    (Note that this data should not be used for commercial purposes per its redistribution restrictions: http://www.ncdc.noaa.gov/cgi-bin/res40.pl?page=gsod.html )

    REPLY: Thanks much -A

  76. Zeke Hausfather says:

    To supplement my last comment a tad, you can find the unprocessed GSOD data here:
    ftp://ftp.ncdc.noaa.gov/pub/data/gsod/

    And the metadata file here:
    ftp://ftp.ncdc.noaa.gov/pub/data/gsod/ish-history.txt (warning: large file)

  77. Bill Illis says:

    Nick Stokes says:
    July 13, 2010 at 6:14 pm
    Bill Illis,
    “2. What adjustments are done to have GISTemp and Hadcrut3 higher (and lower) than the reconstruction numbers.”
    Gistemp code is available. But the message from the linked articles is that the adjusted GISS and HADcrut are not noticeably different from the raw GHCN results.”

    GISTemp and Hadcrut3 are quite different. I quoted the numbers above.

    “Land temperatures in the GHCN dataset (according to the reconstructions) has increased about 0.9C and the Land/Ocean temperature series has increased about 0.55C since 1900.

    GISTemp is 0.70C (Land) and 0.64C (Land and Ocean) since 1900 and Hadcrut3 at 0.966C (Land) and 0.703C (Land and Ocean) since 1900.”

    Maybe when they are plotted on chart with a certain resolution, they look like they come close to matching up, but they don’t. 0.1C and 0.2C’s do matter in this business. Temps are already only 50% of that predicted, 40% is now in the redo category.

  78. Zeke Hausfather says:

    Ahh, sorry, I misread (its been a long day…). You want the list of stations with 100+ year records. Its here: http://drop.io/0yhqyon/asset/long-lived-ghcn-stations-txt

    REPLY: Yes. Thanks for picking that up. I was perusing the links you gave and ??? were sprouting – A

  79. Zeke Hausfather says:

    Bill Illis,

    GISTemp Land isn’t actually calculating land temperature. Rather, its attempting to calculate global temperature with only land records. So its not strictly comparable to other records (though you can replicate it with the correct zonal weighting and no land mask).

    We had a fun time looking into the Great GISTemp Mystery over here awhile back: http://rankexploits.com/musings/2010/the-great-gistemp-mystery/

  80. An Inquirer says:

    If I read the graphs correctly, global temperature over land is up .8 degrees C in 110 years, but if you put in the oceans, then the global temperature is up only .4 degrees.

    I realize that there are issues in the quality & consistency of reading SST; however, I believe the following is correct:

    So the skeptic hypothesis is that UHI and siting issues have biased the land temperature upwards.
    And the believer’s hypothesis is that air over land is drier than air over water, and therefore we will see land temperatures rise faster than SSTs.

  81. Ron Broberg says:

    One of the points of a thread like this, is that the ‘game’ doesn’t stop when someone raises an objection. Unsure of the reliability of CRUTEM? Go out and build your own temperature reconstruction! Don’t understand how GISTEMP does station combinations? Rewrite their method in your own code! Unsure what the effect of the ‘dying of the thermometers’? Use a data set that doesn’t have the station loss. Object to the UHI in the analysis? Explore the urban effect with your own tools.

    Science doesn’t stop when a reasonable question is raised. Indeed, that is precisely the point at which science begins! … but only if you are willing to do the follow-up work required. :-)

  82. EthicallyCivil says:

    Re: Steven Mosher says:
    July 13, 2010 at 3:43 pm

    >>> The data I used was “uncorrected” GHCN.

    Color me confused about the temperature record. I had thought there was some dispute about the temperatures in the 1940′s relative to modern times. That earlier records had indicated higher temperatures for the pre 1970 period than does the modern record. Am I incorrect in this?

    Civilly yours,

    EC

  83. Don B says:

    Assume, as a given, that figure 1 depicts global temperatures. There was no warming or cooling, net, between 1900 and the mid 1970s. All of the net warming was between late 1970s and about 2005.

    Why?

  84. Zeke Hausfather says:

    Anthony,

    Here is a newer graph with all GHCN v2.mean stations, only stations with > 100 year records, and only rural (via GRUMP) stations with > 100 year records. It gets noisier as the number of stations available decreases, but the trends don’t change too much.

    http://i81.photobucket.com/albums/j237/hausfath/Picture18-4.png

    And here are all the wmo_ids for the long-lived rural stations:

    http://drop.io/0yhqyon/asset/long-lived-rural-ghcn-stations-txt

    Mosh should be able to run this analysis with his code as well, or an analysis with any particular subset of stations you choose.

  85. Ron Broberg says:

    EC: The 1930s and 40s were about the same temp as now for the United States (CONUS).

    You can see some US-v-World with GHCN-v-USHCN in this post by Zeke:
    http://rankexploits.com/musings/2010/a-simple-model-for-spatially-weighted-temp-analysis/

  86. Ron Broberg says:

    (Note that this data should not be used for commercial purposes per its redistribution restrictions: http://www.ncdc.noaa.gov/cgi-bin/res40.pl?page=gsod.html )

    I wonder how long before I get a friendly ‘cease and desist’ letter and I, just like Dr Jones before me, will have to pull down my data and tell me people that I can’t give them access and that they should just download their own?! :lol:

    REPLY: If that ever happens, I’ll be the first to step up for defending your right to use and publish it. – Anthony

  87. kuhnkat says:

    Wonder why they didn’t include Dr. Spencer’s computations??

    Oh yeah, he used a data set that did not have the bias built in through poorly formed homogenisation adjustments!!

    HAHAHAHAHAHAHAHAHAHAHAHAHAHA

  88. HaroldW says:

    Zeke and Steve –
    Thanks for the very informative article.

    It’s not particularly important, but figure 2 shows a large divergence in the GSOD dataset around 1969. Any idea of its cause?

    On a different topic, Bill Illis (July 13, 2010 at 5:27 pm) says: “Land temperatures in the GHCN dataset has increased about 0.9C and the Land/Ocean temperature series has increased about 0.55C since 1900.” Using a 30/70 area split for land/ocean, the implied gain in ocean temps since 1900 is about 0.4 C. [I'm not sure that sentence is clear, so I'll write it algebraically as "average land/ocean change = land_area_fraction * land_change + ocean_area_fraction * ocean_change." Using the above information, 0.55 C = 0.3 * 0.9 C + 0.7 * ocean_change;
    solving gives ocean_change = 0.4 C.]

    Visually comparing figures 1 & 8, it appears that in the 1915-1940 warming period, land and ocean warmed at similar rates, but during the more recent period 1970-2000 land warmed considerably faster than ocean. Obviously you have more precise information about the land-ocean difference — can you draw any conclusions from that data?

  89. Zeke Hausfather says:

    Ec,

    This may give you a sense of the variation in temps between countries/regions: http://i81.photobucket.com/albums/j237/hausfath/Picture155.png

  90. Bill Illis says:

    “Zeke Hausfather says:
    July 13, 2010 at 7:45 pm
    Anthony,

    Here is a newer graph with all GHCN v2.mean stations, only stations with > 100 year records, and only rural (via GRUMP) stations with > 100 year records. It gets noisier as the number of stations available decreases, but the trends don’t change too much.

    http://i81.photobucket.com/albums/j237/hausfath/Picture18-4.png

    I understand that the lines on the chart look similar. But let’s just turn the chart into numbers here.

    GHCN rural long-lived: +0.45C 1900 to 2009
    GHCN long-lived: +0.85C
    GNCN V2: +0.70C

    This is enough difference to start drawing conclusions about the differences rather than concluding they are the same.

  91. Zeke Hausfather says:

    kuhnkat,

    You mean this one? http://www.drroyspencer.com/wp-content/uploads/ISH-vs-CRUTem3NH-1986-thru-20091.jpg

    The only reason we didn’t include a temp reconstruction by Dr. Spencer is because he hasn’t released the data yet, just figures. I emailed him awhile back requesting global monthly means from his ISH implementation, but unfortunately didn’t receive a response.

    If you are referring to Spencer’s U.S. graph, well, we don’t have any U.S.-only charts in this post. I’d warrant it would be pretty close to GHCN v2.mean or USHCN raw for CONUS, however. I took a stab at replicating it using GHCN back in the day: http://rankexploits.com/musings/2010/effect-of-dropping-station-data/#comment-35519

  92. Ron Broberg says:

    Kuhnkat, GSOD is the daily summary of the same data set (ISH aka ISD) that Dr Spencer used.

    The ISH data is only freely accessible in bulk from known US domains (.edu, .gov, .mil, and such). It is also much larger than the GSOD data (~ 300gb for everything). I’m working on getting access to it. Alternately, if you would like to donate $2000 US so that I can purchase the whole set on CD, I would be happy to accept. :D

    REPLY: if you can give me an exact citation for this, I can see if my friend Jim Goodridge, former State Climatologist, has it -Anthony

  93. Zeke Hausfather says:

    Bill Illis,

    I’m not sure where you are getting those slopes from, but they are much more similar than that.

    1900-2009 (degrees C per decade)
    All stations: 0.072
    Long-lived: 0.086
    Long-lived rural: 0.071

    1960-2009 (degrees C per decade)
    All stations: 0.222
    Long-lived: 0.246
    Long-lived rural: 0.238

    Spreadsheet with data outputs and slope calculations:

    http://drop.io/0yhqyon/asset/ghcn-longevity-urbanity-xls

  94. richcar 1225 says:

    All the raw data sets show 1 degree per century. Satellite data since 1978 shows 1.2 degrees per century. Tony B’s long history stations show .26 degrees per century but with periods of similar rate of increases. I am ready to throw UHI under the bus and concede that the last thirty years fulfills the expectations of AGW supporters. However they will have to fight the long term trend of .26 degrees per century. I look for cooling to bring the trend back down.

  95. Ron Broberg says:

    ISH / ISD (same thing, two names)

    http://ols.nndc.noaa.gov/plolstore/plsql/olstore.prodspecific?prodnum=C00532-TAP-A0001

    http://www.ncdc.noaa.gov/oa/nndc/freeaccess.html

    http://www.ncdc.noaa.gov/oa/climate/isd/index.php

    Note: There are several references to COOP data in those pages. The ISH/ISD data is NOT the same as COOP. It is the ISH/ISD that I am most interested in.

    If there is a chance that we can arrange a data transfer, you can contact me at my email listed in the posting headers. And thanks for looking into it!

  96. anna v says:

    It is true that the climate community has managed to focus the world’s attention to the temperature anomalies, and by sleight of hand taken the attention from the temperatures themselves and how badly the models reproduce them.

    I am with George Smith on this .

    It is good that we see that the anomalies calculated in different ways agree within the magnitude of the change seen, so as to be able to say : there has been an x change in anomaly +/- y systematic from different methods.

    But it is like magic tricks with cards.

    Lets see what we are talking about. We are talking about an excess of retained energy on the continuous input output flux coming from the sun and radiating to space. i.e. energy.

    What does the anomaly tell us about energy input output except “there is change”?
    It is absolutely impossible to calculate energies radiated by using the anomaly map.
    It would be impossible with the temperature map too, unless one had the gray body constants and radiation curves of the map.

    In addition, a 1C anomaly in a region that has an average temperature of 273K has a completely different physical manifestation than a 1C anomaly where the average temperature is 288K.

    I am also with George on UHI, that in an ideal temperature measurement it should be integrated in. Do we correct for rocky mountains? Deserts? Energy is energy and is what is important for life on earth.

    Now of course, in the way the anomalies are being used UHI is important. It is the anomalies that are really irrelevant except in the sense “here there be tigers”.

  97. JimF says:

    @jorgekafkazar says:
    July 13, 2010 at 6:35 pm

    I agree fully with your comments in that post.

    Now, we seem here above to have substantiated that the output of an imperfect (and apparently, increasingly unreliable) temperature data collection system can be screwed with one way or another and still come up with, more or less, the same results in assessing the “global temperature”.

    At least we have this from the authors to tease us: “…Our point was this: concerns about bias in the methods of GISS, CRU and NCDC can be put to rest. The big issue HAS ALWAYS been the adjustments and the metadata.

    That is the next topic for discussion, for serious discussion that is….”

    and further (in comments):

    “…A turn to the question of data adjustments and a turn to the question of metadata accuracy and finally a turn to the question about UHI. Now, however, the community on all sides of the debate has a set of tools to address these questions….”

    No kidding, Dick Tracy. I’ll wait for that next installment. Still, after looking at all those graphs above, I’m left with the feeling that my grandparents and John Steinbeck were a bunch of whiners. Why, the Thirties were positively chilly! They told me they sweated buckets. So, did they lie to me?

  98. EthicallyCivil says:

    Ron Broberg says:
    July 13, 2010 at 7:48 pm

    >>> EC: The 1930s and 40s were about the same temp as now for the United States

    That’s what’s bothered me. How can we have AGW that exempts the US — where we have some of the best temperature record. It just seems… convenient.

    Who said “global warming only happens were nobody lives” — can’t remember.

    EC.

  99. Steven Mosher says:

    David A. Evans says:
    July 13, 2010 at 3:11 pm (Edit)
    How RAW is RAW?
    *********************
    That’s the next question. But what does raw mean?

  100. Steven Mosher says:

    “If all the warming is raised min’s caused by UHI and station dropout, we have zero to worry about.

    So, Zeke and Steve, is there a reliable record for min/max anywhere?”

    1. station dropout is not an issue as far as Bias goes.
    2. GHCN has daily min/max. others have as well.

  101. Steven Mosher says:

    George E. Smith says:
    July 13, 2010 at 3:27 pm (Edit)
    On #4 Gridding Methods; just what the heck grid are we talking about ?

    I thought both Hadcrud, and GISStemp used data from some small number of thermometers spread around the world; so what the heck are these grid cells and what do 5 x 5 and 3 x 3 grids mean ?

    ************************************************************

    There are roughly 7000-7200 hundred stations in GHCN.
    they provide data from 1701 to the present.
    They are distributed around the globe.

    In my case I start by selecting stations using the following rule: A station must have
    15 complete years in the 1953-1982 rime frame. I can vary this rule. and determine the sensitivity to that selection rule.

    Then the stations are combine according the the latlon grid they are in. I can combine all stations in a 1 deg, 2 degree, 3 degree and 5 degree bin.
    Each station is standardized by subtracting its mean during the 1953-82 period.

    The stations are averaged per grid (lat lon)

    Then all the grids are weighted by their area on the sphere. ( with/without land mask)

    Then all grids are combined and an area weighted series.

    By varying the size of the grid we can see the effect of gridding approaches.
    Gisstemp uses an equal area grid. I use regular grids ( lat lon) and weight by area

  102. Steven Mosher says:

    DirkH says:
    July 13, 2010 at 3:50 pm (Edit)

    “And right after 82, a steep temp rise (and declining thermometer population).
    Not accusing anyone of anything, just saying.”

    That’s largely incorrect.
    1. the answer is not changed by station drop off and
    2. The stations do not drop that quickly after 1982.
    3. If I change the period you will still get a rise.
    4. If I pick 1000 stations for the whole period you will still see a rise.

    here are the Stations that report BY MONTH ( months expressed as fractions)

    1950 3342
    1950.08333333333 3360
    1950.16666666667 3368
    1950.25 3393
    1950.33333333333 3394
    1950.41666666667 3382
    1950.5 3401
    1950.58333333333 3413
    1950.66666666667 3407
    1950.75 3405
    1950.83333333333 3421
    1950.91666666667 3425
    1951 3974
    1951.08333333333 3972
    1951.16666666667 3974
    1951.25 4001
    1951.33333333333 4012
    1951.41666666667 4016
    1951.5 4031
    1951.58333333333 4044
    1951.66666666667 4062
    1951.75 4065
    1951.83333333333 4069
    1951.91666666667 4061
    1952 4119
    1952.08333333333 4125
    1952.16666666667 4135
    1952.25 4137
    1952.33333333333 4132
    1952.41666666667 4156
    1952.5 4172
    1952.58333333333 4174
    1952.66666666667 4184
    1952.75 4197
    1952.83333333333 4195
    1952.91666666667 4190
    1953 4238
    1953.08333333333 4250
    1953.16666666667 4259
    1953.25 4266
    1953.33333333333 4269
    1953.41666666667 4288
    1953.5 4286
    1953.58333333333 4292
    1953.66666666667 4281
    1953.75 4283
    1953.83333333333 4291
    1953.91666666667 4299
    1954 4343
    1954.08333333333 4346
    1954.16666666667 4351
    1954.25 4347
    1954.33333333333 4351
    1954.41666666667 4366
    1954.5 4361
    1954.58333333333 4360
    1954.66666666667 4378
    1954.75 4375
    1954.83333333333 4385
    1954.91666666667 4374
    1955 4362
    1955.08333333333 4355
    1955.16666666667 4370
    1955.25 4363
    1955.33333333333 4368
    1955.41666666667 4370
    1955.5 4371
    1955.58333333333 4371
    1955.66666666667 4364
    1955.75 4378
    1955.83333333333 4375
    1955.91666666667 4382
    1956 4419
    1956.08333333333 4407
    1956.16666666667 4420
    1956.25 4424
    1956.33333333333 4426
    1956.41666666667 4420
    1956.5 4430
    1956.58333333333 4427
    1956.66666666667 4435
    1956.75 4429
    1956.83333333333 4427
    1956.91666666667 4432
    1957 4435
    1957.08333333333 4437
    1957.16666666667 4450
    1957.25 4456
    1957.33333333333 4459
    1957.41666666667 4456
    1957.5 4474
    1957.58333333333 4471
    1957.66666666667 4453
    1957.75 4455
    1957.83333333333 4472
    1957.91666666667 4465
    1958 4486
    1958.08333333333 4484
    1958.16666666667 4488
    1958.25 4510
    1958.33333333333 4499
    1958.41666666667 4499
    1958.5 4499
    1958.58333333333 4498
    1958.66666666667 4503
    1958.75 4507
    1958.83333333333 4501
    1958.91666666667 4504
    1959 4525
    1959.08333333333 4534
    1959.16666666667 4536
    1959.25 4538
    1959.33333333333 4527
    1959.41666666667 4537
    1959.5 4540
    1959.58333333333 4543
    1959.66666666667 4541
    1959.75 4541
    1959.83333333333 4540
    1959.91666666667 4547
    1960 4576
    1960.08333333333 4601
    1960.16666666667 4606
    1960.25 4616
    1960.33333333333 4614
    1960.41666666667 4599
    1960.5 4606
    1960.58333333333 4595
    1960.66666666667 4614
    1960.75 4619
    1960.83333333333 4629
    1960.91666666667 4621
    1961 4776
    1961.08333333333 4781
    1961.16666666667 4794
    1961.25 4807
    1961.33333333333 4821
    1961.41666666667 4823
    1961.5 4811
    1961.58333333333 4824
    1961.66666666667 4820
    1961.75 4817
    1961.83333333333 4831
    1961.91666666667 4817
    1962 4877
    1962.08333333333 4879
    1962.16666666667 4892
    1962.25 4875
    1962.33333333333 4888
    1962.41666666667 4883
    1962.5 4891
    1962.58333333333 4897
    1962.66666666667 4921
    1962.75 4918
    1962.83333333333 4932
    1962.91666666667 4927
    1963 4994
    1963.08333333333 5010
    1963.16666666667 5004
    1963.25 5020
    1963.33333333333 5014
    1963.41666666667 5027
    1963.5 5025
    1963.58333333333 5033
    1963.66666666667 5037
    1963.75 5028
    1963.83333333333 5038
    1963.91666666667 5030
    1964 5066
    1964.08333333333 5071
    1964.16666666667 5087
    1964.25 5072
    1964.33333333333 5063
    1964.41666666667 5070
    1964.5 5077
    1964.58333333333 5068
    1964.66666666667 5068
    1964.75 5077
    1964.83333333333 5061
    1964.91666666667 5049
    1965 5143
    1965.08333333333 5147
    1965.16666666667 5143
    1965.25 5143
    1965.33333333333 5151
    1965.41666666667 5143
    1965.5 5147
    1965.58333333333 5146
    1965.66666666667 5143
    1965.75 5151
    1965.83333333333 5161
    1965.91666666667 5156
    1966 5176
    1966.08333333333 5186
    1966.16666666667 5196
    1966.25 5195
    1966.33333333333 5195
    1966.41666666667 5184
    1966.5 5192
    1966.58333333333 5194
    1966.66666666667 5186
    1966.75 5194
    1966.83333333333 5193
    1966.91666666667 5186
    1967 5192
    1967.08333333333 5198
    1967.16666666667 5196
    1967.25 5193
    1967.33333333333 5202
    1967.41666666667 5193
    1967.5 5201
    1967.58333333333 5203
    1967.66666666667 5199
    1967.75 5199
    1967.83333333333 5193
    1967.91666666667 5178
    1968 5202
    1968.08333333333 5199
    1968.16666666667 5202
    1968.25 5206
    1968.33333333333 5200
    1968.41666666667 5204
    1968.5 5205
    1968.58333333333 5197
    1968.66666666667 5209
    1968.75 5208
    1968.83333333333 5208
    1968.91666666667 5193
    1969 5211
    1969.08333333333 5218
    1969.16666666667 5221
    1969.25 5223
    1969.33333333333 5222
    1969.41666666667 5219
    1969.5 5217
    1969.58333333333 5213
    1969.66666666667 5215
    1969.75 5186
    1969.83333333333 5196
    1969.91666666667 5188
    1970 5176
    1970.08333333333 5193
    1970.16666666667 5179
    1970.25 5194
    1970.33333333333 5171
    1970.41666666667 5181
    1970.5 5164
    1970.58333333333 5186
    1970.66666666667 5188
    1970.75 5186
    1970.83333333333 5192
    1970.91666666667 5185
    1971 5054
    1971.08333333333 5065
    1971.16666666667 5045
    1971.25 5050
    1971.33333333333 5058
    1971.41666666667 5049
    1971.5 5033
    1971.58333333333 5045
    1971.66666666667 5064
    1971.75 5050
    1971.83333333333 5044
    1971.91666666667 5023
    1972 5039
    1972.08333333333 5045
    1972.16666666667 5047
    1972.25 5037
    1972.33333333333 5037
    1972.41666666667 5043
    1972.5 5033
    1972.58333333333 5030
    1972.66666666667 5035
    1972.75 5019
    1972.83333333333 5023
    1972.91666666667 5019
    1973 5020
    1973.08333333333 5034
    1973.16666666667 5024
    1973.25 5007
    1973.33333333333 5022
    1973.41666666667 5006
    1973.5 5008
    1973.58333333333 5014
    1973.66666666667 5005
    1973.75 4998
    1973.83333333333 4959
    1973.91666666667 4986
    1974 5019
    1974.08333333333 5023
    1974.16666666667 5016
    1974.25 5004
    1974.33333333333 4992
    1974.41666666667 5004
    1974.5 5001
    1974.58333333333 4993
    1974.66666666667 4985
    1974.75 4991
    1974.83333333333 4986
    1974.91666666667 4967
    1975 4974
    1975.08333333333 4988
    1975.16666666667 4960
    1975.25 4982
    1975.33333333333 4961
    1975.41666666667 4966
    1975.5 4951
    1975.58333333333 4942
    1975.66666666667 4941
    1975.75 4945
    1975.83333333333 4929
    1975.91666666667 4924
    1976 4873
    1976.08333333333 4892
    1976.16666666667 4872
    1976.25 4866
    1976.33333333333 4873
    1976.41666666667 4879
    1976.5 4867
    1976.58333333333 4864
    1976.66666666667 4867
    1976.75 4860
    1976.83333333333 4869
    1976.91666666667 4849
    1977 4850
    1977.08333333333 4864
    1977.16666666667 4863
    1977.25 4843
    1977.33333333333 4847
    1977.41666666667 4845
    1977.5 4823
    1977.58333333333 4835
    1977.66666666667 4831
    1977.75 4829
    1977.83333333333 4816
    1977.91666666667 4825
    1978 4838
    1978.08333333333 4861
    1978.16666666667 4853
    1978.25 4844
    1978.33333333333 4864
    1978.41666666667 4845
    1978.5 4829
    1978.58333333333 4818
    1978.66666666667 4810
    1978.75 4794
    1978.83333333333 4779
    1978.91666666667 4777
    1979 4779
    1979.08333333333 4779
    1979.16666666667 4776
    1979.25 4770
    1979.33333333333 4765
    1979.41666666667 4772
    1979.5 4741
    1979.58333333333 4760
    1979.66666666667 4753
    1979.75 4746
    1979.83333333333 4741
    1979.91666666667 4713
    1980 4731
    1980.08333333333 4741
    1980.16666666667 4742
    1980.25 4730
    1980.33333333333 4726
    1980.41666666667 4741
    1980.5 4745
    1980.58333333333 4746
    1980.66666666667 4749
    1980.75 4746
    1980.83333333333 4728
    1980.91666666667 4700
    1981 4448
    1981.08333333333 4498
    1981.16666666667 4519
    1981.25 4501
    1981.33333333333 4490
    1981.41666666667 4513
    1981.5 4483
    1981.58333333333 4497
    1981.66666666667 4518
    1981.75 4457
    1981.83333333333 4436
    1981.91666666667 4431
    1982 4286
    1982.08333333333 4326
    1982.16666666667 4292
    1982.25 4289
    1982.33333333333 4282
    1982.41666666667 4297
    1982.5 4266
    1982.58333333333 4275
    1982.66666666667 4267
    1982.75 4265
    1982.83333333333 4235
    1982.91666666667 4234
    1983 4220
    1983.08333333333 4265
    1983.16666666667 4273
    1983.25 4265
    1983.33333333333 4244
    1983.41666666667 4266
    1983.5 4234
    1983.58333333333 4244
    1983.66666666667 4238
    1983.75 4224
    1983.83333333333 4205
    1983.91666666667 4191
    1984 4168
    1984.08333333333 4168
    1984.16666666667 4154
    1984.25 4155
    1984.33333333333 4152
    1984.41666666667 4164
    1984.5 4164
    1984.58333333333 4160
    1984.66666666667 4116
    1984.75 4131
    1984.83333333333 4137
    1984.91666666667 4098
    1985 4098
    1985.08333333333 4103
    1985.16666666667 4113
    1985.25 4157
    1985.33333333333 4117
    1985.41666666667 4122
    1985.5 4107
    1985.58333333333 4085
    1985.66666666667 4099
    1985.75 4095
    1985.83333333333 4032
    1985.91666666667 4003
    1986 4032
    1986.08333333333 4056
    1986.16666666667 4045
    1986.25 4061
    1986.33333333333 4083
    1986.41666666667 4031
    1986.5 4059
    1986.58333333333 4036
    1986.66666666667 4021
    1986.75 3988
    1986.83333333333 3980
    1986.91666666667 3953
    1987 4018
    1987.08333333333 4004
    1987.16666666667 4015
    1987.25 3996
    1987.33333333333 3977
    1987.41666666667 3992
    1987.5 3995
    1987.58333333333 3977
    1987.66666666667 3999
    1987.75 4017
    1987.83333333333 3975
    1987.91666666667 3975
    1988 3992
    1988.08333333333 3963
    1988.16666666667 3992
    1988.25 3972
    1988.33333333333 3953
    1988.41666666667 3936
    1988.5 3941
    1988.58333333333 3977
    1988.66666666667 3945
    1988.75 3949
    1988.83333333333 3965
    1988.91666666667 3959
    1989 3899
    1989.08333333333 3905
    1989.16666666667 3887
    1989.25 3922
    1989.33333333333 3904
    1989.41666666667 3844
    1989.5 3894
    1989.58333333333 3847
    1989.66666666667 3864
    1989.75 3848
    1989.83333333333 3845
    1989.91666666667 3830
    1990 3646
    1990.08333333333 3670
    1990.16666666667 3675
    1990.25 3671
    1990.33333333333 3538
    1990.41666666667 3495
    1990.5 3367
    1990.58333333333 3367
    1990.66666666667 3325
    1990.75 3424
    1990.83333333333 3388
    1990.91666666667 3367
    1991 2664
    1991.08333333333 2601
    1991.16666666667 2606
    1991.25 2649
    1991.33333333333 2591
    1991.41666666667 2570
    1991.5 2541
    1991.58333333333 2535
    1991.66666666667 2683
    1991.75 2573
    1991.83333333333 2534
    1991.91666666667 2512
    1992 2564
    1992.08333333333 2524
    1992.16666666667 2475
    1992.25 2587
    1992.33333333333 2582
    1992.41666666667 2576
    1992.5 2538
    1992.58333333333 2596
    1992.66666666667 2538
    1992.75 2565
    1992.83333333333 2463
    1992.91666666667 2413
    1993 2441
    1993.08333333333 2395
    1993.16666666667 2391
    1993.25 2409
    1993.33333333333 2361
    1993.41666666667 2372
    1993.5 2418
    1993.58333333333 2474
    1993.66666666667 2439
    1993.75 2470
    1993.83333333333 2323
    1993.91666666667 2395
    1994 2426
    1994.08333333333 2395
    1994.16666666667 2402
    1994.25 2389
    1994.33333333333 2379
    1994.41666666667 2325
    1994.5 2421
    1994.58333333333 2370
    1994.66666666667 2381
    1994.75 2366
    1994.83333333333 2268
    1994.91666666667 2208
    1995 2285
    1995.08333333333 2183
    1995.16666666667 2322
    1995.25 2308
    1995.33333333333 2250
    1995.41666666667 2230
    1995.5 2212
    1995.58333333333 2277
    1995.66666666667 2284
    1995.75 2212
    1995.83333333333 2266
    1995.91666666667 2238
    1996 2362
    1996.08333333333 2306
    1996.16666666667 2262
    1996.25 2353
    1996.33333333333 2374
    1996.41666666667 2366
    1996.5 2337
    1996.58333333333 2315
    1996.66666666667 2331
    1996.75 2309
    1996.83333333333 2294
    1996.91666666667 2352
    1997 2316
    1997.08333333333 2377
    1997.16666666667 2367
    1997.25 2354
    1997.33333333333 2362
    1997.41666666667 2281
    1997.5 2294
    1997.58333333333 2337
    1997.66666666667 2341
    1997.75 2360
    1997.83333333333 2317
    1997.91666666667 2296
    1998 2326
    1998.08333333333 2358
    1998.16666666667 2335
    1998.25 2350
    1998.33333333333 2287
    1998.41666666667 2353
    1998.5 2325
    1998.58333333333 2354
    1998.66666666667 2345
    1998.75 2301
    1998.83333333333 2268
    1998.91666666667 2277
    1999 2300
    1999.08333333333 2324
    1999.16666666667 2364
    1999.25 2305
    1999.33333333333 2341
    1999.41666666667 2334
    1999.5 2343
    1999.58333333333 2315
    1999.66666666667 2315
    1999.75 2305
    1999.83333333333 2285
    1999.91666666667 2278
    2000 2282
    2000.08333333333 2271
    2000.16666666667 2282
    2000.25 2269
    2000.33333333333 2255
    2000.41666666667 2319
    2000.5 2225
    2000.58333333333 2166
    2000.66666666667 2280
    2000.75 2288
    2000.83333333333 2254
    2000.91666666667 2238
    2001 2240
    2001.08333333333 2261
    2001.16666666667 2273
    2001.25 2221
    2001.33333333333 2204
    2001.41666666667 2279
    2001.5 2223
    2001.58333333333 2234
    2001.66666666667 2271
    2001.75 2282
    2001.83333333333 2250
    2001.91666666667 2242
    2002 2224
    2002.08333333333 2184
    2002.16666666667 2270
    2002.25 2289
    2002.33333333333 2266
    2002.41666666667 2272
    2002.5 2276
    2002.58333333333 2220
    2002.66666666667 2281
    2002.75 2301
    2002.83333333333 2146
    2002.91666666667 2141
    2003 2291
    2003.08333333333 2252
    2003.16666666667 2233
    2003.25 2289
    2003.33333333333 2285
    2003.41666666667 2271
    2003.5 2237
    2003.58333333333 2189
    2003.66666666667 2261
    2003.75 2289
    2003.83333333333 2243
    2003.91666666667 2238
    2004 2177
    2004.08333333333 2231
    2004.16666666667 2241
    2004.25 2248
    2004.33333333333 2228
    2004.41666666667 2212
    2004.5 2242
    2004.58333333333 2045
    2004.66666666667 2080
    2004.75 2064
    2004.83333333333 2070
    2004.91666666667 2075
    2005 2074
    2005.08333333333 2078
    2005.16666666667 2083
    2005.25 2095
    2005.33333333333 2096
    2005.41666666667 2071
    2005.5 2047
    2005.58333333333 2040
    2005.66666666667 2088
    2005.75 2037
    2005.83333333333 2056
    2005.91666666667 2140
    2006 2091
    2006.08333333333 2189
    2006.16666666667 2161
    2006.25 1165
    2006.33333333333 1166
    2006.41666666667 1154
    2006.5 1152
    2006.58333333333 1114
    2006.66666666667 1181
    2006.75 1161
    2006.83333333333 1130
    2006.91666666667 1141
    2007 1164
    2007.08333333333 1147
    2007.16666666667 1155
    2007.25 1162
    2007.33333333333 1160
    2007.41666666667 1195
    2007.5 1176
    2007.58333333333 1168
    2007.66666666667 1166
    2007.75 1197
    2007.83333333333 1170
    2007.91666666667 1159
    2008 1000
    2008.08333333333 994
    2008.16666666667 993
    2008.25 1011
    2008.33333333333 1018
    2008.41666666667 1013
    2008.5 991
    2008.58333333333 1013
    2008.66666666667 1272
    2008.75 1261
    2008.83333333333 1293
    2008.91666666667 1075
    2009 1286
    2009.08333333333 1289
    2009.16666666667 1271
    2009.25 1219
    2009.33333333333 1256
    2009.41666666667 1279
    2009.5 1275
    2009.58333333333 1271
    2009.66666666667 1266
    2009.75 1274
    2009.83333333333 1279
    2009.91666666667 1269

  103. Steven Mosher says:

    CE

    “Had to chuckle at that one. For all the methodology choices that turn out not to matter much, that’s got to be one of the least consequential. No?”

    I’m not so sure. For this excercise I used simple averaging. I will probably post on the other methods. I did not implement hansens RSM, but I probably will. The differences are small.. <.1C. It was a bitch. Also I think Roman is looking at that issue as well

  104. Steven Mosher says:

    “Would certain altitudes be over-represented in the cell’s average, and would it matter?”

    no. it would not matter. each station is standardized by subtracting its mean.

  105. Steven Mosher says:

    “This is a well written overview of the problem by Steve Mosher!”

    Asute readers (like my roommate charles the moderator) should be able to pick out the relatively minor contribution I made to the words. Zeke was kind enough to take my numbers and produce text and charts to describe what SEVERAL people have done.
    without Roman’s help, jeffids help, stevemc, Ron, chad, zeke, and a bunch of people on the R help list this thing would not be done.

  106. BillyBob says:

    Mosher: “2. GHCN has daily min/max. others have as well.”

    2. GHCN has daily min/max. others have as well.

    GHCN v2 max/min for Canada ( for example) drops from 500-600 stations to 20-30 in the 1990s. Its a joke.

    The raw data shows the max is cooling by the way.

    For examples this is raw GHCN V2 max data for June/July/Aug ranked in 10 year periods for the USA. The 30s were the hottest (max) period.

    Decade JJA
    1930 – 1939 30.24
    1929 – 1938 30.19
    1931 – 1940 30.17
    1928 – 1937 30.07
    1932 – 1941 30.06
    1933 – 1942 29.99
    1934 – 1943 29.93
    1927 – 1936 29.87
    1935 – 1944 29.77
    1925 – 1934 29.71
    1926 – 1935 29.68
    1936 – 1945 29.65
    1924 – 1933 29.53
    1917 – 1926 29.49
    1893 – 1902 29.49
    1952 – 1961 29.47
    1916 – 1925 29.46
    1892 – 1901 29.46
    1913 – 1922 29.44
    1951 – 1960 29.42
    1937 – 1946 29.41
    1923 – 1932 29.40
    1994 – 2003 29.40

  107. Ron Broberg says:

    EC: That’s what’s bothered me. How can we have AGW that exempts the US

    There are a couple of answers to that question.
    The one I’ll go with is that, according to the IPCC AR4 WG1,
    the “A” part of “GW” is only, in just the last several decades,
    beginning to stand out from other natural forcings.

    http://www.ipcc.ch/publications_and_data/ar4/wg1/en/spmsspm-understanding-and.html

    In other words, the earlier highs were dominated by natural causes. The current highs are to some degree greater than zero a product of increasing CO2 (as determined by modeling analysis )

  108. Steven Mosher says:

    .DR says:
    July 13, 2010 at 5:18 pm (Edit)
    So once again the accuracy and precision of the data is still not addressed.

    Sorry, I still fail to see the significance of reproducing the same results over and over without investigating the quality of measurement at each individual station and also the inclusion of land use change (which alters the climate over time i.e. boundary layer ) and UHI. It would seem those are the most important factors that need to be ironed out.

    ****************
    Its very simple. You cant do that analysis competently without a tool. The tools were being questioned. So several people built their own tools. SHowing the tool doesnt bias the answer is the good first step for an analyst.

  109. Steven Mosher says:

    CE

    Thanks for the kind remarks..

  110. Steven Mosher says:

    BillyBob:

    “GHCN v2 max/min for Canada ( for example) drops from 500-600 stations to 20-30 in the 1990s. Its a joke.”

    Generally speaking I try to avoid making statements like this without backing it up with solid analysis. Its very simple to test. take the average with only those 20-30 over the whole time period and compare. or take the period 1940-1990 and randomly select any 30, do this several times. Canada being at high latitude is more
    homogenous ( highly correlated in space) than lower latitudes.
    At the highest latitude (90degrees) its cold in every direction you choose to walk.
    High spatial correlation means fewer stations are required to capture the signal.

  111. Steven Mosher says:

    ““Consistent with,” perhaps. CO2 went up and temperatures went up. This does not prove causation. There have been periods in the past where temperatures went down and CO² remained high.”

    Causation is never proved. the mechansim by whch GHG warm the planet is a physical theory. A physical theory that engineers use in the everyday construction of devices that we all enjoy. Its a physical theory which, Monkton, Lindzen, Christy, spencer, all agree with. we bicker over the MAGNITUDE of the effect, but no serious skeptic denies the basics of radiative transfer.

    “But “confirm the theory of GHG warming?” How did you reach that conclusion?”
    Confirm. note, I dont say verify.
    I’m a confirmational holist.

    http://en.wikipedia.org/wiki/Confirmation_holism

  112. DirkH says:

    Steven Mosher says:
    July 13, 2010 at 11:03 pm
    “DirkH says:
    July 13, 2010 at 3:50 pm (Edit)

    “And right after 82, a steep temp rise (and declining thermometer population).
    Not accusing anyone of anything, just saying.”

    That’s largely incorrect.
    1. the answer is not changed by station drop off and
    2. The stations do not drop that quickly after 1982.
    3. If I change the period you will still get a rise.
    4. If I pick 1000 stations for the whole period you will still see a rise.”

    Your points (1),(2),(3) and (4) might be correct, but they don’t interfere with my words:
    “And right after 82, a steep temp rise (and declining thermometer population).”

    Your “That’s largely incorrect.” talks about a possible conclusion that i intentionally did not write down.

  113. KenB says:

    This whole issue is so over complicated by failure to either transparently clean up poor temperature records and audit the sites for compliance and, then rate them properly for adjustment of things like UHI, along with cavalier one size fits all global extrapolation that is also on the face of it “not transparent” Its hardly surprising we are where we are.

    I can also understand the frustration of Anthony when a site is proved to be so badly sited that it is withdrawn from the present system, but its rotten data is left to contaminate the historical record. (If that’s wrong feel free to correct A)

    When looking back over historical hottest, coldest records its bad enough that some of these may be due to historical errors, deliberate skewing to maintain a locations claim (that happened in Australia) (Source BOM History The Weather Watchers) or poorly sited equipment that is poorly maintained.

    But then applying guesswork and adjusting temperatures down in historical records can also skew modern interpretation, especially where trends are constantly used to illustrate extreme views. It takes very little to adjust a model bias, and when trust is lost within the scientific process, suspicion abounds.

    Hopefully with co-operation on sites such as this eventually, some real consensus on data interpretation and weighting methods to be applied might be reached and confidence returned.

    my 2 cents !

  114. DirkH says:

    Ron Broberg says:
    July 13, 2010 at 4:00 pm
    “Dirk H: Here’s a guy who did a very simple analysis of raw data who comes to the conclusion that there is no discernible trend:

    That guy freely admits that he did no geographic weighting. GHCN has a high percentage of US stations – and a low percentage of Southern Hemisphere stations. ”

    Is the SH warming faster than the NH? It didn’t seem so to me in GISS’s famous global anomaly maps. Here’s one from Dec 2008:
    http://global-warming.accuweather.com/2009/01/despite_recent_trends_giss_sti_1.html

    So global warming seems to affect foremost landmasses with a lack of thermometers. Hmm, what could one do to find out more?

    Add thermometers? I don’t know if that is a scientific answer, though, me not being a scientist…

  115. stephen richards says:

    the mechansim by whch GHG warm the planet is a physical theory. A physical theory that engineers use in the everyday construction of devices that we all enjoy. Its a physical theory which, Monkton, Lindzen, Christy, spencer, all agree with. we bicker over the MAGNITUDE of the effect, but no serious skeptic denies the basics of radiative transfer.

    Over the magnitude once the physical is inserted into a system as complex as the climate. It is possible that there is no effect from CO² in the climate but no-one has yet completed a full mathematical model (SteveMc’s engineering study) which proves it one way or the other.

    On another note, Mosh et al thanks for this example and I really do appreciate all this effort. I think a clear statement of purpose, etc at the beginning would have negated many of the comments received and I am utterly convinced that the introduction would have mentioned all that George Smith wrote.

    I believe you know he is correct in what he says but that was not the purpose of your essay. In the final analysis it is only energy balance / lapse rate that matters and globa temp is for the public/media.

    Lastly, I have become aware that GISS have been adjusting historic data along with the latest data and that these adjustments have tended to lower the historic data relative to the later. Am I correct, if so how has this been accounted for in your examples?

  116. Ryan says:

    Amusing article that comes to the conclusion that all the simple arithmetic is done correctly. So it seems even climate scientists can add up.

    Still, it puzzles me that the theory states that the additional CO2 should make most impact when the insolation is highest but they insist on averaging temperatures over a year. Seems to me that a more sensitive apporach would be to look only at temperature measurements on the Longest Day for both Northern and Southern hemispheres to see if there is any real evidence that CO2 is doing its evil work, rather than watering down the possibility of detection by adding in all the winter temperatures too. This approach would also mean you would retain the natural variation of temperatures due to “weather” which would allow mathemiticians to consider the statistical significance of any anomoly.

  117. carrot eater says:

    stephen richards

    GISS adjustments do not enter into the above, at all. The source data are taken from a source upstream from GISS.

    As for what GISS does, see the effect here.
    http://clearclimatecode.org/gistemp-urban-adjustment/

  118. carrot eater says:

    DirkH

    Can you clarify what methods and data your source used? It’s very difficult to tell.
    By parsing language, I kind of think he used New Zealand data only, and may have used the First Difference Method for the calculation, but it’s not obvious to me.

  119. Malaga View says:

    George E. Smith says:
    July 13, 2010 at 3:19 pm
    Why not admit, that ALL of these records; are simply the result of statistical manipulations on the output of a certain set of thermometers; and that any connection between the result of that computation and the actual temperature of the earth is without any scientific foundation; it’s a fiction.

    AGREED: It is a fiction.

    George E. Smith says:
    July 13, 2010 at 3:31 pm
    is not correction simply making up false data?

    AGREED: That is what scientists seem to do these days.

    George E. Smith says:
    July 13, 2010 at 5:31 pm
    The point is that NOWHERE in this process, can the result be connected to the planet to “Calculate the Global Temperature” It simply calculates the variations of some quite arbitrary set of thermometers from themselves.

    AGREED: End of story.

    George E. Smith says:
    July 13, 2010 at 5:45 pm
    The GISStemp process, and the HADcrud process calculate GISStemp and HADcrud respectively; and nothing else. They have no connection to the mean global temperature of the planet; which in turn has no connection to the energy balance of the earth energy budget.

    AGREED: Anomalies just seem to be about fear and obscuration.

    Luis Dias says:
    July 13, 2010 at 6:14 pm
    George Smith, that’s quite an astonishing nihillist (and paranoid) vision of reality. I never thought I’d see that kind of thing. Even here.

    Welcome to the real world.

    rbateman says:
    July 13, 2010 at 5:34 pm
    If we plotted the yearly mean temp instead of the anomaly, set the bottom of the graph at ZERO, set the top at 2x the mean, then we’d see how this Global Temp scare is making a mountain out of a molehill.

    AGREED: A very simple, very sensible and very correct way to look at their data

  120. Dusty Rhodes says:

    A very interesting post. I was particularly surprised at the lack of difference in the results from the various methods used thereby eliminating method as a problem area.

    Would overlaying the standard deviation of at least the baseline data be helpful in interpreting the graphs?

    Slightly O/T but I was looking at the 234 year long Central England Temperature record (link below) and noticed that the global average curve was almost entirely below the CET one. Now given that (just considering the northern hemisphere) there is a great deal more of the earth’s (warmer) surface south of UK than north of it I would have thought, intuitively, that the global curve would therefore have been above CET. Or am I missing something fundamental?

    Thoughts anyone?

    http://www.decc.gov.uk/assets/decc/statistics/climate_change/1_20100319151831_e_@@_surfacetemperaturesummary.pdf

  121. Smokey says:

    Global temperature from 1979 plotted on a normal y-axis.

  122. Smokey says:

    Ron Broberg says at 11:31 pm:

    “The current highs are to some degree greater than zero a product of increasing CO2 (as determined by modeling analysis )”

    That may or may not be correct, but it should be kept in mind that it is an assumption based on a computer model. It is not real world data, and quoting the IPCC still doesn’t make it anything more than a conjecture.

  123. Geoff Sherrington says:

    There has to be a large component of guessing in all these reconstructions. If the old thermometer method was adequate, why was there a change to thermocouples and thermistors? As the daily sampling rate went from 1 a day to hundreds a day, did not this cause different smoothing assumptions, removal of spikes, etc? Why should the pre- and post- mercury be able to be spliced; what was the duration and magnitude of the splice? Would one not expect spike removal in recent times to drive down maxima? How can you use old records when the metadata sheets are still being studied and adjustments made as we speak? How can you define a generic term “rural” and know it was always thus through instrumented time? A station in the middle of a vast paddock will respond to the heat of a lamp used to light it for reading before sunrise – is this everywhere quantified and corrected? Even the height of the grass growing around it can cause substantial change.

    The error terms, when teated in a high quality manner such as for data that really matter for health or safety, are so large that it is impossible to draw solid conclusions. I’m with George E. Smith – July 13, 2010 at 5:45 pm

    I think the comment of Luis Dias – July 13, 2010 at 6:14 pm is harsh and unjustified. The more temperature series I plot, the more I find stations with no change over the last 40 years, within reasonable interpretation. It takes only one such station, in theory, to disprove global warming; but when large numbers of them exist, then the concept of global warming has to explain a negative temperature driver at each one, which global warming theorists have failed to do.

  124. tallbloke says:

    Mosh says:
    “underlying mechanism. Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected.”

    Which of these two graphs suggests the better correlation Mosh?

    http://tallbloke.files.wordpress.com/2010/07/soon-arctic-tsi.jpg

    Take your time…

  125. beng says:

    I’m impressed by the effort. However, as an engineer concerned about heat-balances, measuring the surface air temperature of the earth doesn’t tell us much of anything. The air has negligible mass compared to the oceans/earth for one thing.

    Measuring OHC (ocean heat-content) is the only rational way to make global heat-balance determinations or even determining reasonable trends.

  126. Rob R says:

    Nice graph from Smokey. Kind of puts things into perspective.

    What I want to see is the result of the questioning of the data that should begin from this point. But I want the various “temperature trenders” who’s graphs appear above to consider the point made by Geoff Sherrington in relation to raw data. Is there any change when truely raw data (as reported in the original observations taken at each climate station) is used? I am not really interested in the tortured exuse for data that we find in the NOAA, GISS and CRU/Hadley databases. And what happens when the most obvious antropogenic influences (eg. UHI) are eliminated at each (preferably rural) site?

    I am also rather suspicious of the anomaly method and would prefer careful constuction of annual altitude and latitude adjusted temperatures for equal-area grid cells (or parts thereof).

    I think the “temperature trenders” also need to consider the findings of Roy Spencer in relation to the degree of UHI associated with population density. The greatest impact appears to occur at the beginning, i.e. from essentially zero population density to a few 10′s or 100′s of people per square km, rather than from 100′s to 1000s. So sites that appear on most measures to be rather rural may nevertheless be impacted by substantial UHI effects regardless of proximity to a major centre. I suspect that James Hansen’s “nightlights” strategy is not an adequate solution to the problem.

  127. drj11 says:

    And of course there is Clear Climate Code, a reconstruction, in Python, of GISTEMP, for clarity.

    Because clarity is our goal, we think that the source code should be of interest to people who want to see the nuts of bolts of one particular implementation. Source code is here.

  128. drj11 says:

    Where is the code for MoshTemp? No Graphs Without Code!

  129. Ron Broberg says:

    DirkH: Is the SH warming faster than the NH?

    No. The SH is warming slower than the NH.

    DirkH: So global warming seems to affect foremost landmasses with a lack of thermometers.

    No surprise there. Oceans are vast and they warm slowly.

    DirkH: Hmm, what could one do to find out more?

    Check out the bloggers linked in the original post. They are all doing more.

    DirkH:Add thermometers? I don’t know if that is a scientific answer, though, me not being a scientist…

    Neither am I, but I found more thermometers in the GSOD data set mentioned in posts above.

  130. tallbloke says:

    Can anyone shed light on this for me please. UAH trend 1980-2010 is almost identical to HADsst2gl trend.
    http://woodfortrees.org/plot/hadsst2gl/from:1980/trend/offset:-0.104/plot/uah/from:1980/trend
    But we see tropospheric temps rise much more than sst’s when there is an ENSO event like in 1998 or 2009-10.
    So how can the Hadley SST trend be the same as the UAH tropospheric trend over the longer term? Or am I missing something obvious?

  131. drj11 says:

    @Smokey: I don’t think the global temperature is roughly zero, as that graph seemingly shows. Not temperature then, is it?

  132. tallbloke says:

    beng says:
    July 14, 2010 at 5:36 am (Edit)

    I’m impressed by the effort. However, as an engineer concerned about heat-balances, measuring the surface air temperature of the earth doesn’t tell us much of anything. The air has negligible mass compared to the oceans/earth for one thing.

    Measuring OHC (ocean heat-content) is the only rational way to make global heat-balance determinations or even determining reasonable trends.

    Agreed. However, the current OHC record is unreliable. :(

  133. Smokey says:

    drj11,

    You’re right, it’s a temperature anomaly chart.

  134. tallbloke says:

    Mosh’s Figure 8 Gistemp/hadley area Land/Ocean graph shows an overall warming of around 0.6C but Gistemp shows around 0.8.
    http://woodfortrees.org/plot/gistemp/from:1900/mean:36

    WUWT?

  135. Gary says:

    How much overlap is “some” overlap in the datasets? Unless you show a chart indicating otherwise, it looks from the highly congruent curves that “some” is pretty high. So then the issue is not agreement of seemingly independent reconstructions, but the reliability of the source data — which the SurfaceStations project has shown to be suspect. The bias in reconstruction sausage-making is not in the grinding method, it’s in the raw meat.

    Please generate a comparison of the overlap in datasets, Mosh.

  136. Bill Illis says:

    Zeke Hausfather says:
    July 13, 2010 at 8:23 pm
    “Bill Illis,

    I’m not sure where you are getting those slopes from, but they are much more similar than that.

    1900-2009 (degrees C per decade)
    All stations: 0.072
    Long-lived: 0.086
    Long-lived rural: 0.071″

    Thanks for the data: Going back to 1880 using the same data:

    1880-2009 (degrees C per decade)
    All stations: 0.063
    Long-lived: 0.072
    Long-lived rural: 0.051

    1880-2009 (Increase over 129 years)
    All stations: 0.808C
    Long-lived: 0.933C
    Long-lived rural: 0.662C

    That is different enough in my opinion.

  137. Tim Clark says:

    Steven Mosher says: July 13, 2010 at 3:29 pm
    3. underlying mechanism. Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected. They dont prove the theory, no theory is proven.

    Steven, look at the trend from ~1917-1943. Steven, look at the trend from ~1975-1999 (both periods equally cherry picked). Please explain how “the results are consistent with and confirm the theory of GHG warming”.

  138. jaypan says:

    Hasn’t NASA/GISS more important things to do than just taking care of weather/climate: “… foremost … to find a way to reach out to the Muslim world … to help them feel good about their historic contribution to science … and math and engineering.” How about climate?
    Sorry, this time I couldn’t resist.

  139. Gary,

    There are four datasets, ISH, GSOD, WMSSC, and GHCN, but ISH/GSOD and WMSSC/GHCN are mostly overlapping. However ISH/GSOD has -many- more stations (20,000+) post-1970 than GHCN (~6000, and only ~2000 post 1990).

    Fig 2 shows reconstructions from GSOD, WMSSC, and GHCN. You could add in UAH or RSS as well if you want, though they are measuring something slightly different.

  140. Bill Illis,

    Well, there are < 100 long-lived rural stations prior to 1900, so I'd imagine there is some spatial bias creeping in prior to then unless they are remarkably well distributed. I'll look into it some more when I have a chance.

  141. BillyBob says:

    Mosher: “High spatial correlation means fewer stations are required to capture the signal.”

    2006 GHCN has 2 stations reporting July Max in Canada. Do you think 2 is “too few”.

    Anyway … its colder now in July than it was. A lot colder.

    GHCN v2 July Max in Canada Year,JulyMax mean,Count of Stations

    Year JulMax JulCount
    1840 24.4 1
    1841 25.8 1
    1842 25.4 1
    1843 25.5 1
    1844 25.8 1
    1845 25.4 1
    1846 26.2 1
    1847 25 1
    1848 22.7 1
    1849 24.8 1
    1850 25.8 1
    1851 22.4 1
    1852 23.9 1
    1853 25.1 1
    1854 29.3 1
    1855 24.8 1
    1856 26.8 1
    1857 24.9 1
    1858 24.2 1
    1859 23.7 1
    1860 22.8 1
    1861 23.7 1
    1862 24.7 1
    1863 23.8 1
    1864 26.7 1
    1865 23.3 2
    1866 27.4833333333333 6
    1867 25.9166666666667 6
    1868 30.6166666666667 6
    1869 23.8833333333333 6
    1870 25.9666666666667 6
    1871 24.51 10
    1872 25.3214285714286 14
    1873 25.0733333333333 15
    1874 24.2857142857143 14
    1875 23.7705882352941 17
    1876 24.452380952381 21
    1877 25.0129032258065 31
    1878 24.9714285714286 28
    1879 23.7583333333333 36
    1880 24.0323529411765 34
    1881 24.2055555555556 36
    1882 23.6194444444444 36
    1883 23.2057142857143 35
    1884 21.4066666666667 45
    1885 24 41
    1886 24.2955555555556 45
    1887 25.4851063829787 47
    1888 23.2377777777778 45
    1889 23.6916666666667 48
    1890 24.2 50
    1891 22.5545454545455 55
    1892 24.401724137931 58
    1893 23.8954545454545 66
    1894 25.0913043478261 69
    1895 23.1041095890411 73
    1896 24.3 74
    1897 23.7575342465753 73
    1898 24.3728395061728 81
    1899 23.6303797468354 79
    1900 23.1247191011236 89
    1901 23.8759036144578 83
    1902 22.9685393258427 89
    1903 21.952808988764 89
    1904 23.3631578947368 95
    1905 23.784375 96
    1906 25.0797872340426 94
    1907 23.05 96
    1908 24.5416666666667 108
    1909 23.0342105263158 114
    1910 24.2504347826087 115
    1911 23.3418032786885 122
    1912 22.2008196721311 122
    1913 22.875 140
    1914 24.8126506024096 166
    1915 22.4983695652174 184
    1916 24.4989473684211 190
    1917 24.6735751295337 193
    1918 23.5094059405941 202
    1919 24.771144278607 201
    1920 24.3913705583756 197
    1921 25.6245283018868 212
    1922 23.7359090909091 220
    1923 23.9490990990991 222
    1924 23.9655462184874 238
    1925 23.8238493723849 239
    1926 24.5540983606557 244
    1927 23.273640167364 239
    1928 23.6304721030043 233
    1929 24.145867768595 242
    1930 24.2072874493927 247
    1931 24.4859922178988 257
    1932 22.5463035019455 257
    1933 24.0145038167939 262
    1934 23.803007518797 266
    1935 24.4981684981685 273
    1936 25.4597826086957 276
    1937 24.6967971530249 281
    1938 24.3885416666667 288
    1939 24.3436241610738 298
    1940 23.7579124579125 297
    1941 25.1128712871287 303
    1942 22.974375 320
    1943 23.7377643504532 331
    1944 23.3178885630499 341
    1945 23.1789473684211 342
    1946 23.0744186046512 344
    1947 24.1592261904762 336
    1948 22.802915451895 343
    1949 22.8943502824859 354
    1950 22.1146814404432 361
    1951 22.7132075471698 371
    1952 23.1331606217617 386
    1953 22.4753886010363 386
    1954 21.9177215189873 395
    1955 23.3086294416244 394
    1956 21.6509900990099 404
    1957 22.0631707317073 410
    1958 22.098313253012 415
    1959 23.031990521327 422
    1960 22.4906474820144 417
    1961 22.5366197183099 426
    1962 20.7688073394495 436
    1963 21.9846681922197 437
    1964 21.8165909090909 440
    1965 21.1193333333333 450
    1966 22.01431670282 461
    1967 21.9592274678112 466
    1968 21.3837953091684 469
    1969 20.9380753138075 478
    1970 22.2475308641975 486
    1971 21.3024291497976 494
    1972 20.5340248962656 482
    1973 22.2576612903226 496
    1974 21.3495088408644 509
    1975 23.2423762376238 505
    1976 21.261554192229 489
    1977 21.1826446280992 484
    1978 21.59670781893 486
    1979 22.6260504201681 476
    1980 21.1364224137931 464
    1981 22.1787685774947 471
    1982 21.6768558951965 458
    1983 21.675550660793 454
    1984 22.2848758465011 443
    1985 22.2145833333333 432
    1986 20.4527186761229 423
    1987 21.9730046948357 426
    1988 22.3857142857143 427
    1989 23.2185096153846 416
    1990 23.2041095890411 73
    1995 19.9 28
    1996 19.2171428571429 35
    1997 19.4028571428571 35
    1998 20.5628571428571 35
    1999 19.08 35
    2000 19.8942857142857 35
    2001 19.88 35
    2002 17.2333333333333 24
    2003 18.2217391304348 23
    2004 19.2066666666667 15
    2005 18.7833333333333 12
    2006 18.5 2
    2007 18.38 10
    2008 21.66 5
    2009 20.0206896551724 29
    2010 0

  142. drj,

    I was remiss to omit a reference to CCC’s work, though you guys are a tad different than other efforts being more a replication than a reconstruction (and you don’t fare well on spaghetti graphs being indistinguishable from GISTemp :-P )

    Mosh should have his code polished, commented, and posted soon.

  143. Gail Combs says:

    Geoff Sherrington says:
    July 14, 2010 at 5:27 am

    There has to be a large component of guessing in all these reconstructions…..
    __________________________________________
    A. J. Strata did an analysis of the error in the temperature “product” we are fed.

    “I am going to focus this post on two key documents that became public with the recent whistle blowing at CRU. The first document concerns the accuracy of the land based temperature measurements, which make up the core of the climate alarmists claims about warming. When we look at the CRU error budget and error margins we find a glimmer of reality setting in, in that there is no way to detect the claimed warming trend with the claimed accuracy….”

    http://strata-sphere.com/blog/index.php/archives/11420

  144. Ryan says:

    “UHI is a tough one, simply because it can depend so much on micro-site effects that are difficult to quantify.”

    Actually it is quite easy. Simply measure the temperature near the site, then measure the temperature at the nearest genuinely rural site. The difference is the UHI.

    What is difficult is not UHI. UHI is easy. What is difficult is estimating how much UHI there was 100 years ago compared to today. Those sites that are clearly affected by UHI > 0.1Celsius as detected by the method above should be dismissed for this reason.

  145. KevinUK says:

    tallbloke says:
    July 14, 2010 at 5:29 am

    “Mosh says:

    underlying mechanism. Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected.”

    Which of these two graphs suggests the better correlation Mosh?

    http://tallbloke.files.wordpress.com/2010/07/soon-arctic-tsi.jpg

    Take your time…”

    So tallbloke, even though Mosh hasn’t replied as yet, do you now agree with me that Mosh is a ‘true believer’? I’ve known this for a long time now, but recently on a different thread on tAV you seemed surprised about this fact.

    Because Mosh has written a book on Climategate, most people wrongly assume that Mosh is an AGW skeptic. He isn’t and never has been an AGW skeptic but is in fact when in fact a ‘wolf in sheep’s clothing’ CAGW promoter.

    His latest collaborations with Ron B, Zeke H, Nick s etc although laudable are IMO nonetheless something of a waste of effort as it’s clear that the whole concept of a so-called global mean surface temperature (GMST) is flawed.

    So what’s the point in ‘creating the tools’ as Mosh puts it so that other issues like land use, UHI can be investigated etc if the whole exercise is flawed? Answer, because Mosh & Co somehow think that GMST means something when it doesn’t and if they continue their work that somehow we’ll all be convinced that teh possibility of CAGW is real and that we should therefor edo something to avoid it.

    IMO it’s a concept/construction created primarily so that those who seek to promote CAGW can attempt to claim that the planet has towards the latter part of the 20th century somehow warmed in an unnatural (claimed by them to be unprecedented) way due to man’s emissions of GHGs.

    They aren’t in the slightest bit interested in the poor correlation between 20th century temperature cooling/warming trends and CO2 emissions and certainly not much mor eplausible alternative explanations of these multi-centennial historic trends. What matters to them is the message i.e. that man is having an effect on the planet and so we must ‘act now’, chnage how our society is organised and agree to enrich them at our expense.

  146. Leonard Weinstein says:

    Please observe the 1940 to present land record was a 0.6 C increase while the global (including ocean) was 0.4 C. This means the ocean alone was close to 0.3 C. When this is combined with land records that were taken away from cities (which have some argument as being less biased by UHI effects), it appears the actual global total should be closer to 0.3 C. This is not even taking into account less reliable sea data in the early record from a different process. Since the rise to 1940 is admitted to be mainly natural, it appears that an increase of about 0.3 C since, with possibly still some continuing recovery from the LIA occurring, is not such a big deal. The current projections from many, including some AGW supporters, that the next 20 years are likely to be cooling, seems to drive the whole AGW band wagon off the track.

  147. BillN says:

    Re:
    george e. smith
    graham g
    anna v
    beng
    and others…

    In my own mind I am trying to move beyond the numerology of the temp data and toward looking at the energy and energy flows.

    But, as thermo was my worst performance in undergrad, and psychrometrics is the most arcane field known to man, can someone please point me to a cogent discussion of the right way to “average” the energy of two different locations. Is it the energy of the air? What about radiation, thermal “inertia,” etc.?

    I have mussed about with some basic excel sheets based on black/grey body energy equivalencies (where the T^4 makes the average energy diff from the “usual” average) but want to expand into latent heat, etc.

    I’m sure this is basic stuff that is covered elsewhere, not just a thermo book (I have those about) but hopefully a real-world applied research article.

    It would be interesting to take a 5km x 5km x 30,000ft profile as follows:
    (1) humidity
    (1.1) humidity regime as f(altitude)
    (1.2) humidity regime as f(time)
    (1.3) humidity as f(location)
    (2) temperature
    (2.1) temp regime as f(altitude)
    (2.2) temp regime as f(time)
    (2.3) tem as f(location)
    (3) GHG conc
    (3.1) total GHG potential as f(time)
    (3.2) total GHG potential minus water vapour as f(time)
    (4) land use
    (4.1) urban as %area
    (4.2) suburban as %area
    (4.3) rural as % area
    (4.x etc) forest, swamp, arid, etc as %area
    (5 etc) barometric pressure, wind, cloud cover, precip, sunlight, etc

    Now fully instrument four contiguous 5km x 5km blocks in a representative 20km x 20 km area. No, let’s do two of these 20km^2 areas – one around Madison, WI, and one around Huntsville, AL – for comparison. And arrange the 5km^2 boxes to have different %’s of lakes, urban, forest, etc. Cost is clearly f(resolution).

    What would we find for energy budget over a multi-year period?

    I guess the root question is “Has this already been done?”

    Cheers,
    BillN

  148. George E. Smith says:

    “”” Leonard Weinstein says:
    July 14, 2010 at 10:09 am
    Please observe the 1940 to present land record was a 0.6 C increase while the global (including ocean) was 0.4 C. This means the ocean alone was close to 0.3 C. When this is combined with land records that were taken away from cities (which have some argument as being less biased by UHI effects), it appears the actual global total should be closer to 0.3 C. This is not even taking into account less reliable sea data in the early record from a different process. Since the rise to 1940 is admitted to be mainly natural, it appears that an increase of about 0.3 C since, with possibly still some continuing recovery from the LIA occurring, is not such a big deal. The current projections from many, including some AGW supporters, that the next 20 years are likely to be cooling, seems to drive the whole AGW band wagon off the track. “””

    Leonard, when I see these stories; such as the brief note contained in this post of yours; I always find myself asking the same question; are these “ancient” (1940s) reports still talking about “anomalies” or are they claiming actual Global Temperature measurements; such as a real scientist might report, and actually referenced to some internationally recognised standard; such as the Kelvin Temperature scale for example.

    And then that invariably leads me to note the paper by John Christy; et al, in I believe it is “Geophysical Reserach Letters” for January 2001; surely a reputable peer reviewed Journal.

    In that paper, Christy, et al report on about 20 years of simultaneous records of oceanic water near surface (-1 m) Temperatures, and near surface, (+3 m) lower tropospheric atmospheric Temperatures; which atmospheric Temperatures should surely fit in most appropriately with the data obtainbed from the standard Stevenson Screen “owl boxes” and whatever that other mini owl box is called; maybe it’s for bats.

    As you know; prior to that time (1980ish) ; which would include your 1940s; global records of air temperatures over the ocean; which is only about 73% of the entire planet surface; were inferrred by actually measuring; not the temperature of that air; but measuring the temperature of a bucket of water hauled over the side from some quite unknown water depth, and then measured on deck (with winds). Since about 1929 (apparently), the water bucket started to be replaced by measurments in a hot engin room, of the intake water picked up by the ship, and evidently used for engine cooling and the like. Once again; such water being gathered from some quite non standard water depth that depended on the specific vessel, that reported the data.

    For some reason; totally unfathomable to me (pun intended); it was assumed back evidently to the 1850s that the water and air temperatures would be in equilibrium. This would seem almost slef evident; given that ocean currents are of the order of a few knots; and meander all over the place; whiel air speeds over the ocean can be upwards of 100 kph or higher at times; so naturally one would expect them to reach equilibrium. Well to be pedantic, I would not expect that, but evidently many generations of Climate Scinetists have believed that.

    So what Christy et al reported Jan 2001, was that the simultaneous water and air temperature data, taken from a fixed water depth of one metre, and a fixed air height over the same buoy of 3 metres, showed thqat the increase intemperature over that 20 year interval for the air measurements was about 40% less than what the water temperature measurements recorded. I’m ad libbing here about the 40%; maybe they said the water temps inflated the rise by 40% or some other relation; but you get the idea; the air temperatures recorded a sizeable reduction in temperature increase compared to what the water temperatures claimed for the same period and location.

    OK; simple problem; so now you have to take all the previous 150 years of oceanic temperature data; well all of that that Phil Jones hasn’t lost or thrown away; and you have to scale it back to 60% of what it was (growth wise) to get equivalent air data that can be properly merged with land based owl box temperatures measured up those poles by the barbecues.

    Well NOT SO FAST !! The key result that Christy et al reported; for just that limited 20 years of observation; was not the 60% factor; but they found that the water and air Temperature ARE NOT CORRELATED !!

    Not only are they not the same as had been historically believed for 150 years; affecting climate data for 73% of the global surface area; but they are not even correlated; which means that it is inherently impossible to go back and correct those erroneous water temperature data sets from the 1850s on; to arrive at comparable lower troposphere near surface air temperatures. The true air temperatures are NOT recoverable.

    So excuse me if I take a jaundiced view of world climate data from the 1940s; I don’t believe any temperature measurement data for the earth’s oceans; and by inference for the whole planet; that precedes 1980; when those first oceanic buoy measurments were begun; well when Christy et al, began their collection.
    I did actually e-mail Professor Christy, about that paper; and my recollection is that he said they found some correlations in some parts of the ocean. But he clearly would not have reported a lack of correlation, unless he had found some statistically significant lack of correlation somewhere. Well maybe the best way to put that would be statistically insignificant; rather than significant. I’m not going to Put words IN Prof Christy’s mouth. I heartily recommend that people read HIS paper and see what HE said.

  149. An excerpt of a report from the National Academy of Sciences as listed on a popular warmist website…
    Although our knowledge of climate change may be partial, we can be certain that our climate is changing and that human CO2 emissions are responsible. The US National Academy of Sciences issued a clear statement just a month ago which reads: “Some scientific conclusions … have been so thoroughly examined and tested, and supported by so many independent observations … that their likelihood of … being … wrong is vanishingly small. Such conclusions … are … regarded as settled facts. This is the case for the conclusion that the Earth … is warming and that much of this warming is very likely due to human activities.”
    (My comments on this topic at the Warmist blog)
    Studentskeptic says:
    Regarded as settled facts. This is the case for the conclusion that the Earth … is warming and that much of this warming is “very likely due to human activities.”

    1. I love the circular logic here, regarded as settled fact, the consensus, the science is settled….. “very likely”…

    Picture a Doctor… We are absolutely sure that you have a health issue (although there are some who would disagree)… We are relatively sure that it is likely to be your male reproductive system that’s causing, it so we’ll have to remove it. It will take 33 years to remove your reproductive system piece by piece and there’s only .06% change in your situation by removing your reproductive organs, but we’re very likely sure that is the cause.

    Who’s gonna sign up for that doctor? I’m monitoring that particular blog for the first volunteer. (snicker)

  150. EthicallyCivil says:

    So… the US hasn’t warming. the Russian dataset is compromised for a variety of reasons. The SH is warming more slowly than the NH.

    Where is there a clear warming signal coming from?

    Civilly,

    EC

  151. Steve Fitzpatrick says:

    Zeke and Mosh,

    Thanks for this post; it is an excellent summary.

    The issue of UHI is for sure real, but it’s size does need better definition.

    FWIW… A modest UHI effect combined with slower than expected (based on models) heat accumulation in the 0ceans (ARGO data) means that the long term (200+ yrs) climate sensitivity is likely on the low end of the IPCC range…. on the order of 1.5C – 2.0 C per doubling of CO2, 300 years out. The immediate sensitivity (20-30 yrs) looks more like 0.75 C – 1.0 C per doubling. Since the “age of carbon” will start declining within the next 40-50 years, due to supply limitations, the long term sensitivity is just never going to be seen. Ocean and biosphere absorption of CO2 will overtake emissions of CO2 within ~50 years, and atmospheric CO2 concentration will start declining.

    Immediate and forced draconian reductions in CO2 emissions, at huge economic and human cost, can’t be justified based on any reasonable estimate of future warming.

  152. George E. Smith says:

    “”” Steve Fitzpatrick says:
    July 14, 2010 at 11:19 am
    Zeke and Mosh,

    Thanks for this post; it is an excellent summary.

    The issue of UHI is for sure real, but it’s size does need better definition.

    FWIW… A modest UHI effect combined with slower than expected (based on models) heat accumulation in the 0ceans (ARGO data) means that the long term (200+ yrs) climate sensitivity is likely on the low end of the IPCC range…. on the order of 1.5C – 2.0 C per doubling of CO2, 300 years out. The immediate sensitivity (20-30 yrs) looks more like 0.75 C – 1.0 C per doubling. Since the “age of carbon” will start declining within the next 40-50 years, due to supply limitations, the long term sensitivity is just never going to be seen. Ocean and biosphere absorption of CO2 will overtake emissions of CO2 within ~50 years, and atmospheric CO2 concentration will start declining.

    Immediate and forced draconian reductions in CO2 emissions, at huge economic and human cost, can’t be justified based on any reasonable estimate of future warming. “””

    Steve where is to be found, the definitive paper; presumably some peer reviewed recognised climate journal; that proves that the concept of “climate sensitivity” is valid; which is to say that T2-T1 = cs.log (CO22/CO21) ; of course base 2 logarithms.

    Such a paper would presumably incorporate credible (peer recognised) data sets of Mean Global Surface Temperature; and also of Atmospheric CO2 abundance for that same period of time; showing that the graph is clearly Logarithmic; as distinct from Linear or of any other mathematical functional possibility; within the uncertainty limits of that recognised data; and /or such a paper, would provide some rational and peer reviewed and recognised Physical theoretical basis for believeing that those two data set should be connected by a logarithmic function; as distinct from any other mathematical relationship.

    I’ve been trying for years to locate either the theoretical Physics basis for “Climate Sensitivity” (cs) or the empirical actually measured data (as distinct from proxy “data”) that establishes the concept of Climate Sensitivity; as a fixed Temperature increase in mean global surface Temperature per CO2 doubling in the atmosphere; so far with no such luck.

    I have even read/heard anecdotally that in fact Professor Stephen H. Schneider of Stanford University, is the inventor; and father of “Climate Sensitivity”; and I haven’t been able to confirm either thato rlocate his defining paper that establishes what seems to be the Rosetta sStone of Climate Science.
    (cs) is apparently the “Planck’s constant” or “Boltzmann’s Constant” of Climate Science; a fundamental constant of Physics; yet I can’t locate the origins of the theory.

    You seem to be knowledgeable on the subject; so where are the seminal papers on “Climate Sensitivity” ?

  153. Mac the Knife says:

    George Smith 11:10AM

    Spot on, George!
    The prior ocean temp data is functionally worthless, for all of the reasons you accurately describe… and more. Measuring ship engine cooling water temperature ‘at the inlet source’ not only varies in sampling depth from ship to ship based on the inlet location, it varies continuously on each ship as the fuel and cargo load changes. We have no way of determining what depth (within roughly the range of minimum to maximum draft depth per ship) any given data point relates to, rendering the entire data set meaningless.

    Add to that the lack of correlation of ocean water temp to near ocean atmospheric temps, and all attempts at using historical ocean temp measurements as ‘proxies’ for historical atmospheric temps becomes valueless as well.

    GIGO and GIGO…..

    But, perhaps if we adjusted them a bit…….. ? };>)

  154. Steven Mosher says:

    “underlying mechanism. Well, the results are consistent with and confirm the theory of GHG warming, espoused BEFORE this data was collected.”

    Which of these two graphs suggests the better correlation Mosh?

    http://tallbloke.files.wordpress.com/2010/07/soon-arctic-tsi.jpg

    Take your time…”

    Well since the slide you refer to

    1. doesnt consider the entire globe.
    2. Doesnt even transform C02 properly ( if the effect is a LOG effect you had best transform the predictor variable, like DUH)

    I much prefer this:

    http://comp.uark.edu/~jgeabana/gwfig1ab2.png

    if you think that GHGs do not cause warming ( as in water vapor, c02, methane ) then
    1. Talk to Christy, Monkton, Lindzen, Spenser, Willis,
    2. never trust a another satillite sensor again.
    3. Return your cell phone, it’s violating laws of physics.

  155. Steven Mosher says:

    Steve Fitzpatrick says:
    July 14, 2010 at 11:19 am (Edit)
    Zeke and Mosh,

    Thanks for this post; it is an excellent summary.

    The issue of UHI is for sure real, but it’s size does need better definition.”
    *******************************************************************
    The size of the UHI effect can be approached in several ways. I prefer to bound it from above, as in “not higher than” There are a couple ways to do that, but I’d suggest that looking at the UHA record of trop temps, specifically the trends there from 1979 onward, gives you an upper bound for the effect from 1979 to present.
    prior to 1979, that’s a lot harder because the effect ( as Spencer notes) could be non linear at least initially. There are some folks working on estimates, But the bottom line is that the UHI effect is a Small contribution to the overall warming. The relative size of land to ocean makes it hard to the UHI bias to have any leverage in the final numbers.

    “FWIW… A modest UHI effect combined with slower than expected (based on models) heat accumulation in the 0ceans (ARGO data) means that the long term (200+ yrs) climate sensitivity is likely on the low end of the IPCC range…. on the order of 1.5C – 2.0 C per doubling of CO2, 300 years out. The immediate sensitivity (20-30 yrs) looks more like 0.75 C – 1.0 C per doubling. Since the “age of carbon” will start declining within the next 40-50 years, due to supply limitations, the long term sensitivity is just never going to be seen. Ocean and biosphere absorption of CO2 will overtake emissions of CO2 within ~50 years, and atmospheric CO2 concentration will start declining.

    Immediate and forced draconian reductions in CO2 emissions, at huge economic and human cost, can’t be justified based on any reasonable estimate of future warming.”

    I’d associate myself with those comments, they are in the Lukewarmer camp.

    A. Adding GHGs will warm the planet, not cool it.
    B. The true sensitivity is at the low end of the IPCC range.

    Yes, I’m a true believer, whatever that means.

  156. Steven Mosher says:

    George Smith:

    “I’ve been trying for years to locate either the theoretical Physics basis for “Climate Sensitivity” (cs) or the empirical actually measured data (as distinct from proxy “data”) that establishes the concept of Climate Sensitivity; as a fixed Temperature increase in mean global surface Temperature per CO2 doubling in the atmosphere; so far with no such luck.”

    You should well know that a sensitivity or gain cannot be derived. In the same way that the gains for a flight control system cannot be derived. Sensitivity can be bounded by simulation and modelling ( within big boundaries) and it can be inferred from observations if you are lucky.

    If you have questions.. start with James, and try to look a bit harder.

    http://julesandjames.blogspot.com/2006/09/can-we-believe-in-high-climate.html

    http://julesandjames.blogspot.com/2006/03/climate-sensitivity-is-3c.html

  157. Yarmy says:

    Given that the general agreement of these independent reconstructions provide strong evidence that nobody has been fiddling the figures, it makes the UEA et al obstructions and prevarications all the more baffling and ultimately pyrhhic.

    REPLY: Data fiddling take place in adjustments to the raw data, that is where the real issue lies, along with UHI/microsite effects -A

  158. Steven Mosher says:

    Steve where is to be found, the definitive paper; presumably some peer reviewed recognised climate journal; that proves that the concept of “climate sensitivity” is valid; which is to say that T2-T1 = cs.log (CO22/CO21) ; of course base 2 logarithms.

    Why do people who refuse to read the primary texts, argue that they dont exist.

    odd.

    Start here

    http://www.agu.org/pubs/crossref/1998/98GL01908.shtml

    http://folk.uio.no/gunnarmy/paper/myhre_grl98.pdf

    The basic inputs would be from HITRAN

    HITRAN is where we start (its a database anybody who has worked with sensors
    can tell you about, cant make good bombs or missiles actually WORK without it)

    Then you would work with an RTE, Hmm I think they used

    GENLN2 and some others

    http://adsabs.harvard.edu/abs/1992ggll.rept…..E

    or a better list

    http://www.cramster.com/reference/wiki.aspx?wiki_name=List_of_atmospheric_radiative_transfer_codes

    I’m most familiar with MODTRAN.

    So ya, you start with the database of molecules. A database that those of us who built weapon systems to protect the free world rely on. Then you use RTE, radiative transfer equations. ya, the same physics we use to design weapons systems and sensor systems that protect the free world. So, basically same data, same theory, different purpose.

    If you want to blame somebody for the advances in climate science ( seeing the role of C02) blame the Air Force. Its data and code generated initially for them. Go figure.

  159. Steven Mosher says:

    Thanks Zeke,

    The code will be posted as aturn-key package with the basics covered.
    I’ve pulled out a subset of all the code and passed that onto RonB as I would like to help him ( if possible) get something that is rewritten from the ground up.

    I’ll make a version available in a week or so that folks can play with. It’s directed at beginers in R.

  160. BillyBob says:

    “if you think that GHGs do not cause warming ( as in water vapor, c02, methane ) then 2. never trust a another satillite sensor again.”

    You have satellite data for 1930′s!!!!!!!!!!! Great. Can I see it?

    As for your UHI comments … without min/max, you cannot even being to consider whether UHI is contaminating the GTR.

  161. Tenuc says:

    George E. Smith says:
    July 13, 2010 at 5:31 pm
    “The min/max daily temperature reading fails the Nyquist sampling criterion; and the spatial distribution of the set of thermometers also fails the Nyquist Test; and by orders of magnitude; so there’s no way that any recovered average can be correct because of the aliassing noise; and the result is meaningless anyway.

    As I have said many times GISStemp calculates GISStemp; and nothing else; same goes for HADcrud.

    And even if one did actually measure the true average Temperature of the globe; it is not related in any way to the energy flows; so it tells us nothing about the stability of earth’s energy balance.”

    I’m with you all the way George. GMT is a useless proxy for the amount of energy held in our climate system and because climate is driven by deterministic chaos. Even if you could measure it temperature accurately enough to be able to isolate the tiny bit caused by carbon based GHG’s, linear trends wouldn’t tell us anything worthwhile. However, this is the marker which has been chosen by climate scientists to confirm or deny CAGW, so we have to live with it, even though it does have feet of clay.

    Steven Mosher’s post shows that using the same basic ‘rawish’ data sets and applying various models all seem to give the same sort of time series result for GMT anomaly, which is very small. Steven has demonstrated that the discussion needs to move on to identify factors other than man-made GHG’s, so we can see the true magnitude of the effect.

    I think this needs to be split between natural caused changes (e.g. solar, geothermal, ocean cycles,biosphere) and non-GHG effects of man (e.g. UHI, roads, farming, energy use, forestry, industrial pollution, energy use). All factors need to be considered f a true estimate of climate sensitivity is to be produced.

  162. Steve Fitzpatrick says:

    Steve Mosher,

    Thanks for answering George Smith… I have not the time.

    Cheers.

  163. George E. Smith says:

    “”” anna v says:
    July 13, 2010 at 8:56 pm
    It is true that the climate community has managed to focus the world’s attention to the temperature anomalies, and by sleight of hand taken the attention from the temperatures themselves and how badly the models reproduce them.

    I am with George Smith on this . “””

    Anna; coming from somebody with your background; it is comforting to know that I am not alone, in seeing what is in the Emperor’s wardrobe.
    Imagine a different playing field; say one that is actually more in tune with Professor Stephen H. Schneider’s actual academic qualifications; the field of biology.

    Suppose we pick say 7000-8000 locations around the globe; here and there of a size not too unlike the space taken up by a typical “weather station” much like Anthony’s survey has shown us.
    And we want to study data on animals for the World Wildlife Fund; that never used to be known as the WWF, until Vince McMahon put the WWF on the map.
    So maybe we can assign say a Hectare to each of the 7000 odd sites. We go out periodically; and we count the animals present on the station; well we stipulate here that anything from the size of an ant, on up, is an animal. Doesn’t really matter what kind of animal it is. Does GISS take into account what make or model of barbecue grill it is at each official weather station; so an animal is as good as any other animal.
    Now we only keep track of the changes in the station’s animal population; it matters not a jot; whether the elephants, all left on safari; or whether a locust plague just moved in; one animal is as good as another on our animal anomaly chart.

    So now we have the means of monitoring animal anomalies all over the world; without regard to the fact that there are no Penguins in the Arctic Ocean.

    For 150 years through thick and thin, we simply report on the statistical summation of the animal anomalies garnered from our seven thousand locations; well from time to time, we may add some stations, or take some away. Well there’s the Urban Animal Dearth (UAD) problem that arises when some sheep farm is converted into an 8 lane freeway; we have to make corrections to the data.

    Well this is all very interesting and can employ countless numbers of otherwise unemployed and maybe unemployable biologists; but bottom line is.

    Does this process really tell the World Wildlife fund anything meaningful about the state of the earth’s fauna; or flora if we did that instead.

    You see there’s a basic assumption that the state of the earth’s animal populations as represented by those 7000 chosen locations originally is a GOOD representation of the original or baseline condition.

    Now I understand the anomaly concept as it levels mountains and canyons, and reduces the planetary surface to a billiard ball; rather clever idea actually; but it isn’t a good facsimile of reality.

    Now Mother Gaia really has a nicely equipped Laboratory; which is why she always gets the right answer. She literally has a thermometer on board each and every atom or molecule. Well maybe not formally; we like to statistically gather a number of molecules at any instant and take some form of average kinetic energy for those, as a representation of the Temperature; but we could add an asterisk, and simply say that we could do a time average of the energy of any single molecule for some short period and declare that to be the asterisked Temperature of that molecule. So in that sense Mother gaia’s thermometers are everywhere and she integrates the consequences of all of them to decide over time; just what weather and climate she is going to allow.

    Well we don’t have that many thermometers; or animal accountants either; so we have to SAMPLE.

    Well sampling is something we have come to understand at WUWT. You walk up to a tree, with something like a laboratory cork borer; and you pick a spot about chest high, where you can work conveniently, and we drive the cork borer into the tree; hopefully pointed about in the direction of the very center of the roughly circular outline; and we extract a nice core of layered samples of each of the annual growth rings; that we happen to hit on.

    Now there’s that old Murphy’s Law joke that says that that single core that we just bored from the tree, is a PERFECT sample of everything that is inside that tree. Murphy’s law says that is so; and we can prove it, by cutting down the whole tree, and running it through a sausage slicer; we could call it an Ultramicrotome instead; and then cutting the slices into narrow segments or sectors; and when we examine each and every single element of each of the rings of that entire tree we will find that every piece is absolutely identical to the pices in the original cork borer sample. That is what Murphy’s law says; but it also adds that the trick is to stop when you extract the borer core; and not assassinate the entire tree.

    Well as we know too well from actual full slices of murdered trees, our bored core is anything but a representative sample of that complete former glorious edifice that was a tree.

    Well that is the problem of SAMPLED DATA SYSTEMS. How the hell do we know that the sample(s) we take of some continuous function (of maybe several variables), is a truly representative sample of the entire function.

    Well fortunately we do know the answer to that question; and the entire modern world of high density high bandwidth communications, is totally dependent on the sure knowledge that we know when enough is enough.

    If we have a continuous let’s say time varying function (of whatever); in general that function cannot change its value in zero time; it takes some finite time to change from one value to another. We say the function is “Band Limited”. The continuous time function can be represented by some Fourier series or integral of functions with different frequencies, the simplest being a sum of sinusoidal components; having some maximum upper frequency; beyond which no material signal components exist; that maximum frquency being the bandwidth or band limit of the signal; and remember that instead of time; it could be a function of ANY variable or set of variables.

    In the case of the time varying function; it is known that we can represent the band limited signal COMPLETELY by a regular sequence of INSTANTANEOUS samples of the continuous signal; measured at discrete times. And the theory tells us that we can exactly recover the band limited continuous signal from just those instantaneous samples; IF WE DO IT ALL CORRECTLY.

    So why is that useful ? Well if we have a voice telephone signal that contains NO signals varying at higher than say 3 kHerz; and we have a communications “channel” that has a maximum signal frequency capability of say 500 kHerz; we know that it is possible to transmit a signal pulse of some amplitude and just one microsecond pulsewidth on that channel of communication; so we could sample our voice signal, at say 10kHz; every 100 microseconds; and create a one micorsecond long pulse from our sample, and send them all down the line. Well hell; we have another 99 microseconds in between our samples that isn’t doing anything useful. We coudl take another 99 telephone calls, and do the same thing to those; and then interleave all the 100 sets of samples, and send them all down the same wire. With a little bit of overhead catalog information; we can sort out the pulses down the other end of the wire, and send them to 100 different telephones; and then we can reconstruct all those messages perfectly (in theory).

    So we do know for absolute certainty that sampled data theory works; and a hell of a lot better than I have described it here.

    So what does this have to do with measuring the earth’s temperature. Well the theory says that we don’t have to be as well endowed as Mother Gaia is. We don’t need a thermometer in every molecule.

    Sampled data theory tells us that if we take our Temperature samples properly; that is by sampling CORRECTLY both in TIME and SPATIAL POSITION; then from just those samples we can reconstruct the continuous two dimensional data map that was our global temperature map in time and space; and then from that reconstructed data we are free to calculate any averages or other information about the set that we want; the information that was in the original signal is all there in the properly gathered samples.

    Well the principal theorem of sampled data theory, is the Nyquist Theorem; which says that any band limited continuous function can be fully represented by a set of instantaneous samples of the function’s value; provided we take at least one sample during each half cycle of the highest frequency signal presnt in out band limited function. In the cae of our 3 kHz voice phone call, we needed to sample at at least 7 kHz, to satisfy the requirement. We chose to do it at 10 kHz instead to have some margin.

    So what happens if our signal was not band limited like we assumed. Supposing there is some screaming child in the background of our phone call; and he is putting out some 6 kHz harmonics on his wailing. which is 1 kHz beyond our band limit assumption of 5.0 kHz. We will find in practice, and in theroy, that upon reconstruction of the signal from the samples, we now have a spurious (aliassing noise) signal, that wa sNOT presnet in the original, and since it started at 1kHz beyond our band limit; the reconstructed signal will contain a component that is 1kHz BELOW our band limit; which is at 4 kHz. We originally had NO signals at 4 kHz; now we do; and since 4 kHz is less than 5 kHz, that spurious noise signal is INSIDE the signal band of 5 kHz; so now it is inherently impossible to filter it out, and get rid of it.
    If our child was shrieking with some 10 kHz harmonics; the same as or higher than our sample rate; and just twice our signal band limit, the aliassed noise will be at 5.0 -5 kHz or zero frequency. We have another word for the zero frequency component of ANY signal. We call it the AVERAGE VALUE OF THE SIGNAL.

    Hey isn’t that what it was we were trying to calculate in the first place; was the AVERAGE Temperature over time and space of our planet.

    Well sadly; we only have to undersample our signal by a factor of two in either time or space; from what the Nyquist Theorem tells us we need, and we end up with a non-removable noise corruption of the very thing we were trying to determine; the average value of our continuous function.

    Min max daily time measurement already violates Nyquist by at least two since the diurnal temeprature cycle is not s single frequency sinusoidal signal; so it has at least a second harmonic 12 hour periodic component; which is right at our sample rate of twice daily measurment.

    And of course; with 7000 odd global position sampling stations; it is a total joke to talk about sampling strategy.

    Sampling theory does not requirte regularly spaced samples; they can be at difefrent intervals; but the minimum separation cannot exceed the separation required by Nyquist of one half the period of the highest signal frequency; and any non-uniform sampling regimen requires an even higher number of samples; not a lower number.

    So no; I do not place any confidence in whatever results the experts come up with for the Mean Global Temperature; their problem is not with statistical methodology; the central limit theorem is not a cure for a Nyquist headache.

    Now I’m not going to say that global anomaly sampling is worthless; very little information is totally worthless.

    But I wish they would stop making claims for it that are unsupportable.

    And Anna; back to my original point; I’m happy to read that you are perhaps somewhat like minded. Wish I knew all the particle Physics of your experience.

  164. George E. Smith says:

    Just a brief note here for Steven Mosher, and Zeke Hausfather.

    Please be advised; that I AM NOT here knocking your work or the paper/essay that you presented here; and I have not fully digested it yet; and Steve; I did see and appreciate your clarification of just what the grid sampling process that you (or others) use. You did essentially clear up my confusion.
    When you were talking grids; I had visons of Peter Humbug, and his Playstation models with the gridded computer modelling of whatever it is that he models. So I just wanted to be sure that what you were talking about, was not the same thing.

    As I said, I haven’t yet fully digested the detail of your extensive paper here, that looks like you and Zeke have put a lot of effort into.

    I usually start to grasp the essence of some of those papers ; and Steve Goddard’s too about the time; they disappear off the bottom of Anthony’s very busy menu. So I am often reading and sometimes posting a full two pages of posts below the current page.

    So hopefully, by the time I digest your essay, I will have some idea of what the different methodologies do in the way of changing the apparent results.

    My rant about the whole concept of anomalies is NOT addressed to you or anyone else who posts information or analysies such as the two of you presented here.

    And make no mistake; I am constantly learning from the efforts of posters who present information like this.

    Sometimes I think Willis thinks I’m a bloody pest; which I am; but I do appreciate his efforts as intensely.

    Thanks again; for your analysis; and yes I did see your response to my questions about the gridding Steven.

    George

  165. sky says:

    RomanM (July 13, 2010 at 4:23 pm):

    Having worked extensively with LSE methods, I’m confident of the results when there is a homogeneous spatial field and a consistent datum-level is maintained by the available data. Certainly, it is an attractive approach in principle. As a practical matter, however, I’m not so confident that offsets in station-records are effectively treated when the data from neighboring stations is spatially inhomogeneous and has a nondescript temporal bias introduced by intensifying UHI.

    As a concrete example, consider the two stretches of record (through 1950 and later) for Gibraltar, whose station was moved to a more-inland location 7km away from the original site. The closest staion whose record overlaps the break is a very short segment from across the strait at Tanger. Long overlapping records are available only from Casablanca, Lisboa, and Marseilles–all of which show pronounced, but non-uniform, UHI warming. Perhaps you could allay the skepticism of Sherrington and others by demonstrating how your algorithm estimates the offset in the post-1950 stretch of Gibraltar record.

  166. George E. Smith says:

    “”” Steven Mosher says:
    July 14, 2010 at 1:00 pm
    Steve where is to be found, the definitive paper; presumably some peer reviewed recognised climate journal; that proves that the concept of “climate sensitivity” is valid; which is to say that T2-T1 = cs.log (CO22/CO21) ; of course base 2 logarithms.

    Why do people who refuse to read the primary texts, argue that they dont exist. “””

    Well Steve; I don’t recall ever saying these texts don’t exist. I’ve been Googling Stephen H. Schneider till I am blue in the face; trying to find whatever paper it was in which he reputedly coined the term “Climate Sensitivity” and pointed out (presumably) its logarithmic relationship; which is quite inherent in the very concept of a fixed Temperature increase for any doubling of the atmospheric CO2 abundance. So far; it hasn’t popped out; so maybe my starting assumption that he is the father of “Climate Sensitivity” is all wrong; and sombody else “discovered” the logarithmic relationship.

    Now I’ve read all kinds of folk tale descriptions that talk about the avaiable CO2 “trapping sites” getting removed (as busy with another capture) so that the “effect” “Tapers off” with additional CO2.

    But “Tapers off” or “expansion of the wings” of some absorption spectrum is not what I usually construe the term logarithmic to mean.

    I understand exactly what the logarithmic/exponetial relationship is; and constantly use it over maybe six orders of magniude (of current) in the case of the forward Voltage of Semi-conductor diodes; so that is what I envision when somebody tells me that in climatology there is this constant global surface temperature offset that accompanies each doubling of CO2.
    That is a little bit surprising given that the Mauna Loa Data set; has so far not observed even 1/3rd of one doubling of the CO2 yet; and that presumably along with the post 1958 global Temperature data, would seem to be the information that shows this logarithmic relationship.

    And for small increments, the logarithmic function is simply linear with the variable; which begs the question of how one tells the difference given that the slope of the purported straight line T/Log CO2 is only known within a 3:1 uncertainty range.

    So the issue is not one of my making; I have simply accepted the IPCC and other authoritative sources fo information and followed where that leads to.

    So I don’t say the information isn’t there; or doesn’t exist; and I have read countless hours of the “primary texts” most of which in no way addresses that single issue.

    I have even considered writing to Professor Schneider and asking him for a copy of the original defining paper (if he was indeed the originator) but I suspect he simply wouldn’t respond. I don’t even get responses from acknoledged skeptical authors; when I query them about their work.

    John Christy was singularly forthcoming in responding to a query of mine; about his paper on the ocean buoy measurements.

    Either I am asking the question incorrectly or for some reason; I am not making it clear what I am trying to learn.

    But asserting that there is no such information has not been one of my approaches.

  167. sky says:

    It’s sad to see that what started out as a methodological discussion of how the “global temperature” is calculated, has morphed into position statements about what the results obtained from a highly incomplete and patently urban-biased GHCN data set show or don’t show about AGW. The fact of the matter is that we have consistent, truly global measurements only since the dawn of the satellite era. Prior to that, nothing resembling a credible time-series is available for the great bulk of the oceans and for vast regions of the land masses as well.

    Outside of well-traveled sealanes, rarely does one find enough observations in a 5-degree Marsden square to construct a truly reliable climatological summary over the past century, even under the assumption of homogeneity within the square. (Since SMOs are made 4 times a day, a credible time-series of monthly averages would require at least 4x365x100= 146,000 observations, assuming continuous coverage.) Nor does substituting even more sparsely available SSTs for air temperatures, as is is done by Hadcley Centre and others, provide anything beyond a stopgap measure to fill the void. We simply don’t know what the actual time-history of dry-bulb temperature over the world’s oceans looks like for most of the prior century.

    And even if we did, it would still leave us with a physically incomplete specification of total energy transfer from surface through the atmosphere to space. Contrary to what Steve Mosher avers here, that energy transfer is not by radiative means alone! Only those who are learning their physics from the mistake-riddled science-of-doom website can ignore the central role of evaporation from the oceans in the global energy budget. With ~1.5 m of the oceans evaporating each year, the latent heat transport far exceeds the net backradiation from the moistened atmosphere.

  168. charles the moderator says:

    w00t!

    Steve cleaned the kitchen today!

  169. Smokey says:

    Charles,

    Two words for you two: paper… plates. ☺

  170. Owen says:

    There are no urban heat islands in the troposphere (that I am aware of).

  171. anna v says:

    Yes, George, our two different approaches lead to the same conclusion.
    Mine is never to lose sight of the constants of motion, ( energy, momentum, angular momentum + some esoteric ones) whose conservation is absolutely independent of the type of solutions of the differential equations entering the problem.

    Thanks for your statistical/information-theory approach that teaches the same lesson. You give a very concise and clear summary.

  172. anna v says:

    BillN says:
    July 14, 2010 at 10:38 am

    can someone please point me to a cogent discussion of the right way to “average” the energy of two different locations. Is it the energy of the air? What about radiation, thermal “inertia,” etc.?

    Read George’s very clear outline of what not to do when extracting information statistically.

    Here is my physics view point for the specific problem, sun, earth, atmosphere:
    The energy mainly ( gravitational ignored because much smaller, eve more magnetic and electric) comes from the sun’s radiation impinging on the top of the atmosphere.
    It goes out in various forms again finally as mainly low energy electromagnetic radiation with various mechanisms.
    .
    Input output is not in equilibrium and that is what is being discussed with AGW that CO2 changes the equilibrium and allows higher temperatures to manifest, than would exist without the excess.

    The problem is not in averaging energy. Energy once measured can be averaged easily since it is a scalar.

    The problem is in the measurement. The real answer to “is the earth warming” would come if we stood outside the stratosphere and measured all incoming and all outgoing radiation. The sign of the difference would give us a clue whether we are heating or cooling. In principle one could do it systematically with satellites and obey fully Nyquist’s criterion . I have not seen such an analysis. What we get from satellites are temperatures over the globe at various heights, which are then extrapolated mathematically to the surface.

    What is being done is using the temperatures of sampled locations on the surface of the earth as proxies of the energy , and anomalies of temperatures as proxies of temperatures. One convolution to go from energy to temperature, and another convolution ( i.e. integration over large numbers of variables) from temperature to anomalies of temperatures.

    Suppose we do manage to measure temperatures well, ( Nyquist satisfied), how to we go to energy?

    It is the black body radiation formula j=c*T^4, j radiation flow, c constant for black body. The amount of change of c due to not being a black body is called “emissivity” and is different for different materials. This is for solids, liquids. For air the radiation formula is something else.

    So the question “which temperature” is important. In principle, the skin surface temperature ( first mm of ground,water, ice) is what should enter the formula. What one sees in all the climate essays is the temperature at 2 meters, which is what meteorological stations measure. Logical for gauging the weather, man lives on the first 2 meters of ground. Illogical for measuring radiated energy, because it is the ground that radiates it.

    Then the second level of distortions enters, that of using anomalies. Anomalies have a meaning when one knows the base, for specific locations. Averaging over the globe can have no quantitative meaning. It can only say : energy retained is increasing/decreasing.

    In my opinion this climate change experiment is designed all wrong and of course will come up with distorted results, as did Chicken Little when the drop of a feather was amplified to the drop of the sky.

  173. Edward says:

    Why all the praise? Would be much more thorough if Min/Max/Avg trends were each calculated and displayed as that is where the real mystery lies. C’mon, if we are going to dig into it, then let’s dig into it. Min/max/2 anomalies hide important data IMO. Please add to the reconstructions, maybe delineated by season, Spring/Summer/Fall/Winter, separate min and max and avg anomalies, for both Urban and Rural, and now we are talking. Otherwise, it’s all integrated noise…Until you take that step, you are just validating the integrated mess that we currently have to swallow. The mystery is in the mins no? and the urban vs rural? and the adjustments…dissect please…then put it back together.

  174. tonyb says:

    Steven or Zeke

    Why do you believe that the global sea temperatures back to 1880 have any merit as a scientific measure when clearly they don’t?

    Having met someone who actually took the bucket measurements, haphazard would be too small a word to describe what went on. This haphazard activity took place along a tiny fraction of the Worlds oceans meaning the records become even more pointless.

    Similarly the periapetic nature of land based stations and the way they are influenced by encroaching cities, the change in their micro climate due to moves, poor siting, the differences in the manner in which temperatures were taken- especially before the advent of min/max/thermometers- also means that individually they have some merit, but collectively they are just a jumble of material.

    To believe we have a reliable global record that can be parsed to fractions of a degree is to defy reality. (It was still a great piece of work though)

    tonyb

  175. Geoff Sherrington says:

    Steven Mosher says July 14, 2010 at 1:00 pm “If you want to blame somebody for the advances in climate science ( seeing the role of C02) blame the Air Force. Its data and code generated initially for them. Go figure.

    We presume the US military has its own set of weather stations, like at each missile silo site; and that these just might be fairly free of the need for adjustment, being somewhat fixed in position, as the USAF airports might also be.

    If the US military has its own climate set, it has potential value. You should ask them for it if they have not disclosed it. Go figure. I’m not a US citizen, they will not tell me. I just have to live with the USA stories that have been coming non-stop since Timothy Leary took LSD in California and Steven Schneider demonised global cooling.

  176. Pascvaks says:

    And the Indian Ocean? What’s it doing these days? Seems Ocean Temp (and Tides) on the rise.

    http://nsf.gov/news/news_summ.jsp?cntn_id=117322&org=NSF
    “Indian Ocean Sea-Level Rise Threatens Coastal Areas: Rise is especially high along coastlines of Bay of Bengal and Arabian Sea, as well as Sri Lanka, Sumatra and Java”

  177. George E. Smith says:

    Regardless of what data sets people manipulate, or how they report their results as far a Temperature anomalies go; the simple fact remains that the IPCC; as in InterGOVERNMENTAL Panel on Climate Change; advises these governments, as to what the “Mean Global Surface Temeprature” of planet earth is; and what it is projected to be at some future time; seemingly always 100 years into the future.

    So the public perception of all of this is that what is being reported, and projected/forecasted, IS Global Temperature; not global anomaly.

    And they widely publicise the “Climate Sensitivity” as being the increase in Mean Global Temperature, for ANY doubling of atmospheric CO2; thereby proclaiming a logarithmic relationship.

    Now presumably, the Physical Origin of such a CO2 based relationship, has the be the surface emittance of LWIR radiation; that being the initial source of the energy which the CO2 (or other GHG) is supposed to capture, creating warming.

    So taking the Temperature range of the colorful global map, at the top of the present paper; we have extremes of -81 to +47 deg C. This is far from the most extreme range; but it is good enough to illustrate some issues.

    Taking just three Temperatures; -81, +15, and +47 deg C for the min, max and global average, we have 192 K, 288 K, and 320 K Temperatures. The SB BB radiation limits for these Temperatures, then have factors of 0.1975, 1.000, and 1.524 respectively, and taking 390 W/m^2 for the global average, we would have; 77.0, 390.0, and 594 W/m^2.

    So these are the upper bounds for what surface emissions migh be, and as we can see the coldest regions are down by a factor of five, while the hotter regions are at least 1.5 times the global average; in terms of W/m^2 surface LWIR emissions.

    These are the numbers that CO2 has to deal with in the way of energy to capture; so already we have a strong temperature bias on the fundamental driving source that is supposed to power the CO2 warming engine; that is claimed to result in a logarithmic rise in Temperature with CO2.

    Now suppose we have some trace impurity (could be CO2) in the atmosphere, such that in a given thickness of atmosphere (maybe a cm) absorbs say 1E-6 of some incident radiation in that thickness; so the transmission through that (cm) is 0.999999 of what came in. The next cm of air is also supposed to transmit 0.999999 so the total transmission for the 2 cm layer should be 0.999999^2 which is 0.999998000001. near enough to 0.999998. We conclude that three cm of air will transmit 0.999997 of the incident radiation.

    Clearly the light is being lost (absorbed) linearly with thickness; for all practical purposes. And we can deduce that if instead we keep the thickness constant, and raise the CO2 amount by 2 x or 3 x, that likewise the absorption should be linear with CO2 abundance; but we note those little round-off errors which we discarded, that show it is not exactly linear, but does in fact fit a behavior of the form:- t = exp (-alpha.s) where alpha is some absorption coefficient; and of course we know that that exponential decay formula is in fact nearly linear with very small arguments.

    This is the standard form of assumption of ordinary Optical absorption; but there is a basic assumption that we have overlooked; and that is the common presumption that the absorbed light disappears forever; never to see the light of day again.

    This is often true in a lot of optical materials; where the absorbed energy, ultimately is converted to heat and warms the absorbing glass or whatever; that heat to be ultimately conducted out of the sample.

    Some common materials have a different idea in mind. An Example would be certain kinds of Optical Glass color filters, that have very sharp absorption cuoff at specified wavelengths.
    Take Schott glass RG645 for example. A 3 mm thick sample of this wine red glass is supposed to have 50% internal transmission at 645 nm wavelength ( you also have to allow for perhaps 4-5% Fresnel reflection loss at each surface.
    At 600 nm wavelenght the transmission of this glass may be less than 0.01%; these are very shartp cutoff optical materials.

    You can prove this sort of high attenuation with a tunable laser and a monachromator and wide band sensor. You tune the laser to your desired wavelenght; say 600 nm; then locate that line in the monochromator, and measure the signal drop when you insert the filter glass; and you do measure these extreme values of signal loss at quite nearby wavelengths.

    If you leave out the monochromator, and remeasure the transmitted signal; you find you don’t get anywhere close to 10^-4 extinction; the “transmission” is very much higher.
    What has happened, is that this series of glasses, are quite fluorescent; or luminascent; and the absorbed laser energy simply stimulates some longer wavelength emission from the glass; so much of the energy passes right on through; but with a shift in wavelength. You can add another glass; say RG 665, or RG 695; and the same thing will happen; the energy is largely simply red shifted and re-emitted at a longer wavelength with much less energy loss than the data sheet says.

    Well now we are getting into some familiar territory; because this is about what GHGs are doing. The CO2 absorbs surface emitetd LWIR in the 13.5 to 16.5 micron range; probably in a number of narrower closely spaced lines; but that energy is usually distributed to other atmospheric molecules in collisions; before the CO2 has a chance to re-emit the absorbed photon.
    This warms the ordinary atmosphere; and that in turn radiates a thermal continuum spectrum.

    The problem in this case, is that the spectrum emitted by the warmed atmosphere is not too different from that which the CO2 absorbed in the first place; and it is quite likely that some of that radiation will in turn be recaptured by some other CO2 molecule repeating the process.

    As a result, we do not have a situation, where the absorbed energy simply vanishes from the scene; so the classical ordinary Optical absorption rules don’t work.

    So what happens if say a metre of air with CO2 in it should capture 90 or 99 % of the incoming radiation; so there is little to transmit to the next layer of air; and we now double the CO2. Well we’ve already caught most of the energy; so what else could happen.

    Well not so fast. The absorption/re-emission process goes on unabated; and all that will happen, is that that same percentage of the radiation will be captured in a thinner air layer; whcih will in turn presumably heat a bit more sicne the mass is now lower (of air) and then that air layer will re-emit in all directions to pass the energy on.

    This is why arguments of CO2 “saturation” are not too convincing.
    That is a process which works in the classical optical absorption case; where the light is captured forever. The capture/re-emission cascade does complicate the issue because the atmospoheric emitted LWIR radiation should be emitted isotropically, so it spreads around more, and doesn’t all go either up or down.

    But the point is that more CO2 will continue to absorb the LWIR; it just takes a thinnner layer to do so, and the cascade string gets longer, before the radiation ends up either back on the ground; or escaped to space.

    But I don’t see anything in this process, that leads to a simple logarithmic temperature response; at least theoretically.

    The entire cascade process is something I have no intention of trying to work out; but I presume that it is doable with some supercomputer program; and has likely been done anyway.

    But I think it is important to realize that the CO2 or other GHG absorption process, does not conform to the simple classical Optical absorption rules, since the absorbed energy is actually re-radiated, in the same general spectral region, as the original surface sourced radiation. It is a mistake to think of the CO2 band as being “saturated” at some CO2 level.

    Luckily; we have the water to take care of all of that anyway.

  178. Steven Mosher says:

    Geoff:

    “We presume the US military has its own set of weather stations, like at each missile silo site; and that these just might be fairly free of the need for adjustment, being somewhat fixed in position, as the USAF airports might also be.

    If the US military has its own climate set, it has potential value. You should ask them for it if they have not disclosed it. ”

    I think you missed the point.

    The role the Air Force played isn’t generally talked about in the climate wars.

    1. Creation of a database HITRAN.
    http://en.wikipedia.org/wiki/HITRAN

    2. Study of the stratosphere which is crucial in understanding how C02 operates throughout the ENTIRE column. ( started in the 50s i believe)
    here is some recent work– sensor related
    http://en.wikipedia.org/wiki/Stratospheric_Observatory_for_Infrared_Astronomy

    3. Development of RTE

    http://spiedl.aip.org/getabs/servlet/GetabsServlet?prog=normal&id=PSISDG002309000001000170000001&idtype=cvips&gifs=yes&ref=no

    here is the simple point. Those of us who worked building weapons and systems for the Airforce had a keen interest in how radiation transfers through the atmosphere.
    We needed to understand how Aircraft at say 50,000 feet would be seen on the ground by different sensors. What did they look like in say X band? or IR, or you name it. Understanding how radiation transfered from the highest altitude to the ground was key in the development of Stealth. And the flipside is true as well,
    If we wanted spot IR targets on the ground we had to understand how that radiation would come up through the atmosphere. Those problems are well understood and not debated ( seriously) by any person who ever had to build a system that people’s lives depended on. Period.

    The tools we used were HITRAN, a database of molecules and their behavior. And ( in my case) MODTRAN , a computer model that simulated how radiation would transfer through the atmosphere. Back in the day, MODTRAN was classified.
    its now public (since 2000)
    http://www.kirtland.af.mil/library/factsheets/factsheet.asp?id=7915

    MODTRAN and the higher fidelity TRE ( line by line models) are now everyday Engineering tools based in solid physics. If they didnt work, if they were not accurate we could not build stealth aircarft, or IR missiles, or sensors that performed as expected. So when people tell me that C02 has no effect on radiation escaping the earth, I have to say they don’t know what they are talking about.

  179. Steven Mosher says:

    “Why all the praise? Would be much more thorough if Min/Max/Avg trends were each calculated and displayed as that is where the real mystery lies.”

    1. Two years ago I started with daily data. then I looked at hourly. convinced that the
    “mystery” is there. it’s not. There is no mystery. The world is warmer now than it was back in 1850. Not much of a mystery. How much? that’s a matter for calculation

    ” C’mon, if we are going to dig into it, then let’s dig into it. Min/max/2 anomalies hide important data IMO. Please add to the reconstructions, maybe delineated by season, Spring/Summer/Fall/Winter, separate min and max and avg anomalies, for both Urban and Rural, and now we are talking. Otherwise, it’s all integrated noise…Until you take that step, you are just validating the integrated mess that we currently have to swallow. The mystery is in the mins no? and the urban vs rural? and the adjustments…dissect please…then put it back together.”

    grab a compiler and chip in. plenty of work for you to do

  180. Steven Mosher says:

    tonyb,

    . Let me make it CLEAR.

    the QUALITY of the data is SEPARATE QUESTION. SEPARATE….QUESTION.
    SEPARATE QUESTION.SEPARATE QUESTION.SEPARATE QUESTION.SEPARATE QUESTION.SEPARATE QUESTION

    I build a calculator. to test it I ask two women their age and weight.

    Woman A says: I’m 29 and i weigh 105 lbs.
    Woman B says: I’m 39 and I weigh 129 lbs.

    I’m testing a calculator. I’m not interested (YET) in whether or not these women are lying or telling the truth. i’m interested in whether the calculator gives the right answer ASSUMING the data is correct.

    i’m sorry but I cant put it ANY simpler.

    Maybe I’ll feed the beast some test data, just to make it clear

  181. Steven Mosher says:

    I cleaned the kitchen and both bathrooms.

    so there!

  182. George E. Smith says:

    “”” Steven Mosher says:
    July 15, 2010 at 3:28 pm
    Geoff:

    “We presume the US military has its own set of weather stations, like at each missile silo site; and that these just might be fairly free of the need for adjustment, being somewhat fixed in position, as the USAF airports might also be. “””

    Steven; How true, that a lot of the data we have access to was obtained by the airforce for their purposes.
    Back in the 40s-50s they were doing high altitide studies, partly to find out what hazards future high altitude pilots would face.

    These studies turned up early evidence, that the apparent color temperature of the sun, varied seasonally, and randomly; and they deduced that the difference was in the near UV region of the spectrum; where the sun deviated somewhat from the 6kK BB spectrum.

    Those early spectral variations were clearly an indication that Ozone holes existed back then; before anybody ever thought about that.

  183. anna v says:

    George E. Smith says:
    July 15, 2010 at 3:14 pm

    Well not so fast. The absorption/re-emission process goes on unabated; and all that will happen, is that that same percentage of the radiation will be captured in a thinner air layer; whcih will in turn presumably heat a bit more sicne the mass is now lower (of air) and then that air layer will re-emit in all directions to pass the energy on.

    This is why arguments of CO2 “saturation” are not too convincing.
    That is a process which works in the classical optical absorption case; where the light is captured forever. The capture/re-emission cascade does complicate the issue because the atmospoheric emitted LWIR radiation should be emitted isotropically, so it spreads around more, and doesn’t all go either up or down.

    But the point is that more CO2 will continue to absorb the LWIR; it just takes a thinnner layer to do so, and the cascade string gets longer, before the radiation ends up either back on the ground; or escaped to space.

    But I don’t see anything in this process, that leads to a simple logarithmic temperature response; at least theoretically.

    The entire cascade process is something I have no intention of trying to work out; but I presume that it is doable with some supercomputer program; and has likely been done anyway.

    But I think it is important to realize that the CO2 or other GHG absorption process, does not conform to the simple classical Optical absorption rules, since the absorbed energy is actually re-radiated, in the same general spectral region, as the original surface sourced radiation. It is a mistake to think of the CO2 band as being “saturated” at some CO2 level.

    Luckily; we have the water to take care of all of that anyway.

    Well, George, I am not good at molecular level physics :), but I got a good precis from Tom Vonk a year or so ago, of what happens when a CO2 absorbs an infrared photon. In a nutshell, it has not enough time to re-radiate it, the energy cascades down into rotational and vibrational states, with much smaller frequencies of emission, and those states eventually thermalize the N2 O2 etc hitting the CO2. There is one little CO2 and 150 or so others around it. The result is thermalization, not re-emittance and re-absorption, from what I understood.

  184. tonyb says:

    Steven Mosher

    Obviously BOTH your women are lying-one about her weight, the other her age, and I’m not convinced that you cleaned BOTH bathrooms let alone the kitchen. :)

    Yeah, why don’t you feed the beast some data. THEN we can start to examine the real truth about weight, age AND cleaning.

    Tonyb

  185. George E. Smith says:

    “”” anna v says:
    July 15, 2010 at 9:19 pm
    George E. Smith says:
    July 15, 2010 at 3:14 pm
    ………………………..
    Well, George, I am not good at molecular level physics :), but I got a good precis from Tom Vonk a year or so ago, of what happens when a CO2 absorbs an infrared photon. In a nutshell, it has not enough time to re-radiate it, the energy cascades down into rotational and vibrational states, with much smaller frequencies of emission, and those states eventually thermalize the N2 O2 etc hitting the CO2. There is one little CO2 and 150 or so others around it. The result is thermalization, not re-emittance and re-absorption, from what I understood. “””

    Well Anna, I believe you have it exactly right there; we don’t differ on that. Throughout the lower reaches of the atmosphere; where all the action takes place anyway; there’s not much chance for spontaneous re-emission from the CO2 exited molecule. Phil says there is; at stratospheric levels where mean free paths get long enough for that to happen; and that makes sense to me.

    But lower down the coollisions with ordinary air molecules distribute that extra energy, and result in heating of the air, as Tom evidently explained it to you. I’m sure that is the correct picture.

    But the air and surface temperatures are not so different; so the air itself, ultimately radiates a thermal spectrum; nobody ever told the sun; or the Argon or whatever in ordinary incandescent lamps that gases are not supposed to emit black body like radiation. So far as I know they do that quite well; it’s just at 288 K, the 10.1 micron peaked spectrum that the air radiates, is not perceived by human senses as “heat”.

    So my point was; and is, that the energy that is captured by CO2 from surface emitted LWIR; is not to different from what will be emitted from the (heated) atmosphere itself; so the GHG captured energy does not stay captured as happens in classical Optical absorption in solids; and that new emission from the air itself is a perfectly good target for other GHG molecules to go after for subsequent captures. That is what I mean by “cascades”, a continuous capture and thermalization, followed by further emission and further recapture of not too unlike LWIR radiation.

    So the normal exponential transmission decay due to ordinary optical absorption does not describe what really happens in the atmosphere. One could almost say that the CO2 is acting as a catalyst, to capture energy and deliver it to the atmosphere which will ultimately re-emit a similar radiation spectrum.

    So my view of the process is not different from what Tom evidently explained to you.

  186. Steven Mosher says:

    Well tonyb here is the thing.

    you disbelieve on no evidence that I did not clean both bathrooms and the kitchen.

    Now, I know in a way in which you will never know that I did in fact clean. yet, on no evidence available to you, you doubt. Perhaps you should be a complete skeptic and doubt your doubt.

    I could of course have charles vouch for me, but you might doubt his testimony.
    I could supply before and after pictures and you could doubt their provenance and whether I in fact did the cleaning. I could produce a movie, but george Lucas has shown us what can be done with that. Simply, you could remain unconvinced that the sun came up today. At some point to move forward people have to examine the evidence for themselves or describe what they will accept as evidence.. before hand.

  187. tonyb says:

    Steven Mosher

    I certainly would never doubt CTM, after all he could delete my posts. There’ a time for pragmatism and and a time for scepticism :)

    Tonyb

  188. charles the moderator says:

    Mosh did clean the kitchen and both bathrooms, just a short amount of time after I purchased the Swiffer refills and cleaning solution.

    He still has a bunch of stuff by the front door to take down to Goodwill though.

  189. tonyb says:

    CTM

    So Steve presented his data selectively then? Hmmm. :)

    TonyB

  190. sky says:

    George E. Smith says:
    ” CO2 absorbs surface emitetd LWIR in the 13.5 to 16.5 micron range; probably in a number of narrower closely spaced lines; but that energy is usually distributed to other atmospheric molecules in collisions; before the CO2 has a chance to re-emit the absorbed photon. This warms the ordinary atmosphere; and that in turn radiates a thermal continuum spectrum.”

    You are correct that the radiationally “inert” constituents of the atmosphere are the recipients of energy from GHGs. They , rather than the GHGs themselves , provide what little thermal retentivity the atmoshpere posseses. But band-saturation simply implies that there is no more energy at that wavenumber available, which in the case of CO2 is a few tens of meters above the ground. Every CO2 molecule above that level is redundant because radiation at that wavenumber has become extinct.

  191. Gary Pearse says:

    After acceptance of the several mathematical ways global avg temps are computed, the residual problems are not just whether the data is properly manipulated or UHI accounted for and the like. As good as it all will be when we can all agree on proper treatement of the temperature data, the chief problems remain:

    a) Is the instrumental record – left as it is or twisted into knots- a big enough sample of earth temp history – even of the last millenia or two let alone millions of years, to be of any help in deciding whether the present temperatures and trends are within natural variability – we are talking here about a degree or two over two centuries. Let me be of help here. The answer is no. Even the most highly engineered system couldn’t match the performance of a theorectical planet with all its chaotic and multi-sourced influences (weather patterns, orbital, solar, clouds, cosmic radiation, progress of the solar system in the loops of its host spiral arm, volcanoes, earthquakes, planetary perturbations, asteroid/meteor impacts, movement of tectonic plates…) that kept itself within temp changes of a degree or two, up or down in time periods of centuries. It is clear that modern climate science began with non earth scientists (astronomers and physicists) who noted an upward trend in temperature during the instrumental record of the last century or two. They shot their mouths off and grabbed the attention of the media. When skeptics belatedly came out of the woodwork – notably among them geologists, archeologists and historians- to point out that Swiss villages that had been flourishing for a millennium or so in the lower valleys were crushed by advancing glaciers during the 18th century, that in the 18th/19th the Bosphorus froze over, New York harbour froze over and people walked to Staten Island and London had “Frost Faires”, etc. And the Middle Warming period when Vikings settled and farmed Greenland and were subsequently frozen out in the LIA. And the Roman W P when Hannibal crossed the Alps with his elephants (they had to eat a lot of grass where there is now snow)
    and we found a Swiss chap in leather with a quiver of arrows who died over 4000 years ago in an accident while hunting in a mountain pass and only came to light when hundreds of metres of snow and ice, which had subsequently buried him, melted down by 1980 or so. The modern climate scientists faced, with this damning evidence burned up over 50B dollars concocting proxies to erase the LIA and the MWP and the RWP, instead of taking steps to advance the discipline and put things in perspective.

    b) Nothing one does about advancing methodologies of mincing temperatures into acceptable anomaly sausages will settle the issue of the magnitude of the effect of increased CO2. I can probably be of help here, too. If the industrial revolution of the past 150 years has resulted in only a degree C increase in temp (even forgetting about the contribution of natural rebound from the LIA) then we have nothing to worry about here. Natural variability will overwhelm it-many think the dip into cooler may be in the offing.

    c) Back to the business of adjusting temps a half a degree here or there. I have a proposal that makes good sense. If we have 60 thermometers (Nick Stokes) of long record, lets keep them going or replace them with something new along side to have a duplicate record for when the old ones clap out. If GHG are so powerful as to subdue natural variability, then we should eventually see this in multi-degree increases. It matters not what the average global temp might be or if we could ever generate this artifact. The 60 instruments are enough on their own to answer the question of how serious this will become. If the record starts to bend down again after all this new CO2, then lets relax.

  192. Hu McCulloch says:

    See my post on the First Difference Method over on CA, at
    http://climateaudit.org/2010/08/19/the-first-difference-method/ .

  193. rdr200 says:

    Thanks for the article.
    1) I am a non -scientist. I enjoyed the article since it seems to be written to make it easy for folks like me to follow along.
    2)Is there a 101 type article slowly tracing some calculation from the database to the station anomaly and then from the station anomalies to the global anomaly (with confidence interval)?
    2)It would be nice to see the graphs with confidence intervals around the results
    3)It would be nice to see some graphs with , say, 1deg X 1 deg boxes and confidence intervals.

Comments are closed.