New study shows half of the global warming in the USA is artificial

PRESS RELEASE – U.S. Temperature trends show a spurious doubling due to NOAA station siting problems and post measurement adjustments.

Chico, CA July 29th, 2012 – 12 PM PDT – FOR IMMEDIATE RELEASE

A comparison and summary of trends is shown from the paper. Acceptably placed thermometers away from common urban influences read much cooler nationwide:

A reanalysis of U.S. surface station temperatures has been performed using the recently WMO-approved Siting Classification System devised by METEO-France’s Michel Leroy. The new siting classification more accurately characterizes the quality of the location in terms of monitoring long-term spatially representative surface temperature trends. The new analysis demonstrates that reported 1979-2008 U.S. temperature trends are spuriously doubled, with 92% of that over-estimation resulting from erroneous NOAA adjustments of well-sited stations upward. The paper is the first to use the updated siting system which addresses USHCN siting issues and data adjustments.

The new improved assessment, for the years 1979 to 2008, yields a trend of +0.155C per decade from the high quality sites, a +0.248 C per decade trend for poorly sited locations, and a trend of +0.309 C per decade after NOAA adjusts the data. This issue of station siting quality is expected to be an issue with respect to the monitoring of land surface temperature throughout the Global Historical Climate Network and in the BEST network.

Today, a new paper has been released that is the culmination of knowledge gleaned from five years of work by Anthony Watts and the many volunteers and contributors to the SurfaceStations project started in 2007.

This pre-publication draft paper, titled An area and distance weighted analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends, is co-authored by Anthony Watts of California, Evan Jones of New York, Stephen McIntyre of Toronto, Canada, and Dr. John R. Christy from the Department of Atmospheric Science, University of Alabama, Huntsville, is to be submitted for publication.

The pre-release of this paper follows the practice embraced by Dr. Richard Muller, of the Berkeley Earth Surface Temperature project in a June 2011 interview with Scientific American’s Michael Lemonick in “Science Talk”, said:

I know that is prior to acceptance, but in the tradition that I grew up in (under Nobel Laureate Luis Alvarez) we always widely distributed “preprints” of papers prior to their publication or even submission. That guaranteed a much wider peer review than we obtained from mere referees.

The USHCN is one of the main metrics used to gauge the temperature changes in the United States. The first wide scale effort to address siting issues, Watts, (2009), a collated photographic survey, showed that approximately 90% of USHCN stations were compromised by encroachment of urbanity in the form of heat sinks and sources, such as concrete, asphalt, air conditioning system heat exchangers, roadways, airport tarmac, and other issues. This finding was backed up by an August 2011 U.S. General Accounting Office investigation and report titled: Climate Monitoring: NOAA Can Improve Management of the U.S. Historical Climatology Network

All three papers examining the station siting issue, using early data gathered by the SurfaceStations project, Menne et al (2010), authored by Dr. Matt Menne of NCDC, Fall et al, 2011, authored by Dr. Souleymane Fall of Tuskeegee University and co-authored by Anthony Watts, and Muller et al 2012, authored by Dr. Richard Muller of the University of California, Berkeley and founder of the Berkeley Earth Surface Temperature Project (BEST) were inconclusive in finding effects on temperature trends used to gauge the temperature change in the United States over the last century.

Lead author of the paper, Anthony Watts, commented:

“I fully accept the previous findings of these papers, including that of the Muller et al 2012 paper. These investigators found exactly what would be expected given the siting metadata they had. However, the Leroy 1999 site rating method employed to create the early metadata, and employed in the Fall et al 2011 paper I co-authored was incomplete, and didn’t properly quantify the effects.

The new rating method employed finds that station siting does indeed have a significant effect on temperature trends.”

Watts et al 2012 has employed a new methodology for station siting, pioneered by Michel Leroy of METEOFrance in 2010, in the paper Leroy 2010, and endorsed by the World Meteorological Organization (WMO) Commission for Instruments and Methods of Observation (CIMO-XV, 2010) Fifteenth session, in September 2010 as a WMO-ISO standard, making it suitable for reevaluating previous studies on the issue of station siting.

Previous papers all used a distance only rating system from Leroy 1999, to gauge the impact of heat sinks and sources near thermometers. Leroy 2010 shows that method to be effective for siting new stations, such as was done by NCDC adopting Leroy 1999 methods with their Climate Reference Network (CRN) in 2002 but ineffective at retroactive siting evaluation.

Leroy 2010 adds one simple but effective physical metric; surface area of the heat sinks/sources within the thermometer viewshed to quantify the total heat dissipation effect.

Using the new Leroy 2010 classification system on the older siting metadata used by Fall et al. (2011), Menne et al. (2010), and Muller et al. (2012), yields dramatically different results.

Using Leroy 2010 methods, the Watts et al 2012 paper, which studies several aspects of USHCN siting issues and data adjustments, concludes that:

These factors, combined with station siting issues, have led to a spurious doubling of U.S. mean temperature trends in the 30 year data period covered by the study from 1979 – 2008.

Other findings include, but are not limited to:

· Statistically significant differences between compliant and non-compliant stations exist, as well as urban and rural stations.

· Poorly sited station trends are adjusted sharply upward, and well sited stations are adjusted upward to match the already-adjusted poor stations.

· Well sited rural stations show a warming nearly three times greater after NOAA adjustment is applied.

· Urban sites warm more rapidly than semi-urban sites, which in turn warm more rapidly than rural sites.

· The raw data Tmean trend for well sited stations is 0.15°C per decade lower than adjusted Tmean trend for poorly sited stations.

· Airport USHCN stations show a significant differences in trends than other USHCN stations, and due to equipment issues and other problems, may not be representative stations for monitoring climate.

###

We will continue to investigate other issues related to bias and adjustments such as TOBs in future studies.

FILES:

This press release in PDF form: Watts_et_al 2012_PRESS RELEASE (PDF)

The paper in draft form: Watts-et-al_2012_discussion_paper_webrelease (PDF)

The Figures for the paper: Watts et al 2012 Figures and Tables (PDF)

A PowerPoint presentation of findings with many additional figures is available online:

Overview -Watts et al Station Siting 8-3-12 (PPT) UPDATED

Methodology – Graphs Presentation (.PPT)

Some additional files may be added as needed.

Contact:

Anthony Watts at: http://wattsupwiththat.com/about-wuwt/contact-2/

References:

GAO-11-800 August 31, 2011, Climate Monitoring: NOAA Can Improve Management of the U.S. Historical Climatology Network Highlights Page (PDF)   Full Report (PDF, 47 pages)   Accessible Text Recommendations (HTML)

Fall, S., Watts, A., Nielsen‐Gammon, J. Jones, E. Niyogi, D. Christy, J. and Pielke, R.A. Sr., 2011, Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends, Journal of Geophysical Research, 116, D14120, doi:10.1029/2010JD015146, 2011

Leroy, M., 1999: Classification d’un site. Note Technique no. 35. Direction des Systèmes d’Observation, Météo-France, 12 pp.

Leroy, M., 2010: Siting Classification for Surface Observing Stations on Land, Climate, and Upper-air Observations JMA/WMO Workshop on Quality Management in Surface, Tokyo, Japan 27-30 July 2010 http://www.jma.go.jp/jma/en/Activities/qmws_2010/CountryReport/CS202_Leroy.pdf

Menne, M. J., C. N. Williams Jr., and M. A. Palecki, 2010: On the reliability of the U.S. surface temperature record, J. Geophys. Res., 115, D11108, doi:10.1029/2009JD013094

Muller, R.A., Curry, J., Groom, D. Jacobsen, R.,Perlmutter, S. Rohde, R. Rosenfeld, A., Wickham, C., Wurtele, J., 2012: Earth Atmospheric Land Surface Temperature and Station Quality in the United States. http://berkeleyearth.org/pdf/berkeley-earth-station-quality.pdf

Watts, A., 2009: Is the U.S. surface temperature record reliable? Published online at: http://wattsupwiththat.files.wordpress.com/2009/05/surfacestationsreport_spring09.pdf

World Meteorological Organization Commission for Instruments and Methods of Observation, Fifteenth session, (CIMO-XV, 2010) WMO publication Number 1064, available online at: http://www.wmo.int/pages/prog/www/CIMO/CIMO15-WMO1064/1064_en.pdf

Notes:

1. The SurfaceStations project was a crowd sourcing project started in June 2007, done entirely with citizen volunteers (over 650), created in response to the realization that very little physical site survey metadata exists for the entire United States Historical Climatological Network (USHCN) and Global Historical Climatological Network (GHCN) surface station records worldwide. This realization came about from a discussion of a paper and some new information that occurred on Dr. Roger Pielke Sr. Research Group Weblog. In particular, a thread regarding the paper: Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res.

2. Some files in the initial press release had some small typographical errors. These have been corrected. Please click on links above for  new press release and figures files.

3. A work page has been established for Watts et al 2012 for the purpose of managing updates. You can view it here.

==========================================================

Note: This will be top post for a couple of days, new posts will appear below this one. Kinda burned out and have submission to make so don’t expect much new for a day or two. See post below this for a few notes on backstory. Thanks everybody!  – Anthony

NOTE: 7/31/12 this thread has gotten large and unable to load for some commenters, it continues here.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
1.1K Comments
Inline Feedbacks
View all comments
A. Scott
July 30, 2012 2:49 pm

Excellent story in The Register
Forget ‘climate convert’ Muller: Here’s the real warming blockbuster
Apply official WMO methods, warming shrinks massively

If new techniques endorsed by the World Meteorological Organisation are applied to official figures, over half of the global warming reported by US land-based thermometers between 1979 and 2008 simply disappears, researchers have found.
The new study used the same raw temperature measurements as US government federal scientific agencies, but the team deployed a revised metric that was better at taking into account the quality of the weather stations that housed the thermometers.
Previous studies have used a cruder metric to gauge station quality, which has to be taken into account so as to allow for the effect of asphalt, urban development and other local factors on the readings at any given thermometer. The new station-quality metric improves on older methods, not merely relying on distance but also the density of heat sinks and sources near the thermometers.
When the more sophisticated classification system is used, some dramatic results are seen. The new study reveals that the US National Oceanic and Atmospheric Administration (NOAA) discarded the temperature trend from the higher quality weather stations in favour of a warming temperature trend from low quality weather stations.
But the most extraordinary aspect is that this improved metric for categorising weather station quality has been endorsed by the World Meteorological Organization since 2010. It was proposed by Michel Leroy of Météo-France, the French state weather service, who devised its cruder predecessor in 1999

Much more in the story here

Skiphil
July 30, 2012 2:50 pm

Anthony’s paper shows why it matters that the level of attention to the sites and measurements has been (seriously) inadequate. Would physicists or chemists, biomedical researchers or engineers accept this low quality of data in most other specialties?? (let’s hope not) Now the GAO or some relevant entity needs to push NOAA to give proper attention to these matters, with up-to-date WMO site standards enforced universally (all kinds of political and management issues there, I realize).
From the GAO link above and the accompanying docs it is clear the USHCN network has not been managed at a consistent level of quality (even for the “old” site standards) and precision of measurements:
from “Highlights” of GAO report (Aug. 2011) on NOAA monitoring of USHCN sites

“…NOAA does not centrally track whether USHCN stations adhere to siting standards and the requirement to update station records, and it does not have an agencywide policy regarding stations that do not meet its siting standards. Performance management guidelines call for using performance information to assess program results. NOAA’s information systems, however, are not designed to centrally track whether stations in the USHCN meet its siting standards or the requirement to update station records. Without centrally available information, NOAA cannot easily measure the performance of the USHCN in meeting siting standards and management requirements….”

ed herold
July 30, 2012 2:53 pm

bythe response that i’ve seen from the MSM, thus far, the adage “if a tree falls in the forest and no one is there—–” comes to mind

Skiphil
July 30, 2012 2:55 pm

p.s. Perhaps the USA and all (willing) countries need to create more new “model” WMO sites in an appropriate distribution, while maintaining existing sites for more comparisons of data…. budgetary issues, yes, but small in the overall scheme of climate related spending….. of course I realize that improving data quality going forward will not improve the existing historical record, except that it would create new data streams for comparisons by site, before and after the upgraded stations, etc.

July 30, 2012 2:57 pm

Joel Shore, well done. You managed to slag off Anthony, the entire WUWT community, Roy Spencer and promote Tamino all at the same time. Your concern is noted.

Alma
July 30, 2012 2:57 pm

I’m not science literate and I’m wondering. Is this like the kid who put the thermometer against the lightbulb to convince his mother he is sick and shouldn’t go to school?

michaelozannenne
July 30, 2012 2:59 pm

“There is also the issue of equipment inhomogeneity. Modern MMTS sensors show a significantly lower warming trend than the obsolete CRS shelters. Yet rather than lowering the trends of CRS stations, the trends of MMTS stations are, yet again, sharply adjusted upwards.”
OK I have hippo hide ,comes from years of quality management activity in the auto industry, I’ll ask the apparently dumb question…
One assumes that nobody waved a magic wand and decreed “let there be MMTS” there must have been an adoption R&R process where the new sensors were validated against the old across the full range of temperatures, seasons and sub-climates; in the presence of a calibrated third instrument more precise than both candidates; with data collected to estimate the error in the instrument, the error in the measurement process, drift rates and recalibration intervals. With the new instrument not adopted unless it was as or more accurate than the old, with a narrower estimate of instrument and systematic error.
either an instrument can be trusted to x +- tolerance or you don’t use it. If you get a better ruler and it shows your old measurements are a load of dingo’s kidneys then surely you have to at least expand the error bands on the history or discard it all together…

Ally E.
July 30, 2012 2:59 pm

Twisters says:
July 30, 2012 at 9:22 am
So in conclusion, global warming is happening, but not as much as some other studies have found. But it is happening.
*
Coming up from the LIA, that’s no surprise and is completely natural. The last fifteen years or so has seen the temperature flatten out and it is now slowly tipping into decline. So, global warming was happening, yes, up until 2009 thereabouts. Now it’s not. The Earth is cooling down again, hopefully not too far.
From a “we’re going to fry” point of view, I think you can relax now. Oh, and tell your friends. 🙂

michaelozanne
July 30, 2012 3:03 pm

“There is not a single illustration of trend uncertainty incurred by altering the number of stations or their spatial locations, no bootstrapping, Monte Carlo analysis”
You recall this is actual observations, not repeated model simulation…

clipe
July 30, 2012 3:09 pm

Die kalte Sonne
Ein dicker Hund

July 30, 2012 3:16 pm

Well done Anthony!
We have a court case just completed against NIWA – the NZ equivalent of NOAA etc. Now awaiting judgement.
I plotted the raw data and the NIWA adjustments. I found that this rule gives a line very close to their adjustment line: “If the data is older than 1975, adjust it downwards by 1 deg /century: if it is younger than 1975, adjust it down by 0.3 deg/century”
A clear indication – but not proof – of fiddling.

Tucci78
July 30, 2012 3:18 pm

At 1:27 PM on 30 July, William Roberts kvetches:

This must be a joke right? There is not a single illustration of trend uncertainty incurred by altering the number of stations or their spatial locations, no bootstrapping, Monte Carlo analysis, nothing. My six grader could do a more robust analysis for a middle school science project. I mean, the term “uncertainty” is found once in the whole manuscript… as part of a reference title.

While it would seem to me that what is reported in the manuscript is sufficient to warrant publication – because it certainly does increase the fund of knowledge in the field of climatology, much to the lament of las warmistas and their “progressive” (or are they calling themselves “fascists” again this week?) political goons – there are definitely possibilities for further work to be derived.
In the pharmaceuticals manufacturers’ efforts to gain as much promotional “noise” as possible from research conducted in compliance with FDA and EMEA requirements for marketing approval, this comes under the heading of publications planning, the extraction from available study data of as many additional articles and presentations as can be managed.
Is anybody sensible of potentials for further analyses per Mr. Roberts’ suggestions?

Ivan
July 30, 2012 3:18 pm

From the paper:
“By way of comparison, the University of Alabama Huntsville (UAH) Lower Troposphere CONUS trend over this period is 0.25°C/decade and Remote Sensing Systems (RSS) has 0.23°C/decade, the average being 0.24°C/decade. This provides an upper bound for the surface temperature since the upper air is supposed to have larger trends than the surface (e.g. see Klotzbach et al (2011). Therefore, the surface temperatures should display some fraction of that 0.24°C/decade trend. Depending on the amplification factor used, which for some models ranges from 1.1 to 1.4, the surface trend would calculate to be in the range of 0.17 to 0.22, which is close to the 0.155°C/decade trend seen in the compliant Class 1&2 stations.”
However, just two pages later, we read:
“Modern MMTS sensors show a significantly lower warming trend than the obsolete CRS shelters. Yet rather than lowering the trends of CRS stations, the trends of MMTS stations are sharply adjusted upwards. It is difficult, however, to be certain of the true effect thanks to the relatively small number of Class 1,2, rural, non-airport stations.”
So, if the MMTS rural, non-airport stations are relevant, the trend is 0.032, as shown in the figure 8, and not 0.155 as stated in the press release. But, I suppose that the authors were too afraid to openly say that the entire warming trend in the USA was manufactured by the upward adjustments. But, is the fact that we don’t have “enough” (what’s enough in this context) 1,2 rural stations to reliably confirm the zero trend a sufficient basis to accept, and moreover, trumpet and advertize, the trend that we with certainty know is spurious (0.155C)? If “obsolete” CRS data are not good, as the paper explicitly claims, how then the trend derived from the same data could be good?

pat
July 30, 2012 3:21 pm

30 July: WaPo: Brad Plumer: Two climate papers get hyped first, reviewed later. Isn’t that a bad idea?
Meanwhile, climate skeptic Anthony Watts trumpeted a new paper that questioned some of the techniques used by NOAA to calculate U.S. temperature trends. Watts’ paper was quickly heralded by climate-change doubters everywhere.
And yet, as my colleague Jason Samenow discusses in detail, neither of these research endeavors have yet undergone full peer review. Watts said he’s planning to submit his paper to a journal, while Muller’s group conceded that their studies still haven’t made it through review. That makes these papers no different from thousands of others around the world waiting to be reviewed and published by journals. So why should these findings receive special hype?…
(For those curious, Victor Venema, a scientist who does work in a related field, took an early look at Watts’ paper and offered some words of caution.)
One possibility is that these papers are so crucial that they can’t possibly wait years before being thoroughly vetted…
http://www.washingtonpost.com/blogs/ezra-klein/wp/2012/07/30/two-climate-papers-get-hyped-first-reviewed-later-isnt-that-a-bad-idea/
no mention of yours, anthony:
30 July: ABC Australia: World Today: New findings add to certainty on climate change, while one sceptic has a turnaround
University of Melbourne climate scientist, Professor David Karoly, says Professor Muller’s results confirm what numerous other studies have already shown.
DAVID KAROLY: If you consider a victory someone accepting clearly what evidence shows then yes, it is a victory, but I would not consider that to be an important victory because the vast majority of climate scientists around the world have been assessing the data for an extended period and have reached these conclusions more than 10 years ago…
SIMON LAUDER: He also says that his conclusions are stronger than the IPCC’s.
DAVID KAROLY: He does say that and his comments are difficult to assess at present, mainly because the details of his study are in fact under peer review for a scientific journal. But in fact the methodology that he’s used to link the observed warming to increasing greenhouse gases, the so-called attribution step, is not nearly as robust as many other studies have undertaken over the last 10 years.
SIMON LAUDER: It goes to show there’s always a place for some scepticism in science…
http://www.abc.net.au/worldtoday/content/2012/s3556287.htm

Ivan
July 30, 2012 3:22 pm

And another problem: even if we keep the CRS data, and exclude the airports, the trend is significantly lower than 0.155, namely 0.124, see the figure 5.

DS
July 30, 2012 3:22 pm

[SNIP: A whole host of site violations in such a short comment. Well done, Sir! -REP]

July 30, 2012 3:23 pm

Looks like as the BEST team has overcooked the temperature data
http://www.vukcevic.talktalk.net/Best-NH.htm

A. Scott
July 30, 2012 3:34 pm

Revkin has a response – sort of – from NCDC at Dot Earth
Fair amount of technical gibberish – I’m sure it means something to pro’s – but worthless to the public discussion. Mann “self aggrandizement” comment came to mind 😉
I asked Revkin to ask NCDC a couple simple questions:

Andrew … please ask the folks at NOAA/NCDC to answer a couple simple questions – without the over technical, and not meaningful to the public, rhetoric …
Have you done any review using the WMO endorsed Leroy 2010 siting specs and if not, why not?
If the answer is no – then a followup – will you do even a small sampling using this siting criteria and report back to the public quickly?
Why are you adjusting highest quality rural, non-airport sites UPWARD to match the poor quality sites?
The Leroy 2010 siting standard is simple common sense. It ADDS the thermal mass – heat sink and heat source – to the siting quality equation. Past siting formulas (Leroy 1999) used only distance with no consideration of the mass of the source or sink.
An example … a lit match, a campfire, a bonfire, a fully engulfed burning building, and a forest fire. All are heat sources. No one would argue the effect would be the same at a fixed distance away from each.

They seem relevant to ask here as well.
What possible reason or justification is there to adjust high quality stations to match poorly sited stations?
And using my example – how could anyone support a claim that the thermal mass – the SIZE – of the nearby heat sources and sinks – is not critically important to the quality rating of a surface station.
I would actually think a a third criteria is highly important to consider regarding siting quality in addition to distance and mass … that of predominant winds. The effect/impact is a different issue if the station is downwind of a large thermal mass – especially so in areas with a strong predominant wind pattern?

July 30, 2012 3:41 pm

Seriously, this is a bad headline. If you don’t like my or others’ suggestions (above), can you go back to the original? This headline sounds like you’ve proven half of global warming itself is man-made (i.e., artificial).
Because that’s what the headline literally says, and that wasn’t the point of your study at all.
Was it?

July 30, 2012 3:43 pm

David Longinotti says:
July 30, 2012 at 7:46 am
Excellent work, but I think the new headline is misleading. “Artificial” generally means ‘man-made’ (a term which is typically used as the opposite of “natural”). So what the headline states is that half of the US warming is due to humans, and half is not. This does not reflect the important assertion of the study, which is that US warming has been significantly overestimated due to inappropriate measurement techniques and adjustments.

Quite.

July 30, 2012 3:45 pm

Revkin: A Closer Look at Climate Studies Promoted Before Publication 7/30 17:04.
It contains a response from NOAA Peter Thorn that concentrates on the USHCN.

[W]e have done quite a lot of work this past year that directly builds confidence in the verity of our adjustments to the United States Historical Climatology Network [USHCN] which it would be remiss not to mention in the context of current discussions.

http://dotearth.blogs.nytimes.com/2012/07/30/a-closer-look-at-climate-studies-promoted-before-publicatio/#more-45511
The Thorn non-response to me boils down to, ”Squirrel !! Don’t read Watts. Look at our NEW network. Nevermind why we need a new network. Nevermind that USHCN will never give us the history to answer the past century.” Thorn’s response speaks volumes in what it doesn’t say.

Paul K2
July 30, 2012 3:50 pm

I put up the question of why Watts et. al. (2012) didn’t use the most obvious method to identify siting issues on Revkin’s Dot Earth blog. Why go through decades of data, with all the changes that occurred over the years? Why not just compare the five classes of stations in this paper with the last five years of data from the US Climate Reference Network (USCRN)? If siting issues were important, there should be obvious differences between the Class 1/2 comparison with USCRN stations in the same grid, and Class 3/4/5 stations in that grid.
If there aren’t large differences with the recent years data between poorly sited stations and properly sited stations and the “gold standard” USCRN stations, then the findings of Watts et.al. are incorrect, and the other adjustments and homogenization methods are used to correct the raw data to get correct decadal temperature trends.

Entropic man
July 30, 2012 3:56 pm

Nick Stokes prompted me to look at variability. My own area rarely shows year on year variations of 0.5C. When I looked at the NASA/Goddard graph for the continental US I was amazed at the spread of temperatures and how quickly they changed. Look, for example, at 1918 to 1921.
http://data.giss.nasa.gov/gistemp/graphs_v3/
I hope Watts et al have the data analysis and statistics nailed. They are likely to be challenged at some point during peer review. It is no easy thing to demonstrate, to high significance, a difference of 0.154C in annual averages due to station errors against a background in which the averages alone are capable of varying by over 2C in a few years.

Skiphil
July 30, 2012 4:01 pm

WOW!
“900 Responses to New study shows half of the global warming in the USA is artificial”
Well done, Anthony et al., dedicated mods and commentators
…. but what I really wanted to note is that I’m hitting the tip jar, well deserved:
http://surfacestations.org/donate.htm
[REPLY: Your donation is gratefully accepted. Anthony will put it to good use, as there will be journal fees and other expenses that have to be covered. -REP]

Fred Singleton
July 30, 2012 4:10 pm

Well done,you even got a link on the BBC latest headlines even if it was a report on the Best findings, which is not attributed to any reporter.
Richard Black will be furious.

1 34 35 36 37 38 43