By Dr. Roger Pielke Sr.
First, as posted on my son’s weblog in
the global temperature anomaly is essentially irrelevant in terms of climate change issues that matter to society and the environment. Even in terms of global warming, it is a grossly inadequate measure, as discussed, for example, in
Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer. Meteor. Soc., 84, 331-335.
Pielke Sr., R.A., 2008: A broader view of the role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.
The global average surface temperature, however, unfortunately, has become the icon of the IPCC community and in the policy debate. As my son wrote in his post
“The debate over climate change has many people on both sides of the issue wrapped up in discussing global average temperature trends. I understand this as it is an icon with great political symbolism. It has proved a convenient political battleground, but the reality is that it should matter little to the policy case for decarbonization.”
This political focus has resulted in Richard Muller’s testimony on his Berkeley Earth Surface Temperature project yesterday to The Science, Space and Technology Committee of the House Of Representatives. In his (in my view, premature) testimony he makes the following claims
“The world temperature data has sufficient integrity to be used to determine global temperature trends”
“…. we find that the warming seen in the “poor” stations is virtually indistinguishable from that seen in the “good” stations.”
“The Berkeley Earth agreement with the prior analysis surprised us, since our preliminary results don’t yet address many of the known biases”?
The contradictory statement in the last sentence from his testimony contradicts the first two sentences.
All his study has accomplished so far is to confirm that NCDC, GISS and CRU honestly used the raw observed data as the starting point for their analyses. This is not a surprising result. We have never questioned this aspect of their analyses.
The uncertainties and systematic biases that we have published in several peer-reviewed papers, however, remain unexplored so far by Richard Muller and colleagues as part of The Berkeley Earth Surface Temperature project. We summarized these issues in our paper
Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229
where the issues include:
- a systematic bias in the use of multi-decadal trends in minimum air temperatures
- the use of surface observing sites that are not spatially representative of the region
- the failure to consider the variation of surface air temperature trends with height above the surface
- the lack of incorporation of the effect of concurrent multi-decadal trends in the surface air absolute humidity
- the absence of the statistical documentation of the uncertainty of each step in the adjustment of raw data to a “homogenized data set” (e.g. time of observation bias; equipment changes; station moves)
- the need to assess the absolute temperatures at which a temperature trend occurs, since a temperature anomaly at a cold temperature has less of an effect on outgoing long wave radiation that the same temperature anomaly at a warm temperature.
We have explored most of these issues in peer-reviewed papers and found them to be important remaining uncertainties and biases. Richard Muller and his colleagues have not yet examined these concerns, yet chose to report on his very preliminary results at a House Hearing. A sample of our papers include:
Fall, S., N. Diffenbaugh, D. Niyogi, R.A. Pielke Sr., and G. Rochon, 2010: Temperature and equivalent temperature over the United States (1979 – 2005). Int. J. Climatol., DOI: 10.1002/joc.2094
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841
Montandon, L.M., S. Fall, R.A. Pielke Sr., and D. Niyogi, 2011: Distribution of landscape types in the Global Historical Climatology Network. Earth Interactions, 15:6, doi: 10.1175/2010EI371
Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2011: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., 116, D02122, doi:10.1029/2010JD014612.
Richard Muller should be examining the robustness of our conclusions, as part of his project.
Richard does appropriately acknowledges Anthony’s and Steve McIntyre’s contribution in his testimony where he writes
“Without the efforts of Anthony Watts and his team, we would have only a series of anecdotal images of poor temperature stations, and we would not be able to evaluate the integrity of the data. This is a case in which scientists receiving no government funding did work crucial to understanding climate change. Similarly for the work done by Steve McIntyre. Their “amateur” science is not amateur in quality; it is true science, conducted with integrity and high standards.”
This is well deserved recognition for both research colleagues. One does not need a “Ph.d.” by your name, to do world-class research!
Anthony Watts has prepared an excellent response to Richard Muller’s presentation in
and
Clarification on BEST submitted to the House
His insightful dissection of the problems with Richard Muller’s presentation and of NCDC’s inconsistent behavior (which I completely agree with) include the statements that
“NOAA’s NCDC created a new hi-tech surface monitoring network in 2002, the Climate Reference Network, with a strict emphasis on ensuring high quality siting. If siting does not matter to the data, and the data is adequate, why have this new network at all?”
“Recently, while resurveying stations that I previously surveyed in Oklahoma, I discovered that NOAA has been quietly removing the temperature sensors from many of the USHCN stations we cited as the worst (CRN4, 5) offenders of siting quality. For example, here are before and after photographs of the USHCN temperature station in Ardmore, OK, within a few feet of the traffic intersection at City Hall.”
“Expanding the search my team discovered many more instances nationwide, where USHCN stations with poor siting that were identified by the surfacestations.org survey have either had their temperature sensor removed, closed, or moved. This includes the Tucson USHCN station in the parking lot, as evidenced by NOAA/NCDC’s own metadata online database….”
He concludes with
“It is our contention that many fully unaccounted for biases remain in the surface temperature record, that the resultant uncertainty is large, and systemic biases remain. This uncertainty and the systematic biases needs to be addressed not only nationally, but worldwide. Dr. Richard Muller has not yet examined these issues.”
I completely agree with Anthony’s submission to the House committee in response to Richard Muller’s testimony. Richard Muller has an important new approach to analyze the surface temperature data. We hope he adopts a more robust and appropriate venue to present his results.
I wonder how much do level 9 earth quakes on the Richter scale affect ocean temperatures? Has anyone ever studied this?
Stephan says:
April 1, 2011 at 2:46 pm
ALL the GISS, HADCRUT, BEST etc graphs show warming from 1980 on NOT before please explain.
The “zero” line is the average for specified 30 year period, not for the whole dataset. This period is generally somewhere in the late 20th Century. Since the datasets all show a overall increase in temperature, the earlier datapoints will mainly be below the baseline. In other words, whether a particular year is above or below the baseline depends solely on how you define the baseline. If you made the baseline the temperature in 1862 (?) , all years would be shown as “warming” to different extents.
Bob Paglee: “Those historical charts showing temp data from 7 different areas are very interesting and the slope or temperature gradients seem extremely modest. The chart for central England is easiest to read so I printed it and replicated the temperature scale shown at the 1650 origin. Using my replicated scale to measure the height of the trend line at 1800, I read it as 9.2. I repeated this for the end point marked 2000 and read this as 9.5. This would indicate a rise of only 0.3 over that 200-year interval, or a gradient of 0.15 per century. This seems to be off reality by an order of magnitude. In an earlier post today I described how I had done the same for the 30-year satellite temp chart (data from UAH Huntsville) and estimated the temperature gradient to be about 1.3C per century, albeit over a much shorter period than the 200-year chart. Am I missing something?”
Uhhh….I haven’t yet even taken notice of the slope values compared to the global average slope. This could provide an independent heads up about the global average if indeed there is a great mismatch. I did plot the global average against CET once, but I adjusted it by eye only. It does correspond fairly well to the CET in shape at least: http://oi49.tinypic.com/w1g68.jpg
It’s certainly possible that cities are experiencing some sort of cocoon effect that insulates them from the real world outside their limits.
Let’s quick look at CET slope versus the global average, for real….http://oi51.tinypic.com/2czv6z5.jpg
The slopes are not an order of magnitude off but 0.3°C/century (CET) vs. 0.4°C/century (global average). That doesn’t make sense though?! I thought it was higher than that for the global average. UAH is too short to provide a trend instead of just a recent fluctuation slope so sure UAH can be way higher in slope. Anyway, evidently HadCRUT3, the longest global average (30 years longer than the others that end prior to 1880) seems to have a wimpy slope. Let’s eyeball it from a google search of a plot: ah WoodForTrees.org has plotting…http://www.woodfortrees.org/plot/hadcrut3gl/trend/plot/hadcrut3gl and from 1860 to 1960 I eyeball a slope of 0.45°C/century just like my own plot gave.
“Without the efforts of Anthony Watts and his team, we would have only a series of anecdotal images of poor temperature stations, and we would not be able to evaluate the integrity of the data.”
The more I hear of Richard Muller’s statements, the more I begin to wonder if the BEST project wasn’t intended from its inception to preempt the coming papers based on the surfacestations.org project.
Perhaps the quote from Muller should read “Without the efforts of Anthony Watts and his team [that were disclosed to us in confidence], we would have only a series of anecdotal images of poor temperature stations, and we would not be able to [announce the continued] integrity of the [existing] data [before even completing a full analysis].”
The timing of the creation of BEST, and the speed with which pronouncements are coming (that there are no problems with the existing temperature records) may be just a coincidence. But there is no scientific or legislative reason for the rapid pronouncements issuing from Muller. They do, however, have excellent preemptive propaganda value, given the access he had to the surfacestations data. Anything published now that differs from Muller’s preliminary conclusions may look like sour grapes and have significantly reduced impact (they hope).
Giving the “benefit of the doubt” to progressives pursuing a political agenda like CAGW is never a good idea. Wasn’t James Delingpole recently scammed by progressives pretending to want to do an objective analysis of climate issues? Hate to say it, but did that perhaps happen here?
kramer says: April 1, 2011 at 7:24 am
“I’d like to see some scientist(s) take 10 or so calibrated temperature sensors…”
Why don’t we have more experiments to verify the impact of what may effect the recorded data. As well as experimenting on things like UHI why can’t we find a location where the paleodendro people think we can get a good reading and has a good/adequate historical thermometer nearby and then predict how wide the tree rings are going to be based on the real thermometer. Get an agreement on the values and acceptable error then chop them down/core them and see if your prediction was correct. This experiment might even provide insight into other factors that might effect the result, like precipitation, sunshine etc.. if any of that kind of data had been recorded at the local weather station.
Of course this requires getting away from your computer screen for a bit and might require some real scientists who have experience with lab work and experiments. I do feel that the ‘climate science’ field is heavily filled with mathematicians and more theoretical physicists who won’t have the necessary skills to do this, hence the reliance on processing other peoples data (who did actually go out in the field at some point) and computer modelling
I agree with Roger Pielke, but would go further. The case against CO2 is deeply flawed. The IPCC claims that greenhouse gases warms the planet by absorbing heat reradiated by the earth, which would otherwise go out into space. The amount of energy/heat absorbed by gas is determined by its absorption spectrum. The absorption spectrum of CO2 across the bandwidth occupied by radiated heat is very narrow with new bands at 4.3 and 15 microns. This mean that CO2 could absorb only a few percent of the energy/heat. Contrast with the absorption spectrum of water vapour, which covers most of the total spectrum. Take into account the percentage of the atmosphere occupied by CO2 and water vapour, they are,respectively. 0.039 for CO2 and 0.40 for water vapour, reachihg 1-2% near the surface. It is absurd to blame a gas comprising so little of the atmosphere and which is a very weak absorber of heat/energy for global warming. This is especially so when we consider that the current warming. which represents a recovery from the little ice age which ended about 1820-40, parallels a number of other warming episodes following previous coll periods.
If anyone would like a copy of the full paper I have written demonstrating that the IPCC’s case is not supported by even a single scrap of scientific evidence, please email me at jpenhall@bigpond.net.au.
JOhn Penhallurick
Climategate continues……
Bad data, mixed with good data, then sold as pristine data…. all in the name of getting funding…. don’t be alarmed that he omits important facts… they don’t cause much change in the overall data….
CRU/MET/Jones/Briffa/Man school of science and methods…
Anthony,
When is your analysis of the surface stations going to come out, and demolish these claims?
“There will be no structured academic or scientific attempt to discern a truthful temperature record.”
.
It is obvious because that is impossible to achieve.
We have not enough historical data.
Do you think a couple dozen stations at start of XX century will give you zero.x precision?
We would need more than a million stations try to achieve that and even then…
.
I also advocate that one weather station is a mistake . We need 5 or more around same place.
Perhaps we skeptics shound be challenging Dr. Muller to quantify the four aspects that distort the determination of global temperature: (1) the dropping of “cooler” WSs from the GHCN; (2) the inappropriate siting of WSs (one for you, Anthony); (3) the current impact of nighttime cloud cover (which is increasing as we start to enter a cooling phase), and (4) urban heat island (UHI) effect. I’m sure if all four were examined objectively by the good Professor, we’d find a globe temperature that is starting to decline in response to factors beyond our control (e.g., solar activity, albedo effect and oceanic temperature oscillations).
Greg B.
Perhaps we AGW skeptics should challenge Dr. Muller to quantify the four aspects that distort the determination of global temperature: (1) the dropping of “cooler” weather stations from the GHCN; (2) the inappropriate siting of WSs (one for you, Anthony); (3) the current impact of nighttime cloud cover (which is increasing as we enter a coll=ing phase), and (4) urban heat island (UHI) effect. I’m sure that if all four were addressed objectively by Professor Muller, we’d find a global temperature that is starting to decline in response to factors that are outside of man’s control – solar activity, albedo effect and oceanic temperature oscillations.
Greg B.
Is it is self-interest AND vested interests? In science it is said we moved away from religion and politics but not Marxism it seems.
http://wattsupwiththat.com/2011/03/31/clarification-on-best-submitted-to-the-house/#comment-634498
Pamela Gray says: April 1, 2011 at 6:36 am
Well stated.
Stephan says: April 1, 2011 at 12:47 pm
Thank you, that is interesting.
The Lavoisier Group has written a number of articles since 2000 including concern with the science (and costs) in contrast to the religion of AGW and its various . Articles are listed chronologically here http://www.lavoisier.com.au/index.php and others in Quadrant magazine.
Francis Bacon (1561-1626) father of the scientific method (positivism), encouraged the inquirer before beginning inductive reasoning from fact, axiom to physical law, ‘to free his mind from certain false notions or tendencies that distort truth.’
These false notions or tendencies (biases) are:
‘Idols of the Tribe
Idols of the Den
Idols of the Marketplace
Idols of the Theatre’
Though Bud and Terence and Clint E will likely do the trick with their methods, at least for me. http://en.wikipedia.org/wiki/Francis_Bacon
Anthony et al work on the Surface Station project has been pre-empted and is a perversion of Bacon’s positivism. Let us call positivism positive science. A long history including of methodology, and dare I say, human progress, unlike a negative science. I do not discount serendipitous discoveries (as mentioned NikFromNYC says: April 1, 2011 at 11:45 am) or that the *HEAVY SIGH* from Jeremy http://wattsupwiththat.com/2011/03/31/clarification-on-best-submitted-to-the-house/#comment-633385 and various other comments suggest we wait for correct corrections by Muller.
Perhaps the issue is one of language, dare I say phenomenology? Perhaps WUWT is upheaving the anti-positivist Feyerabend by exposing the voodoo science used by the media, policy and market? Some students of post-positivism certainly have goodness of fit with ‘….not a single idea, no matter how absurd and repulsive, not a sensible aspect, and that there is not a single view, no matter how plausible and humanitarian, that does not encourage and then steal our stupidity and our criminal tendencies’ (1991).
Positivism (Bacon) that which is positivist science looks to something that is posited; direct experience, scientific observation through the scientific method. The study of the given; datum/data.
Natural religion (and natural law), that which is reasoned to, by way of rational argument is based on their knowledge of the world, stemming from the nature of things both the nature of the world and human nature.
History would have it that positive law, that which has been posited in legislation and enacted by the lawgiver can be distinguished from natural law; arising from reasoning of obligations and responsibilities to the natural world and human nature.
‘An action considered wrongful in positive law is not considered wrong in itself. It is considered wrong because it is has been forbidden by a legislator.’ mala quia prohibita, acts that are evil because they are prohibited. ‘
In comparison, natural law prohibita quia mala are acts prohibited because they are evil. As reasoned, they stem from the nature of things.
The climate of science research and researchers informing policy and legislation is supported by ‘evidence’. Evidence continues to trump facts including the transparency of process in which the evidence is arrived at.
Recently discussed in WUWT was the code of conduct and ethics, legislated for government servants and professionals. Positivism is at the coal face, and Anthony et al are at that coal face. Kuhn would term it a revolution, I expect.
Galileo, well he would leave his middle finger after his life of science. And Machievelli, he too would have agreed on that (objective).
Are there any theologians out there doing research into this new religion of warmism? I think a Phd is waiting for someone.
I don’t understand the obsession with trying to compute a global average. Identify local records that are reliable. Adjust for UHI etc. Present the results individually and in aggregate.
You’ll come up with stats such as X % showed a warming trend of average magnitude x deg C.
Y % of local stations showed a cooling trend of average magnitude y deg C.
Z % show no trend
Jessie says: (April 2, 2011 at 4:23 am)
Is it is self-interest AND vested interests? In science it is said we moved away from religion and politics but not Marxism it seems. …
You set vibrations stampeding in the brain with this comment, Jessie. Thanks!
Dr. Pielke, please demonstrate how the sentences you say “contradict” the other are in the form (A and Not A). Thanks.
Greg Beasley:
Could you please give me a reference re your comment of the increase in cloud cover which you referred to earlier ?
Many thanks,
“…….And, I’m prepared to accept whatever result they produce, even if it proves my premise wrong. I’m taking this bold step because the method has promise. So let’s not pay attention to the little yippers who want to tear it down before they even see the results. I haven’t seen the global result, nobody has, not even the home team, but the method isn’t the madness that we’ve seen from NOAA, NCDC, GISS, and CRU, and, there aren’t any monetary strings attached to the result that I can tell. ….
Anthony Watts – 11 March 2011
So Anthony, how is it going with accepting the results, especially now they have proved you wrong? Or will you just do your usual?
Whoops! Not what you all wanted to hear, huh? Now, all of a sudden the temperature of the earth means nothing, and professor Muller is a political hack. Obviously people who deny climate change are wedded to their beliefs. So why don’t they just forget about the science; just call it a religion — maybe “Everything’s OKism.”
Dave says:
“Obviously people who deny climate change are wedded to their beliefs.”
That describes the alarmist crowd, of course. Scientific skeptics have always known that the climate naturally changes. Only Michael Mann’s acolytes believe in his Hokey Stick, which shows no change in climate — no MWP, no LIA — until the industrial revolution. The really strange thing is that since Mann’s ‘Stick has been debunked, the alarmist contingent still refuses to accept natural climate variability. Cognitive dissonance in action.
James Sexton says: Sorry, anyone expecting something different than what we got hasn’t been paying attention. This should serve as a lesson to those that believe there will be a genuine attempt to discern the truth. There will be no structured academic or scientific attempt to discern a truthful temperature record. Not here in the States, not in G.B. and not in Australia.
As one of those who thought he was “paying attention” and “expected something different”, I find my self justly chastized….
I’d gotten an early “third hand description” of the method to be used at BEST and it has some very good points. (I’ve said nothing as it was under “seal of honor” to keep it private until they publish.) But, IMHO, their basic technical approach has much good in it. (It has some of the same ‘benefits’ as my dT/dt method, so of course it would appeal to me 😉
But it doesn’t matter what technically is good if the management has already chosen what the answers will be. “Self Confirmation Bias” is writ very large when it becomes “Management Confirmation Bias”. The second part of what had been shared was a set of “goals of the project” that included idealized statements about searching for a true honest temperature product based on open, public, and clear analysis. THAT part, as of now, is clearly violated. Ethos, meet “round file”.
OK, with that said, I’m still hopefull that the final “product” will provide some improvements over the GISS, CRU, NCDC “products”. If nothing else, removing a lot of the strange circumlocutions in GIStemp would be a major benefit.
BUT, what I found with the review of world locations (i.e. not “gridding” and “homogenizing” the data, but doing ‘regional averages’, some as small as one island) was that the temperature trends are “all over the map”. Some nations up, their neighbor down, some months up and others down in the same area. Countries right next to each other with their individual monthly trends in opposition. It is only in the averaging of all those things that a “rising Global Average Temperature trend” can be created. It is exactly THAT kind of discovery that will never be found if “management” has already decided that they will match GIStemp and HADcrut et.al.
Guys and gals, its been nearly 30 years since this became an issue. It won’t happen. Academia will ride this pony until it drops. There is no impetus to discern the truth. And there will be no epiphany for the charlatans.
Unfortunately, there is more truth in that statement than I care to recognize 😉
30 years in, all the grad students are now the Ph.D. teachers passing on the themes of THEIR dissertation to a new set of empty heads. The “peer reviewers” will now be only the “anointed”. The dominant “memes” will have ossified. We have entered the realm where to advance, a new point of view will need to wait for the old guard to die.
It can take a long time until there is a new puff of smoke over the Vatican and even longer for a shift in the team of cardinals from whom the next vision will be taken…
Stephan says: Ok let’ s say we accept the initial results and all the others (I said I would accept the BEST analysis). Why in hell is ALL the data up to 1980 significantly, and I would say very significantly BELOW the 0C anomaly baseline. Eyeballing, it seems to comprise 80-90% of data up to 1980. There is something very wrong here. How can the baseline be correct? The data is actually “cold” but the baseline is warm?.
The “baseline” is computed independently of the trend. It comes first. Then the “baseline” is used as a reference point for the trended data. They are homogenized, UHI adjusted, gridded, and grid fill-in fabricated (most grid cells are empty in the present. In the baseline for GHCN there are 7000+ stations, at present, about 1200 last I looked. GIStemp now uses 16,000 grid boxes. BY DEFINITION about 9000 of them are fabricated in the baseline era and about 90% of them are fabricated in the present). Then, and only then, those baseline and present “grid boxes” are compared to each other to create the “trend”. Very few actual thermometer need apply…
One of the “great promises” of BEST is the avoidance of that kind of grid/box nonsense and the making up of non-existent “data” to make non-existent trends…
I still have “hope” for that part of their effort, but with management violating their ethos, it becomes much more suspect.
In my view this is showing that the NORMAL temp is COLDER (80-90% of time). This is in ALL the global temperatures graphs produced by GISS NOAA, HADCRUT and BEST. The way the baseline has been calculated must be wrong!
Nope, the baseline is probably calculated OK (IMHO – and with some caveats…) it’s the post baseline “fabricated data, homogenized, UHI “wrong way” adjusted, and grid/box fantasized” part that re-writes the past to be colder that is the problem.
bob paglee says: I repeated this for the end point marked 2000 and read this as 9.5. This would indicate a rise of only 0.3 over that 200-year interval, or a gradient of 0.15 per century. This seems to be off reality by an order of magnitude. In an earlier post today I described how I had done the same for the 30-year satellite temp chart (data from UAH Huntsville) and estimated the temperature gradient to be about 1.3C per century, albeit over a much shorter period than the 200-year chart. Am I missing something?
Yes, the long term cyclicality of temperatures. There is at a minimum a 60 year cycle that looks like it’s tied to the PDO. So in the mid-70’s we had a ‘great temperature shift’ and the world swapped from “Ice Age Comming!!” hysteria to the newfound “Global Warming!!” hysteria. Then, about 30 years later you get the swap to the other half of the cycle and from 1998 to now we’ve been cooling.
So, a 200 year chart will “smooth those ripples” while a “30 year average” as used in “climate science” will ride that wave up and down and up and down and… so you find a strong trend in 30 year data, and it averages away in 120 yr+
Stephan says:
ALL the GISS, HADCRUT, BEST etc graphs show warming from 1980 on NOT before please explain.
IMHO, it is because 2 things happened.
1) A pot load of well sited stations were dropped from the GHCN in about 1990 on. This “locks in” prior trend data from the cold bottom of the baseline PDO phase. The methods used “fabricate” the missing “data” from that point forward in the making of their fantasy ‘grid box’ values. Surviving stations are largely (over 90%) located at airports. Airports grew massively from 1960 to date with the onset of the Jet Age and wealth. Acres and hectares of tarmac are great solar heaters. Now you are comparing a 10000 foot long paved area surrounded by thousands of running motor vehicles with thousands of TONS of kerosene being burned per day to a prior ‘grid box’ of a more diverse and often grassy character…
2) There were changes in the “QA” methodology that cause a pronounced “knee” or hockey stick turn in about 1980. I’ve detailed some of it in a posting about the QA method used on the USHCN (which is similar, as near as I can tell, that that used on the GHCN.)
Now blend those two, run it through a data-food-processor, and viola: “Global Warming” that does not show up in any actual locations…
There is a “DIY” demonstration of the “knee” here:
http://chiefio.wordpress.com/2010/07/31/agdataw-begins-in-1990/
Anyone can reproduce this with a spreadsheet and it clearly shows that “warming” is not CO2 related as it didn’t just suddenly show up one year…
http://chiefio.wordpress.com/2010/04/11/qa-or-tossing-data-you-decide/
Is where I noticed some of the odd connections between knee location and changes of QA processes.
Here is one of my earliest moments of discovering that the monthly trends were vastly different in different countries and even between months in the same country. IMHO, looking at the data ‘by month’ lets you see where there are “issues’ in the data.
http://chiefio.wordpress.com/2010/04/15/dmtdt-climate-change-by-the-monthly-anomaly/
Last year I made an “April Fools” posting that, in fact, just pokes fun at an actual problem in the data. The onset of “warming” progresses country by country around the Africa. Perhaps as the new QA method was adopted? Or as ASOS were installed?
http://chiefio.wordpress.com/2010/04/01/global-warming-from-africa-contagious-spreading-at-100-miles-per-year/
It really is a “smoking gun” for something being very wrong in the data gathered, though delivered a bit “tongue in cheek”.
http://chiefio.wordpress.com/2010/04/22/dmtdt-an-improved-version/
Gives a later version of the “by month” view of the data. Try to explain those monthly trends via CO2. Just try…
If anyone wants more, there are a lot of independent method (dT/dt) temperature explorations in the category at my site here:
http://chiefio.wordpress.com/category/dtdt/
But they all tend to point out the same things:
1) “Global Warming” is a methodology artifact, not CO2 induced.
2) It has “onset” coincident with method change long after CO2 arrived.
3) The monthly data are not consistent with the “trend” in the long term averages.
4) GIStemp, HADcrut, et.al. all make the same errors as they use the same methods.
5) Airports are a very bad place to do climate research.
I would certainly not take issue with anyone challenging the findings here, but it is also important not to paint Muller as a typical warming believer. I believe a good portion of his intent was to try and debunk the findings of other “institutions”. He has played from a very skeptical view point on climate science and the largest private contributor to the project was a Charles Koch foundations. Obviously, he has a dog in the fight, but for some reason, it came out of a different dog house.