Clarification on BEST submitted to the House

UPDATES: A number of feckless political commentators have simply missed this response I prepared, so I’m posting it to the top for a day or two. I’ll have a follow up on what I’ve learned since then in the next day or two. Also, NCDC weighs in at the LA Times, calling the BEST publicity effort without publishing science papers “seriously compromised”

Also – in case you have not seen it, this new analysis from an independent private climate data company shows how the siting of weather stations affects the data they produce. – Anthony

——————————————————————————————

As many know, there’s a hearing today in the House of Representatives with the Subcommittee on Energy and Environment, Committee on Science, Space, and Technology and there are a number of people attending, including Dr. John Christy of UAH and Dr. Richard Muller of the newly minted Berkeley Earth Surface Temperature (BEST) project.

There seems a bit of a rush here, as BEST hasn’t completed all of their promised data techniques that would be able to remove the different kinds of data biases we’ve noted. That was the promise, that is why I signed on (to share my data and collaborate with them). Yet somehow, much of that has been thrown out the window, and they are presenting some results today without the full set of techniques applied. Based on my current understanding, they don’t even have some of them fully working and debugged yet. Knowing that, today’s hearing presenting preliminary results seems rather topsy turvy. But, post normal science political theater is like that.

I have submitted this letter to be included in the record today. It is written for the Members of the committee, to give them a general overview of the issue, so may seem generalized and previously covered in some areas. It also addresses technical concerns I have, also shared by Dr. Pielke Sr. on the issue. I’ll point out that on the front page of the BEST project, they tout openness and replicability, but none of that is available in this instance, even to Dr. Pielke and I. They’ve had a couple of weeks with the surfacestations data, and now without fully completing the main theme of data cleaning, are releasing early conclusions based on that data, without providing the ability to replicate. I’ve seen some graphical output, but that’s it. What I really want to see is a paper and methods. Our upcoming paper was shared with BEST in confidence.

BEST says they will post Dr. Muller’s testimony with a notice on their FAQ’s page which also includes a link to video testimony. So you’ll be able to compare. I’ll put up relevant links later. – Anthony

UPDATE: Dr. Richard Muller’s testimony is now available here. What he proposes about Climate -ARPA is intriguing. I also thank Dr. Muller for his gracious description of the work done by myself, my team, and Steve McIntyre.

A PDF version of the letter below is here: Response_to_Muller_testimony

===========================================================

Chairman Ralph Hall

Committee on Science, Space, and Technology

2321 Rayburn House Office Building

Washington, DC 20515

Letter of response from Anthony Watts to Dr. Richard Muller testimony 3/31/2011

It has come to my attention that data and information from my team’s upcoming paper, shared in confidence with Dr. Richard Muller, is being used to suggest some early conclusions about the state of the quality of the surface temperature measurement system of the United States and the temperature data derived from it.

Normally such scientific debate is conducted in peer reviewed literature, rather than rushed to the floor of the House before papers and projects are complete, but since my team and I are not here to represent our work in person, we ask that this letter be submitted into the Congressional record.

I began studying climate stations in March 2007, stemming from a curiosity about paint used on the Stevenson Screens (thermometer shelters) used since 1892, and still in use today in the Cooperative Observer climate monitoring network. Originally the specification was for lime based whitewash – the paint of the era in which the network was created. In 1979 the specification changed to modern latex paint. The question arose as to whether this made a difference. An experiment I performed showed that it did. Before conducting any further tests, I decided to visit nearby climate monitoring stations to verify that they had been repainted. I discovered they had, but also discovered a larger and troublesome problem; many NOAA climate stations seemed to be next to heat sources, heat sinks, and have been surrounded by urbanization during the decades of their operation.

The surfacestations.org project started in June 2007 as a result of a collaboration begun with Dr. Roger Pielke Senior. at the University of Colorado, who had done a small scale study (Pielke and Davies 2005) and found identical issues.

Since then, with the help of volunteers, the surfacestations.org project has surveyed over 1000 United States Historical Climatological Network (USHCN) stations, which are chosen by NOAA’s National Climatic Data Center (NCDC) to be the best of the NOAA volunteer operated Cooperative Observer network (COOP). The surfacestations.org project was unfunded, using the help of volunteers nationwide, plus an extensive amount of my own volunteer time and travel. I have personally surveyed over 100 USHCN stations nationwide. Until this project started, even NOAA/NCDC had not undertaken a comprehensive survey to evaluate the quality of the measurement environment, they only looked at station records.

The work and results of the surfacestations.org project is a gift to the citizens of the United States.

There are two methods of evaluating climate station siting quality. The first is the older 100 foot rule implemented by NOAA http://www.nws.noaa.gov/om/coop/standard.htm which says:

The [temperature] sensor should be at least 100 feet from any paved or concrete surface.

A second siting quality method is for NOAA’s Climate Reference Network, (CRN) a hi-tech, high quality electronic network designed to eliminate the multitude of data bias problems that Dr. Muller speaks of. In the 2002 document commissioning the project, NOAA’s NCDC implemented a strict code for placement of stations, to be free of any siting or urban biases.

http://www1.ncdc.noaa.gov/pub/data/uscrn/documentation/program/X030FullDocumentD0.pdf

The analysis of metadata produced by the surfacestations.org project considered both techniques, and in my first publication on the issue, at 70% of the USHCN surveyed (Watts 2009) I found that only 1 in 10 NOAA climate stations met the siting quality criteria for either the NOAA 100 foot rule or the newer NCDC CRN rating system. Now, two years later, with over 1000 stations, 82.5% surveyed, the 1 in 10 number holds true using NOAA’s own published criteria for rating station siting quality.

Figure 1 Findings of siting quality from the surfacestations project

During the nationwide survey, we found that many NOAA climate monitoring stations were sited in what can only be described as sub optimal locations. For example, one of the worst examples was identified in data by Steven McIntyre as having the highest decadal temperature trend in the United States before we actually surveyed it. We found it at the University of Arizona Atmospheric Sciences Department and National Weather Service Forecast Office, where it was relegated to the center of their parking lot.

Figure2 – USHCN Station in Tucson, AZ

Photograph by surfacestations.org volunteer Warren Meyer

This USHCN station, COOP# 028815 was established in May 1867, and has had a continuous record since then. One can safely conclude that it did not start out in a parking lot. One can also safely conclude from human experience as well as peer reviewed literature (Yilmaz, 2009) that temperatures over asphalt are warmer than those measured in a field away from such modern influence.

The surfacestations.org survey found hundreds of other examples of poor siting choices like this. We also found equipment problems related to maintenance and design, as well as the fact the the majority of cooperative observers contacted had no knowledge of their stations being part of the USHCN, and were never instructed to perform an extra measure of due diligence to ensure their record keeping, and that their siting conditions should be homogenous over time.

It is evident that such siting problems do in fact cause changes in absolute temperatures, and may also contribute to new record temperatures. The critically important question is: how do these siting problems affect the trend in temperature?

Other concerns, such as the effect of concurrent trends in local absolute humidity due to irrigation, which creates a warm bias in the nighttime temperature trends, the effect of height above the ground on the temperature measurements, etc. have been ignored in past temperature assessments, as reported in, for example:

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2011: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., 116, D02122, doi:10.1029/2010JD014612.

These issues are not yet dealt with in Dr. Richard Muller’s analysis, and he agrees.

The abstract of the 2007 JGR paper reads:

This paper documents various unresolved issues in using surface temperature trends as a metric for assessing global and regional climate change. A series of examples ranging from errors caused by temperature measurements at a monitoring station to the undocumented biases in the regionally and globally averaged time series are provided. The issues are poorly understood or documented and relate to micrometeorological impacts due to warm bias in nighttime minimum temperatures, poor siting of the instrumentation, effect of winds as well as surface atmospheric water vapor content on temperature trends, the quantification of uncertainties in the homogenization of surface temperature data, and the influence of land use/land cover (LULC) change on surface temperature trends.

Because of the issues presented in this paper related to the analysis of multidecadal surface temperature we recommend that greater, more complete documentation and quantification of these issues be required for all observation stations that are intended to be used in such assessments. This is necessary for confidence in the actual observations of surface temperature variability and long-term trends.

While NOAA and Dr. Muller have produced analyses using our preliminary data that suggest siting has no appreciable effect, our upcoming paper reaches a different conclusion.

Our paper, Fall et al 2011 titled “Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends” has this abstract:

The recently concluded Surface Stations Project surveyed 82.5% of the U.S. Historical Climatology Network (USHCN) stations and provided a classification based on exposure conditions of each surveyed station, using a rating system employed by the National Oceanic and Atmospheric Administration (NOAA) to develop the U.S. Climate Reference Network (USCRN). The unique opportunity offered by this completed survey permits an examination of the relationship between USHCN station siting characteristics and temperature trends at national and regional scales and on differences between USHCN temperatures and North American Regional Reanalysis (NARR) temperatures. This initial study examines temperature differences among different levels of siting quality without controlling for other factors such as instrument type.

Temperature trend estimates vary according to site classification, with poor siting leading to an overestimate of minimum temperature trends and an underestimate of maximum temperature trends, resulting in particular in a substantial difference in estimates of the diurnal temperature range trends. The opposite-signed differences of maximum and minimum temperature trends are similar in magnitude, so that the overall mean temperature trends are nearly identical across site classifications. Homogeneity adjustments tend to reduce trend differences, but statistically significant differences remain for all but average temperature trends. Comparison of observed temperatures with NARR shows that the most poorly-sited stations are warmer compared to NARR than are other stations, and a major portion of this bias is associated with the siting classification rather than the geographical distribution of stations. According to the best-sited stations, the diurnal temperature range in the lower 48 states has no century-scale trend.

The finding that the mean temperature has no statistically significant trend difference that is dependent of siting quality, while the maximum and minimum temperature trends indicates that the lack of a difference in the mean temperatures is coincidental for the specific case of the USA sites, and may not be true globally. At the very least, this raises a red flag on the use of the poorly sited locations for climate assessments as these locations are not spatially representative.

Whether you believe the century of data from the NOAA COOP network we have is adequate, as Dr. Muller suggests, or if you believe the poor siting placements and data biases that have been documented with the nationwide climate monitoring network are irrelevant to long term trends, there are some very compelling and demonstrative actions by NOAA that speak directly to the issue.

1. NOAA’s NCDC created a new hi-tech surface monitoring network in 2002, the Climate Reference Network, with a strict emphasis on ensuring high quality siting. If siting does not matter to the data, and the data is adequate, why have this new network at all?

2. Recently, while resurveying stations that I previously surveyed in Oklahoma, I discovered that NOAA has been quietly removing the temperature sensors from some of the USHCN stations we cited as the worst (CRN4, 5) offenders of siting quality. For example, here are before and after photographs of the USHCN temperature station in Ardmore, OK, within a few feet of the traffic intersection at City Hall:

Figure 3 Ardmore USHCN station , MMTS temperature sensor, January 2009

Figure 4 Ardmore USHCN station , MMTS temperature sensor removed, March 2011

NCDC confirms in their meta database that this USHCN station has been closed, the temperature sensor removed, and the rain gauge moved to another location – the fire station west of town. It is odd that after being in operation since 1946, that NOAA would suddenly cease to provide equipment to record temperature from this station just months after being surveyed by the surfacestations.org project and its problems highlighted.

Figure 5 NOAA Metadata for Ardmore, OK USHCN station, showing equipment list

3. Expanding the search my team discovered many more instances nationwide, where USHCN stations with poor siting that were identified by the surfacestations.org survey have either had their temperature sensor removed, closed, or moved. This includes the Tucson USHCN station in the parking lot, as evidenced by NOAA/NCDC’s own metadata online database, shown below:

Figure 6 NOAA Metadata for Tucson USHCN station, showing closure in March 2008

It seems inconsistent with NOAA’s claims of siting effects having no impact that they would need to close a station that has been in operation since 1867, just a few months after our team surveyed it in late 2007 and made its issues known, especially if station siting quality has no effect on the data the station produces.

It is our contention that many fully unaccounted for biases remain in the surface temperature record, that the resultant uncertainty is large, and systemic biases remain. This uncertainty and the systematic biases needs to be addressed not only nationally, but worldwide. Dr. Richard Muller has not yet examined these issues.

Thank you for the opportunity to present this to the Members.

Anthony Watts

Chico, CA

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
225 Comments
Inline Feedbacks
View all comments
Bowen
April 4, 2011 3:38 pm

linda says:
April 4, 2011 at 3:22 pm
“You are an idiot. Too bad you won’t be alive 50 years from now when the dire consequences of climate change have ravaged the earth.”
You mean like it did to the Sahara Forrest?

Max Albert
April 4, 2011 3:53 pm

Are these the same surface stations (with error margins of 1 deg C or more) used to “compute” temperatures two figures past the decimal point? Whatever happened to the “significant digit” rule?

Davidg
April 4, 2011 4:40 pm

Paul Krugman, yet another unqualified climate critic from the Times, like his friend Tom Friedman, knows nothing about Climate or science either but he knows an opportunity when he sees one. The opportunity to call Muller a skeptic!! To use him as a dead fish to beat Anthony( who is a surrogate and symbol of smart. skeptical people on this issue, but he does more than just talk and that worries these guys; they want to shut him up. Here’s another tissue of falsehoods from the Times:
[snip – I can’t allow a reprint the entire article in comments due to copyright issues but here is the link:
http://www.nytimes.com/2011/04/04/opinion/04krugman.html – Anthony ]

Davidg
April 4, 2011 5:15 pm

This is an exact replay of the political ambush on Chris Landsea of NOAA by the IPCC at a Harvard University Press conference/mugging about 7 years ago, this is detailed in the recent work by Booker on the Real Global Warming Disaster. That’s why Landsea resigned, on an issue of principle, which is also at work here.

jrship
April 4, 2011 5:20 pm

[Snip. Enough with the denier label. ~dbs, mod.]

Dave Springer
April 4, 2011 5:50 pm


You may have missed my point. Hopefully it isn’t lost on Dodds.

Dave Springer says:
April 4, 2011 at 11:11 am

John Dodds says:
April 4, 2011 at 8:48 am
The idea that adding CO2 can add extra energy to cause warming is absurd. CO2 can NOT create energy. or don’t you believe in the Laws of physics and conservation of energy?

This really makes me cringe. Put your hand on the hood of a white car that’s been sitting in full sun for a while. Then do the same thing with a black car. If you understand the law of physics one tiny bit you’ll know the answer to that without actually burning your hand.

The idea that adding black pigment to paint adds energy to the car is absurd. Yet it makes the surface of the car hotter. Mr. Dodds therefore has some pondering to do about how things that modify radiative absorption and reflection (like pigment in paint and greenhouse gases in the atmosphere) really can cause the temperature of things to change.

Steve S
April 4, 2011 5:59 pm

Just a quick aside, some on the left are cheering this as a victory for ‘their side’, and at least one, C. Johnson, of you know where, is calling Dr. Muller a denialist.

Thad
April 4, 2011 6:17 pm

This is the bit that jumped out at me:
In fact, in our preliminary analysis the good stations report more warming in the U.S. than the poor stations by 0.009 ± 0.009 degrees per decade, opposite to what might be expected, but also consistent with zero. We are currently checking these results and performing the calculation in several different ways. But we are consistently finding that there is no enhancement of global warming trends due to the inclusion of the poorly ranked US stations.
The claim here is that they actually compared ‘good’ vs ‘poor’ stations in an effort to find bias and basically found none. Of course we should always take preliminary statements with a grain of salt until the actual journal papers are published, but this really is a big deal if it stands up to peer review.

Dave Springer
April 4, 2011 7:27 pm

lapogus says:
April 3, 2011 at 2:19 am
“It is sad to see Muller appear to suggest that CO2 can be blamed for 0.6 of the 0.7C warming, when there is no evidence for this.”
Actually there is evidence of it. Human activity emits X amount of CO2 annually. The amount of CO2 in the atmosphere increases by X/2 annually. It is reasonable to assume that were it not for human emission atmospheric CO2 would not be increasing.
The absorptive properties of some gases (including C02) to long wave radiation is as much of a fact as facts get in science having been first experimentally measured in many gases over 150 years ago by John Tyndall.
An increase of 0.7c in average surface temperature going from 280ppm CO2 to 380ppm should be a little less than 0.5c. The remainder of the rise comes from anthropogenic methane which is usually, if misleadingly, lumped together with CO2 without specifically mentioning that about a third of the greenhouse warming isn’t CO2.
The greenhouse warming effect of additional greenhouse gases isn’t linear. It’s a case of diminishing returns. The next 100ppm increase in CO2 will cause only half as much warming as the prior 100ppm. It works out that for every CO2 doubling beginning at 280ppm the expected surface temperature rise is about 1.1c.
The problem is that 1.1c per doubling is the exact opposite of catastrophic. It’s beneficial – longer growing seasons, greater photosynthetic efficiency, and less water needed per unit of plant growth.
So… to get from BAGW (Beneficial Anthropogenic Global Warming) to CAGW (Catastrophic Anthropogenic Global Warming) climate boffins invented, out of whole cloth, an amplification mechanism where every degree of CO2 GHG warming causes an additional 2 degrees of water vapor GHG warming. That amount of warming would perhaps be catastrophic. In order to provide any evidence whatsoever that this water vapor amplification is real the climate boffins must cherry pick a historic temperature starting point that is low and compare it to today. They need every single tenth of a degree over Muller’s 0.7c they can get in order to support the amplification hypothesis. The evidence is all against them as it’s rather well established in the geologic column that the earth in the past has had CO2 levels far far higher than today (10x or more higher) and was only warmer by the UN-amplified amount we’d expect and there was never a runaway greenhouse which the catastrophic climate boffin model must certainly entail.
I believe what we are seeing is a combination of climate boffins wanting their work to be seen as very very important and hence very very deserving of rich and increasing funding. Leftist politicians are latching onto it because they see it as a means to expand their empires through greater control and taxation. Then there’s the starry eyed ecoloons who basically hate modern civilization for what it does to the natural environment and believe that reducing CO2 emissions will help save the earth from the effects of industrialization.
All I’m interested in is the truth and I believe the truth is that anthropogenic CO2 emissions are beneficial to both humanity and the biosphere as a whole. If you happen to adore barren rocks and ice in the natural world then CO2 is bad for that but most of the rest of the living world is not enamored of brutally cold winters and permanently frozen ground. The tree huggers appear to have morphed into ice huggers. Ain’t that a hoot?
Natural variation, cloud cover changes, oceanic and solar magnetic cycles are all much more likely to be the cause, rather than an increase in atmospheric CO2 from 0.0285% to 0.0385% .

BillyBob
April 4, 2011 9:06 pm

Dave Springer: “The idea that adding black pigment to paint adds energy to the car is absurd. Yet it makes the surface of the car hotter.”
The idea adding 2 or 3 more coats of black paint to an already black car would add energy is absurd.
Adding more CO2 in between the Co2 molecule that has already stopped the IR and thinking that would cause more IR to magically come into being … now thats absurd.
However, what is even more absurb is suggesting Co2 is the only thing that has changed in climate over the last 100 years.
Land use, more bright sunshine, less aerosols because pollution cleanup, more humidity etc etc.
If T = some combination of a,b,c,d,e,f,g,h and you only measure and count c and dimiss any affect of a,b,d,e,f,g,h has, you are really, really dishonest.
The IPCC and AGW supporters deliberately and wifully ignore all the other changes.
They ignore all the T increases that came before CO2.
It is a sad corruption of science.

Colin
April 4, 2011 9:47 pm

Dave Springer: “Actually there is evidence of it. Human activity emits X amount of CO2 annually. The amount of CO2 in the atmosphere increases by X/2 annually. It is reasonable to assume that were it not for human emission atmospheric CO2 would not be increasing.”
This statement draws a false conclusion. Neither the natural sources nor the natural sinks of atmospheric carbon dioxide have been accurately quantified. The geologic record shows that CO2 concentration is a consequence, not a cause, of changes in temperature. Given the size of those natural sources and sinks, human emissions are a rounding error.
There is no greater blunder in science than the statement “It is reasonable to assume…”
You just failed the Sherlock Holmes test.

Girma
April 4, 2011 10:59 pm

Why does not someone show the following information in the congressional hearing?

IPCC: For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios. Even if the concentrations of all greenhouse gases and aerosols had been kept constant at year 2000 levels, a further warming of about 0.1°C per decade would be expected.


http://bit.ly/9pwVyH
Here is a graph that compares the above projections of the IPCC with the actual warming rate of 0.03 deg C per decade.
http://bit.ly/hzs82o

tonyb
Editor
April 4, 2011 11:19 pm

Dave Springer said;
“It is sad to see Muller appear to suggest that CO2 can be blamed for 0.6 of the 0.7C warming, when there is no evidence for this.”

Actually there is evidence of it. Human activity emits X amount of CO2 annually. The amount of CO2 in the atmosphere increases by X/2 annually. It is reasonable to assume that were it not for human emission atmospheric CO2 would not be increasing.”
Dave, Temperatures have risen fractionally since the end of the Little Ice Age. Are you in effect saying that it was warming caused by man that ended that epoch?
tonyb

tonyb
Editor
April 4, 2011 11:24 pm

Dave
Re my message concerning the LIA ending. I appreciate you have a nuanced view on CAGW so this is a reotorical question for you but one that warmists might want to argue so intetested in your viewpoint.
tonyb

JPeden
April 5, 2011 12:04 am

linda says:
April 4, 2011 at 3:22 pm
You are an idiot. Too bad you won’t be alive 50 years from now when the dire consequences of climate change have ravaged the earth.
Oops, point of fact, linda: using atmospheric CO2 concentrations as the critical driver, the ipcc’s own CO2=CAGW Climate Science hasn’t yet managed to get even one of its unique, different or changed-from-natural-climate predictions right.
But look no farther than your own statement, linda. Since for you the term “climate change” means or is the same as “CO2=CAGW”, you have also fallen into the same trap the ipcc Climate Science Propaganda Op. laid for others but fell into itself: as far as your and the ipcc’s own, allegedly scientific use of “climate change” is concerned, by changing the usual definition now there can be no “climate change” whatsoever except for fossil fuel CO2-caused Catastrophic Anthropogenic Global Warming!
The simple fact revealed by its “climate change” word-use game is that since ipcc “Climate Science” has purposefully manipulated itself, and its own “science”, into the denialistic position of having to say there has never been any prior “climate change”, and likewise that there won’t be any “climate change” at all unless fossil fuel CO2 causes CAGW, the U.N.’s “Intergovernmental Panel on Climate Change” gives very strong evidence that either it doesn’t understand its own words, possibly even to the point of being self-deluded by them, or else it intends to try to delude people like you via a classic “perception is reality” Propaganda Operation.
Because that’s all the ipcc’s CO2=CAGW Climate Science really is.

April 5, 2011 12:43 am

I don’t understand why you’ve bothered to point out in great detail that some temperature recording points have been removed. It’s a good thing, right? If they’re controversial, removing them is the right thing to do. Even if they don’t affect the overall results, it’s better to move them and be above reproach.
So I don’t get why you rail against the siting of stations, and then rail against them being moved.

P. Solar
April 5, 2011 12:47 am

I just scanned Muller’s testamony. His level of science becomes instantly clear.
“Human caused global warming is somewhat smaller. According to the most recent
IPCC report (2007), the human component became apparent only after 1957, and it
amounts to “most” of the 0.7 degree rise since then. Let’s assume the human-caused
warming is 0.6 degrees
.
The magnitude of this temperature rise is a key scientific and public policy concern. A
0.2 degree uncertainty puts the human component between 0.4 and 0.8 degrees – a factor
of two uncertainty. Policy depends on this number. It needs to be improved.”
Now IPCC actually defined “most” to be its unequivocal literal meaning of “more than 50%”.
So why does this great scientist and renowned physics professor decide to ASSUME they meant 85.7% ?

P. Solar
April 5, 2011 1:21 am

Muller says to congress: “In an initial test, Berkeley Earth chose stations randomly from the complete set of 39,028 stations. Such a selection is free of station selection bias.”
No professor, random selection contains random bias. To state it is “free” of bias is untrue. But you know that already don’t you, because you have a long career in science and the help of top level statisticians on your team.
If one then wanted to produce a predetermined result, one may then do as many random selections as is necessary and select the ones with desired random bias.

P. Solar
April 5, 2011 1:39 am

Jeremy says: “I don’t understand why you’ve bothered to point out in great detail that some temperature recording points have been removed. It’s a good thing, right? ”
Yes, I’m not sure why Anthony slates this. He repeatedly says it is done “quietly”,
The metadata should record the reason for site removal or discontinuation as being poor quality of the equipment. I doubt this is being done. Maybe that’s what he means about being quiet. If that’s the case, the point should be made clearly .
This is in fact a backhanded recognition of his work. Although it’s a good job we have the photos and evidence of the station survey.

Dave Springer
April 5, 2011 5:23 am

BillyBob says:
April 4, 2011 at 9:06 pm

Dave Springer: “The idea that adding black pigment to paint adds energy to the car is absurd. Yet it makes the surface of the car hotter.”
The idea adding 2 or 3 more coats of black paint to an already black car would add energy is absurd.

Right you are, BillyBob. But we’re not talking about adding more paint. We’re talking about adding more pigment. If you take a gallon of white paint and put thimbleful of black pigment in it turns a shade of gray. Paint the car with that and the car will be warmer sitting in the sun than a white car. Add more pigment to get a darker shade of gray and it gets warmer than the lighter shade. It isn’t a perfect analogy of course. Greenhouse gases work selectively by allowing visible light to pass right through to warm the surface but then hamper the escape of invisible long wave radiation. A better analogy is with insulation. It’s absurd to say that putting insulation in your attic will add heat to your home in the winter. But by the same token it won’t take as much energy added from your furnace to maintain the same indoor temperature. The sun is our furnace. If we add more insulation in the attic (CO2) and the sun (our furnace) keeps adding the same amount of energy the temperate will rise and it rises due to the better insulation. This part of the global warming narrative is beyond credible dispute. The arguable part is
1) the water vapor amplification
2) the magnitude of natural warming/cooling
3) the practical ramifications.
My position is that
1) there is no water vapor amplification
2) natural climate variation is far greater
3) without water vapor amplification the practical ramification is highly beneficial
The earth has been in an ice for the past 3 million years with a recent cyclicity of 100,000 years of glaciers covering everything north of Washington, D.C. with a mile thick sheet of ice followed by 10,000 years of glacial retreat back to the poles. At the present moment we’ve had over 10,000 years of glacial retreat. Therefore the LAST thing we should be worrying about is global warming. Unless you happen to think that ice ages are a good thing. Then you’re a nutcase totally disconnected from reality i.e. an ice hugger. The new tree hugger is an ice hugger – just as loopy, still self-annointed saviors of mother nature, but with a new and improved windmill boogeyman with which to tilt.

Dave Springer
April 5, 2011 6:19 am

Muller: “Human caused global warming is somewhat smaller. According to the most recent IPCC report (2007), the human component became apparent only after 1957, and it amounts to “most” of the 0.7 degree rise since then. Let’s assume the human-caused warming is 0.6 degrees.”
It might have become apparent at that point but there hasn’t been more or less anthropogenic warming in recent years. Anthropogenic emissions have been rising at an exponential rate since the 18th century. The greenhouse effect of CO2 decreases exponentially with increasing amounts of it. The combination of those two facts means that anthropogenic GHG warming has been consistently the same each decade for almost 200 years. The total of it over all that time is a bit less than 0.5c with the remainder of the 0.7c being due to anthropogenic methane. This small linear increase in average temperature rides on top of vastly larger natural variations. The latter half of the 20th century has been dominated by temperature rise associated with multidecadal ocean temperature oscillations particularly the Atlantic Multi-Decadal Oscillation (AMDO) which has a 60 year period where 30 years is cooling and 30 years is warming. Other ocean cycles, ENSO and PDO, have different cycle lengths which combine with AMDO to either increase or decrease it making the 60-year cycle a bit less predictable in magnitude and timing. When all three happen to have their peaks or valleys aligned we get abnormally high/low events. I believe I’ve read here that we had a bit of a perfect storm with these cycles all peaking together in the 1990’s with the mother of all El Nino’s (ENSO) putting a capstone on it in 1998.
On top of these ocean cycle are solar cycles with periods ranging from 11 years (sunspot cycle) to centuries (waxing and waning magnitude of the 11 year cycle) and then there are galactic cycles which have periods in tens or even hundreds of millions of years. All these solar and galactic cycles combine to determine the flux of high energy cosmic rays (actually heavy particles moving near the speed of light) which impact the upper atmosphere and cause a cascade of other particles upon impact. The cascade event is thought to produce condensation nuclei which throttles the formation of high altitude clouds. More or fewer high altitude clouds governed by more or fewer cosmic rays means more or less sunlight is reflected which in turns means more or less warming insolation from the sun reaches the surface. Tiny variations in cloud cover have huge effects on surface temperature. Clouds are the Achilles heel of global circulation models (GCMs) upon which the climate boffins rely to make their doom and gloom predictions.
Dig it: nobody really knows what the “average” albedo of the earth is or how much it varies over time. Attempts to measure it are not in satisfactory agreement with a range that varies by about 7% (33%-40%). The measurements all agree on one thing however – albedo varies from year to year. A 1% change in albedo has a far greater effect on average surface temperature than all the anthropogenic forcings combined. So we have to consider what anthropogenic soot is doing to albedo, anthropogenic aerosols, land use changes, and a whole raft of natural variations as well. Sorting out the anthropogenic from the natural appears to be quite impossible. All we can do is theorize about anthropogenic CO2 and methane and say what happens if everything else remains equal. The thing is that nothing else remains equal for long and the other things that don’t remain equal effectively mask anthropogenic forcings because the natural variations are potentially so much larger and, to a large degree, unpredictable.

rizzo
April 5, 2011 7:13 am

Lol yep I’m sure nobody takes into account the area around the stations changing. You’re the only one who figured it out, great job, smart guy.

REPLY:
Then show it. Show me me where they’ve done it specifically and report back here along with an explanation of why they have a specification for it and only 1 in 10 stations actually adhere to it. – Anthony

BillyBob
April 5, 2011 9:42 am

Dave Springer: “But we’re not talking about adding more paint. ”
Yes you are. Another layer of the exact same paint.
The only thing different about adding more CO2 is that it takes less distance fromt eh ground to absorb the energy.
Instead of 30m, a doubling of CO2 causes all the warming to occur in 15m.
Another doubling causes all the warming in 7.5m.
etc etc.
Now, if you have any science that suggests the warming occurs in say 10 miles at current ppm and then 5 miles at a a doubling, I’l look at it.
But the figures I’ve seen say the distances are minsicule.
And it don’t matter whether CO2 warms 30m,15m or 7.5m from the ground.

Thomas Lux
April 9, 2011 4:54 am

Great work…. I have another theory about why land surface stations might be producing temperature readings which appear to be rising now more rapidly than prior to 1950 or so. In the past 60 years, Europe and North America have undergone rapid REFORESTATION. The forests in North America had become seriously depleted during the 1850 to 1950 period of time as wood was being harvested at a very rapid pace to accommodate the needs of rapidly increasing populations. After WWII North America and Europe started to improve forest management procedures. During the next 50 years much of the depleted forest acreage was restored. During the past 60 years the biomass of the forests in Europe and North America has increased dramatically. That’s probably good. However, that also means that a great deal of additional ground water is now being evaporated daily from those trillions of new leaves every day. This constitutes a change in the biosphere which would likely increase land based temperatures slightly simply as a result of the slightly elevated state of H2O of now vs pre 1950. If H2O is responsible for 80-90 of the GE, and if this occurs in a logarithmic manner, eg k*ln(H2Onew/H2Oold; k=5 or so stefan bolzman adustment) than increasing average H2O only slightly from say 10,000 ppm to 10,500 ppm might ADD as much as .2dC to any temperature anomaly occurring from 1950 to 2010???

Thomas Lux
April 9, 2011 5:22 am

In a recently published paper Dr. Hans Jelbring demonstrates that the theory that radiative forcing dominates the Greenhouse Effect is simply wrong. Using a model earth to explain his beliefs, Dr. Jelbring concludes that most of the GE is simply the result of gravity working on the entire mass of atmospheric gases, O2 and N2 included which creates the lions share of the GE here and on most planets with substantial atmospheres,eg Venus, Saturn, etc.
In Dr. Jelbring’s words.. “The generally claimed importance of “greenhouse” gases rests on an unproven
hypothesis (ref 1). The hypothesis is based on radiative models of energy fluxes in our
atmosphere. These are inadequate, since radiative processes within the atmosphere are poorly described, convective energy fluxes are often inadequately described or
omitted, and latent heat fluxes are poorly treated. The whole GE in these models is
wrongly claimed being caused by “greenhouse gases”. The considerations in this
paper indicate that effects of the greenhouse gases, other radiative effects, and
convection effects all might modulate GE to a minor unknown extent. Hence, the atmospheric mass exposed to a gravity field is the cause of the
substantial part of GW. The more atmospheric mass per unit planetary area, the greater GE has to develop. Otherwise Newton’s basic gravity model has to be dismissed.”
http://ruby.fgcu.edu/courses/twimberley/EnviroPhilo/FunctionOfMass.pdf
If Jelbring is right, than CO2 which constitutes a very small portion of the total mass of the atmosphere ON EARTH, would likely be responsible for only a tiny fraction of the entire GE. Note, in Jelbrings view, the green house gases are all of the gases in earth’s atmosphere. In the end, he argues that the GE is simply a function of the height of the atmosphere D, times g/cp.. g = gravity and cp = specific heat capacity of air which is about 1.006. So for earth, where the atmosphere exists mostly below 15,000 meters, the GE is equal to the lapse rate between .1 bar and the earth’s surface.
This is an interesting observation which, if true, turns the entire AGW argument upside down! Those who are interested in climate change would find his views to be interesting at the least!

1 7 8 9