Watts et al gets a mention.
3. NEW INFORMATION ON SURFACE TEMPERATURE PROCESSES
In general, the issue of global warming is dominated by considering the near-surface air
temperature (Tsfc) as if it were a standard by which one might measure the climate
impact of the extra warming due to increases in greenhouse gases. Fundamentally, the
proper variable to measure is heat content, or the amount of heat energy (measured in
joules) in the climate system, mainly in the oceans and atmosphere. Thus the basic
measurement for detecting greenhouse warming is how many more joules of energy are
accumulating in the climate system over that which would have occurred naturally. This
is a truly “wicked” problem (see House Testimony, Dr. Judith Curry, 17 Nov 2010)
because we do not know how much accumulation can occur naturally.
Unfortunately, discussions about global warming focus on Tsfc even though it is affected
by many more processes than the accumulation of heat in the climate system. Much has
been documented on the problems, and is largely focused on changes in the local environment, i.e. buildings, asphalt, etc. This means that using Tsfc, as measured today,
as a proxy for heat content (the real greenhouse variable) can lead to an overstatement of
greenhouse warming if the two are assumed to be too closely related.
A new paper by my UAHuntsville colleague Dr. Richard McNider (McNider et al. 2012)
looked at reasons for the fact daytime high temperatures (TMax) are really not warming
much while nighttime low temperatures (TMin) show significant warming. This has
been known for some time and found in several locations around the world (e.g.
California – Christy et al. 2006, East Africa – Christy et al. 2009, Uganda – just released
data). Without going into much detail, the bottom line of the study is that as humans
disturb the surface (cities, farming, deforestation, etc.) this disrupts the normal formation
of the shallow, surface layer of cooler air during the night when TMin is measured. In a
complicated process, due to these local changes, there is greater mixing of the naturally
warmer air above down to the shallow nighttime cool layer. This makes TMin warmer,
giving the appearance of warmer nights over time. The subtle consequence of this
phenomenon is that TMin temperatures will show warming, but this warming is from a
turbulent process which redistributes heat near the surface not to the accumulation of
heat related to greenhouse warming of the deep atmosphere. The importance of this is
that many of the positive feedbacks that amplify the CO2 effect in climate models depend
on warming of the deep atmosphere not the shallow nighttime layer.
During the day, the sun generally heats up the surface, and so air is mixed through a deep
layer. Thus, the daily high temperature (TMax) is a better proxy of the heat content of
the deep atmosphere since that air is being mixed more thoroughly down to where the
thermometer station is. The relative lack of warming in TMax is an indication that the
rate of warming due to the greenhouse effect is smaller than models project (Section 2).
The problem with the popular surface temperature datasets is they use the average of the
daytime high and nighttime low as their measurement (i.e. (TMax+TMin)/2). But if
TMin is not representative of the greenhouse effect, then the use of TMin with TMax will
be a misleading indicator of the greenhouse effect. TMax should be viewed as a more
reliable proxy for the heat content of the atmosphere and thus a better indicator of the
enhanced greenhouse effect. This exposes a double problem with models. First of all,
they overwarm their surface compared with the popular surface datasets (the non-circle
symbols in Fig. 2.1). Secondly, the popular surface datasets are likely warming too much
to begin with. This is why I include the global satellite datasets of temperature which are
not affected by these surface problems and more directly represent the heat content of the
atmosphere (see Christy et al. 2010, Klotzbach et al. 2010).
Fall et al. 2011 found evidence for spurious surface temperature warming in certain US
stations which were selected by NOAA for their assumed high quality. Fall et al.
categorized stations by an official system based on Leroy 1999 that attempted to
determine the impact of encroaching civilization on the thermometer stations. The result
was not completely clear-cut as Fall et al. showed that disturbance of the surface around a
station was not a big problem, but it was a problem. A new manuscript by Muller et al.
2012, using the old categorizations of Fall et al., found roughly the same thing. Now,
however, Leroy 2010 has revised the categorization technique to include more details of
changes near the stations. This new categorization was applied to the US stations of Fall
et al., and the results, led by Anthony Watts, are much clearer now. Muller et al. 2012
did not use the new categorizations. Watts et al. demonstrate that when humans alter the
immediate landscape around the thermometer stations, there is a clear warming signal
due simply to those alterations, especially at night. An even more worrisome result is
that the adjustment procedure for one of the popular surface temperature datasets actually increases the temperature of the rural (i.e. best) stations to match and even exceed the more urbanized (i.e. poor) stations. This is a case where it appears the adjustment process took the spurious warming of the poorer stations and spread it throughout the entire set of stations and even magnified it. This is ongoing research and bears watching as other factors as still under investigation, such as changes in the time-of-day readings were taken, but at this point it helps explain why the surface measurements appear to be warming more than the deep atmosphere (where the greenhouse effect should appear.)
Full testimony PDF here: christy-testimony-2012
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Politicians are like the media. If it bleeds it leads. They want sound bites. Catastrophy cake of any kind will get first billing and especially if iced with imitation intelligence (IE peer review). Democrates (the greenies for you in other countries) are about saving the future by killing the present. Conservatives are about growing the present in order to save the future. So. Feed them catastrophy cake iced with artificial intelligence and you will have their ear. Just change the message to match the politician.
Christy’s charts are absolute gold. Bravo.
“I suppose if one wanted to reduce U.S. emissions, one could legislate what the world should and
should not buy. This, of course, is not a serious idea.”
It is if you are a leftist…
Tav = (Tmax + Tmin)/2
sloppy, high-school stuff.
T == Energy
epic fail, not.even.wrong.
(yet somehow the gravvy train rolls along)
Ric Werme says:
August 1, 2012 at 2:06 pm
Theo Goodwin says:
August 1, 2012 at 10:50 am
“The problem with the popular surface temperature datasets is they use the average of the daytime high and nighttime low as their measurement (i.e. (TMax+TMin)/2).”
This “average,” this contrivance, is one of the most bone-headed ideas known to mankind. Why people who call themselves scientists, or even TV weathermen, would “average” crucial pieces of raw data, especially pieces that are qualitatively different in character, and then use that average to create the all important “surface temperature datasets” is a question that should strike terror into everyone interested in the quality of climate science or even TV meteorology. Stupid, stupid, stupid, stupid, stupid, stupid.
Can we now agree to use the raw data?
“What do you propose we do with the decades of data that are max/min temperatures recorded by hand from max/min thermometers? Replace it with treering data?”
The same thing we always do when we make decisions that hurt ourselves, our colleagues, our employers, and science. Take full responsibility. That means accepting disciplinary action. Throw the trash out and start over. Advertise to the world that it is trash and that it must be thrown out. Then we will have gotten out of the way of science and ended our contribution to the blocking of scientific progress. If we are of Japanese heritage, we also offer to accept a 50 percent permanent reduction in pay.
cbltoo says:
August 1, 2012 at 12:38 pm
…..It’s not about the facts – its about influencing opinion not debating one set of data against another. Desmog is run by David Suzukis PR firm for heavens sake.
Are there no PR advisors who could help the realist scientists convince ordinary people and politicians?
________________________________
I agree that it is about propaganda and the other side has had it nailed since Willi Münzenberg’s, ‘Innocents’ Clubs’ began in the 1920’s.
The problem is “They” can use all the PR advisors they want but if we so much as walk in the door of a PR firm we are dead meat. Just look at Big Oil funding. Big Oil has been behind this and funding it from the get go. Maurice Strong was an Oil Man yet he ran the First Earth Summit and Kyoto. Shell and BP provided the original funding for CRU. Ged Davis a Shell Oil VP and recent head of the World Business Council for Sustainable Development’s Scenario Project team aka Agenda 21, wrote the attachment in ClimateGate (1) email 0889554019 It is a rough draft of Agenda 21/Sustainable development that was sent to climate scientists, government officials and Greenpeace for comments.
Look at the Muller media blitz and Mullers Big Oil connection. If you go to the listing of the TEAM at Muller Assoc. you find. Arthur Rosenfeld, Former California Energy Commissioner among others.
Further down you find Marlan Downey
Click on Marlan Downey, Oil and Gas Executive
and you find
Yet with all the Big Oil connections between government officials, Greenpeace (Check out their standard oil grants aka Rockefeller money) and the CAGW hoax you never ever see a peep about it in the MSM or even on the blogs.
However let Exxon fork over a measly few thousand a year and it is screamed to the rafters. Heartland Institute has received $676,500 from ExxonMobil since 1998. Max was $90,000 in 2005 and again in 2006.
Greenpeace on the other hand received $1,215,285 since 1996 from just the Rockefellers and Sierra Club received $450,000.00 from the Bush family.
From the PDF of the testimony –
‘The non-falsifiable hypotheses can be stated this way, “whatever happens is consistent
with my hypothesis.” In other words, there is no event that would “falsify” the
hypothesis. As such, these assertions cannot be considered science or in anyway
informative since the hypothesis’ fundamental prediction is “anything may happen.” In
the example above if winters become milder or they become snowier, the non-falsifiable
hypothesis stands. This is not science.’
I just fancy putting this on a poster outside Hansen office.
Over here in this part of the world we were always taught that the USA 1930’s dustbowl conditions were caused by “overfarming”.
John Christy’s testimony shows that the 1930s was an extreme weather event, of at least a 1-in-100 yr magnitude. It is clearly shown in the mid west graphs as producing the most Tmax daily records and the lowest observed annual rainfall this decade. It is also represented in temperature records from the west coast.
I wonder what the AGW zealots would have said then if they were around at the time?
bali007 says:
T == Energy
epic fail, not.even.wrong.
Since the internal energy of a gas is directly proportional to energy, why is there such a problem with this?
The use and abuse of statistics in climate science by the current U.S. Administration is far from a unique problem. The North American stock market abruptly jumped with gains on the opening of the market upon release of the latest unemployment statistics claiming the number of new jobs created was better than expected while the unemployment rate increased. After untangling this twisted reporting, you find the number of jobs created was extremely low in comparison to the past years for several decades, and the unemployment numbers are a statistical artifact due to “seasonal adjustments.” Yes, arbitrary adjustments are used in a manner which reverses the reality of the raw numbers. Some people might describe such goings on to a Congress critter as something akin to putting lipstick on a pig without using the sow’s ear to make a matching purse.
The details and charts illustrating the problems with political abuse of labor statistics can be seen at: Seasonal And Birth Death Adjustments Add 429,000 Statistical “Jobs”
http://www.zerohedge.com/news/seasonal-and-birth-death-adjustments-add-429000-statistical-jobs
The scientific case must be presented to Congress, but the oppositoin to this testimony has demonstrated a will to abuse such science for the same political purposes as the labor statistics and more.
A little late, but a big Thank You. I doubt if the Boxer listened to you (Incovenient Truths for her) but many more heard you. Again, God Bless and Thanks!
DaveR says:
August 2, 2012 at 4:41 pm
Over here in this part of the world we were always taught that the USA 1930′s dustbowl conditions were caused by “overfarming”.
John Christy’s testimony shows that the 1930s was an extreme weather event, of at least a 1-in-100 yr magnitude. It is clearly shown in the mid west graphs as producing the most Tmax daily records and the lowest observed annual rainfall this decade. It is also represented in temperature records from the west coast.
I wonder what the AGW zealots would have said then if they were around at the time?
===========================================================================
“We’re all going to die!” (Unless you give us your money, your choices, let us tell your children Government is the answer …)
John Brookes:
At August 3, 2012 at 5:05 am you ask: