Guest commentary by John Coleman
(reprinted from KUSI-TV by Mr. Coleman’s request)
Was July 2012 the hottest July in United States history? Was the last 12 months period the warmest ever? NOAA,the National Oceanographic and Atmospheric Administration, says “Yes”. The agency released a map and statement in early August stating without reservation that July 2012 and the last 12 months were the hottest ever. The national media gave the report a “headline play” as we say in the media business. And, many of the media stories linked this hot weather with global warming.
However, skeptical scientists have produced studies that show that the last 15 years have seen a cooling in the United States. This is the NOAA NCDC Climate at a Glance US annual mean temperature trend the last 15 years.

No doubt it has been very, very hot and very, very dry. But if you check the facts behind these reports the claims are not all that certain. Consider this; the data behind the NOAA claims shows the average temperature in July 2012 was only 2/10ths of one degree warmer than July 1936. And there are several papers posted on the internet that claim that if the temperatures and the means of processing the data had not been “adjusted” several times in the past 50 years, 1934 would still be the warmest year ever, considerably warmer that the last 12 months. I conclude that while there is some basis for the NOAA record warmth position those who challenge NOAA’s claims also make good cases. And it seems to me that while the recent hot, dry weather is clearly out of the ordinary, as it stands for now, it is not the sort of extreme event that might prove global warming. And, any connection between the hot, dry weather and warming caused by the activities of mankind remains totally unproven.
Global warming is about a predicted dramatic increase in the temperature year after year leading to the melting of the polar ice caps resulting in a dramatic rise in ocean water levels producing coastal flooding. It also predicts non-stop droughts, massive world-wide, killer heat waves and super storms. Were the conditions that have developed in the United States this summer to become world-wide (which they have not) and continue to increase in intensity for year after year for a decade, that would be the sort of global warming event that would lead to the devastating results predicted by Al Gore and the United Nations Intergovernmental Panel on Climate Change (IPCC).
The runaway heating predicted by the global warming advocates’ computer models, is predicted to occur as a result of the activities of our civilized society, primarily the use of fossil fuels to generate electricity and to fuel internal combustion engines in cars and trucks and power jet airplanes. The theory is that carbon dioxide (CO2) in the exhaust from burning fossil fuels is an extraordinary greenhouse gas that amplifies the greenhouse effect in our atmosphere. Even if the predicted warming of the climate occurs, that does not prove the CO2 causative theory. So far efforts to prove that theory with computer models are totally off of the mark.
Mankind began to burn coal extensively in around 1770 and to use oil extensively starting in the 1880s. The use of fossil fuels greatly increased between 1945 to 2000. A measure of the atmospheric CO2 known as the Keeling Curve (named for the Scientist who set up the measuring system) shows a steady rise in atmospheric CO2 as a result. (Despite the increase, the total of CO2 remains much less than 1% of the atmosphere.)

So a key question in the global warming debate has been “How much warming have we experienced so far as a result of our use of fossil fuels and the related release of carbon dioxide exhaust into the atmosphere?”
This seems as though it would be a very simple question to answer. To get the answer you just simply look at the chart of average temperatures through the years and measure the warming. Compare the warming since man has been using fossil fuels with the warming in a similar period before that and bingo, the influence of mankind’s activities should be clear. You might be surprised to hear that there are lots of complications that make this seemingly simple exercise impossible.
So what are the complications? After all, despite what anyone else has said or done, shouldn’t anyone be able to take all of the temperatures recorded across the nation, average them out to get the temperature for each hour and then average all 24 hours at the end of the day, resulting in the average for that day, and so on?
Well, it turns out that is not possible because only a few pre-selected long term observation records have been saved on a raw, long term, hour by hour and day by day basis. The great mass of long term temperature records are not available. Basically the historical data available is not in a format that lends itself to basic, start-over analysis.
There has been a long series of competing research reports, papers, charts and documents on historical temperatures and the long term trends. There is no consensus on what they show and why, just prolonged and spirited debate among the scientists. In the end, I fear, neither side is going to “win” this argument.
My friend Anthony Watts, a Chico, California based Meteorologist and the man who hosts the Wattsupwiththat website, which is the most popular climate website in the world, and several other scientists have recently released a draft version of a new paper detailing a re-analysis of the temperature data for the United States. They applied a recently developed but well accepted international process, to separate long term U.S.temperatures from a pre-selected set of “official” weather observation points. They concluded that the temperature adjustments used by the National Oceanographic and Ocean Administration produce a temperature increase of 0.30 degrees Celsius per decade, while this new method results in half as much of an increase; 0.15 degrees Celsius per decade. This paper has met with both cheers and jeers in the meteorology community. It is a big deal. But, in the end, I fear it will be countered by a paper that uses still another method of analysis and the debate will only go on and on.
Let’s look at the history. There were no actual thermometers until after 1724 when a man named Fahrenheit made the very first one. All temperature data before that is calculated using such methods as analyzing tree rings and ice cores. While the scientists who do that work are certain that they produce accurate data, there are all sorts of issues such as what type of tree rings to use, the geographic distribution of the trees, how to figure the temperature since other issues such as the amount and time of rainfall also influence the tree’s growth. And ice cores are clearly only in the polar regions and mountain glaciers, and don’t represent the entire world temperature pattern. I can accept basic trends from these systems, but I have trouble believing they produce temperature records with the accuracy of greater than ±2 or 3 degrees.
As for the thermometer, the early ones were crude; tubes of colored mercury were stapled onto wooden backboards with degree markers printed on them. How accurate a temperature reading can you get from looking at something like that? I conclude that you should not regard those readings to be accurate within a tenth of a degree. And actually I am skeptical of even ±one degree.
In the early days of the Weather Bureau here in the United States,the thermometers were housed in little wooden, louvered boxes with legs that held them five feet above the surface. A study by Watts several years ago showed that whether the boxes were painted white or white-washed actually had a measurable impact on the temperatures on the thermometers inside.
For many years some of these measurements were made on the roof tops of buildings in the middle of cities, while others were in rural pastures. Very often the temperatures were so affected by the siting of the observation point that they had little to do with the average temperature of the surrounding region. This led to studies, again a major one under the direction of Mr. Watts, showing that the “siting” of thermometers is a major issue in producing reliable temperature readings. His volunteer field observers and Anthony himself visited 1007 of the 1221 observations stations in the United States. They found some located near asphalt parking lots, and others near air conditioner exhausts. A trash burning barrel was just a few feet from another official thermometer. So when you look at historical temperature records, it is difficult to know the accuracy of the readings, and how representative they are of the area.
For decades the number of thermometers was rather limited. Weather observation points followed the migration of the population. For years, the thermometers that were used to produce official, written down logs of readings over the long-term were mostly in cities and towns. As aviation developed many official weather observation points moved to airports which were mostly on the edge of the cities or even out in the nearby country side. All of this impacted the temperature readings. Now most major airports are surrounded by extensive business centers, with the heat sink of the expanding runways and the exhaust of the departing jet aircraft affecting the air temperatures as well.
As the United States Weather Bureau was established and evolved, standards for making observations were adopted the data became more reliable. Eventually a National Climate Data Center (NCDC) was established in 1934. Each official observation center then sent a monthly report of temperatures and other weather data from its official weather observation station via a form paper report to the NCDC. In the 1970’s this data was entered into computers for the first time. The monthly paper mailed reports continued for decades, however, and on some level continues today.
As the sites changed and the system of data collection became organized, the thermometers themselves were evolving. Mercury thermometers were replaced with thermocouple thermometers at some weather observation stations in the mid 1900’s. And eventually platinum resistance thermometers [PRT’s], which use a length of platinum wire in series with a resistor to measure temperature, replaced thermocouples. And at many locations those louvered instrument sheds were replaced by metal tubes with fans to circulate the air over the thermometer inside.
Meanwhile, inexpensive battery-powered automated weather stations began to spring up at homes, schools and business everywhere. When the home computer and the internet swept the world, these automated stations hit the world wide web. Soon there were tens of thousands of them. All sorts of websites now display these readings. Eventually even the successor to the Weather Bureau, the National Weather Service (NWS), began to link to them, plot them on maps and redistribute them on official NWS websites. Then came the smart phones and their widgets picked up these automated internet stations. They were eventually organized into an official National Oceanographic and Atmospheric Agency (NOAA/NWS) group called Madis (Meteorological Assimilation Data Ingest System). However, these observations are not saved or used in any official climate data base.
So while hundreds of thousands of temperature records are now available worldwide, the NCDC continues to use only a network of government weather stations. And in recent years it has made extensive data adjustments. In one of these, it altered the older data to supposedly match the current data produced by the newer instruments. This change results in a cooling of the temperatures in the previous decades which has the net effect of making more recent temperatures comparatively warmer and therefore increases the long-term warming of temperatures.

I don’t doubt that there has been a general slow increase in atmosphere temperatures. You must understand this is a natural warming trend, a natural result of the continuing interglacial period that began with the melting of the great ice sheets 12,000 years ago. This warming trend has nothing to do with mankind’s use of fossil fuels.
In the 1980’s NCDC and other agencies that have sprung up as offshoots of NASA (National Aeronautics and Space Administration) began to greatly reduce the number of reporting points used in their temperature calculations. So as tens of thousands of new thermometers came on-line,they trimmed these government agencies were greatly reducing the number observations used in their world-wide temperature calculations. The number used dropped from 6,000 in the 1970s to just 1,500 a decade later. And, the new system averaged these temperatures to produce readings on spaced grid points. As these changes took effect, the average annual temperature produced by their systems, rose quickly as compared to their own temperature charts in previous years. NCDC and supporting researchers have presented numerous scientific papers to justify these changes. Others have produced papers, reports and charts to strongly refute the government agencies claims.
Essentially, much of the distortion may come from the selected observation points not being representative of the average for the grid box in which they are located. The problem of using a relatively few temperatures to represent regions gets very complex in coastal and hill and valley, rural and city complexes. As an example, if a grid box is in west-central Colorado and the observation at Grand Junction is used to present the area. Mountainous terrain that surrounds Grand Junction averages 10 degrees cooler. Think about San Diego County, California where the weather zones range from cool, Pacific coastal 60’s to 80’s in inland valleys, from 40’s to 90’s in mountains ranging up to 7,000 feet high (mountain temperatures vary enormously depending on season) to below sea level desert where temperatures are regularly over 100. How do you pick one or two temperatures to represent this county? Those who have put together and use these grids based on a relatively small set of observations have done studies that they say prove that is all averages out and adding more points to the observation network has little impact on the final averages. I might consider their argument but am totally sure they are out-of-bounds when they start to base claims of record warmest or record increases on measurements based on fractions of a degree or even a degree or two.
In this discussion, so far, I have looked at the temperature issues mostly from the viewpoint of only my home country, the United States. But the matter of temperatures is global. Elsewhere across and around the planet the problems of coming up with reliable data explodes. The United States constitutes less than 4% of the surface of the Earth. On the other hand, the Oceans cover 71% of the Earth’s surface. Deserts, unpopulated mountain ranges, tropical forests and the polar regions cover significant parts of the planet. Historical and even current temperature measurements from these regions remains skimpy to non-existent.
Take the case of the oceans. For 100s of years, we depended on measurements made by freighters and passenger ships to lift a bucket of water from the ocean along their routes and measure the temperature and make reports when they reached port. Later buoys would be used to collect temperatures. The ship buckets were eventually replaced by sensors in the ship intake valves of the ships, a method that measured temperatures at different levels. Ship intake values were slightly warmer than the buoys. Later these buoys placed at intervals off shore radioed in temperatures. In recent years, sophisticated buoys that communicate via satellite have been deployed worldwide. This data from them so far, does not show any significant warming of the ocean water and is being discounted by global warming advocates. In any case, there is not a long-term useable ocean temperature data base to use to produce a meaningful long term data base.
In the past forty years there have been measurements of the total atmospheric temperatures on a global basis from satellites. It system is being used to build a new data base that in a hundred years will make a significant data base for measuring climatic changes. For now, however, forty years is too short-term to be hugely significant. What is does show, for now, the satellite data shows a rather steady, gradual increase in global temperatures in line with the long-term increase over the last 12,000 years. It does not support the dramatic increases predicted by the global warming advocates models.
My friends who are climate change skeptics would say the lack of a dramatic rise is the final “nail in the coffin” for global warming alarmists. I have to hold back on reaching that conclusion. I simply hold to my skeptical position. I don’t believe that carbon dioxide is a major greenhouse gas and our use of fossil fuels is creating a climate crisis now and it will not do so in the future as well. I don’t see any evidence that there is a CO2 “tipping point” when temperatures will go out of the control. In fact, there is evidence that CO2 has been much higher in the historic past.
I also point out that CO2 is fertilizing our crops producing a food supply significantly greater than it would be if we were not adding carbon dioxide to the atmosphere.
I conclude the temperature data does not prove global warming. The alarmists are wrong. But the temperature data is so unreliable and garbled that neither alarmists or skeptics can use it to conclusively prove they are right.
What if we don’t ?
Leif, are you watching the southern hemisphere of the sun?
Sparks says:
March 3, 2013 at 9:10 pm
Leif, are you watching the southern hemisphere of the sun?
Our motto is “all the Sun; all the time”. So, yes.
lsvalgaard says:
March 3, 2013 at 9:44 pm
Sparks says:
March 3, 2013 at 9:10 pm
Leif, are you watching the southern hemisphere of the sun?
Our motto is “all the Sun; all the time”. So, yes.
==============
Please define “our”, Leif.
Is it “sole omni tempore” or “Solis omni tempore”?
u.k.(us) says:
March 3, 2013 at 9:50 pm
Please define “our”, Leif.
You may know that the department where I work at Stanford is a partner in the Solar Dynamics Observatory Satelite [if not: http://solar-center.stanford.edu/sdo/ ]. That is ‘our’.
Sparks says:
March 3, 2013 at 10:03 pm
Is it “sole omni tempore” or “Solis omni tempore”?
I think that just ‘Sol’ will do.
lsvalgaard says:
March 3, 2013 at 10:16 pm
“You may know that the department where I work at Stanford is a partner in the Solar Dynamics Observatory Satelite [if not: http://solar-center.stanford.edu/sdo/ ]. That is ‘our’.”
==========
Nice one, Leif.
Walked right in to it.
Checking for intelligent life, and found it.
Steven Mosher says:
March 3, 2013 at 12:47 pm
Global warming is about a predicted dramatic increase in the temperature year after year leading to the melting of the polar ice caps resulting in a dramatic rise in ocean water levels producing coastal flooding.”
1. There isnt a single model that predicts year after year warming. In Ar4 around 20 models were used. They produced about 60 runs total. if you look at the individual runs you will see that some years go up and some years go down
///////////////////////////////////////////////////////////////////////////
Steven
Looking at each of the 60 runs of each of the 20 models, which models correctly predicted a period of falling temperature and/or stasis of temperature rise, and what component in that model caused it to correctly predict the fall in temperature, or stasis in temperature as the case may be, ie., what did the model take into account which correctly over powered the warming that would otherwise have occurred as a consequence in the continued rise in CO2.
Identifying this component (eg., aerosols, ENSO, more cloudiness, less TSI etc etc) and testing it against empirical observation for the component (s) in question would help us learn a lot more about climate what drives it and how it works, AND, of course, about the correctness and usefulness of the models themselves.
We need to make sure that these models have not simply predicted periods when there would be no warming simply because of some randomness programmed into them, and we need to see whether they are ‘predicting’ on the basis of correct components, or is it just fluke coincidence of GIGO.
I would like to see the auditing of each of these model runs
Thanks john.
The GHG theory is a total failure. It violates the laws of thermodynamics, surface radiated energy is within the emission spectra of the so called GHG’s so could not have any effect on already energy saturated molecules. This theory was drempt up to cover poor thinking about the earth’s energy budget. The model used for this poor thinking is a flat earth with no night/day just cold sunlight. This model is nothing like reality but introduces the GHG theory to make up a supposed energy shortfall. There is no energy shortfall the sun is hot enough.
My emphasis,
“Mankind began to burn coal extensively in around 1770 and to use oil extensively starting in the 1880s. The use of fossil fuels greatly increased between 1945 to 2000. A measure of the atmospheric CO2 known as the Keeling Curve (named for the Scientist who set up the measuring system) shows a steady rise in atmospheric CO2 as a result. (Despite the increase, the total of CO2 remains much less than 1% of the atmosphere.)”
This is a bit ambiguous to my mind. It implies that the burning of so called fossil fuels was what caused the whole of the rise in CO2 levels. Was this intended?
SteveT
You can see the problem when a skeptic makes a small error that doesn’t have much to do with an essay. Immediately, the small error gets attacked and then the claim is made the entire essay is wrong. You can see this with Mr. Coleman’s reference to warming since the last glaciation. I suspect he meant the warming since the Little Ice Age for the last 300 years. Replacing the error with that factual warming does nothing to change the rest of the essay. And, it has little to do with temperature data, questionable adjustments and the lack of recent warming. It does point out the mindset of alarmists. They are in full panic mode.
It should be corrected but it’s kind of humorous to watch the mental gyrations of people like James Abbott
The entire argument that “this year was the warmest ever” is just purely foolish. Any scientist who claims this nonsense and claims it means something needs to go back to STAT 101 before they are qualified to post nonsense like that. I can not believe how seemingly such intelligent people can claim that nonsense and not realize its meaningless. In Statistics, there is a tool to test whether the data is increasing or decreasing and this tool is called linear regression as used in the post above. Any questions about warming or lack of warming due to “records” is just obfuscation from a true fool. To think the NOAA itself represents itself with this STAT 101 error is almost pure comedy, and then we remember that this is our tax dollars at work paying fools big bucks to post obfuscation. What nonsense!
What we truly need is honest scientists at NOAA, NASA etc who are willing to post the truth no matter what it says about their pet theories and their pet computer models. How in the world can we truly understand the climate if the “most authoritative source” the US has is making such simple STAT errors and misleading people?
Its truly horrendous when you think about it.
And then its even worse when you look at the data we have. I don’t think its truly necessary to show how bad the temperature data is to prove anything. Just look at the general slope of the data and ask yourself this:
We have been warming since we came out of the Little Ice Age and it appears we have stopped warming over the last 15 years. So therefore, wouldn’t you expect to see record highs in the data over the previous 15 years? And to think people at NOAA are surprised to find these record highs and then repeat it like it means something….
That is like having our own NOAA going to the top of Mount Everest and yelling to the world: “Hey, this is the top of the elevation on the planet.” Well duh, everyone knows that. Thank you Captain Obvious (err NOAA)….we truly learned nothing today except that you are clueless.
A note about Greenland in the Eemian. Listen up Mosher.
Why did Mr. Coleman leave out 2012 in his analysis of the last 15 years of annual data?? That’s an egregious error.
How dumb does he think people are?
Come on….
Eric Huxter says:
NASA data suggests that changes in non CO2 Greenhouse gases are having little effect on their calculations of overall climate forcings.
http://www.esrl.noaa.gov/gmd/aggi/
Which apparently are still rising inexorably. Strange they are not having continued impacts on global temperatures.
————————————————-
The data doesn’t say that at all — they say 30% of the increase in noncondensing GHG radiative forcing since 1979 is due to gases other than CO2.
There was a well known flattening in average global methane levels over the last few years of the 1990s and the first half of the 2000s, perhaps due to industry changes in Russia. That flattening ended around 2008, and atmospheric methane levels are increasing again.
Phobos says:
March 4, 2013 at 8:00 am
Why did Mr. Coleman leave out 2012 in his analysis of the last 15 years of annual data?? That’s an egregious error.
Perhaps 2012 was too hot…
Mike Borgelt says:
March 3, 2013 at 3:32 pm
Bad article. The Holocene optimum was warmer and there has been gradual cooling since then.
The issue isn’t whether CO2 is a greenhouse gas it is whether adding more at current levels will make any measurable or noticeable difference.
You see this is the part I have a problem with.
How can a gas (CO2) that is less than 1% of the atmosphere release enough energy at just the right quality and quantity to keep the other 99% from cooling?
When a parcel of air that has a volume of 1 cubic mile cools, how much energy is released, how much of the energy that is released from that parcel of air comes from CO2 and what is the amount of energy that is available to re-warm that parcel of air?
Box of Rocks says: “How can a gas (CO2) that is less than 1% of the atmosphere release enough energy at just the right quality and quantity to keep the other 99% from cooling?”
1. Do you accept that the Earth’s surface emits infrared radiation upward?
2. Do you accept that CO2 absorbs infrared radiation at certain frequencies?
3. Do you accept that CO2 re-emits infrared radiation at certain frequencies?
If so, then it’s just a matter of calculating how much heat is effectively trapped by CO2. Scientists have been perfecting that calculation for over 100 years. It’s a difficult calculation because the absorption spectrum of CO2 is so complicated, and pressure and temperature influence the absorption and reemission, but the calculation is doable by numerical methods.
Phobos says:
“1. Do you accept that the Earth’s surface emits infrared radiation upward?”
Outgoing longwave radiation has not changed much, either up or down, in decades. Therefore, your argument fails.
lsvalgaard says:
March 4, 2013 at 8:27 am
Perhaps 2012 was too hot…
Not in England it wasn’t 😉
@DB Stealey: Here we are discussing the basics of the greenhouse effect and how to calculate it, not whether it is increasing. Try to keep up.
Phobos,
I don’t blame you for being miffed that I debunked your 10:03 am post. ☺
Mr Green Genes says:
March 4, 2013 at 10:18 am
“Perhaps 2012 was too hot…”
Not in England it wasn’t 😉
Coleman was talking about the US.
Thanks, Mr. Coleman.
A very interesting article, a good historic account, and a correct scientific approach to a most intractable problem.
Phobos says:
March 4, 2013 at 10:03 am
Box of Rocks says: “How can a gas (CO2) that is less than 1% of the atmosphere release enough energy at just the right quality and quantity to keep the other 99% from cooling?”
1. Do you accept that the Earth’s surface emits infrared radiation upward?
2. Do you accept that CO2 absorbs infrared radiation at certain frequencies?
3. Do you accept that CO2 re-emits infrared radiation at certain frequencies?
If so, then it’s just a matter of calculating how much heat is effectively trapped by CO2. Scientists have been perfecting that calculation for over 100 years. It’s a difficult calculation because the absorption spectrum of CO2 is so complicated, and pressure and temperature influence the absorption and reemission, but the calculation is doable by numerical methods.
You left out these items:
4. Do you accept that CO2 is thermally activated by collisons with other particles?
5. Do you accept that this radiation can then be radiated to space?
If so, then it’s just a matter of calculating how much heat is effectively released from the atmopshere by CO2. Maybe you should try it. Just possibly you’ll then understand why Stealey’s chart shows no change. The warming effect of items 1-3 are countered by the cooling effects of 4-5.