CO2 did not drive the rapid warming of the 20th century.
Story submitted by Stan Robertson
The difference between a good idea and a bad idea is often a quantitative matter. For example, many people would think it a good idea to replace internal combustion engines with electric motors. But if the intent is to reduce the burning of fossil fuels then switching to electric motors would not help unless the electricity was generated without burning fossil fuels. Some people think that it has been a good idea to use corn to produce ethanol for a fuel, however, I am not one of them because the energy return on investment is either negative, or minuscule at best. From the standpoint of greenhouse gas emissions, it is a horrendous loser. It may be a biofuel and cleaner burning, might help ameliorate ozone problems and etc, but considering that nearly a gallon of oil is consumed in addition to the gallon of ethanol produced and burned, it is a quantitative loser. (Not that I care at all about the CO2.)
One of the ideas that seems to be widely believed is that human produced greenhouse gases, chiefly CO2, has dominated the warming of the earth in the last century. It is a simple quantitative matter to show that this is completely false.
According to the calculations of the UN IPCC, a doubling of the atmospheric concentration of CO2 (with an accompanying rise of other greenhouse gases) would reduce the outgoing infrared radiation from the earth by a net 2.7 watt/m^2 at the top of the atmosphere. This is known as the “climate forcing” that will occur along with a doubling of the CO2. This is a relatively straightforward, but messy calculation. I have repeated the IPCC calculation for CO2 and obtained a larger number, but after including the IPCC adjustments for other greenhouse gases and the effects of sulfate aerosols accompanying coal burning, we agree. It is important to note that the surface temperature increase that will accompany the CO2 is proportional to the logarithm of the CO2 concentration. Thus while CO2 concentration is increasing exponentially with time, the temperature only increases linearly.
In order to maintain equilibrium with the incoming UV/VIS radiation received by the earth, the surface temperature would need to increase enough to allow it to radiate an additional 2.7 watt/m^2 at the top of the atmosphere after any CO2 doubling. At a nominal surface temperature of 15 C (288 K), the earth surface radiates about 390 watt/m^2 on average, but the radiation that exits the top of the atmosphere is only 240 watt/m^2. Thus the earth would need to produce an additional (390/240)x2.7 watt/m^2 = 4.4 watt/m^2 at the surface in order to offset the direct effect of doubling the atmospheric CO2. At 288 K, the earth radiates an additional 5.4 watt/m^2 per 1C of temperature rise. Thus the direct effect temperature increase of a CO2 doubling would be 4.4/5.4=0.8 C.
At the present 0.5% per year rate of increase of CO2 it will take about 140 years to double its concentration. But as we all know, a 0.8 C temperature increase in 140 years is not the result that the UN IPCC is alarmed about. The IPCC climate models include large positive feedback effects that raise their expected temperature increase into the range 2 – 4.5 C, with their most probable value at about 3 C.
There are four main arguments against this: (1) We have already had half of a 2.7 watt/m^2 climate forcing since pre-industrial times. That has been accompanied by only 0.8 C temperature increase. As shown below, there are reasons for believing this to be due primarily to natural causes. (2) There is no evidence that confirms the existence of any large feedback effects since the end of the last deglaciation. (3) The rate of temperature increase within the past century has been within the bounds of normal climate variability and (4) as shown below, the heating effect of CO2 has been quantitatively inadequate to explain the actual warming that has occurred in the last century.
There have been two periods of rapid warming that account for most of the warming that occurred in the last century, as shown below.
Let’s examine the first of these rapid warming periods first. By 1944, the atmospheric CO2 concentration had increased from the pre-industrial level of about 280 ppm up to 310 ppm. At that time the concentration was increasing at a rate that would require about 600 years to double. The fraction of a doubling climate forcing that would have occurred by 1944 would have been log(310/280)/log(2)=0.15 and this would have contributed at a rate of 0.15×2.7 watt/m^2 per 60 decades, or 0.0068 watt/m^2 per decade. It’s direct warming effect at the surface would thus be only (390/240)x(0.0068 watt/m^2 per decade)= 0.01 watt/m^2 per decade. This would have raised the temperature by (0.01 watt/m^2 per decade) /( 5.4 watt/m^2 /C) = 0.002 C per decade. This is such a pitifully small fraction of the 0.174 C per decade rate of heating that occurred 1917-1944 that it is pretty clear that CO2 had nothing to do with the warming of the first half of the last century. Even the IPCC climate modelers concede this point.
But there is still more to be learned from that period. Apparently some natural phenomenon allowed the earth to absorb energy at a significant rate and produce the temperature increase of the first half of the century. Let’s see how much that might have been. To begin, the earth would have had to take in enough heat to at least produce the additional surface radiation that would accompany a temperature rise of 0.174 C per decade 1917-1944. This would be (5.4 watt/m^2/C)x(0.174C/decade) = 0.94 watt/m^2 per decade. This is already 94X the CO2 heating rate.
But, in addition, as shown by both the ARGO buoy system and heat transfer calculations, at least 700 meters of upper ocean can respond to heating on a time scale of a decade. The additional amount of heat required to raise its temperature by 0.174 C per decade would be c*d*0.174C, where c= 4.3×106 joule/m^3/C is the heat capacity of sea water and d= 700 m, or 5.2×10^8 joule/m^2. Dividing by the number of seconds in 10 years, this would be an average of 1.7 watt/m^2 per decade. But since it would start at zero, it would have to end at 3.4 watt/m^2 per decade in order to attain this average. This should be added to the 0.94 watt/m^2 per decade surface radiation losses by the end of the warming period. So the total heating rate would have to ramp up by 4.3 watt/m^2 per decade to provide the warming that actually occurred in either of the rapid warming periods. This is 430 times the direct CO2 surface heating for 1917-1944.
Since essentially the same rate of temperature increase occurred 1976-2000, we can compare 4.3 watt/m^2 with the heating that might have been caused by CO2 in the last part of the last century. From 1944 to 2000, the CO2 concentration increased from 310 ppm to 370 ppm, with a doubling time of about 140 years. The corresponding climate forcing that would have caused, at the surface, would be (390/240)x(log(370/310)/log(2))x(2.7 watt/m^2)/14 decades = 0.08 watt/m^2 per decade.
Due to the higher rate of growth of CO2 concentration in the second half of the 20th century, this is 8X as large as the direct surface heating effect caused by CO2 in the first half. Nevertheless, it is still some 54 times smaller than the rate of heating that actually occurred.
These straightforward calculations make it painfully obvious that CO2 forcing is not what drove the two periods of rapid heating during the last century. Until there is some understanding of the natural causes of these rapid warming periods and their inclusion in the climate models, there is no reason to believe the models. This is simple first year physics.
Stan Robertson, Ph.D, P.E, retired in 2004 after teaching physics at Southwestern Oklahoma State University for 14 years. In addition to teaching at three other universities over the years, he has maintained a consulting engineering practice for 30 years.