
Forecasting the Earth’s Temperature
by David Whitehouse via Benny Peiser’s CCnet
The recent spate of scientific papers that are attempting to predict what the earth’s temperature might be in the coming decades, and also explain the current global temperature standstill, are very interesting because of the methods used to analyse temperature variations, and because they illustrate the limitations of our knowledge.
Recall that only one or two annual data points ago many scientists, as well as the most vocal ‘campaigners,’ dismissed the very idea that the world’s average annual temperature had not changed in the past decade. Today it is an observational fact that can no longer be ignored. We should also not forget that nobody anticipated it. Now, post facto, scientists are looking for an explanation, and in doing so we are seeing AGW in a new light.
The main conclusion, and perhaps it’s no surprise, to be drawn about what will happen to global temperatures is that nobody knows.
The other conclusion to be drawn is that without exception the papers assume a constantly increasing AGW in line with the increase of CO2. This means that any forecast will ultimately lead to rising temperatures as AGW is forever upward and natural variations have their limits. But there is another way of looking at the data. Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.
Lean and Ride try to isolate and analyse the various factors that affect decadal changes in the temperature record; El Nino, volcanic aerosols, solar irradiance and AGW. Their formula that links these factors together into a time series is quite simple (indeed there is nothing complicated about any of the papers looking at future temperature trends) though in the actual research paper there is not enough information to follow through their calculations completely.
El Nino typically produces 0.2 deg C warming, volcanic aerosols 0.3 deg C cooling on short timescales, solar irradiance 0.1 deg C (I will come back to this figure in a subsequent post) and the IPCC estimate of AGW is 0.1 deg C per decade.
It should also be noted that natural forces are able to produce a 0.5 deg C increase, although over a longer period. The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW.
The temperature increase since 1980 is in fact smaller than the rise seen between 1850 – 1940, approx 0.4 deg C. This took place in less than two decades and was followed by the current standstill. A fact often overlooked is that this recent temperature increase was much greater than that due to the postulated AGW effect (0.1 deg C per decade). It must have included natural increases of a greater magnitude.
This is curious. If the recent temperature standstill, 2002-2008, is due to natural factors counteracting AGW, and AGW was only a minor component of the 1980 -1998 temperature rise, then one could logically take the viewpoint that the increase could be due to a conspiracy of natural factors forcing the temperature up rather than keeping the temperature down post 2002. One cannot have one rule for the period 2002 – 2008 and another for 1980 -1998!
Lean and Rind estimate that 73% of the temperature variability observed in recent decades is natural. However, looking at the observed range of natural variants, and their uncertainties, one could make a case that the AGW component, which has only possibly shown itself between 1980 – 98, is not a required part of the dataset. Indeed, if one did not have in the back of one’s mind the rising CO2 concentration and the physics of the greenhouse effect, one could make out a good case for reproducing the post 1980 temperature dataset with no AGW!
Natural variations dominate any supposed AGW component over timescales of 3 – 4 decades. If that is so then how should be regard 18 years of warming and decades of standstills or cooling in an AGW context? At what point do we question the hypothesis of CO2 induced warming?
Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle. Lean and Rind see the 2014-19 period as being similar to the 2002-8 temperature standstill which they say has been caused by a decline in solar irradiance counteracting AGW.
This should case some of the more strident commentators to reflect. Many papers have been published dismissing the sun as a significant factor in AGW. The gist of them is that solar effects dominated up to 1950, but recently it has been swamped by AGW. Now however, we see that the previously dismissed tiny solar effect is able to hold AGW in check for well over a decade – in fact forcing a temperature standstill of duration comparable to the recent warming spell.
At least the predictions from the various papers are testable. Lean and Rind (2009) predict rapid warming. Looking at the other forecasts for near-future temperature changes we have Smith et al (2007) predicting warming, and Keenlyside et al (2008) predicting cooling.
At this point I am reminded that James Hansen ‘raised the alarm’ about global warming in 1988 when he had less than a decade of noisy global warming data on which to base his concern. The amount of warming he observed between 1980 and 1988 was far smaller than known natural variations and far larger than the IPCC would go on to say was due to AGW during that period. So whatever the eventual outcome of the AGW debate, logically Hansen had no scientific case.
There are considerable uncertainties in our understanding of natural factors that affect the earth’s temperature record. Given the IPCC’s estimate of the strength of the postulated AGW warming, it is clear that those uncertainties are larger than the AGW effect that may have been observed.
References:
Lean and Rind 2009, Geophys Res Lett 36, L15708
Smith et al Science 2007, 317, 796 – 799
Keenlyside et al 2008, Nature 453, 84 – 88
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work.
Ron de Haan,
“I don’t know what the future will bring but if I am aloud to gamble, I say we will continue to cool.”
Of course you allowed to gamble. I have a standing bet with a local “sceptic” which is based precisely on the premise that the world has been cooling since last year. The conditions of the bet are thus:
1. This annual bet concerns the movement in measurements of the troposphere temperature, cumulatively averaged for each calendar year from 2009 onwards compared to a baseline of the average for calendar year 2008.
2. The bet is payable each year by the loser to winner when the last month’s data from a calendar year first becomes available and the cumulative multiyear average can be compared to the baseline.
3. In the event of a tie (!) the bet is not paid for that year.
4. The bet for each year will be £100 multiplied by the square root of the number of years of data i.e. £100 for 2009, £141 for the average of 2009 and 2010, etc.
5. The dataset used will be the latest version of the lower tropospheric global time series released with the data for the last month of the year as published by the University of Alabama, Huntsville, known as UAH TLT and currently in file tltglhman_5.2.
6. Either party can withdraw from the bet at any time without any financial penalty.
I’m willing to bet against you on these terms that we are cooling. Are you on (in a currency multiple of your choice)?
Or, from Mr. Potatoe,
“The future will be better tomorrow”.
I’ll go you one better. There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.
Perhaps Anthony can explain the process whereby the avearage TV meteorologist gets the data to make the forecast.. The way it was explained to me, it all comes from model output generated somewhere, but not by the meteorologist locally. Is this the norm?
My sense is that much of today’s climate forecasting is more akin to fortune-telling than science. Here an astrometeorologist, Theodore White, shares his climate forecast:
http://forums.accuweather.com/index.php?showtopic=13603
Some of his forecast may turn out to be right and some will almost certainly turn out to be wrong, but it all comes across to me as guesswork.
I am sensing that there is somebody in your family who has been sick, someone who is older and you are hoping that they will get better. This person will get through what ails them, but hold them close, because soon their time will come. I am also sensing that you have some money woes. Be frugal and be wise, but don’t fret, because I am seeing that you will soon come into some unexpected money. I am also sensing something with the climate. Yes, it feel like it is going to change. It will get colder and then it will get warmer, and there will be some stormy weather…
“”” steven mosher (13:16:50) :
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work. “””
I didn’t see a single lecture there about the Nyquist Sampling Theorem; or sampled data sytems theory either.
Modern communications sytems simply would not work if the Nyquist Sampling theorem was not valid; and neither will climate sampled data gathering systems; and the other thing that Nyquist teaches us, is that we only need to undersample by a factor of two to get to where even the function average is itself corrupted by aliassing noise. And in fact we know that even the daily max/min temperature samples taken by the global station network thermometers don’t even meet the Nyquist condition for the temporal sampling; so even at a single measurement site, the data is not sufficient to obtain noise free information for that single site; so we don’t need to even consider the orders of magnitude spatial undersampling over the whole globe.
The aliassing noise injected by failure to observe proper sampling protocol, is in band noise so it is permanently unremovable without throwing away good signal as well. No amount of mathematical regression or running averaging, or any other statistical mathematics trickery can correct for aliassing noise.
Why don’t Climate “scientists” assert that they are not subject to the law of gravity either; they might as well, if they cavalierly dismiss the Nyquist sampling criterion.
George
(in reference to last decades lack of rise) ‘We should also not forget that nobody anticipated it’
However in 1983 James Hansen predicted it was possible. He predicted a general warming trend, and noted that it could take as long as 20 years for Co2 warming to overcome natural variation.
So in Hansen knew enough to know that the last 10 years of no warming was possible, but no climate scientists knew enough to predict the timing of such events. Now climate science is learning more and making more efforts to try and predict such medium term variations (i.e. 5 to 20 year range). And being a relatively new effort there will probably be plenty of mistakes early on.
One thing I am always hearing from global warming alarmists when you point out to them our inability to predict short term climate trends is that long term climate trends are easier to predict. How can there be a scientific (as in empirically derived) basis for that statement?
“There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.”
Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense. What does “temperature” mean to you? Only if you think the answer is “nothing, ever” is your previous statement logically possible.
RW whined:
“The GISS temperature record starts in 1880. By 1988, therefore, there was 108 years of instrumental global temperature data, not “less than a decade of noisy global warming data”.”
Yup, and they are still ADJUSTING it to get it right for their theory!!
HAHAHAHAHAHAHAHAHAHAHAHA
Your disengenous propaganda really stands out here!!
“long term climate trends are easier to predict. How can there be a scientific (as in empirically derived) basis for that statement?”
An analogy would be toward prediction the average temperature four months ahead vs. four days ahead. We know that seasonal temperature trends (climate) will override noise (weather) in the long run, but they won’t in the short run.
Number of Days with Temperatures Above 32F at Milwaukee- any Trend?
By Rusty Kapela, WCM, NWS Milwaukee
http://www.crh.noaa.gov/news/display_cmsstory.php?wfo=mkx&storyid=31040&source=0
http://icecap.us/images/uploads/DESMOINESRECORDS.jpg
See full story and graps at http://www.icecap.us (second column)
An interesting read, but this posting presumes that the data set showing the warming is reliable. What I see in it is simply that the data set is not reliable and the processing done on it is worse. For example, there is a large bias toward warming of large chunks of open area via a simple “airport on an island” effect. Perhaps all that has happened in the last decade is that we have built fewer new airports and are flying less.
http://chiefio.wordpress.com/2009/09/08/gistemp-islands-in-the-sun/
There’s a small problem.
The article says : “Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle.”
(my emphasis)
The L&R paper says [10] “We assume that future irradiance cycles replicate cycle 23, with cycle 24 commencing at the beginning of 2009“.
Would you not say that this assumption has already been shown to be false?
Tom P – good luck with your bet, but I bet (figuratively) that you won’t collect more than once, because the other party can pull out at any time.
George E. Smith (09:37:42) : But I suspect that GISStemp is a reasonable representation of the tiny network of thermometers (owl boxes) that make up the (near) surface network.
It isn’t.
It uses too many airports for UHI adjustment (treating them as rural). It uses places very unlike an urban area and very far away as proxies for it (they are not). It divides the world into 6 latitude bands (and hides the major movement of thermometers south in those wide bands). And much much more…
http://chiefio.wordpress.com/2009/08/23/gistemp-fixes-uhi-using-airports-as-rural/
http://chiefio.wordpress.com/2009/08/30/gistemp-a-slice-of-pisa/
http://chiefio.wordpress.com/2009/08/17/thermometer-years-by-latitude-warm-globe/
http://chiefio.wordpress.com/gistemp/
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
Ron de Haan
It must feel a little demeaning to be scrabbling around with US city data rather than making a global inference.
As you’re talking but not taking, can I infer you’re not willing to put your money where your mouth is?
RW (11:57:26) : The GISS temperature record starts in 1880. By 1988, therefore, there was 108 years of instrumental global temperature data, not “less than a decade of noisy global warming data”.
Uh huh… And exactly how many thermometers were around in 1880 and how evenly distributed over the globe? Look again at that phrase “global temperature data”. Guess what, we STILL don’t have global temperature data from any land sources. We have guesses, extrapolation, fudged holes, etc. For a look at percentage of thermometers by latitude by year:
http://chiefio.wordpress.com/2009/08/17/thermometer-years-by-latitude-warm-globe/
Some folks like percentages more than counts. OK, here are the percentage of “thermometer years” (basically, the percentage of the temperature records) for each decade, by latitude band. They are again labled with S for south, N for north, EQ for equator, W for warm, C for cold, and T for temperate S.P S.C S.T S.W EQ N.W N.T N.C N.P DecLatPct: 1709 0.0 0.0 0.0 0.0 0.0 0.0 0.0 100.0 0.0 DecLatPct: 1719 0.0 0.0 0.0 0.0 0.0 0.0 0.0 100.0 0.0 DecLatPct: 1729 0.0 0.0 0.0 0.0 0.0 0.0 0.0 100.0 0.0 DecLatPct: 1739 0.0 0.0 0.0 0.0 0.0 0.0 0.0 100.0 0.0 DecLatPct: 1749 0.0 0.0 0.0 0.0 0.0 0.0 17.5 82.5 0.0 DecLatPct: 1759 0.0 0.0 0.0 0.0 0.0 0.0 37.0 63.0 0.0 DecLatPct: 1769 0.0 0.0 0.0 0.0 0.0 0.0 38.9 61.1 0.0 DecLatPct: 1779 0.0 0.0 0.0 0.0 0.0 0.0 39.8 60.2 0.0 DecLatPct: 1789 0.0 0.0 0.0 0.0 0.0 0.0 49.7 50.3 0.0 DecLatPct: 1799 0.0 0.0 0.0 0.0 0.0 1.1 54.6 44.2 0.0 DecLatPct: 1809 0.0 0.0 0.0 0.0 0.0 1.8 54.2 44.0 0.0 DecLatPct: 1819 0.0 0.0 0.0 0.0 0.0 1.8 52.3 45.9 0.0 DecLatPct: 1829 0.0 0.0 0.0 0.0 0.1 1.8 52.2 45.8 0.1 DecLatPct: 1839 0.0 0.0 0.0 0.0 0.3 2.5 49.5 47.7 0.1 DecLatPct: 1849 0.0 0.0 0.5 0.0 1.0 3.1 50.3 44.5 0.6 DecLatPct: 1859 0.0 0.0 1.0 0.0 0.9 4.4 55.6 37.3 0.8 DecLatPct: 1869 0.0 0.0 5.0 0.0 1.0 2.2 57.2 34.1 0.5 DecLatPct: 1879 0.0 0.0 5.2 0.3 2.8 4.0 65.8 21.5 0.4 DecLatPct: 1889 0.0 0.0 4.7 1.1 2.0 5.2 67.0 19.8 0.3 DecLatPct: 1899 0.0 0.1 3.4 1.5 1.8 5.1 69.6 18.3 0.2 DecLatPct: 1909 0.0 0.4 4.9 2.7 1.9 5.9 66.9 17.0 0.2 DecLatPct: 1919 0.0 0.4 6.2 4.4 2.0 5.7 63.8 17.1 0.3 DecLatPct: 1929 0.0 0.3 6.0 4.6 2.1 6.7 62.4 17.3 0.5 DecLatPct: 1939 0.0 0.4 5.9 4.9 2.4 8.2 58.2 19.3 0.7 DecLatPct: 1949 0.0 0.5 5.9 5.9 2.6 9.3 54.8 20.2 0.8 DecLatPct: 1959 0.1 0.6 4.9 6.4 6.0 14.4 48.6 17.8 1.1 DecLatPct: 1969 0.4 0.8 5.2 7.2 8.1 14.6 45.8 16.8 1.2 DecLatPct: 1979 0.4 0.9 6.3 8.1 7.2 13.8 45.7 16.4 1.1 DecLatPct: 1989 0.3 0.9 6.4 7.8 5.8 11.8 49.1 16.8 1.1 DecLatPct: 1999 0.2 0.9 5.9 6.9 6.2 11.8 58.7 8.7 0.7 DecLatPct: 2009 0.3 0.8 4.4 5.7 6.8 13.6 57.4 10.3 0.7I make that 7.3% of them in the Southern Hemisphere in 1879 as we enter the start of the GIStemp history. BTW, there were all of 27 in the Southern Warm latitudes. ALL of it. NONE in the Southern Cold band.
Not exactly what I’d call “Global”…
Further to Nick Stoke’s point, AGW was raised significantly earlier than 1988.
For instance google Jason report, Nierenberg Report, and Charney report. By the last 70s, early 80s, enough climate modelling and research into Co2 had been done that the American government was comissioning committee reports to attempt to summarise the science, kind of a mini IPCC report, but on a national rather than global scale. These reports were certainly not estimating Co2 warming on the basis of past temperature history, as the temperature history for the past 30 years at that stage was believed to be cooling (70s ice age scare).
steven mosher (13:16:50) :
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work.
I think this are orthogonal points.
If you want a true, real, and accurate GLOBAL average, I think George is correct.
But if all you really need (and want) is a valid sample for determining the trend of the temperature series, then a small and high quality sample ought to answer the question “Is there a general warming going on?” (Note, not “Global” but “general” – widespread and of interest, but not spanning a well sampled total global surface.)
So I think you are both right, but in specific and slightly orthogonal domains. (Having staked out the middle ground, all parties can now commence shooting at me in the crossfire 8-}
RW (15:07:00) :
“There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.”
Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense. What does “temperature” mean to you? Only if you think the answer is “nothing, ever” is your previous statement logically possible.
Wrongo!
All it takes is that you have an intrinsic property. An average of an intrinsic property is meaningless. If you don’t know what an intrinsic property is, or means, please take the time to learn. It really does matter. A Lot.
It is very important to keep it clear in your mind that temperatures are an intensive variable. If you glaze at that in the smallest degree and skip over it without an in depth grasp of it, you will continue to waste time and space on a pointless pursuit of the impossible. Most folks, it seems, do exactly that (give how much bandwidth is wasted on the issue to no avail…)
These folks have a nice short description a couple of paragraphs down:
http://www.tpub.com/content/doe/h1012v1/css/h1012v1_30.htm
What that means, “intensive”, is that one instance of the property for one entity means nothing to another instance of another entity. It is not dependent on the mass of the object, for example.
The taste of MY meal means nothing to the taste of YOUR meal, and averaging them together can be done BUT MEANS NOTHING. Were you averaging in one drop of Tabasco sauce from my meal, or one ounce? It matters to the average, but is not known… What is an average of pepper and cloves?
Another, more pertinent, example might be taking two different pots of water and averaging their temperature. You get two numbers, but know nothing about the THERMAL ENERGY in the two pots. The temperatures become an average, but the average means nothing. It certainly is not representative of the average thermal energy.
Take the two pots of water and mix them, the resultant temperature is NOT the same as the average of the two temperatures. You must know the mass of water in each pot to get that result. And we did not measure the mass. Thermal energy is an extrinsic property and the average does have meaning.
The same thing happens on a colossal scale globally. We measure the temperature over a snow field, and ignore the massive heat needed to melt the snow with no change of temperature and ignore the mass of the snow. We measure the temperature of the surface of the ocean and ignore the shallow and great depths. We measure the surface temperature of a forest, and ignore the TONS of water per acre being evaporated by transpiration.
Then we average those temperature readings together and expect them to tell us something about the heat and energy balance of the planet.
That is lunacy in terms of physics and mathematics.
So take all the above, and firmly fix in your mind the truth:
An average of a bunch of thermometer readings MEANS NOTHING.
Got it?
Mike Jonas,
“Tom P – good luck with your bet, but I bet (figuratively) that you won’t collect more than once, because the other party can pull out at any time.”
Even on these forgiving terms I haven’t had a single taker yet. “All talk and no trousers” might be a applicable to any reader on this site who claims the world is cooling.
All it takes is that you have an intrinsic property. An average of an intrinsic property is meaningless. If you don’t know what an intrinsic property is, or means, please take the time to learn. It really does matter. A Lot.
That ought to have said:
All it takes is that you have an intensive property. An average of an intensive property is meaningless. If you don’t know what an intensive property is, or means, please take the time to learn. It really does matter. A Lot.
In keeping with the rest of the posting…
Nick Stokes (16:06:32) :
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
So it has always been based on a fantasy about gasses and models and projections, oh my! Ok, got it…
It’s so much easier to understand when you leave the data out and accept that it has always been confirmation bias and self delusion.