Forecasting Guru Announces: "no scientific basis for forecasting climate"

It has been an interesting couple of days. Today yet another scientist has come forward with a press release saying that not only did their audit of IPCC forecasting procedures and found that they “violated 72 scientific principles of forecasting”, but that “The models were not intended as forecasting models and they have not been validated for that purpose.” This organization should know, they certify forecasters for many disciplines and in conjunction with John Hopkins University if Washington, DC, offer a Certificate of Forecasting Practice. The story below originally appeared in the blog of Australian Dr. Jennifer Marohasy. It is reprinted below, with with some pictures and links added for WUWT readers. – Anthony

j-scott-armstrong iif-website

J. Scott Armstrong, founder of the International Journal of Forecasting

Guest post by Jennifer Marohasy

YESTERDAY, a former chief at NASA, Dr John S. Theon, slammed the computer models used to determine future climate claiming they are not scientific in part because the modellers have “resisted making their work transparent so that it can be replicated independently by other scientists”. [1]

Today, a founder of the International Journal of Forecasting, Journal of Forecasting, International Institute of Forecasters, and International Symposium on Forecasting, and the author of Long-range Forecasting (1978, 1985), the Principles of Forecasting Handbook, and over 70 papers on forecasting, Dr J. Scott Armstrong, tabled a statement declaring that the forecasting process used by the Intergovernmental Panel on Climate Change (IPCC) lacks a scientific basis. [2]

What these two authorities, Drs Theon and Armstrong, are independently and explicitly stating is that the computer models underpinning the work of many scientific institutions concerned with global warming, including Australia’s CSIRO, are fundamentally flawed.

In today’s statement, made with economist Kesten Green, Dr Armstrong provides the following eight reasons as to why the current IPCC computer models lack a scientific basis:

1. No scientific forecasts of the changes in the Earth’s climate.

Currently, the only forecasts are those based on the opinions of some scientists. Computer modeling was used to create scenarios (i.e., stories) to represent the scientists’ opinions about what might happen. The models were not intended as forecasting models (Trenberth 2007) and they have not been validated for that purpose. Since the publication of our paper, no one has provided evidence to refute our claim that there are no scientific forecasts to support global warming.

We conducted an audit of the procedures described in the IPCC report and found that they clearly violated 72 scientific principles of forecasting (Green and Armstrong 2008). (No justification was provided for any of these violations.) For important forecasts, we can see no reason why any principle should be violated. We draw analogies to flying an aircraft or building a bridge or performing heart surgery—given the potential cost of errors, it is not permissible to violate principles.

2. Improper peer review process.

To our knowledge, papers claiming to forecast global warming have not been subject to peer review by experts in scientific forecasting.

3. Complexity and uncertainty of climate render expert opinions invalid for forecasting.

Expert opinions are an inappropriate forecasting method in situations that involve high complexity and high uncertainty. This conclusion is based on over eight decades of research. Armstrong (1978) provided a review of the evidence and this was supported by Tetlock’s (2005) study that involved 82,361 forecasts by 284 experts over two decades.

Long-term climate changes are highly complex due to the many factors that affect climate and to their interactions. Uncertainty about long-term climate changes is high due to a lack of good knowledge about such things as:

a) causes of climate change,

b) direction, lag time, and effect size of causal factors related to climate change,

c) effects of changing temperatures, and

d) costs and benefits of alternative actions to deal with climate changes (e.g., CO2 markets).

Given these conditions, expert opinions are not appropriate for long-term climate predictions.

4. Forecasts are needed for the effects of climate change.

Even if it were possible to forecast climate changes, it would still be necessary to forecast the effects of climate changes. In other words, in what ways might the effects be beneficial or harmful? Here again, we have been unable to find any scientific forecasts—as opposed to speculation—despite our appeals for such studies.

We addressed this issue with respect to studies involving the possible classification of polar bears as threatened or endangered (Armstrong, Green, and Soon 2008). In our audits of two key papers to support the polar bear listing, 41 principles were clearly violated by the authors of one paper and 61 by the authors of the other. It is not proper from a scientific or from a practical viewpoint to violate any principles. Again, there was no sign that the forecasters realized that they were making mistakes.

5. Forecasts are needed of the costs and benefits of alternative actions that might be taken to combat climate change.

Assuming that climate change could be accurately forecast, it would be necessary to forecast the costs and benefits of actions taken to reduce harmful effects, and to compare the net benefit with other feasible policies including taking no action. Here again we have been unable to find any scientific forecasts despite our appeals for such studies.

6.  To justify using a climate forecasting model, one would need to test it against a relevant naïve model.

We used the Forecasting Method Selection Tree to help determine which method is most appropriate for forecasting long-term climate change. A copy of the Tree is attached as Appendix 1. It is drawn from comparative empirical studies from all areas of forecasting. It suggests that extrapolation is appropriate, and we chose a naïve (no change) model as an appropriate benchmark. A forecasting model should not be used unless it can be shown to provide forecasts that are more accurate than those from this naïve model, as it would otherwise increase error. In Green, Armstrong and Soon (2008), we show that the mean absolute error of 108 naïve forecasts for 50 years in the future was 0.24°C.

7. The climate system is stable.

To assess stability, we examined the errors from naïve forecasts for up to 100 years into the future. Using the U.K. Met Office Hadley Centre’s data, we started with 1850 and used that year’s average temperature as our forecast for the next 100 years. We then calculated the errors for each forecast horizon from 1 to 100. We repeated the process using the average temperature in 1851 as our naïve forecast for the next 100 years, and so on. This “successive updating” continued until year 2006, when we forecasted a single year ahead. This provided 157 one-year-ahead forecasts, 156 two-year-ahead and so on to 58 100-year-ahead forecasts.

We then examined how many forecasts were further than 0.5°C from the observed value. Fewer than 13% of forecasts of up to 65-years-ahead had absolute errors larger than 0.5°C. For longer horizons, fewer than 33% had absolute errors larger than 0.5°C. Given the remarkable stability of global mean temperature, it is unlikely that there would be any practical benefits from a forecasting method that provided more accurate forecasts.

8.  Be conservative and avoid the precautionary principle.

One of the primary scientific principles in forecasting is to be conservative in the darkness of uncertainty. This principle also argues for the use of the naive no-change extrapolation. Some have argued for the precautionary principle as a way to be conservative. It is a political, not a scientific principle. As we explain in our essay in Appendix 2, it is actually an anti-scientific principle in that it attempts to make decisions without using rational analyses. Instead, cost/benefit analyses are appropriate given the available evidence which suggests that temperature is just as likely to go up as down. However, these analyses should be supported by scientific forecasts.

The reach of these models is extraordinary, for example, the CSIRO models are currently being used in Australia to determine water allocations for farmers and to justify the need for an Emissions Trading Scheme (ETS) – the most far-reaching of possible economic interventions.   Yet, according to Dr Armstrong, these same models violate 72 scientific principles.

********************

1. Marc Morano, James Hansen’s Former NASA Supervisor Declares Himself a Skeptic, January 27,2009. http://epw.senate.gov/public/index.cfm?FuseAction=Minority.Blogs&ContentRecord_id=1a5e6e32-802a-23ad-40ed-ecd53cd3d320

2. “Analysis of the U.S. Environmental Protection Agency’s Advanced Notice of Proposed Rulemaking for Greenhouse Gases”, Drs. J. Scott Armstrong and Kesten C. Green a statement prepared for US Senator Inhofe for an analysis of the US EPA’s proposed policies for greenhouse gases.  http://theclimatebet.com


Sponsored IT training links:

Get guaranteed success in 312-50 exam in first try using incredible 642-374 dumps and other 310-200 training resources prepared by experts.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

335 Comments
Inline Feedbacks
View all comments
anna v
January 30, 2009 9:08 am

Pamela, many posts above you floated the idea of testing the GCM with and without CO2 and seeing whether there is any statistical significance.
There is no statistical significance that can be extracted from the spaghetti diagrams of the AR4 studies. The reason is that in order to get a 1 sigma error bar from a model, one has to change by 1 sigma the errors that enter in the multiplicity of parameters that enter in the models. I suspect that this would result in huge changes of graphs, that might even show up as cooling than not, so your test would not be testable by errors.
Here is the reference from the donkey’s mouth itself , that the IPCC does not do errors:
chapter 8 of AR4 “Metrics of Model Reliability”, http://www.ipcc.ch
The above studies show promise
that quantitative metrics for the likelihood of model projections
may be developed, but because the development of robust
metrics is still at an early stage, the model evaluations presented
in this chapter are based primarily on experience and physical
reasoning, as has been the norm in the past.

note, experience and physical reasoning.
My personal opinion is that any grid based model cannot work for the complexity of climate, and the only way I see a future for climate modeling is following chaos and complexity theory and tools, as Tsonis et al ( discussed in a thread in CA) are doing.

Jeff Alberts
January 30, 2009 9:09 am

In case anyone is interested in my previous post about 10 day forecasts, here is the third day snapshot along with the other two:
http://jalberts.net/images/DC_01-28-2009.jpg
http://jalberts.net/images/DC_01-29-2009.jpg
http://jalberts.net/images/DC_01-30-2009.jpg
Less of a drastic change from the 29th to the 30th than there was from the 28th to the 29th. But still most days changed what I would call significantly. I ask again, what’s the point of a 10 day forecast that isn’t the same from one day to the next?

Flanagan
January 30, 2009 9:10 am

Niels: don’t you know that the isotope composition of the Vostok ices are representative of the global ones? Do you want me to give scientific sources about the validation procedures (that you probaby won’t read)? There are many of them. How many papers can you cite that PROVE it is not the case?
Excuse me, but the rest is blabla and rhetorics. It’s not sufficent to say “I think that…” or “obviously…”. That’s not the way science works…

Flanagan
January 30, 2009 9:15 am

John: could you please give us a link to data showing the 20 years-long cooling at 1 deg/century you mentioned?

Jeff Alberts
January 30, 2009 9:38 am

My personal opinion is that any grid based model cannot work for the complexity of climate, and the only way I see a future for climate modeling is following chaos and complexity theory and tools, as Tsonis et al ( discussed in a thread in CA) are doing.

Anna V, please forgive my ignorance, but isn’t chaos theory, by definition, unpredictable?

gary gulrud
January 30, 2009 9:39 am

“Do you want me to give scientific sources about the validation procedures (that you probaby won’t read)? ”
I think it would be good practice for you to do your own work. The issue is CO2 ppm reconstructions from the cores, not cosmogenic isotopes.
Your ‘proof’ will be centered on the chemistry of CO2 on a water planet.

Flanagan
January 30, 2009 9:57 am

Excuse me Gary, but it’s about temperatures not CO2.

Tim Clark
January 30, 2009 10:36 am

Sekerob (08:00:36) :
Anthony, so when I speak of money trail, you think of a direct line to big oil. Well Exxon-Mobil did a significant about face.

About a climate factor of years ago I received a substantial sum of money from Gulf Oil to do research titled “Improving recovery of biofuels with growth regulators” (sorry, proprietary info, no peer reviewed data there). Now what camp do I fall in, with that dichotomy? Am I pro-oil or a greenie? I need to know because I’ve lived with guilt for so many, many years./sarc

Tim Clark
January 30, 2009 11:03 am

Since I’m clearing my conscience, I neglected to mention “Revegetation of Retorted Oil Shale”, funded by Paraho Oil. Now I really, really need forgiveness.

January 30, 2009 11:04 am

Juan #07 13 48
You posted here about Ships Log.
At Exeter University (UK) someone has just received a grant to look for details of climate change in the biggest archive of ships logs in the world-the hundreds of thousands assembled by the Royal Nay.
http://www.timesonline.co.uk/tol/news/environment/article4448895.ece
The Doctor conducting the research has excellent credentials, unfortunately he is being sponsored by Exeter University-The Hadley Centre/Met office are located two miles away and finance a climate science chair there and Exeter is attempting to build a world leading expertise on the subject, so I do fear there will be an inbuilt bias.
I ran a long thread over at CA on climate. This link leads to literally hundreds of pieces of research/papers/books covering many thousands of years of climate and weather information, including the climate references of the Byzantine empire 383 to 1452 AD.
http://www.climateaudit.org/phpBB3/viewtopic.php?f=3&t=520
Thread entitled ‘Have we been this way before?’
I then summarised everything in this thread ‘All Things must pass.’
http://www.climateaudit.org/phpBB3/viewtopic.php?f=3&t=541
This is taken from it;
The civilisation of Akkad-2000bc. Lines taken from the curse of Akkad
For the first time since cities were built and founded,
The great agricultural tracts produced no grain,
The inundated tracts produced no fish,
The irrigated orchards produced neither syrup nor wine,
The gathered clouds did not rain, the masgurum did not grow.
At that time, one shekel’s worth of oil was only one-half quart,
One shekel’s worth of grain was only one-half quart. . . .
These sold at such prices in the markets of all the cities!
He who slept on the roof, died on the roof,
He who slept in the house, had no burial,
People were flailing at themselves from hunger.
Could you post the very interesting information you have mentioned here to my thread ‘Have we been this way before?’ Could you also post a link to your own thread so I can visit it?
Sorry to everyone for going OT but there is a wealth of historic and scientific information demonstrating the current warming episode is nothing new
TonyB

MikeF
January 30, 2009 11:24 am

Tim Clark,
You have to understand that you were just useful tool for the evil oil companies who needed to fool people into thinking that they do something good while getting tax writeoffs at our expense.
It’s a good thing that Exxon-Mobil now understands and repents.
/sarc

MikeF
January 30, 2009 11:37 am

Flanagan,

Niels: don’t you know that the isotope composition of the Vostok ices are representative of the global ones?

I am not sure I understand you implications here. Are you saying that ice cores taken from a single area is as good (or better) at global temperature measurements as ones derived from all of the temperature networks around the world? Please clarify.

Bill D
January 30, 2009 12:15 pm

Stefan:
Yes, there are thousands of scientific studies validating our understanding of what controls population abundance. Or course, a big advantage for population ecology is that we can do experiments. I usually work with zooplankton, where generation time is a week or two and where factors such as algal productivity, fish predators, and competitors can cause predicable order of magnitude changes and even evolutionary (genetic changes) ove time scales of months. Also (of course) nature is complex, so the more we learn, the more questions we have to answer.
Often times when predictions are false, it’s because some factor that was not considered turned out to be important. This is why scientist should be excited when their expectations are not confirmed by data or experiments. At that point, finding out why the expectation was wrong should lead to a good discovery.
I’m getting ready to start some lab experiments next week.

January 30, 2009 12:17 pm

I’m not a scientist and will never claim to have the knowledge to delve into the specifics of either argument. I love the discussion, however. Looking in at the fishbowl of those who decide to debate is fascinating.
I once read a news story about scientists wishing to correct the mistake they made once with algae, fish and predators at a pond somewhere. They had initially been so concerned about the fish, which were dying because of algae growth, that they introduced a bug. The bug killed all the algae, and the fish grew in number, then it caused an increase in the local population of a different fish, which ate a different plant, so they were pondering whether they should add a different fish to eat the littler fish, or maybe a predator, or a chemical. My reaction was “stop screwing with it…the system is smarter than you”.
Our climate system is smarter than all of the people who think they know how to fix it. My reaction is “stop screwing with it”, which means to stop doing things that harm the system. If CO2 is increasing, we should take measures to minimize the change. If forests are being cleared, we should find other alternatives that don’t require that particular wood. Society must contain itself or it will be dealt with eventually.
I see hydrogen cars creating no polution, with hydrogen created from solar and wind energy, as a polutant-free alternative. Humans will never be controllable, being a chaotic factor in our ecosystem. If we do not take measures to prevent the instability of our environment, it will take measures to correct itself.
Believe all you want in the non-existence of global warming, or the certainty of it. Either way, we need to come to a middle ground that works for ALL sides to an agreeable degree. Otherwise, it is certain that eventually the least-costly and least-resistance option will prevail. (probably chosen by the people who trust this blog to say there is no problem) Most certainly, this will be the worst option, and eventually nature will correct itself by removing the item creating the issue, naturally.
The system is much smarter than we are.

January 30, 2009 12:41 pm

salutwineco, I agree with your premise that the climate system is smarter than we are. Then it follows, of course, that the system of civilization is also smarter than we are.
And from that it follows that we should do nothing to alter the system, unless it is shown conclusively that our inaction would cause more problems to the system than taking action.
Finally, we can conclude from your rationale that spending $Trillions to “mitigate” the putative effects of the trace gas CO2 would be a preposterously stupid undertaking, since we don’t know the effects of the changes we would be introducing into a climate system that is currently in balance.
But then, the AGW issue has nothing to do with science or logic. It is simply a way for the government to tax the air we breathe.

anna v
January 30, 2009 12:54 pm

Jeff Alberts (09:38:19) :
Anna V, please forgive my ignorance, but isn’t chaos theory, by definition, unpredictable?
Deterministic chaos, which is the kind weather and climate belong to, has mathematical formulations.
Try http://en.wikipedia.org/wiki/Chaos_theory as a start.
The Tsonis paper is here : http://www.uwm.edu/~aatsonis/

Chris
January 30, 2009 1:09 pm

I believe that there are variables at play in climate prediction that exceed the bounds of even chaos theory.
Not only do we not know all the factors at work, but we’ve only made the rudimentary progress at ranking the variables we do know about.
In short, we ain’t figured out yet how to figure it out.
I love the hurricane forecasts each year as entertainment, and am astonished anyone actually gets paid to make them. They are anywhere close to the mark so rarely that it could simply be an accident when they are. Anyone who wants free money and has the appropriate degree should come up with a serious sounding name and start doing it.
http://www.gonzogeek.com

January 30, 2009 1:12 pm

“Spending Trillions” is always relative… You have to assume the option is to spend $Trillions less. Money will be spent either way. Just examine the “price of gasoline”. Do you include the price of wars, death, health, governments, or other factors that influence such spending?
“Inaction would cause more problems” is only true if you consider the human equation somehow beneficial, which is again subjective. There is no “beneficial” or “harmful” in the natural world, only a balance derived from necessity. If humans cause so much harm that the system becomes unstable, the system will remove or reduce humans.
I’m unclear about your assumption that “civilization” is smarter. I see no correlation of “civility” to “nature”, so the next paragraph doesn’t logically follow for me. To take no action based on the assumption that we shouldn’t take action until necessary is similar to not steering your car until you are not at your destination and under certainty of having an accident.
You’re right about one point…we shouldn’t spend $Trillions simply to mitigate CO2 gas. Hoever, we will spend money regardless, and it will eventually reach $Trillions. Logically we can all agree we should do it in a way that produces the least affect on the environment for the greatest return in productivity. it’s a curve…no “YES” or “NO” answer, so we’ll have to decide the best course through agreement of the many sides of the argument.
The one absolute is that to not affect the current balance will guarantee the status quo, in which humanity is currently thriving. To proceed with no concern for the effect of humanity on the environment, because we believe we understand that CO2 won’t be a factor, is the height of hubris and the complete disregard for the complexities of the system.

gary gulrud
January 30, 2009 2:13 pm

Flanagan (09:57:24) :
“Excuse me Gary, but it’s about temperatures not CO2.”
No, excuse me, I was discourteously inattentive. I hope you’re arguing anticorrelation of 10Be with temp, what’s R^2? 18O has fallen and can’t get up.

DaveE
January 30, 2009 2:14 pm

The whole debate seems to have moved from the validity of a ‘forecasting expert’ commenting on ‘forecasting’.
One of the tenets was, ‘A naive (simple) model is the best way forward due the tendency of complex models to deviate from reality quickly’ (paraphrase).
Didn’t a certain Dr. Roy Spencer come up with a naive model that got a lot closer to reality than the GCMs some little while ago?
I will now be attacked vehemently. 😉
DaveE.

Niels A Nielsen
January 30, 2009 2:24 pm

Flanagan: “Niels: don’t you know that the isotope composition of the Vostok ices are representative of the global ones? Do you want me to give scientific sources about the validation procedures (that you probaby won’t read)? There are many of them. How many papers can you cite that PROVE it is not the case?”
No, I didn’t know that the “Vostok ices” are representative of the global… ices…. on the time scales of thousand or tens of thousands of years. Are they?
Well, the Vostok “ices” do not seem to “represent” the Greenlandic “ices” on those timescales::
“We observe that most of the cold episodes recorded in the Greenland ice correspond to peaks in isotope Antarctic records once the common timescale derived from methane measurements is adopted. These Antarctic peaks are commonly interpreted as reflecting warmings in Antarctica.”
http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V61-402K7X9-8&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=f072fbd8712e718d633045eb9155cc08
Are Greenlandic and Antarctic temperatures correlated or rather anticorrelated on the thousands years scale?
“The results show that for any period in the time between 20,000 to 55,000 years before present Antarctica gradually warmed when the North was cold and warm water export from the Southern Ocean to the North Atlantic was reduced. In contrast the Antarctic started to cool every time more warm water started to flow into the North Atlantic during warm events in the north. This result suggests a general link of long-term climate changes in both hemispheres via a “Bipolar Seesaw” when the Atlantic overturning circulation changes.”
http://www.esf.org/media-centre/press-releases/ext-single-news/article/climate-seesaw-links-northern-and-southern-hemisphere-in-the-glacial-period-98.html
Consider also the IPCC spaghettii graph of multiproxy studies of global temperatures for the last 1000 years. Are these single-proxy, geographically isolated icecore studies spanning hundreds of thousands of years a more reliable source of prehistoric global temperatures than the multiproxy studies are of the more recent past?
Recent finding: the last ice age ended in more or less one year, according to data from Greenland, NORDGRIP (source in Danish)
htp://www.dpc.dk/sw15040.asp

foinavon
January 30, 2009 4:04 pm

anna v (04:39:52) :

I am sorry but your “vast amount” for “climate sensitivity near 3 C per doubing CO2″ is not empirical evidence. It is the output of modeling programs subject to the feedback mechanism with H2O, a creative invention of the modelers.

Not really Anna (and no need to be sorry!). The bulk of the evidence for a climate sensitivity near 3 oC per doubling of atmospheric CO2 doesn’t come from models, although the models are certainly consistent with the large amount of observational and analytical evidence that informs us on the Earth’s temperature response to enhanced greenhouse gas levels. This comes from analysis of the paleotemperature/paleoCO2 data right throughout the Phanerozoic period, analysis of the tropospheric temperature response to forcings that can be matched to the CO2-induced forcing (e.g. the solar cycle or the temperature response to volcanic forcing), analysis of the temperature variations during ice age transitions, and so on. Lots of different means of atttacking this problem which give a general result that the Earth has a climate sensitivity near 3 oC (plus/minus a bit!) for doubling of atmospheric CO2. Here’s some of the data (see [***] below).

If this sensitivity existed, we would not be here now. The icecore records show a lag of CO2 to temperature rise of at least 800 years. Were the sensitivity what you claim the response would be instantaneous, since CO2 comes out with temperature, but still needs aeons to build up in the atmosphere.

I’m not sure why you think we wouldn’t be here now! Can you explain?
You have to be a little careful with respect to the lag, and you seem to be getting things back to front. The climate sensitivity refers to CO2-induced warming of the Earth, and not warming-induced recruitment of CO2 from the oceans (and terrestrial sources). The lag is likely the result of the slow warming response (and CO2 release) of the Southern oceans to Southern hemisphere warming resulting from enhanced Spring insolation due to variations in the Earth’s orbital properties (Milankovitch cycles). In a deglaciation, CO2 rise lags temperature rise in Antarctic cores. But remember that the CO2 rise precedes the temperature rise in Greenland cores.
That’s the “lag”. Once the CO2 concentration rises in the atmosphere, the Earth’s temperature starts to rise (all else being equal, of course). However the final equilibrium temperature (e.g. 3 oC of warming from doubled CO2), takes quite a while to be realized due to the inertia in the climate system. On the other hand, because the temperatures rose so achingly slowly during glacial-interglacial transitions (around 5 oC globally over 5000 years or 0.01 oC per decade on average, around 15-20 times slower than the warming of the last 30-odd years), the temperature rise did more or less “keep pace” with the CO2 rise.

The short term records also show that CO2 goes up and down with temperature with a lag of a few months from temperature, so that even short term the humidity feedback is inexistent. How can something which is a result, drive a cause and with large sensitivity at that

Again you have to be careful with your interpretation of observations that are actually rather well characterised scientifically. Obviously there is a sinusoidal element to the CO2 rise due to greenhouse gas emissions, since the N hemisphere plant growth/decay cycle “piggybacks” on the relentless CO2 rise. So the CO2 doesn’t go “up and down” with temperature. What is apparent is that the rate of increase of CO2 in the atmosphere has some dependence on the very short term global temperature variation due to the stochastic elements in the climate system. So for example in El Nino years and shortly after, the tropical forests are water-depleted, grow poorly and are prone to forest fires and so tend to release a bit more CO2 into the atmosphere, so CO2 rises a bit more than expected in/shortly after El Nino years. Likewise in La Nina years and shortly after, there is a tendency for a “pull down” of CO2 from the atmosphere as the tropical forests show enhanced growth. That’s quite well understood and is described in the published science.
Otherwise the very marked warming of the last 30 years, and the overall warming in the “industrial era” (from the mid-late 19th century) is consistent with expectations from greenhouse-induced warming and a climate sensitivity near 3 oC. Your notion of “results” and “causes” is not consistent with very straightforward observations. So for example you propose that the massively enhanced CO2 is a “result”. But that doesn’t accord with the evidence. We know that the Earth releases around 90 ppm of CO2 during the 5 or 6 oC of warming during glacial to interglacial transitions (180-270 ppm). That’s around 15 ppm per oC. That’s a pretty consistent observation across all the glacial-interglacial-transitions. So for the 0.8 oC of warming we’ve had since the start of the industrial era, we should only have released around 10ppm (probably much less since the temperature rise is much faster than the ocean equilibration timescale). In fact we’ve driven CO2 massively upwards from around 280-385 ppm (and growing). So there’s something fundamentlly wrong with your analysis…
———————————————————-
non-model-based analysis of the earth’s climate sensitivity:
Annan and Hargreaves review some of the data here. Note that some of this is model-based. You can just ignore it if you’re unhappy!
Annan JD, Hargreaves JC (2006) Using multiple observationally-based constraints to estimate climate sensitivity. Geophys. Res. Lett. 33, art #L06704
http://www.jamstec.go.jp/frcgc/research/d5/jdannan/GRL_sensitivity.pdf
More recent estimates of climate sensitivity from observational data are:
K-T Tung and CD Camp (2007) Solar cycle warming at the Earth’s surface and an observational determination of climate sensitivity (in the press)
http://www.amath.washington.edu/research/articles/Tung/journals/solar-jgr.pdf
Royer DL et al. (2007) Climate sensitivity constrained by CO2 concentrations over the past 420 million years Nature 446, 530-532

Lance
January 30, 2009 4:18 pm

Just to let those people who use “desmog.com” as a link, it’s hard to take anything you post serious, look for it some where else maybe.
ALSO,
They are run by a big PR company in connection with Andrew Weaver who is a computer scientist doing weather modeling with the school of Earth and Ocean Sciences at the University of Victoria( I live here) . And has just recently released a book Flogging Hanson/Mann flawed theory and is also part of a bigger PR effort. One wonders what was done with the $250’000 Hanson received from Hienz/Kerry?

John Finn
January 30, 2009 4:48 pm

John: could you please give us a link to data showing the 20 years-long cooling at 1 deg/century you mentioned?
1937-56 Hadley or GISS – take your pick.
Now, can you give us a link to data showing the “1.5C per century since 1850”

January 30, 2009 7:55 pm

salutwineco…enjoy your comments, but agree with Smokey.
The premise of what you say seems to be that human beings are a negative influence on our planet, and in fact are likely to totally f*** it up.
Note I use the word “likely”. Not definitely.
I grew up in England in the 1950’s. People smoked at the movies. People died of lung congestion due to “smog” (smoke and fog.) The whole place was filthy dirty.
Not any more.
We are aware of emissions, we recycle (sometimes stupidly, using more energy in the process) we penalise polluters.
In other words, we are learning.
We don’t need Al Gore’s anti-capitalist, pro Al Gore crusade. The planet is in good shape and getting better than it was. Enhance wealth…people then breed less; enhance literacy too – it leads to enlightenment…