
Guest Post by Steven Goddard
Global Climate Models (GCM’s) are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate. This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.
During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports. However, during the last decade the accelerated warming trend has slowed or reversed. Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.” Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.
But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.
“It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”
Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.
What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs. The models did not predict the current cooling. There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc. But whatever it is, it is not adequately factored into any GCMs.
One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it. A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language. If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran. The Holy Grail of climate models would be the following function, which of course does not exist.
FUNCTION FREEVARIATION(ALLOTHERFACTORS)
C Calculate the sum of all other natural factors influencing the temperature
…..
RETURN
END
Current measured long term warming rates range from 1.2-1.6 C/century. Some climatologists claim 6+C for the remainder century, based on climate models. One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.
As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures. Instead the temperatures were well below normal.
http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2008/200810temp.gif
http://www.hprcc.unl.edu/products/maps/acis/mrcc/Last3mTDeptMRCC.png
Another much larger example is that the GCMs would be unable to explain the causes of ice ages. Clearly the models need more work, and more funding. The BBC printed an article last year titled “Climate prediction: No model for success .”
And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.
“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.
“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“
……
One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.
Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.
Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change
…….
“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.
“If we ask the questions they’re not capable of answering, we get unreliable answers.“
I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them. I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.
Stop misleading climate claims
11 February 2009
Dr Vicky Pope
Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.
She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.

Forgot
In climate4you under climate reflections (scroll down about 2/3 of the page – Figure F)
the rate of change since 1908 is DEAD FLAT.
and should be in Michaels quote “one roughly”
How can someone simultaneously say that natural variability is strong enough to supress the climate signal from GHGs for possibly the next 30 years, while also maintain that the climate signal from GHGs has been observably distinguished over natural climate variability in our temperature record. In the mid to late 70s, the earth was winding up a cooling trend, and the case for global warming was based primarily upon the large temperature increases in the 30 years since then. If, despite the increase in annual CO2 emissions, global warning is not yet strong enough to counteract natural climate cooling in the 30 years going forward, then how can anyone look at the past 30 years and conclude that any particular amount of the warming trend of the last 30 years was or was not natural? Has somebody proven that the claimte naturally cools more strongly than it warms?
In my mind, this is an admission that there is no proof of manmade global warming that can be taken from the Earth’s temperature record.
Fortran 77? Wasn’t that Fortran 4 I used to tutor? (Friends, don’t get old.)
Local SW Missouri ornithologists raised the alarm last month about the unmistakable thermal impact on some silly little purple birds which had been forced northward out of their usual winter haven here by relentless global warming. Rather than spontaneously combust, these crafty avians headed for cooler clime, a whole forty miles north of here. Please ignore the fact that these birds can migrate over three hundred miles a day, by recent measurement. A disreputable skeptic (oh, wait, that was me) pointed out that the local farmers had plowed their pastures and planted corn in order to cash in on the “alternative fuel” bonanza which was surely to happen. Farms north of here are likely too rocky and steep to raise much good corn. Since little purple birds can’t crack a kernel of corn, then better vittles are probably where corn fields ain’t.
Backed by the latest satellite images, dire scientific pronouncements, and scary climate models, the little purple birds headed north. Right into a winter storm from our frozen friends in Canada. The little purple birds froze to death. More victims of global warming. Shame on capitalism. Bird-brains. Film at eleven. (/sarc off)
@kurt (17:29:07) :
We flogged away at this question a few days ago on another thread (see WUWT 2/19/09, http://wattsupwiththat.com/2009/02/19/when-you-can’t-believe-the-model/ ) with two camps emerging: One, computers cannot do any more than their human programmers tell them; and Two, computers can determine things no human ever could do. I am firmly in both camps, having written and tested both sorts.
The theory and practice of Artificial Intelligence holds that there exists several classes of problems in Camp Two for which there are no known solutions, some are so large that even a computer the size of a galaxy would require millions of years to solve.
A more tractable example of Camp One is where no direct solution exists to a given problem, hence trial and error must be used. Further, the number of iterations can continue for a very long time with successive results improving only in the very small decimal places. Therefore, some sufficiently small difference is allowed between iterations, and the answer is declared at that point. Some problems have sufficiently numerous variables, and complex calculations that they can only be solved iteratively. Humans can do this, but it takes a very long time and is prone to calculation errors
A simple analogy from engineering is the heat loss from the surface of an insulated pipe carrying a hot fluid, calculated by determining the temperature gradient through the insulation and the wall of the hot pipe. One can establish the internal temperature of the pipe, for example, carrying hot steam at 900 F. The pipe wall thickness is known, and its thermal conductivity is also known based on the type of material. One also knows the temperature of the surrounding air, for example it may be 40 F. The thickness of insulation is known, and its thermal conductivity is known.
To solve this, one makes an initial guess for the temperature at the outer edge of the insulation, then solves the various heat-balance equations, including radiation heat losses. A calculated temperature at the outer edge of the insulation results, usually not the same as the initial estimate. One continues to input the new temperature, re-solve, and check to see if the old temperature is sufficiently close to the new temperature.
Now, extrapolate this simple problem into an atmospheric model, where the atmospheric volume is divided into a grid of thousands of cells, each cell having three dimensions. The heat transferred from one cell to another, in all three dimensions, must match the heat received from other cells.
Temperatures are a measure of heat, so the model (had better) solve for temperature. Further, the properties of the air in a cell change with temperature, as the volume increases as temperature increases. This creates a mass-transfer that must be properly calculated and solved iteratively. Further, there are moisture issues, or humidity, with rising vapor from oceans entering the appropriate cells.
There is heat input from the sun, which varies not only by the time of day, but with the seasons as the earth orbits and the axial tilt changes with respect to the sun. The day-night changes in heat input also are important. Clouds act both to reflect daytime sunlight, and as a thermal blanket at night.
The iterations must continue for some time because we are interested in very small changes over very long time periods, so solving to 1 degree C probably is not going to pass muster.
Other considerations are water vapor condensation as rain releases heat in upper cells, and rain falling to the earth that may absorb CO2, or it may evaporate somewhat as the drops fall through less humid air, or it may fall to earth. Also, the water droplets may freeze and fall as sleet or hail, or they may initially form as snow. Albedo effects from snow and ice are of course important.
One can appreciate that this gets fairly complicated, thus the need for a computer to solve all the equations. It may be possible that humans could solve the equations, but the organization and time required would be immense.
From my readings on the subject, not all of what I just described is modeled, and there are other important things I have not included. One such thing is CO2 levels increasing with time. Also, methane levels increasing. Also, dust or smoke from desert winds or cooking fires. Also, aerosols from both natural sources, such as volcanoes, and man-made sources. One effect from warming is that frozen methane hydrates may thaw and enter the atmosphere. The recent stories on methane bubbles in the Arctic are on point.
I don’t know if this helps any, but it is how I view these models, and remind myself that these modeling guys may be making heroic efforts and good progress relative to where they were 10 years ago, but they are really kidding themselves, and especially elected officials, with any predictions of the future.
All fine responses. I especially enjoyed Henry Phipps’ commentary on the poor little purple birds. Poor buggers. Just like the spotted owls, they only needed to spread their wings, and fly to happier hunting grounds.
But global warming got ’em first.
FatBigot,
Ah yes, thanks for that link. A good read! And now it makes perfect sense why an opposing view would not be heard. I must have missed that part, if it was indeed reported.
G. Alston:
“That’s the thing. These models were written from the ground up to simulate climate response to CO2. It’s what they do; why they exist. That anyone is surprised when the answer to the question is CO2 still amazes me.”
I’m a patent attorney, so not only do I have an engineering degree, but I review a lot of client material when preparing a patent application. Usually, when someone constructs a model, mathematical or physical, that points to factor X as being a prime cause of effect Y in the model, they also try to quantify how robust that result is if values for other unknown variables in the model are changed. This at least gives you a subjective handle on how well you can count on the modeled results.
Part of me is just as cynical as your response exhibits – primarily because I’ve read through passages of the IPCC reports and see a lot of deceptive wording and hollow conclusions. Nonetheless, I have a day job and don’t have the time (probably not the mathematical expertise) to thoroughly review the published material on the mathematical models of the global warming crowd. I can’t say for certain that the creators of the models have not in fact spent considerable time doing assorted model runs using widely varying values for unknown parameters, including climate sensitivity to CO2, so as to test the robustness of the result that known climate cannot be simulated without a contribution from CO2. Certainly, this would be both a feasible and sensible thing to do.
Having said that, if, as Phillip B stated, only selected model runs are published, and no one documents the different values used for model parameters during the runs, I have a hard time viewing these models as part of any scientific endeavor – science is all about procedure; documenting what steps were made so that others can reproduce the experiments and see whether consistent results are obtained. In fact, the thoroughness of the procedures used to test the falsity of a hypothesis is what eventually gives credence to the hypothesis, if it survives those tests (think relativity and evolution). If no one can devise an effective test of a hypothesis, or alternatiely if no one discloses the complete details of the procedures used to test the hypothesis, how can you possibly evaluate the accuracy of that hypothesis?
A couple of points to keep in mind.
1. Had 2007 been the hottest year on record as the Met Office and Hansen predicted, it would have been cited as proof that the models were correct. Instead, 2007 saw temperatures drop by more than 0.5C and was discounted as “natural variability.”
2. Looking at the graph of the last 30 years, one would have to have a very active imagination to see 6C of warming in the next 90 years.
http://www.woodfortrees.org/plot/uah/from:1978/plot/rss/from:1978/plot/uah/from:1978/trend/plot/rss/from:1978/trend
Defenders of the AGW faith attempt to bury these inconvenient truths in statistical games, but we are not as gullible as they are hoping.
Roger Sowll:
I hve no problem with the assertion that computers are practically necessary to solve many real-world iterative solutions. What I question is whether the solution you get from the computer can teach you more about the underlying relationships used to construct the model. To take your thermal pipe example, would completing that iterative process, either manually or using a computer, ever teach you more about the laws of convection, conduction, or radiation? Could the results be used to better tell you what the thermal conductivity of the pipe wall is, if it were not possible to meause that thermal conductivity in the real world?
One of the fudamental issues I have with global warming theory is a belief that the lack of the ability to measure most of the physical relationships that govern our climate system will always cripple our understanding of the climate. You never let theory outstrip observations. That’s what happened in the 40s and 50s when we modeled the atom like a miniature solar system with electrons orbiting the nucleus like little planets – it may have been a reasonable hypothesis, but when our obserational abilities caught up with our imaginations, we found out that electrons didn’t move like that. The ptolomaic version of the solar system, with everything orbiting the earth, wasn’t put to rest until our observartional abilities improved to the point of being able to detect stars precessing back and forth over the year as the earth moved from one side of the sun to the other.
The utility of modeling a system in more detail than our ability to observe that system, in my mind, is inherently suspect.
kurt,
From my understanding all the models employ a “forcing” which is a positive number that says how much T changes when [CO2] changes. The main function of individual models is just how minor perturbations about that trend appear. The prediction of rising T is simply an expression of the assumption that “forcing” is positive, not zero. So, yes, the ouput is just the assumption in disguise.
Henry Phipps’. I have a question. Were the birds “purple” before or after they flew into the cold north?
Those climate computer models aren’t proper models. This is a model.
Some bright folks are working on golf ball design and how the dimples affect air flow around the ball, and are trying to create accurate computer models of a golf ball in flight. Working out the solution to these equations — even on the fastest personal computers today — is not feasible since it would take more than 15 years of computing time just to get a glimpse of the flow around the golf ball for a fraction of a second.
Now all we have to do is find out all the climate drivers, then plug them into a computer model, hit the “Run” button and sit back and wait…
richard m,
30 years was chosen as ‘climate” at the end of a 30 year warming period. So if the temperature then decreases at about the same rate for, say, 15 years, it is still a warming trend. Think employment security.
kurt, G. Alston, and others:
You seem to have some misconceptions about the climate models.
1) CO2 sensitivity is not an input parameter in the models. The CO2 sensitivity is determined from the RESULTS of the models.
2) The GISS model itself, along with the inputs and results of their model runs, are freely available at the GISS website. (I don’t know about how open other climate models are.)
3) The climate models were not “written from the ground up to simulate climate response to CO2”; they were written to simulate the climate change from ANY forcing- solar, albedo, aerosols, ozone, etc…, not just CO2.
4) Many models have been run both with and without anthropogenic CO2 (there’s a section about it in the IPCC report). Needless to say, the two scenarios yield very different results.
kurt — Part of me is just as cynical as your response exhibits – primarily because I’ve read through passages of the IPCC reports and see a lot of deceptive wording and hollow conclusions.
Cynical? No.
Unless you start off by trying to figure out how GHG’s are working then there’s no reason to posit GHG amplifiers.
If you consider that the language of the model outputs is always in the form of “forcings” (amplifiers) you could conclude that the purpose of the model isn’t to ascertain an admixture of varying parameters at all, but to determine how GHG’s are working. There’s no need for an invocation of forcings otherwise, and RC wouldn’t be posting that N% increase of CO2 doesn’t result in the theoretical increase, but more (due to these forcings.)
Reed Coray (19:53:28) :
Henry Phipps’. I have a question. Were the birds “purple” before or after they flew into the cold north?
Reed, the birds were purple, the science is settled, and we have no time for debate about this.
What IS distressing is that parts of the ornithologists who observed the birds in the blizzard have also turned persistently purple.Looks to be a rather tame Christmas party at the Department of Ornithology this next holiday season.
Best regards, Henry
I came on this thread late, and few may see this, but I’m compelled to add a view on these math/physics models. Over more than 30 years, I wrote business and financial models using linear, reiterative math. Late in my systems career, I was asked to devise a linear math model to emulate a chaotic system. I declined. I can’t. I know no one who can mathematically emulate a chaotic system and produce a reliable result. I know OF no one who do that.
The atmosphere is chaotic. The oceans are chaotic. The gaseous interface between them is chaotic. Chaos cubed will never be emulated with any accuracy by even hyper-reiterative linear math, even if the modeler could reasonably control the iteration time frames. Nothing more than a targeted estimate range could emerge from even the best linear math models.
If an AGW advocate can tell us of a genius modeler using chaotic math systems for climate projections, I’ll grovel and apologize. Until then, I regard all of the 21-22 IPCC models as tools producing exactly what they were designed to produce: propaganda data.
deadwood (18:34:45) :
I am curious as to whether one of the modelers has recalibrated his model to the now 10 years of stagnation in warming.
If this has been done, I’ve not heard about it.
If not, then why not? (Other than, of course, it would mean the sensitivity of CO2 would need to be cranked way down!)
http://www.nature.com/nature/journal/v453/n7191/abs/nature06921.html
They added the PDO to their GCM to get
“Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.”
It is called :” moving the goal posts” . Or,” the pot of gold is always at the end of the rainbow”.
Several prominent figures have said that ancient religious memes are replaying themselves in a modern context. Al Gore is a modern prophet (and as usual the prophecies don’t come true), global warming is a Hell for punishing carbon sinners while global cooling/Greening is Heaven and a return to Eden, carbon credits are Indulgences, scientists in their lab coats are today’s robed priests, in the media figures like George Monbiot are the equivalent of raving mad preachers like Savanorola, and the UN is the new Vatican for the faithful who want their creed to be universal.
Nostradamus used to read the future from looking into a bowl of water. None of this prophecies have come true (he even gave one a specific date in 1999) but half a millennium on he still has believers.
That’s what computer models are like to AGW faithful – necromancy.
In the US it should be considered unconstitutional for government to be involved with a New Age religion.
kurt,
“Could the results be used to better tell you what the thermal conductivity of the pipe wall is, if it were not possible to meause that thermal conductivity in the real world?”
Actually, I believe the answer to that is yes, but only if one is able to measure the outer wall temperature, that is, the edge of the insulation exposed to the atmosphere. If one is unable to measure that outer wall temperature, the problem becomes indeterminant. There would be too many unknown variables, and not enough equations.
Part of the GCMs’ problem is no one can truly test or verify the models, as we cannot easily add 100 ppm of CO2 to the atmosphere without waiting for a good many years, and we cannot cut 50 ppm CO2 from the atmosphere very easily, if at all. These changes in CO2 are the step-tests that should be conducted to determine the physical responses to a change in a control system, and thereby verify the models. Therefore, every modeler is guessing, based on some estimates of how laboratory science scales up in the large atmosphere. I refer, of course, to CO2 absorption of certain wavelengths of infra-red radiation from the earth’s surface, and subsequent re-radiation.
Similarly, it is difficult if not impossible to perform step-tests for albedo changes (zero polar ice), cloud cover (zero cloud cover and 50 percent of the earth), volcanic gaseous and particulate emissions (like babies, they keep their own schedules), and so on.
The skeptics have devastating claims as, for example, Steven Goddard among others, with the evidence that none of the GCMs predicted the current cooling trend that is reported by the four major measurements. Such cooling should be impossible due to increasing levels of CO2, if the GCMs are correct. Cooling is predicted from major volcanoes, and decreases in CO2, neither of which occurred.
This fact alone should be hammered home to the elected officials who are drafting climate change legislation.
When the physics elementary particle community comes up with a number, for example, ” there are 3 neutrinos” from the fit to the data, it uses computer programming and modeling intensively. The difference with “there is a 3C rise in 100 years” coming from the GCM models is the error bars.
Errors are strictly propagated in all particle physics modeling both statistical and systematic, and all numbers come out with an error +/- statistical , +/-systematic.
No error propagation exists in the IPCC model outputs that I can find in the voluminous reports. This denotes that any numbers given are meaningless, sleight of hand, video game presentations.
What is done instead, to fool the scientific audience:
1) The model is fitted to the temperature data. There are so many parameters that this is not difficult to do ( Von Neumann and his elephant).
2) In order to simulate the chaotic dynamics of climate ( they do acknowledge that climate is chaotic) they change the initial conditions as they feel like and create spaghetti lines around the optimum fit, believing they simulate chaos. The deviations from the fit they treat as errors. BUT the variations are not 1 sigma variations to give chi**2 per degree of freedom, they are just to please the eye of the beholder, because no such chi**2 is reported anywhere.
For example, a 1 sigma variation of the albedo only would throw off the curves by 1C ( try the toy model over at junkscience.com).
3) Then many models, which means different models of similar structure and assumptions are all put together on a spaghetti graph, again claiming the width as representative errors even though the only “valid” argument is “chaotic spread”, including the chaotic brain waves of the modelers. What amazes me is that statisticians are treating these spaghetti projection widths as if they are true errors and discuss the number of angels on the point of the needle, ( in other blogs).
A second blow against the GCMs is what a lot of people with mathematical physics background have been trying to say, but cannot get through. The grid model of the earth cannot simulate the interdependencies of the numerous coupled non linear differential equations that enter the climate problem ( a classical dynamic chaos problem) . By construction the grid with the average values assumed in these huge blocks is assuming that the solutions to these differential equations are well behaved so that the first order terms can be used for most of the variables entering ( the average is the first order term in an expansion of any function). It is more than inevitable that this hypothesis will fail, because the solutions are highly non linear. That is why for weather and climate GCMs can only work for a limited number of time steps. After a while, the divergence of the real solution enters as the higher order terms kick in.
The only useful modeling for the future has to be a model that incorporates these nonlinearities, as Tsonis et al have attempted, non linearly.
All I can say is AMEN and as an engineer who has done incredibly simple mechanical models covering minutes of simulated time, observing the errors involved in even the simplest and shortest of runs, it strikes me as incredibly amazing and arrogant that climate modellers think they have an accurate climate model for 100 years out. The models can help you do many useful things but it is not a crystal ball; it is not soothe-saying proxy.
The Tsonis link:
http://www.uwm.edu/~aatsonis/2007GL030288.pdf
also in http://www.uwm.edu/~aatsonis/ there are expositions of connection of chaos with climate.
It seems most commentators on this site have the mistaken impression that a model is only useful if it can provide accurate predictions. However, they are missing an important aspect of the modeling endeavor. In many cases the purpose of a model is not so much to make accurate predictions as to provide a means to test the basic assumptions upon which the model is based. In short, a model provides a means to relate one set of observables to another set of observables and to understand these relationships. The model also provides a means to test the limits of these relationships (or assumptions) and to determine when they break down.
In this sense much of the climate modeling work has been hugely successful in that scientists have been able to rule out some unimportant relationships and have greatly expanded our understanding of both climate and weather. They have also been able to test various scenarios and make predictions. Of course if some of the underlying assumptions are incorrect their predictions will be incorrect, but this is still a valuable exercise in that it provides a way to evaluate their assumptions and later analyze which of those assumptions broke down.
Unfortunately, most people don’t grasp the fundamental value of the modeling exercise in expanding our understanding of a phenomenon. Because of this, there is tremendous pressure on modelers to produce predictive models. Unfortunately, this can often lead the less careful modelers to overestimate and overstate the abilities of their models and to overlook or even hide known flaws. Furthermore, the situation becomes exacerbated when the media or government officials take modeling results out of context and try to formulate policy based on a hypothetical prediction.
The bottom line is that models are extremely valuable tools even when they are not predictive. However, it is critical to understand their limits and to be very cautious when applying a model’s results outside of these limits.
I think many climate scientists are doing a great job in developing their models. However, it seems to me that some experts in the field are a bit too confident, especially when we all know there are severe limitations for modeling so many incredibly important phenomena, such as cloud formation, humidity, aerosols, etc…
Hi. First off i have to admit that i havent reat all of the comments to your article so if someone else has already stated this, well then i obviously agree with them!!
‘Climate Change’ has been the international buzz word for what feels like a long time. as far back as i can remember most of my geography lessons were taken up by the ‘man made’ phenomenons of acid rain, floods caused by the divesersion of rivers, massive storms destroying thousands upon thousands acres of land and recently global warming. It seems to me that a majority of this is reported on a popularity basis. Since the world economy went into melt down I have only seen one report on global warming and that was simpl in responce to the presence of a few centimeters of snow falling on a pitifully unprepared England. The headlines themselves are in my opinion little more than designed to shock and acre people. proclaming in big scary letters precisly how many people will drown if (and apparantly when) sea levels rise.
These climate models that are spewing out all these predictions about the doom of the earth at the hands of an over industrialised population should not be taken as an absolute truth as your article points out. The technology is not nearly accurate enough and while it is no doubt very complicated, it seems to still be at least a few years away from being usefull to the required extent. Not that i am discounting it completly! I just believe that like all sciences (inluding my own branch of science, Ethology) it needs to be trialed, analysed, reviewed, altered, trialed, analysed, reviewed and so on. I think this technology can be perfected, i just think that is a little way off yet.
My point in brief, just because the weater map says it should be raining on your house and when you look there is indeed rain falling from the sky, dosent mean the weather map is acurate for everyone.