Forecasting The Arctic Oscillation

Recently the Chief of the met office went on UK TV to say:

“OUR SHORT TERM FORECASTS ARE AMONG THE BEST IN THE WORLD.” (see video here)

Yesterday, the UK Met Office had to make a rare mea culpa, saying they had botched their own recent snow forecast, it is useful to point out that they aren’t the only one with egg on their faces.

http://wattsupwiththat.files.wordpress.com/2008/09/met_office_forecast_computer-520.jpg

In early October, the Arctic Oscillation (AO) took an unexpected dip into deeply negative territory, which led to the sixth snowiest October on record in the Northern Hemisphere and the snowiest on record in the US.  If you look at the 14 day forecast at the bottom of the graph below, you can see that the dip caught NOAA forecasters off guard.

Source: NOAA Arctic Oscillation Forecast

According to Rutgers University Snow Lab, October, 2009 was the snowiest on record in the US.

Contiguous United States
Month Rank Area Departure Mean
12-2009 1/44 4161 1292 2869
11-2009 39/44 585 -512 1097
10-2009 1/42 538 385 153
9-2009 5/41 21 13 8
8-2009 12-41/41 0 -5 5
7-2009 24-40/40 0 -17 17
6-2009 32-42/42 0 -64 64
5-2009 37/43 34 -151 185
4-2009 17/43 859 106 753
3-2009 23/43 1964 -18 1983
2-2009 17/43 3172 110 3062
1-2009 15/43 3696 185 3511

Source: Rutgers University Snow Lab

The director of NCAR captured the moment perfectly in this East Anglia Email – dated October 12.

From: Kevin Trenberth <trenbert@xxxxxxxxx.xxx>To: Michael Mann <mann@xxxxxxxxx.xxx>

Subject: Re: BBC U-turn on climate

Date: Mon, 12 Oct 2009 08:57:37 -0600

Cc: Stephen H Schneider <shs@xxxxxxxxx.xxx>, Myles Allen <allen@xxxxxxxxx.xxx>, peter stott <peter.stott@xxxxxxxxx.xxx>, “Philip D. Jones” <p.jones@xxxxxxxxx.xxx>, Benjamin Santer <santer1@xxxxxxxxx.xxx>, Tom Wigley <wigley@xxxxxxxxx.xxx>, Thomas R Karl <Thomas.R.Karl@xxxxxxxxx.xxx>, Gavin Schmidt <gschmidt@xxxxxxxxx.xxx>, James Hansen <jhansen@xxxxxxxxx.xxx>, Michael Oppenheimer <omichael@xxxxxxxxx.xxx>

Hi all

Well I have my own article on where the heck is global warming? We are asking that here in Boulder where we have broken records the past two days for the coldest days on record. We had 4 inches of snow. The high the last 2 days was below 30F and the normal is 69F, and it smashed the previous records for these days by 10F. The low was about 18F and also a record low, well below the previous record low. This is January weather (see the Rockies baseball playoff game was canceled on saturday and then played last night in below freezing weather).

Trenberth, K. E., 2009: An imperative for climate change planning: tracking Earth’s global

energy. Current Opinion in Environmental Sustainability, 1, 19-27,

doi:10.1016/j.cosust.2009.06.001. [1][PDF] (A PDF of the published version can be obtained

from the author.)

The fact is that we can’t account for the lack of warming at the moment and it is a

travesty that we can’t.

http://www.eastangliaemails.com/emails.php?eid=1048&filename=1255352257.txt

Once again, this begs the question – if the GCMs can’t forecast the AO two weeks in advance, how can they possible forecast snow and cold 70 years in advance? University of Colorado professor Mark Williams used climate models in 2008 to come up with a remarkable prediction (below) in a year when Aspen broke their snowfall record.

Study: Climate change may force skiers uphill

From the From the Associated Press

Tuesday, December 16, 2008

DENVER — A study of two Rocky Mountain ski resorts says climate change will mean shorter seasons and less snow on lower slopes.

The study by two Colorado researchers says Aspen Mountain in Colorado and Park City in Utah will see dramatic changes even with a reduction in carbon emissions, which fuel climate change.

University of Colorado-Boulder geography professor Mark Williams said Monday that the resorts should be in fairly good shape the next 25 years, but after that there will be less snowpack — or no snow at all — at the base areas, and the season will be shorter because snow will accumulate later and melt earlier.

If carbon emissions increase, the average temperature at Park City will be 10.4 degrees warmer by 2100, and there likely will be no snowpack, according to the study. Skiing at Aspen, with an average temperature 8.6 degrees higher than now, will be marginal.

Since the first of October, Colorado is averaging two to eight degrees below normal, as is most of the US:

http://www.hprcc.unl.edu/products/maps/acis/WaterTDeptUS.png

Source : NOAA High Plains Regional Climate Center

In December 2009, Colorado averaged three to fifteen degrees below normal, once again correlating with a strongly negative Arctic Oscillation

http://www.hprcc.unl.edu/products/maps/acis/hprcc/Dec09TDeptHPRCC.png

Source : NOAA High Plains Regional Climate Center

Climate models are iterative through time, which means once they go off in the weeds they can not recover.  If AO trends can not be forecast more than a few days in advance, it would seem problematic to make any sort of meaningful long-term climate projections using GCMs.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
177 Comments
Inline Feedbacks
View all comments
tallbloke
January 16, 2010 4:56 am

vukcevic (03:18:35) :
Yes, but that’s not how the climate models works. Which is the point at issue.

Frank K.
January 16, 2010 5:15 am

If you look at the way most GCMs are formulated numerically, they are very similar to weather models in that they start with an initial condition and integrate the governing equations (subject to user-defined unsteady, boundary conditions) using small time steps (I believe on the order of 30 minutes per time step). And like the weather models, the governing differential equations are very non-linear, which means stable solutions on any time scale are not guaranteed. Moreover, all time-dependent numerical schemes are subject to numerical errors called truncation errors, which arise because of the fact the numerical representation (discretization) of the equations is not exact. This error accumulates with each time step and can eventually swamp the desired solution over long periods of time integration. This is why, 100 year GCM simulations are, at best, should be taken with a grain of salt.
Now, Leif and others have argued that, while we can’t predict precisely what the average surface temperatures will be in the year 2100, we can say that it is likely they will be higher than today. If that is the question we want answered, then it doesn’t take a GCM to tell you that – you can get that result from much simpler models! Unfortunately, GCMs today are being misused as predictive tools as shown in the article Anthony cited:
“If carbon emissions increase, the average temperature at Park City will be 10.4 degrees warmer by 2100, and there likely will be no snowpack, according to the study. Skiing at Aspen, with an average temperature 8.6 degrees higher than now, will be marginal.”
Does anyone believe that we can predict that Park City will be 10.4 degrees (I assume F) warmer by 2100 than today?? I don’t. These kinds of press releases are being done for two reasons: (1) to publicize the research and therefore justify the expenditure of thousands of research dollars on the author’s NSF or DOE research grant, and (2) to attempt to influence near-term public policy.

DirkH
January 16, 2010 5:35 am

One remark about the iterative nature of climate models and “once they’re off track they just go into the weed”. Some people here found that too much of a generalization.
In fact, assuming a natural self-stabilizing climate (with negative feedbacks) even a faulty model will return to a normal state eventually but will be arbitrarily out of sync with events in reality.
As the IPCC climatologists assume only positive feedbacks, tipping points and runaway greenhouse effects, (this is necessary in their models to reach doomsday efficiently and that’s necessary to keep the funding going) their models will not stabilize (they’re precariously balanced and will fall off the cliff).
So i see two possible kinds of climate models which are both useless in the long run but for different reasons.

Sean
January 16, 2010 5:47 am

It is with great trepidation that I weigh in on this discussion but here goes.
I’ve often pointed out on this blog that the MET Offices uses GCM’s to make their seasonal forecasts in the UK and their poor track record over the last 3 years begs the credibility of of the GCM’s in general to make long term predictions. I also understand Leif’s point that with something as noisy (variable) as the regional weather, a model that integrates over decades might produce more accurate trends over the long term. What I don’t understand is after 3 years of missed forecasts why it is that a more reliable method is not used instead? Meterologists who look at the way the oceans are set up going into the spring or fall are much able to make better predictions, albeit not perfect, for the upcoming season.
The second point I want to make is regarding the longer term forecasts using the GCM’s or any other models for that matter. The long term trends rely on very accurately accounting for the energy budget of the planet and integrating them over time. By their own admission, the models cannot really do clouds well and the uncertainty in the radiation budget caused by clouds is at least as great or greater than the increased heat retention from CO2 and other green house gasses. That being the case, how can anyone profess 90% certainty going forward a temperature trends? A lot more progress in climate science could had with a little humility and more team work between the climate change factions.

Steve M. from TN
January 16, 2010 5:54 am

I’m throwing the BS flag! In my area of the country, they’re showing a -2f anomaly for 1-14 Jan. Only 3 days of the 14 the HIGH temperature made it over the average of 35 for this area. Using the Min/Max temperatures from weather.com for that period, the average temperature here was 21.5f. That is 14f below normal.

lgl
January 16, 2010 5:54 am

Richard Holle (02:17:43) :
“if I pull out the past three patterns of 6558 days, or 240 lunar declinational cycles, and plot them side by side, day by day, they show a resultant pattern that gives a better forecast than any of the models.”
Show us.

January 16, 2010 6:03 am

As Baa Humbugged mentioned, Piers Corbyn claims to have a better record than, for instance, the Met. He recently took them to task in claiming that his predictions would have prevented England from running out of grit.
See http://www.weatheraction.com/docs/WANews10No5.pdf But please don’t forget my barycenter hero, Dr. Theodor L[snip]t, who predicted years ago, before he died, Little Ice Age conditions around 2030. See
http://www.iceagenow.com/New%20Little%20Ice%20Age.htm

matt v.
January 16, 2010 6:16 am

What are experts predicting about the future climate . I am posting what individuals said and in the second post what various organiztions are saying. i don’know if any are predicting AO levels as the key indicator.
William M Gray, Professor Emeritus, Dept of Atmospheric sciences, Colorado State University
“A weak global cooling began from the mid-1940’s and lasted until mid-1970’s. I predict this is what we will see in the next few decades”
http://tropical.atmos.colostate.edu/Includes/Documents/Publications/gray2009.pdf
Don Easterbrook, Professor Emeritus, Dept of Geology, Western Washington University.
“Setting up of the PDO cold phase assures global cooling for next approx. 30 years.
Global warming is over. Expect 30 years of global cooling, perhaps severe [2-5
Degrees F]”
He predicts several cooling scenarios
The first is similar to 1945-1977 trends, the second is similar to 1880-1915 trends and the third is similar to 1790-1820 trends.
http://www.heartland.org/bin/media/newyork09/PowerPoint/Don_Easterbrook.ppt#630,38,Projected global temp to 2100
and
http://www.heartland.org/bin/media/newyork09/PowerPoint/Don_Easterbrook.ppt#608,49,Implications
Syun Akasofu, Professor of Geophysics, Emeritus , University of Alaska, also founding director of ARC
He predicts the current pattern of temperature increase of 0.5C /100 years resulting from natural causes will continue with alternating cooling as well as warming phases. He shows cooling for the next cycle until about 2030/ 2040.
http://www.heartland.org/bin/media/newyork09/PowerPoint/Syun_Akasofu.ppt#524,30,Slide%2030
Mojib Latif, Professor, Kiel University, Germany
He makes a prediction for one decade namely the next decade [2009-2019] and he basically shows the global average temperatures to decline to a range of about 14.18 C to 14.28 C from 14.39 C in 2008
He also said that you may well enter a decade or two of cooling relative to the present temperature level, however he did not indicate when any two decades of cooling would happen or whether the second decade after the next decade was cooling.
In another words he is predicting cooling for the next decade and a possibility of the second as well.
http://www.wcc3.org/sessions.php?session_list=PS-3
Noel Keenlyside, Dr., from the Leibniz Institute of Marine Sciences at Kiel University.
Quote from BBC article
The Earth’s temperature may stay roughly the same for a decade, as natural climate cycles enter a cooling phase, scientists have predicted.
A new computer model developed by German researchers, reported in the journal Nature, suggests the cooling will counter greenhouse warming.
http://news.bbc.co.uk/2/hi/science/nature/7376301.stm
Anastasios, Tsonis, Professor and Head of Atmospheric Sciences Group University of Wisconsin, US
“We have such a change now and can therefore expect 20 -30 years of cooler temperatures”
http://www.dailymail.co.uk/sciencetech/article-1242011/DAVID-ROSE-The-mini-ice-age-starts-here.html

anna v
January 16, 2010 6:18 am

Richard Holle (02:17:43) : | Reply w/ Link
My reply: I am only suggesting that we study the effects of the Lunar declinational tides in the atmosphere, because if I pull out the past three patterns of 6558 days, or 240 lunar declinational cycles, and plot them side by side, day by day, they show a resultant pattern that gives a better forecast than any of the models.
With compensation for the changes in the solar cycles, from the past three patterns of high activity, to this one of lower activity I would get the correct amount of cooling, and better representation of the depth, of the cold arctic air invasions that are arriving on time, just larger than before.

Piers Corbyn secret weather method uses the moon and sun as he has explained in several videos, and he seems to be quite successful in long term weather predictions, so you may be right.
On the other hand, when many dynamical inputs enter, and such is the case of earth climate, it is chaos tools that should be used.
anecdotal: in my part of the earth, Greece, the moon phases are traditionally used by sailors and farmers to “predict the weather” as follows: If the wind/clouds/etc change with the moon phase, expect the same weather to the end of the phase. If it does not change then, it will keep the same through the next phase.

matt v.
January 16, 2010 6:20 am

Here is what various organizations are saying about the future climate.
Met Office, UK
“Even then, due to natural variations in climate, we expect to see 10 year periods both globally and regionally with little or no warming….”
“We found about one in every eight decades has near or negative global temperature trends
Met office decadal forecast predicts renewed warming in 2010 with about half the years to 2015 likely to be warmer globally than the current warmest year on record [1998?]”
They are also predicting global temperatures to rise by 4C by 2060 which translates to about 0.08C per year or 10 times faster than the last decade or last century [0.007 C/year]
http://www.metoffice.gov.uk/corporate/pressoffice/2009/pr20090914.html
Hadley Center –Met Office
The effects of global warming over the coming decades will be modified by shorter-term climate variability.
These three possible trends of winter temperature in northern Europe from 1996 to 2050 were simulated by a climate model using three different (but plausible) initial states6. The choice of initial state crucially affects how natural climate variations evolve on a timescale of decades. But as we zoom out to longer timescales, the warming trend from greenhouse gases begins to dominate, and the initial state becomes less important. Keenlyside and colleagues2 use observations of the sea surface temperature to set the initial state of their model. Their results indicate that, over the coming decade, natural climate variability may counteract the underlying warming trend in some regions around the North Atlantic. (Figure courtesy of A. Pardaens, Met Office Hadley Centre).
http://www.nature.com/nature/journal/v453/n7191/fig_tab/453043a_F1.html#figure-title
CRU
I am not aware of any published forecasts by CRU.
Recently released e-mails from the CRU “climategate” show the IPCC scientists saying that, “we can’t account for the lack of warming at the moment and it is a travesty that we can’t”.
IPCC
They predicted no cooling but called for global temperatures to rise by 0.21C in each of the next two decades. They also made various climate rise projections ranging from a median of about 2-4 C to worst case option of up to 6C by 2100.
http://www.heartland.org/bin/media/newyork09/PowerPoint/Syun_Akasofu.ppt#524,30,Slide%2030
FORECASTS BASED ON SOLAR CYCLES
There are at least half a dozen to a dozen climate forecasts all predicting global cooling based the next 2 solar cycles being significantly lower in terms of sun spot activity. They are not included here to keep this narrative brief.

Roger Knights
January 16, 2010 6:20 am

Leif Svalgaard (00:26:32) :
I don’t believe the current models are correct [too many things are parameterized], but my beef was with the bland assertion that because we cannot predict two weeks ahead, all prediction further out is impossible.

I think it is a weak argument to make the simple claim, so often heard here, that unpredictable short-term wiggliness means that no long-term trend can be foreseen. And remember, it isn’t the fallible models that are predicting a warming trend, but “simple physics,” supposedly. I.e., if more insulation is added to the earth, it will lose heat at a slower rate, effectively raising its thermostat, resulting in a long-term heating-up.
The proper counter to that argument is not to say that short-term predictions can’t be made reliably, but that there are potential negative feedbacks involved that will offset this extra insulation, and/or that the amount of insulation that will be added by increasing CO2 is tiny, given a real-world absence of postulated positive feedbacks.

Frederick Michael
January 16, 2010 6:27 am

Leif is right, but maybe a better example is that you can accurately predict the mean of a random variable (heck, you can KNOW it — in a simulation) and yet an individual sample would still be highly uncertain.
Any time you look at a long term statistic, such as climate, many short term fluctuations average out. There are many things that flop around a lot but have a predictable mean. Quantum mechanics relies on this.
So, bad short term predictions do not NECESSARILY undermine the long term predictions. However, they can be evidence that the forecaster misrepresents his level of confidence in predictions. This WOULD necessarily undermine any statements about his level of certainty for other predictions.

Veronica
January 16, 2010 6:33 am

jgfox
“Yes, it may be impossible to exactly predict tomorrow’s sports outcomes, but you can easily predict the outcomes of sports events decades away when you use computers and software as powerful as those used at the omniscient Met Office. ”
bad analogy. Sports wins are neither progressive nor cumulative. A short inter-season build-up of “Win” can be overturned by the loss of a star player or the purchase of your club by a Russian oligarch.
The point of supposed climate change is that it piles up. More than that in fact, that positive feedback makes it accelerate. Therefore trends should be a darn sight easier to spot than whether Man Utd is going to win the cup.
However I wonder if we switched off the Hadley supercomputer and let Exeter cool down, would we provoke a new ice age?

Eric (skeptic)
January 16, 2010 6:33 am

tucker (04:46:51) : “Regarding GCM’s: Does a failure in predictive quality in the short term (ten years) require that the long term (50+ years) predictive quality must also fail due to propagation of the short term actual results??”
No. The short term failures are due to chaotic influences on small changes in initial conditions. The long term is not a propagation of multiple short terms, but it is constrained by somewhat unknown forcings (e.g. soilar influences) and completely unpredictable events (e.g. volcanoes). The long term may also fail due to model error, for example the model may not properly simulate tropical convection, typically due to lack of resolution.
But a useful model should converge to cycles over large scales of time and space which should include the potentialities of the recent shifts in AO. IOW, there is no reason a model run could not result in a -6 std dev like we saw in reality. But in no model would that be “predictable” in any sense of the word except one: they would occur with the frequency seen in reality.

tmtisfree
January 16, 2010 7:04 am

Tom Vonk once put it in perspective rather nicely:
“In the beginning people thought of the climate as a deterministic system. The climate trajectory was supposed to be computable and predictable and inaccuracies were only due to the lack of computing power and crudeness of the parametrizations.
After having multiplied the computing power by 100 – 150 in the last 10 years and the time spend on parametrizations by a similar factor, the models are still as inaccurate as they were and what increased is only the confidence that they indeed are inaccurate.
This in itself is a powerful signal that there is a very fundamental error somewhere in the approach.
If something like that had happened in a more serious scientific branch (like high energy physics f.e.), people would have dropped the wrong methodology already long ago.
The approach begins to slightly change now.
First as there can be no trust in any individual model, the “ensemble theory” has been invented according to which every model gets “something” right (but nobody knows what) and something wrong (everything else).
By averaging the model results, the wrongness cancels out or at least reduces but the rightness stays.
This theory seems to me silly and based on no serious physics.
Second the modellers grudgingly abandoned determinismus and try to heal the problem by ergodicity.
[Gavin] Schmidt even says that their models are “chaotic” showing hereby that he doesn’t know what chaos is.
What they try in fact is to handle the climate with statistics – while the evolution of every individual parameter that constitutes the climate can’t be computed and predicted, the AVERAGES (time and/or space) of the said parameters are robust and significant while any difference between a realisation of the parameter and its average obeys some statistical law.
That is the theory in which Realisation = Climate + Noise.
It is analogous to Kolmogorov turbulence theory and that’s why I guess Schmidt is calling that “chaos”.
Of course any analogy stops here because the assumptions taken by Kolmogorov (homogeneity and isotropy) that give sense to his theory are absent from the climate theory.
And of course, not surprisingly, according to D. Koutsoyannis [“On the credibility of climate predictions”, 2008] and other work that begin to appear now the “healing” of the models by salvaging at least the time averages, fails too.
What is left […] is the deterministic chaos.
The best example and analogy is the solar system problem.
It is clearly deterministic and even not very complex because there are only a few bodies and a few O[rdinary ]D[ifferential ]E[quation].
Now it happens that it behaves like the Lorenz system – the trajectories of the bodies are not predictable.
While a trajectory is always computable by numerically solving the ODE for an arbitrary time period, this computed trajectory (that Dan Hughes would call “just a series of numbers”) has little to do with the real trajectory.
The system is not stochastic either – asking about probabilities of trajectory excentricity or time averages of distances (Body A – Body B) makes no sense.
Making N runs (N large) with varying initial conditions and varying time periods will give some insights about what the system MIGHT do but no insight at all about what it WILL do or indeed with what probability it MIGHT do this or that.
And the differences between trajectories are not just small numbers – it may be so dramatic as the difference between turning a circle 500 millions years and definitely leaving the solar system.
Fortunately there is at least ONE question that can be asked of such a system and that is the one of stability.
We may ask after having observed the system for a certain time, are the trajectories stable (e.g. will there not be a catastrophic divergence)?
There are mathematical tools for that like f.e. the KAM theorem [Kolmogorov-Arnold-Moser theorem].
Of course it is more than probable that the climate system is nowhere near to an integrable Hamiltonian system where KAM would apply but similar approaches could be attempted.
As there will be more and more results like those of D. Koutsoyannis, I am convinced that people will do one day the last step to be made.
Namely that the climate is neither deterministic nor ergodic.
1) The trajectories unpredictably evolve between different quasi steady states. The causality is not clear cut and for instance there is an infinity of different initial conditions that lead to the same final state.
2) There is no particular time scale at which the system is more stable or predictable (e.g. yearly averages don’t behave “better” than hourly averages).
3) There is no statistical law describing the distances between 2 different trajectories and no probabilities of achievement of the different quasi steady states.
4) Observation of the last 3 billions of years shows that the envelope of the possible trajectories is bounded so the system is stable.
5) The results above are independent of the computing power and the size of integration steps.
6) The variation of a single parameter (f.e. CO² concentration) may lead to wide range of different final states and symmetrically those states can be reached without any variation of this parameter (CO² concentration).
Then and only then people will stop bothering about CO² because they will understand that all kind of unexpected things happen and will happen regardless of CO² concentrations and the longer we observe, the more unexpected things will happen.
With the usual irrational resistance of people towards any change, most of these unexpected things will be perceived as bad and dangerous 🙂
However taking action with regard to a supposed qualitative impact of some climate variable on the final state after a certain time would make sense only if the specific costs/inconvenients of the action were near to 0 or if the time horizon was very short.”

Harry
January 16, 2010 7:11 am

Lief,
“Once again, this begs the question – if the GCMs can’t forecast the AO two weeks in advance
I think this logic is faulty. Let me illustrate with some examples:”
I’ll give my own example..
One would think they could predict flooding based on ‘inches of rain’ over a period of time. It’s a simple fluids calculation. X inches of rains = Y of gallons of water which will all end up in the river which has a maximum flow rate of Z gallons/minute without flooding.
Unfortunately, one needs to know the level of soil saturation and the state of other watersheds when the rain begins(which requires a hydrologist). Otherwise one unnecessarily alarms the population.

Pascvaks
January 16, 2010 7:21 am

Just listening…
The difference between “Micro” and “Macro” seems to be at issue here. I’m not sure of the “M.cro” word for ‘middle’ but I’ll bet it’s also a player in the game.
Let’s say weather is “micro” and climate is “macro”, what’s in the middle? (This isn’t Abbott and Costello by the way.) Is there something in the middle that messing everthing up? Something, to date, undefined and unfactored?
It also occurred to me that the ‘Forrest for the Trees’ problem is also applicable here. Some seem too close and some too far away. Some only see forest, some only see trees.
Lately, on the question of Earth’s temperature.. is it 98.6F and rising? or 98.6F and falling? I keep thinking that we’re much too close to the problem. I look at all the graphs for each year at various elevations (or air pressure levels), and various lines of latitude, etc., and wonder but what’s the “Earth’s” temperature? You know, like being several million miles out in space and looking at this tiny little blue marble, and pointing some thingie wingie at the speck, pulling a trigger, and looking at a readout that say’s 98.6F?
I’ve learned that there is something called space weather, I’ve known all my life about the weather where I stood or slept. I’ve seen, read, and heard about the weather where I wasn’t. And of course there’s the weather in history books. There must be something between Micro and Macro. Mucro? Mecro? Mocro?

Steve Goddard
January 16, 2010 7:47 am

Lief,
One thing that climate models and solar models share is that they both seem to be consistently wrong.
But a big difference is that GCMs are designed to be both iterative and precise. The whole idea of a “tipping point” is completely dependent on cumulative behaviour.
For example, Hansen believes that decreasing snow cover in Siberia will cause release of methane which will in turn cause “life as we know it to cease.” On the other hand, increasing snow cover in Siberia will clearly not have that effect.

tucker
January 16, 2010 7:51 am

Pascvaks (07:21:27) :
Just listening…
The difference between “Micro” and “Macro” seems to be at issue here. I’m not sure of the “M.cro” word for ‘middle’ but I’ll bet it’s also a player in the game.
Let’s say weather is “micro” and climate is “macro”, what’s in the middle? (This isn’t Abbott and Costello by the way.) Is there something in the middle that messing everthing up? Something, to date, undefined and unfactored?
The problem as I see it is that the climate of the past 100 years and the future 100 years is micro climate. The real climate history of Earth extends back billions of years. Man’s recorded history is only ~10,000 years long. Even that isn’t enough to say what is “normal” in climate.
… and isn’t that the real debate anyway. Someone out there, maybe a James Hansen, decided 25 years ago that the future climate had to equal the 1951-1980 mean, regardless of the fact that we didn’t and still do not know if that soup is too hot, too cold, or just right. The only “truth” we know is that climate is ever changing and not stable with regard to temps and precip. Why are we so arrogant to believe we have changed it in the past, and can change it in the future at our whim.

Steve Goddard
January 16, 2010 7:55 am

Scientists Predict Big Solar Cycle (24)
Dec. 21, 2006: Evidence is mounting: the next solar cycle is going to be a big one. Solar cycle 24, due to peak in 2010 or 2011 “looks like its going to be one of the most intense cycles since record-keeping began almost 400 years ago,” says solar physicist David Hathaway of the Marshall Space Flight Center. He and colleague Robert Wilson presented this conclusion last week at the American Geophysical Union meeting in San Francisco.
http://science.nasa.gov/headlines/y2006/21dec_cycle24.htm

Steve Goddard
January 16, 2010 8:23 am

I’m not sure how many people commenters here work with climate models. They break the surface of the earth up into a grid, the atmosphere up into layers, and make very precise iterative calculations of many parameters – which are dependent both temporally and spatially. They use 64 bit math to minimize cumulative errors.
NCAR is building a $150 million supercomputer in Wyoming because of the need for precision. Success of the model in a particular time step is dependent on success of the previous time step for all neighboring grid elements and atmospheric layers.
GCMs are not averaging models as some here have suggested. Rather they are cumulative.
(BTW the computer is being built in Wyoming because of cheaper coal-fired electricity.)

anna v
January 16, 2010 8:35 am

tmtisfree (07:04:28)
Yes.
I would add that once one has deterministic chaos in one scale, the larger scale is also chaotic. I think it is one of the theorems.
It makes no sense to say weather is chaotic in the small time scales but climate is not.

January 16, 2010 8:37 am

Kath (22:48:38) :
Here in the North wet coast, we’ve had a relatively mild winter so far, with the Pineapple Express bringing us rain on a regular basis. I wonder how the 2010 Winter Olympics will cope with the poor conditions?
—————-
The Olympics will be great because the wet weather is dumping snow on the mountains and conditions are best they have been in several years.
247CM base at the bottom with 48Cm of new snow in the last 48 hours, base is solid at all elevations and all 200 runs are open with loose powder.
So do not wonder anymore.

KDK
January 16, 2010 8:40 am

“They have correlated the climate to something which doesn’t do much more than drive a drop into a swimming pool.”
Technically speaking, when I surf on the ocean, and I take a wiz (it happens), I have just raised the oceanic levels (STRICTLY TECHNICALLY SPEAKING) for only a fraction of a moment in time… for, what if, at the same time, a wave crashed onto some rocks and a puddle formed that had not been there a moment ago? Would that not immediately offset what I have done?
There is just NO way, using the science we have with SO MANY undetected (imo) variables, to determine what CO2 would do in a negative way. AGW focuses ONLY on the negative, NEVER on the positive… never.
Humans can ONLY make theories based on what we know, and I say, we KNOW very little overall with much assurance… in fact, many theories that do ‘work’ may have undetectable variables that play a role–well, I’d say that is a definite.
Leave the earth alone and focus on profiteering corporations (govs) abuse of the humans (and all species) with real pollution… mercury, aspartame, fluoride, etc., and their quest for control via agenda 21 (this would come in line with Cap/Trade as the precursor) and Codex Alimentarius.. all legitimate concerns if people ONLY knew.

January 16, 2010 8:41 am

tallbloke (04:56:59) :
“Yes, but that’s not how the climate models works. Which is the point at issue.”
Ah, models, models… mostly waist of time. Explicit correlation which holds, and when it fails has a reason (e.g. volcanic eruptions) is the only one model that might produce a credible output. Here is Einstein’s much longer quote (very appropriate):
“The fact that on the basis of such laws (laws of nature, or physics if you wish – my rem.) we are able to predict the temporal behaviour of phenomena in certain domains with great precision and certainty is deeply embedded in the consciousness of the modern man, even though he may have grasped very little of the contents of those laws. He need only consider that planetary courses within the solar system may be calculated in advance with great exactitude on the basis of a limited number of simple laws. In a similar way, though not with the same precision, it is possible to calculate in advance the mode of operation of an electric motor, a transmission system, or of a wireless apparatus, even when dealing with a novel development.
To be sure, when the number of factors coming into play in a phenomenological complex is too large, scientific method in most cases fails us. One need only think of the weather, in which case prediction even for a few days ahead is impossible. Nevertheless no one doubts that we are confronted with a causal connection whose causal components are in the main known to us. Occurrences in this domain are beyond the reach of exact prediction because of the variety of factors in operation, not because of any lack of order in nature.”