Why Numerical Climate Models Fail at Long-term Climate Prediction

Guest Essay by Kip Hansen — 12 November 2024 — 3000 words — Very Long Essay

There has been a great deal of “model bashing” here and elsewhere in the blogs whenever climate model predictions are mentioned.  This essay is a very long effort to cool off the more knee-jerk segment of that recurring phenomenon.

We all use models to make decisions; most often just tossed together mental models along the lines of: “I don’t see any cars on the road, I don’t hear any cars on the road, I looked both ways twice therefore my mental model tells me that I will be safe crossing the road now.”  Your little ‘safe to cross the road?’ model is perfectly useful and (barring evidence unknown or otherwise not taken into account) and can be depended upon for personal road-crossing safety. 

It is not useful or correct in any way to say “all models are junk”.   

Here, at this website, the models we  talk about are “numerical climate models” [or a broader search of references here], that are commonly run on supercomputers.  Here’s what NASA says:

“Climate modelers run the climate simulation computer code they’ve written on the NASA Center for Climate Simulation (NCCS) supercomputers. When running their mathematical simulations, the climate modelers partition the atmosphere into 3D grids. Within each grid cell, the supercomputer calculates physical climate values such as wind vectors, temperature, and humidity. After conditions are initialized using real observations, the model is moved forward one “time step”. Using equations, the climate values are recalculated creating a projected climate simulation.”

The Wiki explains:

“A general circulation model (GCM) is a type of climate model. It employs a mathematical model of the general circulation of a planetary atmosphere or ocean. It uses the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources (radiation, latent heat). These equations are the basis for computer programs used to simulate the Earth’s atmosphere or oceans. Atmospheric and oceanic GCMs (AGCM and OGCM) are key components along with sea ice and land-surface components.”

I am open to other definitions for the basic GCM.  There are, of course, hundreds of different “climate models” of various types and uses.

But let us just look at the general topic that produces the basis for claims that start with the phrase: “Climate models show that…”

Here are a few from a simple Google search on that phrase:

Climate Models Show That Sea Level Rise from Thermal Expansion Is Inevitable

Global climate models show that Arctic sea ice is on a course to disappear for at least part of the year

Climate models show that global warming could increase from 1.8 to 4.4°C by 2100.

Historical data as well as future climate models show that global warming is (approximately) directly proportional to the increase of CO2 concentrations

All climate models show that the addition of carbon dioxide to the atmosphere will lead to global warming and changes in precipitation.

Climate models show that Cape Town is destined to face a drier future

Let’s try “climate science predicts that”

Climate science predicts that the land areas of the world should be warming faster than the ocean areas and our temperature datasets confirm this

Patterns of extreme weather are changing in the United States, and climate science predicts that further changes are in store

There are innumerable examples.  But let’s ask:  “What do they mean when they say ‘Climate science predicts…’?”

In general, they mean either of the two following:

1) That some climate scientist, or the IPCC, or some group in some climate report, states [or is commonly believed to have stated, which is very often not exactly the case] that such a future event/condition will occur.

2) Some climate model [or some single run of a climate model, or some number of particular climate model outputs which have been averaged] has predicted/projected that such a future event/condition will occur.

Note that the first case is often itself based on the second. 

Just generally dismissing climate model results is every bit as silly as just generally dismissing all of climate skepticism.  A bit of intelligence and understanding is required to make sense of either.  There are some climate skepticism points/claims made by some people with which I disagree and there are climate crisis claims with which I disagree. 

But I know why I disagree. 

Why I Don’t Accept Most Climate Model Predictions or Projections of Future Climate States

Years ago, on October 5, 2016,  I wrote Lorenz validated  which was published on Judith Curry’s blog, Climate Etc..  It is an interesting read, and important enough to re-read if you are truly curious about why numerical climate modeling has problems so serious that is has become to be seen by many, myself included, as only giving valid long-term projections accidentally. I say ‘accidentally’ in the same sense that a stopped clock shows the correct time twice a day, or maybe as a misadjusted clock, running at slightly the wrong speed, gives the correct time only occasionally and accidentally.  

I do not say that a numerical climate model does not and cannot ever give a correct projection. 

Jennifer Kay and Clara Deser, both at University of Colorado Boulder and associated  with  NCAR/UCAR [National Center for Atmospheric Research,  University Corporation for Atmospheric Research], with 18 others,  did experiments with climate models back in 2016 and produced a marvelous paper titled:  “The Community Earth System Model (CESM) Large Ensemble Project: A Community Resource for Studying Climate Change in the Presence of Internal Climate Variability”.

The full paper is available for download here [.pdf].

Here is what they did (in a nutshell):

“To explore the possible impact of miniscule perturbations to the climate — and gain a fuller understanding of the range of climate variability that could occur — Deser and her colleague Jennifer Kay, an assistant professor at the University of Colorado Boulder and an NCAR visiting scientist, led a project to run the NCAR-based Community Earth System Model (CESM) 40 times from 1920 forward to 2100. With each simulation, the scientists modified the model’s starting conditions ever so slightly by adjusting the global atmospheric temperature by less than one-trillionth of one degree, touching off a unique and chaotic chain of climate events.” [ source ]

What are Deser and Kay referring to here?

“It’s the proverbial butterfly effect,” said Clara Deser… “Could a butterfly flapping its wings in Mexico set off these little motions in the atmosphere that cascade into large-scale changes to atmospheric circulation?” 

Note:  The answer to the exact original question posed by Edward Lorenz is “No”, for a lot of reasons that have to do with scale and viscosity of the atmosphere and is a topic argued endlessly.  But the principle of the matter, “extreme sensitivity to initial conditions” is true and correct, and demonstrated in Deser and Kay’s study in practical use in a real climate model. – kh

What happened when Deser and Kay ran the Community Earth System Model (CESM) 40 times, repeating the exact same model run forty different times, using all the same inputs and parameters, with the exception of one input:  the Global Atmospheric Temperature? This input was modified for each run by:

less than one-trillionth of one degree

or

< 0.0000000000001 °C

And that one change resulted in the projections for “Winter temperature trends (in degrees Celsius) for North America between 1963 and 2012”, presented as images:

First, notice how different each of the 30 projections are.  Compare #11 to #12 right beside it.  #11 has a cold northern Canada and Alaska whereas  #12 has a hot northern Canada and Alaska, then look down at  #28. 

Compare #28 to OBS (observations, the reality, actuality, what actually took place).   Remember, these are not temperatures but temperature trends across 50 years.  Not weather but climate. 

Now look at EM, next to OBS in the bottom row.  EM = Ensemble Mean – they have AVERAGED the output of 30 runs into a single result.

They set up the experiment to show whether or not numerical climate models are extremely sensitive to initial conditions.  They changed a single input by an infinitesimal amount – far below the actual real world measurement precision (or our ability to measure ambient air temperatures for that matter).  That amount?   One one-trillionth a degree Centigrade — 0.0000000000001 °C.   To be completely fair, they changed is less than that.

In the article the authors explain that they are fully aware of the extreme sensitivity to initial conditions in numerical climate modelling.  In fact, in a sense, that is their very reason for doing the experiment.  They know they will get chaotic (as in the field of Chaos Theory)  results.  And, they do get chaotic results.  None of the 30 runs matches reality.  The 30 results are all different in substantial ways.  The Ensemble Mean is quite different from the Observations, agreeing only that winters will be somewhat generally warmer – this because models are explicitly told it will be warmer if CO2 concentrations rise (which they did). 

But what they call those chaotic results is internal climate variability

That is a major error.  Their pretty little pictures represent the numerically chaotic results of nonlinear dynamical systems represented by mathematical formulas (most of which are themselves highly sensitive to initial conditions), each result fed back into the formulas at each succeeding the time step of their climate model. 

Edward Lorenz showed in his seminal paper,  “Deterministic Nonperiodic Flow”, that numerical weather models would produce results extremely sensitive to initial conditions and the further into the future one runs them, the more time steps calculated, the wider and wide the spread of chaotic results. 

What exactly did Lorenz say?  “Two states differing by imperceptible amounts may eventually evolve into two considerably different states … If, then, there is any error whatever in observing the present state—and in any real system such errors seem inevitable—an acceptable prediction of an instantaneous state in the distant future may well be impossible….In view of the inevitable inaccuracy and incompleteness of weather observations, precise very-long-range forecasting would seem to be nonexistent.”

These numerical climate models cannot not fail to predict or project accurate long-term climate states. This situation cannot be obviated.  It cannot be ‘worked around’.  It cannot be solved by finer and finer gridding

Nothing can correct for the fact that sensitivity to initial conditions — the primary feature of Chaos Theory’s effect on climate models — causes models to lose the ability to predict long-term future climate states.

Deser and Kay clearly demonstrate this in their 2016 and subsequent papers. 

What does that mean in the practice of climate science?

That means exactly what Lorenz found all those years ago —  quoting the IPCC TAR: “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.

Deser and Kay label the chaotic results found in their paper as “internal climate variability”.  This is entirely, totally, absolutely, magnificently wrong.

The chaotic results, which they acknowledge are chaotic results due to sensitivity to initial conditions,  are nothing more or less than:  chaotic results due to sensitivity to initial conditions.  This variability is numerical – the numbers vary and they vary because they are numbers [specifically not weather and not climate].

The numbers that are varying in climate models vary chaotically because the numbers come out of calculation of nonlinear partial differential equations, such as the Navier–Stokes equations, which are a system of partial differential equations that describe the motion of a fluid in space, such as the atmosphere or the oceans.  Navier-Stokes plays a major role in numerical climate models.  “The open problem of existence (and smoothness) of solutions to the Navier–Stokes equations is one of the seven Millennium Prize problems in mathematics” — a solution to the posed problem will get you $ 1,000,000.00.  For that reason, a linearized version of Navier-Stokes is used in models.

How does this play out then, in today’s climate models – what method is used to try to get around these roadblocks to long-term prediction?

“Apparently, a dynamical system with no explicit randomness or uncertainty to begin with, would after a few time steps produce unpredictable motion with only the slightest changes in the initial values. Seen how even the Lorenz equations (as they have become known over time) present chaotic traits, one can just imagine to what (short, presumably) extent the Navier-Stokes equations on a grid with a million points would be predictable. As previously mentioned, this is the reason why atmospheric models of today use a number of simplifying assumptions, linearizations and statistical methods in order to obtain more well-behaved systems.” [ source – or download .pdf ]

In other words, the mantra that climate models are correct, dependable and produce accurate long-term predictions because they are based on proven physics is false – the physics is treated to subjective assumptions ‘simplifying’ the physics, linearizations of the known mathematical formulas (which make the unsolveable solveable)  and then subjected to statistical methods to “obtain more well-behaved systems”.  

Natural variability can only be seen in the past.  It is the variability seen in nature – the real world – in what really happened.

The weather and climate will vary in the future.  And when we look back at it, we will see the variability.

But what happens in numerical climate models is the opposite of natural variability.  It is numerical chaos.  This numerical chaos is not natural climate variability – it is not internal climate variability. 

But, how can we separate out the numerical chaos seen in climate models from the chaos clearly obvious in the coupled non-linear chaotic system that is Earth’s climate?

[and here I have to fall back on my personal opinion – an informed opinion but only an opinion when all is said and done]

We cannot.

I can (and have) shown images and graphs of the chaotic output of various formulas that demonstrate numerical chaos.  You can glance through my Chaos Series here, scrolling down and looking at the images. 

It is clear that the same type of chaotic features appear in real world physical systems of all types.  Population dynamics, air flow, disease spread, heart rhythms, brain wave functions….almost all real world dynamical systems are non-linear and  display aspects of chaos.  And, of course, Earth’s climate is chaotic in the same Chaos Theory sense.

But, doesn’t that mean that the numerical chaos in climate models IS internal or natural variability?   No, it does not. 

A perfectly calculated trajectory of a cannonball’s path based on the best Newtonian physics will not bring down a castle’s wall.  It is only an idea, a description.   The energy calculated from the formulas is not real.  The cannonball described is not a thing.  And, to use a cliché of an adage: The map is not the territory.

In the same way, the numerical chaos churned out by climate models is similar in appearance to the type of chaos seen in the real world’s climate but it is not that chaos and not the future climate.  Lorenz’s ‘discovery’ of numerical chaos is what led to the discoveries that Chaos Theory applies to real world dynamical systems. 

Let’s take an example from this week’s news:

Hurricane Rafael’s Path Has Shifted Wildly, According to Tracker Models

Shown are the projected paths produced by our leading hurricane models as of 1200 UTC on 6 November 2024.  The messy black smudge just above western Cuba is the 24 hour point, where the models begin to wildly diverge. 

Why do they diverge?  All of the above – everything in this essay —  these hurricane path projections demonstrate a down-and-dirty sample of what chaos does to weather prediction and thus climate predictions. At just 24 hours into the future all the projections begin to diverge.  By 72 hours, the hurricane could be anywhere from just northwest of the Yucatan to already hitting the coast of Florida. 

If you had a home in Galveston, Texas what use would these projections be to you?   If NCAR had “averaged” the paths to produce a “ensemble mean” would it be more useful? 

Going back up to the first image of 30 projected winter temperature trends, a very vague metric:  Is the EM (ensemble mean), of those particular model runs,  created using one of the methods suggested by Copernicus Climate Change Service, more accurate than any of the others futures?  Or is it just accidentally ‘sorta like’ the observations?

# # # # #

Author’s Comment:

This is not an easy topic.  It produces controversy.  Climate scientists know about Lorenz, chaos, sensitivity to initial conditions, non-linear dynamical systems and what that means for climate models.  The IPCC used to know but ignores the facts now.

Some commenter here will cry that “It is not an initial conditions problem but a boundaries problem” – as if that makes everything OK.  You can read about that in a very deep way here.  I may write about that attempt to dodge reality in a future essay.

I will end with a reference to the eclectic R G Brown’s comments which I sponsored here, in which he says:

“What nobody is acknowledging is that current climate models, for all of their computational complexity and enormous size and expense, are still no more than toys, countless orders of magnitude away from the integration scale where we might have some reasonable hope of success. They are being used with gay abandon to generate countless climate trajectories, none of which particularly resemble the climate, and then they are averaged in ways that are an absolute statistical obscenity as if the linearized average of a Feigenbaum tree of chaotic behavior is somehow a good predictor of the behavior of a chaotic system!  …  This isn’t just dumb, it is beyond dumb. It is literally betraying the roots of the entire discipline for manna.”

And so say I.

Thanks for reading.

# # # # #

5 26 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

324 Comments
Inline Feedbacks
View all comments
Richard Greene
November 9, 2024 2:42 am

A computer game that guesses the climate in 100 years is not a model of the climate on this planet.

It is propaganda used to confuse people by appearing to be scientific … when it is just climat astrology.

Science requires data
There are no data for the future climate
Therefore, the so called models are not science.

They are climate confuser games.

They appear to be science only because scientists are involved, and they predicted warming … that has happened in the past 50 years.

Scientists have predicted warming from CO2 emissions for the past 127 years so predicting warming as CO2 rose after the 1960s could have been done on the back of an envelope.

Richard Greene
November 9, 2024 3:37 am

Best model

comment image

November 9, 2024 4:41 am

If any model is just plain wrong in essential ways- running it on a super computer won’t make it better.

Reply to  Joseph Zorzin
November 9, 2024 6:02 pm
Reply to  karlomonte
November 10, 2024 4:29 am

craziest place on the planet!

November 9, 2024 5:13 am

Kip, it is good to see Lorenz mentioned here in your very well-informed article. His insights into the energy dynamics of the general circulation are gem quality, in my estimation.

His descriptions of energy transformations, or “energy conversion”, are important to understand. And if one grasps what is happening in the atmosphere in this respect, there need be no expectation at all that incremental non-condensing GHGs such as CO2, CH4, N2O will end up driving any of the climate-related metrics of temperature, precipitation, ice coverage, etc.

********************
From Edward N. Lorenz (1960) “Energy and Numerical Weather Prediction”
https://doi.org/10.3402/tellusa.v12i4.9420

“2. Energy, available potential energy, and
gross static stability
Of the various forms of energy present in
the atmosphere, kinetic energy has often
received the most attention. Often the total
kinetic energy of a weather system is regarded
as a measure of its intensity. The only other
forms of atmospheric energy which appear
to play a major role in the kinetic energy
budget of the troposphere and lower stratosphere
are potential energy, internal energy, and the
latent energy of water vapor. Potential and
internal energy may be transformed directly
into kinetic energy, while latent energy may
be transformed directly into internal energy,
which is then transformed into kinetic energy.
It is easily shown by means of the hydrostatic
approximation that the changes of the
potential energy P and the internal energy l of
the whole atmosphere are approximately proportional,
so that it is convenient to regard
potential and internal energy as constituting
a single form of energy. This form has been
called total potential energy by Margules (1903).

In the long run, there must be a net depletion
of kinetic energy by dissipative processes. It
follows that there must be an equal net
generation of kinetic energy by reversible
adiabatic processes; this generation must occur
at the expense of total potential energy. It
follows in turn that there must be an equal net
generation of total potential energy by heating
of all kinds. These three steps comprise the
basic energy cycle of the atmosphere. The
rate at which these steps proceed is a fundamental
characteristic of the general circulation.”
***************

This is why I have posted a time-lapse video of the ERA5 reanalysis model’s hourly parameter “vertical integral of energy conversion.” The text description gives the full explanation.
https://youtu.be/hDurP-4gVrY

November 9, 2024 6:52 am

Years ago two German (I think) physicists wrote a paper about the global warming claims. They concluded much the same thing. We can’t know all the initial conditions, we can’t get grid cells small enough, we can’t model the every inch of the surface of earth, etc.

I think their initials were T & G.

Mike71
November 9, 2024 7:15 am

Should models that cannot predict the past accurately, cannot predict the present accurately, should they then be used to predict the future with confirmation only be reached far in the future, or should they be dismissed, personally I dismiss them.

ferdberple
November 9, 2024 7:32 am

Spot on Kip. Most of us are aware that 1/3 is 0.33333… and there is a round off error between fractions and decimal. Binary computers have the exact same problem. Thus the scams and movies about interest round-off in banks. It is a real problem that shows up in all sorts of unexpected ways. Forecasting the future is physically and mathematically prone to errors that grow faster than the answer.

November 9, 2024 7:39 am

To repeat the Wikipedia definition of GCMs that Kip provided in his article above:
“A general circulation model (GCM) is a type of climate model. It employs a mathematical model of the general circulation of a planetary atmosphere or ocean. It uses the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources (radiation, latent heat). These equations are the basis for computer programs used to simulate the Earth’s atmosphere or oceans. Atmospheric and oceanic GCMs (AGCM and OGCM) are key components along with sea ice and land-surface components.”
(my bold emphasis added)

Here is a summary of the failings of using Navier-Stokes equations as the basis of modeling a system as complex and variable (from sea-level to TOA) as Earth’s atmosphere, courtesy of a Quora AI bot (see https://www.quora.com/Why-are-the-Navier-Stokes-equations-incomplete-What-real-physical-effects-do-they-ignore ):

“The Navier-Stokes equations describe the motion of fluid substances and are fundamental in fluid mechanics. However, they are often considered “incomplete” because they do not account for certain real physical effects and complexities in fluid behavior. Here are several key reasons:

1. Turbulence

Description: The Navier-Stokes equations are notoriously difficult to solve in turbulent conditions. Turbulence is a complex, chaotic flow regime characterized by vortices and eddies.
Effect Ignored: The equations do not provide a complete description of turbulent flow, and while turbulence models (like the k-ε model or Large Eddy Simulation) can be used, they are approximations that can still miss certain dynamics.

2. Compressibility

Description: The Navier-Stokes equations are often applied under the assumption of incompressible flow, which simplifies calculations.
Effect Ignored: In high-speed flows (like those encountered in aerodynamics), compressibility effects become significant, requiring modifications to the equations (the compressible Navier-Stokes equations).

3. Non-Newtonian Fluids

Description: Many fluids do not behave according to Newtonian mechanics (where the stress is linearly proportional to the strain rate).
Effect Ignored: The Navier-Stokes equations typically assume a linear relationship between stress and strain rate, which does not apply to non-Newtonian fluids such as polymers, slurries, or biological fluids.

4. Heat Transfer and Phase Changes

Description: The basic form of the Navier-Stokes equations does not incorporate thermal effects or phase changes (like evaporation or condensation).
Effect Ignored: In many practical applications, heat transfer (via conduction, convection, and radiation) and phase transitions significantly influence fluid behavior.

5. Chemical Reactions

Description: The Navier-Stokes equations do not account for any chemical reactions that may occur within the fluid.
Effect Ignored: In processes like combustion or mixing of reactive fluids, the interaction between flow dynamics and chemical kinetics is critical.

6. Boundary Conditions and External Forces

Description: The equations require boundary conditions and can be influenced by external forces (like electromagnetic forces or gravitational variations).
Effect Ignored: Simplifications in boundary conditions or neglecting certain forces can lead to inaccuracies in predictions.

Conclusion

Overall, while the Navier-Stokes equations provide a robust framework for understanding fluid motion, their incompleteness arises from their assumptions and simplifications. Advances in computational fluid dynamics (CFD) and various modeling techniques aim to address these limitations, but complete solutions remain challenging, particularly in turbulent and complex flow scenarios.”

Oh well,N-S based GCMs are a start . . . one that hard data shows is “oh so preliminary” and not to be trusted.

Reply to  Kip Hansen
November 9, 2024 9:00 am

Just so!

ferdberple
November 9, 2024 7:52 am

Anyone that thinks the future is a point in time we arrive at, and is thus predictable need only look at Wheeler’s delayed-choice experiment.

Not only is the future not written. It turns out the past isn’t as well.

ferdberple
November 9, 2024 8:08 am

BCD – Binary Coded Decimal is a base-10 numeric mode used in banking and finance computers to avoid the “chaotic” errors that occur when performing human decimal arithmetic on binary computers.

The purpose of BCD is to ensure that humans and computers will get the same results when doing calculations. In typical binary computers this is not always the case

don k
November 9, 2024 8:24 am

Kip: In general, I would agree with you. I would point out that numerical integration CAN produce excellent results. IF you know what you’re doing. In fact it does a great job of predicting satellite trajectories because barring the satellites themselves adjusting their trajectories or collision, the forces on them are usually strongly dominated by gravity. And we can predict gravity. In fact, there are simple equations that can predict most orbits almost as well on a cocktail napkin. There seem to be no such equations for climate. My feeling is that climate modelling today is no more accurate/useful than reading tarot cards, tea leaves, or sheep entrails.

It’ll likely improve. But it has a LONG way to go before any reasonable person ought to take it seriously.

One thing that deeply impresses me Is that Lorenz contributions to chaos theory didn’t spring from abstract thought but rather from practical experience. It apparently was triggered by the inability of Lorenz and his co-workers to restart an early weather model from an intermediate printout with three decimal digit precision and get about the same results as the initial run. https://en.wikipedia.org/wiki/Edward_Norton_Lorenz

Reply to  Kip Hansen
November 9, 2024 11:34 am

This seems like hand-waving to pave over something inconvenient.

ferdberple
November 9, 2024 8:26 am

The greatest source of feedback in climate models is the human beings that built the model.

If the model did not return the answer they were expecting, they would change the model, even if the model was giving the correct answer.

ferdberple
November 9, 2024 8:39 am

Years ago I converted a financial forecasting model for a large engineering firm to run on a new computer.

Debugging the code I discovered a memory fault was clobbering their final profit projection for the year. I fixed the code and presented the program to senior management for approval.

Managent immediately zeroed in on the final projection. It was wrong. It didn’t match their current system.

I explained that their existing system had a bug. The current value was in error. The new system was the correct value. We had triple checked, 10 ways to Sunday.

Senior management insisted I put the bug back in. True story. So why should climate models be different.

Reply to  ferdberple
November 9, 2024 11:36 am

Great story — presumably senior management had at least some formal engineering training?

old cocky
Reply to  ferdberple
November 9, 2024 1:56 pm

I explained that their existing system had a bug. The current value was in error. The new system was the correct value. We had triple checked, 10 ways to Sunday.

Senior management insisted I put the bug back in. True story. So why should climate models be different.

Don’t make two changes at once.

Either:
Port the system to the new platform, with the same results first, then fix the bug(s)
-or-
Fix the bug(s) first, then port to the new platform.

The problem with fixing bugs at the same time as the porting exercise is that you don’t know if the output differences are the result of fixing the existing bugs, or introducing new bugs.
It’s absolutely crucial in porting exercises to not modify behaviour.

Actually, I’d probably have added a bug compatibility switch to allow comparison runs to validate outputs against the existing system, and also allow parallel runs with the bug fix.

November 9, 2024 8:52 am

Kip, thanks. Looking forward to when you do write your ‘boundaries’ essay.

Art Slartibartfast
November 9, 2024 10:17 am

Kip: It makes one wonder if you run the model 30 times without changing the parameters at all whether you still get the exact same outcome.

Reply to  Art Slartibartfast
November 9, 2024 12:03 pm

You probably don’t have to change the parameters Just change the order of the calculations done by the various algorithms. My bet is that you will get a different answer for every change in order you make.

old cocky
Reply to  Kip Hansen
November 9, 2024 2:03 pm

I think there is some randomisation incorporated as well (volcanos, etc). Hopefully that can be disabled.

November 9, 2024 11:01 am

Kip, the 30 individual runs amount to a perfect model test. A climate model is the perfect model of the climate state simulated by that same model.

One then modifies the input a touch (10⁻¹² K) and runs the model again. See what comes out. How long does it take for the model to diverge from the climate of which it is the perfect model?

So, your run 1 is the climate standard for a perfect model test,

Runs 2-30 test whether the perfect model can reproduce its own climate, given small errors in initial conditions.

It cannot.

Perfect model tests have been run in the past.
https:/doi.org/10.1007/s003820050340
The climate models always fail.

Ireneusz
November 9, 2024 11:10 am

A tropical storm will approach the Texas coast. It meets with a cold front there. there will be a lot of rain in Louisiana.
comment image
comment image

Ireneusz
Reply to  Kip Hansen
November 9, 2024 10:46 pm
The Dark Lord
November 9, 2024 3:10 pm

The models are wrong … full stop … this whole discussion is just an intellectual circle j*rk … may as well argue about angels on the head of a pin …

Reply to  Kip Hansen
November 9, 2024 7:40 pm

The IPCC statement does not mean that the “focusing on ensembles and probability distributions approach” is necessarily valid or that it somehow magically produces valid dependable long-term predictions of future climate states.

They are attempting to say that a distribution of wrong outputs will give a mean that is correct.

That not logically consistent and is gibberish from a statistical standpoint.

Reply to  Kip Hansen
November 10, 2024 6:03 am

In fact the models do *not* provide a “probability distribution approach” at all. I’ve never seen an “ensemble” of climate models where each model is assigned a “probability” other than a uniform one where all outputs are equally likely. Finding the mean of equally likely outcomes does *not* tell you that the mean is the most likely outcome, it is more of a median than an average, it just tells you the point where you have have an equal number of outcomes above and below.

In fact, the climate models don’t even include all possible future outcomes. They are all basically the same model with the same outcome and none of them even measure the impact of the outcome on the human population. Not a single model predicted the greening of the earth, the increase in food production, and the increased survival rate of humans due to lowered deaths from hypothermia. And those are just a few of the “holistic” factors that need to be included in order to make an educated judgement on the possible outcomes.

November 9, 2024 6:21 pm

The discrete and indirect solar forcing of NAO/AO anomalies is not chaos!

Ireneusz
Reply to  Ulric Lyons
November 9, 2024 10:38 pm

That’s why long-term forecasts can’t work, because the solar wind reaching Earth is highly variable, as seen in the current solar cycle. Only changes on a scale of thousands of years can be predicted, due to the Earth’s orbital changes.
https://www.swpc.noaa.gov/products/real-time-solar-wind
http://wso.stanford.edu/gifs/Polar.gif

Ireneusz
November 9, 2024 11:06 pm

In particular, long-term winter forecasts are not reliable because the troposphere interacts with the stratosphere, and circulation changes in the stratosphere are poorly studied. In particular, we do not know what the strength of the stratospheric polar vortex depends on. In my opinion, the difference between the northern and southern polar vortex depends on the difference in the geomagnetic field.
In winter, it is the pattern of the polar vortex in the stratosphere that determines the circulation in the troposphere, up to the 500 hPa level, which is clear to any meteorologist. It is also known that the SSW develops from the top of the stratosphere, up to the 500 hPa level.
http://www.geomag.bgs.ac.uk/data_service/models_compass/polarnorth.html
http://www.geomag.bgs.ac.uk/data_service/models_compass/polarsouth.html
http://www.geomag.bgs.ac.uk/images/charts/jpg/polar_n_z.jpg
http://www.geomag.bgs.ac.uk/images/charts/jpg/polar_s_z.jpg

Ireneusz
Reply to  Ireneusz
November 9, 2024 11:25 pm

We can see that the stronger solar wind over a longer period has strengthened the polar vortex in the upper stratosphere, which is moving into the lower layers.
comment image

Ireneusz
November 10, 2024 12:43 am

On November 14, storm Rafael will reach Louisiana.
comment image

Old.George
November 10, 2024 9:28 am

I retired as a professor of computer science a quarter century ago. Accurate computer modeling of a chaotic (in the math sense) system was then considered not possible without getting _all_ the inputs exactly correct. One mis-measurement (butterfly) anywhere – much less something like rounding all temps to the nearest degree when recording observations – could yield a completely different projection.

Has that situation changed?

Ireneusz
November 10, 2024 10:23 am

Again, an upper low over Spain and downpours in the east.
comment image

Ireneusz
November 10, 2024 11:57 am

The tropical storm is “hiding” south of Louisiana.
comment image