Study: High End Model Climate Sensitivities Not Supported by Paleo Evidence

Guest essay by Eric Worrall

University of Michigan researchers have done the unthinkable, and checked climate model predictions against available paleo-climate data to see if the predictions are plausible.

Some of the latest climate models provide unrealistically high projections of future warming

Date:April 30, 2020Source:University of MichiganSummary:A new study from climate researchers concludes that some of the latest-generation climate models may be overly sensitive to carbon dioxide increases and therefore project future warming that is unrealistically high.

A new study from University of Michigan climate researchers concludes that some of the latest-generation climate models may be overly sensitive to carbon dioxide increases and therefore project future warming that is unrealistically high.

In a letter scheduled for publication April 30 in the journal Nature Climate Change, the researchers say that projections from one of the leading models, known as CESM2, are not supported by geological evidence from a previous warming period roughly 50 million years ago.

The researchers used the CESM2 model to simulate temperatures during the Early Eocene, a time when rainforests thrived in the tropics of the New World, according to fossil evidence.

But the CESM2 model projected Early Eocene land temperatures exceeding 55 degrees Celsius (131 F) in the tropics, which is much higher than the temperature tolerance of plant photosynthesis — conflicting with the fossil evidence. On average across the globe, the model projected surface temperatures at least 6 C (11 F) warmer than estimates based on geological evidence.

“Some of the newest models used to make future predictions may be too sensitive to increases in atmospheric carbon dioxide and thus predict too much warming,” said U-M’s Chris Poulsen, a professor in the U-M Department of Earth and Environmental Sciences and one of the study’s three authors.

Our study implies that CESM2’s climate sensitivity of 5.3 C is likely too high. This means that its prediction of future warming under a high-CO2 scenario would be too high as well,” said Zhu, first author of the Nature Climate Change letter.

Read more: https://www.sciencedaily.com/releases/2020/04/200430113003.htm

People underestimate the power of models. Observational evidence is not very useful” – attributed to John Mitchell, UK MET

Most fields of science don’t accept a model unless it has been rigorously validated against available data, but climate science is different; the modelling process itself frequently seems to be accepted as evidence that the climate model is correct, a circular chain of reasoning which leads to positions which outside of climate science would be considered absurd.

Let us hope this novel protocol of testing climate models against available evidence catches on.

The paywalled study is available here.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

88 Comments
Inline Feedbacks
View all comments
J Cuttance
May 2, 2020 3:48 pm

News flash: The low-end models are rubbish too

John Tillman
Reply to  J Cuttance
May 2, 2020 5:29 pm

Except for the very lowest-end, ie closest to observed reality, Russian model. All the dozens of others should be tossed. But of course they won’t be, because CLIMATE CHANGE EMERGENCY!!!

BC
May 2, 2020 5:02 pm

I have read of a couple of ‘hindcasting’ analyses that have demonstrated the, at best, ‘shortcomings’ in the usual climate models. For example:
https://www.washingtonexaminer.com/opinion/op-eds/the-great-failure-of-the-climate-models
I recall another from a few years ago – I think it was reported on Jo Nova’s blog – but I can’t find it and don’t appear to have kept a copy of it. But, when searching for it, I found this in my archive:

In a 2009 interview with FlightGlobal, the late former Boeing 747 chief engineer, Joe Sutter, cautioned about reliance on computer-assisted design tools in aircraft development. “There should not be an over-emphasis on what computers tell you, because they only tell you what you tell them to tell you,” he said.

https://www.flightglobal.com/opinion/opinion-what-aircraft-designers-should-learn-from-joe-sutter/121608.article
Very droll!

May 2, 2020 5:16 pm

Most fields of science don’t accept a model unless it has been rigorously validated against available data, but climate science is different;

I remember seeing an episode of Modern Marvels on the History Channel years ago.
In the episode they did a thing on a car company using computer models to design bumpers that would stand up to a crash.
When they got a model with the desired results, they didn’t put into production.
They built prototype bumpers, and crashed them, and then compared the results of the computer model vs reality.
The computer models saved them money in design and research but their output wasn’t mistaken for reality. How the prototype performed was the reality that production would be based on.

peterg
May 2, 2020 5:53 pm

I am doing a bit with SPICE, the electronic circuit modelling program. I am always impressed at how well it predicts the DC voltage levels in transistor circuits. Some computer models mostly work.

Clyde Spencer
Reply to  peterg
May 2, 2020 8:56 pm

peterg
As a general rule of thumb, properly constructed deterministic models provide results that are superior to stochastic models. The problem is, not all systems lend themselves to deterministic formulations.

Reply to  peterg
May 3, 2020 6:18 am

DC is pretty easy, there are no frequency related terms and you don’t even need calculus to solve. Now do AC circuits, especially at rf frequencies. When you get the circuit you want, and actually build it, the fun begins. Parasitics, varying permeability, etc. Models can only get you to a starting point, not the final solution.

tom0mason
Reply to  Jim Gorman
May 3, 2020 12:50 pm

Yes Jim Gorman,
It is the tool we have and, as you say is a starting place.
Then later RF circuits can become unstable again when silly environmental matters like (and sometimes only small) changes in temperature, vibration, dust accumulation, and damp interfere with all those carefully constructed mathematical parameters of the components, cables, connectors, and the PCB board materials.

May 2, 2020 5:57 pm

Some of the latest climate models provide unrealistically high projections of future warming

That statement implies that some climate models provide realistically high projections of future warming.

How does anyone decide that some projection is realistically high? Is there a committee that decides? A consensus of climate casuists?

How does one judge any given projection is realistic at all? There is no adequate physical theory of the climate available to make any estimate.

Critical thinkers are at a premium in consensus climatology. One may analogize that the bloody-minded CO2-monotheists have driven off all the free-thinking heretics.

These people live in a science-abusive cloud-cuckoo land.

Clyde Spencer
Reply to  Pat Frank
May 2, 2020 8:58 pm

Come on, Pat! You’re just being realistic. 🙂

Reply to  Clyde Spencer
May 2, 2020 9:36 pm

Careful, Clyde. No expects the Spanish Inquisition. 🙂

John Bruyn
May 2, 2020 5:59 pm

There are 4 major points climate modellers need to consider very seriously:
1. The equatorial speed of Earth’s rotation makes all gases to circulate through the atmosphere vertically and act as surface cooling agents;
2. CO2 in polar ice cores is lower than in the tropics as snow itself has minimal CO2 if any, which means that the air trapped in fir is from the warmer seasons only and give a false impression if taken as an annual value;
3. Variations in Earth’s orbits and orientations with respect to the sun vary all the time as to affect the sun’s illumination of the poles.
4. Anthropogenic CO2 additions to the trillions of tonnes circulating through the atmosphere and sequestrated annually by photosynthesis and by precipitation (as H2CO3) cannot be cummulative and if anything would have a net global cooling effect.

John Shotsky
Reply to  John Bruyn
May 2, 2020 7:05 pm

There is one more thing about Co2…
It is ALL emitted at the surface, and it is ALL absorbed at the surface. It has a higher concentration at the surface for those very reasons. To call it ‘well mixed’, is plain wrong. Most of it is at the surface because the atmosphere is thickest at the surface. So, well-mixed is a misnomer. It doesn’t ‘rain’ Co2, and Co2 isn’t lighter than the other major constituents of the atmosphere – in fact, it is heavier, which is why it settles and puts fires out. So, to claim that there is some ‘blanket’ of CO2 ‘out there’ that traps heat is plain disingenuous, or is claimed by those that have no clue about gases.

Gerald Machnee
May 2, 2020 6:40 pm

May be overly sensitive to CO2?????????

Todd Peterson
May 2, 2020 7:43 pm

Erick, Your last sentence made Richard P Feynman PROUD.

Todd Peterson
May 2, 2020 7:50 pm

Eric, Your last sentence made Richard P Feynman PROUD.

Steven Mosher
May 3, 2020 1:04 am

“University of Michigan researchers have done the unthinkable, and checked climate model predictions against available paleo-climate data to see if the predictions are plausible.”

It is done all the time.

One problem is this.

1. your paleo “data” is not readily comparable. so yu have to turn it into temperatures. you know
trying rings into Temperature. This is a MODEL.
2. Your GCM puts out data. This is not observational data. It too is a MODEL.

So you compare the two models of temperatures. One statistical derived from proxies. One physical derived from first principles.

Comparing two models is hard. Long ago I was sitting in an AGU session that was discussing how to compare models with paelo data. The question was raised. If the paelo model of rainfall doesnt match the GCM model of rainfall, Which is correct?

Anyway, Comparing models to paleo data is nothing new. There is a whole project devoted to it
PMIP

https://pmip.lsce.ipsl.fr/

I do wish authors of posts would check the actual activities that people engage in.

https://pmip.lsce.ipsl.fr/about_us/history

The Paleoclimate Modelling Intercomparison Project (PMIP) emerged from two parallel endeavours. During the 1980s, the Cooperative Holocene Mapping Project showed the utility of combining model simulations and syntheses of paleoenvironmental data to analyse the mechanisms of climate change. At the same time, the climate-modelling community was becoming increasingly aware that responses to changes in forcing were model dependent. The need to investigate this phenomenon led to the establishment of the Atmospheric Modelling Intercomparison Project (AMIP) – the first of a plethora of model intercomparison projects of which PMIP (and CMIP1) are part.

The specific aim of PMIP was, and continues to be, to provide a mechanism for coordinating paleoclimate modelling and model-evaluation activities to understand the mechanisms of climate change and the role of climate feedbacks. To facilitate model evaluation, PMIP has actively fostered paleodata synthesis and the development of benchmark datasets for model evaluation. During its initial phase (PMIP1), the project focused on atmosphere-only general circulation models; comparisons of coupled ocean-atmosphere and ocean-atmosphere-vegetation models were the focus of PMIP2.

In PMIP3, project members are running the CMIP5 paleoclimate simulations and will lead the evaluation of these simulations. However, PMIP3 will also run experiments for non-CMIP5 time periods and will be coordinating the analysis and exploitation of transient simulations across intervals of rapid climate change in the past. PMIP also provides an umbrella for model intercomparison projects focusing on specific times in the past, such as the Pliocene Modelling Intercomparison Project (PlioMIP), or on particular aspects of the paleoclimate system, such as the Paleo Carbon Modelling Intercomparison Project (PCMIP).

PMIP membership is open to all paleoclimatologists, and we actively encourage the use of archived simulations and data products for model diagnosis or to investigate the causes and impacts of past climate changes.

Quoted from Braconnot et al, “Evaluation of climate models using palaeoclimatic data”, Nature Climate Change 2, 417-424 (2012), doi:10.1038/nclimate1456

Steven Mosher
Reply to  Eric Worrall
May 3, 2020 5:06 am

Your claim

‘University of Michigan researchers have done the unthinkable, and checked climate model predictions against available paleo-climate data to see if the predictions are plausible.”

It is not unthinkable. they have done this for a long time.
Now you add the quibble… but eocene

you didn’t check

https://www2.atmos.umd.edu/~dankd/Eocene.html

https://www.ncdc.noaa.gov/global-warming/early-eocene-period

https://www.ipcc.ch/site/assets/uploads/2018/02/WG1AR5_Chapter05_FINAL.pdf

now I ask you

here is your claim

‘University of Michigan researchers have done the unthinkable, and checked climate model predictions against available paleo-climate data to see if the predictions are plausible.”

how could it be unthinkable when it has been done before and published years ago?

MAYBE you should just report the facts and not try to color them with false claims.

Reply to  Eric Worrall
May 4, 2020 2:29 pm

So just what are they PIMPing?
(Sorry. Had to be said.)

CheshireRed
May 3, 2020 4:08 am

High sensitivity has always been the Achilles heel of climate models, with the only surprise being climate sceptics haven’t challenged alarmists projected warming claims anywhere near strongly enough.

Geological records show higher CO2 in the past yet there was no ‘runaway warming’. Have the laws of physics changed or something? Erm…nope. That nails the ‘tipping points’ / positive feedbacks / amplification / high forcing nonsense. Neither proxy data nor today’s observations – if it was true we wouldn’t be here now, support the theory, so ‘runaway warming’ is falsified right there.

Way past time AGW nonsense was put to bed forever.

Steven Mosher
May 3, 2020 5:11 am

“Most fields of science don’t accept a model unless it has been rigorously validated against available data, but climate science is different; ”

Models are validated against a SPECIFICATION.

For example, My specification can be “model the temperature of X to within 2K of the measured K
A model that exceeded 2K of the actual would fail validation. One that was off by 1.9K would pass
validation.

And sometimes the spec is rather easy to hit. The model shall be more skillful than a naive forecast.

sycomputing
Reply to  Steven Mosher
May 3, 2020 10:11 am

Models are validated against a SPECIFICATION.

That’s interesting. Is this regardless of industry?

Mark Pawelek
May 3, 2020 5:23 am

This falsification of CESM2 will probably find modelers tweaking their model to make CESM3, a model which will be equal CESM2 in BS.

Dr Roger Higgs
May 3, 2020 12:41 pm
Verified by MonsterInsights