Meet The Team Shaking Up Climate Models

From Yahoo News, via the Christian Science Monitor

A new team tries a new approach to Climate Modeling using AI and machine learning. Time will tell if a positive effort or extremely complicated exercise in curve fitting. Their goal is regional scale predictive models useful for planning. Few admit publicly that these do not exist today despite thousands of “studies” using downscaled GCM’s.

“There are some things where there are very robust results and other things where those results are not so robust,” says Gavin Schmidt, who heads NASA’s respected climate modeling program at the Goddard Institute for Space Studies. But the variances push skeptics to dismiss the whole field.

“There’s enough stuff out there that people can sort of cherry-pick to support their preconceptions,” says Dr. Hausfather. “Climate skeptics … were arguing that climate models always predict too much warming.” After studying models done in the past 50 years, Dr. Hausfather says, “it turns out they did remarkably well.”

But climate modelers acknowledge accuracy must improve in order to plot a way through the climate crisis. Now, a team of climatologists, oceanographers, and computer scientists on the East and West U.S. coasts have launched a bold race to do just that.

They have gathered some of the brightest experts from around the world to start to build a new, modern climate model. They hope to corral the vast flow of data from sensors in space, on land, and in the ocean, and enlist “machine learning,” a kind of artificial intelligence, to bring their model alive and provide new insight into what many believe is the most pressing threat facing the planet.

Their goal is accurate climate predictions that can tell local policymakers, builders, and planners what changes to expect by when, with the kind of numerical likelihood that weather forecasters now use to describe, say, a 70% chance of rain.

Tapio Schneider, a German-born climatologist at the California Institute of Technology and Jet Propulsion Laboratory in Pasadena, California, leads the effort.

“We don’t have good information for planning,” Dr. Schneider told a gathering of scientists in 2019. Models cannot tell New York City how high to build sea walls, or California how much to spend to protect its vast water infrastructure.

They simply vary too much. For example, in 2015 in Paris, 196 countries agreed there will be alarming consequences if the planet warms by 2 degrees Celsius, measured from the industrial age. But when will we get there? Of 29 leading climate models, the answer ranges from 20 to 40 more years – almost the difference of a human generation – under current levels of emissions. That range is too wide to set timetables for action, which will require sweeping new infrastructure, everything from replacing fossil fuels to switching to electric vehicles to elevating homes.

“It’s important to come up with better predictions, and come up with them fast,” Dr. Schneider says.

This is funny

And it threatens to ruffle feathers in the climate science world, especially at the established modeling centers, like Dr. Schmidt’s NASA group at Goddard. “I think they have oversold what they can do,” Dr. Schmidt says. Is a new model needed? “They would say yes. I would probably say no.”

Apparently a quite modest group.

The other distinguishing feature, Dr. Marshall notes, is those working on it. “The model is actually less important than the team of scientists that you have around it,” he contends. In fact, the 60 to 70 researchers and programmers in the CliMA group represent a veritable United Nations.

Somebody put a map on the wall at the CliMA house, a converted provost’s home at Caltech, and asked everyone to pinpoint their homes. “There were a lot of needles,” Dr. Schneider says.

Here’s the AI part

A climate model that “learns”

CliMA decided on an innovative approach, to harness machine learning. Satellite and sensor information is freely available – much of it for weather forecasters. Dr. Schneider envisions “training” their model with the last three decades of data, and then routinely feeding it the latest updates. The model itself could “learn” from the data and calibrate its performance with formulas refined by AI, even as the climate changes.

Other issues discussed are the reasons for choosing to program in Julia. To read the rest go to the full article here.

HT/Clyde Spencer

Get notified when a new post is published.
Subscribe today!
4.5 11 votes
Article Rating
260 Comments
Inline Feedbacks
View all comments
January 26, 2021 9:21 am

“196 countries agreed there will be alarming consequences if the planet warms by 2 degrees Celsius, measured from the industrial age”

Gee, I though climate “science” was done by scientists, not countries.

Reply to  Joseph Zorzin
January 26, 2021 12:30 pm

Intergovernmental is the first word in IPCC. It comes under the UNFCCC under the UNEP. That’s governments all the way down.

Rory Forbes
Reply to  Joseph Zorzin
January 26, 2021 5:26 pm

Oh, dear … you’re out of touch. Apparently science is also done by large lists of “science” organizations and associations. Appeals to authority and “consensus” (numbers) are the new scientific falsification and repetition of experiments. Computer artifacts are the new data.

hunterson7
January 26, 2021 9:21 am

The model algorithms may “learn” but the climate fanatics most certainly don’t.

Rory Forbes
Reply to  hunterson7
January 26, 2021 5:28 pm

I have no idea what it is they believe the computers are “learning”, but it certainly hasn’t much to do with climate or even weather. It’s just GIGO by a more technical name.

January 26, 2021 9:28 am

“Dr. Schneider envisions “training” their model with the last three decades of data, and then routinely feeding it the latest updates. The model itself could “learn” from the data and calibrate its performance with formulas refined by AI, even as the climate changes.”

What will the model learn? The underlying physics of the countless variables effecting the many climates of a very complex planet? How awesome. Next we can ask AI to explain dark matter and dark energy and how the first life formed- and is there or not a God.

climatebeagle
Reply to  Joseph Zorzin
January 26, 2021 11:24 am

They need to split the data into training and validation sets, so really they would have less data to train. Typically the split is random, but not sure that could be done here, so likely the AI model will be highly dependent on the period chosen for training, e.g. 1990-2010 or 2000-2020.

Some thoughts on data. needed for AI modelling in general here: https://machinelearningmastery.com/much-training-data-required-machine-learning/

I just don’t think there’s enough modern data available to have a model make long term predictions.

Reply to  climatebeagle
January 26, 2021 12:37 pm

“I just don’t think there’s enough modern data available to have a model make long term predictions.”
Right, exactly. I left off the sarcasm sign.

Reply to  climatebeagle
January 26, 2021 3:28 pm

Empirically parameterized models are not predictive.

Reply to  Pat Frank
January 27, 2021 5:24 am

empirical:  originating in or based on observation or experience

I’m not convinced that the parameterizations are empirical at all!

But I understand what you are saying. If the parameterizations are not based in theory or actual physical system validation then they can’t actually predict anything.

Reply to  Charles Rotter
January 26, 2021 2:33 pm

Learning AI’s work well in some situations. E.g. feeding in symptoms from multiple people leading to a diagnose of Disease A. An AI can “learn” how to synthesize a diagnosis of Disease A from varied symptoms that may not be the same in each person. That is *MUCH* different from trying to divine physical theories after being trained with inputs and outputs. The AI simply won’t be able to tell why the Disease A causes the varied symptoms, just that it does. I should also point out that if some Disease A cases are mis-diagnosed but are still used as training data the AI won’t be able to figure out why the mis-diagnoses happened. It will just fold that input into the “learning”. It’s the same thing with feeding manipulated temperature data into a learning AI and telling it that the data should result in Output A. When fed actual data the AI won’t know what to do with it!

fred250
Reply to  Joseph Zorzin
January 26, 2021 6:10 pm

The main climate cycles are about 60-80 years long , AMO, PDO

They need to “train” to un-adjusted data over that period, at the very least.

(if there is any “unadjusted ” data left !)

Sara
January 26, 2021 9:32 am

196 countries agreed there will be alarming consequences if the planet warms by 2 degrees Celsius, measured from the industrial age. But when will we get there? Of 29 leading climate models, the answer ranges from 20 to 40 more years…”

Okay, this is getting sillier and silier.

What part of ‘you can’t control what this planet does – EVER’ is so hard for them to understand?

Just askin’, because I’m looking at 4.5 inches of snow on my front steps and it came out of the doomed Southwest USA (everything is doomed, y’know) and there are forecasts of more later today and tonight. And the birds haven’t even found a good reason to evacuate their cozy warm spots to come to my bird feeding station. I put out a good spread for them, too.

Birds know it’s cold and snowing to beat the band, and don’t worry about warm weather being a problem. They just find other turf to invade if supplies and water run out in their usual territory.

I mean, why on earth do birds have more common sense than all those people with “advanced degrees”?????

RockyRoad
January 26, 2021 9:35 am

So it doesn’t matter that the earth is supersaturated with water vapor and atmospheric C02 with respect to radiation absorption, or that none of the models include clouds, or that the estimation variance of their GCM’s destroys any confidence in their numbers.
Where to begin, where to begin.

January 26, 2021 9:40 am

They had the very best idea and action in India just recently.
Truly truly horribly and bizarrely, the used it against farmers.
They actually bit the hand that feeds them..

Quote:
“”Mobile internet services were suspended in parts of Delhi and some metro stations closed as security forces scrambled to restore order.“”
From here

They. Switched. Off. The. Internet.
It was revealed why…
here ##

Could have been my brother in that picture getting his head mashed in by Government.
‘cept the UK Government, even in the late 80’s was so bad, my bro did his own head in because of it. With a hunting rifle.
He didn’t drink or go to the (recently ill-legalised back then) raves you see,.
Maybe that was his mistake.
Government advice now is to drink alcohol and do cannabis, just that you can endure their relentless meddling,as CS Lewis referred to to when he spoke of “Omnipotent busybodies’
Raves are even More Illegal now tho. Bit like farming has become, it wastes The Climate you see.

The power of Magical Thinking eh
Hideous innit, Gavin?

## The Indian Government asserts that the big metal stick in that picture DID NOT CONNECT with the elderly peasant.
Everyone else close to the story and actual event says otherwise.
What is it about turkeys?

Paul Johnson
January 26, 2021 9:45 am

What happens when the new model gives the “wrong” answers?

Jordan
Reply to  Paul Johnson
January 26, 2021 11:48 am

That’s easy Paul. Nobody can challenge it, because nobody knows how the freaking thing gets its answers. But if you dare question it, you will be accused of Flat Earthism, or likewise,

Mark Pawelek
Reply to  Paul Johnson
January 28, 2021 7:22 am

What happens when a tree falls in the forest but no one is there to record it?

This new model will probably be a trick to eliminate all but the right answers by working on some Al-Gore-ithm which only records right answers. You may be sure there’ll be a scam or three in there.

January 26, 2021 9:48 am

Will they tell the AI that CO2 is the bad guy, or will it figure it out for itself?

Jordan
Reply to  Right-Handed Shark
January 26, 2021 11:49 am

CO2 is the Satan Molecule. CO2 knows how to get the right answers. (And don’t forget CO2’s greatest trick of all – making itself invisible!)

January 26, 2021 9:53 am

Before they begin, and to be sufficiently woke, the group needs to publish its Racial Diversity Action Plan and have training meetings on how they will be avoiding white privilege while addressing Climate Justice. Any science will be an after thought from then on.

oh and they must not forget to wear their Covid face talisman in California, especially after Dementia Joe told them to.

Carlo, Monte
Reply to  Joel O’Bryan
January 28, 2021 7:41 am

Don’t forget the “equity”, absolutely vital.

January 26, 2021 10:09 am

NASA’s respected climate modeling program

HAHAHAHAHAH – good one!

January 26, 2021 10:26 am

Wasn’t there a 1980s documentary that used artificial intelligence to tackle this and came up with the answer 42?

Clyde Spencer
Reply to  Redge
January 26, 2021 5:11 pm

Yes, but they asked the wrong question!

Reply to  Clyde Spencer
January 26, 2021 9:55 pm

Yes, and by the time the civilization that received the (incorrect!) answer had devolved back to loin cloths & spear technology. We are in deep trouble!
[and I nearly fell of the couch laughing during that scene — priceless!]

Paul Penrose
January 26, 2021 10:35 am

Meet the new model, same as the old models.

Best sung to the tune of Revolution Number 9.

Paul Penrose
Reply to  Paul Penrose
January 26, 2021 12:55 pm

That should be “My Generation”. Damn, now I sound like Joe Biden.

lackawaxen123
January 26, 2021 11:05 am

GIGO … with an AI kicker …

January 26, 2021 11:07 am

The physics of the climate is not in hand. No heuristic AI climate model will be any good, because it will be no more than inferential. New AI models will be just the old models in spangles. The alarm-mongers need new glitters to keep the money and attention flowing.

Contra Gavin, nothing in consensus climate science is “robust.” Nothing. Every bit of it — climate model projectionss, air temperature record, paleo-temperature reconstructions — is nothing more than a subjectivist narrative decorated with mathematics.

None of it is science. The models are erroneous and unreliable. The air temperature record is riven with systematic measurement error. And the paleo-temperature reconstructions have no distinct physical meaning at all.

All of which is studiedly ignored by climate so-called scientists, who have no idea how to do actual physical science.

From that diagnosis I exclude physical meteorologists.

If you want someone to blame, look no further than the American Physical Society and the American Institute of Physics.

They should have been front-and-center exposing the AGW garbage. But they’ve not been just silent. They’ve been supportive of it. To their everlasting shame. Not to say criminal negligence.

Jordan
Reply to  Pat Frank
January 26, 2021 12:01 pm

Well said Pat.
A favourite of mine is Global Warming Potential. GWP cannot be measured, therefore it cannot be observed. If it cannot be observed, it cannot be tested. And if it cannot be tested, it’s not science.
Just think about all the hoo-hah about methane and the Arctic. GWP is used to frighten the children with the idea that a methane burp would be worse than CO2 by an order of magnitude.
Just try to imagine the mentality of anybody who could make that claim with a straight face. An army of researchers has been unable to reduce the huge Charney CO2 ECS range, which has stood firm for some 30 years, If anything CO2 ECS has become even more uncertain in that period.
GWP is an untestable way to argue that methane is 10x worse than something that has stubbornly refused to yield to very many attempts to measure it.
You couldn’t make it up!
/rantover

Reply to  Pat Frank
January 26, 2021 12:10 pm

The training period they pick for AI to train on would be the key to getting the result they want. If they include the 1910-1940 warming period which is withouts much CO2 rise, the AI would become unstable and produce random results. So they WILL cherry pick the training period for the AI to force it to produce a preconceived result.

fred250
Reply to  Joel O'Bryan
January 26, 2021 4:52 pm

See my comment below at 4:42.

Reply to  Pat Frank
January 26, 2021 5:18 pm

I guess I don’t see where the proper data is going to come from. If you want an AI to generate the proper relationships you can not feed it homogenized, interpolated, modified temperature data. You need to feed it accurate, as measured data and let it learn what to do with it. Surface temps, both land and sea are not going to suffice. especially mid-range temps calculated from Tmax and Tmin, and tortured into a Global Average Temperature. You will need temps, wind speeds/directions, humidity, etc. at varying altitudes with global coverage. You will need ocean current speed, direction, and temps at multiple depths.

It still appears to me like garbage in, garbage out in a fancier bag.

Clyde Spencer
Reply to  Pat Frank
January 26, 2021 5:24 pm

Pat,
You said, “The air temperature record is riven with systematic measurement error.”

A couple of recent epiphanies: The annual global temperature average has a very large standard deviation (some ten’s of degrees), based on an estimate from the Empirical Rule. That strongly suggests a non-normal probability distribution. (Actually, I demonstrated that in a previous guest editorial.) Yet, the unstated assumption in the manipulation of data, particularly the increase in precision attributed to the large number of temperature readings, is that the temperature data is Gaussian, which it isn’t.

Related to that is the rationalization that the magic Law of Large Numbers justifies increased precision. However, if using different thermometers with different calibration curves and different errors, instead of improving precision, mixing temperatures from thermometers of different pedigrees will give worse results than depending on just the most accurate and precise.

Reply to  Clyde Spencer
January 26, 2021 6:28 pm

Completely agree, Clyde (assuming you meant SDs of ‘[several] tenths of degrees’). I’ve found the same problems with thermometers as you describe.

And like you, have seen that the workers assume all the measurement error is random, and averages away. I’ve had email exchanges with both Phillip Brohan and John Kennedy of UKMet about that. They brushed off concerns about non-random error.

What’s even worse is that they extend the same assumption to ship temps. Ships are called “platforms” and they assume each platform has a constant distribution of random error. And then they 1/sqrtN all of it away. Just as for land temps.

The available SST calibration literature doesn’t support that assumption. But as you say, the CLT and LLN are invoked, and the god of bad data smiles upon them. They’ve gotten away with that nonsense for 50 years.

The carelessness is unconscionable. They’re incompetent by design or else by training. There’s no other possibility.

Clyde Spencer
Reply to  Pat Frank
January 26, 2021 8:41 pm

Pat
No, I meant 10s of degrees for a global, average-temperature SD. The Empirical Rule says that with a normal distribution, 99.7% of all samples will fall within +/- 3 SD. That is nearly all the samples — good enough for government work! The annual range of temperatures on Earth from Antarctica to Tunisia is about 230 deg F., maybe more. Dividing 230 by 6 (+&- 3 SD) gives about 38 deg F. Because the PDF is skewed, with a long tail on the cold side, that is probably an overestimate. Even if we go out to 4 SD, that still gives us an SD of ~29. However, the point being is that, as an order of magnitude, the SD is tens of degrees rather than tenths or even degrees. So, when the Earth’s average global temp’ is reported, it should be shown as ~50 +/- 30 deg F. — [not +/- 0.001 or even +/- 0.01 deg F.] By convention, +/- 1 SD is taken to indicate the general precision of a sample mean. Because of the large range, the precision is low. The global average is problematic also because of daily averages being a mid-range value rather than a true mean calculated from at least hourly values. With probably millions of samples, does anyone try to claim they measure IQs to better than 1 point out of 100?

If you didn’t see it the first time, read my article at:
https://wattsupwiththat.com/2017/04/23/the-meaning-and-utility-of-averages-as-it-applies-to-climate/

Reply to  Clyde Spencer
January 27, 2021 6:04 am

The real problem is that you can not increase precision with statistics. It is what is controlled by the precision of the measurement device and can not be interpolated by any probability distribution. 10,000 measurements of the same thing by a device that measures to the nearest gram WILL NOT allow one to increase the precision to anything less than a gram.

This is why significant figures were developed. Too many people confuse the “error of the mean” with the “precision of the measurement”. The error of the mean only defines the precision of the calculated mean. IOW, as N goes to infinity the calculated mean is closer and closer to the true mean. This doesn’t influence the precision of the measurements used however. Significant digit rules must then be applied to correct the calculated value to the proper precision.

I have asked many mathematicians that want to argue what RULE they use to determine the final precision of a mean when the calculated value returns an irrational number. No one ever gives an answer. There is none!

I also ask why do we spend thousands of dollars on precision micrometers and micro-ohmmeters if simple rulers and ohmmeters would suffice if it only takes a finite number of measurements to reduce the precision to a very small number. Or, why do certified labs treat precision like a god in their methods and equipment. No one ever answers that one either.

Reply to  Clyde Spencer
January 27, 2021 9:37 am

Hi Clyde — I see your point. Your SD is what I called ‘magnitude uncertainty’ in my 2010 paper on air temperature. It’s the uncertainty in a mean that arises because of the variability of input magnitudes.

Magnitude uncertainty is readily understandable when all the measurements are in a single location. Global temperatures have a spatial distribution as well as a magnitude distribution, which complicates the meaning.

Chris Essex, et al., wrote a paper in 2007 on exactly your point Does a Global Temperature Exist?

They say that, “[Global air temperature] arises from projecting a sampling of the fluctuating temperature field of the Earth onto a single number…” just as you indicate.

Their major conclusion is that there is no such thing as a global temperature. It’s a physically meaningless statistic. Also pretty much your conclusion.

My own work just focuses on measurement error, a much more modest target. But at least it’s physically meaningful. 🙂

Clyde Spencer
Reply to  Pat Frank
January 27, 2021 12:20 pm

Thank you for the links, Pat.

Reply to  Clyde Spencer
January 28, 2021 1:59 pm

Happy to do so, Clyde.

Robber
January 26, 2021 12:46 pm

Gotta love weather models and all their forecasting expertise when my local forecast for tomorrow says 40% chance of rain, possible evening shower. MIght as well toss a coin.

Paul of Alexandria
January 26, 2021 3:34 pm

“Machine learning”. Which means that absolutely nobody will have any clue how it gets its results!

Reply to  Paul of Alexandria
January 27, 2021 6:06 am

But, they will know the answer is close to what it was “trained” to provide.

Editor
January 26, 2021 4:29 pm

Dr. Schneider envisions “training” their model with the last three decades of data“.

This is absolutely preposterous. You can’t ‘train’ a climate model on 30 years of data.

fred250
January 26, 2021 4:42 pm

Models are meant to be “validated” against something.

So a challenge to all the AGW model sympathisers…

Which model shows a peak in the 1940s in the Arctic, followed by a drop of 1.5 – 2ºC to the mid 1960s.

This is what THEIR data shows.

comment image

comment image

I wonder how long it will be before we get an answer to that one. 😉

Editor
Reply to  fred250
January 26, 2021 6:41 pm

Your question is easily answered. Aerosols have a cooling effect, and the whole of climate is covered by CO2 and aerosols, given a high enough ECS. For the ~1940-70 cooling, they estimated the aerosol content from the observed cooling (there weren’t any aerosol measurements). They then ran their models and were really impressed by how accurate they were. Their accuracy proved they were right. No reason why it shouldn’t work just as well, further back in time.

fred250
Reply to  Mike Jonas
January 26, 2021 7:37 pm

The question asked “WHICH MODEL” ?

I’m still waiting. And I expect I will be for a long time !! 🙂

TheMightyQuinn
January 26, 2021 5:02 pm

A new strategy to shake the money tree.

ScienceABC123
January 26, 2021 6:09 pm

Again, the biggest problem with computer models is getting them to match up with reality.

My experience with the local weather reporter suggests his 10 day weather report accuracy drops to 50% at 48 hours, beyond that it’s a random moving target.

January 26, 2021 6:52 pm

If the models have proved “remarkably accurate” then why is this new process needed?

It read it that “we have this nailed, so we’ll change how we are doing it”.

?

January 26, 2021 6:52 pm

But how good are the data sources?

Did much come of the BEST data base attempt, hopefully free of the jerk who blabbered prematurely?

January 26, 2021 7:55 pm

Meet The Haruspices Prophesying the Climate Future With New Sheep Entrails

Fixed it.

Tom Abbott
January 26, 2021 8:08 pm

From the article: “Their goal is regional scale predictive models useful for planning. Few admit publicly that these do not exist today despite thousands of “studies” using downscaled GCM’s.”

If they want regional-scale temperatures, they should use regional surface temperature chart readings. Unmodified, regional surface temperature chart readings.

Why do you need a model when you have the real thing? Oh, yeah, I forgot.

I have a prediction garnered from the regional surface temperature charts: It was just as warm in the Early Twentieth Century as it is today, which means we don’t have to worry about regulating CO2.

January 26, 2021 9:06 pm

I can do better than climate models.

I have devised a surface temperature regulator and tethered it at 0N, 156E to test it.

I currently have the set point at 30C. When the temperature gets above 30C I put up sun shades. If the temperature is slow to come back down with just the sun shade I can accelerate the rate of cooling by throwing cold water over the surface.

It works quite well. It overshot to 31.5C at the start of the trial but once I got the control system tuned it works brilliantly. The core of the control system relies on AI.

The attached chart shows 16 days of operation during initial trials last year.

Temp_Regulation.png
fred250
Reply to  RickWill
January 26, 2021 9:22 pm

WHAT..

Using water as a COOLANT

Why hasn’t anyone thought of that before. ? 🙂

IPCC etc say it causes warming !

Reply to  fred250
January 26, 2021 9:35 pm

Fred
You can see how quickly the temperature drops off once I spray the water over the surface. I keep the water chilled at close to freezing. Sometimes it is actually below freezing but not often.