From the Institute for Basic Science and the “we’re all gonna die” department comes Episode #2971 of model madness via press release:
Earth’s future climate at 9 km worldwide resolution
Global Warming does not affect our planet evenly. Some areas such as the Arctic region or high mountain peaks warm faster than the global average, whereas others, including large parts of the tropical oceans, show reduced temperature trends compared to the mean. The heterogeneity of future rainfall patterns is even more pronounced. To adapt to future climate change, policymakers and stakeholders need detailed regional climate information, often on scales much smaller than the typical resolution (~100-200 km) of climate models used in the reports of the Intergovernmental Panel on Climate Change (IPCC).
A team of scientists from the IBS Center for Climate Physics (ICCP), Pusan National University in South Korea and the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI), Bremerhaven, Germany has achieved an important breakthrough in climate modeling, providing unprecedented insights into Earth’s future climate and its variability. Their research was published in the open access journal Earth System Dynamics.
Utilizing the AWI-CM3 earth system model, a novel iterative global modeling protocol, and two of South Korea’s fastest supercomputers (Aleph at the Institute for Basic Science and Guru at the Korea Meteorological Administration), the researchers have simulated climate change at scales of 9 km in the atmosphere and 4-25 km in the ocean. These extensive computer model simulations offer a more accurate representation of future climate conditions, enabling better planning for climate adaptation.
The AWI-CM3 high-resolution model accurately represents global climate, including small-scale phenomena, such as rainfall in mountainous regions, coastal and island climate processes, hurricanes and ocean turbulence (Fig. 1). By resolving more regional details and their interactions with the large-scale atmosphere and ocean circulations, the model demonstrates a superior performance compared to most lower-resolution climate models.

A main product of the simulations is a set of detailed global maps of expected climate change (e.g., temperature, rainfall, winds, ocean currents, etc.) for an anticipated 1oC future global warming.
“It is important to keep in mind that Global Warming is spatially quite heterogenous. For a 1oC global temperature increase, the Siberian and Canadian Arctic will warm by about 2oC, whereas the Arctic Ocean will experience warming of up to 5oC. In high mountain regions, such as the Himalayas, the Andes and the Hindu Kush, the model simulates a 45-60% acceleration relative to the global mean”, says MOON Ja-Yeon from the ICCP, and lead author of the study. To ensure broad access to these high-resolution climate projections, the team has launched an interactive data platform, where users can explore future climate change on regional and global scales (Fig. 2). Normalized climate change data for a 1°C Global Warming level can be downloaded and opened directly in the Google Earth application. These data can provide information on expected future changes in climate variables, such as windspeed and clouds, which are relevant for the future deployment of wind or solar farms, respectively.

“Our study also highlights the regional impacts of major modes of climate variability, such as the Madden Julian Oscillation, the North Atlantic Oscillation, and the El Niño-Southern Oscillation, as well as their response to greenhouse warming” says Prof. Thomas JUNG from the AWI and co-corresponding author of the study. According to the AWI-CM3 simulations, the amplitude of both, the Madden Julian Oscillation and of the alternating El Niño and La Niña events will increase in the future, which will lead to intensified rainfall impacts in affected regions. The simulations further indicate an increase in the frequency and intensity of extreme rainfall events (>50 mm/day) in areas such as eastern Asia, the Himalayas, the Andes, Amazonia, mountain-tops in Africa and the east coast of North America with significant implications for flooding, erosion, and landslides.
“Most global climate models used in the assessment reports of the IPCC are too coarse to resolve small islands, such as those in the western tropical Pacific. These islands are already threatened by global sea level rise. Our new climate model simulations now provide new regional insights into what these regions can expect in terms of changes in ocean currents, temperatures, rainfall patterns and weather extremes. We hope that our dataset will be used extensively by planners, policy- and decision-makers and the public.”, says Prof. Axel TIMMERMANN, Director of the ICCP and co-corresponding author of the study.
The study’s findings offer critical information for assessing climate risks and implementing adaptation measures on regional scales.
Journal: Earth System Dynamics
DOI/Link : 10.5194/esd-16-1103-2025
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
___________________________________________________
Uh huh, show me Permanent Service for Mean Sea Level
Yes, I wonder why some Islands are increasing in size? Being volcanic, I guess they sometimes rise, and sometimes sink. Sea levels are impossible to measure.
No! Mostly shifting tides and storms sometimes increase and sometimes decrease the shorelines. Shifting altitudes above sea level has little to do with it.
It’s the coral reefs that are adjusting to changing sea levels.
Sea levels are difficult to measure, but it can be done. People have been doing it for centuries.
Beware “impossible“.
Thanks for link. Out of interest I looked up Bunbury, AUS. It showed 40yrs flat then a rising slope after a gauge relocation in 2001. Coincidence? I’m thinking taking gauge data on face value is no different to thermometers.
The AWI-CM3 high-resolution model accurately represents global climate
I’m sorry, I don’t believe that for a moment.
“These islands are already threatened by global sea level rise.”
Then again, maybe they aren’t…
“This Pacific Island Was Expected to Disappear, But It’s Actually Growing Larger“
https://www.sciencealert.com/pacific-island-nation-expected-to-sink-is-getting-bigger
Islands Are Growing Not Shrinking
https://canadafreepress.com/article/islands-are-growing-not-shrinking
And we all remember what happened to the Maldives. They put the tourists off and then desperately started building new resorts.
Still, they’re on narrative; sort of.
“I’m sorry, I don’t believe that for a moment.”
I don’t believe it for a nanosecond.
Can they predict what will happen 5 minutes from now? Thought not.
Utilizing the AWI-CM3 earth system model, a novel iterative global modeling protocol, and two of South Korea’s fastest supercomputers (Aleph at the Institute for Basic Science and Guru at the Korea Meteorological Administration), the researchers have simulated climate change at scales of 9 km in the atmosphere and 4-25 km in the ocean.
My intent was to include the above in my comment. Here is the comment … “Has anyone tested these models claimed to work, ‘ … at scales of 9 km in the atmosphere and 4-25 km in the ocean.’ Have they compared the model’s forecasts to real weather using a large number of sensors placed … at scales of 9 km in the atmosphere and 4-25 km in the ocean.”?
(Posted early — before coffee. Apologies.)
I had a look at their specs. No clouds or external influences etc.
https://awi-cm3-documentation.readthedocs.io/en/latest/
It’s hilarious to me that they predict rainfall but ignore clouds. Seems like there should be a correlation between those two quantities.
Google has been providing images of earth for about 20 years… they’ve got a f%(%ng super computer. Hey, the clouds are the white puffy parts that show up really well over water.
Your comment went a-roving.
Bullshit in 4k, is still bullshit.
My understanding is the current IPCC related climate models implement a 25 km grid with varying altitude slices in roughly the same range.
The article would have been interesting had it excluded the alarmist nonsense.
Using South Korea’s fastest supercomputer means they get the wrong answer sooner.
My understanding is that Dr. Sum Ting Wong was is charge of running that supercomputer simulation.
Another model muddle.
A mucking model muddle….
Can it forecast weather?
Can it boil an egg?
Simulated.
This quote says it all: “We hope that our dataset will be used extensively by planners, policy- and decision-makers and the public.”
It’s obvious their intention is to further increase the scare mongering, not to advance accurate global climate information.
Sounds like another expensive guess-machine running on flawed assumptions.
It’s all cyclical. The Sun is sometimes nearer Earth, sometimes further away. At the moment, the Earth is in a long slow cooling phase.
More better video games.
You call THAT the dooming? THIS is the dooming!
‘Global leaders in benevolence’: 50 per cent of Aussies rely on government for income
Sounds like Scotland
The Aussie press today was in activist heaven, realising that half the population was kept by the other half. They forgot to do simple math.
Ages 0 to 20 – learning, not earning.
Ages 20 to 60 – working, earning
Ages 60 to 80 – retired, not earning.
Half and half. Seems natural, OK?
Geoff S
Not natural. The retirement age is way too young given modern life expectancies.
The pre-industrial retirement age used to be called “life expectancy”.
Life expectancy today in Nigeria is supposedly under age 55. I wonder how their retirement plans work?
Retirement age and pension age are not the same
Current pension age in Oz is 67 and rising each year
I retired at age 53 and did not get the age pension until I was 65 (75 now)
There are also many retirees and pensioners who still either work part-time or who volunteer their time and skills so are still working, albeit for free.
I’m more concerned with who is employing the ‘workers’. I recently heard (unconfirmed) that 1 in 4 in Oz are government employees.
Whatever this model does, it cannot provide a computed regional prognosis for precipitation with less uncertainty than our best present measurements.
https://drive.google.com/file/d/1FpucVfLY-uh76GdAwlHmrlhDZ2WrDK9-/view?usp=drive_link
Source – the plot (global map) is generated from the “error” statistic database found here for May 2025.
https://psl.noaa.gov/data/gridded/data.gpcp.html
Quick summary – the uncertainties on a monthly basis range from 0.2 mm/day to over 2.0 mm/day. Good luck with that model. It is a mistake to think it conveys anything meaningful about the future.
Thanks, David.
“mistake to think it conveys anything meaningful”.
Seems to me it conveys more of the same old created panic.
The current dressing for word salad is euphemisms for the greatest of anything.
The hottest weather, the fastest wind speeds, the highest glacier melt rate, the most saline seawater, the biggest hailstones, etc, with excessive use of superlatives such as incredible, absolutely, inarguably, near certainly, never anticipated and so on. Even “unprecedented”.
The whole modern preferred tone of writing lacks precise objectivity that interacts with and degrades science. Like copping a B-grade movie when a documentary was wanted. Geoff S
“Never anticipated” sounds like the perfect consensus aka settled science, no?
‘Whatever this model does, it cannot provide a computed regional prognosis for precipitation with less uncertainty than our best present measurements.’
As a ‘radiative transfer model’ (RTM), this model can’t provide any insight into anything related to how energy is actually conveyed from the Earth’s surface to space. Doesn’t matter if they’re ‘utilizing the AWI-CM3 earth system model’ or MODTRAN, if there is an RTM at its core, it means that collisional dynamics are ignored and that other mechanisms of atmospheric energy transfer, e.g., convection and latent heat, are either parameterized or missing completely. In other words, the physics are wrong, albeit awfully convenient for the alarmists and their hired guns / useful idiots in model land.
‘Final Perspective
While the term “greenhouse effect” is commonly used, it oversimplifies the true processes at work. The real maintenance of atmospheric heat involves the slow, upward movement of sensible and latent heat through convection, with greenhouse gases ultimately enabling energy release to space via radiation only where collisional processes are much less frequent. The concept of “trapping” misses the crucial fact that, for most of the atmosphere, energy is not held back by radiative processes alone, but by the slow pace of convective and latent transport compared to the near-instantaneous movement of energy by radiation at the top of the atmosphere.
Energy in the atmosphere is not trapped in the classical sense by greenhouse gases.Convection and latent heat transport, not radiative trapping, primarily maintain atmospheric temperatures in the regions where collisional processes are dominant.This physical perspective aligns with observed atmospheric dynamics and modern atmospheric science.’
https://wattsupwiththat.com/2025/07/20/open-thread-153/#comment-4095744
This is a reason for the use of gradient equations with a time component and an accurate attribution of energy among conduction, convection , and radiation components. As far as I know, we do not have data that is fit for purpose to begin analyzing this. Guesses abound with large uncertainties.
‘As far as I know, we do not have data that is fit for purpose to begin analyzing this.’
Exactly, Jim. And even if we did, we’d then be left trying to model coupled non-linear systems in a Navier-Stokes reality, hence the near-universal ‘acceptance’ of the radiant transfer paradigm within the entire modeling community, comprising everyone from the most radical alarmists to the most conservative luke warmers.
But, but, but, the science is settled!
/sarc
CO2 input, IR transfer function, temperature output is not an energy model.
So, the fancy new “model” at least helps to expose the idiocy of the GAT! The upper part of the NH is expected to warm up to 5 times as much as the “mean”. They just neglect to explain what area (2-5 times as large) will not warm at all to produce the “mean”! Maybe the Tropics are cooling? Or maybe the new model is trash!
GAT is idiocy. Period.
Mean temperature is idiocy. A simple (Tmax – Tmin)/2 is not a representative calculation given the time domain of the temperature changes. It also is bogus for calculating T^4 EMR emissions.
Your points are valid.
A partial explanation:
One has to include the surface area or atmospheric volume that each applies to.
A 5% change related to 10% of the whole is 0.5% of the total while a 1% change to 50% of the whole is also 0.5% of the total.
I believe you are thinking of average.
Mean is defined as the point at which half the data is above and half below. Because of this, every single data point above the mean could increase in temperature by 100C, and it wouldn’t impact the mean in the slightest.
Mean = the sum of all numbers divided by the count
Median = the middle value when numbers are arranged in order
Mode = the number that appears most frequently in the set.
The mean is the arithmetic average. Usually when people talk about the average they are referring to the mean.
I guess one should add that often what people are doing is look for one number to represent something important about a set of numbers – observations for instance, with a view to justifying policies..
Any of the three can be misleading. Whether they are, and how they are, depends on the distribution. If you have a long tailed distribution, with a cluster of cases in the middle but a few very large or small ones at the extremes, then you may find that, for a given policy decision, the mean leads you to omit important considerations. Same if you have a lopsided distribution.
So politicians advocating policies based on an average then sometimes find themselves the butt of furious minorities out on the far right or left side of the distribution.
The mean is the arithmetic average given a normalized distribution data population.
In other distributions, the mean is the point with the greatest population density. It also may be the arithmetic average but my studies in statistics are decade old.
My recollection is mode applies when there is no clear distribution or the data population is below a threshold (such as 100).
I am willing to have an education refresh.
The mean is independent of distribution. For a given set of numbers, sum the numbers and then divide by the number of them. For instance, if your set consists of 3,4,5 your formula would be
(3+4+5)/3 = 12/3 = 4
Any other set of three numbers which add up to 12 will also have a mean of 4. As will any set of 100 numbers which add up to 400.
This is why it can be misleading when used to compare populations. If you had used the average age at death (life expectancy) to compare English Victorian cities with rural populations you might have found similar average life expectancy, but very different child mortality, deaths from disease and accidental deaths.
Plotted on a chart with age versus number of deaths you would see very different distributions. But the mean life expectancy would be the same.
[I don’t know if this example is historically correct…]
Your points about sets of numbers with different populations all having the same mean value is exactly why one cannot calculate a GAT.
10 C to 50 C has a max-min average of 30 C
So does 20 C to 40 C.
In terms of radiated emissions, the difference between those two ranges is striking, but unsurprising given T^4.
It’s even worse if you use the daily MID-RANGE value instead of the daily average value. Las Vegas and Miami can easily have the same mid-range temperature while having vastly different climates.
One thing is clear (I did go back to refresh a bit). In a normal distribution, the mean, the median, and the mode are identical. The normal distribution is symmetrical about the mean so this makes sense.
There are several other types of distributions, but I only have so many minutes each day to research stuff. I still work.
It is tough to determine distributions with so few measurements. A monthly average has ~30 data points to examine in a histogram. Transition months make it even harder. Annual averages have only 12 monthly data points. Using Tavg ((t1+t2)/2) hides even more and should result in a larger variance.
The problem is that it all gets thrown away as information becomes more and more concentrated by averaging averages of averages. No average ever inherits variance from its components.
Anomalies are the worst example. NIST TN 1900 Example 2 computes an expanded experimental standard deviation of the mean for a monthly Tmax average as ±1.8°C. Even assuming a monthly baseline has 0 uncertainty, that means an anomaly should be “x ±1.8°C”. Does anyone other than CAGW climate scientists think that uncertainty in a month just becomes zero?
“half of X is below average…” Given a general distribution that’s usually roughly true. But if the data set is skewed high or low, more or less than half can be above or below average.
Treating average as a one size fits all bit the military decades ago. The air force tried to make airplane cockpits that fit any man. They had thousands of men measured and designed aircraft around the average. https://mannhowie.com/average-pitfall Exactly zero men fit all the average parameters.
A similar thing happened with an attempt to find the perfect average woman. A large number of American women were measured to determine average, height, measurements etc. Then the search was on to find at least one woman who fit all those criteria. None could be found so the award was given to one who was as close as they could get. I can’t find an article on that, I did read one online a few years ago.
Good point. In my days building models for a mining reginery process we “calibrated” using the most recent annualised data. Some inputs were averages, others were modes or medians. Fudge factors nudged the final output to match free variables as best as possible. Sensitivity simulations were run on key variables on smaller data sets to test the operating window (valid range). The challenge was where there were inverse relationships..pull one higher pushes another lower. This needed to be updated each year, and moreover, rerunning with last years actuals was not really that accurate across all key metrics – a couple yes but not all. For that a smaller standalone model of a single module/unit was used.
A climate model covering a global annual “average” and have a valid operating window forecasting decades ahead is highly unlikely. A hemispheric or regional model, with a smaller operating window, would stand a better chance than a global average one. Maybe then add the 2 ( or 6, 8 whatever) coverage models together and average them at the end to give a global number. After all, its simultaneously summer in NH and winter in SH.
Know it or not, you just accurately described what occurs in creating a measurand’s value from a functional relationship of several input quantities.
The inverse relationship you describe is evidence of autocorrelation which also adds to uncertainty. The relationship between NH and SH is an autocorrelation which is never addressed in climate science. Do you ever wonder why the “GAT ±uncertainty” is never, ever quoted. Climate science’s excuse is that with 9000+ stations dividing by √9000 makes uncertainty negligible. They seem to think how accurate the mean has been calculated is a measure of measurement uncertainty.
In essence, the uncertainty of the measurand is a sum of the uncertainties of the input values.
Hit reply before I saw “michel” wrote something like what I was going to write.
Yeah, you meant “median”.
The real problem is if the “mean” is the optimum temperature. If the optimum is higher than the mean, then there is no current problem.
Since the absorption bands for CO2 largely overlap the absorption bands for H2O, it stands to reason that places with little water vapor in the air will experience more warming for the same CO2 increase, compared to areas with lots of water vapor in the air.
Anyone who claims that the tropics are going to see much, if any warming from CO2, simply doesn’t understand basic physics. Or is ignoring it completely.
Try the desert. Higher CO2 ppm does not make it warmer, especially at night.
Just wondering how you can accurately model something that is itself just a long-term average of something else.
“Forget all those past models we told you were accurate and used for forecasting. This one is the real deal.”
“providing unprecedented insights into Earth’s future climate and its variability.”
“unprecedented insights”
un·prec·e·dent·ed /ˌənˈpresədən(t)əd/
adjective
never done or known before.
“By resolving more regional details and their interactions with the large-scale atmosphere and ocean circulations, the model demonstrates a superior performance compared to most lower-resolution climate models.”
Drives me nuts.
“unprecedented” -> “incremental”
in·cre·men·tal /ˈiNGkrəˌmen(t)l,ˈinkrəˌmen(t)l/
adjective
relating to or denoting an increase or addition, especially one of a series on a fixed scale.
IF the research had been privately funded OR had done something other than run the same type of simulation on a bigger computer OR reached a general prediction different from the prediction historically advocated by its researchers OR touted accuracy over resolution THEN it might have qualified to be called unprecedented… Though it would still be arguable.
AW’s title “Another Day, Another Model of Future Climate Doom” sums it up in content and sentiment.
From the above article:
You know, when I think about climate science, the first centers-of-excellence in that field that I think of— the “go to” guys, as it were—are the ICCP and the Pusan National University and the Alfred Wegener Institute.
/sarc
But then again, the discussion is about climate modeling, not about real science.
Yes.
“the Arctic Ocean will experience warming of up to 5oC”
IBS
Its BS
Happened in 1922 😉
Read bottom highlighted section.
Then it cooled again.
The IBS Center? What does Irritable Bowel Syndrome have to do with climate physics? 😉
Everything!
Wait. You said physics. Had you said atmospheric physics, you would have gotten a HooRah!
It applies, but more so to climate science and climate scientists.
I wish they would make some kind of useful prediction, such as about how the Northern Hemisphere winter will be this year.
Go check The Farmer’s Almanac which is now on the web.
The most extreme impacts always seem to be in the most remote, inaccessible and least instrumented areas. Coincidence?
“Scientists say…”
“… our most vulnerable populations…”
Therefore we need “climate justice.”
Can this model be run back in time? Suppose we start at 0 AD. Will it show the Roman Warm Period, the Medieval Warm Period, the Little Ice Age, the Dust Bowl in the 1930’s, the cool phase from ca. 1950 to 1970, and the warming up to the present?
Simulation, faking it. Iterative, running the simulation over and over, using the previous fake outcome as the basis for subsequent fakery. Generative, using the fake outcomes to make up additional data from thin air (see the nonexistent weather stations the UK Met Office reports readings from).
Simulations have to start somewhere, preferably with accurately measured data that hasn’t been smoothed, homogenized, cleaned, or manipulated in any way. A representative sample of the data may be used, as long as the selection is an even distribution chosen with zero bias.
If they’re using data from the Climactic Research Unit at East Anglia, I assume it fails all those criteria.
With the data in hand, the simulation has to be designed *without any outcome bias*. Bias designed into a simulation is easy to find by feeding it generated data that should generate an output that is as ‘flat’ as the input. If the outputs of the simulation trend away from null change, then there’s a bias in the simulation.
Can we crowdfund a campaign to re-collect as much historical weather data as possible, then simply plot it all as-found? Don’t bother with trying to make mis-matched data from different areas and times past all fit smoothly together. Just plot the raw data and release it as-is for anyone to use.
That way, anyone making BS statements about past climate can easily be called out on it, because everyone will be able to check their work against the raw data.
But, annotation is critical.
Do not change the values, but when other changes influence the data, annotate those, such as site relocations, A/C installed 3 feet away, etc.
The scientific protocol would require one to end the existing data string and begin an entirely new one. That isn’t done because it would require a statistical analysis to determine if grouping short time data strings is causing spurious trends in the data. It is much simpler to yell “bias” and adjust older recorded data so it can be spliced onto the newer information and make it appear to be 100 years of continuous data gathering.
There are enough “long” records that dropping some and adding some over time should not change the distribution or statistical descriptors of the remaining data one iota. So the excuse “we need more long records” is just garbage. It means the data they do have is garbage to begin with. You can’t fix that by adding *more* garbage data.
“offer a more accurate representation of future climate conditions”
Really ??
So they ran this model 20 years ago, and it predicted global rainfall patterns, temperature patterns etc through two major El Nino events.. ???
Really !!!!!
Their statement of “a more accurate representation” is total bovex…
They have zero clue about future climate conditions… only a FAKE prophecy.
It is not wise to fool Mother Nature.
These studies are becoming more painful to read.
“Normalized climate change data for a 1°C Global Warming level can be downloaded and opened directly in the Google Earth application. These data can provide information on expected future changes in climate variables, such as windspeed and clouds, which are relevant for the future deployment of wind or solar farms, respectively.”
Nah, the output of these models are not “data” just unverified projections.
“We hope that our dataset will be used extensively by planners, policy- and decision-makers and the public.”
Let’s hope not. Models are not science. They have lead us into some of the worst public policy decisions in the history of mankind wasting trillions of dollars with zero improvement in human health or wellbeing.
Global Warming does not affect our planet evenly.
Sure doesn’t and the null hypothesis was the climate is always changing until a bunch of whitecoats started smoking something they shouldn’t-
Discovery at ‘most dangerous glacier’ sparks joy for climate skeptics
Computers and social media didn’t help and where AI will lead us God only knows.