Agreement paves way for U.S. to accelerate use of weather, climate model improvements
February 7, 2019
![]()
The United States is making exciting changes to how computer models will be developed in the future to support the nation’s weather and climate forecast system. NOAA and the National Center for Atmospheric Research (NCAR) have joined forces to help the nation’s weather and climate modeling scientists achieve mutual benefits through more strategic collaboration, shared resources and information.
The organizations recently signed a Memorandum of Agreement establishing a new partnership to design a common modeling infrastructure that will be transparent and easy to access and use by public and private researchers, including academia and industry. By leveraging efficiencies and synergies, reducing duplication of effort, and creating shared model repositories, future research advances will more quickly benefit the public through better weather and climate forecasts.
“Historically, different architectures for developing weather and climate models across the public and private sector created challenges for implementing the very best systems quickly,” said Neil Jacobs, Ph.D., NOAA assistant secretary of commerce for environmental observation and prediction. “This new framework streamlines the entire process and gives both researchers and forecasters the same tools across the weather enterprise to accelerate the development of forecast models.”
The agreement establishes the governance to allow NOAA and NCAR to prioritize and coordinate existing and ongoing investments. It does not replace existing governance structures or commit new funding for collaborative work.
The agreement marks a fundamental shift towards community modeling, which is a concept that will enable the entire weather enterprise to accelerate the transition of new approaches from research into operations. It also allows NOAA to transition to a Unified Forecast System (UFS), which is a community-based, coupled comprehensive weather and climate modeling system with its partners.
NCAR brings considerable expertise to the partnership, as its scientists have worked with the research community for many years to develop community weather and climate models.
“By combining NCAR’s community modeling expertise with NOAA’s excellence in real-time operational forecasting, this agreement will accelerate our ability to predict weather and climate in ways that are vital for protecting life and property,” said Antonio Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation. “This will enable the nation to produce world-class models that are second to none, generating substantial benefits for the American taxpayer.”
Additionally, NOAA is taking steps to establish a new Earth Prediction Innovation Center, made possible by the recent reauthorization of the Weather and Research Forecasting Innovation Act of 2017. The virtual Center will enable the research community to develop new and emerging model technology that can be quickly transitioned into forecast operations at NOAA’s National Weather Service. The operational global Earth system models will be made available to the research community to support scientific and research work.
“The Earth Prediction Innovation Center, UFS, and the joint NOAA/NCAR agreement are critical elements that will position the U.S. to regain its standing as a leader on the international earth-system modeling stage. The improved modeling capability will improve our life-saving watches and warnings,” Jacobs added.
With a new stage set for community modeling, NOAA is poised to upgrade the Global Forecasting System in the months ahead with the addition of a new dynamic core, called the FV3. The FV3-based GFS will be the first NOAA application derived from community modeling and will improve forecast accuracy.
Together, NOAA and NCAR are working with partners across the weather and climate modeling enterprise to deliver the best products and infrastructure that enable forecasters to save lives and protect property nationwide.
HT/Yooper
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
The should stop focusing on delivering the “best” products and instead develop better products. The best now are really not very good.
Short on details.
Indeed, but it provides insight into the properties of individual knowers, has provided a means to model complicated scenarios involving groups of knowers and has improved our understanding of the dynamics of inquiry.
What a load of codswalop.
I couldn’t tell what is being done or improved. It is typical bureaucratic gobbly gook,
Unless they’re starting from scratch, they’re wasting their time. The 100+ GCMs we already have are just variations on a basic theme, but the basic theme has multiple flaws. That’s why the projections of the models are in close agreement with each other as they continue to drift farther and farther and farther from reality.
actuallycite they are not in close agreement.
next, they continue to track reality within tolrable limits ( +- 15%)
name a flaw. cite the model and explain the better implementation you have.
you wont.
my guess is you never looked at a single line of GCM code
Actually it reads to me as bureaucratic talk for combining two departments into one, trimming manpower and cost.
Might as well get everyone on the same page…
…we’re all going to die
Yes, the new framework has been streamlined. You no longer need to provide any input data and you always get the same output: “It’s worse than we thought. Stop burning fossil fuels. We need more socialism!”
‘The United States is making exciting changes to how computer models will be developed in the future to support the nation’s weather and climate forecast system.’
Is this a political move to give weather forecast credibility to the ‘climate forecast system,’ whatever that is?
Let me give you the climate forecast for the next 50 years.
The Sahel will continue greening.
No other climate regions will change.
That’s it. No ‘climate forecast system’ needed.
Right – this is an attempt to give credibility to climate modeling that it doesn’t deserve. You only need to wait hours before you know whether your weather models were right or not. There is a difference between weather and climate or so we’ve been told.
Tax payers money being used for the lefts propaganda generation “This will enable the nation to produce world-class models that are second to none, generating substantial benefits for the American taxpayer.” same as ‘If you like your health care plan, you can keep it’ and when they are done with this project does that mean the code will be public domain? NIST did a STEP project, the code is at github…
Since they can’t even hind cast without making adjustments, why are we wasting time on this?
Modelers believe their models, even when they are not verified or shown to be way off base. Modelers make good money producing models and using techno gibberish to explain 1) why they are not predictive and 2) how they can fix it with more time and money. All models are wrong, some are useful.
“Leveraging efficiencies and synergies,” tells you all you need to know about the bureaucracy behind this project.
Waste more taxpayer money to make wrong predictions?
That’s what goobermints are for.
Wasting money.
“Politics is the art
of looking for trouble,
finding it everywhere,
diagnosing it incorrectly
and then applying
the wrong remedies.”
Groucho Marx
Totally off topic, does anybody have any views on David Malpass as head of the World Bank?
Its is hard not to be cynical about this sort of announcement.
its easy to be cynical.
hard to be authentically skeptical since that requires domain expertise
What is clear is there will be lots of meetings to attend with the new partnership all on the taxpayer 🙂
Mr Mosher, it is a tacit admission that ALL PREVIOUS MODELS ARE JUNK.
I thought the science was settled years ago.
It was but there is more meetings and lunches to be had.
I enjoy reading about the high-quality US reference station network and how changes there compare to the overall reported US data. What is the latest on them?
In the past, NOAA and NCAR seem to have paid little or no attention to that information–in spite of the fact that the network was established to meet the highest standards and to minimize influence of UHI on reported warming.
You and I do know why they don’t us that data is that data is it does not give them the answer they and their master are seeking.
“…a community-based, coupled comprehensive weather and climate modeling system…” etc., etc,. etc.
Shouldn’t NOAA be required to write in english using meaningful words?
They’ll probably write in C, C++ and Python.
This is why Python was named after Monty Python. The Ministry of Silly Walks has morphed into GISS, NOAA, et al.
Public Service speak.
Never use 3 words when 100 will do.
Refer to Sir Humphrey Appleby in Yes Minister.
Should have included this.
https://m.youtube.com/watch?v=8keZbZL2ero
Retired,
https://www.google.com/search?client=ms-android-samsung&q=developer+of+c%2B%2B+says+users+of+c%2B%2B+don%27t+understand+c%2B%2B&spell=1&sa=X&ved=2ahUKEwjf_beJmq_gAhUJNOwKHaihCUYQBSgAegQICxAC&biw=360&bih=560
Johann,
Thanks for this:
“developer of c++ says users of c++ dont understand c++”
I can believe that, but it is, I suspect, true of many computer languages.
I tried learning to program in C++ more than thirty years ago and just couldn’t get it. I thought I was just too dumb (which actually may be true), but regardless, it’s consoling to learn others had as much trouble and hated it as much as I did.
You ever tried “Forth” ?
Retired, c++ says
1. a car is red.
2. that car is a pickup.
3. that car is a 4 wheel drive.
__________________________________________
1. is programming language C.
2. is programming language C+. Never heard of.
3. is programming language C++
__________________________________________
users don’t understand C++
astounding.
C++ is
object oriented computing.
You design an object to work on computers with virtual
real objects.
__________________________________________
C++ programmers ain’t able.
Why believe CAGW modelers thei’re able.
We could surely use better models. Much, much better. Maybe even adequate.
There are some disturbingly ignorant remarks within this thread. Weather models are not climate models. Weather forecast models exist now that are more than adequate for providing highly-accurate 3-day forecasts. Mildly accurate 5-day forecasts. With useful trend indications out as far as 10 days.
They utilize almost no ground station input. They’re overwhelmingly dominated by automated digital real-time space systems and multispectral sensors which are down-link net-connected so can thus pipe millions of observations into every computer-run several times a day. Then sources like digital ocean data, aircraft senors and balloon sensors, etc. A tiny component of the data (almost nothing) comes from the ground stations.
Accurate weather models are one of humanity’s more significant accomplishments. So what’s with all the caustic remarks about them?
“Accurate weather models are one of humanity’s more significant accomplishments.”
What exactly do you mean by “accurate?” And for what time period: a day, a week, a month, a year, a decade?
Here’s an example: the solar eclipse last year. You could have booked a flight to Oregon six months in advance and been absolutely certain there was going to be an eclipse that day at that time. On the other hand, suppose you wanted to watch it from the shoreline. Could any weather forecaster have guaranteed a clear morning a week in advance so you could watch the approaching shadow coming in over the ocean?
The Earth’s weather is a “chaotic dynamical system”, and these are known to display “exponential divergence”, i.e., given two close initial conditions, as you evolve the solutions in time from them, the distance between those solutions diverges exponentially.
Forty years ago in his textbook on dynamical systems, Fields Medalist V.I.Arnold showed that for even a grossly-simplified weather system (only looking at 2-D winds on a perfect sphere), it would take 100-significant-digit initial data to be able to generate a 1-significant-digit 30-day forecast (and that is as a purely mathematical problem, without any allowance for round-off error and other things that happen when you look at it as a computer problem).
FWIW
Have you not been paying attention? The caustic remarks aren’t about weather models. The caustic remarks are about computer game climate models, which are being used to project doom and gloom decades out and as a bludgeon to try to institute all manner of societal change.
This announcement wasn’t just about weather models.
Exactly. Most of us here love weather models and weather scientists. Weather models are generally very accurate and weather scientists are real scientists.
But we despise climate models and climate scientists. Climate models just sciency alarmist porn, and most climate scientists are no better than witch doctors.
If a weather forecast model is wrong, it will be obvious pretty quickly. And if it’s a critical forecast, such has where and how strong a hurricane will be when it makes landfall, they can fine tune the forecast as more recent data comes in.
Climate forecast models? They’re already off target just a few decades out. That’s obvious.
Yet some want us to spend trillions of dollars and surrender freedoms based on them.
Genuine climate scientist can use them as a tool (that need correcting) to better understand.
Climate “political scientist” us them as is as a club to better subdue the opposition.
Yes, but…
The WSM and WDM families of microphysics modules in WRF (and other models) simulate condensation/freezing/evaporation/sublimation for clouds, and gravitational settling for rain, snow, etc (but not updraft-effects; that is handled elsewhere); they exhibit a variety of misbehaviors:
1) the settling scheme exhibits convergence (as a parcel of rain falls, it can become more concentrated, rather than diffusing. That’s on the “no-no” list for transport algorithms.
2) the effect of (1) is so bad that the “top” of a falling rain-parcel can and do fall below the “bottom”, leading to negative rain concentrations, which the algorithms subsequently zero out. This violates the mass conservation that these schemes claim to have.
3) the fall-velocity-interpolation algorithm in these modules can and does cause gravitational settling velocity to be sometimes negative (upwards). Physically speaking, rain does not fall upwards.
Fix these (as I did for my last–private, not government–employer), and you get more-accurate forecasts. Fix some other things as well, and you end up with a 40% faster (as well as more accurate computer code). So much for NCAR computational expertise…
FWIW.
If what you say is true, then we should be data warehousing the output out of the weather models and using this to build climate modelz.
Get rid of the not fit for purpose surface temperature data with its subjective and repeated adjustments to the past.
This would very quickly show if the error term in climate models converges to zero or grows indefinitely.
Because unless the error term can be shown to converge, the climate models are worthless. In fact they are worse than useless because they provide a false certainty,
accurate?
one site is telling me it will be -6/2°C with snow showers, the other is saying 0/5°C and sunny. For tuesday, or just 2 days from now. I guess I should prepare for rain…
To clarify, I would like to see a reliable 100-hour weather forecast. 100-day seems unrealistic. Climate “models” used to issue 100-year “projections” (wisely, they don’t call it “predictions” because of legal implications). How exactly it is done is a closely guarded secret; apparently a panel decides which model runs are politically correct enough to be included. As a result, these projections are only as good as the panel.
“Climate “models” used to issue 100-year “projections” (wisely, they don’t call it “predictions” because of legal implications).
100-year “prognostications” would be more accurate.
I guess that they’ll incorporate the European hurricane model in the common model repository? It seems to work better than the US one.
“… exciting changes to how computer models will be developed in the future …” “… climate modeling scientists achieve mutual benefits through more strategic collaboration, shared resources and information.”
“… a new partnership to design a common modeling infrastructure that will be transparent and easy to access and use by public and private researchers, including academia and industry.”
Accessible to the public? Including skeptics?
Note that in all this PR gibberish, there is no mention of developing new models – they are going to develop a new modeling infrastructure, within a new framework.
And will they use Agile Development, and be CMMI Level 5?
I wonder if there is room in this new initiative to include a truely innovative strategy called model validation. I know some folks in private industry who find it very useful to determine if their critical models actually resemble real world processes.
If I am understanding this correctly, the idea is to eliminate the embarrassment that climate models disagree with each other by standardising on a single climate model. Of course there will still be the bigger embarrassment that the real climate refuses to follow the climate model; no matter how hard they adjust the data.
If it doesn’t work sack the lot of them and give the contract to Accuweather.
And the good news is, it will still be GIGO
More money for NOAA so they can continue to bastardized the temperature records. This new partnership doesn’t fill me with enthusiasm.
It’s the people behind the models that I worry about. If it were up to me, I would fire them, not give them more money.
Here’s a novel idea: Make their pay commensurate with the accuracy of their models.
I agree that having a single model vs competing ones is the most efficient way to develop a modeling regime, but it is certainly not the most effective way. Competition usually surfaces the best, a monopoly stifles innovation.
This initiative has the opportunity to develop ‘one source of truth’ that may be wrong, but will be hard to challenge…
If the planned collaboration works like CRU and Met Office……….
My cynic bone is vibrating. “Community modeling” sounds too much like ‘Communist’ modeling. They are trying to make us believe the resulting models will be transparent and available to all but they will still control all the variables….like forcing. Will the variables be up for public discussion, scrutiny, and change or not?
There is an open source, community model out there. I sent in a query and got a reply of the order: “this is a volunteer operation, and short-handed. The code is open source here:xxxxxxxx.org. If you want to take look and propose a change you can submit it there.”
The existing community model is open source
you can make any change you like, just dont expect others to accept it unless it is better.
you wont make any changes. it requires knowledge and skill
also, forcings are not a varibale. they are an input.
you can change them.
you wont
“This will enable the nation to produce world-class models that are second to none…
I would hope that this is driven by at least some bit of embarrassment since so far the Russian model is the only one that comes close to reality. While the current accuracy of the Russian model might only be due to guesswork and luck it is still showing up the other modeling teams, especially here in the U.S., as being too hidebound by their irrationality and confirmation bias.
You’ve done it now, you are being reported to the Mueller team for supporting Russian interference in American climate. Only true American models may be used.
:<)
“This will enable the nation to produce world-class models that are second to none…”
Good, because our current world-class models aren’t second. They’re way worse than that…
Is there a non-verbal warning sign for “Bulls**t in Progress” ?