U.S. Operational Weather Prediction is Crippled By Inadequate Computer Resources

Reposted from the Cliff Mass Weather Blog

U.S. Operational Weather Prediction is Crippled By Inadequate Computer Resources

U.S. global numerical weather prediction has now fallen into fourth place, with national and regional prediction capabilities a shadow of what they could be.
There are several reasons for these lagging numerical weather prediction capabilities, including lack of strategic planning, inadequate cooperation between the research and operational communities, and too many sub-optimal prediction efforts.
But there is another reason of equal importance: a profound lack of computer resources dedicated to numerical weather prediction, both for  operations and research.

The bottom line:  U.S. operational numerical weather prediction resources used by the National Weather Service must be increased 10 times to catch up with leading efforts around the world and 100 times to reach state of the science. 
Why does the National Weather Service require very large computer resources to provide the nation with world-leading weather prediction?
Immense computer resources are required for modern numerical weather prediction.  For example, NOAA/NWS TODAY is responsible for running:

  • A global atmospheric model (the GFS/FV-3) running at 13-km resolution out to 384 hours.
  • Global ensembles (GEFS) of many (21 forecasts) forecasts at 35 km resolution
  • The high-resolution Rapid Refresh and RAP models out 36 h.
  • The atmosphere/ocean Climate Forecast System model out 9 month.s
  • The National Water Model (combined WRF and hydrological modeling)
  • Hurricane models during the season
  • Reanalysis runs (rerunning past decades to provide calibration information)
  • Running the North American mesoscale model (NAM)
  • Running the Short-Range Ensemble Forecast System (SREF)

This is not a comprehensive list.  And then there is the need for research runs to support development of the next generation systems.  As suggested by the world-leading European Center for Medium Range Weather Prediction, research computer resources should be at least five times greater than the operational requirements to be effective.

NY Times Magazine: 10/23/2016


How Lack of Computing Resources is Undermining NWS  Numerical Weather Prediction
The current modeling systems (some described above) used by the National Weather Service are generally less capable then they should be because of insufficient computer resources.  Some examples.
1.  Data Assimilation.  The key reason the U.S. global model is behind the European Center and the other leaders is because they use an approach called 4DVAR, a resource-demanding technique that involves running the modeling systems forward and backward in time multiple times.  Inadequate computer resources has prevented the NWS from doing this.
2.  High-resolution ensembles.   One National Academy report after another, one national workshop committee after another, and one advisory committee after another has told NWS management that the U.S. must have a large high-resolution ensemble system (at least 4-km grid spacing, 30-50 members) to deal with convection (e.g., thunderstorms) and other high-resolution weather features.  But the necessary computer power is not available.

Screen Shot 2020-02-16 at 12.40.16 PM

European Center Supercomputer

3.  Global ensembles.  A key capability of any first-rate global prediction center is to run a large global ensemble (50 members at more), with sufficient resolution to realistically simulate storms and the major impacts of terrain (20 km grid spacing or better).  The European Center has a 52 members ensemble run at 18-km grid spacing.  The U.S. National Weather Service?  21 members at 35-km resolution.  Not in the same league.
I spend a lot of time with NOAA and National Weather Service model developers and group leaders.  They complain continuously how they lack computer resources for development and testing.  They tell me that such resource deficiency prevents them from doing the job they know they could. These are good people, who want to do a state-of-the-art job, but they can’t do to inadequate computer resources.
NOAA/NWS computer resources are so limited that university researchers with good ideas cannot test them on NOAA computers or in facsimiles of the operational computing environment.  NOAA grant proposal documents make it clear:  NOAA/NWS cannot supply the critical computer resources university investigators need to test their innovations (below is quote from a recent NOAA grant document):

So if a researcher has a good idea that could improve U.S. operational weather prediction, they are out of luck:  NOAA/NWS doesn’t have the computer resources to help.  Just sad.
U.S. Weather Prediction Computer Resources Stagnate While the European Center Zooms Ahead
The NOAA/NWS computer resources available for operational weather prediction is limited to roughly 5 petaflops (pflops).   Until Hurricane Sandy (2010), National Weather Service management was content to possess one tenth of the computer resources of the European Center, but after the scandalous situation went public after that storm (including coverage on the NBC nightly news), NOAA/NWS management managed to get a major increment to the current level–which is just under what is available to the European Center.

Image courtesy of Rebecca Cosgrove, NCEP Central Operations

But the situation is actually much worse than it appears.   The NWS computer resources are split between operational and backup machines and is dependent on an inefficient collection of machines of differing architectures (Dell, IBM, and Cray).  There is a bottleneck of I/O (input/output) from these machines (which means they can’t get information into and out of them efficiently), and storage capabilities are inadequate.
There is no real plan for seriously upgrading these machines, other than a 10-20% enhancement over the next few years.
In contrast, the European Center now has two machines with a total of roughly 10 pflop peak performance, with far more storage, and better communication channels into and out of the machine.
And keep in mind that ECMWP computers have far few responsibilities than the NCEP machines.  NCEP computers have to do EVERYTHING from global to local modeling, for hydrological prediction to seasonal time scales.  The ECMWF computers only have to deal with global model computing.
To make things even more lopsided, the European Center is now building a new computer center in Italy and they recently signed an agreement to purchase a new computer system FIVE TIMES as capable as their current one.

They are going to leave NOAA/NWS weather prediction capabilities in the dust.  And it did not have to happen.
And I just learned today that the UKMET office, number two in global weather prediction, just announced that it will spend 1.2 BILLION pounds (that’s 1.6 billion dollars) on a new weather supercomputer system, which will leave both the European Center and the U.S. weather service behind.   U.S. weather prediction will drop back into the third tier.

Fixing the Problem
Past NOAA/NWS management bear substantial responsibility for this disaster, with Congress sharing some blame for not being attentive to this failure.  Congress has supplied substantial funding to NOAA/NWS in the past for model development, but such funding has not been used effectively.
Importantly, there IS bipartisan support in Congress to improve weather prediction, something that was obvious when I testified at a hearing for the House Environment Subcommittee last November.  They know there is a problem and want to help.

There is bipartisan support in Congress for better weather modeling

A major positive is that NOAA is now led by two individuals (Neil Jacobs and Tim Gallaudet), who understand the problem and want to fix it. And the President’s Science Adviser, Kelvin Droegemeier,  is a weather modeler, who understands the problem.

So what must be done now?
(1)  U.S. numerical prediction modeling must be reorganized, since it is clear that the legacy structure, which inefficiently spreads responsibility and support activities, does not work.  The proposal of NOAA administrator Neal Jacobs to build a new EPIC center to be the centerpiece of U.S. model development should be followed (see my blog on EPIC here).
(2) NOAA/NWS must develop a detailed strategic plan that not only makes the case for more computer resources, but demonstrates how such resources will improve weather prediction.  Amazingly, they have never done this.  In fact, NOAA/NWS does not even have a document describing in detail the computer resources they have now (I know, I asked a number of NOAA/NWS managers for it–they admitted to me it doesn’t exist).
(3)  With such a plan Congress should invest in the kind of computer resources that would enable U.S. weather prediction to become first rate.  Ten times the computer resources (costing about 100 million dollars) would bring us up to parity, 100 times would allow us to be state of the science (including such things as running global models at convection-permitting resolution, something I have been working on in my research).
Keep in mind that a new weather prediction computer system would be no more expensive that a single, high tech jet fighter.  Which do you think would provide more benefit to U.S. citizens?  And remember, excellent weather prediction is the first line of defense from severe weather that might be produced by global warming.

82 million dollars a piece

(4)  Future computer resources should divided between high-demand operational forecasting, which requires dedicated large machines, and less time-sensitive research/development runs, which could make use of cloud computing.  Thus, future NOAA computer resources will be a hybrid.
(5)  Current operational numerical prediction in the National Weather Service has been completed at the NCEP Central Operations Center.  This center has not been effective, has unnecessarily slowed the transition to operations of important changes, and must be reorganized or replaced with more facile, responsive entity.
U.S. citizens can enjoy far better weather forecasts, saving many lives and tens of billions of dollars per year.   But to do so will require that NOAA/NWS secure vastly increased computer resources, and reorganize weather model development and operations to take advantage of them.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

101 Comments
Inline Feedbacks
View all comments
Editor
February 17, 2020 9:36 pm

Gee, I remember when WUWT had several posts ridiculing the UK’s new supercomputer.

That implies instead of falling all over ourselves complaining about our current equipment we should be ridiculing the need for more equipment.

Or was WUWT wrong then?

https://wattsupwiththat.com/2017/08/04/report-127-million-climate-supercomputer-no-better-than-using-a-piece-of-paper/

https://wattsupwiththat.com/2009/08/28/met-office-supercomputer-a-megawatt-here-a-megawatt-there-and-pretty-soon-were-talking-real-carbon-pollution/

niceguy
Reply to  Ric Werme
February 17, 2020 10:23 pm

Just because climate weather modeling super computers are patently useless doesn’t mean weather modeling super computers can’t do real good, when the long term nonsense is avoided.

Michael Jankowski
Reply to  Ric Werme
February 18, 2020 9:32 am

Those links were ridiculing the inaccuracy in results and ironic energy footprint.

niceguy
February 17, 2020 10:19 pm

The Alabama will be hit story gave us not only sharpiegate but also the Trump’s spaghettis graph. What did we learn with that? That the models give different results and that they were mostly wrong. Looking at Trump’s spaghettis Alabama was going to be hit but Dorian made a great effort to turn and mostly avoid the land.

But everybody is talking about the inane sharpie drawing and not the spaghettis.

Also, we learnt that the same people panicked by a modicum of possible warming 50 years from now are calmer than a Zen Monk when faced with up to 30% chances of dangerous winds over some part of Alabama, slightly more than a week after the announcement, and don’t believe that such risk warrants any preparation more than 4-5 days in advance.

Also, we learnt that the same people think spaghettis graph is about spaghetti size weather phenomenons and that only a spaghetti over Alabama would be a risk for Alabama.

Rob_Dawg
February 17, 2020 10:58 pm

Executing legacy Fortran IV code faster doesn’t address even the top ten more effective efforts in weather prediction.

A modest suggestion for the Bezos $10 billion.

Mass produced, we could make GPS satellite phone solar powered weather stations for a few hundred bucks. Call it a million units for half a billion. Put them everywhere. The weather data alone would probably have a payback period measured in weeks. The money saved in not doing stupid things about climate because of what we learn would take years but would prove priceless.

It wont happen because that’s not what Bezos is actually interested in. Just like better weather prediction is not the reason for wanting more computing power.

Reply to  Rob_Dawg
February 21, 2020 1:30 pm

What exactly do you mean by a “GPS satellite phone”? On first glance, this looks like ‘word salad’ …

Rod Evans
February 17, 2020 11:20 pm

I am struggling to understand the logic in play, with this desire for ever bigger computers to model weather/climate?
One of the tests of a forecasting system and its value is, does it relate closely to what happened, or is happening now. In other words does it actually do the job you have paid it to do?
Do the forecasts actually happen? Do the hind casts accurately reflect what has already happened?
So far the IPCC models all run hot. They do not reflect the climate conditions we are experiencing? Not content with the fundamental faults the modelling performance reflects, some wiz thought the best thing to do with outputs (that are clearly wrong) is to average them? So we end up with an average output that is just averagely wrong.
These are the kind of people who are asking for more money and resources to continue on with their unique approach to climate research.
The question must eventually come up. What is the point of giving researchers more money/resources, that don’t seem capable of using it for any purpose, beyond showing they don’t understand the complexity of what they are working on?
A 100 petoGiGO computer is no better than any other sized GIGO activity.

RockyRoad
February 18, 2020 2:23 am

I saw a statement by some guru working at weather.com several years ago that said today’s forecast really was just 80% accurate; tomorrow’s forecast was 60% accurate; the day after that it was 40% accurate; the next day it was only 20% accurate; and down to 0% accurate on the fifth day!!

He was describing the logic for adding the 5-day forecast and not bothering with anything beyond that! (Up to that point, they had been offering a 10-day forecast that was actually 14-days long!!….derrr…)

What will more computing power give us?… 10% accuracy on the sixth day??

Grumbler
February 18, 2020 2:38 am

Kevin A.
“deep by 72 inches tall you can fit 2,649 cards producing 55.645 PetaFlop, consuming 39,746 watts and all this for less then $300,000.00
This is 0.7 watt per TeraFlop – unheard of.“

Won’t this mean the end of Bitcoin?

Kevin A
Reply to  Grumbler
February 19, 2020 12:01 am

No, it means I’ll be waiting forever to take delivery while the ‘coin farmers’ eat up the first two generations, on Amazon someone is listing a Nano for twice its value because Amazon is out… Such is life

Not Chicken Little
February 18, 2020 2:52 am

I am working on a computer program that will predict all sorts of random and chaotic events. It works really well, especially for past events! Now I am tuning it to better predict the future, and this program is so good it is already predicting it will be a great success at predicting the future!

So now all I need is better computers and funding to make this all a reality! When it’s finished it will probably be able to predict lottery numbers, so we can use it to make EVERYONE rich!

Please send money ASAP. You can find me here at WUWT. Thank you. By the way my program is predicting it will be cold this winter, so you all might want to prepare for that. No charge for that prediction. 🙂

cjc
February 18, 2020 5:05 am

The problem is made much worse by the fact that the models are programmed to match the limitations of 1980’s-vintage vector computers and their compilers, rather than those for current microprocessor based systems (the assumptions are that memory accesses are free, and that one should do at most one IF and only a three or four arithmetic operations per loop, with no “vector dependencies”).
The facts now are that memory accesses are a hundred times more expensive than even single arithmetic operations, and that the processors are “pipelined superscalar” that are capable of executing as many as fifty arithmetic operations simultaneously, provided that the code puts them together.
In the WRF meteorology model, this mis-structuring causes advection to be three times slower than it should be, and microphysics to be as much as eight times slower, according to cleanups and benchmarking that I have performed. There’s a LOT of “Not Invented Here” at NCAR.

RockyRoad
Reply to  cjc
February 18, 2020 6:51 am

Yes, there’s been a lot in the news lately about the development and benefits of quantum computing, which from what I understand, more closely aligns in theory with the phenomena that controls the weather.

It is supposed to be the next big thing!

We would be wise to wait until that computer architecture matures before spending big bucks on big iron to make big predictions–about anything!

Reply to  cjc
February 18, 2020 10:24 am

cjc
Good point regarding memory access. The Met office recognised this problem a good few years ago. I believe that they have the LFRic project to try and address the problem but I don’t know how far they have got.
It can’t be an easy job writing code for one of those Crays.

RockyRoad
Reply to  Finn McCool
February 18, 2020 1:08 pm

Their OS takes care of the multi-threading.

DrTorch
February 18, 2020 6:52 am

The grant-funded science industry is always sounding the alarm of how the US is falling behind someone else. It’s Japan, China, S. Korea (electronics, biotech). Israel, India or Russia (Software, mathematics, military tech). Sometimes Europe (LHC, weather modeling).

For decades this has been the case, yet somehow the US still competes, even with only 5% of the world’s population.

The bigger problems of course are infiltrators and espionage into the US university and laboratory systems, and poor public schools being made yet worse with high immigration. But those rarely get mentioned during the hype.

Rud Istvan
February 18, 2020 2:03 pm

I have posted before, here and elsewhere, on both weather and climate models. Have also communicated directly with Cliff Mass, as numerical weather prediction is his thing and he knows hi stuff.
Three observations.
1. Cliff’s main point here isn’t about supercomputer resources. It is about the dysfunctional present organization that manages them and plans the future. President Trump could fix that given a chance second term.
2. Bringing NWS resources up to snuff is CHEAP compared to the billions wasted annually on US climate models that have no hope of success because of the unavoidable parameterization problem that drags in attribution.
3. A more accurate 3-5 day forecast would be invaluable. Trucking firms, airlines, utilities, farmers (planting/harvesting) would all benefit enormously. Living on the beach in Fort Lauderdale, a narrower 3 day cone of hurricane uncertainty would be well worth a few extra tax dollars. Look at the evacuation disaster from Miami triggered by an erroneous Irma cone track just a few years ago.

Reply to  Rud Istvan
February 18, 2020 2:27 pm

Improve short term accuracy until it long enough to act where needed and not act where not needed.
Makes sense and will build confidence so that fewer people will ignore the warnings when issued.

February 18, 2020 2:35 pm

Mr. Layman here.
In weather forecasting, how much computing power is put into matching/comparing past weather patterns?
It would seem that, even if we don’t understand the “whys”, if a past pattern is repeating then it’s likely the result will repeat.

Svend Ferdinandsen
February 18, 2020 2:47 pm

It could be they need larger computers, but it could also help with more offices equipped with windows, so that they could actually see the weather. Just a simple thought.

It looks like an intention that all predictions before actual weather is erased.

Mark Luhman
February 18, 2020 10:07 pm

Let’s face it computer have no predictive qualities what so ever, anyone who tells you different should be ask to put their next years salary on the stock marked based on their computer model. the only takers will be fools

Patrick MJD
February 19, 2020 3:26 am

My experience of bigger, faster computers is they just crash faster and more heavily. The biggest advancement IMO is virtualisation which is nothing new (I keep telling my Windows Hyper-V “experts”).

Kevin A
Reply to  Patrick MJD
February 19, 2020 9:17 am

I have to agree, after fighting Windows for two weeks trying to run Linux I returned the software and purchased a new computer for Ubuntu to live on a 4TB drive. And no, Windows Subsystem for Linux (WSL 2) is not the answer unless you’re not trying to get anything done.

War
February 19, 2020 9:18 pm

How about this story that 5G is a threat to accurate weather forecasting from the German media outlet DW, anyone heard of this? article -‘Will 5G mobile networks wreck weather forecasting? ‘

Also I would add that we’re clueless and in uncharted territory with weather at this point on a large scale, need results from the CERN study on clouds and cosmic rays…..

Reply to  War
February 21, 2020 1:43 pm

re: “How about this story that 5G is a threat to accurate weather forecasting”

Bullocks.

Give it to me in “first principles” in physics or its just horse hockey …