U.S. Operational Weather Prediction is Crippled By Inadequate Computer Resources

Reposted from the Cliff Mass Weather Blog

U.S. Operational Weather Prediction is Crippled By Inadequate Computer Resources

U.S. global numerical weather prediction has now fallen into fourth place, with national and regional prediction capabilities a shadow of what they could be.
There are several reasons for these lagging numerical weather prediction capabilities, including lack of strategic planning, inadequate cooperation between the research and operational communities, and too many sub-optimal prediction efforts.
But there is another reason of equal importance: a profound lack of computer resources dedicated to numerical weather prediction, both for  operations and research.

The bottom line:  U.S. operational numerical weather prediction resources used by the National Weather Service must be increased 10 times to catch up with leading efforts around the world and 100 times to reach state of the science. 
Why does the National Weather Service require very large computer resources to provide the nation with world-leading weather prediction?
Immense computer resources are required for modern numerical weather prediction.  For example, NOAA/NWS TODAY is responsible for running:

  • A global atmospheric model (the GFS/FV-3) running at 13-km resolution out to 384 hours.
  • Global ensembles (GEFS) of many (21 forecasts) forecasts at 35 km resolution
  • The high-resolution Rapid Refresh and RAP models out 36 h.
  • The atmosphere/ocean Climate Forecast System model out 9 month.s
  • The National Water Model (combined WRF and hydrological modeling)
  • Hurricane models during the season
  • Reanalysis runs (rerunning past decades to provide calibration information)
  • Running the North American mesoscale model (NAM)
  • Running the Short-Range Ensemble Forecast System (SREF)

This is not a comprehensive list.  And then there is the need for research runs to support development of the next generation systems.  As suggested by the world-leading European Center for Medium Range Weather Prediction, research computer resources should be at least five times greater than the operational requirements to be effective.

NY Times Magazine: 10/23/2016


How Lack of Computing Resources is Undermining NWS  Numerical Weather Prediction
The current modeling systems (some described above) used by the National Weather Service are generally less capable then they should be because of insufficient computer resources.  Some examples.
1.  Data Assimilation.  The key reason the U.S. global model is behind the European Center and the other leaders is because they use an approach called 4DVAR, a resource-demanding technique that involves running the modeling systems forward and backward in time multiple times.  Inadequate computer resources has prevented the NWS from doing this.
2.  High-resolution ensembles.   One National Academy report after another, one national workshop committee after another, and one advisory committee after another has told NWS management that the U.S. must have a large high-resolution ensemble system (at least 4-km grid spacing, 30-50 members) to deal with convection (e.g., thunderstorms) and other high-resolution weather features.  But the necessary computer power is not available.

Screen Shot 2020-02-16 at 12.40.16 PM

European Center Supercomputer

3.  Global ensembles.  A key capability of any first-rate global prediction center is to run a large global ensemble (50 members at more), with sufficient resolution to realistically simulate storms and the major impacts of terrain (20 km grid spacing or better).  The European Center has a 52 members ensemble run at 18-km grid spacing.  The U.S. National Weather Service?  21 members at 35-km resolution.  Not in the same league.
I spend a lot of time with NOAA and National Weather Service model developers and group leaders.  They complain continuously how they lack computer resources for development and testing.  They tell me that such resource deficiency prevents them from doing the job they know they could. These are good people, who want to do a state-of-the-art job, but they can’t do to inadequate computer resources.
NOAA/NWS computer resources are so limited that university researchers with good ideas cannot test them on NOAA computers or in facsimiles of the operational computing environment.  NOAA grant proposal documents make it clear:  NOAA/NWS cannot supply the critical computer resources university investigators need to test their innovations (below is quote from a recent NOAA grant document):

So if a researcher has a good idea that could improve U.S. operational weather prediction, they are out of luck:  NOAA/NWS doesn’t have the computer resources to help.  Just sad.
U.S. Weather Prediction Computer Resources Stagnate While the European Center Zooms Ahead
The NOAA/NWS computer resources available for operational weather prediction is limited to roughly 5 petaflops (pflops).   Until Hurricane Sandy (2010), National Weather Service management was content to possess one tenth of the computer resources of the European Center, but after the scandalous situation went public after that storm (including coverage on the NBC nightly news), NOAA/NWS management managed to get a major increment to the current level–which is just under what is available to the European Center.

Image courtesy of Rebecca Cosgrove, NCEP Central Operations

But the situation is actually much worse than it appears.   The NWS computer resources are split between operational and backup machines and is dependent on an inefficient collection of machines of differing architectures (Dell, IBM, and Cray).  There is a bottleneck of I/O (input/output) from these machines (which means they can’t get information into and out of them efficiently), and storage capabilities are inadequate.
There is no real plan for seriously upgrading these machines, other than a 10-20% enhancement over the next few years.
In contrast, the European Center now has two machines with a total of roughly 10 pflop peak performance, with far more storage, and better communication channels into and out of the machine.
And keep in mind that ECMWP computers have far few responsibilities than the NCEP machines.  NCEP computers have to do EVERYTHING from global to local modeling, for hydrological prediction to seasonal time scales.  The ECMWF computers only have to deal with global model computing.
To make things even more lopsided, the European Center is now building a new computer center in Italy and they recently signed an agreement to purchase a new computer system FIVE TIMES as capable as their current one.

They are going to leave NOAA/NWS weather prediction capabilities in the dust.  And it did not have to happen.
And I just learned today that the UKMET office, number two in global weather prediction, just announced that it will spend 1.2 BILLION pounds (that’s 1.6 billion dollars) on a new weather supercomputer system, which will leave both the European Center and the U.S. weather service behind.   U.S. weather prediction will drop back into the third tier.

Fixing the Problem
Past NOAA/NWS management bear substantial responsibility for this disaster, with Congress sharing some blame for not being attentive to this failure.  Congress has supplied substantial funding to NOAA/NWS in the past for model development, but such funding has not been used effectively.
Importantly, there IS bipartisan support in Congress to improve weather prediction, something that was obvious when I testified at a hearing for the House Environment Subcommittee last November.  They know there is a problem and want to help.

There is bipartisan support in Congress for better weather modeling

A major positive is that NOAA is now led by two individuals (Neil Jacobs and Tim Gallaudet), who understand the problem and want to fix it. And the President’s Science Adviser, Kelvin Droegemeier,  is a weather modeler, who understands the problem.

So what must be done now?
(1)  U.S. numerical prediction modeling must be reorganized, since it is clear that the legacy structure, which inefficiently spreads responsibility and support activities, does not work.  The proposal of NOAA administrator Neal Jacobs to build a new EPIC center to be the centerpiece of U.S. model development should be followed (see my blog on EPIC here).
(2) NOAA/NWS must develop a detailed strategic plan that not only makes the case for more computer resources, but demonstrates how such resources will improve weather prediction.  Amazingly, they have never done this.  In fact, NOAA/NWS does not even have a document describing in detail the computer resources they have now (I know, I asked a number of NOAA/NWS managers for it–they admitted to me it doesn’t exist).
(3)  With such a plan Congress should invest in the kind of computer resources that would enable U.S. weather prediction to become first rate.  Ten times the computer resources (costing about 100 million dollars) would bring us up to parity, 100 times would allow us to be state of the science (including such things as running global models at convection-permitting resolution, something I have been working on in my research).
Keep in mind that a new weather prediction computer system would be no more expensive that a single, high tech jet fighter.  Which do you think would provide more benefit to U.S. citizens?  And remember, excellent weather prediction is the first line of defense from severe weather that might be produced by global warming.

82 million dollars a piece

(4)  Future computer resources should divided between high-demand operational forecasting, which requires dedicated large machines, and less time-sensitive research/development runs, which could make use of cloud computing.  Thus, future NOAA computer resources will be a hybrid.
(5)  Current operational numerical prediction in the National Weather Service has been completed at the NCEP Central Operations Center.  This center has not been effective, has unnecessarily slowed the transition to operations of important changes, and must be reorganized or replaced with more facile, responsive entity.
U.S. citizens can enjoy far better weather forecasts, saving many lives and tens of billions of dollars per year.   But to do so will require that NOAA/NWS secure vastly increased computer resources, and reorganize weather model development and operations to take advantage of them.

0 0 votes
Article Rating
101 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
February 17, 2020 3:07 pm

Isn’t it axiomatic that the only thing faster computers will do is expedite wrong predictions about climate?

Reply to  Gordon Dressler
February 17, 2020 3:35 pm

He is talking about US operational weather prediction.

Greg
Reply to  Nick Stokes
February 17, 2020 4:37 pm

The problem with weather prediction is that most of the resources are being squandered on attempts to do climate prediction instead of weather prediction.

Greg
Reply to  Greg
February 17, 2020 4:44 pm

I love the idea of using “cloud computing” to predict weather !

The irony is they don’t have mathematical models to accurately reproduce evaporation, advection, condensation and precipitation.

what is the point of cloud computing if you don’t know how to compute a cloud ?

Steven Mosher
Reply to  Greg
February 18, 2020 6:01 am

show your work proving this

Bryan A
Reply to  Greg
February 18, 2020 12:33 pm

Looking at the images between the US and EU computers, it is immediately apparent that what the US is missing is the Pretty Pictures on the outside of the servers.
The US servers don’t Feel Pretty

Greg
Reply to  Greg
February 17, 2020 5:03 pm

Certainly, if they stopped wasting resources doing global climate runs of the next hundred years, they’d have more resources free for the next five days.

Ten times the computer resources (costing about 100 million dollars) would bring us up to parity, 100 times would allow us to be state of the science (including such things as running global models at convection-permitting resolution, something I have been working on in my research).

Why does predicting US weather a few days out involve “global model” runs? I thought US was 2% of the globe. If they limited it to modelling USA they would already have 50 time more resources available. If they worked on 3 days instead of 100 years they’d have 12000 times more again and could increase resolution to a more useful and productive level.

This whole attempt to grab more resources is being disingenuously presenting a need for weather prediction when what they really want to do is global climate modelling.

Reply to  Greg
February 18, 2020 8:51 pm

This whole thing is a pampered spoiled whine about “keeping up with the Joneses”.

“U.S. Weather Prediction Computer Resources Stagnate While the European Center Zooms Ahead
The NOAA/NWS computer resources available for operational weather prediction is limited to roughly 5 petaflops (pflops).”

Quickly followed by the real eye gouge:

“Atos, a global leader in digital transformation, has signed a new four-year contract worth over €80 million (approximately £67.8 million) with the European Centre for Medium-Range Weather Forecasts (ECMWF) to supply its BullSequana XH2000 supercomputer, which is one of the most powerful meteorological supercomputers in the world. It will increase ECMWF’s computing power by a factor of around 5”

Nevermind efficient effective program core, go for the biggest baddest fastest computer extant for running their CO₂ climate programs masquerading as weather programs.

Carbon Bigfoot
Reply to  Greg
February 20, 2020 4:44 am

Mosh & Stokes woke up!! Please tell us what the NWS is going to do when the 50,000 low earth orbit 5G satellites are in-place. It doesn’t matter– humans and the biosphere will be radiated to the max and be extinct in 5 years.
https://www.5gspaceappeal.org/

Reply to  Greg
February 17, 2020 5:56 pm

With computers currently on hand, “A seven-day forecast can accurately predict the weather about 80 percent of the time and a five-day forecast can accurately predict the weather approximately 90 percent of the time. However, a 10-day—or longer—forecast is only right about half the time.” — https://scijinks.gov/forecast-reliability/

From these statistics, it does not appear the poor ability to forecast weather out beyond 10 days is due to inadequate computing power or speed, but rather the accumulation of errors arising from un-modeled natural sources/processes over time. Thus, future “improvements” should focus on the software (prediction algorithms), not the hardware memory size or processor(s) gigaflop rates.

Memory and processor speeds mainly benefit GCM’s trying to predict weather evolution over 20 years or more into the future, hence my comment about “climate” rather than “weather”.

Michael S. Kelly
Reply to  Gordon Dressler
February 17, 2020 7:22 pm

Gordon Dressler ====> The number of error terms in a general circulation model (GCM) is unknown, including two that are ignored completely: coding errors, and undetected bit errors. The latter should be of concern because of the stupendous number of computations in a 100 year GCM run, yet I haven’t seen it addressed anywhere in the climate science community. It has been studied elsewhere, and the results are not comforting. Conventional error-checking routines do not catch 2 and 3 non-adjacent bit errors, and those errors occur frequently enough to completely derail a time-integration over a multi-year span.

Short term weather computation, on the other hand, isn’t as sensitive to such scattered errors because of the shorter integration span.

Your comment on the algorithms is well taken, but I would add that a focus on their implementation (including their coding) must take place. I’ve read the manuals on a couple of GCMs, one of which mentioned (without specifics) some algorithms which “damped out” computational excursions resulting from initial conditions (the pressure, temperature and velocity of the atmosphere at all nodes at time = 0) that were not consistent with reality. I’ve done enough software development to know that “stabilizing algorithms” can inadvertently become part of the “physics” of a code. Rather easily, in fact. And without a NASA-scale independent validation and verification effort, no one will ever know whether that has happened.

There’s a lot more required to get a GCM to work than just a bigger computer, but a bigger computer would help enormously for the weather models we have today.

Reply to  Gordon Dressler
February 18, 2020 4:13 am

” However, a 10-day—or longer—forecast is only right about half the time.” ”

Then they are wasting computer time, because that can be done with pencil and paper. A good meteorologist should be able to guess better than that. I’m no scientist but I’ve been doing a little almanac experiment. After looking up 15 years of rain history at my house, I’ve been surprised that I can “predict” rain days a full month in advance with 50% to 80% accuracy.

MarkW
Reply to  Gordon Dressler
February 18, 2020 8:40 am

Running smaller cells reduces the rate at which errors can build up.

Don K
Reply to  Gordon Dressler
February 18, 2020 8:54 am

AFAICS, weather forecasts give reasonable predictions for maybe ten days. And that’s very useful. The next thing we need wouldn’t seem to me to be predictions that are good for eleven days. It’s seasonal predictions that aren’t more or less complete garbage.

Since 1950 or so, computing capacity has been improving about an order of magnitude every decade. And weather forecasting capability has been improving about one day a decade. At that rate, we’ll probably be able to make pretty good seasonal forecasts someday. Maybe sometime around the year 2920.

Let me submit that we’ve probably pretty much reached the practical limits of weather forecasting by computer modelling of fronts, temperatures, air pressures, et. al. We’ve likely reached (and probably passed) the point of diminishing returns for that approach.

Wither from here? I don’t know. Predicting Atlantic tropical cyclone activity based on ENSO seems to work somewhat. So maybe there are other paths. But I think probably we’re near the end of this particular road.

Reply to  Gordon Dressler
February 18, 2020 8:56 pm

“Gordon Dressler February 17, 2020 at 5:56 pm
With computers currently on hand, “A seven-day forecast can accurately predict the weather about 80 percent of the time and a five-day forecast can accurately predict the weather approximately 90 percent of the time.”

NWS weather predictions are so poor that I no longer read beyond 2-3 days.
Even then, NWS frequently reverses weather predictions 18o° for predictions within three days.
I find Joe Bastardi and Ryan Maue far more accurate and for more distant time periods.

Michael S. Kelly
Reply to  Nick Stokes
February 17, 2020 7:04 pm

This is one area where the state of CFD and other weather-related computational art is adequate to make predictions far enough out (days) to make an enormous difference in outcomes for humanity. It does require a lot of computer power, but it is a quantifiable amount with a predictable price tag. It is certainly amenable to a cost-benefit analysis, and I have little doubt that the benefits far outweigh the cost.

At FAA/AST, the office that licenses and permits commercial space launch, I was able to establish a practice of using high-fidelity weather models available on-line to give the day-of-launch winds aloft needed to satisfy range safety constraints in lieu of balloons, sounding rockets, and more exotic atmospheric sounding techniques. It has worked just fine. Extending the period of prediction from one day to a few days is quite feasible.

As an aside, I was Chief Engineer at FAA/AST until my “retirement” in June 2019.

Megs
Reply to  Nick Stokes
February 18, 2020 3:23 am

Nick, US operational weather predictions?

BOM (Bureau of Meteorology) told our State and Federal Ministers, two weeks ago that it was not likely that Australia would see significant rainfall until April of this year….

How can you possibly predict ‘climate’ on any level, at any time in the near or far future?

Latitude
Reply to  Gordon Dressler
February 17, 2020 4:27 pm

..and a 50% chance of rain….still means either it will or it won’t… equally

LdB
Reply to  Latitude
February 17, 2020 5:09 pm

But they can model it in higher resolution 🙂
Your thrust I do agree with you need to first show that I get something useful for the expenditure.
There are lots of science areas that injecting money could make a difference and you can’t afford to fund them all.

Reply to  Latitude
February 18, 2020 2:17 pm

We report our precipitation to our local NWS.
I once asked about “chance of rain” forecast.
If I understood him correctly, 4 to 10 days out a 50% chance of rain means that there is a 50% chance it might rain SOMEWHERE in the forecast area.
1 to 3 days out means it will rain in 50% of the forecast area.
So, if the short-term forecast is 90% chance of rain and it didn’t rain on your house but it did rain on 90% of the forecast area, they nailed it.
If I misunderstood him, I welcome correction.

commieBob
Reply to  Gordon Dressler
February 17, 2020 5:41 pm

It’s not an axiom per se.

Edward Norton Lorenz, a pioneer computer modeller discovered chaotic systems by accident.

As recounted in the book “Chaos” by James Gleick, Dr. Lorenz’s accidental discovery of chaos came in the winter of 1961. Dr. Lorenz was running simulations of weather using a simple computer model. One day, he wanted to repeat one of the simulations for a longer time, but instead of repeating the whole simulation, he started the second run in the middle, typing in numbers from the first run for the initial conditions.

The computer program was the same, so the weather patterns of the second run should have exactly followed those of the first. Instead, the two weather trajectories quickly diverged on completely separate paths.

At first, he thought the computer was malfunctioning. Then he realized that he had not entered the initial conditions exactly. The computer stored numbers to an accuracy of six decimal places, like 0.506127, while, to save space, the printout of results shortened the numbers to three decimal places, 0.506. When typing in the new conditions, Dr. Lorenz had entered the rounded-off numbers, and even this small discrepancy, of less than 0.1 percent, completely changed the end result.

Even though his model was vastly simplified, Dr. Lorenz realized that this meant perfect weather prediction was a fantasy. link

So, folks begging for new super computers for weather forecasting have to demonstrate that Lorenz was wrong.

Michael Jankowski
Reply to  Gordon Dressler
February 17, 2020 5:43 pm

Climate models just require a Commodore VIC-20.

Pillage Idiot
Reply to  Michael Jankowski
February 17, 2020 8:41 pm

Yeah, but the old guys used to be able to do climate models with just a slide rule!

Rod Evans
Reply to  Pillage Idiot
February 17, 2020 10:44 pm

Actually, the old guys used to be able to do weather predictions by holding a wet finger in the air and spotting the migration habit of various birds. Their climate predictions were only as good as today’s computer models, but nobody is perfect….

Reply to  Rod Evans
February 18, 2020 2:12 am

Rod Evans

comment image

🙂 🙂

Darrin
Reply to  Pillage Idiot
February 18, 2020 6:23 pm

My old guy (dad) has a lifetime of observations to rely on and is more accurate than the weather forecaster. We would watch the evening news, the weather forecaster would come on showing the radar pics, where the highs and lows were, wind speeds, etc. then tell us the forecast. Dad would disagree and say it’s going to do this not that and be right. How does he do it? As a kid I thought he was superman, as an adult I’ve come to the realization that spending a lifetime living in one area gives you pretty darn good idea what the weather is going to do if you’ve bothered to pay attention throughout your life.

I’ve now copyrighted a new term to describe his method of weather forecasting so if you use it I get royalties! That term is Pattern Recognition. If anyone is interested in sending funds I’ll do more research into how it works.

JohnM
Reply to  Gordon Dressler
February 17, 2020 6:05 pm

I love the graphic that shows that our current machines are identical, yet smaller than the proposed computers. Sounds like we already have the better system.

Rick K
February 17, 2020 3:08 pm

Too bad they spent all their money fighting CO2. Otherwise, they could afford top-of-the-line computer resources.

Alan the Brit
February 17, 2020 3:09 pm

I read that ghastly report in the papers today, around £1.3billion spent by guvment (aka the taxpayer) on a new puter to create a new model to arrive at the wrong answers even faster than ever before!!! Yipee, it’s only somebody elses money, let’s spend, spend, spend!!! AtB

joe
Reply to  Alan the Brit
February 17, 2020 4:26 pm

They could save money by scraping all their computers and hiring a couple of political scientists to make the forecasts.

Nik
Reply to  joe
February 17, 2020 5:11 pm

Or by looking out a window.

LdB
Reply to  joe
February 17, 2020 5:11 pm

Nick is available probably only cost a fraction of the cost.

Reply to  Alan the Brit
February 18, 2020 2:14 am

Rod Evans

Amazing what £1.3bn will buy you nowadays.

comment image

Reply to  Alan the Brit
February 18, 2020 6:13 am

Alan,
You forgot to add in Police Scotland’s cost estimate of £250 million to police the COP26 nonsense in Glesga later this year.

RayG
February 17, 2020 3:27 pm

Why should the U.S. taxpayer pay for more hardware to run Climate Science ™ models that are demonstrably of no utility?

Bob Vislocky
February 17, 2020 3:47 pm

The US could easily make up for the computing shortfall and surpass ECMWF predictive capabilities by developing a superior MOS system. A good MOS system from an outdated NWP model would smoke a crappy MOS system based on a state of the art NWP model. Unfortunately the US does what it can to sabotage their MOS system too (by continually tweaking NWP models, replacing supposed outdated NWP models even though the MOS still contributes skill, not running new NWP models on past data to get a new MOS system up & running quickly, not taking a consensus MOS, etc..).

Archie
February 17, 2020 3:50 pm

I just moved out of North Carolina where the NWS predictions were worthless even down to the 4-6 hour scale. Something is wrong, that’s for sure.

n.n
February 17, 2020 3:56 pm

Another victim, perhaps collateral damage, of [catastrophic] [anthropogenic] climate cooling… warming… change.

February 17, 2020 3:57 pm

It is immaterial if the US weather-forecasting resources are less than those available in the EU or the UK. It is very important if they aren’t adequate to providing US residents timely and accurate weather forecasts. Stop comparing “us” to “them” and just tell us what we need.

And make sure that the infrastructure is used for weather forecasting and not for chasing ephemeral dreams about CO2.

Prjindigo
February 17, 2020 4:02 pm

Re-analysis could be run on distributed processing.

I’m a 0.1%’er for folding@home XD

markl
February 17, 2020 4:06 pm

“But the situation is actually much worse than it appears.” But of course it is and money will fix it. Funny how the Farmer’s Almanac and Joe Bastardi have been so accurate without this level of Supercomputers. That kind of computer resource for predicting weather is over the top. No matter how much computing power is available it won’t be able to forecast a chaotic non linear system.

PaulH
Reply to  markl
February 17, 2020 4:15 pm

I was thinking the same thing. It seems like experienced weather people with an understanding of historical weather events makes for better forecasts.

Reply to  PaulH
February 17, 2020 4:35 pm

Exactly!

Stevek
February 17, 2020 4:20 pm

Are we sure they are using the computers in best way possible ? As a computer scientist I have seen many times inefficient code that can be sped up 100x by fairly simole optimization techniques.

Latitude
Reply to  Stevek
February 17, 2020 4:31 pm

of course they aren’t…it’s the government

..and predicting weather…..it has a very low threshold…..as long as they look busy and complain they don’t have enough money….it just goes on and on

LdB
Reply to  Stevek
February 17, 2020 5:06 pm

You mean it’s government where some little bureaucrat wants to build there own little empire department with a multi billion dollar budget. The issue for me comes down to bang for buck, you want to see definitive numbers on what you get for the expenditure.

commieBob
Reply to  Stevek
February 17, 2020 5:49 pm

When I first taught DSPs, it was common to code the core of a program in assembly to get it as efficient as possible. Five years later, the tools had progressed sufficiently that attempting to hand optimize the code usually only slowed things down.

Robert of Texas
February 17, 2020 4:49 pm

The only fix that a manager-type can understand is spend-more-money. Sometimes it actually helps, but often is very wasteful. The problem with computer models and computing power USUALLY begins with mediocre programming and a lack of thinking out-the-box.

An example, back in the 1980’s I was tasked with implementing a LANDSAT algorithm written in FORTRAN on a IBM 360. I downloaded the computer program (a deck of computer punch cards) and got it to work on the university IBM 360, but the professor using it was not very happy. A student could load their card deck containing data and then expect a printed output sometime after 24 hours. This is just how long the program took in the time-share partition it was allocated to. I took this program, rewrote it (still in FORTRAN, but a newer version), put it on a much smaller mid-range computer, and tuned it until results were being printed in under a 10 minutes. The original coder(s) were just not very good…they had the right results, but understood nothing about how to build an efficient running program – it wasn’t even clear if they had understood that they COULD make it run faster. I didn’t need a bigger computer to make an improvement, just better code.

I have run into this again and again – a request for new more expensive computers when a simple analysis and some tuning were all that was required. If you do not understand underneath the covers of how computers work, you have no idea how to make a program run faster. Now some of this has been built into compilers, but good old understanding can still work wonders.

The managers are told by average programmers that the computers just are not fast enough, so they ask for money to upgrade everything. Often this results in some improvements, but not always. So, start with a really clever computer programmer and let them analyze the situation. They likely will get you better results and spend less money. DO NOT USE A COMMITTEE unless you want to waste more time and money.

Older organizations, including those inside corporations often become bogged down in procedure. Much of the procedure doesn’t even apply anymore, but that is how the managers learned it so that is the way its done. The only successful way to break out of this mold is to start from scratch – it’s painful and wasteful at first, but after some time you evolve a much more efficient and productive environment. Keep HR and old management as far and long away as possible – they ARE a (maybe the) problem.

And finally, try thinking outside the box. Is there another way to write this procedure (or object)? Is there a way to approximate something that is good enough and runs 10 or 100 times faster? Does this “part” even add to my accuracy, or just slow everything down? Does everything have to run the same number of iterations, or can some things run less often? The questions are many, and the results are often surprising. Take measurements and then decide based on the data.

BUT, this being a typical government-ruined agency, they probably have lost the ability to analyze in new and thoughtful ways, think outside-the-box, or even fix the stuff they already have. Move the entire function (but not the managers) into a science and technology University and things will improve immensely – but keep one person in charge and accountable.

Guy Dombrowski
Reply to  Robert of Texas
February 18, 2020 7:11 am

Robert you are so right !
Being an old engineer turned computer programmer I am always amazed by the kind of garbage that is
written by big computer teams.
I learned programming with 8 bits processors having 32 k ram and 80 k floppies.
You had to make every byte count.
One good programmer is more efficient than a big team any day.

MarkW
Reply to  Guy Dombrowski
February 18, 2020 8:47 am

I had one manager yell at me over the extra time I took to optimize a particular process.
Then he want and bragged to his management how his team had reduced processing time by a factor of 4.

RB
February 17, 2020 4:56 pm

In the meantime, use the UK, European, and Canadian models, which are superior to the US models.

Ron Long
February 17, 2020 5:00 pm

Get up in the morning, start the coffee, open a window and look outside. No computer necessary.

george1st:)
February 17, 2020 5:12 pm

Surely a few economists could predict the weather .
After all , they tell us what the climate is doing .

Michael Jankowski
February 17, 2020 5:34 pm

Based on the Met Office prediction failures, we can’t be THAT bad in the US.

AWG
February 17, 2020 5:40 pm

This may seem like a stupid idea, but since the WX predictions machine in Europe are calculating on a global basis, why can’t the US just chip in to those existing systems? The US and Russia have been ferrying people from all over the globe up to the ISS rather than each nation building their own space station. If they don’t need to build their own space station, why does the US have to build their own WX prediction machines?

February 17, 2020 5:42 pm

The Australian BOM recently ”predicted” no significant rain for AU until at least April based on the ”vibe of the thing”……. It’s raining cats and dogs all over the East coast. Has been for 3 weeks or so, and will no doubt continue. This is a massive failure no matter which way you swing it.
And they want me to believe that they know what it will be doing in 2030 -40-50 -2100?

Curious George
Reply to  Mike
February 17, 2020 5:59 pm

Mike, this is weather. It has nothing to do with climate 🙂

Reply to  Curious George
February 17, 2020 6:17 pm

It’s all weather. The climate here hasn’t changed for probably 500 years. The ”average of 30 years” is a ridiculous joke.

Michael Jankowski
Reply to  Mike
February 17, 2020 6:04 pm

Yeah on Jan 20 the Aussie BOM predicted Sydney only had a 35-40% chance of exceeding average rainfall from Feb-April, which is about 15 inches. They hit that total with a 4-day period in Feb already. Heaviest rainfall since 1990.

Zig Zag Wanderer
Reply to  Mike
February 17, 2020 9:39 pm

I don’t trust the BOM to tell me what the weather is right now, let alone tomorrow. Any further than tomorrow is an ongoing joke that changes every day.

The ONLY useful thing I get from them is the rain radars. At least that lets me know if it’s going to rain on the best future.

Mike Rossander
February 17, 2020 5:44 pm

re: “U.S. citizens can enjoy far better weather forecasts, saving many lives and tens of billions of dollars per year.”

Objection – assumes facts not in evidence. Weather forecasts do save lives. That does not mean that incrementally better weather forecasts will necessarily save more lives. Perfect weather forecasting might save lots of lives in a perfect world with robots who follow the instructions to evacuate (and who don’t hurt more people during the evacuation). In the real world, humans are ornery cusses who generally don’t do what they’re told. I’m not seeing, for example, any evidence that the weather-related casualty rates in Europe are all that much better than in the US despite their having, as the article says, 10 times the prediction capability.

K vS
February 17, 2020 6:39 pm

Two comments;
First the smartass comment, Everyone complains about the weather but never does anything about it.
Second, certainly there is money to be made by accurate predictions for the insurance as well as agricultural industry. Why the singled-minded focus on making government bigger? There is no end to increasing budgets for the “good of the people” when there is no metric for when enough is good enough.

Kevin A
February 17, 2020 6:50 pm

So, buy a 1 rack of computers:
Nvidia will release in March (two weeks) a device the size of a credit card (70X45mm) that does 21 TeraFLOP for 15 watts, in a rack 24 inches wide by 36 inches deep by 72 inches tall you can fit 2,649 cards producing 55.645 PetaFlop, consuming 39,746 watts and all this for less then $300,000.00
This is 0.7 watt per TeraFlop – unheard of.
While Eni Unveils 52 Petaflop Supercomputer, World’s Most Powerful Industrial System

The world’s most powerful industrial supercomputer February 6, 2020
Eni, the Italian energy company headquartered in Milan, today announced the supercomputer named HPC5, a GPU-accelerated system capable of performing 52 million billion operations per second, is now in use.
Obsolete already !

Thomas Edwardson
Reply to  Kevin A
February 18, 2020 9:22 pm

This 0.7 watt per TeraFlop is unheard of for a reason. It’s not true. I’m hoping the above post was done in jest, and I missed the sarcasm tags. If so, I apologize in advance for the following critique …

I’m an IT geek for a pharmaceutical company who has built a number of High Performance Compute clusters to perform computational studies for various projects in conjunction with X-ray crystallography and Magnetic Resonance Imaging and some very large distributed databases. I am also a customer of, and have been inside of, the Blue Waters supercomputer at the National Petascale Computing Facility at the University of Illinois at Champagne Urbana.

Anyway, GPUs do not exist in a vacuum. They get mounted onto PCIe cards (75 watts) with special external power cables that bring the max wattage for the device to almost 300 watts. The goal for the designer is always to cram as many of the GPUs as will fit inside the 300 watt power limit of the carrier. If the GPU card is only drawing 15 watts, then it is idling. So, if you want to actually use 2,649 of these cards, @ 300 watts each you need just shy of 800 kilowatts. Those cards have to be mounted in real computers with their own CPUs, memory, and network cards, which will add at least another 200 kilowatts, or a total of about one megawatt. You can’t fit a megawatt in one cabinet. Most existing data centers were built for 8 to 10kw cabinets. I have some cabinets that run 18kw, but they require supplemental cooling. So, the best you could hope for is about 60 cabinets to hold your GPUs. Density might be increased with water cooling, but you still have to physically deliver power to the cabinets. Finally, the latest Nvidia Tesla V100’s run about $6,000 US each, so your $300,000 price tag is going to be a little light. Even with a deep volume discount, you will be over $10 million just for GPUs.

Kevin A
Reply to  Thomas Edwardson
February 18, 2020 11:56 pm

https://developer.nvidia.com/embedded/jetson-xavier-nx
15 watt /21 TeraFlop = 0.000000000000714285714 or 7.14E-13
Price at 2,500 units is about 50% off the SRP of $399.
The Nano 10 watt / 472 GFLOP = 0.0000000000211864406780 or 2.11E-11 for $99.00 SRP
And yes, I used to cram many Xeons into boxes as Senior Technical Engineering Support for Intel until I burned out… Now I just tinker with bleeding tech in my retirement..
FYI: You can run the benchmarks they did to get those numbers

Kevin A
Reply to  Kevin A
February 19, 2020 9:03 am

With liquid cooling instead of air:
223 PetaFLOP with 222,581 Xavier NX
or $1,059,908.68 + $50K cooling unit.
158,986.3 watts or 31,797.26 amps at 5 volt
Using a Mosfet like IXTN660N04T4 you could build a power supply with 96 devices for $4K + change.

Thomas Edwardson
Reply to  Kevin A
February 19, 2020 7:22 pm

Ah, the NVidia Xavier NX. You are comparing apples and oranges, or maybe grapes to watermelons.

The Xavier NX is a toy compared to the likes of the NVidia Tesla V100 and P100, which are used in general purpose supercomputing like weather forecasting (If there is such a thing as “general purpose supercomputing”, it runs Linux and includes CPUs, GPUs, Infiniband interconnects, and shared global filesystems like Luster). The specs for the NX quote 21 TOPs (int8). TOPs are integer operations. A TOP is not a FLOP. The spec sheet says the NX is good for 6 TFLOPs (FP16) (FP16 = 16 bit floating point, which are 1970’s grade precision). The Nvidia V100 supports 7.8 TFLOPS (FP64) (64 bits), which is markedly better.

But again, the GPU requires a server with PCIe slots for mounting. If you use High Performance Compute servers like the IBM Power9, a DELL Poweredge r740, or an HPE Apollo 6500, then mounting 2649 GPU cards will get you a supercomputer like Summit https://en.wikipedia.org/wiki/Summit_(supercomputer), or Stampede https://www.tacc.utexas.edu/systems/stampede. These are large systems that span several 100s of cabinets and consume megawatts of power.

Please explain how you will mount 2649 Nvidia Xavier NX GPU cards in one cabinet. Be specific.

For the rest of you, here are some pretty pictures of 10 of the fastest supercomputers in existence … https://www.networkworld.com/article/3236875/embargo-10-of-the-worlds-fastest-supercomputers.html

And if you liked those, then try these oldies but goodies … https://royal.pingdom.com/10-of-the-coolest-and-most-powerful-supercomputers-of-all-time/

Waza
February 17, 2020 7:52 pm

Hi all.
I’m guessing there are sort of 4 type of weather/ climate models.
1. 3-7 day forecast
2. 7-16 (384hrs) day forecast
3. 16day to 3 months forecast.
4. Decadal climate models.
I’m also guessing this article is about a the US ability to do 2 above. BUT IMO 3. Long range seasonal is by far more important for farmers and emergency planners

Chaswarnertoo
Reply to  Waza
February 17, 2020 11:37 pm

Pay Piers Corbyn, Jezza’s smart older brother. No supercomputer but more accurate than the Met. Office. GIGO.

Anthony Banton
Reply to  Chaswarnertoo
February 18, 2020 10:30 am

Please provide a link to independent statistic to verify that assertion.
As and ex-professional Metman, it has long been known that Mr Corbyn is the only one who does (assert he is God’s gift to Wx forecasting).
To boot he refuses to say how he does it – other than via some vague ‘sunspot analysis’.
With that insight he can (on his own account) predict regional weather – Globally LOL

This is what Willis said in this thread ……
https://wattsupwiththat.com/2012/07/05/putting-piers-corbyn-to-the-test/

“I don’t understand how I’m supposed to tell if Piers is right or not. He only makes four “forecasts” that are so vague that Nostradamus would be proud of them:

• Waves of major thunderstorms, tornadoes and giant hail continue mainly in N/E parts.
• Searing heat will grip West / South parts with extremely dangerous ‘out-of-control’ forest fires especially later in month.
• Frequent low pressure over Great Lakes / N/E
• Variable band of high(er) pressure from NW to SE parts divides USA through July

As far as I can see, not one of those is specific enough to be falsifiable. How hot is “searing”? How many fires? Just how high is a “variable band of high(er) pressure”? How “frequent” will the low pressure be, and how low does it have to be to count?
I like Piers, and I’ve corresponded with him. But I keep waiting for him to make an actual verifiable checkable falsifiable forecast. Perhaps he’s made one, I haven’t checked them all, but the ones I have looked at have been the equivalent of these, vague claims about “searing” heat and “frequent low pressure”.
My own forecast is that we will continue to have waves of thunderstorms, tornadoes, forest fires, and frequent instances of high and low pressure, particularly over the NE, SW, NW, and SE parts of the US.
w.”

Exactly Willis:
There are gullible idiots out there in their thousands to by into Corbyn’s bollocks.

Witness the one I’m replying to.

Joel O'Bryan
February 17, 2020 8:05 pm

NOAA and Team GFS is still trying to recover from the TS Sandy beat down they got in 2012 by ECMWF.
Send money.
Government Bureaucracies succeed in winning more money by failing. Success means neglect and less money. It *IS* how things work in big government… or actually *don’t*.

February 17, 2020 8:20 pm

As an operational forecaster, I don’t see this as desperate as portrayed.
For this post, I’ll focus on the current GFS in the western US. Since the new version was released last year, the improvements have been noticeable. For the Front Range of Colorado, the GFS does a better job than the Euro on containing snow washing over the Continental Divide – ie – it is more accurate. I have also noticed for QPF, when there are difference between the GFS & Euro, the GFS is commonly better, especially when the GFS has greater QPF than the Euro. It is kicking the Euro’s butt tonight, where the GFS showed 2-5″ of snow in the Colorado Front Range & the Euro showed less than 1/2 that. I already have 2″ & it is still dumping – GFS winning again.
In short, I love the idea of getting better computational capacities for the US systems, however, I don’t see that the American systems are so desperately behind the times as this post suggests.

Reply to  Jeff L
February 17, 2020 9:39 pm

For me, it’s not that GFS should be as good as ECMWF for operational forecasting… it shouldn’t even be race.
GFS or whatever NOAA has should be light-years ahead. Not even in the same league.

Money wasted for decades on climate simulation systems has cost operational forecasting. Dearly.
The operational forecasting community should stand up to the NOAA climate model scammer community.
We want our cutting edge supercomputers back.

WXcycles
Reply to  Jeff L
February 18, 2020 5:20 am

Jeff, I find it hard to swallow you’d not think a 4 km resolution global model with far more data input and more development opportunities wouldn’t be a major advance worth having. GFS is a terribly inferior model option these days. Fine if you want to be 4th best and slipping to 6th real soon.

AntonyIndia
February 17, 2020 8:44 pm

All these hot super computers run for the many different past/future climate models everywhere will contribute quite a lot to global warming, apart from the extra CO2 produced to keep them on 86400 seconds /24/7/365…

Editor
February 17, 2020 9:36 pm

Gee, I remember when WUWT had several posts ridiculing the UK’s new supercomputer.

That implies instead of falling all over ourselves complaining about our current equipment we should be ridiculing the need for more equipment.

Or was WUWT wrong then?

https://wattsupwiththat.com/2017/08/04/report-127-million-climate-supercomputer-no-better-than-using-a-piece-of-paper/

https://wattsupwiththat.com/2009/08/28/met-office-supercomputer-a-megawatt-here-a-megawatt-there-and-pretty-soon-were-talking-real-carbon-pollution/

niceguy
Reply to  Ric Werme
February 17, 2020 10:23 pm

Just because climate weather modeling super computers are patently useless doesn’t mean weather modeling super computers can’t do real good, when the long term nonsense is avoided.

Michael Jankowski
Reply to  Ric Werme
February 18, 2020 9:32 am

Those links were ridiculing the inaccuracy in results and ironic energy footprint.

niceguy
February 17, 2020 10:19 pm

The Alabama will be hit story gave us not only sharpiegate but also the Trump’s spaghettis graph. What did we learn with that? That the models give different results and that they were mostly wrong. Looking at Trump’s spaghettis Alabama was going to be hit but Dorian made a great effort to turn and mostly avoid the land.

But everybody is talking about the inane sharpie drawing and not the spaghettis.

Also, we learnt that the same people panicked by a modicum of possible warming 50 years from now are calmer than a Zen Monk when faced with up to 30% chances of dangerous winds over some part of Alabama, slightly more than a week after the announcement, and don’t believe that such risk warrants any preparation more than 4-5 days in advance.

Also, we learnt that the same people think spaghettis graph is about spaghetti size weather phenomenons and that only a spaghetti over Alabama would be a risk for Alabama.

Rob_Dawg
February 17, 2020 10:58 pm

Executing legacy Fortran IV code faster doesn’t address even the top ten more effective efforts in weather prediction.

A modest suggestion for the Bezos $10 billion.

Mass produced, we could make GPS satellite phone solar powered weather stations for a few hundred bucks. Call it a million units for half a billion. Put them everywhere. The weather data alone would probably have a payback period measured in weeks. The money saved in not doing stupid things about climate because of what we learn would take years but would prove priceless.

It wont happen because that’s not what Bezos is actually interested in. Just like better weather prediction is not the reason for wanting more computing power.

Reply to  Rob_Dawg
February 21, 2020 1:30 pm

What exactly do you mean by a “GPS satellite phone”? On first glance, this looks like ‘word salad’ …

Rod Evans
February 17, 2020 11:20 pm

I am struggling to understand the logic in play, with this desire for ever bigger computers to model weather/climate?
One of the tests of a forecasting system and its value is, does it relate closely to what happened, or is happening now. In other words does it actually do the job you have paid it to do?
Do the forecasts actually happen? Do the hind casts accurately reflect what has already happened?
So far the IPCC models all run hot. They do not reflect the climate conditions we are experiencing? Not content with the fundamental faults the modelling performance reflects, some wiz thought the best thing to do with outputs (that are clearly wrong) is to average them? So we end up with an average output that is just averagely wrong.
These are the kind of people who are asking for more money and resources to continue on with their unique approach to climate research.
The question must eventually come up. What is the point of giving researchers more money/resources, that don’t seem capable of using it for any purpose, beyond showing they don’t understand the complexity of what they are working on?
A 100 petoGiGO computer is no better than any other sized GIGO activity.

RockyRoad
February 18, 2020 2:23 am

I saw a statement by some guru working at weather.com several years ago that said today’s forecast really was just 80% accurate; tomorrow’s forecast was 60% accurate; the day after that it was 40% accurate; the next day it was only 20% accurate; and down to 0% accurate on the fifth day!!

He was describing the logic for adding the 5-day forecast and not bothering with anything beyond that! (Up to that point, they had been offering a 10-day forecast that was actually 14-days long!!….derrr…)

What will more computing power give us?… 10% accuracy on the sixth day??

Grumbler
February 18, 2020 2:38 am

Kevin A.
“deep by 72 inches tall you can fit 2,649 cards producing 55.645 PetaFlop, consuming 39,746 watts and all this for less then $300,000.00
This is 0.7 watt per TeraFlop – unheard of.“

Won’t this mean the end of Bitcoin?

Kevin A
Reply to  Grumbler
February 19, 2020 12:01 am

No, it means I’ll be waiting forever to take delivery while the ‘coin farmers’ eat up the first two generations, on Amazon someone is listing a Nano for twice its value because Amazon is out… Such is life

Not Chicken Little
February 18, 2020 2:52 am

I am working on a computer program that will predict all sorts of random and chaotic events. It works really well, especially for past events! Now I am tuning it to better predict the future, and this program is so good it is already predicting it will be a great success at predicting the future!

So now all I need is better computers and funding to make this all a reality! When it’s finished it will probably be able to predict lottery numbers, so we can use it to make EVERYONE rich!

Please send money ASAP. You can find me here at WUWT. Thank you. By the way my program is predicting it will be cold this winter, so you all might want to prepare for that. No charge for that prediction. 🙂

cjc
February 18, 2020 5:05 am

The problem is made much worse by the fact that the models are programmed to match the limitations of 1980’s-vintage vector computers and their compilers, rather than those for current microprocessor based systems (the assumptions are that memory accesses are free, and that one should do at most one IF and only a three or four arithmetic operations per loop, with no “vector dependencies”).
The facts now are that memory accesses are a hundred times more expensive than even single arithmetic operations, and that the processors are “pipelined superscalar” that are capable of executing as many as fifty arithmetic operations simultaneously, provided that the code puts them together.
In the WRF meteorology model, this mis-structuring causes advection to be three times slower than it should be, and microphysics to be as much as eight times slower, according to cleanups and benchmarking that I have performed. There’s a LOT of “Not Invented Here” at NCAR.

RockyRoad
Reply to  cjc
February 18, 2020 6:51 am

Yes, there’s been a lot in the news lately about the development and benefits of quantum computing, which from what I understand, more closely aligns in theory with the phenomena that controls the weather.

It is supposed to be the next big thing!

We would be wise to wait until that computer architecture matures before spending big bucks on big iron to make big predictions–about anything!

Reply to  cjc
February 18, 2020 10:24 am

cjc
Good point regarding memory access. The Met office recognised this problem a good few years ago. I believe that they have the LFRic project to try and address the problem but I don’t know how far they have got.
It can’t be an easy job writing code for one of those Crays.

RockyRoad
Reply to  Finn McCool
February 18, 2020 1:08 pm

Their OS takes care of the multi-threading.

DrTorch
February 18, 2020 6:52 am

The grant-funded science industry is always sounding the alarm of how the US is falling behind someone else. It’s Japan, China, S. Korea (electronics, biotech). Israel, India or Russia (Software, mathematics, military tech). Sometimes Europe (LHC, weather modeling).

For decades this has been the case, yet somehow the US still competes, even with only 5% of the world’s population.

The bigger problems of course are infiltrators and espionage into the US university and laboratory systems, and poor public schools being made yet worse with high immigration. But those rarely get mentioned during the hype.

Rud Istvan
February 18, 2020 2:03 pm

I have posted before, here and elsewhere, on both weather and climate models. Have also communicated directly with Cliff Mass, as numerical weather prediction is his thing and he knows hi stuff.
Three observations.
1. Cliff’s main point here isn’t about supercomputer resources. It is about the dysfunctional present organization that manages them and plans the future. President Trump could fix that given a chance second term.
2. Bringing NWS resources up to snuff is CHEAP compared to the billions wasted annually on US climate models that have no hope of success because of the unavoidable parameterization problem that drags in attribution.
3. A more accurate 3-5 day forecast would be invaluable. Trucking firms, airlines, utilities, farmers (planting/harvesting) would all benefit enormously. Living on the beach in Fort Lauderdale, a narrower 3 day cone of hurricane uncertainty would be well worth a few extra tax dollars. Look at the evacuation disaster from Miami triggered by an erroneous Irma cone track just a few years ago.

Reply to  Rud Istvan
February 18, 2020 2:27 pm

Improve short term accuracy until it long enough to act where needed and not act where not needed.
Makes sense and will build confidence so that fewer people will ignore the warnings when issued.

February 18, 2020 2:35 pm

Mr. Layman here.
In weather forecasting, how much computing power is put into matching/comparing past weather patterns?
It would seem that, even if we don’t understand the “whys”, if a past pattern is repeating then it’s likely the result will repeat.

Svend Ferdinandsen
February 18, 2020 2:47 pm

It could be they need larger computers, but it could also help with more offices equipped with windows, so that they could actually see the weather. Just a simple thought.

It looks like an intention that all predictions before actual weather is erased.

Mark Luhman
February 18, 2020 10:07 pm

Let’s face it computer have no predictive qualities what so ever, anyone who tells you different should be ask to put their next years salary on the stock marked based on their computer model. the only takers will be fools

Patrick MJD
February 19, 2020 3:26 am

My experience of bigger, faster computers is they just crash faster and more heavily. The biggest advancement IMO is virtualisation which is nothing new (I keep telling my Windows Hyper-V “experts”).

Kevin A
Reply to  Patrick MJD
February 19, 2020 9:17 am

I have to agree, after fighting Windows for two weeks trying to run Linux I returned the software and purchased a new computer for Ubuntu to live on a 4TB drive. And no, Windows Subsystem for Linux (WSL 2) is not the answer unless you’re not trying to get anything done.

War
February 19, 2020 9:18 pm

How about this story that 5G is a threat to accurate weather forecasting from the German media outlet DW, anyone heard of this? article -‘Will 5G mobile networks wreck weather forecasting? ‘

Also I would add that we’re clueless and in uncharted territory with weather at this point on a large scale, need results from the CERN study on clouds and cosmic rays…..

Reply to  War
February 21, 2020 1:43 pm

re: “How about this story that 5G is a threat to accurate weather forecasting”

Bullocks.

Give it to me in “first principles” in physics or its just horse hockey …