The Future of U.S. Weather Prediction Will Be Decided During the Next Month

Reposted from the Cliff Mass Weather Blog

During the next few weeks, leadership in NOAA (the National Oceanic and Atmospheric Administration (NOAA) and the National Weather Service (NWS) will make a key decision regarding the future organization of U.S. numerical weather prediction.  A decision that will determine whether U.S.  weather forecasting will remain third rate or advance to world leadership.   It is that important.

Specifically, they will define the nature of new center for the development of U.S. numerical weather prediction systems in a formal solicitation of proposals  (using something called a RFP–Request for Proposals).
This blog will describe what I believe to be the essential flaws in the way NOAA has developed its weather prediction models.  How the U.S. came to be third-rate in this area, why this is a particularly critical time with unique opportunities, and how the wrong approach will lead to continued mediocrity.
I will explain that only profound reorganization of how NOAA develops, tests, and shares its models will be effective.  It will be a relatively long blog and, at times, somewhat technical, but there is no way around that considering the topic.  I should note that this is a topic I have written on extensively over the past several decades (including many blogs and an article in the peer-reviewed literature), given dozens of presentations at professional meeting, testified about  in Congress, and served on a number of NOAA/NWS advisory committees and National Academy panels dealing with these issues.
The Obvious Problems
As described in several of my previous blogs, U.S. numerical weather prediction, the cornerstone of all U.S. weather prediction, is behind other nations and far behind the state-of-the-art.   Our global model, the GFS, is usually third or fourth ranked; behind the European Center and the UK Met Office, and often tied with the Canadians.
We know the main reason for this inferiority:  the U.S. global data assimilation system is not as good as those of leading centers.  (data assimilation is the step of using all available observations to produce a comprehensive, physically consistent, description of the atmosphere).
The U.S. seasonal model, the CFSv2, is less skillful than the European Model and is aging, while the U.S. is running a number of poorly performing legacy modeling systems (e.g., the NAM and Short-Range Ensemble System).  Furthermore,  our global ensemble system has too few members and lacks sufficient resolution.  The physics used in our modeling systems are generally not state-of-the-art, and the U.S. lacks a large, high-resolution ensemble system capable of simulating convection and other small-scale phenomena. Finally, operational statistical post-processing, the critical last step in weather prediction, is behind that of the private sector, like weather.com or accuweather.com.

The latest global statistics for upper air forecast skill at 5 days shows the U.S. in third place.

There is one area where U.S. numerical weather prediction is doing well:  high-resolution rapid refresh weather prediction.  As we will see there is a reason for this positive outlier.
The generally inferior U.S. weather modeling is made much worse by NOAA’s lack of computer resources.  NOAA probably has 1/00th of what they really need, crippling NOAA’s modeling research as well as its ability to run state-of-the-science modeling systems.
Half-way Steps Are Not Enough
Although known to the professional weather community for decades, the inferiority of U.S. weather prediction become obvious to the media and the general U.S. population during Hurricane Sandy (2012), when the European Center model provided a skillful forecast days ahead of the U.S. GFS.  After a number of media stories and congressional inquiries, topped off by a segment on the NBC nightly news about abysmal state of U.S. weather prediction (see picture below), NOAA/NWS leadership began to take steps that were funded by special congressional budget supplements.

New computers were ordered (the U.S. operational weather prediction effort previously possessed only had 1/10th the computer resources of the Europeans), an improved hurricane model was developed, and NOAA/NWS began an effort to replace the aging U.S. global model, the GFS.   The latter effort, known as the Next Generation Global Prediction System –NGGPS, included funds to develop a new global model and to support applicable research in the outside community.
During the past 8 years, there has been a lot of activity in NOAA/NWS with the goal of improving U.S. weather prediction, and some of it has been beneficial:

  • NOAA management has accepted the need to have one unified modeling system for all scales, rather than the multitude of models they had been running.
  • NOAA management has accepted the idea that the U.S. operational system must be a community system, available to and used by the vast U.S. weather community.
  • NOAA management has increased funding for outside research, although they have not done this in an effective way
  • NOAA has replaced the aging GFS global modeling system with the more modern FV-3 model.
  • NOAA has made some improvements to its data assimilation systems, making better use of ensemble techniques.
  • Antagonistic relationships within NOAA, particularly between the Earth System Research Lab (ESRL) and the NWS Environmental Modeling Center (EMC) have greatly lessened.

But with all of these changes and improvements in approach, U.S. operational weather prediction run by NOAA/NWS has not advanced compared to other nations or against the state-of-the science.  We are still third or fourth in global prediction, with the vaunted European Center maintaining its lead.  Large number of inferior legacy systems are still being run (e.g., NAM and SREF), computer resources are still inadequate, and the NOAA/NWS modeling system is being run by very few outside of the agency.
This is not success.  This is stagnation.
But why?  Something is very wrong.
As I will explain, the key problems holding back NOAA weather modeling can can be addressed (and quickly), but only if NOAA and Congress are willing to follow a different path.  The problem is not money, it is not the quality of NOAA’s scientists and technologists (they are motivated and competent).  It is about organization. 

Let me repeat this.  It is all about ineffective organization.
With visionary leadership now at NOAA and the potential for a new center for model development, these deficiencies could be fixed.  Rapidly.

The REAL Problems Must be Addressed
So with substantial resources available, the acute need for better numerical weather prediction in the U.S., and the acknowledged necessity for improvement, why is U.S. numerical weather prediction stagnating?  There are several reasons:
1.  No one individual or entity is responsible for success

Responsibility for U.S. numerical weather prediction is divided over too many individuals or groups, so in the end no one is responsible.  To illustrate:

  • The group responsible for running the models, the NWS Environmental Model Center (EMC), does not control most of the folks that develop new models (located OUTSIDE of the NWS in NOAA ( the ESRL and NOAA labs).
  • Financial responsibility for modeling systems is divided among several groups including OSTI (Office of Science and Technology Integration) and OWAQ (NOAA Office of Weather and Air Quality), and a whole slew of administrators at various levels (head of the National Weather Service, head of NCEP, head of EMC, NOAA Administrator, and many more).

U.S. weather prediction is not the best?  No one is responsible and fingers are pointed in all directions.
2.  The research community is mainly using other models, and thus not contributing to the national operational models.
The U.S. weather research community is the largest and best in the world, but in general they are NOT using NOAA weather models.  Thus, research innovations are not effectively transferred to the operational system.

The National Center for Atmospheric Research in Boulder, Colorado

Most American weather researchers use the weather modeling systems developed at the National Center for Atmospheric Research (NCAR), such as the WRF and MPAS systems.  They are well documented, easy to use,  supported by NCAR staff and large user community, with tutorials and annual workshops.  Time after time, NOAA has rejected using NCAR models, decided to go with in-house creations, which has led to a separation of the operational and research communities.   It was a huge and historic mistake that has left several at NCAR reticent about working with NOAA again.

There is one exception to this depressing story:  the NOAA ESRL group took on WRF as the core of its Rapid Refresh modeling systems (RAP and HRRR).  These modeling systems, not surprisingly, have been unusual examples of great success and state-of-science work in NOAA.

3.  Computer resources are totally inadequate to produce a world-leading numerical weather prediction modeling system.
NOAA currently has roughly 1/10 to 1/100th of the amount of computer resources necessary for success.  Proven technologies (like 4DVAR and high-resolution ensembles) are avoided,  ensembles (running the models many times to secure uncertainty information) are low resolution and small, and insufficient computer resources are available for research and testing.
Even worse, NOAA computer resources are very difficult for visitors to use because of security and bureaucracy issues, taking the better part of a year, if they are ever allowed on.
There is a lot of talk about using cloud computing, but there is still the issue of paying for it, and cloud computing has issues (e.g., great expense) for operational computing that requires constant, uninterruptible large resources.

With responsibility for U.S. numerical weather prediction diffused over many individuals and groups, no one has put together a coherent strategic plan for U.S. weather computing or made the case for additional resources.    Recently, I asked key NWS personnel to share a document describing the availability and use of NWS computer resources for weather prediction:  no such document appears to exist.
4.  There is a lack of careful, organized strategic planning.

NOAA/NWS lacks a detailed, actionable strategic plan on how it will advance U.S.  numerical weather prediction. How will modeling systems advance over the next decade, including detailed plans for coordinated research and computer acquisition.  Major groups, such as the European Center and UKMET office, have such plans.  We don’t.   Such plans are hard to make when no one is really responsible for success.
NOAA has tried to deal with the lack of planning by asking  U.S. researchers to join committees pulling together a Strategic Implementation Plan (SIP), but these groups have been of uneven quality, have tended to produce long laundry lists, and their recommendations do not have a clear road to implementation.

5.  The most innovative U.S. model development talent is avoiding NOAA/NWS and going to the private sector and other opportunities.
U.S. operational weather prediction cannot be the best, when the best talent coming out of our universities doesn’t want to be employed there.  Unfortunately, that is the case now.  Many of the best U.S. graduate students do not want to work for NOAA/NWS–they want to do cutting edge work in a location that is intellectually exciting.


EPIC:  The Environmental Prediction Innovation Center
Congress and others have slowly but surely realized that U.S. numerical weather prediction is still in trouble want to deal with this problem.  To address the issue, Congress passed recent legislation (The National Integrated Drought Information System Reauthorization Act of 2018 ), which instructs NOAA to establish the Earth Prediction Innovation Center (EPIC) to accelerate community-developed scientific and technological enhancements into operational applications for numerical weather prediction (NWP).  Later appropriation legislation provided funding.
Last summer, NOAA held a community workshop regarding EPIC and asked for input on the new center.  There was strong support, most participants supporting a new center outside of NOAA.  The general consensus:  it will take real change in approach to result in real change in outcome.  They are right.

Two Visions for EPIC

There are two visions of EPIC and the essential question is which NOAA will propose in its request for proposals to be released during the next month.
A Center Outside of NOAA with Substantial Autonomy and Independence
In this vision, EPIC will be an independent center outside of NOAA.  It will be responsible for producing the best unified modeling system in the world, supplying the one point of responsibility that has been missing for decades.
This EPIC  center would maintain advisory committees that would directly couple to model developers, and should have sufficient computer resources for development and testing.   It would build and support a community modeling system, including comprehensive documentation, online support, tutorials, and workshops.
Such a center should be in a location attractive to visitors and should entrain groups at NCAR and UCAR (like the Developmental Testbed Center).  It will maintain a vibrant lecture series and employ some of the leading model and physics scientists in the nation.

EPIC should be led by a scientific leader of the field, with a strong core staff in data assimilation and physics.  This EPIC center will be able to secure resources from entities outside of NOAA (although NOAA funding will provide the core support).
Such an EPIC center might well end up in Boulder, Colorado, the intellectual center of U.S. weather research (with NCAR, NOAA ESRL lab, University of Colorado, Joint Center for Satellite Data Assimilation, and more), and there is hope that UCAR (the University Corporation for Atmospheric Research) might bid on the new center.  If it did so and won the contract, substantial progress could be made in reducing the yawning divide between the U.S research and operational numerical weather prediction communities.

The Alternative: A Virtual Center Without Independence Or Responsibility
There are some in NOAA that would prefer that the EPIC center would simply be a contractor to NOAA that supplies certain services.  It would not have responsibility for providing the best modeling systems in the world, but would accomplish NOAA-specified tasks like external support for the unified modeling system and fostering the use of cloud computing.   It is doubtful that UCAR would bid on such a center, but might be attractive to some “beltway bandit” entity.   This would be a status-quo solution.
The Bottom Line
From all my experience in dealing with this issue, I am convinced that an independent EPIC, responsible for producing the best weather prediction system in the world, might well succeed. It is the breakthrough that we have been waiting for.
Why?  Because it can simultaneously solve the key issues that have been crippling U.S. operational numerical weather prediction centered in NOAA:  a lack of single point responsibility, that complex array of too many players and decision makers,  and the separation of the research and operational communities, to name only a few.
A NOAA-dependent virtual center, which does not address the key issues of responsibility and organization, will almost surely fail.
And let me stress.  The problems noted above are  the result of poor organization and management.  NOAA and NWS employees are not the problem.  If anything, they have been the victims of a deficient organization, working hard to keep a sinking ship afloat.


The Stars are Aligned
This is the best opportunity to fix U.S. NWP I have seen in decades.  We have an extraordinary NOAA administrator (Neil Jacobs) for whom fixing this problem is his top priority (and he is an expert in numerical weather prediction as well).  The nation (including Congress) knows about the problem and wants it fixed.  The President’s Science Advisory (Kelvin Drogemaier) is also a weather modeler and wants to help.  There is bipartisan support in Congress.
During the next month, the RFP (request for proposals) for EPIC will be released by NOAA.  We will then know NOAA’s vision for EPIC, and thus we will know whether this country will reorganize its approach and potentially achieve a breakthrough success, or fall back upon the structure that failed us in the past.

HT/Yooper

0 0 votes
Article Rating
107 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Ron Long
January 28, 2020 2:28 am

Charles Rotter, it’s great that you reposted this article about upgrading NOAA. This is exactly the kind of problem that President Trump is great at fixing, and his Science Adviser as well. Shirly the USA can be the leader in global weather forecasting/prediction, with all of its resources and talent, so I can’t help but wonder if part of the problem is a detour of funding and talent into Climate Change issues? Myself, I visit the Accuweather website to find out what clothes to pack when I go somewhere and they have never let me down.

Reply to  Ron Long
January 28, 2020 3:40 am

Ron,
Who is Shirly?
Surely you jest!

Reply to  Ron Long
January 28, 2020 4:25 am

Latitude
Reply to  Ron Long
January 28, 2020 5:21 am

detour of funding and talent into Climate Change issues….of course

Where NOAA excels is twisting and manipulating to show more global warming

Aeronomer
Reply to  Latitude
January 30, 2020 2:30 pm

My thoughts exactly. Wonder how much better this situation might be if they weren’t wasting so many resources propping up the Climate Cult.

Reply to  Latitude
January 31, 2020 2:06 pm

Yes, a single responsible entity for modeling would work against NOAA’s data massaging and climate alarmism. Why would you expect the people currently at the top of the agency to choose that course.

Bloke down the pub
January 28, 2020 2:39 am

Coming from a country that is historically fixated on the weather, it’s probably not surprising that the UKMO rates highly, especially considering the money that’s been thrown at the issue over the years.

Jean Parisot
January 28, 2020 2:46 am

How will the taxpayers be assured that resources provided for improvements in weather forecasts aren’t wasted on climatology?

Bill Powers
Reply to  Jean Parisot
January 28, 2020 4:28 am

I can assure you that if the money is appropriated by congress the spend will be wasteful. That doesn’t mean what we buy won’t work. Think of our military. It works great with $2000, hammers. but the job could get done with a lot less money from the taxpayers.

Rocketscientist
Reply to  Bill Powers
January 28, 2020 8:52 am

Bill, I have commented on previously and will reiterate: In most cases it’s really only a $20 hammer (not so much for specialized tools as they can be expensive for only a few, but ancillary systems certainly) with several hundred $$ of certifications from specialized labs to provide a safe system.
This “death by a thousand paper cuts” has accumulated over the years because at one time a process or test was naively omitted and a disaster occurred.
Often requirements are imposed across many unnecessary systems and we push back. Reciprocally, often the “push back” will cost more in requirements changes and coordination scheduling meetings than the actual cost savings to the system.
How many “pyrrhic victories” would you have us fight before complaining we are wasting time and budget?

On the up side often many of these “challenges” are recorded for future consideration in documents ostensibly titled “LESSONS LEARNED”. However, my experience has demonstrated that often that those documents gather dust quickly and merely remain “Lessons Recorded”.

Russ Wood
Reply to  Rocketscientist
February 1, 2020 8:53 am

It’s not just the military – it’s almost ANYTHING that flies! As an aeronautical engineering apprentice in the early 60’s, the small airliner my company was making needed a rubber stop for the bathroom door – you know, like almost everybody has to stop a door handle hitting the wall. Cost – a couple of (old) pence. I had to help the paperwork for this bit of rubber to get through ‘the system’, and it took weeks and page after page of paperwork to prove that it wouldn’t affect the aircraft’s safety. I don’t know how much this added to the final cost of the aircraft – probably a couple of hundred pounds!

Reply to  Bill Powers
January 28, 2020 11:05 am

Sometimes a thing properly described as “hammer” isn’t the claw-hammer you picture down at the local True Value, but instead is a unique pneumatic rivet setting hammer designed for riveting tight spaces inside of aircraft wings.

But hey, who wants to get in the way of a good slander.

MarkW
Reply to  Michael
January 28, 2020 6:43 pm

Other famous examples:
The expensive toilet seat turned out to be a specialized seat for a toilet in a submarine. It was stainless steel and had clamps on it that could seal the seat to the main body of the toilet forming a water tight seal.
The expensive coffee percolater turned out to be a pressurized brewer for a strategic bomber. The normal air pressure in these bombers while on patrol was low enough that water boiled at a temperature that was too low to properly brew coffee. Why they couldn’t just store all the coffee they needed? I don’t remember, I believe it had something to do with the length of the patrol.

MarkW
Reply to  Bill Powers
January 28, 2020 6:39 pm

There’s also the case where administrative costs for a contract aren’t billed as a separate item, but spread across all the individual items in the contract.

Bill T
Reply to  MarkW
January 29, 2020 3:42 am

Amen. Often half the cost of a job or product is in overhead.

Jean Parisot
Reply to  Bill Powers
January 28, 2020 10:48 pm

I’m too well acquainted with DCAA to not be concerned about the tactical spending issues. I’m more concerned about the strategic resource allocation. How will the use of weather tools be segregated from climate ‘research’? It seems as if the political push for climate consensus is an omnivorous beast showing up in every discipline and weakest tangential paper. Weather data processing is going to need strong walls to keep their resources focused on weather science.

Carbon Bigfoot
Reply to  Jean Parisot
January 29, 2020 5:06 am

How are the 20,000 5G satellites going to interfere with the weather forecasting? Experts predict the impact on the biosphere will be devastating.
https://www.5gspaceappeal.org/

Reply to  Carbon Bigfoot
January 29, 2020 6:20 am

re: “How are the 20,000 5G satellites …”

No comparison to the radiant energy UV/vis/IR of the sun at over 1,000 W / m^2 at the surface …

Do you like sitting in “the warming rays of the sun” on a cold or chilly day?

How about sitting or standing in front of a roaring fire in a fireplace, at a campsite or a ‘burn barrel’?

Do you ever consider such things, Carbon Bigfoot?

EternalOptimist
January 28, 2020 3:08 am

Charles,
If this is sorted out exactly as you would wish, how much better than accuWeather will it be ?

meanonsunday
Reply to  EternalOptimist
January 28, 2020 8:32 am

Accuweather is mostly just processing data that they get from NOAA/NWS. So they aren’t competing in any meaningful sense; Accuweather doesn’t even try to do 90% of the job. Any improvement by NWS will get passed on to Accuweather. Also keep in mind that Accuweather only has an interest in making the predictions that can generate commercial value, so even in the final step they are not trying to be as comprehensive as NWS.

Jeff Alberts
Reply to  EternalOptimist
January 28, 2020 9:05 am

The article wasn’t written by Charles, it was reposted from the Cliff Mass blog.

Patrick MJD
January 28, 2020 3:10 am

Predicting weather predicting, next month? Hummm…ok!

Big T
January 28, 2020 3:15 am

I wet my finger, hold it in air, and tell you 100 percent of the time, whether it is cold or hot.

MarkW
Reply to  Big T
January 28, 2020 7:30 am

How good is your finger at figuring out whether it will be cold or hot, tomorrow?

John Endicott
Reply to  MarkW
January 28, 2020 9:32 am

fingers aren’t any good at figuring out tomorrow’s weather, for that you need a beetle (see: the “Granny Versus the Weather Bureau” episode of the Beverly Hillbillies).

Reply to  Big T
January 28, 2020 1:05 pm

Big T January 28, 2020 at 3:15 am re: “I wet my finger, hold it in air, and tell you 100 percent of the time, whether it is cold or hot.”

An advancement, the “Weather Rock”:

Prjindigo
January 28, 2020 3:24 am

I predicted the exact time of the turn and the turn angle on Irma from watching GOES east conus alone, NHC’s output was a joke.

There’s a LOT of work needs done and the first thing is to remove surface temperature from the model of national weather patterns. It means nothing at all.

Roy W. Spencer
Reply to  Prjindigo
January 28, 2020 10:43 am

Umm.. the surface is where the largest energy exchanges occur. Leaving surface temperature out of the model would cause the model to fail. Are you saying this because you don’t like NOAA’s climate-monitoring temperature product? That isn’t what forecast models use.

January 28, 2020 3:28 am

A Center Outside of NOAA with Substantial Autonomy and Independence
In this vision, EPIC will be an independent center outside of NOAA. It will be responsible for producing the best unified modeling system in the world, supplying the one point of responsibility that has been missing for decades.

It is important that EPIC should be free of political control. Otherwise it will become “NPW”, National Public Radio Weather.
😐

old construction worker
January 28, 2020 3:32 am

‘…end up in Boulder, Colorado,’? Do to all the restrictions that would be the last place to build anything.

Ian W
Reply to  old construction worker
January 28, 2020 6:00 am

True
Perhaps Norman OK would be a better location.

J Mac
Reply to  Ian W
January 28, 2020 9:50 am

Ian and ocw,
Norman OK lacks sufficient ancillary ‘group think’ and peer group pressure to maintain essential levels of climate change alarmism in any weather or climate modeling facility. Norman is insufficiently progressive politically to attract the needed ‘saving the planet from CO2 plant food’ cadres of weather modelers and researchers. Norman is also pathetically lacking in environmental, bureaucratic, political, and organized labor divisiveness to be considered for a new ‘national’ lab. And finally, Norman OK is just sooooo not where the hip kids want to play……

For these reasons and more, Boulder CO is the perfect place for yet another bureaucratic EPIC failure in weather/climate modeling. It’s highly recommended by 97% of the cool warmer kids!

January 28, 2020 3:44 am

”1. No one individual or entity is responsible for success.

It also means no one individual or entity is responsible for failure. In the eyes of a bureaucracy, this diffusion of blame is a feature, not a bug.

Dodgy Geezer
January 28, 2020 3:45 am

“….NOAA probably has 1/00th of what they really need….”

Impressive! They’re using imaginary numbers…..

Derg
January 28, 2020 4:13 am

Should weather forecasting be something the Federal Government does?

I guess I can see it needed by the armed services..maybe they need their own department.

WXcycles
Reply to  Derg
January 28, 2020 4:50 am

“Should weather forecasting be something the Federal Government does?”
—-

Yes, they should, it has major national-security implications.

John Bell
Reply to  WXcycles
January 28, 2020 5:40 am

Is that not hyphen-abuse?

WXcycles
Reply to  John Bell
January 28, 2020 6:18 pm

You can-not have too-many hyphens.

MarkW
Reply to  WXcycles
January 28, 2020 6:46 pm

R-e-a-l-l-y?

Don Perry
Reply to  WXcycles
January 28, 2020 7:43 am

Then the government should secure the services of PRIVATE entities. If any entity can royally screw things up, it’s the federal government.

WXcycles
Reply to  Don Perry
January 28, 2020 6:16 pm

And yet ECMWF works brilliantly.

The problem for a private company is they won’t be able to get access to all the necessary available global data, plus are subject to going broke and providing none of a vital service any more.

If a private company wanted to do it, and be the best in the world at it, (because second place is not good enough to make a buck) they would be.

Any company could do it now but they don’t so I wouldn’t hold my breath waiting for a private global WX model to emerge which outperforms ECMWF. Either the US government does it or the Chinese attempt it, but probably no one else will.

MarkW
Reply to  WXcycles
January 28, 2020 6:47 pm

It’s hard to compete with a product that is given away for free.

WXcycles
Reply to  WXcycles
January 28, 2020 7:29 pm

Not all competition is about markets and dollars, that’s a very narrow view Mark-W, excelling and being the first and the best is a better measure of the competition.

But is ECMWF is not just given away, there are 4 model runs per day, and a partial tropospheric subset from 2 of the runs is made available to the public (and not available to all countries or organisations either).

commieBob
January 28, 2020 4:45 am

Again, we have the problem of what is good enough. If we threw the entire federal budget at new super computers, how much would weather forecasting improve? How much better are the Europeans?

You could argue that weather forecasting money is being inefficiently spent. That’s fine and can be fixed.

Computers are a zillion times better than they were in the 1970s when they began to have the capability of running a semi-decent model. link Weather forecasting isn’t a zillion times better by any metric.

If we have 100 times the computer power, how much will the models’ performance improve? My guess is that it won’t improve much.

WXcycles
Reply to  commieBob
January 28, 2020 5:13 am

“How much better are the Europeans?”

It’s chalk and cheese, GFS is operationally obsolete now, if not replaced it may as well be scheduled for shutdown, because no one’s going to keep using such an obsolete weather forecasting model. Higher resolution more developed models with more initiating and calibrating data fed into it, are a lot better at providing more accurate and detailed forecasts, for longer. The ECM output is a generation ahead.

commieBob
Reply to  WXcycles
January 28, 2020 2:01 pm

The weather is the prototypical chaotic system. link Better computers don’t improve the problem much, if at all.

WXcycles
Reply to  commieBob
January 28, 2020 6:06 pm

You can continue to maximize the available forecasting capacity with much faster computers processing much more data at higher resolution though, the Europeans have been doing that and it works brilliantly, and will most likely continue to improve as computer power rises and more is learned. There is a chaotic limit of around 2 weeks to forecasts with useful skill, but you can certainly make everything within that two week window much more accurate and useful from here.

DCE
Reply to  commieBob
January 28, 2020 7:36 am

All using new computers will get you is the wrong answer, faster. If the models aren’t changed, then faster computers won’t make any difference.

The existing model needs to go away. New one(s) need to replace it. It’s as simple as that.

Anthony Banton
Reply to  DCE
January 28, 2020 9:11 am

“All using new computers will get you is the wrong answer, faster. If the models aren’t changed, then faster computers won’t make any difference.”

Not true.
A faster computer allows for a higher resolution model, and that always leads to improvement.
It’s not difficult to increase resolution in the horizontal and vertical, all that prevents is a slow computer. When the models began to model into the stratosphere that made a big difference to forecasting the stratospheric PV and consequently the troposphere when the SPV disrupted ‘top down’.

D. J. Hawkins
Reply to  Anthony Banton
January 28, 2020 1:27 pm

You only get a higher resolution output if the model supports it. I seriously doubt the GFS has a “Turbo” mode that could be engaged if only it were running on faster iron. Increasing the spatial resolution and reducing the time step has to be baked into the model.

MarkW
Reply to  D. J. Hawkins
January 28, 2020 6:49 pm

Both could be set to use either environment variables, control files or passed in values.
Depends on how the program was written in the first place.

WXcycles
Reply to  D. J. Hawkins
January 28, 2020 7:44 pm

Doubt that DJ, there’s been many years to re-write and further develop the GFS code to run it alongside the operational code at any desired resolution, as an experiment in higher-res performance, better skill and research of development directions. That’s how WX model development and upgrade proceeds. It won’t be a fresh start it’ll be an adaptation and evolution. I’d be shocked if a considerably higher res system were not operating for many years already over a smaller real estate than the planet, awaiting resources to be scaled-up to a global replacement model.

WXcycles
January 28, 2020 4:47 am

Great job by Cliff Mass to put his finger on the problem and the solution, I’ve wondered for years why the US doesn’t get its act together and produce a competitive forecast model that actually provides fresh insight. So stale at present, I mostly don’t bother with the US models.

Interesting Chris said money is not the problem, then argued at length for computer acquisition and operational cost considerations. Often an organisational mess stems for a chronic lack of money allocated, which on the surface seems to be the number one road-block, but of course the plan and direct control necessary to get it done, then keep pushing it further.

I hope it happens, but it’ll have to be ambitious, the Euros will not be standing still and incrementalism is not going to take that crown, it’ll need a major organised push to get past them. Good for everyone.

mortimer
January 28, 2020 5:23 am

Ah. The whole thing is skewered by the US weather-war program that is still undercover and no one aboveground wants to talk about. It’s probably the biggest sectet program we have going. No weather predicion can be accurate without taking into account AND EXPOSING the geoengineering program.

Reply to  mortimer
January 28, 2020 1:18 pm

Yeah … sure … “chem you-know-whats” being sprayed … or HA ARP* … yeah … sure.

* Now owned by a university
https://en.wikipedia.org/wiki/High_Frequency_Active_Auroral_Research_Program

brians356
Reply to  mortimer
January 28, 2020 6:31 pm

You need to stay off ZeroHedge. Way off.

DHR
January 28, 2020 5:40 am

NOAA, NCAR, ESRL, UCAR, NWS, EMC, OSTI, OWAQ and lord knows how many other “Offices” in this muddle. And your solution to our poor state of weather forecasting is to add another called EPIC? Good grief! Sounds to me like a “solution” that only an incorrigible bureaucrat could conceive. If the organization is responsible for the problem, then simplify the organization, don’t add to it.

Not Chicken Little
January 28, 2020 5:43 am

I don’t know about “taking the crown” – isn’t the USA already #1 at taking taxpayer money and throwing it wastefully at non-problems, and at problems that our government is not suited to address or should not be allowed to address?

Unless the object is to make a few well-connected pigs at the public trough rich – then yeah, we’d better get moving, you taxpayers need to cough up more cash!

MarkW
Reply to  Not Chicken Little
January 28, 2020 7:33 am

In terms of wasting money, I doubt anyone can threaten the lead of the EU.

WXcycles
Reply to  MarkW
January 28, 2020 6:25 pm

How is having the code of a significantly better medium term global forecast model a “waste”? That code is an incredibly valuable national asset (as is the ever growing data flow feeding it), it’s a step forwards for the countries involved and for humanity in general, that won’t be undone from there. The money and effort spent doing it is anything but “wasting”.

DANNY DAVIS
January 28, 2020 5:45 am

I’m wondering how both Agriculture and Aviation are “getting by” now?
I’m somewhat aware of the technology that Aviation has available to pilots in the air. I’ve been on board a friend’s Bonanza as we flew cross-country about 10 years ago. We were constantly monitoring our own real-time satellite imaging of the developing weather as our flight progressed. This was in the summer with thunderstorms developing across the midwest as we were flying east. We did our own forecasting as the weather developed and were able to successfully arrive alive!
I was especially impressed with the terrain modeling while flying thru IFR obscured skies. My friend, the Pilot in command – fully IFR cert was using the autopilot connected waypoints while I was watching a simulated view of the terrain generated real-time on a glass screen. Simply amazing technology available to General Aviation.
I suspect that UPS & FedEx have an incredible capability to keep their operations ahead of the weather.

DCE
Reply to  DANNY DAVIS
January 28, 2020 7:40 am

Chances are they aren’t using NWS predictions. There are plenty of commercial/private firms performing weather forecasting that are more accurate than NWS. It wouldn’t surprise me to find that UPS and FedEx are using private forecasting firms.

David Joyce
January 28, 2020 5:46 am

My 30,000 ft. view: Here’s another subdivision of the organization further balkanizing the organization’s lines of communications and responsibility. But looking for consolidation in the government is a bit like asking to change Jupiter’s orbit.

Kurt Linton
January 28, 2020 5:57 am

Does anybody else in here hate when people (who quite often call themselves scientists) misuse the term “outlier”?

January 28, 2020 6:01 am

As a forecaster, my experience says the new & improved GFS is noticeably better. This is the first full winter in operational mode & I can say that I no longer automatically discount it’s guidance if it differs with the Euro. When they differ, the GFS has been right as much or more than the Euro (at least for the Western US snow forecasting I do). Similarly, the 3km NAM is often an impressive product, as long as you know how to scale model outputs for local conditions.
The Canadian GEM still is pretty poor in the mountain west – I do discount it pretty severely if it differs with other models – occasionally pulls the rabbit out of the hat … but that is pretty rare.
In short, IMO, the situation isn’t quite as desperate as the author makes it seem.

MaxD
Reply to  Jef L
January 28, 2020 11:03 am

As a very long time operational meteorologist in western Canada it is always good to hear fellow forecaster’s views of the models. I still do contract forecasting, mainly wildland fire weather forecasting. The main models I use are the Canadian RDPS and GDPS and the American GFS. They all have their strong and weak points. Generally I find that the Canadian models with perform better than the GFS and will go with those if there is a discrepancy. The GFS often seems to greatly overdevelop systems and produces far to much precipitation in the longer term.
I agree as well, IMO, that the models are pretty darn good in the short to medium range and the situation is not desperate.

Reply to  MaxD
January 28, 2020 6:46 pm

Totally agree on the GFS overdeveloping systems further out… mostly was referring to 5 to 7 days or closer in . Details further out are dodgy although pattern trend (or lack there of , run to run) is often useful guidance. I have always wondered in the GEM does a better job in Canada … perhaps tuned a bit better to local conditions. Your experience & comments would lead me to say … a definite possibility

MaxD
Reply to  Jeff L
January 28, 2020 9:30 pm

I/we have often wondered the same with possible tuning as well. It could well be that the GEM has more detailed tuning or terrain over parts of Canada, whereas the GFS may be more “broad brushed”. What we do seem to find is the GFS works well with temperatures and RH under big ridge scenarios, but not so well when the pattern is more changeable. But, as very often seems to be the case, one of the various models will come much closer to reality, but you never know which one that will be in advance. The thrill of forecasting.

rbabcock
January 28, 2020 6:14 am

This reminds me of the dilemma Boeing had when they decided on how to compete with the Airbus 321NEO.

The A-321 was designed after the 737 and had longer landing gear to better fit into the jetways and accommodate fanjet engines. The 737 was set lower to the ground since back when it was designed a lot of airports didn’t have jetways and people deplaned using stairs to the tarmac. When the first fanjets came out Boeing was barely able to fit them under the wing without scraping on takeoff and landing (737-300 to -900).

The newer generation of fanjets are much bigger in diameter. Airbus, with the longer landing gear, was able to put them under the wing of the A-321 easily. Boeing either had to start with a clean sheet design to replace the 737 or move the engines farther forward under the wing to get enough clearance to work. The clean sheet option would give them a much superior airplane to the A-321 but would entail $$$$ and lots of time to get it designed and certified. Option 2 gave them the 737-MAX. We all know how that has turned out so far.

Maybe NOAA needs to step back and go the clean sheet design option. Start totally from scratch and do it right.

commieBob
Reply to  rbabcock
January 28, 2020 7:19 am

When the first fanjets came out Boeing was barely able to fit them under the wing without scraping on takeoff and landing …

Yep. Flying into a smallish airport with a single runway and notorious crosswinds (Resolute, NWT, in the 1970s), was much more comfortable in a 727 than a 737. They never scraped a 737 engine when I was a passenger but I never understood why.

The problem with starting fresh is the cost of certifying the airframe. One of the problems with the 737 Max 8 was that it was basically still running on the old 737 certification data.

The MCAS design was based on data, architecture, and assumptions that were reused from a previous aircraft configuration without sufficient detailed aircraft-level evaluation of the appropriateness of such reuse, and without additional safety margins and features, … link

Reply to  commieBob
January 28, 2020 9:06 am

The old Boeing 727s were my favorite. Heard one pilot quoted as saying it was like flying a large fighter plane.

Curious George
Reply to  rbabcock
January 28, 2020 8:16 am

The problem with Boeing is an extreme cost-cutting. To save money, they moved the headquarters from Seattle to Chicago. Then a little problem with adopting the MCAS system developed – and the two teams were 1,721 miles apart.

Tired Old Nurse
January 28, 2020 6:14 am

Why not just outsource to Accuweather and save the money?

Tired Old Nurse

Richard Ilfeld
January 28, 2020 6:21 am

There is a huge spend in the commercial world- TV station for example are competitive in-market spending millions on weather sevices. We have private services (probably because of government issues)- I’d suggest
they only retain customer through good performance. NOAA needs to learn to thread the needle the way NASA is trying to. Make sure the is private investment and competition yet also make sure that critical knowledge is coordinated, and, when a public purpose is served (like huricane forecasts) shared in a timely fashion.

This is different that the model in countries where they’ve grown up with a single weather service nationally, with more of a command economy in the public service area. Are all the raw numbers from all the TV radars collected into an historic database. Can they be. Or does the government need its own network. Or can they be combined, to increase the resolution of NEXRAD for building a data history withoutcompting in the hour-to-hour forecast arena?

Making in all-government is likely dooming to to mediocraty, and leaving it prey to mendacity like climate change directives for data adjustment.

January 28, 2020 6:23 am

I like the EPIC recommendation, but would like to point out that even the “vaunted” European Centre for Medium-range Weather Forecasts (ECMWF) model has some serious issues. I noticed last summer (2019) at least a couple of times the ECMWF showed development of an intense tropical storm to near hurricane strength OVER LAND in the western bulge of Africa. The GFS-FV3 correctly showed only a weak low pressure system in both cases. I also remember seeing the ECMWF develop a weak tropical storm OVER LAND in South Texas/Northern Mexico once last summer as well. Needless to say, no other models showed intensification over land.

WXcycles
Reply to  Bryan - oz4caster
January 28, 2020 6:36 pm

I’ve seen that too, suspect its an artifact of attempting to improve high-end thunderstorm forecasts.

Coach Springer
January 28, 2020 6:40 am

Do we really need NWS anymore? NOAA?

Reply to  Coach Springer
January 28, 2020 2:35 pm

Replace the NWS with TWC?
No thanks!

And let me stress. The problems noted above are the result of poor organization and management. NOAA and NWS employees are not the problem. If anything, they have been the victims of a deficient organization, working hard to keep a sinking ship afloat.

The problem has been they’ve become bureaucracies in that organizations with no accountability. No one in the top tiers are held personally responsible for doing poorly. They are too often rewarded despite the reliability of the product they produce.

Robert of Texas
January 28, 2020 6:48 am

The problem with any government run or even funded agency is all the bureaucracy and regulations. It is more important to push for social justice then to actually meet a scientific goal for example. The levels of management become, in a word, unmanageable. People who have NO COMPETENCY in the field end up running the bureaucracy – getting their through politics and not because they are great scientists. I don’t know how many times I have seen this in our government and related agencies.

You can start all over with a new agency and make a lot of progress – for about 10 to 15 years – and then it all starts over again. The politicians rise through the ranks, the rules and regulations focus on the wrong goals, and the new agency becomes once again, incompetent.

This can ONLY be fixed by keeping these entities private. Have 2 or 3 competing private entities at any given time and keep paying the winner bonuses while changing out the want-to-be’s. Remember to keep all code and data public domain. Competition will keep these companies trim and competent (at least relative to a government agency).

As an aside – Norman (OK) *would*be a good place for such a weather center – their are a lot of still competent weather scientists found there, and lots of engineers. They are a less elite-style of people, more down to Earth workers.

NASA is a good example of a once proud agency that now is mostly incompetent. They are more concerned with diversity and gender promotions then in actually performing competent science and engineering. When social justice becomes the driving force, it will be the only goal met.

Now just think of all health care becoming government paid for and controlled, and then try to sleep…

John of Fabius
January 28, 2020 6:51 am

None of my farmer friends rely on the government for weather forecasting. They all pay for the accuracy provided by the private sector.

Anthony Banton
Reply to  John of Fabius
January 28, 2020 9:22 am

But any private sector company has to buy in model product!
They number crunch the world’s weather …. which is what is needed even on a regional scale for a period longer than a day.
A private company will just make it more user friendly.
That is why it is unfair.
Organisations such as the UKMO actively give info to METEOGroup …. who took away their BBC contract. Because it was cheaper.
Well of course it was.
MeteoG don’t have to run and maintain state-of-the-art supercomputers just for a start.

MarkW
Reply to  Anthony Banton
January 28, 2020 6:53 pm

Can you explain why UKMO is giving away their product at below cost?

WXcycles
Reply to  John of Fabius
January 28, 2020 6:48 pm

They are all using major global and regional models paid for by taxpayers John. They provide next to nothing original, they interpret and reformat then represent the data from the taxpayer funded models. Private does not work on that scale beyond that sort of piggy-back WX service model (which is not to belittle what they do or the value of it).

John Shewchuk
January 28, 2020 7:31 am

Cliff. Excellent overview. I echo much of what you say when, during my talks about weather, I always get the question … “why are we so bad” compared to the European model? It’s eye-opening to see the politics behind the science — but that’s usually the problem with anything — especially climate change. Anyway, this recent article helps explain some of the data issues …

https://qz.com/1769842/how-meteorologists-are-using-data-to-improve-hurricane-forecasts/

Curious George
January 28, 2020 7:47 am

Don’t concentrate just on models. Data feed is equally important. How do models handle satellite data? Data from ships, airplanes, and ARGO buoys?

WXcycles
Reply to  Curious George
January 28, 2020 6:54 pm

And enough spot ‘truthing’ sensors to calibrate global input data on the fly.

January 28, 2020 8:00 am

the U.S. global data assimilation system is not as good as those of leading centers.

Prb’ly. In addition, Joe Bastardi says the GFS model has way too much feedback in its parameters & thus chronically misses cold air in the longer forecasts.
http://www.weatherbell.com/premium

January 28, 2020 8:25 am

The state-of-the-art is in very long range solar based prediction of annular mode anomalies.

WXcycles
Reply to  Ulric Lyons
January 28, 2020 6:58 pm

” … solar based prediction of annular mode anomalies.”

I agree about prediction of annular mode anomalies, but who’s doing such a solar-based prediction?

Steve Oregon
January 28, 2020 9:05 am

It is very likely that NOAA is simply overburdened with activism by it’s management and rank and file bureaucrats.
Producing Climate Crusader tools like this.
https://www.climatecentral.org/news/noaas-new-cool-tool-puts-climate-on-view-for-all-16703

It’s just another deep state operating without consequences.

Reply to  Steve Oregon
January 28, 2020 2:52 pm

Remove “politics” from political science and what’s left?
Something that doesn’t pretend to understand everything, but it want’s too.
Something that isn’t always right and often wrong, but it’s honestly trying to get it right.

Jeff Alberts
January 28, 2020 9:16 am

How do they define “accurate”?

I was watching the weather radar for my area a couple weeks back when we were expecting snow. The “future” radar kind of went haywire about 4 hours ahead, at least that’s the way it looked to me. After those 4 hours, the current radar sort of resembled the earlier “future” radar (actual cloud cover and snowfall was not nearly as widespread), but it’s unclear to me how accurate I can expect it to be, or how accurate it can be.

Reply to  Jeff Alberts
January 28, 2020 3:07 pm

I’ve often wondered about the “future radar” stuff.
What I’ve never seen is a split screen of the “future radar” side by side with the later actual radar.
I wouldn’t expect it to be an exact match but I would like to see unbiased comparisons.

Reply to  Gunga Din
January 29, 2020 6:27 am

re: ” “future radar” stuff.”

Conceptualization/presentation created for the public from the “chance of precipitation” number cranked out by models; it works better for people’s minds to SEE what is meant by that abstract number, “chance of precipitation” than just hear it “voiced” or flashed on the screen.

Notice also the colored “zones” over a geographical area depicting expected precipitation amounts shown by some stations in advance of a storm system; those can also be used to ‘give life’ to that “chance of precipitation” number.

markl
January 28, 2020 10:01 am

DCE said: “…All using new computers will get you is the wrong answer, faster…..” Not entirely true but close to it. More likely it means “being able to run more scenarios for AGW until we get one that supports the narrative”. Trying to tame a chaotic system with computers is a fool’s errand. It’s like trying to continually cut something in half to get to zero.

WXcycles
Reply to  markl
January 28, 2020 7:07 pm

“…All using new computers will get you is the wrong answer, faster…..”

Ignorant nonsense of the worst kind. With improved speed comes better coding and more data and higher resolution, you get a forecast system that’s more accurate more often for longer. WX forecast models have zip to do with AGW, what are you talking about? Even chaotic systems are physics and described in math, and have a degree of predictability that can be leveraged and teased out by research and experiment. It’s anti-science absurdity to pretend they don’t – it’s also observably wrong.

Vuk
January 28, 2020 12:02 pm

NOAA does lot of space weather prediction, and they do good job as far as I can judge, while I think that NOAA/NASA co-chaired international panel forecast of the SC25 max is on high side:
https://www.swpc.noaa.gov/news/solar-cycle-25-forecast-update
Today’s solar magnetogram has two sunspots, one in the NH is from the old SC24 while one in the SH is from the mew SC25 cycle.
comment image
January up to date had 16 spotless days while monthly SSN count points to 5 or 6.

Philo
January 28, 2020 12:08 pm

With NOAA, NCAR, ESRL, UCAR, NWS, EMC, OSTI, OWAQ, plus independent military systems we have too many offices.
The National Weather Service should collect data and supply it to users. The National Climate Atmospheric Research unit stands alone and should become the National Climate Research unit- this should be where weather modelling research should be done.
Earth Systems Research probably can stand alone in coordination with other labs.
University Corporation for Atmospheric Research should be handled by limited term fellowships at government labs or by using only university funds with no Federal money or offsets included.
Environmental Modeling Center becomes a part of ESRL
Office of Scientific and Technical Information combines with the National Technical Reports Library
Office of Weather and Air Quality- Air Quality is moved to the EPA. Office of Weather(Quality) is folded into the ESRL.

That eliminates five separate duplications of effort and puts the functions together in complementary units.
Of course all the various units will fight like hell and NOAA will be the worst. Parts of NOAA should go back to being the National Air and Space Authority where all the international politics and over sight belongs.

n.n
January 28, 2020 12:32 pm

Weather forecasts with probabilistic degrees of continuity.

M__ S__
January 28, 2020 12:54 pm

The weather will probably be predicted by a poll

n.n
Reply to  M__ S__
January 28, 2020 4:54 pm

Democracy works until overridden for special, peculiar, and politically congruent interests.

January 28, 2020 1:40 pm

CLASSIC 1980’s weather forecast using hand-drawn maps AND a visible-light satellite image-

Intro starts with venerable KXAS ch 5 Dallas / Ft. Worth Texas weatherman Harold Taft going through some of the earlier ‘tools of the trade’ used when forecasting wasn’t such an exact science:

n.n
January 28, 2020 6:43 pm

Prediction implies surmounting a discontinuity in time, space, knowledge, or skill. Prediction is precluded by chaos (i.e. incomplete or insufficient characterization and computationally unwieldy). Forecasts, however, will improve in a system that is noteworthy for its semistable processes.

January 29, 2020 2:36 am

There is an interesting concept in storm forecasting that has been thrown up recently, and that is the coincidence in timing between the acceleration point of any large storm and solar ‘Ap’ impacts.
Some outline discussion is given in :
https://howtheatmosphereworks.wordpress.com/about/solar-activity-and-surface-climate/storm-analysis/
with further discussion on the ‘Electromagnetic Theory’ section on the same site.
If you have access to the raw data, try a cross reference between any large storm acceleration point and the relevant ‘Ap’ impact records.
An example may be that of “The Perfect Storm” of 1st November 1991 cross referenced to Carrington Rotation – CR1848;- ’Ap’ impacts October 27 to November 02, peaking at ‘Ap = 128’ on October 29.
As with all things in nature, nothing is ever 100% however if we observe any large storm which suddenly accelerates from, say Cat:3 to Cat:5, the relationship to a sudden ‘Ap’ impact is far too common to be pure coincident.
A good area for further research as it may imply the ability to predict such events a few days in advance.

Reply to  The Atmosphere Guy
January 29, 2020 7:39 am

Links in first principles? (i.e., what are the links in the first principle laws in physics; Ampere’s Law, Gauss’s Law, Newton’s Law, Maxwell etc.)

WXcycles
Reply to  The Atmosphere Guy
January 31, 2020 3:38 pm

Existing models already predict cyclonic storm category changes with very good skill levels, in time, space and location with reference only to accurate and sufficient atmospheric met data to predict trends, so why bother with external observations.

Occam’s Razor.

4caster
February 6, 2020 2:27 pm

For those advocating switching public forecasting and warning duties to the private sector, who will take on the liability issues of issuing warnings for public safety? Will the public have to pay for services that have historically been provided by taxpayer funds…a nominal few cents per household annually? In addition, private entities will be financially unable to sustain a profit if they are forced to collect and process the data needed for numerical weather forecast models. Thus, the government would be relegated to a data collection and processing entity. But, the point brought up by others about national security is exceedingly salient. The cost-to-benefit ratio in paying for government weather offices with meteorologists analyzing and disseminating forecasts, and especially warnings, is net positive to society, as evidenced by numerous academic studies throughout the years (I leave it to the readers as a research task to search). But conflict-of-interest issues have historically been a major factor in keeping weather forecasting and warning in the public arena. How can society be assured that a forecast concern sponsored by an umbrella company, or a raincoat maker, or other weather-dependent companies, will not be beholden to its sponsors and issue scientifically correct and neutral products? I also would strongly advise AGAINST basing any new model organization at Boulder…it seems talent there has been misused.