Time to Defund Climate Models?

From MasterResource

By Steve Goreham — April 29, 2025

“The Trump administration is cutting funding for climate research across all federal departments…. Maybe it’s time for NASA to stick to space exploration, NOAA to stick to weather forecasting, and for the climate models to be shut down.”

Climate models have been the basis for concern about climate change for more than 35 years. The US government, the United Nations, and organizations across the world have used model projections to warn about global warming and to demand a shift to renewable energy. But Trump administration budget cuts at NASA, NOAA, and other federal agencies threaten to shut down the models, the heart of climate change alarmism.

In June of 1988, Senator Tim Wirth, then chair of the Committee on Energy and Natural Resources, held the first-ever hearing on the science of climate change. Dr. James Hansen, head of a computer-modeling team at NASA, testified that he was “ … 99 percent confident that the world really was getting warmer and that there was a high degree of probability that it was due to human-made greenhouse gases.”

Since Dr. Syukuro Manabe of the Geophysical Fluid Dynamics Laboratory in Washington D.C. developed one of the first climate models in the 1960s, modelers have been warning that humans are causing dangerous climate change. Global surface temperatures have risen only a little more than one degree Celsius over the last 140 years, but models project a faster additional rise of 0.5ꟷ3.5oC by the year 2100.

Climate models have been used by scientists, researchers, and governmental policy makers to estimate possible future climate impacts. Global organizations, such as The Intergovernmental Panel on Climate Change (IPCC) of the United Nations and the World Bank use model projections to urge climate action. Non-governmental organizations such as Greenpeace use model projections to raise funds. But the Trump administration appears to be about to shut down the US climate models.

There are more than 40 climate models operating across the world, with 13 of the leading models located in the US. The US models are operated by National Aeronautics and Space Administration (NASA) in New York City, the National Oceanic and Atmospheric Administration (NOAA) in Princeton, New Jersey, and the Department of Energy (DOE) in Boulder, Colorado. Each of these organizations has been ordered to reduce staff as part of Trump administration budget cuts.

The White House may soon tell NASA to focus work on space programs, not climate change. In February, the administration denied NASA officials permission to travel to an international climate meeting in China. At the same time, NASA management cut off funding for a support contract for the 7th Assessment Report of the IPCC. NASA has been a primary contributor to previous IPCC Assessment Reports. Preliminary government spending plans for fiscal year 2026 would cut NASA’s science budget by almost half, to $3.9 billion.

The administration also wants to end climate change programs at NOAA. Plans call for a 27% cut to NOAA’s budget, down to $4.5 billion. Final budget totals for NASA and NOAA will need to be approved by Congress, with members concerned about the climate sure to put up a fight.

Climate models run on supercomputers and are expensive. Supercomputers cost about $50 million up front and $20 million per year to support each climate-modeling team. The NASA, NOAA, and DOE modeling teams may not be able to survive large projected cuts.

Beyond climate models, budgets of other climate projects will also be cut. The Sea Level Research Group at the University of Colorado has been studying sea level rise for about two decades. This group gets much of its funding from NASA and other federal agencies. The Mauna Loa Laboratory in Hawaii has been measuring the rise in atmospheric CO2 concentration since the 1950s, but it may be closed due to NOAA funding cuts. Three NASA satellites used to collect climate data also need to be replaced, but there are no plans to do so.

The Trump administration is cutting funding for climate research across all federal departments, with major impacts on US and world efforts to force action on climate change. Maybe it’s time for NASA to stick to space exploration, NOAA to stick to weather forecasting, and for the climate models to be shut down.

——————————-

Steve Goreham is a speaker on energy, the environment, and public policy and author of Green Breakdown: The Coming Renewable Energy Failure. His previous posts at MasterResource are here.

4.7 29 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

151 Comments
Inline Feedbacks
View all comments
SteveZ56
April 30, 2025 1:41 pm

Global circulation models are fairly good at predicting the weather up to about 5 days in advance, after which the observed weather starts to diverge from predictions made 6 or more days earlier.

These models start with measurements of temperature, pressure, humidity, precipitation, and wind speed observed at a given time at multiple ground stations, as well as observations at altitude using weather balloons generally launched once every 12 hours.

These measurements at “time zero” are then input to a three-dimensional grid, and some interpolation procedure is used to fill in the gaps between weather stations. The models then attempt to solve nonlinear differential equations across all grid volumes for a time step in order to predict future values of the same parameters one time-step later, and those calculated values are used as the initial conditions for the next time step, etc.

Errors can be introduced into the calculation if the spatial interpolation procedure does not accurately represent the initial conditions over an entire grid cell at the ground level, and additional errors can be introduced if the interpolation procedure as a function of altitude does not follow the actual temporal variation of parameters between balloon measurements 12 hours apart.

Such errors tend to propagate, and generally increase in magnitude as the model makes predictions about the future after many time steps. The use of relatively small grid cells and short time steps tends to improve the accuracy of the model, but the calculation time is proportional to the product of the number of grid cells and the number of time steps, so that these must be limited so that the model can output results for a future time before that time has actually arrived (real-time predictions).

Weather-forecasting models also assume that the radiation output of the sun is constant, and the amount of sunlight reaching the atmosphere depends only on position, season, and time of day.

Also, if a weather-forecasting model is run for a given initial time, and then new data are available an hour later, they can be entered into the model to “fine-tune” predictions several days into the future.

The problem of using a global circulation model for predicting future climate decades into the future is much more difficult. If a weather-forecasting model designed for 5-day (120 hour) predictions needs to run in an hour, if the grid size and time steps remained constant, a climate model over the next 50 years would take 5 months to run.

In order to obtain a more reasonable runtime, the grid cells need to be larger and/or the time-steps longer than a weather-forecasting model, which increases the possibility of interpolation errors, particularly for parameters over the oceans for which data are scarce.

There are also many parameters that cannot be predicted by a climate model. A 5-day weather-forecasting model for the east coast of the United States can safely ignore a storm in the Pacific Ocean, but a long-range climate model cannot predict the number of tropical cyclones in an ocean a year in advance, nor their location.

There are also many unknowns about the future decades from now. Will the solar radiation rate increase or decrease? How many sunspots will there be? Will ocean currents be different than now? What year will have the next El Nino or La Nina, and how strong or weak will it be?

If the climate models are supposed to calculate the effect of increasing CO2 in the atmosphere, measurements at Mauna Loa may reflect some “background” concentration, but are there local areas where CO2 is much more concentrated, and what are those concentrations? What is the distribution of CO2 concentration over the Pacific, Indian, and Atlantic Oceans?

All these unknown potential sources of error effectively render long-range climate modelling useless, and essentially semi-educated guesses.

It is far more useful to concentrate on short-term (< 10 days) weather forecasting, which can be used to warn people away from severe weather events such as hurricanes, tornados, and floods.

April 30, 2025 1:53 pm

All models say what they are told to say.

April 30, 2025 1:55 pm

If they knew how the weather and climate worked they would need only one model. Instead they have 40 none of which is any use.

April 30, 2025 2:45 pm

The polymathic wisdom Steve has displayed at The Investment Casting Institute, The MidAmerica Petroleum & Convenience Store Expo, the Hand Tools Institute and the Association of Chevron & Texaco Marketers commends his Cabinet nomination as Secretary of Petroleum Telemarketing.

This vital new office will end the climate hoax and lower the cost of gasoline for once and for all by using Amazon to deliver organic, unpasteurized & additive-free crude oil to the nations kitchens.

Bob
April 30, 2025 3:09 pm

Climate models are the only science the CAGW hucksters have but since climate models aren’t proper science that means they have no science to support their claims. They are in a bad place.

April 30, 2025 3:43 pm

13 of the leading models located in the US

Surely, if the science is settled, they should only need one model.

Reply to  StuM
April 30, 2025 8:15 pm

That don’t pay no rent.

Tony Tea
April 30, 2025 7:19 pm

Now they can get back to their Airfix models.

April 30, 2025 7:58 pm

Several years ago I read a white paper on GCM’s summarizing their use of parameters in their calculations (the paper reviewed multiple models). I thought it was on WUWT, but I am not sure. The basic jist was that most models had over 220 parameters, and of the ones that could be corroborated to physical measurements, 30+ were classified as outside of the empirically measured ranges that the parameters represented.

If anyone remembers this could you please link to it, to either corroborate or correct my memory.

May 1, 2025 4:48 am

Of course defund climate modeling. Why would the world want to know more about the deadly warming yet to come?

Reply to  Warren Beeton
May 1, 2025 7:25 am

Government bureaucrats should not be in charge of funding research. There is too much self-interest involved.

Research needs to be focused toward a unique goal and have a specific time frame. An example is to determine a functional relationship that defines a water feedback value from CO2.