by Judith Curry
I recently participated in a Technical Conference sponsored by the U.S. Federal Energy Regulatory Commission (FERC).
This was a very interesting conference. Unfortunately there is no podcast or record of the written statements submitted by the panel
The main part of my written statement is provided below
JC remarks to FERC
The remarks that follow respond to issues raised for Panels #1 and #2, in context of CFAN’s experience in dealing with extreme weather- and climate-related issues for the energy sector.
How extreme can it get?
Extreme weather events are rare, by definition. When planning for future weather extremes, several different approaches are used:
- recent climatology: 1-in-10 or 1-in-20 year standard
- 50- or 100-year return time
- worst cases in the historical record
- incremental changes to #1 – #3 associated with manmade global warming.
The extreme events of 2020 (e.g. TX cold, record number of hurricane landfalls, extensive fires in CA) belie the utility of a 1-in-10 or 1-in-20 year standard. The return period approach doesn’t help much either. For example, Texas saw three 500-year floods during 2015-2017. The 100-year event is not based on history, but on estimated probabilities that assume stationarity of the climate record. However, the climate is not stationary on any time scale – apart from the secular trend of global warming, there is multi-decadal to millennial scale natural climate variability that provides an envelope for decadal and interannual climate variability.
Here is an anecdote that relates to a client who needed help in assessing the vulnerability to hurricanes of a new power plant that was to be located on the Gulf of Mexico coast. A risk assessment firm calculated 100-yr storm surge to be 10.1 ft, and the 500-year storm surge to be 13 ft. A quick look at the historical hurricane record shows an estimated storm surge of 12 feet near that location in the 1920s, and an estimated 15 ft storm surge from a hurricane in the 1840’s – periods with significantly cooler climates than now. Neither conventional statistics on return periods or climate model-driven expectations of slightly more intense hurricanes by 2100 provide a complete picture of what the power plant may be facing over the next 30-50 years from a hurricane storm surge. When I recommended moving the power plant inland, the client said that this site was previously approved for an earlier power plant, and getting a new site approved would take a decade.
In assessing the risk from extreme weather events, I advise clients to develop an understanding of the entire historical record of events impacting the locale, as well as any relevant paleoclimatic data that is available. If it has happened before, it can happen again.
What about the role of global warming in changing the intensity or frequency of extreme weather events? Apart from the a reduced frequency of the coldest temperatures, the signal of global warming in the statistics of extreme weather events remains much smaller than that from natural climate variability, and is expected to remain so at least until the second half of the 21st century.
Rather than focusing on the relatively small and uncertain impacts of global warming on extreme events, a broader range of extreme weather events from the historical record can provide a better basis for avoiding ‘big surprises.’
How can we assess regional vulnerability to weather extremes for the next 30 years?
While much of the climate change literature focuses on projections to 2100 from global climate models, the electric utilities sector needs projections of regional climate variability and change on decadal time scales.
To bridge this gap, there is a growing number of companies and university groups that are producing regional, decadal climate projections from global climate model simulations. Specifically, the 21st century climate simulations prepared for the IPCC assessment reports are bias-corrected based on a comparison of historical climate simulations with observations. The same bias correction is applied to the 21st century simulations, which are then ‘downscaled’ to a finer horizontal resolution. The downscaling approach may be statistical or dynamical; dynamical downscaling uses the coarser resolution outputs from a global climate model simulation as the boundary conditions for higher-resolution simulation using a regional climate model.
The problems with using global climate models as a basis for assessing future regional weather extremes are:
- The climate model simulations used for the IPCC assessment reports include only scenarios for future emissions; they do not include predictions of natural climate variability (solar output, volcanic eruptions or the evolution of large-scale multi-decadal ocean circulations).
- Because the global climate models do not adequately represent the multi-decadal ocean circulations, they do a poor job at simulating regional and decadal-scale climate variability.
- Climate models do not accurately simulate the magnitude or frequency of extreme weather events.
- Downscaling doesn’t help, if the underlying global climate model is not producing an accurate simulation.
In the absence of climate models that are fit-for-purpose for predicting future extreme weather events on regional and decadal time scales, alternative methods are being developed. CFAN has developed a semi-empirical methodology for providing scenarios of regional extreme weather events for the next 30 years. This approach combines historical data and climate dynamics analysis with scenarios of natural climate variability plus the outputs from global climate models. Multiple scenarios are selected for each driver of the forecast – emissions, solar, volcanoes and large-scale ocean circulations – with an emphasis on plausible scenarios, rather than extreme scenarios that cannot completely be ruled out. Based on recent information provided by the International Energy Agency (IEA), emissions scenarios to 2050 are best represented by the IPCC RCP4.5 or RCP6.0 scenarios (not the oft-used extreme RCP8.5 scenario).
The multiple outcomes derived from different combinations of the scenarios for each driver are organized using a possibility diagram that portrays the distribution of scenario outcomes. The likelihood of a particular outcome is associated with the plausibility of the input scenarios and also the number of different combinations of inputs that produce a particular outcome. Regional extreme weather events are then linked to these scenarios of climate change. This linkage is made through an analysis that relates the extreme weather categories to atmospheric and oceanic circulation patterns and global temperature change.
In several regional climate impact assessment projects that have involved CFAN, the client has hired 2-3 different groups to assess the regional impacts of climate change. Apart from different methodologies, such assessments invariably involve expert judgment, and ‘which expert’ matters. The bottom line is that currently there is no generally accepted ‘best practice’ for making regional projections of extreme weather events on a decadal time scale.
Overall, the climate research community has not focused on the scientific problem of projecting future regional impacts of extreme weather events. Given the importance of such projections for adaptation to climate change, FERC could usefully motivate a focus on these applications.
However, in my opinion there has been an over-emphasis on manmade climate change as the cause of increasing extreme weather events. Natural climate variability remains the largest driver of variations in extreme weather events, with at most incremental changes associated with manmade global warming. Greater attention is needed to understanding the full range of climate variability that contributes to extreme weather events. Many of the worst U.S. weather disasters occurred in the 1930’s and 1950’s, a period that was not significantly influenced by manmade global warming. The 1970’s and 1980’s were a relatively quiet period, with weather disasters increasing again in the 21st century. The evolution of natural multi-decadal modes of climate variability suggest that we could see another quiet period in coming decades, followed by a more active period. Until the influence of natural climate variability on extreme weather is better understood, we may be misled in our interpretations of recent trends and their attribution to manmade global warming.
Probabilities, possibilities and uncertainty
As the time horizon of a weather or climate forecast increases and the spatial scale decreases, forecast uncertainty increases.
For a very short-term weather forecast, the uncertainty in the forecast is low and there is deterministic skill.
On timescales of 1-14 days, ensemble global weather forecast models provide meaningful probabilities in the sense of the forecasted mean being better on average than a climatological forecast, and the 90% range of the ensemble envelope nearly always bounds the actual outcome.
On timescales of 3-6 weeks, there are forecast periods with ‘windows of opportunity’ where the forecasts do better than climatology, but often the actual outcome occurs outside of the bounds of the 90% range of the ensemble forecast.
On seasonal time scales of 2 to 9 months, forecasts are commonly presented in terciles, with outcome probabilities provided for near average and above and below average outcomes.
The collection of climate model simulations to 2100 used by the IPCC are not predictions; they should be interpreted as a sensitivity analysis of climate change to different scenarios of emissions. These simulations are possible outcomes that are contingent on the assumptions made about: emissions, the lack of variability in solar and volcanoes, and the absence of meaningful phasing of the multi-decadal ocean circulation patterns. Attempts to create probabilities from the CMIP climate model simulations and regard them as predictions lead to misleading interpretations.
With regards to CFAN’s regional decadal projections, the objective is to bound the range of plausible outcomes for the frequency of extreme events and the plausible worst case. There is weak justification for providing likelihoods of the individual outcomes, which is referred to as scenario uncertainty.
Reducing vulnerability of electric utilities to extreme weather events
Electric utilities are vulnerable to extreme heat and cold waves, hurricanes, wildfires, flooding, droughts and wind gusts, with regionally-varying levels of risk from each of these.
There are two broad approaches for reducing vulnerability to extreme weather events:
- Strategic adaptation hardening of infrastructure and increasing reserve capacity
- Tactical adaptation – planning and strategies for readiness and mitigation of damage from an anticipated severe event.
Strategic adaptation in terms of infrastructure and reserve capacity is developed in response to expected conditions over the relevant time horizon (nominally 30 years). The question then becomes ‘how much resilience can you afford?’ This is a choice between the robustness provided by 1-in-10 year versus 1-in-20 year standards. It is not cost effective to harden the infrastructure to accommodate every plausible worst-case weather scenario, which may not occur during the infrastructure lifetime of 30-50 years.
When an extreme event occurs that is outside of the expectations used in designing the infrastructure, too often the response is to passively watch a cascading disaster unfold and then clean up afterwards. Tactical adaptation strategies can be developed from considering plausible worst case scenarios. Such strategies develop response protocols and then deploy them in a phased manner in response to probabilistic weather forecasts. Such strategies can result in better outcomes, with less damage and more rapid restoration of services,
Since 2013, CFAN has been working with an electric utility provider whose service region is impacted by hurricanes. Reconstructed landfalling winds from historical hurricanes are used to drive their outage models to produce a range of possible outage scenarios. A catalog of synthetic worst-case storms provides an additional basis for stress-testing their system using their outage model and for assessing their response strategies.
When there is a possibility of a hurricane expected to impact their region, risk management begins 7 days prior to the possible landfall. CFAN provides extended-range probabilistic forecasts of tropical cyclone tracks, intensity and landfall winds that are used to drive their outage models. Based on CFAN’s ensemble forecasts of landfall winds, estimates are made of manpower requirements, allowing for early requests for mutual aid so that repair crews are in place several days before the actual landfall. The catalog of synthetic worst-case storms is used to assess the worst-case possibility for the pending landfall.
My main point is that protocols developed for worst-case scenarios can be usefully deployed for forecasted extreme events to produce better outcomes.
There were 5 panels at the Conference; I participated in Panel 2. Here the questions formulated for the first three panels. These questions obviously address very important issues, and the formulation of the questions is interesting in itself.
This panel will explore the ways in which planning inputs and practices—including those used in resource adequacy planning, transmission planning, integrated resource planning, and asset development and management—should evolve to achieve outcomes that reflect consumer needs for reliable electricity in the face of patterns of climate change and extreme weather events that diverge from historical trends. The panel may include a discussion of the following topics and questions:
- With respect to typical inputs to planning, such as expected future load, weather, temperature, etc., how can such futures-based inputs be projected more accurately (or usefully) than simply extending historical trends forward?
- Are there best practices for developing probabilistic/stochastic methods for estimating these typical planning inputs, including through use of expert-developed climate scenarios such as the Representative Concentration Pathway (RCP) scenarios for baseline CO2 projections developed by the Intergovernmental Panel on Climate Change?
- Are there best practices for conducting climate change and extreme weather vulnerability assessments? How should these assessments (and any resulting climate change resilience plans) interact with existing planning processes, e.g., transmission planning and resource adequacy planning?
- Are there expert-developed climate change scenarios, including “down-scaled” ones for smaller regions, that can be incorporated into planning processes at all relevant levels? What additional information, if any, do utilities need from government, academia, or other entities with expertise in climate change and meteorology to develop effective vulnerability assessments?
- How should climate vulnerability assessments be translated into actions that promote least-cost outcomes for consumers? What are the specific steps and considerations that lead from identification of a climate vulnerability to least-cost solution that addresses that vulnerability?
- What are the planning best practices that proactively protect the needs of vulnerable populations?
- What, if anything, should FERC consider to encourage or require jurisdictional utilities to better assess vulnerabilities to climate change or extreme weather and implement appropriate corrective action plans?
This panel will explore how well existing planning processes address climate change and extreme weather events and possible improvements to planning processes. This panel will engage in a broad ranging discussion of relevant best practices throughout the industry for assessing the risks posed by climate change and extreme weather and developing cost-effective mitigation. The panel may include a discussion of the following topics and questions:
- To what extent do existing resource adequacy processes (e.g., Loss of Load Expectation Analysis, Effective Load Carrying Capacity Analysis) assess the risk of common mode failures? How can these processes be improved?
- Given the increasing incidence of extreme weather events, is the existing 1-in-ten-year standard, commonly used as a benchmark for resource adequacy, still an appropriate resource adequacy standard or is a new approach needed? What role do existing, modified, or new Reliability Standards have to play in addressing planning issues associated with climate change and extreme weather?
- How should risks of climate change and extreme weather be incorporated into transmission planning processes? How does the appropriate approach change depending on specific threats most relevant to the region (e.g., extreme heat, drought, sea-level rise, etc.)?
- In light of the potential for increased instances of extreme weather, is a more probabilistic approach to transmission planning necessary? What are the potential benefits and drawbacks of such an approach?
- To what extent do existing transmission planning processes assess the benefits that transmission facilities provide during infrequent (e.g., one in twenty year) events? Should changes be considered to better assess the benefits of such facilities? If so, what should these changes look like?
- How do transmission planners evaluate the need for, and benefits of, increased inter-regional transfer capacity? In evaluating potential transmission projects that would increase regional import capability, do transmission planners consider the potential reliability benefits these projects would provide during extreme weather events? If not, should such benefits be considered and if so, how? Should the establishment and maintenance of some minimum amount of interregional transfer capability be required, and if so, how should the particular amount be determined and by whom?
- To what extent is the Value of Lost Load (VOLL) currently used as an input to resource adequacy processes and transmission planning processes? Would incorporating more accurate estimates of long-term and short-term VOLL into resource adequacy processes and transmission planning processes result in more cost-effective solutions to address the challenges of climate change and extreme weather?
- How can innovative mitigation strategies be incorporated in the various planning processes, such as planning for controlled sectionalization of parts of a grid to improve resilience?
- Are there potential rate incentives that rate regulators may consider to encourage investment in infrastructure to address the risks of climate change and extreme weather?
- What additional actions, if any, should FERC consider to encourage or require jurisdictional utilities to adopt robust planning practices that adequately consider climate change and extreme weather?
This panel will explore the ways in which existing operating practices—including but not limited to those pertaining to seasonal assessments, outage planning and coordination, reserve procurement, demand-side management, unit commitment and dispatch, short-term asset management, and emergency operating procedures—may necessitate updated techniques and approaches in light of increasing instances of extreme weather and longer-term threats posed by climate change. This panel may include a discussion of the following topics and questions:
- How can market structures or rules be reformed to give generators and other resources stronger incentive to be prepared for the challenges of climate change or extreme weather that they may face? Can new market products (e.g., seasonal products), or enhancements to existing market structures, be designed based on defined reliability/resilience needs in order to address the challenges of climate change and extreme weather?
- What current practices exist with respect to recalling or cancelling non-critical generation and transmission maintenance outages during a reliability event; are these practices sufficient to ensure that all possible resources and infrastructure needed to address an extreme weather event are available when such events happen unexpectedly?
- Given the dependence of electric system reliability on other systems (gas, water, etc.), what situational information related to those other systems is critical to electric system operator awareness during extreme weather events? Should electric system operators consider modifications to their control rooms or software to enhance their situational awareness related to these other systems?
- Can the use of market-based congestion management tools such as redispatch, seams coordination, and market-to-market processes, be expanded to more areas of the country in order to help address the challenges of climate change and extreme weather? In particular, are there opportunities to improve coordination between RTOs/ISOs and neighboring non-market areas so that RTOs/ISOs will no longer have to rely on the traditional Transmission Loading Relief (TLR) process to manage excessive transmission congestion at those borders instead of the market-based approaches RTOs/ISOs use internally? If so, would this type of market-to-non-market coordination require the negotiation of joint operating agreements (or other arrangements), and what are the tradeoffs with replacing the TLR process in this scenario?
- What best practices exist in the use of innovative mitigation strategies (such as controlled sectionalization, microgrids) in operations to reduce loss of load and improve resilience during extreme weather events?
- What are the most effective means of engaging flexible demand to mitigate emergency conditions? Are there methods to improve the use of flexible demand in addition to the solicitation of voluntary load reductions through mass communications during extreme weather events? Do existing interoperability and communications standards enable robust participation of flexible demand resources to address climate change and extreme weather challenges, or is more consensus-based standards development work needed by relevant stakeholders?