Climate Change, Extreme Weather, and Electric System Reliability

Reposted from Dr. Judith Curry’s Climate Etc.

Posted on June 27, 2021 by curryja 

by Judith Curry

I recently participated in a Technical Conference sponsored by the U.S. Federal Energy Regulatory Commission (FERC).

This was a very interesting conference. Unfortunately there is no podcast or record of the written statements submitted by the panel

The main part of my written statement is provided below

JC remarks to FERC

The remarks that follow respond to issues raised for Panels #1 and #2, in context of CFAN’s experience in dealing with extreme weather- and climate-related issues for the energy sector.

How extreme can it get?

Extreme weather events are rare, by definition. When planning for future weather extremes, several different approaches are used:

  1. recent climatology: 1-in-10 or 1-in-20 year standard
  2. 50- or 100-year return time
  3. worst cases in the historical record
  4. incremental changes to #1 – #3 associated with manmade global warming.

The extreme events of 2020 (e.g. TX cold, record number of hurricane landfalls, extensive fires in CA) belie the utility of a 1-in-10 or 1-in-20 year standard. The return period approach doesn’t help much either. For example, Texas saw three 500-year floods during 2015-2017. The 100-year event is not based on history, but on estimated probabilities that assume stationarity of the climate record. However, the climate is not stationary on any time scale – apart from the secular trend of global warming, there is multi-decadal to millennial scale natural climate variability that provides an envelope for decadal and interannual climate variability.

Here is an anecdote that relates to a client who needed help in assessing the vulnerability to hurricanes of a new power plant that was to be located on the Gulf of Mexico coast. A risk assessment firm calculated 100-yr storm surge to be 10.1 ft, and the 500-year storm surge to be 13 ft. A quick look at the historical hurricane record shows an estimated storm surge of 12 feet near that location in the 1920s, and an estimated 15 ft storm surge from a hurricane in the 1840’s – periods with significantly cooler climates than now. Neither conventional statistics on return periods or climate model-driven expectations of slightly more intense hurricanes by 2100 provide a complete picture of what the power plant may be facing over the next 30-50 years from a hurricane storm surge. When I recommended moving the power plant inland, the client said that this site was previously approved for an earlier power plant, and getting a new site approved would take a decade.

In assessing the risk from extreme weather events, I advise clients to develop an understanding of the entire historical record of events impacting the locale, as well as any relevant paleoclimatic data that is available. If it has happened before, it can happen again. 

What about the role of global warming in changing the intensity or frequency of extreme weather events? Apart from the a reduced frequency of the coldest temperatures, the signal of global warming in the statistics of extreme weather events remains much smaller than that from natural climate variability, and is expected to remain so at least until the second half of the 21st century.

Rather than focusing on the relatively small and uncertain impacts of global warming on extreme events, a broader range of extreme weather events from the historical record can provide a better basis for avoiding ‘big surprises.’

How can we assess regional vulnerability to weather extremes for the next 30 years?

While much of the climate change literature focuses on projections to 2100 from global climate models, the electric utilities sector needs projections of regional climate variability and change on decadal time scales.

To bridge this gap, there is a growing number of companies and university groups that are producing regional, decadal climate projections from global climate model simulations. Specifically, the 21st century climate simulations prepared for the IPCC assessment reports are bias-corrected based on a comparison of historical climate simulations with observations. The same bias correction is applied to the 21st century simulations, which are then ‘downscaled’ to a finer horizontal resolution. The downscaling approach may be statistical or dynamical; dynamical downscaling uses the coarser resolution outputs from a global climate model simulation as the boundary conditions for higher-resolution simulation using a regional climate model.   

The problems with using global climate models as a basis for assessing future regional weather extremes are:

  • The climate model simulations used for the IPCC assessment reports include only scenarios for future emissions; they do not include predictions of natural climate variability (solar output, volcanic eruptions or the evolution of large-scale multi-decadal ocean circulations).
  • Because the global climate models do not adequately represent the multi-decadal ocean circulations, they do a poor job at simulating regional and decadal-scale climate variability.
  • Climate models do not accurately simulate the magnitude or frequency of extreme weather events.
  • Downscaling doesn’t help, if the underlying global climate model is not producing an accurate simulation.

In the absence of climate models that are fit-for-purpose for predicting future extreme weather events on regional and decadal time scales, alternative methods are being developed. CFAN has developed a semi-empirical methodology for providing scenarios of regional extreme weather events for the next 30 years. This approach combines historical data and climate dynamics analysis with scenarios of natural climate variability plus the outputs from global climate models. Multiple scenarios are selected for each driver of the forecast – emissions, solar, volcanoes and large-scale ocean circulations – with an emphasis on plausible scenarios, rather than extreme scenarios that cannot completely be ruled out. Based on recent information provided by the International Energy Agency (IEA), emissions scenarios to 2050 are best represented by the IPCC RCP4.5 or RCP6.0 scenarios (not the oft-used extreme RCP8.5 scenario).

The multiple outcomes derived from different combinations of the scenarios for each driver are organized using a possibility diagram that portrays the distribution of scenario outcomes. The likelihood of a particular outcome is associated with the plausibility of the input scenarios and also the number of different combinations of inputs that produce a particular outcome. Regional extreme weather events are then linked to these scenarios of climate change. This linkage is made through an analysis that relates the extreme weather categories to atmospheric and oceanic circulation patterns and global temperature change.

In several regional climate impact assessment projects that have involved CFAN, the client has hired 2-3 different groups to assess the regional impacts of climate change. Apart from different methodologies, such assessments invariably involve expert judgment, and ‘which expert’ matters. The bottom line is that currently there is no generally accepted ‘best practice’ for making regional projections of extreme weather events on a decadal time scale.

Overall, the climate research community has not focused on the scientific problem of projecting future regional impacts of extreme weather events. Given the importance of such projections for adaptation to climate change, FERC could usefully motivate a focus on these applications.

However, in my opinion there has been an over-emphasis on manmade climate change as the cause of increasing extreme weather events. Natural climate variability remains the largest driver of variations in extreme weather events, with at most incremental changes associated with manmade global warming. Greater attention is needed to understanding the full range of climate variability that contributes to extreme weather events.  Many of the worst U.S. weather disasters occurred in the 1930’s and 1950’s, a period that was not significantly influenced by manmade global warming. The 1970’s and 1980’s were a relatively quiet period, with weather disasters increasing again in the 21st century.  The evolution of natural multi-decadal modes of climate variability suggest that we could see another quiet period in coming decades, followed by a more active period.  Until the influence of natural climate variability on extreme weather is better understood, we may be misled in our interpretations of recent trends and their attribution to manmade global warming.

Probabilities, possibilities and uncertainty

As the time horizon of a weather or climate forecast increases and the spatial scale decreases, forecast uncertainty increases.

For a very short-term weather forecast, the uncertainty in the forecast is low and there is deterministic skill.

On timescales of 1-14 days, ensemble global weather forecast models provide meaningful probabilities in the sense of the forecasted mean being better on average than a climatological forecast, and the 90% range of the ensemble envelope nearly always bounds the actual outcome.

On timescales of 3-6 weeks, there are forecast periods with ‘windows of opportunity’ where the forecasts do better than climatology, but often the actual outcome occurs outside of the bounds of the 90% range of the ensemble forecast.

On seasonal time scales of 2 to 9 months, forecasts are commonly presented in terciles, with outcome probabilities provided for near average and above and below average outcomes.

The collection of climate model simulations to 2100 used by the IPCC are not predictions; they should be interpreted as a sensitivity analysis of climate change to different scenarios of emissions. These simulations are possible outcomes that are contingent on the assumptions made about: emissions, the lack of variability in solar and volcanoes, and the absence of meaningful phasing of the multi-decadal ocean circulation patterns. Attempts to create probabilities from the CMIP climate model simulations and regard them as predictions lead to misleading interpretations.

With regards to CFAN’s regional decadal projections, the objective is to bound the range of plausible outcomes for the frequency of extreme events and the plausible worst case.  There is weak justification for providing likelihoods of the individual outcomes, which is referred to as scenario uncertainty.

Reducing vulnerability of electric utilities to extreme weather events

Electric utilities are vulnerable to extreme heat and cold waves, hurricanes, wildfires, flooding, droughts and wind gusts, with regionally-varying levels of risk from each of these.

There are two broad approaches for reducing vulnerability to extreme weather events:

  1. Strategic adaptation  hardening of infrastructure and increasing reserve capacity
  2. Tactical adaptation – planning and strategies for readiness and mitigation of damage from an anticipated severe event.

Strategic adaptation in terms of infrastructure and reserve capacity is developed in response to expected conditions over the relevant time horizon (nominally 30 years).  The question then becomes ‘how much resilience can you afford?’ This is a choice between the robustness provided by 1-in-10 year versus 1-in-20 year standards. It is not cost effective to harden the infrastructure to accommodate every plausible worst-case weather scenario, which may not occur during the infrastructure lifetime of 30-50 years.

When an extreme event occurs that is outside of the expectations used in designing the infrastructure, too often the response is to passively watch a cascading disaster unfold and then clean up afterwards. Tactical adaptation strategies can be developed from considering plausible worst case scenarios. Such strategies develop response protocols and then deploy them in a phased manner in response to probabilistic weather forecasts. Such strategies can result in better outcomes, with less damage and more rapid restoration of services,

Since 2013, CFAN has been working with an electric utility provider whose service region is impacted by hurricanes. Reconstructed landfalling winds from historical hurricanes are used to drive their outage models to produce a range of possible outage scenarios. A catalog of synthetic worst-case storms provides an additional basis for stress-testing their system using their outage model and for assessing their response strategies.

When there is a possibility of a hurricane expected to impact their region, risk management begins 7 days prior to the possible landfall. CFAN provides extended-range probabilistic forecasts of tropical cyclone tracks, intensity and landfall winds that are used to drive their outage models. Based on CFAN’s ensemble forecasts of landfall winds, estimates are made of manpower requirements, allowing for early requests for mutual aid so that repair crews are in place several days before the actual landfall.  The catalog of synthetic worst-case storms is used to assess the worst-case possibility for the pending landfall.

My main point is that protocols developed for worst-case scenarios can be usefully deployed for forecasted extreme events to produce better outcomes.

FERC questions

There were 5 panels at the Conference; I participated in Panel 2. Here the questions formulated for the first three panels. These questions obviously address very important issues, and the formulation of the questions is interesting in itself.

Panel 1

This panel will explore the ways in which planning inputs and practices—including those used in resource adequacy planning, transmission planning, integrated resource planning, and asset development and management—should evolve to achieve outcomes that reflect consumer needs for reliable electricity in the face of patterns of climate change and extreme weather events that diverge from historical trends.  The panel may include a discussion of the following topics and questions:

  1. With respect to typical inputs to planning, such as expected future load, weather, temperature, etc., how can such futures-based inputs be projected more accurately (or usefully) than simply extending historical trends forward?
  2. Are there best practices for developing probabilistic/stochastic methods for estimating these typical planning inputs, including through use of expert-developed climate scenarios such as the Representative Concentration Pathway (RCP) scenarios for baseline CO2 projections developed by the Intergovernmental Panel on Climate Change?
  3. Are there best practices for conducting climate change and extreme weather vulnerability assessments?  How should these assessments (and any resulting climate change resilience plans) interact with existing planning processes, e.g., transmission planning and resource adequacy planning?
  4. Are there expert-developed climate change scenarios, including “down-scaled” ones for smaller regions, that can be incorporated into planning processes at all relevant levels?  What additional information, if any, do utilities need from government, academia, or other entities with expertise in climate change and meteorology to develop effective vulnerability assessments?
  5. How should climate vulnerability assessments be translated into actions that promote least-cost outcomes for consumers?  What are the specific steps and considerations that lead from identification of a climate vulnerability to least-cost solution that addresses that vulnerability?
  6. What are the planning best practices that proactively protect the needs of vulnerable populations?
  7. What, if anything, should FERC consider to encourage or require jurisdictional utilities to better assess vulnerabilities to climate change or extreme weather and implement appropriate corrective action plans?

Panel 2

This panel will explore how well existing planning processes address climate change and extreme weather events and possible improvements to planning processes.  This panel will engage in a broad ranging discussion of relevant best practices throughout the industry for assessing the risks posed by climate change and extreme weather and developing cost-effective mitigation. The panel may include a discussion of the following topics and questions:

  1. To what extent do existing resource adequacy processes (e.g., Loss of Load Expectation Analysis, Effective Load Carrying Capacity Analysis) assess the risk of common mode failures?  How can these processes be improved?
  2. Given the increasing incidence of extreme weather events, is the existing 1-in-ten-year standard, commonly used as a benchmark for resource adequacy, still an appropriate resource adequacy standard or is a new approach needed?  What role do existing, modified, or new Reliability Standards have to play in addressing planning issues associated with climate change and extreme weather?
  3. How should risks of climate change and extreme weather be incorporated into transmission planning processes? How does the appropriate approach change depending on specific threats most relevant to the region (e.g., extreme heat, drought, sea-level rise, etc.)?
  4. In light of the potential for increased instances of extreme weather, is a more probabilistic approach to transmission planning necessary? What are the potential benefits and drawbacks of such an approach?
  5. To what extent do existing transmission planning processes assess the benefits that transmission facilities provide during infrequent (e.g., one in twenty year) events? Should changes be considered to better assess the benefits of such facilities? If so, what should these changes look like?
  6. How do transmission planners evaluate the need for, and benefits of, increased inter-regional transfer capacity? In evaluating potential transmission projects that would increase regional import capability, do transmission planners consider the potential reliability benefits these projects would provide during extreme weather events? If not, should such benefits be considered and if so, how?  Should the establishment and maintenance of some minimum amount of interregional transfer capability be required, and if so, how should the particular amount be determined and by whom?
  7. To what extent is the Value of Lost Load (VOLL) currently used as an input to resource adequacy processes and transmission planning processes?  Would incorporating more accurate estimates of long-term and short-term VOLL into resource adequacy processes and transmission planning processes result in more cost-effective solutions to address the challenges of climate change and extreme weather?
  8. How can innovative mitigation strategies be incorporated in the various planning processes, such as planning for controlled sectionalization of parts of a grid to improve resilience?
  9. Are there potential rate incentives that rate regulators may consider to encourage investment in infrastructure to address the risks of climate change and extreme weather?
  10. What additional actions, if any, should FERC consider to encourage or require jurisdictional utilities to adopt robust planning practices that adequately consider climate change and extreme weather?

Panel 3

This panel will explore the ways in which existing operating practices—including but not limited to those pertaining to seasonal assessments, outage planning and coordination, reserve procurement, demand-side management, unit commitment and dispatch, short-term asset management, and emergency operating procedures—may necessitate updated techniques and approaches in light of increasing instances of extreme weather and longer-term threats posed by climate change.  This panel may include a discussion of the following topics and questions:

  1. How can market structures or rules be reformed to give generators and other resources stronger incentive to be prepared for the challenges of climate change or extreme weather that they may face?  Can new market products (e.g., seasonal products), or enhancements to existing market structures, be designed based on defined reliability/resilience needs in order to address the challenges of climate change and extreme weather?
  2. What current practices exist with respect to recalling or cancelling non-critical generation and transmission maintenance outages during a reliability event; are these practices sufficient to ensure that all possible resources and infrastructure needed to address an extreme weather event are available when such events happen unexpectedly?
  3. Given the dependence of electric system reliability on other systems (gas, water, etc.), what situational information related to those other systems is critical to electric system operator awareness during extreme weather events?  Should electric system operators consider modifications to their control rooms or software to enhance their situational awareness related to these other systems?
  4. Can the use of market-based congestion management tools such as redispatch, seams coordination, and market-to-market processes, be expanded to more areas of the country in order to help address the challenges of climate change and extreme weather? In particular, are there opportunities to improve coordination between RTOs/ISOs and neighboring non-market areas so that RTOs/ISOs will no longer have to rely on the traditional Transmission Loading Relief (TLR) process to manage excessive transmission congestion at those borders instead of the market-based approaches RTOs/ISOs use internally? If so, would this type of market-to-non-market coordination require the negotiation of joint operating agreements (or other arrangements), and what are the tradeoffs with replacing the TLR process in this scenario?
  5. What best practices exist in the use of innovative mitigation strategies (such as controlled sectionalization, microgrids) in operations to reduce loss of load and improve resilience during extreme weather events?
  6. What are the most effective means of engaging flexible demand to mitigate emergency conditions?  Are there methods to improve the use of flexible demand in addition to the solicitation of voluntary load reductions through mass communications during extreme weather events? Do existing interoperability and communications standards enable robust participation of flexible demand resources to address climate change and extreme weather challenges, or is more consensus-based standards development work needed by relevant stakeholders?
5 5 votes
Article Rating
48 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
waza
June 29, 2021 6:19 pm

A very good article.
I agree there are not good linkages between global and regional predictions.
BUT
There also any linkages to real world engineering.

waza
June 29, 2021 6:34 pm

Investing in weather/climate models which help predict weather events the following season or two, to help farmers and emergency management teams is an honourable gaol which I support.

Climate models that predict a region MAY have a few percentage more or less annual rainfall in 2100 are totally fraudulent.

Reply to  waza
June 30, 2021 10:39 am

Weather is deterministically chaotic so it is intrinsically unpredictable on a seasonal scale. Following a false model is worse than following none (and being ready for the likely range of events). The folks selling seasonal predictions should know this but the intrinsic unpredictability of nonlinear dynamics is hard to accept. There is no money in unpredictability.

Sweet Old Bob
June 29, 2021 6:39 pm

” . When I recommended moving the power plant inland, the client said that this site was previously approved for an earlier power plant, and getting a new site approved would take a decade. ”

Easy peasey … raise it 19 feet like Galveston did after the 1900 hurricane .

😉

Chaswarnertoo
Reply to  Sweet Old Bob
June 29, 2021 11:56 pm

Yep. Take worst case scenario and add some Brewster’s factor.

markl
June 29, 2021 8:05 pm

Well thought additions to climate modeling. Better? Who knows. What we do know is current models are crap and nature always wins. We can’t move forward until we recognize the failures and limitations to date.

bigoilbob
June 29, 2021 8:17 pm

Unfortunately there is no podcast or record of the written statements submitted by the panel.”

Agree. Is this routine for these government sponsored tech conferences? I aksed my son, a long time LLL Comp. Sci. guy, if he could score a vid of Steve Koonin’s talk, but no could do.

Zig Zag Wanderer
June 29, 2021 9:03 pm

Given the increasing incidence of extreme weather events…

Is there actual evidence of this? I’m dubious.

Chaswarnertoo
Reply to  Zig Zag Wanderer
June 29, 2021 11:55 pm

Nope. It’s not true. The only increase is in the monetary damage done as we are building in inappropriate places.

Alan the Brit
Reply to  Chaswarnertoo
June 30, 2021 12:55 am

Also don’t forget insurance companies change their policies every now & then depending on their marketing philosophy, whereby they insure new for old replacement policies which puts up pay-outs & increases premiums etc!!! The actual costs are not truly reflected! A UK policy change took place a while ago by allowing building on flood plains, largely domestic residential properties, & then one day the plains flood to everyone’s surprise!!! Building in known earthquake areas is a real problem without adequate structural design for resistance to such forces of nature, etc! I dare say a puter model produced some output to tell the powers that be that flooding risks would be reduced due to climate change (mustn’t call it globul warming) so one can build on a flood plain!!!

StephenP
Reply to  Alan the Brit
June 30, 2021 1:30 am

A sceptic might say that brown envelopes stuffed with cash may have had a part in the decision to build on the flood plain.

http://Www.pbctoday.co.uk/news/planning-construction-news/corrupt-planning-decisions/80113/

beng135
Reply to  StephenP
July 1, 2021 9:18 am

Yep, happens all the time here in the US. Floodplain crowded w/mobile homes gets washed away. Floodplain stays empty for some yrs, then a mobile home or two appears, then gradually gets filled up again. Wash away again, repeat, etc. Eco-loons like griff blame globullcrap-warming….

Last edited 25 days ago by beng135
griff
Reply to  Zig Zag Wanderer
June 30, 2021 1:09 am

yes.

for example the UK Met Office has shown that human-induced climate change made the 2018 record-breaking UK summer temperatures about 30 times more likely than it would have been naturally

Christopher Hanley
Reply to  griff
June 30, 2021 2:04 am

Unfalsifiable.

Zig Zag Wanderer
Reply to  griff
June 30, 2021 5:01 am

the UK Met Office has shown that human-induced climate change made the 2018 record-breaking UK summer temperatures about 30 times more likely than it would have been naturally

Evidence, please?

MarkW
Reply to  Zig Zag Wanderer
June 30, 2021 5:50 am

It was reported by the Guardian. That proves it’s true.

Reply to  griff
June 30, 2021 10:43 am

These so-called attribution studies are completely unscientific. They just compare a climate model with and without human forcing. The fact that other models have different sensitivities is ignored. So is the fact that no model is trustworthy. It is models all the way down.

Climate believer
Reply to  Zig Zag Wanderer
June 30, 2021 2:11 am

The magic gas can provide pretty much any statistic you want.

Jeff Alberts
June 29, 2021 11:08 pm

as well as any relevant paleoclimatic data that is available.”

Is there any paleo data that is reliable enough to be useful, in any meaningful way? They all have caveats, very coarse resolution, error bars the size of Texas, and all sorts of other problems.

Earthling2
Reply to  Jeff Alberts
June 30, 2021 12:50 am

Maybe a periodic volcano that had accurate paleoclimatic dating that might be predictive of future eruptions? Mt. Hood or Mt. Baker for example are probably due like the big quake some day.

John Dueker
June 30, 2021 12:32 am

Alternate title, How to scam clients by using alarmist models by using climate buzz words and ignoring history.

climanrecon
June 30, 2021 1:04 am

It is not just extreme weather that is a risk, normal weather can render wind and solar power almost totally useless, otherwise on-message Prof Socolow of Princeton Uni has suggested that widespread wind lulls should be given names like hurricanes, due to the risk they pose to the electricity system. Strangely there is little interest in this suggestion.

In South Australia it is katabatic (downslope) winds in the hills near Adelaide that provide big profits to all the wind farms clustered there, but also is a system-black waiting to happen, as there is an hour-long period when the wind dies, as it transitions from upslope wind during the day to downslope wind around sunset, just when electricity demand peaks, and solar power dies.

griff
June 30, 2021 1:06 am

Extreme weather events are rare… but increasingly frequent.

The UK has in the last 20 years seen strings of 1 in 100 year and even 1 in 300 year flood events… in several locations twice in a decade.

The UK climate has changed and storms damaging to infrastructure and severe and flash flooding are now much more frequent.

Christopher Hanley
Reply to  griff
June 30, 2021 2:06 am

The data to support those claims doesn’t exist.

MarkW
Reply to  Christopher Hanley
June 30, 2021 5:53 am

What data does exist, refutes griff’s claims.

Dave Andrews
Reply to  griff
June 30, 2021 7:41 am

Griff,

If you have a 1 in100year event this year or even a 1 in 300 year event it doesn’t mean that you cannot have a similar event the following year.

Doonman
Reply to  Dave Andrews
June 30, 2021 10:25 pm

Not to mention that a 1 one in 15,000 year event is called a glaciation and it lasts for 90,000 years.

Peta of Newark
June 30, 2021 1:18 am

Because rising temperatures, fires and floods ## are NOT manifestations of Climate or Weather.

Climate/Weather makes the real ‘thing’ actually visible – at least for those who like wearing very dark Ray-Bans behind an arc-welding mask. While sleep-walking.

They are caused by Soil Erosion (SE)
As is rising CO2 AND Global Greening = unrelated things with the same SE cause

## Add to that list on a more Human/Personal Scale

  • Obesity & diabetes
  • Cardiovascular Disease
  • The GHGE & other Junk Science
  • Cancer
  • Autoimmune disorder
  • Autism & other Dementias
Editor
June 30, 2021 2:21 am

“Many of the worst U.S. weather disasters occurred in the 1930’s and 1950’s, a period that was not significantly influenced by manmade global warming. The 1970’s and 1980’s were a relatively quiet period, with weather disasters increasing again in the 21st century.”.

Back in I think 2008, Klotzbach and Grey produced a report in which they showed a chart of US land-falling severe Atlantic hurricanes in a 25-year warming period and in a 25-year cooling period. Their conclusion was that those hurricanes are worse in cooling periods than in warming. This JC article, as quoted, appears to support Klotzbach and Gray’s analysis.

PS. Does anyone have a link to the Klotzbach and Gray article? I had it saved, but following a computer crash I have lost it, and I can no longer find it online.

Tom Abbott
Reply to  Mike Jonas
June 30, 2021 3:14 pm

“Their conclusion was that those hurricanes are worse in cooling periods than in warming.”

Yes, that would be my understanding.

AleaJactaEst
June 30, 2021 4:20 am

“apart from the a reduced frequency of the coldest temperatures, the signal of global warming in the statistics of extreme weather events remains much smaller than that from natural climate variability, and is expected to remain so at least until the second half of the 21st century.”

BOOM

Last edited 27 days ago by AleaJactaEst
Editor
June 30, 2021 5:39 am

My comment on Judith Curry’s page with this article:

“The collection of climate model simulations to 2100 used by the IPCC are not predictions; they should be interpreted as a sensitivity analysis of climate change to different scenarios of emissions. These simulations are possible outcomes that are contingent on the assumptions made about: emissions, …”.

With respect, Judith, I think that is incorrect. Because so many factors are omitted from the models – your article mentioned only a few – the models are effectively random-number generators operating within pre-set limits. These limits are set close to the pre-determined outcome based on the TCS and ECS of each model. The models outputs might look like sensitivity analyses, but this is an illusion. The fact that it is an illusion was demonstrated several years ago, when they ran a set of models, and then changed only the initial conditions by less than a trillionth of a degree and ran them again. Many of the regional temperatures in the re-runs changed by more than one degree, ie. more than a trillion times the change in initial conditions. From memory, the highest regional temperature difference between runs was over 5 degrees. That’s not sensitivity analysis, that’s random-number generation.

PS. If anyone has a link to that study, I would appreciate it being posted here.

Russell
June 30, 2021 5:48 am

Of course, the major environmental influence that is very likely to affect the energy sector has been completely ignored. I’ll give you a hint, it comes from the big golden ball in the sky when it spits plasma from its surface directly towards the earth (CME). All other climate change planning for the energy sector is rubbish relative to the probability and impact of the risks from sun (and its gravitational impacts on earth). But don’t tell the kids we are not watching out for this to protect their futures … they’ll all go on strike.

June 30, 2021 7:06 am

I summarize this excellent, lengthy statement as:
1. Most climate change is natural, which
2. We do not understand, hence
3. It cannot be predicted, so
4. We need non-predictive was of planning for extreme events.

Of course FERC assumes extreme events are due to predictable AGW. Part of the Biden climate push. No good can come from these false assumptions.

It doesn't add up...
June 30, 2021 7:30 am

Completely missing from the analysis is discussion of the impact of extended periods of Dunkelflaute that now threaten the integrity of energy supply. Already twice this year Texas has suffered rolling blackouts from this. Once in winter, and again in summer. European countries have come close to the brink several times. Study after study ignores these risks, pretending that a few GWh of grid batteries will solve it. It will take 10s of TWh per country and well into PWh on a global scale.

The thing is that these weather events also have frequencies of occurrence, with extremes of long duration. Worse, they can cumulated over time. A few storms with 40mph winds do not deplete fossil or nuclear fuel supply. But if we come to rely on storage and renewables instead, a few months of sub par output can deplete storage and prevent it from being refilled against a big event. That means you need even more storage for that 1 in 30 year bad event to be covered, and it must be kept regularly topped up.

Or you recognise that such a system is unworkable to begin with, and you use your data to prove it and to work out viable solutions for the future.

June 30, 2021 8:03 am

Wind and solar add to the list of extreme events that affect reliability. Low wind and heavy clouds, neither of which affect fossil power.

I have found a darkly amusing way the utilities are handling these new threats in their planning. Each says when wind and solar fail they will get the juice from their neighbors, which is impossible if everyone does it. The utilities are making a fortune building wind and solar so this impossibility is well hidden in the planning documents.

Gordon A. Dressler
June 30, 2021 8:38 am

Uhhh . . . with all due respect to Judith Curry (and she deserves a LOT!), nowhere in the above article in there “evidence beyond a reasonable doubt” that atmospheric CO2 concentration levels are the cause of global warming—let alone the fact that mankind’s emissions of CO2 are the predominate cause of such.

The old adage “Correlation does not equal causation” is appropriate in this context.

Moreover, even if one were to assume atmospheric CO2 level did contribute to past global warming, W. A. van Wijngaarden and W. Happer [2020], “Dependence of Earth’s Thermal Radiation on Five Most Abundant Greenhouse Gases” (free download available at https://arxiv.org/abs/2006.03098 ) present convincing evidence that atmospheric CO2 levels are now at an essentially asymptotic-limit and thus increases in such cannot lead to any significant increase in global warming.

The van Wijngaarden and Happer CO2-is-near-total-saturation-at-420-ppm assertion gives us an excellent explanation for why Earth did not experience “runaway” greenhouse warming, and thus the extinction of all life, when Earth previously experienced CO2 levels 8 to 15 times higher than today’s level.

Consequently, attempting to tie future climate-associated “extreme weather events” to present or future atmospheric CO2 concentration levels is just so much fluff.

Last edited 27 days ago by Gordon A. Dressler
Reply to  Gordon A. Dressler
June 30, 2021 10:46 am

She clearly says that most climate change is natural.

Gordon A. Dressler
Reply to  David Wojick
June 30, 2021 11:17 am

Yes she does, but my post was specific to the point that increasing atmospheric CO2 concentration levels—whether from natural sources or from human sources—should not be considered as a significant influence on climate-associated “extreme weather events” going forward.

Reply to  Gordon A. Dressler
June 30, 2021 12:32 pm

I am pretty sure that Curry accepts the popular hypothesis that the CO2 increase is human caused. (My view is we do not know.) Thus saying most climate change is natural implies that the CO2 increase does not play a big role. Given the panel questions the specific role of CO2 is not relevant.

Gordon A. Dressler
Reply to  David Wojick
June 30, 2021 1:37 pm

David Wojick posted: Given the panel questions the specific role of CO2 is not relevant.”

Really? Under Panel 1, Question #2 includes this verbatim phrase “. . . including through use of expert-developed climate scenarios such as the Representative Concentration Pathway (RCP) scenarios for baseline CO2 projections developed by the Intergovernmental Panel on Climate Change?”

Tom Abbott
Reply to  Gordon A. Dressler
June 30, 2021 3:19 pm

“my post was specific to the point that increasing atmospheric CO2 concentration levels—whether from natural sources or from human sources—should not be considered as a significant influence on climate-associated “extreme weather events” going forward.”

That’s right on the money. We may have reached Peak CO2 Warming now. And the temperatures are currently cooling, too.

Coach Springer
June 30, 2021 9:08 am

A methodical breakdown. Such a thing is always ignored by politicians and managers when convenient or when in the midst of group / psychological influence. Maybe they should plan sequestering decision making.

True story without revealing my source: Over the weekend, flash flooding in the area caused the cooling lake at a nuclear power plant to quickly rise several feet. Their planning addresses weather events on 100-year and actual historical records. Many individuals concluded that the lake was higher than it ever had been, even though their written procedures specifically noted a prior higher level.

Human beings want to mislead themselves.

Olen
June 30, 2021 10:21 am

An enjoyable article

The three pigs are example of proper construction to withstand harsh weather as opposed to picking up the pieces after a lot of destruction.

Okinawa builds to withstand typhoon and they don’t shut down during a typhoon at it’s peak.

Kit P
June 30, 2021 12:06 pm

I am old school. A 25% reserve margin for generating capacity is the best way to ensure reliability. Spending the money to maintain vegetation under transmission lines. Diversity of fuel sources is important too.

Every time there is a reliability event there is a pattern of goverment, environmental groups, and consumer groups eroding these basics principles.

There is sometimes a new factor but if the basics had been maintained the grid would have keep delivering power.

I predict the state of California will not learn the lesson of history.

Tom Abbott
June 30, 2021 3:08 pm

From the article: “Here is an anecdote that relates to a client who needed help in assessing the vulnerability to hurricanes of a new power plant that was to be located on the Gulf of Mexico coast. A risk assessment firm calculated 100-yr storm surge to be 10.1 ft, and the 500-year storm surge to be 13 ft. A quick look at the historical hurricane record shows an estimated storm surge of 12 feet near that location in the 1920s, and an estimated 15 ft storm surge from a hurricane in the 1840’s – periods with significantly cooler climates than now. Neither conventional statistics on return periods or climate model-driven expectations of slightly more intense hurricanes by 2100 provide a complete picture of what the power plant may be facing over the next 30-50 years from a hurricane storm surge. When I recommended moving the power plant inland, the client said that this site was previously approved for an earlier power plant, and getting a new site approved would take a decade.”

I think you may be making a wrong assumption here. You are assuming that warmer weather will create stronger hurricanes, but is that true? Don’t we need a colder climate for stronger hurricanes? In a warmer climate like now, we see the hurricanes with less strength and tornadoes are fewer and have less strength than in the recent past when it was colder because there is less of a temperatue contrast between weather fronts when it is warmer.

The 15-ft surge may be all the client needs to worry about.

Trying to Play Nice
June 30, 2021 4:10 pm

We had an unprecedented extreme weather event where I live yesterday. They called it a thunderstorm. Lightning struck some part of the local grid and about 1000 of us poor customers were without power. I’m glad I didn’t have an EV with 3 miles left on it trying to charge in the garage when that power went out. That nice SUV with an ICE really made it easy to get to a restaurant to get some dinner.

Tom Abbott
Reply to  Trying to Play Nice
July 1, 2021 6:59 pm

I see where New York State is telling it citizens to conserve electricity. I don’t think they have enough windmills.

%d bloggers like this: