HT/Willis
A new paper was published in Environmental Research Letters using pre-fire season climate conditions.
LETTER • OPEN ACCESS
Winter and spring climate explains a large portion of interannual variability and trend in western U.S. summer fire burned area
Ronnie Abolafia-Rosenzweig2,1, Cenlin He1 and Fei Chen1
Published 29 April 2022 • © 2022 The Author(s). Published by IOP Publishing Ltd
Environmental Research Letters, Volume 17, Number 5
Citation Ronnie Abolafia-Rosenzweig et al 2022 Environ. Res. Lett. 17 054030
Abstract
This study predicts summer (June–September) fire burned area across the western United States (U.S.) from 1984 to 2020 using ensembles of statistical models trained with pre-fire season climate conditions. Winter and spring climate conditions alone explain up to 53% of the interannual variability and 58% of the increasing trend of observed summer burned area, which suggests that climate conditions in antecedent seasons have been an important driver to broad-scale changes in summer fire activity in the western U.S. over the recent four decades. Relationships between antecedent climate conditions with summer burned area are found to be strongest over non-forested and middle-to-high elevation areas (1100–3300 m). Statistical models that predict summer burned area using both antecedent and fireseason climate conditions have improved performance, explaining 69% of the interannual variability and 83% of the increasing trend of observed burned area. Among the antecedent climate predictors, vapor pressure deficit averaged over winter and spring plays the most critical role in predicting summer fire burned area. Spring snow drought area is found to be an important antecedent predictor for summer burned area over snow-reliant regions in the nonlinear statistical modeling framework used in this analysis. Namely, spring snow drought memory is realized through dry anomalies in land (soil and fuel) and atmospheric moisture during summer, which favours fire activity. This study highlights the important role of snow drought in subseasonal-to-seasonal forecasts of summer burned area over snow-reliant areas.
Export citation and abstract BibTeX RIS
Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 license. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
1. Introduction
Since the beginning of the 21st century, the Intergovernmental Panel on Climate Change has projected increases in summer wildfire risk in North America associated with decreased snowpack and increased summer drying caused by human-induced global warming [1, 2]. This projection holds true to a large degree in the western United States (U.S.) as observations have revealed persistent increasing trends in temperature, aridity and wildfire burned area since the mid-1980s [3–8]. It is well established that the observed increase in fire activity in the western U.S. is predominately explained by warmer and drier springs and summers caused by natural and human-induced climate changes [3, 8–11]. This is consistent with analyses of global paleo fire and climate records that indicate rapid warming has historically played an important role in modulating broad-scale fire activity over the past two millennia [12, 13]. Climate simulations considering a range of greenhouse gas concentration trajectories have projected that abrupt human-caused climate changes will likely contribute to further increases in fire hazard over the next century in the U.S. [14, 15]. This projection has striking societal and environmental consequences because fires cause thousands of smoke-related deaths per annum which is projected to increase throughout this century [15, 16], increased COVID-19 mortality [17], permanent changes to ecosystems [18], persistent changes to streamflow [19], and multi-billion-dollar suppression expenditures (www.nifc.gov/fire-information/statistics/suppression-costs). A robust understanding of the relationships between fire activity and climate conditions susceptible to change can enable better preparation for these consequences via implementation of policy and management that addresses these relationships as well as models that more accurately predict fire activity to inform resource allocation.
Since the early 2000s, a suite of statistical analyses has quantified relationships between summer fire activity and various climate conditions in the western U.S. [3, 5, 9–11, 20–22]. Many of these relationships have been reviewed by Littell et al [23, 24]. These statistical models primarily differ based on spatial and temporal resolutions and domains, model complexity, and climate predictors. However, they each support that broad-scale fire activity across the western U.S. has been mostly explained by fluctuations in antecedent and fire season climate conditions since the mid-1980s. For instance, Westerling et al [3] concluded that wildfire activity in the western U.S. made an abrupt transition in the mid-1980s from infrequent and short-burning wildfires to more frequent longer-burning fires due to unusually warm springs and longer and drier summers. A series of follow-up studies have shown that after this transition, the majority of interannual variability in burned area can be explained by climate conditions [5, 9, 11].
The above body of work provides valuable insights on relationships between climate variability and fire. However, they do not provide in-depth analyses that relate only pre-fire climate conditions to summer fire activity, which are essential for lead-time (e.g. subseasonal-to-seasonal) fire forecasts [25]. Previous studies that have compared antecedent climate conditions with fire season burned area have reported significant relationships between pre-fire climate and fire activity, but have not comprehensively explored how much of the interannual variability or trend in fire season burned area can be explained by pre-fireseason climate alone [9, 10, 26–30]. Multiple of these studies have reported the effects of antecedent soil moisture conditions. Namely, wetter conditions 1–3 years before fire seasons in fuel-limited regions correspond with greater fire activity, and drier antecedent conditions in the months preceding the fire season tend to favour more fire activities [26–28]. A recent study that explored relationships between antecedent climate and fire season activity in a multivariate analysis showed machine learning models that predict burned area using both pre-fire and fire season conditions outperform models that use only fire season conditions [20]. Although previous research underlines that antecedent conditions contain unique information for enhancing fireactivity predictability, it has not yet established a comprehensive quantitative relationship between fire season severity and pre-fire climate alone. Moreover, interannual changes in winter and spring snowpack have important relationships with fires [3, 10], motivating our hypothesis that a large portion of summer burned area variability and trend can be explained by pre-fire climate conditions alone, with an important contribution from pre-fire snowpack conditions in snow-reliant areas over the western U.S.
Indeed, dramatic temperature-driven declines in snowpack across the western U.S. [31] are considered an important link between climate trends and increasing fire hazard [3, 10]. Namely, reductions in winter and spring snowpack (particularly during snow drought) drive earlier spring melt, increase surface temperature and evaporation due to snow-albedo feedback, and more quickly deplete moisture of vegetation and soil, leading to longer and drier summers [32]. The historically observed trend of declining snowpack has been projected to continue through the remainder of this century [33, 34]. Thus, understanding relationships between snow drought and fire over contemporary landscapes is imperative to project how future snowpack changes will impact fire activity, particularly in the western U.S. However, the snow-drought-fire interactions and the predictive ability of snow drought combined with other winter and spring climate conditions in predicting summer fire activity has not been fully understood and evaluated.
In this study, we use ensembles of nonlinear statistical models to predict summer fire burned area across the western U.S. based on pre-fire (winter and spring) climate including snow drought conditions. A unique novelty of this study is that we explore the predictability of burned area with combinations of pre-fire climate conditions and the role of snow-drought-fire interactions in modulating summer fire hazard. The goal of this study is to answer three following science questions. (i) What percent of interannual variability and trend in summer burned area can be explained by pre-fire climate conditions alone? (ii) What predictive information is contained in summer climate conditions that is not provided by winter and spring climate conditions? (iii) To what degree can the inclusion of snow drought enhance the predictability of summer fire activity relative to other traditionally used climate predictors? The overarching objectives of this study are to advance the understanding of seasonal climate-fire relationships, explore a methodology capable of predicting broad-scale fire burned area across the western U.S., and thus better inform policy and resource allocation.
2. Method
2.1. Study domain
The spatial domain considered in this study includes all areas classified by MODIS satellite observations [35] as forest, grassland, or savanna in the western U.S. (north of 32° latitude and west of −104° longitude) below typical tree lines (<3300 m) [36] that have received adequate snowfall in winter and spring (peak snow water equivalent (SWE) > 100 mm) (figure 1(a)). The peak SWE threshold is imposed to limit the study domain to areas potentially affected by snow drought following Livneh and Badger [33] which supports assessing the importance of spring snow drought as a predictor of summer fire activity. The main results and conclusions presented herein are qualitatively insensitive to the choice of this peak SWE threshold (see sensitivity analyses in figures S1–S3 available online at stacks.iop.org/ERL/17/054030/mmedia). The preceding elevation and vegetation screening are performed to confine the domain to areas susceptible to fires, which reduces the domain area by 8% while retaining 99% of the burned area relative to all areas that meet the peak SWE threshold. After applying the aforementioned spatial screening, the study domain contains 43% land area and 61% burned area during 1984–2020 relative to the entire western U.S. (figure 1(a)). The interannual variability of burned area for the study domain is representative of the burned area over the entire western U.S., indicated by a very high correlation (r = 0.97) between fireseason burned area across the study domain and the entire western US (figure 1(c)). We select summer months (June–September) as the fire season in this analysis, which contains 94% of the total burned area within the study domain (figure 1(b)). Figure S10 shows that including May in the summer months (i.e. May–September) does not affect the qualitative relationships between pre-summer climate and summer burned area presented herein. Note that the domain has experienced an increasing trend in annual burned area from 1984 to 2020 for each calendar month (figure S4 and table S1).
Figure 1. Study domain details. (a) White shaded regions represent the snow-effected area used for the analysis. 500 m resolution observed burned fractions from 1984 to 2020 (in yellow to red shades) are overlain. (b) Box and whisker plots of burned area by month highlight that 94% of the burned area from 1984 to 2020 has occurred during summer months. Variability in boxplots is from the spread in burned area across different years, and whisker lengths are equivalent to the interquartile range. Outliers are plotted as red ‘+’. (c) Scatter plot of summer burned area for the study domain shown in (a) (horizontal axis) and the entire western U.S. (vertical axis) during 1984–2020 reveals that interannual variability of summer burned area in the study domain is representative of the entire western U.S.
Download figure:
You can read the full open access article here.
California gets dry enough to burn every year.
And the number of loony eco-terrorists who start wildfires seems to be increasing.
Regards,
Bob
… that should be fitted with the GPS ancle tags.
I would go with GPS locating collars like for Polar Bears
Can we add Taser tech to the GPS collar please?
Like the Geo-Fence dog collar!
Is this one big enough?
Not just a cartoon, it’s real:
https://foxmetronews.com/news/alibaba-president-shares-at-the-world-economic-forum-on-carbon-footprint-tracker-technology-for-individuals/
California literally burns somewhere every day. But I don’t know how you come up with a computer model to determine when a homeless encampment will catch fire again. Los Angeles alone reports multiple fires nearly every day. How do you factor in arsonists? How do you factor in gender reveal parties gone wrong?
Hi Hoyt! Interesting question. Our model does not determine where fires will occur. It just predicts how many acres in the western US will catch on fire. We found that models can predict this accurately only using climate data. Given that the models can accurately predict burn area with only climate data (no information about forest management or ignitions), we conclude that most of western US fire activity can be explained by climate alone. This is not to say other factors are not important, but they are not needed to accurately predict fire activity on a broad scale in the western US. This strongly suggests climate is a key driver.
Correlation/causation??
I call BS on you use of “climate”. By the definitions of the CAGW religion, it takes 30 years to be called climate.
It is weather patterns, NOT “climate” that determines how dry the fuel load is and thus how much will burn before contained by natural barriers and/or firefighters, mostly natural barriers. Weather is what your models are measuring, not CLIMATE.
Did they look for a correlation between acres burned and incompetent fire management?
With No Touch, Let It Burn, Watch It Rot policies, maximum acreage burned is a desired outcome, not an accident of nature.
BTW, whole globe (including the oceans) warming leads to more rain and wetter springs and summers. The notion that warmth = drought is alarmist crack pottery. Just the opposite happens. And they know it! We need a new word for baseless fear mongering — I suggest “hobgoblin-ing”.
“… suggests that climate conditions in antecedent seasons have been an important driver to broad-scale changes in summer fire activity in the western U.S. over the recent four decades. ”
What about the last 10 or 12 decades ?
Oh …
You mean these decades?
Note in the curve on fires that it drops to its lows by 1960 – 2000, afterwhich it begins to rise again. Given the statistics, the small rising trend is almost certainly attributable to arson. The activist surge really didn’t start until 2000. This is why they chose only the last 30yrs. Everything predating that period (a warming period since 1850 according to the “scholarly bullshit”) doesn’t fit the meme.
The big tell is the finding that “arson” or “human caused” isn’t even mentioned in the paper (identified in over 50% of the fires), so they even know they are being deceitful! This type of tell and the period they cherry pick out of data that doesn’t support the study are the main algorithms that will ultimately be used to weed out millions of scholarly bullshit papers.
I had posited this quick SWAG below as a response to Kip stating “~85% of wildland
fires in the United States are caused by humans.”:
“A possibly good SWAG would be to assume a certain percentage of the total amount
dry enough to burn based on winter & spring climate conditions will be set afire by
humans. One could adjust for economic factors, including gas prices as less people
travel when economy’s in the tank & when it costs a lot to fill up the tank!”
Upon looking at my graph again, I realized I should’ve added a ~30-yr upward
trend-line adjustment for a possible rise in arson/carelessness & worse forest
management, if that’s occurring, too. Proximity of dry areas & cities could
be an adjustment that varies with the economy. For a SWAG, it may not
be too far off.
I’m not excusing arsonists, but please don’t confuse or conflate number of fires with acreage burned. Many human-caused fires, including arsons, are close in — near to habitations, roads, and fire stations. They get put out quickly and at small acreages.
On remote lands lightning is the most common ignition source. The remote wildfire policy used to be contain, control, and extinguish as rapidly as possible, often with smoke jumpers. Then ~1980 the feds adopted Let It Burn, aka Fires Used For Resource Benefit, and the burned acreage jumped up. Now the feds “Block It In”, which means drop back the fire lines a few miles and kiss off hundreds of thousands of acres from the get-go.
Those Let It Burn policies, human decisions based on twisted political expediency, are directly and entirely responsible for increasing acreage burned.
You still have to explain what was different in the 40yrs prior to 2000 when it began to rise from the bottom of the curve. The actual rise is indeed small. Lightning doesn’t explain much since 1960. The long stretch of low acreage burned is probably the average contribution over time.
Please see my extensive comments below. The change in fire suppression strategies to Let It Burn began ~1988, most notably with the Yellowstone Fire (793,880 acres). The increasing acreage burned nationally since then is due to govt. policy, not some “natural” phenomena.
https://kutv.com/news/local/man-accused-of-starting-destructive-2017-brian-head-fire-will-face-trial-judge-rules
AND the fire burned so much because dead dry standing beetle killed trees were not logged due to envirowaco lawsuits.
Any increased fire “activity” during the dust bowl years of the early 1930’s? . . . is that perhaps why the “recent four decades” was cherry-picked?
There wasn’t satellite data in earlier decades so our study begins at the start of the satellite record in 1984. Indeed there have been points in the past where other factors may have been more dominant in regulating fire activity than climate (massive land cover changes from a growing population), but in the past 40 years, models can predict most of the broad scale burned area variability and trend with climate alone.
Got any proof to go with that handwaving claim?
In particular, this first sentence statement in your abstract is outrageously sophomoric:
“This study predicts summer (June–September) fire burned area across the western United States (U.S.) from 1984 to 2020 using ensembles of statistical models trained with pre-fire season climate conditions.”
You perform what amounts to glorified statistical curve-fitting over the past 36 years, and then turn around to claim your study is a “prediction”? It is more accurately classified as a postdiction, as it involves explanation after the fact.
Good grief!
Please get back to us when you can compare true predictions made today, using your ensembles of trained statistical models without any additional “training”, with what actually happens over, say, the next five years.
A twenty-year comparison of your predictions to actual history would be better . . . but I think five years will be all that is needed.
“There wasn’t satellite data in earlier decades so our study begins at the start of the satellite record in 1984.”
Really?
“NASA’s Earth Resources Technology Satellite (ERTS) launched July 23, 1972 . . . Later renamed, Landsat 1 became the first earth-observing satellite explicitly designed to study planet Earth.
“During its six-years, Landsat 1 acquired images covering about 75% of the Earth’s surface . . . Within days of the launch, Landsat 1 collected imagery of an astounding 81,000-acre (327.8 square kilometers) fire burning in isolated, central Alaska. For the first time ever, scientists and resource management officials were able to see the full extent of damage from a fire in a single image while it was still burning.”
— source: https://www.usgs.gov/landsat-missions/landsat-1
1972 is more than one decade before 1984.
Facts matter.
My first mental question to pop up…”Why did they start in 1984?” And then when I saw the word “since,” my next mental association was “baselining.”
Once again, “Pick your period, pick your trend.”
Self fulfilling prophecy. They prophesy a target figure and their acolytes go out with petrol and matches in hand to fulfill it.
The issue is you can’t forecast arsonists. There has been a woman leaving least two fires that began in British Columbia. In all probability a misguided environmentalist willing to risk other lifes to prove their alarmist message.
Don’t worry! Stuart Party of the BC Green Party, says climate deuiers (sic) are funded by resource extraction companies, and the LEAP Manifesto will save you.
– – – – – – – – –
Stuart Parker (BC Green Party): Why wildfire season will produce more deuialists if we don’t change course
Ever since climate activists adopted the “no debate” policy with respect to climate deuialists, and since the rise of the modern far right, we have seen an ongoing decline in Canadians’ belief that anthropogenic climate change is real.
https://www.straight.com/news/stuart-parker-why-wildfire-season-will-produce-more-denialists-if-we-dont-change-course
But….but….but….according to the U.S. National Parks Service: “Humans and Wildfire. Nearly 85 percent* of wildland fires in the United States are caused by humans. Human–caused fires result from campfires left unattended, the burning of debris, equipment use and malfunctions, negligently discarded cigarettes, and intentional acts of arson. ”
Whre is the predection on human arsonists? Or human foolishness? Or human neglectfulness?
Eighty-five percent!
Kip,
Just so!
But you’re not supposed to be so closely examining the above article that was published in Environmental Research Letters.
A word search of the segments of that “research” that is quoted in the article above by Charles Rotter reveals:
1) Zero hits on the word “arson” (which also includes “arsonist” and “arsonism”).
2) The only hits for “human” were “human-induced global warming”, “human-induced climate changes”, and “human-caused climate changes”, all contained within the first paragraph of the Introduction. No mention whatsoever of human-initiated fires, which beyond arson would include “controlled burns” that got out of control.
3) Zero hits for “man caused”.
Finally, my own personal observation: one could have stopped reading the article’s quotes upon encountering this phrase “In this study, we use ensembles of nonlinear statistical models to predict . . .”
And a word search on fuel and management does get some hits, but nothing to indicate that forest management allowing the fuel load to increase has anything to do with it.
I stopped reading at the phrase you quote in No. 3.
Epic – “non-linear” they say.
This is gonna be good
So what’s the answer then, tell tell tell – some considerable numbers of folks might be interested to know what the result of “n divided by zero” actually is.
(n being any number other than zero)
If you actually haven’t done a division by zero, if you haven’t ventured into and beyond a singularity – why are you using the term/words “non-linear”
Not to describe A Squiggly Line I do hope. Lines are lines are lines and, surprise surprise, lines are linear.
Do these folks know what they’re talking about?
Gordon ==> “nonlinear statistical models” == Chaos == Unpredictable.
A possibly good SWAG would be to assume a certain percentage of the total amount dry enough to
burn based on winter & spring climate conditions will be set afire by humans. One could adjust for
economic factors, including gas prices as less people travel when economy’s in the tank & when it
costs a lot to fill up the tank!
Kip my friend, I’m being repetitive again, but please don’t confuse or conflate number of fires with acreage burned. Many human-caused fires, including arsons, are close in — near to habitations, roads, and fire stations. They get put out quickly and at small acreages.
On remote lands lightning is the most common ignition source. The remote wildfire policy used to be contain, control, extinguish as rapidly as possible, often with smoke jumpers. It was called the 10AM policy because the goal was to put the fire out by 10AM the next day. Then ~1980 the feds adopted Let It Burn, aka Fires Used For Resource Benefit, and the burned acreage jumped up. Now the feds “Block It In”, which means drop back a few miles and kiss off hundreds of thousands of acres from the get-go.
Those Let It Burn policies, human decisions based on political expediency, are directly and entirely responsible for increasing acreage burned.
I can cite dozens of fires where a small start was deliberately left unsuppressed and grew to megafire size (100,000+ acres).
Another factor must be mentioned. Since ~1988 the USFS and BLM have halted almost all logging on fed lands and aggressively removed (ripped up) access roads. Roadlessness has been a policy goal. These actions have led to fuel build up and continuity of fuels. More connected fuels also lead to bigger fires.
For instance, despite 20+ years of pleas for thinning of defensible zones along roads by local communities — such as the Quincy Library Group — little such work has been done. As a result, small fires have become massive. Almost a million acres burned in one fire, the Dixie Fire in 2021, which destroyed much of the Plumas and Lassen NFs.
Million acre fires were unheard of until recently. Prior to 2018 the largest fire in CA history was the Santiago Fire of 1889 at 300,000 acres. Now there have been 7 CA fires larger than that, all 2018 and after.
Mike ==> I’m not concerned with human-caused, but ARSON. Intentionally started wildfires.
Mike ==> https://nypost.com/2021/08/12/arson-spree-near-californias-dixie-fire-blamed-on-former-professor/
Kip — okay. Your concerns are valid. But… the article was about total acres burned. That’s my concern, the vast destruction of our heritage forests due to whatever cause. However, I’ll play along with you. Our forests are being incinerated for a variety of reasons, all of which may be considered arson.
First, deliberate arson ignitions as you noted. The crazy kook prof started 4 small fires on the periphery of the Dixie Fire but the bulk of the acreage burned was ignited by a downed powerline. In 2020 two major fires in Oregon were entirely arson-set by antifa nutjobs: the Holiday Farms Fire (173,400 acs) and the Riverside Fire (138,000 acs).
Second, an additional ~800,000 acres burned in Oregon in 2020 due to small fires that were ignited by lightning in July and allowed to burn without suppression. When the east wind came in Sept these fires blew up, raged out of control and swept down the Cascades to the Willamette Valley. In my book that’s govt. arson, the deliberate refusal to fight the fires when small regardless of the potential for catastrophe.
Third, the lack of management for decades allowed for fuel accumulation. By removing roads and eschewing fire breaks, the fuel was also contiguous across thousands off square miles. Any spark is fated to to blow up and consume vast acreages. That’s the case with the Dixie Fire and so many others. In my book hands-off management is also tantamount to govt. arson. It’s more than incompetence, it’s programmed failure.
So arson is responsible although it’s arson of various forms. It’s not “climate change” because the climate hasn’t changed in any significant way. Climate skeptics may feel relieved about that, but the reality of landscape scale destruction can’t be denied. I’m a forester. It breaks my heart to see my forests destroyed. It’s not an academic debate to me. CC gets a pass, but the real causes remain unaddressed.
So Winter and Spring snowfall explains half the wildfire “trend”. Since the snow on the ground in the Rockies lasts till June, that isn’t any sort of unexpected observation……The study authors need to get outdoors more instead of reading reports all day long….
Did they notice the super-strong correlation with forest fires after a week of dry weather ?
Just like AGW, correlation does not mean causation. I wonder where they put man initiated fires in their models? How about electrical caused fires from downed power lines? Do they have a special place in the models as well or are they lumped in with man initiated? This is just another bullshit “study” that came to a prescribed conclusion that gained traction only because it supports a narrative.
…and just so we all remember, the worst years for wildfires were in the late 1800’s to the early 1900’s. What caused the great fires around the great lakes region? Or the 1910 fire that burned 3 million acres around the Yellowstone area? Let me guess…global warming from buffalo and elk farts? This is another bulls**t paper to add to the infinite and growing stack.
Not a single mention in the article about the arsonists who set numerous west coast fires. Apparently, modeling human behavioral characteristics and their affect on your modeling project is not a popular enough subject to include.
Retired fire fighter here .
Every year we get predictions of: “The worst fire season ever!” Some times it’s dry hot and thousands of acres are burned. Kiss your. Wife good bye, shut the hatch, in April and you don’t see her until Thanksgiving. However another version of “The worst fire season ever”: is a clean aircraft, retardant tank and a check with only base pay.😁 It’s all relative.
Someone must’ve got some grant money.
No snow drought in the Cascades this year.
We lived in Topanga Cyn for many years directly adjacent to Topanga State Park. I remember a controlled burn that got out of hand, another time across the canyon a person using a metal bladed weed eater started a fire that he couldn’t control and out came the water bombers. Then one year a young man started a fire on the east side of the mountains which rolled over the mountains and demolished quite a few Malibu homes.
We had to evacuate one time when a fire came within ~ 1/2 mile.
Of course the oily brush in the Santa Monica mountains goes up very easily.
So fires are common and of course mud slides during the winter rains.
All normal in So California.
Can’t these people do anything without a model? I think it is getting to the point that models should be withheld from them. If you can’t do your work without a model you must return your grants.
Stopped reading at “Since the beginning of the 21st century, the Intergovernmental Panel on Climate Change has projected increases in summer wildfire risk”. That’s my clue that what follows is devoid of any scientific value. #modernharuspicy
WTF, no mention of the powerful impact that El Nino and La Nina have? That’s nuts!
The current drought has been greatly amplified if not caused by the long lived La Nina…………cold water anomalies in the central and eastern tropical Pacific Ocean. This affect is very strong and is strongly correlated with prior events.
OK, I get it. Those are NATURAL factors. They told us from the get go what THIS study/method would discuss:
“Since the beginning of the 21st century, the Intergovernmental Panel on Climate Change has projected increases in summer wildfire risk in North America”
The BC government website shows that 60% of the fires in last years above average season were started by people, most accidental but still without that 60% of fires it’s just another average year.
Having fought forest fire for several years in my youth and having been involved in occupational fire observation for a period longer than this study, I suggest that there is a very strong correlation between bad fire years and El Nino in the Pacific Northwest. There were rarely and still rarely are more than 2 severe fire seasons in a row and severe fire years occur regularly on a 5, 6 or 7 year cycle. Whatever they were measuring likely correlates well with El Nino.