Guest essay by Larry Hamlin
The October 10, 2016 Times article addresses a study recently published in the Proceedings of the National Academy of Sciences where the authors claimed that through the use of large scale climate models and annual wildfire data from 1984 to 2015 they determined that man made climate change increased the aridity of wildfire fuel by 55% which doubled the area of the western U.S. that burned during this period.
This latest story is significantly different in its presentation of a supposed wildfire connection to climate change versus a wildfire story which was published in the Times on October 18, 2015 where Governor Brown’s attempt to link man made climate change to wildfires was unsupported by fire experts.(
In October 18, 2015 Times article wildfire experts unsupportive of Brown’s position noted that:
“But climate scientists’ computer models show only that global warming will bring consistently hotter weather in future decades.
Their predictions that warming will bring more forest fires — mostly in the Rockies and at other higher elevations, while fires may actually decrease in Southern California — also are for future decades.
Even in a warmer world, they say, land management policies will have the greatest effect on the prevalence and intensity of fire.
A study published in August by a Columbia University team led by climatologist Park Williams concluded that global warming has indeed shown itself in California, by increasing evaporation that has aggravated the current drought.
But Williams said his research, the first to tease out the degree to which global warming is affecting California weather, did not show climate change to be a major cause of the drought.
Even climate ecologists who describe a strong tie between fire frequency and weather say they cannot attribute that connection to phenomena beyond normal, multi-decade variations seen throughout California history.
“There is insufficient data,” said U.S. Forest Service ecologist Matt Jolly. His work shows that over the last 30 years, California has had an average of 18 additional days per year that are conducive to fire.
In addition, predictions of the impact that global warming will have on future fires in California vary.
A team of researchers at UC Irvine recently reported that in 25 years, climate change will increase the size of fires driven by Santa Ana winds in Southern California. But their models varied on how much increase to expect: from 12% to 140%.
Predictions from a UC Merced expert include a possible decrease of such fires as dry conditions slow vegetation growth.
Today’s forest fires are indeed larger than those of the past, said National Park Service climate change scientist Patrick Gonzalez.
At a symposium sponsored by Brown’s administration, Gonzalez presented research attributing that trend to policies of fighting the fires, which create thick underlayers of growth, rather than allowing them to burn.
“We are living right now with a legacy of unnatural fire suppression of approximately a century,” Gonzalez told attendees.”
The new wildfire study relies upon the use of UN IPCC AR5 CMIP5 model ensemble simulations (RCP8.5 scenario) to obtain an anthropogenic climate signal that could be removed from the observational aridity record.
The study then attributes and explicitly assumes that anthropogenic increases in fuel aridity are additive to the wildfire extent that would have arisen from natural climate variability during 1984-2015.
This technique of using climate model simulations to manufacture a divergence between unforced and anthropogenic forcing in separate model ensemble runs is the same highly questionable technique that was used in the UN IPCC AR5 report to justify their man made climate change detection and attribution arguments.
This model driven detection and attribution scheme was challenged by climate scientist Judith Curry (https://judithcurry.com/2014/08/24/the-50-50-argument/) who noted that “the IPCC has failed to convincingly demonstrate ‘detection.’
Because historical records aren’t long enough and paleo reconstructions are not reliable, the climate models ‘detect’ AGW by comparing natural forcing simulations with anthropogenically forced simulations.”
She added that “The IPCC then regards the divergence between unforced and anthropogenically forced simulations after ~1980 as the heart of the their detection and attribution argument. See Figure 10.1 from AR5 WGI (a) is with natural and anthropogenic forcing; (b) is without anthropogenic forcing:”
Dr. Curry then points out a number of critical flaws in these comparisons as follows:
“Note in particular that the models fail to simulate the observed warming between 1910 and 1940.
The glaring flaw in their logic is this. If you are trying to attribute warming over a short period, e.g. since 1980, detection requires that you explicitly consider the phasing of multidecadal natural internal variability during that period (e.g. AMO, PDO), not just the spectra over a long time period.
Attribution arguments of late 20th century warming have failed to pass the detection threshold which requires accounting for the phasing of the AMO and PDO.
It is typically argued that these oscillations go up and down, in net they are a wash. Maybe, but they are NOT a wash when you are considering a period of the order, or shorter than, the multidecadal time scales associated with these oscillations.
Further, in the presence of multidecadal oscillations with a nominal 60-80 yr time scale, convincing attribution requires that you can attribute the variability for more than one 60-80 yr period, preferably back to the mid 19th century.
Not being able to address the attribution of change in the early 20th century to my mind precludes any highly confident attribution of change in the late 20th century.”
Dr. Curry concludes that UN IPCC climate models are unfit for use for this purpose, use circular reasoning in claiming detection and fail to assess the impact of forcing uncertainties regarding attribution assertions.
The significant shortcomings addressed by Dr. Curry of attempting to use UN IPCC AR5 climate model simulations to detect and attribute divergence in separate forced and unforced model runs which supposedly define the natural and man made components of climate impacts since 1980 apply to the use of these model ensembles for the purpose of this latest wildfire study as well as for the broader use of these models runs for the UN IPCC AR5 report.
The new wildfire study is extremely deficient in not addressing at all the fact that the number of wildfires across the U.S. has not increased during the study period between of 1984-2015. (http://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html)
The new study considered two time periods for comparing burned area acres and those were from 1984-1999 and from 2000-2015.
The number of U.S. wildfires for the period 1984 to 1999 is essentially unchanged from the number of U.S. wildfires for the period 2000 –2015.
Additionally the latest year to date wildfire data (http://www.nifc.gov/fireInfo/nfn.htm) for the last ten years shows absolutely no consistent upward trend whatsoever in either burned acres or number of U.S. wildfires which have occurred.
The new wildfire study also fails to address that U.S. drought data does not support claims of increased nationwide droughts as driving increased number of wildfires.