Debunking the L A Times story claiming new study shows human caused warming doubled western U.S. area burned since 1984

Guest essay by Larry Hamlin

The October 10, 2016 Times article addresses a study recently published in the Proceedings of the National Academy of Sciences where the authors claimed that through the use of large scale climate models and annual wildfire data from 1984 to 2015 they determined that man made climate change increased the aridity of wildfire fuel by 55% which doubled the area of the western U.S. that burned during this period.


This latest story is significantly different in its presentation of a supposed wildfire connection to climate change versus a wildfire story which was published in the Times on October 18, 2015 where Governor Brown’s attempt to link man made climate change to wildfires was unsupported by fire experts.(


In October 18, 2015 Times article wildfire experts unsupportive of Brown’s position noted that:

“But climate scientists’ computer models show only that global warming will bring consistently hotter weather in future decades.

Their predictions that warming will bring more forest fires — mostly in the Rockies and at other higher elevations, while fires may actually decrease in Southern California — also are for future decades.

Even in a warmer world, they say, land management policies will have the greatest effect on the prevalence and intensity of fire.

A study published in August by a Columbia University team led by climatologist Park Williams concluded that global warming has indeed shown itself in California, by increasing evaporation that has aggravated the current drought.

But Williams said his research, the first to tease out the degree to which global warming is affecting California weather, did not show climate change to be a major cause of the drought.

Even climate ecologists who describe a strong tie between fire frequency and weather say they cannot attribute that connection to phenomena beyond normal, multi-decade variations seen throughout California history.

“There is insufficient data,” said U.S. Forest Service ecologist Matt Jolly. His work shows that over the last 30 years, California has had an average of 18 additional days per year that are conducive to fire.

In addition, predictions of the impact that global warming will have on future fires in California vary.

A team of researchers at UC Irvine recently reported that in 25 years, climate change will increase the size of fires driven by Santa Ana winds in Southern California. But their models varied on how much increase to expect: from 12% to 140%.

Predictions from a UC Merced expert include a possible decrease of such fires as dry conditions slow vegetation growth.

Today’s forest fires are indeed larger than those of the past, said National Park Service climate change scientist Patrick Gonzalez.

At a symposium sponsored by Brown’s administration, Gonzalez presented research attributing that trend to policies of fighting the fires, which create thick underlayers of growth, rather than allowing them to burn.

“We are living right now with a legacy of unnatural fire suppression of approximately a century,” Gonzalez told attendees.”

The new wildfire study relies upon the use of UN IPCC AR5 CMIP5 model ensemble simulations (RCP8.5 scenario) to obtain an anthropogenic climate signal that could be removed from the observational aridity record.

The study then attributes and explicitly assumes that anthropogenic increases in fuel aridity are additive to the wildfire extent that would have arisen from natural climate variability during 1984-2015.

This technique of using climate model simulations to manufacture a divergence between unforced and anthropogenic forcing in separate model ensemble runs is the same highly questionable technique that was used in the UN IPCC AR5 report to justify their man made climate change detection and attribution arguments.

This model driven detection and attribution scheme was challenged by climate scientist Judith Curry ( who noted that “the IPCC has failed to convincingly demonstrate ‘detection.’

Because historical records aren’t long enough and paleo reconstructions are not reliable, the climate models ‘detect’ AGW by comparing natural forcing simulations with anthropogenically forced simulations.”

She added that “The IPCC then regards the divergence between unforced and anthropogenically forced simulations after ~1980 as the heart of the their detection and attribution argument. See Figure 10.1 from AR5 WGI (a) is with natural and anthropogenic forcing; (b) is without anthropogenic forcing:”


Dr. Curry then points out a number of critical flaws in these comparisons as follows:

“Note in particular that the models fail to simulate the observed warming between 1910 and 1940.

The glaring flaw in their logic is this. If you are trying to attribute warming over a short period, e.g. since 1980, detection requires that you explicitly consider the phasing of multidecadal natural internal variability during that period (e.g. AMO, PDO), not just the spectra over a long time period.

Attribution arguments of late 20th century warming have failed to pass the detection threshold which requires accounting for the phasing of the AMO and PDO.

It is typically argued that these oscillations go up and down, in net they are a wash. Maybe, but they are NOT a wash when you are considering a period of the order, or shorter than, the multidecadal time scales associated with these oscillations.

Further, in the presence of multidecadal oscillations with a nominal 60-80 yr time scale, convincing attribution requires that you can attribute the variability for more than one 60-80 yr period, preferably back to the mid 19th century.

Not being able to address the attribution of change in the early 20th century to my mind precludes any highly confident attribution of change in the late 20th century.”

Dr. Curry concludes that UN IPCC climate models are unfit for use for this purpose, use circular reasoning in claiming detection and fail to assess the impact of forcing uncertainties regarding attribution assertions.

The significant shortcomings addressed by Dr. Curry of attempting to use UN IPCC AR5 climate model simulations to detect and attribute divergence in separate forced and unforced model runs which supposedly define the natural and man made components of climate impacts since 1980 apply to the use of these model ensembles for the purpose of this latest wildfire study as well as for the broader use of these models runs for the UN IPCC AR5 report.

The new wildfire study is extremely deficient in not addressing at all the fact that the number of wildfires across the U.S. has not increased during the study period between of 1984-2015. (

The new study considered two time periods for comparing burned area acres and those were from 1984-1999 and from 2000-2015.

The number of U.S. wildfires for the period 1984 to 1999 is essentially unchanged from the number of  U.S. wildfires for the period 2000 –2015.


Additionally the latest year to date wildfire data ( for the last ten years shows absolutely no consistent upward trend whatsoever in either burned acres or number of U.S. wildfires which have occurred.


The new wildfire study also fails to address that U.S. drought data does not support claims of increased nationwide droughts as driving increased number of wildfires.



37 thoughts on “Debunking the L A Times story claiming new study shows human caused warming doubled western U.S. area burned since 1984

  1. If ( and that is a big if) there is an increase in fires over the past 120 years, to me it would be because the increase of population, the increase of coverage, such as cell phones, satellites, TV coverage, and so on, and overly regulated forest practices. For the rest? Business as usual.

  2. The stories also omit fire management strategies changing over time, which probably have a greater effect on the extent and number of fires than climate change.

    • Which is exactly what has happened here in Australia. A representative of the emergency services stated that this year’s fire season could be worse than previous years because we have had a very wet winter and spring resulting in more growth in fuel load in the news the other night showing off larger water bombing craft that have not been used before. There was a little mention of hazard reduction like back burning but not much.

      • Indeed, the terrible fires in Tasmania (was it 2 or 3 years ago?) were primarily the result of the fire breaks, previously maintained by logging companies to preserve their stocks, becoming overgrown. Ironically the fires destroyed far more trees than the loggers would have cut down

    • This is an interesting article where I find myself agreeing with a “climate change scientist” – Patrick Gonzales – but only in his definition of anthropogenic affects. In this case, he blames unnatural fire suppression over the past century vs. natural which would let it burn and destroy all of the fuel, reducing the risk of fire in subsequent years. It also reduces the intensity of the fires that start. When there is too much fire “fuel”, fires burn hotter and can literally scorch the earth and leave it barren. Modern day forest conservation needs to evolve to allow for clearing of overgrowth. Logging, when done responsibly, reduces fire risk substantially and protects the forest as a habitat.

  3. See this note I left on Tips on October 3rd. [Link to that seems to do odd things, so here it is again.]

    I left a comment on Paul Homewood’s site. This was prompted by the idea of CO2 greening the planet.
    It seems appropriate to have it here also.
    Just want folks to be aware of this: When you see CAGW types jump on news of wildfires, be prepared. It is not really about climate change.

    The statement that CO2 helps plants grow and “green the world”, while true, has a dark underside.
    We went to a presentation on Thursday evening about wildfire in the Western USA. The term “megafire” is used. Here’s why:
    Prior to settlement by Europeans there were two sources of fire on the landscape. Lightening and Natives set fire. Nature could put the fires out. Natives learned to live with this, setting fires in spring and fall when the burn rate was slow. The advantages to fire included helping the growth of food plants and depressing very large hot fires – megafires. Natives lost these lands in sometimes slow and sometimes fast processes – another story.
    White settlers and, then in the early 1900s, the government began to put out fires and learned how to do so.
    For about 100 years, especially the last 50 or 60, fire suppression has been very successful. The landscape has been filling with trees and low woody plants. These grow, die, and produce fuel. The extra CO2 helps the growth. Meanwhile the forest products industry has declined.

    The presentation we witnessed included a small amount of climate change comments but not enough to destroy the message. That being that the landscape is growing increasingly prone to big hot destructive fires and top political types are ignoring the issue.

    A problem, also, is that homes are being built in the wildland – urban interface (the WUI — woo-E) where it is either impossible to protect or very costly to protect from a wildfire.

    When you see CAGW types jump on news of wildfires, be prepared. It is not really about climate change.

  4. Buuubuuuubuuuubuuuubuuuubuuuut … “FIIIIIIIIIIIIIIIIIRRRRRRRRRReeeeeeeesssssss tooooowiiiiiiiiiiiicccccceeeeee AS BAAAAAAAAAAaaaaaaddddddddddd!!!!!!”

    That has such a nice ring to it! The carney hawker in me just can’t resist!

  5. It’s the fuel load, stupid. All that fuel has built up over the past century so that now these fires are devastating and, in some areas in a burn, nothing survives, not even the soil.
    But you can’t stop or convince the true believers. The world could turn to a block of ice and they’d blame it on CO2.

    • Yes. Fire needs fuel and oxygen. Wind can push it and make it spread, but only where there is fuel. There are no fires in deserts. Greens love fire fuel and won’t let people remove it. I think that means Greens are why there are more megafires, right?

      • Reality,


        Auto – glad that there are others who see through the perverted ‘thing’ that was a desire to do no harm to the planet – our planet.
        I raised money for the World Wildlife fund (then) in, I think, the early 1960s [before I was a teenager, anyway!].

  6. Reblogged this on Climatism and commented:
    As evidence for anthropogenic global warming theory dwindles, with widespread debunking of the favoured canaries of doom (like the recent record September Artic sea-ice growth along with its decadal recovery) the CAGW obsessed mainstream media will simply double down on their falsehoods to reinforce their ideologically driven agenda.

    They certainly won’t re-evaluate or tell truths about “global warming”, as too many jobs, money and reputations are now at stake.

  7. Just looking at the figure 10.1 AR5 WGI at the beginning of this article. I have not looked at AR5 so far but this entire temperature collection from the year 1980 on is falsified to show a non-existent warming. In the real world 1980 is followed by an 18 year hiatus of no warming, not by a steep climb as they show. Their phony warming is approximately 0.2 degrees Celsius on this interval which is about 0.1 degrees Celsius per decade instead of zero.. That extra warming is then carried over to the super El Nino of 1998 that follows. From there it is passed on to the entire twenty-first century which they also raise a csecond time so its flat top stands above the super El Nino itself. The scale used is too small for more accurate measurements, perhaps intentionally. My advice is to throw these temperature curves out and start over with real observations. If that requires you to revise AR5, so be it. The manager responsible should get on the job and find out how and why such forgeries can be presented to the scientific community as observations.

  8. Californians (Area 423,970 km²) should thank their lucky stars that their largest wildfire ever was the Rush fire covering 315,577 acres in County Lassen. In this wildfire, fortunately, no one died.
    In Victoria, Australia (Area 237,629 km²), the largest bushfire ever was on Black Thursday, 6th February in 1851, it covered 58,719,000 acres or a quarter of Victoria’s area In the latter, mercifully, only 17 people died thanks to the small mainly urbanised population at the time. Surprisingly CO2 was not involved.
    If Americans want to see real wildfires, they should come out to South Eastern Australia during January or February.

  9. The main long-term anthropogenic effect seems to have been improved fire prevention and fighting:

    (United States Department of Agriculture).

    • For a while after the massive Yellowstone fire, states did seem to let fires burn. In wilderness areas, some still do. I had hope that Yellowstone had actually made a difference, but it appears maybe it didn’t. Or maybe megafires are what the Left wants, in order to push global warming predictions. Who knows. Sigh.

      • Much of the wilderness areas are surrounded by a patchwork of private, state and federal lands. Let the fire go in wilderness areas, good luck stopping it at the border. That’s what’s happening to the let it burn philosophy. We had a couple very public outcries in my state over how much private land was burnt up due to the let it burn philosophy.

  10. For a longer term perspective, this is what the Environmental Impact Report by the California Department of Forestry and Fire Protection had to say a couple of years ago:

    For purposes of analysis, the history of wildfire in California can be loosely categorized into pre-European settlement fire regimes and post-European settlement fire regimes, especially the last fifty years where rigorous fire suppression efforts have been undertaken.
    Natural fire regimes that existed prior to European settlement in California (pre-1700) involved a wide range of fire frequencies and effects on ecosystems; roughly one-third of the State supported frequent fire regimes of 35 years or less. Some areas likely burned on an almost annual basis. Pre-European settlement fire patterns resulted in many millions of acres burning each year, with fire acting as a major ecological force maintaining ecosystem vigor and ranges in habitat conditions. The pre-settlement period is often viewed as the period under which the “natural” fire regime standard for assessing the ecological role of fire developed.

    In the suppression (modern) era, statewide fire frequency is much lower than before the period of European settlement. Between 1950 and 2008, California averaged 320,000 acres burned annually, only a fraction of the several millions of acres that burned under the pre-settlement regimes.

    Before the twentieth century, many forests within California were generally open and park like due to the thinning effects of recurrent fire. Decades of fire suppression and other forest management have left a legacy of increased fuel loads and ecosystems dense with an understory of shade-tolerant, late-succession plant species. The widespread level of dangerous fuel conditions is a result of highly productive vegetative systems accumulating fuels and/or reductions in fire frequency from fire suppression. In the absence of fire, these plant communities accrue biomass, and alter the arrangement of it in ways that significantly increase fuel availability and expected fire intensity. As such, many ecosystems are conducive to large, severe fires, especially during hot, dry, windy periods in late summer through fall. Additionally, the spatial continuity of fuels has increased with fewer structural breaks to retard fire spread and intensity. The increased accumulations of live and dead fuels may burn longer and more completely, threatening the integrity and sustainability of the ecosystems.
    Species composition within these forests is also rapidly changing. Plant and animal species that require open conditions and/or highly patchy edge ecotones are declining and streams are drying as evapotranspiration increases due to increased stocking. Additionally, streams are being infiltrated by silt and debris following high severity fires, and unnaturally severe wildfires have destroyed vast areas of forest (Bonnicksen, 2003). Some insects and disease have reached epidemic proportions in parts of the State and forest conditions are conducive to more outbreaks. The understory of these once open forests is now dominated by smaller shade tolerant trees that would have previously been thinned and/or consumed by fire.

  11. The CAGW crowd tends to mis-use some scientific terms that we should work very hard to avoid accepting and using. In this case it’s the word “study”. “Study” suggests that some phenomena was closely reviewed and data collected. As I understand it, this was simply a model constructed and then run. Likewise, the CAGW crowd loves to use the word “data” to refer to the results of model runs. “Data” should be used only for measurements of real world phenomena, not numbers generated by unproved models. Instead of “study” we should simply reference “unproved modeling” and instead of “data” we should simply reference “model results”. Otherwise we simply encourage the continuing degradation of true science.

  12. The largest fire to date in the Contiguous United States was the Peshtigo Fire that burned more than 3.7 million acres (1.5 million hectares) in Wisconsin and Michigan. Second in line is the so called Great Fire of 1910. It burned over 3 million acres (1.2 million hectares) in less than a week in the Idaho/Montana border area. How did modern forest practices and fire fighting cause or enhance the destruction of those fires? It didn’t! Trees grow. Fires burn trees when the weather conditions are right for it. Hot temps and high winds are a deadly combination.

    Can humans make things worse? Certainly. But the nonsense about biggest fire evah!!!! is just nonsense. But humans can improve the situation as well. And the answer is not to stand by and watch the forests burn as happened in the Yellowstone environs in 1988. 2.2 million acres (900,000 hectares) burned as the idiots who could have fought the fire early did not. They were going to use nature’s way and so let the fire grow until it was could only be extinguished when the snow began to fall. Burning is not the only way to clear the clutter on the forest floor. Humans can clear the clutter. Humans can thin the trees. Humans can help if the self-annointed green rulers will stand aside.

    Too often in our time forest management has become a religion of burning. The Molech of the modern day. It is considered holy to let a tree burn. Or let it be killed by fire and not harvested, but rather left to stand as more dry fuel to help with the next fire. It is sinful to take that same tree and use it for any purpose. The false dichotomy of rape and pillage the land V. don’t touch any tree forever should to be rejected for a reasoned approach of wise use, consistent thinning and consistent conservation of all land resources.


    • Thanks for your testimony and for providing the link. Very informative and well presented. From your intro:

      “Foresters know there are many examples of where human activity affects both the total number
      and size of wildfires. Policy makers who halt active forest management and kill “green”
      harvesting jobs in favor of a “hands-off” approach contribute to the buildup of fuels in the forest.
      This eventually increases the risk of catastrophic wildfires. To attribute this human-caused
      increase in fire risk to carbon dioxide emissions is simply unscientific. However, in today’s
      world of climate alarmism, where accuracy doesn’t matter, I am not at all surprised to see many
      journalists spreading the idea that carbon emissions cause large wildfires.”

  13. I don’t know, there is two blog posts dealing with the latest adapted claims about forest fires and Aridity.
    All that consist with the adaptation of AGW-climatology to the Aridity problem.

    I will try to point out again to the big-huge paradox of the AGW-climatology.

    It basically consist of the connection of the yearly CO2 flux, mostly and predominantly the CO2 emissions variation over time causing and driving the CO2 concentration, that is where the assumed effect of humans comes in to the play, but then by some miraculous intervention all is due and about the CO2 concentration, where the emissions have no much say and actually depend on it, and totally ignored.

    So when in one direction claiming that the game changer is the CO2 emission variation then in an attempt to prove an anthropogenic forcing all is moved to CO2 concentration, by completely ignoring and contradicting the first , the very means and cause of the CO2 concentration variation.. .

    I will try two examples of this, hopefully I am not wrong.

    First one, ice core data, Antarctica ice core data in this case….

    If you look at these data, you find that there is an estimated temps and CO2 concentration over long periods of time, with a given correlation on that variation.
    So far so good.

    But where the problem starts is actually the number estimates attached to these variations.

    The temps one consist at some 12C variation (swing) from min to max and are adjusted further as means to estimate the climatic swing which stands at half of its value approximately, due to the basic fact that the temps variation is not uniform over the globe, where the max will be on the polar regions, and minimum, probably even not distinguished from Zero, will be in the tropics. So to estimate the temps climatic swing further adjust-estimates are considered and applied.

    Now when it comes to the estimated numbers of the atmospheric CO2 concentration through these ice core data there is a problem.
    Whatever the numbers actually are estimated first, and the correlation these numbers propagate, still the CO2 concentration swing is further adjust-estimated as the temp one is. But the strange thing here is that there is no any justification or reason to do that, unless contemplating and accepting that these numbers representing the CO2 concentration variation in the first place are wrong and wrongly estimated.

    In “climate science” is estimated that max swing of CO2 concentration is ~120ppm, while according to the ice core data, where this actual estimation originates from, such a variation does not actually exist at this amount.

    While the similar situation about temps, and there could be a reason to attempt-make numbers adjusted further due to a non uniform temp variation over the globe, in the case of CO2 concentration the same science claims and upholds that the CO2 is very well mixed and the CO2 concentration is very well uniformly distributed all over the globe.
    Besides, the all other observations do support this, even the latest satellite co2 data does. So WHERE DO THEY FIND THE REASON TO FURTHER ADJUST SUCH NUMBERS, unless the reason been actually and simply the very wrong estimation in the first place of such as, meaning that such numbers do not add-up therefor further unjustified adjustments-estimates applied.

    And it is not difficult in a way to try and show or explain why such estimation could be wrong.

    It is not the atmospheric CO2 concentration that effects the ice CO2 concentration, that is being used as a medium for such estimations, but actually is the yearly CO2 flux and its variation over time, especially the emission and its variation over time which results in the variation of both, the atmospheric and ice CO2 concentration.
    So looking at ice co2 concentration and trying to estimate the atmospheric CO2 concentration and its variation by “drawing” a straight line in between both with no regard to the CO2 yearly flux and its variation through time, most probably will result in a faulty and wrong estimation. .
    The yearly co2 sinks do not effect or impact the ice co2 concentration, while its impact and relation to the emissions results in the CO2 atmospheric concentration variation.

    The second example, which follows more or less in the same line, is about the claim that over a long, very long period of time the atmospheric CO2 concentration has dropped a lot from some where of 4000ppm to some where of ~300ppm to present.

    What first is amazing to me about that, is that when actually every one arguing and “fighting” every one else over data of the modern era of last 120 years, data of much higher resolution and from sources like Mauna Loa and satellites about CO2 and its atmospheric concentration for the time period in question, in the same time most do accept the estimations from other less reliable and more scarce means and methods covering immense time periods further in past, as unquestionable and written in stone, when actually a lot of interpretation, guesses and assumptions involved there, with a very high probability of error.

    In this case too, as in the first one, as per my understanding, is the same error, the “drawing” of a straight line between the co2 concentration in the mediums used for the estimation and the CO2 atmospheric concentration, with no regard of the actual cause in both cases, the yearly CO2 flux and its variation.
    So in my view a very wrong one.

    Same data if looked at differently and interpreted differently can show a very much different picture, where the atmospheric CO2 overall concentration has not being ever to such high amounts, and no any case to support that it and its variation any different in magnitude over the whole that period up to present, with no any actual detectable change.
    And all that data show is the yearly co2 flux and its over time variation changing slightly over that time period, while the CO2 concentration variation and its overall magnitude not changing over the same time period..

    After all this long quarrel, Where is the paradox.

    According to AGW,the atmospheric variation of CO2 concentration does cause climate change.
    That variation causes any change even in biomass, also predominates the actual force of the co2 emissions, which by the many interpretations in “climate science” has no much value, even when contemplated at some point
    And there where it comes, the human CO2 emissions are special, these only are predominant to atmospheric CO2 concentration and therefor causing an anthropogenic climate change.

    The paradox, the effect causes the cause, where all based at relying in a very immense and highly amplified backward feed from the effect to the causality and turning everything upside down.
    A condition probably with GCMs but not at such as magnitude, and as a result of not a realistic simulation.

    Why could this be relevant in this case, simply because the Atmospheric CO2 concentration does not effect biomass, and when it comes to emissions the effect is undefined and most probably of no any significance, besides does not seem quantitative in the case of human CO2 emission, especially when there no any real evidence of its effect in the CO2 concentration, especially with the missing heat.

    Sorry for going so long with this, and when probably wrong with…:)


  14. The fires may be more extreme but that is mainly due to the spread of the non -native Cheatgrass across the US. As it spreads it crowds out drought resistant plants.

    “An NSF-funded study conducted by Balch and other scientists shows that cheatgrass has been involved in a disproportionately large number of fires in the Great Basin, a 600,000-square kilometer area that includes parts of Nevada, Utah, Colorado, California and Oregon. “Over the past decade, cheatgrass fueled the majority of the largest fires, including 39 of the largest 50 fires, even though this species only dominates about 6 percent of the land in the Great Basin,” said Balch. “In addition, cheatgrass burned twice as frequently as any other vegetation”

    “Cheatgrass is highly flammable and densely
    growing populations provide ample, fine textured fuels that increase fire intensity and
    often decrease the intervals between fires”

    “The early-season growth habits of cheatgrass provide a competitive advantage
    by allowing it to grow tall and abundant before native species emerge. During
    years of high precipitation, this grass can produce more than 10,000 plants per
    square yard. Cheatgrass turns brown and dies by early summer leaving behind thick,
    continuous dry fuels and creating extreme wildfire hazards”

    “Though several components can affect flame length and fire spread, a
    typical cheatgrass fire on flat terrain with wind speeds of 20 miles per hour may
    generate flame lengths up to eight feet in height; the fire can travel more than four
    miles per hour. Grass fires are dangerous because they move quickly and grasses act
    as ladder fuels igniting larger and more volatile vegetation”

    “Scientists suspect that cheatgrass increases the number and severity of fires because it grows in arid lands and dries out before native vegetation does — a continuous carpet of fuel for fires”
    Why is cheatgrass a problem?
    Because cheatgrass quickly develops a large root system in the spring, by the time native grass seedlings start to grow in April or May, cheatgrass has stolen most water out of the top foot of soil. Although mature native grasses can get water from lower soil regions, seedlings cannot get their roots deep enough into soil to access water before drought sets in, and thus, die of thirst. Without this ability to reproduce, native grasses inevitably decline, and so over time, cheatgrass becomes more and more common until eventually it dominates. Cheatgrass also encourages fires. Because it dries up earlier in the year and burns easily, where cheatgrass is abundant wildfires occur earlier and more often, which damage or kill native grasses and make it impossible for sagebrush to grow back after fire. A decrease in sagebrush also means decreased numbers of native wildlife species because many shrub-steppe animals depend on this shrub for food, hiding, cover, or nesting”

  15. Another largely BS meta study. No new research just past data of acreages burned and some blind belief in General Circulation models that are well known to be unreliable.

    They also pick 1984 to compare with? Why not go back further, like the Tillamook Burn, Peshtigo fire, and a dozen others. IOW, they are cherry picking their comparison date.

    Land management has changed fire conditions, or we should say land mismangement. The FS and BLM have become second NPS as their annual cuts have done nothing but decrease thru the paralysis of analysis and being arm in arm with the tree huggers.

    Fire size is at best a multi-factor correlation statistic. That means mean annual global temperature is just one factor. Want to bet for every fire, other factors have a higher R2 than mean annual global temperature? In fact for all factors, I would guess that mean annual global temperature is way down in the factor ranked percentages.

    Did the research paper say, “more research is necessary. Please send more funds.”

Comments are closed.