From the AGU fall meeting in SFO. Personally, I think wildfire risk (especially in the USA) would be better predicted by observing ocean patterns (ENSO, PDO, AMO etc.) than trying to apply climate models. Further, it seems they are weighting 2012 as being too significant in the scheme of things. Also, I had to laugh at this statement:
In contrast with wildfires, agricultural and prescribed fires are less affected by climate, especially drought, during the fire season.
Gosh, “less affected” how about “not at all”? Maybe they are thinking farmer-forester mind control.
– Anthony
Climate Models Project Increase in U.S. Wildfire Risk
Scientists using NASA satellite data and climate models have projected drier conditions likely will cause increased fire activity across the United States in coming decades. Other findings about U.S. wildfires, including their amount of carbon emissions and how the length and strength of fire seasons are expected to change under future climate conditions, were also presented Tuesday at the annual meeting of the American Geophysical Union in San Francisco.

Doug Morton of NASA’s Goddard Space Flight Center in Greenbelt, Md., presented the new analysis of future U.S. fire activity. The analysis was based on current fire trends and predicted greenhouse gas emissions.
“Climate models project an increase in fire risk across the U.S. by 2050, based on a trend toward drier conditions that favor fire activity and an increase in the frequency of extreme events,” Morton said.
The analysis by Morton and colleagues used climate projections, prepared for the Fifth Assessment Report of the United Nations Intergovernmental Panel on Climate Change, to examine how dryness, and therefore fire activity, is expected to change.
The researchers calculated results for low and high greenhouse gas emissions scenarios. In both cases, results suggest more fire seasons that are longer and stronger across all regions of the U.S. in the next 30-50 years. Specifically, high fire years like 2012 would likely occur two to four times per decade by mid-century, instead of once per decade under current climate conditions.
==============================================================
A Landsat 7 image of the 60,000 acres burned by the High Park wildfire just west of Fort Collins, CO as of June 18, 2012. The fire, which started on June 9 by a lightning strike, destroyed 189 homes as of June 19. In the June 18 image, clouds hover just north of the burned area, with smoke from the fire visible as blue. Credit: USGS/NASA
A visualization of cumulative fires from Jan. 1 through Oct. 31, 2012, detected by the MODIS instrument on board the Terra and Aqua satellites. Bright yellow shows areas that are more intense and have a larger area that is actively burning, flaming and/or smoldering. Credit: NASA Through August of this year, the U.S. burned area topped 2.5 million hectares (6.17 million acres), according to a fire emissions database that incorporates burned area estimates produced from observations by the Moderate Resolution Imaging Spectroradiometer instruments on NASA’s Aqua and Terra satellites. That is short of the record 3.2 million hectares (7.90 million acres) burned in 2011, but exceeds the area burned during 12 of the 15 years since record keeping began in 1997. This and other satellite records, along with more refined climate and emissions models, are allowing scientists to tease out new information about fire trends.
==============================================================
“Fire is an inherently global phenomenon, and the only practical way to track large-scale patterns and changes in fire activity is with satellites,” says Louis Giglio of the University of Maryland at College Park and Goddard.
As the U.S. land area burned by fire each year has increased significantly in the past 25 years, so too have the emissions. Carbon dioxide emissions from wildfires in the western U.S. have more than doubled since the 1980s, according to Chris Williams of Clark University in Worcester, Mass.
The satellite-based view allowed Williams and his colleagues to quantify how much carbon has been released from fires in the U.S. West. The team used data on fire extent and severity derived from Landsat satellites to calculate how much biomass is burned and killed, and how quickly the associated carbon was released to the atmosphere. The team found carbon emissions from fires have grown from an average of 8 teragrams (8.8 million tons) per year from 1984 to 1995 to an average of 20 teragrams (22 million tons) per year from 1996 to 2008, increasing 2.4 times in the latter period.
“With the climate change forecast for the region, this trend likely will continue as the western U.S. gets warmer and drier on average,” Williams said. “If this comes to pass, we can anticipate increased fire severity and an even greater area burned annually, causing a further rise in the release of carbon dioxide.”
Researchers expect a drier and more wildfire-prone U.S. in future decades. Previous research confirmed the connection between the measure of an environment’s potential evaporation, or dryness, and fire activity.
From a fire and emissions management perspective, wildfires are not the entire U.S. fire story, according to research by Hsiao-Wen Lin of the University of California at Irvine. Satellite data show agricultural and prescribed fires are a significant factor and account for 70 percent of the total number of active fires in the continental U.S. Agricultural fires have increased 30 percent in the last decade.
In contrast with wildfires, agricultural and prescribed fires are less affected by climate, especially drought, during the fire season.
“That means there is greater potential to manage fire emissions, even in a future, drier climate with more wildfires. We need to use cost-benefit analysis to assess whether reductions in agricultural fire emissions — which would benefit public health — would significantly impact crop yields or other ecosystem services,” Lin said.
Related Links:
› Powerpoint slides (in PDF format) from the 2012 AGU Conference briefing
› Video of active fires across the U.S. in 2012
› Link to video in Powerpoint presentation
› Link to Flickr gallery in Powerpoint presentation


Since the models have been falsified by 15+ years of no warming, any study based on those models is worthless. I wonder how much this worthless study cost the taxpayers?
“Fire is an inherently global phenomenon”.
I bet those antarctic fires are a bloomin nuisance.
Clearly a funding-source-driven conclusion. The build-up of ground tender and grounding of fire bombers (by the Forest Service) are apparently outweighed by climatic factors in the duration of the season or intensity of wild fires. Ditto the increased presence of more criminal/careless humans in wilderness areas.
Remember the first rule of successful forecasting (career-wise): take the very latest news, pretend it is showing the way ahead, extend it to the future at will. Everybody will believe you, as last year’s news have already been forgotten.
Fuel load in the forest has increased because evil white men run around putting fires out. The ecological pure native americans set fire to the forest and grasslands on a regular basis because they observed that more fire produced more animals that they could eat.
Nature is going to take care of excess fuel load by beetle kill or fire. Take your pick.
Real forecasters, e.g. WeatherAction.com have never considered CO2 at all, and know that it plays no part in any warming at all, as if it did, even a little, there would be a small ascending signal in the temperature record and other indicators. GHE effect? What GHE effect?
‘“Climate models project an increase in fire risk across the U.S. by 2050, based on a trend toward drier conditions that favor fire activity and an increase in the frequency of extreme events,” Morton said.’
An increase in frequency of extreme events? Did Morton et al input that into the climate model, then? The models ‘project’. A clever use of that word, making it sound like “predict”. Sounds to me like good old cart-before-the-horse, telling me that my 96th birthday is going to be a sylvicultureist’s nightmare. This stuff makes me shudder, but not because I’m afraid of burning to a crisp.
http://www.news.com.au/breaking-news/national/genes-not-people-cause-tas-devil-tumours/story-e6frfku9-1226530712559
it will not be long before the wildfire claim goes the same way as the ‘climate change is killing devils’ BS.
regards
The best predictor of wild fire risk is to measure the amount dead fuel available on the ground.
Nobody in the government want’s to look at this because it would prove that forest management policy has been bassakwards in the US for more than 50 years.
It doesn’t matter how hot or how dry or how windy it is. If there is no fuel on the ground a fire can’t spread.
I could have sworn that even the people who write the models have admitted that they are of no user for trying to figure out regional level changes.
If conditions are drier, wouldn’tt there be less stuff growing in the first place, and wouldn’t that put a damper on fires?
lol
“Greenhouse gas emissions” sure go up during a fire, but they don’t cause it.
If only those damned plants and trees didn’t photosynthesize, then there wouldn’t be so much to burn. But more more atmospheric CO2 means greater photosynthetic yields, so more fires….so…
and round and round we go…
Speculation based on error prone Nintendo models.
I agree. it’s all bad news from global warming.
Ignore long term observations in a warming world since 1850 and focus on funding driven drivel.
Mark W: That was true in Wyoming this year. Even the cheat grass didn’t come up. The mountain burned, but that fire, according to the experts is way overdue. The middle has not burned yet but it will whether it’s hot, cold or the climate stays the same.
I notice there was no mention of job security in all of this. Several years back a fire was started by someone who was tired of fighting fires away from home. With unemployment being what it is, wouldn’t this at least give those firefighters more work? After all, it’s all about jobs, isn’t it?
I agree with many of the comments above that forest fire prediction needs to be based on many factors, one of which might be a tendency towards drier or moister conditions. However, the fuel load factor is most likely of far greater importance. Since the early 1980’s we have essentially stopped harvesting timber on federal lands, which make up about 3/4s of the timbered land area in the Western U.S.. As a result, the current inventory (per U.S. Forest Service figures) of merchantable timber is over 40% greater than is was in the 1950’s, to say nothing of the quantity of non-merchantable timber that is present, many times in dense, stagnated conditions. Ironically, we import over $8 billion dollars worth of lumber annually, with some imports come from as far as New Zealand and Finland. With the normal multiplier effect, if we were to re-establish our timber industry we could add about $50 billion to our GDP and increase employment, while increasing the health of our timber resource. The current mantra is to buy food grown locally; to be consistent should we not apply the same logic to lumber purchases?
Another irony is that with higher CO2 levels we are seeing a significant increase in the rate of growth across the spectrum of the plant kingdom, so the fuel load problem is only going to get worse.
Attempting to model forest fire frequency under such narrow constraints as the subject study, is an exercise in futility.
>>>Climate Models Project Increase in U.S. Wildfire Risk
Err, how does that square with NW Europe having the wettest summer on record. They are making this up as they go along, surely.
.
When you’re selling eternal damnation, fire is an essential ingredient.
These people are desperate, if they don’t sell something soon, funding will evaporate.
Do these scientists believe in “spontaneous combustion”? I’ve seen no detailed analysis of the cause of wildfires, though I did see an article recently that presented evidence that most fires start close to roads and forest trails. “Joining the dots” has become popular amongst alarmists, so could I suggest a large degree of human involvement? If that’s right it rather puts a large damper on the theory of climate-induced fires, unless we can add arson to the long list of the claimed effects of “global warming” or rather “climate change” or “climate weirding” or whatever the term has morphed into of late.
What frustrating drivel. When I see statements like “since record keeping began in 1997” and gross exagerations like the “visualization” image I tend not to lend much credibility to the rest of it. I can assure you there are decent fire records long before 1997 and the vizualization is more of an illusion, by using dots scaled to an impossibly large size. Even on the large image the dots are approximately 5 miles across (based on comparison to known landmarks). This is a common trick in mapping to emphasize or de-emphasize your point – change the size of dots, or the thickness of a polygon outline to tell the visual story you are selling. The visualization makes it appear that 1/4 of the continent was on fire this summer, all of pacific mexico, more than half of florida, etc.
As Willis said, ‘Models all the way down!’
With the mandating and enforcement of Green idiocy such as not allowing the clearing of fire breaks or second growth from the forest floors, a rise in both spontaneous and maliciously-lit fires is inevitable, regardless of non-winter weather.
“That is short of the record 3.2 million hectares (7.90 million acres) burned in 2011, but exceeds the area burned during 12 of the 15 years since record keeping began in 1997.”
“Carbon dioxide emissions from wildfires in the western U.S. have more than doubled since the 1980s,”
“The team found carbon emissions from fires have grown from an average of 8 teragrams (8.8 million tons) per year from 1984 to 1995 to an average of 20 teragrams (22 million tons) per year from 1996 to 2008, increasing 2.4 times in the latter period.”
If records have only been kept over the last 15 years, I wonder how the team obtained the information in order to make the other two statements.
Bushfires [Wildfires]? It’s all to do with wet years and dry years, and fuel supply. Seems pretty logical.
http://www.fao.org/docrep/ARTICLE/WFC/XII/0278-B1.HTM
“A Study on Forest Fire Occurrence in China
Shu Lifu, Tian Xiaorui and Wang Mingyu
[Wildfire Research Group, Chinese Academy of Forestry, Behind Summer Palace, Beijing, China]
It is suggested that such different annual forest fire variation have some connection with annual atmospheric movement and climatic changes. A year that suffers a severe drought is mostly attacked by severe forest fires while a year is least prone to fires when there is much precipitation, high humidity. On the other hand, the accumulation of flammable materials in the forest also contributes a lot to the annual variation of forest fires. A forest area with coverage of large quantities of flammable materials mostly easily catches fire in a dry year and causes serious disasters.”
http://blog.smu.edu/research/2012/05/15/ancient-tree-ring-records-from-the-southwest-u-s-suggest-todays-megafires-are-atypical/
“The U.S. would not be experiencing massive large-canopy-killing crown fires today if human activities had not begun to suppress the low-severity surface fires that were so common more than a century ago,” said Roos, an assistant professor in the SMU Department of Anthropology.
Today’s extreme droughts caused by climate change probably would not cause megafires if not for a century of livestock grazing and firefighting, which have combined to create more dense forests with accumulated logs and other fuels that now make them more vulnerable than ever to extreme droughts. One answer to today’s megafires might be changes in fire management.
“If anything, what climate change reminds us is that it’s pretty urgent that we deal with the structural problems in the forests. The forests may be equipped to handle the climate change, but not in the condition that they’re currently in. They haven’t been in that condition before,” Roos said.”
[Disappointing that Roos uses the words “climate change” when he clearly doesn’t mean “Climate Change”]
But then the article goes on to get to the core:
“They discovered that the Medieval Warm Period was no different from the Little Ice Age in terms of what drives frequent low-severity surface fires: year-to-year moisture patterns.
“It’s true that global warming is increasing the magnitude of the droughts we’re facing, but droughts were even more severe during the Medieval Warm Period,” Roos said. “It turns out that what’s driving the frequency of surface fires is having a couple wet years that allow grasses to grow continuously across the forest floor and then a dry year in which they can burn. We found a really strong statistical relationship between two or more wet years followed by a dry year, which produced lots of fires.”“.
When I lived in CA, every April someone would come on TV, and if it had been a dry year, would warn that the upcoming fire season would be worse than normal because of the lack of moisture.
If it had been a wet winter, they’d predict the fire season would be worse than normal because of all the extra vegetation that had grown up.
As a long distance hiker, I’ve walked through a number of fires – some wild, some prescribed burns in places like Florida, Montana, New Mexico, California over th last 20 years. Any assumption that prescribed burns are in any way, “controlled” is entirely mistaken.