Wildfires: Separating Demagoguery from the Science

Guest essay by Jim Steele Director emeritus Sierra Nevada Field Campus, San Francisco State University and author of Landscapes & Cycles: An Environmentalist’s Journey to Climate Skepticism

Published in Range Magazine winter 2017/2018

In 2016 Climate Central, notorious global warming demagogues, published the article Climate Change’s Fingerprints All Over California Wildfires. Ignoring a well-documented history of natural climate change, ignoring the ill-advised 20th century policy of fire suppression, and ignoring the increased percentage (~80 to 90%) of fires ignited by humans, Climate Central tried to persuade the public that California fires, (as well as all recent fires) are “part of a dire global warming-fueled trend toward larger, more frequent and intense wildfires.”

Whether you believe recent warming was natural or caused by rising CO2, warmer temperatures have promoted better growing conditions and that has been good for man and beast. During the cold Little Ice Age tree lines retreated. From the 1400s to the end of the 1800s forests thinned, especially where it was too cold for tree seedlings to establish. Since the beginning of the 20th century that trend reversed, our climate warmed and growing seasons lengthened. Indeed, more warmth can generate more wood for fires. On the other hand, along with improved agricultural efficiency, this more favorable growing climate has allowed us to feed a rapidly growing global population despite Stanford scientist Paul Ehrlich’s dire predictions we would experience mass starvation by the 1970s.

The Fire Suppression Effect

The statistical rise in fires since 1970 is mostly due to changes in fire suppression policies. The debate over pros and cons of fire has a long history. Native Americans had used fire to promote favored food plants and wildlife. Fire historian Stephen Pyne noted timber owners and ranchers in California promoted the use of prescribed “light burning” in the 1880s to reduce fuels, maintain pastures and reduce the likelihood of larger more destructive fires. Small natural wildfires also created natural fire breaks and a patchy forest mosaic that reduced a fire’s ability to spread beyond a local patch. Unfortunately, a few terrifying fires led land managers to embark on a policy of complete fire suppression. The Peshtigo, Wisconsin fire of 1871 blackened 1.5 million acres and caused the deaths of 1,500 to 2,500 people. Fires threatened recently formed Yellowstone National Park in 1886, and the army was called in to fight it.

But by 1996 fire ecologist Thomas Swetnam echoed the growing consensus against fire suppression. He wrote, “The paradox of fire management in conifer forests is that, if in the short term we are effective at reducing fire occurrence below a certain level, then sooner or later catastrophically destructive wildfires will occur. Even the most efficient and technologically advanced firefighting efforts can only forestall this inevitable result. It is clear from many years of study and published works that the thinning action of pre-settlement surface fires maintained open stand conditions and thereby prevented the historically anomalous occurrence of catastrophic crown fires that we are experiencing in today’s Southwestern forests”

Around the 1970s, some government agencies began adopting “let it burn policies” if human habitat was not threatened. An increasing use of prescribed burns attempted to reduce abnormal fuel loads and restore the natural fire balance. But fire ecologists still “estimated that approximately 3 to 6 times more area must be burned to restore historical fire regimes.” The unnaturally low fire frequencies of the 1980s and 90s can be seen in Figure 5 from a 1999 research paper by Dr. Swetnam. Based on fire scars of old living trees from 64 southwest study sites, fires were 5 to 15 times more numerous and widespread between 1700 and 1880 than during the 1990s. When global warming demagogues argue climate change has now resulted in 5 times more fires than observed in the 1970s, they fail to inform the public this increase is largely due to a shift away from the previous complete fire suppression policy to selectively allowing fires to burn.


Figure from Swetnam (1999) Historical Fire Regime Patterns in the Southwestern United States

Not only were fires naturally more common before “global warming”, earlier fires could be huge. Newspaper articles from Tucson, Arizona reported individual fires that scorched over a million acres before 1890. Wisconsin’s Peshtigo Fire blackened 1.5 million acres in 1871 and over 3 million acres were torched in the Big Blowup (aka Great Fire of 1910). The largest fire in Canadian history was the Miramichi Fire of 1825 that burned 3 million acres in News Brunswick and extended into the state of Maine. Unfortunately, large fires are more likely today because past fire suppression has caused an unnatural build up fuels.

Misuse of Global Average Temperatures

Fires are more likely during droughts. So, demagogues blame a “dire global warming-fueled trend” for increasing droughts and thus fires. But regional temperature trends usually differ from the global average statistic. For example, the western Arctic was cooling in the 80s and early 90s until shifting winds removed thick insulating ice into the warmer Atlantic, allowing stored subsurface heat to more readily ventilate. Arctic temperatures then rose twice as fast as the global average. In contrast, the eastern half of Antarctica has not warmed at all. The misleading use of a global average statistic reminds me of an old joke.

A man got his head stuck in a hot oven. While trying to extricate himself he got his feet stuck in the freezer. Not knowing what to do, his wife summoned a doctor hoping he could ease her husband’s pain. But after a careful examination, the doctor concluded her husband was just fine. On average, his body temperature was perfectly normal.

In contrast to the global average, the southeastern USA has not warmed since 1900. The illustration below is from a 2017 research paper Timing and Seasonality of the United States ‘Warming Hole’. It shows summer temperatures cooled by about 1°C from 1930-1950. While warmth in the northern USA began to recover from 1950 to 1975, the southeast remained cool. Despite some recent warming, as of 2005 temperatures in much of the southeast are still slightly cooler than 1901.


Figure from Mascioli et. Al. (2017) Mascioli et al (2017) Timing and Seasonality of the

United States ‘Warming Hole’; Environ. Res. Lett. 12

In 2016, devastating fires burned over 100,000 acres across 7 states of southeastern USA. The Public Broadcasting System’s PBS Newshour hyped the fires with the headline “How Big Droughts and Forest Fires Can Become the New Normal in Appalachia”. They interviewed U.S. Forest Service ecologist James Vose who stated, “It’s very rare to have this many fires burning this amount of area in the Southeast.” But before extensive logging, the Southeast was dominated by the Longleaf Pine, a fire-adapted tree that depends on frequent fires to remove competing vegetation. Its widespread dominance could only be maintained by frequent forest fires. And with no warming trend since 1900, Appalachia’s “old normal” was likely no different than the “new normal”.

After the USA’s widespread mid-century cooling, California’s average temperature began warming since the 1970s. But as exemplified by temperatures in Yosemite, the trend in maximum temperatures for the northern two thirds of California has declined since the 1930s. Because maximum temperatures are the main determinant of heat stress, it is hard to honestly blame California’s fires on a “dire global warming-fueled trend.”


Data source: US Historical Climate Network

Extreme swings between wet and dry years, driven by El Niños and La Niñas, are exactly what natural climate change predicts. Periodic La Niñas induce droughts that amplify the effects of California’s annual summer drought and cause anomalously high temperatures. El Niños induce greater winter rainfall and more growth, that then serves as fuel for the next dry fire-season.

California’s Blue Oaks are sensitive to changes in precipitation, and based on their tree rings scientists have reconstructed California’s precipitation anomalies. Negative anomalies indicate less rain and more drought and positive anomalies indicate heavier rains. The blue star highlights the extreme drought conditions of 2014 and the dashed blue line serves as a reference to 2014. We see extreme drought conditions, similar to or worse than 2014, happened 3 or 4 times a century. Likewise there were frequent periods of anomalously high rainfall. Despite 700 years of these natural extreme weather swings, Stanford’s Noah Diffenbaugh blames recent swings on global warming stating, “This is exactly what state-of-the-art climate models predicted should have happened, and what those models project to intensify in the future as global warming continues.”


Figure from Griffin, D., and K. J. Anchukaitis (2014), How unusual is the 2012–2014 California

drought?, Geophys. Res. Lett., 41

Should We Trust Model Projections of Impending Doom

There is no consensus among climate scientists regards the effects of increasing CO2 on the strength and frequency of El Niño events. Some models indicate more La Niña-like conditions. Some models indicate more El Niño-like conditions. Tree rings suggest no trend since the 1300s. Most likely periodic droughts and high fire risks will always be a fact of life, exactly as natural climate change predicts.

It is worth noting the only “evidence” scientists have that the earth’s changing climate has been driven by rising CO2 is based on their models’ failures to simulate 20th century warming when only “known” natural factors are considered. When increasing CO2 is added, their models can simulate average global warming since the 1970s. But their models fail to simulate earlier oscillating weather patterns. So, there is a high likelihood climate models have failed to incorporate some critical natural factors affecting climate change. For example, the natural Pacific Decadal Oscillation (PDO) results in 20 to 30-year periods of more frequent El Niños, which alternate with periods of more La Niña’s. The negative phase of the PDO amplifies the impacts of La Nina droughts and increases the risk of fires from California to the Colorado Rockies to southeastern USA. Yet the PDO was not even named until 1997 and is still not accurately incorporated into global climate models.

In 2014, the scientist who discovered the PDO co-authored a research paper demonstrating how the PDO explained observed climate swings along much of west coast North America. The impact of the PDO was highly significant but contributions from greenhouse gases were insignificant.

As illustrated below in a graph highlighted in a past National Climate Assessment, CO2 driven climate models failed to replicate the extent and severity of observed droughts since 1900. The number on the left axis represent the proportion of the USA and Mexico that was in drought. The red and black lines represent actual observations. During the Dust Bowl years 20% to 35% of the USA and Mexico were in extreme drought. Gray lines represent the scatter of individual models. The blue line represents averaged model results, which project that as CO2 rises we’ll experience growing widespread catastrophic droughts in the 21st century. That catastrophic projection is what the media hypes. But should we trust dire future predictions from models that totally failed to simulate the extreme droughts of the 20th century. Would you trust a doctor’s diagnosis, if he failed to correctly diagnose his previous patients?


Figure from Wehner et. al. (2011) Projections of Future Drought in the Continental United States and Mexico, Journal of Hydrometeorology

0 0 votes
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
December 3, 2017 9:05 pm

A good review of forest fires in general, and in connection with global warming.

Nick Stokes
December 3, 2017 9:15 pm

“As illustrated below in a graph highlighted in a past National Climate Assessment, CO2 driven climate models failed to replicate the extent and severity of observed droughts since 1900.”
There is no indication that “models” failed to replicate. What is shown is a model average. Models exhibit weather extremes, but they aren’t synchronised. So they even out in the average.

Reply to  Nick Stokes
December 3, 2017 10:09 pm

And the average shows no 20th century drought, and thus fails the reality check. The reliability of a model is its ability to hindcast past climate, and these models failed.

Crispin in Waterloo but really in Beijing
Reply to  Jim Steele
December 4, 2017 5:34 am

What they ought to produce is a projection of forest fires. The trend is down as the climate warmed, and that should have continued down with additional warming.

What they have is a forecast of droughts. It is not necessary to have a drought to have a forest fire. Further, the dropping temperature in many regions of the USA (not just the SE which has been cooling for a century) is an indicator of a lowering risk.

I am not impressed by the forecast. It looks wrong, and I do not trust ‘ensembles’ anyway. To date nothing forecast by ensembles has come close to predicting temperatures 5 years out. Respect has to be earned.

Reply to  Nick Stokes
December 4, 2017 6:28 am

One model runs way too hot. Another runs way too cold. Each individually is garbage, but averaged together they are perfect.

Reply to  MarkW
December 4, 2017 10:07 am

The North Atlantic Cod are the perfect example. The Canadian Gvernment had their sampling estimates of population and they also estimated numbers from catch. They then derived a magicky average sort of number. Both estimates turned out to be high and the cod were managed into oblivion. Nothing can be done with bad projections.

Jim Gorman
Reply to  Nick Stokes
December 4, 2017 7:18 am

This is a statement from a mathematician, not a scientist. Taking the average of models whose output is incorrect can not, I repeat, CAN NOT possibly give a correct answer except through coincidence. To imply the average is a correct solution to a real world problem has me doubled over in laughter.

Read the analogy again about the doctor and the farmer with his head in the oven and his feet in the freezer. That is exactly what you are saying.

Reply to  Nick Stokes
December 4, 2017 8:26 am

Stokes : “There is no indication that “models” failed to replicate.”
Nonsense! The facts that there are many (!) models, that they aren’t synchronised, even out in the average and produce a 20th zero line average totally inconsistent with real data are precisely indications that models failed miserably. If not, then nothing is.

Clyde Spencer
Reply to  Frederic
December 4, 2017 9:17 am

The annual standard deviation of the ensembles should provide a good proxy for the quality of the ‘projections.’ That is, a small SD would indicate good agreement between the individual models. Unfortunately, it is obvious that the SD is quite large, despite claims of high precision in the ensemble average.

Reply to  Nick Stokes
December 4, 2017 9:19 am

If the model is an average and models even out in the average, shouldn’t the model reflect reality then?

December 3, 2017 9:18 pm

Regarding the incidence of forest fires, one major issue not mentioned in the article is the correlation between fuel load and the number of forested acres burned annually. While the U.S. Forest Service does little in the way of administering timber sales, they do conduct a complete inventory on our nation’s timber resources, both public and private, every decade. Their inventories show we have 57% more standing timber today than we had in 1953. We essentially quit harvesting on federal lands in the late 1970s, but the trees continued to grow. As a result, many areas are facing stagnation and the resultant insect and disease infestations. While the number of fires varies greatly from year to year, the trendline on acreage burned correlates very closely to the increase in timber volume.

Lesson: We need to employ sound silvicultural practices, such as selective harvests and thinning to both improve the health of our forests and reduce the number and intensity of forest fires.

December 3, 2017 9:18 pm

Regarding the incidence of forest fires, one major issue not mentioned in the article is the correlation between fuel load and the number of forested acres burned annually. While the U.S. Forest Service does little in the way of administering timber sales, they do conduct a complete inventory on our nation’s timber resources, both public and private, every decade. Their inventories show we have 57% more standing timber today than we had in 1953. We essentially quit harvesting on federal lands in the late 1970s, but the trees continued to grow. As a result, many areas are facing stagnation and the resultant insect and disease infestations. While the number of fires varies greatly from year to year, the trendline on acreage burned correlates very closely to the increase in timber volume.

Lesson: We need to employ sound silvicultural practices, such as selective harvests and thinning to both improve the health of our forests and reduce the number and intensity of forest fires.

Gerald Landry
Reply to  drhealy
December 3, 2017 11:08 pm

drhealey, maybe it’s time to promote more solid timber home building instead of stick framing with layers of man made material, Eg strand board sheathing, plastic wrap, insulation, vapour barrier and drywall, exterior vinyl siding etc.
In Northern Ontario there have been 12 Pulp and Paper Mill Closures, although the fir softwood resources may be diminished there are the aspen poplar that mature in 50 years or less and start falling over. has to be removed from so many overpriced log home packages and lower cost square timber, beveled edge packages offered with tree species of lessor value.
I was caught with a very large stand of Black Poplar and the only Buyers were in Northern Minnesota who were buying in Canada near the Border, but for my stand it involved 3 extra hours of trucking. Finally a Pulp Mill 75 minutes away bought the stand chipped. That was a relief because it was disheartening watching mature regrowth from a forest fire in the early 50’s over mature and flop over by the year 2000. The forest fire in the early 50

Javert Chip
December 3, 2017 9:48 pm

Obvious & stunning intellectual dishonesty.

David A
Reply to  Javert Chip
December 3, 2017 11:04 pm


Javert Chip
Reply to  David A
December 4, 2017 7:48 am

David A

I probably should have said “More” obvious & stunning …

Most alarmist material is unworthy of serious consideration (except that it continues to poison the minds of climate evangelicals),

The “wildfire” paper is simply beneath contempt.

Reply to  Javert Chip
December 4, 2017 1:38 am

We discuss the stunning intellectual dishonesty of alarmists here every day, wasn’t that obvious?

Stephen Skinner
Reply to  ClimateOtter
December 4, 2017 2:59 am

The last graphic “Figure from Wehner et. al. (2011) ” is astonishing alarmism worthy of the best B Movies.

Extreme Hiatus
December 3, 2017 10:29 pm

Step further back further in time and indigenous people were managing their landscapes with fire. In places like California reducing fire danger was one of many benefits of regular burning. The first Spanish mariners saw fires all along the southern coast there.

Smokey the Bear – fire suppression – which allows the build up of fuels is the real problem. No fuel, no fire no matter how hot and dry it is. But too much fuel plus hot and dry makes fires much bigger.

Extreme Hiatus
Reply to  Extreme Hiatus
December 3, 2017 10:31 pm

Bigger and more intense.

December 3, 2017 10:40 pm

Why are there still multiple climate models? If the science is settled, then surely there must be a consensus model, not simply an average?

David A
Reply to  Robber
December 3, 2017 11:08 pm

Because if only the best 3 models were kept, then the funded research based on scary but wrong CAGW projections of warming ( the model mean and worse) would not be justified.

In other words the scary models justify the money train into peer review scare stories.

Reply to  David A
December 4, 2017 1:33 pm

And, the least scary models can be used to justify claims that the models en toto have “forecast” the possibility of extended periods of little or no warming . .

Reply to  Robber
December 4, 2017 2:51 am

ENSO models exemplify the problem. There are 25 different ENSO models reported at IRI.

If over a period of 1 year you choose the 6 best performing models, and you follow them over the next year, they have a similar chance of becoming a second year “6 best performing models” as the other 19, so most of them stop being a best performing model. I have done the test.

The conclusion is clear: It is only by chance that ENSO models are good at predicting ENSO a few months into the future, and that is why all of them have to be kept, so at any time some of them are predicting correctly. If we just use the best from the past and discard the rest, we will soon discover they don’t predict anything.

I believe the situation is no different for general climate models.

Nick Stokes
Reply to  Javier
December 4, 2017 3:02 am

“I believe the situation is no different for general climate models.”
Yes, that’s true. Climate models predict climate. They can’t predict weather (beyond a few days), which includes decadal stuff (though people are now trying). And yet people want to sift them on the basis of how well, for example, they predicted the “pause”. In fact, all models predict periods like this, but they are not synchronised. They don’t get the timing right. This follows from the fact that their evolution is virtually unrelated to the initial condition, which is only information that could fix the timing. So the ones that did happen to show a pause at the right time did so by chance; an achievement that could not be expected in the future.

Reply to  Nick Stokes
December 4, 2017 3:29 am

Climate models predict climate.

Climate models do not predict climate. They produce a numerical output according to their programming that when compared with climate evolution fails miserably.

CMIP5 model ensemble, closed and finished in 2010-2011, is running in the future since 2006. It almost immediately was unable to follow SAT, that fell out of its 25-75% range, while its ability to keep past SAT within that range was notable (or notably adjusted). SAT has only been within CMIP5 ensemble during the peak of the very strong El Niño. Does CMIP5 believe we are in permanent peak strong El Niño conditions?
comment image

As you say the model ensemble should not reproduce the variability, but should reproduce the trend. However the fall in SAT for the past 21 months leaves very little hopes (for modelers), that the trends are even near.

There is a significant probability that by 2020-2025 SAT could be well outside the 5-95% range. That could very well have been the current situation if it wasn’t for El Niño.

The models are simply wrong. They run too hot. They predict a lot more warming than we are observing and that we are going to observe.

Reply to  Javier
December 4, 2017 4:24 am

Whenever I buy a lottery ticket, I always get one that has winning numbers… eventually. Timing is always my big problem.

Reply to  Javier
December 4, 2017 7:27 am

Only in climate science does the average of false = true.

Nick Stokes
Reply to  Javier
December 4, 2017 6:15 pm

“That could very well have been the current situation if it wasn’t for El Niño.”
Climate is chaotic. ENSO events happen at unpredictable time. Since 2006 there were 2 big Niñas and one big Niño., which had the effect shown, exceeding the CMIP average near the end. Models have those events too, but at unsynchronised times, which means they don’t show in the ensemble average. If there is another decade dominated by La Niña, the time series might well drop below the limit. If there isn’t, probably not.

Reply to  Javier
December 5, 2017 3:50 am

1. An El Nino is a temporally limited hot event. Clearly the average output of many models will not show it. So it should quite obvious that if it is only during an El Nino that the observed temperature series actually reaches the median/average of the model runs (eg 1998 & 2016) then the models are running hot.

2. If you take make many stochastic and/or chaotic model runs and then average them (or indeed multiple different models and average them) what you are then left with is the underlying non-stationary assumption of the models, nothing more nor less. That is the nature of averaging. You see what the modellers have coded as their fundamental assumption about the long term behaviour of climate. What they have assumed is a close to linear increase of temperature due to CO2.

If the average or median result of (2) only agrees with observations (reality) when we are in a warm spike (1) then it is quite apparent the models run hot. No amount of arm waving is going to alter that fact.

Solomon Green
Reply to  Robber
December 4, 2017 5:52 am

For the same reason that most investment firms run numerous mutual funds. So that there is a better chance of one or more appearing to outperform the relevant index. The lucky ones can then be advertised. If and when some have failed to perform for several years in a row, they can be quietly dropped.

Unfortunately climate models can be run for very many years before the *****s who produce them can be convinced that their model will never produce a successful long term forecast.

If he follows me down to Hades, Nick Stokes and I could still be debating this in 100 years’ time.

Javert Chip
Reply to  Solomon Green
December 4, 2017 9:26 am


I understand and generally agree with your point about climate models.

However, there are fundamental difference between x>70 poorly-understood climate models and 9,500 USA mutual funds:
1) All climate models claim to model the same phenomena (climate); the 9,500 mutual funds track 9,500 different things (no 2 are. intended to be the same)
2) Climate models are poorly understood; each of the 9,500 funds has a prospectus uniquely defining the set of “things” and how they are tracked (car makers, tire mags, gold mines, etc, etc)
3) Climate models claim not to track short-term (weather) and forecast long-term (climate); mutual funds track short-term and do not forecast long-term
4) Climate models use manipulated data – nobody even tries to reconcile modeled and actual results (funds use clean data, and results are tracked daily)

Fund performance may be unacceptable for at least two reasons: the fund does not accurately track the set of defined “things”, or the the underlying set of “things” is accurately tracked, but financial performance of the “things” is not acceptable. Either way, funds die.

Bottom line: Mutual fund die when they fail to perform as measured against reality; climate model don’t.

December 3, 2017 11:00 pm

The latest $ multibillion dollar fire in Sonoma was started (on the windiest 24hr of the year) by a homeless Mexican … of dubious legal status. A guest worker here in the Wine Country … “doing the jobs that Americans JUST WON’T DO” … like start disastrous conflagrations. Global Warming my ass. More like open borders and the 3rd-worldization of America. BTW … whatever happened to this fine unemployed firestarter ? I haven’t heard word ONE in the news about him since the fire(s)

David A
Reply to  kenji
December 3, 2017 11:09 pm

A dann good question???

Reply to  kenji
December 4, 2017 7:40 am
December 3, 2017 11:12 pm

The Black Thursday bushfires of 1851 were the largest Australian bushfires ever recorded, caused by an intense drought that occurred throughout 1850, a year of massive heatwaves.. On 6 February 1851, a furnace-like wind came down from the north, gaining force and speed. It is believed that the disaster began in Plenty Ranges when a couple of bullock drivers left logs burning unattended, setting fire to long, dry grass affected by the drought. This had been the second hot and dry year in a row.
The primary cause of catastrophic bushfires during this period lay in poor understanding of local fire regimes and in inappropriate management by new settlers. Aboriginal people had managed these areas for tens of thousands of years, using fire-stick farming to clear out fuel build up and maintain tracts of open land and hunting grounds.
The Black Thursday bushfires burnt five million hectares (twelve million acres), or a quarter of all of the State of Victoria. Twelve peoples lives were lost, along with one million sheep, thousands of cattle and countless native animals
By comparison, modern bushfires look like very small beer.

Nick Stokes
Reply to  ntesdorf
December 4, 2017 1:45 am

“the largest Australian bushfires ever recorded”
They were barely recorded. These claims are extremely dubious, and seem to be based on a rather hysterical report of a journalist on the day, making claims that the comunications of the time could not possibly support. In 1851 large parts of Victoria were still unexplored, and very little settled. There were no railways and virtually no roads outside Melbourne and Geelong. Here is an 1849 map of land that had been opened up for settlement. It is a tiny fraction of the State.
comment image

I don’t know of any Plenty Range.

Reply to  Nick Stokes
December 4, 2017 3:11 am

You may be right about the scale of the 1851 fire being poorly documented. But there was a large scale fire on Feb 6th 1851, coinciding with the hottest day on Melbourne’s thermometer record.

FATAL AND DESTRUCTIVE BUSH FIRE — Intelligence reached town yesterday morning of a most destructive bush fire that had been raging on the previous day, at the Plenty River. On the station formerly known as Anderson’s Station, between the Plenty River and Diamond Creek, the destruction was very great, and it is stated that a poor woman, wife of a shepherd named McClelland, was, with five children, suffocated in a hut from the smoke of the fire which raged around them, and left no means of escape. The Coroner has been made aware of this fact, and has appointed to-day to hold an inquest on the bodies at the Bridge Inn, Plenty River. Eight or ten farms in the neighbourhood have been entirely destroyed, stacks, buildings, fences, everything; whilst several men are missing, and fears are entertained that they have perished. A possibility, however, exists of their having saved themselves by timely flight.

THE WEATHER. — Thursday was one of the most oppressively hot days we have experienced for some years. In the early morning the atmosphere was perfectly scorching, and at eleven o’clock the thermometer stood as high as 117°, in the shade ; at one o’clock it had had fallen to 109 °and at four in the afternoon was up to 113.
The blasts of air were so impregnated with smoke and heat, that the lungs seemed absolutely to collapse under their withering influence ; the murkiness of the atmosphere was so great that the roads were actually bright by contrast. The usual unpleasantness of hot wind was considerably aggravated by the existence of extensive Bush fires to the northward, said by some to have an extent of 40 or 50 miles. In the evening, after an hour’s battle for the supremacy, the cool breeze from the sweet south came down, sweeping away the pestilential exhalation of the day, and bringing in its train a light and refreshing rain; for a considerable time yesterday, the parched earth greedily absorbed it as it fell, but a day’s continuance of such very seasonable weather, will do no more than cool the surface.

Leo Smith
Reply to  Nick Stokes
December 4, 2017 3:22 am

Well that’s better than one bristlecone pine tree, anyway.

Nick Stokes
Reply to  Nick Stokes
December 4, 2017 3:28 am

Yes, but a few things to note. One is that the story was in second place in the paper, after a report of a meeting of the Industrial Society. Second, it was more realistic in scale; it describes a fire in a region of what is now just a few Melbourne suburbs. It would be a small fire in comparison with those of 1939, 1983 or 2009. Even the rumored 40 or 50 miles is not huge. And it predates the Melbourne instrumental temperature record (from 1856).

Samuel C Cogar
Reply to  Nick Stokes
December 4, 2017 5:03 am

December 4, 2017 at 3:28 am

Second, it was more realistic in scale; it describes a fire in a region of what is now just a few Melbourne suburbs.

Nick Stokes,

was it your Climate Modeling Program that told you that the aforesaid 1851 bush fire was limited to an area between the Plenty River and Diamond Creek (Melbourne suburbs)….. and that no other areas within a thousand (1,000) miles was affected by that roaring inferno?

HA, and they have been saying that your Climate Modeling Program doesn’t have good “hindsight”.

You shur nuff proved them wrong about that.

Reply to  Nick Stokes
December 4, 2017 8:17 am

In Western Australia we just had the Ferguson inquiry into last years fatal bushfires. The problem has nothing to do with climate change and everything to do with how Suburbia is sprawling into the environment. Bushlands that are left near suburbia have high fuel loads and extremely inaccessible which is why they still are bush land. The fire services between rural and city were disconnected as fire responsibility was largely geographic based.

The full 264 pages of the report are at the bottom of the link

Javert Chip
Reply to  Nick Stokes
December 4, 2017 9:30 am


Your comment regarding Australian fires “…These claims are extremely dubious, and seem to be based on a rather hysterical report of a journalist on the day, making claims that the comunications of the time could not possibly support…” sound exactly like CAGW.

Nick Stokes
Reply to  Nick Stokes
December 4, 2017 2:32 pm

“that told you that the aforesaid 1851 bush fire was limited to an area”
No, that is a description of the report quoted by Khwarizmi. In fact, there were other reports, from areas around Geelong, Portland, Westernport, Horsham. But the areas described are a small fraction of the state, although a fairly large part of the settled area. A reasonable interpretation is that it was a very hot day, and fires broke out where people were, in ways indicated by the Plenty River report. As to what happened in the vast area beyond settlement, who knows?

Samuel C Cogar
Reply to  Nick Stokes
December 5, 2017 4:18 am

Nick Stokes – December 4, 2017 at 2:32 pm

[quoting Sam C] “that told you that the aforesaid 1851 bush fire was limited to an area”

No, that is a description of the report quoted by Khwarizmi.

That is exactly right, …… Nick Stokes, ….. but that is not what you FACTUALLY ATTESTED TO in your above post ….. and the very reason I jumped your case for doing so, …. to wit, what you previously stated:

It would be a small fire in comparison with those of 1939, 1983 or 2009.

There is no way in ell you could possibly know that given the historical accounts presented hereon..

John F. Hultquist
December 3, 2017 11:13 pm

We’ve been to the multimedia production called

the Era of Megafires

Paul Hessburg, Phd., is a Research Ecologist with Pacific Northwest Research Station, U.S. Forest Service. He has been studying historical & modern era forests of the Inland West for the last 32 years, …

This is not about cAGW, but in an open presentation with several different speakers and audience participation, there was a small leakage of AGW at the time we went. I suspect it varies.

The bottom line is there will be very big fires in our future and AGW has nothing to do with it.
Same bottom line as this post.

Thanks Jim Steele.

December 3, 2017 11:26 pm

Long as I can remember for California – they had two seasons – and fire season was one of them. The other one involved landslides. If the climate models failed to replicate the droughts of the 60’s and 70’s, they’re really worthless.

Steve Case
December 4, 2017 12:15 am

comment image

Reply to  Steve Case
December 4, 2017 2:47 pm

Do you – or does any other reader here – have a handle on the number of fires/big fires/catastrophic fires over a similar period?
If so, it would be interesting to see that on the ‘same’ graph.
I do wonder if there is an inverse relationship between the acreage burned [in your graph], and the number of bad fires.

I wonder . . . . . . .


Don K
December 4, 2017 12:40 am

Excellent article as usual, Jim. One thing though. One needs to be careful when comparing forest conditions in Eastern and Western North America. In the West, fires mostly occur in largely coniferious forests in the Summer and Autumn during the annual extended periods of low to no rainfall. In the East, they occur in mostly mixed hardwood forests in the Spring during erratic periods of low rainfall. They become uncommon when tropical humidity arrives in the early Summer. Also, the Eastern forests are almost entirely second growth. As I’m sure you can appreciate, second growth trees are fairly tightly packed compared to the climax forests they replace. Or so I’m told. The original forests seem to be entirely gone although here and there one comes across huge old trees that were somehow not cut during the logging era.

The new trees quickly reach full height of 20 meters or so. But the trunks are pretty thin. Windstorms take care of gradually thinning the tree population. But the forest floors quickly become covered with fallen trees that make excellent tinder if the normally frequent rain fails to keep things soaked. Even in our normally damp climate, it takes decades for the fallen trunks to decay. A few of us who walk regularly in the patch of woods across from my house spend time every year clearing deadfall trees from the informal trails through the woods.

Jeff Alberts
Reply to  Don K
December 4, 2017 5:59 pm

The governor of Washington, Jay Inslee, seemingly blames all the Washington forest fires on “climate change”. Apparently he doesn’t know the state’s history, and how it became the Evergreen State.

December 4, 2017 1:18 am

Every creature on this planet has evolved to survive a wide range of environmental conditions, such that we are nowhere near any sort of limit. And humans are the most adaptable of all.

Leo Smith
December 4, 2017 3:34 am

I actually think there is. Compared with some years ago when every newspaper ran an almost daily advertorial from some renewable energy press release and in the UK the Daily Telegraph sported two environmental journalists, now both are sacked as renewable energy isn’t paying for their salaries.

To be sure the casual name dropping of ‘climate change’ continues in the state funded and EU funded BBC, but really the push is all about Brexit, and more conventional politcal bun fights, whilst at grass roots level, peole are more concerned about te vast numbers of Islamic immigrants roaming the streets aware that any misdeeds they commit will pass unpunished as soon as they play the religioso-ethnic trump card.

And in a sense, that is linked to climate change oddly enough, as it is becoming clear that the vast sums of money down to Arab Oil are not just constructing palaces, or buying racehorses, but are being use to buy active propaganda to promote islamic migration and culture in Europe.

General de Gaulle built France’s nuclear reactors to ensure France would never be beholden to Arabs, whom he detested.

In the same way fracking is also to be encouraged.

The new bête noire is Russia, and North Korea, ‘right wing neo fascist populism’ (that’s us folks!) as the politicians struggle to respond to their new puppet masters.

December 4, 2017 3:36 am

Nearly all the problems of the natural world, including fires, relate mostly to anthropogenic causes. Just not anthropogenic climate change. We need to change our ways even if we are not responsible for changing climate.

[??? .mod]

Reply to  Javier
December 4, 2017 4:47 am

Man is part of the natural world, and in fact, the dominant species.

Reply to  icisil
December 4, 2017 11:16 am

Agreed. How “scientists” who profess to believe in Evolution can suddenly say humans are a threat to the planet is beyond me. Survival of the fittest? That’s humans. Apex predator? That’s humans. How did we get to be a “parasite” on the planet? How can evolution “make a mistake”? Please, someone explain this to me. It makes no sense whatsoever and never has.

Reply to  Javier
December 4, 2017 5:02 am

Right, but we can’t afford to let those changes be dictated by uninformed guesswork.

Samuel C Cogar
December 4, 2017 4:28 am

Excerpted from published commentary:

Unfortunately, large fires are more likely today because past fire suppression has caused an unnatural build up (of) fuels.

That claim of “an unnatural buildup of fuels” on the forest floors and elsewhere can’t possibly be true simply because 97% of all climate scientists and millions of their minions have been attesting to a CAGW “fact” that all that yearly deposition of dead biomass on the forest floors and elsewhere ….. rots away during the wintertime which caused the “wintertime” INCREASE in atmospheric CO2 ppm as denoted on the Keeling Curve CO2 graph.

Yours truly, ,,,, Eritas Rabuf

December 4, 2017 5:34 am

Here is something interesting about the Longleaf Pine. There are many pine trees in the southeastern US. The British used to cut down these trees to make tar for their ships. You will find many towns named after pines and the Tar River. The colonists cut down the pine trees, floated them down the Tar River toward the ocean, and used the trees to make pine tar. The Longleaf Pine needs fire to exist. For its first few years in life, it grows very very slow. But it is virtually fireproof at this stage. Other trees do not grow slow. So by the time the Longleaf Pine starts to grow in earnest, it has already lost the race for sunlight. But if there was a fire, it would win. After those slow years, it grows quickly. With a fire, other trees must start over, but not the Longleaf Pine.

Because of fire suppression, you do not see many Longleaf Pine trees anymore. There are many Loblolly Pines, very tall and very skinny pines with very deep tap roots. Loblolly Pines are perfect for hurricane weather because of their deep taproot. They may snap in a hurricane but they don’t blow over. Another interesting fact for those who claim global warming causes hurricanes. Why does the Loblolly have a deep taproot if powerful hurricanes are new?

Jeff Alberts
Reply to  alexwade
December 4, 2017 6:02 pm

“Here is something interesting about the Longleaf Pine.”

All I know is my wife makes baskets from the needles.

December 4, 2017 6:51 am

comment image

Well, there’s that familiar temperature profile again: The one that shows the 1930’s to be as hot or hotter than subsequent years. This same temperature profile can be found in unaltered temperature charts from all around the world, yet the Climate Gurus trot out the bogus, bastardized Hockey Stick chart, which eliminates the 1930’s heat, and they want us to believe that things are getting hotter and hotter every year, when the truth is the current day is cooler than the 1930’s.

It is true that temperatures have warmed, but only since the 1980’s, and the warmth has not surpassed the warmth of the 1930’s, so all in all, we have been in a temperature downtrend since the 1930’s and have a long way to go to break that downtrend, despite the claims of the Climate Gurus.

The warmth we have been experiencing is a temporary warmth that has a short history from 1980 to the present. We had similar warmth, of the same magnitude, from the period from 1910 to 1940, yet after 1940, we had cooling through the 1970’s. Going by that pattern, we should start experiencing a temperature decline for the next few decades, just like occurred during the 1940 to 1980 timeframe.

Any temperature/anomaly chart you see that does not show the 1930’s as hotter than subsequent years, is a bogus, bastardized Hockey Stick chart that was conconcted by Alarmists to sell the CAGW narrative. Like the one below:
comment image

December 4, 2017 7:09 am

Although the PDO was not named until 1997, I remember a meteorologist, probably at Accu-Weather, being aware of it much sooner. He predicted a warming due to a Pacific shift that warmed the subtropical and midlatitudes eastern Pacific that started in 1977.

Reply to  Donald L. Klipstein
December 4, 2017 8:00 am

Indeed an ocean regime shift was recognized in 1977, it just wasn’t understood to be a multidecadal oscillation until later

December 4, 2017 8:26 am

This is a pretty stupid issue to blame on global warming. What it will do is call attention to the temperature readings of the areas affected. Those areas are forests, far way from the Urban Heat Island Effect, and will most like show cooling due to the increased density of the forests. What the real questions that need to be asked are 1) Don’t we have the ability to thin out those forests to prevent catastrophic fires and 2) If we have the ability, why aren’t we doing it? Ask those two questions and you get at the root of the problem.

December 4, 2017 8:40 am

You have neglected the “religiously believe” part of the eco-religious fanatics.

Here, near Stump Town (Portland Oregon), the California Urban immigrants have
elected clueless wonders who have decreed that there
…a) shall be “wilderness areas”
……one of which “crawled” down the mountain a mile or so a couple years ago
……to be about 200 yards of steep, flammable, fuel laden monoculture
….. from the 1906 Faubion homestead house at the bottom of the hill.
..b) the USFS “shall not touch the forest – it shall be ‘natural’ forever and ever”

The result is that the fuel piles up in the monoculture, lamented root rot forest,
…. left over from the fires of 1880s and 90’s.

Look in the news about how the Columbia River Gorge “burns”
…and, of course, it is somebody’s fault
…for playing in the woods and setting fire to the stacked up fuel.

The “Bull Run Wilderness” is required to have no one in it.
according to the Stump Town clueless wonders
…so the “water is pure and clean and unfiltered and wonderful.

Of course, the overly dense replants from the 1880’s fires will burn
…but it will be “somebody else’s fault”
Just ask the USFS, they are frustrated by the ignorance and cluelessness.

December 4, 2017 9:04 am

I remember living in Oregon in the late 50’s with my dad saying, you have not seen a forest fire until you have seen the “Tillamook Burn” in 1933: https://oregonencyclopedia.org/articles/tillamook_burn/#.WiV8feRe4aE. It was “forever” burned into his memory as something to avoid.

December 4, 2017 9:25 am

While temperature may have some effect on wildfires, fuel and wind have far more. We have “red flag warnings” in December in Wyoming. There’s plenty of fuel and plenty of wind. The fact that it’s cold out is not a major thing.

This is for Montana (quickest reference I could find on winter red flag warnings):

You don’t need 90F degrees heat to have fire danger. Warming is NOT essential, not at all.

December 4, 2017 12:27 pm

A bit of Demagoguery from the other side: (First of December) “It’s snowing in London. The city is hysterical. Panic buy milk. Panic buy huskies. Panic buy snow booze. The city is doomed.”
Snow is falling at the rate of 4 snowflakes per minute. Londoners are urged to stay calm, build sledges and carry on.

December 4, 2017 2:42 pm

Reblogged this on WeatherAction News and commented:
Not only were fires naturally more common before “global warming”, earlier fires could be huge. Newspaper articles from Tucson, Arizona reported individual fires that scorched over a million acres before 1890. Wisconsin’s Peshtigo Fire blackened 1.5 million acres in 1871 and over 3 million acres were torched in the Big Blowup (aka Great Fire of 1910). The largest fire in Canadian history was the Miramichi Fire of 1825 that burned 3 million acres in News Brunswick and extended into the state of Maine. Unfortunately, large fires are more likely today because past fire suppression has caused an unnatural build up fuels.
Another great essay from Jim Steele

Reply to  craigm350
December 4, 2017 7:30 pm


December 4, 2017 3:08 pm

Once upon a time in North Eastern Europe, not too long ago, there were old men and women that made their living by gathering fallen wood in the forests. This was the time before easy energy from gas coal and electricity. Their efforts kept the homes heated, the stoves hot and the forest floor with a light fuel load. Their efforts also helped the wildlife have forage on the forest floor. The old men and women that gathered kindling and fallen wood are now all gone. The forests have a heavy fuel load, and are poorly managed by ideologues. What could possibly go wrong.
Of course the bureaucratic solution is to clear cut the old forests, sell the wood for fuel, and take their cut.

December 5, 2017 10:37 am

I have been out to Santa Rosa at the site of the Tubbs fire. I have a home that nearly was lost in the Cedar Fire and was spared by dumb luck. There is one reason and one reason only that wildfire is so devastating to residences at the urban interface today – combustible insulation. Not global warming. Ironically it is the elimination of fire retardants such as asbestos in insulation and the addition of combustible insulation that makes homes matchsticks today. Whether cellulose, fiberglas (which contains a plastic matrix) or spray foam, it all burns. I inherited a couple of homes, so I own three homes of different ages and construction standards, 1950s, 1980s, and 2000s. The changes with time are remarkable. The oldest has plaster buttonboard walls, empty wall cavities, and composite shingle roof to replace the old wood shake shingles. Concrete stucco exteriors. Windows are the old single-pane glass, but they are aluminum (non-combustible) framed. It is difficult to find a way to burn such a house down.

Look at homes today. Gypsum drywall held together by paper. Wall cavities stuffed with combustible insulation. Vinyl-framed windows. My own attic has a builder-installed deep layer of blown cellulose. Go out to a burn site. What you see is that homes burned from the inside out, the exterior walls collapsing inward on an empty shell. Common building items such as insulation, drywall and vinyl windows are not to be found anywhere, consumed by fire. Guidelines are published on how to build in the wildland-urban interface, but they are generally ignored. Embers enter the attic, set the insulation on fire, and the fire proceeds down the insulated exterior walls, leaving only the exterior stucco to collapse. Fire destruction appears to be random, but the fire is not transmitted on the ground. The reason it appears random is that the fire is spread by airborne burning embers carried by the wind that enter attics through attic ventilation, which is oversized.

Don’t blame the weather, blame modern construction methods.

Dave Sandbrook
December 6, 2017 1:12 pm

One of the unaddressed factors in the average fire size increase is the change in tactics implemented since the 1970s. There no longer is a “suppress all fires, throw everything at every fire” policy and has not been the policy for decades. Fires that do not immediately threaten communities or other high value resources (municipal watersheds, critical wildlife habitat, et al) are evaluated as to total threat, and strategies are chosen commensurate with the values at risk. In remote undeveloped areas the default strategy is now to back off to key ridges or other defensible features and let the fire burn, stopping it at the ridgetop many miles from the origin.
More prescribed burning is not feasible. The biggest limiting factor in implementing more prescribed burning is air quality regulations. The land management agencies are about maxed out now. They want to burn more, but there are not enough allowable burn days under the Clean Air Act. One northern California National Forest looked at the impact of air quality regulations, and found that the days of favorable weather conditions (not too windy but enough wind for smoke dispersal, not too dry or too moist, etc) and the number of allowable burn days from the local Air Quality Management agency, found an average of 18 days per year when a prescribed burn was allowed and weather was suitable for burning. They were already burning as much as they could.
As for the previous fire suppression policy — everyone wants our public lands managed according to the best available science. Full fire suppression was the best science at that time. As the science changed so did the policy.

%d bloggers like this:
Verified by MonsterInsights