From the “models can make you believe anything” department comes this breathless doom-laden press release from the same school that brought you the big wheel of silly climate model results.
Study finds more extreme storms ahead for California
New technique predicts frequency of heavy precipitation with global warming
On December 11, 2014, a freight train of a storm steamed through much of California, deluging the San Francisco Bay Area with three inches of rain in just one hour. The storm was fueled by what meteorologists refer to as the “Pineapple Express” — an atmospheric river of moisture that is whipped up over the Pacific’s tropical waters and swept north with the jet stream.
By evening, record rainfall had set off mudslides, floods, and power outages across the state. The storm, which has been called California’s “storm of the decade,” is among the state’s most extreme precipitation events in recent history.
Now MIT scientists have found that such extreme precipitation events in California should become more frequent as the Earth’s climate warms over this century. The researchers developed a new technique that predicts the frequency of local, extreme rainfall events by identifying telltale large-scale patterns in atmospheric data. For California, they calculated that, if the world’s average temperatures rise by 4 degrees Celsius by the year 2100, the state will experience three more extreme precipitation events than the current average, per year.
The researchers, who have published their results in the Journal of Climate, say their technique significantly reduces the uncertainty of extreme storm predictions made by standard climate models.
“One of the struggles is, coarse climate models produce a wide range of outcomes. [Rainfall] can increase or decrease,” says Adam Schlosser, senior research scientist in MIT’s Joint Program on the Science and Policy of Global Change. “What our method tells you is, for California, we’re very confident that [heavy precipitation] will increase by the end of the century.”
The research was led by Xiang Gao, a research scientist in the Joint Program on the Science and Policy of Global Change. The paper’s co-authors include Paul O’Gorman, associate professor of earth, atmospheric, and planetary sciences; Erwan Monier, principal research scientist in the Joint Program; and Dara Entekhabi, the Bacardi Stockholm Water Foundations Professor of Civil and Environmental Engineering.
Currently, researchers estimate the frequency of local heavy precipitation events mainly by using precipitation information simulated from global climate models. But such models typically carry out complex computations to simulate climate processes across hundreds and even thousands of kilometers. At such coarse resolution, it’s extremely difficult for such models to adequately represent small-scale features such as moisture convection and topography, which are essential to making accurate predictions of precipitation.
To get a better picture of how future precipitation events might change region by region, Gao decided to focus on not simulated precipitation but large-scale atmospheric patterns, which climate models are able to simulate much more reliably.
“We’ve actually found there’s a connection between what climate models do really well, which is to simulate large-scale motions of the atmosphere, and local, heavy precipitation events,” Schlosser says. “We can use this association to tell how frequently these events are occurring now, and how they will change locally, like in New England, or the West Coast.”
While definitions vary for what is considered an extreme precipitation event, in this case the researchers defined such an event as being within the top 5 percent of a region’s precipitation amounts in a particular season, over periods of almost three decades. They focused their analysis on two areas: California and the Midwest, regions which generally experience relatively high amounts of precipitation in the winter and summer, respectively.
For both regions, the team analyzed large-scale atmospheric features such as wind currents and moisture content, from 1979 to 2005, and noted their patterns each day that extreme precipitation occurred. Using statistical analysis, the researchers identified telltale patterns in the atmospheric data that were associated with heavy storms.
“We essentially take snapshots of all the relevant weather information, and we find a common picture, which is used as our red flag,” Schlosser explains. “When we examine historical simulations from a suite of state-of-the-art climate models, we peg every time we see that pattern.”
Using the new scheme, the team was able to reproduce collectively the frequency of extreme events that were observed over the 27-year period. More importantly, the results are much more accurate than those based on simulated precipitation from the same climate models.
“None of the models are even close to the observations,” Gao says. “And regardless of the combination of atmospheric variables we used, the new schemes were much closer to observations.”
Bolstered by their results, the team applied their technique to large-scale atmospheric patterns from climate models to predict how the frequency of heavy storms may change in a warming climate in California and the Midwest over the next century. They analyzed each region under two climate scenarios: a “business as usual” case, in which the world is projected to warm by 4 degrees Celsius by 2100, and a policy-driven case, in which global environmental policies that regulate greenhouse gases should keep the temperature increase to 2 degrees Celsius.
For each scenario, the team flagged those modeled large-scale atmospheric patterns that they had determined to be associated with heavy storms. In the Midwest, yearly instances of summer extreme precipitation decreased slightly under both warming scenarios, although the researchers say the results are not without uncertainty.
For California, the picture is much clearer: Under the more intense scenario of global warming, the state will experience three more extreme precipitation events per year, on the order of the December 2014 storm. Under the policy-driven scenario, Schlosser says “that trend is cut in half.”
The team is now applying its technique to predict changes in heat waves from a globally warming climate. The researchers are looking for patterns in atmospheric data that correlate with past heat waves. If they can more reliably predict the frequency of heat waves in the future, Schlosser says that can be extremely helpful for the long-term maintenance of power grids and transformers.
“That is actionable information,” Schlosser says.
This research was supported in part by the National Science Foundation, the National Aeronautics and Space Administration, and the Department of Energy.
Written by Jennifer Chu, MIT News Office
PAPER: Paper: 21st century changes in U.S. regional heavy precipitation frequency based on resolved atmospheric patterns
Ok, here’s some “actionable information”.
They failed to examine the null hypothesis, what sort of rainfall patterns would California get with a 4°C cooling?
In citing “…the team was able to reproduce collectively the frequency of extreme events that were observed over the 27-year period. ” They fail to note that they are only reproducing weather during a period of warming in California’s history. And since models are tunable, this could be little more than a self-tuned confirmation bias.
But here’s the real kick in the pants from California history: The Great Flood of 1862
The Great Flood of 1862 was the largest flood in the recorded history of Oregon, Nevada, and California, occurring from December 1861 to January 1862. It was preceded by weeks of continuous rains (or snows in the very high elevations) that began in Oregon in November 1861 and continued into January 1862. This was followed by a record amount of rain from January 9–12, and contributed to a flood which extended from the Columbia River southward in western Oregon, and through California to San Diego, and extended as far inland as Idaho in the Washington Territory, Nevada and Utah in the Utah Territory, and Arizona in the western New Mexico Territory.
The event was climaxed by a warmer, more intense storm with much more rain that was much more serious, due to the earlier large accumulation of snow, now melted by the large turbulent heat fluxes into the snow over the lower elevations of the mountains. Throughout the affected area, all the streams and rivers rose to great heights, flooded the valleys, inundated or swept away towns, mills, dams, flumes, houses, fences, and domestic animals, and ruined fields. An early estimate of property damage was $10,000,000. However, later it was estimated that approximately one-quarter of the taxable real estate in the state of California was destroyed in the flood. Dependent on property taxes, the State of California went bankrupt. The governor, state legislature, and state employees were not paid for a year and a half. 200,000 cattle drowned, and the state’s economy shifted from ranching to farming.
The floods were likely caused by precipitation from atmospheric rivers, or narrow bands of water vapor about a mile above sea level that extend for thousands of kilometers.
Prior to the flooding, Oregon had steady but heavier than normal rainfall during November and heavier snow in the mountains.
The weather pattern that caused this flood was not from an El Nino, and from the existing Army and private weather records, it has been determined that the polar jet stream was to the north as the Pacific Northwest experienced a mild rainy pattern for the first half of December 1861. The jet stream then slid south and freezing conditions were reported at Oregon stations by December 25. Heavy rainfall began falling in California as the longwave trough moved down over the state, remaining there until the end of January 1862 and causing precipitation everywhere in the state for nearly 40 days. Eventually the trough moved even further south, causing snow to fall in the Central Valley and surrounding mountain ranges.
What was the global CO2 level in 1862? According to The Keeling Curve at Scripps, it was about 260ppm compared to today’s 400+ PPM
So if global warming is caused by more CO2, and according to MIT, warmer times will cause ‘more extreme storms’, why have we not seen events like the one in 1862, or worse?
And what makes these events, when CO2 levels were lower, not worthy of them running the model backwards in time?
March 1907 and January 1909 Floods
Significant flooding on all major rivers in the Sacramento Valley. A record instantaneous flow peak was set one year, the record overall flow volume was set during the other. A total of 300,000 acres were flooded in the Sacramento Valley in 1907.
– Long-term Strategic Impact: The flood episodes resulted in an overhaul of planned statewide flood control designs. Previous designs were based upon Midwest experience, which relied upon confining rising rivers between levees. The concept of bypasses and overflow weirs had been suggested and rejected. Following the 1907 and 1909 record floods, a new Lead Planning Engineer was selected and the current California flood control design was devised.
1969 Winter Storms and Floods
Significant flooding on Central Valley rivers and reformation of Tulare Lake in the San Joaquin Valley as extended precipitation fell across the state. Heavy snow fell in all mountain ranges and the monthly rainfall record was set in Sacramento. Forty counties were disaster-declared.
Personally I think their modeling is all wet, and just about as credible as the last doomsday climate model MIT produced: