
Guest post by Bob Tisdale
SkepticalScience recently published a post titled Mercury rising: Greater L.A. to heat up an average 4 to 5 degrees by mid-century. It’s a cross post of a UCLA press release with the same title. It struck me odd, because we recently showed that Western North American land surface temperatures have declined in recent years. Refer to Figure 6 in the post IPCC Models vs Observations – Land Surface Temperature Anomalies for the Last 30 Years on a Regional Basis.
The press release is based on the Hall et al (2012) climate model study Mid-Century Warming in the Los Angeles Region published at the website C-Change.La.
So, does a 4 to 5 deg F rise in Greater Los Angeles land surface temperature anomalies by 2050 sound realistic? That equates to a rise of 2.5 to 2.8 deg C. Considering that, based on a preliminary look at the data, greater Los Angeles land surface temperature anomalies have been cooling for the 3-plus decades, it seems to be a real stretch of the imagination.
The GHCN-CAMS land surface temperature dataset comes in a number of resolutions, including 0.5 degree latitude and longitude. Lucky for us, it’s available through the KNMI Climate Explorer. And that means we can capture data for some reasonably small geographical areas. The GHCN-CAMS land surface temperature dataset was presented in the Fan and Dool (2007) paper A global monthly land surface air temperature analysis for 1948-present. It will allow us to get an idea of what Greater Los Angeles surface temperatures have been doing since 1948.
The UCLA press release states (my bold face):
Some of the smallest changes predicted, yet still nearing a 4-degree increase, are in Oxnard (3.68 degrees), Venice (3.70), Santa Barbara (3.73), Santa Monica (3.74), San Pedro (3.78), Torrance (3.80), Long Beach (3.82) and Santa Ana (3.85). Among the highest predicted increases are Wrightwood (5.37), Big Bear Lake (5.23), Palm Springs (5.15), Palmdale (4.92), Lancaster (4.87), Bakersfield (4.48) and Santa Clarita (4.44). Table 2 in the study calls out 27 distinct locations, such as downtown Los Angeles (3.92), San Fernando (4.19), Woodland Hills (4.26), Eagle Rock (3.98), Pasadena (4.05), Pomona (4.09), Glendale (3.99) and Riverside (4.23).
So they’ve made predictions for an area larger than the City or County of Los Angeles. For the sake of discussion, let’s say it represents the area bordered by the coordinates of 33N-35.5N, 121W-117W. See Figure 1. Those coordinates fit with the 0.5 degree grids. And we’ll call that dataset Santa Barbara-Ventura-Los Angeles-Orange Counties.
Figure 1
Figure 2 presents a time-series graph of the Santa Barbara-Ventura-Los Angeles-Orange Counties land surface temperature anomalies since January 1948. The data has a linear trend of 0.177 deg C/decade. In order for the land surface temperatures for that dataset to rise 2.5 deg C by 2050, the linear trend of the data has to change drastically to about 0.667 deg C/decade from June 2012 through December 2050.
Figure 2
In Figure 3, I’ve smoothed the Santa Barbara-Ventura-Los Angeles-Orange Counties land surface temperature anomalies with a 13-month running-average filter to reduce some of the variability. What caught my eye was the shift in 1976 that coincides with the Pacific Climate Shift. Curiously, it appears the dataset has been cooling since that shift.
Figure 3
Let’s take a look at the linear trends before and after the 1976 Pacific Climate Shift. We’ll switch back to the “raw” data. Before the climate shift, January 1948 to December 1975, the Santa Barbara-Ventura-Los Angeles-Orange Counties land surface temperature anomalies rose at a rate of only 0.108 deg C/decade, and after, from January 1977 to May 2012, they’ve cooled at a rate of -0.082 deg C/decade.
Figure 4
In Figure 5, I’ve added the projection of about 2.5 deg warming by 2050 to the graph to show how unrealistic that projection looks, especially when we consider that surface temperatures for the Santa Barbara-Ventura-Los Angeles-Orange Counties data have been dropping for 3+ decades.
Figure 5
Maybe I looked at too large an area. Let’s take a look at the data for the coordinates of 33.5N-34.5N, 118.5W-117.5W. See Figure 6 for the location. We call that dataset “Los Angeles Plus.”
Figure 6
As shown in Figure 7, the pre-1976 warming rate for the “Los Angeles Plus” land surface temperature anomalies is greater than the larger dataset, at about 0.33 deg C/decade. But the post-1976 trend is still negative at -0.074 deg C/decade.
Figure 7
One last try: Let’s decrease the area of the data again, Figure 8, this time looking at the land surface temperature anomalies for the coordinates of 34N-34.5N, 119W-118W. That captures Malibu, the Valley and much of the City of Los Angeles.
Figure 8
Doesn’t help. As shown in Figure 9, the trend after the 1976 Pacific Climate Shift for the Malibu-The Valley-Los Angeles land surface temperatures is negative at -0.086 deg C/decade.
Figure 9
CLOSING
Based on this quick look at land surface temperature data for the Greater Los Angeles area, the Hall et al (2012) study referred to in the UCLA press release appears to have no basis in reality.
SOURCE
The GHCN-CAMS land surface temperature data presented in this post is available through the KNMI Climate Explorer.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.









As with the Warmista attack on the North Carolina legislature,
actual scientific data is irrelevant compared to the consensus computer projections.
After all, today’s CO2 level is ‘unprecedented’,
so therefore the future warming and sea-level hike will be unprecedented too!
And you thought Word Magic was only for medievals!
Thanks for taking a look at this “Study.” I heard about it in passing and was just floored. The LA mayor is taking this info and running with it, talking about saving the planet by using green energy.
The defense of models in such cases is always that they aren’t intended to simulate regional variation. But if that is the case, why is it that so many use them to simulate regional variation?
Will the defenders of models step forward to repudiate the attempts to use models to create scary scenarios on regional scales? Doubtful.
But…but…but…decadal ENSO oscillation parameters on weather variations -and sensors- don’t matter. C02 effects will simply overwhelm and outdo whatever that little ol’ pond can throw at ya.
L.A. summers used to be hot; looking at Pasadena in particular, we’re talking a majority of the summer days over 90°F, and many over 100. But those who live there have noticed a dramatic change, as the last 4 summers — in a row — have been cool. 90° has been a virtual ceiling, with just a handful of days going over 90, and literally just a couple of days reaching 100. This is seemingly and obviously inconsistent with the notion that we are in a period of acute global warming. Instead, if we were actually living in the hotest period of history, nearly all the days should be over 90, and a near majority over 100. But it’s the opposite… And now, to predict that temps are going to skyrocket in L.A., it’s just ludicrous, laughable. Makes you shake your head in disbelief. The warmists are full of it.
Do you mean that we DON’T have to send Gore all that cash?
This appear on the surface to just be silly – I have either lived or visited regularly in the Carmel to San Diego area and have always found the Sea temperatures to control the biggest amount of the climate or weather. Do they predict a increase of the Pacific ocean of 5% + if not they must be wrong = huge heat sink is the ocean and it dominates the coastal climates.
“CLOSING
Based on this quick look at land surface temperature data for the Greater Los Angeles area, the Hall et al (2012) study referred to in the UCLA press release appears to have no basis in reality.”
Sure it does. You see, 2050 is 38 years away. That means that next year their prdiction will be for 2051. The year after that the prediction will be for 2052. The year after that ……..
Their point is we have to do something now!!!!!!!!
As ever, the strategy seems to be to throw as much #$%^ at the wall, figuring that a good percentage will stick to the public mind. If sheer volume were determinant, this debate would have ended long ago.
When it comes to history…the first impulse of the Climatists of warming alarmism was to ignore history. Not the Climatists believe they could use statistics to predict the future. However, we cannot say that what the authors of the study are saying, appears to have no basis in reality. Unfortunately, all of the previous instances are examples of political propaganda not science.
http://evilincandescentbulb.wordpress.com/2012/06/24/1887/
This study is just a projection based on Hansen’s data manipulation. Now Hansen will fraudulently cool the past and warm the present more to comply.
No matter what some might say, a ‘global’ temperature is an unnatural beast. Looking at individual country/city temperatures don’t support this sort of alarmist temperature increase mentioned here.
Here is the longest temperatrutre dataset in the world which has been falling steadily
http://www.metoffice.gov.uk/hadobs/hadcet/
Anecdotally it has been impossible to grow tomatoes out of doors here the last four summers and all my garden succulents have died during the last few winters . The current anomaly brings us back to the same temperatures that England had in the 1730’s. .
tonyb
Yes, another study that is “models all the way down”. Why would someone play computer games when there is real data that can be looked at and analyzed? I have said it many times, computer models are only really useful in telling the scientist what to go measure in the real world to confirm or disprove a hypothesis. Any other use is political science.
SkepticalScience would say anything to create support for their smug self important musings.
The problem is that with climatology there is nothing to lose by making alarming predictions. Indeed there is everything to gain – notoriety and more funding which encourages more high pitched and alarming predictions. Whereas a meteorological forecaster has an almost immediate feedback to their forecasts. If a forecaster is wrong it results as a minimum in loss of ‘face’ and often in loss of employment.
It is a pity that there is no way that a ‘downside’ could be applied to climate modelers; it would be nice to impose retrospective refund of the grant/funds made to them for the research and modeling (plus interest they would have accrued) at the time their forecast/projection is shown to have been untrue. As it is they can be feted by the politicians and draw huge income from forecasting the flooding of New York say, or the inundation of Southern Florida in ’15 years time’ – but when that date is reached, there is no reckoning rather the opposite as main-stream-media expects everyone to be spellbound by their next “in 15 years….. prediction.
Therefore, as we pass these predicted but non existent ‘tipping points’ – refugees – species extinctions – inundations – and nothing happens at the very least we should be able to publish a you were totally wrong like the climate fail files but FAR more public. Perhaps a table of climate ‘scientists’ together with a ‘skill’ score or believability ratio based on forecasts that came to pass or failed. This could even be extended to institutions or research establishments as well as individuals. Imagine – UEA skill 0., NASA Skill 5% Willis Eschenbach 80% Joe Bastardi (Weather Bell) 95% (purely example figures of course – but close to the truth). This would provide stats that could be given to the MSM – why quote that climate ‘scientist’ as they are totally unreliable? More importantly, to the funding authorities such as the NSF – Why are you funding failures?. Only attributed and documented forecasts would be used, but it is time that the irresponsibility was curbed.
I suggest keeping a close eye on your utility bills if you live in LA. The city’s utlity is struggling with trying to meet all of the state and local mandates being forced on it. Guess where they are going to get the money to meet these mandates.
Note: By struggling I mean they are facing a significant challenge. From my limited experience with their people, they are very good.
Are the slopes before and after 1975 significantly different from the slope from 1950 to 2010? Looking at the data, it seems barely plausible to say that it’s two slopes rather than just one.
“So, does a 4 to 5 deg F rise in Greater Los Angeles land surface temperature anomalies by 2050 sound realistic?”
Ultimately truth in science comes down to what observation or experiment says, I could care less about the computer models. Los Angeles shows no evidence of a warming trend according to the real data which I got from NOAA and posted here: http://reasonabledoubtclimate.wordpress.com/2012/06/25/hey-california-hot-enough-for-ya-just-wait-maybe/
Sounds like a great set up for the introduction of a mitigation tax on current power bills in order to develop a future fund for a watermellon committee to develop policies and papers on how to live with Gaia in sustainable harmoneous interdependent trifectas. That last part has to do with their twisted notion of betting on scenario A, B, AND C.
The acronym just rolls off the tongue: FFFWCTDPPOHTLWGSHIT.
“the UCLA press release appears to have no basis in reality”
Of course not, it has a basis in computer models. I bet no one bothered to look and see if the model predictions appeared realistic. It’s just assumed the models are right.
So what do the models of L.A. temp. show if adjusted for the fires caused by gang violence and riots? I bet that flattens out the graph something fierce….
THIS IS HUGE: NEW NORWEGIAN SCIENTIFIC RESEARCH: SOUTH POLE NOT MELTING
Twenty-year-old models which have suggested serious ice loss in the eastern Antarctic have been compared with reality for the first time – and found to be wrong, so much so that it now appears that no ice is being lost at all.
“Previous ocean models … have predicted temperatures and melt rates that are too high, suggesting a significant mass loss in this region that is actually not taking place,” says Tore Hattermann of the Norwegian Polar Institute, member of a team which has obtained two years’ worth of direct measurements below the massive Fimbul Ice Shelf in eastern Antarctica – the first ever to be taken.
According to a statement from the American Geophysical Union, announcing the new research:
It turns out that past studies, which were based on computer models without any direct data for comparison or guidance, overestimate the water temperatures and extent of melting beneath the Fimbul Ice Shelf. This has led to the misconception, Hattermann said, that the ice shelf is losing mass at a faster rate than it is gaining mass, leading to an overall loss of mass.
The team’s results show that water temperatures are far lower than computer models predicted …
Hatterman and his colleagues, using 12 tons of hot-water drilling equipment, bored three holes more than 200m deep through the Fimbul Shelf, which spans an area roughly twice the size of New Jersey. The location of each hole was cunningly chosen so that the various pathways by which water moves beneath the ice shelf could be observed, and instruments were lowered down.
The boffins also supplemented their data craftily by harvesting info from a biology project, the Marine Mammal Exploration of the Oceans Pole to Pole (MEOP) effort, which had seen sensor packages attached to elephant seals.
“Nobody was expecting that the MEOP seals from Bouvetoya would swim straight to the Antarctic and stay along the Fimbul Ice Shelf for the entire winter,” Hattermann says. “But this behaviour certainly provided an impressive and unique data set.”
Normally, getting sea temperature readings along the shelf in winter would be dangerous if not impossible due to shifting pack ice – but the seals were perfectly at home among the grinding floes.
Overall, according to the team, their field data shows “steady state mass balance” on the eastern Antarctic coasts – ie, that no ice is being lost from the massive shelves there. The research is published in the journal Geophysical Research Letters.
http://astuteblogger.blogspot.com/2012/06/this-is-huge-new-norwegian-scientific.html
Chris says: “Are the slopes before and after 1975 significantly different from the slope from 1950 to 2010?”
The trend for the full term of the Santa Barbara-Ventura-Los Angeles-Orange Counties land surface temperature anomalies is shown in Figure 2 to be 0.177 deg C/decade, while the trend for the pre-1976 data (shown in Figure 4) is 0.108 deg C/decade and the trend for the post 1976 data is negative at -0.082 deg C/decade.
Here’s the graph of the full term data for the “Los Angeles Plus” region.
http://i48.tinypic.com/166ako6.jpg
It has a long-term trend of 0.233 deg C/decade. But the pre-1976 data (shown in Figure 7) is 0.329 deg C/decade and the trend for the post 1976 data is negative at -0.074 deg C/decade.
And for the full term of the “Malibu-The Valley-Los Angeles” region…
http://i45.tinypic.com/152fiog.jpg
…the long-term trend is 0.205 deg C/decade. The trend for the pre-1976 data (shown in Figure 9) is 0.202 deg C/decade. So that one’s basically the same as the long-term trend. Yet the trend for the post 1976 data is negative at -0.086 deg C/decade.
Even Gavin at RC concedes that forecasting regional climate is a crap shoot. Models can’t forecast it. If they could, climate scientists would all be winning big at Monte Carlo.
Gavin writes
“The basic issue is that for short time scales (in this case 1979-2000), grid point temperature trends are not a strong function of the forcings – rather they are a function of the (unique realisation of) internal variability and are thus strongly stochastic”
http://www.realclimate.org/index.php/archives/2012/06/unforced-variations-june-2012/
Ian W says:
June 25, 2012 at 3:33 pm
The problem is that with climatology there is nothing to lose by making alarming predictions.
========
Unlike almost every other profession, where a prediction that turns out to be significantly wrong is liable for law suit.
LA residents brace yourselves for zillions of $$ in higher taxes, fees and insurance rates as a result of this report.
Time to pass a law as has been done in other states, that costs can only be increased based on observations, not based on models. Otherwise there is a huge incentive for model builders to incorporate all sorts of nonsense to make money.