California Heating Up, a new NASA/CSU study finds, but data questionable

map of California showing changes in temperature, 1950-2000

Image: Average temperatures warmed in nearly all parts of California between 1950 to 2000. Image credit: NASA/JPL/Cal State L.A Click for Larger Image

Average temperatures in California rose almost one degree Celsius (nearly two degrees Fahrenheit) during the second half of the 20th century, with urban areas leading the trend to warmer conditions, according to a new study by scientists at NASA and California State University, Los Angeles. Results of the study appeared in the journal Climate Research.

But 50 years of temperature trends hardly proves anything relevant about climate change, other than its gotten warmer in the past fifty years. 50 years in terms of our planet and the suns processes is a blink. I have to think that because NASA chose to co-author this paper with researchers at California State University, that some of the statewide “global warming as man-made problem bias” crept into the thinking for the purpose of this paper, i.e. “we need another study to show that its getting hotter so action is justified”.

What is troubling about this study is that many of California’s historical climatological stations, when done on a 100 year trend, rather than a 50 year trend, show a net cooling over the period, or a reversal of trend. The northern Sacramento Valley has very few reporting stations that go back 100 years, so I only have 4 data points, but it makes me wonder just what data the NASA/CSU study used to come to the conclusion that our area has warmed 1.1 degrees F over the last 50 years.

I’ve prepared some side-by-side graphs below of Sacramento Valley stations to illustrate that point:

North Sacramento Valley City temperature Trends for 50 and 100 year periods

My data source: U.S. Historical Climatology Network (USHCN) Data Set

Yet the NASA/CSU paper claims “The only area to cool was a narrow band of the state’s mainly rural northeast interior“. None of the stations above are in that area, but are in the North Sacramento Valley.

Even odder than that, cold and snowy Mt. Shasta, where you’d expect to hear about depleted snowpack, it’s melting glacier on the side of the mountain, and other “signatures” of “global warming” shows a significant drop in temperatures over the last 50 years. yet the NASA/CSU study for that area concludes that a 2.1 degree F rise in temperature occurred.

MtShasta_50-100_trend.png

Granted a few data points don’t equal a complete study, but the fact that I’ve been able to find and plot in a couple of hours, several places that don’t match the trends in the NASA/CSU study calls their methodology into question. Note the cities I used are all small rural cities, but the NASA/CSU study plotted major, medium, and minor cities in California to draw their conclusions. From their own paper they admit that the areas that have grown the most have shown the greatest temperature increases:

Southern California had the highest rates of warming, while the NE Interior Basins division experienced cooling. Large urban sites showed rates over twice those for the state, for the mean maximum temperatures, and over 5 times the state’s mean rate for the minimum temperatures. Average temperatures increased significantly in nearly 54 percent of the stations studied, with human-produced changes in land use seen as the most likely cause. The largest temperature increases were seen in the state’s urban areas, led by Southern California and the San Francisco Bay area, particularly for minimum temperatures.

For example, look at Pasadena, CA once a small city itself, but in the last 100 years it became a dot in the sea of the second largest American City, Los Angeles. It’s temperature trend, unsurprisingly, is sharply upward, for both the 50 and 100 year trends. Its drowning in a sea of asphalt and concrete, is it any wonder it shows a temperature increase?

Pasadena_50-100_trend.png

The inescapable conclusion is that the NASA/CSU study is plotting the effects of urban heat islands, and applying that trend to the entire landmass of California to reach the conclusions they have mapped onto the state map of temperature trend they present.

A simple filtering based on urban growth factors would yield a temperature map with a far different result.

To their credit though, they recognize this fact: “If we assume global warming affects all regions of the state, then the small increases our study found in rural stations can be an estimate of this general warming over land. Larger increases would therefore be due to local or regional changes in land surface use due to human activities.”

For the most part, “urban warming” has dwarfed “global warming” in its magnitude, a fact that is lost on some who look at temperature data from weather stations worldwide and treat them all equally in the quest to prove a theory.

0 0 votes
Article Rating
4 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Nick Freitas
March 28, 2007 9:06 pm

One might conclude, that if tomorrow there was a huge amount of federal grant money out there for people doing studies of global cooling, we would be on the brink of an ice age.

Tina
March 29, 2007 11:15 pm

For example, look at Pasadena, CA…It’s temperature trend, unsurprisingly, is sharply upward, for both the 50 and 100 year trends…
I think many of the people who live in these concrete jungle cities become rabid warming freaks because of their daily experiences in traffic.

Lon
March 30, 2007 3:14 pm

Anthony,
You make a number of good points. Particularly in the fact that the writers may have applied changes in urban temperature measurements over large regions for graphical impact.
As someone who has designed and built electronic temperature sensors I have certain concerns about the data itself.
Unless temperature sensors are regularly calibrated I think it is unreasonable to expect accuracy of greater than a couple of degrees.
Even some that are calibrated may not have good accuracy. The LM34 which is a commonly used semiconductor for measuring temperature is +/-2 degrees F. This is pretty typical of analog or digital semconductor sensors. The temperature error for this part is also non-linear, and so it’s not a simple offset that you have to account for during data collection. Furthermore, there are lots of additional errors that can creep into a temperature measuring device beyond the sensor itself.
http://www.national.com/pf/LM/LM34.html
One could argue that numerical analysis done on data points would tease out errors. But if a scientist doesn’t know the exact accuracy of a temperature sensor then they couldn’t account for errors in their system.
Some of the temperature sensing stations may be very accurate and regularly calibrated. But maybe they’re not?
I have a hard time trusting that the data is accurate to the level of identifying 1 or 2 degree changes over decades. This is especially true since the techniques of making these measurements have changes over that time frame.
Lon

Anthony
March 30, 2007 8:18 pm

Lon, thank you for the comments. FINALLY somebody who understands the kind of biases that creep into temperature measurements!
I’m innately familiar with National Semi’s LM34 and it’s accuracy problems. One of my early jobs at the university as a research assistant was to create remote elctronic weather stations. I soon learned how inaccurate mand electronic devices can be in temperature measurement.
The problem with the National Weather Service temperature data sets (and world data sets too) are that they are full of biases that I’m not sure have been accurately accounted for. People such as Jim Price, from CSUC who is on the IPCC say they have been, yet nobody has shown me any hard evidence of such. I’d be a lot less skeptical if I could see how the IPCC accounted for temperature measurement biases. But they won’t share.
Some people that I try to explain this to accuse me of splitting hairs. But these bias problems in temperature measurement are quite real.
What works against my arguments about the difficulty in getting accurate temperature records is the everyday simplicity of temperature and its common measurement. We live by temperature, we have it reported constantly, we all have thermometers at home, we measure our childrens fevers with thermometers, we barbeque with thermometers.
Measuring temperature is easy right? You just stick the thermometer in whatever gas, liquid, or solid you want to measure the temperature of and voila’ there it is. People tend to think of thermometers as perfect devices. Some are close, especially when taking measurements ina closed system, like a fermenatation vat at Sierra Nevada.
But in an open system in our atmosphere, there are many many more biases that can affect the measurement within a few inches or feet of the thermometer. Here’s just a few:
– Reflected sunlight from nearby building or objects
– Re-radiated infrared from nearby cement or asphalt surfaces or the ground itself (which is why airports make terrible places for temperature measurement)
– The structure that the thermometer is mounted to, can conduct heat to the thermometer
Now add to that:
– Accuracy of the thermometer itself
– Linearity of the thermometer over its measurement range
– Long term repeatability of the thermometer’s accuracy
– Long term repeatability of the thermometer’s linearity
And then we have urban effects such as:
– Localized vegetatation removal or addition over time
– Localized building changes over time
– Localized asphalt or concrete surfaces addition or removal
And finally within the Global Temperature data set we find:
– Changing the location of the weathet station thermometer
– Changing the thermometer itself at some point – i.e. repair/replace
– Changing the thermometer type, from mercury, to electronic
– Changes in thermometer shelter, different types of paint over time, all which have different absorptive and reflective properties.
Ok with all these biases that you have to account for to make long term temperature measurement reflect the true temperature of the location, can you be absolutely sure of the data? Especially when you are looking for trends that may be 1 degree or less over 50-100 years?
Or letes try a thought experiment Lon, you’ve been commissioned by the IPCC to make a new thermometer for use around the world at climate measurement stations. As an electrical engineer, could you design an air temperature thermometer that is:
– Linear to with 0.5% over a temperature range of -20F to 120F
– Accurate to within 0.1 degree over that same range
– Repeatable in linearity and accuracy defined above for a period of 20 years. Or even 10 years.
– Identical withing the specs above, so that if one fails, it can be immediately swapped with another one from parts stock with no worry about introducing bias
Ok there’s your challenge. Could you do it?