From Dr Roy Spencer’s Global Warming Blog
by Roy W. Spencer, Ph. D.
Summary
- Previous research has shown the temperatures recorded at Death Valley National Park (DVNP) have curious warm biases on very hot days, possibly due to instrument deficiencies or proximity to mounting structure apparatus and other manmade structures.
- Here it is shown from 21 years of summertime (June, July, August) data that DVNP has many more days when temperatures are much higher than those at the nearby Stovepipe Wells station, than when Stovepipe Wells has hotter days than DVNP station.
- These lines of evidence suggest that the hot summer daytime temperatures reported at Death Valley National Park have potentially large biases, and should only be used for their entertainment value.
In our continuing examination of the world record hottest temperature of 134 deg. F recorded at Greenland Ranch (now Death Valley National Park station) on 10 July 1913, we are finding some curious behavior in recent summertime temperatures there. (The Bulletin of the American Meteorological Society [BAMS] has accepted my proposal for a BAMS article showing the evidence that the 134 deg. F world record was 8 to 10 deg. F higher than what actually existed on that date [10 July 1913]).
Previous Work on Excessively Hot Death Valley Temperatures
Weather forecaster and storm chaser Bill Reid has blogged extensively over the years on the evidence against the 134 deg. F world record. A good place to start is his most recent post (Part 6) that deals with the Greenland Ranch foreman who made the excessively hot temperature measurements in the first half of July 1913. Bill has agreed to co-author the BAMS paper with John Christy and me.
There was also an experiment carried out with a variety of temperature instrumentation placed next to the DVNP weather station during 2021 and 2022. This revealed that on the near-record hot day of 9 July 2021 (130 deg. F), the “official” DVNP sensor produced temperatures a few degrees hotter than the other instruments (AMS conference poster here). The photo in Fig. 1 shows that the older-style DVNP instrument (which is not aspirated) is mounted next to a lot of metal structure and a small solar panel.
Fig. 1 Death Valley National Park weather station, with additional instrumentation added by Dirk Baker (Campbell Scientific, Inc.) and co-investigators to compare to the ‘official’ temperature readings in 2021 and 2022. (Figure adapted from this AMS conference presentation).
The experimental setup in Fig. 1 used several temperature sensors, some with aspirated shields, others with no aspiration. The data shown in their AMS conference presentation suggests to me that the near-record 130 deg. F reading on 9 July 2021 was 2-3 deg. F too hot partly because of the non-aspirated design of the sensor. There was some additional warm bias that could have been due to all of the mounting structure seen in Fig. 1, including a small solar panel next to the DVNP station sensor.
More Evidence: DVNP vs. Stovepipe Wells Temperatures
For the last 21 years there have been two stations in Death Valley: the DVNP station next to the Furnace Creek Visitors Center, and a climate reference network (CRN) station at Stovepipe Wells, 29 km northwest of the DVNP station.
Fig. 2 shows a comparison of the daily maximum temperatures (Tmax) recorded at these two stations for every day in June, July, and August in all years from 2004 through 2024.
Fig. 2. Comparison between daily high temperatures (Tmax) recorded at Stovepipe Wells and Death Valley National Park, for all days in June, July, and August for the years 2004 through 2024. The dashed red line represents the median difference between the 2 stations (2 deg. F, DVNP warmer than Stovepipe Wells). Gray lines connect the days in chronological order.
The median of the Tmax differences between these 2 stations is 2 deg. F (DVNP warmer, represented by the dashed red line), while the average difference is 2.3 deg. F. The expected difference based upon elevation alone is 1.3 deg. F (DVNP station is 278 ft lower in elevation than Stovepipe Wells).
Note in Fig. 2 that there seem to be more outliers to the left of the dashed red line than to the right. That is, there are more days where DVNP is much warmer than Stovepipe Wells than there are days when Stovepipe Wells is much warmer than DVNP station.
This can be better seen if we look at a frequency distribution of these station differences, adjusted for the 2 deg. F median difference between stations (Fig. 3).
Fig. 3. Frequency distributions of how many days where one Death Valley station is hotter than the other. This is after shifting of the distributions to account for a 2 deg. F difference in their median difference.
As shown in Fig. 3, DVNP station has many more days where it is hotter than Stovepipe Wells, than Stovepipe Wells has days that are hotter than DVNP station. For the 3-4 deg. F hotter category, the difference is 2X, for the 5 to 9 deg. F hotter category the difference is 3x, and for 10 deg. F or greater the difference is 7.8X.
This suggests there is something wrong with the Death Valley National Park instrumentation itself or the immediate environment around the temperature sensor that causes some days to be biased too hot. Bill Reid, who has researched this issue extensively, suspects that days with low wind have excessive heat build-up at the DVNP thermometer site, both in the general area around the instrumentation, and due to the non-aspirated design of the temperature sensor used there.
The difference in exposure at DVNP station and Stovepipe Wells is shown in Fig. 4.
Fig. 4. Google Earth imagery of Stovepipe Wells station (top) and Death Valley N.P. station (bottom), stations circled in red. The inset photo at top is of the Stovepipe Wells Climate Reference Network station, courtesy of William T. Reid. The E-W distance across these images is just over 0.5 km.
As can be seen in Fig. 4, the Death Valley N.P station has quite a bit of development surrounding the station, with parking lots, a paved campground, the Visitors Center, solar panels (black) and trees just to the south. The Stovepipe Wells site has almost no development and no vegetation. It is possible that during the prevailing southerly wind flow during the summer, the structures and trees to the south of the DVNP station lead to stagnation of air flow around the temperature sensor.
Conclusions
The evidence presented here, along with evidence presented previously by Bill Reid, Dirk Baker, and others, suggests that Death Valley National Park temperatures should not be relied upon for accurate daytime readings, and that near-record temperatures there are biased too high. The reasons for the biases are not obvious, but the evidence suggests poor sensor ventilation during the daytime when various structures in the vicinity heat up: whether the shield of the sensor itself, its supporting structure, or various manmade objects around the station site. It is also possible that the trees and other structures to the south of the station restrict air flow, further reducing effective convective heat transport away from the solar heated desert surface.
It is my opinion that “official” Death Valley temperatures should use the Stovepipe Wells site data, which come from state-of-the-art Climate Reference Network instrumentation. The traditional site near the Death Valley National Park Visitors Center should only be used for entertainment purposes.
Maybe the National Park Service should investigate adding a CRN station; a good location would be about 1.6 km southwest of the current station, well away from the Furnace Creek tourist area.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.




How reliable are the Stovepipe Wells data?
PROBITY and PROVENANCE are hugely problematic for climate “data”.
CACA spewers would love to see that record expunged. Then Furnace Creek’s 130 F in August 2020 would replace it, keeping the narrative alive.
But just to make sure, the NPS in 2021 put up a dubiously sited new automated WX station in Death Valley:
https://www.nps.gov/deva/learn/news/badwater-weather-station.htm
I have questions. How long has the DVNP device been at the Furnace Creek location? Do other official sites use the DVNP device? Being 28 kilometers apart wouldn’t there be more than elevation that would make a difference? Why would trees interrupting the wind pattern make the DVNP reading unreliable?
Shown in the graphic (See below) are plots of temperatures from the weather station at Furnace Creek from 1922 to 2001. In 1922 the concentration of CO2 was 302 ppmv
(0.595 g CO2/cu. m. of air), and by 2001 it had increased to 371 ppmv (0.729 g CO2/cu. m. of air), but there was no corresponding increase in the temperature of the dry air. The reason is quite simple: There is too little CO2 in the air.
The empirical temperature data from this remote arid desert falsifies the claim by the IPPC that CO2 causes warming of air and by extension “global warming”. We now know from the recent UN COP29 conference in Baku, what all this rhetoric since 1988 about greenhouse gas emissions, global warming, and climate change is really all about: a scheme by the UN to redistribute the funds from the rich donor countries, via the UNFCCC and the UN COP, to the poor countries to help the poor countries cope with global warming and climate change. The poor countries are now clamoring for trillions of dollars. President-elect Trump will most certainly terminate funding to these UN organizations.
NB: The graphic of the Furnace Creek temperatures was obtained from the late John Daly’s
website “Still Waiting For Greenhouse” available at: http://www.John-Daly.com. From home page scroll down to the end and click on bar: Station Temperature Data. On the World Map, click on NA, scroll down and click on Pacific, finally scroll down and click on Death Valley. John Daly found many weather stations that showed no warming to ca 2002. You should check out some of these.
Those are fake data from a fake science crackpot named John Daly who did not even have a college degree. Nor did he ever study science
Daly blamed global warming on sunspot trends and El Ninos — a true science denying Nutter
Fellow Australian El Nino Nutter and WUWT Court Jester, BeNasty2000, would love John Daly.
Daly denied that the average sea level is rising, on the basis the ‘Isle of the Dead’ mean ocean level benchmark.
BBC NEWS | Science/Nature | Mark of hot dispute
Henry Ford only had a sixth grade education. Thomas Edison was home-schooled and self-educated. Both were accomplished engineers and/or scientists. Your dependence on education for judging people is entirely wrong.
Isaac Newton had BA and MA in arts and education. He was self taught in physics and other fields.
John Daly obtained the temperature data for his studies from the GISS database. The GISS database now has temperature data for Death Valley that starts in 1910.
FYI, At the MLO in Hawaii, the concentration of CO2 in dry air is 422 ppmv. One cubic meter of this air has only 0.829 g of CO2 and a mass of 1.29 kg.
This small amount of CO2 can heat up such a large amount of air by a very small amount if at all.
BTW, Did you read Kauffman’s essay: “Climate Change Reexamined”?
Bill Gates: college drop out
Walt Disney: no college
Wright Brothers: high school dropouts
Yet without a college degree he is more correct than you are. Quit belittling people. Bill Gates, Walt Disney, and the Wright Brothers didn’t have college degrees either.
In general, I agree that the NP site is now contaminated by development issues (your picture just barely shows the start of the massive irrigated golf course green space immediately S-SE). However, you insinuate that altitude alone is the determinate of temperature (“The expected difference based upon elevation alone is 1.3 deg.”). How can you support this assertion? I have spent a good portion of my life in the southwestern deserts, Mojave and Sonoran, and find it very difficult to accept that argument. Terrain characteristics such as arroyos, rock outcroppings, sand flats, and creosote/mesquite growth are all major factors in specific site temperature. In my experience with my backpack thermometer, temperatures can vary by 5 or more degrees quite rapidly.
As you note, wind variations can significantly effect temperature in this region, not just by cooling however. Wind moving through a heated rock formation will carry that heat. Sit down wind of a solid rock formation or a rocky arroyo/canyon on a 110 degree day and you will understand.
If your goal is to get a temperature site that is free of human interference, I completely support your argument. However, do not attempt to correct/adjust/equate the new temperature site to the previous. Start from scratch and wait 30-50 yrs to see the trends that develop. Stovepipe wells and furnace creek are very different locations.
I have long advocated for the “global average temperature” being a joke because it makes no adjustments to readings based on terrain, geography, humidity, wind, etc. It’s why the temperatures in San Diego and Ramona, CA can be so different – totally different environments. Yet climate science just blissfully averages them together. They then justify it by claiming temperature is a continuous gradient between locations and the average is a valid indicator for an area.
Go back and examine the original 1909 document.
1, You’ll see the meticulous handwriting of a meticulous person.
2, You’ll see the evening of the 9th the overnight low is very high. I surmise a high cloud cover blew in blocking long-wave radiation, preventing nighttime cooling.
Fred Corkill, ranch superintendent filled out the forms, but he was not the one who made the measurements… ranch foreman Oscar Denton did. Corkill was usually not at the ranch, so we assume Denton was alone when he recorded the temperatures.
This is the fourth Spencer article about TMAX in Death Valley. These are weather reports, not 30 year or longer climate trends. Why is this important enough for one article? Because a hottest ever claim in 1913 may be overstated? So what?
I have read that Spencer wants to retire and there is already no funding for UAH calculations. That suggests the need for a series of articles on UAH data: Why they are important and should be funded. Comparisons of UAH and NASA GISS. Comparisons of UAH and RSS.
This article is about how two weather stations 28km apart do not have the same summer TMAX readings. Exactly what was expected. How does that analysis help refute the false claims of a coming global warming crisis … whose symptoms would mainly be warmer winters in colder states and nations, with TMIN increasing more than TMAX?
And yet climate science blissfully averages the temperatures from two locations that are a mere 28km apart but which are quite different terrains and geographies. Meaning that average is useless for anything! Including the global average temperature.
Personally, I don’t care how hot it is in Death Valley- if hotter than ever, so what. All I care about is that it’s about 10 deg F right now in Wokeachusetts. I’d love for it to be a few degrees warmer.
“The traditional site near the Death Valley National Park Visitors Center should only be used for entertainment purposes.”
I am adapting this idea to the overall climate issue:
The modeled scenarios for the IPCC CMIP3-5-6-soon7 exercises should only be used for entertainment purposes.
There.
My take from this article is that the various housings for thermometers can have a large effect on temperature readings. Non-aspirated housings can have a systematic uncertainty that is large as they are dependent on wind only (kind of like windmills). All housings can have a systematic uncertainty due to lack of maintenance not keeping airflow openings clear from things like spider webs, wasp nests, ice in winter, etc. Even CRN stations can suffer from these uncertainties. CRN stations even monitor the fan speeds so proper ventilation can be assured.
Even a 2°F systematic uncertainty can pretty much make metrics like land Global Temperature growth highly suspect. This systematic uncertainty is something that Dr. Pat Frank has been asserting for quite some time in his papers. Yet climate scientists and AGW advocates just keep on asserting that nothing is wrong with the data and its processing!
Older style weather stations will trend slightly upwards as the Stevenson screen paint oxidizes, its emissivity changes and airflow is reduced by spider webs, moth wings, dust, dirt, and tree fuzz. Not all stations…just on average…Add in the psychological desire of technicians to recalibrate instruments so that they read the same after a maintenance effort….Today we are replacing these with aspirated stations with RTD’s which are more responsive to short term changes of sunshine in scattered cloud conditions….so an upward trend in recorded temperatures will continue for a couple of decades due to this station updating alone….completely aside from UHI effects.
If Kansas can reach 121 degrees , DV can likely have reached the recorded temp .
https://www.plantmaps.com/kansas-record-high-and-low-temperature-map.php
The science is not showing the results that I want the science to show me, and thus we have to throw away the old science in favor of the science that currently shows the results I want seems to be the most recent trend here at watts up with that.
The equipment being used at death valley is the same equipment that has been being used for how long? But because someone out there wants to knock down the world record holder, they need to change things!
I am all for improving things, set up a secondary nearby station with the new and improved equipment and run it side by side. But, for the purposes of trends since inception, the old equipment has to remain for continuity, at least for a reasonable period of time, maybe a decade, although I am told it takes 30 years to set a climate trend, so that seems closer to what should happen. If after 10 to 30 years the new equipment proves out, then ditch the old.
Yes !
Dr Spencer is probably under extreme pressure to toe the CAGW line .
The issue is not to “knock down” anything.
One issue is recognizing that all temperature readings have an uncertainty associated with them. Various devices and locations have different uncertainty.
Another issue is using measurements recorded as integers in a mathematical process that assumes one can achieve resolutions to the one thousandths of a degree. This violates every rule of measurement in other scientific endeavors.
Ultimately, one should never see a measurement that doesn’t also include an uncertainty value.
The 134°F reading should be at least, 134 ±2°F. Please note, this interval speaks loudly to the fact that making a decision as to what the actual temperature was, is nothing more than a pure guess. You can say it could have been 136 and I’ll respond it could have been 132. Neither will be more correct than the other!
Climate science has made a science of ignoring the fact that all measurements are nothing more than estimates. It is why NIST and ISO have moved to expressing uncertainty as an interval.
Or even 134 -1, +2°F.
Why is this ONE station getting so much attention? It covers a small area, but it gets unprecedented coverage here. The reason is because the record high temperature it recorded in the past, in an extremely low humidity climate, makes the Catastrophic Global Warming narrative hard to maintain.
Yes, they are aiming to take out the past temperatures readings of this station. The first 3 posts I read recently on this location were all about saying that the record high temperature was impossible and had to have been fabricated and thus should be removed from the official record. Now this one. There is an agenda behind the attacks.
“But, for the purposes of trends since inception, the old equipment has to remain for continuity,”
Not even 30 years is long enough to track climate change. It should be more like 1000 years. There are far too many cyclical processes with various periods, changing causes/magnitudes, etc resulting in a climate that varies over long periods of time. The claim that “we need continuity” is nothing more than a red herring argument – you do *NOT* need continuity of records. When you have a multiplicity of data sources, dropping a few or adding a few should not significantly affect the overall average. If you want to know what is happening now then even a two year or five year record is more than sufficient. The question of “does what happened 100 years ago affect what is happening today” is what needs to be answered. It seems to me that it is pretty obvious that it doesn’t. The 1930’s had maximum temperatures as high or higher than today. The “global average temp” that uses daily mid-point temps can’t even tell you if it is higher Tmax temps or higher Tmin temps that is driving the “gat” to go up. Knowing that is absolutely necessary in order to know what is happening to climate but is also necessary to judge how applicable current climate theories are.
The current climate models aren’t as accurate as the Farmers Almanac when it comes to projecting next year’s weather – how accurate can they be over 100 years when they can’t even get the primary baseline contributor of “next year” projection right?
Death Valley Summer TMAX Temperatures
1913 Too Darn Hot
1933 Too Darn Hot
1953 Too Darn Hot
1973 Too Darn Hot
1993 Too Dam Hot
2013 Too Darn Hot
2023 Too Darn Hot
Too Darn Hot
In such a case of comparison, the reading of a global temperature record, the data errors must be rigorously analyzed, with systematic and random errors additively. There are huge systematic errors in siting and instrumentation used, to a degree that they cannot be compared. And the data must be read at the same time during the day, also assuming same cloud coverage and wind velocity. Further, if reflective surrounds, ground slope and height above ground is substantially different the readings have a very high random error, since radiative heat and air chimney effects reaching the instruments are so different. Plus, when instrument make, relative arrangement and shielding are not virtually identical the systemic errors pile up. Certainly the altitude gradient of 3.56 dg F per 1000 feet must be added in. I bet the error bandwidth between the two sites is at least 6 degrees F.
Note, here we use modern instrumentation, presumably with high accuracy. Now imagine how we reference 150-year old readings, which would have had to be accurately measured with bubble thermometers to better than 0.1 degrees to be reasonably stated as 0.3 degree accurate. Parallax reading error alone is much greater than 0.1 degree (remember, folks claim a global rise of 1.356 degree C over 140 years). So, keep in mind the big unknown of long ago, when someone rings the warming alarm bell, claiming to ‘know it all’.
“And the data must be read at the same time during the day, also assuming same cloud coverage and wind velocity.”
I’ve long advocated that all temps should be read at 0000GMT and 1200GMT. That would form a metric that *might* be useful – but it would still be subject to all of the systematic bias and random error you mention, plus all the others you didn’t.
I remembered an old WUWT post about siting issues in Death Valley.
Not sure if this applies directly, but here it is.
https://wattsupwiththat.com/2013/06/30/it-seems-noaa-has-de-modernized-the-official-death-valley-station-to-use-older-equipment-to-make-a-record-more-likely/
Relevant to this discussion is [McKay 2024] who’s 20 year experiment corroborates the extreme high temperatures reported by the official station at Furnace Creek.
From the paper.
“Connecting” the data in order to CREATE a long record is a questionable objective without a very detailed uncertainty analysis that shows a constant uncertainty budget of the Min/Max thermometer.
I see NO uncertainty budget that quantizes the values of uncertainty for any of the thermometers. There is a brief comment that says:
It should be noted that this exceeds the uncertainty of CRN stations (±0.3°C) by a factor of three. CRN stations are quite sophisticated and one must wonder if the ±0.1°C is supportable. It seems to be picked out of the air.