How not to measure temperature (or climate) #97 – California's warming air temperatures are population and site bias related

A couple of days ago, I highlighted a worst of the worst NOAA climate monitoring station in Arizona with the help of a scientist from the University of Washington.

My friend Jim Goodridge, former California State Climatologist continues to be busy in his retirement, and sends this along today. He’s been tracking a group of weather stations in California, and has been doing so for over 20 years. In fact, it was Jim who first introduced me to that light bulb moment where I realized that global warming wasn’t really all it was cracked up to be when he made this short publication in the Bulletin of the American Meteorological Society in 1996.

I guess you could say it was the graph that launched a thousand blog posts, because as we all know, CO2 can’t heat differently based on county population.

goodridge_1996_CA-UHI_county

So with that in mind, have a look at his current analysis:

CA-100year-air-temperature-trend
Note that the larger the red dot, the larger the trend. In the case of the dark blue dots, the larger ones actually show a negative trend.

What is most notable is the red dots, which cluster in the San Francisco Bay Area, as well as the Los Angeles basin. You can also make out the I-80 corridor from San Francisco to Reno, traced in red.

Jim’s Excel spreadsheet is here, you can play with the data yourself: CA-Temp-map-100-Years

Of the stations showing the greatest 100 year warming rate, the one station right on the border of Arizona and California, Parker 6NE, has the greatest at 0.0625/year degrees Fahrenheit as this screen cap from his worksheet shows:

parker6NE-table-100year-trends

That’s a whopping 6.25 degrees Fahrenheit ( 3.47C) per century! That’s a bigger rate than some of the climate model predictions. Wow, the greenhouse effect must surely have gone into overdrive in Parker, right?

I decided to have a look at the Parker 6NE station, and started with the B91 forms of original data, boy was I surprised:

Parker6NE-Dec2015-B91form Parker6NE-Nov2015-B91form

Source: http://www.ncdc.noaa.gov/IPS/coop/coop.html?_page=2&state=AZ&foreign=false&stationID=026250&_target3=Next+%3E

Look at all that missing data, which I’ve marked in yellow. 10 days in November 2015 and 16 days in December 2015. Of course NOAA/NCEI “corrects” this by infilling it with other data from surrounding stations so that no station record is incomplete in their database. In the case of December, 2015, over 50% of the readings aren’t actually real data from the station in Parker, they are “fabricated” from other data using NOAA/NCEI’s special FILNET sauce. No worries, all’s fair in love and climate science, right?

NASA GISS keeps a plot of Parker 6NE data, and it seems missing data has been a hallmark of this station for quite some time. Notice all the gaps:

Parker6NE-nasa-giss-plot

Source: http://data.giss.nasa.gov/cgi-bin/gistemp/show_station.cgi?id=425000262500&dt=1&ds=14

With that many gaps in annual data, you’d think this station might not be suitable for climate science use, much less categorized as a “best of the best” USHCN station, right? No worries, all’s fair in love and climate science.

Steve Goddard had a look at Parker 6NE a few months ago, and plotted the infilled data from NOAA NCEI:

Parker6NE-Tmin-plot Parker6NE-Tmax-plot

Parker6NE-Tmax-Tmin-divergence-plot

It seems pretty clear that the majority of the warming trend is all about the minimum temperature, which has a sharply higher trend than the daytime maximum temperature. This mirrors the temperature trend of nearby Las Vegas, NV which has had explosive growth. The UHI signal in the nighttime Tmin is very clear:

LasVegas_average_temps

But, it turns out that most of that trend is in overnight temperatures, which are most affected by the explosive infrastructure growth of Las Vegas and the resultant UHI:

LasVegas_lows

Inconveniently, there is no upward trend in maximum temperatures, in fact it appears there has been a slight downward trend since the late 1930’s and early 1940’s:

LasVegas_highs

So surely, Parker 6NE must have had similar explosive growth contributing to UHI, making the Tmin trend grow large, right?

Nope. It’s a siting issue. According to the B91 form, the Parker 6NE USHCN climate monitoring station is located at radio station KLPZ in Parker, it is a volunteer observing site, which sort of explains why NOAA gets what it pays for when we have 16 days of missing data in December 2015.

A cursory look at the station in Google Earth shows the problem, can you spot the official climate monitoring temperature sensor in this aerial view?

Parker6NE-KLPZ-radio-aerial-view

I couldn’t either. But thanks to Google Earth street view, I found it. You may have to click the images to see better. Annotations are mine.

Parker6NE-KLPZ-radio-street-view2 Parker6NE-KLPZ-radio-street-view

The junk piles and junk cars are certainly a nice touch for NOAA’s official climate observing station, don’t you think? Note also the big “swamp cooler” on the roof of the radio station about 20 feet to the right of the MMTS temperature sensor. That will put extra humidity into the nearby air, which will contribute to local warming due to moist enthalpy, in addition to the heat sink effects provided by the junk, cars, and nearby building. Those who live in the deep south understand how a how a humid summer night can stay at 80 degrees for a Tmin, while over in the desert of Arizona, away from the swamp cooler A/C units, the temperature can fall to 50 degrees at the same latitude on the same day, with an even higher Tmax.

And then there’s the nearby tree, which we know will limit LWIR going from the ground to the upper atmosphere at night, keeping the air near the ground warmer than it normally would be. That’s a factor too.

But I think the biggest factor is the solid metal fence that surrounds the compound, which can be clearly seen in the aerial view. Then there’s the building itself to the south. That essentially cuts off the temperature sensor from any wind flow near the ground in any direction, and as we know from basic meteorology, windless nights are the biggest problem for UHI. In this case, thanks to the fence, all nights are less windy at the sensor than they normally would be, resulting in less mixing of the boundary layer air, and warmer temperatures at night. This site mimics a big city UHI effect due to these factors I’ve noted.

But NOAA says they can “fix” garbage temperature station data like this.

If it were up to me, I’d remove this station from all climate databases rather than trying to fix this hodgepodge of inaccurate and highly biased data. But NOAA and their fanboys prefer keeping junk data like this.

This is why I’ve said before and will continue to say:

“The majority of weather stations used by NOAA to detect climate change temperature signal have been compromised by encroachment of artificial surfaces like concrete, asphalt, and heat sources like air conditioner exhausts. This study demonstrates conclusively that this issue affects temperature trend and that NOAA’s methods are not correcting for this problem, resulting in an inflated temperature trend. It suggests that the trend for U.S. temperature will need to be corrected.” He [Watts} added: “We also see evidence of this same sort of siting problem around the world at many other official weather stations, suggesting that the same upward bias on trend also manifests itself in the global temperature record”

“Our viewpoint is that trying to retain stations with dodgy records and adjusting the data is a pointless exercise. We chose simply to locate all the stations that DON”T need any adjustments and use those, therefore sidestepping that highly argumentative problem completely. Fortunately, there was enough in the USHCN, 410 out of 1218.”

But, they at NOAA keep these garbage climate stations anyway. No worries, all’s fair in love and climate science.

ADDENDUM: I hope Anthony won’t mind if I add this. I took Jim Goodrich’s Excel data from above, added county population density data, and that gave me the following graph:

Temperature Trend vs. Log Population Density

Best to everyone,

w.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
116 Comments
Inline Feedbacks
View all comments
Just some guy
February 20, 2016 2:26 am

It’s really too bad the satellite only goes back to ’79. I consider myself a “luck warmer”. But honestly I’m not entirely convinced the “true” global temperature today is any warmer than it was in the 1930s. Without trustworth data (or trustworthy keepers of said data), the only thing we can say for sure is that the ground based “record” is worthless.

Science or Fiction
Reply to  Just some guy
February 20, 2016 2:44 am

Worthless and messed up beyond recognition.

knr
February 20, 2016 2:39 am

Science 101 tells if you can’t accurately measure it you cannot know it you can only guess it.
The good news this is not and has never been ‘science’
What we see hear is a classic legacy issue , in the old days before ‘settled science’ we accepted that these problems existed however given the problematic nature of weather prediction there were not consider to be a show stopper .
And then came climate ‘science ‘ and with need the need for dramatic claims of unquestionable accuracy and has part its religions nature, where doubt , let along critical analysis normally seen in science, were forbidden. Only all of these types of problems had not be solved they merely been pushed aside over ‘forgotten’

February 20, 2016 3:00 am

How not to measure temperature (or climate) #97

Considering the surface station project, I wager we are well past how not to measure temperature #97.
I recall when the government said that it was going to spend a fortune of the poor people’s money (taxes that is) to put up satellites to measure the global temperature. I was under the impression that these satellites were going to provide good coverage of the whole of the earth — even over the oceans were we have no thermometers at all. (at least I don’t think we have a set of floating air temperature devices covering the whole ocean)
I had hoped that the system would expand till we could see the temperatures from the surface to the top of the atmosphere; or at least that would be the goal even if that is a bit of over-reach for now. We have spent Trillions of Dollars on fakery “climate research” so we could have spent the money to get near total coverage from space.
But something went wrong. The climate boys discovered that the temperatures measured from space did not fit the “CO2 will destroy us” mythology and so NASA and others decided that the satellite system was no longer a priority. Heck, it was hard to “adjust” the space readings on a daily basis to cool the ’30s and heat the present. What good are they if they don’t support the narrative? (see “Wag the Dog” for a wonderful speech on that issue — one of the best scenes in movie history)
To a common man of the 50s or before, the fact that “the books had to be cooked” tells you that there is something very fishy in the reports purporting to tell you the “bottom line”. But in this degraded time of low logical skills, the “climate boys” think that conformation bias is a good thing. Oh my.

emsnews
February 20, 2016 3:13 am

I was raised in California and Arizona as was my parents, grandparents and great grandparents and in the case of California, since 1849. There have been several cool/hot cycles during this time which we all talked about over the last 150 years. My grandfather loved to tell about how they all nearly starved to death a very cold cycle in the 19th century and as his mom cooked the last food, some dried plums, into a ‘pudding’ she tripped and fell due to catching her toes in a knothole in the floor and it fell and spilled so they ate it all off the floor which impressed him greatly, being a very small child.
I remember the late 1960’s to the early 1970s and we all talked about a new Ice Age coming due to it being very cold and wet even in Tucson, of all places! I remember buying up all the furs we could find at Value Village to make blankets out of so we wouldn’t freeze at night since our pre-statehood homes in downtown Tucson had no central heating. Brrrr. It was really cold at night!
Wiping out the past is what ‘climatologists’ did for several years now and it is very irritating to watch this travesty in science.
By the way, this February in upstate NY feels like Antarctica, with lots of wind and chills. Highs below freezing this coming week when we should be warming up, not getting colder and colder. It is colder, much colder than December and this lopsided weather is a sign of climate change…to colder weather in general.

Marcus
Reply to  emsnews
February 20, 2016 4:41 am

…If Canada gets any colder, it will no longer be known as ” The Great White North “, it will suddenly be known as ” The Great Northern Popsicle ” and Canadians will start flooding across the border by the millions ! Lucky for me I’m half American !

Reply to  Marcus
February 20, 2016 11:52 am

How will you get over “the wall” Marcus? /sarc off 😉

Analitik
February 20, 2016 3:19 am

Has the Parker site changed much to cause the minimum temps to rise like that. If it was always like that, shouldn’t the minimum temps follow the same pattern of variation as the maximums?
Yes, it’s a BS site but has the site gotten worse over time besides the tree growing larger? (eg originally just the building, then fence added, then cooler added to building, then junk cars added…)

Reply to  Analitik
February 20, 2016 11:57 am

Analitik: “shouldn’t the minimum temps follow the same pattern of variation as the maximums?”
No. The minimums will be less due to many factors and will be “smoothed” per the explanation Anthony gave along with other factors. UHI is likely one of the reasons that the Diurnal Temperature Range is less now than in a pristine site. It’s anthropogenic but not CO2.

Editor
February 20, 2016 3:33 am

I would like to add that the highest temperature ever recorded in the UK was recorded in July in SE England. The Met Office thought all their birthdays had come at once, when it his was announced. Fortunately common sense prevailed when it was pointed out that the only weather station to measure this record breaking temperature was sited next to a runway at Heathrow airport!

knr
Reply to  andrewmharding
February 20, 2016 4:59 am

Although that information had to be dragged out of them and of course the ‘headlines ‘ over this claim had come and gone . Even today the MET office still makes much over this ‘record’

February 20, 2016 3:50 am

I would suggest to make a (multivariate) regression of the temperature on both CO2 and Population as independent variables. Then we probably would see that population growth contributes significantly to the R-square. Your first graph hints at this in a graphical way. Otherwise the correlation of temperature with population observed could possibly be dismissed as spurious.
Note that the frequency of apparitions of Maria in the period 1531 – 2010 is also correlated with global temperature.(“Seeing Mary”, National Geographic Magazine, December 2015, p.40-41)
M.H.Nederlof

emsnews
February 20, 2016 4:04 am

History of airport thermometers: when prop planes flew, they didn’t heat up much of anything and when airports were paved (they were grass in the very old days) it got hotter and then jets and lots more asphalt and highways were added, this made them very big heat islands. Using them for temperature records over many years is insane.

Bruce Cobb
February 20, 2016 4:43 am

One would almost think they weren’t interested in the actual temperature trend.

Gregg C.
Reply to  Bruce Cobb
March 3, 2016 7:40 am

Actually they weren’t interested in trends or the climate at all when setting up these weather stations. Many stations are at airports because aviation is very interested in the local weather, and a few degrees + because of UHI doesn’t matter at all. The wind speed and precipitation is a lot more interesting as far as a pilot is concerned, and perhaps if it is below freezing.

Wim Röst
February 20, 2016 5:35 am

The effect of the urbanisation bias is excluded in the following research of Willie Soon & Conolly & Conolly. After doing so, the new record is consistent with other important estimates. See below.
WR: my conclusion: using biased surface temperatures makes it IMPOSSIBLE to do good science on climate. You can not compare bad data to anything at all. Therefore getting unbiased surface temperature records should be the first thing to do for every herself respecting meteorological organisation.
Anthony, you did do and still do the most essential thing: searching for the temperature records that are reflecting REALITY. My compliments.
===
Re-evaluating the role of solar variability on Northern Hemisphere temperature trends since the 19th century
Willie Soon, Ronan Connolly, Michael Connolly
ABSTRACT
(….) Then, in order to account for the problem of urbanization bias, we compile a new estimate of Northern Hemisphere surface air temperature trends since 1881, using records from predominantly rural stations in the monthly Global Historical Climatology Network dataset. Like previous weather station-based estimates, our new estimate suggests that surface air temperatures warmed during the 1880s–1940s and 1980s–2000s. However, this new estimate suggests these two warming periods were separated by a pronounced cooling period during the 1950s–1970s and that the relative warmth of the mid-20th century warm period was comparable to the recent warm period.
(….)
We then compare our weather station-based temperature trend estimate to several other independent estimates. This new record is found to be consistent with estimates of Northern Hemisphere Sea Surface Temperature (SST) trends, as well as temperature proxy-based estimates derived from glacier length records and from tree ring widths

Luke
February 20, 2016 7:15 am

Sorry but this issue has recently been addressed by Hausfather et al. (2016) and it is clear that the homogenized data track temperature trends at “pristine” stations better than unadjusted data. The computer code is also available. I challenge anyone here to publish a rebuttal to Hausfather et al.
From the abstract
Numerous inhomogeneities including station moves, instrument changes, and time of observation changes in the U.S. Historical Climatological Network (USHCN) complicate the assessment of long-term temperature trends. Detection and correction of inhomogeneities in raw temperature records have been undertaken by NOAA and other groups using automated pairwise neighbor-comparison approaches, but these have proven controversial due to the large trend impact of homogenization in the United States. The new U.S. Climate Reference Network (USCRN) provides a homogenous set of surface temperature observations that can serve as an effective empirical test of adjustments to raw USHCN stations. By comparing nearby pairs of USHCN and USCRN stations, we find that adjustments make both trends and monthly anomalies from USHCN stations much more similar to those of neighboring USCRN stations for the period from 2004-2015 when the networks overlap. These results improve our confidence in the reliability of homogenized surface temperature records.
The url for the article:
http://onlinelibrary.wiley.com/doi/10.1002/2015GL067640/full

Reply to  Luke
February 20, 2016 8:30 am

Zeke does not show what you think he shows. Perhaps even whatnhemthinks he shows.
CRN are by definition pristine rural. The closest USHCN will also be rural, but perhaps not pristine. That there is post USHCN pair wise homogenization a good match just implies few microsite problems. Says nothing about mostly not pristine (this post being an example) or UHI afflicted (this post Las Vegas being an example) or the general growth of both microsite issues and UHI with population density (this post Goodman data being a clear example).
Comcerning the comclusions of this post, Zeke’s paper is NOT a rebuttal.

Chris Z.
February 20, 2016 7:33 am

@Luke: Am I getting that right? Adjusted data is more similar to the standard it has been adjusted to than unadjusted data? Well, that’s a definition of “adjustment” and it would have been newsworthy if the adjustments were AWAY FROM the curent standard rather than TOWARDS it – but what makes you (or Hausfather) think that these adjustments bring any of the data closer to local reality than the raw measurements? I for one view it with suspicion when two originally independent data sets (i.e. two neighboring stations in whatever network) generally become “much more similar” after adjustment. Sounds like wilfully losing information to me, the adjustment does not spot and correct real errors as they happen, but smears everything with the same brush, so you end up with lots of stations, but with so much spurious correlation by adjustment as to make them no better than a mere handful of independent measurements. It is foolish to assume that random noise will even out this way, it is more like every signal contaminating all the others, ending up with a pretty meaningless mish-mash.

Luke
Reply to  Chris Z.
February 20, 2016 8:12 am

Chris Z says “but what makes you (or Hausfather) think that these adjustments bring any of the data closer to local reality than the raw measurements?”
The answer is very simple, the data says so.
“the adjustment does not spot and correct real errors as they happen, but smears everything with the same brush”
No, the adjustments are much more complex than you suggest. Any time you have data sets with millions of observations there is the potential for errors in the data. The question is whether you try to identify those data points and adjust them or simply use the raw data. Their analysis suggests that the algorithms they use to make the adjustments improve the data and make it more similar to pristine sites. If you really want to know what they did, the code they used for the adjustments is available.

Toneb
February 20, 2016 8:22 am

Evaluating the impact of U.S. Historical Climatology Network homogenization using the U.S. Climate Reference Network
“Non-pay-walled version:
http://www-users.york.ac.uk/~kdc3/papers/crn2016/CRN%20Paper%20Revised.pdf
“Conclusions
During the period of overlap between the USHCN and USCRN networks, we can confidently
conclude that the adjustments to the USHCN station records made them more similar to
proximate homogenous USCRN station records, both in terms of trends and anomalies. There are
no systematic trend biases introduced by adjustments during this period; if anything adjusted
USHCN stations still underestimate maximum (and mean) temperature trends relative to USCRN
stations. This residual maximum temperature bias warrants additional research to determine the
exact cause.
While this analysis can only directly examine the period of overlap, the effectiveness of
adjustments during this period is at least suggestive that the PHA will perform well in periods
prior to the introduction of the USCRN, though this conclusion is somewhat tempered by the
potential changing nature of inhomogeneities over time. This work provides an important
empirical test of the effectiveness of temperature adjustments similar to Vose et al. [2012], and
lends support prior work by Williams et al [2012] and Venema et al [2012] that used synthetic
datasets to find that NOAA’s pairwise homogenization algorithm is effectively removing
localized inhomogeneities in the temperature record without introducing detectable spurious
trend biases.”

February 20, 2016 9:03 am

I used to think that the foundation of caGW was built on sand.
I was wrong.
It’s built on asphalt.

kevin kilty
February 20, 2016 9:29 am

Because of the persistently cold winter here in laramie, I got interested in local temperature and began tracking the airport AWOS data (klar). I noticed in January that days where the hourly data never rose above something like 20F, the maximum reported temperature would indicate something higher, even by as much as 7 higher. So far I have not found data taken in detail to show the nature of what must be some shortlived anomalous spikes.I have investigated the siting of the station which is between a taxiway and runway, but not especially close to either. All I can surmise is the drift of jet exhaust from the terminal building far to the north of the station. Nonetheless this has made me suspicious about the value of data from this station for some uses. I am pretty certain KLAR is not used in climate compilations, but it is one of the better sources of data from this region.
Then I found by serendipity a publication of the College of Ag at UW entitled “Temperature probabilities in Wyoming”. The data used to build the probability density were from the period 1930 to 1960, and I thought to myself that this should be a long enough period to build a reasonable control chart if climate is a stationary process, and then test it against historical data from after 1960. I built a chart for minimum temperature, but what I found is that observed data violated the lower control limit immediately and often. While the minimum temperature in January never reached below -30F, which was my 95% lower control limit for the period 1931 to 1960, minimum January temperatures from 1961 to 1975 or so routinely reached -40 to -50. I suspect that the coldest part of Winter in Laramie shifted from February during the 1931 to 1960 period, to January in the period 1961 to 2016. It may be shifting to even earlier in the winter season in most recent years. haven’t yet had to time to test this idea.

February 20, 2016 11:37 am

Parker?
yes
Its a good proof of how homogenization works!
the RAW data is crap
Raw monthly anomalies 3.07
After quality control 3.05
After breakpoint alignment -0.27
Skeptical hypothesis? homogenization pollutes good stations with bad stations.
Wrong
homogenization can correct even microsite
http://berkeleyearth.lbl.gov/stations/34568

Reply to  Steven Mosher
February 20, 2016 1:25 pm

If the data is crap to begin with then what are you left with after polishing it?
(An aside for Mosh. Have you ever taken an area, say 50 square miles with 20 sites, randomly eliminate all but 2, run your program and see if your results are anywhere near what the 20 sites originally said to check your program?)

Reply to  Gunga Din
February 21, 2016 7:25 pm

Mosh? My “aside”.
Are you working on it or ignoring it?
The only “control” for climate science is to compare the results against a known. There are no knowns to serve as a control. So take what you’re program says you know based on a bunch of observation, drop a bunch of those observation out, and see how they match.

Reply to  Gunga Din
February 21, 2016 7:52 pm

” drop a bunch of those observation out, and see how they match.”
They do, but what he doesn’t explain is how they turn a station reading into something that can be compared to the field they calculate, without doing the same processing of the station data.

NZ Willy
February 20, 2016 11:40 am

Yep, agree the solid fence makes a huge difference because it cuts off ground breezes. Levees do the same. Very distinctive effect when you’re walking in their vicinity.

February 20, 2016 11:49 am

Willis
“ADDENDUM: I hope Anthony won’t mind if I add this. I took Jim Goodrich’s Excel data from above, added county population density data, and that gave me the following graph:”
Sorry COUNTY population will get you wildly inconsistent results.
Even population at the sites will get you nothing.
Oke, who pioneered using population density, ABANDONED the idea
1. only captured MAX uhi
2. Was dimensionaly incorrect
3. Could change if the city in question changed their adminstrative boundardy
4. The coefficients of the regression are confounded with site/continent/wind
Google UHI in 419 large cities. You will a massive study done on the ttopic. Urban AREA
is the variable you want

1sky1
February 20, 2016 1:59 pm

Introduction of micro-siting factors, such as fences, air conditioners etc. may lead to abrupt step-changes in monthly average temperatures, but they do not produce ongoing gradual increases, characteristic of UHI. The gorilla in the room at Parker is the completion in 1934-1938 of Parker Dam upstream of the town. By curtailing spring floods, the soil moisture content–thus evaporative cooling–was gradually changed for hundreds of square miles around, That is a meso-scale effect totally ignored here.
BTW, the original station record is nowhere as broken as the GISS-homogenized version shows. Viewed in the correct context, it is a very credible record.

February 20, 2016 2:11 pm

“Nope. It’s a siting issue. According to the B91 form, the Parker 6NE USHCN climate monitoring station is located at radio station KLPZ in Parker, it is a volunteer observing site, which sort of explains why NOAA gets what it pays for when we have 16 days of missing data in December 2015.
“NASA GISS keeps a plot of Parker 6NE data, and it seems missing data has been a hallmark of this station for quite some time. Notice all the gaps:”
here is Parker:
See how we adjust the trend DOWN
http://berkeleyearth.lbl.gov/stations/29136
here is Parker 6E
http://berkeleyearth.lbl.gov/stations/34568
Not finding Parker 6NE

1sky1
Reply to  Steven Mosher
February 22, 2016 1:14 pm
February 20, 2016 2:22 pm

It took me a long time to locate the Parker monitoring station on the map even given the clues. It was so disguised by junk piles and dead cars, fences, asphalt, A/C evaporative coolers, concrete and radio station that it was lost. This certainly qualifies it as a NOAA Grade A Global Warming Recording Station.

Jim Goodridge
February 20, 2016 3:10 pm

Hello Anthony
Jim Goodridge, 2/20/16
Weather records best are imperfect indices of weather.
There is no such thing as a perfect weather record.
The search for more perfect observations yielded data loggers.
These loggers record humidity, radiation, temperature, wind and rain each minute.
Instruments are exposed above the highest expected snow accumulation.
Wind causes rain gage under catch as described by Koschmieder in 1934
When historical continuity is important, too bad, cities encroached on exposure.
The tenure of observers is at most only one lifetime.
We can except blemished weather records or be forced to do without.
More perfect weather records are on the way with RAWS, Snotel and CIMIS.
Data loggers with more perfect exposure started about 1980.
California’s oldest temperature record was in the 1840s at Fort Ross,
With records in degrees Reaumur, published in Moscow in the 1840s.
We either accept blemished records and pray contents contain useful information.
A hundred years is not a long time for civilization only for us today.
From abacus to Excel in 50 years is progress; it is accelerating with great promise,
From stick measurements on rooftops to data loggers on mountaintops.
It is too soon to throw out blemished records when needing a historical perspective

Littleoil
February 20, 2016 3:19 pm

Warmists have adjusted the records to show it was colder in the past then sold this apparent warming as an increase in the midday maximum with images of bushfires and boiling seas.
This excellent article may be the turning point to show what has really happened and why there is nothing to fear.

February 20, 2016 7:58 pm

There’s no evidence of a loss of nightly cooling, so land use, ocean cycles and idiotic post processing are strong candidates for all the measured increased, whatever they really are.

February 22, 2016 7:51 am

FWIW, Pike’s Peak Colorado is probably as susceptible to UHI as more populous locations. It is a popular tourist destination during summer months.