From the “global warming data looks better with heat-sinks and air conditioners” department.
Dr. Mark Albright, of the University of Washington writes:
Here is a great example of how NOT to measure the climate! On our way back to Tucson from Phoenix on Monday we stopped by to see the Picacho 8 SE coop site at Picacho Peak State Park. Note the white MMTS temperature monitor 1/3 of the way in from the left. The building is surrounded by the natural terrain of the Sonoran Desert, but instead the worst possible site adjacent to the paved road and SW facing brick wall was chosen in 2009 as the location to monitor temperature.
Here is a view looking Northeast:
For an aerial view in google maps:
The NCEI HOMR metadata repository tells us:
COMPATABLE EQUIPMENT MOVE 55 FEET DUE WEST. EQUIPMENT MOVED 05/06/2009. (that is when the new state park visitor center was built)
http://www.ncdc.noaa.gov/homr/#ncdcstnid=20001376&tab=MISC
Additional photos:
Note the air conditioner heat exchangers within a few feet of the MMTS sensor:

Picacho 8 SE has it all: brick building, parking lot, road, and air conditioner heat exchangers within a few feet of the MMTS sensor.
This one takes the cake, and I think it is worse than our former worst-case USHCN station (now closed) located in a parking lot in Tucson at the University of Arizona:
Picacho 8 SE is a COOP site, not part of USHCN, but it (along with others) is used as basis for the adjustments to the stations that have not been compromised. This is the crux of the problem, and why it is so important to seek out the good and unperturbed stations for their record, and discard the rest. No amount of general purpose algorithms and adjustments can fix garbage temperature data produced by stations like this, nor should we even try. This is a Class 5 station, the worst of the worst, and should be closed rather than continuing to pollute the climate dataset.
In our AGU 2015 poster and press release, it was stated:
“The majority of weather stations used by NOAA to detect climate change temperature signal have been compromised by encroachment of artificial surfaces like concrete, asphalt, and heat sources like air conditioner exhausts. This study demonstrates conclusively that this issue affects temperature trend and that NOAA’s methods are not correcting for this problem, resulting in an inflated temperature trend. It suggests that the trend for U.S. temperature will need to be corrected.” He [Watts} added: “We also see evidence of this same sort of siting problem around the world at many other official weather stations, suggesting that the same upward bias on trend also manifests itself in the global temperature record”
“Our viewpoint is that trying to retain stations with dodgy records and adjusting the data is a pointless exercise. We chose simply to locate all the stations that DON”T need any adjustments and use those, therefor sidestepping that highly argumentative problem completely. Fortunately, there was enough in the USHCN, 410 out of 1218.”
1. Comprehensive and detailed evaluation of station metadata, on-site station photography, satellite and aerial imaging, street level Google Earth imagery, and curator interviews have yielded a well-distributed 410 station subset of the 1218 station USHCN network that is unperturbed by Time of Observation changes, station moves, or rating changes, and a complete or mostly complete 30-year dataset. It must be emphasized that the perturbed stations dropped from the USHCN set show significantly lower trends than those retained in the sample, both for well and poorly sited station sets.
2. Bias at the microsite level (the immediate environment of the sensor) in the unperturbed subset of USHCN stations has a significant effect on the mean temperature (Tmean) trend. Well sited stations show significantly less warming from 1979 – 2008. These differences are significant in Tmean, and most pronounced in the minimum temperature data (Tmin). (Figure 3 and Table 1)
3. Equipment bias (CRS v. MMTS stations) in the unperturbed subset of USHCN stations has a significant effect on the mean temperature (Tmean) trend when CRS stations are compared with MMTS stations. MMTS stations show significantly less warming than CRS stations from 1979 – 2008. (Table 1) These differences are significant in Tmean (even after upward adjustment for MMTS conversion) and most pronounced in the maximum temperature data (Tmax).
4. The 30-year Tmean temperature trend of unperturbed, well sited stations is significantly lower than the Tmean temperature trend of NOAA/NCDC official adjusted homogenized surface temperature record for all 1218 USHCN stations.
5. We believe the NOAA/NCDC homogenization adjustment causes well sited stations to be adjusted upwards to match the trends of poorly sited stations.
6. The data suggests that the divergence between well and poorly sited stations is gradual, not a result of spurious step change due to poor metadata.
The result speaks for itself:
Figure 3 – Tmean Comparisons of well sited (compliant Class 1&2) USHCN stations to poorly sited USHCN stations (non-compliant, Classes 3,4,&5) by CONUS and region to official NOAA adjusted USHCN data (V2.5) for the entire (compliant and non-compliant) USHCN dataset.
Figure 4 – Comparisons of 30 year trend for compliant Class 1,2 USHCN stations to non-compliant, Class 3,4,5 USHCN stations to NOAA final adjusted V2.5 USHCN data in the Continental United States
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





So what does the data plot from that weather station look like? Is there a noticeable change in 2009 after the station was moved?
http://berkeleyearth.lbl.gov/stations/27958
Thanks. Another apparent Berkeley data ingestion fail. They have the station move in 2011, not 2009 per the cited metadata. OTOH, they have regional expectation quality control fails thereafter. Presumably too hot.
Well the visitors’ centre was built in June 2009, at least that is the date of the move. How good was the siting before the move ?!
It does appear from the “break point adjusted” graph at Berkeley that the station record was notably too high from 2009-2011 and then about half the error gets corrected.
The other two adjustments may reflect that it was too volatile in comparison to regional neighbouring sites which may have more greenery. It ran too warm during 1995-2003 , then apparently showed more cooling 2003-2010
This may be due to it’s obviously arid nature. Dry rock / sand has about half the heat capacity of soil or ‘moist rock’ . An important factor in sensitivity to incoming radiation is available moisture . [ Geoffroy 2015]
This kind of land record bias gets roughly doulbed when land temps are added to sea temps to form GMST ‘average’.
See recent discussion at Judith’s.
https://judithcurry.com/2016/02/10/are-land-sea-temperature-averages-meaningful/
In January 2009 WUWT carried a discussion of the mysterious “big red spot” near Florence. Picacho peak came up in the head-scratching:
Gary A.
January 10, 2009 at 4:38 pm
“Picacho Peak State Park is open. Please note that our new Visitor Center is still under construction, but the park is open for public use. The new Visitor Center is scheduled to be completed by the end of February 2009. Thank you for your understanding.”
Going from memory now, so caveat emptor. Shortly after this discussion I tried to get a picture of the Picacho Peak park site. The ranger declined to give permission or disclose the exact location, because the station was at that time near to a residence and he felt there was a privacy issue. I think he did indicate that changes were in the works.
Wonder if Dr. Albright had to pay the entrance fee to get in…?
Can’t expect them to walk too far in the summer heat. I’m just surprised that the Coop isn’t inside the AC room for easy monitoring.
Bryan A February 17, 2016 at 2:30 pm
Don’t be silly, the cooler would result in a decrease in temperature. We all know that NEVER happens! 😉
Well that shadow of the gizmo, would be coming from a early afternoon sun, when it is nice and toasty.
Fortunately red brick is known of to be a very poor reflector in the solar spectrum and IR region. nobody in his right mind would make a telescope mirror out of red brick.
Well that stuff is such a poor reflector, I can barely see what it is; but now it looks more like concrete artificial brick, instead of the real thing.
Well its a pretty clever arrangement, the solar energy may go right through that concrete, so when the sun moves back around to the North East, the building wouldn’t stop all of it from getting to the gizmo.
g
Words escape me.
Wait…. IDIOTS comes to mind.
They are not idiots, this is intentional …IMHO.
Well I wouldn’t call them IDIOTS exactly but I can understand where you’re coming from with the dilution of the gene pool-
http://www.smh.com.au/national/education/atar-charade-bring-back-student-caps-says-nsw-education-minister-adrian-piccoli-20160214-gmu1v3.html
INEPTS.
I don’t think them to be idiots, I think they are quite cunning in fact, to place these things so as to allow for data driven validation of their PET Theory
Excellent post Anthony!
Bad data is worse than no data. Especially for this type of investigation where you’re looking at pretty sensitive (to change in outcome/trend) data. It typically leads to wrong conclusions and thinking. See it all the time in Clinical Data……
Excellent post Anthony!
I agree, that series of pictures tells the story!
“Bad data is worse than no data.”
Indeed. That’s why the saying “its not what you don’t know that gets you in trouble. Its what you know that simply isn’t true” is so important to remember.
On the other hand, bad data — if accompanied by good metadata — potentially can be corrected.
Well actually all that is in that can, is a counter that counts the number of sunny days, which they detect by looking for a reflection from the concrete blocks.
g
..Evan Jones, please explain how you can ” Correct ” bad data ? I love the JoNova website BUT this makes no sense to me !
Evan Jones wrote, “On the other hand, bad data — if accompanied by good metadata — potentially can be corrected.” Lawyers love to hear that kind of talk about prescriptions. It’s why they advertise for customers who have been harmed by taking a particular prescription. The initial rollout fails to mention you could have a heart attack if you take this new whizbang pill, precisely because they pumped up bad data with good metadata. Abracadabra: The new shiny pill is safe because we made the data say Uncle.
Youse guys are going to have me sounding like Mosh.
Good data is better. But bad data can sometimes be corrected.
All surface station data (arguably other than CRN) is bad data. All of it. Quite apart form the siting issue, the equipment itself is biased.
A CRS unit has an inherent trend bias — pimped up Tmax, dumbed down Tmin. as a result of the box itself (paint issues make it worse). MMTS conversion creates jumps.
I can correct for that. I do. Even our unperturbed Class 1\2 data must be corrected for equipment bias.
One cannot achieve perfection. But sometimes one can identify issues and improve otherwise downright incorrect data to the level of usefulness.
Or, as in the case of NOAA, make it worse.
Evan Jones February 17, 2016 at 8:51 pm
Youse guys are going to have me sounding like Mosh.
Good data is better. But bad data can sometimes be corrected….
Mmm, “corrected” data is subject to additional potential error sources. You can treat it as “data” in the same way as “good” data, but the identification and correction of the error becomes problematic, particularly because there is a “should” implicit in any correction. A correction assumes we actually know what the reading “should” have been at some degree of accuracy.
Yes, every correction is subject to MoE. And a regional correction is not necessarily applicable to an individual station even if the offset difference is used rather than the offset.
But, no, corrections do not imply that you knew the “correct” result beforehand and are dickering with the data to achieve that result. Or at least it shouldn’t. When I apply a jump correction for equipment change, I don’t know what the effect on Tmin or Tmax will be until I see the results.
What is important (and not generally done) is to show both raw and adjusted data, explain what you did to it and why, all in a manner that can be easily replicated.
When things get too complex, there is more and more mush for confirmation bias to rule. As a game designer/developer creating historical “models”, I am all too aware of that and have seen all to much of that. When VeeV wants me to feed my stuff into his black box, my impulse is to infer (from the results) what is going into the black box and create on my own a cruder, but entirely transparent and understandable box that can be understood and discussed — positively or negatively — by anyone.
P.S.: Don’t knock it till you’ve tried it.
Just because some folks are doing it wrong doesn’t necessarily mean it can’t be done right.
“Evan Jones February 17, 2016 at 8:51 pm
Youse guys are going to have me sounding like Mosh.
Good data is better. But bad data can sometimes be corrected….”
Perhaps, but sometimes it can not.
How does one determine which bad data is reasonably correctable and which is not? And how does one know that the corrections are within reason?
If you have the data and metadata, and if know what to correct in the first place, you can take a good shot.
But when you do you are playing with fire. Fire is a useful tool, but it is fraught with inherent dangers. Homogenization is juggling with nitro. Proceed with care.
NOAA adjustments appear to be incorrect and incomplete. They have strayed and not followed the full path. But that does not mean it cannot be done in such a way as to improve rather than further damage the results.
Having said all that, proper metadata is often lacking outside the USHCN. For that matter, raw data, being incorrect, no correction will ever get it perfect. Certainly not on a station-by station basis.
We do not strive for perfection; we cannot. Yet we can strive for advantage. We do. We will.
How do you know if what you end up with is better or worse?
One can never be 100% certain. But NOAA has gone full illogical over equipment adjustment and I haven’t even looked at their TOBS correction yet. They fail to account for microsite in any meaningful manner.
The best way to ensure the best result is complete transparency of method. No black boxes need apply.
” The best way to ensure the best result is complete transparency of method. No black boxes need apply.”
I tell my data customers unless you can guarantee your results are correct, you’re better to not “fix” it at all.
But raw data won’t do. It clearly needs to be fixed. There can be no guarantee of accuracy; this is an ongoing effort.
NOAA adjustments make the matter worse. But we think our own (unperturbed Class 1\2 stations stations plus equipment adjustment) are a great improvement, and while they cannot capture the entire spectrum, they improve accuracy rather than detract from it.
As I say, we will provide transparent methods and data. Others (hostile or not) will review this and, no doubt, will offer suggestions that will improve it further. But there are no guarantees in this game — except the guarantee that the raw data is unreliable and should not be used as such.
” except the guarantee that the raw data is unreliable and should not be used as such.”
And yet, I think it’s obvious that the corrected data is worse.
And the raw data is perfectly useful to collect station trend data, absolutes are not needed, trends are.
Yes and no.
Bad data makes bad trends, whether raw or badly adjusted.
The correct solution is to tackle the adjustments and make them correctly. Which I am doing.
But there is no way to tell if it is right.
That’s the problem with any blind change to data, how can you confirm something you don’t have (the true value) based only on what you think might have altered the data you have, and no way to go back and verify your change is an improvement and by how much was it improved.
REPLY — The idea is not to do it blind. Always give full reasons and exact details. The way NOAA does it, it makes the problems worse. The reasons are obvious and easily explained. Neither is it (or can it) be just right. That opportunity was blown the day the data was written. It can be made better instead of worse, and can be explained in a logical manner. Numbers and red flags included. ~ Evan
That looks like a lot of man-made warming.
I noticed the hockey stick, when it came out, had a lot of mann-made warming also.
Besides the two A/C condensing units, the third pad-mounted object in the enclosure next to the large electrical disconnect is a dry-type 480V/208V transformer. Also a good 365d/24h continuous heat source.
Good observations. Bad sensor location. But the condensing unit is probably not the biggest problem here, as the unit sucks ambient air in through the sides (coils) and blows it upward at 500 fpm or so, at which point it will continue to rise due to being warmer air. Except for odd wind currents sometimes reversing this upflow, the condensing unit will actually cause unaffected ambient air flow across the sensor box.
Louis,
When air is pulled into something and then forcefully ejected from it, it changes the air flow around it right? Since there are TWO, LARGE air conditioning units there, most likely running 24 hours a day, and the air flow behind them is almost completely restricted, isn’t it possible that those units “suck” in and “blow out” with enough force to create a mini air current of their own that pulls heat rising from that asphalt towards them? If so, that gauge is directly in the path of that asphalt heated air.
Is that cover (on the gauge) made out of a PVC type plastic or metal painted white? Every single element near that “white” gauge is darker than that gauge and made of rock, cement, brick or asphalt which causes heat to be absorbed more and released more slowly than the “white” gauge in that same location. So even at night, as the asphalt cools the most slowly of all of the other elements around that gauge, those air conditioning units could be pulling that slow to cool air over that sensor and making it appear as though night time temps aren’t falling as quickly or as much as they actually are….
Am I making sense?
Anthony, is it possible to generate a chart that goes all the way through 2015 using only well cited stations? I realize that the ones that were well cited in 2008, might not be well cited any more, but if they are, I’d love to see the trend through the entire “hiatus” period.
This report took a Hurculean effort by you, your coauthors, and a huge number of volunteers. You all deserve scientific medals and accolades. Thank you!
Second that! Are there any volunteers here in the UK to do a similar excercise I wonder? I think I’ve seen pictures of one or two airports which weren’t airports some decades ago. Most of the airfields in Britain were built with A form concrete runways with perimeter track and dispersal bays — about 200 of them — in an enormous construction job in 1941/42 and many have been converted for commercial use since the war. Including Heathrow. Many have gone back to farm land.
I believe figure 4 above: blue line shows compliant stations.
The worst sited temperature monitors I have ever seen are at Seville airport in southern Spain, right next to the tarmac, close to buildings and occasionally fanned with the hot plumes of jet exhaust.
Seville is a very hot place in mid summer and I wouldn’t mind better this monitor gives you serious ‘global warming’ before homogenisation and a lot more afterwards.
A few years ago potential surveys were mentioned regarding other countries. It seems some nations are not fond of picture taking and the like around public facilities. This might be a good way to get shot or just picked up by police and questioned. More reason now than ever.
A sad state of affairs but I can see the point.
Actually that Seville airport Temperature monitor is right where it needs to be, because that’s what the plane crews need to know to see if it is safe to take off.
Airline pilots don’t give a rip about global warming; only landing and take off conditions; specially the landing part.
Take off is entirely optional; the rest of the flight is mandatory !
G
another in agreement here. the importance of anthony and his teams work on this issue cannot be overstated.
I realize that the ones that were well cited in 2008, might not be well cited any more, but if they are, I’d love to see the trend through the entire “hiatus” period.
And it will take a long time to evaluate that (and the other metadata). Probably save it for followup.
2005 – 2014 saw a strong cooling signal in the US. As expected, the poorly sited USHCN shows much more cooling than CRN, just as it showed exaggerated warming during the warming periods.
Aphan: Re my earlier reply to RonPE: Is there something about the terms “suck in” and “blow out” that makes you uncomfortable enough to put them in quotation marks? I was only trying to explain that the heat generated by a condensing unit with vertical discharge doesn’t ordinarily affect a nearby sensor any more that other normal thermal effects in the area..
The more they can pre-cook the data, the less fudging they need to do afterwards. More efficient that way.
Wait…what? There are magical little numbers next to the posts…does this mean what I think (hope) it means?!!
To be honest you’d think it might occur to someone to abandon that hopeless morass and use radiosonde confirmed LT satellite data instead. Or did I miss something really obvious there?
You missed that if they did that, there would be no warming in this century. Doesn’t fit the story line.
But Dr Carl Mears of RSS says “stronger case can be made using surface temperature datasets, which I consider to be more reliable than satellite datasets”
http://www.remss.com/blog/recent-slowing-rise-global-temperatures
I wonder what purpose he sees in their own data if he considers it inferior to surface measurements like this one?
LOL, ..Now they say the satellite data is no good because it has too many…” adjustments ” !!
The adjustments made to sat data are large but fairly simple. The adjustments made to surface stations (except CRN, I hope) are large and relatively complex. Even our unperturbed (both well and poorly sited) data must be adjusted both for CRS trend max-min distortion and jumps for equipment conversion.
And that alone can make a cooling station warm and a warming station cool. Or warm more. Or cool more. Or hardly change the trend at all. CRS adjustment is easy to apply. But MMTS conversion jumps are all over the place (as H&L-06 correctly observes).
Its time for the National Weather Service to break away from NOAA. I think that they would function better as a scientific organization if they weren’t under NOAA’s control and oversight. Its time for some of the nations meteorologists and climatologists (not all of them have been bought off or are bad) to quit catering to special interest groups, lobbyists, and government payoffs (grants for research) and to play quit playing politics with the weather. I long for the days of integrity and honesty from these folks to return.
We’ve had several Senate hearings on the climate. Has the question of station siting ever been raised? Have the bureaucrats involved ever been called on to account for their choices?
Comrade, yes the question has been raised, Dr. Curry, among others has noted the problems with both measurement and data adjustment in various testimony.
No the bureau-trash has not been held accountable for anything, although there are some efforts with Senator Alexander out in front. Although when you have a prez who’s dumb enough to think that all the Marines are dead, (aka Marine Corpse)… then you might just have issues getting anything done.
Yes. Dr. Christy, our co-author, has brought the microsite issue on more than one occasion.
This is more reason for an Open Source Temperature Data Repository.
More evidence of NOAA ineptitude. The switch to state by state NClimDiv in 2014 swept in COOP stations such as this, contributing to NOAAs newest ‘official’ Arizona warming. ‘New and improved’ NClimDiv introduced or increased warming in all but 8 states. For CONUS, it increased the decadal warming rate from 0.088F to 0.135F. Another fine ‘sciency’ contribution from warmunist Tom Karl and his merry band of NOAA rogues.
In fact this station ceased to contribute to nClimDiv in Feb 2014. Very large inventory file here.
Criminy, Nick. Those coordinates are godawful. Pathetically useless for actually spotting them on Google Earth. Haven’t they discovered GPS yet outside the USHCN? Takes me back to the Bad Old USHCN1 days. Sheesh.
All data is within the margin of error. Considering there is no or infinite margin of error with non-random, non-replicated sample size of one (n=1), that is not really an issue.
The real issue is that it is incorrect to even generate simple statistics like averages on apples and oranges.
GIGO
There should be funding cuts associated with ineptitude, or expect more of the same laziness.
I think the entire fed gov, except Defense, should be required to engage in zero-based budgeting every year, and have something like ISO Guidelines to evaluate last years expenditures. The bozos would be doomed-as-doomed could be …
this should be applied to all the western developed nations governments .
/sarc on/ of course, it’s mitigated by the shadow of the tree in the afternoon. Oh, wait! The tree is too close to the station!!!! Danger Will Robinson… /sarc off/
Very interesting how much more like the map of bad sites the “officicial” temperature record is.
Anthony: Do you have a gridded map showing how the compliant statIons are situated with respect to the USCRN locations?
Why would data from these sites even be considered for climate analyses? Would they accept a surface temp on their child,s pinkey right after hand washing for determining health? Would a comparison of multiples of such measurements give accurate anomalies? Could you get better data by averaging or otherwise manipulating measurements on both hands?
If the measurement methods are not accurate and taken with reasonably uniform methods they are not trustworthy. Indeed GIGO.
You look at these pictures and then wonder at the so-called “scientist” who claims that the surface record is better than the data from satellites. Better for what purpose one wonders.
Anthony:
Where did the Mark Albright comment come from?
He emailed it to me.
Ouch, doesn’t that hurt his career, tenured or not? (Sorry to get cynical).
I still know a number of these professors from my days at the UW. At some point I need to buy some of them a pile of drinks…
Peter, if you buy them ” a pile of drinks “, the warmistas will claim ” They were bought off ” !
It’s a bad site for sure, but you can measure the derivative of daily and annual temperature change and show the step in temp in 2009, and that over a 24 hour period after 2009 the nightly cooling hasn’t changed since.
Any temperature increase is from thermal storage, not a loss of cooling.
Just curious, do you base this on theory or did you actually check data already?
Can you clarify this? When is temperature increase ever anything other than gaining more heat than losing it?
@Chic,
Yes I studied surface data and did numerous measurements with an IR thermometer.
I found it cools slightly more at night than it warmed the day before, and it took a long time to figure out how that could be.
Tropical oceans warm and evaporate lots of water, and that water is carried poleward to cooled.
The extratropic (I think this the proper term) Jetstream sets the dividing line between warm tropical air and cool dry polar air masses, a shift in the path of the Jetstream, caused by a change in the oceans AMO/PDO can easily change the ratio of the two air masses over the continents, and then poorly thought out processing of surface data in fills large areas with one or the other, and all of a sudden the whole world is warming.
And my regional derivative charts show large regional shifts, as I would expect them if this were true.
micro6500,
How can this be? If night was always cooler than the previous day, minimum temperature would continually drop and the daily temperature range would have to increase on average to maintain at least a constant temperature.
Is there some reference where your experience is documented in more detail? With this kind of behavior going on, how in the world (no pun intended) is a daily min/max average temperature considered acceptable for a measurement of land, let alone ocean, temperatures?
As I mentioned I believe it’s the heat of water vapor that was evaporated in the tropics moving onto the continents. Same reason why most of the east coast had a warm winter, tropical air out of the gulf displaced Canadian which is 15-20F degrees warmer. Same thing happens in the summer.
https://micro6500blog.wordpress.com/2015/11/18/evidence-against-warming-from-carbon-dioxide/
Charts, descriptions, and links to code and the data used to create the charts, plus a half dozen ways the data was sliced. If you don’t like the way I sliced it.
I believe it is quite useful in detecting the change in temps, not the absolute value. But I also don’t believe surface station can determine a global average, most of the planet isn’t measured. But you can see what happened at a large number of stations that are placed around the world.
If only it were that simple. If it was only an offset step change, then that would work. However, this fundamentally changes the shape of the graph. A/C units in particular exacerbate measurement of hot days since they add to the temperature in the immediate area proportionally to the building’s load (which is not only temperature-dependent, but based on how much the building is used). You also can’t use the difference across the switch as your delta since the change from day to day or year to year is multiple times the decade trend. If you ran the two locations concurrently for several years to get a baseline delta, I could see it working. However, that just doesn’t happen. Trying to adjust for such a change will obliterate your data completely.
You can actually see the effect of such actions in homogenization graphs, which effectively take the delta and replace the existing trend with the regional trend. While it might be interesting for a local trend, it then makes it useless for determining regional trends as you have replaced what you are looking for with what you “should” find plus the noise from the original.
Furthermore, a number of issues are caused by encroachment, where an area gradually becomes built up, so they can’t.
I imagine the AC thermostat lets temps in the building rise at night. Less heat from the AC units. Lights, PCs, etc. would be turned off also. Less heat from the transformer.
There’s too much day-to-night changing of man-made influence on the sites temps to trust any of the numbers or trends from the numbers as being “natural” or worth adding in any way to the data used to produce real “global temperature”. I’m not a meteorologist but it’s probably not even good enough for a local forecast.
@Ben Klijn,
I will have to ponder on your comment.
But you might look at my work and ponder the same
https://micro6500blog.wordpress.com/2015/11/18/evidence-against-warming-from-carbon-dioxide/
If this site were in a cold area, there would be a homeless guy camped there.
“withing” should be “within” in the picture caption.
…just mentioning…
[Noted. Peer-review works! .mod]
Any plans to turn the AGU poster into a published, peer-reviewed manuscript? Seems about ready to go.