Through The Looking Glass with NASA GISS

Guest essay by John Mauer

One aspect of the science of global warming is the measurement of temperature on a local basis. This is followed by placing the data on a worldwide grid and extrapolating to areas that don’t have data. The local measurement site in the northwest corner of Connecticut is in Falls Village, near the old hydroelectric powerhouse:

fallsvillagestation
Falls Village weather station, Stevenson Screen at right
Aerial view of Falls Village GHCN weather station. Note power plant to the north
Aerial view of Falls Village GHCN weather station. Note power plant to the north Image: Google Earth

The data from that site appears to start in 1916 according to NASA records; it shows intermittent warming of about 0.5 degrees C over the last 100 years. However, NASA recently decided to modify that data in a direct display of political convenience to exaggerate the rate of warming, that is, making the current period appear to show increased warming. What follows is a look at the data record as given on the site for NASA Goddard Institute of Space Studies (GISS), a look at alternate facts.

The temperature data for a given station (site) appears as monthly averages of temperature. The actual data from each measurement (daily?) is not present. The calculation of an annual temperature is specified by GISS. A reasonable summary includes several steps.

Although the data set is not labeled, these measurements are presumably min/max measurements. The monthly numbers include the average of at least 20 measurements of temperature for each month. Then the monthly averages are combined to get a seasonal mean (quarters) starting with December. Then these four quarters are averaged to get an annual mean.

The entire data set is also subjected to composite error correction by calculating the differences of each data point from its monthly mean (the monthly anomaly). Then the seasonal correction is the average of the appropriate monthly corrections and the yearly correction the average of the seasonal corrections. The net of these corrections is added to the average. The result is an annual “temperature” which is a measure of the energy in the atmosphere coupled with any direct radiation impingement.

fallavillageraw021317

The plot of 101 years of temperatures in Falls Village is shown above. Although there is routinely a great deal of variation from year to year, several trends can be separated from the plot. A small, but noticeable increase occurs from 1916 through about 1950. Then, the temperature decreases until 1972 with much less volatility. Then, the temperature increases again until roughly 1998 when it holds steady until the present. However, the present volatility is very high.

The El Nino (a change in ocean currents with corresponding atmospheric changes) in 1998 and 2016 is very visible, but not unique in Falls Village. (Please note the Wikipedia editing on the subject of climate change is tightly controlled by a small group of political advocates.)

Sometime in the last few years, just before the Paris Conference on Climate Change, GISS decided to modify the temperature data to account for perceived faults in its representation of atmospheric energy. While the reasons for change are many, the main reason given appeared to be urban sprawl into the measurement sites. They said, “GISS does make an adjustment to deal with potential artifacts associated with urban heat islands, whereby the long-term regional trend derived from rural stations is used instead of the trends from urban centers in the analysis.” Falls Village is a small rural town of approximately 1100 people, mostly farm land.

gissfallsvillageadjustment021317

The plot of all the changes to the raw temperature data from Falls Village is shown above. First, of course, several sections of data are ignored presumably because of some flaw in the collection of data or equipment malfunctions. Second, almost all of the temperatures before 1998 are reduced by 0.6 to 1.2 degrees C which makes the current temperatures look artificially higher. Curiously, this tends to fit the narrative of no pause in global warming that was touted by GISS.

Source: NASA GISTEMP
Comparison of raw vs. final data Source: NASA GISTEMP

Link: https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=425000626580&ds=5&dt=1

Further, if the reasoning of GISS is taken at face value, the apparent temperatures of New York City would be affected. Falls Village (actually the Town of Canaan, Connecticut) is about two hours outside of the City and is part of an expanded weekend community.

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

244 Comments
Inline Feedbacks
View all comments
Neillusion
February 22, 2017 8:09 am

The article states that it is considered to be rural so no UHI adjustment would have been considered necessary.

Walter Sobchak
February 22, 2017 8:12 am

Figures don’t lie. Liars figure.

Neillusion
February 22, 2017 8:15 am

Surely someone has done a study on the effect of building/trees/tarmacadam/water? It would be easy to do, just set up say 10 stations at various locations on one test site with building/etc, measure distances and compare results. This could even be done today for a short period, say a week, for an initial indication of variation. Obviously, years would be needed for comprehensive results, but even a week of measurements would show something, especially if the measuring instruments were state of the art. It could all be networked back to central station

Reply to  Neillusion
February 22, 2017 9:04 am

A body of water with a 30m radius close to a site will give you temps 1C cooler.

Reply to  Steven Mosher
February 22, 2017 10:05 am

That can hardly be generally true. It would depend on how deep the water is, whether it stands or flows, how far exactly from the site, predominant wind direction etc.
Fake news.

DCA
Reply to  Steven Mosher
February 22, 2017 11:20 am

Michael,
How else can they claim that the UHI effect is cooling?

Reply to  Neillusion
February 22, 2017 9:06 am

The elevated temperature from pavement/asphalt dimishes within 10-15m of its border ( calm day)

Kaiser Derden
Reply to  Steven Mosher
February 22, 2017 10:46 am

diminishes to zero ? surely you jest 🙂

DCA
Reply to  Steven Mosher
February 22, 2017 11:08 am

Mosh,
Why can’t you make an equitable comparison? How much is the cooling effect of water on a calm day within 10-15m of it’s border? How much is the “elevated temperature from pavement/asphalt” within a 30m radius? What about other sources of UHI i.e. car, AC and jet engine exhaust? Are these assumptions or have they been tested?
At first glance it appears that the cooling effect is greater than the heating effect by a factor of 2 to 3. Your inability to compare apples to apples greatly diminishes your credibility.

MarkW
Reply to  Steven Mosher
February 22, 2017 2:14 pm

Doncha just hate it when that happens.On the other hand, the effects of a lake can carry for miles.
At least according to Steven.

Reply to  MarkW
February 22, 2017 2:50 pm

On the other hand, the effects of a lake can carry for miles.
At least according to Steven.

It does, I frequently have to shovel it off my driveway in the winter. And I’m about 30 miles away.

george e. smith
Reply to  Steven Mosher
February 22, 2017 2:47 pm

At what altitude above that pavement ??
g

Evan Jones
Editor
Reply to  Steven Mosher
February 22, 2017 4:52 pm

The elevated temperature from pavement/asphalt diminishes within 10-15m of its border (calm day)
Heck, you can go further than that. It diminishes the further away it gets. According to Leroy (2010), the impact of a heat sink from 1-10m distance from the sensor will work out to ~8 times that of the same area of sink from 10-30m.
But there is a sizable number of HCN sensors that are rated as Class 3 (NWS non-compliant) rather than Class 2 (compliant) that have no heat sink exposure within 10m — but do have over 10% exposure within 30m (which works out to ~8 times the same exposure within 10m).
So you are correct to say that heat sink exposure does diminish after 10m distance, as you say. But that also is not to say that heat sink exposure at under 30m is not — very — important, at least when we are dealing in terms of trend differences of tenths/hundredths of a degree C per decade.
Exposure at distances from 30-100m can only make the difference between Class 1 and Class 2. Both of those ratings are NWS-compliant and both of the offsets are zero degrees C. So that is a distinction without much of a practical difference for our (limited) purposes.
But anything within 30m can potentially bump a station into non-compliance.
Well, we can adjust for that! (And we will, too, if the VeeV won’t see the light).
One thing I am very interested in is how our stationset will run using your methods. But for valid results, any homogenization you do cannot, Cannot, CANNOT be cross-class. So no compliant stations being pairwised with non-compliant stations if you please! What that means in plain English is that you can only use Class 1s and 2s for pairwise. Any 3\4\5 stations would have to be adjusted for microsite BEFORE using them to pairwise with Class 1s or 2s.

Reply to  Steven Mosher
February 22, 2017 7:22 pm

OK, so the measuring site is in the middle of several square miles of said pavement/asphalt along with other monitoring sites. Yessiree, UHI in my book.

Reply to  Steven Mosher
February 22, 2017 10:45 pm

lol a light breeze can carry the heat from a parking lot hundreds of meters

Evan Jones
Editor
Reply to  Steven Mosher
February 23, 2017 4:33 am

OK, so the measuring site is in the middle of several square miles of said pavement/asphalt along with other monitoring sites. Yessiree, UHI in my book.
Well, sure. And that will increase the baseline temperature. But that is merely an offset. I have found that UHI does not appear to have a heck of a lot of influence on trend (sic).
When I do not grid the data, urban stations show less warming that non-urban. When I grid the data, it shows a bit more warming than non-urban.
Urban stations are less than 10% of the USHCN total, in any case. So any effect on trend is marginalized.
But bad MICROSITE, i.e., the immediate environment of the station, has a huge effect on trend — even if the bad microsite is constant and unchanging over the study period.
UHI is edge-nibbling. Microsite is your prime mover. Microsite is the New UHI.

Reply to  Steven Mosher
February 23, 2017 6:55 am

Chad Jessup February 22, 2017 at 7:22 pm
OK, so the measuring site is in the middle of several square miles of said pavement/asphalt along with other monitoring sites. Yessiree, UHI in my book.

Well your book is in error, the site is in the middle of several square miles of woods, and 23 m from the river (likely a cooling influence).

Reply to  Neillusion
February 22, 2017 12:14 pm

I propose that we dub these important physical discoveries “Mosher’s laws”. Now I’m fully convinced that Berkeley Earth contains nothing but the unvarnished, untarnished truth.

Evan Jones
Editor
Reply to  Michael Palmer
February 22, 2017 5:07 pm

He is correct. But I think he is not following that particular trail closely enough. OTOH, he has said that microsite is a valid subject for study and has the potential to expose a systematic error in the data, i.e., not “nibbling around the edges”. (He doubts it will make much difference, of course, as well he may. But he said he thinks it is a good issue for study.)

Reply to  Michael Palmer
February 22, 2017 5:52 pm

He is correct about what? He doesn’t even make complete statements. “It is not only not true, it is not even wrong.”

Evan Jones
Editor
Reply to  Michael Palmer
February 23, 2017 3:43 am

He is correct that heat sink effect diminishes at 10-15m.
(Although I think he is drawing the wrong conclusions from that: A diminished effect can still be quite significant.)

Reply to  Michael Palmer
February 23, 2017 6:46 am

The heat sink effect surely diminishes in some sort of continuous manner. That means you need at least 2 numbers to describe the decrease as a function of distance.

Evan Jones
Editor
Reply to  Michael Palmer
February 24, 2017 4:43 am

The heat sink effect surely diminishes in some sort of continuous manner. That means you need at least 2 numbers to describe the decrease as a function of distance.
Yes.
Not only that, but, say, you have a house and the front end is 15m from the sensor and the back end is 25m away. Well, the front end will have more effect than the back end. That makes it next to impossible, and certainly impractical, to calculate the effects precisely.
So Leroy crudes it out so it can actually be measured in this lifetime. It’s not exact, but, as the engineers say, it will do for practical purposes. OTOH, we have always realized that Leroy is a bit of a meataxe (although a very good meataxe), and one of the things we want to look at in followup is re-doing leroy’s method to achieve greater and more uniform accuracy.

Reply to  Michael Palmer
February 25, 2017 7:54 am

Engineers use rules of thumb for estimating the amounts of explosive for blowing up bridges also. After the calculation, they take everything times ten just to be safe. This makes sense, as long as all you care about that the bridge gets blown up. It is not a sufficient basis for constructing bridges.

Sylvia Marten
February 22, 2017 8:32 am

I think that Trump should offer an amnesty to fraudulent ‘climate scientists’ – Come clean and walk away, if you keep cheating go to jail.

Clyde Spencer
Reply to  Sylvia Marten
February 22, 2017 11:27 am

+1

Not Chicken Little
February 22, 2017 8:35 am

I guess I just don’t understand the whole thing, as the warmists are always telling me…
Hasn’t warming been going on for about 10,000 years now give or take, and sea level rise the same? Don’t any direct measurements we have that could be considered even partially global, go back only several hundred years at most?
What has been Man’s contribution to this warming and sea level rise over that period? Show me, and prove it. Show me, and prove, how the computer climate models accurately portray the climate as best as we can reconstruct it, over that period and over the recent directly measured past. Is that too much to ask?
Why is “climate science” not held to similar high standards as are other scientific disciplines? And God help us if engineers had to meet only the low standards of “climate science”…

Winnipeg boy
February 22, 2017 8:37 am

I like this analogy that i read on one of these sites: ” What is the average color of your television annually?”
How useful is that information?

Stevan Reddish
Reply to  Winnipeg boy
February 22, 2017 11:03 am

To show how silly some averages are, I like to ask “If all my darts hit the wall around the board, can I say my average is a bulls eye?”
SR

MarkW
Reply to  Winnipeg boy
February 22, 2017 2:10 pm

Reminds me of the Texas sharpshooter fallacy.
Basically fire a bunch of shots at a barn. Find where most of the bullets hit. Paint your bulls eye there.

george e. smith
Reply to  Winnipeg boy
February 22, 2017 2:33 pm

the triple 20 will still beat your bullseye.
g

Steve Fraser
February 22, 2017 8:38 am

The surfacestations project surveyed this site, and rated it a 4 for heat source nearby.

Evan Jones
Editor
Reply to  Steve Fraser
February 22, 2017 5:17 pm

Well, it’s a Class 4 using Leroy (1999). With the upgunned Leroy (2010), it is a Class 3. (There is heat sink within 10m, but covering under 10% of the area within 10m.)

RWturner
February 22, 2017 8:58 am

Data revisionism only takes place in climastrology, and it usually happens completely opposite of what logic would dictate, i.e. lowering past temperatures and raising current ones because of UHI effect.

Evan Jones
Editor
Reply to  RWturner
February 23, 2017 4:37 am

Data adjustment is very necessary. Raw data, writ large, won’t do. But that just means it is all the more important to do the adjustments right. (And clear. And explainable. And replicable.)

February 22, 2017 9:03 am

“The data from that site appears to start in 1916 according to NASA records; it shows intermittent warming of about 0.5 degrees C over the last 100 years. However, NASA recently decided to modify that data in a direct display of political convenience to exaggerate the rate of warming, that is, making the current period appear to show increased warming. What follows is a look at the data record as given on the site for NASA Goddard Institute of Space Studies (GISS), a look at alternate facts.
Are you accusing the Matte Menne and Claude Williams of scientific misconduct?
For the record?
Is the Publisher of this site aware that you are making a charge of scientific misconduct?
The temperature data for a given station (site) appears as monthly averages of temperature. The actual data from each measurement (daily?) is not present.
Check GHCN Daily. duh

Jake
Reply to  Steven Mosher
February 22, 2017 10:34 am

Steve, with the facts presented in this piece, are you of the opinion that there would be need for adjusting the temperatures at this particular site? If so, why?
This is an honest question. I’m not a lay person, I am a chemist who has done some study into this protocol. I’m still not sure why adjustment would be needed. Broadening uncertainty I could understand. Shifting data points would contradict much of my training in physical science.

Evan Jones
Editor
Reply to  Jake
February 23, 2017 4:18 am

I can think of adjustments that need to be applied.
1.) There is a TOBS flip from 18:00 to 7:00 in 2010. There needs to be a pairwise comparison to adjust for the jump. (NOAA supposedly does this.)
2.) CRS units have a severe problem with Tmax trend because the bulbs are attached to the box itself — the station is carrying its own personal Tmax heat sink around on its shoulders. CRS Tmax trends are more than double any other equipment (either as warming trend OR cooling trend).
But instead of adjusting CRS to conform with MMTS (and ASOS and PRT, and, and, and), NOAA adjusts MMTS trends to match CRS units. In other words, they adjust in exactly the wrong direction. (In our study we account for the jumps, but we include MMTS units in our pairwise.)
So all CRS units need an adjustment to reduce Tmax trend. Instead, MMTS trends are increased by NOAA.
3.) It is a non-compliant Class 3 station. A microsite adjustment needs to be applied, one that will reduce the trend by somewhere between a third and a half.
It’s not that the data does not need adjustment. Unfortunately, it does. But NOAA is doing it wrong, Wrong, WRONG. (As far as I can tell, this is NOT fraud — just error.)

Reply to  Jake
February 23, 2017 8:52 am

Evan Jones wrote, “There is a TOBS flip from 18:00 to 7:00 in 2010.”
Say what? Surely by 2010 they were automated and recording measurements every few minutes, right? So how can Time of OBServation be an issue?

Evan Jones
Editor
Reply to  Jake
February 23, 2017 4:17 pm

Heh!
Sure, the readout is on all the time for both MMTS and ASOS, so you could at any time do a reading. And you can get hourly data on ASOS, as it is.
But all NOAA ticks down is max and min. That’s all that goes into USHCN2.5 data, anyway. Yeah, you can get the hourly scoop on the Airport stations from NOAA, but only max and min go into the HCN station data. However, airports typically observe at 24:00, which is a good time to do it.
With MMTS, the hourly data could be recorded, but simply isn’t, so far as I know. Therefore, TOBS adjustment is necessary. We simply drop stations with TOBS flips and don’t adjust. (Bearing in mind that dropping TOBS-flipped stations is, in essence, the logical equivalent of an adjustment.)

Reply to  Jake
March 2, 2017 8:02 pm

That is amazing and appalling. The amounts of data are very small, by today’s standards. Data storage and transmission are practically free. The stations are expensive. Why on earth would they ever discard any data? SMH.

Kaiser Derden
Reply to  Steven Mosher
February 22, 2017 10:48 am

no, you are the one assuming misconduct … moron …

DCA
Reply to  Kaiser Derden
February 22, 2017 11:18 am

Accusations of misconduct seems to be an automatic go-to defense by alarmists. To think these scientists (sic) are just incompetent are never considered by these self proclaimed supremacists.

Jason Calley
Reply to  Kaiser Derden
February 23, 2017 5:07 am

Misconduct or incompetence… how to decide which it is? Generally speaking, look at the patterns of the errors. Simple incompetence generally makes errors which do not have a long term alignment with some ulterior motive. Misconduct produces patterns of errors which correspond with some desire or purpose.
Remember also that incompetence can only be judged in relation to the documented credentials of the people involved. A medical doctor who even once prescribes cyanide for a headache instead of aspirin cannot claim it was simple error, incompetence on his part. What level of incompetence is believable for a PhD employed by NASA as a climate expert?
For climate scientists, this is not complicated. Are the changes that have been made to the data equally scattered, some cooling, a roughly equal number warming, with warming and cooling adjustments having no unusual patterns related to past or present? Alternatively, do the changes produce new trends which did not exist in the raw data, new trends which support plausible desires or purposes of the people making the adjustments?
I know what it looks like to me.

Evan Jones
Editor
Reply to  Steven Mosher
February 22, 2017 5:26 pm

I didn’t bring this up before, Mosh, but I guess you realize there is a big-ass problem with CRS. Tmax trend is crazy-outlier-large (either in a warming OR cooling trend — doesn’t matter which) when compared with any other equipment.
The MMTS, ASOS Hygro, or CRN PRTs all show a radically different story. Either all of them are wrong or CRS is wrong.
And you can imagine what effect THAT will have on MMTS adjustment . . .

Gloateus Maximus
Reply to  Steven Mosher
February 23, 2017 7:01 am

Galileo recanted his heretical heliocentrism, but still was right that the earth moves, contrary to Church orthodox doctrine.

Rob Dawg
February 22, 2017 9:05 am

Adjusted or no anything that close to the Housatonic River for the last century isn’t useful for climate research.
The Housatonic River runs 140 miles from Pittsfield, MA, through Lee, Great Barrington and many other small communities on its way through Connecticut to the Long Island Sound.
Over that period major manufacturing from Plexiglas to paper had been using the river for dumping industrial pollution. Indeed when the paper companies upstream were producing colored paper you could tell the color even into the 1970s.

Jake
Reply to  Rob Dawg
February 22, 2017 10:53 am

Rob, please explain. I grew up down wind from the Naugatuck River, so on the right day the smell was nearly unbearable. Let me know how that may or may not have impacted temperature data. Are you thinking the aerosols? If so, that would have a cooling effect, and therefore temperatures should be adjusted ….. up?

Jake
Reply to  Jake
February 22, 2017 10:54 am

I hit send prematurely …. to explain, the Naugy and the Housy rivers run pretty much parallel to each other. The Naugatuck Rubber Company dumped into the Naugy, and the smell was, well ….. we didn’t like North Winds much ……

Bill Illis
February 22, 2017 9:12 am

From the Climate Explorer. The average raw temperature anomaly in the 20 closest stations to Falls Village (one of these stations goes back to 1793).
There is absolutely no reason to adjust Falls Village based on the pairwise homogeneity algorithm unless it is rigged or faulty or so unstable that it just not work.
http://climexp.knmi.nl/data/tlist_temperature_all_-73.37:_:41.95_10_1_-1_:___mean_19611990a.png

Reply to  Bill Illis
February 22, 2017 9:18 am

Rigged?
Mr Illis. Are you charging Matt menne and Claude Williams with scientific misconduct?
For the record.
And is the publisher of this site standing behind this charge?

Reply to  Bill Illis
February 22, 2017 9:27 am

1793? really? the 20 closest stations?
there are 20 stations within 25km of this site.
the oldest one goes back to 1884.
IF you are going to accuse people of misconduct show your work.
I’d hate to see you and other who publish this stuff getting sued.

Sheri
Reply to  Steven Mosher
February 22, 2017 10:04 am

Veiled threats of lawsuits don’t really advance science.

Kaiser Derden
Reply to  Steven Mosher
February 22, 2017 10:49 am

nice try … you really are a nasty piece of work aren’t you …

Bill Illis
Reply to  Steven Mosher
February 22, 2017 1:07 pm

New Haven Connecticut, 1781-
http://climexp.knmi.nl/data/ta72504.3.png

Reply to  Steven Mosher
February 22, 2017 5:33 pm

Steven Mosher, currently running the BEST break-point “adjust temperatures higher” algorithim.
Hopefully, Steven is not one those who can be prosecuted for running fake temperature adjustment algorithms. I’ve posted on boards with Steven for about 10 years now, most of which he would have been described as a skeptic. I hope in changing sides he has not sacrificed his integrity, at least not on the prosecutable side.

Bill Illis
Reply to  Steven Mosher
February 23, 2017 4:12 am

Let’s look at Falls Village in Berkeley Earth partially managed by Steven Mosher now. This is a good representation of what this adjustment algorithm actually does.
It takes the raw temperature below.
http://berkeleyearth.lbl.gov/auto/Stations/TAVG/Figures/37510-TAVG-Raw.png
And then finds 13 different “break-points” in this raw data, then separates the original record into 13 different sections and then restitches the 13 different section back together into a “new regional record” that goes up by almost +2.0C versus the “no change” in the raw record.
I mean there are even 3 different “time of observation” breakpoints in the year 1983 alone. As if they changed the time of observation at this station 3 different times in 1983, all of which made the historical records go down and at a period when all this time of observation problem was supposed to be sorted out 50 years earlier.
Obviously, this is a “biased algorithm”. How it got so biased I don’t know but I doubt it was an accident because people would fix it after they found just one example like Falls Village when there are 13,000 more just like it in the Berkeley Earth system. They would have already noticed how biased it is.
http://berkeleyearth.lbl.gov/auto/Stations/TAVG/Figures/37510-TAVG-Comparison.png

seaice1
Reply to  Steven Mosher
February 23, 2017 6:19 am

Sheri: veiled law suits don’t advance science, but neither do hinted allegations of misconduct. Lets get it out there – Is Illis accusing them of misconduct or not? If not lets say it plain that there is NO accusation of misconduct. Then we all know where we stand.
So Mr Illis “Steven is not one those who can be prosecuted for running fake temperature adjustments. ” Please tell us who is running fake temperature adjustments, and please clarify that by fake you mean that by fake you mean they are deliberately and knowingly publishing false data to misled the public.

1sky1
Reply to  Steven Mosher
February 23, 2017 3:09 pm

Obviously, this is a “biased algorithm”. How it got so biased I don’t know but I doubt it was an accident because people would fix it after they found just one example like Falls Village…

Spot on! The technical source of the bias of the “break algorithm” lies in its fundamentally erroneous ex- ante model of a monotonically declining “red noise” spectral structure for the data. That assumption, which misidentifies many sharp moves due to various quasi-cyclical components as “empirical breaks,” winds up fragmenting intact records into mere snippets, to be shifted upwards or downwards to conform to the model. Because of nearly ubiquitous–but largely unrecognized–UHI in the data base, the shifting of snippets toward the regional mean anomaly surreptitiously transfers the UHI effects to stitched-together non-urban records.
There can be little doubt that the uncritical embrace of this biasing algorithm, devised by statisticians with no expertise in geophysical signal behavior, is no accident on the part of agenda-driven “climate science.”

February 22, 2017 9:14 am

You realize that NASA GISS do not adjust this GHCN data as you claim..
That NOAA creates the adjusted data for NASA?
Is the publisher of this site aware of the fake news in your piece?
Blaming NASA for NOAA changes in data looks irresponsible.. Are you trying to do a hit job on NASA?

Reply to  Steven Mosher
February 22, 2017 11:00 am

Great points Mosher. An official agency publishing and promoting trillion-dollar policies based on a dataset has no responsibility for the accuracy of its content, whereas commenters on a website must be held to the highest standards..

Reply to  Steven Mosher
February 22, 2017 11:46 am

Steve,
do you realize that GISS in 1961,,was not supposed to be involved in surface temperature data, in the first place.Their original mission was to support NASA on Space exploration:
“Thus an initial emphasis on astrophysics and lunar and planetary science”
It was Dr. Hansen who changed the mission, when he took over the directorship in 1981.

Reply to  Steven Mosher
February 22, 2017 12:03 pm

First, the government is responsible. Everyone who touches it has a duty to assure that it is correct regardless of which office creates it. That’s why audit trails are important.
Second, I think the Trump fire has raised temperatures a bit!

Evan Jones
Editor
Reply to  Jim Gorman
February 23, 2017 5:08 pm

Hmm. Don’t overanalyze. Especially when one is in opposition. We haven’t the data to arrive at serious value judgments. And we all have our loyalties. I know I do, and I don’t mince words, not on that subject. Life is an armed truce. But we knew that, anyway.
Besides, I may not agree with all of Mosh’s methodology or any of his conclusions, but BEST is a remarkable piece of work and has a different approach than we do.
In simple terms, he does “jumps”, while we do “gradual”. But I think a gradual, systematic, spurious bias will slip right past BEST, because BEST does jumps, not gradual. And our two biggies — Microsite and CRS bias — are gradual and systematic and not only won’t be picked up by BEST, but will serve just fine to make homogenization crash and burn.
Yet I think that by looking at both the BEST method and at our own method, when we publish — for their strengths — we (or others) might well create a better, more sophisticated method than either alone.

February 22, 2017 9:35 am

Although the data set is not labeled, these measurements are presumably min/max measurements.
The GHCN site clearly lists the data as being Max/min taken at 6pm (ideally) each day.
The temperature data for a given station (site) appears as monthly averages of temperature. The actual data from each measurement (daily?) is not present.
Here’s some data from 1916, for example:
http://www1.ncdc.noaa.gov/pub/orders/IPS/IPS-FEF45D69-ABC4-4E68-B19D-0E2C58F574D6.pdf

February 22, 2017 9:35 am

“The result is an annual “temperature” which is a measure of the energy in the atmosphere coupled with any direct radiation impingement.”
Unless the enthalpy for each data point is calculated then they are not getting a “measure of energy in the atmosphere”.

Reply to  mkelly
February 22, 2017 10:46 am

Unless the enthalpy for each data point is calculated then they are not getting a “measure of energy in the atmosphere”.

I have it here, in these zips
https://sourceforge.net/projects/gsod-rpts/files/Reports/Ver%203%20beta
if you want to look and see what it says.
Email me if you have any questions about what it all means.

Ian W
Reply to  mkelly
February 22, 2017 1:16 pm

micro6500 February 22, 2017 at 10:46 am
Thanks a lot of work as I cannot find your email perhaps you would explain how the DayWatts and DayWattsFlat were calculated and what they represent.
(Nice to see Little Risington BTW not been on that hill for a while 🙂 ).

February 22, 2017 9:56 am

As OA points out above, this is all barking up the wrong tree. GISS does very little in handling this data. They use GHCN adjusted data; that is where to look for the adjustment activity. GHCN’s sheet listing the records and its adjustment is here. The article says that GISS does not have the daily record – again looking in the wrong place. It is in GHCN Daily here. If you really want the raw stuff the handwriiten forms are accessible here. Metadata is accessible here.

Reply to  Nick Stokes
February 22, 2017 10:08 am

“GISS does very little in handling this data.”
So why are they handling it?
Andrew

Reply to  Bad Andrew
February 22, 2017 10:19 am

To calculate regional and national averages and their time history.

Reply to  Bad Andrew
February 22, 2017 10:21 am

“To calculate regional and national averages and their time history.”
Why can’t a computer at NOAA do that?
Andrew

Reply to  Bad Andrew
February 22, 2017 10:30 am

Nick,
GISS was originally created by Dr. Jastrow as a supporting group, to NASA’s SPACE EXPLORATION projects.
They have no justification to do work that other agencies already does,it is a waste of taxpayers money.

Reply to  Bad Andrew
February 22, 2017 10:36 am

“They have no justification to do work that other agencies already does,it is a waste of taxpayers money.”
GISS was doing it first. They have a long record, and their product is well used. It would cost very little to produce (I do a similar calc on my home computer). There is no reason to stop.

Reply to  Bad Andrew
February 22, 2017 10:39 am

“There is no reason to stop.”
Yes there is. You stated it yourself. They do very little, according to your own comment.
Andrew

Reply to  Bad Andrew
February 22, 2017 10:42 am

Nick Stokes,
It sounds like you are making excuses to maintain the status quo. That’s not a very scientific posture.
Andrew

Reply to  Bad Andrew
February 22, 2017 10:45 am

Now you are a liar,Nick!
I just posted that it was originally founded to do this:
“Thus an initial emphasis on astrophysics and lunar and planetary science”
It was not about global warming or climate change at all back in 1961,when GISS was founded by Astronomer Dr. Jastrow.
https://www.giss.nasa.gov/research/news/20080303/

Reply to  Bad Andrew
February 22, 2017 11:10 am

“They do very little, according to your own comment.”
My comment was that they don’t do the data handling – the adjustments for homogeneiities which is a minor but unavoidable part of calculating a proper global average. One of the ironies of these posts which seem to come every month or so, is that the reason GISS attracts this misplaced attention is that they seem to run a more user-friendly website for accessing the data, although NOAA has more overall. Do you really want to lose that?

Reply to  Bad Andrew
February 22, 2017 11:12 am

Nick: “GISS was doing it first. They have a long record, and their product is well used. It would cost very little to produce (I do a similar calc on my home computer). There is no reason to stop.”
Before we launched all those expensive satellites, NASA was bubbling over with various excellent reasons to stop using surface stations. Those reasons haven’t changed, and the network has only gotten patchier since. It took a team of unpaid volunteers to even bother to document the current site conditions.
If GISS was showing a satellite-era trend significantly cooler than satellites, it would be quietly buried in a field at midnight instead of being kept on life support and implausibly promoted as more accurate than those really expensive satellites NASA wanted from taxpayers.

Reply to  Bad Andrew
February 22, 2017 11:16 am

“they seem to run a more user-friendly website”
Seriously, Nick. How much money does NASA consume to produce “a more user-friendly website”?
Why not have NOAA produce “a more user-friendly website”?
Andrew

Reply to  Bad Andrew
February 22, 2017 11:36 am

“How much money does NASA consume to produce “a more user-friendly website”?”
No use asking me. Why don’t you find out? I expect it is very little, as is the cost of producing Gistemp.

Reply to  Bad Andrew
February 22, 2017 11:51 am

Nick, if GISTEMP costs almost nothing to produce, then there’s no need for taxpayers to fund it.

Reply to  Bad Andrew
February 22, 2017 12:02 pm

“then there’s no need for taxpayers to fund it.”
Lots of things that don’t cost much are still worth doing.
Running the internet doesn’t cost that much. So do you think they should stop?

Reply to  Bad Andrew
February 22, 2017 12:05 pm

“Why don’t you find out?”
We can’t get these folks to cooperate with FOIA requests or Congressional inquiries, but surely they’ll fall all over themselves in eagerness to help some commenter on a website critical of their work.

Reply to  Bad Andrew
February 22, 2017 12:05 pm

“Running the internet doesn’t cost that much.”
Huh?
Andrew

Reply to  Bad Andrew
February 22, 2017 12:13 pm

Nick — if you’re referring to the maintenance of Internet namespace databases (and not the trillions of dollars in private infrastructure), that is administered by a nonprofit.
Would you like some help with a Kickstarter campaign?

seaice1
Reply to  Bad Andrew
February 23, 2017 6:27 am

I say this has introduced me to anew and surprising way of deciding which public projects to finance. Instead of looking at the value of the output and the costs of the input, then funding if the benefits seem to be greater than the costs, this new way is much simpler. You simply look at the costs, and if they are quite low you scrap the program! Such simplicity must be applauded. After all, if the costs are low, what does it matter if the value is huge?
This new policy will ensure that cheap and excellent value for money projects will be scrapped, and only expensive projects will be funded.

Reply to  Bad Andrew
February 23, 2017 6:43 am

“You simply look at the costs, and if they are quite low you scrap the program!”
Seaice, you are being deliberately obtuse. The reason the costs are quite low is because they aren’t really doing anything. Key point for you to try and comprehend.
Andrew

seaice1
Reply to  Bad Andrew
February 25, 2017 11:45 am

“Nick, if GISTEMP costs almost nothing to produce, then there’s no need for taxpayers to fund it.”
That is the point I was responding to. If it does not cost much there is no need to fund it. Absurd.

Kaiser Derden
Reply to  Nick Stokes
February 22, 2017 10:51 am

The article says that GISS does not have the daily record … and you confirm that by saying that GHCN has them … thanks

Reply to  Kaiser Derden
February 23, 2017 6:27 am

Immediately below the graph on the GISS site it explicitly says where the data is kept and provides a link to it:
“Key
Based on GHCN data from NOAA-NCEI and data from SCAR.
GHCN-Unadjusted is the raw data as reported by the weather station.
GHCN-adj is the data after the NCEI adjustment for station moves and breaks.
GHCN-adj-cleaned is the adjusted data after removal of obvious outliers and less trusted duplicate records.
GHCN-adj-homogenized is the adjusted, cleaned data with the GISTEMP removal of an urban-only trend.”

Evan Jones
Editor
Reply to  Kaiser Derden
February 24, 2017 5:36 am

And step 2 is where it all goes badly awry. Reason being that those breaks are fixed by doing pairwise, and the upwards of 80% of the stations used are invalid on the grounds of poor microsite, alone. And 100% of Stevenson Screen records have a Tmax trend that is more than spuriously doubled.
But neither microsite nor CRS bias creates a break, so it just slips through. And since bad microsite alone creates a systematic error that affects four stations out of five (even if non-CRS), NCEI “corrects” the situation not by adjusting the ~80% bad down to conform with the ~20% good stations, but adjusting the 20% good stations to match the 80% bad ones.
And that, folks, is how homogenization bombs. It works as intended if the majority of the data is good. In that case, the bad is conformed to the good. But if most of the data is bad, then it does the exact opposite. An average of good and bad data is not so great. Obviously. But misapplied homogenization takes a bad situation and, rather than making it better, makes it even worse.

Reply to  Nick Stokes
February 22, 2017 11:39 am

“I expect it is very little, as is the cost of producing Gistemp.”
I guess so, since they don’t really do anything.
Andrew

John Mauer
Reply to  Nick Stokes
February 23, 2017 1:17 pm

Thanks, Nick. I am exposing my ignorance, but this was the easiest way to ask the question. I live about 20 miles south and was really curious why the old temperature data was modified.

Michael Carter
February 22, 2017 9:57 am

I am at the very beginning of studying the record of an extended and near-complete record of one station in New Zealand. It is an exercise to see how the modern record has been created. One must start with how early raw data we read and averages for months and years established.
My understanding is that nowadays average (mean) daily temperature is established through finding the mean between max and min daily temp. But: when was it that instruments could automatically record max and min? This is what I am researching at the moment. As yet I can find no record of thermometer specs throughout the record.
The most likely scenario before automation (to establish max and min) is that the reader recorded temp at specific times (or time) in the day. It is most unlikely that a reader would attempt to find the max and min on a daily basis. Reading the device at 9 am in winter has very different implications to reading at the same time in Summer. The first (winter) may well record min temp but the second (summer) most probably wont. Many of our stations were situated at hydro power stations, forests and research institutes. I cannot imagine these staff getting out of bed at 5 am in summer to ensure that they record a min temp. Neither would they hang around a station during the afternoon to find the max.
There are people in our system who know what was done to establish the “mean” from early data which I am assuming were recorded at specific times during the day. I am going to keep digging until I get an answer.
This is the most basic of questions

Reply to  Michael Carter
February 22, 2017 1:16 pm

To Michael ,The first thermometer to read the maximum &minimum temperatures was invented in 1780 by James Six.of Canterbury (uk) .it recorded the current temp.&also the max &min since last read ..it had no time clock so the time of the occurrence of the max min temp was not known ,but was probably read every 24 hours .it needed to be reset before recording the next 24 hours temps .

Michael Carter
Reply to  kendo2016
February 22, 2017 8:22 pm

Yep – your right. Found out today through research 🙂

Reply to  kendo2016
February 23, 2017 6:18 am

This is like the one I used to use in the early 60’s in my school’s Stephenson’s screen:comment image
I used to make the measurement at lunchtime every day and reset the indicators.

Reply to  Michael Carter
February 23, 2017 6:22 am

The link I gave earlier showed that at Falls Village the max/min temp was read at 6pm although the records from 1916 showed the actual time was more variable but still in the afternoon.

Mike Takacs
February 22, 2017 10:09 am

Mosher, you are quite litigious today. Taking lessons from your pal Mikey?

February 22, 2017 10:42 am

Goddard Institute for SPACE STUDIES
Founded by Dr. Jastrow in 1961,Directed for the first 20 years. Note that INITIALLY it was: “Thus an initial emphasis on astrophysics and lunar and planetary science”
“The GISS formula is designed to yield high research productivity and a flexibility that allows research directions to shift as NASA objectives develop. Thus an initial emphasis on astrophysics and lunar and planetary science has evolved into a focus on understanding the causes and consequences of global climate change on Earth, of direct relevance to the first objective in NASA’s mission “…to understand and protect our home planet.”
Guess who became Director in 1981,who changed the mission to Climate Change?
https://www.giss.nasa.gov/research/news/20080303/

Pablo
February 22, 2017 10:48 am

A surplus of solar energy on a dry surface produces only sensible heat which makes for a very hot surface and an equally hot air above.
Evaporation from an equally solar radiated moist surface replaces some of that sensible heat with latent heat which cools the surface and warms the air somewhere else far away as vapour condenses.
Hence tropical rainforests probably do more to cool the surface and move heat somewhere else than anything else.
Without tropical rain forests rainfall would cease to be as regular, the land would be subject to extremes, ecosystems would be destroyed and mankind would have a tougher time surviving.
Speaking as an eco-warrior, justifiably concerned about mankind’s real negative impacts around the world
the whole CO2 misconception/distraction (to put it kindly) has given environmentalism a bad name and for science a huge embarrassment from which it will take a while to recover.

Reply to  Pablo
February 22, 2017 11:11 am

Speaking as an eco-warrior, justifiably concerned about mankind’s real negative impacts around the world
the whole CO2 misconception/distraction (to put it kindly) has given environmentalism a bad name and for science a huge embarrassment from which it will take a while to recover.

I’m not, but I have complained about this for 10+ years. Think what that money wasted on climate change could have done!

bit chilly
Reply to  micro6500
February 22, 2017 3:46 pm

have a +1 from me on that .

stan stendera
Reply to  micro6500
February 22, 2017 9:30 pm

Bit Chilly. U R a piker. +100 from me!!!!

DWR54
February 22, 2017 10:48 am

Not sure if this has been mentioned before and apologies if so; I haven’t had time to read the entire thread.
Berkeley Earth (BE) also analysed Falls Village and came up with results similar to NASA. Here are the data plotted using BE’s breakpoint algorithm: http://berkeleyearth.lbl.gov/auto/Stations/TAVG/Figures/37510-TAVG-Comparison.pdf
This suggests that recent temperatures at Falls Village fall below those recorded at nearby stations, suggesting a local discrepancy of some sort, defined as an ‘Empirical Break’.
(Can’t help noticing that the tree beside the screen in the photo is casting a shadow over the screen. Was that always the case, I wonder?)

Kaiser Derden
Reply to  DWR54
February 22, 2017 10:53 am

we don’t live in a pristine environment … there will be local differences driven by alot of factors and averaged out over the globe that is fine … there is no need to adjust every location to a pristine baseline … nobody lives in a pristine baseline …

DWR54
Reply to  Kaiser Derden
February 22, 2017 10:57 am

Perhaps the fact that we don’t live in pristine environments is one good reason why we ‘should’ adjust for non-climatic influences. Shouldn’t we adjust for influences like UHI, for example?

Tom Halla
Reply to  DWR54
February 22, 2017 11:11 am

Many statistical tests for significance assume a random distribution of errors. If someone “adjusts” the sample, it makes the tests rather invalid.

Reply to  Kaiser Derden
February 22, 2017 11:39 am

DWR54,
how do we know what the UHI effect for EACH temperature recording station are?
That alone is why Surface station data is a big mess.

Jason Calley
Reply to  Kaiser Derden
February 23, 2017 12:22 pm

Hey DWR54! “Shouldn’t we adjust for influences like UHI, for example?”
Yes, if the influences can be justified and quantified. But justification and especially quantification require information. Do we really have some new source of information that allows us to correctly adjust data from the 1880s? No? Then why are the old numbers changing?

Bindidon
Reply to  Kaiser Derden
February 23, 2017 2:53 pm

Jason Calley on February 23, 2017 at 12:22 pm
No? Then why are the old numbers changing?
This is always the same question, like an eternal refrain.
All anomalies computed in a series out of the average of a given “baseline” period (e.g. 1951-1980 or 1981-2010) will change every time any absolute value within the baseline was changed (for example, to correct an error).
But all the other absolute values in the time series arfe left unchanged.

Reply to  DWR54
February 22, 2017 12:16 pm

If a tree has grown and is now shading it why would you adjust data from 100 years ago rather than more current data? Part of the problem is that there is no audit trail for any of this. This is beside the point of just what global temperature really means and what it actually indicates. If you use fudged up data to calculate some fudged up figure, all you have is something that is meaningless.

bit chilly
Reply to  DWR54
February 22, 2017 3:51 pm

most rivers and stream run in a valley of some sort . even a relatively small depression over a significant length appears to be colder than the surrounding area, often by a significant amount. in-laws live on the bank of the local river half a mile south of me and around 300 feet lower elevation. i often see temperatures up to 4 c cooler in the morning on the car temperature display than when i left my house 5 minutes previously.

Reply to  DWR54
February 23, 2017 2:56 pm

Or a change in predominant wind direction off the river…

Reply to  DWR54
February 24, 2017 9:53 am

Yes, the stevenson screens were adopted when it was discovered that every weather station did not have a handy shade tree, so that the standard “temperature in the shade” might not always be the actuality as the sun hit it. There is shade on the shade, and both the shrubbery and the shelter minimally interfere with breezes.
J.K. Mackie’s (variant spellings in the family include M’Kie, Mackey, MacKay…) record from 1916 March shows that they recorded a daily max, min, the range, and then they took an arithmetical average max per day, and an average min per day each month. Wouldn’t be surprised if they just averaged those 2 to arrive at a monthly aggregate “average”…or at least central tendency. As long as it is consistent, it should be OK, not bias a trend into the mix.

February 22, 2017 10:50 am

This dataset was supposed to obsoleted decades ago with the advent of satellites and the shutting down or degrading of so many surface stations. Instead, it became a wonderful opportunity to promote a political agenda with adjustment that seem plausible on the surface (haha) but are deeply problematic when delving into the devils of the details.

Reply to  talldave2
February 22, 2017 11:50 am

Really hope Trump’s team just takes an axe to the whole GISS temperature dataset.
But Nick Stokes should feel free to keep publishing it from his PC, at no cost to taxpayers.

Reply to  talldave2
February 22, 2017 2:40 pm

I use unadjusted temperatures. I can use adjusted. It makes very little difference.

Reply to  talldave2
February 22, 2017 8:13 pm

Homogenization is far from the only adjustment. But make up any rules you want, it’s your dataset.

Reply to  talldave2
February 22, 2017 9:58 pm

Forrest,
“See the problem with your line of argument?”
I wrote a code using quite different methods to GISS, described here. It is similar to what BEST later used. I use unadjusted GHCN data, which is a difference, but has little effect on the result. But you don’t know that until you have done it. Where there are clear inhomogeneities, you have to adjust for them, even if it all balances out in the end. I can just see people here taking the other tack if they didn’t (Negligent!).

Reply to  Nick Stokes
February 23, 2017 4:12 am

I use unadjusted GHCN data, which is a difference, but has little effect on the result.

that’s because your processing generates the trends itself. You guys screw with the data so much, it doesn’t matter much.
It interesting, I use the measurements as is, when I scrape off the day to day change of min temp, average out over a year, if I take the last 30 years, invert it, and it’s a good match to satellite temps, which makes me think the satellites are detecting the heat passing through the troposphere.

Reply to  talldave2
February 23, 2017 5:14 am

Soon, Nick Stokes will be running a duplicate version of the internet on his PC. because running the internet doesn’t cost that much. lol
Andrew

Bindidon
Reply to  talldave2
February 23, 2017 9:45 am

talldave2 on February 22, 2017 at 10:50 am
Why should a dataset become obsolete with the advent of satellites when both look so similar?
Here is a chart comparing, exclusively for the GHCN V3 FALLS VILLAGE station, unadjusted and adjusted data together with the UAH6.0 2.5° grid cell just above the station:
http://fs5.directupload.net/images/170223/suuzbhch.jpg
You see that all three plots differ by so few that any claim for so called adjustments really sounds a bit paranoid.
But not only the plots show such convergence. Numbers do as well, e.g. highest and lowest temperature anomalies wrt 1981-2010 (in °C) from december 1978 till december 2016.
Highest is december 2015 for all three datasets
– UAH: +5.35
– GHCN unadj: +6.79
– GHCN adj: +7.04
Lowest is february 2015 for both GHCN datasets as well (december 1989 for UAH)
– UAH: -5.32
– GHCN unadj: -8.28
– GHCN adj: -8.33
The similarities between surface and troposphere temperatures at peaks and downs in the chart is sometimes amazing, especially when you look at it in a pdf file.
This of course you see only when looking at anomaly based charts: there are about 24 °C difference between UAH and GHCN, making comparisons of absolute values impossible.

J Mac
February 22, 2017 11:11 am

If the data doesn’t fit, adjust it a bit!
Climatologist’s maxim: One fudged data table is worth a thousand weasel words!

February 22, 2017 11:25 am

Back in 2011 I tried to find out exactly what accounted for the NOAA/GISS U.S. temperature adjustments. I wrote about that attempt in a comment on WUWT in 2012, here:
https://wattsupwiththat.com/2012/08/13/when-will-it-start-cooling/#comment-1057783
Approximately all of the reported warming in the U.S. 48-State surface temperature record from the 1930s to the 1990s was due to adjustments. So I “asked the Climate Science Rapid Response Team” (a/k/a, the Defenders Of The Faith, the Congregation for the Doctrine of Anthropogenic Global Warming) to help me locate the old data, and to explain the alterations which had added so much apparent warming to the U.S. surface temperature record.
They were unable to do so, though they did direct me to some interesting material — some of which made me queasy.
In the WUWT conversation, Amos Batto claimed that the “data and software algorithms are publically available.” But when I told him that I couldn’t find it, and I asked him to find it, he went away.
I never did find an explanation for the majority of those temperature adjustments, nor did I ever track down the original data graphed in Hansen’s 1999 paper. Eventually I reconstructed the data, pretty closely, by digitizing Hansen’s graph, using WebPlotDigitizer.

February 22, 2017 5:36 pm

That the author and most commenters seem ignorant of the fact that raw GHCN dailies are available from NOAA is a real head scratcher. Nick Stokes even supplied a link to the raw daily file for Falls Village, which includes TMAX, TMIN, TOBS, precip, snow, and snow depth, plus measurement, quality, and source flags. Data begins Feb 1916. Snow and snow depth measurements stop in 2010, precipitation measurements stop in 2014. (Why?)
When someone says they are using raw data, however, I wonder how they deal with missing days and data flagged for various “quality issues.”
Glancing at Falls Village, the biggest temperature increase appears to be balmier summer nights. No trend in precipitation. Record high temperature of 104F in 2002 beat the previous 103 in 1933. Has anyone asked long-term residents if they feel climatically threatened?

Reply to  verdeviewer
February 23, 2017 11:56 am

“Snow and snow depth measurements stop in 2010, precipitation measurements stop in 2014. (Why?)”
It’s a coop (volunteer) station, and seems to be fading out. You can read this in the metadata here. Why it is happening in stages is a mystery.

Reply to  Nick Stokes
February 24, 2017 10:30 am

Seems a shame to shut down a station in continuous operation since 1916.
The Norfolk 2SW coop station, 7.8 miles away, has continuous records since 1943, but with many missing days. It shows more warming than Falls Village over the same period. The NWS shut down the hydrological reporting for that station in 2010 and has since been installing automated equipment:
http://www.greatmountainforest.org/workingforest/weather/history.php
What a difference 7.8 miles makes in extreme highs. Falls Village reached its record high of 104F in 2002, 5F above its 2001 high. Norfolk reached its record high of 98F in 2001, 7F above its 2002 high. Extremely odd.

Reply to  Nick Stokes
February 24, 2017 12:19 pm

More on the lack of correspondence between record TMAX days at Falls Village and Norfolk, CT.
Falls Village:
2001-08-03 — 91
2001-08-04 — 85
2001-08-05 — 88
2001-08-06 — 92

2002-07-29 — 93
2002-07-30 — 104
2002-07-31 — 91
Norfolk:
2001-08-03 — 87
2001-08-04 — M
2001-08-05 — 98
2001-08-06 — 81

2002-07-29 — 75
2002-07-30 — 87
2002-07-31 — 85
An argument against interpolation?

February 22, 2017 5:37 pm

Where is that outflow coming from? Is that the sewage treatment plant?

Reply to  verdeviewer
February 22, 2017 5:47 pm

Ah, never mind, it’s the hydro outlet. Built in 1914..

Verified by MonsterInsights