Through The Looking Glass with NASA GISS

Guest essay by John Mauer

One aspect of the science of global warming is the measurement of temperature on a local basis. This is followed by placing the data on a worldwide grid and extrapolating to areas that don’t have data. The local measurement site in the northwest corner of Connecticut is in Falls Village, near the old hydroelectric powerhouse:

fallsvillagestation
Falls Village weather station, Stevenson Screen at right
Aerial view of Falls Village GHCN weather station. Note power plant to the north
Aerial view of Falls Village GHCN weather station. Note power plant to the north Image: Google Earth

The data from that site appears to start in 1916 according to NASA records; it shows intermittent warming of about 0.5 degrees C over the last 100 years. However, NASA recently decided to modify that data in a direct display of political convenience to exaggerate the rate of warming, that is, making the current period appear to show increased warming. What follows is a look at the data record as given on the site for NASA Goddard Institute of Space Studies (GISS), a look at alternate facts.

The temperature data for a given station (site) appears as monthly averages of temperature. The actual data from each measurement (daily?) is not present. The calculation of an annual temperature is specified by GISS. A reasonable summary includes several steps.

Although the data set is not labeled, these measurements are presumably min/max measurements. The monthly numbers include the average of at least 20 measurements of temperature for each month. Then the monthly averages are combined to get a seasonal mean (quarters) starting with December. Then these four quarters are averaged to get an annual mean.

The entire data set is also subjected to composite error correction by calculating the differences of each data point from its monthly mean (the monthly anomaly). Then the seasonal correction is the average of the appropriate monthly corrections and the yearly correction the average of the seasonal corrections. The net of these corrections is added to the average. The result is an annual “temperature” which is a measure of the energy in the atmosphere coupled with any direct radiation impingement.

fallavillageraw021317

The plot of 101 years of temperatures in Falls Village is shown above. Although there is routinely a great deal of variation from year to year, several trends can be separated from the plot. A small, but noticeable increase occurs from 1916 through about 1950. Then, the temperature decreases until 1972 with much less volatility. Then, the temperature increases again until roughly 1998 when it holds steady until the present. However, the present volatility is very high.

The El Nino (a change in ocean currents with corresponding atmospheric changes) in 1998 and 2016 is very visible, but not unique in Falls Village. (Please note the Wikipedia editing on the subject of climate change is tightly controlled by a small group of political advocates.)

Sometime in the last few years, just before the Paris Conference on Climate Change, GISS decided to modify the temperature data to account for perceived faults in its representation of atmospheric energy. While the reasons for change are many, the main reason given appeared to be urban sprawl into the measurement sites. They said, “GISS does make an adjustment to deal with potential artifacts associated with urban heat islands, whereby the long-term regional trend derived from rural stations is used instead of the trends from urban centers in the analysis.” Falls Village is a small rural town of approximately 1100 people, mostly farm land.

gissfallsvillageadjustment021317

The plot of all the changes to the raw temperature data from Falls Village is shown above. First, of course, several sections of data are ignored presumably because of some flaw in the collection of data or equipment malfunctions. Second, almost all of the temperatures before 1998 are reduced by 0.6 to 1.2 degrees C which makes the current temperatures look artificially higher. Curiously, this tends to fit the narrative of no pause in global warming that was touted by GISS.

Source: NASA GISTEMP
Comparison of raw vs. final data Source: NASA GISTEMP

Link: https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=425000626580&ds=5&dt=1

Further, if the reasoning of GISS is taken at face value, the apparent temperatures of New York City would be affected. Falls Village (actually the Town of Canaan, Connecticut) is about two hours outside of the City and is part of an expanded weekend community.

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

244 Comments
Inline Feedbacks
View all comments
Ian W
February 22, 2017 6:11 am

It would appear that GISS is ripe for a full 3rd party audit. Followed by a validation of their methods against actual data for sites where the values were recorded on paper and have been kept. That would seem to be an ideal exercise for some metrologists and statisticians.

Reply to  Ian W
February 22, 2017 8:13 am

take that one further again, if the original records are not available, the custodians are not to be considered scientists at all.
I recall seeing books in the past here in Australia published by BOM with recorded temperatures for major towns and cities in various states, page after page of nothing but tabulated temperature records. I’d love to find one of these to be able to cross reference it against the ‘temperature records’ of today.

RWturner
Reply to  Ian W
February 22, 2017 8:55 am

GISS should simply be closed. NASA is being refocused for space exploration after all.

george e. smith
Reply to  Ian W
February 22, 2017 12:40 pm

I would say that both the olive colored and the blue black (on my screen) curves are fake.
There clearly are discrete points on the graph. I’ll assume that those are accurate measured numbers; as accurate as a reasonable thermometer can be expected to be. (What is the name of the person who measured those accurate temperatures ?)
Every one of those points lies on the curve; exactly. Well that is great. The Temperature should be a curve of some shape or other.
And if it is a real curve and not a fake curve, then of course it MUST go through (exactly) every one of those accurately measured points. (and it does).
So maybe the curve is real after all. Now what if they made measurements more often; let’s say just one more measurement in that total time frame. Well that extra point of course MUST also fallon that curve, because we have now decided it must be a real curve.
Well that would also be true if we took one extra measurement halfway between the points shown on the graph. Well every one of those extra point must also lie on the exact same curve because we just said it is a real curve.
So no matter how many extra Temperature readings we take, the curve will not change. It already records every possible Temperature that occurred during the total time depicted.
But what can we predict from this real accurate curve of Temperatures over the total time interval.
One thing is apparent; no matter how many measurements we make, we can never predict whether the very next measurement we make will be higher, or lower, or maybe exactly the same, as the measurement we just made. There is NO predictability of ANY future trend; let alone any future value.
Well there is only one problem with this conjecture.
It is quite apparent that the given curve, which we believe to be true, and not a fake, has points of infinite curvature on it. The slope of the curve can change by any amount at each and every point of measurement, without limit.
So we must conclude that the given curve is NOT a band limited function.
It includes frequency components all the way from zero frequency to infinite frequency.
But when we look at the data set of measured accurate Temperatures used to construct this real non fake curve, we find that there is not an infinite number of data points in the set; just a few dozen in fact.
So clearly the data set is not a proper sampled set of the plotted curve function. It is in fact grossly under-sampled, so not only can we not reconstruct what the real function is, we are under-sampled to such a degree, that we cannot even determine any statistical property of the numbers in the data set, such as even the average. It is totally corrupted by aliasing noise, and quite unintelligible.
So I was correct right at the outset.
Both of those curves are fake; and do not represent anything real that anybody ever observed or measured.
What did you say was the name of the chap(ess) who read that thermometer ??
G

DisGISSted
Reply to  Ian W
February 22, 2017 12:58 pm

This sort of behaviour by GISS is, in my opinion, at about the level of paedophile priests. The data handling equivalent of ‘kiddy fiddling’.

1sky1
Reply to  Ian W
February 22, 2017 4:51 pm

It is quite apparent that the given curve, which we believe to be true, and not a fake, has points of infinite curvature on it. The slope of the curve can change by any amount at each and every point of measurement, without limit.
So we must conclude that the given curve is NOT a band limited function.
It includes frequency components all the way from zero frequency to infinite frequency.

This is a complete fallacy! The frequency response of all glass thermometers band-limits the measured temperature variations to frequencies lower than a few cycles per minute. Furthermore, the decimation of daily mean temperatures into monthly averages shifts the effective cut-off frequency into the cycles per year range. The monthly averages are then further decimated into yearly data, narrowing the baseband range to half a cycle per year. Band-limited signals contain only finite power; thus there ARE limits to how much the slope can change month to month or year to year. And aliasing is by no means the overwhelming factor erroneously claimed here.
The real problem lies in the gross falsification of actual measurements through various ad hoc adjustments and the fragmentation of time series perpetrated in Version 3 of GHCN. Even the unadjusted (olive curve) does not show the average yearly temperature as indicated by actual measurements.

Richie
Reply to  Ian W
February 23, 2017 4:55 am

Dementia setting in? I recall that during the first week of March in Dallas sometime in the early ’60s, it hit 106F. Looking at an official record of Dallas Love Field temps the other day, the warmest day I could find for the period was 97F. It’s possible what I remember from the ’60s was actually a record temp at a non-official station, perhaps Meacham Field in Fort Worth. But what I wonder is whether that 106 figure has been reduced in the record books by “adjustments”?

Reply to  Ian W
February 23, 2017 7:14 am

george e. smith February 22, 2017 at 12:40 pm
I would say that both the olive colored and the blue black (on my screen) curves are fake.
There clearly are discrete points on the graph. I’ll assume that those are accurate measured numbers; as accurate as a reasonable thermometer can be expected to be. (What is the name of the person who measured those accurate temperatures ?)

Hi George, in 1916 it was someone called ‘J.K. Maclein’ (well Mac-something his handwriting is a bit difficult to read.

george e. smith
Reply to  Ian W
February 23, 2017 2:52 pm

Well I can’t even imagine what 1sky1 is even talking about.
It matters not a jot what sort of thermometers are used to make Temperature measurements so long as they give accurate readings. I took it for granted that whoever made the measurements used to construct those graphs, took accurate readings.
So I have NO PROBLEM with the measured Temperatures, nor with whoever made those measurements.
The problem is with whoever was the total idiot who drew those two colored graphs.
A scatter plot of the original measured Temperature values would have made a wonderful graph.
For the benefit of 1sky1 and others similarly disadvantaged; a scatter plot is a plot of INDIVIDUAL POINTS on an X-Y graph or other suitable graphical axis system.
For the Temperatures measured by whomever it was, the Temperature is the Y value, and the presumed time of measurement is the X value.
Scatter plots can easily be plotted by anyone with a copy of Micro$oft Excel on their computer.
If the total idiot who drew those graphs had used Excel, (s)he could have made a wonderful scatter plot graph of those measurements.
Moreover, that person could even have had Excel construct a CONTINUOUS graphical plotted curve that passes through EVERY ONE of those measured data points EXACTLY; perhaps using some sort of cubic spline or other approximation.
Such a CONTINUOUS curve, would not necessarily be an exact replication of the real Temperature that was the subject of the original measurements; but it would be acceptably close to what that exact curve would have been, and points on such a fitted CONTINUOUS curve intermediate between the measured values, would have been close to what real value could have been measured at that time.
But the total idiot who drew these two graphs, chose to just connect the measured and scatter plotted points with straight line segments, that results in a DISCONTINUOUS NON BAND LIMITED INFINITE FREQUENCY RESPONSE FAKE GRAPH.
Apparently, 1sky1 doesn’t even know the difference between a CONTINUOUS function and a DISCONTINUOUS function.
The first one is band limited, and can be properly represented by properly sampled point data values.
The second one has no frequency limit whatsoever, so it cannot be properly represented by even an infinite number of sampled values.
I suggest 1sky1 that you take a course in elementary sampled data theory, before throwing around words like “fallacy” or “erroneously”, whose meanings you clearly don’t understand.
G

1sky1
Reply to  Ian W
February 23, 2017 3:48 pm

Well I can’t even imagine what 1sky1 is even talking about.

At least you’ve got that much right, George! Since temperature is a continuous physical variable, any periodic sampling will generate a discrete time series. The crucial question is how well that sampled time-series represents the continuous signal. That’s what I address in pointing out the various band-limitations imposed by instruments and data averaging. (With well-sampled time-series, band-limited interpolation according to Shannon’s Theorem can always be employed to reconstruct the underlying continuous signal.)
In stark contrast, you’re apparently concerned with the purely graphic device of connecting the discrete yearly average data points by straight lines. That’s looking at the paper wrapping instead of the substantive information content. The notion of using some spline algorithm on an Excel scattergram of the time series (instead of Shannon’s Theorem) to obtain a continuous function speaks volumes. It shows that, despite throwing around the terminology of signal analysis, its well-established fundamentals continue to escape you.

george e. smith
Reply to  Ian W
February 24, 2017 9:42 am

Well I see it is a total waste of time trying to educate the trolls.
1sky1 simply doesn’t understand the fundamental concept the NO PHYSICAL SYSTEM is accurately described by a DISCONTINUOUS FUNCTION.
Ergo; by definition, a discontinuous graph purporting to be an accurate plot of some measured data from a real physical system, is FAKE. No real system can behave that way.
Gathering the data; of necessity sampled, is one issue. I have no problem with how this particular data was obtained.
How it is presented is a totally different issue that 1sky1 does not seem able to grasp.
Shannon’s theorem on information transmission is not even relevant to the issue. It’s a matter of interpolating real measured data with totally phony fake meta-data.
G

1sky1
Reply to  Ian W
February 24, 2017 2:55 pm

Well I see it is a total waste of time trying to educate the trolls.

Being “educated” by amateurs who are patently clueless about the rigorous theory of reconstruction of continuous bandlimited signals from discrete samples(http://marksmannet.com/RobertMarks/REPRINTS/1999_IntroductionToShannonSamplingAndInterpolationTheory.pdf) provides no intelligent benefit. On the contrary, it prompts senseless fulminations about “fake” graphs of well-sampled time series, based upon nothing more than the superficial impression of how the discrete data points are visually connected.

george e. smith
Reply to  Ian W
February 28, 2017 2:13 pm

Well sky see if you can giggle yourself up a book that’s a bit more on the ball, than the one you linked to the contents pages of.
Nowhere in that entire text book, does it teach reconstruction of a band limited continuous function from properly sampled instantaneous samples of the function, by simply connecting the properly sampled data points, with discontinuous straight line segments. Such a process does not recover the original continuous function, which a correct procedure will do.
And your reference text is a bit of a Johny come Lately tome anyway.
Well as was Claude Shannon, whose writings on sampled data systems are about 20 years after the original papers by Hartley and Nyquist of Bell Labs, circa 1928.
I’m sure there are precedent papers from the pure mathematicians, preceding Hartley’s Law, and Nyquist’s Sampling Theorem, but then Cauchy and co, weren’t exactly in the forefront of communications technology as was Bell Telephone Laboratories.
G

February 22, 2017 6:12 am

It always surprises me that past measurements can be read so much better now, after 80 years, than they could at the time.
In 2097 GISS will probably record that we are hunting woolly mammoths today.

AGW is not Science
Reply to  M Courtney
February 22, 2017 6:58 am

“In 2097 GISS will probably record that we are hunting woolly mammoths today.”
LMFAO – glad I didn’t have food in my mouth when I read that!

Latitude
Reply to  M Courtney
February 22, 2017 7:59 am

This is the one thing I have never understood … and in my opinion should be the main sticking point.
Forget accurate temperature measurements….and everything else.
They run a algorithm every time they enter new temp data.
They say any adjustments to the new data…..is more accurate than the old data.
..so the algorithm retroactively changes the old data….every time they enter new data.
And yet every thing published about global warming is based on the temp history today.
…which will be entirely different tomorrow
We will never know what the temp is today…..because in the future it’s constantly changing
oddest thing about it all….the models reflect all of that past adjusting
and inflate global warming at exactly the same rate as all the adjustments

Reply to  Latitude
February 22, 2017 8:24 am

oddest thing about it all….the models reflect all of that past adjusting
and inflate global warming at exactly the same rate as all the adjustments

Not odd at all, they base their adjustments on the same theory used in the models.
Mosher has stated more than once, they create a “temp” field mostly based on Lat, Alt, and distance to large bodies of water, and the rest is just noise.

Aphan
Reply to  Latitude
February 22, 2017 10:06 am

Remember when Nancy Pelosi said “We have to pass the bill before we can know what’s in the bill!”?? Well, climate science is like that…”We have to adjust the temperatures before we can know what the temperature is/was”.
Makes perfect sense. 🙂 *snark*

Alan the Brit
Reply to  M Courtney
February 22, 2017 8:44 am

But we are hunting them today, there’s one over the way it’s huge & grey & hairy it eats children & it’s……………..oh sorry it was a big red bus instead dropping off some school kids!!! By mistake! sarc off!

Reply to  Alan the Brit
February 22, 2017 1:17 pm

School buses are sunflower yellow here in the states.

Reply to  Alan the Brit
February 22, 2017 2:21 pm

Sounds like your buses have liver damage.
Perhaps they’ve had too much bioethanol?

DHR
Reply to  M Courtney
February 22, 2017 9:54 am

Actually, it seems that only thermometer data recorded around the year 1970 are accurate, according to GISS. They have reduced the readings prior to that date and increase them after. See climate4you for the details. I am now searching for a circa-1970 thermometer in my box of old stuff in the garage. It seems it is the only device that works ok, according to the world class scientists at GISS.

alfred giesbrecht
Reply to  DHR
February 22, 2017 1:00 pm

DHR; is this a fact? Could you show it. alf

bit chilly
Reply to  DHR
February 22, 2017 3:18 pm

brilliant dhr 🙂

Thomas
Reply to  M Courtney
February 22, 2017 4:05 pm

Ha, ha, ha

TheLastDemocrat
February 22, 2017 6:29 am

Could someone develop an easy-to-follow description of this concept of “atmospheric energy?”
I understand that a temp reading is a measure of ambient temperature, and also of “atmospheric energy.”
To the degree that a thermometer, properly used, is not so great at measuring atmospheric energy as ambient temperature, it seems that atmospheric energy is measured by the use of at least two indicators: the local thermometer and some other indicator.
Apparently the other indicator or indicators is somehow simply not superior, since the thermometer is also needed. So, this other indicator and the thermometer are both flawed (as all measures are).
And, it seems that some kind of systematic bias in the thermometer can be determined based on this second indicator, although the second is acknowledged as flawed.
How does the logic go? What is the other indicator?
If the thermometer is wrong, isn’t it systematically wrong, and so all values ought to be changed the same amount? If an old thermometer was wrong, wouldn’t it be wrong every day for years? And if the new one is wrong, wouldn’t it be the same amount of wrong every day for years?
Is a thermometer more reliable at some temp ranges than others (barring extremes)?
–All of this does not add up.

Ian W
Reply to  TheLastDemocrat
February 22, 2017 6:47 am

Temperature is actually the incorrect metric for atmospheric heat energy as the amount of water vapor in the volume of air alters its ‘enthalpy’. The correct metric is kilojoules per kilogram and can be calculated from the temperature and relative humidity.
A volume of air in a misty bayou in Louisiana with the air temperature of 75F and a humidity of 100% is twice the amount of energy as a similar volume of air in Arizona with the air temperature of 100F and humidity close to 0%.
It is therefore incorrect to use atmospheric temperature to measure heat content and a nonsense to average them. Averaging the averages of intensive variables like atmospheric temperature is meaningless. It is like an average telephone number or the average color of cars on the interstate mathematically simple but completely meaningless.

RobR
Reply to  Ian W
February 22, 2017 7:46 am

Ian W, in my view you are right. The Enthalpy of a volume of air is easily calculated from the wet bulb temperature, the dry bulb temperature and the barometric pressure and I would bet that all three have been recorded at weather stations for a long time. What’s more you can safely average Enthalpy. I can’t help but wonder why this is not done.

The Old Man
Reply to  Ian W
February 22, 2017 9:34 am

@IanW: I was about to write a post on my blog on this. Another part of the delta energy content not directly aligned based on simple temperature measurement is the variance in the sensible heat of disparate materials for global surface measurements, and the high phase change heat content for water at constant temperature points as it transitions from solid to liquid to gas and back.

rd50
Reply to  Ian W
February 22, 2017 10:37 am

Everything you presented is correct.
The problem is measuring humidity. Imagine this!!

Reply to  Ian W
February 22, 2017 11:55 am

TY this layman has been saying for many years we CANT compute a single temperature for the earth…..w dont have accurate measurements to average for starters, and as you posted an average is meaningless.

Reply to  Ian W
February 22, 2017 12:11 pm

When the goal is to influence politicians, and the general populace, it is a bad idea to make them cranky by confusing them. You present your big picture in term they are familiar with, and thus believe they understand.

Reply to  Ian W
February 22, 2017 12:35 pm

Here is my reply using only the average letter in the English language:
Mmmmmmmmmmm mmm mm mmmmmmm m mmmmm mmm mmmmmmm m mmmmm mm mmm mm mmmmmmm mmmm m mmmm!

MarkW
Reply to  Ian W
February 22, 2017 2:04 pm

Bill, we could compute such a temperature, the problem is that the error bars would have to be around +/- 20C or so.

Reply to  Ian W
February 23, 2017 8:25 am

Mauer
Thank you for opening this can of worms.
Do you have a reference for “GISS decided to modify the temperature data to account for perceived
faults in its representation of atmospheric energy”?
To amplify on thoughts by Forest Gardener, Ian W, RobR, and others, the measure of heat
energy is enthalpy. In joules per kilogram,the Bernoulli principle expression is
h = (Cp * T -.026) + q * (L(T) + 1.84 * T) + g * Z + V^2/2
Cp is heat capacity, T is temperature in Celsius, q is specific humidity in kg H20/kg Air, g is gravity
L(T) is latent heat of water ~2501, Z is altitude, V is wind speed
An interesting study is https://pielkeclimatesci.files.wordpress.com/2009/10/r-290.pdf
Pielke shows that the difference between effective temperature h/Cp and thermometer temperature can be tens of degrees.
Classical weather reporting does not include the data necessary to calculate enthalpy to an accuracy better than several percent. This inaccuracy is greater than effects attributable to CO2. Hurricane velocity winds add single degrees of effective temperature, but modest winds can add tenths of degrees.
I speculate that part of the reason for the never ending adjustment of temperatures is an attempt to compensate for the inaccuracies inherent in temperature only estimates of energy.
For a more formal discussion of wet enthalpy,
http://www.inscc.utah.edu/~tgarrett/5130/Topics_files/Thermodynamics%20Notes.pdf

Reply to  TheLastDemocrat
February 22, 2017 7:09 am

Absolutely right in my opinion (scientific opinion might I add). At the least the data should be taken as is, with error bars. Every measurement tool has a measurement error. If its decided past measurments aren’t right then the bars need expanding by some amount to cover the uncertainty. The errors then need computing forward. If the errors are too large you’ll end up with some total nonsense at the end that shows errors larger than the signal. That means your data isn’t good enough to draw any conclusions.
There must be a reason that many climate model outputs seldom feature error bars.

Jeff Alberts
Reply to  TheLastDemocrat
February 22, 2017 7:26 am

“I say that the data is not fit for calculating climatic trends.”
But it is fit calculating climactic trends. ;/

JohnTyler
Reply to  TheLastDemocrat
February 22, 2017 8:08 am

“….This means that 20C measured 80 years ago is not necessarily the same as 20C measured today…..”
This is probably true, but since the heat island affect was less pronounced 80 years ago it implies that temperature measurements back then were MORE accurate (or more representative of the “true” temperature) than today’s (concrete, asphalt, brick, glass influenced) measurements.
So if any corrections (i.e., data fudging ) are made, it should be made to TODAY’S temperature readings by LOWERING (artificially high) them !!!
But if this were done, well, there goes the millions of $$$$$ into the AGW scam and it would be game over.

Earl Jantzi
Reply to  TheLastDemocrat
February 22, 2017 9:12 am

NOAA has built a “climate reference network of 50 stations out in the “boonies” where they don’t expect urbanization for 50-100 years”. They refuse to publish those temps because they don’t fit their theory. The models by the way, don’t adjust their error bars when they move the data from one model run to the next one, SO, at the end of 100 years of model runs, the ERROR BARS are plus or minus 14-28 degrees. That makes it very hard to find a gain of a few degrees in the 50+ degree error bands. PURE NONSENSE, to be polite.

DHR
Reply to  Earl Jantzi
February 22, 2017 10:08 am

Data for the NOAA Climate Reference can be found at https://www.ncdc.noaa.gov/crn/ The network is currently (I think) 114 stations in the lower 48, 18 in Alaska and 2 in Hawaii. On the site click on to “Graph the Data” within “National Temperature Comparisons.” Set the time interval to “previous 12 months”, set the start and end date you want and click “plot.” You will see that for duration of the CRN program, roughly from 2005 on, US temps have not changed. The chart can be set back as far as 1889 but as we know, the prior temps have been reduced so are not meaningful. I expect the info from the CRN will make or break the warmist’s in another 10 or 20 years because the data cannot be adjusted.

Thomas
Reply to  Earl Jantzi
February 22, 2017 4:07 pm

Are you talking about USCRN?
Is their another reference network?

Kaiser Derden
Reply to  TheLastDemocrat
February 22, 2017 10:39 am

if the past location was essentially pristine then it should be unadjusted and all adjustments should be applied to the forward records based on the changes driving those adjustments …

Reply to  TheLastDemocrat
February 22, 2017 11:29 am

I think the point of the article was that it read 20C eighty years ago and is still reading 20C. Exactly what changed so that the two readings of 20C are only adjusted from 80 years ago?

MarkW
Reply to  TheLastDemocrat
February 22, 2017 2:07 pm

Beyond that, in the past, that 20C was calculated by averaging the daily max and daily min. With modern sensors, it’s the average of 24 hourly readings. (Some are taken more often. A few a little less often.)
Even without any other source of contamination, it still wouldn’t be possible to compare the past number with the current number because they weren’t arrived at in the same fashion.

george e. smith
Reply to  TheLastDemocrat
February 22, 2017 2:30 pm

It is ALL error, which is why there are no bars.
If it is plotted on a graph it is in error.
g

Butch
February 22, 2017 6:29 am

..If each and every WUWT follower would do this for their own area, I bet we could put together a great historical log of these false/unjustified “adjustments” ! Considering how many people follow this great blog, it should be a “YUGE” list !!!

Reply to  Butch
February 22, 2017 7:34 am

Sounds simple. Who would do the training and quality control?

MarkW
Reply to  M Courtney
February 22, 2017 9:19 am

We could follow the NOAA standard, and not require any.

Sheri
Reply to  M Courtney
February 22, 2017 9:54 am

Training and quality control? Once one knows how to download data and create a spread sheet…..We’re not building models, just recording data and its adjustments.

Aphan
Reply to  M Courtney
February 22, 2017 10:10 am

MarkW,
You always make me smile!

Reply to  M Courtney
February 22, 2017 12:37 pm

“Sheri
February 22, 2017 at 9:54 am
Training and quality control? Once one knows how to download data and create a spread sheet…..We’re not building models, just recording data and its adjustments.”
My experience says very few people know how to make good user documentation. Writers generally assume the user will know various things they themselves know, thus don’t cover, and they frequently, very frequently, use multiple different terms (labels, names, etc.) for the same thing, without ever telling the user that these different words are supposed to be the same.
The point is that no step of obtaining, calculating, or presenting the data would be obvious to the novice. For there to be any chance of getting people to participate it would be necessary to present extremely clear step by step instructions of how. Otherwise, the first result will be that most people quite in frustration when they can’t make the leap from step n to step n+1. The second result will be that many different, and not correct, procedures will be practiced by those who are persistent enough to be able to get SOMETHING done by trial and error.

February 22, 2017 6:29 am

“The result is an annual “temperature” which is a measure of the energy ”
It’s not a measure of anything. It’s an anti-physical value which if used as a temperature in a physical calculation, it is guaranteed to get the wrong result. Temperature is an intensive value (meaning: you don’t add such values) which cannot be defined for a system that’s not even in a dynamic equilibrium, let alone a thermodynamic one.

skorrent1
Reply to  Adrian Roman
February 22, 2017 10:56 am

Amen, brother! Especially as the “energy” balance we’re looking for is a radiative balance (T^4), linear temperature averages have no meaning.

ferdberple
February 22, 2017 6:35 am

In the future, none of the current “record” temperatures will be records. They will have been adjusted downwards and the temperatures of the day will be “new” records. Adjusting the past is the worst of all possible methods to deal with correcting data.

MarkW
February 22, 2017 6:42 am

What kind of maroon believes that min and max can be used to calculate a daily average temperature?

Reply to  MarkW
February 22, 2017 6:51 am

It’s at least the range of the day, which has more info than the average of the two number. Everything they do, throws data away they don’t like.

Reply to  micro6500
February 22, 2017 6:52 am

I should added this, seem pertinent.comment image

KenW
Reply to  MarkW
February 22, 2017 7:10 am

Bingo….

Reply to  MarkW
February 22, 2017 7:24 am

What if all the min’s for the month were averaged and same for the max data – then average the monthly calculated min’s/max’s.
Just asking. Not sure how GISS actually handles the min/max and I don’t know how ‘real science’ would use min/max.

Reply to  kokoda
February 23, 2017 2:39 pm

This chap wrote several papers…Why all models that use Global Mean Temperature as a reference to the air temperatures must be wrong Dr Darko Butina…. Sorry its not linked.

Steve Ta
Reply to  MarkW
February 22, 2017 7:34 am

There’s more than one type of average – the median is a perfectly respectable choice.

Tim Hammond
Reply to  Steve Ta
February 22, 2017 8:40 am

What’s the median of two measurements?

MarkW
Reply to  Steve Ta
February 22, 2017 2:11 pm

Wouldn’t you need at least 3 numbers to get a median?
Or can a median be a fractional position. IE halfway between point one and point 2? In which case, is it really any different from the average?
I confess, it’s been too many years since high school math.

Reply to  MarkW
February 22, 2017 2:49 pm

Wouldn’t you need at least 3 numbers to get a median?
Or can a median be a fractional position. IE halfway between point one and point 2? In which case, is it really any different from the average?
I confess, it’s been too many years since high school math.

In the Global Summary of Days data set, mean is the average of min and max.

george e. smith
Reply to  Steve Ta
February 22, 2017 2:36 pm

Except the median is not any kind of average, so it doesn’t count as an average; good or bad.
Hint: that’s why they call it the median, and NOT the average.
g

Jer0me
Reply to  Steve Ta
February 22, 2017 3:02 pm

George, there are different types of average, the mean, the median and the mode (not sure if there are more).
What we need is an average of the averages 🙂

February 22, 2017 6:51 am

GISS is nothing but artefact. As Schmidt said himself, “what we choose to do with the data determines the temperatures”

rxc
Reply to  Mark - Helsinki
February 22, 2017 2:39 pm

That is very post-modernist thinking. It is what you think about the data, and how you feel about it, that establishes its identity.

February 22, 2017 6:54 am

It’s NOAA\NCEI that need to be stopped from making up data for places they have none. That is where the real problems!
NOAA’s data warms as it moves down latitutes as in when they lose stations on higher latitude warmer southern data warms up the missing station data further north.
BE do this too, fail to cool southern data that is used to make up data further north

Duster
Reply to  Mark - Helsinki
February 23, 2017 2:03 am

The “real problem(s)” are that it is clear, and was centuries ago when the concept was invented, that “climate” is a summary of weather. It is not a real phenomenon but a reified “idea.” Even paleoclimatologists mistakenly discuss Pleistocene “climate” as if it were real, though they have considerably greater justification. The other “real” problem is that we don’t know in detail what drives weather. The basics are in place, but there are critical shortcomings such as the effect of clouds, and more importantly the manner that storms cool the planet. If you couple the altitudes at which clouds form with the geometry of the average optical path for an LWIR photon at that altitude, most of the energy released during condensation and cloud formation will be radiated away from the planet. If you have ever watched squalls pass with virga droppoing from the clouds but not reaching the ground, you are watching weather cooling that part of the planet locally. Energy is released at altitude, the virga falls groundward and returns to a vapor state and is carried back into the cloud layer. Basically, a refrigeration-like cycle. Climate is more attended to because it is already “summarized” and appears effectively simpler, but every single bit of data that addresses “climate” is really weather data at the base.

Reply to  Duster
February 23, 2017 8:46 am

Duster wrote, “If you have ever watched squalls pass with virga dropping from the clouds but not reaching the ground, you are watching weather cooling that part of the planet locally. Energy is released at altitude, the virga falls groundward and returns to a vapor state and is carried back into the cloud layer. Basically, a refrigeration-like cycle.”
Right! But that’s not a refrigeration “-like” cycle, it is a classic phase-change refrigeration cycle, exactly like your refrigerator, except that the refrigerant is H2O instead of a CFC or HCFC.
Duster also wrote, “If you couple the altitudes at which clouds form with the geometry of the average optical path for an LWIR photon at that altitude, most of the energy released during condensation and cloud formation will be radiated away from the planet. “
I’ve wondered about that. It seems like half would radiate downward, so at most half could initially be headed toward space. But, of course, when the radiation is re-absorbed by other CO2 molecules, it just warms the atmosphere, usually at a similar altitude, so eventually it should get other chances to be radiated toward space. So, can you quantify “most of”?

Pablo
February 22, 2017 6:57 am

Talk of an average temperature for the globe is meaningless. The world could be one with 15ºC from pole to pole or one with -10ºC at the poles and +40ºC in tropics and still be the same average.
We have just one ice sheet over the south pole at present with a bit of sea ice in winter over the north pole.This is because we are living in a warm interglacial period.
These balmy times usually only last for around 10,000yrs and we are nearing the end of that period.
Within the next few hundred years the northern ice-sheet will return to stay with us for the next 100,000yrs until the next interglacial and son and so on for a few million years until continents shift enough to allow for better penetration of ocean warmth into the polar night.
Water the great moderator of extremes.

Pablo
Reply to  Pablo
February 22, 2017 7:20 am

More correctly … interstadial should replace interglacial as the term for warm periods within the colder stadials.

Scottish Sceptic
February 22, 2017 6:57 am

NASA Climate are history and so are all their bogus adjustments.
What matters now, is:
1. getting people we trust to compile a metric that really does tell us what is happening.
2. Adjusting for urban heating by REDUCING modern temperatures near settlement
3. Filling in all the gaps around the world
4. Creating a global quality assured system with ownership of the temperature stations so their compliance to required standards can be enforced.

Alex Mason
February 22, 2017 7:02 am

How are the adjustments made? i.e. how is it decided that the adjustment should be x degrees? the graph just looks like someone decided it should be 0.4 deg here and 0.6 deg there. There must be some routine that works it out for each point (though if there were, i’d expect each adjustment to be different perhaps). I’d love to know why, for instance an adjustment of 0.9 deg was used in the 60’s and then it jumps to 0.6 degrees in the 70’s….what changed to make that a realistic adjustment?
Above all though, it is now not data is it. Its some computed numbers which someones opinion has had a part in forging. If they want people to take it seriously there needs to be significant background work showing why the adjustments are valid.

O R
February 22, 2017 7:10 am

Your contentions are totally wrong. GISS doesn’t calculate the monthly average. GISS doesn’t adjust the data as you claim. They get the monthly adjusted (and unadjusted) data from GHCN, a NOAA branch.
The GHCN adjustment is the difference between the yellow and the blue line in this chart:
https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=425000626580&ds=5&dt=1
The adjustment that GISS may do is the UHI adjustment, but that is zero at Falls Village because the site is classed as rural.
You can check this by comparing the black and the blue line, they are identical.. The black line is what GISS finally uses..
Its different in Central Park, NY City:
https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=425003058010&ds=5&dt=1
There, GISS actually reduces the trend of the adjusted GHCN data (to that of nearby rural stations).
Is that a bad thing?.

Butch
Reply to  O R
February 22, 2017 7:22 am

Mark – Helsinki
February 22, 2017 at 6:51 am
GISS is nothing but artefact. As Schmidt said himself, “what we choose to do with the data determines the temperatures”

Jeff Alberts
Reply to  O R
February 22, 2017 7:43 am

Averaging temperatures from different stations is the bad thing. Physically meaningless.

Reply to  Jeff Alberts
February 22, 2017 8:08 am

Averaging temperatures from different stations is the bad thing. Physically meaningless.

it doesn’t have to be, you average the SB flux, and then convert it back to a temp.
That added about 1.2F to temps, but otherwise doesn’t change much. I’m in the process of switching my code to do all temp processing like this.

Ian W
Reply to  Jeff Alberts
February 22, 2017 9:44 am

micro6500 February 22, 2017 at 8:08 am
Do remember to account for enthalpy or you are wasting your time. The hydrologic cycle is what drives climate. That includes the thermohaline currents and clouds as well as humidity and the resulting wet and dry lapse rates.

Reply to  Ian W
February 22, 2017 10:49 am

Yes I calculate enthalpy, and you should like this too.comment image

george e. smith
Reply to  Jeff Alberts
February 22, 2017 2:40 pm

Averaging anything is physically meaningless. Nothing physical pays any attention to it; well nothing physical can even sense an average if and when one happens.
G

Steve Oregon
February 22, 2017 7:15 am

If this year resembles 1983 and Powell and Mead fill up he’ll look even more foolish.comment image
Looks to me like Wettest may soon be reality in all of the West with a likely cyclical return to the same kind of wet years of the 70s to the 90s.
And it’s not just the west coast. Inland upstream water basins are way up and feeding Lakes Powell and Mead in ways that may fill both as was the case in that earlier era.
As sure as the Texas drought ended so is the rest of the west coast drought.
https://www.wired.com/2015/05/texas-floods-big-ended-states-drought/
http://www.sacbee.com/news/state/california/water-and-drought/article126087609.html
“When the snowpack is way above normal and the Sierra Nevada precipitation index is above ‘82-’83, it’s time,” he said. Northern California has received so much rain this year that the region is on pace to surpass the record rainfalls of 1982-83.
http://lakepowell.water-data.com/
Rivers feeding Lake Powell are running at 149.53% of the Feb 22nd avg. Click for Details
http://lakemead.water-data.com/
https://en.wikipedia.org/wiki/Lake_Mead
Multiple wet years from the 1970s to the 1990s filled both lakes to capacity,[10][11] reaching a record high of 1225 feet in the summer of 1983.[11] In these decades prior to 2000, Glen Canyon Dam frequently released more than the required 8.23 million acre feet (10.15 km3) to Lake Mead each year, allowing Lake Mead to maintain a high water level despite releasing significantly more water than for which it is contracted.

Reply to  Steve Oregon
February 22, 2017 9:37 am

I think you meant that comment for another thread

February 22, 2017 7:17 am

It would be worthwhile doing some absolute basic research before writing these kinds of articles. The adjustments shown have nothing to do with GISS. GISS appear to have made no changes to the GHCN input they use.

Butch
February 22, 2017 7:21 am

..If you torture the data long enough, it will say what ever you want !!

Jannie.
February 22, 2017 7:25 am

Another excellent report by WUWT. Unfirtunately this kind of report and discussion would make the average persons eyes glaze over after about eleven seconds, which is good for the attention span of their postmodern education. Fortunately there are people who can actually outline the detail and further the argument.
Simply put:
Everybody has heard about the non stop stream of Fake News coming from the leftist establishment, so you won’t be surprised to hear that NASA is peddling Fake Data on Global Warming.

Alan Davidson
February 22, 2017 7:32 am

The same methodical manipulation of temperature trends by reducing older temperatures and increasing recent temperatures to produce an artificial warming trend, has been reported for many years in many blogs covering all regions of the world. It appears to be quite consistent and systematic.
Since both NOAA and NASA GISS use GHCN temperature data, it is not clear to me whether this systematic manipulation is being done by NOAA, by NASA, or perhaps more likely by both NOAA and NASA GISS in a co-ordinated fashion. Hopefully a congressional inquiry will soon get to the bottom of this.

george e. smith
Reply to  Alan Davidson
February 22, 2017 2:43 pm

So what is the average Temperature for this month for Zealandia; I mean the whole continent ??
G

Neillusion
February 22, 2017 7:49 am

The google earth pic of the site in 1991 does not seem to show two trees next to the weather station. The pic is not too clear but it seems that the trees, if they were there were no where near the size they are in the current pic. This would affect readings in the last 25 years surely?

Tom Halla
February 22, 2017 7:52 am

All is revealed!! NASA has a working time machine!!!! How else can they go back in time and determine accurate temperatures in the past???? I am sure I will find confirmation somewhere on the net of the great cover-up of all time!!!!!!
Oh, they don’t have a time machine? Never mind. . .

Fred Harwood
February 22, 2017 7:59 am

I live 20 minutes north of the generating station. One change to the plant was the large changes in the transformer locations, to right to the south of that SScreen. Now the main step-up transformer is in that location. I’ll get a photo.

Steve Fraser
February 22, 2017 8:00 am

Its not obvious to me why the ‘homogenized’ plot would have gaps when the raw does not. Could it be that upward spikes in the raw are clipped off to prevent the homogenized from rising high too soon?

Griff
February 22, 2017 8:05 am

Just look up the Berkley Earth results… thousands and thousands of surface temp sites checked, UHI influence eliminated.

Gloateus Maximus
Reply to  Griff
February 22, 2017 8:10 am

Griff,
Please keep commenting here.
Your never failing comic relief is appreciated!

ECB
Reply to  Griff
February 22, 2017 8:11 am

If I recall, they stole AWs work.

Tim Hammond
Reply to  Griff
February 22, 2017 8:43 am

You checked thousands and thousands of sites that quickly?I don;t think that can be true.

MarkW
Reply to  Griff
February 22, 2017 9:24 am

As always, Griff defines as good, any “study” that reaches the correct result.
If they claim that they have a 100% perfect method for removing UHI, then they do. After all, they got the correct result, so the methods must be good.

Michael Jankowski
Reply to  Griff
February 22, 2017 10:43 am

“Eliminated”…so it was quantified for every location and subtracted out? Would love to see the annual values on that for a number of cities.
More like it was allegedly “eliminated” by algorithmic (no pun intended) background processing and hand-waving.

Clyde Spencer
Reply to  Griff
February 22, 2017 11:25 am

Griffy,
I have serious reservations about the methodology BEST used to ‘prove’ UHI influence is negligible.

Michael Jankowski
Reply to  Griff
February 22, 2017 7:42 pm

Laughably, BEST calculated a UHI cooling from 1950 to 2010. Instead of reaching the conclusion that something was f*d, they said it justified that UHI was minimal.

February 22, 2017 8:07 am

Thank you. This covers many of the riddles I’ve been puzzling over.
One big point is that the actual raw data are unavailable. The kind of instrument, readings and times of day, missing hours or days or weeks or months are all lost in the black-hole, the bit-bucket, the court-house fire or flood. The means used to interpolate and aggregate to arrive at monthly figures…well, scraps and hints and possibilities tantalize, but are not readily accessible.
But, the watermelons say we should give up our liberty and earnings and property because CATASTROPHE!

Reply to  Mib8
February 22, 2017 9:41 am

The actual data, the handwritten charts are readily available, I linked to one below. Just go here:
https://www.ncdc.noaa.gov/IPS/coop/coop.html?_page=2&state=CT&foreign=false&stationID=062658&_target3=Next+%3E

Reply to  Phil.
February 24, 2017 9:02 am

That’s roughly what I was seeking. Max, min, reading “at obs” over the 24 hours before observation. Suggests a recording unit, perhaps a circular pen trace, that a human would collect once a day, and transcribe readings to another form, keeping the original in a file for some fixed time before tossing them. We used to have one in computing center “machine room”.
That is real progress.
But then, it has to be digested down to a daily, monthly, quarterly, annual figure…somehow.

Steve Fraser
February 22, 2017 8:08 am

Another thought, about siting… though this is in a nice little grass area, it has problems… 1) body of water close by, 2) blacktop roadway and parking lot close by, nearly surrounding it, 3) shade trees too close, 4) building too close, 5) transformers in grid connections too close.

1 2 3
Verified by MonsterInsights