A question for Zeke Hausfather

Zeke is upset that I made this statement in a story at Fox news:

Is history malleable? Can temperature data of the past be molded to fit a purpose? It certainly seems to be the case here, where the temperature for July 1936 reported … changes with the moment. In the business and trading world, people go to jail for such manipulations of data.

he says:

In the spirit of civility, I would ask Anthony to retract his remarks. He may well disagree with NCDC’s approach and results, but accusing them of fraud is one step too far.

I’d point out that Zeke has his interpretation but nowhere did I say “fraud”. He’s mad, and people don’t often think clearly when they are mad. That’s OK.

Without getting into semantics, I’d like to ask Zeke these simple questions:

  1. What is the CONUS average temperature for July 1936 today?
  2. What was it a year ago?
  3. What was it ten years ago? Twenty years ago?
  4. What was it in late 1936, when all the data had been first compiled?

We already know the answers to questions 1 and 2 from my posting here, and they are 76.43°F and  77.4°F respectively, so Zeke really only needs to answer questions 3 and 4.

The answers to these questions will be telling, and I welcome them. We don’t need broad analyses or justifications for processes, just the simple numbers in Fahrenheit will do.

0 0 votes
Article Rating
197 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
January 23, 2013 3:02 pm

Anthony, come on, I can already hear the crickets warming up…. LOL

January 23, 2013 3:02 pm

Absolutely! It’s well past time to stop the history revisionism.

Doug Proctor
January 23, 2013 3:07 pm

The warmists don’t understand that the amount of warming the “consensus” of the IPCC believe is human-caused is of the 1.0F difference between CONUS and other versions. That the possible UHIE inaccurate correction is 40% of the “unprecedented” warming. That an unknown but if going only from 28% to 26%% in global cloud cover you can account for 100% of the warming. Or that a regional change, say a 28% to 18% change over only 20% of the planet, could do the same.
Global warming is a process of fractions made out to be thing of large, whole numbers. As if multitudes of “extreme weather” events were the way the world is getting warmer, so you could see a wall of heat moving across the landscape. A reversal of cause and effect.

January 23, 2013 3:08 pm

Isn’t it said that “History is written by the vanquished?” no no wait … that can’t be correct …
.

Doug UK
January 23, 2013 3:10 pm

How can current “warming” be proved by making yesterdays temperatures cooler?
Was that what Global Cooling was all about in the 1970’s? (sarc)
Keep your eye on that pea now!

Jean Parisot
January 23, 2013 3:13 pm

Eventually, CAGW will be a text book case of stock manipulation.

January 23, 2013 3:14 pm

Anthony:
As you know, all the climatological temperature time series data sets are changed with time, and it is a rare month when at least one of them does not alter past data. This is not only true for the US temperature compilations: it is also true for the hemispheric and global temperature time series.
It seems appropriate to remind people of how GISS global temperature data sets have been adjusted, and this picture is worth a thousand words
http://jonova.s3.amazonaws.com/graphs/giss/hansen-giss-1940-1980.gif
Richard

Brent Hargreaves
January 23, 2013 3:16 pm

I’d like to ask Zeke why his organization reduced the recorded Jan 1900 temperature at Teigarhorn, Iceland from 0.7C to -0.2C. http://endisnighnot.blogspot.co.uk/2012/03/giss-strange-anomalies.html
If historical temperatures are adjusted downward (they did this from 1900 to 1962) and more recent temperatures are bumped up then….. oh, you finish the sentence Zeke. You use the word “fraud”; I use the word “fiddling”. Tell ya what, Zeke, let’s agree on this phrasing: “Distorting the historical record in order to create an artificial warming trend in furtherance of a political or financial agenda”.

Joe Grappa
January 23, 2013 3:17 pm

“I’d point out that Zeke has his interpretation but nowhere did I say “fraud”
Well, you did say that in the business world people go to jail for such manipulations of data, so you were implying the error, if there was one, was a lot worse than an honest mistake.

January 23, 2013 3:22 pm

Accusing someone of manipulating data, suggesting that they should go to jail, but not uttering the word “fraud” is a trick. “Climate skeptics are like those people who refuse to admit the holocaust happened.” See? I never used the d word but i said the same thing.
REPLY: Sort of like suggesting some ex NASA scientists skeptical today caused the space shuttle to blow up?
Can you answer the questions? – Anthony

David, UK
January 23, 2013 3:22 pm

“In the business and trading world, people go to jail for such manipulations of data.”
To be fair, that does sound like an example of “fraud” even if the word was not used. Unless “manipulations of data” for which “people go to jail” has an alternative definition.
Not disagreeing with the comparison, considering how climate data is indeed adjusted and manipulated. Just confused as to why the comparison with illegal “manipulations of data” in the business world doesn’t constitute an accusation of fraud. Is a spade not a spade?

January 23, 2013 3:22 pm

Huh, not only did it change, it also gained an extra significant digit.
Wait till I tell my 7th grade science teacher he was wrong.

Andrew
January 23, 2013 3:25 pm

Anthony – this must be akin to handing the microphone to someone who develops incontinence when handed the microphone in a karaoke bar.
Let the bed wetting commence.

January 23, 2013 3:31 pm

Steven Mosher:
At January 23, 2013 at 3:22 pm you say

Accusing someone of manipulating data, suggesting that they should go to jail, but not uttering the word “fraud” is a trick. “Climate skeptics are like those people who refuse to admit the holocaust happened.” See? I never used the d word but i said the same thing.

Words fail!
The holocaust happened. Claiming it did not is a lie.
The historic temperature data are often altered. Saying they are is the truth.
Why people change the historic temperature data can be debated. But it is NOT ascribing any motivation to point out that it is done and to ask if it is acceptable .
Richard

Robert of Ottawa
January 23, 2013 3:31 pm

Richard Courtney, why should a measurement be re-adjusted later? Why, also, in the case of Crimatology, alll the adjustments have the same effect of cooling the past. This is a neat strategy as it enables the Crims to show that current temperatures are what they say they are so obviously no malarky there.

January 23, 2013 3:31 pm

Go look at the US HCN data from the GHCN site for 2012 and then tell me how they got anything useful out of it. The last 4 days of December are missing for virtually all sites and there are huge swaths of missing data across the nation for October and other months. A good exercise would be to see if the missing data was cold. I suspect it was but have been too busy in the last couple of weeks to write the code to analyze that hypothesis.
I do have all of the data pulled and archived though if anyone wants the mid-January dataset. 2012 is a freakin’ mess. This from the best data source in the world, from a government that thinks they can manage money. They can’t even record data from an instrument. Yeah, I trust that record.

Terry Bixler
January 23, 2013 3:36 pm

Mr. Mosher you avoid the point at all costs. Why are the temperatures for items 1-4 not the same. If they are the same then they have not been manipulated. If they have been manipulated all of the points of manipulation need to be described. Who, What, Where, Why and When. Certainly modern computer databases can handle those issues. Of course you know this but choose to not address the issue of professional data keeping.

Ed Caryl
January 23, 2013 3:36 pm

I may be proved wrong, but a series of thousands of small occasions of rounding up, confirmation bias, in filling missing data just slightly too high, siting problems, mis-estimation of UHIE, and other errors that just happen do go in the direction wanted, can account for all of it. Intentional fraud? That may never be found.

Pompous Git
January 23, 2013 3:36 pm

Steven Mosher said @ January 23, 2013 at 3:22 pm

Accusing someone of manipulating data, suggesting that they should go to jail, but not uttering the word “fraud” is a trick. “Climate skeptics are like those people who refuse to admit the holocaust happened.” See? I never used the d word but i said the same thing.

I fail to see where Anthony suggested that the people who are clearly manipulating the data “should go to jail”. He merely pointed out that people engaged in business are put in jail for manipulating data. Anthony might well have been arguing that business data manipulators deserve the same treatment that climate data manipulators receive.
Anthony said:

“[Zeke’s] mad, and people don’t often think clearly when they are mad. That’s OK.”

Yes, the CAGWers are all quite mad and unable to think clearly. What’s not OK is that they expect those of us that are able to think clearly to share their delusions.

January 23, 2013 3:37 pm

Robert of Ottawa:
At January 23, 2013 at 3:31 pm you ask me

Richard Courtney, why should a measurement be re-adjusted later? Why, also, in the case of Crimatology, alll the adjustments have the same effect of cooling the past. This is a neat strategy as it enables the Crims to show that current temperatures are what they say they are so obviously no malarky there.

I have been objecting to these changes for strong personal reasons for many years. Clearly, you have not read this
http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc0102.htm
Richard

Manfred
January 23, 2013 3:39 pm

Would such OVERWRITING of meta data “to correct the uncertainty” have happened in any other branch of science ?
http://climateaudit.org/2011/07/12/hadsst3/

Zeke Hausfather
January 23, 2013 3:45 pm

Hi Anthony,
Unfortunately, calculating the absolute temperature for the entire contiguous U.S. in July of 1936 is a non-trivial matter. I could download the raw absolute average temperatures from all stations available July of 1936. However, many of those stations were located on the rooftops of city buildings (this was the pre-airport age, after all). These stations also used liquid-in-glass thermometers which produce notably higher maximum temperature readings than modern electronic instruments. All of these things mean that a simple average of instruments available at that time would tell us something interesting about the conditions at the locations of those instruments, but not necessarily produce an unbiased estimate of CONUS temperatures.
Instead, I will start with the Climate Reference Network, and assume that is represents a true unbiased estimate. I will estimate the actual U.S. temperature from 2004 to 2012 by assigning each CRN station to a 2.5×3.5 lat/lon gridcell (using the standard NCDC gridding approach), average the temperature readings in each gridcell, and average the grid cells based on their land area to get a CONUS average.
It turns out that during the period of overlap (e.g. 2004-2012), USCRN and USHCN largely agree on U.S. temperatures, at least once they have been converted into temperature anomalies (absolute temperatures will differ due to siting, elevation, wind exposure, and other factors). See this figure: http://rankexploits.com/musings/wp-content/uploads/2013/01/Screen-Shot-2013-01-16-at-10.37.51-AM.png
If I add in the average absolute temperature obtained from CRN over that period to the HCN anomalies, I can get a good estimate of the actual temperatures for CONUS in the past. This yields an estimate of 75.44 F for July 1936. This is slightly higher than the 2012 July temperature of 75.34 F. However, the result will differ a tad based on the grid size chosen and the baseline years used. Even in the homogenized data the July temperatures in the 1930s are about as warm as today; for annual average temperatures, not so much.
This does raise the point that NCDC probably erred in hyping July 2012 as the “hottest ever” when the difference (0.2 C) between it and July of 1936 is within the range of error introduced by methodological choices.
For folks still confused about the use of anomalies, and why they are important when estimating regional averages of temperatures (especially when individual stations are subject to biases), this post should serve as a useful primer: http://rankexploits.com/musings/2010/the-pure-anomaly-method-aka-a-spherical-cow/
REPLY: Thanks for pointing out the 0.2C error band issue. Note that I’m not asking you to calculate it, just show what the reported averages were. The Climate Reference Network is irrelevant to the July 1936 issue, as it did not exist prior to 2001. That said, it all becomes an issue of reporting.
While the work products of those at NCDC may be fully steeped in the belief they are doing things correctly, comparing things than have been changed through history is most certainly an incorrect way to do it. NCDC is charged with knowing these figures, yet, they change depending on who you ask.
Again, what are the temperatures for July 1936? Which one is the “real” one? – Anthony

Zeke Hausfather
January 23, 2013 3:47 pm

As far as the fraud question goes, while you didn’t use that word, accusations of data manipulation and remarks that folks would be jailed if they engaged in the same behavior in the business or financial realms amounts to effectively the same thing in my reading at least. Perhaps I should have used the term unethical behavior instead of fraud, and I apologize if I put words in your mouth.
REPLY: that’s OK. To the point though, the historical data has in fact been changed (manipulated as you describe), and the question is: when reporting record temperature comparisons to the public, is this ethical?
Really, what is the correct temperature for July 1936? Nobody seems to know for certain, and that’s my point. – Anthony

RockyRoad
January 23, 2013 3:49 pm

Freudian slip by Zeke, it appears.
By the way, I always see questions asked of Steven Mosher, but he never answers them. I’m going to apply the nickname “Crickets” to Steven.
Sorry, Steven, er…., Crickets, but it fits.

January 23, 2013 3:56 pm

Zeke Hausfather:
I have personal interest in the frequent “adjustment” of the data sets. You can see my interest from my answer – especially its link – to Robert of Ottawa at January 23, 2013 at 3:37 pm.
Hence, I appreciate your having made your post at January 23, 2013 at 3:45 pm.
However, your post says different methods give different results. It does not say why each of the methods is often changed, and that is the only answer which matters, isn’t it?
Richard

January 23, 2013 3:57 pm

Zeke Hausfather says:
January 23, 2013 at 3:45 pm

Hi Anthony,
Unfortunately, calculating the absolute temperature for the entire contiguous U.S. in July of 1936 is a non-trivial matter. I could download the raw absolute average temperatures from all stations available July of 1936. However, many of those stations were located on the rooftops of city buildings (this was the pre-airport age, after all). These stations also used liquid-in-glass thermometers which produce notably higher maximum temperature readings than modern electronic instruments. All of these things mean that a simple average of instruments available at that time would tell us something interesting about the conditions at the locations of those instruments, but not necessarily produce an unbiased estimate of CONUS temperatures.

I notice that you failed to answer any question Anthony actually asked, but let me throw in a few of my own:
Where did the extra significant digit come from?
Why is the adjustment in the same order of magnitude as the claimed signal?

tz
January 23, 2013 3:58 pm

“Mad”, as in “angry” or “insane”?
REPLY: “angry” – Anthony

Zeke Hausfather
January 23, 2013 3:59 pm

Whelp, I made an error. This does illustrate the danger of trying to do things too quickly. My code was accidentally still set to use the raw data rather than the homogenized data.
After homogenization, we get:
July 1936: 74.59
July 2012: 75.35

Zeke Hausfather
January 23, 2013 4:01 pm

Anthony,
What exactly do you want to know about July 1963? The simple average of the temperature readings of all stations? The best estimate of the full CONUS temperature field? Its not a simple question with a simple answer.

Zeke Hausfather
January 23, 2013 4:05 pm

Stark Dickflüssig,
I gave my best estimate of the CONUS average absolute temperature for that month and year. Given that we don’t have a measurement of ever inch of the U.S., but rather a specific set of locations, any attempt at creating an average temperatures will require at least a few assumptions.
Anthony,
I think CRN is somewhat germane. We need a good unbiased estimate of the absolute temperature field, and I would argue that it provides a rather good one, or at least a better estimate than a weighted average of absolute temperatures in 1936.
REPLY: I think you’ve been overthinking – Anthony

Sean Houlihane
January 23, 2013 4:05 pm

I still read it as fraud, only not trying to hide behind more questions which we all know are irrelevant to the intention.

Latitude
January 23, 2013 4:08 pm

I still find it amazing that glorified weathermen can clearly see the past, and correct it where it’s wrong…
….and they still can’t pick lotto numbers

gnomish
January 23, 2013 4:08 pm

who fiddles the data? who diddles the chart?
who plays tickle-monster with the temps?
who has soapy showers with statistics?
would the consensus be ‘climate molesters’?

RockyRoad
January 23, 2013 4:09 pm

Zeke Hausfather says:
January 23, 2013 at 3:47 pm

As far as the fraud question goes, while you didn’t use that word, accusations of data manipulation and remarks that folks would be jailed if they engaged in the same behavior in the business or financial realms amounts to effectively the same thing in my reading at least. Perhaps I should have used the term unethical behavior instead of fraud, and I apologize if I put words in your mouth.

As a practicing mining engineer for 15 years in charge of block model generation and economic analysis to substantiate $Millions to $Billions of investment capital, had I performed the data manipulation on assay values you “climate scientists” do on temperature data, I’d certainly be out of a job and, depending on the severity of the situation, perhaps end up in jail. (Some people call it “salting”; others call it “creative accounting”.)
So continue with your “data manipulation” if you wish, but you’re going to have a hard time getting this engineer to believe your conclusions. (And yes, I did read the article by lucia you linked to in your prior post.)

Lew Skannen
January 23, 2013 4:11 pm

Climate changes over time.
Records of climate change over time.
All under the heading of ‘climate variability’.

January 23, 2013 4:14 pm

Corporate officers have legal responsibilities beyond not commiting fraud.
A corporate officer can certainly go to jail for ‘a mistake’.
Here is an example.
http://online.wsj.com/article/SB10001424052970204443404577052173679627572.html

David A. Evans
January 23, 2013 4:16 pm

I watched the data in GISS figd.txt change in Nov. 2009. (remember that GISS & NCDC are essentially the same.)
2006 came from about 6th to joint warmest with 1998 whilst 1934 went from joint warmest to about 6th.
I’m supposed to believe any of this?
Zeke. Does anyone even know what the RAW data is anymore?
DaveE.

Patrick B
January 23, 2013 4:17 pm

Claiming an error margin of just 0.2 degrees – when calculating the CONUS temperature for a month in 1936 – THAT IS EITHER INCOMPETENCE OR FRAUD, take your pick.

Zeke Hausfather
January 23, 2013 4:19 pm

Anthony,
There is a good reason why all the major groups (NCDC, GISS, UAH, RSS, Hadley) primarily report anomalies. Absolute temperatures are tricky things, at least when you aren’t able to sample the full field.
GISS has a rather good explanation: http://data.giss.nasa.gov/gistemp/abs_temp.html

January 23, 2013 4:22 pm

Corporate officers can go to jail for other than fraud. They can go to jail for incompetence or just not exercising due diligence. Here is an example.
http://online.wsj.com/article/SB10001424052970204443404577052173679627572.html
Therefore Anthony’s statement implied one of a range illegal activities for corporate officers. Not just fraud.

RobertInAz
January 23, 2013 4:23 pm

“Accusing someone of manipulating data, suggesting that they should go to jail, but not uttering the word “fraud” is a trick.”
Commentators on business networks frequently make a similar observation with respect to government accounting verses private sector accounting practices. There is no implication that the folks in the government entities should be jailed – the observation is that there are clearly different standards for government practices and the rules the rest of us live by.

Zeke Hausfather
January 23, 2013 4:23 pm

David A. Evans,
To access raw data, browse to one of the NCDC’s FTP folders (e.g. ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2.5/) and click on the files labeled .raw
I also show the raw vs. adjusted results for the U.S. in my post over at the Blackboard: http://rankexploits.com/musings/wp-content/uploads/2013/01/USHCN-adjusted-raw.png
Global raw vs adjusted temperatures can be seen here: http://www.skepticalscience.com/pics/GHCN_RawvAdj.jpg

Ray Donahue
January 23, 2013 4:28 pm

” In the business and trading world, people go to jail for such manipulations of data.”
This statement may also be construed to highlight the seriousness of climate data manipulation via comparison with an activity which most would recognize as having consequences, whether legal ones or reputation damaging.

Zeke Hausfather
January 23, 2013 4:32 pm

Anthony,
They way that NCDC calculates absolute temperatures is by calculating anomalies and adding in a constant absolute temperature for each month, based (I believe) on modeled outputs of a U.S. temperature field for the climate normals period. I’m well versed in the details, however, as I rarely if ever deal with absolute temperatures.
My approach was to try and replicate what NCDC does, but to use the CRN to determine the “true” absolute temperatures for the period in which CRN and HCN overlap. I realize this is somewhat complicated (and perhaps overthinking), but the point is that there is no “simple” way to get an absolute temperature value.
As far as anomalies go, you can see (homogenized) July anomalies here: http://i81.photobucket.com/albums/j237/hausfath/ScreenShot2013-01-23at43154PM_zps3d02490c.png
REPLY: and this exactly illustrates my concern, nobody seems to be able to reproduce the CONUS average temperature for July 1936. Therefore, making claims about “hottest ever” etc are indeed, nothing but hype. NCDC knows full well what they are doing here.
The historical accounts should be in the Monthly Weather Review for 1936. Since I illustrated here how a simple average of the CRN stations in the CONUS came close to the COOP values, I see the point your are trying to make. That said, today’s data is not yesterday’s data. My interest is in the true data from 1936, not a derivative comparison of what it “might” be.
– Anthony

January 23, 2013 4:37 pm

Zeke, The difference between NOAA and USCRN can be as much as plus or minus 4.7F for individual states for a month.
Yet you argue the “trend” is close enough. It isn’t.
For example, in Sept 2012, NOAA was 4.78F warmer than USCRN for Tennessee.
http://sunshinehours.wordpress.com/2012/10/09/uscrn-vs-noaa-september-2012/
Don’t try and argue that NOAA and USCRN know what the temperature in Tennessee was for September 2012.

steven haney
January 23, 2013 4:40 pm

Zeke, Anthony and Richard, Why is there no longer any discussion of temp. data prior to 1850? Why would the end of “the little Ice age” be the starting point for data in this discussion, when there are plenty of temp. records form around the globe available from the 1770’s on… when thermometers were simple, uniform and accurate. It seems to me that using a data set starting at the low point would be like reading one heart beat of of an EKG starting with a flat line and ignoring the previous beat. Every heart beat is a hockey stick when analyzed this way!

MrE
January 23, 2013 4:41 pm

Fudge the data? Where on earth would scientists learn to do that and get rewarded? Answ. First year students in University science lab classes.

Leon0112
January 23, 2013 4:41 pm

Zeke – I agree that there is a fair amount of measurement error in these calculations due to siting and other issues. This is probably even more serious for global temperature averages than the US alone. The measurement errors are sufficiently large that trying to determine causes and effects of various variables would be very trying. Indeed, it makes it quite difficult for the science to be “settled”. It makes it quite difficult to justify raising the prices of food and energy to the poor and middle class, just in case.
I know I did nothing “just in case” the Mayans were right. Dooming poor people to a lower standard of living, based on theories with such high levels of measurement error is bad policy.

January 23, 2013 4:41 pm

Zeke Hausfather:
Following my post addressed to you at January 23, 2013 at 3:56 pm you have answered each of the posts addressed to you from others. It seems that you may have overlooked my question.
I would be grateful if you were to answer my question. To save you scrolling to find it, and for hope of clarification, I rephrase it and expand it here.
Temperatures were measured decades ago so their combined values should be a constant in the absence of altered homogenisation and consolidation methods. Why does the method used to homogenise and consolidate historic temperatures (regional, hemispheric and global) change with time? Which of the used methods can be considered to be correct and why?
Please note that I am not asking about differences between the methods of different teams (e.g. HadCRU and GISS). I am asking about the sequence of different methods used by your organisation.
Richard

Owen in GA
January 23, 2013 4:42 pm

Zeke Hausfather,
When you were an undergraduate taking a lab course, what would have happened to your academic career if you had gone back to your lab book and “adjusted” the recorded data so your conclusion matched expectation? I suggest that you would not be working in science now if you had done such a thing. Experimental data is sacrosanct! It can never be altered. If you measure an item as a meter, it is a meter forever (even if your professor set the lab up with altered meter sticks to make sure everyone was staying honest – and yes I would do that!) There can be no adjustment of the past, only traceable, explained in metadata, correction of the present.
Smoothing algorithms should only change current analytical data and should always be traceable to the original data. Too many of the changes have no documentation of the individual adjustments. Most of us understand adjustments are needed to make apples to apples comparisons for sites that have had significant land use changes around them, but that doesn’t seem to be what is happening. The homogenization process seems to just smear the worst errors or site problems across whole regions and time periods rather than to correct for poor or missing data.
Quite frankly, the whole UHI adjustment process as it now happens seems to be counter-intuitive. If the problem is urban encroachment of the site, the correct adjustments would do one of two things to make an apples to apples comparison of the same site at different points in time and development: adjust the current readings down by an amount equivalent to an experimentally determined value of UHI, or adjust the past up by the same amount, though I’d rather the first than the second, and would rather no adjustment at all until a valid experimental valuation of UHI for different types of sites can be done. The way the current adjustment process is done implies that UHI has a cooling effect on the sensors which is just plain crazy. (See the experiment being done at Oak Ridge on siting issues for ideas on how to get at this value!)
Most of us here understand that science is hard work. I suspect that most people working these data sets are trying to do the right thing for the science with the best information they have available. Thus you will rarely hear the fraud word from this site, but you will hear things like “obviously mistaken”, “out of their gourds”, “too full of themselves to see their error”, or even “suffering from mushroom syndrome” and very rarely for some of the most arrogant practitioners of data manipulation and scientific information suppression, “dishonest”. For the most part, I am looking for data that passes the smell test and so far that has been in short supply. Some of the problem is the ever-popular “science by press release” that the MSM picks up and runs with out of all proportion to the underlying data/paper, but then again, some of the abstracts are written with the obvious intent to engender that response. (I’ve been told my abstracts are too dry, so I am probably not one to ask how to correct that problem.)
The problem with this site is not that people don’t understand science, the problem is that they do, and that makes it hard to put “science in the dark” by us. (not implying that NCDC does this, but there is still the little matter of all the little adjustments of the past – especially those that happen between major dataset methodology revisions!)

Mark Bofill
January 23, 2013 4:44 pm

Zeke Hausfather,
Kudos for coming here to comment on this in a forthright and civil manner, for what it’s worth.

MattS
January 23, 2013 4:46 pm

tz,
Yes.

January 23, 2013 4:47 pm

Another question for Zeke: Is arch-alarmist and $Billionaire Jeremy Grantham still funding your Yale climate blog? Just wondering…

MrE
January 23, 2013 4:47 pm

I don’t know why they don’t just use the temperature from newspaper archives to get raw data. There are too many vested interests in Climate now to trust anyone unless it’s completely open. The secrecy/just trust us thing is not working. Mosher and Zeke used to be for openness and hopefully still are.

January 23, 2013 4:48 pm

Steven haney:
At January 23, 2013 at 4:40 pm you ask

Zeke, Anthony and Richard, Why is there no longer any discussion of temp. data prior to 1850?

My answer is that there were few temperature measurement sites prior to 1850 so the methods used to compile e.g. global temperature are not applicable for then.
Which is not to say I think the methods used are applicable for times after 1850. I don’t, but I think they could be.
Richard

Manfred
January 23, 2013 4:56 pm

Zeke Hausfather,
does Berkeley try to work with and verifiy Watt’s new station classification ? I think the old one you used treats a paved footpath the same way as a parking lot or even an airport runway if it is only at the same distance.

Zeke Hausfather
January 23, 2013 4:56 pm

Anthony,
The Monthly Weather Review for 1936 (or USHCN raw data for 1936) will give me the recorded measurements at all stations available. However, a simple average of these will not necessarily result in a good estimate of U.S. absolute temperatures, due to unrepresentative siting (many stations back then were fairly urban), instrumentation (CRS measures higher max temps than actually occur), and spatial coverage (there were fewer stations back then, and the elevation of stations doesn’t necessarily represent the elevation profile of the CONUS).
The way regional/national average absolute temperatures are calculated is to add anomalies to a modeled field, or to spatially interpolate between absolute readings. The use of anomalies is preferable when evaluating changes over time as it corrects for differing absolute temps and isolates changes; the absolute approach is better for current weather reports where the decadal-scale continuity of measurements is irrelevant.
Put simply, you can average all the readings from all the stations in 1936 and get a rough average of “true” U.S. temperatures, but it won’t be particularly comparable to temperatures in 2012. To create a comparable estimate you both need to use anomalies (to correct for things like elevation and to some extent urban locations) and some sort of homogenization to correct for station moves, instrument changes, and the like.

Rud Istvan
January 23, 2013 4:56 pm

Past temperatures ‘colding’ is apparently a universal AGW phenomenon:
Steirou and Koutsoyiannis, Investigation of methods for hydroclimatic data homogenization, Euro. Geophys. Res. Absts. 14: 957-961 (2012). For their global sample of 163 stations, the homogenized data showed 0.76°C while the raw data showed only a 0.42°C increase. Some of the changes are hilarious–like for Romania. Obviously, thermometers just weren’t up to the global warming job back then.
A presentation describing this study is available at itia.ntua.gr/2012EGU_homogenization_1.
A fuller discussion is contained in the ebook The Arts of Truth, chapter on AGW.

mpainter
January 23, 2013 4:58 pm

The manipulation of data is adulteration of data, however justified. And the justifications?- they boil down to this- the data is unreliable, so we are going to “fix” it.
In other words, you murder the data and at your trial you testify that the data did not deserve to live.
Someday the data murders will be examined in a new light and the murderers called to account.

steven haney
January 23, 2013 4:58 pm

It seems to 1936 is one third of the way into an atrial contraction in my analogy. Aren’t deeper (longer) data sets always preferable in science? Hoping Zeke has a meaningful answer, as I can understand his being angry upset with Anthony’s allegation, yet I, too,
am miffed by data sets changing… Can’t we go apples to apples instead of from Merlot grapes to Champaign grapes to Pinot grapes?

knr
January 23, 2013 4:58 pm

Its not adjustments that are the problem but the lack of good rational behind making them , the poor way these adjustments are tracked and the manner in which raw data is lost so you can’t even go back to the sources to do a real review . And its noteworthy that by ‘lucky ‘ chance these adjustments always favour ‘the cause ‘ which may have something to do with the way people like Dr Doom, who have proved themselves more advocates than scientists, are the ones making them .

knr
January 23, 2013 5:03 pm

Zeke Hausfather just how raw is the raw data you cite has it been partly cooked or is it straight off the bone ?
Meanwhile , whenever you hear claims of accuracy levels better than the instruments measuring them can achieve , no matter how much maths you throw at it , you have to ask yourself how much contact with reality these claims have .

Zeke Hausfather
January 23, 2013 5:07 pm

Manfred,
Anthony has not released the new station ratings for independent groups to work with. I’m sure myself (and possibly Berkeley, though I can’t speak on behalf of the group) would be happy to try and replicate his results when he is ready.
Knr,
For the U.S. at least, I’m pretty sure it is exactly as it appears in the log books. For worldwide data there may be some undocumented adjustments in the early part of the record for some countries (e.g. TOBs changes, for example), but whenever it exists the USHCN and GHCN raw (or QCU) files include the raw data.

Zeke Hausfather
January 23, 2013 5:09 pm

steven haney,
In my post at the Blackboard I go into some depth about the adjustments that are done, the reasons why they are done, and attempts to evaluate if they are proper or not.
http://rankexploits.com/musings/2013/a-defense-of-the-ncdc-and-of-basic-civility/

Ian W
January 23, 2013 5:11 pm

Zeke Hausfather says:
January 23, 2013 at 4:19 pm
Anthony,
There is a good reason why all the major groups (NCDC, GISS, UAH, RSS, Hadley) primarily report anomalies. Absolute temperatures are tricky things, at least when you aren’t able to sample the full field.

I cannot remember seeing this qualification in the press releases about 2012 was the hottest year ever.
Zeke can you give a good reason why here you describe uncertainty and not really knowing what the 1936 temperatures were. Yet are happy to announce to the world and the politicians making decisions on our reports – absolute certainty that 1936 (whose temperatures you don’t know but model in various ways) was cooler than 2012 (if you use the old weather station network not your new one – again not mentioned). I am sure you will admit that this is not really the way a scientist would be expected to report information; was the report reworded for you by a PR group perhaps?

January 23, 2013 5:15 pm

I still do not understand how an English major with no scientific background got on the BEST team? Maybe Mosher could fill us in.

January 23, 2013 5:27 pm

Look folks. You need to just get over it. AGW is very, very real.
All you need to do to prove this is see all of the adjustments Homo sapiens has done to the 4 major data sets to see with your own eyes that humans nave indeed warmed them.

January 23, 2013 5:29 pm

Mr Watts, You may want to remove that link you have under “is upset” because Mr Hausfather has redirected it to some obnoxious page. This has me a little “mad” myself.

REPLY:
it works OK for me and goes here: http://rankexploits.com/musings/2013/a-defense-of-the-ncdc-and-of-basic-civility/
You may have a virus/malware doing that interception. I doubt that Zeke would ever do such a thing – Anthony

Zeke Hausfather
January 23, 2013 5:30 pm

Ian W,
In order to compare 2012 to other years, you only need anomalies, not absolute temperatures.
Poptech,
I don’t understand how an English major with no scientific background could work for the defense industry or develop 3D graphics. Maybe people can become versed an in area without formal schooling if they spend the time and effort to learn. Just a thought.

Mike Jowsey
January 23, 2013 5:34 pm

richardscourtney says:
January 23, 2013 at 3:14 pm
Thanks for that link Richard – as you say, a thousand words painted.

davidmhoffer
January 23, 2013 5:36 pm

Zeke Hausfather says:
January 23, 2013 at 4:19 pm
Anthony,
There is a good reason why all the major groups (NCDC, GISS, UAH, RSS, Hadley) primarily report anomalies. Absolute temperatures are tricky things, at least when you aren’t able to sample the full field.
>>>>>>>>>>>>>>>>>>>>>>
So Zeke, perhaps you could explain something to me. Let;s consider three baseline temperatures each with an anomaly of +1:
[-41] => [-40]
0 => 1
+40 => 41
Each of these represents an anomaly of +1. Let us now convert each of these into w/m2 anomaly via Stefan Boltzmann Law:
-41 => -40 => 2.89 w/m2
0 => 1 => 4.64 w/m2
+40 => 41 => 6.99 w.m2
Could you please explain to me how it makes sense to average anomalies when they represent completely different changes in energy flux? Why does a change in energy balance of, for example, 6 w/m2 equate to an anomaly of over two degrees in winter in the arctic, but less than one degree in the desert at the equator? If we are trying to detect a supposed change in energy balance that is in theory 3.7 w/m2 per doubling of CO2, what possible value is there in averaging anomalies that come from different temperature regimes and have no direct relationship to energy balance as an average?
Thanks in advance,
dmh
[Please confirm the -41 to -40 change is correct. Mod]

January 23, 2013 5:43 pm

yes anthony i can answer the questions

January 23, 2013 5:43 pm

Mr Watts, Doubt all you want, I now get “Access Denied The owner of this website (rankexploits.com) has banned your IP address (207.200.116.13). (Ref. 1006)”
This is the result of the previous “obnoxious” page.

January 23, 2013 5:46 pm

Zeke Hausfather:
It is now nearing 2 am here so I am about to retire. I stayed up in case you chose to answer my question which I first posed to you at January 23, 2013 at 3:56 pm and expanded for clarification at January 23, 2013 at 4:41 pm.
It now seems clear that you have chosen to not answer my question.
OK. That is your rightful choice.
However, if you do decide to answer it then I hope you will understand my failure to acknowledge that kindness for some hours is because I am in bed.
Richard

January 23, 2013 5:48 pm

If the 1936 temperature were a little too low rather than a little too high, would people be going through such gyrations to … er, “fix” it?

Bill Illis
January 23, 2013 5:48 pm

This is the US temperature record in F by month as it was recorded by the NCDC in November 2002.
July 1936 : 77.5F
http://www1.ncdc.noaa.gov/pub/data/cmb/national-temp.txt
Now it is 1.1F lower.
We need a new team at the NCDC, to go in and fix all the adjustments that have been done. Reverse ALL of them. US temperatures and global temperatures.
Then start over with real statisicians who know what they are doing and have no incentive to artificially increase the trend.

Silence DoGood
January 23, 2013 5:50 pm

so, through all the comments here i come to this conclusion
based on what Zeke has said, there really is no way to compare temps in 1936 to 2012, and have it be anywhere close to accurate, so, therefore, it ‘could’ be an inaccurate statement that 2012 was in fact the “hottest year ever”, but no necessarily accurate to say that it in fact ‘was’ the hottest year ever, and furthermore, NCDC and NOAA owe the U.S public a retraction. Is this a decent summation of events here?

Bill H
January 23, 2013 5:53 pm

Joe Grappa says:
January 23, 2013 at 3:17 pm
“I’d point out that Zeke has his interpretation but nowhere did I say “fraud”
Well, you did say that in the business world people go to jail for such manipulations of data, so you were implying the error, if there was one, was a lot worse than an honest mistake.
===================================================
Bernie Maddof would be proud… He just adjusted the numbers to better establish his position too..
Just Sayin…

Chris Edwards
January 23, 2013 5:54 pm

This illustrates why previous tyrants who were taking power burned the books that did not agree with them!

January 23, 2013 5:57 pm

It is quite obvious to anyone that the NCDC temperatures for the past are altered over time. Climate4you documents it. Here is a graph that they publish every month that shows the changes over time to two months in the database: Jan 1915 and Jan 2000. When the data were first downloaded in 2008 the difference between the temperatures was 0.39 C and in December of 2012 the difference was 0.52C with 1915 having somehow grown consistently cooler since 2008 and 2000 somehow having grown warmer. How did that happen?
http://www.climate4you.com/images/NCDC%20Jan1915%20and%20Jan2000.gif
Why are these data changed each month? When was it “correct”? Is the quality of the database being “improved”? Or is it being manipulated to validate a desired conclusion? How can we trust them?

January 23, 2013 6:14 pm

a man with one watch knows what time it is; a man with two watches is never quite sure.
– Lee Segall

Gerald Machnee
January 23, 2013 6:30 pm

To those playing around with whether Anthony said or implied “fraud”, they are missing the point. The real implication is that nobody is investigating or charging the manipulators. Therefore they will never be charged with fraud or whatever misdeed they could be guilty of. As well too many of the manipulators are in charge or hold high offices. We can call them the silent ——-.

January 23, 2013 6:35 pm

“there really is no way to compare temps in 1936 to 2012, and have it be anywhere close to accurate”
That’s how it looks to me. Another thing. I just read someplace that glacial periods last far longer than interglacial periods. Is this true? If so, it is clear that the forces propelling/creating glacial periods are very strong. Are we supposed to believe that man’s measly 6% contribution to greenhouse gases (that which is purportedly warming the globe catastrophically) is enough to overcome and indefinitely postpone the next glacial period?

thisisnotgoodtogo
January 23, 2013 6:50 pm

Mosher said
“Accusing someone of manipulating data, suggesting that they should go to jail, but not uttering the word “fraud” is a trick”
No , not a trick. IF they were in business THEN they would be committing fraud as they would be doing it to receive financial benefit.
Mosher, you’ve been exposed to this reasoning over at CA so many times that for you to pull this stunt now, is easily seen to be
A TRICK

Steve B
January 23, 2013 7:00 pm

Zeke Hausfather says:
January 23, 2013 at 4:56 pm
Anthony,
The Monthly Weather Review for 1936 (or USHCN raw data for 1936) will give me the recorded measurements at all stations available. etc etc
Absolute nonsense excuse for changing data. If I was your boss you would be sacked on the spot. Data is data. You have no idea on how to adjust the data whether up or down or by how much. You cn give all the formulas you like but it is just guesswork looking official. We the people have zero confidence that any data could be honest and true when people like you, Mosher et al just arbitrarily make stuff up.

January 23, 2013 7:00 pm

For the record, I received an email communication from “rankexploits” saying that they had cleared things up and that the link should now work. But I’m not in the mood to put it to a test.

Steve B
January 23, 2013 7:01 pm

Zeke Hausfather says:
January 23, 2013 at 4:56 pm
Anthony,
The Monthly Weather Review for 1936 (or USHCN raw data for 1936) will give me the recorded measurements at all stations available. etc etc
Absolute nonsense excuse for changing data. If I was your boss you would be sacked on the spot. Data is data. You have no idea on how to adjust the data whether up or down or by how much. You cn give all the formulas you like but it is just guesswork looking official. We the people have zero confidence that any data could be honest and true when people like you, Mosher et al just arbitrarily make stuff up and then try to baffle us with BS.
Just sayin

January 23, 2013 7:05 pm

Zeke Hausfather says:
Poptech,
I don’t understand how an English major with no scientific background could work for the defense industry or develop 3D graphics. Maybe people can become versed an in area without formal schooling if they spend the time and effort to learn. Just a thought.

Nice spin game Zeke, too bad working in the defense industry as a “threat analyst” is a not a scientific position, neither is “developing 3D graphics”. Yet all his padded resume says is he “helped bring 3D graphics to market”. Sounds like he did administrative and marketing duties at Creative Labs and you are trying to give him credit for actual computer engineering and programming work. You wouldn’t be trying to to inflate his credentials now would you?
So I ask again,
I still do not understand how an English major with no scientific background got on the BEST team?
It is rather embarrassing to have someone so unqualified on the team. Was he there to spell check and fix the grammar in the papers you cannot get published in any high impact journals?

January 23, 2013 7:08 pm

Zeke,
Can you confirm that this is Steve Mosher’s education: BA’s in English Literature and Philosophy
Could you list his scientific credentials that would make him an invaluable member of the BEST team?

D.J. Hawkins
January 23, 2013 7:25 pm

Zeke Hausfather says:
January 23, 2013 at 5:09 pm
steven haney,
In my post at the Blackboard I go into some depth about the adjustments that are done, the reasons why they are done, and attempts to evaluate if they are proper or not.
http://rankexploits.com/musings/2013/a-defense-of-the-ncdc-and-of-basic-civility/

I believe you are missing the point that most are trying to make. It isn’t quite so much that adjustments are being made, it’s that it never stops. Just to take one example, July 1936, if you were to sample the posted data for that date in 12 consecutive months in any year of the recent decade, you’d probably see 12 different values. There’s no telling for sure, of course, because there’s no data archive for each iteration, but the comparisons made by individuals who have posted at WUWT who have saved snapshots for one reason or another strongly suggests it’s true. One would think that all the data’s in by now, so why the constant shuffling? What’s worse, there doesn’t seem to be much communication regarding why in any specific instance we add 0.01 here and take away 0.02 there. It seems that someone in charge of the algorithm has decided to make perfect the enemy of the good. If I wanted to replicate someone’s work where said individual utilized the NCDC database (adjusted), odds are I couldn’t.
You claim that LIG thermometers record too high on max temps vs electronic. I couldn’t find any study to that effect with a quick Google search. Do you have a citation?
Your note regarding early locations of thermometers was especially amusing. No doubt Logan airport today is much cooler than Boston Commons was in 1910 (/sarc, in case it wasn’t clear). No doubt creeping urbanization has also had a negligible effect (/sarc again). But in any event, why are modern (>1980) temperatures adjusted UP??? With modern electronic devices comprising about 80% of the COOP network, what possible reason could there be? Inquiring minds want to know.

davidmhoffer
January 23, 2013 7:41 pm

davidmhoffer says:
January 23, 2013 at 5:36 pm
>>>>>>>>>>>>>>>>>>
Like richardscourtney I must retire for the evening, my question to Zeke Hausfather thus far not answered. I shall check back in the morning for an answer. My expectation however is that Hausfather will join a long list of other noted scientists who also cannot justify averaging anomaly data because there simply is no math or physics that shows doing so to be relevant. It is averaging of apples and pears to arrive at the weight of pumpkins.
I’ve challenged several scientists to take their precious temperature data, convert it to w/m2, THEN average it and THEN trend it. If we;re trying to detect a change in w/m2 at surface due to increases in CO2, then why the BLEEP are we not measuring changes in w/m2 at surface?
What could possibly be a simpler concept? If CO2 changes the w/m2 at the surface, then why aren’t we measuring and trending w/m2 at the surface?

cartoonasaur
January 23, 2013 7:47 pm

LMAO!!! What some commentators are saying, in essence and really, as a matter of FACT, is that because the data of the past changes constantly, there is ABSOLUTELY no such thing as WARMEST EVER. Moreover, all comparisons of present and past temps anywhere and everywhere are meaningless because without the existence of a data set that does NOT change, nothing can be rationally compared anyway… And what use is a data set of THE PAST changing constantly NOW? Is it even proper to call it data?

Matthew R Marler
January 23, 2013 7:51 pm

In the business and trading world, people go to jail for such manipulations of data.
Unless you redefine “such”, that is an accusation of fraud.
Well, unless you are instead claiming unjust imprisonment.

Lynn Clark
January 23, 2013 8:04 pm

It occurs to me that there may be a problem with the way Anthony worded his four questions to Zeke. Anthony asked:
What is the CONUS average temperature for July 1936 today?
What was it a year ago?
What was it ten years ago? Twenty years ago?
What was it in late 1936, when all the data had been first compiled?
The way I read it, Anthony was asking what the CONUS average temperature was for July 1936 AS REPORTED BY NCDC in the present (today), last year (a year ago), ten years ago, twenty years ago, and in late 1936. In other words, what did NCDC claim the CONUS average temperature for July 1936 was in those five points in time. It seems that Zeke is focused on what the CONUS average temperature was in the present (today), a year ago, ten years ago, twenty years ago and in late 1936. I don’t think that Anthony was asking about CONUS average temperature for July in 1936, 1992, 2002, 2012 and 2013.
Perhaps the four questions would be better worded this way:
What does NCDC claim that the CONUS average temperature for July 1936 is now?
One year ago, what did NCDC claim that the CONUS average temperature for July 1936 was?
Ten years ago, what did NCDC claim that the CONUS average temperature for July 1936 was?
Twenty years ago, what did NCDC claim that the CONUS average temperature for July 1936 was?
In late 1936, when all the data had first been compiled, what did NCDC (or its predecessor(s)) claim that the CONUS average temperature for July 1936 was? (Before the raw temperature data had any adjustments applied to it.)
But in any case Zeke has only answered two of Anthony’s questions. All the rest of his responses are just obfuscation.

January 23, 2013 8:06 pm

You claim that LIG thermometers record too high on max temps vs electronic. I couldn’t find any study to that effect with a quick Google search. Do you have a citation?

D.J. what I found was Quayle et. al. (1991), which says,
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0477%281991%29072%3C1718%3AEORTCI%3E2.0.CO%3B2
“…it is beyond the scope of this study to conduct the field tests and research necessary to answer these questions definitively.”

davidmhoffer
January 23, 2013 8:06 pm

davidmhoffer;
-40 => 41
0 => 1
+40 => 41
>>>>>>>>>>>>>>>>>>>>>
Of course in my haste due to being late for something, I should have written
-40 => -39
But the math is otherwise correct, and let me make the point yet again another way.
In the event we had only two baselines, one at -40 and one at plus 40, we could have anomalies of +2 and -1 respectively. This would result in an average anomaly of +0.5 and a change in w/m2 of -2.6 w.m2.
Of what value Zeke, is averaging anomalies when doing so can result in a positive temperature anomaly due to a negative change in energy balance?
[fixed, Mod]

Crispin in Waterloo
January 23, 2013 8:11 pm

My lack of formal qualification in my field of work (there being nowhere to study it) has forced me to hide under a cloak of competence, precision and replicability. All lab results are, as a result, provided with raw and adjusted data, full copies of formulae and detailed version control of the protocols. I should also mention independent review by other labs not of my choosing for method and precision. Call me old fashioned but it is how I read it should be done. Thank heavens I never learned from a climate scientist.

Scott Basinger
January 23, 2013 8:16 pm

Bah, give Zeke a break. He’s one of the good eggs.

January 23, 2013 8:21 pm

NOAA’s NCDC, Berkeley Earth, NASA’s GISS, and Hadley CRU all have had intense critical focus on their history of industrial era data manipulation changes; for some of them their
manipulation changes occured over the last two decades. The intense critical focus came from outside of those organizations. It is not uncivil to paint a full range of possible scenarios that bracket the simple honest question. Why? Some scenarios must necessarily disturb those orgs, even when stated diplomatically; those disturbing scenarios are not personal per se so they are not uncivil pet se.
A full investigation of all those orgs is scientifically prudent and should be done from within the scientific community with the caveat of the investigation process having no secret forums and aliases. And another caveat is real time openness and transparency, not post hoc openess and transparency.
John

D.B. Stealey
January 23, 2013 8:21 pm

Scott Basinger,
Zeke will get a break when he starts answering questions instead of obfuscating, dodging, and ignoring questions. I’m still waiting for an answer to my question, others’ questions are ignored, and Anthony’s questions have not been properly answered.

A Crooks
January 23, 2013 8:31 pm

Where did I find this?
‘Despite these uncertainties and doubts, the IPCC continues in the manner best described by a Polish aphorism much heard during the Soviet-dominated 1970s:
“The future is certain, only the past is unpredictable.” ’

climatebeagle
January 23, 2013 8:34 pm

I’m so confused, this thread has four different values for July 1936 in the US, yet elsewhere Steven Mosher keeps saying you can drop stations and the answer doesn’t change.
Will no-one think of the poor climate modelers, successfully predicting(*) the past only to have it change on them!
To be serious, does this affect the modellers, if they validate the models against past temperatures (as the UKMO seems to indicate), then was the model correct ten years ago, but not today because the temp record has changed?
* UKMO terminology

Tilo Reber
January 23, 2013 8:36 pm

I’ve had many unanswered questions from Zeke – like how does his data homoginization fix the UHI problem as opposed to just spreading it out evenly. If you want to clean your house do you just take the piles of dirt and distribute them evenly over your floor Zeke?
Just noticed an interesting one Zeke. On page 27 of this NOAA report:
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf
we see modeled CFSv2 forcasts taking a hard and immediate angled change to the El Nino side. How does that happen in the real world? What are the chances of that happening in the real world? What kind of model produces such instantaneous changes of direction?

Bernal
January 23, 2013 8:41 pm

“I still do not understand how an English major with no scientific background got on the BEST team?”
Yeah, but you should see the degree after it was adjusted.

john robertson
January 23, 2013 8:41 pm

Nice to see still no answers.
Using anomalies is science?
How?
When the mean keeps moving, has no error bars and is carefully not documented on each anomaly graph.
Without the specific mean for these anomalies, in degrees C, with estimated error range, stated on the graph, the anomalies are not information.
Of course it makes the headline 2012 hottest ever, 0.76F higher than 1936. Error range + or-3F.
Might do some damage to the cause if the accuracy of your guess was openly stated?

Peter Laux
January 23, 2013 8:44 pm

Wow, Zeke certainly “led with the chin”, he was gunna bash your hand with his face Anthony.
Or to put it even more bluntly, ” Anthony has shot Zeke fair in the ass …… Can’t even find the bullet hole!”

D.J. Hawkins
January 23, 2013 8:45 pm

Poptech says:
January 23, 2013 at 8:06 pm
You claim that LIG thermometers record too high on max temps vs electronic. I couldn’t find any study to that effect with a quick Google search. Do you have a citation?
D.J. what I found was Quayle et. al. (1991), which says,
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0477%281991%29072%3C1718%3AEORTCI%3E2.0.CO%3B2
“…it is beyond the scope of this study to conduct the field tests and research necessary to answer these questions definitively.”

Thanks for the info. The gist, on reading, is that on a regional basis you need to boost the old minimums and scrunch the old maximums. Now, I know the fellas doing this work aren’t steering the boat, but don’t you think that someone might have said “Gee, maybe we should run a representitive sample of LIG’s and MMTS’s in parallel to get a good handle on the potential for the introduction of systemic biases”? This is kinda like the whole not-testing-the-mirror fiasco with Hubble, only dumber. I’d like to think it’s the fault of some penny-pinching rent-seeking desk jockey but…

coalsoffire
January 23, 2013 8:46 pm

D J Hawkins asks “why are modern temperatures adjusted UP???”
Probably so they can be adjusted down more easily later on as required to keep the wave (trend) in play. So far Mosher defends this with snark, (isn’t he the English major after all) and Hausfather uses obfuscation and leger de main. The bottom line is that this ridiculous process serves the very useful purpose of giving the warmists the propaganda coup of claiming recent years are the warmest ever. This coming out of one side of the mouth while the other is saying we could never know what the real temperature was in the 30’s. So it’s useful, therefore it will continue until a better ploy is found notwithstanding it fails to meet any reasonable standards of logic, science, truth, or ethical practices. It’s the nature trick to hide the decline all over again – appending one record for apples onto another for oranges and pretending (even as they make them more and more different) that by always adjusting the oranges down and the apples up they magically become the same fruit – and the warmest fruit ever to boot. It’s actually pretty funny to think that anyone falls for it and funnier still to see people trying to defend it.

January 23, 2013 8:54 pm

Yeah, but you should see the degree after it was adjusted.

LOL.

January 23, 2013 9:00 pm

It’s easy to compare stations that are close together, and see that even over short distances <50 miles temperatures are different, spatially averaging to compensate for missing stations does not create a value that has any relationship to reality. And if averaging the actual station values of 1936 doesn't give you a valid number, how can you compare it to an average that's made up from made up numbers to compensate for missing stations, and then try to tell me that you can measure a trend from this, it is all basically made up, and in no way is it relate-able to the made up number from different years (because they all have a different number of stations).
You either compare your measurements, or you can't compare anything, or you're being dishonest.

January 23, 2013 9:02 pm

D.J. the paper even mentions that “pre-1975 LIGs may be of higher quality” but they never do any research to determine this. IMO there is no way to reliably test this as there is no way to retrieve and test any of the LIGs when first installed if at all, How do they know their accuracy did not change over time? Especially over 100 years? Different brands, makes and models as well as design, materials, manufacturing processes and quality control are all variables they cannot account for.

Venter
January 23, 2013 9:18 pm

Bernal, post of the day, LOL

January 23, 2013 9:19 pm

I have read numerous studies showing the Holocene Climatic Optimum of 9,000 to 5,000 years ago had significantly higher temperatures and sea levels than now, and that there were succeeding warm periods – the Egyptian, Minoan, Roman, and Medieval – that were each cooler than their predecessor, and that each were warmer than the present. Another recent study showed that Greenland was 2.5 degrees Celsius warmer 8,000 years ago, which supported another Greenland ice core study that determined that 9,100 of the past 10,000 years in Greenland were warmer than any one of the past 100 years.
Why then, Zeke, is there such a desire to worry the temperature records for 1.5% of the Earth’s surface for the past 100 years to death? What importance is there that, thanks to unclearly documented “homogenization”, the temperature for July 2012 compared to July 1936 deserves headlines? As you wrote, the temperatures when compared fall within the error band. A bit better use of time would be to determine where 1934 fits into the great scheme of comparisons. Or how all of the maximum temperature readings of the 1930s, 1940s, and 1950s can be explained away. Or how the studiously augmented US temperatures of the past decade stack up against the flat global temperature trend of almost two decades and continuing.
When all is properly homogenized, and following the example of Phil Jones, all the raw data is “lost”, then the climate community can start rewriting Dr. Lamb’s “Climatic History and the Future” and his other works to conform the climatic history of the past 20,000 years to its desired form. Of course that would also require the “homogenization” of the thousands of studies by hundreds of scientists that Lamb cites, but once a task is begun, it needs finishing, doesn’t it?

Eliza
January 23, 2013 9:24 pm

I avoid that site like the plague

Moe
January 23, 2013 9:51 pm

Looks like Zeke could be right, NOAA says 2012 was the hottest year on record.

January 23, 2013 10:26 pm

davidmhoffer said @ January 23, 2013 at 7:41 pm

Like richardscourtney I must retire for the evening, my question to Zeke Hausfather thus far not answered. I shall check back in the morning for an answer. My expectation however is that Hausfather will join a long list of other noted scientists who also cannot justify averaging anomaly data because there simply is no math or physics that shows doing so to be relevant. It is averaging of apples and pears to arrive at the weight of pumpkins.
I’ve challenged several scientists to take their precious temperature data, convert it to w/m2, THEN average it and THEN trend it. If we;re trying to detect a change in w/m2 at surface due to increases in CO2, then why the BLEEP are we not measuring changes in w/m2 at surface?
What could possibly be a simpler concept? If CO2 changes the w/m2 at the surface, then why aren’t we measuring and trending w/m2 at the surface?

More like averaging the density of apples and pears to arrive at the density of pumpkins when the customer only wants the price per kilogram of your pumpkins 😉

January 23, 2013 10:35 pm

I want to thank you Zeke Hausfather for coming here and truthfully addressing the issues raised by followers of this site.
I read the link: http://data.giss.nasa.gov/gistemp/abs_temp.html
and it ends with this, For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.
Thanks for the link. It is a good explanation but if I’m not mistaken, it proves Antony’s point exactly.

AndyG55
January 23, 2013 10:40 pm

Mosher..
And those that go along with the data manipulation and accept it, are nothing more than “collaborators” with ALL its nasty connotations.
You are becoming one !
It seems any pretence you once had towards science, has disappeared. Pity.
Question though… who got to you ????

Kasuha
January 23, 2013 10:54 pm

Let’s have a look of history revisionism. There we have older data processing result where the 1936 temperature is 77F. Here we have newer data processing result where the 1936 temperature is 76F. There we have older processing result saying there was no MWP. Here we have newer data processing result saying the MWP was there.
Now we should ask ourselves which of the historic revisionism do we really want to stop and why. Or how exactly is it with the history malleability.
Now seriously. That original statement was definitely suggesting that the change was made on purpose. I’m not aware of any proof for it. I don’t have any high confidence in GISS temperature data processing, from the latest comparison using climate model data it came out worst of the three (BEST, HADCRUT, GISS). But to prove they’re doing it wrong on purpose, you’d have to come with a bit stronger evidence.

Don Monfort
January 23, 2013 11:02 pm

“Matthew R Marler says:
January 23, 2013 at 7:51 pm
In the business and trading world, people go to jail for such manipulations of data.
Unless you redefine “such”, that is an accusation of fraud.
Well, unless you are instead claiming unjust imprisonment.”
Matt, can you cite the criminal code that prohibits manipulation of 1936 temp data? Is it called fraud? Is jail time a possible penalty? Is their a statute of limitations with regard to the age of the data?

thisisnotgoodtogo
January 23, 2013 11:25 pm

climatebeagle says:
January 23, 2013 at 8:34 pm
“I’m so confused, this thread has four different values for July 1936 in the US, yet elsewhere Steven Mosher keeps saying you can drop stations and the answer doesn’t change.”
I think it’s called “Mosher’s Delayed-Choice Gedanken Experiment” or something like that.

January 23, 2013 11:53 pm

This is a very important matter and I am pleased to see it being pursued by WUWT. Zeke Hausfather appears to be rather cornered by questions 3 & 4, as his reaction to questioning shows. It is prudent to be careful as I have seen cornered rats before and they are extremely vicious and have large teeth.

AndyG55
January 23, 2013 11:54 pm

“But to prove they’re doing it wrong on purpose, you’d have to come with a bit stronger evidence.”
If you look at the percentage of “adjustments” that lead to an increase in temperature trend , that will immediately tell that it hasn’t happened by chance, This is totally beyond the possibility of any statistical scenario.
There is NO DOUBT that there has been systematic “adjustments ” to increase trends from the original raw data.
I’m betting that communications have continued between GISS and Had as well, but they have been much more careful to hide them.

E.M.Smith
Editor
January 24, 2013 12:11 am

@Monique:
Glacials last about 120,000 years, interglacials (like now) about 12,000 or about 1/10 as long.
The natural state of the earth is frozen into an ice age glacial. We only have a warm interglacial when just the right combination of earth axis tilt, circularity of the orbit, and when the tilt is aimed at the sun (‘precession’) are just right. That points the north pole at the sun for a longer period of time and melts the ice. Other than that, we’re frozen.
The “magic cutoff” is just a couple of Watts / square meter different from where we are now. Orbital mechanics make it inevitable that we lose those last Watts. Soon.
So yes, there is a ‘tipping point’ but only into a glacial, not to ‘extra warm’. (Not enough Watts any more for that.) See the chart in the top of this posting:
http://chiefio.wordpress.com/2012/12/29/annoying-lead-time-graph/
@Zeke:
The “raw” data are not really raw. They have been “Quality Controlled” in a variety of ways… that sometimes involve replacement with synthesized values. That was one of my first surprises in looking at the “data”.
@All:
I did a compare of GHCN v1 to v3 for the same time period. Each station has an anomaly taken as the very first step. (The ‘climate science’ guys often wait to the end when they have a ‘grid box’ then take a fictional ‘anomaly’ between two boxes without real thermometers in them. Ask them at what point a given thermometer has an anomaly taken of that thermometer data only compared to itself. In other words, a proper anomaly step prior to any other manipulation…)
http://chiefio.wordpress.com/v1vsv3/
Here is the graph of ONLY anomaly vs anomaly for the “same” data set (GHCN) for the same period of time:comment image
The past cools, the present warms, and about the same amount as “global warming”.
All comparison done ONLY as anomalies. A thermometer ONLY compared to itself.
The anomaly dodge is just that, a dodge. The data contents are being tilted toward a warming trend. Why? Is it valid? Those are left unanswered…

Matt
January 24, 2013 12:12 am

Please explain what you were hinting at by saying people in finance would go to jail, if not for fraud?

Mindert Eiting
January 24, 2013 12:15 am

If there are reasons to assume that bias is involved in a comparison, you should introduce a bias term in your analysis. No one considers this fraud because it is part of a procedure that can be criticized. Take sun spot counts. Should we believe that astronomers in the seventeenth century were as accurate as their colleagues today? So, add to the counts by old German astronomers G points because of their fuzzy telescopes. Subtract from the values of French astronomers F points because they exaggerated their observations for obvious reasons. Let’s hope that the original data do not get lost in this circus. The question is of whether you do this in your analysis or in your data. When you do it in your data, it’s called fraud.

RACookPE1978
Editor
January 24, 2013 12:20 am

Hmmmn.
Zeke:
How did you “generate” your (assumed-ever-so-accurate) “anomalies” that are the source of your continued processing … if you can’t tell us what the original early-century, middle-of-century, and end-of-century minimum and maximum temperatures actually were every day?
Are you not admitting blatantly that you don’t know what the basis of your anomalies is if you can’t tell us what the ACTUAL temperatures were each day at at each location?
Sure – using anomalies is correct. Its the right way to generate long-term trends. BUT HOW DID YOU GET THE ANOMALY RIGHT if you can’t tell us the temperatures?

GabrielHBay
January 24, 2013 1:11 am

A most valuable thread. I have often mused to myself that I must be a real d****r, not just a skeptic, for not even accepting the commonly accepted (even by skeptics) line of 20th century (meaningful) warming. I have consistentently taken the view that, based on what I have seen explained and presented, there is no way I can make even that little concession to the warmists. Ups and downs, yes. Systematic warming, no. We just do not know. The official data is crap. I feel vindicated.

Stephen Richards
January 24, 2013 1:49 am

What is it about climate scientists that they alone think it is normal scientific practise to manipulate pre-validated data. I don’t know how many times I have said this BUT UNDER NO CIRCUMSTANCES IS IT RIGHT TO MANIPULATE VALIDATED DATA. You can call it manipulation if you wish but it appear, to all intents and purposes, to be fraud.
AND what the [snip . . no need . . mod] is the problem with Mosher? Is he looking for work in NASA GISS.?

January 24, 2013 3:04 am

E.M.Smith:
You conclude your post at January 24, 2013 at 12:11 am saying

Here is the graph of ONLY anomaly vs anomaly for the “same” data set (GHCN) for the same period of time:comment image
The past cools, the present warms, and about the same amount as “global warming”.
All comparison done ONLY as anomalies. A thermometer ONLY compared to itself.
The anomaly dodge is just that, a dodge. The data contents are being tilted toward a warming trend. Why? Is it valid? Those are left unanswered…

Yes!
I again point to a Parliamentary Submission I made which is linked at
http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc0102.htm
and
I draw your attention to the draft paper which is it Appendix B.
In the Parliamentary Submission I said of mean global temperature (MGT)

9.
It should also be noted that there is no possible calibration for the estimates of MGT.
The data sets keep changing for unknown (and unpublished) reasons although there is no obvious reason to change a datum for MGT that is for decades in the past. It seems that – in the absence of any possibility of calibration – the compilers of the data sets adjust their data in attempts to agree with each other. Furthermore, they seem to adjust their recent data (i.e. since 1979) to agree with the truly global measurements of MGT obtained using measurements obtained using microwave sounding units(MSU) mounted on orbital satelites since 1979. This adjustment to agree with the MSU data may contribute to the fact that the Jones et al., GISS and GHCN data sets each show no statistically significant rise in MGT since 1995 (i.e. for the last 15 years). However, the Jones et al., GISS and GHCN data sets keep lowering their MGT values for temperatures decades ago.

Richard

chinook
January 24, 2013 3:05 am

Zeke posted the GISS link that explains their methodology for surface temp analysis:http://data.giss.nasa.gov/gistemp/abs_temp.html
But, this paragraph raises some questions and eyebrows at what they’re implying:
‘Q. If the reported SATs are not the true SATs, why are they still useful ?
A. The reported temperature is truly meaningful only to a person who happens to visit the weather station at the precise moment when the reported temperature is measured, in other words, to nobody. However, in addition to the SAT the reports usually also mention whether the current temperature is unusually high or unusually low, how much it differs from the normal temperature, and that information (the anomaly) is meaningful for the whole region. Also, if we hear a temperature (say 70°F), we instinctively translate it into hot or cold, but our translation key depends on the season and region, the same temperature may be ‘hot’ in winter and ‘cold’ in July, since by ‘hot’ we always mean ‘hotter than normal’, i.e. we all translate absolute temperatures automatically into anomalies whether we are aware of it or not.’

I’m trying to understand their explanation how on one hand the SAT is meaningful to nobody, but the anomalies
are extremely meaningful, esp in light of the admitted inaccuracies of local measurements. And so even though or because the SAT’s are meaningless, they use an average of anomalies and suddenly meaningless data is meaningful. And they admit that public perception of anomalies is an easily manipulated thing. I only have a year of college statistics, but something here just isn’t passing the smell test. And how can making changes to past SAT’s that are meaningless which result in new anomalies that are meaningful, esp to public perception, possibly result in an honest and actually meaningful conclusion? Perhaps I’m missing something very obvious. If so, my apologies.

January 24, 2013 3:50 am

chinook:
In your post at January 24, 2013 at 3:05 am you explain your view and ask

Perhaps I’m missing something very obvious. If so, my apologies.

You have no need to apologise because you have “missed” nothing. You have independently discovered what some of us have been complaining about for years.
I commend you to scroll up to my post which is immediately above yours and to read Appendix B in the link I provide there.
Richard

Venter
January 24, 2013 4:10 am

Chinook, it doesn’t pass your or anybody’s smell test [ except the AGW clique ] as it is a pile of steaming BS.
And the people promoting and accepting such BS as gold standard have the nerve to be offended if someone names what they are doing. For them, it seems like their activity is not wrong but the act of naming it truthfully is wrong.

Joe Public
January 24, 2013 4:41 am

Zeke Hausfather claims that “LIG thermometers record too high on max temps vs electronic”.
But a “max” temperature at one location will be well below “min” temperature at many other locations.

mpainter
January 24, 2013 4:54 am

Zeke Hausfather:
Half of the science of the global-warmers is meant to obliterate knowledge that has been obtained by past studies. That is what Michael Mann’s hockey stick was meant to do- eradicate the MWP.
The adulteration of data that is seen at the NOAA, by Hansen and GISS, is more of the same.
It is certain that if such adulterated data were used to promote investment schemes, that the promoters would face prison terms. What sort of person defends such pratices? Well, you for one.

mycroft
January 24, 2013 4:57 am

“In the spirit of civility”!!!!?? beggers belief! after all the uncivil words used against skeptics perhaps this young man should take time to see who is civil and who is not…think he will find Anthony and WUWT at the top end of civility scale and warmist and their sycophants at the other end of the scale..

Theodore
January 24, 2013 4:59 am

“Zeke Hausfather says:
January 23, 2013 at 4:19 pm
Anthony,
There is a good reason why all the major groups (NCDC, GISS, UAH, RSS, Hadley) primarily report anomalies. Absolute temperatures are tricky things, at least when you aren’t able to sample the full field.
GISS has a rather good explanation: http://data.giss.nasa.gov/gistemp/abs_temp.html
We skeptics agree absolute temperatures are tricky things, which is why we have little faith in databases that continue to adjust them. However Zeke, if you don’t know what the absolute temperature is for a station, you can’t calculate its anomaly.
I know you are trying to calculate the trend in anomolies, but without faith in the accuracy of the measurements and adjustments there is no faith in the accuracy of the trend. Sure you can measure a change in the trend with your methodology. What you can not do is determine if that trend is the result of several factors not related to global warming or actual temperature trends. So you are comparing an anomoly trend, but the anomolies are caused by trends in UHI, station siting, land use changes, station moves, cherry picking of stations, and trends in climate scientist adjustments to the data base.
You claim it is hard to know the absolute temperatures from 1936, but without knowing that then the trend data is pretty worthless. And pronouncements that such and such month or year was the hottest ever are absolute garbage out and not of any scientific value.
We know temperature was measured differently in 1936 and that it may require some adjustment to compare directly to todays stations. However, there is little faith that those adjustments are accurate, based on hard measurements of the difference in measuring tools, that they use appropriate statistical techniques, and a host of other reasons the data is considered untrustworthy.
This is further confirmed when the temperature (and thus the anomaly trend based upon it) changes from year to year and version to version. The equipment used in 1936 has not changed between 2011 and 2012, or 1998. Yet each time a new database is produced, those years are adjusted downward to exagerate the anomaly trend. What new information discovered between 2011 and 2012 caused you to adjust 1936 by 1 degree? If you can’t point that out, then how can we trust that it was not just fudged to make sure the trend is moving in the direction that CAGW proponents want it to move?
“Zeke Hausfather says:
January 23, 2013 at 3:45 pm
Hi Anthony,
Unfortunately, calculating the absolute temperature for the entire contiguous U.S. in July of 1936 is a non-trivial matter. I could download the raw absolute average temperatures from all stations available July of 1936. However, many of those stations were located on the rooftops of city buildings (this was the pre-airport age, after all). These stations also used liquid-in-glass thermometers which produce notably higher maximum temperature readings than modern electronic instruments. All of these things mean that a simple average of instruments available at that time would tell us something interesting about the conditions at the locations of those instruments, but not necessarily produce an unbiased estimate of CONUS temperatures.

Mark Bofill
January 24, 2013 6:30 am

jkivoire says:
January 23, 2013 at 5:43 pm
Mr Watts, Doubt all you want, I now get “Access Denied The owner of this website (rankexploits.com) has banned your IP address (207.200.116.13). (Ref. 1006)”
This is the result of the previous “obnoxious” page.
————————–
If this is Lucia’s page (the Blackboard) you’re probably being shot down by her anti-bot defenses. Happens to me once in a while too. She periodically posts about her tweaks to identify and prevent bot access and posts apologies for false rejects, for example:
http://rankexploits.com/musings/2012/how-constant-are-hacking-attempts/
…not that you’ll be able to read this link if you’re being blocked as a bot…

pochas
January 24, 2013 6:36 am

Its fraud. So why are we inclined not to prosecute, to “let it slide?” Because the admission shows what fools we have been.

climatebeagle
January 24, 2013 6:38 am

Even more confusing that Steven Mosher also said adding data does change the answer, but dropping stations doesn’t. That’s one cool algorithm.
http://wattsupwiththat.com/2012/03/19/crus-new-hadcrut4-hiding-the-decline-yet-again-2/#comment-928684

January 24, 2013 6:48 am

In engineering school we took lots of data. We had to report the accuracy of the instrument, every time. This is the definition of “taking data.” The debate about CO2 and its effects involves data from years and years ago, taken with instruments with varying accuracy. The people involved in attempting to prove that CO2 has harmful effects are attempting to prove this to the public, who did not go to engineering school, would not know even what the word “accuracy” means, and believe what they read in the papers. “Never argue with a man who buys ink by the barrel.”
People like Zeke and Steven live by raising red herrings, statements that don’t actually have to BE true, they just have to SOUND true to the uninformed. As long as they still are able to get away with it, they will. As long as the Obamanation continues, and as long as academic and government administrators have a liberal bent, this will continue. There is nothing for it, as these people believe that the modern lifestyle is evil, and they will do anything to make it stop. This, despite the fact that they all LEAD the modern lifestyle. This AGW myth will continue until the general public notices that the weather seems about the same as it has always been, and then there will be no story, no publicity, and the libs will move on to something else.
Life goes on…

RockyRoad
January 24, 2013 6:59 am

So what Zeke is telling us (and defending, I might add) is that the various temperature datasets are actually MODELS of the temperature. I say this because they all use algorithms that adjust the raw data using sets of algorithms–many of which do not withstand logical scrutiny and justification. They’re all just “educated guesses”.
But taking a step back from the temperature models, as a student of sampling theory I would assert that meaningful sample density is not sufficient for the temperature phenomena being studied; that “fudging” is the best term to describe the method(s) used to project point data to large segments of the earth’s surface; and that these projections have no bearing on the true temperature of said regions. Additional mathematical gyrations foisted on the data try to address these issues (and by doing so admit to my allegations) but are no substitute for the real thing.
Subjective models based on insufficient sampling gives anything but accurate results. And until these problems are addressed, discussion will continue ad nauseum with no resolution in sight.
(Still, kudos to Mr. Hausfather for joining the discussion.)

Matt Skaggs
January 24, 2013 7:20 am

davidmhoffer wrote:
“I’ve challenged several scientists to take their precious temperature data, convert it to w/m2, THEN average it and THEN trend it. If we;re trying to detect a change in w/m2 at surface due to increases in CO2, then why the BLEEP are we not measuring changes in w/m2 at surface?”
Thanks David, that had not occurred to me, you are absolutely right. The answer is of course not willful misdirection, but the fact that no one is driving the AGW bus. Clearly climate scientists should be formally pursuing whether W/m2 is tracking CO2. I’m adding this to my list of explicit predictions made by AGW theory that no one is really pursuing in any formal manner, such as polar amplification and the mid-troposphere hot spot. The fact that these cats won’t herd means that we are unlikely to see any of the specific, non-trivial AGW predictions (the true yardstick of “settled science” being the number of non-trivial and non-obvious predictions conclusively shown to be true) resolved in the near future.

rilfeld
January 24, 2013 7:30 am

“Steven haney:
At January 23, 2013 at 4:40 pm you ask
Zeke, Anthony and Richard, Why is there no longer any discussion of temp. data prior to 1850?
My answer is that there were few temperature measurement sites prior to 1850 so the methods used to compile e.g. global temperature are not applicable for then.
Which is not to say I think the methods used are applicable for times after 1850. I don’t, but I think they could be.
Richard”
It would be of great benefit if we had temperature measure from outside the arena of instrumentation to provide checkpoints. Though unable to provide a ‘hard’ confirmation or denial of instruments, we do have a few. The freezing (or not) of the river Thames. The range patterns of various species of winter wheat. The temperature related yields of the citrus and strawberry crops. Extreme anomolies of this sort are usually reported contemperaneously. Granted, very imprecise, but in the ‘hottest year on record’ I’d expect to see a lot of such confirming reports outside of climate political correctness and sheepishness. The wheat doesn’t know how the thermometer is sited (or cited). … like continuing tree ring studies through the present to ‘validate the natural instrument’ or are we htiting a sore spot there?

January 24, 2013 7:30 am

davidmhoffer says January 23, 2013 at 7:41 pm

What could possibly be a simpler concept? If CO2 changes the w/m2 at the surface, then why aren’t we measuring and trending w/m2 at the surface?

Better yet, how about measuring the LWIR radiated back into space? Or both?
BUT WAIT … we have something comparable: UAH and RSS sat msmts …
.

January 24, 2013 7:48 am

Now let’s see if I’ve comprehended
How temperature data gets “mended.”
They make the past cooler,
Then they take a ruler,
Et voilȁ! The warming’s not ended.

January 24, 2013 7:57 am

Couple of points.
I’ve been studying how much of the days temp rise is lost that night, think of it as a Daily anomaly. then averaging that across all stations, since the temps used are from the same station in the same 24 hr period it seems to me to more immune to station errors. I further filter stations out that don’t provide records over most of the year, I’ve used anywhere from 240 days/year to 365 days/year with no significant difference in results. That result is there’s no loss of the ability to cool the days temps, when the daily averages are looked at by latitude range, you can see the effect of the change in length of day.
I’ve also recently gotten a IR thermometer, looking for some reflection of IR off the atm overhead. While gives an interesting result, it either reads below scale (-40F which is ir out to ~12.5u), or it reads max scale (608F), the interesting part is that it seems to read 608 in the opposite direction of the Sun as well as when near the Sun, as if Solar IR (<~5u) is being reflected by the atm. All of this was on a 35F day with clear skys. There is no other IR signal coming from the atm. On the other hand it will read clouds just fine.
Lastly, there's no government agency I've ever worked with, that doesn't calibrate their instruments on a regular basis. They always use calibrated standards traceable to some other calibrated instrument. Are we suppose to believe that the NWS neglected this important step for decades, yet were so interested in the data, it was recorded and saved for decades?

January 24, 2013 8:24 am

rilfeld:
At January 24, 2013 at 7:30 am you say to me

It would be of great benefit if we had temperature measure from outside the arena of instrumentation to provide checkpoints. Though unable to provide a ‘hard’ confirmation or denial of instruments, we do have a few. The freezing (or not) of the river Thames. The range patterns of various species of winter wheat. The temperature related yields of the citrus and strawberry crops. Extreme anomolies of this sort are usually reported contemperaneously. Granted, very imprecise, but in the ‘hottest year on record’ I’d expect to see a lot of such confirming reports outside of climate political correctness and sheepishness. The wheat doesn’t know how the thermometer is sited (or cited). … like continuing tree ring studies through the present to ‘validate the natural instrument’ or are we htiting a sore spot there?

It is not a “sore spot” with me.
Such proxies are used; e.g. Soon & Baliunas
http://www.int-res.com/articles/cr2003/23/c023p089.pdf
Incidentally, please ignore the wicki comments on this excellent paper: the wicki comments have been given the ‘Connelley Treatment’ so read it an evaluate it for yourself.
Of much more direct use are British Admiralty ships’ log temperature records.
http://badc.nerc.ac.uk/view/badc.nerc.ac.uk__ATOM__dataent_1239019538627371
TonyB is the person you really need to question about this subject. He often posts on WUWT and has probably given it more study than any other person or organisation.
Richard

davidmhoffer
January 24, 2013 8:33 am

_Jim
Better yet, how about measuring the LWIR radiated back into space? Or both?
BUT WAIT … we have something comparable: UAH and RSS sat msmts …
>>>>>>>>
I thought so at one time as well. Turns out not. They measure a specific frequency of a specific isotope of something or other (I forget) which is directly proportional to temperature. So even THEY are measuring temperature. Of course, they are measuring it at millions of points in time and space and they COULD turn it into w/m2 before averaging it… but they don’t.
Then there is ERBE which actually DOES measure what we’re after Earth Radiation Budget Experiment, but they don’t publish the data or trends in any useful manner. I started looking at their raw data download once and gave up in short order.

davidmhoffer
January 24, 2013 8:40 am

Zeke Hausfather;
Well Zeke, you complained about the way Anthony leveled his criticisms, and in attempting to defend yourself from them the following becomes obvious:
1. You cannot justify the changes to the temperature record that Anthony pointed out.
2. You cannot justify the methods by which average temperatures at any given point and time are calculated.
3. You cannot justify the manner in which anomalies are calculated from the temperatures.
4. You cannot justify averaging the anomalies together for any purpose at all.
Your silence speaks volumes.

Larry Geiger
January 24, 2013 9:07 am

“These stations also used liquid-in-glass thermometers which produce notably higher maximum temperature readings than modern electronic instruments. ” This absolutely ridiculous. In this case you have no conclusion because YOU HAVE NO DATA! Making corrections based on how you think that thermometer was reading and how the reader was reading it become totally bogus. If this is the excuse for all of these corrections, then it’s time to give up claiming ANY knowledge of where the climate is going until we have a couple of hundred years of REAL DATA!

January 24, 2013 10:26 am

[Anthony:] In the business and trading world, people go to jail for such manipulations of data. [Zeke:] but accusing [NCDC] of fraud is one step too far.
My take on the argument.
Going 76 mph on I-10 in rural portions of Texas won’t get you stopped for speeding.
Going 76 mph on I-10 in downtown, Houston, will eventually result in, “May I see your license and insurance, please?”
Going 76 mph in a 20 mph school zone could quickly result in: “KEEP YOUR HANDS IN SIGHT, Step out of the car, hands on the hood, and SPREAD ‘EM!”
Whether crimes are committed by an action depend upon the circumstances.
Furthermore, actions don’t have to rise to criminal level to be wrong.

January 24, 2013 10:47 am

Liquid-in-glass thermometers CAN read a little high at the lower end of their range. This is because the ones that read up to 100 C are calibrated in boiling water, and the heat makes the glass a little longer, distorting its reading when the glass is cooler. This certainly does not mean that WEATHER thermometers do this. Once again, these people only need a plausible excuse to pull their dirty tricks, it does not have to be TRUE.
Mr. Moderator, typing in this box can be a real ordeal, can’t it be fixed?
[Reply: Sorry, that is controlled by WordPress. You could try a different browser, see if that helps. — mod.]

Michael
January 24, 2013 11:52 am

Even the most developed nations in the world can’t keep accurate/complete temperature records. There are two Environment Canada weather stations at the Calgary Alberta Canada international airport, located, according to the Environment Canada web site, 760 metres (just under half a mile) apart. The average temperature (as shown at 12:45PM local time today) for December 2012 (Tmax) was -4.5C at one and -5.7C at the other. Both are missing data for at least one day in computing the Tmax average. For November it was 1.9C and 0.8C, October 6.8 and 7.6 … . I don’t know which or if both or neither get fed into the calculation for global temperature datasets … but I can tell you that I have no confidence at all that anyone can compute a global average temperature within 1/100th of a degree C, F, or K.
And, yes, Anthony, semantics aside, restating prior years’ temperatures may not be fraud, but the alternative is incompetence. Neither generates confidence in the ability to rely on the data.

john robertson
January 24, 2013 12:09 pm

The statement,
Liquid in glass thermometers produce higher maximums than modern electronic sensors,
is verified where?
Data please, who ran this experiment?
Where is the process described?
Which electronic sensors were used?
Which brings me back to electronic sensors, these are accurate in what temperature range?
-40 to 20C is what Environment Canada apparently uses, which would cast doubt on their ability to measure winter lows or summer highs with any certainty.
This is not even good enough for government, standards.

January 24, 2013 12:26 pm

Michael says:
January 24, 2013 at 11:52 am
“Even the most developed nations in the world can’t keep accurate/complete temperature records. There are two Environment Canada weather stations at the Calgary Alberta Canada international airport, located, according to the Environment Canada web site, 760 metres (just under half a mile) apart.”
As part of the work I did with the NCDC data set, for each graph I made, I also make a google maps file with station locations as listed in the station metadata. You can find the graphs with a google maps link here:
http://www.science20.com/virtual_worlds/blog/updated_temperature_charts-86742
Note Google maps only allows you to put so many thumbtacks per page, some of the maps have dozens of pages (they’re at the bottom of the station list on each page).
One of the things I have in mind is enhancing the data on each thumbtack, for now it’s a minimal set of data.

January 24, 2013 12:36 pm

richardscourtney:
I think it’s pretty clear what’s happening: CO2 emissions are cooling our past.
That’s why Hansen is so worried — if this continues, our grandparents will have died in an Ice Age and we’ll never be born.
Now, spending $14T makes perfect sense…

January 24, 2013 12:53 pm

talldave2:
Thanks! I enjoyed that. 😉
But I would prefer to not contribute to the $14T yet, if you don’t mind. Selfish, huh?
Richard

Editor
January 24, 2013 1:01 pm

Let me offer this in clarification of some of the issues.
1. Temperature is an intensive property of matter, which means it doesn’t necessarily change with the amount of stuff you measure. Temperature is also not conserved. This means that there is no “right” way to average it. Instead, there are a number of ways, each of them with different advantages and disadvantages. This makes it hard to say what the average temperature of the US “is” today, much less what it “was” in 1936.
2. We are dealing with a fragmented, scattered, corrupted dataset of surface temperature. Yes, it is valuable to look at the raw data. It is also valuable to remove what errors we can reliably identify, and see what the result looks like.
3. When we remove any given error in actual temperature data (not anomalies), such as a spurious step change, we are left with a problem—when we adjust the data to remove the error, do we move the recent post-step data in one direction to correct the error, or do we move the older, pre-step data the opposite direction? The consensus is to adjust the older data, although either way works. We could just as easily move recent data down as to move older data up, or vice versa. (One problem with adjusting recent data is that you have to add in the adjustments when adding today’s temperature to a dataset, which may be why older data is usually adjusted.)
4. As a result of the consensus being to adjust the older data when errors are removed, which keeps the most recent data in agreement with the current thermometer readings, the historical temperature data will change depending on what errors we have removed from the raw data. (We could keep historical temperatures the same, but then current temperatures would be changing depending on the errors removed, and that’s even more confusing.)
5. As long as there is a clear audit trail, which as far as I know BEST has provided, I don’t think anyone either would or should go to jail for doing any of that, whether in business or out of business. The only problem comes up when you claim adjusted data is raw data, or you hide your adjustments, or the like. I don’t see that Zeke or BEST have done any of that.
Finally, thanks to Anthony and Zeke for discussing it.
w.

steven haney
January 24, 2013 1:06 pm

Rilfeld, I completely agree, The trees, wheat, vinyards and even the River Thames all agree that 2012 WAS NOT the WARMEST EVER! The Greenland glaciers (with ice core documentation cited in today’s top posting) would LOL (if they could) if told 2012 was the warmest ever. I posed my question concerning pre-1850 data (which was scrubbed by GISS soon after M. Crichtons “State of Fear” was published) to show how ALL the DATA from CRU has been cherry picked! 70% (roughly) of the warming trend from 1850 to 1996 would be erased by using a data set beginning in 1775. “Every EKG looks like a hockey stick if you only observe one heart beat” is my real point. That is why 1936 is not a big deal to me, but Anthony’s core point of the changing historical temps every month by CRU/GISS constituting FRAUD (my word, not his) is VALID… and Zeke has no coherent argument against this FACT. Thanks to all who have replied… I AM OUT!

January 24, 2013 1:16 pm

Mark Bofill says:
Mr Watts, Doubt all you want, I now get “Access Denied The owner of this website (rankexploits.com) has banned your IP address (207.200.116.13). (Ref. 1006)”
This is the result of the previous “obnoxious” page.
————————–
If this is Lucia’s page (the Blackboard) you’re probably being shot down by her anti-bot defenses. Happens to me once in a while too. She periodically posts about her tweaks to identify and prevent bot access and posts apologies for false rejects, for example:
http://rankexploits.com/musings/2012/how-constant-are-hacking-attempts/
…not that you’ll be able to read this link if you’re being blocked as a bot
What a wast of time, it is absolutely impossible to block any serious bot network by IP banning. She keeps incorrectly referring to automated web crawlers (Google, Bing, Baidu ect..) and bot spammers as “hacking” attempts. Yeah sure everyone is trying to hack the Blackboard blog, oh please. It is hilarious to watch her repeatedly banning commentators like Zeke.

January 24, 2013 1:19 pm

Willis Eschenbach:
I write to ask a clarification.
At January 24, 2013 at 1:01 pm you say

When we remove any given error in actual temperature data (not anomalies), such as a spurious step change, we are left with a problem—when we adjust the data to remove the error, do we move the recent post-step data in one direction to correct the error, or do we move the older, pre-step data the opposite direction?

I think that can be agreed. But it also begs the question which Zeke Hausfather has avoided despite it being put to him in various forms.
All “spurious” step changes have been adjusted once a temperature time series has been compiled. There is no obvious reason to make additional adjustments when new data is added to the end of the time series unless and until the additional data adds an additional “spurious” step change.
But each of the temperature time series (GISS, HadCRUTn, etc.) alters past data several times a year. Why?
Richard

pete
January 24, 2013 1:48 pm

“So what Zeke is telling us (and defending, I might add) is that the various temperature datasets are actually MODELS of the temperature.”
Exactly. And if people treat these values as model outputs or statistics rather than data then we can start to have meaningful discussions about what they actually represent. Trying to pass them off as data is patently ridiculous.

Matthew R Marler
January 24, 2013 2:01 pm

Don Monfort: Matt, can you cite the criminal code that prohibits manipulation of 1936 temp data? Is it called fraud? Is jail time a possible penalty? Is their a statute of limitations with regard to the age of the data?
Is there a point there? My comment was directed toward Anthony Watts’ language. It was his language that implied criminal activity.

Matthew R Marler
January 24, 2013 2:09 pm

In the business and trading world, people go to jail for such manipulations of data.
Perhaps Zeke will ask some jurors to decide whether that constitutes libel or slander (when spoken on tv and then written in a blog, I don’t know which word applies.) Or if, as with “torturing the data” the phrase “got to jail for such manipulations of data” is within the bounds of permissible hyperbole when directed at a “public figure” like Zeke.

PKthinks
January 24, 2013 2:19 pm

There are few areas of science where retrospective readjustments of data that always seem to have the same bias would be tolerated, ZH is upset because he knows this himself.
The proof of the bias is the long term trends are not usually affected only the comparisons of different periods in the temperature record.
Only one ‘side’ would one spend (waste) time on the 1936 US temperature record or harmonise Hadcrut with GIStemp post 1998
History may judge this frantic effort to keep re-presenting temperature records as a very cynical attempt at manipulation for it is not obvious if it tells us about the temperature record or the climate ( of climate science)

January 24, 2013 3:12 pm

Matthew R Marler:
At January 24, 2013 at 2:09 pm you write

In the business and trading world, people go to jail for such manipulations of data.

Perhaps Zeke will ask some jurors to decide whether that constitutes libel or slander (when spoken on tv and then written in a blog, I don’t know which word applies.) Or if, as with “torturing the data” the phrase “got to jail for such manipulations of data” is within the bounds of permissible hyperbole when directed at a “public figure” like Zeke.

The statement is not libel, not slander and not hyperbole.
The statement is merely a legal fact.
Richard

johnbuk
January 24, 2013 3:13 pm

As far as I’m concerned this is indeed a truly fascinating post and thread, made more so by Zeke and others prepared to answer their case. I’m not qualified to determine the rights and wrongs of the particular scientific argument and so, like most of the CAGW debate I’ve had to rely on my own BS meter to come to a view.
I like chinook’s (January 24, 2013 at 3:05 am) summary of his own take on the argument which sums up my own thoughts – like him I thought maybe I had missed something as the whole process didn’t look as “scientific” as I imagined it should. Richard Courtney’s response was quite reassuring! Ian W (January 23, 2013 at 5:11 pm) also hit the nail on the head as well regarding the uncertainty demonstrated on the thread by Zeke versus the certainty of the same report when announced to the world at large! I’d be interested to know if Zeke agreed with this level of published certainty on a personal level or if he was overruled by a majority view (consensus perhaps)?
I retired from a career in UK Banking in 2000 and some 8 years later was appalled and embarrassed by what was coming out of the woodwork from these glorified “betting shops” run by the spivs who had grabbed the reins. I avoided telling anyone what I did for a living after that. I suppose I should be grateful I wasn’t a “Climate Scientist”.
Many thanks to Anthony for this excellent site and for all who post here from both sides of the argument.

Craig M
January 24, 2013 3:20 pm

The best of intentions often lead to the worst crimes. It is Orwellian and in any free thinking world should be a crime as green policies – led by the adjustments telling us to panic and fear and fear and panic (rinse & repeat) – are destroying lives by artificially inflating fuel and food prices.
The reference to Madoff is not far off the mark in that context. Really to think scientists are any less likely to resort to fraud etc than any other profession is as absurd as CAGW frying the world. We hardly shower ourselves in glory as a species – do scientists have a gene that makes them immune?
I am reminded of a comment by Tallbloke some months ago.
“Metrics should be maintained and calibrated by impartial bodies whose principle remit is the custodianship of data, not its application. Allowing the definition and calibration of the metrics to be in the same hands as those writing new theory is a recipe for bias. We should not repeat the mistakes of the past so quickly.”

Matthew R Marler
January 24, 2013 3:52 pm

Richard S. Courtney: The statement is merely a legal fact.
I disagree, but I’d be willing to go along with a jury decision after a legal proceeding. It looks to me like an accusation of illegal behavior, but there are disagreements among language users about the true meanings of phrases like “such manipulations”.

January 24, 2013 5:35 pm

talldave2 says:
January 24, 2013 at 12:36 pm
richardscourtney:
I think it’s pretty clear what’s happening: CO2 emissions are cooling our past.
That’s why Hansen is so worried — if this continues, our grandparents will have died in an Ice Age and we’ll never be born.
Now, spending $14T makes perfect sense…
===================================================================
Bet I could find an old DeLorean for a lot less.
(If only Hansen was as smart as Doc Brown …)

Gary Pearse
January 24, 2013 5:42 pm

Zeke’s explanations seem heartfelt, but I think the niggling question is if 1936 was one thing in 1980, something else in 1987 and something else again in more recent times – are we done with it now or does it remain malleable as Anthony complains. Personally, I think Zeke is as honest as the next guy, but you know, if one believes in CAGW, then it is human nature to err in the desired direction, correct in the desired direction, round-off in the desired direction…I see it all the time with theweathernetwork in Canada who in their forecasts tend to overestimate the temperature more often than not. I think an honest man who believes in a theory, would be attracted to those parts of the record for “homogenization” that don’t fit the theory very well and probably don’t give a damn about 1926 or 1946. I think fraud may be in the picture when a new “high” isn’t quite as high as that pesky 1936 and needs a little nudge. Surely we can all agree that (say) if the IPCC has a choice, they will go for the higher temp, the higher rainfall amount etc.etc – like our weather forecaster above. It is high time that weather records and corrections should be in the hands of disinterested statisticians. Anything else is Colonel Sanders looking after our chickens.

Gary Pearse
January 24, 2013 6:01 pm

If CAGW is going to happen, it wouldn’t matter if the record was not perfect. If CAGW is of a significant magnitude, it shouldn’t need little helping hands of hundredths of degrees C. Leave it all alone. Lets, by all means make improvements in equipment, distribution, etc and rely on the newer stuff to tell us. I think the advent of satellite measurements has constrained adjustments to the more recent temps and all there is is to adjust are the pre- 1979 record downwards if indeed there is an effort to put a thumb on the scale. Egad’s we shouldn’t be adjusting 1936 down some fraction of a degree to make July 2012 a new record, we should chop off 2 or 3 degrees. A psychologist might say this type of activity shows a certain desperation on the part of those shoring up an ailing theory.

thelastdemocrat
January 24, 2013 6:40 pm

Billion$ ride upon the implications.
Couldn’t we manufacture a bunch of thermometers as used in the old days, and site them as in the old days, then examine the degree of agreement between the old style and the new style?
How many variations of the old style would we need to provide an answer in the face of a few likely scenarios (i.e., mercury purity was either this or that value, so we make two sets of thermometers)?
I can go on the web and buy all sorts of thermometers. These data (note: plural) would be ready for a year’s worth of analysis within a year.

Venter
January 24, 2013 8:32 pm

Mathew Marler, the statement was directed at NOAA/NCDC and they are Government Organisations with multimillion dollar budgets and staff. If they feel it is libellious, let them take it up. This was not directed at Zeke.
As per what Zeke posted here, there’s no way to know what was the CONUS temperature absolute average in 1936. So what NOAA/NCDC stated was false in announcing that 2012 was the warmest year ever. So what did Zeke do? Did he pull them up? Did he make a post about their statement? No, he goes after Anthony for an interview comment Anthony gave about NOAA/NCDC. Did Zeke at anytime write posts about the language Kevin Trenberth of NOAA/NCDC, James Hansen of NASA/GISS and other taxpayer funded and salaried pro-AGW scientists used about skeptics and Anthony? Is Zeke an employee or legal representative of NOAA/NCDC? So why’s he their mouthpiece? He’s condoning their act and deciding what Anthony should say about what their statement meant to Anthony. He says that it can be termed as an ” unethical act but not in terms Anthony used. Bloody hypocritical of him to do so.
To hell with it. Who’s Zeke to decide what language Anthony should use to express his personal opinion about the behaviour of a taxpayer funded Government Organisation, especially when they indulge in deliberate malfeasance? Who’s Zeke to decide that ” unethical ” is a good word and other words are not. What kind of a weird world do you all inhibit?

MattS
January 24, 2013 9:23 pm

richardscourtney,
“But each of the temperature time series (GISS, HadCRUTn, etc.) alters past data several times a year. Why?”
As an IT person, if I had to guess I would say that the most likely reason is that the routines for in-filling/estimating missing data are recursive so that as new real data gets added it automatically affects all of the old estimates / in-fills which of course affects the past monthly and annual averages.
Now if you ask me, doing things like that in a supposedly scientific endeavor is just plain wrong.

steven haney
January 24, 2013 11:49 pm

MattS, I Agree.. Just Plain Wrong! Unjustifiable… Unconscioable … Illegal… Indefensible.

Don Monfort
January 25, 2013 12:08 am

Have you gone crazy Mattstat?
Matthew R Marler says:
January 24, 2013 at 2:01 pm
Don Monfort: Matt, can you cite the criminal code that prohibits manipulation of 1936 temp data? Is it called fraud? Is jail time a possible penalty? Is their a statute of limitations with regard to the age of the data?
Is there a point there? My comment was directed toward Anthony Watts’ language. It was his language that implied criminal activity.
Matthew R Marler says:
January 24, 2013 at 2:09 pm
In the business and trading world, people go to jail for such manipulations of data.
Perhaps Zeke will ask some jurors to decide whether that constitutes libel or slander (when spoken on tv and then written in a blog, I don’t know which word applies.) Or if, as with “torturing the data” the phrase “got to jail for such manipulations of data” is within the bounds of permissible hyperbole when directed at a “public figure” like Zeke.
Look Matt, Anthony was engaging in hyperbole. My moms used to do it, when I transgressed. She would send me to get a switch, while hollering that I deserved to be whupped within an inch of my life. I would have left town, if I had taken that literally. Zeke is not going to court. He has no cause of action, mainly because Anthony did not say it about Zeke. Try to catch up. If you are going to play a lawyer on the internet, you should Google defamation, slander, libel, before you make a fool of yourself, again.

TC
January 25, 2013 12:35 am

MattS, I suspect you’ve hit the nail on the head there.
A comment from Zeke would be in order. If this is what they’re doing then they’re just playing around with numbers and, as a consequence, totally corrupting the historical data. And they think they’re doing science?

January 25, 2013 4:27 am

Matthew R Marler:
Your post at January 24, 2013 at 3:52 pm says

Richard S. Courtney:

The statement is merely a legal fact.

I disagree, but I’d be willing to go along with a jury decision after a legal proceeding. It looks to me like an accusation of illegal behavior, but there are disagreements among language users about the true meanings of phrases like “such manipulations”.

You are being silly.
It is simply a legal fact that “In the business and trading world, people go to jail for such manipulations of data.”
Don’t disagree with me about it: disagree with those who are in jail because they altered financial data from the past and failed to record the unaltered data.
It is also a legal fact that
people don’t go to jail for altering climatological data from the past and failing to record the unaltered data.
It is NOT libel, slander or hyperbole to state either or both of those legal facts.
Richard

January 25, 2013 4:43 am

MattS:
Your post at January 24, 2013 at 9:23 pm says

richardscourtney,

“But each of the temperature time series (GISS, HadCRUTn, etc.) alters past data several times a year. Why?”

As an IT person, if I had to guess I would say that the most likely reason is that the routines for in-filling/estimating missing data are recursive so that as new real data gets added it automatically affects all of the old estimates / in-fills which of course affects the past monthly and annual averages.
Now if you ask me, doing things like that in a supposedly scientific endeavor is just plain wrong.

I agree every word of your post, and I have obtained statements which imply your “guess” is correct.
But I want the matter to be openly admitted, and that is why I keep asking my question again and again (including repeatedly on this thread as your post quotes).
I want it to be admitted because I strongly agree with you that “doing things like that in a supposedly scientific endeavor is just plain wrong”. And I think those involved privately agree this, too: please note the silence (including from Zeke Hausfather on this thread) which is the only response I get to my question.
Richard

January 25, 2013 5:00 am

RockyRoad and pete:
pete’s post at January 24, 2013 at 1:48 pm quotes RockyRoad saying at January 24, 2013 at 6:59 am

So what Zeke is telling us (and defending, I might add) is that the various temperature datasets are actually MODELS of the temperature.

and pete’s post replies

Exactly. And if people treat these values as model outputs or statistics rather than data then we can start to have meaningful discussions about what they actually represent. Trying to pass them off as data is patently ridiculous.

Indeed so.
I yet again draw attention to the draft paper which is Appendix B of my Parliamentary Submission which can be read at
http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc0102.htm
and was blogged from publication by the frequent changes to global temperature data sets (as the Submission explains).
The draft paper considers mean global temperature according to both understandings (i.e. as a modelled real parameter and as a statistic) and the paper considers the implications of each understanding.
Richard

January 25, 2013 5:02 am

Ouch! I typed “blogged from publication” but intended “blocked from publication”. Sorry.

MattS
January 25, 2013 6:44 am

richardscourtney,
“But I want the matter to be openly admitted, and that is why I keep asking my question again and again (including repeatedly on this thread as your post quotes).”
The problem with this is that the people responsible for it aren’t programming experts and may not understand what they have done and therefore may not be aware that it is happening.

MattS
January 25, 2013 6:49 am

richardscourtney,
I read a number of Zeke’s replies in this light.
He talks about how complicated it is to estimate the average temp for CONUS and that different methods will produce different results. He seems completely blind the the fact that the DATA behind the estimates has changed over time.

January 25, 2013 7:09 am

MattS:
At January 25, 2013 at 6:49 am you say

richardscourtney,
I read a number of Zeke’s replies in this light.
He talks about how complicated it is to estimate the average temp for CONUS and that different methods will produce different results. He seems completely blind the the fact that the DATA behind the estimates has changed over time.

If by “completely blind” you mean in the same way that Nelson saw “no ships”, then I agree.
Please note that
(a) I posed the question to him personally at January 23, 2013 at 3:56 pm
(b) I rephrased it for clarity in a post addressed to him at January 23, 2013 at 4:41 pm
and
(c) I reminded him of it in my post addressed to him at January 23, 2013 at 5:46 pm.
Those personal requests for an answer have obtained no response of any kind.
In your post to me at January 25, 2013 at 6:44 am you say

richardscourtney,

“But I want the matter to be openly admitted, and that is why I keep asking my question again and again (including repeatedly on this thread as your post quotes).”

The problem with this is that the people responsible for it aren’t programming experts and may not understand what they have done and therefore may not be aware that it is happening.

Well, if that were true then Zeke Hausfather could have responded to my posts to him with something similar to, “I don’t know so I will get back to you when I find out”. He did not that. Instead, he was “completely blind” to my posts addressed to him.
Richard

January 25, 2013 8:12 am

MattS on January 25, 2013 at 6:49 am
richardscourtney,
“I read a number of Zeke’s replies in this light.
He talks about how complicated it is to estimate the average temp for CONUS and that different methods will produce different results. He seems completely blind the the fact that the DATA behind the estimates has changed over time.”

– – – – – – –
MattS,
Thanks for your straight forward discussion style on statistic topics. Zeke’s discussions seemed oblique not only to the most fundamental statistical concern of data integrity, he was also avoiding reasonable probes of negative scenarios about NCDC and GISS.
How to mitigate against any kind of original data corruption by government funded bodies, e.g. , GISS or NCDC or MET?
My thought is the original data should be archived in parallel with three different types of custodians. One set placed with a body like the Smithsonian. A duplicate set placed with a major private university library archivist who must be unconnected with its science departments. Several other duplicate sets should be placed with in the hands of several private volunteer groups.
As each new current monthly data is taken, it would be auto archived simultaneously at each location.
With that approach then screwing with the original data would be virtually impossible.
John

Matthew R Marler
January 25, 2013 8:46 am

Richard S Courtney: disagree with those who are in jail because they altered financial data from the past and failed to record the unaltered data.
Tell us who went to jail for “such” manipulations of the data as Zeke is responsible for.
Why not just admit that, as far as anyone knows, no one has gone to jail for hierarchical modeling of time series data of different lengths with possibly inaccurate models of the spatio-temporal correlations (and the other methods used to adjust past data for changes in the recording environment and instruments.) Anthony Watts has written many good critiques of the temperature record, but he went a step too far with the allegation of criminality.
You and I are clearly reading the word “such” differently. You are treating it as though it does not even appear in the sentence: In the business and trading world, people go to jail for such manipulations of data.

Don Monfort
January 25, 2013 9:55 am

Outside the arena, people go to jail for brawls such as routinely occur in hockey games. But don’t get all nitpickety and start whining that I am accusing hockey players of being criminals. I ain’t. If I wanted to do that, I would say something like this: Hockey players are criminals.

January 25, 2013 10:46 am

Matthew R Marler:
re your post at January 25, 2013 at 8:46 am.
Give it up. The horse you are flogging is dead.
You were wrong. I repeatedly explained how and why you were wrong.
You have repeatedly ignored everything I have said and tried to flog the horse with a different whip.
Flogging the horse won’t work because it is dead.
Richard

john robertson
January 25, 2013 10:55 am

So its crickets…. Thank you Zeke Hausfather and Steve Mosher, there is no good reason for the drifting values of the past, other than it suits your needs?
No ethical problems with rewriting history to suit ?
Maybe CO2 emissions of today do cause past temperatures to fall, I will check my Grandfathers diaries for the ice-age he must have lived through.

January 25, 2013 12:07 pm

Is there a reasonable doubt that there has been corruption of data at NCDC and GISS?
I think it is reasonably established there has been some kind of systemic and frequent manipulation of past data. Always increasing a trend toward ever higher positive rates; only unidirectionally.
The UAH and RSS data since the beginning of the satellite era provide a very critical contrast to NCDC and GISS products; the satellite data adds credibility to those who reasonably view that there is systemic and increasing data corruption at NCDC and GISS.
Any observations of apparent NCDC and GISS data corruptions give rise to the question; is any data corruption intentional? The 10+ years of acts and words from GISS leader Hansen clearly indicate an unbalanced leader who allows no doubt of his view’s truth just as his GISS data manipulations have allowed only increasing revisionist warming rates in the data history.
Hansen’s unbalanced leadership is evidence for expecting data corruption toward a unidirectional extreme. He can remain as GISS’s unbalanced leader only because he represents a very fashionable and pseudo-science cause supported by activists in the media, in NGOs and in politics.
It looks as if Hansen is not a symptom of GISS’s numerous public displays and products that give support to the idea of many critical viewers; the idea that GISS has a unidirectional data corruption problem. Hansen is the root cause of any problem.
NCDC is subject to the same kind of analysis as I have just done on GISS.
Fraud? I do not care what one calls any confirmed malfeasance by NCDC and GISS, the most severe punishment is the individuals involved are placed high on the list of history’s worst scientific pretenders.
John

January 25, 2013 4:57 pm

“Glacials last about 120,000 years, interglacials (like now) about 12,000 or about 1/10 as long.
The natural state of the earth is frozen into an ice age glacial. ”
Wow. Thank you, E.M. Smith. That’s another piece of information that never seems to make it into msm articles or discussions about global warming.

Richard M
January 25, 2013 5:42 pm

Researcher bias is a well known fact. It has been studied over and over again in the medical fields. Without proper controls a researcher will always bias results in the direction they believe is “true”. The fact CAGW “believers” have been the ones adjusting the data is all the proof that is needed to KNOW with near certainty that the adjustments are too high.
If skeptics controlled the data we’d probably be adjusting it too far in the other direction. Reality exists somewhere in the middle. It may very well be that the raw data is as close to the truth as we’ll ever get.

MattS
January 25, 2013 5:50 pm

richardscourtney,
None are so blind as he who will not see.

Matthew R Marler
January 26, 2013 2:11 pm

richardscourtney: You have repeatedly ignored everything I have said
Not so. I have repeatedly addressed the word “such”, which to me, in that sentence, implies an accusation of criminal activity. I am not sure that we speak the same dialect of English, as you persistently deny the importance of that word altogether.

January 26, 2013 4:07 pm

Matthew R Marler:
At January 26, 2013 at 2:11 pm you say to me

richardscourtney:

You have repeatedly ignored everything I have said

Not so. I have repeatedly addressed the word “such”, which to me, in that sentence, implies an accusation of criminal activity. I am not sure that we speak the same dialect of English, as you persistently deny the importance of that word altogether.

It is absolutely “so” that you have repeatedly ignored everything I said. For example, this from me to you at January 25, 2013 at 4:27 am

It is simply a legal fact that “In the business and trading world, people go to jail for such manipulations of data.”
Don’t disagree with me about it: disagree with those who are in jail because they altered financial data from the past and failed to record the unaltered data.
It is also a legal fact that
people don’t go to jail for altering climatological data from the past and failing to record the unaltered data.
It is NOT libel, slander or hyperbole to state either or both of those legal facts.

I can only conclude that it is an act of desperation for you to now want to dispute the meaning of the word “such”. And your implication that the word “such” implies “criminal activity” is plain wrong.
The On-Line Dictionary says this of the word ‘such’.

such (sch)
adj.
1.
a. Of this kind: a single parent, one of many such people in the neighborhood.
b. Of a kind specified or implied: a boy such as yourself.
2.
a. Of a degree or quality indicated: Their anxiety was such that they could not sleep.
b. Of so extreme a degree or quality: never dreamed of such wealth.
adv.
1. To so extreme a degree; so: such beautiful flowers; such a funny character.
2. Very; especially: She has been in such poor health lately.
pron.
1.
a. Such a person or persons or thing or things: was the mayor and as such presided over the council; expected difficulties, and such occurred.
b. Itself alone or within itself: Money as such will seldom bring total happiness.
2. Someone or something implied or indicated: Such are the fortunes of war.
3. Similar things or people; the like: pins, needles, and such.
Idiom:
such as
For example.

Clearly, as I quote in this post, I stated that the the “manipulations” are “of a kind” in that they each consist of altering data from the past and failing to record the unaltered data.
Simply, your post is yet another example of your ignoring everything I have written to you in your attempt to flog your ‘dead horse’. You are wrong: live with it.
Richard