NCDC: 'our algorithm is working as designed'

In a statement to Polifact today, NCDC made the following statement:

“… our algorithm is working as designed”

One wonders though, about these sorts of things that have been found wrong in their data file for USHCN, which is represented to the public as “high quality”.

Here are few other things that worked as designed:

The Tacoma Narrows Bridge (1940):


 

Early NASA Rockets (1950’s-60’s):


 

The Titanic (1912): On 14 April, the RMS Titanic, described by its builders as practically unsinkable, sinks after hitting an iceberg.

titanic-breakapart-sinking


 

The de Havilland Comet (1952): Twenty-one of these commercial airliners were built.The Comet was involved in 26 hull-loss accidents, including 13 fatal crashes which resulted in 426 fatalities. After the conclusive evidence revealed in the inquiry that metal fatigue concentrated at the corners of the aircraft’s windows had caused the crashes, all aircraft were redesigned with rounded windows.

De-Havilland-Comet


 

Mariner 1 (1962): The first US spacecraft dispatched to Venus drifts badly off course because of an error in its guidance system. The error is a small one — a wrong punctuation character (a hyphen) in a single line of code — but the course deviation is large. Mariner 1 ends up in the Atlantic Ocean after being destroyed by a range safety officer. It has been called “The most expensive hyphen in history”

Atlas Agena with Mariner 1.jpg

Launch of Mariner 1

The Mars Climate Orbiter (1998)

marsClimateOrbiter[1]

The Mars Climate Orbiter crashed into the surface of the planet, because its orbit was too low.

The primary cause of this discrepancy was that one piece of ground software produced results in an “English system” unit, while a second system that used those results expected them to be in metric units. Software that calculated the total impulse produced by thruster firings calculated results in pound-seconds. The trajectory calculation used these results to correct the predicted position of the spacecraft for the effects of thruster firings. This software expected its inputs to be in newton-seconds.

The discrepancy between calculated and measured position, resulting in the discrepancy between desired and actual orbit insertion altitude, had been noticed earlier by at least two navigators, whose concerns were dismissed.


 

The NCDC Climate at a Glance plotter for the public (2014):

While being told that “all is well” and and that “our algorithm is working as designed”, it is easy to discover that if one tries to plot the temperature data for any city in the United States like Dallas Texas for example you get plots for high temperature, low temperature, and average temperature that are identical:

Dallas_Tmax Dallas_Tmin

Dallas_Tavg

Try it yourself:

Go here:

http://www.ncdc.noaa.gov/cag/

Change settings to go to a statewide time series, pick a city, and what it does is and it gives you data where the min temp, avg temp and max temp that are the same. It is unknown if it is even the right data for the city.

h/t to WUWT readers Wyo_skeptic, Gary T., and Dr. Roy Spencer

0 0 votes
Article Rating
135 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Rhoda R
July 1, 2014 8:19 pm

Did anyone ask what the design goal was that these algorithms were designed to meet?

editstet
July 1, 2014 8:19 pm

Ah, well, that certainly simplifies things.

editstet
July 1, 2014 8:23 pm

Or maybe NOAA scientists took the song Night and Day too literally.

pokerguy
July 1, 2014 8:28 pm

“…working as designed.”
Well that’s a relief.

Abbott
July 1, 2014 8:33 pm

Working exactly as designed. Designed to provide evidence that global warming was an issue of the utmost political importance.

Bill H
July 1, 2014 8:35 pm

Who gave these guys permanent glue for their models?

temp
July 1, 2014 8:40 pm

So my question is… can’t someone whip together a quick study cherry picking on the warmest past temperatures and coldest present. Take screen caps and download the data(since it changes near daily shouldn’t be hard to mix and match) and then patch it all together and say its the true data set. Which according to these guys it is even if the numbers change daily and then produce a study showing were all going to die from global cooling?
Since none of the adjustments are documented and since they change near daily should only take a few weeks to patch together a global cooling scare paper. How will they counter it when you produce screen caps and downloaded data…. Only way I can think of is out right admitting they change the whole temp record on a near daily basis. That type of admission is kind of hard to spin even in a short press release.
“The best way to fight the system is from inside the system” while I generally don’t believe that this does seem like a great way to rube egg on their facing using the algo’s they approve of…
Also can we now move from simple incompetence to plain old per-mediated malice.

AleaJactaEst
July 1, 2014 8:46 pm

good enough for Gubment work.

Rob Dawg
July 1, 2014 8:47 pm

And when they get a multimillion dollar grant to “update” the algorithm it will be considered to be working even better.
Of course the upshot will be a reason to stop any attacks while the studies are underway. Check back in 2019.

Louis
July 1, 2014 8:52 pm

If their algorithms are working as designed, I would sure like to hear their explanation for why temperature adjustments for Luling, Texas were necessary, and why their adjustments added 1.35 C to the annual temperature for 2013 compared to the actual readings for that station. If they can’t (or won’t) explain such adjustments, then there is no reason to believe or trust them ever again.

crosspatch
July 1, 2014 8:52 pm

This really saddens and angers me. Sad because as someone who has respected science, I see a perversion of it. Angry because the perverts are stealing from my children’s earnings and quality of life. Dr. Curry said it best:

When the adjustments are of the same magnitude of the trend you are trying to detect, then the structural uncertainty inspires little confidence in the trends.

george e. smith
July 1, 2014 8:52 pm

Well Algorithms ALWAYS work as designed.
Whether they are designed to do anything useful is a separate issue.

Rud Istvan
July 1, 2014 9:08 pm

Anthony, call them on the max min avg mistake. They might respond since obviously and embarassingly wrong.
You just called them on much bigger climate temp issues, and were ‘blown off’. Time to escalate. And not just here. “algorithm does what we intended” is going to be one of those salient moments all round. What a lovely intent statement in any court of law able to convict.

Pamela Gray
July 1, 2014 9:12 pm

Now that’s what I call reproducible science. 4 marks!!! For hilarity!!!!

Doug Proctor
July 1, 2014 9:22 pm

So a plot of daily range for these cities should give a zero line ….?????

hunter
July 1, 2014 9:25 pm

Guns always shoot where they are pointed, too.

July 1, 2014 9:30 pm

Aw, you left out the Hubble Telescope. I think it a most appropriate example for no other reason that every single component and sub-assembly worked exactly as designed. It was only the fully assembled device that failed to work properly.
i sense the same mind numbing denial of the obvious in this case. The algorithm no doubt did work exactly as designed. That by no means proves that the design achieved an output commensurate with actual results, and, as the trends above show, it is quite possible to have an algorithm that works as designed yet, as part of a larger system, like the Hubble Telescope, produces incorrect information that is wildly and completely obviously wrong. Sadly, a quick look at the original photo from Hubble was enough to convince a rank layman that something was wrong. I don’t think a quick look by the MSM will have the same effect.

Konrad
July 1, 2014 9:40 pm

“… our algorithm is working as designed”
Indeed.
However I think it’s NCDC’s designs that are not going to work out…

KenB
July 1, 2014 9:47 pm

Our al gore rhythm works exactly as designed, so what if the dammed thing makes things warmer that is what the political warmers want and they pay the bills, so shut up and move on! Nothing to see here………

John F. Hultquist
July 1, 2014 9:58 pm

All users want is a simple temperature series.
Here’s and old favorite of a cartoon of “What the user wanted.”
http://pages.uoregon.edu/ftepfer/SchlFacilities/TireSwingTable.html

July 1, 2014 10:09 pm

“… our algorithm is working as designed” SNAFU! It was obviously designed to spread the message that there really was Global Warming out there, just as the Warmistas had predicted.
“When the adjustments are of the same magnitude of the trend you are trying to detect, then the structural uncertainty inspires little confidence in the trends.”…Well said, Dr. Curry.

Louis
July 1, 2014 10:14 pm

george e. smith has a point. Saying that an algorithm works “as designed” doesn’t say anything meaningful about its accuracy or correctness. Those who designed the algorithm could have had a certain end result in mind without caring at all about it being correct.

Bryan A
July 1, 2014 10:19 pm

It is certain that the algorithm is functioning as designed and is evidenced by the 3°f increase in temp since 1940. Any algorithm designed to indicate an increase in temperature that in turn does indicate said increase is obviously functioning properly

david dohbro
July 1, 2014 10:19 pm

I just did the same for San Francisco, CA and got the same results: the average, max and min temperatures for the month of May are all the same (and for any other month): https://www.dropbox.com/s/l5n1byjsos070jh/NCDC%20SF-Temps.pdf
then went on to check Fresno, CA. same deal here, now for the month of February….
https://www.dropbox.com/s/338iuwiu83d3ld2/NCDC%20Fresno-Temps.pdf
very, very, very disturbing.
Question now is “what is it the algorithm designed to do?” Is it supposed to be accurate, be wrong, be false, be corrupt, be miss-informing, etc?

John Coleman
July 1, 2014 10:25 pm

LOL Great work Anthony. You da man. See you in Vegas

MarkG
July 1, 2014 10:47 pm

Re the Comet: over 100 were built, and it continued in passenger service until the early 1980s. The Nimrod derivative continued in military service until 2011.
It’s also worth noting that the Comet would have worked better if built as designed. If I remember correctly, the measured stress around the window turned out to be about twice what their engineering model predicted, but the cracks started at rivet holes around the window, and the original design was for the windows to be glued in, not riveted.

AlecM
July 1, 2014 10:48 pm

You didn’t read it properly:w e’re dealing with AlGoreithms working exactly as designed….:0)

jorgekafkazar
July 1, 2014 11:22 pm

Lysenko is alive and well, working at NCDC.

July 1, 2014 11:37 pm

When we had code that didn’t work but met the specs we used to call it “broken as designed”.

rogerknights
July 1, 2014 11:46 pm

“Our algorithms are working as designed.”

GM could say the same of its ignition switches.
What a question-begging thing to say.
They’re really asking for it.
(A congressional investigation.)

Slabadang
July 1, 2014 11:50 pm

Its a political no answer and at the same time a silent admission!
If it works is decided by reality not the design of the algorithm. Its the lack of a real answer that reveal them. Dont accept this supid and dishonest statement!

Greg Goodman
July 2, 2014 12:03 am

NCDC: ‘our algorithm is working as designed’
This actually says nothing, of course it works as designed, the question is whether it was designed correctly.
Whether that is poor wording or deliberate evasion ( bureaucratic responding whilst not replying ) is another question. Incompetence or misdirection.
But what they are in fact saying is that they see no problem with 40% of the data in the USHCN database being non-observational numbers.

Greg Goodman
July 2, 2014 12:06 am

AlecM says:
You didn’t read it properly:w e’re dealing with AlGoreithms working exactly as designed….:0)
AlGoreithms, very astute ! I think that will get reused.

Rabe
July 2, 2014 12:07 am

Come on, at least the average was calculated correctly. [/s]

Greg Goodman
July 2, 2014 12:08 am

AlecM says:
You didn’t read it properly:w e’re dealing with AlGoreithms working exactly as designed….:0)
AlGoreithms , very good. I think that will get reused.

Ian W
July 2, 2014 12:10 am

It is of course purely coincidental that this ‘verified correct’ algorithm was put in place just prior to the National Climate Assessment. Perhaps its outputs were validated against its sponsor’s requirements too.

Greg Goodman
July 2, 2014 12:10 am

“The de Havilland Comet (1952): Twenty-one of these commercial airliners were built.The Comet was involved in 26 hull-loss accidents, including 13 fatal crashes which resulted in 426 fatalities. After the conclusive evidence revealed in the inquiry that metal fatigue concentrated at the corners of the aircraft’s windows had caused the crashes, all aircraft were redesigned with rounded windows.”
It was not the passenger windows that failed but a radar “window”. Lessons learnt were applied to passenger windows.

Greg Goodman
July 2, 2014 12:15 am

David Dohbro : Question now is “what is it the algorithm designed to do?”
A detailed specification is now required. Firstly to check whether what is supposed to be doing scientifically justifiable, secondly so someone outside the organisation can check whether the AlGoreithms are working. You know, the old validation bit.

Martin A
July 2, 2014 12:16 am

Twenty-one of these commercial airliners were built.The Comet was involved in 26 hull-loss accidents,
??

richardscourtney
July 2, 2014 12:18 am

temp:
You fail to acknowledge a classic ‘Catch 22’ when at July 1, 2014 at 8:40 pm you ask

So my question is… can’t someone whip together a quick study cherry picking on the warmest past temperatures and coldest present. Take screen caps and download the data(since it changes near daily shouldn’t be hard to mix and match) and then patch it all together and say its the true data set. Which according to these guys it is even if the numbers change daily and then produce a study showing were all going to die from global cooling?

No, the fact of the frequent changes prevents publication of a paper which reports effects of the changes.
More than a decade ago I tried to publish a paper on the matter and it was blocked from publication by this problem. This Parliamentary Submission discusses an email (from me) leaked as part of ‘Climategate’ that complained at the blocking of the publication. (Its Appendix A is the email and its Appendix B is a draft of the paper).
It seems that if the corrupted scientific publication problem is to be corrected then that corruption to be addressed by political action in the US. The global warming scare was started by Margaret Thatcher, and now in the UK and much of the EU the scare is continued by the ‘rabid right’ supported by the political center and left so there is no possibility of correcting these matters here. WUWT is US-based and the excellent Senator Inhoffe has recently bolstered his political position.
Richard

Steve in Seattle
July 2, 2014 12:26 am

Well, I used the month of July , start year is 1950 and interval is one month, for Seattle. The plots are all the same, but in the table shown below each plot there appears to be something strange that I haven’t figured – the rankings are the same for the first 7 years in my series, all I could screen capture, however there are different anomaly numbers. I was going to blame the “plot” software as being FUBAR, however now, not so sure if the problem(s) lie deeper.
Its late here, I’m tired, to be continued, perhaps others can pick their cities and help expand the query. Note to self: better investigate dropbox.

Another Ian
July 2, 2014 12:34 am

Louis says:
July 1, 2014 at 10:14 pm
george e. smith has a point. Saying that an algorithm works “as designed” doesn’t say anything meaningful about its accuracy or correctness. Those who designed the algorithm could have had a certain end result in mind without caring at all about it being correct.
Louis
There’s “sleight of hand” and there’s “Sleight of tongue”

Mervyn
July 2, 2014 1:09 am

“… our algorithm is working as designed”
Yes, sure thing… as designed to give a rising temperature trend.
Look, to suggest an algorithm is working as designed says everything we need to know about the fudged temperature data we have been warned about again and again. As a reminder, check out the following link:

DirkH
July 2, 2014 1:09 am

They use a hash code of the station name as seed for a pseudo random number generator, add a constant upward drift, and generate your station-specific global warming data on the spot. I presume.
Because they had a harddisk failure and the real data is unrecoverable, as happens always in government institutions.

richardscourtney
July 2, 2014 1:14 am

Another Ian and Louis:
Louis emphasises an important point when he writes at July 1, 2014 at 10:14 pm saying

george e. smith has a point. Saying that an algorithm works “as designed” doesn’t say anything meaningful about its accuracy or correctness. Those who designed the algorithm could have had a certain end result in mind without caring at all about it being correct.July 2, 2014 at 12:35 am

There is a clear meaning to the statement

our algorithm is working as designed

This can only mean that the algorithm is doing what it was “designed” to do and the OED definition of “design” is here and says

design
noun
1. A plan or drawing produced to show the look and function or workings of a building, garment, or other object before it is built or made:
‘he has just unveiled his design for the new museum’

In other words, the statement
“our algorithm is working as designed”
means
our algorithm is working as it was intended to function before it was made.
As I said on the other thread, I am astonished that any civil servant would provide so incompetent a reply as that! Perhaps NCDC needs to employ a British ‘Sir Humphry’ to teach their spokespeople how to provide an answer which says nothing in so obscure a manner that few can understand it.
Richard

July 2, 2014 1:20 am

This is the second bug found in that Climate at a Glance site. A couple of weeks ago Paul Homewood found that they had forgotten to divide by 12 to get an average.

July 2, 2014 1:28 am

Yes, absolutely get the point, but accuracy please.
“The de Havilland Comet (1952): Twenty-one of these commercial airliners were built.The Comet was involved in 26 hull-loss accidents, including 13 fatal crashes which resulted in 426 fatalities.”
There were 3 hull losses due to design with a loss of 87 lives. What’s your point here and where did you get these figures?

July 2, 2014 1:48 am

Stephen Skinner says:
July 2, 2014 at 1:28 am
Yes, absolutely get the point, but accuracy please.
Alright. Shot with my own gun. Besides the 3 hull losses due to structural failure there were 2 other s due take off characteristics which led to wing leading edge redesign. What should be noted is the manufacturer worked hard to fix the design, which they did.

richardscourtney
July 2, 2014 2:10 am

Stephen Skinner:
I think you make an important point but you understate it in your post at July 2, 2014 at 1:48 am where you write

What should be noted is the manufacturer worked hard to fix the design, which they did.

The illustration goes to the crux of the issue raised by the document reported in the above article.
The aircraft manufacturer needed to know why their design caused planes to crash. Their failure analysis induced investigations which discovered metal fatigue was a more serious problem than previously imagined and – very importantly – the shape of components (in their case the corners of windows) could concentrate stress to very high values in small localities. These were important findings of great importance to much engineering (i.e. not only aircraft engineering).
The reported document cites a question and provides this answer

Are the examples in Texas and Kansas prompting a deeper look at how the algorithms change the raw data?

No – our algorithm is working as designed. NCDC provides estimates for temperature values when:
1) data were originally missing, and
2) when a shift (error) is detected for a period that is too short to reliably correct. These estimates are used in applications that require a complete set of data values.

This reply to the asked question is a clear declaration that no investigation is intended and, therefore, nothing can be learned from the investigation.
The reply denies science.
Richard

cnxtim
July 2, 2014 2:26 am

CAGW defending the indefensible, as long as the budget holds out..

Non Nomen
July 2, 2014 2:44 am

They got what they asked for: BS.

July 2, 2014 2:57 am

“Working exactly as designed. Designed to provide evidence that global warming was an issue of the utmost political importance.”
Agree, to promote an ide(ology) and a narrow political Agenda.

michael hart
July 2, 2014 3:00 am

Planned Obsolescence was also a great track by 10,000 Maniacs.

DirkH
July 2, 2014 3:04 am

richardscourtney says:
July 2, 2014 at 12:18 am
“and now in the UK and much of the EU the scare is continued by the ‘rabid right’ supported by the political center and left ”
Well Richard some day you will have to explain to me what differentiates the EU “rabid right”, in your words, from social democrats (or INGSOC Fabians) – as they both want the exact same high tax, big government, mass immigration, transgendered, welfare state policies.

ROM
July 2, 2014 3:05 am

From what I have read and seen over the last few days and this NCDC typical of today, forked tongue debacle about caps it off, due to the constant and never ending undocumented, un-explainable, near randomised “adjustments” there is no longer any actual provable long term on ground unadjusted, uncorrupted and untampered temperature data available that could provide a true indication as to temperature trends whether up or down over the last century and over any shorter time period within that century both in the USA and globally.
When I think of those thousands of observers over the last hundred plus years who faithfully collected that data through rain and snow and cold and heat and dust and flies and etc in the belief that they were assisting in weather forecasting and recording the weather for posterity to exam and research and then pass that data in it’s all embracing truth onto future generations, I have nothing but total contempt and disgust for those so called climate scientists who have been entrusted with the stewardship of that data and paid rather handsomely for doing so,
To deliberately alter and corrupt that data after all the work all those observers did down through the last dozen decades is truly a massive blight and indelible stain on both the climate alarmist scientists involved and on climate science in it’s totality and a complete abrogation of climate science’s responsibility to ensure that the data in it’s pristine, uncorrupted and unaltered form should be preserved for future generations to exam and delve into for their own research purposes.
It is a telling indication of status and mentality of climate alarmist science that it should treat the prime data collected over all those years by dedicated observers with such utter contempt that it is prepared to stoop to outright corruption of that precious data just to satisfy it’s own extremely self centred and totally selfish elitist agenda without any thought for future generations and their needs and requirements for accurate historical data from the long gone past.

July 2, 2014 3:29 am

richardscourtney says:
July 2, 2014 at 2:10 am
I agree totally with what you are saying and the main point of this article. However, the deaths of 87 soles is a tragedy in itself but it is not correct to say that over 400 deaths occurred because they didn’t. The figure of 13 fatal hull losses and 426 fatalities caused by a design fault is not correct and a dreadful slur on those who worked on this plane.

richardscourtney
July 2, 2014 3:45 am

Stephen Skinner:
re your post addressed to me at July 2, 2014 at 3:29 am.
For the record, yes, I agree.
Richard

tttt
July 2, 2014 3:49 am

About the max/min/avg issue: As you probably have noted, they have changed to [i]nClimDiv[/i] dataset in March. And on their ftp site they clearly say they are planning to have max/min temps up in mid-June. Apparently these are not still up.
I really fail to see any issue in this?

richardscourtney
July 2, 2014 4:01 am

tttt:
I write to answer the question you ask at July 2, 2014 at 3:49 am where you write in total

About the max/min/avg issue: As you probably have noted, they have changed to [i]nClimDiv[/i] dataset in March. And on their ftp site they clearly say they are planning to have max/min temps up in mid-June. Apparently these are not still up.
I really fail to see any issue in this?

The “issue” is that USHCN is publishing and using data which is plain wrong.
You say they claim an intention to provide corrected data in “mid-June”, but there is no stated (or obvious) reason why they would not now publish correct max., min, and average data. When taken in the context that they ‘adjust’ all their data each month, their claimed intention is rightly being treated with scorn especially when in answer to a question they have replied that they are doing what they intended to do.
Richard

A C Osborn
July 2, 2014 4:10 am

ROM says: July 2, 2014 at 3:05 am The Raw data is not lost, there are plenty of records around with that data.
It is the Final data that is totally currupted.

Bruce Cobb
July 2, 2014 4:18 am

Obamacare works as designed.
Lysenkoism worked as designed.
N*az*ism worked as designed.

July 2, 2014 4:37 am

Bruce Cobb says:
July 2, 2014 at 4:18 am
Obamacare works as designed.
Lysenkoism worked as designed.
N*az*ism worked as designed.
I think this is stretching the imagination to link Obamacare with Nazism. Surely there are those who push AGW who will make the same loose association between unrelated events because it will get an emotional reaction that they want. Such as trying to associate those who are sceptical of AGW wildest claims with Holocaust deniers, creationists etc.
You may not like Obamacare but how do you link this to Nazism?
Apologies generally as this is off to the side of the general topic (again).

Bill Illis
July 2, 2014 4:53 am

Let’s just throw out all these adjustments and go back to the Raw record.
There are too many questions about the accuracy of the Final adjusted dataset.
Let’s start collating the Raw data in the correct way and start presenting that from now on to the public and to decision-makers with the proper caveats.

DC Cowboy
Editor
July 2, 2014 5:02 am

“Our algorithms are working as designed.”
This is a classic statement by the higher ups in a bureaucracy in hull down, protect the agency mode.
Just speculation on my part, but, I find it hard to believe that they could validate the entire record this quickly. What they appear to have done is checked the software and pronounced it ‘good’. I would further speculate that the folks Anthony was corresponding with did not put out this statement, nor would they agree with it. Having been an IT Security geek in the Federal Government I personally saw this behavior at several agencies trying to cover the fact that our systems had become a playground for Chinese Intel hackers (and Russian and Israeli and …).
Heck, I ‘designed’ a model that emulated airliners landing at an airport based on fuel usage, etc to order the landings. It worked ‘as designed’. The only trouble was it was landing planes with negative fuel (which would have been a tad bit of a hard landing pretty short of the runway I’m thinking). Again, the algorithm worked ‘as designed’, unfortuantely the design was, as my Prof diplomatically put it “inelegant”.
Other posters are absolutely correct “Our algorithms are working as designed.” is a meaningless statement that implies correctness where none is warranted.

ROM
July 2, 2014 5:07 am

A C Osborn says:
July 2, 2014 at 4:10 am
I have no computer or coding expertise but to ask a couple of questions particularly as NASA has lost a heck of a lot of data already from it’s early space and rocket days due to cleaning out of rooms full of tapes and etc as well as changes in data archiving codes where the older codes when accessed could no longer be read because the machines and expertise for the code used for past archiving were no longer around.
So how much of that original weather data as recorded by the observers and even later electronically is still actually around and accessible?
Has anybody ever taken a look at that side of the system?
Where is the hard data archived or is it all now digitised?
If digitised stored where ?
[ A few major companies have recently lost very large amounts of valuable rchived material due to glitches in the Cloud system And Phil Jones seems to have conveniently lost lots of inconvenient climate related stuff as well. ]
Wherein the system is the backup data stored ?
What systems are in place to guarantee and maintain both the future viability and absolute accuracy of the original data through various transitions in coding systems as written down by the observers of the data and it’s ready availability for access by any potential user of the data?
What systems are in place that will very accurately transpose archived and digitised data into new formats and archival systems as they appear over the times ahead.?
And for that matter how much of the archived data has been lost because it can no longer be read due to the complete loss of the protocols and coding systems used in the original data archiving.
In view of the casual corruption of the data that we are witnessing on every front in the climate alarmist ideology, I have a very strong suspicion that not much attention has been paid to actually carefully archiving the original data for future generations after it is downloaded into the NCDC’s and etc systems.
For another major and utterly inexcusable example of gross corruption to the original data as it was recorded by observers, duck over to Lucia’s Blackboard “How not to calculate temperatures part 3” and have a look at ;
Bob Koss (Comment #130615) June 27th, 2014 at 4:16 pm
and his comment on the Key West data

wwschmidt
July 2, 2014 5:30 am

Just like the IRS is working exactly as designed.

Greg Goodman
July 2, 2014 5:39 am

“… our AlGoreithm is working as designed”
quote of the week ( year ) . Priceless.

Catcracking
July 2, 2014 5:40 am

The design goal was to show a temperature rise so we could get out Bonus.

EternalOptimist
July 2, 2014 5:48 am

Is it time for us to start talking about calculated temperatures when talking to the warmistas.
i.e. ‘if the calculated temperatures are as you say then your prediction may well be correct,
but with the measured temperatures, there is no chance’

KevinM
July 2, 2014 6:01 am

temp says…
“So my question is… can’t someone ”
Why can’t you? You have time to post blog comments. Go do it.

ROM
July 2, 2014 6:07 am

What in the heck are all these crazy mostly irrational adjustments to the temperature record supposed to prove or achieve in any case?
The whole NCDC temperature adjustment debacle is taking on the appearance of a bunch of self indulgent mad scientists messing with some numbers to no known purpose or benefit to anybody just to prove to themselves that they are smarter than the ordinary guy who has to pay for this stupid scam.
The only purpose for all these temperature adjustments is for the climate modellers to have a standard temperature to initiate their models and for climate scientists to use up whole forests of trees writing papers on how the temperature has risen 0.05 degrees in the last ten years leading to a chance of catastrophic warming by 2221.
And thats about as a useless a bit of data tampering as it is possible to achieve given that the climate models have yet to predict anything that has any relationship at all to how the real climate has actually behaved.
This whole climate alarmist thing is taking on the characteristic of a gigantic wank by a group of academic elitist climate scientists who have become so enamored of their own personal prowess and intellect that they can’t see just how much increasing contempt they are creating amongst the common folk for their self centred personal indulgence at those common folk’s expense.

North of 43 and south of 44
July 2, 2014 6:15 am

hunter says:
July 1, 2014 at 9:25 pm
Guns always shoot where they are pointed, too.
—————————————————————-
Even when empty!

July 2, 2014 6:18 am

They did not say what the design was.

July 2, 2014 6:27 am

Stephen Skinner says:
July 2, 2014 at 4:37 am
Bruce Cobb says:
July 2, 2014 at 4:18 am
Obamacare works as designed.
Lysenkoism worked as designed.
N*az*ism worked as designed.
I think this is stretching the imagination to link Obamacare with Nazism.
============================================================
Agreed. However is well into the Muslim Brotherhood – who DO have well documented links to the Nazis in the 30s and during WWII.
Just saying.

July 2, 2014 6:28 am

“however is well” … “however HE is well”

July 2, 2014 6:30 am

Catcracking says:
July 2, 2014 at 5:40 am
The design goal was to show a temperature rise so we could get out Bonus
==============================================
Not good enough. They should take a leaf of of the UK Met Office, who award themselves huge bonuses regardless of how well their forecasting works. For the record, this is an organisation that for thirteen of the past fourteen years “forecast” (or, rather, “hoped that”) UK temps to be higher than they actually were.
“Bonuses all round!!” is the cry, “after all, it’s only taxpayers’ money, and there’s always more where that came from”.
/rant

JeffC
July 2, 2014 6:33 am

I would love to see the business requirements document for those algos

Bruce Cobb
July 2, 2014 6:46 am

@Stephen Skinner,
I made no such link, but you did. Interesting….

Latitude
July 2, 2014 6:52 am

Over the years….and no one has noticed?……bullcrap
“never attribute malice to what can be explained by simple incompetence.”

Steve in SC
July 2, 2014 7:08 am

“working as designed” eh. Mendacity on parade.

José Tomás
July 2, 2014 7:09 am

Watts, you are at Taranto’s yesterday column:
http://online.wsj.com/articles/best-of-the-web-today-free-speech-movement-1404243186
Congrats, but we need more.

Genghis
July 2, 2014 7:09 am

Here is the thing, the Scientists are all being absolutely truthful, accurate and factual, just as Politicians are always absolutely truthful, accurate and factual. I am not being sarcastic they really are absolutely truthful, accurate and factual. They defined the terms.
It really is us skeptics who are lying, sloppy, false and most importantly ignorant of the definitions.
For example skeptics like to talk about absolute temperature. That’s nice, it sounds good, but it is absolutely meaningless. If I read a thermometer right now it reads 82˚ is that the absolute temperature? No! All it takes to falsify is someone somewhere else getting a different reading on their thermometer.
Boom! Just like that skeptics are underwater swimming for their lives. So we skeptics scream out averages! Yes averages are the absolute temperature! The Scientists just shake their heads sadly and calmly explain that it is the change in averages, anomalies, that are real, and they are right.
The skeptics head is spinning. “If absolute temperatures aren’t real how do you get something that is real by averaging something that doesn’t exist?” “Simple,” the scientist explains, “We use the laws of large numbers, statistics and computer modeling.”
“But what if your original data isn’t complete or inaccurate?” The skeptic asks.
“It doesn’t matter.” The scientist replies. “Our methods always produce the same results. The errors are too insignificant to matter.”
The Scientists are right.

~FR
July 2, 2014 7:24 am

Why did they send their statement to Politifact?

Ed Zuiderwijk
July 2, 2014 7:28 am

“My software: no error”. Famous last words before your system crashes (or a paperclip appears).

José Tomás
July 2, 2014 7:29 am

Reposting to get Anthony’s attention 🙂
Anthony, you are at Taranto’s yesterday column:
http://online.wsj.com/articles/best-of-the-web-today-free-speech-movement-1404243186
Congrats, but we need more.

beng
July 2, 2014 7:29 am

If you like your Al-gore-rithm, you can keep your Al-gore-rithm. Besides, what does it matter now?

Ron C.
July 2, 2014 7:30 am

What Goddard and Homewood and others are doing is a well-respected procedure in financial accounting. The Auditors must determine if the aggregate corporation reports are truly representative of the company’s financial condition. In order to test that, a sample of component operations are selected and examined to see if the reported results are accurate compared to the facts on the ground.
Discrepancies such as those we’ve seen from NCDC call into question the validity of the entire situation as reported. The stakeholders must be informed that the numbers presented are misrepresenting the reality. The Auditor must say of NCDC something like: “We are of the opinion that NCDC statements regarding USHCN temperatures do not give a true and fair view of the actual climate reported in all of the sites measured.”

July 2, 2014 7:38 am

I’m going to assume something like the following has been done:
Find a continuously reporting, well sited station.
At the “raw data” level, remove this station from the process for a period of time representing a break in that station’s reporting.
Now do whatever process is done to the “raw data” and then run the “working as designed algorithm”.
How accurately did the final adjusted data match what that perfectly good station actually reported?
Just wonderin’.

Dougmanxx
July 2, 2014 7:58 am

Here is the proper working of their software. I have taken data snapshots of a station in Oberlin, Ohio over the past several days. They are not changing the temperature monthly. It changes just about EVERY DAY. As an example, I give you the year 1936 in Oberlin, Ohio. Station number USH00336196. I picked 1936 because of the whole “hottest month ever” fiasco. I picked Oberlin, because it has a full and complete “record”, even though the station was officially closed in February of 2010, after reporting stopped in May 2009! Enjoy:
06_26_2014 USH00336196 1936 -678E -768E 379E 560E 1626 1922 2287 2253a 1948 1135 173 88a 0
06_27_2014 USH00336196 1936 -678E -768E 379E 560E 1626 1922 2287 2253a 1948 1135 173 88a 0
06_29_2014 USH00336196 1936 -700E -793E 357E 539E 1605 1901 2266 2232a 1927 1114 152 67a 0
07_01_2014 USH00336196 1936 -702E -795E 355E 537E 1604 1900 2265 2231a 1925 1112 150 66a 0
07_02_2014 USH00336196 1936 -698E -792E 358E 539E 1606 1902 2267 2233a 1928 1115 153 68a 0

patrioticduo
July 2, 2014 7:59 am

It’s very interesting that in carrying out post-incident root cause analysis of all of these errors from the past, it was public disclosure, openness and cooperation that were key to finding the sometimes incredibly small problem that resulted in catastrophic failure. But when Government is involved, it is not so much the root cause that is the biggest failure but the secrecy, protectionism and condescension with which bureaucracies act and operate. NCDC is no different. We must keep the pressure on or they will just move on.

Eliza
July 2, 2014 8:05 am

Anybody know what this is about?
http://www.cfact.org/2014/07/02/its-coming/
maybe FOIA? or just the NGIPCC conference?

LogosWrench
July 2, 2014 8:10 am

Working as designed or as directed?

TimO
July 2, 2014 8:25 am

“… our algorithm is working as designed”
Why, yes… it is following the political line and purpose…what a surprise.

Rod Everson
July 2, 2014 8:38 am

To those lamenting the corruption of temperature data, and especially to those wanting to preserve the original data:
John James Cowperthwaite, British social servant: from Wikipedia:
“He returned to Hong Kong in 1945 and continued to rise through the ranks. He was asked to find ways in which the government could boost post-war economic outlook but found the economy was recovering swiftly without any government intervention. He took the lesson to heart and positive non-interventionism became the focus of his economic policy as Financial Secretary. He refused to collect economic statistics to avoid officials meddling in the economy.” (Emphasis added)
The result was, I believe, the fastest growing economy on the earth at the time, growth that went on for decades. Cowperthwaite was onto something. As others have stated repeatedly, this effort to determine a single temperature for the earth at any point in time is ridiculous on its face and, as Cowperthwaite indicated would happen, attempting to do so has led to damaging meddling in the economy.
I realize science requires data, and uncorrupted data at that, but there are always politicians hovering, looking to meddle at the slightest excuse. And when those same politicians are funding the science, the meddling is inevitable.

Steve Keohane
July 2, 2014 8:52 am

ntesdorf says:July 1, 2014 at 10:09 pm
You nailed my response with ‘SNAFU’, but then that is SOP for the gov’t.

Latitude
July 2, 2014 9:03 am

Dougmanxx says:
July 2, 2014 at 7:58 am
Here is the proper working of their software. I have taken data snapshots of a station in Oberlin, Ohio over the past several days. They are not changing the temperature monthly. It changes just about EVERY DAY.
====
100%…..Doug keep repeating that until people stop saying “monthly”….every time they pull a record, it changes it….that can even be several times a day

more soylent green!
July 2, 2014 9:21 am

“Working as designed” may not mean what you think it means. It means the algorithm works the way it was created to work. It does not mean it’s scientifically correct or it provides the correct results.
“Working as designed” means somebody created a list of the requirements and the software matches those specifications. Are the requirements correct? Are the results correct? Those are entirely different questions.

Eliza
July 2, 2014 9:24 am

Goddard explains it all here a very lucid well presented exposition. A very clear speaker should represent the skeptic side on media
https://soundcloud.com/jim-poll/wjr-2014-07-02-1025am-stevegoddard-readscience-proc02

Larry Ledwick
July 2, 2014 9:27 am

davidmhoffer says:
July 1, 2014 at 9:30 pm
Aw, you left out the Hubble Telescope. I think it a most appropriate example for no other reason that every single component and sub-assembly worked exactly as designed. It was only the fully assembled device that failed to work properly.

Chernobyl also worked as designed. When you take an inherently unstable reactor design and shut off safety systems and do things the manual tells you not to do, it melts down.
Every time your Windows 95 system blue screened and crashed, it was working exactly as designed. The problem was the design was faulty. I had a computer instructor years ago who when a student started complaining “It is not supposed to do that” the answer from him would be something like “well it obviously is supposed to do that. Now you need to find out why it is doing that rather than what you wanted it to do.”

Gary Pearse
July 2, 2014 9:28 am

This is pure civil service reaction to a major blunder, especially after they have been under considerable pressure from outsiders and government concerning the quality of the US temperature record. Having laid this bare, don’t let them get out of it with this bull. Let’s get this data destroying algorithm out into the open, too, analyze the heck out of it and lambaste it publically! It should be revealed far and wide that the temp record of the US is changed every day!!! I suspect they have taken perfectly good stations out of service because they don’t show warming and replaced this data with their higher temp gradient fill-in and algorithm warming gradualism.
Isn’t there some kind of auditor general or something to report all this to. No more mister nice guy, Anthony – these guys and their confreres have trampled all over your good faith before. This is an outrage – not science at work. And they can’t even take refuge in being incompetent now. I’m amazed that the algorithm itself hasn’t been presented at these House Committees on the topic. How on earth can anyone ever do a meaningful study if the data changes daily. I think the dismantling has to be monitored by independents and not let these guys “fix it” themselves.

Alan Robertson
July 2, 2014 9:33 am

Steve Keohane says:
July 2, 2014 at 8:52 am
ntesdorf says:July 1, 2014 at 10:09 pm
You nailed my response with ‘SNAFU’, but then that is SOP for the gov’t.
____________________
SNAFU carries a sort of incompetence implication, which is not the proper concept. What we are witnessing goes far beyond incompetence. This is malfeasance by design. Our US Gov’t. bureaucracy is corrupt from top to bottom and follows an agenda of increasing its own power over the people. This is no longer a left or right issue.

richardscourtney
July 2, 2014 9:34 am

more soylent green!:
re your post at July 2, 2014 at 9:21 am.
I considered what the statement means in this post on the other thread.
Richard

Tom J
July 2, 2014 9:34 am

rogerknights
July 1, 2014 at 11:46 pm
says:
‘“Our algorithms are working as designed.”
GM could say the same of its ignition switches.
…’
Notice the flood of GM recalls that are coming as we speak? And the fact that the federal government sold the last of its shares of GM stock in Dec. 2013? Coincidence? Possibly.

Crispin in Waterloo but really in Singapore
July 2, 2014 9:40 am

I rode on a Comet once. They were incredibly noisy. A family friend was killed on the way to Mauritius. Good riddance.
Ditto global warming alarmism eco-loons and all that that entails. My what a wonderful world this would be!
Next, the huge gap between rich and poor, an international tribunal to permanently fix the borders of all countries and an effective ban on the invasion of privacy by a State, anywhere,
Haba na haba ujaza kibaba. (Little by little you get what you want).

richardscourtney
July 2, 2014 9:42 am

DirkH:
I am writing to acknowledge that I read your post addressed to me.
I do not have to – and see no reason to think I will have to – explain elementary political philosophy to you. And I have no intention of attempting to overcome your prejudices which prevent you from understanding such things.
Importantly, at July 2, 2014 at 12:18 am in this post I made (and explained) an on-topic suggestion on who, how and where to inform if the problems with ‘the algorithm’ are to be addressed. Your response is off-topic nonsense.
Richard

Martin A
July 2, 2014 9:43 am

” A family friend was killed on the way to Mauritius. Good riddance.”
That’s an awful thing to say.

richardscourtney
July 2, 2014 9:48 am

DirkH:
I erroneously wrote my acknowledgement of your post addressed to me on the other thread. Sorry.
Richard

Kenw
July 2, 2014 9:59 am

Martin A says:
July 2, 2014 at 9:43 am
” A family friend was killed on the way to Mauritius. Good riddance.”
That’s an awful thing to say.
i think he was referring to the Comet.

Keith Sketchley
July 2, 2014 10:04 am

I think Rhoda R has the right question.
Besides the goal of the algorithm working as required to do the job, a method is detailed requirements. The process of making them should explore subtleties, should look at the data from interfacing modules in and out. That does require communication with other people, which a young engineer who was part of the panic team to fix Obama’s medical software botch said was a big part of the cause of the botch – the team found many small things wrong.

Dan in Nevada
July 2, 2014 10:16 am

davidmhoffer says:
July 1, 2014 at 9:30 pm
“Aw, you left out the Hubble Telescope…every single component and sub-assembly worked exactly as designed…”
That’s not quite true. If I remember correctly, Kodak made a primary mirror that was perfect in every way. That one still sits in a warehouse somewhere. The government instead used a mirror made by a company not known for having any sort of expertise for precision large optics. Whether it was for affirmative action, gifting taxpayer’s money to a crony, or some other reason, it was definitely another case of the government screwing something up in a very expensive way. Like they always do. I’m sure glad they’re taking care of my health care needs.

Dan in Nevada
July 2, 2014 10:32 am

Stephen Skinner says:
“I think this is stretching the imagination to link Obamacare with Nazism…”
Obamacare and a very large percentage of our economy most certainly are run along National Socialist lines. You could just as easily say Fascist and maybe be a little less provocative. However, a fascist economy is characterized by nominally private companies/corporations being “guided” (coerced) by the government. Think Oskar Schindler and you’ll be pretty close. To pretend that health care under the ACA resembles the free market in any way is completely disingenuous.
I’ve heard it said that the upside of Fascism is that the people in charge know how to make the trains run on time. This aspect is what appealed to FDR when he fantasized he could remake the American economy in his own image. The downside, of course, is that they don’t know how to make the trains go where they’re needed.
If your point was that the comparison invoked the Holocaust, genocide, etc., then sure, that’s unfair and mean. But I didn’t take it that way.

Resourceguy
July 2, 2014 10:57 am

Pay no attention to those algorithms behind the curtain. The great and powerful Oz has spoken.

July 2, 2014 11:58 am

Dan in Nevada says:
July 2, 2014 at 10:32 am
Stephen Skinner says:
“If your point was that the comparison invoked the Holocaust, genocide, etc., then sure, that’s unfair and mean. But I didn’t take it that way.”
OK. But running the trains on time or healthcare for that matter are not what jumps out at me when I hear the term Nazism.

Michael J. Dunn
July 2, 2014 12:15 pm

1) “Our algorithm is working as designed,” is a truism, not a defense. How could it do otherwise? The proper response is “So what?”
2) I recall from my aerodynamics classes that the Tacoma Narrows bridge was actually not built as designed. The roadbed width changed and the designed trusswork girders were replaced by I-beam plate girders. Because the transverse wind could not permeate through the plate girders, it shed von Karman vortices. The changed roadbed width caused the torsional resonance of the bridge to match the vortex-shedding frequency. It was a forced oscillator at its resonance condition.

Ron C.
July 2, 2014 12:47 pm

The algorithm is not the point.
The several USHCN samples analyzed so far show that older temperatures have been altered so that the figures are lower than the originals. For the same sites, more recent temperatures have been altered to become higher than the originals. The result is a spurious warming trend of 1-2F, the same magnitude as the claimed warming from rising CO2. How is this acceptable public accountability? More like “creative accounting.”

G. E. Pease
July 2, 2014 1:17 pm

I was a Spacecraft Navigation Engineer employed by JPL from 1965 through 1979. I want everyone to know tht I was not employed there when either of the two JPL screw-ups mentioned occured, but I did subsequently learn interesting details about them.
I had worked in the successful 1965 Mariner 4 Mars flyby Navigation Team and was the Navigation Team Leader for the successful Mariner 5 1967 Venus flyby,
http://www.jpl.nasa.gov/missions/mariner-5/
The clueless 1998 Mars Climate Orbiter Navigation Team Leader who ignored his team’s Red Flag warnings was a Phd holder who was obviously not the right person to be in charge of dealing with real-world problems.
It should be noted, however, that all the information supplied to the JPL Mariner Climate Orbiter Team that had used unlabeled English force units was produced by software from the prime spacecraft contractor (Lockheed Martin Astronautics) in violation of the NASA contract, which specified that only metric system units were to be provided to JPL:
http://www.cse.lehigh.edu/~gtan/bug/localCopies/marsOrbiter
During Clinton’s and Obama’s administrations, I’ve been acutely aware of the malaise of cluelessness (or political deception?) that has become rampant in Government-controlled “Climate Science.”

July 2, 2014 1:22 pm

If the max/min/avg temps aren’t changing, why all the fuss about “Climate Change”? 😎

Ron C.
July 2, 2014 1:32 pm

We are learning from this that USHCN only supports the notion of global warming if you assume that older thermometers ran hot and today’s thermometers run cold. Otherwise the warming does not appear in the original records; they have to be processed, like tree proxies. Not only is the heat hiding in the oceans, even thermometers are hiding some!

Udar
July 2, 2014 1:41 pm

Given that answer (working as designed), can we finally dispense with the notion that all this is caused by simple incompetence? I think this answer pretty much proves actual malice.

Graeme W
July 2, 2014 1:55 pm

Many, many years ago, I attended an International Conference on Software Engineering. I can still remember one of the presentations where it was claimed that up to 90% of software faults are due to specification problems, not coding. This was in the context of mission-critical software, such as used in banks, aircraft, and nuclear reactors.
If the problem is one of specification, then ‘the code is working as specified’ (a paraphrase of what’s been stated in the press release) does not mean that code is working correctly.
Do the specifications really intend that temperature estimates should be supplied for stations that have been shutdown? If so, isn’t that a flaw in the specifications?
Who has the specifications so they can be checked to see if they are valid? Only after that has been done should the software be checked to see if it meets the specifications.
As an aside, another point from the conference was that a lot of errors in coding from specifications arise from the use of ambiguous terms. For that reason some sectors have a dictionary with precise definitions and specifications are only allowed to use terms from that dictionary. This reduces the chance of misunderstanding (as occurred in Mars Climate Orbiter case mentioned in the post).

July 2, 2014 2:00 pm

Anthony
This is my no.1 goto site for all information, news and such like related to meteorology and related topics. I trust that on balance the quality of posts will be high with an emphasis on trying to find out whats going on with all the research and articles AGW related etc.
Therefore please correct the error in this article relating to the De-Havillland Comet.
“The de Havilland Comet (1952): Twenty-one of these commercial airliners were built.The Comet was involved in 26 hull-loss accidents, including 13 fatal crashes which resulted in 426 fatalities”
This has been lifted directly from Wikipedia without any editing and implies that the design caused 13 fatal crashes and 426 fatalities. There were 5 crashes related to the design of the wing and fuselage, 4 of which were fatal and resulted in 110 fatalities. The plane was withdrawn from service and the design fixed. Once returned to service there were nine further 3 fatal accidents but they include flying into the ground, a bomb, unstable approaches/take offs, a faulty instrument, and took place during the remainder of that planes career, Much the same as any other plane. It first flow in 1949 and the last flight was 1996.
REPLY:If you can provide a reference, and I’ll happily do so. – Anthony

wayne
July 2, 2014 2:55 pm

From a system analysts view point, all algorithms work as designed and are so constructed, right or wrong. Is that not the problem? The design and not the algorithm since all algorithms work as designed?
What a meaningless response from NCDC!

July 2, 2014 4:08 pm

Stephen Skinner says:
July 2, 2014 at 2:00 pm
REPLY:If you can provide a reference, and I’ll happily do so. – Anthony
Within the same wikipedia article on the comet is a section ‘Early Hull Losses’ which describes the first issues with the wing if over rotating on take off (involving 2 planes) and 1 plane breaking up in a thunderstorm. The subsequent sections describe a further 2 losses due to structural failure. (http://en.wikipedia.org/wiki/De_Havilland_Comet).
This is as reliable as any as it is much the same as the various books on the subject and I can’t show you those. There is also this site (http://www.oocities.org/capecanaveral/lab/8803/fcometcr.htm#local), which itemises all hull losses for this type which is actually 27 for its entire life.
Anyway I don’t want to detract from the main point of this article which is the poor quality and lack of attention to detail exhibited by the NCDC.

george e. smith
July 2, 2014 5:34 pm

“””””…..Dan in Nevada says:
July 2, 2014 at 10:16 am
davidmhoffer says:
July 1, 2014 at 9:30 pm
“Aw, you left out the Hubble Telescope…every single component and sub-assembly worked exactly as designed…”
That’s not quite true. If I remember correctly, Kodak made a primary mirror that was perfect in every way. That one still sits in a warehouse somewhere. The government instead used a mirror made by a company not known for having any sort of expertise for precision large optics. Whether it was for affirmative action, gifting taxpayer’s money to a crony, or some other reason, it was definitely another case of the government screwing something up in a very expensive way. …..”””””
Well that’s not quite true either.
Dan is quite correct; Kodak did make a spare mirror, which is believed to be essentially perfect. The original, and still in use Hubble mirror, is also essentially perfect; but it was made “perfectly” to the wrong prescription. I forget exactly how that came about, but the story is known. Also, I believe the current Hubble mirror, was made by Perkin Elmer, and it would be a gross error to characterize PE as “a company not known for having any sort of expertise for precision large optics.”
Few companies know as much about that as Perkin Elmer does. I actually own an example of a PE precision large optics system. It is an eight or maybe ten element Double Gauss aerial camera lens. 36 inch (915 mm) focal length, and F/4 speed, and took 24 x 24 inch pictures on Infra-red roll film. So the effective aperture is 9 inches diameter, but the front and rear elements are 12 inches clear aperture. The rear element is connected to its neighbor by a corrugated metal bellows filled with Argon. That allows focusing from quite close, out to 65,000 feet. I have just the lens and its adjustable Iris; but the shutter was removed.
They removed the shutter for safety, since it is in effect a nine inch diameter guillotine, and would cut your arm off if you fired the shutter, with your arm in it.
The piece I have weighs 300 pounds. It came out of a recon version of the B-47 SAC bomber.
I was going to make a large field star camera, out of it, by regrinding the surface that had the dichroic IR pass filter on it, and change its curve to make it a photo-visual lens, rather than IR.
I once set it up on the backs of a couple of dining room chairs, outside, and laid underneath it holding an eyepiece, to gaze at the stars. Did I already mention it weighs 300 pounds.
Without knowing the incorrect prescription to which Perkin Elmer accurately built the Hubble prime mirror, it would have been impossible to design and make the corrector optics, that were installed by astronauts, to restore Hubble to almost its original.design resolution capability.

george e. smith
July 2, 2014 5:49 pm

The De Havilland Comet, was an aircraft way ahead of its time. If it first flew in 1949, you can guess when it was designed. There weren’t too many military planes at the time, like the B-29, and B-36 and some German WW-II planes JU-86B I think comes to mind, that were pressurized, and of aluminum stressed skin design, and Comet flew higher than those. Aluminum metal fatigue was somewhat unknown, when comet was designed. Few if any military aircraft, saw the total number of pressurized flight cycles, that eventually revealed the metal fatigue problem.
De Havilland was the unfortunate trail blazer for commercial aviation.
And subsequent to the string of mysterious crashes, a Comet was pressurized to destruction in a water tank. (same way that DOT tests scuba air tanks these days.)
Principal beneficiary of the Comet disasters was a company called Boeing Aircraft, who moved into the commercial vacuum left by the Comet’s demise.

Anto
July 2, 2014 6:02 pm

Would love to know what Harry’s readme file says these days. Must be hysterical.

old44
July 2, 2014 7:58 pm

davidmhoffer says:
July 1, 2014 at 9:30 pm
Aw, you left out the Hubble Telescope. I think it a most appropriate example for no other reason that every single component and sub-assembly worked exactly as designed. It was only the fully assembled device that failed to work properly.
i sense the same mind numbing denial of the obvious in this case. The algorithm no doubt did work exactly as designed. That by no means proves that the design achieved an output commensurate with actual results, and, as the trends above show, it is quite possible to have an algorithm that works as designed yet, as part of a larger system, like the Hubble Telescope, produces incorrect information that is wildly and completely obviously wrong. Sadly, a quick look at the original photo from Hubble was enough to convince a rank layman that something was wrong. I don’t think a quick look by the MSM will have the same effect.
————————————————————————————————————————————
On a 12.5 short ton, 14ft dia. telescope ground to an accuracy of 0.00000125″ the error was 0.00008″ due to a miscalculation of the effect of gravity. I am sure you could do better.
The Golden Rule of Metrology is
Your work will always be criticised by someone who has trouble measuring the height of his children.

george e. smith
July 2, 2014 8:35 pm

“”””””…..Old44……”””””
“””””…..————————————————————————————————————————————
On a 12.5 short ton, 14ft dia. telescope ground to an accuracy of 0.00000125″ the error was 0.00008″ due to a miscalculation of the effect of gravity. I am sure you could do better…….”””””
Old44 might have pegged it correctly. I believe the error was something of that nature, that the PE mirror was tested at surface gravity, and someone forgot it would be operating in zero gravity (near enough)
I don’t recall, exactly whose error it was, but I’m under the impression that PE built it to exactly what they were given.
And for the life of me, I can’t convert all those zeros to some fraction of some wavelength, so I can’t verify (or falsify) Old44’s nummers..
It’s a shame it wasn’t feasible for astronauts to swap out the faux PE primary, for the EK jewel.
Something about momentum, or some such pestilence. Maybe, they should’ve asked McGiver, to figure out how to accomplish that switch.

CRS, DrPH
July 2, 2014 9:29 pm

“Our algorithm is working as designed….”
Check. http://www.gao.gov/key_issues/climate_change_funding_management/issue_summary
Keep the money flowing at all costs, folks.

You know, that guy...
July 2, 2014 11:04 pm

So it was designed to be Ass Backward???

ROM
July 3, 2014 1:25 am

From ;NASA’s
The Hubble Telescope –
Optical Systems Failure Report
http://www.company7.com/c7news/19910003124_1991003124.pdf
Extracted from the Executive Summary
The Board’s investigation of the manufacture of the mirror proved that the mirror was made in
the wrong shape, being too much flattened away from the mirror’s center (a.
0.4-wave rms wavefront error at 632.8 nm). The error is ten times larger than the
specified tolerance.
&
The RNC [ Reflective Null Corrector ] was designed and built by the Perkin-Elmer Corporation for the HST Project.
This unit had been preserved by the manufacturer exactly as it was during the
manufacture of the mirror. When the Board measured the RNC, the lens was
incorrectly spaced from the mirrors. Calculations of the effect of such
displacement on the primary mirror show that the measured amount, 1.3 mm,
accounts in detail for the amount and character of the observed image blurring.
No verification of the reflective null corrector’s dimensions was carried out by
Perkin-Elmer after the original assembly. There were, however, clear indications
of the problem from auxiliary optical tests made at the time, the results of which
have been studied by the Board. A special optical unit called an inverse null
corrector, designed to mimic the reflection from a perfect primary mirror, was built
and used to align the apparatus; when so used, it clearly showed the error in the
reflective null corrector. A second null corrector, made only with lenses, was used
to measure the vertex radius of the finished primary mirror. It, too, clearly showed
the error in the primary mirror. Both indicators of error were discounted at the
time as being themselves flawed.
The Perkin-Elmer plan for fabricating the primary mirror placed complete
reliance on the reflective null corrector as the only test to be used in both
manufacturing and verifying the mirror’s surface with the required precision.
NASA understood and accepted this plan. This methodology should have alerted
NASA management to the fragility of the process and the possibility of gross error,
that is, a mistake in the process, and the need for continued care and
consideration of independent measurements.
The design of the telescope and the measuring instruments was performed well
by skilled optical scientists. However, the fabrication was the responsibility of the
Optical Operations Division at the Perkin-Elmer Corporation (P-E), which was
insulated from review or technical supervision. The P-E design scientists,
Management, and Technical Advisory Group, as well as NASA management and
NASA review activities, all failed to follow the fabrication process with reasonable
diligence and, according to testimony, were unaware that discrepant data existed,
although the data were of concern to some members of P-E’s Optical Operations
Division. Reliance on a single test method was a process which was clearly
vulnerable to simple error. Such errors had been seen in other telescope
programs, yet no independent tests were planned, although some simple tests to
protect against major error were considered and rejected. During the critical time
period, there was great concern about cost and schedule, which hrther inhibited
consideration of independent tests.
The most unfortunate aspect of this HST optical system failure, however, is that
the data revealing these errors were available from time to time in the fabrication
process, but were not recognized and fully investigated at the time. Reviews were
inadequate, both internally and externally, and the engineers and scientists who
were qualified to analyze the test data did not do so in sufficient detail.
Competitive, organizational, cost, and schedule pressures were all factors in
limiting full exposure of all the test information to qualified reviewers.

ROM
July 3, 2014 2:05 am

De Havilland Comet DH 106 production lists 117 airframes of all variants produced.
De Havilland Comet airliner crashes;
http://www.oocities.org/capecanaveral/lab/8803/fcometcr.htm#table2
20 crashes from June 53 to Jan 71 including;
3 Aircraft lost due to airframe fatigue and design faults .
2 Faulty airfoil leading edge design leading to loss of lift in too sharp a pull up during final phase of the take off.
2 instrument failures
1 hull loss from a wheels up landing due to inadequate pre landing checks by the crew, Air Traffic controllers in training.
1 bomb
11 pilot error

Alan Robertson
July 3, 2014 5:58 am

National Center for Data Control

John Hewitt
July 3, 2014 6:26 am

Where’s Mosh to defend the indefensible?

beng
July 3, 2014 6:36 am

***
John Hewitt says:
July 3, 2014 at 6:26 am
Where’s Mosh to defend the indefensible?
***
Conspiring w/Stokes.