ISO-8000 Data Quality – something climate science could benefit from

I received the following email in my inbox this morning, inviting me to attend the first ISO 8000 data quality conference. Looking at the membership directory for this group, I’m not at all surprised that NASA, NOAA, NCDC, NWS, GISS and others are not members, given the mess that the surface data set is in.

But this is exactly what is needed, better data quality control. We have ISO 9000 all over private industry, to make sure that products meet or exceed quality specifications. Yet even though our government has the Data Quality Act (DQA) which is supposed to cover things like climate data, the simple fact is that it is not enforced. And even when it is questioned, such as I did last year sending a letter to NASA regarding DQA issues, (twice) it was simply ignored.

If climatologists want people to trust the data they gather and present, having an ISO 8000 certification would go a long way towards providing assurance. Given that entire economies will be affected by policy based on climate data that has been presented, wouldn’t it make sense to at least hold it to the same quality standard as private industry now embraces voluntarily? – Anthony


Dear Sir/Madam,

ECCMA is holding the first ISO 8000 data quality conference in Battle Creek, Michigan the home town of the Defense Logistics Information Service (DLIS). With a packed two day agenda that includes over twenty government and industry speakers the conference is focused on the challenges and the rewards of developing and managing data quality. The conference is preceded on Tuesday October 14th by an ECCMA ISO 8000:110-2008 Master Data Quality Manager Certification course.

What differentiates success from failure is more often than not the quality of the data. Just ask those in the NASA control room on September 23, 1999 when they realized the loss of the $125 million Mars Climate Orbiter, or any business large or small about the “mistakes” that plague them almost on a daily basis; in the final analysis it is truly “all about the data” or rather all about the quality of the data.

A committee of the International Organization for Standardization better known as ISO has risen to the challenge to develop an international standard that defines what is and what is not “quality data”. The first part in the ISO 8000 series is a foundational standard for master data, the data that describes individuals, organizations, locations, goods and services.

ISO 8000-110:2008 defines the basic requirements that need to be met for master data to be considered quality master data and of course if it is not “quality data” then it clearly must be something else. Unlike many other quality standards ISO 8000-110:2008 specifically deals with those aspects of data quality that can easily be checked by a simple computer program. When a company claims that the data they are sending is “ISO 8000-110:2008 Quality Master Data”, verification is but a mouse click away.

Commenting on the new standard Mr. Peter Benson the project leader for ISO 8000 said: “Complying with the requirements of ISO 8000-110:2008 is not difficult or expensive, it is simply common sense which, if consistently applied, will save us all a lot of time, money and of course a lot of unwelcome frustration”.

ISO 8000-110:2008 is based on the experiences of the Defense Logistics Information Service (DLIS) in building and maintaining world class databases of master data.

The conference is being held in Battle Creek, the home of the Defense Logistics Information Service (DLIS) a Defense Logistics Agency (DLA) activity www.dlis.dla.mil and celebrates the transition of the Federal Cataloging System and NATO Codification System into an open public standard through the eOTD, ISO 22745, ISO 29002 and finally ISO 8000. The conference is a landmark event focused on the application of the new international standards for data quality that define the process for the creation of quality data in the form of consistent and unambiguous descriptions of individuals, organizations, locations, goods or services.

For conference registration and updates please visit www.eccma.org

Peter Benson

Executive Director and Chief Technical Officer

Electronic Commerce Code Management Association (ECCMA)

Project leader for ISO 22745 (technical dictionaries) and ISO 8000 (data quality)

0 0 votes
Article Rating
29 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Bob B
September 23, 2008 9:44 am

Anthony, I totally agree. It appears “climatologists” don;t want to be held accountable. Having monitored and certified data and accounting practices will go a long way to avoid future “Hockey Sticks” A few lessons in gauge R&R with data and measurements should be required for all climate scientists.

crosspatch
September 23, 2008 9:48 am

It has been my experience that government is expert at avoiding their own regulations and requirements that they impose on others. For example, in a contract let to a private company, they might require adherence to ISO-8000 data standards with ISO-9000 process standards yet exempt themselves from the same requirements. Government is very good at avoiding having to hold itself to any standard of accountability unless it also has the right to change those standards to fit the reality of the moment.

Steve Keohane
September 23, 2008 9:52 am

Anthony, it makes sense for this type of protocal be followed. I was going to make make a snide comment regarding the inability of the government or its instutions to follow such a logical approach. However, after working with some of the folks at NIST who were interested in my investigations in submicron optical metrology for a semiconductor firm in the early 90’s, some do very good work. I had forgotten that after the past few years of examining this climatology mess. Thank you for all that you do.

UKIPer
September 23, 2008 10:00 am

OT..”Anyone who thinks global warming has stopped have their heads in the sand” from the truly professional http://www.metoffice.gov.uk/research/hadleycentre/news/warming_goes_on.html

Richard deSousa
September 23, 2008 10:11 am

Yeah, now if only pigs could fly… 😉

Bob B
September 23, 2008 10:12 am

UKIper—they only include UHI infected surface measurements and play with timescales and slopes.

Les Johnson
September 23, 2008 10:42 am

UKIper: I notice that Hadley didn’t put the YTD 2008 data in. When I put in the Hadley 2008 projected temperature (from Jan 08), of 0.37, I get a negative slope since 1998.
Of course, there is no chance of ever being close to 0.37 average for 2008.

terry46
September 23, 2008 10:44 am

Off topic but something I’ve noticed is when you see an article on the internet about global warming they ALWAYS show the ice chunks melting in the artic and you ALWAYSsee the polar bears. How old are there pictures which look like the same pictures they were showing 10 years ago.Maybe they are from the movie day after tomorrow. On the otherhand when there’s a story about record snowstorms or record cold the hardle ever show a picture at all.It’s like they don’t want to show the effects that go against there beliefs

Steven Hill
September 23, 2008 10:48 am

http://gmy.news.yahoo.com/v/9856612
Polar bears are in danger again as all of the artic ice melts

Steven Hill
September 23, 2008 10:49 am
AnonyMoose
September 23, 2008 11:06 am

”Anyone who thinks global warming has stopped have their heads in the sand” – not ISO 8000 certified. And they talk about 100 years but only show a graph from the 1975 warming onward.

Pieter F
September 23, 2008 11:14 am

UKIPer (10:00:20) — the “truly professional” what? Climatologists or propagandists? That piece called “Global warming goes on” is so corroded with hyperbole and cheery picking of data to be worthless in an empirical setting and frightening in the political setting.
1) The publication was done in August of 2008, but the graph used only went to 2006. The important and telling reversion to the mean seen from 2007 to 2008 was LEFT OUT (= propaganda technique known as card stacking).
2) A 33 year trend is insufficient to establish a climate trend, especially if the most recent data flattens the curve (unless, of course, the most recent data is left out). The text of the article made much about “longer-term changes,” but failed to use deeper historical data in its presentation. The graph would be mush flatter (and less scary) if their data went back to 1930. The direct observed data is available, bu the cooling of the late 40s into the 70s was LEFT OUT, apparently because it didn’t let their graph appear so steep.
3) Hansen in 1988 used the 1951-1980 mean and projected anomalies forward; the metoffice publication used a 1961-1990 average and included 15 years of data within that average to establish the anomalous trend. I wonder what would happen to the presentation if they both followed the same mean?
4) Hansen predicted a 1.1°+ global temperature rise from 1988 to 2008. IT DIDN’T HAPPEN, suggesting the models based on surface temps don’t work.
5) If data from the Eemian and early 75% of the late Holocene interglacials are considered, the authors’ statement “11 of the last 13 years were the warmest ever recorded” would stand out only as a moment of profound ignorance.

Pierre Gosselin
September 23, 2008 11:17 am

Speculation?
Svensmark’s “speculation” just happens to correlate awfully darn well with past events, and the phenomena is confirmed in laboratory experiments.
They don’t see any correlation because they refuse to see it, and will continue to do so unless we plunge into a new LIA.
Clearly these NASA scientists are under tremendous political pressure to either tow the AGW propoganda line, or to kiss their careers good bye. What a shame.

Pierre Gosselin
September 23, 2008 11:22 am

Ooopy! Wrong place- previous comment belongs elsewhere…

Leon Brozyna
September 23, 2008 1:23 pm

Why ever would folks such as NASA, NOAA, NCDC, NWS, or GISS want, on their own, to take part in such a program? I’m sure they view themselves as being standard setters and beyond such petty details.

Paul Shanahan
September 23, 2008 2:11 pm

Les Johnson Said: UKIper: I notice that Hadley didn’t put the YTD 2008 data in. When I put in the Hadley 2008 projected temperature (from Jan 08), of 0.37, I get a negative slope since 1998.
When I look at the 10 year data (Jan 1999 to Jan 2008), I get a negative slope too, of around -0.02
However, August to August, I get a positive of around 0.08
Unfortunately, I get the average (2008 averaged over 8 months) of positive 0.04. I realise that this is just preliminary as the 2008 average is missing 4 months of data. Who knows what will happen by the years end.
I hate to say it but based on the HADCRUT data, it looks like their claims are feasible.

Editor
September 23, 2008 2:19 pm

A slight re-phrasing of the metoffice press release…
==============================
Global prosperity goes on
23 September 2008
Anyone who thinks global prosperity has stopped has their head in the sand. The evidence is clear – the long-term trend in global prosperity is rising, and the stock market is largely responsible for this rise. Global prosperity does not mean that each year will be wealthier than the last, natural market forces will mean that some years will be much wealthier and others poorer…
In the last couple of years, the underlying prosperity is partially masked caused by a strong sub-prime loan crisis. Despite this, 11 of the last 13 years are the highest stock markets ever recorded.

Les Johnson
September 23, 2008 4:58 pm

Paul: I am using HADCRUT3 yearly average. I put in the HADCRUT projected 2008 temp of 0.37.
If 2008 is not used, the slope is still positive from 1998. It does not go negative until the start date is moved up to 2001.
This is the data I use.
1998 0.600291
1999 0.354782
2000 0.307351
2001 0.424473
2002 0.484325
2003 0.50359
2004 0.493032
2005 0.530529
2006 0.463274
2007 0.41
2008 0.37

MattN
September 23, 2008 6:20 pm

As someone current going through ISO certification right this minute, I know a few rules of thumb.
1) Say what you do, do what you say.
2) Accountability, accountability, accountability…
3) Do not under any circumstances attempt to intimidate the auditors. When they say jump, you better have a document handy that says how high you will be jumping for them.

September 23, 2008 8:52 pm

[…] ISO-8000 Data Quality – something climate science could benefit from […]

old construction worker
September 23, 2008 9:08 pm

‘Yet even though our government has the Data Quality Act (DQA) which is supposed to cover things like climate data,’
We got to kept the pressure on them, Just like Alaska suing over the polar bear issue.

Bruce Foutch
September 23, 2008 9:55 pm

Mr. Watts,
http://www.oilit.com/papers/Benson.pdf
My reading of the above linked overview of ISO 8000 reminds me of my ISO 9000 Lead Assessor training and how I quickly formed the opinion that ISO 9000 was not all that appropriate for government research projects, since it has more to do with supply chains where goods and services move from manufacturer to customer, and where conformance to ISO 9000 is customer driven.
My first thought from a brief look at ISO 8000 is that it really would not be the standard you would use for climate data quality assurance. ISO 8000 seems an XML based supply chain management standard, dealing with codification of data that “describes individuals, organizations, locations, goods, services, rules and regulations” (from link).
Sorry to run counter with you on this one. Maybe I will come around after I spend a little more time looking into the scope wording contained in the standard. I will order a copy tomorrow.

Bruce Foutch
September 23, 2008 10:20 pm

Perhaps we can put together something similar to this handbook for the quality control of climate science data:
http://www.who.int/tdr/publications/publications/pdf/glp-handbook.pdf

Bobby Lane
September 23, 2008 10:32 pm

Bruce makes a good observation. How do we know if the standard fits the practices? That we need a standard climate data handling is undoubtable, but we need the right standard too.
Even so, the data quality cannot be good if the siting of the temperature stations is not good. What you start with matters a lot. If that is off due to UHI or other siting anomalies, you’re off to a rocky start. And who knows how good the algorithms are that are used to adjust for siting anomalies. NOAA obviously does not check the quality of their sites to begin with or Anthony would have no need of his Surface Stations project. And how do you use an algorithm that treats all the UHI-induced stations the same (homogenizing) when each station site is affected differently by a variety of things in its environment? Adjusting for the same conditions means you assume the same conditions exist, or that minor differences are negligible.

Ross Berteig
September 24, 2008 12:27 am

Re: Bruce Foutch (21:55:06) :
It might not be the right standard to describe the raw temperature data, but there is more to the temperature record than the temperature itself.
From (mostly) lurking here since the whitewash experiments began, it has become exceedingly clear that the surface station network could benefit from following better practices with what I would call their station metadata… but which could be called data that “describes individuals, organizations, locations, goods, …”. We see the issues in the fun people have even finding the stations at their documented locations. And don’t get me started on the number of unrecorded equipment changes and moves. Then there are the consumers of the data who apparently fail to do their own consistency checks resulting in the issues identified over at Climate Audit with data series misplaced by thousands of miles when used by Mann et. al.
Better still would be to see the beginnings of a quality audit program of their own, not to mention a calibration program. Does anyone know the ages of the thermometers used at the Mohonk Lakes USHCN weather station, let alone when (or even if) they were last calibrated?
As full disclosure, I am married to an ASQ Certified Quality Auditor (she’s also a CQE and CQM, and has been certified as an ISO-9000 auditor) who is employed in the pharmaceutical industry. I’ve gained a fair amount of respect for proper quality systems and their auditors as a result.

Paul
September 24, 2008 4:00 am

Les Johnson Said: Paul: I am using HADCRUT3 yearly average. I put in the HADCRUT projected 2008 temp of 0.37.
I will have to check what I got as results when I get home again (at work at the mo). It was the first time I’ve put my own data together so there may be something dodgy in what I’ve done.

Retired Engineer
September 24, 2008 8:52 am

The stuffy definition of quality is conformance to specification. If Microsoft had a published spec that said Windows would crash less than 10 times per day and your system crashed 8 times, they could claim a high-quality product.
ISO 9000 is an audit trail. It says you have documented procedures for your production and can track everything through the process. It doesn’t mean your product or service is any good, but it is well documented.
ISO 8000 is similar. Your procedure for data collection is well documented and any exceptions must by noted. No late night mysterious ‘adjustments’. You can still put measurement boxes in sewage treatment plants, but you must document it.
While this won’t assure good accurate data, there would be well defined procedures and policies concerning the data. Transparency.
That would be a big step in the right direction.

Ron de Haan
January 23, 2009 10:11 am

Anthony, not as spectacular as this article but in my opinion a possible second piece of the climate/weather puzzle:
“Climate records stretching back 5000 years seem to show a strong link between rainfall in the tropics and changes in the Earth’s magnetic field, according to new research”.
http://planetearth.nerc.ac.uk/news/story.aspx?id=296