From the National Physical Laboratory, a rather curious press release that had a fact I didn’t know: the gas calibration standard sample for measuring CO2 comes from an obscure location in the Rocky Mountains. One would have thought Mauna Loa would be the source. I’ve added a Google Earth map below to show the location. The only problem I have is that with any synthetic standard used as a baseline, it can be subject to synthesis error.
NPL scientists blend synthetic air to measure climate change
New gas standard to meet increasing demand
Scientists at the National Physical Laboratory (NPL) have produced a synthetic air reference standard which can be used to accurately measure levels of carbon dioxide and methane in the atmosphere. This will greatly help scientists contribute to our understanding of climate change.
A paper published in Analytical Chemistry describes how researchers at NPL have created a synthetic gas standard for the first time, which is comparable to the World Meteorological Organisation (WMO) scale and can be quickly produced in a laboratory and distributed, meeting growing demand.
The bulk of demand for gas standards comes from atmospheric monitoring stations around the world. The data collected from these is important to our understanding of climate change.
To reliably compare the concentration of carbon dioxide and methane in air at different locations, and over time, a primary standard to which all measurements relate is required. We must be able to relate the measurements to a trusted base unit, so we can reliably compare measurement between London and Beijing, or between 1990 and 2014.
The current primary standards for carbon dioxide and methane are a suite of cylinders of compressed air captured from Niwot Ridge in Colorado and held at the National Oceanic and Atmospheric Administration (NOAA).
They are used to create secondary standards, which are used to calibrate the instruments that measure greenhouse gasses around the world.
A new improved measurement technique – cavity ring-down spectroscopy (CRDS) – has resulted in a dramatic increase in the number of atmospheric measurements taken. As the requirement for data that is comparable to the WMO scale increases, there is a corresponding increase in the demand for comparable reference standards.
Supplying the demand for reference standards comparable to the WMO scale is becoming an issue. An infrastructure to disseminate reference standards prepared gravimetrically – i.e. by weighing the gas in the cylinder – that are traceable to the International System of Units (SI) offers a means of broadening availability. These could overcome the cost and complexity of sampling air under global background conditions which can only be carried out at remote locations.
NPL has developed a solution, producing a synthetic standard which can be used to calibrate carbon dioxide and methane measuring instruments. Rather than sampling air directly, NPL created the sample in the laboratory by carefully blending a mix of gaseous components found in air.
However preparing reference standards synthetically presents a significant challenge. Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air, which measurement instruments read differently.
Paul Brewer, Principal Research Scientist at NPL, said: “By using high accuracy gravimetry, we were able to prepare a gas mixture that accurately replicated the natural occurring isotopic carbon dioxide. The samples were tested using NPL’s world leading measurement equipment and expertise, which demonstrated that the synthetic standard was comparable with the NOAA standard and suitable for use with the international measurement scale for atmospheric monitoring.”
The research has demonstrated that air standards comparable to the WMO scale can be prepared synthetically with an isotopic distribution matching that in the atmosphere. The methods used can be replicated, leading to widespread availability of standards for globally monitoring these two high impact greenhouse gasses. For the international atmospheric monitoring community and for gas companies, this could solve the pressing supply issue.
The project has received widespread support from the atmospheric measurement community. Euan G. Nisbet, Foundation Professor of Earth Sciences at Royal Holloway maintains an Atlantic network of greenhouse gas measurements. He says: “Standards are a critical problem in greenhouse gas measurement. Developing high accuracy reference standards of carbon dioxide and methane with international comparability, and traceability to the SI, will greatly contribute to our work, and to improving our understanding of how greenhouse gases affect the atmosphere.”
The full paper can be viewed here: http://pubs.acs.org/doi/abs/10.1021/ac403982m

Dudley Horscroft says:
February 27, 2014 at 6:02 am
Probably no problems with Nitrogen and Argon, plus most of the other mini-constituents, but one wonders how they manage to ensure that O16 and O18, C12 and C14 are present in the correct quantities. Also Hydrogen and Deuterium. IIRC the half life for Deuterium is very large, but that of C14 is of the order of about 5300 years. What about O18?
O16, O18, C12 are all stable isotopes and have no half-life.
I wonder if we could get co2 sink occur if we have enough cold years in a row. Since we know if it gets cold co2 enters ocean and gets hot oceans emit co2. So wouldn’t it be cool to see a couple years where sink will be more than released. I am curious if this is something we could or should expect. I would expect it gets more likely at higher co2 levels and once we start to curb our emissions, perhaps around the end of this century or end of the next. I just hope we do not have to wait that long before the co2 AGW house of cards falls.
@ur momisugly Crispin
Yes, I understand that, but I didn’t understand what WUWT means by “synthesis error” and why he’s worried about it. Once synthesised, the gas mixture is accurately measured and so partly it doesn’t matter exactly what the concentrations are, only that you *know* what the concentrations are to a high degree of accuracy. Unless I’m missing something.
This sounds like a good way forward for improving testing.
However…
…does sort of imply that the measurement method assumes perfect mixing of isotopes across the Globe.
Which does raise a question as to whether the claimed measurement accuracy has any real meaning?
From the abstract:
The reference standards developed here have been compared with standards developed by the National Institute of Standards and Technology and standards from the WMO scale. They demonstrate excellent comparability.
Would have been nice to see a quantitative comparison rather than a qualitative one.
It isn’t what they are doing that raises eyebrows. It is how they are talking about it.
Have a look.
http://maps.google.com/gallery/details?id=zttWLnOPAlrs.kQBKSYw5ok5U&hl=en
“Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air”
Curious as to the difference in isotopes. It seems to me that CO2 condensed out of the atmosphere would have substantially the same isotopes as CO2 NOT condensed from the atmosphere. Extraction of CO2 from air is “not efficient” so it’s not used as an “industrial process”. However, it does work and I would suggest that having the correct isotopes would outweigh the increased cost over an “industrial process”.
Regards,
Steamboat Jack (Jon Jewett’s evil twin)
Dudley Horscroft:
“IIRC the half life for Deuterium is very large”
Infinite, as far as we know. It’s nonradioactive.
James Strom: “since nothing in the debate revolves around measurements of such precision? So I’m inclined to agree with the comment above that this may be a non-event.”
I don’t know how many different sorts of measurements this standard might be applied to, or the different calibration chains between the reference standard and the measurement device (eg direct calibration with reference standard or calibration of national standards with the reference standard etc) but the precision of any primary or reference standard tends to need to be a couple of orders of magnitude greater than the required precision of actual measurements.
“And, by the way, I agree with you that measurements have to be improved over time, as has been done in sampling ocean temperatures or even in the quality of weather stations, but each change in technology also creates a challenge of harmonizing the old data with the new.”
Indeed – and from my observation it’s one of the things to which metrologists give great thought. When I had some involvement with NPL, about 20 years ago – when it moved from direct government operation to contractor operation – i discovered a member of staff, a scientist, in his 90s who still cycled to work very day and whose input was specifically valued because of his detailed knowledge of the history of improvements in some important standards.
I get the feeling the Mauna Loa hegemony is coming to an end. Perhaps this is a good thing.
“Standards are a critical problem in greenhouse gas measurement.”
They need standards to measure water vapor?
As rgb notes, it’s not really climate-news. It’s partly a technical note, just saying that someone has now taken the trouble to prepare such a standard.
And partly also a ‘business’ note because the increased number of people taking measurements means there is an increased number of people who might be willing to buy it.
So we now have artificial, synthesised, air and computer models to tell us about the future climate. It will be like looking at a woman whose had Botox and plastic surgery. Looks good, expensive to maintain, but we all know it’s not the real thing.
Originally from Colorado and been to that location. Here is a picture I snapped during the collection process….
http://www.huffingtonpost.com/2014/01/23/deer-farts-on-camera_n_4653026.html
Labs need standards. There should be nothing controversial about this.
What? Unicorn farts are not accurate enough? How about Gore Gas? It is likely to be very exacting having bubbled all the way through a Nobel prize recipient.
“… Developing high accuracy reference standards of carbon dioxide and methane … will greatly contribute to … improving our understanding of how greenhouse gases affect the atmosphere.” (Euan G. Nisbet <– really)
More precision in measuring a conjectured cause for which there is, so far, ZERO evidence of causation, does exactly NOTHING to improve understanding.
Either Nisbet is:
1. Lying or
2. Ignorant-to-the-point-of-scientific-incompetence.
Which is it, Nisbet? Are you a cr00k or a dope?
Chris4692 says:
February 27, 2014 at 9:33 am
“Labs need standards. There should be nothing controversial about this.”
Agreed
But, but, but, Finaglebean has assured us there is no question of the reliability of the official monitoring networks output, yet, they didn’t even have a reliable standard to work against.
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
Janice Moore says:
February 27, 2014 at 10:54 am
With the current politico-science situation in AGW among funders, I have no problem with puffery for funding. Note that the understanding could go the way of less impact of greenhouse gasses as well as more impact.
From this requirement to ‘increase’ the accuracy of atmospheric CO2/methane readings we can deduce that up till now ‘they’ haven’t had a ‘reliable’ measure of CO2/methane.
Gibby says:
xhttp://keelingcurve.ucsd.edu/how-is-co2-data-processed/ 2013
I suggest that everyone read the measurement and calibration procedures!
How we measure background CO2 levels on Mauna Loa 2008
I’m not convinced this is a bad idea, having some kind of standard for the composition of “air”. After all, they’ve established Vienna Standard Mean Ocean Water (VSMOW) for the composition of fresh water for temperature and other scientific measures.
https://en.wikipedia.org/wiki/Vienna_Standard_Mean_Ocean_Water
After rising above the constitutions of modern western democracies, ISO Guide 34, ISO17043 and ISO17025 is a piece of cake for the AGW scientologists.
Makes me wonder what is the estimated uncertainty of the reference standard for the carbon dioxide and methane assays (measured in parts per million and parts per billion respectively). Let alone the uncertainty caused by the size and homogeneity of the measurand, which is nothing less than the earth’s atmosphere.
It’s sobering to recall that AGW is founded on the atmospheric composition, in that sort of accuracy, also before electricity was discovered.
Jon Jewett says:
February 27, 2014 at 8:32 am
Curious as to the difference in isotopes. It seems to me that CO2 condensed out of the atmosphere would have substantially the same isotopes as CO2 NOT condensed from the atmosphere.
Even condensing CO2 out of the atmosphere shifts the isotopic composition a little bit: the lighter ones will condense faster than the heavier ones. Evaporating water and CO2 again shows the lighter ones somewhat higher than the heavier ones. The change is about -10 per mil oceans to atmosphere and -2 per mil atmosphere to oceans. Direct condensing CO2 out of the atmosphere: no idea, but as can be seen in condensing water vapour, not unimportant.
But if you make CO2 out of natural gas (as is common for industrial use), the carbon isotopic composition is at -40 to -80 per mil. That doesn’t make much difference for the NDIR measurements used in most stations today, as IR absorption differences for different isotopes are not that much different for the current technique, but as the resolution increases with new techniques, new, more accurate standards even for the isotopic composition may be needed.
Anyway, accurate measurements need calibration and cross-calibration, no matter if that is for blood tests or CO2 measurements. The higher the resolution of the measurements gets, the more accurate the calibration gases need to be known, and the more the calibration gases need to have a similar isotopic composition as what has to be measured.