New lab reference standard for CO2 and Methane

From the National Physical Laboratory, a rather curious press release that had a fact I didn’t know: the gas calibration standard sample for measuring CO2 comes from an obscure location in the Rocky Mountains. One would have thought Mauna Loa would be the source. I’ve added a Google Earth map below to show the location. The only problem I have is that with any synthetic standard used as a baseline, it can be subject to synthesis error.

NPL scientists blend synthetic air to measure climate change

New gas standard to meet increasing demand

Scientists at the National Physical Laboratory (NPL) have produced a synthetic air reference standard which can be used to accurately measure levels of carbon dioxide and methane in the atmosphere. This will greatly help scientists contribute to our understanding of climate change.

A paper published in Analytical Chemistry describes how researchers at NPL have created a synthetic gas standard for the first time, which is comparable to the World Meteorological Organisation (WMO) scale and can be quickly produced in a laboratory and distributed, meeting growing demand.

The bulk of demand for gas standards comes from atmospheric monitoring stations around the world. The data collected from these is important to our understanding of climate change.

To reliably compare the concentration of carbon dioxide and methane in air at different locations, and over time, a primary standard to which all measurements relate is required. We must be able to relate the measurements to a trusted base unit, so we can reliably compare measurement between London and Beijing, or between 1990 and 2014.

The current primary standards for carbon dioxide and methane are a suite of cylinders of compressed air captured from Niwot Ridge in Colorado and held at the National Oceanic and Atmospheric Administration (NOAA).

niwot_ridge1

They are used to create secondary standards, which are used to calibrate the instruments that measure greenhouse gasses around the world.

A new improved measurement technique – cavity ring-down spectroscopy (CRDS) – has resulted in a dramatic increase in the number of atmospheric measurements taken. As the requirement for data that is comparable to the WMO scale increases, there is a corresponding increase in the demand for comparable reference standards.

Supplying the demand for reference standards comparable to the WMO scale is becoming an issue. An infrastructure to disseminate reference standards prepared gravimetrically – i.e. by weighing the gas in the cylinder – that are traceable to the International System of Units (SI) offers a means of broadening availability. These could overcome the cost and complexity of sampling air under global background conditions which can only be carried out at remote locations.

NPL has developed a solution, producing a synthetic standard which can be used to calibrate carbon dioxide and methane measuring instruments. Rather than sampling air directly, NPL created the sample in the laboratory by carefully blending a mix of gaseous components found in air.

However preparing reference standards synthetically presents a significant challenge. Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air, which measurement instruments read differently.

Paul Brewer, Principal Research Scientist at NPL, said: “By using high accuracy gravimetry, we were able to prepare a gas mixture that accurately replicated the natural occurring isotopic carbon dioxide. The samples were tested using NPL’s world leading measurement equipment and expertise, which demonstrated that the synthetic standard was comparable with the NOAA standard and suitable for use with the international measurement scale for atmospheric monitoring.”

The research has demonstrated that air standards comparable to the WMO scale can be prepared synthetically with an isotopic distribution matching that in the atmosphere. The methods used can be replicated, leading to widespread availability of standards for globally monitoring these two high impact greenhouse gasses. For the international atmospheric monitoring community and for gas companies, this could solve the pressing supply issue.

The project has received widespread support from the atmospheric measurement community. Euan G. Nisbet, Foundation Professor of Earth Sciences at Royal Holloway maintains an Atlantic network of greenhouse gas measurements. He says: “Standards are a critical problem in greenhouse gas measurement. Developing high accuracy reference standards of carbon dioxide and methane with international comparability, and traceability to the SI, will greatly contribute to our work, and to improving our understanding of how greenhouse gases affect the atmosphere.”

###

The full paper can be viewed here: http://pubs.acs.org/doi/abs/10.1021/ac403982m

About these ads

70 thoughts on “New lab reference standard for CO2 and Methane

  1. Developing high accuracy reference standards of carbon dioxide and methane with international comparability, and traceability to the SI, will greatly contribute to our work, and to improving our understanding of how greenhouse gases affect the atmosphere.”

    and defraud the taxpayer of even more money

  2. My first thought is that I’m surprised that the reference standard isn’t coming from the NIST Standard Reference Sample Laboratory, especially considering that NOAA is involved, and therefore the program is under the umbrella of the US Department of Commerce.

  3. Probably no problems with Nitrogen and Argon, plus most of the other mini-constituents, but one wonders how they manage to ensure that O16 and O18, C12 and C14 are present in the correct quantities. Also Hydrogen and Deuterium. IIRC the half life for Deuterium is very large, but that of C14 is of the order of about 5300 years. What about O18?

    Thinks, wonders what the effect of someone creating ‘methane’ with C14 and Deuterium would be.

  4. Hmmm…combining this

    “A new improved measurement technique – cavity ring-down spectroscopy (CRDS) – has resulted in a dramatic increase in the number of atmospheric measurements taken.”

    with this

    Euan G. Nisbet, Foundation Professor of Earth Sciences at Royal Holloway maintains an Atlantic network of greenhouse gas measurements. He says: “Standards are a critical problem in greenhouse gas measurement. Developing high accuracy reference standards of carbon dioxide and methane with international comparability, and traceability to the SI, will greatly contribute to our work, and to improving our understanding of how greenhouse gases affect the atmosphere.

    and this

    The samples were tested using NPL’s world leading measurement equipment and expertise, which demonstrated that the synthetic standard was comparable with the NOAA standard and suitable for use with the international measurement scale for atmospheric monitoring.”

    one wonders as to the motivation and intended results of the new standard. I’ve always been, er, skeptical of using Mauna Loa as some sort of gold standard for CO2, especially as it’s a Volcano spewing all sorts of gasses, and (I would have thought) not exactly constant in its output. So, in that regard, more measurements from more places would be welcome. On the other hand, I get a “fox guarding the henhouse” (or maybe the inmates running the asylum :) ) feeling, going back to “you seen one, you seen Yamal”….

    Maybe in addition to the surfacestations project there should be a GGM (greenhouse gas measurement)stations project? Something along the line of watching the watchers?

  5. “The only problem I have is that with any synthetic standard used as a baseline, it can be subject to synthesis error.”

    Eh? I’ve no idea what you mean by “synthesis error”, but the point of a reference standard is that the concentration is known very accurately, usually measured with several techniques to confirm the concentration (+/- error). So a synthetic mixture will be very precisely and accurately measured, and this is what is reported as the ‘standard reference values’ of the mixture. It’s used to calibrate other instruments, since one can check the results of the analysis with the standard reference values.

  6. I’d like to see the increase in accuracy, under the new standard, quantified. This PR reads as if measurements under the old standards were untrustworthy.

  7. “However preparing reference standards synthetically presents a significant challenge. Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air, which measurement instruments read differently.”

    So then how do measurement instruments read CO2 from the burning of fossil fuels, as compared to that of “atmospheric air” and “natural’ sources of CO2 such as this reference location in Colorado?

  8. Since one major source of error is instrumental calibration error (which can easily be systematic) this is a generally good thing although as noted above, I would have expected it to be done already by NIST since that is precisely their raison d’etre. But however, whoever, it is a good idea. It isn’t really “climate news”, though, because one suspects that measuring naked CO_2 concentration to within a few (relative) percent just isn’t that difficult, and whether CO_2 at a location is 397 or 400 ppm simply isn’t relevant to the ongoing debate. The isotopic fractionation might BE relevant, but that’s a lot more difficult and probably isn’t routinely done by the devices that track CO_2 globally — it is more likely done at only a few locations and relies on the “well-mixed” hypothesis which is, for that matter, probably not a bad one outside of the immediate vicinity of CO_2 sources.

    A non-event, but good for them.

    rgb

  9. I am not sure that I fully understand this press release since the regularly used method for analytically determining CO2 concentrations is using FTIR and a set of in house prepared standards with the reference standard acting as the calibration verification (pretty much what happens every day in vehicle emissions inspection stations every day). So in some highly unique situations where you have to establish a man made CO2 molecule vs. a naturally occurring one I guess that this standard is of use but CRDS is too new and specific for the average sampling business to run out and start playing with one when I am sure a FTIR is several orders cheaper with well established procedures and more reliable in the field.
    Here is a nice summary of CRDS: http://www.uvm.edu/~jgoldber/courses/chem226/Lafranchi_CRDS.pdf

  10. Since CO2 has no effect on climate, and varies according to time of day, time of year, where the wind blows from, what is the point? Andrewmharding is right, another clever wheeze to defraud the taxpayer.

  11. Perhaps this is due to my great distrust of our government and everything it does, can this synthetic mix be fudged so that real air samples match what they want it to be rather than what it really is?

  12. how interesting. it’s bed-time for me, but Serco is managing NPL til March:

    Dec 2012: Financial Times: Gill Plimmer: Time called on Serco’s NPL contract
    Serco, the FTSE 100 outsourcing company, has lost its contract to run the National Physical Laboratory – which built the first atomic clock – after the government said it would seek academic partners to take over the centre instead.
    The laboratory has been managed by Serco on a profit-share basis since 1994. But David Willetts, science minister, has decided that the government can “encourage greater interaction with businesses” by ending the contract in March 2014, when the company’s 17-year tenure comes to an end…

    http://www.ft.com/cms/s/0/19e53b8a-4085-11e2-8f90-00144feabdc0.html

    NPL: What is NPL?
    The National Physical Laboratory is operated on behalf of the National Measurement Office by NPL Management Limited, a wholly owned subsidiary of Serco Group plc.
    The Minister of State for Universities and Science, David Willetts, has announced his decision on NPL’s future operations from 2014 when the current arrangement as a Government Owned – Contractor Operated facility by Serco Group plc comes to an end.

    http://www.npl.co.uk/about/what-is-npl/

    this is the purpose:

    NPL: FAQS: Answers to a range of questions that we have been asked about the Centre for Carbon Measurement
    Equally carbon offset credits depend on confidence that the credit being purchased represents a genuine reduction of a tonne of carbon dioxide emissions. Implementing such policies and schemes will require increasing accuracies of physical measurement, as greater capital inputs and financial flows rely on the outcomes.

    http://www.npl.co.uk/carbon-measurement/faqs/

    check out Serco’s Wikipedia pages (more than one). loved this: Serco is one of the 55 contractors hired by United States Department of Health and Human Services to work on the Healthcare.gov web site.

  13. “This will greatly help scientists contribute to our understanding of climate change.” Good, cause “our current understanding” is way the hell off!

  14. James Strom: “I’d like to see the increase in accuracy, under the new standard, quantified. This PR reads as if measurements under the old standards were untrustworthy”
    That’s not how metrology works. Existing measurements aren’t “untrustworthy”, they’re less precise. Measurement users need increasingly precise measurements which themselves need increasingly precise standards/reference standards – look at the changes/improved precision of the time and length standards over the last century – driven by need and technical change.

    To those of you scratching their heads over why NPL are doing this rather than NIST, well NIST isn’t the only world class measurement standards laboratory and NPL has a high standing in gas analysis and in the production of reference standards – and the SI isn’t a US institution it’s the Systeme INTERNATIONALE.

    And this isn’t a sceptic/warmist issue – it’s just science and technology moving along.

  15. It’s one thing to create primary standards for individual gas components… but trying to blend a primary standard of a blend of gasses, and which is also isotopically standardized among several different components simply boggles the mind. I suppose then that you must presume that the standard will change over time, and you must know the relatively precise moment in time that the blend was prepared.

    Then, if you want to really get weird about it, try to consider how storage conditions may affect those standards (temperatures, natural or artificial shielding against stray neutrons, etc).

    I’m just not sure that such a standard atmosphere gas blend that is both chemically and isotopically correct could be prepared as an acceptably stable standard. And if it can be prepared, is it worth the cost vs. individual standards of the components you are measuring?

  16. My biggest concern is that they will be claiming high precision for these numbers, even though they are distinct from the temp measurements, wh/ are the real issue.

  17. Will they detect a massive spike in CO2 levels then realise that they have to adjust the temperatures up from 1998 to compensate meaning that it is warmer than ever? /sarc

  18. …. or will it detect a massive fall meaning they have found the reason for the current Hiatus, pause or whatever they are calling it these days //sarc

  19. @Kit Carruthers

    “Eh? I’ve no idea what you mean by “synthesis error”, but the point of a reference standard is that the concentration is known very accurately, usually measured with several techniques to confirm the concentration (+/- error).”

    It is pretty easy to get a reading to within 1 ppm. The issue with calibrating is all the equipment is calibrated at once (or can be) so having ‘standard air’ with known concentrations of everything saves a lot of time and overcomes certain issues to do with piping and gases ‘hanging around’ in the intake system. It is laborious to do a zero and span calibration using a different source for each gas on several machines. Easier to run a standard gas to everything (as if it was from the incoming air supply outdoors) then train all the instruments at the same time. This is done several times a day at a place like the atmospheric monitoring station at Cape Point, South Africa. Top end instruments do this automatically either after a certain time or if the detected concentration changes a preset amount.

    rgbatduke – the isotope ratios are definitely tracked. That is where the money is.

    Instrument precision is amazing these days. But they require calibration and linearization using three gas concentrations in many cases: zero, middling and top (‘span’) of range.

    Many gases are monitored – far more than CO2 and methane and mercury and SO2. There are enough PhD’s in it for all.

  20. I find it incredible that Paul Brewer, Principal Research Scientist at NPL, said: “By using high accuracy gravimetry, we were able to prepare a gas mixture that accurately replicated the natural occurring isotopic carbon dioxide.”
    Gravimetry is the measurement of the strength of a gravitational field.
    The preparation of highly accurate standard gas mixtures requires *gravimetric methods*, preferably with traceability to NIST mass standards. It involves weighing compressed gas cylinders to a high degree of precision. As of 1990 that precision was ±0.04 %. US Bureau of Mines Report of Investigations RI 9312 (1990) has the details.

  21. Climate science must be the only science in the world that purports to measure something, and then 30-60 years after it begins measuring it, decides that maybe it ought to have some sort of measurement standard applied. What a bhghbiicfio joke!

  22. rgbatduke is right about this. It is an important event for those who are active in analysis; and it is a small, but important part of getting the science right. The issues of isnotope ratios and some of the other more arcane elements are probably only important for a few folks, but good chemical (and atmospheric) analysis depend on high quality reference standards. If anything, reliable reference standards are one small step in making sure things aren’t fudged.

  23. Roger Hird says:
    February 27, 2014 at 6:34 am

    That’s not how metrology works. Existing measurements aren’t “untrustworthy”, they’re less precise. Measurement users need increasingly precise measurements which themselves need increasingly precise standards/reference standards – look at the changes/improved precision of the time and length standards over the last century – driven by need and technical change.
    ______

    Thanks for your comment Roger. My comment may have been cryptic. Let me put things in a different way: will the new methods provide information that is capable of making any change in our understanding of climate? I’ll grant that the new method will be an improvement, but wouldn’t everything in the climate debate remain exactly the same, since nothing in the debate revolves around measurements of such precision? So I’m inclined to agree with the comment above that this may be a non-event.

    And, by the way, I agree with you that measurements have to be improved over time, as has been done in sampling ocean temperatures or even in the quality of weather stations, but each change in technology also creates a challenge of harmonizing the old data with the new.

  24. Dudley Horscroft says:

    February 27, 2014 at 6:02 am
    Probably no problems with Nitrogen and Argon, plus most of the other mini-constituents, but one wonders how they manage to ensure that O16 and O18, C12 and C14 are present in the correct quantities. Also Hydrogen and Deuterium. IIRC the half life for Deuterium is very large, but that of C14 is of the order of about 5300 years. What about O18?

    O16, O18, C12 are all stable isotopes and have no half-life.

  25. I wonder if we could get co2 sink occur if we have enough cold years in a row. Since we know if it gets cold co2 enters ocean and gets hot oceans emit co2. So wouldn’t it be cool to see a couple years where sink will be more than released. I am curious if this is something we could or should expect. I would expect it gets more likely at higher co2 levels and once we start to curb our emissions, perhaps around the end of this century or end of the next. I just hope we do not have to wait that long before the co2 AGW house of cards falls.

  26. @ Crispin

    Yes, I understand that, but I didn’t understand what WUWT means by “synthesis error” and why he’s worried about it. Once synthesised, the gas mixture is accurately measured and so partly it doesn’t matter exactly what the concentrations are, only that you *know* what the concentrations are to a high degree of accuracy. Unless I’m missing something.

  27. This sounds like a good way forward for improving testing.
    However…

    Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air, which measurement instruments read differently.

    …does sort of imply that the measurement method assumes perfect mixing of isotopes across the Globe.
    Which does raise a question as to whether the claimed measurement accuracy has any real meaning?

  28. From the abstract:
    The reference standards developed here have been compared with standards developed by the National Institute of Standards and Technology and standards from the WMO scale. They demonstrate excellent comparability.

    Would have been nice to see a quantitative comparison rather than a qualitative one.
    It isn’t what they are doing that raises eyebrows. It is how they are talking about it.

  29. “Industrially produced carbon dioxide has a different isotopic distribution to that of atmospheric air”

    Curious as to the difference in isotopes. It seems to me that CO2 condensed out of the atmosphere would have substantially the same isotopes as CO2 NOT condensed from the atmosphere. Extraction of CO2 from air is “not efficient” so it’s not used as an “industrial process”. However, it does work and I would suggest that having the correct isotopes would outweigh the increased cost over an “industrial process”.

    Regards,

    Steamboat Jack (Jon Jewett’s evil twin)

  30. Dudley Horscroft:

    “IIRC the half life for Deuterium is very large”

    Infinite, as far as we know. It’s nonradioactive.

  31. James Strom: “since nothing in the debate revolves around measurements of such precision? So I’m inclined to agree with the comment above that this may be a non-event.”

    I don’t know how many different sorts of measurements this standard might be applied to, or the different calibration chains between the reference standard and the measurement device (eg direct calibration with reference standard or calibration of national standards with the reference standard etc) but the precision of any primary or reference standard tends to need to be a couple of orders of magnitude greater than the required precision of actual measurements.

    “And, by the way, I agree with you that measurements have to be improved over time, as has been done in sampling ocean temperatures or even in the quality of weather stations, but each change in technology also creates a challenge of harmonizing the old data with the new.”

    Indeed – and from my observation it’s one of the things to which metrologists give great thought. When I had some involvement with NPL, about 20 years ago – when it moved from direct government operation to contractor operation – i discovered a member of staff, a scientist, in his 90s who still cycled to work very day and whose input was specifically valued because of his detailed knowledge of the history of improvements in some important standards.

  32. “Standards are a critical problem in greenhouse gas measurement.”

    They need standards to measure water vapor?

  33. As rgb notes, it’s not really climate-news. It’s partly a technical note, just saying that someone has now taken the trouble to prepare such a standard.

    And partly also a ‘business’ note because the increased number of people taking measurements means there is an increased number of people who might be willing to buy it.

  34. So we now have artificial, synthesised, air and computer models to tell us about the future climate. It will be like looking at a woman whose had Botox and plastic surgery. Looks good, expensive to maintain, but we all know it’s not the real thing.

  35. What? Unicorn farts are not accurate enough? How about Gore Gas? It is likely to be very exacting having bubbled all the way through a Nobel prize recipient.

  36. “… Developing high accuracy reference standards of carbon dioxide and methane … will greatly contribute to … improving our understanding of how greenhouse gases affect the atmosphere.” (Euan G. Nisbet <– really)

    More precision in measuring a conjectured cause for which there is, so far, ZERO evidence of causation, does exactly NOTHING to improve understanding.

    Either Nisbet is:

    1. Lying or
    2. Ignorant-to-the-point-of-scientific-incompetence.

    Which is it, Nisbet? Are you a cr00k or a dope?

  37. But, but, but, Finaglebean has assured us there is no question of the reliability of the official monitoring networks output, yet, they didn’t even have a reliable standard to work against.

    HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

  38. Janice Moore says:
    February 27, 2014 at 10:54 am

    “… Developing high accuracy reference standards of carbon dioxide and methane … will greatly contribute to … improving our understanding of how greenhouse gases affect the atmosphere.” (Euan G. Nisbet <– really)

    More precision in measuring a conjectured cause for which there is, so far, ZERO evidence of causation, does exactly NOTHING to improve understanding.

    With the current politico-science situation in AGW among funders, I have no problem with puffery for funding. Note that the understanding could go the way of less impact of greenhouse gasses as well as more impact.

  39. From this requirement to ‘increase’ the accuracy of atmospheric CO2/methane readings we can deduce that up till now ‘they’ haven’t had a ‘reliable’ measure of CO2/methane.

  40. Gibby says:

    I am not sure that I fully understand this press release since the regularly used method for analytically determining CO2 concentrations is using FTIR

    xhttp://keelingcurve.ucsd.edu/how-is-co2-data-processed/ 2013

    The Scripps instrument itself has seen several upgrades. The original Scripps Applied Physics analyzer was retired in 2006 when it was replaced by a more modern CO2 analyzer made by Siemens after a year of overlap. A further upgrade is currently underway in a collaboration between Scripps and Earth Networks Corp. using an even newer technology based on cavity ring-down (CRD) spectroscopy. The CRD instrument was installed in December 2012 and has operated in parallel with both the Siemens and the NOAA analyzers. The plan is to retire the Siemens analyzer by the end of 2013 after a year of overlap with the CRD instrument. Typically the CO2 concentrations independently determined from all three analyzers agree within a few tenths of a ppm (0.2 to 0.3 ppm).

    I suggest that everyone read the measurement and calibration procedures!

    How we measure background CO2 levels on Mauna Loa 2008

    Air is slowly pumped through a small cylindrical cell [where] CO2 absorbs infrared light. More CO2 in the cell causes more absorption, leaving less light to hit the detector. We turn the detector signal, which is registered in volts, into a measure of the amount of CO2 in the cell through extensive and automated (always ongoing) calibration procedures.

    [There are] frequent calibrations of the instrument with reference gas mixtures of CO2-in-dry-air spanning the expected range of the measurements. The reference gas mixtures are stored in high pressure aluminum cylinders. At Mauna Loa the calibration is done every hour by interrupting the flow of outside air through the cell, and replacing it with flows of three reference gas mixtures in succession, 5 minutes each.

  41. After rising above the constitutions of modern western democracies, ISO Guide 34, ISO17043 and ISO17025 is a piece of cake for the AGW scientologists.

    Makes me wonder what is the estimated uncertainty of the reference standard for the carbon dioxide and methane assays (measured in parts per million and parts per billion respectively). Let alone the uncertainty caused by the size and homogeneity of the measurand, which is nothing less than the earth’s atmosphere.

    It’s sobering to recall that AGW is founded on the atmospheric composition, in that sort of accuracy, also before electricity was discovered.

  42. Jon Jewett says:
    February 27, 2014 at 8:32 am

    Curious as to the difference in isotopes. It seems to me that CO2 condensed out of the atmosphere would have substantially the same isotopes as CO2 NOT condensed from the atmosphere.

    Even condensing CO2 out of the atmosphere shifts the isotopic composition a little bit: the lighter ones will condense faster than the heavier ones. Evaporating water and CO2 again shows the lighter ones somewhat higher than the heavier ones. The change is about -10 per mil oceans to atmosphere and -2 per mil atmosphere to oceans. Direct condensing CO2 out of the atmosphere: no idea, but as can be seen in condensing water vapour, not unimportant.

    But if you make CO2 out of natural gas (as is common for industrial use), the carbon isotopic composition is at -40 to -80 per mil. That doesn’t make much difference for the NDIR measurements used in most stations today, as IR absorption differences for different isotopes are not that much different for the current technique, but as the resolution increases with new techniques, new, more accurate standards even for the isotopic composition may be needed.

    Anyway, accurate measurements need calibration and cross-calibration, no matter if that is for blood tests or CO2 measurements. The higher the resolution of the measurements gets, the more accurate the calibration gases need to be known, and the more the calibration gases need to have a similar isotopic composition as what has to be measured.

  43. for all those who want CO2 measured accurately – be careful what you wish for.

    2011: Bloomberg: Earth Networks, Scripps to Deploy 100 Carbon Monitoring Stations Globally
    Scientists and carbon traders typically rely on self- reported data from governments and businesses, based on consumption of fossil fuels, to determine how much a country or region is emitting.
    “I don’t trust some governments to accurately measure their carbon inventories,” Scripps director Tony Haymet said in an interview. “We need these top-down measurements to verify.”…
    On Jan. 10, there was a “fairly moderate” concentration of carbon dioxide in the air, he said yesterday. “Today, the wind has shifted to the south, and carbon measurements are significantly higher because of all the pollution coming from Baltimore.”

    http://www.bloomberg.com/news/2011-01-12/earth-networks-scripps-to-deploy-100-carbon-monitoring-stations-globally.html

  44. Jaakko Kateenkorva says:
    February 27, 2014 at 2:03 pm

    Makes me wonder what is the estimated uncertainty of the reference standard for the carbon dioxide and methane assays (measured in parts per million and parts per billion respectively).

    Keeling senior made a glass instrument himself (he had good skills on glass blowing) in the 1950’s to calibrate any CO2 measurements/devices with a gravimetric method accurate to 1:40,000. Good enough to calibrate the standard NDIR measurements accurate to better than 0.1 ppmv. The above calibration instrument was in use at Scripps until a few years ago to prepare calibration gases for Scripps for samples taken at Mauna Loa, independent of NOAA, which did take over the calibrations from Scripps some decade ago. The NOAA (continuous) and Scripps (flask) measurements differ maximum 0.1 ppmv from each other (1 sigma), each with their own calibration gases and instruments.

    The only problem they didn’t know at the beginning of the measurements was that their calibration gases were mixtures of nitrogen and CO2, not air and CO2, out of fear that inside oxydation of the containers might give a deteoration of the mixture. When years later was observed that the instruments did show different results with CO2 in N2 than with CO2 in air, they recalibrated all instruments and changed all previous values accordingly (the raw voltage data still were available).

    I don’t know the accuracy of the CH4 standards…

  45. kuhnkat says:
    February 27, 2014 at 12:13 pm

    But, but, but, Finaglebean has assured us there is no question of the reliability of the official monitoring networks output, yet, they didn’t even have a reliable standard to work against.

    If they monitor your blood composition, you rely on good standards of today. That doesn’t mean that the old standards were bad, only that modern blood tests have a much better resolution, thus that the standards to compare with must be more accurate too…

  46. pat says:
    February 27, 2014 at 2:22 pm

    for all those who want CO2 measured accurately – be careful what you wish for.

    That story has nothing to do with accurate measurements of “background” CO2 levels. These are measured far away from the main industrial and natural sources and sinks: mid oceans or coastal and with wind from the seaside.

    The 100 extra stations are tall towers to be installed (and partly already installed) over land to measure regional sources and sinks of CO2 and specifically where the human emissions originate. The latter nowadays are calculated from fossil fuel sales (taxes!), but probably underestimated due to under-the-counter sales…

  47. The NBS used to be in Boulder. It is now NIST. They still do the same stuff. They are most famous for time and frequency standards. WWVB on 60 KHz and WWV at 5MHz, 10MHz, and 15MHz. For WWV you can set your watch by it if you can decode the 100 Hz subcarrier. WWVB does it by 17 dB power changes.

  48. Industrial CO2 is produced from various processes, e.g. recovered from steam methane reforming, refining, recovered from fermentation, etc. Most industrial CO2 is recovered from gas wells. It is not practical to recover it directly from air.

  49. Ferdinand Engelbeen –

    have noted your comments, tho i was considering NPL’s own stated purposes at a link i provided earlier. appreciate u pointing out the difference in this thread’s particular case, tho.

  50. Quantitative Uncertainty
    Re: “it can be subject to synthesis error.”
    Paul Brewer et al. establish an objective reproducible standard traceable to fundamental primary standards at national labs like NPL and NIST.

    We report the preparation and validation of the first fully synthetic gaseous reference standards of CO2 and CH4 in a whole air matrix with an isotopic distribution matching that in the ambient atmosphere. The mixtures are accurately representative of the ambient atmosphere and were prepared gravimetrically. The isotopic distribution of the CO2 was matched to the abundance in the ambient atmosphere by blending 12C-enriched CO2 with 13C-enriched CO2 in order to avoid measurement biases introduced by measurement instrumentation detecting only certain isotopologues. The reference standards developed here have been compared with standards developed by the National Institute of Standards and Technology and standards from the WMO scale. They demonstrate excellent comparability. . . .
    The uncertainty in the gravimetric amount fractions of the reference mixtures described here have been calculated by applying the principles of the Guide to the Expression of Uncertainty in Measurement.^18 Mixtures of CO2 and CH4 have been prepared with uncertainties of less than 0.1% and 0.2% (k = 2), respectively.

    Ref: (18) International Organization for Standardization (ISO) <Guide to the Expression of Uncertainty in Measurement (GUM); Geneva, 1995
    For the latest edition see: JCGM 100 : 2008 (GUM 1995 with minor corrections) Sept. 2008 BIPM.
    Reporting to the BIPM international standard guidelines for uncertainty indicates formal quantitative scientific evaluation making it highly likely it is quantitatively reproducible. (k=2 means “Expanded uncertainties giving a confidence level of approximately 95%” i.e. within +/- 2 sigma).

    Such documentation of uncertainty is extremely rare in IPCC publications. Climate science would take a major step up if it would begin to quantify uncertainty in all publications in keeping with these international guidelines of BIPM’s GUM – JCGM 100 : 2008.

  51. David L. Hagen says:
    February 28, 2014 at 6:01 am

    Thanks a lot for the references!

    The WS-CRDS method thus uses extremely narrow IR bands at the peak band of the main isotope of both CO2 and CH4. That makes that the calibration standards may not differ too much in isotopic composition from the target composition, or that can give underestimation or overestimation of the real levels. The same for differences in N2/O2/Ar composition which gives differences in broadening of the CO2 and CH4 bands, thus influencing the peak of 12CO2 and 12CH4… But even so, the changes are very small, in the hundredths of a ppmv…

    I wonder if they can detect (sooner or later) the relative abundances of 12CO2 and 13CO2 in the atmosphere from continuous sampling by using narrow IR bands for each individual isotope peak wavelength…

  52. Ferdinand Engelbeen
    Paul Brewer et al. used a CRDS from Picarro #G2301. See the G2301 datasheet. They state 25 ppb over 5 minutes with 500 ppb drift/month. Thus the benefit of calibration gases.

    PS For another CRDS ref:
    13CO2/12CO2 isotopic ratio measurements with a continuous-wave quantum cascade laser in exhaled breath Vasili L. Kasyutich et al.

    For wavelength modulation spectroscopy (WMS) see:
    UTILIZATION OF MULTIPLE HARMONICS OF WAVELENGTH MODULATION ABSORPTION SPECTROSCOPY FOR PRACTICAL GAS SENSING, Kai Sun DEC 2013, Dissertation

    For the potential for reducing satellite measurement uncertainty compared to current see Nigel Fox NPL TRUTHS project. e.g. Seeking the TRUTHS about climate 2011. Paper: Accurate radiometry from space: an essential tool for climate studies Nigel Fox et al. 2011.

  53. David L. Hagen,

    Thanks again for the references. Amazing what modern analytical techniques can reach these days.
    My last pre-retirement confrontation with such instruments is already 10 years ago and the instrument makers obviously didn’t stop applying new techniques…

    Thus my idea to use that for 13C/12C ratio’s is already in use. Again too late for a patent of my own…

  54. This group knows that CO2 atmospheric concentration is around 400 PPM. Mathematically, 400 PPM means that .04% of our atmosphere consists of CO2. Ask people in your work place, lunch room, dinner table, coffee house what they think the percentage of the atmosphere consists of CO2. Regardless of which side of the debate they may be , the answers given are usually off by a magnitude of 100 to 1,000 or more.
    We are fighting a battle of public perception more than a battle of the merits of the science. The science is clear. CAGS alarmists have convinced a good part of the world that increased CO2 is causing the earth to warm with catastrophic effects. The earth has not warmed in 17 years, despite continued increases in both the amount and the rate of increase in the amount of CO2. When your results do not meet your prediction, or projection, you are wrong, end of story, thank you Dr. Feynman.
    In every article, in every comment on WUWT, and at every opportunity, we should start the discussion with the fact that CO2 is only .04% of the atmosphere. It is much harder for the other team to convince people that CO2 is a problem once the general population understands that CO2 is .04%. What if it doubles? Now it is .08%, still an infinitesimal amount. One comment said that new technology can measure CO2 concentrations +/- .04%. This means that we can detect a change in co2 concentration of .000016%. While quite a technological achievement, the ability to measure such small changes of a trace gas does not have any meaning, and does not present any useful information in the real world.
    I would like to ask our gracious host, to always express CO2 concentration in percentage terms, in addition to the more scientific expression, ie 400 PPM (.04%) so that the numerically challenged in our society can better appreciate the tiny amount of CO2 that is purported to cause such a potential catastrophe.

  55. Bill Sprague says:
    February 28, 2014 at 1:03 pm

    Bill, the tiny percentage of CO2 in the atmosphere is a non-argument. The same percentage of hydrogen cyanide in the atmosphere will kill most animals… It is a matter of what the effect is of such a percentage.

    In the case of CO2, the effect is small: the doubling of CO2 from 280 to 560 ppmv will increase global temperatures with not more than 1°C, not a disaster, mainly beneficial for all plant life. The rest of the “projections” is based on failed climate models. That is the message we need to bring to the masses: that all disaster scenario’s are based on models, not more real than the movies from the Holywood industry…

  56. Ferdinand,
    Thank you for your comment, and I completely understand your argument. Your argument is scientific. My argument is public persuasion and perception. The public perception is that CO2 is 20% of the atmosphere, headed to 40%, and we are all going to die. My point is that 400 PPM sounds like a lot, because 400 is a big number, and is either a car payment or even a house payment for some. Ask John Kerry what percentage of the atmosphere is CO2. I doubt that he knows, and I doubt that he could explain in layman’s terms to other laymen that .04% is something to worry about.
    Yes, we also need to point out the inaccuracy of the models. The difficulty is that Scientist A says yes they are and Scientist B says no they are not, and the public does not have enough education to tell the difference on their own. They hear that 97% of scientists agree that CAGS is real and government needs to “do something” fund windmills, solar, etc etc. My point is not one based on science, although the science is on our side. It is based on the simplistic logic that CO2 is only .04% of the atmosphere, so what are you worried about. It is 4 pennies out of $10,000 dollars.

    It is to speak to the general public in a readily comprehensible way. The genius of Michal Mann was the hockey stick, a simple, visual depiction of runaway CO2 associated with runaway temperatures. The reality is that if his hockey stick were the prize in a box of Cracker Jacks, it would be too small to find, and the customer would feel cheated, because he did not get his prize.

  57. Englebeen and Hagen above did an excellent job of explaining the need for and use of reference gasses – thanks, guys!

    The concern by others that the availability of new standards is going to somehow change CO2 measurements is completely misplaced. As I read the release, it’s just announcing the development of a reliable method of creating secondary reference standards for atmospheric gas measurement, in response to greatly increased demand for such standards. This happens in metrology all the time; there are primary, secondary, and tertiary standards, and the aim is always to make each successive generation of standard match the higher levels as closely as possible. Verifying that is simply a matter of comparing the standards with each other with the most sensitive equipment available.

    In this case, it doesn’t matter a whole lot where and when the primary atmosphere standard was gathered; as mentioned earlier, calibrations are most likely done at bottom, mid and top-scale, and I’d be extremely surprised if there was any significant non-linearity inherent in the instruments’ designs, in any proximity to the values actually being measured.

    This is seriously a non-issue relative to CO2 monitoring and the climate debate: All it means is that it’ll be easier and cheaper to calibrate the measuring instruments more accurately. – And as noted by someone else above, the current level of accuracy is far greater than any level that would impact the climate debate, on the order of small fractions of a percent of a value that we’re concerned about doubling.

  58. I have often wondered why they measure CO2 on top of the largest active volcano on the planet. Not to mention it’s on an island in the middle of the Pacific Ocean which is the largest producer of CO2 on the planet.

  59. elmer says:
    March 1, 2014 at 5:38 am

    I have often wondered why they measure CO2 on top of the largest active volcano on the planet.

    The first continuous measurements in fact were at the South Pole, but as that has a gap of a few years, the Mauna Loa measurements are the longest continuous series on the planet. But there are a lot of places nowadays where CO2 and other trace gases are measured, see:
    http://www.esrl.noaa.gov/gmd/dv/iadv/ already mentioned by Billy Liar.
    They all show similar CO2 levels within natural variability (mainly seasonal).

    Winds at the 3,400 m level of the station are mainly trade winds free from local pollution, but if there are downslope winds bearing CO2 from volvanic vents (or slightly CO2 depleted upslope winds from the valleys), these are excluded from daily to yearly averaging.

    Some 20 ppmv/year CO2 (40 GtC) comes out of the equatorial waters and are absorbed near the poles. That are mainly continuous flows which hardly affect local levels. Without any mixing by winds, that would increase the local level at Mauna Loa with some 0.05 ppmv/day, but as there are near always winds…

  60. A paleo-fact(or) to keep in mind is that the oxygen in the atmosphere was originally stripped from CO2. I.e., there was more than 20% CO2, and plants (including algae and early bacteria) reversed the ratios by eating it! What a massive pig-out that was.

Comments are closed.