OCO: I can see your house emitting CO2 from here

From NASA JPL and the department of future CO2 emissions ticketing:

OCO-2 Data to Lead Scientists Forward into the Past

NASA’s Orbiting Carbon Observatory-2, which launched on July 2, will soon be providing about 100,000 high-quality measurements each day of carbon dioxide concentrations from around the globe. Atmospheric scientists are excited about that. But to understand the processes that control the amount of the greenhouse gas in the atmosphere, they need to know more than just where carbon dioxide is now. They need to know where it has been. It takes more than great data to figure that out. 

“In a sense, you’re trying to go backward in time and space,” said David Baker, a scientist at Colorado State University in Fort Collins. “You’re reversing the flow of the winds to determine when and where the input of carbon at the Earth’s surface had to be to give you the measurements you see now.”

Harry Potter used a magical time turner to travel to the past. Atmospheric scientists use a type of computer model called a chemical transport model. It combines the atmospheric processes found in a climate model with additional information on important chemical compounds, including their reactions, their sources on Earth’s surface and the processes that remove them from the air, known as sinks.

Baker used the example of a forest fire to explain how a chemical transport model works. “Where the fire is, at that point in time, you get a pulse of carbon dioxide in the atmosphere from the burning carbon in wood. The model’s winds blow it along, and mixing processes dilute it through the atmosphere. It gradually gets mixed into a wider and wider plume that eventually gets blown around the world.”

Some models can be run backward in time — from a point in the plume back to the fire, in other words — to search for the sources of airborne carbon dioxide. The reactions and processes that must be modeled are so complex that researchers often cycle their chemical transport models backward and forward through the same time period dozens of times, adjusting the model as each set of results reveals new clues. “You basically start crawling toward a solution,” Baker said. “You may not be crawling straight toward the best answer, but you course-correct along the way.”

Lesley Ott, a climate modeler at NASA’s Goddard Space Flight Center, Greenbelt, Maryland, noted that simulating carbon dioxide’s atmospheric transport correctly is a prerequisite for improving the way global climate models simulate the carbon cycle and how it will change with our changing climate. “If you get the transport piece right, then you can understand the piece about sources and sinks,” she said. “More and better-quality data from OCO-2 are going to create better characterization of global carbon.”

Baker noted that the volume of data provided by OCO-2 will improve knowledge of carbon processes on a finer scale than is currently possible. “With all that coverage, we’ll be able to resolve what’s going on at the regional scale,” Baker said, referring to areas the size of Texas or France. “That will help us understand better how the forests and oceans take up carbon. There are various competing processes, and right now we’re not sure which ones are most important.”

Ott pointed out that improving the way global climate models represent carbon dioxide provides benefits far beyond the scientific research community. “Trying to figure out what national and international responses to climate change should be is really hard,” she said. “Politicians need answers quickly. Right now we have to trust a very small number of carbon dioxide observations. We’re going to have a lot better coverage because so much more data is coming, and we may be able to see in better detail features of the carbon cycle that were missed before.” Taking those OCO-2 data backward in time may be the next step forward on the road to understanding and adapting to climate change.

To learn more about the OCO-2 mission, visit these websites:

http://www.nasa.gov/oco2

http://oco.jpl.nasa.gov

NASA monitors Earth’s vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information about NASA’s Earth science activities in 2014, visit:

http://www.nasa.gov/earthrightnow

OCO-2 is managed by NASA’s Jet Propulsion Laboratory, Pasadena, California.

0 0 votes
Article Rating
60 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
July 23, 2014 12:13 am

They truly don’t realize how untrustworthy climate models are considered by any sane person able of critical thinking.
Also how will this relate to the age old mantra about CO2 being “well mixed” is anybody’s guess.

Steve B
July 23, 2014 12:16 am

Useless people with useless jobs I think.

mickgreenhough
July 23, 2014 12:27 am

If my house emits CO2 that is a good thing as CO2 increases the yield of plants.

Andy
July 23, 2014 12:29 am

What an absolute waste of money. I was always glad the climate wonks in NASA seemed apart from the scientists and rocket scientists in proper NASA. Now, sadly they seem to have become a little connected…

July 23, 2014 12:35 am

From the report:
Harry Potter used a magical time turner to travel to the past. Atmospheric scientists use a type of computer model called a chemical transport model. It combines the atmospheric processes found in a climate model with additional information on important chemical compounds, including their reactions, their sources on Earth’s surface and the processes that remove them from the air, known as sinks.
That first sentence says it all. Harry Potter used a magical time tuner while Atmospheric “scientists” use a magical computer model which itself uses as a component a magical climate model. (one of the four “best” no doubt)
The great thing about computer models is that you can program them to yield whatever answer your bias and your funding masters need to see. I wish they would stop using the word “science” in conjunction with these boys and their computer games.

July 23, 2014 12:36 am

Measurement is good.
Evidence is good.
I have no argument with the real world; it really doesn’t care about my opinion.
Climate models on the other hand..

Jack Savage
July 23, 2014 12:37 am

“Some models can be run backward in time — from a point in the plume back to the fire, in other words — to search for the sources of airborne carbon dioxide.”
So….you are going to trace the hurricane in China back to the individual butterfly wing in New Mexico? Good Luck with that!

July 23, 2014 12:37 am

Warning to the Climate Research Unit ( Jones, emails, lost data etc, Norwich, UK) is now also watched from the above
http://i.telegraph.co.uk/multimedia/archive/02983/gods-face0_2983566k.jpg
photo taken just few miles up the road.

Frank
July 23, 2014 12:51 am

I think Google should drive all over the country with CO2 cameras. Then overlay the results on Google Earth. Then we can zoom in on James Hansen, Al Gore, Barbara Streisand, Michael Mann’s, et al’s homes emitting clouds of poison. Clip ’em and put ’em on Facebook.

Leigh
July 23, 2014 12:57 am

How many ways do they need to be told, CO/2 is not the problem?
Another waste of taxpayers monies.
That would be better spent on medical research into cancer or any other disease.

pat
July 23, 2014 1:10 am

NASA – look here:
22 July: Guardian: Damian Carrington: Germany, UK and Poland top ‘dirty 30’ list of EU coal-fired power stations
Environmental study highlights health effects from pollution, with Germany coming top, and the UK third in total coal consumption
http://www.theguardian.com/environment/2014/jul/22/germany-uk-poland-top-dirty-30-list-eu-coal-fired-power-stations
.pdf (17 pages): Panda.org: Europe’s Dirty 30:How the EU’s coal-fired power plants are undermining its climate efforts
This report was researched and written by Kathrin Gutmann from Climate Action Network (CAN) Europe, Julia Huscher from Health and Environment Alliance (HEAL), Darek Urbaniak and Adam White from WWF European Policy Office, Christian Schaible from the European Environmental Bureau (EEB) and Mona Bricke, Climate Alliance Germany.
http://awsassets.panda.org/downloads/dirty_30_report_finale.pdf

Stephen Richards
July 23, 2014 1:19 am

It takes more than great data to figure that out yep, add a few dozen incompetent climate astros along with Gavin and Michael. and ça y est.

jono1066
July 23, 2014 1:21 am

Besides being able to plot positive CO2 levels can you ask them also to tune it to look for negative CO2 plumes showing where the CO2 is being absorbed, which appears to happen on a regular annual basis (according to the ML trace).
and I hope they never point the satelite at a forest or two at the wrong time, all that C02 being released would blow the sensors.

CodeTech
July 23, 2014 1:29 am

I sometimes wonder how these people will handle having all of this expensive and high tech stuff disprove their theory.
Then I realize, their theory was disproved years ago, and they still search fruitlessly for evidence that will never appear. It’s sad, really. Like the lonely ghost ship, adrift on the high seas, ever searching for a lost love…

MangoChutney
July 23, 2014 1:48 am

I’d lay money on all the man made CO2 since the beginning of time being traced back to the western countries, even though, as omnologos points out, what happens with the “well mixed” claims

Greg Goodman
July 23, 2014 2:11 am

OCO-2 ( pronounced Oh-CO2 ! ) could help explain why CO2 in the Arctic so closely matches ice area for much of the year.
http://climategrog.wordpress.com/?attachment_id=970
It should also help explain why the amplitude of annual variation is greatest in the Arctic, not where human emissions are greatest.
This should finally demonstrate that out-gassing and absorption due to temperature change in the oceans is a key factor in atmospheric CO2 not an insignificant 10 ppmv as is currently proposed.

Greg Goodman
July 23, 2014 2:13 am
Jaakko Kateenkorva
July 23, 2014 2:22 am

Well, only that what was missing to advance the modern anthropocentric age to conclusion – it can finally be compared to a teenager using the latest iPhone to take blackhead discovering selfies.

Greg Goodman
July 23, 2014 2:24 am

Full graph of which the above is a close up:
http://climategrog.wordpress.com/?attachment_id=997
Most of the short term variability in Arctic CO2 seems to be accounted for by ice area ( SST being very stable in the presence of and ice/water mix. )
This leaves about 0.82 ppmv/year gradual rise that may be attributable to human emissions AND SST variations in ice-free parts of the North Atlantic.

Greg Goodman
July 23, 2014 2:44 am

pat says:
July 23, 2014 at 1:10 am
NASA – look here:
22 July: Guardian: Damian Carrington: Germany, UK and Poland top ‘dirty 30’ list of EU coal-fired power stations
Environmental study highlights health effects from pollution, with Germany coming top, and the UK third in total coal consumption
=====
And when Britain started to construct a clean, modern, coal power stations at Kingsnorth in Kent … it got stopped because of protests from the enviros.
So the deliberate attempts to refer to a colourless, odourless, non toxic gas as “dirty”, ends up by causing more emissions of REAL pollutants coming from older coal fired power stations.
Head meets arse .. “can I come in?”

Greg Goodman
July 23, 2014 2:52 am

omnologos says:
They truly don’t realize how untrustworthy climate models are considered by any sane person able of critical thinking. Also how will this relate to the age old mantra about CO2 being “well mixed” is anybody’s guess.
===
Well mixed does not mean perfectly uniform the world over. Nothing at all in climate conforms to that.
Variations of a few ppmv in a total of 400 means that CO2 is “well mixed”.
Studying geographic deviations from the well mixed level should tell us about sources and sinks and the carbon cycle.
This could be very informative but it seems clear the major aim of this satellite will be finger pointing.

July 23, 2014 2:55 am

pat says: July 23, 2014 at 1:10 am
………….
or this
Tilbury power station mothballed after investment burns out
RWE Npower-owned Tilbury, claimed to be biggest biomass plant in the world providing 10% of Britain’s renewable power
http://www.theguardian.com/business/2013/aug/16/tilbury-power-station-mothballed

Jimbo
July 23, 2014 3:01 am

Here is the settled science of the IPCC.
I vaguely recall that co2 is a “Well-Mixed Greenhouse Gas” [IPCC].
I vaguely recall that NASA said that c02 in the atmosphere “is rather “lumpy.” [NASA JPL]
Here are some preliminary results for Co2 hotspots from around the world. The Sahara is a great emitter of Co2, the USA not as much. We must act now.

lee
July 23, 2014 3:05 am

Cue – ‘Oh the wayward wind is a restless wind”

Rogueelement451
July 23, 2014 3:12 am

A question for the boffins, we know that hot CO2 rises , evidenced by hot air balloons ,if CO2 is unencumbered by excess baggage like balloons,at what speed would it rise to achieve an equilibrium in temperature and how high would have to go to attain that equilibrium?

pat
July 23, 2014 4:02 am

another region for NASA to watch! can barely believe a couple of sane voices were heard at this Californian talkfest:
23 July: The Energy Collective: by Michael Shellenberger & Ted Nordhaus: High-Energy Africa
Africa has experienced massive economic growth over the last decade, but in order for this growth to translate into significant development outcomes, big investments will be needed to provide electricity to the 600 million sub-Saharan Africans who lack it, said a panel of development experts at Breakthrough Dialogue.
While some advocates have suggested that small-scale, distributed renewable energy technologies can meet the needs of sub-Saharan Africa, two of the panelists argued that Africa’s power sector will much more diverse, and, at least in the near future, dominated by hydro and fossil fuels…
Given their high cost and a lack of infrastructure to support them, at least in the near-term, renewable technologies like solar will not be able to power Africa, argued Asafu-Adjaye. Renewables may become useful in the medium-term, but only once significant investment is directed toward new power plants and rehabilitating old ones.
“Renewables cannot do the heavy lifting,” he said. “We need to invest in the power grid in order to power industries, modernize agriculture, and power schools and hospitals. Going forward, sub-Saharan Africa needs to exploit cheap sources of energy, and this could come from oil and coal.”…
(Mimi Alemayehou, formerly of the Overseas Private Investment Corporation): “I think for Africa, because the need is so great, with 600 million people in the dark, every energy source needs to be on the table including coal,” she said. “We can’t make blanket statements. In my country of birth, Ethiopia, 90 percent of energy is clean because we have hydro. But you go south to Mozambique, and trust me, gas will be developed way before solar.”…
As the US Executive Director of the African Development Bank, Alemayehou oversaw the proposal for the Gibe dam project, which planned to double the installed capacity of Ethiopia.
Rising pressure from environmental groups like International Rivers, heightened by the approaching Copenhagen climate change conference, ultimately forced the African Development Bank, the World Bank, and the European Investment Bank to withdraw funding from the dam. In the end, the Chinese funded the project.
“Unfortunately, I don’t think the Chinese care much about the Pelosi Amendment or other things we had carved out in our project, including additional funding for community engagement,” said Alemayhou. “The Ethiopians got what they wanted in terms of doubled capacity, but the issues that the NGOs raised are still there.”…
http://theenergycollective.com/michaelshellenberger/440746/high-energy-africa
Agenda: Breakthrough Dialogue 2014: High-Energy Planet
Human Needs and Environmental Protection in the 21st Century
June 22 – 24, 2014, California
http://thebreakthrough.org/index.php/dialogue/agenda/

Rogueelement451
July 23, 2014 4:05 am

I am asking that in relation to it being “a well mixed gas”

mjc
July 23, 2014 4:28 am

“Rogueelement451 says:
July 23, 2014 at 3:12 am
A question for the boffins, we know that hot CO2 rises , evidenced by hot air balloons ,if CO2 is unencumbered by excess baggage like balloons,at what speed would it rise to achieve an equilibrium in temperature and how high would have to go to attain that equilibrium?”
How much grant money is available to study that?
It will take quite a bit of money to get the equipment (satellites) in place and a top-notch model coder to whip up a model to figure out what that equilibrium should be…

Rogueelement451
July 23, 2014 4:34 am

unless we could tag molecules ,,, hmmmm

Crispin in Waterloo
July 23, 2014 4:45 am

Omnologos asks about mixing.
Because data is better than speculation here is some.
Late last year I was measuring CO2 in Ulaanbaatar Mongolia at the north edge of the city to find out the correction to apply for combustion analysis. At 9AM it was about 600 ppm and it rose during the morning to peak at 1100 ppm by 11AM after which it dropped rapidly to below 500. The facility is not near any single large emitter. It shows that a city can accumulate significant amounts of emissions and that they drift uphill when the air is warmed by the sun.
Measurements were made using a calibrated Rosemount X-Stream NDIR gas analyzer with a 200 mm cell having a short term precision of 0.01% of value.
Conclusion: CO2 is not well mixed in major urban areas.

July 23, 2014 4:50 am

I suggest that even more than the oceans it is the land masses, dominated by the larger Northern Hemisphere land masses, that rule atmospheric CO2.
Please examine the beautiful 15fps AIRS data animation of global CO2 at
http://svs.gsfc.nasa.gov/vis/a000000/a003500/a003562/carbonDioxideSequence2002_2008_at15fps.mp4
It is difficult to see the impact of humanity in this impressive display of nature’s power.
In the animation, does anyone see the impact of industrialization? USA? Europe? India? China? Anything related to humanity? I don’t.
I do see evidence of natural seasonal fluxes on land, and also evidence of deep ocean currents.
The animation does make it look like we Canadians and the Russians have lots of heavy industry emitting megatonnes of CO2 in the far northern Arctic. Not so.
This is no proof, but it appears that atmospheric CO2 flux and CO2 concentration have significant natural drivers. This does out rule out a humanmade component to the observed CO2 increase due to fossil fuel combustion, deforestation, etc.

Daniel H
July 23, 2014 4:53 am

Translation:
In the event that the raw data from OCO-2 fails to show that anthropogenic sources in Western nations are responsible for the bulk of CO2 emissions, we are already prepared to run it through a tweaked chemical transport model that can transform the data so that it correctly conforms to our hypothesis. This model transformation should be described in terms of Harry Potter time-travel magic so that it can be easily understood and accepted as scientifically valid by President Obama, Vice-President Biden, Secretary Kerry, Nancy Pelosi, and Barbara Boxer.
/sarc (I hope!)

Man Bearpig
July 23, 2014 4:54 am

Here is the answer.

Claude Harvey
July 23, 2014 5:13 am

With Goddard Space Flight Center doing the calculating, the answers are preordained. Satisfactory answers may not emerge quickly, in which case the data will be processed through the Goddard data sausage machine until the desired product appears. James Hansen may be called out of retirement to assist in this important mission.

Editor
July 23, 2014 5:35 am

I’m disappointed with the several negative comments about OCO-2. Currently we have good CO2 data for a handful of sites around the world. Soon we’ll have decent data from many points around the world. Not perfect, but decent. If it tells use what next to look at, that’s fine.
One thing about raw science is that it often raises questions we didn’t realize should be investigated until we got enough data to be able to ask the question. Space science has been especially good at this, e.g. seeing magnetic fields on the Sun, the accidental discovery of the van Allen radiation belts, seeing sprites and jets above thunderstorms. Studying these required looking at things in a new way with new tools.
Perhaps we do know all that needs to be known about CO2 distribution and transport, CO2 sources and sinks. I suspect not, and we can’t be sure of that until we take a closer look.
I don’t understand why some commenters call this a waste of money. I can think of many much greater wastes of my tax dollars that should be addressed before cutting back on basic research.

Editor
July 23, 2014 5:44 am

Claude Harvey says:
July 23, 2014 at 5:13 am
> With Goddard Space Flight Center doing the calculating, the answers are preordained.
Please explain the connection between GSFC – http://www.nasa.gov/centers/goddard/home/ and GISS – http://www.giss.nasa.gov/ beyond their nasa.gov common parent. Do you have examples of GSFC preordaining answers from their current missions, see http://www.nasa.gov/centers/goddard/missions/index.html

Ted Clayton
July 23, 2014 6:00 am

It matters, just what are the spacial data acquisition capabilities, characteristics and limitations of the OCO sensors & sampling program. I took a look around earlier to check on these factors, and was surprised that answers were not ‘on the shelf’ where expected. Ie, when we look at the capabilities of a Hubble Telescope and many other orbital or astronomical systems and spacecraft missions, these same kinds of questions arise … they determine just how much progress is supposed to be possible, using the newest tool. Perhaps I just didn’t look in the right places…
I did see mention, that data-points ‘see’ about 3 square kilometers or roughly a square mile of ground-surface. I also saw that it was considered marginally possible that emissions of larger individual cities or metropolitan areas could be documented. That’s important, since cities are obviously large emitters, but are supposedly difficult to sample/characterize, CO2-wise (mainly, evidently, because the gases waft away & are gone elsewhere, rather quickly … the discussion in this article about “plumes” suggests that this might be how it is planned to examine city-contributions (by mentioning “forest fires”, by proxy)).
The case/question of cities points to a crucial data-characteristic: Can OCO re-sample the same location on the planet surface, repeatedly? If it can, that means certain things, and if it can’t, that means other things. Can the satellite/sensor ‘aim’, even a little? Can it pick an spot? Can it ‘know’ that a recurring sample-location is approaching, and have itself set & ready?
Tracking backwards in time & space is ‘clever’, but cleverness is not a substitute for direct, real-time data-measurements. The back-tracking enterprise involves us in the same sort of assumptions & modeling exercises that we are tangled up in, trying to understand climate itself, from an overly-sparse and uncertain data-set. This is the basic problem we have with climate, now … and it’s what we should be trying to move away from & reduce, not embrace … not dig ourselves deeper into the cleverness-hole.
Cleverness is probably a disease of the scientific process & personality. It is certainly the prelude to illusions, delusions, fantasies and several DSM-4/5 conditions noted by shrinks. Then, we fall in love with the creations of our cleverness, and defend them far out of keeping with their factual worth.

July 23, 2014 6:04 am

Ric Werme says:
July 23, 2014 at 5:35 am
I’m disappointed with the several negative comments about OCO-2. Currently we have good CO2 data for a handful of sites around the world. Soon we’ll have decent data from many points around the world. Not perfect, but decent. If it tells use what next to look at, that’s fine.
,,,

I agree, especially if they provide the “un-fooled around with” data.
Not only will this show how well mixed CO2 really is, but it may also help lay to rest the existence of the “hot spot”.
I’m thinking we should be looking for the silver lining behind this CO2 cloud.

kenw
July 23, 2014 6:28 am

no one seems to have mentioned Big Brother potential…..

Editor
July 23, 2014 7:02 am

“If you get the transport piece [of the GCM] right, then you can understand the piece about sources and sinks,” she said. “More and better-quality data from OCO-2 are going to create better characterization of global carbon.”

Too bad that there is almost nothing of lesser significance than the sources and sinks and distribution of a minor climate player like incremental changes in CO2. Well, maybe it will be of some use when the planet starts to cool, the air gets drier, and CO2 becomes less redundant with water vapor, but even then it’s not like we’re going to be trying to alter natural sources and sinks. We’re just going to be spewing as much CO2 as we can and burning coal in the winter in a way that is dirty in the old fashioned sense of producing soot so as to continually dust the great white north with albedo-reducing black.

Ian W
July 23, 2014 7:50 am

Daniel H says:
July 23, 2014 at 4:53 am
Translation:
In the event that the raw data from OCO-2 fails to show that anthropogenic sources in Western nations are responsible for the bulk of CO2 emissions, we are already prepared to run it through a tweaked chemical transport model that can transform the data so that it correctly conforms to our hypothesis. This model transformation should be described in terms of Harry Potter time-travel magic so that it can be easily understood and accepted as scientifically valid by President Obama, Vice-President Biden, Secretary Kerry, Nancy Pelosi, and Barbara Boxer.
/sarc (I hope!)

No that is the sequence of events followed by every other measurement system that the climate ‘scientists’ have had taxpayers fund. There is another possibility that even the various homogenization algorithms and adjustments cannot torture the data enough, in which case like Argo it will be brushed under the carpet and become a system that dare not speak its name.

July 23, 2014 7:57 am

For those who question the “well mixed” point of CO2:
In 95% of the atmosphere, the CO2 levels are within 2% of full scale. That includes the huge seasonal variation (mainly in the NH) and the year-by-year variability in increase rate (mainly ENSO influences in the tropics). Taking into account that some 20% of all CO2 moves in and out of the atmosphere within a year, that is well mixed.
In 5% of the atmosphere, that are the first few hundred meters over land, CO2 is not well mixed, as there are huge sources and sinks at work which are not fast enough mixed with the rest of the atmosphere, especially under inversion. See e.g. the CO2 levels over different heights at Cabauw (The Netherlands) where the largest variations are near ground where plants, plant decay and humans absorb and emit large quantities of CO2:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/cabauw_day_week.jpg
What this satellite will do is make the measurements world wide, but still intermittent and still only point measurements, not fluxes. To calculate fluxes, one need a model and calibration with ground instruments like the above tall towers.
Another drawback is that the measurements are only possible during daylight, while plants respire a lot of CO2 at night…
About the human contribution: that is only possible for huge point sources (like power plants) and larger (industrial) areas and towns. The human contribution of ~4.5 ppmv/year is only ~0.01 ppmv/day and thus far below the detection limit of the satellite (~1 ppmv), except for concentrated sources…

Allencic
July 23, 2014 8:03 am

No matter what the satellite sees, and no matter what it measures with regard to CO2, we know what the conclusions will be: “It’s worse than we thought, women and minorities affected most and it will take a new expensive tax to fix it.” There, ain’t climate science easy?

Tom J
July 23, 2014 8:04 am

This is truly clever. Genius, in fact. We all know that the only way the climate models can possibly be judged is by applying them to what we ‘know’ has occurred in the past since we can’t instantly transport ourselves to the future to see how well they’ve predicted it. And, what we’ve consistently found is that the climate models suck at predicting what the past has been. How to fix this incorrigible problem? Let’s get rid of the hard data points from the past and replace them with nebulous data points that can be subject to whatever interpretation we choose to apply just like the UN does with its ‘projections’ for future climate. So if a hard data point doesn’t fit the script, just replace it with a soft data point. Voila, problem solved. And we all know CO2 governs absolutely everything in the climate.
Ah, but let us not be cynical. Science would never ever lie or make up convenient data points. Especially when they’re doing so to for, “Politicians (who) need answers quickly.” And, of course, those politicians would never dream of circumventing the Constitution by imposing the power hungry’s dream of an ex post facto law. Unless, of course, an emergency needed an answer quickly.
Clever indeed!

Phil.
July 23, 2014 8:19 am

JohnWho says:
July 23, 2014 at 6:04 am
Ric Werme says:
July 23, 2014 at 5:35 am
I’m disappointed with the several negative comments about OCO-2. Currently we have good CO2 data for a handful of sites around the world. Soon we’ll have decent data from many points around the world. Not perfect, but decent. If it tells use what next to look at, that’s fine.
,,,
I agree, especially if they provide the “un-fooled around with” data.

Well the “un-fooled around with” data is the absorption spectral bands at 0.76, 1.61 and 2.06 microns, would you know what to do with that data if they gave it to you?

Ted Clayton
July 23, 2014 8:22 am

Alec Rawls @ July 23, 2014 at 7:02 am

Too bad that there is almost nothing of lesser significance than the sources and sinks and distribution of a minor climate player like incremental changes in CO2. … We’re just going to be spewing as much CO2 as we can and burning coal in the winter in a way that is dirty in the old fashioned sense of producing soot so as to continually dust the great white north with albedo-reducing black.

Power plants work more-efficiently, in cold climates (bigger Delta-T). As early as the beginning of the nuclear power era, special attention was placed on superconductors, since with lossless electrical transmission we could site nukes far away from NIMBYs, and where cooling-opportunities (always a special challenge with nuclear – no ‘exhaust’ to carry off waste-heat) were better . And we would get significantly more generating-capacity from our investment. The same could also be done/applies to fossil-fired plants.
The problem in melting unwanted snow & ice with ‘incidental’ smoke stack particulate, is that it fails to meet the basic location & timing requirements of an effective dusting-program. Plus, belching black clouds represent inefficiencies of the plant. ‘Scatter-gun’ dusting is crude, but worse, dusting does no good, when additional snowfalls are expected, since the dust will just be covered by new snow. Good dusting has to wait until after snow-season, and the sun returns.
Carbon-particulate manufacture & application was extremely important at an early stage of modern techno-civilization, and we have a wealth of nifty (‘largely forgotten’) sci-tech in this area. We can either prepare optimized materials (powders) that are loaded on large airplanes, or we can load liquid fuels onto planes which then carry well-designed soot-generators, or both. Fleets of these aircraft then hit the skies during a brief window in the spring.
In the off-season, these planes can also spread seed & fertilizer … and might be diverted to fight forest fire.

July 23, 2014 8:26 am

Ric Werme: “I can think of many much greater wastes of my tax dollars that should be addressed before cutting back on basic research.”
Me, I’m agnostic about the project, but we should not as citizens just recite “basic research” and take that as sufficient justification. I’m sure we could come up with a trillion dollars’ worth of basic research of which all has the potential of producing knowledge beneficial to us. The “basic research” argument proves too much.
As I say, I’m agnostic, but Mr. Engelbeen’s comments regarding the density variation’s being largely restricted to the lowest altitudes, which the satellites cannot distinguish, and the proponents’ musings about turning the clock back (from higher to lower entropy) make me skeptical.

Samuel C Cogar
July 23, 2014 9:21 am

The Posted commentary states:
on July 23, 2014
NASA’s Orbiting Carbon Observatory-2, which launched on July 2, will soon be providing about 100,000 high-quality measurements each day of (atmospheric) carbon dioxide (CO2) concentrations from around the globe.
——————–
IMHO, anyone that believes the above is surely a likely candidate of also being an avid proponent in/of the magical powers of the Flying Spaghetti Monster.
Now I will retract the above claim if someone, anyone, can explain to me how it is possible for satellite “imagery” to “see” and/or “count” the number of CO2 molecule in any given or specific locale of the earth’s atmosphere.
Now the satellite measurement of near-surface molecular CO2 concentrations (quantities) is quite different from the higher altitude satellite measurement of the Infrared radiation whose source is thought to be the emissions from CO2 molecules. Now I said “thought-to-be” because of the “overlap” in the absorption frequencies of CO2 and H2O, to wit:
Ref: CO2 and H2O Infrared (IR) absorption bands: http://www.randombio.com/spectra.png
And I offer the following excerpts from: http://profhorn.meteor.wisc.edu/wxwise/satmet/lesson8/GOESprof.html as reference facts, to wit:
The upwelling radiation sensed by a satellite sensor is governed by a) emission from the earth’s surface transmitted through the atmosphere and b) emission from the atmospheric layers transmitted through the outer layers of the atmosphere.
And …
Because the concentration of CO2 is nearly uniform in the atmosphere, the weighting functions specific to the CO2 absorbing bands show little variation with location. However, water vapor concentrations vary greatly from one location to another. The non-uniform concentration of water vapor in the atmosphere will cause the weighting functions specific to the H2O absorbing bands to vary by location.
———————-
Thus they are per se “subtracting” the noise of the H2O vapor to determine the effect of the CO2.
Now the above stated “nearly uniform” is true for high altitude measurements …… but not for near-surface measurements where the actual sources of the CO2 emissions into the atmosphere occur ….. and/or … the actual sources that are absorbing CO2 from the atmosphere are situated.
And the next important fact is, ….. there is extremely few, if hardly any, ….. low altitude locales on the earth’s surface where the quantity (#) of CO2 molecules outnumber the quantity (#) of H2O molecules. On the contrary, H2O molecules will normally outnumber the CO2 molecules by a minimum factor of 10X. Except of course, maybe directly overtop of a huge mass of rotting garbage.

more soylent green!
July 23, 2014 10:44 am

I wonder if we could surveil the official CO2 measurement sites (http://co2now.org/Know-CO2/CO2-Monitoring/co2-measuring-stations.html) and the surrounding areas and determine how well-mixed CO2 really is in those areas.

more soylent green!
July 23, 2014 11:07 am

It really sounds to me like they are overselling the capabilities of just one single satellite. They do a sweep over an area and then attempt to trace the sources of CO2 backwards? Seems like this is fraught with potential errors.
However, if they can run the model backwards, maybe they can help me figure out where I left the keys to the back shed? I know I had them just the other day…

July 23, 2014 11:59 am

more soylent green! says:
July 23, 2014 at 10:44 am
I wonder if we could surveil the official CO2 measurement sites (http://co2now.org/Know-CO2/CO2-Monitoring/co2-measuring-stations.html) and the surrounding areas and determine how well-mixed CO2 really is in those areas.
Quite good, as the raw measurements show.
The hourly averaged raw data from 4 stations could be downloaded from NOAA, but the location recently changed.
Anyway here a plot of the raw data at Mauna Loa and the South Pole for 2008, together with the “selected” daily and monthly averages:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_mlo_spo_raw_select_2008.jpg
The outliers caused by volcanic vents and upwind conditions from the valleys (Mauna Loa) or mechanical problems (South Pole) are not used for averages.

July 23, 2014 1:06 pm


JohnWho says:
July 23, 2014 at 6:04 am
I agree, especially if they provide the “un-fooled around with” data.
Phil. says:
July 23, 2014 at 8:19 am
Well the “un-fooled around with” data is the absorption spectral bands at 0.76, 1.61 and 2.06 microns, would you know what to do with that data if they gave it to you?

Yes.
Provide it to people who do.
“Adjusted” data, when the adjustments aren’t fully revealed, often appears suspect, does it not?

Duster
July 23, 2014 2:58 pm

Crispin in Waterloo says:
July 23, 2014 at 4:45 am
. . .
Conclusion: CO2 is not well mixed in major urban areas.

Cool, that implies an “Urban Carbon Island” effect. Anthony’s next project.

catweazle666
July 23, 2014 5:09 pm

Ah, more computer games…
YAWN

Chuck Bradley
July 23, 2014 6:28 pm

I don’t understand all the doubt expressed in the comments. This is just the everyday egg unscrambling operation.

David Walton
July 23, 2014 6:40 pm

What, they didn’t include any methane detection diagnostics?

Down to Earth
July 23, 2014 9:49 pm

I invite criticism to this statement : “The OCO-2 readings will be flawed because of air traffic exhaust emissions.”
After the Malaysian jet disaster a air traffic pattern map was shown on TV. It showed heavy traffic around Ukraine(lots of concentrated jet exhaust), and a big open air space over Ukraine(no flights at all). Seems this would skew the CO2 emissions of the surrounding area vs. Ukraine. So by extension, it seems CO2 readings could be skewed by traffic patterns around the world. Comments or thoughts ?

Claude Harvey
July 23, 2014 9:53 pm

Re: Ric Werme says:
July 23, 2014 at 5:44 am
Claude Harvey says:
July 23, 2014 at 5:13 am
> With Goddard Space Flight Center doing the calculating, the answers are preordained.
“Please explain…”
I apologize for tarring other NASA agencies with the GISS brush. I should have said “With the Goddard Institute of Space Studies doing the calculating….”
CH

July 24, 2014 2:32 am

What will hourly monitoring help better understand phenomena that take centuries to evolve in a significant way as eg. liite ice age or medieval warming ?

Samuel C Cogar
July 25, 2014 4:25 am

Claude Harvey says:
July 23, 2014 at 9:53 pm
With Goddard Space Flight Center doing the calculating, the answers are preordained.
—————–
Averages are like row boats.
They rise and fall relative to the tidal forcing by the “ebb & flow” of either the newly recorded temperatures or “adjustments” to the historical temperature records.