From the “if the government won’t visualize it, a climate skeptic will” department.
Guest essay by Erik Swenson
In July of 2014, NASA launched its most advanced carbon dioxide monitoring satellite, The Orbiting Carbon Observatory (OCO-2). The first OCO burned up on launch. There has been a lot of anticipation regarding the data from this instrument. However, over a year after it launch, there has been little public information presented about its results. The only data made available by NASA has been images showing CO2 from an AGU14 session.
These images are shown below.
Figure 1: NASA-provided OCO-2 data for Oct 1 – Nov 11, 2014
http://www.nasa.gov/sites/default/files/thumbnails/image/mainco2mappia18934.jpg
Figure 2 NASA-provided OCO-2 data for Nov 21 – Dec 27, 2014
Source: http://svs.gsfc.nasa.gov/vis/a010000/a011700/a011788/F5a.png
Back in May 2015, there was a release of some visualized data showing mixing ratios of CO2 over the oceans:
For some reason, NASA has not chosen to publish any recent updates of the OCO-2 satellite data. Many people are interested in the data from OCO-2, but have not been able to access the information. NASA has now provided access to the raw data from OCO-2, but the data is in the HDF file format. No common commercial programs such as Excel can access this data in this form.
I have created a program to parse this data, and attempt to graph it in a form that closely matches the output of the NASA images. The data is available from 9/20/2014 – 9/22/2015 as of this writing. I have generated the plots in approximately 6 week intervals. It takes about that much data to cover most of the globe with observational data. You can see how the orbit path is from this NASA visualization story:

A few implementation notes.
The data from each sample is put into an array. Each point is added to the array as a circular blob. The center point of the circle has a weight of 1 for the averaging function. The remaining points in the circle are weighted in a decreasing manner from the center. This choice is based on the images from NASA which show circular artifacts.
All of the images use the same min/max scale of 380 – 415 ppm. This does not give the best dynamic range for each image, but it does present a good range over all of the images.
The NASA images are chopped beyond 60 degrees N and S latitude. I have chosen to show whatever data is there.
All data points are plotted from the OCO-2-Lite files regardless of warn_level. Warn_level is used to judge the quality of the sample. The OCO-2-Lite files say they are the “high-quality” samples, so I chose to use them all.
The data used for these images is from the OCO-2-Lite v7 data set. It can be accessed here:
https://co2.jpl.nasa.gov/#mission=OCO-2
Finished visualizations
The data here is presented without comment. I will leave it to others to decide what this data means. So, without further ado – here is the data I have processed.
Figure 3: Processed data from Oct 1 – Nov 11, 2014
Figure 3 is an attempt to match the first NASA image from Oct 1 – Nov 11, 2014 to see how closely my algorithm matches. Note that NASA has adjusted the data set multiple times since the release of the NASA image. The current version is v7. I am not sure what changes have been made in the data.
Figure 4 : Processed data from Nov 16 – Dec 31, 2014
Figure 5 : Processed data from Jan 1 – Feb 15, 2015
Figure 6 : Processed data from Feb 16 – Mar 31, 2015
Figure 7 : Processed data from Apr 1 – May 15, 2015
Figure 8 : Processed data from May 16 – Jun 30, 2015
Figure 9 : Processed data from Jul 1 – Aug 15, 2015
Figure 10 : Processed data from Aug 16- Sep 12, 2015
UPDATE: Eric Swenson provides this map in comments showing CO2 over the entire year from From September 2014 to October 2015 – Anthony

Also, reader “edimbukvarevic” provides this map of anthropogenic CO2 emissions for comparison:
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.


.
Well that answers my question, I’d forgot I read that.
The doc says that anything above WL19 is not reliable and has heavy biases. Just plotting everything regardless is not really a valid approach.
Sorry to rain on the party, but N Atl ‘hotspot’ in the ‘annual average’ is totally spurious since it is only contains summer centred months. Glowing boreal forests may well be a similar issue and will get cut out by excluding unreliable data : WL>=19. Equatorial african hotspot is also dubious data and very spares.
The only remaining feature seems to be the chinese smoke stack. Alarmists will now all boycott chinese goods and we get our jobs back, Hooray !!
Hi Mike – the LITE data set only contains 0-19 warn levels I am fairly sure.
I spot checked a few of the images. There is very little data above warn_level 15. The cutoff is right about there, which is where NASA says the data can start to have high errors.
I certainly hope NASA gets their act together and publishes their version of these images. I make no claim to understand all the caveats in their data stream. I only tried to publish the LITE data, which has gone through NASA’s filtering at least to some level. The general themes in the images are probably sound since they generally agree with NASA’s first image they released. The second NOV/DEC image looks very different in the NH, and I am not sure what is going on there.
On what basis would the “glowing boreal forests” be a similar issue. They clearly show (albeit very differently) in both winter and summer?
.gov’s goals are clear when it comes to AGW. Replacing a free market with a governed one.This while of all good intentions has most often the end result in the same or worst effect.
An example could be a states tax on gasoline, if market volume declines in any one local .gov will ensure that an abundance is necessarily spent driving around in circles burning it, to keep the levels running. The more overburden spenders in the populace the better.
The same cant be said of truly free markets, if one country starts producing more with byproduct CO2 than another. The afore would need to cut down its production capacity due to the market competition.
AGW guarantees a system will be running at constant nominal levels to produce and maintain the tax burden requirement of those governing. (Spinning circles) the lag of which Earth may or may not be able to handle in all eventuality.
Steve P
I was originally looking at V6 – and the links I have show it’s been removed and replaced by V7 on the FTP site and documents which were residing there have disappeared – which is actually quite a cheek imho. I hope this doesn’t indicate some constructive unhelpfulness 🙂
Data User Guide(pdf) has some stuff worth reading about data integrity.
As I understand it OCO2 is continually ground truthed against terrestrial measurements at known fixed locations so instrument drift can be evaluated.
Tnx for the pdf link tomo.
It was like a hammer to the head.
I needed it.
They have gone to great lengths to show you how accurately they can measure the snapshot in time. First review looks like they attain terrific accuracy. If there are flaws, they will improve them.
So why is this so important ?
If I tell you that watering your lawn is illegal and I can’t track it, I can’t make you pay the fine.
For the first time, there is a reliable snapshot of potential CO2 violators, just in time for Paris. Next step is the bill in the mail which of course can be offset with carbon tax credits run by market makers.
Options
1. Don’t pay and risk being sued in international court.
2. Don’t pay and risk being put on a no trade list.
3. Don’t pay and risk more severe sanctions.
4. Don’t pay and risk military escalation.
Will people really be willing to let their sons and daughters die if the above escalation happens ? My child died fighting in the first Carbon War ?
And there ya have it. CAGW is the greatest risk mankind faces. Not because we will make life unsustainable, but because we will fight wars to enforce the tax.
Sick, twisted, rotten to the core.
Thanks tomo,
I’ve been looking at the Data User Guide linked from the OCO-2 Docmentation page. The only thing I saw jumping out at me (wrt high latitude imaging) was this blurb, from the list of figures:
Figure 2-1. Nadir, glint, and target observations. (a) Nadir observations are acquired over the sunlit hemisphere at latitudes where the surface solar zenith angle is <85°. On all orbits except downlink orbits, as the Observatory passes over the northern terminator, it pitches up to point the instrument aperture at the Sun for solar radiometric calibrations. (b) Glint observations are made at latitudes where the solar zenith angle at the apparent glint spot is less than ~75°. (c) For target observations, the spacecraft scans the instrument across a stationary surface target as it flies overhead.
http://disc.sci.gsfc.nasa.gov/OCO-2/documentation/oco-2-v6/OCO2_DUG.V6.pdf
and:
For OCO, the nominal plan was to switch from Nadir to Glint observations on alternate 16-day global ground-track repeat cycles so that the entire Earth is mapped in each mode every 32 days. A similar approach has been adopted for OCO-2. Comparisons between Nadir and Glint observations will provide opportunities to identify and correct for biases introduced by the viewing geometry. Target observation will be acquired over an OCO-2 validation site roughly once each day.
https://directory.eoportal.org/web/eoportal/satellite-missions/o/oco-2
I suppose there may be a more precise explanation in the DUG, but I don’t have the energy right now to wade more deeply into it. Volunteers encouraged.
It is clear that OCO-2 is a work in progress and that measurement anomalies will be encountered.
One cannot escape the perception that the delay in presenting gridded data is highly political – detailed analysis of the data is something that will come with time…
What is completely unacceptable – is that challenging data has effectively been hidden from scrutiny – particularly in light of the PR brouha that accompanied the project launch.
I hope that the OCO-2 team has been diligent and honest (no reason to suspect otherwise – the delay is I suspect orchestrated by higher-ups…) – others in the climate farce are not and will seek to twist, obfuscate and so on.
What seems clear is that CO2 in the atmosphere is a deal more complicated than the modelers claimed and that they missed a bulging shedload of stuff.
Settled science ? erm… nope.
OCO-2 Data V7 README (pdf)
And finally
The OCO-2 Documentation Page Which is quite comprehensive
“usurbrain: Might want to look into the thousands of coal fires that are burning of underground….why no effort to extinguish those fires”.
Geologist E. Kirsten Peters in her book “The Whole Story of Climate” states “Geologists estimate up to 200 million tons of coal burn each year in China in these unwanted blazes, accounting for up to 10% of national coal production.” India, Africa, Australia and Pennsylvania also have their share. I am not a scientist, merely a lifelong gardener interested in weather & climate who believes that if the IPCC & it’s fellow travelers were truly concerned about the impact of CO2 on planet Earth, much would be made by them in effort & money to put out as many as can be extinguished. Its acknowledged that not all can be put out as it’s nigh impossible to locate all oxygen sources feeding the fires. That the topic isn’t brought up by them is highly suspicious. So if you folks despair that ordinary folks are duped by Gore & IPCC, take heart. Many of us aren’t!!! Point of curiosity: In assessing the periodic rise & fall CO2 in the atmos- phere, is it worth taking into account the increase in sheer #s of leaves trees put on following heavy rains as a method of removing water in water-logged soil which trees use to transpire water out through leaf stoma as their roots need to be in contact with oxygen in soil, not stay waterlogged. When that has been accomplished, they then shed the extra leaves so as not to dry the area out too much. Following central TX May, June flooding this spring, leaf fall was almost equal to fall. Can’t thank you enough for WUWT, even when I don’t understand all of what is discussed.
Ginger
Ive been reading years worth of posts and comments at WUWT and am struck by the simple honesty and dare I say, tenderness of your post. It’s reassuring that good people such as yourself understand that CAGW is but a con, a hoax. Talk amongst your friends. The CAGW movement has no validated science to prove itself and so is using politics, shaddy business deals and corrupt scientists to promote its cause.
In the not too distant future, the movement is going to be applying a carbon tax on countries that do not meet CO2 reduction goals. These costs will be passed onto citizens. The program will likely be enforced thru the UN in the form of sanctions and preferred trade deals. The UN will be supported by the USA, England, and Germany. China and Russia will not object, for now, because they have been allowed finance coal plants for non first tier nations as well as finance the development of their resources.. The IMF will no longer be in business models that are considered high carbon output such as coal fired power..
Casualties of this movement will include more deserving issues going unaddressed such as the burning coal mines. The real danger is that standards of living will decrease in the nations that pay and tempers begin to flame. Countries don’t like sanctions and eventually people will grow tired of forking over unnecessary payments for things that don’t matter.
So, please, spread your gentle wisdom in your circle.
gymnosperm:
That’s not true. Look at Erik’s fig4 : nov-dec 2014. There is nothing above the canadian border.
There may well be some important info about how those forest function but the ‘annual’ average in incomplete and thus biases to certain months. It WILL be misleading.
What is shown about these forest is definitley worth closer investigation but I’m warning against all those jumping to conclusions based on a biased and incorrect annual avarge.
Erik Swenson
Thanks for the reply Erik.
I was looking at their document:
“Warn Level, Bias Correction, and Lite File Product Description”
For example Figure 5 – Average Warn Level Map
Central Africa and Amazonia seems almost void of data (??) and russian and canadian zones full of very high WL data.
I think the fact that you have done this may prompt some action, congratulations. BTW if you could post some code I may be tempted to have a poke myself ( I don’t have time to start from scratch ).
I think your original post is good and is sufficiently close to NASA output. The annual mean is a great idea but needs to be corrected to only include zones that have data for each month.
If you have time to do that I think it would be well worth it of avoid people jumping to false conclusions. At that point it would be good to ask our host to add that graph as an update to your article.
Thanks again for this initiative, very interesting.
Many thanks to Erik again. I agree that the annual mean map should be corrected to only include areas with complete annual data, otherwise it’s misleading.
You are welcome. I appreciate all the comments from everyone. It has been a great response. I will try to come up with a good way to that.
Ok, I think I have this working. Data is only shown if a data point (or the averaged point in the circle) is present in all 12 months. In addition, I filtered to warn_level 15 or below (although I don’t think that filtered much of anything).
I have two time spans, from 09/06/2014 – 09/05/2015 and another from 09/29/2014 – 09/28/2015. Interestingly, the CO2 levels are visibly higher when using the late September data.
http://i59.tinypic.com/api6qd.png
http://i60.tinypic.com/ei0aop.png
You can see how much CO2 has risen year over year in these next two pics.
http://i58.tinypic.com/110he2w.png
http://i61.tinypic.com/2cd7f9d.png
Thank you very much Mike for your insights. I will clean up my code a little and post it somewhere.
Since you seem to know a thing or two about the filters, here is the code I used. Hopefully with no serious flaws 🙂
// Store the number of 'hits' in location 0 of the vector // Store the additive warn and co2 levels in locations 1 and 2 respectively for (int dY = -avgRadius; dY <= avgRadius; dY += 1) { // A little Pythagoras anyone? x^2 = hyp^2 - dY^2 int minX = sqrt(avgRadius * avgRadius - dY * dY); for (int dX = -minX; dX <= minX; dX += 1) { float dist = sqrt(dY * dY + dX * dX); // Use the full value at the center (and prevent div by 0) if (dX == 0 && dY == 0) { texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 0]++; texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 1] += warn; texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 2] += xco2; } else { texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 0] += avgRadius / dist; texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 1] += (avgRadius / dist) * warn; texDataA[((x + dX) + (y + dY) * gridWidth) * 4 + 2] += (avgRadius / dist) * xco2; } } } // For display, compute the average value for each location for (int y = 0; y < gridHeight; y++) { for (int x = 0; x 0) { texDataA[(x + y * gridWidth) * 4 + 3] = texDataA[(x + y * gridWidth) * 4 + 2] / texDataA[(x + y * gridWidth) * 4 + 0]; float xco2 = texDataA[(x + y * gridWidth) * 4 + 3]; if (xco2 > maxCo2) maxCo2 = xco2; if (xco2 < minCo2) minCo2 = xco2; } } }BTW it is interesting that you use a tapered blob intensity, whereas NASA use a uniform circle.
When the results for each observation are overlain this is effectively doing a spacially weighted average: ie a spacial filter.
In the NASA case this is running average over the width of their blob. Your method uses a triangular weighting, which is often referred to loosely as a ‘tent’ filter or tent kernel.
This is a crude filter but far better than a running average. This is why your redition of central Africa is far better than the NASA fuzz. Nice work, maybe they will learn from it.
re. Figure 8 : Processed data from May 16 – Jun 30, 2015
This shows how little mixing there is between NH and SH CO2 in the short term.
Figure 7 : Processed data from Apr 1 – May 15, 2015
Annual cycle is dominated by boreal forests and , surprisingly, N.Atlantic out-gassing.I would guess that this early land output is microbial activity ( or termites! ) happening before the trees wake up fully.
This cannot be assumed to be net zero over the year? Sadly OCO2 will not tell us about that because it does not have full annual coverage in this region.
BTW annual average of SH is more reliable since, despite similar coverage problems, the annual variation is small. Southern Ocean is a huge sink.
Figure 5 : Processed data from Jan 1 – Feb 15, 2015
Middle of NH winter. Here we see clearest indication of human emissions. US east coast, central Europe and China.
In a warmer world, we’ll need less heating 🙂
Mike, I disagree. See fig 9 and 10. Were are the human emissions? Power and industrial plants don’t shut down for sure. These maps confirm that human emission fluxes are very small compared to the natural ones.
I tend to agree that figs 5 is just the beginning of the US SE warming up. You can see in the subsequent images that the rest of the forested regions light up. I think it is also apparent in the yearly average that the zones tend to follow the world map of forests.
I think the Fall images are telling though. There is very little activity anywhere in the NH. I think if the estimates of emissions vs. natural CO2 are correct (about 5%), it would be very hard to see the emissions in the current OCO-2 data.
The very purpose of OCO-2 is an attempt to detect the anthropogenic portion of CO2 within the global biogeochemical fluxes. The analysis team will spare no expense trying to extract the “fossil fuel” emission signal into a “dramatic” display. It’s part of the mission statement in the user guide.
Erik, very high complements for your work on this. The estimate of 5 percent anthropogenic CO2 is located within the OCO-2 user guide but is not supported by facts. IPCC and everyone else says A-CO2 is about 3 percent of the natural CO2 global atmospheric flux. ie. the global biogeochemical carbon cycle is at least 33 times larger than human addition. The forest observations are great, but don’t forget the soil biology under the forests. Soil temp is highly important to biological activity and is notoriously non-linear.
Anthony – sorry to bug you, but can we please get the images updated here to the proper scale? They are starting to show up in Google’s image searches (yay!). But, the ones they have are the wrong ones. I emailed you the corrected versions. I would like to scrub the comments to eliminate bogus charts too if we can.
Thanks,
Erik
I generated a video of the full year data as it is generated. It is pretty interesting to watch. You have only been able to see the month-to-month variance, while I have been seeing the daily buildups. As it says in the YT description, this is only for warn_level 15 and below data. As well, only points (and their averages) that exist in all 12 months are drawn. So, you can see the full cycle over a year with the ebb and flow of CO2.
Engelbeen,
If I belabor the point I apologize, but:
Please explain in the history of our planet how a new CO2 sink could simultaneously arrive with a new CO2 source. And particularly, after Einstein denied the concept of simultaneity, sorry, non-sequitur there. But, seriously, everything has been going along swimmingly for many thousands of years, and all of a sudden the Industrial Revolution, and also all of a sudden, coincidentally, (which Einstein also denied,) a new natural CO2 sink?
My favorite tennis player, John McEnroe, ” You cannot be serious!!!”
I think not, my friend…
Michael,
There is no “new” sink, only a new source, human emissions:
Before human emissions the CO2 levels in the atmosphere were mainly governed by temperature as can be seen in ice cores: about 8 ppmv/K change over glacial and interglacial intervals. As the ice core temperature proxy (either dD or d18O) mainly reflects local temperature changes in Antarctica, the hemispheric / global temperature changes were about halve that and the sensitivity of CO2 from temperature changes is about 16 ppmv/K (regardless of the lags involved).
The literature shows a pCO2 response of seawater to temperature between 4-17 ppmv/K. 16 ppmv/K seems to be within that range, and most of the change seems to be a reaction of ocean CO2 to ocean surface temperature changes. Confirmed by the very small changes in 13C/12C ratio over the glacial/interglacial periods. If vegetation was leading, there would be huge changes in 13C/12C ratio, opposite to the CO2 changes.
What happens if you increase the CO2 pressure in the atmosphere (no matter from volcanoes or by humans)? Henry’s law says that for any temperature there is a unique ratio between the concentration in the atmosphere and in the liquid for any soluble gas. In that case, the partial pressures of the gas in the atmosphere and the liquid are equal.
If in the oceans (by temperature) or the atmosphere (by humans) the partial pressure of CO2 (pCO2) changes, there will be a flux from CO2 in or out in ratio to the difference in pCO2 between atmosphere and ocean surface.
The pCO2 of in the atmosphere increased with 110 ppmv (that is about the same as partial pressure in μatm – the difference is water vapor, ppmv is for dry air) above the “normal” dynamic equilibrium pressure according to the current (area weighted) average temperature of the ocean surface. That makes that there is an increasing flux of CO2 from the atmosphere into the oceans, in ratio to the increase in the atmosphere. That is your “new” sink, which is not new at all, but did work over the millennia to remove any extra CO2 from huge forest fires or huge volcanic eruptions, mostly fast enough to be immeasurable in the smoothed ice core history. Human emissions meanwhile are faster than what the oceans (and similar in vegetation) can remove in the same year that they are released, thus these pile up in the atmosphere…
The mass of the oceans is over 800x that of the atmosphere. That makes the oceans 800/0.04% more massive than the CO2 in the atmosphere. Just exactly how much CO2 can the oceans absorb, and just exactly how fast? Since no one knows, your theory is facile, un-falsifiable. Scientific? Maybe…
Michael Moon October 7, 2015 at 1:24 pm
The mass of the oceans is over 800x that of the atmosphere.
But most of the oceans are not in contact with the atmosphere, the part that is is termed the Ocean Mixing Layer.
Michael,
Far more is known about the CO2 exchanges between atmosphere and oceans than you think. About 1000 GtC is in the “mixed layer”, the upper few hundred meters of the oceans, where wind and waves make a fast exchange (less than a year half life time) with the 800 GtC in the atmosphere possible. Due to ocean chemistry, the Revelle/buffer factor makes that the ocean’s mixed layer follows CO2 changes in the atmosphere with about 10%. Henry’s law says 100%, but Henry’s law is only for CO2 gas in the liquid, which in seawater is only 1% of all carbon, the rest is about 90% bicarbonates and 9% carbonates. These are in chemical equilibrium with each other, where the changes of one of them result in changes of all others. The net result is that a 100% change in the atmosphere gives a 100% change in free CO2 in the oceans, but only a 10% change in total carbon of the ocean waters.
The 30% change in the atmosphere in the past 160 years is thus good for a 3% change in the ocean’s mixed layer, thus an increase of only 30 GtC in the 1000 GtC present and a very slight decrease in pH (less than 0.1 pH unit).
The deep oceans have a much larger capacity, but there is little exchange between the surface layer (and thus the atmosphere) and the deep oceans. The only exchanges are via the sinks near the poles and the returning upwelling near the (mainly Pacific) coasts, near 5% of the ocean area in each direction. That is a much slower process and while the capacity is huge, the speed of uptake can’t cope with human emissions.
Some figures: with the current 110 ppmv extra pressure in the atmosphere some 0.5 GtC/year is the increase in the ocean mixed layer (only as result of the extra increase/year in the atmosphere as the surface layer is near saturated), about 3 GtC/year is absorbed by the deep oceans and about 1 GtC/year by the biosphere. Human emissions are ~9 GtC/year… The latter two sinks are not saturated and don’t show any sign of reduction in uptake speed (contrary to the IPCC’s Bern model).
Well then why is the annual increase in CO2 not the exact same fraction of human emissions each year, why in fact is it dropping as a percentage of human emissions? Emissions go up, the percentage absorbed goes down, and the annual increase has decreased dramatically as emissions increase. There is no correlation. The oceans outgas CO2 in tremendous quantities each year, and also absorb tremendous quantities. And, of course, plankton dies and sinks to the bottom, pulling carbon out of the cycle. I think this is complex, and the assumption that our emissions disturb the “balance” that existed before is simplistic.
Michael,
Well then why is the annual increase in CO2 not the exact same fraction of human emissions each year, why in fact is it dropping as a percentage of human emissions?
Straightforward process control: the sinks are not dependent of the momentary human emissions, they are dependent of the total extra pressure of CO2 in the atmosphere above dynamic equilibrium (“steady state”). The latter depends of temperature and other natural influences which influence the uptake speed (Pinatubo, El Niño). If humans should emit average around 4.5 GtC/year, half of the current emissions, the levels in the atmosphere would stay the same as human emissions and net sink rate would be near equal. That besides the natural variability, which is from 10% to 90% of human emissions from one year to the next and about 40-60% if you look at decadal periods.
In the period 1985-1995 the % increase also was lower (around 40%), due to the Pinatubo, in which period the sink rate was also about 60% of human emissions as in the current period.
There doesn’t need to be a ‘new’ sink just the old one absorbing more. Henry’s law says that the amount absorbed by the ocean increases as the pCO2 in the atmosphere increases, similarly with photosynthesis.
Erik Swenson:
Nice work! thanks. – but…
1) your year average chart combines well with the periodic stuff to suggest that what we’re seeing is a series of snapshots from a mixing process. Given the 16 day mapping period we would need additional data about air movements before we could interpret what a reading taken on one orbital path really tells us about the source and volume of the gas triggering the readings.
2) in addition I would want to know a lot more about the variability of readings from different altitudes before believing anything the pictures might suggest about the distribution, density, and sourcing of the material.
Hi Paul,
The assumption is that NASA has already done all the major accounting and corrections in this data. This is not the raw co2 values, but the adjusted and calibrated data. As far a winds go, it does not appear that the winds carry things very far from their point of origin if you watch the animations.
For any further processing/analysis, we will have to wait for NASA or the scientists that are looking at OCO-2 data for results. I am sure we are going to see a lot of papers published on this data. People are probably looking at the relationships of this data with other geo data like wind, pressure, etc. At least I hope so. 🙂
Eric…. just fantastic work – your last sentence….
I’ve asked (via Twitter) if nullschool.net will be integrating OCO-2 data at all – which would be erm.. interesting 🙂
Thank you Eric. The dominance of natural flows is actually found in the IPCC carbon budget if you actually take their stated uncertainties into account when doing the budget. Please see
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2654191
If not mentioned above – Matlab reads HDF data of various kinds – and can process them in many ways too http://se.mathworks.com/help/matlab/ref/hdfread.html