Guest Post by Willis Eschenbach
Well, I was going to write about hourly albedo changes, honest I was, but as is often the case I got sidetractored. My great thanks to Joanne Nova for highlighting a mostly unknown paper on the error estimate for the Argo dataset entitled On the accuracy of North Atlantic temperature and heat storage fields from Argo by R. E. Hadfield et al., hereinafter Hadfield2007. As a bit of history, three years ago in a post entitled “Decimals of Precision” I pointed out inconsistencies in the prevailing Argo error estimates. My calculations in that post showed that their claims of accuracy were way overblown.
The claims of precision at the time, which are unchanged today, can be seen in Figure 1(a) below from the paper Observed changes in top-of-the-atmosphere radiation and upper-ocean heating consistent within uncertainty, Norman G. Loeb et al, paywalled here, hereinafter Loeb2012
Figure 1. This shows Fig. 1(a) from Loeb2012. ORIGINAL CAPTION: a, Annual global averaged upper-ocean warming rates computed from first differences of the Pacific Marine Environmental Laboratory/Jet Propulsion Laboratory/Joint Institute for Marine and Atmospheric Research (PMEL/JPL/JIMAR), NODC, and Hadley, 0–700 m
I must apologize for the quality of the graphics, but sadly the document is paywalled. It’s OK, I just wanted to see their error estimates.
As you can see, Loeb2012 is showing the oceanic heating rates in watts per square metre applied over each year. All three groups report about the same size of error. The error in the earliest data is about 1 W/m2. However, the size of the error starts decreasing once the Argo buoys started coming on line in 2006. At the end of their record all three groups are showing errors well under half a watt per square metre.
Figure 2. This shows Fig. 3(a) from Loeb2012. Black shows the available heat for storage as shown by the CERES satellite data. Blue shows heating rates to 1800 metres, and red shows heating rates to 700 metres. ORIGINAL CAPTION: a, Global annual average (July to June) net TOA flux from CERES observations (based on the EBAF-TOA_Ed2.6 product) and 0–700 and 0–1,800 m ocean heating rates from PMEL/JPL/JIMAR
Here we see that at the end of their dataset the error for the 1800 metre deep layer was also under half a watt per square metre.
But how much temperature change does that half-watt per square metre error represent? My rule of thumb is simple.
One watt per square metre for one year warms one cubic metre of the ocean by 8°C
(Yeah, it’s actually 8.15°C, but I do lots of general calcs, so a couple of percent error is OK for ease of calculation and memory). That means a half watt for a year is 4°C per cubic metre.
So … for an 1800 metre deep layer of water, Loeb2012 is saying the standard error of their temperature measurements is 4°C / 1800 = about two thousandths of a degree C (0.002°C). For the shallower 700 metre layer, since the forcing error is the same but the mass is smaller, the same error in W/m2 gives a larger temperature error of 4°C / 700, which equals a whopping temperature error of six thousandths of a degree C (0.006°C).
I said at that time that this claimed accuracy, somewhere around five thousandths of a degree (0.005°C), was … well … highly unlikely.
Jo Nova points out that curiously, the paper was written in 2007, but it got little traction at the time or since. I certainly hadn’t read it when I wrote my post cited above. The following paragraphs from their study are of interest:
Using OCCAM subsampled to typical Argo sampling density, it is found that outside of the western boundary, the mixed layer monthly heat storage in the subtropical North Atlantic has a sampling error of 10–20 Wm2 when averaged over a 10 x 10 area. This error reduces to less than 10 Wm2 when seasonal heat storage is considered. Errors of this magnitude suggest that the Argo dataset is of use for investigating variability in mixed layer heat storage on interannual timescales. However, the expected sampling error increases to more than 50 Wm2 in the Gulf Stream region and north of 40N, limiting the use of Argo in these areas.
Our analysis of subsampled temperature fields from the OCCAM model has shown that in the subtropical North Atlantic, the Argo project provides temperature data at a spatial and temporal resolution that results in a sampling uncertainty in mixed layer heat storage of order 10–20 Wm−2. The error gets smaller as the period considered increases and at seasonal [annual] timescales is reduced to 7 ± 1.5 Wm−2. Within the Gulf Stream and subpolar regions, the sampling errors are much larger and thus the Argo dataset will be less useful in these regions for investigating variability in the mixed layer heat storage.
Once again I wanted to convert their units of W/m2 to a temperature change. The problem I have with the units many of these papers use is that “7 ± 1.5 Wm−2” just doesn’t mean much to me. In addition, the Argo buoys are not measuring W/m2, they’re measuring temperatures and converting them to W/m2. So my question upon reading the paper was, how much will their cited error of “7 W/m2″ for one year change the temperature of the “mixed layer” of the North Atlantic? And what is the mixed layer anyhow?
Well, they’ve picked a kind of curious thing to measure. The “mixed layer” is the top layer of the ocean that is mixed by both the wind and by the nightly overturning of the ocean. It is of interest in a climate sense because it’s the part of the ocean that responds to the changing temperatures above. It can be defined numerically in a number of ways. Basically, it’s the layer from the surface down to the “thermocline”, the point where the ocean starts cooling rapidly with depth. Jayne Doucette of the Woods Hole Oceanographic Institute has made a lovely drawing of most of the things that go in the mixed layer. [For unknown reasons she’s omitted one of the most important circulations, the nightly overturning of the upper ocean.]
According to the paper, the definition that they have chosen is that the mixed layer is the depth at which the ocean is 0.2°C cooler than the temperature at ten metres depth. OK, no problem, that’s one of the standard definitions … but how deep is the mixed layer?
Well, the problem is that the mixed layer depth varies by both location and time of year. Figure 4 shows typical variations in the depth of the mixed layer at a single location by month.
Figure 4. Typical variations of the depth of the mixed layer by month. Sorry, no provenance for the graph other than Wiki. Given the temperatures I’m guessing North Atlantic. In any case, it is entirely representative of the species.
You can see how the temperature is almost the same all the way down to the thermocline, and then starts dropping rapidly.
However, I couldn’t find any number for the average mixed layer depth anywhere. So instead, I downloaded the 2°x2° mixed layer depth monthly climatology dataset entitled “mld_DT02_c1m_reg2.0_Global.nc” from here and took the area-weighted average of the mixed layer depth. It turns out that globally the mixed layer depth averages just under sixty metres. The whole process for doing the calculations including writing the code took about half an hour … I’ve appended the code for those interested.
Then I went on to resample their 2°x2° dataset to a 1°x1° grid, which of course gave me the same answer for the average, but it allowed me to use my usual graphics routines to display the depths.
I do love climate science because I never know what I”ll have to learn in order to do my research. This time I’ve gotten to explore the depth of the mixed layer. As you might imagine, in the stormiest areas the largest waves mix the ocean to the greatest depths, which are shown in green and blue. You can also see the mark of the El Nino/La Nina along the Equator off the coast of Ecuador. There, the trade winds blow the warm surface waters to the west, and leave the thermocline closer to the surface. So much to learn … but I digress. I could see that there were a number of shallow areas in the North Atlantic, which was the area used for the Argo study. So I calculated the average mixed layer depth for the North Atlantic (5°N-65°N, 0°W-90°W. This turns out to be 53 metres, about seven metres shallower than the global average.
Now, recalling the rule of thumb:
One watt per square metre for one year raises one cubic metre of seawater about eight degrees.
Using the rule of thumb with a depth of 53 metres, one W/m2 over one year raises 53 cubic metres (mixed layer depth) of seawater about 8/53 = .15°C. However, they estimate the annual error at seven W/m2 (see their quote above). This means that Hadfield2007 are saying the Argo floats can only determine the average annual temperature of the North Atlantic mixed layer to within plus or minus 1°C …
Now, to me that seems reasonable. It is very, very hard to accurately measure the average temperature of a wildly discontinuous body of water like oh, I don’t know, say the North Atlantic. Or any other ocean.
So far, so good. Now comes the tough part. We know that Argo can measure the temperature of the North Atlantic mixed layer with an error of ±1°C. Then the question becomes … if we could measure the whole ocean with the same density of measurements as the Argo North Atlantic, what would the error of the final average be?
The answer to this rests on a curious fact—assuming that the errors are symmetrical, the error of the average of a series of measurements, each of which has its own inherent error, is smaller than the average of the individual errors. If the errors are all equal to say E, then if we are averaging N items each of which has an error E, the error scales as
So for example if you are averaging one hundred items each with an error of E, your error is a tenth of E [ sqrt(100)/100 ].
If the 118 errors are not all equal, on the other hand, then what scales by sqrt(N)/N is not the error E but
sqrt(E^2 + SD^2)
where SD is the standard deviation of the errors.
Now, let’s assume for the moment that the global ocean is measured at the same measurement density as the North Atlantic in the study. It’s not, but let’s ignore that for the moment. Regarding the 700 metre deep layer, we need to determine how much larger in volume it is than the volume of the NA mixed layer. It turns out that the answer is that the global ocean down to 700 metres is 118 times the volume of the NA mixed layer.
Unfortunately, while we know the mean error (7 W/m2 = 1°C), we don’t know the standard deviation of those errors. However, they do say that there are many areas with larger errors. So if we assumed something like a standard deviation of say 3.5 W/m2 = 0.5°C, we’d likely be conservative, it may well be larger.
Putting it all together: IF we can measure the North Atlantic mixed layer with a mean error of 1° C and an error SD of 0.5°C, then with the same measurement density we should be able to measure the global ocean to
sqrt(118)/118 * sqrt( 1^2 + 0.5^2 ) = 0.1°C
Now, recall from above that Loeb2012 claimed an error of something like 0.005°C … which appears to be optimistic by a factor of about twenty.
And my guess is that underestimating the actual error by a factor of 20 is the best case. I say this because they’ve already pointed out that “the expected sampling error increases to more than 50 Wm2 in the Gulf Stream region and north of 40N”. So their estimate doesn’t even hold for all of the North Atlantic
I also say it is a best case because it assumes that a) the errors are symmetrical, and that b) all parts of the ocean are sampled with the same frequency as the upper 53 metres of the Mediterranean. I doubt if either of those is true, which would make the uncertainty even larger.
In any case, I am glad that once again, mainstream science verifies the interesting work that is being done here at WUWT. If you wonder what it all means, look at Figure 1, and consider that in reality the errors bars are twenty times larger … clearly, with those kinds of errors we can say nothing about whether the ocean might be warming, cooling, or standing still.
Best to all,
PS: I’ve been a bit slow writing this because a teenage single mother and her four delinquent children seem to have moved in downstairs … and we don’t have a downstairs. Here they are:
CUSTOMARY REQUEST: If you disagree with someone, please quote the exact words you find problems with, so that all of us can understand your objection.
CODE: These days I mostly use the computer language “R” for all my work. I learned it a few years ago at the urging of Steve McIntyre, and it’s far and away the best of the dozen or so computer languages I’ve written code in. The code for getting the weighted average mixed layer depth is pretty simple, and it gives you an idea of the power of the language.
# specify URL and file name ----------------------------------------------- mldurl="http://www.ifremer.fr/cerweb/deboyer/data/mld_DT02_c1m_reg2.0.nc" mldfile="Mixed Layer Depth DT02_c1m_reg2.0.nc" # download file ----------------------------------------------------------- download.file(mldurl,mldfile) # extract and clean up variable ( 90 rows latitude by 180 colums longitude by 12 months) nc=open.ncdf(mldfile) mld=aperm(get.var.ncdf(nc,"mld"),c(2,1,3)) #the “aperm” changes from a 180 row 90 col to 90 x 180 mld[mld==1.000000e+09]=NA # replace missing values with NA # create area weights ------------(they use a strange unequal 2° grid with the last point at 89.5°N) latline=seq(-88,90,2) latline=89.5 latline=cos(latline*pi/180) latmatrix2=matrix(rep(latline,180),90,180) # take array gridcell averages over the 12 months mldmap=rowMeans(mld,dims = 2,na.rm = T) dim(mldmap) #checking the dimensions of the result, 90 latitude x 180 longitude  90 180 # take weighted mean of gridcells weighted.mean(mldmap,latmatrix2,na.rm=T)  59.28661