Guest Post by Willis Eschenbach
I’ve been thinking about the Argo floats and the data they’ve collected. There are about 4,000 Argo floats in the ocean. Most of the time they are asleep, a thousand metres below the surface. Every 10 days they wake up and slowly rise to the surface, taking temperature measurements as they go. When they reach the surface, they radio their data back to headquarters, slip beneath the waves, sink down to a thousand metres and go back to sleep …
At this point, we have decent Argo data since about 2005. I’m using the Argo dataset 2005-2012, which has been gridded. Here, to open the bidding, are the ocean surface temperatures for the period.
Figure 1. Oceanic surface temperatures, 2005-2012. Argo data.
Dang, I like that … so what else can the Argo data show us?
Well, it can show us the changes in the average temperature down to 2000 metres. Figure 2 shows that result:
Figure 2. Average temperature, surface down to 2,000 metres depth. Temperatures are volume-weighted.
The average temperature of the top 2000 metres is six degrees C (43°F). Chilly.
We can also take a look at how much the ocean has warmed and cooled, and where. Here are the trends in the surface temperature:
Figure 3. Decadal change in ocean surface temperatures.
Once again we see the surprising stability of the system. Some areas of the ocean have warmed at 2° per decade, some have cooled at -1.5° per decade. But overall? The warming is trivially small, 0.03°C per decade.
Next, here is the corresponding map for the average temperatures down to 2,000 metres:
Figure 4. Decadal change in average temperatures 0—2000 metres. Temperatures are volume-averaged.
Note that although the amounts of the changes are smaller, the trends at the surface are geographically similar to the trends down to 2000 metres.
Figure 5 shows the global average trends in the top 2,000 metres of the ocean. I have expressed the changes in another unit, 10^22 joules, rather than in °C, to show it as variations in ocean heat content.
Figure 5. Global ocean heat content anomaly (10^22 joules). Same data as in Figure 4, expressed in different units.
The trend in this data (6.9 ± 0.6 e+22 joules per decade) agrees quite well with the trend in the Levitus OHC data, which is about 7.4 ± 0.8 e+22 joules per decade.
Anyhow, that’s the state of play so far. The top two kilometers of the ocean are warming at 0.02°C per decade … can’t say I’m worried by that. More to come, unless I get distracted by … oooh, shiny!
Regards,
w.
SAME OLD: If you disagree with something I or anyone said, please quote it exactly, so we can all be clear on exactly what you object to.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Willis,
Re: Figure 3. The weighted average global rate of temp increase should be (1/2)x0.1 + (1/2)x(-0.07) = 0.015 C, not 0.03 C.
going of tangent but an interesting paper, shows that it is best to get out of the lab and monitor real life.
Changes in coral microbial communities in response to a natural pH gradient.
Extract
Most of the studies to date have examined the impact of ocean acidification on corals and/or associated microbiota under controlled laboratory conditions. Here we report the first study that examines the changes in coral microbial communities in response to a natural pH gradient (mean pH(T) 7.3-8.1) caused by volcanic CO(2) vents off Ischia, Gulf of Naples, Italy. Two Mediterranean coral species, Balanophyllia europaea and Cladocora caespitosa, were examined. The microbial community diversity and the physiological parameters of the endosymbiotic dinoflagellates (Symbiodinium spp.) were monitored. We found that pH did not have a significant impact on the composition of associated microbial communities in both coral species. In contrast to some earlier studies, we found that corals present at the lower pH sites exhibited only minor physiological changes and no microbial pathogens were detected. Together, these results provide new insights into the impact of ocean acidification on the coral holobiont.
Roy Spencer says:
March 2, 2014 at 4:40 am
Excellent question, Dr. Roy, and it’s on my list. I also want to compare the Argo surface temperature dataset with the CERES surface radiation dataset … so many drummers, so little time.
w.
Cardin Drake,
I agree, this is a good article.
I also agree with your comment:
Since the atmosphere hasn’t warmed in the time period we have had the Argos floats, it clearly can’t be responsible for warming in the oceans.
Peter Foster said at 4:37 am
Willis, are you using raw data or “adjusted data”. I get confused by graphs from Bob Tisdale that show most of the ocean basins cooling. so how do we suddenly get ocean warming when the data used by Bob shows cooling ?
No kidding.
tallbloke said at 1:26 am
Interesting to compare the post-adjustment data Willis uses to Craig Loehle’s 2009 work which shows cooling from 2004.
…
The Argo data is suspect because they selectively turned off the low reading floats, without turning off the high reading floats. In other words, they changed the data to match expectations.
…
The low reading floats were removed because they provided data the scientists had not expected to see. They had expected to see the oceans warming, so they removed the floats that showed cooling. What was left is biased to meet experimenter expectations.
That’s what I figure was done.
jorgekafkazar said at 8:55 am
There’s no question that it’s [ARGO FLOATS] been “adjusted,” but based on statements of some who know him, I can’t draw the conclusion that Dr. Willis consciously did so to fall into line with AGW theory.
And the shepherd boy who cried wolf finally got eaten by one.
Maybe the adjustments Dr. Willis directed his team to make were legitimate, but after seeing his “Ask a Climate Scientist” YouTube
I am not impressed with him in any way.
ferdberple said at 8:55 am
… to correctly adjust the data, the experimenter must do so in a blind fashion, otherwise he/she will unconsciously introduce bias and mistakenly believe it to be neutral.
In the case of the Argo floats this was not done. Floats were selectively removed based on the trend they showed. This allowed the experimenters to unconsciously bias the results, by deciding which trend was correct and which trend was not.
As a result the Argo floats are not measuring ocean temperatures. They are measuring the experimenters unconscious bias.
B I N G O !
ferdberple said at 9:34 am
… the Argo adjustments came about because the floats were showing decreasing average ocean temperature over time. This led to the belief that something must be wrong with the floats. Had the floats shown increasing average ocean temperatures, researchers would have instead concluded that the floats were correct.
Oh I’m sure they would tell you they would have made corrections if results were deemed to be too high. One would think that the corrections to Global Warming data of all kinds (Temperature, Sea Level, Polar Bear Populations, Glacier Recession, Sea Ice, etc. ) would show a normal distribution around zero but I think that is far from the case.
Thus, the adjustments are a result of experimenter expectations, not ocean temperatures. Thus the result cannot be relied upon.
Again, B I N G O !
Willis; wonderful movies!
Any chance of showing a version centered on the Atlantic view?
I know, so many requests, so little time…
At least there isn’t new analysis for this one.
I also know, the Pacific is the big kahuna, Tisdale has flogged that through my thick skull.
I’d enjoy seeing the ‘Gulf Stream’, and maybe the ‘Bermuda triangle’.
For that matter, should I be noticing any Gyre-ations ?
Regards,
RR
Well Willis, I haven’t digested the significance of your various new physics SI units, but I’m inclined to agree with this observation:….””””….Dang, I like that …”””””
The movies are a lot better than Hollywood.
I may have missed it somewhere, or it’s not there, but your final fig. 5 trend line Joules per decade, could be converted to some sort of Watts per square meter net input storage rate to the total ocean surface, and/or extrapolated to the whole earth surface, to get a net earth energy gain rate.
Any idea what that rate might be. And I agree, your trend rate and the Levitus version, both agree with each other, given, your error bands.
A nice essay to wake up to this morning , Willis.
I only know OHC as either “Overhead Cam”, or “Oxygen free Hard Copper” , and I presume yours is something else ??
Stephen Wilde
You said above:
“If energy is being added to the system then El Ninos become more prominent and discharge energy from the oceans more rapidly.
If energy is being lost by the system them La Ninas become more prominent and retain energy in the oceans for longer.”
Have you ever studied emergent phenomena in things like plasmas or even chemical reactions? The plasma analogy is actually quite apt to ENSO.
In contained plasmas, there is a point where a critical amount of energy input into the plasma causes a sharp and often chaotic-like transition to coherent voltage oscillations. These oscillations may start off as a mixture of periodic and chaotic but with an increase in power become more periodic. The increase in power is very slight however, for instance input anode current changing from 1 A to 1.02 A.
Below this threshold the plasma just exhibits random noise with no periodic voltage oscillations. The transition is related to acoustic effects where the ions start to oscillate slightly with electrons still being the major charge carrier. But put enough power and suddenly the ions will cause negative resistivity, as in it looks like current is travelling backwards. In effect the ions become the major charge carriers, but it requires a lot of power.
The commonality with the Earth’s climate would be that you have two elements that can carry heat: a slowly charging one, the ocean; and a quickly changing one, the atmosphere. So ENSO could actually be a consequence of a certain power coming from the Sun, that is enough to cause oscilllations in the ocean that are defined by the shape of the ocean bed itself and the geoid. This may be a reason why ENSO appears in the Pacific.
Now if heat is removed from the system, ENSO itself, according to this idea, would become more erratic, with larger pulses and longer times between cycles. Until it would disappear entirely. The atmosphere would uncouple from the ocean in this respect. Put much more energy in and you may find ENSO is more stable but has larger temperature variations.
Having studied plasma interactions, especially discharge plasmas, and seeing that the Earth has acoustic characteristics (a contained ocean) maybe we should be looking into ocean oscillations and their history. If we are seeing less periodicity it may be signalling a cooling world.
But this is just a theory based on observations in another field. It may be different like you say and in effect the Earth’s climate balances itself.
Forget it, I got it; it was posted on the front door !!
Stephen Wilde says:
“The late 20th century warming spell showed high solar activity, reduced global cloudiness, a poleward drift of the climate zones and jets..”
Sorry, but poleward jets means faster trade winds.
mickyhcorbett75
I don’t exclude any of that.
As system energy content declines I would expect the ocean cycles to slow down, just as you say, but I think the oceans would still control the atmosphere until they shrank to a fraction of their current size.
Once the oceans do shrink to a point where they lose control then I would expect a more vigorous atmospheric circulation to develop to maintain system equilibrium.
On Mars, for example, we see periodic planet wide dust storms which alter albedo to act as a negative system response to excess warming.
My contention is that whatever forces seek to destabilise the thermal behaviour of a planetary atmosphere that atmosphere will reconfigure its circulation in a full and complete negative system response. otherwise the atmosphere could not be retained.
“””””…..Stephen Rasey says:
March 2, 2014 at 2:17 am
In agreement with tty at 1:38.
The margins of error might not be considerable, but are uncomfortably close in size to the signal.
The trend in this data (6.9 ± 0.6 e+22 joules per decade)
That indeed looks very precise. 69 ± 6 ZJ. or 0.022 ± 0.002 deg C / decade
(at 27.5 ZJ per 0.01 deg C for 0-2000 meters)
Can we believe we have that much precision to 0.002 deg C / decade? And we have not yet measured a full decade. With 4,000 Argo floats, 130,000 sounding per year, with one Argo float for every 200,000 km^3 of ocean.
That is remarkable precision, especially when the average temperature of the entire 0-2000 column runs from 0.000 – 10.000 deg C and the surface runs from 0.000 to 30.000+ deg C (extra decimal points added to make a point)……”””””
WHAT remarkable precision, are YOU talking about ???
From 6.9 +/- 0.6 E +100! Amps per kelvin, I get about 8.7% precision !!
What on earth do the units have to do with precision ??
If you CALCULATE the temperature change per attosecond, you are NOT suddenly getting Guinness world record precision !
Ulric Lyons said:
“Sorry, but poleward jets means faster trade winds.”
Can you substantiate that ?
I’m sure that a recent thread here on WUWT confirmed that the trade winds had recently increased whilst the middle latitude jets have become more equatorward / meridional.
http://wattsupwiththat.com/2014/02/10/seven-years-ago-we-were-told-the-opposite-of-what-the-new-matthew-england-paper-says-slower-not-faster-trade-winds-caused-the-pause/
which confirms that the trade winds were slower during the past warming spell of poleward jets but are now stronger with more equatorward jets.
“an unprecedented strengthening of the equatorial trade winds appears to be largely responsible for the hiatus in surface warming observed over the past 13 years.”
and we all know that the jets stopped drifting poleward around 2000 don’t we ?
Apparently, it is possible to get the raw Argo data and some groups have done work with it, as seen here:
http://www.google.co.uk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0CCoQFjAA&url=http%3A%2F%2Fwww.argo.ucsd.edu%2FOllitrault_ANDRO.pdf&ei=dIcTU_OmM8Kb0AXD7YCYBA&usg=AFQjCNHMRw95JT_77MJ6gUwv84Z112nG6Q&bvm=bv.62286460,d.d2k&cad=rja
Looks like the raw data is available if it’s needed, though no idea if there are any restrictions on access…
per Nick Boyce
“””””…..(9) Therefore, the temperature of the world’s oceans, to a depth of 2000 meters, increased by 0.09°C, from 1950 to 2013……”””””
I make that 0.0143 deg. C per decade, compared to Willis’s 0.02 deg. C per decade.
So what was the point you were making, again ??
Robert, thanks once again for your interesting comments.
rgbatduke says:
March 2, 2014 at 7:02 am
Well, somewhere in between, but it’s not bad. Here’s the current distribution:


I analyzed the total number of samples ever taken (as of 20120 in this graphic:
ORIGINAL CAPTION Figure 2. Number of temperature profiles ever taken by Argo floats in various areas of the ocean. Percentages in the second row refer to the percentage of the total ocean area having that number of temperature profiles. Percentages in the third row refer to the percentage of the ocean area from 60°N to 60°S having that number of temperature profiles. Click on image for larger version. SOURCE
You can see that the sampling is not totally even … but it’s not bad. Note the undersampling in the area of the ITCZ. I assume this is from diverging currents at depth.
Mmmm … no clear answer. Given the results in Figure 1, which are on a grid 60 nautical miles on a side at the Equator, I think we have enough samples to make some conclusions. You can see pretty fine-grained stuff in Figures 1 & 2.
While all of that is certainly true, in climate science we are always faced with datasets that are too short … I do note that the statistical significance of the trend is clear.
For me, the key is precision, not accuracy. If I can see small coherent physically logical features and details in the graphics, that’s what counts. In general I’m looking for understanding, not accuracy.
I considered putting in error bars, but I didn’t have enough information to do it—I don’t know how many individual Argo measurements were made per gridcell.
I can tell you that (assuming that the individual gridcells have either no errors or symmetrical errors) the standard error of the mean of each month’s global average surface temperature is on the order of tenth of a degree …
I probably should have included the following in the head post, which answers the question …

As always, of course, this is the statistical accuracy …
For the surface temperature, there are N = 29407 gridcells, and an area-adjusted standard deviation of 15°. By the usual calculations, that’s a standard error of the mean of about a tenth of a degree [ stdev/sqrt(N) = 0.09 ]. As you might imagine, the deeper you go, the smaller that gets. Here are the results for the surface. Note that there is no significant trend in the surface data.

For the average down to 2,000 metres, you have 27 times the number of samples, and the standard deviation is smaller ( ~12° ), so we get a SEM of about a hundredth of a degree.
Look, I have my own very real concerns about the accuracy of the OHC data, see Decimals of Precision.
And I am aware that the sampling is sparse, and the dataset is short. I am also aware that what I am reporting is the STATISTICAL uncertainty, and not the accuracy of the results compared to the actual reality. All I can do is report what I find.
I don’t think, however, that the warming is 0.02* ± 1°C per decade …
Thanks as always,
w.
ferdberple says:
March 2, 2014 at 8:41 am
Egads, ferd, don’t make that kind of claim without a citation. What is your basis for the claim?
w.
“Most of the time they are asleep, a thousand metres below the surface. Every 10 days they wake up and slowly rise to the surface, taking temperature measurements as they go. When they reach the surface, they radio their data back to headquarters, slip beneath the waves, sink down to a thousand metres and go back to sleep …”
A naive question perhaps but are they all synchronised in some way? Just askin’
TC says:
March 2, 2014 at 11:48 am
Good question. The answer is no.
w.
Stephen Wilde says:
“Can you substantiate that ?”
Yes, negative AO on the right:
are the reported temps at all affected by the sinking and rising actions or by passing through different temp zones? or so they sink, wait a few to stabilize, record and store THEN rise and transmit after fully risen with no recording during transit?
I don’t know enough about them to really phrase the question well, sorry.
just concerned the sink/rise action may skew the results.
“””””…..RichardLH says:
March 2, 2014 at 5:37 am
Speed says:
March 2, 2014 at 4:41 am
“Averages are sometimes useful, sometimes not. My nine year old car has operated at an average speed of 1.14 miles per hour over its lifetime. This reveals almost nothing about the car or how I drive.”
But averages per day, month or year may well provide data that is of interest as to usage……”””””
My 2012 Subaru Impreza, has just reached 10,000 miles. It’s average (moving) speed for that 10,000 miles, is just 16.0 mph. On a level road with engine fully warmed up. it goes 15 mph, with my foot off the gas pedal, at engine idle speed. I have never driven 16.0 mph in my life, for more than ten seconds per instance.
For that 10,000 miles, of actual moving, I have averaged 24.6 mpg.
The car has never been above 300 feet above MSL (no hilly driving).
On my typical freeway and expressway roads, where most of that 10,000 miles went, I drive between 45 and 60 mph. Over that speed range my instant mpg averages 55, down to 45 mpg. Yes it dips, if I go over an overpass. Yes I am very good at high mpg driving, and no I do not hold up traffic; quite the opposite, as my averages confirm.
Averages tell you something about ANY set of already accurately known numbers. Tells you nothing about any unknown numbers.
On average, Hurricane Sandy, from birth to death did very little damage anywhere. Well you can cherry pick, and find some small parts of its total path, area, where it did some observable damage.
Planet earth (Mother Gaia) does not compute averages; she integrates everything, but if something breaks, it may take a while to fix.
I don’t see how 4000 devices can produce a 4-dimensional (x,y, temp, time) image that has that level of resolution in each frame.
I assume about 30,000 pixels per frame (in the oceans) and taking into consideration Harry Nyquists rule, there ought to be about 60,000 sensors to yield the given frame, also assuming they are producing data for each frame.
There must be at least an order or magnitude plus of interpolation to produce the animation. Since interpolation isn’t real data, I would like to see an animation of the real discrete data without averaging and wizz-bang-edness. I suspect the real data is not as fancy looking.
Or am I completely wrong.
Wow, people are concerned about only 4,000 Argo buoys. Darn side better than one Yamal Charlie Brown Christmas tree !
Hansen says Thermometers are good out to 1,000 km, that’s more than 3 million sq. km per buoy, or 12 billion square km total.
Why all this sudden interest in the Nyquist sampling theorem ? Nobody in Climatism, pays any attention to the theory of sampled data systems; well they probably don’t even know that the climate is a (multi-variable) sampled data system.
The view from living in New Zealand of Figure 2 is fascinating, particularly as slowed down by Steve Keohane (http://tinypic.com/view.php?pic=12519p1&s=8#.UxOKHs63sas)
We can see our typical yearly, cyclical, regional, seasons clearly: February being the hottest and August the coldest. NZ is about 2000 km long and you can see why the north of the country has become a population magnet.
Figure 2 That says nothing about measurement but does show that the surrounding ocean temperatures have a commonality with our seasonal weather at the macro level.