Guest Post by Willis Eschenbach
Anthony has an interesting post up discussing the latest findings regarding the heat content of the upper ocean. Here’s one of the figures from that post.
Figure 1. Upper ocean heat content anomaly (OHCA), 0-700 metres, in zeta-joules (10^21 joules). Errors are not specified but are presumably one sigma. SOURCE
He notes that there has been no significant change in the OHCA in the last decade. It’s a significant piece of information. I still have a problem with the graph, however, which is that the units are meaningless to me. What does a change of 10 zeta-joules mean? So following my usual practice, I converted the graph to a more familiar units, degrees C. Let me explain how I went about that.
To start with, I digitized the data from the graph. Often this is far, far quicker than tracking down the initial dataset, particularly if the graph contains the errors. I work on the Mac, so I use a program called GraphClick, I’m sure the same or better is available on the PC. I measured three series: the data, the plus error, and the minus error. I then put this data into an Excel spreadsheet, available here.
Then all that remained was to convert the change in zeta-joules to the corresponding change in degrees C. The first number I need is the volume of the top 700 metres of the ocean. I have a spreadsheet for this. Interpolated, it says 237,029,703 cubic kilometres. I multiply that by 62/60 to adjust for the density of salt vs. fresh water, and multiply by 10^9 to convert to tonnes. I multiply that by 4.186 mega-joules per tonne per degree C. That tells me that it takes about a thousand zeta-joules to raise the upper ocean temperature by 1°C.
Dividing all of the numbers in their chart by that conversion factor gives us their chart, in units of degrees C. Calculations are shown on the spreadsheet.
Figure 2. Upper ocean heat content anomaly, 0-700 metres, in degrees C.
I don’t plan to say a whole lot about that, I’ll leave it to the commenters, other than to point out the following facts:
• The temperature was roughly flat from 1993-1998. Then it increased by about one tenth of a degree in the next five years to 2003, and has been about flat since then.
• The claim is made that the average temperature of the entire upper ocean of the planet is currently known to an error (presumably one sigma) of about a hundredth of a degree C.
• I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.
• The huge increase in observations post 2002 from the addition of the Argo floats didn’t reduce the error by a whole lot.
My main question in this revolves around the claimed error. I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. Doubtful.
I also find it odd that the very large increase in the number of annual observations due to the more than 3,000 Argo floats didn’t decrease the error much …
As is common in climate science … more questions than answers. Why did it go up? Why is it now flat? Which way will the frog jump next?
Regards to everyone,
w.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Some here are suffering from an error in concept that averaging data having errors produces more accurate data. The truth of that presumption depends on the nature of the errors (unbiased random noise) and measurement of a constant (fixed) value. If you use different thermometers with different biases, averaging does not help nor does averaging data from the same thermometer with a significant bias or the same thermometer used to measure different things. Is there a discussion of these issues for Argo.
richard verney says:
___
I was astounded the first time I took my then new 35mm Nikonos camera and light meter diving. Astounded at how rapidly the red dial on the meter faded to black with increasing depth. Astounded at how rapidly the light values dropped and the color values shifted towards blue.
Where has all the red light gone, long time passing?
Where has all the red light gone, long time ago…
From the wikipedia entry on Ocean Acoustics, http://en.wikipedia.org/wiki/Ocean_acoustic_tomography
“The integrating property of long-range acoustic measurements
Ocean acoustic tomography integrates temperature variations over large distances, that is, the measured travel times result from the accumulated effects of all the temperature variations along the acoustic path, hence measurements by the technique are inherently averaging. This is an important, unique property, since the ubiquitous small-scale turbulent and internal-wave features of the ocean usually dominate the signals in measurements at single points. For example, measurements by thermometers (i.e., moored thermistors or Argo drifting floats) have to contend with this 1-2 °C noise, so that large numbers of instruments are required to obtain an accurate measure of average temperature. For measuring the average temperature of ocean basins, therefore, the acoustic measurement is quite cost effective. Tomographic measurements also average variability over depth as well, since the ray paths cycle throughout the water column.”
Note the mention of 1-2 C “noise”, actually fluctuations in temperature. This places an error limit on any measurement that is in addition to instrumental error.
@DB Stealey: Making things up to discredit an argument isn’t cool. Please stick to the science.
To MikeR:
You wrote:
You are correct that the ARGO project buoys measure ocean temperature. Our knowledge
of ocean temperatures before the project came on-line in 2003 is decidedly fragmentary.
Willis stated in his original posting that the reason he was converting heat content into
average temperature rise was a personal preference, as he found that
the heat energy change was difficult to visualize. As to the “missing heat” problem, the
so-called deep ocean is just about the only place left where this “missing heat” can be
hiding. This was the reason for Anthony making his original post,
commenting on a blog post at a rival Web site. Anthony adduced as evidence
a graph from the NOAA Pacific Marine Environment Laboratory (PMEL), which had
listed the total heat energy change in the 0-700 meter layer.
Philip Lee says:”Is there a discussion of these issues for Argo.”
Yet again, please read the paper (Levitus et al 2012). Their Supplementary Material has an Appendix titled “Error Estimates of Objectively Analyzed Oceanographic Data”; equation 4 there addresses your point explicitly.
@ur momisugly Phobos –
Lets make this simple. You are trying cut a board that must be exactly 8′ 7-13/64′ long to for a shelf in your closet. The only, and I mean ONLY, thing you have to make any measurements with is an old hardware store yard stick. The smallest readable graduations or 1/4 inch. You have no string, rope, building square, framing triangle, etc. just the yard stick. I don’t care how many times you measure the board and how many times you average each measurement or use RMS, you will never get the correct length.
1. The fact that you have 3000 buoys does not mean that you have 3000 samples of the same temperature that can be averaged using the RMS accuracy rules (square root of the sum of the squares – in the industry we usually said RMS). All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.
2. The fact that you add up 3000 surface level temperatures, (e.g., numbers between 30 and 90 degrees F) then divide by 3000 and gives you a number out to 3 or more decimal points does not mean that is the temperature within anything other than +/- 1.5 Degrees C (or 0.25%, in fact you must use the WORST accuracy to be accurate). PERIOD. The RMS accuracy rules for averaging samples does not apply. IT DOES NOT WORK THAT WAY.
Also, the accuracy for essentially every instrument I have ever worked with, including precision laboratory standard instruments when expressed in terms of % were percent of the MAXIMUM reading for the range selected. (It is a common misconception that the 0.01%) means of the reading – this is not normally the case. Even for instruments selling for more than $10,000. Read the fine print on the accuracy specifications.) You might want to Google “NBS traceable calibration facility” find one in your area and talk to them.
3. I suggest you contact someone with a degree in applied mathematics. Note I said APPLIED mathematics. Generally, theoretical Mathematicians know the theory but haven’t the foggiest idea how to apply the theory.
P.S. Back in the early 70/s I was involved in the process of getting several US Government agencies to allow the use of calculating the square root of the sum of the squares for determining the accuracy of measurements and instrument channels. Prior to that time we were forced to use the direct sum of the “inaccuracy.” This meant that a instrument train with 5 devices, each with +/- .25% accuracy was treated as +/- 1.25% accurate. I have an advanced degree in Applied Mathematics and was a member of the Instrument Society of America back then.
Phobos says:
February 26, 2013 at 10:55 am
It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality. They don’t measure the same box over and over again, but sample a subset of the boxes over time. With sufficiently sized sampling, the difference between the sampled set and the entire population is small — this is, after, the basis of all statistical reasoning.
Not quite correct. With what you describe above they know in advance what the outcome should be. The size of the box, weight of candy bar etc is known and you are finding a difference from the known. You have no idea what a specific block of water is suppose to have for a temperature.
.
Let me give an example of why I think the ocean error claims are unsupportable.

The BEST dataset folks have given us their error estimates. They are using on the order of 30,000 stations, which take temperatures either several or many times daily, to measure a two-dimensional field, the average temperature of the air at about two metres off of the surface. They say the 95% error (two sigma) in their estimate is on the order of four hundredths of a degree (0.04°C)
The paper shown above, on the other hand, is using on the order of 3,000 Argo floats, which take temperatures once every ten days, to measure a much larger three-dimensional field, the average temperature of the top seven hundred metres of the ocean … and despite that, they claim a two sigma error of half of what the BEST folks get, two hundredths of a degree (0.02°C). A tenth of the data, a three-dimentional field, 2.3 times the area of the land data … and half the error?
I’m sorry, but I simply don’t see how that is even theoretically possible … yes, the ocean has less variation, but we’re talking orders of magnitude here. I’m more than happy to listen to any explanation of how that small an error is possible, I just haven’t heard one yet.
Here’s an example. In 2005, the listed error above (two sigma) is .022 degrees. In that year, there were 48,132 samples of the ocean temperature taken by Argo floats. The average surface temperature of all of the floats (raw data) was 19.25°C, the standard deviation was 8.73°C, and thus the standard error of the mean (two sigma) was 0.08°C … and that’s just for the top layer.
That is also a raw error, things will get worse once the irregular distribution of the data is included. The error is particularly exaggerated in any area where the currents mean that the free-floating buoys provide poor coverage. Unfortunately, these include areas of downwelling, spreading currents which push the Argo floats out of the very areas of larger temperature variation that we want to measure.
Finally, the global coverage of the Argo floats is poor. Only 1.5% of the samples were taken north of 60°N, and the same south of 60°S. Here’s the distribution of the floats compared to the distribution of the ocean by latitude …
The northern hemisphere, particularly from 30-60 north, is way over-represented in the data, and the southern hemisphere is correspondingly under-represented south of 40°S. This can only serve to increase the error of the average from the size of raw error I calculated above.
I don’t see how they get the two sigma error down to a couple hundredths of a degree … any explanations welcome.
w.
Uzurbrain says:
February 26, 2013 at 12:01 pm
“1. The fact that you have 3000 buoys does not mean that you have 3000 samples of the same temperature that can be averaged using the RMS accuracy rules (square root of the sum of the squares – in the industry we usually said RMS). All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.”
Wonderful comment. You have an excellent understanding of science. I have learned over the years that it is impossible to get Alarmists to understand that a scientist would not take temperature measurements from the surface of the ocean and from a depth of 200 meters and assume that they are comparable. The important word here is ‘assume’, as you know. The science must reference all the relevant conditions associated with each measurement.
Invariably, Alarmists treat any two temperature measurements as comparable. Some have posted comments in which they argue in detail that all such differences in the conditions of measurements disappear in the statistical wash. Well, I guess they do but so does the science.
Phobos says:
“@DB Stealey: Making things up to discredit an argument isn’t cool. Please stick to the science.”
I never claimed to be cool. And I will stick to anything I think is imkportant. Once again: is any part of your income derived from public funds?
As a taxpayer I get very tired of the constant stream of alarmist propagandists showing up here and blogging on my taxpayer dollars. If your answer to my question above is No, I’ll accept that. Otherwise, do what the public is paying you to do — and it isn’t posting endless comments throughout the work day.
Anthony already asked you who you work for, and you evaded him. From the content of your posts, you have an agenda. Your comments look like they were cut ‘n’ pasted from SkS, the internet’s premier pseudoscience blog. But unlike SkS, you will get plenty of pushback here at the internet’s “Best Science” site.
mkelly says:
February 26, 2013 at 12:02 pm
Bazinga! In addition, the entire environment has been designed and refined over the years for the sole purpose of achieving uniformity in each and every piece of candy. The candy manufacturers struggle to define their event space. Alarmists either do not have a clue what defining an event space means or they are being deceptive.
Steven Mosher says: February 25, 2013 at 11:12 pm
“Nick,
Ya, you’d be amazed if you look at a ships track over time in ICOADS by how little it changes and by how little it changes during the course of a day.. relative to the air that is.”
Steven, you would be amazed at how the sea temperature changes minute to minute if you are in a small boat. Then you reflect that there is as much heat in the upper 1m of the ocean as there is in all the atmosphere above it. Then you look at a thundering great vessel in ICOADS and you say “How the ^$#@ur momisugly can anyone think that something that size can give a good idea of the temperature of a layer of water 1m deep?”
It sometimes helps to go and observe. You learn a lot that way. You learn to respect Nature and not to try to second guess Her.
Phobos says:
February 26, 2013 at 10:55 am
“It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality.”
In good faith and total honesty, I cannot believe that you wrote that sentence. Sir, you are assuming that the ocean is no less uniform than the Hershey Bars coming off the product line. Who or what guarantees that uniformity? The God of Alarmism?
You, Sir, have won the prize for most transparent myth maker among Alarmists. Mosher comes in second. Alarmists have no concept of empirical research and the work that must be done to define and respect the integrity of the natural phenomenon that you desire to measure. On empirical matters, Alarmists are as dumb as rocks.
Willis Eschenbach wrote: “Only 1.5% of the samples were taken north of 60°N.”
The Arctic Ocean’s volume is only 1.1% of total ocean volume.
Uzurbrain says: “All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.
Of course not — if that were true, you could never measure something’s variation with time. You couldn’t determine the average annual temperature in your backyard. You couldn’t measure the speed of an automobile.
You are confusing what is theoretically desirable with what is practical. All measurements — ALL of them — are a compromise between what’s ideal and what’s doable.
The sun warms the lower mid latitudes of the equatorial regions of the pacific ocean. The relatively constant westward winds blow the warmed ocean surface to the west where they churn downwards against the coastlines there (Australia area) and are stored deep to 100’s of meters of depth.
During La Nina, more warmed surface water is blown farther west and down once it hits the coast. This process upwells more deep cooler water in the Eastern pacific (the America) to be warmed by less cloudy skies. (The cooler water is drier and creates fewer clouds). The Western pacific is slightly higher because water is being piled up there.
When La Nina conditions subside, (the westward winds reverse or slow) gravity then plunges the western pacific ocean down and the warm water flows in a reverse wave to the East to warm the surface of the equatorial regions, resulting in an El Nino. This then releases the deep warm water that was stored there.
There is no evidence that CO2 has any effect on this process.
And the volume of the Southern Ocean (south of 60 deg S) is 5.4% of total ocean volume.
Phobos said on February 26, 2013 at 12:33 pm:
Thus since it is only capable of holding a relatively very small amount of heat, and since it is far colder than other waters such as equitorial thus it takes less heat per Kelvin rise than it would around the equator or the middle latitudes, the Arctic is really not representative of the world oceans, and should be disregarded during conversations about “global warming” and “warming oceans” as it’s irrelevant to the overall discussions of ocean heat content.
Good point, Phobos.
@DB Stealey: My only agenda is good science. My income is my business, but nothing like you think it is.
Again, focus on the science.
Willis Eschenbach says:
February 26, 2013 at 10:05 am
Richard, that is total and absolute bullshit. I give you the Lie Direct on that.
I even dedicated an entire post to the question of the absorption of IR. In that post I answered your damn questions, over and over and over.
The fact that you didn’t like my answers doesn’t entitle you to make false claims about whether I’ve answered, Richard, thats underhanded and untrue, and I expect better of you.
See my post called Radiating the Oceans for a full discussion of this issue, including my answers to Richard that he’s trying to pretend never happened …
Richard, I expect an apology, or we’re done discussing forever. I won’t have dealings with a man who tells lies in an attempt to damage my reputation.
w.
I mix now in a discussion that is not mine and I’ll be sorry …
Willis, your 4 arguments there don’t stand for me.
The sea surface is the only point where the ocean does lose heat, this is why the inverted gradient forms with a cooler surface.
If the gradient would be inverted the ocean would be a net heat sink, this goes up to a moment when thermical balance is achieved and then the surface cools again, as the ocean gains warmth not only at the surface but in deeper layers too (as deep as sun radiation can penetrate)
DLR cannot directly warm the oceans due to the oceans cool skin. The inverted temperature gradient at the ocean surface ensures that heat does not go from the cool skin above to the below strata.
Even with mixing the surface water, when the water above is cooler then the water below, it will not add warm to the water below but cool it.
The only way DLR can warm the ocean is increasing the sea surface temperature.
If the sea surface temperature increases, then the temperature gradient allows for warmer water below, but this water is warmed by the sun radiation which penenetrates deeper.
As long as the cool skin gradient does not invert itself there is no heat transfer from the surface to the lower levels.
Argument 1. People claim that because the DLR is absorbed in the first mm of water, it can’t heat the mass of the ocean. But the same is true of the land. DLR is absorbed in the first mm of rock or soil.
When you put your hand on land it is warmer above and cooler below. This explains why heat is tranferred from above to below.
The oceans have a cool skin above, so the comparison is wrong.
Argument 2. If the DLR isn’t heating the water, where is it going?
DLR is part of the energy exchange at the surface.
You can calculate the net heat flow at the surface and DLR is part of that net heat flow.
If it would be heating the surface of the ocean then the surface would get warmer then the water below, then you can talk about warming and heat transfer to the below through mixing water or difusion. As long as the surface is cooler then the water below it, it does not warm the water below it.
Argument 3. The claim is often made that warming the top millimetre can’t affect the heat of the bulk ocean.
This is a different point. As explained this top milimeter is not warmer. One can mix it as long he/she wants, it will not add energy to the warmer water below.
The only way how the ocean can get warmer is when the surface gets warmer.
Argument 4. Without the heating from the DLR, there’s not enough heating to explain the current liquid state of the ocean.
Yes, as said, this sets the temperature of the sea surface. In consequence the ocean below warms from the sun radiation until it gets in balance and it loses the same heat at this very surface.
The ocean can lose heat only at the surface and not everywhere. On the “dark side” it frozes and it does not lose heat there, whereas where the 1000 W/m2 comes it unfrozes and intakes all the heat it can. It then radiates only at the surface where it is not frozen, in balance with the atmosphere above.
The ocean would be in any case unfrozen in part of it, not completely frozen, as 170 W/m2 is a flat world myth, 1000 W/m2 would melt any ice, what we have due to DLR is a bigger unfrozen surface due to the slow down of heat loss.
I think this is the way how it makes sense.
@ur momisugly Phobos 12:33 pm
See: http://ngdc.noaa.gov/mgg/global/etopo1_ocean_volumes.html
This says the Arctic is 4.3% of ocean area, 1.4% of Ocean Volume.
Since we seem to be concentrating on the upper 0-700 m of the ocean column, It seem reasonable that a better estimate would be that the Arctic is about 3.5% of the volume of interest.
Error bars should represent sampling uncertainty. Or stated another way, the probability the correct value is within a certain range.
Error bars are largely determined by sample size, and to a lesser extent the distribution of the data. Population (ocean) size over a certain amount has a very small effect.
But with the crucial caveat, sampling is random. As Argo floats drift, they do not sample randomly ( I don’t know whether or not their initial deployment is random). Error bars with non-random sampling are meaningless. Unless, you know what effect your non-random sampling has, and I see no indication the Argo people do.
On one university course, I was required to read Use and Abuse of Statistics (a very well written book as I recall). Climate Science merits a whole new edition of its own.
Phobos says:
February 26, 2013 at 12:52 pm
Uzurbrain says: “All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.
“Of course not — if that were true, you could never measure something’s variation with time. You couldn’t determine the average annual temperature in your backyard. You couldn’t measure the speed of an automobile.”
You define “exact same time” in terms of the physical processes measured. All the Hershey Bars are measured at the exact same time – the instant when they fall from the production line into the packaging machine. You cannot be this dumb.
A few thoughts regarding ocean data and observations:
The ARGO buoy network was giving data that contradicted the AGW narrative, so it was “adjusted”. Prior to adjusting it, this is what ARGO showed.
Envisat provided satellite sea level measurements. But they didn’t show what the climate lobby wanted, so like ARGO, the data was ‘adjusted‘.
Sea surface temperature [SST] varies widely depending on where the measurements are taken.
…scientists from the Norwegian Polar Institute reported that they’d measured sea temperatures beneath an East Antarctic ice shelf and found no signs of warming whatsoever… [source]
If the rise in CO2 caused global warming, the OHC would be rising at an accelerating rate. It isn’t. The trend is the same, whether CO2 is low or high. Moreover, in recent years OHC has flattened, in parallel with the stalling of global warming. It is becoming very difficult to square these observations with the AGW narrative. When the models disagree with observation, the models are wrong.