Ocean Temperature And Heat Content

Guest Post by Willis Eschenbach

Anthony has an interesting post up discussing the latest findings regarding the heat content of the upper ocean. Here’s one of the figures from that post.

pmel 0-700m heat content anomalyFigure 1. Upper ocean heat content anomaly (OHCA), 0-700 metres, in zeta-joules (10^21 joules). Errors are not specified but are presumably one sigma. SOURCE 

He notes that there has been no significant change in the OHCA in the last decade. It’s a significant piece of information. I still have a problem with the graph, however, which is that the units are meaningless to me. What does a change of 10 zeta-joules mean? So following my usual practice, I converted the graph to a more familiar units, degrees C. Let me explain how I went about that.

To start with, I digitized the data from the graph. Often this is far, far quicker than tracking down the initial dataset, particularly if the graph contains the errors. I work on the Mac, so I use a program called GraphClick, I’m sure the same or better is available on the PC. I measured three series: the data, the plus error, and the minus error. I then put this data into an Excel spreadsheet, available here.

Then all that remained was to convert the change in zeta-joules to the corresponding change in degrees C. The first number I need is the volume of the top 700 metres of the ocean. I have a spreadsheet for this. Interpolated, it says 237,029,703 cubic kilometres. I multiply that by 62/60 to adjust for the density of salt vs. fresh water, and multiply by 10^9 to convert to tonnes. I multiply that by 4.186 mega-joules per tonne per degree C. That tells me that it takes about a thousand zeta-joules to raise the upper ocean temperature by 1°C.

Dividing all of the numbers in their chart by that conversion factor gives us their chart, in units of degrees C. Calculations are shown on the spreadsheet.

degrees pmel 0-700m heat content anomalyFigure 2. Upper ocean heat content anomaly, 0-700 metres, in degrees C. 

I don’t plan to say a whole lot about that, I’ll leave it to the commenters, other than to point out the following facts:

• The temperature was roughly flat from 1993-1998. Then it increased by about one tenth of a degree in the next five years to 2003, and has been about flat since then.

• The claim is made that the average temperature of the entire upper ocean of the planet is currently known to an error (presumably one sigma) of about a hundredth of a degree C.

• I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.

• The huge increase in observations post 2002 from the addition of the Argo floats didn’t reduce the error by a whole lot.

My main question in this revolves around the claimed error. I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. Doubtful.

I also find it odd that the very large increase in the number of annual observations due to the more than 3,000 Argo floats didn’t decrease the error much …

As is common in climate science … more questions than answers. Why did it go up? Why is it now flat? Which way will the frog jump next?

Regards to everyone,

w.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
230 Comments
Inline Feedbacks
View all comments
Mindert Eiting
February 26, 2013 2:49 pm

Willis at 12:09 pm. I agree that the claim of one standard error equaling 0.01 for the global mean is curious but check whether this is the standard error of the global mean or something else like the annual mean per float (or per stratum). Each float does a measurement every ten days. So there is already a small error in the annual mean per float. For the remainder we assume that we have in a certain year a random sample of size 3000 from an imaginary population of float means. Square root 3000 equals 54.77. Global standard deviation equals 8.73. Then the standard error of the global mean equals 8.73/54.77 or 0.16.
The statistical theorem holds for random samples, whether the distribution is normal or not. Note that the standard error is the standard deviation of sample means over repeated random sampling. In the course of years float samples are dependent. So the standard error may be re-interpreted as the standard deviation of means in dependent or non-random samples. To give an extreme example: take one year of data and make one hundred copies of your data. Agree that the mean over copies has standard deviation zero? This would never be called a standard error. So the reported value of 0.01 may apply at dependent samples, underestimating the real standard error.

Merovign
February 26, 2013 3:25 pm

Allow me to quote Blackadder:
“So what you’re saying, Percy, is that something you have never seen is only slightly less blue than something else… that you have never seen.”
That’s kind of the vibe I get when something changes so far within the margin for error that it’s detection may well start the process of organizing a Canonization committee.

February 26, 2013 3:52 pm

gymnosperm says:
February 26, 2013 at 7:52 am
I think everyone has the Tisdale function backwards. ENSO transfers energy from the ocean to the atmosphere so ocean enthalpy should have decreased after the 1997 El Nino.

Well, not according to my understanding. Sure, there is a transfer of heat from the ocean to the atmosphere, but there is also a redistribution of warm water. Also, La Nina is generally a “charging” phase where net energy is being absorbed by the oceans and piled up in the Western Pacific Warm Pool. When the trades relax, that water spreads back east along the equator and loses heat to atmosphere. When the trades return, that warm water is redistributed.
We have an interesting condition right now in that we don’t really have much energy in the warm pool. The trades have been nominal. If they were to slacken now, we wouldn’t see much of an El Nino. Look at the coast of Southeast Asia right now. There is no massive buildup of warm water there.
http://weather.unisys.com/surface/sst_anom_new.gif

bw
February 26, 2013 3:55 pm

Going back to the original abstract in Nature, the graph is based on pooling global data in an attempt to find a warming signal. Pooling global temperature data is entirely bogus.
No point is asking what the error bars represent, they are fantasy.
There is also a supplement trying to explain the error methodology. The authors of the original analysis are pooling, homogenizing, excluding data in an attempt to find a “global” value.
Sounds like most of the so-called global warming conjecture.

February 26, 2013 5:17 pm

If I ran the Argo zoo, I’d moor 30 floats, or make them powered such that they can be kept at the same location. In order to quantify the bias resulting from free floating drift. Especially, as the TAO buoys have solved the deep ocean mooring problem.

kadaka (KD Knoebel)
February 26, 2013 5:30 pm

From Willis Eschenbach on February 26, 2013 at 2:47 pm:

Actually, you both seem to have missed my point, my intention must not have been clear. What I tried to say was that if you ignore any part of the ocean, it only increases the error in the average. It never decreases it.

Actually on the glance-through (wasn’t reading Phobos’ scribblings too carefully), it looked like he was dismissing your concern about above 60°N being only 1.5% of samples, by mentioning the Arctic Ocean itself is only 1.1% of the total volume, as if that was supposed to make sense.
My toss-off comment finished with “following the logic” to the Arctic Ocean not being important in discussions about climate, due to the small size Phobos cited… And who’d believe that?
As it is, there are certain small bits of water that, due to their location, have a far greater effect on climate than their “percent of volume” indicates. Which makes dismissing a region due to the low percentage an even worse error.

February 26, 2013 6:08 pm

Willis writes “I’d put them in a rigorously temperature controlled pressure chamber. I’d simulate a series of dives to their resting level (typically 1000 metres) and take the pressure and temperature and other recordings as usual. ”
I hope they did that with the floats that were thought to be giving cold biased values (as well as any floats they pulled giving expected warming bias)
I’ve not seen anyone reporting the results of testing of those floats however. If it turned out they read accurately in those tests then the readings ought to stand until they came up with another reason…
A week or so ago here in Hobart we had a stinking hot 35C day followed by a 15C day and then back up to about a 25C day after that. Those extreme days may seem “wrong” to an algorithm looking for erroneous readings but in fact they were all perfectly accurate.

Eyes Wide Open
February 26, 2013 6:18 pm

Two key points:
1) As someone mentioned earlier, any depth below 100m is a heat sink so how much of the rise in OHC is simply filling this sink.
2) It would be interesting to get a profile of the heat increase over the last 10 years at 0m, 10m, 50m, 100m and then every 100m below that. My guess would be that the upper layers have lost heat over the last 10 years offset by gains in the lower layers (due to the heat sink thing). That would explain the cooling global climate over that period!

February 26, 2013 6:32 pm

Phobos says: February 26, 2013 at 12:52 pm
You are confusing what is theoretically desirable with what is practical. All measurements — ALL of them — are a compromise between what’s ideal and what’s doable.
——————
Thank you for the admission of the stupidity of trying to measure the temperature of the Ocean AND that is the heart of the problem. You are telling us that we have to live with what is doable. Measuring the temperature of the ocean is not doable. The number of unknown variables exceeds the capabilities of the combined power of all of the existing supercomputers.
I assume you know that the ocean has a surface area of 63,780,000 sq miles (165,200,000 km²), and many known and unknown currents, and many known and unknown thermocline depths, and areas where for some unknown reason the temperature is unexplainably warmer or colder, and areas like the Sargasso sea. Even if equidistantly spread over the entire ocean you cannot do this with 3,000, 30,000 or 300,000 probes in the ocean because you would be combining different incompatible measurements creating a meaningless collection of garbage. If you were to use 3,000,000 probes, each would be covering 22 square miles, and if they had static probes at each desired depth (at least 10 intervals over the 700 meters) you might start to get some relevant, but not perfect data. Till then all you are doing is wasting money and providing a forecast of Armageddon and apocalypse based upon instruments/reports that have an error band wider than the measurement they are taking. But it looks good because it is displayed to 3 decimal points!
Like I said earlier, you cannot measure the quantity of water in a lake with a barrel to the nearest milliliter regardless of how many times you measure the amount of water. That is what you are advocating. It is doable to measure the quantity with a barrel; therefore, we will measure the quantity in 30,000 places, average them out and report the findings with a precision of 0.01 milliliters. Then for good measure claim that with the standard deviation of the normal distribution allowing for standard error of one sigma, divided by chi square over pi, and so forth, and so on, and etcetera, and more bu.. sh..
Get a good book on Applied Statistics. Read it.

dp
February 26, 2013 7:21 pm

These dramatic rates of change in the sea temperature represent huge rates of change in the influencing energy source. If 10 petajoules is required to sustain any particular rate of change, what rate of change is necessary to cause the flat lining? Is there a negative energy source out there? Without it that energy source has to work like a toggle switch *and* the rate at which the ocean loses energy has to be very non-linear. There is no flywheel effect in that graph and I find that troubling.

February 26, 2013 7:23 pm

Phobos says:
“Don’t forget, the uncertainty of an average is less than the uncertainty of any of what it’s averaging.”
Not true, often times it can be higher. How much statistics do you really know? To say something so outright false, just makes me think you are misinformed on statistics in general.
Here are what you SHOULD have said if you had taken stat 101:
“The uncertainty of an average is less then the uncertainty in certain cases. In the case of ARGO, I claim that the experiments are all equal and so therefore the uncertainty of the average is equal to this:
For example, if you have two temperatures T1 and T2, each with a measurement uncertainty of dT, the uncertainty dA of the average is
dA = dT/sqrt(2)
For N points the denominator is sqrt(N).
Then of course if you were being honest, you would add the following excerpt:
“Levitus et al 2012 claims this is the only error and I read and lazilly agreed without giving it any further though. So my opinion on this matter is rather worthless. So just go read them if you disagree with me, because my brain has shut down and I can not think.”
Why that last paragraph? Well you are being lazy. You site a paper and state that the error is what they say it s. Did you have any thoughts, concerns, or even some opinion on their evaluation of the data? Come on man, up your game. Give us something. Don’t just go around thumbing your nose at everyone else while you are being lazy intellectually.
OK, enough with “that”….my thoughts on ARGO is this:
As we add 700 bouys at (certain) locations every year, we are adding instruments which have not been active prior to this year. *duh. Therefore, what is the error of the instruments over time being subjected to various weather phenomena including in some cases hurricanes? (the ARGO floats can dive to avoid some of this, but then again that asks whether the diving will effect the instruments)…..
Another issue:
“Float reliability has improved each year and the float lifetime has been extended. ” So in essence every year you add more accurate instrumentation. This is probably a good thing, but it brings out error such as what Willis stated. Namely, that you have instruments which are as I mentioned earlier being around longer and longer…which becomes a larger and larger problem. In essence, adding more accurate and reliable instruments might actually increase your error for that reason. You would never know though if you did not test the instruments over time and find out if you had a systematic drift on the readings. (I think someone mentioned this possible error earlier….)
Anything systematic error not caugh is time of measurements and whether this has a systematic effect. My guess is probably not, but more readings would not hurt either, so to get rid of this error, have the things record more often. Perhaps they are unable to do so, in which case any new bouys added should measure more often.
I don’t know, there was so much covered by Willis that I tend to wonder if the grids are a problem now too. I would have to think on that, but I tend to think that grids are going to be hairy to make since the positioning of these bouys is not exact.

Manfred
February 26, 2013 7:35 pm

Mike Jonas says:
February 25, 2013 at 10:37 pm
re the “the 0.1°C temperature rise 1998-2003″. Could it be from the 1997-8 El Nino? If the upwelling warm water comes from below 700m depth, or if the temperature measure doesn’t treat all 700m equally, it could be the El Nino. I think this would be in line with Bob Tisdale’s thinking that there is a ‘step function’ at an El Nino.
—————————
It would make sense to see some increase in 1997/1998 due to the El Nino.
But not until 2003.
And not stopping just when you start measuring for the first time with a good instrument.
And not without any increase during the subsequent El Ninos in 2005 and 2010.
The best plausible explanation for the increase is then bad data or bad adjustment.

February 26, 2013 9:20 pm

benfrommo says:
February 26, 2013 at 7:23 pm
“Well you are being lazy. You cite a paper and state that the error is what they say it is. Did you have any thoughts, concerns, or even some opinion on their evaluation of the data? Come on man, up your game. Give us something. Don’t just go around thumbing your nose at everyone else while you are being lazy intellectually.”
And that, ladies and gentlemen, is probably the root of the problem.
Nice distillation benfrommo!

February 26, 2013 9:50 pm

This is an awful lot of comments for nothing more than a thought experiment. Are we all still pretending the OHC was measurable in 97-98? And then suddenly we start talking about ARGO? HELLO!!!??!!! ARGO started being deployed in 2000. This isn’t an apples to oranges comparison! It’s apples to imagination! It’s a meaningless conversation.
But, then, so is the thoughts behind trying to measure the OHC with the buoys. This isn’t like static ground thermometers. And, even if they were, what they are measuring is different each measurement. It isn’t like they’re measuring the same piece of water, even if the location is exactly the same. Quit letting people pretend this gives us some value or understanding. It doesn’t. It’s as meaningful as me measuring the heat content of the water from my tap each day. As a bonus, I’ll measure it in a pot and change depths.
Wait, never mind. I forgot that some people believe that heat from SUV driving magically drops to the depths of the oceans and lurks. Please ignore and carry on with the delusions.

Andyj
February 27, 2013 3:31 am

The graph under review has serious issues.
For a start, this shows anomalies. To me that means 99.99% of the worlds oceans have none and the Y axis is calculated as a mean. Heating of the oceans from the heating of the air to depths of 700m by simply blowing air on it… AND being able to measure the difference year on year… Guys like James Sexton who says this graph has issues.. No doubts about that.

February 27, 2013 5:40 am

Ray says:
February 26, 2013 at 7:46 am
“Ben I have no idea if geothermal heat is factored into the assorted Climate models in use or how a warm “Black Body” behaves.”
The current heat flux through the crust is negligeable. (~0,1 W/m^2) and not factored in afaik.
But the seismic event I mentioned involved 100 million km^3 of magma erupting from inside the Earth into the Pacific Ocean. That’s enough magma to cover all of Canada under a 10km thick layer of sisling hot magma. You can’t ignore events of this magnitude.
This event has the potential to warm ALL of the worlds ocean water some 15-20K.
So with the oceans pre-heated from within Earth it makes no sense to discuss black or grey body behaviour. Just look at how many joules/s the sun delivers on Earth and how that energy warms the already warm oceans.

February 27, 2013 6:51 am

Very interesting Willis….appears you found some more selection bias all over again. The more we find the more we show that the scientists were either being lazy or incompetent. I don’t know which, but if they can not write a paper like that and not even mention these effects to the tune of simply: “We did not consider x y and z” then they are either being lazy and not thinking this through or perhaps they are being dishonest. Now, I make no claim to either one, and perhaps they just figured adding that was a waste of time, but regardless that is rather shoddy work!
In any regard, I think politics may also be playing a large part of this. Here is an except from the ARGO page:

We are increasingly concerned about global change and its regional impacts. Sea level is rising at an accelerating rate of 3 mm/year, Arctic sea ice cover is shrinking and high latitude areas are warming rapidly. Extreme weather events cause loss of life and enormous burdens on the insurance industry. Globally, 8 of the 10 warmest years since 1860, when instrumental records began, were in the past decade.
These effects are caused by a mixture of long-term climate change and natural variability. Their impacts are in some cases beneficial (lengthened growing seasons, opening of Arctic shipping routes) and in others adverse (increased coastal flooding, severe droughts, more extreme and frequent heat waves and weather events such as severe tropical cyclones).”

Of all of that, two things strike me….. (this goes towards the motivations of the scientists to perhaps explain whether they are being intentionally dishonest or perhaps just lazy.)
They claim that 8 out of the last 10 years have been the warmest since 1860? did they not think this through? Of course the last decade is warmer then any previous decade. This is to be expected when you start the data during the Little Ice Age and have over 100 years of warming. What did they expect, record cold? That just makes no sense to me….because frankly that information does not tell you whether you are in a warming world or whether the warming was in the past….after all, do we not expect to find the highest elevations on the Earth to be at the top of mountains?
Don’t we expect to find the highest temperatures in the data to be at the top of the proverbial temperature chart? And why in the world are these world class scientists not able to realize that the warmest decade meme is pointless to repeat because it comes from being lazy and not able to think for yourself.
And then the call to “extreme weather.” Not one paper written has confirmed this “CRAZY GUESS” and yet its repeated by scientists like its gospel truth. Do they realize that since the science does not back this up, that simply stating it means they aren’t thinking at all? I don’t know what to say to add to that…… In any regard, reading about ARGO and the scientists involved makes me think they are either very lazy or very incompetent. I don’t know which, but this decadent line of thinking is what is going to kill climate science more then anything else. Normal rational people after all can reason things out and if you go on about unproven quack theories like extreme weather, the science will suffer as your conclusions are based on faulty input.
Hello, can we say garbage in, garbage out again?
Perhaps that is the problem…these scientists either for political reasons or for other reasons are not thinking things through well enough. In any regard, pure laziness is not an excuse for sub-par papers that claim a certain error when they do not even mention other sources of error. Come on, a sixth grader could have done better in that regard. Heck, my science project for sixth grade was graded rougher then what I am grading these scientists on, so if my post bothers them, perhaps they just need to grow thicker skin…

Phobos
February 27, 2013 7:07 am

Willis Eschenbach wrote: “I’m just not finding how those things are being accounted for.”
If you read Levitus et al 2012, you see they are not claiming to calculate the average temperature of the oceans, but of the “World Ocean.”
The World Ocean isn’t the exact ocean; it’s meant to be a representation of it, subject to the limitations of what’s experimentally doable and affordable. It’s a model of the exact ocean, if you want to call it that.
Of course, such constructions are the norm in environmental science, and indeed in most science. One can never measure the ideal system, so one somehow obtains reasonable facisimiles of it and does the measurements there, accounting for differences as well as reason and creativity allow, and doing one’s best to indicate the amount of resulting uncertainty.
The interest (in this case) isn’t so much in whether the modeled system gives a precise measurement of the average temperature of the ocean — which would, of course, require an infinite number of a set of measurements {x,y,z,t} for every point in the ocean and at every instant in time — but in how the consistently modeled system changes with time.
All data in science depends on a model. ALL data. I suspect you know this, and are just looking for ways to dismiss the OHC results in any way possible. There are an infinite number of objections that could be raised; but, do you have a better method?

Mark Bofill
February 27, 2013 7:20 am

Phobos says:
February 27, 2013 at 7:07 am

All data in science depends on a model. ALL data. I suspect you know this, and are just looking for ways to dismiss the OHC results in any way possible. There are an infinite number of objections that could be raised; but, do you have a better method?
—————————-
Phobos, I think Willis was pretty clear here:

My main question in this revolves around the claimed error. I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. Doubtful.

His issue is with the error claims, not with the model.

Phobos
February 27, 2013 8:13 am

Bofill: Again, they aren’t measuring the average temperature of the huge ocean; they’re measuring the average temperature of the “World Ocean.”
As Levitus et al 2012 write in their Supplementary Material: “The results describing the variability of ocean heat content shown here are based on gridded (1-degree latitude-longitude grid), interpolated fields at standard depth measurement levels….”
You have about 3,000 buoys surfacing about every 10 days, taking temperature profiles all the way up. At any given depth, that’s about 110,000 measurements per year, so about one measurement every 3500 km^2, randomized over most of that ocean layer.
Sure, it’d be great to have 10 times as many buoys, or a hundred times more. Is there a reason to think that would give greater precision of the model’s average temperature?

Theo Goodwin
February 27, 2013 8:17 am

Phobos says:
February 27, 2013 at 7:07 am
You have set a record for self-serving tripe on WUWT and maybe in the whole arena of science. You write:
“The World Ocean isn’t the exact ocean; it’s meant to be a representation of it, subject to the limitations of what’s experimentally doable and affordable. It’s a model of the exact ocean, if you want to call it that.
Of course, such constructions are the norm in environmental science, and indeed in most science. One can never measure the ideal system, so one somehow obtains reasonable facisimiles of it and does the measurements there, accounting for differences as well as reason and creativity allow, and doing one’s best to indicate the amount of resulting uncertainty.”
So, if you do not have much money for experiment or much time, interest, or imagination, then you do what you can – you collect some numbers somehow and from that you project what you call a model of the World Ocean? Is that about right? Please notice in the comments above that the words that have risen to the top are ‘lazy’ and ‘thoughtless’. You have nothing to say about evaluation of data. After raising the “sampling candy bars” analogy, you fail to respond to criticisms of it.
However, I can say in your defense that I have not encountered an Alarmist who would attempt to think through the steps that must be taken in nature (the real world) to validate the measurements that support what you call your model.