In a number of posts, we’ve discussed and illustrated the difficulties with ocean heat content data. (There are links to those earlier posts at the end of this one.) The data presented in this post is supported by the 2012 Levitus et al paper World ocean heat content and thermosteric sea level change (0–2000 m), 1955–2010 [8.1 MB].
One topic discussed but not illustrated (until now in Figure 1) was that the annual variations in temperatures at depths between 700 and 2000 meters were in terms of hundredths if not thousandths of a deg C and that it was unrealistic to think we could measure the temperatures of the oceans at depth with that type of accuracy. It turns out that the annual variations are typically in thousandths of a deg C. The total scale of the temperature anomalies of the graph in Figure 1 is two one-hundredths of a deg C.
Keep in mind that Figure 1 presents the approximated changes in annual temperature for the depths of 700-2000 meters—not the anomalies. The annual changes are determined simply: by subtracting the value of the previous year from the value of the year being plotted.
The ocean temperature data for the depths of 700-2000 meters is an important topic. With the surface temperatures no longer warming and with the slowdown in the rate at which the oceans are warming at depths of 0-700 meters, climate scientists are now saying the missing heat is being forced to depths below 700 meters. Of course, that assumes the missing heat actually exists.
DATA AND METHOD
The NODC provides vertically averaged temperature anomalies for 0-700 meters here, and for the depths of 0-2000 meters here. To use that data to determine the variations in temperatures at depths between 700 and 2000 meters we need to know the volume of water in the depth ranges of 0-700 meters and 0-2000 meters. In reality, we don’t necessarily need the volumes. We can work with percentages.
According to an EPA document about ocean heat content data (here):
…the top 700 meters of the ocean (nearly 2,300 feet)…accounts for just under 20 percent of the total volume of water in the world’s oceans.
“[J]ust under 20 percent” could mean many things, so we’ll use the rounded figure of 20%.
Then on the last page of the NOAA presentation here, under the heading of “ARGO Future Possibilities”, they have the bullet point:
-52% of ocean volume below 2000 m
That obviously means that about 48% of the ocean volume is above 2000 meters.
Assuming those percentages are close to being correct, that means 41.66 % of the volume of the oceans to depths of 2000 meters comes from the top 700 meters and 58.33% at 700-2000 meters. I told you this was a rough estimate. So now we can use a simple weighted-average equation, where you solve for one of the components, to determine the changes in ocean temperature for the depths of 700-2000 meters. The result is illustrated above in Figure 1, and the data is presented as the annual change in temperature.
PUTTING THAT IN PERSPECTIVE
Figure 2 compares the annual changes in global sea surface temperature anomalies (HADSST3) and the vertically averaged temperatures of the global oceans to depths of 0-700 and 700-2000 meters.
In Figure 3, the data has been returned to their standard time-series form and I’ve added global land+sea surface temperature anomalies (HADCRUT4) as a reference. All of the data have been zeroed at 1955.
Many persons find it hard to believe that global surface temperatures can be estimated to the values shown. They’ll certainly have difficulty with the warming shown at depths of 700-2000 meters.
THE NODC’S HOCKEY STICK WHEN THEY TRANSITION TO ARGO DATA
Figure 4 presents the vertically averaged temperature anomalies for the global oceans for the depths of 700-2000 meters. They warm at a relatively constant and very slow rate until the ARGO floats begin the have complete coverage of the global oceans around 2003. Then they warm at a rate that’s almost 24 times faster than the rate seen in the 5 decades leading up to it.
It doesn’t look very realistic, now, does it?
IS THE HOCKEY STICK DEPENDENT ON THE PERCENTAGES USED?
Let’s assume that the percentages I discovered online are wrong, and that the relationship of the volumes of the oceans at depths of 0-700 meters and 0-2000 meters is a linear one. That is, we’re going to assume that there are no continental shelves and that the volume of the oceans from 0-700 meters is 35% of the volume from 0-2000 meters (700/2000 = 0.35). Therefore, in our hypothetical example world, the volume of water from 700-2000 meters represents the other 65% of the depths from 0-2000 meters. Plugging those percentages into our standard weighted-average equation presents the curve shown in Figure 5. The hockey stick still exists, but it’s not as pronounced.
IS THE NODC FOOLING ITSELF WITH THE PENTADAL DATA?
Or are they trying to fool us?
In past posts (here and here), we showed how the NODC increases the warming rate of their ocean heat content data by presenting it in pentadal form. See the comparison graphs at depths of 0-700 meters (here) and 0-2000 meters (here). Figure 6 compares the roughly estimated vertically averaged temperature anomalies for the global oceans, for the depths of 700-2000 meters, from 1955 to 2012, in annual and pentadal forms. The same percentages of ocean volumes at depth (0.4166 for 0-700 meters and 0.5833 for 700-2000 meters) were applied to calculate both. In this comparison, I did not smooth the annual data because I wanted the break point in 2002 to be visible.
Not only does the data in pentadal form have a significantly higher warming rate, but its curve bears little resemblance to that of the annual data.
To account for the slowdown the warming of surface temperatures and in the heat content of the upper 700 meters of the global oceans, the climate science community is claiming manmade global warming is bypassing those levels and warming the oceans at depths below 700 meters. But that warming of the subsurface temperatures of the global oceans from 700-2000 meters represents variations measured in thousandths of a deg C. It is not reasonable to think we can measure the temperatures of the global oceans to that accuracy. Now consider that ocean heat content data have to be adjusted—tweaked, modified, corrected, whatever—in order for it to show warming during the ARGO era. (The UKMO EN3 data in the graph here should represent the uncorrected ARGO era data for depths of 0-2000 meters.)
The vertically averaged temperatures for the depths of 700-2000 meters can be approximated from their data for the depths of 0-700 and 0-2000 meters. It shows a very sudden shift in the rate of warming for depths of 700-2000 meters. The shift coincides with the introduction of the ARGO floats to rarely sampled portions of the global oceans—the mid-to-high latitudes of the oceans of the Southern Hemisphere, for example. This suggests that the warming presented by the data at those depths may result from the more-complete sampling of the global oceans.
We’ve discussed the problems with ocean heat content data in numerous posts over the past few months. They include:
Ocean heat content for the depths of 0-2000 meters is not a dataset in which one should have any confidence. It appears that it was introduced solely so that global warming enthusiasts could claim global warming continues.