Rough Estimate of the Annual Changes in Ocean Temperatures from 700 to 2000 Meters Based on NODC Data

OVERVIEW

In a number of posts, we’ve discussed and illustrated the difficulties with ocean heat content data. (There are links to those earlier posts at the end of this one.) The data presented in this post is supported by the 2012 Levitus et al paper World ocean heat content and thermosteric sea level change (0–2000 m), 1955–2010 [8.1 MB].

One topic discussed but not illustrated (until now in Figure 1) was that the annual variations in temperatures at depths between 700 and 2000 meters were in terms of hundredths if not thousandths of a deg C and that it was unrealistic to think we could measure the temperatures of the oceans at depth with that type of accuracy. It turns out that the annual variations are typically in thousandths of a deg C. The total scale of the temperature anomalies of the graph in Figure 1 is two one-hundredths of a deg C.

Fig 1 Ocean Temp Changes 700-2000m

Figure 1

Keep in mind that Figure 1 presents the approximated changes in annual temperature for the depths of 700-2000 meters—not the anomalies. The annual changes are determined simply: by subtracting the value of the previous year from the value of the year being plotted.

The ocean temperature data for the depths of 700-2000 meters is an important topic. With the surface temperatures no longer warming and with the slowdown in the rate at which the oceans are warming at depths of 0-700 meters, climate scientists are now saying the missing heat is being forced to depths below 700 meters. Of course, that assumes the missing heat actually exists.

DATA AND METHOD

The NODC provides vertically averaged temperature anomalies for 0-700 meters here, and for the depths of 0-2000 meters here. To use that data to determine the variations in temperatures at depths between 700 and 2000 meters we need to know the volume of water in the depth ranges of 0-700 meters and 0-2000 meters. In reality, we don’t necessarily need the volumes. We can work with percentages.

According to an EPA document about ocean heat content data (here):

…the top 700 meters of the ocean (nearly 2,300 feet)…accounts for just under 20 percent of the total volume of water in the world’s oceans.

“[J]ust under 20 percent” could mean many things, so we’ll use the rounded figure of 20%.

Then on the last page of the NOAA presentation here, under the heading of “ARGO Future Possibilities”, they have the bullet point:

-52% of ocean volume below 2000 m

That obviously means that about 48% of the ocean volume is above 2000 meters.

Assuming those percentages are close to being correct, that means 41.66 % of the volume of the oceans to depths of 2000 meters comes from the top 700 meters and 58.33% at 700-2000 meters. I told you this was a rough estimate. So now we can use a simple weighted-average equation, where you solve for one of the components, to determine the changes in ocean temperature for the depths of 700-2000 meters. The result is illustrated above in Figure 1, and the data is presented as the annual change in temperature.

PUTTING THAT IN PERSPECTIVE

Figure 2 compares the annual changes in global sea surface temperature anomalies (HADSST3) and the vertically averaged temperatures of the global oceans to depths of 0-700 and 700-2000 meters.

Fig 2 Ocean Temp Changes Comparison A

Figure 2

In Figure 3, the data has been returned to their standard time-series form and I’ve added global land+sea surface temperature anomalies (HADCRUT4) as a reference. All of the data have been zeroed at 1955.

Fig 3 Temp Anom Comparison A

Figure 3

Many persons find it hard to believe that global surface temperatures can be estimated to the values shown. They’ll certainly have difficulty with the warming shown at depths of 700-2000 meters.

THE NODC’S HOCKEY STICK WHEN THEY TRANSITION TO ARGO DATA

Figure 4 presents the vertically averaged temperature anomalies for the global oceans for the depths of 700-2000 meters. They warm at a relatively constant and very slow rate until the ARGO floats begin the have complete coverage of the global oceans around 2003. Then they warm at a rate that’s almost 24 times faster than the rate seen in the 5 decades leading up to it.

Figure 4 ARGO Hockey Stick

Figure 4

It doesn’t look very realistic, now, does it?

IS THE HOCKEY STICK DEPENDENT ON THE PERCENTAGES USED?

Let’s assume that the percentages I discovered online are wrong, and that the relationship of the volumes of the oceans at depths of 0-700 meters and 0-2000 meters is a linear one. That is, we’re going to assume that there are no continental shelves and that the volume of the oceans from 0-700 meters is 35% of the volume from 0-2000 meters (700/2000 = 0.35). Therefore, in our hypothetical example world, the volume of water from 700-2000 meters represents the other 65% of the depths from 0-2000 meters. Plugging those percentages into our standard weighted-average equation presents the curve shown in Figure 5. The hockey stick still exists, but it’s not as pronounced.

Figure 5 ARGO Hockey Stick 35-65 Spit

Figure 5

IS THE NODC FOOLING ITSELF WITH THE PENTADAL DATA?

Or are they trying to fool us?

In past posts (here and here), we showed how the NODC increases the warming rate of their ocean heat content data by presenting it in pentadal form. See the comparison graphs at depths of 0-700 meters (here) and 0-2000 meters (here). Figure 6 compares the roughly estimated vertically averaged temperature anomalies for the global oceans, for the depths of 700-2000 meters, from 1955 to 2012, in annual and pentadal forms. The same percentages of ocean volumes at depth (0.4166 for 0-700 meters and 0.5833 for 700-2000 meters) were applied to calculate both. In this comparison, I did not smooth the annual data because I wanted the break point in 2002 to be visible.

Figure 6 Annual v Pentadal 700-2000m Anomalies

Figure 6

Not only does the data in pentadal form have a significantly higher warming rate, but its curve bears little resemblance to that of the annual data.

CLOSING

To account for the slowdown the warming of surface temperatures and in the heat content of the upper 700 meters of the global oceans, the climate science community is claiming manmade global warming is bypassing those levels and warming the oceans at depths below 700 meters. But that warming of the subsurface temperatures of the global oceans from 700-2000 meters represents variations measured in thousandths of a deg C. It is not reasonable to think we can measure the temperatures of the global oceans to that accuracy. Now consider that ocean heat content data have to be adjusted—tweaked, modified, corrected, whatever—in order for it to show warming during the ARGO era. (The UKMO EN3 data in the graph here should represent the uncorrected ARGO era data for depths of 0-2000 meters.)

The vertically averaged temperatures for the depths of 700-2000 meters can be approximated from their data for the depths of 0-700 and 0-2000 meters. It shows a very sudden shift in the rate of warming for depths of 700-2000 meters. The shift coincides with the introduction of the ARGO floats to rarely sampled portions of the global oceans—the mid-to-high latitudes of the oceans of the Southern Hemisphere, for example. This suggests that the warming presented by the data at those depths may result from the more-complete sampling of the global oceans.

We’ve discussed the problems with ocean heat content data in numerous posts over the past few months. They include:

Is Ocean Heat Content Data All It’s Stacked Up to Be?

NODC’s Pentadal Ocean Heat Content (0 to 2000m) Creates Warming That Doesn’t Exist in the Annual Data – A Lot of Warming

Ocean Heat Content (0 to 2000 Meters) – Why Aren’t Northern Hemisphere Oceans Warming During the ARGO Era?

Ocean heat content for the depths of 0-2000 meters is not a dataset in which one should have any confidence. It appears that it was introduced solely so that global warming enthusiasts could claim global warming continues.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
67 Comments
Inline Feedbacks
View all comments
July 4, 2013 7:42 am

Figures 4 and 5 above show the results of selection bias in the Argo data. We see a sharp break in the trend at the time the Argo floats were introduced. Because the scientists knowingly eliminated the cooling floats, believing them to be in error, the data now doesn’t fit with the long term historical trend. Instead it appears skewed.
What we are seeing is similar to the problem with tree ring calibration. The scientists are selecting some trees/floats to believe and rejecting others, based on what the trees/floats are telling them. This changes the underlying requirement in statistics that your sample be random. As a result when you apply statistics to them the results are spurious – they are false and misleading.
The results from Argo and calibrated tree rings cannot be trusted statistically because they are not a true random sample of the population.

richard verney
July 4, 2013 7:52 am

Does it matter? Isn’t this all simply of accademic interest only?
If the ocean below 700m is truly warming, the energy has been disipated. What process possibly exists that allows it to come back and bite us?
At what rate could the ‘extra’ energy in the 700m to 2000m depth come back to warm the ocean in the 699 to 600m depth, then in turn the 599 to 500m depth etc?
When chemical cargoes get contaminated in shore tanks, or ships, one way to recondition the cargo so that it is usable and within the required specification, is not to redistill it, or to filter out the contaminant etc, but simply to dilute the contaminated cargo with copious amounts of good on spec cargo. In this manner the contaminant becomes so diluted that eventually the bulk is brought within the required range of specification. This is what is happening to the ‘extra’ energy allegedly absorbed in the below 700m range. It is simply being disipated in the bulk of the ocean and we know of no process that would enable it to come back and heat the surface layer of the ocean in a time frame measured in less than millenia.
So the warmists need to explain:
1. What physics and process involved explains why increased CO2 emissions leading to ever increasing amounts of backradiation in the 20th century heated the atmosphere, but not the deep ocean (ie., below 700m), but now (ie., the 21st century) that physics and process leads to heating the deep ocean but not the atmosphere.
2. What physics and process involved explains how the energy which is heatting the deep ocean passes through the 0 to 700m range undedected.
3 What physics and process involved explains how the extra energy absorbed in the deep ocean will return to warm the surface and when this will occur and at what rate will the surface layer be warmed.

Ryan Stephenson
July 4, 2013 7:52 am

OT, but I will be taking my son wild camping in the home of the “Hound of the Baskervilles” – that is Dartmoor UK – at the weekend and saw some interesting climate related history on Wikipedia:
http://en.wikipedia.org/wiki/Dartmoor
Turns out this area was much warmer in the Bronze age (up till about 3000years ago) – so warm in fact that farming was widespread in the area. Farming is now only possible in the lower areas. The medieval warm period also had an impact.

July 4, 2013 8:07 am

Bob Tisdale says:
July 4, 2013 at 7:52 am
Yup. I noticed how ridiculous the 41.66 % and 58.33% looked
============
no more ridiculous than calculating average ocean temperature to a precision of 0.0004 degree.

July 4, 2013 8:19 am

richard verney says:
July 4, 2013 at 7:52 am
Does it matter? Isn’t this all simply of academic interest only?
===========
It allows the climate scientists to claim that they weren’t wrong. That CO2 is continuing to warm the planet, only now it is warming the oceans.
It is the old epicycles argument. We weren’t wrong, the orbits are perfect circles, within perfect circles, because god creates perfection. The models correctly predicted warming, once we include both surface and ocean warming, because CO2 causes warming.

July 4, 2013 8:26 am

Recently, because I’m in Barcelona, I’ve been reading about how, during the Spanish Civil War (1936-1937) Stalinists vilified and outright lied about those who did not share their ideas of centralized control and inequality of benefits for the masters of the country vs the workers (counter-intuitive these days, but true: Trotsky wanted to have only local governments and everybody getting paid the same). Reading about the party line of CAGW I am struck by the similarity in word and deed by our governments on climate change and those of Stalinist Russia in the 1930s, when they feared loss of power to those who had more “revolutionary” ideas of liberty and equality.
Are we actually seeing a “party line”, including the use of distorted to the point of fabrication (nobody except the North Koreans thinks they can totally fabricate storiess these days)? The vilification of the skeptic position, all of us in the pockets of Big Oil, the refusal by the MSM to talk about the failure of, among other things, the mid-tropopsheric hotspot and the melting of Antarctica, is not unlike some of the shrieking done by the Stalinists against those even slightly in disagreement. The financial aid the Soviets gave to those they saw as useful to them was kept quiet back then, as is the funding that funnels to Mann et al, plus the fees Hansen, Suzuki and Gore get – not to mention the pure profits those at the level of Gore enjoy.
Whatever you might say, there is definitely a “party line” that is being supported and promoted. A “good” eco-green says that the NOAA data is good, and whatever data Hansen uses is better. HadCru temperatures, satellite temperatures, balloon temperatures and tidal guage measurements are not equal to what the IPCC uses. All this data goes towards supporting international control centralized in the UN, and rules and regulations that “efficiently” manage CO2 emissions – rules and regulations spilling forth that cross national boundaries and circumvent – seriously circucuvent – our legislative bodies. This is exactly what the Stalinist did in Spain.
It should be understood that the Stalinist position was that it is/was better to lose the battle (Spain) than lose the war (centralized control and an “efficient” economy as defined by the principals). In this regard one might look to coal plants in the UK and the XL pipeline in the Canadian/US situation. In the UK premature shut-down of the coal-fired power plants (along with nuclear power plants) could easily lead to power outages. Here the loss is felt by the UK civilian population, which of course includes the employed people. The XL pipeline, should it be vetoed, will cost jobs, pensions and such to the Texas refinery worker. In each case there is an ideological solution to these probable problems: renewable energy in the UK, now replaced by the false idea of shale gas, and in the US, the increasing but false expectation of North Dakota Bakken oil. The failure of both of these “solutions” is akin to losing Spain while maintaining at least the appearance of control, future or present, in the heart of Stalinist Russia.
What we seem to be seeing when our governing bodies refuse to accept that their “scientific” data is somewhat (at least) suspect, is that the party line, i.e. fossil fuel generated prosperity, is not acceptable. This bottom line is what counts, all else is expendable. The question that we really, really need to ask is, what is in it for them?
Note: I do not think there is a “them” in the organized sense. Just in an ideological sense. Obviously there can be mutual benefits with, for example, an investing rich like Gore, so that they all act the same way because they all will benefit from the same way. A Suzuki or a Hansen or an Erhlich similarly act without the (large) financial benefit in this case, because their egos are what benefit: it is perhaps better in the afterlife to be a prophet than a rich man, after all.

RichardLH
July 4, 2013 8:31 am

Bob: I have liitle fear of poor quality data now that I have a high quality instrument to look at it through 🙂 A matematically accurate one at that.

July 4, 2013 8:32 am

ferd berple says:
July 4, 2013 at 8:19 am
richard verney says:
July 4, 2013 at 7:52 am
Does it matter? Isn’t this all simply of academic interest only?
===========
“It allows the climate scientists to claim that they weren’t wrong. That CO2 is continuing to warm the planet, only now it is warming the oceans.”
I see a bigger problem for the warming proponents. Okay, so we are heating up the ocean depths by miniscule amounts. Is this REPLACING the effect of CO2 in the atmosphere at this time? Is the CO2 not now “capturing” any upwelling LWIR and heating the air? Does the CO2 in the atmosphere care whether a bunch of heat is suddenly heating up the ocean depths? How does the ocean efficiently suck up this terrible heat being created so efficiently that there is none available to heat the atmosphere for 16 years? As expected, when you look for ways to cling to a failing theory you get to look more and more foolish to more and more people. Perhaps its the process of looking evermore stupid that causes a sea change in the beliefs and a new paradigm.

HankHenry
July 4, 2013 9:12 am

This ocean heat business all so ad hoc. It’s clearly an attempt to correct existing models with things outside the model. It’s all just an admission of failure the models. The bottom line is that ocean heat must be accounted for in any realistic climate model but the data is scant while the effect is potentially just as monstrous as cloud variability.
All that’s been discovered is that modelling climate is even harder than modelling weather.

EthicallyCivil
July 4, 2013 9:42 am

If one assumes the data is valid, it could be evidence of a unmodelled natural variation — one in which a massive heat sink transitions between net absorption and net release. One could posit some deep current long period oscillation that could tend to sequester, or tend to release heat. Given the specific heat of water this would seem far larger in magnitude than the variation in the atmospheric absorption of IR.

Luther Wu
July 4, 2013 9:44 am

Gary Pearse says:
July 4, 2013 at 8:32 am
_______________
Don’t you see? CO2 is a switch, turning on the ocean heater.
/

Matthew R Marler
July 4, 2013 9:50 am

Assuming those percentages are close to being correct, that means 41.66 % of the volume of the oceans to depths of 2000 meters comes from the top 700 meters and 58.33% at 700-2000 meters.
The four significant figures are a treasure. 😎
Thanks for another interesting post.

Kevin Kilty
July 4, 2013 9:58 am

AndyG55 says:
July 4, 2013 at 6:31 am
Don’t the Argo buoys go with the current.
And wouldn’t the currents naturally flow from colder to warmer, as the warmer upwells.
Maybe this is what the Argo buoys are seeing.?

This is the nub of one potential problem with the data — The time rate of change of temperature is the sum of two terms 1) the partial derivative of temperature w/resp to time, which is to say the “ocean warming” signal; and 2) the signal that comes from the velocity vector of a buoy crossing ocean isotherms — the advective term. I’ve seen no attempt to separate the two. Keep in mind the advective term has the potential for a systematic bias as you said.
The sensors in these buoys have a measurement repeatability of around 0.005 C and very low drift, but this is under lab conditions. I don’t know that anyone has documented what they do under field use. Moreover, the figure of average ocean temperature anomaly is made assuming that one can simply add the individual measurements to make a mean–but what becomes of the the error budget is anyone’s guess. We don’t know if the errors of each sensor follow a single normal distribution or not. Thus it is difficult to say in the figures where would be the level of significant change.

Bill H
July 4, 2013 10:19 am

With the numerous corrections to ARGO data being well over the .002 or .05 deg C would this not create an error bar which would render the data not only questionable but useless?
I fear that the MOE is far greater than the suposed warming..

July 4, 2013 10:22 am

For four years I was involved in anti-submarine patrols with the Canadian Air Force in the North Atlantic. A major part of the detection involved measuring transmission of sound through the water, either actively by dropping very small depth charges (PDC) or passively by detecting sound from microphones we could drop in the water and have it lowered to a particularly depth. We also had the ability to drop thermometers that provide water temperature at different depths. Besides this there were ship borne and land based detectors, including microphones on the bottom of the ocean. All these were used to create bathythermographs, that is water temperature with depth.
Knowledge of temperature layering was essential for placement of microphones because the sound travelled within those layers. I was always impressed by the remarkable number of layers that could exist and also the fact that temperature could have an inversion, that is increase with depth.
It was possible to have a microphone within metres of a submarine and not detect the sound, but it was also possible to detect a submarine hundreds of kilometres away within a particular layer. The number and variance in the layers over time was quite remarkable and leaves me skeptical about any claims of knowledge of water temperature in the top “friction layer” of the oceans.
Submariners are aware of these sub surface currents and use them to their advantage. For example, German U-Boats wanting to move undetected through the Straits of Gibraltar turned off engines and drifted in at one depth and out at another as the currents were opposite through the Straits. They also knew of the sound layering and “hid” within a certain layer.
I mention the friction layer because within the air and the oceans it is affected by the Ekman Spiral. Studies done with Professor Bell on a 1000 foot tower outside of Winnipeg, Manitoba with instruments every 200 feet showed very similar layering and inversions as I had seen in the oceans. It is just one reason why I am so skeptical about the construct of the computer models and their translations of energy from the upper ocean to the lower atmosphere.
After I left the military and started work on climate (when the consensus was cooling) I linked up with oceanographer, the late Roger Pockington who was studying water temperature changes in a transect from Port Au Basque Newfoundland to Bermuda. We often discussed the value of those military temperature readings, unavailable because of secrecy classifications. Maybe we could get Al Gore to get them released like he did the Top Secret ice data for the “Gore Box” in the Arctic.
http://drtimball.com/2013/thinning-arctic-ice-more-al-gore-aided-and-abetted-misinformation/

Bill H
July 4, 2013 10:22 am

ferd berple says:
July 4, 2013 at 8:19 am
richard verney says:
July 4, 2013 at 7:52 am
Does it matter? Isn’t this all simply of academic interest only?
===========
It allows the climate scientists to claim that they weren’t wrong. That CO2 is continuing to warm the planet, only now it is warming the oceans.
It is the old epicycles argument. We weren’t wrong, the orbits are perfect circles, within perfect circles, because god creates perfection. The models correctly predicted warming, once we include both surface and ocean warming, because CO2 causes warming.
======================================================
Fred,
I dare say that this is how we learn we are going in circles using circular logic.. 🙂

chris y
July 4, 2013 10:48 am

richard verney asks a very good question-
“What physics and process involved explains how the extra energy absorbed in the deep ocean will return to warm the surface and when this will occur and at what rate will the surface layer be warmed.”
I think this is already settled. Now that the 700m-2000m ocean volume has warmed by about 0.01C since 2003, it eventually could (or maybe already has) cause surface temperatures to warm by up to… 0.01C.
I remember reading some years ago that one of the climate science glitterati (someone like Trenberth, Hansen, Schneider) suggested that if the CO2 flamethrower heat wound up in the deep oceans, then global warming would no longer be a crisis or even a worry.
Yet here we are pointing to deep ocean anthropogenic heating that may or may not even exist, with a magnitude in the hundredth’s of a degree, and with no physical means of being transferred back into large temperature changes at the surface.

dp
July 4, 2013 11:54 am

This heating, microscopic as it is, should show up in the satellite sensed global gravity map and which should change over time as the water density changes. These changes are regional – impossible for them to be global as there’s not enough directed energy to cause the entire ocean heat to change at these rates and undirected energy at the rates needed would smoke the land surface of the planet.
Gravity distribution may be a better proxy for our Earth system balance of energy than temperature. Temperature really sux at it.
How many petawatts of energy does it take to move warm surface water to deeper depths in the volume needed to support this NODC data? Where does that energy come from?

climatereason
Editor
July 4, 2013 12:53 pm

Bob
I think i’ve had it explained to me about 5 times by warmistas as to how the warming (should it exist) leaps frog the top layers of water, over the middle layers and suddenly reappears much lower down in a frightening form that can be impressively measure in thousands of a degree,.
.
Could you please explain the process to me one more time using words of not more than two syllables. Or a diagram would be nice. Or a video. Cartoon? Anything.
tonyb

Latitude
July 4, 2013 12:56 pm

Tony, it’s just an excuse for a math error…..their math was wrong (missing)
might as well say it’s on the moon

Kajajuk
July 4, 2013 1:53 pm

ohnmarshall says:
July 4, 2013 at 6:19 am
I cannot imagine a process that would ”force” the heat lower into the ocean depths.
———————————————————
I can imagine a process and provide a “sciency” rational to support it using CO2 as the culprit.
Consider a bell jar in an 1880 lab containing increasing concentrations of CO2 exposed to sunlight; viola temperature rise. But the volume is fixed so temperature and pressure increase. Now consider an open system where volume is essentially unbounded and temperature and pressure have functional constraints i.e T and P cannot change locally before being dissipated to the surrounding “locales”.
Increasing CO2 concentrations tend to increase the turbidity of the atmosphere and less its temperature as the concentrations increase. After certain thresholds are realized the turbulence help to “self” organize natural turbulence driven from temperature and pressure gradients which stir up the oceans by creating more localized eddies that combine to make the oceans’ natural flows more turbulent. This increases the mixing of the oceans bringing heat content farther down and generate wild swings in local weather on land, albeit the “local” gets larger and larger.