An Ocean of Overconfidence

Guest Post by Willis Eschenbach

I previously discussed the question of error bars in oceanic heat content measurements in “Decimals of Precision“. There’s a new study of changes in oceanic heat content, by Levitus et al., called “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010” (paywalled here). [UPDATE: Available here, h/t Leif Svalgaard] It’s highlighted over at Roger Pielke Senior’s excellent blog , where he shows this graph of the results:

Figure 1. From Levitus 2012. Upper graphs show changes in ocean heat content, in units of 1022 joules. Lower graphs show data coverage.

Now, there’s some oddities in this graph. For one, the data starts at year 1957.5, presumably because each year’s value is actually a centered five-year average … which makes me nervous already, very nervous. Why not show the actual annual data? What are the averages hiding?

But what was of most interest to me are the error bars. To get the heat content figures, they are actually measuring the ocean temperature. Then they are converting that change in temperature into a change in heat content. So to understand the underlying measurements, I’ve converted the graph of the 0-2000 metre ocean heat content shown in Figure 1 back into units of temperature. Figure 2 shows that result.

Figure 2. Graph of ocean heat anomaly 0.-2000 metres from Figure 1, with the units converted to degrees Celsius. Note that the total change over the entire period is 0.09°C, which agrees with the total change reported in their paper.

Here’s the problem I have with this graph. It claims that we know the temperature of the top two kilometres (1.2 miles) of the ocean in 1955-60 with an error of plus or minus one and a half hundredths of a degree C

It also claims that we currently know the temperature of the top 2 kilometers of the global ocean, which is some 673,423,330,000,000,000 tonnes (673 quadrillion tonnes) of water, with an error of plus or minus two thousandths of a degree C

I’m sorry, but I’m not buying that. I don’t know how they are calculating their error bars, but that is just not possible. Ask any industrial process engineer. If you want to measure something as small as an Olympic-size swimming pool full of water to the nearest two thousandths of a degree C, you need a fistful of thermometers, one or two would be wildly inadequate for the job. And the top two kilometres of the global ocean is unimaginably huge, with as much volume as 260,700,000,000,000 Olympic-size swimming pools …

So I don’t know where they got their error numbers … but I’m going on record to say that they have greatly underestimated the errors in their calculations.

w.

PS—One final oddity. If the ocean heating is driven by increasing CO2 and increasing surface temperatures as the authors claim, why didn’t the oceans warm in the slightest from about 1978 to 1990, while CO2 was rising and the surface temperature was increasing?

PPS—Bonus question. Suppose we have an Olympic-sized swimming pool, and one perfectly accurate thermometer mounted in one location in the pool. Suppose we take one measurement per day. How long will we have to take daily measurements before we know the temperature of the entire pool full of water to the nearest two thousandths of a degree C?

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

135 Comments
Inline Feedbacks
View all comments
Eugene WR Gallun
April 23, 2012 9:35 pm

i have not read all the above but speaking as a layperson nothing manmade continues to work the same after a period of time — particularily if it is around salt water. Do they take some of these out of the water and check to see if they are still accurately recording data? They must have some spot check program in place — but wait — this is government funded — maybe not. If such exists it would be nice if it were given a quick mention to allay needless doubts in an old boatboy’s mind..

Brian H
April 23, 2012 9:40 pm

How long will we have to take daily measurements before we know the temperature of the entire pool full of water to the nearest two thousandths of a degree C?

Forever, = never. The temperature will be fluctuating by more than that every minute or two.

dennisambler
April 24, 2012 1:01 am

The late Dr Robert Stevenson critiqued the first Levitus paper, Levitus et al 2000, in this article:
“Yes, the Ocean Has Warmed; No, It?s Not ?Global Warming?”
by Dr. Robert E. Stevenson, http://www.21stcenturysciencetech.com/articles/ocean.html
He described the process of temperature measurement in those earlier times, as he and his colleagues had spent many years taking those temperatures:
“In the 1960s, more ships were out at sea: from Fisheries Laboratories, U.S.
Coast and Geodetic Survey (now NOAA), and research institutions at Scripps (La
Jolla, Calif.), Woods Hole (Massachusetts), Miami, and Texas A&M (in the Gulf of
Mexico). The British sailed the new Discovery, the Germans the new Meteor, and
there were small ships sailing from Denmark, Japan, and France. Many cruises
were dedicated to the geophysics of the sea floor, where deep-ocean casts for
water and temperatures were few and far between.”
Surface water samples were taken routinely, however, with buckets from the deck
and the ship’s engine-water intake valve. Most of the thermometers were
calibrated into 1/4-degrees Fahrenheit. They came from the U.S. Navy. Galvanized
iron buckets were preferred, mainly because they lasted longer than the wood and
canvas. But, they had the disadvantage of cooling quickly in the winds, so that
the temperature readings needed to be taken quickly.
I would guess that any bucket-temperature measurement that was closer to the actual temperature by better than 0.5° was an accident, or a good guess. But then, no one ever knew
whether or not it was good or bad. Everyone always considered whatever reading was made to be precise, and they still do today.
The archived data used by Levitus, and a plethora of other oceanographers, were taken by me, and a whole cadre of students, post-docs, and seagoing technicians around the world. Those of us who obtained the data, are not going to be snowed by the claims of the great precision of “historical data found stored in some musty archives.” ”
As Willis has shown, those claims of precision are still there.
Stevenson trained NASA astronauts in oceanography and marine meteorology. He was Secretary General of the International Association for the Physical Science of the Oceans from 1987
to 1995, and worked as an oceanographer for the U.S. Office of Naval Research for 20 years.

LazyTeenager
April 24, 2012 2:33 am

thermometer mounted in one location in the pool. Suppose we take one measurement per day. How long will we have to take daily measurements before we know the temperature of the entire pool full of water to the nearest two thousandths of a degree C?
———–
Interesting question.
Let’s change it slightly. Let’s measure temperature every 1 second with a precision of 0.1C.
Number of readings per day is 60x60x24.
That’s 86,400.
So 0.0003C is the precision for the average measurement assuming the pool is in equilibrium and isolated.
What we don’t know is how much variation there is in the pool due to changes in heating and cooling. Air currents and other stuff like clouds can affect this. Then you need to look at the measurements and their distribution and the frequency properties of the noise.

LazyTeenager
April 24, 2012 2:36 am

I’ve converted the graph of the 0-2000 metre ocean heat content shown in Figure 1 back into units of temperature. Figure 2 shows that result.
———–
I don’t know how it’s possible to do that. The heat capacity of water is not fixed. It varies with temperature and possibly even a little with pressure.

LazyTeenager
April 24, 2012 2:44 am

Had another think and read. The variation in heat capacity is moderate and not enough to affect an estimate of the average temperature in the context of Willis’ argument.

LazyTeenager
April 24, 2012 2:50 am

why didn’t the oceans warm in the slightest from about 1978 to 1990, while CO2 was rising and the surface temperature was increasing?
———
Changes in ocean circulation causing redistribution of the heat perhaps?. Since sea surface temps strong influence air surface temp this implies most of the discrepancy was at depth.

P. Solar
April 24, 2012 4:05 am

JK says:
>>
In this simple model the errors on the floats are uncorrelated: whether one float finds itself at a higher or lower temperature than the average for its region is independent of whether it’s neighbours are higher or lower than the average for their regions. Note, in this model it is the errors that are uncorrelated not the average. If a float in a high average region then it’s neighbour is more likely to be in a high average region. That’s a different question.
In that case then averaging, adding up these uncorrelated error terms, will indeed reduce their expected sum by a square root factor.
If you disagree with that, please explain where the disagreement is. Otherwise we can move on to clarify what changes in more realistic situations.
>>
Within the assumption that these errors are normally distributed and uncorrelated that seems correct. The question is what is the error of any reading , maybe this could be assumed to be less than 10K for example but I’m not sure how it could be evaluated.
Then there is : square root of what. the paper states that many cells have only one reading.
http://data.nodc.noaa.gov/woa/PUBLICATIONS/grlheat12.pdf
This version has the supplementary information on how they calculate the uncertainty due to propagation of errors in the averaging technique. You have to read it a few times but the crux is that they start with a complicated formula from another paper but do not have enough data to apply it , so they start doing some very inappropriate simplifications.
>>
… may be from one observation in data sparse regions up to several hundred or more.
Ideally, we would like to have a time series of measurements in each ODSQ [one degree sqr] with which we can produce statistics but this is simply not possible with the available data. Hence to evaluate standard deviations we will use the spatial variance of ODSQs containing data.
>>
For heavily populated squares they have a mean and a std error around that mean. Which, with lots of data, should be relatively small.
The problem is, for the cells with few data, they use the std error of the well populated cells and assume this reflects the error of single reading from the true mean.
Now if you stop to think, that means that they are incorrectly assuming the error of a one-off reading is the same as the error of a well balanced sample with hundreds of readings.
So their uncertainty calculations seem to be based on attributing the uncertainty for “several hundred” readings to cells of just ONE reading.
That dirty little secret gets tucked away in the S.I. which is not even available in the published version of the paper.
Now, until they can be bothered to do a correct analysis and a valid error estimation, using “hundreds” instead of one suggests that their error estimates are at least AN ORDER OF MAGNITUDE too small.
That is probably at least some way to Willis’ puzzle in the other thread:
http://wattsupwiththat.com/2012/01/26/decimals-of-precision-trenberths-missing-heat/

peter_ga
April 24, 2012 5:00 am

I only have some rough digital signal processing background.
To actually measure the oceans temperature, applying Shannons sampling theorum, it would be necessary to have enough thermometers to have twice the highest 3d fourier transform spatial frequency in the data. Given the numerous hot and cold oceanic currents, which would introduce relatively high spatial frequency components, this would suggest an impossible number. Dividing by the square root of the number of measurements is pointless in this situation. That would only reduce the gaussian noise of the individual measurements, which could be assumed to be fairly precise.
If high frequency components are ignored, then they fold back into the measured spectrum with approximately equal power to whatever power they have at the ignored frequencies, greatly reducing the low frequency measurement accuracy.

Pamela Gray
April 24, 2012 6:00 am

Let’s argue this from another angle. The oceans have warmed. Knowing the theorized mathematical equation for the ability of long wave radiation to warm the ocean skin, and thus by mixing, the layers below the mixing, it would be possible to determine what amount of that warming could be due to re-radiated LW infrared. It would also be possible to determine the part of THAT warming that is due to the anthropogenic increase in CO2.
My hypothesis would be that the LW warming would fall within the standard error of their data. This is a paper that would then appear to point towards a natural phenomenon.

April 24, 2012 6:47 am

Bonus question. Suppose we have an Olympic-sized swimming pool, and one perfectly accurate thermometer mounted in one location in the pool.
If it were an Argo bouy, it would be stuck at the outlet drain within a few minutes, because Argo bouys are free floating. In the ocean this means they drift towards areas of downwelling which are warmer and away from areas of upwelling which are cooler.
Bias anyone.

wsbriggs
April 24, 2012 6:53 am

Well queried Willis.
My question about the Argo floats has to do with the build up of bio material on the surface of the diving units. I suspect that there is something in place to control the film buildup, but from just using a squeege in my shower, I know that over time buildup still occurs. The film will affect the conductivity, and definitely with a change in conductivity, the measurement will drift. This would be true even if there weren’t currents etc. Moreover, the drift would be positive, as the electronics emit heat, small as that amount might be, it still would cause an upward bias. SInce the accuracy with which they are perporting to measure the temperature is so good, it would seem to be necessary to take into account small error contributions to the temperature measurement.
The above quibble is void if there is something like a plasma cleaner built into the devices.

April 24, 2012 6:57 am

Eric Worrall says: April 23, 2012 at 11:19 am
Steamboat Jack, I was kindof thumbing my nose at them 🙂
********
Eric,
My apologies if I misunderstood your post, but I am losing my sense of humor.
The CAGW crowd is a part of a loose coalition of like-minded individuals that want to fundamentally remake the world in general and these United States in particular. Unfortunately (for them) the end results of their agenda are easy to see if you care to look.
According to the 1950 census, Capitalism and the auto industry had made Detroit one of the richest cities in the US and by extension the world. It is now number two. From the bottom. Ahead of only Cleveland.
The 1950 census also showed that Capitalism and the auto industry had created the largest, richest Black (as in African decent of ANY nationality) community in the world. Everything bad said about Henry Ford is true. He also put business ahead of most everything else, so all he cared about was how hard a man would work. He hired a lot of Americans who just coincidentally had African ancestry because they would work hard. (Imagine that: judging a man on the content of his character some half a century before it became a liberal cliché!) In 2010, the city council proposed bulldozing some forty (yes, 40) square miles of abandoned housing. The richest Black community in the world had been rendered uninhabitable.
Chicago, New York and California are headed the way of Detroit, Buffalo, and Flint.
Texas is holding out, but the current régime is using the EPA to shut down electric power production here. That would, of course, cripple the economy. The reason is so that the economic refugees fleeing Democrat party strongholds would have nowhere to go. (It is sort of like the reason that the Berlin Wall was built.)
For example, Illinois is ranked:
• 50th for fiscal policy
• 47th in job creation
• 1st in unfunded pension liabilities
• 2nd largest budget deficit
• 1st in failing schools
• 1st in bonded indebtedness
• Highest sales tax in the nation
• Most judges indicted (Operations Greylord and Gambat)
As a result people are abandoning Illinois, largely Chicago. (The population of Chicago has fallen to the 1920’s level). Because of that loss, the state has lost 7 of 25 Congressmen.
There is a Chicago way to deal with competition: You destroy them so that they can’t compete. Cutting off Texas energy would do just that.
I am worried.
Regards,
Steamboat Jack (Jon Jewett’s evil twin)

April 24, 2012 7:06 am

Excellent comment, Steamboat Jack. Our corrupt leaders are ceding the battlefield to other countries. The result is the impoverishment of America.
It doesn’t have to be this way. America is a “can-do” country that can out compete anyone. But we are being sold out by anti-American traitors. There is no better term for what is being done.

Gail Combs
April 24, 2012 12:50 pm

mtobis (mtobis) says:
April 23, 2012 at 8:56 am
It is frequently possible to measure changes more accurately than absolute values.
______________________________________
markx says: April 23, 2012 at 9:26 am
….What you need to get increasing accuracy with repeated measures is, strangely enough, repeatable conditions.
______________________________________
mtobis, that is the fallacy that is being discussed. I am going to expand on what markx said.
IF you are taking 10 or 25 or 100 readings with the same instrument within a short time period of a UNIFORM material in a lab, then you can use the average to give you a very good estimate of the “True Value” with tight error bars (+/- 2 sigma.) Once you are talking about a non-uniform material (the ocean) measured by different instruments at different times at different depths at different locations then you are talking about huge error bars because you adding up all the errors inherent in each measurement and not increasing the accuracy by repeating the measurement under the same conditions.
Think of that swimming pool again vs a mix room tank filled with water and with specially engineered blender blades. I dump the same percentage of a chemical into the water of both the swimming pool and the mix tank (blenders on) a half hour later I take 100 samples from the mix tank and from the pool at different depths and at different locations. With the mix tank I use the same equipment and technician to analyse the 100 samples. With the swimming pool I use different people to take each sample at different depths and at different locations. I use different people at different labs with different equipment to analyse for the chemical.
Do you really think I am going to come up with the same statistical distribution for the two sets of data points and similar standard deviation and therefore similar error bars? Or do you think the swimming pool is going to have a much wider spread (range) in the data and therefore a much larger error?
Which mean of the two sets of data do you think is going to come closer to the “True Value,” that is the known percentage of the chemical placed in the water?
The swimming pool data is analogous to the ocean data and Levitus et al. are trying to convince us they are dealing with a mix tank with the blender on when they apply the statistics to estimate the error.

Gail Combs
April 24, 2012 12:59 pm

DR says: April 23, 2012 at 9:34 am
If there are any fellow Metrologists following, do you also split a gut laughing when seeing the claimed error bars for many of these so-called “studies”? In the real world, uncertainty must be accounted for empirically, not by playing statistics games.
__________________________________
Yes, I ran a lab and I know how hard it is to get decent data under lab conditions much less field conditions. Trying to get good interlab correlation among the various plants had us tearing our hair out trying to find the sources of error. Temperatures good to .02 C from 1955 ocean measurements??? ROTFLMAO

April 24, 2012 1:16 pm

Smokey says:
April 24, 2012 at 7:06 am
America is a “can-do” country that can out compete anyone. But we are being sold out by anti-American traitors. There is no better term for what is being done.

Actually, there are several — but there are ladies present…

Gail Combs
April 24, 2012 1:30 pm

Curiousgeorge says: April 23, 2012 at 12:37 pm
Small world! 🙂 I knew Scott thru ASQ….
We had this discussion at Boeing in the ’90′s…..
____________________________
VERY small world. In the 90’s I worked for a company that sold aircraft turbine blades to Boeing. I had a dust up with them when they refused to fire an obnoxious Lab tech that I finally nailed for falsifying data. (It was a bit of gorilla warfare for about two years) I am the one who got fired BTW because the lab tech had an “Angel” placed high up. So much for the usefulness of ISO or the Federal Aviation Administration. It took the loss of three planes to finally nail that company.
My cynical out look on scientists is well based in reality.

Gail Combs
April 24, 2012 2:21 pm

Chuck Wiese says:
April 23, 2012 at 2:22 pm
…. There is no way to separate the contributing wavelengths that vary anywhere from solar ultra violet to the far end of the infrared spectrum, so in this regard, the claim of a rising heat content as being a proof that greenhouse gases were the cause is patently ridiculous.
_____________________________
Actually there is given this graph of the solar radiation intensity for different wavelengths at various ocean depths. The climate scientists are shooting themselves in the foot and proving it is the SUN not CO2 that is the culprit for ocean heating. This is because as the graph states there is no absorption of energy at depth from the wavelengths given off by GHGs. Those wavelengths are around 10 micrometers and well beyond where the chart approaches zero around a wavelength of 2.5 micrometer. ( depth of 0.01m) Essentially the GHG energy can not penetrate beyond the surface “skin”

Underwater Sound Propagation: Temperature and sound velocity profiles
….The ocean depths, where little or no sunlight penetrates is of uniform temperature… In these isothermal regions, the sound speed, C, increases with increasing pressure as a function of depth….
In colder waters with less solar radiation results in small or no isothermal surface layer. The temperature continuously declines with depth from the surface…. Knowledge of the temperature profile and hence the sound speed profile, enables the prediction of sound transmission paths. Such information serves as input to the tactical environment in which submarine and surface navies operate…

Engineers have to get the science right. They can not play the games ivory tower scientists do or they fail.

Gail Combs
April 24, 2012 2:44 pm

stevefitzpatrick says:
April 23, 2012 at 7:06 pm
…Which is not at all surprising when you consider that the ocean surface temperature has increased modestly (about 0.6 – 0.7C) over the past 100 or so years. I have a sincere question for you: do you really doubt that a rising ocean surface temperature would not be expected to lead to a rising ocean heat content? I can understand arguing about uncertainty in the exact values; I can’t understand arguing that a significant increase in ocean heat content is not expected based on the historical ocean surface temperature trend.
__________________________
The problem is that when a “scientist” makes claims for impossible precision/accuracy in the measurements (a statistics/math mistake) it calls into doubt the entire body of data that he has adjusted and manipulated.
Do I think the oceans have warmed? Probably but I would be a lot less cynical if I did not see so much game playing going on.

Gail Combs
April 24, 2012 2:54 pm

theofloinn says:
April 23, 2012 at 8:18 pm
ISO 9000 is like any other tool, capable of being used in foolish ways.
_______________________________
As W. Edwards Deming said: “Quality starts in the boardroom.” It does not matter what system is used if $$$ is all that concerns the members of the board.
My other big gripe was Just in Time. One nasty ice storm and you are up the creek without a paddle, and have nothing to go into the furnaces. (Fought and lost that battle too until nature whapped the CEO upside the head)