# Ocean Temperature And Heat Content

Guest Post by Willis Eschenbach

Anthony has an interesting post up discussing the latest findings regarding the heat content of the upper ocean. Here’s one of the figures from that post.

Figure 1. Upper ocean heat content anomaly (OHCA), 0-700 metres, in zeta-joules (10^21 joules). Errors are not specified but are presumably one sigma. SOURCE

He notes that there has been no significant change in the OHCA in the last decade. It’s a significant piece of information. I still have a problem with the graph, however, which is that the units are meaningless to me. What does a change of 10 zeta-joules mean? So following my usual practice, I converted the graph to a more familiar units, degrees C. Let me explain how I went about that.

To start with, I digitized the data from the graph. Often this is far, far quicker than tracking down the initial dataset, particularly if the graph contains the errors. I work on the Mac, so I use a program called GraphClick, I’m sure the same or better is available on the PC. I measured three series: the data, the plus error, and the minus error. I then put this data into an Excel spreadsheet, available here.

Then all that remained was to convert the change in zeta-joules to the corresponding change in degrees C. The first number I need is the volume of the top 700 metres of the ocean. I have a spreadsheet for this. Interpolated, it says 237,029,703 cubic kilometres. I multiply that by 62/60 to adjust for the density of salt vs. fresh water, and multiply by 10^9 to convert to tonnes. I multiply that by 4.186 mega-joules per tonne per degree C. That tells me that it takes about a thousand zeta-joules to raise the upper ocean temperature by 1°C.

Dividing all of the numbers in their chart by that conversion factor gives us their chart, in units of degrees C. Calculations are shown on the spreadsheet.

Figure 2. Upper ocean heat content anomaly, 0-700 metres, in degrees C.

I don’t plan to say a whole lot about that, I’ll leave it to the commenters, other than to point out the following facts:

• The temperature was roughly flat from 1993-1998. Then it increased by about one tenth of a degree in the next five years to 2003, and has been about flat since then.

• The claim is made that the average temperature of the entire upper ocean of the planet is currently known to an error (presumably one sigma) of about a hundredth of a degree C.

• I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.

• The huge increase in observations post 2002 from the addition of the Argo floats didn’t reduce the error by a whole lot.

My main question in this revolves around the claimed error. I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. Doubtful.

I also find it odd that the very large increase in the number of annual observations due to the more than 3,000 Argo floats didn’t decrease the error much …

As is common in climate science … more questions than answers. Why did it go up? Why is it now flat? Which way will the frog jump next?

Regards to everyone,

w.

## 230 thoughts on “Ocean Temperature And Heat Content”

1. Hoser says:

Hey, a broken clock constantly reports very reproducible values. Doesn’t mean it’s accurate.

Many thanks indeed !

That claim of ‘error knowledge’ should be voided as well as the ‘0.1’ of a degree C increase.

XD

3. Why did it go up?

4. Mike Jonas says:

re the “the 0.1°C temperature rise 1998-2003″. Could it be from the 1997-8 El Nino? If the upwelling warm water comes from below 700m depth, or if the temperature measure doesn’t treat all 700m equally, it could be the El Nino. I think this would be in line with Bob Tisdale’s thinking that there is a ‘step function’ at an El Nino.

5. Willis,
” I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. “

But the ocean temperature isn’t noisy. At depth there is no daily variation – and weak spatial gradients. There are no winds, no variable sunlight. The main issue seems to be deciding where the float actually was when readings were taken.

Here is a document that looks at a particular trajectory in detail. Fig 8 in particular shows the error issues.

6. Lew Skannen says:

I am always interested in error bars. I suspect that they are not really wanted and everyone in the modelling world would be happier if they would just disappear but stubbornly refuse to do so. They are cleaned up and brought out for show when they need to be accounted for by they are expected to behave themselves, not make a scene, not speak unless spoken to and certainly not relax and reveal any more of themselves that strictly necessary.
In reality if they were allowed to be themselves and behave as they would at home I suspect that the error bars would undo their corsets and belt buckles and flop out all over the place.
This would then spoil the image of the neat, tidy, prim and proper graph because it would be indistinguishable from a page of wall to wall error bars running rampant. A bit like a ball room dancer at a Hells Angels long weekend booze up.

7. Arnost says:

• I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.

Adjustments to the bathythermograph fall rate (particularly for the data from 1996)?
http://www.nodc.noaa.gov/OC5/WOD09/bt_bias_notes.html

8. DOH! You’ve change a heat content measurement of the entire ocean to a level of abstraction, a potential to reach a certain temperature at a specific depth. Not an improvement.

9. First: Bob Tisdale explains with clarity what happens with ENSO and step changes.
Second: What happened to the water energy below 700meters? Any mixing there could have tremendous affect.

Anyway – the earth seems particularly stable with regard to its energy budget based on your fine work Willis. One caveat is that the measurements to be accurate with such tiny variations seems difficult at best to measure. I personally have found it difficult to get such accurate and repeatable measurements from temperatures sensors, even with RTDs… you know those platinum ones…

10. Mario Lento says:

First: Bob Tisdale explains with clarity what happens with ENSO and step changes.
Second: What happened to the water energy below 700meters? Any mixing there could have tremendous affect.

Anyway – the earth seems particularly stable with regard to its energy budget based on your fine work Willis. One caveat is that the measurements to be accurate with such tiny variations seems difficult at best to measure. I personally have found it difficult to get such accurate and repeatable measurements from temperatures sensors, even with RTDs… you know those platinum ones…
Great post!

11. @James Cook: No, he converted energy budget to an average temperature increase in a normalized body of water equal to the volume reported. It helps us see what the magnitude of the energy budget would do to the ocean temperatures of that upper 0.7km of ocean. It’s just to put things into perspective.

12. Mario Lento says:

@James Cook: No, he converted energy budget to an average temperature increase in a normalized body of water equal to the volume reported. It helps us see what the magnitude of the energy budget would do to the ocean temperatures of that upper 0.7km of ocean. It’s just to put things into perspective James…

13. Steven Mosher says:

Nick,
Ya, you’d be amazed if you look at a ships track over time in ICOADS by how little it changes and by how little it changes during the course of a day.. relative to the air that is.

@Willis:

“• The claim is made that the average temperature of the entire upper ocean of the planet is currently known to an error (presumably one sigma) of about a hundredth of a degree C.”

Actually, that is not what the number represents.

REPLY: Mosh – instead of making another crypto-comment, why not tell us what it represents and show a citation? – Anthony

14. tobias says:

Thanks Hoser i had a good laugh at that, I use it often and chuckle every time.

15. Willis Eschenbach says:

James Cook says:
February 25, 2013 at 11:02 pm

DOH! You’ve change a heat content measurement of the entire ocean to a level of abstraction, a potential to reach a certain temperature at a specific depth. Not an improvement.

Thanks, James. “Potential to reach a certain temperature”? I did several things, but I’ve no clue what you are referring to.

w.

16. Mindert Eiting says:

Willis, with N randomly sampled observations the standard error equals the standard deviation times square root N. Standard error 0.01 and N = 3000 suggests standard deviation of half a degree Celsius or almost all annual means (per float) between plus/minus one degree. Crucial is randomness. Is there also a graph of number of floats each year? If not fixed, for each year the number added and the number deleted? The whole effect may be due to sample change. Compare the surface station record.

17. The temperature rise from 1998 – 2003 is when the Argo system was being deployed. The slightly lower earlier temperatures most probably represent nothing more than some attempt to adjust for the depth/time error inherent in XBT temperature profiles.

The brief 1993 to 1997 part of the record is only enough to indicate a steep rise thereafter but too short to raise uncomfortable questions regarding the earlier decades of no noticeable increase.

18. Australis says:

It is notable that two incompatible measurement systems are brought together in this graph. Is it coincident that the (relative) warming spurt occurred on the eve of commencement of the ARGO floats?

We’ve seen this before, when sea level rises doubled at the time measurements glissaded from tide gauges to satellites, around 1992. And when hockey sticks glissaded from paleo measures to instruments around 1960. And … ?

19. bw says:

Most solar input occurs above the thermocline. Actually, almost all solar energy input occurs within the top meter of the ocean depth. That one meter of ocean has over 3000 times the volumetric heat capacity of the air above it. Most of the global energy budget therefore arises at the tropical ocean surface. The greatest variable affecting that budget is how much solar energy reaches the surface. Since tropical ocean albedo is very low, the only remaining major variable is the transparency/albedo of the atmosphere above the ocean, ie clouds.
Another variable is how much mixing/transport occurs from the ocean surface downward.
The thermocline is usually around 100 meters. Below the thermocline, the ocean depths are pretty much an infinite heat sink. Even relatively small changes in surface currents will carry more thermal energy than the entire atmosphere above the surface. What would happen to the northern polar ice if there were a small shift in the direction of the Gulf Stream??

20. PaulC says:

Does any one how much energy it takes to heat 237 million cubic kilometers of salt water by 0.1 deg in 5 years. Is it possible given heat from the sun and from the earths core (volcanos etc)

21. Luther Wu says:

PaulC says:
February 25, 2013 at 11:57 pm

Does any one how much energy it takes to heat 237 million cubic kilometers of salt water by 0.1 deg in 5 years. Is it possible given heat from the sun and from the earths core (volcanos etc)
___________________________
The graph says it is possible. Maybe we could get some models to agree. Maybe I should just go back to sleep.

22. PaulC says:

Luther Wu says “The graph says it is possible

That’s the point – the graph looks wrong. A out of character step usually means Mann made figures.

23. The oceans have currents …. with areas of cold upswelling. You dump floats in the sea and it is a matter of fact that they will float away from the areas of upswelling from cold->warm..

24. Mindert Eiting says:

Sorry, I meant above standard deviation divided by square root N. An example of simple book-keeping you almost never see. GHCN: 1940 has 4266 stations, 1940-1969 added 6074, deleted 696. In 1970 we have 9644 stations, 1970-1999 added 2918, deleted 9434. In 2000 we have 3128 stations, 2000-2010 added 14, deleted 1539. The samples were never spatially random. The changes were extremely non-random (homogenization). If something similar happened with the floats, the whole effect is dubious beyond repair.

25. Nick said;

‘At depth there is no daily variation – and weak spatial gradients. There are no winds, no variable sunlight. The main issue seems to be deciding where the float actually was when readings were taken.’

Can you give us your definition of ‘depth’ in the context you are using i? Self evidently there is a huge difference between the top and bottom of the ocean and sufficient spatial gradient to make it worth thinking of tapping into it as an energy source. The temperature difference will vary of course according to if you are in the tropics or Northern:Latitudes.
Are you referring purely to abyssal depth?

tonyb

From Nick Stokes on February 25, 2013 at 10:41 pm:

But the ocean temperature isn’t noisy. At depth there is no daily variation – and weak spatial gradients. There are no winds, no variable sunlight. The main issue seems to be deciding where the float actually was when readings were taken.

Here is a document that looks at a particular trajectory in detail. Fig 8 in particular shows the error issues.

For those of us who prefer our URL’s un-obfuscated, here’s your link:
http://www.cawcr.gov.au/bmrc/ocean/staff/gbb/Argo_error_estimates_v3.doc

Considering the many posts Willis has had before about the ARGO system, how the floats operate, etc, plus his extensive knowledge of the oceans otherwise,

It is curious that apparently you are taking it upon yourself to educate such a poor benighted fool, speaking down as you see needed to reach his decidedly low level of understanding.

Having downloaded the document, I see it is an incomplete work in progress, no references younger than 2005. Thus this “error estimates” document is missing info from the last seven-plus years of operation of this rather young monitoring system.

Got anything more relevant? If not, I bet Willis can quickly get some better links for you.

27. Jesper says:

A 700m depth shouldn’t used to convert joules to degrees. The ocean mixes down only to about 500-100 m in most places. The energy change is concentrated in the upper layers, and not fully distributed over 700 m. Using a more realistic mixing depth of, say 70m, would increase your temperature change by a factor of 10, and better reconcile the energy change with observed temperatures.

28. Michel says:

Why transform all heat content change into hypothetical temperature change?
Water comes and goes in form of rain and vapour. To add 1 mm to the global sea level it takes approx. 0.9 Zetajoules. Yearly rainfalls over the whole globe (860 mm on land and seas) take or give approx 1000 Zetajoules.
How to disentangle energy content between heat capacity and heat of evaporation, taking into account short, long term, geographical variations? More research is needed…

29. richard verney says:

bw says:
February 25, 2013 at 11:52 pm

Most solar input occurs above the thermocline. Actually, almost all solar energy input occurs within the top meter of the ocean depth. That one meter of ocean has over 3000 times the volumetric heat capacity of the air above it. Most of the global energy budget therefore arises at the tropical ocean surface. The greatest variable affecting that budget is how much solar energy reaches the surface. Since tropical ocean albedo is very low, the only remaining major variable is the transparency/albedo of the atmosphere above the ocean, ie clouds.
Another variable is how much mixing/transport occurs from the ocean surface downward.
The thermocline is usually around 100 meters. Below the thermocline, the ocean depths are pretty much an infinite heat sink. Even relatively small changes in surface currents will carry more thermal energy than the entire atmosphere above the surface. What would happen to the northern polar ice if there were a small shift in the direction of the Gulf Stream??
////////////////////////////////////
The optical absorption of solar in water would not support the assertion that almost all solar energy occurs within the top metre. Of course, ocean water is not pristine clear and not only contains salts (not simply NaCl) but also other particulate matter which affects the absorption characteristics.

The assertion that most of solar occurs above the thermocline (usually around 100 metres) may be correct.

I would suggest that it is because most of solar is not absorbed within the top metre of the ocean but is instead absorbed at lower depths that ocean overturning (as well as conduction where the energy flux is downwards once one gets to about the 5 metre depth) works so effectively.

It is worth pausing to consider that after approximately 3.5 to 4 billion years (or so) of receiving solar energy, the average temperature of the oceans is only about 4 deg C. To consider that the oceans have an average temperature akin to the surface temperature is misleading and short sighted. It is the fact that the average temperature of the oceans is low that we get ice ages. If the average temperature of the oceans was more akin to the surface temperatures earth would not get ice ages (may be some relatively small growth in polar ice caps but no more) since the heat capacity in the oceans would be so great that extensive ice could not develop.

I consider that when we say that the average temperature of the earth is plus 14 or plus 15 degC (whatever figure that is quoted for this misleading concept), this is in fact misleading since it fails to take account of the average temperature of the oceans. It is looking at the position in an interglacial. If we were to consider the average temperature of the earth in a glacial period, it would be far lower since the surface temperature of the oceans would have cooled to a temperature much more akin to the average temperature of the ocean.

I do not consider that this is properly taken into account when considering the earth’s energy budget and when making comparisons with the moon (in any event it is crazy to make a comparison with the moon that is so different in almost every respect save only the distance from the sun). It is another example of a snapshot error so frequently made in climate science; a failure to consider data over a long enough base period, and making extrapolations from an inappropriately short time series data set.

30. izen says:

This ignores the vast mass of the ocean that is below 700m

If ocean heat content has really no increased significantly over this time then it poses big problems for the measurements of ice melt in Greenland and Antarctica. Sea level rise would need to be driven by much greater amounts of ice melt if thermal expansion is a smaller factor.

Or you could dismiss the OHC and sea level data as grossly mistaken or fraudulent. But that would destroy any credibility as multiple lines of evidence support both the rise in sea level and the degree of ice melt.

31. richard verney says:

Further to my last post regarding the aabsorption of solar irradiance in water, perhaps I should add 2 further points.

First, we are lucky that the absorption of solar irradiance in water, due to the wavelength characteristics of solar irradiance, is very different to the absorption characteristics of LWIR in water. Had it been similar, the oceans would have burnt off long ago. We are very fortunate that the top surface of the oceans does not absorb any significant quantity of incoming solar.

Second, when assessing to what extent the earth’s atmosphere raises the average temperature above that of the moon, one should make this assessment over an entire glacial/interglacial cycle. The average temperature of the moon probably varies very little over such a cycle. However, the temperature of the earth does. One needs to assess the average surface of temperature of the land over this complete cycle and the average surface temperature of the ocean over this entire cycle, and it is only once this assessment has been made that one can begin to consider to what extent the earth’s atmosphere raises the temperature of the earth over and above that of the moon.

32. richard verney says:

[snip . . nope . . mod]

Bart says:
February 25, 2013 at 1:47 pm
Phobos says:
February 25, 2013 at 1:27 pm

“— much more relevant than just the 0-700 m level — “

How so? IR from CO2 backradiation does not penetrate nearly that far. How are you supposing the heat gets there, when there is no change in the waters above?…”
////////////////////////////////////////////////////////////////////////////////////////////////
Before one considers how much DWLWIR penetrates the oceans to depth, one has to consider whether any and, if so, how much DWLWIR even reaches the very top surface layer of the ocean.

The optical absorption of LWIR in water is such that 20% of LWIR is fully absorbed within 1 micron, 40% is fully absorbed within 2 microns, and 60% is fully absorbed within 4 microns. About 83% of it is fully absorbed within the first 10 microns. See for example http://scienceofdoom.com/2010/10/06/does-back-radiation-heat-the-ocean-part-one/ This website cites a plot taken from Wiki.

For all practical purposes water is a LWIR block, much like a parasol is a sun block, or a high factor sun cream block is a powerful harmful UV block.

The first question that arises is, given that water is such an effective LWIR block, how does any DWLWIR reach the top surface of the oceans? This question arises because the ocean model referred to in Wiki (and the like) is an example of an ideal ocean, far divorced from the realities of life. The ocean considered is the ocean that is as flat as a mill pond. But in real life, this is not earth’s oceans.

According to a study conducted by Stanford University, the average wind speed over water is Beaufort 4 (see http://www.stanford.edu/group/efmh/winds/global_winds.html) Of course, averages can be misleading due to variability, and in the Atlantic and South Pacific, the wind speed is greater (see http://www.ceoe.udel.edu/windpower/ResourceMap/index-world.html). I have spent approximately some 30 years studying ship’s logs covering trading worldwide, and I can confirm that it is rather rare to see wind conditions of less than force 4 being recorded on ocean voyages, and such a review would suggest an average more like force 5 (on worldwide ocean trading routes).

NOAA gives a description of sea conditions in various wind conditions as follows; http://www.spc.noaa.gov/faq/tornado/beaufort.html. It will be noted that at force 3 “crests begin to break, scattered whitecaps” and at force 4 there are “numerous whitecaps”. What does this mean? Well it means that already at force 3 the wind is drawing off the very top surface of the water which can be seen by the naked eye as crest beginning to break. The optical resolution of the human eye is not high, and the fact that the unaided human eye can, from a distance, see white crests means that more than a few microns of water is being ripped off. The human hair is between 17 to 180 microns. (http://en.wikipedia.org/wiki/Hair) perhaps on average approximately 50 microns (consider when standing say 5 metres from a person, how easy is it to see individual strands of hair?). The fact that the human eye can see white crests, which by force 4 are “numerous” in number, suggests that not less than about 50 microns of water is being ripped off the oceans and lies immediately above the oceans, particularly within say ½ metre above the ocean, as wind swept spray and spume.

Accordingly, before any DWLWIR can reach the oceans it has to first find its way through the wind swept spray and spume which lies immediately above the oceans. Given the optical absorption of LWIR in water, for practical purposes if there is even just 6 microns of wind swept spray and spume lying above the oceans at most only about 25% of DWLWIR even gets to reach the top surface of the oceans. If there is more than 6 microns of windswept spray and spume, even less than 25% of DWLWIR could penetrate this barrier. This is an issue which seems to be overlooked by those promoting the AGW meme.

It may well be the case that in force 5 conditions and above, none of the DWLWIR even reaches the very top layer of the oceans because it cannot penetrate the IR block consisting of the wind swept spray and spume that exists immediately above but divorced from the ocean below.
It is only once the DWLWIR has penetrated the spray and spume, that one has to ask how does that residual element (ie., such DWLWIR that is not absorbed by the windswept spray and spume which exists and lies immediately above the top surface of the ocean) heat the deep ocean? ie., the point to which Bart alludes. It is not easy to see by what mechanism DWLWIR can effectively heat the ocean, for a number of reasons:

1. Little if any DWLWIR actually reaches the top surface of the ocean since, for reasons detailed above, in the real world, most of it must surely be fully absorbed by the windswept spray and spume that lies immediately above the ocean (ie., say within ½ metre or so above the ocean).

2. Of the residual DWLWIR that has found its way past the windswept spray and spume some 60% of it is fully absorbed within 4 microns of the top surface of the ocean. But what happens to the DWLWIR so absorbed? Absorption of IR photons is essentially a light speed event, and theoretically (assuming the K&T energy budget is correct) there is so much energy absorbed within the first 4 microns of the top surface of the ocean that there would be copious amounts of evaporation (perhaps so much as to provide approximately 15 metres, or so, of global rainfall). This would arise unless in some way the energy received could be dissipated downwards into the deeper ocean before the top 4 microns are heated to evaporation point by DWLWIR being absorbed in the top 4 micron layer. But how is the energy dissipated downwards? What is the mechanism that dissipates this energy downwards?

3. It is not easy to see by what mechanism the LWIR absorbed in the first 4, or so, microns is dissipated downwards. It would appear that it cannot be conducted downwards since the energy flux is upwards not downwards at the top of the ocean and there is no known mechanism whereby conduction can take place against the direction of energy flux. See http://en.wikipedia.org/wiki/Sea_surface_temperature from which it will be seen that the top surface of the ocean is cooler and that the ocean temperature increases from the top 10 microns through to about 5 metres, and only as from a depth of about 5metres onwards does the ocean begin to cool. It follows from this that energy flux is upwards not downwards, so how can any energy absorbed within the first 4 or so microns be conducted downwards?

4. The only other method that I have seen suggested is ocean overturning. However this is a slow mechanical process measured in many hours (about half a day). It is not clear that ocean over turning can effectively wrap and drag down the very top micron layer (this is a problem in itself) but even if it could, this is a slow mechanical process which cannot dissipate energy downwards at a speed greater or nearly equal to the speed and rate at which DWLWIR is absorbed in the top 4 micron layer. Given the speed of photonic absorption in the top 4 or so micron layer, the mechanical process involved in ocean over turning cannot dissipate that energy down to depth before the energy absorbed in the top 4, or so, microns would raise the temperature in the first 4, or so, microns of the oceans to a temperature sufficient to drive evaporation form the top microns of the ocean.

I have asked Willis a number of times to explain the process by which energy from DWLWIR absorbed in the top few microns of the oceans can be dissipated downwards to depth before the energy absorbed in those microns heats those microns of water to a temperature driving evaporation. Despite many requests being made of him, he has at no time explained the mechanical process involved. He has not explained how heat can be conducted to depth against the direction of energy flux, nor how ocean overturning can dissipate the energy absorbed in the top microns before that energy would drive evaporation of those very microns of water.

As I see matters there is a significant problem with respect to the behavoir of DWLWIR and its interaction with the oceans which presently is not fully addressed, or even addressed at all, by those that support the AGW meme.

PS. I do not like referencing Wiki, but I consider that on the aspects covered above it is not contentious.

PPS. As noted I have studied ship’s logs for approximately 30 years. Today, ships record ocean temperature by taking the water temperature at the inlet manifold of the engine cooling system. This water is drawn at depth. Depending on the design and configuration of the vessel (to what extent it is laden and how it is being trimmed), an ocean going ship will draw water from a depth of between say 5m and 13m. A depth of 8m to 10m is probably quite typical (if there is such a thing). It will therefore be appreciated that when a ship records water temperature it is measuring water temperature drawn at a depth of say about 8 to 9m, not surface temperature. In practice, there is for the main part relatively little difference between the water temperature at about 8m and surface, but of course, ship’s logs, since they are not recording surface temperature, will understate what the surface temperature of the ocean is.

33. sean.fr says:

You can not just relabel heat content into degrees using a multiplier. The measurement is in degrees. The content is derived by using knowledge of saltinity and pressure on a cell by cell basis. You would not average the 10km of air above you. You have to seperate out the layers and zones;The key thing to find is what is happening the deep. If you to want argue that a lot of warming is already programmed in this is where you need to claim it is hiding.

34. Mike Ozanne says:

” I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.”

Well us widget makers who saw a mean shift which occurred during the implementation of an improved measurement system, would take it to mean that our prior record history was probably bull. I had a quick look through the ARGO website and couldn’t find reference to any processes of reliability and reproducibility or cross calibration against existing systems prior to deployment. But there was this item ” Analyses of decadal changes presently focus on comparison of Argo to sparse and sometimes inaccurate historical data” which would also seem to suggest that the prior records are suspect.

35. Richard Verney

Good comment.

If it is the upper ocean temperature that affects our global/land temperature it follows that whether they are warm or cold matters much more than what happens lower down, which fluctuates very little.

In the summer the upper surface is greatly affected by sunshine intensity and duration, as I can testify from my experiments in the English Channel last year! It is also affected by other factors of course, if it is mostly cloudy there will be little solar gain if the nights are clear there is a surprisingly high loss of heat . As the BBC reported this morning the North Sea is colder than normal, a combination of the poor summer and lack of sun ever since amongst other things.

Lamb in 1982 mentioned that in 1690 to 1699 -the depths of the LIA- ocean surface temperature betwen Iceland and Faroe islands was probably 5degrees C lower than today.This suggests the arctic cold water had spead far to the south and that (probably) sunshine levels were low..
tonyb

36. We had an interesting weather report this morning on BBC R4 0700. The reporter forecast colder than usual (?) weather due to the NE winds from Scandinavia blowing over a colder than normal (?) North Sea. Remember this report was on the arch alarmist media BBC. But the North Sea is a shallow small area of water that would warm quicker than the oceans IF the claimed global warming was true. So perhaps this indicates a cooler NW Europe than would be expected probably due to a slightly less active sun.
Comment on Richard Verney’s dit above— The top 200m of water is affected by solar energy, the photic zone, below that sunlight cannot penetrate. Lunar comparisons are valid because the moon receives the same insolation as earth and zenith position temperatures there are around 120C which is the radiative equilibrium temperature with 1370W/m2. We get the same reduced for albedo and atmospheric adsorption to 960W/m2 which gives a radiative equilibrium temperature of 88C. Surface temperatures on earth in the zenith position can reach near 60C in dry desert positions though rarely over 40C in wet rainforest positions.These are both air temperatures. Surface heating of the atmosphere causes convection which pulls in cooler air to continue the heat removing process. The surface is always hotter than the in surrounding atmosphere and surface measurement shows that the 88C in dry desert positions to be near possible. I have seen an egg fried on a black rock in Death Valley Ca. Lunar comparison demonstrates the cooling effect of an atmosphere.There is more than enough heat from the sun to preclude the need for a GHG theory.

Assuming the figure is correct; could any one give us an idea of what effect a 0.1 degree centigrade change over 20 years would have?

38. phlogiston says:

I know of no obvious reason for the 0.1°C temperature rise 1998-2003, nor for the basically flat temperatures before and after.

You need to read your Bob Tisdale. This jump-up in ocean temperatures coinciding with a big el Nino is at the core of his analysis and interpretation of ENSO and climate. What the above graph shows is the effect on ocean heat of the monster 1997-1999 el Nino and following La Nina. Bob has dissected the east Pacific temperatures from “rest-of-the-world” ocean temps, and shows that, while the east Pacific itself has not warmed over the last half century, it has successfully exported its ENSO heat such that the rest-of-the-world ocean temperatures show discreet, quantum-like step-ups precisely at the major el nino events in the last few decades.

BTW what goes up must come down. Expect the inverse phenomenon some time soon.

39. DennisA says:

The Mystery of Global Warming’s Missing Heat by Dr Robert Stevenson, 2000
http://www.21stcenturysciencetech.com/articles/ocean.html

“Because of the high density/specific heat of sea water, the entire heat in the overlying atmosphere can be contained in the top two meters of the oceans. This enormous storage capacity enables the oceans to “buffer” any major deviations in temperature, moderating both heat and cold waves alike.

Evaporation is constantly taking place at the surface of the seas. It is greatest in the tropics and weakest near the polar regions. The effect of evaporation is to cool the oceans and, thereby, the surface atmosphere.

Sunlight penetrates the water surface readily, and directly heats the ocean up to a certain depth. Around 3 percent of the radiation from the Sun reaches a depth of about 100 meters.

The top layer of the ocean to that depth warms up easily under sunlight. Below 100 meters, however, little radiant energy remains. The ocean becomes progressively darker and colder as the depth increases.

The infrared radiation penetrates but a few millimeters into the ocean. This means that the greenhouse radiation from the atmosphere affects only the top few millimeters of the ocean. Water just a few centimeters deep receives none of the direct effect of the infrared thermal energy from the atmosphere! Further, it is in those top few millimeters in which evaporation takes places. So whatever infrared energy may reach the ocean as a result of the greenhouse effect is soon dissipated.

It is clear that solar-related variations in mixed-layer temperatures penetrate to between 80 to 160 meters, the average depth of the main pycnocline (density discontinuity) in the global ocean. Below these depths, temperature fluctuations become uncorrelated with solar signals, deeper penetration being restrained by the stratified barrier of the pycnocline.

Consequently, anomalous heat associated with changing solar irradiance is stored in the upper 100 meters. The heat balance is maintained by heat loss to the atmosphere, not to the deep ocean.”

If the sudden rise in temperature is due to upwelling of warm water or trending toward warmer areas, then no way that can be taken as an increase in heat content of the entire ocean. But then, I’m just stating the obvious (so those claiming a change in the heat resevoir can see the fallacy of their assumptions).

And I agree with Willis that it’s unlikely to know the average temperature of the upper ocean to 0.01 degrees because it is huge beyond belief. But even more, the picker is in motion, moving every direction under the sun and changing directions constantly. This whole exercise reminds me, on a much smaller scale, of trying to determine the change in the average temperature of a tornado in progress as time advances–my head spins just thinking about it.

(As a comparative example, determining the metal content in a stationary orebody to an equal degree of confidence would be difficult enough–but every drilling campaign would show a different average simply by moving the holes to random locations–so what’s reality? The corrollary with the ocean is that the medium being sampled is moving in a quasi-random fashion with the added complexity that the buoys aren’t stationary with respect to the ocean’s currents–they’re going up and down while the water moves laterally yet likely nonuniformly. But who knows? Again, my head spins.)

I’ve contributed nothing but doubt.

41. Ben Wouters says:

For some real change in the OHC see the graph in this post:
http://principia-scientific.org/supportnews/latest-news/124-real-global-warming.html

The deep oceans lost about 17C in the last 85 million years.
For an explanation of the high temperatures at that time think at least100 million km^3 magma erupting in the Pacific prior to this period. Earth is cooling down slowly but surely since then.
With the temperature of the deep oceans explained by geothermal heat and the sun adding the rest, the surface temperatures are easily explained without Greenhouse effect or similar constructs.
This is also an explanation for the unvelievably stable surface temperatures we have on our planet.

42. richard verney says:

John Marshall says:
February 26, 2013 at 2:27 am
“…Comment on Richard Verney’s dit above— The top 200m of water is affected by solar energy, the photic zone, below that sunlight cannot penetrate. Lunar comparisons are valid because the moon receives the same insolation as earth and…”
//////////////////////////////////////////
I do not accept that a meaningful comparison with the moon can be made. Apart from the sistance between the moon and the sun, and the earth and the sun, every factor is different. May be one can make a ball park comparison, but is that ball park figure accurate within 10%, ie within about 28 degK, I doubt.

The earth has a very different and constantly changing albedo, it spins on a tilted axis with procession of season, it is a water world which has a significant latent heat capacity and acts as a heat and storage sink, the earth day and lunar day are radically different, the earth has an atmosphere which absorbs some part of the incoming solar the precise absorption is not known and varies (may be only slightly) over time, the atmosphere contains clouds which both reflect, absorb and block solar from reaching the surface. The pattern of clouds is chaotic and constantly changing, ie.,, the time of the day when they appear and disappear in relation to the solar zenith, the altitude at which they appear, the area of coverage, the height/volume of coverage, their nature and consistency how much water they hold and the size of droplet, the albedo characteristics of the earth below any cloud which may be formed in the atmosphere above, all has an effect on how much solar is received by the land and oceans below the clouds. This variability is so large that it could in itself completely explain the thermoter temperature record these past 150 years. Then there is volcanic activity etc, the topography of mountain ranges etc..The foregoing is just an illustration of some of the differences not a complete list.

All in all, the differences are so large and significant that one would not expect planet earth to have the same temperature as the lifeless moon. I do not consider that GHGs are the reason why planet earth is some 33degC warmer than the moon (even assuming that the 33degC figure is correct).

That said, I agree with you that the effect of Earth’s atmosphere is to cool the Earth rather than to warm it, and I consider your assertion that “There is more than enough heat from the sun to preclude the need for a GHG theory” to be probably correct.

43. Willis:
Small (5%) quibble. Your heat capacity is for pure water at 4 celsius, atomspheric pressure. None of which, generally, apply to sea water.

44. Google “NASA correcting ocean cooling”. That JPL guy openly admits that he threw away all “too cool” buoy ARGO data, since consensus says it must be warming. I have barely any faith even in ARGO since then, whatever fine system it has been designed to be.

45. izen says:

@- John Marshall
“Surface temperatures on earth in the zenith position can reach near 60C in dry desert positions though rarely over 40C in wet rainforest positions”

Why mention zenith position energy fluxes when only an infinitely small point is ever at the zenith position? The average flux for the total surface is a fraction of this.

@-“Lunar comparison demonstrates the cooling effect of an atmosphere.There is more than enough heat from the sun to preclude the need for a GHG theory.”

No, there isn’t. The GHG effect is old science, very well established and can be clearly observed on all rocky planets with an atmosphere. You have no hope in explaining the temperatures of Mars and Venus without a GHG effect and the variation in GHG over a glacial cycle on Earth
reveals clearly their role in the climate.

46. Paul says:

I think we are due for another big la nina event.

47. TimTheToolMan says:

Nick writes “But the ocean temperature isn’t noisy. At depth there is no daily variation – and weak spatial gradients. There are no winds, no variable sunlight. ”

You cant have it both ways. If the oceans are warming below where we’re measuring to keep AGW “on track” then that must involve currents moving that heat downwards because diffusion cant cut it. However nobody has spotted them because there are no papers (that I’ve heard of) that quantify the effect with the measurements and yet it would be the obvious thing to do.

Hence we dont have the resolution to see it OR its not happening. Your choice.

48. richard verney says:

Nick Stokes says:
February 25, 2013 at 10:41 pm

“…But the ocean temperature isn’t noisy…”
//////////////////////////////////////////////////////////////////////////////////
Anyone who has been diving would beg to differ; the temperature of the ocean can vary quite significantly within matters of metres.

I have spent some 30 years studying ship’s logs and Ships’ logs would also suggest that that assertion is not correct. It is not infrequent that one sees different sea temperatures recorded every 4 hours. At say 11.5 knots (a fairly typical speed for and ocean trading vessel) that is a change in temperature within only 46 nautical miles.

The idea that we have an appreciation of the average ocean temperature within a tenth of a degree is misconceived, the assertion that we know this within one hundreth of degree is farcical in the extreme.

Even if ARGO was increased a thousand fold, I doubt that we would have a realistic assessment within one tenth of a degree.

49. cmarrou says:

I am surprised Willis isn’t familiar with zeta-joules. I just saw Catherine Zeta-Joules Sunday on the Academy Awards, and I’m quite familiar with her work.

50. Owen in GA says:

John Eggert says:
February 26, 2013 at 3:44 am
Willis:
Small (5%) quibble. Your heat capacity is for pure water at 4 celsius, atomspheric pressure. None of which, generally, apply to sea water.

Actually he addressed that in his conversion. You may or may not agree with his adjustment, but that is a different quibble.

I multiply that by 62/60 to adjust for the density of salt vs. fresh water, and multiply by 10^9 to convert to tonnes. I multiply that by 4.186 mega-joules per tonne per degree C.

51. richard verney says:

izen says:
February 26, 2013 at 3:59 am

“…No, there isn’t. The GHG effect is old science, very well established and can be clearly observed on all rocky planets with an atmosphere. You have no hope in explaining the temperatures of Mars and Venus without a GHG effect and the variation in GHG over a glacial cycle on Earth
reveals clearly their role in the climate…”
//////////////////////////////////////////////////
This is a very weird assertion since it is well known that the temperature of Venus can be fully explained by atmospheric pressure and lapse rate (I am not saying that it is so explained, but it certainly can be explained by that) and Mars has a very high concentration of GHGs (CO2 being approximately 96% of the Martian atmosphere) and yet no significant GHG warming is observed (Mars has a roughly similar percentage of CO2 compared to Venus which has just under 97% CO2 but unlike Venus it is not a hot world). The usual argument from warmists with regard to Mars is that whilst CO2 concentration is high (96% roughly the same as Venus), in absolute terms, there is little atmosphere on Mars so the the effect of GHGs is not seen. There is a lack of partial pressure (average atmospheric pressure on Mars being just 0.6 kilopascals or 0.087 psi)..

52. John Eggert says: February 26, 2013 at 3:44 am
Willis: Small (5%) quibble. Your heat capacity is for pure water at 4 celsius, atomspheric pressure. None of which, generally, apply to sea water.
– – – –
Are you sure about that John?
“I multiply that by 62/60 [1.03333…] to adjust for the density of salt vs. fresh water…”
That looks like about 3.3% to me, in the ball park.

53. Björn says:

Every now and the I get this fantasy thougtflash of Lewis Carrol some how time travelling forward to our timeperiod stopping for a while, having a quick peek at the global warming consensus crowd an their argumnents , then going back to his own age and publishing his agony in eight fits a.k.a. “The hunting of the snark” as his personal (and unexplaind to his contempories ) comment to the future he saw.
For those who have not read it delow is a link to a very nice illustrated online version of the all eight fits ( of laughter ???).

b.w. Björn

54. richard verney says:

Further to my last post, the last sentence was incomplete. It should have read:-

“There is a lack of partial pressure (average atmospheric pressure on Mars being just 0.6 kilopascals or 0.087 psi) such that the spatial distribution of CO2 molecules is too spread out for them to produce the greenhouse effect which it is claimed that CO2 produces on Venus and Earth”

I trust that makes a little more sense. I am certainly not endorsing the warmists’ argument in this regard, but obviously partial pressure and spatial distribution of molecules is a factor which could have some relevance,

55. JPS says:

Here are some interesting factoids, according to wikipedia-
The ENTIRE global energy usage for one year is approximately 0.5 ZJ. After checking the numbers above (they seem to be correct FWIW), that means if you could somehow dump the entire sum of energy into the ocean it would raise the top 700m 0.005K. ALso, the entire amount of energy absorbed by the sun in one year comes out to around 5500ZJ. Wow

56. Bill Illis says:

NODC reports the standard error in 2012 of the 0-700 metre ocean here as 0.361 10^22 joules or (one-third of the PMEL SE) or what would be just 0.0035C.

http://data.nodc.noaa.gov/woa/DATA_ANALYSIS/3M_HEAT_CONTENT/DATA/basin/yearly/h22-w0-700m.dat

I used your spreadsheet on the 0-2000 metre ocean (assuming there is 10% less area as one moves down to 2000 metres) and the temperature change from 1955 to 2012 is only 0.07C and from mid-2004 (when the Argo floats starts to become reliable) to 2012, it is 0.018C.

57. CEH says:

I´m getting very suspicious when people tell me that they can measure temperatures to a hundredth of a degree outside the lab, ARGO is even worse, they say that they measure temperature to an accuracy of +- 0.005 degrees Celsius.
http://www.argo.ucsd.edu/FAQ.html#accurate
I´ve been trying to find what kind of out tech. they´re using to achieve this, to no avail. Does anyone here know?

58. richard verney says:

Izen

It is difficult to make any comparison between rocky planets since none are blackbodies and therefore, as a matter of first principle, BB calculations are misconceived.

However, even more fundamental than that is the proposition that you make a comparison between a rocky planet and a water world. Some 70% of earth is not rocky but watery.Water has very unusual characteristics and throws a complete spanner in the works. No meaningful comparison can be made between a water world and a rocky planet.

59. richard verney says:

Juraj V. says:

February 26, 2013 at 3:47 am
///////////////////////////////////////////////////

Quite extraordinary. It was apt of you to remind us of this.

The problem in climate science is that one can have llittle confidence in the quality of any data set, they are all far too brief and errors, accuracy and precision are not properly acknowledged.

The upshot of all of this is that any honest scientist would have to acknowledge that we dot know whether, on a global basis, it is today warmer than it was in the 1880s and/or 1930s and as far as the USA is concerned it is probably cooler today than it was in the 1930s.

60. Gary Pearse says:

richard verney says:
February 26, 2013 at 2:04 am

“If there is more than 6 microns of windswept spray and spume, even less than 25% of DWLWIR could penetrate this barrier. This is an issue which seems to be overlooked by those promoting the AGW meme.”

Richard, to me, this effect would allow the ocean surface to absorb more LWIR. The spray would be heated and fall back onto the surface, mixed and more spray.,,. In the case of a calm ocean, then the absorption is low as you note. Perhaps the phenomenon of heated spray is the reason why there is a measurable difference in heat content of the upper ocean layer. I suggest (if it is possible to know the “ocean skin removal rate”) integration of this “thin” effect 10 microns at a time over a time period of say one year? Let’s take the peeling off of the ocean layer as occurring every 10 seconds – it would only occur during the day, but sun angle would be less variable for spray up above the surface: 365 days/2 * 3600 secs* 24hrs *10 microns~= 160 metres! Where’s my PhD!! and grants? Do you think Trenberth will put my name on his paper, having found how the missing heat gets into the deeper ocean?

61. [ snip – waaaay off topic – into outer space off topic -mod]

62. Reblogged this on gottadobetterthanthis and commented:
Anthony posts giving evidence that the ocean isn’t warming, just like the rest of the earth. Well, duh. Then here Willis runs the conversion of the heat content anomaly to temperature deviation. Different units, but the same thing. I know Mosh asserts that the “anomaly” measurement isn’t subject to the usual limits on temperature measurement, but I’ll point out that a typical digital thermometer has specs along the line of ± (0.1% reading +1°C). Accordingly, the heat analysis as presented claims that the oceans warmed over the last 20 years 0.1°C ± 1.25°C. Is that a reasonable assertion? Is it even rational?

63. Luther Wu says:

Paul says:
February 26, 2013 at 4:07 am

I think we are due for another big la nina event.
_______________
The US Southwest and much of the Southern and Central Plains are in their third year of La Niña- type weather pattern, with drought result, even though the ENSO has been shown alternating between El Niño and La Niña. The persistent pattern with blocking high seems to have broken recently. Many residents of the mentioned areas hope you guessed wrong.

64. Gary Pearse says:

The ocean looks to be heading for a cooling off.

Antarctic ice has bottomed out and is on its way back up. Also note that the sea ice extent for the 2013 SH summer minimum is a record high (tied with 2003 if the acuity my old eyes can be trusted) for the entire satellite period 1979-2013. With the prolonged deep freeze in the arctic,

http://ocean.dmi.dk/arctic/meant80n.uk.php

look for a new record maximum global sea ice extent this year. Note that Japan, which was bemoaning the long trip tour operators had to make to show tourists arctic ice a few years ago, now have it jammed up against the whole north coast of Hokkaido and pressing down the sides – they will now have no sea tourists! Even North Koreans can take a stroll on the sea ice if they dare.I don’t think this is the signal CAGW proponents have been waiting for..

65. Pamela Gray says:

Thank goodness we no longer study the individual ups and downs of solar parameters at such close scale (well…at least some of us don’t). If we did, we would be in a panic from year to year! My hunch is that eventually the bloom will be off the rose of minute scale ocean heat variability and we will all laugh at our previous wriggle watching consternations.

66. Professor Ole Humlum has an analysis of temperature and CO2 over the Jan 1980 to Dec 2011 period, where satellite data tells us that ‘total’ insolation is constant within 0.1%.

http://www.sciencedirect.com/science/article/pii/S0921818112001658

“The maximum positive correlation between CO2 and temperature is found for CO2 lagging 11-12 months in relation to global sea surface temperature, 9.5-10 months to global surface air temperature, and about 9 months to global lower troposphere temperature.”

The oceans are a maximum saturation of CO2 and other elemental gases and outgas based on system wide thermal changes, which may well be influended by changes in particle bombardments which are NOT part of the TOA ‘constant’ insolation values. Solar and galactic forces may well drive the fission thermostat that is the real basis for climate and CO2 changes.

67. Ben Wouters says:

John Marshall says:
February 26, 2013 at 2:27 am
“There is more than enough heat from the sun to preclude the need for a GHG theory.”
You’re neglecting the nightside of the moon when temperatures drop to 70K and lower.
(lowest temps on the moon are ~25K in craters near the poles)
Average lunar surface temp. is ~197K, so you need a good explanation for our ~290K average surface temp. since we receive even less solar radiation than the moon with our higher albedo.
see http://principia-scientific.org/supportnews/latest-news/123-moons-hidden-message.html
for more details.

68. Rob Potter says:

I have probably missed something here, but isn’t the ARGO data temperature originally? Before it is converted to energy? Can you not just get hold of the ARGO data and calculate temperature directly without having to make the various estimations for volume, salt-water density etc.?

Not that it would make any difference (or should not anyway), but it would remove a good chunk of the comments above which seem to be quibbling about the calculation methodology.

Also, as an abuser rather than a user of statistical methodology, has anyone (Willis, I know you have done a lot of ARGO posts in the past) looked at the range of temperatures seen in these buoys, together with the variation in readings from one area? This might go a part of the way to answering your question of whether the error bars are that good or not. I suspect these come from the final ‘god’ number for the total energy content and are nothing more than the presumed instrument error.

69. Rob Potter says:

By the way, I meant that I am an abuser of stats – not you Willis! sorry if there was any misunderstanding!

70. Gary Pearse says:

Juraj V. says:
February 26, 2013 at 3:47 am

“Google “NASA correcting ocean cooling”. That JPL guy openly admits that he threw away all “too cool” buoy ARGO data, since consensus says it must be warming. I have barely any faith even in ARGO since then, whatever fine system it has been designed to be.”

If they can read cool then they probably can also read hot. Leaving them in probably is closer to the right overall OHC. I’m sure this cavalier treatment of data adds enormously to the “error”. Were I to be charged with evaluation of the cool ones, I would take half a dozen and move them into a different sector and see how they compare. If the “anomaly” disappears, I would move them back and add the data back into the sheet. The treatment of the data shows that good science is not the objective. Shouldn’t this info be sent to the senate/congress committees.

71. MattN says:

If your calculation is right then the 200 zeta joule increase in OHC since 1950 translates to .2C and that doesn’t seem very concerning to me.

72. MikeR says:

Can someone give me some background on this issue – I’m totally confused. Graphs aside, surely no one is measuring the heat content of the ocean! Isn’t it correct that they are measuring temperature, with buoys and such? So even if someone decided to convert that into heat content for some reason beyond my understanding, why should we convert it back? Rather, what is the basic data available on temperature, above 700 meters or below or whatever? Do we know what’s been happening in the last decade and before, and how does it depend on depth? I had heard that there is “missing heat”, that some are guessing that it passed down into the very deep ocean beyond reach of our instruments… what are the facts about this?
Thanks.

73. Jeff Alberts says:

Pamela Gray says:
February 26, 2013 at 6:17 am

Thank goodness we no longer study the individual ups and downs of solar parameters at such close scale (well…at least some of us don’t). If we did, we would be in a panic from year to year! My hunch is that eventually the bloom will be off the rose of minute scale ocean heat variability and we will all laugh at our previous wriggle watching consternations.

That’s pretty much how I’ve felt about the hype in both directions. Those predicting warming or cooling.

74. richard verney says:
February 26, 2013 at 2:04 am
I have asked Willis a number of times to explain the process by which energy from DWLWIR absorbed in the top few microns of the oceans can be dissipated downwards
========
The DWLWIR is calculated to be about the same order of magnitude as solar radiation at the surface. Yet every swimmer knows that the ocean surface warms only in the daytime and cools at night. On cloudy days, the oceans surface doesn’t warm, yet according to the typical energy budget drawn by Climate Science there is little change in the W/M2 reaching the surface on a cloudy day as compared to a sunny day.

And, on a rainy day there is more W/M2 reaching the surface that on a sunny day. This can easily be verified by the speed at which rain melts ice and snow as compared to the rate at which sunshine melts ice and snow. Yet the ocean surface does not warm on a rainy day.

This certainly suggests that the typical energy budget as drawn by Climate Science, which suggests that the land area of the earth is large and the oceans are small is wrong. That the absorption of DWLWIR by the oceans goes primarily into evaporation, because it is absorbed by such a thin layer of molecules on the surface.

75. Ian L. McQueen says:

@bw

You asked “What would happen to the northern polar ice if there were a small shift in the direction of the Gulf Stream??”

In one of the many comments at http://wattsupwiththat.com/2012/08/27/sea-ice-news-volume-3-number-11-part-2-other-sources-show-no-record-low/Arno Arrak said at
August 27, 2012 at 7:35 pm: “Apparently a rearrangement of the North Atlantic current system at the turn of the century caused the currents to start carrying warm Gulf Stream water into the Arctic Ocean. Direct measurements of current temperature reaching the Arctic in 2010 showed that it exceeded anything measured for the last two thousand years of Arctic history. See E&E 22(8):1069-1083 (2011).”

IanM

76. Arno Arrak says:

It is pretty obvious that the step up is related to the super El Nino of 1998. In the satellite record it is clear that this super El Nino initiated a step warming that raised global temperature by a third of a degree in four years. It was obvious too that this step warming had an oceanic origin and was not anthropogenic. Your upper ocean temperature data confirm this and add additional data that may lead to an understanding of this rare phenomenon that happened only once in the twentieth century.

77. Nic Lewis says:

“garymount says:
February 26, 2013 at 4:58 am

John Eggert says: February 26, 2013 at 3:44 am
Willis: Small (5%) quibble. Your heat capacity is for pure water at 4 celsius, atomspheric pressure. None of which, generally, apply to sea water.
– – – –
Are you sure about that John?
“I multiply that by 62/60 [1.03333…] to adjust for the density of salt vs. fresh water…”
That looks like about 3.3% to me, in the ball park.”

As well as adjusting for the density of salt vs fresh water, one needs to adjust for their relative specific heat capacities. The SHC of fresh water is about 4180 J/kg/K. The SHC of seawater is about 3900 J/kg/K, about 6.7% lower. AFAIK, the adjustment factor is not very sensitive to temperature or pressure.

But, as John Eggert says, it is only a minor point (and it is one that I have seen a well known climate scientist overlook).

78. It the error bars are as small as indicated, how is it that Argo data has needed to be adjusted to such a degree? Argo data initially showed ocean cooling, until the floats that showed the cooling were eliminated from the data set.

The large increase in temperature from 1997 to 2003, of 0.12C over 6 years. Give me a break. There is no known process that could warm the upper oceans by that amount in such a short period time, that would not fry the land surface of the earth.

I call BS on the error bars.

79. mkelly says:

John Marshall says:

February 26, 2013 at 2:27 am

To your point. Have you ever listened to a baseball game when the announcer tells everyone that it is 128 F in the outfield or the pre-game when a player will hold a thermometer on the field to show how hot it is. The stands reduce convection. The atmosphere cools the surface.

I am sure there are differences between real turf and the fake stuff in regard temperature in the outfield.

80. Folks may remember the scandal over the Argo adjustments. This is clearly a case of confirmation bias. When the Argo data started coming in it didn’t show the expected picture. It showed the oceans were cooling, so a hunt was made to eliminate the floats that were showing cooling because they must clearly be in error.

I submit that no such hunt would have been made for sensors that were showing warming, because that was showing the expected result. Given that there should on average be as many faulty sensors reading high as faulty sensors reading low, given the large number of floats, it is likely that by eliminating the low reading sensors we now have an unbalanced population of faulty sensors reading high. This would explain the large jump in ocean temps 1997-2003 around the time Argo was first deployed.

One could just as easily reduce the average high of adult humans by eliminating the male portion of the population, or raise the average height be eliminating the female portion. By taking lots of measurements you could claim statistically that your error bars were statistically very small. But it wouldn’t be true.

81. Ray says:

Ben Wouters says:
February 26, 2013 at 3:35 am

Ben I have no idea if geothermal heat is factored into the assorted Climate models in use or how a warm “Black Body” behaves. But I worked below ground in Homestake Gold Mine for a period of time and know from personal experience that below the frost line, that the temperature goes up X degrees for every Y increase in depth. The Earth is a warm body that without any solar input that directly impacts the temperature of the seas and atmosphere.

82. mkelly says:

garymount says:

February 26, 2013 at 4:58 am

From engineering toolbox specific heat of sea water. About a 6% difference. But the overall contention remains the same.
Water, sea 36oF 3.93

83. I think everyone has the Tisdale function backwards. ENSO transfers energy from the ocean to the atmosphere so ocean enthalpy should have decreased after the 1997 El Nino. The atmosphere should show an increase and it did very briefly but has been flat ever since. The 1997 inflection point keeps showing up in strange places. It also happens to be the beginning of the acceleration of the magenetic north pole motion from about 10 to over 50 kilometers per year.

Is anyone else having trouble getting on this site? I have been continuously blocked by ad popups along the top that prevent anything else from loading. This happens both when I approach from my browser and my wordpress reader. I can only get through with multiple refreshes.

84. FauxScienceSlayer says:
February 26, 2013 at 6:20 am
http://www.sciencedirect.com/science/article/pii/S0921818112001658
=================
more:
Changes in global atmospheric CO2 are lagging about 9 months behind changes in global lower troposphere temperature. ► Changes in ocean temperatures explain a substantial part of the observed changes in atmospheric CO2 since January 1980. ► Changes in atmospheric CO2 are not tracking changes in human emissions.
===========
As with the ice cores. Early studies of temp and CO2 failed to take lag and lead times into account, which made it look like CO2 was driving temperature. Something Al Gore “conveniently” forgot to mention in his movie.

CO2 lags temperature in the modern records as well, which is strong evidence that CO2 is not a forcing agent in global temperatures. Rather, global temperatures are forcing CO2, and some other mechanism is causing climate change.

This further confirmed by the failure of the CO2 based climate models to predict observed temperatures going forward. This should have been the nail in the coffin for CO2 theory. In any other branch of science it would have been. Except that CO2 theory is closely tied to environmental fears over fossil fuels and pollution. It is this fear that is preventing an honest scientific examination of the cause and effect issue. Instead we get opportunists like Gore and Packy using their positions of authority to knowingly suppress information for personal gain.

85. Jim G says:

Sampling error derives not only from a lack of significant numbers of observations but also from a
lack of representativeness of those samples collected to the universe being sampled and from methods and or equipment used to collect those samples (siting and callibrations come to mind). These issues plague much of the “data” used in “climate science”. Tree rings, sediments, etc. as well as poorly measured temperatures.

86. Robany Bigjobz says:

I don’t think converting the OHCA to temperature anomaly is actually helpful. It relies on the unstated assumption that the top 700m of ocean are well mixed and in thermal equilibrium. I doubt that either of these assumptions are valid.

It is simple to imagine that different temperature profiles from 0-700m would have identical heat content but very different climatic effects and appearances on, say, satellite temperature measurements. A thin, warm layer lying atop a steady decline in temperature might conceivably drive higher levels of evaporation than a relatively cool, deep top layer with relatively warmer water as you approach 700m.

If we’re ultimately concerned with the energy balance of the planet, let’s look at it in terms of total energy rather than the temperature of a very thin layer of the whole system.

87. Phobos says:

Willis wrote: “He notes that there has been no significant change in the OHCA in the last decade.”

1) There is much more to the ocean than the 0-700 m layer.
2) Even for this layer, the statement is incorrect: The OLS 10-year trend there is 44 TW, with a 2-sigma uncertainty of 30 TW. That’s statistically significant warming.

88. Phobos says:

Willis wrote: “I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief.”

We don’t know the average temperature of the upper ocean to 0.01 C, and no one is claiming that we do.

In fact, no one is making any claims about the temperature of the ocean.

The calculation is about the *change* in average temperature of the ocean: dT = dQ/mc , and the uncertainty attaches to dT, not to T.

89. Uzurbrain says:

Where are they getting these accurate thermometers? There are a lot more errors than they think. I have worked with “precision” laboratory standard NBS traceable thermometers for calibrating plant equipment. These instruments cost \$5,000 to \$10,000, and are just 0.05% accurate! When you “calibrate” another thermometer, the “calibrated” thermometer will be LESS accurate than the source. Then you have the process of calibrating the source against a reference. If done in a typical “calibrating” lab it will not be to the actual NBS reference standard but a “proxy.” Thus, more errors. Now you have to throw in the “measurement” while in use. Essentially every electronic temperature reading device will introduce another error in measuring the temperature based upon the ambient temperature of the area surrounding the electronic device (think the buoy that the equipment is in taking these readings). The most accurate “precision” measuring devices are designed for “Laboratory Use Only.” With the probe in a location away from the measuring instrument the temperature displayed will change due to the change in the laboratory ambient temperature. Typical numbers are in the order of 0.0001 to 0.00001 per degree change in the ambient temperature (and this is for the BEST Costliest Laboratory Grade instruments). Not much but that means that when the buoy is at the surface (85 D F) and then sinks to 700 meters (35 D F) you have just introduced an error of at least 0.0050! All of that RMS large number averaging B/S will not remove that error it will be the same for every one! Oh, and that reminds me, Proper use of RMS averaging of multiple readings assumes that the errors are random and more or less equally distributed on either side of the correct reading. The statistical theory behind the use of RMS averaging assumes this. Any good mathematician will tell you this. You can’t prove that 7 is the most likely number from a roll of dice if you are testing this theory with loaded dice, regardless of how many “samples” you take! You will only prove they are loaded.

90. DesertYote says:

Nick Stokes says:
February 25, 2013 at 10:41 pm

Willis,
” I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. “

But the ocean temperature isn’t noisy.

###

You don’t know much about physical oceanography. E.g. you weasel about lack of wind at depth, while ignoring the well known and well documented existence of the oceans wind equivalent, which represents a far more energetic system then mere gas. You also seam to be under some delusion that ocean water is well mixed. It is not. One example would be the bubbles or lenses of sharply bounded lower or higher density water, who’s temperature is generally higher (but also can be lower) then the surrounding volume containing them. Until the mechanisms that create these structures and govern their behavior are understood, no one can claim that they understand the ocean well enough to be certain of anything, least of all the total heat content of the ocean from a few distribution biased measurements.

91. @Nick Stokes 10:41pm:

But the ocean temperature isn’t noisy. At depth there is no daily variation – and weak spatial gradients. There are no winds, no variable sunlight

Define noisy in the context of knowing its average temperature to 0.03 deg K when the fluid body has a range of 0 to 32 deg K over the world, variable by season, latitude, Longitude, hour of the day and depth.

Interannual atmospheric variability forced by the deep equatorial Atlantic Ocean, Brandt-2011, Nature 473,497–500(26 May 2011).
http://www.nature.com/nature/journal/v473/n7348/fig_tab/nature10013_F2.html
Figure 2: shows E-W velocities (color) as a function of depth (-20 to 20 cm/sec), with peak to peak reversals in as little as 300 m of depth, repeatedly. (Y-axis depth to 3500 m. x-axis is time) Location: 0 N, 23 W. (Moored, non-Argo, data)

This is only one moored bouy from one location over two year span. But the shocking thing is the shear, the contra-flow of currents, stacked vertically over one spot on the equator. Admitedly, the velocities are not huge, but at differences of more than 1/3 of a knot they are not insignificant either.

Surface Currents in the Atlantic Ocean
http://oceancurrents.rsmas.miami.edu/atlantic/gulf-stream_2.html
is a good index page to many good maps on surface drift bouys (1978-2003) from a total dataset spaghetti map (Figure 3), individual bouys entrained in the florida current then meander in the atlantic (Fig. 6,7), and Vertical profile transects of the Gulf Stream velocity field (Fig. 13).

Is it noisy? Or is it signal? It sure isn’t uniform.

People send me stuff!
This is an intervju with J Gregory IPCC and Chambers even if you dont understand Swedish have patience because the interviews in english will surprise many off you. Its a completely new tone when it comes to sea level rise. (the upper pod cast link on the page)

Enjoy!!

93. bacullen says:

If the error bars are correct and 1 sigma then multiply them by 3 to get to p<0.1 and see what happens. The delta in '98 is meaningless. The whole curve lies within the noise! It may be suggestive to some but for the whole time period it still lies w/in the noise.
GIGO!

94. Curt says:

MikeR says:
February 26, 2013 at 6:53 am

“Can someone give me some background on this issue – I’m totally confused. Graphs aside, surely no one is measuring the heat content of the ocean! Isn’t it correct that they are measuring temperature, with buoys and such?”

Yes, what they are measuring is temperature, whether with the older XBTs or the newer Argo floats. It is sensible, in principle at least, to convert this to energy units using thermal capacitance values, because energy is conserved, and in all calculations, the conservation equations are the key ones to be solved. Average temperature levels don’t really have a physical meaning, but average energy units do.

So what Willis is really doing here is a form of back-calculation, to get a feeling for the sensitivity of the original temperature measurements. Yes, oversampling and averaging can reduce the uncertainty in measurements, but that only goes so far, before any systematic errors can overwhelm the remaining random errors.

I, too, have many questions about the OHC values and the conversion from using older methods to Argo data in the years leading up to 2003. I have seen several claims that the jump seen in this plot is not reflected in several other data sets that should also be affected by a true temperature rise. I don’t have time now to look for references – can anyone find them?

95. Phobos says:

Don’t forget, the uncertainty of an average is less than the uncertainty of any of what it’s averaging.

For example, if you have two temperatures T1 and T2, each with a measurement uncertainty of dT, the uncertainty dA of the average is

dA = dT/sqrt(2)

For N points the denominator is sqrt(N).

Levitus et al are averaging a very large number of temperature measurements. Their 2012 Supplementary Material shows this in Figure S12: the dTs range from 1.50 C, but they are averaging thousands of measurements all across the ocean,

Their Supp Material gives a detailed exposition of their error handling techniques. It’s worth reading.

96. Phobos says:

Jim G says: “Sampling error derives not only from a lack of significant numbers of observations but also from a lack of representativeness of those samples collected to the universe being sampled and from methods and or equipment used to collect those samples (siting and callibrations come to mind).”

This is true for *any* measurement of *anything*. All data is model-dependent — all of it.

97. DCA says:

RE: ocean cooling between 2003-2005.

According to the NASA: “other indicators of ocean heat—satellite measurements of the balance of incoming and outgoing energy at the top of the atmosphere, and satellite and buoy data on sea level rise—didn’t show the cooling trend.”

However UHA shows a cooling from +0.3 to -0.2. and SST also shows a decline for this period.

Why does this not make sense?

98. DCA says:

In addition to my comment, even the graphs above show a decline in the period.

99. Geoff Withnell says:

richard verney says:
February 26, 2013 at 1:42 am

…Accordingly, before any DWLWIR can reach the oceans it has to first find its way through the wind swept spray and spume which lies immediately above the oceans. Given the optical absorption of LWIR in water, for practical purposes if there is even just 6 microns of wind swept spray and spume lying above the oceans at most only about 25% of DWLWIR even gets to reach the top surface of the oceans. If there is more than 6 microns of windswept spray and spume, even less than 25% of DWLWIR could penetrate this barrier. This is an issue which seems to be overlooked by those promoting the AGW meme.

It may well be the case that in force 5 conditions and above, none of the DWLWIR even reaches the very top layer of the oceans because it cannot penetrate the IR block consisting of the wind swept spray and spume that exists immediately above but divorced from the ocean below.

Ah, doesn’t a significant amount of that spray fall back into the ocean? Transporting at least some of that DWLWIR energy it absorbed to the top layer of the ocean?

100. Lars P. says:

richard verney says:
February 26, 2013 at 2:04 am
It would appear that it cannot be conducted downwards since the energy flux is upwards not downwards at the top of the ocean and there is no known mechanism whereby conduction can take place against the direction of energy flux.

Yes, this is a very straightforward point that is mostly avoided in the ocean warming discussion by alarmists. The downwards infrared can warm only the surface of the ocean, as heat transfer cannot happen agains the temperature gradient,
“It is well known that temperatures at the sea surface are typically a few-tenths degrees Celsius cooler than the temperatures some tens of centimeters below [Saunders, 1967; Paulson and Simpson, 1981; Wu, 1985; Fairall et al., 1996; Wick et al., 1996; Donlon et al., 2002].”

If the ocean surface is getting warmer, then the below mass will get warmer as not be able to cool so fast, and might store more heat from the sun, however to my understanding the DLIR stops at the surface of the ocean.
Therefore with known SST we may evaluate the temperature of the ocean below. No increase in SST means no increase in the total heat content once the heat transfer below is in balance.

101. Willis: Please note the pre-Argo values are probably MEANINGLESS.

This whole exercise is a “faith based” activity that makes the snake holding/Bible thumping Penecostals look RATIONAL!

I think, a superficial examination of the “source data” and the derivation of the 1950’s to 2000 part of the curve will illustrate it’s unreliability.

THEREFORE THE ONLY INFORMATION WE HAVE FROM THIS STUDY IS THE CURRENT INFORMATION WHICH PRETTY MUCH CORRELATES WITH THE FLAT TROPOSPHERIC TEMPERATURES OF THE LAST 17 YEARS.

Equilibrium system, period. STABLE. Negative feedback. Willis’ thunderstorm thermostats.

102. Lars P. says:

gymnosperm says:
February 26, 2013 at 7:52 am
Is anyone else having trouble getting on this site? I have been continuously blocked by ad popups along the top that prevent anything else from loading. This happens both when I approach from my browser and my wordpress reader. I can only get through with multiple refreshes.
I am using Firefox (version 18.0.2 now). I don’t remember ever having any problems or adds popping-up when accessing this site.

103. @Phobos 8:28 am
In fact, no one is making any claims about the temperature of the ocean.
The calculation is about the *change* in average temperature of the ocean: dT = dQ/mc , and the uncertainty attaches to dT, not to T.

That claim might wash with Stevenson Screens that are bolted to the ground, and assumes that nothing moves, is built or destoryed around them, nothing moves but the air. All of those caveats are usually false.

But ARGO floats move with currents, 3D velocity fields in a complicated geometry where the instantaneous divergence is zero, but with a non-zero curl. From measurement to measurement, the float are recording a different piece of the ocean, representing a different proportion to the whole. The only way to get a delta-average-T is to know the average-T at different times.

Your “to dT, not to T” argument holds no water, much less an ocean.

104. jim2 says:

“If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts, he shall end in certainties.” —Francis Bacon (1605) The Advancement of Learning, Book 1, v, 8

105. Juraj V. says:
February 26, 2013 at 3:47 am

‘Google “NASA correcting ocean cooling”. That JPL guy openly admits that he threw away all “too cool” buoy ARGO data, since consensus says it must be warming. I have barely any faith even in ARGO since then, whatever fine system it has been designed to be.’

That too cool data exists suggests that all the buoys could be miss-calibrated (why would only the cool ones be in error) and that suggests that there is instrument bias that must be incorporated into measurement errors. If there hasn’t been an effort to determine buoy biases and how they should be handled in measurement estimates, the buoy program is failing its scientific purposes.

106. Phobos says:

Stephen Rasey says: “From measurement to measurement, the float are recording a different piece of the ocean, representing a different proportion to the whole.”

Levitus et al 2012 discuss such concerns explicitly in their Supplementary Material. You should read it. It’s really just applying the statistics of sampling to the ARGO data.

107. Phobos says:

Uzurbrain says: “Where are they getting these accurate thermometers?”

Their temperature measurements are accurate to 1.5 C or less, but nowhere near your 10^-4 C to 10^-5 C. Levitus et al 2012 show this explicitly in their paper, and discuss it. Really, you should at least read and understand a paper before dismissing it.

108. Austin says:

How well calibrated are the Argo floats before deployment? Have any been retrieved and calibrated? How were they calibrated – against one temp or the whole temperature range? And under what pressures?

With 510 million square km of ocean and assuming a random distribution of floats, for a confidence interval of .01 – within the sensitivity of your graph at .01C – you would need A LOT more floats. Tens of thousands..

109. Willis Eschenbach says:

Richard Verney, you say:

I have asked Willis a number of times to explain the process by which energy from DWLWIR absorbed in the top few microns of the oceans can be dissipated downwards to depth before the energy absorbed in those microns heats those microns of water to a temperature driving evaporation. Despite many requests being made of him, he has at no time explained the mechanical process involved.

Richard, that is total and absolute bullshit. I give you the Lie Direct on that.

I even dedicated an entire post to the question of the absorption of IR. In that post I answered your damn questions, over and over and over.

The fact that you didn’t like my answers doesn’t entitle you to make false claims about whether I’ve answered, Richard, thats underhanded and untrue, and I expect better of you.

See my post called Radiating the Oceans for a full discussion of this issue, including my answers to Richard that he’s trying to pretend never happened …

Richard, I expect an apology, or we’re done discussing forever. I won’t have dealings with a man who tells lies in an attempt to damage my reputation.

w.

110. Jim G says:

Phobos says:

February 26, 2013 at 8:58 am

Jim G says: “Sampling error derives not only from a lack of significant numbers of observations but also from a lack of representativeness of those samples collected to the universe being sampled and from methods and or equipment used to collect those samples (siting and callibrations come to mind).”

“This is true for *any* measurement of *anything*. All data is model-dependent — all of it.”

Agreed, the point is that, per the examples I gave, “climate science” is particulary weak on collection of representative data and error confidence limits do not take quality of data into consideration.

111. Arno Arrak says:

Further on the step warming. There is a poorly documented step warming in 1976 that has been associated with the PDO phase change from cool to warm, another oceanic phenomenon. Can you get similar upper ocean data for this time period?

112. bones says:

MikeR says:
February 26, 2013 at 6:53 am

Can someone give me some background on this issue – I’m totally confused. Graphs aside, surely no one is measuring the heat content of the ocean! Isn’t it correct that they are measuring temperature, with buoys and such? So even if someone decided to convert that into heat content for some reason beyond my understanding, why should we convert it back? Rather, what is the basic data available on temperature, above 700 meters or below or whatever? Do we know what’s been happening in the last decade and before, and how does it depend on depth? I had heard that there is “missing heat”, that some are guessing that it passed down into the very deep ocean beyond reach of our instruments… what are the facts about this?
Thanks.

Curt says:
February 26, 2013 at 8:51 am

Yes, what they are measuring is temperature, whether with the older XBTs or the newer Argo floats. It is sensible, in principle at least, to convert this to energy units using thermal capacitance values, because energy is conserved, and in all calculations, the conservation equations are the key ones to be solved. Average temperature levels don’t really have a physical meaning, but average energy units do.

So what Willis is really doing here is a form of back-calculation, to get a feeling for the sensitivity of the original temperature measurements.
———————————————
Let me add to Curt’s comments. The ocean is heated directly by absorbing solar radiation. The atmosphere is primarily warmed by the oceans rather than the reverse. Since heat flows from warmer to cooler, any warming of the atmosphere would not, on average exceed that of the oceans, which for the last decade, have warmed at a rate of about 0.3 C per century. If one wants to do energy balances, then it makes sense to convert the measured temperatures to heat units, but if you want to know what to expect for surface air temperatures pay attention to Willis.

If heat has been lost to the ocean depths below 700 m, it will not be coming back to heat the surface. Temperatures at those depths are only a few degrees. There aren’t many places on earth cold enough to be warmed the deep waters.

113. Uzurbrain says:

Phobos says: February 26, 2013 at 9:54 am

Uzurbrain says: “Where are they getting these accurate thermometers?”

Their temperature measurements are accurate to 1.5 C or less, but nowhere near your 10^-4 C to 10^-5 C.
——————————->
In which case their error band should be about 100 times larger than shown. Just because you can get an electronic instrument to show a reading to 5 decimal points does not mean that is the accuracy. You claim +/- 1.5 C that is about +/- 3 F. and even with 10 thousand measurements averaged, after considering accuracy, calibration errors, measurement errors, ambient effects upon the measurement creating errors you end up with GARBAGE IN GRBAGE OUT. Even a moron would agree that you can not express a change in water content of a lake In milliliters when you are taking your sample measurements with a barrel or even a litter jug, regardless of how many “samples” you take and average them. The math does not support the conclusions.

114. Jim G says:

Willis Eschenbach says:

“Richard, that is total and absolute bullshit”.

I am assuming that this is a technical term used in climate science and differs from just plain bullshit as opposed to absolute bullshit?

115. Phobos says:

Jim G says: “per the examples I gave, “climate science” is particulary weak on collection of representative data and error confidence limits.”

How so? There are ~3000 ARGO buoys in the oceans, with ~750 replaced annually. How many more do you think are needed to get meaningful statistics? Budgets have limitations….

116. oMan says:

Willis: you rock. What a genius way to cut through the clutter of “data” and try to elucidate the “meaning” and, in this case, the fact that the “meaning” is completely lost in the measurement errors. Thanks. These graphs offer no support for policy recommendations, in fact, the reverse. Anyone using them for that should be ignored.

117. To start with, I digitized the data from the graph. Often this is far, far quicker than tracking down the initial dataset, particularly if the graph contains the errors.

There was no need to digitize the data from the graph.
From Anthony’s post of February 25,
http://wattsupwiththat.com/2013/02/25/fact-check-for-andrew-glickson-ocean-heat-has-paused-too/
click on the link directly below the 0-700 m Heat Content Anomaly graph:
http://oceans.pmel.noaa.gov/
Click “Data” in the menu at the top of the page, click on “Global 0-700m Ocean Heat Curve” to see the data http://oceans.pmel.noaa.gov/Data/OHCA_700.txt
It describes the errors as SE (standard error)

It should be obvious that the water the ARGO buoys measured in 1993 isn’t the same water they measured in 2012–the water moves on (sinks, evaporates, rises, mixes) and is replenished from God knows where, while the buoys are relegated to a certain strata of the oceans. So this whole study is of dubious value.

Willis’ calculations and discussion are good and interesting, but it’s an unquantifiable, dynamic/non-static “population” from inception. To NOT find a change over time would be the exceptional case.

119. Doug Proctor says:

All discussion is dependent on the accuracy of the data. If the data cannot be known to this precision, then we have no basis for discussion.

But didn’t the data start out as temperatures, that were THEN converted to zeta-joules? I don’t think you measure, but calculate zeta-joules.

Huge numbers of data points in different places and different times do not reduce error estimates; only repeated measurements of the same parameter that is unchanging reduces errors (of measurements). Each time you measure a different thing that is itself changing you get back to your fundamental measurement limitation.

So I doubt that the accuracy is as good as claimed.

120. Phobos says:

Uzurbrain says: “Just because you can get an electronic instrument to show a reading to 5 decimal points does not mean that is the accuracy. You claim +/- 1.5 C that is about +/- 3 F.”

Again, _read the paper_. 1.5 C is their largest uncertainty. Most are less than 0.5 C, and many less than 0.25 C (Supp Mat, Figure S12). Hence the uncertainty of the average can be small.

121. Phobos says:

RockyRoad says: “It should be obvious that the water the ARGO buoys measured in 1993 isn’t the same water they measured in 2012–the water moves on (sinks, evaporates, rises, mixes) and is replenished from God knows where, while the buoys are relegated to a certain strata of the oceans. So this whole study is of dubious value.”

By this argument, you can’t measure the average air temperature in your back yard. Is that really what you think?

122. ferdberple says:
February 26, 2013 at 8:04 am

CO2 lags temperature in the modern records as well, which is strong evidence that CO2 is not a forcing agent in global temperatures. Rather, global temperatures are forcing CO2, and some other mechanism is causing climate change.

With a chance of -again- starting the same discussions:
CO2 changes lag temperature changes in the modern record, but human caused CO2 emissions lead the increase of CO2. Not temperature. See:

and at an incredible stable ratio:

Temperature can’t be the cause: a maximum of 16 ppmv/°C for seawater at equilibrium (Henry’s Law). And vegetation is going opposite: more uptake with higher temperatures, that gives about 8 ppmv/°C over multi-decades to multi-millennia.
Thus maximum 8 ppmv since the depth of the LIA. The rest of the 100+ ppmv increase is human induced.

That doesn’t give a clue of what CO2 does on temperature. My own estimate: around 1°C for a CO2 doubling, mostly benign for nature, including humans…

123. Phobos says:

Doug Proctor says: “Huge numbers of data points in different places and different times do not reduce error estimates; only repeated measurements of the same parameter that is unchanging reduces errors (of measurements). Each time you measure a different thing that is itself changing you get back to your fundamental measurement limitation.”
——————————–
It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality. They don’t measure the same box over and over again, but sample a subset of the boxes over time. With sufficiently sized sampling, the difference between the sampled set and the entire population is small — this is, after, the basis of all statistical reasoning.

124. Patrick B says:

” Lew Skannen says:
February 25, 2013 at 10:53 pm

I am always interested in error bars. I suspect that they are not really wanted and everyone in the modelling world would be happier if they would just disappear but stubbornly refuse to do so. They are cleaned up and brought out for show when they need to be accounted for by they are expected to behave themselves, not make a scene, not speak unless spoken to and certainly not relax and reveal any more of themselves that strictly necessary.
In reality if they were allowed to be themselves and behave as they would at home I suspect that the error bars would undo their corsets and belt buckles and flop out all over the place.
This would then spoil the image of the neat, tidy, prim and proper graph because it would be indistinguishable from a page of wall to wall error bars running rampant. A bit like a ball room dancer at a Hells Angels long weekend booze up.”

Thank you Lew. Probably half my comments on various climate change sites revolve around questioning the error analysis. I was trained in chemistry and it’s amazing how quickly error analysis builds a large error bar. I believe that if proper error analysis was performed on climate measurements, we would be able to authoritatively state the variations in average temperature between winter and summer for the past two decades and not much else.

125. D.B. Stealey says:

I suspect that Phobos has a job paid at least in part with public funds. Is he given free rein to post his climate alarmism on blogs constantly throughout the work day? As a hard-bitten taxpayer, I would like to have a discussion with his boss — and with his boss’s boss.

126. Some here are suffering from an error in concept that averaging data having errors produces more accurate data. The truth of that presumption depends on the nature of the errors (unbiased random noise) and measurement of a constant (fixed) value. If you use different thermometers with different biases, averaging does not help nor does averaging data from the same thermometer with a significant bias or the same thermometer used to measure different things. Is there a discussion of these issues for Argo.

127. Richard G says:

richard verney says:
___
I was astounded the first time I took my then new 35mm Nikonos camera and light meter diving. Astounded at how rapidly the red dial on the meter faded to black with increasing depth. Astounded at how rapidly the light values dropped and the color values shifted towards blue.
Where has all the red light gone, long time passing?
Where has all the red light gone, long time ago…

128. paullinsay says:

From the wikipedia entry on Ocean Acoustics, http://en.wikipedia.org/wiki/Ocean_acoustic_tomography

“The integrating property of long-range acoustic measurements

Ocean acoustic tomography integrates temperature variations over large distances, that is, the measured travel times result from the accumulated effects of all the temperature variations along the acoustic path, hence measurements by the technique are inherently averaging. This is an important, unique property, since the ubiquitous small-scale turbulent and internal-wave features of the ocean usually dominate the signals in measurements at single points. For example, measurements by thermometers (i.e., moored thermistors or Argo drifting floats) have to contend with this 1-2 °C noise, so that large numbers of instruments are required to obtain an accurate measure of average temperature. For measuring the average temperature of ocean basins, therefore, the acoustic measurement is quite cost effective. Tomographic measurements also average variability over depth as well, since the ray paths cycle throughout the water column.”

Note the mention of 1-2 C “noise”, actually fluctuations in temperature. This places an error limit on any measurement that is in addition to instrumental error.

129. Phobos says:

@DB Stealey: Making things up to discredit an argument isn’t cool. Please stick to the science.

130. Chris R. says:

To MikeR:

You wrote:

Can someone give me some background on this issue – I’m totally confused. Graphs aside, surely no one is measuring the heat content of the ocean! Isn’t it correct that they are measuring temperature, with buoys and such? So even if someone decided to convert that into heat content for some reason beyond my understanding, why should we convert it back? Rather, what is the basic data available on temperature, above 700 meters or below or whatever? Do we know what’s been happening in the last decade and before, and how does it depend on depth? I had heard that there is “missing heat”, that some are guessing that it passed down into the very deep ocean beyond reach of our instruments… what are the facts about this?

You are correct that the ARGO project buoys measure ocean temperature. Our knowledge
of ocean temperatures before the project came on-line in 2003 is decidedly fragmentary.
Willis stated in his original posting that the reason he was converting heat content into
average temperature rise was a personal preference, as he found that
the heat energy change was difficult to visualize. As to the “missing heat” problem, the
so-called deep ocean is just about the only place left where this “missing heat” can be
hiding. This was the reason for Anthony making his original post,
commenting on a blog post at a rival Web site. Anthony adduced as evidence
a graph from the NOAA Pacific Marine Environment Laboratory (PMEL), which had
listed the total heat energy change in the 0-700 meter layer.

131. Phobos says:

Philip Lee says:”Is there a discussion of these issues for Argo.”

Yet again, please read the paper (Levitus et al 2012). Their Supplementary Material has an Appendix titled “Error Estimates of Objectively Analyzed Oceanographic Data”; equation 4 there addresses your point explicitly.

132. Uzurbrain says:

@ Phobos –
Lets make this simple. You are trying cut a board that must be exactly 8′ 7-13/64′ long to for a shelf in your closet. The only, and I mean ONLY, thing you have to make any measurements with is an old hardware store yard stick. The smallest readable graduations or 1/4 inch. You have no string, rope, building square, framing triangle, etc. just the yard stick. I don’t care how many times you measure the board and how many times you average each measurement or use RMS, you will never get the correct length.

1. The fact that you have 3000 buoys does not mean that you have 3000 samples of the same temperature that can be averaged using the RMS accuracy rules (square root of the sum of the squares – in the industry we usually said RMS). All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.
2. The fact that you add up 3000 surface level temperatures, (e.g., numbers between 30 and 90 degrees F) then divide by 3000 and gives you a number out to 3 or more decimal points does not mean that is the temperature within anything other than +/- 1.5 Degrees C (or 0.25%, in fact you must use the WORST accuracy to be accurate). PERIOD. The RMS accuracy rules for averaging samples does not apply. IT DOES NOT WORK THAT WAY.
Also, the accuracy for essentially every instrument I have ever worked with, including precision laboratory standard instruments when expressed in terms of % were percent of the MAXIMUM reading for the range selected. (It is a common misconception that the 0.01%) means of the reading – this is not normally the case. Even for instruments selling for more than \$10,000. Read the fine print on the accuracy specifications.) You might want to Google “NBS traceable calibration facility” find one in your area and talk to them.
3. I suggest you contact someone with a degree in applied mathematics. Note I said APPLIED mathematics. Generally, theoretical Mathematicians know the theory but haven’t the foggiest idea how to apply the theory.

P.S. Back in the early 70/s I was involved in the process of getting several US Government agencies to allow the use of calculating the square root of the sum of the squares for determining the accuracy of measurements and instrument channels. Prior to that time we were forced to use the direct sum of the “inaccuracy.” This meant that a instrument train with 5 devices, each with +/- .25% accuracy was treated as +/- 1.25% accurate. I have an advanced degree in Applied Mathematics and was a member of the Instrument Society of America back then.

133. mkelly says:

Phobos says:

February 26, 2013 at 10:55 am

It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality. They don’t measure the same box over and over again, but sample a subset of the boxes over time. With sufficiently sized sampling, the difference between the sampled set and the entire population is small — this is, after, the basis of all statistical reasoning.

Not quite correct. With what you describe above they know in advance what the outcome should be. The size of the box, weight of candy bar etc is known and you are finding a difference from the known. You have no idea what a specific block of water is suppose to have for a temperature.

.

134. Willis Eschenbach says:

Let me give an example of why I think the ocean error claims are unsupportable.

The BEST dataset folks have given us their error estimates. They are using on the order of 30,000 stations, which take temperatures either several or many times daily, to measure a two-dimensional field, the average temperature of the air at about two metres off of the surface. They say the 95% error (two sigma) in their estimate is on the order of four hundredths of a degree (0.04°C)

The paper shown above, on the other hand, is using on the order of 3,000 Argo floats, which take temperatures once every ten days, to measure a much larger three-dimensional field, the average temperature of the top seven hundred metres of the ocean … and despite that, they claim a two sigma error of half of what the BEST folks get, two hundredths of a degree (0.02°C). A tenth of the data, a three-dimentional field, 2.3 times the area of the land data … and half the error?

I’m sorry, but I simply don’t see how that is even theoretically possible … yes, the ocean has less variation, but we’re talking orders of magnitude here. I’m more than happy to listen to any explanation of how that small an error is possible, I just haven’t heard one yet.

Here’s an example. In 2005, the listed error above (two sigma) is .022 degrees. In that year, there were 48,132 samples of the ocean temperature taken by Argo floats. The average surface temperature of all of the floats (raw data) was 19.25°C, the standard deviation was 8.73°C, and thus the standard error of the mean (two sigma) was 0.08°C … and that’s just for the top layer.

That is also a raw error, things will get worse once the irregular distribution of the data is included. The error is particularly exaggerated in any area where the currents mean that the free-floating buoys provide poor coverage. Unfortunately, these include areas of downwelling, spreading currents which push the Argo floats out of the very areas of larger temperature variation that we want to measure.

Finally, the global coverage of the Argo floats is poor. Only 1.5% of the samples were taken north of 60°N, and the same south of 60°S. Here’s the distribution of the floats compared to the distribution of the ocean by latitude …

The northern hemisphere, particularly from 30-60 north, is way over-represented in the data, and the southern hemisphere is correspondingly under-represented south of 40°S. This can only serve to increase the error of the average from the size of raw error I calculated above.

I don’t see how they get the two sigma error down to a couple hundredths of a degree … any explanations welcome.

w.

135. Theo Goodwin says:

Uzurbrain says:
February 26, 2013 at 12:01 pm

“1. The fact that you have 3000 buoys does not mean that you have 3000 samples of the same temperature that can be averaged using the RMS accuracy rules (square root of the sum of the squares – in the industry we usually said RMS). All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.”

Wonderful comment. You have an excellent understanding of science. I have learned over the years that it is impossible to get Alarmists to understand that a scientist would not take temperature measurements from the surface of the ocean and from a depth of 200 meters and assume that they are comparable. The important word here is ‘assume’, as you know. The science must reference all the relevant conditions associated with each measurement.

Invariably, Alarmists treat any two temperature measurements as comparable. Some have posted comments in which they argue in detail that all such differences in the conditions of measurements disappear in the statistical wash. Well, I guess they do but so does the science.

136. D.B. Stealey says:

Phobos says:

“@DB Stealey: Making things up to discredit an argument isn’t cool. Please stick to the science.”

I never claimed to be cool. And I will stick to anything I think is imkportant. Once again: is any part of your income derived from public funds?

As a taxpayer I get very tired of the constant stream of alarmist propagandists showing up here and blogging on my taxpayer dollars. If your answer to my question above is No, I’ll accept that. Otherwise, do what the public is paying you to do — and it isn’t posting endless comments throughout the work day.

Anthony already asked you who you work for, and you evaded him. From the content of your posts, you have an agenda. Your comments look like they were cut ‘n’ pasted from SkS, the internet’s premier pseudoscience blog. But unlike SkS, you will get plenty of pushback here at the internet’s “Best Science” site.

137. Theo Goodwin says:

mkelly says:
February 26, 2013 at 12:02 pm

Bazinga! In addition, the entire environment has been designed and refined over the years for the sole purpose of achieving uniformity in each and every piece of candy. The candy manufacturers struggle to define their event space. Alarmists either do not have a clue what defining an event space means or they are being deceptive.

138. Philip Lloyd says:

Steven Mosher says: February 25, 2013 at 11:12 pm
“Nick,
Ya, you’d be amazed if you look at a ships track over time in ICOADS by how little it changes and by how little it changes during the course of a day.. relative to the air that is.”
Steven, you would be amazed at how the sea temperature changes minute to minute if you are in a small boat. Then you reflect that there is as much heat in the upper 1m of the ocean as there is in all the atmosphere above it. Then you look at a thundering great vessel in ICOADS and you say “How the ^\$#@ can anyone think that something that size can give a good idea of the temperature of a layer of water 1m deep?”
It sometimes helps to go and observe. You learn a lot that way. You learn to respect Nature and not to try to second guess Her.

139. Theo Goodwin says:

Phobos says:
February 26, 2013 at 10:55 am

“It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality.”

In good faith and total honesty, I cannot believe that you wrote that sentence. Sir, you are assuming that the ocean is no less uniform than the Hershey Bars coming off the product line. Who or what guarantees that uniformity? The God of Alarmism?

You, Sir, have won the prize for most transparent myth maker among Alarmists. Mosher comes in second. Alarmists have no concept of empirical research and the work that must be done to define and respect the integrity of the natural phenomenon that you desire to measure. On empirical matters, Alarmists are as dumb as rocks.

140. Phobos says:

Willis Eschenbach wrote: “Only 1.5% of the samples were taken north of 60°N.”

The Arctic Ocean’s volume is only 1.1% of total ocean volume.

141. Phobos says:

Uzurbrain says: “All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.

Of course not — if that were true, you could never measure something’s variation with time. You couldn’t determine the average annual temperature in your backyard. You couldn’t measure the speed of an automobile.

You are confusing what is theoretically desirable with what is practical. All measurements — ALL of them — are a compromise between what’s ideal and what’s doable.

142. Mario Lento says:

The sun warms the lower mid latitudes of the equatorial regions of the pacific ocean. The relatively constant westward winds blow the warmed ocean surface to the west where they churn downwards against the coastlines there (Australia area) and are stored deep to 100’s of meters of depth.

During La Nina, more warmed surface water is blown farther west and down once it hits the coast. This process upwells more deep cooler water in the Eastern pacific (the America) to be warmed by less cloudy skies. (The cooler water is drier and creates fewer clouds). The Western pacific is slightly higher because water is being piled up there.

When La Nina conditions subside, (the westward winds reverse or slow) gravity then plunges the western pacific ocean down and the warm water flows in a reverse wave to the East to warm the surface of the equatorial regions, resulting in an El Nino. This then releases the deep warm water that was stored there.

There is no evidence that CO2 has any effect on this process.

143. Phobos says:

And the volume of the Southern Ocean (south of 60 deg S) is 5.4% of total ocean volume.

Phobos said on February 26, 2013 at 12:33 pm:

Willis Eschenbach wrote: “Only 1.5% of the samples were taken north of 60°N.”

The Arctic Ocean’s volume is only 1.1% of total ocean volume.

Thus since it is only capable of holding a relatively very small amount of heat, and since it is far colder than other waters such as equitorial thus it takes less heat per Kelvin rise than it would around the equator or the middle latitudes, the Arctic is really not representative of the world oceans, and should be disregarded during conversations about “global warming” and “warming oceans” as it’s irrelevant to the overall discussions of ocean heat content.

Good point, Phobos.

145. Phobos says:

@DB Stealey: My only agenda is good science. My income is my business, but nothing like you think it is.

Again, focus on the science.

146. Lars P. says:

Willis Eschenbach says:
February 26, 2013 at 10:05 am
Richard, that is total and absolute bullshit. I give you the Lie Direct on that.
I even dedicated an entire post to the question of the absorption of IR. In that post I answered your damn questions, over and over and over.

The fact that you didn’t like my answers doesn’t entitle you to make false claims about whether I’ve answered, Richard, thats underhanded and untrue, and I expect better of you.

See my post called Radiating the Oceans for a full discussion of this issue, including my answers to Richard that he’s trying to pretend never happened …

Richard, I expect an apology, or we’re done discussing forever. I won’t have dealings with a man who tells lies in an attempt to damage my reputation.

w.
I mix now in a discussion that is not mine and I’ll be sorry …
Willis, your 4 arguments there don’t stand for me.
The sea surface is the only point where the ocean does lose heat, this is why the inverted gradient forms with a cooler surface.
If the gradient would be inverted the ocean would be a net heat sink, this goes up to a moment when thermical balance is achieved and then the surface cools again, as the ocean gains warmth not only at the surface but in deeper layers too (as deep as sun radiation can penetrate)
DLR cannot directly warm the oceans due to the oceans cool skin. The inverted temperature gradient at the ocean surface ensures that heat does not go from the cool skin above to the below strata.
Even with mixing the surface water, when the water above is cooler then the water below, it will not add warm to the water below but cool it.
The only way DLR can warm the ocean is increasing the sea surface temperature.
If the sea surface temperature increases, then the temperature gradient allows for warmer water below, but this water is warmed by the sun radiation which penenetrates deeper.
As long as the cool skin gradient does not invert itself there is no heat transfer from the surface to the lower levels.

Argument 1. People claim that because the DLR is absorbed in the first mm of water, it can’t heat the mass of the ocean. But the same is true of the land. DLR is absorbed in the first mm of rock or soil.
When you put your hand on land it is warmer above and cooler below. This explains why heat is tranferred from above to below.
The oceans have a cool skin above, so the comparison is wrong.

Argument 2. If the DLR isn’t heating the water, where is it going?
DLR is part of the energy exchange at the surface.
You can calculate the net heat flow at the surface and DLR is part of that net heat flow.
If it would be heating the surface of the ocean then the surface would get warmer then the water below, then you can talk about warming and heat transfer to the below through mixing water or difusion. As long as the surface is cooler then the water below it, it does not warm the water below it.

Argument 3. The claim is often made that warming the top millimetre can’t affect the heat of the bulk ocean.
This is a different point. As explained this top milimeter is not warmer. One can mix it as long he/she wants, it will not add energy to the warmer water below.
The only way how the ocean can get warmer is when the surface gets warmer.

Argument 4. Without the heating from the DLR, there’s not enough heating to explain the current liquid state of the ocean.
Yes, as said, this sets the temperature of the sea surface. In consequence the ocean below warms from the sun radiation until it gets in balance and it loses the same heat at this very surface.
The ocean can lose heat only at the surface and not everywhere. On the “dark side” it frozes and it does not lose heat there, whereas where the 1000 W/m2 comes it unfrozes and intakes all the heat it can. It then radiates only at the surface where it is not frozen, in balance with the atmosphere above.
The ocean would be in any case unfrozen in part of it, not completely frozen, as 170 W/m2 is a flat world myth, 1000 W/m2 would melt any ice, what we have due to DLR is a bigger unfrozen surface due to the slow down of heat loss.
I think this is the way how it makes sense.

147. @ Phobos 12:33 pm
See: http://ngdc.noaa.gov/mgg/global/etopo1_ocean_volumes.html
This says the Arctic is 4.3% of ocean area, 1.4% of Ocean Volume.

Since we seem to be concentrating on the upper 0-700 m of the ocean column, It seem reasonable that a better estimate would be that the Arctic is about 3.5% of the volume of interest.

148. Error bars should represent sampling uncertainty. Or stated another way, the probability the correct value is within a certain range.

Error bars are largely determined by sample size, and to a lesser extent the distribution of the data. Population (ocean) size over a certain amount has a very small effect.

But with the crucial caveat, sampling is random. As Argo floats drift, they do not sample randomly ( I don’t know whether or not their initial deployment is random). Error bars with non-random sampling are meaningless. Unless, you know what effect your non-random sampling has, and I see no indication the Argo people do.

On one university course, I was required to read Use and Abuse of Statistics (a very well written book as I recall). Climate Science merits a whole new edition of its own.

149. Theo Goodwin says:

Phobos says:
February 26, 2013 at 12:52 pm
Uzurbrain says: “All measurements MUST be of the exact same entity at the exact same time under the exact same conditions.

“Of course not — if that were true, you could never measure something’s variation with time. You couldn’t determine the average annual temperature in your backyard. You couldn’t measure the speed of an automobile.”

You define “exact same time” in terms of the physical processes measured. All the Hershey Bars are measured at the exact same time – the instant when they fall from the production line into the packaging machine. You cannot be this dumb.

150. D.B. Stealey says:

A few thoughts regarding ocean data and observations:

The ARGO buoy network was giving data that contradicted the AGW narrative, so it was “adjusted”. Prior to adjusting it, this is what ARGO showed.

Envisat provided satellite sea level measurements. But they didn’t show what the climate lobby wanted, so like ARGO, the data was ‘adjusted‘.

Sea surface temperature [SST] varies widely depending on where the measurements are taken.

…scientists from the Norwegian Polar Institute reported that they’d measured sea temperatures beneath an East Antarctic ice shelf and found no signs of warming whatsoever… [source]

If the rise in CO2 caused global warming, the OHC would be rising at an accelerating rate. It isn’t. The trend is the same, whether CO2 is low or high. Moreover, in recent years OHC has flattened, in parallel with the stalling of global warming. It is becoming very difficult to square these observations with the AGW narrative. When the models disagree with observation, the models are wrong.

151. Willis Eschenbach says:

February 26, 2013 at 1:13 pm

Phobos said on February 26, 2013 at 12:33 pm:

Willis Eschenbach wrote: “Only 1.5% of the samples were taken north of 60°N.”

The Arctic Ocean’s volume is only 1.1% of total ocean volume.

Thus since it is only capable of holding a relatively very small amount of heat, and since it is far colder than other waters such as equitorial thus it takes less heat per Kelvin rise than it would around the equator or the middle latitudes, the Arctic is really not representative of the world oceans, and should be disregarded during conversations about “global warming” and “warming oceans” as it’s irrelevant to the overall discussions of ocean heat content.

Good point, Phobos.

Actually, you both seem to have missed my point, my intention must not have been clear. What I tried to say was that if you ignore any part of the ocean, it only increases the error in the average. It never decreases it.

Additionally, you both are not mentioning the huge undersampling of the south compared to the north.

Next, arctic water is somewhere around zero degrees C, global average is 19° C, so the Arctic water contains about 93% of the heat of the oceanic average.

Next, arctic water is shallow, but we’re only looking at the top 700m in any case.

Next, my claim about the entire ocean north of 60°N has somehow been morphed into a claim about the Arctic Ocean … say what? Five percent of the ocean area is north of 60°N, not 1.5%.

Finally, what we’re looking at is not the amount of heat in the arctic water. It is the change in that heat which is of interest. Two things about that. First, it takes about same amount of energy (trivial differences) to raise water from 1°C to 2°C as it does to raise it from 24°C to 25°C. Second, variations in ice cover will mean that the arctic water temperatures will vary accordingly. Additionally, warm pulses of water from El Nino’s to the poles change the temperature of the northern Pacific and the Arctic Ocean, as well as the southern Pacific and the Southern Ocean.

My point remains: undersampling and ignoring part of the ocean can only increase the error, they do not decrease it.

w.

152. Mindert Eiting says:

Willis at 12:09 pm. I agree that the claim of one standard error equaling 0.01 for the global mean is curious but check whether this is the standard error of the global mean or something else like the annual mean per float (or per stratum). Each float does a measurement every ten days. So there is already a small error in the annual mean per float. For the remainder we assume that we have in a certain year a random sample of size 3000 from an imaginary population of float means. Square root 3000 equals 54.77. Global standard deviation equals 8.73. Then the standard error of the global mean equals 8.73/54.77 or 0.16.

The statistical theorem holds for random samples, whether the distribution is normal or not. Note that the standard error is the standard deviation of sample means over repeated random sampling. In the course of years float samples are dependent. So the standard error may be re-interpreted as the standard deviation of means in dependent or non-random samples. To give an extreme example: take one year of data and make one hundred copies of your data. Agree that the mean over copies has standard deviation zero? This would never be called a standard error. So the reported value of 0.01 may apply at dependent samples, underestimating the real standard error.

153. Merovign says:

“So what you’re saying, Percy, is that something you have never seen is only slightly less blue than something else… that you have never seen.”

That’s kind of the vibe I get when something changes so far within the margin for error that it’s detection may well start the process of organizing a Canonization committee.

154. gymnosperm says:
February 26, 2013 at 7:52 am
I think everyone has the Tisdale function backwards. ENSO transfers energy from the ocean to the atmosphere so ocean enthalpy should have decreased after the 1997 El Nino.

Well, not according to my understanding. Sure, there is a transfer of heat from the ocean to the atmosphere, but there is also a redistribution of warm water. Also, La Nina is generally a “charging” phase where net energy is being absorbed by the oceans and piled up in the Western Pacific Warm Pool. When the trades relax, that water spreads back east along the equator and loses heat to atmosphere. When the trades return, that warm water is redistributed.

We have an interesting condition right now in that we don’t really have much energy in the warm pool. The trades have been nominal. If they were to slacken now, we wouldn’t see much of an El Nino. Look at the coast of Southeast Asia right now. There is no massive buildup of warm water there.

155. Willis Eschenbach says:

Mindert Eiting says:
February 26, 2013 at 2:49 pm

Willis at 12:09 pm. I agree that the claim of one standard error equaling 0.01 for the global mean is curious but check whether this is the standard error of the global mean or something else like the annual mean per float (or per stratum).

Good question, and the answer’s right there in the chart, it’s the number that’s always quoted—it’s the claimed error in the global mean temperature including everything—instrument accuracy, instrument precision, sampling error, the whole deal.

w.

156. bw says:

Going back to the original abstract in Nature, the graph is based on pooling global data in an attempt to find a warming signal. Pooling global temperature data is entirely bogus.
No point is asking what the error bars represent, they are fantasy.
There is also a supplement trying to explain the error methodology. The authors of the original analysis are pooling, homogenizing, excluding data in an attempt to find a “global” value.
Sounds like most of the so-called global warming conjecture.

157. Willis Eschenbach says:

A further point:

If I ran the Argo zoo, I’d tell the guys to pull maybe thirty floats at random. I’d put them in a rigorously temperature controlled pressure chamber. I’d simulate a series of dives to their resting level (typically 1000 metres) and take the pressure and temperature and other recordings as usual.

That would at least give us an idea of how well they are doing in the real world.

w.

158. If I ran the Argo zoo, I’d moor 30 floats, or make them powered such that they can be kept at the same location. In order to quantify the bias resulting from free floating drift. Especially, as the TAO buoys have solved the deep ocean mooring problem.

From Willis Eschenbach on February 26, 2013 at 2:47 pm:

Actually, you both seem to have missed my point, my intention must not have been clear. What I tried to say was that if you ignore any part of the ocean, it only increases the error in the average. It never decreases it.

Actually on the glance-through (wasn’t reading Phobos’ scribblings too carefully), it looked like he was dismissing your concern about above 60°N being only 1.5% of samples, by mentioning the Arctic Ocean itself is only 1.1% of the total volume, as if that was supposed to make sense.

My toss-off comment finished with “following the logic” to the Arctic Ocean not being important in discussions about climate, due to the small size Phobos cited… And who’d believe that?

As it is, there are certain small bits of water that, due to their location, have a far greater effect on climate than their “percent of volume” indicates. Which makes dismissing a region due to the low percentage an even worse error.

160. TimTheToolMan says:

Willis writes “I’d put them in a rigorously temperature controlled pressure chamber. I’d simulate a series of dives to their resting level (typically 1000 metres) and take the pressure and temperature and other recordings as usual. ”

I hope they did that with the floats that were thought to be giving cold biased values (as well as any floats they pulled giving expected warming bias)

I’ve not seen anyone reporting the results of testing of those floats however. If it turned out they read accurately in those tests then the readings ought to stand until they came up with another reason…

A week or so ago here in Hobart we had a stinking hot 35C day followed by a 15C day and then back up to about a 25C day after that. Those extreme days may seem “wrong” to an algorithm looking for erroneous readings but in fact they were all perfectly accurate.

161. Eyes Wide Open says:

Two key points:

1) As someone mentioned earlier, any depth below 100m is a heat sink so how much of the rise in OHC is simply filling this sink.
2) It would be interesting to get a profile of the heat increase over the last 10 years at 0m, 10m, 50m, 100m and then every 100m below that. My guess would be that the upper layers have lost heat over the last 10 years offset by gains in the lower layers (due to the heat sink thing). That would explain the cooling global climate over that period!

162. Uzurbrain says:

Phobos says: February 26, 2013 at 12:52 pm

You are confusing what is theoretically desirable with what is practical. All measurements — ALL of them — are a compromise between what’s ideal and what’s doable.
——————
Thank you for the admission of the stupidity of trying to measure the temperature of the Ocean AND that is the heart of the problem. You are telling us that we have to live with what is doable. Measuring the temperature of the ocean is not doable. The number of unknown variables exceeds the capabilities of the combined power of all of the existing supercomputers.

I assume you know that the ocean has a surface area of 63,780,000 sq miles (165,200,000 km²), and many known and unknown currents, and many known and unknown thermocline depths, and areas where for some unknown reason the temperature is unexplainably warmer or colder, and areas like the Sargasso sea. Even if equidistantly spread over the entire ocean you cannot do this with 3,000, 30,000 or 300,000 probes in the ocean because you would be combining different incompatible measurements creating a meaningless collection of garbage. If you were to use 3,000,000 probes, each would be covering 22 square miles, and if they had static probes at each desired depth (at least 10 intervals over the 700 meters) you might start to get some relevant, but not perfect data. Till then all you are doing is wasting money and providing a forecast of Armageddon and apocalypse based upon instruments/reports that have an error band wider than the measurement they are taking. But it looks good because it is displayed to 3 decimal points!

Like I said earlier, you cannot measure the quantity of water in a lake with a barrel to the nearest milliliter regardless of how many times you measure the amount of water. That is what you are advocating. It is doable to measure the quantity with a barrel; therefore, we will measure the quantity in 30,000 places, average them out and report the findings with a precision of 0.01 milliliters. Then for good measure claim that with the standard deviation of the normal distribution allowing for standard error of one sigma, divided by chi square over pi, and so forth, and so on, and etcetera, and more bu.. sh..

Get a good book on Applied Statistics. Read it.

163. dp says:

These dramatic rates of change in the sea temperature represent huge rates of change in the influencing energy source. If 10 petajoules is required to sustain any particular rate of change, what rate of change is necessary to cause the flat lining? Is there a negative energy source out there? Without it that energy source has to work like a toggle switch *and* the rate at which the ocean loses energy has to be very non-linear. There is no flywheel effect in that graph and I find that troubling.

164. Phobos says:

“Don’t forget, the uncertainty of an average is less than the uncertainty of any of what it’s averaging.”

Not true, often times it can be higher. How much statistics do you really know? To say something so outright false, just makes me think you are misinformed on statistics in general.

Here are what you SHOULD have said if you had taken stat 101:

“The uncertainty of an average is less then the uncertainty in certain cases. In the case of ARGO, I claim that the experiments are all equal and so therefore the uncertainty of the average is equal to this:

For example, if you have two temperatures T1 and T2, each with a measurement uncertainty of dT, the uncertainty dA of the average is

dA = dT/sqrt(2)

For N points the denominator is sqrt(N).

Then of course if you were being honest, you would add the following excerpt:

“Levitus et al 2012 claims this is the only error and I read and lazilly agreed without giving it any further though. So my opinion on this matter is rather worthless. So just go read them if you disagree with me, because my brain has shut down and I can not think.”

Why that last paragraph? Well you are being lazy. You site a paper and state that the error is what they say it s. Did you have any thoughts, concerns, or even some opinion on their evaluation of the data? Come on man, up your game. Give us something. Don’t just go around thumbing your nose at everyone else while you are being lazy intellectually.

OK, enough with “that”….my thoughts on ARGO is this:

As we add 700 bouys at (certain) locations every year, we are adding instruments which have not been active prior to this year. *duh. Therefore, what is the error of the instruments over time being subjected to various weather phenomena including in some cases hurricanes? (the ARGO floats can dive to avoid some of this, but then again that asks whether the diving will effect the instruments)…..

Another issue:

“Float reliability has improved each year and the float lifetime has been extended. ” So in essence every year you add more accurate instrumentation. This is probably a good thing, but it brings out error such as what Willis stated. Namely, that you have instruments which are as I mentioned earlier being around longer and longer…which becomes a larger and larger problem. In essence, adding more accurate and reliable instruments might actually increase your error for that reason. You would never know though if you did not test the instruments over time and find out if you had a systematic drift on the readings. (I think someone mentioned this possible error earlier….)

Anything systematic error not caugh is time of measurements and whether this has a systematic effect. My guess is probably not, but more readings would not hurt either, so to get rid of this error, have the things record more often. Perhaps they are unable to do so, in which case any new bouys added should measure more often.

I don’t know, there was so much covered by Willis that I tend to wonder if the grids are a problem now too. I would have to think on that, but I tend to think that grids are going to be hairy to make since the positioning of these bouys is not exact.

165. Manfred says:

Mike Jonas says:
February 25, 2013 at 10:37 pm
re the “the 0.1°C temperature rise 1998-2003″. Could it be from the 1997-8 El Nino? If the upwelling warm water comes from below 700m depth, or if the temperature measure doesn’t treat all 700m equally, it could be the El Nino. I think this would be in line with Bob Tisdale’s thinking that there is a ‘step function’ at an El Nino.
—————————

It would make sense to see some increase in 1997/1998 due to the El Nino.

But not until 2003.
And not stopping just when you start measuring for the first time with a good instrument.
And not without any increase during the subsequent El Ninos in 2005 and 2010.

166. William McClenney says:

benfrommo says:
February 26, 2013 at 7:23 pm

“Well you are being lazy. You cite a paper and state that the error is what they say it is. Did you have any thoughts, concerns, or even some opinion on their evaluation of the data? Come on man, up your game. Give us something. Don’t just go around thumbing your nose at everyone else while you are being lazy intellectually.”

And that, ladies and gentlemen, is probably the root of the problem.

Nice distillation benfrommo!

167. Willis Eschenbach says:

I wanted to point out something interesting about the location of the argo floats.

SOURCE

Note that a fairly large expanse of the continental shelf is free of Argo floats. This is for a curious reason.

The Argo floats sleep in the midnight-black depths of the ocean, below the photic limit, a kilometre down (3,300 ft, 0.6 miles). They spend ten days there with all systems on idle.

And then (perhaps after diving even deeper yet) they rise to the surface, taking measurements as they go. When they hit the surface of the ocean, ET calls home, they radio the info to the satellite, sign off, and drop a kilometre back down into inky darkness to sleep another ten days.

I’m sure you’ve already figured out the reason for the lack of Argo floats in the figure above over the continental shelves … which are often less than 1,000 metres in depth … like 600, or 300.

This leads to a funny problem. When deep ocean currents hit the continents, they often are forced to the surface by the undersea contours. Often they play along the coastline like a loose underwater hose, shifting north and south with the seasons and other pressures.

This leads to a large temporal and spatial variation in the temperature (and salinity and pH) of the significant volume of water over the continental shelves and along the coastlines.

This coastal water is also subject to the large variations in temperature due to the variable influx of the generally colder but sometimes warmer fresh water coming from the rivers.

And unfortunately, almost none of this water is being measured by the Argo floats at all … as they are generally set, they can’t go where the bottom is shallower than 1,000 feet.

My point, as always, is the effect that all of this has on the error estimates, when large areas (continental margins) have large spatial and temporal variation in temperature, and are not included in the calculations.

Next, note the lack of floats in the intertropical convergence zone (ITCZ) just above the equator. This is the most active region on the planet, the driving, pulsing, shifting mass of thunderstorms that powers the global circulation and variably cools regions of the upper ocean … and it too is undersampled.

I’m just not finding how those things are being accounted for. I’ve read the Levitus including the supplement, and I can’t see they’re mentioned.

w.

168. This is an awful lot of comments for nothing more than a thought experiment. Are we all still pretending the OHC was measurable in 97-98? And then suddenly we start talking about ARGO? HELLO!!!??!!! ARGO started being deployed in 2000. This isn’t an apples to oranges comparison! It’s apples to imagination! It’s a meaningless conversation.

But, then, so is the thoughts behind trying to measure the OHC with the buoys. This isn’t like static ground thermometers. And, even if they were, what they are measuring is different each measurement. It isn’t like they’re measuring the same piece of water, even if the location is exactly the same. Quit letting people pretend this gives us some value or understanding. It doesn’t. It’s as meaningful as me measuring the heat content of the water from my tap each day. As a bonus, I’ll measure it in a pot and change depths.

Wait, never mind. I forgot that some people believe that heat from SUV driving magically drops to the depths of the oceans and lurks. Please ignore and carry on with the delusions.

169. Andyj says:

The graph under review has serious issues.
For a start, this shows anomalies. To me that means 99.99% of the worlds oceans have none and the Y axis is calculated as a mean. Heating of the oceans from the heating of the air to depths of 700m by simply blowing air on it… AND being able to measure the difference year on year… Guys like James Sexton who says this graph has issues.. No doubts about that.

170. Ben Wouters says:

Ray says:
February 26, 2013 at 7:46 am
“Ben I have no idea if geothermal heat is factored into the assorted Climate models in use or how a warm “Black Body” behaves.”
The current heat flux through the crust is negligeable. (~0,1 W/m^2) and not factored in afaik.
But the seismic event I mentioned involved 100 million km^3 of magma erupting from inside the Earth into the Pacific Ocean. That’s enough magma to cover all of Canada under a 10km thick layer of sisling hot magma. You can’t ignore events of this magnitude.
This event has the potential to warm ALL of the worlds ocean water some 15-20K.
So with the oceans pre-heated from within Earth it makes no sense to discuss black or grey body behaviour. Just look at how many joules/s the sun delivers on Earth and how that energy warms the already warm oceans.

171. Very interesting Willis….appears you found some more selection bias all over again. The more we find the more we show that the scientists were either being lazy or incompetent. I don’t know which, but if they can not write a paper like that and not even mention these effects to the tune of simply: “We did not consider x y and z” then they are either being lazy and not thinking this through or perhaps they are being dishonest. Now, I make no claim to either one, and perhaps they just figured adding that was a waste of time, but regardless that is rather shoddy work!

In any regard, I think politics may also be playing a large part of this. Here is an except from the ARGO page:

We are increasingly concerned about global change and its regional impacts. Sea level is rising at an accelerating rate of 3 mm/year, Arctic sea ice cover is shrinking and high latitude areas are warming rapidly. Extreme weather events cause loss of life and enormous burdens on the insurance industry. Globally, 8 of the 10 warmest years since 1860, when instrumental records began, were in the past decade.

These effects are caused by a mixture of long-term climate change and natural variability. Their impacts are in some cases beneficial (lengthened growing seasons, opening of Arctic shipping routes) and in others adverse (increased coastal flooding, severe droughts, more extreme and frequent heat waves and weather events such as severe tropical cyclones).”

Of all of that, two things strike me….. (this goes towards the motivations of the scientists to perhaps explain whether they are being intentionally dishonest or perhaps just lazy.)

They claim that 8 out of the last 10 years have been the warmest since 1860? did they not think this through? Of course the last decade is warmer then any previous decade. This is to be expected when you start the data during the Little Ice Age and have over 100 years of warming. What did they expect, record cold? That just makes no sense to me….because frankly that information does not tell you whether you are in a warming world or whether the warming was in the past….after all, do we not expect to find the highest elevations on the Earth to be at the top of mountains?

Don’t we expect to find the highest temperatures in the data to be at the top of the proverbial temperature chart? And why in the world are these world class scientists not able to realize that the warmest decade meme is pointless to repeat because it comes from being lazy and not able to think for yourself.

And then the call to “extreme weather.” Not one paper written has confirmed this “CRAZY GUESS” and yet its repeated by scientists like its gospel truth. Do they realize that since the science does not back this up, that simply stating it means they aren’t thinking at all? I don’t know what to say to add to that…… In any regard, reading about ARGO and the scientists involved makes me think they are either very lazy or very incompetent. I don’t know which, but this decadent line of thinking is what is going to kill climate science more then anything else. Normal rational people after all can reason things out and if you go on about unproven quack theories like extreme weather, the science will suffer as your conclusions are based on faulty input.

Hello, can we say garbage in, garbage out again?

Perhaps that is the problem…these scientists either for political reasons or for other reasons are not thinking things through well enough. In any regard, pure laziness is not an excuse for sub-par papers that claim a certain error when they do not even mention other sources of error. Come on, a sixth grader could have done better in that regard. Heck, my science project for sixth grade was graded rougher then what I am grading these scientists on, so if my post bothers them, perhaps they just need to grow thicker skin…

172. Phobos says:

Willis Eschenbach wrote: “I’m just not finding how those things are being accounted for.”

If you read Levitus et al 2012, you see they are not claiming to calculate the average temperature of the oceans, but of the “World Ocean.”

The World Ocean isn’t the exact ocean; it’s meant to be a representation of it, subject to the limitations of what’s experimentally doable and affordable. It’s a model of the exact ocean, if you want to call it that.

Of course, such constructions are the norm in environmental science, and indeed in most science. One can never measure the ideal system, so one somehow obtains reasonable facisimiles of it and does the measurements there, accounting for differences as well as reason and creativity allow, and doing one’s best to indicate the amount of resulting uncertainty.

The interest (in this case) isn’t so much in whether the modeled system gives a precise measurement of the average temperature of the ocean — which would, of course, require an infinite number of a set of measurements {x,y,z,t} for every point in the ocean and at every instant in time — but in how the consistently modeled system changes with time.

All data in science depends on a model. ALL data. I suspect you know this, and are just looking for ways to dismiss the OHC results in any way possible. There are an infinite number of objections that could be raised; but, do you have a better method?

173. Mark Bofill says:

Phobos says:
February 27, 2013 at 7:07 am

All data in science depends on a model. ALL data. I suspect you know this, and are just looking for ways to dismiss the OHC results in any way possible. There are an infinite number of objections that could be raised; but, do you have a better method?
—————————-
Phobos, I think Willis was pretty clear here:

My main question in this revolves around the claimed error. I find the claim that we know the average temperature of the upper ocean with an error of only one hundredth of a degree to be very unlikely … the ocean is huge beyond belief. This claimed ocean error is on the order of the size of the claimed error in the land temperature records, which have many more stations, taking daily records, over a much smaller area, at only one level. Doubtful.

His issue is with the error claims, not with the model.

174. Phobos says:

@Mark Bofill: Again, they aren’t measuring the average temperature of the huge ocean; they’re measuring the average temperature of the “World Ocean.”

As Levitus et al 2012 write in their Supplementary Material: “The results describing the variability of ocean heat content shown here are based on gridded (1-degree latitude-longitude grid), interpolated fields at standard depth measurement levels….”

You have about 3,000 buoys surfacing about every 10 days, taking temperature profiles all the way up. At any given depth, that’s about 110,000 measurements per year, so about one measurement every 3500 km^2, randomized over most of that ocean layer.

Sure, it’d be great to have 10 times as many buoys, or a hundred times more. Is there a reason to think that would give greater precision of the model’s average temperature?

175. Theo Goodwin says:

Phobos says:
February 27, 2013 at 7:07 am

You have set a record for self-serving tripe on WUWT and maybe in the whole arena of science. You write:

“The World Ocean isn’t the exact ocean; it’s meant to be a representation of it, subject to the limitations of what’s experimentally doable and affordable. It’s a model of the exact ocean, if you want to call it that.

Of course, such constructions are the norm in environmental science, and indeed in most science. One can never measure the ideal system, so one somehow obtains reasonable facisimiles of it and does the measurements there, accounting for differences as well as reason and creativity allow, and doing one’s best to indicate the amount of resulting uncertainty.”

So, if you do not have much money for experiment or much time, interest, or imagination, then you do what you can – you collect some numbers somehow and from that you project what you call a model of the World Ocean? Is that about right? Please notice in the comments above that the words that have risen to the top are ‘lazy’ and ‘thoughtless’. You have nothing to say about evaluation of data. After raising the “sampling candy bars” analogy, you fail to respond to criticisms of it.

However, I can say in your defense that I have not encountered an Alarmist who would attempt to think through the steps that must be taken in nature (the real world) to validate the measurements that support what you call your model.

176. Mark Bofill says:

Phobos says:
February 27, 2013 at 8:13 am

Sure, it’d be great to have 10 times as many buoys, or a hundred times more. Is there a reason to think that would give greater precision of the model’s average temperature?
—-
I’m sorry if I was unclear. I don’t have any issue with the number of buoys deployed. I don’t read Willis as having taken issue with this either. I believe the issue is in the error claims. Willis appears to be expressing skepticism that these measurements are accurate to within a hundreth of a degree. He does not appear to be stating that we require accuracy to within a hundreth of a degree, and neither am I.

177. Phobos says:

Mark Bofill says: “I believe the issue is in the error claims. Willis appears to be expressing skepticism that these measurements are accurate to within a hundreth of a degree.”

It’s good to raise questions. It’s also good to answer them. I don’t see Willis crunching the numbers to provide an alternative number. On the other hand, there has been a lot of work done on the ARGO system over the years, much of which is concerned about data integrity:
http://www.argo.ucsd.edu/Bibliography.html

And, Levitus et al 2012 have an entire Appendix on it; clearly they have thought about the issue a lot, and, importantly, crunched the numbers.
http://www.agu.org/pubs/crossref/pip/2012GL051106.shtml

178. Theo Goodwin says:

Phobos says:
February 27, 2013 at 7:07 am

“The interest (in this case) isn’t so much in whether the modeled system gives a precise measurement of the average temperature of the ocean — which would, of course, require an infinite number of a set of measurements {x,y,z,t} for every point in the ocean and at every instant in time — but in how the consistently modeled system changes with time.”

Phobos says:
February 27, 2013 at 9:01 am
‘Mark Bofill says: “I believe the issue is in the error claims. Willis appears to be expressing skepticism that these measurements are accurate to within a hundreth of a degree.”

It’s good to raise questions. It’s also good to answer them. I don’t see Willis crunching the numbers to provide an alternative number. On the other hand, there has been a lot of work done on the ARGO system over the years, much of which is concerned about data integrity:
http://www.argo.ucsd.edu/Bibliography.html

And, Levitus et al 2012 have an entire Appendix on it; clearly they have thought about the issue a lot, and, importantly, crunched the numbers.
http://www.agu.org/pubs/crossref/pip/2012GL051106.shtml

OK, Phobos, which is it? How consistently the model changes with time or the integrity of the measurements. Maybe you are going to address the latter? So far, you have not.

179. Willis Eschenbach says:

Phobos says:
February 27, 2013 at 7:07 am

Willis Eschenbach wrote:

“I’m just not finding how those things are being accounted for.”

If you read Levitus et al 2012, you see they are not claiming to calculate the average temperature of the oceans, but of the “World Ocean.”

The World Ocean isn’t the exact ocean; it’s meant to be a representation of it, subject to the limitations of what’s experimentally doable and affordable. It’s a model of the exact ocean, if you want to call it that.

Huh? That claim is totally meaningless without a quote or a page number for a citation ..

w.

180. Theo Goodwin says:

Phobos writes

“which would, of course, require an infinite number of a set of measurements {x,y,z,t} for every point in the ocean and at every instant in time”

I cannot imagine where you got this idea. You are allowed to sample, just as the Hershey Company samples. But you have to identify the physical processes that you are sampling and the physical characteristics of the measuring instrument. Both projects demand rigorous experimentation. From what I see, you have assumed that the ocean is everywhere uniform, at least down to 200 meters. You have assumed that there are no physical processes to measure. Regarding the buoys, you assume that they are all ideal and remain so come what may. As a scientist, your level of rigorous experimentation should at least come up to the level of the Hershey Company. Treating the ocean as everywhere uniform is simply myth making.

181. Willis Eschenbach says:

Phobos says:
February 27, 2013 at 9:01 am

Mark Bofill says: “I believe the issue is in the error claims. Willis appears to be expressing skepticism that these measurements are accurate to within a hundreth of a degree.”

It’s good to raise questions. It’s also good to answer them. I don’t see Willis crunching the numbers to provide an alternative number. On the other hand, there has been a lot of work done on the ARGO system over the years, much of which is concerned about data integrity:
http://www.argo.ucsd.edu/Bibliography.html

And, Levitus et al 2012 have an entire Appendix on it; clearly they have thought about the issue a lot, and, importantly, crunched the numbers.
http://www.agu.org/pubs/crossref/pip/2012GL051106.shtml

It appears you hold the bizarre idea that anyone who discovers an error in a scientific work is required to “provide an alternative number”. Nothing could be further from the truth. The fact that I think there is an error, as I’ve discusses here and in “Decimals of Precision”, doesn’t impose on me any obligation to do anything at all, much less “provide an alternative number”.

w.

182. Theo Goodwin says:

Phobos writes:

“It’s good to raise questions. It’s also good to answer them. I don’t see Willis crunching the numbers to provide an alternative number.”

Of all sins against logic and scientific method, this one is the most offensive. Alarmists claim to reason that any claim they present, no matter how ridiculous, is authoritative until critics present an alternative. The only thing behind the Alarmists’ claim is narcissistic grandiosity.

183. Uzurbrain says:

Willis Eschenbach says: February 27, 2013 at 10:08 am

Reviewed your post on “Decimals of Precision.” An Excellent description of what is needed to provide the precision purported in all of these “scientific” papers. The more I look into the Global warming hype the more I am convinced that if I am a Skeptic, then those providing and supporting the AGW theory are “Pseudo Scientists.”

184. Michael Moon says:

On the ARGO’s website they list their instrument packages. SBE Sea Bird Electronics lists the accuracy of their instrument at 0.001 C, with a drift of 0.0002 c/month. They use a Platinum Resistance Thermometer, a very accurate thermocouple. So, this accuracy is probably true, as their website is extremely sophisticated. Averaging tens of thousands of these readings, assuming all are this accurate, will give a pretty good look at OHC.

185. Phobos says:

Willis Eshenbach said: “It appears you hold the bizarre idea that anyone who discovers an error in a scientific work is required to “provide an alternative number”.

You think you have discovered an error. You have not. All you have said is, this number looks suspicious to me. That’s easy to say and anyone can say it. Until you are more specific or quantitative, it doesn’t rise above just another blog comment,

186. Phobos says:

Theo Goodwin says: “Alarmists claim to reason that any claim they present, no matter how ridiculous, is authoritative until critics present an alternative.”

Hardly. But what we have here is an experienced research group who has been analyzing ocean data for a long time and who have published a peer reviewed paper in a respectable journal that contains an entire Appendix on error analysis, versus a blog comment that says, this doesn’t look right to me.

Until more specifics are offered, and/or I crunch the numbers myself, I’m going with the scientific experts.

187. george e. smith says:

From Lars P.

“””””…..Argument 1. People claim that because the DLR is absorbed in the first mm of water, it can’t heat the mass of the ocean. But the same is true of the land. DLR is absorbed in the first mm of rock or soil.
When you put your hand on land it is warmer above and cooler below. This explains why heat is tranferred from above to below.
The oceans have a cool skin above, so the comparison is wrong……”””””

Water has its maximum absorption coefficient at 3.0 microns wavelength, it’s about 8,000 cm^-1. It is still above 1,000 cm^-1 out to 10 microns wavelength, where the peak of the earth radiation spectrum is for 288 K Temperature.
1,000 cm^-1 means the 1/e absorption depth is 10 mirons; only 37% survives to that depth. So five times that is 50 microns, by which distance 99% of the incident LWIR radiation has been absorbed. So the atmospheric downward LWIR doesn’t make it to anywhere near one millimetre depth.
That absorption in just a couple of thousandths of an inch results in enhanced evaporation of surface water; and that is the reason the very surface layer is cooler than the water a few cm down. Evaporation removes about 590 calories per gram of water evaporated, which is much more heat energy than what results from just the Temperature depression.

And because of that surface gradient; Temperature increasing from surface down, even if only for a few mm or cm, means that there is not a heat conduction from the surface to the ocean depths.

This is of course predicated on a relatively still surface. Wind driven storms of course may roil the surface waters, and stir things up a bit, but the principal effect of downward LWIR radiation absorption in the sea surface is evaporation, not downward conduction of heat.

The far more energetic photons of the incoming solar spectrum radiant energy encounter water absorption coefficients that are more like 0.0001 to 0.001 cm^-1 so the 1/e absorption depth is in the range of ten to one hundred metres; not ten to one hundred microns.

Don’t take my word for it; go read it for yourself in the literature. I can suggest “The Infra-Red Handbook” prepared for the US Navy, as one reputable source.

I don’t know why we continually rehash this belief that downward LWIR from the atmosphere stores heat in the ocean.
Now for the solid rocky surface, it is different, as the rocks don’t evaporate as easily as the ocean water.

188. Uzurbrain says:

Michael Moon says: February 27, 2013 at 10:51 am
“On the ARGO’s website they list their instrument packages. SBE Sea Bird Electronics lists the accuracy of their instrument at 0.001 C, with a drift of 0.0002″
——————
Those numbers do not make sense, they do not agree with the capabilities of ultra-high precision, laboratory only use, NIST traceable thermometers costing \$10000 and multiples of that even more.
You are being “snowed” with the B/S. They are not “calibrating” or “analyzing” the whole instrument string. That data ONLY represents the potential accuracy of the ELECTRONICS. It does not include the PRT (Platinum Resistance Thermometer) or the effects of the change in resistance of the PRT leads, connections, and the change in the ambient temperature of the electronics (ambient temperature could affect the reading by as much as a full degree). In essence they are using a micrometer to measure a 2X4. It would be like you calibrating your car speedometer and then replacing the 15’ wheels with 17” wheels – your speedometer is no longer accurate – get ready for a speeding ticket when it says you are doing 63 in a 65 zone. It has to be calibrated all the way to the source (moving roller/pavement) to be “accurate.”
SeaBird has designed an electronics package that will display numbers to 5 or 6 decimal points. Yes, they have data that even after 5 years, the ELECTRONICS (not the electronics AND PRT) gives them the same number within +/- 0.002 C. BUT they are reading a Temperature Transfer Standard (TTS) – NOT the actual temperature in an actual Triple Point Water (TPW) bath and a Gallium Melt Point (GPW) bath. They just hook up the TTS with “Calibrated leads” to the SeaBird, set the TTS for TPW – read the display, set the TTS for GMP – read the display and WALA – 0.002 percent accurate. That only proves the ELECTRONICS is accurate – nothing more. They have ignored the accuracy of the PRT and ASSUMED that the PRT follows the standard resistance curve. In my 50 years of experience I have found that even PRT’s costing over \$5,000 (yes five thousand) have been off by as much as 0.1% (when 0.01 was specified on the PR) and NOT followed the standard curve. AND again the accuracy stated for PRT’s is almost always for the maximum range. In laymen’s terms that means 0.01% of 100 degrees is 1 degree which is 0.03% of 33 degrees.
You have been snowed, they have been snowed, and we have been snowed. They need somebody that knows what they are doing. That is the problem with book learned scientists that have never seen actual, real, practical, applied application of real world instrumentation.

189. Some observations about the ARGO location map:
First, it is Cartesian in Latitude, so it is not “Equal Area”. The map’s creators should make the Y-axis linear in cos(Latitude) to make it an equal Area Map.

The density of points in the Sea of Japan is remarkably high compared to the Southern Caribbean and Gulf of Mexico and the Philippian Sea. Is it evidence of a trap? There is also a surprising gap between the Philippines and Papua New Guinea, one of the deepest average depths in the Pacific.

There is an unusual lineation at 165 W 30-50 deg N, a clump bordered by a dearth on each side. I thought this might be the ridge of sea mounts running north of Midway, but those are further west at 170 E.

It is hard to see why Argo’s tend to the west side of the Bay of Bengal and Arabian Sea. Water depth doesn’t preclude floats in the Straights of Madagascar.

ARGO’s avoid southern West Africa which cannot be explained by shallow water. Likewise offshore Peru.

To a lesser degree, the southern Atlantic Ocean seem not very random. Is there a E-W clump at 30 deg S or is it an optical illusion with the latitude line? If so, where is the illusion in the Pacific? There is a dearth south of 30 deg S 10-45 deg W.

I’d sure like to see a time-series plot of the number of Argo’s south of 60 deg S over time.

Willis, did you post to the net the compiled data from the 8000+ downloads you described in “Where in there World is Argo?”

Once I get it all downloaded, I’ll put it together in some more reasonable format and stick it back out on the web, so people won’t have to go through that madness for the data. — Willis Feb/6/2012

Map of ocean depth – NOAA to compare to Argo location map.

190. Theo Goodwin says:

Phobos says:
February 27, 2013 at 11:09 am

Then come to the United States and meet the people who say “Show me!” Scientists are given exactly the respect they earn; that is, they are permitted to offer evidence and explanations in support of their assertions. They will be questioned. They will be judged on the quality of their answers. They may respond and a new cycle of debate begins.

The playing field that I have described is level in accordance with scientific method. If you find something wrong with it, please tell me.

191. Lars P. says:

george e. smith says:
February 27, 2013 at 11:41 am

From Lars P.
“””””…..Argument 1. People claim that because the DLR is absorbed in the first mm of water, it can’t heat the mass of the ocean. But the same is true of the land. DLR is absorbed in the first mm of rock or soil.
When you put your hand on land it is warmer above and cooler below. This explains why heat is tranferred from above to below.
The oceans have a cool skin above, so the comparison is wrong……”””””
…………………….
Don’t take my word for it; go read it for yourself in the literature. I can suggest “The Infra-Red Handbook” prepared for the US Navy, as one reputable source.

I don’t know why we continually rehash this belief that downward LWIR from the atmosphere stores heat in the ocean.
Now for the solid rocky surface, it is different, as the rocks don’t evaporate as easily as the ocean water.

George, I perfectly agree.
My answer was to Willis who made that particular claim a the post above. The Arguments 1,2,3,4 are Willis’ arguments and I tried to show they were wrong. It looks like I did not succeed much to make my point clear.

Here was Willis post that I was answerring:
Willis Eschenbach says:
February 26, 2013 at 10:05 am

Where he was linking to his blog post here – where you can find the arguments:

192. Theo Goodwin says:

Michael Moon says:
February 27, 2013 at 10:51 am

How do they describe the rigorous experimentation they did to validate the instruments in the ocean? Did they assume that the ocean is everywhere uniform, at least down to 200 meters? If not, against what physical processes were the instruments validated?

What is their program for removing and re-validating instruments that have been in use for (what period of time)? What have they discovered about instrument deterioration (change) for instruments that were actually in use? What are the differences between instruments that traveled some distance from their starting points and those that did not?

If these questions seem annoying, do remember that the future of industrial civilization may very well depend upon the answers.

193. Theo Goodwin says:

Stephen Rasey says:
February 27, 2013 at 12:37 pm

Excellent post, Thanks. Once again unpaid skeptics do part of the necessary work for the paid scientists.

194. Theo Goodwin says:

Uzurbrain says:
February 27, 2013 at 12:34 pm

Ditto. Unpaid skeptic substitutes for paid scientist and, in this case, reveals some falsehoods and some important work not done.

195. Theo Goodwin says:

Phobos says:
February 27, 2013 at 11:09 am
“Theo Goodwin says: “Alarmists claim to reason that any claim they present, no matter how ridiculous, is authoritative until critics present an alternative.”

Hardly. But what we have here is an experienced research group who has been analyzing ocean data for a long time and who have published a peer reviewed paper in a respectable journal that contains an entire Appendix on error analysis, versus a blog comment that says, this doesn’t look right to me.

Until more specifics are offered, and/or I crunch the numbers myself, I’m going with the scientific experts.”

Do you really not see that what you say here is exactly what the followers of Ptolemy said to Copernicus, Brahe, Kepler, and Galileo? There is no authority in science except scientific method and observational evidence.

196. Theo Goodwin,

Have a look for yourself. These are commercial instruments with guarantees involved, and as usual, when someone buys something like this and expects to get what he paid for, he probably will. Having bought accurate instruments and used them, these guys impress me as the real deal.

The group I do not trust are the ones apparently at UC-San Diego who took the first 5 years of data, saw that OHC was Falling, and adjusted the data to show the opposite. The instrument was not at fault there.

197. george e. smith says:

“””””Lars P. says:

February 27, 2013 at 12:55 pm

george e. smith says:
February 27, 2013 at 11:41 am…..”””””

Yes Lars, your post was just a convenient place to put in some elaboration, so I wasn’t directing any criticismic comment at you.

The ocean shallows really are a diabolical interface.

We know that there is generally a cooling with depth, relatively fast at first (my scuba daughter reminds me of this constantly) and then a slower decine into the abysmal depths, so that is the direction of conductive diffusion for solar energy implanted in the top few hundred metres; but the top mm have a reverse gradient due to evaporation of the most energetic molecules in the surface film, and that is enough to stop the LWIR energy from breaking through the barrier, and then diffusing to the depths.
Of course these effects aren’t necessarily a constant over all the oceans regradless of local conditions,but a good picture of the mostly tropical regions, where most of the solar energy is.

198. TimTheToolMan says:

Phobos writes “You think you have discovered an error. You have not.”

Alternatively one could ask whether Levitus et al have adequately justified their precision in their paper. Particularly for the older 1955 sparse data on which the overall trend relies. The more measurements we have (ie after Argo), the flatter the trend.

199. Uzurbrain,

This is from National Physical Laboratory of the UK.

“For the highest accuracy, special glass-sheathed standard PRTs, usually of 25 ohms at 0 °C, are calibrated at the fixed points of the International temperature scale 1990 (see above). The ITS-90 specifies equations to relate the resistance to temperature and, using these, uncertainties can be achieved of 0.001 °C or better. Standard PRTs can be used from temperatures as low as 259 °C up to 660 ºC, or even, 962 ºC, with some increase in uncertainty and of loss of reproducibility.”

O of course do not know the UCSD people, nor whether their data quality control is straightforward or tricked up. They were caught with their hand in the cookie jar once, but maybe now they are telling the straight dope.

200. Theo Goodwin says:

Phobos says:
February 26, 2013 at 10:55 am
“Doug Proctor says: “Huge numbers of data points in different places and different times do not reduce error estimates; only repeated measurements of the same parameter that is unchanging reduces errors (of measurements). Each time you measure a different thing that is itself changing you get back to your fundamental measurement limitation.”
——————————–
It is simply statistical sampling — much like what a candy company does as its boxes come off the assembly line, in order to measure and control quality. They don’t measure the same box over and over again, but sample a subset of the boxes over time. With sufficiently sized sampling, the difference between the sampled set and the entire population is small — this is, after, the basis of all statistical reasoning.”

Phobos, you did not address your error here. In fact, you did not acknowledge it and when prompted you pretended that it does not exist. Yet this error is huge. You have assumed that the ocean is everywhere uniform down to 200 meters. In effect, you make a preposterous claim of uniformity. Once again, Alarmist thought is top down and has no place for empirical work.

201. Uzurbrain says:

Michael R. Moon says: February 27, 2013 at 1:19 pm
“Theo Goodwin, http://www.seabird.com/pdf_documents/Datasheets/911plusbrochureAug11.pdfin,”

Also this appears to be for a shipboard operated instrument. If not, please explain how all of these buoys are getting 120 or 240 VAC.

Assuming the buoy equipment is this equipment , then, with my experience calibrating equipment traceable to NIST standards with Ultra High Accuracy Laboratory Grade equipment in a room that had alarms if the temperature, pressure or humidity exceeded specified bounds (required to assure measuring equipment accuracy), I find it hard to believe (dare I say impossible) that you can take equipment that claims to have that accuracy, put it in a water/pressure proof can and drop it to a depth of 700 meters, make it ascend to the surface, do that over 30 times a year, and that the readings have the same accuracy as described in the referenced data sheet.

I have seen test equipment provide incorrect readings while making calibration comparisons simply because the room temperature had changed from the low end to the high end of the allowable limits. And SBE is attesting to the fact that this does not happen due to the temperature change of the equipment from this ambient temperature change? I don’t think so. They use a Wein Bridge for measuring the temperature and pressure. that means that there are reference elements in the legs of the Wein Bridge. When the ocean temperature sounding the buoy changes the temperature of these elements will change also and there goes your “factory certified calibration.” That is why the calibration labs have alarms on the ambient conditions. Look closely at that sales brochure (it is not a spec sheet) and you will note that there is no indication of ambient temperature or pressure effects upon the indicated accuracies. That is the first thing you look for when selecting Ultra High Accuracy Measuring instruments. Even NIST reference standards e.g. the “Kilogram” or “Meter” which are solid objects have specification that it is at a specified temperature, pressure, elevation, etc., etc.

Look up the Vishay resistor they use (I found this Vishay data on the web)
Temperature coefficient of resistance (TCR):
± 0.05 ppm/°C (0 °C to + 60 °C, + 25 °C Ref.)
± 0.2 ppm/°C (– 55 °C to + 125 °C, + 25 °C Ref.)

Those numbers tell me that the Exalted precision equipment (/Snark off) becomes useless if you do not know what the ambient temperature is and have not corrected for it. Are they doing that? I don’t think so. It also tells me that unless the equipment is used at EXACTLY the same temperature as it was calibrated at it will only have an accuracy as indicated by the above variance in the reference resistor. Then you have to include the capacitors in the other two legs of the Wein Bridge. Assuming these are approximately the same as the above, you then need to double the error, unless, again, you have the equipment in a lab. But you will note that at the top and most of the lower 3/4 of the buoy travel it will be in the +/- 0.2 ppm range.

It sounds like these designers fell into the same design pit as those that designed the first Venus probe that transmitted back at the wrong frequency because they forgot things like this.

Let’s see if I’m following this right.

“Phobos” is mainly referring to Levitus 2012 and its Appendix as the font of knowledge where all of this is explained, and if all you ignorant denying slobs would (could?) just read and understand it, you’d all know (and especially Willis would know) why you are all wrong.

Levitus 2012:
Comment On Ocean Heat Content “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ By Levitus Et Al 2012
by Dr. Roger Pielke Sr.

An Ocean of Overconfidence
by Willis Eschenbach

More Ocean-sized Errors in Levitus et al.
by Willis Eschenbach

Levitus data on ocean forcing confirms skeptics, falsifies IPCC
at Niche Modeling

The Overstatement Of Certainty In The Levitus Et Al 2012 Paper
by Dr. Roger Pielke Sr.

Yeah, I see Phobos’ point. If only that lousy slacker Willis had ever read Levitus 2012, he would naturally understand how he’s completely wrong and Phobos is so perfectly correct.

Also I’m getting that Phobos says you must have your own numbers to challenge other numbers. Much like how it’s not enough to prove to the judge that you weren’t doing 60 in the 45mph zone thus the ticket is wrong, but you must provide a verified speed value measured with certified instruments traceable to national standards showing you were below 45.5mph (allow for rounding) to beat the rap.

203. Theo Goodwin says:

Uzurbrain says:
February 27, 2013 at 4:25 pm

Another excellent post. Thanks.

204. Theo Goodwin says:

February 27, 2013 at 5:55 pm

Thanks for bringing many of us up to speed.

205. Mark Bofill says:

Theo Goodwin says:
February 27, 2013 at 7:27 pm

206. Sea Bird Electronics makes commercial equipment. Maybe there is another customer, maybe there is not. I am not in that business.

If you suggest that there are no instruments accurate as these fellows offer for sale with guaranteed accuracy, I would ask, “Why not? Clearly the technology exists.” If they made some error, buy one, prove the error, and sue for your losses.

This is 2013. There is a lot of amazing equipment available. It does not change the Skeptic meme at all, only in use since 1999 with a few years of errors, a lot like the microwave satellites.

207. Willis Eschenbach says:

Phobos says:
February 27, 2013 at 11:02 am

Willis Eshenbach said:

“It appears you hold the bizarre idea that anyone who discovers an error in a scientific work is required to “provide an alternative number”.

You think you have discovered an error. You have not. All you have said is, this number looks suspicious to me. That’s easy to say and anyone can say it. Until you are more specific or quantitative, it doesn’t rise above just another blog comment.

Thanks for the thoughts, Phobos. I’ve given a variety of reasons why I think the number is incorrect, both here and in Decimals of Precision. I’ve compared it, for example, to the BEST data (38,000 stations, each one taking at least two measurements per day, 28 million samples per year) with Argo (3,500 floats, one sample every 10 days, 125 thousand samples per year). I’ve calculated the raw error of the argo floats, and explained the various issues that all increase the raw error.

No one has yet explained the small size of the claimed error, or where they’ve taken account of many of the issues I’ve raised. I’m not saying there is no answer. I’m saying that I haven’t found it, and despite my putting the question out, no one has pointed me to it.

So … to return to what I described as your bizarre idea, beyond doing what I’ve done above, I have no obligation to “crunch the numbers to provide an alternative number”. I’m trying to understand how they got the number they did … not to provide an alternate number myself.

Regards,

w.

208. richard verney says:

Willis

I revert further to your comment at February 26, 2013 at 10:05 am which calls for my response.

Willis, your present Article, is on certainties and error margins in the measurement/assessment of ocean temperatures and hence the heat content of the oceans, and on that point I am fully with the general thrust of your observation. My comment that has apparently caused you offence was not a comment on the present Article (concerning measured/assessed “Ocean Temperatures and Heat Content”) but was instead a comment on a comment made by another commentator. As such, my observation was only partially germane to your present Article.
I am familiar with your post “Radiating the Oceans”. You have posted many good articles on WUWT, but, with respect, your article on “Radiating the Oceans” was not one of your best, in fact I would venture to say it was one of your worst.

The reason I hold this view is that the most important aspect to understanding climate and in particular whether AGW is real (or if real is more than merely nominal) is understanding the oceans and the atmospheric conditions above the oceans on a micro and macro basis. Not only does the heat capacity of the oceans dwarf that of the atmosphere, the oceans act as the heat sink and as the heat pump of planet and in so doing they are the key driver of Earth’s climate. The heat pump being both the generator and power force behind the water cycle, and also the ocean current conveyor belts which in turn distribute the vast energy received, in the equatorial, tropical and sub tropical areas, pole wards. There can be no AGW unless the oceans are heating, and there can be no cAGW unless the top metres of the ocean are warming at a significant rate; if heat energy is being sequestered and dissipated to lower depths (say much below the thermocline, or at any rate below the 700m level) then subject to that being only a very transient phenomena of short duration, there can be no cause for immediate alarm. Accordingly, it is necessary to consider the role of GHGs and consequential claimed backradiation (DWLWIR) in detail. Your article on “Radiating the Oceans” was, with respect, far too superficial, contained little in the way of science, and did not address the fundamental issues that arise, and in failing to do so, it shed very little light on this vitally important topic (quite probably the most important topic in the context of cAGW) and failed to take the science or the debate forward.

I am of the view that one cannot begin to properly discuss ‘radiating the oceans’ without considering; (i) what is the optical absorption characteristics of LWIR in water, (ii) how much DWLWIR reaches the oceans (in considering this one has to consider the effect of wind swept spray and spume and to what extent it acts as a LWIR block and thus a barrier preventing some or all of the DWLWIR penetrating the ocean and instead imparts that energy received into the atmosphere immediately above the ocean below, AND whether any, and if so how much, DWLWIR is reflected back off the surface of the oceans so that it does not penetrate the ocean but is instead bounced off into the atmosphere), (iii) given the optical absorption characteristics of LWIR in water, how much energy is absorbed in the top micron(s) layer (bearing in mind that DWLWIR is omni-directional such that DWLWIR which interacts with the ocean say at 10 degrees to the horizontal will penetrate through the water predominantly horizontally with little vertical downward component), at what speed is this energy absorbed (bearing in mind we are talking about energy being received in joules per second) and what is the affect of the absorption of that energy; in particular what does it do, where does it go and how (through what mechanisms and/or physical processes) is that energy distributed and/or transported from the top few microns, and over what time scale does this dissipation take place (when considering this issue one needs to consider the mechanical process of ocean overturning and whether this slow mechanical process dissipates the energy in the top micron layer which is being received in joule seconds at a rate faster than that energy would excite water molecules in those microns to a level at which they would evaporate), (iv) what is the temperature profile of the ocean both in the top micron layers, the top surface layer, the first 10 metres, down to the ocean thermocline and below; and (v) is the energy budget for the oceans effectively that they are receiving: 170 W m^-2 (solar) + 320 W m^-2 (DWLWIR), and are losing 390 W m^-2 (surface radiation) and 100 W m^-2 (sensible heat/convective/evaporative losses), thereby balancing at 490 W m^-2, or is it the null hypothesis (the energy flux) position that the oceans receive: 170 W m^-2 (solar), and are losing 70 W m^-2 (radiation loss) and 100 W m^-2 (sensible heat/convective/evaporative losses), thereby balancing at 170 W m^-2, and what evidence and/or scientific literature supports one or other energy budget.

When considering those issues, it is very important to not mix heat and energy. On a science blog such as WUWT, I would have thought that those issues, amongst others, would have been considered and addressed in any worthwhile article dealing with ‘radiating the oceans.’ As WUWT is a sceptical blog, one might also have considered the atmospheric composition immediately above the oceans. Water vapour not CO2 is the main GHG, and over the oceans (particularly the equatorial. tropical and sub tropical oceans) there is a greater concentration of water vapour than the global average water vapour, such that one could pontificate upon whether a small increase in the concentration of CO2 in the atmosphere immediately above the oceans could have any significant affect in driving up ocean heat content when it is so overwhelmingly dwarfed by levels of water vapour (which the ocean itself is producing). That, of course, is a slightly different issue but sufficiently germane since most readers are interested in whether ocean heat content is rising and if so why.

I do not wish to get embroiled and thereby side tracked by a debate on semantics as to the meaning of ‘an explanation’. I recall that we have briefly touched on that before, and I commented that we shall have to beg to differ upon the meaning of explanation.

Not wishing to put words in your mouth (and apologies if I am so doing), my perception of your comments, made over a period of time, can reasonably fairly be summarised as: (i) It is a fact that the oceans are not frozen; and (ii) the energy budget for the oceans is that they are receiving: 170 W m^-2 (solar) + 320 W m^-2 (DWLWIR), and are losing 390 W m^-2 (surface radiation) and 100 W m^-2 (sensible heat/convective/evaporative losses), thereby balancing at 490 W m^-2 ; and (iii) if one were to remove the 320 W m^-2 (DWLWIR), received by the oceans, in that energy budget, the oceans would gradually cool and freeze; and (iv) since the oceans are not frozen QED they must be receiving 320 W m^-2 of DWLWIR. Sorry, but I do not consider that to be an explanation of the issues arising. Indeed, since the energy budget employed by you in (ii) above is contested, as a matter of first principle, it cannot itself be used as the means by which to establish its own validity. It is no more proof of its own validity than my citing the null hypothesis energy budget (the oceans receive 170 W m^-2 (solar), and are losing 70 W m^-2 (radiation loss) and 100 W m^-2 (sensible heat/convective/evaporative losses), thereby balancing at 170 W m^-2) and saying look DWLWIR is irrelevant it can be completely ignored since it is no part of the null hypothesis energy flux budget. Merely citing your claimed version of the energy budget, or my merely citing the null hypothesis energy budget, is not proof that either one of those budgets is correct. We need to consider in detail the implications of those budgets on the physical world and what affect they would have on the physical world if they were correct, further we need extraneous corroborating evidence in support of one or other.

Unfortunately, in climate science, notwithstanding the vast sums thrown at it, little is known and even less understood. For practical purposes, all the data sets are not fit for purpose (too short a period, collection issues, contamination, a failure to consider errors and uncertainties, bastardisation through countless adjustments the need and correctness of which is unclear etc) and there is a lack of empirical observational evidence and experimentation. It is unfortunate that climate scientists repeatedly use averages since the use of averages sheds very little light on the real world; one of the few things that you can be reasonably sure about, is that in the real world the average condition is rarely encountered. Models are not objective and can tell us little more than the assumptions pre-programmed into them as a result of the prejudices of the programmer and it is difficult to see how the average of ‘crap’ is pure gold; the average of numerous sow’s ears does not produce a silk purse. The upshot of this is that we are all fumbling around in the dark; so for main part, it is difficult to know and understand what is going on. I consider myself to be a sceptic, by which I mean I am sceptical of the correctness and/or relevance of nearly all contentions in support of AGW, and I am sceptical of the correctness and/or relevance of nearly all contentions countering against AGW, and I have an open mind to be persuaded of the relevance and correctness of any issue whether it be in support of AGW or against it. I am only interested in getting to the bottom of the issues and learning the truth. I want to know what is going on, and why, and how. I would have thought that that would be a view that you also share so it is not constructive to get side tracked with semantics unless those semantics have a bearing on the scientific principles involved.

I will revert later today (or tomorrow) with a list of questions germane to “Radiating the Oceans” on which i would appreciate receiving the wisdom of your views. It may well be that some of the questions cannot be answered due to the poor state of climate science, but even to hone in on areas than cannot presently be properly/fully answered, say due to lack of research and/or experimentation, is in itself useful.

Willis, it is apparent that you are an intelligent man. It is also apparent that you have much experience with the sea. I would have thought that you might wish to put that intelligence and experience to good use and actually address head on some of the varied issues that arise should the ocean energy budget be as you claim in your Article “Radiating the Oceans”. In fact these issues would make a good topic for a follow up Article. Perhaps you could pen an article on “Radiating the Oceans, Part 2” in which you could take the science and debate forward.

209. Uzurbrain says:

Michael Moon says: February 27, 2013 at 9:05 pm
“Sea Bird Electronics makes commercial equipment. Maybe there is another customer, maybe there is not. I am not in that business.
If you suggest that there are no instruments accurate as these fellows offer for sale with guaranteed accuracy, I would ask, “Why not? “
————————-
I am not claiming that there are no instruments as accurate as SBE makes. I am claiming that all of the data I have been referred to (http://www.seabird.com/technical_references/LongtermTSstabilityAGUDec08Handout2Pages.pdf ) AND quoted to as the accuracy even on the ARGO web site ( http://www.argo.ucsd.edu/FAQ.html#accurate ) for the ocean buoys is GROSELY misleading, and I would say on purpose. And therefore all of the data that everyone is taking and using concerning ocean temperature is very suspect. At the minimum it is at least an order of magnitude LESS accurate than the oceanographers claim.
The data sheets say what they say and if used in a controlled, laboratory, environment would repeatedly provide the numbers they display on those sheets. However, they do not give you all of the numbers with the accuracy professed. Yes they can build equipment to do what they say, exactly as described on that data sheet, if you KNOW what that data sheet is telling you. The thing that is missing is that many Skeptics and most of the Scientists do not know what is missing and the scientists/technicians that know what is missing from those data sheets are not telling you the data is missing. And that is probably why you do not understand my problem with the Salse Brochure.” Here are the things that I see missing. * I do not classify any of this as data sheets or calibration sheets, too much is missing. I will refer to this information as a Sales Brochure.
1. The Sales Brochure and calibration sheet (Dec 8 handout) is calibrated against a TTS (Temperature Transfer Standard). It is not calibrated against an actual Triple Point Water (TPW) bath and a Gallium Melt Point (GPW) bath, or even boiling water at STD T/P. A TTS is an ultra high accuracy resistor that can be used to provide the EXACT resistance of various Temperature sensors (e.g. RTD, PRT, etc.). It does not “magically” make the exact temperature (think oven or refrigerator) specified. The TTS is hooked up to the electronics with wires (calibrated leads), thus the temperature probe is not in the loop.
2. All of the data about accuracy is for the electronics only. PERIOD. It does not include, or even state the accuracy of the sensor, Pressure (depth) or temperature. They just claim the sensors are highly accurate and high speed – fast. (See #1)
3. Also not included are the effects of ambient temperature on the equipment. All of the SBE equipment that I have seen data sheets for will repeatedly and flawlessly provide the data they profess on the data sheets when, and only WHEN performed in laboratory environment. They do not provide any data as to what happens to the equipment or electronics under different ambient conditions. Read them again – it is not there. Real Scientific Instruments provide this data. Look through a Fisher Scientific, Omega Engineering, or other scientific instrument supplier and the ambient data is giving on the better equipment (or will with a phone call). Why isn’t SBE providing this data?
4. Where is the data for the PRT sensor? Are they telling us that everyone is exactly the same and provide exactly the same resistance as every other PRT they make, with exactly the same curve and readings at each of the ONLY two reference points they have calibrated this super expensive boondoggle as ever other PRT they make? I have only one word for that claim – B…… Again, why is this data missing?
5. Also missing is the effects of the probe (enclosure surrounding the “highly accurate” PRT temperature sensor). It is designed to withstand test depth plus some unknown margin (not specified on their “Sales Brochure”) or it would leak. That means there is a need to transfer the temperature of the ocean to the PRT. That means there is a gap between the two surfaces. That gap causes four (readily apparent) things 1. Decreases the speed, 2. Causes latency due to the fact that the probe must also change temperature. 3. The biggest problem is the gap causes a difference in temperature. 4. With cyclical pressure increases/decreases the gap will increase aggravating this condition. (I have seen it happen many times.) This the reason that you calibrate the equipment with a real triple point bath, etc. But this is EXPENSIVE, VERY EXPENSIVE. I have done it. You could buy a house or at least a very nice car for the price of calibrating the entire set of sensors in each probe for what this costs. That is why they use a TTS etc.
6. As explained earlier, all of the “Sales Brochures” indicates information and accuracy that would be obtainable only if used in laboratory conditions (an environmentally controlled area, including at least temperature, humidity, and pressure conditions equal to the conditions of calibration, +/- a few degrees). Equipment like this (this expensive) can result in a difference in displayed value of over 1-2% when subject to a temperature 100 degrees different than the factory calibration ambient. The ARGO probes are, from my understanding, subject to about 50 dergrees F change from bottom to top of travel. That tells me that you will get about a 1% error that they do not discuss or deny. (By the way, for those reading this, do the surface thermometers have this same problem??? The electronics are stuck in a small shelter that gets to the “measured” temperature. Does not look good to me.)
This is my problem. I know they can make accurate equipment, I have used it. I have no problem with the accuracies declared; it is that they have not declared everything. However I know the limitations of this equipment, and everything I read tells me they are ignoring and hiding the limitations of the use of laboratory equipment in the field. Would you try to use a balance scale (like the ones you see on the hall of justice) on a small ship? If you say yes would you let me cash your paycheck and pay you with gold measured with this scale? That is what ARGO is doing.

210. Jim G says:

Phobos says:

“How so?”

The actual size of the universe you are sampling dwarfs your number of sampling points. Your universe in this case could be considered semi-infinite for all of the potential locations where instruments could be sited. And of course no amount of sampling of infinity results in any confidence interval whatsoever. Plus the arguments regarding precision of instruments, siting, methodolgy, etc. Sample size error, though questionable in itself in this instance, is only one source of potential error so quoting the precision levels being claimed is highly questionable

211. Willis Eschenbach says:

richard verney says:
February 28, 2013 at 4:40 am

Willis

I revert further to your comment at February 26, 2013 at 10:05 am which calls for my response.

Willis, your present Article, is on certainties and error margins in the measurement/assessment of ocean temperatures and hence the heat content of the oceans, and on that point I am fully with the general thrust of your observation. … blah, blah, blah, bunch of science …

Richard, my comment to you did not call for a scientific response.

You retract your lie, and we can move forwards. Here is my previous comment to you, since you have ignored it completely.

Richard, that is total and absolute bullshit. I give you the Lie Direct on that.

I even dedicated an entire post to the question of the absorption of IR. In that post I answered your damn questions, over and over and over.

The fact that you didn’t like my answers doesn’t entitle you to make false claims about whether I’ve answered, Richard, thats underhanded and untrue, and I expect better of you.

See my post called Radiating the Oceans for a full discussion of this issue, including my answers to Richard that he’s trying to pretend never happened …

Richard, I expect an apology, or we’re done discussing forever. I won’t have dealings with a man who tells lies in an attempt to damage my reputation.

I still expect a retraction and an apology, Richard. You lied about my proven willingness to answer your often idiotic questions. I’ve done so over and over, and I will not have you lying to people about it.

w.

212. george e. smith says:

I’m really surprised at the lack of understanding of SAMPLING. When I see people using terms like “statistical sampling” it makes the hair on my neck stand on end. As if just taking lots of “samples” and applying statistical mathematics, is of any value.

Sampling is at the center of all modern communications. Fiber optics transmission systems work at such high capacity, because any single signal doesn’t have to occupy the whole fiber. A continuous (analog) signal can be sampled at regular intervals, in such a way that the whole signal can be represented by just the (near zero length) samples, which means there are long gaps between one sample and the next. Those gaps are a perfect place to put in samples of some completely different signal, or even hundreds of different signals. Clever circuitry allows one to sort out the samples, and send those for any one signal to a place where the entire original analog signal can be reconstructed.

The rules are quite simple; the signal must be “Band Limited”, meaning there must be some cutoff frequency, beyond which there is no information from the signal. All real signals are band limited, but one may have to curtail the bandlimit (B) to some orgainzed value. The next rule is the important one; the samples must be taken at least as often as one sample for each half wavelength at the Band limit frequency. So for example, an audio telephone signal limited to say 4 kHz, must be sampled at least as fast as once every 125 microseconds.
This rule is known as the Nyquist Sampling Theorem. Theoretically if you obey those rules, you can in principle exactly reconstruct the complete continuous signal.

If you violate that rule, and have a signal at a frequency of B+b while sampling at a frequency 2B, you will find that the reconstructed message will now be bound to contain signal components at B-b frequency.

Well so what ? Well B-b is a frequency that is within the signal bandwidth B, that was NOT present in the original signal, so it is an error or noise component, that cannot be removed by any means without at the same time removing components of the real signal. No amount of statistical prestidigitation or central limit theorems can buy you a reprive from a violation of the Nyquist criterion; the message is corrupted forever.

Well you may say, in climate or weather data sampling, we really don’t need to reconstruct the original continuous function; say a continuous local Temperature value, it is sufficient to calculate say the average Temperature over say a 24 hour day.

Well we still have a problem; that erronious noise component has a frequency B-b, for a signal outside the band limit by (b). So what if (b) = (B) so the signal has a component which is equal to the sampling rate (2B). The reconstruction noise component is now at B-B = 0. Zero frequency.
Well the zero frequency component of a signal is just the average value of that signal over a complete cycle.

So if you sample a Temperature just twice a day, and that Temperature has a 24 hour fundamental frequency component, plus some higher frequency components due to the daily Temperature cycle not being a pure sinusoid, then you can’t even recover the average value of the Temperature over the day.

So now if you throw in spatial variations of the Temperature over say the whole of planet earth, then you better not have any cyclic changes in Temperature over distances less than twice the spacing of your measurment (sampling) stations.

Hansen seems to think that 1200 km is a perfectly good distance to be representing by any single Temperature made at one point. So what about weather in the SF Bay region, where Temperatures can vary wildly over distances of a few km.

Well I don’t want to belabor the point, but the bottom line is, that BEFORE you apply your statistical magic to any set of numbers, you better be sure that each of those numbers is indeed a valid sample of the quantity you want to study. Otherwise, you are simply applying statistics to completely meaningless noise components, which don’t represent anything useful.

Now in the case of Argo buoys that are ducking and diving, you certainly are seeing what they read as they rise to the surface, but the local water is changing with ocean current meanderings, and two nearby buoys, can be in quite different water streams.

At certain times in tidal flows inside San Fracisco bay, you can lean over the side of your fishing boat, and put one hand in each of two completely different water bodies; one can be muddy brown coming down from upstream rivers, and the other the normal not to clear water of the bay.

Statistics doesn’t correct for improper sampling procedures.

213. Uzuubrain,

Do any of these issues induce a systemic error? Does not look like it to me. The ascent from 1000 m to the surface takes 10 hours, should be enough to equalize temperatures pretty close at each reading, and in any event would not bias the results.

PRT’s are extremely accurate quote marks or no quote marks, they are.

Maybe you should consult for Sea Bird?

214. Uzurbrain says:

Michael R. Moon says: February 28, 2013 at 10:45 am

” Uzuubrain, Do any of these issues induce a systemic error? ”

YES – That is what I am trying to explain to you.
The data sent by these instruments will be “systematic”ly reported as approximately .1% higher (or lower, depends upon whether the temperature coefficient of the electronics is positive or negative) for those conditions when the buoy is at an ambient temperature of 85 degrees F and then when the buoy is at an ambient temperature of 30 degrees F it will report a temperature that is about .5% off in the opposite direction. This is a pure guess based upon actual equipment that provides the value of the effect of changes in ambient temperature changes.

Think of it this way, you have a micrometer, scientifically calibrated to be within 0.001% accurate @ 20 Degrees C (oC). The attached data sheet states that the measurement expands by 0.001 for each degree C above 20 oC (the calibration temperature) and contracts by the same amount for each oC below 20 oC between minus 50 and plus 150 oC. That means that when you measure something at an ambient temperature of 120 oC that the reading will be off by 0.100 +/- 0.001. It will not be off by just 0.001 as these data sheets I keep getting referred to claim. If you can’t/don’t understand this, get a new line of business/endeavor.

215. usurbrain says:

Michael R. Moon says: February 28, 2013 at 10:45 am

” Uzuubrain, PRT’s are extremely accurate quote marks or no quote marks, they are.”

I have worked with PRT’s and know EXACTLY how accurate they are. Do you? I also know that you MUST know the EXACT ACCURACY that they are to use them, or they are not “extremely accurate” they are worthless! To get that “extremely accurate” accuracy you must have the calibration curve, and the data detailing the deviation from the normal curve. That is provided by the manufacture. SBE simply states in there sales brochure that they are “extremely accurate” without the quotes. The quotes are mine because that is all that SBE states about the accuracy of the PRT’s used. If you know the values, tell me what they are.

216. So what you are telling us is, above 85 F and below 30 F, the instruments are not as accurate as inside that range. Above 85 is virtually never, and below 30 is impossible unless under the ice where they don’t go anyway. Were you involved in this project somehow? Did your relationship with some supplier end? Is this constructive?

217. Theo Goodwin says:

Michael R. Moon says:
February 28, 2013 at 12:19 pm
“So what you are telling us is, above 85 F and below 30 F, the instruments are not as accurate as inside that range. Above 85 is virtually never, and below 30 is impossible unless under the ice where they don’t go anyway.”

Aren’t some of them near the equator and don’t all of them surface (into sunlight) part of the time? How is 85 “virtually never?” Don’t we need a plot of this activity?

218. Theo Goodwin says:

george e. smith says:
February 28, 2013 at 10:30 am

“Well I don’t want to belabor the point, but the bottom line is, that BEFORE you apply your statistical magic to any set of numbers, you better be sure that each of those numbers is indeed a valid sample of the quantity you want to study. Otherwise, you are simply applying statistics to completely meaningless noise components, which don’t represent anything useful.

Now in the case of Argo buoys that are ducking and diving, you certainly are seeing what they read as they rise to the surface, but the local water is changing with ocean current meanderings, and two nearby buoys, can be in quite different water streams.

At certain times in tidal flows inside San Fracisco bay, you can lean over the side of your fishing boat, and put one hand in each of two completely different water bodies; one can be muddy brown coming down from upstream rivers, and the other the normal not to clear water of the bay.

Statistics doesn’t correct for improper sampling procedures.”

Thank you for this clear, concise, and darn near eloquent statement of the fundamental problem with making assumptions about what is being sampled. I think one problem that really handicaps Alarmists is that so few of them have actually been in a place like San Francisco Bay and discovered adjoining but distinct bodies of water. None of them seem to have interest in experience or instincts for empirical research.

219. Theo Goodwin says:

Michael Moon says:
February 27, 2013 at 9:05 pm
“Sea Bird Electronics makes commercial equipment. Maybe there is another customer, maybe there is not. I am not in that business.

If you suggest that there are no instruments accurate as these fellows offer for sale with guaranteed accuracy, I would ask, “Why not? Clearly the technology exists.” If they made some error, buy one, prove the error, and sue for your losses.”

Given this comment, I would say that your experience working with instruments of this sort is zero.

220. usurbrain says:

Michael R. Moon says: February 28, 2013 at 12:19 pm
” So what you are telling us is, above 85 F and below 30 F, the instruments are not as accurate as inside that range. … ”

At and around 20 oC / 72 oF the instrument will be fairly close to the accuracy stated (if they had included the accuracy of the PRT in their data. One of my points is that his data is missing from their data.) It is not a step change at 85 and 30. Normally though, you can draw a straight line with a slope equal to the value of the effects of ambient temperature through the temperature at which it was calibrated at (like you do for interpolating logarithms) and use this line to express how far off it will be at temperatures other than the calibration point.
I was not involved with this project or I could have gotten REAL data sheets. I was involved with the design, use, and operation of highly accurate instrumentation, have several designs that are used in the industry, and am now retired. My point is that, like I said in a much earlier comment, that SBE is snowing us. The equipment will work great in the lab. They have not provided data that details how it will work in the real world. They (SBE) have not provided the data needed to make intelligent use of the data that their equipment provides and that ARGO relies upon. Look at the ARGO web page http://www.argo.ucsd.edu/FAQ.html ARGO parrots the SBE accuracy and they claim the whole buoy only costs \$15,000. I have worked with instruments costing 4 – 5 times that that only measure one parameter, and that instrument includes all of the data in the specification sheet – especially the effects of ambient temperature upon the instrument. I smell a fish.

221. I don’t quite get what all the fuss is about here. Theo Goodwin has made an unkind remark to what purpose? It is true, I never bought a thermometer accurate to 0.001 C with a drift of 0.0002 C/month, so what? I have worked with plenty of accurate instruments, most engineers have. Uzurbrain has retired from this business and takes issue with the procedures followed by the ARGO team. Calibrating an instrument that reads temperature as resistance, by measuring its resistance, is doomed to failure? The properties of platinum are well-understood. Machining within simple tolerances, to match the design properties will give an accurate and precise ( I know the difference, do you?) set of readings.The ARGO readings are junk? I am not sold.

Water temperatures over 85 F must be a small if not vanishingly small sub-set of the overall readings, and even then the accuracy is respectable. I think Uzurbrain should take this up with Sea Bird, if he did not already and was rejected.

222. Uzurbrain says:

Michael Moon says: February 27, 2013 at 9:05 pm
“This is 2013. There is a lot of amazing equipment available. It does not change the Skeptic meme at all, only in use since 1999 with a few years of errors, a lot like the microwave satellites.”

Microwave satellites have been in use since at least 1965 as I used them to communicate with WashDC from Hawaii back then. They have learned a lot since then. And speaking of Satellites, look into their “Temperature Control System. “ They and the space probes NEED, would not function without, a system to keep the control and MEASURING equipment within the bounds specified by the design of the equipment. Usually within a few degrees of the calibration point. Put you cell phone in a deep freezer (about -10 – 20 oF that is only 50 degrees away from the 30 you said the probes rarely reach) and see how well it work immediately upon removal. ** CAUTION ** Do this with a throwaway phone and/or do not blame me if it never works again as it probably will never work again. ** CAUTION

Do you think the air-conditioning is on the ISS for the astronauts? The equipment needs are determined first – or the ISS doesn’t work. The same is true with Submarines. I had been told many times that the AC is for the equipment, not you. The vents and ducts were configured to cool the equipment; we got what was left over/used, and that was shut off when we ran on reduced power. A simple Google search of “Space Probe Temperature Control” and/or “Space Station Temperature Control” should provide a wealth of information backing up my concerns about the effect of temperature on equipment operating in harsh environments, which is also applicable to the ARGO buoys.

Also in reading again on the operation of the ARGO buoys, it seems they stay on the bottom in essentially a sleeping condition for 10 days. That makes me think just about all of the electronics will be at 30 – 40 oF. And raises even more concerns. They need a temperature control system, or they are useless. How could they possibly get enough of a solar charge in 24 hours to keep warm for ten days? I smell more fish.

223. Theo Goodwin says:

Michael Moon says:
February 28, 2013 at 3:40 pm
“I don’t quite get what all the fuss is about here. Theo Goodwin has made an unkind remark to what purpose? It is true, I never bought a thermometer accurate to 0.001 C with a drift of 0.0002 C/month, so what?”

I did not mean to put you down. I did not mean to question your experience generally. I meant to say that you have no experience with this set of instruments. For example, if it surfaces near the equator then it will exceed 85 degrees Fahrenheit.

224. usurbrain says:

@Michael Moon
Isotech is one of the leading, if not best manufactures of High Precision Instrumentation. They make equipment that is used to calibrate things like the SBE equipment so that SBE can brag about how good it is. ARGO is probably even using IsoTech equipment. I have found some of their equipment sheets on the internet. Here is a link for one of their better High Precision Thermometers, (~\$5000)
http://www.isotechna.com/v/vspfiles/pdf_datasheets/isotech/millik.pdf
On the 3rd page, if you look at “Operating Conditions” you will see the Equipment Operating Range; 0 – 45 C / 32 – 113 F. Outside this temperature range proper operation is not guaranteed. The second part of the Operating Conditions states “Full Specification”; 15 – 30 C / 50 – 85 F. That means advertised accuracy is only valid inside that range. This data is NOT listed on the SBE equipment. SBE has neglected to tell you what the probe is.
Here is the Page for the best advertised IsoTech High Accuracy Probe (~\$500). It is better than the industry standard as described in the ISA handbook. and would be used to calibrate an industry standard probe (working/process probe).
http://www.isotechna.com/v/vspfiles/pdf_datasheets/asl/T100-Series.pdf
On the 2nd page under “Calibrated Probe Accuracy” under 0.01C and 30 C you will see the number 0.009. That means that their probe can be off, or incorrect, by 0.009 oC or about 2 oF. This data is missing on any SBE or ARGO data I can find anywhere on the net. All the give is the accuracy of the electronics. If ARGO is using these probes, to get the REAL accuracy you need to ADD the inaccuracy above for 0.0 to 30 C to their specified inaccuracy 0.001 + 0.009 and you get 0.010. And that is what I have been trying to tell you for three days.

225. You two learned gentlemen should critique the ARGO data QC procedure to them, examine their procedure in detail, and get back to us. 3200 probes? Someone spent some money on this, did they get their money’s worth?

226. Uzurbrain says:

@Michael Moon
I hope this discussion has helped you look more analytically at the statements that are being made about Global Warming. I thought it was true 15 years ago. However, as I looked into some of the claims I found that essentially every one of them had hidden/missing data. Eventually, I came to the conclusion that the whole AGW theory was more like a Michael Crichton novel. Not knowing his writing stile back in the 70’s, I was convinced that one of his early books was fact/history/ government secret instead of fiction. I spent many hours trying to prove his “account.” I was able to verify a large portion of his account. And, with the help of some of my government access I was able to verify some of the things that were not available to the public. I was convinced that his account was true – “with the names changed to protect the innocent.” I was dumbfounded, when 10 or 15 years later I learned that most of his “fiction” uses 99.9% verifiable, believable facts that he weaves into a “pseudo history” fiction novel. This is what I believe the AGW crowd has done. They provide us with reams and reams, terabytes and terabytes of data all verifiable and repeatable. However, they are missing that .1% that makes it absolutely true. An instrument that has an accuracy/precision to 0.001% is touted over and over, but they do not mention that it is not suitable for making the measurements they are making. They hide the data that lets informed individual determine that it is being used in an inappropriate manor, and even hide the accuracy of the complete system.

A good analogy is the numbers given on Stereo equipment. Years ago they would tell you the Stereo had a frequency response of 1 Hz to 100 kHz. It sounded good in the showroom and you bought it. However, they did not tell you that the harmonic distortion was greater than 2-3% for frequencies below 60 Hz and above 10 kHz. They also never told you that the best available speakers had a 3 db. per octave loss below 100 Hz. Would you want to listen to that? Works great for speech and AM radio. That is what SBE is doing. That is what 90% of the data collectors and instruments providing AGW data are doing – you just have to know what to look for and not get caught into the trap of believing that Michael Crichton was writing “history” or government secrets. As I recall, his last book was supposedly going to explode the myth about AGW. I followed his writing on AGW back when he was posting them on the web.

Take Care.

227. richard verney says:

Further to: Willis Eschenbach says: February 28, 2013 at 10:24 am
………………………………………………………………………………………………………………………………….
Willis

At times, it would appear that precision in the use of language is not your strongest talent. An objective observer to these exchanges may well conclude that the offence you claim that you have sustained at my comment (February 26, 2013 at 2:04 am), to be feigned, and your response thereto to be petty and/or immature and/or it makes you look ridiculous.

At times, it would appear that you misconstrue what is said. Whether this is deliberate or unintentional, I do not know. I do gain the impression (from quite a number of your comments) that it appears that you like engaging in games with people, and often a plank in this game is a misconstruction of what has been said to you.

You state: “I still expect a retraction and an apology, Richard. You lied about my proven willingness to answer your often idiotic questions.” However, I have not said that you have ignored my comments, nor that you have been unwilling to respond to them, nor that you have offered no explanation whatsoever. I have merely said that you have not explained various issues.

There is, as a matter of English language (I understand this to be the language practiced in the USA) a difference between ‘offering no explanation whatsoever’ and ‘having not explained’ Objectively, it is possible for someone to offer an explanation but if, for example, that explanation is incomplete, or is wrong, then that has not explained the issue/question/matter. You are misconstruing, as a matter of English language, the meaning of ‘not having explained’ with ‘not having offered an explanation’. They are quite different concepts. This difference is quite elementary and I am surprised that time has to be wasted on pointing out this difference.

Let me give you an example of this difference that may help you understand the difference in the meaning of these words/expression. Say I read an article about the geology of some moon rock which had been collected by Neil Armstrong when he walked on the moon. Being the idiot that you imply that I am, I did not know nor could I understand how Neil Armstrong got to the moon, walked around, collected some rock and brought it back to Earth. Fortunately for me, I have this good friend called Willis who I meet from time to time in the pub. One night after a heavy boozing (drinking) session, I ask him whether he can explain how Neil Armstrong got the moon rock and was able to get it back to Earth. Willis explained to me that ‘Neil Armstrong has a trampoline in his garden which is positioned under a tall tree. One day Neil Armstrong went into his garage and picked up his son’s plastic bucket and spade. He then climbed the tall tree and jumped onto the trampoline from a great height. The trampoline went boingngngng and catapulted him all the way to the moon. Neil Armstrong then used the spade to dig up some rock which he put in the bucket. Then Neil Armstrong jumped up. The moon has far less gravity than Earth and because of this Neil Armstrong jumped so high that he was able to jump all the way back to Earth clutching the bucket. Fortunately, Neil Armstrong landed safely in the sea swam ashore, where he used the bucket and spade to make a pretty sand castle which he decorated with the moon rock that he had collected.’ Now my good mate Willis has offered an explanation as to how the moon rock was collected by Neil Armstrong and brought back to Earth, but he has not explained how it was done. If a couple of days later, I meet up with my good mate Willis and I say to him that he has not explained how the moon rock was collected, this is not a lie since although Willis has proffered an explanation, he has not in practice explained matters.

You accuse me of lying. This is a slanderous accusation. However, I am man enough not to take offence since it is apparent that your incorrect assertion is based upon your misconstruction of the English language, and I do not expect a scientist to be a master of language (although precision of language is important when it pertains to describing a scientific principle).

You further to state; “You lied about my proven willingness to answer your often idiotic questions.” Leaving aside for one moment the assertion that I “lied” (which assertion is incorrect for reasons detailed above), the remainder of your assertion boarders on the slanderous since it carries with it an inference that the author of the ‘idiotic questions’ is idiotic since the most probable explanation for someone asking an idiotic question is that they themselves are idiotic. I suspect that you had that inference in mind when you chose the language that you used. If I was to adopt a petty attitude I would be calling you out on that to identify each and every one of the idiotic questions, to explain why you consider that the question(s) is idiotic and in the absence of a reasonable explanation form you justifying your assertion, demanding a retraction by you and an apology from you. However, I am not so petty.

Now if we can both be man enough to move on and not waste time with petty exchanges of this ilk, and instead concentrate upon addressing the science, I will revert with some questions that I consider to be relevant. You will be at complete liberty to point to any question that you consider to be idiotic and at the same time explaining the reasons behind your holding that view, but hopefully, you will be more constructive and perhaps we can narrow some of issues and hone in on the importance of any scientific matter that may arise. As I said earlier, you might like to consider an article on ‘radiating the oceans, part 2’ in which you consider the wider issues that arise, and perhaps incorporate into such an article some of the points that you have made when considering the ARGO data.

Let me know whether you are now willing to move on and address the science and i shall revert with some relevant questions/issues. If you are not so inclined, I am not going to waste further time addressing petty semantics.

228. @Richard Verney and Willis: Richard offers intellectual responses reminiscent of “As the World Turns” including quite eloquent prose. Much of the insults within the prose contains lengthy analogies, which I find demeaning. There is troubling sardonic tone and mincing of words with some olive branches seemingly asking for a truce. Truthfully, I am not sure what you really want to accomplish with regard to a good hashing of the science here between you and Willis. To me, you should soften things with an apology and admit you were trying to hurt. To say it another way, there is no reason to make statements that border on slander because you feel Willis did not answer your questions as you would have expected or liked. It does seem you are lashing out in anger about them and needed to vent – but I’d rather hear you and Willis explain and discuss this neat science and physics stuff.

229. Michael Moon say: “They use a Platinum Resistance Thermometer, a very accurate thermocouple.”
++++
Not to nit pick. And the spirit of what you are saying is true. But I’ve found many engineers use the word thermocouple and temperature sensor as one in the same. And I thought this was a good place to explain the importance of what’s different.

Inadvertently, they don’t realize that there are significant difference in sensor technologies. The platinum based sensors are a type of RTD (resistance temperature detector), and they are very accurate for a number of reasons, esp the ones that use platinum. First, however, thermocouples work by measuring the voltage generated between two different metals across a range of temperatures. As well, there is a voltage drop across the cold junction (where the end of the leads are connected) and that temperature needs to be known and subtracted from the temperature of the point of interest. If not, the temperature of the cold junction will affect the reading by the same delta t as the varying cold junction. Most engineers in the solar hot water industry that I’ve met don’t even know this – and it has led them to drive a lot of sensor engineers and controls engineers crazy!

Thermocouples are cheap, easy to make and reliable and rugged. They are best used in applications that would harm other sensors… we used to use them to measure molten steel. They were good for a few dips before they’d burn up!

RTDs operate with a very low current and the measured voltage across the sensor element deduces the resistance (which correlates with a precise temperature curve based on several constants). The idea is not to heat the sensor with current flow so as not to change the temperature of what you are trying to read!

Anyway, the best sensors also measure the resistance of the lead wires in the circuit so use 3 or 4 wires instead of just 2. By measuring the resistance of the lead wires, you can more accurately measure the resistance of the sensor. The newer 1000 ohm RTDs are less sensitive to lead wire resistance and so, for most applications, the lead wires’ resistance is negligible.