New study from Scripps puts a crimp on claims of recent rising ocean temperatures

This is interesting, and revealing. Using a new method of measuring krypton and xenon ratios in Antarctic ice core, an estimated temperature rise of just 0.1°C over the last 50 years was determined. This is well below many other estimates of ocean temperature increase.  Mean global ocean temperature increased by 2.57 ± 0.24 degrees Celsius over the last glacial transition (20,000 to 10,000 years ago).

From UCSD Scripps:

New Study Identifies Thermometer for Global Ocean

Researchers now able to reconstruct past ocean temperatures

There is a new way to measure the average temperature of the ocean thanks to researchers at Scripps Institution of Oceanography at the University of California San Diego. In an article published in the Jan. 4, 2018, issue of the journal Nature, geoscientist Jeff Severinghaus and colleagues at Scripps Oceanography and institutions in Switzerland and Japan detailed their ground-breaking approach.

Determining changes in the average temperature of the entire world’s ocean has proven to be a nearly impossible task due to the distribution of different water masses. Each layer of water can have drastically different temperatures, so determining the average over the entirety of the ocean’s surface and depths presents a challenge.

Severinghaus and colleagues were able to bypass these obstacles by determining the value indirectly. Instead of measuring water temperature, they determined the ratio of noble gases in the atmosphere, which are in direct relation to the ocean’s temperature.

“This method is a radically new way to measure change in total ocean heat,” said Severinghaus. “It takes advantage of the fact that the atmosphere is well-mixed, so a single measurement anywhere in the world can give you the answer.”

In the study, the scientists measured values of the noble gases argon, krypton, and xenon in air bubbles captured inside ice in Antarctica. As the oceans warm, krypton and xenon are released into the atmosphere in known quantities. The ratio of these gases in the atmosphere therefore allows for the calculation of average global ocean temperature.

Measurements were taken from ice samples collected during the West Antarctic Ice Sheet (WAIS) Divide coring project, of which Severinghaus is a leader. Over the course of six field seasons in Antarctica, a drill removed ice in cylindrical samples 2.7 meters (just under 9 feet) in length. The final sample was taken at a depth of 3,405 meters (over 11,000 feet) in 2011. This record spans nearly 100,000 years and the age of the layers can be determined to within 50 years. Earth’s atmosphere mixes on a scale of weeks to months, so a measurement of these air bubbles gives what is essentially a global average. For this study, scientists focused on samples 8,000 to 22,000 years old, and collected data in increments averaging 250 years in resolution.

New insights into the glaciation cycles that occurred on Earth long before humans began affecting the temperature of the atmosphere and oceans are now possible using the technique of measuring noble gas quantities. The study determined that the average global ocean temperature at the peak of the most recent ice age was 0.9 ºC (33.6 ºF). The modern ocean’s average temperature is 3.5 ºC (38.3 ºF). The incremental measurements between these data points provide an understanding of the global climate never before possible.

“The reason this study is so exciting is that previous methods of reconstructing ocean heat content have very large age uncertainties, [which] smooths out the more subtle features of the record,” said co-author Sarah Shackleton, a graduate student in the Severinghaus lab at Scripps. “Because WAIS Divide is so well dated, this is the first time that we’ve been able to see these subtle features in the record of the deglaciation. This helps us better understand the processes that control changes in ocean heat content.”

This paper is the result of fifteen years of work for Severinghaus, along with graduate students and postdoctoral scholars in his lab. Discussions with another professor at Scripps, atmospheric scientist Ralph Keeling, brought about the idea. Keeling studies the argon levels in the atmosphere to get a similar record of ocean heat going back a few decades. However, air bubbles trapped in ice don’t preserve argon levels accurately. Severinghaus discovered that xenon and krypton are well preserved in ice cores, which provides the temperature information that can then be used by scientists studying many other aspects of the earth’s oceans and atmosphere over hundreds of thousands of years.

Going forward, the ratios of these same noble gases can be determined from atmospheric samples taken anywhere in the world. For example, a measurement from the Ellen Browning Scripps Memorial Pier in La Jolla represents a global average of ocean temperature. Severinghaus hopes to fine tune the procedure.

“Our precision is about 0.2 ºC (0.4 ºF) now, and the warming of the past 50 years is only about 0.1 ºC,” he said, adding that advanced equipment can provide more precise measurements, allowing scientists to use this technique to track the current warming trend in the world’s oceans.

With this study, Severinghaus and colleagues have shown that measurements of noble gases in the atmosphere provide the historical record long sought by the scientific community, and can be further optimized to gain insights into modern ocean temperature changes as well.

This research was supported by the National Science Foundation (grant numbers 05-38630 and 09-44343), and the Swiss National Science Foundation.


The paper: Mean global ocean temperatures during the last glacial transition

Abstract:

Little is known about the ocean temperature’s long-term response to climate perturbations owing to limited observations and a lack of robust reconstructions. Although most of the anthropogenic heat added to the climate system has been taken up by the ocean up until now, its role in a century and beyond is uncertain. Here, using noble gases trapped in ice cores, we show that the mean global ocean temperature increased by 2.57 ± 0.24 degrees Celsius over the last glacial transition (20,000 to 10,000 years ago). Our reconstruction provides unprecedented precision and temporal resolution for the integrated global ocean, in contrast to the depth-, region-, organism- and season-specific estimates provided by other methods. We find that the mean global ocean temperature is closely correlated with Antarctic temperature and has no lead or lag with atmospheric CO2, thereby confirming the important role of Southern Hemisphere climate in global climate trends. We also reveal an enigmatic 700-year warming during the early Younger Dryas period (about 12,000 years ago) that surpasses estimates of modern ocean heat uptake.

https://www.nature.com/articles/nature25152

Reconstructing past ocean temperatures

Many techniques exist to reconstruct past ocean temperatures. The majority of these approaches, however, can be used to study only specific depths or seasons, or are based on complicated and poorly understood biological processes. Bernhard Bereiter and colleagues use noble gases in ice cores to build a high-resolution reconstruction of mean ocean temperature from the Last Glacial Maximum to the early Holocene. They find an overall ocean warming of about 2.5 ℃ over this period, which is closely correlated with variations in Antarctic ocean temperature. A dramatic ocean warming exceeding that of the modern era occurred during the Younger Dryas period—a time of sharp cooling over much of the high-latitude Northern Hemisphere land mass.

Figure 1: Schematic of the four-box model used to derive MOT, including the modern (‘Today’) and LGM characteristics of the boxes.
Figure 2: Mean Ocean Temperature records relative to today derived from three different atmospheric noble gas ratios and their mixture.
Figure 3: Comparison of our best-estimate MOT record with other palaeoclimatic records for the last glacial transition.
5 1 vote
Article Rating
104 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Tom13 - the non climate scientist
January 4, 2018 6:08 am

Take a look at how SkepticalScience is presenting the rapid warming of the oceans

https://www.skepticalscience.com/usgcrp-oceans-buffering-climate.html

As an aside – It is baffling how the ocean heat content can be determined with margin of error in the instruments measuring the alleged change is 50-100x the amount of the change.

The argo system is a great improvement, but the margin of error is huuuuge

Jeff Cagle
Reply to  Tom13 - the non climate scientist
January 4, 2018 6:32 am

The baffling paradox is resolved by considering the difference between the error in a single measurement and the error of the mean.

If I have 100 measurements of the same temp taken by the same thermometer, and those measurements show standard deviation of 0.1C, the mean of those measurements will have an error of only 0.1 / sqrt(100) = 0.01.

If you construct a confidence interval (say, 95%) around the individual measurement, you would use the standard deviation of the single measurement. You will get an interval that, 95 times of 100 repetitions of the experiment, will contain the true measurement for that instance.

If you construct a confidence interval around the mean, you would use the standard error of the mean, which gives you a much tighter interval. The resulting interval will, 95 times of 100, contain the true mean of measurements.

So: it is entirely possible to use 0.1C thermometers to get a very narrow confidence interval for the mean. All that is needed is to sample many, many times.

RACookPE1978
Editor
Reply to  Jeff Cagle
January 4, 2018 6:39 am

If you construct a confidence interval around the mean, you would use the standard error of the mean, which gives you a much tighter interval. The resulting interval will, 95 times of 100, contain the true mean of measurements.

So: it is entirely possible to use 0.1C thermometers to get a very narrow confidence interval for the mean. All that is needed is to sample many, many times.

Important Correction: So: it is entirely possible to use the same 0.1C thermometers under the same (or very similar) circumstances to get a very narrow confidence interval for the mean. All that is needed is to sample the same thing the same way many, many times.

MarkW
Reply to  Jeff Cagle
January 4, 2018 7:14 am

“If I have 100 measurements of the same temp taken by the same thermometer”

Both conditions are violated here.
We have hundreds of different thermometers, and each reading is a different patch of ocean. Both horizontally and vertically.

Tom13 - the non climate scientist
Reply to  Jeff Cagle
January 4, 2018 7:31 am

“So: it is entirely possible to use 0.1C thermometers to get a very narrow confidence interval for the mean. All that is needed is to sample many, many times.”

So all you have to hope for is that the millions and millions of individual margins of errors on each measurement magicly cancel the total error out.

Jeff Cagle
Reply to  Jeff Cagle
January 4, 2018 7:47 am

So here is a question I don’t know the answer to: How frequently do the ARGO floats make readings?

I assume that they are sampling many times per hour, but perhaps I am mistaken?

If I am correct, then you have the same thermometer reading the same thing many many times.

Reply to  Jeff Cagle
January 4, 2018 8:20 am

You’re obviously a mathematician. Please do some reading on metrology. Tell the group is an accurate mean the “real temperature” or just a mean of the reading? Do you have errors that need to be stated because of the accuracy of the measuring device. In other words should you tell folks that the actual measurement is ‘mean value +- error?

Let me also point out that you are not measuring the “same thing” many times. Temperature is not the “same thing”. The “same thing” refers to a physical entity, not the units you use to describe it. Basically, the “same thing” would be a unique piece of water.

We’ve had this argument before on this site. First you need to understand accuracy versus precision. Here is a link, http://webs.mn.catholic.edu.au/physics/emery/measurement.htm. Pay close attention to significant digits and how the “mean” is stated, i.e. 0.72 ± 0.03 mm. What is the “maximum probable error” with your 0.1 measuring device?

MarkW
Reply to  Jeff Cagle
January 4, 2018 8:44 am

Jeff, the Argo floats are free floating, they also move up and down through the water column. So each reading is a different piece of the ocean.

tty
Reply to  Jeff Cagle
January 4, 2018 9:02 am

“So here is a question I don’t know the answer to: How frequently do the ARGO floats make readings?”

They take one series of readings from 2000 m to the surface per week. So they never ever measure the same place or the same temperature twice,

Ragnaar
Reply to  Jeff Cagle
January 4, 2018 10:37 am

Let’s say you can measure this: Warmer than 15 C or not warmer than 15 C. Assume the possible range is negative 5 C to 35 C. Will many samples average to closer than half the range (15 C to 35 C)? Will they provide better information than one reading can? Yes. How many times do we need to see the same argument made here, that something that can measure to 0.5 C cannot provide information more accurate than that.

Rick C PE
Reply to  Jeff Cagle
January 4, 2018 11:44 am

Jeff Cagle ==> First, the previous comments are correct, only applies to multiple measurements of the same thing under the same conditions. Second you divide the SD by the square root of N. No problem there, but this does not account for the instrument uncertainty which could easily be +/- 1 C.

The hypothetical 0.01 C in your example actually means that if you repeated the 100 measurements there would be a 95% chance that the new average would be within two SDs (ie +/- 0.02 C) of original mean. But the uncertainty of this result must also consider the uncertainty of the instrument. I could have a bias that would make the result inaccurate.

Paul Penrose
Reply to  Jeff Cagle
January 4, 2018 12:06 pm

Jim Gorman has it correct. This is an accuracy versus precision argument. In the best cases, precision can be improved by taking multiple readings and averaging them. Accuracy can not be improved by any mathematical procedure performed on the data after it has been recorded. Another thing to consider: often times the accuracy of an instrument only includes the analog side of the device. Digitizing the data will introduce additional errors that is not always disclosed. In those cases, you need to factor in the accuracy of the digitizer as well.

Dave Fair
Reply to  Jeff Cagle
January 4, 2018 1:33 pm

Why is it seemingly only the skeptical statisticians point out the gross errors of the Team?

Reply to  Jeff Cagle
January 4, 2018 2:48 pm

“If you construct a confidence interval around the mean, you would use the standard error of the mean, which gives you a much tighter interval. The resulting interval will, 95 times of 100, contain the true mean of measurements.”

The resulting interval is based on 5% error; without any ability to identify which 5%.
It’s a mathematical version of “follow the pea”.

This 95% based mean is applicable to a specific instrument at a specific location, under identical conditions.
It also requires that the instrument accuracy is explicitly tested and certified to a specific levels of accuracy/precision during and after deployment.

Most buoys use the manufacture’s stated accuracy/precision.
Which even under laboratory conditions require initialization and accuracy certification, plus frequent recertification along with recording and tracking every instrument’s accuracy/precision.

Floating instruments, whether anchored or diving, are hotspots for marine life and growth; barnacles, algae, mollusk’s, arthropods, seaweed.
These instruments are susceptible to contamination; e.g. floating protein/alkaloids scum (sea froth), floating detritus and debris, oil, grease, bird guano, etc..
Everything listed affects an instrument’s measurements.
The longer an interval between capturing/hauling/recovering/cleaning/recertifying instruments, the greater the error; especially sea based and extremely remote instruments.

Which ignores the base fact that most of these instruments are not serviced in place. Instead the instruments are replaced with fresh clean instruments and the buoy is redeployed.
Nor are newly installed instruments parallel tracked to establish instrument variance.

Utterly destroying the concept that all measurements from a deployed buoy are from the same exact instrument.

Then there is the massive problem where buoys deployed by USA services utilize different measuring devices. Worldwide, there is a large range of instruments

Summing and averaging different instruments requires that the error rate for each instrument be measured, tracked and taken into account. Averaging away error bounds, is impossible.

• A) especially when errors are not identified and tracked to begin with. Brand new equipment may be delivered by manufacturers with specific accuracy specs; That does not negate the need for proper instrument certification , error identification and error tracking.

• B) Nor can a summed/averaged amount ever have better accuracy than the least accurate instrument measurements that are included in that sum.
Summing inaccurate numbers then drawing a circle around some center average to claim bulls-eye accuracy is specious.

D. J. Hawkins
Reply to  Jeff Cagle
January 4, 2018 3:31 pm

Cagle
You can go to:
http://www.argo.ucsd.edu/How_Argo_floats.html

for a detailed account of the Argo float operation. In brief, they are designed to drift at a target depth, usually 1,000 m and then every 10 days they “blow ballast” and ascend to the surface in about 6 hours. They sample the temperature and salinity along the way up and when they get to the surface they transmit their data by satellite. They then flood the ballast chamber again and drop back down to their parking depth and drift wherever the current takes them.

It’s very easy to see that there is no way you can apply the law of large numbers to this kind of sampling scheme. Each measurement is unique to each parcel of water in both space and time. “No SQRT(N) for you.”

Anders Valland
Reply to  Jeff Cagle
January 5, 2018 12:24 am

As many have pointed out in this thread, you are talking about something very different than measuring a physical entity in 4 dimensions. My comment here is that this 4th dimension is an additional screw up to your assertion. The measurements are taken at different times, on a physical entity that changes with time. In this case you would need to take multiple simultaneous measurements at the same location, which is never the case.

This issue is even greater with measurments in air, since changes in moisture content affects the heat capacity of the air and thus changes the meaning of temperature each and every time you measure.

I believe that in certain parts of the ocean these Argo floats are measuring different types of seawater (higher or lower salt content, ingress of fresh water) thus encountering similar issues, but I have not looked into whether it is as significant as it is for air.

You can not, as in ever, use the /sqrt(n) for this.

knr
Reply to  Jeff Cagle
January 5, 2018 1:39 am

And of course you need to know that the means of measurement , which amazingly gets left out so very often in this area , itself does not change. Which in reality it certainly can, hence one reason manufactories quote error margins and the need for calibration.
This is real 101 stuff , your value is only ever as good as the means by which it is obtained , and it does not matter what your trying to measure. The throwing of statistics at it does not change that inability in this area means your ‘guessing ‘ it just means your ‘guess ‘ can be claimed to be more intelligent .

Jeff Cagle
Reply to  Jeff Cagle
January 7, 2018 2:05 pm

Found it. http://www.argo.ucsd.edu/How_Argo_floats.html

The floats make 200 measurements over a 6-hr period on ascent (or descent), so the average Delta t is 0.03hr = 108 seconds. That’s reasonably granular given the slight temperature gradient from 2000m to 0m and given the tremendous agreement between spatially correlated floats.

Jeff Cagle
Reply to  Jeff Cagle
January 7, 2018 2:33 pm

Replies:

Jim G: You’re obviously a mathematician. Please do some reading on metrology. Tell the group is an accurate mean the “real temperature” or just a mean of the reading? Do you have errors that need to be stated because of the accuracy of the measuring device. In other words should you tell folks that the actual measurement is ‘mean value +- error?

Math and physics, actually. I’m comfortable with measurement theory, but not so much of an expert that I would apply to NIST.

The mean of the reading is the mean of the reading. It is also an unbiased estimator of the actual temperature.

As such, it does not provide the actual value, but a very good bet as to the values that future measurements would be.

Jim G: If your device is only accurate to plus and minus 0.1, this inaccuracy carries through to the end. This mean you may get a reading of +-0.1. Just picking a temperature out of the air, you would need to quote it as 55.01 +-0.1. In other words, between 55.11 to 54.91. Guess what? The mean is very precise, i.e. 55.01 but the accuracy of the mean can vary a whole lot.

I don’t know your background, but this statement is confused.

It is not the case that “this inaccuracy carries through to the end.” Rather, the inaccuracy of the device, if known, can be included in the computation of the standard deviation; when the SEM is computed, it is still computed as s/sqrt(n).

In fact, random inaccuracies will be reflected in the actual standard deviation of the measurements, which is why the SD is an estimator of inaccuracy.

And the key point is that the standard error of the mean is an unbiased estimator of the error in the mean of the measurements. If the experiment were to be repeated, it is highly likely that the mean of that new experiment would fall within the 95% CI.

Jeff Cagle
Reply to  Jeff Cagle
January 7, 2018 2:54 pm

ATheoK: This 95% based mean is applicable to a specific instrument at a specific location, under identical conditions.
It also requires that the instrument accuracy is explicitly tested and certified to a specific levels of accuracy/precision during and after deployment.

Most buoys use the manufacture’s stated accuracy/precision.
Which even under laboratory conditions require initialization and accuracy certification, plus frequent recertification along with recording and tracking every instrument’s accuracy/precision.

Floating instruments, whether anchored or diving, are hotspots for marine life and growth; barnacles, algae, mollusk’s, arthropods, seaweed.
These instruments are susceptible to contamination; e.g. floating protein/alkaloids scum (sea froth), floating detritus and debris, oil, grease, bird guano, etc..
Everything listed affects an instrument’s measurements.
The longer an interval between capturing/hauling/recovering/cleaning/recertifying instruments, the greater the error; especially sea based and extremely remote instruments.

Which ignores the base fact that most of these instruments are not serviced in place. Instead the instruments are replaced with fresh clean instruments and the buoy is redeployed.
Nor are newly installed instruments parallel tracked to establish instrument variance.

Utterly destroying the concept that all measurements from a deployed buoy are from the same exact instrument.

All of the potential errors you raise could make a difference. Now: do they?

As I say to my students, as I replied to D.J. Hawkins, It’s not enough to say “there could be errors.” We need to quantify those errors.

A good way to do that is to examine ARGO floats from one run to the next. Do they show a lot of variance? I bet D.J. that they do not. Would you care to join the wager?

Ian Mecdonald
Reply to  Tom13 - the non climate scientist
January 4, 2018 8:46 pm

“If I have 100 measurements of the same temp taken by the same thermometer, and those measurements show standard deviation of 0.1C, the mean of those measurements will have an error of only 0.1 / sqrt(100) = 0.01.”

That statement just rings alarm bells for me. Basic principles of physics: You can’t get something for nothing, and any signal path is only as good as the weakest link. Think about it this way: Would a hundred distorted sound systems give a better result than one distorted sound system? Or, just a louder distorted sound?

Reply to  Ian Mecdonald
January 5, 2018 6:28 am

I don’t have a problem with the basic assumption here. The problem occurs with this question. Is the precise mean the REAL measurement? This is described by the accuracy of the measuring device. This kind of error doesn’t disappear with the statistics methods applied. It is also the most neglected when quoting results.

If your device is only accurate to plus and minus 0.1, this inaccuracy carries through to the end. This mean you may get a reading of +-0.1. Just picking a temperature out of the air, you would need to quote it as 55.01 +-0.1. In other words, between 55.11 to 54.91. Guess what? The mean is very precise, i.e. 55.01 but the accuracy of the mean can vary a whole lot.

I blame the modelers for this. A computer can spit out a very precise number, but unless it is programmed to also assess the accuracy and spit this out too, you only get one very precise number. There probably aren’t very many computer programmers versed in measurement theory so they don’t even address the issue. Consequently, a lot of so-called climate scientists also ignore this issue and simply rely on a statistical result that may or may not be accurate. Papers not dealing with the accuracy of results should never be published.

Jeff Cagle
Reply to  Tom13 - the non climate scientist
January 7, 2018 2:50 pm

DJ: It’s very easy to see that there is no way you can apply the law of large numbers to this kind of sampling scheme. Each measurement is unique to each parcel of water in both space and time. “No SQRT(N) for you.”

Every measurement of any quantity whatsoever is unique to time and space. That doesn’t prevent us from taking multiple measurements and carefully interpreting the results.

In this case, the ARGO floats are sampling the heat content of the ocean, one spot at a time.

Does that raise the possibility that different spots have different heat contents? Absolutely. In fact, they will certainly have different heat contents, both spatially and over time.

Now: how different? Go look at the data. Here’s a prediction: If you take a look at a randomly selected float and examine the temperature readings for 10 randomly selected runs, you will find that the temperature between 1000 and 2000 dbar changes not at all, and the temperature between 1000 – 0 changes only a little.

If you’re willing to take that bet, we can each do it and report results back here.

The point I’m trying to make, the same one that I make to my students, is that it’s not enough to say “there might be errors.” A good scientist will try to quantify those errors.

Isn’t that what Judith Curry has been saying all along?

Alastair Brickell
Reply to  David Middleton
January 4, 2018 12:24 pm

David Middleton
January 4, 2018 at 6:28 am

“However, air bubbles trapped in ice don’t preserve argon levels accurately.”

Do we know that CO2 levels are any better preserved or reliable from ice cores?

Ian Mecdonald
Reply to  Alastair Brickell
January 4, 2018 8:50 pm

“Do we know that CO2 levels are any better preserved or reliable from ice cores?”

Being that CO2 is very soluble in water, I’ve always had my doubts about that. Ice also sublimes without melting in dry air, a fact not often realised.

Reply to  Alastair Brickell
January 5, 2018 9:05 am

Alastair,

Once the air is enclosed in the ice, there are no measurable changes in the composition anymore. The problem is in the smallest molecules: At the moment that the remaining pores are getting smaller and smaller, the smallest atoms/molecules still can escape and are underdetected in the air bubbles. CO2 is slightly wider than the “close-off” diameter, thus is not affected. Neon is much smaller and is highly affected, Oxygen and argon are affected, but the larger atoms like xenon and krypton used in this study are not affected.

See table 1 in:
http://www.bowdoin.edu/~mbattle/papers_posters_and_talks/Closeoff_fractionation_EPSL.pdf

That means that you can’t use ice cores air for O2/N2 ratio measurements in the past, but CO2/N2 ratio measurements are not affected.

Reply to  Alastair Brickell
January 5, 2018 9:10 am

Ian Mecdonald,

Depends of the temperature of the ice how much water is present at the surface and thus how much CO2 is absorbed. Doesn’t pay any role during measurement time, as the measurements are under vacuum and any evaporated water is trapped in a cold trap at -70ºC, leaving no liquid water at its surface…

jpatrick
January 4, 2018 6:29 am

Interesting approach, but in the end, it’s still just another proxy, and for that reason it must viewed with some doubt. A proxy is when you measure something else to infer what the temperature was. It involves a chain of assumptions, any of which could falsify the temperature output.

MarkW
Reply to  jpatrick
January 4, 2018 7:15 am

When you have multiple proxies, all giving pretty much the same result, that increases confidence in all of the proxies.

Tom13 - the non climate scientist
Reply to  MarkW
January 4, 2018 8:20 am

“When you have multiple proxies, all giving pretty much the same result, that increases confidence in all of the proxies.”

Steve McIntyre has made numerous posts on this issue over at climate audit dot com, especially dealing with the SH proxies. Gergis and gang along with mann jones and that gang have heavily weighted proxies going the “correct way”, but have repeatedly underweighted proxies going the “wrong way” or excluded those proxies based on post facto criteria.

John B
Reply to  MarkW
January 7, 2018 8:34 pm

Sorry MarkW but that’s circular reasoning. How do you tell a proxy is good? It agrees with the other proxies, and therefore the other proxies are good too.

However it can also lead to reinforcing bad results and the rejection of good ones. Just because things agree doesn’t make them correct.

Cassio
Reply to  jpatrick
January 4, 2018 9:52 am

That was my thought too, jpatrick. And a proxy needs to be validated empirically before it can be used with confidence. But how can we validate a proxy for the global mean ocean temperature when we cannot even measure it directly with thermometers? A proxy that is validated only by other proxies that have also not been empirically validated themselves remains unvalidated. An awful lot of unvalidated theoretical assumptions seem to be implicit in this case.

Jon
Reply to  jpatrick
January 4, 2018 1:04 pm

Not just a proxy b u t an unproven proxy. How can you [proxy] it if you haven’t measured what it proxies – the sctual temperature. Perhaps it’s a proxy for climate models? No reality necessary

[Fixed. -mod]

Jon
Reply to  Jon
January 4, 2018 1:05 pm

Sorry “proge” should be proxy

Reply to  jpatrick
January 4, 2018 5:20 pm

Must be viewed with doubt?

weird.. when skeptics look at Greenland ice core temps.. there is no doubt
when they look at c02/temp lags from ice cores… no doubt
UAH satellite temps… no doubt
Svensmark… no doubt
Any solar cycle study.. no doubt,
Goddard GIF … no doubt
newpaper clipping of cooling scare in the 70s. no doubt

Science is Systematic methodological doubt .. not selective doubt.

Reply to  Steven Mosher
January 4, 2018 10:34 pm

Steven Mosher January 4, 2018 at 5:20 pm Edit

Must be viewed with doubt?

weird.. when skeptics look at …

Thanks, mosh. Generally true, but a bit OTT. I think you meant

Must be viewed with doubt?

weird.. when many skeptics look at …

Me, I’ve learned to avoid any and all blanket statements of every kind, 100% of the time … I suppose I need the /sarc tag to avoid misunderstanding …

Have a marvelous New Year, my friend,

w.

Editor
January 4, 2018 6:31 am

The ratio of noble gasses anywhere in the atmosphere can detect a globally averaged 0.1 ºC in this…
comment image
comment image

Malcolm Carter
Reply to  David Middleton
January 4, 2018 2:43 pm

So I put a rock on the bathroom scale and it weighed 1 kg. I came back 100 times and each time it weighed 1 kg. Now can I say with some confidence that it weighs 1.0 kg? Now I intend to let it do over the next week weighing it at intervals until I have weighed it 10000 times. Then I will be confident that it weighs 1.00 kg.
Mathematical precision doesn’t seem to be the same as scientific precision.

RACookPE1978
Editor
Reply to  Malcolm Carter
January 4, 2018 2:52 pm

Malcolm, David Middleton:

Now, compare that to how we “measure” ocean and surface temperatures: Take one rock from 1500 different quarries twice a day for 40 years. Measure those rocks on 1500 different scales, but change the scales irregularly, and don’t bother calibrating any of the scales. Do not record when the quarries change. Report the average weight of the rocks has increased by 0.31 kilogram, and so we must kill 400 million innocent people.

Reply to  Malcolm Carter
January 4, 2018 5:24 pm

when you estimate area temperatures you are not measuring the same thing multiple times.
You need to understand how spacial PREDICTION works.
not maeasuring the sanme thing multiple times

Reply to  Malcolm Carter
January 4, 2018 6:38 pm

It really doesn’t matter what mathematical tricks you use to do spatial prediction. You are using MEASUREMENTS of something physical to obtain your data. You simply must use the same techniques used in any physical science to carry forward the errors your recorded data has. To do otherwise, is doing pseudoscience. Showing more precision than the measuring device actually has is also pseudoscience.

Stevek
January 4, 2018 6:31 am

Very interesting. Suppose we asume that the 0.1 degrees in last 50 years is precise ( no error ), what conclusions can be drawn about Co2 hypothesis assuming aresols by man have had no to little impact on climate ?

Reply to  Stevek
January 4, 2018 7:29 am

We can draw the same conclusion we could draw a priori… The oceans can absorb a lot of heat without a significant increase in bulk temperature.

Reply to  Stevek
January 4, 2018 7:39 am

We believe the world warmed 4-5°C from the last glacial maximum to the interglacial. This study defends the ocean warmed ~2.5°C. If the ocean warms about half than the planet, and the ocean has warmed only 0.1°C in the past 50 years, it means that the ocean has not kept up with global warming, as the world warming is estimated at 0.6-0.8°C for that period. Global warming is thus skin deep as it has not affected much the ocean and it would be easier to revert if global cooling takes place.

A significant blow also to those that defend that Younger Drias was a return to glacial conditions (Fred Singer, for example). The oceans say no.

Latitude
Reply to  Javier
January 4, 2018 8:30 am

…or it’s proof that the recording of temperature history has been adjusted to the f r a u d line..
..and world warming is really only about 0.2C

Reply to  Latitude
January 4, 2018 8:52 am

Possible but unlikely. Glacier melting is real.

tty
Reply to  Javier
January 4, 2018 9:09 am

“This study defends the ocean warmed ~2.5°C. If the ocean warms about half than the planet, and the ocean has warmed only 0.1°C in the past 50 years, it means that the ocean has not kept up with global warming”

The deep ocean has definitely warmed much less than 2.5 degrees, for the simple reason that otherwise it would have been frozen solid during the ice age. It is nearly as could as is physically possible now and wasn’t much colder during the LGM. Nearly the whole increase must have occured above the thermocline, a relatively small part of the ocean volume, so the ocean surface has warmed several times 2.5 degrees on average (which agrees with SST proxies which regularly show differences of 10 degrees or more between glacials and interglacials)

Latitude
Reply to  Javier
January 4, 2018 9:14 am

don’t know so much as melting as retreating….that would be snowfall

tty
Reply to  Javier
January 4, 2018 9:31 am

The amount of energy needed to warm the ocean 2.5 degrees would heat the atmosphere about 3000 degrees so you might just as well claim that the atmosphere is lagging.

cerescokid
Reply to  Javier
January 4, 2018 10:55 am

So does this mean that the “warmer” waters that are melting Antarctica ice shelves from below at an accelerated rate, are doing so because of a measly 0.1 C? Seems like some powerful impacts from such a small change in temperature.

Reply to  cerescokid
January 4, 2018 11:14 am

Seems like some powerful impacts from such a small change in temperature.

As usual the average temperature change has little to do with actual regional temperature changes. The North Atlantic has been cooling this past decade very fast, while other parts of the ocean have warmed.

http://www.climate4you.com/images/NODC%20NorthAtlanticOceanicHeatContent0-700mSince1979%20With37monthRunningAverage.gif

Reply to  Javier
January 4, 2018 8:26 pm

tty,
1. The salinity of ocean water under high pressure has a freezing point well below 0 deg C.
2. For the oceans to freeze at depth, due to that pressure/expansion property of water, the oceans have to freeze top-down. They can’t freeze botton up. Buoyancy doesn’t allow that even if you try to artifically induce freezing at depth.

tty
Reply to  Javier
January 5, 2018 1:57 am

Joel O’Bryan:

“1. The salinity of ocean water under high pressure has a freezing point well below 0 deg C.”

Of course, about -2 degrees.

“2. 2. For the oceans to freeze at depth, due to that pressure/expansion property of water, the oceans have to freeze top-down. They can’t freeze botton up. Buoyancy doesn’t allow that even if you try to artifically induce freezing at depth.”

Exactly. Which is the reason I said the deep ocean can’t have warmed 2.5 degrees. The AABW (Antarctic Bottom Water), the predominant watermass in the deep ocean is at -0.9 degrees. A warming of 2.5 degrees means that it would have been at -3.4 degrees, which is well below freezing.

By the way the sea can actually freeze from the bottom up. It happens under sea-ice in Antarctica but it takes rather special conditions. Google “brinicle”.

Hans-Georg
January 4, 2018 6:34 am

“So we reveal an enigmatic 700-year warming during the early Younger Dryas period (about 12,000 years ago) that surpasses modern ocean heat uptake.”
“That probably comes from the natives of atlantis, which already produced CO2 before they moved to Egypt to build pyramids,” says Erich von Däniken. But joking aside, again proving that even ocean temperatures can increase without any significant changes in CO2 content (if one wants to believe the lies of studies on longer-term CO2 levels in the atmosphere). There are probably two conclusions from the study – First, we may well still burning fossil energy without significantly increasing ocean temperatures, 0.02 degrees in 10 years is unlikely to be significant) and, secondly, there must be other drivers for ocean temperatures. Or a third solution would be: The studies on the CO2 content of the atmosphere in the past are all crap.

Reply to  Hans-Georg
January 6, 2018 10:42 am

Why would we want to burn fossil fuels when nuclear is a million times as energy dense, meaning that we have to disturb a million times as much of the planet for the same amount of energy. And how much energy do we use disturbing a million times as much of the planet?

Andy Pattullo
January 4, 2018 7:04 am

If we can’t actually measure the average temperature of the whole ocean to a high degree of accuracy, then there is no “gold standard” against which to test this new proxy measure of ocean temperature and therefor no proof of accuracy in spite of any grand claims of precision.

Reply to  Andy Pattullo
January 4, 2018 8:26 am

+10

Lee L
Reply to  Andy Pattullo
January 4, 2018 10:26 am

I’m thinking you might find some other old, thick ice somewhere else on the planet to test against,, say Greenland? And.. although this doesn’t verify that the method actually measures some average sea surface temperature, this second set of measurements should show the same and corresponding ‘evidence’ of atmospheric events that the Antarctic series show. You should be able to line them up side by side in time and get similar sized ‘temperature’ excursions. If you can do this, then at least you are part way to a useful measure of something global especially when you get on to a third or fourth series to compare against.

Brian
Reply to  Lee L
January 7, 2018 5:53 am

❝… find some other old, thick ice somewhere else on the planet to test against … ❞

Didn’t it say that this has been a fifteen year-long process?
“This paper is the result of fifteen years of work for Severinghaus, along with graduate students and postdoctoral scholars in his lab.”

I guess they haven’t thought of that, yet.

January 4, 2018 7:14 am

Understand the oceans and you understand the atmospheric temperature. No need for CO2. CO2 doesn’t warm water.

Understand the Oceans, Understand the Climate, NO CO2 Needed
https://co2islife.wordpress.com/2017/11/05/understand-the-oceans-understand-the-climate-no-co2-needed/

NOAA and NASA Admit the Sun NOT CO2 is Causing the Arctic Sea Ice to Disappear
https://co2islife.wordpress.com/2017/12/13/noaa-and-nasa-admit-the-sun-not-co2-is-causing-the-arctic-sea-ice-to-disappear/

MarkW
January 4, 2018 7:31 am

My last physics class was decades ago.
Isn’t at least one of those gases produced through radioactive decay?
If so, wouldn’t the things man has done that disturb the earth, mostly construction and mining, increased the amount of that particular gas being released from the earth?

paqyfelyc
Reply to  MarkW
January 4, 2018 8:30 am

You don’t have to bother about radioactive decay as a source of those any more (nor any less!) than any other source (like: out-gassing from Earth core). These have to be taken into consideration ,and i guess they did

tty
Reply to  MarkW
January 4, 2018 9:34 am

They all are to some extent, but the quantities are infinitesimal over such a short interval as 10,000 years.

Editor
January 4, 2018 7:57 am

Pardon me, but I have immense doubt about the claimed ability of the changes in noble gases concentrations to represent total ocean average temperature in increments of “degrees C”. While the idea has merit when applied to relative differences (more atmospheric noble gases, warmer ocean), for instance, his graph shows ocean temperatures remarkably steady (flat — no change) since the end of the Younger Dryas — which is quite possibly right.
To place any numbers, particularly 1/10ths of a degree C, on those minute changes is just plain nutty.

TDBraun
January 4, 2018 8:12 am

Layman question: Krypton is about 1 part per million in the atmosphere. I wonder how they can accurately count such tiny amounts, inside tiny centuries-old ice bubbles. It seems like they would need to identify each and every single molecule in the bubble to have the required accuracy they are claiming, and even then how would you know what the odds are of a extra krypton or two straying into that bubble by chance? So I wonder how they do it.

Reply to  TDBraun
January 4, 2018 9:11 am

Modern spectroscopic techniques are very accurate indeed and measuring ppm’s is routine. Your second point is more pertinent though and it isn’t always possible to accurately know gas diffusion rates in ice and furious debates rage about such things. For example the warmunards were simply devastated by the news that temperature leads co2 by centuries in the ice core data and they immediately set about redressing that by changing the assumptions about how representative ice gas bubbles are of paleo-atmospheres. They may of course succeed in reducing that lag but they will never be able to introduce a sign change and it’s funny to watch them sweat over their hysterical efforts. Just one of the many ways you know you are not dealing with science when all data is feverishly worked on to force it into compliance with a priori conclusions.

Luc Ozade
Reply to  cephus0
January 4, 2018 9:54 am

They may of course succeed in reducing that lag but they will never be able to introduce a sign change and it’s funny to watch them sweat over their hysterical efforts. Just one of the many ways you know you are not dealing with science when all data is feverishly worked on to force it into compliance with a priori conclusions.

I love your description, cephus

David E. Hein
Reply to  cephus0
January 4, 2018 12:28 pm

Yep. Same with the pause, MWP, LIA…

O R
January 4, 2018 8:31 am

Anthony,
“Using a new method of measuring krypton and xenon ratios in Antarctic ice core, an estimated temperature rise of just 0.1°C over the last 50 years was determined. This is well below many other estimates of ocean temperature increase”

Looking at NOAA/NODC temperature data of the upper 2000 m of the oceans, the difference between pentades 50 years apart, 2012-2016 vs 1962-1966, suggests that the temperature has risen by 0.10 C.
If we assume that the whole ocean volume has warmed by a little more than half of that, say 0.06 C, the estimate from the new study (0.1 C) is actually well above it, not below as you claim..

tty
Reply to  O R
January 4, 2018 9:38 am

“If we assume that the whole ocean volume has warmed by a little more than half of that”

Much too high. The deep ocean has a turnover time on the scale of 1,000 years, so it will have changed very little over a 50 year period.

Reply to  tty
January 4, 2018 10:51 am

TTY, more likely 800 years based on the ice core CO2 lag and the underlying thermohaline circulation driver.

January 4, 2018 8:44 am

Interesting, but this line in the abstract is dubious.

“Although most of the anthropogenic heat added to the climate system …”

Are they talking about the Joules released by burning fossil fuels or are they referring to a violation of conservation of energy?

January 4, 2018 9:21 am

Xenon diffuses through steel and cladding materials. Why won’t it diffuse through ice?

Reply to  Lonnie E. Schubert
January 5, 2018 11:11 am

Lonnie,

Not much to see here:
https://www.osti.gov/scitech/biblio/4120957

Anyway, a matter of temperature: diffusion speed increases a lot with temperature. In ice cores at -40ºC, no observable migration over 800,000 years for CO2. Xenon is larger in diameter and krypton is smaller, these don’t show less or more migration than CO2.

January 4, 2018 9:29 am

Earth’s surface is ~70% ocean which averages around 4,000 metres in depth. The Argo floats measure down to a depth of around 2,000 metres and there are none in the Arctic at all. Until these two stupendous data gaps are filled with several decades of data, our knowledge shouldn’t be confused with speculation.

tty
January 4, 2018 9:57 am

Well, now I’ve read the paper and it is quite interesting, and in my opinion a real advance. However I do not put much faith in their pre-holocene absolute figures, since they are dependent on the assumptions in their (very simplified) ocean model. We do not have enough information about the circulation in the glacial ocean to make detailed guesses about the volumes of different water masses.

And as a matter of fact they themselves more or less admit as much in the supplementary information (this by the way is perhaps the most thought provoking section of the paper):

“Another hypothesis that could explain the MOT pattern during the Younger Dryas is that a cold, isolated water mass was ventilated during YD1. This water mass would have last been ventilated several millennia earlier, for example during the cold LGM, and only the push of the Younger Dryas onset (collapse of AMOC) would have brought this cold water up to the surface to equilibrate. The end of YD1 would then mark the point in time when this water mass was fully ventilated and hence this scenario would be able to provide an explanation for the stalled warming before the AMOC acceleration. Such a drastic change in ocean ventilation could be explained with a switch from a glacial ocean circulation mode to a modern/interglacial mode as mentioned in the main text. Multiple lines of evidences suggest the existence of such different ocean circulation modes, and in the case of the shift from interglacial to glacial mode, the ‘MIS 5-4 transition’ at around 70 kyr bp has been suggested as such.”

This would also fit in nicely with the peculiar “radiocarbon plateau” during the Younger Dryas and the abrupt changes in pCO2 shown by stomatal studies.

January 4, 2018 10:43 am

I dont think very highly of this new paper, based on the reproduced figure 2. First, the error bars—if real (I did not spend the money to go behind paywall)— are on the order of half a degree. Half a degree is an enormous amount of ocean heat, even if confined to just the mixed layer. Second, all three nobel gassesprovide a lower anomaly than the mix. How can that be? Each has its own Henry’s law partial pressure solution constant. Those don’t change by mixing.

Reply to  ristvan
January 4, 2018 11:21 am

Rare to see someone discussing errors and how it affects the result.

mothcatcher
January 4, 2018 11:05 am

First things first, perhaps?

“………………As the oceans warm, krypton and xenon are released into the atmosphere in known quantities. The ratio of these gases in the atmosphere therefore allows for the calculation of average global ocean temperature……..”

?Can anybody tell me how this is established. Is it from lab experimentation? And what evidence is there that this ratio transfers with precision to the ice cores over the whole period under investigation? I guess if the guy has been working on this for 15 years he has it sewn up, but I’d like to see it discussed. Haven’t read the paper.

tty
Reply to  mothcatcher
January 4, 2018 12:08 pm

The ocean is almost saturated with respect to the noble gases which makes it fairly easy to estimate the total amounts and the proportion in the atmosphere from basic physics.
. However that “almost” that is due to the fact that the cold waters in the Arctic and Antarctic do not have time to equilibrate fully before sinking complicates things and is one reason I don’t trust the absolute values.

I must say however that they are unusually honest in discussing sources of error and uncertainties. Straight off I can’t think of any major factor they have ignored, though they are rather optimistic as to the size of the probable uncertainty.

The Reverend Badger
Reply to  tty
January 4, 2018 3:37 pm

IF the oceans were one lump of water at one uniform temperature you might have a hope of relating the temperature to atmospheric gas concentrations. But multiple lumps at varying temperatures will have multiple varied atmospheric gas concentrations above each lump (before mixing). To then mix all those varied concentrations up into one figure and back calculate an “average” temperature is pure scientific BS.
It’s utterly wrong logically.

This is exactly the same stupidity which thinks that radiative flux is a conserved quantity, that you can add different radiative fluxes from sources at different temperatures, and then compute the sink temperature using S-B. Which is what Trenberth did in those K-T diagrams.

Fundamental logical and scientific stupidity.

I cannot believe so called scientists get away with stuff like this. In my day this kind of logical fallacy was exposed in first year undergraduate work. Why oh why do not more proper PhDs call it out ? It’s like having astrologers run NASA.

tty
Reply to  tty
January 4, 2018 4:12 pm

“But multiple lumps at varying temperatures will have multiple varied atmospheric gas concentrations above each lump (before mixing).”

Exactly, before mixing, but atmospheric mixing time is very short compared to the time it takes ocean temperatures to change significantly.

The Reverend Badger
Reply to  mothcatcher
January 4, 2018 3:52 pm

Let’s try a simple example: Here are 3 flasks containing gas mixtures in rations 60/40 , 62/38 and 64/36 .
The three flasks represent 3 lumps of ocean at different temperatures T1,T2,T3. Mix the gases together and calculate the resulting ratio. What EXACTLY is the essential information you need in order to have a meaningful result?
Note as in the real world we do not know exactly the size of each lump of ocean at each temperature, so I am not telling you how big the flasks are.

I await any answers with interest, including from those who wrote the paper.

tty
Reply to  The Reverend Badger
January 4, 2018 4:17 pm

As a matter of fact that problem is soluble since there are four noble gases with differing solubilities, so their ratios will give you a unique solution.

David E. Hein
January 4, 2018 12:37 pm

Reminds me of an old joke about a guy that goes to the doctor complaining of elbow pain. The nurse hands him a cup and asks for a urine sample. The man reiterates that it’s his elbow and the nurse says; ” The Doctor can diagnose just about anything from a urine sample.” The man claims he doesn’t have to go and takes the cup home where he decides to show this quack a thing or two. So he has his wife, daughter, son, and dog pee in the cup. Then the man leaves a sample of, we will just say, of a self-congratulating nature. A couple of days later the Dr. calls and tells the man he has the diagnosis. ” Well sir, I have good news and bad. The bad news is your wife has a yeast infection, your daughter is pregnant, your son is gay, your dog has scabies and if you stop self-congratulating then your elbow will heal.”

The Reverend Badger
Reply to  David E. Hein
January 4, 2018 3:43 pm

It’s a good joke but the reality of DNA specific testing needs more work before we can tell which (if any) of the fluid residues come from the daughter (Let’s call her Madeleine for fun). Maddie’s DNA needs to be separated from Mum and Dad and the dog (Eddie).

tty
Reply to  The Reverend Badger
January 4, 2018 4:06 pm

Separating the doggies DNA would be simple, but it might be a bit difficult deciding whether it is the mother or the daughter which is pregnant (DNA sequencing would easily determine that they were mother and daughter and whether the father was actually the father).

The Reverend Badger
January 4, 2018 3:26 pm

What an excellent series of comments, debate and scientific knowledge as a result of this posted topic.

It has been so good I reckon the average IQ of WUWT participants has now increased by 0.01.

(+.02 / -5)

scraft1
January 4, 2018 4:39 pm

Yes, Reverend Badger, a welcome respite from the climate wars.

Dr. Deanster
January 4, 2018 7:09 pm

So … if the ratio of noble gases in the atmosphere is an indication of global ocean temp, and given that global ocean temp is a significant predictor of global atmospheric temp …. then why are we still using thermometers and such to measure global temperature? Seems we would be much more accurate to use the change in the well mixed noble gas concentration to determine change ….. again, provided this is true.

Editor
January 4, 2018 11:46 pm

For those interested in the question of how accurately and how precisely we know the temperature of the surface ocean, here are a few of my previous posts on the subject.

Decimals of Precision – Trenberth’s missing heat 2012-01-26

Over at Judith Curry’s excellent blog there’s a discussion of Trenberth’s missing heat. A new paper about oceanic temperatures says the heat’s not really missing, we just don’t have accurate enough information to tell where it is. The paper’s called Observed changes in top-of-the-atmosphere radiation and upper-ocean heating consistent within…

An Ocean of Overconfidence 2012-04-23

I previously discussed the question of error bars in oceanic heat content measurements in “Decimals of Precision“. There’s a new study of changes in oceanic heat content, by Levitus et al., called “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ (paywalled here). It’s highlighted over at Roger…

Where in the World is Argo? 2012-02-06

The Argo floats are technical marvels. They float around below the surface of the ocean, about a kilometre down, for nine days. On the tenth day, they rise slowly to the surface, sampling the pressure, temperature, and salinity as they go. When they reach the surface, they radio home like…

Argo, Temperature, and OHC 2014-03-02

I’ve been thinking about the Argo floats and the data they’ve collected. There are about 4,000 Argo floats in the ocean. Most of the time they are asleep, a thousand metres below the surface. Every 10 days they wake up and slowly rise to the surface, taking temperature measurements as…

Best of the New Year to all,

w.

Reply to  Willis Eschenbach
January 5, 2018 2:07 am

The average temperature of the ocean and the amount of heat it holds is an exercise in futility.

Reply to  wayne Job
January 5, 2018 3:44 pm

wayne Job January 5, 2018 at 2:07 am

The average temperature of the ocean and the amount of heat it holds is an exercise in futility.

I disagree entirely. Too defeatist, and doesn’t recognize the math.

The important question is the size of the uncertainty. For any given application, a given uncertainty may or may not be an issue.

w.

Ian_UK
January 5, 2018 12:09 am

All this is a bit academic, according to UK’s Lancaster University. On BBC News this morning, they reported on a global coral bleaching problem caused by climate change and warming oceans. Apparently, the corals can cope with gradual warming (1deg so far) but sudden changes, as induced by, eg, El Ninos, can’t be tolerated. Also apparent is that this is a very modern phenomenon.

I can’t find a link to the source story yet.

Ian_UK
Reply to  Ian_UK
January 5, 2018 12:13 am
tty
Reply to  Ian_UK
January 5, 2018 2:13 am

“Also apparent is that this is a very modern phenomenon.”

It isn’t. How do you think all the reef flats just below high tide in the tropics originated? By reefs growing up close to the surface and then getting bleached/killed by low water.

Remember that the Great Barrier Reef is dead 90% of the time, it is only a living reef during interglacials. Most of the time it is just a low range of barren limestone hills.

Reply to  Ian_UK
January 5, 2018 3:54 pm

Ian_UK January 5, 2018 at 12:09 am

All this is a bit academic, according to UK’s Lancaster University. On BBC News this morning, they reported on a global coral bleaching problem caused by climate change and warming oceans. Apparently, the corals can cope with gradual warming (1deg so far) but sudden changes, as induced by, eg, El Ninos, can’t be tolerated. Also apparent is that this is a very modern phenomenon.

I can’t find a link to the source story yet.

As usual, Ian, nature is much more complex. Bleaching is the natural reaction of coral to thermal changes. It can be induced by cold as well as heat. As tty mentions above, it’s what happens on reef flats.

The key to understanding bleaching is that a coral reef has two parts—the inhabitants, and the apartment house. The apartment house is the white coral skeleton, which is built slowly but surely by the inhabitants. But what happens when conditions change? Say the winds and currents shift, and some coral reef apartment house is too hot for the inhabitants?

What happens is that the inhabitants die.

However, the apartment house is still there … and nature abhors a vacuum. So who moves in to the already-built structure? Well, inhabitants who can live in warmer temperatures … and they’ll live there until such time as conditions change again.

This is not modern, it’s unbelievably ancient. It is how coral reefs have dealt with changing conditions since forever—the strain of coral polyps that are living there all die out and are replaced by new polyps which can tolerate the new conditions. It’s their version of ongoing local evolution.

For the same reason, it’s not a tragedy. It only takes a couple of years for the apartment house to be fully repopulated. Why so fast?

Because the habitat is already built. So the new inhabitants don’t have to expend any energy on building, they can put it all into growth.

Best to you,

w.

ptolemy2
January 5, 2018 3:58 am

Any new accurate proxy of past conditions is to be welcomed.

Archie Lever
January 5, 2018 6:12 am

Interesting stuff about the new method of measuring ocean temperature increase. As far as Argo is concerned; I have never been convinced about the averaging of Argo results to measure differences in OHC. The fact is that they move around spacially and report at different times.

If I were measuring the average CHANGE in temperature of any body between times t1 and t2, I would expect that the temperature sensors would be placed spacially over the body as uniformly as possible; then a snapshot taken of ALL the sensor temperatures at t1 and t2 to calculate an accurate difference.

Applied to the Earth’s oceans; this would mean many many fixed tethered buoys at a range of depths, lats and longs; each measuring a fixed chunk of ocean. They would ALL record temperature instantaneously at t1, t2, t3 et al. in order for a snapshot temperature to be taken at t1 and compared with similar snapshots at t2 etc for the whole planet.

Technically this would be a massive task with huge problems setting up planet wide fixed tethered buoys and reporting from depth.

thomho
January 6, 2018 9:53 pm

While not the main issue in this interesting blog there are a number of conflicting descriptions of the Argo floats that seem to need resolution, Some posts on this blog have described the floats as hovering at 1000 metres then ascending taking temperatures and other data eg salinity which they report to satellites when they reach the surface Other posts say they hover at 1000 metres then descend to 2000 metres before then ascending to the surface.
In the past, comments on WUWT have been critical of such Argo operations as it has been said that given the average depth of the oceans is 3500 metres that the floats only measure data from the top 57% by depth (and possibly much less by volume)..
A more recent article now claims the Argo floats descend to the sea floor (cited as 5,500 metres) before ascending.
http://argofloats.wikispaces.com/Argo+Floats
That article also states there are about 3800 floats administered by some 27 nations with the USA administering by far the largest number (over 2000) of floats (Australia about 500). It also says that the reason for their hovering at 1000 metres is that currents are less at that depth plus sea life such as molluscs etc which might over time encrust the floats and thus affect the accuracy of their measurements, do not exist at that depth.

As its seems it could be useful to establish the operation of the Argo floats as a basis for future discussion, can any reader confirm those facts ?