A new method for correcting systematic errors in ocean subsurface data

News Release 8-Apr-2020

IMAGE
IMAGE: Ocean heat content is a most reliable indicator of climate change. view more  Credit: Jiang Zhu

A homogeneous, consistent, high-quality in situ temperature data set covering some decades in time is crucial for the detection of climate changes in the ocean.

Systematic errors in the global archive of temperature profiles pose a significant problem for the estimation and monitoring of the global ocean heat content, a most reliable indicator of climate change. During almost four decades between 1940-1970s the majority of temperature observations in the ocean within the upper 200 meters was obtained by means of mechanical bathythermographs (MBT). Actually MBT contributes to 68% of ocean subsurface data within 1940-1966.

The new study by Viktor Gouretski and Lijing Cheng from the Institute for Atmospheric Physics, Beijing, published in the Journal of Atmospheric and Oceanic Technology investigates the quality of MBT data by comparing these data with reference profiles obtained by means of Nansen bottle casts and Conductivity-Temperature-Depth profilers (CTD).

This comparison reveals significant systematic errors in MBT data. The MBT temperature bias is as large as 0.2°C before 1980 on the global average and reduces to less than 0.1°C after 1980. To eliminate this bias from the original data a new empirical correction scheme for MBT data is derived, where the MBT correction is country-, depth, and time- dependent.

Several bias correction schemes were tested. In order to objectively assess the performance of the schemes, four metrics were introduced and for each correction scheme and bias reduction factors were calculated. The scheme accounting for the depth bias and the thermal bias showed the best performance significantly reducing the original bias. Further, the new MBT correction scheme suggests a better performance compared with three MBT correction schemes proposed earlier in the literature (from Japan, United States of America and Germany).

The reduction of the biases increases the homogeneity of the global ocean database being mostly important for climate change related studies, such as the improved estimation of the ocean heat content changes.

“This new technique will be used in IAP ocean gridded temperature product and ocean heat content estimate in 2020.” said CHENG, “We expect it to significantly improve their quality during the 1940-1970 period.”

This study is funded by National Key R&D Program (2016YFC1401800 and 2017YFA0603202) and Chinese Academy of Sciences (CAS) President’s International Fellowship Initiative (PIFI).

###

From EurekAlert!

47 thoughts on “A new method for correcting systematic errors in ocean subsurface data

      • If you subtract a bias of 0.2 C before 1980 and subtract 0.1 C after 1980 then the trend increases by 0.1 C. As I recall the entire 10 E 23 joules of OHC is about 0.03 C over the whole oceans. They are suggesting an increase in warming of 200%
        Warming is now at Ludicrous Speed.

        • Calm down, this is “subsurface”, look at how deep that goes, I doubt they mean down to 4km deep. This is just another incremental tweak, like Hansen et al have been doing to the surface record since 1980. A tenth and then another inconspicuous tenth: slowly, slowly catch ye monkey.

          A steady campaign of incremental adjustments have already erase the hottest decade of the 20th century in USA which occurred in the 1930s. Now the present is “unprecedented”.

          Anyway we are now entering into a period of manufactured economic crisis, we will not be hearing much about climate for a while to come.

          • Achhh, it’s all hogwash anyway. Oceans have 1,000 times more heat capacity than the atmosphere. So the error at 0.1C would kick an atmospheric error to 100C in terms of global energy balance!

            The supposed warming/cooling/sky is falling narrative is within the noise of measuring errors. Although it is then much easier to debunk any and all global warming nonsense if you concentrate on ocean heat content due to it being 1000 times more heat capacity than entire atmosphere.

            To spell it out – 1000x higher heat capacity of oceans means if the oceans yield an average of 0.001 C in heating to the atmosphere, the atmospheric temperature would rise by 1 C. Ergo if the ocean temperature error is 0.1C any “assessment” of global ocean temps as a barometer for “global warming” is useless. (0.1 C ocean error is 100 C atmosphere error based on their respective heat capacity ratio)

      • It isn’t SST we see blasted in the headlines, it’s air temperatures. It’s these data sets that are “adjusted” and always to the warm side. And quite frankly all are estimates anyway. They all have error bars and even the error bars have error bars.

      • (1) This removed past cooling (for a period of time):”The skeptics will make a meal of this when it comes out, but if they did their job properly (I know this is impossible!) they would have found it. It relates to a problem with SST data in the late 1940s. The problem will get corrected for at some point. SSTs need adjusting as there must be from buckets for the period from Aug45 by about 0.3 gradually reducing to a zero adjustment by about the mid-1960s. The assumption was that after WW2 they were all intake measurements and didn’t need adjusting. This will reduce the 1940-1970 cooling in NH temps. Explaining the cooling with sulphates won’t be quite as necessary.” – Phil Jones

        (2) Karl et al’s 2015 paper clearly increased warming trends, particularly in the ocean. https://curryja.files.wordpress.com/2015/06/slide12.png .

      • Yes, remove warming from the early records to steepen warming (the homogens are restrained more in the recent end of the record). Be honest. Would committed consensus science do anything but to enhance the warming? They say:

        “… being mostly important for climate change related studies, such as the improved estimation of the ocean heat content changes.”

        and “…crucial for the detection of climate changes in the ocean.”

        and money quote

        “significant systematic errors in MBT data. The MBT temperature bias is as large as 0.2°C before 1980 on the global average and reduces to less than 0.1°C after 1980.”

        Steven, you have cynically repeated this more ‘cooling than warming changes’ in all such discussions, knowing full well that they are not random changes, but deliberately targetted to steepen the warming curve. This is one of the benefits to the consensus cause of using only ‘anomalies’.

        A big “cooling” was that of pushing the record degree of warming to date, the 1930s and 40s down by half a degree C (virtually all the warming that has taken place since 1850) and neatly removed the steep cooling from 1945 to 1979 in the process (the Ice Age Cometh decades) both events an embarassment for the Theory.

        • You shouldn’t be surprised — it comes to us from EurekAlert! — the official sponsor of junk science.

          They discovered that it wasn’t warming enough so they cool the past more than the present. Prediction of upcoming headline: New study shows oceans warming three times faster than was thought!

      • “We expect it to significantly improve their quality during the 1940-1970 period.”

        He who controls the past, controls the furture. 😉

      • SST adjustments remove warming

        All SST adjustments ? No.

        You are presumably referring to Folland’s folly. The ridiculous -0.5 deg C step change inserted to account for the British maritime domination before they lost WWII to the US domination of maritime traffic after the war. This was later blended by M.O. Hadley Centre into a less noticeable slide of 0.5 deg, by wantonly rewriting the recorded logs of sampling method from bucket to engine room where it was felt there “too many” of a certain type.

        An arbitrary change of 2% per year was decided and records were falsified to conform this assumption. The 2% / year produces a nice exponential decay instead of a rather cludgy 0.5 step which had everyone saying WTF is this ? Your data is rubbish.

        Yes, it did introduce cooling but it did so just before the period when IPCC said CO2 became significant. So it introduced cooling in such a way as to accentuate the supposed AGW.

        You have often attempted this dishonest redirection, as though saying it introduces cooling ( which is technically correct ) somehow negates the idea that this manipulation was done – like all the others – to rig the record to support the AGW mantra.

        Without this cooling “correction” the 20th c. is one almost continuous rise , which does not fit at all with hypothesis of CO2 driven warming since this is only reckoned to be significant after 1960 according to IPCC.

        So yes, the Hadley post war adjustment was a cooling adjustment but one which is there to support the AGW agenda.

    • Hi Paul, I just read your paper over at GWPF – thanks for writing that (I hope and assume that it is you).

      Yes and with respect to this new story about some data fiddling to reduce the bias from as much as 0.2 deg. C (gosh terrible!) to 0.1 deg. C. (fine) in temperature profiles in the actual ocean! Well I am glad it is so short, but I still think I might have lost consciousness around the second or third paragraph. Such was the egregious in-consequence of the thing – as an argument for (or against) the global warming scam it is akin to holding a crisp bag out of a window to reduce the speed of a runaway train.

    • Definitely. Adjustments are just another term for make it up and make it fit my world view. I think this one should say 6.3 instead of 5.6 because I believe global warming would have increased it.
      Now, who I am I going to believe? The tree ring data or the falsified data that we just adjusted? Definitely the falsified data we just just adjusted, we will paste that on and delete that tree ring data. No one will care.

  1. I have more confidence in my personal manual bathythermograph than any of this data with 0.1 deg C error correction, to wit: I stick my big toe in the bath water to see if it’s just right.

    • If it melts your rubber ducky, Ron, it’s too warm.

      It’s either a sure sign of Global Warming (alarum!) or you need to have a plumber check your water heater thermostat. Either way, it was all caused by CO2.

      My condolences on the loss of your rubber ducky, may he rest in peace.
      .
      .
      .
      Wait… Maybe they need to call a plumber to adjust the bias. After all, what do these researchers know from hot water? Call a pro.

  2. So the systematic errors and bias have been reduced from 0.2 to 0.1 ? So how long until the last stated 0.11 degrees ocean temperature increase since the 70’s is declared to be “twice as bad as we thought” ?

    • Hey, 0.11º might not sound like much to you, but it’s the difference between life and death to some poor coral.. (paraphrasing Tony Hancock)

    • So how long until the last stated 0.11 degrees ocean temperature increase since the 70’s is declared to be “twice as bad as we thought” ?

      BINGO!

      From my file of quotes, tag lines and smart remarks:

      If the Climate Crisis headline says,
      “Worse than previously thought”
      Historical data has been re-written.

  3. Chucked the bathy over the back end a thousand times I guess in a long seagoing career. Little glass smoked slide and an accuracy of perhaps two degrees? Reported to the Admiralty Chart Depot. Useful thermostructure data for acoustic propagation but temperature? Not so much.

    • Agree, I was on a few Gulf of Mexico cruises using BTs, precision seemed difficult with such a small view. New guys need to find one to check against Nansen bottles, etc. Are there any left? What did the oceanographers of the period think?

  4. We are witnessing the dissolution of a rational society. Truth has been replaced with propaganda tailored to benefit the few. We have come to expect this in politics and the media but are now accepting it in “science”. Since academia has been bought and paid for where is our hope for the future.

  5. Homogeneity is a great word, and I think I know what it means.

    I can’t see how the concept applies to a dataset.

    Perhaps I’m wrong, but I think this article means we’re changing data for “sciency” reasons.

    Even though it not used here, I’m compelled to share my belief that ANYONE who uses the shibboleth “THE SCIENCE” knows nothing about science.

  6. > A new empirical correction scheme for MBT data is derived…

    Not just a scheme, not just a derived model but a Scheme and a Model.

  7. Removing a mean bias leaves a (+/-) uncertainty due to residual bias from individual biases unequal to the mean.

    Removal of a mean bias does not remove the uncertainty width due to the systematic error from uncontrolled variables. The field calibration error in ARGO float temperatures is typically of order (+/-)0.6 C. And ARGO floats are typically more accurate than MBTs or XBTs.

    There is no differencing method that will reduce or mitigate that uncertainty.

    An uncertainty of (+/-)0.6 C is equivalent to an uncertainty of about (+/-)5.2 x 10^22 Joules in heat content of the first 200 m of ocean. The uncertainty is so large that the whole of AGW ocean heat content change is below the limits of uncertainty. It’s invisible.

    That won’t stop them getting a scary number to tout about, though.

    • What Pat said.

      And maybe a little more: Taking Data, and then Correcting it, is anathema to professional engineers. Your corrections could be wrong! The instrument said what it said. You cannot add information to the readings, and if you need a better instrument, buy one.

      Wow.

  8. A whole tenth of a degree. These guys are good!

    You know, according to the UAH chart, the year 1998, is one-tenth of a degree cooler than the “hottest year evah!”, the year 2016. This difference is described as being “within the margin of error” of the measuring instrument.

    That’s what one-tenth of a degree is, a very small number, which means a very small difference.

  9. It is all a joke but not really?

    We have some of the data But presumably lose a little bit more every year, lost and corrupted files.
    We have to interpret the data now knowing the failings of the old systems having tested said failings.

    What is not to like in the Mosherite adjusted world?
    Ask yourself how could you do it better?

    “Several bias correction schemes were tested. In order to objectively assess the performance of the schemes, four metrics were introduced and for each correction scheme and bias reduction factors were calculated. The scheme accounting for the depth bias and the thermal bias showed the best performance significantly reducing the original bias.”

    First up you either have a recognised actual bias or you don’t.
    You do not Have to make up bias correction schemes to get a result “you should get”

    That is called introducing a bias.

    Either there are four known biases. Or not.
    You cannot choose to use a combination of two and disregard the rest.
    Why stop at 0.1 C when you could remove the bias completely?

    Sarcasm.

    • Everything about it smells like Smoke and Mirrors.

      How can you “significantly reducing the original bias”? Isn’t this a methodology question not a bias question?

      And 0.1 C looks like a very suspicious figure for a world ocean. The error margin must be triple this figure at least.

  10. Systematic?
    As in related to a ‘system’?

    Keeping in mind that the alleged temperature increases are within error bounds. Even if the ocean temperature equipment were installed in a laboratory environment; instead of the real world where biotics live, grow, cover and affect.

    Perhaps they meant systemic? But then, fixing systemic problems means identifying and correcting the error source.

    • Don’t confuse random measurement error with uncertainty. The uncertainty associated even with the Argo devices is more than the difference in temp that is being alleged. You can’t just wish that away using some statistical manipulation.

      • Absolutely! “adjusting” data to account for presumed bias in the measurements is not a correct way to account for the ‘problem’ unless a very thorough analysis of the means of taking the original data is made. And of course that is impossible with centuries old data.
        The only acceptable way to account is to add an appropriate amount to the uncertainty – ie plus or minus a larger number. This seldom make the analysis clearer but it is the honest thing to do.

  11. The only real data that is actual readings. Unfortunately, the trillions of readings required, from every depth below every square kilometre of the oceans vast surface, to gain an accurate measure of the ocean’s heat content at any one moment, is never going to happen – let alone repeating the measurements to catalogue the changes over decades.

    To compensate for agreed lack of data, scientists can amuse themselves to create an estimate of reality by writing computer programs (models) which function using a lot of assumed values for the multitude of variables involved in such a vast system.

    That is great as an academic exercise, but the results should never, ever, be allowed anywhere near organisations with responsibility for determining public policies that involve economic ramifications resulting from the decided policies.

Comments are closed.