Big Claims About Tiny Numbers

Guest Post by Willis Eschenbach

[UPDATE: An alert commenter, Izaak Walton, notes below that I’d used 10e21 instead of 1e21. This means all my results were too large by a factor of 10 … I’ve updated all the numbers to fix my error. Mea maxima culpa. This is why I love writing for the web … my errors don’t last long.]

Our marvelous host, Anthony Watts, alerted me about a new paper yclept “New Record Ocean Temperatures and Related Climate Indicators in 2023“.

Of course, since I’m “the very model of a modern major general”, my first thought was “Is there gender balance among the authors as required by DEI?”. I mean, according to the seminal paper “Ocean sciences must incorporate DEI, scholars argue“, that’s a new requirement. Not balance by sex. Balance by gender.

However, it turns out that there are thirty-five authors of the new paper. I downloaded the citation. It says “Cheng, L., Abraham, J., Trenberth, K., Boyer, T., Mann, M., Zhu, J., Wang, F., Yu, F., Locarnini, R., Fasullo, J., Zheng, F., Li, Y., Zhang, B., Wan, L., Chen, X., Wang, D., Feng, L., Song, X., Liu, Y., Reseghetti, F., Simoncelli, S., Gouretski, V., Chen, G., Mishonov, A., Reagan, J., Von Schuckmann, K., Pan, Y., Tan, Z., Zhu, Y., Wei, W., Li, G., Ren, Q., Cao, L., Lu, Y.”

Ooogh … gonna be hard to determine their genders. Can’t just check their names, that would be transphobic. Have to contact each one and ask them about their sexual proclivities … that’ll go over well …

In addition, there’s a numerical problem with genders.

Here, from the San Francisco “GIFT” program, which will give $1,200/month in taxpayer money preferentially to illegal alien ex-con transgender prostitutes with AIDS who can’t speak English, is their checkbox list of genders. (And no, I’m not kidding—that is their preferred recipient, the person that goes to the head of the line for “free” taxpayer money. But I digress…)

So buckle up and keep your hands in the vehicle at all times, let’s take a ride through their official list of genders.

GENDER IDENTITY (Check all that apply)

Cis-gender woman
Woman
Transgender Woman
Woman of Trans experience
Woman with a history of gender transition
Trans feminine
Feminine-of-center
MTF (male-to-female)
Demigirl
T-girl
Transgirl
Sistergirl
Cis-gender man
Man
Transgender man
Man of Trans experience
Man with a history of gender transition
Trans masculine
Masculine-of-center
FTM (female-to-male)
Demiboy
T-boy
Transguy
Brotherboy
Trans
Transgender
Transsexual
Non-binary
Genderqueer
Agender
Xenogender
Fem
Femme
Butch
Boi
Stud
Aggressive (AG)
Androgyne
Tomboy
Gender outlaw
Gender non-conforming
Gender variant
Gender fluid
Genderfuck
Bi-gender
Multi-gender
Pangender
Gender creative
Gender expansive
Third gender
Neutrois
Omnigender
Polygender
Graygender
Intergender
Maverique
Novigender
Two-spirit
Hijra
Kathoey
Muxe
Khanith/Xanith
X-gender
MTX
FTX
Bakla
Mahu
Fa’afafine
Waria
Palao’ana
Ashtime
Mashoga
Mangaiko
Chibados
Tida wena
Bixa’ah
Alyha
Hwame
Lhamana
Nadleehi
Dilbaa
Winkte
Ninauposkitzipxpe
Machi-embra
Quariwarmi
Chuckchi
Whakawahine
Fakaleiti
Calabai
Calalai
Bissu
Acault
Travesti
Questioning
I don’t use labels
Declined
Not Listed: _________________

Heck, there are only about a hundred “genders” there. That means there shouldn’t be any problem determining which author in this paper is a “Calabai” and which is a “Calalai” …

In addition, the number of authors brings up what I modestly call “Willis’s First Rule Of Authorship”, which states:

Paper Quality ≈ 1 / (Number Of Authors)2

But enough digression … moving on to the paper, there’s a fascinating claim in the abstract, viz:

In 2023, the sea surface temperature (SST) and upper 2000 m ocean heat content (OHC) reached record highs. The 0–2000 m OHC in 2023 exceeded that of 2022 by 15 ± 10 ZJ (1 Zetta Joules = 1021 Joules) (updated IAP/CAS data); 9 ± 5 ZJ (NCEI/NOAA data).

So … what is the relationship between ZJ and the temperature of the top 2000 meters? Let me use the NCEI/NOAA data. Here are the calculations, skip them if you wish, the answer’s at the end. Items marked as [1] are the computer results of the calculation. Everything after a # is a comment.

> (seavolume=volbydepth(2000)) #cubic kilometers

[1] 647,988,372

> (seamass = seavolume * 1e9 * 1e3 * 1.025) # kg

[1] 6.641881e+20

> (specificheat=3850) # joules/kg/°C

[1] 3850

> (zjoulesperdeg=specificheat * seamass / 1e21) #zettajoules/°C, to raise seamass by 1°C

[1] 2557.124

> (zettajoules2023 = 9) # from the paper

[1] 9

> (tempchange2023 =zettajoules2023 / zjoulesperdeg) # °C

[1] 0.0035

So all the angst is about a temperature change of three and a half thousandths of one degree. EVERYONE PANIC!!

But that wasn’t the interesting part. The interesting part is their uncertainty, which per NCEI/NOAA is ± 5 ZJ. Let me note to start that the results of the two groups, IAP/CAS and NCEI/NOAA, differ by 6 ZJ …

Using the above calculations, 5 ZJ is ± 0.0019°C … they are seriously claiming that we can measure the temperature of the top 2,000 meters of the ocean to within ±0.0019°C.

And how are they doing that?

They say “The main subsurface observing system since 2005 is the profiling floats from the Argo program”. These are amazing floats that sleep a thousand meters down deep in the ocean, then periodically wake up, sink further down to two thousand meters, and then rise slowly to the surface, measuring temperature and salinity along the way. When they reach the surface, they phone home like ET, report the measurements, and sink down a thousand meters to go to sleep again. They’re a fascinating piece of technology. Here’s a map of the float locations from a few years back.

There are about 4,000 floats, each of which measures the temperature as it rises from 2000 meters up to the surface every 10 days. Note that they tend to concentrate in some areas, like the intertropical convergence zone by the Equator and the US East Coast, while other areas are undersampled.

So to start with, ignoring the uneven sampling. each float is theoretically representative of an area of about 92,000 square kilometers and down to two kilometers depth. That’s a bit more area than Austria, Portugal, or the state of South Carolina.

Now consider their claim for a moment. We put one single thermometer in Austria, take one measurement every 10 days for a year … and claim we’ve measured Austria’s annual average temperature with an uncertainty of ±0.0019°C???

Yeah … that’s totally legit …

But wait, as they say on TV, there’s more. That’s just measuring the surface temperature, but the Argo floats are measuring a 3D volume, not the surface. So their claimed uncertainty is even less likely.

Here’s another way to look at it. We’re talking about the uncertainty of the average of a number of measurements. As we get more measurements, our uncertainty decreases … but it doesn’t decrease directly proportionally to the number of measurements.

Instead, it decreases proportionally to the square root of the number of measurements. This means if we want to decrease the uncertainty by one decimal point, that is to say we want to have one-tenth of the uncertainty, we need one hundred times as many measurements.

And of course, this works in reverse as well. If we have one-hundredth of the number of measurements, we lose one decimal point in the uncertainty.

So let’s apply that to the ARGO floats.

Claimed uncertainty with 4,000 floats = ± 0.0019°C

Therefore, uncertainty with 40 floats = ± 0.019°C

And uncertainty with 4 floats = ±0.19 time the square root of 10 = 0.06°C …

Their claimed uncertainty says that four ARGO floats could measure the temperature of the entire global ocean to an uncertainty of less than one tenth of one degree … yeah, right.

Sadly, I fear that’s as far as I got in their paper … I was laughing too hard to continue. I’m sure it’s all sciency and everything, but they lost me by hyperventilating over an ocean warming of three and a half thousandths of a degree and put me over the edge by claiming an impossibly small uncertainty.

Here, a sunny morning in the redwood forest after a day of strong rain, with football playoffs (not the round ball kind) starting in a little while—what’s not to like?

My very best to all,

w.

[ADDENDUM] To close the circle, let me do a sensitivity analysis. The paper mentions that there are some other data sources for the analysis like XBTs (expendable bathythermographs) and other ship-deployed instruments.

So let’s assume that there were a further 4,000 scientific research vessels who each made a voyage where they made thirty-six XBT measurements. That would double the total number of measurements taken during the year. Never mind that there aren’t 4,000 scientific research vessels, this is a sensitivity analysis.

That would change the calculations as follows:

Claimed uncertainty with 8,000 floats + measurements = ± 0.0019°C

Therefore, uncertainty with 80 floats + measurements = ± 0.019°C

And uncertainty with 8 floats + measurements = ±0.019 time the square root of 10 = 0.06°C …

We come to the same problem. There’s no way that 8 thermometers taking temperatures every 10 days can give us the average temperature of the top two kilometers of the entire global ocean with an uncertainty of less than 0.1°C.

MY USUAL: When you comment please quote the exact words you are discussing. It avoids endless misunderstandings.

Get notified when a new post is published.
Subscribe today!
4.9 55 votes
Article Rating
281 Comments
Inline Feedbacks
View all comments
David Spain
January 15, 2024 1:34 pm

Willis, from your paper quality formula:

Paper Quality ≈ 1 / (Number Of Authors)2

Did you intend to say that the denominator is (Number of Authors) squared or was that 2 a missing footnote/citation? Trying to determine the paper quality within an order of magnitude. So is this PQ=~.0285714 or PQ=~.0008163?

dk_
Reply to  David Spain
January 15, 2024 1:51 pm

recommended update
1/(2*(number authors) * (number of author genders))

bdgwx
Reply to  David Spain
January 15, 2024 2:09 pm

A publication documenting the Higgs Boson mass had 5154 authors. It’s PQ would be 0.00019 or 0.000000038 (depending on if it was squared or not). That means this publication would be 150x or 21500x (again…depending on if it was squared or not) higher quality than one documenting the mass of a fundamental particle.

Obviously this is an absurd (and probably mostly tongue-in-cheek) method for determining quality since authorship is based on who contributed and since larger efforts have a larger number of contributors.

bdgwx
Reply to  Willis Eschenbach
January 15, 2024 4:37 pm

It was just one institution…CERN.

Reply to  bdgwx
January 16, 2024 5:00 am

Try looking up ATLAS Collaboration and CMS Collaboration.

bdgwx
Reply to  DavsS
January 16, 2024 5:33 am

Exactly. That’s how we know the experiment was done at just one institution…CERN.

Izaak Walton
January 15, 2024 2:03 pm

Willis,
your calculations differ by about an order of magnitude from those in the original paper. From their abstract:
“Associated with the onset of a strong El Niño, the global SST reached its record high in 2023 with an annual mean of ~0.23°C higher than 2022 and an astounding > 0.3°C above 2022 values for the second half of 2023”.

So they have converted a 15 ZJ energy difference into a 0.23 degree temperature change while you give 0.035. Do you know who is correct or why there is such a big difference?

Izaak Walton
Reply to  Willis Eschenbach
January 15, 2024 2:35 pm

Hi Willis,
you are right. Thanks.

Izaak Walton
Reply to  Willis Eschenbach
January 15, 2024 7:23 pm

Hi Willis,
Can you check your figures? You use 1e9 to represent 10^9 but 10e21 to represent 10^21.
If you replace 10e21 in your calculation by 1e21 the temperature change goes down by a factor of 10 which I am sure you would be pleased by.

Reply to  Izaak Walton
January 15, 2024 9:07 pm

Good catch Mr. Walton!

bdgwx
Reply to  Izaak Walton
January 16, 2024 5:37 am

Nice catch. I was so blinded by the fact that the significant figures matched my results that I didn’t even notice they were missing a zero.

bdgwx
Reply to  Izaak Walton
January 15, 2024 2:13 pm

SST = Sea Surface Temperature. Willis’ figure is for the first 2000m. His figure is very close to that which I calculated independently.

Reply to  bdgwx
January 15, 2024 9:25 pm

So meaninglessly tiny !

January 15, 2024 3:17 pm

It is worth noting that the Argo dataset itself is subject to ongoing QA, with subsequent adjustments and edits, and datasets which have passed QA filters and adjustments.

I always wonder whether that results in an inadvertant bias towards what people think should be happening….

https://argo.ucsd.edu/data/data-faq/#RorD

What is the difference between an “R” and a “D” profile file?

R files contain data that have only passed automated simple QC tests in real time and so may contain temperature, pressure and/or salinity errors. Most of these errors are the result of sensor drifts. D files have passed expert QC inspection and have had sensor drifts removed. They also have statistical uncertainty assigned to each observation based on both the sensor accuracy and the correction accuracy. If your application requires very high accuracy, use the ADJUSTED fields in the D-files and read and inspect their assigned errors and qc flags.

In particular, if you are doing scientific work sensitive to small pressure biases (e.g. calculations of global ocean heat content or mixed layer depth) Argo recommends the following guidelines:

Use the quality flags in the Argo data files and data labeled with QC = 1

Only use delayed-mode data (D files)

Only use ADJUSTED data variables

PRES_ADJUSTED_ERROR should be checked and where values are greater than 20db, these data should be rejected

If your work is not sensitive to small pressure biases, it is probably acceptable to use “R” files. “R” files should be free from gross errors in position, temperature and pressure. Additionally, if there is a known offset in salinity, this is applied and appears in the PSAL_ADJUSTED variable.

Reply to  markx
January 16, 2024 6:46 am

Most of these errors are the result of sensor drifts. D files have passed expert QC inspection and have had sensor drifts removed.”

Really? They have ship-board calibration labs out checking the 4000 buoys? How do they find the darn things?

What they are saying is that what they can identify AT THE CALIBRATION LAB is measured, before dumping them into the sea? Afterwards? GUESSWORK!

Why don’t they just identify adjustments to the field data as TYPE B measurement uncertainty estimates and tell us how they arrived at that value?

January 15, 2024 3:42 pm

Let’s remember that values reported for the Earth Energy Imbalance (EEI) such as in Loeb, et al 2021 ( https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2021GL093047 ) rely on the Argo float data and also the CERES EBAF TOA data which itself is “constrained to the ocean heat storage.”

To my mind therefore, any claim to have determined the EEI to the precision stated (two decimal points W/m^2) is unconvincing for the same reasons Willis gives in this post. If everything depends on ocean heat content trends, and we can’t know the value close enough for the calculation, then EEI cannot be determined accurately in any case.

“CERES_EBAF_Edition4.1 is the Clouds and the Earth’s Radiant Energy System (CERES) Energy Balanced and Filled (EBAF) Top-of-Atmosphere (TOA) and surface monthly means data in netCDF format Edition 4.1 data product. Data was collected using the CERES Scanner instruments on both the Terra and Aqua platforms. Data collection for this product is ongoing.

CERES_EBAF_Edition4.1 data are monthly and climatological averages of TOA clear-sky (spatially complete) fluxes and all-sky fluxes, where the TOA net flux is constrained to the ocean heat storage.” (Emphasis mine.)
Source https://asdc.larc.nasa.gov/project/CERES/CERES_EBAF_Edition4.1

I would welcome a correction if I am somehow misunderstanding how this works.

bdgwx
Reply to  David Dibbell
January 15, 2024 4:35 pm

To my mind therefore, any claim to have determined the EEI to the precision stated (two decimal points W/m^2) is unconvincing for the same reasons Willis gives in this post.

CERES estimates an uncertainty of ±0.48 W/m2 for the EEI in 2019 according to the publication you linked to.

Reply to  bdgwx
January 15, 2024 5:10 pm

Its a nonsense number considering the uncertainties of the constituent irradiance measurements is 5 W/m2 or greater.

Reply to  karlomonte
January 15, 2024 6:00 pm

To your point, I note that the 2km resolution near-real-time GOES East images for Band 16 show how unreasonable it is to suppose that the CERES sensors can ever result in a 1 deg x 1 deg gridded product with an uncertainty better than your number.

Reply to  bdgwx
January 15, 2024 5:11 pm

I take it you are referring to the statement “The linear trend of CERES implies a net EEI of 0.42 ± 0.48 W m−2 in mid-2005 and 1.12 ± 0.48 W m−2 in mid-2019.”

Yes, the EEI in W/m^2 (and also the +/- uncertainty) are stated to a precision of two decimal places. That is what I am pointing out as unconvincing.

bdgwx
Reply to  David Dibbell
January 15, 2024 5:22 pm

It is unconvincing that the uncertainty is ±0.48 W/m2?

Would it be convincing if it were stated as ±0.5 W/m2?

Reply to  bdgwx
January 15, 2024 6:00 pm

Still no.

bdgwx
Reply to  David Dibbell
January 15, 2024 6:10 pm

If not 1 or 2 decimal places then how many would be convincing?

Reply to  bdgwx
January 15, 2024 9:00 pm

Its an order of magnitude thing.

Reply to  bdgwx
January 16, 2024 3:48 am

KM is correct. Watch from space with an open mind to be shown that it is at least an order of magnitude (base 10) beyond our capabilities to diagnose a tiny imbalance. Perhaps some will not be persuaded even by these observations from a single instrument from a fixed location in space.
https://youtu.be/Yarzo13_TSE

Source of these visualizations: https://cdn.star.nesdis.noaa.gov/GOES16/ABI/FD/16/

bdgwx
Reply to  David Dibbell
January 16, 2024 5:07 am

So it is not the number of decimal places you are unconvinced by but the magnitude of the value? No?

Are you thinking the EEI could be as high 6 W/m2? No?

Reply to  bdgwx
January 16, 2024 5:47 am

It is not the magnitude of the EEI itself, but the order(s)-of-magnitude variation of emitter outputs and solar inputs that make it unrealistic to think of an EEI computed from sensors as useful for diagnosis.

Reply to  David Dibbell
January 16, 2024 6:39 am

Part of the problem here is the use of “averages” – typical for climate science.

Since the radiation varies as T^4 the actual radiation total will be much higher (i.e. the integral of the radiation curve) than that given by an average measurement?

Consider the exponential decay of temperature at night. The radiation as sunset, for example, will be T0^4. The radiation from the average temperature of the curve will be no where near this high.

The use of averages should be anathema to science – averages lose far too much necessary information.

Reply to  Tim Gorman
January 16, 2024 6:54 am

Good point about averages. In the GOES East images, the Atacama Desert area along the western coast of South America is a good example of the rapid rise and decay of longwave radiation.

bdgwx
Reply to  David Dibbell
January 16, 2024 6:50 am

It’s not the number of decimal places and it’s not the magnitude of EEI itself? Correct?

It’s is the number of decimal places and order of magnitude of the two components of EEI or ASR and OLR? Correct?

Reply to  bdgwx
January 16, 2024 7:10 am

Please go back and read my original comment.

bdgwx
Reply to  David Dibbell
January 16, 2024 7:49 am

I’m responding to To my mind therefore, any claim to have determined the EEI to the precision stated (two decimal points W/m^2) is unconvincing for the same reasons Willis gives in this post.” Is there something else in your original comment you want me to focus on?

Reply to  bdgwx
January 16, 2024 8:14 am

You are free to disagree and give the reason.

bdgwx
Reply to  David Dibbell
January 16, 2024 8:31 am

I don’t have a question about anything else in your post.

The question I have right now is…It’s is the number of decimal places and order of magnitude of the two components of EEI or ASR and OLR? Correct?

The reason I’m asking the question is because I can abate you concern regarding ASR and OLR (assuming that is your concern) by informing you that the publication cited in Willis’ post does not calculate EEI using the ASR – OLR method. In fact, it doesn’t use ASR or OLR at all.

Reply to  bdgwx
January 16, 2024 9:18 am

Willis’ post was about ocean heat content, and that is why I mentioned it in my comment. If you disagree with my original comment or any of my subsequent replies, then you are free to specify and support your alternative viewpoint. I am not asking you for anything.

bdgwx
Reply to  David Dibbell
January 16, 2024 9:41 am

My intent was to provide useful information. I got the feeling you felt Loeb et al. 2021 was saying they knew the EEI to within 0.01 W/m2. They don’t. They only know it to within 0.48 W/m2.

Reply to  bdgwx
January 16, 2024 9:24 pm

Still nonsense.

Reply to  bdgwx
January 17, 2024 3:49 am

“I got the feeling you felt Loeb et al. 2021 was saying they knew the EEI to within 0.01 W/m2. They don’t.” Glad that’s cleared up. No, I did not take the value reported in Loeb et all 2021 that way.
“They only know it to within 0.48 W/m2.” I am unconvinced the bounds of the interval can be known to that precision. .
 

Reply to  bdgwx
January 15, 2024 8:42 pm

To find an imbalance you need accurate measurements. The image shows CRN station error ranges for insolation and soil temps. These are supposed to be the best available measuring systems.

Think carefully about how one-hundredths decimal places are arrived at!

PSX_20240115_223747
Jerry Magnan
January 15, 2024 3:52 pm

And Michael Mann is one of the co-authors of the report. He’s everywhere!

Anyway, thank you, Willis! Some comments from a layman:

NASA’s Vital Signs Sea Level chart does show a jump in sea level in 2023, which could be partly explained by some increase in ocean temperature among other possible factors. Reflects what the sea level gauge in Boston also showed – an atypical rise in 2023 following a drop in 2022 from 2019-2021 levels. Basically, the ocean has been rising around 3.5-
4.0 mm/yr for 30 years and the recent 2023 global rise pretty much keeps it within that range.

Some quibbles. The report states on its first page:
“…the global SST reached its record high in 2023 with an annual mean of ~0.23degC and an astounding > 0.3degC above 2022 values for the second half of 2023…”

First quibble – Why are they calling a fast temperature increase in the second half of 2023 “astounding”? Why the extreme emotional wording in a scientific paper?

Second quibble -Their own Fig. 3 shows that there have been near-comparably dramatic one-year SST jumps in approximately 1957, 1977 and 1997. (In each instance, the SST leveled off or dropped the next year or soon after.) Why weren’t they called out as dramatic events in the paper?

Final quibble – Why focus on a short 6-month period in a very strong El Nino year that the report points out is following two years of an SST-suppressing La Nina, as shown in their Fig.1? Which could have explained the short-term jump as an understandable short term temperature rebound. Also, were there other dramatic 6-month changes (in either direction) in the historical record (as shown in Fig.3) that could have also earned the “astounding” rating? This would have helped to put recent developments in perspective.

Just askin’.

Reply to  Jerry Magnan
January 15, 2024 6:59 pm

I’ll give you a hint. They are not “disinterested observers.” They are advocates for their preferred view of reality.

Reply to  Jerry Magnan
January 15, 2024 9:28 pm

If Mann is a co-author, you can be absolutely sure it is one big non-scientific CON-job.

Erik Magnuson
Reply to  Jerry Magnan
January 15, 2024 9:35 pm

IMHO, seeing M. Mann as one of the authors leads me to wonder how much torturing the data had to endure before being shown to the public?

John Hultquist
January 15, 2024 4:04 pm

I thought, with a title including “tiny numbers” I might learn why I and most of the rest of Canada and the U. S. have had a week of tiny-number temperatures. Fortunately, I have less snow and less cold than many, so I empathize with my many comrades.
All those buoys going up and down and floating about also made me think of Bernie Taupin/Elton John’s “Tiny Dancers“.
I’ll get my coat. 🙂

January 15, 2024 4:58 pm

Off subject but has anybody notice that the ass has fallen out of ENSO and there is a large cold water tongue from Chile making its way quickly across the Pacific?

ENSO-diminishing-2024-01-12
Reply to  Streetcred
January 15, 2024 4:59 pm

and today

Enso-2024-01-16
January 15, 2024 5:33 pm

Note that they tend to concentrate in some areas, like the intertropical convergence zone by the Equator and the US East Coast, while other areas are undersampled.

The floats seem to be particularly sparse in the area where the University of Maine, using the Reanalyzer program to predict global temperatures around the 4th of July, created a stir with their claim of unprecedented warmth.

lanceman
January 15, 2024 5:48 pm

I noticed all the authors with Asian names are academics in the People’s Republic of China. Not that I am claiming there is a conspiracy by China to hype climate change to induce us to enact policies which would weaken us economically, but what would happen if these scientists staged public protests about China not doing enough to fight climate change? Has there EVER been such a protest in the PRC?

Reply to  lanceman
January 16, 2024 4:49 am

I don’t recall St Greta ever going there.

Gnrnr
January 15, 2024 6:49 pm

What is this int he first sentence Willis?

“yclept”

Gnrnr
Reply to  Gnrnr
January 15, 2024 6:50 pm

typo? (like mine…)

Gnrnr
Reply to  Willis Eschenbach
January 15, 2024 8:41 pm

Thanks Willis. I’ve never seen it before and thought I’d ask.

Reply to  Willis Eschenbach
January 15, 2024 9:03 pm

The first letter ‘y’ pronounced long-‘e’ tells me it likely came from French.

Reply to  karlomonte
January 16, 2024 3:37 am

… came from French.

No, it originated from the past participle of the Old English verb ‘clepen’, to call or name. In those olden days past participles were formed by adding the prefix ‘ge-‘ like in German or Dutch: geclept. In Middle English the prefix was change to ‘y-‘ -> ‘yclept’, which has survived into modern English (purely for humorous or poetic effect), making it the only English verb still being used having this archaic participle prefix. https://en.wiktionary.org/wiki/yclept

Reply to  Johanus
January 16, 2024 7:40 am

Interesting, thanks.

lb
Reply to  Johanus
January 16, 2024 1:10 pm

As far as I know middle english was heavily influenced by french. So the idea that the “ee” sound was imported from french seems not too farfetched. Consider modern french words like “éclair” or “étrange” or even “excuse”.
🙂

Reply to  lb
January 16, 2024 2:30 pm

Yes, the Norman invasion (aka Battle of Hastings) in 1066 had an enormous effect on the English language. In fact it marked the end of Old English, spoken by everyone. The conqueror, King William, spoke French which trickled down on the entire English society. Middle English became the language of the lower classes.

So, perhaps, the transition of ‘ge-‘ to ‘y-‘ was facilitated by francization.

https://english.stackexchange.com/questions/551683/palatization-of-y-from-ga

But the sound of é (accent aigu) in your examples is ‘aye’ not ‘ee’. That usually evolved from Latin words beginning with ‘ex’.

Reply to  Gnrnr
January 15, 2024 7:05 pm

Does your browser not support a search engine?

Gnrnr
Reply to  Clyde Spencer
January 15, 2024 8:40 pm

I’m reasonable well read and have never ever seen anything like it before so I didn’t go looking. Did you need to work at being a dick or did it come naturally?

Reply to  Gnrnr
January 15, 2024 9:23 pm

Considering my gender, it comes naturally. The point is, you were asking Willis to explain something that was perfectly legitimate and you could have discovered that on your own in 7 seconds and avoided this ugly exchange.

Gnrnr
Reply to  Clyde Spencer
January 15, 2024 9:41 pm

Or you could just have not responded in the first place and avoided the exchange, but you are a dick, so there is that. I’m of the same gender, yet managed to not be a dick (at least not as a start point), go figure.

Reply to  Gnrnr
January 16, 2024 8:42 am

I didn’t start out with a crude insult! You seem to be spun up a little tight to take a gentle jab about your laziness and turn it into a crude personal attack. I’m sorry, but somehow a button got pushed when you demonstrated a lack of responsibility for educating yourself. If Willis had spelled the word wrong, or it was so arcane that it doesn’t show up in an online search, then it would have been appropriate to ask him about it. However, you went directly to him, not showing any initiative to explore it on your own. Might it be that your lashing out at me is a defense because you recognize you came off looking helpless?

Reply to  Clyde Spencer
January 16, 2024 9:05 am

Upon some reflection, I suspect my reaction comes from the days when I used to teach. All too often, I had students ask me questions whose answers were either in the assigned reading in the text, or in the instructions for the lab procedure. They apparently hadn’t read either, and were taking my time away from other students who had invested the time but were still having difficulty understanding it. My priority was always for those who had made the effort but needed additional help to get passed some roadblock. However, there is also that I’m becoming a grumpy old man with less patience than when I was young.

Reply to  Willis Eschenbach
January 15, 2024 8:44 pm

Every time I read one of your articles or comments my IQ goes up!

Thank you for being you!

Reply to  Willis Eschenbach
January 15, 2024 8:56 pm

Even fresh water fisherman know this if the spend much time with depth finders. Why would fish suspend in the middle of a lake halfway to the bottom. Temperature is one variable.

Reply to  Willis Eschenbach
January 15, 2024 9:09 pm

Can I play “Devil’s advocate”?

Given the Argo system, with it’s 4000 units roughly randomly distributed in non-polar regions, if we compare whatever number it comes up with now, say an average temperature, and compare it to a similar number calculated a different time, can a valid comparison be made? Could we really, confidently say that it’s warmer or colder (or higher/lower total energy content) than another period?

The Argo system is basically taking random readings over the most of the globe, isn’t that enough to filter out noise? How many Argos would it take to have some serious confidence in the data? There’s millions of square kilometers of surface – as a ballpark estimate, I feel happy with 1 per kilometre – how to know if that is enough or too excessive?

Back in engineering class in the Pleistocene era, we were doing experiments with small things, say the classic thermodynamic experiment with a rod where one end is in a bucket of melting water and the other end heated by a flame. Because the rod was relatively thin, it didn’t matter where you took the temperature, left or right side, top or bottom, when taking measurements every few centimeters from one end to the other. But how to deal with a column of water 2 kilometers deep and spaced out hundreds of kilometers from the next one. It’s like we would need the Argos to move laterally on command, to come close together and move apart so that we can see how much the temperature can fluctuate side to side even at depths in the thermocline and in the very deep water.

Is the very deep water, at 2 km, very stable at 4°C, even compared pole to equator?

Thanks!

Reply to  Willis Eschenbach
January 16, 2024 5:21 am

… take a look at some of the datasets

Editorial note : My “sense of humour” was surgically removed when I was very young.

For the heat content and 0-2000m temperature anomalies a good start is the NCEI / NOAA webpage.

URL 1 : https://www.ncei.noaa.gov/data/oceans/woa/DATA_ANALYSIS/3M_HEAT_CONTENT/DATA/basin/

The “./3month” sub-directory contains OHC data, for global (world, 0-2000m, quarterly) average anomalies you want the 4 “h22-w0-2000m*.dat” files.

The “./3month_mt” sub-directory contains temperature data, for global average anomalies you want the 4 “T-dC-w0-2000m*.dat” files.

The “./onemonth” subdirectory only contains OHC anomaly data, and only from January 2005. The 12 “month-h22-w0-2000m*.dat” files are therefore “limited” to the ARGO network time period.
_ _ _ _ _ _ _ _ _ _ _

The lead author of the paper cited in the ATL article, “Cheng, L”, has a very useful website (in English !) for OHC-related reanalysis datasets.

URL 2 : http://www.ocean.iap.ac.cn/pages/dataService/dataService.html

Clicking on the “Time series” tab (left-hand side, 4th option from the top) gives access to the 0-700m and 0-2000m files, both of which go from January 1940 to December 2023 (at the time of typing this).

Note that he has “recently” (?) added a “2000-6000m” option, but that only has “valid” (reanalysis …) numbers from January 1992.

Another set of “extremely interesting” (to me, at least) files can be found under the (penultimate) “Other Obs data” tab.

Scroll down to “3. Data in the paper of Cheng et al. 2019 Science” entry, which has the title “Global OHC0-2000m changes (Unit: Zeta Joules; Resolution: annual mean) from 1955 to 2100 from CMIP5 models and four latest observational datasets”.

There you will discover a set of “CMIP5 models” files, which provide OHC numbers for individual model “projections” going all the way to the year 2100 for RCPs 2.6, 4.5 and 8.5, from which “ensemble means” can be easily calculated for any desired “Reference Period”.

Curious George
Reply to  Willis Eschenbach
January 16, 2024 12:45 pm

 Re Rayleigh-Benard convection: Does the heat from below mean the geophysical average about 100mW/m2? My BS meter awakens. I believe – and your graphs agree – that deeper ocean is colder.

Reply to  Willis Eschenbach
January 16, 2024 2:34 pm

All three of the metrology experts I have books/papers on say that identifying systematic biases in measurements is next to impossible using statistical analysis. Since you don’t know the systematic bias in nearby units they can’t be used to calibrate another one – unless you increase the measurement uncertainty to account for the possible systematic bias in the nearby units, i.e. a Type B measurement uncertainty component.

As you point out, a sudden change in readings can be caused by several things, none of which can be considered a “quality control” issue.

As I’ve stated elsewhere, when I see someone stating they can “adjust” data to account for measurement uncertainty in a field measurement device I just automatically count them as either ignorant or charlatans.

prjndigo
January 15, 2024 7:43 pm

Hoookay.

So lets talk a margin of error based on the barely 6°C of warming starting at 280°K ambient at Earths distance from the sun. … and its effects on pangendered red squirrel attack helicopters in Wales.

January 15, 2024 8:10 pm

Using the above calculations, 5 ZJ is ± 0.019°C … they are seriously claiming that we can measure the temperature of the top 2,000 meters of the ocean to within ±0.019°C.

_______________________________________________________________

Maybe that does beat the IPCC’s AR4 Chapter 5 Opening Statement:

     The oceans are warming. Over the period 1961 to 2003, global ocean      temperature has risen by 0.10°C from the surface to a depth of 700 m.

Reply to  Steve Case
January 15, 2024 9:43 pm

Ya, the balls, or ovaries, to claim the temperature in the ocean upper layer is .10°C higher than in 1961 – not .011 or .09°C – and based on how many measurements going on in 1961? Wasn’t just buckets and ropes at that time?

Are we really supposed to be astonished by .1 degree over 4 decades? And that is in the top layers of the ocean, most affected by weather and can range anywhere from near freezing to ~30°C – so less than 1% of the range, and certainly less than anyone or anything would notice or be affected by it.

SomeBlokeFromCambridge
January 15, 2024 11:30 pm

I’d used 10e21 instead of 1e21. This means all my results were too large by a factor of 10

Errr… did you mean to write that?

1e21 is 1 x itself 21 times. So still = 1

Whereas 10e21 is 10 x itself 21 times. So a factor of 1,000,000,000,000,000,000,000 (I think I’ve counted the 0s right)

Doh!

SomeBlokeFromCambridge
Reply to  Willis Eschenbach
January 17, 2024 12:36 am

Doh – you are correct.
My “brain fart” 🙂

Gnrnr
Reply to  SomeBlokeFromCambridge
January 16, 2024 3:43 am

e with a number is 10 to the power of the number, at least it has always meant that in engineering. so 1 Would be written as 1e0, pointless, yet correct. 0.001 would be 1e-3.0.0027 would be 2.7e-3 etc etc etc. To do with using significant figures correctly mostly.

Richard Greene
January 16, 2024 12:50 am

The satire is tedious, like a joke with a long set up and an unfunny punchline.

Stick to science please.

Where is an editor when one is needed?

I’m also offended that your list did not include albino dwarfs, often called “People of No Color”

January 16, 2024 1:37 am

I’m sure the Chuckchi people of Siberia will be delighted that their ethnicity has now transitioned into a sexual orientation LOL

January 16, 2024 3:53 am

An alert commenter, Izaak Walton, notes below….”

See how nice climate skeptics are- they tolerate and even thank critics. Climate alarmists don’t do the same- they’ll just accuse any critics of being science deniers. Or just ignore them.

rovingbroker
January 16, 2024 4:48 am

Willis wrote, “An alert commenter, Izaak Walton, notes below that I’d used 10e21 instead of 1e21. This means all my results were too large by a factor of 10 “

1e21 is 1 times 1, 21 times which is … 1.

oeman50
January 16, 2024 6:21 am

Willis,

I particularly like that your “Willis’s First Rule of Authorship” squares the number of authors in the denominator.

January 16, 2024 12:37 pm

This what NOAA reckon in centigrade rather than in joules, since the 1950’s:

comment image

Reply to  Ulric Lyons
January 17, 2024 4:40 am

A “zoomed in” view of the same data.

Note that “IAP” is the data from Lijing Cheng’s website (see my previous post), aligned to match the “most recent, therefore with more coverage and probably more accurate / precise” 1995-2020 NOAA data.

OHC_0-2000m_1955-2023
Reply to  Ulric Lyons
January 17, 2024 4:53 am

For reference, a copy of “Fig. 2” from the Cheng et al (2024) paper of the ATL article.

This shows the ocean heat content (OHC, in ZJ), allowing for the ratio between (lots of) Joules and (fractions of) degrees Celsius to be estimated.

Note also that the “cooling the past” result of their reanalysis is present in this figure, it’s just slightly less obvious than in my version …

In any case, a (roughly) 0.19°C temperature rise in 69 years — i.e. 2.75 hundredths of a degree per decade — hardly counts as “rapid” warming … in my books, at least.

Cheng-et-al_Fig-2_OHC-2000m_1958-2023
Reply to  Mark BLR
January 17, 2024 1:12 pm

The warming from 1995 looks AMO related, probably the associated decline in low cloud cover.

Rick C
January 16, 2024 12:46 pm

Willis, thanks for the post. After reading the gender list I’m not even sure what I am.

Anyway, I also find their uncertainty claim ludicrous. My guess is they used the old divide the MU of individual data points used in the average by the square root of N. In this case I assume that one data point would be the average of the data collected in one floats excursion from 2000 m to the surface. With 4000 floats collecting 36.5 data points per year that’s N =146,000. To get an MU of 0.0019 C the MU of each reading would be 0.72 (0.0019 x SQRT(146,000). Maybe some one knows what the actual MU of an Argo float traverse is. I did find that the Argo temperature sensor accuracy is claimed to be 0.002 K. That’s for a single measurement of a stable measurand. So the claim is that they have measured the average temperature of the upper 2000 m of the world’s oceans to an uncertainty better than that of the instrument spec. Absurd!

This site has had dozens of posts explaining why the “Law of Large Numbers” and the divide by the square root of N is not applicable to measurements of this type.

Reply to  Rick C
January 16, 2024 9:38 pm

0.0019 x sqrt(146000) is probably only the standard deviation of the average. If the instrumental uncertainty is 0.5C, then as a minimum the combined uncertainty should be the root-sum-sqr of the two or 0.9C. The 1 or 2 mK number is only the sensor calibration and doesn’t include other systematic effects.

Reply to  Rick C
January 25, 2024 4:28 am

The measurement uncertainty of the entire float is between +/- 0.3C and +/- 0.5C. Not much different than any other field temperature measurement device that is land based. Since none of those measurements are of the same thing using the same device under the same environmental conditions (i.e. repeatability requirements) the total measurement uncertainty grows as root-sum-square as km pointed out. The total measurement uncertainty is HUGE. As it should be!

Note carefully that you NEVER see a climate scientist talking about MEASUREMENT uncertainty, only how precisely they can calculate the average value. Precision is not accuracy.

It’s all based on the common climate science meme of: “all measurement uncertainty is random, Gaussian, and cancels”. I

u(y) = u(random) + u(systematic) Even if *some* of the random uncertainty it doesn’t *all* cancel. And the systematic uncertainty never cancels, it just adds.

Temperature distributions are *not* Gaussian. The daily temperature profile is typically a sinusoidal one during the day and is typically an exponential decay at night. Put them together and you get a multi-modal distribution – meaning the “average” is physically meaningless (it’s actually a median, not an average the way climate science does it). And then climate science uses this physically meaningless “average” to build an edifice of averages, totally ignoring the fact that even if the distributions were Gaussian you need the VARIANCE to fully describe the resulting distribution, not just the “average” value.

When you are measuring multiple things a single time using a different device under different environmental conditions, you simply cannot ASSUME that measurement uncertainty is random across all the data elements and thereby cancels. You must prove such an assumption is warranted – but climate science never does.

Reply to  Willis Eschenbach
January 27, 2024 6:55 am

I’m sorry Willis. It’s buried on my hard drive (or on one of several). If I find it I’ll post it. Don’t hold your breath.

The last time I looked the document was no longer available on the internet and I’m not well-versed enough to try and find it in historical archives.

This one gives an indication of the standard deviation of the temperature being around +/- 0.1C. https://www.mdpi.com/2077-1312/8/5/313

Even that amount of uncertainty is enough to prevent determining average temperature out to the hundredths digit.

In essence the document looked at the problems within the water flow channel, the water pumps, the length of time the water was stored, etc. Any kind of detritus in the water channel can affect the bias of the temperature measured by the sensor. As typical in climate science today, climate science substitutes the SEM for the float measurement uncertainty instead of the actual measurement uncertainty. in other words, the more samples the floats make the more precise the average value becomes. But that tells you nothing about the actual measurement uncertainty.

Reply to  Willis Eschenbach
January 28, 2024 4:25 am

Willis,

Thank you for finding these references.

The published study pretty much confirms your, along with others, suspicions that the ARGO system has problems with measuring actual ocian temperatures due to the small area actually being covered.

It also confirms the inanity of many on WUWT not being either learned or experienced in the science of making measurements. One cannot simply look at one component in a measuring device to determine it’s resolution or accuracy. Additionally, the term microclimate exists for a reason and must be considered in estimating any measurement uncertainty between weather stations.

Reply to  Willis Eschenbach
January 28, 2024 10:47 am

Thanks Willis. This sounds like the study I saw. I’ll print some of this out and file it. Maybe I’ll be able to find it later!

Reply to  Willis Eschenbach
January 28, 2024 12:20 pm

Thank you . I’ll try to find time to read it.

I have to tell you, however, whenever I see the word “reanalysis” it totally turns me off. Especially in climate science there are simply too many unstated, implicit assumptions made about how to combine observational data from different sources that just don’t satisfy reasonableness. Different variances in the data are typically ignored so differences are not weighted. Same for measurement uncertainty, all stated values are assumed to be 100% accurate when forming trend lines, even if the measurement uncertainty is greater than the differences that are trying to be identified. If model outputs are used as part of the reanalysis then any metrology protocols are typically ignored for the assumption that model outputs have no measurement uncertainty.

If they have truly identified uncertainty greater than the typical accepted accuracy of the sensor itself, then I’ll be shocked. If their uncertainty is based on SEM calculations then I’ll know they aren’t really identifying measurement uncertainty.

I’ll try to let you know what I find.

Reply to  Willis Eschenbach
January 29, 2024 5:32 am

Since the 1970s, the global ocean has absorbed more than 90% of the excess heat that was mainly caused by the increasing greenhouse gas emissions from the anthropogenic activities.

I may be misinterpreting this wrongly but greenhouse gas emissions are unlikely to have caused the ocean heat content to have increased. IR from GHG’s only affects the smallest amount of surface molecules so it is unlikely that OHC increase, especially at depth is caused by GHG’s.

I’ll admit I am not an expert on ocean heating so I may be mistaken.

Reply to  Willis Eschenbach
January 27, 2024 1:28 pm

Willis, one thing to look at is what NOAA says for CRN uncertainty. The sensors are pretty much the same, yet the actual uncertainty rises to ±0.3° C for the system.

I’m sure I don’t need to point out to you that measuring devices are a conglomeration of components, but other folks may need some understanding. That system, when made, must follow the adage that uncertainties add, ALWAYS. Each component in the system adds to the total uncertainty. Consequently, the total uncertainty is never, ever as low as the best component.

People who make calibrated measurements always have a feeling in their bones that when measurement uncertainty is attributed to one component when making field measurements, something is not right.

SteveZ56
January 16, 2024 1:27 pm

This is typical of the alarmists’ desire to scare people with large numbers–imagine 15 million million million million million million million Joules of extra heat in a year! OMG!

But expressed as an average temperature rise of 0.0035 C per year, no one would be impressed, and people would rightly question (as Willis did) whether we can measure temperatures to within that precision in salt water at a 2,000-meter depth at a pressure of 19.6 MPa (about 2850 psi). Have the researchers considered the possibility of instrument drift? How are the temperature measurement devices recalibrated when the buoys surface? Methinks the signal/noise ratio is fairly small…

There is a whole lot of water in the oceans, and it has a huge heat capacity, so those 15 zettajoules amount to not much on a global scale. Oh, by the way, Joules aren’t very big. You can get about 120 million Joules by burning a gallon of gasoline.

Expressed another way, if the 15 ZJ per year of heat absorption is correct, dividing by 8760 hours/yr and 3600 seconds/hour results in an energy absorption rate of 476 terawatts (476,000 GW). Dividing this by the surface area of the oceans (361 million km^2 = 3.61*10^14 m2) results in an average heat absorption intensity of about 1.32 W/m2, or about 0.1% of the intensity of the sun at the zenith.

The oceans are a huge heat sink, easily able to damp out anything happening in the atmosphere due to infrared absorption by CO2.

January 24, 2024 7:21 pm

It is just amazing how the warmist cry foul (or should that really be “fowl” in this case) every time a statistician or econometrics analyst tells them they are doing it wrong. “They aren’t climate scientist!” No, it takes a climate “scientist” to think they can borrow precision If they don’t understand the tools, why do they insist they are doing it right. When this whole thing started, it was McIntyre and McKitrick that ma stand up and go, “cheater”..