A Small Margin Of Error

Guest Post by Willis Eschenbach

I see that Zeke Hausfather and others are claiming that 2018 is the warmest year on record for the ocean down to a depth of 2,000 metres. Here’s Zeke’s claim:

Figure 1. Change in ocean heat content, 1955 – 2018. Data available from Institute for Applied Physics (IAP). 

When I saw that graph in Zeke’s tweet, my bad-number detector started flashing bright red. What I found suspicious was that the confidence intervals seemed far too small. Not only that, but the graph is measured in a unit that is meaningless to most everyone. Hmmm …

Now, the units in this graph are “zettajoules”, abbreviated ZJ. A zettajoule is a billion trillion joules, or 1E+21 joules. I wanted to convert this to a more familiar number, which is degrees Celsius (°C). So I had to calculate how many zettajoule it takes to raise the temperature of the top two kilometres of the ocean by 1°C.

I go over the math in the endnotes, but suffice it to say at this point that it takes about twenty-six hundred zettajoule to raise the temperature of the top two kilometres of the ocean by 1°C. 2,600 ZJ per degree.

Now, look at Figure 1 again. They claim that their error back in 1955 is plus or minus ninety-five zettajoules … and that converts to ± 0.04°C. Four hundredths of one degree celsius … right …

Call me crazy, but I do NOT believe that we know the 1955 temperature of the top two kilometres of the ocean to within plus or minus four hundredths of one degree.

It gets worse. By the year 2018, they are claiming that the error bar is on the order of plus or minus nine zettajoules … which is three thousandths of one degree C. That’s 0.003°C. Get real! Ask any process engineer—determining the average temperature of a typcial swimming pool to within three thousandths of a degree would require a dozen thermometers or more …

The claim is that they can achieve this degree of accuracy because of the ARGO floats. These are floats that drift down deep in the ocean. Every ten days they rise slowly to the surface, sampling temperatures as they go. At present, well, three days ago, there were 3,835 Argo floats in operation.

Figure 2. Distribution of all Argo floats which were active as of January 8, 2019.

Looks pretty dense-packed in this graphic, doesn’t it? Maybe not a couple dozen thermometers per swimming pool, but dense … however,  in fact, that’s only one Argo float for every 93,500 square km (36,000 square miles) of ocean. That’s a box that’s 300 km (190 miles) on a side and two km (1.2 miles) deep … containing one thermometer.

Here’s the underlying problem with their error estimate. As the number of observations goes up, the error bar decreases by one divided by the square root of the number of observations. And that means if we want to get one more decimal in our error, we have to have a hundred times the number of data points.

For example, if we get an error of say a tenth of a degree C from ten observations, then if we want to reduce the error to a hundredth of a degree C we need one thousand observations …

And the same is true in reverse. So let’s assume that their error estimate of ± 0.003°C for 2018 data is correct, and it’s due to the excellent coverage of the 3,835 Argo floats.

That would mean that we would have an error of ten times that, ± 0.03°C if there were only 38 ARGO floats …

Sorry. Not believing it. Thirty-eight thermometers, each taking three vertical temperature profiles per month, to measure the temperature of the top two kilometers of the entire global ocean to plus or minus three hundredths of a degree?

My bad number detector was still going off. So I decided to do a type of “Monte Carlo” analysis. Named after the famous casino, a Monte Carlo analysis implies that you are using random data in an analysis to see if your answer is reasonable.

In this case, what I did was to get gridded 1° latitude by 1° longitude data for ocean temperatures at various depths down to 2000 metres from the Levitus World Ocean Atlas. It contains the long-term monthly averages at each depth for each gridcell for each month. Then I calculated the global monthly average for each month from the surface down to 2000 metres.

Now, there are 33,713 1°x1° gridcells with ocean data. (I excluded the areas poleward of the Arctic/Antarctic Circles, as there are almost no Argo floats there.) And there are 3,825 Argo floats. On average some 5% of them are in a common gridcell. So the Argo floats are sampling on the order of ten percent of the gridcells … meaning that despite having lots of Argo floats, still at any given time, 90% of the 1°x1° ocean gridcells are not sampled. Just sayin …

To see what difference this might make, I did repeated runs by choosing 3,825 ocean gridcells at random. I then ran the same analysis as before—get the averages at depth, and then calculate the global average temperature month by month for just those gridcells. Here’s a map of typical random locations for simulated Argo locations for one run.

Figure 3. Typical simulated distribution of Argo floats for one run of Monte Carlo Analysis.

And in the event, I found what I suspected I’d find. Their claimed accuracy is not borne out by experiment. Figure 4 shows the results of a typical run. The 95% confidence interval for the results varied from 0.05°C to 0.1°C.

Figure 4. Typical run, average global ocean temperature 0-2,000 metres depth, from Levitus World Ocean Atlas (red dots) and from 3.825 simulated Argo locations. White “whisker” lines show the 95% confidence interval (95%CI). For this run, the 95%CI was 0.07°C. Small white whisker line at bottom center shows the claimed 2018 95%CI of ± 0.003°C.

As you can see, using the simulated Argo locations gives an answer that is quite close to the actual temperature average. Monthly averages are within a tenth of a degree of the actual average … but because the Argo floats only measure about 10% of the 1°x1° ocean gridcells, that is still more than an order of magnitude larger than the claimed 2018 95% confidence interval for the AIP data shown in Figure 1.

So I guess my bad number detector must still be working …

Finally, Zeke says that the ocean temperature in 2018 exceeds that in 2017 by “a comfortable margin”. But in fact, it is warmer by only 8 zettajoules … which is less than the claimed 2018 error. So no, that is not a “comfortable margin”. It’s well within even their unbelievably small claimed error, which they say is ± 9 zettajoule for 2018.

In closing, please don’t rag on Zeke about this. He’s one of the good guys, and all of us are wrong at times. As I myself have proven more often than I care to think about, the American scientist Lewis Thomas was totally correct when he said, “We are built to make mistakes, coded for error”

Best regards to everyone,

w.

PS—when commenting please quote the exact words that you are discussing. That way we can all understand both who and what you are referring to.

Math Notes: Here is the calculation of the conversion of zettajoules to degrees of warming of the top two km of the ocean. I work in the computer language R, and these are the actual calculations. Everything after a hashmark (#) in a line is a comment.

heatcapacity=sw_cp(t=4,p=100) # heat capacity, with temperature and pressure at 1000 m depth
print(paste(round(heatcapacity), "joules/kg/°C"))
[1] "3958 joules/kg/°C"

seadensity=gsw_rho(35,4,1000) # density, with temperature and pressure at 1000 m depth
print(paste(round(seadensity), "kg/cubic metre"))
[1] "1032 kg/cubic metre"

seavolume=1.4e9*1e9 #cubic km * 1e9 to convert to cubic metres
print(paste(round(seavolume), "cubic metres, per levitus"))
[1] "1.4e+18 cubic metres, per levitus"

fractionto2000m=0.46 # fraction of ocean above 2000 m depth per Levitus

zjoulesperdeg=seavolume*fractionto2000m*seadensity*heatcapacity/1e21
print(paste(round(zjoulesperdeg), "zettajoules to heat 2 km seawater by 1°C"))
[1] "2631 zettajoules to heat 2 km seawater by 1°C"

z1955error = 95 # 1955 error in ZJ
print(paste(round(z1955error/zjoulesperdeg,2),"°C 1955 error"))
[1] "0.04 °C 1955 error"

z2018error = 9 # 1955 error in ZJ
print(paste(round(z2018error/zjoulesperdeg,3),"°C 2018 error"))
[1] "0.003 °C 2018 error"

yr2018change = 8 # 2017 to 2018 change in ZJ
print(paste(round(yr2018change/zjoulesperdeg,3),"°C change 2017 - 2018"))
[1] "0.003 °C change 2017 - 2018"
0 0 vote
Article Rating
372 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Crispin in Waterloo but really in Beijing
January 11, 2019 6:15 pm

The first chart shows me that the CO2 concentration is driven by the ocean temperature. There is a terrible correlation between the CO2 concentration and the temperature of the atmosphere, but there is a very good correlation with the ocean temperature. As the oceans contained a huge amount of CO2, what else could we expect, based on sound knowledge of vapour pressure?

The simplest explanation for that remarkable correlation is that as the oceans heat up, the CO2 out-gases and the atmospheric concentration increases.

Because a mechanism for heating the oceans using atmospheric CO2 as a driver is missing, we are left with the obvious: as the sun heats the oceans (net) the CO2 emerges, as expected. End of short story.

Steve Heins
Reply to  Crispin in Waterloo but really in Beijing
January 11, 2019 6:28 pm

” a mechanism for heating the oceans using atmospheric CO2 as a driver is missing”

False.

The mechanism is clear and obvious.

The sun heats the ocean during the day.

CO2 retards the cooling at night.

Net effect is the oceans warm.

TRM
Reply to  Steve Heins
January 11, 2019 6:58 pm

Please provide links to papers that show how and how much “CO2 retards the cooling at night”.

Thanks

Steve Heins
Reply to  TRM
January 11, 2019 7:34 pm

https://www.nature.com/articles/nature14240

Please don’t ask me to explain to you how down welling IR works, as it is basic radiative physics.

Red94ViperRT10
Reply to  Steve Heins
January 11, 2019 8:00 pm

You linked to a paywalled article. Bad form. Since I can’t read it nothing is proven, but just from the abstract I would say, 1) insufficient data sample, and 2) correlation does not prove causation! Try again? Furthermore, a closer look at that “correlation” would reveal changes in atmospheric CO2 lags changes in temperature by ~9-10 months, and the future can’t cause the past, so not even a correlation to brag about!

Hugs
Reply to  Steve Heins
January 11, 2019 8:24 pm

Don’t require things on a gold plate. We others (Willis included I guess) do take the GHE as a fact, and are bored to death for people requiring them hand fed ‘proof’ something they’re not going to accept anyways. Just waste of time from everybody’s side.

Red94ViperRT10
Reply to  Steve Heins
January 11, 2019 9:25 pm

@Hugs January 11, 2019 at 8:24 pm

I do take the GHE as a fact, effected by ALL the greenhouse gasses. Out of all those, there has never been an experiment to show which one controls that temperature rise, if any. It might be like my insurance premiums, where either my children are covered or they’re not, I can’t have it part-ways. And there has been no proof of the TCS or ECS from a change in the concentration of atmospheric CO2, I believe there is one, but it could vary all the way to <0, once all the feedbacks have their feedback.

@Willis, BTW, thanks for the Sci-Hub link, I'll bookmark it!

Menicholas
Reply to  Steve Heins
January 11, 2019 10:14 pm

We are all, or mostly adults here, and should be able to do our own homework.
It is not advancing the conversation to demand that everyone justify every statement, when most of these are things we discuss at length over and over again.
I agree it is tiresome, although it is also quite annoying that some people act like the long discussions we all just had on the same topic, several times in the past month, never happened.
Steve Heins and Steve O, this means you.
But that is how warmistas and their apologists work.
Just pretend like all previous discussions never happened.
They want to wear us down by repetition.
Enough to just link to one of the many recent articles and long discussion on the topic.

Anthony Banton
Reply to  Steve Heins
January 12, 2019 7:31 am

“You linked to a paywalled article”

No it’s not …
http://asl.umbc.edu/pub/chepplew/journals/nature14240_v519_Feldman_CO2.pdf

Reply to  Steve Heins
January 12, 2019 7:59 am

“Basic radiative physics” LOL
It is unfortunate that the real world and real physics cannot fit into climate alarmists’ models or heads.

1. Earth is not a blackbody, but has albedo at surface, upper and lower faces of tropospheric clouds, and both faces of stratospheric clouds. If you can’t model clouds, you can’t model climate.

2. Core mechanism of GHE conjecture and CAGW alarmism is a violation of known physics. After actual scientists repeatedly pointed out photons cannot be trapped, climate “scientists” invented the novel physical phenomenon of “back radiation,” which is the downward-biased re-radiation of upwelling LWIR from the surface due to CO2 absorbtion and re-emission in the troposphere. Back radiation only appears in models, not in actual validated observations.

3. Photons absorbed by any gas molecule are a billion times more likely to be thermalized into rotational, vibrational, and translational degrees of freedom whose phonon energy quanta are quickly exchanged via super-elastic collisions with other molecules long before they can be re-radiated.

4. Quantum dynamics dictates that any re-emission of energy as a photon is omni-directional, with no bias toward the surface.

5. Convection and conduction rates in the atmosphere are high enough to easily carry away upward any mysterious energy bloom that might appear at any altitude in the troposphere. The actual equations physicists and meteorologists use for gas energy transfer ignore radiation because its contribution is insignificant.

“Basic physics” is the antidote to “climate science” Lysenkoism.

GCSquared
Reply to  Steve Heins
January 12, 2019 11:03 am

Your reference examines the Southern Great Planes and the North Slope of Alaska, i.e., it’s a land-based study, and dirt doesn’t circulate.

In oceans, some of the warmed water will mix somewhat during the course of a day. Thus, at night, IR radiation from the ocean will leave not just from the immediate surface, but from the warmed layer, and so will have to traverse water, in the denser fluid rather than gaseous phase. The IR spectrum is likely to be different compared with ground-based emissions.

The IR release mechanism, and the “basic radiation physics”, is different enough to call the relevance of your reference into question. Perhaps you can enlighten us by citing a more appropriate ocean-based study.

Red94ViperRT10
Reply to  Steve Heins
January 12, 2019 11:13 am

So I looked at the paper in your link (I hope Anthony Barton linked to the same paper you referenced)… I’m still scratching my head. First off, it doesn’t really read like a research paper as compiled by post-doctoral researchers, or even a doctoral researcher, more like a report compiled by a technician or salesperson attempting to demonstrate his piece of equipment makes the measurements he claims it can make. Nonetheless, best I can gather through the jargon, the researchers collected one “… sample clear-sky measured AERI spectrum…” from 2001 (apparently this was the paper’s closest brush with actual data, but they managed to evade it, man that was close!), constructed and applied filters to eliminate all other effects except the CO2, then used another model, not even constructed by them (I don’t think), to estimate what the radiative forcing should be from that point through 2010 starting from that 2001 sample reading, given the changes in CO2 as reported by “…CarbonTracker 2011 (CT2011)20, which is a greenhouse gas assimilation system based on measurements and modelled [sic] emission and transport…”, doesn’t say where the “measurements” were taken nor when, but not an actual reading taken in 2010/11 at the exact same location as the 2001 radiation reading, nor do we have a 2001 CO2 reading to compare it against, not in the paper anyway. And I don’t see a 2010 radiation reading taken at the same location as the 2001 reading. And never at any time does the paper ever even make an attempt to connect the changes in CO2 or the (modeled) changes in radiative forcing to the actual temperature. All I can see is models all the way down. And even from that sample reading, I’m guessing you’re effectively measuring the temperature of the air. (The only way I can picture to measure the insulation value of air would be to place an emitter on the ground (or in space) that emits all the known wavelengths that can transport heat, and a sensor in space (or vice versa) and take a clear-sky nighttime measurement and see how much emission doesn’t make it to the sensor. That would be a valid experiment!)

David L. Hagen
Reply to  Steve Heins
January 12, 2019 11:18 am

Ike Kiefer
Partially true – but you overstate your case. R Essenhigh includes the small contribution to gas absorption/radiation in atmospheric lapse rate.
Essenhigh, R.H., 2006. Prediction of the standard atmosphere profiles of temperature, pressure, and density with height for the lower atmosphere by solution of the (S− S) integral equations of transfer and evaluation of the potential for profile perturbation by combustion emissions. Energy & fuels, 20(3), pp.1057-1067.
Ferenc Miskolczi exhaustively analyses molecular absorption/radiation for all greenhouse species across the most frequences with line by line model HARTCODE
eg see The Greenhouse Effect and the Infrared Radiative Structure of the Earth’s Atmosphere

Red94ViperRT10
Reply to  Steve Heins
January 12, 2019 7:37 pm

In my January 11, 2019 at 9:25 pm comment, rather than “…the TCS or ECS … could vary all the way to <0…" I meant to say …could fall in a range all the way to <0…

Samuel C Cogar
Reply to  Steve Heins
January 13, 2019 3:51 am

Ike Kiefer – January 12, 2019 at 7:59 am

4. Quantum dynamics dictates that any re-emission of energy as a photon is omni-directional, with no bias toward the surface.

A fact of science, …… but a fact that the proponents of AGW/CAGW climate change REFUSE to consider or admit to because it is CONTRARY to their inferred claims that all IR energy emitted from the earth’s surface to GHGs in the atmosphere, …… is either “trapped” in said atmospheric GHGs ….. or is emitted directly back to the earth’s surface.

Vicus
Reply to  Steve Heins
January 13, 2019 11:35 am

“Downwelling IR”
Has never been documented, or replicated. Don’t state conjecture as facts.
Strike one.

IR emissivity is a 360 sphere. What percentage “downwells’ and what prevents “upwelling”? Your search for the answer will disprove your beliefs.
Strike two.

Since a greenhouse works by preventing convection, generally a glass barrier, and “downwelling IR” does not prevent convection- please explain the [non-existent] mechanism a trace gas will prevent atmospheric convection?
(It doesn’t).
Strike three.

The last point is very, very important to understand.

Steve Heins
Reply to  TRM
January 11, 2019 7:37 pm

For example, I suggest you go to Nevada, and visit the desert during the day when it is 90 degrees F, then stay there when the sun goes down to see how cold it gets.

After you do that go to Atlanta GA, during the day when it’s 90 degrees F, then stay there when the sun goes down to see how cold it gets.

You’ll learn to appreciate the green-house gas effect of H2O

Same thing happens with CO2 to a lesser extent.

Menicholas
Reply to  Steve Heins
January 11, 2019 10:19 pm

Good point.
The driest desert is Antarctica.
No warming.
Case closed.
NEXT!

paul courtney
Reply to  Steve Heins
January 12, 2019 6:29 am

Mr. Hein: Thanks for keeping it so very simple. So while you were in the desert, without the interference of water vapor, you were able to precisely observe and measure the GHG effect of CO2? Can you provide the results? We’d all be very keen to have you describe your method, too! Surely you did this before walking to Atlanta (surely you didn’t burn fossil fuel for travel?) for stage 2 of your experiment? You know, where you were able to observe and measure the “lesser extent” you mention. Being a curious sort, you just had to know the numbers on this “extent”, right? Lookin’ fwd to seeing your data, so we can put this all to rest.

Anthony Banton
Reply to  Steve Heins
January 12, 2019 7:40 am

“So while you were in the desert, without the interference of water vapor, you were able to precisely observe and measure the GHG effect of CO2? Can you provide the results?”

He did.
In the paper he linked up-thread ……

http://asl.umbc.edu/pub/chepplew/journals/nature14240_v519_Feldman_CO2.pdf

Reply to  Anthony Banton
January 12, 2019 8:16 am

Anthony

your paper is supported by which actual measurements that you made yourself?

How much warming is there in your own backyard? Did you check and can you give me a figure [that I can verify]?

like I said,
there is no warming here, man made, or otheriwse,

click on my name to read my reports on that.

donb
Reply to  Steve Heins
January 12, 2019 8:30 am

@P.C.
If someone wishes to challenge a reviewed and published scientific paper, that person is expected to publish his/her own paper, giving reasons such as experimental data that contradict the original paper, demonstrating error in mathematical calculations, or showing obvious error in logic. Saying “I don’t agree with what you say and don’t believe in your conclusion” is woefully insufficient in science.

Anthony Banton
Reply to  Steve Heins
January 12, 2019 10:05 am

“like I said,
there is no warming here, man made, or otheriwse,”

If you say so Henry.

However the study observationally says otherwise.
And I don’t have to do it myself to verify.

I’m not a conspiracy theorist and believe researchers at their word.

Reply to  Anthony Banton
January 12, 2019 10:12 am

Anthony,

it is rather foolish to trust the results of others. Not very scientific. That is why the gap between sceptics and ‘believers’ is getting bigger and bigger. At the very least, you could have done some verification?

e.g how much warmer did it get in the place where you stay over the past 40 years? let me know what you result you got.

Click on my name aand follow the links to figure out how you can easily determine the trend in your own neighborhood.

Menicholas
Reply to  Steve Heins
January 12, 2019 1:14 pm

“If someone wishes to challenge a reviewed and published scientific paper, that person is expected to publish his/her own paper, giving reasons such as experimental data that contradict the original paper, demonstrating error in mathematical calculations, or showing obvious error in logic.”

Hey Don, I suggest you ask Dr. Soon, not to mention anyone else who has failed to go along with the warmista orthodoxy, just exactly what happens when someone tries to do just that?
Are you naïve or a liar or both?

Chris Riley
Reply to  Steve Heins
January 12, 2019 3:49 pm

I was taught in the sixth grade that, the reason a 90 degree day in a humid climate zone is followed by a warmer night than would be the case in a dry climate is that the dew point of humid air is higher than the dew point dry air. On a 90 degree day in Atlanta with a relative humidity of 57% the dew point would be 72.5 degrees. At this temperature the water vapor in the air will begin to condense, thus releasing the latent heat of evaporation contained in the water vapor. This slows the rate that temperature can fall relative to that which is seen in a dry climate.

In Death Valley, a 90 degree day at a relative humidity of 10% will have a dew point of 26.2 degrees, and will not “benefit” from the latent heat of the water vapor condensation until the temperature falls to that level.

With all other things equal (or ignored) (UHI cloud cover etc), the location with the lower dew point will experience the coolest night.

Is this no longer true?

Vicus
Reply to  Steve Heins
January 13, 2019 11:45 am

You’re conflating latent surface heat with gasses acting as a “greenhouse gas”, a barrier, when it actually conducts & convects, not trap.

That desert cools faster due to the surrounding environment unable to hold latent surface heat. Comparing that to a massive concrete & asphalt complex with heat generating motors, cars, people, et cetera, it is incorrect to explain why it “stays warmer” invoking water vapor “trapping heat”.

The greatest con pulled on humanity is unassailable belief in “greenhouse gasses”.

ATheoK
Reply to  TRM
January 11, 2019 11:28 pm

And do not forget to specifically address the 402 molecules of CO₂ out of every ten thousand atmospheric molecules are somehow more powerful or efficient within their miniscule infrared range, than water vapor is over it’s vastly greater atmospheric levels and a very broad range of interactive infrared frequencies.

Graemethecat
Reply to  ATheoK
January 12, 2019 1:35 am

Actually, I believe there are currently 4 molecules of CO2 per ten thousand (0.04%), not 402. This merely reinforces your point, of course.

ATheoK
Reply to  ATheoK
January 12, 2019 5:40 am

“Graemethecat January 12, 2019 at 1:35 am
Actually, I believe there are currently 4 molecules of CO2 per ten thousand (0.04%), not 402. This merely reinforces your point, of course.”

You are, of course, absolutely accurate, Graemethecat!

My error in not correcting the calculation when rewriting from parts per million to parts per ten thousand.
Thank you for correcting my oversight and misstatement!

Hivemind
Reply to  TRM
January 12, 2019 1:14 am

“CO2 retards the cooling at night”

This is the famous “radiative forcing” myth which is coded into all of the computer models. The ones that don’t actually work, predicting 2x actual warming, not predicting the pause, etc. The atmosphere actually pumps heat about through convection, as Willis has discussed many times.

John Shotsky
Reply to  Hivemind
January 12, 2019 6:24 am

If you want to see the effect of back radiation, it is easy to calculate…
Place a 1 sq cm of foil on a substrate such as Styrofoam, facing the sky at night.
Place another, with Styrofoam on both sides, at the same height.
Measure the temperature difference between them.
Guess what? The one facing the sky will be COOLER, because it is radiating away from billions of molecules, while only a few ‘photons’ from CO2 will affect that cooling. If you can’t measure it, it ain’t there.
The obvious point is that the entire earth’s surface is radiating at all times, but more during the day. The ‘back radiation’ cannot affect daytime temperatures because it is overwhelmed by the sun. Night time radiation is at a lower rate than daytime radiation, but it is STILL dependent on the surface temperature. If the surface temperature is somehow ‘heated’ by back radiation, the RATE of radiation would increase to eliminate it. It is affected by the 4th power of temperature – an almost unbelievable thermostat. You cannot trap heat – if you try, radiation will increase to offset it.

Red94ViperRT10
Reply to  Hivemind
January 14, 2019 4:42 pm

@John Shotsky January 12, 2019 at 6:24 am

If you want to see the effect of back radiation, …
Place a 1 sq cm of foil on a substrate such as Styrofoam, facing the sky at night.
Place another, with Styrofoam on both sides, at the same height.
Measure the temperature difference between them.

No, actually I don’t see the effect of back radiation. I see the effect of insulation. But mostly I don’t see the effect of “back radiation” because it doesn’t exist, it’s a figment of Mr. Trenberth’s fevered imagination.

Bartemis
Reply to  Steve Heins
January 11, 2019 7:35 pm

Most ocean cooling is via evaporation and convection.

Steve Heins
Reply to  Bartemis
January 11, 2019 7:46 pm

What is your scientific definition of “most” ??

Is it 95%?

Is it 80%?

or maybe 73%?

PS, convection and evaporation does not remove ocean heat into space, all it does is move the heat from point A to point B, and not off-planet.

Hugs
Reply to  Steve Heins
January 11, 2019 8:08 pm

See Trenberth et al 2009.

Most is usually more than 50 per cent.

Looking at how badly water freezes up, but how seldom it rises above 30C, I really think the thermostate is two ended. At the high end, evaporation and cloud formation stop further warming. At the low end, latent heat of freezing, and insulating, floating ice prevents the Earth becoming a total snowball.

However, the air above water may become really cold, and much of the measured hlobal warming is due to less intensive winter night cold above the frozen ground. Is that bad? Maybe, but not as bad as the cold itself.

richard verney
Reply to  Steve Heins
January 11, 2019 10:21 pm

@Steve Heins

Whenever considering ocean temperature and behavoir, one should always have in mind the below plot:

comment image

(a) is the night-time temperature, and (b) is the daytime temperature.

As you will note, the top half millimetre of the ocean is always cooler, both day and night. This is notwithstanding that 95% of all DWLWIR is fully absorbed in the top 10 MICRONS (may be even less given the omni-directional nature of DWLWIR) of the ocean.

Why is the top half millimetre cooler? The obvious answer is evaporation. Lick your hand and blow on it and you will feel the cooling effect of evaporation.

One can see that there is no heating of the ocean by DWLWIR since at night the only source of energy in is DWLWIR and the ocean temperature is constant between about the top half millimetre and about 5 metres.

Contrast that with the daytime where one can see solar energy at work. Solar is not absorbed in the top MICRONS, and one can see that the top half millimetre (where solar begins to get absorbed) is warmer than the 5 metre level and the heat is getting less and less as one descends from the top half millimetre to 5 metres.

Night-time temperatures above the ocean are determined by the temperature of the ocean itself. If there is any impeding of temperature loss, it would appear that this is due to water vapour immediately above the oceans, and not by CO2.

This planet is a water world and it is water that dominates everything.

Menicholas
Reply to  Steve Heins
January 11, 2019 10:26 pm

“What is your scientific definition of “most” ”
What is your definition of a scientific definition, as opposed to say…oh, I don’t know, a dictionary definition?
I wonder, do you use the phrase “scientific fact” much?

Menicholas
Reply to  Steve Heins
January 11, 2019 10:30 pm

Funny how someone can point out how water behaves in the air in deserts, but then forget all of that and make up different selective attention fact set a minute later when talking about the ocean.

Anthony Banton
Reply to  Steve Heins
January 12, 2019 8:28 am

richard:

“One can see that there is no heating of the ocean by DWLWIR …..”

The process of ocean warming is via reduction of cooling ….

http://images.remss.com/papers/rsspubs/Gentemann_JGR_2008_thermal_variability.pdf

An explanation here from Nick Stokes ….
https://moyhu.blogspot.com/2010/10/can-downwelling-infrared-warm-ocean.html

The skin layer is still warmer than without the (extra) GHE and the transference of heat to the atmosphere is via that layer. As heat flow depends on DeltaT the warmer water just below does not do that as well with a warmer skin.

Walter Sobchak
Reply to  Steve Heins
January 12, 2019 9:27 am

“convection and evaporation does not remove ocean heat into space, all it does is move the heat from point A to point B, and not off-planet.”

The sun heats the ocean. The warm water evaporates and warms the air.

The war moist air rises, not rises, shoots up, 10 to 15Km, where the water vapor condenses and releases heat energy which is either conducted to adjacent gas molecules or is radiated.

More than half of the radiation is into space because the altitude at which it occurs is well above the surface, and because the cloud formation creates a high albedo surface underneath the cloud top. When you stand under the cloud and look up it is dark. When you fly over the cloud it is light. The colored graphic displays of satellite infrared images of cloud tops show the big high ones as red, because they are warmer than the surrounding air.

The effects are dramatic, we call them storms.

Menicholas
Reply to  Steve Heins
January 12, 2019 1:33 pm

The red clouds are warmer than surrounding air that they rose through, or they would not have risen to where they are.
But it is incorrect that the reds are the warmest…those are the coldest cloud tops…colder because they are higher, and all air cools as it rises.
The clouds are radiating far more than the surrounding air though…even though if the air comprising the clouds is no longer rising, it is because they have reached a level where they rise no more since they are the same temp…because the water vapor and ice crystals in clouds actively radiate in many wavelengths, and the surrounding dry air does not. So the clouds and the air around them are the same temp, but the dry air is transparent and invisible and the clouds are not.
Air can only rise if it is warmer than the air it is rising into. Once that condition no longer is the case, it stops rising, with the exception of air which has moved rapidly and acquired enough momentum to overshoot.

Here is a satellite picture showing with a temperature scale…note the scale is reversed from the way such scales are often present…colder, and redder, is on the right:
http://www.recmod.com/hurricane/hurricane2005/katrina/satellite/irloop-8-28b.jpg

BTW, Walter, I am not disagreeing with the point you are making, just this detail.
I do not this it undermines your main point though

Walter Sobchak
Reply to  Steve Heins
January 12, 2019 5:50 pm

Menicholas: Thank you. You are correct. My main point ist that heat energy those cloud particles have given up has been radiated towards outer space.

Alan Tomalty
Reply to  Steve Heins
January 11, 2019 10:25 pm

http://applet-magic.com/cloudblanket.htm

Clouds overwhelm the Downward Infrared Radiation (DWIR) produced by CO2. At night with and without clouds, the temperature difference can be as much as 11C. The amount of warming provided by DWIR from CO2 is negligible but is a real quantity. We give this as the average amount of DWIR due to CO2 and H2O or some other cause of the DWIR. Now we can convert it to a temperature increase and call this Tcdiox.The pyrgeometers assume emission coeff of 1 for CO2. CO2 is NOT a blackbody. Clouds contribute 85% of the DWIR. GHG’s contribute 15%. See the analysis in link. The IR that hits clouds does not get absorbed. Instead it gets reflected. When IR gets absorbed by GHG’s it gets reemitted either on its own or via collisions with N2 and O2. In both cases, the emitted IR is weaker than the absorbed IR. Don’t forget that the IR from reradiated CO2 is emitted in all directions. Therefore a little less than 50% of the absorbed IR by the CO2 gets reemitted downward to the earth surface. Since CO2 is not transitory like clouds or water vapour, it remains well mixed at all times. Therefore since the earth is always giving off IR (probably a maximum at 5 pm everyday), the so called greenhouse effect (not really but the term is always used) is always present and there will always be some backward downward IR from the atmosphere.

When there isn’t clouds, there is still DWIR which causes a slight warming. We have an indication of what this is because of the measured temperature increase of 0.65 from 1950 to 2018. This slight warming is for reasons other than just clouds, therefore it is happening all the time. Therefore in a particular night that has the maximum effect , you have 11 C + Tcdiox. We can put a number to Tcdiox. It may change over the years as CO2 increases in the atmosphere. At the present time with 409 ppm CO2, the global temperature is now 0.65 C higher than it was in 1950, the year when mankind started to put significant amounts of CO2 into the air. So at a maximum Tcdiox = 0.65C. We don’t know the exact cause of Tcdiox whether it is all H2O caused or both H2O and CO2 or the sun or something else but we do know the rate of warming. This analysis will assume that CO2 and H2O are the only possible causes. That assumption will pacify the alarmists because they say there is no other cause worth mentioning. They like to forget about water vapour but in any average local temperature calculation you can’t forget about water vapour unless it is a desert.
A proper calculation of the mean physical temperature of a spherical body requires an explicit integration of the Stefan-Boltzmann equation over the entire planet surface. This means first taking the 4th root of the absorbed solar flux at every point on the planet and then doing the same thing for the outgoing flux at Top of atmosphere from each of these points that you measured from the solar side and subtract each point flux and then turn each point result into a temperature field and then average the resulting temperature field across the entire globe. This gets around the Holder inequality problem when calculating temperatures from fluxes on a global spherical body. However in this analysis we are simply taking averages applied to one local situation because we are not after the exact effect of CO2 but only its maximum effect.
In any case Tcdiox represents the real temperature increase over last 68 years. You have to add Tcdiox to the overall temp difference of 11 to get the maximum temperature difference of clouds, H2O and CO2 . So the maximum effect of any temperature changes caused by clouds, water vapour, or CO2 on a cloudy night is 11.65C. We will ignore methane and any other GHG except water vapour.

So from the above URL link clouds represent 85% of the total temperature effect , so clouds have a maximum temperature effect of .85 * 11.65 C = 9.90 C. That leaves 1.75 C for the water vapour and CO2. CO2 will have relatively more of an effect in deserts than it will in wet areas but still can never go beyond this 1.75 C . Since the desert areas are 33% of 30% (land vs oceans) = 10% of earth’s surface , then the CO2 has a maximum effect of 10% of 1.75 + 90% of Twet. We define Twet as the CO2 temperature effect of over all the world’s oceans and the non desert areas of land. There is an argument for less IR being radiated from the world’s oceans than from land but we will ignore that for the purpose of maximizing the effect of CO2 to keep the alarmists happy for now. So CO2 has a maximum effect of 0.175 C + (.9 * Twet).

So all we have to do is calculate Twet.

Reflected IR from clouds is not weaker. Water vapour is in the air and in clouds. Even without clouds, water vapour is in the air. No one knows the ratio of the amount of water vapour that has now condensed to water/ice in the clouds compared to the total amount of water vapour/H2O in the atmosphere but the ratio can’t be very large. Even though clouds cover on average 60 % of the lower layers of the troposhere, since the troposphere is approximately 8.14 x 10^18 m^3 in volume, the total cloud volume in relation must be small. Certainly not more than 5%. H2O is a GHG. Water vapour outnumbers CO2 by a factor of 25 to 1 assuming 1% water vapour. So of the original 15% contribution by GHG’s of the DWIR, we have .15 x .04 =0.006 or 0.6% to account for CO2. Now we have to apply an adjustment factor to account for the fact that some water vapour at any one time is condensed into the clouds. So add 5% onto the 0.006 and we get 0.0063 or 0.63 % CO2 therefore contributes 0.63 % of the DWIR in non deserts. We will neglect the fact that the IR emitted downward from the CO2 is a little weaker than the IR that is reflected by the clouds. Since, as in the above, a cloudy night can make the temperature 11C warmer than a clear sky night, CO2 or Twet contributes a maximum of 0.0063 * 1.75 C = 0.011 C.

Therfore Since Twet = 0.011 C we have in the above equation CO2 max effect = 0.175 C + (.9 * 0.011 C ) = ~ 0.185 C. As I said before; this will increase as the level of CO2 increases, but we have had 68 years of heavy fossil fuel burning and this is the absolute maximum of the effect of CO2 on global temperature.
So how would any average global temperature increase by 7C or even 2C, if the maximum temperature warming effect of CO2 today from DWIR is only 0.185 C? This means that the effect of clouds = 85%, the effect of water vapour = 13.5 % and the effect of CO2 = 1.5%.

Sure, if we quadruple the CO2 in the air which at the present rate of increase would take 278 years, we would increase the effect of CO2 (if it is a linear effect) to 4 X 0.185C = 0.74 C Whoopedy doo!!!!!!!!!!!!!!!!!!!!!!!!!!

eyesonu
Reply to  Alan Tomalty
January 12, 2019 6:34 am

Alan,

My focus is on IR exchange to and from the ‘bottom’ of clouds. I have spent considerable time pondering this. Maybe you can help me out here.

With regards to your quote “… The IR that hits clouds does not get absorbed. Instead it gets reflected. When IR gets absorbed by GHG’s it gets reemitted either on its own or via collisions with N2 and O2. In both cases, the emitted IR is weaker than the absorbed IR. …”

My reasoning is that up-welling IR from the ground (consider for discussion that the ground is warmer than the cloud base) will be absorbed and a portion re-emitted downwards at a lower emission temperature and thus energy level. Now the cloud base is not like a thin sheet of foil or micro boundary layer but is in fact a ‘spongy’ semi-transparent boundary layer of considerable thickness (possibly a couple of hundred feet thick). Some IR coming up from below my not be captured and re-emitted until well into the depth of this boundary layer at which point the re-remittance could send it deeper into the boundary layer or if sent downward it could be captured by the water condensate that it missed on the way up and its travel redirected yet again with some continuing deeper (upwards) into the cloud. This leads me to believe that clouds are absorbing some significant portion of the up-welling IR. Also if the general cloud emission temperature is less than the ground emission temperature it would seem reasonable to think that some energy was retained in the cloud even by the IR that was re-emitted downwards and back to the ground.

In the real would where I can climb or drive up a mountain into the fog the density of the fog increases over a period of elevation change of a few hundred feet until visibility becomes very poor. I would assume that cloud bases normally behave the same, but I don’t have a balloon to prove my point.

Charles Nelson
Reply to  Steve Heins
January 11, 2019 11:37 pm

The thermal mass of the oceans is in the region of 1100 times the thermal mass of the atmosphere…maybe you could explain how an increase from 350ppm to 400ppm in CO2 in the atmosphere could cause a detectable increase in ocean temperatures?
I know it’s only basic physics…not you’re advanced ‘downwelling radiation’ science…but it would be fascinating to hear your attempt at the math.
Thanks.

mbur
Reply to  Charles Nelson
January 13, 2019 5:24 pm

….thermal mass!? How much does water expand when going from liquid to vapor?…

Menicholas
Reply to  mbur
January 14, 2019 7:41 am

One mole of any gas at STP occupies a volume of 22.4 liters.
STP is one atmosphere of pressure and 273K, or 1012 millibars and 0.0 C
The volume of one mole of water is slightly below 1 cubic centimeter per gram at STP, or
<18 cc.
18 cubic centimeters is 0.018 liters.

DocSiders
Reply to  Steve Heins
January 12, 2019 2:03 pm

Every day in the tropics the sun heats the ocean water by several (or more) degrees triggering emergent phenomena called clouds and thunderstorms. Every day, these emergent responses leave the water cooler than the triggering temperature. Cooling continues overnight. The warmer the water, the earlier in the day the clouds and thunderstorms form (clouds block sunlight and reflect it into space…storms drive humid air into high altitudes where it condenses radiating heat into space — CO2 assists in this high altitude radiative escape). This daily energy transfer activity is several orders of magnitude greater than the local effects of CO2.

And it is not entirely clear how much warming is done at ocean surfaces from downwelling IR. Only the upper few hundred microns absorb this IR…which immediately causes some surface evaporative cooling. So downwelling IR might actually cool the oceans.

Things are not nearly as simple as your Nature article would have it.

Climate science is not rocket science…IT’S AT LEAST AN ORDER OF MAGNITUDE MORE COMPLICATED than rocket science.

Johann Wundersamer
Reply to  Steve Heins
January 12, 2019 2:58 pm

Steve Heins, Clouds redard the cooling at night.

What’s your point.

Steve Heins
Reply to  Johann Wundersamer
January 12, 2019 3:07 pm

CO2 and clouds do the same thing.

DocSiders
Reply to  Johann Wundersamer
January 13, 2019 6:55 am

That climate science is not nearly as simple as most AGW scientists admit publicly.

It is certainly not simple enough to model accurately without at least 3 orders of magnitude more (good) DATA and 2 orders of magnitude more spatial resolution to the data and a whole lot more (who knows how much) computing power.

You do not know that downwelling IR heats the water…at all. Evaporation requires lots energy. IR penetrates water hardly at all (most is reflected…mirror like). Lots of energy into a little bit of water causes evaporation…which causes cooling of the underlying body.

Broad spectrum Sunlight certainly penetrates and heats water…orders of magnitude more than the downwelling IR does. So, Things that block sunlight have way more effect on ocean temperatures than downwelling IR.

IT’S NOT AS SIMPLE AS CALCULATING JOULES OF DOWNWELLING IR HEATING UP WATER LIKE A STOVETOP BURNER.

In models, Emergent phenomena like clouds and storms are treated as averaged linear events and these phenomenon are FAR FROM THAT. They respond to the temperature of the water and act as heat engines producing powerful negative feedbacks.

None of the climate models can or do “model” these ubiquitous emergent phenomena. CO2 effects are real of course but puny compared to these (and many other) governing emergent phenomena.

Reply to  DocSiders
January 13, 2019 7:24 am

Doc,
Water absorbs both in the UV and the IR . There might be some IR warming going down up to a meter or so
but the effect of the UV is much more pronounced as it warms the top of the surface easily to boiling point,
causing evaporation, mostly. The subsequent condensation of water vapor is what keeps the temperature on earth so much more equal,
be happy about that!
Hence, the increase or decrease in UV coming to earth is what is causing warming or cooling, respectively;
do the measurement!!
[sorry I don’t have the equipment]

DocSiders
Reply to  Johann Wundersamer
January 13, 2019 7:44 am

Over the Intertropical oceans there are virtually no clouds overnight. “SO THERE”.

AGAIN…my point was that climate is NOT simple enough to point to one factor (like ocean heating from downwelling radiation) and say “SO THERE…end of discussion”.

Gary Pearse
Reply to  Steve Heins
January 12, 2019 3:11 pm

IR doesnt heat the ocean and neither does the atmosphere, the atmosphere is heated by the ocean.

Steve Heins
Reply to  Gary Pearse
January 12, 2019 3:25 pm

If the air above the ocean is warmer than the water, heat will flow from the air into the water.

tty
Reply to  Steve Heins
January 12, 2019 5:13 pm

Yes, by conduction. Gases are quite bad conductors, so the flow will be minimal, even for a large temperature differential.

John Shotsky
Reply to  Steve Heins
January 12, 2019 8:35 pm

Yes, air warmer than the ocean will cause the ocean to warm – a little. But the opposite is also true, and that happens daily, monthly, yearly. But the ocean will also be warmed by radiation from the sun, and it will radiate more due to that warming. So, what is the point of the comment? Heat goes in, heat goes out, and it has always been that way, and always will be. There is nothing of interest in what has always been, and always will be. Unless one does not understand that is how it is.

Red94ViperRT10
Reply to  Steve Heins
January 14, 2019 6:14 am

Global average air temperature is lower than global average water temperature. So besides the fact that very little heat transfers from the atmosphere to the ocean, what does is lowering the temperature on average.

Steve Heins
Reply to  Gary Pearse
January 12, 2019 3:32 pm

If a given area of ocean emits 20 units of energy in the IR band, and the atmosphere returns 1 unit of energy via down welling IR, the net emission rate of the ocean is 19 units. ……..Down welling IR retards cooling.

John Shotsky
Reply to  Steve Heins
January 12, 2019 5:52 pm

That’s a big if…there are billions of radiating surface molecules for EVERY radiating CO2 molecule that is earth directed. Half are not earth-directed. ALL earth radiating molecules are space-bound. People think of CO2 molecules as if they were all street lights with shields to prevent outward radiation. Think of it as ½ the Co2 concentration is earth directed…yep…200ppm…

Steve Heins
Reply to  Steve Heins
January 12, 2019 6:13 pm

So John, what you are saying is that there is an effective IR mirror of 200 ppm, that reflects all outbound IR back towards the surface.

Thank you, you’ve just confirmed that the CO2 retards cooling.

Red94ViperRT10
Reply to  Steve Heins
January 12, 2019 9:41 pm

In general, air by itself is a pretty good insulator, if it’s static, and you don’t need to invoke any mythical DWIR or any other such mumbo jumbo. That’s why you fill a wall cavity with insulation, to keep the air as static as possible, keep it from moving around (the insulation, if you remove all the air from it, is not a very good insulator by itself). The atmosphere is NOT static, its specific composition notwithstanding!!! Case closed.

rishrac
Reply to  Steve Heins
January 13, 2019 1:50 pm

If all of the increase in atmospheric co2 could be explained by man made co2, then the retained additional warming can not exceed 0.00012 provided that co2 is a perfect insulator . Do you really expect that to warm the oceans?
Steve, you do know that nitrogen is a green house gas too. It has a heat value as well. Do you know what the difference is between the heat retaining value of nitrogen as opposed to co2? The amount of nitrogen is 78% compared to 0.0.042 % . At 300 K, N is 1.040 and co2 is 0.846. How much co2 do you think it would take to overcome the specific heat of nitrogen? You’re thinking there is this enormous amount of co2 and it’s not. If there is any warming by co2, it is extremely small. That is, if all the increase in atmospheric co2 is man made, which I have my doubts… strong ones.

Donald Kasper
Reply to  Crispin in Waterloo but really in Beijing
January 11, 2019 6:31 pm

There is no correlation whatsoever shown in the first graph. The illusion comes from scaling the ocean heat flux. To tell if there is a correlation you generate a graph of CO2 in one axis to heat flux in the other axis and generate a least square trend. The first graph is Powerpoint bullshit, not science.

Menicholas
Reply to  Donald Kasper
January 11, 2019 10:32 pm

Agreed.
If one wanted to show something else the scale could be changed and alter the appearance completely.

R Shearer
Reply to  Crispin in Waterloo but really in Beijing
January 11, 2019 6:51 pm

Zeke is the Jim Acosta of climate.

Louis Hooffstetter
Reply to  R Shearer
January 11, 2019 8:21 pm

I agree. Willis is far too kind to call him one of the good guys. He’s just another obfuscating, fear mongering, climate con artist and the tweet referenced in Willis’ article proves it. It’s his latest attempt to intentionally mislead and frighten the general public about the climate. To me, he’s part of the global warming deep state.

taz1999
Reply to  Louis Hooffstetter
January 12, 2019 9:02 am

Louis Hooffstetter

I’m going with your comment on this guy Zeke (whom I’d never heard of before). The only rationale for picking an odd unit of measure and even ignoring your own error bars is because you could not generate the wanted conclusion and alarm otherwise.

Reply to  taz1999
January 12, 2019 9:36 am

agree

Menicholas
Reply to  Louis Hooffstetter
January 12, 2019 12:48 pm

100%
A large part of the problem is trying to play nice with people who have as an end game some very nasty outcomes.
If the greenies, globalists, and other factions of the warmista cabal ever get control of our economy, governments, and lives, the state of the entire world will make the darkest days of the Soviet union look like a sunny beach side bikini babe barbecue.
Things are already getting pretty bad in some places, but nothing like what they have in mind.

Hugs
Reply to  Crispin in Waterloo but really in Beijing
January 11, 2019 7:26 pm

The first chart shows me that the CO2 concentration is driven by the ocean temperature.

How many times this claim has been presented, and nope, it’s not relevant. CO2 ticks up by a quantity much smaller than human emissions. The ocean is taking in and sinking a large portion of human-made CO2. In fact, it is changing gasses quickly to both directions but the balance is much on the sink side.

That Zeke posted this was pretty cheap of him. CO2 should not be scaled to correlate with OHC, as they by the best knowledge, are not real time correlated.

Louis Hooffstetter
Reply to  Hugs
January 11, 2019 8:56 pm

“CO2 ticks up by a quantity much smaller than human emissions. The ocean is taking in and sinking a large portion of human-made CO2. In fact, it is changing gasses quickly to both directions but the balance is much on the sink side.”

WRONG! WRONG! WRONG!
Even climate “scientists” agree that the amount of CO2 contributed to our atmosphere by natural sources dwarfs the human contribution. The last ice age began ~2.6 million years ago, began to end ~15,000 years ago, and we are still coming out of that ice age today. 20,000 years ago the CO2 concentration in the atmosphere was only ~190 ppm (ice core data), which indicates the frigid oceans removed a huge amount of CO2 from the atmosphere. Climate “scientists” also constantly tell us that the oceans are getting warmer and warmer every day. So if the oceans reached equilibrium with the atmosphere (with regards to CO2) during the last ice age, there is no possible way they can absorb more than a gnat’s ass of CO2 from the atmosphere today. They’re supersaturated with respect to CO2, and have to off-gas CO2 as they warm. That’s why the first chart shows the CO2 concentration (in the atmosphere) is driven by ocean temperature.

Hugs
Reply to  Louis Hooffstetter
January 12, 2019 2:47 am

”Even climate “scientists” agree that the amount of CO2 contributed to our atmosphere by natural sources dwarfs the human contribution. ”

I kind of agree, but your fact is a totally irrelevant factoid.

Hugs
Reply to  Louis Hooffstetter
January 12, 2019 2:51 am

‘there is no possible way they can absorb more than a gnat’s ass of CO2 from the atmosphere today’

And this is just bollocks. The growing partial pressure does the trick. Good for us, otherwise we’d doomed.

Louis Hooffstetter
Reply to  Hugs
January 12, 2019 6:54 am

I respectfully disagree:
* At the peak of the last Ice Age (~20,000 years ago), the concentration of CO2 in our atmosphere was only ~190 ppm (from ice core data).
* The minimum concentration necessary for woody plants to survive is ~150 ppm. If the concentration falls below 150 ppm, all plants die.
* The concentration at which plants begin to die is ~225 ppm. So at the peak of the last Ice Age plants were suffering. Exposed areas near glaciers were barren deserts (as evidenced by large deposits of Loess, a wind-swept, flour-like soil dust).
*~10,000 years ago, the concentration of CO2 in our atmosphere was only ~280 ppm (from ice core data).
* By 1880, (when CO2 measurements began) level of CO2 in our atmosphere was only ~283 ppm.
* We are currently at 406.58 ppm (most recent Mauna Loa observatory reading I could find). But prior to the most recent cycle of Ice Ages (that started about 3.5 million years ago) the concentration of CO2 in our atmosphere was around 1000 ppm.
* Greenhouse experiments also show that when the level of CO2 falls below ~500 ppm, plant growth is severely retarded. So as far as plants are concerned, the current level of CO2 in our atmosphere is dangerously low.
* These same experiments also show that optimum plant growth occurs between 900 to 1,200 ppm. So commercial greenhouses add CO2 to the air to maintain levels within this range.
*Why do plants grow best when the atmosphere contains 900 to 1,200 ppm CO2? The reason is because our atmosphere contained 900 to 1,200 ppm CO2 when plants evolved.
* So our atmosphere is depleted in CO2 relative to when plants evolved, and still below the ‘comfort’ level for plants (500 ppm). And for most of the Earth’s geologic history, the levels of CO2 in the atmosphere have been much higher that today (without resulting in computer modeled disaster).

Hugs
Reply to  Hugs
January 12, 2019 11:44 am

Louis,

Sorry for some foul language.

Your logic is very convoluted and only well refutable by detailed balance calculations. For now I just point a problem in your claim. If the atmospheric portion growth were spontaneous, what would make it happen just the same time as the industrial emissions? Why was the CO2 still low during the holocene climatic optimum?

If you claim substantial outgassing, this time should be much warmer than the medieval and holocene climate optimum, and outgassing would also require significant ocean temp rise. We are not seeing that. We see CO2 rising steadily over the industrial period. The rise corresponds roughly emissions. The human emissions are not in the atmosphere, so they needed to go somewhere. If you say they didn’t sink into the ocean, but the ocean was /also/ net outgassing, you need to find a huge land-based sink that took all human emissions and some ocean CO2 as well.

There’s no such sink, and isotope analyses put a severe limit on where the CO2 goes. Without me being a scientist at the area, which, I admit, is a serious weakness from my side, I regard your idea a fancy Salby like crackpot idea.

With respect, there is only so much you can claim without a darn good full explanation. I think you didn’t really answer the partial pressure question. When the partial pressure grows, CO2 starts sinking in even if the surface is a bit warmer. This is the case now.

Thanks for a civil answer, though. That’s more than I can do.

Alastair Brickell
Reply to  Louis Hooffstetter
January 12, 2019 12:12 pm

Louis Hooffstetter
January 12, 2019 at 6:54 am

Louis, regarding your comment below. A good summary of the importance of CO2. I think greenhouses often pump in around 1200ppm. However, my understanding is that CO2 was last at 1000ppm in the Cretaceous…is there data available that indicates it was at this level as recently as 3.5MYA?

Menicholas
Reply to  Alastair Brickell
January 12, 2019 1:04 pm

Climate scientists are spending lots of time and money to determine what the exact CO2 concentration and temp was at every interval over the past 50 million years, and beyond, to prove their notions of CO2 being the thermostat of the atmosphere.
After all, who could dispute that if they are perfectly correlated in previous ages, that there is some cause and effect relationship?
Haha…just kidding!
They have not spent one penny or one breathe studying, researching, or talking about this, since it was pointed out that the ice cores actually refute the notion of CO2 causing the surface to warm.
Because anyone, even jackass warmistas, can easily see the other side of this logical coin: If such careful studies were done, and showed that there is no cause and effect correlation, that there is no tipping point, or runaway effect due to increasing water vapor once CO2 gets above a certain level, that the entire hypothesis will be dead and buried, conclusively and completely, once and for all.
But there are the kinds of studies and research that can and have been easily done by geologists and others even many decades ago, and could surely be done in far greater detail and with more certainty and less margin for error with modern instrumentation and methods.
So with all the humongous mountains of our money being carelessly spent on ultra-repetitive nonsense which amounts to circular reasoning and confirmation of prior assumptions, is it not about time to ask…WTF?

Louis Hooffstetter
Reply to  Alastair Brickell
January 14, 2019 10:18 am

Hugs: No problem with the foul language. “Bollocks” is relatively mild compared to the language I use on occasion.
When I think about the ocean/atmosphere system of Earth, I imagine an unopened 2 liter bottle of soda. It’s not a perfect model, but it makes sense to me. The mass of the oceans is ~270 X the mass of the atmosphere, so in my model, the soda represents the oceans and the void space at the top is the atmosphere. The exchange of gasses between the two is controlled by temperature and pressure, but in our actual ocean/atmosphere system, pressure has less influence than it does in my unopened soda bottle model. And since the mass of the soda (oceans) is so much greater than the atmosphere, it’s the temperature of the soda (oceans) that dominates the process. As the soda warms, gasses diffuse into the atmosphere. When the soda cools, gasses are removed from the atmosphere. This can be illustrated with two unopened bottles of soda. Place one in the freezer for ~45 minutes, and place the other one in the sun. When the two are opened side by side, the cool bottle of soda bottle will have some pressure (fizz), but the warm bottle may have enough to cause soda to foam out of the top. The reason the CO2 curve in the ice core data lags the temperature curve is because it takes a while for the oceans to warm relative to the atmosphere, and so there is a lag time before they warm enough to begin to degas.

Alastair: I have seen the 3.5 mya reference in several places (that I can’t recall) but here is one: https://www.ucdavis.edu/news/what-ancient-co2-record-may-mean-future-climate-change/

Gary Pearse
Reply to  Hugs
January 12, 2019 3:47 pm

Hugs, what happens with CO2 in or out of the ocean is determined both by the temperature of the water and the concentration of CO2 in the air. The higher the temp, the higher the outgassing – think a hot can of coca cola. Opposing this, the higher the atmospheric content of CO2, the greater the dissolution into the ocean. It isn’t a one-or-the-other type of thing. One must know what the concentration of the gas in the ocean is, what the concentration of the gas in the atmosphere is, and what the temperature of the ocean is doing – going up or down to decide whether CO2 is going net in or out. In general, given a period of no changes in these factors for a period, CO2 in the ocean is in equilibrium with that in the atmosphere. Were you to cool the ocean down, it would dissilve more, warm it up and it would outgas.

The picture is complicated by living sea organisms that take CO2 from the water, allowing more to be dissolved. However, major changes in in the main physical parameters: amount of CO2 in the atmosphere or water temperatures up or down decide which way the CO2 goes.

Julian Flood
Reply to  Crispin in Waterloo but really in Beijing
January 11, 2019 7:36 pm

A missing mechanism….

Willis, why the blip?

JF

björn
Reply to  Crispin in Waterloo but really in Beijing
January 12, 2019 3:09 am

Well then, Nobel Laurate Abert Gore made popular the 800 year lag of CO2 behind temperature. If your hypothesis is correct, paleoclimatological data would show the world started warming 800 years ago Is that the case or am I missing something?

whiten
Reply to  björn
January 12, 2019 3:00 pm

Yes bjorn,
I think you are missing the obvious.

First, the CO2 you referring to is about the concentration, which solely depends in the CO2
emission flux…where the proper lag of CO2 is to CO2… 🙂

Second, when considering the lag there, the 800 year one, it consist with a climatic variation either of temps or CO2 concentration in regard to time, in a very slow path.
Jumping 1.2C in 300 years as versus the case of 800 year lag period (1.2C in 3000years) , means that the lag time will be shorter…but it will be.

The CO2 flux is very much connected, related and dependent on the thermal flux…
So faster the “positive” variation in the “coupled” flux, shorter the lag time of CO2 concentration.
There still is a lag there of CO2 lag to temp increase, meaning that atmosphere started to warm before CO2 concentration starting to increase. In consideration of time this is ~100 years.

What is even more interesting about the lag, is that actually it relates far much more to temp variation than time perse.
The data for the last 300 years shows a lag of CO2 concentration to ~0.4C variation.
The modern data, also shows that there is no any detectable lag of CO2 flux response to the
thermal flux… no any detectable such, as yet, of any CO2 concentration acceleration in the given present thermal
stagnation of the atmosphere in the last 20 years…even when the increase of CO2 concentration is still high.
In this context, the lag, when considered in both, time and temps values, it means that in a slow long time path, the max climate thermal swing can not be more than 4C related….no more than 2C variation up or down from the mean. (~0.4C per ~800 years).

Hopefully this helps with your question.

cheers

Samuel C Cogar
Reply to  Crispin in Waterloo but really in Beijing
January 12, 2019 8:02 am

Crispin in Waterloo but really in Beijing January -11, 2019 at 6:15 pm

The simplest explanation for that remarkable correlation is that as the oceans heat up, the CO2 out-gases and the atmospheric concentration increases.

Exactly correct you are, Crispin in Waterloo

And in reference to what the author stated, to wit:

Call me crazy, but I do NOT believe that we know the 1955 temperature of the top two kilometres (6,562 feet) of the ocean to within plus or minus four hundredths of one degree.

I agree with the “non-belief” stated in the above statement, …… as well as the fact that I do NOT believe that we knew what the “ocean heat content” (temperature) of the top 2 km (6,562 feet) of the ocean was during the “1955-1990” time period as denoted on the included graphic, ….. Figure 1. Change in ocean heat content, ….. simply because that portion of the graph appears to me to be “random guesstimates”.

What could possible cause the “heat content” (temperature) of the top 2 kilometers (6,562 feet) of the ocean waters to change (increase/decrease) as quickly as denoted on said graph?

But anyway, I kinda really like the above noted graph …… simply because is shows a direct correlation (as per defined by Henry’s Law) between the 60 year increase in average ocean water temperature ….. and ….. the 60 year increase in average atmospheric CO2 ppm quantities.

The CAGW claim that the greening/rotting of the NH biomass is the “driver” of average biyearly/yearly atmospheric CO2 ppm quantities is nothing more than “junk science” agitprop.

Bill Taylor
Reply to  Crispin in Waterloo but really in Beijing
January 12, 2019 9:48 am

as a kid i noticed when you let a soda get to room temperature it went “flat” meaning as it warmed it released its co2, because warmer water holds LESS co2.

Reply to  Bill Taylor
January 12, 2019 10:20 am

100%
good observation
which is also the very reason why there is correlation with rising global T and rising CO2…
but rising CO2 does not cause rising global T

if all of us could just agree on that? that would be good…

Bill Taylor
Reply to  henryp
January 12, 2019 11:27 am

the historic records shows clearly the temperature rises and then co2 rises……the global warming caused by humans crowd are claiming the cause comes AFTER the effect

Reply to  Bill Taylor
January 12, 2019 11:29 am

true.

tty
Reply to  Bill Taylor
January 12, 2019 11:52 am

It will go flat once you open it even without getting to room temperature beacause the concentration of CO2 is higher than in the air above the surface.

Ever notice all those little CO2 bubbles in soda pop? They form as soon as you open the bottle, no matter how cold the pop is.

Gary Pearse
Reply to  tty
January 12, 2019 4:01 pm

Yes, but the temperature is indeed a factor. I did a contract job in Tanzania in the 1980s and, looking for bottled water to drink in the local area, I had to settle for soda water and we had no refrigeration. When you were taking the cap off a bottle, you had to wait patiently letting the gas escape slowly. To snap the cap off, you would lose 90% of your water in an explosion of gas. I only absentmindedly did this once!

Bill Taylor
Reply to  Gary Pearse
January 13, 2019 9:15 am

TY Gary, i grow tired of seeing my accurate information “corrected” here when it was already accurate and needed no correction.

Douglas Proctor
Reply to  Crispin in Waterloo but really in Beijing
January 12, 2019 11:19 am

Is the real question not how 1 measurenent is representative of a 300km X 300 km X 2km volume of water, not how you determine rhe math error-wize in 4000 measurements?

How accurate is one mearement of the volume? +/- 0.5C? Each reading 200 km apart? I’m thinking of the SST maps. Regional hot to what fineness?

Robin
Reply to  Crispin in Waterloo but really in Beijing
January 13, 2019 12:35 am

I am not scientist, but I have experience in industry that tells me that the oceans cannot be getting warmer and more acidic at the same time as the ‘warmists’ argue, because of what you correctly say about the off gassing of CO2 as the oceans warm up.

Nigel Franks
Reply to  Crispin in Waterloo but really in Beijing
January 13, 2019 11:17 am

The solubility of CO2 on salt water is more complicated than solubility in pure water. As well as temperature it depends on pH, partial pressure, absolute pressure and salinity.

Chaamjamal
January 11, 2019 6:17 pm

Not easy to sell bullshit to Willis.

Chaamjamal
Reply to  Chaamjamal
January 11, 2019 6:46 pm

Yet another ocean heat content issue in the context of agw is the use of circular reasoning. Pls see

https://tambonthongchai.com/2018/10/06/ohc/

Menicholas
Reply to  Chaamjamal
January 12, 2019 3:12 pm

Chaamjamal,
Thank you for this very interesting link!
Very interesting video of the undersea volcano.
Fascinating really.
Is that clip part of a longer video, documentary, or discussion?
I would love to watch much more of this if there is more.
Perhaps a link?

Thank you in advance.
Nick M

Keith Rowe
Reply to  Chaamjamal
January 11, 2019 7:30 pm

The problem I have with it is. “If this were true”. If we go by NASA’s sea level rise which has been pretty flat for the past 3 years and all the heat is going to the ocean. Then where is the warming expansion of the oceans happening? Is it getting balanced by massive ice sheet builds on Greenland and Antarctica? The math doesn’t work out to me. Simple logic shows something isn’t right.

Keith Rowe
Reply to  Chaamjamal
January 11, 2019 7:31 pm
bit chilly
Reply to  Chaamjamal
January 11, 2019 7:50 pm

it shouldn’t be easy or even possible to sell this bullshit to any adult on the planet. i know willis asked not to rag on zeke ,but given he has displayed the fact he is an intelligent and articulate adult numerous times i find it very hard to believe these claims are “mistakes”.

i have huge respect for the time and effort willis puts into these analysis . here in scotland the most apt response to the claims by zeke would be ” aye right, ya walloper”.

that would suffice, because no one with an iq above 90 should swallow that we know the ocean heat content down to 2000m to the degree of accuracy claimed here , either 50 years ago or today. anyone claiming so is either stupid or a liar that knows no one of any consequence in the political or sci-political arena will ever call them out for selling the snake oil they are paid to sell.

ATheoK
Reply to  bit chilly
January 11, 2019 11:32 pm

I agree with you bit chilly.

Sweet Old Bob
January 11, 2019 6:18 pm

Sorry Zeke ….HORSEFEATHERS !

MarkW
Reply to  Sweet Old Bob
January 11, 2019 6:47 pm

A mistake is getting your answer wrong by a factor of two or three.
Getting the error bars wrong by 2 or 3 orders of magnitude falls into an entirely different category.

Jimmy Haigh
January 11, 2019 6:19 pm

Is this yet another “peeer reviewed” error-strewn study? Peers ain’t what they used to be.

Hugs
Reply to  Jimmy Haigh
January 11, 2019 7:28 pm

No, it’s a freaking marketing meme.

bit chilly
Reply to  Hugs
January 11, 2019 7:52 pm

+1

Louis Hooffstetter
Reply to  Jimmy Haigh
January 11, 2019 9:04 pm

“Peers ain’t what they used to be.”
Zeke’s peer reviewers are fellow climate “scientists” who also don’t follow the scientific method or know how to use significant figures.

fbc3
January 11, 2019 6:20 pm

As always, average temperature is a meaningless value. So just lead with that.

Hugs
Reply to  fbc3
January 11, 2019 7:30 pm

Avg T of water is not meaningless, but Zeke was using ZJ, not tenperature.

Chaamjamal
Reply to  Willis Eschenbach
January 11, 2019 6:29 pm

Enjoy all your posts sir particularly the data analysis ones. Thank you.

Marcus
Reply to  Willis Eschenbach
January 11, 2019 6:55 pm

(easiest ?}

Menicholas
Reply to  Willis Eschenbach
January 11, 2019 7:00 pm

Willis,
The easiest person too!
*I know how you hate typos*
BTW…good observations here.
I recall the first time I heard about this zetajoules crap and in two minutes had looked up the volume of the ocean and did the math.
It took me longer to find out what the resolution of the Argo thermometers was, but I knew instantly that the result was BS of a very egregious nature.
Zetajoules my eye!
At the very least they could use a more easily relatable unit…like Hiroshima’s!

Tom Halla
January 11, 2019 6:21 pm

I agree. No way was the 1955 measurement that precise, or the current ARGO measurements either.

doonman
Reply to  Tom Halla
January 11, 2019 8:04 pm

Didn’t we have to adjust the Argo surface float data upward because it didn’t align with WWII ship intake water temperature data?

Menicholas
Reply to  doonman
January 11, 2019 10:23 pm

Yes, yes they did!
That was how they Karlized away the pause a few years back.
As sleazy and disingenuous as anything could be.

commieBob
January 11, 2019 6:26 pm

Once again, we have results specified with much greater accuracy and precision than that provided by the test instruments.

The assumption, that if you have a ton of data, you gain accuracy and precision is only true under specific circumstances. From experience, I would say that data gathered from nature almost never fulfills those criteria. I’m not a mathematician and I do wish that someone would come along and produce a theoretically sound proof. I can provide examples but that’s not the same.

Donald Kasper
Reply to  commieBob
January 11, 2019 6:35 pm

0.001 C and probably 0.01 C measurements are measuring the microcurrents along the buoy skin and not the ocean and are probably influenced by heat from the buoy electronics, battery, and motors. At different scales of resolution does not mean you are measuring the same thing more accurately, but also means you are likely measuring other phenomena.

Donald Kasper
Reply to  Donald Kasper
January 11, 2019 6:36 pm

Measuring 0.001 C ocean microeddies and microfluxes is not the same as measuring world ocean currents in bulk.

Louis Hooffstetter
Reply to  Donald Kasper
January 11, 2019 9:14 pm

ARGO floats can measure temperature to 1/1000th of a degree C with accuracy?
Is that really possible? Now my BS alarm is going off.

Alex
Reply to  Louis Hooffstetter
January 11, 2019 9:54 pm

Argo site says +/- 0.002 C. That is the precision of the instrumentation. How accurately it reads a real temperature is up to debate. To be fairly certain, I would recalibrate every month. That doesn’t happen.
I wouldn’t be betting any part of my anatomy on it.

Richard
Reply to  Louis Hooffstetter
January 12, 2019 12:00 pm

“The temperatures in the Argo profiles are accurate to ± 0.002°C and pressures are accurate to ± 2.4dbar. ”
http://www.argo.ucsd.edu/Data_FAQ.html#accurate

Menicholas
Reply to  Louis Hooffstetter
January 12, 2019 3:25 pm

To claim that the resolution limit of the device is the same as the accuracy of the measurement is complete nonsense.
To do so is to claim that there are no errors from any source and the unit is constantly and perfectly performing at it’s theoretical design limitations.

Red94ViperRT10
Reply to  Louis Hooffstetter
January 12, 2019 9:52 pm

Last month I got interested in how good of a thermometer I could get, so I went searching (online of course). Even after I asked for a “laboratory thermometer” the best I could find was ±0.25°C. If anyone can build a thermometer that actually reads to ±0.002°C, where do I buy one, and how much will it cost me? And how often will I have to get it calibrated? And how quickly can it give me that reading? Do I have to stand in one place for 20 minutes waiting for it to acclimate so I can actually read it?

Brooks Hurd
Reply to  Louis Hooffstetter
January 13, 2019 9:32 pm

I believe that the people who wrote Argo document stating 0.002 K “accuracy” have confused accuracy and precision. You determine precision by measuring the same thing multiple times and determining the variation in the readings. Accuracy is determined by measuring a known temperature multiple times. Known temperatures are the melting and boiling points of various substances. This is can be done in in a laboratory, but not very easily under 2,000 m of ocean water.

There are other factors which effect accuracy beyond the characteristics of the RTD sensor. The sensor’s resistance must be measured to convert to a temperature. This means that the voltage and current need to be measured with extremely high precision to match the RTD sensor precision. Electronic components are impacted by temperature changes, therefore I wonder if the Argo electronic components are in an isothermal enclosure. I find it hard to believe that the Argo temperature measurements are within several orders of magnitude of the 0.002 K that they claim on the Argo website.

ATheoK
Reply to  Donald Kasper
January 12, 2019 12:22 am

Almost, Donald;
My only quibble is the microcurrents.

“3.3.5 Water Temperature
While there are generally few problems with water temperature measurements (WTMP1, WTMP2), it should be noted that the depth of water temperature sensors vary with buoy hull or C-MAN site, and that the temperature probes on buoys are attached to the inside of the hull. Since buoy hulls are highly thermally conducting, the temperatures measured may reflect the average temperature of the water around the submerged hull rather than the temperature of the water nearest the probe.”

There is another curious caveat that leaves ocean temperature measurement begging:
From the same source:

“Table 1. Accuracies Achieved During Field Comparisons.
—————– WMO — NDBC ——– Basis
———– requirement – Accuracy –
Air Temperature 0.1 deg. C 0.09 deg. C Duplicate sensor comparison
Water Temperature* 0.1 deg. C 0.08 deg. C Duplicate sensor comparison
Dew Point 0.5 degrees C 0.31 deg. C Post calibration
Wind Direction 10 deg 9.26 degrees Adjacent buoy comparison
Wind Speed 0.5 m/s or 10% 0.55 m/s Duplicate sensor comparison
Sea Level Pressure 0.1 hPa 0.07 hPa Duplicate sensor comparison
Wave Height 0.2 m or 5% 0.2 m Comparison to standard
Wave Period 1 s 1 s** Comparison to standard
Wave Direction 10 deg 10 deg Comparison to standard

* Water temperature is taken by a buoy internal thermistor. Ocean Temperature is
taken with direct water contact.

** Resolutions for periods greater than 10 seconds are greater than 1 second.”

A caveat that leaves one wondering why “ocean temperature” is not further defined with a relevant accuracy?

That thermal conducting buoy masks the micro-currents, otherwise I would agree with you.

* NDBC 0.08°C claimed accuracy,
* Continual growth of marine life on the buoy,
* Buoy sampling throughout a column of water,
* i) i.e. sampling different layers of water that are often drastically different temperatures,
* ii) Where a slight change in water layer thickness could drastically affect any averaged temperatures,
* iii) That difference in water layers means each buoy is not measuring the same thing. Even the same buoy, with currents changing the column of water is not likely to be measuring the same thing.
* iiii) Meaning, Ocean buoys are another situation where averages are improper reflection of hat is being measured.

The continual growth of marine life, along with the minor fact that buoys attract marine life obscures that claimed accuracy more as time passes.

Nor are all of the buoys Willis shows under USA control. Even within the USA, drifting buoys are spread amongst different owners and programs.
This image show the 1,684 United States Argos drifting buoys over the last six months.
comment image?dl=0

The lines are the tracks of the buoys. I am also puzzled by tracks without buoys.

Different programs, different owners, different equipment, different instruments, different maintenance teams, different maintenance schedules, practices and procedures ar eall ignored in the claimed 0.08°C accuracy claimed by NDBC above.

MarkW
Reply to  commieBob
January 11, 2019 6:50 pm

Even if they were measuring the results from fixed buoys, they couldn’t increase accuracy the way the claimed.
And fixed buoys at least have the advantage of measuring the same patch of ocean each time.
The ARGO buoys are free floating, so the place being measured is different each time.

AndyHce
Reply to  MarkW
January 12, 2019 8:50 pm

Fixed buoys are measuring the constantly changing ocean moving through their fixed position.

Many measurements by many floating buoys producing a better idea of average temperature within a volume of ocean certainly makes some sense but surely the maximum accuracy of that average cannot exceed the individual accuracy of the measurements.

Bartemis
Reply to  commieBob
January 11, 2019 7:42 pm

To get accuracy increasing with the square root of N, the samples must be independent and unbiased.

Steve Heins
Reply to  Bartemis
January 11, 2019 7:52 pm

Are two dimensions (time and space) of separation enough for you to conclude independence? If you claim “bias” you will have to provide conclusive proof of it.

Erik Magnuson
Reply to  Steve Heins
January 11, 2019 10:15 pm

Other way around, the burden of proof is on the assertion that the measurements are unbiased.

Peter Sable
Reply to  Steve Heins
January 12, 2019 5:50 pm

If you claim “bias” you will have to provide conclusive proof of it.

White noise is the most *optimistic* noise spectrum for a measurement, and this means the Null hypothesis should be that that the noise of sampling and measurement is correlated (Red noise), and one must prove that it’s not correlated in order to using N^(1/2).

(strictly speaking blue noise has negative correlation and improves better than N^(1/2) but blue noise is rare in nature)

Also since we are comparing two time periods, we need to able to conclude that the *space* is not correlated.

So let me ask you, is the temperature measurement of 1×1 area of the Earth’s surface strongly correlated with the cell next to it? The answer is quite probably yes. Therefore likely red noise. And thus N^(something less than 1/2) improvement with sample size.

Micky H Corbett
Reply to  Bartemis
January 12, 2019 12:11 am

They also have to be identically distributed. So you need to determine that samples all follow the same shape and characteristics. This is impossible without decent signal to noise ratio.

What Zeke has missed is the basic part of the scientific method. The tools (method) should provide you with sufficient low uncertainty in your data to be able to see variations predicted by your hypothesis.

If you don’t have this you can still assume it does. But your work carries much less relevance to the real world.

Geoff Sherrington
Reply to  Micky H Corbett
January 12, 2019 1:04 am

MHC,
Thank you for this common sense comment. Geoff.

Richard Saumarez
Reply to  Micky H Corbett
January 12, 2019 1:45 pm

This also assumes that the sampling is uniform: each buoy is measuring the same volume of water. I.e.: the mean is sum(temperature x volume of water) /N to get the heat content. W.E, demolishes this pretty effectively with a Monte-Carlo simulation.

I am continually astonished by some of the basic measurement / interpretation errors in climate science. If they were engineers, they might have learnt these things the hard way.

January 11, 2019 6:30 pm

The sun is the object that heats up the Earth’s oceans. CO2 in the air and the air itself would have a hard time heating up the oceans which have a heat content 1200 times that of the atmosphere. The oceans heat the air not vice-versa. Hotter oceans expel more CO2 to be counted at Mauna Loa, the World’s largest volcano.

Donald Kasper
Reply to  Nicholas William Tesdorf
January 11, 2019 6:39 pm

Earth oceans heated by the sun, geothermal heat flux of the ocean floor, undersea rift system volcanics, subduction volcanics and friction, heat from biodegradation in the sea floor sediment. Chemical reactions in the ocean also have endothermic and exothermic results, depending on the reaction.

Albert
January 11, 2019 6:31 pm

Zeke is clearly allied with all the usual propaganda-pushers. What makes him “one of the good guys”?

Hugs
Reply to  Albert
January 11, 2019 7:35 pm

Not necessarily allied. But sometimes a bit blinded. He’s good, because he doesn’t call names, sneer, and he has meaningful discourse with other, disagreeing people, unlike, say, Michael E Mann.

a
Reply to  Hugs
January 11, 2019 8:02 pm

Hugs,
Thanks for the reply. Perhaps Willis was calling for civility. That’s always a good idea.

JohnWho
January 11, 2019 6:34 pm

Even worse, the way I understand it, the Argo floats do exactly that – they float around and don’t even measure each time the same part of the ocean.

Donald Kasper
Reply to  JohnWho
January 11, 2019 6:42 pm

You can get a sense of buoy repeatability by taking multiple buoys that occur in a certain grid spacing and comparing their outputs. You can compare repeatability to grid size sampling as well. You can also test repeatability vs ocean depth and latitude. You can compare buoy to ship data and coastal station data. None of that was done.

Menicholas
Reply to  Donald Kasper
January 11, 2019 7:22 pm

You can also make sure everyone involved is honest, agenda-free, and non-biased.
Which we know is true, right?
“Scuse me, I gotta wipe the laugh tears off my face…

Steve Heins
Reply to  Menicholas
January 11, 2019 7:42 pm

Specifically, what part of Donald Kasper’s post do you disagree with?

Menicholas
Reply to  Steve Heins
January 11, 2019 8:24 pm

Specifically, which part of what I wrote gave you the idea I think he is “involved” with the Argo project?

GCSquared
Reply to  Donald Kasper
January 12, 2019 11:19 am

VERY good suggestion; it’s always good to examine the basics.

There could always be systematic variations during the course of an ascent, that differ from one rise to the next. The bit about the precision improving as root N holds only for normal processes. There’s no reason to reject the possibility of “long tail” statistics outright.

diogenese2
Reply to  JohnWho
January 12, 2019 1:07 am

the whole ocean is floating around. The buoys go with it, so, to some extent the buoy IS measuring the same part of the ocean, which makes rather nonsense of grid measurements.

EdB
January 11, 2019 6:38 pm

Simply brilliant Willis. Thanks.

I am dubious about ocean temperatures were known to better than + or – 1C right up until satellite measurements, then + or – 0.3C until the Argo floats were deployed.

MarkW
January 11, 2019 6:39 pm

“That’s 0.003°C. Get real! Ask any process engineer—determining the average temperature of a typcial swimming pool to within three thousandths of a degree would require a dozen thermometers or more …”

Maybe one of three foot in diameter kiddee pools.

MarkW
Reply to  MarkW
January 11, 2019 6:39 pm

And you would need thermometers that are sensitive to 0.0001C to do it.

Steve Heins
Reply to  MarkW
January 11, 2019 6:48 pm

Nope, you don’t understand anything about the random sampling of a mean.

https://www.bing.com/th?id=OIP.CYUOXtuvcFogpL3jEnQw_gHaCY&pid=Api&w=2880&h=928&rs=1&p=0

“N” is the number of samples

MarkW
Reply to  Steve Heins
January 11, 2019 6:52 pm

As usual Steve, you haven’t a clue as to what you are talking about.
That formula only applies when you are measuring the same thing with the same instrument.
Neither condition applies here.

Steve Heins
Reply to  MarkW
January 11, 2019 7:02 pm

No MarkW, you are wrong, as I do have a clue. The formula applies when you are using an estimator for a population mean. “Observations” as reflected in the “N” in the formula is not restricted to either”the same thing” nor the same “instrument.” What is being measured is not a “thing” it is a population mean. and the “s” is the error term for all observations. No restriction on “same instrument.”

You don’t have a clue about statistical sampling do you?

MarkW
Reply to  MarkW
January 11, 2019 7:04 pm

Ah yes, Steve, insult by assertion.
The subject has been reviewed to death here, and your position has been consistently eviscerated.

Beyond that, you can never reduce your margin of error to less than the accuracy of your proble.

Steve Heins
Reply to  MarkW
January 11, 2019 7:05 pm

Tell us all MarkW, what “instrument” do you use to measure GAST? (global average surface temperature.)

I would really like to see a picture of that measuring device.

Steve Heins
Reply to  MarkW
January 11, 2019 7:09 pm

MarkW, the “probe” you refer to is NOT by itself capable of measuring the global average surface temperature.

However, by increasing the number of observations, one will gain more accuracy in measuring the global average surface temperature.

This logic applies to the measurement of sea levels and the observation of sea level rise.

Steve Heins
Reply to  MarkW
January 11, 2019 7:19 pm

MarkW says: “The subject has been reviewed to death here”

No it has not. All the discussion here has been about using an instrument repeatedly to increase accuracy. That is not what happens in statistical sampling.

For example, you can use a ruler to measure the height of an adult American male. You cannot use that same ruler to measure the average height of an adult American male. The difference being what you are measuring with said ruler.

Your (an everybody else ) confuse the item being measured. A ruler measures height. A ruler doesn’t measure an average.

Louis Hooffstetter
Reply to  MarkW
January 11, 2019 9:22 pm

Steve, you still have to use significant figures. No matter how you slice and dice the data using statistics, the answer can never have more significant figures that the least accurate measurement. Climate “scientists” ignore this rule.

Don’t be a climate “scientist”.

Simon
Reply to  MarkW
January 12, 2019 10:53 am

MarkW is once again all BS and bluster and once again caught with his pants down.

MarkW
Reply to  MarkW
January 12, 2019 12:40 pm

As usual, Steve and Simon have nothing intelligent to say. They just whine that others don’t agree with them.
Steve, if you were half as intelligent as you believe yourself to be, you would know that all instruments require calibration and recalibration.
Secondly, it’s up to those making the claim to demonstrate that the equipment they are using is fit for the task.
As always, you have reversed the null hypothesis and the scientific method in general.

MarkW
Reply to  MarkW
January 12, 2019 12:41 pm

If you aren’t measuring the same thing, it doesn’t matter how many measurements you take, it doesn’t increase the accuracy beyond the accuracy of the individual probes.

Steve, just because you weren’t allowed to use a computer before a few weeks ago is not evidence that a subject has not been discussed before.

commieBob
Reply to  Steve Heins
January 11, 2019 7:42 pm

So, let’s use the example of Argo buoys.

We have buoys taking N samples at various times and places and we calculate a population mean.

Suppose that we take a different set of Argo buoys which take N samples at different times and places.

We now have two figures for population mean. Can you tell me how close those two figures should be? Hint: you can’t. The ocean is anything but homogeneous. The distribution of the Argo buoys will make a big difference in the calculated population mean.

BTW – I stumbled on a wonderful article about the history of Student’s T-test. link As an added bonus, the article takes a swipe at p-value as a method for evaluating the validity of experimental results.

BCBill
Reply to  commieBob
January 11, 2019 11:01 pm

Nice article on Gosset. Thanks.

Geoff Sherrington
Reply to  Steve Heins
January 12, 2019 12:47 am

Steve Heins January 11, 2019 at 7:19 pm

Steve,
A common error is the argument that one can use a crude ruler to get information that can be processed with statistics to give the average height of an adult female or whomever.
While that works in theory, it is only correct if there have been prior, accurate measurements to depict the statistical distribution that is later relied upon to calculate the average. That is, the trick works only if someone has done the thorough, proper job before you.
With this Argo data, I have not seen much at all about arrays of buoys deployed so as to have maximum constancy of known exogenous variables, which has long been the customary path to good science. Can you provide a reference to the manner in which the claims of accuracy were calculated for Argo?
Thanks Geoff.

Donald Kasper
Reply to  MarkW
January 11, 2019 6:43 pm

You would need one thousand thermometers or more.

nw sage
January 11, 2019 6:43 pm

“Here’s the underlying problem with their error estimate. As the number of observations goes up, the error bar decreases by one divided the square root of the number of observations. And that means if we want to get one more decimal in our error, we have to have a hundred times the number of data points.”
This comment is similar to CommieBob’s above.
The quoted section is not a correct analysis. As worded, the number of observations will only improve the error estimate at ALL if the observations are all made, using the same thermometer, measuring the same piece of seawater, at the same time (so no heat transfer can take place). Since the each ARGO buoy changes depth and floats from place to place, and each has a different thermometer – albeit calibrated – no amount of data manipulating of the buoy observations will improve the apparent error bar. See Dr Ball’s piece on this. He said it much more elegantly than I.

Donald Kasper
Reply to  nw sage
January 11, 2019 6:47 pm

There is measurement of a thing and the error of the instruments themselves, which was confused in that statement. Increased sampling is not a direct square function at all. It depends on the amount of noise. There are diminishing returns on increasing sampling, not automatically better accuracy. The misconception is that noise is random and tends to cancel out with more samples. Yeah, however, every now and then you get another wildly anomalous reading from noise and that defeats you additional sampling.

Steve Heins
Reply to  Donald Kasper
January 11, 2019 6:56 pm

Increased sampling is proportional to the inverse square of the number of samples. This is a mathematical fact. Your argument about “noise” is wholly contained within the standard deviation of the sampling instrument.

MarkW
Reply to  Steve Heins
January 11, 2019 7:05 pm

Only if the samples are the same thing.
So your formula doesn’t apply in this case.

David L. Hagen
Reply to  Steve Heins
January 12, 2019 11:09 am

Steve Heins – ONLY the statistical uncertainty declines with number of samples OF THE SAME THING. = Type A error.
Type B error or systematic error does NOT decline with inverse square root of samples.
Study NIST TN1297
https://www.nist.gov/pml/nist-technical-note-1297
Study international guidelines BIPM GUM
https://www.bipm.org/en/publications/guides/gum.html

Richard Saumarez
Reply to  David L. Hagen
January 12, 2019 1:57 pm

That’s a really interesting website. I’ve just skimmed the “propagation of uncertainties with Monte-Carlo analysis” and it’s very interesting and useful. Food for thought!

I suppose that the simplest answer is that one is estimating two distributions, which will converge differently with the number of samples

January 11, 2019 6:43 pm

Bravo, Willis! Nicely done.

Just from eyeballing the graph, it looks like that graph shows a 200 ZJ increase in OHC over twenty years (1998 to 2008). If it takes 2600 ZJ to raise the temperature of the top 2 km of the world’s oceans by 1°C, then 200 ZJ (over 20 years) would be 200/2600 = 0.077°C average temperature change (about 0.0038°C/year).

That’s a minuscule temperature difference. Even if it’s real, that’s certainly not a very scary rate of global warming.

What’s more, there were no Argo floats in 1998. So that data had to come from other sources.

Argo didn’t get to 3000 floats (targeted coverage level) until late 2007. So where’s all that data coming from?

(Maybe I’m asking questions that are in the paper, which I haven’t read.)

MarkW
January 11, 2019 6:45 pm

Willis, in your Monte Carlo runs, what is the assumed accuracy of the thermometers?

Hugs
Reply to  Willis Eschenbach
January 11, 2019 8:20 pm

In fact, how does the grid cell size affect on the conclusion? What would be the necessary size of grid cells to reach the claimed accuracy?

Another question. Why the graph Zeke posted is so smooth at the right hand side if the errors are so large you claim? Doesn’t the smoothness show that the computed CI is not really relevant?

Crashex
Reply to  Willis Eschenbach
January 12, 2019 7:17 am

Oh. So if you included any reasonable level of accuracy and precision error in the “measured average subset of the ocean”, then the errors bars would increase.

Good to know.

David L. Hagen
January 11, 2019 6:51 pm

Thanks Willis for raising the issue and exploring it.

For those seeking further details in published docs etc. here are several recent pubs.
Still not much mention of systematic uncertainties (Type B errors) per BIPM’s GUM
https://www.bipm.org/en/publications/guides/gum.html

Annie Wong, Robert Keeley, Thierry Carval and the Argo Data Management Team
(2018) Argo Quality Control Manual For CTD and Trajectory Data. Version 3.1 16 January 2018 http://dx.doi.org/10.13155/33951

Huang, B., Angel, W., Boyer, T., Cheng, L., Chepurin, G., Freeman, E., Liu, C. and Zhang, H.M., 2018. Evaluating SST analyses with independent ocean profile observations. Journal of Climate, (2018).
https://journals.ametsoc.org/doi/pdf/10.1175/JCLI-D-17-0824.1

Kuusela, M. and Stein, M.L., 2018. Locally stationary spatio-temporal interpolation of Argo profiling float data. Proceedings of the Royal Society A.
https://arxiv.org/pdf/1711.00460

Roach, C.J., Balwada, D. and Speer, K., 2018. Global Observations of Horizontal Mixing from Argo Float and Surface Drifter Trajectories. Journal of Geophysical Research: Oceans. https://bit.ly/2M65kQ7

geoff@large
Reply to  David L. Hagen
January 14, 2019 12:08 am

Thanks DavidH, very helpful.

A G Foster
January 11, 2019 6:56 pm

“Summary:
Climate change from human activities mainly results from the energy imbalance in Earth’s climate system caused by rising concentrations of heat-trapping gases. About 93% of the energy imbalance accumulates in the ocean as increased ocean heat content (OHC). The ocean record of this imbalance is much less affected by internal variability and is thus better suited for detecting and attributing human influences (1) than more commonly used surface temperature records. Recent observation-based estimates show rapid warming of Earth’s oceans over the past few decades (see the figure) (1, 2). This warming has contributed to increases in rainfall intensity, rising sea levels, the destruction of coral reefs, declining ocean oxygen levels, and declines in ice sheets; glaciers; and ice caps in the polar regions (3, 4). Recent estimates of observed warming resemble those seen in models, indicating that models reliably project changes in OHC.”

The summary says nothing of methods; nothing but alarm and paranoia. Judging by the summary alone it is clear we are dealing with propaganda, obviously intended to counter the devastating effects of the Resplandy/Lewis fiasco. –AGF

Jeff Alberts
Reply to  A G Foster
January 12, 2019 9:56 pm

“The ocean record of this imbalance is much less affected by internal variability and is thus better suited for detecting and attributing human influences (1) than more commonly used surface temperature records.”

Wha? This really beggars belief. It’s easier to detect AGW in the oceans, where humans aren’t, than on land, where humans are??

Mick
January 11, 2019 6:58 pm

Love your work Willis 🙂

I think the World’s big powers has more data… I mean the Navy’s submarines. The sub need to know the thermal layers of the oceans to seek and hide. All those subs zig-zag the oceans checking temperature real time and document all changes, but it’s all classified me think.

Red94ViperRT10
Reply to  Mick
January 14, 2019 6:36 am

Last I heard, it’s not. You can thank VP Al Gore for that. See, even as VP he was already on board the CAGW gravy train. He thought the data set you just mentioned would help him prove the warming, so he had all those records declassified. Now, where you would go today to get your hands on them I still can’t tell you.

Ve2
January 11, 2019 7:05 pm

Why this sudden interest in the accuracy of Argo floats from the climate changers.
Several years ago they ignored the Argo information and relied instead on the intake manifolds temperature of ocean going ships to discredit the Argo floats.

January 11, 2019 7:07 pm

I remember when the Argo floats were measuring a cooling ocean… until Josh Willis “fixed” the data. Here’s an article:
https://earthobservatory.nasa.gov/Features/OceanCooling/

One of the ways that bias creeps into scientific experiments and measurements is that scientists tend to scrutinize results which conflict with their expectations much more closely than they scrutinize results which support their expectations. It’s not even dishonest, it’s just human nature, but it still distorts the results. It’s why medical studies are blinded, and even double-blinded.

So, errors which bias the results the “wrong way” get found and corrected, but errors which bias the results the “right way” don’t.

Josh Willis is an extreme climate alarmist . So when the Argo floats found ocean cooling instead of warming, he looked hard to find an error. Does anyone think he would have looked so hard if the measurements had initially shown warming, instead of cooling?

Louis Hooffstetter
Reply to  Dave Burton
January 11, 2019 9:33 pm

“If you torture the data long enough, it will confess.”
Ronald Coase, Professor Emeritus of Economics at the University of Chicago Law School & 1991 Nobel Prize winner for Economics.

A C Osborn
Reply to  Dave Burton
January 12, 2019 9:34 am

+1000.

Is there anything that has not been adjusted in the “right direction”?

Menicholas
Reply to  Dave Burton
January 12, 2019 3:19 pm

Thank you very much for this information and the links, Mr. Burton.
Stuff like this is very important to understanding why everything coming from the climate mafia orthodoxy points in one direction.
Simply put, when it does not do so, they simply change it so it does or, failing an ability to change it, they simply erase it, bury it, or otherwise remove it from any study or conversation.
He who controls the data controls the narrative.

Red94ViperRT10
Reply to  Menicholas
January 14, 2019 6:45 am

Such as the satellite measurements of atmospheric CO₂… the warmunists were sure they would see CO₂ plumes over all the major population centers, because of the evil fossil fuel combustion going on. When they didn’t, and in fact found major plumes over the Amazon basin and other large aggregations of plant life (areas rather sparse in fossil fuel combustion, I might add), the website that was supposed to produce continuously updated graphics showing the results from the satellite just disappeared without any announcement. I think this came up a few weeks ago and someone knew of how to access the raw data, but the graphical interface no longer exists? Can someone refresh my memory on that? Or did it just stop updating?

M__ S__
January 11, 2019 7:08 pm

Don’t you know you’re just supposed to believe these days. If it sounds technical and a lot of decimal places are used then it MUST be right.

Dennis Sandberg
January 11, 2019 7:11 pm

In summary, we’ll have a 5 C ocean temperature increase about the same time we have a 0.1 pH decrease (never).

Bart Tali
January 11, 2019 7:12 pm

The most interesting thing I learned from this is that it takes 2600 zettajoules to raise the top 2000m of the ocean by 1°C. Aren’t there two t’s in zettajoule, by the way?

In any case, if you look at the rise of 200 ZJ from the year 2000 approximately to 2018, that translates to about 0.08°C.

So am I supposed to become concerned because the ocean warmed by 0.08°C in roughly 20 years? It doesn’t sound all that frightening.

Hugs
Reply to  Willis Eschenbach
January 12, 2019 4:34 am

Well, that’s the way you know it’s merchant of alarm marketing shit from Global Warming Art like parties.

It really existed and produced talking points, or kilotalking-points for political Wikipedia authors.

The site claimed to have been founded by Robert A Rohde.

Menicholas
Reply to  Hugs
January 12, 2019 10:35 pm

Great info.
Interesting that the info must be accessed via the wayback machine.
One might think that some people deemed it necessary, at some point, to cover their tracks.

ScottR
January 11, 2019 7:13 pm

Willis, off topic, but seeing your world map of the Argo float locations reminded me of an idea I had at the time that some of the wreckage from the MH370 jet was first found around Reunion Island in the Indian Ocean. I thought that all you’d need to do is find an Argo float near the wreckage and then see where it was at the time the plane went down. I’m sure some of the searchers would have had the same idea, but then they never found the plane. Is it even possible to extract this information from the Argo records?

Gordon Dressler
Reply to  ScottR
January 11, 2019 8:58 pm

The idea of using an Argo float to backtrack the drift of floating wreckage on the ocean’s surface will not work because almost all Argo floats spend most of their time drifting at a submerged “parking depth” of ~1000 m.

According to http://www.argo.net/ , most Argo floats operate on a typical total cycle duration of ~10 days, with ~ 6 hours spent doing a detailed profile while ascending from 2000 m depth to ocean surface, another 6-12 hours floating on the ocean surface to transmit data via satellite link, another ~6 hours spent submerging to a depth of about 1000 m, and around 9 days spent drifting at ~1000m without any vertical profiling. Note: I could not find the time required to move down to 2000 m from the 1000 m parking depth, but this move by itself may require an additional 4-6 hours.

Ocean currents at 1000 m depth almost certainly differ greatly from sea surface currents.

Menicholas
January 11, 2019 7:14 pm

There are nearly 4000 of them in service now, but that number was much lower in prior years.
The first units went in in 1999, and by the year 2000 there was about a hundred.
Over the next seven years a few hundred were added every year, and the 3000 target level was reached sometime in 2007.
One might also wonder about the testing and calibration phase and how often each unit is recalibrated?
Ever? Are they sure they all go to the proper depth?

Anyway, in 2009 there was a ten year meeting and discussions on how to improve distribution was one topic.
So it must be assumed that the distribution has not been as good as it is now for many of those years.
Here is a site which gives a map and a count, and it is interactive:
http://wo.jcommops.org/cgi-bin/WebObjects/Argo

January 11, 2019 7:25 pm

SST animation of the roiling SST corroborates that the claimed accuracy is bogus. http://www.ospo.noaa.gov/Products/ocean/sst/anomaly/anim.html

A. Patterson Moore
January 11, 2019 7:57 pm

Thanks, Willis. Your BS detector is finely tuned. The calculated error bars in Zeke’s graph are a joke. If they really wanted to know, they would run some experiments to find out. Put a couple of hundred floats in one 50,000 square mile grid cell, widely dispersed. Your error bar is the difference in their measurements. Chances are, we are talking error bars in tens of degrees, not tenths of a degree.

Red94ViperRT10
Reply to  A. Patterson Moore
January 14, 2019 6:59 am

That’s not error, that’s standard deviation, σ.

Menicholas
January 11, 2019 8:01 pm

Hmm, if x amount of energy is equal to three one thousandths of a degree averaged out over the whole ocean, it seems to me that one might consider this in many ways to decide if it passes any possible credibility test.
Lets say that instead of the gain in temperature being dispersed evenly throughout the ocean down to 2000 meters, we instead had a situation where the entire ocean stayed exactly the same temp, except that an area of three one thousandths of the surface warmed by one degree and down to 200 meters.
Same amount of heat, no? (check me on that…it is late and I was up early)
Three one thousandths of the ocean is, if I figured it right and roughly speaking, about a square 1000 km on a side (361,000,000msk x 3 / 1000).

This is an area about the size of the country of Chile, minus one fifth (Chile= 1,250,00 sq km).
IOW…small.
A tiny slice of the Humboldt Current, which runs from the southern ocean up the west side of South America offshore of Chile, and is an upwelling current.
OK?
Consider that…all of the zeta joules they are talking about is like keeping the entire part they are measuring exactly the same, and raising one tiny sliver of the Pacific ocean by 1 degree! (If I have it figured out correct, proportionally…again, check me on that)

Now consider that the top 2000 meters is a small fraction of the ocean.
How small?
Here is a link to a ma of Earth with bathymetric color coding of the ocean. 2000 meters is near the middle of the color scale. There are giant areas three to four times that deep…most of the ocean in fact. And considerable portions over 5 times that deep.
Plus…and this is interesting, huge areas that are nowhere close to 2000 meters!
All of the continental shelves. Considerable areas. Bigger in the aggregate by many times over than 1 million sq km.
The last thought I had…why report a result in joules, for water temp changes over a global ocean?
Well, it sounds like a lot of heat.
But also it seem to me if they made a scary map, it would not look very scary with the warming spots graduated in thousandths of a degree, no matter what they did.
Here is that world atlas with bathymetry readings:

http://wo.jcommops.org/cgi-bin/WebObjects/Argo

Conclusion: The more you think about this, and think about it, and look at maps, and consider they are measuring a small fraction of the ocean, and it varies tremendously in temp from pole to pole and top to bottom, the weaker this all looks.
Hard to believe, in fact, that the signal is as big as the noise, and hard to make an argument, no matter how you slice it, that they are not just measuring, if anything real at all, just water moving around which is very non-uniform in temp.

Red94ViperRT10
Reply to  Menicholas
January 11, 2019 9:02 pm

DuckDuckGo finds the average depth of the world’s oceans at 3,796 m or 3,730 m, depending on whose study you select, so the top 2,000 m should be slightly more than half that.

Menicholas
Reply to  Red94ViperRT10
January 11, 2019 10:06 pm

Seems I attached the wrong link. Here is the ocean bathymetry data.
Large areas of the ocean are very shallow, far under 2000 meters.
And most of the rest is abyssal plain, far deeper than 2000 meters.
Large parts of the southern ocean are not covered. The artic has no coverage.
Having said that, I see that I may have been somewhat imprecise to say that the top 2000 meters is a small fraction of the total ocean volume.
It is, using the average number you gave without checking it myself, just under half of the total volume.
My mistake. Thank you for pointing that out.
One error I made was neglecting that the map is not an equal area projection.
But I do not think this invalidate my observation, that they are claiming increase in total ocean heat without measuring anything like the entire water column.
Do you are with my point that one million square km raised by one degree is what we are talking about here? And that this is a tiny slice of the total?
And that, intuitively, this demonstrates that the headline claim by Zeke and whoever compiled the number he quoted is almost surely unjustifiable?
IOW…they have no idea from this data what the total heat content of the ocean is, or how much it may or may not have changed by?
It is not even a rounding error, as near as I can tell.
And that is not even taking into account that the numbers may be massaged and invented data, as recounted by the earlier observation by the same people that the oceans were cooling, not warming, and they then “fixed it”, so that it agreed with the alarmist claims that the sea level rise is due to warming of the ocean as a whole?
http://planetolog.com/maps/map-world/big/bathymetric-world-map.jpg

Menicholas
Reply to  Menicholas
January 11, 2019 10:50 pm

Darn, did it again.
Those 200s should each be 2000 up top.

Red94ViperRT10
Reply to  Menicholas
January 12, 2019 9:22 am

Yes, I like your depiction, you could hold the entire ocean constant and warm just a small piece of it by 1ºC and get the same change in OHC.

Red94ViperRT10
Reply to  Menicholas
January 12, 2019 7:33 pm

BTW, the fact that large parts of the ocean are <lt;2,000 meters, does this mean the Argo buoys, if located in one of those areas, would submerge until it hits bottom, then begin it's ascent while recording temperatures? That being the case, then that reduces the percentage of the oceans the buoys just completely leave out.

Menicholas
Reply to  Red94ViperRT10
January 12, 2019 10:47 pm

I do not know the answer to that question, but over the past 24 hours of reading from various sources and previous articles, that the current practice of descending to 2000 feet is a recent change, sometime in the last five years. Since the 2007 date of reaching the 3000 unit coverage goal, the distribution and sampling protocol have been changed several times.
So, whatever is now the case in terms of the volume of water being, theoretically at least, sampled by the probes, was not true in the past. It used to be that far less of the total ocean volume was sampled.
Which makes their time series and the claimed uncertainty bar in past years, even more dubious.
Which in turn makes any claims of unprecedented ocean heat content even more meaningless.
And for consistency, lets just keep in mind that whatever they are saying their data indicates, was not what they were finding several years back before they completely altered it. Altered data from people that change any result they do not like is not data.
There is little reason to suppose it correlates in any fashion to objective reality.

Joel O'Bryan
January 11, 2019 8:03 pm

” The 95% confidence interval for the results varied from 005°C to 0.1°C.”

A missing decimal in 0.05?

Joel O'Bryan
January 11, 2019 8:17 pm

“In closing, please don’t rag on Zeke about this. He’s one of the good guys,…”

When he is spewing BS climate science nonsense, when he should know better, he SHOULD be “ragged on.”

“We can now say with confidence…” – Zeke

I wouldn’t buy a used car from that guy, much less let him inform on government policy that wants to restrict my liberties and make energy un-affordable in the name of faux alarmism, just so he can keep getting DOE intramural grants and promotions in Laboratory chock-full with rentseekers.

Mike Borgelt
Reply to  Willis Eschenbach
January 12, 2019 2:00 am

Yeah, these folks are either clueless and incompetent or just plain crooks.
Sorry but I’ve had it with this medacity and/or stupidity. You are giving these people FAR too much credence. Ridicule is our most powerful weapon. See Alinsky’s Rules.

Phil
January 11, 2019 8:19 pm

Temperature is an intensive variable. The average of two intensive data points is not straightforward. What is the average of a container A at a temperature of 80°C and a container B at a temperature of 20°C? The formula (t(A) + t(B)) / 2 is not necessarily applicable to finding the average of two intensive data points. The average temperature of the containers (if they were poured into another container C that is a perfect insulator and allowed to come to equilibrium) cannot be calculated without knowing the sizes of both containers. If container A is a pasta pot and container B is a thimble, the average temperature of the two containers would be very close to 80°C and NOT 50°C.

It is well know that there are warm and cold currents in the oceans. In order to obtain an estimate of the average temperature of the oceans, the temperatures of the warm and cold currents would need to be measured as well as the amount of water in each current and then that would have to be combined with an measurement of the temperature of the non-current parts of the ocean and an measurement of the amount of water in each non-current part of the ocean (a tropical sector would be warmer than on at high latitudes). There isn’t enough data to be able to even begin to estimate the heat content of the ocean, except by ignoring the existence of ocean currents and that is just for starters.

Mr. Hausfather’s assertions of uncertainty completely ignore the fact that temperature is an intensive variable and that each buoy is not measuring the state of the oceans as a whole. Since each buoy is not measuring the state of the oceans as a whole, there is no basis to claim a statistical miracle. There is no way to tell if a particular buoy data point is being measured within an ocean current where the temperature may be many whole integers of degrees C different from that in an area of the ocean a short distance away that is not part of a current. That alone would represent a systemic uncertainty that makes all of his calculations meaningless.

Phil
Reply to  Phil
January 11, 2019 8:42 pm

I got carried away by Willis’ conversion to temperature. The conversion to joules of the Argo data is an attempt to avoid the criticism of averaging intensive variables. However, the conversion to joules masks the uncertainty in the conversion as an estimate has to be made of the volume of ocean water represented by each Argo data point. The uncertainty of the estimate of the volume of ocean water represented by each temperature data point is not disclosed and is effectively assumed to be zero. Therefore the gist of my comment I think still applies. You have to watch the little red ball very carefully when this shell game is being played. It is very easy to get distracted.

John Shotsky
January 11, 2019 8:23 pm

Measurements are never accurate. They are ALWAYS estimates. On a digital thermometer that shows 70.1, 70.2 and 70.4, you cannot average to a temperature of 70.23. Why? Because you can NEVER average measurements to a decimal degree (sic) of accuracy better than the lowest resolution of any measurement in the series. Why? Because you don’t know if the thermometers were about to click up to 70.2, 70.3 and/or 70.3 – or vice versa. You don’t KNOW the threshold inside the thermometer. That (new) average would result in 70.26. But wait – the temperature was not changed. All that was said is that we don’t know what the thermometer was about to click to…All three ‘could’ click one way or the other, without the temperature actually changing. And, if one thermometer was in whole degrees, the rest in tenths, the average would have to be in whole degrees – because you don’t know if it is about to click up or down one tick. So ITS resolution defines the resolution for the series. There is a huge difference between measurements (estimates) and counts. Counts might be the number of people in a stadium. Each count is exact. The average can be to 6 decimals – not a problem because you can average counts, but not measurements. As an instrument designer for many years, this was drilled into me. And my BS detector goes off any time ANYONE talks about measurements with results in sub-tenths of a degree. It is meaningless unless both the resolution of the measuring device AND it’s error band are known.
To refresh, look up significant digits in measurement. The last digit is always an estimate, and you cannot average estimates.
So, ALL of the above and original charting is meaningless. Follow the math laws to ascertain valid results.

Rob_Dawg
January 11, 2019 8:28 pm

Willis mentions:
> Looks pretty dense-packed in this graphic, doesn’t it? Maybe not a couple dozen thermometers per swimming pool, but dense

This is a great idea. All the big schools have swimming pools. A diving pool would be even better. Instrument the heck out of one for temperature over spring break. I`d be willing to bet there are instantaneous variations well outside any instrument error. I’d also bet that no matter how good the instruments many of them will have drifted a bit by the time the kids get back from Daytona Beach.

Phodges
January 11, 2019 8:47 pm

Does the increase in atmospheric CO2 produce enough additional w/m2 to heat that volume of water?

Joel O'Bryan
January 11, 2019 8:53 pm

I will say this about why I think the climatists use OHC in zetajoules rather than the recorded temps.
The reason is one is a whole bunch of disparate methods and instrument instrumental records (the latter), while the OHC is a combined estimate of all those temps. Thus converting it all to OHC, they think they can get away with stringing together different records into one data set. Sort of Mike’s Nature trick for ocean temp records.

Björn
January 11, 2019 9:12 pm

Willis I think you made a small error when calculating the coverage of the argo floats. You state there is one float per 133000 square km. But 3835*133000 is few square Km north of 510 million Km². That number is the total area of the earth. The oceans cover approximately 70.8% of the earth surface , it equals 362 millinn square km , give or take. That gives me 362/3.835 e+(06-03) ~ 94394 Km² as the base area of the box covered , so the the base side length is c.a 307 Km instead of the the 365 you get. I do not this affects your argument much , and do knot know if has any bearing on your calulations, but just in case it does , correct and update if needed.

geoff@large
Reply to  Willis Eschenbach
January 13, 2019 10:42 pm

Hi Willis,

Great work. Small mistake. “…in fact, that’s only one Argo float for every 93,500 square km (36,000 square miles) of ocean. That’s a box that’s 300 km (190 miles) on a side and two km (1.2 miles) deep … containing one thermometer”.

Just for the record since your comment will be quoted – I used the global ocean volume from NOAA (https://web.archive.org/web/20150311032757/http://ngdc.noaa.gov/mgg/global/etopo1_ocean_volumes.html) of 1.335 billion km3 (of course not squared). If I divide by the number of Argo bathymeters, that gives 348,110 km3 for earch Argo (gross). Dividing that by the average depth of 3700 meters and multiplying by the measured depth of 2000 meters I get 188,167 km3. Dividing by 2 km gives a box on the sides of 306 km each side, each box 94,083 km3. Practically the same as your calculation, only with your typo of squared corrected to cubed. I’m sure you’ve done it correctly before and were just too busy pointing our the ridiculousness of what is being proposed to have been accurately measured. And just to use your favorite comparison, that’s 19.12 Lake Michigans (the lake being 4920 km3).

Since the true average depth of the ocean is not known (last I checked we have mapped about 10% of the ocean floor) this figure could be off by a few percent either way, but your point remains the same.

geoff@large
Reply to  geoff@large
January 13, 2019 11:21 pm

Oops, I did it myself. Each box being measured by one Argo float is 188,167 km3, with the box being 306km by 306km (93,636 km2) times 2km deep. So each Argo float is supposed to measure the equivalent of 38.24 Lake Michigans.

geoff@large
Reply to  geoff@large
January 14, 2019 12:22 am

So for the record:

Using the global ocean volume from NOAA (https://web.archive.org/web/20150311032757/http://ngdc.noaa.gov/mgg/global/etopo1_ocean_volumes.html) which is 1.335 billion km3 (of course not squared). Dividing by the number of Argo bathymeters, that gives 348,110 km3 for earch Argo (gross). Dividing that by the average depth of 3700 meters and multiplying by the measured depth of 2000 meters I get 188,167 km3. Dividing by 2 km gives a box on the sides of 306 km each side (so 94,083 km2) for the same total of 188,167km3.

For comparison, that’s 38.24 Lake Michigans (the lake being 4920 km3).

Björn
Reply to  Björn
January 11, 2019 9:30 pm

argh , meant the write ” I do not think this affects …” , instead of the actual ” I do not this affects…” at one place in the comment above, did not review before posting.

Clyde Spencer
January 11, 2019 9:12 pm

Willis,
You said, “And that means if we want to get one more decimal in our error, we have to have a hundred times the number of data points.” I think that this is a best case scenario. It has long been the practice of land surveyors to take multiple readings of an angle turned to improve the precision. However, the circumstances are the same operator using the same instrument on a fixed value. The assumption is that the only variance is random and normally distributed.

However, when measuring a variable, such as temperatures, one is dealing with both the range in temperatures and the random error related to the instrumentation. The range in temperature is probably at least one or two orders of magnitude larger than the random error of measurement

I would ask the question of why it is common in most (if not all) sciences other than climatology and oceanography to state uncertainty as +/- 2 standard deviations rather than one standard error of the mean?

Red94ViperRT10
January 11, 2019 9:13 pm

Willis,

Now, there are 33,713 1°x1° gridcells with ocean data. … And there are 3,825 Argo floats. On average some 5% of them are in a common gridcell. So the Argo floats are sampling on the order of ten percent of the gridcells…”

(At this point I must resist the urge to say, “It’s worse than we thought!” But it is.) You left out the part, the buoys take measurements at different depths. So the cells to be measured (I haven’t seen the data, how are the depths divided? …so how many depths are there?) each have multiple depths, let’s call them cubes, though I’m pretty sure they’re not, times those 33,713 gridcells, times 12 months. And we only have 3,825 floats to measure all of that. I’d do the arithmetic myself but I don’t have the number of depths, but I’m thinking it comes out to way less than 10% coverage, maybe even <1%? When doing field sampling of equipment I was told to observe at least 10% of the total population, so we haven't got there yet, I don't think we can conclude anything meaningful from the datapoints we have.

angech
January 11, 2019 9:22 pm

“In closing, please don’t rag on Zeke about this. He’s one of the good guys,”
Like Rosenstein I guess.
He is unfailing polite, unlike me.
He produces the “data”.
But he pushes the global warming agenda severely.
He often pops up on a triumvirate with Mosher and Nick when an inconvenient truth emerges to their belief system.
He is honest in the data he states, but as with USHCN in the past the real messages are hidden in what he does not say.
He has clearly stated in the past that the warming record automatically updates and devalues past historical data. He has said that nearly less than half the USHCN stations no longer present data 4 years ago. No one listens.
As stated the actual purported change in temperature of the upper 2000 m is extremely small. He knows that no one will care about a change of 0.03C plus the large error bars but still goes all out to push it as hard as possible.
As a Princess Bride comment, he will appreciate. shame on you, Zeke.

angech
January 11, 2019 9:25 pm

“90% of the 1°x1° ocean gridcells are not sampled. Just sayin …”
Data is data , we can only use what we have.
The problems with Argo data are shifting positions of the buoys, breakdown of the buoys, unreliability of the thermometers used.
Just have to take this into account with the error range.

January 11, 2019 9:47 pm

One of our Australian politics had a saying as to how he worked Politics, “Whatever it takes.”

The Warmers Lobby obviously think the same.

To hell with data and accuracy, its the final result that counts, and what is that. Certainly it has nothing to do with saving the Planet.

Its just a grab for power. True both the Russians and now the Chinese tried Communism, but they both failed. We will learn from their errors s and end up with a perfect world wide system.

MJE

SMS
January 11, 2019 9:59 pm

Why not determine the increase in ocean heat content using tide gauge results? Expansion of the oceans due to warming and cyclic melting of Greenland are the two components that provide most of the answer.

Looking back on tide gauge results suggests that the ocean warming implied by Zeke is normal and has existed since the last ice age ended, and most probably cycled through the other ice ages as well.

Zeke is just cherry picking.

GregK
Reply to  SMS
January 12, 2019 4:05 am

Note also that the very deep ocean is cooling
https://www.sciencedaily.com/releases/2019/01/190104121426.htm
[I think that this was discussed recently]

And a thought….would not an increase in sea ice lead to a rise in sea level?
Maybe not a large increase but the density of sea ice is less than the water it floats in and the ice must displace a volume equal to its weight.

Hugs
Reply to  GregK
January 12, 2019 5:01 am

Some deep ocean is cooling. Wholly we really don’t know. But remember attribution rule one; cooling is always meaningless and natural, where warming is alarming and man-made. Even El Niños.

MeanOnSunday
January 11, 2019 10:24 pm

Your simulation seems very generous as you are taking the monthly averages as a fixed known quantity. But they themselves are only estimates with errors according to variation with time during the month, positional variation within the grid square, etc. If you look at the data where you have multiple floats in the same grid square in the same month you could get a crude approximation of this variability within grid-month. Then in your Monte Carlo process you have generate your fixed average plus a random component with the appropriate variance.

The idea of any kind of accuracy or precision before the last 15 years just seems ridiculous. Making adjustments to 1/100th of a degree for how much the water temperature changed while a guy dragged up water in his canvas bucket and sat for a minute waiting for a mercury thermometer to stabilize? When those measurements were taken the magnitude of potential errors were well understood. Now we have so called scientists that perform statistical analysis as if the data has no sampling error, and that they can with incredible precision find conversion factors between different measurement techniques.

J Mac
January 11, 2019 10:38 pm

Willis,
Minor tweak: “As the number of observations goes up, the error bar decreases by one divided (by?) the square root of the number of observations. “

Excellent re-analysis of Hausfather’s flawed assertions!

Stephen Rasey
January 11, 2019 10:48 pm

Willis, you mentioned about 2 years ago that Zeke was a “good smart guy”, yet still hadn’t answered you question about the sawtooth record where the scalpel removes the recalibration information.

Willis Eschenbach January 30, 2017 at 7:58 pm
Sadly, Stephen, that question still isn’t answered. I saw Zeke Hausfather at the recent AGU meeting and he said they were looking at the issue … however, given that that has been the answer since June 2014, I have to confess that I figured his statement would sell at a significant discount from full retail price …
It’s too bad, because both Zeke and Mosher are good smart guys … does make a man wonder.
w.

Has this “good guy” ever answered you on this fundamental problem at the core of BEST analysis?

See this June 12, 2017 at 11:03 am comment for a traceback of the discussion.

Prjindigo
January 11, 2019 10:51 pm

You need to increase that “100 times as many data points” to “100 plus the inverse of the error margin times as many data points” because you must overcome the error to begin with.

Prjindigo
January 11, 2019 10:53 pm

oops… +”times the inverse of the error of individual units allowed over lifetime” as well. Sorry.

Menicholas
January 11, 2019 11:11 pm
James McCown
January 11, 2019 11:36 pm

Besides the fact that the 350 Zettajoule increase corresponds to less than one tenth of one degree Celsius, then is an additional question:

Have Trenberth and Hausfather come up with a satisfactory explanation of how that heat got sucked into the oceans in the first place?

Michael Hammer
January 11, 2019 11:37 pm

When it comes to measuring temperature I can speak from some experience. I was very heavily involved with development of a scientific instrument product, part of which required precise measurement of temperature inside a 10mm * 10mm cuvette. We of course used the best thermal sensor around – a high precision platinum resistance thermometer. Now remember we needed to GUARANTEE our measurement accuracy. So what did we achieve? +- 0.1C. For the next product upgrade I came up with some novel circuitry which was able to reduce that to +- 0.01C for the sensor – not necessarily for the contents of a cuvette. For example, whether the contents of the cuvette are stirred or not makes far more than 0.01C difference – indeed more than 0.1C difference. That circuit concept was significant enough to warrant a patent application. This is over a volume of 1 cubic centimeter in a cuvette that is reasonably well insulated from the environment! Trying to get 0.01C in something the size of a swimming pool let alone an ocean is pretty much a joke!

steven mosher
Reply to  Willis Eschenbach
January 12, 2019 5:32 am

psst.
its not a measurement accuracy.

you might be fooling yourself

Menicholas
Reply to  Willis Eschenbach
January 12, 2019 10:56 pm

Scientific haiku!
I think we have a new nickname for S.M.

Curious from Cleathropes
January 12, 2019 12:17 am

Hi Willis,

Interesting article. Clearly the oceans are heating which is consistent with the expected warming from increased levels of CO2. I wonder how easy / difficult it would be to translate the observed level from this data to surface temperature warming? Specifically to what this implies with respect to the expected warming from a doubling of CO2.

Happy New Year and all the best!

Geoff Sherrington
January 12, 2019 12:26 am

Willis,
Australia has a government body named the National Measurement Institute. (Years ago, a neighbor was a scientist there, useful to talk with between his work to restore his 1932 Bentley). In the coming week I shall attempt to obtain information from them concerning the accuracy of measurement of water temperature under their controlled conditions.
Technology has advanced since I owned a chemistry laboratory, but I do recall problems in measuring water temperatures to +/- 0.5 deg C. Apart from the electronics, the instrumentation and the special rooms with controlled atmospheres, there is the problem of making a measurement in water representative of the whole volume of water. Mixing is a remedy, but mixing creates heat. That is but one complication.

Readers here would help the effort by contacting their own country reps.
BTW, our NMI is running a course next month.
https://shop.measurement.gov.au/collections/physical-metrology-training/products/introduction-to-estimating-measurement-uncertainty?variant=4291206184991
Geoff.

cerescokid
Reply to  Geoff Sherrington
January 12, 2019 4:15 am

Geoff

As a result of a recent post on historical OHC, I dug up a report on the HMS Challenger 1873-76 Expedition that I found fascinating. In particular, their efforts to determine temperatures in the abyss. That section starts on page 84. I was surprised at their ingenuity in trying to accurately measure those temperatures. They even have a section identifying the weaknesses in the measuring devices.

I thought you might enjoy reading it and mention it on your visit with NMI.

http://19thcenturyscience.org/HMSC/HMSC-Reports/1885-Narrative/htm/doc.html

Roy W. Spencer
Reply to  Geoff Sherrington
January 12, 2019 4:27 am

As I mention below, the Argo floats use platinum resistance thermometers, which depending on design and application have an absolute accuracy of 0.001 to 0.01 deg. C, with high stability. The problem is making a measurement representative of the ocean water around the float.

Clyde Spencer
Reply to  Roy W. Spencer
January 12, 2019 3:27 pm

Roy
Did you mean “accuracy,” “precision,” or both?

Steven Mosher
January 12, 2019 1:59 am

wow.

Now I understand the ad homs against Mann or me,

But Zeke?

I had hopes folks would lay off the personal attacks as Tim Ball sugested.

cerescokid
Reply to  Steven Mosher
January 12, 2019 3:56 am

Steve

Forget the personal stuff. Who has the better case, Willis or Zeke?

Alan Millar
Reply to  Steven Mosher
January 12, 2019 10:09 am

Good to point it out Steve but it is a very small subset of commentators on here.

In the interest fairness perhaps you would like to go on the ‘alarmist’ sites and tell us what proportion of there is of dogs abuse of people like Willis, Spencer, Curry when their names crop up.

Playing the well known game is it higher or lower, perhaps the most obvious answer ever I would say.

nobodysknowledge
January 12, 2019 3:42 am

The 8 ZJ increase in ocean heat was the same as in 2014 when it was asserted that energy equal to 4 Hirochima atomic bombs pr second got into oceans. Now there is a rebirth of the atomic bomb analogi: “They say the oceans are absorbing around 90 percent of the excess energy caused by greenhouse gas emissions and by their estimates, since 1871, it’s added up to about “436 x 1021 Joules.” Since most of us don’t speak math, The Guardian decided the “1.5 atomic bomb explosions per-second” would be a suitable analogy to digest after crunching the numbers.” https://www.theinertia.com/environment/oxford-research-oceans-warm-at-the-equivalent-of-1-5-atomic-bomb-blasts-per-second/
I think the trends are much more interesting than what year will win the heat contest.

Roy W. Spencer
January 12, 2019 4:20 am

The Argo floats use platinum resistance thermometers (PRTs), which depending on application and design are accurate to 0.001 to 0.01 deg. C, and very stable. Between 2005 and 2017, the 0-2000m global average temperature has risen about 0.04 deg. C, and I doubt the error in this due to Argo float accuracy is as large as 0.01 deg. C. As Willis points out, spatial sampling is probably the largest source of error. These are point sources of temperature measurement, as opposed to the satellite microwave measurements (also calibrated with on-board PRTs) which are volumetric with each footprint on the Earth representing about 1,000 cubic km of air, rather than a tiny point. I think the main uncertainty is, as Willis alludes to, is back before the Argo float era. I really don’t believe we know to a few hundredths of a degree the average temperature of the upper half of the ocean in 1990, 1980, 1970, or any other year before the Argo floats.

Hugs
Reply to  Roy W. Spencer
January 12, 2019 5:06 am

Oh, but we can model them and then use circular reasoning to claim the unprecedented.

I’m sorry, I just wish this was just a joke. But it is clear there is no accurate knowledge of old temps, it is all a modelled guess that works backwards from the ‘basic physics’ .

Roy W. Spencer
Reply to  Hugs
January 12, 2019 5:54 am

Yup.

Stephen Rasey
Reply to  Roy W. Spencer
January 12, 2019 10:37 am

What is the record of “adjustment” and “removal” of Argo sensor data? Have Argo floats been removed from service because they are “reading too cold”?

geoff@large
Reply to  Stephen Rasey
January 13, 2019 10:57 pm

Hi Stephen, There are loads of Argo adjustments. Just to give one example see https://www.researchgate.net/profile/Taiyo_Kobayashi/publication/228863360_Argo_float_pressure_offset_adjustment_recommendations/links/0c960515944df8b5ba000000.pdf (pdf) where they say “We strongly suggest the adjustment of all known pressure drifts in Argo data. These adjustments will improve the consistency and accuracy of the hydrographic dataset obtained by Argo Program. Even small errors should be corrected because the potential impact of a small bias in Argo data pressures from uncorrected pressure sensor drifts could be quite significant for global ocean heat content anomaly estimates. In the worst case, a bias could introduce an artifact comparable in magnitude to ocean heat content changes estimated for the later half of the 20th century”.

Yep, the errors could be as large all the ocean heat estimated for the 2nd half of the 20th century. And that’s only one adjustment.

geoff@large
Reply to  geoff@large
January 15, 2019 12:42 am

Here’s a recent overview of the biases and corrections https://journals.ametsoc.org/doi/10.1175/JTECH-D-17-0122.1

Andrew West
January 12, 2019 4:53 am

Great post, even I can understand it (and my mathematical abilities are highly atrophied). I completely agree about not ragging on Zeke. Maybe he and the other authors will challenge your view, but assuming this largely survives, it’s another in a growing list of symptoms regarding the enterprise of science generally (and some areas particularly of which climate science is one). No system can be perfect but this seems to me a very basic issue which has gotten through (the system should not just be one man’s science and so subject as we all are to one man’s mistakes). Unfortunately, a lot of press off the back of this already.

January 12, 2019 5:49 am

Hate to disappoint. There is no man made global warming. I could not find it.
If the globe is heating [at some places] it is due to natural reasons.
http://breadonthewater.co.za/2019/01/06/does-man-made-climate-change-exist/

January 12, 2019 6:12 am

There are some other aspects to this false alarm. In particular, Cheng et al. relies partly on Resplandy et al. which Nic Lewis critiqued at Climate Etc. My post is

https://rclutz.wordpress.com/2019/01/11/scare-of-the-day-ocean-heat-content/

January 12, 2019 6:34 am

Anyway, if man made global warming really existed,
e.g. due to more GHG being released into the atmosphere,
would you not say that the increase of observed warming of each place on earth must be the same?
[all CO2 and other GHG’s are mixed and spread 100% into the atmosphere?]

I could not find any warming here where I live. In fact, minimum T seems to be dropping…

https://www.dropbox.com/s/tps2cd4kuds8o6g/SUBMISSION%20by%20Henry%20Pool.docx?dl=0

John Shotsky
Reply to  henryp
January 12, 2019 7:06 am

I often see the claim that CO2 is ‘well mixed’ into the atmosphere, but I don’t think so. Why?
Because ALL of the CO2 is emitted from the surface.
And ALL of the CO2 is absorbed at the surface.
And the atmosphere is densest at the surface.
Thus, CO2 MUST be highest near the surface.
Granted, it will mix through convection, wind, etc, but it cannot be true that the percentage of CO2 is the same everywhere in the atmosphere because CO2 is ‘well mixed’. CO2 comes out of tailpipes – isn’t it higher near that tailpipe than at 1 meter from the tailpipe? 2 meters? 1 km?

Reply to  John Shotsky
January 12, 2019 7:50 am

John

They measure the CO2 at several places around the earth and although they [the stations] do differ a small tiny little bit, amongst one and another, it seems the rate of increase is undeniable, the same,
in the case of CO2 due to both human emissions and the warming of the oceans, i.e.
HCO3- (there are giga tons of bi-carbonates in the oceans) + heat => CO2 + OH-

Obviously, in any vessel {earth?} , all CO2 would eventually diffuse and mix to the same concentration
as per the relevant gas law.

Density may make a difference from top to bottom but the ratio of all gases in the atmosphere stays the same, no matter what altitude.

Prove me wrong?

Macha
Reply to  John Shotsky
January 12, 2019 3:18 pm

True. CO2 is also denser than air. Some studies have shown muvh higher concentrations and daiky changes above crop acreages. Can’t find link but recall it may have been at jonova.com.au website.

January 12, 2019 7:50 am

John

They measure the CO2 at several places around the earth and although they [the stations] do differ a small tiny little bit, amongst one and another, it seems the rate of increase is undeniable, the same,
in the case of CO2 due to both human emissions and the warming of the oceans, i.e.
HCO3- (there are giga tons of bi-carbonates in the oceans) + heat => CO2 + OH-

Obviously, in any vessel {earth?} , all CO2 would eventually diffuse and mix to the same concentration
as per the relevant gas law.

Density may make a difference from top to bottom but the ratio of all gases in the atmosphere stays the same, no matter what altitude.

Prove me wrong?

John Shotsky
Reply to  henryp
January 12, 2019 8:58 am

There is a constant exchange of CO2 at the interface of the entire earth at all times. When CO2 is released by the oceans, it is obviously more dense at that interface than at higher elevations, where it is more dispersed. Why do you suppose they measure CO2 at the tops of mountains? Because they want to measure the more well mixed atmosphere. There are CO2 domes over cities, that is well known. And, CO2 is heavier than ‘air’ regardless of its ability to ‘disperse’. It can suffocate you in higher concentrations, which would not be possible if it were ‘well mixed’ at all times.
It’s too bad they don’t include CO2 % on weather balloons…properly calibrated, it would be at a higher percentage where it is emitted than it would be ‘at altitude’, where it is naturally dispersed.