The Weather and Climate of World War II

By Andy May

This post has been translated into German for those who prefer it here.

In my last post I discussed the evolution from the ICOADS raw sea surface temperature (SST) data to the final ERSST and HadSST SST anomalies. These anomalies are compared in figure 1. The ICOADS anomalies are generated by subtracting the 1961-1990 mean value from all the final raw simple yearly means. This was done last after the simple global mean SST had been computed for all years.

The ERSST and HadSST values start out as anomalies. That is, they are created grid cell by grid cell before processing starts. Obviously, the measurements in each cell are normally from different vessels in different years or months, but the measurements are turned into anomalies by subtracting the 1961-1990 mean for each cell from each measurement captured in the same cell for any specific month. This is before any processing or corrections are performed. Since the cells may be as large as 12,300 sq. km this is of dubious value, but that is the way it is done.

On land with fixed measurement weather stations anomalies make more sense, the elevation of each weather station is different, and the individual weather station will often have been in the same location through the entire period from 1961-1990 and often with the same or similar equipment. Thus, making an anomaly at the beginning of the processing by subtracting the mean weather station value for each month from its 1961-1990 monthly mean is logical. On the ocean where every measurement in any given month is at about the same elevation, but from a different buoy or ship, with different equipment, it makes less sense.

Figure 1. A comparison of the ICOADS 3 simple mean sea surface temperature (SST) in green, converted into an anomaly, with the heavily processed “final” SST anomalies from ERSST 5 (NOAA, orange) and HadSST 4.1 (Hadley Centre, heavy black line).

This post is about the difference in the ICOADS data anomaly during the World War II era from 1939 to 1946 as marked in figure 1. It is very noticeable in the raw ICOADS data but disappears in the two final reconstructions shown. There are a lot of known problems that occurred during the war. Shipping lanes changed due to the presence of submarine “wolf packs,” SSTs were beginning to be measured inside engine water intakes rather than with buckets dipped in the ocean, and for those ships still using buckets the type of bucket often changed. These problems are apparent in the ICOADS data, but do they account for the entire radical correction shown in figure 1? The peak raw data anomaly is in 1944 (+2.14°C), yet the 1944 ERSST value is 0.091°C, thus the correction in 1944 is over 2°C. Is this realistic? The total warming since 1900 is estimated to be only about one degree, how is a two-degree correction for an entire year justified? ERSST and HadSST corrected the data, but how much confidence can we have in the corrections? What else was going on at the time?

The Climatic Conditions during World War II

Climatically there was a lot going on during the war. We are fortunate that Stefan Brönnimann of the University of Bern has dug out and digitized a very large database of meteorological data from Germany, the German occupied areas, Sweden, the United States, the Soviet Union, and the UK, and tried to come to some conclusions as to the climate of the Northern Hemisphere during the war. What is presented in this post is mostly from papers he published from 2003 to 2005 (Brönnimann S. , 2003), (Brönnimann, Luterbacher, & Staehelin, 2004), (Brönnimann & Luterbacher, 2004b), and (Brönnimann S. , 2005).

Using the data they collected, Brönnimann and colleagues built monthly maps of surface conditions, upper atmospheric temperature, and geopotential height. Brönnimann and Luterbacher reconstructed the upper atmosphere for the period 1939 to 1944. They found evidence of a weak and disturbed Northern Hemisphere winter polar vortex. This resulted in anomalously high winter temperatures in the upper atmosphere and the surface in Alaska, Canada, and Greenland and very cold winter temperatures in Europe. An example map is shown in figure 2 for January 1942 at an altitude of 500 hPa (around 18,000 feet or 5,500 meters).

Figure 2. Example map of the January 1942 temperature anomaly at 500 hPa (5,500 meters). It is a polar projection of the Northern Hemisphere, and the center is the North Pole. Source: after (Brönnimann & Luterbacher, Reconstructing Northern Hemisphere upper-level fields during World War II, 2004b).

Brönnimann and colleagues believe that the unusual weather during World War II was related to the very strong and persistent El Niño that occurred at the time. In figure 3 we show the Niño 3.4 index, the AMO and the PDO with World War II marked as “WWII.”

While the world as a whole was unusually warm during the war years on average, Europe, northern Siberia, and the central North Pacific suffered through three bitterly cold winters from late 1939 to 1942. The Southern Hemisphere was not spared, sea surface temperatures in the southern mid-latitudes were unusually cool and Australia suffered from a very prolonged drought from 1937 to 1945 (Hegerl, Brönnimann, Schurer, & Cowan, 2018).

Figure 3. The AMO, the PDO and the Niño 3.4 indices with World War II marked. Data source: NOAA.

In figure 3 the prominent World War II El Niño is very clear, and we can see that the AMO and PDO are positive. The North Atlantic Oscillation is not shown, but it is strongly negative during this period (see figure 4 for a plot of the similar Iceland to Aleutian sea level pressure anomaly). There is also evidence of high column ozone in both the Arctic and mid-latitudes, a weak winter polar vortex, and frequent stratigraphic warmings. As Brönnimann and colleagues report, “At the Earth’s surface and in the troposphere, the period 1940–42 represents an extreme climatic anomaly of hemispheric to global extent.”

Figure 4 compares the ENSO 3.4 temperature to central, northern, and eastern European temperature, the Iceland to Aleutian sea level pressure difference, the 100-mbar geopotential height difference between the poles and the mid-latitudes, and the total ozone measurement at Arosa, Switzerland. All these measurements show a distinct anomaly during World War II.

Figure 4. A comparison of the ENSO 3.4 temperature anomaly to European surface temperature, the Iceland to Aleutian sea-level pressure anomaly, the 100-mbar geopotential height, and total ozone in Switzerland. The yellow shading covers roughly 1937 to1945. Source: after (Brönnimann S. , 2005).

Figure 4 shows that analyzed in the context of the 20th century, 1940-1942 stands out as a climatic anomaly that is unique. Thus, it seems unusual that the final ERSST and HadSST records shown in figure 1 just carry on through 1940-1942 on the same trend as before as if nothing unusual were happening. They do show a bit of the 1946 cliff in the raw ICOADS data, but it is very subdued compared to the ICOADS data.

I do not question the problems in the raw data, they are firmly documented. I do question the corrections applied by the Hadley Centre and NOAA, however. The corrected data appears to be too consistent with the time before the giant World War II El Niño event and the time after. I would expect some of the anomaly seen in the raw data to survive the correction process.

Works Cited

Brönnimann, S. (2003). A historical upper air-data set for the 1939–44 period. International Journal of Climatology, 23(7), 769-791. doi:10.1002/joc.914

Brönnimann, S. (2005). The Global Climate Anomaly in 1940-1942. RMetS Weather, 60(12).

Brönnimann, S., & Luterbacher, J. (2004b). Reconstructing Northern Hemisphere upper-level fields during World War II. Climate Dynamics, 22, 499-510. doi:10.1007/s00382-004-0391-3

Brönnimann, S., Luterbacher, J., & Staehelin, J. (2004). Extreme climate of the global troposphere and stratosphere in 1940–42 related to El Niño. Nature, 431, 971–974. doi:10.1038/nature02982

Hegerl, G. C., Brönnimann, S., Schurer, A., & Cowan, T. (2018). The early 20th century warming: Anomalies, causes, and consequences. WIREs Climate Change, 9(4). doi:10.1002/wcc.522

4.6 8 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

100 Comments
Inline Feedbacks
View all comments
April 17, 2025 2:10 pm

Very interesting.

The enormous spike on the chart during WWII was always so far beyond the typical variation that one or more anomalies almost certainly had to affect the results. (In my non-expert opinion.)

Michael S. Kelly
Reply to  pillageidiot
April 17, 2025 2:22 pm

I always thought that spike was due to two days in Japan in 1945, when the air temperature reached 100 million degrees for 10 microseconds each…

KevinM
Reply to  Michael S. Kelly
April 17, 2025 3:59 pm

Location might matter too. Island hopping campaign might have favored warmer waters.

Reply to  Michael S. Kelly
April 18, 2025 12:18 am

Wouldn’t that spike be in 1945, not 1943/1944?

Reply to  Michael S. Kelly
April 18, 2025 2:06 am

No thermometer would have registered that spike. 😉

Michael S. Kelly
Reply to  Michael S. Kelly
April 18, 2025 11:27 pm

Eight downvotes. What, too soon?

April 17, 2025 2:14 pm

Does anyone know if the Kuwait oil field fires (1991) showed up in any temperature records?

I vividly remember the billowing, dense black smoke.

No clue if that smoke absorbed/reflected sunlight differently than typical clouds. Or even if the massive event on a human scale, was still just too small to show up on a global scale?

hdhoese
Reply to  pillageidiot
April 17, 2025 6:11 pm

I’ve wondered about WWII which must have been the largest amount of oil spilled in history including on coral reefs. Added to the natural seepage which was quite a bit in some places like where a German U-boat torpedoed a ship at the mouth of the Mississippi River but missed and blew up the jetty. Currents and density differences there are a mess. Wiggins, M. 1995.Torpedoes in the Gulf. Galveston and the U-Boats, 1942-1943. Texas A & M Univ. Press. 265 pp. Based on what I’ve read and the veterans I’ve talked to, including merchant marine, I would doubt that there was much concern about accurate temperatures, more about staying warm and surviving.

As for the Kuwait oil fire I was asked by a student if it was true that they would destroy the climate. I recall deferring some but pointing out it would be difficult to get into the stratosphere. Texas oil field firefighters shut them down rather quickly. 1990s was when the modern end of the world panic started.

DarrinB
Reply to  pillageidiot
April 19, 2025 9:54 am

Wish it had but no. Intake air temperature in the gulf was 100+ degrees during the day for the first gulf war. If we had followed heat stress monitoring like we were suppose to stay time in the engine room was a whopping 2 hours. Heat stress monitoring kicked in at >100 degrees so us watch standers refused to record any temperature over 100. FYI, watch station I mainly stood would record up to 130F once stepping out of direct ventilation air stream. That’s with 6 complete air exchanges per hour in the space.

If you’re curious why we refused…We had a 3 man watch rotation for each watch station or what we called a five and dime, 5 hours on watch 10 hours off (one 4 hour watch to keep it in the 24 hour day). Absolutely no one wanted to go down to 2 hours on, 4 hours off watch. Talk about getting no sleep! Also we were pretty sure command would implement the watch rotation but consider off watch time as still time to be down in the plant working because regs did not specifically call out off watch time meant off work time. Our work center was in the steam plant so getting off watch wouldn’t mean squat for escaping the heat.

April 17, 2025 2:55 pm

The winter of 1944-45 was also brutal in Europe.

There was a great deal of increased war shipping in the equatorial Pacific.

A happy little debunker
April 17, 2025 3:00 pm

I couldn’t help but notice the increased readings coincide with the increased numbers of ships (warships) as WW2 continued, only to rapidly fall as ships were decommissioned … starting around 1945.

Scarecrow Repair
April 17, 2025 3:14 pm

It was all them bombers. Flying loaded to Germany, releasing all that explosive pressure in Germany, empty flying back, setting some polar vortex, gotta be.

Sparta Nova 4
Reply to  Scarecrow Repair
April 18, 2025 8:13 am

Humor – a difficult concept.
— Lt. Saavik

Mr.
April 17, 2025 3:47 pm

I’m thinking that folks generally in 1939 – 1945 had more to worry about than whether the temperature was a poofteenth higher than it was 30 years ago, and ditto the sea levels.

KevinM
Reply to  Mr.
April 17, 2025 4:01 pm

+”a poofteenth”

Sparta Nova 4
Reply to  Mr.
April 18, 2025 8:15 am

There are images from the Battle of the Bulge and the German invasion of Russia (Barbarossa) that are striking. Soldiers frozen standing after being hit with bullets.

Seems it might have been a wee bit cool during the winter back then.

Nick Stokes
April 17, 2025 3:51 pm

The ICOADS anomalies are generated by subtracting the 1961-1990 mean value from all the final raw simple yearly means.”

Lesson umpteen in the boring series of trying to get Andy May to understand anomalies. This is not how you calculate them. You always subtract from each monthly reading the mean for that time and place.

The reason is that raw temperatures are not homogeneous. There are hot and cold places, and hot and cold times. So in taking an average it really matters how you sample. If some cold places have missing values, the average will go up, when it shouldn’t. And there were lots of missing values in WWII, which is why you get that crazy behaviour. It’s just bad maths.

It you subtract out the expected value for each time and place, there aren’t hot and cold places any more. You get a much more robust result, which is not nearly so sensitive to missing values. That is why HADSST and ERSST get reasonable behaviour in these hard times. It isn’t some special adjustment. It is just correct computation of the average anomaly.

Nick Stokes
Reply to  Andy May
April 17, 2025 4:24 pm

Andy,
In para 2 you correctly explain what ERSST and HADSST do. In para 3 you give spurious reasons for not doing it. You just don’t seem to understand about homogeneity. And as usual, you just get very wrong results.

Nick Stokes
Reply to  Andy May
April 17, 2025 5:07 pm

Yes, it is spurious. To get an anomaly, you subtract from each T the expected value. It doesn’t matter much how you get it, but an average of historic temperatures it pretty good. But you must take out the expected value, so the anomaly that you average is distributed about zero.

Each T is made up of T=E+A, where E is the climate value for that place and time, and A the anomaly. What you do just gives you an average of the E values where T was recorded that month, with all the vagaries of latitude and season. The variability of that swamps the average of A, without those vagaries.

You can actually test that. If you calculate the average as you did, but replacing each T with E (the 1961-90 mean or whatever), you’ll get the same spurious result that you do. But you have taken out all the weather information, which resides in A. Your result just reflects the interaction of missing values with climate E.

Nick Stokes
Reply to  Andy May
April 17, 2025 5:44 pm

Andy,
We went through all this back here (more here).
_1 You get the anomaly by subtracting the expected value, and that can be done in various reasonable ways. But you are subtracting 0, which is in no way anything like the expected value.
_2-3. You are not talking about temperature trends, You are talking about averages over space and time.

There is no masking. Again, you have a field of raw:
T=E+A
where E is the climate for that month and place; all the weather variability is in A. But in averaging T you average a sample of E values, depending on where T was recorded. Because E has big variation, that dominates the result (again, you can test by omitting A) in a way that doesn’t return what you want.

You can see this in your top fig. Before and especially during WWII you get results all over the place. After WWII, when there is much less missing data, your results tend back into agreement with the correct calculation of HADSST and ERSST.

Reply to  Nick Stokes
April 18, 2025 4:47 am

There is no masking. Again, you have a field of raw:
T=E+A”

As usual, you just can’t seem to get it right.

The equation should be:

T +/- u_t = (E +/- u_e) + (A +/- u_a)
where u is the measurement uncertainty

This becomes A +/- u_a = (E +/- u_e) – (T +/- u_t)

from this you get

A = E – T and
u(a) = u_e + u_t (uncertainties always add)

Thus the measurement uncertainty of the anomaly is GREATER THAN the measurement uncertainty of either the estimated long term average or the current short term average (i.e. a monthly average).

If you want the most accuracy you should use the raw data as it is measured. Changing to an anomaly DECREASES your measurement accuracy.

Shifting the stated values of a set of data along the x-axis to center it on zero changes neither the variance or the standard deviation of the sample means of the distribution. Thus using the anomaly actually provides no benefit from the point of view of statistical analysis of the raw data, it only increases the measurement uncertainty.

Now come back and tell us that all measurement uncertainty is random, Gaussian, and cancels.

Reply to  Nick Stokes
April 18, 2025 5:02 am

From your preceding post :

To get an anomaly, you subtract from each T the expected value.

Each T is made up of T=E+A, where E is the climate value for that place and time, and A the anomaly. What you do just gives you an average of the E values

From the post I clicked “Reply” on :

You get the anomaly by subtracting the expected value

… you have a field of raw T=E+A where E is the climate for that month and place; all the weather variability is in A. But in averaging T you average a sample of E values

Your “expected”, or “E”, values are monthly averages over an arbitrarily selected 30-year (or longer, e.g. NCEI uses “the 20th century”) period.

Any “average of (a sample of) E values” that is a multiple of 12 months will be a constant.

.

“Mr” gave a good response in words, but I am a more “visual” person.

The BEST GMST reanalysis datasets can be downloaded from the following link :
https://berkeleyearth.org/data/

Expand the “Global Time Series Data” section, then download the “Monthly Global Average Temperature” text file for both “Air” and “Water” versions of their algorithm.

From the header section of that file :

% As Earth’s land is not distributed symmetrically about the equator, there

% exists a mean seasonality to the global average temperature.

%

% Using air temperature above sea ice:

%

% Estimated Jan 1951-Dec 1980 monthly absolute temperature:

% Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

% 12.23 12.44 13.06 13.97 14.95 15.67 15.95 15.79 15.19 14.26 13.24 12.49

% +/- 0.03 0.02 0.02 0.03 0.03 0.03 0.03 0.02 0.02 0.02 0.03 0.03

Highlighted are the monthly “expected / E” values for the BEST (Air) time series of numbers.

Attached is a plot of the “anomalies / A” values and the “expected / E” values for the entire dataset (January 1850 to December 2024 at the time of posting), as well as a “zoomed in” plot from January 1975, AKA “The Modern Warming Period”.

When you wrote phrases along the lines of “an average of the E values” above, what notion(s) were you attempting to convey ?

.
.

PS : I find that when I am “blocked” on an idea it is often productive to “take a break” and either have a cup of coffee (with some chocolate biscuits, obviously …) or work on a different subject for a while.

When coming back to the original problem, the process of “getting back up to speed” is frequently interrupted by “profanatory and self-derogatory remarks” (hat-tip E. E. “Doc” Smith) … e.g. “Mark, you are a [ multiple expletives deleted ] idiot !” … as the “obvious” thing that I was missing suddenly became clear to a brain that had been “reset” while thinking about other concepts.

I added a post yesterday, in a “Reply” to you (!), to a now (almost ?) defunct WUWT comments section :

URL : https://wattsupwiththat.com/2025/04/14/lowering-energy-costs-in-america/#comment-4062425

If you need a “distraction” please consider responding to that post to “reset / unblock” your brain, and then returning here to read through this sub-thread again.

BEST-Air_Composite
Nick Stokes
Reply to  Mark BLR
April 18, 2025 10:51 pm

When you wrote phrases along the lines of “an average of the E values” above, what notion(s) were you attempting to convey ?”

For goodness sake, I have said it over and over. T,E and A are numbers for a grid cell/month. Everything must be done before you get to the aggregated figures that you quote from BEST. These are cell level calculations – I cited the data source. I don’t think you have understood anything at all.

On your ps I have responded back there. Although I clearly stated that perplexity was using 2022 numbers, you quoted 2023 WA numbers in an attempt to contradict. The 2022 numbers match up very well.

Reply to  Nick Stokes
April 19, 2025 11:06 am

I don’t think you have understood anything at all.

I object to the “anything at all” qualifier, but you are essentially correct, I did not understand your “argument”.

That is why I decided to ask some questions, in order to seek clarification.

For goodness sake, I have said it over and over. T,E and A are numbers for a grid cell/month.

Re-reading my posts I noted that I had added the following highlighting to an extract from one of your posts :
“… you have a field of raw T=E+A where E is the climate for that month and place; all the weather variability is in A.”

With the benefit of hindsight my sub-conscious had picked up on that “detail”, but I was unable to clearly / coherently formulate my subsequent “request for clarification”.

I do not know if BEST calculates anomalies per weather station / grid cell first and then the global equivalent numbers I copied from their file header afterwards, or if they calculate the absolute monthly GMST values first before deciding what Reference Period to use.

In either case my point … or one of them, at least … is that your generic statements about “T, E and A” do not apply to that specific counter-example.

.

On your ps I have responded back there.

I have just come here after responding to your response.

… you quoted 2023 WA numbers in an attempt to contradict

Your inferences about my “motivations” are incorrect, but unlike far too many other posters here you took the time to dig out some actual data to show where my honest mistake had been made.

That effort, despite your obvious frustration with my lack of comprehension, is greatly appreciated.

Note that the final formula I ended up with for 2022 was :
“Renewables” = Wind + Solar + [ Hydro + Pumped Storage (PS) ] + Wood + Biomass + Geothermal

The attached screenshot may help you understand why I decided to add those particular elements to the mix in an effort to reduce the “residuals” as much as possible.

WA-OK-OR-states_2022-selected
Reply to  Nick Stokes
April 18, 2025 6:56 am

From the post:”…subtracting the expected value…”

Where do you get this expected value?

Nick Stokes
Reply to  mkelly
April 18, 2025 10:52 pm

Wearily, by averaging over the years 1961-90. Andy’s choice – I would have used 1995-2024.

Mr.
Reply to  Andy May
April 17, 2025 6:07 pm

Andy, you’re getting sidetracked and complicating about how climate temps anomalies are arrived at.

The correct way is, as I think Nick is trying to explain –
you just subtract the “expected” number.

So to simplify –
you just make shit up.

(Peer reviewers in climate papers will never notice anyway, and even it one did, “The Cause” dictates that they just ignore it).

Reply to  Mr.
April 17, 2025 7:58 pm

you just make shit up.

Yep, that is what they have done.

The data just DOES NOT EXIST to get anything remotely real in the way of “ocean” temperature before ARGO was in place.

Reply to  bnice2000
April 18, 2025 4:02 am

That’s right, the data does not exist.

Don’t you love how they confidently predict temperatures when they have no data!

The whole Climate Change Scam is based on making stuff up.

Reply to  Tom Abbott
April 18, 2025 5:27 am

From a statistical analysis point of view, the precision with which these “average” temperatures are determined is “undefined”, even with the Argo floats. Too few samples over too long of a time covering too large of an area.

Reply to  bnice2000
April 18, 2025 5:26 am

As usual, the adherence to statistical analysis protocols is non-existent in climate science.

Andy has said that some of these measurements represent an area as large as 12,000 sqkm. Assume that one measurement per 1000 sqkm would be a representative population and would give a good idea of the actual average temperature of that 12,000 sqkm. Since we don’t have the population values in order to calculate a standard deviation for the population we can estimate the population standard deviation by using the sample standard deviation, i.e. a sample size of 1. Now to calculate the standard deviation estimate from a sample you divide by (n-1) – which means the standard deviation of the sample is undefined (divide by zero) and, therefore, the standard deviation of the population can’t be determined or estimated. If the standard deviation of either the population or of the sample can’t be determined then you can’t calculate the SEM of the average – meaning you have no idea if that one sample of size 1 represents the actual area or not.

But Nick and climate science just assume that an undefined SEM is equivalent to an SEM of zero – meaning that sample of 1 measurement is a 100% accurate estimate for the entire area.

A high school sophomore taking AP Statistics would catch this and question the legitimacy of the analysis.

BTW, on average Argo floats sample abut 100,000 sqkm of the oceans. (Density is higher in the NH than the SH so this is just an average). I’m not sure of the time it takes to cover a sample area but the floats only report every 10 days or so and I assume it takes a number of months to fully cover the area so large areas still have a paucity of measurements. This means that, as above, how well the data actually determines “average” temperatures of the ocean is pretty much undefined – but climate science equates undefined with a zero SEM which, in turn, means a highly accurate estimate!

Reply to  Mr.
April 18, 2025 3:15 am

I was going to try to comment on the idea of the expected number but now I don’t have to.

Reply to  Nick Stokes
April 18, 2025 7:00 am

Each T is made up of T=E+A, where E is the climate value for that place and time, and A the anomaly.

Let’s discuss measurement uncertainty for a minute using this equation. Rewriting it we end up with:

A = T – E

Now, if both T and E are measurements with a probability distribution, they have both a mean and an uncertainty value.

Looking at NIST TN 1900 as an example (note that it is not a complete assessment of uncertainty but is useable for an example) The monthly average has an standard deviation of 4.1◦C and standard deviation of the mean of 0.872◦C. From my investigations a 30 year baseline has a ~2.96◦C and a standard deviation of the mean of ~0.55.

These are both random variables with probability distributions. When subtracting, “T – E”, the variances add. This gives a combined variance of

√[(4.1)² + (2.96)²] = 5.05◦C

Converting to a standard deviation of the mean by dividing by the √30 gives ±0.92◦C.

Since these are not repeatable measurements of the same thing, nor are they using the same instruments, operators, etc., one should use the standard deviations and not the standard deviation of the mean.

Using error bars of even ±0.92◦C would obviate any substantial conclusions about what the actual anomaly values are.

Reply to  Andy May
April 18, 2025 3:12 am

raw temperatures are not homogeneous.

There are hot and cold places, and hot and cold times.

If some cold places have missing values, the average will go up, when it shouldn’t.

And there were lots of missing values in WWII, which is why you get that crazy behavior.

This is what sounds spurious to me. Then again, I;m not a mathematician.

Michael Flynn
Reply to  Nick Stokes
April 17, 2025 4:22 pm

Nick, the Earth is losing energy, about 44 TW.

That’s defined as cooling.You believe slow cooling is really increasing temperatures. You must be mad.

MarkW
Reply to  Michael Flynn
April 17, 2025 6:55 pm

44TW, even if correct, is tiny compared to the size of the Earth.
Secondly, thanks to the fact that the Earth’s core is radioactive, the Earth is generating heat almost as fast as it is losing it.

Michael Flynn
Reply to  MarkW
April 17, 2025 7:06 pm

Mark, the fact that the Earth is cooling means that it is getting colder. Very slowly.

You don’t seem to be disputing anything I wrote, just trying to imply that slow cooling really means “getting hotter”.

As you rightly point out, the radiogenic heat is not quite enough to replace the oncoming heat loss.

Or did you really mean to say that you think that adding CO2 to air makes it hotter? That would be rather silly, wouldn’t it?

leefor
Reply to  Nick Stokes
April 17, 2025 7:21 pm

So Nick, tell us about the Argo buoys being continually is the same time and place, as well as the ships. 😉

ERSST is computation of the anomaly? LOL.

Nick Stokes
Reply to  leefor
April 17, 2025 7:29 pm

They don’t need to be in the same place. Andy starts with an average temerature for a grid cell for a month. What you need is a climate estimate for that grid cell/month. So you use the average 1961-90 or whatever. But you subtract that estimate before you do anything else.

Reply to  Nick Stokes
April 17, 2025 8:00 pm

Before ARGO there is basically ZERO coherent whole-of-ocean data… period !

It is all meaningless fakery.

leefor
Reply to  Nick Stokes
April 17, 2025 9:29 pm

But the grid cells at 12300 sq km? That is not fit for the porpoise. 😉

Nick Stokes
Reply to  leefor
April 17, 2025 9:49 pm

The purpose is to get a global average. Not a problem.

leefor
Reply to  Nick Stokes
April 17, 2025 10:58 pm

A global average where the long term data is at the very least suspect. Sparse data cannot be infilled. 😉

Reply to  leefor
April 18, 2025 3:18 am

Sparse data cannot be infilled.

Boy, do you have a lot to learn about climate science!

Reply to  leefor
April 18, 2025 5:35 am

Or homogenized or interpolated.

Reply to  Nick Stokes
April 18, 2025 1:40 am

Probably better than the area of some surface sites.

Still a complete and absolute joke.

Reply to  Nick Stokes
April 17, 2025 7:55 pm

There are very few readings in the Southern hemisphere or even the NH before about 1960.

Any values before then, or even after then, are pure guesswork, and basically meaningless.

ocean-temp-coverage
Reply to  bnice2000
April 18, 2025 4:08 am

Alarmist Climate Science is practically all guesswork.

It’s not good science.

Izaak Walton
Reply to  Andy May
April 18, 2025 7:46 am

so if you don’t think that we can know the global surface temperature before the 1990s does that mean that you believe in the medieval warm period?

Reply to  Izaak Walton
April 18, 2025 1:04 pm

There is no doubt from biological evidence that the MWP was at least as warm as now.

Evidence from all around the world is available.

Reply to  Nick Stokes
April 18, 2025 3:07 am

The reason is that raw temperatures are not homogeneous. There are hot and cold places, and hot and cold times. So in taking an average it really matters how you sample. If some cold places have missing values, the average will go up, when it shouldn’t. And there were lots of missing values in WWII, which is why you get that crazy behaviour.

Well, Nick, you’re right about one thing: “It’s just bad maths.”

Bob
April 17, 2025 6:17 pm

I have no confidence in corrections, give us the data as measured if you think there are problems with the measurements tell us and explain why you think there are problems and what caused them. Don’t fool around with the measurements and let people believe your corrections represent what you measured that is dishonest and not helpful.

April 17, 2025 9:24 pm

The winter was Germanys biggest enemy in the war with Russia

J K
April 17, 2025 9:35 pm

Grok’s evaluation of this post. Either grok is biased or Andy May is hiding things deliberately.

Strengths

  • Data-Centric Approach: The article focuses on empirical data (ICOADS, ERSST, HadSST) and provides a figure to compare anomalies, making it accessible for readers interested in raw data analysis.
  • Historical Contextualization: It effectively highlights the unique challenges of WWII-era data collection, such as daytime observations and reduced trade ship activity, which could explain anomalies.
  • Engagement with Scientific Literature: Referencing Hegerl et al. (2018) adds a layer of credibility by tying the discussion to peer-reviewed research.

Weaknesses

  • Skeptical Bias: The article is hosted on Watts Up With That?, a blog known for promoting climate change denial and opposing scientific consensus. This context suggests a potential bias toward questioning mainstream climate data without fully exploring alternative explanations.
  • Lack of Detailed Explanation: The article does not deeply explore why the WWII anomaly might be smoothed in ERSST and HadSST (e.g., specific statistical methods or corrections for wartime biases). It hints at methodological issues but lacks technical depth.
  • Limited Counterarguments: It does not address potential reasons for data processing differences, such as corrections for known biases (e.g., ship engine intake measurements) or improvements in modern datasets.
  • Speculative Tone: The implication that the anomaly’s absence in official records is suspicious lacks robust evidence, relying on a single commenter’s observation and the author’s interpretation.

Broader Context

  • Watts Up With That? Reputation: The blog is described as a leading climate change denial platform, with contributors like Christopher Monckton and Fred Singer, and has been criticized for inaccuracy and politicized commentary. Its involvement in the Climategate controversy further underscores its skeptical stance.
  • WWII Warm Anomaly: Other sources confirm the WWII warm anomaly as a known phenomenon, potentially due to changes in measurement practices (e.g., daytime observations) and sparse data. The article’s focus on this anomaly aligns with scientific interest but frames it in a way that questions data integrity.
  • Data Reliability: The article’s critique of SST data processing echoes broader debates about historical climate data, where adjustments are made to account for measurement biases (e.g., urban heat islands, instrument changes). However, mainstream science, as supported by NOAA and IPCC reports, finds these adjustments generally reliable.

Potential Counterpoints

  • Data Corrections: ERSST and HadSST datasets undergo rigorous corrections to account for biases, such as wartime measurement shifts or ship-based data inconsistencies. The smoothing of the WWII anomaly could reflect these corrections rather than data manipulation.
  • Scientific Consensus: The IPCC and other bodies maintain that global warming trends are robust despite historical data challenges, and the WWII anomaly does not undermine long-term trends.
  • Alternative Explanations: The anomaly could result from environmental factors (e.g., reduced aerosol cooling during wartime) or data artifacts, which the article does not explore.

TL;DRThe article highlights a WWII-era SST spike in raw ICOADS data, missing in ERSST and HadSST, and questions data processing methods. It provides historical context but is limited by a skeptical bias, lack of technical depth, and failure to address data correction rationales. Hosted on a climate denial blog, it aligns with critiques of mainstream climate science.

R_G
Reply to  J K
April 17, 2025 10:27 pm

J K is hallucinating again.

Reply to  J K
April 17, 2025 10:56 pm

GROK as a source of anything!
Is GROK an AI? What literature was used for its learning? Clearly, literature that claims WattsUpWithThat is a denier website.

Reply to  J K
April 18, 2025 12:26 am

Weaknesses

Skeptical Bias: The article is hosted on Watts Up With That?, a blog known for promoting climate change denial and opposing scientific consensus. This context suggests a potential bias toward questioning mainstream climate data without fully exploring alternative explanations.

The very fact that Grok states WUWT promotes “climate change denial” suggests Grok has an inherent bias itself.

Now go back and ask Grok to repeat the analysis without the bias.

J K
Reply to  Redge
April 18, 2025 4:56 am

I will do that next time

Reply to  J K
April 18, 2025 6:24 am

There shouldn’t be a “next time.”

Reply to  J K
April 18, 2025 7:21 am

Sure you will.

The same as, next time you’ll think for yourself.

Reply to  J K
April 18, 2025 1:42 am

Relying on Artificial Intelligence scraping the farcical anti-science of the web ??.

But where is your own intelligence?

paul courtney
Reply to  bnice2000
April 18, 2025 5:04 am

Mr. 2000: He evidently chose not to use it when he got the desired result. Using JK’s “analysis”, his comment is posted at a wrong-think website (WUWT, ick) and therefore is wrong. But he put alotta lipstick on it!

Reply to  J K
April 18, 2025 4:13 am

The article is hosted on Watts Up With That?, a blog known for promoting climate change denial and opposing scientific consensus.

The blog is described as a leading climate change denial platform …

If a true transcript then this just shows the (possibly sub-conscious ?) biases of Grok’s programmers.

These things about WUWT are “known” by whom ?

Those “descriptions” were written by whom ?

On the subject of “scientific consensus” one of the best rebuttals was that of Michael Crichton :

Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled. Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you’re being had. The greatest scientists in history are great precisely because they broke with the consensus. There is no such thing as consensus science. If it’s consensus, it isn’t science. If it’s science, it isn’t consensus. Period.

.

In a WUWT article / comment : 2 + 2 = 4

Grok : WUWT is a blog known for promoting climate change denial …

J K : Mathematicians ! You have to rewrite the laws of integer arithmetic ! ! !

Mathematicians (startled) : Errrrrrr … why, exactly ?

J K : Because WUWT is a denialist blog ! ! !

Mathematicians (+ World + Dog) : [ … backs away slowly … ]

.

Some time later, in a second WUWT article / comment : E = mc²

Grok : WUWT is a blog known for promoting climate change denial …

J K : Physicists ! You have to find an alternative to Special Relativity ! ! !

Physicists (startled) : Errrrrrr … why, exactly ?

J K : Because WUWT is a denialist blog ! ! !

Physicists (+ World + Dog) : [ … backs away slowly … ]

.

Please try again, but using your own thoughts and analytical skills this time.

Reply to  Mark BLR
April 18, 2025 4:52 am

Please try again, but using your own thoughts and analytical skills this time.

But being told what to think and mindlessly parroting liberal talking points is so much easier. Now they even have “AI” to even prepare all their responses for them.

Reply to  J K
April 18, 2025 4:13 am

Watts Up With That? Reputation: The blog is described as a leading climate change denial platform”

Described by whom? Does Grok have a source? Grok seems to have been fed climate alarmists propaganda and accepts it as truth.

Get brainwashed much, Grok? Your handlers are lying to you.

Sparta Nova 4
Reply to  Tom Abbott
April 18, 2025 8:23 am

Describe by Wiki.
’nuff said.

paul courtney
Reply to  J K
April 18, 2025 5:16 am

Mr. K: So which is it, grok biased or the author is hiding things? Didn’t you ask? The answer would be just as fascinating as your above comment. Unfortunately, I must discredit your comment due to reading it at this site. Right?
Reading your comment is a mistake I won’t make again.

cementafriend
April 17, 2025 10:53 pm

Kreutz here ( Kreutz, W. Kohlensäure Gehalt der unteren Luftschichten in Abhangigkeit von
Witterungsfaktoren,” Angewandte Botanik, vol. 2, 1941, pp. 89- ) measured several climate indicators (temperature, wind, precipitation, CO2, radiation ) several times every day for 1.5 years. From his graphs it is clear the temperature precedes CO2 and CO2 levels were similar to present day level (about 400ppm)
This paper by Dr Massen and Ernst-Georg Beck (Accurate estimation of CO2 background level from near ground measurements at non-mixed environments) looks at the CO2 results of Kreutz and measurements by others to show how wind influences results to give accurate measurement. The latter paper is available at ResearchGate https://www.researchgate.net/publication/234004309.
Ocean surface temperatures clearly have a measurement problem. E-G Beck also looked at the relation between OST and CO2. As well as I remember there is a lag of atmospheric CO2 from OST changes by upto 5 years. There is a longer lag in ice core data even though the are doubts about the CO2 in ice cores. There is no evidence that CO2 has any effect on atmospheric temperature or on surface temperature.

Izaak Walton
April 17, 2025 11:14 pm

The peak raw data anomaly is in 1944 (+2.14°C), yet the 1944 ERSST value is 0.091°C, thus the correction in 1944 is over 2°C. Is this realistic?

Yes. There is no physical mechanism that could warm the oceans by 2 degrees over a period of a couple of years without also warming the land temperatures by a much larger given the relative heat capacities of water and air. Also the temperature anomaly dropped by 2 degrees in less than a year in 1946 and again the question has to be where did that energy go? That temperature anomaly is too big and over too short a time period to by physically reasonable.

Reply to  Andy May
April 18, 2025 5:34 am

1) the correction procedure is ad-hoc and useless. 2) The raw data is so bad it shouldn’t be used.”

100%

“My personal opinion? Only back to the early 1990s.”

Even after the 90’s the data is so sparse that it is statistically unsound. The Argo floats cover an average area of around 100,000 sqkm and takes several months to do so. Jamming this data into one data set and pretending it is a representative sample is literally garbage.

April 18, 2025 1:57 am

SSTs near Scotland show something similar.

Sea-temp-Scotland
April 18, 2025 2:04 am

Many places in the NH, the only hemisphere where any widespread measurements exist, USED to have a similar pattern with a strong peak in or around 1940.

Eg , several places in the Arctic.

arctic_temp
April 18, 2025 2:05 am

Maximum temperature in the USA do the same thing

1940s-us-temp
April 18, 2025 4:25 am

Somebody should ask one or more of these AI to tell us how Phil Jones got his sea surface temperature data.

Phil Jones wouldn’t let us look at his methods because he said someone might criticize them, but maybe an AI can figure out how he bastardized the temperature record after the end of the Little Ice Age in 1850.

Phil Jones bastardization of the instrument-era temperature record erased the past warm high spots in the 1880’s and 1930’s, and erased the significant cooling of the 1970’s (Ice Age Cometh!), in his efforts to make the temperature record track the CO2 increases, thus making it appear that CO2 is causing the temperatures to get hotter and hotter and hotter and shows today as the hottest time in human history. Made to order for Climate Alarmists.

And it’s all a BIG LIE. It’s not any warmer today than it was in the recent past, and we have actual evidence to prove this in the regional, historic, original temperature records from around the world, which all show it was just as warm in the recent past as it is today.

This being the case, there is no evidence that CO2 has affected Earth’s temperatures because it is no warmer today with more CO2 in the air, that it was in the recent past with less CO2 in the air. Therefore, CO2 has had no visible effect on the Earth’s temperatures.

This is what Phil Jones wanted to hide with his bastardization of the instrument-era temperature record. He wanted to hide the fact that CO2 cannot be shown to be heating up the planet.

sherro01
April 18, 2025 5:08 am

Andy,
One might expect large land masses to have a wartime hot blip if the oceans were showing one around them
A quick look at Australian land heatwave data shows a little evidence for 1939 and/or 1940 being years with very hot 10-day hottest heatwaves in 4 stations of the 10 I studied.
Adelaide, Brisbane, Melbourne and Sydney had quite hot 10-day heatwaves in one or two of these years, while stations at Alice Springs, Darwin, Hobart, Longreach, Cape Leeuwin and Perth did not show similar anomalous warming in wartime.
Viewed another way, the average daily Sydney Tmax temperature 1856-2024 is 21.8C, while for 1939-1945 it is 21.6C. For Melbourne, 19.9C versus 19.8C. overall Not much there to think about a link to ocean temperatures.
This is no ideal way to study the matter. There is so much unexplained noise in the raw daily land data that all sorts of odd things could have happened, but cannot be linked to specific events. The same is the case for ocean temperatures.
Geoff S

sherro01
April 18, 2025 5:27 am

Andy,
After study of questions like this for 30 years, my only reasonable conclusion is that the available data prior to 1945 is unfit for the purpose.
I welcome arguments to the contrary.
Geoff S

Reply to  sherro01
April 18, 2025 6:30 am

Unfortunately for people like Nick and others, their argument is, “it’s the data we have so we have to use it.” regardless of whether it’s fit for purpose or not. Actually, it’s beneficial to their cause because they can point out the questionable data and therefore justify any number of questionable adjustments until the data “confesses” what they want it to.

April 18, 2025 6:33 am

The Oceans Govern Climate is a website by Arnd Bernaerts, oceanographer and naval war historian.

Man influences the ocean governor by means of an expanding fleet of motorized propeller-driven ships. Naval warfare in the two World Wars provide the most dramatic examples of the climate effects.

http://oceansgovernclimate.com/wp-content/uploads/2017/02/20_1.jpg

Neither I nor Dr. Bernaerts claim that shipping and naval activity are the only factors driving climate fluctuations. But it is disturbing that so much attention and money is spent on a bit player CO2, when a much more plausible human influence on climate is ignored and not investigated.

https://rclutz.com/2017/02/18/ocean-climate-ripples/

jdj
April 18, 2025 6:47 am

May I ask whether anybody considered the convoy effect where those ships behind are eating the front ships’ water that has been churned and warmed by the screw motions not to mention containing oil and other contaminants such as exploded ship contents?

A useful exercise might be to analyse a convoy’s spatial distribution of readings and see if there is a gradient from leaders to laggards and how things change temporally as convoys speed up or slow down or mill about.

Nick Stokes
April 18, 2025 7:57 am

As I noted above, if you average raw temperatures, as Andy has done, you will get spurious results, which reflect not the weather, but the interaction of missing values with long term climate variation, both spatial and seasonal. If there are missing values in a month in cold places, the result is warmer, for example.

A test is to do that bad arithmetic, and then repeat it replacing the T data at each grid/month by a 1961-90 (say) average. Now there is no weather information. A grid cell in June has the same value for each year, except where T was missing.

So I did that with one of the datasets Andy is using – the HADSST actiual. It is the orange curve in the graph here, from the predecessor to this post:

comment image

Here is my calc, with the average T calculated in the same way in black. Then the average calculated with the data held constant at average values, as described above, in red:

comment image

The pattern in the same, including the WWII spike. But the red curve has no weather. It just has fixed T with the same set of missing values. The spike at WWII occurs because of missing values in colder waters.

Nick Stokes
Reply to  Andy May
April 18, 2025 2:38 pm

Your exercise shows that you can get two wildly different results depending upon the sample chosen.”

Yes, with bad arithmetic.

Again, you can write T as C+A, where C is the climate (say 1961-90 average for that month and grid) and A is the anomaly, which contains all the information you want about varying weather. The average of T is the sum of the averages of C and A. Now C is constant over years, so its average should be constant. But with missing data you are effectively sampling, and with C being so inhomogeneous (tropics, poles etc), that gives very variable results, which are telling you about your process rather than the state of the Earth. And this exercise returns the average of T and of C, and shows that the process result dominates. What you want to know is the average of A.

To get that, you have to identify C and subtract it, before averaging. That is what ERSST etc are doing, in your example, and free of the wild swings that you have. Those swings are just from the attempt to average C (where the average of C should be constant).

Reply to  Nick Stokes
April 18, 2025 2:55 pm

Temperature is not climate. If it was then Las Vegas and Miami would have the same climate.

Now C is constant over years, so its average should be constant.”

What are the dimensions of “C”? Degrees celsius?

Nick Stokes
Reply to  Tim Gorman
April 18, 2025 3:10 pm

T, C and A have the same units, necessarily.

Nick Stokes
Reply to  Andy May
April 18, 2025 7:57 pm

“Your assumptions overreach your data”

You just have no idea about the arithmetic. Of course we know C. It is the mean over 30 years of the grid average. You could choose 1995-2024 if you like. With 30 years of observation it is ridiculous to say we have no idea of the climate average.

The basic point is that you are averaging T=C+A and C is constant over years for and grid. But C averaged, as in my exercise (red curve) and as you do it, is far from constant. And the error made in that bad calculation dominates your calculation and appears all the way through. It is just a very bad way to average C because of the missing values, and there need not be any. You know the missing values because of the constancy of C.

A proper way to average T would be to average C and A separately, and add. A you could do by your method, but with C you could infill and get a correctly constant result over years. Much better, but not very interesting, which is why people just talk about the anomaly average, which is where the information is. But you could do it.

Of course A should be infilled too, but that is another story. Failure to infill A isn’t as harmful, because of its homogeneity.

Nick Stokes
Reply to  Andy May
April 19, 2025 4:43 am

Andy, it is just a matter of arithmetic, and your objections are spurious. Back to basics – T is in my case a 72x36x12x175 array. That is, Lon,lat,mth,year. You can split it any way you like into C and A, so that T=C+A. If you apply an averaging procss to C and A separately, and add, you will get the same as averaging T by the same process.

But we can apply some constraints. C should be constant over years. Then you don’t need missing values any more; it is defined constant and can be exactly infilled. So how to choose cconstants for C? Just to make A small. Then the error of averaging A with NA’s will be much less than averaging T. Any reasonably approximation to the mean of T over years assigned to C will have that property. The A’s have a spread of a degree or so, and the error must be a fraction of that. Choosing a good mean will lead to better cancelation, but that is icing on the cake.

Note that I haven’t said anything about what anomalies should really mean, although by the definition they contain all the weather information, including AMO etc. I’ve just presented it as a better way to average T. It really is just arithmetic.

Here it is graphically. As before, black is T averaged your way. Red is C averaged your way, and the difference is the average of A. Green is the correctly infilled average of C. It is periodic over a year, because months are done separately. The blue curve is the sum of the correct average of C to the average of A=T-C. Note that it is very like the ERSST average in your graph, including being about 2C cooler. Your average was boosted because cold places have a lot of missing values. ERSST didn’t do any fancy adjustments. They just averaged C in the obvious way.

comment image

The red, blue and black curves have 60 month smoothing. The green is not smoothed, to emphasise the seasonality of the climatology.

Reply to  Nick Stokes
April 18, 2025 5:17 pm

because of missing values in colder waters”

The whole ocean is 97%+ “missing values” before maybe the 1980s, or maybe even before 2005 and ARGO.

It is total idiocy to think anyone can calculate a realistic or meaningful “global mean” from it.

Nick Stokes
Reply to  Andy May
April 18, 2025 10:55 pm

1961-1990″
Those years were your choice. You can use 1995-2024, say.

observa
April 18, 2025 5:36 pm

Well it was presaging WW2-
Black Friday bushfires – Wikipedia
Which meant the nation had to put off establishing a serious rural firefighting response until after the war-
Black Friday 1939 | CFA (Country Fire Authority)

We learn by experience and as resources permit although some are curiously slow on the uptake it seems-
Electric vehicles are a DANGER to shipping | MGUY Australia