UAH v6.1 Global Temperature Update for April, 2026: +0.39 deg. C

From Dr. Roy Spencer’s Global Warming Blog

by Roy W. Spencer, Ph. D.

This month I’m adding plots for USA48 and Canada, too.

The Version 6.1 global average lower tropospheric temperature (LT) anomaly for April, 2026 was +0.39 deg. C departure from the 1991-2020 mean, which remains statistically unchanged for 4 months now.

The Version 6.1 global area-averaged linear temperature trend (January 1979 through April 2026) remains at +0.16 deg/ C/decade (+0.22 C/decade over land, +0.13 C/decade over oceans).

The following table lists various regional Version 6.1 LT departures from the 30-year (1991-2020) average for the last 28 months (record highs are in red). Note I’ve added Canada to the table this month, by request (although WordPress won’t allow me to add September 2024 for some reason). The warmest April in Canada was in 2010 (+2.61 deg. C), while the warmest anomaly out of all months was in January 1981 (+3.75 deg. C).

YearMonGlobeNHemSHemTropicUS48ArcticAust.Can.
2024Jan+0.80+1.02+0.57+1.20-0.19+0.40+1.12+0.97
2024Feb+0.88+0.94+0.81+1.16+1.31+0.85+1.16+2.45
2024Mar+0.88+0.96+0.80+1.25+0.22+1.05+1.34+1.12
2024Apr+0.94+1.12+0.76+1.15+0.86+0.88+0.54+1.39
2024May+0.77+0.77+0.78+1.20+0.04+0.20+0.52+0.67
2024June+0.69+0.78+0.60+0.85+1.36+0.63+0.91+0.19
2024July+0.73+0.86+0.61+0.96+0.44+0.56-0.07+1.15
2024Aug+0.75+0.81+0.69+0.74+0.40+0.88+1.75+1.36
2024Sep+0.81+1.04+0.58+0.82+1.31+1.48+0.98
2024Oct+0.75+0.89+0.60+0.63+1.89+0.81+1.09+0.89
2024Nov+0.64+0.87+0.40+0.53+1.11+0.79+1.00+1.61
2024Dec+0.61+0.75+0.47+0.52+1.41+1.12+1.54+1.65
2025Jan+0.45+0.70+0.21+0.24-1.07+0.74+0.48+1.04
2025Feb+0.50+0.55+0.45+0.26+1.03+2.10+0.87-0.35
2025Mar+0.57+0.73+0.41+0.40+1.24+1.23+1.20+0.80
2025Apr+0.61+0.76+0.46+0.36+0.81+0.85+1.21+0.45
2025May+0.50+0.45+0.55+0.30+0.15+0.75+0.98+0.81
2025June+0.48+0.48+0.47+0.30+0.80+0.05+0.39-0.22
2025July+0.36+0.49+0.23+0.45+0.32+0.40+0.53-0.23
2025Aug+0.39+0.39+0.39+0.16-0.06+0.82+0.11+0.62
2025Sep+0.53+0.56+0.49+0.35+0.38+0.77+0.30+2.44
2025Oct+0.53+0.52+0.55+0.24+1.12+1.42+1.67+2.59
2025Nov+0.43+0.59+0.27+0.24+1.32+0.78+0.36+1.47
2025Dec+0.30+0.45+0.15+0.19+2.10+0.32+0.37-1.86
2026Jan+0.35+0.51+0.19+0.09+0.30+1.40+0.95+1.17
2026Feb+0.39+0.54+0.23+0.03+1.91-0.48+0.73+0.32
2026Mar+0.38+0.33+0.42+0.07+3.74-0.48+1.14-3.17
2026Apr+0.39+0.43+0.34+0.23+1.20+0.30+0.70-0.89
YearMonGlobeNHemSHemTropicUS48ArcticAust.Can.

Time Series Plots for USA48 and Canada

Starting this month I will include time series graphs for USA48 and Canada, in addition to the usual global plot. Note that for the previous month (March) the record warmth in USA48 (+3.74 deg. C) was in stark contrast to the coldest March in Canada in the 48-year satellite record (-3.17 deg. C).

The full UAH Global Temperature Report, along with the LT global gridpoint anomaly map for April, 2026 and a more detailed analysis by John Christy, should be available within the next several days here.

The monthly anomalies for various regions for the four deep layers we monitor from satellites will be available in the next several days at the following locations:

Lower Troposphere

Mid-Troposphere

Tropopause

Lower Stratosphere

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 10 votes
Article Rating
Subscribe
Notify of
253 Comments
Inline Feedbacks
View all comments
Sparta Nova 4
May 7, 2026 2:04 pm

Why are we still looking at average global temperatures?

Reply to  Sparta Nova 4
May 8, 2026 7:33 am

Because it allows us to determine how quickly our planet is recovering from the ice age 10,000 years ago…now that only vestigial masses of ice remain, mostly in polar regions.

bdgwx
Reply to  Sparta Nova 4
May 9, 2026 6:17 am

It is so that we can test the hypothesis that the global average temperature will increase/decrease.

Mr.
May 7, 2026 2:15 pm

“global”

Definition:
Pertaining to the whole of something

Which means that the “global temperature” must pertain to my street, and all the other streets around the globe.

So it’s currently “15C and Sunny” everywhere around the planet, just as it is at my location?

Here’s an idea – why don’t we start using more precise, accurate descriptors for such numeric constructs as this, say something like –

“formulaic constructs of selected temps readings at different times from around the world”?

Make what you will of them.

Nick Stokes
May 7, 2026 2:59 pm

The measured surface temperature has been rather stagnant too. The anomaly average (base 1961-90) for April was 1.01°C, only slightly down from March 1.025°C. Here is the graph of the last four years (different base years):

comment image

Really not much down from the 2024 record warm.

David Wojick
Reply to  Nick Stokes
May 7, 2026 5:06 pm

These are not measurements. They are the output of questionable statistical algorithms using local heat contaminated convenience samples as input. Junk science personified.

Mr.
Reply to  David Wojick
May 7, 2026 5:15 pm

Yep.
It’s real X-Files inspired stuff –
“I want to believe!”

Nick Stokes
Reply to  David Wojick
May 7, 2026 5:48 pm

I showed above a graph of the major surface indices, along with satellite UAH and RSS, over the last 4 years on a common anomaly base, 1981-2010. Satellite and surface are all very close. UAH is a little lower, but then, RSS is higher.

Reply to  Nick Stokes
May 8, 2026 7:21 am

And that means what? Show it for the period beginning in 1900.

Reply to  Mark Whitney
May 8, 2026 3:24 pm

Show satellite data starting 1900?

Mr.
Reply to  TheFinalNail
May 8, 2026 6:20 pm

Sure, why not?
Just make some numbers up.

Reply to  TheFinalNail
May 9, 2026 6:37 am

Reading difficulties? “…major surface indices,…”

Perhaps you are unaware that history extends to before 1981.

bdgwx
Reply to  Mark Whitney
May 9, 2026 7:55 am

Not for satellite data. Nick’s graph is a comparison of surface and satellite datasets. It’s not possible to extend that comparison back to 1900.

Reply to  bdgwx
May 9, 2026 8:04 am

Yes, but that’s not my point. I asked what significance the (short) time series has, meaning in the context of the subject of climate, which is, after all, the whole purpose of the discussion here.
It is possible to include the entire temperature history instead of choosing one small bit, ostensibly for the purpose of supporting claims of human interference in the system and Nick’s claim of 2024 being a “record” in terms of warmth.

bdgwx
Reply to  Mark Whitney
May 9, 2026 8:24 am

Nick isn’t claiming human interference here. He is claiming that the surface and satellite datasets are similar. He is essentially testing the hypothesis that the surface datasets will show something substantially different than satellite datasets. As you can see the hypothesis is false.

I’m not suggesting we shouldn’t discuss temperatures back to 1900. We should. But that’s a different conversation and one in which we’ll have to forgo consideration of satellite data since it didn’t exist that far back.

bdgwx
Reply to  David Wojick
May 9, 2026 6:38 am

These are not measurements.

[JCGM GUM-6:2020[ on Developing and using measurement models says it is.

Reply to  bdgwx
May 9, 2026 6:54 am

Only for repeatability conditions — the same quantity measured with the same apparatus!

Reply to  karlomonte
May 9, 2026 10:01 am

The same quantity here is the regional weather, which several independent thermometers are measuring.

Reply to  Eclang
May 9, 2026 10:49 am

To which region do you refer?

Air temperature measurements are a time series, there is exactly one opportunity to measure a single temperature at a single location when it is gone forever.

Reply to  karlomonte
May 9, 2026 1:33 pm

Most regions. Exceptions are remote areas of the planet (e.g., the High Arctic and Antarctic, the Indian Ocean, etc.)

Air temperature measurements are a time series, there is exactly one opportunity to measure a single temperature at a single location when it is gone forever.”

But even at a single location, multiple thermometers can measure temperature simultaneously, giving you repeated observations rather than just one opportunity.

Reply to  Eclang
May 9, 2026 3:46 pm

But even at a single location, multiple thermometers can measure temperature simultaneously, giving you repeated observations rather than just one opportunity.

Two thermometers, maybe, doubling the cost of the instrumentation for very little gain (other than failure redundancy, perhaps).

And still, the next instant of time, that temperature is gone forever.

Usually its just the opposite — Fake Data inserted into nonexistent sites.

Reply to  karlomonte
May 10, 2026 5:54 am

The measurement uncertainty of each thermometer STILL ADDS. Uncertainty *always* adds for multiple measurement devices. You simply cannot reduce uncertainty by using different measurement devices.

Uncertainty only cancels when measuring the same thing multiple times under the same environment using the same device – and even then certain conditions have to be confirmed before assuming cancellation, such as a Gaussian distribution of measurement values.

bdgwx
Reply to  karlomonte
May 9, 2026 10:42 am

Saying it doesn’t make it true. The irony is that the majority of the examples in the GUM are of measurements that are themselves dependent upon measurements of different things. And the majority of those require the use of a different apparatus. Literally the first example (pg. 3) in the GUM is combining measurements of two different things to compute a third measurement. And the second example (pg. 4) combines measurements requiring the use of a different apparatus. The examples expand from there and get arbitrarily more complex. So your statement here is patently wrong.

Reply to  bdgwx
May 9, 2026 10:53 am

This is just your usual sophistry and abuse of the GUM to justify ignoring variance.

And you continue to refuse to understand or acknowledge that an air temperature can only be measured once, averaging multiple measurements of the same quantity is impossible.

And again, the average formula is NOT a “measurement model”, your need for it to be so notwithstanding.

bdgwx
Reply to  karlomonte
May 9, 2026 12:47 pm

This is just your usual sophistry and abuse of the GUM to justify ignoring variance.

It’s me pointing out that your statements are contradictory to what the GUM actually says.

And you continue to refuse to understand or acknowledge that an air temperature can only be measured once, averaging multiple measurements of the same quantity is impossible.

The reason I refuse to acknowledge your position here is because its demonstrably wrong according to the GUM.

And again, the average formula is NOT a “measurement model”, your need for it to be so notwithstanding.

Which you keep saying. Show me exactly where the GUM prohibits y = f(x1, x2, …, xN) = Σ[x: {1 to N}] / N.

Don’t defect. Don’t divert. Don’t post irrelevant content from the GUM that has nothing to do with the topic.

Post the exact text that says that equation or the operations used in that equations are prohibited when defining y.

Reply to  bdgwx
May 9, 2026 4:01 pm

3.3.2 In practice, there are many possible sources of uncertainty in a measurement, including: 

a through i, all ignored by climatology, and especially:

j — variations in repeated observations of the measurand under apparently identical conditions. 

You ignore variations under differing conditions!

You have never explained how you get a Type A uncertainty from measurements of differing conditions!

You have never shown why it is acceptable to ignore the uncertainty contributions of your fav “measurement model”:

y = f(x1, x2, …, xN) = Σ[x: {1 to N}] / N.

4.1.3 is not your friend, either:

4.1.3 The set of input quantities X1, X2, …, XN may be categorized as:

⎯ quantities whose values and uncertainties are directly determined in the current measurement. These values and uncertainties may be obtained from, for example, a single observation, repeated observations, or judgement based on experience, and may involve the determination of corrections to instrument readings and corrections for influence quantities, such as ambient temperature, barometric pressure, and humidity;

⎯ quantities whose values and uncertainties are brought into the measurement from external sources, such as quantities associated with calibrated measurement standards, certified reference materials, and reference data obtained from handbooks. 

You just skip ahead to 4.1.4 and hope it applies to your abuse of the GUM and glomming together a bazillion different thermometers, then ignore 4.1.5.

bdgwx
Reply to  karlomonte
May 9, 2026 6:36 pm

None of 4.1.3, 4.1.4, or 4.1.5 say that you cannot define y = f(x1, x2, …, xN) = Σ[x: {1 to N}] / N. Nor does it insinuate, imply, or even hint that it is prohibited.

The rest of your comment has no bearing whatsoever to anything I said. You’re just randomly picking sections of GUM to defect and divert away from the fact that no where in any of the GUM documents does it prohibit a measurement model from computing an average of the input quantities.

Reply to  bdgwx
May 10, 2026 5:48 am

None of 4.1.3, 4.1.4, or 4.1.5 say that you cannot define y = f(x1, x2, …, xN) = Σ[x: {1 to N}] / N.”

The GUM defines that this ONLY APPLIES TO A SINGLE MEASURAND!

“4.1.1 In most cases, a measurand Y”

Measurand – singular!

4.1.2 “The input quantities X1, X2, …, XN upon which the output quantity Y

Y – singular measurand!

“4.1.4 An estimate of the measurand Y, denoted by y, is obtained from Equation (1) using input estimates x1, x2, …, xN for the values of the N quantities X1, X2, …, XN.”

measurand – SINGULAR

x1, x2, etc are MEASUREMENT estimates for the Singular measurand.

The average is *NOT* a measurement. It is *NOT* an input quantity for anything. It is *NOT* a measurement of multiple different things.

You have *NEVER* understood what the values X/x and Y/y stand for in the GUM. It’s because you keep cherry picking from the GUM instead of actually reading it for meaning and context!

Again, x1/x2/etc are input quantiity ESTIMATES for a singular measurand. You are trying to extend it to mean a Y value estimate for *multiple* measurands.

READ THE GUM. STOP CHERRY PICKING!

Reply to  bdgwx
May 10, 2026 7:27 am

You’re just randomly picking sections of GUM to defect and divert away

Projection mode?

from the fact that no where in any of the GUM documents does it prohibit a measurement model from computing an average of the input quantities.

The GUM is a standard for EXPRESSING uncertainty, it is not immune to GIGO abuse — Garbage In, Garbage Out.

Reply to  karlomonte
May 10, 2026 11:20 am

from the fact that no where in any of the GUM documents does it prohibit a measurement model from computing an average of the input quantities.

Actually it does.

From JCGM 200-2012

2.48

measurement model

model of measurement

model

mathematical relation among all quantities known

to be involved in a measurement

NOTE 1 A general form of a measurement model is the

equation h(Y, X1, , Xn) = 0, where Y, the output quantity

in the measurement model, is the measurand, the

quantity value of which is to be inferred from information

about input quantities in the measurement model

X1, , Xn.

NOTE 2 In more complex cases where there are two or

more output quantities in a measurement model, the

measurement model consists of more than one equation.

A mathematical relation among all quantities involved in a measurement.

What is the mathematical relation between the data points in a mean? There is none. Each data point is independent from the others. If this wasn’t true then one would need to include a covariance factor in the uncertainty. I have never seen climate science do that.

Reply to  Jim Gorman
May 10, 2026 2:59 pm

What is the mathematical relation between the data points in a mean?

Y = (X1 + X2 + … + Xn) / n

Reply to  Bellman
May 11, 2026 10:04 am

What is the mathematical relation between the data points in a mean?”

Y = (X1 + X2 + … + Xn) / n

What is the physical relationship between X1 and X2 and Xn that allows you divide each by “n”, the same counting number (not a constant).

Reply to  Jim Gorman
May 11, 2026 10:37 am

Its the Magic Number!

Reply to  Jim Gorman
May 11, 2026 10:50 am

Who says there has to be any relationship between the different Xs?

Most likely though they are telated by comming from a particular population, but that has no relevance to the question of propagating measurememt uncertainties.

The divide by n is simply what you do to get an average. I’ve no idea why you think you need permission to do it.

Reply to  Bellman
May 11, 2026 11:32 am

“Its the Magic Number!”

Sure is. Any other number would be wrong.

Reply to  karlomonte
May 10, 2026 6:04 am

The “different” things are actually different properties of a SINGULAR measurand. bdgwx simply can’t seem to get that into his head.

He has *never* taken the time to understand that Y/y and X/x are defined as in the GUM. Cherry picking champion!

Reply to  bdgwx
May 10, 2026 6:02 am

Saying it doesn’t make it true. The irony is that the majority of the examples in the GUM are of measurements that are themselves dependent upon measurements of different things.”

Malarky! You are equivocating again! “different things” are PROPERTIES OF THE SAME MEASURAND. Such as the length and width of a table top! Two different properties used to calculate the area OF A SINGULAR measurand.

If you measure the length and width of two table tops and get two values, Y1 and Y2, for the areas of each, then if you average those two values YOU ADD THE UNCERTAINTIES OF EACH together to get a total uncertainty. You do *NOT* average the uncertainties – they ADD!

The uncertainties of Y1 and Y2 will either be Type A uncertainties determined by multiple measurements of each individual property added in root-sum-square or a Type B uncertainty.

AGAIN, read the GUM for meaning and context. STOP CHERRYPICKING!

Reply to  Nick Stokes
May 7, 2026 5:18 pm

The record caused by a very strong El Nino.

…. with zero evidence of any human causation expect for the highly suspect urban surface sites the measurements were made at.

Reply to  bnice2000
May 7, 2026 8:56 pm

Yet USCRN continues to warm faster than the adjusted nClimDiv…

Reply to  TheFinalNail
May 7, 2026 11:50 pm

It is mathematical idiocy to compare trends between real and fake data. !

Reply to  bnice2000
May 8, 2026 3:25 pm

Oh yeah, forgot. NOAA are faking nClimDiv in order to make it warm as fast as USCRN.

Reply to  bnice2000
May 9, 2026 10:05 am

You would know all about idiocy.

Bob B.
Reply to  bnice2000
May 8, 2026 4:00 am

The Hunga Tonga eruption likely contributed as well. But in Nick’s world, C02 is the driver of everything.

Reply to  Bob B.
May 8, 2026 3:26 pm

The Hunga Tonga eruption likely contributed as well. 

And the evidence for this is….?

Reply to  Nick Stokes
May 8, 2026 7:11 am

Let’s see it from 1930.

Nick Stokes
Reply to  Mark Whitney
May 8, 2026 1:57 pm

OK, but it needs a 12-month running average to reduce the noise:

comment image

Reply to  Nick Stokes
May 8, 2026 10:29 pm

Really not much down from the 2024 record warm.

Reverting to the long term warming average. It’ll be interesting to see if it continues to fall below that long term average for a while.

May 7, 2026 3:25 pm

The global anomalies for 2026 have started off remarkably consistent. Only 0.04C separating all four months. Each month has been either 6th or 5th warmest.

Ten warmest Aprils since 1978:

Year Anomaly
 2024 0.94
 1998 0.62
 2016 0.61
 2025 0.61
 2026 0.39
 2019 0.32
 2020 0.26
 2022 0.26
 2005 0.20
 2010 0.20
Reply to  Bellman
May 7, 2026 3:29 pm

Here’s my unofficial version of the UAH map.

202600507wuwt1
Reply to  Bellman
May 7, 2026 3:33 pm

Still a work in progress, but here is my estimate of anomalies for each country, starting with the 10 warmest anomalies.

Kyrgyzstan 2.41
Japan 2.34
Switzerland 2.26
Kazakhstan 2.24
Norway 2.11
France 2.09
Tajikistan 2.04
Luxembourg 1.98
Spain 1.95
Portugal 1.80

and the 10 coldest anomalies:

Belarus -1.80
Ukraine -1.72
Moldova -1.27
Lithuania -1.26
Latvia -1.19
Libya -0.94
Canada -0.89
Syria -0.89
Georgia -0.88
N. Cyprus -0.79

Of course. smaller countries are more likely to have more extreme anomalies. so this ranking don’t mean much.

Reply to  Bellman
May 7, 2026 4:58 pm

In contrast to last month, no individual country had a record hot or cold April.

France and Italy both had their 2nd warmest April in the UAH data set, and several countries had their 3rd warmest – including the USA (that’s the entire states, not just USA48).

Puerto Rico had its equal 5th coldest April, and Belarus its 6th coldest.

Reply to  Bellman
May 8, 2026 9:43 am

But, say Canada at 50-60 degree latitude, covers nearly 1/4 of the planet circumferentially and it is -0.89….so probably more than balances Luxembourg at 1.98….just kidding…

Nick Stokes
Reply to  Bellman
May 7, 2026 3:58 pm

Here is my corresponding map of the surface temperatures:

comment image

Reply to  Nick Stokes
May 7, 2026 5:21 pm

Nick using a reference period that was the COLDEST since 1900..

The period of the GREAT GLOBAL COOLING SCARE

Not fooling anyone.

And GHCN is a totally bogus and unscientific FANTASY fabrication anyway.

Mr.
Reply to  Nick Stokes
May 7, 2026 5:31 pm

AAwww, that’s so pretty, Nick.

(you could lose the black splotches though, I reckon)

How about just a kinda mauve & teal shaded combo?

Nick Stokes
Reply to  Mr.
May 7, 2026 5:56 pm

Do you like this better?

comment image

Reply to  Mr.
May 8, 2026 12:35 am

Nick ate all the blue and green crayons.

Reply to  Nick Stokes
May 7, 2026 10:55 pm

We are doomed !

The-end-si-near
Reply to  Nick Stokes
May 8, 2026 5:31 am

The eastern part of the Pacific has shown cooling over the last 30 years. Your graph indicate warming. I see no white that shows any of the Pacific is cooling.

Reply to  Jim Gorman
May 8, 2026 6:14 am

You do understand that a single month is not a trend?

Here’s the UAH trends over the last 30 years. There’s only a small patch of the pacific that’s shown a negative trend.

20260508wuwt2
Reply to  Bellman
May 8, 2026 6:15 am

And here’s the same over the last 45 years.

20260508wuwt3
Reply to  Bellman
May 8, 2026 1:18 pm

1979 was the COLDEST year since the much warmer 1930,40.

Thank goodness the planet started to warm out of the Great Global Cooling Scare.

Reply to  bnice2000
May 8, 2026 3:38 pm

1979 was the COLDEST year since the much warmer 1930,40.

Nope. In the 1970s only 1977 was warmer, globally, than 1979; 1973 was a tie. The only years warmer that 1979 globally since 1930 were 1941 and 1944.

No year has been colder than 1979 globally since 1985.

Reply to  Bellman
May 7, 2026 5:26 pm

Pretty non-descript isn’t it. 🙂

All that white are must be caused by CO2 !! 😉

Mr.
Reply to  Bellman
May 7, 2026 5:18 pm

You sure it’s 0.04 degrees, not 0.0324 degrees?
Could be the difference between humanity surviving or disappearing forever.

Phillip Chalmers
May 7, 2026 4:28 pm

Could Dr Spencer, whose corpus is authentic data and not propaganda, please present measurements in absolute units as part of his output.
This “anomaly” nonsense obscures any of our attempts to talk facts, figures and trends to our nearest and dearest and the wider community.
Specifically, what rate of change and the direction of change of the surface temperature of the northern hemisphere and southern hemisphere is becoming apparent. I am looking for data which confirms or refutes the current GSM cycle as “cooling” and whether this one is really going to be a minimum comparable to previous quite cold spells like the Maunder Minimum.

Reply to  Phillip Chalmers
May 7, 2026 8:59 pm

The trend in the absolute data is identical to the trend in the anomaly data. Do you understand what anomalies are?

Reply to  TheFinalNail
May 8, 2026 4:23 am

You seem to be knowledgeable enough to criticize others. Why don’t you post the experimental standard deviation of the means, the k factor used, and the degrees of freedom for the measurement uncertainty in these temperatures? Include both the anomalies and the absolute temperature uncertainties.

Reply to  Jim Gorman
May 8, 2026 3:40 pm

Why doesn’t Roy Spencer post the standard deviation of the means and the monthly measurement uncertainties? It’s UAH’s data, not mine.

May 7, 2026 4:36 pm

To Roy W. Spencer, Ph. D., I do appreciate you releasing these monthly UAH data plots and data tables to the public . . . but PLEASE, PLEASE, PLEASE insert a disclaimer to the effect that “reporting a satellite-derived temperature anomaly average to a precision of +/- 0.01 C for any specified geographic area and/or any averaged time interval is a MATHEMATICAL artifact only and is not scientifically justified.”

Reply to  ToldYouSo
May 7, 2026 9:05 pm

Funny how UAH data becomes “scientifically unjustified” when it confirms continued warming. For years here it was championed as the ‘gold standard’. See Monckton et al. Not a word about ‘mathematical artifacts’ when a minute several-year cooling trend could be scratched out if it.

Reply to  TheFinalNail
May 7, 2026 11:46 pm

UAH does not confirm continued warming

It confirms warming ONLY at strong El Nino events.

There is no “CO2 warming” signature at all in the UAH data.

Reply to  bnice2000
May 8, 2026 7:25 am

“UAH does not confirm continued warming”

Apparently you missed this statement by Dr. Spencer that is immediately below the first graph, UAH GLAT temperature history, in his above article:

“The Version 6.1 global area-averaged linear temperature trend (January 1979 through April 2026) remains at +0.16 deg/ C/decade (+0.22 C/decade over land, +0.13 C/decade over oceans).
(my bold emphasis added)

That would be . . . let’s see . . . yeah, 47+ years of “continued warming”.

Reading comprehension 101.

Victor
Reply to  ToldYouSo
May 8, 2026 10:17 am

What is the linear temperature trend from January 2024 to the present?
Is it a new temperature trend from January 2024?

Reply to  Victor
May 8, 2026 3:43 pm

Yeah, let’s start the data at the peak of the last El Nino warming and then claim there has been ‘global cooling’!

I think even the regulars here have cottoned on to the silliness of this – eventually!

Victor
Reply to  TheFinalNail
May 8, 2026 8:26 pm

How is the Earth’s decreasing average temperature from 2024 Apr +0.94 to 2026 Apr +0.39 explained in the Earth’s energy balance diagram?
0.39-0.94=-0.55 degrees Celsius.

Where has the temperature decrease of -0.55 degrees Celsius gone in the Earth’s energy balance diagram?

When the Earth’s average temperature is decreasing, there is less incoming energy and more outgoing energy in the Earth’s energy balance diagram.

Earth Energy Imbalance: The Sun versus CO2

https://wattsupwiththat.com/2026/04/24/earth-energy-imbalance-the-sun-versus-co2/

bdgwx
Reply to  Victor
May 9, 2026 7:47 am

Where has the temperature decrease of -0.55 degrees Celsius gone in the Earth’s energy balance diagram?

It isn’t captured in most of the energy balance models per se because they typically use long averaging periods. However, if were to apply to the same 3 layer model concept to only the period from 2024/04 to 2026/04 it would be captured as a decrease of the components going into layer 2 (atmosphere) and an increase of the components of coming or a combination thereof.

When the Earth’s average temperature is decreasing, there is less incoming energy and more outgoing energy in the Earth’s energy balance diagram.

Yes and no. The incoming and outgoing energy at layer 3 (TOA) could be unchanged while layer 1 (surface) and layer 2 (atmosphere) adjust to reduce the net flow of energy in layer 2. I will say typically layer 3 does see a net positive gain/loss when responding to El Nino/La Nino.

Victor
Reply to  bdgwx
May 9, 2026 11:54 am

Do you have any examples of El Nino/La Nino causing similar global temperature decreases in the past?

There is a weak correlation between the SC25 MGII index and the Earth’s mean temperature. Is SC25 the cause of the Earth’s decreasing mean temperature?

Solar activity increased through SC25, coinciding with a period of elevated temperatures, but daily correlations are weak/noisy.comment image

bdgwx
Reply to  Victor
May 9, 2026 1:09 pm

Do you have any examples of El Nino/La Nino causing similar global temperature decreases in the past?

1998, 2010, and 2016 have magnitudes similar to 2024.

Is SC25 the cause of the Earth’s decreasing mean temperature?

Yes; partly anyway. My model says the SC25 contribution peaked at +0.09 C in 2025/04 and is now at +0.03 C in 2026/04 for an influence of -0.06 C.

Reply to  ToldYouSo
May 8, 2026 1:15 pm

Perhaps you don’t understand that the warming has ONLY occurred at very specific times.. ie .. at El Nino events.

You can put an “linear trend” through anything.. doesn’t mean the data is actually linear.

Mathematical comprehend 101. !

El-Nino-in-UAH
Reply to  bnice2000
May 8, 2026 3:46 pm

Perhaps you don’t understand that the ENSO system is an ‘oscillation’ (what the ‘O’ in the acronym stands for).

Oscillations do not produce heat; they just move heat around within the system. Otherwise it would be a ‘forcing’, not an oscillation… (ENSF??)

Reply to  TheFinalNail
May 8, 2026 4:04 am

Funny how you attribute the opinion of one person to everyone who visits WUWT.

Stereotype much?

Reply to  TheFinalNail
May 8, 2026 4:26 am

Funny how UAH data becomes “scientifically unjustified”

Scientifically justified measurements include things like uncertainty budgets, propagated uncertainty, standard deviations, k factors, degrees of freedom. Would you take a medicine or allow a treatment that did not disclose these items in the studies that were done for their approval? Same thing, different verse.

Reply to  Jim Gorman
May 8, 2026 7:34 am

“Would you take a medicine or allow a treatment that did not disclose . . .”

You mean something like what happened with the mRNA vaccines against COVID-19 administered to 250+ million American citizens (on advice of the FDA, CDC, NIAID (Dr. Fauci), and White House)?

ROTFLMAO!

Reply to  ToldYouSo
May 8, 2026 8:35 am

You didn’t answer the question.

Reply to  Jim Gorman
May 8, 2026 10:18 am

Answer to your (rhetorical?) question: Hell, no.

But then again, things like uncertainty budgets, propagated uncertainty, standard deviations, k factors, and degrees of freedom were never obtained, let alone disclosed, from “studies” done in advance of the rushed-to-implementation mRNA vaccines for COVID-19.

Now, please answer my direct question to you asked in my previous response above.

Reply to  TheFinalNail
May 8, 2026 7:17 am

I did not comment that UAH GLAT data trending is “scientifically unjustified”.

I specifically pointed out that reporting UAH temperature anomaly data to two decimal places is scientifically unjustified.

Reading comprehension 101.

bdgwx
Reply to  ToldYouSo
May 9, 2026 7:29 am

I specifically pointed out that reporting UAH temperature anomaly data to two decimal places is scientifically unjustified.

That’s actually not true.

According to [JCGM 100:2008] you report the measurement with the same number of digits as the uncertainty. UAH reports their uncertainty with 2 significant digts. [Christy et al. 2006]. And their use of 2 significant digits for the uncertainty is consistent with rules set forth in the GUM.

Reply to  bdgwx
May 9, 2026 8:37 am

From your second paragraph:

“. . . you report the measurement . . .”

You obviously have confused a “measurement” with the clearly-stated-by-UAH-reporting of averages of measurements. The two are not considered to be the same by most people familiar with data handling.

bdgwx
Reply to  ToldYouSo
May 9, 2026 8:47 am

[JCGM GUM-6:2020] considers averages of measurements to be a measurement itself.

Reply to  bdgwx
May 9, 2026 10:55 am

See above, please stop abusing the GUM.

Reply to  bdgwx
May 9, 2026 2:45 pm

Which is an excellent reason to not put too much credibility toward that document!

For those involved with practical data analysis and reporting, it is common knowledge that AVERAGING a set of measurements of a physical parameter obtained over a range of time can—indeed almost always does—increase the uncertainty of the average value compared to the uncertainty of any single measurement value within that data set. This is caused by unavoidable factors such as instrumentation (calibration) drift, irreproducibility of consecutive measurements made at high precision, and inherently stochastic temporal and/or spatial and/or amplitude variations in the parameter that is being measured empirically.

There are good reasons to not blindly follow JCGM GUM, especially since GUM is self-defined as a “Guide”, not a rule book.

bdgwx
Reply to  ToldYouSo
May 9, 2026 4:53 pm

Which is an excellent reason to not put too much credibility toward that document!

You don’t think the Joint Committee for Guides in Metrology chaired by the International Bureau of Weights of Measures with partners including NIST, UKAS, ISO, etc. is a credible source?

For those involved with practical data analysis and reporting, it is common knowledge that AVERAGING a set of measurements of a physical parameter obtained over a range of time can—indeed almost always does—increase the uncertainty of the average value compared to the uncertainty of any single measurement value within that data set.

That’s just patently false. According to the law of propagation of uncertainty it is mathematically impossible for averaging to yield an uncertainty higher than that of the individual values. And in almost all cases it will be lower.

If you want I can walk you through the mathematical proof.

You can also verify this with the NIST uncertainty machine.

Reply to  bdgwx
May 10, 2026 7:30 am

And in almost all cases it will be lower.

Something you desperately need to be true.

It isn’t.

Reply to  bdgwx
May 10, 2026 7:50 am

“According to the law of propagation of uncertainty it is mathematically impossible for averaging to yield an uncertainty higher than that of the individual values. And in almost all cases it will be lower.”

That statement is visually falsified by the first UAH plot of GLAT at the top of the above article. Look at the apparently random data scatter (approximating “uncertainty”) in the red curve designated as the “running, centered 13-month average” . . . it’s on the order of +/- 0.05 C, whereas UAH asserts uncertainty of any individual measurement by them (even acknowledging that such “measurement” is an average obtained over one month) is +/- 0.01 C.

“Test all things; hold fast what is good”
— The Bible, 1 Thessalonians 5:21

bdgwx
Reply to  ToldYouSo
May 12, 2026 3:46 am

You think UAH falsifies the law of proposition of uncertainty?

Reply to  bdgwx
May 10, 2026 11:56 am

[JCGM GUM-6:2020] considers averages of measurements to be a measurement itself.

No it considers the average of a number of observations of the same input quantity to be an estimate of a measurand. It explicitly defines the series of observations of the input quantity as a set of data in a random variable. The random variable has both a mean and variance.

The observations of a single measurand are not scaled, they exist as an actual estimate of the measurand’s physical property. Remember, you are defining x1, x2, xn as unique values divided by a constant. That requires each observation of a unique input quantity to be scaled by a constant.

The GUM, all of them, define X1 to be the “true value” of an input quantity. Then they define x1 to be the estimate of the value of that input quantity. If you want x1 to actually be x1/n, then you must scale each and every observation by n. Finding the mean of x1 as described in JCGM 100-2008 Section 4.2 requires the use of the actual measurement of the input quantity, not a scaled value.

X1 can not simultaneously be x1 and x1/n. I would love to see that done in a publ9shed paper.

Different input quantities (each being a separate measurand) may be combined using a functional relationship that controls the output value using those inputs. This is your stumbling block, you must define the input quantity’s value and the process for determining it prior to making measurements. If you want them scaled you must define why and how they are to be scaled before the first measurement is taken.

Reply to  Jim Gorman
May 10, 2026 5:01 pm

The GUM, all of them, define X1 to be the “true value” of an input quantity. Then they define x1 to be the estimate of the value of that input quantity. If you want x1 to actually be x1/n, then you must scale each and every observation by n.

X1 can not simultaneously be x1 and x1/n.

You really don’t understand how functions work, do you.

Reply to  Bellman
May 11, 2026 5:24 am

And you don’t understand what X and x are. They are not measurements of different things. The are measurements of THE SAME THING.

Reply to  Tim Gorman
May 11, 2026 11:54 am

X is a random variable representing the probability of all possible measurements, x is a realization from the variable, that is it’s a measurement. They also use X to be a measurand. Or as Jim puts it, a “true value” of the measurand, though I doubt the GUM says that.

Reply to  Bellman
May 11, 2026 8:26 am

You really don’t understand how functions work, do you.

You need to reread the GUM. “X1, X2, Xn” are input quantities defined in the measurement model. They are not the estimated values obtained by observations.

The estimated values of each input quantity obtained by actual physical measurements are designated as x1, x2, xn.

From the GUM.

4.1.4 An estimate of the measurand Y, denoted by y, is obtained from Equation (1) using input estimates x1, x2, …, xN for the values of the N quantities X1, X2, …, XN. Thus the output estimate y, which is the result of the measurement, is given by f(x1, x2, …, xN ).

In other words,

Y = f(X1, X2, …, XN). ( The model)

y = f(x1, x2, …, xN). (The estimated)

XN and xn are each independent input quantities having independent evaluations.

Since they are independent, each must be scaled independently. In other words,

Y = f(X1/N, X2/N …, XN/N). (model)

y = f(x1/N, x2/N, …, xN/N). (estimated)

Please EXPLAIN why each physically observed measurement is scaled in the measurement model by the number of input quantities..

I can not remember any measurement model in physics, chemistry, electronics, etc. of a physical measurand ever being scaled by the count of the number of input quantities. The GUM does not show any example of where individual input quantities are scaled using the number of input quantities.

Perhaps you can find one.

Reply to  Jim Gorman
May 11, 2026 9:28 am

“You need to reread the GUM.”

I don’t because I understand it, as eell as how functions work. Your claim wad that a function that scaled an input value meant tgatvthe input value had to exist in two states. That’s not how functions work. The inputs are invariable. A function such as y = x/2 does not mean you make x half the size, and therefore x is sumultaniously two values. It means that the value of the function is equal to half of x.

Reply to  Bellman
May 11, 2026 3:15 pm

our claim wad that a function that scaled an input value meant tgatvthe input value had to exist in two states.

That’s not how functions work

X1, X2, etc. are input quantities and the function defines the relationship between those and the output.

x1, x2, etc. are the actual “estimates” obtained that are observed while making actual physical measurements.

A function such as y = x/2 does not mean you make x half the size, and therefore x is sumultaniously two values. It means that the value of the function is equal to half of x.

y = x1 divided by 2 requires the measurement model to be Y = X1/2. You can’t say Y = X1 = x1/2

That means you must have a physical reason when creating the measurement model that defines why you are dividing a physical thermometer reading, a length, a mass, ectc., by 2 (or any counting number). Tell us what physical reason exists for dividing each measurement by “n”.

A physical functional relationship is defined through experimentation, not by just saying will I can create any old function that I please to give me what I want. Your measurement model must have a statement that you have experimented and determined that your function provides an output that is found in the real physical world and reproducible.

Lastly, you and bdgwx think you can reduce uncertainty by a factor of “n”. You can not. x1 has an uncertainty based upon the GUM.

You would like to say that (1/n)(u(xₙ)/n) is the uncertainty. It is not. Since you are dividing, you must use relative uncertainties. That makes each term:

(1/n)(u(xₙ)/(xₙ/n) = (1/n)(n(u(xₙ)/xₙ) = (1/n)(n(u(xn))/xn)

and your counting number disappears. All for nothing.

Reply to  Jim Gorman
May 11, 2026 4:14 pm

y = x1 divided by 2 requires the measurement model to be Y = X1/2. You can’t say Y = X1 = x1/2

Yes. I don’t know why you think that’s a problem.

That means you must have a physical reason when creating the measurement model that defines why you are dividing a physical thermometer reading, a length, a mass, ectc., by 2 (or any counting number).

In this case it’s becasue we are averaging. In another case it might be becasue we measured a diameter but want the radius.

Tell us what physical reason exists for dividing each measurement by “n”.

Because that’s how you make an average.

A physical functional relationship is defined through experimentation, not by just saying will I can create any old function that I please to give me what I want.

So you won’t accept that the volume of a cylinder is πR²H, unless it’s been determined through experimentation? Using a function to give you what you want is what the measurement model is all about.

Lastly, you and bdgwx think you can reduce uncertainty by a factor of “n”. You can not. x1 has an uncertainty based upon the GUM.

When all else fails, just stamp your feet and cry “you can’t do that”.

You would like to say that (1/n)(u(xₙ)/n) is the uncertainty.

No it isn’t. The uncertainty of a single input scaled by 1/n is (1/n)u(xₙ). Your obsession with getting a simple formula wrong in so many different ways is baffling.

“That makes each term:
(1/n)(u(xₙ)/(xₙ/n) = (1/n)(n(u(xₙ)/xₙ) = (1/n)(n(u(xn))/xn)”

As I said, you just don’t understand how functions work. At this point you just seem to be typing randomly. I’m guessing you are making the same mistake as Tim last time, and thinking equation 10 is using relative uncertainties.

Reply to  ToldYouSo
May 8, 2026 8:45 am

The real problem is that this is a time series. Has seasonality been dealt with? Has auto-correlation been dealt with? Is an SARIMA been used to remove these factors?

Has the fact that diurnal ranges have changed due to night temps increasing created a change in standard deviations that result in a false trend.

Reply to  Jim Gorman
May 8, 2026 10:32 am

Please direct those questions to the person/party responsible for analyzing and reporting the UAH satellite temperature data.

Personally, I don’t see that seasonality, auto-correlation, SARIMA (especially since it is used for forecasting), and diurnal range changes have any meaningful bearing on the plotted data points and associated LS regression curve fit over the 47+ years of UAH satellite data as presented in the first graph in the above article.

Is this an attempt to throw some out-of-left-field concerns “against the wall” to see what sticks?

Reply to  ToldYouSo
May 8, 2026 11:13 am

So you think there is no auto-correlation involved? How about seasonal changes.

Do you believe that UAH doesn’t put daytime and nighttime temperatures together in determining an average?

You say SARIMA is used for forecasting. That what regressions in climate science are used for also.
Linear trends from regressions take nothing into account.

Remember linear regression is designed to show linearity between an independent variable and a dependable variable such that an equation of y=mx + b will be a functional relationship allowing the calculation of a value for “y” using a value for “x”.

Linear regressions for a time series do not provide a functional relationship. The linear regression will never provide when a change like ENSO will occur, nor it effects.

The other thing SARIMA does is provide a proper procedure for decaying shocks such as ENSO over a short time. Otherwise, a shock will continue being propagated into future values.

Reply to  Jim Gorman
May 8, 2026 12:24 pm

“How about seasonal changes.”

Since the UAH graph in question in a plot of monthly, global-averaged data, and since the seasons in Earth’s northern hemisphere are 180-degrees out of phase with those in the southern hemisphere, you’ll have to tell me exactly HOW and WHY seasonal changes should be “adjusted” in the plotted data points.

“Linear trends from regressions take nothing into account.”

Hah! A least-squares linear regression does indeed take the degree of data scatter into account, by the degree-of-fit of the linear trend (y=mx + b) to the full data set, commonly known by anyone familiar with the mathematical procedure for doing such as the “coefficient of determination”, assigned the symbol R^2 or r^2, and varying in value from zero to 1.000.

R^2 is a statistical measure in least squares regression that indicates how well the regression model fits the observed data. That is important to know.

bdgwx
Reply to  ToldYouSo
May 9, 2026 7:23 am

The global average temperature does have seasonality to it. Here’s is UAH’s monthly anomaly baselines.

Month 1991-2020 (K)
1     263.18
2     263.27
3     263.43
4     263.84
5     264.45
6     265.10
7     265.42
8     265.23
9     264.64
10    263.95
11    263.41
12    263.19
The reason UAH (and others) remove the seasonality component is to isolate the changes in temperature not caused by Earth’s orbital cycle. The way they remove the seasonality component is by subtracting the above baselines from the measured value for the same month pairs.

Reply to  bdgwx
May 9, 2026 3:17 pm

“The reason UAH (and others) remove the seasonality component is to isolate the changes in temperature not caused by Earth’s orbital cycle.”

“. . . isolate the changes not caused by Earth’s orbital cycle.” Huh? Do you really think that adjusting the monthly data using a single “baseline” value for each month—each value being an average obtained over a 29-year span—really does that??? Doing that necessarily assumes those 29 years of past “seasonality” is an accurate representation of the “seasonality” of Earth for the last 6 or so years as well as going forward. Problematic, that.

Also,changes in solar insolation due to the 11-year (more or less) sunspot cycle are not caused by Earth’s orbital cycle and are not properly normalized by using anomaly baselines of 29-year averages. And, should or should not the anomaly baselines themselves be “adjusted” to remove ENSO and AMOC variations, both of which are not caused by Earth’s orbital cycle?

Reply to  ToldYouSo
May 9, 2026 3:34 pm

The whole use of “anomalies” is based on assuming that the anomalies don’t inherit the uncertainty of the component values.

If the current values have variance (and they *do* have such) then using a CONTANT to scale the current values into an ANOMALIES is a linear transformation of the current data using a constant. The variance of the current data is inherited by the anomaly, i.e. the standard deviation of the original data and the anomaly data is the *same*.

Climate science is based on the assumption that anomalies have less uncertainty (i.e. variance) than the absolute values. All they have are “smaller” absolute values – but the standard deviation stays the same. Meaning the variance stays the same and so does the uncertainty of the data.

Climate “science” is just riddled with such assumptions.

bdgwx
Reply to  ToldYouSo
May 9, 2026 4:41 pm

Do you really think that adjusting the monthly data using a single “baseline” value for each month—each value being an average obtained over a 29-year span—really does that???

First…it’s not a single baseline. It’s 12; one for each month of the year. Second…it’s a 30 year average. Anyway, the answer to your question is yes.

And, should or should not the anomaly baselines themselves be “adjusted” to remove ENSO and AMOC variations, both of which are not caused by Earth’s orbital cycle?

Of course not. Variations caused by ENSO and/or AMOC would be something we would want to see in the data.

Reply to  ToldYouSo
May 9, 2026 7:36 pm

If it would help, here’s a comparison of monthly anomalies and absolute temperatures for UAH. They are both on the same scale, but absolute temperatures have been shifted.

I see clear seasonality in one, and not the other.

20260509wuwt3
Reply to  Bellman
May 10, 2026 8:04 am

Looking at the variation in the maximums of your right-hand graph compared to the variation of maximums of your left-hand graph—and doing likewise comparisons for the minimums of both graphs—I see clear seasonality in both graphs.

Reply to  ToldYouSo
May 10, 2026 4:22 pm

I see clear seasonality in both graphs.

Then you must have very imaginative eyes.

Decomposing the two time series suggests for temperatures a seasonal variation of almost ±1°C, but for anomalies, less than ±0.02°C. Any seasonality in the anomalies is just noise.

Here’s the decomposition for anomalies.

20260511wuwt1
Reply to  Bellman
May 10, 2026 4:31 pm

And here’s the decomposition for temperatures. Compare the scales for seasonality.

20260511wuwt2
Reply to  Bellman
May 11, 2026 5:43 am

They both have the same relative uncertainty.

50 +/- 1 has the same relative uncertainty as 1 +/- .02. You don’t give the actual means but I’ll bet they are close to 50 and 1.

Both have a relative uncertainty of 2%.

If seasonal variation appears in the absolute temperature, then it will also appear in the anomaly.

This can be seen by comparing the relative uncertainties. You can’t get rid of the relative uncertainty using a linear transformation by a constant.

Reply to  Tim Gorman
May 11, 2026 10:00 am

What uncertainty? We are talking about seasonality here, not uncertainty. And I’ve no idea whrre you get values of 50 or 1 here.

“If seasonal variation appears in the absolute temperature, then it will also appear in the anomaly.”

Do you see the same seasonal variation in the graphs? If I do may I suggestbyiu get your eyes tested.

Again you seem to be confusing seasonal variation with natural variation that might change slightly on a seasonal basis.

“You can’t get rid of the relative uncertainty using a linear transformation by a constant.”

Again, we are not talking about uncrrtainty. but seasonality.

Reply to  Bellman
May 11, 2026 7:45 am

Decomposing the two time series suggests for temperatures a seasonal variation of almost ±1°C, but for anomalies, less than ±0.02°C.

You are merely modifying the original value by reducing its absolute value.

The ±1°C is the variance in the data. It is determined from the data points in a random variable. Similarly, the baseline has a variance determined from the data points in another random variable.

Subtracting the means of two random variables requires the variances to add. Consequently, the value of the anomaly’s variance is > ±1°C.

You have just thrown that inherited variance away into the trash and instead found the variance of very small numbers and declared that to be be the total variance.

Reply to  Jim Gorman
May 11, 2026 9:20 am

“You are merely modifying the original value by reducing its absolute value.”

Yes, that’s the point. Reducing the seasonal variation.

“The ±1°C is the variance in the data.”

It isn’t.

“Subtracting the means of two random variables requires the variances to add.”

Pointless discussion this unless you state what variance you are talking about. You and Tim keep switching the argument. The variance in the base value is irrelevant to the variance in the monthly anomalies because as Tim keeos pointing out you are just subtracting a constant.

If you mean uncertainty, then yes the uncertainty if the anomaly is the uncertainty of the monthly temperature plus the uncertainty of the base value added in quadrature. But the monthly variance is not uncertainty. The variance does contribute to the uncertainty if the trend, but that’s already present in the anomalies.

“Consequently, the value of the anomaly’s variance is > ±1°C.”

No idea how you get that, but as you refuse to show your workings, I doubt you understand it either. Again, talk to Tim. Even he understands that subtracting a constant doesn’t change the variance.

“…found the variance of very small numbers and declared that to be be the total variance.”

You don’t need to worry about what’s been thrown away. The variance of the data is the variance of the data. But regardless. the seasonal component is not the variance, it’s the monthly seasonal offset that can be exteacted from the data.

Reply to  Jim Gorman
May 8, 2026 11:08 am

“Has seasonality been dealt with?”

What seasonality? You keep saying this every month, but never give any indication you understand what it means. If you disagree with Spencer’s linear regression, explain how you would do it differently.

And why did you never raise any of these objections all the time you were praising Monckton’s pause “analysis”? All you ever did was attack me for pointing out the uncertainty in the trends.

Reply to  Bellman
May 8, 2026 1:24 pm

What seasonality?

Seasonality can introduce spurious trends due to variance changes, that is, difference in winter vs summer in different hemispheres. Trends reflect a change in variance rather than an actual increase.

It is why ag science was the group to show that nighttime temperatures were increasing rather than climate science. Climate science ignored the actual reason by concentrating on a daily average and regressions that showed increasing temperature. CAGW!

explain how you would do it differently.

A linear regression is used to develop a functional relationship between an independent variable (x) and a dependent variable (y). This functional relationship can then be used to predict values not collected in the original data set. As it stands right now, any linear regression of temperature has two problems.

One, it does not predict when step changes will occur, nor why nor does it predict when pauses will occur.

Two, extending the linear regression into the future results in temperatures that are unrealistic. Remember, CO2 is not the independent variable, time is. So as time increases far into the future so will temperature.

Why do you think nowhere in climate science does anyone map CO2 vs temperature and do a linear regression? You have the data, why don’t you do the regression?

I have done this on a number of stations while teaching myself python. SARIMA stands for Seasonal Auto Regression Moving Average. It has been developed for time series analysis. It allows one to have a statistically constant baseline for a model. One can then develop variables that provide the actual values and apply them to the base. I assure you that CO2 concentrations alone do not give adequate projections throughout the complete time line.

This started me down the path of determining how the intermediate land/oceans warm from the sun, and then release their energy into the atmosphere. Using averages quickly lead you astray when doing the thermodynamic analysis. For land, the soil temperature distribution is something that is not amenable to arithmetic averaging and I’m sure the atmosphere is no different.

I seldom look at temperature trends nymore. The measurement uncertainty is never given appropriately, UHI contamination is rife along with poorstation siting. Data modification in station data, while well intentioned still poisons the well, and daily averages have already been provenusless in determining the reason for increasing values.

Reply to  Jim Gorman
May 8, 2026 3:48 pm

Seasonality can introduce spurious trends due to variance changes, that is, difference in winter vs summer in different hemispheres. 

The very thing that anomalies remove.

Reply to  TheFinalNail
May 9, 2026 3:53 am

The very thing that anomalies remove.”

Anomalies remove NOTHING. They only scale the absolute values which is meaningless.

Creating an anomaly is nothing more than a linear transformation using a constant. The standard deviation (variance) stays exactly the same even if the absolute values are scaled. The variance (standard deviation squared) is the metric used for uncertainty. If the variance doesn’t change then neither does the uncertainty of the data being scaled.

If the variance is the metric for uncertainty then the mean needs to be a *weighted* mean reflecting the differing uncertainty associated with each. Since winter temperatures typically have a higher variance they also have a larger uncertainty. Their percentage contribution to the mean should be less than the percentage contribution from temps (i.e. summer) with smaller variance.

This is just BASIC statistical analysis – which climate science *always* ignores.

Reply to  TheFinalNail
May 9, 2026 6:43 am

The very thing that anomalies remove.

If you KNOW that, then you should be able to show how anomalies remove seasonality using mathematics.

You show no references to support your assertion. That makes your assertion worthless.

If you knew what you are talking about, you would mention the variances associated with the mean values that make up anomalies. Do those variances between seasons result in spurious trends? Show how you know if that is true or not. Remember, seasons are not correlated between the NH and SH.

From: TSCS_Week5_Trends.pdf

  • In order to do time series analysis we need to distinguish the trend and stationary components, and the appropriate method depends on whether the trend is deterministic or stochastic.
  • If a trend is stochastic, we difference the data to isolate the stationary component. The process is difference-stationary.
  • In the case of a random walk with drift, we have

E(∆yt) = E(a0 +εt) = a0
var(∆yt) = E(∆yt −a0)2 = E(εt)2 = σ2
cov(∆yt,∆yt−s) = E(εt,εt−s) = 0

  • If the trend is deterministic, to isolate the stationary component, we detrend the data by regressing {yt} on a high-order polynomial function of time.
  • The order of the polynomial can be determined by t-tests and F-tests as well AIC and SBC measures of fit

From: Linear Regression in Time Series: Sources of Spurious Regression | Towards Data Science

It is especially problematic in economics and finance, where many key variables exhibit autocorrelation or serial correlation between adjacent values, particularly if the sampling interval is small, such as a week or a month, leading to misleading conclusions if not handled correctly. 

I hate to be the bearer of bad news, but temperatures are auto-correlated. The temperature of yesterday is a good predictor of todays temperature, which is a good predictor of tomorrows temperature.

Reply to  Jim Gorman
May 8, 2026 5:19 pm

Seasonality can introduce spurious trends

I asked what seasonality are you talking about, not what effect seasonality can have. I know it can introduce spurious trends – if you remember I tried to explain that to Monckton when he used CET monthly temperatures.

But the point is we are talking about anomalies, not temperatures. There is no noticeable seasonality.

Trends reflect a change in variance rather than an actual increase.

How do you think that works? How does a change in variance change a trend?

It is why ag science was the group to show that nighttime temperatures were increasing rather than climate science.

You’re just making stuff up now aren’t you? Regardless, what has that got to do with seasonality?

A linear regression is used to develop a functional relationship between an independent variable (x) and a dependent variable (y).

Linear regression does not give you a functional relationship. There is always an epsilon term that is not predicted by the independent variables.

And you are again avoiding the question. I asked you explain what analysis you would do on the data.

One, it does not predict when step changes will occur, nor why nor does it predict when pauses will occur.

Why would you expect it to? If you have change points, you don’t have a linear regression over the entire data set. You need to use other techniques to estimate where such a change has happened.

Two, extending the linear regression into the future results in temperatures that are unrealistic.

Which is why you should not extend a linear trend into the future. It’s linear regression 101, that you should not extend the regression outside the data range.

Remember, CO2 is not the independent variable, time is.

CO2 can easily be an independent variable. I’ve shown this to you many times.

Why do you think nowhere in climate science does anyone map CO2 vs temperature and do a linear regression?

I keep showing you my graphs doing just that. E.g.

20260509wuwt1
Reply to  Bellman
May 8, 2026 5:21 pm

I assure you that CO2 concentrations alone do not give adequate projections throughout the complete time line.

You don’t need to “assure” me, just show your workings.

Reply to  Bellman
May 9, 2026 12:12 pm

You don’t need to “assure” me, just show your workings.

You are the one with the trends, show us your predictions. You do know that is the purpose of a regression model for a physical phenomenon. It shouldn’t take much to plug in a doubling of CO2 from 330 ppm (log₂ (330) = 8.4) to 660 ppm (log₂ (660) = 9.4). What is the resultant prediction from your regression equation?

Reply to  Jim Gorman
May 9, 2026 5:06 pm

You are the one with the trends, show us your predictions.

Spencer’s the one gibing the linear trends, and Monckton used to when he was posting. The only linear regression I’ve made in these comments, was my global map of trends, produced to test your claim about cooling in the Eastern Pacific.

You do know that is the purpose of a regression model for a physical phenomenon.

Nope, I do not know that. In that I think the opposite. You should not project a linear regression too far outside it’s data range.

Reply to  Bellman
May 8, 2026 5:39 pm

Two, extending the linear regression into the future results in temperatures that are unrealistic.

I think your problem is you only see the role of linear regression as predicting what will happen in the future, which is not something you should really be doing.

The main point of linear regression is to identify if correlations exist between different variables, and in the case of a time series, to identify if something has been changing over time, and if so how quickly.

You might be able to use a regression to make a rough estimate of what will happen in the future – but that’s only valid if you can assume that the trend will continue, and that depends on what the reasons are for the change. If you want to estimate how the climate will change over the next century, you have to look at climate models, not just assume a linear trend will continue indefinitely.

Reply to  Bellman
May 9, 2026 12:28 pm

I think your problem is you only see the role of linear regression as predicting what will happen in the future, which is not something you should really be doing.

From: Linear regression | Definition, Formula, & Facts | Britannica

A primary use of the estimated regression equation is to predict the value of the dependent variable when values for the independent variables are given. 

In physical science, experiments are run to gather data that occurs in the dependent variable when changes in the independent variable are made.

Here you are equating CO2 ppm concentration to the independent variable and the anomaly to the dependent variable. This isn’t a statistical analysis, it is determining the physical functional relationship between the two variables. If the relationship is a viable one, then predictions can be made of what the anomaly will be at various CO2 concentrations.

Finally, to be honest, this whole focus on anomalies is worthless. It tells no one what is actually happening at any point on the globe or in time. The whole process is being done to attempt to convince folks that there are places on the globe warming to such an extent that cooler places do not offset the warming. If you want to do something worthwhile, show us WHERE this warming is occurring on a constant basis. Greenland? Antarctica? Africa? Or is well mixed CO2 just wandering around warming here and there until the entire globe is warmer. I’m very interested in where here and there actually are.

Reply to  Jim Gorman
May 9, 2026 5:26 pm

A primary use of the estimated regression equation is to predict the value of the dependent variable when values for the independent variables are given. 

You still don’t get that prediction does not mean predicting the future. It’s always dangerous to predict outside the data range. Look at the example in your source

For instance, given a patient with a stress test score of 60, the predicted blood pressure is 0.49(60) + 42.3 = 71.7.

That’s in relation to a graph giving a stress test range of 50 – 100, i.e. within the range of the data.

I also think you misunderstanding the word “prediction” in this context. The predicted value is not the actual value you will get. It’s what you would expect to get on average. Using this you can compare the actual result against the predicted result. You can use the prediction to see if someones blood pressure is unusually high or low.

In physical science, experiments are run to gather data that occurs in the dependent variable when changes in the independent variable are made.

But in this case we are using time as an independent variable, simply to see if temperatures are increasing or decreasing significantly, and if so by how much. Once again, you are hung up on one use of a regression and dismiss any other use as “not physical science”. You have a very blinkered view of science.

This isn’t a statistical analysis, it is determining the physical functional relationship between the two variables.

I’ve no idea how you think you can do that without a statistical analysis. And again, there is not a functional relationship between CO2 and temperature.

If the relationship is a viable one, then predictions can be made of what the anomaly will be at various CO2 concentrations.

How? You keep demanding the impossible then complaining when nobody does it for you? There are any number of factors that will determine the global temperature in a given year. And the relationships between them are not likely to be linear. If you think it’s possible to predict the exact global temperature for a given year, then why don’t you demonstrate how?

The whole process is being done to attempt to convince folks that there are places on the globe warming to such an extent that cooler places do not offset the warming.

What are you on now? Again, if you have a problem with the UAH data set, take your complaints to Spencer and Christie, before they retire.

If you want to do something worthwhile, show us WHERE this warming is occurring on a constant basis.

Did you notice my global anomaly maps, or the trend maps?

Reply to  Bellman
May 9, 2026 3:56 am

But the point is we are talking about anomalies, not temperatures. There is no noticeable seasonality.”

This has been pointed out to you multiple times and you just ignore it: Anomalies are linear transformations by a constant. The anomalies inherit the EXACT VARIANCE of the parent.

If temperatures have variances based on seasonality then so will the anomalies generated by linear transformation using a constant.

Reply to  Tim Gorman
May 9, 2026 4:40 am

I don’t ignore what you “point out”, I explain why it’s wrong or irrelevant.

First. we are talking about seasonality, not variance.

Second, you need to state what variance you are talking about. If you mean the variance in monthly values across the year, then anomalies definitly have less variance than temperatutes. That’s because you are not transforming by a constant. The value you subtract in January will be different to the value you subtract in July. The variance you see in temperatures across the year is caused by seasonality. Using anomalies removes that seasonality.

If you mean the variance for a specific time of year across the years, then yes, there will be the same variance for all Decembers whether you use anomalies or temperatutes. This variance may be different for different months, although the difference isn’t large on a global scale. But the question is, so what? What specific problem do you think that causes, and how could it result in a spurious trend?

Please give a specific example, rather than your usual hand waving.

Reply to  Bellman
May 9, 2026 3:24 pm

 If you mean the variance in monthly values across the year, then anomalies definitly have less variance than temperatutes.”

The variance of each monthly value *should* inherit the added variance of the component daily values. Var_total = Var1 + Var2 +…. VarN

Anomalies obtained from the monthly values and their inherited variances will have the exact same variance.

For the umpteenth time, variance is the uncertainty metric. Uncertainty adds, especially when different things are being measured. You, and climate science as well *ALWAYS* assume that uncertainty is random, Gaussian, and cancels so that monthly values don’t inherit the summed uncertainties of the component daily values! The variance (i.e. the uncertainty) is *NOT* the standard deviation of the monthly values, it is the added variances of the monthly values.

YOU CAN’T JUST IGNORE THE VARIANCE OF THE BASE MEASUREMENTS. You continually say that you don’t assume that measurement uncertainty is random, Gaussian, and cancels but you just proved that you *DO* assume just that! EVERY SINGLE TIME!

That’s because you are not transforming by a constant.”

You *are* transforming by a constant. The mean value used to form the anomaly *is* a constant. It is used in forming every single anomaly. You may change that constant from decade-to-decade or from 30year mean to a different 30 year mean, but the *same* value is used to scale every single measurement over the time period covered. It is *not* a variable. If it was then it would have to be quoted with an uncertainty and the variance of the anomaly would be the sum of the uncertainty of the mean value plus the uncertainty of the current temperature. You are caught between the rock and the hard place you always wind up at when trying to justify the belief that anomalies have less uncertianty than the components used to calculate the anomaly.

Reply to  Tim Gorman
May 9, 2026 5:47 pm

The variance of each monthly value *should* inherit the added variance of the component daily values.. Var_total = Var1 + Var2 +…. VarN

OK, so you don’t understand what a variance is. You’ve demonstrated this enough times, you don’t have to keep repeating it.

What you are describing is what happens when you add multiple random variables. The average of multiple random variables is not a sum. The Var_average will be Var_total / N².

Anomalies obtained from the monthly values and their inherited variances will have the exact same variance.

Did you read my comment? That’s what I said, if you are talking about the variance of values for a specific month across the years. But that’s not the same as the variance of all monthly values across the years.

For the umpteenth time, variance is the uncertainty metric.

It isn’t really, it’s standard deviation that’s the metric for uncertainty. But this has nothing to do with the discussion, which was about accounting for seasonality in a linear regression.

random, Gaussian, and cancels

You’ve lost the argument, and the plot, again.

You *are* transforming by a constant.

You think you subtract the same value for January as you do for July?

Reply to  Bellman
May 10, 2026 6:34 am

“The average of multiple random variables is not a sum”

You *have* to add the values of the random variables in order to calculate the average!

“Did you read my comment? That’s what I said, if you are talking about the variance of values for a specific month across the years. But that’s not the same as the variance of all monthly values across the years.”

Each month is linearly transformed by a constant. That constant does not have to be the same for each different month. That does *NOT* change the fact that the anomaly created from the linear transformation inherits the uncertainty of the constant *and* the current data. The standard deviation remains the same no matter how much you scale the absolute value with a constant value.

“It isn’t really, it’s standard deviation that’s the metric for uncertainty.”

Malarky! Some day you and bdgwx NEED TO READ THE GUM FOR MEANING AND CONTEXT.

From the GUM:
——————————————
3.3.5 The estimated variance u2 characterizing an uncertainty component obtained from a Type A evaluation is calculated from series of repeated observations and is the familiar statistically estimated variance s2 (see 4.2).”
—————————————-(bolding mine, tpg)

The variance of a Gaussian distribution is related to the peakedness index. pi = max-value/sqrt(variance). The larger the variance the less the peakedness index. The “hump” around the average value gets flattened as the variance increases. That means that values surrounding the mean get closer to the mean value which raises the uncertainty of the mean.

This has been explained to you multiple times. Yet you just continue to ignore it. Why?

READ THE GUM FOR MEANING AND CONTEXT. STOP CHERRY PICKING.

“You’ve lost the argument, and the plot, again.”

As usual, you just refuse to actually learn the basics of metrology.

Linear transformation of a distribution using a constant does *NOT* change the standard deviation of the distributionVariance *is* the metric for uncertaintyMeasurement uncertainty ADDS when multiple measurands are involved, it does *not* cancel.You can’t create a larger system by combining the intensive properties of multiple measurandsREAD THE GUM FOR MEANING AND CONTEXT. STOP CHERRY PICKING.

Reply to  Tim Gorman
May 10, 2026 8:57 am

“You *have* to add the values of the random variables in order to calculate the average!”

And thrn you divide by N, and tgat neans dividing the variance by N². Your inability to remember or understand this basic fact is your problem. All I can do is keep pointing it out.

“Each month is linearly transformed by a constant. That constant does not have to be the same for each different month.”

Yes, that’s what I keep telling you. As I keep saying, these conversations woukd be far les painful if you just explained what variance you are talking about.

“The standard deviation remains the same no matter how much you scale the absolute value with a constant value.”

Again, what standard deviation? We were talking about removing seasonality by taking anomalies. That implies that the variance of the monthly anomalies is less than that gor temperatures because you’ve removed the variance caused by seasonality.

That’s complety different from the variance you see when looking at a specific calandar month and how much it varies year to year.

You can cherry pick parts of the GUM, and sometimes they suggest variance as a measure of uncertainty. But the primary unit is the standard uncertainty. which is a standard deviation. IMHO, variance is a poor value to use for incertainty giving it’s expressed in physically meaningless units and gives no real indication of the size of the uncertainty.

The variance of a Gaussian distribution is related to the peakedness index. pi = max-value/sqrt(variance).

This has been explained to you multiple tomes. Yet you just continue to ignore it. Why?

This is the first time I can remember you talking about a peakedness index. Your equation makes no sense. Could you privide a reference? As far as I can see the index was only proposed last year.

In this paper, we propose a new measure for quantifying peakedness, named the “peakedness index”. The proposed index is defined as the ratio of the maximum density (or peak density) of the distribution to its continuous informity, where “continuous informity” is a concept from the newly developed theory of informity.

https://www.preprints.org/manuscript/202509.2604

Reply to  Bellman
May 11, 2026 3:38 am

And thrn you divide by N, and tgat neans dividing the variance by N². Your inability to remember or understand this basic fact is your problem. All I can do is keep pointing it out.”

If you can’t add the data so that it represents a larger system then any average you calculate from the data has no physical meaning. Trending the average over time only tracks the movement of the non-physical average – meaning the trend is also useless in describing the real world.



Reply to  Bellman
May 11, 2026 3:42 am

Yes, that’s what I keep telling you. As I keep saying, these conversations woukd be far les painful if you just explained what variance you are talking about.”

THE VARIANCE OF THE MEASUREMENT DATA!

Jeesh!

It is *still* using a CONSTANT to perform a linear transformation, even if the constant for March is different than the constant for December. That means that the variance of March data is the same for both the absolute values and for the anomaly.

Anomalies do *NOT* reduce measurement uncertainty – i.e. variance of the measurement data.

I’m not surprised you can’t accept that simple fact.

Reply to  Bellman
May 11, 2026 3:46 am

Again, what standard deviation? We were talking about removing seasonality by taking anomalies.”

If the variance of the temperature data is impacted by the seasonality – AND IT *IS* AFFECTED BY SEASONALITY – then anomalies do *not* remove seasonality because the anomaly has the same variance as the absolute values.

For the umpteenth time, linear transformation by a constant does *NOT* reduce variance. The linearly transformed data inherits the exact same variance of the original data.

Reply to  Tim Gorman
May 11, 2026 5:38 pm

Then you are not talking about seasonality. That’s a change in the mean not in variance. You are talking about heteroscedasticity. You can fix this by doing a weighted regression, but it makes no significant difference.

When I tried it in a back of the envelope way, it just knocked about 0.001°C / decade from the trend, down to 0.155°C / decade compared to the unweighted 0.156°C / decade.

Reply to  Bellman
May 11, 2026 3:52 am

That’s complety different from the variance you see when looking at a specific calandar month and how much it varies year to year.”

The variance of the month’s data *IS* related to seasonality. Colder temps (e.g. winter) have higher variance than warmer temps (e.g. summer). How many times has this been pointed out to you? Why do you continue to ignore that simple fact?

The variance of the year-to-year average data is separate from the variance of the individual average data element. All you are doing is ignoring the variance the monthly average inherits from the individual data elements.

It’s typical of YOU and climate science to just throw variance away – it’s part and parcel of the meme of “all measurement uncertainty is random, Gaussian, and cancels” – you know, the meme you continuously profess to not follow? But which colors each and every assertion you make!

Reply to  Bellman
May 11, 2026 4:01 am

You can cherry pick parts of the GUM, and sometimes they suggest variance as a measure of uncertainty.”

“SOMETIMES”?????

The definition of measurement uncertainty is DEFINED as the variance! It’s consistent throughout the entire document!

” But the primary unit is the standard uncertainty. which is a standard deviation.”

JUDAS H. PRIEST!

SD = √Variance!!!!

GUM:
—————————-
3.3.5 The estimated variance u^2 characterizing an uncertainty component obtained from a Type A evaluation is calculated from series of repeated observations and is the familiar statistically estimated variance s^2 (see 4.2). The estimated standard deviation (C.2.12, C.2.21, C.3.3) u, the positive square root of u^2, is thus u = s and for convenience is sometimes called a Type A standard uncertainty.
—————————–(bolding mine, tpg)

FIRST you find the variance. THEN you find the standard deviation!

Reply to  Tim Gorman
May 11, 2026 5:06 am

The definition of measurement uncertainty is DEFINED as the variance! It’s consistent throughout the entire document!

The GUM definition of uncertainty

B.2.18

uncertainty (of measurement)

parameter, associated with the result of a measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand

NOTE 1 The parameter may be, for example, a standard deviation (or a given multiple of it), or the half-width of an interval having a stated level of confidence

Reply to  Bellman
May 11, 2026 5:24 am

FIRST you find the variance. THEN you find the standard deviation!

Why do you think you “then find the standard deviation”? You claim, standard deviation is not the measure of uncertainty, so why go to the trouble of taking a square root for something you are never going to use?

Reply to  Bellman
May 11, 2026 6:08 am

YOU CAN’T KNOW THE SD WITHOUT FIRST KNOWING THE VARIANCE!

You are simply playing word games. Section 3 of the GUM is the BASIC CONCEPTS section. I gave you the definition of the basic concept of uncertainty.

——————————
3.3.5 The estimated variance u^2 characterizing an uncertainty component obtained from a Type A evaluation is calculated from series of repeated observations and is the familiar statistically estimated variance s^2 (see 4.2). The estimated standard deviation (C.2.12, C.2.21, C.3.3) u, the positive square root of u^2, is thus u = s and for convenience is sometimes called a Type A standard uncertainty.
—————————–(bolding mine, tpg)

Variance *is* the metric characterizing the uncertainty from a Type A evaluation.

If you go to Appx C, the section on Basic Statistical Terms and Concepts you’ll find:

————————
C.2.20
variance
a measure of dispersion, which is the sum of the squared deviations of observations from their average divided by one less than the number of observations”
——————-(bolding mine, tpg)

 so why go to the trouble of taking a square root for something you are never going to use?”

Nor have I ever claimed that the standard deviation is *NOT* a measure of uncertainty. Variance, however, IS *THE* metric for uncertainty. It is a basic statistical rule. When combining random variables you do *NOT* add standard deviations, you add variances. The GUM specifies this for adding measurement uncertainty in Eq 11 where you calculate the sum of u^2(y), not the sum of u(y).

One of the main reasons for using standard deviation in expressing uncertainty is that it shows a plus/minus interval. Using a plus/minus interval allows the specification of a non-symmetric interval which variance doesn’t allow. So if you want a standard expression that is general, the standard deviation works. That does *NOT* mean that variance is not the overall metric for uncertainty.

Reply to  Tim Gorman
May 11, 2026 9:04 am

“YOU CAN’T KNOW THE SD WITHOUT FIRST KNOWING THE VARIANCE!”

This is such a childish argument. You saidvthat cariance was the measure of uncertainty. I said standard deviation is the prefered measure. And now you realise you were wrong you rant and twist the argumrnt as being about how standard deviation is calculated.

Of course standard deviation is the root of variance. Variance is an intermediate step, it exists because you square to get rid of negative differencies. But in itself it isn’t a useful measurvof deviation because it’s the average of the square difference and has square units. That is why you take the square root, to get a value that is a useful metric of deviation, or uncertainty.

“Variance, however, IS *THE* metric for uncertainty.”

Nothing you quote says that.

“where you calculate the sum of u^2(y), not the sum of u(y).”

Why do you think it’s written u² and not varience? Because standard uncertainty is a standard deviation. Yes the convinience if varience is that when adding random variables you have to add the squares of the standard deviations and take the square root. which can easily be written as adding variances, but that doesn’t make the variance a useful value. It’s exactly the same as using the Pythagorean theorem. You are adding the square areas of the sides of a triangle to get the area
of the side of the hypotenuse, but the practical use is to take the square root to get the length if the hypotenuse.

“One of the main reasons for using standard deviation in expressing uncertainty is that it shows a plus/minus interval.”

Huh. Any number can be turned into a ± interval. It’s just that such an interval is meaningful for a standard deviation as it reflects the actual dispersion of values.

“Using a plus/minus interval allows the specification of a non-symmetric interval which variance doesn’t allow.’

Not this insanity again. Sorry. you just don’t understand how this works. I’m not rehashing your delusions about negative standard deviations again.

Reply to  Bellman
May 11, 2026 5:47 am

“NOTE 1 The parameter may be, for example, a standard deviation”

But FIRST you must find the variance. You don’t calculate standard deviation directly!

Reply to  Tim Gorman
May 11, 2026 11:46 am

Much like an average. You usually calculate the sum and then divide by n. It doesn’t mean the sum or the variance is a meaningful or useful quantity, it’s the end result that has meaning.

Reply to  Bellman
May 11, 2026 4:12 am

This is the first time I can remember you talking about a peakedness index. Your equation makes no sense. Could you privide a reference? As far as I can see the index was only proposed last year.”

I have told you OVER AND OVER AND OVER AND OVER ……

As the variance grows the hump around the average gets smaller – meaning the values surrounding the average get closer and closer to the average value – a metric for the measurement uncertainty surrounding the average value. In other words, the PEAKEDNESS of the hump goes down. I’ve even posted images demonstrating this to you in threads on the subject.

The peakedness index is just a method for quantifying what I’ve told you OVER AND OVER AND OVER AND OVER ….. which you have stubbornly refused to accept, just saying that the variance is *NOT* a metric for uncertainty.

The peakedness index is nothing more than quantifying what has been discussed in the GUM for YEARS! It is a concept that you just can’t seem to grasp or accept!

Reply to  Tim Gorman
May 11, 2026 5:58 am

So you don’t know what “peakedness” means.

What you say about sd is just wrong. As usual you are assuming that all distributions a Gaussian. Try applying your logic to a rectangular distribution. Does peakedness increase as the standard deviation decreases?

Reply to  Bellman
May 11, 2026 6:23 am

I do *NOT* assume all distributions are Gaussian! Why do you think I keep saying that the 5-number statistical descriptor is what climate science should use?

Variance does *NOT* require a Gaussian distribution to be valid.

A rectangular distribution with a constant value has a variance of 0. Thus the peakedness index of a rectangular distribution would be undefined (division by 0).

Reply to  Tim Gorman
May 11, 2026 7:21 am

That is why the GUM defines the standard uncertainty to be

(b – a) / √3.

Reply to  Jim Gorman
May 11, 2026 9:52 am

That should be

(b – a) / 2√3

It’s just the standard equation for the SD of a rectangular distribution. Not sure what that has to do with peakedness.

Reply to  Tim Gorman
May 11, 2026 9:38 am

“I do *NOT* assume all distributions are Gaussian!”

\sarc You claim that, but you kerp making that assumption all the time.

“Variance does *NOT* require a Gaussian distribution to be valid.”

I never said it did. But your assumption that you can tell something about “the hump” from variance does. Deviation tells you nothing about the shape of the distribution.

“A rectangular distribution with a constant value has a variance of 0.”

That’s just a point. What about a distribution with a greater than zero variance?

Reply to  Bellman
May 9, 2026 12:07 pm

How do you think that works? How does a change in variance change a trend?

Since you obviously won’t take my word for it. From here: TSCS_Week5_Trends.pdf

  • Time series processes with trends are non-stationary. The mean, variance, or both are a function of time.
  • We need to properly account for trends in dynamic processes in order to test hypotheses with time series data

When the variance of a time series changes over time — for example, due to structural breaks, regime shifts, or non-stationarity — it can distort statistical relationships and lead to spurious trends that appear meaningful but are not causally linked.

Including ENSO which is a dramatic shock to the trend can propagate indefinitely if not properly accounted for. Do your regressions take this into account?

Linear regression does not give you a functional relationship. There is always an epsilon term that is not predicted by the independent variables.

Now you are talking statistics instead of physical science. That is one of the main problems in climate science. The primary use of the regression is to predict the value of the dependent variable when values for the independent variables are given

Your graph is a mess. The x-axis is not log CO2. It is log₂ CO2. This gives values of concentration on the x-axis from about 330 to 415. You also didn’t include the regression formula. The graph is useless as a forecasting tool. Show us the regression equation you came up with and show what the predicted anomaly will be at 660 ppm, a doubling of CO2.

Reply to  Jim Gorman
May 9, 2026 5:01 pm

Since you obviously won’t take my word for it.

Nullius in verba.

You need to explain what you mean and preferably demonstrate it. I’m not even saying you are wrong, but unless you can show your workings as far as the UAH trends Spencer states, then how can you say there is a serious problem with them?

But all you do is quote more stuff that has nothing to do with the claim. Your slide show says nothing about seasonality, let alone how different variances over the year can effect liner regression.

If I have time I’ll try to do some analysis on this, but it’s really up to you to demonstrate what you think the correct trend should be. You keep claiming to be an expert on this, yet never do anything than say that Dr Spencer doesn’t understand what he’s doing.

Now you are talking statistics instead of physical science.

Of course we are talking about statistics. What do you think linear regression is? I take it you think you should have a deterministic rather than a stochastic model. In which case I can only echo the charge leveled at me, you don’t live in the real world.

Your graph is a mess.”

Sorry. I produced it in a hurry, to counter your lies. I’s sooner have a messy graph than your non-existent ones.

The graph is useless as a forecasting tool.

That’s not it’s purpose.

Show us the regression equation you came up with and show what the predicted anomaly will be at 660 ppm, a doubling of CO2.

And again, you should not use a regression outside the data range.

For the record the regression is 2.2 ± 0.47 °C / log2(CO2). With a 2σ uncertainty interval. Though note this is not corrected for auto-regression so it should probably be a bit larger.

The prediction for 660ppm of CO2 would be 1.8 ± 0.5 °C. That’s a prediction interval, not a confidence interval. But, as I said, it’s a mistake to take this as an actual prediction.

20260509wuwt5
Reply to  Bellman
May 10, 2026 8:34 am

You need to explain what you mean and preferably demonstrate it. I’m not even saying you are wrong, but unless you can show your workings as far as the UAH trends Spencer states, then how can you say there is a serious problem with them?

I gave you resources to read. If you won’t take the time to read them and attempt to understand them, then it isn’t worth my time to try and explain it to you.

Benice has made it plain to you that most of the increase in UAH is due to ENSO. It does exist.

UAH’s lower‑troposphere (TLT) product is known to retain ENSO‑related shocks unusually strongly, producing step‑like increases that decay more slowly than in other satellite or reanalysis datasets. This behavior is linked to how UAH performs diurnal drift correction, synthetic channel construction, and satellite merging — not to a physical requirement of the climate system.

If you wish to know more, I suggest you delve into the UAH processing.

And again, you should not use a regression outside the data range.

So your graph is useless in determining what the future holds. That is what many of us have been trying to tell you. Cherry picking early 20th century temperatures that are closely related to the Little Ice Age to start a trend will not tell you what the future holds regardless of when it ends.

Reply to  Jim Gorman
May 10, 2026 11:13 am

I gave you resources to read.

Typical deflection. Your resource is a slideshow from a lecture that makes no mention of seasonality, or anything else you claimed. And I keep telling you, if you want to be taken seriously, you need to be able to explain this in your own words, and give us your analysis. How are you interpreting the trend for UAH? How does it differ from the value Spencer gives in this article?

Benice has made it plain to you that most of the increase in UAH is due to ENSO. It does exist.

If you mean bnice2000, why do you think that troll has demonstrated anything? You are the one claiming linear regression is wrong for UAH, yet now you want to rely on arbitrary short term linear regression to claim that ENSO is causing multi-decade warming. This has not been demonstrated statistically, and is physically implausible.

And this is yet more deflection. The discussion was about seasonality, not ENSO.

If you wish to know more, I suggest you delve into the UAH processing.

Or maybe we should just stop using UAH.

So your graph is useless in determining what the future holds.

Not completely useless, but that’s not the purpose of a linear regression. Best you can say about future projections is that, if the linear trend is valid, and if UAH is an accurate measure of global temperature, and if it continues to be linear as CO2 increases, and if there are no confounding factors, then it might be a reasonable estimate.

Cherry picking early 20th century temperatures…

Huh. The data goes back to 1979. That is not early 20th century.

Reply to  Bellman
May 11, 2026 9:47 am

Your resource is a slideshow from a lecture that makes no mention of seasonality, or anything else you claimed.

Best you can say about future projections is that, if the linear trend is valid, and if UAH is an accurate measure of global temperature, and if it continues to be linear as CO2 increases, and if there are no confounding factors, then it might be a reasonable estimate.

Sorry you are so adverse to learning new things. The resource discusses spurious trends. These are real and can’t be dismissed.

For trend analysis here are some items to consider.

  • Remove the annual cycle to avoid aliasing
  • Account for autocorrelation because ENSO introduces persistence
  • ENSO years can create apparent “steps” in tropospheric datasets due to shocks not decaying

You keep criticizing but seldom show references. Show some resources that refute the assertion that regressions on current temperature data sets do not include spurious trends from data.

Let me point out that regression trending average temperatures results in spurious trends because of Tmax and Tmin having unique trends of their own. It leads one to say both are increasing when that is not true.

Reply to  Jim Gorman
May 11, 2026 11:06 am

“Sorry you are so adverse to learning new things.”

You couldn’t imagine how many new things I’ve learned during the course of these arguments. It’s the only reason it’s worth continuing.

“The resource discusses spurious trends.”

But not seasonality, which is what you were claiming.

“Remove the annual cycle to avoid aliasing.”

Done that. Anomalies, remember.

“Account for autocorrelation because ENSO introduces persistence”

Ideally yes. but this generally increases uncertainty, not the trend. Remember all the times I had to point out the uncertainty in your pauses? Factoring in auto correlation is why the uncertainties are so large, but it doesn’t stop the trend being flat.

But it’s a complicated procedure and there is no one correct method. In the past I used the figures given from the trend calculator that results in UAH uncertainty being roughly 4 times larger than the default value. But I suspect this might be too large.

Most of the time I prefer to use annual averages that remove some but not allauto correlation. It also of course removes any seasonality.

“ENSO years can create apparent “steps” in tropospheric datasets due to shocks not decaying”

That’s your assetion and I see no reason to it. You have done nothing to determine these step changes statistically. but even if you did it wouldn’t counter the real world physical reasons why this is implausible.

Reply to  Bellman
May 11, 2026 11:30 am

“You keep criticizing but seldom show references”

Strange, I could have sworn this started with you criticizing the way liner regression was used in this article, and then refusing to explain how you would do it differently.

“Show some resources that refute the assertion that regressions on current temperature data sets do not include spurious trends from data.”

You want me to prove a negative. I’m not even claiming the simple regression is correct. The more you learn the more ways of doing things there are and the more possible issues. But if you think there is something fundamentally wrong with a specific use, then you need to show your own workings. Do your own regression using whatever method you want, and then we can compare it with the trend given by Spencer, or any simple software package.

“Let me point out that regression trending average temperatures results in spurious trends because of Tmax and Tmin having unique trends of their own.”

I’m not sure if you understand what “spurious” means. A correct trend of a specific quantity is still a correct trend. The trend if average temperatures is a correct trend of average temperatures. A correct trend of lower troposphere satellite data is a correct trend of that data.

“It leads one to say both are increasing when that is not true.”

It doesn’t. The change in average temperatures does not imply the same change is happening in both max and min. But in fact both min and max are increasing, so your point is moot.

Victor
Reply to  Jim Gorman
May 8, 2026 11:58 am

Create a linear temperature trend for each decade from January 1979 to the present. Will the temperature trend be the same for each decade?

Do the temperature trends for each decade tell more than a temperature trend for the entire period?

Reply to  Victor
May 8, 2026 3:56 pm

Do the temperature trends for each decade tell more than a temperature trend for the entire period?

No. A period of ‘climatology’ is based on at least 30-years of continuous data, according to the WMO.

Every global temperature data set we have, surface or satellite, shows statistically significant warming over the past 30 years.

(‘Statistically significant’ meaning >95% probability that the observed warming trend would not have occurred in a randomly fluctuating climate system over a 30-year period. The chances of it not being a true warming trend are the same as the chances of tossing a coin 30 times and it coming up heads on ~29 of the tosses. The warming is real.)

Victor
Reply to  TheFinalNail
May 8, 2026 8:04 pm

Is the trend the same for the period 1979-1989 compared to the period 2009-2019?
When comparing 2 periods, you can see whether the trend is decreasing or increasing.

Reply to  Victor
May 9, 2026 3:58 am

Better yet, compare it to 1915-1945.

Reply to  Victor
May 9, 2026 1:03 pm

Create a linear temperature trend for each decade 

Temperature trends are a joke. The trends are all against time, auto-correlated, and with seasonality. If you want to use time series analysis to find a stationary trend with constant statistical values for means, and variance, that is a proper method to eliminate these problems. Then one can create a model that allow different variables to be assessed as to how they affect they trend.

Bellman continues to show regressions with a linear equation that continues indefinitely. Doing so ignores cooling, as over the last few months, pauses, and step changes at El Nino occurrences.

If you really want to get down into the weeds, start using Tmax and Tmin from every day separately. Tavg = (Tmin + Tmax)/2 is a joke. They have different means, variances, functional relationships.

Here is a good graph showing the relationship between sun, soil, and atmosphere. Just looking at temperature will tell you nothing.
comment image
Ag studies have shown that daytime temperatures are well within the tolerances of most grain crops in the U.S. Night temperatures have increased such that last frost and first frosts allow for longer growing seasons. Grain harvests worldwide, confirm that global temperatures are not a problem.

Averaging temperatures, especially from far flung places is not scientific. NH and SH have reverse seasons, do you think their variances are different? Where do those disappear to in the calculation of a global temperature?

Reply to  Jim Gorman
May 11, 2026 4:30 pm

OK, as Jim won;t show his seasonal analysis, I might as well have a stab. I’ll use the method of treating each month as a factor in the linear regression.

First let’s compare the linear trend without accounting for seasonality.

For anomalies I get

lm(formula = Anomaly ~ Time, data = .)

Residuals:
     Min       1Q   Median       3Q      Max 
-0.50432 -0.12224 -0.01685  0.10373  0.71156 

Coefficients:
              Estimate Std. Error t value Pr(>|t|)    
(Intercept) -3.116e+01  1.187e+00  -26.25   <2e-16 ***
Time         1.555e-02  5.928e-04   26.23   <2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1936 on 567 degrees of freedom
Multiple R-squared:  0.5482,	Adjusted R-squared:  0.5474 
F-statistic: 687.9 on 1 and 567 DF,  p-value: < 2.2e-16

The trend is 0.1555°C / decade. Consistent with Spencer’s quoted value of 0.16.

(Whine about the number of decimal places all you want, I want to see any difference between the methods.)

Now for absolute temperatures.

lm(formula = Temperature ~ Time, data = .)

Residuals:
    Min      1Q  Median      3Q     Max 
-1.3641 -0.7519 -0.2041  0.7699  1.8004 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept) 2.333e+02  5.111e+00  45.638  < 2e-16 ***
Time        1.537e-02  2.552e-03   6.022  3.1e-09 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.8334 on 567 degrees of freedom
Multiple R-squared:  0.06012,	Adjusted R-squared:  0.05846 
F-statistic: 36.27 on 1 and 567 DF,  p-value: 3.097e-09

The trend is slightly lower at 0.1537°C / decade. This is becasue of the seasonal variation, especially as the time series is ending during the colder months. Also note that the R² value is just 0.058, as most of the variation is in the seasonal fluctuations.

Reply to  Bellman
May 11, 2026 4:37 pm

So now the seasonal model, used with anomalies.

lm(formula = Anomaly ~ Time + factor(Month), data = .)

Residuals:
     Min       1Q   Median       3Q      Max 
-0.48861 -0.12165 -0.01403  0.10233  0.70243 

Coefficients:
                  Estimate Std. Error t value Pr(>|t|)    
(Intercept)     -3.117e+01  1.198e+00 -26.009   <2e-16 ***
Time             1.555e-02  5.983e-04  25.985   <2e-16 ***
factor(Month)2  -6.229e-03  3.987e-02  -0.156    0.876    
factor(Month)3   1.102e-02  3.987e-02   0.276    0.782    
factor(Month)4   1.860e-02  3.987e-02   0.466    0.641    
factor(Month)5   1.626e-02  4.008e-02   0.406    0.685    
factor(Month)6   4.826e-03  4.008e-02   0.120    0.904    
factor(Month)7   1.338e-02  4.008e-02   0.334    0.739    
factor(Month)8   1.481e-02  4.008e-02   0.369    0.712    
factor(Month)9   7.668e-03  4.008e-02   0.191    0.848    
factor(Month)10  1.141e-02  4.008e-02   0.285    0.776    
factor(Month)11  9.947e-03  4.008e-02   0.248    0.804    
factor(Month)12  1.225e-02  3.987e-02   0.307    0.759    
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1953 on 556 degrees of freedom
Multiple R-squared:  0.5487,	Adjusted R-squared:  0.539 
F-statistic: 56.34 on 12 and 556 DF,  p-value: < 2.2e-16

The trend is 0.1555°C identical to 4 significant figures to the standard model.

The estimate values for each month is the offset relative to January, and they are all very small, less than 0.02°C, and show little pattern. All are insignificant.

Here’s the graph using this regression. You can see a slight wobble in the trend line, but it’s essentially the same as the standard regression.

20260512wuwt1
Reply to  Bellman
May 11, 2026 4:46 pm

Now for temperatures using the seasonal model.

lm(formula = Temperature ~ Time + factor(Month), data = .)

Residuals:
     Min       1Q   Median       3Q      Max 
-0.48855 -0.12152 -0.01206  0.10228  0.70240 

Coefficients:
                 Estimate Std. Error t value Pr(>|t|)    
(Intercept)     2.320e+02  1.198e+00 193.617  < 2e-16 ***
Time            1.554e-02  5.983e-04  25.976  < 2e-16 ***
factor(Month)2  8.251e-02  3.987e-02   2.069    0.039 *  
factor(Month)3  2.587e-01  3.987e-02   6.490 1.91e-10 ***
factor(Month)4  6.821e-01  3.987e-02  17.108  < 2e-16 ***
factor(Month)5  1.285e+00  4.008e-02  32.059  < 2e-16 ***
factor(Month)6  1.924e+00  4.008e-02  48.003  < 2e-16 ***
factor(Month)7  2.252e+00  4.008e-02  56.178  < 2e-16 ***
factor(Month)8  2.069e+00  4.008e-02  51.625  < 2e-16 ***
factor(Month)9  1.465e+00  4.008e-02  36.555  < 2e-16 ***
factor(Month)10 7.758e-01  4.008e-02  19.354  < 2e-16 ***
factor(Month)11 2.356e-01  4.008e-02   5.878 7.15e-09 ***
factor(Month)12 2.384e-02  3.987e-02   0.598    0.550    
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1953 on 556 degrees of freedom
Multiple R-squared:  0.9494,	Adjusted R-squared:  0.9483 
F-statistic: 868.8 on 12 and 556 DF,  p-value: < 2.2e-16

The trend is 0.1554°C / decade, virtually identical to that of the anomalies. Compare this with the original trend for temperature of 0.1537°C / decade., and you can see it’s removed the spurious change caused by seasonality.

The monthly offsets compared to January now reflect the season variation, going up to 2.3°C for July, and all but December are significant.

The R² is now 0.95, but that’s becasue the majority of the variation is caused by the seasonal cycle.

This should be clear in the graph.

20260512wuwt2
Reply to  Bellman
May 11, 2026 5:24 pm

OK, as Jim won;t show his seasonal analysis, I might as well have a stab. I’ll use the method of treating each month as a factor in the linear regression.

I have shown you this before.
comment image
This graph was made using the ADIFF command in an Excel stats package. It reduces auto-correlation by first differences of 1-lag and removes seasonality of 12 month periodicity.

Do you see a trend?

That is the baseline trend you should be starting with.

  • Can you determine a value of CO2 that gives a proper trend on top of this?
  • How about methane?
  • How about clouds?
  • How about seasonal changes in insolation?
  • How about a refined period for ENSO rather than allowing the shock to remain?
  • How about soil and ocean effects on temperature.

You say a linear regression is not useful for making predictions. It only is useful for displaying a line of best fit for what has occurred in the past. Whoopee!

A time series analysis will allow one to derive multiple variables that can be used to evaluate what causes temperature to change.

It is one reason I have started researching how insolation is processed through soil and then to the atmosphere. Let’s see you do that with your regression.

Reply to  Jim Gorman
May 11, 2026 6:20 pm

This graph was made using the ADIFF command in an Excel stats package. It reduces auto-correlation by first differences of 1-lag and removes seasonality of 12 month periodicity.

Do you see a trend?

The main point of taking differences is to remove the trend.

Reply to  Bellman
May 11, 2026 7:12 pm

The main point of taking differences is to remove the trend.

No, that is not the case. It does remove linear trends that may very well be due to auto-correlation. In other words when yesterday’s temperature influences today’s temperature.

Here is a web page from Michael Mann’s u.niversity

https://online.stat.psu.edu/stat462/node/188/

To emphasize that we have measured values over time, we use “t” as a subscript rather than the usual “i,” i.e., yt means y measured in time period t. An autoregressive model is when a value from a time series is regressed on previous values from that same time series. for example, yt on yt−1:

yt=β0+β1yt−1+ϵt.

In this regression model, the response variable in the previous time period has become the predictor and the errors have our usual assumptions about errors in a simple linear regression model.

As I mentioned earlier, which you obviously missed, is that this gives a baseline with constant statistical parameters. You can then use that baseline to add changes contributed by other variables. CO2 for example. It allows one to create a model that more accurately makes forecasts.

If you are confident that your linear regression accurately predicts the change in temperature, then you can easily add this to the baseline trend and obtain a forecast of future temperatures. Then you can compare it to actual values each month in the future. Good luck.

Reply to  Jim Gorman
May 12, 2026 4:03 am

“No, that is not the case.”

The primary goal of differencing is to remove trends and seasonality from the data, which can obscure the underlying patterns and make the time series unpredictable. By applying differencing, analysts can stabilize the mean of a time series by removing the changes in the level of a series, thus allowing the model to focus on other patterns, such as cyclical or irregular fluctuations.

https://milvus.io/ai-quick-reference/what-is-differencing-in-time-series-and-why-is-it-used

Many time series exhibit trends or seasonality, making their statistical properties like mean and variance change over time. This non-stationarity poses a challenge for standard forecasting models like ARMA, which assume stationarity. Fortunately, a common and effective technique to transform non-stationary data into stationary data is differencing.

https://apxml.com/courses/time-series-analysis-forecasting/chapter-2-decomposition-stationarity/differencing-for-stationarity

Reply to  Bellman
May 11, 2026 5:24 pm

OK, as Jim won;t show his seasonal analysis, I might as well have a stab. I’ll use the method of treating each month as a factor in the linear regression.

I have shown you this before.
comment image
This graph was made using the ADIFF command in an Excel stats package. It reduces auto-correlation by first differences of 1-lag and removes seasonality of 12 month periodicity.

Do you see a trend?

That is the baseline trend you should be starting with.

  • Can you determine a value of CO2 that gives a proper trend on top of this?
  • How about methane?
  • How about clouds?
  • How about seasonal changes in insolation?
  • How about a refined period for ENSO rather than allowing the shock to remain?
  • How about soil and ocean effects on temperature.

You say a linear regression is not useful for making predictions. It only is useful for displaying a line of best fit for what has occurred in the past. Whoopee!

A time series analysis will allow one to derive multiple variables that can be used to evaluate what causes temperature to change.

It is one reason I have started researching how insolation is processed through soil and then to the atmosphere. Let’s see you do that with your regression.

bdgwx
Reply to  ToldYouSo
May 9, 2026 6:47 am

He actually reports it to 0.001 C.

Reply to  bdgwx
May 9, 2026 7:22 am

Why not 0.001 mK?

bdgwx
Reply to  karlomonte
May 9, 2026 8:43 am

I don’t know. You’ll have to ask Dr. Spencer.

David Wojick
May 7, 2026 5:11 pm

If it follows the pattern since the beginning it should cool a bit more then oscillate around a constant value a little warmer than before this super El Niño. Might take 3-4 years to know.

All the warming to date looks like residual heat from super El Niños. No GHG increase caused warming at all.

Reply to  David Wojick
May 8, 2026 4:29 am

This is known as a “shock” in a time series. It is carried throughout following averages. The decay of a shock requires specific treatment of the data to remove. Time series analysis is in short supply in climate science.

Reply to  Jim Gorman
May 8, 2026 12:54 pm

“This is known as a “shock” in a time series. It is carried throughout following averages. The decay of a shock requires specific treatment of the data to remove.”

Please note the red line in the topmost graph of UAH data that is labeled “Running, centered 13-month average”.

Any “shock” data in the UAH plotted time-series data that does nor fall within that moving 13-month time window DOES NOT get “carried forward” nor does it affect that “running centered average” trend line more than 14 months before or more than 14 months after the interval of that “shock”.

In this case, the “specific treatment” to remove the “shock” from “following averages” is to allow time to pass. /sarc

bdgwx
Reply to  David Wojick
May 9, 2026 6:34 am

All the warming to date looks like residual heat from super El Niños.

The average ENSO state since 1979 is 0. [1]

Anyway, do you have a prediction of when you think El Ninos (and thus the warming) will stop?

David Wojick
May 7, 2026 5:14 pm

I keep seeing predictions of an imminent super El Niño which is strange since we are just coming out of one.

Reply to  David Wojick
May 7, 2026 5:24 pm

The climate scaremongers DESPERATELY want another El Nino.

They KNOW it is the only cause of any warming, and a totally natural solar forced occurrence….

.. but still need it so they can continue to PRETEND that human CO2 is the cause.

Nick Stokes
Reply to  David Wojick
May 7, 2026 6:43 pm

From the WUWT reference page, here is a plot of T at depth 55m, along the equator. Time on the y axis – present time at bottom. Something is going on:

comment image

Reply to  Nick Stokes
May 7, 2026 11:48 pm

This chart is absolute confirmation that the warming is not cause by human released CO2.

Reply to  bnice2000
May 8, 2026 6:57 pm

Well . . confirmation over the charted period of start-May 2025 to end-April 2026 . . . one year.

Reply to  Nick Stokes
May 8, 2026 7:52 am

“Something is going on”

Yes, it’s called natural variations in ocean temperature, particularly along Earth’s equator on the east side of the Pacific Ocean (~70 to 100 W longitude).

Interesting that you only focused on presenting data over the relatively narrow time period of May 2025 to end-April 2026.

According to https://ggweather.com/enso/roni.htm there was a weak La Niña (decline in ocean near-surface temperatures) in 2025-2026.

Nick Stokes
Reply to  ToldYouSo
May 8, 2026 1:49 pm

Interesting that you only focused on presenting data”

It isn’t my graph. It is what is displayed on the WUWT Reference Page

Reply to  Nick Stokes
May 8, 2026 6:38 pm

A distinction without a difference.

It is YOU that presented it in YOUR post with YOUR statement that “Something is going on”.

That statement is not on the WUWT Reference Page.

Reply to  David Wojick
May 7, 2026 7:02 pm

predictions of an imminent super El Niño … strange

If you’re seriously interested in this question, please study this report, especially the slides #10-12, 15* & (for ‘projections’ fwd) #25*:
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf
And, at your own risk, the up-to-date (labeled 6th of May) projections are presented in the graphically in the midst of too-much-more at …
https://arctic-news.blogspot.com/
Use this phrase to search for the (2) graphs:

Forecasts indicate … upcoming El Niño threatens to become a monster within months.

bdgwx
Reply to  David Wojick
May 9, 2026 6:29 am

We’re coming out of a La Nina. [1]

May 7, 2026 5:50 pm

The most recent, moderate El Niño event occurred from summer 2023 through early spring 2024, with peak impacts around the end of 2023. This El Niño was officially declared to have ended by April 2024, with a weak La Niña period beginning in late 2024 and continuing into 2026. (see https://ggweather.com/enso/roni.htm )

These facts are in excellent agreement with the UAH graph of GLAT posted in the above article.

The Hunga-Tonga submarine volcano is claimed to have injected “massive amounts” of water vapor into the stratosphere in mid-January 2022 . . . H-T water vapor, where are you now and where were you from January 2022 through, oh, May 2023???

Reply to  ToldYouSo
May 8, 2026 9:10 am

I have a graphic of the water vapour.

h2o_MLS_vLAT_qbo_75S-75N_26hPa
Reply to  SteveT
May 8, 2026 10:48 am

Excellent!

That contour plot (time on the x-axis) indicates that there is still a significant amount of excess water vapor, at least at the 26 hPa pressure altitude, from 75 S to 75 N latitudes, compared to pre-January 2022 levels. Despite this, the UAH plot of GLAT shows that global average lower atmospheric temperature has returned to being right at the 47+ year historic linear temperature trend line (January 1979 through April 2026).

HT water vapor . . . if you’re really up there in the stratosphere, you’re really not doing what many claimed you were capable of!

Intelligent Dasein
Reply to  ToldYouSo
May 8, 2026 6:05 pm

The Hunga-Tongan volcano worshippers are currently the biggest embarrassment to climate skepticism that exists. They spent decades arguing that a 200ppm increase in CO2 had nothing to do with global temperatures, only to turn around and insist that a 1ppm increase in H2O was driving the highest temperature spike in the satellite record. They need to be excised from the movement. History should not be kind to these people.

observa
May 7, 2026 8:13 pm

Ahhh stuff the global boiling the natives are getting hot under the collar again-
Brussels mulls scrapping methane fines amid energy crisis – leak

May 9, 2026 4:20 am

I have to interject here that all of this discussion is nothing more than mathematical masturbation.

  1. Temperature is an intensive property
  2. Intensive properties of multiple systems cannot be added to create a larger system.

These truths indicate that adding local and regional temperature systems to create a larger “global” system is decidedly non-physical. In fact, using local temperatures to create a larger regional system is also non-physical. The microclimate at PointA is different than the microclimate at PointB – different systems with different intensive properties. Adding the temperatures of the two points to create a mean tells you nothing physical, it does not create a larger system whose intensive temperature property can be determined.

Yes, you can use the data to create a MATHEMATICAL MEAN but you are *NOT* creating a physically meaningful statistical descriptor of a larger system.

Statistical descriptors are *NOT* measurements. They are mathematical attributes of a data set that is useful in understanding the data set. But if the data set is not attributable to a physical system then they tell you nothing about reality.

Tracking a non-physical mathematical mean is an attractive nuisance that climate science tries to assign some physical meaning. An attractive nuisance is a legal doctrine concerning hazardous conditions that are attractive to children. That is a perfect description of the “global average temperature”.

Reply to  Tim Gorman
May 9, 2026 7:17 am

“Yes, you can use the data to create a MATHEMATICAL MEAN but you are *NOT* creating a physically meaningful statistical descriptor of a larger system.”

Keep thinking, you are almost getting there.

The mean is not a physical thing. That’s true for intensive or intensive properties. The average weight of a person is not the combined weight of all people. The average temperature if a person does not have to be the physical temperature of all people combined. But that doesn’t mean that the average is useless. The average tells you something about the population and it alliws you to distinguish between different populations.

If you want a phyisical meaningful description of an extensive property of an entire population, you want the sum of all the properties.

If you want a physically meaningful property of a combined population the sum will not give you that. Thevaverage may or may not give you that property, depending on exactly what and how you are aceraging, but at the least an average will be closer to that physical property than the sum.

Reply to  Bellman
May 9, 2026 1:55 pm

The mean is not a physical thing.

Bull hockey. You are dealing with measurements of a physical property. The mean is the best estimate if the distribution of measurements varies randomly. See GUM 4.2.1. However, the dispersion of observations about the mean, the standard deviation, is an integral part of the measurement. See GUM 4.2.2.

If you want to pretend the mean of measurements has no physical meaning, then you are free to do so. Just don’t try to tell folks that you are discussing measurements like a global temperature. If it isn’t physical, then it isn’t a temperature. It is just a number you are playing around with that has no meaning in the real world.

Reply to  Jim Gorman
May 9, 2026 5:36 pm

“The mean is the best estimate if the distribution of measurements varies randomly.”

I think you are confusing yourself again. You are talking about the average of multiple measurements of the same thing. I was talking about the average of different things.

If you want to pretend the mean of measurements has no physical meaning

I said it wasn’t a physical thing, not that it didn’t have a physical meaning. I really think you are tying yourself in knots with all these phrases. The claim Tim is making is that the average isn’t a physical property of the collection of all things. The average weight of a person is not the combined weight of all people.

If it isn’t physical, then it isn’t a temperature.

That’s what I’m saying. The average temperature is not a temperature. It’s the average of all temperatures.

It is just a number you are playing around with that has no meaning in the real world.

And there’s your confusion again. Assuming that if a value isn’t a physical thing, it can have no meaning.

Reply to  Bellman
May 10, 2026 5:30 am

I said it wasn’t a physical thing, not that it didn’t have a physical meaning. I really think you are tying yourself in knots with all these phrases. The claim Tim is making is that the average isn’t a physical property of the collection of all things. The average weight of a person is not the combined weight of all people.”

That is *NOT* what I am saying at all.

I am saying that if you have a large enough scale you can put 10 people on the scale and get a total weight. You can then find the average of that total weight.

I am saying that if you put 10 rocks on a thermometer, each at 20C, you will *NOT* get a total of 200C! If you can’t get a total value for the larger system then you cannot get a meaningful, physical mean for the larger system.

The difference is that one is an extensive property, weight, and the other is an intensive property, temperature.

A statistician or mathematician will say you can average ANYTHING and the average is meaningful. It is only meaningful in that it describes the distribution of the values you include in the data set. That doesn’t mean you can deduce anything meaningful in the real world from that statistical descriptor.

You *can* average the weight of measurand1, the length of measurand2, the temperature of measurand3, the speed of measurand4, and get a value. So what? It just tells you the average of the values you include in the data set. What use can you make of that average? If you track the average of those properties of the multiple measurands over time what can you discern from the change in the average of the values in the data set? Jeesshhh, the values of the components can change while the average stays the same!

Averaging temperatures of multiple measurands is exactly the same. Changes in the average for the phantom larger system tells you exactly nothing about the real world. It’s why climate science kept saying the world was going to burn up from higher temperatures and failed to recognize that the higher average was due to expanding growing seasons! Using mid-range values as an “average” intensive property is idiotic, averaging averages of intensive properties is even worse!

Reply to  Tim Gorman
May 10, 2026 10:25 am

That is *NOT* what I am saying at all.

What you said was:

Yes, you can use the data to create a MATHEMATICAL MEAN but you are *NOT* creating a physically meaningful statistical descriptor of a larger system.

Perhaps if you calmed down and tried to define your terms, we wouldn’t keep having these disagreements. What exactly do you mean by “physically meaningless”?

My point is that there are two different concepts here – one the average as a meaningful statistical descriptor of the population (the larger system), and that is a useful value to know for a variety of reasons. But it does not necessarily describe a physical property of the larger system. In the case of extensive properties, it never does. The average mass of a person is not the mass of the population.

In the case of intensive properties, the mean is a closer approximation to a physical property of the larger system, and may be a more exact description of that property if properly weighted. But that is not generally the point of the average. The point is exactly the same as for extensive properties, to have a value that describes part of the population.

I am saying that if you put 10 rocks on a thermometer, each at 20C, you will *NOT* get a total of 200C! If you can’t get a total value for the larger system then you cannot get a meaningful, physical mean for the larger system.

But you don’t need to measure the larger system in one go. You can just add the values. And it doesn’t matter if that sum is not physically meaningful, the average will still be as useful as it is for the extensive property, Really, this idea that every step of an equation has to be physically meaningful for the end result to be meaningful is just nonsense.

A statistician or mathematician will say you can average ANYTHING and the average is meaningful.

They shouldn’t. But as usual you think that all statisticians and mathematicians are idiots who don;t understand their own research, and all scientists who use statistics are idiots. You really need to consider that maybe you are the one who is misunderstanding things.

You *can* average the weight of measurand1, the length of measurand2, the temperature of measurand3, the speed of measurand4, and get a value.

You cannot. An average has to be of things with the same units.

Averaging temperatures of multiple measurands is exactly the same.

Exactly the same, in the sense it’s completely different. Do words have no meaning for you?

Changes in the average for the phantom larger system tells you exactly nothing about the real world.

You think a significant change in average temperature can occur without there being any change in the larger system?

…failed to recognize that the higher average was due to expanding growing seasons

I’m not sure you understand cause and effect here. Growing seasons only change becasue temperatures are changing. The changing average indicates that temperatures are changing. You’ve just demonstrated that the average is not completely meaningless.

Reply to  Bellman
May 11, 2026 4:39 am

Perhaps if you calmed down and tried to define your terms, we wouldn’t keep having these disagreements. What exactly do you mean by “physically meaningless”?”

Put the bottle away. I explained in DETAIL what “physically meaningless” is.

If you put 10 rocks, whose temperatures range consecutively from 10C to 19C, on a thermometer you do *NOT* get a 145C reading on the thermometer. Thus the average of 15C is PHYSICALLY MEANINGLESS. The collection of rocks simply can’t be combined into a larger system since they are intensive properties.

The average value obtained from the temperature at 0000GMT on Pikes Peak and the temperature at 0000GMT in Colorado Springs is PHYSICALLY MEANINGLESS. The average value obtained from the temperature at 0000GMT in Kansas City and the temperature at 0000GMT in Miami is PHYSICALLY MEANINGLESS. Neither of these examples can be combined into a larger system that can be averaged.

Even worse is the climate science belief that you can combine disparate intensive property values into a data set and it will always come out to be Gaussian – i.e. a distribution that the mean actually describes – instead of being a distribution that needs at least the 5-number statistical description to characterize the distribution.

Reply to  Bellman
May 11, 2026 4:51 am

My point is that there are two different concepts here – one the average as a meaningful statistical descriptor of the population (the larger system), and that is a useful value to know for a variety of reasons. But it does not necessarily describe a physical property of the larger system. In the case of extensive properties, it never does. The average mass of a person is not the mass of the population.”

You have totally confused yourself!

The average mass of a GROUP of people is not the mass of each individual. SO WHAT! The mean mass value *is* a physically meaningful value for that group of people – because MASS is an EXTENSIVE property that *can* be combined into a larger system. It *DOES* describe a physical property of the larger system!

But temperature, an intensive property, can *NOT* be combined into a larger system. Thus the mean value is not physically meaningful, it does *NOT* describe a physical property of the larger system.

A physically meaningless mean *is* useless in describing physical reality. You can’t say that the mean of a set of disparate temperatures describes anything physical – its why the “global average temperature” is physically meaningless.

The entire concept based on using temperature is nothing more than the idea of TRADITION. Temperature was the only data available in the earlier centuries so it’s good enough for us! Teyve’s view in “Fiddler on the Roof”! The sad thing is that temperature is *NOT* a good metric for climate! Using a mid-range daily temperature value as the metric is even worse!

Reply to  Tim Gorman
May 11, 2026 5:51 pm

The average mass of a GROUP of people is not the mass of each individual.

What do you mean by the average mass of a group of people?

The mean mass value *is* a physically meaningful value for that group of people – because MASS is an EXTENSIVE property that *can* be combined into a larger system. It *DOES* describe a physical property of the larger system!

What are you on about? How does the average mass of a person, or a group of people, describe a physical property of a larger system? It was only a few weeks ago you were arguing the exact opposite. That the average mass of an asteroid told you NOTHING about the mass of all asteroids.

A physically meaningless mean *is* useless in describing physical reality.”

Say what you mean by “physically meaningful”. An average of a physical property has meaning. It can be used to identify differences in populations for a start.

You can’t say that the mean of a set of disparate temperatures describes anything physical – its why the “global average temperature” is physically meaningless.

All you keep stating is your personal belief as if it had any meaning. Just because you can’t understand how an average of surface temperature can be used, doesn’t make it meaningless.

Temperature was the only data available in the earlier centuries so it’s good enough for us!

Yes, if you want to compare temperatures with the past, you have to use temperatures. Do you not use temperatures in your modern world? Weather forecasters never tell you what the temperature will be, Doctors never take your temperature because it’s old fashioned? Do you live on Laputa?

Reply to  Bellman
May 11, 2026 4:57 am

In the case of intensive properties, the mean is a closer approximation to a physical property of the larger system, and may be a more exact description of that property if properly weighted.”

Your lack of knowledge of physical science is sad.

You cannot create a larger system from diverse intensive property values. You *can* do so with extensive property values. It’s the very definition of intensive vs extensive properties!

The only way your conclusion could be true is if a rock at 10C and a second rock at 20C, both placed on a thermometer at the same time, would cause the thermometer to indicate 30C.

I suggest you try it sometime and see what the thermometer shows. Let us know when you do the experiment what you find.

Reply to  Tim Gorman
May 11, 2026 5:58 pm

You cannot create a larger system from diverse intensive property values. You *can* do so with extensive property values. It’s the very definition of intensive vs extensive properties!

That’s not the definition.

The only way your conclusion could be true is if a rock at 10C and a second rock at 20C, both placed on a thermometer at the same time, would cause the thermometer to indicate 30C.

Could you please keep banging your head against a wall until you understand the difference between a sum and an average. And understand that temperatures do not start at freezing point.

My contention is that your two rocks would register something between 10 and 20°C, assuming you have a thermometer you could place things on. Possibly close to 15°C, if the rocks are of similar size and material.

But my further contention is that it doesn’t matter, because the average is not a physical property, but an average of physical properties.

Reply to  Bellman
May 11, 2026 6:15 pm

My contention is that your two rocks would register something between 10 and 20°C, assuming you have a thermometer you could place things on. Possibly close to 15°C, if the rocks are of similar size and material.

In other words you are guessing. Show us the math behind your guess of how the energy is redistributed between the two rocks. Give us the EXACT requirements for your guess to be true.

Averaging intensive properties of distinct objects is only correct when the measurand is explicitly defined as a ratio of extensive quantities that can be aggregated. Otherwise, the average is not a property of anything.

This is the exact reason climate metrics, station networks, and heterogeneous measurements must be handled with extreme care.

Reply to  Jim Gorman
May 11, 2026 6:57 pm

In other words you are guessing.

Yes, because I don’t happen to have two rocks of different temperature on my person at the moment. But if you think the two rocks can have a combined temperature greater than 20°C or less than 10°C I’d like to
see your experimental evidence.

Reply to  Bellman
May 11, 2026 5:04 am

But you don’t need to measure the larger system in one go. You can just add the values. “

“They shouldn’t.”

You need to pick one and stick with it. You just contradicted yourself!

And it doesn’t matter if that sum is not physically meaningful, the average will still be as useful as it is for the extensive property, Really, this idea that every step of an equation has to be physically meaningful for the end result to be meaningful is just nonsense.”

If the sum is not physically meaningful then the average won’t be either.

If ANY step in an equation is non-physical, then the result of the equation will be non-physical as well. For Pete’s Sake, did you actually read this before posting it?

Your claim seems to be: The number of angels multiplied by the area occupied by an angel tells you the size of the head of a pin!

Good luck with that!

Reply to  Tim Gorman
May 11, 2026 6:28 am

“Really, this idea that every step of an equation has to be physically meaningful for the end result to be meaningful is just nonsense.”

If the sum is not physically meaningful then the average won’t be either.

/BLINK/ — did he really write this?!

Well folks, you heard it here first — pressure and volume do not need to be physically meaningful quantities in PV = nRT!

Only a “numbers is numbers” person would make such a claim.

It ranks up there with his claim that division with the Magic Number N magically reduces measurement uncertainty.

Your claim seems to be: The number of angels multiplied by the area occupied by an angel tells you the size of the head of a pin!

!!

Reply to  karlomonte
May 11, 2026 11:42 am

“If the sum is not physically meaningful then the average won’t be either.

/BLINK/ — did he really write this?!”

No that’s Tim’s quote. I’m saying the opposite. ot every syep of an equation has to be “physically meaningful” for the end result to be meaningful.

“pressure and volume do not need to be physically meaningful quantities in PV = nRT!”

Learn some logic.

“It ranks up there with his claim that division with the Magic Number N magically reduces measurement uncertainty.”

To s primative person I’m guessing basic maths must seem like magic.

Reply to  Bellman
May 11, 2026 12:12 pm

No that’s Tim’s quote. I’m saying the opposite. ot every syep of an equation has to be “physically meaningful” for the end result to be meaningful.

Oops, try again:

https://wattsupwiththat.com/2026/05/07/uah-v6-1-global-temperature-update-for-april-2026-0-39-deg-c/#comment-4193564

Really, this idea that every step of an equation has to be physically meaningful for the end result to be meaningful is just nonsense.

As always, you fly in circles.

Reply to  karlomonte
May 11, 2026 12:58 pm

Sarcasm fail.

You quoted Tim.

If the sum is not physically meaningful then the average won’t be either.

Reply to  Bellman
May 11, 2026 2:06 pm

I quoted Tim, who quoted YOU. Play again?

Much easier to read Tim’s replies than your unreadable tomes.

Reply to  karlomonte
May 11, 2026 4:18 pm

Nope, it’s all Tim’s

https://wattsupwiththat.com/2026/05/07/uah-v6-1-global-temperature-update-for-april-2026-0-39-deg-c/#comment-4193821

What I said was

And it doesn’t matter if that sum is not physically meaningful, the average will still be as useful as it is for the extensive property, Really, this idea that every step of an equation has to be physically meaningful for the end result to be meaningful is just nonsense.

It’s Tim who said

If the sum is not physically meaningful then the average won’t be either.

Which I agree, is a dumb think to say.

Do I have to use the sarcasm tag again?

Reply to  Bellman
May 11, 2026 6:02 pm

It’s Tim who said

“If the sum is not physically meaningful then the average won’t be either.”

Which I agree, is a dumb think to say.

It is not dumb. If the sum of intrinsic values does not give a physically meaningful value, dividing that meaningless value by a counting number will not result in a magical change to a number that is physically meaningful.

Intensive properties don’t add — so their averages are usually meaningless. Temperature, pressure, density, concentration, pH, etc. are not additive.
If you take:

– Object A at 300 K
– Object B at 400 K

The arithmetic mean (350 K) is not the temperature of any physical system unless very special constraints hold.

This is the same reason you can’t average:

– Pressures of two sealed tanks
– Humidities of two air parcels
– pH values of two solutions
– Wind speeds from two stations

The arithmetic mean has no physical interpretation.

An average can be correct only if the measurand is a ratio of extensive quantities.

An intensive property can only be averaged when the average corresponds to a ratio of extensive quantities that can be meaningfully summed.

Examples are:

  • Mass‑weighted average temperature because internal energy U = m×c×T is extensive.
  • Volume‑weighted average density because mass and volume are extensive.
  • Mole‑fraction‑weighted concentration because moles are extensive.

In these cases, the “average” is not really an average of intensives — it’s a ratio of sums of extensives.

When the two objects are distinct, averaging is almost always invalid. If objects differ in:

  • mass
  • heat capacity
  • volume
  • composition
  • geometry
  • exposure
  • representativeness

then the arithmetic mean of their intensive properties is not a property of any real or hypothetical combined system.

This is the situation in most real-world measurement problems.

Reply to  Jim Gorman
May 11, 2026 6:49 pm

If the sum of intrinsic values does not give a physically meaningful value, dividing that meaningless value by a counting number will not result in a magical change to a number that is physically meaningful.

Just keep repeating that, maybe it will become true. And could you please define “physically meaningful”.

If you don’t want to add up all values to get an average, will you accept that a median value could be meaningful?

And what about if you calculate the mean without first adding up all intrinsic values?

Do you allow that the difference between two intensive values can be meaningful? What about the sum of two differences?

The arithmetic mean (350 K) is not the temperature of any physical system

And I keep telling you the average does not have to be the property of a physical system to be useful. Averages are generally used for statistical analysis. This is true whether the property averaged is extensive or intensive.

Try your argument with an extensive property – two objects, one with a mass of 300g the other with a mass of 400g. The average of 350g is not the mass of any physical system.

Reply to  Bellman
May 11, 2026 6:08 pm

Do whatever floats your barge.

Of all the troll-persons inhabiting WUWT, you are the most disingenuous which is why I refuse to debate anything with you.

You talk in circles, your typing is atrocious, you backpedal like crazy when your bizarre ideas are exposed, and you have a bad case of LWS.

Reply to  Tim Gorman
May 11, 2026 7:11 am

Temperature is a measurement of a substance’s kinetic energy. It does not inform about the latent heat energy content. Averaging intensive values is generally not allowed.

One can always work out assumptions that in a unique example, averaging may work. That does not make the general rule that an intensive quantity must be converted incorrect.

Even with the necessary assumptions that would allow averaging, using the general rule, that is, conversion to extensive values, one will achieve the correct answer.

Reply to  Tim Gorman
May 11, 2026 6:12 pm

You need to pick one and stick with it. You just contradicted yourself!

It’s only a contradiction if you are incapable of reading.

Statement 1 – you don’t have to weigh multiple rocks in one go, you can weigh each one separately and add the results.

Statement 2 – A competent statistician should not say you can average anything and always get a meaningful result.

I really don’t want to get inside your head to figure out why you think those two statements are contradictory.

If the sum is not physically meaningful then the average won’t be either.

Argument by endless repetition is very boring. Do you make the same claim about standard deviation? If variance is meaningless than the standard deviation must also be meaningless.

If ANY step in an equation is non-physical, then the result of the equation will be non-physical as well.

So say you want to calculate kinetic energy. I see the equation is 1/2 mv². Velocity squared must be a meaningless concept, so you conclude that kinetic energy is non-physical. And by the same logic e = mc² must be meaningless, unless you can describe what the square of the speed of light looks like.

Reply to  Bellman
May 11, 2026 6:41 pm

So say you want to calculate kinetic energy. I see the equation is 1/2 mv². Velocity squared must be a meaningless concept, so you conclude that kinetic energy is non-physical. And by the same logic e = mc² must be meaningless, unless you can describe what the square of the speed of light looks like.

Your brain has exploded. You have obviously never taken a calculus based higher level physics class.

1/2 mv² is a common functional relationship that is reproducible in experiments. The 1/2 is a physical constant (not a counting number, think π) determined thru experiment. It is not an average! You would know this if you had physics lab in college. The formula accurately predicts correct values every time. It’s predictions are not meaningless. You should be happy that is the case as safety requirements for autos rely on its fundamental predictions.

Same with e = mc². It is not an average, it is a functional relationship that has been experimentally verified. https://www.nist.gov/news-events/news/2005/12/einstein-was-right-again-experiments-confirm-e-mc2

Reply to  Jim Gorman
May 11, 2026 6:53 pm

“1/2 mv² is a common functional relationship that is reproducible in experiments. The 1/2 is a physical constant (not a counting number, think π) determined thru experiment.”

Read what I said. I’m not arguing about the 1/2, I’m arguing about squaring velocity.

The formula accurately predicts correct values every time.

Almost as if you don’t need every part of the equation to be meaningful in order for the result to be meaningful.

Same with e = mc². It is not an average”

Again, why would you think it’s an average. The question is what physically meaningful property is the square of the speed of light?

Reply to  Bellman
May 11, 2026 5:08 am

But as usual you think that all statisticians and mathematicians are idiots who don;t understand their own research, and all scientists who use statistics are idiots. “

That’s *NOT* what I’ve asserted at all. Statisticians and mathematicians that live in “statistical world” don’t understand their own research. It’s like statistical textbooks that have examples of averaging measurements without including the measurement uncertainty associated with the measurements in the results. It’s from living in a blackboard statistical world instead of reality.

YOU are a perfect example. Thinking that a non-physical average is physically meaningful!



Reply to  Bellman
May 11, 2026 5:12 am

“You cannot. An average has to be of things with the same units.”

Why? If you can’t create a larger system from things with different units then how is that different from trying to create a larger system from things with teh same units that can’t be combined into a larger system?

“Exactly the same, in the sense it’s completely different. Do words have no meaning for you?”

Apparently the words “intensive” and “extensive” have no meaning for you. They are completely different as well! But you seem to think that you can create a larger system from each simply by adding the values.

“You think a significant change in average temperature can occur without there being any change in the larger system?”

A perfect example of the fact that you think you can create a larger system from intensive properties of different things.

That’s what this all boils down to. The statisticians meme that you can average anything and have it be physically meaningful.

Reply to  Tim Gorman
May 11, 2026 12:09 pm

“If you can’t create a larger system from things with different units then how is that different from trying to create a larger system from things with teh same units that can’t be combined into a larger system?”

Try to read my comments. I didn’t say you cannot create a system from different units. I said you couldn’t average things with different units.

“Apparently the words “intensive” and “extensive” have no meaning for you.”

I’ve explained the difference enough times. But as is your usual distraction technique any attempt to explain why your claim is wrong results in you claiming I don’t understand the distinction.

Extensive and intensive oroperties are different. It does not mean you cannot average both.

“A perfect example of the fact that you think you can create a larger system from intensive properties of different things.”

Temperatures usually have the same units. You can average them.

“The statisticians meme that you can average anything and have it be physically meaningful.”

You really need to define “physically meaningful”. The average is not a physical tning, it is related to physical properties of induvidual elements of the population, it is meaningful. This is true regardless of properties being intensive or extensive. Do you regard the average weight of a person as physically meaningful? If you do, why would you not regard the average temperature of a person as physically meanngful?

Reply to  Bellman
May 11, 2026 5:20 am

“I’m not sure you understand cause and effect here. Growing seasons only change becasue temperatures are changing. The changing average indicates that temperatures are changing. You’ve just demonstrated that the average is not completely meaningless.”

Talk about not understanding cause and effect. A higher average caused by increasing maximum temperatures has a totally different effect than if the higher average is caused by increased minimum temperatures.

What has climate science been preaching for 25 years or longer? That the higher average will cause crop failures, declining polar bear populations, disappearance of polar ice, massive starvation and migration of humans and animals, etc.

It’s exactly the same as trying to assign an “effect” to your temperature trends. If the trend line is physically meaningless then you can’t assign a physical effect. And the trend of a physically meaningless average *is* itself meaningless.

Subtracting two dates to get a growing season is *NOT* 1. an average and, 2. is not using intensive properties.

Reply to  Tim Gorman
May 11, 2026 12:00 pm

“A higher average caused by increasing maximum temperatures has a totally different effect than if the higher average is caused by increased minimum temperatures.”

Lots of things are possible. The pont is that changing temperatures can cause changes to the average and growing season. Changes in the average or growing season do not cause changes in temperature. But it was just a joke about the way you worded your comment. Not worth going into a length argument about it.

bdgwx
Reply to  Tim Gorman
May 9, 2026 8:09 am

Statistical descriptors are *NOT* measurements.

As I’ve shown you before that position is not consistent with the GUM. The GUM states that an average can be a measure of a measurand. See [JCGM GUM-6:2020] section 5.7, E.2.2, and 11.10.4 for relevancy.

Reply to  bdgwx
May 9, 2026 11:20 am

5.7: The fitness for purpose of a measurement model can encompass considerations made before measurement. 

You conveniently skipped 5.8:

5.8 When developing a measurement model, the ranges of possible values of the input quantities and output quantities should be considered. “

Ranges, especially of input quantities, are routinely ignored and swept under the rug in climatology.

And you skipped 11.7 Models for time series, which has an example for a temperature bath.

11.10 Model selection and model uncertainty 

You skipped an important part in 11.10.4:

Furthermore, the uncertainty surrounding model selection (model uncertainty) should be evaluated and propagated to the results recognizing the several alternatives [30, 36], and become just another contribution to the uncertainty to be associated with the estimate of the measurand. 

This is never done in climatology.

Section E is titled: Cause-and-effect analysis.

E.2.2 says nothing averaging different quantities and different measurement systems.

Cherry picking, as usual, to get the answer you want and need.

bdgwx
Reply to  karlomonte
May 9, 2026 1:00 pm

There isn’t a single thing your response that challenges the notion that measurements can be averages of other quantities. In fact, the only relevant thing in your response was a mention of section 11.7 which…wait for it…contains an example of averaging temperature.

Reply to  bdgwx
May 9, 2026 3:49 pm

Each and every section speaks of measuring the properties of a SINGLULAR measurand. Not a large system consisting of multiple measurands which you are trying to characterize by using an intensive property of the multiple measurands.

Someday you REALLY need to read the GUM for meaning and context and stop cherry picking bits and pieces that you think confirm your misconceptions.

Measuring the temperature of a water bath 10 times and finding the average value is finding the best estimate of the temperature OF THAT SINGLE WATER BATH!

bdgwx
Reply to  Tim Gorman
May 9, 2026 5:01 pm

Each and every section speaks of measuring the properties of a SINGLULAR measurand. Not a large system consisting of multiple measurands which you are trying to characterize by using an intensive property of the multiple measurands.

And I’m the one who’s reading comprehension skills are lacking?

Measuring the temperature of a water bath 10 times and finding the average value is finding the best estimate of the temperature OF THAT SINGLE WATER BATH!

Says the guy who thinks it is impossible to average temperature…

Reply to  bdgwx
May 10, 2026 3:55 am

It’s impossible to average intensive properties of multiple measurands!

What is it that you don’t understand about that truth?

In order to create a physical average, you must be able to add the individual component’s property together to build a larger system that can be characterized. You can’t do that with intensive properties. Temperatures do not add into a larger system. 20C + 20C ≠ 40C.

What is it that you don’t understand about that?

When you are averaging the measurements of a single measurand you are averaging the MEASUREMENT values to get a better estimate of the measurand property, your are not averaging the intensive property itself.

What is it that you do not understand about that?

Reply to  bdgwx
May 9, 2026 4:08 pm

Thank you for confirming that you indeed ignore most everything in the GUM by cherry picking bits here-and-there.

Reply to  karlomonte
May 10, 2026 4:57 am

How hard is it to understand that you can’t add two rocks at 20C each together to get a temperature of 40C?

Tracking a non-physical statistical descriptor over time doesn’t describe reality, all it does is track the statistical descriptor of the non-physical data set over time. That doesn’t mean you are tracking a physical property. If you aren’t tracking a physical property, then you can’t make any judgements about reality.

Reply to  Tim Gorman
May 10, 2026 7:13 am

Numbers is numbers!

The truth is, he doesn’t care that the invention of the term “forcing”, denominated in the ludicrous units of Watts per meter squared is meaningless.

The truth is, he doesn’t care one bit that Spencer’s reporting of microwave radiance data averages to 1 mK is meaningless.

And the truth is, deep down he just wishes measurement uncertainty had never been developed.

It is all too inconvenient to his warmunist worldview.

Reply to  karlomonte
May 11, 2026 3:34 am

Numbers is numbers!”

Yep. Statistical World is *not* congruent with Real World unless you are a statistician instead of a physical scientist.

“And the truth is, deep down he just wishes measurement uncertainty had never been developed.”

Exactly!

Reply to  bdgwx
May 10, 2026 4:24 am

You are incorrect. A measurement is a result of evaluating a measurement model that is a functional relationship.

An average (mean) is not a functional relationship. It is a statistic, not a mapping. A functional relationship is a rule y=f(x) that assigns one output to each input or group of unique inputs. A mean is a summary operator applied to a set of values. It does not map inputs to outputs in the sense required for a function describing a physical or mathematical relationship.

You are confusing statistical formula with a functional relationship that can predict output values from different inputs. An average can not define the relationship between input quantities.

A functional relationship must connect variables, not data points. A mean is a property of a dataset, not a law of nature. You can say “the mean is a function of the dataset,” but that is a set‑function, not a functional relationship between physical quantities.

When you say f(x) = (1/n) ∑(x1/n, x2/n, …, xn/n) you are declaring x1/n as a unique observation of an input quantity. The same for x2/n. You might explain why scaling an actual measurement like x1 or x2 is a functional relationship related to the other input quantities. I am sorry but you are trying to cram a simple statistical formula into a functional relationship.

Here is what the GUM says:

4.1.1 In most cases, a measurand Y is not measured directly, but is determined from N other quantities X1, X2, …, XN through a functional relationship f :

Each “input quantity”, X1, X2, Xn, is a unique and standalone physical quantity that is physically measured. X1 is estimated by the value of x1, which itself may have multiple measurements of the same thing to determine its estimated value. Its measured value is not dependent on any other variable in the functional relationship like x1/n would be.
Scaling a measured value must have a reason for doing so. Otherwise, one could make the argument that the height of a single rafter is 6 ft/10 because there are 10 of them. Or that a temperature at a given station, on a given day, at a given time is T/n because I have n different measurements from n different stations, on different days, and even at different times.

Reply to  bdgwx
May 9, 2026 3:46 pm

The GUM states that an average can be a measure of a measurand”

As usual, your reading comprehension skills are sadly lacking.

The word MEASURAND is singlular – meaning ONE MEASURAND. The implication being that you are measuring the INTENSIVE VALUE of one single system, not a larger system consisting of multiple smaller systems.

bdgwx
Reply to  Tim Gorman
May 9, 2026 5:08 pm

As usual, your reading comprehension skills are sadly lacking.

This epitomizes why I rarely engage with you. I cite sections of the GUM that unequivocally say an average can be a measurand and you’re knee-jerk reaction is to then accuse me of lacking comprehension. Anyway, maybe your grievance isn’t with me, but with the GUM since you are constantly challenging what it says.

Reply to  bdgwx
May 10, 2026 5:12 am

The issue isn’t averaging measurements of a single measurand to determine the best estimate of its property.

The issue is averaging measurements of an intensive property for MULTIPLE measurands to develop a characteristic intensive property for the larger system consisting of the multiple measurands.

All you are doing here is equivocation.

Merriam-Webster: equivocation – deliberate evasiveness in wording the use of ambiguous or equivocal language

You keep trying to convince everyone that if you can average measurements of a single measurand to get a “best estimate” of the intensive property of that SINGLE measurand, then you can also average the measurements of the intensive property of MULTIPLE measurands to get an average value for the intensive property across a larger system consisting of all of the measurands.

You are being deliberately evasive in wording by trying to make the issue into averaging multiple measurements of a single measurand when the issue is multiple measurements of multiple measurands.

You can only average the properties of multiple measurands if you can combine those individual measurands and properties into a larger system. You cannot do that with intensive properties like temperature.

What is it that you can’t seem to understand about something so simple?

Do you *really* believe that if you have two rocks in your hand at 20C each that you are holding 40C in your hand?

Reply to  Tim Gorman
May 9, 2026 3:28 pm

“I have to interject here that all of this discussion is nothing more than mathematical masturbation.

Temperature is an intensive property”

Funny that I can physically sense temperature (that parameter receiving prominent, repeated mention in discussions under this article), whereas I simply cannot physically sense mathematics. Oh well.

/sarc

Reply to  ToldYouSo
May 9, 2026 3:44 pm

Climate science believes that you can add the temperature of a cubic meter of air on Pikes Peak with a cubic meter of air at Colorado Springs and form a larger system whose temperature can be averaged and used to describe the climate of the area encompassing both sites.

mass1 + mass2 = mass_total
temperature1 + temperature2 ≠ temperature_total

If I have you close your eyes and I put a 10g rock and a 20g rock in your right hand and two rocks of 15g in your left hand can you tell the difference in the weights you are having to hold up with each hand?

If I put a rock at 10C and a rock at 12C in your right hand and two rocks at 11C in your left hand does each hand experience a total of 22C? Can you say that the temperature of the rocks in each hand total to 22C?

Reply to  Tim Gorman
May 10, 2026 8:08 am

You mistake me for being a scientific measuring instrument.

bdgwx
May 9, 2026 6:59 am

The new Monckton Pause extends to 39 months starting in 2023/02. The average of this pause is 0.55 C. The previous Monckton Pause started in 2014/06. It lasted 107 months and had an average of 0.21 C. That makes this pause 0.34 C higher than the previous one.

+0.156 ± 0.038 C.decade-1 k=2 is the trend from 1979/01 to 2026/04 covering 568 values.

+0.026 ± 0.010 C.decade-2 k=2 is the acceleration of the trend.

bdgwx
Reply to  bdgwx
May 9, 2026 7:12 am

comment image

Reply to  bdgwx
May 10, 2026 4:26 am

The value of the “pauses” is that they disprove that increasing CO2 during the pause has no any effect. It indicates that natural variation has much larger effect on temperature than does a single variable of CO2.

(edited to correct)

bdgwx
Reply to  Jim Gorman
May 10, 2026 7:13 am

As I’ve told you before this is the reduction fallacy. Ironically it is this fallacy that provided the impetus for me to develop the graph above.

Reply to  bdgwx
May 10, 2026 7:53 am

The “reduction fallacy” is when someone tries to explain a complex system entirely as a function of one of its parts – i.e. that climate is determined by temperature. Which is what your graph attempts to do. Heal thyself, physician.