UAH Global Temperature Update for January, 2023: -0.04 deg. C

From Dr. Roy Spencer’s Global Warming Blog

by Roy W. Spencer, Ph. D.

The Version 6 global average lower tropospheric temperature (LT) anomaly for January 2023 was -0.04 deg. C departure from the 1991-2020 mean. This is down from the December 2022 anomaly of +0.05 deg. C.

The linear warming trend since January, 1979 now stands at +0.13 C/decade (+0.11 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).

Various regional LT departures from the 30-year (1991-2020) average for the last 13 months are:

YEARMOGLOBENHEM.SHEM.TROPICUSA48ARCTICAUST
2022Jan+0.03+0.06-0.00-0.23-0.13+0.68+0.10
2022Feb-0.00+0.01-0.01-0.24-0.04-0.30-0.50
2022Mar+0.15+0.27+0.03-0.07+0.22+0.74+0.02
2022Apr+0.26+0.35+0.18-0.04-0.26+0.45+0.61
2022May+0.17+0.25+0.10+0.01+0.59+0.23+0.20
2022Jun+0.06+0.08+0.05-0.36+0.46+0.33+0.11
2022Jul+0.36+0.37+0.35+0.13+0.84+0.55+0.65
2022Aug+0.28+0.31+0.24-0.03+0.60+0.50-0.00
2022Sep+0.24+0.43+0.06+0.03+0.88+0.69-0.28
2022Oct+0.32+0.43+0.21+0.04+0.16+0.93+0.04
2022Nov+0.17+0.21+0.13-0.16-0.51+0.51-0.56
2022Dec+0.05+0.13-0.03-0.35-0.21+0.80-0.38
2023Jan-0.04+0.05-0.14-0.38+0.12-0.12-0.50

The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for January, 2023 should be available within the next several days here.

The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:

Lower Troposphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt

Mid-Troposphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tmt/uahncdc_mt_6.0.txt

Tropopause:

http://vortex.nsstc.uah.edu/data/msu/v6.0/ttp/uahncdc_tp_6.0.txt

Lower Stratosphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt

5 32 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

443 Comments
Inline Feedbacks
View all comments
February 1, 2023 10:47 pm

Ja. Past few years I am battling to get my pool up to temperature. It is getting cooler here. In South Africa.

https://breadonthewater.co.za/2022/08/02/global-warming-how-and-where/

strativarius
February 1, 2023 11:51 pm

Don’t tell the BBC…

“Why have there been no named winter storms this year?”

https://www.bbc.co.uk/news/uk-64454569

rah
Reply to  strativarius
February 2, 2023 12:04 am

That will likely change in a week or so.

strativarius
Reply to  rah
February 2, 2023 12:06 am

Then again it might not

rah
Reply to  strativarius
February 2, 2023 12:25 am

Have you checked the models. What we’re getting here is set to visit Europe in 8-9 days.

strativarius
Reply to  rah
February 2, 2023 12:30 am

I checked my model Avro Lancaster – no change there

rah
Reply to  strativarius
February 2, 2023 12:36 am

Guess it didn’t belong to 617 Squadron.

strativarius
Reply to  rah
February 2, 2023 12:49 am

626 – RAF Wickenby

Reply to  rah
February 2, 2023 4:01 am

This is what we are getting here:

https://earth.nullschool.net/#current/wind/isobaric/500hPa/overlay=temp/orthographic=-55.64,48.92,264/loc=-80.524,62.634

The coldest air in the arctic is marked, and is currently affecting the northern and northeastern parts of the U.S., along with large areas of Canada, and is working its way east.

The UK is south of the jet stream at the moment, keeping the weather mild there, but that cold air over the U.S. and Canada is headed the UK’s way.

Mr David Guy-Johnson
Reply to  Tom Abbott
February 2, 2023 4:29 am

Not according to any of the reputable medium range models. Which is disappointing because I was hoping for a cold winter and to rubbing some warmist noses in the snow

Mr David Guy-Johnson
Reply to  rah
February 2, 2023 4:27 am

I’ve checked and they don’t forecast any untoward weather for Europe. For western Europe they’re forecasting that a weak anticycline will dominate for most of the next 10 days. Therefore there won’t be any strong winds and no named storms

Reply to  strativarius
February 2, 2023 1:14 am

Have you ever witness such a belch of BaffleGab, WeaselWords and Coulds&Maybes in your life?
They haven’t a fugging clue and are grasping at straws, even ones that are on The Other Side of the Earth in Latitude, Longitude and (effectively) Altitude

When is any serious number of people gonna get off their backsides and call out this shyte for what it is..

sherro01
Reply to  strativarius
February 2, 2023 5:54 am

And don’t tell the Australian leftists governments, Federal and. State.
By the Monckton method, with UAH monthly data updated to include January 2023, the “pause” in warming over Australia’s lower troposphere is now 10 years and 9 months.
There has been slight cooling from May 2012.
Hypothetical. If this pause lasts 5 more years, there will be no Australian school children who have been exposed to global warming in their home country Australia.
Why are they taught that global warming is an existential threat?
Geoff S
http://www.geoffstuff.com/uahfeb2023.jpg

Reply to  strativarius
February 2, 2023 6:18 am

I’m sure they will start naming the various polar vortexes that come down from the artic soon enough

Reply to  strativarius
February 5, 2023 11:14 am

There were no named winter storms because they didn’t feel like naming any. It’s not like hurricanes that have a set standard.

rah
February 1, 2023 11:55 pm

Well, that isn’t what was heard. What was heard was that January was one of the warmest on record for the US.

But February sure has started out with the deep freeze door wide open here. And it looks like all the models are showing Europe is in for their own blast of Arctic air starting late next week.

Mr David Guy-Johnson
Reply to  rah
February 2, 2023 12:03 am

I’m not sure where you get that about Europe from. There is actually zero sign of any untoward cold weather in Western Europe in the next fortnight according to all of the medium range forecasts, or at least the reputable ones. A shame because I enjoy a bit of snow and ice.

Philip Mulholland
Reply to  Mr David Guy-Johnson
February 2, 2023 12:17 am

I suggest that you follow Ventusky
See 12 Feb 23

Reply to  Mr David Guy-Johnson
February 2, 2023 12:21 am
Reply to  Mr David Guy-Johnson
February 2, 2023 1:40 am

They are forecasting snow in the U.K. by the middle of this month, maybe.

Editor
Reply to  JohnC
February 2, 2023 12:34 pm

Today’s BoM (Bureau of Meteorology) forecast for Thredbo, NSW, is “Cloudy. High (70%) chance of snow showers.”. And it’s mid-summer in Australia. On 31 Jan, a place I visited in Robertson, NSW (between Canberra and Sydney) had a log fire burning mid-afternoon – and they needed it. Hey, weather isn’t climate, and the Snowies do get summer snow, but an awful lot of places seem to be getting a bit of cold right now.

Reply to  Mike Jonas
February 2, 2023 8:37 pm

G’Day Mike,

A spot of history. Late April, 1970, we were in Canberra. The snow was down to 4,000 feet. We hightailed it to Brisbane.

Reply to  JohnC
February 2, 2023 12:13 pm

I prefer the one that is right. In Ohio, the 5-day forecasts vary daily, and are often wrong 12-hours before. This is fairly flat country, unlike when I lived in Vermont in the late-60s and expected turbulence in the air masses passing over the Green Mountains to produce low predictability. Quite frankly, with the advent of Doppler Radar, geosynchronous weather satellites, and computer models, I would expect much better precipitation predictions. It is my subjective impression that about the only thing that can be depended on are the temperature forecasts. However, just using the historical averages would probably be nearly as good.

renbutler
Reply to  rah
February 2, 2023 4:47 am

In my part of the Midwest, it was the ninth warmest January ever. But my tiny corner of the world is a mere speck in the larger climate. Looks like the southern hemisphere was quite cool — which some people tend to appreciate in the (southern) summertime. I know I like cooler summers myself.

Coeur de Lion
February 2, 2023 12:42 am

Berlin minus four overnight Saturday week. Note that our gradually warming planet first reached this anomaly in 1983. So that’s 40 years of no warming.

Reply to  Coeur de Lion
February 2, 2023 1:32 am

So that’s 40 years of no warming.

That’s not quite how trends are calculated.

Linear warming trend since that first -0.04C anomaly (April 1983) is +0.14C per decade, according to UAH.

trend.png
Ron Long
Reply to  TheFinalNail
February 2, 2023 2:05 am

“That’s not how trends are calculated.” How do you calculate that the January global temperature (Dr. Roy Spencer) is 0.04 deg C under the 1991 to 2009 average, any way other than: in 32 years it has gotten a little colder/less warm?

Reply to  Ron Long
February 2, 2023 3:59 am

By linear regression. The exact same method used by Roy Spencer to arrive at his +0.13C per decade warming value.

Reply to  TheFinalNail
February 2, 2023 4:29 am

Explain why linear regression is appropriate from a statisical point of view. Does the series exhibit well behaved residuals, in accordance with the underlying assumptions behind OLS estimation? – i.e. normally distributed, with no substantive evidence of autocorrelation, homoscedastic? Is a linear model really appropriate? The Fourier analysis strongly suggests not.

Reply to  It doesnot add up
February 2, 2023 4:39 am

Explain why linear regression is appropriate from a statisical point of view.

Because it’s the method used by UAH and Roy Spencer and it’s their data set we are discussing here. Perhaps you should direct your comments to Dr spencer on his blog?

Reply to  TheFinalNail
February 2, 2023 4:54 am

*YOU* were asked the question and you bailed. Meaning you have no idea if the method of analysis is proper or not.

So you just fall back on the argumentative fallacy of Appeal to Authority.

If you can’t support *your* assertions then don’t make them.

Reply to  Tim Gorman
February 2, 2023 5:23 am

Let me get this straight. UAH posts its monthly update and uses linear regression to estimate the rate of warming in its global LT data set, +0.13C per decade. I now have to justify and explain UAH’s method of trend calculation? As I said before, if you don’t agree with it, take it up with UAH rather than me.

By the way, what do you make of Coeur de Lion’s approach of selecting a monthly anomaly from 40 years ago that happens to be the same as Jan 2023 and concluding from that that there has been no warming? You didn’t comment on it.

Is that a better statistical method than linear regression? Maybe you should pass that method on to Dr Spencer?

Reply to  TheFinalNail
February 2, 2023 12:05 pm

Look at this graph again.

comment image

Do temps below the baseline indicate warming to you.

This is a time series and the way it is forecasted is important!

What would you say if Dr. Christy used this graph and simply said there has been no CONSISTENT trend to warming?

Reply to  Jim Gorman
February 2, 2023 1:59 pm

What special significance do you attach to the base line? It’s just the 1991 – 2020 average. Would there be more warming if you stuck to using the 1981 – 2010 baseline?

Don;t you think the fact that there is more red in recent years and a lot more blue last century is some indication of warming?

Reply to  Bellman
February 3, 2023 9:40 am

No there would not be more warming. The temperatures would still revert to the mean. You just don’t get that natural variation is in control. You can trend what you want, but your trends NEVER show a reversion to the mean! Maybe you can rethink your analysis.

Reply to  Jim Gorman
February 3, 2023 11:12 am

The temperatures would still revert to the mean.

What mean? The old or the new base line, or an earlier one?

You just don’t get that natural variation is in control.

If that’s correct we should expect at some point to see temperatures averaging around the same as they were in the past. So far this has not happened. One month that is below a baseline centered on the temperatures 15 years ago does not suggest that anything has reverted to the mean yet.

Maybe you can rethink your analysis.

I will do, when there is some evidence to suggest a change in the trend.

Reply to  TheFinalNail
February 2, 2023 5:06 pm

Attached is a graph of the 5min data from my Vantage Vue weather station for August, 2019.

Which would you use as part of the baseline to calculate anomalies from? The median? The average? The mid-range?

No matter which one you pick you are going to introduce an uncertainty into the anomaly calculation, it could be as much as 3.0F.

Reply to  Tim Gorman
February 2, 2023 5:06 pm

forgot the graph

august_2019.png
Reply to  Tim Gorman
February 3, 2023 4:54 am

Hmmm,,,, no answer. Didn’t really expect one I guess. It just really shows the folly in trying to come up with an accurate baseline from which to work that allows accurate anomalies in the hundredths digit! But that doesn’t stop the climate warmers!

Reply to  TheFinalNail
February 2, 2023 6:57 am

Dr Spencer presents a centred 13 month moving average as a simple way of deseasonalising the data. He does not attempt to display a linear trend line. You did that.

Reply to  It doesnot add up
February 2, 2023 9:17 am

He does not attempt to display a linear trend line.

The linear warming trend since January, 1979 now stands at +0.13 C/decade (+0.11 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).

And if you want to see an example of actually drawing a trend line with no challenge, here’s Monckton from last month.

comment image

https://wattsupwiththat.com/2023/01/04/the-new-pause-lengthens-100-months-with-no-warming-at-all

Reply to  Bellman
February 2, 2023 11:18 am

Monckton is not Spencer. Monckton majors on back-casting to find the longest period from the present where the estimated slope is zero. Now at 100 months…

As he notes:

The least-squares method was recommended by Professor Jones of the University of East Anglia as a reasonable method of showing the trend on stochastic temperature data.

However, that is plainly false, as any consideration of the historic climate should tell you. The climate does not follow a linear temperature trend. Monckton is in effect jibbing Prof Jones for his lack of knowledge of statistics.

After all, a hockey stick is not a straight line.

Reply to  Bellman
February 2, 2023 12:19 pm

Less than 50% of the variance is explained/predicted by the regression line. Would you use similar statistics to buy stocks and expect to make money?

Reply to  Clyde Spencer
February 2, 2023 1:47 pm

I wouldn’t have a clue how to predict stock markets, but the fact that up to half of the monthly variation can be explained by a simple linear trend is a good indication to me that the trend is real.

Reply to  Clyde Spencer
February 3, 2023 6:18 am

Less than 50% of the variance is explained/predicted by the regression line.”

The R^2 means nada here. If you detrended the data to zero, the R^2 would go away, but the standard error of the (flat) trend would be identical to that found from the current one.

Reply to  It doesnot add up
February 2, 2023 9:35 am

Dr Spencer declares the linear trend in every update. His data set contains a linear trend for every single region covered by the data. It’s at the bottom of every column.

Why he doesn’t add a linear trend line to his charts is an interesting question. Perhaps he doesn’t like the look of it?

Reply to  TheFinalNail
February 2, 2023 9:54 am

He always used to add a high order polynomial to his charts “for entertainment purposes only”.

comment image

https://www.drroyspencer.com/2012/07/uah-global-temperature-update-for-june-2012-0-37-deg-c/

bdgwx
Reply to  It doesnot add up
February 2, 2023 11:37 am

The data is already deseasonalized (at least to some extent). Remember, there are actually 12 distinct baselines from the which the UAH TLT anomalies are constructed; one for each month.

Reply to  TheFinalNail
February 2, 2023 12:00 pm

They use it only because it compares to all the other joker climate alarmists.

Reply to  It doesnot add up
February 2, 2023 4:52 am

Yep. What does a linear regression of sinusoidal data tell you?

Reply to  Tim Gorman
February 2, 2023 9:30 am

A good rule of thumb is that if a linear curve fit provides no predictability over the graph’s x-axis, you shouldn’t draw it on the graph. Unfortunately, climateers love this particular transgression which allows them to extrapolate normal fluctuations into crises, thereby seemingly increasing their worth to society, and securing continued paychecks.

bdgwx
Reply to  DMacKenzie
February 2, 2023 12:03 pm

DMacKenzie said: “A good rule of thumb is that if a linear curve fit provides no predictability over the graph’s x-axis, you shouldn’t draw it on the graph.”

The model Tx = slope[X:1=>(N-1)] * (N-1) + intercept[X:1=>(N-1)] has an RMSE of 0.176 C. That’s not too bad considering Christy et al. 2003 report the uncertainty of the observations as ±0.1 C (1σ). For comparison the model Tx = -0.24 + [1.3*log2(CO2)] + [0.14*ONIlag4] + [0.32*AMOlag2] + [-2.7*AODvolcanic] has an RMSE of 0.121 C. We can actually get a hair more skill by incorporating auto-correlation into both. Anyway, as you can see OLR has predictive power and it is actually reasonably skillful.

Reply to  bdgwx
February 2, 2023 5:04 pm

Read this site about linear regresson.

2.1 – What is Simple Linear Regression? | STAT 462 (psu.edu)

In regards to linear regression this document has some good information.

In general, to be absolutely predictive there must be a functional relationship (gosh, there is that term again) between the independent and dependent variables.

In a statistical relationship, there still must be a relationship between the predictor variable and the response variable although it may not be perfect as with a functional relationship.

In essence, when plotting temperature versus time, it is difficult to prove that time is a good predictor variable. Pauses immediately bring into question whether time has any direct relationship to temperature.

If you want to prove that GHG’s and more specifically CO2 are good predictors of temperature by using linear regression, then you should use those as the variable that is the predictor of a response.

I can tell you have never been responsible for large budgets of people, revenue and expenses. I can’t tell you how many times someone has come to me with a linear regression encompassing several past years and exclaimed that it would predict the upcoming year. When you start to ask about the underlying factors like wage growth, productivity changes, rate increases, etc., they had no answers. Time IS NOT a good predictor variable when something doesn’t directly depend it.

The linear regressions being touted are really trying to replace time series analysis. This data really needs to be looked at using tools that have been developed for time series analysis if that is what you are really looking for. You will still find out that time is not a factor in predicting temperature even after doing all the transformations to remove seasonality, make the trend stationary, etc.

You are going to end up with something like the attached graph which plainly shows that CO2 and temperature are not very well correlated.

Reply to  Jim Gorman
February 2, 2023 5:10 pm

Forgot the image.

CO2 vs UAH temps.png
Reply to  It doesnot add up
February 2, 2023 7:14 am

Does the series exhibit well behaved residuals, in accordance with the underlying assumptions behind OLS estimation?”

It seems to. The correlation between their residuals and their best fit normally distributed equivalents is ~0.992. And they look like they fit as well. Please provide your Fourier Analysis.



uah6 from 0183 to present.png
Reply to  bigoilbob
February 2, 2023 11:40 am

Here:

https://www.woodfortrees.org/plot/uah6/plot/uah6/fourier/low-pass:5/inverse-fourier

The residuals from linear regression of the UAH6 data show a very high degree of positive serial autocorrelation. The Durbin-Watson statistic is ~0.752, which is highly significant – way beyond a 1% criterion for a sample of over 500 readings. So OLS is NOT valid for the data.

Reply to  It doesnot add up
February 2, 2023 11:57 am

Your WFT does not show the calculation of your DW statistic. Since this is your value, would you please show us from whence it came?

And FYI, we certainly know that there are cyclical events, which would tend autocorrelate. But we also know that the underlying trend of the repeated cycles is undeniably, up.

Reply to  bigoilbob
February 2, 2023 1:44 pm

Check it yourself. Take the sum of the squares of the residuals as the denominator, and the sum of the product of the residuals e(t) with those one period lagged e(t-1) as the numerator. Divide, and bob’s your answer… You can verify the formula here:

https://www.statology.org/durbin-watson-test/

For MY information there are cyclical events? LOL. That’s exactly why I conducted Fourier analysis, which is designed to elucidate the amplitudes of different frequencies and relative phases of cycles in the data.

We do not know anything about the future trend. See Sir Alan Cairncross.

Reply to  It doesnot add up
February 3, 2023 5:54 am

From your link:

When this assumption is violated, the standard errors of the coefficients in a regression model are likely to be underestimated which means predictor variables are more likely to be deemed statistically significant when they’re actually not.”

Italics mine.

  1. Indications of autocorrelation are likely to be found in an upward trend with cyclicity. Both the cyclicity and the positivity of the trend itself would tend to present those indications, without necessarily degrading the underlying statistical durability of that upward trend.
  2. Both the OLS evaluation of the basic data being evaluated here, and of it’s fourier series, returns standard errors a tiny fraction of the magnitude of the positive trend. Those standard errors could be several times larger and still show a statistically durable, high, positive trend.
Reply to  bigoilbob
February 3, 2023 6:47 am

Except that when we actually use the previous period reading as a variable, we find that it effectively eliminates a linear trend as an explanatory variable. The linear trend is not a good model. The autocorrelation model is much better (though still leaving a lot of holes).

As I pointed out, the DW statistic was strongly diagnostic of a high degree of positive serial autocorrelation. So the first step is to estimate an AR1 model.

AR1 UAH6.png
Nick Stokes
Reply to  It doesnot add up
February 5, 2023 1:33 am

Except that when we actually use the previous period reading as a variable, we find that it effectively eliminates a linear trend as an explanatory variable.”

I don’t think you know how to do Ar(1) regression. You regress against both the previous value and time. The trend is virtually unchanged.

Here is how to do it in R:

comment image

Reply to  It doesnot add up
February 2, 2023 12:10 pm

Without OLS, trendology is totally bankrupt.

Reply to  bigoilbob
February 2, 2023 12:10 pm

See. bellman’s post above of Dr. Christy’s sinusoid curve. Which has a better fit, a linear regression or the polynomial one?

Are either one better than this one?

comment image

Reply to  Jim Gorman
February 2, 2023 12:34 pm

I don’t know. But his OLS fit of the post Jan 1983 UAH6 data is not as good as the fit to the cyclical fourier data of that same data by it doesn’t add up.

Data – 1.40 degC/century with a standard error of 0.07degC/century

Fourier data – 1.32 degC/century with a standard error of 0.04 degC/century

If you do this a few times you will notice that these are terrific fits. If you consider the ramifications of the plot I posted separately of how unlikely the trend is to be much different than it’s expected value, the scales should fall from your eyes.

Reply to  Jim Gorman
February 2, 2023 1:03 pm

It was Roy Spencer’s graph.

A polynomial will always have a better fit the higher the order, simply because there are more degrees of freedom. That doesn’t mean it’s a better indicator of an underlying trend. How good would that polynomial have been at predicting the next 10 years?

Here, by the way, is what the 4th order polynomial fit looks like now.

20230202wuwt4.png
Reply to  Bellman
February 3, 2023 5:44 am

A polynomial will always have a better fit the higher the order, simply because there are more degrees of freedom.”

A key point. It is how the deniersphere tries to widgit its way into believing that CC is “natural”.

Reply to  Bellman
February 3, 2023 8:31 am

The purpose of statistical tests of model parameters is to establish whether they do offer a valid model. Adding more polynomial terms to a regression offers no such guarantee.

Reply to  It doesnot add up
February 3, 2023 10:07 am

And nor does adding multiple sine waves.

Reply to  Bellman
February 3, 2023 10:55 am

I didn’t. I subtracted them, only showing the lower harmonics.

A Fourier Transform is not a regression.

Reply to  Bellman
February 3, 2023 2:18 pm

An FT is a mapping from one domain to another. It is not a regression. How many times do you have to be told that before you actually go learn something about FT’s?

Reply to  bigoilbob
February 2, 2023 12:15 pm

The residuals are also not homoscedastic. Another reason not to trust an OLS linear model.

Reply to  It doesnot add up
February 2, 2023 12:42 pm

I agree. The residuals go up and down, but exhibit a quite normal distribution. That, not how uniform they are, is the name of the game.

Reply to  bigoilbob
February 2, 2023 2:21 pm

NOPE. All the criteria have to be met to justify the use of OLS.

Reply to  It doesnot add up
February 3, 2023 5:37 am

NOPE. Please read up on both autocorrelation and heteroscedasticity.

When there are signs of autocorrelation, that in itself does not disqualify the trend identification. Both cyclical trends within the larger positive trend, and the positivity of the trend itself would tend towards a DW test of autocorrelation, even with the upward trend still statistically durable.

Please find any link to the use of the phrase “have to” or a synonymous one, w.r.t. autocorrelation of an OLS trend….

W.r.t. heteroscedasticity, in this case it is caused by normal cyclicity from ENSO, volcanism, etc. There is no reason in and of itself that merely becuase there is variability in the variance of the residuals, that the underlying trend is invalid.

Reply to  bigoilbob
February 3, 2023 6:51 am

Please read up on autocorrelation and heteroscedasticity.

I studed this at degree level, so I don’t need you trying to act as grandma.

Now look at the results of an AR1 model and tell me what is left by way of a trend.

AR1 UAH6.png
Reply to  It doesnot add up
February 3, 2023 6:57 am

I studed this at degree level, so I don’t need you trying to act as grandma.”

I gave you reasons to catch up on these topics, complete with links. I should have guessed that all I would get in return would be a fact free appeal to authority.

As for your cartoon, no accompanying data, as is the custom here. But I’m guessing that if you did provide it, the trend would be as durable as what I provided.

Reply to  bigoilbob
February 3, 2023 7:39 am

My chart shows the UAH6 data, and the predicted values from an AR1 model. I have explained that entails using the previous period value as the variable to regress. Do you need more spoon feeding? I have quoted the regression results already.

Nullis in verba – you have the tools to verify for yourself. Assuming you know how to use them.

Reply to  It doesnot add up
February 3, 2023 7:58 am

Ah, the usual deflection when substituting a cartoon for actual data. No, it’s not a request to spoon feed, by aksing you to do the minimum amount of man up.

Folks, Idau would rather **** a **** (maybe not a problem for him) than provide data that he claims to have, but that might actually use. It channels a bit salesman from my past. It used to be the custom to get a bottle of Crown Royal for using certain brand name drill bits. But one pusher never got any. When I aksed why, the bit peddler looked at me with a grin and said “Why, he’d drink it!”.

Reply to  bigoilbob
February 3, 2023 11:53 am

YOU have the data. If you’ve not downloaded them they are available from UAH. I have reported my results and methods. You can verify them. Do you want a post with 530 lines of data showing the variables, predicted values and residuals so you can check your spreadsheet works?

Reply to  It doesnot add up
February 4, 2023 5:18 am

If it is merely the UAH data, already evaluated.

bdgwx
Reply to  It doesnot add up
February 3, 2023 8:23 am

I’ve not explored AR models yet so I’m interested in learning more. One thing that interest me specifically is predicting the next value Xn using only the set {X1, …, Xn-1}. So I’m curious…what is the RMSE of your AR1 model in predicting value Xn using only X1 to Xn-1 as inputs for the training?

For example, Xn = average(X1:Xn-1) has an RMSE of 0.20 C. Xn = slope(X1:Xn-1) * (n-1) + intercept(X1:Xn-1) has an RMSE of 0.18 C. Xn = Xn-1 has an RMSE of 0.13 C. What is the RMSE of Xn = ar1next(X1:Xn-1) when applied to the UAH TLT values?

Reply to  It doesnot add up
February 2, 2023 1:57 pm

The data are all there on the UAH website, so why don’t you show us what the trend should look like?

Seems like no more than a couple of hours work for one so wise.

Please forgive my snarky tone (I’ve had the Canada Revenue Agency on the phone and feel the need to lash out at someone), but this is a serious ask. You’re not the first to deplore least-squares trends as inappropriate, but I can’t recall seeing the “proper” statistics applied to a temperature-time trend.

You could also respond to his lordship, who is going to post another verbose chapter of “the New Pause Lengthens” within the next few days, and tell him what he did wrong, because it looks as though he uses least squares to establish his trends. That might provoke an entertaining discussion. I always look forward to learning new words!

Reply to  Smart Rock
February 2, 2023 3:33 pm

I already posted a Fourier filtered analysis which is much more appropriate than the linear trend.

Reply to  It doesnot add up
February 3, 2023 5:23 am

the fourier trend itself is increasing at essentially the same rate as the original.

Reply to  bigoilbob
February 3, 2023 6:38 am

fourier transforms don’t have trends.

see the attached picture. A cosine wave in the time domain on the left and its representation in the frequency domain in the middle. How do you trend the frequency components in the frequency domain?

cosine_functions.png
Reply to  Tim Gorman
February 3, 2023 6:42 am

“fourier transforms don’t have trends.”

This one sure did. But I’m happy to revert to the original data evaluation, which had a higher trend, and a still, relatively miniscule, standard error.

Your cosine waves are flat. They don’t have to be, and in the case of temp seasonality (one of many examples) they never are.

To answer your question, you trend them the same way you trend any data, while trying to match cycles as best you can, and by losing the cyclicity component, if possible. The trend gets judged on its statistical durability, no matter what.

Reply to  bigoilbob
February 3, 2023 8:13 am

No it did not have a trend. Try reading and understanding more about Fourier transforms. They can be used to model e.g. a sawtooth fuction. If you narrow your view to an ascending section of the graph you will con yourself into thinking it is modelling a rising trend. Yet the full curve has no trend at all. It is a sawtooth.

Reply to  It doesnot add up
February 3, 2023 8:32 am

So, “sawtooth” and other cyclical functions can not have trends? Whoa! In what world….?

And again, I have trended the actual data, and it’s even more pronounced than the fourier.

How’z that provision of you Big Foot ARIMA data coming? To be fair, I can understand why you wish to withhold it….

Reply to  bigoilbob
February 3, 2023 12:56 pm

They can have *short* term trends based on how the sinusoids combine.

What you get out of a FT is *NOT* statistical data, it is frequency components. You map the time domain data into the frequency domain. There is no trending in the frequency domain!

Reply to  Tim Gorman
February 4, 2023 5:33 am

The FT data has a y component of degC and an x component of time. But for some Big Foot reason we can’t trend ddegC/dT.

But no problem in any case. I already trended the actual data. It trends slightly higher than the FT, with a relatively tiny standard error of that trend. Upon request I’ll happily provide the plot of probability of that trend being less than a table of values, with those values ranging from expected to zero.

Reply to  bigoilbob
February 3, 2023 12:55 pm

trend this one. It has El Nino’s, La Nina’s, heating, cooling, etc.

and by losing the cyclicity component,”

In other words, just ignore that which makes the temperatures what they are.

sinusoid1.png
Reply to  Tim Gorman
February 4, 2023 5:26 am

There are enough cycles to trend it with or without “losing it’s cyclicity component”. Eyeballing seems to indicate that the cycles are regular and repeating, and that the overall trend is flat, unlike the 01/83 to present UAH6 data. So, is there a pony under there?

Separately, Watts Up with the WUWT poster penchant for providing cartoons instead of actual data.? Could it be that you and the rest are afraid of it? Nah! They’d be talkin’ about that!

Reply to  It doesnot add up
February 3, 2023 7:40 am

And I’ve now added an AR1 model.

Reply to  Smart Rock
February 2, 2023 3:58 pm

How do you trend cyclical data?

Monckton only looks a small portion of the data. Like the very top or bottom of a cycle the derivative is 0, i.e. horizontal. He is basically trying to identify an inflection point and show that a linear trend will not work to do that. Along with that goes the forecasting truism that you must weight current data more heavily than past data if you want to know where things are actually going.

How does an OLS do that? An OLS is truly only good for the interval over which it is calculated. Extending it is a fools mission. Yet that is what the climate models basically do. Take a close look at them. After a few years into the future they all become linear trend lines, y = mx + b. “m” varies among them but they all have an “m” value. I might believe such a forecast for postal rates, not for climate however.

Reply to  Tim Gorman
February 2, 2023 4:14 pm

Monckton only looks a small portion of the data.

He’s looking at the entire 44 years of UAH to say the rate of warming is 0.134°C / decade. He’s drawn linear trends on the entire HadCRUT data set. He keeps going on about the trend over 40 years of CET, whilst ignoring the fact it really is cyclic.

Along with that goes the forecasting truism that you must weight current data more heavily than past data if you want to know where things are actually going.

Something he never does. Something you’ve never done. Show us what the forecast is when you weigh current data more than past data. Then explain how you determined the weighting and demonstrate how good your method was using past data.

An OLS is truly only good for the interval over which it is calculated.

All this pointless discussion has been about what has been happening over the given interval. It’s been about the claim that there has been no warming the past 40 years – past tense, not about predicting the future.

Extending it is a fools mission. Yet that is what the climate models basically do.

If that was all climate models did, they would all agree and there would be no talk about how the actual trend was less than predicted.

Reply to  Bellman
February 2, 2023 4:47 pm

He’s looking at the entire 44 years of UAH to say the rate of warming is 0.134°C / decade”

Not when he is calculating the length of the pause!

“Something he never does”

That’s the whole point of calculating the length of the pause. It is based on the most recent data!

“Something you’ve never done. Show us what the forecast is when you weigh current data more than past data.”

Look at his pause calculation! He gives everything earlier than that a weight of ZERO!

“Then explain how you determined the weighting and demonstrate how good your method was using past data.”

I did this already in another thread. I provided links to two different web sites on how to do it and started the calculations for you. As usual, you just blew it all off.

“All this pointless discussion has been about what has been happening over the given interval.”

Meaning you have absolutely no understanding what a Fourier Transform analysis actually is. Why am I not surprised?

“If that was all climate models did, they would all agree and there would be no talk about how the actual trend was less than predicted.”

They don’t agree because they all come up with a different value of “m” for their y = mx+b projections. And none of their values of “m” match the actual reality we live in! Do the words “they are all running too hot” mean anything to you?

Reply to  Tim Gorman
February 2, 2023 5:15 pm

Not when he is calculating the length of the pause!

Which is why I was not talking about the length of the so called pause, but all the times he’s used a linear trend over the entire data, whether it’s appropriate or not.

That’s the whole point of calculating the length of the pause. It is based on the most recent data!

You were talking about weighing recent data more than older data. That is not the same as ignoring all data before a certain data and giving equal weight to all data after that date.

Look at his pause calculation! He gives everything earlier than that a weight of ZERO!

And everything after that a weight of ONE. This is a very silly thing to do, if you are claiming you want to produce a weighted trend. Especially when you only choose the start date to get the trend you want. Another way of saying this, is you are ignoring all data you don;t want, and only looking at the data you do want.

I did this already in another thread. I provided links to two different web sites on how to do it and started the calculations for you. As usual, you just blew it all off.

And the lies keep coming. You showed me a method for a weighted regression, I showed what happens when you used it. I did not blow it off. I’m still waiting for you to do the work and demonstrate what happens when you apply your preferred method.

Meaning you have absolutely no understanding what a Fourier Transform analysis actually is. Why am I not surprised?

We were talking about linear regression, not a Fourier Transform. But why am I not surprised you throw another red herring along the path. Does your Fourier Transform allow you to answer the question, has there been warming over the last 40 years?

They don’t agree because they all come up with a different value of “m” for their y = mx+b projections.

Your claim was all they are doing is projecting the current trend into the future. Why would there be any disagreement. We know what m is for any given data set.

And none of their values of “m” match the actual reality we live in!

Make your mind up. First yopu complain they just use the current warming trend, now you complain the models don’t agree with the current warming trend.

Reply to  Bellman
February 3, 2023 4:09 am

Which is why I was not talking about the length of the so called pause,”

In other words you are deflecting because you don’t have an answer. You never do.

You were talking about weighing recent data more than older data.”

Again, using the most recent data while setting past data to a weight of zero *IS* weighting current data more than older data!

“That is not the same as ignoring all data before a certain data and giving equal weight to all data after that date.”

Did you actually read this before posting it? It’s not ignoring it, it is giving it a weight of zero!

“And the lies keep coming. You showed me a method for a weighted regression, I showed what happens when you used it. I did not blow it off. I’m still waiting for you to do the work and demonstrate what happens when you apply your preferred method.”

Malarky! You showed NOTHNING. You just did as you usually do and just said it was wrong!

Monckton has already done the work. And you won’t accept it. You just continue to say its wrong!

“We were talking about linear regression, not a Fourier Transform. But why am I not surprised you throw another red herring along the path. Does your Fourier Transform allow you to answer the question, has there been warming over the last 40 years?”

In other words you aren’t keeping up. Surprise, surprise! “It does not add up” posted a FT. It’s not a red herring. It shows the frequency components. It is *NOT* a linear trend line.

There has been warming and cooling since the Earth was formed. Yet you want to ignore everything except your lifeline that CO2 is going to turn the Earth into a cinder!

“Your claim was all they are doing is projecting the current trend into the future. Why would there be any disagreement. We know what m is for any given data set.”

Again, did you read this before you posted it? The climate models CREATE the future data set. And the equation they wind up with is a linear trend line, y = mx +b. And they all come up with different values for “m” which is different that “m” for actual observations! Again, that is why people are beginning to recognize the models are all “running too hot” – except for you I guess!

” First yopu complain they just use the current warming trend, now you complain the models don’t agree with the current warming trend.”

Shut it down troll! You can’t even recognize the difference between extending a linear regression of temperature observations and creating a future prediction based on models. You are just embarrassing yourself!

Reply to  Tim Gorman
February 3, 2023 12:42 pm

“In other words you are deflecting because you don’t have an answer. You never do.”

No, I was trying to keep on topic. You claimed he only ever looked at a small portion of the time series, and I pointed out he does show the trend over the whole series. This was in relation to the fact that people are now insisting that it’s meaningless to look at the trend over the entire series.

Did you actually read this before posting it? It’s not ignoring it, it is giving it a weight of zero!

Did you? Giving something a weight of zero is to ignore it.

Malarky! You showed NOTHNING. You just did as you usually do and just said it was wrong!

then provide a reference. I can’t keep up with your stream of nonsense. Which site did you ask me to look at, how do you want me to weight the values, what result do you get?

Monckton has already done the work. And you won’t accept it. You just continue to say its wrong!

Because I think it’s wrong, or at least misleading. You talked about weighing more recent data higher than older data, but all Monckton does is weigh the data he wants as 1 and the data he doesn’t want as zero. Claiming this is some sophisticated statistical technique when it’s just cherry picking hte start you want is a distraction.

Yet you want to ignore everything except your lifeline that CO2 is going to turn the Earth into a cinder!

More strawman fallacies.

The climate models CREATE the future data set. And the equation they wind up with is a linear trend line, y = mx +b. And they all come up with different values for “m” which is different that “m” for actual observations!

And you keep moving the goalposts. The statement I was disagreeing with, was you saying

An OLS is truly only good for the interval over which it is calculated. Extending it is a fools mission. Yet that is what the climate models basically do.

You claim the models basically extend the trend into the future, and when I call you out on it you come up with a different claim.

Reply to  Bellman
February 3, 2023 2:42 pm

Because I think it’s wrong, or at least misleading.”

You’ve never once run a business where maintaining inventory was a requirement for a profitable business.

————————————

An OLS is truly only good for the interval over which it is calculated. Extending it is a fools mission. Yet that is what the climate models basically do.

You claim the models basically extend the trend into the future, and when I call you out on it you come up with a different claim.

————————————————-

No moving of goal posts. Only your inability to understand a simple truism. Depending on an extended linear trend line, especially in a changing environment, simple doesn’t work. It might work in geology where you have a fixed, unchanging medium over a long period of time but weather and climate isn’t geology!

Reply to  Tim Gorman
February 3, 2023 5:14 pm

You’ve never once run a business where maintaining inventory was a requirement for a profitable business.

How many different jobs do I need to have had before I’m allowed to point out the flaws in Monckton’s pause analysis?

Reply to  Bellman
February 5, 2023 8:06 am

You need to show that you have been accountable for making decisions that affect your employment and the profitability of a business you are involved with. In other words some financial accountability.

Linear trends ONLY SHOW what has happened between the end points. Using them to forecast either the past or the future is fruitless unless you know what factors are going to change in the future.

If you don’t believe me, tell us what your linear trend says occurred in the past. If it is accurate to the future, in damned well better be accurate to the past beyond the beginning of the trend also.

Reply to  Jim Gorman
February 5, 2023 9:30 am

What an utterly bizarre piece of ad hominem logic. Only people who have handled the company financiers are allowed to explain some basic statistics?

“Linear trends ONLY SHOW what has happened between the end points.”

Which is all I’ve been using them for. But I don’t think that’s ALL you can use them for.

Reply to  Bellman
February 6, 2023 7:58 am

You didn’t answer my question!

Does your trend accurately reflect the past beyond what you have shown?

If it doesn’t then you have cherry-picked your starting point haven’t you?

This one reason OLS is truly questionable on non-stationary time series!

Reply to  Jim Gorman
February 6, 2023 2:49 pm

Does your trend accurately reflect the past beyond what you have shown?

No, and I’ve never claimed it did. As always you keep inventing arguments you think I’m making and then shooting them down. A straw man.

If it doesn’t then you have cherry-picked your starting point haven’t you?

In this case the starting point is the first point of the data, December 1978. Difficult to see how that can be a cherry-pick. I could have easily picked later points that showed faster rates of warming.

If you were looking at a longer data set, e.g. HadCRUT, then a linear trend would not be appropriate. You need to try to find a model that fits the data better, and be honest about it. Not for example choosing the start point that will maximize the rate or warming, preferably using some analysis to identify a change point.

This one reason OLS is truly questionable on non-stationary time series!

You still don;t get stationary data do you. If data has a trend it is not stationary. There is no rule that says you can’t find a trend in non stationary data, it would be a pointless exercise to find a trend in stationary data.

Reply to  Tim Gorman
February 2, 2023 5:40 pm

They don’t agree because they all come up with a different value of “m” for their y = mx+b projections. And none of their values of “m” match the actual reality we live in!

And the ECS numbers they get out of the models are not much more than the CO2 concentrations versus time the operators assume will happen.

Reply to  karlomonte
February 3, 2023 4:10 am

Exactly!

Reply to  Smart Rock
February 3, 2023 8:00 am

“The data are all there on the UAH website, so why don’t you show us what the trend should look like?
Seems like no more than a couple of hours work for one so wise.”

It’s 5 minutes worth of “work”, trend statistics and odds of the trend being qualitatively incorrect included. And we know why he won’t provide it…

Reply to  TheFinalNail
February 2, 2023 4:34 am

A warming rate of all of +0.13C per decade. Wow, should I panic?

Reply to  Graemethecat
February 2, 2023 4:42 am

You can panic if you like, or not. I was just pointing out that this is the linear warming trend in UAH, as opposed to Coeur de Lion, who believed that because there was a previous anomaly value of -0.04C 40-years ago that meant there had been no warming in between.

Reply to  TheFinalNail
February 2, 2023 3:51 am

A trend is a trend is a trend. But the question is, will it bend? Will it alter its course through some unforeseen force and come to a premature end?

Sir Alec Cairncross, Economist

UAH inverse-fourier.png
Reply to  It doesnot add up
February 2, 2023 4:01 am

Like I said above, linear regression is the method used by Roy Spencer to arrive at UAH’s overal warming trend of +0.13C per decade. Not sure what you’ve used in the above chart.

Reply to  TheFinalNail
February 2, 2023 4:05 am

The steps are written out: Fourier transform, low pass filter, inverse Fourier transform.

It removes the high frequency oscillations to reveal the underlying low frequency oscillations of the trend, which ends up where we started, and heading down…

Reply to  It doesnot add up
February 2, 2023 4:11 am

Perhaps best to stick to the method used by Spencer and UAH?

Reply to  TheFinalNail
February 2, 2023 4:30 am

Perhaps best to take a course in statistics.

Reply to  It doesnot add up
February 2, 2023 4:44 am

I’ll leave it to you to pass that on to Dr Spencer. It seems UAH have been getting their trend calculation method wrong for decades. You can now enlighten them.

Reply to  TheFinalNail
February 2, 2023 4:57 am

*YOU* are the one advocating for the linear trend analysis. *YOU* should be able to support your advocacy.

All you are doing is throwing out the argumentative fallacy of Appeal to Authority. If you can’t support *YOUR* assertion then don’t make the assertion.

Reply to  Tim Gorman
February 2, 2023 5:13 am

I’m not advocationg anything. I’m just following the trend calculation method used by UAH, since it’s their data set under discussion. If you disagree with their method then it seems to me it should be Dr Spencer and UAH you should be objecting to, rather than me.

Ron Long
Reply to  TheFinalNail
February 2, 2023 7:00 am

Thefinalnail, my initial question was a very simple one “how do YOU calculate…”, because I like to view data as simply as possible, at least initially, to see if there is some common sense element, which in my question one is exposed.

Reply to  Ron Long
February 2, 2023 9:38 am

I use the LINEST function on Excel, as I imagine Dr Spencer does.

Reply to  TheFinalNail
February 2, 2023 7:00 am

No you aren’t. Dr Spencer clearly thinks a 13 monthe centred moving average, as presented on his charts, is much more informative. It picks out aperiodic anomalies like major El Niños, while removing the noise from seasonal factors.

Reply to  It doesnot add up
February 2, 2023 9:39 am

There’s nothing wrong with a centred moving average. And there’s nothing wrong with a linear trend. The two aren’t mutually exclusive.

Reply to  TheFinalNail
February 2, 2023 3:49 pm

The linear trend is trying to impose a preconception on the data – that the linear trend exists, and will persist. There isn’t good evidence for that – either in the history or in the lack of skill of climate modelling. The moving average allows the underlying noise to be smoothed out, and leaves a lot of room for interpretation of what is happening. That is a much better approach.

Go back to Feynman.

In general, we look for a new law by the following process: First we guess it; then we compute the consequences of the guess to see what would be implied if this law that we guessed is right; then we compare the result of the computation to nature, with experiment or experience, compare it directly with observation, to see if it works. If it disagrees with experiment, it is wrong.

We’re still looking for new laws of climate science. Linear temperature trends are failed guesses. They are wrong. Moving average data is a much better place to start understanding climate drivers.

Reply to  It doesnot add up
February 2, 2023 4:29 pm

What is the linear trend line for sin(x+a) + sin(y+b) + sin(z+c) + …..?

bdgwx
Reply to  It doesnot add up
February 2, 2023 11:31 am

Does he think it is more informative? I ask because the linear trend is included in the data file, but the 13m centered average is not.

Reply to  bdgwx
February 2, 2023 2:47 pm

All that the slopes reveal is a crude estimate of the relative changes in the different series over the period of the data. So we can easily pick out that the Arctic has been warming, while Antarctic has not.

The data files are provided with the intent that anyone who wants to can analyse the data. Unlike UEA. He presents the moving average deseasonalisation for discussion in his blog. It is not controversial, and displays no other modelling: it follows the data wither it wanders.

bdgwx
Reply to  It doesnot add up
February 2, 2023 4:06 pm

I’m not saying a 13m centered is controversial. It’s not. But neither is a linear regression or many of the other statistical techniques.

Reply to  bdgwx
February 2, 2023 5:24 pm

I have explained that linear trends are imposed as a subliminal attempt to pretend that the trend is inexorable, even though it wis quite clear to anyone who is statistically competent they do not model the data. That is why they are controversial: they are propaganda. Even when used by Monckton.

bdgwx
Reply to  It doesnot add up
February 2, 2023 6:41 pm

OLR is no more or less a subliminal message than a 13m centered average or the countless other statistical techniques used ubiquitously in nearly all disciplines of science. And it does model data. That does not mean it is the best model but it is a model and can be used as an estimator for the data nonetheless. It can even be used to predict the next data point. I showed above that a simple OLR model had an RMSE of 0.18 C in predicting monthly UAH TLT anomalies. Does it model the data perfectly? Nope. Is it the best model? Nope. But it does model the data. That is indisputable.

Reply to  It doesnot add up
February 3, 2023 5:10 am

How anyone can look at the UAH graph (I have attached and modified the graph to show how pulses occur) and get the idea that a linear trend will capture what is happening is beyond me.

uah temperature graph modified.jpg
Reply to  Jim Gorman
February 3, 2023 10:10 am

The purpose of a linear trend is not to capture every up and down. The point is to try to identify a possible trend that is happening amid the variations.

And how anyone can look at that graph and not realise that the fact that nearly all the left hand side is blue and nearly all the red is on the right, and not realise that suggests a warming trend, is beyond me.

Reply to  Bellman
February 3, 2023 2:20 pm

It only suggests a warming trend if, like you accuse Monckton of doing, you PICK a date that has only warming after it!

Reply to  Tim Gorman
February 3, 2023 5:44 pm

You can pick any start date up to August 2014 and it shows warming. December 1978 is the best value to pick as it covers the whole range with no bias in it’s selection. Warming of 0.13°C / decade.

I could easily pick start dates that show faster rates of warming, and I could “find” the start date that gives me the longest period where warming is twice as fast as the overall warming rate – i.e. “Since July 2007 the warming rate has been running at double the claimed overall warming rate. That means that for the past 15 years and 7 months, that’s 187 months, the warming rate has been equivalent to 0.27°C / decade.”

But I’d be sure to point out this means nothing, and is based on cherry-picking, I mean finding, the best start point to make that claim.

Reply to  Tim Gorman
February 2, 2023 5:52 am

*YOU* are the one advocating for the linear trend analysis. *YOU* should be able to support your advocacy.

Will you ask Monckton to support his use of OLS trends next time he talks about the pause or uses the trend to claim the warming is less than predicted?

Reply to  Bellman
February 2, 2023 9:53 am

No. You can ask him again and save everyone else the trouble.

Reply to  doonman
February 2, 2023 10:01 am

I’m not the one saying linear trends are meaningless.

I do question why he reports the trend with no confidence interval, especially when talking about his pause, and I challenge him when he talks about a liner trend over the last 150 years, when the trend clearly isn’t linear.

bdgwx
Reply to  doonman
February 2, 2023 11:13 am

doonman said: “No. You can ask him again and save everyone else the trouble.”

Is it safe for use to conclude then that you are okay with linear regression when it outputs < 0 C/decade, but are critical of it when it outputs > 0 C/decade?

Reply to  bdgwx
February 2, 2023 12:52 pm

Dude, linear regression on a periodic function is not worth the paper it is drawn on regardless of what it shows. You are asking an I’ll posed question. Try to ask a question that is appropriate to the subject.

Reply to  Jim Gorman
February 3, 2023 6:11 am

Dude, linear regression on a periodic function is not worth the paper it is drawn on regardless of what it shows.”

Not just wrong, but unsupportable. I outed you in a different post on this, Since this is your convenient claim, please provide any linked support to it.

Reply to  bigoilbob
February 3, 2023 8:50 am

Be so kind as to explain which trend line is the correct one! Or, any other trend line if you so wish.

trend a periodic function.jpg
Reply to  Jim Gorman
February 4, 2023 5:16 am

The flat one I suppose. Using the same evaluative process that gives us the statisticall durable increasing trend for the UAH6 1/83 to present data.

Reply to  bigoilbob
February 3, 2023 12:42 pm

trend this:

And this is a simple sinusoid.

sinusoid.png
Reply to  Tim Gorman
February 4, 2023 5:20 am

I agree that it is not the cyclical, increasing UAH6 01/83 to present data.

Reply to  Bellman
February 2, 2023 12:07 pm

Monckton’s justification is published every time: Dr Phil Jones claims it is OK.

The least-squares method was recommended by Professor Jones of the University of East Anglia as a reasonable method of showing the trend on stochastic temperature data.

However, that is plainly false, as any consideration of the historic climate should tell you. The climate does not follow a linear temperature trend. Monckton is in effect jibbing Prof Jones for his lack of knowledge of statistics.

After all, a hockey stick is not a straight line.

Reply to  It doesnot add up
February 2, 2023 1:13 pm

What an odd justification. Why would you care if one person said it was OK?

Reply to  Bellman
February 2, 2023 3:52 pm

It is done in mockery of a supposedly highly respected climate scientist – or one whose reputation never quite recovered from Climategate.

Reply to  It doesnot add up
February 2, 2023 4:02 pm

Because Dr Phil Johns is the only person to have ever used linear regression?

If you think Monckton is just using OLS in jest, do you also agree that all his claims that the rate of warming is much less than models said is also just a joke?

Reply to  Bellman
February 2, 2023 5:26 pm

No, they appear to be based on fact. Moreover, fact that is accepted by many climate scientists who fear their work is undermined by their models running too hot.

sherro01
Reply to  It doesnot add up
February 2, 2023 3:41 pm

Some of us Aussies like Warwick Hughes were asking Phil Jones pointed questions about his bad handling of data from about 1990. The Climategate publicity popularised his weaknesses in 2008/9, nearly 20 years later.
In a nutshell, it is fair game to tease Phil Jones because of his horrible contributions to public mistrust of Science in general.
No matter what Phil opined, it is not good science to use ordinary least squares regressions for data that diverge much from linearity. By using OLS on UAH data, you imply belief in a linear mechanism over time that affects air temperature. I do not think that many scientists have assumed that temperatures change with time in a linear manner. That would mean that a warming trend would never end.
I use OLS graphs now and then, but usually try to state their main limitations. They have become widespread in climate work. If I did not add the straight line, some readers would. That is not good, confusing science with eye candy.
Geoff S

Reply to  sherro01
February 3, 2023 4:17 am

That would mean that a warming trend would never end.”

Which is what the models imply. Let alone AOC and Greta who thinks the Earth will become a burnt out cinder by 2100.

Reply to  Bellman
February 2, 2023 12:15 pm

Like it or not but a piece wise analysis of a waveform does give a valid answer for the piece being analyzed. If it didn’t, calculus would be a waste of time and effort. We would be using statistics to do all the math in the world!

Reply to  Jim Gorman
February 2, 2023 1:16 pm

Then do it. You keep on saying there are better methods, but never say what they show. I’ve done piecewise evaluations on the UAH data and it shows the best place for a change is 2012 with a sharp uptick in the rate of warming, but I doubt a case could be made that this is significantly better than a single linear trend.

Reply to  Bellman
February 2, 2023 12:30 pm

Monckton is using OLS regression to demonstrate that there is no statistically significant linear trend over the stated time interval.

Reply to  Clyde Spencer
February 2, 2023 1:18 pm

No he isn’t. Where has he ever said that the trend over the last 44 years is not significant? I don’t think he’s ever done any significance test or even quoted a confidence interval.

Reply to  Bellman
February 2, 2023 4:33 pm

Monckton is *not* using the trend line to predict the future, only to track the past over a limited period of time.

You *can* draw a linear trend line for sin(x) where x is from 0 to π/4 for instance. It’s not going to be a perfect match, especially at the ends, but if you limit the segment of the sine wave you are looking at you will at least be close.

But you can’t do the same thing from -π/2 to +π/2 or even from 0 to π.

Reply to  Tim Gorman
February 2, 2023 4:41 pm

Nobody should be using trend lines to predict the future, certainly not the long term future. This whole argument, I repeat, is not about predicting the future, but answering the question, has there been warming over the last 40 years.

Reply to  Bellman
February 2, 2023 4:50 pm

There has been both warming and cooling over the life of the planet. And you are worried about just the past 40 years?

History didn’t start when you were born, you know, right? Far too many of the younger generation today don’t seem to understand that.

Reply to  Tim Gorman
February 2, 2023 5:19 pm

And you are worried about just the past 40 years?

Not particularly, no. I’m just trying to establish an answer to the question, has there or has there not been warming over the last 40 years?

History didn’t start when you were born, you know, right? Far too many of the younger generation today don’t seem to understand that.

It’s been a long time since anyone suggested I was part of the younger generation.

Reply to  Bellman
February 2, 2023 5:33 pm

If there is dispute about the degree of warming it centers on poor measurement methodologies and the constant reinvention of the data to cool the past and warm the present. The satellite record is more secure – though not perfect – in that respect.

However, the desire to show a significant degree of warming in the recent past has nothing to do with an innocent enquiry into history and a lot to do with framing of climate derived policies. The use of inappropriate techniques, and the careful choice of starting point are all part of that picture.

Reply to  It doesnot add up
February 3, 2023 3:44 am

The use of inappropriate techniques, and the careful choice of starting point are all part of that picture.

This entire thread starts because someone claims that on the basis of two carefully selected months of UAH data there has been no warming over 40 years.

It’s now resulted in people defending Monckton’s careful choice of a starting date to claim there has been a pause in warming.

Reply to  Bellman
February 3, 2023 4:22 am

This entire thread starts because someone claims that on the basis of two carefully selected months of UAH data there has been no warming over 40 years.”

No one is claiming that. Your paranoia is showing again.

“It’s now resulted in people defending Monckton’s careful choice of a starting date to claim there has been a pause in warming.”

And, for the upteenth time, Monckton didn’t *PICK* a start date. He *found* a start date. You are one obsessed stalker! Like a stalker of a woman you’ve built an edifice in your mind that Monckton *picks* a date and you just can’t let it go.

Reply to  Tim Gorman
February 3, 2023 5:21 am

No one is claiming that. Your paranoia is showing again.

Literally the first comment in the thread

Note that our gradually warming planet first reached this anomaly in 1983. So that’s 40 years of no warming.

https://wattsupwiththat.com/2023/02/01/uah-global-temperature-update-for-january-2023-0-04-deg-c/#comment-3674481

And, for the upteenth time, Monckton didn’t *PICK* a start date. He *found* a start date.

I don’t care how umpteen times you say that. It’s just your weird mental block that means you have to distinguish between “finding” and “picking”. It makes no difference. The objection is that Monckton “finds” the date that will give him the longest possible “pause” and chooses that date to be the start of his trend line. It is a bad statistical technique.

Like a stalker of a woman you’ve built an edifice in your mind that Monckton *picks* a date and you just can’t let it go.”

Really offensive.

Reply to  Bellman
February 3, 2023 6:27 am

The objection is that Monckton “finds” the date that will give him the longest possible “pause” and chooses that date to be the start of his trend line. It is a bad statistical technique.”

Simply unfreakingbelievable!

He is LOOKING for the length of the pause. *YOU* want to claim there is no pause so you keep accusing him of doing something that he isn’t doing.

It’s a PERFECTLY LEGITIMATE STATISTICAL TECHNIQUE!

Really offensive.”

Sometimes the truth hurts. You can’t distinguish between finding something and picking something. You’ve already admitted that you think someone backtracking from a dead deer to FIND where he was shot is PICKING the spot where the deer was shot. It’s like you can’t distinguish between the present and the past!

Reply to  Tim Gorman
February 3, 2023 12:55 pm

It’s a PERFECTLY LEGITIMATE STATISTICAL TECHNIQUE!

Stop shouting, and provide some reference. Where does any text claim that looking back at every possible start point until you find the result you want is a legitimate statistical technique?

To claim you know the length of a pause requires you to first establish there is a pause. That requires you provide statistical evidence for the existence of a pause, not simply look for periods when the trend is flat.

Reply to  Bellman
February 3, 2023 4:39 am

Monckton’s “choice” of starting date is no such thing. It is derived from a pure statistical criterion. If we get another major El Nino his technique will rapidly show that there is no current “pause” until some time after it has subsided. That is of course what happened in 2016. Fourier analysis is more robust in not being side tracked by an El Nino spike.

Reply to  It doesnot add up
February 3, 2023 5:43 am

It is derived from a pure statistical criterion

the criterion being to find the start date that will give you the longest possible non-positive linear trend. I’m not sure I would regard that as pure statistics.

Fourier analysis is more robust in not being side tracked by an El Nino spike.”

Or any other form of curve fitting – high level polynomials, splines, Loess etc. They can all show the ups and downs, but that doesn’t mean they are actually describing what is happening. They are all just giving the best fit to the existing data.

Reply to  Bellman
February 3, 2023 6:29 am

“Or any other form of curve fitting”

AGAIN – FOURIER TRANSFORMS ARE *NOT* CURVE FITTING.

The frequency domain and the time domain are different domains! See the attached graph. There is *no* curve fitting.

You are still embarrassing yourself.

Reply to  Tim Gorman
February 3, 2023 7:12 am

“Or any other form of curve fitting”

AGAIN – FOURIER TRANSFORMS ARE *NOT* CURVE FITTING.

It seems that climastrology has it own esoteric definition of the term “transform”.

Beyond incredible.

Reply to  karlomonte
February 3, 2023 12:59 pm

It’s all got to be statistics all the time. There is no such thing as a frequency domain that you can map a time domain function into!

Reply to  Tim Gorman
February 3, 2023 12:59 pm

What do you think a frequency is if it doesn’t involve time? The only difference between a frequency and a time domain is the figures you use. A frequency is still something that maps to the time domain, and there’s no point in doing this analysis if you don’t think it describe the shape of the time series.

Reply to  Bellman
February 3, 2023 6:35 am

here is the graph showing a cosine function in the time domain and in the frequency domain.

cosine_functions.png
Eng_Ian
Reply to  It doesnot add up
February 2, 2023 1:13 pm

By performing a low pass filter you are removing data.

Imagine if you set the low frequency cut off to 1 cycle per century, (since the data is well less than this length), you would effectively eliminate ALL cyclic data and be left with a straight line.

Similarly, if you used a HIGH pass filter, set for one month then you would have no linear data at all, it would all be sine waves from end to end.

When you exclude data, you exclude details that are most probably worthwhile. Would you listen to music with a low pass cut off of say 5kHz?

Reply to  Eng_Ian
February 2, 2023 5:21 pm

I am not throwing away data: the high frequency harmonics are the difference between the actual series and the low frequency elements by construction, and I show both in the chart. Linear trend estimates exclude almost all the data. The Fourier analysis is designed to use frequency analysis to clarify how the very noisy data is influenced on different timescales. High frequency components are usually associated with discontinuities: think of the Fourier analysis of a square wave. So when we look at the full spectral analysis we find strong aliasing high frequencies that help describe the El Nino spikes. Those clearly have a different, and reasonably well understood basis. So what are we left with? It’s not a linear trend. More medium/high frequency fuzziness and oscillation, and perhaps some slower moving elements that may be easier to understand if we have an idea of what we arae looking for.

Reply to  It doesnot add up
February 3, 2023 4:12 am

may be easier to understand if we have an idea of what we arae looking for.”

Are the CAGW advocates actually looking for anything?

Reply to  Tim Gorman
February 3, 2023 4:40 am

Catastrophe.

bdgwx
Reply to  It doesnot add up
February 2, 2023 5:58 am

Do you think the top is in?

Reply to  bdgwx
February 2, 2023 7:08 am

The reliability of trend estimates is always flakiest at the ends of the data. Clearly the 2016 El Niño is going to be a local peak in the long run. In the very long run the earth will eventually get very hot as the sun goes through its red giant stage, so it is not a global peak.

bdgwx
Reply to  It doesnot add up
February 2, 2023 7:28 am

I have your post bookmarked. What will your response be to challenges of your prediction if the local peak in 2016 is eclipsed?

Reply to  bdgwx
February 2, 2023 12:12 pm

It’s already a local peak. Not eclipsed for at least 6 years. I am not making predictions that I can’t justify. We do know that the sun will become a red giant, eventually engulfing the earth. Climate projections over the next century? I see no evidence that we have good models, so unlike Al Gore or David Wadhams I’m not going to make a stupid guess.

bdgwx
Reply to  It doesnot add up
February 2, 2023 12:22 pm

Oh, got it. You’re just saying it is a local peak on yearly time scales, but not necessarily on decadal. I get that it will almost certainly be warmer as the Sun (like all main sequency stars) continues to brighten. But the current brightening rate is like 1% every 120 million years and far beyond the decadal and centennial scales we typically discuss here.

Reply to  It doesnot add up
February 2, 2023 11:10 am

The trend from your fourier analysis is not significantly different from FinalNail’s. His was ~1.40 degC/century, with a standard error of ~0.07degC/century. Yours is ~1.32 degC/century, with a standard error of ~0.04 degC/century. Yes, I can see the sinus rhythm, and also how it trends, per Mr. Gorman’s question. I’ll post it.

Since the f anal improves the statistical durability of that trend, let’s see how hard it would be for that trend to be down instead of up. Not lookin’ that promising for you…

Last UAH6 Plot.png
Reply to  bigoilbob
February 2, 2023 11:11 am

Here’s the fourier output

uah6 fourier trend from 011983 on.png
Reply to  bigoilbob
February 2, 2023 3:13 pm

It is quite inappropriate to draw a trend through the Fourier transform. If you look at the fundamental harmonic, which has the biggest amplitude, it is headed down. The phases of some other harmonics are now also in sympathy with the downward movement. But the analysis tells us nothing about what happens outside the data period.

It is important to understand that Fourier analysis says very little about the future. By decomposing to the frequency domain it allows us to consider whether there are factors that we can identify that account for the observed frequencies. If we cannot find such factors, then all it is telling us is that the mathematical technique deconstructs the data analysed into the given sum of sinusoids.

Reply to  It doesnot add up
February 2, 2023 4:25 pm

Far too many of the people supporting CAGW are statisticians only. Trend lines are their bread and butter. They don’t understand that FT gives you *frequency components*, not linear data points. It goes hand in hand with not understanding that trending cyclical processes can’t be done using trend lines, especially trend lines extending into the future.

What is the trend line for a sine wave?

Reply to  Tim Gorman
February 3, 2023 6:05 am

What is the trend line for a sine wave?”

Any cyclical wave can have an upward, a flat, or a declining underlying trend. Just the seasonality of upward trending temp data shows us that.

Folks, the selectivity of trends some here like v those they find inconvenient, is all telling. The best example is the pimping of the current GAT “pause”, while deflecting from the much more statistically/physically significant rise, many times longer.

Reply to  bigoilbob
February 3, 2023 6:33 am

Who are these alleged “folks”, blob?

Reply to  bigoilbob
February 3, 2023 8:26 am

So define the trend of a sine wave, as you were asked.

Look at it over -π/4 to π/4 and you will conclude it has a positive slope. Look at it over 3π/4 to 5π/4 and you will conclude it has a negative slope. Or not.

Reply to  It doesnot add up
February 3, 2023 8:37 am

The phasing of your fourier data appeared to have ended about when it began. FYI, it is cyclical, but not regular or repetitive. It’s biggest characteristic is that it is increasing, with time. And no, I won’t cherry pick it…

Reply to  bigoilbob
February 3, 2023 11:45 am

Of course you can have a function y(t)=αt + sin(βt+φ)

Which you can decompose into a linear trend αt and a cycle sin(βt+φ). But the cycle has no trend, and the linear trend has no cycle. The point about the AR1 analysis is that it confirms that the data show no sign or a trend, and the point of the Fourier analysis is that we find the important frequencies are either low or high. High frequencies are associated with discontinuities and spikes. The low frequencies therefore display the behaviour once those are accounted for.

Reply to  bigoilbob
February 4, 2023 4:52 am

You just love being hoist on your own petard don’t you?

Attached is the *long term* data you should be looking at as posted by Dr. Vinos.

What so many CAGW advocates *always* conveniently forget is that in its earliest formation periods there was *lots* of CO2 available in the atmosphere. A *lot* of it became oil and coal deposits. A *lot* of it because sedimentary rock. If CO2 actually “traps heat”, none of this would have ever happened. The earth would have remained a cinder throughout its entire history.

image_2023-02-04_064928835.png
Reply to  It doesnot add up
February 2, 2023 6:12 pm

Let’s continue to run trend lines against time even when time is NOT a variable that predicts anything. Moving into the frequency domain will allow a closer look at factors that are truly part of the climate.

It is refreshing to have your to the point analysis of the limited data available.

Reply to  It doesnot add up
February 3, 2023 5:59 am

The fact that the fourier series is last headed down tells us nada.

The trend of any data series tells us nada about what happens after. The name of the game is to identify those trends and use them to guide further inquiry. This is what the deniersphere attempts to avoid by statistically invalid whining about these strong trends….

Reply to  bigoilbob
February 3, 2023 6:35 am

deniersphere” — HAHAHAHAHAHAH

blob lets his true colors show, just another Klimate Change Kook.

Reply to  bigoilbob
February 3, 2023 9:08 am

A complete inversion of the truth. The statistically invalid trends are linear trends concocted from data that does not conform to the requirements for making linear trend estimates. In the course of this thread, I have shown that the UAH6 data do not conform to the requirements for making linear trend estimates. I have shown that if you construct an AR1 model of the data there is no trend left. I have shown that Fourier analysis reveals far more about the nature of the data, and helps to isolate the effects of the well known El Nino phenomenon. That process also suggests that the data are for now headed down. It says nothing about what is likely to happen more than a few months hence.

It is the catastrophists who wish to rely on linear trends where there are none, and who ignore what the data are revealing. If Newton had tried to covert everything to linear trends he would never have discovered the laws of gravitation.

ResourceGuy
Reply to  TheFinalNail
February 2, 2023 5:40 am

Maybe you don’t know what conditions were like at the start of the record.

Reply to  TheFinalNail
February 2, 2023 11:52 am

When are people going to learn that linear trends on high variance data aren’t useful for trending, especially linear trends.

Look at the graph at the link below and tell folks why temps continue to return to a base line. At the next 30 year baseline change what do you think this graph will show?

I’ll bet it continues to show up/down and down/up excursions.

comment image

Reply to  TheFinalNail
February 2, 2023 1:34 pm

What if we started our graph in March, 1998, instead of when all the doomsayers start it, near the end of the coldest period in the last century? So it would essentially show cooling of some sort in the last 25 years out of billions rather than a warming trend in the last 44 years.

Hey, but 0.14C per decade over a billion years is 14 million degrees of warming! Which would make our planet literally burn. Greta was right! OMG! Science!

Reply to  Coeur de Lion
February 2, 2023 5:43 am

Note that our gradually warming planet first reached this anomaly in 1983. So that’s 40 years of no warming.

And in 1984 it was -0.67°C, whereas three months ago the anomaly was +0.32°C, so that’s almost 1 degree of warming in 40 years.

Maybe best not to rely on picking single months to make a claim.

Reply to  Bellman
February 2, 2023 9:57 am

But today it’s -.04 deg and you haven’t explained what caused the cooling. It must be increased CO2 in the atmosphere, right?

Reply to  doonman
February 2, 2023 10:12 am

Why do I have to explain anything?

There’s a clear upward trend, but with a lot of variation to each monthly value. The variation can be for a number of reasons, most noticeably ENSO conditions, but I don’t have to explain every single monthly value to see the underlying trend, and that clearly temperatures at present are not the same as they where in the 80s. The fact that people are getting exited over a single month which would have been one of the warmest in the 80s is a clear sign that temperatures now are on the whole warmer.

Reply to  Bellman
February 2, 2023 12:36 pm

Is there a clear upward trend still?

Reply to  It doesnot add up
February 2, 2023 1:32 pm

From December 1978 to January 2023? Yes. Rate of 0.13°C / decade with a 2 sigma confidence of ± 0.05.

Has anything happened in the last few years to indicate the trend may still be upwards? Not that I can see. There was a big jump in temperatures since 2015 which increased the overall rate of warming a bit, temperatures since then seem to be following the trend, sometimes above it, currently below it, but nothing to indicate a significant change in the rate of warming.

Will this continue indefinitely? Couldn’t say, but I prefer to see some actual evidence for a a change before assuming it must be happening.

Reply to  Bellman
February 2, 2023 5:44 pm

I have already explained why your confidence limit is nonsense: it is based on assumptions that do not hold. The Fourier analysis shows there is a significant chance that we are now in a cooling phase. It also clearly illustrates the “pause”. Analysis based on assuming linear trends is clearly flawed. Now, I am not claiming that I magically have the perfect climate model to explain it all. But it really is not difficult to show that what we have at present is junk. “If it disagrees with experiment, it’s wrong.”

Reply to  It doesnot add up
February 3, 2023 4:04 am

I have already explained why your confidence limit is nonsense

Then say what you think the confidence interval should be.

it is based on assumptions that do not hold.

What assumptions do you think I’m making? My estimated interval was based on the correction for autocorrelation used by the Skeptical Science Trend Calculator. If you have a method that will increase the interval further let me know. But the problem is the larger the interval the less likely you going to be able to show a meaningful change in the rate of warming.

The Fourier analysis shows there is a significant chance that we are now in a cooling phase. It also clearly illustrates the “pause”.

What are the confidence intervals on your Fourier analysis? My problem with all the talk of Fourier analysis is that statistically all you are doing is an exercise in curve fitting. The danger is you are over-fitting. It’s not good enough to say that you can fit a curve to the existing data that shows a pause. You need to show that the pause could have been predicted with data prior to the pause.

Analysis based on assuming linear trends is clearly flawed.

Standard adage, all linear models are wrong, but they can be useful. As yet, I’ve yet to see any convincing evidence that a linear rate of warming is seriously flawed. It may not be correct, it may change in the future, but to date there’s no clear indication that there is anything over than a linear trend with noise.

Reply to  Bellman
February 3, 2023 4:45 am

What are the confidence intervals on your Fourier analysis”

A Fourier Transform is *NOT* a statistical analysis with a confidence limit. Didn’t you even bother to go look up what a FT is?

From wikipedia: “The Fourier transform of a function is a complex-valued function representing the complex sinusoids that comprise the original function. For each frequency, the magnitude (absolute value) of the complex value represents the amplitude of a constituent complex sinusoid with that frequency, and the argument of the complex value represents that complex sinusoid’s phase offset. If a frequency is not present, the transform has a value of 0 for that frequency. ”

My problem with all the talk of Fourier analysis is that statistically all you are doing is an exercise in curve fitting.”

Again, the FT is *NOT* a statistical tool. There is no curve fitting.

“As yet, I’ve yet to see any convincing evidence that a linear rate of warming is seriously flawed.”

Willfully blind. Go look at the recent thread started by Andy May. I’ve attached a graph from the talk by Dr. Javier Vinós. If you can’t see the cyclical nature of both the temperature and CO2 then you *are* willfully blind.

image_2023-02-03_064438758.png
Reply to  Tim Gorman
February 3, 2023 5:53 am

Willfully blind. Go look at the recent thread started by Andy May. I’ve attached a graph from the talk by Dr. Javier Vinós. If you can’t see the cyclical nature of both the temperature and CO2 then you *are* willfully blind.

We are not talking about changes over millions of years, just the last 40. And, no I’m not sure what cycles you are seeing in the graph. A cycle would imply that at some point you return to an original value. Both your lines are dropping on the whole throughout the last 50 million years. there are ups and downs along the way, and that may in part due to natural cycles, but there is no obvious repeating, predictable cycle that appears to be controlling either over that time period.

Reply to  Bellman
February 3, 2023 6:41 am

In other words “don’t confuse me with the facts”!

Reply to  Tim Gorman
February 3, 2023 1:00 pm

Good come back. I will try to follow your request.

Reply to  Tim Gorman
February 3, 2023 5:59 am

A Fourier Transform is *NOT* a statistical analysis with a confidence limit.

Which is the problem. Even if temperature changes are caused by multiple sine waves, you still have to show that what you’ve got from your analysis is not just noise.

There is no curve fitting.

But that’s exactly what you are trying to do.

The Fourier transform of a function is a complex-valued function representing the complex sinusoids that comprise the original function.

What do you think the “original function” is? You are trying to identify it by looking at the actual temperatures, and that means finding the composition of multiple sine waves that best fit the data.

Reply to  Bellman
February 3, 2023 7:03 am

Which is the problem. Even if temperature changes are caused by multiple sine waves, you still have to show that what you’ve got from your analysis is not just noise.”

Noise, especially white noise, shows up as a flat line in a fourier transform. White noise is basically a combination of an infinite number of frequencies of the same magnitude. Other types of noise with different magnitudes and bandwidth still show up across the spectrum involved. If the noise magnitude and signal magnitude are very close together it can be almost impossible to separate them out. That doesn’t seem to be the case with the analysis of temperature and CO2.

“What do you think the “original function” is? You are trying to identify it by looking at the actual temperatures, and that means finding the composition of multiple sine waves that best fit the data.”

The original function is a periodic function made up of sinusoids – which is what the true historical temperature record is. Again, there is NO CURVE FITTING.

The FT is basically defined as F{g(t)] ∫ g(t) e^(-2πft) dt

It is a MAPPING, not a curve fitting.

Reply to  Tim Gorman
February 3, 2023 9:05 am

These guys should give up anything that has “oscillation” or “cycle” in the description. ENSO, AMO, SUNSPOT CYCLE, PDO, SEASONS, and on and on. These periodic functions apparently have nothing to do with climate. The mapping from the time domain to the frequency domain appears to be a nonstarter amongst them! No wonder why CO2 is the easy choice to dwell upon.

Reply to  Bellman
February 3, 2023 9:19 am

Fourier analysis is not a problem. It is a way to help look for features that may be important in determining the behaviour we observe. If they chime, we have a winner. If not, at least we looked.

Reply to  Bellman
February 3, 2023 4:55 am

Having identified serial correlation, if we include the previous period reading as an explanatory variable we find that it dominates the regression with a coefficient of 0.754. The linear time trend falls to being 0.0333 C per decade as a central estimate, and borders on being statistically not different from zero.

Reply to  It doesnot add up
February 3, 2023 5:48 am

I’m not sure you understand how autocorrelation works. It doesn’t cause a trend, unless the coefficient is 1.

Reply to  Bellman
February 3, 2023 9:13 am

Precisely. Since the coefficient of the lagged variable is significantly different from 1 (Dickey Fuller unit root test) the analysis finds NO trend. It is mean reverting.

I’m not sure you understand that.

Reply to  It doesnot add up
February 3, 2023 10:04 am

Which is why the fact that temperatures have not reverted to the mean is a good indication that there is a trend.

Reply to  Bellman
February 3, 2023 10:50 am

No. It just means that we are still part way through a cycle.

Reply to  It doesnot add up
February 3, 2023 11:03 am

Which is a completely different claim to the one about most the rise being caused by autocorrelation.

Reply to  Bellman
February 3, 2023 4:48 pm

No. Autocorrelation just means that you cannot get reliable estimates of a trend using OLS. Simple example: Real data is a parabola – say the trajectory of a golf ball drive on the moon. A linear trend calculated through the data will chop through it, with sections where the residuals are of the same sign at each end, and the opposite sign in the middle. That is, the residuals are highly autocorrelated. That tells us that the linear model is wrong. We should be looking for some sort of curve function. We get further clues if we slide our data window to find that the linear slope changes accordingly. The linear trend has no reliable estimate, because there isn’t one.

Reply to  It doesnot add up
February 3, 2023 5:07 pm

A linear trend crossing a parabola is clearly wrong. That’s not because of autocorrelation, it’s just that the parabola is not linear.

We get further clues if we slide our data window to find that the linear slope changes accordingly.

That will happen with no autocorrelation as well, it’s just that autocorrelation increases the uncertainty of the trend so you will get more of a change.

Reply to  Bellman
February 5, 2023 8:12 am

Let’s see, I believe the IPCC said something along the lines of “long-term prediction of future climate states is not possible because the climate system is a coupled non-linear chaotic system.”

And, what did you say about a linear trend? “it’s just that the parabola is not linear.”

How do you reconcile these two disparate statements and use of linear trends?

Reply to  Jim Gorman
February 5, 2023 9:11 am

Why do you think there is anything that needs reconciling those two statements? A linear trend is not a good fit for something that isn’t linear. You’d have to give me some context for the IPCC paraphrase, but it doesn’t in any way contradict the idea that a linear trend is not a good fit for a parabola. I’m really not sure what point you think you are trying to make.

Reply to  Bellman
February 6, 2023 8:11 am

Geez, you said that a linear trend IS NOT a good fit for something that is not linear.

The IPCC has said climate is non-linear. Why do you then insist that your linear regressions mean anything.

I’ll give you something from my business experience. If your linear trend depends on the starting and ending points, then you need to make it stationary before proceeding.

Reply to  Jim Gorman
February 6, 2023 2:52 pm

I’ve not claimed the linear trend means anything other than there appears to be a linear trend over the last 40+ years, and that this is a useful way to show there has been warming over that period.

bdgwx
Reply to  doonman
February 2, 2023 11:11 am

We know what caused the cooling. ΔE < 0 in the TLT layer. Remember, the 1LOT says that ΔE = Σ[Ein_x, 1, N] – Σ[Eout_y, 1, N]. CO2 is but one among many of the Ein_x and Eout_y components. It is possible for Enet_CO2 > 0 simultaneously with Enet_others < 0 and ΔE < 0. Remember, the 1LOT is the law of conservation of energy. It says in no uncertain terms that it is a violation of the law to ignore the energy flows of the non-CO2 components.

wh
February 2, 2023 12:43 am

We’re still in a La Niña so this isn’t surprising. I think it will still be cooler for a couple of more months, but then it will get interesting. ESNO conditions are expected soon and we’ll finally able to get a true sense of where the global temperature is without the influence of neither La Niña or El Niño. I’m hoping for a return to the old pause. There needs to not be global warming for this whole agenda to end.

rah
Reply to  wh
February 2, 2023 1:28 am

Yea, but this La Nina has been weak, not far from neutral, for months now. Any way one cuts it, it is clear that natural variability rules over the magic molecule.

Reply to  rah
February 2, 2023 1:44 am

…this La Nina has been weak, not far from neutral, for months now.

Not according to NOAA.

The index has been at or around -1.0 for most of the past year. Latest value (Oct-Nov-Dec 2022) is -0.9. The NOAA La Niña threshold value is -0.5.

rah
Reply to  TheFinalNail
February 2, 2023 2:17 am

This one is not even in the top 8 since 1950 according to NOAA.

Reply to  rah
February 2, 2023 4:08 am

A couple of posts back you were claiming that the current ENSO value was “not far from neutral”. Now you seem to be saying that it’s the 9th coolest La Niña since 1950. That’s quite a switch.

rah
Reply to  TheFinalNail
February 2, 2023 6:04 am

It has not been far from neutral relatively.

Reply to  rah
February 2, 2023 9:42 am

It has not been far from neutral relatively.

Well, it sort of has. Declaring El Niño or La Niña conditions is, by definition, a relative departure from neutral.

Richard M
Reply to  TheFinalNail
February 2, 2023 5:54 am

Keep in mind that NOAA changes the baseline every 5 years. As a result this La Nina is based on a higher baseline than previous ones. This has increased the index by several tenths of a degree. On a flat baseline it’s even weaker.

Reply to  Richard M
February 2, 2023 9:43 am

Yes Richard, but if it didn’t do that then the value would become meaningless, because the sea surface temperature in the region is rising over the long term.

bdgwx
Reply to  Richard M
February 2, 2023 11:04 am

If were based on a flat baseline then it would no longer be an oscillation. Remember, it is detrended specifically so that it can represent the El Nino Southern Oscillation.

wh
Reply to  TheFinalNail
February 2, 2023 9:33 am

TheFinalNail

In any case, the reason why it’s been so elevated lately despite La Niña is due to extra tropical warmth in the Northern Hemisphere. If you really think that that is being caused by greenhouse gases, I don’t know what tell you. I think it’s just easy to blame it on GHGs. I’m convinced at this point that the rise in global temperature has been mostly dominated by the oceans.

bdgwx
Reply to  wh
February 2, 2023 10:58 am

Well, GHGs block more of the Eout than the Ein. And the 1LOT says that ΔE = Ein – Eout. So if ΔEout < ΔEin then ΔE > 0. And because ΔT = ΔE/c*m then ΔT > 0 as well. Of course, if you don’t accept the 1LOT or that polyatomic gas species impede the transmission of terrestrial radiation more than solar radiation then you likely won’t accept that GHGs have an influence.

BTW…how do you think the oceans got warmer?

wh
Reply to  bdgwx
February 2, 2023 1:33 pm

I’m convinced that the culprit responsible for the oceans warming are underwater volcanoes and the AMO. When looking at the graph, you see a slight cooling take place through the early 21st century but then the El Niño comes and ever since then we’ve been stuck on a new baseline, but maybe just maybe we’re going back down to the old pause. A good hypothesis for the El Niño are underwater volcano eruptions. I don’t think GHGs have no effect, but I’m not very convinced that it’s been the main culprit. All of what you’re saying is probably true. It’s just a question whether it’s the main culprit or not.

Here is a video explaining more of that and it’s effects around the world by Wyss Yim frong Hong Kong.

https://youtu.be/OlTlMXR_tSw

bdgwx
Reply to  wh
February 2, 2023 9:05 pm

Which underwater volcanoes are responsible for the warming?

Where is the AMO getting the energy to warm the planet?

sherro01
Reply to  wh
February 2, 2023 3:54 pm

Walter,
We are still in a La Nina so this (cooling) isn’t surprising.
OR:
We are still cooling so it is not surprising to have a La Nina.
(Some definitions of La Nina involve cooling temperatures near Equatorial Pacific Ocean).
…………..
Temporal cause and effect is often confused in climate talk.
So many more mechanisms remain to be understood, while pop climate relies on CO2 control knob gossip.
Geoff S

rah
February 2, 2023 12:44 am

Yesterday morning at 06:40 I started the big truck for a run up to Grand Rapids, MI and back. It was 6 deg. F. It was the kind of cold that is more bitter and bites harder than the temperature indicated though there was little wind.

Took 20 minutes idling before the truck got warm enough inside so that my Samsara tablet would work. The driver can’t move the truck until the logs can be accessed and the log and dispatch information is on that tab.

Reply to  rah
February 2, 2023 12:39 pm

The driver can’t move the truck until the logs can be accessed and the log and dispatch information is on that tab.

I’m sure that you are anxiously awaiting the day that all trucks are battery powered. 🙂

rah
Reply to  Clyde Spencer
February 2, 2023 1:38 pm

I am retiring this year!

February 2, 2023 1:45 am

I mentioned this data on the Facebook page of a certain group here in the U.K. and the response was, what does tropospheric temperature have to do with temperatures lower down, or words to that effect, clearly thinking there’s impenetrable barriers between the different layers of the atmosphere.

rah
Reply to  JohnC
February 2, 2023 2:11 am

We live in the lower troposphere.

Rod Evans
Reply to  rah
February 2, 2023 2:54 am

That is true, but you must remember the Alarmists are mostly an underground movement. It is pretty near impossible to get through their impenetrable brains. 🙂

Richard M
Reply to  JohnC
February 2, 2023 5:57 am

Ask them if they think the lapse rate is changing. It scientifically links all the temperatures in the troposphere.

Wim Rost
February 2, 2023 2:31 am

Good to see the spreadsheet in the text. One line could be added: the trend per region. People should be remembered the large regional differences resulting from natural variation. 

The absence of warming in the Antarctic and the opposite, large warming of the Arctic, show that there cannot be a general cause for ‘average warming’. And also show there is no global warming. Natural variation and natural causes are in play.

Reply to  Wim Rost
February 2, 2023 4:19 am

You can check the trend per region in the UAH data. It’s the bottom value in each column.

And also show there is no global warming.

The global value is shown as +0.13C per decade warming; so it’s hard to see how you reach the conclusion that there is no global warming.

Wim Rost
Reply to  TheFinalNail
February 2, 2023 3:28 pm

The Final Nail:”You can check the trend per region in the UAH data. It’s the bottom value in each column.”

WR: Of course I know. The trend is 0.01 degrees per decade (!) for Antarctica and 0.25 degrees per decade for the Arctic. And there are still people who think that those numbers reflect a warming that is global.

Reply to  Wim Rost
February 2, 2023 12:46 pm

It should be noted that there is an increase in the seasonal range of CO2 from the South Pole to the North Pole. Despite the claim of CO2 being “well-mixed,” the range is quite asymmetrical.

Wim Rost
Reply to  Clyde Spencer
February 2, 2023 3:46 pm

Clyde Spencer: Despite the claim of CO2 being “well-mixed,” the range is quite asymmetrical.

WR: If you refer to some maps or videos that suggest by using intense colors there is “quite an asymmetry”, you probably forgot to study the map legend. For example in the video below the legend shows a minimum and maximum value of resp. 390 and 408 ppm. Not “quite an asymmetry”.

https://www.google.com/search?sxsrf=AJOqlzVvUquRAOFzmnMuYpmAwu4FFxgXpg%3A1675380118013&lei=lkXcY_8xtI327w_s-aTgDw&q=nasa%20co2%20time%20lapse&ved=2ahUKEwj_1LCm_ff8AhW0hv0HHew8CfwQsKwBKAB6BAhaEAE&biw=1356&bih=705&dpr=2.5#fpstate=ive&vld=cid:ca2eb288,vid:TrQzbXc6LVE

February 2, 2023 3:51 am

I wonder if the alarmists are getting nervous yet?

El Nino is their only hope now.

El Ninos occurred during the cooling that took place from the 1940’s to the 1980’s. The El Ninos didn’t prevent the temperatures from cooling at that time.

Not much to hang your hat on.

Reply to  Tom Abbott
February 2, 2023 5:05 am

I wonder if the alarmists are getting nervous yet?

I wouldn’t have thought so.

According to NOAA, this current ‘double-dip’ La Nina began in the 3-month period centered on August 2020.

The global warming rate in UAH up to July 2020 was +0.14C per decade. After more than 2-years of near-continuous La Niña conditions, it has only fallen back to +0.13C per decade. (These are ‘best estimate’ trends; error margins would easily overlap.)

With ENSO neutral and possibly even El Niño conditions forecast to return over the next several months, it seems any further significant decline is unlikely. So not nervous, no.

Richard M
Reply to  TheFinalNail
February 2, 2023 6:02 am

As you’ve already been informed above, a linear trend is useless when dealing with natural cycles. I do find it interesting you keep going back to this feeble point. It demonstrates exactly what Tom Abbott stated.

Not much to hang your hat on.

Reply to  Richard M
February 2, 2023 9:31 am

As you’ve already been advised above, tell that to Roy Spencer and UAH who use linear trends to calculate their warming rates.

Reply to  TheFinalNail
February 2, 2023 10:03 am

I think they are getting nervous. Repeating appeals to authority is repeating logical fallacies. And I don’t think he has a hat.

bdgwx
Reply to  doonman
February 2, 2023 10:55 am

I don’t think Dr. Spencer is getting nervous. He understands that the energy fluxes into and out of the TLT layer are highly variable even over multi year periods. He has stated that he expects the UAH TLT anomaly to continue to increase over the long term.

Reply to  doonman
February 2, 2023 12:28 pm

Au contraire, I think, all hat, no cattle.

Reply to  Tom Abbott
February 2, 2023 5:55 am

I wonder if the alarmists are getting nervous yet?

You mean the alarmists who post here seeing a new ice age behind every cold snap? Yes I imagine they are, but there are always nervous.

Reply to  Bellman
February 2, 2023 12:48 pm

And boiling of the oceans after every heat wave?

Reply to  Clyde Spencer
February 2, 2023 1:43 pm

Not seen any article make that claim here. If anyone thinks the oceans are going to boil they are wrong.

Reply to  Bellman
February 2, 2023 5:30 pm

People are dying. Whole ecosystems are collapsing. The GBR is going to die. We are on the verge of mass extinction.

Reply to  Mike
February 3, 2023 1:09 pm

Could you point to the WUWT post that makes these claims? And then to the one claiming oceans will boil.

bdgwx
Reply to  Clyde Spencer
February 2, 2023 9:04 pm

The Simpson–Nakajima limit says that it is not possible for the oceans to boil.

Reply to  bdgwx
February 3, 2023 8:12 am

True, I’ll trust. Sheldoney. Smile emoji hopefully unnecessary…..

I’ll look this up.

Katy trail e bike riding Sunday, me and he missus. Which my computer says should be in good weather. But we’re bringing Cal wine in our picnic basket. We tried St. James Missouri wine decades ago, while students at Mo. School of Mines. Never again…..

bdgwx
Reply to  bigoilbob
February 3, 2023 8:59 am

Goldblatt & Watson 2012 is a good reference. In a nutshell a runaway greenhouse effect is not possible on Earth because we are too far away from the Sun.

February 2, 2023 5:34 am

Very average. Equal 20th warmest January out of 45.

Coldest January since 2012, and 5th coldest this century. Last century this would have been 3rd warmest.

The Monckton pause now starts in August 2014.

Richard M
Reply to  Bellman
February 2, 2023 6:04 am

When you consider the low sea ice levels at both poles, it becomes more telling. Clearly, there is more solar energy getting to the surface, especially in Antarctica.

Reply to  Bellman
February 2, 2023 6:15 am

For what it’s worth (very little with just one month of data), my simple statistical model is forecasting that UAH 0.071 ± 0.182°C for 2023.

That would put it around 11th or 12th warmest, but with a huge range of uncertainty. It’s unlikely to be a top 5 finish.

Note. This is a very simple statistical extrapolation from a single month. It does not look at factors such as ENSO.

202301UAH6forc.png
Reply to  Bellman
February 2, 2023 7:37 am

How can your simple model get a yearly forecast from a single month’s worth of data?

It reminds me of the famous Mark Twain quote from “Life on the Mississippi.”

There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.

Reply to  Javier Vinós
February 2, 2023 9:29 am

It’s essentially just a linear regression of previous January’s to annual values, along with a time factor. As I said, it isn’t very precise after just one month, there’s a huge range of possible values.

There is something fascinating about science.

This has nothing to do with science, it’s pure statistics. I have been running this for years, over different data sets, and whilst I wouldn’t make an claims for it’s accuracy, the final result is generally within the 95% prediction interval.

Reply to  Bellman
February 2, 2023 9:41 am

For example, here’s the forecasts for UAH last year. The red line is the forecast based on data up to that month, the blue line shows the final annual average and the grey zone is the 95% prediction interval.

The forecasts started off quite low, due to the colder start to the year but at no point was the final value outside of the 95% prediction.

202212UAH6forcbymonth.png
Reply to  Bellman
February 2, 2023 9:46 am

To get some idea of why it’s possible, here’s the scatter plot of the January anomaly, compared to the Annual average anomaly. Clearly there’s some sort of a relationship.

20230202wuwt3.png
Reply to  Bellman
February 2, 2023 10:23 am

The quote was literal. I know what you are doing is statistics. But essentially with only one month, it is a measure of the autocorrelation in the data. The temperature anomaly for one month or one year depends a lot on the previous one.

Given that ENSO should be turning to neutral soon and a possible Niño before year’s end, I would bet your method is giving a low estimate as it did last year.

2023 is likely to be warmer than 2022. If it is not, that would be something interesting and might indicate a long pause is in the making.

Beta Blocker
Reply to  Javier Vinós
February 2, 2023 11:43 am

Javier: “2023 is likely to be warmer than 2022. If it is not, that would be something interesting and might indicate a long pause is in the making.”

Or not. It doesn’t matter. Net Zero anti-carbon energy policies will be pushed hard by western governments between now and 2050 regardless of what kinds of GMT trends emerge in the next twenty-five years. Up, down, flat, sideways — whatever.

bdgwx
Reply to  Javier Vinós
February 2, 2023 11:49 am

Yes, you should be able to gain skill in forecasting the 2023 annual anomaly be incorporating ENSO data. By exploiting the 4 month lag we can skillfully predict the Feb, Mar, and Apr anomalies. And by incorporating an 8 month forecast of ENSO (which are readily available) would could reasonably predict the remaining months as well albeit with the lower skill. Note, however, the “spring forecast barrier” with ENSO means the ENSO values we use from the global circulation models only have marginally better skill than persistence/climatological techniques so there’s still going to be a lot of uncertainty. When I get time I’ll see if I can improve upon Bellman’s work. But considering that Christy et al. 2003 report the uncertainty on annual observations as ±0.15 C (2σ) that puts a hard clamp on any improvement.

Reply to  bdgwx
February 2, 2023 12:10 pm

The spring forecast barrier is only for those that ignore the effect of the solar cycle on ENSO. By taking that into account, and the time since the last El Niño, the skill in ENSO forecasting can be significantly improved.

For example, the probability of a Niño in the 2023-24 winter is above 90%. I don’t expect a big Niño, though. Warm water volume is small despite 3 years of Niña.

comment image

bdgwx
Reply to  Javier Vinós
February 2, 2023 12:18 pm

Oh ok. Let me take baby steps first. I’ll see if I can incorporate the solar cycle effect on ENSO later. Either way, I’m busy tonight so it’ll be tomorrow at the earliest before I have a predictive model ready to go.

wh
Reply to  Javier Vinós
February 2, 2023 5:12 pm

Interesting. Looks like it’ll be a 2009/2010 like situation. Do El Ninos release heat? Correct me if I’m wrong but maybe that means it will just cool us long term and we’ll go back to the old pause after that.

bdgwx
Reply to  wh
February 2, 2023 8:47 pm

Yes. El Nino’s are associated with increased energy fluxes from the ocean to the atmosphere. No, El Nino’s do not cool the planet long term. They just primarily result in the energy moving from the ocean reservoir to the atmosphere reservoir. The only thing that will cool the planet long term is for the current positive planetary energy imbalance to go negative.

Reply to  Bellman
February 2, 2023 1:17 pm

Here is part of the problem with the anomalies – they are all within the measurement uncertainty of average values as calculated using the NIST TN1900 document!

Tell you what, take the absolute temps from one location of your choice from one of the databases for January. Tell us what the measurement uncertainty of the absolute temps are that you calculate using the method outlined.

Then using 2022 monthly absolute temps from any location you like, calculate the measurement uncertainty for an annual average.

Lastly, compute the measurement uncertainty for the baseline being used.

I’ll be very interested in your findings and if they match mine.

Reply to  Jim Gorman
February 2, 2023 1:38 pm

Here is part of the problem with the anomalies

I couldn’t care less what you think the problem with UAH data is at this point. It’s irrelevant to my forecast, as I’m only trying to forecast what UAH will claim the annual average was. Will this be a good estimate of the true anomaly? Probably not given how far out of whack UAH is with the rest of the data sets. But it’s irrelevant to predicting what UAH will say.

Reply to  Bellman
February 2, 2023 3:32 pm

In other words, just ignore the inconvenient parts, right?

Reply to  Tim Gorman
February 2, 2023 3:57 pm

Wrong.

Reply to  Bellman
February 2, 2023 4:03 pm

Have you never wondered why climate scientists and those who follow them never, ever, tell folks what the measurement uncertainty of the their anomaly calculations are? Would it surprise you that annual averages have somewhere between ±9 and ±13 °F of measurement uncertainty using NIST calculations for monthly temperatures? That gives annual average temperatures around 55 ±13 °F? If the averages have that kind of uncertainty, it is a joke to talk about the anomaly values you are posting.

Reply to  Jim Gorman
February 2, 2023 4:25 pm

I’ve asked repeatedly why UAH does not publish an uncertainty analysis for the current version. But everyone seems to accept it as the only valid data set. As I keep pointing out, most other sets I know of do publish uncertainty analysis. You might not like it, but just pretending it doesn’t exist.

Would it surprise you that annual averages have somewhere between ±9 and ±13 °F of measurement uncertainty using NIST calculations for monthly temperatures?

Nothing that sprung from your imagination would surprise me. It would be astonishing if it was true. But as you’ve spent the last 2 years demonstrating your complete lack of understanding even the most basic maths, I wouldn’t take it on trust if you said 1 + 1 is 2.

If the averages have that kind of uncertainty, it is a joke to talk about the anomaly values you are posting.

If it were remotely true, the entire work of Spencer and Christie would be meaningless. There would be no reason for WUWT to favour his meaningless figures every month, and Monckton’s entire claim of a pause would be more meaningless than it already is.

Reply to  Bellman
February 2, 2023 4:41 pm

Go study Possolo’s methods in TN1900. That’s where those figures come from.

And it *IS* why so much of the climate work today is meaningless!

You and most of the CAGW crowd ALWAYS assume that all error cancels and all stated values are 100% accurate, and that the standard deviation of the sample means is the uncertainty of the average calculated from the sample means.

It lets everyone calculate averages down to the hundredths digit.

It’s part and parcel of assuming that the average uncertainty is the uncertainty of the average.

Is Possolo wrong in TN1900? Show your work in disproving his methods.

Reply to  Tim Gorman
February 2, 2023 5:00 pm

Go study Possolo’s methods in TN1900. That’s where those figures come from.

Citation required. Show your workings, or show where it says the annual uncertainty is ±13°F.

You and most of the CAGW crowd ALWAYS assume that all error cancels

Not engaging with this today, but for new readers, Tim keeps repeating these lies about me, and always ignores my explanations why I do not assume that all error cancel, or any of the other things he keeps asserting I believe.

It’s part and parcel of assuming that the average uncertainty is the uncertainty of the average.

More meaningless verbiage he repeats at every opportunity. I’m not sure he even understands what the words mean at this point, but it’s another thing I’ve repeatedly demonstrated is not true.

Is Possolo wrong in TN1900? Show your work in disproving his methods.

Difficult to do, if you won’t explain how you’ve reached your conclusions. I doubt he is wrong, I’m pretty sure on past form you have completely misunderstood something, but all you have to do is show your workings, so we can all see.

Reply to  Bellman
February 2, 2023 5:50 pm

First, the citation as required.

Possolo, A. (2015), Simple Guide for Evaluating and Expressing the Uncertainty of NIST Measurement Results, Technical Note (NIST TN), National Institute of Standards and Technology, Gaithersburg, MD, [online], https://doi.org/10.6028/NIST.TN.1900 (Accessed February 2, 2023).

You can download the pdf from this site.

https://doi.org/10.6028/NIST.TN.1900

From the document:

‘For example, proceeding as in the GUM (4.2.3, 4.4.3, G.3.2), the average of the m = 22 daily readings is t̄ = 25.6 ◦C, and the standard deviation is s = 4.1 ◦ √ C. Therefore, the standard uncertainty associated with the average is u(r) = s∕ m = 0.872 ◦C. The coverage factor for 95 % coverage probability is k = 2.08, which is the 97.5th percentile of Student’s t distribution with 21 degrees of freedom. In this conformity, the shortest 95 % coverage interval is t̄± ks∕ √n = (23.8 ◦C, 27.4 ◦C)”

Please see the image for a sample annual average from Topeka Forbes Air Force Base in 1953. You should note that the daily, weekly and annual averages have all pretty much varied between 9 to 13 degrees F, from 1953 to 2022. This is a large standard uncertainty of the mean and obviates any anomalies calculated to the thousandths of degrees.

monthly example of average.jpg
Reply to  Jim Gorman
February 2, 2023 6:28 pm

Which is the same old example you constantly misunderstand.

Nothing in your quote is about measurement uncertainty. It is simply applying the standard error of the mean to a sample of 22 values, and applying a coverage factor to give a 95% confidence interval.

The standard deviation of those 22 daily maximums is 4.1°C, and the sample size is 22, hence the SEM is 4.1 / √22 = 0.87°C. With a coverage factor of 2.08, this gives a 95% uncertainty interval of ±1.8°C.

You should note that the daily, weekly and annual averages have all pretty much varied between 9 to 13 degrees F, from 1953 to 2022.

Which has nothing to do with your claimed measurement uncertainty of the global average, or even of the individual station.

I’ve no idea what the image is claiming. It’s calculating a confidence interval based on just 10 unspecified values. Then claims that an annual average is based on a sample size of 12. This is nonsense. If you have 12 monthly values, you do not have 12 individual measurements taken randomly around the year. You have 12 monthly values that make up the entire year. The only uncertainty is from the uncertainty of the individual monthly values.

Even more bizarrely it claims that if you average n stations the sample size is not n, but 12.

Really, if this is what NIST is teaching you, they are doing an appalling job.

Reply to  Bellman
February 2, 2023 8:03 pm

You didn’t read the document did you? This example IS done after declaring the average as a MEASUREMENT. Do you understand the reference to the GUM?

This is the NIST recommendation for calculating standard uncertainty of the mean for a monthly temperature data set.

I originally thought the Technical Note was hokum but I have spent time studying it very closely. I have always been concerned about the lack of any interest in the data distributions and determining the appropriate statistical parameters of the distributions. This document details the reasoning behind the assumptions being made.

“””””Which has nothing to do with your claimed measurement uncertainty of the global average, or even of the individual station.”””””

You will have to up your game beyond making assertions with no backup references. Using station annual averages to calculate a Global figure will not reduce these standard uncertainty of the mean.

The ten.”made-up” temps are 80 – 89. They are merely an example I started.

Look closely at the table. That is calculated from downloaded data from NOAA and uses the exact steps as the TN 1900 does. You will need to do a better job refuting it if you want to have any hope of being considered knowledgeable.

“””””Even more bizarrely it claims that if you average n stations the sample size is not n, but 12.”””””

Ha, ha, ha. Show ONE reference where the NUMBER OF SAMPLES is used rather than the SIZE of SAMPLES. If you ask, I can give you 10 references that say otherwise.

The appropriate equation is:

SEM = SD / √N

Reply to  Jim Gorman
February 3, 2023 3:53 am

He won’t refute it. He’ll just continue to say its wrong.

Reply to  Jim Gorman
February 3, 2023 10:24 am

I don’t have to refute anything. It’s your extraordinary claim, you need to justify it.

But it’s good that you seem to finally accepting what I’ve said all along, the measurement uncertainty of individual instruments isn’t usually the cause of uncertainty in the global average, it’s uncertainty caused by sampling. And if you are now accepting, as I’ve said all along, that the SEM is the uncertainty of an average, or at least part of it, then all the better.°°L

The problem, now is that you still don’t understand what any of these terms mean, and rely on your own misunderstanding of a simple example to justify ridiculous, impossible, levels of uncertainty. You really need to consider what a global annual anomaly uncertainty of ±7°C. You are claiming this is a 95% uncertainty interval, which means that around 1 in 20 times the measured value should differ from the true value by more than that. Yet the entire range of all UAH annual data varies by less than 1°C. UAH differs from other annual data sets by a few tenths of a degree. Just where does all this uncertainty manifest itself? How it possible to see changes of a few tenths of a degree and identify them with ENSO conditions, or volcanic activity, when there is a 14°C uncertainty range for each year?

Reply to  Bellman
February 3, 2023 10:40 am

Continued.

So where is the problem with your annual analysis. First you are looking at 12 monthly values and treating them as if they were random values taken from the year. You calculation for the SEM would be correct if that was the case, but the monthly measurements are not being made randomly, there are 12 months and you have the value of each. The SEM is assuming you might have got 2 or 3 measurements from say January, and maybe none from July.

Secondly, when you treat the 12 months as a time series, you are using the standard deviation of all the monthly values. But that is not random variation. It’s a very specific, predictable, seasonal effect. You have non-stationary data, and a non-random sample. It doesn’t work. The obvious way of avoiding this is to seasonally adjust the monthly temperatures, e.g. by turning them into anomalies. If you do that you have a much smaller standard deviation that actually represents the random variation over the year.

Thirdly, I’m not sure it makes any sense to be talking about the uncertainty in the annual values in this way. What is the population you are sampling from? If you only want to know what the average annual temperature was at this particular station in this particular year, the average of the 12 monthly values tells you that exactly. The average is what it is, and there is no sampling uncertainty. The only uncertainty is from any uncertainty in the monthly values, which might be down to measurement errors or missing values (as in the NIST example).

Fourthly, you were talking about the uncertainty in the global average, not one station. This is where the true sampling takes place, hundreds of different stations being averaged into an estimate of the global average. The more stations the larger the sample size and the smaller the SEM. But, as I also keep saying, any actual uncertainty analysis is more complicated. Stations are not randomly distributed, the average has to be weighted against geography, and many adjustments have to be made.

Reply to  Bellman
February 3, 2023 10:44 am

Ha, ha, ha. Show ONE reference where the NUMBER OF SAMPLES is used rather than the SIZE of SAMPLES. If you ask, I can give you 10 references that say otherwise.

As so often you don’t understand what a sample is. You can look at the measurements that go into making an average monthly value as a sample, but as I’ve said it isn’t really a random sample. You can look at 1000 different stations and their average monthly value as a sample for that months global temperature. This isn’t about the “NUMBER OF SAMPLES”, it’s about the size of the sample you are looking at.

Reply to  Bellman
February 3, 2023 2:32 pm

You *still* haven’t bothered to actually read TN1900, have you? You have no idea what assumptions Possolo made to justify what he did.

You just keep falling back on the old “all error is random and cancels, and all stated values are 100% accurate”. All the while trying to tell us that you don’t assume that!

Yet the entire range of all UAH annual data varies by less than 1°C.”

And you *still* haven’t figured out that the uncertainties of the values used to calculate the anomalies carries over to the anomalies. You want to “scale” everything, including the uncertainties. It just doesn’t work that way.

“How it possible to see changes of a few tenths of a degree and identify them with ENSO conditions, or volcanic activity, when there is a 14°C uncertainty range for each year?”

There IS NO WAY! What do you think everyone has been trying to teach you for two years?

Reply to  Tim Gorman
February 3, 2023 5:31 pm

You have no idea what assumptions Possolo made to justify what he did.

Why don’t you spell out which assumptions you think are relevant, rather than relying on insults and innuendos?

You just keep falling back on the old “all error is random and cancels, and all stated values are 100% accurate”. All the while trying to tell us that you don’t assume that!

Yet you now accept that the SEM is the uncertainty of the mean, which requires all the errors to be random, and only works because some errors will cancel. And the example is and Jim’s garbled version of it is assuming that all stated values are 100% accurate, to the same extent as any calculation of the SEM does.

And you *still* haven’t figured out that the uncertainties of the values used to calculate the anomalies carries over to the anomalies.

Which has nothing to do with what I was saying. My question remains. How if the annual global anomaly figures have a 95% uncertainty range of ±7°C can you get annual values that never deviate by more than 1°C.

There IS NO WAY!

And yet it’s self evident just by looking at the graph. Why do you think the largest annual values occurred during strong El Niños. Why do you think Dr Spencer used to draw attention to Mt Pinatubo cooling on his graphs?

Reply to  Bellman
February 5, 2023 7:08 am

I gave a link to get the paper and read it. I tire of having to teach you what you can easily learn yourself if you take the time to do so.

I will include a rather lengthy list of entries in the paper so you won’t need to strain in finding pertinent information.

To summarize, the paper describes using an observation equation made up of the readings of temperature. The mean temperature “τ” (the declared measurand) of the readings defines the mean of a probability distribution made up of temperature readings surrounding “τ”. A Students t distribution is assumed and the standard uncertainty is expanded to include a 95% confidence interval.

If you want to continue to complain, please refer to the sections I have included that you say are incorrect. Remember, your argument isn’t with me, it is with NIST. You need to do more than simply declare the result is incorrect!

“(1) Measurand & Measurement Model. Define the measurand (property intended to be measured, §2), and formulate the measurement model (§4) that relates the value of the measurand (output) to the values of inputs (quantitative or qualitative) that determine or influence its value. Measurement models may be:

Measurement equations (§6) that express the measurand as a function of inputs for which estimates and uncertainty evaluations are available (Example E3);

Observation equations (§7) that express the measurand as a function of the parameters of the probability distributions of the inputs (Examples E2 and E14).”

“(4) Measurement Result. Provide an estimate of the measurand and report an evaluation of the associated uncertainty, comprising one or more of the following (§8):

Standard uncertainty (for scalar measurands), or an analogous summary of the dispersion of values that are attributable to the measurand (for non-scalar measurands);

Coverage region: set of possible values for the measurand that, with specified probability, is believed to include the true value of the measurand;

Probability distribution for the value of the measurand, characterized either analytically (exactly or approximately) or by a suitably large sample drawn from it.”

The standard uncertainty that is often associated with such average as estimate of μ equals s∕√n, where s denotes the standard deviation of the observations. However, it is common knowledge that, especially for small sample sizes, s∕n is a rather unreliable evaluation of u(μ) because there is considerable uncertainty associated with s as estimate of σ.

“The answer, in this case, with the additional assumption that the observations are like a sample from a Gaussian distribution, is that a (suitably rescaled and shifted) Student’s t distribution shortcuts that staircase (Mosteller and Tukey, 1977, 1A) and in fact captures all the shades of uncertainty under consideration, thus fully characterizing the uncertainty associated with the average as estimate of the true mean.”

“(4b) An observation equation (or, statistical model) expresses the measurand as a known function of the parameters of the probability distribution of the inputs.”

“(i) Additive Measurement Error Model. Each observation x = g(y)+ E is the sum of a known function g of the true value y of the measurand and of a random variable E that represents measurement error (3e). The measurement errors corresponding to different observations may be correlated (Example E20) or uncorrelated (Examples E2 and E14), and they may be Gaussian (Example E2) or not (Examples E22 and E14).”

“(iii) Student’s t, Laplace, and hyperbolic distributions are suitable candidates for situations where large deviations from the center of the distribution are more likely than under a Gaussian model.”

“(7a) Observation equations are typically called for when multiple observations of the value of the same property are made under conditions of repeatability (VIM 2.20), or when multiple measurements are made of the same measurand (for example, in an interlaboratory study), and the goal is to combine those observations or these measurement results.

EXAMPLES: Examples E2, E20, and E14 involve multiple observations made under conditions of repeatability. In Examples E12, E10, and E21, the same measurand has been measured by different laboratories or by different methods.”

“(7d) … EXAMPLES: Examples E2, E14, E17, E18, E20 and E27 illustrate maximum likelihood estimation and the corresponding evaluation of measurement uncertainty.”

“The equation, ti = r+Ei, that links the data to the measurand, together with the assumptions made about the quantities that figure in it, is the observation equation. The measurand τ is a parameter (the mean in this case) of the probability distribution being entertained for the observations.”

Reply to  Jim Gorman
February 5, 2023 2:02 pm

If you want to continue to complain, please refer to the sections I have included that you say are incorrect. Remember, your argument isn’t with me, it is with NIST.

How many more times? I am not saying anything you quote is incorrect. I’m pointing out your interpretation is wrong. I’ve gone through all the obvious reasons why what you are trying to do is wrong and cannot possibly be correct. Endlessly cut and pasting some authority does you no good if you don’t understand what it means.

If you want top convince the rest of the world that the uncertainty really is ±7°C, you have to demonstrate why what you are doing is correct, not just claim that it’s what the NIST would do.

Reply to  Bellman
February 5, 2023 3:48 pm

I know exactly what it means but you obviously don’t.

I do exactly what the NIST document does. I declare the mean “τ” of the Tavg for the month to be the measurand.

I find the standard deviation of the data surrounding the mean using the traditional formula for “s”.

Then divide “s” by the sqrt of the number of days with temperatures to obtain “u(τ)”.

“u(τ)” is then expanded by reading into the “t” table at 97.5% and (n-1) degrees of freedom. That is usually around k = 2.2.

Then multiply u(τ) by the k factor to obtain the expanded uncertainty at a 95% confidence level.

It really doesn’t matter a lot if all the monthly Tmin’s and Tmax’s are used or if the Tavg’s are used. The uncertainty is still very large.

I know it bothers you to admit that the uncertainty in the distribution of temperatures is so very large. I can’t help that. The large values do point out why NIST decided to declare that measurement uncertainty of the individual temperatures is negligible.

It does rather show how what the anomalies are really worth doesn’t it?

Reply to  Jim Gorman
February 5, 2023 4:17 pm

I know exactly what it means but you obviously don’t.

And yet you get a value which is impossible, and never question that you might be wrong. Think about it. Do you think global annual temperatures ever vary over the last few hundred years by 14°C? That’s ice-age level changes. Do even individual locations vary annually by more than a degree or so?

The uncertainty in maximum values from your NIST example is only ±1.8°C, and that’s based on only having a third of the monthly data. That’s for one month. How on earth do you think averaging 12 of those monthly values could increase the uncertainty by a factor of 4.

Reply to  Bellman
February 5, 2023 4:39 pm

Then divide “s” by the sqrt of the number of days with temperatures to obtain “u(τ)”.

There’s one problem. You are dividing by the number of months not the number of days. Each of your monthly values is based on multiple daily values, hopefully a months worth. But even assuming each month only has 22 days worth of data, you still have over 260 individual measurements, not 12. Dividing by sqrt 260 will produce a much smaller value than dividing by sqrt 12. In fact your SEM is going to be reduced by a factor of 5, and of course you no longer need to use the student-t distribution, it will be as close to normal as makes no difference.

In your example you have a standard deviation of 10°C, so 10 / √260 = 0.62, and the 95% confidence interval is around ±1.2°C.

If you had all 365 daily values this drops to ±1.0°C.

But that still ignores the fact that you do not have a random sample of values from across the year, and you are taking samples without replacement.

And of course, you want anomalies not temperatures. Calculate the anomaly for each month, and it’s standard deviation is not going to be 10°C.

Reply to  Bellman
February 5, 2023 5:13 pm

Oh, and of course, that’s just one station. You will actually be averaging many stations to get the global average.

Reply to  Bellman
February 6, 2023 7:22 am

And here we go again. Averaging reduces uncertainty. You just can’t let that go can you?

Reply to  Bellman
February 5, 2023 5:25 pm

Where did you get 260? Each day has one Tmax and one Tmin. Over thirty days that is 60 measurements.

Did you not read what I said?

It really doesn’t matter a lot if all the monthly Tmin’s and Tmax’s are used or if the Tavg’s are used. The uncertainty is still very large.”

You haven’t even bothered to find out what the variance, standard deviation, and mean for a month is have you? Here is an image of a web based standard deviation with data for 1953 in Topeka, Kansas.

Do you think those monthly averages are out of line? By the way, there are only twelve data points so n=12, not 260.

sample standard deviation.jpg
Reply to  Jim Gorman
February 5, 2023 5:28 pm

Here is a copy of the monthly data in 1953 in Topeka. You’ll notice that the uncertainty of the monthly mean is not as large as the annual. Do you think that the variance in a month is smaller than what one would see over 4 seasons in a year?

sample monthly standard deviations.jpg
Reply to  Jim Gorman
February 6, 2023 3:21 pm

You’ll notice that the uncertainty of the monthly mean is not as large as the annual.

Only because you don’t know how to calculate the uncertainty. Simple logic should tell you it’s not possible for the average of 12 things to have more measurement uncertainty than the individual things. But as I say, the actual uncertainty you are calculating is not the measurement uncertainty but that which you would get from a random sample of 12 monthly values.

Reply to  Bellman
February 6, 2023 8:57 am

“””””Dividing by sqrt 260 will produce a much smaller value than dividing by sqrt 12.”””””

Are you innumerate? Each of the monthly variances have already been divided by the number of days in the month. Dividing again will certainly lower the number, but the result is meaningless.

Look at the variance formula and the uncertainty of the mean for a sample.

s^2 = (1/(n-1))Σ(Xi – Xbar)^2, SD =√(s^2)

u(τ) = SD / √n (standard uncertainty)

What do you think “n” is in this case? I’ll give you a hint, the number of days.

Why do you think dividing AGAIN by the √260 (days) is necessary? What are you trying to calculate and what formula are you using?

It looks like you are trying to calculate an SEM of a SEM!

As I said if you showed your calculations you might, just maybe, find some errors.

You and many climate scientists just can’t help yourselves. Find anyway to lower the uncertainty whether it is mathematically correct or not.

Reply to  Jim Gorman
February 6, 2023 2:55 pm

Each of the monthly variances have already been divided by the number of days in the month. Dividing again will certainly lower the number, but the result is meaningless.

I’m saying what would happen if instead of using the monthly values you used daily values to calculate the average. You should get the same average, why do you think you should get different uncertainties?

Reply to  Bellman
February 6, 2023 4:31 am

The NIST example only looked at Tmax therefore the uncertainty only takes into account the variation in those temperatures. Do you think adding Tmins wouldn’t expand the variance of the data?

I am sure you have not looked at the data or done your own calculations. Tell us what the variance of Tavg generally is. Pick some example sites and do the calculations. Heck, I bet your local TV station can show that information for a few days in the past. Pull up a weather app and get the forecast for Tmax and Tmin. Then check the variance in an online calculator.

When you create a random variable by averaging measurements, you must use the variance of that variable in further calculations.

You just confirmed that you still don’t understand uncertainty. This is exactly what Dr. Frank had to put up with when he did his analysis on GCM’s.

Uncertainty only tells you how wide the interval is that surrounds the mean and it defines where the mean may actually lay.

Your response only shows that you have not even processed temperature data yourself.

Lastly, do you think anomalies carry the same variance as the random variables that create them? If you do, then you must show the added variances of the two numbers you subtract. If you don’t think the variances add, then you must show references as to why they don’t.

Reply to  Jim Gorman
February 6, 2023 3:07 pm

I was only pointing out daily values as a way of pointing out how the SEM can be applied. It doesn’t really matter if you are looking at average daily or maximum daily values.

And yes, I probably should have taken into account the fact that daily values across the year will have more variance than monthly values, but it was only an estimate to show how treating monthly values as if they were random makes no sense.

This is exactly what Dr. Frank had to put up with when he did his analysis on GCM’s.

Yet not even he claimed the uncertainty of a global anomaly was ±7°C.

Your response only shows that you have not even processed temperature data yourself.

Where do you think all my graphs come from?

Lastly, do you think anomalies carry the same variance as the random variables that create them?

Depends what variance you are talking about. They certainly don’t have the same variance as your monthly values, because that variance is coming from seasonal changes.

If you do, then you must show the added variances of the two numbers you subtract.

Why don’t you try it with your station data. Work out a 30 year average for each month, then subtract it from your single year data. Then use the same equation for the SEM, standard deviation divided by root 12. Remember, when you used that formula for your monthly values we were ignoring any uncertainty in the individual monthly values, just taking them as stated values. All your uncertainty came from the seasonal variation.

Reply to  Bellman
February 6, 2023 6:25 am

And yet you get a value which is impossible”

Why is it impossible? It’s from the same method Possolo used. Are you saying Possolo’s method is wrong?

Can you *show* why it is wrong?

An inconvenient truth is *still* the truth!

“Do you think global annual temperatures ever vary over the last few hundred years by 14°C?”

It’s quite likely! Colder temperatures have wider variances than hotter temperatures! Why is this so hard for you to believe?

The difference between -30C and -20C is ten degrees but it doesn’t affect the ice sheet very much!

” Do even individual locations vary annually by more than a degree or so?”

Did you read this before posting it?

Here are the median annual temperature values at my location:

year median diff yr-yr diff from 2019 annual std dev
2019 55.8 21
2020 56.6 +.8 +.8 19.3
2021 58.7 +2.1 +2.9 20
2022 57.7 -1.0 +1.9 21.6

Even the standard deviation of the annual temperature profile can change by more than one degree.

That’s for one month. How on earth do you think averaging 12 of those monthly values could increase the uncertainty by a factor of 4.”

Because uncertainties add? Averaging does *NOT* decrease uncertainty, no matter how much you wish it would.

Reply to  Tim Gorman
February 6, 2023 7:07 am

If you add random variable means to obtain an average, then you also add the variances to find an average uncertainty. Unless there is a large difference in variances you basically end up with about what you started with. One must also remember that u(τ) is like a scaled Standard Deviation (/√n), so you must square it to get u(τ)^2 before you add them.

Reply to  Tim Gorman
February 6, 2023 3:16 pm

It’s quite likely! Colder temperatures have wider variances than hotter temperatures! Why is this so hard for you to believe?

Why do you keep assuming I don’t believe things that are patently true? This has nothing to do with you claiming that annual global temperatures might vary by 14°C.

Here are the median annual temperature values at my location:

Four years differing by what I assume is 1.6°C, i.e. a degree or so.

Reply to  Bellman
February 3, 2023 10:11 am

Still waiting for a response and a reference about what “n” is when doing sampling. Have you no come across a reference yet?

Here is a good one. Notice the reference to the sample size and not the number of samples.

Statistics Notes: Standard deviations and standard errors – PMC (nih.gov)

The standard error of the sample mean depends on both the standard deviation and the sample size, by the simple relation SE = SD/√(sample size). The standard error falls as the sample size increases, as the extent of chance variation is reduced—this idea underlies the sample size calculation for a controlled trial, for example. By contrast the standard deviation will not tend to change as we increase the size of our sample.”

“So, if we want to say how widely scattered some measurements are, we use the standard deviation. If we want to indicate the uncertainty around the estimate of the mean measurement, we quote the standard error of the mean. The standard error is most useful as a means of calculating a confidence interval. For a large sample, a 95% confidence interval is obtained as the values 1.96×SE either side of the mean. We will discuss confidence intervals in more detail in a subsequent Statistics Note. The standard error is also used to calculate P values in many circumstances.”

Ok, I’ve started. Let’s see your references that say you divide by the sqrt of the number of samples.

Reply to  Jim Gorman
February 3, 2023 10:50 am

Still waiting for a response and a reference about what “n” is when doing sampling. Have you no come across a reference yet?

Have some patience. I’ve been compared to a predatory stalker, just because I question Monckton’s methods, yet whenever I log in I always find hundreds of comments addressed to me, insulting me, and demanding answers to what all sorts of nonsense. And If I haven’t got round to every one within a few hours I get more insults and the claim that I don’t know the answer.

Your lucky I had to cancel an appointment today, otherwise you would have had to wait till tomorrow before I could answer all your polite questions.

Reply to  Jim Gorman
February 3, 2023 11:00 am

Notice the reference to the sample size and not the number of samples.

Your ability to finally agree with what I’ve been trying to tell you for years, whilst making it seem like it’s something I need to have explained to me, is quite extraordinary.

When have I ever said that you look at anything other than sample size?

Let’s see your references that say you divide by the sqrt of the number of samples.

Why would you do that? If you are talking about averaging across both time and space, you can either look at each station as a single monthly value, in which case each station is an item in the sample of all stations, or you could look at the monthly average as being the combination of the 30 daily values from each station. This would give you a single sample that is 30 times bigger than the first, but in each case you still only have one sample.

Just as when you took the 12 monthly averages and treated each as a single item in a sample of size 12. You were not treating this as 12 separate samples each of size 30. But you could have treated the annual value as 365 daily values, that is a sample of size 365.

Reply to  Bellman
February 3, 2023 8:23 am

“I’ve asked repeatedly why UAH does not publish an uncertainty analysis for the current version.”

A good, valid question. But even the most ridiculous estimates of it (I see some eyes burning out there), won’t degrade the statistical durability of the trends now under discussion. We are talking nearly 500 data points. clearly steering us up…

Reply to  Bellman
February 2, 2023 12:54 pm

… and whilst I wouldn’t make an claims for it’s accuracy, …

Reminds me of a former colleague who was teaching a bone-head math class and upon being told by a student that he had made an arithmetic error on the chalk board, promptly turned and said, “I may not be very accurate, but I’m fast.”

Reply to  Bellman
February 3, 2023 1:08 pm

The Monckton pause now starts in August 2014.

Seems I was wrong. Monckton is stating the pause still starts in September. Not sure if this is an oversight on Monckton’s part, or if there is some slight difference in the data we are using. By my calculation the rate of change from August 2014 to January 2023 is -0.001°C / decade. Very close to zero, but not positive.

Reply to  Bellman
February 4, 2023 5:45 am

My mother in Independent Living goes similarly sideways from time to time….

ResourceGuy
February 2, 2023 5:39 am

Give it another five years to add come curvilinear down pattern instead of a linear segment with noise. That won’t make a difference for the woke indoctrinated and their benefactors but the rest of us will learn something.

ResourceGuy
February 2, 2023 6:04 am

Free the polynomials!

Reply to  ResourceGuy
February 2, 2023 12:58 pm

Let them emigrate to Pangea.

February 2, 2023 6:12 am

Dr. Spencer,

Please, add a graph showing the 100-plus, subjective, computer temperature “predictions” and UPDATED objective satellite and balloon temperatures

rah
February 2, 2023 6:20 am

Well Phil saw his shadow. He’s almost certainly has got it right this year. A few years ago I saw a statistical comparison of the accuracy of Phil verses weather models and it was a tie as I recall.

But obviously that furry four legged critter that likes to live in a hole in the ground, is far more accurate than the climate models and the likes of Hanson, Gore, Mann, etc….

Beta Blocker
Reply to  rah
February 2, 2023 12:52 pm

Mr. rah, is Punxsutawney Phil properly credentialed as a statistically-trained groundhog with education in climate science? Are his predictions peer-reviewed by other statistically-trained groundhogs of similar or better reputation in the world of groundhog climate science?

February 2, 2023 6:31 am

The monthly Trendology Clown Car Circus has arrived right on schedule.

Reply to  karlomonte
February 2, 2023 6:49 am

Nope. Still waiting for Monckton to show up.

He’ll be able to confirm for you that the trend is 0.133°C / decade, and is negative if you start in August 2014.

Reply to  Bellman
February 2, 2023 7:52 am

“I’ll pass” — Willis E.

Not going to play in your clown car circus…

Norman Page
February 2, 2023 7:00 am

Excerpts From http://climatesense-norpag.blogspot.com/

“The IPCC and UNFCCC post modern science establishment’s “consensus” is that a modelled future increase in CO2 levels is the main threat to human civilization. This is an egregious error of scientific judgement.  A Millennial Solar ” Activity” Peak in 1991  correlates with the Millennial Temperature Peak at 2003/4 with a 12/13 year delay because of the thermal inertia of the oceans. Earth has now entered a general cooling trend which will last for the next 700+/- years.
Because of the areal distribution and variability in the energy density of energy resources and the varying per capita use of energy in different countries, international power relationships have been transformed. The global free trade system and global supply chains have been disrupted.

Additionally, the worlds richest and most easily accessible key mineral deposits were mined first and the lower quality resources which remain in the 21st century are distributed without regard to national boundaries and demand. As population grows,inflation inevitably skyrockets. War between states and violent conflicts between tribes and religious groups within states are multiplying.

2 The Millennial Temperature Cycle Peak.
Latest Data (1) https://www.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Global   Temp Data 2003/12 Anomaly +0.26 : 2023/01 Anomaly -0.04 Net cooling for 19 years
NH     Temp Data 2004/01 Anomaly +0.37 :  2023/01 Anomaly +0.05 Net cooling for 19 years
SH      Temp Data 2003/11 Anomaly +0.21:  2023/01 Anomaly  -0.14 Net cooling for 19 years  
Tropics  Temp Data 2004/01 Anomaly +0.22 : 2023/01 Anomaly  – 0.38 Net cooling for 19 years.
USA 48  Temp Data 2004/03 Anomaly +1.32 : 2023/01 Anomaly  + 0.12 Net cooling for 19 years.
Arctic    Temp Data 2003/10 Anomaly +0.93 :  2023/01 Anomaly  – 0.72 Net cooling for 19 years
Australia  Temp Data 2004/02 Anomaly +0.80 : 2023/01 Anomaly  – 0.50 Net cooling for 19 years 
Earth’s climate is the result of resonances and beats between the phases of natural cyclic processes of varying wavelengths and amplitudes. At all scales, including the scale of the solar planetary system, sub-sets of oscillating systems develop synchronous behaviors which then produce changing patterns of periodicities in time and space in the emergent temperature data. The periodicities pertinent to current estimates of future global temperature change fall into two main categories:
a) The orbital long wave Milankovitch eccentricity, obliquity and precession cycles. These control the glacial and interglacial periodicities and the amplitudes of the corresponding global temperature cycles. 
b)  Solar activity cycles with multi-millennial, millennial, centennial and decadal time scales. 
The most prominent solar activity and temperature cycles  are : Schwab-11+/-years ; Hale-22 +/-years ; 3 x the Jupiter/Saturn lap cycle 60 years +/- :; Gleissberg 88+/- ; de Vries – 210 years+/-; Millennial- 960-1020 +/-. (2)………………………..
5. CO2 -Temperature and Climate.

The whole COP Net Zero meme is founded on the flawed assumptions and algorithms which produced the IPCC- UNFCCC model forecasts of coming dangerous temperature increases.
The “consensus” IPCC models make the fundamental error of ignoring the long- term decline in solar activity and temperature following the Millennial Solar Activity Turning Point and activity peak which was reached in 1990/91 as shown in Figure 1
The amount of CO2 in the atmosphere is .058% by weight.  That is one 1,720th of the whole. It is inconceivable thermodynamically that such a tiny tail could wag so big a dog. (13)
 Stallinga 2020 (14) concludes: ” The atmosphere is close to thermodynamic equilibrium and based on that we……… find that the alleged greenhouse effect cannot explain the empirical data—orders of magnitude are missing. ……Henry’s Law—outgassing of oceans—easily can explain all observed phenomena.” CO2 levels follow temperature changes. CO2 is the dependent variable and there is no calculable consistent relationship between the two. The uncertainties and wide range of out-comes of model calculations of climate radiative forcing (RF) arise from the improbable basic assumption that anthropogenic CO2 is the major controller of global temperatures.
 Miskolczi 2014 (15) in “The greenhouse effect and the Infrared Radiative Structure of the Earth’s Atmosphere “says “The stability and natural fluctuations of the global average surface temperature of the heterogeneous system are ultimately determined by the phase changes of water.”
 Also See  AleksandrZhitomirskiy2022 Absorption of heat and the greenhouse gas effect. https://independent.academia.edu/AleksandrZhitomirskiy  (16)  which says:
“The molar heat capacities of the main greenhouse and non-greenhouse gases are of the same order of magnitude. Given the low concentration of greenhouse gases in the atmosphere, their contribution to temperature change is below the measurement error. It seems that the role of various gases in the absorption of heat by the atmosphere is determined not by the ability of the gas to absorb infrared radiation, but by its heat capacity and concentration. ”  

Zaichun Zhul et al 2016 (17) in Greening of the Earth and its drivers report “a persistent and widespread increase of growing season integrated Leaf Area Index (greening) over 25% to 50% of the global vegetated area from 1982 – 2009. ………. C02 fertilization effects explain 70% of the observed greening trend.”
 Policies which limit CO2 emissions or even worse sequester CO2 in quixotic CCS green-washing schemes would decrease agricultural food production and are antithetical to the goals of feeding the increasing population and bringing people out of poverty.
 
The tropical rain forests and tropical oceans are the main source of the atmosphere’s water vapor and the rainfall essential to life and agriculture on land. Potable and agricultural water supplies are now stretched to their limits in many areas because of the demographics of global population increase. Temperature limits and targets as set in the Paris Accords to ameliorate future temperatures are completely useless when formulating policies relative to adaptation to the actual real world problems. These require more local inputs for particular regional ecosystems delineated by coastlines, major river basins and mountain range limited intra-continental divides.”

wh
Reply to  Norman Page
February 2, 2023 9:37 am

You guys have been saying we’re due for cooling for decades now.

February 2, 2023 7:46 am

Only 5 months with a negative anomaly in the past 100 months. January temperature was truly a rare occurrence in the monthly temperature distribution.

Norman Page
Reply to  Javier Vinós
February 2, 2023 8:00 am

Javier See Figs 1,2,and 3 from  http://climatesense-norpag.blogspot.com/
comment image

Fig 1 Correlation of the last 5 Oulu neutron cycles and trends with the Hadsst3 temperature     trends and the 300 mb Specific Humidity. ( 5,6 )     

The Oulu Cosmic Ray count in Fig.1C shows the decrease in solar activity since the 1991/92 Millennial Solar Activity Turning Point and peak There is a significant secular drop to a lower solar activity base level post 2007+/- and a new solar activity minimum late in 2009. In Figure 1 short term temperature spikes are colored orange and are closely correlated to El Ninos. The hadsst3gl temperature anomaly at 2037 is forecast to be + 0.05. 

https://blogger.googleusercontent.com/img/a/AVvXsEjEbcj5Rk2czupOsD4PnxjTI-dNoIAxcMG7yKIGiTboHkXgmlF-HR1m87NYfqMPtiJwwLrIvGpQBvedJLU9dgcqsm-EV63Xuz7VyuiLjy7aqL2p6NaMD9mt9TOO-iDEeT_GIcBDpyAFUkX5-gJwoywFuphiM6-20iV3lXEUvLpz1Ln0mdmiRqpfyAR_3w=w631-h410
Fig.2 Northern Hemisphere 2000 year temperature reconstruction and a Millennial Temperature Turning Point. (MTTP). (7)
Because of the data quality, record length and methods used, the NH Christiansen et al 2012 series was selected as the “type reconstruction” to represent the NH trends. The de Vries, Maunder, Sporer and Wolf minima are noted. Important volcanic cooling events are marked with a V.  An MTTP occurs at about 990. The Millennial cycles are asymmetric with a 700+/- year down-leg and a 300 +/- year up-leg.

https://blogger.googleusercontent.com/img/a/AVvXsEi_Nb8FScfi4vU39WRnt85Ua6CSHHmuRkp9lCLYNQqXLsAyLu5iOnnIgr2Jj32od9RAOhpiG6MHcdUVBUb-3_ATOd7Wue73fogFWBHPBRMEu2gsmZ7wn5ZEuLlxxCCjK2T47UdKng6Wbt36Igdf659FOWPLsuAlR19WeFoGaXjbVZn6qctZd6RnLFj1gQ=w446-h516

Fig 3  The NRLTSI2 Solar Activity – CET Relationship 1600- Present (8,9,10)
In Fig.3 the Roth & Joos Cosmogenic Index (CI) is used as the emergent proxy for the solar activity driver of the resulting emergent global and NH temperature data.
The effect on observed emergent behaviors i.e. global temperature trends, of the combined effect of these solar and GCR drivers will vary non-linearly depending on the particular phases of the eccentricity, obliquity and precession orbital cycles at any particular time.
Figure 3 shows an increase in CI of about 2 W/m 2 from the Maunder minimum to the 1991 activity peak. This increase, together with the other solar “activity” variations modulate the earth’s temperature and albedo via the GR flux and varying cloud cover.

bdgwx
February 2, 2023 8:26 am

The 2022 data points are in. Here is the global average temperature since 1979 from 8 datasets. To better visualize the long divergence of the datasets I have normalized them all to their OLR y-intercept. In other words, they all start in 1979 at the starting point of their own trend lines.

comment image

Reply to  bdgwx
February 2, 2023 1:14 pm

It looks to me that the CMIP5 temp’s are only reasonably good for prior to about 1998. Which is surprising because the models were tuned to data prior to 2005. However, they seem to run too warm, which is not news.

bdgwx
Reply to  Clyde Spencer
February 2, 2023 2:15 pm

It’s 0.2 C of error relative to the composite. I think that is a lot considering the Hansen and IPCC predictions from 1988 and 1990 respectively for the scenario closest to the one humans actually chose was closer. But compared to the predictions of Easterbrook, Archibald, Zharkova, Bastardi, and many others I’ve seen mentioned on WUWT it is reasonably good. At least it correctly predicted the direction of change. Nonetheless I expect better. I’m obviously doubly disappointed that CMIP6 not only did not improve the skill over this period but actually got slightly worse

Reply to  bdgwx
February 2, 2023 5:37 pm

composite.”

Lol.

bdgwx
Reply to  Mike
February 2, 2023 8:54 pm

My apologies. I thought “composite” would be self evident to most. The “composite” in my graph is (UAH + RSS + NOAA + HCRUT + GISS + BEST + ERA + RATPAC) / 8.

Reply to  bdgwx
February 3, 2023 5:48 pm

Lol

bdgwx
Reply to  Mike
February 4, 2023 6:58 am

Admittedly I probably should have used the term ensemble since that is what most others are using. I’ll be sure to get that corrected. Thanks for bringing it to my attention.

Reply to  bdgwx
February 2, 2023 1:22 pm

Now show the measurement uncertainties as done in the NIST TN1900 Ex. 2.
Remember, when you subtract random variables the variances add, so when calculating an anomaly, you don’t get to reduce the variance.

Reply to  bdgwx
February 2, 2023 2:14 pm

The Pause was flat. This is actually worse as the trend since 2016 is down. It is hard to defend that the Earth’s energy imbalance is still positive when the planet is cooling.

bdgwx
Reply to  Javier Vinós
February 2, 2023 3:55 pm

The “planet” is not cooling though. See Cheng et al 2023 regarding the updated energy increase in the ocean which accounts for about 90% of the uptake of excess energy. What has cooled from its peak in 2016 is the atmosphere and specifically the UAH TLT layer. That is but a small part of the climate system of Earth though.

Reply to  bdgwx
February 2, 2023 11:21 pm

The net flux of energy is positive from the ocean to the atmosphere. The ocean gets its energy from the Sun. The atmosphere can only reduce the net flux, it cannot warm the ocean because the net flux is never negative. Energy from the Sun is on average constant. The ocean is always resisting climate change. When the atmosphere is cooling the ocean will release more heat cooling itself. When the atmosphere is warming the ocean will release less heat warming itself. It is the atmosphere that does climate change, while the ocean resists climate change. The ocean always provides inertia to climate change.

Since 2016 the atmosphere is cooling because is losing more energy at the top of the atmosphere than it is getting from the ocean. The ocean then releases more heat to a colder atmosphere. Once it is releasing more heat than it is getting from the Sun it will cool too. The planet cannot warm for long if its atmosphere is cooling. Climate change is driven by the atmosphere and surface, not the ocean.

bdgwx
Reply to  Javier Vinós
February 3, 2023 6:06 am

The ocean is not releasing more energy than it gets from the Sun though.

[Cheng et al. 2023]

comment image

And from [Schuckmann et al. 2020] who analyzed all heat reservoirs including the hydrosphere, cryosphere, atmosphere, and land we know the planetary energy imbalance is about +0.8 W/m2.

The atmospheric temperature is not going to decrease or even stabilize until the imbalance is <= 0 W/m2. It will be highly variable with many ups and down on shorter times, but over longer timescales it will continue to trend upward as long as the imbalance is > 0 W/m2.

Reply to  bdgwx
February 3, 2023 6:37 am

land we know the planetary energy imbalance is about +0.8 W/m2″ — ignoring the fact that uncertainties for radiometric quantities are an order of magnitude greater.

Reply to  bdgwx
February 3, 2023 10:55 am

According to Steven Dewitte et al. 2019, since about 2000 the Earth’s energy imbalance and OHC time derivative are decreasing over time.

comment image

This is consistent with the planet warming more slowly, a necessary first step to the planet not warming.

bdgwx
Reply to  Javier Vinós
February 3, 2023 11:59 am

Warming more slowly does not mean cooling. Mathematically we state this as d”T/dt < 0 does not imply d’T/dt < 0 where d’ is the 1st derivative and d” is the second derivative. Similarly, just because the EEI trend is -0.15 W/m2.decade since 2000 does not mean that the EEI itself isn’t still +0.8 W/m2 at the end of 2018.

February 2, 2023 8:44 am

models are getting farther and farther off

lol it was already this bad in 2018

temps.png
February 2, 2023 8:54 am

Interestingly, in the area I live, January was unusually warm this year.

Just another data point in the fact that weather is not climate.

Reply to  Sailorcurt
February 2, 2023 9:44 am

Unusually wet here in Auckland. We’ve had some really bad flooding, resulting in damage to houses, slips etc.
Our leaders are _not_ telling us this is weather – it seems that almost all of them are running the line that ‘this’ (the awful weather and its consequences) is climate change, and it’s only going to get worse. They don’t need any proof that this bad weather is the result of climate change – the bad weather itself is all the proof they need. Interestingly, the really good summer we had a couple of years back when it didn’t rain at all for a few months was also proof of climate change.
Our government has actually produced some predictions about how much they expect the weather to change in Auckland over the century, and they predict somewhere between -2% -> 2% change in rainfall (so not even certain of the sign of the change). It would seem that I’m the only person who has ever bothered to read them.

Simon
Reply to  Chris Nisbet
February 2, 2023 11:24 pm

Truth is the records for rainfall in Auckland were not broken… they were smashed by such large numbers it was mind blowing. It is entirely reasonable when a weather event happens like this that people look for possible causes/answers.

Reply to  Simon
February 2, 2023 11:37 pm

I don’t think anybody is looking for answers – ‘climate change’ dunnit, apparently.

Simon
Reply to  Chris Nisbet
February 3, 2023 10:17 am

No, the smart ones look for answers, others just deny.

February 2, 2023 8:58 am

Still the entire modern temp record is one thin page out of a encyclopedic size paleo climate history that lacks precision anyway. What scares me is that what might be a completely natural temporary blip upward over the next decade could be seen as proof positive of CAGW and further hasten these green policies and obsession with net zero

February 2, 2023 10:12 am

The question nobody asks:

Did CO2 concentration in the atmosphere decline before the latest observed results were in? If not, then how did the increased trapped heat escape?

bdgwx
Reply to  doonman
February 2, 2023 10:51 am

doonman said: “Did CO2 concentration in the atmosphere decline before the latest observed results were in?”

No.

doonman said: “If not, then how did the increased trapped heat escape?”

The 1st law of thermodynamics. ΔE = Ein – Eout. CO2, like all polyatomic molecules, impedes the transmission of terrestrial energy more than solar energy. That means at TOA ΔEout < ΔEin which means ΔE > 0.

And it is important to point out that CO2 is not the only thing modulating Ein and Eout at either the TLT or TOA layers of the atmosphere. This is one of the biggest confusions I see here on WUWT. That is a lot people here erroneously think CO2 and only CO2 can modulate Ein and Eout. And are then confused when the UAH TLT temperature declines from month to the next even though CO2 concentration increased.

To help you understand what is going on consider a hypothetical system being acted upon by only two actors A and B. Let Ea_in = sin(x), Ea_out = cos(x), Eb_in = 0.2*x, and Eb_out = 0.1*x. The 1LOT says ΔE = (0.2*x – 0.1*x) + (sin(x) + cos(x)). You can plot this on desmos.com or your favorite graphing calculator. Notice that actor A provides the small but persistent long term trend whereas actor B provides the large but variable short term fluctuations. Understand that simple concept first and then we can start extending it to include numerous arbitrarily complex persistent and cyclic actors like what the TLT layer of the atmosphere has in the real world. Remember, make sure you understand the simple concept first before jumping into the far more complex real world. If you cannot understand what effect actors A and B are having in the simple idealized scenario then there is no way you are going to understand the far more complex real world. If you have question ask.

Richard M
Reply to  bdgwx
February 2, 2023 12:52 pm

CO2 does impede Eout right up until saturation (all surface energy is already absorbed). Of course, humanity has only experienced saturation so there’s never been a situation where CO2 produces any additional warming.

bdgwx
Reply to  Richard M
February 2, 2023 1:42 pm

I’m not talking about Eout at the surface. I’m also not sure it is worth discussing that kind of detail until it has been said the simple idealized case is understood.

Reply to  Richard M
February 2, 2023 2:05 pm

There is no point of saturation. The more CO2 the higher the altitude of emission. There is no saturation at the top of the atmosphere, not even in Venus. If more is added the top of the atmosphere just goes higher.

Richard M
Reply to  Javier Vinós
February 2, 2023 2:49 pm

I’m talking about saturation of surface energy emissions which are the only ones that matter.

CO2 does impede more energy within the atmosphere as the concentration increases, but it also emits more energy. This is due to Kirchhoff’s Law. The two balance out and you get a consistent flow of energy independent of CO2 concentration. The emission height does not change.

Reply to  Richard M
February 2, 2023 11:29 pm

CO2 molecules always emit the exact same number of photons they receive. There is no impeding. All the IR thermal radiation is emitted toward space. Adding CO2 changes things momentarily, except for the height of emission.

Reply to  Javier Vinós
February 3, 2023 6:17 am

Can’t CO2 also lose its excited state (i.e. the absorbed photon) through kinetic collisions with other molecules?

Reply to  Tim Gorman
February 3, 2023 6:38 am

In solid state physics these take the form of phonons.

Reply to  Tim Gorman
February 3, 2023 10:59 am

Yes, thermalization. It is more important the lower you are in the atmosphere. The IR ends up being emitted by a different molecule, and all the IR thermal radiation is emitted toward space the same.

Reply to  Javier Vinós
February 3, 2023 12:45 pm

That was my point. If heat is “trapped” we wouldn’t be here today.

Reply to  bdgwx
February 2, 2023 1:16 pm

The 1st law of thermodynamics. ΔE = Ein – Eout.

With a time delay.

bdgwx
Reply to  Clyde Spencer
February 2, 2023 1:47 pm

The 1LOT does not have a time delay. ΔE = Ein – Eout at every moment of time.

Reply to  bdgwx
February 2, 2023 4:17 pm

The rate of heat transmission is:

q = -kA(dt/dL) BTU/hr

where A is normal surface area, dt is time, and dL is the path length. k is the thermal conductivity of the medium.

You keep wanting to describe the situation where temperatures do not vary with time. That is *never* the case for the atmosphere. Time thus becomes a required factor.

bdgwx
Reply to  Tim Gorman
February 2, 2023 8:42 pm

As if challenging the 1LOT isn’t absurd enough you take it to a whole new level by doing so using the conductive heat transfer equation which, unsurprisingly to most, is dependent upon the 1LOT. I accept that I can’t stop you from challenging the 1LOT. But I neither have the time nor the motivation to argue with you about it.

Reply to  bdgwx
February 3, 2023 5:34 am

ROFL! The equation for heat transfer does *NOT* challenge the 1LOT. It just shows that there is TIME COMPONENT which you, for some unknown reason, want to ignore!

You apparently totally missed the entire thread talking about time constants for PRT probes and how the time constant can be used to make the PRT probe emulate the response of an LIG thermometer.

Heat transfer *does* have a time component. It’s why its important to understand the use of thermal conductivity constants for various materials. It’s why when you heat one end of a steel rod the far end doesn’t immediately heat up.

ΔE has a TIME COMPONENT. You can only cram in as much energy over time as the thermal conductivity allows.

As has been pointed out over and over you are not an engineer, you are a statistician. Not everything in the world is time independent. Statistics don’t work well on time dependent functions.

Reply to  Clyde Spencer
February 2, 2023 3:03 pm

You betray your lack of education concerning thermodynamics. Nothing is EVER stable, all processes have time varying characteristics that require integral calculus to be accurate. Declaring ΔE as a simple linear equation of Ein – Eout is using linear equations to describe a much more complex system. Worse is using averages to try and prove what is happening.

As Clyde said, there are time delays in the system that must be accounted for in describing the system. The RATE of heat being introduced when the sun is shining far exceeds the RATE of heat being radiated or there would never be any heating at all. As bodies store heat, they will increase in temperature. Their internal characteristics will determine how fast that absorbed heat is radiated.

Reply to  bdgwx
February 2, 2023 1:59 pm

CO2, like all polyatomic molecules, impedes the transmission of terrestrial energy more than solar energy.

CO2 does not impede anything. It just radiates from a higher altitude in an atmosphere with a positive lapse rate. In Antarctica, where the lapse rate is negative because the ground is colder than the atmosphere, additional CO2 cools instead of warming, despite doing exactly the same.

bdgwx
Reply to  Javier Vinós
February 2, 2023 3:32 pm

I standby what I said. CO2 like all polyatomic molecules impedes the transmission of radiation and thus energy. The countless non-dispersive infrared, cavity ring down spectroscopy. and other spectroscopy and radiometer instruments prove this unequivocally. It is an undisputable fact that CO2 absorbs radiation at various frequencies and thus impedes the transmission of the energy that radiation carries.

The fact that the effective height at which the radiation is emitted changes in no way contradicts the fact that CO2 and other polyatomic gas species impede the transmission of energy. In fact it is precisely because it impedes the transmission of energy that it reduces the net amount of energy the surface receives from the atmosphere in Antarctica thus resulting in a cooling influence there sometimes.

I’ll repeat. CO2 impedes the transmission of energy. That is an undisputable fact

Richard M
Reply to  bdgwx
February 2, 2023 6:00 pm

It appears you are in denial of Kirchhoff’s Law. Yes, CO2 absorbs (impedes) the flow of energy in specific IR bands. It also emits energy. You have to consider both which it appears you have not thought through.

bdgwx
Reply to  Richard M
February 2, 2023 6:26 pm

I fully accept Kirchhoff’s Law. What I don’t accept is an insinuation that it implies polyatomic molecules do not impede the transmission of energy which countless NDIR, CRDS, and other spectroscopy and radiometer instruments are indisputably show. What Kirchhoff’s Law says is that a body in equilibrium emits as much as it absords. It does NOT say that it will always emit with the same trajectory in which it absorbs. Specifically for gas species in the atmosphere the surface radiation absorbed is always directed upward. But the radiation emitted is randomly distributed between upward and downward trajectories.

Richard M
Reply to  bdgwx
February 2, 2023 8:17 pm

Wasn’t trying to insinuate there is no absorption. Just making sure we are dealing with the full compliment of events.

Now for the next step. As you stated the radiation is emitted randomly. However, all radiation events have an upward or downward component. If we average out all the emissions we can come out with an average upward and average downward event.

And if we then average those events, we end up with one average photon emission. Of course, it is upward due to the reduction in density as one moves upward.

There you have it. We can replace all photon emissions with an average emission. In reality you need to do this layer by layer.

Yes, it’s a little more complicated, but not much. Now you can replace all photons emissions and think about the effect.

So what happens when you double CO2? You get more absorption which impedes both upward and downward path lengths. The average of those averages is then reduced a bit. However, you also get more emissions so the total energy will increase. Yes, exactly in line with Kirchhoff’s Law.

It turns out both of the changes are log functions. They cancel out. The rate of energy flux moving upward stays the same.

IOW, while it is true that more CO2 will cause a slow down in upward energy flux, Kirchhoff’s Law demands the total energy flux increases. This happens in a manner that compensates of the slow down.

This is why the average emission height does not change and there is no warming effect from a well mixed GHG after surface absorption reaches saturation.

bdgwx
Reply to  Richard M
February 3, 2023 5:58 am

I don’t think we are ready for the next step. We still have people challenging the 1LOT. There is no way I’m going to convince doonman that 1) CO2 traps heat and 2) that CO2 is not the only thing effect ΔE if people that he may be appealing don’t concede that the 1LOT as it is stated in literature is an unassailable law of physics.

Reply to  bdgwx
February 3, 2023 7:05 am

CO2 does not TRAP heat. If it did the earth would have become a cinder long ago and we would never have had any ice ages.

As usual, you just ignore the time component of heat transfer.

Richard M
Reply to  bdgwx
February 3, 2023 9:11 am

Looks like the typical alarmist deflection. It appears you aren’t ready to admit CO2 concentration is independent of energy flow through the atmosphere.

bdgwx
Reply to  Richard M
February 3, 2023 9:58 am

I never said CO2 concentration is dependent on the energy flows in the atmosphere. Nobody had even mentioned anything about it until you just mentioned it. You can make up all the absurd arguments you want. Just don’t pin your arguments on someone else. And don’t think the irony of you saying I’m deflecting while simultaneously bringing up the dependence of CO2 concentration on atmospheric energy flows (which I don’t support) in a subthread where I’m trying to explain to doonman that the 1LOT says that all actors influencing Ein and Eout into and out of a system must be considered and that CO2 is not the only actor was lost on me.

Reply to  bdgwx
February 2, 2023 11:43 pm

CO2 impedes the transmission of energy. That is an undisputable fact

Every photon absorbed by a molecule of CO2 is re-emitted eventually. Transmission of energy continues, just in a different direction. That is an indisputable fact. Since the planet continues emitting the energy it receives from the Sun, the only thing that changes is the height of emission, and the surface temperature readjusts to that change.

If the lapse rate was negative, more CO2 would cool the surface, because the number of photons is proportional to the temperature, not to the number of CO2 molecules.

Reply to  Javier Vinós
February 3, 2023 4:12 am

It is refreshing to read your commentary.

I am always confused about why it is hard for people to understand about a molecule falling back to a stable state after being excited by the absorption of energy. Neither atoms nor molecules stay excited, the extra energy is constantly removed by several means. Entropy rules!

Richard M
Reply to  Javier Vinós
February 3, 2023 5:27 am

Every photon absorbed by a molecule of CO2 is re-emitted eventually.

While this is true, the actual energy spends some time in the atmosphere as kinetic energy between the absorption and reemission. In fact, that piece of energy may combine with other pieces of energy an be emitted from a completely different place.

The next reemission by that CO2 molecule is highly likely to be a completely different piece of energy. Does this matter? Actually yes if you are trying to follow the flow of energy through the atmosphere.

It turns out a radiation model can’t really see the actual flow of energy without factoring in convective/conductive energy movement.

the number of photons is proportional to the temperature

Again true but not the whole story. It is also proportional to the emissivity which does fact in the number of CO2 molecules.

Reply to  Richard M
February 3, 2023 6:05 am

If a body behaves like a black body then emission does not depend on the nature of its molecules. If it does not behave like a black body, then it can emit more or less depending on the nature of its molecules and other factors.

But if the body is not changing its temperature, all the energy received is returned.

Ozone is a good example. It absorbs high-energy photons getting very hot. A small part of that energy is transferred through collisions to N2 and O2 molecules which are much more abundant, but the stratosphere is very rarified. Most of the energy is emitted as IR that can warm CO2 molecules in the stratosphere but usually goes out, either up or down.

As a result of this, the ozone layer is warmer, but once in equilibrium, all the energy continues its trip, part of it in a different direction. Once the ozone layer is not warming further it does not absorb any more net energy. While the UV disappears, it is replaced by IR for the same amount of energy.

CO2 does the same as ozone, except it absorbs IR and emits IR.

Richard M
Reply to  Javier Vinós
February 3, 2023 9:02 am

The atmosphere does not act like a blackbody. The emissivity changes as CO2 concentration changes.

JCM
Reply to  Richard M
February 3, 2023 8:08 am

The system of interest is the bulk atmosphere, not only trace gases.

For the thermodynamic bulk atmosphere, the change in radiative emittance is equal to the change in radiative absorptance.

Change in radiative absorptance is proportional to change in transmissivity.

Logically, reduced transmissivity, such as increasing IR active trace gases, increases absorptance. Increased absorptance results in increased emittance.

Increased emittance is equal to the reduced transmittance. Internal energy is unchanged.

Outgoing Longwave Radiation = Transmitted power through the atmosphere + Emitted power from the atmosphere.

In the bulk thermodynamic atmosphere, the sum of these terms does not change with introduction of more internal IR active gases.

The only change is the proportions of transmitted flux through the atmosphere, and the emitted flux from the atmosphere. The sum is unchanging.

In development of thermodynamics, the observer witnesses transmitted flux and emitted flux from outside the system.

Therefore, the partitioning represents net flux out of the system. Thermodynamics becomes misleading when observing flux internally.

The only way to increase internal energy is by increasing power input (solar), else we have a violation of thermodynamic principles.

Rearranging internal power dissipation, such as swapping transmitted flux for emitted flux, does not change energy content. Nor does changing the the proportion of energy in different internal reservoirs.

The reservoirs of interest are the kinetic energy of ocean and kinetic energy of atmosphere. In atmosphere, the energy can be further partioned to gravitational potential energy.

Richard M
Reply to  JCM
February 3, 2023 9:05 am

The system of interest is the bulk atmosphere, not only trace gases.

Exactly. That’s why radiation models don’t provide answers. They are only looking at part of the system.

JCM
Reply to  Richard M
February 3, 2023 9:33 am

If one is looking for a thermal storage term, one must only look to the existence of the atmosphere itself.

here we have roughly 5.1480 × 10^18 kg of mass participating in the internal energy of atmosphere. This mass exists in turbulent motions in the gravity field of the system.

Conductance & convection represents total turbulent flux from the surface into atmosphere (net).

At the surface, by definition, net radiation = total turbulent flux (non radiative power). This power is represented in the same units of radiation power, so there is much confusion.

Importantly, internal energy of atmosphere Ua, the variable of interest for climates, is the sum of kinetic energy and potential energy. Ua = E_k + E_p.

No atmosphere, no kinetic energy of suspended fluid mass.

In the simplest way, the kinetic energy represents the mechanical motions of atmospheric molecules. As we know, the molecules have many modes of mechanical motion, but perhaps the least understood is the translational motion.

During the day, while total turbulent flux is directed upwards, translational motion shifts particles higher in the atmosphere. The boundary layer expands.

Ua is shifted to a relative higher proportion of potential energy, and lesser proportion of kinetic energy.

This potential energy has no radiative property whatsoever. No emittance. Here we have the storage term. Thermal energy is stored as gravitation potential energy, with no T^4 radiative properties to play with.

During the night, while total turbulent flux is directly downwards, we have a conversion of potential energy back to kinetic energy. The boundary layer contracts.

During day, kinetic energy -> potential energy.
During night, potential energy -> kinetic energy.

While this is only a local example with diurnal variation, the same principles apply in the hemispheres, seasonality, and the global circulation. Kinetic energy is borne mostly at lower latitude surface, translated aloft, converted to potential energy, eventually to arrive back at the surface as kinetic energy somewhere else, perhaps poleward.

The translational motions of atmospheric mass provides the mystery storage term. The thermal energy storage as potential energy cannot be accounted in radiation schemes. So it leads to many imaginary tales in radiation accounting.

JCM
Reply to  JCM
February 3, 2023 10:36 am

Importantly, now increasing complexity, latent flux bypasses near surface kinetic energy to transmit to some height releasing tangible heat in condensation.

This is the dynamic process which is free to vary in intensity and spatial structure/organization. Thermal energy is stored in the vapor phase to be emitted later-on in condensation.

The condensed suspended atmospheric liquid and solid surfaces emitting full-spectrum IR, unlike narrow band gases.

The average emission height here is in the range of 2-3km, well below the virtual calculated effective radiation height. (33K/6.5K/km) = 5km. 5km is WRONG.

The vast bulk of OLR, as observed in spectral lines, is occurring at the triple point phase transition of water, not some imaginary 5km altitude -18C. Peak emission bands are around 273.14K, or roughly 2.3km. (288K-273.14K/6.5K/km) = 2.3km.

While some is condensed at the surface while total turbulent flux is directed downwards, at night (frost/dew), the majority is condensed at altitude (somewhere, at some time, at some height). Bulk emission at 2.3km.

The vast variable water content of atmosphere, and phase transitions, is the primary regulatory mechanism. This is how entropy is maximized. This is how Kirchhoff’s law operates. This is the path of least resistance for Le Chatelier’s principle to manifest.

Reply to  JCM
February 3, 2023 11:08 am

I agree. The radiative paradigm is incomplete. The entire climate has been reduced to the top of the atmosphere and this cannot be correct. Climate is the result of energy transfer through the climate system, and meridional transport is a central part of it because on average the energy exits the climate system at a higher latitude than it enters.

JCM
Reply to  Javier Vinós
February 3, 2023 11:21 am

Climate is the result of energy transfer through the climate system

Precisely – this is not occurring at the speed of light (radiation), but at the (variable) speed of fluid dynamic motion and bio-hydrological cycling.

The reductionist quest to refine equilibrium climate sensitivity to CO2 is completely destroying the field.

Reply to  JCM
February 3, 2023 11:11 am

Let me add a small twist. Kinetic energy of a gas is directly proportional to temperature. The distribution of molecular energies in a gas is given by the Maxwell-Boltzman distribution. At higher temperatures, more molecules will have higher energies. High energy molecules are thermally capable of emitting a photon, which slows and cools them.

JCM
Reply to  It doesnot add up
February 3, 2023 12:21 pm

True, but in the open atmosphere during the day it is more likely for increasing kinetic energy to cause a particle to rise in altitude due to decreasing density with height. However, it is true it is not a perfectly efficient process, and there are dissipation processes occurring.

The density profile is critical to the energy storage potential. Kinetic energy is transformed to potential energy.

This storage potential can be defined by:

delta potential energy = mass of atmosphere x gravity x height of particle translation.

Say for 100m average vertical translation of boundary layer bulk density.

Work = Force x Displacement = Joules

5 x 10^18 * 9.8 * 100m = storage change of 5 x 10^21 Joules.

We can see that, by an average translation of only 100m of the bulk atmosphere, vast potential energy is stored during the diurnal cycle. It is returned at night, keeping us comfortable.

Solar energy is stored in the elevated mass of the atmosphere.

bdgwx
Reply to  Javier Vinós
February 3, 2023 5:51 am

Javier Vinos said: “Every photon absorbed by a molecule of CO2 is re-emitted eventually. Transmission of energy continues, just in a different direction. That is an indisputable fact.”

Exactly. That means it impedes the transmission of energy. For example, let’s say X j of energy is emitted from one end of a cuvette carried by 4.3 um photons. The more CO2 you put in the cuvette the less photons are received at the other end. That means there will be Y j of energy received at the other side where Y < X. And because Y < X it necessarily impeded the transmission of the emitted energy at the other side. This experiment has to be ran billions or even trillions of times per day given all of the NDIR and CRDS instruments deployed in the world. And this happens not only at 4.3 um, but 15 um and other minor active bands as well.

Javier Vinos said: “Since the planet continues emitting the energy it receives from the Sun, the only thing that changes is the height of emission, and the surface temperature readjusts to that change.”

Yep. But that in no way invalidates the fact that CO2 impedes the transmission of energy.

Javier Vinos said: “If the lapse rate was negative, more CO2 would cool the surface, because the number of photons is proportional to the temperature, not to the number of CO2 molecules.”

Yep. And the reason it cools the surface is because it is impeding the transmission of energy. Remember, the 1LOT says ΔE = Ein – Eout. In a negative lapse rate scenario CO2 is impeding more of the surface Ein than the Eout. That’s why ΔE < 0 and ultimately ΔT < 0.

Richard M
Reply to  bdgwx
February 3, 2023 9:16 am

That means it impedes the transmission of energy.

Once again with the half truths. I agree but you are then going off the rails. 

You are ignoring the fact more CO2 increases the volume of energy moving upward in a gravitation field.

Reply to  Richard M
February 3, 2023 9:30 am

You are ruining the concept that more CO2 TRAPS more heat and ultimately will burn us to a crisp!

Richard M
Reply to  Jim Gorman
February 3, 2023 9:43 am

Yeah, it’s a thankless job. Even many skeptics believe this falsehood.

bdgwx
Reply to  Richard M
February 3, 2023 10:03 am

First, the fact that CO2 (and other polyatomic molecules) impede the transmission energy carried by radiation in the terrestrial spectrum is not a half truth. It is the absolute truth and gets validated millions, billions, or maybe even trillions of times every single day with the countless NDIR, CRDS, and other spectroscopy/radiometer instruments.

Second, I’m not ignoring the fact that the result of this barrier to energy transport caused by CO2 (and other polyatomic molecules) has consequences including but not limited to raising the effect emission height of radiation in the terrestrial spectrum. But we can’t even begin to talk about that added complication until everyone accepts that 1) the 1LOT says that energy is conserved not some of the time, but all of the time and 2) that CO2 (and other polyatomic) impedes the transmission of energy carried by radiation in the terrestrial spectrum.

Richard M
Reply to  bdgwx
February 3, 2023 10:19 am

Your “First” statement is false. Well mixed GHGs such as CO2. both slow (impede) and increase energy flow. You are ignoring the increase. Sure, you can test the “impede” part in a nice little experiment, but that does not apply to bulk energy flow in our atmosphere where both processes are active.

It’s amazingly simple. The “impede” part comes from increased absorptivity. Increased absorptivity implies increased emissivity due to Kirchhoff’s Law. The increased emissivity means an increase in the bulk energy flow.

The result is no change in the net energy flux, no change in the emission height and no warming.

bdgwx
Reply to  Richard M
February 3, 2023 2:35 pm

Richard M said: “Your “First” statement is false”

No. It most definitely is not false.

Richard M said: “Well mixed GHGs such as CO2. both slow (impede) and increase energy flow.”

So let me get this straight. You’re 1st sentence is declaring the statement “CO2 impedes energy transmission” as false and then your 2nd sentence says it impedes energy flow while at the same time saying it increases energy flow.

Which is it? Does CO2 cause ΔEout < 0 from the system or not?

Richard M said: “Sure, you can test the “impede” part in a nice little experiment,”

Yeah, it absolutely can be tested. You can do it yourself even. Most instruments that measure CO2 (and other gas species) concentrations exploit the fact that these gases impede the transmission of energy. They are cheap and plentiful.

Richard M said: “but that does not apply to bulk energy flow in our atmosphere where both processes are active.”

Like hell it doesn’t. CO2 does not magically stop impeding the transmission of energy just because the atmosphere has a path length of many kilometers as opposed to millimeters.

Richard M said: “It’s amazingly simple. The “impede” part comes from increased absorptivity.”

No, it’s not simple. Yes, it comes partly from increased absorptivity. But that’s only part of the story. The other part of the story is that the some of the energy thermalizes with the environment instead of escaping the system and the other part of the energy gets reradiated in a random direction with only a fraction of the energy having an escape trajectory from the system. In this manner ΔEout < 0 and thus energy transmission is impeded from egressing from the system until a new steady-state is achieved.

Richard M said: The result is no change in the net energy flux, no change in the emission height and no warming.

Hold on. It sounds like you’re trying to jump right into the exact nature of how CO2 alters the energy flows in the atmosphere. If you cannot accept that CO2 impedes the transmission of energy you are not going to be any more successful at analyzing the far more complex atmosphere than you are at understanding the simpler or even idealized scenarios.

Richard M
Reply to  bdgwx
February 3, 2023 6:43 pm

Which is it? Does CO2 cause ΔEout < 0 from the system or not?

CO2 both impedes (absorbs) and increases the number of photons (emissions). Higher absorptivity from more CO2 impedes energy flow and higher emissivity creates more photons. The net result is constant energy out.

The other part of the story is that the some of the energy thermalizes with the environment

It is true that the energy thermalizes almost every time it is absorbed. In fact, you can ignore the times it doesn’t because they seldom occur. This is why you can’t just deal with radiation. After the energy is thermalized it can move around via conduction and convection.

Like hell it doesn’t. CO2 does not magically stop impeding the transmission of energy

I’ve never stated more CO2 stops impeding energy. It continues and at a higher rate, however, it also increases the emission of more photons. The two effects cancel out keeping the “transmission of energy” constant.

Here’s an analogy. Doubling CO2 is like doubling the lanes of a highway and adding speed bumps which reduces the speed by half. The increased absorptivity is the speed bumps while the increased emissivity is what adds lanes.

bdgwx
Reply to  Richard M
February 3, 2023 8:46 pm

Richard M said: “The net result is constant energy out.”

Try to explain how a NDIR works without invoking ΔEout < 0 at the end of the cuvette and ΔEin < 0 at the thermopile.

Try to explain how a space based radiometer detects low and high concentrations of water vapor in the atmosphere without invoking ΔEout < 0 at TOA and ΔEin < 0 at the radiometer.

Richard M
Reply to  bdgwx
February 3, 2023 9:16 pm

The original question applied to the atmosphere. Quit deflecting. No one cares about your test equipment.

Reply to  bdgwx
February 3, 2023 11:14 am

It is only so under your narrow view of measuring at a certain band from a certain position. As all the energy goes through regardless of how much CO2, there is no impedance.

bdgwx
Reply to  Javier Vinós
February 3, 2023 2:45 pm

JV said: “t is only so under your narrow view of measuring at a certain band from a certain position. As all the energy goes through regardless of how much CO2, there is no impedance.”

That’s just patently false. Not all of the energy goes through when CO2 (or other gas species) are present. And the more CO2 (and other gas species) you have the bigger the change in ΔEout from the system. This is true at all scales including the atmosphere, but for now I’m only trying to convince you this true for the simpler scenarios like that of CO2 (and other gas species) in a small cuvette. If I cannot convince you that CO2 (and other gas species) impede the transmission of energy in simpler scenarios then there’s no way I’m going to convince you that it is true for the far more complex ones as well.

I will repeat. CO2 (and other gas species) impede the transmission of energy carried by radiation in the terrestrial spectrum. This gets proved ad-nauseum millions, billions, and maybe even trillions of times every single day from the countless NDIR, CRDS, and other spectroscopy/radiometer instruments in existence.

Let me see if I can convince you this way. See if you can explain how a NDIR instrument works without invoking ΔEout < 0 and Ein < 0 at the interface between the CO2 bulk and thermopile respectively.

Reply to  bdgwx
February 5, 2023 8:23 am

Tell you what, look at the attached image. They y-axis is basically in mW. That is a measurement of energy per second. Why don’t you go back and see how this graph has changed over the last 50 – 60 years. As CO2 has grown, the energy leaving the earth should have gone up or down. It should show up in these graphs. Does it?

emmision temps of the earth.jpg
Nick Stokes
Reply to  Javier Vinós
February 2, 2023 7:57 pm

In Antarctica, where the lapse rate is negative because the ground is colder than the atmosphere, additional CO2 cools instead of warming, despite doing exactly the same.”

Yes. The reason is that the air is warming the surface, via a downward IR flux (warm to cold). CO2 impedes that, and so reduces the warming. Therefore it has cooling effect.

Reply to  Nick Stokes
February 3, 2023 6:52 am

Wait a minute.

The air is warmer than the ground. Yet air doesn’t radiate IR, only GHG’s consisting of polyatomic molecules according to GHG theory.

There is little H2O in the air due to the air’s temperature. So no IR from water vapor.

CO2 impedes downward IR flux from what, itself?

Your statement is so illogical it makes not only little sense but no sense at all!

Nick Stokes
Reply to  Jim Gorman
February 4, 2023 1:59 am

CO2 impedes downward IR flux from what, itself?”

Yes.

Reply to  Nick Stokes
February 5, 2023 5:15 am

And, where does that flux originate? CO2 can’t generate heat by itself. The ground is colder so if heat only flows from hot to cold, where does the heat originate?

Reply to  bdgwx
February 2, 2023 2:53 pm

You continually misrepresent what is being said here. Climate alarmists jump on CO2 like it was a snake. If CO2 IS NOT the control knob for temperature, then why are you and other alarmists on the band wagon for eliminating coal plants and ICE engines?

Do like the rest of us and exclaim that the identification of CO2 as the boogie man is totally absurd.

bdgwx
Reply to  bdgwx
February 3, 2023 10:17 am

Well this spiraled out of control. I’ve got two people telling me that energy is conserved only after a period of time has elapsed. And I have two other people telling that CO2 does not impede the transmission of energy. Folks, if we cannot all accept that CO2 impedes the transmission of energy or even just that the law of conservation of energy holds all of the time then it is going to be very difficult convincing doonman that CO2 traps energy and that it is not the only thing that modulates the internal energy of the climate system especially if he chooses to appeal to the “authorities” rejecting these well established concepts.

Richard M
Reply to  bdgwx
February 3, 2023 2:03 pm

I explained all to you and yet you continue to deny Kirchhoff’s Law.

The “impede” part comes from increased absorptivity.
Increased absorptivity implies increased emissivity due to Kirchhoff’s Law.
The increased emissivity means an increase in the bulk energy flow.

As a result the energy flux is constant and no warming can occur. All we see is continued deflection. Is that in your job description?

Reply to  Richard M
February 3, 2023 2:27 pm

Yes, it is.

bdgwx
Reply to  Richard M
February 3, 2023 2:56 pm

As I said before. I fully accept Kirchoff’s Law. What I do not accept is an insinuation that it prohibits CO2 (and other gas species) from impeding the transmission of energy carried by terrestrial radiation because it says no such thing. You call that deflection. I call that reality.

And I’ll repeat again and again if I have to. The fact remains that CO2 (and other gas species) modulate Eout from body A and Ein into body B when CO2 (and other gas species) is at the interface between A and B. That is whole basis in how thermopiles and radiometers work. They work precisely because CO2 (and other gas species) impede the transmission of energy. And it works at all scales whether it be a small millimeter long cuvette in a NDIR or large kilometer long path from the surface to space based radiometer.

Richard M
Reply to  bdgwx
February 3, 2023 5:37 pm

It is exactly what Miskolczi 2010 found from 60 years of NOAA data. Kirchhoff’s Law directly leads to radiation exchange equilibrium throughout the atmosphere. Just another way of describing what I explained above.

 What I do not accept is an insinuation that it prohibits CO2 (and other gas species) from impeding the transmission of energy carried by terrestrial radiation because it says no such thing.

Your denial of physical laws is quite amusing. No one said anything about prohibiting CO2 from impeding energy flow. What was said is that CO2 ALSO will increase the emission of energy throughout the atmosphere. While each individual unit of energy takes longer to pass upward to space, there are more of them. The two effects cancel out.

Why is it you feel the need to deny the obvious effects of Kirchhoff’s Law?

bdgwx
Reply to  Richard M
February 3, 2023 8:38 pm

Richard M said: “Your denial of physical laws is quite amusing. No one said anything about prohibiting CO2 from impeding energy flow.”

Javier Vinos said: “CO2 does not impede anything.”

Then I said: I standby what I said.”

Then you said: “It appears you are in denial of Kirchhoff’s Law.”

Richard M said: “The two effects cancel out.”

We aren’t discussing the atmosphere yet. As I keep saying if you are not willing to accept the simpler idealized scenarios there’s no way you’re going to do any better with the far more complex atmosphere.

Richard M said: “Why is it you feel the need to deny the obvious effects of Kirchhoff’s Law?”

I already responded to your loaded question here. If you’re wanting to ask something like…why do you not accept my belief that Kirchoff’s Law contradicts your statement that CO2 impedes the transmission of energy and calls into question how all of those NDIR, CRDS, and various other spectroscopy/radiometers instruments work…then just ask it that way instead of posing it as a loaded question designed to make it look like I deny Kirchoff’s Law which couldn’t be further from the truth. Here, I’ll save you the hassle of another post. The answer…because Kirchoff’s Law does not say that.

Richard M
Reply to  bdgwx
February 3, 2023 9:15 pm

We aren’t discussing the atmosphere yet. 

I am. So was comment you responded to. Energy flow through the atmosphere works differently than in test equipment. There’s this thing called gravity that comes into play. In fact, it is the driver of the energy flow. Discussing equipment tells you nothing about what happens in the atmosphere. You were deflecting yet again and I’m pulling you back.

Guess what? CO2 impedes energy in both. However, you don’t get the net upward flux unless you are in a gravitational field. You would just get an equal flux in both directions. No one cares on a climate blog.

So, let’s get back to the atmosphere. The effect of Kirchhoff’s Law is to keep the total energy flow constant independent of CO2 concentration. Do you deny this?

Reply to  bdgwx
February 4, 2023 1:00 pm

From:

What is Kirchhoff’s Law of Thermal Radiation – Definition (thermal-engineering.org)

“This law must be also valid in order to satisfy the Second Law of Thermodynamics. As was written, all bodies above absolute zero temperature radiate some heat. Two objects radiate heat toward each other. But what if a colder object with high emissivity radiates toward a hotter object with very low emissivity? This seems to violate the Second Law of Thermodynamics, which states that heat cannot spontaneously flow from cold system to hot system without external work being performed on the system. … Therefore, whenever the cool body is radiating heat to the hot body, the hot body must also be radiating heat to the cool body. Moreover, the hot body will radiate more energy than cold body. The case of different emissivities is solved by the Kirchhoff’s Law of thermal radiation, which states that object with low emissivity have also low absorptivity. As a result, heat cannot spontaneously flow from cold system to hot system and the second law is still satisfied.”

Reply to  bdgwx
February 3, 2023 2:26 pm

 I’ve got two people telling me that energy is conserved only after a period of time has elapsed.”

What happens to the far end of a metal rod when you heat the near end?

is ΔE a time function along the rod? Why do you just absolutely refuse to join the real world of physical science?

Beta Blocker
February 2, 2023 10:24 am

The argument is often made that global mean temperature (GMT) means nothing by itself as a general indicator of the current state of the earth’s climate system. As this argument goes, GMT is therefore useless by itself as a predictor for how the climate system might be evolving in the future.

With that context as background, I have four simply-stated questions:

(1) Is it possible at the current state of science to estimate what the earth’s global mean temperature (GMT) might have been 21,000 years ago at the height of the last glacial?

(2) If it’s possible to do that, was the earth’s GMT at the height of the last glacial either lower, the same, or higher than it is today?

(3) Is it possible to estimate what the pattern of regional temperatures might have been 21,000 years ago for ocean and for land at the height of the last glacial?

(4) Starting at the height of the last glacial 21,000 years ago, and for each thousand-year interval between then and now, is it possible to estimate what kinds of changes might have occurred in the pattern of regional temperatures, both for ocean and for land?

OK, the last question isn’t so simple. But you get my drift.

Milo
Reply to  Beta Blocker
February 2, 2023 5:30 pm

Estimates for GASTA during the LGM range from four to 10 degrees C colder than now.

roaddog
February 2, 2023 11:44 am

All I know is its colder here this winter than we’ve seen in six or seven years. My personal, lived experience is all I am really concerned about; and any warming is welcome.

Ireneusz Palmowski
February 2, 2023 12:18 pm

SSW brings arctic air over the US. Temperatures in C.
comment image