UAH Global Temperature Update for November 2022: +0.17 deg. C

From Dr. Roy Spencer’s Global Warming Blog

December 6th, 2022 by Roy W. Spencer, Ph. D.

Sorry for the late posting of the global temperature update, I’ve been busy responding to reviewers of one of our papers for publication.

The Version 6 global average lower tropospheric temperature (LT) anomaly for November 2022 was +0.17 deg. C departure from the 1991-2020 mean. This is down from the October anomaly of +0.32 deg. C

The linear warming trend since January, 1979 now stands at +0.13 C/decade (+0.12 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).

Various regional LT departures from the 30-year (1991-2020) average for the last 22 months are:

YEARMOGLOBENHEM.SHEM.TROPICUSA48ARCTICAUST
2021Jan+0.13+0.34-0.09-0.08+0.36+0.50-0.52
2021Feb+0.20+0.32+0.08-0.14-0.65+0.07-0.27
2021Mar-0.00+0.13-0.13-0.28+0.60-0.78-0.79
2021Apr-0.05+0.06-0.15-0.27-0.01+0.02+0.29
2021May+0.08+0.14+0.03+0.07-0.41-0.04+0.02
2021Jun-0.01+0.31-0.32-0.14+1.44+0.64-0.76
2021Jul+0.20+0.34+0.07+0.13+0.58+0.43+0.80
2021Aug+0.17+0.27+0.08+0.07+0.33+0.83-0.02
2021Sep+0.26+0.19+0.33+0.09+0.67+0.02+0.37
2021Oct+0.37+0.46+0.28+0.33+0.84+0.64+0.07
2021Nov+0.09+0.12+0.06+0.14+0.50-0.42-0.29
2021Dec+0.21+0.27+0.15+0.04+1.63+0.01-0.06
2022Jan+0.03+0.06-0.00-0.23-0.13+0.68+0.10
2022Feb-0.00+0.01-0.02-0.24-0.04-0.30-0.50
2022Mar+0.15+0.27+0.02-0.07+0.22+0.74+0.02
2022Apr+0.26+0.35+0.18-0.04-0.26+0.45+0.61
2022May+0.17+0.25+0.10+0.01+0.59+0.23+0.19
2022Jun+0.06+0.08+0.04-0.36+0.46+0.33+0.11
2022Jul+0.36+0.37+0.35+0.13+0.84+0.56+0.65
2022Aug+0.28+0.32+0.24-0.03+0.60+0.50-0.00
2022Sep+0.24+0.43+0.06+0.03+0.88+0.69-0.28
2022Oct+0.32+0.43+0.21+0.04+0.16+0.93+0.04
2022Nov+0.17+0.21+0.12-0.16-0.51+0.51-0.56

The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for November, 2022 should be available within the next several days here.

The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:

Lower Troposphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Mid-Troposphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tmt/uahncdc_mt_6.0.txt
Tropopause:

http://vortex.nsstc.uah.edu/data/msu/v6.0/ttp/uahncdc_tp_6.0.txt
Lower Stratosphere:

http://vortex.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt

5 17 votes
Article Rating
279 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
walterr070
December 6, 2022 2:12 pm

”As noted in the last few reports, the extratropical warmth, especially in the NH, during this multi-year La Niña episode has been a remarkable feature that has kept the global average near or above zero since commencing in late 2020 and is consistent with an overall long-term upward trend in global temperature.”

What is hypothesized to be causing this extra tropical warmth he is referring to? This is a key question that should be discovered more. Unfortunately, the stubbornness of climate “science” to focus solely on GHG is setting back our knowledge on this. It’s probably safe to say that we barely know anymore on this than we did in the 1970s.

rbabcock
Reply to  walterr070
December 6, 2022 4:01 pm

I would go with undersea volcano heating of the oceans north of Hawaii. The source of the warm water is either the tropics (which has been shut down with the La Niña) or heat from beneath, so I submit it is the latter.

Last edited 1 month ago by rbabcock
RickWill
Reply to  walterr070
December 6, 2022 4:19 pm

It’s probably safe to say that we barely know anymore on this than we did in the 1970s.

There are more measurements but less understanding. Climate science is now a religion where beliefs prevail over facts and evidence only comes from models. Anything that disagrees with the models is automatically wrong because it does not fit the belief system.

The phrase “upward trend in global temperature” is inaccurate. The average is going up but there are regions that are cooling or have no trend..

Javier Vinós
Reply to  walterr070
December 7, 2022 1:44 am

Extra-tropical warmth means more energy transported by the atmosphere at the extra-tropics. The Sun puts almost 2/3 of its energy in the tropics, the rest of the planet has most of the time a negative energy balance at the top of the atmosphere.

During La Niña oceanic transport increases at the expense of atmospheric transport, since oceanic transport is less efficient this means the surface cools but the climate system gains energy. The “remarkable feature” means during these 3 Niñas atmospheric transport has remained more elevated than usual. This is related to low solar activity and a weaker polar vortex. Although it means less surface cooling the net effect is feeding into the pause.

A longer explanation of the energy transport effect on climate and its modulation by solar activity is in chapters 10 and 11 of my book.

This figure from my book explains the relationship between ENSO and the solar cycle. We are about to finish phase V, increasing solar activity Niña-frequent, and should enter phase I, high solar activity Niño-frequent, before moving to the highly variable phase II.

comment image

david_t
Reply to  Javier Vinós
December 8, 2022 2:12 pm

Hi Javier, I stumbled on your comment here and decided read your work on the winter gatekeeper hypothesis. I have read many articles on the indirect effects of the sun on climate, but I have never seen a hypothesis like yours that puts all the pieces together. I don’t know what to say more than that I am a bit blown away! It is remarkable that nothing of this exists in “The modern theory of climate change” as you call it. I have so many questions, but let me just ask you one: Can you estimate the uncertainty of the IPCC models when predicting the average global temp anomaly at the year of 2100, given that they haven’t taken your hypothesis into account when modeling? Thank you, and very impressive work!

Javier Vinós
Reply to  david_t
December 9, 2022 6:31 am

Hi David. I don’t know how to estimate the uncertainty of GCM model predictions. It essentially depends on the amount of warming due to the increase in CO2. If it is one-third, for example, then the models have no skill as the uncertainty would be much higher than the predicted warming.

karlomonte
Reply to  Javier Vinós
December 9, 2022 10:40 am

There is Pat Frank’s paper:

Frank P (2019) Propagation of Error and the Reliability of Global Air Temperature Projections. Front. Earth Sci. 7:223. doi: 10.3389/feart.2019.00223

https://doi.org/10.3389/feart.2019.00223

david_t
Reply to  Javier Vinós
December 11, 2022 2:45 pm

Ok, I understand. I look forward to reading your book.

Richard M
Reply to  walterr070
December 7, 2022 1:08 pm

There was a reduction in clouds starting in 2014 likely due to the PDO going positive. See Dubal/Vahrenholt 2021.

comment image

This led to an increase in solar energy reaching the surface especially in the tropical east Pacific. The solar energy warmed the oceans which have been releasing some of that energy into the atmosphere over the past 8 years.

This year the Tonga eruption injected vast amounts of water vapor into the Stratosphere. As this slowly falls back to Earth there’s been an increase in upper Tropospheric water vapor and a related increase in the greenhouse effect.

comment image

Notice the little upward movement during 2022 at 9 km.

Last edited 1 month ago by Richard M
David Dibbell
December 6, 2022 2:19 pm

The USA48 anomaly for November is reported at -0.51C. It did seem a bit chilly.

MarkH
Reply to  David Dibbell
December 6, 2022 2:29 pm

Anomalies for Australia roughly check out. Mild(ish) Winters and relatively cold and wet Summers. November has been particularly cold here (save for maybe one day that got above 30C), coldest November in almost 50 years this year.

Bellman
December 6, 2022 2:38 pm

The 6th warmest November in the UAH record.

Still looks like 2022 will finish close to 2010, but I think it’#s more likely it will be just below, making 2022 the 7th warmest year. (December will need to be grater than 0.27°C to beat 2010).

This will mean that all eight years since 2015 are in the top 10.

The start date for the Monckton pause remains unchanged, so the pause grows by another month.

MarkW
Reply to  Bellman
December 6, 2022 3:47 pm

In other words, no warming since 2012, despite continued increases in CO2 levels.
Beyond that, the warming that started around 1850, 100 years before significant increases in CO2 levels, continues.

In summation.
Ho hum.

Bellman
Reply to  MarkW
December 6, 2022 4:25 pm

Warming at the rate of 0.27°C / decade since 2012. Ho hum.

Mike
Reply to  Bellman
December 6, 2022 5:30 pm

If you take a rock from a shady place and put it in the sun, then put it in the fridge, then put it in the sun then put it in the fridge then place it in the shade. Has it warmed or has it cooled?
Big picture dude. Big picture.

Last edited 1 month ago by Mike
bdgwx
Reply to  Mike
December 6, 2022 6:01 pm

I don’t think it is unreasonable to use OLR as a big picture analysis technique to answer the warming/cooling question. But I’m open to alternatives. If you could provide an objective test I’ll run it against the UAH data and your rock data to answer the question.

MarkW
Reply to  bdgwx
December 6, 2022 8:46 pm

Over the last 5000 years, the earth has cooled by several degrees C.

bdgwx
Reply to  MarkW
December 7, 2022 9:56 am

What does that have to do with UAH or the rock?

MarkW
Reply to  Bellman
December 6, 2022 8:45 pm

During the 70’s, the rate was negative. Same with the 40’s.
Only a warmunist is stupid enough to take a short term trend and declare that it is proof of anything.

Bellman
Reply to  MarkW
December 7, 2022 1:17 am

Be sure to point this out to Monckton if he starts talking about an 8 year 2 month pause.

HotScot
Reply to  Bellman
December 7, 2022 2:58 am

The planet has cooled considerably over 570 million years.

Your point is?

Carbon_Dioxide_Geological_4600mya_.jpeg
Brian
Reply to  Bellman
December 7, 2022 6:31 am

Monkton isn’t the one who is claiming we should be observing 0.1/decade warming. We only need show we are not observing it to be correct in our skepticism.

Bellman
Reply to  Brian
December 7, 2022 7:40 am

You have an odd idea of skepticism. In statistics the idea is to be skeptical of the thing that seems to confirm your hypothesis. Only accepting it if there is sufficient evidence that it’s unlikely to have happened by chance.

A zero trend över a short period should not convince a skeptic of anything. The uncertainty of the trend is such that it would be entirely possible it is the result of chance.

karlomonte
Reply to  Bellman
December 7, 2022 8:25 am

The uncertainty of the trend is such 

You don’t even understand the difference between uncertainty and error, yet here you are lecturing on the subject.

Bellman
Reply to  karlomonte
December 7, 2022 12:15 pm

Call it what you will. It’s the the confidence interval for the slope. An indication of the range within which it is reasonable to assume the true slope may lay. You need to know it if you want to claim your estimated slope is is significantly different to another.

Tim Gorman
Reply to  Bellman
December 8, 2022 10:06 am

If you don’t consider the uncertainty intervals of the data then you don’t *KNOW* the slope. The slope could be positive/negative/zero.

doonman
Reply to  Bellman
December 7, 2022 1:36 pm

Strange that back radiation would cease as the molecule that causes it increases. That is the hypothesis, isn’t it?

Brian
Reply to  Bellman
December 8, 2022 8:38 am

Wrong. You should be skeptical of confirmation bias and all prejudicial bias. You propose a false dilemma. And if I say I am observing a trend that is counter to the prevailing hypothesis, however long it might be, it is still a valid means of observing and analyzing the data if I am willing to extrapolate to a wider field of data. If Monkton were willing, he should use whatever yardstick the longest pause provides, and use that to analyze longer periods in history and make predictions on the future.

bdgwx
Reply to  Brian
December 8, 2022 10:44 am

Brian said: “If Monkton were willing, he should use whatever yardstick the longest pause provides, and use that to analyze longer periods in history and make predictions on the future.”

The issue isn’t that he does it. I, Bellman, and many others use OLR as well. The issue is that Monckton and many other contrarians here ignore or even reject the exact same method when performed on other date ranges like 2011/01 to 2022/11.

That is the impetus of my question below. Do you accept the Monckton method or not? Do you accept it as a valid analysis technique for any time period or not?

Last edited 1 month ago by bdgwx
bdgwx
Reply to  bdgwx
December 8, 2022 10:55 am

BTW…I find it mildly ironic that Monckton gets a free pass to use (and possibly abuse) OLR all he wants, but if Bellman or I use OLR or even use the exact same technique Monckton uses himself we are summarily criticized. I even had two posters here already accuse me of Monckton’s “sophistry” as if I’m somehow responsible for it. How messed up is that?

karlomonte
Reply to  bdgwx
December 8, 2022 11:38 am

Stop whining.

bdgwx
Reply to  Brian
December 7, 2022 9:55 am

To draw that conclusion you’re going to have to accept the result from the Monckton method from 2014/10 to 2022/12. Are equally willing to accept it’s result from 2011/01 to 2022/12 as well?

Brian
Reply to  bdgwx
December 8, 2022 8:28 am

No, I can reject your hypothesis without proposing my own or anyone else. That’s allowed

bdgwx
Reply to  Brian
December 8, 2022 10:40 am

Let me make sure I’m understanding your position so I’m not putting words in your mouth. You accept the Monckton method when performed on the period 2014/10 to 2022/11, but you don’t accept the same method when performed on the period 2011/01 to 2022/11?

BTW…I’m not presenting a hypothesis here. I’m responding to your statement We only need show we are not observing it to be correct in our skepticism.”

Tim Gorman
Reply to  bdgwx
December 8, 2022 2:41 pm

Neither you or bellman will accept that Monckton is not cherry picking an interval! He is *finding* an interval.

bellman and I went through this already. According to him if you find a dead deer and backtrack his blood trail to find where he was shot then you are cherry picking the place where the deer was shot! It’s as idiotic of an accusation as I’ve ever heard.

You are making the same allegation – that Monckton is cherry picking an interval instead of *finding* an interval that matches what he is looking for.

Give it a rest. It’s not obvious you understand the term “cherry picking”.

AlanJ
Reply to  MarkW
December 7, 2022 8:51 am

Absence of an obvious surface warming trend does not mean the planet is not warming – unforced internal natural variability within the climate system is fairly large on timescales of less than 30 years and is capable of obscuring the long-term gradual warming. Ocean heat content is still increasing unabated:

https://climate.nasa.gov/vital-signs/ocean-warming/

So the planet is taking in more energy than it is releasing back to space. The “Monckton Pause” is most likely a trend in random variability and does not reflect a change in the trend. In fact, if you plot the linear trends starting at the present day and increasing by one month back in time, you can see that the trendline doesn’t stabilize until sometime between 2010-2015. In fact there are several times when the Monckton pause is actually a Monckton warming, and Mr. Monckton has to ignore those periods to say his mathematical calculation is valid:

comment image

Mike
Reply to  AlanJ
December 7, 2022 2:47 pm

 Ocean heat content is still increasing unabated:”
Garbage. How is the ocean warmed? From above or below?
If above, the ocean is not heating because the atmosphere is not heating.
Any observed increase in ocean temp at the moment is an artifact of previous atmospheric warming.

angech
Reply to  Bellman
December 6, 2022 4:13 pm

Bellman.
Not sure where you are getting your ideas from.

1.The start date for the new [Monckton?] pause does not remain unchanged.
Each new month of data starts from the end of that new month.
That is how calculating a change in trend at the present time is done.

2 The pause should grow by several months.
It will be interesting to see what Christopher posts.
The reason for the growth by more than 1 month is the marked fall from the previous level.
This means the other end of the pause will also occur at an earlier date.
Nick could explain this to you if you bothered to ask him.

3 Other data sets also show the pause if you are interested and have a larger fall this month
than usual.
2002 has been the 6th warmest year approximately on most of the other charts with a
prediction of finishing in the top 10.
A big fall like November however will possibly move it the 8th lowest on most of them.
lower than UAH.
Another low month in December could even push it towards 10th lowest.
Consistent with Arctic sea ice recent results and PIOMAS.

.

Bellman
Reply to  angech
December 6, 2022 4:51 pm

The start date for the new [Monckton?] pause does not remain unchanged.

Show us your workings. I make the start date as October 2014, the same as it was last month.

Each new month of data starts from the end of that new month.
That is how calculating a change in trend at the present time is done.

I’m not talking about when this month’s data started. I’m talking about when the pause starts. That is what is the furthest back you can go and have a non positive trend.

The pause should grow by several months.
It will be interesting to see what Christopher posts.

I predict he will post that it grew by one month. To 8 years and 2 months.

The reason for the growth by more than 1 month is the marked fall from the previous level.
This means the other end of the pause will also occur at an earlier date.
Nick could explain this to you if you bothered to ask him.

I base my prediction on looking at the actual data. I have no reason to ask Nick or anyone about it. I’ve been predicting when he will say the pause starts for years, going back to the old one. It’s a trivial exercise.

Other data sets also show the pause if you are interested and have a larger fall this month
than usual.

Maybe, but Monckton only uses UAH, the slowest warming data set for his pause.

Another low month in December could even push it towards 10th lowest.

I doubt it. Tenth warmest year in UAH is 2002 at 0.08°C. For 2022 to equal this would require a December anomaly of -1.08°C. That’s more than half a degree colder than any other December in the UAH record.

bdgwx
Reply to  angech
December 6, 2022 4:54 pm

Bellman is right. The start date remains unchanged at 2014/10 therefore the Monckton Pause extends by one month. If December comes in at +0.10 C or lower (give or take) the start date will back up to 2014/09 and the pause will extend by two months.

Tim Gorman
Reply to  bdgwx
December 6, 2022 6:13 pm

You just basically said they are both right. Nice.

MarkW
Reply to  Tim Gorman
December 6, 2022 8:47 pm

Sophistry is pretty much the only mental skill that most warmunists have mastered.

bdgwx
Reply to  MarkW
December 7, 2022 9:49 am

If there is sophistry here it is from Monckton. It is his method; not mine.

angech
Reply to  bdgwx
December 7, 2022 6:44 pm

great sophistry! bdgwx
appreciated.Bellman is wrong
UAH pause extended by 4 months apparently.
Apologies accepted [if due]

bdgwx
Reply to  angech
December 8, 2022 5:31 am

The Monckton Pause extended by 4 months from the October update to the November update?

Last edited 1 month ago by bdgwx
angech
Reply to  angech
December 7, 2022 5:20 am

The start date for the new [Monckton?] pause does not remain unchanged.
Show us your workings. I make the start date as October 2014, the same as it was last month.

Bellman,
When discussing a pause in data and even more pertinently a lengthening in a pause over time the length of the pause being assessed is from the current date back to a date to be determined where there anomalies balance out

[bdgwx December 6, 2022 5:34 pm It is defined as the longest consecutive period ending on the most recent month where the OLR is <= 0 C/decade.]

Confusion obviously arises from using the terminology starting date in two different senses.
Perhaps I should have said “The current date for the new [Monckton?] pause does not remain unchanged.

Regardless there has been a significant fall in the last months temperature.
This means that you are claiming the pause has extended by one month despite your statement that the start date is the same.

Logic will tell you that for the “start” date for this longer pause to be the same as the current pause This could only be true if the reading for this November was exactly the same as that for October. The only way to extend a flat line.
Your workings are wrong.

” Tenth warmest year in UAH is 2002 at 0.08°C. For 2022 to equal this would require a December anomaly of -1.08°C”
“This will mean that all eight years since 2015 are in the top 10.”
Are you saying that 2022 will be only +0.17C if 8th warmest?
That does not seem a worry does it?

bdgwx
Reply to  angech
December 7, 2022 1:52 pm

It would have made more sense if you had said the end date for the pause has changed.

Bellman
Reply to  angech
December 8, 2022 3:04 am

“When discussing a pause in data and even more pertinently a lengthening in a pause over time the length of the pause being assessed is from the current date back to a date to be determined where there anomalies balance out”

I was talking about the Monckton pause, as defined by Monckton. It has nothing to do with anomalies balancing out, whatever that means. It’s simply choosing the start date that will give you the longest negative trend to the present.

Mike
Reply to  Bellman
December 6, 2022 4:45 pm

”The 6th warmest November in the UAH record.”

And yet only 0.2 degrees warmer than 64 years ago.
Oh the horror!
Soon it will be 0.0 degrees warmer than 64 years ago (2 ”climate data points”)
Ergo, no global warming. Isn’t that lovely? Hmmmm?

Last edited 1 month ago by Mike
Bellman
Reply to  Mike
December 6, 2022 4:52 pm

There isn’t any UAH data from 64 years ago.

Mike
Reply to  Bellman
December 6, 2022 5:24 pm

There is balloon data from 64 years ago. It agrees with satellite. There has been no net global warming in your lifetime. Isn’t that nice?

Last edited 1 month ago by Mike
Bellman
Reply to  Mike
December 6, 2022 5:39 pm

Could you provide a link to the balloon data set you are using, the one that shows average temperatures in 1958 were close to the 1991-2020 average.

[edited because I’d misread the initial comment.]

Last edited 1 month ago by Bellman
Mike
Reply to  Bellman
December 6, 2022 5:52 pm

Who’s talking about the 91 to 2022 average? I said the 1958 temps were the same as in 2001 and now it is a only a matter of time (possibly a month or two) until the temps will AGAIN be the same as 1958. Not net global warming in 64 years.
When will we be able to say there is or will be permanent warming? Next year for sure?

Last edited 1 month ago by Mike
bdgwx
Reply to  Mike
December 6, 2022 6:16 pm

Mike said: “ I said the 1958 temps were the same as in 2001″

Can you post a link to the dataset showing that?

Mike
Reply to  bdgwx
December 6, 2022 9:06 pm

I have a million times. It is just as good now as it was then. If it was good enough data for the late Proff. Bob Carter to prove than no – or almost no warming has occurred in your lifetime it should be good enough for you. Is this data too old for you? Not complicated enough? Not enough precision? Do you want to argue over 10ths of a degree?
Isn’t it pretty obvious to you that carbon dioxide has done squat to the climate so far?

radiosonde.JPG
Last edited 1 month ago by Mike
Javier Vinós
Reply to  Mike
December 6, 2022 11:32 pm

That is the HadAT paper from Met Office Hadley Center.

https://www.metoffice.gov.uk/hadobs/hadat/HadAT_paper.pdf

The atmosphere at 500 hPa was as warm in the late 50s as in the mid-90s. The mid-60s to mid-70s was a cold period.

Tim Gorman
Reply to  Mike
December 7, 2022 5:46 am

Your graph shows what most climate scientists ignore totally. The true value of each data point lies somewhere within the error bar. It could be significantly higher or lower than the stated value.

That means that just about any trend you want to draw can be realized, positive/negative/zero. You simply cannot tell what the actual trend is when considering the uncertainty of the data points.

Yet in climate science the uncertainties are assumed to all cancel and the stated values are 100% accurate. By adjusting the stated values any desired trend can be established with a supposedly 100% accuracy allowing changes out to the hundredths digit to be discerned regardless that the uncertainty of each data point on the graph appears to be about +/- 0.5C.

How do you discern differences in the hundredths digit when the uncertainty is in the tenths digit?

bdgwx
Reply to  Mike
December 7, 2022 9:40 am

Mike said: “I have a million times.”

That’s HadAT2 at 500mb, which I’ve responded to you about a million times.

Mike said: ” If it was good enough data for the late Proff. Bob Carter to prove than no – or almost no warming has occurred in your lifetime it should be good enough for you.”

HadAT2 is good enough for me. It shows +0.13 C/decade of warming from 1958 to 2001 at 700mb. That is a lot more than the “no warming” you claimed.

Mike said: “Is this data too old for you?”

Yes. But, that doesn’t mean isn’t useful.

Mike said: “Not complicated enough? Not enough precision? Do you want to argue over 10ths of a degree?”

I have no issues with it? Do you?

Mike said: “Isn’t it pretty obvious to you that carbon dioxide has done squat to the climate so far?”

Not according to your own preferred dataset.

BTW…during the overlap period from 1979 to 2012 it shows +0.15 C/decade of warming whereas UAH only shows +0.11 C/decade.

Does the fact that your own preferred dataset show significantly more warming than UAH cause you to now question it?

Mike
Reply to  bdgwx
December 7, 2022 2:56 pm

”Does the fact that your own preferred dataset show significantly more warming than UAH cause you to now question it?”
No. I see no NET warming anywhere (although it is ALWAYS either ”warming” or ”cooling”) given that surface temperature (pre adjustment) show it was as warm in the 30s/40s as it is now.

Last edited 1 month ago by Mike
bdgwx
Reply to  Mike
December 7, 2022 3:56 pm

Mike said: No

Is it safe to conclude that you agree that your preferred dataset shows a trend of +0.13 C/decade from 1958 to 2001 at 700mb?

Mike said: I see no NET warming anywhere

The only way that is consistent with your preferred dataset is if you have a different definition of “warming”. In the context of OLR upon a temperature time series scientists define warming as a value greater than 0 C/decade

Mike said: given that surface temperature (pre adjustment) show it was as warm in the 30s/40s as it is now.

Can post a link to the dataset (both adjusted and unadjusted time series) showing that?

Mike
Reply to  bdgwx
December 7, 2022 10:22 pm

”given that surface temperature (pre adjustment) show it was as warm in the 30s/40s as it is now.
Can post a link to the dataset (both adjusted and unadjusted time series) showing that?

20s 30s 40s.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:22 pm

2

30s40s mark Fife.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:23 pm

3

30s40s1.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:25 pm

4

30s40s3.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:26 pm

5

30s40s4.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:26 pm

6

30s40s5.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:27 pm

7

30s40s6.JPG
Mike
Reply to  bdgwx
December 7, 2022 10:27 pm

8

30s40s8.JPG
bdgwx
Reply to  Mike
December 8, 2022 5:30 am

8 graphs. Not a single one of them is even of the global average temperature.

Here is a graph with multiple global average temperature time series.

comment image

And here is a graph showing adjusted vs unadjusted. As you can see not is it significantly warmer now than in the 30s/40s, but the adjustments actually remove some of the warming relative to the unadjusted data.

comment image

[Hausfather 2017]

Jim Gorman
Reply to  bdgwx
December 8, 2022 8:00 am

None of these are Global Average Temperature, only global temperature ANOMALY.

Tim Gorman
Reply to  bdgwx
December 8, 2022 10:17 am

What are the uncertainty intervals associated with each data point used on this graph?

karlomonte
Reply to  Tim Gorman
December 8, 2022 11:40 am

The question he will never answer.

Mike
Reply to  bdgwx
December 8, 2022 4:28 pm

If you actually believe that data from 6 sources could be in such lock step, as your graph above, without collusion, you are in an even worse state than I imagined.
Also the 2001 is <or> 0.2 degrees out of whack. That graph is fraud and that you don’t recognize it is bewildering.

bdgwx
Reply to  Mike
December 8, 2022 5:33 pm

Let me get this straight because I don’t want to be accused of putting words in your moth. Are you seriously indicting the creators of your own preferred dataset of fraud?

Mike
Reply to  bdgwx
December 8, 2022 10:54 pm

I’ll answer your question about YOUR preferred graph with another yet another graph.
So, Iceland – and the other side of the Earth – Cape town show the 40’s to be more or less the same temperatures as the 2000s (what a coincidence!) and your graph shows 6 different sources (all EXACTLY the same) around 1 degree C warmer. What is wrong with this picture?

30s40s9.JPG
Last edited 1 month ago by Mike
bdgwx
Reply to  Mike
December 9, 2022 6:54 am

First…I didn’t ask a question about the graph I posted.

Second…you said “That graph is fraud”. That graph contains data from the Hadley Center which is the creator of HadAT2. Unless you tell you me otherwise I have no choice but to accept that you think your own preferred dataset, which BTW shows more warming than UAH, is created by an institution engaging in fraud.

Third, Iceland is not global.

Last edited 1 month ago by bdgwx
Jim Gorman
Reply to  bdgwx
December 9, 2022 10:24 am

I’ll keep saying it until you give a cogent answer.

It is simple math and logic. The adjustments are done on monthly averages supposedly BECAUSE THEY ARE WRONG!

Yet the actual temperature readings are never changed!

Are the temperature readings correct? If so, then how do the averages come out wrong.

If the temperature readings are NOT correct then the whole shebang should be thrown out should they not?

Mike
Reply to  bdgwx
December 9, 2022 9:04 pm

Do you believe that the Iceland, Cape Town, US, Greenland, and Antarctica data which show 40s warmth equal to the 2000s are outliers and that your graph represents the true global picture? 0.5 to 0.8 degrees higher? Is that what you are saying?

bdgwx
Reply to  Mike
December 11, 2022 5:30 am

No. I’m not saying that individual station time series are outliers. I’m saying they aren’t global. They are only components that go into a global average. Because the warming of the planet isn’t homogenous you’ll see a lot of variation in the warming rate from station to station. Some station are even cooling.

AlanJ
Reply to  Mike
December 7, 2022 9:59 am

If you believe the HadAT2 balloon data are the best available representation of global mean temperature change, then you believe that the world is warming faster than the surface temperature indexes like NASA’s GISTEMP says it is:

comment image

That is a comparison of the near-surface HadAT2, instead of the temps at tens of km, with NASA GISTEMP.

Jim Gorman
Reply to  AlanJ
December 9, 2022 10:42 am

The problem with this graph is that it is not long enough to capture all the vagaries of the different cycles and oscillations on the earth.

Regressions on cyclic data prove nothing. They are not predictive.

bigoilbob
Reply to  Mike
December 9, 2022 6:26 am

Interesting plot, but you didn’t post the data for it, per bdgwx’s request. I can find the mean monthly values, as did bd, but I would be interested in the 5 and 95% values as well. I looked around and searched separately and came up short. If it is posted separately, and I’m missing it, forgive me, I tried. So please indulge me and (re)post the data including the 1000 realization error bars.

bigoilbob
Reply to  bigoilbob
December 9, 2022 3:11 pm

Thanks, but you can belay that request. I think I have what I need here. The data is a little balky, but usable.

https://www.metoffice.gov.uk/hadobs/hadat/uncertainty/global_timeseries.txt

bdgwx
Reply to  bigoilbob
December 9, 2022 5:01 pm
bigoilbob
Reply to  bdgwx
December 9, 2022 5:18 pm

Thx, indeed it is. I saw you post it previously. I wanted to see each of the 1000 realizations for each of the different pressures. I’m guessing that by either evaluating results from the batch of realizations from earlier to later, or by bootstrapping random samples from each of the error bars, the trend is firm. The latter would be result in the highest standard error for the trend.

Last edited 1 month ago by bigoilbob
Bellman
Reply to  Mike
December 6, 2022 6:37 pm

The current UAH anomaly is 0.17 above the 1991-2020 average. You claimed it is only 0.2 warmer than the temperature in 1958, which implies the temperature than was 0.03 below the 1991-2020 average.

bdgwx
Reply to  Mike
December 6, 2022 5:56 pm

Most satellite datasets start in ~1979. Here is the comparison of 3 balloon datasets (RATPAC, RICH, RAOBCORE) and 4 satellite (UAH, RSS, STAR, and UW).

comment image

Mark BLR
Reply to  Mike
December 7, 2022 4:34 am

There is balloon data from 64 years ago. It agrees with satellite

… during the overlap period since 1979, yes it does.

There has been no net global warming in your lifetime.

This part is incorrect.

The increase in the RATPAC anomaly in the “Lower Troposphere” region, 300-850 mb, from the lows of the early 1970s to “now” is on the order of 1°C.

RATPAC-MSU_TLT_1958-2021.png
Mark BLR
Reply to  Mark BLR
December 7, 2022 4:44 am

Bellman : Could you provide a link to the balloon data set you are using, the one that shows average temperatures in 1958 were close to the 1991-2020 average.

bdgwx : Can you post a link to the dataset showing that?

RATPAC data from the following URL (FTP site) :
ftp://ftp.ncdc.noaa.gov/pub/data/ratpac/

bdgwx
Reply to  Mark BLR
December 7, 2022 9:41 am

It was actually HadAT2 found here. At

Mike
Reply to  Mark BLR
December 7, 2022 10:30 pm

”This part is incorrect.
The increase in the RATPAC anomaly in the “Lower Troposphere” region, 300-850 mb, from the lows of the early 1970s to “now” is on the order of 1°C.”

Ok in MY lifetime then.

John Tillman
Reply to  Bellman
December 6, 2022 6:54 pm

Earth cooling since Feb 2016 and flat since 1998. How is that possible under steadily rising plant food in the air?

Bellman
Reply to  John Tillman
December 6, 2022 7:02 pm

1998 starts with a massive El Nino, but still the trend since the start of ’98 is 0.11°C / decade. Not much difference than the overall trend, and a rise of around 0.26°C in total.

If by “since” you mean just after 1998, then the trend is 0.16°C / decade.

bdgwx
Reply to  John Tillman
December 6, 2022 7:26 pm

The atmosphere has a relatively low capacity to store heat. As a result even small perturbations in the ingress and egress flows of energy cause large changes in temperature. The atmospheric temperature highly variable as a result. As you can see in the graph I posted below this ebb and flow of atmospheric temperature is not inconsistent with “steadily rising plant food in the air”.

AGW is Not Science
Reply to  John Tillman
December 7, 2022 4:18 am

Simple; because rising plant food in the air does not drive the Earth’s temperature. Just like always.

But you can expect Bellman, bdwx, Stokes and the like to come up with the most contorted pseudo “logic” to continue to willfully refuse to recognize such “inconvenient” facts.

For them it will always be some variation of “start in a cold period” (beginning of satellite record, which is near the end of the “Ice Age Cometh” Global Cooling, or “pre-industrial” aka “The Little Ice Age”) and “end at a warm period” (take your pick of “recent” dates during the current (thankfully) warm period, and blame it on CO2. Rinse and repeat.

bdgwx
Reply to  AGW is Not Science
December 7, 2022 11:04 am

For them it will always be some variation of “start in a cold period”

Nah. For me it is the realization that there are multiple agents modulating the ingress and egress of energy to/from the atmosphere and that the atmosphere has a relatively low capacity to store heat so the variability of the temperature is relatively high. CO2 is but one among many factors that modulate the energy flows. And since CO2’s modulation is small but persistently positive it is only noticeable over long periods time. Other agents create a lot of noise and variability especially over short period times like months and years.

Last edited 1 month ago by bdgwx
doonman
Reply to  bdgwx
December 7, 2022 1:50 pm

So when does the catastrophe happen?

bdgwx
Reply to  doonman
December 8, 2022 5:45 am

I don’t know. I don’t even know what “catastrophe” means in this context. And every time I ask someone on WUWT I get a different answer. Maybe if you could define “catastrophe” I could attempt an answer.

Mike
Reply to  bdgwx
December 7, 2022 3:03 pm

And since CO2’s modulation is small but persistently positive it is only noticeable over long periods time.”
How long?

bdgwx
Reply to  Mike
December 7, 2022 6:06 pm

Based on the information provided in Christy et al. 2003 and the current trend then it would take about 25 years to notice the warming at monthly time scales or about 20 years at annual time scales.

Last edited 1 month ago by bdgwx
John Tillman
Reply to  bdgwx
December 7, 2022 6:10 pm

We’re already at 20 years after 2003, but the world has been cooling for seven years, with no sign of let up.

bdgwx
Reply to  John Tillman
December 7, 2022 6:37 pm

The trend over the last 7 years is -0.05 C/decade. The trend over the last 12 years is +0.32 C/decade. So while it is has been cooling for 7 years it has warmed at 6x that rate over the last 12 years.

Tim Gorman
Reply to  bdgwx
December 8, 2022 4:55 am

And what has it done over a period twice that long? 1998 to 2022?

Bellman
Reply to  Tim Gorman
December 8, 2022 5:27 am

I said this earlier, the trend since the start 1998 is 0.11° C/ decade. Not much difference from the long term trend despite starting with a massive El Niño.

Tim Gorman
Reply to  John Tillman
December 8, 2022 4:54 am

We are actually at 25 years since 1998.

Bellman
Reply to  AGW is Not Science
December 8, 2022 3:08 am

Gosh, starting at the start of the only data set people here accept and finishing at the present date.

sherro01
Reply to  Bellman
December 6, 2022 8:55 pm

Bellman,
What uncertainties are you putting around these annual averages?
How far apart do they have to be before you label them as hotter or cooler than another?
Geoff S

Bellman
Reply to  sherro01
December 7, 2022 3:38 am

I’m not putting any uncertainties on the values. I’m just reporting the stated values from Spencer, who also never mentions uncertainties.

The exact ordering is not very relevant, it’s just a bit of fun. But regardless of the uncertainty it is clear that the trend is up, and there have been more hot years recently, whatever the exact ordering.

Reply to  Bellman
December 8, 2022 11:43 pm

Bellman,
Propaganda is never fun.
Geoff S

Bellman
Reply to  Geoff Sherrington
December 9, 2022 3:48 am

I don’t know. Monckton’s pause nonsense can be quite fun.

Last edited 1 month ago by Bellman
bdgwx
Reply to  Geoff Sherrington
December 9, 2022 9:15 am

I suspect Bellman feels the same way I do. That is the majority of WUWT commenters speak favorably of UAH and the Monckton Method when it shows a pause. But when I point that the same data using the same method also shows warming suddenly the dataset, the method, and/or myself are criticized. I have one guy here promoting HadAT2 because it shows “the 1958 temps were the same as in 2001″, but when I point out that it also shows more warming than UAH from 1979 to 2001 (the overlap period) and that the other products the creator provides also show warming near the surface all of sudden he indicts the creator of fraud. I even have two commenters above blaming me for Monckton’s “sophistry” simply because I pointed out that the Monckton Pause starts at 2014/10. And, of course, pointing that the Monckton method shows that the current warming is +0.3 C/decade since 2011 is met with dismissive rejection or statements about how the method I’m using (Monckton’s) is inappropriate. So yeah…the propaganda is fun.

Last edited 1 month ago by bdgwx
Tim Gorman
Reply to  sherro01
December 7, 2022 4:29 am

I knew he wouldn’t answer you. He never does. If he was in charge of sizing the payload of multiple individual satellites on a rocket he would just add up all the stated values and not worry about the uncertainties of the individual satellites. He would assume that all uncertainties are random, symmetrical, and therefore cancel out.

If the rocket couldn’t reach orbit it couldn’t be because the payload was overweight.

A *real* engineer would propagate the individual uncertainties onto the total weight and make sure the total payload size was within limits *including* the uncertainties.

Bellman
Reply to  Tim Gorman
December 7, 2022 4:54 am

This constant lying is getting beyond tedious. Try to address the argument not the person.

karlomonte
Reply to  Bellman
December 7, 2022 7:21 am

Who is the liar here? Underneath all weaseling and posturing, YOU are the person who whines (a lot) when it is pointed out that the impossibly small total uncertainty numbers quoted for these GAT “trends” and “anomalies” are in fact, impossibly small.

Bellman
Reply to  karlomonte
December 7, 2022 8:13 am

As I used to say to Carlo, I’m not the one you need to convince about the problems with UAH. If you want to be taken seriously with your +/- 1.4°C uncertainties in UAH, the people you need to convince are:

1. Drs Spencer and Christie.
2. The guardians of WUWT who routinely publish UAH with no mention of uncertainty.
3. Lord Monckton who routinely uses UAH data to prove spurious points about the low rate of warming
4. Anyone who takes the pause seriously, when it’s based exclusively on this highly uncertain data.

karlomonte
Reply to  Bellman
December 7, 2022 8:29 am

You still can’t comprehend the difference between error and uncertainty, and just as Tim says, assume errors are random, allowing you to ignore them.

Bellman
Reply to  karlomonte
December 7, 2022 9:39 am

Just like the old Carlo. You could try to establish your claim with the people who matter, or do you could just throw out ad hominems from the sidelines. Guess which one he chooses.

karlomonte
Reply to  Bellman
December 7, 2022 10:16 am

As I wrote, you whine a lot.

Bellman
Reply to  karlomonte
December 7, 2022 12:18 pm

Yet you’re so obsessed with me you have to put some fatuous one liner after my every comment, rather than devoting your time to pursuading those who matter that your uncertainty estimate is correct.

karlomonte
Reply to  Bellman
December 7, 2022 2:10 pm

Request DENIED. Again.

Bellman
Reply to  karlomonte
December 7, 2022 3:06 pm

It wasn’t a request, just an observation. Feel free to ignore it and continue to play the troll.

Tim Gorman
Reply to  Bellman
December 8, 2022 5:00 am

This is a false appeal to authority. Did Galileo have to convince the establishment that his opinion was right in order to be correct?

The truth is the truth – period. Something you fight against every single day.

Bellman
Reply to  Tim Gorman
December 8, 2022 5:38 am

Nonsense. I’m not saying Dr Spencer is the authority, I disagree with a lot of things he says. I’m just saying that if you think you have found a massive flaw in his work you should at least check with him rather than arguing with me about it.

Maybe he’ll be able to explain why you are wrong, maybe he’ll accept the impossibility of using satellites get an accurate temperature record and give up his life’s work. Maybe he’ll refuse to accept the “truth” coming from Carl or Karl. But at least it would advance the argument rather than having some random troll shouting from the sidelines.

karlomonte
Reply to  Bellman
December 8, 2022 7:03 am

You are so worried about it, YOU go do it, mr anti-troll (rolls eyes).

Tim Gorman
Reply to  Bellman
December 8, 2022 9:45 am

Why should I have to check with anyone to lay out the problem with an analysis? Peer reviewers are expected to give an independent, unbiased analysis of a paper without contacting the author to get their views (for anything more than a possible typo or such). Then it is up to the author to answer the issues raised.

If Spencer wants to answer why all uncertainty of individual measurements are assumed to cancel he is quite able to do so.

bdgwx
Reply to  Bellman
December 7, 2022 10:58 am

Bellman said: “If you want to be taken seriously with your +/- 1.4°C uncertainties in UAH, the people you need to convince are”

It seems odd (assuming the claim above is true) that the simple estimation models you and I developed can predict next month’s UAH anomaly with significantly lower uncertainty than the claimed measurement uncertainty of 1.4 C for UAH itself.

Last edited 1 month ago by bdgwx
karlomonte
Reply to  bdgwx
December 7, 2022 11:09 am

claimed measurement uncertainty of 1.4 C for UAH itself

It was not a “claim”, merely an estimate based on a typical ±0.5°C value for single, individual temperature measurements.

You still don’t understand that RSME calculations require true values that don’t exist.

Bellman
Reply to  bdgwx
December 7, 2022 12:23 pm

Or indeed how UAH could agree so closely each month will all the surface data sets.

I suspect that what he might mean is there is a systematic error that changes over time, resulting in an error in the trend. Something like that would explain the difference in trend between UAH and other data sets, but I don’t see how interpreting that as an uncertainty in the monthly anomalies makes any sense.

karlomonte
Reply to  Bellman
December 7, 2022 2:12 pm

I suspect that what he might mean is there is a systematic error that changes over time, resulting in an error in the trend.

And you STILL don’t understand the difference. Around and around the belcurveman hamster wheel spins…

Tim Gorman
Reply to  karlomonte
December 8, 2022 5:19 am

The daily mid-range values have *at least* an uncertainty of +/- 0.7C. Thus each and every “average” calculated from those values *has* to have a higher uncertainty than that. Uncertainties of different things add – always. You cannot decrease those uncertainties by dividing by a constant. Constants have no uncertainty. Nor does it matter whether a variable or constant is in the numerator or the denominator, when calculating uncertainty the uncertainty *adds* for the contributions of both numerator components and denominator components.

The only thing that decreases is how closely a sample mean approaches the population mean. But that “closeness” is not the accuracy of the mean. That can only be determined by propagating the individual uncertainties of the individual measurements onto that mean. If you have the entire population your calculation of the average will have no standard deviation, it will be zero. That does *NOT* mean that the the average is 100% accurate.

But that is what both bellman and bdgwx both claim for temperature.

karlomonte
Reply to  Tim Gorman
December 8, 2022 6:42 am

They call me a “troll” but the only cogent answer he can compose to your detailed explanations is to push the downvote button and accuse you of lying.

Bellman
Reply to  karlomonte
December 8, 2022 9:56 am

I call you a troll when you engage in troll like activity. Such as responding to my every comment with some attention seeking insulting one liner. When you do post something of interest I try to engage.

And I haven’t downvoted anyone in this post. I rarely downvote anyone, only when someone has said something particularly offensive or annoying. I don’t understand why anyone cares about the votes. Mine almost immediately get blitzed with downvoted regardless of what I’ve said. I take it as a complement.

karlomonte
Reply to  Bellman
December 8, 2022 11:49 am

As you’ve been told many times, attempting to educate you has proven quite impossible, and I refuse to waste any more of my time doing so.

You need to provide the receipts to show that I “respond[] to my every comment with some attention seeking insulting one liner.”

And stop whining.

Bellman
Reply to  karlomonte
December 8, 2022 2:06 pm

Stop trolling.

karlomonte
Reply to  Bellman
December 8, 2022 2:28 pm

Get a new dictionary.

Jim Gorman
Reply to  Tim Gorman
December 8, 2022 8:14 am

” If you have the entire population your calculation of the average will have no standard deviation, it will be zero. That does *NOT* mean that the the average is 100% accurate.”

If you have the entire population your calculation of the average will have no standard deviation OF THE MEAN, it will be zero. That does *NOT* mean that the the average is 100% accurate.

Corrected it for you.

The mean of the population is not an estimated value subject to sampling error. Sampling always has errors unless the population is totally normal and all samples are IID.

One can ponder how much sampling error there is by using 2 readings from a 24 hour continuous phenomena with variation all over the earth.

Tim Gorman
Reply to  Bellman
December 8, 2022 5:10 am

The surface data sets don’t consider uncertainty. They do what you do – assume that all stated values are 100% accurate and the only uncertainty is the standard deviation of the stated values.

It’s been pointed out to you over and over and over, and to bdgwx as well, that if you include the uncertainty intervals along with the stated values on the graphs you can get just about any trend line you wish, positive/negative/zero. You didn’t believe it was shown with graphs and you don’t believe it today.

Bellman
Reply to  Tim Gorman
December 8, 2022 5:41 am

You literally just joined in discussion I was was having with Pat Frank about how the surface data sets interpret measurement error. You obviously weren’t paying attention to it. It’s a complete lie that they assume all stated values are 100% accurate.

karlomonte
Reply to  Bellman
December 8, 2022 6:44 am

Who are “they”, mr anti-troll?

And you STILL don’t understand that error is not uncertainty.

Bellman
Reply to  karlomonte
December 8, 2022 9:58 am

They is refering to what Tim was saying; the surface datasets, which I take to mean those who run them.

karlomonte
Reply to  Bellman
December 8, 2022 11:54 am

Do any of them report uncertainty intervals? — Of course not, so by default they are assuming 100% accurate values.

And you STILL don’t understand that error is not uncertainty.

Bellman
Reply to  karlomonte
December 8, 2022 2:21 pm

Do any of them report uncertainty intervals? — Of course not, so by default they are assuming 100% accurate values.

Have you heard of this thing called the internet.

https://data.giss.nasa.gov/gistemp/uncertainty/

https://www.metoffice.gov.uk/hadobs/hadcrut5/data/current/download.html

https://berkeley-earth-temperature.s3.us-west-1.amazonaws.com/Global/Land_and_Ocean_complete.txt

karlomonte
Reply to  Bellman
December 8, 2022 2:32 pm

Have you heard of this thing called the internet.

https://data.giss.nasa.gov/gistemp/uncertainty/

You forgot to look under the hood, from link #1, which claims impossibly small milli-Kelvin “uncertainty” for their averaging of averages:

95% confidence interval calculated in our study

BZZZZZZT.

Another fail, they don’t understand the subject any better than you.

Bellman
Reply to  karlomonte
December 8, 2022 2:52 pm

And the scraping sound you hear is that of trolls dragging goal posts around.

karlo / carlo says with complete confidence that none of data sets report uncertainty intervals, which proved they assumed all data was 100% accurate. When presented with simple evidence that his claim was wrong, he whines that the real issue is he doesn’t like the size of the confidence intervals so they don’t really count.

karlomonte
Reply to  Bellman
December 8, 2022 2:59 pm

That website is signed by Gavin Schmitt, which should have been a huge clue for you. Those values are as absurd as the UAH claims.

“95% confidence intervals” are NOT propagated uncertainty.

Screenshot 2022-12-08 at 3.35.33 PM.png
Bellman
Reply to  karlomonte
December 8, 2022 4:26 pm

Stop digging. You made a claim that nop data set was published with uncertainty intervals. Thjat was flat out wrong, so now you are in full deflection mode.

karlomonte
Reply to  Bellman
December 8, 2022 5:26 pm

You made a claim that nop data set was published with uncertainty intervals.

Deflection? Rather projection by you.

As Tim explained in detail, those Gavin Schmitt green thingies are NOT uncertainty intervals.

Bellman
Reply to  karlomonte
December 9, 2022 4:09 am

It’s your choice – you can keep digging, or you can admit that it was untrue to say that no data set comes with uncertainty intervals and claimed there results were 100% accurate.

It doesn’t matter if you agree with the intervals or not, it’s simply wrong to claim that they claim 100% accuracy.

karlomonte
Reply to  Bellman
December 9, 2022 4:56 am

Those are not intervals and error is not uncertainty, you dolt.

Again and again you demonstrate your ignorance and ineptness.

What is the color of the sky in bellcurveman world?

Last edited 1 month ago by karlomonte
Tim Gorman
Reply to  karlomonte
December 8, 2022 3:25 pm

If you read the GISS document they specifically ignore measurement uncertainty and use the variance of the anomaly as the uncertainty. It’s like none of these people understand metrology at all!

Tim Gorman
Reply to  Bellman
December 8, 2022 3:24 pm

giss document.

“The treatment of missing land surface data is a major distinction between products. Since monthly temperature

anomalies are strongly correlated in space, spatial interpolation methods can be used to infill sections of missing data.”

Anomalies are *NOT* strongly correlated in space. Hubbard and Lin showed that. They came to the conclusion that any calibration corrections have to be done on a station-by-station basis, not on a regional basis. That means the anomalies calculated from the measurements of the stations cannot be strongly correlated either.

See the attached picture of temps in the Kansas area for the evening of 12-8. There is simply no way anomalies calculated from a statewide average can be strongly correlated, not when GISS uses a 1200km radius!

This is an unwarranted assumption by GISS that is not justified in their paper using any actual data at all.

“The averaging step calculates the regional and global time series from the interpolated subbox records.”

No where in the document is it mentioned that GISS propagates the individual measurement uncertainties onto the average they arrive at. They do the typical thing that you do – assume all measurements are 100% accurate because all uncertainty is random and cancels.

It is simply unbelievable to me that no one ever calls them out on this.

“Since the global mean calculation in GISTEMP aggregates from small subboxes to the 80 equal-area boxes,

the coarse model grid approach has considerable value in quantifying the large-scale sampling uncertainty in
the approach assuming that the model is capturing sufficient statistical structure of the underlying fine-scale
global temperature anomaly field.”

Once again, they are ignoring the uncertainty of the actual measurements by assuming all individual uncertainties cancel and that a statistical analysis of the stated values gives an accurate picture of the uncertainty of the average.

“In this formulation, 𝜖(t) is a random variable that represents the total uncertainty in our estimate of the

annual mean temperature anomaly. Assuming that our estimation procedure is unbiased (an assumption we will revisit), the expected value  E[𝜖(t)] = 0 for all years t. The uncertainty in our calculation of the global mean anomaly is then defined as

E(t) = Var(𝜖(t)).”

This is common in climate science. Ignore the uncertainties of the individual measurements by assuming they all cancel and then finding the variance of the stated values and call it “uncertainty”.

This is *only* valid if you have multiple measurements of the same thing.

I have no doubt the other links will show exactly the same thing. I’ve looked at too many climate study documents that do *exactly* this – assume measurement uncertainty all cancels.

karlomonte
Reply to  Tim Gorman
December 8, 2022 3:34 pm

the coarse model grid approach has considerable value in quantifying the large-scale sampling uncertainty

Now I see where he got this term from, thanks. As I thought, it isn’t from metrology at all, but from climatology.

Tim Gorman
Reply to  karlomonte
December 8, 2022 4:14 pm

As usual he is just cherry picking without understanding what he is actually quoting. He didn’t bother reading the GISS document at all! It was obvious in just the first few pages that they were ignoring measurement uncertainty. It was proved when they equated the uncertainty as being the variance of the stated average values. They even said they assumed that the uncertainty of a years average is zero!

You have it pegged. They all think that averaging reduces measurement uncertainty.

I’m done in this thread trying to teach him – he is unteachable. He’s a troll, pure and plain.

karlomonte
Reply to  Tim Gorman
December 8, 2022 5:27 pm

Yep. And he projects his failings onto others.

Unskilled and Unaware.

karlomonte
Reply to  Tim Gorman
December 8, 2022 6:27 pm

He didn’t bother reading the GISS document at all! It was obvious in just the first few pages that they were ignoring measurement uncertainty.

Just like he didn’t bother reading the GUM, or Pat Frank’s paper, or any of CMoB’s posts. He just skims for anything that might prop up his climatology biases.

Jim Gorman
Reply to  Bellman
December 8, 2022 7:39 am

If “they” don’t consider them 100% accurate, then why is an uncertainty interval for each measurement not shown? IOW, a standard deviation!

Why is there no Standard Deviation shown for any average?

The only conclusion is that all measurement readings have no uncertainty and even averages have no SD.

karlomonte
Reply to  Jim Gorman
December 8, 2022 8:08 am

Exactly correct.

Bellman
Reply to  Jim Gorman
December 8, 2022 10:03 am

Most datasets have published uncertainty intervals. UAH being an exception.

Tim Gorman
Reply to  Bellman
December 8, 2022 10:20 am

And most of those uncertainty intervals are far too small considering the accuracy of the underlying data. They, like you, ignore the uncertainty of the data and just use the standard error of the mean as the accuracy of the mean when it isn’t!

karlomonte
Reply to  Tim Gorman
December 8, 2022 2:40 pm

By claiming “uncertainty” numbers less than ±0.5K for their averages of averages, NASA GISS is as lost in the weeds as bellcurveman. They obviously also believe that averaging and subtracting baseline averages “reduces error”. This is beyond absurd (also note the “confidence interval” claim):

Screenshot 2022-12-08 at 3.35.33 PM.png
Last edited 1 month ago by karlomonte
Bellman
Reply to  karlomonte
December 8, 2022 2:57 pm

Why limit your fantasy to ±0.5K?

According to Tim you have to add all the uncertainties, so 1 instrument making 700 measurements each with a systematic uncertainty of ±0.5K, will have an annual uncertainty of ±350K. A minimum of 1000 such instruments will result in a global annual uncertainty of ±350,000K.

karlomonte
Reply to  Bellman
December 8, 2022 3:16 pm

And now the bellcurveman hamster wheel circles back around to where “averaging reduces uncertainty.”

Get a grip, PDQ. You haven’t grasped a single word that Tim has taken the time to write.

Bellman
Reply to  karlomonte
December 8, 2022 4:28 pm

More deflection. Tim’s point isn’t that averaging doesn’t reduce uncertainty, it’s that it increases uncertainty. As always you try to defend Tim, but fall short of actually agreeing with him.

karlomonte
Reply to  Bellman
December 8, 2022 5:30 pm

Tim’s point isn’t that averaging doesn’t reduce uncertainty, it’s that it increases uncertainty.

The variances ADD, which is EXACTLY what he has been trying to implant into your neutronium-dense skull…

Bellman
Reply to  karlomonte
December 9, 2022 4:05 am

I’m not sure if you are agreeing with Tim or not. Are you saying that you agree that the measurement uncertainties will increase when propagated to an average?

Variances add when adding random variables, but as I keep saying when you scale a random variable by a constant you have to scale the variance by the square of that constant. And that means when you divide the variable by N the variance is divided by N^2.

Could you clarify if you agree with that, or Tim’s claim that you never divide the variance by anything? Preferably without recourse to more schoolyard insults.

karlomonte
Reply to  Bellman
December 9, 2022 4:59 am

I see you still can’t understand that I refuse to play your games with futile efforts to educate you.

Bellman
Reply to  karlomonte
December 9, 2022 6:57 am

I know you won’t or can’t answer. That’s the point. Others can draw there own conclusions from your refusal to answer.

karlomonte
Reply to  Bellman
December 9, 2022 8:44 am

Tim has stated it a lot better than I:

The only way they can get uncertainty intervals down into the hundredths and thousandths of a degree is by ignoring the uncertainty of the individual components. THE ONLY WAY.

You can *NOT* decrease uncertainty by averaging. You simply can’t. Trying to discern temperature differences in the hundredths digit by averaging when the underlying data is only accurate to the tenths digit (or even the units digit) is an impossiblity.

It truly is that simple.

Ask me if I care about what the alleged “others” think.

Bellman
Reply to  karlomonte
December 9, 2022 8:57 am

Not the question I’m asking. Do you agree with Tim that the uncertainty of the average increases with sample size?

karlomonte
Reply to  Bellman
December 9, 2022 9:14 am

Zzzzzzzzzzzzzzzz………

Jim Gorman
Reply to  karlomonte
December 9, 2022 10:17 am

That is the issue.

From a John Hopkins Univ Chem Lab.

https://www2.chem21labs.com/labfiles/jhu_significant_figures.pdf

9. When determining the mean and standard deviation based on repeated measurements
• The mean cannot be more accurate than the original measurements.

karlomonte
Reply to  Jim Gorman
December 9, 2022 10:29 am

There it is, in black & white. Only in climatology are rules like these allowed to be send out the pilot tube.

Tim Gorman
Reply to  Bellman
December 8, 2022 3:35 pm

Do *you* know how much of the +/- 0.5K is attributable to systematic uncertainty and how much to random error?

so 1 instrument making 700 measurements each”

Of the same thing or different things? If you have two boards and measure each with the same tape measure that is off by 1″ what will the length of the two boards combined be? x +/- 2″.

Why is this so hard for you to understand? It’s as simple of an example as you will ever get. Join the two boards end to end and you will *NOT* get an uncertainty of 1″ or 1″/2 or anything else! If you plan on using those two boards to span a distance you won’t know if it is 2″ too long, 2″ too short, or somewhere in between!

It will be the same with three boards or 50 boards or 1000 boards! The uncertainty adds!

Bellman
Reply to  Tim Gorman
December 8, 2022 4:36 pm

Of the same thing or different things?

I asked that same question a long time ago. I was repeatedly told that is was impossible for a thermometer to ever read the same thing twice as temperature is always changing. You keep making the point that the uncertainty of the average of maximum and minimum temperatures has to be greater than either of the individual measurements. So, yes I assume the 700 measurements are of different things.

Why is this so hard for you to understand?

It isn’t. What’s hard to understand is why you think the uncertainty of the sum is the same as the uncertainty of the average.

karlomonte
Reply to  Bellman
December 8, 2022 6:11 pm

You’re unhinged, no wonder climatology has so little connection to reality.

Bellman
Reply to  Tim Gorman
December 8, 2022 4:41 pm

Do *you* know how much of the +/- 0.5K is attributable to systematic uncertainty and how much to random error?

I’m assuming it’s all systematic as that’s the only way it makes sense for Pat Frank to claim that if individual measurements have an uncertainty of ±0.5 than the global annual average uncertainty will also be ±0.5.

karlomonte
Reply to  Bellman
December 8, 2022 6:05 pm

Try reading his paper with some comprehension.

Jim Gorman
Reply to  Bellman
December 8, 2022 12:03 pm

Do you show them when you quote a value? If not, are you being accurate?

Bellman
Reply to  Jim Gorman
December 8, 2022 2:30 pm

Depends on what I’m doing. Mostly I’m only quoting UAH data which is the only one considered accurate here, but also have no quoted uncertainty. If all I’m doing is saying this is the 6th warmest November or what ever, I don’t because it’s just a bit of fun.

I do often quote the uncertainty in trends, because that’s something that is relevant, especially when people are cherry-picking a pause, and fail to mention the uncertainty.

karlomonte
Reply to  Bellman
December 8, 2022 2:53 pm

UAH data which is the only one considered accurate here

Strawman alert—where is the official WUWT statement to this effect?

I do often quote the uncertainty in trends

No you don’t, you don’t understand that error is not uncertainty, and you believe that averaging reducing this quantity that you don’t comprehend.

Bellman
Reply to  karlomonte
December 8, 2022 4:23 pm

So now the trolling reaches the point of demanding everything be taking absolutely literally. No, I doubt there’s an official statement to the effect that all data sets apart from UAH are considered inaccurate.

But it is the only one that is published here on a monthly basis. I’m sure if you look through the archive you can find numerous articles attacking the “tampering” made to other data sets. I’ve been told in comments that UAH is the only data set that bioth sides trust. Lord Monckton publishes frequent “pause” articles always based on UAH, wioth no mention of uncertainty, and if he mentions HadCRUT at all it’s prefaced by comments about how it’s been tampered with. And it wasn’t that long ago that anyone questioning the accuracy of satellite data was accused of being a satellite unbeliever.

karlomonte
Reply to  Bellman
December 8, 2022 6:14 pm

bellcurveman is reduced to screeching “troll!” “troll!” “troll!” over and over.

And you STILL don’t understand CMoB’s methods after years and years.

Do you even read what he writes?

Bellman
Reply to  karlomonte
December 9, 2022 3:56 am

I’m not a troll whines someone who thinks “bellcurveman” is an effective insult.

And you STILL don’t understand CMoB’s methods after years and years.

I beg to differ. I’ve been following his methods for years, I follow his methods to the letter and get exactly the same results as him every time. I’ve yet to hear nay convincing explanation from anyone else about why I’m wrong, let alone a demonstration of how using their preferred method would produce the correct result.

Instead all I’m given is inane analogies about tracking deer, or that trhe secret is that Monckton works backwards and the end is the beginning. Still, there’s no an opportunity for you to give your explanation of what Monckton’s method really is, and tell us when you think the pause should start this month. (Or you could just say I’m playing mind tricks on you, and that I’m incapable of being educated.)

karlomonte
Reply to  Bellman
December 9, 2022 4:59 am

Return of bellcurveman the whiner…

Jim Gorman
Reply to  Bellman
December 9, 2022 9:55 am

A temperature reading is a singular occurrence. There are no multiple readings of the same measurand that can be used to develop a distribution that one can use to determine the cancelation of random errors. There NO RANDOM ERRORS in a single reading. That doesn’t rule out measurement errors, just that there is no way to reduce them.

Similarly, systematic errors can occur through drift, screen or positioning problems, etc. Hubbard and Lin showed that one could not predict these statistically and that visits to calibrate and test individual stations was the only accurate method of evaluating systematic errors.

Consequently, there is only one way to determine a combined standard uncertainty, use the uncertainty that is specified by the manufacturer or other supervisory entity.

Here are what the NWS and NOAA specify for ASOS, CRN, and MMTS stations

https://www.weather.gov/asos/ ASOS MAX ERROR = ±1.8°F

https://www.ncei.noaa.gov/access/crn/
documentation.html CRN ±0.3°C

https://www.weather.gov/coop/Standards MMTS ±1.0°F

One can not simply blow these off by handwaving and claiming that they disappear through the magic of averaging.

A “mean” is a statistical descriptor of the central tendency of a distribution of data points. A “mean” is not a measurement. The GUM, JCGM 100:2008, has definitions of what a measurement of a measurand is. It must be a value of an physical attribute that can be physically measured. Note, it can be a measurement derived from other individual physical measurements, but regardless, it requires physical measurements.

An arithmetic “mean” is not a physical measurement. It is not a physical attribute of a measurand. So what is an arithmetic “mean” and when is it useful? A mean is PART of a group of statistical descriptors that describe the shape and form of a distribution of data points. One of the important descriptors other than a mean is the Standard Deviation (SD) of the distribution. This describes the “range” of the data surrounding the mean, i.e. how spread out it is. Other descriptors are kurtosis and skewness.

There is one other calculations that gets tossed around and that is the Standard Error of the sample Mean (SEM) or Standard Error as it is sometimes called. The SEM is not a statistical descriptor of a distribution. It is a statistic describing an INTERVAL around the estimated mean which is derived from a sampe mean distribution. The interval tells you where the actual population mean may lay. It does not tell how accurate or precise a mean value is.

So what is the measurement uncertainty of a mean of a distribution single measurements? If all the measurements carry the same combined standard uncertainty it isn’t hard to calculate. The mean will carry the same uncertainty. For example, a daily mean uncertainty will be ±1.0°F. A monthly mean will carry a ±1.0°F uncertainty. An annual mean will carry a ±1.0°F uncertainty. A 30 year baseline mean will carry a ±1.0°F average.

Even when subtracting a baseline mean value from another mean the result will carry that ±1.0°F uncertainty. That is what makes anomalies of 0.001 so ludicrous when using land measurements. Anything inside the uncertainty interval is statistically insignificant. That is the one reason scientists are so eager to wave away measurement uncertainty and to incorrectly use the SEM as the only uncertainty even when that is not it’s purpose. Even Dr. Possolo in the NIST1900 document used an expanded computation and ended up with a ±1.8°C uncertainty.

I haven’t even touched on what the Standard Deviation is used for. But, I am sure one reason you never see one is the annual average of both winter and summer temps together. That would make anomalies disappear into the woodwork and why they are never quoted. I don’t believe for a minute that mathematicians are so blinded that they don’t know a mean is meaningless without knowing the Standard Deviation of the distribution that was used to calculate the mean.

Last edited 1 month ago by Jim Gorman
karlomonte
Reply to  Jim Gorman
December 9, 2022 10:18 am

Including the standard deviations would boost the uncertainty intervals to tens of degrees, not thousandths!

Jim Gorman
Reply to  karlomonte
December 9, 2022 10:33 am

Which is precisely the point!

The whole concept is a scam to achieve something to further funding. I can’t believe real statisticians have not stood up and said wait a minute! Averages of averages of averages mixed in with some sampling theory has never been shown to me to give a true answer. Especially when variances are totally ignored.

UAH is the only one I don’t really know about. I just don’t know enough to judge in their conversion from irradiance to temperature. That they agree with radiosondes is encouraging.

karlomonte
Reply to  Jim Gorman
December 9, 2022 10:58 am

The conversion is complex and non-trivial; there is a flowchart in an RSS paper for their procedure that is hard to even read. Corrections have to be made for orbit drift, detector degradation, detector view angle, and then more adjustments are made because the 20-odd NOAA satellites used since 1979 don’t always overlap cleanly. It is all done in what I suspect are mountains of FORTRAN.

But the UAH could easily report more details of the results such as the number of points in the monthly means and the standard deviations.

Tim Gorman
Reply to  Bellman
December 8, 2022 9:50 am

The only way they can get uncertainty intervals down into the hundredths and thousandths of a degree is by ignoring the uncertainty of the individual components. THE ONLY WAY.

You can *NOT* decrease uncertainty by averaging. You simply can’t. Trying to discern temperature differences in the hundredths digit by averaging when the underlying data is only accurate to the tenths digit (or even the units digit) is an impossiblity.

It truly is that simple.

Tim Gorman
Reply to  bdgwx
December 8, 2022 5:01 am

Which *still* doesn’t prove that your model is anything but curve matching to data points that have large uncertainty which is ignored.

Richard M
Reply to  Bellman
December 7, 2022 1:22 pm

A little longer view tells the real story. After the AMO went positive in 1997 the trend started up. But was flattened because the PDO went negative in 2006. When the PDO flipped back into a positive phase there was again lots of warming. Since then we’ve been cooling.

https://woodfortrees.org/plot/uah6/from:1997/to/plot/uah6/from:1997/to:2014.5/trend/plot/uah6/from:2015.5/to/trend/plot/uah6/from:2014.5/to:2015.5/trend

Forrest Gardener
Reply to  Bellman
December 7, 2022 3:23 pm

The 6th warmest November in the UAH record, and the hottest this year.

There. Fixed it for you.

Nick Stokes
December 6, 2022 2:38 pm

My surface temperature calculation also showed a cool November, coolest month since Feb 2021. It was down 0.217°C since October. Main cool spots were western N America and Australia. Western Europe through temperate latitudes to China were quite warm.

comment image

Chris Hanley
Reply to  Nick Stokes
December 6, 2022 3:25 pm

A lot of red orange and yellow there, it is worth noting that anomaly base period 1951 – 1980 was a period when the planet was allegedly heading to another ice age.

RickWill
Reply to  Chris Hanley
December 6, 2022 4:41 pm

allegedly heading to another ice age.

The wording is imprecise but otherwise accurate.

To be accurate, better wording would be “the current interglacial is at its point of termination”.

There will eventually be dispute over what represents the termination. The earliest date is the reversal of the northern hemisphere solar input due to orbital changes about 1000 years ago. Or it could be defined as the permafrost line advancing southward again, which is yet to happen. Or maybe the sea level declining, which will be a long time after it is obvious that snow is accumulating again.

It is apparent few people appreciate that show fall is an energy intensive process. Deposition of 1 tonne of snow on land requires the same heat release as burning 100kg of coal. The snow actually warms the land. That heat went into an ocean months earlier to liberate the water that eventually ends up as snow on land. More snow cannot be achieved without the oceans getting warmer. Enough snow to overtake melting requires a lot warmer oceans. Most of the northern oceans will reach the 30C temperature limit before the ice accumulation peaks in 8000 years. So the NH warming has a loooong way to go.

So far only Greenland and Iceland are obviously gaining ice extent but the rest of the northern land masses are not far behind.

Last edited 1 month ago by RickWill
walterr070
Reply to  RickWill
December 6, 2022 6:37 pm

Rick,

I thought your theory was that snowfall would increase in the NH over the next 1000 or so years. Wouldn’t that imply that there would be cooling?

RickWill
Reply to  walterr070
December 6, 2022 8:41 pm

Wouldn’t that imply that there would be cooling?

Snow falls because it is transporting heat from the oceans. Every tonne of snow fall requires the equivalent heat loss to space as burning 100kg of coal. It makes the place warmer than it otherwise would be. Right now that is clearly evident over the NH – per attached. Land winter temperatures are rising rapidly. Greenland from most significant.

So warming oceans in summer result in increasing advection of heat to land in winter and some of that heat is released by water solidification over the land that falls as snow.

Screen Shot 2022-11-07 at 11.10.13 am.png
angech
Reply to  Nick Stokes
December 6, 2022 3:56 pm

Thanks Nick. Timely posting and correlation

Ozonebust
Reply to  Nick Stokes
December 6, 2022 6:41 pm

Hi Nick
Could you please provide a value for the temperature increase compared to the 1850 baseline.
From my estimation it is about 0.7C.
Thank you

Nick Stokes
Reply to  Ozonebust
December 6, 2022 7:19 pm

My TempLS only goes back to 1901. The rise since then has been 1.09°C.

Hadcrut 5 says that the Earth cooled 0.09°C between 1850 and 1900.

Ozonebust
Reply to  Nick Stokes
December 6, 2022 8:09 pm

Thanks Nick.
So a round number of plus 1C is a good estimate.
Regards

bdgwx
Reply to  Ozonebust
December 7, 2022 1:55 pm

Berkeley Earth is in the 1 C ballpark as well.

Jim Gorman
Reply to  bdgwx
December 7, 2022 5:00 pm

Berkeley Earth doesn’t even look at the actual temperature measurements. They simply start changing monthly averages and never do anything about changing the actual temperature measurements.

In essence, they are saying the actual measurements are correct, but we can fiddle with the average values created by using those correct temperatures and just create what we want to see!

Mathematically, if the averages are incorrect, then the measurements that are used in the calculations are incorrect also. Normally when incorrect data is discovered, the data is thrown away. But, not in climate science!

Javier Vinós
Reply to  Nick Stokes
December 7, 2022 2:00 am

We did have a warm November in Spain, which was great considering heating costs are through the roof. Too bad December is being cold.

Spain is the 14th economy in the world by nominal GDP, yet it has considerable energy poverty. According to Eurostat, 14% of Spanish homes declared not being able to sufficiently heat their houses in 2021. This year, energy is a lot more expensive and people have less money due to high inflation. I’m afraid winter excessive mortality is going to be high this year.

JCM
December 6, 2022 3:01 pm

A problem, of relative significance in my view, is that energy budget diagrams and 1D models which assume a 288K surface emission temperature have erred due to simplifying assumptions.

Notably, the surface of the Earth, is at an average pressure level of 984 millibars, or an average elevation above sea level of 246m. Assuming a 6.5K per km lapse rate, this translates to a global average surface temperature of 288K – 0.246(6.5K) = 286.4K. Thus, the first order estimate of upward LW emission is 381.5 W m-2.

When then quantifying energy budgets it is sensible to use a value of approximately 381.5 W m-2 upward emission, as opposed to commonly used values in the range of 398.

When subtracting windows, such as the NASA diagram of 40.1, we see the net LW exchange at the surface is nearly net zero, where 381.5 – 40.1 = 341.4 net emission into atmosphere, and downward LW radiation is indicated to be 340.3. This correction to energy budgets seriously limits the LW influence upon surface temperature, where net LW flux is approximately zero.

Thus, we must understand more clearly the influence of solar absorbed, solar reflected, and most particularly their inextricable connection to convection of non radiative fluxes.

The surface of the Earth, of course, is not at an average of sea level pressure of 1013 millibars, but rather closer to 984. The 288K value for which we all take for granted is, of course, invalid for global average 1D model parameterizations.

comment image?itok=43dwxKEV

Crispin in Val Quentin
Reply to  JCM
December 6, 2022 3:50 pm

JCM

I am wondering if the 163.3 W absorbed by the surface is based on heating the ground or heating the ground plus chemically converted to biomass through photosynthesis. By my churly calculations and considering the gross biomass net production rate requires about 62 W per sq m over the whole sun-facing surface, 24/7. Is that considered in the “net absorbed” (0.6 W) figure? It is a large fraction of the total insolation. It is not converted to heat, it is absorbed chemically. The “greening of the earth” maps demonstrate that there is a huge amount of energy being absorbed by biomass, not just the CO2 that gets all the attention.

JCM
Reply to  Crispin in Val Quentin
December 6, 2022 4:18 pm

The net absorbed 0.6, or imbalance, is measured at TOA, as a diagnostic procedure with available data. The surface properties (beyond albedo), and associated surface budgets, are deemed too complicated to model. No attempt is made to quantify the perturbation to surface flux properties such as photosynthesis, respiration, and water cycles (evapotranspiration and total turbulent flux). Those who operate the radiometers, and those trained in astrophysics, have an apparent bias towards radiative concepts. Of this, I am quite certain. This conceptualization has polluted our discussion of climates.

AGW is Not Science
Reply to  JCM
December 7, 2022 4:44 am

No pun intended! 😀

RickWill
Reply to  JCM
December 6, 2022 5:12 pm

have erred due to simplifying assumptions.

Ya think?

What I have come to appreciate is that there is a wide held belief that Earth was in energy balance in 1850 and at temperature equilibrium. That demonstrates a massive misunderstanding of thermal processes in Earth’s climate system.

In 2019, the solar EMR over the SH temperate zone oceans had a swing from min to max of 372W/m^2 resulting in a temperature swing of 4.3C. For comparison the range over the NH temperate zone oceans had 312W/m^2 difference min to max resulting in a temperature swing of 8.5C. The surface temperature response of the northern temperate zone oceans is 2.3 times that of SH temperate oceans for any given variation in solar EMR.

The SH reached peak sunlight about 1000 years ago around the same time the NH reached its minimum. You do not need to be real clever to appreciate that the global average temperature is going to increase.

And the fact that that image still shows back radiation means the creators have no clue on the nature of EMR. It is ELECTROMAGNET radiation. It is exists in THE electro-magnet field. Energy cannot travel against the field potential.

Climate models are comprehensive in their slaughter of physics. They are make-believe, made up fairy tales. Clouds do not occur through modelled physical processes but are the result of parameters. Heat does not get into deep oceans through a modelled physical process but via the magic of parameters.

What the modellers do not understand is that Earth’s climate takes no notice of the parameters they use. It will do what it has always done – change. And an increase in a tiny trace gas is not going to have ANY impact.

Human civilisation will know a lot more about earth’s climate in due course as the northern oceans hit 30C across most of the surface and the ice stucks up year after year for thousands of years..

Last edited 1 month ago by RickWill
E. Schaffer
Reply to  JCM
December 7, 2022 5:06 am

What makes you believe surface elevation would make a difference? I am pretty certain the average global temperature is based on actual measured temperatures. It just makes no sense otherwise.

What is not measured, but rather just assumed, is a surface emissivity of 1. And this we know is rather 0.91. So it is not 390W/m2, but only 355W/m2.

https://greenhousedefect.com/what-is-the-surface-emissivity-of-earth

JCM
Reply to  E. Schaffer
December 7, 2022 6:09 am

288K is measured where? and by whom? Who is assuming what?

Pierrehumbert gives us his value of 286K in his 2008 book, with a preindustrial of 285K.

http://jvarekamp.web.wesleyan.edu/CO2/ClimateVol1.pdf

Notably, Hartmann in 1994 reminds that “the downward longwave from the atmosphere and the emission from the surface are both relatively large and tend to offset each other”. Sounds like a net-zero statement to me, over a sufficiently long and wide averaging time and space.

http://meteo.edu.vn/remoclic/Bai_giang/KHH_and_KHVN/Physical_Climatology.pdf

I am less interested in the specific offsetting values, but rather that they (the so-called upward and downward long waves within the atmosphere at whatever layer of interest) do appear rather similar. The net is always upward due to losses to space at the speed of light. Within the atmosphere, however, net is closer to zero.

Hartmann also notes that due to emissivity, the uncertainty of surface flux in the order of 5%, and temperature being at the 4th root represents about 1%. So give or take 20 W m-2 and 2K, respectively. Not much has changed since then.

It looks to me that the balloons are giving the 288K value in 1D representations at an altitude 0m.comment image

Last edited 1 month ago by JCM
E. Schaffer
Reply to  JCM
December 7, 2022 7:25 am

Notably, Hartmann in 1994 reminds that “the downward longwave from the atmosphere and the emission from the surface are both relatively large and tend to offset each other”. Sounds like a net-zero statement to me, over a sufficiently long and wide averaging time and space.

Nothing to be curious about. Back- and forth radiation always tend to cancel each other out. It is just one of the reasons why the GHE has nothing to do with “back radiation”. Next to many others..

mkelly
Reply to  JCM
December 7, 2022 7:00 am

JCM, I am curious about the 169.9 emitted by the atmosphere shown on diagram. What gases are doing this?

JCM
Reply to  mkelly
December 7, 2022 7:59 am

Irrespective of atmospheric composition, the value 169.9 is almost precisely 1/2 of the near surface available LW budget 340.

It is either a coincidence, or the LW regime has (practically) no freedom to force the system, and is fundamentally constrained. The budget of this heat is dominated by solar available and enthalpy flux.

The clear-sky value of 169.9 is simply the vertically integrated window losses. The vertical distribution of energy (mass flux, and enthalpy of latent and tangible heat) is dominated by thermodynamics. All sky emission profiles are dominated by the phase changes of water.

One can go on about atmospheric composition perturbations imposing upon the extreme wings of the windows in clear sky, but it must be practically negligible. read Happer et al. Clear sky represents perhaps 33% of “the sky”.

We should then have more interest to understand what really changes the distribution of heat, both vertically and horizontally, IMO. A far more interesting and challenging problem than commonly posed. Changes to enthalpy flux, pattern effects, lapse rates, and atmospheric dynamics cannot possibly be simply a feedback to LW ‘forcing’. It is rather the so-called available heat which is responding to the myriad of fascinating factors. This inverts the whole question.

Last edited 1 month ago by JCM
bdgwx
Reply to  mkelly
December 7, 2022 1:58 pm

All of the polyatomic gas species will contribute to the 169.9 W/m2 value.

Crispin in Val Quentin
December 6, 2022 3:41 pm

It was -36 C at 9 AM this morning in Val Quentin, Alberta and the temperature peaked for the day 4 hours later at -25, after which it dropped again.

The average for December is -9 C (-13 to -5 daily range). It is only the 6th Dec so it doesn’t look too good so far, being 23 C (41 F) below the monthly average low.

Winter is still two weeks away!

I believe the nearby town of Stoney Plain and the Edmonton Airport were ~2 deg colder, with a wind chill around -50. Alberta is doing a yeoman’s job of pulling down the global temperature. Well done.

Alastair Brickell
Reply to  Crispin in Val Quentin
December 6, 2022 4:28 pm

Yes, that’s a bit chilly, even for Alberta in early December. But Crispin, what’s the temperature doing in Waterloo? I’m sure you used to keep track of that!

Crispin in Val Quentin
Reply to  Alastair Brickell
December 8, 2022 5:16 pm

Waterloo is having a dry year, and an average temperature. There is a report produced monthly by Waterloo University that used to be based on measurements near the university, but when a warmist took over it was transferred to the Waterloo Airport which is in nearby Breslau, east of Kitchener. You can subscribe to teh emailed monthly reports.

The university data showed clearly that Waterloo (uptown) had not warmed in 100 years so that station has been abandoned. The airport has a shorter record and is more compliant.

Because each month was used to create an artificial alarm signal based on the month’s values being above or below the “long term average”, I wrote to them asking that they include a Sigma 1 or 2 line so we could see whether or not the results were within the normal range, and therefore nothing to worry about at all. I wrote it in a sciency way meaning, if you don’t add them I am going to call you out for being unscientific.

And they did! There is still a subtle bias towards global warming alarm in each text portion but it has calmed down a lot when the data is clearly within the expectable range. The baseline is (of course) the end half of the cooling spell 1940-1978.

bdgwx
December 6, 2022 4:28 pm

My model which uses the previous month’s value predicted +0.21 C for November. The actual value came in at +0.17 C. The prediction for December is +0.15 C. I’m expecting the Monckton Pause to continue into next year.

comment image

JCM
Reply to  bdgwx
December 6, 2022 4:38 pm

I can fit a coefficient to the area of soil desiccation and get the same result as CO2. If you want to get fancy coastal algal blooms fed by excess nutrient runoff which raise SST 1-2K might add to the fit.

bdgwx
Reply to  JCM
December 6, 2022 4:42 pm

Can you post your model and a link to the monthly soil desiccation data you are using? I’d like to test your model and see if it has better skill than a RMSE of 0.11 C.

JCM
Reply to  bdgwx
December 6, 2022 4:49 pm

There is no data to support the claim, thus it is ignored. It is rather inconvenient to collect such data, you see, compared to CO2. Many will use this an excuse to exclude such matters. However, as a first approximation consider the following diagram for conceptual discussion. To date, is it estimated 90% wetlands gone within the developed envelope, top soils certainly more closely resembling mineral rockflour today, compared to rich in organics. Note the resemblance of the curve to CO2. https://ourworldindata.org/grapher/land-use-over-the-long-term

bdgwx
Reply to  JCM
December 6, 2022 5:02 pm

I’m confused. You said I can fit a coefficient to the area of soil desiccation and get the same result as CO2.” Now you say “There is no data to support the claim, thus it is ignored.”

Do you have a model or not?

Does your model have an RMSE <= 0.11 C or not?

JCM
Reply to  bdgwx
December 6, 2022 5:04 pm

not this again. You are not engaging in scientific discourse.

bdgwx
Reply to  JCM
December 6, 2022 5:19 pm

My scientific discourse is 1) the model I presented above and 2) the request for the methods and data for the soil desiccation model you mentioned. My model is clearly documented above. The formula is T = -0.25 + [1.4 * log2(CO2lag2)] + [0.10 * ENSOlag4] + [-4.0 * AODvolcanic] + [0.35 * UAHlag1]. The model is tuned using recursive descent to optimize the parameters. The CO2 data is here. The ENSO data is here. And the volcanic AOD data is here. That’s everything you need to replicate the model. I’m now asking that you reciprocate and provide the necessary materials to replicate your model.

Jim Gorman
Reply to  bdgwx
December 7, 2022 12:29 pm

The problem with your “model” is that is not predictive! It is simple curve fitting and at the first change you will need to change your coefficients to match. Holy cow, it sounds like a GCM!

To show how it is predictive, you need to show a dimensional analysis of how your components are converted into temperature and how the coefficients we’re derived. For example “-0.25” needs have units of temperature in order to calculate a physical value for “T”. Your “1.4 • log2(CO2lag2)” also needs a way to be converted from CO2 ppm to “T”, along with the other components.

Tim Gorman
Reply to  Jim Gorman
December 8, 2022 10:08 am

It’s like saying the road was straight for the past 100 miles so it will be straight for the next 100 miles. Better not go to sleep at the wheel!

Peta of Newark
Reply to  bdgwx
December 6, 2022 5:54 pm

To understand something,you have to ‘go there, do that, get down & dirty’

IOW: Build your own model.
The figures you need are all out there and that ‘world in data’ image is an epic start.

The inputs you need are:

  • Total globe area: 5e14 square metres
  • Total water area: (start at 70% of the above)
  • Area of ice: 10%
  • Area desert: 10%
  • Area of green: 10%
  • Albedo of water: 1.0
  • Albedo Fresh new Ice: 0.8
  • Albedo old ice: 0.4
  • Albedo greenery: 0.4
  • Albedo wet ploughed land: 0.1
  • Albedo dry land: 0.15
  • Emissivity water: 0.95
  • Emissivity air: between 0.03 and 0.01
  • Emissivity land/soil: (There’s a question – how does it relate with albedo?)

Just run through that lot and see how the absorbed solar input has changed.
Turning it into temperature is a whole other ball-park and as JCM says, nobody wants to know.
Because, in a nutshell, all they want to know about is radiation plus and especially, they pass the buck to The Emperor’s costumier to assert that Temperature = Energy and you’re A Really Stupid Person if you don’t understand why

Another nice model to build, Excel is your friend, is to divvy the globe into slices of Latitude.
Then ask it to work out some sines and cosines concerning the amount of solar energy landing in each slice.
Then do some Stefan calculations on the power you calculate and then, Joy Of Excel, work out some averages.
Then tweak a few things, get an idea of how sensitive your average is to albedo, solar power, emissivity

Also get yourself a datalogger or two….

If all else fails, visit Wunderground.
Something that intrigues me is how air temperatures rise so quickly immediately at sun-rise.
For northern hemisphere where obviously most stations are and for dates between spring/fall equinoxes,from the very instant of sunrise, air temps rise at about 3 degC per hour.
Very quickly getting warmer than your datalogger buried in the soil (rockflour).
So straight off that gives a lie to the GHGE

But using numbers for air density, weight and heat capacities, tells you that (near the surface where thermometers are), the air is absorbing about 0.5Watts per cubic metre – from the very instant the sun shows even just a glimmer.
But that is per ‘cubic metre’ – not per ‘square metre’

iow How many cubic metres of air is there above anyone’s head, on average?

You’ll quickly see what a complete trainwreck this thing is, simply based on the mahoooosive assumptions made about albedo and emissivity and that you can not just ‘simply average’ something that varies with the fourth power of something else – as Temperature and Power do

bdgwx
Reply to  Peta of Newark
December 6, 2022 6:14 pm

PofN said: “IOW: Build your own model.”

That’s what I did.

PofN said: “Turning it into temperature is a whole other ball-park and as JCM says, nobody wants to know.”

Yeah, it’s hard. I and many other want to know.

PofN said: “You’ll quickly see what a complete trainwreck this thing is”

I don’t think an RMSE of 0.11 C is a train wreck especially considering that Christy et al. 2003 report that the uncertainty of UAH itself is about 0.10 C. If their assessment of the uncertainty is correct then that means there is only 0.01 C of skill left on the table.

karlomonte
Reply to  bdgwx
December 6, 2022 6:20 pm

 If their assessment of the uncertainty is correct 

It’s not, way too low for myriad reasons.

Ozonebust
Reply to  bdgwx
December 6, 2022 7:02 pm

bdgwx
The model does not have skill, it is you that is supposed to have skill, or any other estimator or predictor of future anomalies.

bdgwx
Reply to  Ozonebust
December 6, 2022 7:20 pm

The estimator or predictor being discussed here is T = -0.25 + [1.4 * log2(CO2lag2)] + [0.10 * ENSOlag4] + [-4.0 * AODvolcanic] + [0.35 * UAHlag1]. When I test the estimator or predictor I get an RMSE of 0.11 C.

Would you mind posting an estimator or predictor of the UAH monthly anomalies that you believe does have skill? I’d like to test it and see what the RMSE is. Also, what is the threshold of RMSE or anything objective metric that discriminates between skillful and unskillful?

Jim Gorman
Reply to  bdgwx
December 7, 2022 12:42 pm

You can’t get an uncertainty at all if your components are not physical. You are simply curve fitting some values and claiming they give a physical prediction. If “-0.25” is a measurement of “T”, with an uncertainty, where does it originate from?

This is also a perfect example of when RSS would be used along with relative uncertainty for each component if they were actual measurements.

RMSE is usually a measure of how well a regression line fits data points. Beware trying to extrapolate to the future doing this! Regressions are not predictive of multivariate time series.

Last edited 1 month ago by Jim Gorman
karlomonte
Reply to  Jim Gorman
December 7, 2022 2:47 pm

By only predicting a single month into the future, he fools himself into thinking this basic problem doesn’t apply.

Tim Gorman
Reply to  Jim Gorman
December 8, 2022 10:11 am

RMSE taken from only the stated values is useless when uncertainty in the stated values exist. This is just one more example of saying uncertainty is random noise and will cancel out so you can assume all the stated values are 100% accurate!

Mike
Reply to  bdgwx
December 6, 2022 4:53 pm

I’m expecting the Monckton Pause to continue”

You mean the pause?

bdgwx
Reply to  Mike
December 6, 2022 4:58 pm

I definitely mean the Monckton Pause.

Mike
Reply to  bdgwx
December 6, 2022 5:28 pm

Meaning?

bdgwx
Reply to  Mike
December 6, 2022 5:34 pm

It is defined as the longest consecutive period ending on the most recent month where the OLR is <= 0 C/decade. The period is currently between 2014/10 and 2022/11. I’m expecting the period to extend into 2023.

Mike
Reply to  bdgwx
December 6, 2022 5:43 pm

So you mean the pause.

bdgwx
Reply to  Mike
December 6, 2022 5:52 pm

I don’t know. How are you defining “pause”? I ask because some people define it based on the longest period where Tnow – Tpast <= 0 C.

Mike
Reply to  bdgwx
December 6, 2022 6:00 pm

How are you defining “pause”?”

stop

cease

halt

discontinue

break off

take a break

take a breath

adjourn

desist

rest

hold back

wait

delay

hesitate

hang back

pull up

mark time

falter

waver

let up

take a breather

noun

stop

cessation

break

halt

stoppage

standstill

interruption

check

lull

respite

stay

breathing space

discontinuation

discontinuance

hiatus

gap

lapse (of time)

interlude

intermission

interval

entr’acte

adjournment

suspension

moratorium

interregnum

rest

time out

stopover

delay

hold-up

wait

hesitation

beat

caesura

let-up

breather

bdgwx
Reply to  Mike
December 7, 2022 9:21 am

Then no, when I say “Monckton Pause” it does not mean the same thing you are talking about.

Javier Vinós
Reply to  bdgwx
December 7, 2022 12:00 am

I’m expecting the Monckton Pause to continue into next year.

It will definitely extend into next year. The question is if the 2023-24 winter will be a Niño winter. A Niño is due, as we always get one in 3-5 years. A weak Niño would allow the pause to extend a couple of years more. A 10-year pause is not out of the cards. No doubt the emission pledges are having an effect.

John Shewchuk
December 6, 2022 5:24 pm

And again … CO2 under-performs. I want an IPCC tax refund.

pillageidiot
Reply to  John Shewchuk
December 6, 2022 6:29 pm

They broke it, you bought it.

No returns!

Jeff L
December 6, 2022 5:24 pm

I have seen in some comments here(rbabcock) (and by some others elsewhere who should know better) that heat flows from inside the earth may be responsible for changes in temperatures – bother regionally & globally. As a geophysicist, this makes no sense at all.

Here’s some basic math:
Earth Heat flow : 47 TW (plus or minus depending on who’s research you look at)
Earth surface area : 5.096E+14 m^2

Average Earth Heat flow at surface : 0.0922 w/m^2 (give or take – do the math of heat flow divided by surface area)

The bottom line is that this is a trivial number when put into any ocean-atmosphere energy balance diagram … a rounding error below our ability to even measure accurately. And that is why it is never included.

And this is a worldwide average so it already accounts for high heat flows in areas such as mid ocean ridges or active volcanos. Consider that any “new” volcanic activity which could add to net heat flow at the surface is a trivially small area compared to the overall size of the Earth.

For example, the big island of Hawaii is ~1.0432e+10 m^2. This is ~0.002% of the Earths total area. So, for argument’s sake, let’s just say there were an area of new volcanic activity the size of the big island of Hawaii that had somehow evaded geophysical detection (a poor assumption, of course) and the heat flow from this zone was a 10,000 times greater than the worldwide average ( I am being excessively aggressive here, to be clear), what would be the net effect on average global heat flow from the earth?

The Math :
922 w/m^2 over 0.002% of the earth = 0.019 w/m^2 average increase

Trivial again when looking at any ocean-atmosphere energy balance diagram.

As the saying goes : Do the math
Changes in earth heat flow don’t compute when it comes to surface air temps.

bdgwx
Reply to  Jeff L
December 6, 2022 5:48 pm

Agreed. I usually round it of to 0.1 W/m2. Most of that is primodial and radiothermal. I believe only 0.005 W/m2 (ish) is from lunar tidal dissipation. Anyway, the planetary energy imblance is about 0.9 W/m2 [Shuckmann et al. 2020]. So extending your example we’d need nearly 50 of those 10,000x enhanced volcanic zones. Even if that kind of activity could have miraculously gone unnoticed it would not explain the cooling stratosphere.

pillageidiot
Reply to  Jeff L
December 6, 2022 6:35 pm

The Math :
922 w/m^2 over 0.002% of the earth = 0.019 w/m^2 average increase

I don’t get your math.

You think an active subsea lava flow would contribute heat flow at a rate of 922w/m^2?

I believe you are in error by several orders of magnitude.

Steve Keohane
Reply to  Jeff L
December 7, 2022 5:37 am

Using the numbers from my 1971 CRC handbook, I came up with .08 w/m^2, pretty close.

ResourceGuy
December 6, 2022 6:35 pm

The AMO is headed down n cyclical form and pulling the global number with it. More to come if we’re allowed to think in nonlinear terms.

Ozonebust