UAH Global Temperature Update for January, 2022: +0.03 deg. C.

From Dr. Roy Spencer’s Weather Blog

February 2nd, 2022 by Roy W. Spencer, Ph. D.

The Version 6.0 global average lower tropospheric temperature (LT) anomaly for January, 2022 was +0.03 deg. C, down from the December, 2021 value of +0.21 deg. C.

The linear warming trend since January, 1979 now stands at +0.13 C/decade (+0.12 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).

Various regional LT departures from the 30-year (1991-2020) average for the last 13 months are:

YEAR MO GLOBE NHEM. SHEM. TROPIC USA48 ARCTIC AUST 
2021 01 0.12 0.34 -0.09 -0.08 0.36 0.50 -0.52
2021 02 0.20 0.32 0.08 -0.14 -0.65 0.07 -0.27
2021 03 -0.01 0.13 -0.14 -0.29 0.59 -0.78 -0.79
2021 04 -0.05 0.05 -0.15 -0.28 -0.02 0.02 0.29
2021 05 0.08 0.14 0.03 0.06 -0.41 -0.04 0.02
2021 06 -0.01 0.31 -0.32 -0.14 1.44 0.63 -0.76
2021 07 0.20 0.33 0.07 0.13 0.58 0.43 0.80
2021 08 0.17 0.27 0.08 0.07 0.33 0.83 -0.02
2021 09 0.25 0.18 0.33 0.09 0.67 0.02 0.37
2021 10 0.37 0.46 0.27 0.33 0.84 0.63 0.06
2021 11 0.08 0.11 0.06 0.14 0.50 -0.42 -0.29
2021 12 0.21 0.27 0.15 0.03 1.63 0.01 -0.06
2022 01 0.03 0.06 0.00 -0.24 -0.13 0.68 0.09

The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for January, 2022 should be available within the next several days here.

The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:

Lower Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Mid-Troposphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tmt/uahncdc_mt_6.0.txt
Tropopause: http://vortex.nsstc.uah.edu/data/msu/v6.0/ttp/uahncdc_tp_6.0.txt
Lower Stratosphere: http://vortex.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt

5 22 votes
Article Rating
261 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Greg61
February 3, 2022 6:09 pm

We’re all going to die. Yawn

Bob Tisdale(@bobtisdale)
Editor
Reply to  Greg61
February 3, 2022 6:13 pm

You beat me to it, Greg61. Though I would’ve begun with Oh no +0.03 deg C to reinforce the absurdity.

Regards,
Bob

Reply to  Greg61
February 3, 2022 7:27 pm

We’re all going to die because over the last 24 years we’ve had .. 1/10th of a degree of warming.

The 2017 peak is 1/10th of a degree higher than the 1998 peak. And the 13-month avg in 2017 is just ~ .06 degrees hotter than ’98.

At some point they’ve got to fish or cut bait, or whatever the saying is, because their constant out of control high-pitched fear-mongering is getting obviously ridiculous.

Last edited 3 months ago by Eric Simpson
Alan the Brit
Reply to  Eric Simpson
February 4, 2022 12:20 am

It’s much more frightening than that, as I’ve said before, the average globul temperature has risen a massive, terrifying, unprecedented, 1.1 degree over the last 150 years, making that a staggering 1/7000th of a degree Celcius per year on year!!! We’re all doomed!!! 😉

Ozonebust
Reply to  Alan the Brit
February 4, 2022 12:54 pm

Alan
Remind me again when the little ice age ended.

Leonard Weinstein
Reply to  Alan the Brit
February 6, 2022 3:06 pm

The number is 0.007 degrees C/year which is 1/136th not 1/7000.

meiggs
Reply to  Eric Simpson
February 4, 2022 1:59 pm

Does not matter they have a voice and you don’t

ResourceGuy
Reply to  Greg61
February 4, 2022 6:14 am

The attacks on UAH data and methods should begin in earnest in a few days with Biden and Kerry in agreement with lots of press coverage. Special teams of climate communicators are being activated at this moment.

observa
Reply to  Greg61
February 4, 2022 6:50 am

Have we reached 12 o’clock on the doomsday clock yet as these matchsticks under the eyelids are getting a tad uncomfortable and I’m getting lockjaw.

ResourceGuy
Reply to  observa
February 4, 2022 8:46 am

The clock is on pause during the Biden years dispite war in Europe and Asia.

Rick K
February 3, 2022 6:20 pm

Hmmm… lower than 1988 when it all started. Interesting.

Derg
Reply to  Rick K
February 3, 2022 7:05 pm

As warm as the 20s?

Mike
Reply to  Derg
February 3, 2022 10:39 pm

It was warmer when Julius Caesar got stabbed in the back. The climate was real nice back then

Simon
Reply to  Derg
February 4, 2022 12:07 pm

That’s clever of you Derg. I didn’t know Dr Spencer’s UAH work went back to the 20’s.

tygrus
Reply to  Rick K
February 3, 2022 7:36 pm

The 0.03C is not absolute, not based on 100yrs ago but is relative to a recent warm period of 1990-2020. UAH reports a trend of <0.14C (previous months) & >0.13C/10yrs (this month) over ~43yrs = ~0.57C which does appear to be significant.
If we are in the bottom of a cycle, the current temperature is about 0.35C to 0.45C warmer than the bottom of ~1989, & ~1993. While this could represent a slight warming trend (0.13C/10yrs), we don’t have the historical data using the same satellites & sensors before late 1978. There are many positive & negative drivers of climate so +/-0.5C natural variation cannot be ignored. But this isn’t a hockey stick to bash us over the head with.

Any cooling will now be blamed on Volcanoes, nature & our non-CO2 pollution to somehow “adjust” their climate models to suit the actual observations. I find it strange that some people are so quick at blaming warming on humans (not nature) but they think any cooling is a natural anomaly or wrong data (never blaming CO2).

There have been many in North America & Europe which are not wanting it to be any colder than the coldest days/nights they’ve had over the past 18mths. Cities frozen, electricity grids failing due to cold storms, vineyards needing heaters to stop frost destroying crops, lizards falling out of trees stone cold, ships stuck in ice. Individual weather events don’t represent the climate averages but it’s very hard to have significant warming with so much cooling over long periods of time.

Bindidon
Reply to  tygrus
February 4, 2022 9:36 am

You write:

1) ” Any cooling will now be blamed on Volcanoes… ”

Firstly, let me add La Niña. Is that not a well known cause for cooling?

Secondly: on what else than volcanoes would you blame the increases in the lower stratosphere and the decreases in the lower troposphere which happened together in 1982 and 1991?

comment image

The cooling generated by El Chichon was so heavy that it completely erased the El Niño in 1982 out of UAH’s data – despite the fact that this 1982 event belongs to the strongest ENSO signals of the last 100 years.

comment image

*
2) ” Individual weather events don’t represent the climate averages but it’s very hard to have significant warming with so much cooling over long periods of time. ”

Very interesting to read, especially for me living in Northeastern Germany.

Our last winter deserving the name was in 2010. Since then, a few cm of snow, and finito.

Even in 2021/22! A year which was for us much cooler than those before. Since mid November, coldest nights at -5 °C, apart from one single night drop down to -12 °C on Dec 26!

Shall I mention

  • the lawn still growing in December last year
  • flowers that bloom far too early since years
  • migratory birds more and more staying here or returning unexpectedly early…

*
I have more and more the impression that on this blog (I read in it since over 10 years), cold weather patterns get quickly identified with naturally cooling climate, whereas hints on warming patterns lasting over longer time are automatically downgraded to… weather.

*
Do we have significant warming? No se.

I made an own evaluation of UAH’s grid cell trend data

comment image

because I wanted to compare it with their original graph.

We see that apart from a small hot spot near the South Pole, everything happens above 30 °N.

Doesn’t look horribly dramatic, but I’m not a specialist in the domain.

Last edited 3 months ago by Bindidon
Bindidon
Reply to  Bindidon
February 4, 2022 9:44 am

Source for UAH grid cell trends

https://www.nsstc.uah.edu/data/msu/v6.0/tlt/

Original graph

comment image

Jtom
Reply to  Bindidon
February 4, 2022 1:20 pm

Out of curiosity, I went on accuweather to check out what this past December and January looked like for Germany. I selected the city of Kassel, close to the center of the country. That site shows that the actual measured temperatures in that city were well below normal.

Do you claim that site is wrong, or that the center of the country is not representative?
https://www.accuweather.com/en/de/kassel/34117/january-weather/168717?year=2022

Change the month to December and year to 2021 to see that month’s data.

Bindidon
Reply to  Jtom
February 4, 2022 1:44 pm

Jtom

I repeat:

” Since mid November, coldest nights at -5 °C, apart from one single night drop down to -12 °C on Dec 26! ”

Here are, for the period mid Oct 21 till end Jan 22, the night temperatures for the corner where I live

comment image

and here are those near Kassel

comment image

Similar temperatures. But I don’t know what is usual in Kassel.

Apart from the drop a lot above normal in comparison with what we had 30 years ago (as said, 2010 was our last real winter).

I was not discussing about what X or Y considers to be above or below norm for the places A or B.

Bindidon
Reply to  Jtom
February 4, 2022 10:25 pm

With the two graphs added below, I hope you better understand how things do look like here in comparison with the past, and that ‘well below normal’ sounds a bit academic to me, though a guy generating himself anomalies wrt some reference period out of absolute data, exactly knows what it means.

GHCN daily stations in Berlin, Germany

1881-2021

comment image

1971-2021

comment image

*
Here is a sorted list of the monthly anomalies since 1951

1956 2: -11.04
1963 1: -9.26
1986 2: -9.12
1987 1: -8.18
1954 2: -7.30
1963 2: -7.30
1969 12: -7.14
2010 12: -6.66
1985 1: -5.94
2013 3: -5.33
2010 1: -5.31
1970 1: -5.30
2006 1: -4.81
1980 1: -4.59
1985 2: -4.45

At position 81

2021 4: -2.39 (yes, April!)

Last edited 3 months ago by Bindidon
Climate believer
Reply to  Bindidon
February 4, 2022 2:15 pm

Rye grass needs ~6°C,but hardier meadow grasses and fescues will often grow slowly at ~4°C.

Migration of birds is an evolutionary adaptation to increase survival rates, birds have been adapting for a long time against climate, weather and food shortages.

The DWD graph does confirm your anecdotal evidence that average temp for January 2022 in Potsdam, Brandenburg, (if that’s about where you are) is on average a bit warmer than the 1981-2010 average.

Spring however seems to show it’s doing the opposite, and that’s when CO² is increasing in the atmosphere.

If the winter of 2010 is your reference for a real winter (average T -1.95°C) then you would have to go back to 1996 to find a colder one.

The average January temperature did not drop below -2°C between 1895 and 1924, that’s nearly 30 years, I’m presuming that was natural.

Potsdam monthly mean.png
Bindidon
Reply to  Climate believer
February 4, 2022 10:36 pm

Thanks for trying to teach me about how weather and climate behave here where I live!

When I write ‘as said, 2010 was our last real winter’, I mean thast as it should be understood by normal people.

FYI, I was born in 1951, and perfectly recall February 1956, January 1963, etc etc.

A look at my second reply to Jtom might help you much more than you did ‘help’ me.

I have hourly DWD and METEOSTAT data on disk, thanks.

*
Could you spare me your CO2 blah blah? I’m not at all interested in such a poor, superficial discussion concerning that stuff.

Last edited 3 months ago by Bindidon
Climate believer
Reply to  Bindidon
February 5, 2022 1:57 am

I’m not trying to teach you anything, I was just verifying with data what you were saying, is that verboten in Bindidon world?

How incredibly arrogant, but seeing the way you reply to people I can see arrogance is one of your personality traits.

Just remember one thing, this comment section is for everybody and maybe other people might learn something, this isn’t all about you, get over yourself.

As for 2010, it is you that categorized it as “real winter”, a term very subjective and totally unscientific, I just put a figure to it so I could compare with other years.

I presume this video of heavy snow falling in Berlin in January 2017 is also not considered a “real winter” by normal people?

https://www.youtube.com/watch?v=afx2r12IhTg

Last February 2021 was probably not a “real winter” either according to the all knowing Bindidon.

https://www.dw.com/en/heavy-snowfall-paralyzes-northern-and-central-germany/a-56483641

Bindidon
Reply to  Climate believer
February 5, 2022 3:13 am

Of course: this site is for everybody.

With the difference that I would never write such teachy stuff as

Rye grass needs ~6°C,but hardier meadow grasses and fescues will often grow slowly at ~4°C.

Migration of birds is an evolutionary adaptation to increase survival rates, birds have been adapting for a long time against climate, weather and food shortages.

Jesus.

What now concerns Jan 17 and Feb 21, Climbel: that was nothing in comparison with Feb 10 – with respect to the corner where I live.

It’s good not to look through the wrong end of the telescope.

And by the way, I’m old enough to know the difference between a winter when I have to scoop snow off the sidewalk in front of our house with a big shovel every day, and a winter when I occasionally shove a few chunks of snow onto the street.

You can name me arrogant as long as you want. I’m immune against such claims.

Climate believer
Reply to  Bindidon
February 5, 2022 5:47 am

Yes, arrogant people usually are.

Bellman
Reply to  Rick K
February 4, 2022 4:28 am

January 1988 was 0.00°C.

aussiecol
Reply to  Bellman
February 4, 2022 5:24 am

LOL

Carlo, Monte
Reply to  aussiecol
February 4, 2022 8:04 am

So according to UAH and bellcurveman, the world temperature has risen by 0.03°C since 1988.

HORRORS! RUN AWAY!

Bellman
Reply to  Carlo, Monte
February 4, 2022 3:54 pm

Yes that’s exactly what the UAH is saying. If you ignore all other months and compare an average cool January in recent years with one of the warmest January’s from 33 years ago, there has only been 0.03°C of global warming. I suggest you alert the scientific community straight away with this vital information.

Bindidon
Reply to  Bellman
February 4, 2022 10:51 am

Oh Noes…

” So according to UAH and bellcurveman, the world temperature has risen by 0.03°C since 1988. ”

That is the typical nonsense produced all the time by the Monte-Carlo genius…

No, Monte-Carlo: the temperature did not rise by 0.03 °C.

Every child above 7 would understand that a warming is not determined by simply, trivially drawing a line above a time series’ plot.

*
According to the data published by UAH

https://www.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt

including the newest data for January 2022, the linear trend for January 1988 – January 2022 is: 0.14 (± 0.01) °C per decade.

Means that according to UAH, the global temperature has increased by 0.476, i.e. a bit less than 0.5 °C in these 34 years.

Nothing to hurl about! But… it is as it is, and not as you ‘see’ it.

Simon
Reply to  Bindidon
February 4, 2022 12:20 pm

Every child above 7 would understand that a warming is not determined by simply, trivially drawing a line above a time series’ plot.”
I think I see the problem here….

Matt Kiro
Reply to  Bindidon
February 4, 2022 3:48 pm

“Every child above 7 would understand that a warming is not determined by simply, trivially drawing a line above a time series’ plot.”

Except that is exactly what they are teaching 7 yr olds. See this increasing rate? That is never going to stop!!

Simon
Reply to  Matt Kiro
February 4, 2022 6:19 pm

Umm no they are teaching them “if we act responsibly …. it will stop.”

Deon Botha-Richards
Reply to  Simon
February 5, 2022 3:42 am

Except it won’t.

Firstly not all CO2 is man made. In fact only a small percentage of the increase is man made. https://wattsupwiththat.com/2022/01/16/how-much-manmade-co2-is-in-the-atmosphere-really/

Secondly even if we achieved the full Paris Accord targets the effect on 2100 increase in temperature will be a mere 0.1° less than is we achieve RCP8.5 (seriously worst case scenario) model projections. Or put another way we can have practically zero impact on future temperature regardless of what we do and thus teaching 7 year olds that we can is at best misleading and more accurate utter lies.

beng135
Reply to  Simon
February 6, 2022 9:38 am

Oh right, simple-simon-says. The temperature depends on how we act. Unless we do what we’re told by the gaia worshippers, Gaia will punish us by turning up the thermostat (which doesn’t make sense for punishment because warmer has always been better). Time to cancel a few virgins into the volcano?

Last edited 3 months ago by beng135
Scissor
February 3, 2022 6:23 pm

I just knew it. When I was clearing snow for my garbage bin this morning @ -12 F, it felt at least a tenth of a degree cooler.

Alan Robertson
Reply to  Scissor
February 3, 2022 8:07 pm

Oklahoma City just broke two 100 year records for snowfall for date on Feb.2 and Feb.3.
Last year in February, the city had a debilitating storm and many lost power for days, which was unfortunate as just the previous November, we had a terible storm which brought the city to a standstill, with many streets impassable and many people without power, (I went 18 days w/out electricity.)
In the past dozen or so years, Oklahoma has experienced all- time record cold temps, probably the worst blizzard on record (Christmas 2009?) and many cold weather events.
All of this is just weather, of course and shouldn’t detract from the fact that we’re destroying the planet by trying to stay warm. And breathing.
Maybe it’s our Great Grand kids who just won’t know what snow is.

Scissor
Reply to  Alan Robertson
February 3, 2022 8:14 pm

Staying warm and breathing are good things and beat the alternatives.

Bindidon
Reply to  Alan Robertson
February 4, 2022 11:12 am

Yeah.

Indeed, it’s colder and colder in Northern America.

And in Europe, it’s getting warmer and warmer.

I didn’t collect GHCN raw daily data for Europe since a while, but here is a rather typical example – Germany:

comment image

And yes: Bavarian and Austrian Great Grand Kids very possibly still will know in 30 years what snow is.

But if even in such a cool year as was 2021, the following winter shows nearly no snow around Berlin in Northern Germany till right now, will then ours still know that?

No se!

rbabcock
Reply to  Bindidon
February 4, 2022 1:34 pm

I wouldn’t get too fixated on the rise in temps in Europe over the past few decades. It’s all dependent on what’s happening in the North Atlantic and the Gulf Stream feeding it. The AMO has been warm and has been declining for over a decade now. Based on it’s long history, it will continue down and the subsequent land temperatures will drop along with it. Nothing happens fast in the oceans, but they do happen.

beng135
Reply to  Bindidon
February 6, 2022 9:48 am

Bindi, your graphs are just scaring the heck out of us. Oh, what we will ever do? Can anyone help us now?

ex-KaliforniaKook
Reply to  Alan Robertson
February 4, 2022 6:47 pm

We are all so different. I moved to Nevada to experience cold weather. I love it. My friends in So Cal and Scottsdale AZ HATE cold, and REALLY HATE snow, which I enjoy shoveling. Took me two weeks to clear the driveway.

December broke the Reno snowfall record. Lovely. January broke the low precipitation record (ZERO precipitation). My heart would be broken, except the temps have stayed low enough we still have some 2′ drifts around the house and ice on the rear deck (mountain top living).

Of course, I worry that propane will eventually be outlawed. It’s up over one dollar (US) from a year ago. My whole house generator will be worthless, and by then we’ll be largely using renewables in NV (i.e. we’ll be freezing half to death for weeks at a time).

We’re all going to die, so we’re good with it!

February 3, 2022 6:34 pm

A good chance of a drop to the negative in a month or so.
Then we have to wait a few years to see if this leads to the first downward step in temperatures since the AGW farrago began.
There have been two upward steps since the mid 20th century cooling period.
Such steps up and down being a result of solar induced cloudiness changes which cause alterations in the proportion of solar energy able to enter the oceans.

Reply to  Stephen Wilde
February 3, 2022 6:49 pm

I notice that the new 30 year average makes it easier to drop below the zero line. My comments are best considered in relation to a 60 year period.

Dave Fair
Reply to  Stephen Wilde
February 4, 2022 9:33 am

Any period of less than about 70 years simply reflects part of a cycle. Over most of the last 43 years we have been in an upward trend of the approximately 70-year temperature cycle. Additionally, a 0.13 C/decade trend on the warm side of the cycle is unalarming. Count me unimpressed with such a small positive temperature trend in the face of a steadily increasing trend of atmospheric CO2 concentrations. I assume that would put a pretty firm lower bound on the number for the Transient Climate Response (TCR) range of estimates.

A reduced rate of warming seems to be at least part of the reason for “CAGW” morphing into “Climate Disaster” under the CliSciFi regime. There is always bad weather somewhere around the globe with which to stampede the sheepople towards the socialist nirvana de jure. The only limitation on warmunists’ desire to control is OPM. Individual Western nations are nearing that limit to lesser or greater extent. Let’s Go Brandon!

herb
Reply to  Stephen Wilde
February 4, 2022 4:58 am

just here they said that it was one of the cloudiest winters on record

stinkerp
February 3, 2022 7:05 pm

Thank you Dr. Spencer for your extraordinary work to produce a long term global temperature measurement so we can all see with our own eyes what’s happening. The UAH data is by far the best resource we have for refuting the hyperbolic claims of the alarmists. We get excited about big downturns but it doesn’t mean much other than to remind us that natural variation dominates. It’s the long term trend that matters.

The trend in the next two decades will be interesting to watch. It’ll have to increase dramatically to catch up with model projections. The longer it remains moderate the more steeply it will have to accelerate to match the models. At what point do the alarmists admit that the models are wrong? Those more grounded in science are already quietly ignoring RCP8.5, saying its projections of CO2 are unrealistically high. Which is really funny and a disingenous deflection because atmospheric CO2 continues to increase in line with RCP8.5 but temperatures and sea level rise are stubbornly refusing to go along with its predictions and they know it, meaning the models are wrong for every CO2 scenario (8.5, 6.0, 4.5, 2.6} and always have been. You would think, being scientists, that they would revise their models but the dogma is strong in them and they still cling to their invalidated assumptions about CO2 radiative forcing because catastrophic warming just has to be true, gosh darn it.

Last edited 3 months ago by stinkerp
Stephen W
Reply to  stinkerp
February 3, 2022 9:14 pm

The models are wrong.
They should be adjusted continually.

I could model a forecast for tomorrow, and as new data came in I would continually adjust the model.
I would never stick with the original forecast I made a day ago when new data refutes it.

Who are these so called modellers that aren’t adjusting their models on a regular basis?

guidoLaMoto
Reply to  Stephen W
February 4, 2022 2:47 am

Scientific theories should not be judged as right or wrong, but useful or not useful. Newton’s gravity is wrong compared to Einstein’s, but more useful….From the meteorology/climatology standpoint, AGW is not useful. From the political POV, it has proven to be quite useful.

Juan Slayton
Reply to  guidoLaMoto
February 4, 2022 6:36 am

Scientific theories should not be judged as right or wrong, but useful or not useful.
Been reading William James, huh? : > )
But utilitarianism is not an adequate epistemology for everyday life, much less for empirical investigationa..,.

Tom
Reply to  Stephen W
February 4, 2022 4:42 am

The CATASTROPHIC Anthropogenic Global Warming gravy train is supported by billions of dollars of mostly government funding. The only evidence for this comes from the models. They simply can’t be adjusted.

Matt Kiro
Reply to  Stephen W
February 4, 2022 4:12 pm

Oh they are revising their models. But until they stop using CO2 as the main control of temperature they will continue to be wrong.

Tom Abbott
Reply to  stinkerp
February 4, 2022 4:46 am

“At what point do the alarmists admit that the models are wrong? Those more grounded in science are already quietly ignoring RCP8.5, saying its projections of CO2 are unrealistically high. Which is really funny and a disingenous deflection because atmospheric CO2 continues to increase in line with RCP8.5 but temperatures and sea level rise are stubbornly refusing to go along with its predictions and they know it, meaning the models are wrong for every CO2 scenario (8.5, 6.0, 4.5, 2.6} and always have been.”

Excellent question.

CO2 does continue to increase in line with RCP8.5, but the temperatures are currently cooling.

The alarmists are hoping the warming will continue. We shall see. Let UAH be our guide.

Carlo, Monte
Reply to  Tom Abbott
February 4, 2022 8:05 am

The alarmists are hoping the warming will continue.

Exactly, they will do or say anything to keep the curves afloat.

Doonman
Reply to  Tom Abbott
February 4, 2022 11:05 am

Remember, it was Phil Jones that stated there would need to be a pause in warming for 15 years “before they became worried”.

Worried about what? That their gravy train would end?

It is impossible to be worried about “climate tipping points” and the lack of “climate tipping points” simultaneously.

Jtom
Reply to  stinkerp
February 4, 2022 1:36 pm

It looks like the UK may be the first to see all this defeated. The have been subjected to a variety of onerous rules and regulations to counter Covid, all based on doomsday models projecting massive deaths. The models have been wrong at every step in modeling all the mutations of the virus. On top of that, those onerous rules were completely ignored by those in power.

Now they are faced with massive new taxes and energy costs because of policies based on yet more models. And they realize that those in power won’t be the least inconvenienced by the higher costs. It’s nice to be able to vote yourself a raise.

The people are not happy. I am waiting to see if a Trump-like politician steps up and offers saner solutions than those being pushed by the Labour Party and Carrie’s, sorry, Boris’s Tory Party.

The experts may never admit their models were wrong, but the people know it.

Deon Botha-Richards
Reply to  stinkerp
February 5, 2022 3:56 am

All predictions or alarmist reactions to current weather events which they extrapolate to future climate all assume RCP8.5. Reality is rather different. Despite all the claims of business as usual being RCP8.5 it is in fact RCP 2.6. The original target and best case scenario.

That’s not a fact that can be disclosed by activists and politicians activating for extreme change because it would negate any changes. So to solve the problem the IPCC has now created a lower scenario so that they can make a new target to justify reduction is CO2 output.

SAMURAI
February 3, 2022 7:24 pm

Factoring in the 5~6 month lag between La Niña cooling and UAH 6 global temp anomaly cooling, March’s temp anomaly will likely be around -0.1C~-0.2C and will mark the end of the current La Niña cycle.

UAH6 temp anomalies will increasr after March as a new El Niño cycle develops later and will generally trend upwards will through to the middle of 2023.

It’s very difficult to predict how strong the developing El Niño cycle will be, but given the 30-year PDO cool cycle seems to already started, it’s likely to be another moderate El Niño cycle.

There is a pretty good chance that the next La Niña cycle starting from around the end of 2023 will be a strong one as we haven’t had a strong La Niña cycle since 2010, and there is usually at least one strong La Niña every 10 years, so we’re “overdue”…. (Yeah, I know that’s not how statistics work, but still..)

Bindidon
Reply to  SAMURAI
February 4, 2022 3:55 am

Holá SAMURAI-san,

This is a thoroughly modernized discourse you are offering us here.
Moderate appreciation of ENSO, no longer GSM blah blah…

Much appreciated.

bdgwx
Reply to  SAMURAI
February 4, 2022 6:57 am

That’s a pretty substantial reduction from your previous predictions in which you said -0.3 C on the 1981-2010 baseline [1] which is -0.44 C on the new 1991-2020 baseline. Why the reduction?

Last edited 3 months ago by bdgwx
rbabcock
Reply to  SAMURAI
February 4, 2022 1:40 pm

Don’t be so sure the La Niña will disappear too quickly. The Australian BOM tracks the tropical Pacific waters to depth and its still pretty cold to 160W. Maybe at least to fall before it moves to La Nada. http://www.bom.gov.au/climate/enso/#tabs=Pacific-Ocean&pacific=Sea-sub%E2%80%93surface

SAMURAI
Reply to  rbabcock
February 5, 2022 8:33 am

rbabcock-san:

According to Pacific equatorial sub-surface ocean temperature anomalies, the current La Niña cycle is quickly dissipating and warm deep-water currents are quickly moving east, as occurs when a new El Niño cycle is developing:

(Page12 of attached ENSO Report)

https://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf

“She’s dead, Jim.”

Last edited 3 months ago by SAMURAI
rbabcock
Reply to  SAMURAI
February 5, 2022 4:10 pm

Did you not even read through the report? Page 25: The CFS.v2 ensemble mean (black dashed line) predicts La Niña to continue into autumn 2022. Then to La Nada. An El Niño is probably a year away and current indications say mild if it develops.

Jim Gorman
Reply to  rbabcock
February 6, 2022 3:53 am

Just too easy to cherry pick statements that agree with one’s confirmation bias.

Richard M
February 3, 2022 8:10 pm

We are still feeling the effects of the cloud changes seen over the past 25 years. They have thinned allowing in more short wave (aka solar) energy. This energy has warmed the oceans and oceans continue to release it slowly back into the atmosphere. Until the clouds thicken back up we will not see any long term cooling.

BTW, looking at the atmosphere as a multi-layer onion provides insight as to why CO2 increases cannot provide more warming.

When you add CO2 it radiates more energy in all directions. That includes upward. If you believe downwelling energy warms the Earth, then the increased upwelling energy should cool the planet.

But neither is true because the amount of upward energy flux is independent of the amount of CO2. The actual amount of energy flow is based on the total energy available and is moderated by the gravitational field. Since neither of these changes with added CO2, you get no warming.

To understand this think of 3 layers of the atmosphere constantly absorbing and radiating energy through CO2 molecules.

The middle layer is radiating more energy than the upper layer because it is denser. It is radiating less energy than the bottom layer because it is less dense. Same is true for absorption.

Hence, the middle layer absorbs more energy from the layer below than from the layer above. However it radiates equally in both directions. This means there’s a net positive movement upward and a net negative change downward.

This is how energy moves through the atmosphere.

If you replace some O2 with CO2 you get more absorption and radiation in all 3 layers. However, the amount of change in all 3 layers is unchanged since the difference is driven by the change in density which is unchanged. The flow remains the same.

Even though more energy is radiated downward from any given layer, that is met with an even larger flow upward from the layer below it. The difference is based on the difference in density.

This is not at all simple. It takes most people some time to grasp what is going on. It helps to first consider what would happen without gravity. Then all the flows are equal and you quickly realize there would be no energy movement at all when you increase CO2.

I realize the actual flow in the atmosphere is not this well constrained. Statistically, all those other complexities vanish.

Finally, it is rather amazing that the upward movement of energy through the atmosphere is driven by gravity. Not at intuitive. While there exists downwelling IR, the energy flux is always upward. Hence, it cannot warm the planet.

Bindidon
Reply to  Richard M
February 4, 2022 3:51 am

” When you add CO2 it radiates more energy in all directions. ”

Typical nonsense.

If CO2 (and yeah! at the very first position, water vapor) molecules aren’t present in the atmosphere, than all the terrestrial IR immediately radiates to space, instead of being caught by these nice little molecules.

And where CO2 catches IR (at up to 50 km altitude), this IR very certainly is not due to heat generated by convection!

What now concerns the real effect of that half of IR reradiated down to surface by H2O and CO2… hmmmh. That’s a myth imho.

Richard M
Reply to  Bindidon
February 4, 2022 6:02 am

Denial of science is common among the climate cult. Are you claiming that Kirchhoff’s Law does not apply to CO2? Definitely, appears that way.

The half of energy radiated down is always less that the amount of energy radiated upward due to the density differences as I described. That means the net flow is always upward and ends up in space.

There is one small error in my description which I insert to see if the person reading it makes any attempt to understand this science. You failed.

Bob boder
Reply to  Bindidon
February 4, 2022 6:09 am

Bindidon
No ghg’s, how does the atmosphere cool. It will warm through conduction with the surface, but how does it cool?

Jim Gorman
Reply to  Bindidon
February 4, 2022 8:07 am

That “half” you talk about ignores the actual phenomena that is occuring. In order to describe how it actually proceeds requires a calculus showing how the diffusion of the radiation occurs. It is not a simple half goes up and half goes down. The lapse rate and the density changes in altitude determine how much actually goes down and goes up!

Bindidon
Reply to  Bindidon
February 4, 2022 10:28 am

It seems to me that none of the people having replied did understand what I wrote:

” What now concerns the real effect of that half of IR reradiated down to surface by H2O and CO2… hmmmh. That’s a myth imho.

Every 12 year old child would have read that correctly.

The replies remind me these Spanish corridas, and the toros solely looking at the red muleta.

Thanks a lot, very good.

Richard M
Reply to  Bindidon
February 4, 2022 12:30 pm

Since you obviously did not understand the comment you responded to, it’s not surprising that you are out in left field. The comment and replies below answer all of your points.

The downwelling IR does not warm the surface due to the effects of the EBL. Once you realize that energy primarily remains in the atmosphere, then you only have a net upward flow of energy similar to a conduit with the flow driven by the differences in density as you move upward.

You are capable of understanding. It’s not intuitive, but it’s also not overly complicated once the main points are understood.

Bindidon
Reply to  Richard M
February 4, 2022 1:50 pm

For the very last time:

” ” What now concerns the real effect of that half of IR reradiated down to surface by H2O and CO2… hmmmh. That’s a myth imho.

Didn’t you understand that simple sentence, man?

I’m telling that I do not believe that downwelling radiation coming from reemission by H2O or CO2 has any warming effect.

Do you have problems to understand

That’s a myth imho. ” ???

Jesus are you Pseudoskeptics boring!

MarkW
Reply to  Richard M
February 4, 2022 7:38 am

CO2 doesn’t radiate more in all directions.
Consider this. If there is 1 unit of energy radiating upward and it is captured by a CO2 molecule. Then if half that energy is then radiated downwards, then there will only be half a unit left to radiated upwards. The total amount of energy being radiated upwards decreases while the amount being radiated downwards increases.

Richard M
Reply to  MarkW
February 4, 2022 10:23 am

I was simply referring to the random nature of CO2 radiation. In addition, once energy is absorbed it is no longer radiating upwards or downwards. In the case of CO2 the surface signal is extinguished within 10 meters. No more surface originated radiation going upwards. All you have is CO2 radiated energy going upward (downward) between layers as in my description. Study it closely, you will find that the net radiation flows upward based on the gravitational force.

But, what about the back radiation of that first 10 meters of absorption. Doesn’t it warm the surface? Turns out it is irrelevant. Since it is within the equilibrium boundary layer (EBL) of the atmosphere any imbalance is quickly removed via conduction. The energy that was supposed to warm the planet is returned into the atmosphere as part of massive equilibrium energy transfers.

Conduction may be slow across large distances but those do not exist in the EBL. Conduction also utilities all atmospheric gases and kinetic collisions at the surface occur a thousand times faster than radiation events. As a result, any downwelling energy to the surface from within the EBL disturbs the equilibrium and conduction works to restore it quickly.

This effectively means all we have are the layers I mentioned previously. Energy effectively starts flowing upward with slow losses to space as the density decreases.

I realize this destroys the belief that CO2 generates 3.7 W/m2 of warming. The 1.1 C warming that it could create almost entirely disappears. We now have an explanation for the results of Seim/Olsen 2020.

https://www.scirp.org/journal/paperinformation.aspx?paperid=99608

Jim Gorman
Reply to  Richard M
February 4, 2022 11:55 am

This is a complicated physical phenomena that I don’t think people understand completely. This is an example only, don’t look at the numbers, just look at the concept. First, emissivity = absorptivity. If emissivity is based on temperature, then so is absorption of radiation. If you had two CO2 molecules, one at the surface at 300 K and one at 10,000 ft at 250 K (because of lapse rate), The surface would emit based upon 300 K ^4, but the one at 250 K could not absorb all that radiation. What happens to the rest? On out to space.

Now I’m not stupid. I know what happens is vastly more complicated because of continuous temperature change and continuous radiation values. They are not linear and require calculus to describe the actual gradients. That is one reason why radiation diagrams using linear algebra to describe what is going on is so much foo fa rah.

Read carefully what Richard M has written. It is a good attempt at describing a complicated atmosphere.

Tim Gorman
Reply to  MarkW
February 4, 2022 2:43 pm

Radiation is not a bullet. It is an electromagnetic signal that radiates in all directions, just like a radio signal from an isotropic antenna. That wave front is made up of energy which we quantify as “photons”. The energy in any specific direction of that wave front follows the inverse-square law. As the spherical wave front gets larger the energy contained in each incremental piece of the wave front gets less. Just like a radio signal gets weaker as you move away from the antenna.

I would also add this observation: If GHG’s in the atmosphere intercept LWIR going up and then re-radiates it, why doesn’t the GHG’s in the atmosphere intercept the part of the wave front headed downward toward the earth? it would seem logical that by the time that wave front reaches earth a certain portion of it would have already been intercepted. And part of that back radiation would then again be sent away from earth with only a fraction being left to actually go on toward the earth. What will be left by the time that wave front reaches earth? If that original backward wave gets intercepted in the atmosphere 100 times and half of it is lost each time then the part that reaches the earth will be X/(100 * 0.5) or 50 times smaller than what started out.

My conclusion from this? The atmosphere gets heated much more that the earth does from back radiation. That really puts into question trying to use atmospheric temperature as a proxy for surface temperature. It’s probably *not* a linear relationship yet everything trying to depict radiation and the earth never accounts for this.

Reply to  Richard M
February 4, 2022 8:01 am

Some people divide IR into IR-A and IR-B and IR-C with IR-A being the shorter IR frequencies. They say that incoming IR from the sun does not match outgoing IR from the earth in frequencies. This is supposed to be key in how there is a small net warming from IR. The CO2 allows X amount of IR to hit the earth’s surface, but only allows X-small amount to be radiated back to space…….due to differences in frequencies.

Last edited 3 months ago by Anti_griff
Jim Gorman
Reply to  Richard M
February 4, 2022 8:01 am

I am not sure about the effect of gravity on radiation, but it does affect the density and temperature, i.e., the lapse rate. As you move up, there are fewer and fewer molecules and they are also colder.

If you take a slice of atmosphere that has 10 CO2 molecules, then the next slice has only 9 CO2 molecules and are cooler. If the 10 radiate all at once, then the nine only intercept part of the radiation and some goes on upward. Also, since the 9 are cooler, they do not radiate as much in the downward direction as was sent from below. The net is always upward.

This ignores conduction both from the earth to all air molecules but also from CO2 to other molecules. This is what warms the air.

Richard M
Reply to  Jim Gorman
February 4, 2022 10:32 am

Exactly, this is precisely what I was describing. Gravity affects the density and it is the difference in density that leads to more net upward radiation between layers.

Essentially, CO2 creates an energy conduit moving energy from the high density surface to the low density upper atmosphere. The conduit narrows and energy leaks out (toward space) as you go.

DMacKenzie
February 3, 2022 8:24 pm

To go a bit extreme, a 2% increase in absolute temperature (call that 5C) causes an 8% increase in emitted IR energy….and the solar constant remains, hmmm, constant….There’s just not enough solar energy available that double the CO2 can absorb to make the planet much warmer than it already is, at least without some sort of fairly big Albedo change. Maybe one more degree. So as they say on Wall street “if something can’t continue rising…it will stop.”

jorgekafkazar
Reply to  DMacKenzie
February 3, 2022 10:06 pm

It’s harder to push the earth’s temperature distribution curve further to the right than to the left. There’s the T⁴ term that favors the other direction.

Denise
February 3, 2022 8:25 pm
  • Maybe I am a tad older than most of you, I distinctly remember back in the late 1970’s when global cooling was the massive scare. So if we are only looking back to the 70;s for an increase in warming .. not the dirty 30’s .. where are we now. Probably about normal. We have record cold for all of January in Ottawa right now. I cannot remember a colder January ever .. at least 8 degrees celcius below average most days.
Bindidon
Reply to  Denise
February 4, 2022 3:40 am

Denise, I’m not sure to be so much younger than you. I perfectly remember here in Europe our winters in 1956, 1963, 1979. They were by far not as cold as what Northern America experiences since ‘evah’ during all winters, but 11 °C below norm: that’s a lot too, isn’t it?

*
The coldest records in GHCN daily for OTTAWA:

CA006105976 54-41 1933 12 29 -38.9
CA006106100 54-41 1957 1 15 -38.9
CA006105976 54-41 1933 12 30 -38.3
CA006105976 54-41 1934 2 17 -38.3
CA006105976 54-41 1925 1 19 -37.8
CA006105976 54-41 1943 2 15 -37.8
CA006106090 54-41 1957 1 15 -37.8
CA006105976 54-41 1934 2 8 -37.2
CA006105976 54-41 1943 2 16 -37.2
CA006105976 54-41 1957 1 15 -37.2

*
But to remember cold moments, you just need to go back three years ago, and look at corners not so terribly far from yours, e.g. Cotton, Minnesota in January 2019:

USC00211840 54-35 2019 1 27 -48.9
USC00211840 54-35 2019 1 31 -47.2
USC00211840 54-35 1965 1 14 -45.6
USC00211840 54-35 1996 1 20 -45.6
USC00211840 54-35 1982 1 17 -45.0
USC00211840 54-35 1967 1 18 -44.4
USC00211840 54-35 1972 1 15 -44.4
USC00211840 54-35 1996 1 31 -44.4
USC00211840 54-35 1965 1 29 -43.3
USC00211840 54-35 1996 1 21 -43.3

There is, as it seems, always a corner near us which is way colder than where we live.

bdgwx
Reply to  Denise
February 4, 2022 7:54 am

Hi Denise. The cooling scare in the 1970’s was a narrative mainly driven by the media. A lot of that was based on the research by Reid Bryson and his human volcano theory in which he thought aerosol forcing would overpower greenhouse gas forcing. He wasn’t completely wrong as there was a period between 1950 and 1980 in which anthropogenic aerosol forcing really did match or even exceed human greenhouse gas forcing. But by 1980 pollution was reigned in and GHG forcing began outpacing aerosol forcing. In fact, aerosol forcing started to decline some. Anyway, most scientists in the 1970’s were convinced that the Earth would continue to warm [1]

comment image

Mr.
Reply to  bdgwx
February 4, 2022 10:54 am

From memory, James Hansen (the “Father Of Global Warming”) led the panic about an imminent ice age in the ’70s.

Seems that particular bout of “science” wasn’t “settled” enough to last into the ’80s.

(Smart guy Jim Hansen – he sees when one trough is drying up and another is beginning to fill, and time to get your nose in before the crowd piles in.)

bdgwx
Reply to  Mr.
February 4, 2022 11:25 am

I don’t think that’s right. I think what you are referring to is the Rasool 1971 publication. Although Rasool 1971 used some of Hansen’s research Hansen was not an author on the publication. It’s important to note that Rasool was only exploring possibilities related to hypothetically large increases in aerosol forcing on the order of 4x or higher what they were at the time. BTW…notice that Raspool cites Bryson in that publication. Anyway, Hansen, AFAIK, never predicted an ice age.

Mr.
Reply to  bdgwx
February 4, 2022 12:51 pm

Dr. S. I. Rasool of the National Aeronautics and Space Administration?

Mr. Rasool came to his chilling conclusions by resorting in part to a new computer program developed by Mr. Hansen that studied clouds above Venus.

So is there any record of Jimmy Hansen pooh poohing his buddy at GISS’ paper?

bdgwx
Reply to  Mr.
February 4, 2022 6:01 pm

No. There is no record of Hansen “pooh poohing” Raspool or Schnedier. In fact, he said the Raspool 1971 publication was a useful scientific paper.

BTW…Raspool & Schneider may be right. Note that in 1970 the RF of aerosols was -1.5 W/m2. 4x that would -6 W/m2. A 3.5 C change would only require a climate sensitivity of 0.6 C per W/m2. Today you could argue that R & S were too conservative with their 3.5 C estimate.

Danley Wolfe
February 3, 2022 8:44 pm

already saw this on Roy Spencer’s feed. You posted Spencers complete website entry …TMI

Denise
Reply to  Danley Wolfe
February 3, 2022 8:51 pm

Who is Roy Spencer, I am a older lady just stating what I believe, Please give me a link , maybe he agrees with me. No need to be rude!

Bindidon
Reply to  Denise
February 4, 2022 3:57 am

When the indentation level of a comment is the same as that of yours, it is very unlikely a reply to what you wrote.

Charles Rotter(@jeeztheadmin)
Admin
Reply to  Danley Wolfe
February 3, 2022 9:02 pm

OMG really? You saw it somewhere else? I’m soooo sorry.

Denise
Reply to  Charles Rotter
February 3, 2022 9:05 pm

Is this meant to be rude to me or the the reply to me.. hard to distinguish from your vague comment.

Climate believer
Reply to  Denise
February 4, 2022 12:18 am

Reply to Denise, it can be confusing sometimes but Danley Wolfe wasn’t replying to you, he was just making an uninformed comment, and got a sarcastic reply from Charles Rotter the admin of the site.

When someone replies to you, your name will be at the top of their comment, and the comment box will be offset slightly to the right.

Try not to take comments personally, life’s to short.

Dr Roy Spencer is the author of the article, his website is here:

From Dr. Roy Spencer’s Weather Blog

Tom Abbott
Reply to  Climate believer
February 4, 2022 4:54 am

Good, helpful comment.

Sunsettommy(@sunsetmpoutlookcom)
Editor
Reply to  Denise
February 4, 2022 7:02 am

He is responding to Danley Wolfe as shown under his name where it states “reply to Danley Wolfe”

No one is picking on you relax and enjoy the reading here.

MarkW
Reply to  Denise
February 4, 2022 7:40 am

It’s sarcasm. Perhaps you were so busy being offended you couldn’t recognize it.

zee
February 3, 2022 9:48 pm

love it keep posting best things thanks for sharing love it

Afterthought
February 3, 2022 10:25 pm

Literally nothing out of the ordinary is happening at all (except the rise of an evil New Religion!)

billtoo
Reply to  Afterthought
February 4, 2022 5:54 am

the fireworm approaches

kybill
February 3, 2022 10:33 pm

We don’t need to argue about 0,?/deg, We need to accept that the earth is getting warmer. However the minuscule increase is not an existential threat. This is what we should sing together. The world is not in danger. As Tucker would say “Sit down and shut up,”

We all have periods when temperature is hot/warmer than whenever. Here along the Ohio River our “plant hardiness zone” was changed 20? years ago. We moved up to a warmer
zone. This doesn’t prove/disapprove anything. I plant the same plants and have the same bugs and viruses.

We are not facing an existential threat.

Tony C
Reply to  kybill
February 3, 2022 10:45 pm

I miss the old days, when the river Thames in London was frozen three foot deep and the Valley Forge encampment also had its ground frozen three foot deep…. Those were the days……

Carlo, Monte
Reply to  Tony C
February 4, 2022 8:10 am

All that New Jersey swamp, frozen solid.

Mike
February 3, 2022 10:45 pm

Everybody…. that’s ”climate change” right there. Have a good look then run away screaming.

Jade Goat
February 3, 2022 10:48 pm

0.13 degrees per decade! So 1.3 degrees in a century’s time! Woop-de-DO, warmists!

Dear warmists – explain THIS to me. Say the temperature is (on average) 16 degrees Celsius in my country. If it is 18 degrees in 100 years time, *how will that kill anyone?*
It WON’T!

Even in a hot place like (say) India – say it’s (on average) 35 Celsius now. In 100 years that’s 37 Celsius. Big deal!

Oh, but “melting Antarctica” they say! Gee, even if that were true, we can *move!*
We’re not helpless! Yes, it would be massively expensive but *people can adapt!*

Oh, but “water shortages”, they say. It is true that *some places* (best example being Southern California taking water from the Colorado River) need to wise-up with their water usage but their problems are caused by *massive waste of water* – not by the climate.
Oh, and India is FAR ahead of the West when it comes to conserving water (with their water-harvesting schemes).

Newsflash – If it were warming, that means more evaporation. Evaporation *cools* – that’s why a breeze on your sweating body feels so good. It also means *more rain*.
It’s a self-correcting system!

As for desertification, one of the main causes of that is foolish land use – grazing too intensively and chopping down every tree in sight.
I note that with increasing CO2 levels the *Sahel is greening*. Oh, and we had record wheat harvests a couple of years ago – oh, that nasty CO2!

Climate believer
Reply to  Jade Goat
February 4, 2022 12:30 am

“Even in a hot place like (say) India – say it’s (on average) 35 Celsius now. In 100 years that’s 37 Celsius. Big deal!”

Most of the warming is not global as they like to say, but heavily weighted in the upper northern hemisphere.

India will be fine.

2021f-TemperatureAnomalyF.0900_print.jpg
Derg
Reply to  Climate believer
February 4, 2022 1:53 am

I live there and I am freezing.

Climate believer
Reply to  Derg
February 4, 2022 2:33 am

lol 🙂

Bindidon
Reply to  Derg
February 4, 2022 5:53 am

I live in Northeastern Germoney, the last winter was deserving that name was in 2010.

I’m not freezing at all anymore, how good.

The only disturbing factor is that the cold has been replaced by permanent wind, due to atmospheric lows running since years every day down from the Northern Atlantic.

With such a bad news for Coolistas, I hardly will become upvoted 🙂 🙂

Bob boder
Reply to  Bindidon
February 4, 2022 6:12 am

Get-money? Is that a slip?

Bob boder
Reply to  Bob boder
February 4, 2022 6:13 am

Ger-money, ugh auto correct.

Derg
Reply to  Bob boder
February 4, 2022 6:54 am

They both work 🙂

Derg
Reply to  Bindidon
February 4, 2022 6:55 am

No kidding, it’s so cold here I will become a climate refugee tomorrow as I travel to FL.

Max K
Reply to  Bindidon
February 4, 2022 3:19 pm

I live in the north of the Netherlands and I totally agree: we haven’t seen any serious winter since 2010. Only lots over grey overclouded skies and winds from the southwest that deliver relatively hot air masses. Even now, in a year with record low Solar activity, no frost up till now. While for at least the last 100 years low Solar activity covariated very strong with low temperatures and lots of great ice skating events. I almost forgot how snow looks like ….

OK, one positive thing about this is that I’m not going into bankrupcy as our leftish green liberal government has made energy prices skyrocketting -as Obama promisssed.

Disputin
Reply to  Climate believer
February 4, 2022 5:54 am

BTW CB, how many thermometers are there in that red blob? Or have “they” simply spread out the few there are?

I agree, India (and everywhere else) will be fine.

Climate believer
Reply to  Disputin
February 4, 2022 10:21 am

Well the map is based on NOAA’s GHCN-m v4 and that takes monthly mean temperatures from 25,000 ‘thermometers’ across the globe. I don’t have the metadata on their exact locations.

I think there’s only about 10 permanent stations in the Arctic, somebody will correct me if I’m wrong.

Talking of the Arctic…

ARCTIC ICE COVERAGEFebruary 2nd.png
MarkW
Reply to  Jade Goat
February 4, 2022 7:42 am

The wetter the environment, the less impact CO2 has.
The tropics will see the least warming from CO2, assuming they see any.

James Schrumpf
Reply to  MarkW
February 4, 2022 9:38 am

Wouldn’t that make sense, because the tropics are much more humid than the polar regions, and H2O vapor is 20X the greenhouse gas CO2 is.

Alan the Brit
February 4, 2022 12:25 am

I am waiting for another Kapitan Kolbenikov eco tourist boat trip to demonstrate to all selected passengers the horror of the disappearing ice in the Arctic, which got stuck in ice that was supposed to have not been there for several weeks without rescue, the ice being so thick that ice-breakers couldn’t reach the stranded vessel!!!

Matthew Sykes
February 4, 2022 3:11 am

Popcorn is open, waiting to see what a negative AMO does. If temps fall off back towards 1970s levels, CO2 is a busted flush. If it stays level (hopefully) CO2 is having some effect, though not dramatic.

Any suggestion of an alarming rise from CO2 is already dead, we know that, the lack of warming cant be accounted for, the Cabal admitted it. So the whole panic is over anyway.

Bellman
February 4, 2022 4:42 am

Some trivia:

Only the 17th Warmest January, or 28th coldest, out of 44 years of UAH data.

The coldest January since 2012.

Only one January in the 20th century was warmer.

202201UAH6month.png
Derg
Reply to  Bellman
February 4, 2022 6:04 am

Warmest evah

Bellman
Reply to  Derg
February 4, 2022 6:37 am

I see your math’s skills are as good as your spelling.

No, 17th warmest in 44 years is not warmest ever – it’s 17th warmest. Hope you can understand the difference, or do you want me to draw another picture.

Derg
Reply to  Bellman
February 4, 2022 6:53 am

Hahaha are you going to draw us another hockey stick?

You and Mann are clown-shows. I am wondering if you are related to the Russia colluuuusion Simon who posts on here.

MarkW
Reply to  Bellman
February 4, 2022 7:45 am

Call me back when you have a couple hundred years of data.
Declaring warmest or coolest ever from 44 years of data is a fools errand, and you are just the fool to attempt it.

Bellman
Reply to  MarkW
February 4, 2022 3:57 pm

Can you actually read? I said it was the 17th warmest January out of 44 years.

Maybe you are thinking of carlo, monte who thinks that UAH is so bad, that every month is statistically tied for the warmest ever.

Carlo, Monte
Reply to  Bellman
February 4, 2022 4:54 pm

Stop whining.

Bellman
Reply to  Carlo, Monte
February 4, 2022 5:12 pm

Brilliant come back. I can only admire the originality of your wit.

Pat from kerbob
Reply to  Bellman
February 6, 2022 6:27 pm

You are right
It’s getting colder
Very bad
Thanks for highlighting the end of the AGW scam

Carlo, Monte
Reply to  Derg
February 4, 2022 8:13 am

He luvs his noisy grafs…

Bellman
Reply to  Carlo, Monte
February 4, 2022 3:58 pm

Thanks, I do like them – but ggplot does most of the work.

DMacKenzie
Reply to  Bellman
February 4, 2022 6:28 am

And as of this date, we have more arctic sea ice than we have had for a dozen years. I remember everyone saying that was “the canary in the coal mine” only a few years ago when those death spiral arctic ice graphs were “exhibit A” for an alarmists proof of gorebull warming.

https://nsidc.org/arcticseaicenews/charctic-interactive-sea-ice-graph/

Bindidon
Reply to  DMacKenzie
February 4, 2022 7:56 am

DMacKenzie

You are right; when sorting NSIDC’s G2035 absolute data for the January months since the year 2000, we see this as top 10:

2003 1: 14.39 (Mkm²)
2002 1: 14.27
2000 1: 14.22
2001 1: 14.20
2004 1: 14.03
2009 1: 13.91
2008 1: 13.89
2022 1: 13.88
2010 1: 13.74
2012 1: 13.73

But it is not incorrect to have a global view, taking the situation in both the Arctic and the Antarctic into account.

Here is the global top 20 for the January months:

2015 1: 20.45 (Mkm²)
2008 1: 20.30
2003 1: 20.16
2014 1: 19.98
2009 1: 19.62
2004 1: 19.62
2001 1: 19.43
2012 1: 19.38
2013 1: 19.24
2002 1: 19.01
2000 1: 18.97
2010 1: 18.70
2005 1: 18.41
2007 1: 18.37
2020 1: 18.16
2016 1: 18.16
2021 1: 18.15
2011 1: 17.97
2022 1: 17.74
2006 1: 17.63

Here is a graph showing the absolute values for the Globe as time series:

comment image

Right on, we have a recent recovery since 2019! No need for even more salt-free water in the oceans, especially in the Northwestern Atlantic.

But… such a recovery we had already for 2012-2015, and for 2008-2011.

A look at recent years in departure form (wrt mean of 1981-2010) might be of interest:

comment image

{ I have reinstalled 2012, it is the plot in dotted black. Most people don’t know that 2012 was a very icy year, because only its melting season has obtained attention. }

Bindidon
Reply to  Bindidon
February 4, 2022 7:57 am
MarkW
Reply to  Bellman
February 4, 2022 7:44 am

Once again the warmies misuse statistics in a desperate effort to remain relevant.

Bindidon
Reply to  MarkW
February 4, 2022 8:00 am

The same is pretty good valid for the Coolistas, isn’t it?

Derg
Reply to  Bindidon
February 4, 2022 8:10 am

No kidding…maybe the tiny warming is due to all the concrete and asphalt 🤔

Simon
Reply to  Derg
February 4, 2022 12:35 pm

No kidding…maybe the tiny warming is due to all the concrete and asphalt”
Haha… given the fastest warming place on the planet is the Arctic…. I don’t think so….

Derg
Reply to  Simon
February 4, 2022 12:45 pm

Hey it’s the Russia colluuuusion clown. I had to laugh when Trudeau said Russia was behind the trucker protest. I seriously laughed out loud and said to myself where have I heard that before 🤔

Carlo, Monte
Reply to  Derg
February 4, 2022 2:08 pm

And then the Canuck press dutifully picked up the Russian nonsense and ran with it.

Simon
Reply to  Derg
February 4, 2022 6:34 pm

So no answer for my pointing out how silly your comment was re warming coz there ain’t no concrete (worth talking about)in the Arctic. Just Duh!!!! Russian collusion. Duh!!! Trudeau. Dud Duh Duh!!!!!! If nothing you are consistent. But hey you don’t need to know anything about science if you are Derg, just copy and paste irrelevant stuff to divert from the fact you have no answers when questioned.

Last edited 3 months ago by Simon
Derg
Reply to  Simon
February 5, 2022 12:12 am

Hey you were the Russian colluuuusion guy and it cracks me up how Trudeau does the exact same thing.

You are a clown show and should be ashamed of yourself. You can’t talk science because you are a liar. The world is beginning to see what pieces of 💩 you type of people are.

Simon
Reply to  Derg
February 5, 2022 1:07 am

Once again no come back just put downs. What a guy.

Derg
Reply to  Simon
February 5, 2022 1:16 am

Because you are a liar. Russia colluuuusion indeed.

Pat from kerbob
Reply to  Bindidon
February 6, 2022 6:29 pm

I’m unaware of “coolistas” here?
There are warmistas for sure.

The rest are just normal people who can read

Bellman
Reply to  MarkW
February 5, 2022 6:33 am

What on earth are you on about? I stated where this January fell in relation to other Januaries according to UAH. It should be obvious that being 17 out of 44 is not particularly warm. It’s a pretty average month at least compared with recent years.

If I wanted to demonstrate warming, I could have simply pointed to the head posting where it states that the warming trend over the last 43 years is 0.13°C / decade, and that amounts to 0.56°C of warming over the satellite era.

If I really wanted to misuse statistics in order to exaggerate the warming I would have gone done the Monckton path and cherry picked the start date to show a length of faster warming. E.g. Despite the double dip La Niña, UAH shows that over the last 14 years and 8 months the globe has warmed by 0.44°C, a warming rate equivalent to 3.0°C / century.

20220205wuwt1.png
Last edited 3 months ago by Bellman
Carlo, Monte
Reply to  Bellman
February 5, 2022 7:19 am

the Monckton path and cherry picked

Same old, same old, tired worn-out carpet, anything to keep the rise alive.

angech
Reply to  Bellman
February 5, 2022 6:30 pm

Carlo, Monte That is just so fair.

Bellman
If I really wanted to misuse statistics in order to exaggerate the warming.

  • you do.
  • All the time.
  • Why complain about accurate representation?

If I really wanted to misuse statistics in order to exaggerate the warming I would have cherry picked the start date of UAH to show a length of faster warming.

It is blindingly obvious that if you start on top of a sine curve and misrepresent it as being the average that it will always go down from that point.
Seeing that you never admit this fact but take it as gospel gives you no credibility at all.
Admit faking it and grow up statistically

Bellman
Reply to  angech
February 5, 2022 7:18 pm

It is blindingly obvious that if you start on top of a sine curve and misrepresent it as being the average that it will always go down from that point.
Seeing that you never admit this fact but take it as gospel gives you no credibility at all.

Why would I deny that. But if you are suggesting UAH data shows we are at the top of a sine wave, you need to show that. When will we see this fall? How long before there is a statistically significant drop in temperatures? At present the best fit looks to be a straight line.

If the evidence changes I’ll admit it, but this continuous wishful thinking, looking at each La Niña as proof that we’ve reached the top and cooling must surely follow, is just getting embarrassing.

I’ve been following these claims for decades. “There’s no evidence of warming in the 90s”, in 2009 Monckton’s claiming that “there’s been rapid global cooling for 7 years”, the month by month growing pause up to 2016. Every time there’s a chorus of skeptics claiming that this is proof that we’re at the top of a sine wave and any time now temperature will start plummeting – but so far it has not happened.

Carlo, Monte
Reply to  Bellman
February 5, 2022 8:13 pm

Anything to keep the rise alive == bellcurveman.

Jim Gorman
Reply to  Bellman
February 6, 2022 3:47 am

Here is the proof. Run your trend out 1000 years. Is your trend accurately telling you we going to achieve that? If not, why not, and when will your trend turn negative?

Are we never going to reenter another glaciation? If so, temps are going to have to fall.

These are questions that linear regression of a short time, geologically speaking, need to answer. But, so sorry, linear regression can never follow cycles. A linear regression must have a non-changing slope. Therefore, it can never follow a phenomenon that varies in time.

Bellman
Reply to  Jim Gorman
February 6, 2022 6:50 pm

You have an odd idea of what constitutes proof.

Why would I run the trend out to 1000 years? I’m not even extending out past the current date.

If not, why not, and when will your trend turn negative?

That’s the question I was asking. I keep being told we are at the top of a sine wave and any time soon it will turn negative, but so far no evidence that it is actually happening.

Are we never going to reenter another glaciation? If so, temps are going to have to fall.

Obviously. But that’s a long way from saying temperatures are currently falling.

These are questions that linear regression of a short time, geologically speaking, need to answer.

Why? There are a lot of reasons for linear regression, but predicting what will happen outside their limits is not a good one. If you want to predict what will happen in the next 100 let alone 1000 years, you really don’t want to just extend the current trend line. You need things like physical models.

But, so sorry, linear regression can never follow cycles. A linear regression must have a non-changing slope.

Not true. You can fit all sorts of curves using linear regression. But as always, the more variables the more risk of over fitting.

Tim Gorman
Reply to  Bellman
February 6, 2022 3:25 pm

I have yet to see a Fourier analysis of the temperature curve over several millenia or a wavelet analysis either. Either would break out any cycles there are to be found in the temperature record. While the Fourier analysis won’t tell you much about the phasing or time relationships of the cycles. Wavelet analysis doesn’t identify frequencies as easily but it gives time relationships. Look up a wavelet analysis of a speech sample some time.

Until I see a widely published analysis like this I take all linear trends of temperature with a grain of salt.

Think of it this way. The temperature curve is time dependent and generates a multi-model data distribution. You can’t take a multi-modal data set and jam it all into a single average without losing all of the modality associated with the actual data. And when you lose that modality you also lose time-dependence associated with the temperatures. A wavelet analysis of the overall data set would give you at least some of the time dependence back (i.e. the seasonal time dependence). A fourier analysis would give you the frequencies of the cyclical changes, e.g 4 cycles/year for the seasonal time dependence.

From this you could *see* the relationships of starting and ending periods. If there are long term cycles, e.g. like La Nina/El Nino, then it would show up and you could avoid biasing the trend line from the choice of start/end date.

Tim Gorman
Reply to  Bellman
February 7, 2022 6:03 am

Why would I deny that. But if you are suggesting UAH data shows we are at the top of a sine wave, you need to show that.”

UAH is *NOT* a time series temperature. It is a calculated metric based on the satellite traveling around the earth and reading radiance as it travels. It does *NOT* measure Tmax or Tmin and has no real relationship to climate. It is, once again, a multi-modal, time dependent snapshot of “something”.

As a metric it is useful in measuring gross trends in the metric. It is not really all that useful in determining climate at any point on the earth let alone a global climate. It is far different than surface based temperatures.

2009 Monckton’s claiming that “there’s been rapid global cooling for 7 years”, the month by month growing pause up to 2016. Every time there’s a chorus of skeptics claiming that this is proof that we’re at the top of a sine wave and any time now temperature will start plummeting – but so far it has not happened.”

There are none so blind as those who will not see. Monckton’s claim is that there has been no warming, not that we are cooling. If CO2 is the thermostat and CO2 keeps going up but the temperature does not then that thermostat is broken. If the thermostat (CO2) is broken then all the models that depend on the thermostat are broken as well.

If you are truly open-minded on this then go here and read this *FOR MEANING*: https://www.nature.com/articles/s41598-018-25212-2

This paper was written by agriculture scientists. Their investigations have *real* world consequences that can be measured and validated.

“”On average, FFF has been occurring later (by 5.4 days century−1), and LSF has been occurring earlier (by 6.9 days century−1), resulting in the average lengthening of the CGS (by 12.7 days century−1). ”

[tpg – FFF is first fall frost, LSF is last spring frost, and CGS is climatological growing season]

We developed relationships between county-level crop yields vs. agroclimate changes and found that all crops (maize, soybean, sorghum, spring wheat, winter wheat, and cotton) responded positively to a lengthened CGS, while responding negatively to increase in GDD, except cotton.” [tpg – GDD is growing degree days]

The annual AGDD trends (deviation from mean annual AGDD) in time domain (1900–2014) are presented on a national scale in Fig. 3. The deviation was initially close to zero, which rose to a positive maximum in 1939 and thereby started declining into negative deviations until the end of the study period in 2014. The national time series is derived from observed data at 1218 sites, and presents a national changes in AGDD, but it should be acknowledged that the constituent sites show highly variable trends and relying on a national series can conceal regional variations.”
[tpg – red in the attached image is annual GDD]

So what do we know from this? CGS is increasing while GDD is decreasing. Longer growing seasons with less heat accumulation.

My conclusion? Minimum temps are going up thus positively moving the first and last frost day while max temps are stagnant to moderating thus not increasing heat accumulation.

Admittedly this is for the continental US but this gives us a good picture of what is happening with a large percentage of the globe. You can find the same relationship with heating/cooling degree day values around the world. Fewer heating degree-days and stagnant to moderating cooling degree-days.

Are you now going to tell us that all of this data, which runs counter to the Earth turning into a cinder, is wrong and these ag scientists don’t have a clue as to what they are talking about?

image_2022-02-07_075757.png
Bellman
Reply to  Tim Gorman
February 7, 2022 1:53 pm

There are none so blind as those who will not see. Monckton’s claim is that there has been no warming, not that we are cooling.

Take that lesson to heart. I specifically said it was Monckton’s claims from 2009 when he was constantly claiming 7 years of global cooling. Here for example, or here.

Seven and a half years’ global cooling at 4.3 F° (2.4 C°) / century



Screenshot 2022-02-07 215216.png
aussiecol
February 4, 2022 5:23 am

+0.03degrees, just wow.

Carlo, Monte
Reply to  aussiecol
February 4, 2022 8:17 am

According to bell curve bellman, 1988 was +zero degrees; this means 0.0088K per decade or 0.88 mK per century.

We’re all gonna die!

Bellman
Reply to  Carlo, Monte
February 5, 2022 6:35 am

We’re all gonna die!

Correct.

Pat from kerbob
Reply to  Bellman
February 6, 2022 6:25 pm

You first with any luck.
There is a very real psychological phenomenon whereby people who are convinced they are going to die find some way to make it true.

So should we be sending the rubberized banana truck to pick you up?
For you own safety?

Bellman
Reply to  Pat from kerbob
February 6, 2022 7:06 pm

You first with any luck.

Very likely.

…people who are convinced they are going to die find some way to make it true

So you think you are never going to die?

Maybe my not very subtle joke was lost on you. Every one dies eventually.

February 4, 2022 5:35 am

Thanks Roy for the update. Doesn’t look to scary to me!

I like the bigger dot for the latest reading…makes it easier to see what’s happened this month.

ResourceGuy
February 4, 2022 6:09 am

The real climate emergency is locking in mandates, regulation, tariffs, and tax increases before the oceans reveal the fraud and before Edward Markey laments “Who could have known.”

David Anderson
February 4, 2022 7:46 am

Another graph I saw on these pages said no warming for the past 7 years. What do I believe?

Bindidon
Reply to  David Anderson
February 4, 2022 8:05 am

The Third Viscount has a {sarc} very selective view {/sarc} over UAH’s data.

The more you look at the history of a time series, the more it tells you.

Derg
Reply to  Bindidon
February 5, 2022 2:28 am

Yawn…CO2 climbs and temperature stays the same. CO2 is a control knob 😉

Ted
Reply to  David Anderson
February 5, 2022 4:11 am

Different data sets. The pause is in the HADCrut data, that post acknowledges warming in the UAH data aside from the last few months.

Ireneusz Palmowski
February 4, 2022 7:58 am

The Arctic air mass is cut off in the southern US and will remain there longer.comment image

Steve Oregon
February 4, 2022 8:10 am

The “warming” is the “warming”. Not much.
But I am most disturbed by the deceitful government policymakers who use the purposefully deceptive words “address” and “mitigate” climate change.
Rank and file climate crusaders all presume that means doing something about it when it does not. Query any of them to provide any science showing carbon policies altering the climate and they clam up or parrot the usual cliches.
IMO more attention (criticism) needs to be aimed at and highlight the absence of any atmospheric, climate or weather changes possible from the worthless “addressing” and “mitigation”.

As most regulars here are aware, there is widespread recognition that all of the implemented and proposed “addressing” and “mitigation” is worthless as relating to anything climate or weather,
Example https://ballotpedia.org/Fact_check/Would_the_Clean_Power_Plan_mitigate_climate_change

Excerpt: The excuse…

“The value of this …[mitigation] …is measured in showing strong domestic action which can actually trigger global action to address what is necessary…if we don’t take action domestically, we will never get started.”

That is nothing but another fatally flawed presumption. It’s an empty claim that will never even be measured. Who and how will will anyone ever show global action is addressing what is necessary?

No one will, ever.
We’ll all be dead, decades will pass and the asinine mitigation will never trigger or address anything necessary.

Without demands for science and politicians to measure and demonstrate positive climate impact the climate crusade remains the biggest fraud in human history.

Ireneusz Palmowski
February 4, 2022 9:05 am

There was a record temperature drop above the 65th parallel in the lower stratosphere.comment image

Ouluman
February 4, 2022 9:34 am

Well, all I can say is where I live in Finland it was -23c 2 days ago , -16c yesterday, -7 today and 30 cm snow, -1 in a few days. Normal stuff. Wtf is 1/10th of 1c in 10 years difference going to do? We often have 30c difference in temps within 36 hours.

bdgwx
Reply to  Ouluman
February 4, 2022 10:57 am

The global average temperature does not change 30 C in 36 hours.

AlexBerlin
Reply to  bdgwx
February 4, 2022 11:31 am

No, but any system able to survive quick 30 C changes will brush off a slow 3 C change without even noticing. Especially as it is going in the right direction, away from freezing temperatures. Ice kills. Less ice = better life on Earth.

bdgwx
Reply to  AlexBerlin
February 4, 2022 7:05 pm

Has Earth had a period where there has been a quick 30 C change in the global average temperature?

Pat from kerbob
Reply to  bdgwx
February 6, 2022 6:19 pm

Always the idiotic word games with you people, assuming we are as stupid as your people.

bdgwx
Reply to  Pat from kerbob
February 7, 2022 9:10 am

It’s a fair question. The insinuation by Ouluman and AlexBerlin is that because it is normal and survivable for specific locations to have 30 C changes than it must be normal and survivable when the global average temperature changes by 30 C as well. I want to know when the last time the global average temperature changed by 30 C and to be presented with evidence that this was a survivable period. Perhaps you could answer the question?

Tim Gorman
Reply to  bdgwx
February 7, 2022 3:53 pm

What is the difference between temps above the Arctic circle in winter and the summer Dubai?

People survive in both extremes. That doesn’t mean that some regions will prosper and some won’t. But it doesn’t mean everyone on the globe will die!

bdgwx
Reply to  Tim Gorman
February 7, 2022 4:18 pm

I believe the average high in Dubia in August is around 41 C and the average low at Summit Camp in January is -48 C.

Has the global average temperature ever changed by 89 C between August and January?

Do you think people would be able to survive if the global average changed by 89 C?

Last edited 3 months ago by bdgwx
Tim Gorman
Reply to  bdgwx
February 8, 2022 4:57 am

You missed the point entirely. The global average doesn’t determine the climate at any specific point on the globe. If the climate above the Artic Circle changed by +89C then people living there will still survive. They would just have to start living like those in Dubia! It’s why the “global average” is so meaningless. That average tells you nothing. You have to take the local climate into consideration! The global average only tells you that there are areas that average less than the global average and there are locations that average higher than the global average. Not every place will have a climate equal to the “global average”.

bdgwx
Reply to  Tim Gorman
February 8, 2022 7:28 am

I think it may be you that missed the point. Ouluman and AlexBerlin are insinuating that because the temperature changes by 30 C in their backyards with a significant impediment to survival then it must be true that the global average temperature can change by 30 C and that there would not be any impediments to survival as well. You upped the ante to 89 C. Do you think a global average temperature change of 89 C is possible? If it is possible do you think it would happen without any significant impediments placed on survivability? What about a smaller 30 C change?

Tim Gorman
Reply to  bdgwx
February 8, 2022 11:26 am

I didn’t miss the point at all. If their back yard is above the Artic circle then it doesn’t matter if the temp change is +30C or +90C. People living there will still survive.

If you look at the climate models their trend has a positive trend forever. They are merely linear equations of y = mx + b. Such a projection will, sooner or later reach +89C. Are you implying the models are (gasp) wrong?

Again, there will be climates below the +89C growth and above the +89C Those local climates that are below the +89C growth will certainly contain some that will allow continued survival.

Will life change? Of course. So what? Life has been and always will be a struggle for survival. My grandparents several times removed struggled to survive the climate on the plains of Kansas in the 1800’s. Life wasn’t very pleasant. I’m sure our ancestors that crossed into North America so long ago had to struggle against the climate as well.

bdgwx
Reply to  Tim Gorman
February 8, 2022 11:51 am

You really don’t think survivability will be impaired if the global average temperature increased 89 C from 15 C to was 104 C? Really?

Tim Gorman
Reply to  bdgwx
February 8, 2022 2:37 pm

The survivability of the human race won’t change. It survives pretty well in Dubai. If northern Alaska changes to a climate of Dubai I’m pretty sure those in Alaska will continue to survive.

Why do *you* think every one on Earth would die? There are really only two choices – 1. The human race would die out or 2. the human race would survive.

Pick one.

Last edited 3 months ago by Tim Gorman
bdgwx
Reply to  Tim Gorman
February 8, 2022 3:19 pm

Again…that’s a 104 C…the global average. Are you absolutely sure the survivability of the human race won’t change if the average surface temperature of Earth was 104 C? Are there any second thoughts here or are you sticking with it?

Jim Gorman
Reply to  bdgwx
February 8, 2022 12:50 pm

Do you think a global average temperature change of 89 C is possible? “

Every day on the globe. When India is 40C and Antarctica is -50C that breadth of change occurs on earth.

Can a GAT of -89 occur? If it does we’ll be in a full blown ice house earth and who will care?

bdgwx
Reply to  Jim Gorman
February 8, 2022 3:11 pm

bdgwx said: “Do you think a global average temperature change of 89 C is possible?”

JG said: “Every day on the globe.”

Wow.

Jim Gorman
Reply to  bdgwx
February 4, 2022 12:03 pm

You miss the point. Did large number of mammals die because of the major change in temperature? Why do you think large number of mammals will die because of 2 C change over 100 years?

Derg
Reply to  Jim Gorman
February 5, 2022 2:32 am

Exactly. CO2 is life.

Derg
Reply to  bdgwx
February 5, 2022 2:31 am

Lol…CO2 control knob indeed.

Pat from kerbob
Reply to  Ouluman
February 6, 2022 6:22 pm

Exactly
Since basically all of the observed warming has been at high latitudes (like where you and I reside) winters and overnights, the 1-1.5-2-2.5-3c etc increase is entirely beneficial and leads to fewer dead humans.
Which I continue to suspect is the real problem.

CO2isLife
February 4, 2022 2:15 pm

A better way to look at the satellite data is to select the data set that minimizes the Urban Heat Island Effect, Water Vapor and Albedo. To do that, download the South Pole Data and chart each month. What you will find is there has been no warming since the start of the data set. Now, how can CO2 increase 25 to 30% and yet have no impact on temperatures? What then is causing the warming? The only thing that warms the oceans. That is VISIBLE Radiation, not LWIR. Do we have evidence more warming visible radiation is reaching the oceans? You bet. Cloud cover has been decreasing over the oceans explaining the warming of the oceans, and that has nothing to do with CO2. Nothing.

bdgwx
Reply to  CO2isLife
February 5, 2022 7:00 am

What happens when you consider at the NoPol?

What happens when you consider the negative lapse rate in SoPol?

What happens when you consider the ocean portion?

What happens when you consider all of the other factors that modulate the planetary energy imbalance?

Why did clouds change?

Jim Gorman
Reply to  bdgwx
February 5, 2022 8:45 am

Why not answer the question that was asked by co2?

Deon Botha-Richards
February 5, 2022 3:27 am

What does the zero line represent? Is that the temperature from “preindustrial time”?

bdgwx
Reply to  Deon Botha-Richards
February 5, 2022 8:13 am

It’s the 1991-2020 average.

Carlo, Monte
Reply to  Deon Botha-Richards
February 5, 2022 8:15 am

No, its an average over arbitrary some time period, the details of which are not openly provided—they are probably somewhere in other documentation, but I can’t point to them. The baseline average is subtracted from the individual monthly averages to produce the “anomaly” numbers (a very deceptive term IMO that is commonly used in climate science).

Bellman
Reply to  Carlo, Monte
February 5, 2022 7:26 pm

No, its an average over arbitrary some time period, the details of which are not openly provided

It’s explained in the head positing and mentioned in the y label of the graph, that the base line is 1991-2020. This was much discussed when Dr Spencer changed the base period from 1981-2010. I’m really not sure why you think this is being hidden.

Carlo, Monte
Reply to  Bellman
February 5, 2022 8:11 pm

More whining.

Gordon A. Dressler
February 6, 2022 11:42 am

The IPPC’s stated goal (speaking for all of humanity, of course) is for mankind to limit global warming to 1.5 °C or less global temperature increase above “pre-industrial times”. In turn, the IPCC defines “pre-industrial times” to be the period of 1850-1900. (Ref: https://www.ipcc.ch/site/assets/uploads/sites/2/2018/12/SR15_FAQ_Low_Res.pdf ).

Interestingly, the IPCC apparently goes out of its way to avoid stating what the absolute global temperature was at any time in the period of 1850-1900.

Furthermore, the IPCC’s timeframe for limiting the 1.5 °C increase is given variously, and confusingly, as “in the next several decades”, “by 2040”, “by 2050”, and “by the end of this century”. What’s one to make of this?

Anyway, the global temperature for 1850-1890 has been independently stated to be “roughly 13.6 °C” (ref: https://history.aip.org/climate/timeline.htm ) whereas the global temperature for 1880 to 1900 has been independently stated to be 13.7 °C (ref: https://www.currentresults.com/Environment-Facts/changes-in-earth-temperature.php ).

The 2020 global surface temperature (averaged across land and ocean) was 1.2 °C (2.1 °F) warmer than the pre-industrial period (1880-1900) (ref: https://www.climate.gov/news-features/understanding-climate/climate-change-global-temperature ).

Dr. Spencer’s data presented in the above article leads to a linearized global (i.e., land and sea) warming trend of +0.13 C/decade averaged over the last 43 years. Therefore, combining the foregoing data, we can reasonably predict that the point of reaching 1.5 °C of warming from “pre-industrial times” might occur in (1.5-1.2)/0.13 = 2.3 decades from 2020, or equivalently around year 2043.

While there is absolutely NOTHING mankind can do to change the trend revealed by Dr. Spencer’s analysis of UAH satellite-based global temperature data in, say, the next 20 years, I am confident that a current confluence of natural factors will be resulting in global cooling actually occurring over the next century or so.

In fact, the transition from long-term global warming to long-term global cooling might be revealed in the article’s graph of UAH satellite data for the interval of 2017-2021.

Let’s stay tuned! . . . next little ice age, here we (may) come.

Gordon A. Dressler
February 6, 2022 12:14 pm

The IPPC’s stated goal (speaking for all of humanity, of course) is for mankind to limit global warming to 1.5 °C or less global temperature increase above “pre-industrial times”. In turn, the IPCC defines “pre-industrial times” to be the period of 1850-1900. (Ref: https://www.ipcc.ch/site/assets/uploads/sites/2/2018/12/SR15_FAQ_Low_Res.pdf ).

Interestingly, the IPCC apparently goes out of its way to avoid stating what the absolute global temperature was at any time in the period of 1850-1900.

Furthermore, the IPCC’s timeframe for limiting the 1.5 °C increase is given variously, and confusingly, as “in the next several decades”, “by 2040”, “by 2050”, and “by the end of this century”. What’s one to make of this?

Anyway, the global temperature for 1850-1890 has been independently stated to be “roughly 13.6 °C” (ref: https://history.aip.org/climate/timeline.htm ) whereas the global temperature for 1880 to 1900 has been independently stated to be 13.7 °C (ref: https://www.currentresults.com/Environment-Facts/changes-in-earth-temperature.php ).

The 2020 global surface temperature (averaged across land and ocean) was 1.2 °C (2.1 °F) warmer than the pre-industrial period (1880-1900) (ref: https://www.climate.gov/news-features/understanding-climate/climate-change-global-temperature ).

Dr. Spencer’s data presented in the above article leads to a linearized global (i.e., land and sea) warming trend of +0.13 C/decade averaged over the last 43 years. Therefore, combining the foregoing data, we can reasonably predict that the point of reaching 1.5 °C of warming from “pre-industrial times” might occur in (1.5-1.2)/0.13 = 2.3 decades from 2020, or equivalently around year 2043.

While there is absolutely NOTHING mankind can do to change the trend revealed by Dr. Spencer’s analysis of UAH satellite-based global temperature data in, say, the next 20 years, I am confident that a current confluence of natural factors will be resulting in global cooling actually occurring over the next century or so.

In fact, the transition from long-term global warming to long-term global cooling might be revealed in the article’s graph of UAH satellite data for the interval of 2017-2021.

Let’s stay tuned! . . . next little ice age, here we (may) come.

John Boland
Reply to  Gordon A. Dressler
February 6, 2022 7:23 pm

No clue what to think. I am a skeptic, a denier of science apparently. No doubt there appears to be an upward pressure on temperature trends right now…so it’s possible the AGW crowd is right. In any case if we are that close to the 1.5C rise I would bet we get there. So what does that mean? I have no idea. I don’t see a case for CO2 causation, I just don’t see it. I don’t see a case for cooling either. All I see is a non linear system just doing what it does.

MarkMcD
February 6, 2022 2:05 pm

You know what I’d like to see?

I’d like to see all the data that has been used for version 6 to produce the recent record to be redone using version 5.

See, in 2015, version 6 came online and the entire temp record took an upwards jump. So I want to see what the record might be if we went back to when UAH temps were a closer match to the radiosondes.

Mind you, I am having issues finding the radiosonde data from the past couple of years – or rather the raw data appears to be available for download but I can’t find anywhere it has been mapped into human-readable form.

Coincidence?

Bellman
Reply to  MarkMcD
February 7, 2022 6:19 am

I’m not sure if this is meant to be a joke. But for the record, but temperatures did not take an “upward jump” in version 6, quite the opposite.

Graph showing annual anomalies compared with 1981-2010 base period in °C.

20220207wuwt1.png
Last edited 3 months ago by Bellman
James Schrumpf
February 7, 2022 9:57 am

Why do these data set owners keep pretending that number out there in the hundredth place has any significance? I know from college physics that you might keep that number out there when doing all the calculations, but once you reach the end you have to drop off those insignificant digits.

bdgwx
Reply to  James Schrumpf
February 7, 2022 10:15 am

They aren’t pretending that they have significance at least in terms of uncertainty. See Christy et al. 2003 for details. I believe there are several reasons why they include extra digits. The public can perform their own calculations on the data with less rounding error. It allows the public more visibility into how each monthly update changes past data. In some cases this may be by request. For example, I heard that BEST had originally provided 2 decimal places, but the public requested more digits. I think one solution that would satisfy all parties is to provide two data files: one using significant figures rules and one that contains all IEEE 754 digits.

Last edited 3 months ago by bdgwx
James Schrumpf
Reply to  bdgwx
February 7, 2022 1:08 pm

Haven’t we gone through this many times already? The standard deviation and uncertainty calculations only apply if one has several measurements of the same thing. Taking a month’s worth of temperatures from one station is not taking several measurements of ONE thing.

The entire point of the multiple measurements improving the accuracy of the mean of the measurements is that the length of the board has a single true value which is being estimated. Thirty separate temperatures have no such true value to estimate, so the techniques used to improve the measurement of one true value are meaningless.

bdgwx
Reply to  James Schrumpf
February 7, 2022 1:34 pm

Yes. We have gone through this quite a bit already. Not only does uncertainty propagate through a combining function (like an average) the same regardless of whether the input measurements are of the same thing or not, but it works the same even if the input measurements are of an entirely different type with entirely different units. See the Guide to the Expression of Uncertainty in Measurement. You can also prove this out with the NIST uncertainty calculator. It is an undisputable fact. The uncertainty of an average is less than the uncertainty of the individual measurements from which the average is based.

That’s not all that relevant to your post though. Regardless of how uncertainty propagates through the UAH process they aren’t pretending that the thousandths or even hundredths place is significant in term of uncertainty. And I still think this whole issue gets resolved if they would just post 2 files: one with significant rules applied and one with all IEEE 754 digits.

Tim Gorman
Reply to  bdgwx
February 7, 2022 3:31 pm

Not only does uncertainty propagate through a combining function (like an average) the same regardless of whether the input measurements are of the same thing or not”

I’m sorry, this is just wrong. Measuring the same thing can build a measurement database with random errors surrounding a true value. Those random errors, if they are truly Gaussian (and this isn’t always the case), will tend to cancel with equal numbers of negative errors and positive errors.

When you measure different things, especially using different measurement devices, you do *NOT* build a measurement database with random errors surrounding a true value. In this case the errors do not cancel completely if at all. Nor is there a “true value” represented by the mean.

The *process* of propagating the uncertainty may be similar in both cases but they do *NOT* provide the same descriptive value. Consider multiple measurements of just two items using different measurement devices for each item, one item being twice the length of the other. You wind up with a bi-modal distribution with an average that will be somewhere in the gap between the the two modes and which does not describe a “true value”. There is no way to guarantee that the random measurement errors of mode 1 can cancel the random measurements of mode 2. And the average accurately describes neither of two modes let alone their combination.

Yet this is *exactly* what mathematicians and climate scientists do when combining temperatures in the northern hemisphere with ones from the southern hemisphere. And the excuse that they use anomalies doesn’t hold water either since the range of temps during winter (which determines the mid-range value) is different than the range of values during summer and this affects the anomalies as well as the average temps. I have yet to see anyone trying to find a “global average temp” using any weighting to account for the different temp ranges between the hemispheres.

bigoilbob
Reply to  Tim Gorman
February 8, 2022 9:10 am

When you measure different things, especially using different measurement devices, you do *NOT* build a measurement database with random errors surrounding a true value.”

If the average of each device converges on the “true value”, with repeated measurements, indeed they do. There is no rule that they must not be “different” or even have gaussian distributions (a bi-modal distribution that converges on the “true value” with multiple measurements is just fine).

AGAIN, my world, oil and gas reservoir, production, economic modeling is awash with geological and rheological measurement methods that are 1. greatly different from each other, 2. with greatly different “random errors”, 3. used together n the larger evaluations.

What specific amygdala overamp triggers your flight reflex when presented with these immutable facts?

Tim Gorman
Reply to  bigoilbob
February 8, 2022 11:40 am

If the average of each device converges on the “true value”, with repeated measurements, indeed they do.

How can the average of each device converge on a “true value” when you are measuring different things? Will a 100% accurate tape measure measuring a 10′ 2″x4″ and a 6′ 2″x4″ board give you something that converges on a “true value”?

If the average from a 100% accurate measuring device doesn’t converge on a “true value” then how can a measuring device with uncertainty do so?

There is no rule that they must not be “different” or even have gaussian distributions (a bi-modal distribution that converges on the “true value” with multiple measurements is just fine).

Temperature measurements from different locations ARE different. Temperature measurements at the same location using different thermometers will give different results.

How do you get a bi-modal distribution to converge on a “true value” when the average doesn’t describe either mode? Your definition of a true value is apparently just the average, like most climate scientists and mathematicians.

1. greatly different from each other, 2. with greatly different “random errors”, 3. used together n the larger evaluations.”

Then you are not using base statistic descriptors. The average from greatly different measurements can’t describe a “true value”. Greatly different “random errors” with systematic errors just grow the uncertainty which also means the risk value associated with the evaluation goes up also. You’ll have to explain how you use them in larger evaluations.

I assure you, when I went in to pitch a capital project and had to admit that our measurements of peak usage were greatly different and that the uncertainty of our measurements varied greatly I wouldn’t be allowed back in the conference room again.

Jim Gorman
Reply to  bigoilbob
February 8, 2022 12:00 pm

If the average of each device converges on the “true value”, with repeated measurements, indeed they do. There is no rule that they must not be “different” or even have gaussian distributions (a bi-modal distribution that converges on the “true value” with multiple measurements is just fine).”

You are so full of crap it just isn’t funny.

Where do you think random errors originate? The measurand or the measuring device?

Answer – the measuring device and the process of making a measurement with that device.

If you have two devices measuring different things, you must first prove that the “true value” of each set of independent measurements have the same value BEFORE you can assume the average of each is the true value.

If you have two devices measuring two different things and have a bimodal distribution the mean of the two IS NOT the true value of either measurand.

Bimodal distributions generally occur due to two devices measuring the same thing. It is highly unlikely that the mean of the two measurements would be the true value however. It is more likely that that one or both of the instruments is in need of calibration.

Jim Gorman
Reply to  bdgwx
February 7, 2022 4:49 pm

Uncertainty is not the issue when dealing with Significant Digits Rules (SDR). SDR were developed to insure that extraneous information was not added to measurements by performing mathematical calculations. The information contained in a measurement is shown by the resolution to which the measurement is made. The resolution determines the number of Significant Digits that can be determined.

Any information added to a measurement by extending the measured digits is nothing more than writing fiction. It is creating unreliable numbers.

Using the IEEE 754 spec on floating point is not a reason for anything. In fact this spec has problems in rounding decimal numbers accurately and requires great care in programming to reduce errors.

This whole reference by you just indicates that you have no appreciation for what physical measurements are and how they should be treated. To recommend something that doesn’t follow the rules followed by all physical scientists, engineers, machinists and others just indicates your lack of dedication to correctly portraying measurements.

Carlo, Monte
Reply to  bdgwx
February 7, 2022 9:14 pm

Clownish nonsense.

James Schrumpf
Reply to  bdgwx
February 8, 2022 9:10 am

It is an undisputable fact. The uncertainty of an average is less than the uncertainty of the individual measurements from which the average is based.

I don’t think anyone is arguing that point per se, it’s the application of the method that I think is wrong. The way I see it, the above is only applicable if there is a “true value” you’re trying to approach. A board’s length is its “true value”. What’s the “true value” that a thousand different temperature measurements at a thousand different locations is trying to approach?

With the board, we’re not trying to approach the average (or mean) of all the measurements, we’re trying to determine its actual length. If we took a thousand different boards with a range of 100-200 mm and measured each of them once, then took all those measurements and calculated the standard deviation and the uncertainty of the mean, what do we actually now know about all those boards?

Sure, we’ve got a mean, and an SD, and the uncertainty in the mean. What do those values actually tell us about the boards, or an individual board?

At our thousand weather stations, a month’s worth of daily measurements are taken to get a monthly average for a station. There’s already a 30-year baseline of average temps for each month of a station’s history, so the monthly average for a station is subtracted from the baseline value for that month at that station to get the anomaly for that month and year for that station.

The 30-year average for a month is our Xavg. Each current month’s average is our X. The X is subtracted from the Xavg to get the deltaX (the anomaly). All the anomalies for each month of a year are averaged to get the “annual average anomaly” for that station, and SDs and uncertainty in the mean is calculated and so fourth.

But I don’t think that is sum(Xavg -X) for each station-month, it’s sum(Xavg1 – X1) for n=1 twelve times. Each station has its own Xavg to subtract from its monthly average X. Instead of adding the distance of each measurement from a common mean value, the distance of each station’s distance from its own mean is summed.

How does that help? Instead of knowing that each anomaly is some distance from a common value, we have anomalies that are each some distance from its own personal mean point.

I wish I could draw this, but it’s like instead of the anomalies being shots clustered around a bulls-eye, and each anomaly being measured as from where it hit to the bulls-eye, there are bulls-eyes all over the target paper, and each anomaly is measured from its own personal bulls-eye, without knowing how from the real bulls-eye each of the others is.

It just doesn’t seem like it’s giving any useful information.

Tim Gorman
Reply to  James Schrumpf
February 8, 2022 11:57 am

How does that help? Instead of knowing that each anomaly is some distance from a common value, we have anomalies that are each some distance from its own personal mean point.”

You nailed it!

Climate is the overall absolute temperature profile at a location, not its anomaly. Fairbanks, AK and Miami can have the same anomaly but *vastly* different climates.

Attached is a graph showing average growing season length (orange) and growing degree-days (red) for the US. Note carefully that while the growing season length ( the number of days between first-fall-frost and last-spring-frost) is going up while the growing degree-days (a measure of heat accumulation) is going down!

If max temps were going up you would expect heat accumulation (GDD) to go up as well. And that is what the climate scientists are trying to get us to believe. Tmax is growing more and more every day and soon Earth will be nothing but a cinder with nothing growing.

But the actual data from agricultural scientists, whose job depends on accurate, reproducible results, says otherwise. Growing season length is increasing because minimum temps are going up causing last spring frost to move earlier and first fall frost to move later.

Admittedly this is a national average and different locations and regions will see different results so you can’t just project the national average to any and all places in the US. But that is the *exact* same problem you have with the “global average temperature. You can’t project it to any specific location or region. Which also implies that a one-solution-fits all approach is also bad policy. Solutions have to be tailored to fit the problem at local and regional areas.

Tim Gorman
Reply to  Tim Gorman
February 8, 2022 11:58 am

I forgot the graph.

gdd_avg.png
bigoilbob
Reply to  James Schrumpf
February 8, 2022 1:41 pm

What’s the “true value” that a thousand different temperature measurements at a thousand different locations is trying to approach?”

Whatever the proper spatial interpolation, with normal statistical rules governing the sum of the variances from both the interpolation and from the varying error distributions of the measuring instruments and techniques, arrive at. With an appropriate error band of it’s own.

What gets lost – especially to the Gorman’s – is that we are measuring one thing – temperature. Different places, different instruments and methods, with different distributions around the “true value”, at different times if trending is evaluated, but all temperature. We have known the statistical bases for doing this, and then spatially interpolating it with proper error aggregation for decades. And now we have the computing HP to do it without empirical short cuts.

I’m sorry the results don’t agree with your prejudgments. But not sorry that you and the Gorman’s don’t have the backup of anyone with actual statistical training, even in this fawning fora…

James Schrumpf
Reply to  bigoilbob
February 8, 2022 3:23 pm

“What’s the “true value” that a thousand different temperature measurements at a thousand different locations is trying to approach?”

Whatever the proper spatial interpolation, with normal statistical rules governing the sum of the variances from both the interpolation and from the varying error distributions of the measuring instruments and techniques, arrive at. With an appropriate error band of it’s own.

It appears that you just said there IS no true value but whatever the calculations arrive at.

If that’s so, then the result is meaningless. To go back to the board example, that is saying the true length of the board doesn’t exist until we measure all one thousand different boards and suitably process the results and then proclaim the result as the “true value” of the length of the board, even if no board of that length existed in the population.

Your description sounds more like the statistics applied for figuring public opinion. There are no measurements, only numbers; if you ask 1200 people who they would vote for in a Presidential election, there are no units. Out of the 1200 so many answer this way, so many another. It’s a pure tally of the vote.

That doesn’t seem to be way to handle physical measurements at a location. If the first step to determining standard deviation is to subtract the mean from each individual measurement. What do you do when you have a thousand different means and a thousand different measurements? Instead of sqrt( sum((X – Xmean) ^2/ n), n=1000 you have
sqrt(sum(X1-X1mean1)^2 + (X2-X2mean)^2 + (X3-X3mean)^2 ) + . . . n.

How can those possibly relate to give a rational, logical answer?

bdgwx
Reply to  James Schrumpf
February 8, 2022 4:52 pm

I wonder if there is a real world application of averages that might resonate better with you. There are so many examples that can be considered. What about image analysis especially in the context of astronomical research? I was thinking it is a decent analog because there is a grid of pixels where each pixel has a value assigned to it representing the brightness of that pixel not unlike how we can represent the Earth as a grid of cells where each cell has a value assigned to it representing the temperature of that cell. If you want to determine the brightness of an astronomical object you can spatially average the pixels just like you would spatially average the cells of Earths to determine the average temperature. There other interesting similarities between the two that I don’t want to get into yet. I was just thinking that if you can be convinced (if you’re not already) that astronomers can determine the various properties of astronomical objects through image analysis techniques including spatial averages then it might make the concept a global average temperature of Earth (or any planet) more intuitive.

Jim Gorman
Reply to  bdgwx
February 8, 2022 5:36 pm

Not the same thing. You are describing a static image that you are measuring. It would be more appropriate if you said pixels or groups of pixels were missing and you used a clone tool to fill with nearby pixels. And/or you took a pixels from images from a large number of different telescopes averaged them and then say you obtained a more accurate and precise image by averaging them all together.

Go back to the start of even using “anomaliea” for some purpose. Primarily to show the warming of the globe coincided with the growth of CO2. That ignored seasons, hemisphere winter/summer differences, and that as you went back in time, less and less coverage of the globe. Now that the connection between temp and CO2 is becoming smaller and smaller, the processes to try and combine land and sea temps to show this connection are also becoming less and less important.

bigoilbob
Reply to  James Schrumpf
February 8, 2022 5:02 pm

It appears that you just said there IS no true value but whatever the calculations arrive at.”

There is one, but we non deities don’t know it. But we have a best expected value, and it’s associated error band. it’s what you always end up with when doing technical evaluations with distributed inputs, and almost always with deterministic inputs.

James Schrumpf
Reply to  bigoilbob
February 8, 2022 6:45 pm

What’s the best expected value for the average temperature of the Earth, and why do you think so?

bigoilbob
Reply to  James Schrumpf
February 8, 2022 7:09 pm

It is a dreaded calculated value, of course. I.e., it’s the properly spatially interpolated mean, or average. It varies with time, and the spatial interpolation technique used (there are more than one). It should come with the aggregation of the distributed uncertainties of the instrumentation and processes used for that time period, the residuals from the spatial interpolations, and so on. See BEST temp data, for example.

But what we are largely looking for are the trends over physically/statistically significant time periods, and their standard errors. These have proven to be quite disturbingly durable w.r.t. ACC. So much so that when the doubters get cornered, they invariably change the subject…

Carlo, Monte
Reply to  bigoilbob
February 8, 2022 9:20 pm

You’re a liar, blob.

bdgwx
Reply to  James Schrumpf
February 9, 2022 5:57 am

I don’t think there is a best or optimum temperature for Earth.

James Schrumpf
Reply to  bdgwx
February 9, 2022 8:20 am

Here’s how I see the problem under discussion. In a probability analysis there are no units. If there are no units there is no uncertainty except the statistical ones.

But these are measurements with units, and that is the uncertainty getting tossed aside in the probability analysis.

All those measurements are in hundredths of a degree C, which is ridiculous on its face before modern instrumentation, but that’s not the problem either, as digits can be removed. The problem is if those measurements were taken on a thermometer with one-degree increments, all those measurements have an uncertainty of +/- 0.5 degrees C. Modern thermometers are probably around +/- 0.05 degrees C.

When I look at a temperature series like 10.1,11.1, 9.8, 7.4, 10.5, I see 10.1+/- 0.05, 11.1 +/- 0.05, etc. When the mean is calculated, the uncertainty goes along:

(10.1 + 11.1 + 9.8 + 7.4 + 10.5) / 5 = 9.8
(0.05 + 0.05 + 0.05 + 0.05 + 0.05) / 5 = 0.05

The mean is 9.8 C +/- 0.05 C

They carry along in the standard deviation. Raising an uncertainty to a power multiplies the uncertainty by the power.

(10.1 +/- 0.05 C – 9.8 +/- 0.5 ) ^2 = 0.09 +/- 0.1 C

Completing the calculation:

(11.1 +/- 0.05 C – 9.8 +/- 0.5 ) ^2 = 1.69 +/- 0.1 C
(9.8 +/- 0.05 C – 9.8 +/- 0.5 ) ^2 = 0.0 +/- 0.1 C
(7.4 +/- 0.05 C – 9.8 +/- 0.5 ) ^2 = 5.76 +/- 0.1 C
(10.5 +/- 0.05 C – 9.8 +/- 0.5 ) ^2 = 0.49 +/- 0.1 C

Standard deviation = 1.6 +/- 0.3 C

If I did all the maths right, that’s the correct result. Question is, would a mathematician or statistician carry that +/- 0.3 C uncertainty along?

bdgwx
Reply to  James Schrumpf
February 9, 2022 9:06 am

JS said: “ If there are no units there is no uncertainty except the statistical ones.”

It turns out that uncertainty is assessed the same regardless of the units of measure or even if there are unit at all. The GUM has an example of determining combined uncertainty when the combining function results in a unitless value.

JS said: “(10.1 + 11.1 + 9.8 + 7.4 + 10.5) / 5 = 9.8
(0.05 + 0.05 + 0.05 + 0.05 + 0.05) / 5 = 0.05
The mean is 9.8 C +/- 0.05 C”

That’s not how uncertainty propagates though. Using GUM [1] equation 10, Taylor [2] equations 3.16, 3.18, or 3.47, or the NIST [3] monte carlo method all say the mean is 9.78 ± 0.02 C. That’s 5 different methods all giving the exact same answer. The general formula for the propagation of uncertainty through a combining function that produces an average is u(Tavg) = u(T) / sqrt(N) when all Ti elements have the same individual uncertainty u(T) and there are N elements. It doesn’t even matter what the distribution of u(T) is. It could be gaussian, uniform, etc. It works out all the same.

The UAH uncertainty is significantly more complicated to assess. The spot measurement uncertainty is about ± 1 K. And even though there are 9504 cells in the grid mesh the uncertainty of the average of the grid mesh isn’t 1 /sqrt(9504). This is because there is a non-zero correlation between grid cells due to the way the satellites view the Earth. The degrees of freedom of the grid mesh turns out to be 26 so the uncertainty is closer to 1 / sqrt(26 – 1) = 0.2 for a monthly average. Christy et al. 2003 provides a lot details regarding the uncertainty of the UAH dataset.

Last edited 3 months ago by bdgwx
James Schrumpf
Reply to  bdgwx
February 9, 2022 12:49 pm

It turns out that uncertainty is assessed the same regardless of the units of measure or even if there are unit at all. The GUM has an example of determining combined uncertainty when the combining function results in a unitless value.

That Equation 10 looks pretty complicated to use on adding 5 simple measurements. You sure a simpler method won’t work?

I’m suspecting we’re talking about two different things. I’m talking about measurement error. As in, if I read a thermometer marked in degrees, any reading I take is going to have a measurement uncertainty of +/- 0.5 degrees.

I think you are talking about statistical uncertainties, the standard deviation and the uncertainty in the mean.

Could you indulge me and show how you would propagate the measurement error in my example above?

Don’t forget to show your work!

bdgwx
Reply to  James Schrumpf
February 9, 2022 2:29 pm

JS said: “That Equation 10 looks pretty complicated to use on adding 5 simple measurements. You sure a simpler method won’t work?”

It’s not terrible. The most confusing part for those who aren’t familiar with calculus notation is the partial derivative of the combining function wrt to the inputs. For a function that computes the average the partial derivative is 1/N since when you change an input by 1 units it changes the output 1/N units. There are easier methods, but GUM 10 is generic enough that it can handle arbitrarily complex output functions.

JS said: “I’m suspecting we’re talking about two different things. I’m talking about measurement error. As in, if I read a thermometer marked in degrees, any reading I take is going to have a measurement uncertainty of +/- 0.5 degrees.”

Ah…got it. The ±0.5 figure here is the read uncertainty due to the instrument reporting in increments of 1. That changes things a bit actually. Because that is a uniform distribution with the read error between -0.5 and +0.5 then the standard deviation works out to 0.289. For -0.05 and +0.05 the standard deviation works out to 0.0289. That’s the figure we’ll need to plug into the various combined uncertainty equations for the example below.

JS said: “Could you indulge me and show how you would propagate the measurement error in my example above?”

Absolutely. Note that given my new understanding that the ±0.05 figure was the bounds of a uniform distribution and not the standard uncertainty the answer is going to be a bit different than what I gave you previously. The equivalent standard uncertainty for ±0.05 is ±0.0289. Here is GUM equation 10 solved both numerically and algebraically using ±0.0289.

GUM 10 – numerical method

x_1 = 10.1, u(x_1) = 0.0289
x_2 = 11.1, u(x_2) = 0.0289
x_3 = 9.8, u(x_3) = 0.0289
x_4 = 7.4, u(x_4) = 0.0289
x_5 = 10.5, u(x_5) = 0.0289

y = f = (x_1+x_2+x_3+x_4+x_5)/5
y = 9.78

∂f/∂x_1 = 0.2
∂f/∂x_2 = 0.2
∂f/∂x_3 = 0.2
∂f/∂x_4 = 0.2
∂f/∂x_5 = 0.2

u(y)^2 = Σ[∂f/∂x_i * u(x_i)^2, 1, 5]

u(y)^2 = 0.2^2*0.0289^2 + 0.2^2*0.0289^2 + 0.2^2*0.0289^2 + 0.2^2*0.0289^2 + 0.2^2*0.0289^2

u(y)^2 = 0.0000334 * 5 = 0.000167

u(y) = sqrt(0.000167)

u(y) = 0.0129

GUM 10 – algebraic method

u^2(y) = Σ[∂f/∂T_i^2 * u^2(T_i), 1, N]

Let…

y = f = Σ[T_i, 1, N] / N

Therefore…

∂f/∂T_i = 1/N for all T_i

And then it follows that…

u^2(y) = Σ[(1/N)^2 * u^2(T_i), 1, N]

And when u^2(T_i) is the same for all T_i then…

u^2(y) = ((1/N)^2 * u^2(T)) * N

u^2(y) = N * 1/N^2 * u^2(T)

u^2(y) = 1/N * u^2(T)

u^2(y) = u^2(T) / N

u(y) = sqrt[u^2(T) / N]

u(y) = u(T) / sqrt(N) = 0.0289 / sqrt(5) = 0.0129

NIST monte carlo method [1]

x0 is uniform between 10.05 and +10.15
x1 is uniform between 11.05 and +11.15
x2 is uniform between 9.75 and 9.85
x3 is uniform between 7.35 and 7.45
x4 is uniform between 10.45 and 10.55

y = (x0+x1+x2+x3+x4)/5 = 9.78

u(y) = 0.0129

James Schrumpf
Reply to  bdgwx
February 9, 2022 5:56 pm

For -0.05 and +0.05 the standard deviation works out to 0.0289.

How? Please show me that calculation. My maths work out to 0.05.

bdgwx
Reply to  James Schrumpf
February 9, 2022 6:13 pm

The formula for the variance and standard deviation of a uniform distribution is as follows.

σ^2 = (b-a)^2/12

σ = sqrt[(b-a)^2/12]

So for a uniform distribution with endpoints a = -0.05 and b = 0.05 we have the following.

σ = sqrt[(0.05 + 0.05)^2/12] = 0.0289.

The NIST uncertainty calculator will calculate the SD of any distribution as well. Leave everything at the defaults except change x0 to a uniform distribution with left and right endpoints specified as -0.05 and +0.05.

James Schrumpf
Reply to  bdgwx
February 9, 2022 8:23 pm

This is not a uniform distribution it’s an uncertainty in a measurement. The LSU Physics Dept. lab web page on Uncertainties and Error Propagation says this:

Example
w = (4.52 ± 0.02) cm,
x = ( 2.0 ± 0.2) cm,
y = (3.0 ± 0.6) cm.
Find z = x + y – w and its uncertainty.

z = x + y – w = 2.0 + 3.0 – 4.5 = 0.5 cm
Dz = sqrt(0.02^2 + 0.2^2 + 0.6^2)
= sqrt(0.0004 + 0.04 + 0.36)}
= sqrt(0.4004) = 0.6
z = 0.5 +/- 0.6 cm

This isn’t wrong. But it’s not what you’re doing.

bdgwx
Reply to  James Schrumpf
February 9, 2022 9:36 pm

You described it as being the result of the limitation in the markings on the instrument being only in units of 0.1. That means there is equal probability of the true value being anywhere in the range -0.05 to +0.05 of what you read from the instrument. For example, -0.01 is just as likely as + 0.01 or any other value in that range. That is a uniform distribution. If I’ve misunderstood the meaning of your 0.05 uncertainty figure then no big deal. Just tell me what it means and I’ll redo the calculation.

For the new example I get the same answer as LSU using both the GUM equation 10 and the monte carlo method. BTW…GUM equation 10 reduces to the formula LSU used for the propagation of uncertainty through an output function that only contains addition and subtraction. This is the well known root sum square formula and can actually be derived from GUM equation 10. In fact I’ve actually derived it in few posts here already.

James Schrumpf
Reply to  bdgwx
February 10, 2022 3:00 am

-0.01 is just as likely as + 0.01 or any other value in that range. That is a uniform distribution

Is it? Sounds to me like it’s saying “any measurement is only accurate to +/- 0.05 C”. There’s no distributed sample there, it’s just a statement of accuracy.

On the gripping hand, this is a snip of a histogram of a randomly selected GHCN Monthly site with 30 years of good monthly data, from 1992-2021. The mean of the data is 9,6C. What kind of a distribution would that be called?

temp_histogram.png
bdgwx
Reply to  James Schrumpf
February 10, 2022 6:07 am

JS said: “Is it?”

Yes. If the instrument only displays units of 0.1 then the true value could be any value within 0.05 of what is displayed. It is uniform because the true value would not exhibit any preference for a specific digit at the 2nd decimal place. In other words all digits in the 2nd decimal place are equally likely…10% for each.

You should be able to convince yourself of this easily in Excel. In column enter “=RAND()” in 100 cells of column A. In column B enter “=ROUND(A1, 1)” and then repeat for each Ax value in column A. In column C enter “=B1 – A1″ and then repeat for each Ax and Bx value in columns A and B. Finally, in another cell enter =”STDDEV.P(C1:C100)”. You’ll get a value very close to 0.0289.

JS said: “There’s no distributed sample there, it’s just a statement of accuracy.”

I’m not sure what you mean by “accuracy” here. Technically and per ISO 5725 accuracy is describing the bias of every measurement. It essentially shifts the error distribution to the left or right. The markings on the instrument or the display of the value reported by the instrument do not influence the accuracy in any way. Saying an instrument can only report or be read in units of 0.1 does not in anyway describe the accuracy of the instrument. It only describes a limitation of the precision of the instrument. Again, I’m using formal ISO 5725 language here.

JS said: “What kind of a distribution would that be called?”

That is an arbitrary and asymmetric distribution. It does not fit the common types: normal, uniform, triangular, exponential, weibull, etc. But, and this is important, it is still a distribution and still has a standard deviation. I’m not sure the relevance to the discussion here because it is not an error distribution. That histogram of Tavg does not describe an uncertainty or the dispersion of values that could be reasonably attributed to a thing being measured.

bigoilbob
Reply to  bdgwx
February 10, 2022 7:12 am

You seem to think that you can teach the basic ground rules to the hard kernel of the miswired who seem to have their flight reflexes triggered whenever they are exposed to them. Your many earnest, imaginative attempts are truly admirable, but do you see the pattern yet?

  1. Initial engagement.
  2. Deflection, a la, “different instruments”, “different places”, “different times”.
  3. True denial of stat 101 concepts.
  4. Wholesale subject change. https://wattsupwiththat.com/2022/02/03/uah-global-temperature-update-for-january-2022-0-03-deg-c/#comment-3450532

Mr. Schrumpf at least seems interested in learning, if still blocked. Your insights like “That is an arbitrary and asymmetric distribution. It does not fit the common types: normal, uniform, triangular, exponential, weibull, etc. But, and this is important, it is still a distribution and still has a standard deviation.” might be helpful.

I too get pulled back in from time to time, so who am I to talk. Admire your patience. I read your local posts once in awhile, even though I will never invest the years that you and Nick have put in to become intimate with the fundamentals and specifics of temp reconstructions.

Last edited 3 months ago by bigoilbob
bdgwx
Reply to  bigoilbob
February 10, 2022 10:04 am

I am the eternal optimist for sure!

I do wish we could get past the denial that an average has lower uncertainty than the individual measurements that go into it.

If we could get past the denial of simple statistical principles and stop with the seemingly endless barrage of strawman that accompanies these discusses we might actually be able to discuss legitimate concerns with the accuracy, precision, and uncertainty of the UAH anomaly values. I’m thinking of things like the one-size-fits-all TLT weighting function, the possibility that the cooling stratosphere is contaminating the TLT values, possible systematic biases effecting the trend, and many other topics that are far more productive and valuable.

Carlo, Monte
Reply to  bdgwx
February 10, 2022 12:37 pm

I do wish we could get past the denial that an average has lower uncertainty than the individual measurements that go into it.

I do wish you’d take your lies back to wherever they came from.

James Schrumpf
Reply to  bdgwx
February 10, 2022 5:58 pm

I do wish we could get past the denial that an average has lower uncertainty than the individual measurements that go into it.

Speaking probability-wise, it’s obvious that the average of several measurements of our beloved board is most likely more accurate than an individual. That doesn’t rule out the possibility of one or more of the measurements being smack dead on the true value (though we can’t know that), while the average/mean is off a few hairs.

If we could get past the denial of simple statistical principles and stop with the seemingly endless barrage of strawman that accompanies these discusses

I don’t see the simple principles being protested. It’s the complicated ones giving me doubts.

I’ve just been looking at a paper put out by NIST, Technical Note 1900 “Simple Guide for Evaluating and Expressing the Uncertainty of NIST Measurement Results”. Starting on page 24 are several examples of measurements and the calculation of uncertainty.

“The equation, ti = r+Ei, that links the data to the measurand, together with the assumptions
made about the quantities that figure in it, is the observation equation. The measurand r is
a parameter (the mean in this case) of the probability distribution being entertained for the
observations.

“Adoption of this model still does not imply that r should be estimated by the average of
the observations — some additional criterion is needed. In this case, several well-known
and widely used criteria do lead to the average as “optimal” choice in one sense or another:
these include maximum likelihood, some forms of Bayesian estimation, and minimum mean
squared error.”

Example 2 is “proceeding as in the GUM (4.2.3, 4.4.3, G.3.2), the average of the m = 22 daily readings is t̄ = 25.6 ◦C, and the standard deviation is s =4.1 ◦C. Therefore, the standard uncertainty associated with the average is u(r)= s∕m =0.872 ◦C. The coverage factor for 95% coverage probability is k =2.08, which is the 97.5th percentile of Student’s t distribution with 21 degrees of freedom. In this conformity, the shortest 95% coverage interval is t̄± ks∕√n = (23.8 ◦C, 27.4 ◦C).”

NIST’s values for s, the average temp, and the standard uncertainty are the same as Excel calculated them.

This exercise is exactly the same thing we do with GNCN Daily temperatures every month to get the GHCN Monthly summaries. It seems as though NIST agrees with the simple approach here.

NIST_surface_temps.png
Jim Gorman
Reply to  James Schrumpf
February 10, 2022 7:41 pm

A few things.

1)
Remember the GUM is based on a single measurand, not multiple ones. Even the references (Sec 4) that discuss multiple measurements being used is based on a function that relate them to a final value refer to a single value. In other words, say LxWxH = Area. Average (mean) is not a function that builds to a final value. A mean is a statistical parameter of a distribution, not a function.

2)
“s/sqrt n” is what is known as SEM. It is the interval within which the estimated mean may lie based on the size of a sample distribution. It is based on the sample forming a normal distribution. It is actually based on sampling theory.

It is not measurement uncertainty. It is an assessment of how accurate an estimated mean obtained from a small number of sample measurements may be. It is a sample statistic used to estimate statistical parameters. This is certainly a type of uncertainty, but it doesn’t replace an assessment of measurement uncertainty.

The actual measurement uncertainty assessment uses root-sum-square calculation.

3)
Using a coverage factor gives an “expanded” uncertainty. It is certainly something that could be used and you show it will result in a much larger uncertainty value.

4)
You can not divide by sqrt N (say 9000 stations) and claim you have increased both accuracy and precision of data prior 1980. It just isn’t possible. Integer data must be used consistently as integers. You can not use a baseline from data newer than 1990 that has 1 or 2 decimals to convert integer data into a similar resolution. You immediately lose all the uncertainty information from having integer data. Too much of the discussion here is deflected into anomaly uncertainty and ignores the real data uncertainty.

bdgwx
Reply to  James Schrumpf
February 11, 2022 7:48 am

JS said: “This exercise is exactly the same thing we do with GNCN Daily temperatures every month to get the GHCN Monthly summaries.”

Yeah, at least for monthly station summaries. Note, that NIST uses a type A evaluation of uncertainty in the example. They could have done a type B evaluation as well by taking the assessed uncertainty of each Tmax measurement and combining them via an output function that computes an average. Both type A and type B are acceptable per the available literature. They often provide different results so it is often preferred to use both.

Since NIST uses the type A method let’s do the same thing on the UAH monthly grid. You can download the gridded data here. There are 9504 values. The standard deviation for January 2022 is 12.45 K which means the uncertainty on the average is 12.45 / sqrt(9504) = 0.13 K using the same type A method that NIST used in their example. Interestingly this is not significantly different from the type B method with full propagation through the gridding and spatial averaging steps used by Christy et al. 2003. Their result is 0.10 K for a monthly average.

James Schrumpf
Reply to  bdgwx
February 11, 2022 11:11 am

The NIST method calculates a 25.6 C mean, a 4.1 standard deviation, and a 0.88 SEM for the month of May 2012. The year’s measurements are completed, and now it’s time to calculate the mean for the year. We’ve got 12 means, twelve standard deviations, and 12 SEMs.

How is all this uncertainty handled?

bdgwx
Reply to  James Schrumpf
February 11, 2022 12:47 pm

You have two choices at this point. Continue the propagation using the type B method or do another type A analysis separately. I’ll do both types with the official Washington D.C. reporting station for the year 2012.

Type A Monthly (N={29-31})

Jan: 9.6 ± 1.0
Feb: 11.3 ± 0.8
Mar: 19.1 ± 1.0
Apr: 20.1 ± 0.9
May: 26.7 ± 0.5
Jun: 29.9 ± 0.9
Jul: 33.9 ± 0.7
Aug: 32.0 ± 0.5
Sep: 27.1 ± 0.6
Oct: 20.5 ± 0.9
Nov: 12.7 ± 0.7
Dec: 11.3 ± 0.8

Type A Annual from Monthly (N=12)

2012: 21.2 ± 2.4

Type A Annual from Daily (N=366)

2012: 21.2 ± 0.5

Type B Annual from Monthly Type A

2012: 21.2 ± 0.1

Type B Full Propagation assuming ± 0.3 obs

Jan: 9.6 ± 0.1
Feb: 11.3 ± 0.1
Mar: 19.1 ± 0.1
Apr: 20.1 ± 0.1
May: 26.7 ± 0.1
Jun: 29.9 ± 0.1
Jul: 33.9 ± 0.1
Aug: 32.0 ± 0.1
Sep: 27.1 ± 0.1
Oct: 20.5 ± 0.1
Nov: 12.7 ± 0.1
Dec: 11.3 ± 0.1

2012: 21.18 ± 0.01

The ± 0.3 figure is based on Hubbard & Lin 2003.

Note that this assumes zero correlation. In reality the monthly ± 0.1 and annual ± 0.01 will be far too low of an estimate.

Last edited 3 months ago by bdgwx
bigoilbob
Reply to  James Schrumpf
February 11, 2022 5:15 pm

How is all this uncertainty handled?”

A good question, hopefully asked for elucidation. I know that it can be rigorously evaluated, and I know how I would do it. My process would honor DOM weighting and any correlation between the monthly standard deviations. But bdgwx and Nick Stokes have spent years on the step by step specifics. I hope that one of them responds…

Jim Gorman
Reply to  bdgwx
February 11, 2022 2:14 pm

Dividing 12.45 by √9504 at best only tells the interval within which the mean may lay. IOW, the Standard Error of the sample Mean, the SEM. The SEM is not a gauge of the uncertainty of a measurement unless there is a single measurand. It is a statistic of a sample distribution, it is not a statistical parameter of a population.

The formula for SEM is:

SEM = σ/√N where,

SEM is the Standard Error of the sample Mean
σ is the standard deviation of a population
N is the sample size

Look at what you are doing here.

First, by using sigma (SD), you are defining your data as an entire population.

Second, you then declare the data a group of 9504 samples so that you can divide σ by a large number. When in actuality each sample has a size of 12 (i.e., average of 12 months) and should be the value of “N”.

These don’t go together. Your data is either a group of 9504 samples or it is the entire population, one or the other, it can’t be both. You are doing what many, many scientists do.

Read this document that NCBI felt appropriate for their website.

Standard Error | What It Is, Why It Matters, and How to Calculate (scribbr.com)

and another:

Basics of Estimating Measurement Uncertainty (nih.gov)

and from:

Standard Error | What It Is, Why It Matters, and How to Calculate (scribbr.com)

“The standard error of the mean, or simply standard error, indicates how different the population mean is likely to be from a sample mean. It tells you how much the sample mean would vary if you were to repeat a study using new samples from within a single population.”

Read this, especially the part about temperatures.

Standard Deviation Calculator

Carlo, Monte
Reply to  bigoilbob
February 10, 2022 12:36 pm

blob the Idiot.

James Schrumpf
Reply to  bdgwx
February 10, 2022 10:24 am

I’m not sure what you mean by “accuracy” here. Technically and per ISO 5725 accuracy is describing the bias of every measurement. It essentially shifts the error distribution to the left or right. The markings on the instrument or the display of the value reported by the instrument do not influence the accuracy in any way. Saying an instrument can only report or be read in units of 0.1 does not in anyway describe the accuracy of the instrument. It only describes a limitation of the precision of the instrument. Again, I’m using formal ISO 5725 language here.

I did use the wrong term, I meant precision. Here’s the specs for the temperature sensor used by NOAA in the USCRN stations, which I believe are their highest-quality quality stations:

Type: Platinum 1000 ohm ± 0.04% at 0°C (per IEC-751, Class A accuracy)

yadda yadda yadda

Accuracy: ±0.04% over full range

I’m presuming these stations are fully automatic and humans do not pop in to read these thermometers, so it it says the temp is 15.33 C, it’s reading those hundredth-place temperatures with no measurement uncertainty due to a human squinting at a thermometer marked in whole digits and trying to guess if that is 23.6 or 23.7 C.

When I was taking physics we were taught that the relative error (expressed as a percentage) was calculated as dX/X. Zero point zero zero 4 percent would be 0.0004 as a decimal, so does that mean with this instrument if it reads the daytime high as 23.6 C that the uncertainty in that measurement is +/- 23.6 * 0.0004 = 23.6 +/ 0.009 C? If so, what happens when it measures 0.00 C? Is the accuracy meaningless?

I performed your Excel experiment, and it was as you said, though I don’t see what it’s telling me about anything. Take a bunch of randomly generated 5-6 decimal place numbers subtract a rounded version of each from itself, and calculate the standard deviation of the resulting column of numbers. What does that tell me?

My first thought was that perhaps the random generator algorithm included a built-in standard deviation of approximately that number. So I used 100 temperature measurements from one of the NOAA GHCN-Monthly and performed the same exercise. The results were about 8.7% smaller than for the random numbers.

Is that significant? I don’t know. I don’t know what the point of the exercise was.

I read this paper online: J. Chem. Educ. 2020, 97, 5, 1491–1494, and while I didn’t understand all of it, this was very clear:

Monte Carlo simulations for uncertainty propagation take as inputs the uncertainty distribution for each variable and an equation for the calculation of a desired quantity. The desired quantity is then calculated by randomly drawing from the specified uncertainty distributions of the input variables. This calculation is then repeated many times (often 106 or greater) with new random drawings each time. 

Question. How does this repeated sampling and recalculating apply to a mass of temperature measurements?

bdgwx
Reply to  James Schrumpf
February 10, 2022 11:47 am

JS said: ” If so, what happens when it measures 0.00 C? Is the accuracy meaningless?”

PT1000 instruments work because temperature alters the electrical characteristics of the material such that it’s resistance increases with increasing temperature. The ±0.04% figure you see is for the uncertainty of the ohm measurement by the data logger. It is given as percent of the full scale range (FSR) of ohms over a temperature range of I believe -25 to +50 C. This ohm range is about 290 ohms. So 0.0004 * 290 = 0.12 ohms. And since there are about 3.9 ohms per C that would translate into an temperature uncertainty of about 0.4 C. However, I did look up the specific data logger for USCRN. Some use the CR23X which has a resistance uncertainty of about ±0.02%. That translates into a theoretical temperature uncertainty of about ±0.2 C. However, due to other factors Hubbard et al. 2005 concluded that it could be as high as ±0.33 C So to answer your question at 0 C the uncertainty is probably on the order of ±0.3 C for a typical USCRN station.

JS said: “What does that tell me?”

What that tells you is:

1) The error distribution when you truncate or round digits is uniform.

2) The error distribution when you truncate or round digits has a standard deviation given by 0.289 / D where D is the number of decimal places you keep.

For example, it is common for temperature instruments to only display 1 digit after the decimal place. That means the read uncertainty is uniform between -0.05 and +0.05 with an SD of 0.0289 C. So if the display says 15.1 C then the measured value could fall between 15.05 and 15.15 C. You just don’t know what it is because the display only gives you 1 digit after the decimal place.

It is also important to note that the read uncertainty is different from the measurement uncertainty. For example, let’s consider the CR23X data logger. It has a published measurement uncertainty of ±0.2 C. Now let’s say you log the values to Excel but with only 1 digit after the decimal place…just because. That injects a read uncertainty of ±0.03 (remember the 0.0289 SD). The combined uncertainty of your dataset per GUM equation 10 or using the simpler root sum square formula is sqrt(0.2^2 + 0.03^2) = 0.202 or 0.2. Notice how the CR23X measurement error dominates the combined uncertainty such that the read uncertainty is negligible.

Last edited 3 months ago by bdgwx
Jim Gorman
Reply to  bdgwx
February 10, 2022 12:12 pm

You need to decipher how temps can be quoted to the 1/1000ths when the uncertainty means you’ll never know if 0.002 is correct or not.

Show JS how averaging different things allows one to increase the resolution of the measurements. How do you go from a resolution of 0.1 to 0.001 via arithmetic averaging?

Carlo, Monte
Reply to  Jim Gorman
February 10, 2022 12:41 pm

Via mental willpower that it be so.

bdgwx
Reply to  Jim Gorman
February 10, 2022 1:54 pm

bdgwx said: “2) The error distribution when you truncate or round digits has a standard deviation given by 0.289 / D where D is the number of decimal places you keep.”

Yikes. I just noticed that I butchered that. That should be 0.289 / 10^(D-1).

BTW…here is the derivation of that using the variance formula for a uniform distribution.

σ^2 = 1/12 * (b – a)^2

σ^2 = 1/12 * (0.5/10^(D-1) + 0.5/10^(D-1))^2

σ^2 = 1/12 * (1/10^(D-1))^2

σ = sqrt[1/12 * (1/10^(D-1))^2]

σ = sqrt[1/12] * 1/10^(D-1)

σ = 0.289 * 1/10^(D-1)

σ = 0.289 / 10^(D-1)

Carlo, Monte
Reply to  bdgwx
February 10, 2022 12:39 pm

As usual, your rants about uncertainty are nonsense.

Jim Gorman
Reply to  James Schrumpf
February 10, 2022 12:04 pm

Here is a screenshot of a section from the GUM that covers what you say.

Please note that a function for combining different measurements is needed. This doesn’t mean just using a statistical tool like averaging is a valid method for also dealing with uncertainty.

Finding an SEM, Standard Error of the Sample Mean, only tells you how accurate your estimated mean calculation may be for the distribution that you have. It does not deal with the propagation of the uncertainty inherent in each and every measurement used to calculate that mean and SEM.

Capture+_2022-02-10-13-53-20.png
Jim Gorman
Reply to  James Schrumpf
February 10, 2022 8:40 am

Don’t get too caught up in the minutiae of uncertainty in the measurements. The real issue is taking old data recorded in integers and expanding the information available from that measurement to include much more information with far more significant digits.

If proper scientific data analysis was being done, anomaly baselines would only have two significant digits and the corresponding anomalies would be in integers also. Error bars would be at least +/- 0.5 for these anomalies.

Averaging measurements whether of the same thing or different things simply can not increase the resolution of the measuring device.

Jim Gorman
Reply to  James Schrumpf
February 10, 2022 1:17 pm

Part of the problem is that it is not a uniform distribution, especially for human readings. It only becomes uniform as one reaches the limit of resolution. For example, the recording of 96 1/4 easily becomes 96. A reading of 96.75 easily becomes 97. Only around 96.5 does the reading really become uniform.

The uncertainty arises when 96 is recorded. You don’t know the next digit and can never know it, i.e., it is uncertain. Does that make the uncertainty a uniform distribution? Not really.

A normal distribution, which is what is assumed when measuring the same thing, multiple times, with the same thing gives you an SD = 0.5 for an interval of +/-0.5.

Jim Gorman
Reply to  James Schrumpf
February 10, 2022 12:23 pm

You are right, it isn’t wrong. Gave you a plus to cancel the negative.

I like your use of matching the uncertainty decimal place to the resolution of the measurement.

Last edited 3 months ago by Jim Gorman
Jim Gorman