New WUWT Global Temperature Feature: Anomaly vs. Real-World Temperature

One of the most frightening aspects of global warming, aka “climate change” is the graphs produced from temperature data for public consumption and trumpeted by an unquestioning and compliant media. When it comes to measuring climate, in order to actually see any temperature differences over the last century, they must be highly magnified using the temperature anomaly method.

The most often cited global temperature anomaly graph is from the NASA Goddard Institute of Space Studies (GISS), showing yearly average temperatures since 1880, as seen in Figure 1 below.

Figure 1: Land-ocean temperature index, 1880 to present, with base period 1951-1980. The solid black line is the global annual mean and the solid red line is the five-year lowess smooth. The gray shading represents the total (LSAT and SST) annual uncertainty at a 95% confidence interval and is available for download. [More information on the updated uncertainty model can be found here: Lenssen et al. (2019).] Source: https://data.giss.nasa.gov/gistemp/graphs_v4/

To the untrained and uninitiated (i.e. the general public) it looks like Earth’s temperature is on a trajectory for a hot and terrible future.

Sometimes, media outlets such as the daily-doom newspaper known as The Guardian, will take that data and make their own graphs, making them look even steeper and scarier, such their highly statistically amplified graph from their 2019 article as seen in Figure 2.

Figure 2. Headline and graphical depiction of global temperature by The Guardian in 2019. Note the graph was amplified by using a different baseline for anomaly comparison. NASA GISS uses 1950-1980 as the baseline, where the Guardian used 1850-1900 as the baseline, amplifying the positive anomalies in the near present, because 1850-1900 was a cooler period in Earth’s temperature history.

Written by the ever-alarmed and always unreliable Damian Carrington, it is no wonder some children think they have no future due to “climate change”.

But in the real-world, people don’t experience climate as yearly or monthly temperature anomalies, they experience weather on a day to day basis, where one day may be abnormally warm, and another might be abnormally cold. Sometimes new records are set on such days. This is normal, but such records are often portrayed by the media as being evidence of “climate change” when if fact it is nothing more than natural variations of Earth’s atmosphere and weather systems. In fact, is doubtful humans would even notice the mild warming we’ve had in the last century at all, given that the human body often can’t tell the difference between 57°F and 58°F in any given moment, much less over a long term.

Essentially, what we know as climate change is nothing more than a man-made statistical construct. You can’t go outside and hold an instrument in the air and say “I’m measuring the climate.” Climate is always about averages of temperature over time. It’s a spreadsheet of data where daily high and low temperatures are turned into monthly averages, and monthly averages are turned into yearly averages, and yearly averages are turned into graphs spanning a century.

But, such graphs used in press releases to the media and broadcast to the public don’t really tell the story of the data honestly. They omit a huge amount of background information, such as the fact that in the last 40 years, we’ve had a series of El Niño weather events that have warmed the Earth; for example, 1983, 1998 and in 2016. The two biggest El Niño events are shown coinciding with temperature increases in Figure 3.

Figure 3.  GISTEMP global temperature data, in 12-months running average (anomalies relative to the first 30 years). Source: RealClimate.org 

These graphs also don’t tell you the fact that much of the global surface temperature measurements are highly polluted with Urban Heat Island (UHI) and local heat-sink related siting effects that bias temperatures upward, such as the wholesale corruption of climate monitoring stations I documented in 2022, where 96% of the stations surveyed don’t even meet published standards for accurate climate observations. In essence – garbage in, garbage out.

But, all that aside, the main issue is how the data is portrayed in the media, such as The Guardian example shown in Figure 2.

To that end, I have prepared a new regular feature on WUWT, that will be on the right sidebar, combined with the long-running monthly temperature graphs from the state of the art (not polluted or corrupted) NOAA operated U. S. Climate Reference Network and the University of Alabama Huntsville (UAH) satellite derived temperature global record.

Screen Capture of WUWT showing new feature on the sidebar.

I’m utilizing the NASA Goddard Institute of Space Studies GISTEMP global dataset. The difference is simply this – I show both the absolute (measured) and the anomaly (statistically magnified) versions of the global temperature. This is accomplished by doing the reverse procedure as outlined in UCAR’s How to Measure Global Average Temperature in Five Easy Steps.

In this calculation, the “normal” temperature of the Earth is assumed to be 57.2°F. and that is simply added to the anomaly temperature reported by NASA GISS to obtain the absolute temperature. The basis of this number comes from NASA GISS itself, from their FAQ page as seen in August 2016 as captured by the Wayback Machine.

Figure 4. Screen capture of the NASA GISS FAQ’s page from August 24, 2016 – Source: Wayback Machine

Of course GISS removed it from that page as seen today, because they don’t want people doing exactly what I’m doing now – providing the absolute temperature data, in a non-scary graphical presentation, done in the scale of how humans experience Earth’s temperature where they live. For that I’ve chosen a temperature range of -20°F to +120°F, which is representative of winter low temperature near the Arctic Circle and high summer temperature in many populated deserts, such as in the Middle East.

Figure 5: NASA GISTEMP Data, plotted as a temperature anomaly, using a “normal” temperature of 57.2°F
Figure 6: NASA GISTEMP Data, plotted as absolute temperature, using a “normal” temperature of 57.2°F

Can you tell which graph visually represents a “climate crisis” and which one doesn’t?

Feel free to check my work – the Excel spreadsheet and the calculations are here:

To create the graphs above in Figures 5 and 6, I used the data from the Excel Sheet imported into the graphing program DPlot.

Note: some typos in this article were fixed and some clarifications added within about 30 minutes of publication. -Anthony

4.9 116 votes
Article Rating
728 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Nick Stokes
Reply to  Anthony Watts
March 12, 2023 2:34 pm

Well, since you asked 🙂 what is the difference here? Seems to me it is the same plot with 57.2 added to the numbers on the y-axis.

Nick Stokes
Reply to  Anthony Watts
March 12, 2023 4:04 pm

Nope. But you can do this to anything. Here is the US national debt:

comment image

See? Nothing to worry about.

Rud Istvan
Reply to  Nick Stokes
March 12, 2023 4:38 pm

Nick, I have a degree in economics. We were taught to scale meaningfully. Your posted scale goes from 0 to 500 trillion increments, when the federal debt is now on order of 30 trillion. NOT a meaningful scale.
AW did not pull the same obvious trick you just did.

Nick Stokes
Reply to  Rud Istvan
March 12, 2023 4:47 pm

We were taught to scale meaningfully.”
Exactly. You were taught to use a scale that was most informative. One that gives the best resolution while still getting all the data on the page. That is what GISS and everyone else does.

The AW trick was to relate it to what we “experience”. But famously, we don’t experience global average temperature. During the last glaciation, the global temperature dropped to about 45F. People have “experienced” much colder than that. But their experience of the glaciation was much more severe than that.

Editor
Reply to  Nick Stokes
March 12, 2023 6:18 pm

Nick, repeating my comment to Anthony: So, we should use a scale something like the OSHA recommended temperature range for offices in the US:

office_temp_scale.png
Nick Stokes
Reply to  Kip Hansen
March 12, 2023 7:00 pm

Kip,
The thing is that humans do not feel the global average temperature. As AW says, you can’t go out and measure it with a thermometer. What counts is what fluctuations signify. And as I said, a glaciation has to be a significant fluctuations, with bad implications for NY real estate. But the average was 45F, easily within our normal experience.

It’s a bit like the doctor finding your temp is 104F, and you say, no worries, it is within the OHSA range.

Rick C
Reply to  Nick Stokes
March 12, 2023 8:52 pm

Nick: Humans don’t feel temperature anomalies of less than 2C either. In fact I defy anyone to determine the temperature, indoors or out, to an accuracy of +/- 3C by how it feels. We need thermometers to be able to tell what the temperature really is.

But we all experience a temperature change of around 10C on a daily basis and most of us experience a range of about 50C over the course of a year. Without a vast network of weather stations and thousands of people recording and processing data we’d have no idea if there was any trend in long term temperatures. We can’t feel it and we can’t observe any affects on the environment that are outside of our sense of normal weather variability. A 1.5 to 2 C warming over a century – please! – not scary to me or anyone else except followers of the Catastrophic Climate Change Cult (CCCC).

Jim Gorman
Reply to  Rick C
March 14, 2023 4:41 am

This is exactly the position I have reached. I have seen too many local temperature graphs with little to no warming to think that we are truly seeing these kinds of increases worldwide. One only has to read all the headlines that trumpet “so and so place is warming faster than the global average” to recognize the propaganda that global anomalies produce.

My preliminary looks using NIST TN 1900, Example 2 procedure for examining Tmax and Tmin lead me to believe that only local values for these, when done separately, is the only valid way to decide if climate alarmism is correct or not. My guess is that we are having “statistical” trickery being used, willfully or not, to see things that aren’t there.

My guess is that this was originally done by unaware climate scientists that just assumed simple arithmetic averages would show something valid, and then proceeded to average averages over and over and saw what they wanted.

Reply to  Jim Gorman
March 14, 2023 7:16 pm

you havent reviewed 40000 local graphs. i have
My guess is that this was originally done by unaware climate scientists that just assumed simple arithmetic averages would show something valid, and then proceeded to average averages over and over and saw what they wanted.

bad guess!!!! you dont even know what an average is
or how to estimate one.
go ahead show your math

Reply to  Rick C
March 14, 2023 7:13 pm

 In fact I defy anyone to determine the temperature, indoors or out, to an accuracy of +/- 3C by how it feels. We need thermometers to be able to tell what the temperature really is.

i can definately tell the difference between 26 and 27C/

so can plants and animals

animals dont need thrmometrs
https://www.nationalgeographic.com/science/article/climate-change-species-migration-disease

neither do plants

https://www.jstor.org/stable/27856854

Editor
Reply to  Nick Stokes
March 13, 2023 7:02 am

Nick ==. I quite agree, the scale must be very pragmatic — have a very specific practical basis. The practical basis for human body core temperature is based on long-term medical study (despite the exact value of “normal” still being controversial) — the range between death from “too low” and death from “too hot” — low is below 80°F (organ systems begining to fail) high is over 108°F…want to use that scale? 28°F?

Office temperature designated by OSHA has a 5°C comfort range — 68°F to 77°F. That’s 11°F. Want to use that range?

The range NOT to use is the auto-scale range of various stats and spreadsheet programs. Such scales have NO pragmatic reality — only a numerical relationship of highes-lowest spread.

Bellman
Reply to  Kip Hansen
March 13, 2023 10:06 am

Office temperature designated by OSHA has a 5°C comfort range — 68°F to 77°F. That’s 11°F. Want to use that range?

That’s 20 – 25°C.

By that logic every year on earth has been well below the minimum acceptable office temperature of 20°C.

Looking through NOAA’s time series for the states, the only location on the mainland of the USA which is habitable by that standard is Florida, which usually has annual mean temperatures above 20°C.

Tim Gorman
Reply to  Bellman
March 13, 2023 11:16 am

Did you *really* think about this before posting it?

Bellman
Reply to  Tim Gorman
March 13, 2023 11:43 am

No, it was entirely randomly generated. What part did you disagree with?

Tim Gorman
Reply to  Bellman
March 13, 2023 12:57 pm

Do you wear clothes outside? If you do then why do you do so?

Bellman
Reply to  Tim Gorman
March 13, 2023 2:45 pm

If necessary. The same in an office. Do you have a point?

KevinM
Reply to  Bellman
March 13, 2023 3:23 pm

Florida was made habitable by the air conditioner.

Matthew Bergin
Reply to  KevinM
March 15, 2023 11:03 am

Think again. People lived in Florida long before air-conditioning was ever conceived.

Last edited 15 days ago by Matthew Bergin
KevinM
Reply to  Matthew Bergin
March 15, 2023 3:55 pm

The area was inhabited when Spaniards first arrived, so no AC. I don’t think (aka opinion) FLA would be thickly settled by USA citizens without nearly universal AC. Google search – yes the reference period contains USA’s baby boom generation:
In 1950, the population here was 2.7 million. By 1960, with some air conditioning, the population increased to 4.9 million.
AC Market penetration is about 85%, meaning that if you can afford AC in FLA today, then you probably have it. TG seemed to be trying to get to the point that humans can adapt to a wide range of climates by using technology. It seemed to be taking a long time to get there.

Matthew Bergin
Reply to  KevinM
March 15, 2023 4:03 pm

The funny thing is Florida’s temp is pretty close to the point where humans can live naked. Probably accounts for all the swimsuits😉

Tim Gorman
Reply to  Bellman
March 13, 2023 3:39 pm

You didn’t answer as to why you wear clothes.

That lies at the base of the whole point.

Bellman
Reply to  Tim Gorman
March 13, 2023 4:26 pm

I find your obsession of what I’m wearing a bit disturbing.

If you have a point, state it. Otherwise I’m just going to assume you are obsessed with he thought of me naked.

Tim Gorman
Reply to  Bellman
March 14, 2023 5:43 am

Your assumption says more about *you* than it does about me!

You are just avoiding answering because you know what the answer has to be.

Bellman
Reply to  Tim Gorman
March 14, 2023 7:56 am

Still no idea what point you are trying to make, and if I have to guess you’ll just say I’m missing the point. You’re inability to just say what point you are trying to make, speaks volumes. I suspect if you did just come out with it it would be something completely inane, so much better to rely on these cryptic questions.

Tim Gorman
Reply to  Bellman
March 14, 2023 7:57 am

Malarky. You know what the point is, you just don’t want to admit it. How many flannel shirts do people living in FL keep in their closet?

Bellman
Reply to  Tim Gorman
March 14, 2023 1:51 pm

Yet you still won’t state your argument. Just keep asking more meaningless questions.

I can only assume your asinine argument is that an ice-age wouldn’t be a bad thing because you can just put on more clothes.

Tim Gorman
Reply to  Bellman
March 14, 2023 4:10 pm

You won’t answer the question. Why?

And if it warms then take clothes off.

Humans survive above the Artic Circle and have for thousands of years. Humans survive in the Middle East and have for thousands of years.

How much wheat do you suppose the Iniut grow?

Bellman
Reply to  Tim Gorman
March 14, 2023 5:13 pm

Because they are stupid questions that have no relevance to the point.

You ask if I wear clothes. I said I usually did.
You ask why I wear clothes. That’s such a trite question it isn’t worth answering. But if you insist I answer it, because people tend to object if I don’t. Because they other some protection. Because they keep me warm on a really cold day. Because they keep me dry when it’s raining. Because I need pockets to keep thing in. Is that enough answers for you, or do you want to discuss the colour of my underwear next?

You ask how many shirts someone living in Florida has. I have no idea. Do you have the statistics? Would you accept an average, or would you complain that, say nobody actually has 2.3 shirt, or whatever?

And if it warms then take clothes off.

There’s an obvious limit to how far that will get you. You can’t get more naked than naked.

Humans survive above the Artic Circle and have for thousands of years.
Humans survive in the Middle East and have for thousands of years.

Humans survived for thousands of years without electricity.
Humans survived for thousands of years without cars.
Humans survived for thousands of years without antibiotics.
Humans survived for thousands of years without fossil fuels.

How much wheat do you suppose the Iniut grow?

Are there 8 billion Inuit?

Tim Gorman
Reply to  Bellman
March 15, 2023 4:42 am

You ask why I wear clothes. That’s such a trite question it isn’t worth answering. But if you insist I answer it, because people tend to object if I don’t. Because they other some protection. Because they keep me warm on a really cold day. Because they keep me dry when it’s raining. :”

In other words you ADAPT to your climate. There isn’t any reason for everyone to live in FL because it has an average temp above 20C.

“You ask how many shirts someone living in Florida has.”

That is *not* what I asked. Your reading skills (or lack thereof) are showing again.

“There’s an obvious limit to how far that will get you. You can’t get more naked than naked.”

Your lack of knowledge about the real world is showing again. What do Bedouins wear?

“Are there 8 billion Inuit?”

Red herring. If the Inuit can survive for thousands of years in their climate why couldn’t people in SD do the same?

Bellman
Reply to  Tim Gorman
March 15, 2023 10:30 am

In other words you ADAPT to your climate.

You keep arguing with shadows. I’ve already said people can adapt to their climate. Wearing clothes is a very limited way of doing it.

There isn’t any reason for everyone to live in FL because it has an average temp above 20C.

The point I was trying to make is that there is a big difference between average global annual temperatures, or in this case just annual averages, and what may be considered necessary in a dingle point of time.

I assume Texas or New Mexico are considered warm states, yet the annual average temperature is considered too cold for an office. You think that just means everyone in Texas should wrap up warm, rather than consider that you can’t compare the average with everyday expectations.

That is *not* what I asked. Your reading skills (or lack thereof) are showing again.

You ask facile questions, then object when I don’t treat them seriously. What you actually asked was

How many flannel shirts do people living in FL keep in their closet?

I simplified that to

You ask how many shirts someone living in Florida has.

And you attack me because I omitted the words “flannel” and “closet”. And you wonder why I decline to always answer your dumb questions.

Your lack of knowledge about the real world is showing again. What do Bedouins wear?

I was responding to your, and only your, statement t

And if it warms then take clothes off.

Again, I am not responding to your every point as is it was worth a detailed response.

Red herring. If the Inuit can survive for thousands of years in their climate why couldn’t people in SD do the same?.

No worries then. Everyone in SD can give up eating bread and survive by hunting seals.

Tim Gorman
Reply to  Bellman
March 15, 2023 1:19 pm

“The point I was trying to make is that there is a big difference between average global annual temperatures, or in this case just annual averages, and what may be considered necessary in a dingle point of time.”

Climate is *NOT* the average, not the annual average, not the monthly average, not even the daily average!

Climate is the *entire* temperature profile. Two different locations with the same monthly and annual average temperature can have different climates.

You think that just means everyone in Texas should wrap up warm, rather than consider that you can’t compare the average with everyday expectations.

So now you are back to being a psychic again? You *know* what I think?

You just proved my entire point. You *can’t* tell daily expectations from the average. The average tells you NOTHING about the variance.

If the average doesn’t tell you anything then what good is it?

“And you attack me because I omitted the words “flannel” and “closet”. And you wonder why I decline to always answer your dumb questions.”

ROFL! Why did you omit the word “flannel”? The question wasn’t dumb, your answer was because you deliberately misquoted me!

“No worries then. Everyone in SD can give up eating bread and survive by hunting seals.”

Yep. Or then can move somewhere else and adapt to that location! Why don’t the Inuit move?

Last edited 15 days ago by Tim Gorman
Bellman
Reply to  Tim Gorman
March 15, 2023 4:25 pm

Climate is *NOT* the average, not the annual average, not the monthly average, not even the daily average!

Define it how you want. I didn’t say “climate” I said the global average temperature. A change in that of 2°C will have big effects on the planet.

Two different locations with the same monthly and annual average temperature can have different climates.

Indeed. As always you keep saying things like this as if it has any relevance to what we were discussing.

So now you are back to being a psychic again? You *know* what I think?

Make your fracking mind up. First you refuse to explain what point you are trying to make, insisting I must know what you are getting at. Then when I try to guess at what your argument is, you insist that it’s impossible for me to actually know it.

You just proved my entire point. You *can’t* tell daily expectations from the average.

And there you go, winning over another strawman.

ROFL! Why did you omit the word “flannel”? The question wasn’t dumb, your answer was because you deliberately misquoted me!

And you are back to assuming I can fathom out what goes on inside your head. I don’t know how many shirts an average person has, let alone what they are made of.

At the risk of being accused of *knowing* what you think again. I’m guessing your point is that “flannel” shirts are considered warm and are so less likely to be warm in a hot part of the world. And that in some way you think this is a telling point with regard to office temperatures, compared with global averages.

Tim Gorman
Reply to  Bellman
March 16, 2023 9:52 am

Define it how you want. I didn’t say “climate” I said the global average temperature. A change in that of 2°C will have big effects on the planet.”

How do you know that from the global average temperature? Do you know the variance that goes along with that average? Do you know you know where this is going to come from? The Arctic? Central Africa? Australia? Are some areas going to go up and some go down? Which ones will go up and which ones will go down?

If you don’t know that then how can you judge what the “big effects” will be?

tg: Two different locations with the same monthly and annual average temperature can have different climates.

“Indeed. As always you keep saying things like this as if it has any relevance to what we were discussing.”

It has *everything* to do with it! Jeeesh!

The average annual temp in Miami is 83F and in Las Vegas it is 80F. Hardly any different in annual average. Yet vastly different climates due to humidity (i.e. to enthalpy as opposed to temperature) and temperature variance.

The average summer high is 89F in Miami. The average summer high in Las Vegas is 104F. The average winter low is 60F in Miami. The average winter low in Las Vegas is 37F. Vastly different variances, 29F for Miami and 67F for Las Vegas. BUT BOTH HAVE ALMOST THE SAME AVERAGE ANNUAL TEMP!

You just epitomize the typical climate alarmist! Belief that 2C increase in the GAT will have big effects yet just dismiss the variance of the temperatures that are combined to calculate that anomaly.

It’s religion. Not science.

And there you go, winning over another strawman.”

ROFL!! You can’t tell daily expectations from the average monthly or annual average and you consider that to be a strawman argument! And yet you think average annual temperature define the climate! Unfreakingbeliveable.

“And you are back to assuming I can fathom out what goes on inside your head.”

The issue is that you purposefully misquoted me. And you are still doing it!

I’m guessing your point is that “flannel” shirts are considered warm and are so less likely to be warm in a hot part of the world.”

Have you lived every day of your life in a basement?

Bellman
Reply to  Tim Gorman
March 17, 2023 6:28 am

How do you know that from the global average temperature? Do you know the variance that goes along with that average? Do you know you know where this is going to come from? The Arctic? Central Africa? Australia? Are some areas going to go up and some go down? Which ones will go up and which ones will go down?

I couldn;t say. But I am confident that unless all your ducks fall in just the right places, a drop of 2°C in global annual average temperatures over a period of time will have very big effects.

Things can change even if the average stays the same. The converse is not true.

Tim Gorman
Reply to  Bellman
March 17, 2023 10:53 am

I couldn;t say.”

If you knew the variances associated with all the averages used to calculate other averages you *could* say. But you aren’t interested in knowing that, are you?

But I am confident that unless all your ducks fall in just the right places, a drop of 2°C in global annual average temperatures over a period of time will have very big effects.”

You couldn’t say but you are *sure*. Cognitive dissonance at its finest.

What if the drop only occurs at the equator?

“Things can change even if the average stays the same. The converse is not true.”

But WHAT “things” change? If you only know the average then you have no idea of what things are changing. Meaning you have to guess – a subjective process at best, producing nothing but confirmation bias.

BTW, you are confusing average with median. If you have skewed distributions the average can change while the median stays the same. That’s the problem with using (Tmax+Tmin)/2 as an average – it assumes a Gaussian distribution of temperatures. But in the general case its only a median. But then, EVERYTHING is Gaussian, isn’t it?

Bellman
Reply to  Tim Gorman
March 18, 2023 7:44 am

“If you knew the variances associated with all the averages used to calculate other averages you *could* say.”

No you couldn’t. Either the variances stay the same across each location or they change as the temperatures change. But in either case that won;t allow you to predict by how much any location changes. Nor will it allow you to predict what other climatic changes happen as a result in the drop in average temperatures.

You couldn’t say but you are *sure*. Cognitive dissonance at its finest.

You ignored the clause starting “unless”.

What if the drop only occurs at the equator?

Then it will have to be a pretty big drop. If global temperatures fall by 2.5°C, and say 20% of the glob centered on the equator is the only place to change, that means it’s cooled by 12.5°C. With land temperatures probably changing much more.

It’s also very improbable, given that cooling is more likely to affect the higher latitudes. The 1690s were mostly felt in northern Europe and the US, not at the equator.

If you only know the average then you have no idea of what things are changing.

That’s my point. You cannot compare a change in average global temperature, with the experience of working in an office. You cannot assume that a 2.5°C drop in global temperature is OK because you could live with a drop of 2.5°C in an office.

Meaning you have to guess – a subjective process at best, producing nothing but confirmation bias.

Says someone who guesses that all the cooling might just happen at the equator, so everyone’s OK. The point is, if you don’t know what changes will happen with such a drop, then it should worry you.

Tim Gorman
Reply to  Bellman
March 18, 2023 10:47 am

No you couldn’t. Either the variances stay the same across each location or they change as the temperatures change.”

Nope, The variance of the daytime temp is different from the nighttime temp. So they don’t stay the same even at the same location!

But in either case that won;t allow you to predict by how much any location changes.”

Variance is a measure of the uncertainty of a distribution. The wide the variance the more uncertain the expected value is. So if you know the variance you *can* predict by how much a location will change.

I thought you were a statistician?

“Then it will have to be a pretty big drop.”

How do you KNOW that isn’t the case? The equator is where the most sun insolation occurs. That is where you would expect to see the biggest temperature change.

“It’s also very improbable, given that cooling is more likely to affect the higher latitudes. The 1690s were mostly felt in northern Europe and the US, not at the equator.”

And was that because there was less CO2 in the atmosphere in the higher latitudes?

You cannot assume that a 2.5°C drop in global temperature is OK because you could live with a drop of 2.5°C in an office.”

Why *can’t* you compare them. Office temps are supposed to be where the survival of the human race is supposed to be.

You can’t say that a 2.5C drop in global temp will be catastrophic because the metric doesn’t allow you to know the inputs to that average!

Bellman
Reply to  Tim Gorman
March 18, 2023 12:32 pm

Nope, The variance of the daytime temp is different from the nighttime temp.”

I swear sometimes I really think you must be a not very good machine learning chat bot. You just keep repeating things like this regardless of their relevance to the discussion.

Variance is a measure of the uncertainty of a distribution. The wide the variance the more uncertain the expected value is. So if you know the variance you *can* predict by how much a location will change.

What nonsense. How does knowing the variance of temperatures in the current climate at a particular location allow you to predict how much it will change if the global average changes?

I thought you were a statistician?

Then you’ve forgotten all the times I’ve pointed out I’m not a statistician.

How do you KNOW that isn’t the case?

I don’t. That’s why I used the word “if”.

You can’t say that a 2.5C drop in global temp will be catastrophic because the metric doesn’t allow you to know the inputs to that average!

Take it up with everyone who comes on here claiming it will be a disaster if we return to mini ice age conditions. Take it up with Monckton who insists a 1 degree rise in temperature can only be for the good.

Tim Gorman
Reply to  Bellman
March 18, 2023 2:18 pm

I swear sometimes I really think you must be a not very good machine learning chat bot. You just keep repeating things like this regardless of their relevance to the discussion.”

The variance in average daily temps for Miami and Las Vegas is 89-61F and 104-37F for Las Vegas. Yet the average monthly high temp is almost the same for both.

Only you would think that variance in temperature is not relevant to discussions having to do with climate.

“What nonsense. How does knowing the variance of temperatures in the current climate at a particular location allow you to predict how much it will change if the global average changes?”

Look at the stats for Miami and Las Vegas. The temp variance for Miami is from warm to slightly below warm. For Las Vegas it is from very hot to very cold.

No one is going to notice a 2.5C change in the temperatures at Las Vegas. They *might* notice it in Miami but its doubtful.

Only you would think that variance in temperature tells you nothing about climate change.

“Then you’ve forgotten all the times I’ve pointed out I’m not a statistician.”

You aren’t a physical scientist either. So what are you doing on here pretending to be both? Just being a troll?

Minimum temps going up 2.5C will *not* boil the oceans, it won’t melt ice where the tmax temps are more than that below freezing, and it won’t kill off the food supply.

Yet you can’t tell from the global average temperature what is happening. That’s why the variance associated with the GAT is so important. And it’s why climate science refuses to consider it, it would be so wide no one would worry about the average!

Bellman
Reply to  Tim Gorman
March 18, 2023 5:34 pm

Only you would think that variance in temperature tells you nothing about climate change.

Stop twisting.

The question was – if the world cools by 2.5°C, can you predict which parts of the world will be cooling the most, which cooling by the average, which not cooling at all, possibly which will actually warm? I say, knowing the current variance at any location will not allow you to do that. You just spout your usual gibberish.

Just being a troll?

Says someone who admitted to lying just to get people to hoist themselves by their own petard.

Minimum temps going up 2.5C will *not* boil the oceans

*** STRAW MAN ALERT ***

We were talking about cooling not warming, and nobody – certainly not me – thinks a rise of 2.5°C will “boil oceans”.

it won’t melt ice where the tmax temps are more than that below freezing, and it won’t kill off the food supply.

Depends on where the biggest rises are.

Remember, you can’t tell that just by knowing what the average rise is.

That’s why the variance associated with the GAT is so important. And it’s why climate science refuses to consider it, it would be so wide no one would worry about the average!

You realize that if the average changes and the variance doesn’t, that the whole range of temperatures will shift? I think you ignore how much has to change to get a small change in the overall average. Even in a single location people notice the difference in a year with a just a few degrees difference in the average.

Your variances are the seasonal variance. Nobody expects a 2.5°C change to mean summers are colder than winters used to be. But it means that on average every day of winter has to be 2.5°C colder, every day of summer 2.5°C colder. Or you have prolonged periods which are much colder than that interspersed with more average temperatures. Either way, you will notice it.

Tim Gorman
Reply to  Bellman
March 19, 2023 6:13 am

I say, knowing the current variance at any location will not allow you to do that. You just spout your usual gibberish.”

It’s no wonder you thing the GAT with no associated variance has meaning for climate change!

“We were talking about cooling not warming, and nobody – certainly not me – thinks a rise of 2.5°C will “boil oceans”.”

Does that mean you think a drop of 2.5C will freeze the oceans?

Depends on where the biggest rises are.”

That’s the whole point! And you are just now figuring that out?

“You realize that if the average changes and the variance doesn’t, that the whole range of temperatures will shift?”

Really? How does the range of temperatures shift without the variance shifting? Variance is based on (X_i – average). If the average changes then the variance changes. If the range of temperatures changes, i.e. X_i values, then the variance changes.

Did you think about this before you posted it?

Your variances are the seasonal variance.”

My base variance is the DAILY temperatures. Those need to be propagated through the whole process of average -> average -> average in order to get to a GAT average.



Bellman
Reply to  Tim Gorman
March 19, 2023 2:34 pm

Does that mean you think a drop of 2.5C will freeze the oceans?

Idiotic question.

How does the range of temperatures shift without the variance shifting?

The mean can change, the variance remains the same. You should understands this. You keep going on about Gaussian distributions.

Variance is based on (X_i – average). If the average changes then the variance changes. If the range of temperatures changes, i.e. X_i values, then the variance changes.

Seriously? You don’t get this?

If every value in the distribution changes by the same amount, say everything cools by 2°C, then X_i becomes X_i – 2 and the average becomes average – 2. So (X_i – average) becomes (X_i – average).

Of course, it’s entirely possible that the distribution doesn’t change at the same rate, in which case you get a change in the variance, and / or it becomes more or less skewed.

My base variance is the DAILY temperatures.

I assume you mean daily temperatures throughout the year. The range you quoted was “The variance in average daily temps for Miami and Las Vegas is 89-61F and 104-37F for Las Vegas.” And I assume you don’t actually mean variance.

Those need to be propagated through the whole process of average -> average -> average in order to get to a GAT average.

Why do you want to propagate the variance when taking an average. The temperature on a given day is the temperature on that day. Why would knowing the range of all temperatures during the year change that?

Bellman
Reply to  Tim Gorman
March 18, 2023 8:12 am

BTW, you are confusing average with median.

Not this nonsense again. We went through it all a couple of weeks ago. Have you forgotten already.

If you have skewed distributions the average can change while the median stays the same.

Which has nothing to do with anything we’ve been talking about.

That’s the problem with using (Tmax+Tmin)/2 as an average – it assumes a Gaussian distribution of temperatures.

It does not assume anything of the sort. The distribution of temperatures during a day is almost never going to Gaussian. Assuming anything like a sine wave means that the distribution will be U-shaped, with most of the temperatures being closer to the max or minimum than the average.

If you want to think of the daily mean temperatures based on the average of max and min, as representing the actual mean temperature, then you would need the distribution to be symmetrical. It does not need to be Gaussian. But for most cases I don’t care if it not the exact mean, it’s just a convenient way of getting a representative daily temperature when that is the only data you have available.

But in the general case its only a median.

As I said last time, if the distribution is skewed it’s more likely that (Tmax+Tmin)/2 will be closer to the mean than it will be to the median temperature.

But then, EVERYTHING is Gaussian, isn’t it?

I’ve really no idea why you would think that. I’ve given you numerous examples of non-Gaussian distributions, but for some reason you think that all distributions must be Gaussian.

Have you learnt yet what a Gaussian distribution actually is? I remember a little while ago you seemed to think that any symmetric distribution was Gaussian.

Tim Gorman
Reply to  Bellman
March 18, 2023 11:44 am

Not this nonsense again. We went through it all a couple of weeks ago. Have you forgotten already.”

Yes, we did. And you still can’t accept that (Tmax+Tmin)/2 is ALWAYS a median value. It is only an average value if you have a Gaussian distribution. Nor can you accept that the median value of a combination of a sine wave distribution and an exponential decay distribution is *NOT* an average, it is a median.

“Which has nothing to do with anything we’ve been talking about.”

Of course it does. *YOU* claimed the average couldn’t change. Now you are trying to ignore you said that.

“It does not assume anything of the sort. The distribution of temperatures during a day is almost never going to Gaussian.”

Then the average is *NEVER* going to be the median.

“Assuming anything like a sine wave means that the distribution will be U-shaped, with most of the temperatures being closer to the max or minimum than the average.”

But the nighttime temperature is an exponential decay where the mean is 1/λ and the median is ln(2)/λ. They are not the same so it is a skewed distribution. When you combine them the average will *NOT* equal the median.

 But for most cases I don’t care if it not the exact mean, it’s just a convenient way of getting a representative daily temperature when that is the only data you have available.”

We *know* you don’t care if your representation is actually physically realistic or not. You never have!

“when that is the only data you have available.””

We’ve had the data for over 20 years. Why hasn’t climate science changed over to using the more realistic metric of degree-days?

Because tradition? appeal to tradition: Believing something is right just because it’s been done around for a really long time.

Assumes: 1) The old way of thinking was proven correct when introduced, i.e. since the old way of thinking was prevalent, it was necessarily correct, 2)The past justifications for the tradition are still valid at present.

When is climate science going to change? If not now, then when?

As I said last time, if the distribution is skewed it’s more likely that (Tmax+Tmin)/2 will be closer to the mean than it will be to the median temperature.”

Malarky! The median is *always* the median! (Tmax+Tmin)/2 IS ALWAYS THE MEDIAN. It can’t be closer to the mean.

“but for some reason you think that all distributions must be Gaussian.”

OMG! *I* am the one that has been trying , for at least two years, to educate you on the fact that not all distributions are Gaussian. Multiple single measurements of different things is almost guaranteed to not be Gaussian and, after two years, YOU STILL DON’T BELIEVE THAT!



Bellman
Reply to  Tim Gorman
March 18, 2023 12:41 pm

And you still can’t accept that (Tmax+Tmin)/2 is ALWAYS a median value.

I really can’t figure out at this point if you are deliberately lying or suffering from some cognitive decline.

No. I said that (Tmax+Tmin)/2 is the median. When you have just two values the median and the mean are identical. What I can’t understand is why you are so insistent it has to be called the median and not the mean.

It is only an average value if you have a Gaussian distribution.

Nonsense squared.

Nor can you accept that the median value of a combination of a sine wave distribution and an exponential decay distribution is *NOT* an average, it is a median.

Of course I accept that a median is a median. (It’s semantics whether you consider a median to be a type of average – some sources do others don’t).

But we don’t know what the true median is anymore than we know the true mean when we only have two values. By all means, try to estimate the true mean or median based on a solar model, and then take it up with all those who insist that infilling is always wrong.

Bellman
Reply to  Tim Gorman
March 18, 2023 2:33 pm

Because tradition?

Because you want to see what’s been happening over more than 20 years.

And I don’t have to take being called a traditionalist from someone who still uses feet and inches.

Malarky! The median is *always* the median!

And the mean is always the mean. Go on. Demonstrate that the median is yet another word you like to throw about without knowing what it means.

(Tmax+Tmin)/2 IS ALWAYS THE MEDIAN.

The MEDIAN of what? It’s trivially the median of the two values, but you seem to want it to have a deeper meaning.

OMG! *I* am the one that has been trying , for at least two years, to educate you on the fact that not all distributions are Gaussian.

He he. Someone doesn’t like the taste of their own medicine. Yes. I know you keep trying to “educate me” that not all distributions are Gaussian. For some reason you repeatedly do it no matter how many times I tell you that not all distributions are Gaussian.

Multiple single measurements of different things is almost guaranteed to not be Gaussian and, after two years, YOU STILL DON’T BELIEVE THAT!

That depends on what the distribution of the different things you are measuring. Random values from a Gaussian distribution will have a Gaussian distribution. Random values from a uniform distribution will have a uniform distribution.

But none of this matters because there is no requirement that the distribution be Gaussian. That’s just your fantasy.

Tim Gorman
Reply to  Bellman
March 19, 2023 5:15 am

Because you want to see what’s been happening over more than 20 years.”

ROFL! So you should never start using the new process. That just insures that you’ll *never* have the benefits of the better method!

Again, “Tradition”, to quote Teyve. Never change is the battle cry.

You didn’t even bother to read about the Appeal to Tradition, did you? “The past justifications for the tradition are still valid at present.”

And the mean is always the mean. Go on. Demonstrate that the median is yet another word you like to throw about without knowing what it means.”

(Tmax+Tmin)/2 is the median. It never changes even if the daytime and nighttime distributions changes. Therefore it can’t tell you what is going on all by itself. It’s the same with the average. The average *can* change without changing the median. So the average by itself isn’t sufficient to tell what is going on.

You just love arguing blue is green, don’t you? A true troll.

“The MEDIAN of what? It’s trivially the median of the two values, but you seem to want it to have a deeper meaning.”

“The MEDIAN of what? It’s trivially the median of the two values, but you seem to want it to have a deeper meaning.”

*I* am not the one that wants to have a deeper meaning! You haven’t read a single thing I’ve said, have you? Put down the bottle! Why do you think I keep telling you that (Tmax+Tmin)/2 is a median and is a piss-poor way to model the climate profile? You are the one that wants the median to have a deeper meaning so it can be used to model the Global Average Temperature. It should be renamed the GLOBAL MEDIAN TEMPERATURE because it has nothing to do with the “average” temperature.

Bellman
Reply to  Tim Gorman
March 19, 2023 2:45 pm

ROFL! So you should never start using the new process.

You can use your new process whenever you want. But if you want to compare current temperatures with those in the 1930s it won’t be much use.

Again, “Tradition”

Yet every time there’s the slightest change in the methodology people here scream “fraud” and start pointing to 25 year old charts to show the temperatures they prefer.

(Tmax+Tmin)/2 is the median.

It is. But the question is why you think it’s not the mean.

It never changes even if the daytime and nighttime distributions changes.

Unless that change in the distributions causes the min or max to change.

The average *can* change without changing the median.

The average of what? Are you talking about the average of max and min, or the actual daily distribution? I’m sure it’s accidental but you do keep bringing in these ambiguities.

If you mean the average of max and min, you are wrong. The mean and median will always be in lockstep, because they are the same thing. If you mean the “true” daily mean and median, then it’s possible but also the median could change without the mean changing.

Why do you think I keep telling you that (Tmax+Tmin)/2 is a median

I don’t know. That’s why I keep asking you to explain yourself. But as usual all I get is patronizing insults and rants.

Tim Gorman
Reply to  Bellman
March 19, 2023 5:30 am

no matter how many times I tell you that not all distributions are Gaussian.”

EVERY TIME you assume that all measurement uncertainty cancels you are assuming that all distributions are Gaussian. You keep denying it but you do it EVERY SINGLE TIME.

“That depends on what the distribution of the different things you are measuring.”

See! You are doing it again! Assuming that measurements of different things give you a Gaussian distribution. If you measure one horse out of the multiplicity of horse species you will NOT* get a Gaussian distribution. If you measure the crankshaft journal diameter on a 327 cu. in V8 engine, on a 409 cu. in. V8 engine,a Ford 351 cu in V8, etc. you will *NOT* get a Gaussian distribution. If you measure the diameter of Big Boy, Early Girl, Beefeater, and etc tomatoes you won’t get a Gaussian distribution.

Why do you think the measurements of different things will give you a Gaussian distribution? Even the daily temperature profile is not Gaussian or symmetrical.

“But none of this matters because there is no requirement that the distribution be Gaussian. That’s just your fantasy.”

No, not *MY* fantasy. Taylor, Bevington, and Possolo *all* say that. None of their books, notes, or papers show how to handle measurement uncertainty for a skewed distribution using just the average value of the measurements as a “true value”.

*YOU* are the only one that is adamant about assuming that all measurements of different things is Gaussian and the average is a “true value”.

bdgwx
Reply to  Tim Gorman
March 19, 2023 1:14 pm

TG said: “No, not *MY* fantasy. Taylor, Bevington, and Possolo *all* say that.”

Show me where Taylor, Bevington, and Possolo say that a distribution has to be Gaussian for the law of propagation of uncertainty to hold. I expect 3 different links with 3 exact page numbers. Stay focused and stay on topic. Don’t deflect. Don’t divert.

Last edited 11 days ago by bdgwx
Jim Gorman
Reply to  bdgwx
March 20, 2023 6:53 pm

Tim has been busy so I’ll give you an answer. First, you are creating a strawman argument and showing his statement out of context. Here are the pertinent statements.

EVERY TIME you assume that all measurement uncertainty cancels you are assuming that all distributions are Gaussian. You keep denying it but you do it EVERY SINGLE TIME.”

“Why do you think the measurements of different things will give you a Gaussian distribution? Even the daily temperature profile is not Gaussian or symmetrical.”

Neither of these mention the propagation of uncertainty. They are about achieving the true value of a measurand such as the current temperature at a given point at a given time. Not one climate temperature exists that has a distribution of multiple measurements of the same thing with the same device. Not even ASOS provides this.

This from the GUM:

B.2.15

repeatability (of results of measurements)

closeness of the agreement between the results of successive measurements of the same measurand carried out under the same conditions of measurement

NOTE 1 These conditions are called repeatability conditions.

NOTE 2 Repeatability conditions include:

— the same measurement procedure

— the same observer

— the same measuring instrument, used under the same conditions

— the same location

— repetition over a short period of time.

NOTE 3 Repeatability may be expressed quantitatively in terms of the dispersion characteristics of the results.

[VIM:1993, definition 3.6]”

B.2.17

experimental standard deviation

for a series of n measurements of the same measurand, the quantity s(qk) characterizing the dispersion of the results and given by the formula:

(formula not copied)

qk being the result of the kth measurement and q being the arithmetic mean of the n results considered

NOTE 1 Considering the series of n values as a sample of a distribution, q is an unbiased estimate of the mean μq, and s2(qk) is an unbiased estimate of the variance σ2, of that distribution.

NOTE 2 The expression s (qk )√n is an estimate of the standard deviation of the distribution of q and is called the experimental standard deviation of the mean.

NOTE 3 “Experimental standard deviation of the mean” is sometimes incorrectly called standard error of the mean.

NOTE 4 Adapted from VIM:1993, definition 3.8.

B.2.21

random error

result of a measurement minus the mean that would result from an infinite number of measurements of the same measurand carried out under repeatability conditions

NOTE 1 Random error is equal to error minus systematic error.

NOTE 2 Because only a finite number of measurements can be made, it is possible to determine only an estimate of random error.

[VIM:1993, definition 3.13]

Note here that experimental standard deviation still requires using the same measurand.

From TN 1900:

“Questions are often asked about whether it is meaningful to qualify uncertainty evaluations with uncertainties of a higher order, or whether uncertainty evaluations already incorporate all levels of uncertainty. A typical example concerns the average of n observations obtained under conditions of repeatability and modeled as outcomes of independent random variables with the same mean μ and the same standard deviation σ, both unknown a priori.

The standard uncertainty that is often associated with such average as estimate of μ equals s∕√n, where s denotes the standard deviation of the observations. However, it is common knowledge that, especially for small sample sizes, s∕n is a rather unreliable evaluation of u(μ) because there is considerable uncertainty associated with s as estimate of (. But then should we not be compelled to consider the uncertainty of that uncertainty evaluation, and so on ad infinitum, as if climbing “a long staircase from the near foreground to the misty heights” (Mosteller and Tukey, 1977, Page 2)?

The answer, in this case, with the additional assumption that the observations are like a sample from a Gaussian distribution, is that a (suitably rescaled and shifted) Student’s t distribution shortcuts that staircase (Mosteller and Tukey, 1977, 1A) and in fact captures all the shades of uncertainty under consideration, thus fully characterizing the uncertainty associated with the average as estimate of the true mean. Interestingly, this shortcut to that infinite regress is obtained under both frequentist (sampling-theoretic) and Bayesian paradigms for statistical inference.”

“3 Measurement uncertainty is the doubt about the true value of the measurand that remains after making a measurement. Measurement uncertainty is described fully and quantitatively by a probability distribution on the set of values of the measurand. At a minimum, it may be described summarily and approximately by a quantitative indication of the dispersion (or scatter) of such distribution.”

“(iii) Student’s t, Laplace, and hyperbolic distributions are suitable candidates for situations where large deviations from the center of the distribution are more likely than under a Gaussian model.”

“(7d) The statistical methods preferred in applications involving observation equations are likelihood-based, including maximum likelihood estimation and Bayesian procedures (DeGroot and Schervish, 2011; Wasserman, 2004), but ad hoc methods may be employed for special purposes (Example E22).

EXAMPLES: Examples E2, E14, E17, E18, E20 and E27 illustrate maximum likelihood estimation and the corresponding evaluation of measurement uncertainty.”

Lastly. look at the image from Taylor. If “errors”, i.e. are not Gaussian and are skewed, the “error’s will not cancel out and the “true value” will not be an accurate portrayal of the appropriate measurement.

Taylor Gaussian Distribution.jpg
Bellman
Reply to  Jim Gorman
March 21, 2023 7:02 am

If you want to avoid misunderstandings about your claims, you need to explain what you are talking about better.

Tim’s assertion was that if you assume all measurement uncertainty cancels you are implying all distributions are Gaussian. What do you mean by “all” measurement uncertainties cancel? What distribution are you talking about? The distribution of measurement errors, or the population of things you are measuring?

In neither case is there any requirement for any distribution to be Gaussian in order for some of the uncertainties to cancel when you a etage multiple things, and none of your quotes suggest that. It can be useful to assume a Gaussian for some cases such as hypothesis testing, but that does not mean having non-Gaussian distributions result in errors not cancelling when taking an average.

I do suspect the problem is as simple as not understanding what a Gaussian distribution is. In a past discussion it did seem that Tim though Gaussian just meant symmetrical.

As to your three over long quotes:

The GUM one doesn’t mention Gaussian once

Taylor is only samling that it’s reasonable to assume the result of a measurement based on multiple errors is likely to be Gaussian. It does not require the multiple errors to be Gaussian. Indeed, he’s explicitly assuming they are not Gaussian, either + or – a fixed value.

TN 1900 is the only one that requires the assumption of s Gaussian distribution, and that’s in the context of using s Student-t distribution. This is correct, but again does not mean that a small sample from a non-Gaussian distribution will not have some cancellations. It’s about what shape the sampling distribution will have.

BTW, you might have taken note of this when you were using a Student-t distribution to calculate theuncertainty of the annual temperature, or of the daily average temperature based on max and min values.

Jim Gorman
Reply to  Bellman
March 21, 2023 6:47 pm

You’re full of it. Here is a university discussion on skewed distributions.

web.ma.utexas.edu/users/mks/statmistake

Some remarks from the web site:

“””””But if a distribution is skewed, then the mean is usually not in the middle”””””

“””””A better measure of the center for this distribution would be the median”””””

“””””But many common statistical techniques are not valid for strongly skewed distributions.”””””

“””””Indeed, if you know a distribution is normal, then knowing its mean and standard deviation tells you exactly which normal distribution you have.”””””

“””””For a normal distribution, the standard deviation is a very appropriate measure of variability (or spread) of the distribution.”””””

“””””But for skewed distributions, the standard deviation gives no information on the asymmetry. It is better to use the first and third quartiles4, since these will give some sense of the asymmetry of the distribution.”””””

This site recommends using quartile (five number) regression to better see what is occurring. You have been told this before.

Look, if you end up with a skewed distribution after taking multiple measurements of the same thing, then canceling uncertainty is the least of your problems.

Ultimately, a skewed distribution can not predict a “true value”! If you think otherwise then you need to show references backing up your position.

I have repeatedly shown references for my position. You have shown NONE. Guess you can’t find any!

Bellman
Reply to  Jim Gorman
March 21, 2023 8:36 pm

You’re full of it.

I’ll take that as a complement.

Some remarks from the web site

Not one of them saying uncertainties don;t cancel if you don’t have a Gaussian distribution. Again non-Gaussian does not mean the distribution is necessarily skewed.

But if a distribution is skewed, then the mean is usually not in the middle

That’s pretty much the definition of a skewed distribution.

Indeed, if you know a distribution is normal, then knowing its mean and standard deviation tells you exactly which normal distribution you have.

Indeed. That’s why the CLT is so useful.

But for skewed distributions, the standard deviation gives no information on the asymmetry.

Very observant, standard deviation is not a measure of skewness.

I have repeatedly shown references for my position. You have shown NONE. Guess you can’t find any!

What position? The claim was that a distribution had to be Gaussian for cancellation to occur.

EVERY TIME you assume that all measurement uncertainty cancels you are assuming that all distributions are Gaussian. You keep denying it but you do it EVERY SINGLE TIME.

Your response is to point to facts that have zero to do with the requirement that distributions be Gaussian.

Jim Gorman
Reply to  Bellman
March 22, 2023 9:38 am

You are trying to make a straw man to win an argument. Okay, make some triangular and some uniform and others Gaussian. You must still assume that all errors cancel, EVEN SYSTEMATIC. You still are dancing around the problem.

Find this book and download it.

“Measurement Uncertainty: A Reintroduction”

https://nrc-publications.canada.ca/eng/view/object/?id=1bfd93be-dba3-42ee-b1c8-180dcd3b3c61

I have included a page as an image. The page discusses finding the uncertainty of a simple functional relationship. Since you insist that an average is a functional relationship, let’s find the uncertainty using the info on this page.

Equation definitions —

TmA => monthly average of daily averages.

Tavg_n => daily Tavg_1 … Tavg_n

Tavg_n uses daily averages from Topeka Forbes for January 1953.

Uncertainty of measurements = ±1°F

TmA = (Σ(Tavg_1 + … + Tavg_n)) / n

“n” disappears in the following equation because it is a defined number with no uncertainty just as the book shows for “π”

(u(TmA) / TmA)^2 = (u(Tavg_1) / Tavg_1)^2 + … + (u(Tavg_n) / Tavg_n)^2

u(TmA) = TmA • √(u(Tavg_1) / Tavg_1)^2 + … + (u(Tavg_n) / Tavg_n)^2

u(TmA) = 33.9 • √{(1 / 34)^2 + … + (1 / 44)^2}

u(TmA) = 33.9 • √ 0.039 = ±6.7°F

Topeka Forbes January 1953 Average Temperature is:

33.9 ± 6.7° F

You have been shown example after example and reference after reference. You continuously post without including any references to support your position. The Internet has a plethora of statistics books and university based statistics information. In the future if you do not post a reference to support a claim and assumptions, you will only get a 3 word response — SHOW A REFERENCE.

PSX_20230322_091437.jpg
Bellman
Reply to  Jim Gorman
March 22, 2023 11:07 am

You are trying to make a straw man to win an argument.

Do you understand the meaning of the word? The claim that
I assume all distributions are Gaussian is exactly what Tim has been saying for years. I repeatability suggested he doesn’t understand what Gaussian means, but now I’m accused of making a straw man argument when I point out he’s wrong about Gaussian distributions.

You must still assume that all errors cancel, EVEN SYSTEMATIC.

Again, that ambiguity of the word “all”. I keep asking what you mean by all errors cancel, but just get the usual platitudes thrown back.

I absolutely do not believe that systematic errors cancel. That’s the very definition of systematic error. But howeer many times I point this out it get lost in the repeated lie about “Gaussian” distributions.

As I tried to point out to Tim months ago, the issue is not about the shape of the distribution or even if it’s skewed. It’s about what it’s mean is. If you are talking about measurement errors, then if the mean of their distribution is zero, all errors are random and will tend to cancel as the number of measurements increase. If the mean is not zero, then repeated measurements will tend to cancel out towards that mean. Hencve you are left with a systematic error equal to the mean of the distribution.

This is why the continued lies about assuming all distributions are Gaussian are so distracting. A systematic error can occur in a Gaussian distribution, there may be no systematic error in a non-Gaussian or even skewed distribution. The only thing that matters is what the mean of the distribution is.

Jim Gorman
Reply to  Bellman
March 22, 2023 4:53 pm

Show a reference . Your claims need support.

Bellman
Reply to  Jim Gorman
March 22, 2023 6:14 pm

Which claim? I keep showing references, but you just ignore them, or change the subject, or claim I’m cherry-picking, or ignoring assumptions, or the reference can only be used in some exact way.

Bellman
Reply to  Jim Gorman
March 22, 2023 6:32 pm

A reference to my claim that non-Gaussian including skewed distributions will have errors that cancel can be found in just about any reference tot he Central Limit Theorem. E.g.

https://en.wikipedia.org/wiki/Central_limit_theorem

See attached diagram.

Whatever the form of the population distribution, the sampling distribution tends to a Gaussian, and its dispersion is given by the central limit theorem.

For large enough n, the distribution of X^bar_n gets arbitrarily close to the normal distribution with mean μ and variance σ^2 / n.

The usefulness of the theorem is that the distribution of √n(X^bar_n − μ ) approaches normality regardless of the shape of the distribution of the individual X_i.

Screenshot 2023-03-23 at 01-24-42 Central limit theorem - Wikipedia.png
Jim Gorman
Reply to  Bellman
March 23, 2023 11:26 am

Why, why, why do you want to always use the CTL as proof of anything?

1st, read this:

“””””In probability theory, the central limit theorem (CLT) establishes that, in many situations, for identically distributed independent samples, the standardized sample mean tends towards the standard normal distribution even if the original variables themselves are not normally distributed.”””””

See that part that says “identically distributed independent samples”. Are the samples you use INDEPENDENT? Daily Tmax & Tmin are correlated at better than 0.90. They are NOT Independent! Tavg is not an accepted transform to convert from dependent random variables to independent random variables.

Have you plotted all the “samples” to determine if they have identical distribution? I’ll guarantee that there will be differences as the seasons change. Prove me wrong with data.

2nd, you realize that the CTL only deals with estimating the mean, right?

Read this closely from the wiki page.

“””””For large enough n, the distribution of X_bar_n gets arbitrarily close to the normal distribution with mean μ and variance σ^2/n.

What does this tell you?

That you may infer from the CTL statistic of μ that the population mean is also μ.

However, the CTL statistic of “s^2” (σ^2/n) does not infer that the population variance is “s^2. In fact the inference is that the population variance is
σ^2 = “s^2 • n”!

Basically, the CTL is used to obtain statistics that can be used to infer the population descriptors of μ and σ. The population parameters do not change. As “n” increases the variance of the samples gets smaller and smaller. That DOES NOT mean the population variance also gets smaller.

Remember, you need to show independence and Identical distribution using data when trying to show either the LLN or CTL assumptions are met.

Bellman
Reply to  Jim Gorman
March 23, 2023 1:01 pm

Again, the futility of providing references to people who don;t understand what they say, and will just move the goal posts.

Why, why, why do you want to always use the CTL as proof of anything?

Your claim was that only Gaussian distributions have errors that cancel. You don’t really define what you mean by that, but insist I provide a reference for my claim that Gaussian has nothing to do with errors cancelling. I say the CLT which specifically says that non-Gaussian distributions will have errors that cancel when you take a sample is a reference to why distributions do not have to be Gaussian to cancel.

So the response is to move the goal posts.

Are the samples you use INDEPENDENT?

Yes. My hypothetical samples from a non-Gaussian distribution are INDEPENDENT. Even if they are not, it doesn’t not mean that no cancellation occurs.

Daily Tmax & Tmin are correlated at better than 0.90. They are NOT Independent!

That has nothing to do with the claim. I was the one pointing out to you that Tmax and Tmin are not a random sample from the distribution of temperatures during the day, when you were trying to use the CLT to provide ridiculously large uncertainty intervals.

Regardless, the fact that there is a correlation between tmax and tmin is irrelevant if what you are trying to do is find the average of the two.

Have you plotted all the “samples” to determine if they have identical distribution?

What samples? If you take a random sample from a fixed population, by definition they are identically distributed, because each has the distribution of the population. Whjat has this to do with whether the distribution is Gaussian or not?

Bellman
Reply to  Bellman
March 23, 2023 1:24 pm

Continued.

2nd, you realize that the CTL only deals with estimating the mean, right?”

Or the sum. But it’s the mean we are interested in. If you aren’t talking about the mean when you ask about errors cancelling, what are you claiming?

Read this closely from the wiki page.

You mean the part I quoted to you?

““””””For large enough n, the distribution of X_bar_n gets arbitrarily close to the normal distribution with mean μ and variance σ^2/n.

What does this tell you?

It tells me that as sample size increases the sampling distribution of the mean tends to a normal distribution with increasingly small standard deviation. The last part implies that cancellation of errors is occurring.

That you may infer from the CTL statistic of μ that the population mean is also μ.

Gibberish. What do you mean by the CLT statistic? The mean of a sample is more likely to be closer to the actual population as sample size increases, but it will never be guaranteed to be equal to the population mean unless the sample size is infinite.

However, the CTL statistic of “s^2” (σ^2/n) does not infer that the population variance is “s^2.

Gibberish squared. There is nothing in the CLT that is intended to tell you what the population variance is. The CLT tells you that if you know what the population variance is, or more conveniently the standard deviation, then you can say what the standard deviation / variance of the sample distribution is. If you don’t know the population standard deviation tyou have to infer it from the standard deviation of the sample.

In fact the inference is that the population variance is
σ^2 = “s^2 • n”!

Wrong in at least two ways. First s^2 is the variance of the sample, not the SEM^2. Second, you are getting the logic backwards. You start with the standard deviation and work out the SEM.

Basically, the CTL is used to obtain statistics that can be used to infer the population descriptors of μ and σ.

You really don’t understand how to use the CLT, do you?

The population parameters do not change.

Hopefully not.

As “n” increases the variance of the samples gets smaller and smaller.

You mean the variance of the sampling distribution. Not the variance of the sample. Sample variance should tend towards population variance.

That DOES NOT mean the population variance also gets smaller.

Duh!. You do have this ability to make the most obvious truisms sound like it’s some major revelation. The population is the population. It doesn’t matter what size sample you take of it, it remains unchanged.

Tim Gorman
Reply to  Bellman
March 23, 2023 3:31 pm

The last part implies that cancellation of errors is occurring.”

It does *NOT* imply that at all! It only means you are approaching the population mean. It says *NOTHING* about the uncertainty of that mean.

Errors do not determine the mean, the mean is determined from the stated values, not the uncertainty values.

If you had *every* member of the population you could calculate the population average without any samples. That does *NOT* imply that the population mean has no uncertainty. The uncertainty of the mean is determined by the propagation of the uncertainties associated with the population members.

The CLT says NOTHING about the uncertainty of the mean – period, exclamation point.

Bellman
Reply to  Tim Gorman
March 23, 2023 4:42 pm

It does *NOT* imply that at all! It only means you are approaching the population mean.

How do you get closer to the population mean without errors cancelling?

The error of any individual value is it’s distance to the population mean. The error of a sample is the distance of the sample mean to the population mean. If a sample of 100 is likely to be closer to the mean than a sample of 10, then that can only be because some of the errors cancelled each other.

Errors do not determine the mean, the mean is determined from the stated values, not the uncertainty values.

You keep confusing things. We can either be talking about random samples from a population, or in your preferred case, random measurement errors around the true value of something. In either case random errors will cancel when you take a sample, whether that’s random samples from a distribution or random measurements of the same thing, or a combination of the two. In either case the CLT applies irrespective of the population / measurement error distribution. Cancellation of errors does not imply a Gaussian distribution.

Jim Gorman
Reply to  Bellman
March 23, 2023 5:52 pm

And around and around and around we go.

Why do you think this is measurement error? It is not by any stretch of the imagination!

Do you not understand that this is a standard deviation interval of the sampling distribution that defines where the true mean may lay? As you increase the samples the interval most assuredly converges on the population mean.

It has nothing to do with decreasing the “measurement error”. I know this has been told to you many times before.

Remember, that
σ_population = σ_sample • √n
You don’t gain anything as to the population mean or the population standard deviation from increasing samples. The measurement uncertainty remains in the measured data, not the sampling distribution.

If this is all you got, you are way overdone!

Bellman
Reply to  Jim Gorman
March 23, 2023 7:15 pm

And around and around and around we go.

Yes, because every time I try to explain something, you try to turn the argument around. This whole discussion was about the claim that you could only get cancellation of errors with a Gaussian distribution. You’ve now turned it into a discussion about your lack of understanding about the CLT and just about every thing else.

Just specify what your claim actually is. Define your terms. Stop making stuff up. Otherwise this discussion will keep going round and I’ll remain happy to see you continuing to blow yourselves up with your own petards.

Why do you think this is measurement error?

I specifically said we could be talking about either. You never specify which distribution you are talking about or which errors, but it works either with a non Gaussian population, or non-Gaussian measurement errors. The CLT can be used for either.

Do you not understand that this is a standard deviation interval of the sampling distribution that defines where the true mean may lay?

Strictly speaking it defines the distribution of the sample average. Remember the probability of the true mean lying within a given interval is either 1 or 0. Unless we are getting Bayesian.

As you increase the samples the interval most assuredly converges on the population mean.

As you increase the sample size the interval tends to converge on the population mean.

It has nothing to do with decreasing the “measurement error”.

Unless the population is the distribution of all possible measurement errors – i.e. measurement uncertainty.

I know this has been told to you many times before.

Only by people who don;t understand what they are talking about.

Remember, that
σ_population = σ_sample • √n

What is σ_sample in this upside down equation? It should be the standard error of the mean, but you keep confusing it with the standard deviation of the sample.

You don’t gain anything as to the population mean or the population standard deviation from increasing samples.

Again, what do you mean by increasing samples. You keep using this ambiguous language. You get a better, less uncertain, estimate for the population mean by increasing sample size. But you keep confusing this with the concept of having a large number of different samples.

The measurement uncertainty remains in the measured data, not the sampling distribution.

But the measurement uncertainty is part of the sampling distribution.

Tim Gorman
Reply to  Bellman
March 24, 2023 6:09 am

This whole discussion was about the claim that you could only get cancellation of errors with a Gaussian distribution.”

Can you whine any louder? You’ve already been told that Gaussian is just a substitute for trying to list out all the symmetrical distributions!

And not all symmetrical distributions even meet the criteria for cancellation.

If you are trying to win the argument that there are other symmetrical distributions than Gaussian then YOU WIN! So what?

Exactly what does that have to do with anything?

Uncertainty intervals HAVE NO DISTRIBUTION. Only stated values have distributions. Uncertainty defines an interval in which the true value can lie – PERIOD. There is nothing that says it is more likely that the true value lies closer to the mean! That is assuming that uncertainty is *not* an unknown! That you know the probability associated with each and every possible value in the interval – and you don’t!

Take the uniform distribution such as from rolling a six-sided dice. It is symmetrical around the mean but every value has the same probability. So what is the uncertainty interval? +/- 2.5? Is it more likely that you will get a 3 or 4 than a 6 since the 3 and 4 are closer to the mean? Do the uncertainties cancel?

“Just specify what your claim actually is.”

More whining. Where do I send the tiny violin and crying towels?

The CLT can be used for either.”

The CLT works for finding the mean of the stated values of a population. It says NOTHING about uncertainty. You *always* circle back around to assuming all measurement uncertainty is SYMMETRICAL and cancels! ALWAYS!

Remember the probability of the true mean lying within a given interval is either 1 or 0. Unless we are getting Bayesian.”

So what? You simply don’t know what the true value *IS*. Period. That’s why you can never get to 100% accuracy with measurements!

As you increase the sample size the interval tends to converge on the population mean.”

But that population mean can be as UNCERTAIN AS ALL GET OUT! The population mean is only based on the stated values. If those stated values are inaccurate then the MEAN WILL BE INACCURATE. How close you get to the mean does *NOT* define the uncertainty of the mean!

Once again we find you you circling back to the assumption that all measurement uncertainty cancels and the stated values define the uncertainty of the mean. EVERY SINGLE TIME!

“Unless the population is the distribution of all possible measurement errors – i.e. measurement uncertainty.”

And here we are again! All measurement uncertainty cancels and the stated values become the measurement uncertainty. No matter how many times you deny you make this assumption it just stands out in EVERY THING YOU POST!

But the measurement uncertainty is part of the sampling distribution.”

NO, it isn’t. How do you find the mean of an unknown? The measurement uncertainty is an INTERVAL. How do you find the mean of multiple intervals? What is the mean of +/- 2.5 and +/- 6?

You can find the direct addition of the intervals – but that isn’t a mean. You can find the root-sum-square of the intervals but that isn’t a mean either.

And here we are again! All measurement uncertainty cancels and the stated values become the measurement uncertainty. No matter how many times you deny you make this assumption it just stands out in EVERY THING YOU POST!

Bellman
Reply to  Tim Gorman
March 24, 2023 7:04 am

You’ve already been told that Gaussian is just a substitute for trying to list out all the symmetrical distributions!

So all this time you’ve been accusing me of assuming that all distributions were Gaussian, you were lying? Every time I pointed out that I didn’t believe all distributions were Gaussian and you told me I did, you were lying.

Honestly!? How do you hope to be taking seriously if you a) just make up your definitions, and then b) yell at people for not understanding your made up definitions?

I sugested several months ago that the confusion was you didn’t understand what a Gaussian distribution was, and confused it for any symmetrical distribution. Yet you’ve just repeated in the months following that I believe that all distributions are Gaussian.

Tim Gorman
Reply to  Bellman
March 24, 2023 5:46 am

How do you get closer to the population mean without errors cancelling?”

Assume you have the ENTIRE POPULATION. An infinite number of measurements of the form “stated value +/- uncertainty”.

What is the mean calculated from? The mean or the uncertainty?

How does calculating the mean of the population using the stated values do *anything* to the uncertainty part of the measurement? The uncertainty part has nothing to do with the mean.

All you have is the mean of the population. The uncertainty of each measurement must *still* be propagated onto the mean in order to find the uncertainty of the mean!

The uncertainty of that mean is *NOT* the average uncertainty!

In fact, do the calculation. If you have an infinite number of measurements then what is the total uncertainty?

u(q) = sqrt[ infinity * u(x)^2]

What is the uncertainty? What is the average uncertainty?

u(q)_avg = sqrt[ infinity * u(x)^2 / infinity^2 ] What in Pete’s name is that?

The error of any individual value is it’s distance to the population mean.”

How many times does it have to be repeated for you to finally memorize it? Uncertainty is *NOT* error. Even the GUM says this. You have yet to internalize that simple truth and it lies at the heart of you total misunderstanding of what metrology is all about.

You keep on denying it but it shows up in every single thing you post – you assume all measurement uncertainty is random and Gaussian.

I keep telling you that uncertainty doesn’t have a distribution. It is an interval within which the true value can lie. There is *NOTHING* that says the true value is probably closer to the stated value than not UNLESS you assume the uncertainty is Gaussian and random!

It is *only* when you assume that the uncertainty is random, Gaussian, and cancels so that you can use the stated values to determine uncertainty that you can assume that measurements are closer to the true value than not. And even this only applies when you are measuring the same thing multiple times using the same device under conditions of repeatability.

When you are measuring different things one time each there is no “true value”. It’s like trying to combine weight measurements of cats with dogs. What is the “true value” you are going to find? It doesn’t make any sense. It’s like combining weight measurements of cucumbers and pumpkins. Again – it doesn’t make any sense. Combining temperature measurements from different locations with different variances is no different. You are combining different things and trying to find a “true value”. It doesn’t make any sense. And anomalies don’t help! Anomalies carry the same problem as the absolute temperatures – different variances. Winter temps have wider variances than summer temps. So do the anomalies.

It’s what makes the GAT unfit for purpose. You would be better off just assigning a plus sign or minus sign to the daily mid-range value if it is higher or lower than the average. Then at the end of the month just add up all the signs and see if you have more pluses or more minuses. Then go around the globe and add up all the local pluses and minuses and see what you get. Stop with trying to find a GAT out to the hundredths digit. The uncertainty associated with the GAT is so large you simply don’t know what it is.

Jim Gorman
Reply to  Tim Gorman
March 24, 2023 6:14 am

Excellent.

Anomalies are the addition/subtraction of two random variables. The variances of the monthly average and the baseline average should be ADDED.

Ha! But first, you need to calculate them!

Bellman
Reply to  Jim Gorman
March 22, 2023 11:35 am

Find this book and download it.

Not again. We’ve already been through this example months ago. It doesn’t matter how many times you find books that you think show that measurement uncertainty increases with sample size, it always comes back to you simply not understanding the maths.

“n” disappears in the following equation because it is a defined number with no uncertainty just as the book shows for “π”

And here’s where it happens. “n” does not disappear. It doesn’t matter if you are using the general “partial differential” equations or the rules for division derived from it, you always end up dividing the uncertainty by “n” to get the uncertainty of the average.

π doesn’t disappear from that equation either, clumsy wording not withstanding. π doesn’t appear as a term on the RHS because it’s uncertainty is zero. But it’s there on the LHS, hiding in V. Take π out of the equation for V, and the uncertainty of V would be different.

This is where your misunderstanding leads:

“Topeka Forbes January 1953 Average Temperature is:
33.9 ± 6.7° F”

You are claiming that the uncertainty of the monthly average of January measurements, can somehow be almost 7 times larger than the uncertainty of any individual day. That makes no sense. The only way for the average to have an error of 6.7, is for every day to have an average error of 6.7, but you’ve already said that the uncertainty for an individual day is 1.

Jim Gorman
Reply to  Bellman
March 22, 2023 4:55 pm

You need references.

Bellman
Reply to  Jim Gorman
March 22, 2023 6:10 pm

References for what? Which particular observation did you think you couldn’t work out for yourself?

That V = HπR^2?

That you have to multiply the RHS of the equation by V to get the uncertainty of V?

You provided the reference yourself.

That in general if you divide a quantity by an exact value you have to divide the uncertainty by that value to get the derived uncertainty? I’ve continuously provided the exact formula from Taylor. I’ll attach it again in case you didn’t spot it two comments below.

That you can also derive the same result by using the general equation? See GUM Equation 10.

That it makes no sense for the uncertainty of an average to be 7 times bigger than any individual measurement? Well, I doubt I can find a reference for that as it’s so obvious.

Screenshot 2021-08-19 224911.png
Jim Gorman
Reply to  Bellman
March 23, 2023 9:26 am

I will only explain this one time where you incorrect. At least you showed a reference that can be discussed. Thank you.

You need to stop cherry picking formulas while not understanding what they mean.

The rule you picked is a special rule used when ONE measurement is used multiple times. Remember, multiplying by a number is basically adding, i.e., if B=3, then δx_total = 3•δx = δx + δx + δx! I don’t think this is really what you want.

C = πd, d = 2r or as in the example in the book, t = (1/200) • T

In other words, THE SAME MEASUREMENT IS USED MULTIPLE TIMES IN A CALCULATION. THEREFORE, THE ERROR IS ALSO ADDITIVE BY THE NUMBER OF TIMES THE MEASUREMENT IS USED.

Lastly, this was shown prior to his use of quadrature. So this is only useful to determine the absolute maximum uncertainty in a functional relationship.

If you had read just a little further in Taylor, you would have seen the exact same formula as used in Possolo/Meija when Taylor begins addressing quadrature.

Now let’s look at an average.

q = x1/n + x2/n + … + xn/n

Now, the fractional uncertainty in “q” => δq/q,

and, the uncertainty x1/n => (δx1/x1) + (δn/n),
and. xn/n => (δxn/xn) + (δn/n),

based upon the rule of products and quotients.

Using the fact that n is a defined number with no uncertainty,
then we see,

(δn/n) = 0/n = 0

Therefore we calculate using the rule of products/quotients in quadrature,

δq/q = sqrt[{δx1/x1 + 0}^2 + … + {δxn/n + 0}^2]

or,

δq = q • sqrt[(δx1/x1)^2 + … + (δxn/xn)^2]

This is the same as Possolo/Meija shows. One must be willing to accept that a defined number has no uncertainty and does not CONTRIBUTE to the total uncertainty, EVER.

Please don’t try to convince me that you can multiply a single value of uncertainty such as ±1° by 1/30 to obtain an overall uncertainty. You first need to multiply ±1 by 30 ( days in a month) and then divide by 30 to get an average. Guess what you will get?

Tim Gorman
Reply to  Jim Gorman
March 23, 2023 11:20 am

You are wasting your time. I’ve been over this and over this with bellman and he *never* gets it. He truly believes the average uncertainty is the uncertainty of the average. He’ll never believe otherwise. It’s part of the climate alarmists religious dogma. You *can* reduce uncertainty by averaging!

–back to painting–

Bellman
Reply to  Tim Gorman
March 23, 2023 11:37 am

If you want me to get something – maybe try explaining your case rather than just repeating meaningless lies, such as “He truly believes the average uncertainty is the uncertainty of the average.”

I honestly have no idea why you think that. Take the example of 30 days each with a random measurement of ±1°F. The average uncertainty would be 1 * 30 / 30 = 1. I do not think ±1 would be the uncertainty of the average. I think the uncertainty of the average (assuming independence and no systematic errors) would be 1 / √30, about ±0.18°F. If think the average uncertainty would be the worst case, where all the uncertainty was systematic.

You can complain all you want about not being able to convince me that the true uncertainty of the average should be 1 * √30 = ± 5.5°F. But if all you are going to do is repeat the same lies, like you’ve got them on speed dial, don’t be surprised if you continue to waste your time.

Tim Gorman
Reply to  Bellman
March 23, 2023 3:37 pm

 I think the uncertainty of the average (assuming independence and no systematic errors) would be 1 / √30, about ±0.18°F.”

NO! That would be the average uncertainty. 0.18 * 30 is the uncertainty of the average = +/- 5.4!

When you divide by n you are finding an average! Plain and simple. It doesn’t matter if you are dividing the total uncertainty by the number of members to get an average uncertainty or if you are dividing the sum of the stated values by n in order to find the average value of the members of the population.



Jim Gorman
Reply to  Tim Gorman
March 24, 2023 10:19 am

It is why an average is worthless. An average uncertainty means each element has that uncertainty. We know that can’t be true or you would have no cancelation and every element would have the same ±u_c(y).

The other thing that is easy to forget is that what is actually being analyzed by statistics is a frequency distribution and not just absolute values. It is why a Gaussian distribution is important. The mean has the highest frequency of occurrence. A “+1” doesn’t just cancel a “-1” that occurs ten times. Cancelation only occurs if ten “+1” offset ten “-1”. The frequencies are what determines the distribution.

This is why a skewed distribution doesn’t allow cancelation around the mean. A right skewed distribution may have 20 “-1s” but only 7 “+1s”. It may have 15 “-2s” and 4 “+2s”. Heck, the mean may not have the largest number of occurrences! What is the true value in that case?

It is why averaging measurements of different things so impossible to analyze. Take Tavg. Say 80 and 60. If the distributions surrounding each temperature are not Gaussian how in the world could distributions ever offset in each other? Could the frequency of two 79’s offset the frequency of two 59’s? It is why one must assume Gaussian distributions where all uncertainty cancels and you have two temps with no error!

The standard deviation of 80 and 60 is ±14. Why is that? Using the formula for a Gaussian distribution this is what is necessary to have the mean (70) have the highest probability of occuring and still have the two values fall on the Gaussian curve. I know bellman doesn’t believe this but it is true.

What is worse is that Tmax and Tmin have vastly different distributions with vastly different frequencies.

It is one reason why analyzing Tmax and Tmin is necessary.

I’ll see if I can work this out in excel.

Bellman
Reply to  Jim Gorman
March 24, 2023 10:57 am

This is why a skewed distribution doesn’t allow cancelation around the mean.”

Except it does. The CLT works with skewed distributions., It’s just a question balance. A skewed distribution might have more values to the left of its mean, but the values on the right will be bigger, and that means any sample will tend to the mean of the distribution. Imagine a distribution with 5 -1’s and just one +5. It’s mean is zero, any random value is 5 times as likely to be a negative than a positive, but when you get a positive it’s 5 times bigger. Collect enough values and the average is likely to be close to 0.

Jim Gorman
Reply to  Bellman
March 24, 2023 3:54 pm

Why do you think the sample distribution and CLT statistics override the statistical parameters of the original distribution.

The CLT is useful to obtain “statistics” that allow inferences to be made about a population. That is all it does. The sample distribution DOES NOT replace the original distributions. Nor does it replace the mean and variance of the population.

Give it up. Tn1900 outlines a procedure that meets all the requirements.

“””””The {Ei} capture three sources of uncertainty: natural variability of temperature from day to day, variability attributable to differences in the time of day when the thermometer was read, and the components of uncertainty associated with the calibration of the thermometer and with reading the scale inscribed on the thermometer.”””””

These are all that is needed to capture the main sources of uncertainty.

Bellman
Reply to  Jim Gorman
March 24, 2023 5:09 pm

Why do you think the sample distribution and CLT statistics override the statistical parameters of the original distribution.

Not sure what point you are trying to make. The original distribution remains unchanged, but the sampling distribution is not the same. That doesn’t mean they override the population, they are two different things.

The CLT is useful to obtain “statistics” that allow inferences to be made about a population.

The inference being how much confidence you have in the sample mean as representing the population mean.

That is all it does.

Yes, but that “all” is fundamental.

The sample distribution DOES NOT replace the original distributions. Nor does it replace the mean and variance of the population.

Of course it doesn’t. It doesn’t tie your bootlaces or wipe your nose. We could go on listing all the things the CLT does not do all night. I fail to see the relevance.

Jim Gorman
Reply to  Bellman
March 25, 2023 8:45 am

Fundamentally the CLT assumptions are not met by the method used to assess an average temperature. Tavg is made up of two correlated numbers from differing distributions. Therefore, any calculations made from them are based on correlated values, that is, they are not independent. They are not from identical distributions, consequently, any calculations based on them can not be considered to be IID.

Read this carefully from the GUM.

2.2.3 The formal definition of the term “uncertainty of measurement” developed for use in this Guide and in
the VIM [6] (VIM:1993, definition 3.9) is as follows:

uncertainty (of measurement)
parameter, associated with the result of a measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand

NOTE 1 The parameter may be, for example, a standard deviation (or a given multiple of it), or the half-width of an interval having a stated level of confidence.

NOTE 2 Uncertainty of measurement comprises, in general, many components. Some of these components may be evaluated from the statistical distribution of the results of series of measurements and can be characterized by experimental standard deviations. The other components, which also can be characterized by standard deviations, are evaluated from assumed probability distributions based on experience or other information.

Note 2 above describes perfectly the steps from TN1900 and from examples in Possolo/Meija’s book.

Again, from the GUM

B.2.17
experimental standard deviation
for a series of n measurements of the same measurand, the quantity s(qk) characterizing the dispersion of the results and given by the formula:

s(q_k) = sqrt( Σ(q_i q_bar)^2/ (n-1)

qk being the result of the kth measurement and q being the arithmetic mean of the n results considered

NOTE 1 Considering the series of n values as a sample of a distribution, q is an unbiased estimate of the mean μq, and s2(qk) is an unbiased estimate of the variance σ2, of that distribution.

NOTE 2 The expression s(q_k) / √n is an estimate of the standard deviation of the distribution of q and is called the experimental standard deviation of the mean.

NOTE 3 “Experimental standard deviation of the mean” is sometimes incorrectly called standard error of the mean.

NOTE 4 Adapted from VIM:1993, definition 3.8.

Sections 4.2.2 and 4.2.3 provide the basis for TN1900.

The phrase from 4.2.2, “The individual observations qk differ in value because of random variations in the influence quantities” says it all. “random variations in the influence quanties”. When the measurand is “monthly Tmax”, this fits perfectly, like it or not.

Again, from the GUM:

B.2.10
influence quantity
quantity that is not the measurand but that affects the result of the measurement

EXAMPLE 1 Temperature of a micrometer used to measure length.

EXAMPLE 2 Frequency in the measurement of the amplitude of an alternating electric potential difference.

EXAMPLE 3 Bilirubin concentration in the measurement of haemoglobin concentration in a sample of human blood
plasma.

[VIM:1993, definition 2.7]

Guide Comment: The definition of influence quantity is understood to include values associated with
measurement standards, reference materials, and reference data upon which the result of a measurement may depend, as well as phenomena such as short-term measuring instrument fluctuations and quantities such as ambient temperature, barometric pressure and humidity.

See that phrase, “ambient temperature”! Well guess what Tmax and Tmin are?

Bellman
Reply to  Jim Gorman
March 25, 2023 5:19 pm

Fundamentally the CLT assumptions are not met by the method used to assess an average temperature.

Yes, that’s what I was trying to explain when you were trying to estimate the uncertainty. Max and Min are not random samples from the daily distribution of temperatures. The CLT is about random samples. It’s just that simple. The point though is that not being a random sample max and min are a better estimate of the average temperature than two random temperatures would be.

Tavg is made up of two correlated numbers from differing distributions.

You want the two variables to be correlated. If they weren’t they would not be an indication of the average. You hope that a hot day will have hotter max and mins than a cold day.

Bellman
Reply to  Jim Gorman
March 25, 2023 5:28 pm

Note 2 above describes perfectly the steps from TN1900 and from examples in Possolo/Meija’s book.”

Yes. If you can consider a mean as a measurand, and each random variable taken from it a measurement then you can describe it as that. It’s just you were adamant a little while ago that means were not measurands and weren’t subject to measurement theory.

See that phrase, “ambient temperature”! Well guess what Tmax and Tmin are?

That’s not how I read it. Tmax isn’t an influence on Tmax, it is the measurement. Ambient temperature is an influence quantity when it influences the measurement – e.g. by changing the length of an object being measured.

Influence quantities on Tmax would be clouds, wind direction, air pressure etc.

Bellman
Reply to  Tim Gorman
March 24, 2023 10:44 am

NO! That would be the average uncertainty.

How can 0.18 be the average uncertainty when the premise was that each individual uncertainty was 1? If every uncertainty is 1 then by definition the average uncertainty is 1.

I assume you do have a point, it’s just that you can’t express yourself correctly. Just as continually making claims about Gaussian distributions when you actually mean symmetrical ones, I assume “average uncertainty” has some meaning to you that is different to what the words mean. Maybe if you took a deep breath and thought about what you want to say, rather than typing the same identical phrases over and over, we could get somewhere and stop having these unending discussions.

When you divide by n you are finding an average!

An average of what?

If you divide a measurement in inches by 12 to get the result in feet, are you averaging the inches? It might be in some literal sense what you are doing, but it’s much easier to think of it as a scaling. That’s really all you are doing when you divide the uncertainty of the sum by the number of elements. The average is 1/nth the size of the sum, so naturally the uncertainty of the average is 1/nth the uncertainty of the sum. You are not in any meaningful sense redistributing the uncertainty of the sum equally amongst all the measurements.

It doesn’t matter if you are dividing the total uncertainty by the number of members to get an average uncertainty

And again, that is not what you are doing. It’s the same problem with your choice of language. The total uncertainty is not the same as the uncertainty of the total.

Bellman
Reply to  Jim Gorman
March 23, 2023 11:25 am

And this is why providing references is pointless when people are so determined to min-understand what they say.

The rule you picked is a special rule used when ONE measurement is used multiple times.

It is not. It’s specifically about multiplying one measurement by an exact value. Specific examples given are multiplying the diameter by π, and dividing the height of a stack of papers by the number of sheets. Not about using the same measurement multiple times (though is could be used for that purpose.)

Remember, multiplying by a number is basically adding, i.e., if B=3, then δx_total = 3•δx = δx + δx + δx! I don’t think this is really what you want.

Not if you’ve progressed beyond junior school maths. Multiply by π is not adding 3.14… times, multiplying by 1/200 is not adding 1/200th time.

Lastly, this was shown prior to his use of quadrature.

Indeed, but as Taylor notes, it’s irrelevant in this case. Irrelevant because you are only adding zero to a term. √(x^2 + 0^2) = √(x^2) = x.

If you had read just a little further in Taylor, you would have seen the exact same formula as used in Possolo/Meija when Taylor begins addressing quadrature.

That is not the formula being used in Possolo is equation 10 from the GUM. You can derive all the rules used here from it – but as we’ve seen before that assumes you understand the calculus.

Now let’s look at an average.

Therefore we calculate using the rule of products/quotients in quadrature,
δq/q = sqrt[{δx1/x1 + 0}^2 + … + {δxn/n + 0}^2]

No, no no. You are making the same mistake Tim made. Mixing up adding and division. You equation would be correct if you were multiplying all the terms rather than adding them.

There are two different rules for propagating uncertainty. One using absolute uncertainties, the other relative uncertainties. You have to do each separately and convert as appropriate.

Work out the uncertainty of the sum using the rule for addition, then work out the uncertainty of that sum divided by n using the rules for division (or just use the special rule). Finally convert the relative uncertainty back into an absolute uncertainty.

It’s not difficult, and it’s painful to watch you two tie yourselves in knots trying to avoid the simple result that uncertainty of random errors will decrease with sample size not increase. You can approach it multiple ways and you always get the same result, whether you use these rules, or Equation 10, or the Central Limit Theorem. Even Kip Hansen explained why you have to divide the uncertainty of the sum by the number of elements when taking an average.

δq = q • sqrt[(δx1/x1)^2 + … + (δxn/xn)^2]
This is the same as Possolo/Meija shows.

The example you quoted from Possolo is not the uncertainty of an average, it’s the uncertainty of the volume of a cylinder. There is no addition, just multiplication.

Really, you can;t work these things out just by looking at something you think might be similar. Look at the actual equations. Apply them correctly. You will get the same result.

One must be willing to accept that a defined number has no uncertainty and does not CONTRIBUTE to the total uncertainty, EVER.

Then explain why the defined number appears in the Taylor special case.

Please don’t try to convince me that you can multiply a single value of uncertainty such as ±1° by 1/30 to obtain an overall uncertainty.

I won’t. You are completely un-convinceable. You have to be prepared to accept the possibility you might be wrong in order to learn anything.

You first need to multiply ±1 by 30 ( days in a month) and then divide by 30 to get an average. Guess what you will get?

You get ±1. A lot less than the ±6.7 you were claiming. But you are now ignoring the rules of Quadrature, just as Kip does. ±1 is what you get if your measurement errors are completely dependent, say if as at the start of your comment you are only making one measurement and adding it together 30 times, or if all your errors were caused by a systematic error.

If there is some random uncertainty in the daily measurements, the uncertainty of the exact average of the 30 days will be less, due to random cancellation.

Tim Gorman
Reply to  Bellman
March 23, 2023 2:46 pm

It is not. It’s specifically about multiplying one measurement by an exact value”

After two years you *STILL* don’t get it.

On Page 78 under “Principal Definitions and Equations of Chapter 3”

Measured quantity time an exact number:

If B is known exactly and q = Bx
then
ẟq = |B|ẟx

ore equivalently ẟq/q = ẟx/x

The relative uncertainty for x is *NOT* (1/B)(ẟx/x), it is just ẟx/x

The relative uncertainties for q and x are the same. B doesn’t factor into the uncertainty at all.

Taylor even says that on Page 54 – “Because ẟB = 0, this implies that

ẟq/q = ^x/x.

We’ve been over this and over this ad infinitum.

The relative uncertainty of an average is just like Taylor says:

ẟq_avg/q_avg = ẟx/x NOT (1/n)(x/x)

When you want to know ẟq_avg the ẟq_avg = (q_avg) * (ẟx/x)

THIS IS THE AVERAGE UNCERTAINTY. It is *NOT* the uncertainty of the average.

When will you learn this?

Bellman
Reply to  Tim Gorman
March 23, 2023 3:50 pm

After two years you *STILL* don’t get it.

Get what? That you’re an idiot or a troll, or probably both? I think I get that. I don’t point out your mistakes because I think you will suddenly realize you are wrong – I know that will never happen. I do it for my own interest, and in the hope that someone reading these comments won’t be fooled.

Measured quantity time an exact number”

Which is exactly what I said. It does not mean it only applies to adding the same measurement multiple times, as Jim was saying. It means multiplying a measurement by an exact value. The clue is in the heading.

The relative uncertainty for x is *NOT* (1/B)(ẟx/x), it is just ẟx/x

How observant of you.

The relative uncertainties for q and x are the same.

Again, well observed. It’s almost as if it’s correct that

ẟq/q = ẟx/x.

Now all you have to do is figure out the consequence of the two being in the same proportion.

B doesn’t factor into the uncertainty at all.

Unless, I know this might seem crazy, but hear me out, what if q = Bx? What do you think the consequence for ẟq is if q is B times bigger than x, but has to be in the same ratio to q as ẟx is to x?

“We’ve been over this and over this ad infinitum.

Yet you still fail to see the evidence of your own eyes, even when Taylor spells it out for you in the special case. You even typed it yourself

ẟq = |B|ẟx

Yet you still insist that B doesn’t factor into the uncertainty of q.

When you want to know ẟq_avg the ẟq_avg = (q_avg) * (ẟx/x)

And what is q_avg equal to? Knowing that, can you simplify the RHS so it includes B?

THIS IS THE AVERAGE UNCERTAINTY.

It’s the “average” of ẟx, ẟx is the uncertainty of the sum. The uncertainty of the sum is not the sum of the uncertainties. Hence you are wrong.

It is *NOT* the uncertainty of the average. “

Did you mean to type that, especially in bold? You are saying ẟq_avg, is not the uncertainty of the average? Then what do you think q_avg is? You said ẟq_avg/q_avg is the relative uncertainty of the average. That would imply q_avg is the average, and ẟq_avg is the absolute uncertainty of the average. But now you insist is *NOT* that.

When will you learn this?

You realise there’s a logical flaw whenever you say this. If what you are saying is wrong then why would I want to learn it?

Tim Gorman
Reply to  Bellman
March 23, 2023 4:03 pm

Unless, I know this might seem crazy, but hear me out, what if q = Bx? What do you think the consequence for ẟq is if q is B times bigger than x, but has to be in the same ratio to q as ẟx is to x?”

You didn’t even bother to go look at Taylor, did you? Go look at Page 78. I gave you exactly what it says.

FOR q = Bx the uncertainty is ”

—————————————-
ẟq = |B|ẟx
or equivalently ẟq/q = ẟx/x
—————————————-

Again, for the umpteenth time:

If you have 100 sheets of paper where the undertainty of each sheet is ẟx, the total uncertainty of the stack is |B|ẟx.

That is ẟsheet1 + ẟsheet2 + … + ẟsheetB

It is a sum of all of the uncertainties.

It is *NOT* ẟsheet1/n + ẟsheet2/n + … + ẟsheetB/n

This equation is [ẟsheet1 + ẟsheet2 + … + ẟsheetB] /n

That is the AVERAGE UNCERTAINTY, not the uncertainty of the average.

Bellman
Reply to  Tim Gorman
March 23, 2023 4:30 pm

You keep quoting exactly what I’m saying and then claiming I’m not reading it.

If you have 100 sheets of paper where the undertainty of each sheet is ẟx, the total uncertainty of the stack is |B|ẟx.

Correct, but only if you are just going to measure one sheet and multiply that measured value by 100 to get the height of the stack of 100 sheets. Why on earth you would do that I don’t know.

The better option is the one Taylor describes. Measure the stack of 100 sheets and then divide the height by 100 to get the thickness of a single sheet, then divide the uncertainty of the measurement of the stack by 100 to get the uncertainty of your calculation for the thickness of a single sheet. That’s what Taylor describes, but for some reason you can’t accept it as you only think multiplying is the same as repeated addition.

That is ẟsheet1 + ẟsheet2 + … + ẟsheetB

But that’s only true if you make just one measurement and add it to itself 100 times. Or multiply by 100. If you measure each sheet separately then you have 100 independent measurements and you can add the uncertainties in quadrature.

This equation is [ẟsheet1 + ẟsheet2 + … + ẟsheetB] /n

That is the AVERAGE UNCERTAINTY, not the uncertainty of the average.

Yes, that would be the average uncertainty. But that’s not the case if you add in quadrature.

You can’t have it both ways. You can keep saying I believe that errors cancel, and at the same time claim I always want to find the average uncertainty. If errors don’t cancel (i.e. they are all dependent) then the average uncertainty is the uncertainty of the average. If they do cancel the uncertainty of the average will be less than the average uncertainty.

Tim Gorman
Reply to  Bellman
March 24, 2023 4:46 am

Correct, but only if you are just going to measure one sheet and multiply that measured value by 100 to get the height of the stack of 100 sheets. Why on earth you would do that I don’t know.”

Are you physically incapable of reading? Taylor’s whole example was about measuring the entire stack and then dividing by 100 to find each individual uncertainty. NO AVERAGING. My guess is that you can’t even state off the top of your head the the assumptions Taylor specifically states in the example!

It just gets more and more obvious over the past two years that you simply can’t relate to the physical real world in any way, shape, or form. Suppose you have a pallet of cement bricks you are gong to stack to form a retaining wall. Are you going to stack them all up and try to measure the whole stack? Or are you going to measure 1 brick and multiply by the number of bricks? What is the total uncertainty you would expect to obtain from measuring just one and multiplying?

“The better option is the one Taylor describes. Measure the stack of 100 sheets and then divide the height by 100 to get the thickness of a single sheet, then divide the uncertainty of the measurement of the stack by 100 to get the uncertainty of your calculation for the thickness of a single sheet. “

See above! What if you have bricks instead of paper? Do you *ever* try to relate anything to the real world?

“But that’s only true if you make just one measurement and add it to itself 100 times. Or multiply by 100. If you measure each sheet separately then you have 100 independent measurements and you can add the uncertainties in quadrature.”

What if you have a stack of bricks? A stack of antenna mast sections? Can you relate in any way, shape, or form to these real world examples? Can you even imagine why you would want to know the uncertainty of each?

“Yes, that would be the average uncertainty. But that’s not the case if you add in quadrature.”

Your poor math skills are showing again!

sqrt[ u(x1)^2/n^2 + u(x2)^2/n^2 + …. + u(xn)^2/n^2] =
sqrt{ [u(x1)^2 + u(x2)^2 + … + u(xn)^2] /n2 } =
(1/n) * sqrt[ u(x1)^2 + u(x2)^2 + … + u(xn)^2]

You are *still* finding an average! It just depends if you are adding the uncertainties directly or in quadrature as to the form the average takes.

(RSS/n) is an average! It assigns the exact same uncertainty to each individual element regardless of what the actual uncertainty of the element is.

(Sum/n) is an average! It assigns the exact same uncertainty to each individual element regardless of what the actual uncertainty of the element is.

You are finding an AVERAGE UNCERTAINTY in each case! The issue is that the average uncertainty is *NOT* the uncertainty of the average, especially when you are measuring multiple things one time each using different measuring devices.

Write this out 1000 times: Average uncertainty is not the uncertainty of the average. It probably still won’t sink in but at least you will have tried.

[hint: in the equation (RSS/n) exactly what does RSS describe?]

Bellman
Reply to  Tim Gorman
March 24, 2023 6:20 am

Are you physically incapable of reading? Taylor’s whole example was about measuring the entire stack and then dividing by 100 to find each individual uncertainty.

I was replying to Jim’s example.

If you have 100 sheets of paper where the undertainty of each sheet is ẟx, the total uncertainty of the stack is |B|ẟx.

Your bad faith quoting what I said out of context is very tedious.

And I said nothing about averaging in that example. It’s just an illustration of why, if you divide a measurement by an exact value you also divide the uncertainty by that exact value.

Tim Gorman
Reply to  Bellman
March 24, 2023 6:54 am

Your bad faith quoting what I said out of context is very tedious.”

More whining!

You are saying that B can’t equal (1/n) as in finding an average?

Dividing a measurement, I.E. A STATED VALUE, is *NOT* the same thing as dividing the uncertainty by that same value.

Again, you are trying to reduce uncertainty by finding an average uncertainty value. You can’t reduce uncertainty by averaging. Average uncertainty is *NOT* the uncertainty of the average!

If I use 10 boards, all of different lengths and measured using different tape measures, to build a beam spanning a basement, my uncertainty in how long that beam will actually be is *NOT* the average uncertainty. It is the total uncertainty as determined by direct addition or RSS of *all* the uncertainties of the boards.

Why is this so hard for you to understand?

Bellman
Reply to  Tim Gorman
March 24, 2023 7:31 am

More whining!

Welcome to Gorman-speak, where Gaussian distributions don;t have to be Gaussian, where a sum is the same as an average, and where pointing out when you’ve been lied about is whining.

You are saying that B can’t equal (1/n) as in finding an average?

Nope, I’m saying the exact opposite. That B can be 1/n, or any exact number.

Dividing a measurement, I.E. A STATED VALUE, is *NOT* the same thing as dividing the uncertainty by that same value.

Indeed it’s not. You do however want to divide the uncertainty by the same value as you divide the measurement to get the correct uncertainty.

Again, you are trying to reduce uncertainty by finding an average uncertainty value.

Get another lie, this one has become threadbare.

If I use 10 boards, all of different lengths and measured using different tape measures, to build a beam spanning a basement, my uncertainty in how long that beam will actually be is *NOT* the average uncertainty.”

You are not averaging anything there. You don;t want to know the uncertainty of the average, you want to know the uncertainty of the sum. The fact even after all these years you still can’t tell the difference is why I tend to get a little irked when you insist on denigrating my “real world” skills.

Why is this so hard for you to understand?

The hard part is why you think this has any relevance to the discussion.

Jim Gorman
Reply to  Bellman
March 24, 2023 8:13 am

You are a troll. You will get no more from me unless it deals with discussing TN1900.

Tim Gorman
Reply to  Bellman
March 25, 2023 3:58 am

Welcome to Gorman-speak, where Gaussian distributions don;t have to be Gaussian, where a sum is the same as an average, and where pointing out when you’ve been lied about is whining.”

More whining. All you are doing is avoiding the real issue. Not all distributions are symmetrical where measurement uncertainty cancels. You simply can’t accept that so all you can do is whine.

“Indeed it’s not. You do however want to divide the uncertainty by the same value as you divide the measurement to get the correct uncertainty.”

Only if you are wanting to find the average uncertainty instead of the uncertainty of the average.

“Get another lie, this one has become threadbare.”

(a + b + c … + z) / 26 *IS* an average no matter how much you want to deny it. It doesn’t matter if a, b, c, etc are stated values or uncertainty intervals. When you divide by the number of members in the data set you are finding an average value.

The average uncertainty is *NOT* the uncertainty of the average.

You still can’t accept that simple fact, can you?

“You are not averaging anything there. You don;t want to know the uncertainty of the average, you want to know the uncertainty of the sum. The fact even after all these years you still can’t tell the difference is why I tend to get a little irked when you insist on denigrating my “real world” skills.”

Your “uncertainty of the average” IS how close you are to the average, NOT how uncertain the value of the average is!

How close you are to the population mean is useless if the population mean is inaccurate!

How do you determine how inaccurate the population mean might be? By finding the average uncertainty of the members in the population? Or by propagating the uncertainty of the members to the population average?

If *every* member in the population is off by 2 then how far off will the population mean be? You can calculate the mean out to however many digits you want, just state how uncertain that value will be! Can you?

I know this example has been given to you before but, as usual, you ignored it.

Population Stated values = 2, 4, 6,8,10, 12 ==> Avg = 7
Systematic bias of +2 = 4, 6,8,10,12,14 ==> Avg = 9
Systematic bias of -2 = 0,2,4,6,8,10 == Avg = 5

You can calculate the average of the stated values EXACTLY, your uncertainty of the mean is ZERO, i.e. how close you are to the population mean.

BUT, when you consider the uncertainty of the members the value of the mean is 7 +/- 2!

You want to drive how close you are to the population mean, I.E. the standard error of the mean, to zero while ignoring the uncertainty of the data set members. This is why you ALWAYS circle back to assuming the measurement uncertainty always cancels – so the standard error of the mean can be used to define the uncertainty of the mean. You always deny this but you do it EVERY SINGLE TIME.

THE STANDARD ERROR OF THE MEAN IS NOT THE UNCERTAINTY OF THE MEAN!!

Until you can get this simple fact of metrology into your head there is simply no use in discussing this with you. I am missing out on money making time trying to explain it to you.

You just continue living in your alternate reality where all measurement uncertainty cancels. I’ll go live in the real world.

Bellman
Reply to  Tim Gorman
March 25, 2023 6:10 pm

I am missing out on money making time trying to explain it to you.

To use your favorite childish insult, stop whining. Nobody is forcing you to write these interminable repetitive tracts.

If you want to avoid wasting your time you could try engaging with what I actually say, rather than what you want to believe I say.

Not all distributions are symmetrical where measurement uncertainty cancels. You simply can’t accept that so all you can do is whine.

I fully accept that not all distributions are symmetrical. But you still seem to think that it’s necessary for a distribution to be symmetrical for errors to cancel. That’s wrong, as the CLT proves. Just about any distribution will tend to the mean of the distribution the larger the sample size.

As always you are vague about what distribution you are talking about. If you are talking about measurement uncertainty, then I assume you mean the distribution of measurement errors. But as you also insist that measurement uncertainty doesn’t have a probability distribution, maybe you mean something else.

Assuming you do mean the distribution of errors, then as I’ve explained before, what matters isn’t the symmetry of the distribution, it’s the mean. I the mean is zero, then with an infinite number of measurements all the errors will cancel. If it isn’t zero then you have a systematic error and the an infinite number of measurements will tend to that error.

(a + b + c … + z) / 26 *IS* an average no matter how much you want to deny it.

I don’t deny it. What I deny is that √(a² + b² + c² … + z²) / 26 is an average. It’s certainly not an average of the 26 values.

The average uncertainty is *NOT* the uncertainty of the average.
You still can’t accept that simple fact, can you?

Apart from the fact I keep telling you they are not the same thing. It’s that what you mean by not accepting it.

Your “uncertainty of the average” IS how close you are to the average, NOT how uncertain the value of the average is!

But you don’t know how close you are to the average, that’s why it’s uncertain. What uncertainty, e.g. a confidence interval does is give you an indication of how close you are likely to be.

How close you are to the population mean is useless if the population mean is inaccurate!

I can guess what you are trying to say, but as written this is nonsense. The population mean is the thing you are measuring, it has no inaccuracy. Your measurements of the mean might be inaccurate, but it isn’t the population that is wrong, it’s your measurements.

By finding the average uncertainty of the members in the population?

Again, assuming you mean the uncertainty of the measurements, not of the members, obviously you don’t do this. The assumption is that errors will cancel, the average uncertainty is assuming they don’t. The only time average uncertainty makes sense is if you assume the only uncertainty is caused by a systematic error.

Or by propagating the uncertainty of the members to the population average?

You can do that, and that’s what we’ve been arguing for the past few years. But that is only going to give you the measurement uncertainty. The, usually, much bigger source of uncertainty is that from sampling.

If *every* member in the population is off by 2 then how far off will the population mean be?

Again, assuming you mean every measurement, then you have a systematic error and the measurement uncertainty of the mean will be off by 2.

I know this example has been given to you before but, as usual, you ignored it.

Either that, or you forgot my response. I’ve told you on many occasions that systematic errors are not reduced by averaging. That’s pretty much the definition of a systematic error.

This is why you ALWAYS circle back to assuming the measurement uncertainty always cancels – so the standard error of the mean can be used to define the uncertainty of the mean. You always deny this but you do it EVERY SINGLE TIME.”

Apart from all the times when we’ve discussed systematic errors, along with systematic biases in the sampling.

Nobody thinks that the standard error of the mean is the only factor in a real world sampling. But you keep ignoring the fact that when you do have random errors, they will get cancelled. And you keep bringing up systematic errors to distract from your own mistakes.

All of this started with you saying that if you had 100 temperatures, each with a random uncertainty of ±0.5°C, then the uncertainty of the average would be ±5°C. As far as I can tell you still believe that, as you still keep insisting you don’t divide the uncertainty of the sum by the number of values.

Yet, even if the 0.5 uncertainties were nothing but systematic errors, the average would still only have an uncertainty of ±0.5°C. Moreover, your argument was that the uncertainty of the sum would be ±5°C, which only makes sense if you were assuming that there was no systematic error.

Trying to point out why your maths is wrong, under your own assumptions, does not mean that I believe there are no complicating factors, such as systematic errors, it’s trying to explain why your maths is wrong.

Bellman
Reply to  Tim Gorman
March 24, 2023 1:52 pm

What if you have bricks instead of paper?

What if your arms were made of cheese? The example we were discussing was paper not bricks. Paper exists in the real world just as much as bricks do.

The maths is the same regardless of whether you are measuring bricks or paper, it’s just the practicalities that change. The example of a stack of paper is good because it’s very difficult to measure a single sheet of paper without expensive equipment. A stack of 100 bricks isn’t such a good example as it’s more difficult to measure the stack and easier to measure individual bricks. But regardless, if you wanted an accurate measurement for an individual brick, stacking a number of them and dividing the height by number of bricks will give you a more accurate measurement that just measuring one brick, assuming your measuring device had the same accuracy.

In contrast measuring just one brick and multiplying it by the number of bricks to get the height of the stack will be less accurate as any error in your one measurement will be multiplied by the number of bricks. If you measure it with ±5mm uncertainty, then the estimate of a stack of 100 bricks will have an uncertainty of ±500mm, (± 0.5m).

Your poor math skills are showing again!

For someone living a glasshouse,m you sure like to throe a lot of stones.

(1/n) * sqrt[ u(x1)^2 + u(x2)^2 + … + u(xn)^2]

You are *still* finding an average!

An average of what? If you had 50 6′ boards and 50 8′ boards, would you regard that formula as giving you the average length of the board? Hint, this would give you an average of about 0.7′.

(RSS/n) is an average! It assigns the exact same uncertainty to each individual element regardless of what the actual uncertainty of the element is.

You already know the individual uncertainties, that’s how you calculated the RSS. In what world does averaging result in each actual element having on average a much smaller uncertainty? In what way does the uncertainty of the average assign the same value to each element? It’s the uncertainty of the combined average, nothing to do with each element.

Write this out 1000 times: Average uncertainty is not the uncertainty of the average.

Write this out three times – “I agree”.

Bellman
Reply to  Jim Gorman
March 22, 2023 11:40 am

Since you insist that an average is a functional relationship

That’s only the case where you are talking about an exact average. I.e. you have 30 different values and you want to know what their average is. More usually you are taking 30 values as a random sample from a population and that is not a functional relationship. A different random sample will give you a different average.

Whether you consider the average temperature for January 1953 to be an exact average of those 31 days, or a random sample from all possible 31 daily values, is an interesting question. It’s the question of why TN1900 uses the random variation of the daily maximums as a random sample and ignores any measurement uncertainty.

Jim Gorman
Reply to  Bellman
March 24, 2023 5:51 am

Why do you think the variance of the averaged data is important? I have tried to emphasize that yet no one including you address that fact. The variance of an average is needed to define the shape of a of a normal or Gaussian distribution.

That is why the method in NIST TN1900 is so important. It provides a method that follows the GUM and defines the appropriate interval of where mean may lay.

You want to divide measurement uncertainty by n^2 so you get an average error in the hundredths or thousandths decimal place, go right ahead, it just makes the combined measurement uncertainty less and less important when compared to the expanded uncertainty calculated from the actual data.

I for one will be using the NIST recommended procedure. If you want to argue about their recommendation I suggest you start a dialog with them.

Bellman
Reply to  Jim Gorman
March 24, 2023 6:53 am

Why do you think the variance of the averaged data is important?

Which variance are you talking about? It’s all important, but it’s difficult to try to explain to you why when you keep twisting everything. Are you talking about the variance of the population, or the variance of the sampling distribution?

You need the former to estimate the later. The later is important as it indicates the uncertainty of the average. (Though really it’s the standard deviation not the variance that is used).

That is why the method in NIST TN1900 is so important.

You mean the method you think is wrong, and are only using to “hoist others on their own petard”?

You still don’t seem to get that the method used in Example 2 of TN1900 is the method I’ve been (in different contexts) trying to explain to you. The one you claim is wrong. The one which involves dividing the standard deviation of the measurements by the square root of the sample size.

You want to divide measurement uncertainty by n^2

No I do not. I want, under the right circumstances, to divide it by √n. I assume you understand the difference between the square root and the square.

so you get an average error in the hundredths or thousandths decimal place

No. I want to do it because it’s the correct way. And you are very unlikely to get uncertainties in the thousandths of places, unless you have very precise measurements to start with. Averaging multiple measurement uncertainties will only get you so far, as the precision only increases with the square root of the sample size, and the smaller it gets the more effect any small systematic error will have. Bevington I think describes this well.

it just makes the combined measurement uncertainty less and less important when compared to the expanded uncertainty calculated from the actual data.

What are you on about now. The expanded uncertainty is just the combined standard uncertainty with a coverage factor. The smaller the standard uncertainty the smaller the expanded uncertainty.

I for one will be using the NIST recommended procedure.

Try not to hoist yourself on your own petard.

Jim Gorman
Reply to  Bellman
March 24, 2023 8:09 am

So you agree, TN1900 is a correct way to find u_c(y).

My problem with TN1900, is not its method, but the assumption that measurement uncertainty is negligible. However, since you insist measurement uncertainty is so small as to allow u_c(y) to allow averages with 2 and 3 decimal points, then your only alternative is to use experimental uncertainty as recommended by both TN2900 and the GUM.

As stated in TN1900 which references the GUM 4.2.3 where:

σ^2(q_bar) = σ^2 / n and

s^2(q_bar) = s^2(q_k) / n, and

s(q_bar) = s(q_k) / √n

Then s(q_bar) is expanded by a coverage factor to achieve a confidence intervals.

It appears to calculate an expanded combined uncertainty that subsumes any of your estimates and includes:

“””””The {Ei} capture three sources of uncertainty: natural variability of temperature from day to day, variability attributable to differences in the time of day when the thermometer was read, and the components of uncertainty associated with the calibration of the thermometer and with reading the scale inscribed on the thermometer.”””””

You aren’t going to like the resulting uncertainty intervals that show anomalies as being calculated are meaningless.

Bellman
Reply to  Jim Gorman
March 24, 2023 9:06 am

So you agree, TN1900 is a correct way to find u_c(y).

Up to a point. For a start they don’t say it’s the correct way. You can always find different and possibly better ways of analyzing any data.

My main issue is they don’t really define what they want to measure, and what the uncertainty represents. Are you interested in the actual monthly average, and how certain you are about it, or a more abstract conceptual average, which might have been different if the temperatures had been different. The approach in the sample is the later, each day is treated like a random temperature around a supposed daily average and the of the example uncertainty reflects that. I don’t think this is wrong, and it’s the correct way to handle questions such as, was this month significantly warmer or colder than the previous one. But if you are only interested in what the actual temperature was then it would make more sense to just propagate the measurement uncertainty.

The problem with the example is they only look at a month with a large proportion of it’s days missing. It’s not clear is the SEM used is meant to reflect the uncertainty of the missing days. The problem would be more obvious if they had all 31 days.

But in general, as an illustration that you can use the standard error of the mean to estimate the uncertainty of a mean it’s fine.

My problem with TN1900, is not its method, but the assumption that measurement uncertainty is negligible.

I don’t think it does that. What they say is that calibration issues are assumed to be be negligible. Random uncertainty caused by reading the instrument (and I assume that includes rounding to the nearest 0.25°C) will still be present in each days error. But it will be negligible simply compared to the range of errors caused natural variability. But when you look at the standard deviation, that can only be the result of all errors. You can’t pretend the measurement errors are not part of the total error.

However, since you insist measurement uncertainty is so small as to allow u_c(y) to allow averages with 2 and 3 decimal points!

I don’t claim that, or at least I don’t think the size of the measurement uncertainty is the reason for any specific number of decimal places. I agree with the standard practice of calculating the uncertainty to a reasonable number of places and reporting the result to the same number of decimal places. (I also think it’s better to include too many than too few digits, especially if the result will be used in other calculations.) But the size of the uncertainty has much more to do with the sampling errors than the measurement errors.

As stated in TN1900 which references the GUM 4.2.3 where

And I still don’t understand why you are so against dividing standard deviation by the root of sample size, yet say it’s the correct method here.

It appears to calculate an expanded combined uncertainty that subsumes any of your estimates

I’m, not sure what you mean by my estimates. I’ve made no attempt to estimate any temperature uncertainty. I mainly just point out why your methods are wrong.

Again, given the concept of looking at natural daily variability as a random error, then I don’t think the estimate they give is unreasonable. But that’s the estimate for one month (with a lot of missing data) at one station.

You aren’t going to like the resulting uncertainty intervals that show anomalies as being calculated are meaningless.

How do you come to the conclusion that this result shows all anomalies are meaningless?

Jim Gorman
Reply to  Bellman
March 24, 2023 4:30 pm

If you have a problem with a NIST recommendation I suggest you do a better job of describing the problems that you have and discuss them with NIST.

I can assure you that what you are currently spouting will not get you very far with them. They will want both math and evidence to support your claims of their Technical Note being incorrect.

Bellman
Reply to  Jim Gorman
March 24, 2023 5:01 pm

Why don’t you ask Tim to do that? He’s the one that says he and you have a problem with that example to trick “alarmists” into hoisting themselves with their own petards.

As I said, I don’t have a problem with the method, just an observation about what is being considered uncertainty in the instance.

Tim Gorman
Reply to  Bellman
March 24, 2023 9:12 am

Are you talking about the variance of the population, or the variance of the sampling distribution”

It shouldn’t matter. The samples should have the same distribution as the population. If they don’t then somewhere the assumption of iid is being violated. If the samples are not iid then you can get spurious trends.

If you don’t have iid then your precious CLT won’t work correctly.

“The later is important as it indicates the uncertainty of the average. “

This is really a mis-naming of what it is. It is *not* the uncertainty of the average, it is the standard deviation of the sample means. It is a measure of how closely you are estimating the population mean. It is *NOT* finding the uncertainty of the mean.

I know “uncertainty of the mean” is common parlance in statistics but it is as misleading as all get out. It is more correctly called the Standard Deviation of the Sample Means. Use of that term would eliminate a lot of confusion – confusion which you display in extreme.

Uncertainty of the mean is the propagated uncertainty of the measurements making up the data set. It’s why you can have a very precisely calculated mean that is very close to the population mean while being inaccurate as all get out!

Again, if you have the entire population, with each member given as “stated value +/- uncertainty”, the uncertainty of the mean is *NOT* zero. It is just the population mean – period. It is the average of the stated values. It is the propagation of the “uncertainty” portion of the members that determine the uncertainty of the mean.

You need to unlearn your use of the term “uncertainty of the mean” and start using the “standard deviation of the sample means”.

The one which involves dividing the standard deviation of the measurements by the square root of the sample size.”

Why do you NEVER state the assumptions Possolo uses in TN1900? You *ONLY* do this when you can assume that all systematic bias is zero and the uncertainty is random, symmetrical, and cancels. Possolo specifically states these assumptions in his example. This allows him to assume that all measurements are of the same thing taken by the same device and the variation of the stated values determines the uncertainty of the mean.

It is the very same set of assumptions you always make even though you deny it vociferously. All measurement uncertainty for you cancels and only the stated values are used for analysis. In this case the standard deviation of the sample means is considered to be the uncertainty of the mean. But you *NEVER* justify those assumptions. You won’t even state them explicitly because you know it would invalidate your analysis.

Bellman
Reply to  Tim Gorman
March 24, 2023 10:22 am

It shouldn’t matter. The samples should have the same distribution as the population.

I asked about the sampling distribution. Not the distribution of a sample.

Bellman
Reply to  Jim Gorman
March 22, 2023 11:42 am

SHOW A REFERENCE.”

Screenshot 2021-09-18 221252.png
Bellman
Reply to  Tim Gorman
March 19, 2023 3:12 pm

EVERY TIME you assume that all measurement uncertainty cancels you are assuming that all distributions are Gaussian.

Your need to justify all these lies to yourself are quite telling.

1) I do not assume that all measurement uncertainties cancel. I assume that there is some cancellation because that’s what you would expect with random errors. But that random errors ever completely cancel. That’s the whole point of the sum of uncertainties propagating as sqrt(N) * uncertainty. There’s some cancellation, so they don’t grow with N, but they still grow at a slower rate. And when you take the average this becomes uncertainty / sqrt(N), which means the uncertainty reduces due to cancellations, but never completely.

2) And how many more times does this have to be explained.? But the cancelling of errors does not depend in any way on the distribution being Gaussian. There are times when knowing or assuming a distribution is Gaussian in order to simplify the maths. But for any case random errors, whatever the distribution, tend to cancel.

See! You are doing it again!

What I was “doing again” was responding to your claim that

Multiple single measurements of different things is almost guaranteed to not be Gaussian and, after two years, YOU STILL DON’T BELIEVE THAT!

It’s an obvious point, but if you take multiple values from a distribution, then the values will tend to have the same distribution as the parent distribution. Hence my point that is the population is Gaussian the multiple measurements you take will also be Gaussian.

Maybe you mean the measurement errors will not be Gaussian, but that again would depend on the distribution of your measurement uncertainties. You really need to take a breath and explain what exactly you mean in these exercises, rather than just ranting and jumping on that shift key.

Assuming that measurements of different things give you a Gaussian distribution.

If the original distribution is Gaussian. You missed that import bit.

If you measure one horse out of the multiplicity of horse species you will NOT* get a Gaussian distribution.

Of course not. I’ll just have one value. But you specifically said you were measuring different things, not one thing.

Even the daily temperature profile is not Gaussian or symmetrical.

As I keep pointing out to you.

Taylor, Bevington, and Possolo *all* say that.

Context please.

None of their books, notes, or papers show how to handle measurement uncertainty for a skewed distribution using just the average value of the measurements as a “true value”.

Again, you need to be clearer about what you are trying to do here. Is it the measurements that have a skewed distribution or the population?

*YOU* are the only one that is adamant about assuming that all measurements of different things is Gaussian and the average is a “true value”.

As in I’m adamant that you do not have to assume all measurements of different things are Gaussian.

When I say the average is the “true value” it’s in the context of saying that if you want to know what the average, then that average is the true values you are trying to measure. Not that your sample or any specific average you obtain is the true value.

John Dilks
Reply to  Bellman
March 20, 2023 8:21 am

A drop of 2C is much more dangerous than a rise of 2C. We can adapt to either, but I would rather adapt to the rise.

PCman999
Reply to  Tim Gorman
March 14, 2023 5:17 pm

His post is fine – it further ridicules the idea that an extra 1.5°C by the end of the century will cause the end of the world.

Imho, 1.5C is just a good start!

Even the IPCC’s own scientists have said that any temp increase will be mostly in the poles and hardly anything at the equator – and other commentators have added that it’s the nightly lows that will be affected the most, not the daily highs that most alarmists cry about.

KevinM
Reply to  PCman999
March 15, 2023 4:01 pm

PCman999 makes probably the most important comment in the thread here – only a little warmer, and only where/when a little warmer might feel good.

Reply to  Bellman
March 13, 2023 4:48 pm

Office temperature designated by OSHA has a 5°C comfort range — 68°F to 77°F. That’s 11°F. Want to use that range?
That’s 20 – 25°C.

indoor comfort with low humidity

you are misusing the OSHA data

Gilbert K. Arnold
Reply to  Steven Mosher
March 14, 2023 7:44 pm

Steven: I suggest you check your math….. 77
minus 68 does not equal 11

Editor
Reply to  Bellman
March 14, 2023 1:03 pm

bellman ==> Yes, right on….the Earth needs to warm up a bit more — it still doesn’t even quite meet the NASA standard for average global temperature for an Earth-like planet. (15-16°C)

Last edited 15 days ago by Kip Hansen
Bellman
Reply to  Kip Hansen
March 14, 2023 1:53 pm

Do you have a source for that definition? It seems odd that NASA are saying the Earth isn’t an Earth like planet.

Editor
Reply to  Bellman
March 15, 2023 7:16 am

Bellman ==> NASA generally states two temperatures, 15°C and on occasion, 16°C, depending on which NASA page you are looking at. “21°C, says William Borucki of NASA’s Ames Research Center here, who is the principal investigator of NASA’s Kepler space telescope, once used that,”for a “a very nice temperature”.

The Earth, according to the GAST believers, is running about 14.85°C, depending on the day and the amount of fiddling. In any case, it is still a little short of 15°C, and over 1°C from 16°C.

This wiki page references NASA’s source for the 15°C and you can find many more.

Or you can read my essay.

Bellman
Reply to  Kip Hansen
March 15, 2023 7:46 am

NASA generally states two temperatures, 15°C and on occasion, 16°C, depending on which NASA page you are looking at.

Still looking for a link to any of these pages. I can’t find any reference on any of the links you do provide that says that anything below 15°C is considered to not be earth-like.

Your own essay only says

One of those specifications for an Earth-like planet is an appropriate average surface temperature, usually said to be 15 °C.

And else where simply say that 15°C is the ideal temperature. That’s very different from setting a minimum requirement.

Again, it’s seems nonsense to suggest that Earth-like doesn’t apply to the Earth, when that’s the very definition of Earth-like. It would be like arguing that Belgium is too small to be considered about the size of Belgium.

dbakerber
Reply to  Bellman
March 14, 2023 1:45 pm

I suggest you check your math. 77-68=?

Bellman
Reply to  dbakerber
March 14, 2023 2:16 pm

The difference between Fahrenheit and Celsius.

Gilbert K. Arnold
Reply to  Bellman
March 14, 2023 7:39 pm

Bellman…. the last time I took arithmetic: 77 minus 68 = 9… 5°C = 9°F the difference between °C and °F has noting to do with it

Bellman
Reply to  Gilbert K. Arnold
March 14, 2023 7:56 pm

77°F = 25°C
68°F = 20°C

The range is 5°C, just as Kip Hansen said at the start.

But I do see that Kip Hansen quoted a range of 11°F, which is wrong. You really need to complain to him, rather than me. I’ve only been using Celsius.

Reply to  Bellman
March 14, 2023 7:20 pm

Office temperature designated by OSHA has a 5°C comfort range — 68°F to 77°F. That’s 11°F. Want to use that range?

no.

indoor comfort at low humidity has nothing to do with climate change

pragmatic try outside temperature

please https://bfi.uchicago.edu/wp-content/uploads/UCH-110116_IndianManufacturingResearchSummary_v04.pdf

Reply to  Kip Hansen
March 13, 2023 10:13 am

An ordinary red line mercury thermometer is what almost everyone uses to find out the temperature at home. Put 142 of them in a row and you have AW’s new chart.

Simple and easy to understand.

Small changes in temperature look small on the chart.

Temperature changes likely to be too small to be felt at home or outdoors are barely visible.

The C. or F. degree absolute temperature charts are honest charts that do not mislead people.
The K, degree absolute temperature chart would mislead people because it looks like a straight line.

All three (C. F. and K. degree) absolute temperature charts are at the link below:

Honest Climate Science and Energy Blog: Global warming and CO2 levels with honest charts

That leftists HATE these charts is further evidence they are great.

Last edited 17 days ago by Richard Greene
Reply to  Kip Hansen
March 13, 2023 2:53 pm

pragmatic for

  1. growing crops is different than pragmatic for office workers

AGAIN various approaches have been MEASURED!!

http://euclid.psych.yorku.ca/www/psy6135/papers/ClevelandMcGill1984.pdf

example. show people both charts and ask them how much the temperature has changed
absolute charts fail utterly.

next up what does ANSI say

Nick Stokes
Reply to  Kip Hansen
March 13, 2023 5:30 pm

Kip,
Want to use that range?
No. You are persistently missing the point. The global average is a calculated indicator. It can’t be compared with what humans experience; no-one experiences it. The question is, what does it indicate.

In the last glaciation, GAT went down by about 11°F. That is actually the range you suggest. But of course it had far more effect than we would find if the temperature changed by that amount one day.

The GAT is an average over time and space, so is very stable. It takes a lot to move it, and so if it does move, it means a lot has happened. Even just averaging over time has the same stabilising effect. NYC has an average temperature of 55.8°F. Atlanta has an average of 63.6°F. The difference of 7.8°Fis well within your scale. But in terms of human perception alone, those are very different climates. You can’t grow peanuts in NY.

Tim Gorman
Reply to  Nick Stokes
March 14, 2023 5:56 am

No. You are persistently missing the point. The global average is a calculated indicator. It can’t be compared with what humans experience; no-one experiences it. The question is, what does it indicate.”

This is like saying average miles/gallon in your car should be measured in hundredths. It takes a lot to change your average mpg so you should be very interested in that change in the hundredths digit.

No one cares about that increment in mpg. It’s going to be what it is and unless the change is significant enough to impact your driving it isn’t important, at least to most people.

If no one experiences it then does it exist? If a tree falls in the forest does it make a sound? (hint: what is “sound”)

TimTheToolMan
Reply to  Nick Stokes
March 15, 2023 3:46 am

Nick writes “The global average is a calculated indicator. It can’t be compared with what humans experience; no-one experiences it.”

OK, so do the graph for a local region. Its going to look quite similar for many regions and now the graph does have meaning and people will directly experience the change.

TimTheToolMan
Reply to  Nick Stokes
March 15, 2023 12:21 am

Nick writes “What counts is what fluctuations signify. “

When about half the fluctuations come from the TOBS adjustment, you know you’re on shaky grounds wrt immanent catastrophe prediction

Nick Stokes
Reply to  TimTheToolMan
March 15, 2023 1:48 am

There is no TOBS adjustment. USHCN was US only, and became obsolete nine years ago.
I calculate the global average using GHCN unadjusted, and get almost identical results.

TimTheToolMan
Reply to  Nick Stokes
March 15, 2023 4:27 am

Nick writes “USHCN was US only”

You mean where the people live and “feel” climate change?

Lets be sure of what’s being claimed here, are you saying there are no TOBS adjustments in the various data sets used to calculate the GAT?

Do you have that comparison on your website?

TimTheToolMan
Reply to  Nick Stokes
March 15, 2023 5:01 am

. Nevermind, I found the info on your website.

Before 1970 the adjustments “cool the past”, by up to 0.05°C. However, on a land basis, that is up to 0.2°C.

Reply to  Kip Hansen
March 13, 2023 7:20 am

Luckily, researchers have already done lots of studies on what visual cues work and what sucks, so you don’t have to start from scratch. Most notable is perhaps William S. Cleveland and Robert McGill’s paper Graphical Perception: Theory, Experimentation, and Application to the Development of Graphical Methods [pdf] from the September 1984 edition of the Journal of the American Statistical Association. I won’t rehash the whole paper, but the findings of most interest here is a ranked list of how well people decode visual cues.

In his text Visualizing Data, William Cleveland demonstrates how the aspect ratio of a line chart can affect an analyst’s perception of trends in the data. Cleveland proposes an optimization technique for computing the aspect ratio such that the average absolute orientation of line segments in the chart is equal to 45 degrees. This technique, called banking to 45 degrees, is designed to maximize the discriminability of the orientations of the line segments in the chart. In this paper, we revisit this classic result and describe two new extensions. First, we propose alternate optimization criteria designed to further improve the visual perception of line segment orientations. Second, we develop multi-scale banking, a technique that combines spectral analysis with banking to 45 degrees. Our technique automatically identifies trends at various frequency scales and then generates a banked chart for each of these scales. We demonstrate the utility of our techniques in a range of visualization tools and analysis examples.

Reply to  Steven Mosher
March 13, 2023 10:23 am

That sounds like Ph.D.-style claptrap

Reply to  Richard Greene
March 13, 2023 2:37 pm

sounds like insecure name calling

sskinner
Reply to  Steven Mosher
March 13, 2023 11:35 am

“…….We demonstrate the utility of our techniques in a range of visualization tools and analysis examples”

Or we could just create a graph with the correct x,y scale.

Reply to  sskinner
March 13, 2023 2:40 pm

the correct scale is what they studied.

hint: its not a zeroed scale

sskinner
Reply to  Steven Mosher
March 13, 2023 5:13 pm

Too much study and education can be an obstacle to learning.
“Today’s scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality.” – N. Tesla

Tim Gorman
Reply to  sskinner
March 14, 2023 5:49 am

Wow! Nice quote!

Models as data? what a joke.

Bellman
Reply to  sskinner
March 14, 2023 8:33 am

By “today’s scientists” I think he means people like Einstein. He was very much against the idea of relativity and atomic theory.

sskinner
Reply to  Bellman
March 14, 2023 12:48 pm

That is interesting, although it seems very relevant to today, especially with any ‘research’ to do with climate or Covid. Here is Richard Feynman – “We live in an unscientific age in which almost all the buffeting of communications and television-words, books, and so on-are unscientific. As a result, there is a considerable amount of intellectual tyranny in the name of science.”
And Einstein said “Blind belief in authority is the greatest enemy of truth”

Bellman
Reply to  Bellman
March 14, 2023 2:00 pm
michael hart
Reply to  Steven Mosher
March 14, 2023 6:13 am

Boring. tl, dnr.

Reply to  Kip Hansen
March 13, 2023 7:26 am

Kip, take a minute and google “cleveland graphics slope”

the ability to convey information with a chart has been studied empirically.

bank to 45 is the rule amongst graphics professionals.

osha is NOT an authority here bad appeal to te wrong authority

2 strikes in 1 comment.

Editor
Reply to  Steven Mosher
March 13, 2023 8:27 am

Mosher ==> Cleveland’s basic advice on scale is:

Make important differences large enough to perceive

That implies that barely significant differences need to be minimized — or at least shown to a scale that allows the readers understand the relative importance.

Thus, my graph fits the bill rather well.

If the temperature difference is unnoticeable or insignificant to humans in daily settings, then the graph should show that.

Editor
Reply to  Kip Hansen
March 13, 2023 10:25 am

Mosher ==> The Cleveland advice makes it very clear that scale depends on purpose.

If your purpose is solely scientific data analysis, say in a scientific paper, then follow their advice.

But for communicating information to the general public, using Cleveland’s “bank to 45” is exactly what Huff points out is a propagandist’s dream. Every tiny change is anything, up or down, can be made to look dramatic.

Certainly not fit for truth telling.

Reply to  Kip Hansen
March 13, 2023 2:36 pm

again,Kip, you can refer to me as steve or mr mosher.

the PURPOSE of the data is to show the CHANGE

to make that readable immediately/

so 1 use anomalies

  1. 0 your graph.

your version lies because it hides the incline and assumes
that indoor temperature, which can be controlled is important for things like farming and outside work. and animal migration

your choice DESTROYS the slope information and hides the incline.

kips nature trick. great your plagarizing Mann now.

if you want to argue with clevland cite some science

Editor
Reply to  Steven Mosher
March 13, 2023 4:00 pm

Mosher ==> You anomaly only version hugely exaggerates the increase, which, in terms of life on Earth, or Earth Climate in general, is rather small and insignificant.

Read Huff for your reference for graphs for public consumption.

Last edited 17 days ago by Kip Hansen
Tim Gorman
Reply to  Kip Hansen
March 14, 2023 4:57 am

The mountains of Arkansas look huge until you visit Denver. It’s all a matter of perspective. Creating a false perspective doesn’t lead to understanding, it leads to misunderstanding.

Nick Stokes
Reply to  Tim Gorman
March 15, 2023 2:06 am

So do you insist that people are only allowed to view the mountains of Colorado from Arkansas, lest they be alarmed?

bdgwx
Reply to  Kip Hansen
March 14, 2023 7:54 am

Kip, your 5 C recommendation is the difference between glacial and interglacial eras. And 1 C captures the range of the global temperature from the MWP to the LIA. So I don’t think these changes are small and insignificant.

Last edited 16 days ago by bdgwx
Editor
Reply to  bdgwx
March 14, 2023 4:58 pm

bdgwx ==> We have very little idea what the “thermometer” temperatures were in the MWP or in the Little Ice Age in degrees. And as you know, I am not a fan of the “averages” used to produce such things as “Global” anything.

The know mostly about the MWP from historic accounts, but not thermometer measurement. Same for the LIA. And most certainly, not temperatures as Global Averages in degrees.

We know approximately NOTHING about thermometer measurements (degrees) about GAST during Glacial Periods and Interglacials in the past.

We can know that temperatures were lower during Glacials and warmed up from Glacials to Interglacials.

We do NOT know what those temperatures were in any measure of degrees.

bdgwx
Reply to  Kip Hansen
March 14, 2023 6:49 pm

We do know. [Kaufman et al. 2020].

Editor
Reply to  bdgwx
March 15, 2023 7:21 am

bdgwx ==> Interesting, but guesses is guesses — not measurement.

bdgwx
Reply to  Kip Hansen
March 15, 2023 7:45 am

Kaufman et al. 2020 didn’t guess. They took actual measurements of temperature proxies and aggregated them into a global average temperature.

MCourtney
Reply to  Kip Hansen
March 13, 2023 7:35 am

Kip Hansen, I applaud this idea.
A rationale for the scale based on human comfort. It makes sense.

But while that’s useful for health questions, there are other questions too.
Effects on wildlife, agriculture, weather (humidity), weather (windspeed)…

The scale shown ought to be relevant to the question you are asking.
Your graph is. But the question youa re answering always needs to be stated.

Editor
Reply to  MCourtney
March 13, 2023 8:31 am

MCourtney ==> There is not, and cannot be, a single answer. or a single question.

There is absolutely no evidence presented by the IPCC (or anyone else) that a 1°C change in “global temperature’ (which we do not and probably cannot know) is important or significant to life on Earth. At best, observations swing both ways — the warmer present is better for almost everything, bad for some in some special instances.

Tim Gorman
Reply to  Kip Hansen
March 13, 2023 9:38 am

The global AVERAGE temperature hides what is going on. There is no way to use the average to determine if Tmax is going up, down, or sideways and the same thing applies to Tmin.

I’ve seen study after study on ‘global warming” going to kill the food supply because of the assumption that it is Tmax that is going to go up by 3C or even more.

Yet the GAT doesn’t tell you that! In fact, most ag studies show that growing seasons are lengthening. That’s due to Tmin going up, not Tmax.

At best, observations swing both ways — the warmer present is better for almost everything”

First you have to know if the present is actually warming!

Reply to  Tim Gorman
March 13, 2023 10:32 am

Local temperature data tell you what you need to know. No one lives in the average global climate.

Or you can follow my two predictions if your local data are unavailable: (Nobel Prizes pending)

(1) The climate will get warmer, unless it gets colder, and

(2) Winters will be cold, and summers will be warm

Global climate statistics would be useful when they include TMAX and TMIN trends, warming by latitude and warming by season or month of the year. But those details are top secret. No one must know that colder nations are having warmer winters since the 1970s — those facts would not scare anyone. Climate change scaremongering is worthless if it does not scare people.

Last edited 17 days ago by Richard Greene
KevinM
Reply to  Richard Greene
March 13, 2023 3:36 pm

All that Canadian and Russian/Siberian tundra turning into farmland. It would be tough for a Texan wheat farmer but imagine the farmer in Saskatchewan. Yet they vote opposite those interests. In both cases, Texas and Saskatchewan, “other factors” are more important when they consider the role of government on their personal situations.

Last edited 17 days ago by KevinM
Reply to  Tim Gorman
March 13, 2023 2:21 pm

the global average hides nothing!!

Tim Gorman
Reply to  Steven Mosher
March 13, 2023 3:45 pm

Really? What is the variance associated with the distribution that results in that average?

Editor
Reply to  Steven Mosher
March 13, 2023 4:06 pm

Mosher ==> The “Global Average” graph depends on how it is shown and to what scale.

If shown on the scale of Earth temperatures experienced on the average day, it is trivial. If the scale is Annual Global High and Low, it is still trivial.

Using the Cleveland “bank to 45” recommendation guarantees a 45° slant for even trivial changes and trends, and is meant to make them look significant.

Huff – How to Lie with Statistics..

Reply to  Steven Mosher
March 14, 2023 1:29 am

The global average temperature “hides nothing” EXCEPT:

More warming in the six coldest months of the year, including the Arctic, than in the six warmest months of the year

More warming at the higher, colder latitudes of the Northern Hemisphere

No warming of Antarctica

More warming at night (TMIN), than in the afternoon (TMAX)

Do you still want to continue lying by claiming a single global average temperature, that not one person lives in, hides nothing?

A single average temperature hides EXACTLY what leftist climate scaremongers like you WANT to hide. Because the average temperature change details I listed above would REDUCE the unjustified fear of the future climate that gullible fools like you have been conned to believe.

So the general public will NEVER get historical temperature details beyond a simple global average temperature.

An always wrong prediction of the future climate works to scare people. … Hearing about warmer winter nights in Siberia would not scare anyone.

Scaring people is the ONLY goal of the climate change religion. Scaring people about an imaginary boogeyman to control the private sector and increase government powers, while reducing personal freedoms..

And you, Mr, Masher, are on the WRONG side of the climate change propaganda. Shame on you.

Last edited 16 days ago by Richard Greene
DWM
Reply to  Richard Greene
March 14, 2023 12:50 pm

We use global average temperature and global average solar radiation and global average albedo to estimate energy budgets and the GHE. Of course not measured to two decimal places.

Reply to  Kip Hansen
March 13, 2023 10:28 am

+1 degree C.is a nothingburger
+1.49 degrees C. is a nothingburger too
Over +1.5 degrees C. and we are in BIG trouble
Over +2.0 degrees C. and we all die

That is a summary of modern climate science.
It is in all the newspapers.

By the way, no one knows the global average temperature in 1880 or 1850. The claimed averages could have a margin of error of +/- 1 degree C.

This post is serious,
not satire.

Tim Gorman
Reply to  Richard Greene
March 13, 2023 11:18 am

You pretty much nailed it!

Reply to  Richard Greene
March 13, 2023 2:30 pm

+1 degree C.is a nothingburger
+1.49 degrees C. is a nothingburger too
Over +1.5 degrees C. and we are in BIG trouble
Over +2.0 degrees C. and we all die
That is a summary of modern climate science.
It is in all the newspapers.

this is a classic strawman. 2C is not a kill zone demarcation

you cannot debunk arguments unless you can restate them accurately to the satisfaction of those who actually believe the arguments.

its in the newspapers!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

bad sourcing freshman mistake.

here
https://www.ipcc.ch/site/assets/uploads/sites/2/2022/06/SR15_Chapter_3_LR.pdf

KevinM
Reply to  Steven Mosher
March 13, 2023 4:01 pm

I’ve read the linked pdf, which seems to say that higher/lower extremes are more important than a static higher mean for the scenarios we’re thinking about. The authors seem unwilling to commit to a catastrophic conclusion.

Select quotes:
“The strongest warming of hot extremes is projected to occur in central and eastern North America, central and southern Europe, the Mediterranean region”
If someone wanted to write a definition of where money lives, the quoted sentence would be a good start

“Limiting global warming to 1.5°C would limit risks of increases in heavy precipitation events on a global scale and in several regions compared to conditions at 2°C global warming”
Are precipitation events bad? Rain hurts outdoor pick-up soccer, but helps green growing things.

 A smaller sea level rise could mean that up to 10.4 million fewer people (based on the 2010 global population and assuming no adaptation) would be exposed to the impacts of sea level rise globally in 2100
As soon as I see year 2100, I stop worrying. I understand the change would compound yearly, but if flood risk were to crush beachfront property values, I’d be a buyer not a seller. Even given the land might be gone in 70 years, I know I will certainly be gone in 70 years, so whatever.

I could keep going with quotes and comments. You get the idea. Earth is so big, I don’t much that fear my smart, healthy, educated American kids will find a nice place on it. I’d rather walk outside and find it warmer than colder, and I live in a very warm region. I wish people would stop coming here, they’re going to use up the water!

Sweet Old Bob
Reply to  KevinM
March 14, 2023 4:19 pm

“The strongest warming of hot extremes is projected to occur in central and eastern North America….”

except that those hot extremes have been falling for 80 years or so.

See EPA heat wave index ….

Reply to  Steven Mosher
March 14, 2023 1:35 am

It’s called sarcasm, Mr, Masher.
Written to amuse people and exaggerate a point.
You have no sense of humor
Although many of your comments here are hilarious.
A typical leftist.

sskinner
Reply to  Kip Hansen
March 13, 2023 1:19 pm

And, it’s not possible to describe or deduce any of the Earth’s varied climates or where they are from a single global average temperature. And, there isn’t a single Earth climate.

KevinM
Reply to  sskinner
March 13, 2023 4:11 pm

Mosher’s link to recent environmentalist text agrees that there isn’t a single Earth climate.
I think you might agree with more than you realize.

Reply to  MCourtney
March 13, 2023 2:19 pm

he scale shown ought to be relevant to the question you are asking.

the question is How much has the temperature increased
how much has the climate changed.

  1. use anomalies
  2. 2. zero base your chart. you can read on sight

human comfort?
inside or outside?
dressed or naked?
calm or windy?

why not “crop comfort” or workingoutside comfort

Jim Gorman
Reply to  Steven Mosher
March 14, 2023 5:29 am

the question is How much has the temperature increased”

“how much has the climate changed.”

These are subjective choices, not science. An increase of 0.5C is unlikely to result in a change in any of the defined climate zones. Consequently, one must answer the question of whether it is appropriate to emphasize that small of a change.

Here is an image of large climate definitions. Where do you see a 1C causing changes?

Jim Gorman
Reply to  Jim Gorman
March 14, 2023 6:28 am

Here is the image.

climate zones.jpeg
Editor
Reply to  Jim Gorman
March 14, 2023 1:21 pm

Jim ==> That is, of course, (and I’m sure you know) a vastly over-simplified version of climate zones.

The one usually used for most purposes is the Köppen-Geiger climate zone map:

Koppen-map.jpg
Last edited 16 days ago by Kip Hansen
Editor
Reply to  Steven Mosher
March 14, 2023 1:15 pm

Mosher ==> Now you’re babbling. The GAST anomaly does not show “how much the climate had changed”

Even local annual anomalies do not show how much local climates have changed…the only show how much the questionably-useable statistical average “annual temperature” had changed — which may have produced ZERO climatically important changes.

The change matters only to those who demand we define climate change as any change in average surface temperature on scales of years — annual average temperatures.

Such an idea is entirely unscientific.

Tim Gorman
Reply to  Kip Hansen
March 14, 2023 4:00 pm

Climate is based on the entire temperature profile. That is lost when using averages to calculate anomalies. Therefore anomalies can’t tell you anything about climate, be they local, regional, or global.

GAST is a metric, but what it is a metric for is not obvious at all.

Reply to  Kip Hansen
March 13, 2023 9:12 am

here.

https://priceonomics.com/how-william-cleveland-turned-data-visualization/

osha is the wrong authority to appeal to.

please ask before posting about stuff you dont understand

Nansar07
Reply to  Steven Mosher
March 13, 2023 12:43 pm

Wow, three Moshisms and counting

Reply to  Nansar07
March 13, 2023 2:15 pm

of course you ignore the bad appeal to authority

where did you learn to argue?

folks who have studied empirically how BEST to convey information would never say to scale a graph to destroy the trend information

its TONY’s nature trick

bobclose
Reply to  Kip Hansen
March 14, 2023 4:24 pm

I disagree with this temp range, as the data is only just 1C above or below the chosen datum, so expand that to 3C total and it will look better and be more meaningful.

Reply to  Kip Hansen
March 14, 2023 7:07 pm

office temperatures? thats stupid
why not office co2?

how about office humidity?

TallDave
Reply to  Nick Stokes
March 13, 2023 9:17 am

“But famously, we don’t experience global average temperature”

like hundreds of millions of others I lived every temperature on the scale this year

it’s the human-relevant scale

but yes a local livability scale would be vastly superior

but you won’t like the proportion of the Earth that actually wants lower temperatures

KevinM
Reply to  TallDave
March 13, 2023 4:13 pm

People don’t get rich and move North very often.

Editor
Reply to  Nick Stokes
March 14, 2023 5:31 pm

Nick ==> So, you say we should use a scale with a vertical scale of at least 14°F (7 or so C) — the difference between your offfered Ice Age figure and the present? That’s about the scale I used (5°C). I would be glad to produce the graph with a wider scale (7°C) It will look even a little less alarming at that scale.

Nick Stokes
Reply to  Kip Hansen
March 15, 2023 1:59 am

No,
I’m saying you should draw a graph in order to inform, not make some juvenile point. You should use the range of the observed GAST. That tells people what happened. If you are graphing from 1850 to now, you don’t need to make provision for some glaciation.

Editor
Reply to  Nick Stokes
March 15, 2023 7:26 am

Nick ==> If your only purpose is to show how the artificial metric, Global Average Surface Temperature, changed, then that’s fine.

But if you want to inform the general public what that might mean for them in their real lives, and for their children and grandchildren, then you have to put it in some form, with some scale, that will be meaningful to them.

Thus — Read Huff. Or read Cleveland and read his quotes on Huff.

Nick Stokes
Reply to  Kip Hansen
March 16, 2023 1:35 am

But if you want to inform the general public what that might mean for them in their real lives”

How does a red rectangle do that?

bdgwx
Reply to  Rud Istvan
March 12, 2023 7:17 pm

Rud Istvan: “We were taught to scale meaningfully.”

Bingo!

Rud Istvan: “AW did not pull the same obvious trick you just did.”

Is -20 F or 120 F a meaningful global average temperature?

I ask because 120 F is 92 standard deviations above the max and -20 F is 115 standard deviations below the min. Have you ever created a chart with economic data with the y-axis scaled at such drastic extremes? Can you provide an example of where you did this?

Ed Reid
Reply to  bdgwx
March 13, 2023 6:08 am

The range of -20F to +120F is a reasonable range for global temperatures, from which the global average is calculated.

bdgwx
Reply to  Ed Reid
March 13, 2023 6:26 am

Can you tell me the last time the global average temperature was close to -20 F or +120 F?

Can you post a link to other examples where published graphs use a y-axis range of 115σ below the min and 92σ above the max?

Matt Kiro
Reply to  bdgwx
March 13, 2023 1:03 pm

The point isn’t that the global average temperature was ever close to those values, however humans regularly live in places where the temperature reaches those values at height of summer or winter

bdgwx
Reply to  Matt Kiro
March 13, 2023 2:00 pm

Matt Kiro said: “The point isn’t that the global average temperature was ever close to those values,”

Then don’t base the y-axis on something that has never come close to happening.

Matt Kiro said: ” however humans regularly live in places where the temperature reaches those values at height of summer or winter”

Which would be a fair point if the graph were of the temperature at a specific spot on Earth that did exhibit that kind of temperature range. But it’s not. It’s a graph of the global average temperature.

Editor
Reply to  bdgwx
March 14, 2023 1:30 pm

bdgwx ==> How about the scale of temperatures In Death Valley, California or any of the weather stations in the American West High Desert? Would that satisfy you?

Editor
Reply to  Kip Hansen
March 14, 2023 1:33 pm

That’s a high of  134 °F (56.7 °C), on July 10, 1913 down to a low of 15 °F (−9 °C) on January 2, 1913.

 https://en.wikipedia.org/wiki/Death_Valley#Climate

bdgwx
Reply to  Kip Hansen
March 14, 2023 3:22 pm

KP said: “How about the scale of temperatures In Death Valley, California or any of the weather stations in the American West High Desert? Would that satisfy you?”

If you were graphing the temperature at Death Valley then absolutely. I’d even expand the y-axis by 1σ or perhaps even 2σ above and below the range.

If you were graphing the global average temperature then absolutely not.

Last edited 16 days ago by bdgwx
Editor
Reply to  bdgwx
March 15, 2023 7:30 am

bdgwx ==> We (Anthony and myself, anyway) are not just graphing GAST. We are trying to make a graphic representation of the changes in GAST that will properly communicate to the general public those changes in a way they can understand and which will properly commincate the true practical magnitude of that change.

Again, read Huff or read Cleveland on Huff and the importance of the purpose of a graph (or any representation of a statistic/metric) in how it should be scaled.

bdgwx
Reply to  Kip Hansen
March 15, 2023 7:43 am

Despite having an issue with your justification I don’t actually have an issue with your choice or y-axis bounds. Like I said 5 C at least frames the GAT in the context of what was typical over the last 1 million years.

I have 3 major issues with Anthony Watt’s graph though.

It’s not even right. You can’t just add 52.7 F to every data point. It doesn’t work that way.It looks like the GAT is oscillating between 0 F and 59 F rapidly.It uses unreasonable y-axis bounds that are not even remotely typical or anything that like what humans experience.

Last edited 15 days ago by bdgwx
KevinM
Reply to  Matt Kiro
March 13, 2023 4:15 pm

Thanks MK.

ClimatePerson
Reply to  bdgwx
March 13, 2023 4:05 pm

He didn’t claim that global avg temps were close to -20 or +120F. Just that the average falls between those extremes. Which the last time I checked is pretty much true.

bdgwx
Reply to  ClimatePerson
March 13, 2023 5:25 pm

ClimatePerson said: “He didn’t claim that global avg temps were close to -20 or +120F.”

Actually the graph shows the GAT between 0 F and 59 F so it could be reasonably argued that it is at least close to -20 F. Let’s ignore that fact for a moment though. If the global average temperature isn’t close to -20 F or +120 F then why chose those as the lower and upper bounds for the y-axis?

ClimatePerson said: “Which the last time I checked is pretty much true.”

It’s also true that the global average temperature lies between -459 F and 212 F. That doesn’t make it a good choice for the bounds of the y-axis though.

ClimatePerson
Reply to  bdgwx
March 13, 2023 6:57 pm

My point is that “he didn’t claim that the global avg temp were close to -20 or +120F”. 57.2F pretty falls in the middle of that range. Mr. Watts choice of upper and lower bounds for the y-axis make sense to me, because they represent a range of temps that are actual extremes that the globe experiences. We don’t experience -459F and 212F in our daily existence so it would not be appropriate to choose such a range. It would be meaningless. In contrast, Mr. Stokes choice of upper and lower limits of national debt do not represent a reasonable range because our national debt is many factors less than what his y-axis depicts. You are free to prepare your own graph with any y-axis you wish.

bdgwx
Reply to  ClimatePerson
March 13, 2023 7:52 pm

ClimatePerson said: “Mr. Watts choice of upper and lower bounds for the y-axis make sense to me, because they represent a range of temps that are actual extremes that the globe experiences.”

No they aren’t. The global average temperature does not come anywhere close to -20 F or +120 F.

ClimatePerson said: “We don’t experience -459F and 212F in our daily existence so it would not be appropriate to choose such a range.”

What we experience is irrelevant. It’s not a graph of the temperature in your backward. It is a graph of the global average temperature. Nobody “experiences” the global average temperature.

ClimatePerson said: “Mr. Stokes choice of upper and lower limits of national debt do not represent a reasonable range because our national debt is many factors less than what his y-axis depicts.”

Bingo! It is similar to what I did with Greenland temperature proxy. By surreptitiously expanding the y-axis you can hide details like the fact that the Earth goes through glacial cycles.

Last edited 16 days ago by bdgwx
Jim Gorman
Reply to  bdgwx
March 14, 2023 5:48 am

Your attention to the GAT is misplaced. You are being used to propagate propaganda. If no place on earth experiences the GAT anomaly on an ongoing basis, then it can’t possibly be useful for indicating what people actually experience.

As electricity becomes less reliable and more and more expensive, people WILL start asking what is going on. They are going to ask for proof that their local “climate” has been experiencing untenable temperatures. The GAT anomaly is not going satisfy that need. Climate scientists should be preparing that information right now to prevent angry societies from exacting retribution. That is where your attention should be focused.

Nick Stokes
Reply to  ClimatePerson
March 13, 2023 9:10 pm

In contrast, Mr. Stokes choice of upper and lower limits of national debt do not represent a reasonable range because our national debt is many factors less than what his y-axis depicts”

I agree. The range should be linked to the range over which the quantity varies, True for GAT also.

Editor
Reply to  Nick Stokes
March 14, 2023 1:39 pm

Nick ==> Perfectly true for a science paper but totally false for a graph for pubic consumption. Tiny tiny changes can be made to look HUGE for a quantity that varies very little over time.

Thus fooling the public into thinking that the change is BIG not TINY.

ref: Huff, “How to Lie with Statistics” (free .pdf)

This book is quoted extensively in Cleveland, btw.

Nick Stokes
Reply to  Kip Hansen
March 15, 2023 2:01 am

This is a very elitist view. Let them eat crap.

The public has a right to be straight-forwardly informed. They can handle it.