Numbers –Tricky Tricky Numbers: Part 1

Guest Opinion by Kip Hansen –  26 July 2022

I beg your forbearance if I step on your ideological or scientific-viewpoint’s toes with my essay today.     But I hope to attract your attention long enough to make a point so important that it affects almost all empirical knowledge in our modern world.  The point is so simple, yet so scientifically profound, that it might even sound silly to casual readers:

Numbers are just Numbers

~~ ~

The most famous and illustrative example (albeit, fictional) is Douglas Adams’, the “Answer to the Ultimate Question of Life, the Universe, and Everything“, as calculated by an enormous supercomputer named Deep Thought, is “42”.

This is precisely my whole point here today; with a rather long afterword on why this is important enough to mention here at a science blog.  Readers who already understand why “Numbers are Just Numbers” is so profoundly true and those who already understand the significance of this for modern science can move on and read about (boring) climate change topics. 

[ Warning: This is not easy essay – it is a short dissertation on the scientific philosophy of numbers and their use in modern science with some cautions and will extend to  two parts, at least. ]

~ ~ ~

From the book “The Science of Measurement:  Historical Survey” by Herbert Klein we get the following quotes:

“…the tools and techniques of measurement provide the most useful bridge between the everyday worlds of the layman and of the specialists in science.”

“Non-scientists may be similarly impressed to discover that units of measurement – for length, area, volume, time duration, weight, and all the rest – are essentials of science.”

“[This work] …should prove serviceable to professionals in science, but its main purpose is to make outsiders realize that in their daily lives and concerns they too are involved in the activities and ideas classified as metrology, the science of measurement – a subdivision of science that underlies and assists all others.”

And what is metrology?  “Metrology is the scientific study of measurement.”

And what is measurement?  “Measurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events.”

And what is quantification? “…quantification is the act of counting and measuring that maps human sense observations and experiences into quantities. Quantification in this sense is fundamental to the scientific method.”

And what is counting?  “Counting is the process of determining the number of elements of a finite set of objects” such as its physical attributes.

~ ~ ~

All measurement is, at its most basic, simply counting – the number of beans, coins, stars, inches, lightyears…(and all other units of measurement).  The result of counting is a number – the number of “elements” – of the things counted.

And what is a number? “A number is a mathematical object used to count, measure, and label. The original examples are the natural numbers 1, 2, 3, 4, and so forth.”

~ ~ ~

So, the basic activity of Science is counting or measuring  — a specific type of counting against pre-established , internationally-agreed-upon units of some quality/property such as temperature or weight or length or foot/lbs and many many more.  There are a lot of different methods of measuring different things with vastly different tools at a wide variety of scales.  Nonetheless, they are all really just types of counting.

Alas, when we (or you or they) count,  the result is a number – which is nothing more than a mathematical object – “A mathematical object is an abstract concept arising in mathematics”.    The number counted alone, of course, is not a thing at all – only an abstract concept — until the counted number is clearly stated as “number of whatevers?” – number of peaches, number of inches of 2×4 board, number of monarch butterflies at any given moment, number of any of the SI units under the International System of Units of various physical properties of something.

Numbers can be tricky….just because they are numbers – like 1, 2, 85,  400  million, 3.432 — some think we can just willy-nilly apply all types of mathematical processes to them: add them up, subtract them, multiply them, add them up and divide them into averages and/or average them spatially with various methods of distance weighting and kriging – all this is meant to produce physically meaningful results

To make matters worse, statisticians often think that they can then take those numbers churned out by all the above processes and wring out even more meaningful results not otherwise visible to the human mind. 

But does all that Mathematica-ing  produce physically meaningful results?

While some interesting things can be done with numbers and statistics, many fields of modern science have often gone far down the slippery slope of reification of numbers – often creating whole fields worth of non-physical data – like Global Average Surface Temperature, an entirely imaginary nonphysical number.  Similarly, modern oceanic scientists have created the imaginary concept of Eustatic Sea Level – a “would have been” not-really-a-physical-level level

SIDE BAR:  “Reification is when you think of or treat something abstract as a physical thing. “   Remember, these numbers are mathematical abstracts.

In a two-decade old BMJ article, it is acknowledged that:

“Many people only respect evidence about clinical practice [think also: biology, climate science, geology, psychology, ad infinitum ]  that is couched in the highly abstract language of graphs and statistical tables, which are themselves visualisations of abstract relations pertaining among types of numbers, themselves again abstractions about ordinary phenomena.” [ source ]

Thus, we find ourselves reading articles and essays and journal papers filled with abstractions about abstractions about ordinary phenomena

In practice, we call these abstractions numbers or data sets or even “the data” and then we/somebody makes them into various visual presentations – charts and graphs and pretty pictures —  intended to sell their favorite hypothesis or to refute your favorite hypothesis.

~ ~ ~

Preview of Part 2:

In Part 2, I will consider why it is that

“One cannot average temperatures.”

Really.

# # # # #

Author’s Comment:

I have been accused in the past of not liking numbers, of hating numbers, of not understanding mathematics, of not understanding statistics and being a general math-o-phobe.  This is not true –not only do I love the beauty and certainty of mathematics, but I am also a true pragmatist.

“one who judges by consequences rather than by antecedents.”

Not a real fan of blind trust in so-called experts – experts, in my opinion, have to be able to show their work in the real world

I am well aware of the many problems of scientific modern research,  including that all fields of science are returning a lot of questionable results – even in fields closely monitored like medicine.  It is my belief that much of this is functionalized by  “too much math, not enough thinking” or the reification of mathematical and statistical results. 

“I could be wrong now, but I don’t think so.”

(h/t Randy Newman)

Free-for-all on this topic in comments.

Thanks for reading.

# # # # #

4.7 25 votes
Article Rating
276 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Scissor
July 26, 2022 2:09 pm

I wonder how many people like Bell’s palsy.

Scissor
Reply to  Kip Hansen
July 26, 2022 2:13 pm

A lot of things are. Unexpected. Cheers

Scissor
Reply to  Kip Hansen
July 26, 2022 2:35 pm

I like your articles, not in a Bell’s palsy sense, but in the way they ponder one to think about things from a different perspective. They’re very enjoyable to read.

Reply to  Kip Hansen
July 27, 2022 3:58 am

Karl Popper listed surprise as a good indicator of truth in his <i>Logic of Scientific Discovery.</i>

Last edited 13 days ago by Doug Huffman
Redge
Reply to  Scissor
July 26, 2022 9:57 pm

I get that after too much of this:

bells_blended_scotch_whisky_70cl_60604_T1.jpg
2hotel9
July 26, 2022 2:16 pm

Statistics can be made to say whatever the person presenting wants them to say. Computer models say precisely what they are programed to say by those who program them. Beginning to see a pattern in climate catastrophics?

Old Cocky
Reply to  Kip Hansen
July 26, 2022 3:13 pm

Kip, did you get my email on that particular topic?

Old Cocky
Reply to  Kip Hansen
July 26, 2022 8:14 pm

Third time lucky?

Old Cocky
Reply to  Kip Hansen
July 27, 2022 3:22 pm

Cool. I’m looking forward to your thoughts on it.

Steve Z
Reply to  2hotel9
July 26, 2022 2:34 pm

Mark Twain wrote that there are “lies, damned lies, and statistics”. He passed away in 1910, long before anyone worried about global warming, but what would he have said about today’s number manipulators trying to scare people away from keeping warm?

David Blenkinsop
Reply to  Steve Z
July 27, 2022 1:37 am

I’ve read that it was really Dr. Roger Revelle who took the established (established but not really proven) theory of greenhouse gas warming, and turned it into something alarming, starting in the 1950’s:

http://ruby.fgcu.edu/courses/twimberley/EnviroPhilo/Coleman.pdf

Before Revelle’s work, hardly anyone would have thought that a bit of warming was of any big concern, it seems. Of course, getting alarmed must have been an idea whose time had come, being seized upon eventually then by everyone from government officials to NASA administrators, scientist’s unions or associations, etc.

As to the general abuse of number concepts, it really is a bit mysterious how empiricism — the ability to actually handle measurements properly and verify either the usefulness or the uselessness of theories thereby — has fallen by the wayside in so many areas of publicly consequential science these days.

Josh Scandlen
Reply to  David Blenkinsop
July 27, 2022 6:46 pm

Interesitng. A few years back, I read D.S. Halacy’s “Fuel Cells: Power for Tomorrow” written in 1964 I think where he mentioned CO2 and global warming and I always wondered when the scare tactics actually began.
https://amzn.to/3zdJUtd

Retired_Engineer_Jim
Reply to  2hotel9
July 26, 2022 2:36 pm

Except for true AI algorithms, which can go off and do whatever they want.

Reply to  Retired_Engineer_Jim
July 26, 2022 10:05 pm

Which do not exist – true AI algorithms, that is. Today’s “AI” does exactly what it is programmed to do. Takes input numbers, manipulates them in some manner, and outputs numbers.

Whatever a computer does can be replicated by a human with paper and pencil. “AI” is only quantitatively different; i.e., that hypothetical human cannot perform the task in anywhere close to the time that the computer can. (In the case of supercomputer runs, beyond the theoretical heat death of the universe.)

Richard Page
Reply to  writing observer
July 27, 2022 9:07 am

Today’s ‘AI’ despite it’s name is little more than a Simulated Intelligence program, with preset parameters and data sets they are sophisticated programs but a universe away from what a true AI would be.

Ferd Flintson
Reply to  Retired_Engineer_Jim
July 26, 2022 11:31 pm

Al Gore rhythms have been out of sync for years.

Reply to  Retired_Engineer_Jim
July 27, 2022 12:14 am

How about AlGoreithms?

Last edited 13 days ago by Richard Greene
Joao Martins
Reply to  Richard Greene
July 27, 2022 2:34 am

You spoke my mind. I was going to make the same suggestion, word by word.

Gary Kerkin
Reply to  2hotel9
July 26, 2022 3:49 pm

That depends on the intellectual honesty of the programmer!

2hotel9
Reply to  Gary Kerkin
July 27, 2022 4:21 am

Looking at the “product” being produced their intellectual honesty is zero.

Surrr
Reply to  2hotel9
July 26, 2022 4:28 pm

Or the size of the taxpayer funded research grants for the “right” type of statistics that governments agree with

Robert B
Reply to  2hotel9
July 27, 2022 12:20 am
  1. Think of a number.
  2. Multiply it by 3.
  3. Add 6.
  4. Divide this number by 3.
  5. Subtract the number from Step 1 from the answer in Step 4.

The answer is 2.

DaveW
Reply to  Robert B
July 27, 2022 2:06 am

Works for 1000 and 1001 and I think I’m quitting now while I can still do it in my head.

Joao Martins
Reply to  Robert B
July 27, 2022 2:37 am

Goog algoreithm. Unfortunately, one cannot make money from it…

eyesonu
Reply to  Robert B
July 29, 2022 7:28 am

Complex mathematical / algebraic formula where: 2 = a+2-a

roaddog
July 26, 2022 2:16 pm

Too much math, no thinking at all.

“I made the formula say what I desire, and objective reality has no bearing.” – Modern Expert

Mac
Reply to  Kip Hansen
July 26, 2022 4:05 pm

When you suggested in a previous article that modelers should get together in a room I thought of Monty Pythons argument sketch; my model is better than yours…no it isn’t; yes it is…no it isn’t…
Same for statisticians..similar room more arguments about which numbers are correct.

Drake
Reply to  Kip Hansen
July 26, 2022 7:27 pm

But the only “good” model is the Russian one, and the Russians are probably not allowed in the room.

Reply to  Drake
July 27, 2022 3:22 am

The Russian model is “bad” — too close to being accurate, and not scary enough. Climate computer not wanted. Or we would already have them.

Last edited 13 days ago by Richard Greene
roaddog
Reply to  Kip Hansen
July 26, 2022 7:02 pm

If they ever had them. Popularity is endemic among the liberal insanitii.

LdB
Reply to  roaddog
July 26, 2022 7:49 pm

This is climate science you should always use alternative maths

Reply to  LdB
July 26, 2022 10:14 pm

22,000 upvotes!

Zig Zag Wanderer
Reply to  LdB
July 26, 2022 10:35 pm

I think the correct figure is $20,002,000

DaveW
Reply to  LdB
July 27, 2022 2:17 am

Pretty good. Now do genders. (Correct answer is 3, even in English).

Ben Vorlich
Reply to  DaveW
July 27, 2022 4:47 am

No there’s way more than three, whatever maths you use, in the UK anyway

Richard Page
Reply to  Ben Vorlich
July 27, 2022 9:11 am

3. They keep relabelling the same 3 over and over again in slightly different ways until you end up with hundreds that all bear a remarkable similarity to the 3 they started with.

Alba
Reply to  roaddog
July 27, 2022 3:59 am

Douglas Murray in his book, “The War on the West” has a section where he describes the attempt to ‘deconstruct’ mathematics by proponents of critical race theory. One aspect of this was to question whether 2 + 2 = 4. “Others claimed that it was obvious that 2 + 2 cannot equal 4 and gave a variety of reasons. These included, but were not limited to, claims that 2 + 2 = 4 is part of a ‘hegemonic narrative’. that the people who make such narratives should not get to decide what is true, that 2 + 2 should equal whatever people want it to equal, and that making such a definitive statement excludes other ways of knowing. One PhD candidate took to social media to declare that ‘the idea of 2 + 2 equalling 4 is cultural and because of western imperialism/colonialism, we think of it as the only way of knowing.'” (Page 198)

George Daddis
Reply to  Alba
July 27, 2022 6:13 am

If I recall, one of the examples to prove that 2 + 2 does not equal 4 is:
Imagine a factory that made 2 complete items and 1/2 of another. They recorded 2 completed items. They did the same the next day. Traditional math would conclude they completed 4 items, but when you combine the 2 half items you find there are actually 5 completed items! (I kid you not. I believe this examples came from California.)

Richard Page
Reply to  George Daddis
July 27, 2022 9:13 am

But surely that then would be 2.5 + 2.5 = not 2 + 2 = wouldn’t it?

roaddog
Reply to  Richard Page
July 27, 2022 9:27 pm

Math is hard.

atticman
July 26, 2022 2:26 pm

The perils of mis-using statistics in an argument were highlighted by Winston Churchill when he said of an opponent during a Parliamentary debate, “The honourable gentleman uses statististics the way a drunk uses a lamp-post: more for support than for illumination.”

The best definition of confirmation bias I’ve ever come across…

James Snook
July 26, 2022 2:35 pm

Kip. Typo worth correcting: reification

James Snook
Reply to  Kip Hansen
July 27, 2022 6:36 am

Sorry Kip. I thought that you meant to type Deification. Probably appropriate in the context of the the use by alarmists of such constructions as Global Average Surface Temperature.

Mike Maguire
July 26, 2022 2:43 pm

Great topic Kip!
Numbers of Zeros in a Million, Billion, Trillion, and More
https://www.thoughtco.com/zeros-in-million-billion-trillion-2312346

 Numbers Bigger Than a Trillion The digit zero plays an important role as you count very large numbers. It helps track these multiples of 10 because the larger the number is, the more zeroes are needed.

Millions, Billions, and TrillionsHow Can We Think About Really Large Numbers?
https://www.thoughtco.com/millions-billions-and-trillions-3126163

++++++++++++++++++++++++
How do you express large numbers?
https://www.geeksforgeeks.org/how-do-you-express-large-numbers/

+++++++++++++++++++++++++++

So this number, 240,000,000,000 which is 240 billion could be written as 2.4 X 10 to the 11th power because there are 11-#’s after the last whole number.

Rud Istvan
July 26, 2022 2:55 pm

Kip, your post part one on numbers reminded me of a lesson learned long ago in college about economics. It was the main takeaway from a one semester course taught by John Kenneth Galbraith. I took it not so much for the subject matter (artificial demand creation via advertising based on his book ‘The Affluent Society’ on same) but because he was a truly funny professor. Example: he was debating Paul Samuelson of MIT, and claimed something that Samuelson disagreed with. So Samuelson asked Galbraith how he knew this was so? Galbraith responded, ‘Because I am the greatest living expert on it.’ Which brought the house down. But to the lesson.

Galbraith ran the Office of Price Administration (price fixing during WW2 shortages) for FDR. He learned a mental trick for summing a long column of numbers to within about 10% (answer, lots of rounding) in order to more quickly reach OPA decisions. A student (not me) asked him whether that was good enough for such important decisions. Galbraith replied, ‘If within ten percent isn’t good enough to answer the question, then you have the wrong question.’

Good enough for government work—and much else in life. Just get into the ballpark. It suffices for most things.

Gary Kerkin
Reply to  Rud Istvan
July 26, 2022 3:56 pm

It was once said that if you asked a panel of 5 economists a question you would get 6 answers because JK Galbraith always changed his mind.

Rud Istvan
Reply to  Gary Kerkin
July 26, 2022 5:11 pm

Nah.As I learned it, if you asked 3 economists you would get six answers…on the one hand, on the other hand. 3×2=6. Same basic joke.

John in Oz
Reply to  Rud Istvan
July 26, 2022 6:22 pm

Apparently you can just average all of the answers to get the correct one, as is done with climate model runs

Retired_Engineer_Jim
Reply to  Rud Istvan
July 26, 2022 10:04 pm

Are these the same economists who predicted 18 of the last 3 recessions?

Reply to  Retired_Engineer_Jim
July 27, 2022 3:25 am

Stock investors predict too mamy recessions
Economists, as a group, never predict recessions.

Zig Zag Wanderer
Reply to  Rud Istvan
July 26, 2022 10:38 pm

This is why we need economists with only one hand

Reply to  Rud Istvan
July 27, 2022 3:24 am

As a group. US economists
have never predicted a recession.
Not once.
That’s not a joke — it’s true.
They are about to do it again.

Reply to  Richard Greene
July 27, 2022 1:20 pm

But,if all the economists in the world were laid end to end, they still wouldn’t reach a conclusion.

Auto.

Josh Scandlen
Reply to  Rud Istvan
July 27, 2022 6:50 pm

What president was it that wanted the one armed economist? because economists always say “on the other hand…”
Truman?

Mr.
Reply to  Rud Istvan
July 26, 2022 4:07 pm

Before sophisticated computer systems were widely in use by businesses (1960s), my early career work was in accounting / auditing production materials usage manually.

So we used a “practiced eye” to assess “first-pass fit” of manually compiled production reports numbers –
know approximately what reported total numbers should be,
add up whole tens or hundreds or thousands entries in your head,
compare to expected total,
if in the ballpark, call it “first-pass fit”,
then move on to next reports.

I’ve been using the “first-pass fit” technique for many & varied tasks all my life now.

Trying to teach the grandkids to use it now.

Mr.
Reply to  Kip Hansen
July 26, 2022 7:00 pm

My lord have they complicated elementary math lessons.
It’s a major exercise now to arrive at 6 x 8 = 48.

Ben Vorlich
Reply to  Mr.
July 27, 2022 5:05 am

I feel your pain, I’m also a grandchild maths tutor. One was struggling with multiplication of large decimal numbers 123.456×543.987 type calculations. In the the end I said i don’t care what you’ve been taught we’re doing it the way I learnt. The only problem was getting him to write neatly.

It’s important to be able to do mental calculations of approximate answers so you can say that doesn’t look right when using a calculator.

I went to a school in a little village. In the Co-op there was an assistant who was the most impressive I’ve ever met at mentally adding up £sd. People used to leave their shopping lists and collect the messages (shopping) later. He could do the calculation by running his pencil up the column of numbers in just a few seconds, this involved halves, 12s and 20s, correct every time. I’m still impressed 65 years later.

roaddog
Reply to  Rud Istvan
July 26, 2022 7:04 pm

Absolutely. The pursuit of precision, in order to corrupt it, is a pandemic all its own.

Retired_Engineer_Jim
Reply to  roaddog
July 26, 2022 10:05 pm

Is there a vaccine? Maybe we need another Warp Speed effort.

meab
Reply to  Rud Istvan
July 27, 2022 5:24 pm

The First Law of Economics: For every economist, there exists an equal and opposite economist.

William
July 26, 2022 2:55 pm

The numbers I think are most important are the actual amount of CO2 in the atmosphere- something less than 4 one hundredths of one percent and the amount of that which is naturally occurring- between 95-97% which we can’t do anything about – so the amount of CO2 in the atmosphere from man’s industrial and transportation activity is minuscule and you would have to abandon all common sense to believe that a tiny amount of CO2 has any effect on the earth’s temperature or climate

Ronald Havelock
Reply to  William
July 26, 2022 3:30 pm

tiny! tiny! and even the amount of projected warming is tiny (1-2 degrees C in 100 years). Even the tiny, were it not so tiny, would have no meaningfully negative effect on the planet or people.On top of all that, the term “climate change” has no meaning!

Johne Morton
Reply to  William
July 26, 2022 3:56 pm

CO² does have effect, just a very small one that decreases logarithmically. After exceeding pre-industrial levels, say beyond 290 ppm, CO²’s ability to trap additional heat (longwave IR) drops dramatically. Think of adding layers of clothing on a cold day- the first couple of layers do a lot, the third helps a little more, and beyond that it does little to add additional warmth, but eventually makes you look like the Michelin Man. The whopper of GHGs is good old H²O, which itself varies massively with seasonal changes and very slight blips in global average temperatures (as best we can measure that)…

John in Oz
Reply to  Johne Morton
July 26, 2022 6:25 pm

Similarly, heat water. Once it reaches 100C (at sea level) it will not get hotter no matter how much more heat you apply.

Reply to  John in Oz
July 26, 2022 10:34 pm

Irrelevant comparison
CO2 does not add heat
It impedes heat from escaping into the infinite heat sink of space

Zig Zag Wanderer
Reply to  John in Oz
July 26, 2022 10:40 pm

I dunno. Pressure cookers work perfectly well at sea level…

Peta of Newark
Reply to  Zig Zag Wanderer
July 27, 2022 1:39 am

So do steam engines and steam turbines – if, only if, you have any fuel for their boilers….

Alasdair
Reply to  John in Oz
July 27, 2022 4:50 am

Yes. And heat an ocean. Once it reaches 30C (at sea level obviously) it will not get hotter no matter how long you apply the heat.

A good thing really as it means, with the oceans being some 72% of the Earth’s surface that we can never get runaway global warming.

Reply to  William
July 26, 2022 10:32 pm

Complete nonsense.
About one third of the current 420 ppm of CO2 is manmade.
CO2 did not increase about +50% from 1850 naturally.
nature is still ABSORBING CO2, not a net CO2 emitter.
You are spouting nonsense.

Alasdair
Reply to  Richard Greene
July 27, 2022 5:02 am

You are the one spouting nonsense Richard. You seem to soak the propaganda like blotting paper.

Ben Vorlich
Reply to  Richard Greene
July 27, 2022 5:08 am

But is it a problem? Surely the problems begin when we stop adding CO2 and nature is used to extracting what we;re adding. The decline will be more rapid than the increase.

Jim Gorman
Reply to  Richard Greene
July 29, 2022 10:08 am

I should ask how you know that for sure. Are you absolutely sure that all sinks and all sources have been found and accurately accounted for? Have all cycles that affect the sinks and sources been found and accounted for?

If there is even a 10% error in the amount sinks take up or in the amount of non-human caused CO2, your assertion would be wrong.

Reply to  William
July 27, 2022 1:31 pm

William,
Indeed.
So, to the nearest one tenth of one percent, there is Zero CO2 in the atmosphere.
Yet our economy, and our society, is being destroyed with that as an excuse.

Auto.

Vuk
July 26, 2022 3:06 pm

Scientist and inventor Nikola Tesla was fascinated by number 3
I believe that number 3 had religious connotation for Tesla, his father was an orthodox christian priest, and orthodox christians (as I am) cross with 3 fingers (father, son & holly spirit) and many churches and graves feature triple cross.
Tesla turned ‘father, son & holly spirit’ to ‘energy, frequency, & vibration’ the basis of his universe’s existence understanding.
He invented a threephase alternative current generators and motors, still in world wide use.
It is sad he would only stay in a hotel room containing no.3, walked 3 times about block, always used 3 napkins with his meals and all sorts of other nonsense.

Rud Istvan
Reply to  Kip Hansen
July 26, 2022 5:44 pm

Just an observation. Tesla has been a bit mythologized. He did understand AC versus DC. He did understand the implications of higher frequency AC (but not it’s negative implications). And some of his high frequency AC ideas proved later (with better mathematics) probably not very practical energetically. Always remember, lightning is a DC Helmholtz layer TStorm effect having NOTHING to do with Tesla.

Reply to  Rud Istvan
July 26, 2022 10:06 pm

Rud, my man, I never associated Tesla with lightning, but now the idea will not leave my mind!
Cudos for the most obscure joke ever. Or is it? I have no cookin’ clue.

Old Man Winter
July 26, 2022 3:09 pm

Nobama identifying as 42!

bo42.jpg
Bruce Ploetz
Reply to  Old Man Winter
July 26, 2022 6:16 pm

Of course, the significance of the “number” 42 has nothing to do with math. Decimal 42 is the code for the symbol * in ascii, also known as the wild card character. That can stand for anything and everything. As in cd c:\ del *.* Bend over and kiss your hard drive goodbye.

Mr.
Reply to  Bruce Ploetz
July 26, 2022 7:47 pm

Don’t even disclose or tell teenagers about DOS commands.

You know what havoc could be inflicted on pc’s all over the world.

Old Man Winter
Reply to  Mr.
July 26, 2022 8:20 pm

As time went on, less & less of the “really good stuff” became
accessible to the end user (EU). This made life as a phone tech much easier as the most common fix was:: Replace EU!

Last edited 14 days ago by Old Man Winter
Retired_Engineer_Jim
Reply to  Mr.
July 26, 2022 10:09 pm

It’s OK – they all use AppleDOS, or Unix.

Reply to  Mr.
July 26, 2022 10:23 pm

Assuming that they can find the command prompt. It’s buried so far down when you use just mouse clicks that it’s effectively not there. (Why I stick with the “classic” UI on Windows. The wife’s machine, you can’t just right click the Start button and select Run. I fumble around every time.)

Yirgach
Reply to  Bruce Ploetz
July 27, 2022 9:03 am

I think Douglas Adams was the first one to use 42 as a joke.

roaddog
Reply to  Old Man Winter
July 26, 2022 7:06 pm

Objective Reality says:
42 x 0 = 0.

Old Man Winter
Reply to  roaddog
July 26, 2022 7:53 pm

Because he’s a sociopathic narcissist, His Zeroness still identifies as 42!

Old Man Winter
July 26, 2022 3:14 pm

Numbers are tricky & can take an unexpected turn!

liminfin.jpg
Bob boder
Reply to  Old Man Winter
July 26, 2022 4:40 pm

Lol

Rud Istvan
Reply to  Bob boder
July 26, 2022 5:21 pm

Always a problem as you take any lemma close to its defined zero, no matter how defined. Why more students should learn more math.

John in Oz
Reply to  Old Man Winter
July 26, 2022 6:28 pm

If the student took this to be a visual problem rather than a mathematical one then he/she/it/pumpkin produced a correct answer

Old Man Winter
Reply to  John in Oz
July 26, 2022 8:02 pm

While it looked like sheer genius treating numbers as
pure symbols, it was more than likely an act of desperation where she got lucky. Been there, done that!

Last edited 14 days ago by Old Man Winter
gdt
Reply to  Old Man Winter
July 26, 2022 9:04 pm

I always liked this one

blondeanswer.jpg
JimH in CA
Reply to  gdt
July 26, 2022 10:28 pm

it the very useful ‘3-4-5’ triangle.
Carpenters use this to make a 90 degree angled cut, or measure.

Drake
Reply to  JimH in CA
July 27, 2022 8:47 am

Laying out a house foundation, 3/4/5 gets it square.

DaveW
Reply to  gdt
July 27, 2022 2:21 am

See the Alternative Maths video that LdB links to though.

drh
Reply to  Old Man Winter
July 27, 2022 9:47 am

Here’s another one: 16/64 –> cancel the 6’s you get 1/4. New math!!

Peta of Newark
Reply to  Old Man Winter
July 27, 2022 1:54 am

So perfectly describes why Climate Scientists get away with calling climate a ‘non-linear system
Such is their maths – childlike scribblings given meaning.

A true non-linear system contains a singularity – the answer can be anything you want it to be = classically 42 or the ‘wild card’

Thus enter ‘reification’ = a lovely word to describe Magical Thinking = the process where (chronically chemically) depressed people brain-wash themselves.
iow: They apply the MSDOS “del *.*” command to their own minds

Its very easily done – so easy in fact that creatures with only one brain cell do it
They do it because they are desperately hungry. In that starvation state they start eating whatever they can and reification kicks in – they then truly believe that ‘the wrong thing’ is actually good for them.
Hopeless (Oh, I can handle it) alcoholics being the perfect example – convinced that booze is keeping them alive. (In really advanced states it actually is – suddenly stopping will ki11)

2 nice examples being swarming locusts and John Kerry (Thank Fug there’s only one of him – or is there?)
Biden is not in the race = his braincell count is the divisor in the above equation

edit to PS
You do see the significant fail of Climate Science – as the author here states.
Climate Science gives reality to something that is not real.
i.e. Temperature.

Temperature is a dimensionless quantity – it has no tangible or palpable reality.

It can represent reality but you have to carefully describe the object that you are taking the temperature of – you have to specify some real tangible things (metres, kilograms seconds) to the thing you are recording the temp of.

Thus Climate Science is one humongous lie – a lie by omission in that it never defines the ‘dimensions’ of what it’s recording the temp of.

The most basic omission is that the water content is never mentioned.

But to do so requires an admission that water controls climate
So simple, even a child could understand.
(Now do we see how deep the doo-doo we’re now in)

Last edited 13 days ago by Peta of Newark
Clarky of Oz
Reply to  Old Man Winter
July 27, 2022 7:29 pm

Good one.

We had a “science teacher”in high school who asked everyone in the class to bring in some ice cubes wrapped in a towel. She need dry ice fro an experiment. True story. A little later my father went to the headmaster and had her sacked after another episode.

Neville
July 26, 2022 3:26 pm

Well Kip is Dr Rosling wrong when claims he used 120,000 data points to display the countries of the world from 1810 to 2010?
Of course he has used UN data and this optimistic 5 minute video wrecks all the alarmist’s arguments about a climate EMERGENCY or CRISIS or even Biden’s so called EXISTENTIAL threat.
And Willis Eschenbach’s ” where’s the Emergency” article also requires a lot of numbers to test all of their alarmist claims. And he does this point by point.
And Dr Christy also tested their data point by point and came to a similar conclusion.
AGAIN here’s Dr Rosling’s video. Any comments, anyone?

Old Man Winter
Reply to  Neville
July 26, 2022 3:58 pm

Spectacular use of graphics by Dr Rosling to prove an important fact. It’s
too bad my teachers didn’t have something like this when I was younger.

Last edited 14 days ago by Old Man Winter
Rud Istvan
July 26, 2022 3:27 pm

Kip, separate comment based on your observation ‘too much math, not enough thinking.’

When I was learning math modeling and econometrics at my University, there were no Mathematica packages, no PC’s with Excels and Rs. The University’ Aikens Computer Center housed an IBM 340 with ferrite core memory so max allowable program RAM was 250 kilobytes. (So I got an A for my discrete step Harvard Square traffic jam simulation model NOT because I could run it (needed more than 250kb to simulate the 7 streets feeding the Square at rush hour with each byte a vehicle in time and space) but rather because the professor was impressed with the hundreds of lines of Fortran code modeling the problem.) Much of what I had to learn was based on thinking a lot, then a little but hard to do back up math to make sure you were in the ballpark.

These days, with all the available PC programs, numbers produced by ‘math’ are ‘easy’, while thinking is not only still ‘hard’, but ‘unproductive’ since it takes time. Plus, not that many Uni educated types these days can think at all, let alone critically. I offer AOC as exhibit A. Which IMO results in a lot of the ‘publish or perish’ irreproducible junk statistical science in fields as diverse as medicine and climate that you and others allude to.

Sparko
Reply to  Rud Istvan
July 26, 2022 3:46 pm

Take their computers away, and they would be absolutely lost.
I suspect a large amount of these junk papers are generated by chucking numbers into random stats packages until they get something that looks interesting, and then trying to work backwards to justify it.

Rud Istvan
Reply to  Kip Hansen
July 26, 2022 4:55 pm

Kip, totally agree. Have often wondered, what would ‘science’ now be if we had to go back to hand threaded three wire ferrite cores. Of course, an idle speculation. I would miss many modern miracles like this iPad. But we do not adequately recognize the ‘easy but wrong’ side effects.

John in Oz
Reply to  Kip Hansen
July 26, 2022 6:37 pm

You really want us to believe ‘for my wife’ is the reason you buy them?

Geoff Sherrington
Reply to  Kip Hansen
July 26, 2022 7:42 pm

Kip,
In 1976 I flew (economy) with colleague Albert Berkavicius from Sydney to Los Angeles. His primary purpose was to remove and replace an identified faulty ferrite core in the board he carried, which from memory had 256 of them. We were allowed to watch the procedure, the taking out and putting back of the 4 thin wires through the loops. It worked. This story helps to show the value then placed in the emerging beast called the computer.
Geoff S

Rick C
Reply to  Rud Istvan
July 26, 2022 4:44 pm

Rud: We must be about the same age. When I learned to program in FORTRAN the hard part was done with a pencil and paper – creating a detailed flow chart of the steps required to solve the problem. Once you had figured that out, writing the code the execute the steps was easy. It was then just a matter of compiling and running to obtain the error codes that would lead to discovery of typos and logic errors – aka “debugging”. Of course, each run meant submitting your deck of punch cards to the computing center and waiting sometimes hours to get the printout of results which were often something like “syntax error line 455”. Played a lot of Bridge with other CS students in break room while waiting.

Rud Istvan
Reply to  Rick C
July 26, 2022 5:29 pm

My thesis card punch decks always used a big magic marker top X. Because they almost always never ran first time due to IBM CTL goofs, and almost always came back return sequence goofed. Ah, memories.

John Hultquist
Reply to  Rick C
July 26, 2022 9:39 pm

Early versions of that period, I think:
FORTRAN, or FORTRAN II D, or FORTRAN IV

Geoff Sherrington
Reply to  Rud Istvan
July 26, 2022 5:35 pm

Rud,
Ten out of ten. Your comments resonate strongly with my experiences. My intro to computing was to write a perpetual calendar in machine language. Others in the class did much better, so I made a decision right then to leave programming to those with a bent for it, to free up my time for thinking about solutions, including those responding to computing. Thanks. Geoff S

Greg Bacon
Reply to  Rud Istvan
July 26, 2022 8:00 pm
gdt
Reply to  Greg Bacon
July 26, 2022 9:23 pm

Likely meant a 360 which was the machine of the 1960s. Definitely had core memory

July 26, 2022 3:39 pm

I look forward to part 2. As I was reading part 1, before reaching the end, I thought about the claim I’ve often seen here that a global average temperature is a useless measure. Is it that the number itself is really useless? (What other way would we have to know if we are in a “global” warming or cooling trend and whether it is likely to be beneficial or catastrophic?) Or is it that the means of arriving at said number are dubious/suspect/insufficient? I hope to find some answers in part 2,

Carlo, Monte
Reply to  Thallstd
July 26, 2022 3:59 pm

Think about it this way, reduce the question down to two different locations: what does it mean to average all the temperatures from Cut Bank, Montana with Rio de Janeiro?

Rud Istvan
Reply to  Thallstd
July 26, 2022 4:01 pm

A partial answer, IMO. GAST is meaningless. But the GAST anomaly is not (e.g. UAH anomaly). The problem with anomalies is that they hide the very large absolute divergence in IPCC climate modeled temperatures. See essay ‘Models all the way down’ in ebook Blowing Smoke for an illustrated and referenced example of this problem.

Tim Gorman
Reply to  Thallstd
July 26, 2022 4:29 pm

Climate is the entire temperature profile at a location. Every time you take an average you lose part of the data needed to evaluate the climate. In calculating the daily mid-range value you lose data about the temperature profile since multiple different minimum/maximum temps can give the same mid-range value, you have lost what happened in reality When you then find a monthly average using those averages you lose more data, different daily mid-range values can result in the same monthly average so how do you tell what is happening in reality. Then when you average monthly averages to get an annual average you lose even more data, you don’t know what happened in reality since different monthly averages can result in the same annual average value. Now average all those averages one more time on a global basis and while you come up with a number what does it actually mean in reality? There is no place on the globe you can find and measure that value. What does it actually tell you?

If you can’t measure it then does it really exist?

This doesn’t even get into propagating the uncertainties of the initial measurements through each average calculation.

gdt
Reply to  Thallstd
July 26, 2022 9:15 pm

The issue is that temperature is not a good indicator of heat. Enthalpy (energy content – SI unit Joules per kilogram) is the correct measure.

By illustration 40C and 20% humidity has an enthalpy of 550 J/kg. 32C and 80% humidity is 2080 J/kg so is actually much “hotter”.

Maybe Kip will address this in part 2

Zig Zag Wanderer
Reply to  gdt
July 26, 2022 10:51 pm

This is my biggest problem with atmospheric temperature measurements of any kind.

What we really need is a calculation of the total heat energy in the entire global climate system, ie the oceans and the atmosphere. To be absolutely correct, we should also include a certain amount of the ground too, since that is affected by the climate.

If we can get all that, accurate to 100th of a joule, on at least an hourly, and preferably per minute, basis, for the entire planet, I’ll start taking the numbers seriously.

Tim Gorman
Reply to  gdt
July 28, 2022 8:35 am

Pressure also figures into this. So temperatures taken at different elevations create different enthalpies as well. You very seldom see temperatures go up with a cold front passes through while pressure changes significantly.

Clarky of Oz
Reply to  Thallstd
July 27, 2022 7:43 pm

I think a global average temperature is about as useful as the average of all the house numbers in my street. However I would be happy to be corrected.

I have no formal science of mathematical qualifications.

Carlo, Monte
July 26, 2022 3:50 pm

Excellent, Kip!

“the data” — is an oxymoron because the word is plural!

Chris Hanley
Reply to  Carlo, Monte
July 26, 2022 4:56 pm

WUWT is a blog not a specialized scientific journal: “In modern non-scientific use, however, it is generally not treated as a plural. Instead, it is treated as a mass noun, similar to a word like information, which takes a singular verb. Sentences such as data was collected over a number of years are now widely accepted in standard English” (Oxford).

Clyde Spencer
Reply to  Kip Hansen
July 26, 2022 8:31 pm

But it is probably not a wise move in that one has sacrificed a word (datum) and ended up with less precision by having to use one word to describe two sets — a set with one entry and sets with many entries. What advantage is provided by collapsing two similar words into a single word, other than not having to worry about verb-noun agreement? However, one can always decide not to worry about noun-verb agreement — if they don’t mind people viewing them as illiterate.

JeffC
Reply to  Kip Hansen
July 27, 2022 2:23 am

The trouble is that definitions change out of ignorance and misuse.

Carlo, Monte
Reply to  Chris Hanley
July 26, 2022 6:03 pm

Still looks wrong from here, even if the Oxford eggheads hath decreed it not!

DaveW
Reply to  Chris Hanley
July 27, 2022 2:30 am

It is interesting to me how many ‘irregular’ (in the sense of based on Greek or Latin) singulars are disappearing from the language. You never see bacterium, larva, phenomenon any more, it is always bacteria, larvae, phenomena. I used to struggle against this trend, sad pedant that I am, but I wonder if it isn’t just the normal evolution of language, rather than a lowering of standards.

Last edited 13 days ago by DaveW
Bellman
Reply to  Carlo, Monte
July 27, 2022 5:56 am

Why do you think “the” can’t be used with plurals?

Bellman
Reply to  Carlo, Monte
July 27, 2022 6:13 am

Even if you did object to data being used as a singular, it wouldn’t make it an oxymoron, just bad grammar.

Neville
July 26, 2022 4:10 pm

Does anyone understand the numbers that prompted the Biden donkey to declare that we have an EXISTENTIAL threat?
He even claims we can “feel the threat in our bones”, whatever that means?
Just watch this silly nonsense and fear for our future and ask yourself how we’ve fallen for this lunacy?



Carlo, Monte
Reply to  Neville
July 27, 2022 6:25 am

It’s the Twilight Zone.

Carlo, Monte
Reply to  Kip Hansen
July 27, 2022 11:31 am

“Help me, Mr. Wizard!”

mario lento
July 26, 2022 4:10 pm

I’ve always had a problem of putting meaning behind temperature because without knowing the constituents of the temperature (what is the relative or absolute humidity, pressure and other particles), the number is not close to precisely anything of value! Further averaging this poorly defined number goes down the rabbit hole of “you lost me already”

Last edited 14 days ago by mario lento
Mario Lento
Reply to  Kip Hansen
July 26, 2022 4:46 pm

Yes, that is correct. Delta temperature is used by many to “falsely” prove the earth is trapping energy. That is a slight of hand argument.

So I posit that measuring temperature without knowing the constituents of the air being measured is meaningless precisely because the amount of energy is not known simply by a temperature measurement without considering the other components I mentioned.

And I need to explain, when I wrote in my previous post “you lost me already” I was not referring to the author of this nice post. I was speaking generally to those others who don’t know the difference because heat and temperature.

Last edited 14 days ago by Mario Lento
mario lento
Reply to  Kip Hansen
July 27, 2022 8:51 am

I figured that engineering units assigned to numbers would be a natural segway 🙂

markl
July 26, 2022 5:13 pm

With you so far, looking forward to part 2.

Steven Candy
July 26, 2022 5:46 pm

You do come across as a bit of a statistics-a-phobe.”All Models Are Wrong, Some are Useful”  George Box, 1976. Fair (agenda-free) statistical analyses including summary statistics, empirical model-based summarisations, and theoretical model validations are an essential part of science. These all should consider appropriateness of the data, its measurement, measurement errors, and adequacy for the research question being addressed and all subsequent uses of the data should be closely scrutinised, peer reviewed (unfortunately a very fallible system), and subject to re-analysis (requiring all data and code to be freely available). Otherwise what is your alternative? The biggest hindrance is narrative-biased peer review. Fully transparent peer review would help.

Steven Candy
Reply to  Kip Hansen
July 27, 2022 6:31 am

Kip that’s fine but “we must not throw the baby out with the bathwater” as the saying goes. As a PhD-level applied statistician with over 40 years experience in statistical modelling/analysis mostly in forestry/fisheries/ecology I have seen many inappropriate to flat-out wrong analyses. A thing I get a lot of satisfaction out of is in helping collaborating subject-area researchers clarify and convert their research questions into valid and best-practice statistical models and inferences. cheers Steve

Richard Page
Reply to  Steven Candy
July 27, 2022 9:32 am

Mathematical analyses can be useful in supporting research work and formatting data into easily visible summaries. However what we are seeing currently is that the mathematical analysis IS the research work and actual data is becoming more and more irrelevant. It’s a disturbing trend.

Old Cocky
Reply to  Kip Hansen
July 27, 2022 2:44 pm

Somebody (Roman M, I think) pointed out at Climate Audit ages ago that there actually is a statistically correct way to handle significance in such data mining analyses.

Re: the students pushing the ANOVA button. Easily accessible powerful statistical analysis programs* allow people with no conceptual grasp of the area to throw numbers into the machine and produce something. They then think that particular something is important because it’s statistics and the computer said so 🙁

[*] it’s rather a stretch to include ANOVA here, but it’s the concept, not the detail…

Geoff Sherrington
July 26, 2022 5:47 pm

Kip,
Looking forward to your next part, which might be numbered 2.
We are all shaped by a past life of selecting what we like and we’re taught and sometimes disagreeing with others.
I was taught that in Earth Sciences at least, a derived number is incomplete unless it has another number expressing its uncertainty. Not all agree. I have been trying for 6 years to get our BOM to disclose the uncertainty that they place on routine daily temperature measurements. Still not there, have some of their estimates for instrumental but not yet including the setting, like screen errors.
Without proper uncertainties, one cannot claim a temperature as a record because it might be just random noise. This goes straight to your comments about numbers measuring something but not being that something.
Let’s push for more uncertainty estimates, done by the book. Geoff S

Clyde Spencer
Reply to  Geoff Sherrington
July 26, 2022 8:36 pm

Geoff, don’t you realize that climatologists have to take the equivalent of the Hippocratic Oath — “First, do no harm.” It is called the Hypocrite Oath and is — “First, never admit uncertainty.”

Jim Gorman
Reply to  Kip Hansen
July 27, 2022 11:13 am

I agree. Plus the use of SEM as uncertainty is totally wrong. SEM is statistical error not measurement uncertainty. In other words it tells you how closely the sample mean estimates the population mean, it tells you nothing of measurement uncertainty.

Most metrology teaches that if you have a small number of samples, like 10 experiments, the uncertainty is best expressed by Standard Deviation of the 10 experiments. This is actually the Standard Deviation of sample Means (SEM). You can’t divide the SEM by the √N because you get a worthless number.

Carlo, Monte
Reply to  Jim Gorman
July 27, 2022 11:35 am

And for temperature, you get exactly one chance for the measurement, then it is gone forever. √N = 1.

Bellman
Reply to  Jim Gorman
July 27, 2022 3:13 pm

You can’t divide the SEM by the √N because you get a worthless number.

Why would you want to divide SEM by √N? You divide the standard deviation by √N to get the SEM.

Most metrology teaches that if you have a small number of samples, like 10 experiments, the uncertainty is best expressed by Standard Deviation of the 10 experiments.

What do you mean by “experiments” in this case?

Jim Gorman
Reply to  Bellman
July 27, 2022 6:12 pm

Experiments can be anything. Samples of a production run, a chemical reaction, you name it.

How many folks on here have said that to find the uncertainty of the Global Average Temperature, you divide the Standard Deviation by the N? Besides that not being uncertainty you are dealing with samples. Every daily average is of samples. Every monthly average is of samples. Every annual average is of samples. That is all there is — Samples. You can’t divide by √10,000 stations and get a number that means anything.

Kip Hansen has started a series on numbers. I suggest you go there and begin to learn what you don’t know about metrology.

Bellman
Reply to  Jim Gorman
July 27, 2022 6:52 pm

But there’s a difference between an experiment that is taking just one value, and an experiment that results in a sample of values.

How many folks on here have said that to find the uncertainty of the Global Average Temperature, you divide the Standard Deviation by the √N?

Nobody as far as I’m aware. I’ve always said that the uncertainty of a global average is a complicated process. All I’ve said is that is the standard way to calculate the SEM and the SEM is a measure of the uncertainty of the mean, with a lot of caveats.

Kip Hansen has started a series on numbers.”

Thanks, but based on his comments I’m not sure he understands averaging better than you.

Jim Gorman
Reply to  Bellman
July 28, 2022 6:38 am

Wake up. Read studies and see what they do.

The NIH has even recognized the problem.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2959222/#

“” However, many authors incorrectly use SEM as a descriptive statistics to sumthemarize the variability in their data because it is less than the SD, implying incorrectly that their measurements are more precise.””

Bellman
Reply to  Jim Gorman
July 28, 2022 7:40 am

You keep linking this.

Yes SEM is not a descriptive statistic, it’s inferential. If you use the SEM and imply it’s the SD you are wrong. If you use the SD when you use mean the SEM you are wrong. And none of this means, you estimate the uncertainty of a global anomaly average simply by dividing the SD by √N.

Geoff Sherrington
Reply to  Kip Hansen
July 27, 2022 9:28 pm

Yes, Kip,
I know that you have pressed for better uncertainty analysis from the beginning. I did not mean to infer that you were lax, I simply accepted that both you and I and most readers knew it.
Geoff S

tygrus
July 26, 2022 6:09 pm

You can use temperature to estimate the energy level of a known substance when the volume & pressure remains the same (see its “Specific heat capacity”). BUT …

  • It becomes more complicated when a change of state is involved or the substance is not pure. A 1K rise of 1kg of sea water IS NOT the same change of energy as 1K rise of 1kg of air. A 1K change of ice != 1K change of water != 1K change of H2O gas (in terms of energy change).
  • Air & sea water are mixtures, their composition varies based on altitude & location, pressure/volume in the area being measured can vary hour to hour, pressure/volume varies from location to location, affected by weather & it affects weather.

So averaging temperatures is not a measurement of energy when volume, pressure and/or composition varies per locations & over time as occurs in meteorology (then aggregated over time as climatology).

Last edited 14 days ago by tygrus
tommyboy
July 26, 2022 6:13 pm

We can distill the dissertation.
“Lies, Damn Lies and statistics”

goldminor
July 26, 2022 6:32 pm

Excellent post.

Steven Candy
July 26, 2022 7:21 pm

“like Global Average Surface Temperature, an entirely imaginary nonphysical number.” In principal a Global Average Surface Temperature based on a census of every defined surface unit and time interval, to be theoretical in the extreme lets say every m^2 and every sec, and then averaged over all are units in space and over a contiguous set of time units (say a year) is a valid statistic of a valid measurement (say instrumental) that approximates a physical reality. The problem is not in the theory but in the historical sampling realisations of the population of census values. Dealing with highly unbalanced sampling over space and time, time trend analysis of means, the uncertainty of the time trend, attempting to adjust for temporal and/or spatial confounders this is my effort in modelling an analogous set of very unbalanced data for long-term trend in the mean and its uncertainty (https://journalarrb.com/index.php/ARRB/article/view/30460)

Last edited 14 days ago by Steven Candy
Steven Candy
Reply to  Kip Hansen
July 27, 2022 6:15 pm

Kip, so here in southern Tasmania when the maximum daily air temperature can be about 10degC this time of year, if I was a poikilotherm (like those Tasmanian’s to be wary of – the tiger snake) I would curl up on a rock in the sun and not do much else. So air temperature and internal body temperature have a strong relationship with obviously body heat content functionally dependent on air temperature (with some insolation warming of the rock I am curled up on) not the other way around. I modelled coleopteran development in my PhD (a poikilothermic order) and a thermal sum with a lower temperature threshold is very well established as an excellent predictor of insect development through immature stages (larval instars). “they should be counting HEAT content” Good luck with that. I wouldnt have been able to do my PhD if I had to put tiny-weeny temperature probes in all those thousands of gum leaf beetles and weigh each one individually to estimate their heat content. Have you ever done any field based research?

Last edited 13 days ago by Steven Candy
Steven Candy
Reply to  Steven Candy
July 27, 2022 7:54 pm

I forget to mention the insolation warming of the black tiger snake itself. One contemporary PhD I communicated with did glue small temperature probes on the underside of the abdomen of a few gum leaf beetles to estimate the additional effect of insolation but a probe inside the beetle would have been ideal but then again they like us would have found it difficult to go about their business happily with a metal spike up their rear end! :-}

Greg Bacon
July 26, 2022 7:29 pm

scientific philosophy of numbers “
..
No such thing Kip. Posting a word salad is dumb

Clyde Spencer
Reply to  Greg Bacon
July 26, 2022 8:39 pm

And, I don’t think much of your Caesar salad — even with bacon.

Last edited 14 days ago by Clyde Spencer
Paul Redfern
July 26, 2022 8:43 pm

THE NUMBERS ARE MEANINGLESS, BUT THE TRENDS ARE IMPORTANT.
— LARRY BURGGRAF
https://twitter.com/gaussianquotes/status/778646351022260224

Richard Page
Reply to  Kip Hansen
July 27, 2022 9:37 am

Now that’s what you might call a facepalm moment. The quote given by Paul Redfern may be quite accurate but it still makes no sense as it stands. Try inserting the word “alone” between “numbers” and the first “are”; makes slightly more sense then.

Jim Gorman
Reply to  Kip Hansen
July 27, 2022 11:31 am

I’ve used this example before and I’ll make it short. I make shafts that are supposed to be 6″ ±0.01″. My cutter is broke and it makes shafts between 5″ – 7″. I make 1000 of them and by golly the mean is 6″ and the Standard Deviation / √N = 0.001. Will my customer be happy? Does the average have any meaning? How about the SEM, is it meaningful?

Bellman
Reply to  Jim Gorman
July 27, 2022 3:06 pm

And as always your examples are ones where the SEM is not useful, and then conclude it can never be useful.

If all your shafts have to be within a certain tolerance then you need to look at the standard deviation, not the SEM. If on the other hand it didn’t matter what the exact size of your shafts were, but it was important that the average was close to 6″, then you would might want to take samples of a sufficient size that it could alert you to a change in the average. Then you need to know the SEM.

The main reason for wanting to know the SEM is in hypothesis testing, not in building things all of the same size. You want to know if two samples are from the same population, if one treatment is better than another, if a chemical is increasing the chance of getting ill. You can;t do that by assuming everything is the same, you have to take samples, and you have to know what the expected error of the mean will be.

Jim Gorman
Reply to  Bellman
July 27, 2022 6:02 pm

As usual you are out standing in the cold. Here are some references.

https://www.investopedia.com/ask/answers/042415/what-difference-between-standard-error-means-and-standard-deviation.asp

“researchers should remember that the calculations for SD and SEM include different statistical inferences, each of them with its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data.

 

However, the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution).” (bold by me.)

Read this carefully. The SEM (Standard Error of the sampe Mean) is the SD (Standard Deviation) of the sample means distribution.

Did you get that? A test of a small number of times is a SAMPLE! The mean of that sample is the average value. The SD of that sample is the SEM.

Look on youtube for standard error of the mean. There are literally hundreds of videos that explain this.

Bellman
Reply to  Jim Gorman
July 27, 2022 6:46 pm

Every time I try to explain why you might want to know the SEM, you give me some link that explains the difference between standard error of the mean and the standard deviation of a sample. Thanks, but I’m fully aware of the difference.

“Did you get that?”

Yes. But you still don’t, as your next statement makes clear.

A test of a small number of times is a SAMPLE! The mean of that sample is the average value. The SD of that sample is the SEM.

The SD of a sample is not the SEM.

As so often you seem to get hung up on the odd phrasing. “Sampling distribution” does not mean the distribution of a sample. It means the distribution of all possible samples.

https://en.wikipedia.org/wiki/Sampling_distribution

If an arbitrarily large number of samples, each involving multiple observations (data points), were separately used in order to compute one value of a statistic (such as, for example, the sample mean or sample variance) for each sample, then the sampling distribution is the probability distribution of the values that the statistic takes on.

See the words “arbitrarily large number of samples”, not one sample.

Steven Candy
Reply to  Jim Gorman
July 27, 2022 10:04 pm

“SD is the dispersion of data in a normal distribution.” Wrong the SD is a sample statistic which when squared along with sample mean are for a normal distribution the joint sufficient statistics for the population variance and mean, respectively. The maximum likelihood estimate of the population variance is slightly biased because it divides the sum of squares by N and not N-1.
SD indicates how accurately the mean represents sample data” wrong: the SEM estimates the accuracy of the mean as an estimate of the population mean.
“SEM is the SD of the theoretical distribution of the sample means (the sampling distribution)” correct if you change toSEM when squared is an estimate of the variance of …
“A test of a small number of times is a SAMPLE” what a mangled sentence, what is it trying to say?
“The SD of that sample is the SEM.” what sample? the SEM is the SD of the sample divided by √N or alternatively you could use bootstrap resampling to draw a size-M sample of means and estimate the SEM from the sample of means as their SD but DO NOT divide this SD by the square root of M.

Dont go to https://www.investopedia.com/ask/answers/ whatever go to statistics text books or talk to a bona fide statistician

Steven Candy
Reply to  Steven Candy
July 28, 2022 12:40 am

These standard results assume equal-probability random sampling (or simple random sampling) so that all units in the population have equal chance of being selected and selection is random. There are other unequal-probability random sampling schemes I have used such as list sampling and response-biased sampling which require these sampling probabilities to be incorporated in estimation using for example Horvitz-Thompson estimation of the mean.

Last edited 12 days ago by Steven Candy
Tim Gorman
Reply to  Steven Candy
July 28, 2022 9:15 am

Wrong the SD is a sample statistic which when squared along with sample mean are for a normal distribution the joint sufficient statistics for the population variance and mean, respectively.”

How does this apply to temperature measurements which are *NOT* a normal distribution? E.g. northern hemisphere temps mixed with southern hemisphere temps – at least a bimodal distribution. Or summer temps mixed with winter temps when each have a different variance (e.g. SH vs NH temps)

Since temperatures are multiple measurements of different things using different devices there is no kind of a guarantee you will get a normal distribution. In a bi-modal distribution what does mean and standard deviation really tell you?

Steven Candy
Reply to  Tim Gorman
July 29, 2022 12:59 am

One can resort to the Central Limit Theorem that asymptotically the sampling distribution of the mean approaches a normal distribution as the sample size N increases https://en.wikipedia.org/wiki/Central_limit_theorem). However, I would suggest in modelling the long term trend in average global surface temperature that one combines fixed effects (e.g. latitude, altitude, land, ocean, sinusoidal function of days since winter solstice, albedo etc) and random effects (e.g. weather station, or grid square for oceans, days etc) in a linear mixed model and calculate averages across levels of both fixed (except year) and random effects (converting continuous fixed effects to ordered categorical variables). You could assume a normal distribution for temperature since given the above fixed effects it is the residual distribution about the local mean that we need to model. The uncertainty could be modelled using Markov Chain Monte Carlo estimation/sampling (see my paper for something like this approach: https://journalarrb.com/index.php/ARRB/article/view/30460). The fixed effects take out the bimodality you mention since these modes are modelled by the fixed effects. I would also build in different error variances given the type of measurement (eg weather station versus remote sensing). The problem becomes a more complex modelling problem when weather stations come and go and there are confounding effects like urban heat island effects on the trend for a station, to name a couple of issues.

Tim Gorman
Reply to  Steven Candy
July 29, 2022 5:43 am

One can resort to the Central Limit Theorem that asymptotically the sampling distribution of the mean approaches a normal distribution as the sample size N increases”

All that does is decrease the interval in which the calculated mean might lie. It does not represent the uncertainty of that mean in any way. Each measured data element (i.e. temperature) should be given as “stated value +/- measurement uncertainty”. You *must* include the propagated measurement uncertainty along with each sample mean. Those sample mean uncertainties must be propagated onto the average of the sample means if you truly want the uncertainty of the mean.

” You could assume a normal distribution for temperature since given the above fixed effects it is the residual distribution about the local mean that we need to model. The uncertainty could be modelled using Markov Chain Monte Carlo estimation/sampling (see my paper for something like this approach:”

You simply cannot assume a normal distribution for temperature. That would only occur if you are measuring the same thing multiple times. Temperatures are multiple measurements of different things using different devices. There is simply no way you can assume a normal distribution of the values you gather. This also means that Monte Carlo estimation will not work, that method is also based on generating random, normal distributions of values. Since you are only taking single measurements of an object there isn’t a random, normal distribution from which to evaluate uncertainty for each measurement.

For a global average temperature you must combine northern hemisphere temps with southern hemisphere temps. For the same month one will be measuring summer temps in one hemisphere and winter temps in the other hemisphere. This actually results in at least a bi-modal distribution. Not only that winter temperatures usually have a higher variance than summer temps. Combining independent data sets with different variances is not the simple exercise of just jamming the data together and finding an average.

The fixed effects take out the bimodality you mention since these modes are modelled by the fixed effects.”

How do you do this? If temps in the SH range from 0C to 10C and in the NH from 10C to 20C how do fixed effects remove such a bimodality?

Climate is the entire temperature profile. Every time you take an average you lose data, you no longer have the entire temperature profile. An mid-range daily temperature can be 15C. Can *YOU* tell what minimum and maximum temperatures generated that value? I can’t. When you then average 30 daily mid-range values to get a monthly average of 12C can you tell what daily mid-range values generated that average? I can’t. When you then average 12 monthly average to get an annual average of 8C can you tell what values generated that average? I can’t.

If you can’t follow the temperature profile throughout all the averaging then what do you really know about the global climate. Using anomalies don’t help – they just further mask the actual temperature profiles!

Steven Candy
Reply to  Tim Gorman
July 31, 2022 6:17 pm

How do you do this?”
Read my paper and if you can understand the statistical methods come back with some informed criticism, otherwise we are arguing from the base level of your misconceptions due to your lack of practical experience in such modelling as is evident from your postings. I havent got time for this, I have paid statistical consulting work to get on with, and another applied statistical methods paper just submitted took up the last month of my semi-retirement time.

Steven Candy
Reply to  Tim Gorman
July 31, 2022 7:55 pm

How do you do this? If temps in the SH range from 0C to 10C and in the NH from 10C to 20C how do fixed effects remove such a bimodality?” This does it for me, got better things to do. Your language is so imprecise and arguments so even counter-intuitive. I live in the SH, in the southern most capital city in Australia and I can assure you the “temp” (do you mean temperature as displayed on the BOM site?) ranges well above 10C and sometimes below 0C. Today mid-winter the forecast maximum is 13C. What on earth (literally) do you mean by “temps in the SH range from 0C to 10C”? Do you mean “average annual mean daily temperature (AAMDT)”? If so then somewhere between Macquarie Island and Davis station in coastal Antarctica (which I have visited on a research cruise with AAD in the 2009/10 Austral summer) would fit with an AAMDT of 0C! BTW you get a daily mean temperature by integrating the 24hr diurnal temperature profile.

Tim Gorman
Reply to  Steven Candy
August 1, 2022 8:09 am

And now we devolve into nitpicking, an argumentative fallacy. The temps I used were meant to signify the difference between seasons in the hemispheres. Something which you used nitpicking in order to create a red herring to argue with.

Two issues you still need to address:

  1. how do you resolve the bimodality?
  2. how do you resolve the different variances between summer temps and winter temps?

I don’t really expect you to answer because climate alarmist ignore things like this.

But please stop using arguementative fallacies to avoid answering. If you don’t wish to answer or can’t answer then just say so.

Carlo, Monte
Reply to  Steven Candy
July 29, 2022 7:18 am

average global surface temperature

is a fictitious quantity that cannot represent climate.

Tim Gorman
Reply to  Carlo, Monte
July 29, 2022 1:08 pm

It’s not even a temperature measurement let alone a climate measurement!

Jim Gorman
Reply to  Steven Candy
July 29, 2022 12:00 pm

“SD is the dispersion of data in a normal distribution.” 

Wrong the SD is a sample statistic which when squared along with sample mean are for a normal distribution the joint sufficient statistics for the population variance and mean, respectively. “

Your definition in no way proves mine wrong. In fact, I can give multiple references if you so wish. However, basically SD tells you the dispersion of data in a normal distribution. 10 ± 5 describes a much different normal distribution than 10 ± 25. It doesn’t even matter how many data points you have. You know that ~68% is in one SD, etc.

You are mistaken in this statement.

“SEM is the SD of the theoretical distribution of the sample means (the sampling distribution)” correct if you change to “SEM when squared is an estimate of the variance of …“”

The Standard Error of the sample Means (SEM) IS the standard deviation of the distribution made from all the average of each sample of the population.

Here you are totally wrong. You make the same mistake that many scientists and mathematicians make.

“The SD of that sample is the SEM.” what sample? the SEM is the SD of the sample divided by √N “

The SEM is the standard deviation of a sample distribution. Now you may make many “samples” of a population and find the mean of each of those samples to create a distribution with many points and then calculate the SD of that sample means distribution. However, at that point you have an “estimated mean” and a standard deviation of the sample means (SEM). The SEM tells you an interval surrounding the estimated mean where the population many lay.

The relation between the SEM and the population SD is:

SEM = SD / √N

Your other mistake is that the √N is NOT the number of samples, it is the size of the samples.

Your statement:

Dont go to https://www.investopedia.com/ask/answers/ whatever go to statistics text books or talk to a bona fide statistician”

is an insult to the information on this site. You need to provide references or letters you have sent to their site describing their errors. Personally, I would not slander a site like this that has been there for a long time and whose information is similar to many other references.

You need to provide a “textbook” reference to support your assertion that their information is wrong.

Here are two references from National Institute of Health that show bona fide statisticians do not understand the issues involved.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1255808/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2959222/#

Inappropriate use of standard error of the mean when reporting variability of study samples: a critical evaluation of four selected journals of obstetrics and gynecology – PubMed (nih.gov)

Check this site out. Be sure and multiply the sampled distribution standard deviation (SEM) by the square root of the sample size and see if you don’t get the population SD. That will verify the above equation. I’ve included an image of one example I ran. This verifies the Central Limit Theorem by the way.

Sampling Distributions (onlinestatbook.com)

psx_20211019_082116.jpg
Jim Gorman
Reply to  Jim Gorman
July 29, 2022 4:34 pm

Since you wanted a textbook reference, here is one.

Introduction to Statistics
Online Edition

Primary author and editor: David M. Lane1
Other authors: David Scott1, Mikki Hebl1, Rudy Guerra1, Dan Osherson1, and Heidi Zimmer2

1Rice University;
2University of Houston, Downtown Campus

Page 306

“The most common measure of how much sample means differ from each other is the standard deviation of the sampling distribution of the mean. This standard deviation is called the standard error of the mean.”

Page 307

“The variance of the sampling distribution of the mean is computed as follows:

(σmean)^2 = σ^2 / N (formula typed by me)

That is, the variance of the sampling distribution of the mean is the population variance divided by N, the sample size (the number of scores used to compute a mean).”

Here is another online textbook.

Statistical Thinking for the 21st Century
Copyright 2019 Russell A. Poldrack

Chapter 7.3

“For the mean, we do this using a quantity called the standard error of the mean (SEM), which one can think of as the standard deviation of the sampling distribution of the mean. To compute the standard error of the mean for our sample, we divide the estimated standard deviation by the square root of the sample size:”

SEM =For the mean, we do this using a quantity called the standard error of the mean (SEM), which one can think of as the standard deviation of the sampling distribution of the mean. To compute the standard error of the mean for our sample, we divide the estimated standard deviation by the square root of the sample size:

SEM = σ / √N (equation typed by me)

The important part of these is that the standard deviation of the sample means IS NOT divided again by √N to obtain a better number.

In fact, if the temperature data is not of the entire population, and it is not, it must be considered a sample.

I would give you a reference from my Engineering Statistics book but it disappeared many moons ago during one move or another.

If these don’t suffice let me know and I’ll get more.

Bellman
Reply to  Jim Gorman
July 29, 2022 5:27 pm

You keep giving texts that explain what people are trying to tell you. The problem is you keep giving the impression that you don’t understand the difference between the standard deviation of a sampling distribution, and the standard deviation of a sample.

Maybe you do understand it, but are not expressing yourself vary well, but I’ve had similar arguments with you over the months and you keep making the same statements.

You said:

A test of a small number of times is a SAMPLE! The mean of that sample is the average value. The SD of that sample is the SEM.

Maybe you meant the SD of the sampling distribution. But that’s not how it reads, and when anyone tries to correct you, you say they don’t know what they’re talking about.

Then you claim that many mathematicians and scientists don’t understand it either. Before making such a bold statement you really need to tell us exactly what you mean, because it might just be the mathematicians understand it better than you.

Tim Gorman
Reply to  Bellman
July 30, 2022 4:12 am

None of you so-called statisticians seem to understand this at ALL! The standard deviation of the sample means is *NOT* the measurement uncertainty of that calculated mean. The uncertainty of that mean calculated from the sample means is the measurement uncertainty associated with each individual element in each sample.

Population: “mean of the stated values” +/- propagated measurement uncertainty
Sample 1: “mean of the stated values” +/- propagated measurement uncertainty
Sample 2: “mean of the stated values” +/- propagated measurement uncertainty
Sample 3: “mean of the stated values” +/- propagated measurement uncertainty

The standard deviation of the same means only gives you an interval in which the mean of the population stated values might lie.

The actual measurement uncertainty of the mean calculated from the sample means is the propagated measurement uncertainty of the sample means.

The SEM is *NOT* the accuracy of the mean of the sample means. It is *NOT* the accuracy of population mean.

For some reason it seems that most statisticians want to use the SEM as a metric of measurement accuracy. IT IS NOT!

I don’t care how many samples you take or how large the sample are. The accuracy of any mean calculated from individual elements is the propagated measurement uncertainty of the individual elements.

You can’t lessen measurement uncertainty using averages or anomalies when you are combining individual, random, independent measurements of different things using different devices. You can’t even do it if you have multiple measurements of the same thing using the same device unless you can *prove* that no systematic uncertainty exists in the measuring device.

Even if every square inch of the Earth was covered in thermometers, the mean calculated from those thermometers would still have a measurement uncertainty factor associated with it. I don’t care how many samples of how many elements you would pull from that population, the standard deviation of the means of those samples would *still* not be a metric of the accuracy of the mean thus calculated. The ONLY metric of the accuracy would be the measurement uncertainty propagated from those individual thermometers.

I can’t find a single set of temperature data that includes proper measurement uncertainty with each element. I can’t find a single analysis of any of the measurement data that actually propagates and shows the measurement uncertainty from combining all those pieces of temperature measurements. THEY ALL USE THE STANDARD DEVIATION OF THE SAMPLE MEANS AS THE ACCURACY OF THE MEAN!

It means each and every one of them either just ignores measurement uncertainty or assumes it all just cancels. Whether they do so out of ignorance or intention, I can’t judge. I suspect ignorance is a major problem because so few statistics textbooks cover how to handle measurement uncertainty. All the textbooks just assume that all stated values are 100% accurate! In the real world that simply does not cut it!

Bellman
Reply to  Tim Gorman
July 30, 2022 3:21 pm

None of you so-called statisticians seem to understand this at ALL!

I’m not a statistician “so-called” or otherwise. But your hubris keeps growing. You really need to consider that statisticians understand statistics better than you.

The standard deviation of the sample means is *NOT* the measurement uncertainty of that calculated mean.

I’m not saying it’s the “measurement uncertainty”, I’m saying it’s a type of uncertainty, and generally more important than that from measurement uncertainty.

Population: “mean of the stated values” +/- propagated measurement uncertainty

I’m not sure how a population can have measurement uncertainty, unless you mean the population of all possible measurements. In that case I’d expect this to converge to the actual mean plus any systematic measurement error.

Sample 1: “mean of the stated values” +/- propagated measurement uncertainty
Sample 2: “mean of the stated values” +/- propagated measurement uncertainty
Sample 3: “mean of the stated values” +/- propagated measurement uncertainty

As with Jim, I don’t know why you keep wanting to take multiple samples.

The standard deviation of the same means only gives you an interval in which the mean of the population stated values might lie.

Which is what you want if you are talking about the uncertainty of your sample mean.

The actual measurement uncertainty of the mean calculated from the sample means is the propagated measurement uncertainty of the sample means.

If you want, but as I keep saying, this will generally be small compared with the actual uncertainty caused by the random sampling. (Of course, you will disagree because you don;t understand how to propagate those measurement uncertainties in a mean, properly)

The SEM is *NOT* the accuracy of the mean of the sample means. It is *NOT* the accuracy of population mean.

You need to read some of the articles Jim keeps posting. They disagree.

Standard error gives the accuracy of a sample mean by measuring the sample-to-sample variability of the sample means.

https://www.investopedia.com/ask/answers/042415/what-difference-between-standard-error-means-and-standard-deviation.asp

Bellman
Reply to  Tim Gorman
July 30, 2022 3:38 pm

I can’t find a single set of temperature data that includes proper measurement uncertainty with each element. I can’t find a single analysis of any of the measurement data that actually propagates and shows the measurement uncertainty from combining all those pieces of temperature measurements. THEY ALL USE THE STANDARD DEVIATION OF THE SAMPLE MEANS AS THE ACCURACY OF THE MEAN!

Could you provide a link to one of these. Just using the SEM to estimate the uncertainty in a global anomaly calculation would be wrong. It’s not a random sample.

Here’s the description for HadCRUT5:

Both forms of the dataset are presented as an ensemble of 200 dataset realisations that sample the distribution of uncertainty. For the non-infilled data set, the ensemble represents uncertainties in methods used to account for changes in SST measurement practices, homogenisation of land station records and the potential impacts of urbanisation. The ensemble generated from the statistical analysis includes these uncertainties as well as uncertainty arising from measurement error, under-sampling at a grid cell level and uncertainty in the statistical reconstruction.

https://www.metoffice.gov.uk/hadobs/hadcrut5/

Steven Candy
Reply to  Tim Gorman
August 1, 2022 12:45 am

It means each and every one of them either just ignores measurement uncertainty or assumes it all just cancels.”

It is considered when its of consequence when the modelling is done by bona fide statisticians. It is often minimal when averaging across a set of predictions because measurement error occurs at the lowest sampling level and is additive to that of sampling error at that level. Also, measurement error variance is most often a small component of the sum of the variances of these two independent errors and if its not find a better measuring instrument.

“However, correctly incorporating all of (i) binomial sampling errors, (ii) biological errors (i.e., overdispersion), and (iii) errors in variables is not possible using linear regression.”
Candy, SG 2002 Empirical Binomial Sampling Plans: Model Calibration and Testing Using Williams’ Method III for Generalized
Linear Models With Overdispersion. Journal of Agricultural, Biological, and Environmental Statistics, Volume 7, Number 3, Pages 373–388. DOI: 10.1198/108571102302

See also: Carroll, R. J., Ruppert, D., and Stefanski, L. A. (1995), Measurement Error in Nonlinear Models, London: Chapman
and Hall.

Tim Gorman
Reply to  Steven Candy
August 1, 2022 8:19 am

“ Also, measurement error variance is most often a small component of the sum of the variances of these two independent errors and if its not find a better measuring instrument.”

But the measurement uncertainty is NOT a small component of the differences you are attempting to identify. Therefore you can’t just ignore the measurement uncertainty.

You are just throwing up word salad as an excuse.

Steven Candy
Reply to  Tim Gorman
August 1, 2022 6:59 pm

If you do not explicitly incorporate measurement error in the error model for the response variable (eg temperature, see “Ln_Den_st” below) it is subsumed into the “units”-level error term. If you explicitly include measurement error variance as a known quantity by including it as a prior with almost zero variance then you get the same result in terms of parameter estimates with just more detail on units-level variance components. This is how I did it in R in the paper I quoted where an extra prior could be included for measurement error variance if this variance or an estimate were known a priori in the same way as “G3 = list(V =1.0, fix=1)” below (see Supplementary Material in https://www.researchgate.net/publication/357063946_Long-term_Trend_in_Mean_Density_of_Antarctic_Krill_Euphausia_superba_Uncertain)

prior1 <- list(G = list(G1 = list(V =diag(2), nu = 0.002),
       G2 = list(V =1, nu = 0.002), G3 = list(V =1.0, fix=1)), R = list(V=diag(2), nu = 0.002))
 
m5d.1 <- MCMCglmm(Ln_Den_st ~ North60_f + t_SEASON.CENT + North60_f:t_SEASON.CENT,
    random=~ us(1 + I(t_SEASON.CENT)):t_cell_f + t_SEASON_f + idh(SE):units, rcov = ~idh(North60_f):units, data = dataSxC,
    nitt=130000, thin=100, burnin=30000, prior = prior1, family = “gaussian”, pr=TRUE, verbose = FALSE)
 
summary(m5d.1)

So ignoring measurement error in the response variable is not a big deal in terms of modelling like it can be for measurement errors in predictor variables where it can introduce bias in regression parameters and so those references I gave earlier (eg see my paper: DOI: 10.1198/108571102302). However, a relatively high level of measurement error in the response variable does not lead to good predictive accuracy for the underlying physical quantity that is implicitly being modelled.

You can label the above as a “word salad” if you like to retain your delusion that you know enough about statistical theory to be able to understand actual research-level applications.

Last edited 8 days ago by Steven Candy
Jim Gorman
Reply to  Bellman
July 30, 2022 7:09 am

I showed you two links from the Nations Institute of Health (NIH) that discusses how scientists and the mathematicians that assist them DO NOT use appropriate statistics to describe their studies. Climate scientists and their mathematicians are no different.

Here they are again.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1255808/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2959222/#

Each of these contain other references that discuss the statistical errors made in interpreting sample data.

“Maybe you meant the SD of the sampling distribution.”

That is exactly what I meant. If you understood the statistical error definition you would know that:

SEM =
Standard Error of the sample Means =
Standard Deviation of the sample means distribution.

Samples versus populations is a pretty simple distinction. Temperature stations provide temperatures at distinct points and not the entire population of every point on the earth. Consequently, they are samples, not the entire population.

Worse, most historical temperatures are samples of a continuous function which don’t meet Nyquist requirements for sampling.

Here is an example. I perform an experiment 5 times using the same solution, the same pipet, the same beaker, the same heater, etc. I get 5 values.

2.0, 2.8, 3.1, 2.3, 2.5

The mean is 2.5 and the sample standard deviation is 0.4.

Now let’s discuss. Do you divide by (N-1) or by (N) to find the standard deviation. You need to decide is this a population (N) or is it a sample
(N-1)? Obviously, it can’t be the entire population since many, many more experiments would need to be run to obtain anything like an entire population.

What is the mean? Is it the population mean or a sample mean. Since the data is a sample, the mean is of a sample. Therefore, the standard deviation of a sample mean is the SEM. You don’t divide by the √N since you don’t have the population standard deviation you have the sample mean standard deviation.

This is what the NIH documents and other references are trying to get across. When you have a sample mean, be it from one sample or many, the standard deviation of that sample IS THE SEM. You don’t divide by the number of samples to decrease the SEM even further.

Now we haven’t even addressed uncertainty. Each of the numbers in the sample should be:

2.0 ± 0.2, 2,8 ± 0.2, 3.1 ± 0.2, 2.3 ± 0.2, 2.5 ± 0.2

How do you think that affects the uncertainty of the mean? How does it affect the standard deviation?

Last edited 10 days ago by Jim Gorman
Bellman
Reply to  Jim Gorman
July 30, 2022 2:24 pm

I showed you two links from the Nations Institute of Health (NIH) that discusses how scientists and the mathematicians that assist them DO NOT use appropriate statistics to describe their studies. Climate scientists and their mathematicians are no different.

You keep spamming the same links, that don’t say what you think they do. The claim is that some papers use SEM when they should be using SD or fail to indicate which they are using. I can’t comment on the accuracy of these claims because they don’t provide any examples. But it does not mean SEM is not a meaningful statistic,

From one of your articles:

So, if we want to say how widely scattered some measurements are, we use the standard deviation. If we want to indicate the uncertainty around the estimate of the mean measurement, we quote the standard error of the mean. The standard error is most useful as a means of calculating a confidence interval. For a large sample, a 95% confidence interval is obtained as the values 1.96×SE either side of the mean. We will discuss confidence intervals in more detail in a subsequent Statistics Note. The standard error is also used to calculate P values in many circumstances.

Note that SEM is here identified with the uncertainty of the estimated mean – something you and Tim keep saying it isn’t.

Bellman
Reply to  Jim Gorman
July 30, 2022 2:54 pm

That is exactly what I meant.

Good, then maybe we are making progress. But you keep writing in a confused style, and keep implying you mean the opposite, as you do later in your comment.

Here is an example. I perform an experiment 5 times using the same solution, the same pipet, the same beaker, the same heater, etc. I get 5 values.

2.0, 2.8, 3.1, 2.3, 2.5

You seem to be describing a sample of individual values here. You’ve repeated the experiment 5 times and got a distinct value each time.

The mean is 2.5 and the sample standard deviation is 0.4.

The means 2.54 and the sample standard deviation is 0.43, but if you want to prematurely truncate them go ahead.

In answer to your other trivial questions.

You divide by (N – 1) = 4.

It’s a sample not a population.

The mean is 2.54.

It’s a sample mean.

Therefore, the standard deviation of a sample mean is the SEM.

Oh dear. You keep wanting to lecture everyone, but then you keep saying something like this.

You don’t divide by the √N since you don’t have the population standard deviation you have the sample mean standard deviation.

You still divide the sample standard deviation by √N to get the SEM estimate. The sample standard deviation is the best estimate of the population standard deviation. And what do you mean by “the sample mean standard deviation”? You don’t have a sample of means, you have a sample of values.

This is what the NIH documents and other references are trying to get across.

None of them say anything of the sort, and if they do could you actually provide a quote. You keep failing to understand and of these documents.

When you have a sample mean, be it from one sample or many, the standard deviation of that sample IS THE SEM.”

And this is where you are getting completely confused. There are two possibilities.

1) You have 1 sample. This is off a certain size. You know the mean of that sample which is a best estimate of the population size, and you know the sample standard deviation which is an estimate of the population standard deviation. You estimate SEM by dividing this SD by √N.

2) You have an infinite set of samples, all of the same size, and you calculate the standard deviation of the means of these samples, and we call this standard deviation the SEM. This is what’s meant by a sampling distribution. But in reality this is a concept, not something you would do except in simulations.

You seem to think that you can take a small sample of samples and work out the SEM from them. That’s possible, but makes little practical sense as you could just combine all the samples into one bigger sample and get a more accurate estimate of the mean.

But now you seem to think you can do that with just one sample, and somehow use the standard deviation of that one sample as if it was a sampling distribution. As I say, I think you are getting very confused. You should really read the articles you link to

The standard error of the sample mean depends on both the standard deviation and the sample size, by the simple relation SE = SD/√(sample size). The standard error falls as the sample size increases, as the extent of chance variation is reduced—this idea underlies the sample size calculation for a controlled trial, for example. By contrast the standard deviation will not tend to change as we increase the size of our sample.

And note, here SD is the sample standard deviation.

Bellman
Reply to  Jim Gorman
July 30, 2022 3:07 pm

Each of the numbers in the sample should be:2.0 ± 0.2, 2,8 ± 0.2, 3.1 ± 0.2, 2.3 ± 0.2, 2.5 ± 0.2″

I don’t think there’s any “should be”, but let’s try it your way.

“How do you think that affects the uncertainty of the mean? How does it affect the standard deviation?

You don’t specify what your coverage factor is for the measurement uncertainty. I’ll assume it’s 2, so the standard uncertainty is 0.1.

Assuming independence the measurement uncertainty propagated to the mean, is 0.1 / √5, around 0.045.

The standard error of the mean for the stated values is 0.43 / √5, approximately 0.192.

Combining these two values we get √(0.192^2 + 0.045^2), about 0.197.

Steven Candy
Reply to  Jim Gorman
July 31, 2022 6:00 pm

Part of the problem here is the lack of mathematical notation, so that causes confusion with conflicting implicit definitions of sample estimates and population quantities that are being estimated. I was using SD to mean, as is standard, the usual sample estimate of the square root of the population variance σ^2.

“we divide the estimated standard deviation by the square root of the sample size:
SEM = σ / √N (equation typed by me)

The σ should have a “^” over it to show that it is an estimate. This is the problem with duelling statistical “theory” in this type of forum.

BTW I did not slander the “answers” section of your “investor” web page I simply recommended text books or other validated references. I used Wikipedia myself and check descriptions there which I have found to be accurate.

July 26, 2022 10:24 pm

While numbers are interesting, since numbers represent data, numbers are not that important for leftist politics and climate change scaremongering:

Numbers are not required. Data are not required.

(There are no data for the future climate, just beliefs)

All that is required is effective propaganda.

Coming from government bureaucrat “scientlsts”

hired to make scary climate forecasts.

Many Americans are victims of the Appeal to Authority

logical fallacy — almost every leftist. and a few Republicans too. As a result, a majority of Americans believe in a coming climate crisis, as predicted by government bureaucrat “scientists” for the past 50 years.

Climate scaremongering could have been done without numbers. Unpleasant weather events can be demonized without any numbers. …

My “climate rap” continues here:
Honest global warming chart Blog: Climate Rap: Does the belief in a coming climate crisis require any numbers? (elonionbloggle.blogspot.com)

Andy Espersen
July 26, 2022 10:33 pm

Kip, the question that imposes itself after all this is whether democratic governments are really suited for governance in our modern world. Fact is that the United States, generally assumed to represent the cream of democracy, are proving completely unable to comprehend the truly scientific importance of your philosophy. They base their policies on what they perceive as science;  but which is actually only meaningless strings of numbers.
 
Looking back in history, old-fashioned governments never used to govern, to legislate, following “scientific viewpoints”. They always based their decisions, their enacted legislation, only on what they thought represented what was best and most beneficial for the citizens who had elected them to govern – science be damned.

David Long
Reply to  Andy Espersen
July 26, 2022 11:41 pm

Where did you ever get the idea that historical governments were so benevolent?

Andy Espersen
Reply to  David Long
July 27, 2022 3:13 am

Please note, David Long, that I was referring exclusively to democratic governments – the whole idea of which is to deliver to its citizens what they need, crave for and are happiest with.

In world history, of course, democracy is an atypical form of government – but also the much more common authoritarian rulers (or ruling families) mostly tried to please their peoples. If they didn’t, people would soon rebel and somehow find another strongman to follow.

My question is whether democracies will survive the tyranny they are exerting on citizens at present. Will there soon be “a Caesar crossing river Rubicon” to give us the sort of governance we want??

Last edited 13 days ago by Andy Espersen
Reply to  Andy Espersen
July 27, 2022 12:23 am

“actually only meaningless strings of numbers.”

There are no real numbers for the future climate
No data.
Just unproven theories that include a few wild guessed “numbers”
that few people even know, but government bureaucrat scientists
“say so” (about a coming climate crisis), and that’s good enough for them.

The claim that the average temperature increased +2 degrees C.
since the late 1880s does not support the predictions of climate doom.
The predictions are for a much faster rate of warming that has never happened,
in spite of 50 years of such predictions.
.

Richard Page
Reply to  Andy Espersen
July 27, 2022 9:46 am

I think you need to seperate governments controlled by those that truly represent their constituents from those of career politicians who seem to only represent themselves and some desperate need to cling to power. Politician should never have been a career path; there should be a requirement for experience in other fields away from politics first.

Steve Case
July 27, 2022 1:44 am

Kip, neither you nor anybody in the comments mentioned the IPCC’s Global Warming Potential numbers. You know, CH4 is 86 times more powerful than CO2 at trapping heat, N2O is 265 times more powerful, and CClF3 is 13,900 times more powerful than CO2 at trapping heat.

And nowhere will you find an answer to how much warming those compounds will actually run up global temperatures over the next 100 years.

David Blenkinsop
Reply to  Steve Case
July 27, 2022 1:59 am

Now the scary part is if someone comes up with a compound so bad that way,that a single escaped molecule would raise the whole planet’s temperature by about 25 degrees C or so!

(Probably it would flip the Earth’s magnetic field, triple the hurricanes, and cause all the volcanoes to go off as well).

Mark BLR
Reply to  Steve Case
July 27, 2022 4:54 am

… the IPCC’s Global Warming Potential numbers. You know, CH4 is 86 times more powerful than CO2 at trapping heat, N2O is 265 times more powerful

You clearly have not seen the most recent AR6 (WG-I) report.

N2O has been bumped up to a GWP(-20) of 273.

“Fossil” methane molecules (e.g. leaking from a pipeline) now have a GWP of 82.5.

“Non-fossil” CH4 (e.g. from melting permafrost), however, now has a GWP of 79.7.

Ain’t “(Climate) Science” wonderful ?

Screenshot_AR6-Final_Table-7-15_CH4.png
Clyde Spencer
Reply to  Mark BLR
July 27, 2022 11:07 am

How does the atmosphere, or rather the up-welling IR, know the difference between “fossil” and “non-fossil methane”?

Mark BLR
Reply to  Clyde Spencer
July 28, 2022 3:23 am

How does the atmosphere, or rather the up-welling IR, know the difference between “fossil” and “non-fossil methane”?

The only thing the IPCC says about that particular “minor detail” is the bald assertion (on page 1017) :

Contributions to CO2 formation are included for methane depending on whether or not the source originates from fossil carbon, thus methane from fossil fuel sources has slightly higher emissions metric values than that from non-fossil sources.

I would not presume to advance a conjecture of my own in this specific domain.

Last edited 12 days ago by Mark BLR
Clyde Spencer
Reply to  Steve Case
July 27, 2022 11:03 am

I think that part of the explanation may lie in the MSM promoting the reduction of such emissions to give the impression of actually doing something. Even methane, which COP26 made a priority, will have a negligible impact for realistic reductions since most is natural. The last thing that they want is for the public to realize that new laws and tax dollars will have almost no measurable impact, even with the high-end estimates of warming potential and possible reductions.

Robert B
July 27, 2022 2:42 am

In theory (wishful thinking?), averaging temperatures should give you something meaningful, like averaging concentrations of a desired mineral in samples of an ore body. You only get a good estimate of the amount of that mineral that you will retrieve, after digging up the whole ore body, from the average if you take so many samples that you might as well keep going and dig up the rest.

Bespoke fitting of polynomials to far fewer ore samples can give you something worthwhile averaging, which is what the mining industry does successfully enough to get a useful estimate of the amount of mineral they will obtain. So when I criticized HadSST for being ridiculously dodgy, I was informed that they use krigging like they have used successfully in mining for decades, and asked if I don’t believe in krigging. My reply was that I don’t believe in salami science – Is krigging. Is good!

“Concentration” is the name of a number of different intensive properties that are the quotient of an extensive property of one component, typically used for quantity, divided by an extensive property of the whole sample, typically used for quantity. In the case of mining, the mass of the mineral over the mass of the sample of ore. Those who have done chemistry will be familiar with amount, n, (the proper name of the extensive property measured in moles) of a solute over volume of the solution, as well a number of other intensive properties under the banner of “concentration”.

It’s assumed that the same can be done with temperature, treating it as a simple intensive property, the quotient of the heat energy put into the sample divided the heat capacity of the sample, except that the basic definition of heat capacity is

https://wikimedia.org/api/rest_v1/media/math/render/svg/27a0d3558b14b5228694dfc7ab80ad6c6012c617

“The value of this parameter usually varies considerably depending on the starting temperature {\displaystyle T}https://wikimedia.org/api/rest_v1/media/math/render/svg/ec7200acd984a1d3a3d7dc455e262fbe54f7f6e0 of the object and the pressure {\displaystyle P}https://wikimedia.org/api/rest_v1/media/math/render/svg/b4dc73bf40314945ff376bd363916a738548d40a applied to it. In particular, it typically varies dramatically with phase transitions such as melting or vaporization (see enthalpy of fusion and enthalpy of vaporization). “

if the above doesn’t get published properly, here is a link. https://en.wikipedia.org/wiki/Heat_capacity#Basic_definition

And it’s a property of the probe at a particular time that is influenced by it’s immediate surroundings. It’s not a property of 10 km×10 km×10 km volume of the atmosphere. The average of the minimum and maximum temperatures is definitely not an intensive property like concentration. Each measurement is influenced by a different sample of the atmosphere as the weather system goes through, convection, heat going in and out, and changes in heat capacity as water evaporates or condenses.

To cut to the chase, krigging is far from a straight forward method to make a collection of measurements mean something when you average the values derived from them, and these measurements are simple intensive properties of carefully planned sampling, something the temperature record isn’t.

Krigging might give an average that is within 1% of the tonnage of the mineral retrieved for each tonne of ore dug up and the miners would be stoked with the analysis. One percent of 300 K is 3°C, or three times the warming in the past century. Even 1% of 30° C, the spread of sea-surface temperatures, is 0.3°C. This is about how much of the warming since 1950 that is consistent with extra warming above the natural variation that started before 1950 – according to the GTA that I’m claiming is worthless.

AND then you need to consider how much the temperature record is not suitable for such analysis.

It’s mind boggling that governments accept it as evidence to destroy their constituents’ standard of living.

Jim Gorman
Reply to  Robert B
July 27, 2022 11:52 am

From:

http://geofaculty.uwyo.edu/yzhang/files/Geosta1.pdf

“”As stressed by Journel, “that there are no accepted universal algorithm for determining a variogram/covariance model, that cross-validation is no guarantee that an estimation procedure will produce good estimates at unsampled locations, that kriging needs not be the most appropriate estimation method, and that the most consequential decisions of any geostatistical study are made early in the exploratory data analysis”.””

Krigging, although complicated, is no guarantee that there are “good estimates at unsampled locations”.

Rod Evans
July 27, 2022 2:54 am

Quote
I am well aware of the many problems of scientific modern research, including that all fields of science are returning a lot of questionable results – even in fields closely monitored like medicine. It is my belief that much of this is functionalized by “too much math, not enough thinking” or the reification of mathematical and statistical results”. 
I think you are being too generous there Kip. My view is increasingly coming round to the idea, too many scientific studies depend on returning the answer the funding body needs them to return. The inducement being, there will be more funding if you do the ‘right’ thing.
I may be becoming cynical as well as sceptical in my old age….. 🙂

July 27, 2022 3:48 am

Medicine is not a science falsifiable, but an art and technology depending on statistical verification for validation. Falsifiability is the demarcation boundary of science from nonsense. Beware the Black Swan. Eschew ad-hockery. Medicine is a corrupt mountain of hockery. Healers mine this mountain for nuggets of advice.

False claims of knowledge – lies – as subjective naive priors cannot reduce the entropic space to truth.

Pflashgordon
July 27, 2022 5:31 am

Kip, you are striking on some critical points. Great Part 1!

My former graduate advisor and scientific mentor is a brilliant, world class scientist and mathematician. He is also imminently practical and pragmatic. A former farmer, a soil physicist and micro-meteorologist, even at age 80+ he still runs circles around corporate defense attorneys as an expert witness in environmental litigations. His passions are many, from getting drunk drivers off of the road to refurbishing old housing with his own hands to benefit elders in need.

He taught me early on that you have to check your fine-tuned calculations and statistics against your own “back of the envelope” scaling and your own reasoning. Do your results make sense in the real world?

We can all err, but so-called climate “experts” cannot dismissively blow off skeptics as not scientists or scientists/engineers of the wrong kind. Many lay people, including many who frequent WUWT, have enough sense of the world and it’s processes to call “foul” when they read or hear of the nonsense that often comes out of academia, and even more so from politically motivated government officials and activist NGOs.

D Boss
July 27, 2022 6:28 am

Like your slant here! I grew up in town with a couple of Universities. One had a highly accredited Engineering section, and we were often pitted against the “Pure Math” section. Of course there were Humanities, but both real sections considered the Humanities as full of fruit cups and nut bars.

The Pure Maths people derided the Engineering folks as alcoholics and jocks, and we derided them simply as “Matholes”….

I stand to this day as considering those who view maths as God, being “matholes”, and devoid of a proper sense of reality.

Yes, math is a tool. Yes it is essential and useful for navigating reality. But we must never loose sight of the fact it is a symbolic representation of reality, not actual reality!

And Engineering has consequences – so you can’t lie and get away with it for long. But math people can lie with numbers and it’s often devoid of consequences of their pontifications.

Clyde Spencer
Reply to  Kip Hansen
July 27, 2022 11:15 am

And engineers may lose their license to practice if a public engineering project fails. I don’t know of any similar motivation hanging over the head of mathematicians.

drh
Reply to  Kip Hansen
July 28, 2022 11:02 am

Reminds me of a joke:
An engineer, physicist and mathematician are locked up in a jail. The guard comes around in gives them lunch in the form of an unopened can of soup and they have to figure out how to get it open with nothing but their wits to eat.

The engineer bashes the crap out of the can and it opens almost immediately.

The physicist takes a bit longer to make some calculations but gets his can open too.

The mathematician is sitting in the corner with the closed can in front of him saying over and over “Assume the can is open. Assume the can is open….”

Jim Gorman
Reply to  D Boss
July 27, 2022 11:56 am

^100

Odds On
July 27, 2022 6:55 am

I always believed the addage that ” if you interrogate the numbers for long enough, you can get them to admit to anything”

hiskorr
July 27, 2022 7:03 am

Thank you! Thank you! For at least two decades I have been claiming that the “numbers” representing “Global Average Surface Temperature” have the same utility as the “number” representing the “average phone number” in a phone book.

griff
July 27, 2022 7:08 am

Here’s some numbers: UK highest temperature record set only 3 years before beaten in 43 places in one day, with 5 of those places recording a 40C plus temperature

Rod Evans
Reply to  griff
July 27, 2022 7:43 am

That’s weather for you. It is brilliant at generating records. Record hot, record cold, record highs and record lows. Life would be monotonous without weather that’s for certain…..

Last edited 13 days ago by Rod Evans
Richard Page
Reply to  griff
July 27, 2022 9:51 am

So? I fail to see the significance of a hot building, runway or other structure on climactic trends. Enlighten me.

Clyde Spencer
Reply to  griff
July 27, 2022 11:31 am

Earth has been warming for 12,000 years. A reasonable extrapolation would be that it will continue to warm. Be worried — very worried — if it starts to cool rapidly.

Jim Gorman
July 27, 2022 7:54 am

I would respectively say that there is a difference in counting numbers and measurement numbers. Counting numbers are unique, infinite precision, integers. Things like the quantity of fruit, yes/no answers, fingers, etc. They are different than measurements that have uncertainty.

Example: I have 5 pieces of fruit. There are no subdivisions, there are exactly five. Now I divide them in two. One group has two and one half pieces and the other group has two and one half pieces. Or do they. maybe one group has two and 0.48 pieces and the other group has two and 0.52 pieces. You now have uncertainty involved in the measurement of the division.

I’ll leave you with two things. The first is a statement from Washington Univ in St. Louis from their chemistry department. It isn’t available on the web any longer, but I saved a copy that I read quite often.

“By using significant figures, we can show how precise a number is. If we express a number beyond the place to which we have actually measured (and are therefore certain of), we compromise the integrity of what this number is representing. It is important after learning and understanding significant figures to use them properly throughout your scientific career.” (bold by me)

This is important when assessing the quality of anomalies created from temperatures that are recorded in integer fashion or even to the nearest one-tenth. Are values of anomalies out to the one-hundredth or one-thousandths from values that haven’t been measured this accurately worth the time and effort to calculate?

Another excellent explanation of resolution and uncertainty is at a youtube location.

(761) Uncertainties – Physics A-level & GCSE – YouTube