Fun with Trends

Brief Note by Kip Hansen – 17 August 2022

I have been doing research for other people’s book projects (I do not write books).  One of the topics I looked at recently was the USCRN — U.S. Surface Climate Observing Reference Networks (noaa.gov);  Self-described as “The U.S. Climate Reference Network (USCRN) is a systematic and sustained network of climate monitoring stations with sites across the conterminous U.S., Alaska, and Hawaii. These stations use high-quality instruments to measure temperature, precipitation, wind speed, soil conditions, and more.”

A main temperature data product produced by USCRN is Average Temperature Anomaly for the entire network over its full length of about 17 years.  It is shown up-to-date here at WUWT in the Reference Pages section as “Surface Temperature, US. Climate Reference Network, 2005 to present” where it looks like this:

Now, a lot of people would like to jump in and start figuring out trend lines and telling us that the US Average Temperature Anomaly is either “going up” or “going down” and how quickly it is doing so.

But let’s start with a more pragmatic approach and ask first:  “What do we see here?” 

I suggest the following:

1.  What is the range over the time period presented (2005-2022)?

          Highest to lowest, the range is about 11 °F or 6 °C.  This range represents not a rise or fall of the metric but rather the variability (natural or forced).  Look at the difference between the high in late 2005 and the low in early 2021.  If this graph had been unlabeled, I would have identified it as semi-chaotic. 

2.  Is the anomaly visually going up or down?

          Well, for me, it was hard to say.  Oddly, the anomaly seems to run a bit above “0” – which tells us that the base period for the anomaly must be from some other time period.   And it is, USCRN uses a 1981-2010 base period for “0” when figuring these anomalies, the base period is not inside the time range of this particular time-series data set. 

We can, however, ask Excel to tell us mathematically, what the trend is over the whole time period.

There, now you know.  Or do you?  MS Excel says that USCRN Average Temperature Anomaly is trending up, quite a bit, about 1 °F (0.6 ° C) over 17 years

~ ~ ~

Now comes the FUN!

I’ve arbitrarily picked five-year time increments as they are about 1/3 of the whole period.  Three five-year trends (the last one, slightly longer) which are all down-trending, add up to one up-trending graph when placed end to end in date order.

Lessons We Might Learn:

a.  Don’t use short time periods when determining trends in a time series.  Trends are always sensitive to start and end dates.

b.  This phenomena is somewhat akin to Simpson’s Paradox: “is a phenomenon in probability and statistics in which a trend appears in several groups of data but disappears or reverses when the groups are combined.” 

“In his 2022 book Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy and Everything Else, Jordan Ellenberg argues that Simpson’s paradox is misnamed:”

“Paradox” isn’t the right name for it, really, because there’s no constriction involved, just two different ways to think about the same data. … The lesson of Simpson’s paradox isn’t really to tell us which viewpoint to take but to insist that we keep both the parts and the whole in mind at once.”  [ source ]

c.  It does bring to mind other data sets that change trend (or even trend sign) when looked at in differing time lengths  — sea level rise comes to mind, with the short satellite record claiming to be double the century-long tide-gauge SLR rate. 

d.  Why look at trends that are obviously not reliable over different time scales?   This is a philosophical question.  Can a longer trend be real if all the shorter components of the trend have the opposite sign?  Can three shorter down-trends add up to a longer up-trend that has applicability in the Real World?  Or is it just an artifact of the time scale chosen?  Or is the opposite true?  Are three shorter down-trends real if they add up to an up-trend? (When I say “real” I do not mean just mathematically correct – but physically correct.)

e.  Are we dealing with a Simpson’s-like aberration here?  Is there something important to learn from this?  Both views are valid but seem improbable.

f.  Or, is what we see here just a matter of attempting to force a short highly variable data set to have a real world trend?   Are we fooling ourselves with the interpretation of the USCRN Average Temperature Anomaly as having an upward trend – when the physical reality is that this rather short data set is better described as simply “highly variable”?

# # # # #

Author’s Comment:

I hope some reader’s will find this Brief Note interesting and that it might lead to some deeper thought than “the average and its trend have to be correct – they are simply maths”. 

Many metrics of CliSci are viewed at an artificially assigned time scale of  “since the beginning of the modern Industrial Era” usually interpreted as the late 19th century,  roughly 1860 to 1890.  Judith Curry, in her recent interview at Mind and Matter suggests that this is literally “short sighted” and that for many metrics, a much longer time period should be considered. 

I hope I have time to keep up with your comments, I try to answer all that are addressed expressly to me. 

Thanks for reading.

# # # # #

Get notified when a new post is published.
Subscribe today!
5 31 votes
Article Rating
867 Comments
Inline Feedbacks
View all comments
August 17, 2022 6:06 am

It seems fair to include all data in the trend unless there is a discussed reason not to.
The only graph that matters, if any do, is all of the data.
We can always find other things to discuss if we ignore some data for some unstated reason. But we shouldn’t do that.

kzb
Reply to  Kip Hansen
August 17, 2022 7:35 am

The start date is not really arbitrary. It is supposedly the start of significant fossil fuel use, the idea being to correlate (previously buried) CO2 emissions with temperature. So there is a good reason why the start date is where it is.

MarkW
Reply to  kzb
August 17, 2022 7:51 am

Except that start date is also during the coldest period of the last 10,000 years.
Of course the world has warmed up since then, and most, perhaps all of that warming would have occurred even if man hadn’t burned any fossil fuels.
Heck, the world still hasn’t returned to the average temperature for the last 10,000 years.

Reply to  kzb
August 17, 2022 10:09 am

Can you tell us about anything LESS ARBITRARY than the statement of (my stress) “significant fossil fuel use”? … and that statement implying that CO2 emissions are the sole cause of temperature changes?

Reply to  kzb
August 17, 2022 10:23 am

According to some sources, the Industrial Revolution began in 1760.

MarkW
Reply to  kzb
August 17, 2022 10:25 am

Fossil fuel use in 1860 was not “significant”.
It didn’t become “significant” until sometime after 1950.

Reply to  MarkW
August 18, 2022 1:53 am

In SE Australia, most rural temperature readings show a cooling from about 1900 (the Federation drought) to the late 1970s than a warming tend. Sydney shows a warming from 1950, but I put that down to the immigration growth in the cities after WWII, and the associated UHI effect. All capital cities attracted migrants after WWII and this would explain a lot of the warming in the cities). The cool PDO (~1944-1978) may have had an effect on the cooling in the 1970s. The recent La Nina most likely affected last year’s Summer, which dropped up to 3 degrees C from the year before. I also think from the 50s in the country, weather stations were moved to the airports, also contributing to an increasing trend. Also, as an exercise, I graphed the temperatures (mean max summer-JJA) of 3 central England stations (CET (comp) , Sheffield & Oxford) and compared them to 3 remote Scotland stations (Lerwick, Stornoway & Tiree), also the mean Max Summer temp. The 3 mean England stations temperature trended at 1.1C per century, whereas, the Scotland temperature trended at only 0.2C per century. From what I have seen through my research is that the UHI effect is not adequately accounted for in any adjustment.

UK remote vs urban 3 vs 3.PNG
Reply to  John B
August 18, 2022 1:56 am

Make of it what you will.

UK Scotland 5 stations Slide.JPG
Carlo, Monte
Reply to  John B
August 18, 2022 5:48 am

Very interesting to see.

Clyde Spencer
Reply to  kzb
August 17, 2022 11:04 am

No matter how well justified rationalized, one has to be concerned about spurious correlations. Correlations may have predictive value, but they may have little or no value for establishing causation, particularly when time is the independent variable.

Reply to  Clyde Spencer
August 17, 2022 3:57 pm

They may also have *no* predictive value. In a time series, giving equal weight to past history as you give to current history is a recipe for forecasting disaster. Unless you can define a functional causal relationship for the values of the time series you really don’t have an idea of what is going to happen in the future.

Ask anyone that has tried to forecast where a new telephone central office should be located or where a new grocery store should be located. If you just use population growth data over time with no weighting as a predictor your prediction is quite likely to be wrong. Your long term trend may indicate growth while the more recent data indicates movement away from the long term trend (e.g. conversion from landlines to cell phones, population movement away from from high density areas).

Focusing solely on CO2 as the main driver for the functional relationship of temperature and time is missing the forest for the trees!

Clyde Spencer
Reply to  Tim Gorman
August 17, 2022 4:59 pm

Yes, implicit in the typical auto-correlation in time-series, the correlation tends to break down the farther one goes back in time. Note that I said, “may have predictive value.”

Reply to  kzb
August 17, 2022 11:33 pm

The start date is not really arbitrary. 

Of course, the start date is arbitrary.

Why not start with the MWP, the Roman Warm Period or the Minoan Warm Period?

Or would using the Holocene Climate Optimum as the starting point show the current warming period is nothing new?

Ron Long
Reply to  Kip Hansen
August 17, 2022 8:23 am

Thanks for this posting and reply. As a geologist that specializes in detecting gold deposit trends, I am certain that not only should all data be studied, but that the investigator must update the trend interpretation every time new and relevant data appears. However, after studying all data not all needs to be included in updated thinking, some data needs to be set aside (but not destroyed).

Reply to  Kip Hansen
August 17, 2022 8:28 am

Kip: as a geologist, it is ingrained to consider larger time frames. A favorite one in the climate sphere concerns a single dead rooted tree at Tuktuyaktuk on the NW Canada Arctic coast that was dated at 5000 yrs ago:

comment image

This a white spruce, currently about 100km north of the modern tree line but, another couple of hundred km north of today’s living white spruce (Yes! the very same species) of this size where the average temperature would be some 6 to 8°C warmer than Tuk. Reckoning Arctic amplification at double the Global avg T at the time, suggests it was 3 to 4°C warmer globally 5000yrs ago and no runaway anything or Crisis at all. It would have been a nice quiet forest with pleasant bird noises and occasional soughing wind through the treetops.

MarkW
Reply to  M Courtney
August 17, 2022 7:49 am

The problem with that is that the end points could contain anomalies.

Clyde Spencer
Reply to  MarkW
August 17, 2022 11:09 am

And, the presented slope(s) should always be presented with uncertainties and statistical significance, which is a rarity in climatology.

Reply to  Clyde Spencer
August 17, 2022 4:00 pm

You just nailed the problem with *all* temperature data graphs. No uncertainties presented – meaning you don’t know if the data lies within the uncertainty interval or not, thus you simply don’t know what the true trend is.

All temperatures should be given as “stated value +/- uncertainty” and then the uncertainty should be propagated into the results using standard procedures.

Richard M
Reply to  M Courtney
August 17, 2022 7:59 am

I disagree. Sometime you already know “all the data” will be influenced by a known factor. All you will see is the effect of that factor. Hence, it would be much more informative to try and eliminate the influence of that factor.

In this case we know the PDO went negative in 2006 and turned positive in 2014. Hence, we would expect a positive trend based only on the influence of the PDO. In this case the only really meaningful data is what happened prior to and after the PDO switch. We were shown the 2015-2022 data. It’s a start. The 2006-2014 data would also be informative.

Truthbknown
Reply to  M Courtney
August 17, 2022 9:21 am

That is the “science” part of it! They manipulate the data to brainwash idiots!

ferdberple
Reply to  M Courtney
August 17, 2022 4:52 pm

If changing the endpoints changes the trend, then what you have is not a trend.

What you have is a curve fit of degree 1 with no predictive power..

Scissor
August 17, 2022 6:07 am

From one month to the next, global T commonly jumps by a third of a degree. A handful more jumps up than down over a decade or even more so over a century is probably insignificant from a statistical point of view.

Scissor
Reply to  Kip Hansen
August 17, 2022 10:17 am

Yes, that’s a key question.

I know at least one physicist having the opinion that a number of contributors to the global climate system behave as nonlinear chaotic subsystems, which are difficult to model.

Scissor
Reply to  Kip Hansen
August 17, 2022 2:31 pm

There you go.

Jeff Alberts
Reply to  Kip Hansen
August 17, 2022 3:27 pm

Haven’t they re-defined that already? To fit their agenda, like everything else?

bdgwx
Reply to  Jeff Alberts
August 17, 2022 8:02 pm

Jeff Alberts said: “Haven’t they re-defined that already? To fit their agenda, like everything else?”

No. They’ve always maintained that predictions of exact states (meaning exact properties at exact times at exact locations) is nearly impossible in the long range. In fact, the data suggests that useful skill of predictions of exact states will always be limited to about 15 days regardless of the advancements in modeling. After that predictions tend to focus on the movement of the attractors of the chaotic system and so only the average state (meaning average of the properties over large spatial and temporal domains) is possible.

Reply to  bdgwx
August 18, 2022 12:30 pm

This is from a book Confessions of a climate scientist by Mototaka Nakamura.

That is, the selection of the parameter values is a poor engineering process to “make the model appear to work” rather than a scientific process. The models are “tuned” by tinkering around with values of various parameters until the best compromise is obtained. I used to do it myself. It is a necessary and unavoidable procedure and is not a problem so long as the user is aware of its ramifications and is honest about them. But it is a serious and fatal flaw if it is used for climate forecasting/prediction purposes.”

This from a researcher who worked with and did research using GCM’s.

August 17, 2022 6:12 am

There are endless ways to have fun–or maybe it’s make mischief–with trends. One that I always point out is the way in which climate-textbook author Raymond Pierrehumbert used sea-level trends to accuse Unsettled author Steven Koonin of cherry picking.

Fig 4.png
GuyFromBerlin
Reply to  Joe Born
August 17, 2022 6:26 am

Looks like very improperly done low-passing to me! There are so much better algorithms to do this kind of thing – one might look into A/V processing techniques. If you’d mutilate an audio or video signal (which in its digital form is just the same thing as any other timeseries of values, it doesn’t matter if what you sample are sea levels, temperatures, movements of a microphone diaphragm, or brightnesses of a row of pixels) in the way I regularly see done to climate-related “signals”, your eyes and ears would tell you in no time that THIS DOESN’T WORK, because the resulting sound and image would be distorted beyond recognition. Lowpassing without ringing, rippling and phase/time-distortion isn’t easy and certainly not doable with sophomore mathematics like fitting trend lines or trailing averages. No wonder that people see everything and nothing in the resulting graphs (just like you might, with enough imagination, hear anything from Beethoven to the noise of Niagara Falls in audio “processed” in such crude ways….)

Reply to  GuyFromBerlin
August 17, 2022 8:40 am

Dr. Pierrehumbert based his argument on trends, so in this case plotting trend values was the appropriate approach.

Yes, I often use Gaussian or binomial filters when I’m looking at temperature time series, but temperature series aren’t audio signals, and, esthetics aside, I haven’t been able to come up with a satisfactory argument for the proposition that in the case of temperature series such filters are necessarily superior to, say, rectangular filters.

Reply to  Joe Born
August 17, 2022 10:12 am

Joe, the problem with using rectangular filters is exemplified by this graph from my post “Sunny Spots Along The Parana River“.

comment image

It shows the normalized sunspot anomaly, along with the 11-year rectangular “boxcar” filter of the same data. The correlation is horrible, R^2 is 0.01, and many peaks are converted into troughs and vice versa.

So yes, almost ANY filter would be preferable to a boxcar filter in this case.

w.

Reply to  Willis Eschenbach
August 17, 2022 11:01 am

Yes, I understand that the filtered signal doesn’t look like the original signal. But preferable implies some criterion; what do you want the filtered signal to tell you? If the question were what the eleven-year average is, then the rectangular filter would be precisely what you want.

True, such a filter is usually undesirable in signal processing because it leaves what the guys I’ve dealt with seem to call “side lobes” in the resultant spectrum. I’m told that this is bad for, e.g. frequency-division multiplexing.

But I’m also told that the zeroes in such a filter’s Fourier transform (a sinc function) may enable you to suppress unwanted periodicity in the original signal.

Hey, you probably see something I’m missing; I don’t profess to be a filter expert. I made my living as a lawyer, not an engineer. But different horses for different courses is the impression I took from guys who seemed to know this stuff.

Reply to  Joe Born
August 17, 2022 11:49 am

The only useful information conveyed by the analysis is that the sunspot cycle is approximately 11 years long, so you get an extreme moire effect through the filter. Think of filming a propellor at a frame rate that is almost an exact multiple of its rotation frequency.

Reply to  Joe Born
August 17, 2022 10:53 am

The use of filters is primarily used to eliminate real “noise”. What is the noise you are hoping to eliminate and what is the signal you are hoping to isolate?

Is there something other than temperature you are hoping to see?

Reply to  GuyFromBerlin
August 17, 2022 9:27 am

OMG, how appropriate. Someone who knows more than simple averaging. Looking for a “signal” in data that IS THE SIGNAL is confirmation bias of the worst kind.

Reply to  Kip Hansen
August 17, 2022 8:36 am

As I keep pointing out to thoise who think Monckton’s pauses and rapid cooling periods are relevant. Even Monckton points out that short term trends are bogus.

But just because some trend are cherry-picked to make a point, doesn’t mean that all trends are meaningless, and there are ways of determining one from another. Is it significant, does it represent a significant change, does the short term trend meet up with the end of the previous change or does it create an unrealistic discontinuity.

Reply to  Bellman
August 17, 2022 9:06 am

The trend is not bogus, the value of the trend is. But as you say Monckton points that out meaning that you agree with him. But then you keep arguing with him about showing the trend, which can only mean you dislike Monckton and not the trend.

Reply to  Bellman
August 17, 2022 10:56 am

The trend Monckton is using is to show that CO2 does not control temperature. That is all. The longer the pause the less likely that growing CO2 is even a factor!

Reply to  Jim Gorman
August 17, 2022 4:24 pm

Could you point to a single example where Monckton says that?

bdgwx
Reply to  Jim Gorman
August 17, 2022 7:18 pm

I don’t know that Monckton is saying that. But I find it ironic that several of us on here who have actually attempted to model the UAH temperature using CO2 as a parameter expect the pause to continue for several more months.

Clyde Spencer
Reply to  Bellman
August 17, 2022 11:36 am

The alarmist thesis is that CO2 is the dominant driver of temperature; therefore, one would expect that there would be a high correlation between the measured CO2 increase and the temperature. “Monckton’s Pauses” clearly demonstrate the ineffectiveness of monotonic annual increases of CO2 and that it is easily over-powered by other factors.

During the July 1991 total solar eclipse that I observed at Cabo San Lucas (Baja), during the 6+ minutes of totality, I measured a drop in air temperature of about 1 deg F per minute. Obviously, there are things that affect global temperature (such as clouds and aerosols) that have more influence than CO2.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 12:09 pm

The Monckton Pause appears to be inline with expectations to me.

comment image

Derg
Reply to  bdgwx
August 17, 2022 1:55 pm

We just need one more model and then the science will be completely settled.

Meanwhile China doesn’t give 2 runny sh1ts and uses fossil fuels like their lives depend on it.

Hashbang
Reply to  Derg
August 18, 2022 1:34 am

Which it does and so do ours.

Reply to  bdgwx
August 17, 2022 2:04 pm

So you think CO2 is THE CONTROL KNOB along with whoever wrote the model? Look at the model difference with wo/CO2 and w/CO2!

Looks like tuning to me. How does the projection from 5years ago from this compare to what it shows now? What did they change to show cooling?

bdgwx
Reply to  Jim Gorman
August 17, 2022 2:27 pm

No. I don’t think CO2 is THE control knob. The data is not consistent with that hypothesis. However, the data is consistent with it being A control knob.

Derg
Reply to  bdgwx
August 17, 2022 3:39 pm

A control knob 😉

That was funny

Reply to  bdgwx
August 17, 2022 4:09 pm

How do you know it isn’t an “effect” from other control knobs?

bdgwx
Reply to  Tim Gorman
August 17, 2022 6:43 pm

That model does not tell if you if CO2 is driven by other control knobs. What the model does is show that increasing CO2 is not inconsistent with the Monckton Pause.

Derg
Reply to  bdgwx
August 18, 2022 6:05 am

It shows that the models are wrong.

Carlo, Monte
Reply to  Derg
August 18, 2022 6:23 am

bgwxyz has a big audio soundboard of control knobs…

bdgwx
Reply to  Derg
August 18, 2022 6:56 am

Can you post the objective criteria by which you claim the model I posted above is wrong?

Clyde Spencer
Reply to  bdgwx
August 17, 2022 5:16 pm

Like playing an electronic synthesizer.

If CO2 isn’t THE control knob, why is there so much effort and money being spent on controlling it?

Clyde Spencer
Reply to  bdgwx
August 17, 2022 5:05 pm

Note that CO2 is just one of 4 explicit parameters. The other parameters are composites of several different physical measurements, chief among them being temperature.

The divergence supposedly due to CO2 doesn’t show up until about 1994. Mauna Loa measurements have been taken since 1958, and supposedly have been impacting temperature since the late 18th century. It is weak support of your claim.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 6:39 pm

My claim is that the Monckton Pause is not inconsistent with CO2 being a control knob. It is certainly inconsistent with CO2 being the control knob though.

Sorry, I don’t engage in policy discussions. I’m only interested in the science.

Derg
Reply to  bdgwx
August 18, 2022 6:06 am

Lol the science 😉

Kinda like the gene therapy shot science?

Clyde Spencer
Reply to  bdgwx
August 18, 2022 11:25 am

I didn’t bring up any issues of policy. It looks like an attempt at deflection on your part instead of responding to the facts of science I pointed out that you had wrong.

bdgwx
Reply to  Clyde Spencer
August 18, 2022 12:43 pm

CS said: I didn’t bring up any issues of policy.

You askedIf CO2 isn’t THE control knob, why is there so much effort and money being spent on controlling it?” That is a question that directly relates to public policy. I don’t participate in those kinds of discussions.

Reply to  Clyde Spencer
August 17, 2022 4:29 pm

The alarmist thesis is that CO2 is the dominant driver of temperature; therefore, one would expect that there would be a high correlation between the measured CO2 increase and the temperature

If true, it’s the dominant driver of temperature on the scale of the next century or so. That doesn’t mean over a few years other factors, such as ENSO don’t have a bigger effect.

The correlation between CO2 and temperature has not changed other this “pause”. If anything it’s gotten stronger.

Obviously, there are things that affect global temperature (such as clouds and aerosols) that have more influence than CO2.

We’ll yes, turning off the sun will have a big effect than CO2. But what happened after the eclipse. Did temperatures remain 6°F lower or did they eventually return to equilibrium?

Clyde Spencer
Reply to  Bellman
August 17, 2022 5:21 pm

After the roosters quit crowing, the sun decided to grant their plea, and returned the air temperature to what it had been.

The point of the story is how quickly air temperatures respond to insolation. It doesn’t take 7 years to restore a perturbation.

TheLastDemocrat
Reply to  Clyde Spencer
August 17, 2022 4:46 pm

This statement from CS is very important:
“the ineffectiveness of monotonic annual increases of CO2 and that it is easily over-powered by other factors.”

We are trying to see if one thing co-varies with another. If one data set has a monotonic function, you cannot theoretically carry out a covariance analysis (“covariance” with a small “c”).

We are trying to examine whether “planet temp” covaries with “co2.” A more complete way to say this is: if co2 goes up, does planet temp go up, and if co2 goes down, does planet temp go down?”

We can ask that. But to answer it, we have to have both ups and down in “planet temp,” and also in “co2.” –This is an “assumption” that must be met.

For our examinations of “does planet temp go up or down as co2 goes up or down,” we NEVER have co2 going down!

So, these time-based analyses of the relation between co2 and planet temp are not valid.

Mauna Loa goes back to 1960. Has been monotonic the whole time.

If we trust ice core estimates of co2 back to the last Ice Age, then we have a chance at data to suit the analysis. And, (proxy-based) temps and co2 trends do not really co-vary at all. So, the temp-co2 relation hypothesis is dead before it starts. co2 influence upon planet temp is so modest, if it exists, it cannot be discerned out of the other much greater influences – not even with PCA as was done in MBH98.

https://gml.noaa.gov/ccgg/trends/

Reply to  TheLastDemocrat
August 17, 2022 5:17 pm

+100

Clyde Spencer
Reply to  TheLastDemocrat
August 17, 2022 5:42 pm

we NEVER have co2 going down!

Actually, we do every Summer and Fall. It did during 2020, with the April decline being -18% of anthro’ emissions compared to 2019. However, the 2020 seasonal variations in CO2 were indistinguishable from 2019 and 2021. All of those years include the “Monckton Pause.”

One can unequivocally say that a decline in CO2 does not result in the immediate response seen with either an eclipse, thick clouds passing overhead, or the sometime effect of dense contrails.

https://wattsupwiththat.com/2021/06/11/contribution-of-anthropogenic-co2-emissions-to-changes-in-atmospheric-concentrations/

https://wattsupwiththat.com/2022/03/22/anthropogenic-co2-and-the-expected-results-from-eliminating-it/

Derg
Reply to  Bellman
August 17, 2022 1:53 pm

No kidding, we had the 4th coldest April ever.

AGW is Not Science
Reply to  Kip Hansen
August 17, 2022 11:46 am

I often think of the “pre industrial” start point for the ‘climate’ nonsense as the ultimate cherry pick, one which allows continual “trend” obfuscation.

Just think of a roller coaster rising vs. falling measured relative to its start point. As it descends the first hill, at any point one could argue that the “long term trend” is “still up,” even as the coaster is going down.

This will be the climate fascist playbook until the inevitable decline of “average temperature,” as meaningless as that is, can no longer be hidden by “adjustments.”

Reply to  Kip Hansen
August 17, 2022 11:56 am

A trend is a trend, is a trend, but the question is, will it bend? Will it alter its course, through some unforeseen force and come to a premature end?

~Sir Alec Carincross, economist

August 17, 2022 6:24 am

Trends have only 2 ways to go – up or down. They might be totally flat, but the chances for this are infinitely small with any random walk. Predicting the climate should be warming because fo CO2, has naturally a 50% chance of materializing. It is not the most compelling evidence.

A number of GCM results show that surface warming produced by contrails is between 0.2 and 0.3 o C/decade

And then you have this largely ignored issue. We have something that explains the 5 decadal warming trend, and it is NOT CO2. Moreover, if we take a closer look on the CO2 hypothesis, it is easy to see it is not even working, as overlaps were not considered in climate sensitivity estimates. Including overlaps, ECS simply collapses and turns negligible. Something even modtran shows..

comment image

I can only recommend to pay more attention to the contrail issue..

https://greenhousedefect.com/contrails-a-forcing-to-be-reckoned-with

Sandwood
August 17, 2022 6:28 am

Applying linear trend lines to cyclical data only works if the start and end points are at the same point in the cycle. When the system has multiple or irregular cycles, as in this case, or unknown cycles, an applied linear trend line can never be correct.

Reply to  Sandwood
August 17, 2022 6:55 am

This is my big gripe. The parts are pretty much cyclical. From orbital, sun, currents, precession, etc. They all combine into a complex of what is called climate.

Trying to do a linear regression on a cyclical phenomena leads to HUGE uncertainty levels.

Clyde Spencer
Reply to  Sandwood
August 17, 2022 11:40 am

And, with multiple cyclical drivers with different periods and phases, one ends up with constructive and destructive interference, meaning that only by Fourier decomposition can anything be deduced about long-term trends.

Reply to  Kip Hansen
August 17, 2022 5:01 pm

It is not real. From the other thread on averaging intensive variables, the global average temperature just doesn’t relate well when you look at enthalpy. I’ve seen too many projections from people I trust for places like Iceland, Greenland, Japan, U.S., South Africa where individual stations show no warming. If CO2 is well mixed then everywhere should be warming at least at a constant rate, even if the rates differ.

For every place like the image I took from another thread, there must be another place that has a lot of warming. Why do none of the CAGW folks here never post any local graphs that show really high warming? The would need one, isolated from UHI, with what, at least 3 degrees of warming? Where does that exist using raw station data?

Hachihojima_2022_07 mean temps.png
Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 5:47 pm

You typed an “r” for an “f” again. 🙂

Ten-percent is a relatively small change, but can be measured and be meaningful with high precision data. The problem is that it is not uncommon to observe an uncertainty of +/-50% in climatology measurements — when they even bother to cite the uncertainty.

Lit
August 17, 2022 6:28 am

Just do average temperature in Kelvin plotted for 100 years. You won´t even see a trend.

Reply to  Lit
August 17, 2022 6:58 am

Go back and read what his purpose is. When you find a deterministic functional relationship for CO2 vs temp, post it.

Reply to  Jim Gorman
August 17, 2022 7:34 am

Like this SST one?

Screenshot_20220623-085546_Drive.jpg
Carlo, Monte
Reply to  Macha
August 17, 2022 7:51 am

Substitute U.S. Postal rates for sea surface temperature…

TheLastDemocrat
Reply to  Macha
August 17, 2022 4:53 pm

Macha – the vertical axis is not CO2; it is “time.”
Prove me wrong.

Plotting this way does not get rid of the confounder of time.

Plot Mauna Loa x time. Same graph.

As Del Monte Carlo says: you can also substitute postal rate.

There is no up-and-down in time; time just goes up and up. Also, no up-and-down in postal rate; it just goes up, and pretty monotonically.

August 17, 2022 6:38 am

a. Don’t use short time periods when determining trends in a time series. Trends are always sensitive to start and end dates.

Lord Monckton, take note!

Reply to  Kip Hansen
August 17, 2022 7:21 am

Are you referring to his “The Pause is now X year long”?

Actually, he has it down to the month, at the moment. A trend is still a trend, whether you count backwards or forwards from or to a particular point. So your advice still applies to Lord M’s monthly UAH ‘pause’ updates: too short a period and highly dependant on start and end dates.

Geoff Sherrington
Reply to  TheFinalNail
August 17, 2022 7:44 am

TFN,
I use this method because it helps detect a change in trend.
Like, Australia UAH LT shows no linear least squares fit warming for the last 10 years.
If we are in a transition from warming to cooling, we would expect this.
It indicates when the transition started, so we can examine what causative variables might have changed then.
There is no suggestion that coming temperatures will rise, fall or stay flat – the data are incapable of predicting the future, they merely observe the past. Geoff S

Clyde Spencer
Reply to  Geoff Sherrington
August 17, 2022 11:48 am

Time-series are often auto-correlated, so at least short-term predictions are more reliable than random guesses.

Reply to  Kip Hansen
August 17, 2022 8:59 am

Quite right. I have had no reluctance to criticize Lord Monckton’s ravings on other subjects, but his pauses are what they are, and he (largely) leaves us to make of them what we will.

To me they militate against the proposition that natural variation is minor.

Derg
Reply to  Joe Born
August 17, 2022 1:58 pm

He uses their math to show the follly of their theory.

Reply to  Kip Hansen
August 17, 2022 9:08 am

No, really, a PAUSE is a temporary phenomena — thus it must have a start and end (even it is ‘to-date’) and is expressly only the part of some greater whole (in time).

Except he still keeps talking about the old pause, and that ended sometime after the new one began.

His claim is only that rise in temperature (UAH) has temporarily ceased rising since such-and-such a date.

UAH warming rate from 1979 to the start of pause was 0.11°C / decade.

UAH warming rate from 1979 to current date is 0.13°C / decade.

I’m not sure what the evidence is that warming has temporarily ceased.

Reply to  Bellman
August 17, 2022 10:39 am

WE were told that if a pause occurred and lasted over 15 years then all the models were wrong. Because models never predicted a pause that long.

When the pause lasted longer than required we got excuses. The models were still claimed to be valid even after alarmists said they should not be.

Clyde Spencer
Reply to  Bellman
August 17, 2022 11:54 am

Except he still keeps talking about the old pause, …

If your car engine has suddenly stopped on more than one occasion, would you conclude that there was some problem that needed to be discovered and fixed to increase the reliability?

The hiatus is an indicator that because there is not a close coupling between CO2 and temperature, the models are defective and are unreliable for periods of several years running.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 2:22 pm

Several months ago I tested the hypothesis that models do not predict long pause like the current one.

Using the CMIP5 model data from the KNMI Explorer and the Monckton method the data suggests that about 21% of the months should be included in a pause lasting at least 90 months. The UAH observations showed 26%. That’s not a bad prediction.

Using my trivial 3 component model above and the Monckton method the data suggests that about 26% of the months to be included in a pause lasting at least 90 months. Again, the observation is 26%. The model is in excellent agreement with the observations.

Reply to  bdgwx
August 17, 2022 4:18 pm

The problem is that the models turn into y = mx + b equations after a few years. Run’em for 100 years, 200 years, or 10000 years – the models will predict that the earth *will* turn into a cinder, no more ice ages. No pauses, no bending of the projection, no nothing. *That* is not a good model

TheLastDemocrat
Reply to  Tim Gorman
August 17, 2022 4:59 pm

This is true, and at least partly because relations between “forcings” have to include geometric functions – tipping points. Water heats and heats and heats – and then turns into steam.

If you do not model at least one “tipping point” correctly, one influence is going to be exaggerated, and it is only a matter of time until it runs crazy.

So, you add bounds to at least one of the relations. Well, then you either run off the chart another way, or are limited to an upper and lower bound forever, always to be hit and never exceeded.

Good luck.

bdgwx
Reply to  Tim Gorman
August 17, 2022 6:47 pm

Which one of the CMIP5 members shows Earth turning into a cinder in 100, 200, or 10000 years? Can you post a link to the data you are looking at so that I can download and verify it?

Reply to  bdgwx
August 18, 2022 6:18 am

How about this graph: (see attached)

Do you see any bending in the 8.5 or 6.0 scenario’s? Even the 4.5 scenario shows continual growth although it does show the slope changing in the out years. The only one that shows growth stopping is the 2.5 scenario, the one no one believes will ever happen (see India and China for examples)

cimp5_graph_2100.png
Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 6:27 am

He lives in De Nile.

bdgwx
Reply to  Tim Gorman
August 18, 2022 7:29 am

None of those show the Earth turning into a cinder.

bdgwx
Reply to  bdgwx
August 19, 2022 10:40 am

I don’t really have much choice at this point to dismiss the claim that CMIP5 shows the Earth turning into a cinder.

Reply to  bdgwx
August 19, 2022 11:15 am

When they continue going up with no bound then the only conclusion is that the earth *WILL* turn into a cinder. That’s what the “consensus” climate scientists are predicting. We will soon reach the tipping point where we can’t stop the warming and it will kill us all! Just ask John Kerry!

bdgwx
Reply to  Tim Gorman
August 19, 2022 12:40 pm

TG said: “When they continue going up with no bound then the only conclusion is that the earth *WILL* turn into a cinder.”

Where do you see that any of the CMIP5 models show the global average temperature continuing to go up indefinitely to the point that Earth turns into a cinder? All I’m asking for is a link or some kind of reference here. I’ll you one last chance here. After that I don’t have any other option but to dismiss the claim.

BTW…how would even be able to assess whether the Earth turns into a cinder using the global average temperature anyway? According to Essex et al. it does not actually exist.

Reply to  bdgwx
August 20, 2022 1:23 pm

I gave you a graph. In the out years both RCP 8.5 and 4.5 are nothing more than y = mx +b linear trends. The termination of the runs are generated by computing limits more than by the trend ending.

After that I don’t have any other option but to dismiss the claim.”

That’s one of your main tactics. Argument by Dismissal, an argumentative fallacy. You simply can’t show where the 8.5 and 4.5 runs bend down in any way, shape, or form. So you just dismiss any argument that says the trend lines from the models will continue indefinitely until the oceans boil!

“BTW…how would even be able to assess whether the Earth turns into a cinder using the global average temperature anyway? According to Essex et al. it does not actually exist.”

It doesn’t exist. That doesn’t keep the climate models from saying it does and using it to forecast the end of the world.

Clyde Spencer
Reply to  bdgwx
August 17, 2022 5:59 pm

Using my trivial 3 component model above …

I count 4 variables plus a constant.

Not knowing the details of how you did your analysis, I can’t provide an informed comment.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 6:34 pm

Yes. Typo. 4.

I performed the analysis using the Monckton method.

Reply to  Clyde Spencer
August 17, 2022 4:15 pm

+100!

Jeff Alberts
Reply to  Bellman
August 17, 2022 5:47 pm

At those rates, in ten decades, you still wouldn’t notice a change in temperature.

Richard M
Reply to  TheFinalNail
August 17, 2022 8:11 am

The pauses continue to be a big problem for the climate cult. It’s now pretty obvious the breakpoint was due to the PDO phase change in 2014. When you look before and after that obvious warming influence you seeing nothing. The only possible conclusion is we’ve gone 25 years with no evidence for man made warming.

https://woodfortrees.org/plot/uah6/from:1997.5/to/plot/uah6/from:1997.5/to:2014.5/trend/plot/uah6/from:2015.5/to/trend/plot/uah6/from:2014.5/to:2015.5/trend

Reply to  Richard M
August 17, 2022 8:47 am

You mean if we leave out the periods of natural warming but keep in the periods of natural cooling we can get a cooler temperature series? Never thought of that.

Reply to  TheFinalNail
August 17, 2022 8:43 am

Why don’t you post the reason for examining this short trend so everyone can see you are in error!

CO2 is not a control knob for temperature if temps don’t increase along with increasing CO2.

Reply to  Jim Gorman
August 17, 2022 9:10 am

Why do you always want to ignore the uncertainty in these trend lines?

Clyde Spencer
Reply to  Bellman
August 17, 2022 11:56 am

Funny that you would accuse Jim of that! The uncertainty is taken into account when determining the statistical significance.

Carlo, Monte
Reply to  Clyde Spencer
August 17, 2022 3:04 pm

No kidding, this is a troll comment if there ever was one.

Reply to  Clyde Spencer
August 17, 2022 3:20 pm

Yes, ironic. Jim, Tim and Carlo have spent years trying to claim that UAH data is completely unreliable because uncertainty increases the more observations you make, that it’s impossible to know what any trend is , and now wonlt even accept that it’s possible to average temperature.

Yet will then insist that a carefully selected flat trend line covering a few years of highly variable data, is somehow so certain you can use it to prove anything you want. Last discussion I was being told that there is zero uncertainty in the trend, whilst at the same time being told the true trend over the 7 year period could be at least ±3°C / decade.

Carlo, Monte
Reply to  Bellman
August 17, 2022 3:42 pm

Last discussion I was being told that there is zero uncertainty in the trend

You’re insane.

Reply to  Carlo, Monte
August 17, 2022 4:27 pm

Yep!

Reply to  Bellman
August 17, 2022 4:26 pm

In other words you get pissed when someone else uses the data you look at as religious dogma in order to disprove the CAGW theory!

ROFL!!

Reply to  Tim Gorman
August 17, 2022 5:31 pm

I don’t know how many times I have to tell you this before it penetrates your understanding, but I do not regard UAH data as perfect. I don’t regard any data as perfect but UAH least of all. The only reason I go along with UAH data here is because up to recently it was the only data set regarded as acceptable.

Just because I don’t accept the ±1.5°C uncertainty you want to pluck out of the air, does not mean I regard Dr Spencer’s work as religious dogma.

Reply to  Bellman
August 18, 2022 5:56 am

Just because I don’t accept the ±1.5°C uncertainty you want to pluck out of the air, does not mean I regard Dr Spencer’s work as religious dogma.”

The only data I accept as minimally usable is degree-day data for cooling and heating. That data is actually in use by professionals to size HVAC systems and by agricultural scientists to advise farmers. The users of those HVAC systems and of the ag advice would quickly abandon the professionals and the ag scientists is they are continuously wrong. Climate scientists have no real users except governments who don’t care if the climate scientists are wrong as long as their predictions lead to increased power in the hands of the politicians.

Even HDD/CDD has problems. It doesn’t tell you the enthalpy. It’s why you could have the same HDD/CDD values in Phoenix and Miami on the same day. Humidity and pressure aren’t considered. Until the climate scientists are forced to use actual physical attributes that can be compared I will remain doubtful of their conclusions. That doesn’t mean I won’t use their own data against them.

bdgwx
Reply to  Tim Gorman
August 18, 2022 6:55 am

TG said: “The only data I accept as minimally usable is degree-day data for cooling and heating.”

Back in Kip’s previous article you said the average temperature is meaningless and useless and even advocated for the position of them being non-existent. Yet you find HDD and CDD usable despite them being dependent on a meaningless, useless, and non-existent metric?

Reply to  Tim Gorman
August 18, 2022 11:53 am

The only data I accept as minimally usable is degree-day data for cooling and heating.

But they are based on the same temperature readings as any of the land based global sets. Do you use CDDs and HDDs as a global average or just for specific locations?

Reply to  Bellman
August 19, 2022 5:33 am

“But they are based on the same temperature readings as any of the land based global sets. Do you use CDDs and HDDs as a global average or just for specific locations?”

HDD and CDD are degree-days, not degrees. Still, I do *NOT* add them up to use them as a global average. Have you *EVER* seen me do that? Be honest!

What I do is to use them to evaluate if HDD/CDD is going up or down at a location. I can then assign them a cooling/warming metric for both minimum and maximum temps and can compare those metrics over a region. I try to use stations that are not located at an airport or in the middle of an urban heat complex.

If I find a region (e.g. central Peru) that shows mostly cooling/heating then I can ask “where is the offsetting region that would cause global warming?”. If I find a 0.5C cooling slope then there needs to be a 1.0C warming slope somewhere to offset it if we are to believe in “global” warming.

There is absolutely no reason why climate scientists could not do exactly the same. More complicated? Sure! So what? More believable result? Sure!

Doing minimums (HDD) and maximums (CDD) gives FAR more knowledge than the hokey, unphysical, global temperature mid-range average value. I suspect that is one reason why climate science refuses to do abandon their current method. It would make it easier to hold their feet to the fire over predictions of the future!

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 6:47 am

Can you imagine trying to make a model to spit out these numbers instead of a single global average?

Reply to  Carlo, Monte
August 19, 2022 2:20 pm

Can you imagine trying to make a model to spit out these numbers instead of a single global average?”

Actually I can. But it wouldn’t fit the agenda!

bdgwx
Reply to  Tim Gorman
August 19, 2022 9:11 am

TG said: “What I do is to use them to evaluate if HDD/CDD is going up or down at a location.”

According to you HDD/CDD are dependent on a useless and meaningless quantity. So how exactly are finding meaning and use out of them?

Carlo, Monte
Reply to  bdgwx
August 19, 2022 10:27 am

According to you HDD/CDD are dependent on a useless and meaningless quantity.

A lie disguising a strawman.

bdgwx
Reply to  Carlo, Monte
August 19, 2022 11:50 am

CM said: “A lie disguising a strawman.”

If I’m mistaken then so be it. I assure you it is not intentional. My statement is based on the comments in Kip’s previous article in which TG advocated for the position that the average of temperature observations is meaningless and useless. Is that not TG’s position?

TG, what say you? Is your position that the average of temperature observations is meaningless and useless or not?

Reply to  bdgwx
August 19, 2022 2:24 pm

TG, what say you? Is your position that the average of temperature observations is meaningless and useless or not?”

You don’t even know what the dimensions of HDD and CDD are, do you?

They are *NOT* averages. They are integration of the temperature curve – i.e. they are the area between the temperature curve and a base set line. They are an AREA, not a “degree”.

bdgwx
Reply to  Tim Gorman
August 19, 2022 7:39 pm

TG said: “They are *NOT* averages.”

I never said they were averages. I said they depend on temperature averages. Specifically the formula is HDD = 65 F – (Tmin + Tmax) / 2 or CDD = (Tmin + Tmax) / 2 – 65 F. [1]

Carlo, Monte
Reply to  bdgwx
August 19, 2022 9:12 pm

bgwxyz throws another wad at the wall, hoping it will stick.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 2:35 pm

You think if you can find just a single example of averaging temperatures then the entire Essex paper must be wrong, so you go around throwing up nonsense like this hoping it will stick.

Reply to  Carlo, Monte
August 19, 2022 2:22 pm

Yep.

Reply to  bdgwx
August 19, 2022 2:22 pm

According to you HDD/CDD are dependent on a useless and meaningless quantity. So how exactly are finding meaning and use out of them?”

You *still* haven’t learned what an HDD/CDD is, have you? You *really* need to go learn about them. If they were meaningless then HVAC professionals would be using them to size installations.

bdgwx
Reply to  Tim Gorman
August 19, 2022 7:36 pm

TG said: “You *still* haven’t learned what an HDD/CDD is, have you?”

I learned about degrees days long ago. According to the National Weather Service degree days are “the difference between the daily temperature mean, (high temperature plus low temperature divided by two) and 65°F.”


Reply to  bdgwx
August 21, 2022 4:50 am

According to the National Weather Service degree days are “the difference between the daily temperature mean, (high temperature plus low temperature divided by two) and 65°F.””

That is the *old* way of doing it — and it is wrong, wrong, wrong. I’m not surprised to see the government still using an old, wrong definition.

(Tmax + Tmin)/2 is *NOT* the daily temperature mean. It never has been. It’s just a convenient relationship to use with no regard to its physical accuracy.

The mean of the daytime temp is .63Tmax. The mean of the nighttime temp is ln(2)/k where k is the decay factor. So what is the actual mean of the daily temperature profile? It is *NOT* (Tmax – Tmin)/2!

The process used today is to integrate the temperature curve since most useful measurement stations record in minutes if not more often. This finds the actual area under the temperature curve in degree-days.

That is where the .63 factor comes from for the day and the ln(2)/k comes from for the night.

If you don’t have a quasi-smooth curve for either the day or night you can still do a numeric integration.

In essence you are using the argumentative fallacy of Appeal to Tradition. Stop it.

bdgwx
Reply to  Tim Gorman
August 21, 2022 12:57 pm

Can you post a link to the literature 1) showing the new method and 2) showing that NWS retrospectively analyzed the data and replaced the old values with the new values in the official reporting? I’m very interested in review the materials you can provide.

Reply to  bdgwx
August 21, 2022 6:23 pm

go here: https://www.degreedays.net/calculation

Some other sources (particularly generalist weather-data providers for whom degree days are not a focus) are still using outdated approximation methods that either simplistically pretend that within-day temperature variations don’t matter, or they try to estimate them from daily average/maximum/minimum temperatures. We discuss these approximation methods in more detail further below, but, in a nutshell: the Integration Method is best because it uses detailed within-day temperature readings to account for within-day temperature variations accurately.”

Like I told you, the government still uses the old method of finding the mid-range temps (Tmax+Tmin)/2 and comparing that with the set point temperature. I don’t know of any reputable engineer that still does that.

go here: https://www.researchgate.net/publication/299495512_An_integral_model_to_calculate_the_growing_degree-days_and_heat_units_a_spreadsheet_application

go here (this site only offers the integration method): https://energy-data.io/energy-tools/degree-day-calculator-free/

go here: https://www.researchgate.net/publication/262557717_Comparison_of_methodologies_for_degree-day_estimation_using_numerical_methods

I shouldn’t have to do this research for you. Having to do so only indicates to me that all you are interested in on this subject is being a troll.

bdgwx
Reply to  Tim Gorman
August 22, 2022 5:17 am

TG said: “go here: https://www.degreedays.net/calculation

Yep. That’s what I thought you were going to reference.

The integration technique still uses average temperatures.

Calculate the average number of degrees by which the temperature was below the base temperature over the calculated time period (1). In this simple example this is always the base temperature minus the average of the two recorded temperatures.

Nevermind that each temperature observation is itself an average over a 5 minute period. See the ASOS User Guide for details. In other words, all methods use an average temperature in some way shape or form whether you understood that or not.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 6:25 am

So why did you not just come right out and say what you found on the internet, and instead tried to lay what you think is a trap, mr. disingenuous?

WOOHOO! bgwxyz takes another vicollly lap!

As usual, you are about as genuine as a three-dollar bill.

And your “nevermind”? This is averaging data from ONE instrument at ONE location. Does your hallowed ASOS guide tell you how to propagate the measurement uncertainty in this procedure?

Try again to Keep the Rise Alive.

bdgwx
Reply to  Carlo, Monte
August 22, 2022 8:00 am

The point is this. I don’t think you, Jim, Tim, and Kip are as convicted regarding the proclamation that averaging temperatures is meaningless and useless as you let on. Afterall, we’re being told that HDD/CDD are meaningful and useful despite them being dependent on averaging temperatures. And it’s not just HDD/CDD. We’re being told that the particular soil moisture product used in the comments here is meaningful and useful enough to determine droughts despite it being dependent on averaging temperatures. It is a contradictory position. I’ll even argue that it is a hypocritical position since we are all being told that we shouldn’t do it yet the contrarians here not only do it themselves but defend their use vigorously.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 8:23 am

You label anything you please “averaging”! Go lecture someone else.

bdgwx
Reply to  Carlo, Monte
August 22, 2022 8:44 am

CM said: “You label anything you please “averaging”! Go lecture someone else.”

Not at all. I only label the operation Σ[x_i, 1, n] / n as “averaging” and nothing more.

On the contrary, it is Essex et al. that were discussing how there are an infinite number of ways to average a set of values only one of which fits the canonical formula above. So you tell me…who’s labeling anything they please as “averaging”?

Carlo, Monte
Reply to  bdgwx
August 22, 2022 9:38 am

Not at all. I only label the operation Σ[x_i, 1, n] / n as “averaging” and nothing more.

Liar, go read the GUM H3 again. YOU!

bdgwx
Reply to  Carlo, Monte
August 22, 2022 10:11 am

bdgwx said: “I only label the operation Σ[x_i, 1, n] / n as “averaging” and nothing more.”

CM said: “Liar, go read the GUM H3 again. YOU!”

I did as you requested. It still says t_bar =  Σ[t_k, 1, n] / n which is the same as the last time I read it and the time before that.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 10:28 am

Went to see the captain, strangest I could find
Laid my proposition down, laid it on the line
I won’t slave for beggar’s pay, likewise gold and jewels
But I would slave to learn the way to sink your ship of fools

Ship of fools sail away from me
It was later than I thought when I first believed you
But now I cannot share your laughter, ship of fools

The bottles stand as empty now, as they were filled before
Time there was and plenty, but from that cup no more
Though I could not caution all, I still might warn a few
Don’t lend your hand to raise no flag atop no ship of fools

Ship of fools sail away from me
It was later than I thought when I first believed you
But now I cannot share your laughter, ship of fools

Reply to  bdgwx
August 22, 2022 10:50 am

Here is the deal. Many of us have been calling GAT meaningless for a long, long time. Do you think we made this decision out of clear blue sky? Most of us have scientific educations and have dealt with applying these in our careers to solve problems. We don’t just jump when someone pontificates without having some mathematics to back up their assertions.

My first recognition was when no one was ever quoting a standard deviation to go along with the distributions they were using to calculate a mean. Why? Still don’t have an answer, not even from you! Why? I would never have passed statistics if I had done that!

My second recognition was that only temperatures were being averaged. Temperatures NEVER tell you about the amount of heat in a substance, i.e., enthalpy. The amount of HEAT tells you the energy held by a substance. Mechanical and electrical engineers spend many hours in class learning about this. How do you think a steam boiler in a power plant or a heating plant is evaluated to determine how much energy can be extracted when turning a turbine or driving a process? Trial and error? Guess and by golly? Dude, this is all calculated before physical design is started. These are not a joke. Very, very high temperatures and pressures are involved. Personal and equipment safety is paramount.

My third recognition was that unreasonable uncertainties we’re being quoted based upon measurements that were AT BEST taken to an uncertainty of ±0.5 degrees. The predominate figure quoted was actually the SEM divided by √N. This computation is not defined in statistics. What is divided by the √N is the POPULATION standard deviation and not the standard deviation of the sample mean.

My fourth was sampling theory was not being applied correctly. Review the requirements of the different sampling theories and see if they are being meant.

Wise up and take some calculus based physics and thermodynamic classes and follow up with metrology classes.

Reply to  bdgwx
August 22, 2022 1:06 pm

Afterall, we’re being told that HDD/CDD are meaningful and useful despite them being dependent on averaging temperatures.”

That is the OLD method, called the Mean Temperature Method, not the new method called the Integration method. If you had bothered to read the entire link I gave you rather than trying to cherry pick the first thing you came across you would realize that!

The Integration Method DOES NOT AVERAGE ANYTHING. I don’t know why you and bellman are so intent on defining integration as averaging. IT IS NOT!

“droughts despite it being dependent on averaging temperatures.”

Droughts are dependent on temperature? Droughts are typically defined by the amount of water in the soil (an extensive property) at specified depths plus the amount of surface water in lakes, ponds, streams, rivers, etc, again an extensive property, as well as precipitation, i..e the volume of rain, again an extensive property.

Do you and bellman live in the land beyond the looking glass?

yet the contrarians here not only do it themselves but defend their use vigorously.”

You are looking through a fun house mirror! You can’t even define reality (e.g. droughts) correctly!

Reply to  Carlo, Monte
August 22, 2022 12:59 pm

You nailed it!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 1:33 pm

I cannot for the life of me understand how they can willy-nilly ignore non-trivial details and context. This is the domain of religion and pseudoscience, not science and engineering.

Reply to  bdgwx
August 22, 2022 8:48 am

You do realize that “at a distance” is vastly different than “an average over 5 minutes at the same place and with the same device”, right.

Go find an actual physics or thermodynamic reference for averaging intensive measurements of different objects at a distance to obtain a real physical measurement.

The GUM does not deal with this separation of physical quantities and qualities. It was never intended to. It was written for people trying to convey uncertainty in MEASUREMENTS, not for what appropriate measurements are and how they should be combined.

I know you are trying to destroy the belief that intensive values can not generally be averaged to obtain a real physical quantity. To do so you need to deal with physics, chemistry, and thermodynamic subjects to find references that deal with these issues. Uncertainty in MEASUREMENTS is not the appropriate choice of subject matter.

Carlo, Monte
Reply to  Jim Gorman
August 22, 2022 9:39 am

He relies on sophistry instead.

bdgwx
Reply to  Jim Gorman
August 22, 2022 10:59 am

JG said: “Go find an actual physics or thermodynamic reference for averaging intensive measurements of different objects at a distance to obtain a real physical measurement.”

I did as you suggested. I pulled out my Mesoscale Meterology book by Markowski and Richardson and Dynamic Meteorology by Holton and Hakim. Both refer to average temperature and averages of various other intensive properties. It’s been awhile since I’ve opened these up. I hadn’t realized just how prolifically average temperatures are used in them until now. BTW…these are great references if you want to learn more about the kinematic and thermodynamic nature of the atmosphere.

Reply to  bdgwx
August 22, 2022 1:24 pm

My guess is that those books speak to the averaging the temperature in a single object that is in equilibrium. E.g. in a wall between the inside and outside. They are finding the average value of the gradient between two points.

This was part of an entire piece of Essex. Averaging two different, independent objects to find a value for a third independent object. I.e. the average of the gradient in one of your house walls, the gradient in one of your neighbor’s house walls, and the surface temperature of the asphalt on your nearest street!

None of those temperatures can act at a distance to determine either of the other two. So the average of the three is meaningless.

Yet you, for some reason, want to keep claiming that such an average *is* meaningful while never actually trying to explain what it is useful for! You can’t explain it because the average you find doesn’t exist!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 1:38 pm

Or a triple-point bath inside a laboratory controlled to less than ±0.1°C.

Without these meaningless global air temperature averages, they are totally and completely bankrupt.

bdgwx
Reply to  Tim Gorman
August 22, 2022 2:03 pm

TG said: “My guess is that those books speak to the averaging the temperature in a single object that is in equilibrium.”

They are in reference to volumes of the atmosphere. And I’m sure you already know that no volume of the atmosphere is ever in equilibrium which is why we have weather. And it’s not just temperature these books are referring to. Average vorticity, average density, average pressure, average divergence, and various other intensive properties are used throughout.

Reply to  bdgwx
August 22, 2022 3:05 pm

I’ll give you just one instance. Vorticity is a calculated value based on the mean angular displacement and center of mass. In other words it is *exactly* what you continue to do – conflate a calculated intensive value derived from extensive properties with a measured intensive property!

In none of the examples you give do you measure the intensive value and then average it. You measure extensive values and then calculate an intensive property. Density requires averaging mass and volume. two extensive values that can be measured! You don’t measure density and then average the values! What kind of probe would you use to measure density anyway!

Reply to  Tim Gorman
August 23, 2022 9:39 am

You are just stomping on his belief that a GAT has some meaning. Temperature is not the appropriate measure to be using for determining the energy contained in the atmosphere. It only measures translational energy and totally ignores latent energy.

bdgwx
Reply to  Jim Gorman
August 23, 2022 11:49 am

JG said: “Temperature is not the appropriate measure to be using for determining the energy contained in the atmosphere.”

Deflection and diversion. The claim is that averages of intensive properties regardless of context is meaningless and useless up to and including the claim that it doesn’t even exist and whether trends based on those values has meaning and usefulness as well. Nobody is challenging the idea that temperature only provides limited insights into the energy content of a body or even discussing the relation between temperature and heat/energy at all.

Reply to  bdgwx
August 23, 2022 7:52 am

Do you think meteorology texts have good thought out math justifications for averaging temps at a distance? Do your books talk about averaging Sacramento and Rochester temps for a purpose.

I did reference physics and chemistry and thermodynamics on purpose. These subjects have detailed math justification for what is stated.

You are basically trying to justify averaging temps for something like an average temp at a location or small region. It is OK to average temps if all you are trying to find is average temperature. However, it is NOT OK to use that average temp as a proxy for the amount of heat (enthalpy) in two different locations or objects! If you are dealing with the sun’s ENERGY, you must use enthalpy to track the total energy in the system.

Have you had any calculus based science like physics, chemistry, or thermodynamics?

bdgwx
Reply to  Jim Gorman
August 23, 2022 2:02 pm

JG said: “Do you think meteorology texts have good thought out math justifications for averaging temps at a distance?”

Yes. They have good thought out math justification for averaging other intensive properties as well.

Reply to  bdgwx
August 23, 2022 4:31 pm

Show us the math. You can surely scan the pages that have the mathematical derivations of how temperature averages can determine the enthalpy in the atmosphere.

Reply to  bdgwx
August 22, 2022 12:24 pm

The integration technique still uses average temperatures.”

You are as bad as bellman! Integration is not AVERAGING!

In this simple example this is always the base temperature minus the average of the two recorded temperatures.”

This is *NOT* the integration method. It is the older method known as the Mean Temperature Method, not the Integration Method!

If you would read on down the page it shows using the integration method.

As usual with you and bellman, you are cherry picking something you think will prove a point you have made but you fail to understand the entire context of what you are cherry picking from!

Nevermind that each temperature observation is itself an average over a 5 minute period.”

Not from my weather station. Nor does it need to be from any newer, digital weather station. You are still trying to use the Argument to Tradition fallacy.

from your link:

“Once each minute the ACU calculates the 5-minute average ambient temperature and dew point temperature from the 1-minute average observations (provided at least 4 valid 1-minute averages are available). These 5-minute averages are rounded to the nearest degree Fahrenheit, converted to the nearest 0.1 degree Celsius, and reported once each minute as the 5-minute average ambient and dew point temperatures” (bolding mine, tg)

Please note that the readings are rounded to the nearest Fahrenheit degree and then converted to the nearest 0.1C. That’s truly a violation of stating values! 1F is about 0.5C not 0.1C. This algorithm severely overstates the precision of the data!

In other words, all methods use an average temperature in some way shape or form whether you understood that or not.”

I would also note what the uncertainty of the measurements are. For the range of -58F to +122F the uncertainty is +-/ 1.8F (+/- 1.0C)

With that kind of an uncertainty any error from averaging will be totally masked by the measurement uncertainty itself.

The one-minute average is obtained from 10 sec measurements (6 per minute). Each measurement is from approximately the same measurand. This simply isn’t like trying to find an average between two different locations. While not perfect, this is about 30 measurements per 5 minute average. This is as close to 30 measurements of the same measurand as one cat get.

Stop trying to cherry pick things without getting the full context. You only lower further your already low credibility!

bdgwx
Reply to  Tim Gorman
August 22, 2022 1:58 pm

TG said: “You are as bad as bellman! Integration is not AVERAGING!”

I didn’t say it was. I said their integration technique uses temperature averages. BTW…don’t conflate their integration technique with proper calculus integrals.

TG said: “This is *NOT* the integration method.”

That is what they call the “Integration Method”. It is documented under the heading How do you calculate degree days using the Integration Method? It definitely involves averaging temperatures. Note that it is different from what they call the Mean Temperature Method which also involves averaging temperatures.

TG said: “While not perfect, this is about 30 measurements per 5 minute average. This is as close to 30 measurements of the same measurand as one cat get.”

It’s still an average.

Reply to  bdgwx
August 22, 2022 3:19 pm

BTW…don’t conflate their integration technique with proper calculus integrals.”

Your knowledge of calculus is just as bad as bellman’s. You’ve never heard of numerical integration I guess. I’m not surprised!

“That is what they call the “Integration Method”. It is documented under the heading How do you calculate degree days using the Integration Method?”

You STILL haven’t bothered to read the link I gave you! The link has another link to a page that explains how they calculate their degree-day value. It is *NOT* an average!

From the link:
———————————————————–
Although the Integration Method is the most accurate degree-day-calculation method, and the one that we use, many other sources (especially non-specialist sources) calculate degree days using an approximation method based on daily summary data (some or all of daily average, maximum, and minimum temperatures).

There are three popular methods for approximating degree days from daily temperature data:
The first two are both known as the “Mean Temperature Method”. They approximate the degree days for each day using the average temperature for the day. For HDD they subtract the average temperature from the base temperature (taking zero if the average temperature is greater than the base temperature on that day). For CDD they subtract the base temperature from the average temperature (taking zero if the base temperature is greater than the average temperature).
This sounds like one method, but it’s actually two because there are two commonly-used ways to calculate the average temperature for each day, both of which give slightly different results:

  1. The average temperature is taken to be half way between the maximum and the minimum temperatures on each day.
  2. The average temperature is calculated from detailed readings taken throughout each day (this method gives a more accurate average temperature).

The third is the Met Office Method – a set of equations that aim to approximate the Integration Method using daily max/min temperatures only. They work on the assumption that temperatures follow a partial sine-curve pattern between their daily maximum and minimum values.
Another approximation method deserves mentioning: The Enhanced Met Office Method. This is based on the original Met Office Method, but, whilst the original Met Office Method approximates the daily average temperature by taking the midpoint between the daily maximum and minimum temperatures, the Enhanced Met Office Method uses the real daily average temperature (calculated from temperature readings taken throughout the day). The daily maximum and minimum temperatures also play a role in approximating within-day temperature variations, as they do for the Met Office Method.
————————————————————

It’s still an average.”

OF THE SAME MEASURAND, or at least as close as you can get!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 3:33 pm

Your knowledge of calculus is just as bad as bellman’s. You’ve never heard of numerical integration I guess. I’m not surprised!

They just pile the lies higher and higher.

Reply to  Carlo, Monte
August 23, 2022 2:52 pm

You’ll notice that the question about what calculus based physics, chemistry, and thermodynamic classes they have goes unanswered.

Carlo, Monte
Reply to  Jim Gorman
August 23, 2022 4:34 pm

Oh yes, this is quite telling. Might as well be a physical geography background, which back in the day counted toward arts and humanities credits.

Just like how he avoids the issue of how a laboratory triple-point bath can be identical to averaging Timbuktu and Kalamazoo air temperatures.

bdgwx
Reply to  Tim Gorman
August 22, 2022 5:16 pm

TG said: “You STILL haven’t bothered to read the link I gave you!”

Yes I read it. That’s how I know their method uses an average temperature. They state it explicitly and even give examples. They not only use an average once or even twice but three separate times! First, they rely on the observations from the automated stations which are themselves averages. Second, they take those detailed observations and group them into “consecutive pairs” and take the average of all the pairs. Third, they multiple difference of that average and the reference by the time representing the pairs which as Bellman astutely recognized is effectively averaging; just a slightly different order than the canonical formula.

TG said: “From the link:”

What follows here is the alternative methods to their “Integration Method”. You did not post their method.

Reply to  bdgwx
August 24, 2022 3:42 am

The link to their method appears on the link I gave you. This is just proof that you did not *READ* the link. You just cherry picked something with a glance and, as usual, it was wrong.

Reply to  bdgwx
August 24, 2022 6:17 am

First, they rely on the observations from the automated stations which are themselves averages.”

Multiple observations of the SAME THING, or at least as close as you can get! I’ve given you the answer on this after you referred to NWS standards. Of course you blew that off!

“Second, they take those detailed observations and group them into “consecutive pairs” and take the average of all the pairs.”

They actually INTERPOLATE a middle value. And, as I pointed out in another message, when doing interpolation/averaging between points when doing numerical integration, you get the *MOST ACCURATE” calculation of the area under the curve. And I am positive you have no clue as to why that is!

“Third, they multiple difference of that average and the reference by the time representing the pairs which as Bellman astutely recognized is effectively averaging; just a slightly different order than the canonical formula.”

It is *NOT* averaging! Neither you nor he understands calculus at all! Averaging gives you degrees/time, not degree-time! You can’t even do simple dimensional analysis that 8th graders can do! You split the time period into intervals, be they minute, hours, seconds, or whatever.

∫sin(x) dx

sin(x) in degrees multiplied by dx in hours, minutes, seconds, etc.

This way you wind up with degree-time units. If your period is 24 hours then your intervals could be in hours, or minutes, or seconds. NOT 1/24hours. Again, that would give you degree/hour!

You and bellman like to portray yourselves as very good mathematicians. You are nothing of the sort! Nor do either of you have any relationship to the real world! Neither of you understand that when calibrating a thermometer in a thermal bath that you don’t average the water temp across a number of thermal baths to determine the average temperature to be used as the calibration temperature of the thermal bath you are using in the calibration routine. You take multiple measurements of the SAME thermal bath and use the average of those!

Reply to  Tim Gorman
August 22, 2022 2:29 pm

You are as bad as bellman! Integration is not AVERAGING!

Have you still not understood that taking 24 temperature readings through the day, multiplying them each by 1/24 of a day, and adding them together is no different to adding the 24 values and then dividing them by 24?

Reply to  Bellman
August 23, 2022 6:19 am

Have you still not understood that taking 24 temperature readings through the day, multiplying them each by 1/24 of a day, and adding them together is no different to adding the 24 values and then dividing them by 24?”

This is *NOT* how you do integration!

Sampling the temperatures is to create a temperature signal, just like sampling in a software defined radio (SDR) does for radio signals or a digital signal generator does for creating a sine wave.

That curve is then integrated. The intervals are *NOT* 1/24. Each interval is an interval of 1. You have 24 samples, that’s all. You are trying to build the averaging into the calculation. By doing so you wind up with a dimension of degree instead of degree-day or degree-hour (or whatever) for the final value! Learn some basic dimensional analysis for what you try to do. If the dimensions don’t match then you have done something wrong!

Interval 1 (of size 1/24) is a width of one. Interval 2 is a width of 1. Interval 3 is a width of one. Etc. The height of Interval 1 is sin(π/24). The height of Interval 2 is sin(2π/24). Etc.

That’s for numerical integration. If the temperature profile is at least quasi-sinusoidal then just do the standard integration:

2∫Asin(x)dx from 0 to π/2 => 2A where A is the peak value of the sine wave.

Please note carefully that is *NOT* the same as the sum of the values of the sine wave over an interval of pi.

2Σsin(x) from 0 to π/2 = 3.5. Divide that by the interval of π and you get 0.56, much different than the value of .637 you get when dividing the area under the curve by the π.

The dimensions don’t even match with the two. The sum gives you a dimension of “degree”. Divide that by radians and you get degree/rad. The integration gives you degree-rad and when you divide by radians you get a dimension of degree. Degree/rad is not the same thing as degree!

I keep telling you that you *need* to take a first year engineering calculus course. But for some reason you just want to continue down the path of trying to prove something that is wrong.

Reply to  Tim Gorman
August 23, 2022 6:52 am

This is *NOT* how you do integration!

Correct. To do integration you have to take the sample width to the limit of zero. But it’s an approximation, and it is what your preferred source calls the “integration method”. I think they add a few bells and whistles which don;t make much difference.

Learn some basic dimensional analysis for what you try to do. If the dimensions don’t match then you have done something wrong!

And what happens to your dimensional analysis when you take your sum of degree days and divide by days?

Interval 1 (of size 1/24) is a width of one.

One what?

The height of Interval 1 is sin(π/24). The height of Interval 2 is sin(2π/24). Etc.

You need to take the proverbial remedial class in basic trigonometry.

That’s for numerical integration. If the temperature profile is at least quasi-sinusoidal then just do the standard integration

We keep going through this. You complain that multiple samples can never be exactly correct, but then are happy to approximate values based on just two readings and an assumption of a sine wave.

Which you then screw up because you still cannot see why daily temperatures are not defined just by the maximum temperature, and daily temperatures do not have a mid-point of zero.

2Σsin(x) from 0 to π/2 = 3.5.

What do you mean by sum here? What are you summing?

The dimensions don’t even match with the two. The sum gives you a dimension of “degree”. Divide that by radians and you get degree/rad.”

If you are just adding temperatures then you are not going to divide by radians. You are just dividing by a dimensionless count. If you are doing an approximate integral you are not just adding degrees, you are adding degree * whatever the x axis is, in this case you are adding degree*radians. Then you divide by radians to get degrees.

I keep telling you that you *need* to take a first year engineering calculus course. But for some reason you just want to continue down the path of trying to prove something that is wrong.

Sorry, but if your understanding is what comes from “engineering calculus” I think I’ll stick to my degree courses in analysis.

Reply to  Bellman
August 24, 2022 3:56 am

Correct. To do integration you have to take the sample width to the limit of zero. But it’s an approximation, and it is what your preferred source calls the “integration method”. I think they add a few bells and whistles which don;t make much difference.”

Just like bdgwx you have not *READ* the link I gave you. You glanced at it and left with no actual understanding of what they do.

Just like bdgwx, you have not heard of numerical integration either. Let alone doing piecewise numerical integration.

And what happens to your dimensional analysis when you take your sum of degree days and divide by days?”

You get degrees. So what? Degrees is not what you are looking for!

One what?”

One INTERVAL. Just what I said in my post!

“We keep going through this. You complain that multiple samples can never be exactly correct, but then are happy to approximate values based on just two readings and an assumption of a sine wave.”

We’ve been through this! You propagate the uncertainty through to the result. I tutored you earlier this year on how to calculate the uncertainty of a sine wave. The method is right there in Taylor’s book and examples. But you *STILL* have not actually studied Taylor or done the examples. You just keep on trying to cherry pick pieces without understanding what you are cherry picking.

Do we need to go over how to calculate the uncertainty associated with a sine wave again?

Reply to  Tim Gorman
August 24, 2022 2:43 pm

What link? What bit do you want me to read? Why do you think I don;t understand integration, yet you do?

The uncertainty of a sine wave depend on the assumption that what you are measuring is a sine wave.

Reply to  Bellman
August 24, 2022 3:03 pm

What link? What bit do you want me to read? Why do you think I don;t understand integration, yet you do?”

The link to degreeday.net. But you won’t go read it for meaning. I know you won’t.

“The uncertainty of a sine wave depend on the assumption that what you are measuring is a sine wave.”

What is sin(x) = sin(L)sin(ð) + cos(L)cos(ð)cos(h)

That *is* the path the sun follows across the sky. It *is* a sine wave. It is the first order driver of daytime temperature. There are numerous secondary drivers which I have shown as ⱷ(a,b,….). Many of these secondary drivers are functions themselves. Some may modulate the maximum temperature in a day (e.g. overcast skys) and some may modulate the temperature in other ways (e.g. a low pressure cold front moving through). But none of these changes can’t be accounted for by moving to numerical integration if needed.

The temperatures still have to be measured and that implies uncertainty. My guess is that you STILL haven’t worked out the examples in Taylor’s book on how to do this for functions. You never will. Knowledge just isn’t important enough to you for you to actually learn anything.

Reply to  Tim Gorman
August 24, 2022 3:22 pm

That *is* the path the sun follows across the sky.

But unless you live in a vacuum it is not a measure of temperature.

My guess is that you STILL haven’t worked out the examples in Taylor’s book on how to do this for functions.

That’s a book about error. UNCERTAINTY IS NOT ERROR.

Reply to  Bellman
August 25, 2022 5:59 am

But unless you live in a vacuum it is not a measure of temperature.”

NO KIDDING! It *is* the first order driver of temperature! Everything else is a secondary driver.

You simply can’t admit that using sin(x + ⱷ) *is* a direct, and meaningful simplification for daytime temperature, can you?

Like usual, you are going to throw all kinds of crap against the wall hoping something will stick in order to try and refute it.

That’s a book about error. UNCERTAINTY IS NOT ERROR.”

I’m going to offer you up a link in the faint, faint hope you will read it for meaning. I know you won’t but hope springs eternal.

https://sepmetrologie.com/en/2021/08/02/what-is-the-difference-between-error-and-uncertainty/

Unlike errors, measurement uncertainty is the quantification of the doubt, which is obtained from the result of a measurement. The uncertainty value has the list of components coming from systematic and random effects on previous measurements, due to elements that are calculated by a series of statistical distributions, of the measurement values.”

Reply to  Tim Gorman
August 25, 2022 6:56 am

You simply can’t admit that using sin(x + ⱷ) *is* a direct, and meaningful simplification for daytime temperature, can you?

Not if I want to keep my sanity I can’t.

I’m sure we must be talking at cross purposes, and what you are saying means something to you, but you are just very bad at explaining it, and refuse to admit you might possibly have made a simple mistake.

Maybe if you could draw a diagram to show what you think sin(x + ⱷ) looks like for a given value of , it would be clearer.

Reply to  Bellman
August 25, 2022 10:28 am

Not if I want to keep my sanity I can’t.”

==> Not if I want to keep my insanity I can’t.

There, I fixed it for you!

“I’m sure we must be talking at cross purposes, and what you are saying means something to you, but you are just very bad at explaining it, and refuse to admit you might possibly have made a simple mistake.”

I can’t do any more than

  1. give you equation for the height of the sun in the sky
  2. tell you that the height of the sun in the sky is the primary driver of temperature on the Earth during the day.
  3. The factors determining the height of the sun are not time delays but are describing the orbital relationships between the sun and the earth.

I can’t make you believe this no matter what I give you.

There are multiple other secondary drivers of the temperature that modulate the result of the primary driver.

Maybe if you could draw a diagram to show what you think sin(x + ⱷ) looks like for a given value of , it would be clearer.”

I’ll attach one but you won’t believe it or understand it. You’ll just deny it also. It will be a waste of time.

image_2022-08-25_122741556.png
Reply to  Tim Gorman
August 25, 2022 2:04 pm

Not if I want to keep my insanity I can’t.

Slow hand clap.

  1. give you equation for the height of the sun in the sky
  2. tell you that the height of the sun in the sky is the primary driver of temperature on the Earth during the day.
  3. The factors determining the height of the sun are not time delays but are describing the orbital relationships between the sun and the earth.

The problem is even your claim that the height of the sun is the primary driver of temperature is wrong. The effects of the sun are cumulative. During the day, temperatures are not generally hottest at noon, but several hour afterwards. During the year the summer solstice does not mark the warmest part of summer. Despite the sun getting lower each day, temperatures keep building.

Then there’s the length of the day. Days are longer in summer and that has a bigger effect than the sun beinbg a bit higher in the sky.

Just knowing how high the sun is at any point, is not going to tell you how hot it is, even on average.

And then there are all those other pesky factors such as different air masses, direction of wind, cloud cover etc.

I can’t make you believe this no matter what I give you.

You can’t make me believe that 2 + 2 = 5. It doesn’t mean I‘m insane.

I’ll attach one but you won’t believe it or understand it. You’ll just deny it also. It will be a waste of time.

What is x in that diagram, what is ⱷ?

Reply to  Bellman
August 26, 2022 7:13 am

The problem is even your claim that the height of the sun is the primary driver of temperature is wrong. The effects of the sun are cumulative. During the day, temperatures are not generally hottest at noon, but several hour afterwards. During the year the summer solstice does not mark the warmest part of summer. Despite the sun getting lower each day, temperatures keep building.”

You are describing thermal inertia! That’s why it’s hottest in the afternoon!

So what? That doesn’t change the fact that the daytime temp is a sinusoid!

Have you *ever* taken a calculus based physics or thermodynamic course? Does the phrase “normal to the plane” mean *anything* to you? The sun dumps the largest amount of heat into the earth when its rays are perpendicular (normal) to the plane. That occurs when the sun is directly overhead. That heat doesn’t immediately appear as temperature. But the temperature will rise to a max and then decrease based on the sine of the angle of the sun dumping heat into the earth.

It doesn’t keep the temperature from being a sine wave related to the path of the sun. It’s just one more factor that has to be considered in the functions “x” and “ⱷ” in sin(x + ⱷ).

Then there’s the length of the day. Days are longer in summer and that has a bigger effect than the sun beinbg a bit higher in the sky.”

Of course it does impact the length of the day! What does that have to do with the path of the sun being a sinusoid? You’ve never actually read the site you linked to for the height of the sun and understood the equations there, have you? As usual, you just cherry picked a site with the operative equation and threw it out hoping people would believe you actually understood what was in it!

Just knowing how high the sun is at any point, is not going to tell you how hot it is, even on average.”

Who ever said it would? That’s why “x” and “ⱷ” are functions and not numbers!

And then there are all those other pesky factors such as different air masses, direction of wind, cloud cover etc.”

Which, if you will go back, I listed as factors in the function “ⱷ”

And you left out terrain, geography, elevation, and atmospheric pressure, all of which I have already listed out! Terrain is itself a function of land use, surface water, sub-soil moisture, vegetation, etc. Clouds are a function of humidity, pressure, wind, aerosols, etc. Did you think you were coming up with something original?

What is x in that diagram, what is ⱷ?”

You are kidding me, right? In the formula for the height of the sun you can’t tell from the equation
 sin(x) = sin(L)sin(ð) + cos(L)cos(ð)cos(h)
what “x” is and what “ⱷ” is?

It’s useless trying to teach you anything!

Reply to  Tim Gorman
August 26, 2022 7:13 pm

Ah ha. I think I’ve finally figured out what you are trying to say. Your x in sin(x), which you were originally calling t, is the height of the sun. That makes more sense.

Sorry if I missed any explanation of that amidst all your yelling and insults.

Reply to  Bellman
August 27, 2022 5:35 am

Ah ha. I think I’ve finally figured out what you are trying to say. Your x in sin(x), which you were originally calling t, is the height of the sun. That makes more sense.”

No! As I’ve told you multiple times “x” is a function all of its own. Sun height is only a part of it! Height, in radians/degrees or whatever, is not temperature. There has to be other factors involved to come up with temperature. Again, the sun is the primary driver of daytime temps, it’s path is a sinusoid and that is why daytime temps typically look like a sinusoid. I’m not sure how to quantify all the factors in “x” ==> f_x(a,b,c,…) but solar insolation would be the start. Not being able to quantify the factors doesn’t keep one from using the function f_x(a,b,c,…) in sin(x), or if you like sin[f_x(a,b,c,…)]. sin(x) is just far easier to write or type.

Reply to  Tim Gorman
August 28, 2022 3:32 pm

That’s a relief. I was worried I’d misjudged you.

Reply to  Bellman
August 24, 2022 4:01 am

What do you mean by sum here? What are you summing?”

You keep want to call integration “averaging”. An average is a sum divided by the number of components. So here is the sum of the values of a normalized sine wave!

If you are just adding temperatures then you are not going to divide by radians. You are just dividing by a dimensionless count. “

Really? And you couldn’t figure out the intervals? Using an interval of 1/24hrs gives you a dimension of 1/hrs. Now you are saying the intervals should be dimensionless? It’s no wonder you can’t figure out what is going on!

Sorry, but if your understanding is what comes from “engineering calculus” I think I’ll stick to my degree courses in analysis.”

There was *NEVER* any doubt you were going to stay in your little box of understanding. You just never learn anything!

bdgwx
Reply to  Tim Gorman
August 23, 2022 6:57 am

TG said: “This is *NOT* how you do integration!”

Then you need to have that discussion with them. Because that’s exactly how they say they are doing their “Integration Method”. I also don’t understand why you referred us to them if you disapprove of their method.

TG said: “The intervals are *NOT* 1/24”

That is literally their first example.

Calculate the time (in days) over which the temperature was below the base temperature. In this simple example this is always an hour (1/24 days).

They are using average temperature whether you realized it or not.

I can say the same thing about the GUM JCGM 6:2020 document you wanted me to look at as well. It not only averages temperatures, but says it can be done to “abate the extent of the uncertainty.”

Let me summarize where we are at this point.

Kip’s article and defense of it tells us that averaging intensive properties like temperature is meaningless, useless, and non existent.

We are told that a particular soil moisture product can be used to assess drought even though it depends on averaging temperature.

We are told that HDD/CDD are useful even though they depend on averaging temperature.

Furthermore, we are told that we must use the GUM to assess uncertainty even though it says averaging temperature is appropriate if you want to “abate the extent of the uncertainty”.

Reply to  bdgwx
August 24, 2022 4:42 am

Then you need to have that discussion with them. Because that’s exactly how they say they are doing their “Integration Method”. I also don’t understand why you referred us to them if you disapprove of their method.”

You *still* have not bothered to actually read the link! Their integration method is *NOT* the Mean Temperature Method. You are as bad as bellman in just scanning something to find something to cherry pick.

“That is literally their first example.”

You don’t understand integration any more than bellman! Do I need to draw you a picture of how you do numerical integration?

go here: https://www.r-bloggers.com/2014/01/rectangular-integration-a-k-a-the-midpoint-rule/

From the link: “If your integrand cannot be evaluated at the midpoints of your intervals, you can modify rectangular integration to use the function’s value at either the left boundary or right boundary as the rectangle’s height. These 2 values can be taken directly from the function values, but the resulting approximation is not as good as using the midpoint rule. “(bolding mine, tg)

Can you tell us why the mid-point rule is more accurate? I doubt it!

They are using average temperature whether you realized it or not.”

No, they aren’t!

“I can say the same thing about the GUM JCGM 6:2020 document you wanted me to look at as well. It not only averages temperatures, but says it can be done to “abate the extent of the uncertainty.”

This is averaging multiple measurements of a measurand (singlular). It is not integrating and it is *NOT* averaging measurements of different measurands!

Kip’s article and defense of it tells us that averaging intensive properties like temperature is meaningless, useless, and non existent.”

You keep missing the details. You can’t average properties that are not additive or which can’t be divided as well as those which don’t act at a distance. You have to read for UNDERSTADING, not to just cherry pick pieces!

Averaging extensive properties in order to calculate an intensive value is *NOT* averaging intensive values!

“We are told that a particular soil moisture product can be used to assess drought even though it depends on averaging temperature.”

Again, give us the functional relationship between soil moisture and temperature. I can’t find one. Most measurements are volume/volume, i..e volume of water per volume of material. Nothing about temperature.

We are told that HDD/CDD are useful even though they depend on averaging temperature.”

The integration method of finding HDD/CDD does *NOT* depend on averaging temperature. Again, you don’t understand calculus at all! You especially don’t understand how to do numerical integration. You are a perfect example of a climate scientist!

“”Furthermore, we are told that we must use the GUM to assess uncertainty even though it says averaging temperature is appropriate if you want to “abate the extent of the uncertainty”.

For at least the fifth time – when you have multiple measurements of the SAME THING which generate a normal distribution around a true value (assuming insignificant systemic uncertainty) you can average the measurements in order to find the true value.

STUDY TAYLOR FOR UNDERSTANDING!

Don’t just use Taylor for cherry picking. Do the examples!

Carlo, Monte
Reply to  Tim Gorman
August 23, 2022 12:56 pm

This is *NOT* how you do integration!

Absolutely incredible, almost impossible to believe anyone could be this far out.

bdgwx
Reply to  Carlo, Monte
August 23, 2022 1:26 pm

CM said: “Absolutely incredible, almost impossible to believe anyone could be this far out.”

And yet it seems like Tim isn’t understanding that the degreedays.net “Integration Method” isn’t using proper calculus integration. All they’re doing is grouping temperature observations into pairs, averaging them, multiplying the averages by the amount of time represented by the pairs, and summing the result. Let me repeat that. They are averaging temperature observations and found enough meaning in that to then use the value in further steps. Nevermind, that each individual temperature observation they are using is itself an average of more fine grained temperature observations.

Carlo, Monte
Reply to  bdgwx
August 23, 2022 2:16 pm

Nice word salad covering your pack of lies and half-truths.

Reply to  Tim Gorman
August 19, 2022 11:59 am

But this whole debate is whether it makes sense to look at trends of US or global temperature if you don’t think the average temperatures have any physical meaning. If you can’t average degree-days across the US or globe, then the question is still why you think the pause is a reality.

Reply to  Bellman
August 19, 2022 2:26 pm

But this whole debate is whether it makes sense to look at trends of US or global temperature if you don’t think the average temperatures have any physical meaning. “

INTEGRALS ARE NOT AVERAGES!

“f you can’t average degree-days across the US or globe, then the question is still why you think the pause is a reality.”

Integrals are *NOT* averages. And where did I talk about averaging degree-days?

Like I keep telling you, go take a remedial calculus course! Take bdgwx with you!

Reply to  Tim Gorman
August 19, 2022 2:57 pm

You are arguing with the voices in your head again. I said nothing about integrals.

I’m asking how you can use CDDs or whatever to get an indication of what is happening either globally or in the US without averaging them.

This thread all starts with me saying it’s ironic that you et al, want to accept the idea of a pause. despite it being based on data you say is meaningless both because it’s an average temperature which is impossible, or because satellite data is so uncertain you can’t even tell if it’s warming or not.

Your response to to say the only temperature data you accept is CDDs based on surface thermometer data. And all I’m trying to understand is how you could use that data to confirm the pause, without averaging them.

Reply to  Bellman
August 19, 2022 3:10 pm

You are arguing with the voices in your head again. I said nothing about integrals.”

You didn’t?

bellman “If you can’t average degree-days across the US or globe, then the question is still why you think the pause is a reality.”

“I’m asking how you can use CDDs or whatever to get an indication of what is happening either globally or in the US without averaging them.”

I’ve given you this twice. Learn to read.

This thread all starts with me saying it’s ironic that you et al, want to accept the idea of a pause. despite it being based on data you say is meaningless both because it’s an average temperature which is impossible, or because satellite data is so uncertain you can’t even tell if it’s warming or not.”

As I’ve told you multiple times, I don’t trust *any* temperature data set that is based on averages. That doesn’t mean I can’t point out what those data set show when others are using them.



Reply to  Bellman
August 17, 2022 4:30 pm

You do understand that UAH uses an entirely different method to judge temperatures don’t you. My problem with UAH uncertainty is that there are things in the atmosphere that can bias the readings used in different directions increasing the stated uncertainty.

Don’t try to lecture me on uncertainty. I have experience in making precision measurements and know how hard it is to achieve truly accurate ones with little uncertainty. You certainly can’t get it making one reading on different things.

Reply to  Jim Gorman
August 17, 2022 5:13 pm

Yet it’s the UAH data, and the zero uncertainty pause trend you use to claim “CO2 is not a control knob for temperature if temps don’t increase along with increasing CO2.”

bdgwx
Reply to  Jim Gorman
August 17, 2022 7:13 pm

I’m trying to figure out why uncertainty and trends in the UAH data matter to you so much if you don’t even think the global average temperature even exists. How can a non existent metric have an uncertainty or trend anyway? And so there is no confusion let it be known that I’m asking the question from the devil’s advocate point of here.

Carlo, Monte
Reply to  bdgwx
August 17, 2022 8:56 pm

How can a non existent metric have an uncertainty or trend anyway?

Silly question—air temperatures are MEASUREMENTS (except the fraudulent made-up data of course).

bdgwx
Reply to  Carlo, Monte
August 18, 2022 4:10 am

So the global average temperature is a measurement that does not exist?

Carlo, Monte
Reply to  bdgwx
August 18, 2022 5:49 am

That you continue to deny the reality of propagation of measurement uncertainty is no surprise.

Reply to  bdgwx
August 18, 2022 6:39 am

Go read the other thread again about averaging intensive measurements. Dose the fact that temperature misses heat in the atmosphere and its location bother you?

Reply to  bdgwx
August 18, 2022 4:20 pm

Where can one go to measure the global average temperature?

Reply to  bdgwx
August 18, 2022 5:34 am

Do you understand the difference in thermometer readings vs what UAH measures?

Just one example is the spatial coverage.

Reply to  Jim Gorman
August 18, 2022 5:42 am

The claim is that you cannot average temperatures because they are an intensive property – no buts. This has nothing to do with the way you measure temperatures or in the spatial coverage.

Carlo, Monte
Reply to  Bellman
August 18, 2022 5:53 am

You can add up a column of numbers and divide by N, just like you can divide standard deviation by root-N.

Just because it is possible doesn’t indicate they have any basis in reality.

But you must fulfill your mission in life to Keep the Rise Alive, at any cost.

The ends justify any means, the watermelon credo.

Reply to  Bellman
August 18, 2022 4:23 pm

Temperature is determined by the micro-climate at a point. It is not determined by the temperature at a distant point. In that micro-climates are *NOT* the same everywhere an “average” temperature is meaningless. The temperature in Phoenix and the temperature in Miami can’t be averaged to find an “average” temp somewhere in between. It doesn’t work for the north/south side of the Kansas River valley. It doesn’t work for *anywhere*.

Reply to  bdgwx
August 18, 2022 4:19 pm

I’m trying to figure out why uncertainty and trends in the UAH data matter”

Guess you don’t understand the role of a sceptic. Use their own data as an argument against their conclusions!

How can a non existent metric have an uncertainty or trend anyway?”

The uncertainty is part of the measurement. There is nothing that keeps a temperature measurement from having an uncertainty interval associated with it. It is the uncertainty that propagates into the uncertainty of the temperature trend.

bdgwx
Reply to  Tim Gorman
August 18, 2022 6:16 pm

So you believe an average temperature does not exist while simultaneously believing that its uncertainty can be assessed?

Reply to  bdgwx
August 19, 2022 6:28 am

Actually, the uncertainty of a measurement is neither intensive or extensive. It is a description of the measurement itself. Ask yourself this, is the measurement itself used in uncertainty calculations or just the possible variance of the measurement? I think you will find uncertainty doesn’t originate with what you are measuring, only with the instrument doing the measuring.

Carlo, Monte
Reply to  Jim Gorman
August 19, 2022 6:48 am

They still think uncertainty drops out by subtracting the baseline, so they don’t care.

bdgwx
Reply to  Jim Gorman
August 19, 2022 9:07 am

Yeah, so it sounds like to me you believe we can assess the uncertainty of non-existent quantities? Correct?

Reply to  bdgwx
August 19, 2022 10:13 am

If it doesn’t exist then how can you measure it?

Where do you go to measure the “global average temperature”?

I asked that of you yesterday, the 18th, and you didn’t reply. will you answer today?

bdgwx
Reply to  Tim Gorman
August 19, 2022 11:44 am

It’s a good question. There are others that are similar in a nature. Where do you go to measure the temperature of the CMB? Where do you go to measure the mass of Earth? Where do you go to measure the volume of the ocean? Where do you go to measure the total solar irradiance? Where do you go to measure countless other quantities of a similar nature?

I don’t think you necessarily need to “go” anywhere to measure something. Nor do I think you need to probe every single atom of the body to “measure” it. One of the great things about science is that we can estimate quantities of bodies with incomplete information and from an observational perspective that is far away in many cases.

Anyway, let me play the devil’s advocate here and assume some quantities (like an average of intrinsic property) do not exist. How do you assess the uncertainty of these quantities if they do not even exist?

Reply to  bdgwx
August 19, 2022 1:39 pm

Where do you go to measure the temperature of the CMB?”

You don’t go anywhere. You use a radio telescope to measure the EM radiation which is pretty much uniform as far as teh background is concerned. You then calculate the temperature of that signal, you don’t measure it!

Where do you go to measure the mass of Earth?”

You don’t. You estimate it based on its impacts on the other planets and their influence on the earth. Mass creates a distortion in the space-time continuum known as gravity. You can estimate mass from the effects of that gravity. It’s also why there is always an uncertainty quoted with the mass!

Where do you go to measure the total solar irradiance?”

You don’t go measure it. You measure the solar irradiance at a point and then calculate the total.

“Where do you go to measure countless other quantities of a similar nature?”

You don’t. You measure what you can measure and then use functional relationships to calculate them. It’s why you can’t measure temperature at a point and calculate the temperature of the earth. No functional relationship exists to do that!

I don’t think you necessarily need to “go” anywhere to measure something.”

Really? Observations are what validate or invalidate theories. If you don’t go somewhere to make observations then you really aren’t doing validation or invalidation of anything.

“Nor do I think you need to probe every single atom of the body to “measure” it.”

Who is claiming that you have to?

One of the great things about science is that we can estimate quantities of bodies with incomplete information and from an observational perspective that is far away in many cases.”

Estimates imply uncertainty. Why do we never see measurement uncertainties propagated forward from temperature measurements?

Incomplete data means you are guessing. Estimates require calculating from known values even if you aren’t quite sure what the functional relationship is. We used to teach Boy Scouts how to estimate the height of a tree – but it required a measurement of something, even it was units of “thumb” or “steps”.

How do you assess the uncertainty of these quantities if they do not even exist?”

Ah! Another trick question! If the quantity doesn’t exist then how does it have an uncertainty? Having an uncertainty implies that it *does* exist! How would a non-existing global average temperature have an uncertainty?

The issue here is that a skeptic trying to question the applicability of something must work with that something. If that something is a global average temperature then trying to show how the uncertainties associated with it would mask any differential being looked for is certainty an acceptable path to take.

bdgwx
Reply to  Tim Gorman
August 19, 2022 2:22 pm

TG said: “Why do we never see measurement uncertainties propagated forward from temperature measurements?”

I have seen it done. I’ve even posted several publications doing just that.

TG said: “Really?”

Yes. We’ve learned a lot about countless objects without going to them.

TG said: “If the quantity doesn’t exist then how does it have an uncertainty?”

That’s my question.

TG said: “Having an uncertainty implies that it *does* exist!”

That’s what I was thinking.

TG said: “How would a non-existing global average temperature have an uncertainty?”

That’s my question.

Reply to  bdgwx
August 20, 2022 6:10 am

I have seen it done. I’ve even posted several publications doing just that.”

No, you haven’t. Everything I’ve seem from you is focused on the standard deviation of the sample means. That is *NOT* propagating measurement uncertainty. Standard deviation of the sample means simply tells you nothing about the accuracy of the sample means.

“Yes. We’ve learned a lot about countless objects without going to them.”

Not from direct measurements of intensive attributes!

That’s my question.”

A non-existent value has no uncertainty. You are trying to conflate an average temperature which doesn’t exist with a direct measurement of temperature at a location between two points.

That direct measurement *can* have a measurement uncertainty.

That’s what I was thinking.”

The average temp doesn’t exist. It doesn’t have a measurement uncertainty since it can’t be measured. The temp at an intermediate point can be measured and will have a measurement uncertainty.

You are still trying to conflate the two.

That’s my question.”

Since it doesn’t exist it can’t have a measurement uncertainty. Insofar as it is an average of measurements the individual measurement uncertainties should be propagated into the average value that is calculated. That uncertainty will, sooner or later with enough measurement stations, overwhelm the average thus showing that it is impossible to actually calculate it.

You can run but you can’t hide. The global average temperature is meaningless and useless.

bdgwx
Reply to  Tim Gorman
August 20, 2022 6:43 am

TG said: “Since it doesn’t exist it can’t have a measurement uncertainty.”

That was my thinking too. Perhaps the next time someone who believes the global average temperature does not exist and yet still posts an uncertainty estimate for it can be informed of the mistake?

Reply to  bdgwx
August 20, 2022 1:47 pm

It would be far better to educate them in how to propagate the measurement uncertainty forward from the measurements. Then they could see that the average makes no sense even if you *could* average it.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 11:07 am

It sounds to me like you don’t understand ANYTHING about uncertainty.

bdgwx
Reply to  Carlo, Monte
August 19, 2022 12:36 pm

I’m not foolish enough to claim I understand EVERYTHING about uncertainty, but I do think I think I claim I know SOMETHING about it. For example, I know you can propagate the uncertainty of individual measurements that have gone through a combining function using GUM equation 10.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 2:44 pm

The ONLY reason you do is because of your false belief that you can use it to average uncertainties away.

As Jim has tried to tell you countless times, Eq. 10 is for a functional relationship between an output quantity and the input quantities needed to calculate it. A mindless average operation doesn’t qualify!

This is the entire extent of your uncertainty knowledge, and it is 1000% wrong.

Reply to  bdgwx
August 19, 2022 3:04 pm

For example, I know you can propagate the uncertainty of individual measurements that have gone through a combining function using GUM equation 10.”

That is *ONLY* useful for where you have multiple measurements of the SAME thing that generate something like a Gaussian distribution!

From the GUM:

The standard uncertainty of y, where y is the estimate of the measurand Y”

Please note carefully (this has been pointed out to you MULTIPLE times over the past year) the word “measurand“.

It doesn’t say measurands.

If you look at Fig 1 it says: “Figure 1 — Graphical illustration of evaluating the standard uncertainty of an input quantity from repeated observations ”

“AN INPUT QUANTITY”. Not input quantities!

I’ve attached a screenshot of Figure 1. Please note that it lays out a GAUSSIAN distribution!

The odds of getting a Gaussian distribution from 1000’s of temperature measurements stations is so low as to be zero!

gum_fig_1.png
bdgwx
Reply to  Tim Gorman
August 19, 2022 7:26 pm

TG said: “That is *ONLY* useful for where you have multiple measurements of the SAME thing that generate something like a Gaussian distribution!”

And yet the GUM provides examples of the individual measurements being of different things. In fact, the measurements in their examples don’t even have the same units in many cases. If you believe they have made a mistake perhaps you can contact them and lobby to have the GUM changed. When you get it changed let me know and I’ll change my position on this matter.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 9:08 pm

Idiot, you just dig the hole you are in deeper and deeper with nonsense like this.

Unskilled and Unaware

Reply to  Carlo, Monte
August 20, 2022 1:18 pm

He never looks up to see how deep the hole is!

Reply to  bdgwx
August 20, 2022 6:24 am

And yet the GUM provides examples of the individual measurements being of different things.”

I can’t find it associated with Eq. 10. If you can then please post a quote or at least a link to it!

“In fact, the measurements in their examples don’t even have the same units in many cases.”

As C,M has pointed out, that is for when you are trying to calculate a functional relationship. The dimensions always finally work out to the dimension of the quantity you are trying to find the uncertainty of. Each of those component values must have their uncertainty propagated from the measurements of those component values into the functional relationship.

E.g. finding current in an electrical circuit. You can measure voltage and resistance, each with a valid set of measurements of the same thing, and calculate the current along with its associated propagated uncertainty. The dimensions of voltage and resistance work out to amperes!

“If you believe they have made a mistake perhaps you can contact them and lobby to have the GUM changed. “

They haven’t made a mistake. *YOU* just don’t understand what they are saying!

Reply to  Tim Gorman
August 20, 2022 5:06 pm

You have a penchant for understatement.

bdgwx
Reply to  Tim Gorman
August 21, 2022 3:35 pm

TG said: “I can’t find it associated with Eq. 10. If you can then please post a quote or at least a link to it!”

According to the verbiage on pg. 19 for equation 10 f is the function given in equation 1 and described on pg. 8 as Y = f(X_1, X_2, …, X_n). There is no requirement stated that the X inputs must be of the same thing. And, in fact, the example given on pg. 9 is of X’s of different things. The verbiage also indicates that the X’s can themselves be dependent on other quantiles.

Reply to  bdgwx
August 21, 2022 4:51 pm

function given in equation 1 and described on pg. 8 as Y = f(X_1, X_2, …, X_n). There is no requirement stated that the X inputs must be of the same thing. “

There is no requirement that the input variables be need be the same thing is true. However, the dimensional analysis of the inputs must provide the correct output value.

Functional relationships require exact descriptions of the physical units needed. You can’t just throw some things in a pot and come up with a functional relationship. If you’ve never done dimensional analysis, you have never taken a physical science course.

Reply to  bdgwx
August 21, 2022 7:54 pm

According to the verbiage on pg. 19 for equation 10 f is the function given in equation 1 and described on pg. 8 as Y = f(X_1, X_2, …, X_n). There is no requirement stated that the X inputs must be of the same thing. And, in fact, the example given on pg. 9 is of X’s of different things. The verbiage also indicates that the X’s can themselves be dependent on other”

You are as bad as bellman at cherry picking stuff you simply don’t understand hoping it will bolster your argumeent.

From the wording around equation 1:

“Many measurements are modelled by a real functional relationship f between N real valued input quantities X1 , . . . , XN and a single real-valued output quantity (or measurand) Y in the form

Y = f (X1 , . . . , XN ).”

The term measurand is singlular! It is not multiple. The X quantities are associated with ONE MEASURAND!

You are trying to turn the equation into a multivariate one and the same page says: “The measurement model can be multivariate where there is more than one measurand, denoted by Y1 , . . . , Ym;”

I would also point you to

———————————————–
3.8 multivariate measurement model multivariate model measurement model in which there is any number of output quantities NOTE 1 The general form of a multivariate measurement model is the equations
h1(Y1, . . . , Ym, X1, . . . , XN ) = 0, . . . , hm(Y1, . . . , Ym, X1, . . . , XN ) = 0,
where Y1, . . . , Ym, the output quantities, m in number, in the multivariate measurement model, constitute the measurand, the quantity values of which are to be inferred from information about input quantities X1, . . . , XN in the multivariate measurement model.

—————————————————

Stop trying to cherry pick from the GUM so you can say that it handles multiple measurements of different things the same way it handles multiple measurements of the same thing. The first has multiple measurands, the second has one measurand and that is what the GUM addresses!

bdgwx
Reply to  Tim Gorman
August 22, 2022 5:40 am

TG said: “The term measurand is singlular! It is not multiple. The X quantities are associated with ONE MEASURAND!”

First, the function f accepts multiple measurands X_1 through X_N. That is plural as in many. And the example given on pg. 9 is of measurands (plural) of different things.

Second, section 4.1.2 states “The input quantities X_1, X_2, …, X_N upon which the output quantity Y depends may themselves be viewed as measurands and may themselves depend on other quantities, including corrections and correction factors for systematic effects, thereby leading to a complicated functional relationship f that may never be written down explicitly.”

Third, the term “measurand” is defined exactly as “particular quantity subject to measurement where “measurement” is defined exactly as “set of operations having the object of determining a value of a quantity”

Fourth, most of the examples given in the GUM are of functions that accept inputs of different things. One notable exception is that of H.3 which…wait for it…not only sums temperatures but averages them to. Clearly the GUM thinks averaging temperatures is meaningful and useful.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 6:41 am

Fourth, most of the examples given in the GUM are of functions that accept inputs of different things. One notable exception is that of H.3 which…wait for it…not only sums temperatures but averages them to. Clearly the GUM thinks averaging temperatures is meaningful and useful.

mr. disingenuous counts another coup!

Clearly you did not understand what you were reading, while engrossed in your headlong desperate search for any example you think vindicates your nonsense.

Opening the GUM, going to example H3; the TITLE says:

“H.3 Calibration of a thermometer”

Did it ever dawn inside your trendologist brain that this might be a leeeeeetle different? This is for a calibration in a lab!

Continuing:

“This example illustrates the use of the method of least squares to obtain a linear calibration curve and how the parameters of the fit, the intercept and slope, and their estimated variances and covariance, are used to obtain from the curve the value and standard uncertainty of a predicted correction. “

Oh look, nothing about averaging.

“H.3.1 The measurement problem

“A thermometer is calibrated by comparing n = 11 temperature readings t_k of the thermometer, each having negligible uncertainty, with corresponding known reference temperatures t_R, k in the temperature range 21 °C to 27 °C to obtain the corrections b_k = t_R, k − t_k to the readings. The measured corrections b_k and measured temperatures t_k are the input quantities of the evaluation. A linear calibration curve

“b(t) = y_1 + y_2(t – T_0) H.12

“is fitted to the measured corrections and temperatures by the method of least squares. The parameters y_1 and y_2, which are respectively the intercept and slope of the calibration curve, are the two measurands or output quantities to be determined.”

Absolutely NOTHING in here about any kind of averaging; this is a comparison of the output of an thermometer being calibrated against a calibration reference. Completely UNLIKE air temperature averaging in the context of global warming.

Try again, mr. disingenuous, another fail. Your paragraph quoted above is a LIE.

bdgwx
Reply to  Carlo, Monte
August 22, 2022 7:48 am

CM said: “Your paragraph quoted above is a LIE.”

I stand by what I said. They not only do they sum temperatures but they average temperatures as well and use it (see H.16a). Just because you didn’t notice or ignored it doesn’t make it any less true. Clearly they think averaging temperatures is meaningful and useful.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 8:02 am

A characteristic of hardened liars is that when confronted with their falsehoods, they double-down and deny with additional lies: “It’s not me in that video!”

The text prior to H.16a says NOTHING about averaging! It is about removing correlation between the slope and the intercept.

I repeat:

Completely UNLIKE air temperature averaging in the context of global warming.

Are there ANY mathematical operations that you DON’T label as “averaging”?

bdgwx
Reply to  Carlo, Monte
August 22, 2022 8:36 am

CM said: “The text prior to H.16a says NOTHING about averaging!”

You don’t think t_bar = Σ[t_k, 1, n] / n is an average?

CM said: “Completely UNLIKE air temperature averaging in the context of global warming.”

What difference does that make? We were told an average temperature was meaningless and useless. In fact, we were told that averaging any intensive property is meaningless and useless. Until now no one has said anything about that proclamation being only in the context of global warming.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 9:40 am

We were told an average temperature was meaningless and useless.

GO READ THE PAPER, LIAR!

H.3.5 Elimination of the correlation between the slope and intercept

Ignore context and claim victory!

bdgwx
Reply to  Carlo, Monte
August 22, 2022 12:03 pm

First, it’s not about winning. It’s about learning.

Second, I’m not the one who’s ignoring context here. It was Kip’s blanket assertion that averaging intensive properties is meaningless, useless, and non exist under all contexts and the knee-jerk contrarian defense of it. I’m one of the few on here who challenged it.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 1:41 pm

It’s about learning.

Another irony overload.

Reply to  Carlo, Monte
August 22, 2022 3:06 pm

yep.

Reply to  bdgwx
August 22, 2022 1:13 pm

“We were told an average temperature was meaningless and useless.”

You still can’t get it straight! Calibrating a LIG thermometer is actually determining the volume of liquid in a tube at specified external conditions. What is volume?

Calibrating a thermistor is actually determining the current through a device at specified external conditions. What is current?

This is entirely different than trying to find an average temperature between location A and location B when the temperatures at each location cannot act at a distance to determine the temperature at any intermediate point. The temperatures at each point, A/B/C are determined by the factors existing at each point, not by factors at the other two points.

Some things like volume and current you can average, some things like temperature you can not average. You just can’t seem to get that into your head!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 1:27 pm

Some things like volume and current you can average, some things like temperature you can not average. You just can’t seem to get that into your head!

No, he refuses to allow reality to penetrate into his head.

Reply to  Carlo, Monte
August 22, 2022 2:59 pm

I swear that the two of them have lived in their parents basement for all of their lives. They’ve never soldered anything, they’ve never tried to build a stud wall, they’ve never created a beam to span a distance, they’ve never built a set of stairs and had to figure out tread and rise, they’ve never used a micrometer on a crankshaft journal, they’ve never gapped a sparkplug, they’ve never thought about how their house thermostat works, they’ve never installed a pre-hung door, they’ve never had to analyze the hit pattern on a target from a rifle, they’ve ever tried to actually physically find an average temperature between an interior wall in their house and the asphalt on their street, they’ve never had to figure out the welding settings (acetelyne or stick) in order to weld different thicknesses of steel, and just about any other example you can think of.

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 3:35 pm

Nor performed a real uncertainty analysis.

But they can lecture experienced professionals about their deep knowledge of measurement uncertainty acquired by typing stuff into google.

Reply to  Carlo, Monte
August 23, 2022 6:27 am

And then cherry picking stuff they don’t actually understand.

Reply to  Carlo, Monte
August 23, 2022 2:50 pm

Nor ordered a bronze bearing while deciding to I order 3/1000 or 2/1000 oversize? Is the journal round enough to not freeze with one or the other. No uncertainty involved here just go measure another journal and average their readings. ROFLMA

Reply to  bdgwx
August 23, 2022 7:25 am

You need classes in physical science! Learn what acting at a distance means. Learn what a field is.

The paper does not say you can’t average temperature EVER! If you have a block and apply heat there will be a temperature differential both in time and distance from the source. You can calculate an average temp after you have equilibrium but it will be meaningless. Conduction in many materials is not linear.

Carlo, Monte
Reply to  Jim Gorman
August 23, 2022 12:57 pm

Learn what a field is.

Bullseye!

bdgwx
Reply to  Jim Gorman
August 23, 2022 2:06 pm

JG said: “The paper does not say you can’t average temperature EVER!”

One cannot average temperatures.

Reply to  bdgwx
August 23, 2022 4:28 pm

One can not average disparate temperatures that are used to determine something else. As I said before, you could average the start and ending temperatures of a given object. What do you think it would tell you? Do you think it would be more accurate than measuring the time and temperature at equilibrium? The only use I can think of is validating a gradient equation to see if it accurate or not. Do you have a gradient for the global field of temperature. What is it? All I’ve seen is an average which is useless.

Reply to  bdgwx
August 22, 2022 12:58 pm

First, the function f accepts multiple measurands X_1 through X_N.”

NO! I’ve already pointed this out to you! Here is what the GUM says:

XN and a single real-valued output quantity (or measurand) Y in the form” (bolding mine, tg)

X_1 through X_N are MEASUREMENTS not different measurands!

“And the example given on pg. 9 is of measurands (plural) of different things.”

——————————————-
Page 9:

Simple theoretical model of mass measurement

The mass m of a weight is measured with a spring balance. The relationship between the restoring force F exerted by the spring and m at equilibrium is F = mg, where g is the acceleration due
to gravity. Hooke’s law relating F to the extension X of the spring and the spring constant k is F = kX. Thus, a theoretical measurement model relating mass to extension, spring constant and acceleration due to gravity is

m =(k/g)X
———————————————-

This example is used for calculating the value of a functional relationship! The uncertainty of the output (m) is the uncertainty propagated from the components on the right side. Each component on the right side has its own uncertainty.

There is *NO* functional relationship between a load of boards of random length and an output “Y”. There is no functional relationship defined between the temperature at location A and the temperature at location B, at least not a simple one.

An average is *NOT* a functional relationship.

I’ll never understand how someone can think that the average value of multiple, random measurands can tell you anything about what the next measurand will be. It’s like trying to guess what the next card out of a deck will be when the deck has an infinite number of cards, all of which are different!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 1:25 pm

These persons remind me of the Cheyenne contras who rode horses facing backward and washed in dirt.

bdgwx
Reply to  Tim Gorman
August 22, 2022 4:45 pm

TG said: “NO! I’ve already pointed this out to you! Here is what the GUM says:
“XN and a single real-valued output quantity (or measurand) Y in the form” (bolding mine, tg)”

First, we I refer to the GUM I am talking about the canonical GUM document JCGM 100:2008. It clearly statements that the function f used in equation 10 accepts input quantities that themselves can be referred to as measurands.

Second, the verbiage you are quoting comes from JCGM 6:2020 which is titled Part 6: Developing and using measurement models. There is nothing wrong with that. Just know it is a different document.

Third, the verbiage you are quoting is clearly describing the output of the function as a measurand. It is not inconsistent with the verbiage in 100:2008 describing the inputs as measurands as well.

Fourth, if you really want to go here and start citing JCGM 6:2020 then understand that it says you can abate the extent of the uncertainty by averaging values in a timeseries. This is discussed in regards to temperatures nonetheless. Do you really want to introduce this document into the discussion?

TG said: “X_1 through X_N are MEASUREMENTS not different measurands!”

Here is what the GUM says on pg. 9.

The input quantities X1, X2, …, XN upon which the output quantity Y depends may themselves be viewed as measurands and may themselves depend on other quantities, including corrections and correction factors for systematic effects, thereby leading to a complicated functional relationship f that may never be written down explicitly.

Reply to  bdgwx
August 23, 2022 6:49 am

It clearly statements that the function f used in equation 10 accepts input quantities that themselves can be referred to as measurands.”

Each and every factor on the right side of the equation is a separate measurand all of its own, be it spring extension, gravity, or whatever. You define the uncertainty of each measurand separately through separate multiple measurements of each measurand. There is no such thing as doing multiple measurements of different things in order to define each measurand. You don’t measure the extension of different springs in order to try and find an “average X” to use in the functional relationship. That simply makes no physical sense at all! You make multiple measurements of a single spring in order to get a value to insert into the functional relationship to determine the mass.

Just know it is a different document”

A non sequitur. It has nothing to do with the issue at hand.

“Third, the verbiage you are quoting is clearly describing the output of the function as a measurand. It is not inconsistent with the verbiage in 100:2008 describing the inputs as measurands as well.”

The uncertainty of each measurand has to be determined separately, e.g. using Eq 10.

There is no averaging multiple measurements of different measurands.

Which is what you are trying to support. Such as measuring the extension factor of multiple different springs in order to get an “average extension factor” to use in the functional relationship to determine the mass of a specific spring. Or in trying to define an average board length by measuring multiple different boards so the average can be used in a functional relationship such as for calculating the torque from a lever arm. That average isn’t useful for the calculation in a specific situation!

The input quantities X1, X2, …, XN upon which the output quantity Y depends may themselves be viewed as measurands “

But those measurands are not AVERAGED together! Each contribute separately to the quantity Y! Each measurand is handled separately!

m = (kg/X)

You have multiple measurands. Yet they are not averaged together to get m! Each are determined separately and their uncertainties propagated onto m. Where are you getting the understanding that X1, X2, …., Xn are *averaged”?

Reply to  Tim Gorman
August 23, 2022 8:36 am

I will guarantee that this person has never taken a calculus based physical science and done the labs required for them.

“Where are you getting the understanding that X1, X2, …., Xn are *averaged”?”

Best question yet!

bdgwx
Reply to  Jim Gorman
August 23, 2022 9:45 am

JG said: “Where are you getting the understanding that X1, X2, …., Xn are *averaged”?”

I never said they were. What I said is that they are measurands and can be of different things. The GUM even provides examples where X1, X2, …, XN are different things.

And because I can already see where this is going make sure you are not conflating the inputs X1, X2, …, XN with the output of Y = f(X1, X2, …, XN) which is also a measurand. The only requirement of f is that it be a function that operates on real numbers and produces a single real number as the output. There is nothing in either JCGM 100:2008 or JCGM 6:2020 that says f cannot compute the average of the inputs. In fact JCGM 100:2008 says f can be so arbitrarily complex that it cannot even be stated explicitly. It can be anything you want it to be.

Reply to  bdgwx
August 23, 2022 4:57 pm

I never said they were. What I said is that they are measurands and can be of different things. The GUM even provides examples where X1, X2, …, XN are different things.”

Of course they are different things! gravity is different from extension is different from mass, etc. Voltage is different from current is different from resistance.

But you don’t AVERAGE THEM ALL TOGETHER!

” There is nothing in either JCGM 100:2008 or JCGM 6:2020 that says f cannot compute the average of the inputs.”

The GUM provides NO MENTION OF AVERAGING DIFFERENT THINGS in any way, shape, or form. It only talks about Y being the output of a functional relationship with components (X1, X2, …, Xn). It says nothing about averaging X1, X2, …, Xn together!

” In fact JCGM 100:2008 says f can be so arbitrarily complex that it cannot even be stated explicitly. It can be anything you want it to be.”

Using this logic it’s possible to say that averaging the size of a grapefruit can be used to determine temperature!

Face it, you are still trying to defend an indefensible position. Ghe GUM simply doesn’t say what you think it says!

bdgwx
Reply to  Tim Gorman
August 23, 2022 10:27 am

TG said: “Each and every factor on the right side of the equation is a separate measurand all of its own, be it spring extension, gravity, or whatever.”

Yes. Finally we are getting somewhere. The function f accepts multiple measurands. Each input is its own measurand. And they don’t have to be of the same thing.

TG said: “There is no averaging multiple measurements of different measurands.”

Where does the GUM say the function f cannot be Σ[X_i, 1, N] / N? Which document number and which page number? Be precise.

Reply to  bdgwx
August 23, 2022 4:59 pm

Where does the GUM say the function f cannot be Σ[X_i, 1, N] / N? Which document number and which page number? Be precise.”

How do I prove a negative? *YOU* prove that it says that!



Reply to  bdgwx
August 23, 2022 7:12 am

Fourth, if you really want to go here and start citing JCGM 6:2020 then understand that it says you can abate the extent of the uncertainty by averaging values in a timeseries. This is discussed in regards to temperatures nonetheless. Do you really want to introduce this document into the discussion?”

I assume you are speaking of Section 11.7, Models for Time Series.

The sections starts off with this:

11.7.1 Observations made repeatedly of the same phenomenon or object, over time or along a transect in space, often equispaced in time or equidistant in space, tend to be interrelated, the more so the greater their proximity. The models used to study such series of observations involve correlated random variables” (bolding mine, tg)

Please note carefully that the section specifically states that THE SAME PHENOMENON OR OBJECT is being discussed. Not multiple objects or phenomenons being averaged.

Note in the example concerning the temperature of a thermal bath it says:

“The readings of temperature listed in Table 8 and depicted in Figure 8 were taken every minute with a thermocouple immersed in a thermal bath during a period of 100 min.”

Please note the use of the word “a”. “A” thermocouple, the same measuring device. “A” thermal bath, not multiple thermal baths.

So you are getting multiple measurements of the same thing using the same device. YES YOU CAN AVERAGE THOSE! And get something useful abut that one, singular thermal bath. You cannot use multiple thermocouples in multiple thermal baths to get a useful average temperature to describe a non-existent thermal bath.

You are *still* trying to defend the proposition that you can get multiple measurements of different things. I suspect you are still trying to defend the proposition that you can also minimize uncertainty by averaging the uncertainty of different things.

Both are wrong. And you just keep on banging your head against the wall because you don’t want to admit it!

bdgwx
Reply to  Tim Gorman
August 23, 2022 7:31 am

Finally we are getting somewhere. It is unequivocal. You CAN average temperature and get something useful namely the abatement of uncertainty which I only bring up because that has been a long drawn out debate as well.

And make sure you read JCGM 6:2020 section 11.7 carefully. Nowhere does it say you cannot average temperature over a spatial domain. In fact, it implies that you can. All it says is that the closer in time and space the measurements are made the more they are interrelated and require correlated analysis. That is their point regarding proximity both temporally and spatially. It is a point Bellman and I have made on numerous occasions.

Let me summarize. You’re own source says in no uncertain terms that 1) temperatures can be averaged and 2) averaging abates the extent of uncertainty.

BTW…thank you for bringing JCGM 6:2020 to my attention. That should settled the debate once and for all now.

Reply to  bdgwx
August 23, 2022 8:49 am

What is the temperature average for? Temperature! Do you see any mention of heat? Is temp being used in a thermodynamic scenario?

You obviously have a background in meteorology. Have you ever studied the subject of thermodynamics that requires calculus? Do you think that a big portion of the sun’s energy is converted into latent heat? How does temperature measure latent heat?

bdgwx
Reply to  Jim Gorman
August 23, 2022 9:55 am

JG said: “Do you see any mention of heat? Is temp being used in a thermodynamic scenario?”

Deflection and diversion. We aren’t even discussing the relationship between temperature and heat here. We discussing whether an average temperature is meaningful and useful and by extension whether trends of the average temperature are meaningful and useful.

And the new piece of evidence added to this conversation by TG (GUM document JCGM 6:2020) not only says temperatures can be averaged, but says it can be done to “abate the extent of the uncertainty”.

Reply to  bdgwx
August 24, 2022 7:14 am

You CAN average temperature and get something useful namely the abatement of uncertainty which I only bring up because that has been a long drawn out debate as well.d”

You can average temperature measurements OF THE SAME THING. If the multiple measurements of the same thing generate a random distribution free of systemic bias then you can assume that random bias cancels and just average the measurements to find a “true value”.

You can *NOT* average multiple measurements of different things and get a useful average.

Say you have one thermometer, one calibrated probe, and 50 different thermal baths.

In order to calibrate that thermometer would you measure all 50 thermal baths, find their average, and use *that* average temp to calibrate the thermometer?

Or would you make 50 measurements of the single thermal bath the thermometer is sitting in and use the average of those 50 measurements?

One technique is averaging single measurements of multiple things, the other is averaging multiple measurements of the same thing.

Do you not see the difference?

Reply to  bdgwx
August 24, 2022 7:19 am

That is their point regarding proximity both temporally and spatially. It is a point Bellman and I have made on numerous occasions.”

In other words, the GUM says you have to get as close as you can to MEASURING THE SAME THING MULTIPLE TIMES!

Just how spatially interrelated are the temperatures in Denver and Kansas City? In Phoenix and Miami? In Des Moines, IA and Mexico City?

The fact is that they are not interrelated at all! Yet we see the climate scientists averaging them to determine a “global average temperature”!

You seem to be your own worst enemy in trying to defend averaging temperatures for climate science!

Carlo, Monte
Reply to  bdgwx
August 19, 2022 9:10 pm

A succinct statement that illustrates your total bankruptcy of understanding about uncertainty.

Clyde Spencer
Reply to  Bellman
August 17, 2022 6:59 pm

My recollection is that a p-test was performed to determine the statistical significance. The null-hypothesis that there was no trend (|trend slope|>0) at a p<0.05 could not be rejected; therefore, the null hypothesis was accepted.

There was no claim, other than by you, that a lack of statistically significant warming can “prove anything.” People on here can and do make many contradictory claims. Painting all of us with the same broad brush bespeaks of desperation or intellectual sloppiness. If you want to be taken seriously, I suggest that you name and quote the exact words supporting your claim.

Reply to  Clyde Spencer
August 18, 2022 5:14 am

The null-hypothesis that there was no trend (|trend slope|>0) at a p<0.05 could not be rejected; therefore, the null hypothesis was accepted.

Which is a great example of how not to do a significance test. You do not accept the null-hypothesis. You either reject it or you cannot reject it. Not rejecting the null-hypothesis does not mean it is true, it just means that if it were true there would be a greater than 5% chance of seeing the results you did.

If you want to show there has been significant (in the statistical sense) warming, you can use a null hypothesis of zero warming, and see if the data is sufficient to reject it. If you can’t reject the null-hypothesis it’s either because it’s true, or because the data is insufficient. The solution can the n be to look at more data, e.g. look at a longer time period.

If you want to show there has been no warming, that’s a difficult thing to show statistically. My suggestion is that you can either have a null-hypothesis saying the warming rate is above some per-defined small value.But simpler is to change the claim from zero warming to a slow down. Then the null-hypothesis is that the warming rate continued at it’s previous rate. If you can show that there is a low p-value to support that, i.e. that it is very unlikely you would have seen the “pause” if the warming rate had continued at it’s previous rate, then you have evidence (not proof) that there has been a slow down.

Of course, so far there is insufficient evidence to make that claim, the standard error of the pause is just too big.

Reply to  Bellman
August 18, 2022 5:25 am

Continued.

Even if you could establish that the pause period was significantly different to the previous rate of warming, I still think there would be major issues with the Monckton method.

For one, it ignores the discontinuity in the trends. The fact that the pause starts around 0.2°C warmer than the end of the previous trend, means that you have a lower warmer rate but higher temperatures. The previous trend can easily pass through all the values, and if there’s a question it’s why are the temperatures warmer than would be expected.

And then, you can;t ignore the cherry-picking nature of the start date. One of the assumptions of any significance test is that the data has been randomly selected. A selection process that involves looking back through all the data until you find the optimum start date to show zero trend, biases the result.

Reply to  Bellman
August 18, 2022 3:41 pm

There is no discontinuity. The anomaly is the anomaly. CoM didn’t create a new anomaly that caused a discontinuity.

And then, you can;t ignore the cherry-picking nature of the start date.”

You just can’t keep from contradicting yourself, can you? About finding the length of a drought you said: ” I would try to find an exact starting point where it started.”

EXACTLY what CoM does in trying to find the length of the pause!



Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 3:54 pm

The old Department of Double Standards Department…

Reply to  Tim Gorman
August 18, 2022 4:28 pm

It is what I say it is! Who said that?

Reply to  Tim Gorman
August 18, 2022 4:57 pm

There is no discontinuity.”

20220709wuwt1.png
Carlo, Monte
Reply to  Bellman
August 18, 2022 6:12 pm

ANY midpoint you pick will have a disconnect between two regression fits because the data points are DIFFERENT.

Reply to  Carlo, Monte
August 18, 2022 6:53 pm

Tell that to Gorman, he’s the one who keeps insisting there is no discontinuity.

You’re wrong of course, but I expect you know that.

Carlo, Monte
Reply to  Bellman
August 19, 2022 5:36 am

Time and again you have demonstrated that these trend lines are the only way you view reality, and have to lie and obfuscate to cover it up.

Do you need me to demonstrate?

Reply to  Carlo, Monte
August 19, 2022 8:03 am

Feel free to demonstrate it, just as I free to ignore your delusions.

Reply to  Carlo, Monte
August 19, 2022 10:44 am

Can you imagine buying a stock with this variability (volatility) based on an increasing trend that shows unlimited growth? Would the ups wipe out the downs?

How about selling a device to control a rocket motor whose output had this kind of variability?

Oh! It is just noise!

Carlo, Monte
Reply to  Jim Gorman
August 19, 2022 10:55 am

“No way this could happen! We designed it respond to the trend line!”

Reply to  Jim Gorman
August 19, 2022 12:02 pm

Financial markets do not behave in the same way as global temperatures. But I and others do invest in the markets on the assumption that whilst in the short term markets go up and down, there’s a reasonable assumption that they will go up in the long term.

Carlo, Monte
Reply to  Bellman
August 19, 2022 2:45 pm

WHOOOOOOSH

Reply to  Bellman
August 19, 2022 10:06 am

There is *NO* discontinuity in the data.

If you plot the uncertainty points (plus and minus) and use them for a linear regression the trend lines formed by those data points will not meet nor will they meet the trend line for the stated values.

Does that mean that those uncertainty point trend lines represent discontinuities?

NOPE!

A discontinuity is typically defined where the data on one side of a point goes to a different value in the limit than it does on the other side of the point. You can also have a discontinuity in a transfer function where you have a pole (i.e. where the denominator goes to zero) which drives the output data to infinity.

Please note carefully the use of the word “data”.

Once again, you simply don’t know what you are talking about. You’ve cherry picked the word “discontinuity” and are trying to use it to bolster your argument without actually understanding what it means.

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 10:29 am

Or a step function.

He thinks the trend lines are “data”.

Reply to  Tim Gorman
August 19, 2022 12:08 pm

There is *NO* discontinuity in the data.

I’m not talking about discontinuity in the data. But in fact the data, being monthly averages is entirely discontinuous.

The point is that Monckton’s choice of change point results in a discontinuity in the trends.

Reply to  Bellman
August 19, 2022 1:49 pm

The point is that Monckton’s choice of change point results in a discontinuity in the trends.”

There is no discontinuity in trend lines! Trend lines are based on the data used to generate them. There is no restriction anywhere that I can find that says all trend lines must meet at a common point!

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 2:49 pm

In the general case, they won’t meet because the data used to calculate two line are different, and the regression operation ONLY considers the points given to it! It can’t be expected to be valid outside of the fitting range.

Reply to  Carlo, Monte
August 19, 2022 4:14 pm

If you want to have a realist continuous change in trends, one option is to look for change points that actually do get close to meeting up. Or you can constrain the trend lines to be continuous, e.g. using linear splines.

I’ve done second before, and selected the change date that produces the best fit, somewhere in 2012. As it happens 2012 also gives you a reasonably good continuous fit even without the constraints.

I’m not saying that actually means there was a change in 2012, I doubt there is any significant difference between this and a linear trend across the whole data set, but it is to my eyes more convincing than Monckton’s.

20220819wuwt1.png
Carlo, Monte
Reply to  Bellman
August 19, 2022 9:06 pm

Again, you are confusing and conflating your hallowed trend lines with real data.

Reply to  Bellman
August 20, 2022 6:35 am

using linear splines.”

That is nothing more than a piecewise linear analysis! There is nothing anywhere that I can find that says that it the only valid way to interpret data!

As I’ve pointed out to you many times, I have experience in determining where to place new telephone central offices. Many times when looking at demographic data the past trend and current trend simply didn’t match up. There was no requirement that we do a piecewise linear analysis, it would provide nothing extra that we could use. You simply didn’t want to put a new central office where the trend over the past five years indicated a decline in population even if the past 40 years indicated a population growth!

“but it is to my eyes more convincing than Monckton’s”

Which we have proven is meaningless in the real world. You just like pretty graphs, even if they don’t tell you about current reality.

Reply to  Tim Gorman
August 20, 2022 1:49 pm

There is nothing anywhere that I can find that says that it the only valid way to interpret data!

There are an infinite number of valid ways of interpreting data. That doesn’t mean they are all equally good.

Many times when looking at demographic data the past trend and current trend simply didn’t match up.

Would you expect demographic data to be continuous?

You simply didn’t want to put a new central office where the trend over the past five years indicated a decline in population even if the past 40 years indicated a population growth!

What if 4 years ago there had been a temporary population explosion due to a construction project, and now the declining population was still above where it had been 6 years ago?

Reply to  Carlo, Monte
August 20, 2022 6:36 am

He likes “pretty” graphs, whether they tell you something useful or not.

Reply to  Tim Gorman
August 19, 2022 3:18 pm

There is no discontinuity in trend lines! Trend lines are based on the data used to generate them. There is no restriction anywhere that I can find that says all trend lines must meet at a common point!

If the two trend lines do not meet a common point, then by definition they are not continuous.

Carlo, Monte
Reply to  Bellman
August 19, 2022 3:31 pm

If the two trend lines do not meet a common point, then by definition they are not continuous.

But this DOES NOT imply there is a discontinuity in the data points themselves!

Reply to  Bellman
August 20, 2022 7:11 am

So what? Where is it written that the trend lines for two different data sets have to be continuous?

Reply to  Tim Gorman
August 20, 2022 1:13 pm

So what?

The so what is you’ve spent the last few months aggressively insisting there was no discontinuity.

Where is it written that the trend lines for two different data sets have to be continuous?

Nobody says they have to be continuous. I’ve said there are times when it’s important to spot potential discontinuities. But in this case it doesn’t make sense and suggests the “pause” is spurious.

You keep dodging the problem. If the pause and the discontinuity are genuine, it implies there was a decade or mores worth of warming happening instantaneously in 2014. If you think the pause requires attention, then that sudden warming requires more. Despite almost 8 years of a pause, temperatures are still only about where we would have expected them to be if the previous warming trend had continued without pause.

If you want to see in writing the idea of continuous trends being used for changes in trend, look at the article you kept insisting was exactly the same technique that Monckton used.

https://wattsupwiththat.com/2022/05/14/un-the-world-is-going-to-end-kink-analysis-says-otherwise/

Assume that a change point (a “kink point”) may exist in the data set. For each point in the data set (termed a “candidate kink point”), split the data set at that point (with the candidate split point present in both data sets that are produced), and run regressions on the data before and after the candidate kink point, constraining the junction between the two line segments to be continuous.

Reply to  Bellman
August 20, 2022 2:16 pm

The so what is you’ve spent the last few months aggressively insisting there was no discontinuity.”
“Nobody says they have to be continuous.”

Cognitive dissonance at its finest! You claim that not being continuous is a problem and then state that they don’t have to be continuous.

But in this case it doesn’t make sense and suggests the “pause” is spurious.”

It’s right there in front of your face. Your cognitive dissonance just won’t let you see it!

“If the pause and the discontinuity are genuine, it implies there was a decade or mores worth of warming happening instantaneously in 2014.”

It doesn’t imply that at all! Now you’ve moved on to claiming that the different trend lines imply that the data has a discontinuity. It doesn’t imply that at all!

“Despite almost 8 years of a pause, temperatures are still only about where we would have expected them to be if the previous warming trend had continued without pause.”

NO. They are *NOT* where we would have expected them to be! We have seen COOLING since 2016, not warming. The warming trend suggests the exact opposite!

“If you want to see in writing the idea of continuous trends being used for changes in trend, look at the article you kept insisting was exactly the same technique that Monckton used.”

Stop lying! I never said they were the same technique! I said they found the same point where the trend changed!

You are so lost in the weeds trying to defend the undefendable that your delusions have taken over!

Reply to  Tim Gorman
August 20, 2022 3:19 pm

Not worth going round in circles again at this time, but this needs a response.

Stop lying! I never said they were the same technique! I said they found the same point where the trend changed!

You say they are not the same technique but find the same change point. Your exact words at the time were

Have you read the thread about “kink” analysis for understanding yet? Monckton is merely finding a “kink” in the data!

Now, why don’t you just go ahead and tell us that the newly described “kink” analysis is illegitimate as well?

and when I suggested he was not doing the same thing you replied

Of course that is what he is doing! He’s finding where the derivative of the data is changing slope. Your argument is nothing more than the old “Argument by Dismissal” fallacy you are so fond of employing!

Only when I asked you to demonstrate how Monckton was using the “kink” method, you retorted

I didn’t say he is using the kink algorithm. I said he is doing the same thing. Finding a kink point!

So, sorry if I’m confused, but I still don;lt understand how you can claim that Monckton is doing the same thing as the “kink” analysis, but not using the same technique.

This might not matter if he did have an algorithm that always found the same point as the “kink” method, but it doesn’t. You are wrong. The “kink” analysis finds the best point in 2012, and Monckton finds a point in 2014. The kink analysis finds a kink which results in a faster warming rate lasting 10 years (despite the fact that 80% of the period was supposedly in a pause.)

Reply to  Bellman
August 21, 2022 6:59 am

You simply cannot read.

Of course that is what he is doing! He’s finding where the derivative of the data is changing slope.”

Finding the same thing is *NOT* the same as using the same method. You simply don’t seem to be able to make the distinction.

Finding a point on the prairie using a compass and map will work just as well as using a GPS to travel to the GPS coordinate. They are totally different methods that wind up giving the same result.

Using the kink algorithm and finding the beginning of a slope change is exactly the same. You wind up at the same destination using different methods.

Why are such simple concepts so hard for you to grasp?

Reply to  Tim Gorman
August 21, 2022 12:37 pm

I’ll repeat. He is not finding the same thing. His results are not the same as the results using the kink method.

Reply to  Bellman
August 21, 2022 4:24 pm

Of course he’s finding the same thing. You just don’t want to admit it.

Reply to  Tim Gorman
August 21, 2022 4:35 pm

So in your world February 2012 is the same thing as September 2014?

Reply to  Bellman
August 19, 2022 9:54 am

Do you see a discontinuity in the anomaly values? I don’t.

The fact that the trend lines don’t meet isn’t a discontinuity in the data. It’s merely a different way of analyzing the data.

The end of one trend and the beginning of another do *NOT* have to meet. That isn’t a discontinuity!

Reply to  Tim Gorman
August 19, 2022 12:14 pm

Do you see a discontinuity in the anomaly values?

Yes, but that’s unavoidable and nothing to do with the discontinuity in the trends.

The fact that the trend lines don’t meet isn’t a discontinuity in the data. It’s merely a different way of analyzing the data.

The claim is that the trend line represents a smoothed value of the data. The assumption is that when Monckton draws a flat trend line starting in September 2014, there is an underlying reality, e.g. that the world hasn’t on average warmed for the last 8 years. If the trend line doesn’t start where the previous one left of, either the underlying reality shows a system break happening in September 2014, which caused a spontaneous large increase in temperature. Or, more likely, that your choice of change point is wrong.

Reply to  Bellman
August 19, 2022 1:55 pm

“The claim is that the trend line represents a smoothed value of the data.”

“of the data USED!”. There, fixed it for you!

“The assumption is that when Monckton draws a flat trend line starting in September 2014, there is an underlying reality, e.g. that the world hasn’t on average warmed for the last 8 years.”

You are *still* trying to push the restriction that all trend line *must* meet at a common point! That’s a restriction *YOU* have developed – it is *NOT* a common rule!

The trend line for the positive slope of a sine wave even has a different sign than the trend line of the negative slope of the sine wave. For you that would define a “discontinuity”. Do you understand how insane that sounds?

 If the trend line doesn’t start where the previous one left of, either the underlying reality shows a system break happening in September 2014, which caused a spontaneous large increase in temperature. Or, more likely, that your choice of change point is wrong.”

Malarky! All it shows is that you are analyzing the data differently!

You didn’t address my point concerning a trend line through the positive uncertainty points and the negative uncertainty points. Neither of those lines will ever meet. Does that indicate some kind of discontinuity in the trend lines?

Don’t run away. Answer the question!



Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 2:51 pm

Christopher NEVER plots regression lines for the data prior to his start point!

Reply to  Carlo, Monte
August 20, 2022 6:39 am

Christopher NEVER plots regression lines for the data prior to his start point!”

So what? Go ask your local hardware store owner if he is more interested in shovel sales from 40 years ago or is more interested in shovel sales over the past year!

We’ve had this discussion before. If you are going to use linear trend lines to forecast a natural process then you had better use the most recent data, not data from 40 years ago.

Why do you just keep going around and around in circles beating the same dead horse each time? You just never learn!

Reply to  Tim Gorman
August 20, 2022 7:19 am

I suspect Carlo was being sarcastic. Monckton does sometimes show regressions for the entire data set. What he doesn’t often do is show the pauses in the context of previous trends.

Carlo, Monte
Reply to  Bellman
August 20, 2022 7:28 am

Wrong—he never does this segment fitting stuff at all.

Reply to  Carlo, Monte
August 20, 2022 9:28 am

Then I apologise for trying to defend you.

Here he is doing just that.

https://wattsupwiththat.com/2021/08/02/the-new-pause-lengthens-again/

image-8-720x406.png
Carlo, Monte
Reply to  Bellman
August 20, 2022 9:39 am

You had to go back to 2021 to find this—ok fine, whatever:

s/never/rarely/

Is there ANYTHING on WUWT for which you don’t keep detailed files?

Reply to  Carlo, Monte
August 20, 2022 1:00 pm

Gosh all the way back to last year. Of course he rarely likes to show the context explicitly in graph form, but he often shows graphs of the entire period, just without the pause shown in context. And it’s not like he doesn’t say all the time that the pause is like a stair case.

Is there ANYTHING on WUWT for which you don’t keep detailed files?

It’s the internet. There are ways of finding information. If you think this is something Monckton wants to hide, then you better ensure all old WUWT articles are deleted.

Carlo, Monte
Reply to  Bellman
August 20, 2022 2:59 pm

If you think this is something Monckton wants to hide,

Another leap into the darkness—how in the name of Phil did you come to this nutty idea?!??

then you better ensure all old WUWT articles are deleted.

More pompous idiocy.

Here is the real truth: CMoB lives rent-free inside your skull.

Reply to  Carlo, Monte
August 20, 2022 3:23 pm

Says the person, who every time I quote something more than a few months old, thinks I’m keeping extensive records on all my “enemies”.

Reply to  Tim Gorman
August 19, 2022 3:15 pm

Of course it’s the data used, how could it represent unused data?

You are *still* trying to push the restriction that all trend line *must* meet at a common point!

I’m not saying that. Just that it’s generally a bad analysis if they don’t. This depends on whether you assume the data you are using is continuous. If this was measuring temperatures in a mechanical system, identifying structural breaks like this can be useful, as it could indicate a failure in the system.

The trend line for the positive slope of a sine wave even has a different sign than the trend line of the negative slope of the sine wave. For you that would define a “discontinuity”.

No, I expect the trend lines for the positive and negative slope of a sine wave to be continuous.

Do you understand how insane that sounds?

You’re the one saying it.

You didn’t address my point concerning a trend line through the positive uncertainty points and the negative uncertainty points. Neither of those lines will ever meet. Does that indicate some kind of discontinuity in the trend lines?

No. You have two separate trend lines covering the same time period. So as you would say, not a functional relationship.

Carlo, Monte
Reply to  Bellman
August 19, 2022 3:34 pm

Just that it’s generally a bad analysis if they don’t.

WTH does this mean? The data is bad? The regression programming has an error?

Reply to  Carlo, Monte
August 19, 2022 4:24 pm

There was a clue in the word “analysis”. You’ve analysed the data, and came to a conclusion that is physically highly implausible if not impossible.

Carlo, Monte
Reply to  Bellman
August 19, 2022 9:04 pm

More nonsense word salad, trying to cover your previous nonsense about discontinuities and regression fits.

Reply to  Bellman
August 20, 2022 7:10 am

I’m not saying that. Just that it’s generally a bad analysis if they don’t.”

That’s *YOUR* opinion from out of your small, small world view. Cue Shakespeare: More things in heaven and earth ….”

“This depends on whether you assume the data you are using is continuous.”

It doesn’t depend on that at all! It depends on the data you choose to analyze.

No, I expect the trend lines for the positive and negative slope of a sine wave to be continuous.”

But they are *NOT* the same for each. One goes up and one goes down.

No. You have two separate trend lines covering the same time period. So as you would say, not a functional relationship.”

So the trend lines don’t have to meet for some analyses but they do for others?

ROFL!!!

The two trend lines are analyzing two different sets of data. Just like CoM does. He is analyzing a different set of data!

Reply to  Tim Gorman
August 20, 2022 1:35 pm

That’s *YOUR* opinion…

Of course it’s my opinion. Why would I be arguing it if it wasn’t.

More things in heaven and earth ….”

You’re comparing the idea you can have discontinuous trends, with a believe in ghosts?

But they are *NOT* the same for each. One goes up and one goes down.

Which has nothing to do with them being continuous or not. I would ask if you understood what the word meant, but you’ve given a correct definition, so I don’t see why you have a problem with this.

So the trend lines don’t have to meet for some analyses but they do for others?

You were describing having two different trend lines across the entire data. One through “negative uncertainty points” and one through he positive. I don’t know what you mean by uncertainty points, or why you want to do this. But regardless, they are two separate trend lines – two separate functions. Not one discontinuous function.

The two trend lines are analyzing two different sets of data. Just like CoM does.

Who is CoM? Viscount Monckton is splitting the same data set into two partitions. They are not two different sets of data in that they all are cut from the same cloth, and some data keeps moving from one set to another. Unless you think something catastrophic happened in whatever month the pause now starts in, there is no reason why the trends should not be continuous.

Reply to  Bellman
August 20, 2022 2:21 pm

You’re comparing the idea you can have discontinuous trends, with a believe in ghosts?”

I am defending that trend lines based on different data sets don’t have to be continuous. Something you agreed with and then forgot I guess!

“You were describing having two different trend lines across the entire data. One through “negative uncertainty points” and one through he positive. I don’t know what you mean by uncertainty points, or why you want to do this. But regardless, they are two separate trend lines – two separate functions. Not one discontinuous function.”

Measurement values are “stated value +/- uncertainty”. That means there are three data points you can graph from each measurement, the stated value, the stated value plus the uncertainty, and the stated value minus the uncertainty value.

What is so hard to understand about this?

Each set of those points is a *different* data set and the trend lines never meet, they are not continuous.

CoM uses a subset of the overall data to find his pause length – i.e. it is a different data set than the overall data set. Thus the trend lines don’t have to ever meet and they don’t have to be continuous.

Again, why is this so hard for you to understand?

Reply to  Tim Gorman
August 20, 2022 2:49 pm

Each set of those points is a *different* data set and the trend lines never meet, they are not continuous.

You are describing three different functions. Each continuous. It makes no sense to ask if a function is continuous with a different function.

CoM uses a subset of the overall data to find his pause length – i.e. it is a different data set than the overall data set.”

These sets are meant to represent the real world to some extent. You can;t just remove a part of the time line and say it exists in a different reality.

Again, why is this so hard for you to understand?

I understand your argument, I just find it hard to believe a reasonable person would not see the problem. You are saying you can treat the world from 1979 – August 2014 as if it had one temperature trend, and then the world from September 2014 to present as having another. And not worrying about why the start of the second trend is around 0.2°C warmer than the end of the previous trend.

Carlo, Monte
Reply to  Bellman
August 20, 2022 3:04 pm

I understand

You understand very little, witness the increasing incoherence in these word salads you generate, trying to Keep the Rise Alive. This is one my favs:

One through “negative uncertainty points” and one through he positive.

Reply to  Carlo, Monte
August 20, 2022 3:21 pm

I was quoting Tim. Or are you confused by that minor typo?

Reply to  Bellman
August 21, 2022 6:51 am

You are describing three different functions. Each continuous. It makes no sense to ask if a function is continuous with a different function.”

YES! And how is a function defined? It’s not just that there is one y value for every x value. You also have to define the domains of x and y! If the domains are different then the functions are different and there is no reason why they have to be continuous with each other!

y = +sqrt(x) + 10
y = -sqrt(x) – 10

They are two entirely different functions, the domain of x and y are different and the two graphs will never meet! But “x” is the same in both!

These sets are meant to represent the real world to some extent. You can;t just remove a part of the time line and say it exists in a different reality.”

So what? Of course you can look at part of the time line. *YOU* are looking at part of the temperature time line! UAH didn’t exist at t = –∞. It’s why the trend line of an exponential decay for t_small is different from the trend line for t_large. Both have a place in analyzing exponential decay in an electrical circuit. Same for atmospheric temperature.

You are saying you can treat the world from 1979 – August 2014 as if it had one temperature trend, and then the world from September 2014 to present as having another.”

Absolutely you can do this! Why do you think that is a problem? It’s what forecasters actually do all the time! Would you order shovel inventory based on the trend line from 1979 – 8/2014 or based on the trend line from 9/14 to present? There are multiple factors that affect shovel usage over time just like there are multiple factors that affect temperature changes over time. You *have* to be aware of what is going on today if you are going to make good judgements.

And not worrying about why the start of the second trend is around 0.2°C warmer than the end of the previous trend.”

The data is continuous. Where you start the trend line doesn’t make it discontinuous. You are trying to say that the trend line determines the data values – you *really* are lost in the woods!

Carlo, Monte
Reply to  Bellman
August 20, 2022 3:15 pm

The claim is that the trend line represents a smoothed value of the data.

Who makes this absurd alleged claim? You devote your entire life to these holy trends yet you don’t even know what they are!

Regression smooths nothing! It is a minimization procedure, this is all!

Reply to  Carlo, Monte
August 21, 2022 6:52 am

Regression smooths nothing! It is a minimization procedure, this is all!”

Yep. A distinction that seems to get lost much of the time!

Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 7:21 am

And going back to residuals, instead of looking at what they might indicate, they go to great lengths to eliminate them with lots of fancy smoothing functions!

Reply to  Carlo, Monte
August 21, 2022 12:41 pm

A linear regression or any smoothing method does not eliminate residuals. Residuals are by definition the difference between the data point and trend line. You can’t have residuals if you don’t have a trend line.

Carlo, Monte
Reply to  Bellman
August 21, 2022 6:21 pm

Another WHOOOSH

Reply to  Carlo, Monte
August 21, 2022 7:00 pm

You don’t have to describe each of your discharges here.

Reply to  Carlo, Monte
August 21, 2022 7:37 am

It’s a straight line. How much smoother do you want it?

Carlo, Monte
Reply to  Bellman
August 21, 2022 7:40 am

What is “it”?

Reply to  Tim Gorman
August 18, 2022 4:59 pm

You just can’t keep from contradicting yourself, can you? About finding the length of a drought you said: ” I would try to find an exact starting point where it started.”

And I corrected myself. It should have been clear from the context that I’d intended to say “I would[‘nt] try to find an exact starting point where it started.”

Reply to  Bellman
August 19, 2022 10:07 am

And I corrected myself. It should have been clear from the context that I’d intended to say “I would[‘nt] try to find an exact starting point where it started.”

I’m sorry but it was quite CLEAR what you meant from the context of the post!

You can’t weasel out of it now! You got caught, fair and square.

Reply to  Tim Gorman
August 19, 2022 12:22 pm

I’m sorry but it was quite CLEAR what you meant from the context of the post!

This is becoming quite absurd.

First – here’s the entire context: (some highlights added)

The first thing I’d probably look for is an indication that rainfall had decreased. If I found that rainfall had been increasing over the period, I probably wouldn’t claim it was a drought.

If there were signs of a drought, I would try to find an exact starting point where it started. Droughts only start after months of low levels of rain, deciding when to declare a drought depends on what water levels are like and that can have multiple causes.

I’m not sure what the point would be of looking back until you found some point where you could claim the drought started, unless you could identify a specific change that caused it.

Second – you are engaging in exactly the argumentative fallacies you keep accusing me of. Distractions and nit-picking. The start point of a drought is not the same as the start point of a pause. Even if I believed it was possible to identify an exact start date of a drought, that would not be the same as finding the start date of something as vague as a pause identified by uncertain trends, over data you think is meaningless.

Carlo, Monte
Reply to  Bellman
August 19, 2022 2:53 pm

Neutronium density.

Reply to  Bellman
August 19, 2022 3:20 pm

I would try to find”

EXACTLY what you accuse CoM of doing as a fault in his methodology.

Distractions and nit-picking”

No nit picking. Just pointing out that you can’t even be consistent.

Denigrate CoM of doing what you would do!

“The start point of a drought is not the same as the start point of a pause.”

It isn’t? Why isn’t it?

exact start date”

And now we are back to whining about “exact” again. What does “exact” have to do with it? And if you *do* find an exact start point would the trend line from that start point going forward meet up with the extension of the past data or would there be a “discontinuity”?

Reply to  Tim Gorman
August 19, 2022 3:39 pm

This is just getting pathetic.

If you think finding the start date of a drought can be done using Monckton’s method, you tell me how. Then explain how your method can produce a different start date fro the drought every month. Then explain how the current drought started a year before the last one ended. The explain how despite being in a drought we are getting more rain than ever.

Carlo, Monte
Reply to  Bellman
August 19, 2022 9:00 pm

This is just getting pathetic.

Yes, you are pathetic.

Reply to  Bellman
August 21, 2022 4:58 am

If you think finding the start date of a drought can be done using Monckton’s method, you tell me how. Then explain how your method can produce a different start date fro the drought every month. Then explain how the current drought started a year before the last one ended. The explain how despite being in a drought we are getting more rain than ever.”

You seem to think that physical processes with something like temperature can be perfectly fit to an analysis algorithm. Nothing could be further from the truth. The biosphere is a non-linear, chaotic system. Even the IPCC recognizes it as such. That means that start dates and end dates of such a system can be “fuzzy”.

If you base your process on determining when the residuals change enough to justify beginning a new trend then that word “enough” is subjective. It always will be. It will *never* be exact.

Surface moisture has significant variation. Subsoil moisture at depth changes far more slowly. You can get a lot of rain on the surface but that may not significantly impact the subsoil moisture – which is what most plants depend on, not surface moisture.

Reply to  Tim Gorman
August 21, 2022 1:17 pm

You seem to think that physical processes with something like temperature can be perfectly fit to an analysis algorithm.

Then you are not seeing my argument. I don’t think you can ever have a perfect fit. That’s why I keep saying you need to consider the uncertainty in the trend.

If you base your process on determining when the residuals change enough to justify beginning a new trend then that word “enough” is subjective.”

You have still yet to explain how you would determine that, let alone explain why you think that is what Monckton is doing.

It always will be. It will *never* be exact.

Which is one reason why Monckton posting each month to tell everyone exactly when the pause now starts from is pointless.

Reply to  Bellman
August 21, 2022 6:27 pm

Then you are not seeing my argument. I don’t think you can ever have a perfect fit. That’s why I keep saying you need to consider the uncertainty in the trend.”

Residuals are not uncertainty. Never have been, never will be.

“Which is one reason why Monckton posting each month to tell everyone exactly when the pause now starts from is pointless.”

Pointless to you. You don’t want to believe the long term trend can ever change. Argument to Tradition is all you have. Is your middle name Teyve?

bdgwx
Reply to  Tim Gorman
August 19, 2022 12:33 pm

TG said: “I’m sorry but it was quite CLEAR what you meant from the context of the post!
You can’t weasel out of it now! You got caught, fair and square.”

Given Bellman’s posting history I thought it was rather obvious what was meant. It was so obvious I didn’t even notice the typo.

I do frequently notice typos on here though. I almost always ignore them because I’ve become reasonable skilled at identifying obvious typos and inferring intent. I’m more interested in furthering the essence and spirit of the discussion than in singling out trivial and irrelevant “gotchas”. That’s just me. To each his own I guess!

Reply to  Clyde Spencer
August 18, 2022 5:36 am

There was no claim, other than by you, that a lack of statistically significant warming can “prove anything.

If you want to be taken seriously, I suggest that you name and quote the exact words supporting your claim.”

In this comment section

“Monckton’s Pauses” clearly demonstrate the ineffectiveness of monotonic annual increases of CO2 and that it is easily over-powered by other factors.

Clyde Spencer

The trend Monckton is using is to show that CO2 does not control temperature. That is all. The longer the pause the less likely that growing CO2 is even a factor!

Jim Gorman

The pause MIGHT indicate a change from a warming trend to a cooling trend of longer duration as Lord Monckton wrote.

Also, it invokes a need for a detailed mechanism to explain the pause when CO2 continues its steady upward trend. Any ideas about this disconnect?

Geoff Sherrington

Carlo, Monte
Reply to  Bellman
August 18, 2022 5:55 am

bellcurveman digs into his files of enemies’ quotes…

Reply to  Carlo, Monte
August 18, 2022 7:47 am

You’re inane.

Reply to  Bellman
August 18, 2022 3:43 pm

I don’t see anything in any of these quotes that mention statistical significance. They all just point out that trending one factor is incorrect science.

Your delusions are peeking out again.

Reply to  Clyde Spencer
August 17, 2022 4:23 pm

If the uncertainty interval is wider than the differences you are attempting to identify with the trend line then you cannot assign a statistical significance.

If your data is “stated value +/- uncertainty” how do you do a best-fit analysis for any kind of a trend line, linear regression or other?

Clyde Spencer
Reply to  Tim Gorman
August 17, 2022 7:10 pm

Calculate the regression line for the nominal values. Assuming that all the sampled time-series data have approximately the same uncertainty, an approximate uncertainty envelope can be calculated by finding the slope between the minimum first end-point and the maximum last end-point, and then for the maximum first end-point and minimum last end-point, using the 95% uncertainty values. It can be refined by truncating the initial end-points and re-doing the calculations.

Reply to  Clyde Spencer
August 18, 2022 3:45 pm

Yep. I posted this method somewhere else in the thread. It’s not a difficult task, just time consuming. NOAA and NASA have computers that could do it in hours if not quicker.

You have to ask yourself why the climate scientists don’t do this.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 7:43 pm

Clyde, we’re being told that the Monckton Pause proves that CO2 cannot be a control knob while at the same time being told that the data UAH provides does not even exist since they average an intensive property. How can there be an uncertainty on the trend or even a trend at all of the global average temperature if that metric does not even exist? And how can you possibly draw any conclusions from a non existent value anyway?

Reply to  bdgwx
August 18, 2022 5:37 am

You are missing the point of this essay. It is hard to one issue without addressing two. Why don’t you stay on subject and address the issue of trending first.

Derg
Reply to  Bellman
August 17, 2022 2:00 pm

Why does CO2 keep rising faster 😉

Reply to  Bellman
August 17, 2022 4:00 pm

You are absolutely hilarious. Dude, I grew making measurements on engines both gas and diesel, hydraulic pumps, being a carpenter, etc. I knew geometry before I even went to high school. I knew what it was to runout measurements, eccentricity measurements, etc. I knew how to read a micrometer by the 5th grade. I know what it takes to get an average measurement by making multiple readings of the same thing.

I KNOW when only one reading is made twice a day of different things, there is a large uncertainty in each measurement. The average of those two measurements only increases the uncertainty of that average. That should carry through clear to the anomaly readings.

Your wanting to include only regression error doesn’t begin to address the uncertainty in each data point. Almost every text you examine on regressions assumes the data have no uncertainty built in. Why don’t you post one that says how to do a regression when each data point has its own uncertainty, like 1±0.5.

Why don’t you stick to the issue being discussed. Go ahead and assume each data point is accurate with no uncertainty and address the issues posted in the article.

Geoff Sherrington
Reply to  Jim Gorman
August 17, 2022 10:11 pm

I have almost finished writing an article for WUWT that shows what should be done if you follow the BIPM Guide to Uncertainty in Measurement, plus what is being done by Australian authorities and plus how to move to conformity. It is lengthy, it is technical and it allows the interested reader to work a way through the present abundant confusion. Should appear in 10 days or so.
I have no horse in this race other than a will to learn from past wisdom and to get it right going forwards. So much of what appears here is just wrong, but thankfully the moderators put it up to show what needs improving.
Geoff S

Reply to  Geoff Sherrington
August 18, 2022 5:39 am

Good to know. Looking forward to it.

Carlo, Monte
Reply to  Geoff Sherrington
August 18, 2022 5:56 am

Excellent!

Expect a lot of noise from the trendologists.

Reply to  Geoff Sherrington
August 18, 2022 9:02 am

I’ll be very interested in reading it.

Reply to  Bellman
August 17, 2022 4:21 pm

Why do you always want to ignore the uncertainty in these trend lines?”

ROFL!!! Now you argue that the uncertainty in the temperature readings makes it impossible to define a “true” trend line. When just a month ago you were arguing that the temps were 100% accurate and so the long term trend line is accurate and is an accurate predictor of the future!

Why have you now changed your mind?

Reply to  Tim Gorman
August 17, 2022 5:27 pm

No. Try to remember what I’ve already tried to explain to you. The uncertainties in the trend are caused by variability in the data. It doesn’t matter how much of this is due to measurement errors, and how much is due to natural variation in the earths temperature, it causes “uncertainty” in the trend line.

You can never know what the true trend line is, unless all the data exactly fits it. But the more data you have, and the stronger the signal compared with the noise, the less uncertainty there is in the trend.

I have never argued that the temps are 100% accurate. I especially don’t think UAH data is 100% accurate. I do think there’s enough data to confirm that there has been long term warming since the 1970s, but there will still be uncertainty as to the range of that warming.

And I have never claimed the trend is an accurate predictor of the future. I think it would be absurd to just assume the current rate of warming will continue over the next century.

Why have you now changed your mind?

Changing ones mind is not something to be ashamed of, but in this case I don’t think I have. It’s just your inability to understand what I’m saying that makes you think that.

Carlo, Monte
Reply to  Bellman
August 17, 2022 8:52 pm

Nice word salad, blob should be proud.

Reply to  Bellman
August 18, 2022 5:41 am

Why do you keep wanting to use the term “noise”. I don’t think you know what the term means. How can you have noise when it is all temperature? Perhaps you mean variability?

Reply to  Jim Gorman
August 18, 2022 11:49 am

I’m just using it as a short hand for the idea of variability, errors, or whatever, hiding the trend. It’s a common enough concept, e.g.

https://www.techtarget.com/whatis/definition/statistical-noise

Statistical noise is unexplained variability within a data sample. The term noise, in this context, came from signal processing where it was used to refer to unwanted electrical or electromagnetic energy that degrades the quality of signals and data. The presence of noise means that the results of sampling might not be duplicated if the process were repeated.

Reply to  Bellman
August 18, 2022 3:48 pm

where it was used to refer to unwanted electrical or electromagnetic energy that degrades the quality of signals and data.”

Since when does an unwanted signal or energy appear in the temperature record?

You tried this same quote at least a year ago. It’s no more true for the temperature record today than it was back then.

If variation shows up in the measurement data then it is TRUE variation, not noise!

Reply to  Tim Gorman
August 18, 2022 4:41 pm

What bit of “used to” don’t you understand?

I don;t care what you call it. Variable data or noisy data the point is the same. You are just indulging in what you call the logical fallacy of nit-picking.

Carlo, Monte
Reply to  Bellman
August 18, 2022 6:14 pm

Irony explosion.

Reply to  Bellman
August 19, 2022 1:18 pm

“What bit of “used to” don’t you understand?”

The operative words in the full quote are: “ unwanted electrical or electromagnetic energy that degrades the quality of signals and data.”

I asked what unwanted energy appears in the temperature record. Of course you can’t answer.

I don;t care what you call it. Variable data or noisy data the point is the same. You are just indulging in what you call the logical fallacy of nit-picking.”

The amplitude of a sine wave varies. Is that “variable data”? Is it “noisy data”. Should you throw the sine wave away because it is variable?

There isn’t any nitpicking here. You are, as usual, showing an absolute ignorance concerning the real world. You can’t even distinguish between variation and noise. You’ve never used a spectrum analyzer have you? You can download an audio one to your computer and analyze the output of a radio going to a speaker. You’ll see a lot of “noise” at the base display but you’ll also see a *lot* of variation in the amplitude of the actual signal. Where does that base noise appear in the temperature data?

Reply to  Tim Gorman
August 19, 2022 3:31 pm

The operative words in the full quote are…

The full sentence is

The term noise, in this context, came from signal processing where it was used to refer to unwanted electrical or electromagnetic energy that degrades the quality of signals and data.

“It comes from… “, “where it was used to”. Try to put those words into context and see that it is saying that the term statistical noise is borrowed as a concept from another field, and may not have exactly the same meaning.

And I’ll leave it there, before we go any further down the distraction fallacy rabbit hole. What you call it is irrelevant.

Reply to  Bellman
August 20, 2022 10:36 am

Whoever wrote that is obviously a whipper snapper and has no experience with analog radio signals. NOISE has been around since radio began listening to EM waves. In fact, the very early spark gap transmitters basically made wideband noise that could be keyed on and off using Morse Code.

I got my first amateur radio license in 1963. I am fully aware of signal to noise ratios in analog transmission of all kinds. Most folks don’t even know that most long distance calls of any distance in U.S. for many years were carried on Single Sideband Radio channels multiplexed onto a single carrier. You have no idea the work that went into designing and implementing those and multiplexed signals carried on coaxial cable for shorter distances. Read this for a short history.

Telephone Transmission – Engineering and Technology History Wiki (ethw.org)

Signal to noise ratio is nothing new to any electrical engineer working with either analog or digital signals. Go to your car and turn the radio to AM and move in between stations. Hear that hissing? That is noise. Do it when there is lots of lightening around. That is real noise! It wipes out the signal you are trying to listen to.

Now tell me that temperature data has “noise” included in it! From where does it come from? I will assure you that what you see is variance in the signal, not noise.

Reply to  Bellman
August 21, 2022 3:55 am

“It comes from… “, “where it was used to”. Try to put those words into context and see that it is saying that the term statistical noise is borrowed as a concept from another field, and may not have exactly the same meaning.”

What statistical noise do you think appears in a temperature measurement?

Reply to  Bellman
August 18, 2022 5:43 am

The uncertainties in the trend are caused by variability in the data. It doesn’t matter how much of this is due to measurement errors, and how much is due to natural variation in the earths temperature, it causes “uncertainty” in the trend line.”

Variation is *NOT* uncertainty! Variation only determines the best-fit metric for the linear trend line. It does *nothing* to quantify the uncertainty associated with the data itself.

Even a thermometer with significant systemic bias will show measured temperature variations, they just won’t be accurate! Precision is *NOT* accuracy!

“You can never know what the true trend line is, unless all the data exactly fits it. But the more data you have, and the stronger the signal compared with the noise, the less uncertainty there is in the trend.”

The best-fit metric will allow you to get as close as you can. If that metric is high then you might want to consider if the data is fit for the purpose you are using it for.

Variation is *NOT* noise. My spectrum analyzer will show lots of variation in the strength of a received signal on the amateur radio bands. That variation is *NOT* noise, it is just variation in the signal from changes in the atmosphere and ionosphere, quite similar to what causes variation in temperature. Trying to filter out that variation just degrades the actual signal. You wind up losing data you need!

I often wonder what climate scientists think noise is. How do you get *noise* in a measured value? Are they really talking about uncertainty? Are they talking about an interfering signal? Are they talking about best-fit metrics? Are they talking about variation? It’s simply not obvious. Three of these are not noise. If there *is* an interfering signal then what is it? How can you just dismiss it as noise?

” I do think there’s enough data to confirm that there has been long term warming since the 1970s, but there will still be uncertainty as to the range of that warming.”

What kind of warming? Catastrophic warming that will turn the earth into a cinder? Or beneficial warming that will give us more food and fewer deaths from cold? You *have* to look at this on a holistic basis, that was Freeman Dyson’s problem with the climate models. You can’t look at just one thing, especially an average of an intensive property, and say its bad. That’s just assuming you *KNOW* for certain what that one thing should be to maximize benefits for humankind and the planet.

“And I have never claimed the trend is an accurate predictor of the future. I think it would be absurd to just assume the current rate of warming will continue over the next century.”

The climate models do assume just that. So does the IPCC and most governments. It’s what is driving the push to ban fossil fuels (can’t wait to see what they make computer keyboards out of when there isn’t any oil!) So are you now claiming you don’t believe what the climate models say? Welcome to the ranks of the skeptics!

It’s just your inability to understand what I’m saying that makes you think that.”

I’ve seen you say things like “I don’t care” and “pauses are insignificant”. It’s obvious you’ve changed your views at least a little bit – until you start spouting garbage like variation is uncertainty.

Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 6:05 am

He lost me after the 1st paragraph, couldn’t read any farther.

Variation is *NOT* uncertainty!

Variation is *NOT* noise.

I’ve seen you say things like “I don’t care” and “pauses are insignificant”. It’s obvious you’ve changed your views at least a little bit – until you start spouting garbage like variation is uncertainty.

Same old nonsense, he hasn’t learned anything.

Reply to  Tim Gorman
August 18, 2022 2:26 pm

Variation is *NOT* uncertainty!

Variability in the data is the reason the trend line is uncertain.

Variation only determines the best-fit metric for the linear trend line.

You keep stating that, I’ve still no idea what you think it means. The linear trend is the best fit, for a specific metric, to the data.

It does *nothing* to quantify the uncertainty associated with the data itself.”

The standard errors, i.e. the uncertainty of the trend, is quantified by the variation in the data. It’s very basic statistics. Maybe you should join me in one of these remedial classes you are so keen on.

The best-fit metric will allow you to get as close as you can. If that metric is high then you might want to consider if the data is fit for the purpose you are using it for.

That’s the point of the calculations for the standard error.

What kind of warming?

The kind that goes from cooler to hotter.

Catastrophic warming that will turn the earth into a cinder?

Very unlikely.

Or beneficial warming that will give us more food and fewer deaths from cold?

Couldn’t tell you, and it’s irrelevant to the discussion.

You can’t look at just one thing, especially an average of an intensive property, and say its bad.

I’m not saying it’s good or bad, just that it appears to be happening. Are you saying you’ll accept the statistical evidence of warming if the warming is good, but reject it if it’s bad?

The climate models do assume just that.

Climate models are not based on projecting current trends into the future.

I’ve seen you say things like “I don’t care” and “pauses are insignificant”.

And what was the context for those remakes?

Carlo, Monte
Reply to  Bellman
August 18, 2022 3:29 pm

You’re spouting the usual nonsense, again.

Reply to  Bellman
August 18, 2022 4:15 pm

Variability in the data is the reason the trend line is uncertain.”

Same song, second verse! Residuals between the stated measurement values and the trend line is *NOT* uncertainty! It is not a measure of uncertainty! The residuals only tell you how well your trend line fits the data!

“You keep stating that, I’ve still no idea what you think it means. The linear trend is the best fit, for a specific metric, to the data.”

It’s not a matter of what I think, it’s a matter of what you don’t understand.

Linear regression is done by minimizing the residuals between the stated data measurement and the trend line. You work the trend line residuals until you’ve gotten the best-fit trend line.

from the internet:
-“linear regression finds the line that minimizes the total squared residuals”
-“the residuals express the difference between the data on the line and the actual data so the values of the residuals will show how well the residuals represent the data.”
-“Linear regression finds the line of best fit line through your data by searching for the regression coefficient (B1) that minimizes the total error (e) of the model. While you can perform a linear regression by hand, this is a tedious process, so most people use statistical programs to help them quickly analyze the data.”

None of this has anything to do with measurement uncertainty. Variation only means that no all data points will lie directly on the trend line.

The standard errors, i.e. the uncertainty of the trend, is quantified by the variation in the data. It’s very basic statistics. Maybe you should join me in one of these remedial classes you are so keen on.”

See above. Far too many people refer to the residuals as “standard error”. THEY ARE NOT ERRORS! As you say they indicate variation in the data points. VARIATION IS NOT ERROR! Not everything fits neatly to a linear trend line – just look overhead at the clouds sometime!

Couldn’t tell you, and it’s irrelevant to the discussion.”

That’s right! And neither can the climate models. And neither can the global average temperature. All the data you need to determine what is beneficial and what is not beneficial is lost when you calculate your first daily mid-range temperature.

If all this climate model nonsense can’t inform you so you can make an informed judgement on future action then it is meaningless and worthless. You apparently know this because you admit it can’t inform you but you refuse to admit it!

“I’m not saying it’s good or bad, just that it appears to be happening. Are you saying you’ll accept the statistical evidence of warming if the warming is good, but reject it if it’s bad?”

You don’t know what is happening! You just admitted it in the sentence above! “Couldn’t tell you”.

Stop trying to tell us that it means something when you can’t tell us what it means!

Climate models are not based on projecting current trends into the future”

Have you ever heard the term “hindcasting”, especially in association with validating the climate models? What do you suppose that causes to happen?

Why do you keep trying to expound on things you are apparently ignorant of? As usual, you are just cherry picking something someone said hoping it will stick to the wall.

Reply to  Tim Gorman
August 18, 2022 4:53 pm

Same song, second verse!

Same reality.

Residuals between the stated measurement values and the trend line is *NOT* uncertainty! It is not a measure of uncertainty!

It isn’t. But you need to know their variance to calculate the standard error.

Linear regression is done by minimizing the residuals between the stated data measurement and the trend line.

That’s what I was saying. And minimizing to a specific metric, usually least square.

You work the trend line residuals until you’ve gotten the best-fit trend line.

You can do it in a single calculation, if that’s what you mean by working it.

None of this has anything to do with measurement uncertainty.

Of course not.

Variation only means that no all data points will lie directly on the trend line.

No find all the quotes explaining how to calculate the standard error.

Far too many people refer to the residuals as “standard error”.”

Who?

THEY ARE NOT ERRORS!

Stop shouting. But correct they are not errors, but residuals. Error only refers to the deviation from the true trend line, the deviations from the estimated trend are called residuals.

You still seem to be failing to grasp, that I’m not talking about the deviation in the variables, but the standard error in the trend line.

Carlo, Monte
Reply to  Bellman
August 18, 2022 6:19 pm

You still seem to be failing to grasp, that I’m not talking about the deviation in the variables, but the standard error in the trend line.

It is YOU who has the failure problem:

“standard error” =/= measurement uncertainty

“standard error” =/= “uncertainty of the trend line”

Reply to  Carlo, Monte
August 18, 2022 6:57 pm

At the risk of giving you the attention you crave, I did not say the standard error is the measurement uncertainty of the trend line, I said it’s the uncertainty of the trend line.

I’ve no idea why you think the standard error is not an indicator of the uncertainty of the trend line, nor do I care.

Reply to  Bellman
August 19, 2022 4:36 am

Here is a question you need to answer for yourself. What do you hope to gain from a time versus temperature trend line? Is it to be able to forecast the future? Is temperature a cyclical phenomena? Can a linear regression tell much about a cyclical high variance phenomena? Can you determine the high’s or low’s from the regression line.

You do realize that linear regression is designed to evaluate the functional relationship between two related variables, right? Does temperature and time have a functional relationship? If it does, is it cyclical? If it is, then you need some trig functions to relate it, not a linear equation.

Carlo, Monte
Reply to  Jim Gorman
August 19, 2022 5:30 am

You do realize that linear regression is designed to evaluate the functional relationship between two related variables, right?

If he does realize it, he doesn’t care, nothing matters except the “trends”.

Reply to  Jim Gorman
August 19, 2022 12:33 pm

What do you hope to gain from a time versus temperature trend line?

To see if temperature is changing over time.

Is it to be able to forecast the future?

Not really. That’s not to say you cannot use it in some ways. Such as saying that if the warming trend continued at the current rate it would result in so much warming over the next century.

Is temperature a cyclical phenomena?

Define your terms. It is on some time scales, it may or may not be on others.

Can a linear regression tell much about a cyclical high variance phenomena?

Yes.

Can you determine the high’s or low’s from the regression line.

Define your terms. Do you mean minimum and maximum daily values, or the range of monthly or annual values?

You do realize that linear regression is designed to evaluate the functional relationship between two related variables, right?

As always with you I don’t know what you mean by “functional relationship”. Temperature against time is a functional relationship as there can only be one value per time. If you mean the linear function, that is also a functional relationship. But the correct function of a linear regression is the predicted value plus an error term. So that is not a functional relationship.

If it is, then you need some trig functions to relate it, not a linear equation.

So you keep claiming, but so far you haven’t demonstrated what this trig function is.

Reply to  Bellman
August 19, 2022 2:09 pm

As always with you I don’t know what you mean by “functional relationship”. “

It means T = at where t is time and T is temperature.

What is a? Is a T/t?

How can that be if T is a variable not dependent on time but on something else? Is that something else time dependent?

Should the functional relationship be T = az where z is a function of t, z(t)?

Reply to  Tim Gorman
August 19, 2022 3:07 pm

I was asking what Jim thought a functional relationship was, not you. But let’s complicate this further.

It means T = at where t is time and T is temperature.

That would be an example of a functional relationship between T and t.

What is a? Is a T/t

How should I know? It’s your example. If a is a constant then yes, by definition a = T / t.

How can that be if T is a variable not dependent on time but on something else?

It’s your example. Obviously in this example T is a variable depending on t. That doesn’t mean it couldn’t be dependent on other things, it just requires the other things to also be dependent on t. e.g. if x = bt, then T = (a/b)x.

Should the functional relationship be T = az where z is a function of t, z(t)

Could be. I’ve still no idea what you are getting at or how this is an answer to the question what do you think a functional relationship is.

Carlo, Monte
Reply to  Bellman
August 19, 2022 3:42 pm

Could be. I’ve still no idea what you are getting at or how this is an answer to the question what do you think a functional relationship is.

More obfuscation, no one can be this far out to lunch.

As you’ve been told many times, V = I * R is a functional relationship, but the clues have been unable to penetrate the neutronium.

I predict the solar system will collapse into a singularity soon.

Reply to  Carlo, Monte
August 19, 2022 4:22 pm

If I ask you, and I didn’t, what you think functional relationship means, I am not asking you to point to an example of a functional relationship. I’m trying to figure out if you understand what you are talking about.

Carlo, Monte
Reply to  Bellman
August 19, 2022 8:57 pm

You are either being deliberately coy trying to hide an agenda or are completely daft.

Go back to basic math and learn what a function is, obviously it didn’t stick the first time around.

Reply to  Bellman
August 20, 2022 5:01 pm

MC gave you V=I*R a well known functional relationship relating voltage, current, and resistance. There are others.

e=mc^2
PV=nRT
A=L*W
V=L*W*H
A=pi*r^2
C=pi*D
T=δU/δS

Shall I go on.

“I’m trying to figure out if you understand what you are talking about.”

A functional relationship is one that relates independent variables such that the relation provides a deterministic value.

Mathematically a function is defined as one output for any one input (one input is any combination of variables). Look at my examples. None of them give duplicate answers for any combination of inputs.

Now, if you want a functional relationship between temperature and time, then you need to define a function where time is an independent variable such that any value of time provides a deterministic value for temperature.

I can assure you that you will not find one. If such a function existed, we would no longer need Global Climate Models.

To develop one, I have started investigating how to look at seasons. GAT is hosed because it uses calendar years which hose up the seasons. If there is any time relationship it should be based on the sun moving between hemispheres.

Reply to  Jim Gorman
August 20, 2022 5:58 pm

Good. You do understand what a functional relationship is and we are talking in mathematical terms and your comment is all good up to this point.

Now, if you want a functional relationship between temperature and time, then you need to define a function where time is an independent variable such that any value of time provides a deterministic value for temperature.

I’m still not sure exactly what you want here. Do you mean the actual temperature in this universe at a specific time? Because that will be a functional relationship, there can only be one average at any point in time. But if you mean a relationship that holds true for all possible measurements or all possible worlds, then that isn’t going to be functional, and I don’t know why you require it.

You say that “linear regression is designed to evaluate the functional relationship between two related variables”, but I’m still not sure if you mean in the model or in the data. Any linear model will be a functional relationship, but that does not mean the data has to have a functional relationship between the dependent and independent variables. It’s not usually the case with statistics. Suppose you are looking at the relationship between the score from a revision test and an exam. Lots of pupils could get the same score at revision, but different scores in the exam. It is not a functional relationship, but you can still use linear regression to investigate the relationship between the variables.

You will never have a model for global temperature that can predict an exact value for any given time, especially if time is your only independent variable. There are unlimited different things that will effect the temeprature at any given time, and it’s impossible to include them all in a model. This is true for any statistical model, the equation is always the model plus a random error term. None of this makes the models useless.

Carlo, Monte
Reply to  Bellman
August 20, 2022 9:31 pm

I’m still not sure exactly what you want here.

You say that “linear regression is designed to evaluate the functional relationship between two related variables”, but I’m still not sure if you mean in the model or in the data.

Are you real this daft? Or is this just another smokescreen covering your real agenda?

Variable One: TIME
Variable Two: TEMPERATURE

There are no other variables!

You will never have a model for global temperature that can predict an exact value for any given time, especially if time is your only independent variable.

Then why do you plot T versus t as a linear function?!??

Don’t you get it?

Sheesh.

I know I’ll regret another futile attempt at troll education, but for others who might be reading:

Unknown resistor calibration:

Apply voltage to resistor, measure voltage across resistor, current through resistor

Vary voltage, repeat measurements

Plot voltage versus current

Perform linear regression through origin

Because V = I * R. the slope of the regression line is the calibrated resistance, problem solved

Now I know this will be reel hard for you, but think of all this air temperature trendology as an equivalent calibration by analogy. See if you can sort it out.

For extra credit, perform a complete uncertainty analysis by propagating from the measurements to obtain the uncertainty of the resistor calibration value.

Show your work, no handwaving allowed.

Reply to  Carlo, Monte
August 21, 2022 7:40 am

Has it occured to you that global surface temperature might not be as simple as voltage verses current?

Reply to  Bellman
August 21, 2022 8:23 am

Has it occured to you that global surface temperature might not be as simple as voltage verses current?”

No kidding? That’s why I told you that temperature is a function of lots of things. e.g. φ(distance,elevation, humidity, terrain, geography, etc)!

Which of course you denied was true!

As usual, you are saying want you need to say in the moment.

Reply to  Tim Gorman
August 21, 2022 12:56 pm

No kidding? That’s why I told you that temperature is a function of lots of things. e.g. φ(distance,elevation, humidity, terrain, geography, etc)!

Which of course you denied was true!

I can’t help you with your own delusions. You’ll never see it, but you keep doing this. Making up your own stories about what you wanted me to say, then throwing them back at me as if it was something I said.

I did not, and have never, denied that there are multiple things affecting temperature. What I disagreed with you about that φ function, was your claim that daily temperature could be approximated by a function like sin(t + φ).

My objection is that in your equation φ is just changing the phase of the sin function. You are saying the only effect all those confounding factors are having is to move the maximum each backwards or forwards in time.

Reply to  Bellman
August 21, 2022 6:06 pm

“Making up your own stories about what you wanted me to say, then throwing them back at me as if it was something I said.

I did not, and have never, denied that there are multiple things affecting temperature. What I disagreed with you about that φ function, was your claim that daily temperature could be approximated by a function like sin(t + φ).”

In other words you really don’t believe in multiple things determining temperature.

 What I disagreed with you about that φ function, was your claim that daily temperature could be approximated by a function like sin(t + φ).”

My objection is that in your equation φ is just changing the phase of the sin function.”

And exactly what do you think the phase of a sin function is?

My guess is that you don’t even realize that sin(x+y) is *NOT* sin(x) + sin(y)!

sin(x+y) = sin(x)cos(y) + sin(y)cos(x). That’s not just a simple phase shift in time.

Let’s just analyze latitude as it applies to temperature during daylight hours. At the equator the temperature is a direct function of sin(t). sin(0) = 0, sin(pi/2) = 1, sin(pi) = 0. As you move north in latitude the temperature is max at the equator and (assume) zero at the poles. That is the definition of a cos function, cos(latitude). So we have sin(t)cos(latitude) as the function for T.

  • sin(t)cos(latitude) = =[sin(x+latitude)+sin(x–latitude) ]/2

where latitude is measured in radians from 0 to pi/2.

You should be able to tell pretty quickly from this that ⱷ is not just a phase angle as you apparently think of it.

I tried to use the function sin(x + ⱷ(a,b,c,d,….)) as a simple demonstration that temperature is a function of many variables.

And the correlation between sin(x) and sin(x + ⱷ) *IS* cos(ⱷ). Thus as ⱷ drives the next location to be more and more separated from the origin (e.g. distance, elevation, terrain, clouds, geography, etc) the lower the correlation between the locations becomes.

It’s a far easier concept to understand than trying to break down sin(t)cos(a)cos(b)cos(c)……

If (a,b,c,d, …..) are all equal to zero radians then the two locations are the same, same latitude, same longitude, same elevation, same terrain, etc. and sin(t) = sin(t). But in reality none of these factors can be zero in the real world. There will always be some degree of separation.

sin(t)cos(ⱷ) is a simple linear mix. If (as the IPCC says) the biosphere is a non-linear, chaotic function then the mixing complicates things very, very quickly. I’m not even going to get into how all the mixing products get generated. It’s not pretty. I would refer you to Single-Sideband Systems and Circuits, Sabin & Schoenike, Section 5.4. Or perhaps: https://en.wikipedia.org/wiki/Intermodulation

You are saying the only effect all those confounding factors are having is to move the maximum each backwards or forwards in time.”

Like I said earlier. You simply don’t understand. This has nothing to do with moving signals back and forth in time (although time lags/leads for each component complicates the mixing products even *more since this represents non-linear mixing).

You keep spouting off on things you don’t understand and, as usual, I’m sure you will come back and tell me I am full of it. I’m sure you won’t understand any of this math and it is so far outside your box you’ll just ignore it and keep spouting your religious dogma!

Reply to  Tim Gorman
August 21, 2022 6:41 pm

In other words you really don’t believe in multiple things determining temperature.

It’s like talking to a brick wall.

My guess is that you don’t even realize that sin(x+y) is *NOT* sin(x) + sin(y)!

You’re a terrible guesser.

sin(x+y) = sin(x)cos(y) + sin(y)cos(x). That’s not just a simple phase shift in time.

If y is constant it is.

So we have sin(t)cos(latitude) as the function for T.

But your function is sin(t)cos(latitude) it’s sin(t + latitude).

You should be able to tell pretty quickly from this that ⱷ is not just a phase angle as you apparently think of it.

It isn’t if you change the formula. Now you are saying latitude just reduces the amplitude of sin(t). Which means days are colder the further north you go, but nights get warmer.

Like I said earlier. You simply don’t understand.

Correct. I don;t understand what you think you are trying to prove with all this nonsense.

I’m sure you won’t understand any of this math and it is so far outside your box you’ll just ignore it and keep spouting your religious dogma!

That’s because I don’t understand this fantasy maths you keep using. I don’t understand how

sin(t + ⱷ) = sin(t)cos(ⱷ),

especially when you’ve already said, correctly that it equals sin(t)cos(ⱷ) + sin(ⱷ)cos(t).

Reply to  Bellman
August 22, 2022 8:22 am

tg: “sin(x+y) = sin(x)cos(y) + sin(y)cos(x). That’s not just a simple phase shift in time.””

If y is constant it is.”

What does “y” being constant have to do with it being a phase shift in time? You are still lost in your little, confined box!

But your function is sin(t)cos(latitude) it’s sin(t + latitude).”

If you had read on you would know why I used sin(x+y). It was to make it simpler for you to understand. Guess that failed!

It isn’t if you change the formula. Now you are saying latitude just reduces the amplitude of sin(t). Which means days are colder the further north you go, but nights get warmer.” (bolding mine, tg)

So what does that have to do with “y” being a phase shift in time?

The equation for nighttime is *NOT* a sine wave. How many times does that need to be pointed out to you? It is an exponential decay. That doesn’t mean that things like latitude, elevation, etc doesn’t play a role in the decay factor but you wind up with a totally different equation.

e^(jx) = cos(x) + jsin(x).

You are more than welcome to try and expand this for the case where x is a function of multiple factors: x = f(a,b,c,d,…)

“That’s because I don’t understand this fantasy maths you keep using. I don’t understand how”

It’s not fantasy math. You are just looking for a way to rationalize your cognitive dissonance. You want to be able to say that you believe temperature relies on a lot of things but when you are shown a way to characterize it all you can do is deny how the factors combine – in essence saying that other factors can’t affect temperature.

I’m sure that when I point out that all of these factors form a multi-dimensional vector space and they all interact as vectors you’ll deny that too.

Once again, I tire of trying to explain to you how you must handle all the factors associated with temperatures. I explained to you why I used the simplification of sin(x+y) and you just dismissed it. When I tried to show how you handle the factors as vectors, you dismissed that as well. You can’t simply say that they interact and then dismiss anything that tries to explain to you shows how they interact. That’s just exhibiting a closed mind – which is your only true forte!

I’m out. Nitpick someone else and display your ignorance of how to apply math to the real world somewhere else.

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 8:29 am

I’m out. Nitpick someone else and display your ignorance of how to apply math to the real world somewhere else.

These guys are liars, plain and simple.

Reply to  Tim Gorman
August 22, 2022 11:19 am

To sum up:

Tim Gorman says that we can estimate daily temperature with the equation sin(t + φ).

I say that doesn’t make sense and suggest what he really means is more like φ sin(t) or sin(t) + φ.

He says I’m confused and calls me a troll.

He then concludes that because I don’t agree with the sin(t + φ), it can only be because I don’t agree that anything but the sun can be influencing temperature. .

Later he realizes that sin(t + φ) is nonsense, but pretends it was an attempt to simplify φ sin(t) for my benefit, because he’s convinced himself that I don’t understand trigonometry.

Then he takes the fact that I didn’t realize that sin(t + φ) was meant to be a simplification of φ sin(t), means that I’m an idiot who needs more education. (The fact that there is no rational sense in which sin(t + φ) is a simplification of φ sin(t) doesn’t seem to matter).

He claims victory and say’s he won;t bother discussing it any further.

Reply to  Bellman
August 22, 2022 2:27 pm

I say that doesn’t make sense and suggest what he really means is more like φ sin(t) or sin(t) + φ.”

Yeah, that makes a lot of sense. At t = pi/2 then T = φ. And at t = 0 or pi, then T = 0.

Yep, that makes a lot of sense.

“He says I’m confused and calls me a troll.”

You are a troll when you make suggestion like this.

“He then concludes that because I don’t agree with the sin(t + φ), it can only be because I don’t agree that anything but the sun can be influencing temperature. .”

I told you this was a simplification to explain how other factors affect temperature. But, of course, you couldn’t even follow the more detailed explanation I gave you and just blew it off.

Typical.

“Later he realizes that sin(t + φ) is nonsense, but pretends it was an attempt to simplify φ sin(t) for my benefit, because he’s convinced himself that I don’t understand trigonometry.”

That’ correct! You don’t understand trig or vector calculus or you wouldn’t have suggested T = φ sin(t). As I pointed out, and which you totally ignored, you have to treat all this as a multi-dimensional vector field where the other factors must be added vectorially to the temperature vector.

“Then he takes the fact that I didn’t realize that sin(t + φ) was meant to be a simplification of φ sin(t), means that I’m an idiot who needs more education”

You *are* an idiot. Simplification of sin(t+ⱷ) is *NOT* sin(t) it is:

sin(x+ⱷ) = sin(x)cos() + sin()cos(x)

I gave you this trig identity in my message. And here you are saying you understand trig but you can’t even recognize a trig identity when you see one!

 (The fact that there is no rational sense in which sin(t + φ) is a simplification of φ sin(t) doesn’t seem to matter).”

*YOU* are the one that came up with ⱷsin(t)! Don’t attribute it to me! *YOU* are the one that thinks temperature should be zero at the equator (ⱷsin(t)!

I pointed out in my message that for latitude the formula would be

sin(t)cos(latitude) = =[ sin(x+latitude)+sin(x–latitude) ] / 2

My guess is that you *still* don’t understand what that identity is telling you!

You also need to note that I only used time as the variable in one place and it was a typo. I used “x” in all of my equations. “x” is itself a function, one of the independent variables would be time but there are other independent variables as well.

You’ll never understand any of this. My guess is that you can’t even figure out all the mixing products generated by a non-linear biosphere from the inputs. Since the driving inputs are sinusoidal all the mixing products will be sinusoidal and there combinations will result in all different kinds of orders from 2nd to infinity.

Reply to  Tim Gorman
August 22, 2022 3:40 pm

Hello, I thought you’d promised to stop bothering me.

Yeah, that makes a lot of sense. At t = pi/2 then T = φ. And at t = 0 or pi, then T = 0.

What do you think happens to your sin(t + φ)? Or your cos(φ)sin(t)?

You are the one who keeps insisting that daily temperature can be modeled with a sine wave centered on zero, that the mean daytime temperature is 0.63TMax.

I told you this was a simplification to explain how other factors affect temperature.

How is turning all the other factors than time into a phase shift a simplification of anything. I told you it was confused.

But, of course, you couldn’t even follow the more detailed explanation I gave you and just blew it off.

I’m not under any contract to give your ramblings more than the attention they deserve. We’ve been arguing for at least 2 years, and every time you try to define daily temperature as a sine curve it’s obvious you haven’t thought anything through. No matter how many times I’ve tried to explain why this is wrong you still insist of this 0.63TMax nonsense, and tell my I don’t understand basic calculus.

You *are* an idiot. Simplification of sin(t+ⱷ) is *NOT* ⱷsin(t)

It’s literally what you said a few comments ago

That is the definition of a cos function, cos(latitude). So we have sin(t)cos(latitude) as the function for T.


sin(x+ⱷ) = sin(x)cos(ⱷ) + sin(ⱷ)cos(x)

And here you are saying you understand trig but you can’t even recognize a trig identity when you see one!

Yes that’s the correct identity. And if you use that you are just back to sin(x+ⱷ), a phase shift.

*YOU* are the one that came up with ⱷsin(t)! Don’t attribute it to me! *YOU* are the one that thinks temperature should be zero at the equator (ⱷsin(t)!

You don’t even understand your own equations, or are being over literal with mine. You said sin(t)cos(latitude). I’m just replacing cos(latitude) with a constant ⱷ, for a given latitude, just as you used ⱷ in sin(t + ⱷ). ⱷ = cos(latitude). Does that make it clearer.

I pointed out in my message that for latitude the formula would be

sin(t)cos(latitude) = =[ sin(x+latitude)+sin(x–latitude) ] / 2

True, but that’s just another way of saying sin(t)cos(latitude), You are still just scaling the sin function, nothing to do with sin(t + ⱷ).

You also need to note that I only used time as the variable in one place and it was a typo. I used “x” in all of my equations. “x” is itself a function, one of the independent variables would be time but there are other independent variables as well.

Then that just makes everything twice as confusing. So now you have two different sets of independent variables controlling a sine function? So if x doesn’t go from 0 to 2pi over a day, what does it do?

You’ll never understand any of this.

You are correct, I doubt I ever will. The question you have to ask yourself is, is this because everything you say is nuts, or is it because you haven’t explained it properly?

Carlo, Monte
Reply to  Bellman
August 22, 2022 4:00 pm

is this because everything you say is nuts

Another irony overload.

Reply to  Bellman
August 23, 2022 4:45 pm

What do you think happens to your sin(t + φ)? Or your cos(φ)sin(t)?”

Tell me when sin(t+ ⱷ) goes to zero.
You didn’t actually calculate the mixing products of cos(ⱷ)sin(t), did you?

You are the one who keeps insisting that daily temperature can be modeled with a sine wave centered on zero, that the mean daytime temperature is 0.63TMax.”

Really? You still haven’t learned calculus yet have you? sin(x) is a normalized curve. You *do* know about normalization, don’t you? Like normalizing a normal distribution to center on zero?

“How is turning all the other factors than time into a phase shift a simplification of anything. I told you it was confused.”

And I’ve told you at least three times now that ⱷ doesn’t have to signify a phase shift! You don’t know any more about vector algebra than you know about calculus.

We’ve been arguing for at least 2 years, and every time you try to define daily temperature as a sine curve it’s obvious you haven’t thought anything through.”

Oh, malarky! You don’t know *anything* about the subject yet are willing to expound on it.

The actual path of the sun across the sky is truly complicated. It is described as sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)
where L is local latitude, h is the hour angle, and ẟ is the declination. I’m not going to get into all of this but its used in solar panel engineering. I just decided to use sin(x) as a useful simplification. L is what I described as latitude. And the path of the sun is the driver of the daytime temperature and it doesn’t matter if you believe it or not, it’s still the truth.

And, of course, here you are, denigrating something you know nothing about. Typical!

tg: “You *are* an idiot. Simplification of sin(t+ⱷ) is *NOT* ⱷsin(t)

It’s literally what you said a few comments ago

tg:That is the definition of a cos function, cos(latitude). So we have sin(t)cos(latitude) as the function for T.

You can’t even differentiate between sin(t)cos(latitude) and ⱷsin(t)?

Since when did I say ⱷ = cos(latitude)? That was *YOUR* statement, not mine!

True, but that’s just another way of saying sin(t)cos(latitude), You are still just scaling the sin function, nothing to do with sin(t + ⱷ).

I TOLD YOU IN NO UNCERTAIN TERMS THAT sin(x+ⱷ) WAS A SIMPLIFICATION TO MAKE THINGS EASIER TO UNDERSTAND.

And, as usual, you just blew it off so you could continue your troll activities.

Then that just makes everything twice as confusing”

Only to you! ONLY TO YOU!

Like I said above, sin(x) is a actually a function of the entire elliptical orbit and and its tilt with respect to the sun.

sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)

My guess is that you will make ABSOLUTELY NO ATTEMPT to understand this and will just continue to complain that using sin(x + ⱷ) is somehow totally wrong.

is this because everything you say is nuts, or is it because you haven’t explained it properly?”

It’s because you understand NOTHING about the real world. Get Lost stubborn mule!

Reply to  Tim Gorman
August 23, 2022 6:06 pm

It’s far too late in the day to be rehashing all this nonsense, especially when it’s degenerating to all caps and bold fonts.

I’ll just comment on this point (volume turned down by me)

I told you in no uncertain terms that sin(x+ⱷ) was a simplification to make things easier to understand.

Writing an incorrect formula, neither makes it simpler nor easier to understand. The whole misunderstanding, if that’s what it is, is because of that supposed simplification. There is no way that that simplified equation means anything other than you are shifting the phase by ⱷ. You even told me that’s what it meant, when I’d incorrectly said it was going to change the frequency.

Your lack of calculus knowledge is showing again! ⱷ is a PHASE DIFFERENCE.

https://wattsupwiththat.com/2022/08/09/numbers-tricky-tricky-numbers-part-2/#comment-3576018

Reply to  Bellman
August 24, 2022 2:48 pm

Writing an incorrect formula, neither makes it simpler nor easier to understand. “

I gave you the actual formula for the path of the sun which is the direct driver for daytime temp and, as usual, YOU BLEW IT OFF!

 sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)

I simplified that to sin(x + ⱷ).

And, again, you’ve just blown that off!

There is no way that that simplified equation means anything other than you are shifting the phase by ⱷ”

You have some kind of a fixation on ⱷ being a phase shift! It does *NOT* have to be a phase shift! It can be anything that modifies the value of the function. Declination is not a phase shift but it *does* change the position of the sun in the sky. Why do you insist on declination being a phase shift?

Your lack of calculus knowledge is showing again! ⱷ is a PHASE DIFFERENCE.”

I did a search on this and cannot find this anywhere in the thread. Where did you come up with it?

Reply to  Tim Gorman
August 24, 2022 3:30 pm

I simplified that to sin(x + ⱷ).
And, again, you’ve just blown that off!

Yes, because it’s nonsense.

You have some kind of a fixation on ⱷ being a phase shift! It does *NOT* have to be a phase shift!

You are the one who said it was a phase difference. You even wrote it in capitals.

That’s what adding something to the x in a sin function does, it shifts the phase.

I did a search on this and cannot find this anywhere in the thread. Where did you come up with it?

I gave you the actual link, I assume you didn’t think to click on it. Here it is again.

https://wattsupwiththat.com/2022/08/09/numbers-tricky-tricky-numbers-part-2/#comment-3576018

Reply to  Bellman
August 25, 2022 6:07 am

Yes, because it’s nonsense.”

But you can’t show, in any possible way, how the following equation is nonsense can you? You can’t even bring yourself to quote it!

sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)

You won’t even admit that the path of the sun drives the daytime temperature!

That’s what adding something to the x in a sin function does, it shifts the phase.”

How does declination shift the “phase” of the sun? Does it delay it? Does it advance it? How is it a “phase shift”?

You won’t answer any of these questions because it would force you to admit that you are wrong!

“That’s what adding something to the x in a sin function does, it shifts the phase.”

I also said: “All of these factors, and probably others, affect the relationship between one point on the surface and a different point. A phase shift is not just a shift in time but a change in relationship.”

You keep wanting to define “phase shift” as only being a shift in time! It isn’t. But you’ll never admit it because it would mean you were wrong!

Reply to  Tim Gorman
August 25, 2022 6:49 am

This is just getting pathetic. I said your “simplification of sin(x + ), not your formula for the height of the sun.

I said you could only assume the height of the sun “drives” temperature if you thought we live in a vacuum.

If that was the formula for temperature we would have the same temperature at sunrise in the summer as we have in the winter.

How does declination shift the “phase” of the sun? Does it delay it? Does it advance it? How is it a “phase shift”?

Firstly, you were talking about latitude (theta in your equation), not declination(delta).
Secondly, it doesn’t. But your equation said it did. That’s why I said it was nonsense.

You keep wanting to define “phase shift” as only being a shift in time! It isn’t.

I’ve still no idea what you think x is in your equation if it’s not time. You are trying to say it describes temperature during the day, but somehow isn’t dependent on time.

Reply to  Bellman
August 25, 2022 10:20 am

This is just getting pathetic. I said your “simplification of sin(x + ), not your formula for the height of the sun.”

So the height of the sun is not a driver of the daytime temperature? Then why does it get warmer as the sun rises in the sky?

“I said you could only assume the height of the sun “drives” temperature if you thought we live in a vacuum.”

Again, so the height of the sun does *NOT* drive the daytime temperature? It doesn’t get warmer as the sun rises and then cools off as the sun sets?

If that was the formula for temperature we would have the same temperature at sunrise in the summer as we have in the winter.”

You have *NOT* bothered to look up the formula I provided, have you? You never do your studying, you just depend on your religious dogma.

The temperature at sunrise is dependent on the exponential decay during the night, not on the sun shining at night.

Is the height of the sun in the sky the same in both summer and winter?

As usual, the term declination means nothing to you refuse to learn anything about it!

Firstly, you were talking about latitude (theta in your equation), not declination(delta).
Secondly, it doesn’t. But your equation said it did. That’s why I said it was nonsense.”

I tried to use latitude as a FACTOR impacting the temperature. And here you are trying to focus on what greek letter I used for each?

Like MC has said: “WHOOOSH!”. It just went right over your head!

My equation doesn’t say that it delays or advances the sun. That is *YOU* being stuck in your little mental box thinking that “ⱷ” has to be time delay!

You just blocked out my TRUE statement of “A phase shift is not just a shift in time but a change in relationship.””

If you actually bothered to study the equation I gave you it would be apparent that the variables involved are *NOT* time shifts, but a change in the physical relationship between the earth and the sun.

sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)

You simply don’t have a clue about this even after I told you what the variables L, ẟ, and h are. All you know how to do is just keep repeating “Nuh uh! You are wrong!”

Reply to  Tim Gorman
August 25, 2022 1:46 pm

So the height of the sun is not a driver of the daytime temperature?

This is just pointless. You are clearly not interested in arguing in good faith. You even quote the part where I say I’m talking about your sin(x + ) equation, but you keep wanting to pretend this in someways means I’m saying the sun does not drive temperature.

For the last time, to anyone daft enough to be following this, sin(x + ) is not an equation describing the passage of the sun, let alone it’s relation with temperature. Tim obviously made a simple mistake, I expect he meant to put the outside the bracket, but for some reason he is incapable of admitting he made a mistake, so instead has to twist everything I say to convince himself I must be wrong.

You have *NOT* bothered to look up the formula I provided

Ad hominem number 1. Of course, I’ve looked at the formula, it’s a standard one for calculating the height of the sun. E.g. the one given here. As usual you ignore my point that it does not directly relate to temperature.

You never do your studying, you just depend on your religious dogma.

Ad hominem number 2. Every time I suggest what you say might be wrong, it must be because of some supposed dogma, rather than say, you are wrong about something.

Is the height of the sun in the sky the same in both summer and winter?

It is at sunrise and sunset. The height of the sun during winter is the same as the height of the sun during summer at some time of the day. There is a lot more to temperature than the height of the sun.

As usual, the term declination means nothing to you refuse to learn anything about it!

Ad hominem number 3.

I tried to use latitude as a FACTOR impacting the temperature. And here you are trying to focus on what greek letter I used for each?

I said nothing about which character you used, I was merely pointing out the difference between declaration and latitude in the equation. To be honest I wasn’t looking at the letters you used but the ones used in the site I linked to above. I should have said you were using L for latitude. But it’s another distraction from my point.

Like MC has said: “WHOOOSH!”. It just went right over your head!

Ad hominem number 4.

My equation doesn’t say that it delays or advances the sun. That is *YOU* being stuck in your little mental box thinking that “ⱷ” has to be time delay!

Ad hominem number 5.

Your equation is sin(x + ). There is only one thing adding a value to the parameter of the sin function, and that’s to shift it. The fact you are now having to pretend that x is not time just raises more questions.

You just blocked out my TRUE statement of “A phase shift is not just a shift in time but a change in relationship.””

It’s a change in relationship if you mean it shifts the sine wave to the left or right.

If you actually bothered to study the equation I gave you it would be apparent that the variables involved are *NOT* time shifts, but a change in the physical relationship between the earth and the sun.

sin(x + ) does not describe that. Your correct equation does, but sin(x + ) is just wrong.

All you know how to do is just keep repeating “Nuh uh! You are wrong!”

I’ve explained why it’s wrong. I’ve asked you to justify why you think it’s right. I’ve asked you to draw the graph, I’ve asked you to explain what you think x is.

Reply to  Bellman
August 26, 2022 5:58 am

This is just pointless. You are clearly not interested in arguing in good faith. You even quote the part where I say I’m talking about your sin(x + ) equation, but you keep wanting to pretend this in someways means I’m saying the sun does not drive temperature.”

*YOU* are the one that says sin(x+ⱷ) isn’t a good simplification. So who isn’t arguing in good faith? I’ve given you the derivation and it is mathematically sound.

For the last time, to anyone daft enough to be following this, sin(x + ) is not an equation describing the passage of the sun, let alone it’s relation with temperature.”

Of course it is an equation describing the passage of the sun. I’ve even given you a graphic showing how it works! And you just blow it off. Tell me again who isn’t arguing in good faith? Show me how the equation is wrong. All you are doing is using the Argument by Dismissal fallacy!

“Tim obviously made a simple mistake, I expect he meant to put the ⱷ outside the bracket,”

Nope. If you will actually do some research you’ll find that my equation is everywhere!

go here: https://www.pveducation.org/pvcdrom/properties-of-sunlight/elevation-angle

You’ll have to read the entire thing for meaning, no cherry picking, to understand.

The path of the sun *IS* the primary driver for daytime temp. You can deny that all you want, you’ll just continue to show your ignorance.

but for some reason he is incapable of admitting he made a mistake, so instead has to twist everything I say to convince himself I must be wrong.”

Projection anyone?

 you keep wanting to pretend this in someways means I’m saying the sun does not drive temperature.”

“As usual you ignore my point that it does not directly relate to temperature.”

Cognitive dissonance. You deny that you don’t believe the sun drives temperature and then you turn around and confirm that you don’t believe the sun drives the temperature!

It is at sunrise and sunset. The height of the sun during winter is the same as the height of the sun during summer at some time of the day. There is a lot more to temperature than the height of the sun.”

ROFL!!! When it’s dark the sun is at the same height in winter and summer! Hello Captain Obvious!

And there *is* a lot more to temperature than the height of the sun! As I keep trying to tell you!

That’s why in sin(x+), x is a function, it is not just time. It’s why ⱷ is a function, it is not just time.

Just as sin(ɑ) is a function of various sinusoids, so is temperature. Relative humidity is at least partially a function of temperature since warmer air can hold more moisture. Latitude is a sinusoid function as well. The further north you go the cooler the weather. Just as you can subsume the cos(ẟ) factor into sin(ɑ) you can subsume latitude and humidity into sin(x+).

The problem is that you just can’t admit that for some reason. You can’t even do the simplest dimensional analysis. You want to use
ⱷsin(x). If sin(x) is in degrees of temperature what do you get when you multiply sin(x) by latitude? You don’t get degrees of temperature, you get degrees of temperature times degrees of latitude. What do you get when you multiply degrees of temperature times kilometers of distance? You certainly don’t get degrees, you get degree-km.

Your formula truly doesn’t pass muster. If you want to define “ⱷ” as a function then what function do you use to relate latitude and humidity to degrees of temperature? A sinusoid? What do you get when you multiply sinusoids? Have you ever bothered to go look up the associated identities?

sin(ⱷ)sin(x) = [ cos(ⱷ-x) – cos(ⱷ+x) ] /2

ANOTHER SET OF SINUSOIDS. Just like
sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h) where a set of sinusoids gets subsumed into sin(x).

Are you now going to claim that the formula
sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)
is wrong? That it doesn’t give the path of the sun in the sky?



Reply to  Bellman
August 26, 2022 6:33 am

Ad hominem number 2. Every time I suggest what you say might be wrong, it must be because of some supposed dogma, rather than say, you are wrong about something.”

You never actually refute anything. You just use the Argument by Dismissal fallacy. The typical tactic of a religious believer. Just keep on spouting religious dogma.

“I said nothing about which character you used, I was merely pointing out the difference between declaration and latitude in the equation. To be honest I wasn’t looking at the letters you used but the ones used in the site I linked to above. I should have said you were using L for latitude. But it’s another distraction from my point.”

But you said the temperature depends on a lot of factors. And here you are, once again, trying to disprove that!

Latitude gets subsumed into sin(x) AS ANOTHER SINUSOID. Yet you keep arguing that it shouldn’t be! In essence saying that
sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)
simply can’t be correct!

You quoted this: My equation doesn’t say that it delays or advances the sun. That is *YOU* being stuck in your little mental box thinking that “ⱷ” has to be time delay!

But you never refuted it! Why is that?

“Your equation is sin(x + ). There is only one thing adding a value to the parameter of the sin function, and that’s to shift it. The fact you are now having to pretend that x is not time just raises more questions.”

Shift it in time as you claim? Or shift it in value? I have *ALWAYS* said “x” is a function all of its own! Why do you think I didn’t use the typical variable representation of “t”?

Is the dimension for declination in the sun height equation given in units of time? Does it shift the value of x in sin(x)?

You are now grasping at straws in trying to rationalize that you can’t be wrong! Give it up!

“It’s a change in relationship if you mean it shifts the sine wave to the left or right.”

Left or right in time? Or in value? Again, tell me how declination shifts sin(x) left or right. Does it shift sin(x) in time or in value?

sin(x + ) does not describe that. Your correct equation does, but sin(x + ) is just wrong.”

But you just quite figure out how to show it is wrong, can you? “x” is a function that is the primary driver for temperature and ” ⱷ” is a function that modulates the driving function.

And your only *TRUE* objection to the formula is that I came up with it! And for some reason you just *have* to prove that it can’t be correct! Let go your anger! It will only drive you further into irrationality.

I’ve explained why it’s wrong. I’ve asked you to justify why you think it’s right. I’ve asked you to draw the graph, I’ve asked you to explain what you think x is.”

You haven’t explained why it is wrong. And I *have* justified why it is right. I’ve led you down the path to understanding the path the sun follows in the sky. And you keep denying that it is the primary driver of daytime temps on the Earth. I’ve shown how that formula subsumes several sinusoidal functions into one sin(x) function and you keep saying that it can’t be done!

Not once have you shown how the path of the sun is *NOT* the primary driver of temperature on the Earth. Not once have you shown that subsuming sinusoids into one simplification is wrong and you never will for doing so would require you to say that
sin(x) = sin(L)sin(ẟ) + cos(L)cos(ẟ)cos(h)
can’t be correct either!

I’ve attached a graph of the past weeks temperatures, humidity, and wind from my weather station. The gray areas are nighttime. The daytime temps are a *very* close approximation of a sinusoid, i.e. sin(x+ⱷ). Humidity is also a very close approximation to a sinusoid and so is wind speed. Thus subsuming all of this into sin(x+ⱷ) is perfectly legitimate.

But you’ll never admit that! You’ll just tell me that my graphs can’t be correct and that you can’t create one sinusoidal function out of multiple sinusoids!

image_2022-08-26_082951533.png
Carlo, Monte
Reply to  Bellman
August 21, 2022 8:27 am

Then how do you know with any certainty that air temperature versus time is linear?

What is the functional dependence of this linear slope?

Reply to  Carlo, Monte
August 21, 2022 1:00 pm

If you ever read what I said, rather than trying to spam every comment section with witty one liners, you would know I’ve always said the relationship between air temperature and time is almost certainly not linear.

It’s just that linear is often a reasonable first guess. Start with the assumption it’s linear then see if there is a compelling reason to use a different model.

Reply to  Bellman
August 21, 2022 4:20 pm

How can a reasonable first guess be a linear relationship?

When you *KNOW* that the Earth spins, that the Earth is not vertical to the sun, and that the Earth ‘s orbit around the sun is not a circle how can your first guess possibly be linear? All of these are described by sinusoids or combinations of sinusoids. This doesn’t even consider the fact the Earth is not a sphere or cyclical things like ice ages, ocean phases, sunspots, etc.

If sin was linear then sin(x+y) would equal sin(x) + sin(y).

But it doesn’t. sin(x+y) = sin(x)cos(y) + sin(y)cos(x)

Reply to  Tim Gorman
August 21, 2022 4:51 pm

They are anomalies. They remove the seasonal variation.

If sin was linear then sin(x+y) would equal sin(x) + sin(y).

What do you mean by linear? Of course sin isn’t a linear function, but it can be used in a linear regression.

You are describing additive functions, not linear ones. Consider f(x) = x + 1. By your definition that wouldn’t be linear.

Reply to  Bellman
August 21, 2022 5:06 pm

Which season has the highest variance? Which season has the lowest variance. Do you think this might cause a bias that is never corrected?

Here is an example. If you are capable of thinking about gradients, which nighttime has the most cooling, summer or winter. Can this bias the averages and therefore the anomalies?

Reply to  Jim Gorman
August 21, 2022 7:26 pm

Which season has the highest variance? Which season has the lowest variance. Do you think this might cause a bias that is never corrected?

Why keep asking me, rather than doing your own homework?

Anyway, just for you the standard deviation of seasonal UAH data, gives winter as the highest and summer as the lowest. Which is to be expected.

Winter: 0.25°C
Spring: 0.24°C
Summer: 0.22°C
Autumn: 0.23°C

I doubt this has much of an effect on the overall trend, but if you want to make sure the easiest way is to just use annual averages.

Reply to  Bellman
August 22, 2022 11:36 am

I doubt this has much of an effect on the overall trend, but if you want to make sure the easiest way is to just use annual averages.”

ROFL!!! Like annual averages don’t use data from seasons with different variances!



Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 1:42 pm

And different hemispheres!

“Let’s average winter and summer!”

“Yeah! This makes sense to me!”

Reply to  Carlo, Monte
August 22, 2022 3:08 pm

They both think multi-modal distributions can be adequately described using the mean. And forget the different standard deviations or variances of the modal distributions!

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 3:25 pm

While claiming to be using statistics!

Then they go apoplectic if anyone dares to point out their cherished, hallowed, holy trend lines are no longer going up.

Reply to  Carlo, Monte
August 23, 2022 6:23 am

Yep!

Reply to  Bellman
August 21, 2022 6:35 pm

Consider f(x) = x + 1. By your definition that wouldn’t be linear.”

Malarky!

Of course sin isn’t a linear function, but it can be used in a linear regression.”

If the sin function isn’t linear then how do you do a linear regression on it?

Your cognitive dissonance is just truly amazing to see!

Reply to  Tim Gorman
August 21, 2022 6:51 pm

If the sin function isn’t linear then how do you do a linear regression on it?

The same way you do it for any non-linear function. Transform the independent variables.

Here’s a quick example, using a sine wave and some random noise. The blue line is the trend line based on the formula y = sin(x).

Of course it helps if you know the frequency and phase in advance.

20220821wuwt1.png
Reply to  Bellman
August 22, 2022 8:42 am

Here’s a quick example, using a sine wave and some random noise. The blue line is the trend line based on the formula y = sin(x).”

But temperature is *NOT* noise! Nor did you show a linear regression of the sine wave!

You said earlier: “The trend line isn’t trying to predict specific values, it’s a prediction of the average value.”

How is the sine wave you show any kind of an average value?

Basically all you are now saying is that the linear regression trend line of a sine wave is the sine wave itself. Wow! What a revelation! What is the prediction of the average value of a sine wave?

So, once again, we see you saying exactly what you need to say in the moment.

Carlo, Monte
Reply to  Tim Gorman
August 22, 2022 9:44 am

They will claim anything they need to be an “average”!

Reply to  Tim Gorman
August 22, 2022 11:25 am

Sorry, I can’t help you any further. Teh blue line was a linear regression. I just used the formula y ~ sin(x). The linear regression is what it is, despite the noise I added to the data. If you don’t agree feel free to try it with your own stats package.

Reply to  Bellman
August 22, 2022 2:43 pm

Sorry, I can’t help you any further. Teh blue line was a linear regression. I just used the formula y ~ sin(x). The linear regression is what it is, despite the noise I added to the data. If you don’t agree feel free to try it with your own stats package.”

The blue line is a SINUSOIDAL REGRESSION, not a linear regression. Go look up sinusoidal regression!



Reply to  Tim Gorman
August 22, 2022 3:57 pm

In this case it’s just a linear regression.

Reply to  Bellman
August 23, 2022 4:50 pm

sinusoidal regression:

Adjust values of A, B, C, and D in the equation y = A*sin(B(x-C))+D to make a sinusoidal curve fit a given set of randomly generated data.”

linear regression:

Y = B0+B1X

Totally different equations. My guess is that you plugged your data into a regression calculator and it came up with your sinusoidal regression and you didn’t even realize what was happening!

Reply to  Tim Gorman
August 23, 2022 5:41 pm

As I kept telling you, the formula was y = sin(x). That is why it it can be solved with a linear regression.

Yes, if you want to regress on extra parameters you need non-linear regression.

My guess is that you plugged your data into a regression calculator and it came up with your sinusoidal regression and you didn’t even realize what was happening!

Stop guessing. You are really bad at it. I used a linear regression. I explained how I did it.

Reply to  Bellman
August 21, 2022 11:38 am

Do you think the Ideal Gas Law is simple? It was developed without computers by experiment. It is a linear function as graphing shows.

Do you have a function like f(t) = T.

Carlo, Monte
Reply to  Jim Gorman
August 20, 2022 9:33 pm

And by averaging both hemispheres, the GAT is adding two out-of-phase time series!

Reply to  Carlo, Monte
August 21, 2022 11:44 am

Hurray! I don’t think these guys recognize that a sin or cosine, or even a combination is only part of the functional description. You also need relationships for the amplitude.

This isn’t the place to give classes. I tire of doing so without compensation!

Carlo, Monte
Reply to  Bellman
August 20, 2022 9:36 pm

If I ask you, and I didn’t

The snooty bellcurveman shows himself…

I’m trying to figure out if you understand what you are talking about.

By not asking me?

Reply to  Bellman
August 20, 2022 6:55 am

I was asking what Jim thought a functional relationship was, not you. But let’s complicate this further.”

Judas H. Priest! You are posting in a public forum! ANYONE can post a reply to you!

“That doesn’t mean it couldn’t be dependent on other things, it just requires the other things to also be dependent on t. e.g. if x = bt, then T = (a/b)x.”

Now you are starting to get it! Why is always so hard to drag you into understanding?

You said: “Temperature against time is a functional relationship as there can only be one value per time”

It’s FAR more complicated than that!

“Could be. I’ve still no idea what you are getting at or how this is an answer to the question what do you think a functional relationship is.”

You said: “To see if temperature is changing over time.”

Temperature *always* changes over time, even at the North Pole. If you are going to define a functional relationship then you need to define *all* the components of the functional relationship. That would include all those cyclical components that are combinations of sine waves. It’s pretty simple to show that most cyclical components are negligible on a daily basis, it’s not so easy to show on a weekly, monthly, daily, or annual basis.

Yet the climate modelers try to define the functional relationship of temperature as T = a(CO2). Pretty much a linear relationship – which is what their model outputs wind up being. It’s also pretty obvious that such a functional relationship leaves out far too many significant factors. Even your buddy bdgwx is finally starting to come around on this issue!

Reply to  Tim Gorman
August 20, 2022 1:38 pm

You are posting in a public forum! ANYONE can post a reply to you!

Indeed, but as I was trying to figure out what Jim thought a functional relationship meant, you failing to say what you think it means doesn’t answer the question.

Carlo, Monte
Reply to  Bellman
August 20, 2022 3:07 pm

Do you seriously believe that Tim would give you a different answer?

Or are you trying to play another round of stump the professor?

See if you can sort it out, this is a real toughie:

Y = f(X1,..Xn)

Carlo, Monte
Reply to  Bellman
August 19, 2022 5:28 am

So now “measurement uncertainty of the trend line” is not the same as “uncertainty of the trend line”?

Nonsense semantic sleight-of-hand without a shred of mathematical rigor.

The truth is you need the true uncertainty of these fits to be as small as possible for political reasons, so you sweep it all under the rug, and then whine like this when called out.

Reply to  Bellman
August 19, 2022 1:45 pm

I’ve no idea why you think the standard error is not an indicator of the uncertainty of the trend line, nor do I care.”

Because the “standard error” is *NOT* an uncertainty. It’s not even an error. It is a set of residuals generated from trying to fit a linear line to data that doesn’t sit on a linear line! The residuals only measure the best-fit metric for that linear line.

The term “uncertainty” implies that there might exist another line that fits better. If you’ve done a good job of finding the best slope of the line then there isn’t any line that fits better.



Reply to  Tim Gorman
August 19, 2022 3:25 pm

Because the “standard error” is *NOT* an uncertainty. It’s not even an error.

It can be seen as a measure of the uncertainty in the trend line. I’ve quoted your own books using the term “uncertainty” to describe it. Standard error is not error, it’s the standard deviation of the sampling distribution. It’s telling you the likely range of errors of the parameter caused by sampling, In this case the parameter is the slope of the trend.

It is a set of residuals generated from trying to fit a linear line to data that doesn’t sit on a linear line!

No it isn’t. How can a single value be a “set of residuals”.

The residuals only measure the best-fit metric for that linear line.

You keep repeating that line as if it means something.

The term “uncertainty” implies that there might exist another line that fits better.

Correct, that is the point. But to be clear, by better we mean better to the population rather than the sample. That may be the point you are missing. The uncertainty comes from the fact we are only working with an imperfect, random sample.

Carlo, Monte
Reply to  Bellman
August 19, 2022 3:54 pm

The uncertainty comes from the fact we are only working with an imperfect, random sample.

THERE IS NO RANDOM SAMPLING INVOLVED WITH GLOBAL AIR TEMPERATURE MEASUREMENTS.

You are fooling yourself.

Reply to  Carlo, Monte
August 19, 2022 4:20 pm

Stop shouting, and think.

I am not talking about how the monthly averages are derived, I’m talking about the random sampling that goes into choosing the data points.

In a time series this is conceptually a little more difficult as there can only be one sample, but the principle is the same. Different patterns of highs and lows could have lead to very different trends.

Carlo, Monte
Reply to  Bellman
August 19, 2022 8:55 pm

NONE OF THIS IS RANDOM SAMPLING FROM A SINGLE POPULATION! Why is this so hard for you to understand?

Reply to  Carlo, Monte
August 21, 2022 5:16 am

All he knows is what he’s cherry picked from statistical textbooks that never address real world issues. Everything is always from the same population and everything is 100% accurate!

Reply to  Carlo, Monte
August 21, 2022 6:05 am

Shouting doesn’t help your argument.

I am not talking about the sampling that comes from station measurements. I am talking about the randomnessthat comes from all the monthly averages being used to calculate the trend.

The monthly average values do not lie exactly on the trend line. Hence we treat the as a random sample. The idea is that if we could rerun the experiment and get a different set of monthly values they would be different due to their random nature and the calculated trend line would be different. The standard error is an estimate of the standard deviation of all possible calculated trend lines.

You could quible about whether the monthly deviations are really random, and that it’s impossible to go back in time to find out. But it doesn’t matter because you can still treat the values as if they were random and use the same maths.

Carlo, Monte
Reply to  Bellman
August 21, 2022 6:35 am

I am not talking about the sampling that comes from station measurements. I am talking about the randomnessthat comes from all the monthly averages being used to calculate the trend.

So the gnomes are sneaking in at night and adding “randomness”?

But it doesn’t matter because you can still treat the values as if they were random and use the same maths.

A claim made without any justification—reality: it is too hard to do anything else. Why is it that none of you trendology lot NEVER looks at residuals? Is this how you use “the same maths”, by ignoring said maths?

If these averaged air temperature are random as you claim, then from WHERE does the assumed linear time-temperature relationship arise?

Reply to  Carlo, Monte
August 21, 2022 7:27 am

He just doesn’t get it. If it doesn’t fit into one of the examples in his statistics textbook then BY PETE he’ll *make* it fit, no matter what he has to ignore to do so!

Reply to  Bellman
August 21, 2022 7:26 am

The idea is that if we could rerun the experiment and get a different set of monthly values they would be different due to their random nature and the calculated trend line would be different. “

How do you go back in time and rerun the experiment? If you can’t do that then you are running an experiment on a different measurand. Why would you expect the same result when you are measuring a different thing?

“The monthly average values do not lie exactly on the trend line. Hence we treat the as a random sample.”

That’s because of natural variation, not sampling error.

“The standard error is an estimate of the standard deviation of all possible calculated trend lines.”

What is a temperature measurement a “sample” of?

“You could quible about whether the monthly deviations are really random, and that it’s impossible to go back in time to find out.”

That’s not a quibble. That’s an indisputable fact that makes a huge difference in the analysis!

 But it doesn’t matter because you can still treat the values as if they were random and use the same maths.”

Variation of temperature over time is not randomness. It is an measurement of a physical process. Each and every measurement is a separate population shown as “stated value +/- uncertainty”.

Reply to  Tim Gorman
August 21, 2022 12:49 pm

How do you go back in time and rerun the experiment?

Read what I wrote, rather than trying to score cheap points – “if you could rerun the experiment”

Why would you expect the same result when you are measuring a different thing?

You don’t. You expect a different result. Hence the uncertainty.

That’s not a quibble. That’s an indisputable fact that makes a huge difference in the analysis!

he said, quibbling.

Reply to  Bellman
August 21, 2022 4:33 pm

Read what I wrote, rather than trying to score cheap points – “if you could rerun the experiment””

If you can’t rerun it then why assume you can?

You don’t. You expect a different result. Hence the uncertainty.”

No, the uncertainty comes from the fact that no measurement is perfect, not that you expect a different result. When you measure 2’x4’x6′ A and then measure 2’x4’x8′ B the different results are because the two boards are not the same length, they are different things. That has nothing to do with the uncertainty of the measurements. When you measure temperature A at time t1 and temperature B at time t2 you are measuring different things. That has nothing to do with determining the uncertainty of measurement!

It’s why you can’t determine an expectation for what your next measurement of temperature at time t3 will be by using (Tmax + Tmin)/2 as the “average” temperature!

Reply to  Bellman
August 21, 2022 5:14 am

When you are using the sample as (Tmax-Tmin)/2 then you have multiple combinations of Tmax and Tmin that will give the same answer. How is that a functional relationship?

Temperature measurements are not a random “sample”. They are an independent measurement of different things. As has bee pointed out to you many times.

I’ve advocated several times for moving away from using Tmax and Tmin (which occur at different times each day) to using common times, e.g. 0000 GMT and 1200 GMT. That would give you a common sample point for all measurements. It wouldn’t help with the the multi-modal problem of combining temps from the two hemisphere. But it would give a metric much more comparable to data sets like UAH.

Reply to  Bellman
August 21, 2022 3:52 am

It can be seen as a measure of the uncertainty in the trend line. “

Just as the standard deviation of the sample means is called an error when it is not any such thing, calling the residuals “errors” is a misnomer and it doesn’t matter if it is in common usage or not. Both are called such by statisticians that have no actual understanding of metrology. The proof of that statement is the fact that *you* think that both are actual errors. Residuals between data points and the linear regression line are *not* errors, they are a metric that indicate the data is not perfectly linear in nature. That is not error. Data that is not perfectly linear simply can’t be perfectly described by a linear line. Large residuals indicate some other kind of regression should be used, perhaps even piecewise linear regression. The charge then, to those generating the data, is to determine if the data is valid or if it actually indicates that variation from linear is just part of the system. An example would be a controller system that allows both overshoot and undershoot in order to simplify the controlling system using hysteresis instead of a complicated PID controlling system. Output data from a system with such a hysteresis controller will never lie on a perfectly linear line but will, instead, oscillate around such a linear line. Think about the thermostat in your house. I’ll guarantee you it has hysteresis with the temperature oscillating around the set temperature. This hysteresis is not “error”, it is just the way the system is designed!

No it isn’t. How can a single value be a “set of residuals”.”

You are kidding, right? You can’t even see the “s” on the end of words? A “set of residuals“?

If you have a single point then how do you generate a linear line? You must have at least two data points to generate a line! Can you see the “s” on the end of “points“?

You keep repeating that line as if it means something.”

It *does* tell you something. It tells you how far the data is from being perfectly linear!

Correct, that is the point.”

That just means that *YOU* are the error, not the residuals. You did a poor job of finding the best-fit line!

But to be clear, by better we mean better to the population rather than the sample. That may be the point you are missing. The uncertainty comes from the fact we are only working with an imperfect, random sample.”

If your sample doesn’t represent the population then why are you using it? You are grasping for straws. The best-fit to a set of data *is* the best-fit. It doesn’t matter if it is a sample or the entire population. The residuals only tell you if the data is perfectly linear or not. You seem to be trying to say that all data from a population should be perfectly linear and if the data in a sample is not perfectly linear then the sample data doesn’t describe the population. That’s just a bunch of hooey and you should know that!

Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 7:10 am

Just as the standard deviation of the sample means is called an error when it is not any such thing, calling the residuals “errors” is a misnomer and it doesn’t matter if it is in common usage or not.

The GUM even has a paragraph or two about usage of “standard error”—I was scoffed at when I tried to tell them it is considered obsolete language in metrology.

Both are called such by statisticians that have no actual understanding of metrology. The proof of that statement is the fact that *you* think that both are actual errors

A perfect example is that steven candy person who blew through WUWT recently; when confronted with real uncertainty, he scoffed and insisted he was the authority with his long statistics resume and large hat size.

If your sample doesn’t represent the population then why are you using it? You are grasping for straws. The best-fit to a set of data *is* the best-fit. It doesn’t matter if it is a sample or the entire population. The residuals only tell you if the data is perfectly linear or not. You seem to be trying to say that all data from a population should be perfectly linear and if the data in a sample is not perfectly linear then the sample data doesn’t describe the population. That’s just a bunch of hooey and you should know that!

He claims with handwaving that “randomness” in air temperature datasets is identical to taking random samples from a fixed population!

Reply to  Carlo, Monte
August 21, 2022 7:31 am

The GUM even has a paragraph or two about usage of “standard error”—I was scoffed at when I tried to tell them it is considered obsolete language in metrology.”

Was one of them named Tyeve?

“A perfect example is that steven candy person who blew through WUWT recently; when confronted with real uncertainty, he scoffed and insisted he was the authority with his long statistics resume and large hat size.”

His nickname should be Tyeve.

“He claims with handwaving that “randomness” in air temperature datasets is identical to taking random samples from a fixed population!”

Again, he’s committed to making the temperature of the globe fit into an example in his statistics textbook. It’s just more of his cherry picking things without understanding them!

Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 7:39 am

Was one of them named Tyeve?

No, this name must be from before my time. Pretty sure it was bellman.

The next time they start going on about standard error, I should dig out the GUM quote.

Reply to  Carlo, Monte
August 21, 2022 8:16 am

The main character in Fiddler on the Roof. He is big on TRADITION!

Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 8:21 am

“ah, I see!” said blind man…

Reply to  Tim Gorman
August 21, 2022 2:12 pm

I’d better repeat this entire stream of consciousness, as this reply won’t appear until well below the original comment:

Just as the standard deviation of the sample means is called an error when it is not any such thing, calling the residuals “errors” is a misnomer and it doesn’t matter if it is in common usage or not. Both are called such by statisticians that have no actual understanding of metrology. The proof of that statement is the fact that *you* think that both are actual errors. Residuals between data points and the linear regression line are *not* errors, they are a metric that indicate the data is not perfectly linear in nature. That is not error. Data that is not perfectly linear simply can’t be perfectly described by a linear line. Large residuals indicate some other kind of regression should be used, perhaps even piecewise linear regression. The charge then, to those generating the data, is to determine if the data is valid or if it actually indicates that variation from linear is just part of the system. An example would be a controller system that allows both overshoot and undershoot in order to simplify the controlling system using hysteresis instead of a complicated PID controlling system. Output data from a system with such a hysteresis controller will never lie on a perfectly linear line but will, instead, oscillate around such a linear line. Think about the thermostat in your house. I’ll guarantee you it has hysteresis with the temperature oscillating around the set temperature. This hysteresis is not “error”, it is just the way the system is designed!

So many misconceptions and nit picking. For some reason a few here find the word “error” triggering. But it’s usual in maths and science to use words in ways that don’t agree with the everyday meaning.

Tim is confused here on a number of points though. He says that residuals are called errors. They are not. They are called residuals, hence the name. Errors only refer to the difference between a data point and the true, but unknown, mean or trend, whilst a regression is the difference between the data point and the estimated mean or trend.

For some reason he doesn’t like the term standard error of the mean. He says it is not an actual error. Several points here. The standard error of the mean, or the standard error of anything, is not an error. It’s an estimate of what the average error is. Just as the standard deviation is not a deviation, it’s an estimate of the average deviation.

The distinction between the phrases “standard error” and “standard deviation” have been around since the end of the 19th century, and whilst they could be used interchangeably, I feel it’s useful to keep to the accepted usages for a number of reasons. For one, the terms standard deviation of a sample and standard error of the means are already frequently confused with each other, and calling the “standard error of the mean” the “standard deviation of the mean”, only adds to the confusion.

Moreover, standard errors are describing actual errors. They describe the likely error in the estimate of a statistical parameter. If you estimate that a mean is 99 and the true mean is 100, that’s an error both in statistical and any other terms.

Tim seems to think that because metrology doesn’t like the term error then nobody else should be allowed to use the word. But, when we are talking about standard errors we are talking about statistics, not metrology.

Carlo, Monte
Reply to  Bellman
August 21, 2022 3:49 pm

So in your abject ignorance, you reject metrology—stamp a big ‘L’ on your forehead.

Attempting to provide you with clues remains a fool’s errand.

Reply to  Carlo, Monte
August 21, 2022 4:18 pm

No. But I can see why an idiot might leap to that assumption.

Reply to  Bellman
August 21, 2022 6:41 pm

So many misconceptions and nit picking. For some reason a few here find the word “error” triggering. But it’s usual in maths and science to use words in ways that don’t agree with the everyday meaning.”

Of course you can’t actually show any misconceptions or nit picking.

It’s *NOT* common in science or engineering to use words in ways that don’t match reality! That’s a quick way to lose a reputation!

” He says that residuals are called errors. They are not. They are called residuals, hence the name. Errors only refer to the difference between a data point and the true, but unknown, mean or trend, whilst a regression is the difference between the data point and the estimated mean or trend.”

Here is what you said: “The standard errors, i.e. the uncertainty of the trend, is quantified by the variation in the data. It’s very basic statistics. Maybe you should join me in one of these remedial classes you are so keen on.”

It is the variations of the data that determine the residuals. *YOU* are the one that said those are the “standard errors”, not me!

You have no idea of what you are talking about and that’s why you get so lost and contradict yourself ALL THE TIME!



Reply to  Tim Gorman
August 21, 2022 7:07 pm

It’s *NOT* common in science or engineering to use words in ways that don’t match reality!

Strawman. I said “don’t agree with the everyday meaning”, not “reality”.

It is the variations of the data that determine the residuals. *YOU* are the one that said those are the “standard errors”, not me!

No I didn’t. I said the variation in the data is used to quantify the standard error of the trend. Not that the residuals are the standard errors.

Reply to  Bellman
August 21, 2022 7:58 pm

No I didn’t. I said the variation in the data is used to quantify the standard error of the trend. Not that the residuals are the standard errors.”

The variation in the data determines THE RESIDUALS! And you *did* say the following!

“The standard errors, i.e. the uncertainty of the trend, is quantified by the variation in the data. It’s very basic statistics. Maybe you should join me in one of these remedial classes you are so keen on.””

Give it a rest! You can’t even admit to what you said when it is quoted back to you!

Reply to  Tim Gorman
August 22, 2022 7:38 am

This constant nit-picking over the exact words I’ve used in trying to explain some simple concept to you is getting really tedious. If you want a more authoritative explanation of the details, try reading a book on statistics. Or look it up on the internet, or read your own books on error propagation, or the GUM (Example H3). They all, hopefully, give the same formula, in different forms, for the standard error of the slope, or the uncertainty as it’s called in the metrolgy sources.

Here’s one such example from

https://www.statology.org/standard-error-of-regression-slope/

See how you need to know both the variance in the residuals and the variance in the independent variables, along with the sample size to calculate the standard error of the regression slope.

Screenshot 2022-08-22 153406.png
Reply to  Bellman
August 22, 2022 7:41 am

What may be confusing you is that the standard deviation off the residuals is often called “the standard error of the regression”. Which I agree is confusing, and not to be confused with the standard error of the regression slope, which is what I mean by the uncertainty of the slope.

Carlo, Monte
Reply to  Bellman
August 22, 2022 8:21 am

This quantity is NOT UNCERTAINTY!

Carlo, Monte
Reply to  Bellman
August 22, 2022 8:05 am

or the GUM (Example H3).

LIAR!

This example has NOTHING to do with averaging. You just bgwxyz toss it up against the wall and ran with it, and didn’t bother to read it!

bdgwx
Reply to  Carlo, Monte
August 22, 2022 9:04 am

CM said: “This example has NOTHING to do with averaging. You just bgwxyz toss it up against the wall and ran with it, and didn’t bother to read it!”

Hold on. I never said or implied that H3 was an example focused on averaging. I said (above) it is an example that uses an average temperature. Specifically, it uses the average temperature to make a salient point in section H.3.5. Clearly the GUM has no issue with averaging a set of temperature values and found it meaningful enough to use it.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 9:49 am

And now here you are doubling down on your lie—H3 is about calibration using a least-squares fit! In a laboratory!

t0 is the temperature of the controled temperature environment, this is NOT a time-series! H3.5 is ONLY about removing correlation!

It has nothing in common with averaging air temperatures for GLOBAL WARMING, LIAR!

bdgwx
Reply to  Carlo, Monte
August 22, 2022 10:03 am

CM said: “It has nothing in common with averaging air temperatures for GLOBAL WARMING”

I never said it did. What I said is that the GUM averages temperatures and finds it meaningful enough to use it. Remember, we were told that averaging temperature regardless of context was meaningless and useless. In fact, we were told averaging any intensive property is meaningless and useless. Yet here we are with another example of meaning and use being applied to averaging temperatures. And it comes from a document you told me I had to use to assess uncertainty.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 10:34 am

I never said it did.

Of course you did, by implication.

You didn’t post the context of the thermometer calibration, I DID.

See if you can sort this out:

It is an average of multiple measurements of the calibration bath WHICH IS CONSTANT.

This has NOTHING in common with averaging air temperatures from Timbuktu and Kalamazoo.

FOOL.

I leave you to your delusions and lies.

bdgwx
Reply to  Carlo, Monte
August 22, 2022 11:41 am

CM said: “You didn’t post the context of the thermometer calibration, I DID”

If you’re going to be pedantic about it then I’m compelled to out that the GUM provides the context. It wasn’t either you or I. But I’m the one pointed out that example in the first place. Though, I see Bellman pointed it out for a completely different reason which, as best I can tell, you confused with the reason I pointed it out. Either way it’s moot; my point stands. The authors of the GUM believe the average temperature is meaningful enough to use it to make a salient point. And you’re the one that told me I had to use the GUM in the first place. So I don’t think you’re as convinced about this arbitrary rule about averaging intensive properties as you let on.

Carlo, Monte
Reply to  bdgwx
August 22, 2022 3:28 pm

Fool, once again you run away from the difference between a temperature-controlled bath and air temperatures in diverse locations.

How do you manage to feed and dress yourself in the morning?

Reply to  Carlo, Monte
August 23, 2022 6:24 am

I often ask myself that. At the very least who picks out the clothes they are going to wear?

Reply to  Carlo, Monte
August 22, 2022 10:57 am

From anyone but you I might take offense at being called a liar.

I am not talking about averaging. I’m talking about the uncertainty in a linear regression, exactly what the GUM does in H3.

Carlo, Monte
Reply to  Bellman
August 22, 2022 11:18 am

Another liar who ignores context to cover his $$.

Reply to  Carlo, Monte
August 22, 2022 11:20 am

What context? And leave my donkey out of it.

Carlo, Monte
Reply to  Bellman
August 22, 2022 11:22 am

Fool, liar, go sort it out for yourself (not possible, I know).

Reply to  Bellman
August 22, 2022 11:34 am

This constant nit-picking over the exact words I’ve used in trying to explain some simple concept to you is getting really tedious. If you want a more authoritative explanation of the details, try reading a book on statistics. Or look it up on the internet, or read your own books on error propagation, or the GUM (Example H3). They all, hopefully, give the same formula, in different forms, for the standard error of the slope, or the uncertainty as it’s called in the metrolgy sources.”

So, in other words, rules for thee but not for me is your meme. The rules don’t apply to you. ““When I use a word,’ Humpty Dumpty said in rather a scornful tone, ‘it means just what I choose it to mean — neither more nor less.”
I guess we should start calling you Humpty or perhaps Dumpty.

As I continue to point out – the term “standard error” is used by statisticians all over the place when it is very seldom and actual “error”. In the case of linear regression it is a metric for the best-fit. The equation you posted is not a measure of “error”, it is a measure of variation, and variation is neither error or uncertainty.

Call it what it is.

Reply to  Tim Gorman
August 23, 2022 10:49 am

The equation you posted is not a measure of “error”, it is a measure of variation, and variation is neither error or uncertainty.

It is not a measure of actual variation. It’s an estimate of the variation that would occur if you could keep running the experiment and getting different results, and that is a measure both of error and uncertainty. The GUM and Taylor both call it uncertainty, statistics prefer the term standard error of the regression slope, but it is still reflecting the uncertainty you have in the value of the stated parameters.

Carlo, Monte
Reply to  Bellman
August 23, 2022 1:03 pm

More idiotic lies.

Reply to  Carlo, Monte
August 23, 2022 2:18 pm

More random spitting

If you think there’s something wrong with what I;d said, you could point it out. But much easier for you just to make vague accusations.

Carlo, Monte
Reply to  Bellman
August 23, 2022 2:20 pm

Go find someone who cares about your whining, laddie.

Reply to  Carlo, Monte
August 23, 2022 2:53 pm

You care about me enough to call me a liar, but then go into a sulk when challenged.

Carlo, Monte
Reply to  Bellman
August 23, 2022 4:31 pm

Sulk? HAHAHAHAHAHAH

You’ve been informed many many times I’m done with attempting to educate you, but this is just another fact that cannot penetrate the neutronium.

Reply to  Carlo, Monte
August 23, 2022 4:51 pm

Stop whining.

Carlo, Monte
Reply to  Bellman
August 23, 2022 6:05 pm

Another fail, unable to differentiate laughter from whining.

No surprise.

Reply to  Bellman
August 24, 2022 1:05 pm

ERROR IS NOT UNCERTAINTY!!!

How many times have you been told to memorize that meme?

And yet you just continue to ignore it!

Running the *same* experiment over and over is measuring the SAME thing over and over.

The residuals are NOT* a measure of the uncertainty in the stated values. They are a metric for finding the best-fit regression line The uncertainty of the stated values is the uncertainty of the stated values. You seem to think the regression line *IS* the true value and therefore the residual between the regression line and the stated values are some measure of error or uncertainty. The regression line is *NOT* a true value;

Reply to  Tim Gorman
August 24, 2022 3:16 pm

Stop thinking in slogans and learn to think for yourself.

What do you mean when you say error is not uncertainty? Do you mean that thing we call uncertainty is not exactly same thing as that thing we call error, or do you mean that errors are irrelevant to the concept of uncertainty?

Running the *same* experiment over and over is measuring the SAME thing over and over.

And getting a different result each time. Hence each result has an error, hence you have uncertainty. (Notes, this is not measurement uncertainty, unless you allow that calculating a linear trend is type of measurement.)

The residuals are NOT* a measure of the uncertainty in the stated values.

They are a measure of the uncertainty in the prediction of individual values. But that’s not what I’m talking about.

They are a metric for finding the best-fit regression line

That’s the first step in the operation, finding the best fit line. But then we want to know the uncertainty of that line.

You seem to think the regression line *IS* the true value…”and therefore the residual between the regression line and the stated values are some measure of error or uncertainty.

No I do not. If I did I wouldn’t be interested in the uncertainty of the line.

“and therefore the residual between the regression line and the stated values are some measure of error or uncertainty.

As I said above, they are but that’s not the uncertainty I’m talking about. I’m talking about the uncertainty of the slope of the regression line.

Reply to  Bellman
August 25, 2022 5:48 am

What do you mean when you say error is not uncertainty?”

Error is a quantifiable quantity. Uncertainty is an unknown, Period, exclamation point!

“Do you mean that thing we call uncertainty is not exactly same thing as that thing we call error, or do you mean that errors are irrelevant to the concept of uncertainty?”

Error is not uncertainty. What is so ambiguous about that statement?

“And getting a different result each time. Hence each result has an error, hence you have uncertainty. (Notes, this is not measurement uncertainty, unless you allow that calculating a linear trend is type of measurement.)”

But it is *NOT* measuring different things! It is not measuring the diameter of grapefruit to determine temperature.

Each result for the same thing is a combination of systematic bias and random variation. That is *NOT* true when measuring different things.

The residuals between stated values and a trend line are not “errors”. They are a metric for best-fit. It would appear that much of your problem in understanding measurement uncertainty and its propagation is due to not understanding the meaning of “error” in context. That’s why so many sources today shy away from even using the term ‘error” in association with measurement!

“That’s the first step in the operation, finding the best fit line. But then we want to know the uncertainty of that line.”

See what I mean? There is no “uncertainty”. Each residual is 100% accurate. There is a best-fit metric. There is no uncertainty.

“No I do not. If I did I wouldn’t be interested in the uncertainty of the line.”

You *KNOW* the best-fit metric. That is not uncertainty. It is 100% accurate! You are actually saying you want to know the best-fit metric but won’t admit it!



Reply to  Tim Gorman
August 25, 2022 3:20 pm

Watch this video. It covers what the SEM (Standard Error of the sample Mean) is, how it is derived and what it means.

(850) Statistics 101: Standard Error of the Mean – YouTube

Reply to  Tim Gorman
August 28, 2022 3:36 pm

Error is a quantifiable quantity. Uncertainty is an unknown, Period, exclamation point!

Uncertainty is quantifiable, that’s the whole point of the GUM.

Reply to  Bellman
August 28, 2022 3:48 pm

Only when it consists of a distribution that is identically distributed. E.g. multiple measurements of the same thing using the same instrument. That’s *all* the GUM addresses.

It simply isn’t applicable to multiple measurements of different things using different devices – which is what temperature at different locations and times are!

Reply to  Bellman
August 21, 2022 6:56 pm

 The standard error of the mean, or the standard error of anything, is not an error.”

Then why do you call it “standard error” if it isn’t an error! OMG, cognitive dissonance at its finest!

“It’s an estimate of what the average error is”

You just said that it is not an error so why call it an “error”?

” calling the “standard error of the mean” the “standard deviation of the mean”, only adds to the confusion.”

Standard error of the mean *IS* the standard deviation of the sample means!

“Moreover, standard errors are describing actual errors. They describe the likely error in the estimate of a statistical parameter. If you estimate that a mean is 99 and the true mean is 100, that’s an error both in statistical and any other terms.”

How do you know the “true mean”?

“Tim seems to think that because metrology doesn’t like the term error then nobody else should be allowed to use the word. “

The problem is that the term “error” is so ambiguous. If you are finding the standard deviation of a set of data, be it a set of sample means or something else, then call it what it is. Using the term “error” leaves it open to interpretation – which you are showing in spades!

Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 9:09 pm

Shhhhh! Don’t tell him that a lot of the people on the GUM committee have statistics backgrounds!

Reply to  Carlo, Monte
August 22, 2022 7:44 am

Committee may be the operative word. The more I look into it the more the GUM seems to be a horse designed by a committee. 30 years on and people are still trying to figure out what it’s trying to say.

Carlo, Monte
Reply to  Bellman
August 22, 2022 8:22 am

Only clueless rubes such as yourself who seem to be unable to read and understand.

Reply to  Carlo, Monte
August 22, 2022 3:00 pm

Here for example are some clueless rubes, referencing lots of other clueless papers.

https://link.springer.com/article/10.1007/s00769-022-01508-9

To address this challenge, the GUM states that it adopted an operational view of measurement that focuses on the observables, rather than the unknowable true values and errors [1, E.5.4]. More specifically, the GUM states that it focuses on “the measurement result and its evaluated uncertainty rather than on the unknowable quantities “true” value and error” [1, E.5.1]. Further, it states that it does not use the term “true value” because the expressions “value of a measurand” and “true value of a measurand” are viewed as equivalent [1, 3.1.1 NOTE]. The GUM also states that its approach “makes any mention of error entirely unnecessary” [1, E.5.4]. These statements in the GUM have received wide attention in the literature. However, controversies persist on their interpretations

The literature is divided on whether the GUM has avoided the concepts true value and error, or only the terms “true value” and “error”. Some insist that the GUM discarded the concepts true value and error. Others insist that the GUM discouraged only the use of the term “true value”, and simply used “value” in place of “true value”. Some have further suggested that avoiding the expressions “true value” and “error” creates confusion with no clear benefit because true value and error are indispensable concepts in metrology. Following such a view, recent authoritative publications use the expressions “true value” and “error” in association with measurement uncertainty. This difference in the views towards true value and uncertainty is far-reaching for metrology. Unfortunately, the diverging views are not widely discussed and are becoming a source of confusion.

Carlo, Monte
Reply to  Bellman
August 22, 2022 3:29 pm

YOU are confused, fool!

“To address this challenge, the GUM states that it adopted an operational view of measurement that focuses on the observables, rather than the unknowable true values and errors”

This is absolutely correct, but I’m quite confident that in the alternate reality which you inhabit, true values and errors are knowable.

Reply to  Carlo, Monte
August 22, 2022 3:42 pm

It’s not my paper.

Carlo, Monte
Reply to  Bellman
August 22, 2022 3:57 pm

Where did I indicate such?

More lies and your confusion remains intact.

Reply to  Bellman
August 23, 2022 6:23 am

Further, it states that it does not use the term “true value” because the expressions “value of a measurand” and “true value of a measurand” are viewed as equivalent [1, 3.1.1 NOTE].”

Please note carefully the use of the word MEASURAND and not MEASURANDS!

We have, once again, the GUM clarifying that it’s talking about a single measurand with multiple measurements of it.

Reply to  Tim Gorman
August 23, 2022 6:34 am

Why are you arguing about this with me? You seem to have suddenly decided to reignite discussions from months ago, or are confusing me with bdgwx. All I’ve talked about in this is the nature of linear regression, and the meaning of error.

If you do want to go down this route with me, all you are showing is your complete lack of logic. I jokingly asked you what the correct plural of measurand is, and you are just showing me what the singular is. I know full well that the correct plural is measurands, and I don’t know why you think it’s worth arguing about. A quick search of the G?UM shows the plural occurring 16 times. I don’t know why you think plurals are not allowed.

bdgwx
Reply to  Bellman
August 23, 2022 1:40 pm

I have no idea why you got roped into that either.

FWIW…that all got started when I was accused of not understanding anything about uncertainty. I then said that while I don’t know everything I do at least know that you can use GUM equation 10 to assess the uncertainty of the output of the function f that combines multiple measurands (plural) even when those measurands (plural) are of different things. I even pointed out that the GUM literally calls the inputs to that function measurands (plural) and has numerous examples of them being of different things.

BTW…and only tangentially related…thanks to Tim I just discovered that GUM document JCGM 6:2020 advocates for averaging measurements to “abate the extent of the uncertainty” and it did so in the context of temperature measurements nonetheless even when those measurements have an element of temporal and/or spatial extent and even when those measurements have an element of interrelatedness (correlation). If that isn’t the smoking gun to finally put that whole debate to rest then Tim (and presumably Jim, Carlo Monte, and others) then they will never be convinced.

Carlo, Monte
Reply to  bdgwx
August 23, 2022 2:19 pm

Nice whineage, and glad to have made it into your list-o-hate (although next time I need to be ranked much higher than LAST).

Fail.

Reply to  bdgwx
August 23, 2022 2:45 pm

Thanks for the link. I’m afraid I haven’t been paying as much attention to your side of the discussion as I should.

At a cursory glance this volume of the GUM does seem a lot clearer about what’s allowed. I notice they mention Bayesian models, and suggest you can talk about a probability interval for the measurand, something Carlo keeps objecting to.

Carlo, Monte
Reply to  Bellman
August 23, 2022 4:28 pm

and suggest you can talk about a probability interval for the measurand, something Carlo keeps objecting to.

Idiot! You still don’t understand (and never will), as witness this latest gem of extrapolation.

Reply to  Carlo, Monte
August 23, 2022 4:34 pm

These guys have no idea what they are reading. It is sad.

As I asked, do they think they’ve discovered something new that other climate mathematicians and climate scientists don’t know about. Maybe so, and a Nobel prize is in the offing!

Reply to  bdgwx
August 23, 2022 4:22 pm

Boy, you guys are great at cherry picking. Read this:

“11.7 Models for time series 11.7.1 Observations made repeatedly of the same phenomenon or object, over time or along a transect in space, often equispaced in time or equidistant in space, tend to be interrelated, the more so the greater their proximity. The models used to study such series of observations involve correlated random variables [23, 134].” (bold by me)

Read the next section too.

Now ask yourself why all the mathematicians and climate scientists, who must be very educated based on your remarks, have not used ARIMA models to determine a GAT.

Let us know why this hasn’t been used.

Let us know how New Delhi and Toronto are interrelated. I want to see your answer.

As to your reference, notice the need for a priori information. Have you figured out yet what the standard deviation for GAT is?

Reply to  Jim Gorman
August 23, 2022 4:37 pm

Now ask yourself why all the mathematicians and climate scientists, who must be very educated based on your remarks, have not used ARIMA models to determine a GAT.

ARIMA models are used for auto-correlated time series. I’ve repeatedly said you need to correct the standard error for auto correlation. This is done in say the the Skeptical Science Trend Calculator.

Reply to  Bellman
August 24, 2022 7:07 am

 I jokingly asked you what the correct plural of measurand is, and you are just showing me what the singular is. “

The plural is irrelevant. The GUM only speaks to averaging measurments of the same thing, the same measurand (singular)!

” A quick search of the G?UM shows the plural occurring 16 times. “

But it *NEVER* mentions averaging multiple measurands. It only mentions using multiple measurands to calculate a functional relationship. The value of each of those measurands stands alone – no averaging.

Reply to  Tim Gorman
August 24, 2022 3:06 pm

You seem to think that if the GUM doesn’t mention something, then it isn’t possible. I assume the authors assume their readers are intelligent enough to know how to apply the methods to any operation.

Reply to  Bellman
August 25, 2022 5:38 am

You can’t apply the same process to multiple measurements of different things as you can to multiple measurements of the same thing.

Only *YOU* and bdgwx believe that you can! It requires each of you to even ignore the statistics rules that you so adore!

Reply to  Tim Gorman
August 25, 2022 7:31 am

It’s me bdgwx and just about every statistician over the last few hundred years who thinks you can.

The maths is the same. You are just averaging random variables. The variance is the same regardless of whether the random variables are the result of random errors when measuing the same thing repeatidly, or taking samples from a random variable defined by a population, the maths works exactly the same way.

Reply to  Bellman
August 25, 2022 1:20 pm

The variance is the same regardless of whether the random variables are the result of random errors when measuing the same thing repeatidly, or taking samples from a random variable defined by a population, the maths works exactly the same way.”

No, the math for uncertainty does *NOT* work the same for temperature. Multiple measurements of the same thing are all related. Their average and standard deviation give a high expectation for what the next measurement will be. That is simply *not* the case where the measurements are totally random and unrelated. The average and standard deviation are not even proper statistical descriptors for such a population because it is more than likely to *NOT* be a normal distribution or even an identically distributed distribution. Samples of such a population will *NOT* help. While the deviation of the sample means from a non-normal population may converge on a value, there is nothing that says that average either actually exists or is meaningful.

Now you are showing you total ignorance of statistical principles. There is a *reason* why the 5-number statistical description exists – it is for describing non-normal populations. And it does *NOT* include either the mean or the standard deviation.

Reply to  Tim Gorman
August 25, 2022 5:15 pm

No, the math for uncertainty does *NOT* work the same for temperature.

Firstly, yes it does. And secondly, here you go moving the goalposts again. Your statement was not about temperatures specifically, but measuring “different things”.

Multiple measurements of the same thing are all related. Their average and standard deviation give a high expectation for what the next measurement will be. That is simply *not* the case where the measurements are totally random and unrelated.

No idea why you think this matters. The standard deviation of different things from a population will probably be bigger than the SD from measurement errors of the same thing, It matters not, to the mathematics.

The average and standard deviation are not even proper statistical descriptors for such a population because it is more than likely to *NOT* be a normal distribution or even an identically distributed distribution.

And there’s your religious believe that statistics only works with normal distributions. Have you actually read the books you keep going on about. I’m pretty sure some of them explain how to calculate the mean and standard deviation for different distributions.

Also, I see you still don’t understand what identically distributed means. If you are taking truly random values from a population distribution they will be identically distributed.

Also, there’s no reason to suppose measurement errors are normally distributed.

While the deviation of the sample means from a non-normal population may converge on a value, there is nothing that says that average either actually exists or is meaningful.

And you still haven’t come to terms with the idea that an average does not have to “actually exist” to be meaningful or useful. Consider a binomial distribution. It’s based on random samples from a Bernoulli process. Values are either 0 or 1. The mean of the binomial distribution is unlikely to be 0 or 1, but it will tell you something about the Bernoulli process.

Reply to  Bellman
August 26, 2022 9:00 am

Firstly, yes it does. And secondly, here you go moving the goalposts again. Your statement was not about temperatures specifically, but measuring “different things”.”

Temperature measurements at different times and different places *ARE* different things!

“No idea why you think this matters. The standard deviation of different things from a population will probably be bigger than the SD from measurement errors of the same thing, It matters not, to the mathematics.”

But a distribution of measurements of different things can’t be assumed to be normal. If it isn’t normal then why is standard deviation a proper statistical descriptor to use? All my textbooks say you should use the 5-number descriptor; minimum, first quartile, median, third quartile, and maximum. I don’t see standard deviation or mean anywhere in there!

And of course if the standard deviation of one population is larger then it *does* matter.

And there’s your religious believe that statistics only works with normal distributions.”

Standard deviations are only applicable to distributions where the components are independent and identically distributed. That is usually mentioned in association with normal distributions. It could also apply to rectangular, triangular, u-shaped, and others. It does *NOT* apply to distributions that are not independent or are not identically distributed – which include temperature measurements of different things!

Also, I see you still don’t understand what identically distributed means. If you are taking truly random values from a population distribution they will be identically distributed.”

Here is an example:

————————————————————-
“As a second example of a non-iid sample, suppose we observe daily rainfall amounts at 50 locations in Texas, for every day during May of 2019. For several reasons, it is not clear what the population should be here. First, note that there will only ever be one May of 2019, and that is the sample. To construct a population that this sample may generalize to, imagine that we are actually interested in rainfall throughout Texas, but we only have the ability to collect data at 50 locations. Also, suppose that we are actually interested in the rainfall in Texas in May of any year, not only in May of 2019. Specifically, suppose that our idealized population involves choosing a random location in Texas, and a random day within May from the past 100 years, and obtaining the rainfall on the given day and at the given location. Does our sample, consisting of 50 arbitrarily chosen locations that are then observed for every day in May of 2019 represent this target population in an iid manner? Almost certainly it does not. First, May of 2019 may have been an unusually wet or dry year compared to other years in the population. Second, some of the locations where we collected data may have been so close together that if it is raining in one location it is almost certain to be raining in the other (this is sometimes called “twinning”). If we have strong or perfect twinning, only one meaningful unit of information is provided by the two twinned locations.”
————————————————-

Temperature measurements are no different than the rainfall in the example. Sampling arbitrarily chosen locations, even at random, doesn’t represent the population in an iid manner.

If all the temperature data is considered to be a population instead of a sample the same thing applies, it won’t be an iid population.

Here is another:
————————————————————
Suppose you have a learning module for predicting the click-thru-rate of online-ads, the distribution of query terms coming from the users are changing during the year dependent on seasonal trending. The query terms in summer and in Christmas season should have different distribution.”
———————————————————–

This is no different than temperatures taken during the summer and winter. They will have different distributions and their combination will not be identically distributed.

Conceptually, each temperature measurement is a random variable all of its own with a variance of v. So each measurement represents it’s own subset. Since each of the variables will have its own variance the subsets are not the same and no sample can therefore be identically distributed.

*YOU* need to decide if temperature measurements are samples or if they are a population. You switch back and forth between the two with no regard for the difference.

And you still haven’t come to terms with the idea that an average does not have to “actually exist” to be meaningful or useful. “

By definition an average that doesn’t exist isn’t useful or meaningful in reality. You can’t roll a 3.5 on a six sided die. The average of a 6′ board and an 8′ board doesn’t exist if those are the only lengths you have in your pile.

Consider a binomial distribution. It’s based on random samples from a Bernoulli process. Values are either 0 or 1. The mean of the binomial distribution is unlikely to be 0 or 1, but it will tell you something about the Bernoulli process.”

You can’t even get this one correct. A Bernoulli process or a binomial distribution is a DISCRETE probability density. No uncertainty in the outcomes! Therefore no uncertainty associated with the data population.

Leave it to a statistician that was *never* trained in uncertainty to think that multiple measurements of different things with different uncertainties is similar to a binomial distribution.

BTW, how *do* you roll a .5 on a two-sided coin? The .5 is the probability of getting one or the other, it is *NOT* an average of the rolls!

Reply to  Tim Gorman
August 28, 2022 3:30 pm

Not going through all this at this late stage, but

You can’t even get this one correct. A Bernoulli process or a binomial distribution is a DISCRETE probability density. No uncertainty in the outcomes! Therefore no uncertainty associated with the data population.

Of course there’s uncertainty in a Bernoulli process. It’s random. You just don;t know if the next number will be 0 or 1. You still haven’t understood that uncertainty is not just about measurement error, it’s about any random process.

Leave it to a statistician that was *never* trained in uncertainty to think that multiple measurements of different things with different uncertainties is similar to a binomial distribution.

Leave it to someone who was never trained in logic to leap to such a weird strawman. You can see exactly what I was replying to and why I introduced this example. It was this from you

While the deviation of the sample means from a non-normal population may converge on a value, there is nothing that says that average either actually exists or is meaningful.

The average of a binomial distribution does not “exist” in the process, but is very useful.

Reply to  Bellman
August 28, 2022 3:46 pm

Of course there’s uncertainty in a Bernoulli process. It’s random.”

Give me a break! There is no uncertainty in the possible outcomes! Is there any uncertainty in what a six sided dice shows on top after a roll?

“You still haven’t understood that uncertainty is not just about measurement error, it’s about any random process.”

Malarky! Probability isn’t MEASUREMENT uncertainty! The topic of every thread on WUWT is about measuring temperature!

This is just one more use of the argumentative fallacy of Equivocation that you are so enamored of!

“You can see exactly what I was replying to and why I introduced this example. It was this from you”

The only strawman here is you trying to equate a Bernoulli process with measurement uncertainty!

The average of a binomial distribution does not “exist” in the process, but is very useful.”

Not when it comes to physical measurements with uncertainty.

Reply to  Tim Gorman
August 29, 2022 6:30 pm

I was pointing out why you were wrong when you said

While the deviation of the sample means from a non-normal population may converge on a value, there is nothing that says that average either actually exists or is meaningful.

Nothing there about measurement uncertainty.

Reply to  Bellman
August 30, 2022 5:32 am

The standard deviation of the sample means is not measurement uncertainty.

Reply to  Bellman
August 26, 2022 4:33 am

Listen to this presentation. Pay careful attention to the later part that discusses the distribution of the samples. He shows that they are all the same and therefore the SEM applies equally to all. This is known as IID, Independent and Identical Distribution of the samples.

Have you checked to see if each station random variable is IID all through all the averaging from daily to monthly to annual to global? If they are not, do you know how to compensate for this?

Reply to  Jim Gorman
August 26, 2022 12:19 pm

you forgot the link

Reply to  Tim Gorman
August 26, 2022 12:28 pm
Reply to  Jim Gorman
August 26, 2022 12:28 pm
Reply to  Bellman
August 22, 2022 11:38 am

The word “measurand” is *NOT* hard to understand. It’s singular not plural. And that word underlies *everything* in the GUM.

If only you and bdgwx could stop trying to put an “s” on the end of measurand.

bdgwx
Reply to  Tim Gorman
August 22, 2022 12:59 pm

TG said: “If only you and bdgwx could stop trying to put an “s” on the end of measurand.”

The “s” at the end of measurands comes from the GUM…literally. See section 4.1.2 regarding the function f which combines measurands.

The input quantities X1, X2, …, XN upon which the output quantity Y depends may themselves be viewed as measurands and may themselves depend on other quantities, including corrections and correction factors for systematic effects, thereby leading to a complicated functional relationship f that may never be written down explicitly.

Reply to  bdgwx
August 22, 2022 2:51 pm

The “s” at the end of measurands comes from the GUM…literally. See section 4.1.2 regarding the function f which combines measurands.”

I gave you the quote that shows MEASURAND!

You then skipped to the section that is calculating a functional relationship using multiple factors.

Each of those multiple factors is a separate measurand. Each separate measurand gets handled in the fashion described for Eq 1. You determine the uncertainty of each factor separately using Eq 10! Those separate uncertainties get propagated exactly as Taylor explains in Rule 3.18!

You can argue all you want that independent, random objects can be combined into a data set and treated the same way as multiple measurements of the same thing but it is just a delusional stance to take. That is *NOT* what the Gum says.

bdgwx
Reply to  Tim Gorman
August 22, 2022 4:55 pm

TG said: “I gave you the quote that shows MEASURAND!”

Which is in reference to the output of the function. Your quote also comes from JCGM 6:2020; not JCGM 100:2008. Note that JCGM 100:2008 is the canonical GUM document we’ve been discussing all along.

TG said: “You can argue all you want that independent, random objects can be combined into a data set and treated the same way as multiple measurements of the same thing but it is just a delusional stance to take. That is *NOT* what the Gum says.”

I can argue that because it is what the GUM (both the canonical JCGM 100:2008 and the measurement model document JCGM 6:2020) says.

BTW…JCGM 6:2020 not only discusses averaging temperature; but it does so in the context of abating the extent of the uncertainty. I don’t think you’re going to like JCGM 6:2020 anymore than JCGM 100:2008.

Reply to  Tim Gorman
August 22, 2022 2:23 pm

You’re arguing with the voices in your head again. I don’t think I’ve used the word “measurand” throught these comments

Out of interest, what is the correct plaural of measurand then?

Carlo, Monte
Reply to  Bellman
August 22, 2022 3:30 pm

You’re arguing with the voices in your head again.

You’re a liar.

Reply to  Tim Gorman
August 22, 2022 11:04 am

Then why do you call it “standard error” if it isn’t an error!

Why do you call it “standard deviation” if it isn’t a deviation?

Standard error of the mean *IS* the standard deviation of the sample means!

The trouble is that “standard deviation” is usually considered a descriptive statistic, and the standard error or the mean is not a descriptive statistic.

The problem is that the term “error” is so ambiguous.

So is “deviation”. So is “uncertainty”.

Carlo, Monte
Reply to  Bellman
August 22, 2022 11:19 am

So is “uncertainty”.

Liar, only to pseudoscience frauds such as yourself.

Reply to  Carlo, Monte
August 19, 2022 1:41 pm

“It is YOU who has the failure problem:
“standard error” =/= measurement uncertainty
“standard error” =/= “uncertainty of the trend line””

This is such a *simple* concept I don’t know why anyone has a problem accepting it!

If bellman was a mule I’d be looking for the old 2×4 in the barn!

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 3:08 pm

So far applying the proverbial “clue by four” has been an exercise in futility.

Reply to  Carlo, Monte
August 20, 2022 6:56 am

I haven’t heard that one for a LONG time! You are giving away your age!

Carlo, Monte
Reply to  Tim Gorman
August 20, 2022 7:09 am

Heh. Guilty as charged officer.

bdgwx
Reply to  Jim Gorman
August 17, 2022 2:48 pm

So for variable X to be a control knob for variable Y it is mandatory for Y to move in lock-step with X?

What if variable Y does not exist like would be the case if you believed the average of an intensive property does not exist?

Reply to  bdgwx
August 17, 2022 3:23 pm

Look at the graph you posted that has with CO2 and without CO2. With that graph do you really want to ask if CO2 is not the control knob?

bdgwx
Reply to  Jim Gorman
August 17, 2022 6:55 pm

JG said: “With that graph do you really want to ask if CO2 is not the control knob?”

No. I’m not asking that. Science figured out that CO2 is not THE control knob back the 1800’s. I have no interest in wasting time on a hypothesis that was long ago falsified.

Reply to  bdgwx
August 18, 2022 7:53 am

‘Science figured out that CO2 is not THE control knob back the 1800’s.’

You might want to pass this along to the IPCC et al.

bdgwx
Reply to  Frank from NoVA
August 18, 2022 8:35 am

Frank from NoVA said: “You might want to pass this along to the IPCC et al.”

No need. The IPCC is already aware of it. A significant percentage of their assessment reports is dedicated to summarizing the current understanding of all the factors in play and in what proportions they contribute to temperature changes.

Reply to  bdgwx
August 18, 2022 11:40 am

‘The IPCC is already aware of it.’

Good to know. How about Messieurs Lacis and Schmidt?

Carlo, Monte
Reply to  bdgwx
August 18, 2022 3:31 pm

This is propaganda, the IPCC just handwaves paragraph after paragraph about “uncertainties”.

If they think they know these alleged proportions, then they are even more confused than you are.

Reply to  bdgwx
August 17, 2022 4:35 pm

If CO2 is not *the* control knob then why should we believe the climate models which are so dependent on CO2 being *the* control knob?

What other control knobs could there be? Clouds? The climate models do a terrible job with those – they don’t even have the data to do curve fitting to the clouds!

Water vapor and humidity? The CAGW advocates oscillate between claiming global WV is going up and is going down. The climate models don’t even seem to integrate global greening into their calculations.

CoM’s method shows that there *are* other control knobs that have a larger impact than CO2 – yet ALL we ever hear about from the IPCC and national governments is how we need to cut CO2.

Mr.
Reply to  Tim Gorman
August 17, 2022 5:38 pm

IPCC was set up as a one-trick donkey from the get-go.

Why anyone would take any notice of anything they say beggars rational minds.

bdgwx
Reply to  Tim Gorman
August 17, 2022 6:53 pm

TG said: “If CO2 is not *the* control knob then why should we believe the climate models which are so dependent on CO2 being *the* control knob?”

I’m not aware of even a single CMIP model that is dependent on CO2 being THE control knob. Can you post a link to literature showing one of the CMIP models that predicts the global average temperature moving in lock-step with CO2 and only CO2? I’d like to review the material and verify your claim if you don’t mind.

Reply to  bdgwx
August 18, 2022 5:44 am

Exactly what are the different CO2 projections supposed to mean if CO2 is not the control knob?

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:06 am

Go read Pat Frank’s paper, mr. disingenuous.

Reply to  bdgwx
August 18, 2022 3:35 pm

Wasn’t it you, or maybe bellman that pointed out the correlation between CO2 rise and the climate model predictions?

Reply to  TheFinalNail
August 17, 2022 9:26 am

Monckton of Baloney keeps talking about a short term trend which starts near the large El Nino heat release in late 2015 and early 2016. What he fails to do is to present the context of the long term uptrend from 1979. He also fails to persuade people that a short term trend HAS ANY ABILITY TO PREDICT THE CLIMATE THAT FOLLOWS IT.

A flat trend for seven years could be a transition away from the prior warming trend since 1975, or just a random variation in a complex climate system with many variables affecting the temperature. Monckton implies “his” trend is important by focusing on it, without knowing if that is true. I say he is data mining with a bias — avoiding any mention of the 43 year UAH trend.

Reply to  Kip Hansen
August 17, 2022 11:30 am

Yet, many here claim that what he is really trying to show is that CO2 does not control temperature.

I think writing an essay every month, just to inform people how long the UAH trend has been flat is rather pointless, especially if you make no mention of the confidence in that flatness.

What Monckton actually claims is that the purpose is to illustrate to people who wouldn’t otherwise understand what a trend is, that the UAH trend is lower than predicted.

Reply to  Bellman
August 17, 2022 4:04 pm

What you are really yapping about is the uncertainty in THE FUTURE TEMPERATURES. You can critique it all you want, but only the future will determine what is right or wrong. Wait your turn and you may be able to say “I TOLD YOU SO”!

Reply to  Jim Gorman
August 17, 2022 5:16 pm

What has the pause got to do with future temperatures?

Geoff Sherrington
Reply to  Bellman
August 18, 2022 3:47 am

The pause MIGHT indicate a change from a warming trend to a cooling trend of longer duration as Lord Monckton wrote.
Also, it invokes a need for a detailed mechanism to explain the pause when CO2 continues its steady upward trend. Any ideas about this disconnect?
Geoff S

bdgwx
Reply to  Geoff Sherrington
August 18, 2022 4:19 am

ENSO and AMO can explain almost all of the variability in the UAH TLT anomalies. The model T = 1.7*log2(CO2) + 0.12*ENSOlag5 + 0.16*AMOlag3 – 5.0*AODvolcanic has an RMSE of 0.13 C since 1979. Note that UAH’s claimed standard uncertainty is 0.10 C. Anyway, the heat flux from the ocean an ideal candidate as the dominant mechanism for the pause. Most of us who have modeled the UAH TLT anomalies using CO2 as one of the forcing mechanisms expect the pause to continue for several more months at least.

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:07 am

Note that UAH’s claimed standard uncertainty is 0.10 C.

Still parroting this milli-Kelvin nonsense.

bdgwx
Reply to  Carlo, Monte
August 18, 2022 6:46 am

bdgwx said: Note that UAH’s claimed standard uncertainty is 0.10 C.

CM said: Still parroting this milli-Kelvin nonsense?

0.10 C is 100 milli-Kelvin’s. The formula is is 1 mK = 1 K / 1000.

In other words 0.10 C is not the same thing as milli-Kelvin. It is literally 2 orders of magnitude larger.

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:55 am

The formula is is 1 mK = 1 K / 1000.

Pin a bright shiny star on your chest!

In other words 0.10 C is not the same thing as milli-Kelvin. It is literally 2 orders of magnitude larger.

Hey mr. disingenous, where did I equate 1 with 100?

NOWHERE.

But using milli-Kelvins never ceases to amuse with all the noise it generates.

bdgwx
Reply to  Carlo, Monte
August 18, 2022 7:25 am

CM said: “Hey mr. disingenous, where did I equate 1 with 100?”

I said: Note that UAH’s claimed standard uncertainty is 0.10 C.

You said: Still parroting this milli-Kelvin nonsense.

CM said: But using milli-Kelvins never ceases to amuse with all the noise it generates.

0.10 C still does not equal a milli-Kelvin. I don’t know how to make that anymore clear. You’re either going to understand it or not. I don’t have the time nor motivation at this point to continue to defend the formula 1 mK = 1 K / 1000 and try to explain it to you further.

Carlo, Monte
Reply to  bdgwx
August 18, 2022 7:41 am

Try again, mr. disingenous, YOU STILL IGNORE THE POINT about these impossibly small “uncertainty” numbers.

You remain sans even one clue about the subject.

Reply to  Carlo, Monte
August 18, 2022 9:11 am

And exactly how did they reach this “standard uncertainty”?

From NIST:

———————————————————
comment imageStandard Uncertainty and Relative Standard Uncertainty
Definitions
The standard uncertainty u(y) of a measurement result is the estimated standard deviation of y.

  • The relative standard uncertainty ur(y) of a measurement result is defined by ur(y) = u(y)/|y|, where y is not equal to 0.

Meaning of uncertainty
If the probability distribution characterized by the measurement result and its standard uncertainty u(y) is approximately normal (Gaussian), and u(y) is a reliable estimate of the standard deviation of y, then the interval – u(y) to u(y) is expected to encompass approximately 68 % of the distribution of values that could reasonably be attributed to the value of the quantity of which is an estimate. This implies that it is believed with an approximate level of confidence of 68 % that Y is greater than or equal to y – u(y), and is less than or equal to y + u(y), which is commonly written as Y± u(y).
———————————————————-

Once again we see “standard uncertainty” defined in the terms of a normal distribution. This only happens if you are measuring the same thing multiple times, preferably using the same measurement device.

Please explain how multiple measurements of different things can result in a normal distribution.

I’m not going to hold my breath waiting. You’ve never provided an explanation of this before and I highly doubt you will this time.

Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 3:42 pm

And exactly how did they reach this “standard uncertainty”?

I have no idea, the only analysis I’ve seen is a comparison of “trends” between UAH and radiosondes. Nothing about U(T).

Reply to  bdgwx
August 18, 2022 6:26 am

You may have something. You do realize that models don’t do a good job with oceans and currents.

Reply to  bdgwx
August 18, 2022 9:06 am

Then why doesn’t the IPCC have a scenario for ENSO and AMO changes instead of only GHG’s?

Reply to  Kip Hansen
August 17, 2022 1:45 pm

An article about a short term trend can persuade people that such a trend is important. It is especially biased when the longer term trend from 1979, using the same dataset, is not mentioned.

Reply to  Richard Greene
August 17, 2022 5:13 pm

Unless you have a deterministic, invariant, functional relationship between CO2 and temperature, then you simply can’t weight past data as heavily as recent data. No one does that, not the owner of the corner hardware store trying to forecast how many shovels to stock, a telephone company siting a new central office, a long term investor in the stock market, or any of a myriad other things. What happened yesterday or a week ago is a far better predictor than what happened 40 years ago.

That’s why short term, recent trends *must* be tracked as well as long term trends.

Past performance is not a predictor of future performance. Buyer beware.

Tom Abbott
Reply to  Richard Greene
August 18, 2022 6:00 am

What about that cooling trend from the Early Twenthieth Century to 1979? Shouldn’t that part of the trend be included?

Remember: Unmodified, regional temperature charts from all around the world show it was just as warm in the Early Twentieth Century as it is today, so the period from 1979 to the present is just one part of the cycle, and the present is not any more significant temperature-wise than in the past.

The temperatures warm for a few decades and then they cool for a few decades and then they warm again for a few decades with the highs and the lows staying within certain bounds, at least since the Little Ice Age.

Choosing only 1979 to the present is cherry-picking and distorts reality.

Here’s the U.S. temperatue profile:

comment image

And here are about 300 charts from around the world that show similar temperature profiles to the U.S. profile, where it was just as warm in the Early Twenthieth Century as it is today.

http://notrickszone.com/2017/06/16/almost-300-graphs-undermine-claims-of-unprecedented-global-scale-modern-warmth/#sthash.neDvp33z.hWRS8nJ5.dpbs

bdgwx
Reply to  Tom Abbott
August 18, 2022 6:43 am

Tom, How did the time-of-observation bias, instrument/shelter change bias, etc. effect that graph?

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:56 am

The Adjustor Fraud Crew wants to take over here…

Reply to  bdgwx
August 19, 2022 10:24 am

The difference between the 30’s temps and the 2000 temps is more than 0.5C. Propagated measurement uncertainties could easily make that either 0C or 1C.

I thought you claimed that annual means would hide any uncertainties from tob bias, instrument/shelter change, bias, etc since there would be so many data points in the record?

Are you now saying that the number of observing stations does *not* smooth out biases?

bdgwx
Reply to  Tim Gorman
August 19, 2022 12:27 pm

TG said: “I thought you claimed that annual means would hide any uncertainties from tob bias, instrument/shelter change, bias, etc since there would be so many data points in the record?”

I didn’t say that.

TG said: “Are you now saying that the number of observing stations does *not* smooth out biases?”

I’ve always been saying that systematic biases like the time-of-observation bias, instrument/shelter change bias, etc. bias the aggregation of station observations into spatial averages. I’m in the minority here on this one since most of the other WUWT participants believe systematic biases like these either do not exist or can be safely ignored.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 3:12 pm

since most of the other WUWT participants believe systematic biases like these either do not exist or can be safely ignored.

Another strawman—you believe “adjustments” to “systematic biases” are something other than fraudulent data mannipulation. From here you have somehow extrapolated to this ridiculous statement.

Reply to  bdgwx
August 19, 2022 3:24 pm

I didn’t say that.”

Then you think averaging doesn’t hide uncertainties? Pick one and stick with it!

“I’ve always been saying that systematic biases like the time-of-observation bias, instrument/shelter change bias, etc. bias the aggregation of station observations into spatial averages.”

And yet you are unwilling to admit that those uncertainties are large enough in an average of numerous independent measurement devices to mask any kind of a difference?

Or are you now willing to admit that the uncertainties add (perhaps directly or by root-sum-square) up to a level that it overwhelms the average of anomalies?

Dave Andrews
Reply to  Tom Abbott
August 18, 2022 7:48 am

Physical proof of early 20th century warming is that the coalport in Spitsbergen (Svalbard) was only accessible for three months in the years before 1920 but by the late 1930s was accessible for over seven months of the year.

Lamb ‘Climate, History and the Modern World’

Reply to  Kip Hansen
August 17, 2022 9:03 am

“…like “It hasn’t rained in three months and four days.” This type of statement ends when the rains come (or the trend changes).

A difference is that the last day it rained has a fixed objective date. You know the last day it rained the day after. The start date doesn’t keep changing like the pause start date does.

Another difference is that, if the question is when was the date warming stopped, you can easily see that warming didn’t stop in October 2014 or whenever, the warming trend has continued since then and if anything grown stronger.

Derg
Reply to  Bellman
August 17, 2022 2:01 pm

Lol…CO2 keeps rising and rising well sea levels slowly rise at the same pace they always have.

Reply to  Bellman
August 17, 2022 5:16 pm

When you are measuring the length of a drought when do you start? In the present and work backwards? Or do you cherry pick a start date?

Reply to  Tim Gorman
August 17, 2022 5:42 pm

The first thing I’d probably look for is an indication that rainfall had decreased. If I found that rainfall had been increasing over the period, I probably wouldn’t claim it was a drought.

If there were signs of a drought, I would try to find an exact starting point where it started. Droughts only start after months of low levels of rain, deciding when to declare a drought depends on what water levels are like and that can have multiple causes.

I’m not sure what the point would be of looking back until you found some point where you could claim the drought started, unless you could identify a specific change that caused it.

Reply to  Bellman
August 18, 2022 6:00 am

If there were signs of a drought, I would try to find an exact starting point where it started.” (bolding mine, tg)

ROFL! Exactly what CoM does! Yet you keep claiming he is cherry picking his start point!

multiple causes”

And temperature changes don’t have multiple causes? Especially when changing from increasing to stagnant?

” unless you could identify a specific change that caused it.”

How many climate scientists ae trying to identify specific changes that have caused the past pause and the current one – other than just dismissing them out of hand?

Reply to  Tim Gorman
August 18, 2022 7:54 am

Arg! My mistake! Obviously I meant to say, I wouldn’t try to find an exact starting point. I think the rest of my comment makes that clear.

Reply to  Bellman
August 18, 2022 5:01 pm

So you would try to find an INEXACT starting date?

What’s the difference?

Reply to  Tim Gorman
August 18, 2022 5:19 pm

It’s a daft analogy, and I’ve still no idea what you are trying to prove. I do not know how you would determine the start date of a drought. Actually I do, it would be whenever the appropriate services declared a drought, but I suspect that’s not what you are asking.

How would you determine the exact date a drought “started”?

Reply to  Bellman
August 19, 2022 8:18 am

“It’s a daft analogy, and I’ve still no idea what you are trying to prove. I do not know how you would determine the start date of a drought. Actually I do, it would be whenever the appropriate services declared a drought, but I suspect that’s not what you are asking.
How would you determine the exact date a drought “started”?”

You are dodging the issue. I figured that’s what would happen. You said you would FIND the start date. Yet you call that “cherry picking” when CoM does it.

ROFL!!!

Old Cocky
Reply to  Tim Gorman
August 18, 2022 1:03 am

Deciding when a drought started is rather arbitrary and depends on where and when you are – there isn’t a consistent definition of drought.

One definition is that you are in drought after 3 months without effective rainfall, where effective rainfall is approximately the measured rainfall minus 3 times the pan evaporation.
But that’s only one definition. Some others are x days without rain, or y days below a percentage of average rainfall for the time of year.

Reply to  Old Cocky
August 18, 2022 6:08 am

I’ve always used the soil moisture level, especially the depth. When the moisture drops down below the roots of the vegetation you are in a drought. It may be a short one but it may also be a long one. That’s why native prairie grasses here in the central plains have root systems that can go down eight feet or more – so the plant can find moisture during droughts.

I’ve attached a map from drought.gove that is an indicator that I like to look at.

cpc-leaky-bucket-daily-soil-moisture-percentiles-(1.6m)-08-18-2022.png
bdgwx
Reply to  Tim Gorman
August 18, 2022 6:41 am

Soil moisture is intensive since if you partition a unit of soil the parts will have the same value as the whole. And since no sample contains the exact same amount of water at every point within the sample any value you propose for the moisture of that sample cannot exist per the logic you advocated for in Kip’s previous article. So I’m trying to figure out why you now think moisture can meaningfully quantify a drought if you don’t think an average moisture of a sample is meaningful or even exists in the first place.

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:46 am

mr. disingenuous strikes again—that map is NOT a spatial average!

bdgwx
Reply to  Carlo, Monte
August 18, 2022 7:40 am

So if I pick a pixel in say western Kansas and I take a sample of soil encompassing the entire pixel every single microscopic spot within that sample will have the exact same moisture percentile value?

Carlo, Monte
Reply to  bdgwx
August 18, 2022 7:46 am

Idiot.

Reply to  bdgwx
August 19, 2022 8:53 am

So if I pick a pixel in say western Kansas and I take a sample of soil encompassing the entire pixel every single microscopic spot within that sample will have the exact same moisture percentile value?”

No!

Why would you expect that to be the case? Soil moisture at depth is collected at measurement stations. I suspect that data is collected somewhere.

Again, this map is not some “global soil moisture” calculation. If the climate scientists would do something like measure the radiance of 25km x 25km areas and use it to calculate a temperature and then create a map of values they would get a lot closer to having a useful product, just like the soil moisture map I posted. (UAH anyone?)

But then the models would have to do the same thing. Forecast max and min temps on a 25km x 25km basis. You would then have a map of where max temps would be going up/down and min temps would be going up/down. What are the odds of that happening?

Reply to  bdgwx
August 19, 2022 10:25 am

1st, that sample would not be at thermodynamic equilibrium at all points. 2nd, do you know what a gradient is? Objects not at equilibrium have gradients.

Take a cubic meter of soil. There is a gradient of energy going into it since the sun, clouds, etc. move. There is another gradient for the heat being distributed throughout, a gradient of heat into adjoining soil, and lastly a gradient of heat leaving into the atmosphere.

I’ve had a similar problem at university. We had to solve a problem with a pump shack on the Alaskan pipeline. You had gradients from the oil in the pipe, the pump and motor, through the insulated floor, heat radiating out from the legs and down the legs, out the floor and walls. The problem was to calculate how tall and big the legs needed to be to keep the temperature of the legs at ground level at the normal ground temperature. You didn’t want the shack to sink in the frozen soil you see.

The climate is no less complicated although climate scientists and mathematicians like to think so because averages simplify the calculations. The problem is that averages also hide the complexities. AGW folks have persuaded a lot of people that averages can give the answers, but that simply isn’t the case. If you and Bellman and others continue to promote averages and trending time vs temperature with linear regression you will never come close to understanding what is really happening.

You will become as the scientists of old promoting the earth as the center of the universe.

Reply to  Carlo, Monte
August 19, 2022 8:22 am

BINGO!

Reply to  bdgwx
August 19, 2022 8:22 am

bdgwx:

So what? There’s no “global average soil moisture” value being calculated! No one has said you can’t determine an intensive value at a location. The map I posted is no different than a topographical map showing elevation.

For some reason you have a problem recognizing the difference between measuring a sample and averaging a whole bunch of samples!

Why is that?

bdgwx
Reply to  Tim Gorman
August 19, 2022 9:04 am

So what does the value of the pixel represent if not some spatial aggregate of all the soil underneath it?

Carlo, Monte
Reply to  bdgwx
August 19, 2022 10:33 am

Are you really this dense?

Or is this just an act?

bdgwx
Reply to  Carlo, Monte
August 19, 2022 10:51 am

I am interested in hearing TG out on this. If you understand how boiling down an area not in equilibrium to a single value for soil moisture is meaningful and useful enough to draw conclusions regarding drought even though it is an intrinsic property you can explain it to me as well if you like. Maybe averaging intrinsic properties is okay if TG does it, but no one else? Or maybe it’s just averaging that is forbidden and other aggregate functions are allowed? Maybe spatially infilling is okay with soil moisture, but nothing else? I have a lot of questions about how TG’s rules work and why the graphic of soil moisture is okay, but other seemingly similar graphics are not.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 11:02 am

Are there any depths to which you will not descend to defend the honor of these global average air temperature constructs?

Instead of implying that Tim is somehow a hypocrite, maybe you should instead find out how that map was measured and made.

bdgwx
Reply to  Carlo, Monte
August 19, 2022 12:20 pm

Yes, I’m really hoping TG will it explain it all to us. I’m definitely interested in how a single value for a pixel can represent an entire swath of soil that almost certainly does not have same moisture percentile (intrinsic property) at every infinitesimal spot given the rules that were hashed out in Kip’s previous article.

Reply to  bdgwx
August 19, 2022 2:38 pm

I’m definitely interested in how a single value for a pixel can represent an entire swath of soil that almost certainly does not have same moisture percentile (intrinsic property)”

Again, water volume is an extensive value. It is essentially mass.

You *can* average mass.

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 3:44 pm

He’ll just ignore this and move on to the next strawman.

Carlo, Monte
Reply to  bdgwx
August 19, 2022 3:14 pm

Yet you ignore the problems with the UAH sampling procedure?!??

Anything to Keep the Rise Alive.

Reply to  bdgwx
August 19, 2022 2:37 pm

“even though it is an intrinsic property”

Soil moisture is a measure of the volume of water in an area, that is an EXTENSIVE measure. Temperature is not an extensive value, it is an intensive one. You can average extensive values but you can’t average intensive ones.

Water can act at a distance – it moves, it diffuses. If you take a cubic foot of soil and calculate the water in it and then cut that cube in half you will have half the amount of water. That is not an intensive attribute, it is an extensive one!

What makes you think the volume of water is an intensive attribute?

Reply to  bdgwx
August 19, 2022 10:55 am

Have you ever heard the term resolution? You are just making our argument about uncertainty, do you realize that?

bdgwx
Reply to  Jim Gorman
August 19, 2022 12:17 pm

JG said: “Have you ever heard the term resolution?”

Yes.

JG said: “You are just making our argument about uncertainty, do you realize that?”

No.

Reply to  Jim Gorman
August 19, 2022 2:43 pm

Have you ever heard the term resolution? You are just making our argument about uncertainty, do you realize that?”

He has *never* understood uncertainty, Why would he start now.

Satellites measure a 25km x 25km area, the aperture of the sensor determines minimum resolution.

But some of the data comes from surface measuring stations. Since water is an extensive value you can average those measures to get a value. Water does act at a distance but that distance is certainly limited. But river, pond, lake, and stream water is included. That can be remote sensed pretty easily.

Carlo, Monte
Reply to  Tim Gorman
August 19, 2022 4:03 pm

When I looked into the NOAA satellite MSU sampling, I was a bit shocked to learn how poor it really is. The grids are not equal-area but are instead equal-solid angle (2.5 x 2.5 degrees). This means the grid area along the equator is almost 10x larger than the polar grids. And because of the polar satellite orbits, polar grids can end up being sampled 3x per day, while at the equator 3 days can elapse without sampling an individual grid. They never state how many times each grid was sampled during a given month.

This is called called the “global temperature”.

Reply to  Carlo, Monte
August 20, 2022 6:59 am

I had never looked into that! Even if they do take it into consideration in their conversion algorithm, they need to state what they are doing and how they are doing it!

Thanks!

Carlo, Monte
Reply to  Tim Gorman
August 20, 2022 7:22 am

Here is the graph of grid pixel area versus latitude (your basic cosine):

UAH pixel area.jpeg
Carlo, Monte
Reply to  Carlo, Monte
August 20, 2022 7:25 am

Note this is mid-latitudes:

jgrd16826 fig 3.jpg
Reply to  bdgwx
August 19, 2022 2:33 pm

So what does the value of the pixel represent if not some spatial aggregate of all the soil underneath it?”

Volume of water is an extensive value. Why can’t you average it?

bdgwx
Reply to  Tim Gorman
August 19, 2022 7:18 pm

TG said: “Volume of water is an extensive value. Why can’t you average it?”

I think you can average volume. I think you can average any extensive property. I also think you can average any intensive property. But that is moot since the graphic is showing soil moisture percentile which is an intensive property.

BTW…I suspect it is going to come as a surprise that this particular soil moisture product requires not only summing temperatures but averaging them too. Even when the soil moisture is expressed in volumetric terms (not so in this case) it still requires summing and averaging temperature. You can find details regarding the leaky bucket model in Haung et al. 1995.

Reply to  bdgwx
August 20, 2022 7:03 am

“I also think you can average any intensive property”

The difference is that the average of the extensive property gives you a useable value (with restrictions) while the value of the intensive property does not.

“But that is moot since the graphic is showing soil moisture percentile which is an intensive property.”

Which is calculated from an extensive property. That is perfectly legitimate.

“BTW…I suspect it is going to come as a surprise that this particular soil moisture product requires not only summing temperatures but averaging them too”

Huh? Soil moisture depends on temperature? Then how can soil in Kansas be the same as in Alabama when each have a different temperature?

How does the volume of water in a stream depend on temperature? Or in a river? Or in a lake? Are you trying to use temp as a proxy for season and/or latitude?

bdgwx
Reply to  Tim Gorman
August 20, 2022 2:20 pm

TG said: “Which is calculated from an extensive property. That is perfectly legitimate.”

It is calculated from extensive properties AND intensive properties. Does that change your conclusion that it is legitmate?

TG said: “Huh? Soil moisture depends on temperature?”

Yes. Well, at least the soil moisture product you selected anyway.

TG said: “Then how can soil in Kansas be the same as in Alabama when each have a different temperature?”

First, it is a percentile metric based on climatological averages. Second, temperature is but only one component of the model.

TG said: “How does the volume of water in a stream depend on temperature? Or in a river? Or in a lake?”

I don’t think it does; at least not in this particular model.

TG said: “Are you trying to use temp as a proxy for season and/or latitude?”

I’m not using temperature for anything here. It’s not my model. The model used to create the graphic uses temperature as part of the evapotranspiration term. The literature says it was chosen because it serves as a parameter of net radiation.

Reply to  Kip Hansen
August 17, 2022 10:07 am

He is not even implying that the whole is the same as the part

He’s not SAYING it, but to my eyes, he is definitely IMPLYING something like it, as long as he doesn’t attach a disclaimer that says something like “I’m a peer of the realm with a mathematical bent and too much time on my hands, so don’t read anything into this, it’s just me playing with numbers while I wait for the butler to bring me another brandy”.

If he’s not trying to send a message that the warming trend is slowing down (which it does seem to be), then I don’t see what is the point of his monthly “pause” posting. He certainly doesn’t need them to be a lede for an exhibition of his wit, his bottomless vocabulary, his personal knowledge of the Thatcher years, and his command of dead languages. Those are always welcome anyway.

Those of us occupying the scientific high ground need to be really rigorous about scientific observations. We are surrounded by a seething mob of enemies who will stop at nothing to discredit us.

His Lordship isn’t alone. There’s a video of Willie Soon debunking the Great Pacific Garbage Patch, in which he makes the statement that polyethylene is the same as ethylene (or was it polypropylene = propylene?). I strongly suspect that he spoke in haste and would regret it, being a very erudite chap (albeit with a rather offhand delivery) – too late, it’s out there and could be used against him.

And a smattering of other examples that I can’t think of without further reflection.

Reply to  Smart Rock
August 17, 2022 4:12 pm

The problem is that climate science is making an assumption that “CO2 vs time” and “Temperature vs time” correlation somehow result in a decision that CO2 CAUSES TEMPERATURE. Any variance from this correlation absolutely ruins the consensus that CO2 causes temperature. The greenies will have a hemorrhage if this happens.

Geoff Sherrington
Reply to  Smart Rock
August 18, 2022 3:54 am

Lord Monckton usually entreats people to use or learn more math and usually gives a working example for people to critique.
What is wrong with that ?
Details of feedback theory are not everyone’s cup of tea, so people who are sparse about it, like I am, should be careful about commenting. Ii don’t enter the feedback discussion. I do not dismiss it. Geoff S

Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 11:44 am

“No Accidents for 666 days.”

MarkW
Reply to  TheFinalNail
August 17, 2022 7:59 am

It’s really sad how desperate climate alarmists are, to find something relevant to say.
Lord Monckton is only telling us how long it’s been since there was a positive trend.
He is saying nothing whatsoever regarding trends that may or may not be in the total data set.
Completely different things.

Reply to  MarkW
August 17, 2022 3:16 pm

And Kip is pointing out how irrelevant it is to do so.

Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 7:16 pm

… when a time series goes flat after a steady rise (or fall) …

It derives from the observation that the instantaneous slope of something like a sine wave goes through a minimum with zero slope, starts increasing, goes through a point of inflection, then starts to decrease the slope until it reaches the peak, again at zero slope. The often unstated assumption is that one is dealing with a function similar to a sine wave.

Reply to  Clyde Spencer
August 18, 2022 4:25 pm

At the very least a cyclical phenomenon. Most of which can be broken down into a series of sine waves.

Old Man Winter
Reply to  TheFinalNail
August 17, 2022 11:21 am

About 15 yrs ago, skeptics noted that global Ts stopped rising which was
counter to Algore’s claim that was also based on a short-term
trend & was as you noted, “always sensitive to start and end
dates“- which means it was also bogus. For that, they were
vilified by The Team™ as it undermined their claim of rising Ts
& it continuously forced them to “move the goalposts” to longer
periods of time for a trend to be considered “valid”, thus
exposing the fraud that you recognize in using short-term
trends. It was never meant to be the basis for establishing a
cooling trend claim although as Geoff S pointed out, it could identify that a trend shift MAY have occurred which would require more rigorous analysis!

1850THad.jpg
Old Man Winter
Reply to  Old Man Winter
August 17, 2022 1:15 pm

Th most fun was watching at least 66 Team™ members beclowning
themselves by coming up with lame excuses for The Pause. My
favorite’s Trenberth’s “dog-ate-my-homework” excuse claiming the
heat was at the bottom of the ocean! Bwwwaaaaahhhhh!!!!!!

https://wattsupwiththat.com/list-of-excuses-for-the-pause-in-global-warming/

https://www.climatedepot.com/2014/11/20/its-official-there-are-now-66-excuses-for-temp-pause-updated-list-of-66-excuses-for-the-18-26-year-pause-in-global-warming/

Pauseexc.jpg
Derg
Reply to  TheFinalNail
August 17, 2022 1:57 pm

He gets under your skin using your math 😉

You have a version of TDS I’d say.

Carlo, Monte
Reply to  Derg
August 17, 2022 3:08 pm

All the trendologists crawl out of the woodwork when a Monckton post appears, vainly trying to Keep the Rise Alive.

Old Cocky
Reply to  TheFinalNail
August 17, 2022 4:05 pm

That actually seems to be the object of his exercise.

With an implicit nod to the amount of noise in the time series data.

August 17, 2022 6:40 am

‘…“the average and its trend have to be correct – they are simply maths”.’

Kip, I may be biased, but I think I know who you had in mind here.

Coach Springer
August 17, 2022 6:42 am

So, there’s a pause. And there isn’t a pause. Depending on whether you’re a believer.

Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 12:03 pm

“The Pause” is simply an aspect of a larger phenomenon. That is a step-wise or saw-tooth pattern of temperature increase. The question that needs to be answered is what creates the saw-tooth pattern when the annual CO2 increase is relatively smooth (seasonal variations ignored).

Lance Wallace
August 17, 2022 6:57 am

It’s interesting to note that in some years (2005, 2009, 2010, 2015….) the yearly average minimum temperature reported exceeds the maximum.

Lance Wallace
August 17, 2022 7:12 am

The monthly trend (210 months) for the maxima is +0.000551 , but the standard error of that slope is 0.000263, leading to a significance level of 0.04, t-value of just over 2, significance level of 0.04, so technically significant.

The monthly trend (210 months) for the minima is +0.000641 , but the standard error of that slope is 0.000301, leading to a significance level of 0.04, t-value of just over 2, significance level of 0.04, so technically significant.

Clearly a fragile significance level, might become non-significant with one additional month.

Lance Wallace
August 17, 2022 7:20 am

Over the same 210 months, the Clim Div global slopes for the maxima and minima were smaller than the US CRN slopes (although still positive), the standard errors were 2-3 times the mean values of the slopes and clearly non-significant (p-value >0.6). (the Global Pause still holding).

Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 12:06 pm

Two times the square-root of the base-length of the Great Pyramid equals the slope of the sides, or some such claim.

Rud Istvan
August 17, 2022 7:23 am

Nicely done.
USCRN will eventually be useful for climate trends. Right now it just shows the general problem with the regular surface stations that Anthony has now demonstrated in detail twice, 2008 and 2022.

Clyde Spencer
Reply to  Kip Hansen
August 17, 2022 12:09 pm

Many decades ago I came to the conclusion that to answer philosophical questions, it is imperative to carefully define all the critical words.

In this case, I think that what is necessary is to carefully define what question one is trying to answer, and over what time scale.

Rud Istvan
Reply to  Kip Hansen
August 17, 2022 2:20 pm

Kip, back in my math modeling and stats/econometrics days I would not have dared even think to try it. Especially not on such a short 17 year data set. Double especially not with so much max/min variance. My profs would have flunked me rather than graduate me with a summa cum laude.

Reply to  Kip Hansen
August 17, 2022 2:46 pm

One of the problems is not being able to discern what is actually rising and where. It is another problem that needs to be addressed. Tmax could be doing anything along with Tmin doing anything. I suspect 99% of average people if asked would say Tmax is going to make us burn up. I think the propaganda has not bothered to educate them accurately.

Carlo, Monte
Reply to  Jim Gorman
August 17, 2022 3:11 pm

They always avoid answering questions such as these.

Reply to  Jim Gorman
August 17, 2022 3:24 pm

It’s USCRN data, all publicly available. It records temperature and a whole load of other data at 5 minute intervals. There’s nothing stopping you analyzing maximum temperatures.

Reply to  Kip Hansen
August 17, 2022 4:57 pm

Thanks.

According to my quick analysis there’s not much difference between the them as far as overall trends are concerned.

Mean = 0.35°C / decade
Max = 0.38°C / decade
Min = 0.32°C / decade

Clyde Spencer
Reply to  Bellman
August 17, 2022 7:28 pm

There’s nothing stopping you analyzing maximum temperatures.

Been there done that! See Fig. 1 and 2:

https://wattsupwiththat.com/2015/08/11/an-analysis-of-best-data-for-the-question-is-earth-warming-or-cooling/

The point that Jim was making is that stating an average doesn’t provide information on the proportional change in Tmin and Tmax. It is unclear whether providing only the average of mid-range values is done purposely to hide what is actually happening, or it is simple incompetence.

bdgwx
Reply to  Clyde Spencer
August 17, 2022 8:18 pm

It goes beyond that. It is my understanding that both of the Gormans support the notion that average temperatures are non existent and have no meaning whatsoever. In other words, if the average temperature on days A and B is -10 C and 40 C respectively you cannot say that day A was colder than day B based on that data. In fact, you’ve already abused to the data to even claim it was -10 C and 40 C on those days in the first place according to their school of thought. This was all hashed out in Kip’s previous article.

Reply to  bdgwx
August 18, 2022 9:47 am

You miss the two big points. A temperature at one point can not affect the temperature at a distance. Also temperature can not provide the enthalpy at a point.

If you want to allocate the sun’s energy properly you need a measurement or calculation that show the enthalpy at each location. No one is saying that you can’t average the numbers that are associated with temperatures. Numbers are numbers. No one disputes that.

Intensive measurements are not amenable to averaging though. Temperature is a quality, not a quantity. Can you average green, white, and blue? What is the average? Someone mentioned that you could average frequencies. Do you think the perceived color of the averaged frequency is the color you see?

How about density? If you have blocks of gold, Bakelite, and carbon fiber and compute an average density does the average tell you anything?

Here is a map of from a TV station showing temperatures around northeast Kansas. Two questions. How many combinations of three locations in a straight line are there? How many of those have the center point equal to the average of the outer two?

What is the global average humidity? Can it be used to compute a global enthalpy?

Reply to  bdgwx
August 18, 2022 4:28 pm

 average temperatures are non existent and have no meaning whatsoever. In other words, if the average temperature on days A and B is -10 C and 40 C respectively you cannot say that day A was colder than day B based on that data. “

Is 40C in Phoenix hotter or colder than 40C in Miami?

Why is one dry heat and the other wet heat?

Reply to  Tim Gorman
August 18, 2022 5:15 pm

Is 40C in Phoenix hotter or colder than 40C in Miami?

By the zeroth law of thermodynamics they are equally hot.

They might not feel the same, but they have the same amount of “hotness”.

Reply to  Bellman
August 18, 2022 5:46 pm

You have no thermodynamic training, I am now sure of it.

The zeroth law says:

If Ta = Tb and Tb = Tc, then Ta = Tc

Read this:

https://byjus.com/jee/zeroth-law-of-thermodynamics/

This is basically the Transitive Property of Equality.

Note: It is only applicable to sensible energy that can be actively measured. Latent is not considered along with total enthalpy.

Reply to  Jim Gorman
August 18, 2022 6:49 pm

You have no thermodynamic training, I am now sure of it.

I don’t know how you figured that out. Couldn’t have been all the times I’ve said I know little to nothing about thermodynamics.

This is basically the Transitive Property of Equality.

Yes, that’s what I thought it meant.

The question was about which was hotter 40°C in Miami or 40°C in Phoenix. Unless I’m missing something hotness is defined by temperature, not heat or latent or otherwise, or enthalpy.

Reply to  Bellman
August 19, 2022 5:32 am

In all my life I have never heard of temperature being called “hotness”. Scientifically it is a measure of the translational (movement) kinetic energy in an object or a gas. Latent heat is the non-translational movement of an atom or molecule, e.g., rotation, vibration, or orbital, but it is still thermodynamic heat.

Reply to  Jim Gorman
August 19, 2022 7:53 am

My understanding (usual disclaimers about a little learning) is that temperature can be defined either in terms of average kinetic energy, or in terms of thermodynamics, and that Kelvin is defined by kinetics. But when mentioned the kinetic definition last time, Kip was very clear that I was wrong, that it was only s “sorta” definition and the only correct one was that given by the Encyclopedia Britannica.

Regardless of the way it is defined, all sources I can find describe it as a measure of “hotness”. E.g.

“Temperature is a physical quantity that expresses the hotness of matter or radiation.”

https://en.m.wikipedia.org/wiki/Temperature

And

“temperature, measure of hotness or coldness expressed in terms of any of several arbitrary scales and indicating the direction in which heat energy will spontaneously flow”

https://www.britannica.com/science/temperature

Reply to  Bellman
August 19, 2022 12:06 pm

“temperature, measure of hotness or coldness expressed in terms of any of several arbitrary scales and indicating the direction in which heat energy will spontaneously flow””

Note carefully that this definition does not quantify how much energy will spontaneously flow! That is a matter of heat content or enthalpy.

bdgwx
Reply to  Bellman
August 19, 2022 12:15 pm

I’ve always understood temperature to define hotness or coldness as well. And that for two bodies A and B with Ta > Tb then we say body A is hotter (or warmer) than body B. A corollary to this is that the words warming (or warmed) and cooling (or cooled) mean ΔT > 0 and ΔT < 0 respectively.

Reply to  Bellman
August 19, 2022 12:03 pm

They do *NOT* have the same enthalpy or heat content. I simply don’t understand why that is so hard to get across.

I take a knitting needle and a wooden toothpick and heat them both to 100C in boiling water. I then drop each onto my vinyl countertop next to the stovetop. Which one melts further into the countertop? Which one gets me the biggest lecture from wifey?

Carlo, Monte
August 17, 2022 7:46 am

Kip—there is a lot of information missing from air temperature anomaly graphs.

First, as you point out, there is a 6°C range of all (monthly) values; as a result the linear regression also produces a standard deviation versus time. These are never shown. I also suspect that the standard deviations may be higher for the 5-year segments.

Second, a histogram plot of the regression residuals will very likely have a strong Gaussian shape, indicating most of the variation is random; more information that is never seen.

Third, the average for each monthly point also has a range and standard deviation, more information that is not revealed. Considering these are from the entire USA for entire single months, the ranges will be a lot larger than 6°C.

Fourth, the same holds for however the anomaly baseline average was constructed.

At a minimum, these standard deviations should be combined with root-sum-square and plotted with the data. The regression lines will look tiny in comparison.

Lastly, properly propagated measurement uncertainties from the instrumental data are never computed, which are a lot larger than a few tenths of a degree. When combined with the standard deviations from the averaging and applied to a regression graph as error bars, the regression lines can fall anywhere between the uncertainty limits.

These are fundamental issues that air temperature trendology refuses to consider because they make it impossible to determine even the slope of regression lines with any reliability, and thus subsume declarations of runaway global meltdowns into noise.

Carlo, Monte
Reply to  Carlo, Monte
August 17, 2022 8:13 am

I also suspect that the standard deviations may be higher for the 5-year segments.

As more points are accumulated and added to the end, absent any drastic changes, the standard deviation of the regression will go down. However, this isn’t true of the individual points, the standard deviations remain the same.

Reply to  Carlo, Monte
August 17, 2022 9:17 am

Hear, Hear. Some reasonable expectations that would ordinarily be required of scientific endeavors! Sadly, much of climate science is a simply trying to curve fit a linear regression line to numbers that are considered to have no uncertainty at all.

Carlo, Monte
Reply to  Jim Gorman
August 17, 2022 3:18 pm

This is why I call it trendology, it isn’t studying climate! And the din of whining that usually ensues when someone dares to question the party lines can be deafening.

Clyde Spencer
Reply to  Carlo, Monte
August 17, 2022 12:11 pm

I have never been a fan of trying to answer a complex question with a single number.

Carlo, Monte
Reply to  Clyde Spencer
August 17, 2022 3:35 pm

I am reminded of the story that someone posted recently about the US Air Force’s “average pilot” from the 1940s which was used to design cockpit ergonomics—when compared with real people, it turned out that zero pilots could match even just three of the average body measurements. The average pilot was a ghost.

Clyde Spencer
Reply to  Carlo, Monte
August 17, 2022 7:32 pm

Similar to the remark attributed to Dixie Lee Ray that the average person had one breast, one testicle, and one ovary. Averages are overused and misused.

Geoff Sherrington
Reply to  Clyde Spencer
August 18, 2022 4:28 am

CM,
Not even 42? Geoff S

Geoff Sherrington
Reply to  Carlo, Monte
August 18, 2022 4:27 am

Carlo Monte,
I made about 20 histograms today, from “raw” Australian BOM temperature data. Not one remotely resembled a proper Normal Distribution. As an example, from the Fahrenheit recording days at hot stations, there is often a prominent bin around 38.7 C which is converted from 100 F. Commonly, there is a skew favouring more hot days. There is any number of sub-distributions combining because different weather systems sweep over the station each year. Maybe each synoptic pattern contributes a different distribution.
The Central Limit Theory hopes that all combine into wheat final Bell curve, but I am not finding that.
Geoff S

Carlo, Monte
Reply to  Geoff Sherrington
August 18, 2022 6:13 am

Geoff, I believe this—combining air temperature data from a group of locations will never be anything like standard statistics random sampling from a fixed population. Despite this, many try to treat them as such.

Reply to  Geoff Sherrington
August 18, 2022 6:19 am

The CLT has certain assumptions built in. One is sample size. When you only take one measurement is that the sample size? Maybe one could use only all the Tmax’s from everywhere in a year and sample that distribution with a sample size of 50 to see what happens.

The real problem starts with anomalies. UAH shows temperatures moving around a baseline, up then down, down then up. Do anomaies follow. Plus using fractional temperatures prior to 1980 is just magic masquerading as science.

Reply to  Geoff Sherrington
August 19, 2022 9:06 am

I simply don’t see how a distorted sine wave (slightly distorted?) in the daytime combined with an exponential decay at night could possibly generate a normal distribution of temperatures. That means that a psuedo-average (actually a mid-range value) from those daily temps can’t magically create a normal distribution either, no matter how many people claim that it can!

If you don’t have a distribution that is at least somewhat symmetric then you don’t get cancellation around a mean, e.g. a true value, and the central limit theory goes out the window.

August 17, 2022 7:51 am

Fun with Trends
__________________

And their projections / extrapolations / predictions and derivatives, which would add to the fun.

And yes, it does bring to mind the relatively short and ever changing satellite sea level record. Entrance of the Gladiators would be appropriate.

Rud Istvan
Reply to  Steve Case
August 17, 2022 2:27 pm

Wrote about satellite altimetry and sea level rise several times here before. It never ceases to amaze me that their published inherent newest, best precision of 3.4cm is claimed by NASA to result in 0.1mm precision via mere repetition. But NASA does. Obviously not the same NASA that got guys to the moon and back.

Reply to  Rud Istvan
August 17, 2022 2:51 pm

Just think if their burns were off by one percent. Wonder how far they would have missed their trajectory on reentry? Yet a drifting satellite in space can create measurements of 1/10000th of a meter? What do you think a master machinist would say?

MarkW
August 17, 2022 7:54 am

Another trick is when examining data that has repeating patterns in it.
The classic example is a sin wave. Unless you make sure that your sample contains only whole wave forms, you are guaranteed to get trends, even if there is no actual trend.

Reply to  Kip Hansen
August 17, 2022 12:13 pm

I’ll raise you a few Doodson numbers…

Rud Istvan
Reply to  Kip Hansen
August 17, 2022 2:29 pm

Actually 3 sin waves including the 18.6 year lunar cycle.

Reply to  Rud Istvan
August 17, 2022 5:02 pm

Actually quite a few more. I looked at the harmonics for Seattle reported by NOAA. Half a dozen had amplitudes of over 6 inches, and they listed 37 out of some 400+ that have been analysed.

https://tidesandcurrents.noaa.gov/harcon.html?id=9447130

Philip
August 17, 2022 8:37 am

Looking at the first figure, the graph appears to start at 1°F.
However, the trend line calculation appears to start at 0°F, and finish at 1°F, giving a trend of 1°F.

However, if your trend calculation had started from the actual reading of 1°F, the trend would be zero.

Try using something (anything) but excel for mathematical calculations.

August 17, 2022 8:38 am

Are we allowed to discuss this, or does it fall because of the injunction, “one cannot average temperatures”?

Mr.
Reply to  Bellman
August 17, 2022 8:58 am

Well, you can average anything you like.
The rational question though, is –

what practical, real-world use / application does the averaged number serve?

Clyde Spencer
Reply to  Bellman
August 17, 2022 12:20 pm

If it is made clear that the recorded temperatures are what are commonly encountered with terrestrial materials, and the concern is the physiological effects on living organisms and chemical reactions, then by all means measure and collate temperatures.

It is probably more physically meaningful to present a binned modal temperature than an arithmetic mean with superfluous digits. Particularly so because temperature distributions tend to be skewed with long tails on the cold side.

Reply to  Bellman
August 17, 2022 12:38 pm

In case you can’t tell trending is different subject from how the data is arrived at.

Reply to  Jim Gorman
August 17, 2022 4:40 pm

What’s the point of looking at a trend if you think the data is physically meaningless?

Clyde Spencer
Reply to  Bellman
August 17, 2022 7:36 pm

You might ask that question of the journalists that faithfully post articles about “the third highest temperature in the last decade,” or “the 12th smallest ice coverage since the 12th of Never.”

Reply to  Bellman
August 18, 2022 4:57 am

It’s not meaningless for a specific location if uncertainty propagation is followed. An aggregate average? Not meaningful at all. What does an aggregate average between Phoenix and Miami tell you about climate? Or Pikes Peak and Denver? Or Nevada, US and Peru?

If you must aggregate, then calculate the slopes of the maximum temps and the slopes of the minimum temps at each location. Assign + or – to each slope of the maximums and each slope of the minimums. Then add up all the pluses and minuses for the max temps and all the pluses and minuses for the min temps.

Then you’ll know if you have a greater number of locations with increasing or decreasing max temps and the same for min temps.

*THEN* go back and calculate what the possible slopes are for each location using the uncertainty of the stated values. I.e. from stated value + uncertainty for the first data point to stated value – uncertainty for the final data point. And vice versa. Then show those plots for the maximum and minimum values. Aggregate them using pluses and minuses.

You’ll finally have enough data to make and informed judgement about what might be happening – or what might *not* be happening.

This isn’t complicated to do, computers the size of what NOAA and NASA have should be able to do it in hours at most. The biggest problem with doing this is that the CAGW crowd is afraid of what it would show!

Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 6:16 am

Combining arctic with tropic, Northern hemisphere with Southern hemisphere, etc. is just silly.

Reply to  Tim Gorman
August 18, 2022 2:47 pm

It’s not meaningless for a specific location if uncertainty propagation is followed.

We are talking about trends for the average US temperature. If you think the US is not one specific location, then by the Essex claim, the average temperature is meaningless and any trend is meaningless.

Then you’ll know if you have a greater number of locations with increasing or decreasing max temps and the same for min temps.

That’s a far less meaningful average, than just averaging temperatures.

I.e. from stated value + uncertainty for the first data point to stated value – uncertainty for the final data point.

Hate to say this, but have you considered taking a remedial statistics class.

This isn’t complicated to do

Then do it.

The biggest problem with doing this is that the CAGW crowd is afraid of what it would show!

So you do it.

Carlo, Monte
Reply to  Bellman
August 18, 2022 3:46 pm

Hate to say this, but have you considered taking a remedial statistics class.

*snort*—says the fool who thinks averaging temperature measurements reduces uncertainty.

Reply to  Bellman
August 19, 2022 11:09 am

We are talking about trends for the average US temperature. If you think the US is not one specific location, then by the Essex claim, the average temperature is meaningless and any trend is meaningless.”

You *STILL* have not internalized how the real world works. There is no average US temperature. Temperature depends on too many factors to just be simply averaged. The US is *NOT* one specific location!

“That’s a far less meaningful average, than just averaging temperatures.”

Malarky. I didn’t say *ANYTHING* about taking averages. I said you add up the pluses and the minuses and see which dominates!

Can you get *ANYTHING* right?

Hate to say this, but have you considered taking a remedial statistics class.”

You *still* don’t understand uncertainty. I have given you graph after graph showing the space between the uncertainty boundaries blacked out. In other words you can’t define a specific trend as the *true trend”. Any trend that fits inside that interval is possible. All I am saying is that you can develop a trend spectrum that fits in that interval. Do the start-max to end-min and the start-min to end-max. Those should bracket enough possibilities to allow an informed judgement to be formed about what the data is showing.

You just can’t get out of that little box you live in, can you?

Then do it.”

You can’t even tell the difference between complicated and time consuming! Do you live in the real world at all?

Reply to  Tim Gorman
August 19, 2022 12:53 pm

You *STILL* have not internalized how the real world works.

That’s why I prefer arguing on this fantasy one.

There is no average US temperature.

Hence why I’m questioning what the point of this article is, and why people think the pause has any meaning.

Malarky. I didn’t say *ANYTHING* about taking averages.

As usual you think that if you haven’t used the word it means you are not using them. You want to look at the trend of multiple places and then see if there are more pluses than minuses. That’s averaging. I don’t care what method you use to aggregate the pluses and minuses, you are deriving a single figure that represents the overall property of the whole. Either you are adding them all up and dividing by the number of locations, or you are just adding them up. Both will give you the same issues, just one won’t change as the number of stations change.

I have given you graph after graph showing the space between the uncertainty boundaries blacked out.

Then stop it. It’s meaningless.

In other words you can’t define a specific trend as the *true trend”.

Hence why I say there’s uncertainty in the trend.

Any trend that fits inside that interval is possible.

Any trend that doesn’t fit in the interval is also possible. Anything is possible. It’s just not likely.

You just can’t get out of that little box you live in, can you?

And you can’t stop with these tedious ad hominems, can you?

Reply to  Bellman
August 20, 2022 1:33 pm

You want to look at the trend of multiple places and then see if there are more pluses than minuses. That’s averaging”

really? where is the division by the number of pluses and minuses. You can’t even get this one correct!

I don’t care what method you use to aggregate the pluses and minuses, you are deriving a single figure that represents the overall property of the whole.”

That’s still not averaging. And it’s obvious that you don’t care. You are a troll. All you are doing is throwing out crap in order to get answers. It apparently makes you feel like a big man!

“Either you are adding them all up and dividing by the number of locations, or you are just adding them up. “

Adding them up is not taking an average!

Both will give you the same issues, just one won’t change as the number of stations change.”

No, you don’t get the same issues. That’s just more garbage you are trying to use in your trolling.

Then stop it. It’s meaningless.

No, to a real physical scientist or engineer it tells one that you simply don’t know! That’s what uncertainty is, the unknown!

Hence why I say there’s uncertainty in the trend.”

It hasn’t got ANYTHING to do with uncertainty in the trend. It has nothing to do with the residuals between the stated values and linear regression line! You haven’t learned anything about uncertainty over the past two years. It’s obvious that you never will!

Any trend that doesn’t fit in the interval is also possible. Anything is possible. It’s just not likely.”

That *is* true. Uncertainty intervals are chosen to be the most likely to contain the true value. But nothing is perfect. It doesn’t mean you can just ignore uncertainty!

And you can’t stop with these tedious ad hominems, can you?”

If you don’t like your inanities being pointed out then stop being inane and actually learn something instead of just cherry picking things you don’t understand but which you think back you up.

Reply to  Tim Gorman
August 20, 2022 1:58 pm

really? where is the division by the number of pluses and minuses. You can’t even get this one correct!

It’s your idea, you tell me. If you are saying all you are going to do is add up all the pluses and minuses, then that’s making it seem even less useful. What happens if your stations aren’t evenly located, what happens if later you increase the number of stations.

Adding them up is not taking an average!

What’s the difference. You seem to have this odd idea, that it’s fine to add a bunch of number together to get a meaningful result, but somehow dividing by a constant makes the figure useless.

Carlo, Monte
Reply to  Tim Gorman
August 20, 2022 9:13 pm

That’s still not averaging. And it’s obvious that you don’t care. You are a troll. All you are doing is throwing out crap in order to get answers. It apparently makes you feel like a big man!

BINGO!

August 17, 2022 8:43 am

“Now, a lot of people would like to jump in and start figuring out trend lines…”

I’m guilty of that, but only in response to people claiming that the trend is flat or suggestions that USCRN somehow disproves the idea pf global warming. In reality though the short length and high variability over a small region of planet makes it impossible to tell if the rise at the moment is significant or down to chance.

Clyde Spencer
Reply to  Bellman
August 17, 2022 12:22 pm

… makes it impossible to tell if the rise at the moment is significant or down to chance.

Why does atmospheric CO2 not show a similar high variability if it is all chance?

Reply to  Bellman
August 17, 2022 12:41 pm

Have you ever considered that some cycles are shorter and more irregular than others? Do you think that could cause spurious trends to be seen, both long and short?

Do pauses indicate that other factors are at work than CO2?

Jeff Alberts
Reply to  Bellman
August 17, 2022 6:03 pm

You’re not going to find global anything, by using averages. Doesn’t matter if it’s temps, anomalies, or whatever. Any change will give one a false sense of “globality”.

bdgwx
Reply to  Jeff Alberts
August 17, 2022 7:05 pm

Over in Kip’s other article the contrarians decided that any body not in equilibrium cannot have a temperature average or otherwise. Like…it literally does not exist. The Earth could be colder than Pluto for all they know. The CMB could be warmer than Earth for all they know. They certainly don’t think the average temperature of Earth is 289 K or the CMB is 2.7 K because those values are useless and meaningless.

Geoff Sherrington
Reply to  bdgwx
August 18, 2022 4:34 am

I do not think those claims are so.
Temperature, after all, is classed as one of the 7 primary variables that alone or combined can be used as units for everything we can measure to date. Geoff S

Reply to  Geoff Sherrington
August 18, 2022 6:07 am

The question is can you use temperature in an average? Temperature is an intensive variable that does not measure heat, only translation kinetic energy. The big variable is humidity whose latent heat is not measured by temperature.

If climate science would admit they are only looking at temperature differences and not enthalpy it would make matters simpler. However, they could not then use temperature as a proxy for heat.

Reply to  Jim Gorman
August 18, 2022 6:26 am

+100!

Carlo, Monte
Reply to  Jim Gorman
August 18, 2022 6:35 am

Much like using power as a proxy for energy?

Reply to  Jim Gorman
August 18, 2022 8:08 am

“The question is can you use temperature in an average?”

This article does use the average, so clearly the answers yes.

“Temperature is an intensive variable that does not measure heat…”

Heat is a description of energy transfer. It doesn’t measure hotness. I think temperature is a better measure of how hot the air is then heat.

Reply to  Bellman
August 18, 2022 1:07 pm

Heat is a description of energy transfer. It doesn’t measure hotness. I think temperature is a better measure of how hot the air is then heat.”

Try using enthalpy as a measure of heat. You might want to read about it before making assertions. Does temperature measure all of the sun’s energy regardless of the humidity?

Reply to  Jim Gorman
August 18, 2022 2:37 pm

Enthalpy is not a measure of heat. You keep insisting that I understand thermodynamics before I can comment on averaging temperatures, then don’t even seem to understand the meaning of the word heat.

By all means, determine the total enthalpy of the earth if you need to understand the full extent of warming. But I expect the average temperature is a lot easier to calculate, given it can be measured directly, and as I suggested last time, there probably isn’t much of a difference in practice as far as the surface is concerned. Humidity makes very little difference according to Tim’s figures. By my calculation less than 10% difference between the most extreme conditions.

And, if I’m understanding it correctly, the warmer the air gets the greater the specific heat capacity, so if anything the temperature increase will be less than the enthalpy increase.

Carlo, Monte
Reply to  Bellman
August 18, 2022 3:48 pm

More hand-waving word salad.

Reply to  Bellman
August 18, 2022 5:24 pm

Enthalpy is not a measure of heat.”

https://byjus.com/chemistry/what-is-enthalpy/
“Enthalpy is the measurement of energy in a thermodynamic system. The quantity of enthalpy equals to the total content of heat of a system, equivalent to the system’s internal energy plus the product of volume and pressure.”

https://www.thermal-engineering.org/what-is-enthalpy-definition/
In thermodynamics, the enthalpy is a measurement of energy in a thermodynamic system. Enthalpy is equivalent to the total heat content of a system. What is Enthalpy. Thermal Engineering”

How many more definitions do you need?

 Humidity makes very little difference according to Tim’s figures”

Really? Where do my figures state that? I told you how to calculate enthalpy using the steam tables. What do you think the steam tables are other than a measure of absolute humidity?

You can’t get *anything* right!

 But I expect the average temperature is a lot easier to calculate, given it can be measured directly, and as I suggested last time, there probably isn’t much of a difference in practice as far as the surface is concerned.”

What’s so hard about calculating absolute humidity? My weather station does it! Most modern temperature stations do.

You refuse to say why it’s called dry heat in Phoenix and wet heat in Miami. There *is* a reason why!

“And, if I’m understanding it correctly, the warmer the air gets the greater the specific heat capacity, so if anything the temperature increase will be less than the enthalpy increase.”

Once again,

h = ha + Hh_g

where h is enthalpy, i.e. heat. ha is the specific enthalpy of dry air, c_pa * T, where c_pa = 1.006 kJ/kgC. H is the humidity ratio, m_v/m_a. h_g is the specific enthalpy of water vapor from the steam tables.

Break this down further: h = c_pa*T + (m_v/m_a)h_g.

Enthalpy will go up *faster* than temperature because there is a water vapor factor added to the enthalpy change and not just the temperature increase.

You are not understanding very much about the real world correctly. Why do you insist on showing that?

Reply to  Tim Gorman
August 18, 2022 5:56 pm

Previous experience on matters I do know about has taught me to avoid byjus.

My limited understanding is that heat content is an obsolete term in thermodynamics, with the term enthalpy now being the correct term.

https://en.wikipedia.org/wiki/Heat

Since many processes do take place at constant atmospheric pressure, the enthalpy is sometimes given the misleading name of ‘heat content‘ or heat function, while it actually depends strongly on the energies of covalent bonds and intermolecular forces.

The generic meaning of “heat”, even in classical thermodynamics, is just “thermal energy”. Since the 1920s, it has been recommended practice to use enthalpy to refer to the “heat content at constant volume”, and to thermal energy when “heat” in the general sense is intended, while “heat” is reserved for the very specific context of the transfer of thermal energy between two systems.

So to be clear, when Jim says

Try using enthalpy as a measure of heat.

he doesn’t mean heat in the current sense, but as thermal energy?

Reply to  Bellman
August 20, 2022 5:01 am

How much “heat” you can transfer is determined by the enthalpy of the material – i.e. its heat content.

It’s like using a No. 1 tip on an acetylene torch as opposed to a rosebud tip. The flame from each has the same temperature but the amount of heat that can be transferred is vastly different. It’s why a steel needle at 100F can melt a bigger hole in a vinyl countertop than a wooden toothpick at 100F. The steel has a larger heat content (enthalpy) and the heat transfer is much larger.

Reply to  Tim Gorman
August 20, 2022 2:36 pm

How much “heat” you can transfer is determined by the enthalpy of the material – i.e. its heat content.

Could you provide a reference. As I say, my understanding of thermodynamics is pretty limited, but I thought that heat was governed by the temperature and heat capacity.

As a thought experiment, if you have something cold but massive like the atmosphere at 5°C, and introduce something small and hot, say an iron bar heated to 80°C, I would expect heat to transfer from the bar into the atmosphere, despite the atmosphere having far more enthalpy.

It’s why a steel needle at 100F can melt a bigger hole in a vinyl countertop than a wooden toothpick at 100F. The steel has a larger heat content (enthalpy) and the heat transfer is much larger.

Again, I thought that was down to heat capacity, not the enthalpy. (Although in that case the enthalpy would depend on the temperature and heat capacity of the objects).

What happens if you repeat this experiment, but the countertop is also at 100F. Now there is no heat transfer, despite big differences in enthalpy.

Reply to  Bellman
August 21, 2022 6:33 am

Could you provide a reference.”

I can provide you a real world example! I have two soldering irons, one which is small that I use on smd circuit boards and such. I have a larger one that I use for soldering up radiators and for installing PL-259 connectors onto coax. They both reach about the same temperature, 700F to 800F. The small one doesn’t carry enough heat to solder radiators and the large one will burn the circuit board in no time.

While they have the same temperature the amount of heat that each can transfer is very, very different. The difference is the enthalpy, i.e. heat content, of each.

“As a thought experiment, if you have something cold but massive like the atmosphere at 5°C, and introduce something small and hot, say an iron bar heated to 80°C, I would expect heat to transfer from the bar into the atmosphere, despite the atmosphere having far more enthalpy.”

Temperature only gives you the direction of the heat flow. It can’t define how much heat actually gets transferred. The iron bar doesn’t carry enough heat content, i.e. enthalpy, to have a significant impact on the atmosphere. Just like the small soldering iron doesn’t have enough heat content to solder a car radiator.

Again, I thought that was down to heat capacity, not the enthalpy. (Although in that case the enthalpy would depend on the temperature and heat capacity of the objects).”

Go ahead. Think about what you just said. What is heat capacity? Are you speaking of the specific heat constant for a substance? Isn’t it basically how good of a heat conductor it is? How is that related to heat content? The vinyl will offer the same heat capacity to both objects. But one will cause more melting than the other. Why is that?

“What happens if you repeat this experiment, but the countertop is also at 100F. Now there is no heat transfer, despite big differences in enthalpy.”

Again, you are talking about the direction of heat flow, not the amount. If there is no heat flow there is no heat exchange and the heat content of the objects won’t matter. You are trying to conflate two different things. They *are* two different things.

Reply to  Tim Gorman
August 21, 2022 2:25 pm

I can provide you a real world example!

A reference would be more useful. You are good at finding examples that confirm your beliefs, but to really test them you need to look for counter-examples.

What I’m trying to understand is what you mean when you say

How much “heat” you can transfer is determined by the enthalpy of the material – i.e. its heat content.

How do you determine the heat from the enthalpy? If you have two objects and you know the enthalpy of each, how do you determine the heat transfer just from that. Unless you also know their temperatures you cannot even tell if there will be any heat transfer or in what direction it will go.

Then what determines how much heat is transferred are the temperatures, masses and specific heat capacities. I would imagine that it’s possible to work out the total enthalpy from that, but it doesn’t feel like the enthalpy is determining the amount of heat transfer directly.

Think about what you just said. What is heat capacity? Are you speaking of the specific heat constant for a substance?

No, that would be specific heat capacity. I was talking about that multiplied by the mass.

The vinyl will offer the same heat capacity to both objects. But one will cause more melting than the other.

Because it depends on the heat capacity of both the objects and the vinyl.

Reply to  Bellman
August 21, 2022 7:32 pm

A reference would be more useful. You are good at finding examples that confirm your beliefs, but to really test them you need to look for counter-examples.”

If you had *any* real world experience in the reality of a working man you would understand this totally. Lacking that try here:

https://www.hellopractical.com/how-hot-does-a-soldering-iron-get/

This may seem like enough to solder alloys with a melting point of 180°C -190°C, but the lower the wattage, the harder it is for a soldering iron to maintain its temperature”

It’s harder for it to maintain its temperature because it doesn’t have enough heat content!

How do you determine the heat from the enthalpy? If you have two objects and you know the enthalpy of each, how do you determine the heat transfer just from that. Unless you also know their temperatures you cannot even tell if there will be any heat transfer or in what direction it will go.”

Work = ΔH. It takes *work* to heat up metal and solder so the soldering process can be accomplished. If your tool can’t provide the amount of ΔH required (i.e. the amount of work needed) then it is lacking in heat content.

Like I said: “Temperature only gives you the direction of the heat flow. It can’t define how much heat actually gets transferred.” It isn’t just a matter of knowing the temperature, you also have to know that you have enough heat content to accomplish the work that needs to be done!

You simply do not listen!

Then what determines how much heat is transferred are the temperatures, masses and specific heat capacities. I would imagine that it’s possible to work out the total enthalpy from that, but it doesn’t feel like the enthalpy is determining the amount of heat transfer directly.”

Like I said, you just don’t listen. It it truly getting tiresome. I’m not being paid to tutor you in basic science.

Temperatures determine heat flow direction. Heat content is proportional to mass. Specific heat tells you how easy it is to heat a substance (i.e. copper heats easier than asbestos, specific heat is in joules/gram).

Enthalpy is heat content. You *have* to have enough heat content (enthalpy) to perform the amount of work that is required to accomplish what you are doing.

If you want to raise the temperature of something 1degree and specific heat * mass gives you the joules required to do that then you better have that many joules of energy (heat content = enthalpy) available to perform that amount of work!

No, that would be specific heat capacity. I was talking about that multiplied by the mass.”

specific heat in joules/gram times mass in grams gives you how many joules are required to raise a substance 1 degree. The unit for enthalpy is joules!

“Because it depends on the heat capacity of both the objects and the vinyl.”

It depends on the number of joules available to do the work required. The needle has more joules of energy in it and can do more work – i.e. melting of the vinyl!

This is all I will post on this. Not only do you need a basic course in calculus you need a basic course in general science that covers basic thermodynamics. If you meet the age requirement see if a local junior college will let you audit a couple of courses.

I have lawnmowers to fix and home projects that need done. I can’t spend hours teaching you the basics of either calc or thermo!

Reply to  Tim Gorman
August 22, 2022 6:12 pm

Same old same old. “You can’t understand thermodynamics if you’ve never wielded a soldering iron in anger.

As I’ve said, I really don’t know much about thermodynamics, and this could have been an opportunity for Tim to surprise me, by actually providing what I asked for, either a reference to support his claim that heat transfer is determined by enthalpy, or an equation, or even a more detailed explanation of what he means by that.

I’m assuming that this claim is relation to his claim that 40°C in Miami was hotter than 40°C in Phoenix, because of the greater enthalpy in the atmosphere. His claim that heat transfer is determined by the enthalpy seems like it implying there will be a much greater heat transfer in Miami because there is a lot more enthalpy in the atmosphere.

To me, this does not seam correct. What determines the amount of heat transfer is the relationship between the two temperatures, masses and specific heat capacitys. Given that the atmosphere in either city is massive, the amount of heat transferred to say heat up a bucket of cold water is going to be pretty much the same in either case. A bucket of ice water left in the atmosphere in Miami will eventually heat up to 40°C and have next to no effect on the air temperature, and the same will be true in Phoenix, and in each case the heat transfer will be next to identical.

Reply to  Bellman
August 22, 2022 6:42 pm

Work = ΔH. It takes *work* to heat up metal and solder so the soldering process can be accomplished.

But that’s the change in enthalpy, not the initial amount.

Temperatures determine heat flow direction.

But it also to an extent controls the amount of heat transfer. Objects can only be heated or cooled to the other object they are in contact with. In the case of the bucket of water left in the 40°C atmosphere. If it starts at 10°C it will receive 30 times as much energy than if it started at 39°C.

Heat content is proportional to mass.

If by “heat content” you mean enthalpy, it’s proportional to mass, specific heat capacity and temperature in Kelvin. Plus pressure times volume.

Specific heat tells you how easy it is to heat a substance (i.e. copper heats easier than asbestos, specific heat is in joules/gram).

You missed of temperature. The SI units of specific heat capacity are J/kg/K.

It depends on the number of joules available to do the work required.

“It” being “The vinyl will offer the same heat capacity to both objects. But one will cause more melting than the other.

But the number of joules in each object depends on their mass, specific heat capacity and temperature. You could calculate how much energy will be transferred based on the temperature and enthalpy of the two objects, but as far as I can tell it’s generally much easier to do it based on heat capacity and temperature, because they are the properties that directly determine the amount of heat transfer.

Reply to  Bellman
August 24, 2022 6:58 am

But that’s the change in enthalpy, not the initial amount.”

So what? If you had done *any* studying on this subject you would have found out that you can’t measure enthalpy directly. You *can* measure heat transfer – which *is* related to enthalpy.

But it also to an extent controls the amount of heat transfer.”

I think you are confusing RATE of heat transfer with AMOUNT of heat transfer. A higher temperature differential will give a higher *rate* of heat transfer. But it doesn’t determine the amount!

If by “heat content” you mean enthalpy, it’s proportional to mass, specific heat capacity and temperature in Kelvin. Plus pressure times volume.”

Now, if you had even an inkling of what you just posted actually means we wouldn’t be talking about it!

“You missed of temperature. The SI units of specific heat capacity are J/kg/K.”

Specific heat capacity is how many joules it takes to heat a substance one degree K. You then multiply by kg to get the number of joules. The /K simply doesn’t mean what you think it does. It is not the temperature of a specific object!

But the number of joules in each object depends on their mass, specific heat capacity and temperature.”

AND HOW MUCH WORK HAS BEEN DONE ON THE SUBSTANCE. What do you think PV gives you?

You need to sit down and actually STUDY this subject instead of just cherry picking things you think support you looking like you know what you are talking about! Just like you do with calculus!

Reply to  Tim Gorman
August 24, 2022 3:04 pm

If you had done *any* studying on this subject you would have found out that you can’t measure enthalpy directly.

Which is the point. You claim heat transfer is directly related to enthalpy, but you can calculate heat transfer just from temperature mass and specific heat capacity, all of which you can measure.

I think you are confusing RATE of heat transfer with AMOUNT of heat transfer. A higher temperature differential will give a higher *rate* of heat transfer. But it doesn’t determine the amount!

No, I haven”t even thought about the rate. It’s just my understanding that you can calculate the equilibrium temperature knowing temperature and heat capacity. No need to know the enthalpy directly.

The /K simply doesn’t mean what you think it does. It is not the temperature of a specific object!

I didn’t say it was, just that it’s part of the unit of specific heat capacity.

Reply to  Bellman
August 25, 2022 5:36 am

“Which is the point. You claim heat transfer is directly related to enthalpy, but you can calculate heat transfer just from temperature mass and specific heat capacity, all of which you can measure.”

What do you think you are saying? You still can’t differentiate between “rate” and “amount”! TOTAL heat transfer *is* directly related to enthalpy! You simply cannot transfer more joules than an object has available. That’s why a small soldering iron can’t be used on a radiator but a large one can. The enthalpy of each is vastly different!

“No, I haven”t even thought about the rate.”

That is quite obvious. You *never* think.

” It’s just my understanding that you can calculate the equilibrium temperature knowing temperature and heat capacity.”

OMG! You *still* haven’t bothered to look up the definition of heat capacity, have you?

from wikipedia: “Heat capacity or thermal capacity is a physical property of matter, defined as the amount of heat to be supplied to an object to produce a unit change in its temperature. The SI unit of heat capacity is joule per kelvin (J/K). Heat capacity is an extensive property.”

Reply to  Tim Gorman
August 25, 2022 7:39 am

TOTAL heat transfer *is* directly related to enthalpy!

All I’m asking you to do is define what you mean by “directly related to”.

Do you mean proportional to? Do you mean if you only knew the enthalpy of two objects you could calculate the total heat transfer? Or what?

That is quite obvious. You *never* think.

Weak.

OMG! You *still* haven’t bothered to look up the definition of heat capacity, have you?

Yes I have, and you quoting the definition does nothing to explain what you think you mean.

Reply to  Bellman
August 25, 2022 2:08 pm

All I’m asking you to do is define what you mean by “directly related to”.”

Small soldering iron vs large soldering iron. What is so hard about this that you can’t understand it? You can transfer more heat to a radiator from a large soldering iron than from a small one!

Do you mean proportional to? Do you mean if you only knew the enthalpy of two objects you could calculate the total heat transfer? Or what?”

Total heat transfer is ambiguous to begin with! Take an auto radiator. How much heat must be transferred to it in order for it to take solder? That’s a time function as well as a heat function. I could dump a lot of heat into the radiator from a small soldering iron if I leave it on there long enough and keep pumping heat into it – and never get the radiator hot enough to take solder. Or I can take a large soldering iron, heat it with an acetylene torch, and then apply it to the radiator and quickly solder it.

Can you understand the difference? Think about it!



Reply to  Bellman
August 24, 2022 6:49 am

Same old same old. “You can’t understand thermodynamics if you’ve never wielded a soldering iron in anger.”

Science is used to explain the real world. The real world you have no actual experience in. If you did have some real world experience you would understand what heat content (enthalpy) actually is.

“As I’ve said, I really don’t know much about thermodynamics, and this could have been an opportunity for Tim to surprise me, by actually providing what I asked for, either a reference to support his claim that heat transfer is determined by enthalpy, or an equation, or even a more detailed explanation of what he means by that.”

I *DID* provide a reference. One that you apparently blew off! No one can teach you if you aren’t willing to actually do the work to understand! I referred you to the steam tables which you apparently blew off as well.

Once again, enthalpy is an extensive property that depends on the size or amount of substance in an object. The change in enthalpy is the amount of heat transferred. Enthalpy is measured in joules. So is work. Think about that a minute. How much work can a 90lb weakling do to a football blocking sled compared to a 260lb, juiced linebacker? It’s the same with heat! A small soldering iron (at the same temp as a large iron) simply can’t do the same amount of work as a large soldering iron. It can’t transfer as many joules because it doesn’t have as many to transfer!

The formula you are looking for is H = U + PV. It’s all over the internet. You are whining that others refuse to do your research for you. And then when they *do* give you references you just blow them off and never bother to do any actual studying!

I’m assuming that this claim is relation to his claim that 40°C in Miami was hotter than 40°C in Phoenix, because of the greater enthalpy in the atmosphere. His claim that heat transfer is determined by the enthalpy seems like it implying there will be a much greater heat transfer in Miami because there is a lot more enthalpy in the atmosphere.”

You never, not once. stop to actually relate anything in the physical world in the proper manner. I have two parcels of atmosphere, one dry and the other wet. Which has more mass? Which takes more work to move it up in the atmosphere? Based on the amount of work required, which has the higher enthalpy? Which has more joules to expend in creating a massive thunderstorm?

To me, this does not seam correct. What determines the amount of heat transfer is the relationship between the two temperatures, masses and specific heat capacitys.”

Have you EVER bothered to look up the dimensions of any of these? Do you have even the slightest of clues as to what specific heat capacity is?

I gave you this already but, as usual, you just blew it off!

It’s joules/kg needed to raise the temperature of a substance one degree K.

YOU CAN ONLY TRANSFER AS MANY JOULES OF ENERGY AS A SUBSTANCE HAS! That is related to mass (J/kg). The mass of a small soldering iron is less than the mass of a large soldering iron. The small soldering iron simply doesn’t have as many joules to transfer as a large one! It’s a 90lb weakling vs a 260lb linebacker!

It’s just *SO* damn frustrating trying to tell you anything. You just won’t learn.

Reply to  Tim Gorman
August 24, 2022 2:57 pm

Once again, enthalpy is an extensive property that depends on the size or amount of substance in an object.

It depends on a lot things aside from mass – specific heat capacity, temperature, pressure and volume.

A small soldering iron (at the same temp as a large iron) simply can’t do the same amount of work as a large soldering iron.

Yes because of their masses. But we were talking about air mass. That’s virtually unlimited. My point is that the amount of heat needed to warm an object up to air temperature is the same regardless of the enthalpy of the air.

YOU CAN ONLY TRANSFER AS MANY JOULES OF ENERGY AS A SUBSTANCE HAS!

No need to shout. It’s correct but irrelevant. The air isn’t going to run out of Joules trying to heat a bucket of water.

Reply to  Bellman
August 24, 2022 3:10 pm

You’ll never learn.

What is the dimension for specific heat capacity?

Why does enthalpy depend on temperature? (hint: temperature vs temperature change)

Volume depends on what?

Yes because of their masses. But we were talking about air mass. That’s virtually unlimited.”

Air mass is unlimited? A cubic foot of air can weigh anything? Tons?

“My point is that the amount of heat needed to warm an object up to air temperature is the same regardless of the enthalpy of the air.”

Really? You think the same number of joules is required to heat a small soldering iron and a large soldering iron to the same temperature? Think about that for a minute!

“No need to shout. It’s correct but irrelevant. The air isn’t going to run out of Joules trying to heat a bucket of water.”

Unbelievable. You are unteachable.

Reply to  Tim Gorman
August 24, 2022 3:37 pm

What is the dimension for specific heat capacity?

I’ve already told you and you can easily find it on the internet. It’s J/kg/K

Why does enthalpy depend on temperature?

I’m not sure what you mean by “depend”.

To get an object up to a specific temperature from absolute zero requires specific heat capacity * mass * temperature, in addition to the work needed in increasing volume.

Air mass is unlimited?

You failed to read the word “virtually”.

A cubic foot of air can weigh anything? Tons?

No, I’m talking about the air mass over Miami or wherever.

Really? You think the same number of joules is required to heat a small soldering iron and a large soldering iron to the same temperature?

No. I’m talking about the amount of energy needed to heat the same mass of the same substance from the same initial temperature to the temperature of the air. My point, is that the amount of enthalpy in the air is almost entirely irrelevant to the amount of heat needed to do that.

Reply to  Bellman
August 18, 2022 5:03 pm

Heat is a description of energy transfer. It doesn’t measure hotness. I think temperature is a better measure of how hot the air is then heat.”

OMG! Where in Pete’s name did you get this idea?

It is *heat* that makes it hotter in Miami than in Phoenix for the very same temperature! What do you suppose causes that?

Reply to  Tim Gorman
August 18, 2022 5:45 pm

OMG! Where in Pete’s name did you get this idea?

What idea? That heat is a description of energy transfer or that it doesn’t measure hotness?

I got it from various sources, including

https://en.wikipedia.org/wiki/Heat

In thermodynamics, heat is energy in transfer to or from a thermodynamic system, by mechanisms other than thermodynamic work or transfer of matter (e.g. conduction, radiation, and friction).

According to Denbigh (1981), the property of hotness is a concern of thermodynamics that should be defined without reference to the concept of heat. Consideration of hotness leads to the concept of empirical temperature. All physical systems are capable of heating or cooling others. With reference to hotness, the comparative terms hotter and colder are defined by the rule that heat flows from the hotter body to the colder

As I’ve said before, I know little of thermodynamics, so if I am misunderstanding something, please explain.

It is *heat* that makes it hotter in Miami than in Phoenix for the very same temperature! What do you suppose causes that?

Could you explain exactly what these different heat flows are?

As far as I can see, the only reason it feels hotter in Miami even thought they have the same temperature, is if say it’s more humid, so you can’t cool down so quickly through sweating. Or maybe it’s windier in Phoenix which helps your skin cool down.

Reply to  Bellman
August 19, 2022 10:49 am

What idea? That heat is a description of energy transfer or that it doesn’t measure hotness?”

You keep confusing heat, i.e. energy being transferred, with heat content. Even today enthalpy is still described as heat content.

Read closer!

Reply to  Tim Gorman
August 19, 2022 11:54 am

I’m not the one confusing heat with hotness.

Reply to  Bellman
August 19, 2022 1:02 pm

No, you’ve used the argumentative fallacy of Equivocation to change the subject by offering up a red herring.

The issue isn’t heat transfer, the issue is heat content and the amount of heat transferred,

A parcel of moist air takes more to lift it than does a parcel of dry air. The water vapor adds mass. A parcel of moist air carries more heat content (enthalpy) with it because the heat content is a function of dry air and wet air. The moist air carries with it latent heat content as well as sensible heat content.

Water is a common working medium in thermal power plants (think steam engine). The steam tables allow engineers to easily determine the enthalpy, volume, pressure, and all kinds of attributes for steam. And since water vapor *is* steam, that’s why the steam tables work for the atmosphere as well.

The steam tables allow you to determine the heat energy carrying capacity of the medium. And that changes with absolute humidity.

Reply to  Bellman
August 19, 2022 1:06 pm

As far as I can see, the only reason it feels hotter in Miami even thought they have the same temperature, is if say it’s more humid, so you can’t cool down so quickly through sweating. Or maybe it’s windier in Phoenix which helps your skin cool down.”

That’s only part of it! Why do you suppose storms in Miami can be far more energetic than in Phoenix? It’s a function of how much heat energy a parcel of air can have in a storm – and that is based on its enthalpy – which, in turn is affected by the absolute humidity!

It’s not just the cold air/hot air interface that determines the power in the storm. It’s also the heat energy available in the atmosphere.

bdgwx
Reply to  Geoff Sherrington
August 18, 2022 6:33 am

I don’t disagree. I’m just saying that if the body is not in thermal equilibrium the contrarians don’t think it even has a temperature. Literally…the phrase “does not exist” is being used.

Carlo, Monte
Reply to  bdgwx
August 18, 2022 6:48 am

Who are these “contrarians” that you refer to again and again, mr. disingenuous?

Reply to  bdgwx
August 19, 2022 10:50 am

I don’t disagree. I’m just saying that if the body is not in thermal equilibrium the contrarians don’t think it even has a temperature. Literally…the phrase “does not exist” is being used.”

Stop whining and post some actual quotes to show this!

bdgwx
Reply to  Tim Gorman
August 19, 2022 12:10 pm

TG said: “Stop whining and post some actual quotes to show this!”

Essex et al. 2007

We argue herein that these disputes have their root in attempting to estimate a physical quantity that does not actually exist, and hence there is no prospect of resolution on scientific grounds.

The context of equilibrium comes from section 2.1 covered on pgs. 5-7.

Consequently, the average is not a temperature anywhere in the system, which contradicts the proposition that the average is a temperature.

Reply to  bdgwx
August 19, 2022 3:14 pm

bdgwx,

So you decided to quote ONE contrarian? When you said it was the “contrarians”?

We argue herein that these disputes have their root in attempting to estimate a physical quantity that does not actually exist, and hence there is no prospect of resolution on scientific grounds.”

Do you see the word “equilibrium” anywhere in this quote?

“Consequently, the average is not a temperature anywhere in the system, which contradicts the proposition that the average is a temperature.”

Do you see the word “equilibrium” anywhere in this quote?

You posted a lie about what people are saying. Admit it and move on. Stop whining.

bdgwx
Reply to  Tim Gorman
August 19, 2022 6:48 pm

TG said: “Do you see the word “equilibrium” anywhere in this quote?”

That’s on page 2.

But an average of temperature data sampled from a nonequilibrium field is not a temperature.

And of course the concept of equilibrium was the crux of the point and quote from pg. 6.

Reply to  bdgwx
August 20, 2022 7:21 am

That’s on page 2.”

Then why didn’t you include it?

Here is an excerpt:

While that statistics is nothing more than an average over temperatures, it is regarded as the temperature, as if an average over temperatures is actually a temperature itself, and as if the out-of-equilibrium climate system has only one temperature. But an average of temperature data sampled from a non-
equilibrium field is not a temperature. Moreover, it hardly needs stating that the Earth does not have just one temperature. It is not in global thermodynamic equilibrium – neither within itself nor with its surroundings.” (bolding mine, tg)

Which is exactly OPPOSITE of what you were trying to imply!

And of course the concept of equilibrium was the crux of the point and quote from pg. 6.”

In the context that an object that is not in equilibrium has no average temperature!

bdgwx
Reply to  Tim Gorman
August 20, 2022 5:54 pm

TG said: “Which is exactly OPPOSITE of what you were trying to imply!”

My point is that if the body is not in thermal equilibrium the contrarians don’t think it even has a temperature. Literally…the phrase “does not exist” is being used.

Are you saying that you think Essex et al. are saying that a body not in equilibrium does, in fact, have a temperature? Where do they say that?

Reply to  bdgwx
August 21, 2022 7:07 am

I repeat: “In the context that an object that is not in equilibrium has no average temperature!”



Carlo, Monte
Reply to  Tim Gorman
August 21, 2022 7:26 am

Amazing how “average temperature” morphs into “temperature”!

Reply to  Carlo, Monte
August 21, 2022 3:57 pm

yep!

Carlo, Monte
Reply to  bdgwx
August 19, 2022 3:20 pm

Fess up time, here is what you REALLY don’t like about the paper:

Physical, mathematical, and observational grounds are employed to show that there is no physically meaningful global temperature for the Earth in the context of the issue of global warming.

Reply to  bdgwx
August 18, 2022 6:25 am

Over in Kip’s other article the contrarians decided that any body not in equilibrium cannot have a temperature average or otherwise”

If that’s what you took from the thread then you didn’t read the Essex paper at all and you ignored most of the comments.

What you were told was that the S-B equation only gives a valid answer for objects in temperature equilibrium and that since temperature involves a lot of local extensive attributes that there is no legitimate average between two measuring stations. The extensive attributes at Station1 can be different than at Station2, even with minimal distance between them. That’s a result of temperature not being able to act at a distance to force temperature at a distance.

CMB is a CALCULATED intensive value derived from the average of an extensive value, it is not a measured value. It is the intensive attribute of an extensive attribute, not an average of intensive attributes.

You do your credibility significant damage when you put strawmen arguments in others mouths that they didn’t actually say.

Carlo, Monte
Reply to  Tim Gorman
August 18, 2022 6:49 am

This guy has the credibility of a three-dollar bill.

Reply to  bdgwx
August 18, 2022 7:58 am

To be fair the Essex paper does say we can tell the Earth is hotter than Pluto and colder than the Sun. It just requires you to know that the hottest spot in Pluto is Colder than the coldest place on Earth.

bdgwx
Reply to  Bellman
August 18, 2022 8:04 am

Got it. So I wonder what Essex et al. would argue for the Earth and Moon in regards to which one is hotter/colder? Ya know…because…the hottest spot on the Moon is still warmer than the coldest spot on Earth and yet the hottest spot on Earth is still warmer than the coldest spot on Moon.

Reply to  bdgwx
August 19, 2022 10:52 am

What the Essex paper says is that there is no place between the Earth and the moon that exists at the average temperature of the Earth and the moon!

If you think there is then tell us where we can go to find that average temp somewhere out in space.

Reply to  Tim Gorman
August 19, 2022 11:52 am

The question wasn’t what the average of the moon and the Earth was. It’s whether you can say that the moon is colder than the Earth.

Reply to  Bellman
August 19, 2022 3:06 pm

Which side of the moon?

bdgwx
Reply to  Tim Gorman
August 19, 2022 12:03 pm

Which one (Earth or Moon) is warmer and what rule do you use to make the conclusion?

Reply to  bdgwx
August 19, 2022 3:06 pm

Is the moon in temperature equilibrium? Which side of the moon are you speaking of?

bdgwx
Reply to  Tim Gorman
August 19, 2022 6:43 pm

TG said: “Is the moon in temperature equilibrium?”

No. If it helps you can use the Diviner data to answer the question.

TG said: “Which side of the moon are you speaking of?”

All of it. The Moon is body A and the Earth is body B. Is A or B or hotter/warmer?

August 17, 2022 8:46 am

Three five-year trends (the last one, slightly longer) which are all down-trending, add up to one up-trending graph when placed end to end in date order.

Which is exactly the problem with all these cherry-picked periods. You can always find short term flat or negative trends which none the less add up to a warming trend. In order to see why you have to put them all on the same graph, and see how much of a gap there is between the end of one and the start of another.

August 17, 2022 8:49 am

Speaking of trends, when was it decided that governments should look at trends and take steps to alter them? Isn’t that more of a personal and professional responsibility?

It seems to me that the only trend important to government operations is the one that charts the value of the treasury. And since we can now borrow from ourselves by creating fiat money out of thin air that never is repaid, even that trend is no longer important.

When deciding that trends must be changed, value judgements must first be made. That is where the arguments arise. But after the decisions on value are made, the performance trend must also be examined. That is where the rubber hits the road and where the effort is meaningful. If there is no performance, then the value is also meaningless.

kwinterkorn
August 17, 2022 8:57 am

It seems to me that all trend discussions should mention “error of measurement” for the discussed data.

Any “trend” that falls within the error bars is

  1. small, maybe unimportant
  2. in doubt.

If climate alarmists cannot show trends that exceed the appropriate error bars, then probably the Earth is not going to die any time soon.

Terry
August 17, 2022 9:01 am

Same problem occurs in accounting, where one years profit or loss is not representative of how the company is doing long term.

MarkW
Reply to  Kip Hansen
August 17, 2022 10:36 am

If you want to live a long and happy life. The worst thing you can do is to check the value of your investments every day.

Reply to  Terry
August 17, 2022 9:39 am

Not true
Consider a start up company that finally earns a profit after losing money for years — that news is significant even though only for one year

Consider a growth company that suddenly has a bad year and loses money for the first time in years — that news is significant, especially if it can not be blamed on a recession, even though it is only for one year..

If your stock portfolio value declines you have lost wealth.
Does not matter if you sell or not.
If it is higher a year later, you have gained wealth.

MarkW
Reply to  Richard Greene
August 17, 2022 10:37 am

The news is significant, however it is also quite meaningless.

Reply to  MarkW
August 17, 2022 1:51 pm

You are no investor !

Reply to  Terry
August 17, 2022 12:45 pm

That is why the smart traders look at financial statements and marketing data, e.g., FUNDAMENTALS, and not trends.

Reply to  Jim Gorman
August 17, 2022 1:53 pm

In fact traders look at trends more than fundamentals and investors look at fundamentals more than trends. The best investors look at both, whether they admit or not.

Truthbknown
August 17, 2022 9:20 am

Bill Gates needs you all to DIE so he can have a pristine planet! Shut up and do it slaves!

Rick C
August 17, 2022 9:21 am

Kip: Interesting example of the tendency of people seeing trends they think are significant in essentially noisy data. I’d suggest that whenever you apply a linear regression, you obtain the R^2 value or its root – the correlation coefficient. Very low R^2 values, say under 0.3 means the “trend” observed is meaningless. I ran a OLR on the 2005-2015 USCRN data and the R^2 was 0.007 (CC = 0.08). At that level the sign of the trend is irrelevant.

Rick C
Reply to  Kip Hansen
August 17, 2022 1:41 pm

No, because when there is an actual cause and effect the trend will be clearly evident in a strong correlation – say R^2 > 0.9. Of course the correlation could be spurious which is why we say correlation does not prove causation. But the inverse statement – lack of correlation disproves causation is true.

Of course time series regressions do not relate an independent variable to a dependent variable other than age. They assume a change is some unidentified independent variable. To fudge whether CO2 causes warming, regress temperature v. CO2.

Rick C
Reply to  Kip Hansen
August 17, 2022 6:46 pm

Kip ==> With something like average global temperature there are so many possible causes for change that that it is quite a hopeless task to disentangle them. In engineering we use multi-factor experimental design to figure out the effects of multiple variables on a process outcome. But to to this we must be able to adjust and control each relevant variable and run the process with preselected levels of each variable, then precisely measure the process result. It’s not something we can do with weather. The cli-sci world tries to do this with computer models, but there is no way to validate them, so it’s no surprise that they are wrong in their predictions so often.

Clyde Spencer
Reply to  Rick C
August 17, 2022 12:31 pm

An R^2 value of 0.5 means that only 50% of the dependent variable variance can be explained by the variance of the independent variable. That means in predicting the value of something, you will usually be wrong, with the error as likely to be too high as it is too low.

If you run a casino, you might be able to eke out a profit with an R^2 of 0.51, but it will be small unless the volume is very high.

Reply to  Clyde Spencer
August 17, 2022 4:45 pm

That means in predicting the value of something, you will usually be wrong, with the error as likely to be too high as it is too low.

I’m sure we’ve had this discussion before, but you seem to think that an r^2 value of 0.5 means there’s an equal chance a value will be above the prediction or below it. In fact, unless your r^2 value is 1, you will always have as much chance of the value being below or above the trend line. The trend line isn’t trying to predict specific values, it’s a prediction of the average value.

Clyde Spencer
Reply to  Bellman
August 17, 2022 7:53 pm

OK, fair enough that the way a regression line is calculated you will always have about half the points above the regression line and half below.

However, look at it this way: With an R^2 value of 1.000…, all the points will fall on a 45 deg line — a perfect 1:1 correspondence. With an R^2 value of 0, few if any will fall on the line, and/or the slope will be zero. As the R^2 value decreases from 1.000…, the error grows from zero to a value so large that it has no practical predictive value. The question should be, “At what point does the correlation coefficient cease to have any utility?” I’m of the opinion that a correlation coefficient (R) below about 0.7 has little practical predictive value. It suggests that the independent variable has some influence, which allows a prediction of sign of the slope, but that there are probably one or more other unidentified variables that will be at least as useful. Hence, the need for multivariate analysis. A co-correlation matrix will be more informative than a single number.

Reply to  Clyde Spencer
August 18, 2022 12:54 pm

For a dependent variable with wide variations against time, such as temperature, a linear regression will understate the max temps and overstate the minimums. That’s life and no amount of math can change it. Throw in cyclic behavior and exactly what are you obtaining?