A sensational science paper has blown holes in alarmist claims that global temperatures are surging. Just published results in Nature show “limited evidence” for a warming surge. “In most surface temperature time series, no change in the warming rate beyond the 1970s is detected despite the breaking record temperatures observed in 2023,” the paper says. Written by an international group of mathematicians and scientists, it is unlikely to be acknowledged in the mainstream media where general hysteria reigns over the anomalous 2023 experience. As we have seen, constant misinformation is published to scare the general public and this is exemplified by climate comedy-turn Jim ‘jail the deniers’ Dale forecasting almost daily Armageddon and exhorting people to “join up the dots”.
In science, one swallow does not make a summer and in climate science it is impossible to show a trend by picking on short periods or individual weather events. This paper is an excellent piece of climate science work since it takes the long statistical view and challenges the two-a penny clickbait alarmists looking for a headline on the BBC. The Intergovernmental Panel on Climate Change is a biased body but it understands the importance of long-term climate trends by stating, much to the chagrin of Net Zero-promoting activists, that it can find little or no human involvement in most extreme weather events either in the past or in the likely immediate future. But these findings, along with the paper on the warming trend, are inconvenient to those promoting the unproven claim that humans control the climate thermostat by utilising hydrocarbons.
The paper is highly technical and mathematically-inclined readers can study the full workings out in the open access publication. It notes that global temperature datasets fluctuate due to short-term variability and this often creates the appearance of surges and slowdowns in warming. It is important to consider random noise caused by natural variation when investigating the recent pauses in temperature and the more recent “alleged warming acceleration”, it adds. In fact there have been a number of plausible explanations given for the recent spike, with attention focused on the massive Hunga Tonga submarine volcano adding 13% extra water vapour to the stratosphere, a strong El Niño and even the reduction in atmospheric particulates caused by recent changes in shipping vessel fuel. Several “changepoints” were used by the mathematicians and it was found that “a warming surge could not be reliably detected any time after 1970”.
While the focus was on whether there had been a continued acceleration in the rate of global warming, it was recognised how unusual the surface temperature anomalies were in 2023. Indeed they were, and it was widely argued that this showed the climate was breaking down, or in the silly words of the UN chief Antonio Guterres that the planet was “boiling”. Last year’s hysterics were useful for short-term alarmism but they help destroy the ‘settled’ science around CO2. If human-caused CO2 is responsible for the rise, why did the temperature pause from 1998-2012 when atmospheric levels of the gas were on the up. Does alarmism on the BBC and most other mainstream media only apply when the temperatures spikes upwards for a few months?
One of the key conclusions in the paper arises from considering two time series – 1970-2023 and 2013-2023. This of course includes the early 1970s when global cooling fears were all the rage and average temperatures were falling. Estimated temperature trends were said to be 0.019°C per year for the first time segment and 0.029°C for the second that includes the spike from last year. This 0.029°C estimated slope “falls far short” of an increase needed to point to a change in the warming trend in the recent past. This is because of short-term variability in the U.K. Met Office HadCRUT global database since 1970 and “uncertainty” of the 2012 changepoint. This uncertainty arises over speculation as to whether 2012 and the ending of the pause was a year marking an important change in the longer time series. ”The HadCRUT record is simply not long enough for the surge to be statistically detectable at this time,” they note.
Cliff Mass is the Professor of Atmospheric Science at the University of Washington. He has a golden rule of weather extremes: “The more extreme a climate or weather record is, the greater the contribution of natural variability, and the smaller the contribution of human-caused global warming.”
The mathematicians used changepoint statistical techniques which were designed to identify structural changes over time. Four global mean surface temperature records over 1850-2023 were used including HadCRUT. This of course is problematic since there is substantial evidence that these datasets hype the warming trend by their careless treatment of urban heat corruptions – the fact that urban areas become warmer through ongoing development. In addition, substantial retrospective adjustments are made, often cooling the past and warming the near present to increase the ‘trend’. Despite writing copiously about the 1998-2012 ‘pause’, the Met Office has now removed it from its own record by adding 30% retrospective warming. Perhaps the Met Office need not have worried, with the mathematicians noting that the pause was “not unusual” given the level of short-term variability present in the data. But these datasets are the best we have and nobody doubts that the planet has warmed a small amount over the last 200 years since the lifting of the little ice age. For want of anything better, using these datasets for scientific analysis is fair, although it could be suggested that overall warming is probably less than suggested by this paper.
Chris Morrison is the Daily Sceptic’s Environment Editor.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
The paper does not dispute that it has been warming rapidly. They just claim, using their model, that you can’t be sure that warming has been accelerating. It’s always harder to establish a second derivative numerically.
I’m doubtful about their model, though. One offering is discontinuous piecewise, which is unphysical. It implies infinite acceleration at the breaks. I don’t see that they have accounted for this properly. They do also a continuous model, but I am not sure how the results are harmonised.
They were concerned that 2023 did not really fit with their modelling. They would have a lot more cause for concern had they included 2024.
This paper is junk science anyway, because it uses the highly tainted and maladjusted and meaningless surface data.
The ONLY warming in the last 45 in the UAH data has come at major El Nino events.
There was essentially no warming from 1980-1997,
No warming from 2001-2015
Cooling from 2017-2023.4
Not only is there no sign of any human caused warming in the last 45 years,
…but there is certainly no sign of any “acceleration”, just a strong 2023 El Nino, probably enhanced and extended by the WV from the HT eruption.
Glad you admit that claims using models are very suspect.
You have just destroyed to whole AGW / climate change malarkey, which is based on “models”..
…, exposing it for the rancid non-science propaganda that it is.
This paper is most certainly NOT “junk science”. It is a detailed statistical analysis of the same available weather data that the warmunists routinely abuse to make their claims of accelerating warming. This study does not claim to confirm or deny the data – it simply blows up the claim of accelerating warming,
The quality of the data is entirely another matter.
“This paper is junk science “
BeNasty
Your comments would have to improve by 97% just to reach the level of junk science.
Yawn !!
A stupid comment from Mr zero science.
‘Allo Nick. Still ploughing that lonely furrow?
Great posting, strativarius. The alarming point is that current CO2 is as low as it was during the great Permian die-off. Not good.
I think, Ron, that for a lot of the… let’s call them dark greens, a die off is what they seem to desire.
strat,
The Dark Greens seem completely oblivious to the CO2 trend, which would have levels falling below 150ppm sometime in the next few glacial periods! That would mean not only the end of humanity, but of virtually ALL life on Earth!
Anyone that suicidal should be treated very gently, and NEVER allowed anywhere near the levers of power!
The fruit flies living in my plastic composter from Costco would agree.
Dude!!!!C’mon. You are trying to hide from the clear yappings of you and the warmunists for decades with their claims of ACCELERATING warming … which according to their ideology of “settled science” correlates exactly with the known accelerating increase in atmospheric CO2,
If there is no acceleration in warming then that utterly destroys their hypothesis that accelerating CO2 controls warming.
Not only that, but if there is no accelerating warming, then ipso facto, there is no global warming “crisis” – a “crisis” that demands immediate, emergency, radical “solutions” that are clearly harmful to human welfare.
Ever familiarize yourself with the Roman god Janus, usually represented by two heads facing in opposite directions? His symbolic representations are used as a symbol of duplicity and two facedness (as in “two faced liar”). That is you, and your fellow warmunists.
The paper certainly says there was huge acceleration. Here is their default model:
It says there was a huge acceleration in about 1970, and then linear warming thereafter. Thay say you can look at other models, but can’t statistically reject this one-change null. I think that is just statistics ingoring physics. There is no physical reason to prefer such an abrupt change. It is just imposed by the rules they used, which pile all the acceleration into one breakpoint.
But warming is warming. They put it at about 2C/century, post 1970. Even without acceleration, that needs attention.
Models!!!!
The last refuge of the [scientific] scoundrel.
That is what the “sensational science paper has blown holes in alarmist claims ” uses.
Alarmists, Attributionists etc etc etc have nothing without a model.
Show me the problem… if you can.
Strictly speaking, the graph above is a “model.” However, it would be more interesting to me if, instead of two time series, the CO2 concentration were plotted against the temperatures showing the correlation and the r^2 value. Have you seen tabular data that could be converted easily to xy values?
Not a model, which is used to predict future behavior, dude. This is a regression analysis of existing data.
Some scientist you are, not understanding the difference
Which means it was looking for a rate of a rate. Nick missed this detail of the study entirely as he never mentioned it.
“he never mentioned it”
My first para in this thread said:
“The paper does not dispute that it has been warming rapidly. They just claim, using their model, that you can’t be sure that warming has been accelerating. It’s always harder to establish a second derivative numerically.“
So you admit it is just gibberish based on junk-data.
Yes, Nick.. we know that. !
Warming rapidly? In what world is it warming rapidly?
It is a statistical model. From the abstract:
‘We use changepoint models”
Nick doesn’t say anything about their models predicting “future behaviour”.
There are such things such as statistical models
The models referred to by him are (from the paper – of which therein lie 89 references to their “models”) ….
“Changepoint modelsOur work entails fitting several changepoint time series models that partition the GMST into regimes with similar trends using piecewise linear regression models. This work is most concerned with changes in the trend of the series.”
They are statistical models performed on existing data, as is plainly stated and is obvious when the data they used is shown to be from the 4 major GMST series (NOAA, Hadcrut, NASA, Berkeley).
I would suggest with a high degree of confidence that Nick merely assumed (never a good idea here admittedly) that denizens were capable of following the discussion without (as here) the trigger word “model” bringing the old cognitive dissonance thing into play.
FYI: climate modelling projections use physics and not statistics.
“climate modelling projections use physics”
BS.. they mostly use pseudo-science and fake data.
ROTLLMAO.
You must have a different definition of parameters being physics based than I do!
Yes as compared to models which presuppose an ECS. Big difference, and Nick was equating the two.
This commonly held misconception is unfortunately why climate “science” has persisted for so long. For example clouds are effectively modelled statistically and that alone is enough to destroy models’ ability to project.
Exactly 100%. He doesn’t understand the difference because he doesn’t want to understand he difference.
And there you have it straight from the horses mouth. ”The HadCRUT record is simply not long enough for the surge to be statistically detectable at this time,” they note.
It uses the FAKE alarmist urban data to show there is no acceleration even in the massively corrupted surface data.
Get over it, Nick.
Try to understand the difference between a model based on actual data, facts and measurement and models based on an unknown ECS. For Christ sake!
Ha ha ha ha ha ha ha ha NASA ha ha ha ha ha
NASA took a wrong turn going after Muslim outreach, but now they have tampons in men’s restrooms.
Great for nose bleeds !!!
University of Michigan Health disagrees.
https://www.uofmhealth.org/conditions-treatments/ear-nose-throat/nosebleeds#:~:text=Do%20not%20pack%20the%20nose,can%20make%20the%20bleeding%20worse.
Here are some suggested uses: https://www.businessinsider.com/tampon-hacks-men-women-save-your-life-2015-8
So the Kamal’s running mate can be used to stop nose bleeds, Is that what your second link is saying ? (its behind a paywall)
Loved Trump’s comment about TT at the Al Smith dinner…
Something like..
“I used to think men couldn’t get pregnant… then I met Tim Walz”
They might have some utility in staunching the bleeding from being pierced by a saber in a duel.
Look at these graphs closely from about 1950 to ~1975. Do you see any cooling in this range of time? If not, then something is askew in the analysis. Either that, or the 1970 cooling scare was based upon funky data. Which is it?
There was no 1970 cooling scare.
How many headlines do you need to see from that time dude?
From:
https://longreads.com/2017/04/13/in-1975-newsweek-predicted-a-new-ice-age-were-still-living-with-the-consequences/
Reporters didn’t make this up from nothing. Climate scientists began the idea that it might happen using the same data as used today
Someone in Newsweek once wrote an article. There was another in time, although more ambivalent.
And did the Newsweek journalist just write a science fiction article?
Dude, I lived this. It was fueled by “science” being done in universities using guesses. JUST LIKE NOW!
Yes. Here is what the author, Peter Gwynne, said in 2014:
““While the hypotheses described in that original story seemed right at the time,” Gwynne explained, “climate scientists now know that they were seriously incomplete. Our climate is warming — not cooling, as the original story suggested.” Put simply, he said, climate science evolved and advanced, resulting in new knowledge.
(One of the climate scientists whose research Gwynne cited in the Newsweek article long maintained that some aspects of the story were basically correct. But George Kukla, who had observed the satellite photos showing increased snow cover over North America during the early 1970s, denied believing that a sustained period of significant cooling had been imminent. He told an interviewer in 2007 that “none of us expected uninterrupted continuation of the trend.” Instead, he viewed the warming that followed as a cyclical and mostly naturally occurring prelude to the start of a cool-down that will become apparent in the middle of this century.)
Gwynne admitted that although his article accurately captured threads of meteorological thinking from the 1970s, “I didn’t tell the full story back then.” He left out the suggestive, but not then conclusive, evidence of CO2 increases in the atmosphere. He could not have possibly known that initiatives to reduce air pollution would quickly erase the blip of cooler temperatures in North America and help send temperatures up.
Gwynne also said he was over-enthusiastic in writing his Newsweek article and incorrectly suggested a connection between global cooling and severe weather in the U.S. — an unjustified leap. “I also predicted a forthcoming impact of global cooling on the world’s food production that had scant research to back it,” he wrote.”
The very first sentence is the operative one — “seemed right at the time”! But it turned out that “science” made wrong conclusions at the time.
There is no guarantee that the same thing isn’t happening right now is there? If “science” now knows how climate works, they would be making accurate PREDICTIONS. Ain’t happening!
No, the operative part is
“Gwynne admitted that although his article accurately captured threads of meteorological thinking from the 1970s, “I didn’t tell the full story back then.” He left out the suggestive, but not then conclusive, evidence of CO2 increases in the atmosphere.”
You are mixing up what occurred at what time.
Of course! That’s how the arctic ice reached it’s peak. Because it was warming! Lol.
You have just contradicted yourself. You claimed that there was no cooling scare in the 1970’s, then you post the Gwynne article which proves you wrong.
You have deliberately confused the existence of the cooling scare, which was provably real, with its physical reality.
Slippery chap, aren’t you.
More rampant DENIAL from Nick…
Even his fellow scammers were in on it…
and
and…
and…..
plenty more as well.
Nick is a mal-informed troll. !!
No, Stokes is a mendacious troll. He knows he is lying.
Another one just because I can…
“There was no 1970 cooling scare.”
Now we all KNOW that you are an out and out LIAR !!!
Well one would have to agree with that wouldn’t one?
Yep.
Sophistry Stokes.
Even the universities were saying it…
Wow.
My memory must be faulty then. I clearly remember the forthcoming Ice Age being widely discussed in British newspapers and on television in the early 1970’s. I even gave a presentation on it in school in 1974.
Why do you lie about something so easily checked?
It was COOLING from the 1940’s into the late 1970’s as well shown,
1970s Global Cooling Alarmism
LINK
Plus a long list of published papers showing that the COOLING was real:
285 Papers 70s Cooling 1
LINK
What a miserable liar you are!
Dude!!!
Do you even remember what you wrote in your comment at the top of this thread? Sheesh, NOT.
Let me past your opening sentences:
Plus, from your top-level post :
In the “Supplementary Information” file for Beaulieu et al they include a couple of graphs showing continuous and discontinuous results for both “a global AR(1) structure” and “a global AR(4) structure” (whatever the hell they are …), which have 2 or 3 breakpoints per dataset.
Attached is a copy of “Supplementary Figure 1”, which shows the AR(1) case (see “Supplementary Figure 3” for the AR(4) case).
The “interesting” option here is the continuous HadCRUT (V5) one, which shows approximately the same warming rate for the 1910-to-1940s leg as the post-1970s one, although the latter has lasted (almost) twice as long as the former.
.
Note that Supplementary Table 1 contains “Results (p-values) of the Fisher-Gallagher test for independence applied to residuals of the different model fits and series up to 20 lags”.
I have no idea whether (or not) that can help “harmonise” the results.
“they include a couple of graphs showing continuous and discontinuous results”
Yes. The single breakpoint (continuous or discontinuous) is their null hypothesis. It fits the results reasonably well. The point of their statistical testing is that they say no model with more breaks can be shown to reject their null (one break) hypothesis.
What is wrong with this is that a null hypothesis is supposed to be the default which most simply explains the data. But a breakpoint doesn’t explain the data. We don’t know what caused the breakpoint, and don’t really believe it happened.
Sort of, kind of.
They were only concerned with the 1970 – 2023 period, which doesn’t exhibit any breakpoints. The multi-segment regression does show pre-1970 breakpoints for some data sets, which is in line with simple visual inspection.
For whatever reason, the post-1970 period was arbitrarily broken into pre-2012 (or 2010, depending), which gives way too short a second segment (40 years + 10 years). 2012 may have been selected by visual inspection.
It does seem odd that the full 1850 – 2023 period was used when the period of interest is 1970 – 2023. I could understand showing it to provide some context, but it’s irrelevant otherwise.
The slope pre-1970 is different to the slope post-1970, but that’s not really relevant to what they were testing.
Essentially, they couldn’t identify a breakpoint post-1970, which is why they couldn’t reject their null hypothesis (the slope 1970-2012 is the same as 2012-2023).
We don’t know what caused the breakpoint, and don’t really believe it happened.
We don’t have to know what caused it, that’s not what statistical analysis is for.
As for whether it happened, under multi-segment regression in the Figure MarkBLR posted, the NASA and Berkeley data sets retain a single breakpoint, NOAA has 2 and HadCRUT has 3. That is one indicator that they come from different populations. So, it happened in the NASA and Berkeley data sets, but not in NOAA or HadCRUT.
To me, this is a red flag that data changes have been made differently by each organization and all are guessing.
It may just be different site selection for different periods.
One would require the raw data sets to be able to adequately compare them. It shouldn’t be all that difficult a Perl or Python script provided the site references are unchanged.
“The slope pre-1970 is different to the slope post-1970, but that’s not really relevant to what they were testing.”
Yes, it is. The effect of their model is to push any post-1970 acceleration into the 1970 event.
“We don’t have to know what caused it, that’s not what statistical analysis is for.”
It’s useless without a cause. As I said above, they are using the one breakpoint as a null hypothesis. But a null hypothesis is supposed to be the default explanation of the data. It’s no use if it doesn’t explain.
Breakpoint models are popuar in econometrics where it often does happen that change happens all at once. A war, a plague, a change of government. But in physical science, if you postulate such a change, you’d expect an observed cause.
You can think of their model as a model of acceleration. We know, more or less, the integral (difference between trend then and now), and it’s positive. So when did it happen? What was it as a function of time? These guys say, well, it could have all happened in 1970 (zero elsewhere). You can’t disprove that statistically. But you can’t disprove umpteen scenarios where it happened in a more distributed way, and they do have the merit of being more believable. And of course, that means more recent acceleration.
“So, it happened in the NASA and Berkeley data sets, but not in NOAA or HadCRUT.”
Or it didn’t happen at all (far more likely).
No acceleration even in the FAKED surface data.
Get over it , Nick !!
The FACT is that there is no sign of any human caused warming in the last 46 years of the UAH data.
Anything you yabber gibberishly on about, using GISS, Had, et al, based on surface data, is totally meaningless.
The multi-segment regression pulls the NOAA change point back to about 1995. NASA, Berkeley and HadCRUT retain their final break around 1970.
There is no point looking for a cause if nothing happened. The purpose of statistical analysis is to figure out if there is any there there.
The null hypothesis is just the one you want to falsify. It can be whatever you like, but is traditionally “nothing to see here” (no change). It has statistical explanatory power, but says nothing about the cause.
Yep, is there a statistically significant difference? You might have something you think should make a difference, but if there is no significant difference, it didn’t matter.
You observe a difference, which might be attributed to the proposed cause. That’s why the dependant is plotted against the independent (proposed cause).
When is “then”?
They did the multi-segment regression to give greater granularity, which in the NOAA and HadCRUT data sets shows acceleration prior to 1970, and in the NOAA case, again about 1995.
There has been a consistent rate of increase since 1850?. Yeah, nah.
This is a key statement and should be bolded and italicized. Time series of measurements when time is not part of the cause simply can not allow one to discern a CAUSE.
Determining a cause can only be done by plotting the dependent variable against an independent variable that is part of the functional relationship that determines a value of the dependent variable.
You can’t “explain” something as complex as the earth’s climate simply like that. But we could speculate it was due to a regime shift.
You and Nick have both been caught out by the word “explain”.
In statistical analysis (hence, hypothesis testing), “explanatory power” just means “how well the hypothesis fits the observations”.
The object of the exercise is to formulate an alternative hypothesis which fits the data (“explains”) better than the null hypothesis (which you also formulate). If the null hypothesis can’t be rejected, you stuffed up!
Selecting between alternative hypotheses follows on from that, and tends to require more and more ever higher resolution data, utilising ever more expensive instruments and experimental design.
In this case Nick is using “explain” in the sense of having a physical cause. He said
He’s mentioned previously that a breakpoint for accelerated warming means infinite change. I’m not sure why in this case, usually he’s better than that. But at any rate, I dont believe he’s using it in a statistical sense. YMMV.
Yes, I think you’re right.
In calculus a “breakpoint” is where a function is not continuous. I.e. the limit of the function is different when approached from the left compared to the right. That really doesn’t apply with temperature. If it does then something is wrong with the data or the data anslysis.
From JCGM 200:2012
That why in physical science, when influence quantities change due to a physical change in the measuring system, the previous data is declared no longer fit for purpose and the data set is closed and a new one is started. Climate science has said, we know how to fix the old data to agree with the new data and then proceed to guess how to change the data from the past.
One only need to look at climdev cooling to meet CRN to see the effect. Rather than simply saying we now have more accurate data and discarding the prior data, let’s change the past data. No one will know or care.
“In calculus a “breakpoint” is where a function is not continuous.”
Not by most definitions I’ve seen. A breakpoint is where a piecewise function changes, it does not have to be discontinuous.
“That really doesn’t apply with temperature. If it does then something is wrong with the data or the data anslysis.”
You mean that thing I kept trying to point out to you about the “pause”? The analysis is wrong as it results in a discontinuity.
From Limits (An Introduction)
You do have a talent for fining the most elementary introduction to maths, and still misunderstanding it.
Yes, you can describe a discontinuity as a “break” if you want top avoid long words. But that isn’t defining a “breakpoint”.
Here’s a couple of random examples:
https://en.wikipedia.org/wiki/Piecewise_linear_function
https://www.ibm.com/docs/en/cofz/12.10.0?topic=example-what-is-piecewise-linear-function
“The piecewise linear function in Figure 1 has three breakpoints.”
Using your definition of a breakpoint a sine wave has breakpoints at the maximum and minimum values since that is where the slope of the function changes.
What you have described here is a PIECEWISE function, not a continuous function describing temperature. Piecewise functions can have breakpoints where the function is not “smooth” at the breakpoints.
Once again, TEMPERATURE IS NOT A PIECEWISE FUNCTION. You are doing nothing but throwing up a red herring argument in order to see you name in the thread.
Once again, if an analysis of temperature requires a piecewise function to describe it then the data being used is bad or the analysis is bad. The temperature curve here on earth is made up of a daytime sinusoid and a nighttime exponential decay. Neither of these is or requires a piecewise function in order to describe it.
It’s not my definition. It’s that used by IBM, Wikipedia, and multiple other sources.
But, no. A sine wave does not have any breakpoints. It is not a piecewise function, it can be defined across the domain by a single function. And the slope changes continuously across the domain, not just at the maximum and minimum points. I suspect you are confusing a change in the slope with a change in the sign of the slope.
“What you have described here is a PIECEWISE function”
Yes – that’s the point. breakpoints, changepoints, knots, or whatever you call them are things that happen in a piecewise function. They may result in a discontinuity, and they will often result in a discontinuity in the first derivative.
“…not a continuous function describing temperature.”
They can describe a continuous function in temperature.
“Piecewise functions can have breakpoints where the function is not “smooth” at the breakpoints.”
But not necessarily discontinuous. Why do you have to turn every small correction into such a drama? You said something wrong, I corrected you. That’s all.
“Once again, TEMPERATURE IS NOT A PIECEWISE FUNCTION”
You should know by now that virtually everything you write in all caps is false. A function is used to describe changes in temperature. Sometimes the best function to use will be a piecewise function. Say you have a room which remains a constant temperature, and then you turn the heating on. What do you think the function will look like?
“Once again, if an analysis of temperature requires a piecewise function to describe it then the data being used is bad or the analysis is bad.”
Nonsense on stilts. How many times have you insisted that a daily temperature cycle will follow one function during the day time and another during the night? In fact you say it here as well.
“The temperature curve here on earth is made up of a daytime sinusoid and a nighttime exponential decay.”
And what do you think you are describing if not a piecewise function?
“Neither of these is or requires a piecewise function in order to describe it.”
Do you actually listen to yourself? You’ve just said there are two different functions, and then say that this does not result in a piecewise function.
“Yes – that’s the point. breakpoints, changepoints, knots, or whatever you call them are things that happen in a piecewise function. They may result in a discontinuity, and they will often result in a discontinuity in the first derivative.”
Temperature is *NOT* a piecewise function. It is a smooth, continuous function. That doesn’t mean the slope may change significantly but there is never a discontinuity.
“But not necessarily discontinuous. Why do you have to turn every small correction into such a drama? You said something wrong, I corrected you. That’s all.”
You didn’t correct anything. You made an incorrect assertion.
“You should know by now that virtually everything you write in all caps is false. A function is used to describe changes in temperature. Sometimes the best function to use will be a piecewise function. Say you have a room which remains a constant temperature, and then you turn the heating on. What do you think the function will look like?”
At no point in the temperature curve will the limit of the function describing the curve be different on the left side of any point than it is on the right side of the point. The conductive temperature gradient between the heat source (say a steam driven radiator) and the far side of the room will be related to dt/dL. At no point will that derivative become infinite thus no discontinuity. The same thing applies to convection transfer or radiative transfer of heat.
The fact that your data points are discrete does *NOT* mean that the temperature curve is discrete as well. It’s why statistics don’t define the physical world, statistical descriptors only provide a means to look at your data but you must still understand what the data represents!
“Nonsense on stilts. How many times have you insisted that a daily temperature cycle will follow one function during the day time and another during the night? In fact you say it here as well.”
You are *still* showing the bias of a statistician with no actual understanding of physical science. Does combining two sine waves of different frequency, sin(a) + sin(b), result in a function that has discontinuities? The exponential decay of temperature occurs 24 HOURS PER DAY. Even when the sun is shining. The higher the sun drives the temperature the faster the earth radiates the heat away. That rate of radiation still defines an exponential decay (or more likely a polynomial decay). Does sin(a) + e^(-b) result in a function that is discontinuous?
“And what do you think you are describing if not a piecewise function?”
Like I said, the actual function is sin(a) + e(-b). The sine wave dominates during the day and the exponential decay dominates at night. That does *NOT* make a piecewise function.
“Do you actually listen to yourself? You’ve just said there are two different functions, and then say that this does not result in a piecewise function.”
Your lack of knowledge concerning physical processes is showing as usual. Do you *really* believe the earth doesn’t radiate away heat during the day?
“Temperature is *NOT* a piecewise function. It is a smooth, continuous function. ”
Temperature is not a function. Functions can be used to represent temperature. Those functions can operate in many different levels, over different time frames, and can model different aspects of temperature. And often a piecewise function is a good option. You might be able to avoid a piecewise function, if you can add lots of other parameters, but for most purposes a simpler function is all that is required.
“That doesn’t mean the slope may change significantly but there is never a discontinuity.”
You keep saying that as if it wasn’t my original point. A piecewise linear regression of global temperatures that results in a discontinuity is a least suspect.
“You made an incorrect assertion.”
My assertion was that breakpoints do not have to be discontinuous. I backed that up with two sources. The fact that you still can’t accept you were wrong, is your problem.
“At no point in the temperature curve will the limit of the function describing the curve be different on the left side of any point than it is on the right side of the point.”
This is turning into your usual tactic of just lying about what I said. The example of turning on a radiator is not going to produce a discontinuity – it can be represented by a piecewise function.
“The fact that your data points are discrete does *NOT* mean that the temperature curve is discrete as well.”
Who said anything about discrete data points. Please stop arguing with he voices in your head, and try to comprehend the point. In this hypothetical simple example, the room is at a constant temperature before a certain point in time, say k. At time k a radiator is turned on and the room starts to heat up. It does not matter what function describes this warming – what matters is it is a different function after k than it is before. Hence a continuous piecewise function, with a breakpoint at k.
“Does combining two sine waves of different frequency, sin(a) + sin(b), result in a function that has discontinuities?”
No. And adding two sine waves is not a piecewise function. We were talking about the different functions for day and night.
“Like I said, the actual function is sin(a) + e(-b).”
Only if you think the sun causes cooling during the night. As I said, you may be able to create a complex function to model a simplistic daily temperature cycle, but that doesn’t make the piecewise approximation invalid. And any function you propose has to account for the difference between the sun being below the horizon.
“At time k a radiator is turned on and the room starts to heat up. It does not matter what function describes this warming – what matters is it is a different function after k than it is before. Hence a continuous piecewise function, with a breakpoint at k.”
The functional relationship REMAINS THE SAME. It is continuous with no breakpoint.
First off you cannot instantaneously change the temperature of the radiator. It is itself a continuous function and its temperature is related to the heat transfer into the media circulating through the radiator.
Second, since you cannot instantaneously change the temperature of the radiator you cannot instantaneously change the temperature of the media surrounding the radiator.
NO BREAKPOINT. At any point in the timeline the limit at that point will be the same no matter which direction you approach the point from.
“No. And adding two sine waves is not a piecewise function. We were talking about the different functions for day and night.”
You do *NOT* have different functions for day and night. You have the same functional relationship. The values of the independent variables change over time. That is *NOT* a piecewise function. y = f(a) + f(b) does not describe different functions based on the values of f(a) and f(b).
“Only if you think the sun causes cooling during the night. “
Do you *really* enjoy showing your total lack of understanding of the physical world? First, the sun *does* warm the portion of the earth in darkness via heat transfer (convection, conduction, and radiation) in the atmosphere. The amount of that warming at night is small compared to the heat loss due to radiation. Second, where did you get the belief that the contribution from the sun can go negative, i.e. cooling? There is nothing in the function sin(a) + e^b that indicates that sin(a) will go negative.
I tried to make the example so simple that even you could understand it, thus the sin(a) + e^b. The actual fact is that the function is ever so much more complicated that that. Yet you can’t even grasp the most simple example. You’ll never understand the almost infinitely more complicated actual functional relationship.
I suspect the reason Tim thinks he is never wrong is that everyone realizes it’s pointless trying to correct him. A simple correction of what’s meant by a breakpoint, results in endless distractions, without a single attempt to address the point. He’s still talking about breakpoints as if they mean a discontinuity. He still thinks that if a function is continuous it means there are no breakpoints. E.g
“The functional relationship REMAINS THE SAME. It is continuous with no breakpoint.”
Let’s try again. Say, Temp(t) = 10 if t < 0, 10 + t if t >= 0.
That function is continuous, including at 0, but it has a breakpoint at 0.
If this is a simple function representing temperature it does not require any “instantaneous” change in temperature.
“There is nothing in the function sin(a) + e^b that indicates that sin(a) will go negative.”
There is nothing to indicate anything in your equation. You haven’t defined a or b, or given any indication of the domain. If sin(a) is never negative, then presumably 0 <= a <= pi, and so can’t represent time, so what does it represent?
“I tried to make the example so simple that even you could understand it, thus the sin(a) + e^b.”
You made it so simple as to be meaningless. This isn’t a surprise as you keep on doing this, claiming you can predict daily temperature changes using just sine waves, yet somehow never you never give an actual working equation.
“The actual fact is that the function is ever so much more complicated that that.”
So write it down, and plot it, and we’ll see if it works. It will be fun seeing how you account for the different behaviour of the sun when it’s above and below the horizon without using a piecewise function.
Regardless, it still doesn’t mean that you cannot describe the daily temperature using a piecewise function. The point, which you inevitably miss, is that any function is an attempt to model a much more complicated reality. A piecewise function is often the simplest way of doing that, and arguing that there is a much more complicated function which is slightly more accurate does not mean the piecewise function disappears. It’s still describing a functional relationship between time and temperature.
“There is nothing to indicate anything in your equation.”
It’s a TEACHING TOOL you dunce! As usual you adamantly refuse to learn *anything*, no matter how simply it is presented!
The point is that at any point “c” the limit as x->c will be the same whether you approach it from the left or the right. THERE IS NO SINGULARITY.
“You made it so simple as to be meaningless.”
It’s only meaningless because you refuse to LEARN anything, absolutely *anything*!
“This isn’t a surprise as you keep on doing this, claiming you can predict daily temperature changes using just sine waves, yet somehow never you never give an actual working equation.”
e^ix = cos(x) + i sin(x). TWO SINE WAVES
e^-ix = cos(x) – i sin(x) TWO SINE WAVES
You may think you know statistics (which you really don’t) but your knowledge of actual math is at a high school freshman algebra level.
“So write it down, and plot it, and we’ll see if it works.”
I’ve given you example functions in the past. You even argued about the solar azimuth and elevation being a part of the temperature curve!
f(T) = f(sun) + f(geography) + f(humidity) + f(terrain) + f(pressure) + f(wind) + f(…..
This is *NOT* a piecewise function in any way, shape, or form. It is a combination of various independent variables which are themselves functions of other independent variables.
“Regardless, it still doesn’t mean that you cannot describe the daily temperature using a piecewise function.”
Trying to model a physical process using a non-physical representation does nothing but confirm the Feynman quote about yourself being the easiest person to fool. For instance, if you do not include the fact that radiative heat loss based on T^4 occurs even during the daylight hours then you have eliminated a major component that limits maximum temperature during the day. That’s part of the problem of using an “average” radiative heat loss based on some kind of measured value from a satellite. How is that “average” calculated? What is the “average” value of an exponential that is being “forced” by the sun over time? It’s not nearly as simple as you would think. The rate of heat loss at any point in time.is related to 1/ e^(k/T) or e^-(k/T). So what is the total heat loss during the day since T has two drivers, radiative heat loss and radiative heat gain during the day?
Don’t forget that conduction/convection also works over the entire day right along with radiation.
I already pointed that out. It’s why the contribution from the sun’s insolation never actually goes to zero even on the dark side. Even the re-radiation from CO2 in the atmosphere has a small portion wind up on the dark side as part of it always heads toward the horizon, gets reabsorbed, and then re-raidated toward the next distant horizon.
You’ve been told before that you need to take some basic calculus courses.
A breakpoint in calculus is where the limit of f(c-) is different than f(c+) as a point is approached. It’s typically called a singularity and not a breakpoint. The value of f(c-) and f(c+) can be the same at the point in question but they do not have to be. If they are not the same then you have a discontinuity. If you need to have two different functions defining a curve, i.e. a piecewise function, then it is highly likely that at the point of intersection f(c-) = f(x), limit as x ->c where x< c and f(c+) = f(x), limit as x->c where x>c are different.
When you condition a continuous function into discrete steps by digitization you will likely find numerous points where the f(c-) and f(c+) have different limits as x approaches c. As I have pointed out to you multiple times: a sine wave generated digitally will have a singularity at every step point. When you digitize the temperature curve in the same manner, whatever the step size might be, you will also wind up with multiple singularity points. All the piecewise statistical analysis of the temperature curve does is identify the singularities produced from digitizing the temperature curve. That does *NOT* mean that the physical temperature process is a piecewise function with singularities, that is just a phantom produced from the digitization.
You continue to labor under the misconception that statistical descriptors of discrete values generated by digitizing a non-digital signal somehow represents the actual physical process itself. Those statistical descriptors only describe your discrete points, they simply cannot tell you the functional relationship at play in reality. Someday you should look up “Nyquist limit” and actually try to understand the concept it describes.
From a signal sampling point of view trying to represent a daily temperature curve using the mid-point value as a single point of representation is a loser since it is not sufficient to reconstruct the daily temperature signal. Using 30/31 of these data points to represent a single monthly value is even worse since it doesn’t allow reproducing the actual temperature signal over the entire month. All this digitization does is lose information and generate phantom statistical descriptors based on useless discrete data. This could all be fixed by using degree-day data but climate science adamantly refuses to do this.
And I wonder why? We now have lots of data from ASOS and CRN. Why no studies about how an integrated temperature (degree-day) vs traditional temperature compare.
“You’ve been told before that you need to take some basic calculus courses.”
And I’ve told you, these patronizing ad hominems do you no favor. I suspect I’ve done more advanced calculus courses than you, but that makes no difference to the argument.
“A breakpoint in calculus is where the limit of f(c-) is different than f(c+) as a point is approached.”
That’s a definition of a discontinuity (and not the full definition), not a breakpoint. If you disagree, produce a reference.
“It’s typically called a singularity and not a breakpoint.”
So now you accept it isn’t usually called a breakpoint. But you are still wrong. A piecewise function does not have to have any singularities. As I keep saying it can be continuous at all points. A continuous piecewise function, may have a singularity in it’s derivative, but that is not considered a singularity in the function.
https://en.wikipedia.org/wiki/Singularity_(mathematics)
“then it is highly likely that at the point of intersection f(c-) = f(x), limit as x ->c where x< c and f(c+) = f(x), limit as x->c where x>c are different.”
You are going to have to define your probabilities for “highly likely”. If you are talking about all possible functions, then I would guess that nearly all will have discontinuities. But if you limit to useful functions, I would imagine rather more are continuous, if they are modelling real would phenomena.
Regardless, I’ll take the admission that you were wrong to claim that “all” piecewise functions have singularities.
“When you condition a continuous function into discrete steps by digitization…”
And there’s your problem. You only seem to understand piecewise functions as a method for digitization. But that’s not what’s being talked about in the cases I’m talking about. Especially using change point analysis in linear regression. Linear regression is the opposite of digitization, taking a set of discrete points and smoothing them into a continuous function.
“I suspect I’ve done more advanced calculus courses than you,”
Taking a course is meaningless. You apparently can’t learn.
“That’s a definition of a discontinuity (and not the full definition), not a breakpoint. If you disagree, produce a reference.”
It can be at a discontinuity but does not have to be.
“So now you accept it isn’t usually called a breakpoint. But you are still wrong. A piecewise function does not have to have any singularities. As I keep saying it can be continuous at all points. A continuous piecewise function, may have a singularity in it’s derivative, but that is not considered a singularity in the function.”
You *do* need to take a basic calculus course and comprehend the course material.
The singularity in a function is identified by the limit from the left and the right being different. Example: y = sqrt(z) While continuous it is not analytic.
If you don’t believe me then perhaps you’ll believe wikepedia:
Tim still pretends his academic credentials mean he’s right about everything.
=============================================
“It can be at a discontinuity but does not have to be.”
This is in response to the sentence
Wrong. It’s one of the main definitions of continuity that a function is continuous at a point c if, and only if,
That second point has to be true for limits from the left and right if defined. If lim x -> c- ≠ lim x -> c+, then by definition it is discontinuous at c.
===============================================
“The singularity in a function is identified by the limit from the left and the right being different. Example: y = sqrt(z) While continuous it is not analytic. ”
You need to define your domain. If you mean complex, as implied by your use of “z”, then sqrt(z) is not continuous at 0.
If you mean ℝ+, then yes it’s continuous at 0, but not analytic. But that doesn’t mean it’s a singularity. As I said before, for real functions, a singularity means a discontinuity in the function, not in the derivative.
================================================
“from wikepedia: “Piecewise defined functions (functions given by different formulae in different regions) are typically not analytic where the pieces meet.”
Yes, the very article I quoted to you at the start. A breakpoint does not have to be discontinuous, They, may, but not always, be non-analytic at the breakpoints.
You attempted to justify Stokes assertion that if the linear regression of decadal segment A didn’t meet the linear regression of the following decadal segment B then a discontinuity existed in the global temperature average. It was a bullshit justification.
It is still bullshit for the reasons I pointed out to you. The fact that the linear regression lines, i.e. the slopes of the temperature curve, did not meet did *not* mean a discontinuity in the temperature curve.
Now here you are trying to repeat back to me exactly what I initially pointed out to you and trying to claim it as *your* initial assertion. More bullshit.
Tell it to someone that you can fool. I *know* that you have absolutely no understanding of either calculus or physical science.
“You attempted to justify Stokes assertion…”
This had nothing to do with Stokes. I merely pointed out an error in you understanding about breakpoints. You as usual dragged the argument in every direction, rather than accept you were wrong.
“Tell it to someone that you can fool. I *know* that you have absolutely no understanding of either calculus or physical science.”
Try to argue the facts, not just claim you have a better education than me.
“Let’s try again. Say, Temp(t) = 10 if t < 0, 10 + t if t >= 0.”
Show me a daily temperature curve where this happens. In the real world temperature is either going down from heat loss or going up from heat addition. In the natural world the entropy of a system increases unless it is affected by a second system whose entropy increases by causing the decrease of entropy in the first system.
Your assertion is nothing more than a red herring argumentative fallacy you are using to keep from having to address the real world. Typical.
“That function is continuous, including at 0, but it has a breakpoint at 0.”
But that function only exists is your unreal statistical world.
At the atomic/molecular level one could argue that breakpoints exist at each absorption of additional energy. But that is not measurable at the micro level. Given that a volume of air has a considerable number of atoms/molecules, the average KE will change gradually as additional energy is absorbed by more and more of the atoms/molecules heat.
The statisticians amongst us need to learn what a gradient describing a phenomena is and what it means for temperature. It is like a plane descending. To have a break point, one would need to invent teleportation.
Averaging small samples to get a trend is a sure way to encounter false break points. It is why uncertainty grows, always.
Far too many statisticians don’t realize that all they are doing when performing piecewise linear regression analysis of a part of a digitized data set is finding the first derivative of that part of the curve. They are, in essence, determining the slope of the curve at that section of the curve.
That truly tells you little about what the total curve is doing. Trying to extend the first derivative of that part of the curve into the future is projecting the first derivative and not the actual curve itself. It’s like projecting the derivative of a sine curve surrounding the point π/4 radians into the future somehow assuming it will remain at that value. It’s a meaningless projection, it tells you nothing about what the curve itself is going to do.
There is no doubt that the temperature curve at any point on the earth is cyclical over time, both for short time periods (daily, weekly, monthly, seasonally, decadally) and for long time periods (centuries, millenia). Projecting the slope of a linear regression of temperature data for a specific 30 year period into the next 30 year period (or even the next 100 year period) is only projecting that the first derivative of the temperature curve for that base period will remain the same for the future. Without a functional relationship describing the actual curve itself it is the height of folly to assume that the slope of the actual curve will remain the same into the future. Yet that *is* what the climate models appear to do – assume the slope of the temperature curve will remain the same forever. It’s why Pat Frank can duplicate the models using a linear equation.
“Show me a daily temperature curve where this happens.”
Good grief! It was a simple equation to demonstrate a piecewise function that is continuous. “It’s a TEACHING TOOL you dunce!”
“But that function only exists is your unreal statistical world. ”
No functions exist in the real world. They are constructs intended to explain aspects of the real world. They can do that to varying degrees of accuracy and usefulness.
Functions *do* exist in the real world. Look at Gauss’ Law sometime. It’s the same for micro and macro systems, its the same even for quantum mechanics.
Again, piecewise linear function relationships used to model non-piecewise physical processes are of, at best, LIMITED usefulness, especially when used to try and forecast future states of the process. This is especially true when the piecewise functions are nothing more than estimations of the slope of the functional curve at limited points on the functional curve.
When the data set being examined is a set of measurements of the intensive properties of different things all jammed together with no weighting for either variance or time then piecewise linear regression of parts of the data is even more unfit for the purpose of describing reality, including future reality.
“You mean that thing I kept trying to point out to you about the “pause”? The analysis is wrong as it results in a discontinuity.”
Here’s an old graph showing how much of a discontinuity you get with the pause.
This is the kind of analysis that is *BAD*. There is no discontinuity in the temperature data but only in the piecewise linear regression analysis being done. It’s just one more part of the poor science of “climate science” which results from statisticians ignoring physical reality when analyzing the data.
Yes that’s the point. It’s demonstrating why Monckton’s Pause is the result of *BAD* analysis. It implies an implausible discontinuous step change in global temperatures.
Maybe you should explain this to bnice2000, who keeps claiming that all the warming is the result of step changes caused by El Niños.
Malarky! Your assertion, if true, means it would be impossible to analyze a sine wave over a portion of the cycle including the max or min values since you would see a change in the slope of the curve, i.e. what you term as a discontinuous breakpoint.
Again, discontinuous means the limit of the function at a specific point is different when approached from opposite sides. That is *not* true of a sine wave and it is *not* true of the temperature curve, not even around a La Nina or El Nino occurrence.
A digital representation of a sine wave *is* a discontinuous set of step changes, i.e. a piecewise function. It might *look* continuous if displayed on a device with insufficient granularity but it is truly a piecewise function where the limit at each step is different on each side of the step.
This only highlights the problem with climate science not understanding what they are doing in using gross granularity (e.g. annual averages generated from mid-point daily temps) to do statistical analysis of the data. Climate science uses a digital representation of the temperature curve which can make it look like it has step changes when that is physically impossible, not eve a major volcanic eruption can cause a discontinuity in the temperature.
CoM’s analysis has nothing to do with a “discontinuity”, no more than saying the slope of a sine wave is positive over a portion of the curve and is negative over a different portion of the curve.
This only highlights the fact that the climate models are *NOT* functional relationships but only data matching algorithms. If they truly defined a functional relationship between temperature and all its independent variables there would be no discontinuities to be found. That does *not* mean that the slope of the functional relationship at any point might be different than the slope at a different point.
“Your assertion, if true, means it would be impossible to analyze a sine wave over a portion of the cycle including the max or min values since you would see a change in the slope of the curve, i.e. what you term as a discontinuous breakpoint.”
Barmy – a sine wave is not discontinuous, and even if it were, you can still analyze it. You see a change in the slope of a sine wave everywhere, it’s a continuously changing slope. It’s derivative is cos, that should be a clue.
And my “assertion” was simply agreeing with Tim when he said it was bad analysis to have a piecewise regressions with discontinuities when modeling global temperatures.
Not worth the time responding to the rest of his rant. It’s obvious he doesn’t understand what he’s arguing, and it’s all just an attempt to distract from the simple point that breakpoints do not have to be discontinuous.
“And my “assertion” was simply agreeing with Tim when he said it was bad analysis to have a piecewise regressions with discontinuities when modeling global temperatures.”
Again, “my” assertion was simply to agree with Tim that suggesting there was a discontinuity in global temperatures suggested bad analysis.
It has nothing to do with Monckton’s Pause. The “pause” has to do with the lack of temperature following the increase in CO2.
Temperature itself IS NOT a piecewise function with discontinuities. Temperature is a continuous function that has periodic measurements made of it.
Trends of averages of averages of averages DO HAVE points that appear as piecewise functions. You should ask yourself why are trends discontinuous when they are used to explain a continuous function. Maybe a time series doesn’t a linear relationship between time and temperature.
Here is a primer. https://www.statology.org/linear-regression-assumptions/
Are all the prerequisites met?
Trying to discuss trends of temperature against a sequence count is useless in determining a cause of temperature variation.
“It has nothing to do with Monckton’s Pause.”
It’s literally Monckton’s pause.
“The “pause” has to do with the lack of temperature following the increase in CO2.”
Yet not even Monckton claims such nonsense. What he kept telling you was that the purpose was to demonstrate the difference in the rise and run of the steps.
“Temperature itself IS NOT a piecewise function with discontinuities. Temperature is a continuous function that has periodic measurements made of it.”
See my responses to the same nonsense coming from you brother.
“You should ask yourself why are trends discontinuous when they are used to explain a continuous function.”
Because you are choosing an inappropriate start point, and not constraining the regression to be continuous. You are then hiding the discontinuity by only showing the after graph, rather than the before and after slopes.
“Maybe a time series doesn’t a linear relationship between time and temperature.”
I’m sure you’ve missed something out there, but don’t want to speculate as to what it is.
“Yet not even Monckton claims such nonsense. What he kept telling you was that the purpose was to demonstrate the difference in the rise and run of the steps.”
You have *NEVER* understood what CoM was pointing out. As Jim tried to tell you he was demonstrating that CO2 is *not* physically related to rising temps. It is a SPURIOUS correlation.
“Because you are choosing an inappropriate start point,”
No, it’s because you are trying to represent a continuous physical phenomenon using discrete values. As I pointed out this is why a digital representation of a sine wave can never be perfect. It would require an infinite number of discrete points to do so – think Δx in calculus where Δx -> 0 in the limit.
What is being trended is not temperature, it is points in time. Rather than showing the points with a connecting line is wrong. The points should be shown as a scatter plot with each point being standalone. This is one reason uncertainty in the data is misunderstood and misrepresented.
Temperature is *NOT* a piecewise function. It is a continuous function driven by a functional relationship. If an analysis of temperature requires a piecewise function then something is wrong with the data or with the analysis.
All you’ve done here is throw up an irrelevant red herring.
“Temperature is *NOT* a piecewise function”
Temperature is not a function of any sort. Functions can be used to model changes in temperature, and that can include piecewise functions. No function will be a perfect representation, unless you have a god like omniscience.
A function, by definition describes a functional relationship.
“Temperature is not a function of any sort.”
Temperature has a functional relationship with its independent variables. Therefore temperature *is* a function. Temp = f(sun, clouds, humidity, geography, terrain, pressure, ….). In other words A FUNCTION.
“Functions can be used to model changes in temperature, and that can include piecewise functions.”
Representing a continuous function using piecewise functions is nothing more than the lack of knowledge of the person doing the representation.
” No function will be a perfect representation”
Of course it is. The function y = sin(x) IS a perfect representation of a sine wave. Trying to define that function using discrete points will never be perfect. But the function *is*.
You have the typical statistician’s worldview that statistical descriptors define the physical world. They don’t. They never will. They are a tool to use to increase knowledge of physical processes but they are *not* the functional relationship of the process itself. Piecewise analysis using regression analyses of parts of a function cannot itself generate or even identify discontinuities in the function. All that piecewise analysis can do is tell you what is happening with that piece, it can’t tell you what is happening with the functional relationship.
The perfect example of this is you thinking that the sinusoidal daytime curve is separate from the nighttime exponential decay thus resulting in two piecewise functions for the daily temperature. That’s a statistician’s worldview, not a physical scientist’s worldview that understands that exponential decay occurs during daylight hours as well as nighttime hours and that the functional relationship for temperature is an addition of the two. In fact, that daytime exponential decay (related to T^4) is a prime factor limiting what Tmax can be. No amount of statistical analysis of daily mid-point temperatures can ever identify that part of the actual physical process going on in the biosphere meaning that no model based on the statistical analysis of daily mid-point temperatures can ever match reality.
Absorption/Emission or Heating/Cooling are each separate processes.
The earth absorbs even while it is also cooling. Planck’s Theory of Heat Radiation covers all this. Thermodynamic texts cover this also from the standpoint of conduction and convection.
The ability to deal with gradients as vectors is necessary to see how the physical processes are continuous.
“The function y = sin(x) IS a perfect representation of a sine wave.”
Well done. Yes a function perfectly represents itself. The question is, can it perfectly represent something like temperature over time?
“Representing a continuous function using piecewise functions is nothing more than the lack of knowledge of the person doing the representation.”
Again, unless you are omniscient you will always lack knowledge about the subject. You really need to remember that models are always wrong but can be useful.
“You have the typical statistician’s worldview that statistical descriptors define the physical world.”
Any statistician claiming that is wrong “all models …” etc.
“Piecewise analysis using regression analyses of parts of a function cannot itself generate or even identify discontinuities in the function.”
What are you rambling about now? The point of change point analysis is to identify possible points of change. This may or may not tell you something about what has happened, or how a process behaves. They can be described by a piecewise function, that function is a model. They will never describe the entirety of reality, any more than any function will (except this god-like function you believe exists.)
“The perfect example of this is you thinking that the sinusoidal daytime curve is separate from the nighttime exponential decay thus resulting in two piecewise functions for the daily temperature.”
You were the one who claimed that – not me. I kept suggesting your sin wave by day, exponential decay by night was too simplistic.
“he question is, can it perfectly represent something like temperature over time?”
I gave you the equation to use for the sun’s insolation at least twice.
For any point in time
dq = [cos(Θ1)cos(Θ2)dA_1 dA_2]/2πr^2 * σ(T_1^4 – T_2^4)
Since the sun moves Θ1 and Θ2 are themselves sine functions. So *YES* sine waves can represent something like temperature over time.
“Again, unless you are omniscient you will always lack knowledge about the subject. You really need to remember that models are always wrong but can be useful”
The path of the sun in the sky is known to as many digits as the calculators can handle. How perfect do you need?
“Any statistician claiming that is wrong “all models …” etc.”
And yet that is exactly what you claim when you try to claim that statistical descriptors of the the earth’s sampled temperatures indicate the presence of “breakpoints” in the temperature curve!
“What are you rambling about now? The point of change point analysis is to identify possible points of change.”
Statistical descriptors of sampled temperatures can only identify “phantom” points of possible change. Especially since the samples are not of sufficient granularity to even meet the Nyquist criteria for sampling.
“You were the one who claimed that – not me.”
I did *NOT* claim any such thing. I *said* that the daytime temps are sinusoidal and the nighttime temps are exponential decay. That’s because the daytime temps are overwhelmed by the sun’s sinusoidal forcing and the nighttime temps are overwhelmed by the exponential decay. That is *NOT* saying that the earth’s temperature is a piecewise function. What is y = sin(x) + cos(x)? That is not a piecewise function but at x = π/2 the contribution to y from sin(x) overwhelms the contribution from cos(x). And vice versa for x = π.
You can’t even admit that the variance of the data in the temp data sets is a direct metric for the uncertainty of the average value of the data. It’s a certainty that you’ll never accept that the sample data is not sufficient to identify breakpoints in the temperature curve at *any* level, be it daily, weekly, monthly, annually, or decadally.
Which doesn’t make sense if CO2 is the cause and is expected to be the cause of any acceleration.
Magical molecules that cause opposite effects at the same time have never made any sense. The 18 year statistical pause of global warming that HadCRUT has now disappeared is proof of that. Both cannot have happened.
The missing heat cannot be “Hiding in the oceans” and non existent at the same time, but there you go.
Indeed. Statisticians think they have explained something better if they can pack the acceleration into just one breakpoint. So that is their null hypothesis. But thereis nothing physical about it. It isn’t a better explanation.
More waffling gibberish from Nick.
True in one facet though…
There is nothing physically “real” about the surface data.
The null hypothesis is the one you want to reject. The alternative hypothesis is the one you think is a better explanation.
Warming looks to happen with El Ninos too. Its all breakpoints, not a steady increase as might be expected by a constant forcing. In reality nothing about AGW is well understood.
I have often wondered why no one has taken the time to study when automated stations were implemented worldwide and how this might have affected temperature readings. The US began using ASOS stations in the 1980’s about the same time the breakpoint is postulated to occur. I don’t know about the rest of the world. UHI could have grown pretty quickly as the population in cities has grown. Lots of things that could be causes high rates of change.
You’ve got some splainin’ to do, Lucy.
“huge acceleration in about 1970″
As it is based on mostly urban data… and better rural data is forced to confirm…
No evidence it is anything but urban warming.
Straws, grasping at!
“The paper certainly says there was huge acceleration.”
It does not! It acknowledges an abrupt change in the slope of the linear trend in 1970. They said specifically, “…, a warming surge could not be reliably detected anytime after 1970.” Additionally, they said, “While it is still possible there was a change in the warming rate starting in 2013, the HadCRUT record is simply not long enough for the surge to be statistically detectable at this time.” In other words, they have ‘pulled the rug out from under’ those claiming that the GMST change recently accelerated. The implication is that the basis for the claim of acceleration is subjective, based on spurious correlations observed over a short time span. There is no robust statistical support for the claim, based on their model.
+100
“an abrupt change in the slope of the linear trend in 1970″
An abrupt change is a huge acceleration. Infinite, in fact.
It’s the old business of statistical testing. Many scenarios can explain the data. They have one (all happened in 1970) which they say explains as well as the others and involves no more recent acceleration. That doesn’t disprove the others.
DENIAL of urban warming
The first and last refuge of the climate scammer.
But not impossible. The problem with time series, which these are, is that there is no information about the underlying causes.
Cyclical phenomena can meet in phase and cause something like a “pole” in EE parlance where feedback goes wild. Using regressions against time will NEVER lead one to the complexity of the underlying variables that cause temperature variation.
However, that was 54 years ago, not recently!
The acceleration is not infinite. As a first-order estimate, it is approximately the GMST change over a short unit of time, not equal to zero, centered on the 1970 ‘knick point.’
See particularly the paragraph titled “Instantaneous rates of change” and the following paragraph, “Formal definition of the derivative” at the link below:
https://www.britannica.com/science/analysis-mathematics/Calculus#ref218270
A concise statement that reveals you put all your eggs into the ruler basket, and cannot see past it.
The crisis is actually in the “science” that calls itself “climate science” and those that promulgate this scam.
The corruption of data, and the use of junk data…
The fake models.
The activists pretending to be scientists.
Nick is in on all of it.
“Even without acceleration, that needs attention.”
Why? 2° would put us about on a par with the MWP which still leaves us adrift of the Roman Warm Period and still doesn’t change the fact that there is no scientific evidence supporting the hypothesis that changes to atmospheric CO2 concentration cause changes to atmospheric temperatures.
Which means that by 2070 there is a reasonable chance that the current warm period might have reached another peak or equally that it might already have peaked and started a downward trend.
You are brainwashed
Why is the process of filling a seemingly functional brain with lies, half truths and other assorted filth, called washing!?
Surely our “modern” education system is more akin to mental gang rape than learning!
brainfilthing
It should be called brain-stuffing
Not really. IMO Stokes is deeply emotionally invested in the CAGW hoax, to the point where he simply cannot countenance any possibility he is wrong as this would do terrible violence to his very self-image.
Stokes apparently used to work at CSIRO,
Would not be surprised if he had input into the abortive and pathetic CSIRO climate models.
Trying to protect a dumb legacy and keep friendships with those fellow collaborators still at CSIRO.
dingdingding.
“”[this showed] the climate was breaking down””
Am I alone in thinking this use of language as absolutely meaningless, clever wordplay, but nonetheless, meaningless?
Breaking down? Is this the new expression for [unwanted, undesired] changes that occur quite naturally? Need one say that the climate is always changing? How do they fit their utopian state of climate stasis into that? What exactly do they mean by breakdown and who is saying it?
“”Climate breakdown is the most recent term for what was previously known as global warming (that the average temperature of the earth is continuing to rise) or climate change (that the global climate has changed in an unprecedented manner). The phrase climate breakdown was made a journalistic standard by The Guardian after the recent IPCC report.
…
What climate breakdown, as a term, encompasses is a change of some kind””
https://poeticearthmonth.com/what-is-climate-breakdown/
For journalist read hacktivist. In reality, such as it is, Net Zero is an imaginary means of preventing the climate from changing – as it always has.
Three words spring to mind: arrogance, hubris and boneheadedness.
Looks to me like a bunch of trendologists gone on a rampage..
… totally unaware that they are working with meaningless maladjusted surface data.
The only “recent” surge has been from the 2023 El Nino.
ps… Given the massive urbanisation affecting surface stations data, especially after 1970, it would be strange if the surface data fabricated temperature were not increasing.
But this is NOT warming caused by CO2 emissions.
I suspect that most of surface sites that are going to be badly affected by urban and site issues, may have already been affected, and that the non-affected sites may now be reasonably stable…
(apart from class 4,5 site intentionally created by the UK Met , Bom etc.)
Time will tell.
Looks to me like a bunch of theologians gone on a rampage..
I’m enjoying the silence and warmth of my electric blanket on the cold dark morning.
Yep. Only 6% of UK Met Office sites are rated class 1 (Pristine) whilst almost 80% are rated classes 4 and 5 the two lowest classes.
Indeed over 8 in 10 of the 113 measuring sites established in the last 30 years are class 4 or 5.
Class 5 is defined by the WMO as “a site whereby nearby obstacles create an inappropriate environment for a meteorological measurement that is intended to be representative of a wide area”
And one needs to realize that the uncertainty values the WMO specify for each class are ADDITIONAL uncertainty that is to be added to the base uncertainty.
Maladjusted or not, it is what the alarmists use. They have demonstrated that the preferred data set does not support the claim(s) of an acceleration of GMST.
The difference between ArtStudent™ descriptions and scientific descriptions is that the former use emotions, the latter, numbers.
How do you feel….
Climate nounthe long-term weather pattern in a region, typically averaged over 30 years.
Breakdown nounThe act or process of failing to function or continue.
So ‘climate breakdown’ must mean the weather has stopped fluctuating ( i.e. Not changing )
I am still trying to figure out why these poor deranged people seem to pine for a return to the Little Ice Age. That would indeed be a breakdown, not of climate but of the pampered society they represent.
We have been having an above “normal” October here along the Wasatch Front, with a brief preview of winter this week. I find pleasant weather to be, well, pleasant. Sure beats the below “normal” October we had a few years back!
The change from “global cooling” to “global warming” was an admission of error.
The change from “global warming” to “climate change” was a marketing move.
The change from “climate change” to “climate breakdown” is another marketing move.
The science that’s being practiced (poorly) here is marketing.
Remember, Kamala Harris did not receive one vote in any presidential primary election. That is what the data shows.
However, 100% of her supporters experienced “political climate change” (marketing) in less than two months when she received her party’s nomination from all the party delegates.
Deranged people think, say and do deranged things.
“Remember, Kamala Harris did not receive one vote in any presidential primary election.”
Isn’t that how democracy is supposed to work?
As we all suspected.
Can we all have a refund of the useless green taxes we have all paid for years, that made no difference?
Real climate reparations.
Compensation for all that global grifting
What a joke!
Was he signing a deal for a another new airport ??
Or for a new 5 star resort for the climate cabal to stay at.
It’s all a needle in a haystack, and the haystack won’t stay put. Any attribution of a long-term warming trend to human emissions of GHGs, in effect, is a claim that a tiny trend of warming – say 0.15C per decade (UAH) or 0.015C per year – can be isolated for study from the annual cycle of about 3.8C of warming and cooling taken as a global average. Such a claim is not valid in any physically real sense of cause and effect, in my view. I like this ClimateReanalyzer web page, which uses the parameter for T2m from the ERA5 reanalysis model. You can go to Arctic, Antarctic, tropics, NH, and SH to get a better idea about what is driving the global average up or down. Look at the tropics for 2024 vs 2023 and the background of all the other squiggles for past years.
https://climatereanalyzer.org/clim/t2_daily/?dm_id=world
It’s 2024, and I still need a new pair of slippers.
Go out and get yourself a couple of raccoons.
I can’t see why this would be “sensational”. I’ve been pointing out here for years that the best assumption Is a linear rate of warming since the 70s. You will always need more than a few years to establish an actual change in the rate of warming. Until then there is no significant evidence for acceleration, deceleration, or a pause.
Have you pointed out that – in accordance with reality – there is no problem?
“I’ve been pointing out here for years that the best assumption Is a linear rate of warming since the 70s.”
Which of course is load of total balderdash based on junk science surface fabrications.
And the reason for the increase in the surface data is urbanisation…
NOT CO2
All that urbanization happening in the oceans.
show us where the measurements for the oceans were made in 1978.
The topic is the surface data , not some other fakery from the GISS stable.
It’s UAH data. The measurements were made from space.
Therefore it would be IMPOSSIBLE to claim there is accelerating sea level rise would it not?
If it was just as warm in the Early Twentieth Century as it is today, then where is the warming?
The temporary warming from the 1970’s is just bringing the temperatures up to where they were in the Early Twentieth Century.
The official temperature records have been bastardized by climate change political activists and don’t show the warmth of the Early Twentieth Century because it would destroy their human-caused climate change narrative if they show that it was just as warm in the Early Twentieth Century as it is today.
So the Climate change activists lie about the temperature record. Any studies done using these bastardized temperature records are necessarily bogus and do not represent reality.
My melon plants were killed by an early freeze a couple of days ago. That’s about two week ahead of the usual first freeze around here. Hottest year evah!, huh? Not around here, it isn’t.
You have melon plants? I love melon plants. Mine are seedlings as we speak. Gold on Gold variety.
Don’t mistake of those melons for your head. !
Do you ever have a nice thought, or is your default just plain nasty?
Your default is just plain DUMB. Your only option.
What Variety?
I’m going to assume you are asking seriously. Gold on gold. They are a yellow melon.
Actually gold in gold. See them near the end…
What of the Hunga Tonga effect?
It is indeed interesting that climate alarmists would try and make the word “skeptic” a perjorative! Anyone with a basic understanding of science, and the scientific method, would know that skepticism is as foundational as careful observation. Calling someone a skeptic is, in reality, claiming that they are the BETTER scientist or observer!
Food for thought. How much of the observed temperature increase in the late 70s and early 80s was due to many nations drastically reducing air pollution; especially particulates? How is this bad?
From the paper.
What is the uncertainty in these values? ±0.0005?
The values are so far below what was measured, it simply is not believable that ANYONE can tease measurement values like these out of the data. ASOS uncertainty is ±1.8°F (±1.0°C) and I assume LIG’s were no better, while even CRN stations have an uncertainty of ±0.3°C (±0.54°F).
Mathematicians and scientists that dismiss and disregard the propagation of these uncertainties throughout their calculations are counting the number of angels on the head of a pin and are trying to convince everyone they have encountered a method of doing so.
According to Nick Stokes, you are an “uncertainty crank” if you ask questions like this one.
According to Nick and his peeps, NIST and ISO are stupid agencies that are only concerned with complicating measurements, their meaning, and their use
The probity, provenance and presentation / prosecution of temps “data” are all shockingly compromised.
What do we put this situation down to?
Malfeasance? Perfidy?
Ignorance! The ability to recognize the fact that you don’t know what you don’t know is operative amongst the warmist mathematicians. These folks demonstrate NO knowledge of higher level physical science training where accuracy, precision, and uncertainty rule all measurements from the size of a 10 penny nail to the speed of light!
The people (“Scientists”) who throw numbers like these as meteorological temperatures have clearly never actually read a real thermometer.
All statistical numbers have infinite precision and any time statistical numbers are averaged the significant figures go from aleph null to aleph sub-one. 🙂
“The values are so far below what was measured, it simply is not believable”
For heaven’s sake, look at the units. They are describing trends in °C/year. ASOS etc are in °C. Different.
0.029 °C/year is 29 °C/millenium. Same quantity. Should be able to measure 29°C accurately?
But they aren’t “measuring” 29 C.
They’re “constructing” 29 C.
?
Units.
You’re only going to confuse him!
Is “him” me or Nick?
Him is Nick. Must have clicked on the wrong Reply button!
Good grief Nick, you have no conception of how the °C is derived do you? That is just a stupid comment.
Do you think temperature has a time component when you read a thermometer? Do you think there is an SI unit of °C/time? What device do you use to measure °C/time?
The SI unit of °C has an uncertainty because it is derived from measurements that have uncertainty. That uncertainty in each measurement must be PROPAGATED through following calculations. Where do you think the description of “propagate” originated in metrology and why?
Your attempt to dismiss that fact illustrates your ignorance of how physical science is done when determining physical quantity values.
So if you take the slope of the UAH data, say 0.8ªC ± 0.05 over 45 years that would be 0.018±0.001 ªC/year.
“29 °C/millenium.”
Only a complete mathematical moron extrapolates units like that.
It’s just an alternative expression of the rate of change. It’s not a meaningful time scale, but it’s not really extrapolation.
It’s like expressing velocity in parsecs per fortnight.
A rate of change is more properly stated as “Δ°C/time”. But, the value of Δ°C is still a derived value from separate measurements with uncertainty that should be propagated and included in the statement of the value!
The title of this article is a lie
When looking at 10 years averages, manmade CO2 emissions and the global average temperature have had a VERY strong positive correlation.
The past ten years, when viewing UAH, has had a +0.4 warming per decade, faster than all prior decades since 1974.
Each decade (1974 to 1984, 1984 to 1994, etc.) using UAH when available, has had higher average atmospheric CO2 increases and higher average temperatures.
Only fools deny the obvious connection between manmade CO2 emissions, the atmospheric CO2 levels and global warming. Unfortunately, there are a few fools who comment here and deny all three facts of life.
You are assuming that it’s increase in CO2 levels that drive global temperature?
The ‘global average temperature’ construct is arrant nonsense.
There’s another “fact of life” for you.
Yes mindless child.
The rate of CO2 increase follows the ocean atmospheric temperatures (see chart)
The warming in UAH happens ONLY at El Nino events…
… and not even you are stupid enough to say that CO2 causes El Ninos.
(or are you are that stupid?.. hard to tell just how stupid you really are !)
You still have ZERO empirical scientific evidence of any warming by atmospheric CO2
“The past ten years, when viewing UAH, has had a +0.4 warming per decade, “
Using two large El Nino events.. that is truly dumb.
You continue to make yourself look like a complete and nutter fool.
You have zero “FACTS” that you can back up with any science whatsoever..
…. just you usual mindless AGW-cultist bluster.
You keep making a FOOL of yourself whenever asked for evidence.
Here are some facts for you… evidence shown to you elsewhere in prior major posts… but which you DENY.
**Human CO2 is only about 4% of natural CO2 flux… so nearly all of it disappears rapidly
**There is no isotopic evidence of human CO2 remaining in the atmosphere
**Rate of CO2 change follows ocean atmospheric temperatures
**There is no empirical scientific evidence that atmospheric CO2 causes warming (you have proven that yourself)
**There is absolutely zero evidence of any human caused warming in the UAH satellite data (again, you have been ask to show evidence that there is.. and have failed)
All your petty and arrogant bluster is totally meaningless without any actual real science to back it up.
Be sure to come visit us when the temp falls back to the base line, and don’t forget to mention the ”connection with co2” then. Lol.
“The cure for boredom is curiosity. There is no cure for curiosity.” — Dorothy Parker
Note that in this time-zone right now it is a cloudy Sunday afternoon, with nothing worth watching on the television.
.
Even before starting to read your post I could see that you had not attached a graph, and there were no links, AKA “red text, often but not necessarily starting with http”, in your post.
I am neither psychic nor telepathic.
Unlike you, I am not omniscient, and I therefore have no idea whatsoever of what you were “looking at” (/ thinking of) when you were typing your post.
OK, you were “looking at” UAH [ TLT, V6, by inference on my part ], and its rates of warming, rather that “global average (surface) temperature” (GAST) anomalies …
1) GAST datasets are available back to 1850, even GISS goes back to 1880. Why pick on UAH, which starts in December 1978 ?
2) “Each decade”, after using the term “10 years averages” ?!? GAST (and UAH) datasets come in monthly resolution. Reducing that to 10-year (/ 240-month) “bins” is going to lose a lot of information.
3) You’ve switched back to “average temperatures” again. Are you “looking at” graphs of “temperature anomalies” or “rates of warming” ? … Never mind, I’ll assume the former.
.
Unfortunately (for you) I am a “visual” person, and find I can only really “understand / appreciate / get” a timeseries of numbers by plotting it on a graph and then “just looking at” it for a while.
NB : This doesn’t always work, but it does seem to be a required step for my brain to truly “learn” something.
Despite everything, my curiosity was piqued by your post on this cloudy Sunday afternoon, and as a result I generated the attached graph, using data from UAH, HadCRUT (as a representative pre-1979 GAST dataset) and the GCP (Global Carbon Project, for annual CO2 emissions).
Do you really think that using the “10-year binned” numbers — the asterisks in the graph below — is somehow “more insightful” than the alternatives ?
.
PS :
Just because something is “obvious” to you (or me) does not mean that it is “necessarily” true.
See : The Flat Earth Society.
Just because something is not “obvious” does not automatically mean that it is false.
See : Quantum Mechanics, both Special and General Relativity, …
At first look, one would assume that CO2 and temperature are tied together with one causing the other. But the limits of time series interrupts this simple conclusion.
It is likely there is an underlying cause that results in the simultaneous growth of both. That is why correlation is not proof of causation.
The “gold standard” for climate science is often considered to be the IPCC WG-I assessment report, the latest (AR6, “Final / Approved”) version of which was released in May 2021.
From the “Technical Summary” chapter of that document, section TS.3.2.1, “Equilibrium Climate Sensitivity, Transient Climate Response, and Transient Climate Response to Cumulative Carbon-dioxide Emissions”, on page 94, starts with :
NB : For more detail see section 5.5.1, “Transient Climate Response to Cumulative Emissions of Carbon Dioxide (TCRE)”, on pages 742 to 749.
OK, let’s take that as an “assume true for the sake of the argument” axiom, and see if there are any “unexpected” results.
I discovered that the Global Carbon Project (GCP) had released their initial estimate of anthropogenic CO2 emissions for 2023 in March this year, and I updated the attached graph accordingly at that time.
Notes
– For the first and third segments, the “global warming” didn’t even have the correct sign.
– For the second segment, the actual warming was more than three times the “expected” warming (due to TCRE).
– The slope of the TCRE curve has only matched the slope of the fourth segment since 2005/2010. 15 to 20 years of data does not equal “climate”.
– The actual GMST curve (HadCRUT5 in my example) kept going outside the IPCC’s “confidence interval” until 1976, i.e. it has only remained entirely within that range since 1977.
.
The actual situation is nowhere near as “obvious” as you have been (mis-)lead to believe.
Climate Break Down
Leftists Jumping Up and Down
Junk science Triple Crown
Freedom Clamp Down
Capitalism Knock Down
You remind me of a sad, dumb kid I knew when we were at school who would eat dirt as long as we promised to be friends with him.
I also do not see any significant warming here, in South-Africa.
https://breadonthewater.co.za/2024/05/28/no-change-in-temperature-in-south-africa-for-more-than-45-years
I can’t see any justification for fiddling with recorded temperatures. I also can’t see any justification for comparing traditional thermometers with satellite or the new measuring devices. Any adjustment is an opportunity for corruption.
There is no justification possible. Would anyone allow physicists searching for a new particle to adjust past data to show that it existed at the prior time?
Adjustments after the fact are NOT corrections. Corrections are made at the time of reading based on calibration information not years later based on some screwy ideas that homogenization will reduce uncertainty. If you don’t KNOW the adjustment value, then guessing is all that is occurring.
Well said Jim. 🙂
“Hottest year, Evah!!!!”