The Met Office is Unable to Name the Sites Providing ‘Estimated’ Temperature Data For its 103 Non-Existent Stations

From THE DAILY SCEPTIC

by Chris Morrison

Last year the UK Met Office was shown to be inventing long-term temperature data at 103 non-existent weather stations. It was claimed in a later risible ‘fact check’ that the data were estimated from nearby well-correlated neighbouring stations. Citizen super sleuth Ray Sanders issued a number of Freedom of Information (FOI) requests to learn the identity of these correlating sites but has been told that the information is not held by the Met Office. So the invented figures for the non-existent sites are supposedly provided by stations that the Met Office claims it cannot identify and are presumably not recorded in its copious computer storage and archive.

Mr Sanders is understandably unimpressed with the explanation that this vital identifying information is not retained, writing: “Is the general public just supposed to ‘believe’ the Met Office without any workings out evident. To me, and every single scientist who has ever lived, it is imperative to show the data used – ANYTHING LESS IS NOT VALID. No Verifiable Data Source = No Credibility = no better than Fiction.”

Until recently, the Met Office showed weather averages including temperature for over 300 stations stretching back at least 30 years. The data identified individual stations and single location coordinates, but when 103 were found not to exist the Met Office hastily rewrote the title of the database to suggest that the figures arose from a wider local area.

Following the change, Sanders sought FOI guidance about Scole, a temperature weather station in Norfolk that operated for only nine years between 1971 to 1980. Type in Scole on the new ‘location’ database and it is identified as one of five sites that are the “nearest climate stations to Scole”. Sixty years of average data are given including 10 years before Scole was actually established. This itself is odd since the Met Office justifies ‘estimating’ data for closed stations to preserve long usability of the data. It would appear a stretch to use this explanation to justify preserving 1960s data from a station that did not open until 1971. Sanders made a simple request and asked the Met Office to reveal the names of the weather stations used in compiling the climate average data for Scole from 1990 to 2020. If the Met Office was unable to supply the full list, he made it as easy as possible and asked for the name of the last station supplying data.

The astonishing claim that the Met Office was unable to help because the information was not held was followed by an explanation that “the specific stations used in regressive analysis each month are not an output from the process”. The unimpressed Sanders observes that the Met Office archives billions of numbers and data items but does not seem to keep a record of its workings out. “So they have no proof whatsoever of how their climate averages were compiled,” he observes.

Sanders also sought similar details about another ‘zombie’ site, namely, Manby in Lincolnshire. This actually closed for temperature readings in 1974 but again 60-year averages are currently available. Sanders was intrigued by this site since the CEDA archive that collects Met Office data showed it was still open, a claim also made in an earlier FOI disclosure by the state meteorologist. Again Manby is identified as the nearest climate station when its name is searched on the climate averages site. But the Met Office’s Weather Observations Website shows it is closed and Sanders notes the Met Office has since confirmed that to him. It has been 50 years since an actual temperature reading was taken at Manby but as with Scole the Met Office under a FOI request is unable to name any of the ‘well-correlated’ sites supposed providing data.

It is difficult to understand why the Met Office cannot answer a simple question seeking guidance on where temperature readings were taken. Presumably they would be obtained from the five nearest ‘stations’ identified when a location is entered into the climate averages database. But as the Daily Sceptic has reported in the past, there might be problems with this approach. Cawood in the West Riding of Yorkshire is a pristine class 1 site designated by the World Meteorological Organisation as providing an uncorrupted air temperature reading over a large surrounding area (nearly 80% of Met Office sites are in junk classes 4 and 5 with ‘uncertainties’ of 2C and 5C respectively). Cawood has good temperature recordings going back to 1959. But no rolling 30-year average for Cawood is provided. Instead, the Met Office flags data from five other sites, four of which don’t exist, with the fifth located 27 miles away at a 163 metres higher elevation. Even worse, the location of Norwich brings up five nearby stations, including Scole, none of which exist.

As the Daily Sceptic has noted in the past, the Met Office has only itself to blame for the often trenchant criticism it receives on social media about its temperature collecting operations. It does a fine job of forecasting weather, but activist elements in its operation have weaponised inaccurate temperature recordings to promote the politicised Net Zero fantasy.

Recently, the chief scientist at the Met Office, Professor Stephen Belcher, called for Net Zero “to stabilise the climate” claiming he saw “more extreme weather” in the Met’s observations. In the UK, he suggested that between 2014-2023 the number of days recording 28C had doubled, while those over 30C had tripled compared to 1961-1990. A more extreme weather trend is not something that the Intergovernmental Panel on Climate Change has seen, while observations about more recent hot days might ring truer if they were not based on the increasingly urban heat-ravaged Met Office databases.

And Ray Sander’s take? “We are regularly told in the mainstream media, particularly the BBC, that we are entering an existential ‘climate emergency’, so how is it nobody wants to discuss the obviously fictional data that is being manipulated to support this ‘argument’?”

Chris Morrison is the Daily Sceptic’s Environment Editor.

4.9 39 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

281 Comments
Inline Feedbacks
View all comments
strativarius
May 13, 2025 2:19 am

The names are:

Bogus 1, 2, 3, n…

There was a time when people got fired for fiddling and extrapolating, now they get promoted.

Why I returned to the Met Office

…every day I’m reminded of how lucky I am to work for an institution with such a storied past.

Looking ahead, I’m genuinely excited about what the future holds, especially with emerging trends in AI, machine learning, and cloud computing. “
https://careers.metoffice.gov.uk/stories/why-i-returned-met-office-innovation-and-partnership

Number crunching Bogus data.

Reply to  strativarius
May 13, 2025 3:02 am

No one should be allowed to gush about their excitement at the prospect of AI without being able to explain (a) what they mean by “AI”, and (b) how their notion of AI does the thing they think it does.

Anyone who attempts to give those two explanations will either stop or reveal themself to be an imbecile.

As for “machine learning”, that is the unfamiliar new name for a lot of stuff we’ve been using for years. And who the heck cares about where their computing is done? I guess if you’re still in primary school it could be exciting—I’d probably have been excited by it when I was about 10…

strativarius
Reply to  quelgeek
May 13, 2025 3:12 am

In truth AI is exceedingly fast pattern matching. Intelligent it is not.

Reply to  strativarius
May 13, 2025 3:33 am

Automated Idiocy.

strativarius
Reply to  AGW is Not Science
May 13, 2025 3:41 am

The way it was programmed.

Reply to  strativarius
May 13, 2025 8:26 am

And yet nations of the world are in a race to be the best in AI.

Bryan A
Reply to  strativarius
May 13, 2025 8:29 am

Never trust Mannipulated data sources.
Likely these zombie datasets are averaged from other zombie data sites

Reply to  strativarius
May 13, 2025 3:45 am

In truth AI is exceedingly fast pattern matching

Correct. Although the actual patterns being matched are often unknown.

Reply to  Zig Zag Wanderer
May 13, 2025 8:27 am

There are relevant pages on Wikipedia?

Bryan A
Reply to  Zig Zag Wanderer
May 13, 2025 8:31 am

Often schizophrenics can recognise nonexistent patterns and insist they exist

Mr.
Reply to  strativarius
May 13, 2025 7:07 am

It just scrapes all of the Wikipedia pages at once very quickly so you don’t have to go to every relevant page one by one.

Bryan A
Reply to  Mr.
May 13, 2025 8:33 am

Sounds very trustworthy editable.

Reply to  strativarius
May 13, 2025 9:17 am

I am not at all sure of this.
If you read “Game Changer”, by Sadler – you need to be able to follow the games at quite a high level, 2000+ rating minimum) you will find Alpha Zero making moves which seem simply incomprehensible, but which some moves later, turn out to have rested on what, if it were a human playing, we would call deep insight into the position.
Now, Alpha Zero isn’t programmed in any usual sense to do this, it learned chess itself. It also can’t be doing pattern recognition in any usual sense – it is playing like this in situations it has never seen before and does not have a database of. Its doing it with a game where the number of possible moves is huge, far beyond any program’s power either to exhaustively analyze or to have in a huge database the resulting positions.

People will say, this is just a program looking to you like a human player of great ability. Well maybe. But Stockfish, which Alpha Zero wiped the floor with, that is recognizable, and its limits are too.

The claim that its just pattern recognition seems to be to rest in the end on defining what it does in that way. Its not a claim that is testable by observation.

So what do I think its doing? I don’t know for sure, but my starting hypothesis is that what its doing is very much akin to ordinary human thought, at a very high level.

A possible counter example would be its less good performance and its defects in Go. It did outperform a very strong human player at Go, but it turned out later to have some rather basic vulnerabilities. Don’t know why. If it has any of these in chess I haven’t heard of them.

LLMs are probably just something akin to pattern recognition and extraction and synthesis in some vague sense of the expression. But I am not at all sure that the kind of AI which Alpha Zero represents is. And of course it will have moved on since then, that match with Stockfish was several years back.

Sparta Nova 4
Reply to  michel
May 15, 2025 7:46 am

All that chess program is doing is, starting with the current board piece positions, running every possible move and counter move, then calculating which has the highest probability of success or eliminating moves with the lowest probability of success, probably both.

It operates as an optimization search algorithm. It this sense it replicates the thought processes of a chess player.

Reply to  quelgeek
May 13, 2025 7:26 am

AI has no current means of discriminating between “truth” and “falsehood”, other than by appealing to the current consensus viewpoint as revealed by Internet-sourced information/publications. And we all know how misleading a “consensus viewpoint” can be (reference “flat Earth”, Newton, Copernicus, Einstein, Plank, Heisenberg, and numerous other great thinkers).

AI has NO independent means of performing experimental testing of any given hypothesis or meme, which is an integral part of The Scientific Method that humans have found to be so useful in expanding science-based truth.

The intelligence of AI is far more artificial than most people recognize.

Reply to  ToldYouSo
May 14, 2025 8:35 am

This isn’t true. Alpha Zero plays moves that in the end result in either win or lose. It doesn’t rely on any consensus about anything.

Reply to  michel
May 14, 2025 11:46 am

And how, pray tell, does an AI playing games like chess, shogi, or Go (e.g., Alpha Zero) involve it making decisions about truth or falsehood?

P.S. Making a “false” move in a game, as well as in life, is often called cheating . . . be careful how far you want to run with your assertion because I believe the consensus of humans it that solo “games” and multiple-player “games” should not involve cheating.

Sparta Nova 4
Reply to  michel
May 15, 2025 7:48 am

Perhaps, or perhaps it reviews all of the recorded games and uses that to create a best case consensus.

Reply to  strativarius
May 13, 2025 3:32 am

Yes with AI they can not only make up imaginary “data” faster, but it provides another ready excuse for the complete lack of transparency – the “proprietary” AI algorithms.

Ron Long
Reply to  strativarius
May 13, 2025 5:14 am

Strativariius, I am pretty sure I saw a preview of this type of alternate reality on the old Rod Serling show “The Twilight Zone”. It didn’t end well.

strativarius
Reply to  Ron Long
May 13, 2025 5:46 am

I couldn’t help noticing that Shatner, Nimoy et al featured in The Scary Door and The Outer Limits…

SF wasn’t just plucked out of the air, but it failed to chart the speed of progress – or dystopian advance. Take you pick.

Reply to  strativarius
May 13, 2025 7:57 am

Computing how clouds work would be quite a trick.

May 13, 2025 3:29 am

Oh everyone knows what the three stations they use for this “data.”

They’re called

Their;

Nether; and

Region

strativarius
Reply to  AGW is Not Science
May 13, 2025 3:42 am

Where the Sun don’t shine.

May 13, 2025 3:46 am

Come on Nick. Tell us all why we’re wrong!

rovingbroker
May 13, 2025 4:06 am

In the UK, he suggested that between 2014-2023 the number of days recording 28C had doubled, while those over 30C had tripled compared to 1961-1990.

When I read statements like these, I assume that the writer is hiding something. For examples, “doubled” could mean they went from one instance to two instances over nine or ten years and “tripled” could mean they went from one to three instances over 29 years. Hardly earth-shaking.

Professor Stephen Belcher, Show Me the Data!

rovingbroker
May 13, 2025 4:15 am

“So they have no proof whatsoever of how their climate averages were compiled,” he observes.

It’s a religion.

SxyxS
Reply to  rovingbroker
May 13, 2025 1:22 pm

The stations are as real as the climate crisis.

Every single station is a prophet,the models are the prophecies,
Co2 is Satan and the climate crisis is the Apocalypse.
Trump is Paul turned into Saul.
And the CarbonTax are the indulgencies you pay to get rid of our sins.

May 13, 2025 4:26 am

The fact that a government agency is allowed to manufacture data is mind blowing. What else gets manufactured to justify “problem solving”?

Scientific data is physically measured. This is what so many scientific endeavors are finding when it comes to the replication crisis.

It is fantasy! It is fiction! There is no other way to put it. The people both doing it and allowing it should be dismissed for falsifying official records!

strativarius
Reply to  Jim Gorman
May 13, 2025 5:35 am

The people both doing it and allowing it should be dismissed for falsifying official records!

But they know best…. /sarc

Scissor
Reply to  Jim Gorman
May 13, 2025 6:00 am

Please stay 6 feet away.

Ed Zuiderwijk
Reply to  Jim Gorman
May 13, 2025 6:02 am

There was a time when you got fired for data fiddling. Methinks it may be time to get fired at.

Reply to  Jim Gorman
May 13, 2025 7:34 am

manufacture data

They didn’t manufacture it. Pls read their answer. Sanders and the OP either had problems understanding it or they pretended to do so, doesn’t matter. You, as an engineer, a knowledgeable, literate person (/s) are expected to understand these things.

Reply to  nyolci
May 13, 2025 9:33 am

If it wasn’t physically measured, then IT WAS MANUFACTURED. You can claim that the process that manufactured the temperature arrived at an accurate value, but to do that you also need the evidence to prove it. That evidence seems to be missing.

As an engineer, I would never falsely claim I made a physical measurement when I did not. Depending on the risk I MIGHT interpolate between actual physical measurements, but I would always mark it as such and show what physical measurements were used. In other words document the evidence for future review.

Reply to  Jim Gorman
May 13, 2025 10:25 am

Resident troll nyolci’s relation to physical reality and truth is tenuous at best.

Reply to  Jim Gorman
May 13, 2025 2:13 pm

That evidence seems to be missing.

Read their answer. You will see what they actually claim, and the evidence. And when you actually understand that, then you’ll be able to assert things about it. Until then you’re just bsing, just us most of you deniers here.

Reply to  nyolci
May 14, 2025 12:45 am

I’ve read their answer, and it’s perfectly clear the Met Office is making up temperatures, and you are lying again (no surprise there).

Reply to  Graemethecat
May 14, 2025 5:00 am

I’ve read their answer, and it’s perfectly clear

It’s perfectly clear that you didn’t understand it.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:11 pm

I have a bridge for sale. Only slightly used.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:07 pm

Deniers.
You lose all credibility with that.

None of us deny weather is variable.
None of us deny climate is a running average of weather.
None of us deny the running average, therefore climate, changes.

We argue for science. CO2 has a miniscule effect on the temperature and weather of this planet. We do not deny there is a human element.

What we argue vehemently against is radicalization of civilization that does not benefit human beings.

Reply to  Sparta Nova 4
May 14, 2025 11:01 pm

None of us deny weather is variable.

You deny science, big boy 🙂

Sparta Nova 4
Reply to  nyolci
May 15, 2025 7:52 am

Your definition of science fails the sniff test. Activism, now there is passes.

I take personal offense to your comment.

Reply to  Sparta Nova 4
May 15, 2025 8:37 am

Your definition of science fails the sniff test

What is my definition of science? 🙂 I love when you deniers are bsing.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 11:45 am

I love it when an alarmist uses insults to try to silence people all the while avoiding answering a simple question.

I continue to take personal offense with your personal abuse.

Reply to  Sparta Nova 4
May 16, 2025 4:08 am

I love it when an alarmist uses insults to try to silence people

Wish I could silence idiots like you who pollute the internet and make even the simplest debate turn into an unhinged bs festival. Unfortunately, idiots like you have the biggest loudspeakers nowadays.

Sparta Nova 4
Reply to  nyolci
May 16, 2025 10:23 am

So speaks the Princess Flame War.

Reply to  nyolci
May 13, 2025 10:47 am

That is all you got is a cooked up reply with nothing of substance in it, why do you bother coming here if all you have is non existent data to offer……

Snicker.

Sparta Nova 4
Reply to  Sunsettommy
May 16, 2025 10:22 am

Princess Flame War comes here for attention and amusement. She scores points every time she gets someone to respond.

Ray Sanders
Reply to  nyolci
May 13, 2025 1:34 pm

Are you running a brain in for an idiot?

Reply to  Ray Sanders
May 13, 2025 2:22 pm

Are you running a brain in for an idiot?

Oh, and here we have the chief cretin. Nice to meet you. I don’t know what is worse, if you pretend not to understand the answer, or if you really don’t understand it. I assume the first but then I have to assume that you consciously take part in this propaganda process, just like very likely Watts and the McCretins. So you do it for pay. If not, then you are just dumb, just like Gorman and the rest here. So which one are you? Please specify it in your answer. Thx.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:12 pm

If you believe you understood that nonsensical “explanation,” then I feel sorry for you.

Reply to  nyolci
May 13, 2025 4:56 pm

It is scientific misconduct. What you are saying is that in a medical efficacy study, if you don’t have enough patients, you can create homogenies of pseudo-people for the study to prove the effect. Of course, that is how we got where we are with vaccines and autism, and endocrine disrupters.

Reply to  g3ellis
May 13, 2025 10:48 pm

you can create homogenies of pseudo-people for the study to prove the effect

Good god… Okay, this is a misunderstanding on multiple levels. First of all, this is not even for proving anything, this is just a FYI style thing, courtesy from the Met Office. But this is not even what the claimed problem here is. That is about some supposed problem with the way the homogenization itself had been done. The thing is that the MetO is using a complicated algorithm (described in a paper) so for a certain location and for a longer duration the series has a non trivial dependence on the series of nearby stations and even the series of the location itself if that existed (even just for a limited period). The algorithm doesn’t record this, this information is irrelevant fot the purpose of the site. In other words, they just can’t tell this without long calculations. And the FOI is about information that is actually at hand. BTW the FOI request was obviously malicious.

Reply to  nyolci
May 14, 2025 12:53 am

The thing is that the MetO is using a complicated algorithm (described in a paper) so for a certain location and for a longer duration the series has a non trivial dependence on the series of nearby stations and even the series of the location itself if that existed (even just for a limited period). The algorithm doesn’t record this, this information is irrelevant fot the purpose of the site. In other words, they just can’t tell this without long calculations. 

You don’t need complicated algorithms or long calculations to create a temperature series. All you need are calibrated thermometers in known locations and honest record-keeping.

Thanks for confirming the accusations of malfeasance.

Reply to  Graemethecat
May 14, 2025 5:03 am

All you need are calibrated thermometers in known locations

The point of contention was exactly the fact that there was no calibrated thermometer at a certain location during a very long period in the past, and they wanted to calculate an (approximate) temperature for the location. (I cannot wrap my head around how you deniers are organically understand extremely simple things.)

Reply to  nyolci
May 14, 2025 5:56 am

they wanted to calculate an (approximate) temperature for the location.

And what is the uncertainty in an approximate temperature?

Was that uncertainty propagated into an anomaly calculation?

Reply to  Jim Gorman
May 14, 2025 11:53 pm

And what is the uncertainty in an approximate temperature?

Why do you think they don’t know that? Anyway, this wasn’t even the question. They questioned the need for calculating a temperature instead of measuring it. I just pointed out that you couldn’t measure temperatures directly in the past if there had not been a primary measurement at the location and time in question. You have to approximate that value if you can’t measure it.

Was that uncertainty propagated into an anomaly calculation?

Yes. For each and every calculation the result is the average, the uncertainty and the weight (area and time period). A “tuple” of these values. When you use that average further, these variables are used (“propagated”) to those calculations, contributing to that average (resulting in another “tuple”). The whole thing is extremely simple, the only (literally the only, and accidentally a very reasonable) assumption is independence of measurements. Uncertainties don’t have to be Gaussian, don’t have to be of the same distribution etc.

Reply to  nyolci
May 14, 2025 6:36 am

The point of contention was exactly the fact that there was no calibrated thermometer at a certain location during a very long period in the past, and they wanted to calculate an (approximate) temperature for the location.

You measure a temperature, you don’t “calculate” it. Why would anyone want an “approximate” temperature anyway?

This is called “making things up”.

Reply to  Graemethecat
May 14, 2025 9:45 am

You measure a temperature, you don’t “calculate” it.

For that matter, even this is false, but I know your intention. (You don’t measure temperature, nowadays you measure voltage that is mostly proportional to the temp in a range. With LiG, you actually measure distance that is mostly proportional to the temperature 🙂 )

Why would anyone want an “approximate” temperature anyway?

Hm, what was the temperature in your backyard on the 17th of May, 1989? Do you have a primary measurement for that? Or we only have some stations that are reasonably close?

Reply to  nyolci
May 14, 2025 10:27 am

I have no idea what the temperature was that day. However, I can tell you that the temperature was not the same everywhere in my backyard, and and that I would never use that approximate temperature to calculate an average temperature to 3 decimal places.

Reply to  Graemethecat
May 14, 2025 1:17 pm

I can tell you that the temperature was not the same everywhere in my backyard

Good boy. You are at least trying. Okay, when you hear in the weather report that currently the temp in town X is T, do you call them to tell them it can vary?

Reply to  nyolci
May 14, 2025 10:21 pm

Actually, yes. Ever heard the word “microclimate”?

Reply to  Graemethecat
May 14, 2025 11:03 pm

Actually, yes

Good on you 🙂

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:16 pm

If I were claiming I had that temperature at that date, there would be a measurement, mercury thermometer likely, and a written record.

Claiming stations where none exist and publishing data for those non-existent stations is criminal.

Reply to  Sparta Nova 4
May 14, 2025 1:18 pm

If I were claiming I had that temperature at that date, there would be a measurement, mercury thermometer likely, and a written record.

Is there? Is there readily available?

Claiming stations where none exist and publishing data for those non-existent stations is criminal.

They don’t claim that.

Reply to  nyolci
May 15, 2025 3:55 am

Claiming stations where none exist and publishing data for those non-existent stations is criminal.

They don’t claim that.

They admit exactly that:

“In order to advice and assistance, the long term record is based on obervations data at the location, where it is available, any data gaps in the monthly data from this station are filled with estimates obtained by regression relationships with a number of well-correlated neighbouring sites using carefully designed scientific techniques”.

Reply to  Graemethecat
May 15, 2025 5:34 am

They admit exactly that:

Well, they admit exactly the opposite. “any data gaps in the monthly data from this station are filled with estimates” This is not a claim for a station’s existence, this is exactly the opposite. Okay, you are nearing it, you need to take those few steps in the process of comprehension. I can see the progress.

Reply to  nyolci
May 15, 2025 6:55 am

well-correlated neighbouring sites”

Correlation is not causation. Nor does correlation imply equal. Unless you are a climate scientist.

Reply to  Tim Gorman
May 15, 2025 7:08 am

Correlation is not causation.

No one claimed causation. And in this specific case, it’s irrelevant, we are interested in correlation. Tim, we all know thinking is not one of your strengths. Regardless, I ask you (again) to think before you post to avoid ridicule.

Reply to  nyolci
May 15, 2025 7:37 am

You left out the other statement. Correlation does not imply equal!

If correlation does not imply equal then how do you justify substituting a value from one location for one at a different location?

Correlation doesn’t even imply that the slope of the regression lines for two different data sets are the same. Thus it’s not possible to just assume that the anomalies will be the same!

Yet climate science assumes both, that correlation means equal for both temperature and anomaly.

Reply to  Tim Gorman
May 15, 2025 8:42 am

If correlation does not imply equal then how do you justify substituting a value from one location for one at a different location?

If correlation is established, that justifies substitution. Regardless of whether there is a cause-effect relationship in any direction. (Very likely the correlation at both points is the result of a common and complicated cause.)

Yet climate science assumes both, that correlation means equal for both temperature and anomaly.

Huh, another prospective motto for this site 🙂 The amount of nonsense you can come up with is staggering.

Reply to  nyolci
May 15, 2025 10:31 am

If correlation is established, that justifies substitution.”

You just stated the same garbage again.

“Regardless of whether there is a cause-effect relationship in any direction. (Very likely the correlation at both points is the result of a common and complicated cause.)”

Why can’t you address the fact that correlation does *NOT* mean equal?

Correlation does not mean equal either in absolute value or in anomaly.

I’ll ask again (and again and again until you address it):

If correlation does not imply equal then how do you justify substituting a value from one location for one at a different location?”

Reply to  Tim Gorman
May 16, 2025 4:09 am

Why can’t you address the fact that correlation does *NOT* mean equal?

Who claimed it meant equality? Correlation, if demonstrated, means a strong connection, regardless of its causes, you idiot.

Reply to  nyolci
May 16, 2025 4:47 am

Correlation, if demonstrated, means a strong connection, regardless of its causes, you idiot.”

Per capita use of margarine is highly correlated with the divorce rate in Maine. You are claiming that there is a strong connection between the two?

You *have* to claim equality if you want to substitute one value in place of another. That’s supposedly the whole idea of “homogenization”, that you can substitute a value in one data set for a value in a different data set. By claiming that homogenization of temperatures is a valid methodology you have to accept the implicit but unstated assumption that the substituted value is equal to the value that should be found in the primary data set.

Reply to  Tim Gorman
May 16, 2025 7:00 am

the substituted value is equal to the value that should be found in the primary data set.

Otherwise, one is creating new information.

Reply to  Jim Gorman
May 17, 2025 4:39 am

Otherwise, one is creating new information.

See above. interpolation etc. is not substitution, but it’s not “new” information.

Reply to  nyolci
May 17, 2025 4:57 am

Interpolation is a method given to researchers to use in their research. Interpolation is not allowed in government provided official MEASUREMENTS. That is creating information that does not exist as a physical measurement and should not be portrayed as such with a station identification.

If the government wants to provide a separate database of interpolated data with geographic identification rather than station identification, that is their prerogative. But it should be identified as manufactured information and it is not physically measured data.

Reply to  Tim Gorman
May 17, 2025 4:38 am

I’m glad you’ve given up on bsing about causation.

You *have* to claim equality if you want to substitute one value in place of another.

No one wants to “substitute”, you genius. Interpolation etc. is not substitution.

Reply to  nyolci
May 17, 2025 5:18 am

More BS. Inteerpolating a data point at Location1 in order to use it for a data point at Location2 *is* a direct substitution. In order for it to be valid the data point interpolated at Location1 *has* to be equal to the missing data point at Location2 or you’ve created a garbage data point at Location2.

You are in a hole. Stop digging it deeper.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 11:49 am

So you admit they manufacture data.

Reply to  Sparta Nova 4
May 15, 2025 3:20 pm

So you admit they manufacture data guesses.

Fixed it for you. LOL

Sparta Nova 4
Reply to  Graemethecat
May 16, 2025 12:31 pm

That was the update after the missing climate stations were identified and the FOIAs needed an answer.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 11:48 am

Is there? Is there readily available?

You really do not read. That or your reading comprehension vacuums.

So you deflect and amuse yourself by spewing insults and carrying on like you are the ultimate answer to life the universe and everything.

Reply to  Sparta Nova 4
May 16, 2025 4:11 am

You really do not read.

Okay, I said if you didn’t have a primary record for a certain data point, you have to interpolate somehow. You said, in your reply, that what if you had. This is a good demonstration for when the question doesn’t make sense.

Sparta Nova 4
Reply to  nyolci
May 16, 2025 10:20 am

You really do not read.

Okay, I said if you didn’t have a primary record for a certain data point, you have to interpolate somehow. You said, in your reply, that what if you had. This is a good demonstration for when the question doesn’t make sense.

What I said is this:
“If I were claiming I had that temperature at that date, there would be a measurement, mercury thermometer likely, and a written record.”

You are spinning it. I did not say in my reply what if you had.

Such nonsense from The Princess Flame War.

Reply to  Sparta Nova 4
May 17, 2025 4:42 am

“If I were claiming I had that temperature at that date, there would be a measurement, mercury thermometer likely, and a written record.”

I really don’t understand your problem. If you have a primary measurement, good on you. But we don’t have this luxury for most of the surface of the Earth. So if we want to know the temp there at a certain time, we have to use all the data we have from the neighboring stations.

Reply to  nyolci
May 17, 2025 4:51 am

If you have a primary measurement, good on you. But we don’t have this luxury for most of the surface of the Earth.

No, no you don’t. If an individual researcher wants to use some method of creating information to “fill in” what his algorithm requires, then they need to document the procedure and show what has been manufactured.

For a government agency to pronounce temperature data that does not exist as the “official” data is misleading at best. To hide the fact that it is being done is not acceptable under any circumstance. You can not rationalize it away.

Reply to  nyolci
May 17, 2025 5:24 am

If the factors determining the value of interest at Location1 are not the same as the factors determining the value of interest at Location2 then you’ve just created a crap data point for Location2.

if datapoint = a * b * c

and

(a1)(b1)(c1) ≠ (a2)(b2)(c2) then datapoint1 ≠ datapoint2

Then substituting datapoint1 for datapoint2 creates a garbage data point at location2.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:14 pm

So, back to the “manufactured” data.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 7:58 am

MET claimed there were stations, some 103 of 300, that do not exist.
MET claimed temperature data from those sites. The lied or they misrepresented.
Had they marked those sites as calculated, rather than pretending it was real measurement data, we would not be having this conversation.
Had they responded to the requests with transparency, we would not be having this conversation.
The MET behavior begs the question: What are they hiding.

Reply to  Sparta Nova 4
May 15, 2025 8:27 am

Why did they think they needed 300 samples? Wouldn’t 200 be sufficient? Just drop the 103. It shouldn’t change either the average value or the standard deviation of the sample means. If it does then their sampling is garbage to begin with!

Sparta Nova 4
Reply to  Tim Gorman
May 16, 2025 12:33 pm

Questions abound, Tim.
One good question is the rapid updated to the website when these discrepancies were noted.

Reply to  nyolci
May 14, 2025 3:53 am
  1. It is becoming more and more apparent to those critical of climate science that homogenization of temperatures from surrounding locations does nothing but spread measurement uncertainty around. It is a garbage methodology. It’s why Hubbard and Lin found in 2002 that you cannot apply regional adjustments to temperatures because they are so dependent on the individual microclimates. Temperature adjustments *must* be done on a station-by-station basis and must be determined by comparison with a calibrated measurement device whose calibration is done on-site.
  2. Homogenization is based on the garbage memes that averaging increases accuracy and resolution and that all measurement uncertainty is random, Gaussian, and cancels.
  3. Homogenization is based on the garbage meme that temperatures are an extensive property that is correlated between locations based solely on distance and no other factor. It winds up with the idiocy that the temperatures on the east side of a mountain can be “homogenized” to a station on the west side of a mountain based solely on the distance between the latitude/longitudes. It ignores the statistical reality that correlation doesn’t mean things are equal, just that the slope of a linear regression line on the data is the same.

I could go on but this should be sufficient to show that “manufacturing” temperature data is physically impossible. It only serves to defraud those who thinks that the temperature data is based on actual physical measurement.

Reply to  Tim Gorman
May 14, 2025 5:15 am

Tim, not again…

does nothing but spread measurement uncertainty around

What the MetO does is not homogenization. It’s just an informative website. This is your first misunderstanding in a long row. The homogenization you refer to has to be done otherwise the missing data point introduces unwanted artifacts. If you cannot understand this, you’re lost in a debate.

Homogenization is based on the garbage memes that averaging increases accuracy and resolution and that all measurement uncertainty is random, Gaussian, and cancels.

This is so bad it has to be put as a motto to this site. Homogenization is not based on the square root law (which I guess you wanted to refer to). No one has ever claimed that averaging increases accuracy for the individual measurement. It increases accuracy for the average. I don’t want to delve into resolution, that’s another can of worms. Measurement uncertainty per se is eventually a random variable so it’s random. No one says it’s Gaussian, and this is not a requirement. As for canceling, the only (and obviously met) requirement is the independence of measurements.

Homogenization is based on the garbage meme that temperatures are an extensive property

This strange obsession of deniers with this and their persistent inability to understand these extremely simple things is astonishing. No, temperatures are not extensive, and no one has ever claimed that. When you average temperatures, you (usually implicitly) convert them to an extensive quantity and then convert them back to an intensive one with weighting. As for surface temperature with similar terrain all along, simple area weighting of the average will do this trick. I have calculated this to you and Jim at least 3 times already, and you still don’t get it. Congratulations.

Reply to  nyolci
May 14, 2025 5:53 am

What the MetO does is not homogenization. It’s just an informative website.

Yet you said in a prior post;

But this is not even what the claimed problem here is. That is about some supposed problem with the way the homogenization itself had been done. 

Which is it, no homogenization or some homogenization?

As to an informative website, has this informative information been used in published scientific papers?

If yes, did the authors know that they shouldn’t be using the information for scientific purposes?

Reply to  Jim Gorman
May 14, 2025 7:38 am

Yet you said in a prior post;

Yeah, my wording was sloppy, sorry. Technically the two are the same or at least extremely similar but this is just an informative website, nothing else. But anyway, I would like to remind you (just as I emphasized in the quoted part) to the fact that the supposed problem here was not even “homogenization” per se but the way it had been done.

As to an informative website, has this informative information been used in published scientific papers?

No.

If yes, did the authors know that they shouldn’t be using the information for scientific purposes?

Why the fcuk do you think they didn’t know?

Reply to  nyolci
May 14, 2025 6:21 am

What the MetO does is not homogenization.”

Bullshite! When they substitute temps from one location for another location that is the very definition of “homogenization”. Pairwise homogenization is comparing temperatures with neighboring stations. If there is nothing at the location to use for comparison then the neighboring station reading is substituted.

“The homogenization you refer to has to be done otherwise the missing data point introduces unwanted artifacts.”

More bullshite! In *real* science data points are typically determined based on interpolation from surrounding data points. When there is no surrounding data points there can be no interpolation. It is substituting data points from other data sets that introduces unwanted artifacts!

*Homogenization is not based on the square root law (which I guess you wanted to refer to).”

I didn’t say it was. It doesn’t matter one iota what algorithm is used to determine the value to substitute, IT REMAINS A GUESS. Guesses have in-built uncertainty. It is simply unavoidable.

No one has ever claimed that averaging increases accuracy for the individual measurement.”

Stop putting words in my mouth. How do you average ONE measurement? I didn’t claim anything about averaging one measurement. The climate meme is that averaging MULTIPLE measurements reduces inaccuracy and increases resolution! That’s the only way temperatures in the hundredths digit can be determined from measurements in the units digit!

“Measurement uncertainty per se is eventually a random variable so it’s random.”

More bullshite! Very few physical measurement devices drift randomly. Most drift in physical measurement devices is caused by heat over time. Heat typically causes an expansion of whatever material is involved, be it the substrate on an integrated chip or the glass in a LIG thermometer. Expansion causes calibration drift in ONE DIRECTION, not in random directions. As usual, you are doing nothing here but demonstrating the fact that you have ZERO experience in the real world of measurements!

“When you average temperatures, you (usually implicitly) convert them to an extensive quantity and then convert them back to an intensive one with weighting.”

How in Pete’s name do you convert an intensive property to an extensive property while retaining the same dimensional description? A mid-range daily temperature has the same dimension as each component – degree of temperature. Averaging mid-range temperatures results in a value that has the dimension of DEGREE OF TEMPERATURE. Temperature is an intensive property. You can’t change that. You have to do something that changes the dimension.

In addition, since temperature is *NOT* an inherent extensive property converting it into one means you have to introduce multiple conversion factors. Things like humidity, pressure, elevation, geography, terrain, wind, etc. That conversion quantity is called ENTHALPY.

Tell us all where climate science has started to use ENTHALPY. Please provide a reference we can all find on the internet.

This type of conversion also introduces a problem with converting it back to a temperature since all of the other factors ARE TIME SENSITIVE. Even if the enthalpy at time T0 is the same as at T1 that doesn’t mean that humidity and pressure remain constant. So the temperature at time T0 can’t be directly compared to the temperature at time T1, only the extensive quantity of enthalpy can be directly compared.

I’ll repeat. You are just showing your lack of experience in the real world of measurements with everything you post!

As for surface temperature with similar terrain all along, simple area weighting of the average will do this trick.”

More bullshite! What WEIGHTING? How do you weight an AVERAGE? You would weight the components, not the average. You can’t even get this one right!

 I have calculated this to you and Jim at least 3 times already, and you still don’t get it.”

You’ve never done a single calculation that I remember. Just idiotic words – like you’ve posted here. You’ve not showed how you convert temperature to an extensive value using real world calculations. You’ve not shown how you weight an average with real numbers. You’ve just posted a word salad that makes no sense in the real world we actually live in!

Reply to  Tim Gorman
May 14, 2025 8:03 am

When they substitute temps from one location for another location that is the very definition of “homogenization”

Okay then, let’s call it “homogenization”. It doesn’t change anything.

If there is nothing at the location to use for comparison then the neighboring station reading is substituted.

No, this is substantially complicated. They used the method described in Parry and Hollis (2005).

When there is no surrounding data points there can be no interpolation. It is substituting data points from other data sets that introduces unwanted artifacts!

When you interpolate from “surrounding data points”, you “substitute data points from other data sets” as per definition, you genius 🙂

Very few physical measurement devices drift randomly.

No one was talking about drift, you fokkin genius. Each and every measurement has its deviation from the actual value. This deviation has multiple elements that may change with time (drift) but the non systematic parts are (universally modeled as) a random variable. Two consecutive measurements of the same quantity will differ due to this.

The climate meme is that averaging MULTIPLE measurements reduces inaccuracy

Oh, okay. Jim was known to be bsing about increasing the accuracy of individual measurements. But what you call a meme here is not a meme, averaging multiple measurements gives you a value that is a much more faithful approximation of the true average than the individual measurements w/r/t the true values. If you don’t understand this simple thing you’re lost in this debate.
How in Pete’s name do you convert an intensive property to an extensive property while retaining the same dimensional description?
No one claimed it would be the same “dimensional description”.

have to introduce multiple conversion factors.

Good god… Temperature is internal energy per molecule per degrees of freedom. So if you multiply it with degrees of freedom (well known) and mass, you have internal energy, an extensive property. If you calculate the sum and then divide again, you get the average temperature for the whole thing. This is so fokkin simple… BTW, most factors cancel out, so under normal circumstances area weighting gives you an extremely good approximation.

Tell us all where climate science has started to use ENTHALPY.

Always used it. This calculation is essentially enthalpy (or proportional to that by a constant factor). This is a good illustration how off you are in these things.

How do you weight an AVERAGE?

Try google “weighted average” 🙂

You’ve never done a single calculation that I remember.

You have memory problems then, old MoeFoe 🙂

Reply to  nyolci
May 14, 2025 9:07 am

Wrong again. If you want to calculate the internal energy of a body you must multiply the temperature in Kelvin by Universal Gas Constant R, by the degrees of freedom, and by the number of moles, NOT MASS.

Thanks for demonstrating your ignorance.

Reply to  Graemethecat
May 14, 2025 9:47 am

number of moles, NOT MASS.

Yeah. But if you use some material specific constant, you can use mass, too. And for that matter, if the matter stays the same, mass is just as good as mole number. That’s still an extensive quantity that you get, that is proportional to internal energy.

Reply to  nyolci
May 14, 2025 10:46 am

Good god… Temperature is internal energy per molecule per degrees of freedom. So if you multiply it with degrees of freedom (well known) and mass, you have internal energy, an extensive property.

Not true in the least. Temperature is an indication of the kinetic energy (sensible) portion of the internal energy of a molecule. Temperature IS NOT a proxy for the latent energy (latent heat) that is also part of the internal energy not sensible. Latent heat of water is a large value and never examined when using temperature as a proxy.

Reply to  Jim Gorman
May 14, 2025 1:04 pm

Temperature IS NOT a proxy for the latent energy (latent heat) that is also part of the internal energy not sensible

Good god, not again… https://en.wikipedia.org/wiki/Internal_energy#Internal_energy_of_the_ideal_gas U=CvNT where U is internal energy, Cv is a constant, N is mole number. Why the fcuk do you go to a fight without not simply serious preparation but any preparation at all?

Reply to  nyolci
May 15, 2025 6:46 am

Unlike water, an ideal gas does not undergo phase changes, by definition.

Reply to  Graemethecat
May 15, 2025 7:03 am

for nyolci there is no water vapor in the atmosphere – thus it can be treated as an ideal gas which has no latent heat.

He’s never heard of the steam tables.

Reply to  Tim Gorman
May 15, 2025 7:10 am

for nyolci there is no water vapor in the atmosphere

Tim, again, comprehension. No one has claimed this (except, of course, for you). The claim was that water vapor was just gas. This claim is true.

Reply to  nyolci
May 15, 2025 7:41 am

 No one has claimed this”

did you not give this site as a reference?

https://en.wikipedia.org/wiki/Internal_energy#Internal_energy_of_the_ideal_gas

Do you see the last few words? “Internal_energy_of_the_ideal_gas

If you aren’t claiming the atmosphere is an ideal gas then why did you provide a reference to an article on an ideal gas?

Water vapor is *NOT* an ideal gas. Did you think you could fool us by claiming you said water vapor was just gas?

Reply to  Tim Gorman
May 15, 2025 8:46 am

Do you see the last few words? “Internal_energy_of_the_ideal_gas

Oh, so this is the new subject of your masturbation 🙂 Ideal gas is a very good model in practice and it simplifies a lot of stuff here in these debates. But you shouldn’t think that its usage in debates like this invalidates the argument. Of course, in science they use the approximation that is appropriate.

Reply to  nyolci
May 15, 2025 10:38 am

deal gas is a very good model in practice and it simplifies a lot of stuff here in these debates.”

The atmosphere is *NOT* an ideal gas. So an ideal gas is *NOT* a good model in practice applicable to the atmosphere which is the main topic of discussion in these debates.

It’s this kind of garbage that makes climate science into garbage science.

“Of course, in science they use the approximation that is appropriate.”

Trying to approximate the atmosphere by assuming it is an ideal gas with no latent heat is *not* appropriate. It’s this kind of garbage that makes climate science into garbage science.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 11:51 am

Wiki. bwahahaha

Reply to  nyolci
May 14, 2025 11:02 am

Always used it. This calculation is essentially enthalpy (or proportional to that by a constant factor). This is a good illustration how off you are in these things

So the fabricated temperatures are really a heat index temperature calculated using humidity?

Most recorded temperatures do not factor humidity into their temperature records, therefore enthalpy is never accounted for.

Reply to  Jim Gorman
May 14, 2025 1:11 pm

Most recorded temperatures do not factor humidity into their temperature records, therefore enthalpy is never accounted for.

I think we should frame this and put to the top of this site, to show the ignorance and self confidence of the denierfolk here. https://en.wikipedia.org/wiki/Enthalpy Enthalpy is U+PV where U is internal energy, and that is, at least for ideal gases (a very good approximation) is CvNT. PV is NRT. So we have Enthalpy = N*T*constant . This is extensive, and it’s still extensive if we use an arbitrary constant. That cancels out anyway in during averaging.

Reply to  nyolci
May 15, 2025 6:58 am

Most recorded temperatures do not factor humidity into their temperature records, therefore enthalpy is never accounted for.

You didn’t refute my point at all.

Recorded temperatures do not factor humidity into their records.

Find me a database that has Tmax, Tmin, or Tavg shown as calculated by using enthalpy. Better yet find a global land anomaly ΔT that includes enthalpy.

Reply to  Jim Gorman
May 15, 2025 7:13 am

Find me a database that has Tmax, Tmin, or Tavg shown as calculated by using enthalpy.

Tavg is always calculated using enthalpy like quantities as I have showed to you at least 4 times already. Try to comprehend it.

Reply to  nyolci
May 15, 2025 7:20 am

I asked for a temperature database that has recorded temperatures using enthalpy.

You can’t find one can you?

That means none of the global anomaly ΔT temperatures use it either.

Reply to  Jim Gorman
May 15, 2025 8:48 am

I asked for a temperature database that has recorded temperatures using enthalpy.

Its usage is implicit.

Reply to  nyolci
May 15, 2025 11:01 am

ts usage is implicit.”

More garbage. If it were “implicit”, i.e. understood but not stated, then it would be obvious that you can’t average temperature because it can’t be determined directly from just the enthalpy value.

Reply to  Tim Gorman
May 16, 2025 4:18 am

If it were “implicit”, i.e. understood but not stated, then it would be obvious that you can’t average temperature because it can’t be determined directly from just the enthalpy value.

Again, this is so bad, it’s good. I’m inclined just to leave it here as it is. Just a hint: here “implicit” doesn’t mean what you think it means. “Implicit” means that when you weight the average, that is essentially the conversion to some extensive quantity (but already mathematically simplified, not spelled out full). I know you won’t be able to understand it but it has to be stated anyway.

Reply to  nyolci
May 16, 2025 5:37 am

Just a hint: here “implicit” doesn’t mean what you think it means. “Implicit” means that when you weight the average, that is essentially the conversion to some extensive quantity (but already mathematically simplified, not spelled out full). I know you won’t be able to understand it but it has to be stated anyway.”

More BS.

Again, you can’t directly determine temperature from enthalpy because other variable factors exist, especially humidity. A problem you continually fail to acknowledge or provide an answer for. No amount of “weighting the average” can account for not knowing the other factors.

One more time:

latent heat is m[ (cpw * t) + hwe]

“m” (i.e. mass) is an integral part of calculating enthalpy. If you don’t know the mass then you can’t calculate “t” and no amount of “weighting” can fix that problem. And you can’t know “m” unless you know the humidity.

Reply to  nyolci
May 15, 2025 7:21 am

All you’ve shown is how to calculate the enthalpy of an ideal gas! The atmosphere is *NOT* an ideal gas. An ideal gas, by definition, has no latent heat involved.

As usual, the only thing you’ve showed us is how little you understand the real world.

Reply to  Tim Gorman
May 15, 2025 8:50 am

An ideal gas, by definition, has no latent heat involved.

Yeah. Latent heat comes with phase transitions. And during averaging, there’s no mixing, there are no phase transitions.

Reply to  nyolci
May 15, 2025 11:18 am

latent heat is m[ (cpw * t) + hwe]

Latent heat *does* come from the fact that water can have phase transitions. The *amount* of latent heat, however, is directly dependent on the mass of water vapor involved. The only way to average latent heat is to *mix* the water vapor so you get a total mass. Otherwise the “average” is physically meaningless.

It’s like trying to find the “average” height of a herd of mixed Shetland poinies and quarter hourses. Yes, you can calculate an average. And yes the average is physically meaningless since you won’t find a single component of the herd that is the average height. It’s a multi-modal distribution.

So is latent heat if you don’t mix the components.

It’s all part of the climate science meme of “numbers is just numbers”. You can therefore average anything and whether it means anything physically is irrelevant! It’s a perversion of the use of statistical DESCRIPTORS. An average is a statistical descriptor, it is *NOT* a measurement. The average, variance, skewness, kurtosis, quartiles, etc are all DESCRIPTORS of a distribution but they are *NOT* the distribution themselves. They are not part of the data set. It’s like saying you have brown hair. The color brown is not the hair itself, it is a DESCRIPTOR of the hair. The map is not the territory.

Reply to  Tim Gorman
May 16, 2025 4:21 am

The *amount* of latent heat, however, is directly dependent on the mass of water vapor involved.

Wonderful ignorance in basic thermodynamic things. Latent heat is not a property of a state. It’s a property of a reaction (state change). It is irrelevant for averaging, we use “snapshots” there, there’s no state change.

Reply to  nyolci
May 16, 2025 6:50 am

Wonderful ignorance in basic thermodynamic things. Latent heat is not a property of a state. It’s a property of a reaction (state change). It is irrelevant for averaging, we use “snapshots” there, there’s no state change.”

Of course latent heat is a property of a state. Otherwise the equation for latent heat (m[ (cpw * t) + hwe]) makes no sense at all!

Can you show where m[ (cpw * t) + hwe] is somehow incorrect for calculating latent heat?

Latent heat exists in the atmosphere. That it comes from the evaporation of liquid water doesn’t make it not a part of the state of the atmosphere. The amount of latent heat in a volume of air depends on the mass of water vapor in that volume of air and thus it is not a constant value throughout the atmosphere since the humidity of the atmosphere is not a constant.

Reply to  Tim Gorman
May 17, 2025 4:50 am

Latent heat exists in the atmosphere. That it comes from the evaporation of liquid

Evaporation is a reaction. Temperature is a quantity of a state. A snapshot in time. Evaporation is a reaction, a state change. Latent heat is associated with a reaction, a change in state, not with a state itself. It is incredible that this is the fourth round (or so) and you are unable to understand this. Water vapor and aerosols (liquid water) etc. themselves don’t have latent heat per se. When they change phase, latent heat is emitted or absorbed. Water etc. change the nr of degrees of freedom and this is the only relevant factor for temperature calculations (like averaging). But, for that matter, this change is not that great, so even ignoring it gives us a very good approximation. But scientists takes this into account in more detailed studies.

Reply to  nyolci
May 17, 2025 5:18 am

Water vapor and aerosols (liquid water) etc. themselves don’t have latent heat per se.

Don’t have latent heat per se. Exactly where did the term latent heat originate dummy?

The TOTAL internal energy of a molecule is a sum of kinetic energy (sensible temperature) and potential energy (latent heat).

I don’t know what you think water vapor emits when it changes back to a liquid, It is the latent energy that it carried with it.

If you can find a resource that describes what you are asserting, POST IT HERE, I want to see it.

Reply to  nyolci
May 17, 2025 5:28 am

Evaporation is a reaction”

The issue isn’t the constant. The issue is the constant times the mass involved in order to get the total amount of heat generated!

The AMOUNT of heat generated *IS* a property of a volume of air.

You are doing nothing but arguing a red herring in order to avoid having to admit that latent heat exists. All so you can assume the atmosphere is nothing but dry air!

you are in a hole. Stop digging!

Reply to  nyolci
May 15, 2025 7:43 am

enthalpy like quantities”

ROFL!!



Reply to  Jim Gorman
May 15, 2025 8:47 am

So the fabricated temperatures are really a heat index temperature calculated using humidity?

Okay, so we can agree, that temperatures can be averaged with proper weighting, right? Your new problem is humidity, apparently, right? Bad news, for averaging, humidity doesn’t really change anything.

Reply to  nyolci
May 15, 2025 10:57 am

Okay, so we can agree, that temperatures can be averaged with proper weighting, right?”

NO! Temperatures can’t be averaged. That is more climate science garbage. Temperature is not a extensive value. You can’t just assume that the humidity, wind, pressure, etc is the same from moment to moment let alone from location to location.

There is no way to *weight* temperature to make it extensive.

Heat transfer in the atmosphere can change temperature, pressure, or volume – or any combination of the three. There is no way to *weight* temperature to account for this.

Your new problem is humidity, apparently, right? Bad news, for averaging, humidity doesn’t really change anything.”

Humidity is a ratio so is an intensive property. But the *mass* of water vapor, part of the latent heat equation, can be determined directly from the specific humidity and mass *is* an extensive property. So you can average the mass of water vapor. You can *NOT* determine temperature directly from enthalpy, temperature is not a ratio per unit anything associated with the atmosphere.

Reply to  nyolci
May 14, 2025 1:11 pm

No, this is substantially complicated. They used the method described in Parry and Hollis (2005).”

I’m not going to pay for the paper. I will note that the abstract makes *NO* mention of humidity as a variable in any of the interpolation methodology. Humidity is a primary factor in determining temperature of moist air.

The abstract also says: “are incorporated either through normalization with regard to the 1961–90 average climate, or as independent variables in the regression.”

Just how was the 1961-1990 average determined? By “climate” are they speaking of temperature? Where would the pressure, humidity, upper air wind velocity, etc come from in 1961? These values would be needed to make the factors into “independent variables” in the regression *or* they would be just guesses – i.e. primary elements of uncertainty in the results.

When you interpolate from “surrounding data points”, you “substitute data points from other data sets” as per definition, you genius “

Did you read this before you posted it? If I know the temperature at location L1 was 10deg at time T0 and was 11deg at time T0 + 1sec I can “interpolate” the temperature at T0 + 0.5sec from the “surrounding” data in the data set. I can *NOT* take the temperature at L2 and T0 and T1
and use it to interpolate the data value at L1 and T0 + 0.5sec.L2 data is *NOT* surrounding data, it is independent data.

No one was talking about drift, you fokkin genius.”

Total uncertainty = systematic uncertainty + random uncertainty. The problem is that if you can’t isolate the random uncertainty element then you can’t tell what the systematic uncertainty is. If you can’t isolate the systematic uncertainty (e.g. calibration drift) then you can’t tell what the random uncertainty is.

Both John Taylor and Phillip Bevington, in their tomes on measurement and uncertainty, state that systematic uncertainty can *NOT* be analyzed using statistical methodology. Therefore the random uncertainty can’t be isolated and cancelled either -BECAUSE YOU DON’T KNOW WHAT IT IS!

This is *further* complicated by the fact that even random data can be significantly skewed. Significantly skewed random effects can have asymmetric uncertainties that do not cancel, e.g. -1,+3.

You are trying to defend the climate science meme that all measurement uncertainty is random (it isn’t), Gaussian (it isn’t), and cancels (it doesn’t).

non systematic parts are (universally modeled as) a random variable”

See what I mean? The implicit but unstated assumption you are making here is that random data is always Gaussian and cancels. You *must* justify that assumption before using it – and climate science (including you) never bother to justify the assumption!

Two consecutive measurements of the same quantity will differ due to this.”

And they can vary asymmetrically. Assuming they will cancel should be justified explicitly but never is when it comes to climate science.



Reply to  Tim Gorman
May 14, 2025 3:37 pm

I’m not going to pay for the paper. I will note that the abstract makes *NO* mention of humidity as a variable in any of the interpolation methodology. Humidity is a primary factor in determining temperature of moist air.”

Tim, here is the paper:

http://www.rengy.org/uploadfile/file/中文版/资源/文献/2004/Development%20of%20a%20new%20set%20of%20long-term%20climate%20averages%20for%20the%20UK%20%20.pdf

Reply to  crocodile
May 15, 2025 7:18 am

Thank you!

From the paper:

“It is recognized that there are some notable omissions, such as wind speed, wind direction, humidity, visibility, solar radiation, snow depth and days of snow falling. It is expected that some or all of these
variables will be addressed in future projects.”

How in Pete’s name do they leave out humidity which is a direct factor in enthalpy and therefore in temperature? How do they leave out solar radiation which is also a direct factor in enthalpy and therefore in temperature?

Much of the difference between coastal and inland stations as far as temperature is concerned are based on humidity, wind speed, and wind direction. How can they quantify any difference if they don’t consider these factors?

“The correlation coefficient is adjusted based on the length of overlapping record between the stations. Linear regression is used to generate the estimates, and a number of neighbours are used to reduce
random errors.”

How do they reduce systematic errors such as calibration drift? All this methodology does is spread around measurement uncertainty from one station to the next. Nor are they actually reducing “random error”. They are just increasing the number of highly correlated data points to make things look better!

Reply to  Tim Gorman
May 15, 2025 8:54 am

How in Pete’s name do they leave out humidity which is a direct factor in enthalpy and therefore in temperature?

humidity is not a factor in enthalpy (except for the slightly different constant). Humidity is a factor in a reaction (like mixing), and only if there are phase transitions. Temperature measurements can be regarded as point-like w/r/t time.

How do they leave out solar radiation which is also a direct factor in enthalpy

Jesus Christ…

Reply to  nyolci
May 15, 2025 9:08 am

humidity is not a factor in enthalpy

You are full of crap.

What is the specific heat of water? How much energy can water vapor in a cubic meter of air at 50% relative humidity absorb as compared to CO2?

As an “expert” you should have these right at hand.

Reply to  Jim Gorman
May 16, 2025 4:25 am

How much energy can water vapor in a cubic meter of air at 50% relative humidity absorb as compared to CO2?

Doesn’t matter. That’s for a state change or reaction. We are talking about timewise pointlike things (temperatures at certain times). Water vapor, etc. only affects the constants (essentially the nr of degrees of freedom), and even that is not a great change, but anyway it can be handled.

Reply to  nyolci
May 16, 2025 7:06 am

We are talking about timewise pointlike things (temperatures at certain times).

Timewise! Really? The last I knew, timewise dependent variables required a gradient function to describe them.

All you are describing is a value at an infinitesimally small point in time. Have you ever heard of integration?

Reply to  nyolci
May 16, 2025 7:07 am

Again, latent heat

m[ (cpw * t) + hwe]

You keep ignoring the “m” piece of that equation!

” We are talking about timewise pointlike things (temperatures at certain times). Water vapor, etc. only affects the constants (essentially the nr of degrees of freedom), and even that is not a great change, but anyway it can be handled.”

Can you read that equation for latent heat at all?

The amount of water vapor in a volume of air is a VARIABLE, it is not a constant. The amount of water vapor in a volume of air, a variable in both space and time, determines how much latent heat exists in that volume of air at any point in time and any location in space.

You can’t just assume the atmosphere is dry air!

Reply to  Tim Gorman
May 16, 2025 7:21 am

But climate science has to make the assumption that all is dry air. The movement of heat from one location to another by latent heat is anthema to the use of temperature alone.

Reply to  nyolci
May 16, 2025 7:16 am

That’s for a state change or reaction.

Read this page.

If a body both absorbs and emits heat, then it must also store the energy associated with that heat. That potential energy is important in the transfer of “heat”

Reply to  nyolci
May 15, 2025 11:23 am

humidity is not a factor in enthalpy”

Don’t you ever get tired of making idiotic assertions like this?

enthalpy of air = enthalpy of dry air + enthalpy of water vapor

The enthalpy of water vapor is directly dependent on the mass of water vapor which is, in turn, directly related to the humidity ratio!

The humidity DETERMINES HOW MUCH WATER VAPOR YOU HAVE!

That means it *IS* a factor in enthalpy.

Temperature measurements can be regarded as point-like w/r/t time.”

You are your won worst enemy. That means that at each point in time you are measuring A DIFFERENT THING. How do you average different things and get a physically meaningful value? Does the average height of a herd of mixed Shetland ponies and quarter horses have a physical reality?

Reply to  Tim Gorman
May 16, 2025 4:48 am

enthalpy of air = enthalpy of dry air + enthalpy of water vapor

Yeah. And the enthalpy of water vapor has a different constant. So there’s a slight modification for the combined constant. That’s all. There are no state changes here, no actual mixing.

That means that at each point in time you are measuring A DIFFERENT THING

Please first understand the meaning of this. It means there are no reactions.

Reply to  nyolci
May 16, 2025 7:09 am

And the enthalpy of water vapor has a different constant.”

No, the vaporization energy of water is a constant. But the enthalpy of a volume of air is dependent on the AMOUNT OF WATER VAPOR in that volume of air. The vaporization energy and the amount of water vapor are two different things entirely!

Reply to  Tim Gorman
May 14, 2025 11:22 pm

A lot of bollocks 🙂

The problem is that if you can’t isolate the random uncertainty element then you can’t tell what the systematic uncertainty is

Oh, the next step 🙂 First you cry out loud that the uncertainty is ignored. When we point out that it’s not true, you come up with some bs that even the uncertainty (or at least the random element of it) cannot be known. Can you please confirm that we can measure things at all? 🙂

Significantly skewed random effects can have asymmetric uncertainties that do not cancel, e.g. -1,+3.

Wrong. The only requirement for this is independence. And they don’t “cancel” per se. The uncertainty decreases for the average.

random data is always Gaussian and cancels.

Again NO, and please understand this at last. This is your meme. The one single requirement is, again, independence. And, again, they don’t cancel each other completely. They get reduced.

Reply to  nyolci
May 15, 2025 6:34 am

 First you cry out loud that the uncertainty is ignored. When we point out that it’s not true, you come up with some bs that even the uncertainty (or at least the random element of it) cannot be known.”

It’s not BS. It is exactly what both Taylor and Bevington SPECIFICALLY point out in their tomes on measurement uncertainty.

Taylor: “Error in a scientific measurement means the inevitable uncertainty that attends all measurements.”

Taylor: “The best you can do is to ensure that errors are as small as reasonably possible and to have a reliable estimate of how large they are.”

Taylor: “For now, error is used exclusively in the sense of uncertainty, and the two words are used interchangeably.”

Bevington: “Because, in general, we shall not be able to quote the actual error in a result, we must develop a consistent method for determining and quoting the estimated error.”

Can you please confirm that we can measure things at all?”

You have *NEVER* understood the concept of measurement uncertainty even after multiple attempts by people to explain it to you. You remain willfully ignorant on the subject, the worst kind of ignorance there is.

“Wrong. The only requirement for this is independence. And they don’t “cancel” per se. The uncertainty decreases for the average.”

Independence does *NOT* guarantee a Gaussian distribution. And uncertainty always adds when you are measuring different things with different instruments under different conditions, e.g. temperatures The uncertainty never decreases, it ALWAYS ADDS. Thus when you are “averaging” temperatures their uncertainties ADD. Dividing by the number of data points only creates an “average uncertainty”. It is not the uncertainty of the average. The “average uncertainty” is meaningless in the real world. Just like saying the average length of two 2″x4″ boards, one 8′ long and the other 4′ long, is 6′. If you are trying to span a 6′ gap based on the average length being 6′ then one board will be way too short! You’ll wind up in the same position by using the average measurement uncertainty. In fact, it can be DANGEROUS if it involves public safety! If you use the average shear strength of I-beams to design a bridge span it means that some of the I-beams will have a shear strength LESS than the average, thus creating a dangerous situation.

You don’t even understand the difference between repeatability and reproducibility in measurements which apply to measuring the same quantity let alone when you are measuring different quantities.

This is your meme.”

No, this is YOUR meme. Yours and climate science. It allows you to totally ignore the measurement uncertainty associated with even the daily mid-range temperature let alone the global average temperature.

You won’t even admit that the measurement uncertainty of a daily mid-range temperature for a typical measurement station (+/- 0.3C) is between 0.4C and 0.6C. That totally blows out of the water the ability to define an anomaly in the hundredths digit! And that’s just for one day at one location!

Reply to  Tim Gorman
May 15, 2025 7:42 am

Independence does *NOT* guarantee a Gaussian distribution

Jesus fokkin christ… Again, for the square root law (and the bit more complicated general form) the only requirement is independence.
For that matter, you’ve pulled a big one in this post 🙂 Such a heap of bs…

Dividing by the number of data points only creates an “average uncertainty”. It is not the uncertainty of the average. The “average uncertainty” is meaningless in the real world.

While this is just a glimpse into your confused and convoluted thinking, I have a question. There’s an instrument, your husband has written about it. Now its readings are actually the average of 6 “raw” readings taken at 10 sec intervals. Tell me please what the measurement uncertainty of this instrument is for the normal reading. You can assume literally anything for the raw readings (that you don’t see in practice). There must be something for the average, otherwise we can’t say anything about the measurement uncertainty of this instrument.

Reply to  nyolci
May 15, 2025 8:24 am

There’s an instrument, your husband has written about it. Now its readings are actually the average of 6 “raw” readings taken at 10 sec intervals. “

If this instrument is measuring temperatures at 6 different times then it is measuring six different things. You are using the same instrument but you are not measuring the same quantity.

You can assume literally anything for the raw readings (that you don’t see in practice). There must be something for the average, otherwise we can’t say anything about the measurement uncertainty of this instrument.”

You STILL don’t understand measurement uncertainty. You just remain willfully ignorant!

First off you have to state the measurements properly.

The measurement at time t0 would be T0 +/- measurement uncertainty.
The measurement at time t1 would be T1 +/- measurement uncertainty
….
The measurement at time t5 would be T5 +/- measurement uncertainty.

T0 to T5 are all different quantities.

Let’s average just the first two measurements and assume that u0 = u1 = u.

The average of the stated values becomes (T0 + T1)/2. But the range of possible values becomes (T0 -u + T1 – u) = (T0+T1 -2u) and (T0+u+T1+u) = (T0+T1+2u). So the range of possible values has gone from +/-u to +/- 2u. The measurement uncertainty has increased.

So your measurement value for the average should be given as

(T0+T1)/2 +/- 2u

The measurement uncertainty of the average has increased. The *average uncertainty* remains 2u/2 = u. Thus the average uncertainty is *NOT* the uncertainty of the average.

Every time you add another data point the measurement uncertainty of the average will go up while the average uncertainty stays the same.

You CAN NOT REDUCE the measurement uncertainty in a data set by averaging. It is truly just that simple. The only time it reduces to zero is if you have a truly random and Gaussian distribution for the measurement uncertainty. If you think there is a partial cancellation then the accepted method of propagating the measurement uncertainty is using root-sum-square. But even root-sum-square addition GROWS the measurement uncertainty!

Reply to  Tim Gorman
May 15, 2025 9:00 am

Huh, this is perfect. It’s so bad, it’s perfect. But this is a true gem even here:

You CAN NOT REDUCE the measurement uncertainty in a data set by averaging. It is truly just that simple. The only time it reduces to zero is if you have a truly random and Gaussian distribution for the measurement uncertainty. If you think there is a partial cancellation then the accepted method of propagating the measurement uncertainty is using root-sum-square.

So you cannot reduce but you can make it null 🙂 But then you can “partially cancel” it with the square root law. Tim, you’re a genius, you can hold three contradictory things in your head without blinking.

Reply to  nyolci
May 15, 2025 11:27 am

So you cannot reduce but you can make it null”

Your reading comprehension skills are atrocious.

How do you make measurements of different things truly random and Gaussian?

I gave you the quotes from Taylor and Bevington pointing out that *no* measurements are truly random and Gaussian. Therefore you can never get a null value for measurement uncertainty. You can make it small but you can never make it zero.



Reply to  nyolci
May 15, 2025 12:35 pm

But then you can “partially cancel” it with the square root law.

More bull crap.

Read the attached carefully. It comes from Dr. Taylor’s book, An Introduction to Error Analysis, 2nd Edition.

comment image

Note the following requirements for using the divide by √N rule when dealing with measurements.

Dr. Taylor says “we imagine performing a sequence of experiments in each of which make N measurements and compute the averages. We now want to find the distribution of these many determinations of the average of N measurements”

What Dr.Taylor is describing is many samples of the same thing and each sample will have N measurements. He also assumes x1, …, xn are all measurements of the same quantity x, so their widths are all the same and equal sigma x. In other words, the samples are IID.

Temperature averaging meets none of these requirements. You and others like bdgwx and Bellman really don’t understand metrology because you just cherry ick and try to rationalize using sampling theory and traditional statistics. Measurement uncertainty does not use sampling theory to sample a population. Measurement uncertainty uses probability distributions to provide an internationally accepted method of DESCRIBING how certain measurements are.

In statistics one can assume the the data is 100% accurate and therefore no special treatment is required in the calculations. The means of the multiple samples are also 100% accurate. Any error results only from sampling error. Therefore sigma/N gives an accurate view of the estimated means reliability. This is what Dr. Taylor also assumes in his derivation of the “square root rule” as you call it.

NIST N 1900 creates assumptions that do the same thing. You’ll notice bdgwx never answered my questions about the assumptions used in this EXAMPLE from NIST.

NIST says a better model is one that allows for temporal correlations to be incorporated. They assume calibration uncertainty was negligible (resolution). And, they assumed no other significant instances of uncertainty occur. In other words, the readings are 100% accurate. Note, they say the error terms are assumed to be independent andom variables (Dr. Taylor’s multiple experiments) with the same Gaussian distribution with mean 0 (ZERO) and and standard deviation. Read Dr. Taylor’s derivation again to see the relevance.

The reliability of the means caculation does not inform one of the measurement uncertainty as defined in the GUM. Measurement uncertainty is the dispersion of the observed measurements attributed to the measurand. There is no way that the decreasing interval of the standard deviation of the mean can be used to erase this dispersion or to make it smaller. The GUM, NIST, ISO, etc,., all require that the standard uncertainty of the mean also declare the degrees of freedom used to calculate it. That is so the dispersion of the observations can be calculated by “SDOM × √n”. This seldom
disclosed in climate science.

Reply to  Jim Gorman
May 16, 2025 4:58 am

Temperature averaging meets none of these requirements

Long story short, you don’t understand much from this. All the above are just assumptions for the current example. These are not preconditions. Furthermore, you very often confuse similar (or similarly sounding) things like independence of measurement or temporal independence (of a variable with itself).

rationalize using sampling theory and traditional statistics.

While “traditional statistics” is just an application of probability theory, I have never “used” statistics here, to an extent that you accused me of being too mathematical. Furthermore, in the past you specifically rejected probabilistic description as non-applicable. Especially w/r/t uncertainty (which is just a random variable, NB. this is term from Probability Theory).

Reply to  nyolci
May 16, 2025 6:54 am

Long story short, you don’t understand much from this. All the above are just assumptions for the current example. These are not preconditions.

THESE ARE PRECONDITIONS FOR DR. TAYLOR’S DERIVATION TO HOLD! Any mathematician should be able to understand that. However, you obviously don’t.

Show the math that refutes this derivation or a reference that shows a refutation. Otherwise, it remains true and your assertion is worthless.

Go look up measurement uncertainty in NIST’s Statistical Handbook and see what precondions are required.

I have never “used” statistics here, to an extent that you accused me of being too mathematical.

Right, the “square root rule” is something a non-mathematician would know.

Remember, a random variable doesn’t create a probability distribution. A random variable is a container that holds the results of an experiment. Those results will define the probability distribution.

In essence, to use the “square root rule” in measurement uncertainty, each input value should be an independent random variable. Read NIST TN 1900Ex 2 again and find out how they make assumptions that treat each measurement as an IID sample

Reply to  Jim Gorman
May 16, 2025 7:19 am

Especially w/r/t uncertainty (which is just a random variable,”

Again we seem the meme that all measurement uncertainty is random, Gaussian, and cancels.

Random variables can be asymmetric, skewed, multi-modal, and decidedly non-Gaussian. But that ruins the ability to assume it all cancels out!

Reply to  nyolci
May 14, 2025 1:24 pm

 But what you call a meme here is not a meme, averaging multiple measurements gives you a value that is a much more faithful approximation of the true average than the individual measurements w/r/t the true values.”

Bullshite! If I take a data set where all data is inaccurate the average will inherit that inaccuracy. Averaging won’t reduce it in any way, shape, or form. Once again, you are implicitly assuming that the measurements will result in a totally random and Gaussian distribution where you can assume cancellation. Can you EXPLICITLY justify that assumption for temperature measurements where you are measuring different things using different measurement devices? Please show us how you do that!

If you don’t understand this simple thing you’re lost in this debate.”

You simply don’t understand metrology at all. You can’t even tell the difference between measuring the same thing multiple times under the same environment using the same measurement device and making single measurements of different things under different environments using different measurement devices.

Your ignorance on the subject just shines right through in everything you assert!

“No one claimed it would be the same “dimensional description”.”

So you think you can have intensive and extensive values with the same dimensionality describing the same thing?

You aren’t showing *any* understanding of the basic gas laws at all. The pressure exerted by those molecules depends on the volume in which they exist. It’s the simple PV = nrT. These factors do NOT “cancel” out! And when it comes to moist air it’s even more complicated.

You don’t even understand enthalpy of moist air. The enthalpy is the enthalpy of dry air + the enthalpy of the water vapor. And the enthalpy contributed by the water vapor is a function of pressure and volume. What do you think steam tables were developed for?

And calculations? Where is your calculation of the enthalpy of moist air at 10000 feet of altitude?

Reply to  Tim Gorman
May 14, 2025 11:35 pm

These factors do NOT “cancel” out!

I don’t think you get what cancel out. Okay, slowly again. X1 = N1*T1*c where “c” is a constant, N1 is the amount of air in one area, T1 is its temperature, and X1 is an extensive quantity. X2 = N2*T2*c. Their temperature, taken as a whole is (X1+X2)/c(N1+N2). “c” cancels. The result is (N1T1+N2T2)/(N1+N2), a weighted average of the temperatures. The weights can be (and are) approximated very well. And you can see that picking a specific “c” here is not even needed.

The enthalpy is the enthalpy of dry air + the enthalpy of the water vapor.

Good god… in Thermodynamics, dry air and water vapor are just gases with known (and the same) nr of degrees of freedom. As an illustration, https://en.wikipedia.org/wiki/Enthalpy does not mention humidity, mind you. Humidity is kinda special because of the phase transition that it may produce. But for averaging, where there is no actual mixing, it is irrelevant.

Where is your calculation of the enthalpy of moist air at 10000 feet of altitude?

En = U + PV. U = CvNT, PV=RNT, so En = (Cv+R)NT. N is the number of molecules (times 6*10^-23). Cv and R are constants. See? This simple it is.

Reply to  nyolci
May 15, 2025 5:57 am

” (N1T1+N2T2)/(N1+N2″

The c only cancels if it is a constant! If N1 and N2 are different then c can’t be the same for both. Your “c” is actually Cv, the specific heat for a constant volume. Perhaps you should go read up on specific heat, especially as it relates to the atmosphere where pressure changes? Again, learn how to read the steam tables.

“nr of degrees of freedom”

The degrees of freedom have to do with the rotational and vibration modes of the molecules, not the humidity of a volume of air. The nr of degrees of freedom change as the temperature changes, e.g. the degrees of freedom are different for the troposphere and the stratosphere for CO2. Stop using chatgpt and learn the basics.

“As an illustration, https://en.wikipedia.org/wiki/Enthalpy does not mention humidity”

So what? This just shows that you actually don’t know the subject and are just cherry picking quotes that you think support your assertions.

For moist air h = ha + hw

h = total enthalpy
ha = sensible heat
hw = latent heat

ha = cpa * t
hw = (cpw * t) + hwe

cpa = specific heat of air at constant pressure
hw = latent heat
cpw = specific heat of water vapor at constant pressure
hwe = evaporation heat of water

expanding:

h = (cpa * t) + m[ (cpw * t) + hwe]

where cpa is specific heat of air at constant pressure
m is the mass of water vapor, i.e. humidity dependent

“Humidity is kinda special because of the phase transition that it may produce.”

Humidity is kinda special because it determines the mass of water vapor involved – which determines the latent heat which is a factor in the total enthalpy!


” See? This simple it is.”

Only because you have no idea about finding sensible and latent heat and their contribution to the total enthalpy of moist air.

The enthalpy of the atmosphere is a constantly changing factor even at a single location and volume, let alone globally. It is wildly chaotic and non-linear which, as usual, means the climate models have to parameterize the factor using some kind of a guess at an average value. It’s part of the reason why even the IPCC recognizes climate as a chaotic, non-linear process!

Reply to  Tim Gorman
May 15, 2025 6:52 am

Nice!

Reply to  Tim Gorman
May 15, 2025 8:09 am

The c only cancels if it is a constant! If N1 and N2 are different then c can’t be the same for both. Your “c” is actually Cv

So “c” is a constant ‘cos Cv is a constant, right 🙂 Actually, c is not Cv but it doesn’t even matter here. For our discussion, this is just a constant, and the resultant quantity is extensive. BTW, this is from wiki:

The enthalpy of an ideal gas is independent of its pressure or volume, and depends only on its temperature, which correlates to its thermal energy. Real gases at common temperatures and pressures often closely approximate this behavior, which simplifies practical thermodynamic design and analysis.

Please keep this in mind.

The nr of degrees of freedom change as the temperature changes

This is, characteristically, wrong. Or mostly wrong. Degrees of freedom depends mostly on the phase (or phases) of the matter, and on the matter itself (eg. helium has less, O2 has more in its gas form). This is why humidity may be important for air, water may change phases during reactions. Furthermore, the constants in the CxNT formula may be a bit different for different bodies of air because of the presence of liquid and a bit different materials. But the most important point here is that averaging is not real mixing, so we don’t have to deal with the phase changes that may really complicate the picture. We may have to play around the constants that are a bit different (while just using one constant is a very good approximation itself).

Stop using chatgpt and learn the basics.

I’ve never used chatgpt or any other ai thing (except for the github copilot).

Reply to  nyolci
May 15, 2025 8:37 am

 Cv is a constant, right”

But the volume of a parcel of air changes as it rises! Therefore Cv, which is based on a constant volume, is not a constant for that parcel of air!

“Degrees of freedom depends mostly on the phase (or phases) of the matter, and on the matter itself “

Word salad, pure and plain!

An ideal gas, which is what you continue to reference HAS NO PHASE CHANGE!

“water may change phases during reactions”

Meaning your continued reference to an ideal gas is simply garbage. And now you are trying to back away from that!

“Furthermore, the constants in the CxNT formula may be a bit different for different bodies of air because of the presence of liquid and a bit different materials.”

What in the hell do you think everyone has been trying to tell you? Moist air has latent heat. An ideal gas does not. But your entire argument has been based on an ideal gas!

” But the most important point here is that averaging is not real mixing, so we don’t have to deal with the phase changes that may really complicate the picture.”

You can’t average an intensive property. Thus all your word salad on the enthalpy of an ideal gas!

Did you not read *anything* I’ve posted? Have you gone to look up the steam tables yet? In essence you are trying to justify finding enthalpy using the ideal gas law – when the atmosphere is *NOT* an ideal gas!

“We may have to play around the constants that are a bit different”

It’s not just the “constants”! It’s the mass of water vapor that determines the latent heat! Again, did you not read anything I posted?

 m[ (cpw * t) + hwe]

What in Pete’s name do you think “m” is? I defined it for you! “m is the mass of water vapor, i.e. humidity dependent”

Reply to  Tim Gorman
May 16, 2025 5:00 am

But the volume of a parcel of air changes as it rises!

When we average temperatures, we use the state at hand, from a point in time, no reactions, no rising air. A temperature applies to a certain point in time. You somehow cannot comprehend this.

Reply to  nyolci
May 16, 2025 5:56 am

, we use the state at hand, from a point in time, no reactions, no rising air.

As is typical in climate science, you don’t bother to write a gradient equation for the changing environment. Averages tell everything when you pick the point you want to emphasize. That isn’t how nature works.

Sparta Nova 4
Reply to  nyolci
May 16, 2025 10:07 am

Why are you using Cv? The correct parameter is Cp.

That aside, Cv and Cp are not constant. They change with pressure and temperature.

Sparta Nova 4
Reply to  Tim Gorman
May 16, 2025 12:34 pm

Root Sum Squared (RSS) is what I believe the reference is to.

Reply to  Sparta Nova 4
May 16, 2025 2:52 pm

RSS is used to calculate a combined uncertainty “u꜀” from a number of different input quantities, each with their own uncertainty. That is why an uncertainty budget is useful.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:18 pm

Averaging does not increase accuracy of the average.

Sheesh.

Reply to  Tim Gorman
May 14, 2025 6:30 am

Claiming that air temperature averages have tiny “uncertainty” through invoking sigma/root(N), the essence of “all measurement uncertainty is random, Gaussian, and cancels”, is also Fake Data. It is another aspect of the fraud.

Reply to  karlomonte
May 14, 2025 11:12 am

Since the standard deviation of the mean is determined by dividing the standard deviation by the square root of a value, you can determine the SD by multiplying by that same quantity.

It is interesting how little variation there is in global temperatures, even anomalies.

Newminster
Reply to  nyolci
May 14, 2025 9:00 am

But with all this verbiage you still haven’t explained how the Met Office can provide figures for sites that do not exist

Reply to  Newminster
May 14, 2025 9:48 am

Met Office can provide figures for sites that do not exist

see Perry and Hollis (2005). The link to the Sander idiot’s site is in the article, and the link to the MetO answer is there, you can see the reference.

Reply to  nyolci
May 14, 2025 1:26 pm

You realize this is nothing more than the argumentative fallacy of a False Appeal to Authority? It’s nothing more than name dropping. *SHOW* us how its done, preferably using data we all have access to. BE SPECIFIC.

Reply to  Tim Gorman
May 14, 2025 11:07 pm

False Appeal to Authority?

It wasn’t even an “appeal to authority” (even if I take you seriously which is hard). There was a persistent question about “making up” things (inventing). I just pointed out that these temperatures weren’t “manufactured” arbitrarily, they are a result of a complicated interpolation process described in the paper.

Sparta Nova 4
Reply to  nyolci
May 16, 2025 10:05 am

The only poster using the word arbitrarily is you, the Princess of Flame Wars.

The temperatures were manufactured.
The significance is, there is no means provided to validate the calculations.

What was written in the paper is basic handwaving.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:04 pm

Reread what? I reread the article. I read all the links.

State their answer.

Reply to  Sparta Nova 4
May 14, 2025 1:14 pm

State their answer.

Oh, so you didn’t understand that. Good boy.comment image Re-read the paragraph where they reference Perry and Hollis (2005). That’s the key. Read it slowly, look up all the words in the dictionary even if you think you know them. Maybe you don’t.

Reply to  nyolci
May 15, 2025 1:56 am

“In order to advice and assistance, the long term record is based on obervations data at the location, where it is available, any data gaps in the monthly data from this station are filled with estimates obtained by regression relationships with a number of well-correlated neighbouring sites using carefully designed scientific techniques”.

Couldn’t be clearer, the Met Office is interpolating temperatures from non-existent stations,

Do the words “data gaps” mean anything to you?

Reply to  Graemethecat
May 15, 2025 2:48 am

Couldn’t be clearer,

Wrong again. You’ve failed to understand this not very complicated thing. Try harder!

Reply to  nyolci
May 15, 2025 3:52 am

Why should I take anything you say seriously? You have been exposed several times as a liar.

Reply to  Graemethecat
May 15, 2025 5:36 am

You have been exposed several times as a liar.

This is what you want to think but I see how butthurt you are.

Sparta Nova 4
Reply to  Graemethecat
May 15, 2025 8:07 am

Never engage in a battle of wits with the unarmed. They never know when they have lost.

Reply to  nyolci
May 15, 2025 5:55 am

Questions from the document you posted.

  1. Isnt it hard to find correlation when the station you are using has no data to correlated with?
  2. Why do correlated stations change? They should be constant?
  3. Why has the important evidence of what stations were used allowed to be “disappeared? This is critical evidence for traceability purposes. To stand and point fingers in opposite directions to show it wasn’t their job iss specious at best. It points to a terrible non-chalance toward scientific data.
Reply to  Jim Gorman
May 15, 2025 7:43 am

Questions from the document you posted.

Read Perry and Hollis (2005). You will have some answers. Read what they reference. etc.

Why has the important evidence of what stations were used allowed to be “disappeared?

It didn’t disappear. This is an automated calculation. It wasn’t output. The whole purpose of this is just being informative, it’s not even a “scientific result” (while the calculation has the necessary rigor).

Reply to  nyolci
May 15, 2025 5:55 am

Questions from the document you posted.

  1. Isnt it hard to find correlation when the station you are using has no data to correlated with?
  2. Why do correlated stations change? They should be constant?
  3. Why has the important evidence of what stations were used allowed to be “disappeared? This is critical evidence for traceability purposes. To stand and point fingers in opposite directions to show it wasn’t their job iss specious at best. It points to a terrible non-chalance toward scientific data.
Reply to  nyolci
May 18, 2025 9:42 am

Me thinks you work for the Met Office. There can be no other explanation for your rants.

Rick C
Reply to  Jim Gorman
May 13, 2025 8:48 am

How is this not a major scientific scandal akin to “Climategate”? Temperatures in the MET office dataset from non-existent stations should be expunged completely and any and all publications that included analysis that included these bogus values should be retracted.

The leadership of the agency that perpetrated the fraud should be terminated and prosecuted if possible.

When I managed a major independent laboratory we occasionally found a fault with equipment or procedure after a report had been issued. We had to immediately withdraw the report and either refund the fees or redo the project at our expense. It would have been a firing offense to ignore the problem or try to cover it up.

Reply to  Rick C
May 13, 2025 9:34 am

Rick, exactly on point.

Sparta Nova 4
Reply to  Rick C
May 15, 2025 8:10 am

That goes well beyond your lab.
Calibration.
In my case acceptance and qualification testing.
It is clear the path from data to results can not be obstructed, must be direct, or validation and verification are impossible.

Mistakes happen. In your lab, same as me in engineering, if we make a mistake, we own it, admit it, and correct.

I see none of that over the past 5 decades of this climate modelling fiasco.

Dave Andrews
Reply to  Jim Gorman
May 13, 2025 8:56 am

The UK Civil Service is very practiced at making the information match the Minister’s desired outcome 🙂

Reply to  Dave Andrews
May 13, 2025 1:26 pm

UK Government Ministers are preprogrammed to accept everything a Civil Servant tells them.
This was very clear in the evidence to the Post Office Limited Horizon Inquiry.

SxyxS
Reply to  Jim Gorman
May 13, 2025 1:32 pm

Nothing mindblowing.
The government and its agencies have manufactured data to justify wars
and conspired on the highest levels with other governments and agencies to bomb other country (iraq) and coconspired with the media to perpetuate the lie(100% support by MSM)
and killed a whistleblower(Dr Kelly).
They have withheld data about masschildrape(Rotherham,Rothington,Newcastle and dozen other cities) and called truthtellers racist conspiracy theorists and eventually marginalalized childrape with the euphemism Grooming.

There are signs all over the place for decades.
It’s just business as usual.
And keep in mind: A parasite with a degree will do anything to keep their job.

Reply to  Jim Gorman
May 13, 2025 2:00 pm

Same MO during Covid

MrGrimNasty
May 13, 2025 6:09 am

It’s quite possible a computer program selects what data to use, in which case it is entirely plausible that they cannot answer the question simply.

Try asking for the process specification or the code instead.

strativarius
Reply to  MrGrimNasty
May 13, 2025 6:24 am

Programmed accordingly….

Reply to  MrGrimNasty
May 13, 2025 1:27 pm

Good luck with that

Ray Sanders
Reply to  MrGrimNasty
May 13, 2025 1:36 pm

Mr GM. You have clearly not improved your perception of reality since you came off NALOPKT

SxyxS
Reply to  MrGrimNasty
May 13, 2025 1:50 pm

lol -no excuse can be too cheap for guys like you?

Expert1 :
“How can we avoid accountability and fool the average GrimNastyZombie”
Expert2:
” Let’s outsource the process to a program or AI? That way we can play our fake game forever.
And if things get exposed,it’s AI’s fault and we the innocent victims with the best of intentions”
Expert1:
“But if we use this excuse for the 100th time GrimNastyZombie might get angry?!”
Expert2
” No.We used this strategy already 300 times and GrimNastyZombie is not only perfectly fine with it.
He actually wants such an excuse from us.
No matter what.
He will not only eat and promote the lie.
He will even invent new lies to protect us and his dillusion.”

Ray Sanders
Reply to  MrGrimNasty
May 13, 2025 2:17 pm

Nonsense. I understand the process perfectly well which is why I asked for the inputs NOT the outputs. How difficult is that to understand?

Jeff Alberts
Reply to  MrGrimNasty
May 13, 2025 5:41 pm

Try asking for the process specification or the code instead.”

And you think they would have complied??

Reply to  MrGrimNasty
May 13, 2025 10:51 pm

it is entirely plausible that they cannot answer the question simply

This is essentially what the MetO claimed in its reply. FOI is about information at hand. This certain piece of info had not been recorded, they would’ve had to do a lot of calculations.

Try asking for the process specification or the code instead.

The MetO actually gave that information in the reply. Perry and Hollis (2005).
Summary: good old Sanders is either an idiot or he maliciously “misunderstands” not too complicated things.

Sparta Nova 4
Reply to  nyolci
May 14, 2025 12:26 pm

Your problem is you accept as credible based on the logic fallacy of authority.
You dismiss the attempts to independently verify and validate that which is published.

Therefore, your point of view is soundly unscientific.

You are in fact a Science Denier.

Reply to  Sparta Nova 4
May 14, 2025 1:21 pm

credible based on the logic fallacy of authority.

But they are the authority, mind you. Should I accept as credible an idiot in a blog?

Sparta Nova 4
Reply to  nyolci
May 15, 2025 8:14 am

They have falsified data and you are accepting the always tell the absolute truth?

In my work, independent verification and validation are vital.

You are clueless. You accept anything you are told that matches your self-deceptions.

Sparta Nova 4
Reply to  Sparta Nova 4
May 15, 2025 12:06 pm

typo: “they always tell”

So I am an idiot in addition to a denier. Cool

Sparta Nova 4
Reply to  nyolci
May 16, 2025 9:58 am

Richard Feynman: “Science is the belief in the ignorance of experts.”

Reply to  nyolci
May 14, 2025 3:30 pm

Perry and Hollis excludes stations based solely on the size of regression residuals. That is the difference between the predicted value and the observed one. But this assumes that the observed value is accurate by default. Most UK stations do not meet WMO Class 1 siting standards as emphasized by WUWT for several years now. 

The paper also relies on some other questionable assumptions. It applies a single set of regression coefficients across the entire UK to generate the 1 km x 1 km grid estimates. This assumes geographic predictors like proximity to the sea have the same effect on temperature everywhere, which clearly ignores regional dynamics. 

Being near the sea may moderate temperatures differently in the southeast compared to the northeast, yet the model treats these influences uniformly.

The accuracy of the gridded maps is evaluated using RMSE. But again, this is calculated using withheld data from the same corrupted stations, just as the regression residuals used for station excursion. If the underlying observations are systematically corrupted then the validation exercise simply measures consistency with flawed data, not accuracy relative to the true climate signal.

Also, the validation metric itself is questionable. The authors withhold 6 years of monthly data from 20 stations, but monthly averages smooth over any potential physically meaningful errors.

For example, if the regression underestimates lapse rate cooling by 1C during windy days, and there are only five such days in a month, that error is mostly smoothed in the monthly mean.

Reply to  crocodile
May 15, 2025 2:50 am

So can we put away the “primary criticism” that the MetO makes up stations and data? Because now you criticize how they interpolate. This latter problem may be a good candidate for a paper. Go try, submit one.

Reply to  nyolci
May 15, 2025 7:08 am

No. The fact remains that it is manufactured information with no traceability whatsoever.

Just because the location WAS a physical station is no excuse for propagating it’s ghost existence. If it’s ghosted existence is vital and necessary for the public, then it should have a physical existence.

Truthfully, any location anywhere could be chosen, even places where no station ever existed following this logic.

Reply to  Jim Gorman
May 15, 2025 7:32 am

If you have a sufficient number of samples then dropping one sample should make little difference in the variance of the data and little difference in the standard deviation of the sample means. In other words there simply isn’t any justification for making stuff up!

Climate science tries to justify making stuff up because “we need long records”. No, you do *NOT* need long records. All you need is sufficient samples at any point in time to generate an acceptable standard deviation of the sample means.

I’ve even seen it stated that if you don’t have data for a location and you don’t infill it with something then you are assuming that location is the average value. SO WHAT? When you calculate an average of a number of samples you are basically assuming that average value applies everywhere – i.e. every sample is the same value. Otherwise the average is meaningless and useless!

If your sample size is so small that dropping one value out of the data set significantly changes the average or the standard deviation of the sample means then your sampling is garbage to begin with!

Reply to  nyolci
May 15, 2025 7:24 am

Interpolation means you have data points to interpolate between. Substituting data interpolated from a different data set into the one under scrutiny *is* not acceptable. You may as well just pull it out your backside.

Again, correlation is not causation. Correlation does not imply equal. Unless you are the MetO.

Sparta Nova 4
Reply to  nyolci
May 15, 2025 8:15 am

You miss the point. Those are interrelated.

Reply to  nyolci
May 15, 2025 12:43 pm

“Go try, submit one.”

From what I’ve observed over the years in the climate blogosphere, publishing contrarian papers often comes at a professional cost. Errors are dissected with unusual intensity, arguments are misrepresented, and the researchers themselves are targeted rather than just their work.

It’s no coincidence that most publishing skeptics are either retired, in senior roles, or outside academia entirely. They do it once they have little left to lose.

I prefer to offer criticism anonymously. After all, WUWT is structured as a form of public peer review.

If someone finds my input useful, great. If not, that’s fine too. I’m not willing to ruin my life on a paper that’s unlikely that minds.

Sparta Nova 4
Reply to  crocodile
May 16, 2025 9:57 am

“Errors are dissected with unusual intensity, arguments are misrepresented, and the researchers themselves are targeted rather than just their work.”

You just described nyolci’s methodology.

Tom Halla
May 13, 2025 6:11 am

POOMA, except they spell it arses?

May 13, 2025 7:13 am

Summary of the above article:

From the UK MET Office: “What, you want factual information . . . sorry, we don’t deal in that.”

May 13, 2025 7:55 am

Anticyclonish

As I was going up a stair
The Met – a temp that wasn’t there
It wasn’t there again today
Oh how I wish it’d go away!

with apologies to William Hughes Mearns – Antigonish.

taxed
May 13, 2025 8:05 am

Any claims about maximum temps from electronic thermometers housed in Stevenson screens during fine weather need to be treated with caution. Because during fine weather the warming of the Stevenson screen by the sun can cause the temperature readings to be overstated by between 1C to 3C from the real shade temperature. Has as been made clear to me since l have been recording temperatures with a LIG thermometer in open shade.
Today has been typical of what happens during such weather conditions.
Here in Scunthorpe as of 3.50pm l have recorded a temperature of 18.6C, yet the weather stations on Hatfield Moor and Thorne Moor are recording 21.2C and 19.9C.
While the Met Office weather station at Waddington is recording around 21C and a weather station in Lincoln is showing a temp of 20.7C.
This is what constantly happens whenever there is fine sunny weather.

Doug S
Reply to  taxed
May 13, 2025 8:22 am

Interesting. Over the years the Climate experts have said that using temperature anomalies solves the problem you’re identifying. I haven’t heard anomalies used as an argument lately, perhaps they’ve given up on that?

taxed
Reply to  Doug S
May 13, 2025 9:04 am

By having all of the thermometers housed in Stevenson screens outside in the sunshine. Then this overstated warming of daytime temperatures is no longer a anomaly, but rather it becomes the benchmark for claimed daily maximum temperatures.
What’s needed is a major rethink on this matter. There needs to be the testing of the temperature readings on site between a electronic thermometer in a Stevenson screen along side a electronic thermometer in open shade. As its my view this is only way it will be shown just what a issue this has become. The Stevenson screen may have done the job back in the late 19th century, but that’s no longer the case with modern electronic temperature recording.

Randle Dewees
Reply to  taxed
May 13, 2025 11:16 am

I’d go further on this. The comparison reference should be an electronic thermometer in a Stevenson screen in open shade.

There can be considerable radiance from a sunlit scene into a shaded area.

taxed
Reply to  Randle Dewees
May 13, 2025 1:15 pm

Yes l would agree with that.
These screens need to be in the shade due to the advancements in temperature recording.

Reply to  Doug S
May 14, 2025 3:34 am

Anomalies don’t fix anything. You use (Tmax+Tmin)/2 to find a daily mid-range temperature. Those daily mid-range temperatures are then averaged to get a monthly “average” temperature. Those monthly “average” temperatures are then used to create a baseline “average” temperature. Then current daily mid-range temperatures are subtracted from the baseline to get an “anomaly”. If Tmax or Tmin is incorrect at the start then that propagates throughout the whole chain of “averaging” and the anomalies wind up being inaccurate.

Climate science is based on the assumption that averaging always increases accuracy. It’s garbage, pure and utter garbage.

Climate science meme 1: All measurement uncertainty is random, Gaussian, and cancels.
Climate science meme 2: Numbers is just numbers, no need for significant digits when measuring physical phenomena.
Climate science meme 3: Averaging can increase resolution of measurements.

Reply to  taxed
May 13, 2025 10:35 am

Are the Stevenson screens made of whitewashed wood? Or are they more modern materials. Is it possible that there is a difference in temperatures read depending upon the material used to make the screen?

Sparta Nova 4
Reply to  Retired_Engineer_Jim
May 13, 2025 11:41 am

It used to be whitewash. In the US some years ago they changed to latex. One study estimated as much as a 1.5C error was introduced with that change.

Reply to  Sparta Nova 4
May 13, 2025 4:51 pm

And then add the digital thermistor data sampling rate to the equation.

taxed
Reply to  Retired_Engineer_Jim
May 13, 2025 1:25 pm

Yes replacing the large wooden screens with smaller screens made of plastic has made the matter worse.
Due to the smaller screen been warmed quicker by the sun. But also because over time these plastic screens lose their bright white colour and become more of a creamy white in colour. Which causes them to warm up quicker in sunlight.

Reply to  Retired_Engineer_Jim
May 14, 2025 4:02 am

No matter what the station enclosure is made of aging will introduce measurement uncertainty. Even with modern plastics the level of exposure to direct sun will change the reflectivity of the enclosure material thus changing the accuracy of the measurements. Different stations with different sun exposure will change differently thus causing an asymmetric spread of measurement uncertainty. Plastics simply don’t get *more* reflective with age. Neither does any other material. This makes the climate science meme that all measurement uncertainty is random, Gaussian, and cancels nothing more than garbage. You simply can’t “average” that asymmetric measurement uncertainty out of existence. Especially not at the resolution of tenths or hundredths of a degree Celsius.

Anthony Banton
Reply to  taxed
May 13, 2025 10:57 am

I’ve explained this to you before.
It is the same synoptic situation … an easterly from off a cold N Sea. Places to the west of you will be warmer by dint of a longer land track after coming from the lincs coast.

The Lincoln stations (Waddington, Cranwell) are furher south and the air affecting them had a shorter sea-track and was quicker to warm once overland.
Conversely places furthe rnorth need a longer land-track to reach a similar temp.

Try to think of the met setup before jumping to erroneous conclusions.

comment image

comment image

This is what constantly happens whenever there is fine sunny weather.”

No it doesn’t !

The air over the UK is rarely homogeneous, and quite often over England too.

Sparta Nova 4
Reply to  Anthony Banton
May 13, 2025 11:40 am

And yet they average temperatures.

MrGrimNasty
Reply to  Anthony Banton
May 13, 2025 12:04 pm

Taxed is beyond reason.
Facts do not penetrate.
He’s single-handedly discovered a major error dontcha know.

taxed
Reply to  MrGrimNasty
May 13, 2025 1:45 pm

The fact you are getting so rattled by my posts, has convinced me that am onto something here.
So trot on

taxed
Reply to  Anthony Banton
May 13, 2025 1:42 pm

My statement is based on long term study, not one day’s weather.
Everytime there is fine sunny weather my LIG thermometer record’s lower daytime temps then the local weather stations. It does not matter what the wind direction is or what time of the year it is. There is a constant overstating of the daytime temperature readings.
lts been constant enough to convince me that having modern electronic thermometers housed in Stevenson screens outside in the sunshine is a serious flawed way of trying to obtain a true recording of the shade temperature.

Reply to  Anthony Banton
May 13, 2025 1:58 pm

Try to think of the met setup”

You do know that because of the incompetence of past Met employees, a very large proportion of UK surface sites are totally unfit for the purpose of “climate”, being totally contaminated by urban effects and bad site placement..

So incompetent are these past Met officers, that even those sites installed in the last couple of decades are mostly class 4 and 5. !!

Ray Sanders
Reply to  taxed
May 13, 2025 1:41 pm

Yes which is why I put in the FOI re artificially aspirated screens. They refused to respond but then said as additional information that they did not use them anywhere. Wait for my follow ups on that. I now have access to comparative data from some non Met office screens artificially aspirated screens directly alongside MO ones

real bob boder
Reply to  taxed
May 13, 2025 5:09 pm

I have pointed out several times that RTDs over time all drift in the same direction. Guess which direction that is. They need to be calibrated regularly.

Reply to  real bob boder
May 14, 2025 4:17 am

Which, in turn, brings up the issue of how the calibration is done outside of a calibration lab.

Sparta Nova 4
Reply to  Tim Gorman
May 14, 2025 12:33 pm

There is a way to approximate the calibration.
Get a freshly calibrated device and take it on site.
Make a series of measurements using the field and the cal units side by side.
That is something that was once done when the change over from mercury to electronic thermometers was happening.

It’s not perfect, but it is better than guessing.

The other choice, obviously, is to remove the sensor, take it to the cal lab, then reinstall it. The downside is measurement loss during cal.

Reply to  Sparta Nova 4
May 14, 2025 1:30 pm

You still don’t don’t the calibration drift for past data. Measuring the calibration drift today doesn’t mean the same calibration drift existed a year ago. It could have been more or less a year ago and based solely on the external microclimate. You are probably correct that this is the best “possible” way but it simply doesn’t remove the measurement uncertainty in the data.

Sparta Nova 4
Reply to  Tim Gorman
May 15, 2025 8:21 am

That is true. It only gives you a present time benchmark.
It would need to be done periodically to get an estimate of drift.
You are correct. The measurement uncertainty is not addressed by this simple procedure.

May 13, 2025 8:27 am

So the UK is going Net Zero and bankrupting its economy based on fake data that they know is fake because they faked it. WTF? When can we expect the barricades and guillotines?

Westfieldmike
May 13, 2025 8:44 am

One temperature station is in Kew Gardens London, behind a public toilet on a concrete base surrounded by metal cabinets.

Reply to  Westfieldmike
May 14, 2025 10:08 am

Completely false. I’ve just visited Kew, and it’s nowhere near any toilets, it’s in the middle of a grassy area with no concrete or metal cabinets to be seen.

Reply to  Bellman
May 14, 2025 5:04 pm

Here’s a photo:

IMG_20250514_155458604b
Sparta Nova 4
Reply to  Bellman
May 15, 2025 8:24 am

Not intended as a challenge, but did you search everywhere in the vicinity?
I do not know the area. I process by analysis of alternatives. Is it possible there are 2 stations in the area?
On the other hand, I do not trust the media. If a publication stated what MD contends, then it is anecdotal, not definitive, until independent verification is conducted.

Reply to  Sparta Nova 4
May 15, 2025 9:32 am

So telling that you want me to prove a negative, yet don’t ask M. Dack to provide any evidence for his absurd claim.

No. I didn’t search behind every public toilet for hidden weather stations, as I didn’t want to get arrested. I think it requires high level of conspiracy thinking to suppose the Met Office have a well sited station which they ignore in favor if a station hidden behind a public toilet.

Besides, I think the claim comes from a story about a recent record being rejected by the Met Office, because of temporary toilets in place for VE celebrations.

Sparta Nova 4
Reply to  Bellman
May 15, 2025 12:04 pm

I was NOT asking or telling you to prove a negative or anything else.
I asked a yes/no question.

Your last statement matches my last 2 statements.

ResourceGuy
May 13, 2025 11:00 am

Make UK Great Again, without all the deception and advocacy science cheating.

ResourceGuy
May 13, 2025 11:02 am

It’s an updated excuse from the 1990s of ‘the computer did it’.

Anthony Banton
May 13, 2025 11:18 am

Here we go again – if not Homewood its Morrison wasting energy and in outrage over the MetO doing something that is not climate compatible

The data is not for investigating climate or to provide exactitude to any casually interested party.
It’s for people who would like to know as close as is possible what the weather was on a particular day at a particulay location and NOT to insert into a global climate GMST series (as if it would make any difference even so).

This place seems to think that the MetO is there purely to provide 101% verifiable data for the likes of Homewood/Morrison as if they are in any way important. The person/s and the data required.
It’s primary remit is to provide the UK public with any weather information of interest … and it’s NOT at all related to verified climate quality data.
That is abut a mall part of the MetO’s undertaings.
Sorry.
That’s why more than one “location” is on a beach … because people go there and want to know conditions.

That anyone with a single brain cell of intelligence would have groked from what the MetO say, that it is of scientific (ie integratable into long term climate series) quality and use is beyond me…. oh, wait!

From the MetO….

“These maps enable you to view maps of monthly, seasonal and annual averages for the UK. The maps are based on the 1km resolution HadUK-Grid dataset derived from station data.
*Locations displayed in this map may not be those from which observations are made. Data will be displayed from the closest available climate station, which may be a short distance from the chosen location. We are working to improve the visualisation of data as part of this map. 
Where stations are currently closed in this dataset, well-correlated observations from other nearby stations are used to help inform latest long-term average figures in order to preserve the long-term usability of the data. Similar peer-reviewed scientific methods are used by meteorological organisations around the world to maintain the continuity of long-term datasets. 

Also this oft regurgitated myth gives the faithfull something to exercise their anger on. …

“(nearly 80% of Met Office sites are in junk classes 4 and 5 with ‘uncertainties’ of 2C and 5C respectively).

Necessarily, as is explained by the UKMO:
https://www.metoffice.gov.uk/weather/learn-about/how-forecasts-are-made/observations/observation-site-classification

“WMO Siting Classifications were designed with reference to a wide range of global environments and the higher classes can be difficult to achieve in the more-densely populated and higher latitude UK. For example, the criteria for a Class 1 rating for temperature suits wide open flat areas with little or no human influenced land use and high amounts of continuous sunshine reaching the screen all year around, however, these conditions are relatively rare in the UK. Mid and higher latitude sites will, additionally, receive more shading from low sun angles than some other stations globally, so shading will most commonly result in a higher CIMO classification – most Stevenson Screens in the UK are class 3 or 4 for temperature as a result but continue to produce valid high-quality data. WMO guidance does, in fact, not preclude use of Class 5 temperature sites – the WMO classification simply informs the data user of the geographical scale of a site’s representativity of the surrounding environment – the smaller the siting class, the higher the representativeness of the measurement for a wide area……”

And no it’s not feasible to tow the uk to the west of Iberia or to depopulate and raise trees and buildings to accomodate the myth…..

“but continue to produce valid high-quality data. WMO guidance does, in fact, not preclude use of Class 5 temperature sites – the WMO classification simply informs the data user of the geographical scale of a site’s representativity of the surrounding environment – the smaller the siting class, the higher the representativeness of the measurement for a wide area……”

Reply to  Anthony Banton
May 13, 2025 12:21 pm

Anthony, please stop trolling.

Ray Sanders
Reply to  Anthony Banton
May 13, 2025 1:51 pm

Bollocks

Reply to  Ray Sanders
May 13, 2025 2:58 pm

Ant is saying that Met office like using temperature data which is totally unrepresentative of the surrounding area. 😉

Reply to  bnice2000
May 13, 2025 3:55 pm

Yes, in what sense does a weather station surrounded by human modified landscapes or subject to low sun angles produce high quality data?

If the surrounding conditions remain stable, the station may capture trends and anomalies over time, but those measurements still fail to represent the broader region’s true atmospheric state. The data might be precise, but it isn’t accurate.

Reply to  Anthony Banton
May 13, 2025 2:02 pm

Y
Thanks for confirming what we all know, that the Met Office should STFU about Climate Change and stick to forecasting the weather.

Sparta Nova 4
Reply to  Graemethecat
May 15, 2025 8:26 am

+10

Reply to  Anthony Banton
May 13, 2025 2:04 pm

Please stop talking about Met surface sites as if they are relevant to anything to do with climate.

Because of the incompetence of the Met and its officers, a large number of those surface sites are in a totally unfit for purpose state of disrepair.

“(nearly 80% of Met Office sites are in junk classes 4 and 5 with ‘uncertainties’ of 2C and 5C respectively).”

At least you got that correct.

Accepting data they the Met themselves state is from mostly class 4 and 5, shows that it may not actually be total incompetence, but a deliberate attempt to manufacture fake warming.

Reply to  bnice2000
May 13, 2025 4:16 pm

Let me add that those uncertainties are added to the regular uncertainty of the station.

For example, if they match U.S. ASOS at ±1°C, the total uncertainty would be 3°C and 6°C. Six degrees for God’s sake!

Reply to  Jim Gorman
May 13, 2025 4:42 pm

And that will not be a +/- thing.

Most of these sites are likely to err very much on the “hot” side.

A very skewed uncertainty.

And remember, a large proportion of their sites will have that highly skewed large uncertainty…

Its ridiculous that the Met Office, and its workers let their sites get into such an appalling state.

And even more ridiculous that they still pretend the data is anything but junk.

Sparta Nova 4
Reply to  bnice2000
May 14, 2025 12:36 pm

But, but, but it is the longest continuing temperature record in the world!
/s

Jeff Alberts
Reply to  Anthony Banton
May 13, 2025 6:07 pm

Locations displayed in this map may not be those from which observations are made. Data will be displayed from the closest available climate station, which may be a short distance from the chosen location.”

Define “short distance”. I’ve seen differences as high as 27F from two places 13 miles apart.

Sparta Nova 4
Reply to  Jeff Alberts
May 15, 2025 8:27 am

That is perfect for a 1 km grid, eh?
/s

Reply to  Anthony Banton
May 14, 2025 4:13 am

however, these conditions are relatively rare in the UK. Mid and higher latitude sites will, additionally, receive more shading from low sun angles than some other stations globally, so shading will most commonly result in a higher CIMO classification – most Stevenson Screens in the UK are class 3 or 4 for temperature as a result but continue to produce valid high-quality data.”

This is nothing more than a rationalization that is basically “inaccurate data can be high-quality data”.

It’s garbage!

Sparta Nova 4
Reply to  Anthony Banton
May 14, 2025 12:35 pm

Except they do.

Sparta Nova 4
Reply to  Anthony Banton
May 15, 2025 8:25 am

Funny how they call them “climate stations” not “weather stations.”

Chew on that for a bit.

cgh
May 13, 2025 12:19 pm

Why not? The Met Office is simply ensuring that it will never be contradicted by its own made-up data.

We should not be surprised. Climatologists like Mikey have entire careers based on just making stuff up. MBH98, anyone? The most useful statistical tricks for the global warmers are the methods which will take any sort of data and produce whatever results you want. Mikey and his Nature Trick showed everyone how it’s done.

May 13, 2025 12:26 pm

Data is observed and then recorded. The Met office needs retraining on what observation means. Data is either acceptable for purpose or is discarded as useless. People who cite Met Office data in their studies should retract.

Sparta Nova 4
Reply to  doonman
May 15, 2025 8:29 am

If unacceptable, it still needs to be retained, but with the appropriate annotation including why it is not acceptable.

KevinM
May 13, 2025 1:00 pm

“ris·i·ble
adjective
such as to provoke laughter.”
Been meaning to look it up, I can’t be the only one.

KevinM
May 13, 2025 1:04 pm

“Until recently, the Met Office showed weather averages including temperature for over 300 stations stretching back at least 30 years. The data identified individual stations and single location coordinates, but when 103 were found not to exist”

300-103 = 197

I wonder about the other 197.

A British version of AW needs to run a surface station project to survey them. I wonder how many have been visited since installation?

Bob
May 13, 2025 2:06 pm

We all know what the problem is with MET, it is their leadership. If I had the authority I would request the Met Office employees to list the people responsible for the shenanigans taking place there. If a sufficient number don’t respond I would fire 5% of the employees from the top on down. If there is no change I would fire the next 5% and so on.

Reply to  Bob
May 14, 2025 11:37 am

IIRC, Anthony Banton claims to be a former Met Office manager.

Reply to  karlomonte
May 16, 2025 4:43 am

That would explain a lot of the Met Office’s problems.

sherro01
May 13, 2025 3:46 pm

Where are the references to text books and serious papers that allow conventional statistics about error and uncertainty to be used on made-up numbers?
Surely, stats can only be applied in commerce (as opposed to academic research) when based on properly measured actual observations. Are made up numbers illegal if not used properly? Geoff S

Reply to  sherro01
May 13, 2025 5:04 pm

Anthony Banton endorsed the Met Office upthread, suggesting that weather stations affected by altered landscapes and sunlight angles somehow still provide high quality data.

One has to wonder what definition of “high quality” he is working with.

From Google AI:

A high-quality measurement is accurate, precise, repeatable, and consistent, providing reliable data that can be used to make informed decisions or track progress. It ensures that the data is reliable and trustworthy, making it useful for analysis and decision-making. 

That explanation is easily accessible. I just typed “what is a high quality measurement?” into Google search bar.

Reply to  crocodile
May 13, 2025 5:38 pm

Those are the qualities of measurement that I was trained in during my engineering degree.

Sparta Nova 4
Reply to  Jim Gorman
May 15, 2025 8:31 am

There are qualities of measurement that I used daily for 50 years and counting.

Reply to  crocodile
May 13, 2025 7:14 pm

Most Met data is only useful for “climate” propaganda.

Built-in warming at a large proportion of Met weather sites.

DStayer
May 13, 2025 4:09 pm

They don’t want to discuss it because it might reveal the absolute fraud the MET office has been involved in!

sherro01
Reply to  DStayer
May 13, 2025 10:15 pm

DStayer,
What is holding you back? Before I retired, if we found prima facie evidence for fraud by anyone who seriously affected our operations, we would go straight to the best lawyers we could find.
Have you been threatened?
Are you not too sure of yourself?
You have to have a deal of self confidence, as in being called big head, to succeed this way.

Geoff S

High Treason
May 15, 2025 2:05 am

Bit of reverse psychology should be used on the. Ask them what they would think of someone trying to sell them some product, claiming it to be thoroughly tested by several laboratories, but actually had one or 2 tests performed? Would they still buy the product knowing the tests claimed were a lie?

Sparta Nova 4
Reply to  High Treason
May 15, 2025 8:32 am

Like Chinese inverters?

Sparta Nova 4
May 15, 2025 12:01 pm

I wish to officially protest nyolci.

Personal attacks.
Use of “denier” in almost every post and intended to encompass everyone who visits WUWT.
This is akin to racial profiling and it unacceptable.

Then there is use of cherry picking snippets that nyolci responds to ignoring the full comment.
This is apparently done so he/she can be amused by deflecting and detracting from a legitimate discussion.

For the rest of you, do not feed the trolls.
nyolci is not contributing anything worthwhile and is apparently only engaging to get a response for entertainment only.

nyolci makes Plato’s Sophists look like rank amateurs.
The sophistry is strong in this one.

Reply to  Sparta Nova 4
May 16, 2025 4:45 am

Nick Stokes and Simon run nyolci close for mendacity and sophistry.

Verified by MonsterInsights