Massive Cover-up Launched by U.K. Met Office to Hide its 103 Non-Existent Temperature Measuring Stations

From THE DAILY SCEPTIC

by Chris Morrison

Last month, the Daily Sceptic highlighted the practice at the U.K. Met Office of inventing temperature averages from over 100 non-existent measuring stations. Helpfully, the Met Office went so far as to supply coordinates, elevations and purposes of the imaginary sites. Following massive interest across social media and frequent reposting of the Daily Sceptic article, the Met Office has amended its ludicrous claims. The move has not been announced in public, needless to say, since drawing attention to this would open a pandora’s box and run the risk of subjecting all the Met Office temperature claims to wider scrutiny. Instead, the Met Office has discreetly renamed its “U.K. climate averages” page as “Location-specific long-term averages”.

Significant modifications have been made to the new page, designed no doubt to quash suspicions that the Met Office has been making the figures up as it went along. The original suggestion that selecting a climate station can provide a 30-year average from 1991-2020 has been replaced with the explanation that the page “is designed to display locations that provide even geographical coverage of the U.K., but it is not reflective of every weather station that has existed or the current Met Office observation network”. Under the new page the locations are still referred to as “climate stations” but the details of where they are, exactly, have been omitted.

The cynical might note that the Met Office has solved its problem of inventing data from non-existing stations by suggesting that they now arise from “locations” which may or may not bear any relation to stations that once existed, or indeed exist today. If this is a reasonable interpretation of the matter, it might suggest that the affair is far from closed.

Again we are obliged to the diligent citizen journalist Ray Sanders for drawing our attention to the unannounced Met Office changes and providing a link to the previous averages page on the Wayback Machine. The sleuthing Sanders has been on the case for some time, having discovered that three named stations near where he lives, namely Dungeness, Folkestone and Dover, did not exist. The claimed co-ordinates for Dover placed the station in the water on the local beach as shown by the Google Earth photo below.

As a result, Sanders discovered from a freedom of information request that 103 of the 302 sites marked on the climate averages listing – over a third of the total – no longer existed. Subsequently, Sanders sought further information about the methodology used to supply data for both Folkestone and Dover. In reply, the Met Office said it was unable to supply details of the observing sites requested “as this is not recorded information”. It did however disclose that for non-existent stations “we use regression analysis to create a model of the relationship between each station and others in the network”. This generates an estimate for each month when the station is not operating. Each “estimate” is said to be based on data from six other stations, chosen because they are “well correlated” with the target station.

In the case of Dover, the nearest ‘station’ is seven miles away at non-existent Folkestone followed by Manston which is 15 miles distant. By “well correlated” perhaps the Met Office means they are in the same county of Kent. No matter, computer models are on hand to guide the way.

Ray Sanders had sent details of his findings to the new Labour science minister Peter Kyle MP and the recent Met Office changes may have been promoted by a discreet political push. At the time, Sanders asked: “How would any reasonable observer know that the data was not real and was simply ‘made up’ by a Government agency?” He called for an open declaration of likely inaccuracies of existing published data “to avoid other institutions and researchers using unreliable data and reaching erroneous conclusions”.

The Met Office also runs an historical data section where a number of sites with long records of temperature are identified. Lowestoft closed in 2010 and since then the figures have been estimated. The stations at Nairn Druim, Paisley and Newton Rigg have also closed but are still reporting estimated monthly data. “Why would any scientific organisation feel the need to publish what can only be described as fiction?” asks Sanders.

The original Braemar station in Aberdeenshire has recorded temperature data since Victorian times. Due to its interesting topography surrounded by high mountains, it recorded the U.K.’s coldest temperature of -27.2°C in both 1895 and 1982. In summer, the temperature can soar as the heat stays trapped. A new site, some distance from the original, was set up in 2005 and in common with Met Office procedure was labelled Braemar 2 to reflect both distance and climatological differences. In the historical data section of the Met’s website, Braemar 2 is shown supplying data back to 1959. “For reasons I find difficult to understand, the Met Office has chosen to highlight a spurious merging of two notably different data sets for an illogically defined period that fails to represent either site,” observes Sanders.

The recent changes made by the Met Office to its climate average pages shows that the state-funded operation is fully aware of the growing interest in its entire temperature recording business. This interest has grown because the Met Office is fully committed to using its data to promote the Net Zero political fantasy. But it is silent on the biggest concern that has been raised of late, namely the promotion of temperatures, accurate to one hundredth of a degree centigrade, obtained from a nationwide network where nearly eight out of 10 stations are so poorly sited they have internationally-recognised ‘uncertainties’ as high as 5°C.

Chris Morrison is the Daily Sceptic’s Environment Editor.

4.9 53 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

151 Comments
Inline Feedbacks
View all comments
observa
December 9, 2024 6:23 pm

“we use regression analysis to create a model of the relationship between each station and others in the network”.

They are the very model of a major modern modeller!
They do love their relativism and contextualisation now don’t they?

December 10, 2024 12:03 am

Claiming a measured absolute global surface temperature is meaningless. Interpolations do not help. Climate models produce a 2-4C range of absolute temperatures. This fact is obviated by use of temperature anomalies. Specifying tenths of degrees in anomalies of temperature is equally meaningless. There is a global temperature, in principle, but we cannot measure it for obvious reasons. The agencies pretend to do so anyway. It puts dinner on the table.

beanleft
December 10, 2024 5:11 am

As student employees for a major oil company back in the early days of the Clean Air Act some of us were required to climb the refinery’s 100+ ft. furnace stacks to measure and record flow, pressure and temperature. Some of the brighter students quickly realized that 1) the numbers never changed by much, and 2) nobody cared anyway. So most of the data from the coldest days were recorded in a local restuarant while enjoying a hot breakfast.

I have been a data skeptic ever since, and preach healthy skepticism to everyone meet.

Ray Sanders
Reply to  beanleft
December 10, 2024 5:45 am

Nine fully able people in a room plus a 10th with one leg amputated. Average number of legs per person in the room is 1.9 and yet 9 people are above “average” having 2!

Reply to  Ray Sanders
December 10, 2024 9:23 am

Are legs an intensive or extensive property?

It would appear to be intensive, just like temperature.

John Hultquist
December 10, 2024 7:53 am

TIP – – – it is no wonder temperature is such an insoluble problem
From the WSJ: …according to a global test of adult know-how,
… the least-educated American workers between the ages of 16 and 65 are less able to make inferences from a section of text, manipulate fractions or apply spatial reasoning.
When it comes to basic skills such as creating a complex travel itinerary, reading a thermometer or finding information from a website, American workers are falling behind those in other rich countries.

My bold. I have no idea when or how I learned about temperature.  

Reply to  John Hultquist
December 10, 2024 8:30 am

My first reading a thermometer was watching the temperature gauge my father added to a Farmall A tractor.

Robert B
December 11, 2024 2:16 am

But it is silent on the biggest concern that has been raised of late, namely the promotion of temperatures, accurate to one hundredth of a degree centigrade, obtained from a nationwide network where nearly eight out of 10 stations are so poorly sited they have internationally-recognised ‘uncertainties’ as high as 5°C.

Temperature is an intensive property like concentration. Both are quotients of two extensive properties of the same body, which depend on the total quantity of the body so the intensive property does not. Intensive properties also are required to be constant throughout.

You can get an average intensive value if you measure both extensive properties of the body, or have sampled it so much that you have the totals of the two extensive properties. You do not get 100 times more precise measure by sampling 10 000 times than if you measure the total of the two extensive properties. Measuring it in 10 000 parcels gets you a random error 100 times greater. You also introduce an error of the sample sizes not being perfectly even (even if weighted), not to mention that global temperature measurements are like smaller sampling of the samples.

The average of sampling well short of everything is meaningless. You can fit functions to what you have to create a complete profile and average an infinite number of samples, but that has extra associated errors, both systematic and random.

When the intensive property is constant, each sample will have a single true value and the average of your measurement becomes useful to get a more precise measurement eg. 10 000 measurements should give you an average that is within 1/100 of the range of individual measurements from the true value, as is the case with extensive properties like height.

When you look up the average salinity of the oceans, NOAA will report that it’s about 35 ppm, mostly between 33 and 37 ppm. They don’t call it an average. But when it comes to how many Hiroshima bombs of energy have the oceans accumulated, 0.001 K increase in average temperature is significant.

Huge problems even before you consider eight out of 10 stations are so poor.

Reply to  Robert B
December 11, 2024 9:42 am

You do not get 100 times more precise measure by sampling 10 000 times than if you measure the total of the two extensive properties.

You do not get an answer with a higher resolution than what was actually measured with the lowest resolution device. That is the whole reason significant figure rules were developed. If the average is 13.25 from integer measurements, it is not 13.0. The average is “13”, i.e., no decimals.

You may measure THE SAME THING repeatedly using repeatable conditions 10,000 times and have standard deviation of the mean that is equal to “3.0/100=0.030”. The measurement becomes 13 ±0.03. An engineering cost study would show that doing ~20 measurements would suffice to give an uncertainty of the mean to one decimal places, i.e., 0.7, which gives 13 ±0.7. Regardless of how many integer readings you make, the average stated value is 13.

Be careful of online AI’s and calculators. They don’t always follow the rules correctly.

Measuring it in 10 000 parcels gets you a random error 100 times greater.

There are folks here that don’t or won’t understand this.

High Treason
December 11, 2024 12:30 pm

If it turns out that fake weather stations putting out fake data (to fit in with a certain political narrative)is correct, this is a massive political scandal. It would mean that ALL the trillions spent on windmills etc was fraudulently obtained. The taxpayers were being deceived. The big questions-Who orchestrated it? Why (what were the benefits were obtained by the Who), When the fraud was conceived and implemented, What was done-how many other aspects of the narrative are fraudulent. How did they get the fraud implemented? How did anyone allow it to happen?
All of these questions need to be nutted out to avoid such a massive fraud in the future. As it is, this is a massive scandal- someone would have been bribed, blackmailed, intimidated or coerced to allow the fraud to start, which then became a self-perpetuating myth.
This sort of mass fraud has happened in the past-childhood fables are a warning about this:- Chicken Little-note, it was a wise king. Most rulers are greedy or power hungry. They would have imposed new taxes and restricted freedoms. Isn’t this what we have seen with the “climate crisis” and COVID “vaccines?”
The Emperor’s New Clothes-pretty obvious, or it should be. Note, the treasury was cleaned out. Who would pay? The hapless taxpayer-and they were not even properly consulted about the whole thing.
The Lamb and the Wolf-tyrants always use the pithiest and feeblest of excuses to justify their tyranny.
The man, the boy and the donkey-blindly trusting others instead of following your instincts and you will find yourself in deep trouble .If you think about it, they were so poor they had to sell their means of production just to get a bit of money to survive-they lost both, as well as a full day.
Pied Piper-follow along with an absurd, but appealing narrative and your future is gone.
These are messages from the past that are being played out now. we should pursue this massive fraud and not let up until the tide of the mass deception turns on the perpetrators. It is not as though it was an accident-there can never be an excuse for false science.

Corrigenda
December 12, 2024 2:10 pm

This surely is scientific fraud? When will it stop and when will the world be told there is no significant human or animal triggered climate change in the world?