Last month, the Daily Sceptic highlighted the practice at the U.K. Met Office of inventing temperature averages from over 100 non-existent measuring stations. Helpfully, the Met Office went so far as to supply coordinates, elevations and purposes of the imaginary sites. Following massive interest across social media and frequent reposting of the Daily Sceptic article, the Met Office has amended its ludicrous claims. The move has not been announced in public, needless to say, since drawing attention to this would open a pandora’s box and run the risk of subjecting all the Met Office temperature claims to wider scrutiny. Instead, the Met Office has discreetly renamed its “U.K. climate averages” page as “Location-specific long-term averages”.
Significant modifications have been made to the new page, designed no doubt to quash suspicions that the Met Office has been making the figures up as it went along. The original suggestion that selecting a climate station can provide a 30-year average from 1991-2020 has been replaced with the explanation that the page “is designed to display locations that provide even geographical coverage of the U.K., but it is not reflective of every weather station that has existed or the current Met Office observation network”. Under the new page the locations are still referred to as “climate stations” but the details of where they are, exactly, have been omitted.
The cynical might note that the Met Office has solved its problem of inventing data from non-existing stations by suggesting that they now arise from “locations” which may or may not bear any relation to stations that once existed, or indeed exist today. If this is a reasonable interpretation of the matter, it might suggest that the affair is far from closed.
Again we are obliged to the diligent citizen journalist Ray Sanders for drawing our attention to the unannounced Met Office changes and providing a link to the previous averages page on the Wayback Machine. The sleuthing Sanders has been on the case for some time, having discovered that three named stations near where he lives, namely Dungeness, Folkestone and Dover, did not exist. The claimed co-ordinates for Dover placed the station in the water on the local beach as shown by the Google Earth photo below.

As a result, Sanders discovered from a freedom of information request that 103 of the 302 sites marked on the climate averages listing – over a third of the total – no longer existed. Subsequently, Sanders sought further information about the methodology used to supply data for both Folkestone and Dover. In reply, the Met Office said it was unable to supply details of the observing sites requested “as this is not recorded information”. It did however disclose that for non-existent stations “we use regression analysis to create a model of the relationship between each station and others in the network”. This generates an estimate for each month when the station is not operating. Each “estimate” is said to be based on data from six other stations, chosen because they are “well correlated” with the target station.
In the case of Dover, the nearest ‘station’ is seven miles away at non-existent Folkestone followed by Manston which is 15 miles distant. By “well correlated” perhaps the Met Office means they are in the same county of Kent. No matter, computer models are on hand to guide the way.
Ray Sanders had sent details of his findings to the new Labour science minister Peter Kyle MP and the recent Met Office changes may have been promoted by a discreet political push. At the time, Sanders asked: “How would any reasonable observer know that the data was not real and was simply ‘made up’ by a Government agency?” He called for an open declaration of likely inaccuracies of existing published data “to avoid other institutions and researchers using unreliable data and reaching erroneous conclusions”.
The Met Office also runs an historical data section where a number of sites with long records of temperature are identified. Lowestoft closed in 2010 and since then the figures have been estimated. The stations at Nairn Druim, Paisley and Newton Rigg have also closed but are still reporting estimated monthly data. “Why would any scientific organisation feel the need to publish what can only be described as fiction?” asks Sanders.
The original Braemar station in Aberdeenshire has recorded temperature data since Victorian times. Due to its interesting topography surrounded by high mountains, it recorded the U.K.’s coldest temperature of -27.2°C in both 1895 and 1982. In summer, the temperature can soar as the heat stays trapped. A new site, some distance from the original, was set up in 2005 and in common with Met Office procedure was labelled Braemar 2 to reflect both distance and climatological differences. In the historical data section of the Met’s website, Braemar 2 is shown supplying data back to 1959. “For reasons I find difficult to understand, the Met Office has chosen to highlight a spurious merging of two notably different data sets for an illogically defined period that fails to represent either site,” observes Sanders.
The recent changes made by the Met Office to its climate average pages shows that the state-funded operation is fully aware of the growing interest in its entire temperature recording business. This interest has grown because the Met Office is fully committed to using its data to promote the Net Zero political fantasy. But it is silent on the biggest concern that has been raised of late, namely the promotion of temperatures, accurate to one hundredth of a degree centigrade, obtained from a nationwide network where nearly eight out of 10 stations are so poorly sited they have internationally-recognised ‘uncertainties’ as high as 5°C.
Chris Morrison is the Daily Sceptic’s Environment Editor.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
I suggest a policy where the paychecks of the UK Met Office employees are marked as “delivered” to the nearest one of these weather stations.
And if nobody is there to receive it, they can leave it with the station next door.
or, simply stop paying into 1/3 of the met office employees’ retirement plan & wages. Run a regression analysis to average what would have been paid based on the other ‘similar’ employees, and then pay them that calculated average.
See if they think it is a reasonable thing to do … see if they further research things to check if the calculated averages are beneficial or detrimental to them as single/individual data points.
Anyone that does look into it then gets fired for not doing their job, and then lying about why it was O.K. to not do their job, in the first place.
Brilliant idea!!
We have a similar problem here in the U.S. with over 380 ghost stations … those that no longer exist and yet NOAA continues to fabricate their temperature data. Belle Glade, FL is just one of them. My video provides more information about data altering and fabrications … https://www.youtube.com/watch?v=cF16lDtSVrU
I sort of can understand how this “estimation” thing could get started. Let’s say a met-tech says to the boss: Mrs. O’Leary from East Mayberry didn’t report today. Shall I put in zeros? And the boss says: That would mess up the averages unless we fix the computer code. Just make an estimate using West Mayberry info.
As time goes by, more folks like Mrs. O’Leary stop calling in because of sickness or death. { I know one. It took about two years for the weather folks to find a place and install equipment – about 1.75 miles away.}
. . . and years later the “infilling” got out of control.
[I’m in central Washington State.]
Altered and/or fabricated data is fake data — not real.
Have you ever used a bathymetric chart, a topographic map with contour lines, or looked at isobars for air pressure? Almost all “fake” data. 🤔
How do you think a bathymetric chart or contour map is constructed? By actually measuring depths or heights and connecting the points of equal value..
There is a fundamental difference between interpolating between measured data points and inventing data points. The first practice is legitimate, the second is fraud. If you have problems understanding this, perhaps you shouldn’t be commenting here.
By your reasoning there should not be lines connecting those data points.
Read my post again.
Have you ever drawn a graph? If so, did you connect the dots?
Hey John,
Those are models, created from the data.
If the data is not reasonably collected (miss a break line, or a low point, or a high point), then the contour lines are off.
Had a contractor decide to do their own construction staking … they missed a few important points and brought in and placed another 4,000 yards of material. Few points off of a vertical curve and $16,000 over bill….
The problem with the ‘global temp’ collection scheme is that NO ONE knows what specific data points are important and which ones can be interpolated. At this point in time, it seems that the people that claim ‘too much warmth’ are assuming that we can get by very little actual real data.
The problem there is a ghost network; NOAA hasn’t published a USHCN average for more than 10 years. So a network that doesn’t exist has stations that don’t exist. Spooky!
NOAA updates the USHCN data on a daily basis via its public URL … https://www.ncei.noaa.gov/pub/data/ushcn/v2.5/
So if you want to find a non-existent station (flagged of course with an E) you can go look in that data directory. But the NOAA hasn’t used USHCN for any published US climate maps or averages for over ten years!
Correct. They don’t use the “USHCN” designation, but they use that data in other designated datasets — for the 1218 stations are the ONLY long-term historic data available. That’s why they keep using our tax monies to alter and fabricate this data. They obviously have fooled you.
The stations that are still reporting are part of GHCN V4, and used by GISS and NOAA, along with thousands of other US stations, and in the same way. They don’t use the USHCN filnet etc, which was interpolation needed only to deal with the USHCN requirement to average absolute temperatures. So they have no requirement for 1218 stations.
They now publish the pristine station homogenised ClimDiv to match USCRN.
The don’t publish USCHN because it would be way too embarrassing and would show just how much of their fakery comes from urban warming.
You know that USCHN2.5 still exists. The data and link have been given to you MANY times.
That means you are DELIBERATELY LYING !!
They now publish the pristine station homogenised ClimDiv to match USCRN.
The don’t publish USCHN because it would be way too embarrassing and show just how much of their fakery comes from urban warming.
You keep LYING Nick, I corrected you about 5 years on this showing that the NOAA is still running the program.
I said
“NOAA hasn’t published a USHCN average for more than 10 years.”
The way to refute that is to show where NOAA published a USHCN average. You can’t do that.
They haven’t published other stuff they generate data either.
You think if it isn’t “published” it means it doesn’t exist, man you are failing badly.
We need more fact checks like this, and easy enough to do with temperature. Audit all temp collection sites and publicly announce the results!
I do that on X … https://x.com/search?q=%23noaaghoststation&src=typed_query
Caught out making changes to ‘hide’ the indefensible. Not a good look.
Somebody made an idiotic fuss about wording and so they improved their wording.
You ALWAYS make an idiotic fuss about wording.
It is your only mode of operation.
You are well aware that much of the UK data is FAKE and CORRUPTED, just as it is with BoM and USHCN.
Why always think you can get away with be stupidly disingenuous. !!
Nick, that “somebody” was …me. Can you please explain why you feel it is scientifically acceptable to fabricate data for non-existent sites?
Ray
I always appreciate your contributions to NALOPKT
Thanks erm…Glen!
It is an estimate. They undertake to estiamte climate variables for your location. They can’t give you a measurement there.
It’s stuff people want to know.
What are the error bars on these “estimations”? Interested people want to know.
Mechanical response here. Estimations do not have error bars. The estimator may choose to give a range.
Why do you bother eh?
He is here to harvest negative votes……
Yup. My employer pays me $1 for every down vote. So keep them coming, please! 🙂
Thanks!
Translation: climate practitioners are exempt from data handling rules that other rigorous sciences live by.
It is certainly helpful to know what does not exist.
“fabricate data”
It is not measurement data. It is a climate normal (30 years). They do not claim to have measured it with an instrument (they cannot). They estimate it by calculation using the information available. People find that useful.
Are you saying that the “locations” only display the climate normal for that “location”?
It seems to me Sanders have shown that the Met Office provides daily temperatures for the “locations”.
Do you have a link for that?
Link for what? I asked if you are saying the “locations” only show the climate normal?
You said:
“the Met Office provides daily temperatures for the “locations”
I asked for a link. I doubt it is true.
People find it fabrication.
This is just off the planet. Look at the head picture. It shows Europe covered with isobars, each representing a number. Such maps have been standard for at least a century. Do you imagine each number is backed by a barometer reading? No, they are estimated, in old days drawn by hand. They are not “fabricated”. They are the meteorologist trying to tell the public what is happening. And that is very useful.
Yes. People who want to manipulate perception and profit from it no doubt find fabricated “data” very useful.
So you are saying that NO scientific papers use these non-measured temperatures for any purpose?
If these guesses find their way into scientific papers without adequate assessment of their uncertainty, then the science is tainted. Why do you think physical scientists and engineers are fanatical about measurements? Why bother having metrology science at all?
Caught out.
Yes.
And where is BBC Verify?
You know, the BBC’s own Holy Guarantor of Unimpeachable Whatever the Crazy Lefties think is Right?
Surely they would be all over this if it was related to Trump . . . but, for some, they prove their inbuilt bias by remaining silent.
Auto
Hi Auto, if I published the views on my LinkedIn profile rather than the Met Office and BBC trying to ascertain the truth you would actually find people with “qualifications” in “ethical hacking” crawling all over it. Guess I should be flattered but it gets a bit irksome after a while. Here is an example, he won’t talk to me or answer inquiries but is happy to try and infiltrate my profile.
https://www.linkedin.com/in/angus-bruce-341645129/?originalSubdomain=uk
Very strange that an operational meteorologist doesn’t list meteorology among his skills…
This new kind of data needs a name
Pseudo data
Synthetic data
Data lite
Plastic data
Fairy data, I think this one is the best.
Everyone has to believe or Tinkerbell dies.
How about out and out lies?
Idealised data
Virtual data
not physically existing as such but made by software to appear to do so
PS This actually applies to the whole CAGW
Convenient data?
Phantom data
Ghost data
Don’t-got-no data
Here are more USHCN ghost stations … https://x.com/search?q=%23noaaghoststation&src=typed_query
Fiddled figures.
How about “Trans” data?
It is not what it appears to be, that is real data?
Close enough for government data.
It’s hard to get a “xxx data” name that drives home the criminality. “Invented data”, possibly. But really Sparta Nova 4’s suggestion here is probably the only really accurate name for it:
out and out lies
non-measured data
Fake Data
Non data.
Or, if they want a weekend vibe, Satur data.
Or, if it only makes temperatures seem higher, Fry data.
It has a name. Propaganda.
Excellent Chris. Government has clearly shown it is incapable of managing the MET. It should be privatized, and aggressively held accountable for every action. Government can not be trusted for the simple reason that it is self regulated. This is why outfits like Exxon are not self regulated.
“aggressively held accountable”
By?
There are laws against lying and cheating, private firms and individuals are far more likely to be held accountable than governments or individuals in government.
If it was privatised, we would have no right to FOI requests. That data and how the data were produced would become commercial secrets. We would never have found out what was going on.
That would be part of it going private. Private outfits get audited all the time, write it in the contract.
They trust the models more than the data, since they have no data and the models give the temps they want.
I’ve seen this more times than I can remember. They seem to think that since the models produce data that is used by the models to predict the data, everything is fine.
The Met Office, a government organ, is fully complicit in the man made warming criminal scam.
The Met Office
Ho, ho, bleedin’ ho.
“It’s critical to keep in mind that this data, although fabricated, confirms what we wanted to see, so it’s sorta true.” h/t Dan Rather, formerly of CBS “News”.
Is that a real quote?
He read it on the internet!
But it is exactly how the AGW-cultist like you operate, Nick !
Not a real quote, although that’s probably what Dan Rather was thinking.
Dan lost his job at CBS News after his lie about George W. Bush’s military service became public.
Oh, how the Mighty have fallen!
You brought it on yourself, Dan. Good riddance.
Thanks. I wasn’t sure whether he said that out loud, but I do remember he got fired for the GWB bit so though it must have been that bad.
“But it is silent on the biggest concern that has been raised of late, namely the promotion of temperatures, accurate to one hundredth of a degree centigrade, obtained from a nationwide network where nearly eight out of 10 stations are so poorly sited they have internationally-recognised ‘uncertainties’ as high as 5°C.”
UKMet historical temperature record reports uncertainties smaller than the lower limit of resolution of the LiG thermometers, and ignores the profound uncertainties arising from systematic measurement error.
LiG Metrology, Correlated Error, and the Integrity of the Global Surface Air-Temperature Record
I used to teach ecology classes about microclimate. Lived on a rare ~10 foot hill in Louisiana with a small coulee at the bottom that ran down to where the 1927 flood reached. Didn’t take much to get degrees of difference with a calibrated mercury thermometer. Decimals no. Once vehicles got thermometers it seemed logical that we need lots more stations, especially in the ocean. Everybody knows about the old adage about lies and statistics now with computer models added. This just came out, but only a small number of papers seem to be tolerated showing all these statistical follies. Preprint, not reviewed
https://doi.org/10.1038/d41586-024-03996-w
Bad bar charts are pervasive across biology
Scientific ‘shorthand’ could be introducing data distortions into published papers. By Amanda Heidt.1/3 of 3400 papers in 2023 with bar charts had a problem.
“The pair found that 88% of the papers contained at least one bar chart; of those papers, 29% had a bar chart with some form of data distortion.”
Actually Pat it is much much worse than that. Ready for this? The UK Met Office claims field reading accuracy down to one hundred thousandth of a degree i.e. the fifth decimal place. No I ain’t joking- here’s the proof.
https://tallbloke.wordpress.com/2024/09/16/cavendish-dcnn-3122-anatomy-of-an-ongoing-challenge-crop-circling/
Outrageous! I would like to know the thermometer being used and I would like to see the calibration document for it.
Even USCRN stations claim an uncertainty of 0.3°C.
Liars.
I wonder if gathering their temperature readings hurts the Met’s hemorrhoids.
It is easy to have data fit The Narrative if one just makes sh!t up.
If you average a non-existent sensor many times, does it become more accurate?
If you average it enough times, does it come into existence?
The supernatural properties of air temperature averaging are not to be questioned.
Nick Stokes seems to think so.
Just one more example of the poxy probity of climate “data”.
How can the Met Office make any analysis of changes in weather and climate if it does not know where its weather stations are sited? Their pronouncements simply have to be scientifically fraudulent.
Here is another Met Office conundrum. They supply data to formulate “Average Heating Days” statistics for the DESNZ. So…..how do you average 17 numbers? Easy double 4 of them, add the remaining 17 and divide by 21. Think I am joking??https://tallbloke.wordpress.com/2024/10/30/leconfield-wmo-03382-doubled-dubious-data-despite-a-new-solar-farm/
Story tip
oops “add the remaining 13” or maybe it is 17 who knows!
The King of Peru
(Who was Emperor too)
Had a sort of a rhyme
Which was useful to know…
Oh, whenever the Emperor
Got into a temper, or
Felt himself awkward and shy,
He would whisper and whisper,
Until he felt crisper,
This odd little rhyme to the sky:
Eight eights are eighty-one;
Multiply by seven.
If it’s more,
Carry four,
And take away eleven.
Nine nines are sixty-four;
Multiply by three.
When it’s done,
Carry one,
And then it’s time for tea.
A A Milne
If the Dover met station is really located at the pin on the aerial photo, it would read unusually high on sunny summer days due to radiant heat reflected from the sand.
Met Office: “We don’t need no bloody thermometer–it’s getting warmer because we told you so!”
Some teenage surfers report feeling the change as it happened.
“Massive Cover-up Launched by U.K. Met Office to Hide its 103 Non-Existent Temperature Measuring Stations”
How hard can it be to hide a non-existent station?
Of course all this breathless rant has at its base is that the MO has reworded its descriptor of a facility to be even clearer. The facility is one that will give you an estimate of climate variables at an arbitrary point. The facility uses data from some nearby point. That is all you can expect for an instant response facility. Sometimes it uses a closer point that is itself a composite of nearby points. That is actually better, because it can put more effort into forming that composite.
Well said Kamala.
Facility- based on the word facile.
Facilities like that should eliminate homelessness.
Fundamentally dishonest people always expose themselves on stories like these.
This is definitely a case where the government has been caught manipulating data to advance a political narrative. An honest person who believes in the green religion would be shocked.
The honest man, tho’ e’er sae poor,
Is king o’ men for a’ that.
The man o’ independent mind,
He looks an’ laughs at a’ that.
Word salad alert! Can we call out the international cooperation by Government run Weather Forecasters. In Australia it is no different – badly sited stations and homogenisation of temperatures at stations over hundreds of km/miles apart. The problems have been documented numerous times by various contributors with no valid justification in response by our BOM.
“How hard can it be to hide a non-existent station?”
Thanks for letting us know that YOU WORSHIP FAKE DATA.
We have known that for a long time.
They got sprung, and are now trying to hide their fakery.
Nick you are simultaneously both ignorant and arrogant. I am the original researcher and author of most of this work.Here is a little secret…..I am rather well qualified and know what I am talking about. And you are?
… rather well wualified and I know what I am talking about.
Even your spellchecker knows you’re full of sh!t.
……In your pigs eye!
Agreed, Nick, but what is the purpose of these inferred temperatures?
Why are they continued?
What would suffer if they simply discontinued use of inferred temperatures?
Geoff S
Geoff,
It’s a user facility to let you know the best the MO can tell you about your location. It doesn’t have a station there, so it can’t be exact. Mostly they tell you the nearest comparable station (altitude etc) but sometimes it makes sense to pre-form a composite at some nearby location. Sometimes this will be a continuation of a station.
Are you after a gold medal in being a moron? The phoney figures are quoted to the second decimal place. You haven’t read any of them have you?
Try this one
https://tallbloke.wordpress.com/2024/09/04/dungeness-wmo-03888-and-the-103-missing-met-stations-mystery/
When challenged the Met Office could NOT even show HOW they faked the numbers
It really doesn’t change the fact that they are guesses whether they are made by “eyeball” or by using a made up computer algorithm created by a programmer.
Science isn’t done by guessing. It is why a physical hypothesis requires a mathematical proposition to test using MEASURED DATA. Using non-measured data can’t prove anything.
It doesn’t have a station there, so it can’t be exact.
You realize you just verified the fact that the uncertainty of the made up data has a huge value.
If a class 5 station has an uncertainty of ±5 degrees, what must an unreal station have?
FFS:
It isn’t meant to be exact.
The data is not for investigating climate or to provide exactitude to any casually interested party.
It’s for people who would like to know as close as is possible what the weather was on a particular day at a particulay location and NOT to insert into a global climate GMST series (as if it would make any difference even so).
This place seems to think that the MetO is there purely to provide 101% verifiable data for the likes of Homewood/Morrison as if they are in any way important. The person/s and the data required.
You are asserting that none of the non-measured data is ever used in any scientific studies. I can find no paper that asserts they have removed non-measured data from their calculations.
Why don’t you show us sources that confirm this data hasn’t found it’s way into the computations of global anomalies. You might include sources that remove non-measured data from U.S. stations also.
“You are asserting that none of the non-measured data is ever used in any scientific studies.”
If they were, then the authors would have been laughed out of their profession.
Mind here, they would have been lauded for *discovering* fraud.
That anyone with a single brain cell of intelligence would have groked from what the MetO say, that it is of scientific (ie integratable into long term series) quality and use is beyond me.
No matter how hard I try, I can only surmise it is the phenomenon exhibited during the Donald’s last foray into *democracy*
QAnon anyone?
The leaving of the senses behind when extreme partizanship takes over common sense.
“These maps enable you to view maps of monthly, seasonal and annual averages for the UK. The maps are based on the 1km resolution HadUK-Grid dataset derived from station data.
*Locations displayed in this map may not be those from which observations are made. Data will be displayed from the closest available climate station, which may be a short distance from the chosen location. We are working to improve the visualisation of data as part of this map.
Where stations are currently closed in this dataset, well-correlated observations from other nearby stations are used to help inform latest long-term average figures in order to preserve the long-term usability of the data. Similar peer-reviewed scientific methods are used by meteorological organisations around the world to maintain the continuity of long-term datasets.
Word salad without one iota of evidence.
This just confirms my admonition that long term records are the primary reason for creating non-measured data.
Why are long term records more important than scientific rigor?
If other stations are “ok” to use to guess at a temperature, then there is absolutely no reason to keep making up data.
One of the real reasons is to keep the number of stations up so divide by “n” keeps uncertainty small. What a statistical joke.
Tell us if the average temperature is created from a collection of samples of a population or if the average temperature is created from a population. If there are thousands of samples, what is the size of each sample? If the average temperature is of a population, why divide by √n anyway.
And a trip in leftwing-marxist bozoland.
I would love to hear the scientific justification for “composite” measurements; fabricated out of thin air with no correlation whatsoever to actual measurements, no matter how supposedly educated the guess from sites located an arbitrary distance from the phantom location. What’s wrong with using just the actual measurements? Here’s what: the observational data from hundreds of stations doesn’t all cover the time period being used as the range, so merging the data to determine a global trend is impossible.
Apparently no one in the climate pseudoscience cabal has considered that generating long term “global” trends by combining station data from stations that don’t all have data for the entire time range is a really bad idea. Why? Either because they’re remarkably stupid, or because they can manipulate the made up data to say anything they want and make it seem plausible by claiming it correlates with “nearby” stations. Of course the whole exercise is crazy, but that’s the basis of all the climate pseudoscience: crazy. It’s not based on observational data, but on manipulations of observational data, or sometimes entirely fabricated from models of what they think CO2 and methane do to temperatures.
How about identifying trends for each station from its own data then analyzing the trends of all those individual stations to see if a regional or global pattern emerges and can be quantified? Nah. Too scientific. Gridding data spatially and temporally is so much more easily manipulated to produce the desired result.
There is only one reason for human created non-measured data – long records. Why long records? So spurious trend possiblity can be shrugged away.
If I would have “homogenized” financial data where I worked, I would probably be doing this from prison.
This being done a lot. On X there are a multiplicity of folks doing exactly that. Guess what? Little to no growth in station after station all over the globe.
Using locally derived anomalies in a global average is a poor practice. When calculating uncertainties, relative values are used, basically percentage. Temperature anomaly averages should be done the same way. An anomaly of 1°C at 1°C is 100%. An anomaly of 1°C at 30°C is 3%. Why are these all weighted equally?
If something doesn’t exist, can you really hide it?
What branch of philosophy does that question belong in?
If you are happy with data from some near by point, Will you accept the electricity bill from a near by hospital?
“How hard can it be to hide a non-existent station?”
That’s not what they are doing. They are hiding the old webpage that exposed them for using non-existent stations and are creating a new one.
Climate Alarmists can’t escape from the Wayback Machine.
So the next time the MET Office says it’s the “hottest day evah!” in the future, the logical question to ask is: How do *you* know? Based on what?
Measured or estimated?
That cover-up will succeed, I’m afraid.
Paul, I do not share your pessimism. So far since late August I have published over 90 weather station reviews and have another 300 to go. I am not giving up. Along the way I am exposing numerous Met Office failings, data manipulation and aspects that have tangible and immediate effects in all manner of strange areas on our society. The Daily Sceptic articles are derived from my research. Read this one as an example from which the above was drawn along with previous postings.
https://tallbloke.wordpress.com/2024/12/07/braemar-no-2-dcnn-1216-he-who-controls-the-past-controls-the-future-he-who-controls-the-present-controls-the-past/
It will not be easy but things will change.
I appreciate your optimism. 😊
I wonder what the personal training records look like for those whose job it is to make up the data?.Surely there is formal training for this essential role.
At some point the historical adjustments required to make the past colder will become too silly to allow “this time” to be “the warmest time ever”.
As of January 1, 2024, more than 2.3 billion iPhones have been sold worldwide. Imagine if each device had a thermometer and a way to determine its location.
My comment makes less sense without connecting thought process. Maybe the MET office scheme of estimating data represents forward thinking that will allow chart makers to maintain an upward trendline by eliminating measurements? Also using 100ths of degrees would allow 100 upward steps to amount to only 1 degree, so who could argue that the number is inaccurate?
I’m sure lots of people wanted to have an iPhone in their hot little hands…
The people who run the MET office are so corrupt it’s not useful anymore. Not sure it can ever be fixed.
There is common wording over the years about inferring temperatures from other weather stations that are well correlated.
The big problem is that mathematical correlation, using well-known methods like Pearson or Spearman, does not produce an absolute value. In one field of science a correlation coefficient of (say) 0.9000 has a significance that is rather different in another field. In general, in earth science, “high” correlations are comparatively rare because the accuracy of measurements is low and the incidence of confounding variables is high. (The LIG thermometer in a Stevenson screen has turned out to be a cantankerous little beastie, compared say to a micrometer in metal machining).
It gets worse. We often calculate correlation coefficients between temperatures at 2 or more stations. With the options of observations from seconds apart to averaged annual temperatures to compare, it is no surprise that correlation coefficients vary widely with the sampling rate. Then, there is a detrending option to first remove seasonal trends when comparing daily, weekly, monthly observations.
There is no way to select the “best” sampling frequency or to detrend or not, based on calculated correlation coefficients. Current practice is near fraudulent when researchers simply eyeball a correlation coefficient and state a satisfactory “high” coefficient. There is no absolute high or low to guide acceptance.
Yet, this is precisely what is done by those who claim that (for a common example) there is high correlation between atmospheric CO2 and global temperature estimates. I have run many sets of numbers behind this seminal, foundational claim and find it riddled with uncertainties. As have others, who must marvel at the lack of hard science. Geoff S
Climatology is a liberal art, not a quantitative physical science.
True words!
nothing to see here folks, move along…….
Who the hell do you actually think you are? If you have nothing to say then simply shut up.
Ray, I think he was referring to the scene from “Naked Gun”(?) where things are blowing up in the background and Leslie Nielsen is saying that.
I think this is another nice example of “denier on denier violence”.