A team of climatologists is studying how to minimize errors in observed climate trend

Experts in climatology from the Universitat Rovira i Virgili, the State Meteorology Agency and the University of Bonn (Germany) make headway in identifying the most reliable methods that help to correct these biases

UNIVERSITAT ROVIRA I VIRGILI

Research News

IMAGE
IMAGE: JAVIER SIGRÓ AND MANOLA BRUNET view more CREDIT: URV

The instrumental climate record is the cultural heritage of humankind, the result of the diligent work of many generations of people all over the world. However, the changes in the way in which temperature is measured, as well as the environment in which weather stations are located can produce spurious trends. An international study carried out by researchers from the Universitat Rovira i Virgili (URV), the State Meteorology Agency and the University of Bonn (Germany) have succeeded in identifying the most reliable methods that help correct these trends. These “homogenization methods” are a key step in converting the enormous effort made by observers into reliable data about climate change. The results of this research, funded by the Spanish Ministry of Economy and Competitiveness, have been published in the Journal of Climate of the American Meteorological Society.

Climate observations can often be traced back more than a century, even before there were cars and electricity. These long periods of time mean that it is practically impossible to maintain the same measuring conditions over the years. The most common problem is the growth of cities around urban weather stations. We know that cities are getting warmer and warmer because of the thermal properties of urban surfaces and the reduction of evapotranspiration surfaces. To verify this, it is sufficient to compare urban stations with nearby rural stations. Although less known, similar problems are caused by the expansion of irrigated crops around observatories.

The other most common reason for biases in observed data is that weather stations have been relocated, among other reasons, because of changes in the observation networks. “A typical organisational change consisted of weather stations, which used to be in cities, being transferred to newly built airports which needed observations and predictions,” explains Victor Venema, a climatologist from Bonn and one of the authors of the study. “The weather station in Bonn used to be in a field in the village of Poppelsdorf, which is now a part of the city and, after it had been moved several times, it is now in the Cologne-Bonn airport,” he says.

As far as the robust estimation of global trends is concerned, the most important changes are technological, which are made simultaneously in an observation network. “At the moment we are in the middle of a period of generalised automation of the observation networks,” says Venema.

The computer programs that can be used for the automatic homogenisation of climate time series data are the result of several years of development. They operate by comparing stations that are near to each other and looking for changes that only take place in one of them, unlike climate changes, which affect them all.

To examine these homogenization methods, the research team generated a test bank in which they incorporated a set of simulated data that reliably imitated the sets of observed climate data, including the biases mentioned. Hence, the spurious changes are known and they can be studied to determine how the various homogenisation methods can correct them.

The test data sets generated were more diverse than those in previous studies and so were the real networks of stations, because of differences in how they were used. The researchers reproduced networks with highly varied densities of stations because in a dense network it is easier to identify a small spurious change in one station. The test data set that was used in this project was much larger than in previous studies (a total of 1,900 weather stations were analysed), which enabled the scientists to accurately determine the differences between the main automatic homogenisation methods developed by research groups in Europe and America. Because of the large size of the test data set, only the automated homogenisation methods could be tested.

The research group discovered that it is much more difficult to improve the estimated mean climate signal for an observation network than improve the accuracy of the time series of each station.

In the resulting classification, the methods of homogenisation proposed by URV and AEMET were better than the others. The method developed at the URV’s C3 Centre for Climate Change (Vila-seca, Tarragona) by the Hungarian climatologist Peter Domonkos proved to be the best at homogenising both the series from individual stations and the mean series from the regional network. The AEMET method, developed by the researcher José A. Guijarro, was very close behind.

The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.

The results of this study have demonstrated the value of large test data sets. “It is another reason why automatic homogenisation methods are important: they can be tested more easily and this helps in their development,” explains Peter Domonkos, who started his career as a meteorological observer and is now writing a book on the homogenisation of climate time series.

“The study shows the importance of very dense station networks in making homogenisation methods more robust and efficient and, therefore, in calculating observed trends more accurately,” says the researcher Manola Brunet, director of the URV’s C3, visiting member of the Climate Research Unit of the University of East Anglia, Norwich, United Kingdom, and vice-president of the World Meteorological Organisation’s Commission for Weather, Climate, Water and Related Environmental Services & Applications.

“Unfortunately, much more climate data still has to be digitalised for even better homogenisation and quality control,” she concludes.

For his part, the researcher Javier Sigró, also from the C3, points out that homogenisation is often just the first step “that allows us to go to the archives and check what happened with those observations affected by spurious changes. Improving the methods of homogenisation means that we can do this much more efficiently.”

“The results of the project can help users to choose the method most suited to their needs and developers to improve their software because its strong and weak points are revealed. This will enable more improvement in the future,” says José A. Guijarro from the State Meteorology Agency of the Balearic Islands and co-author of the study.

Previous studies of a similar kind have shown that the homogenisation methods that were designed to detect multiple biases simultaneously were clearly better than those that identify artificial spurious changes one by one. “Curiously, our study did not confirm this. It may be more an issue of using methods that have been accurately fitted and tested,” says Victor Venema from the University of Bonn.

The experts are sure that the accuracy of the homogenisation methods will improve even more. “Nevertheless, we must not forget that climate observations that are spatially more dense and of high quality are the cornerstone of what we know about climate variability,” concludes Peter Domonkos.

###

3.2 12 votes
Article Rating
89 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Chris Nisbet
February 4, 2021 10:19 am

I think that what they’re saying is that we should expect more adjustments to the data in the near future.

Editor
Reply to  Chris Nisbet
February 4, 2021 11:30 am

Agreed, Chris. And as history has shown, the adjustments always increase the warming trend so that the data come closer to matching the models.

Regards,
Bob

PS: The adjustments have reached the point of silliness.

Vuk
Reply to  Chris Nisbet
February 4, 2021 12:27 pm

It looks those are consuming too much food and personally contributing to the forthcoming climate change catastrophe.

n.n
Reply to  Vuk
February 4, 2021 1:28 pm

Vegetables. Vegans, including cows, are sources of methane, a first-order forcing of [catastrophic] [anthropogenic] global warming. Humans, without the ability to optimally process fibrous matter, are probably a greater source of this pollutant.

Zig Zag Wanderer
Reply to  n.n
February 4, 2021 2:44 pm

All animal life is completely ‘carbon’ neutral. It’s the way the carbon cycle works.

Peta of Newark
Reply to  Vuk
February 4, 2021 3:19 pm

beautiful

ResourceGuy
February 4, 2021 10:22 am

Do they possess a signed photo or autographed book by Michael Mann? We need to check that first.

Robert of Texas
February 4, 2021 10:23 am

“(1)The most common problem is the growth of cities around urban weather stations. We know that cities are getting warmer and warmer …(2) Although less known, similar problems are caused by the expansion of irrigated crops around observatories…(3)The other most common reason for biases in observed data is that weather stations have been relocated…”

1) If urban areas are growing around many nearby stations, how do you correct for that using homogenization?

2) I have yet to see an instance where irrigation is causing incorrect temperatures – temperature stations tend to be built around over time, not irrigated around over time. I wonder how big an issue this is?

3) The relocation of a station seems to be the easiest to correct for *if* the change was logged. More difficult I think is the changing of instruments which respond differently to a range of temperatures.

Most of this is easy to demonstrate using a set of well maintained rural stations – they will show that there is a large heating bias in urban temperature measurements. Harder is proving what causes them and in what proportion. Heat-sinks, wind barriers, albedo, loss of evaporation all play a part.

If you cannot quantify these effects, then homogenization is just a bunch of guesswork math for hiding the mess.

Peter W
Reply to  Robert of Texas
February 4, 2021 10:37 am

Also needed is an appropriate worldwide diversification of measuring stations, to compensate for more localized variations such as El Ninos.

a_scientist
Reply to  Robert of Texas
February 4, 2021 11:15 am

I think we need a world wide system like the USCRN stations that need no adjustments or homogenization.

Pat from kerbob
Reply to  Robert of Texas
February 4, 2021 11:30 am

Irrigation, spraying water around, reduces temp by evaporation

meab
Reply to  Pat from kerbob
February 4, 2021 12:04 pm

It also increases overnight temps by increasing humidity.

MarkMcd
Reply to  Pat from kerbob
February 4, 2021 4:06 pm

That might be an interesting experiment. Because water vapour, unlike CO2, is a heat trap. It actually IS a GHG.

So would irrigation actually reduce temps or would it warm the surrounding air as it traps IR for a while?

MJB
Reply to  Robert of Texas
February 4, 2021 12:02 pm

To your second point, there’s been a fair bit written/studied regarding effect of irrigation on temperatures. WUWT covered it here for example in 2010:

https://wattsupwiththat.com/2010/09/12/christy-on-irrigation-and-regional-temperature-effects/

RelPerm
Reply to  MJB
February 4, 2021 8:38 pm

Interesting case study for California, is Arizona Phoenix area also an example of irrigation causing nighttime warming?

MarkMcd
Reply to  Robert of Texas
February 4, 2021 4:02 pm

3) The relocation of a station seems to be the easiest to correct for *if* the change was logged. More difficult I think is the changing of instruments which respond differently to a range of temperatures.

Logging isn’t the only issues. They also need to run instrument changes side by side for a period of time to show exactly HOW the new instruments compare. Only then can they ‘adjust’ the new data.

Unlike BoM in Australia who didn’t bother running parallel and then just adjusted the old data. They also went from the standard measurements over a decent interval to new ‘instantaneous’ max/min temps – a passing truck (or plane at airports) could register a new max temp 10° above the actual max temp for the day and it gets counted.

February 4, 2021 10:26 am

Of course, there must be a reason to homogenise data, it can’t be, that natural reasons lead to different data even in a dense network and the measured temps. f.e. are correct.
So if they test, they have to look at every station and search for reasons of the “divergence”.
Airport data have to be eliminated for reasons.

Reply to  Krishna Gans
February 4, 2021 11:10 am

https://pbs.twimg.com/card_img/1357316433361969162/iiZyy7WF?format=png&name=small
Example

A coldfront blowing east to west fore several day, will the northern stations homogenised to warm or the southern to cold ?

Tim Gorman
February 4, 2021 10:28 am

a set of simulated data that reliably imitated the sets of observed climate data”

So they generated a data set that confirmed what they wanted? Unbelievable.

Reply to  Tim Gorman
February 4, 2021 11:25 pm

The headline says it all: Minimizing errors in OBSERVATION. How strongly do you feel about ‘correcting’ the readings you took off a, say, thermometer? Whereas a 2020 thermometer might be more precise than one from 1920, it is not more accurate. Do these twerps know the difference? This is pure sciencery.

Jim Gorman
Reply to  paranoid goy
February 6, 2021 6:02 am

I’m beginning to think none of these “climate scientists” have ever taken a lab class that required any knowledge of uncertainty, precision, accuracy, or significant digits. I am also sure none of them have a clue what the GUM is or what it is for. I know none of them have any training in metrology like normal engineers and physical scientists.

Reply to  Jim Gorman
February 9, 2021 12:34 pm

Here in Spain, weather observatories measure precipitation with an error range of +- 0.3 mm, but report it with two digits behind the comma. Nuff said.

Garland Lowe
February 4, 2021 10:35 am

automatic homogenisation = computer programmers will determine the observed temperature.

Reply to  Garland Lowe
February 4, 2021 1:03 pm

GIGO

another ian
Reply to  Krishna Gans
February 4, 2021 1:28 pm

They use super computers so

GI S(uper) GO

Jeff Alberts
Reply to  Garland Lowe
February 4, 2021 3:44 pm

If by homogenization they mean averaging everything together to present a single number for “global temperature”, they need to just stop it. It’s completely meaningless. Have any of these “scientists” ever heard of Intensive Properties?

Reply to  Jeff Alberts
February 4, 2021 11:34 pm

Intensive properties? Never heard of that. Watching a breeze blow over a high cliff, forming a small cloud that rains down on half a cornfield while the rest of the landscape stands in baking sunlight? A common experience when working highsites and radio masts, where you spend lots of times watching cloud formations. A skill you by necessity require after once not climbing down fast enough before lightning strikes the tower.

Graemethecat
Reply to  Jeff Alberts
February 5, 2021 10:05 am

I find it incredible that so-called climate “scientists” are unaware of this.

Averaging intensive properties is about as meaningful as averaging telephone numbers.

Ron Long
February 4, 2021 10:42 am

This at first appears to be a bunch of nonsense. The only way to view a long-term temperature record, to utilize it form a climate change comment, is to segregate out all weather stations that have moved, are engulfed by UHI effects, or are at dramatic conflict with near-by rural stations. If you take a glass of milk (which is essentially white) and poour some chocolate (which is a strong brown color) and homogenize it by stirring it is something other than either start product. Same with long-term weather/temperature data, -these two researchers are not going to come up with the right answer unless they filter our the problem data, not homogenize it.

DonK31
Reply to  Ron Long
February 4, 2021 11:11 am

In the US, it’s called USCRN. The reason why those stations a sited where they are is to avoid contamination by UHI, roads, airports or the need to re-site to avoid those problems.

Pat from kerbob
Reply to  DonK31
February 4, 2021 11:28 am

I think it has been documented here how badly that siting is

Paul Johnson
Reply to  DonK31
February 4, 2021 11:40 am

Unfortunately, the uncontaminated data USCRN provided was incompatible with Obama policy, so the system has not been expanded since 2008.

Clyde Spencer
Reply to  Ron Long
February 4, 2021 12:30 pm

It strikes to me that the process being described is one of looking for outliers and replacing them with the average (weighted?) of surrounding stations. There are some purists who maintain that outliers in a data set should never be deleted or replaced unless it can be demonstrated that there is a measurement error and the cause can be identified and quantified. Some moderates recommend deletion and/or replacement if the apparent outlier is more than 2 or 3 standard deviations from the mean, even if the source of the error is not identified. Either approach tends to work well for invariant properties such as the bulk density of a sample or multiple measurements of the diameter of a ball bearing.

However, when dealing with temperatures, which vary with elevation, transient cloud cover, windiness, local rain storms, position of cold fronts, etc., there is a risk that removing data actually distorts the calculated mean. It is generally acknowledged that there are unique microclimates around Earth. They are more probable to be outliers. Thus, homogenization may suppress the contribution from these areas.

Homogenization is questionable as a data-processing practice, and provides an opportunity for mischief. In any event, the original data always should be archived, which doesn’t seem to be the case.

It is surprising that for something as important as climate, which bears on numerous social questions, that there is so little rigor in the mathematical processing of the data. The classic example is the ‘novel’ treatment used by Mann to create his ‘hockey stick,’ which will generate similar forms from noise.

Reply to  Clyde Spencer
February 4, 2021 1:08 pm

Every outlier has to be proven as error before being homogenised.

observa
Reply to  Ron Long
February 5, 2021 12:06 am

You’re too polite with the chocolate milk analogy. I consume a lot of outlier foods so would they like the outcome of my homogenization? Visit the sewage works if they’re interested in averages.

Mark Pawelek
February 4, 2021 10:44 am

<blockquote><i>The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates.</i></blockquote>

I can’t understand the concept of “systematic errors in trends from many weather stations”, especially when “produced simultaneously”. Don’t these stations use modern scientific instruments to measure things? If there’s a possibility of error, shouldn’t they use duplicate, or triplicate instruments? The year is 2020, not 1820; we’ve learnt a bit about instrumentation over the past 4 centuries of empirically-based science. Why do I get the impression NASA are so very keen to make massive alterations to raw data?

Retired_Engineer_Jim
Reply to  Mark Pawelek
February 4, 2021 11:11 am

I believe that they are trying to develop an homogenization method to handle the historical data (from non-automated reporting stations) as well as current data (from automated reporting stations).

They have identified reasons for “bad” data, so why not just eliminate data from urban and airport reporting stations, to start, and see if they even need homogenization?

Retired_Engineer_Jim
Reply to  Retired_Engineer_Jim
February 4, 2021 11:12 am

Sorry, this wasn’t a comment to Mark Pawelek’s comment – I got my indentations wrong. Shame on me.

Jim Gorman
Reply to  Mark Pawelek
February 4, 2021 1:07 pm

These people need to go back to school and learn some metrology and time series trending. First, “systematic errors in trends” makes no sense whatsoever. Please, someone send them a copy of the GUM. Systematic errors are biases IN A GIVEN DEVICE, not in a trend. It mainly affects absolute temperatures, not anomalies. So much of climate work today is done with anomalies it makes even less sense unless you a trying to find a “perceived” trend in an anomaly.

These folks don’t seem to have a clue about errors both random and systematic or uncertainty. I don’t know how they are going to make scientific judgements about these issues.

Trending is another facet I’m not sure they understand. One of the big issues is length of time for individual stations. When you have stations being moved, changed, added, removed, etc. you really hose up trends. The best example are stock market indexes. They pick companies that are well capitalized, well run, and appear viable for the long term future. Why? Can you imagine trying to develop a trend for an index portfolio like the Dow Jones or NASDAQ that has companies disappear on an regular basis and that are replaced by companies whose stock prices and associated growth rates were totally different. A trend of the “index” would mean nothing.

Temperature trends are no different. You need long term stability. You need the various stations to have consistent means and similar variances.

My recommendation would be to get totally away from a Global Average Temperature. Its usefulness is only for propaganda and not for anything else. Local and regional trends make more sense for everything that needs to be done. If you have 7 continents all with 1 degree per decade increase in temperature you should have one solution. If you have SH with cooling and the NH with warming you probably need an entirely different solution.

Mr.
Reply to  Jim Gorman
February 4, 2021 2:30 pm

How rational of you Jim.

(don’t bother checking your inbox for that invitation to address the upcoming CoP in Glasgow though. If you do get one, it will just be so the attendees can put you in stocks and belt you with hockey sticks)

fred250
Reply to  Jim Gorman
February 4, 2021 7:26 pm

“I don’t know how they are going to make scientific judgements about these issues.”

.

And what makes you think they have any intention whatsoever, to do that ?

Reply to  Jim Gorman
February 4, 2021 10:00 pm

Hate to break it to you, , but the DJI does just that – companies are removed and added on a regular basis. NASDAQ is a somewhat better measure of the overall market – but not the stock market, it includes non-stock things such as American Depository Receipts (ADRs), Real Estate Investment Trusts (REITs), and limited partnership interests.

That being said, the analogy doesn’t work anyway. None of the indexes are intended to measure the trend of the market value – they measure the trend of the market <i>perception</i>.

fred250
Reply to  writing observer
February 5, 2021 2:40 am

And they don’t go around “ADJUSTING” past data.

That would land them in JAIL for FRAUD.!!

Jim Gorman
Reply to  writing observer
February 5, 2021 9:44 am

I never said anything about the “market”, only the specific index. You simply can not get an accurate trend on an index fund with companies being added and removed wily nilly. An accurate trend requires stability and stationarity in the individual parts. Just imagine what a trend would do if an index fund would have added GameStop when it reached 500 and took it out when it fell below 100. It distorts trends like it or not.

Climate believer
February 4, 2021 10:47 am

My moneys on they’ll find it much worse than they thought….. just a hunch.

Pat from kerbob
February 4, 2021 11:26 am

They are looking to the NOAA algorithms as the gold standard and as has been shown in many recent posts that method increases readings in lockstep with CO2

So they are admitting to fraud in advance?

Paul Johnson
Reply to  Pat from kerbob
February 4, 2021 11:31 am

Is this the excuse NOAA needs to terminate its project in the Urban Heat Island effect on existing stations?

Dave Fair
Reply to  Pat from kerbob
February 4, 2021 1:09 pm

This is very confusing. In the discussion, they say their individual-station approach is superior to NOAA’s blanket approach, giving more accurate results. CliSci gobbly-gook.

Pat from kerbob
February 4, 2021 11:31 am

Roy Spencer recent post showed that the increase in population is directly related to population growth near stations

NOAA seems to be adjusting opposite to that
Over time all adjustments should be linear but down not up?

Joel O'Bryan
Reply to  Pat from kerbob
February 4, 2021 12:28 pm

edit is your friend.

fred250
Reply to  Pat from kerbob
February 4, 2021 9:09 pm

“he increase in population is directly related to population growth”

.

100% correlation in fact !! 🙂 R² = 1…..

….. just like temperature adjustments vs CO₂

Steve Richards
February 4, 2021 11:32 am

Here is the abstract for the article:
“The aim of time series homogenization is to remove non-climatic effects, such as changes in station location, instrumentation, observation practices, etc., from observed data. Statistical homogenization usually reduces the non-climatic effects, but does not remove them completely. In the Spanish MULTITEST project, the efficiencies of automatic homogenization methods were tested on large benchmark datasets of a wide range of statistical properties. In this study, test results for 9 versions, based on 5 homogenization methods (ACMANT, Climatol, MASH, PHA and RHtests) are presented and evaluated. The tests were executed with 12 synthetic/surrogate monthly temperature test datasets containing 100 to 500 networks with 5 to 40 time series in each. Residual centred root mean square errors and residual trend biases were calculated both for individual station series and for network mean series.
The results show that a larger fraction of the non-climatic biases can be removed from station series than from network-mean series. The largest error reduction is found for the long-term linear trends of individual time series in datasets with a high signal-to-noise ratio (SNR), there the mean residual error is only 14 – 36% of the raw data error. When the SNR is low, most of the results still indicate error reductions, although with smaller ratios than for large SNR. Generally, ACMANT gave the most accurate homogenization results. In the accuracy of individual time series ACMANT is closely followed by Climatol, while for the accurate calculation of mean climatic trends over large geographical regions both PHA and ACMANT are recommended.”

It appears they are *JUST* worried about how to determine if your method of homogenization has produced the least errors. As such, sounds good.

Shame that the paper is still not available.

Jim Gorman
Reply to  Steve Richards
February 4, 2021 3:47 pm

I’ve heard this argument from folks on Twitter. Signal to Noise is a totally different thing than what these folks are talking about. First, noise is extraneous signal that IS NOT part of the information you are dealing with. What these folks are calling noise is the VARIANCE of the real signal. It is not noise, IT IS THE SIGNAL ITSELF. They consider anything that “hides” the mean value as noise. Consequently, sampling and averaging, and smoothing are employed to obtain a simple unique graph that has little variance. This is simply misinformation and outright torture of data to get something that allows you to portray your massaged data as the truth. It is not truth.

commieBob
February 4, 2021 11:32 am

Homogenization is a four letter word.

Solar Mutant Ninjaneer
Reply to  commieBob
February 4, 2021 12:13 pm

If you substitute the word “falsification” for “homogenization”, this article makes sense.

D. J. Hawkins
Reply to  commieBob
February 4, 2021 1:46 pm

Homogenization is the attempt to determine, after you’ve dropped a turd in a gallon of ice cream, how much more ice cream you have to add before the mixture is edible.

fred250
Reply to  D. J. Hawkins
February 4, 2021 9:11 pm

Yet the AGW apostles just lap it up !

Graemethecat
Reply to  D. J. Hawkins
February 5, 2021 10:10 am

Thanks. I was eating ice cream when I read that. Your analogy just spoiled my supper.

D. J. Hawkins
Reply to  Graemethecat
February 5, 2021 1:18 pm

Sorry about that. Next time I’ll add a “Revolting Analogy” warning to any posts that need them.

Jean Parisot
February 4, 2021 11:56 am

“Nevertheless, we must not forget that climate observations that are spatially more dense and of high quality are the cornerstone of what we know about climate variability,”

Why would density matter, wouldn’t wider distribution into the unmeasured environment improve our understanding?

Smart Rock
February 4, 2021 12:10 pm

I don’t trust them at all. Especially when one of them is visiting at the Heart of Darkness CRU.

It looks (to this jaded and suspicious commenter) that now, as CMIP6 models are being published with ever-higher warming rates, that the “Instrumental record” needs some serious upward adjustment to avoid deviating too far from the model outputs, and making the models look bad. That, of course will be in addition to the upward adjustments already made to post-1960 records (plus the downward adjustments of pre-1960 data). What better way to do this than homogenization, where we never get to see the actual code. And they make it all sound so reasonable, so objective and so impartial.

No, I don’t trust them. They need those UHI-influenced records to make sure that homogenization works in the right direction. We can’t have the warming rate start to decline!

G Mawer
February 4, 2021 12:53 pm

Decades ago I was an engineering student. Unfortunately I did not finish the degree but I remember one thing from day one of mechanical drawing = KISS.

Keep It Simple Stupid 🙂

If the goal is to determine if the global temperature is changing the KISS method should be to look at the globe as a whole, not as a collection of locations seeing varying conditions.

When I look at the “Blue Marble” I realize that all of those variables exist on the planet simultaneously. The big ones average out. The sun shines on half of the planet at a time, ALWAYS, moment by moment. When it is summer in the north it is winter in the south. There are always storms on the planet and there are always clear skies somewhere. All transitions occur simultaneously at any and all times!

So, what if we pick a time, a moment, and take a snap shot of the “globe” by recording ALL stations at once. Take it for what it is, plot it over time and see what change shows up. No adjustments or homogenization method needed!

For best accuracy I would pick 2 times a day. One snap shot taken when the continents in the east face the sun. The second snap shot when the continents in the west face the sun.

Too much KISSing ??

February 4, 2021 1:07 pm

https://breadonthewater.co.za/2021/01/26/am-i-a-climate-denier-denialist/

Yes.

There is a trend. It flies in the face of AGW

Reply to  HenryP
February 4, 2021 3:23 pm

How often will you post that ?

Vuk
February 4, 2021 1:21 pm

James Hansen has a go at the British PM Boris Johnson
“It would be easy to achieve this latter ignominy and humiliation – just continue with the plan to open a new coal mine in Cumbria in contemptuous disregard of the future of young people and nature.”
https://www.bbc.co.uk/news/science-environment-55923731

Chris Hanley
February 4, 2021 1:56 pm

Berkeley Earth apply their magic wand or ‘regional expectation’ logarithmic that miraculously converts falling or zero long-term trends to rising trends that approximate their Global Land trend.
There seems a certain circularity in the process.
Three long-term stations dating back to the 1880s, at the time very isolated, around here come to mind, Cape Otway Lighthouse, Cape Schanck Lighthouse, Wilsons Promontory Lighthouse.

Chris Hanley
Reply to  Chris Hanley
February 4, 2021 2:38 pm

algorithm, I must have been operating on automatic pilot.

Mr.
February 4, 2021 2:06 pm

So, to borrow from Groucho –

“these are my thermometer readings, but if you don’t like them, I have others . . .”

Peta of Newark
February 4, 2021 3:42 pm

Let’s liken a proper Temperature Record to something very complex and fragile that takes a lot of painstaking work to assemble.

A Faberge Egg perhaps

What has happened over the decades, is that couldn’t-give-a-toss incompetents, looking to serve their own ends (e.g. airport users) got involved in its construction and maintenance.

As far as The Egg goes, it has been dropped from a great height onto a concrete road then repeatedly run over by a fleet of 45 tonne artics.

While no record of its construction was kept, no photos or drawings, no nuffink.
All there is, are the broken and scattered bits.

Thus it gets equally very sad but also very worrying, for us all.
Because these folks here, dazzled by their own brilliance, bouyed up by good intentions and political correctness, actually do believe they can reconstruct The Egg.

Without a plan or blueprint of its construction or even a pencil sketch drawing of what it looked like.
Thus, it can only ever come out as exactly what they think it should look like.
That they don’t realise that, yet dig themselves ever deeper into a hole by imagining that A Computer will guide them.
That’s the sad bit

The worrying bit is ‘How many more of these folks are out there’
What if, me or you finished up in a Court-of-Law and these, dazzled by themselves folks, were not only the prosecution but also the judge and jury.

What kinda trial would you get?

It gets ever more worser than a really worser thing because, your defence (contrarians, skeptics, deneirs) in your trial, start your defence by agreeing with the prosecution – that CO2 IS a Green Gas Gaga House Gas.

Your own defence says that, in some ‘Natural Variation’ or other unknown way, you are Guilty As Sin!
w.t.f.

Just that alone will have the prosecution reaching for the shampers & looking forwards a big victory celebration party – even before the trail really started

Vuk nailed it here, Mr Trump knew/knows it, folks who managed all 397 days of Dry January know it and thus, know how to fix it………….

Mr. Lee
February 4, 2021 4:22 pm

In other words, sites like this are beginning to give “homogenization” a bad reputation, so they are damage control to prop up the fraud by giving it a veneer of scientific rigor.

February 4, 2021 5:14 pm

I did an analysis of the problems with conventional homogenization in essay When Data Isnt in ebook Blowing Smoke published late 2014. I also contributed a guest post here a few years back using the CRN1 Surface Station sites.
Bottom line of that long ago guest post here was:
Homogenization V2 does a reasonable job for good urban stations
Does a mediocre job for good suburban stations
Does a horrible job for rural pristine stations—9 of 10 were warmed.

So this article is NOT new news.

Richard M
February 4, 2021 5:26 pm

Such a waste of time. You don’t need surface stations going forward to determine the temperature of the planet. We have satellites which agree almost perfectly with the ocean measurements that are most reliable. Between the two we have all that we need for climate studies.

https://woodfortrees.org/plot/uah6/from:1979/to/plot/uah6/from:1979/to/trend/plot/hadsst3gl/from:1979/to/offset:-0.2/plot/hadsst3gl/from:1979/to/offset:-0.2/trend

Geoff Sherrington
February 4, 2021 7:05 pm

Question. How can I measure the accuracy of my adjustments of historical data?
Answer: There is absolutely no way to measure your accuracy.
Reason. You cannot re-create the original measurement conditions. You have no reference point that allows you to say that your adjustment is too high or too low, negative or positive.
Future. The most you can do is to reject past observations that have reason to be rejected.
Homogenization in the now-popular way has no firm basis in science. It is little more than contrived guesswork,
Geoff S

Joel O'Bryan
February 4, 2021 7:29 pm

A urologist, a proctologist, and climatologist walked into a bar…
They shuffle over to bar and sit down, while making some small talk among themselves.
Barkeep comes up and asks, “What’ll it be gents?”

Urologist says, “Gimme a pint o’ your cheapest beer.”
Barkeep looks quizzical, but goes tot he tap and fills him a glass of Coors Light and slides the glass of bubbly, pale yellow liquid to him.

The Urologist takes the glass of yellow liquid and just up-ends the glass to his lips and empties the entire pint in one long chug. Finishing with the pint with a BELCH and saying, “Ah!!!.. Thanks! I just wanted to be reminded that just because it looks like piss, sometimes it isn’t.”

Barkeep gives an amused look, but quickly moves is gaze to the Proctologist and asks, “And you sir?”

Proctologist says, “Gimme a Mud-slide, and make it with your best vodka, please.”
Bar keep gives that quizzical look again, goes to make the MudSlide cocktail as ordered. Barkeep digs around the cabinet to find the Grey Goose Vodka, finishes making the cocktail with garnishment of chocolate shavings. Slides it over to Proctologist.

The Proctologist takes the stemmed glass of dark brown liquid and proceeds to upend the contents onto to his lips. He as soon as he finishes he proclaims to everyone in bar, “All that something that looks like diarrhea, but tastes great, and with the best ingredients.”

Barkeep gives an amused look, but as before, he quickly moves is gaze to the Climatologist and asks, “And finally you sir? What be your pleasure?”

The Climatologist says, “Gimme a glass of real piss and blend it in your blender with a pile of fresh dog poo fromyour dog. I’ll pay you a thousand dollars.”

Barkeep looks on in aghast at the crude disgusting request, and then composes himself and matter of factly states, “Sir I cannot do that. Serving that would violate my county health license, endanger your health, defile my bar, and it’s just morally wrong!”

The Climatologist looks around, quite confused, and then says, ….

pochas94
February 4, 2021 8:46 pm

Climate is the Goose that lays the Golden Eggs. Keeping it alive depends on their ability to tell convincing fairy tales.

Patrick MJD
February 4, 2021 8:51 pm

They are German, so their “results” will be “correct”.

observa
February 4, 2021 11:44 pm

Personally I find this settled sciencey stuff all a bit unsettling-
Scientists Discover an Immense, Unknown Hydrocarbon Cycle Hiding in The Oceans (msn.com)

February 5, 2021 1:10 am

Altering recorded results, in any way, is not Science, it is advocacy. Wow….

RickWill
February 5, 2021 2:02 am

If they eliminate measurement error then they eliminate Climate Change.

Earth’s surface temperature is thermostatically controlled. The only thing that changes is the way the temperature is measured.

All the Climate Change in Australia can be sheeted home to instrument changes, instrument moves, bigger aircraft and more frequent air services; population density, commercial/industrial/domestic energy intensity; data homogenistion and certainly others.

Mickey Reno
February 5, 2021 5:24 am

If I could be so bold as to paraphrase a Dilbert cartoon: Dogbert was hired as an efficiency expert to find all the unproductive University of East Anglia climate scientists who could be laid off as a cost saving measure. Some time later, one of the scientists worried about losing his job asked Dogbert if he had finished his investigation, and if so, could he see a copy of Dogbert’s final report. “But this report is just a copy of the UEA faculty directory,” complained the scientist, to which Dogbert answered, “Yes, finding that saved me a lot of time.”

S. K. Dodsland
February 5, 2021 7:25 am

Is this just another ploy to further the man-caused climate change agenda?

The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.

Tony Heller of realclimatescience.com has documented how NASA/NOAA alter data and cannot be trusted.

https://realclimatescience.com/2020/10/new-video-alterations-to-the-us-temperature-record-part-one/
https://realclimatescience.com/2020/10/new-video-alterations-to-the-us-temperature-record-part-2/
https://newtube.app/user/TonyHeller/xEyXN2e (temperature alterations part 3)
https://realclimatescience.com/2020/10/new-video-how-the-us-temperature-record-is-being-altered-part-3/

Gerald Machnee
February 5, 2021 8:38 am

***The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.***

Translation: The 1930’s were biased high simultaneous so they had to be homogenized downwards. Today there are simultaneous cold biases so they are homogenized upward using the scientific CO2 increase.

Ferdberple
February 5, 2021 8:39 am

You cannot correct a station using average data from surrounding stations, because weather is not linear, it is chaotic.

The underlying assumption is wrong. Garbage in garbage out.

S. K. Dodsland
February 5, 2021 8:50 am

Why did you eliminate my post?

S. K. Dodsland
Reply to  S. K. Dodsland
February 5, 2021 10:36 pm

Thank you posting my comment and for having objectivity.There are at least 2 sides to every theory.

Paul Penrose
February 5, 2021 9:08 am

So, when measuring the property of a large object, higher spacial density (resolution) of the measurements is better than lower – wow, what a ground breaking conclusion.

accordionsrule
February 5, 2021 11:11 am

In homogenization, UHI affected stations will always win. Being in largest metropolitan areas and airports they are older, more continuous, better maintained, and spatially more dense. They are considered “high quality,” therefore are the “cornerstone” and used to infill the other stations. Their steady incline is smooth, not sharp, thus the warming trend is deemed not “spurious.”
From what I’ve read, adjusting for UHI effects is the last step. Since all the other previous steps served to lift nearby, possibly more pristine stations into conformity with the “cornerstone” stations, there is only a tiny UHI signal left to adjust for.

Tom Abbott
February 5, 2021 5:48 pm

From the article: “These “homogenization methods” are a key step in converting the enormous effort made by observers into reliable data about climate change.”

There they go implying that temperature readings from the past are not reliable.

First of all, the temperature readings from the past are the only readings we have, and second of all, putting the readings through a computer is not going to improve their accuracy.

The original handwritten temperature readings are the readings we should use to plan our future. These temperature readings tell us we have nothing to fear from CO2.

That’s why alarmists want to manipulate the data, because it doesn’t fit their narrative in its native form, so they put it through their computers and make it fit their narrative.

Stick to the actual temperature readings if you want to dwell in reality.

S. K. Dodsland
February 5, 2021 10:19 pm

Is this what is considered homogenization?

If the corrections fixed known problems in the instruments, that would help accuracy. But they are statistical. They make the station measurements smoother when mapped and they smooth over discontinuities. In my opinion, NOAA has overdone it. TOB, PHA, infilling and gridding are overkill. This is easily seen in Figure 7 and by comparing Figure 3 to Figure 6 or Figure 5. Does the final trend in Figure 3 more closely resemble the measurements (Figure 6) or the net corrections in Figure 5? The century slope of the data is 0.25°, the corrections add 0.35° to this and the “climatological gridding algorithm” adds 0.9°! It is worth saying again, the type of statistical operations we are discussing do nothing to improve the accuracy of the National Temperature Index, and they probably reduce it.

https://wattsupwiththat.com/2020/11/24/the-u-s-national-temperature-index-is-it-based-on-data-or-corrections/

%d
Verified by MonsterInsights