A team of climatologists is studying how to minimize errors in observed climate trend

Experts in climatology from the Universitat Rovira i Virgili, the State Meteorology Agency and the University of Bonn (Germany) make headway in identifying the most reliable methods that help to correct these biases

UNIVERSITAT ROVIRA I VIRGILI

Research News

IMAGE
IMAGE: JAVIER SIGRÓ AND MANOLA BRUNET view more CREDIT: URV

The instrumental climate record is the cultural heritage of humankind, the result of the diligent work of many generations of people all over the world. However, the changes in the way in which temperature is measured, as well as the environment in which weather stations are located can produce spurious trends. An international study carried out by researchers from the Universitat Rovira i Virgili (URV), the State Meteorology Agency and the University of Bonn (Germany) have succeeded in identifying the most reliable methods that help correct these trends. These “homogenization methods” are a key step in converting the enormous effort made by observers into reliable data about climate change. The results of this research, funded by the Spanish Ministry of Economy and Competitiveness, have been published in the Journal of Climate of the American Meteorological Society.

Climate observations can often be traced back more than a century, even before there were cars and electricity. These long periods of time mean that it is practically impossible to maintain the same measuring conditions over the years. The most common problem is the growth of cities around urban weather stations. We know that cities are getting warmer and warmer because of the thermal properties of urban surfaces and the reduction of evapotranspiration surfaces. To verify this, it is sufficient to compare urban stations with nearby rural stations. Although less known, similar problems are caused by the expansion of irrigated crops around observatories.

The other most common reason for biases in observed data is that weather stations have been relocated, among other reasons, because of changes in the observation networks. “A typical organisational change consisted of weather stations, which used to be in cities, being transferred to newly built airports which needed observations and predictions,” explains Victor Venema, a climatologist from Bonn and one of the authors of the study. “The weather station in Bonn used to be in a field in the village of Poppelsdorf, which is now a part of the city and, after it had been moved several times, it is now in the Cologne-Bonn airport,” he says.

As far as the robust estimation of global trends is concerned, the most important changes are technological, which are made simultaneously in an observation network. “At the moment we are in the middle of a period of generalised automation of the observation networks,” says Venema.

The computer programs that can be used for the automatic homogenisation of climate time series data are the result of several years of development. They operate by comparing stations that are near to each other and looking for changes that only take place in one of them, unlike climate changes, which affect them all.

To examine these homogenization methods, the research team generated a test bank in which they incorporated a set of simulated data that reliably imitated the sets of observed climate data, including the biases mentioned. Hence, the spurious changes are known and they can be studied to determine how the various homogenisation methods can correct them.

The test data sets generated were more diverse than those in previous studies and so were the real networks of stations, because of differences in how they were used. The researchers reproduced networks with highly varied densities of stations because in a dense network it is easier to identify a small spurious change in one station. The test data set that was used in this project was much larger than in previous studies (a total of 1,900 weather stations were analysed), which enabled the scientists to accurately determine the differences between the main automatic homogenisation methods developed by research groups in Europe and America. Because of the large size of the test data set, only the automated homogenisation methods could be tested.

The research group discovered that it is much more difficult to improve the estimated mean climate signal for an observation network than improve the accuracy of the time series of each station.

In the resulting classification, the methods of homogenisation proposed by URV and AEMET were better than the others. The method developed at the URV’s C3 Centre for Climate Change (Vila-seca, Tarragona) by the Hungarian climatologist Peter Domonkos proved to be the best at homogenising both the series from individual stations and the mean series from the regional network. The AEMET method, developed by the researcher José A. Guijarro, was very close behind.

The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.

The results of this study have demonstrated the value of large test data sets. “It is another reason why automatic homogenisation methods are important: they can be tested more easily and this helps in their development,” explains Peter Domonkos, who started his career as a meteorological observer and is now writing a book on the homogenisation of climate time series.

“The study shows the importance of very dense station networks in making homogenisation methods more robust and efficient and, therefore, in calculating observed trends more accurately,” says the researcher Manola Brunet, director of the URV’s C3, visiting member of the Climate Research Unit of the University of East Anglia, Norwich, United Kingdom, and vice-president of the World Meteorological Organisation’s Commission for Weather, Climate, Water and Related Environmental Services & Applications.

“Unfortunately, much more climate data still has to be digitalised for even better homogenisation and quality control,” she concludes.

For his part, the researcher Javier Sigró, also from the C3, points out that homogenisation is often just the first step “that allows us to go to the archives and check what happened with those observations affected by spurious changes. Improving the methods of homogenisation means that we can do this much more efficiently.”

“The results of the project can help users to choose the method most suited to their needs and developers to improve their software because its strong and weak points are revealed. This will enable more improvement in the future,” says José A. Guijarro from the State Meteorology Agency of the Balearic Islands and co-author of the study.

Previous studies of a similar kind have shown that the homogenisation methods that were designed to detect multiple biases simultaneously were clearly better than those that identify artificial spurious changes one by one. “Curiously, our study did not confirm this. It may be more an issue of using methods that have been accurately fitted and tested,” says Victor Venema from the University of Bonn.

The experts are sure that the accuracy of the homogenisation methods will improve even more. “Nevertheless, we must not forget that climate observations that are spatially more dense and of high quality are the cornerstone of what we know about climate variability,” concludes Peter Domonkos.

###

3.2 12 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

89 Comments
Inline Feedbacks
View all comments
February 4, 2021 7:29 pm

A urologist, a proctologist, and climatologist walked into a bar…
They shuffle over to bar and sit down, while making some small talk among themselves.
Barkeep comes up and asks, “What’ll it be gents?”

Urologist says, “Gimme a pint o’ your cheapest beer.”
Barkeep looks quizzical, but goes tot he tap and fills him a glass of Coors Light and slides the glass of bubbly, pale yellow liquid to him.

The Urologist takes the glass of yellow liquid and just up-ends the glass to his lips and empties the entire pint in one long chug. Finishing with the pint with a BELCH and saying, “Ah!!!.. Thanks! I just wanted to be reminded that just because it looks like piss, sometimes it isn’t.”

Barkeep gives an amused look, but quickly moves is gaze to the Proctologist and asks, “And you sir?”

Proctologist says, “Gimme a Mud-slide, and make it with your best vodka, please.”
Bar keep gives that quizzical look again, goes to make the MudSlide cocktail as ordered. Barkeep digs around the cabinet to find the Grey Goose Vodka, finishes making the cocktail with garnishment of chocolate shavings. Slides it over to Proctologist.

The Proctologist takes the stemmed glass of dark brown liquid and proceeds to upend the contents onto to his lips. He as soon as he finishes he proclaims to everyone in bar, “All that something that looks like diarrhea, but tastes great, and with the best ingredients.”

Barkeep gives an amused look, but as before, he quickly moves is gaze to the Climatologist and asks, “And finally you sir? What be your pleasure?”

The Climatologist says, “Gimme a glass of real piss and blend it in your blender with a pile of fresh dog poo fromyour dog. I’ll pay you a thousand dollars.”

Barkeep looks on in aghast at the crude disgusting request, and then composes himself and matter of factly states, “Sir I cannot do that. Serving that would violate my county health license, endanger your health, defile my bar, and it’s just morally wrong!”

The Climatologist looks around, quite confused, and then says, ….

pochas94
February 4, 2021 8:46 pm

Climate is the Goose that lays the Golden Eggs. Keeping it alive depends on their ability to tell convincing fairy tales.

Patrick MJD
February 4, 2021 8:51 pm

They are German, so their “results” will be “correct”.

observa
February 4, 2021 11:44 pm

Personally I find this settled sciencey stuff all a bit unsettling-
Scientists Discover an Immense, Unknown Hydrocarbon Cycle Hiding in The Oceans (msn.com)

February 5, 2021 1:10 am

Altering recorded results, in any way, is not Science, it is advocacy. Wow….

February 5, 2021 2:02 am

If they eliminate measurement error then they eliminate Climate Change.

Earth’s surface temperature is thermostatically controlled. The only thing that changes is the way the temperature is measured.

All the Climate Change in Australia can be sheeted home to instrument changes, instrument moves, bigger aircraft and more frequent air services; population density, commercial/industrial/domestic energy intensity; data homogenistion and certainly others.

Mickey Reno
February 5, 2021 5:24 am

If I could be so bold as to paraphrase a Dilbert cartoon: Dogbert was hired as an efficiency expert to find all the unproductive University of East Anglia climate scientists who could be laid off as a cost saving measure. Some time later, one of the scientists worried about losing his job asked Dogbert if he had finished his investigation, and if so, could he see a copy of Dogbert’s final report. “But this report is just a copy of the UEA faculty directory,” complained the scientist, to which Dogbert answered, “Yes, finding that saved me a lot of time.”

S. K. Dodsland
February 5, 2021 7:25 am

Is this just another ploy to further the man-caused climate change agenda?

The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.

Tony Heller of realclimatescience.com has documented how NASA/NOAA alter data and cannot be trusted.

https://realclimatescience.com/2020/10/new-video-alterations-to-the-us-temperature-record-part-one/
https://realclimatescience.com/2020/10/new-video-alterations-to-the-us-temperature-record-part-2/
https://newtube.app/user/TonyHeller/xEyXN2e (temperature alterations part 3)
https://realclimatescience.com/2020/10/new-video-how-the-us-temperature-record-is-being-altered-part-3/

Gerald Machnee
February 5, 2021 8:38 am

***The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.***

Translation: The 1930’s were biased high simultaneous so they had to be homogenized downwards. Today there are simultaneous cold biases so they are homogenized upward using the scientific CO2 increase.

Ferdberple
February 5, 2021 8:39 am

You cannot correct a station using average data from surrounding stations, because weather is not linear, it is chaotic.

The underlying assumption is wrong. Garbage in garbage out.

S. K. Dodsland
February 5, 2021 8:50 am

Why did you eliminate my post?

S. K. Dodsland
Reply to  S. K. Dodsland
February 5, 2021 10:36 pm

Thank you posting my comment and for having objectivity.There are at least 2 sides to every theory.

Paul Penrose
February 5, 2021 9:08 am

So, when measuring the property of a large object, higher spacial density (resolution) of the measurements is better than lower – wow, what a ground breaking conclusion.

accordionsrule
February 5, 2021 11:11 am

In homogenization, UHI affected stations will always win. Being in largest metropolitan areas and airports they are older, more continuous, better maintained, and spatially more dense. They are considered “high quality,” therefore are the “cornerstone” and used to infill the other stations. Their steady incline is smooth, not sharp, thus the warming trend is deemed not “spurious.”
From what I’ve read, adjusting for UHI effects is the last step. Since all the other previous steps served to lift nearby, possibly more pristine stations into conformity with the “cornerstone” stations, there is only a tiny UHI signal left to adjust for.

Tom Abbott
February 5, 2021 5:48 pm

From the article: “These “homogenization methods” are a key step in converting the enormous effort made by observers into reliable data about climate change.”

There they go implying that temperature readings from the past are not reliable.

First of all, the temperature readings from the past are the only readings we have, and second of all, putting the readings through a computer is not going to improve their accuracy.

The original handwritten temperature readings are the readings we should use to plan our future. These temperature readings tell us we have nothing to fear from CO2.

That’s why alarmists want to manipulate the data, because it doesn’t fit their narrative in its native form, so they put it through their computers and make it fit their narrative.

Stick to the actual temperature readings if you want to dwell in reality.

S. K. Dodsland
February 5, 2021 10:19 pm

Is this what is considered homogenization?

If the corrections fixed known problems in the instruments, that would help accuracy. But they are statistical. They make the station measurements smoother when mapped and they smooth over discontinuities. In my opinion, NOAA has overdone it. TOB, PHA, infilling and gridding are overkill. This is easily seen in Figure 7 and by comparing Figure 3 to Figure 6 or Figure 5. Does the final trend in Figure 3 more closely resemble the measurements (Figure 6) or the net corrections in Figure 5? The century slope of the data is 0.25°, the corrections add 0.35° to this and the “climatological gridding algorithm” adds 0.9°! It is worth saying again, the type of statistical operations we are discussing do nothing to improve the accuracy of the National Temperature Index, and they probably reduce it.

https://wattsupwiththat.com/2020/11/24/the-u-s-national-temperature-index-is-it-based-on-data-or-corrections/