I’m designing an experiment to illustrate how ‘homogenization’ creates a false climate trend – suggestions welcome

Those of you that have been with WUWT for a few years know that I often like to do hands-on experiments to illustrate and counter some of the most ridiculous climate change claims made on both sides of the aisle. On the alarmist side, you may remember this one:

Al Gore and Bill Nye FAIL at doing a simple CO2 experiment

Replicating Al Gore’s Climate 101 video experiment (from the 24 hour Gore-a-thon) shows that his “high school physics” could never work as advertised

Unfortunately, YouTube has switched off the video, but I’m going to try getting it posted elsewhere such as on Rumble. The graphs of temperature measurements and other images are still there.

Despite the fact that I proved beyond a shadow of a doubt that the experiment was not only fatally flawed, but actually FAKED, they are still using it as propaganda today on Al Gore’s web page.

They never took it down. Schmucks.

So along those lines, like Willis often does, I’ve been thinking about the recent paper published in Atmosphere by some of our brothers-in-arms (Willie Soon, The Connallys, etc) in climate skepticism,

Evaluation of the Homogenization Adjustments Applied to European Temperature Records in the Global Historical Climatology Network Dataset

Abstract

The widely used Global Historical Climatology Network (GHCN) monthly temperature dataset is available in two formats—non-homogenized and homogenized. Since 2011, this homogenized dataset has been updated almost daily by applying the “Pairwise Homogenization Algorithm” (PHA) to the non-homogenized datasets. Previous studies found that the PHA can perform well at correcting synthetic time series when certain artificial biases are introduced. However, its performance with real world data has been less well studied. Therefore, the homogenized GHCN datasets (Version 3 and 4) were downloaded almost daily over a 10-year period (2011–2021) yielding 3689 different updates to the datasets. The different breakpoints identified were analyzed for a set of stations from 24 European countries for which station history metadata were available. A remarkable inconsistency in the identified breakpoints (and hence adjustments applied) was revealed. Of the adjustments applied for GHCN Version 4, 64% (61% for Version 3) were identified on less than 25% of runs, while only 16% of the adjustments (21% for Version 3) were identified consistently for more than 75% of the runs. The consistency of PHA adjustments improved when the breakpoints corresponded to documented station history metadata events. However, only 19% of the breakpoints (18% for Version 3) were associated with a documented event within 1 year, and 67% (69% for Version 3) were not associated with any documented event. Therefore, while the PHA remains a useful tool in the community’s homogenization toolbox, many of the PHA adjustments applied to the homogenized GHCN dataset may have been spurious. Using station metadata to assess the reliability of PHA adjustments might potentially help to identify some of these spurious adjustments.

In a nutshell, they conclude that the homogenization process introduces artificial biases to the long-term temperature record. This is something I surmised over 10 years ago with the USHCN, and published at AGU 2015 with this graph, showing how the final product of an homogenized data is so much warmer than stations that have not been encraoched upon by urbanization and artificials urfaces such as asphalt, concrete, and buildings. By my analysis, almost 90% of the entire USHCN network is out of compliance with siting, and thus suffers from spurious effects of nearby heat sources and sinks.

In the new paper, here is a relevant papragraph that speaks to the graph I published in 2015 at AGU:

As a result, the more breakpoints are adjusted for each record, the more the trends of that record will tend to converge towards the trends of its neighbors. Initially, this might appear desirable since the trends of the homogenized records will be more homogeneous (arguably one of the main goals of “homogenization”), and therefore some have objected to this criticism [41]. However, if multiple neighbors are systemically affected by similar long-term non-climatic biases, then the homogenized trends will tend to converge towards the averages of the station network (including systemic biases), rather than towards the true climatic trends of the region.

The key phrase is “multiple neighbors, i.e. nearby stations.

Back on August 1, 2009, I created an analogy to this issue with homgenization by using bowls of dirty water. If the cleanest water (a good station, properly sited) is homgenized with nearby stations that have varying degrees of turbidity due to dirt in the water, with 5 being the worst, homgenization effectively mixes the clean and dirty water, and you end up with a data point for the station labeled “?” that is some level of turbidity, but not clear. Basically a data blend of clean and dirty data, resulting in muddy water, or muddled data.

In homgenization the data is weighted against the nearby neighbors within a radius. And so a station the might start out as a “1” data wise, might end up getting polluted with the data of nearby stations and end up as as new value, say weighted at “2.5”.

In the map below, applying a homogenization smoothing, weighting  stations by distance nearby the stations with question marks, what would you imagine the values (of turbidity) of them would be? And, how close would these two values be for the east coast station in question and the west coast station in question? Each would be closer to a smoothed center average value based on the neighboring stations.

Of course, this isn’t the actual method, just a visual analogy. But it is essentially what this new paper says is happening to the temperature data.

And, it just isn’t me and this new paper saying this, back in 2012 I reported on another paper that is saying the same thing.

New paper blames about half of global warming on weather station data homogenization

Authors Steirou and Koutsoyiannis, after taking homogenization errors into account find global warming over the past century was only about one-half [0.42°C] of that claimed by the IPCC [0.7-0.8°C].

Here’s the part I really like:  of 67% of the weather stations examined, questionable adjustments were made to raw data that resulted in:

“increased positive trends, decreased negative trends, or changed negative trends to positive,” whereas “the expected proportions would be 1/2 (50%).”

And…

“homogenization practices used until today are mainly statistical, not well justified by experiments, and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic times series are regarded as errors and are adjusted.”

So, from my viewpoint, it is pretty clear that homgenization is adding a spurious climate warming where there actually isn’t a true climate signal. Instead, it is picking up the urbanization effect which leads to warming of the average temperature, and adding it to the climate signal.

 Steve McIntyre concurs in a post, writing:

Finally, when reference information from nearby stations was used, artifacts at neighbor stations tend to cause adjustment errors: the “bad neighbor” problem. In this case, after adjustment, climate signals became more similar at nearby stations even when the average bias over the whole network was not reduced.

So, I want to design an experiment to simulate and illustrate the “bad neighbor” problem with weather stations and create a video for it.

I’m thinking of the following:

  1. Use the turbidity analogy in some way, perhaps using red and blue food coloring rather than a suspended particulate, which will settle out. This is purely for visualization.
  2. Using actual temperature, by creating temperature controlled vials of water at varying temperature.
  3. Mixing the contents of the vials, and measuing the resultant turbidy/color change and the resultant temperature of the mix.

The trick is how to create individual temperature controlled vials of water and maintain that temperature. Some lab equipment, some tubing and some pumps will be needed.

Again purely for visual effect, I may create a map of the USA or the world, place the vials within it, and use that to visualize the results and measure the results.

I welcome a discussion of ideas on how to do this accurately and convincingly.

5 21 votes
Article Rating
134 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Tom Halla
February 23, 2022 1:13 pm

I do not think that would be a good analogy for “correction” errors. This is a case of a “non-blind” reporting, where the observer has expectations as to the outcome that influence the reported outcome. It is a matter of bad research design to not take bias into account, whether conscious or not.
Anthony Watts earlier work actually reporting on the siting of reporting stations as to how they are affected by UHI is more likely to show results.

Janice Moore
Reply to  Tom Halla
February 23, 2022 2:09 pm

Tom. The author of the above analogies IS Anthony Watts. 🤨

😀

You make a good general point, but, it would be helpful if you told us what “that [analogy]” is (the one you don’t like) so we really understand and discuss what you are saying… .

TallDave
February 23, 2022 1:38 pm

don’t worry, any warming from homogenization is negated by people getting up later in the old days

Last edited 3 months ago by TallDave
David Sulik
February 23, 2022 1:46 pm

Use temperatures only from dairy farms and then homogenize to infer temperatures of places in between, or outlying.

Janice Moore
Reply to  David Sulik
February 23, 2022 2:11 pm

😄

Reply to  David Sulik
February 23, 2022 2:11 pm

I only know that milk often is homogenized, what makes sense….

Mike Jonas(@egrey1)
Editor
Reply to  David Sulik
February 23, 2022 2:25 pm

Similar suggestion: Remove all urban and UHI-prone sites (eg, airports) and apply homogenisation using only the remaining sites. Compare results for those stations with their results when homogenisation uses all sites. Then do the same for only the sites that were removed. You could get a bit more sophisticated and divide sites into more than two groups, but I think just two groups could give interesting results.

Ebor
Reply to  David Sulik
February 23, 2022 9:54 pm

It’s the dairy-air that’ll get ya…

Fred Hubler
February 23, 2022 1:49 pm

Review how the Argo buoy sea surface temperatures were homogenized with known faulty sst data measured from ships.

Gator
February 23, 2022 1:50 pm

If you do not remove UHI from your data, you are incorporating it into your data. I’ve been saying this for many years. A child should be able to understand that this raises temperatures across the board. What’s wrong with these people?

George W Childs
Reply to  Gator
February 23, 2022 4:49 pm

They can’t handle the truth.

MarkW
Reply to  George W Childs
February 23, 2022 5:08 pm

They have no interest in the truth.

Ebor
Reply to  MarkW
February 23, 2022 9:55 pm

Sod the truth!

Frank the Norwegian
Reply to  MarkW
February 24, 2022 3:35 am

But many of them do have a vested interest in the non-truth

Danley Wolfe
February 23, 2022 2:02 pm

I am having a bad neighbor problem right now. Seriously, I can’t find much credence in how the climate scientists avoid scientific standards and rigor in much of the published research , especially in using averages of selected groups of climate models. If we know that none of them are truly representative what is the good / value in showing graphs with wide ranges of predictions. If we know the models are deficient does showing a spread of models all of which are deficient is also not only deficient but intellectually deficient.

RickWill
Reply to  Danley Wolfe
February 23, 2022 4:38 pm

what is the good / value in showing graphs with wide ranges of predictions.

You should take lessons in consensus science then you would understand.

When I challenged Australia’s CSIRO on the 38C SST they were predicting in the NINO4 region by 2300 they replied that their latest model is only predicting to 2100 and its prediction is middle of the road of all the climate models.

Climate modellers are consensus scientists. They no longer care about the validity of their prediction, as long as it is in line with others. The consensus result has to be the average. Trying to determine which model produces the best result is not woke.

CSIRO_Nino4.png
February 23, 2022 2:33 pm

Data fabrication is even worse … https://www.youtube.com/watch?v=hs-K_tadveI

David Dibbell
February 23, 2022 2:35 pm

I don’t have anything to suggest about the visualization exercise. Not sure that will be persuasive.

But I will offer these plots for the USHCN monthly Tavg data for the month of December from 1895 to 2021. For the list of 1,218 stations, updated data files for raw, tob (time-of-observation adjusted) and FLs (final adjusted data after the pairwise homogenization adjustment) are available from NOAA here. https://www.ncei.noaa.gov/pub/data/ushcn/v2.5/ The latest compressed files have all the data for all the stations for all periods.

These plots at the links below give the mean for all reported data by year from the list of 1,218 USHCN stations (i.e. contiguous U.S.) Missing values are ignored to calculate the mean. Then the difference by year for tob-less-raw, FLs-less-tob and FLs-less-raw are given. The number of actual reported values in the Raw data and the percentage of values flagged “E” (Estimated) in the FLs data is also shown.

Key points:
Whatever trend for the contiguous US is being reported looks like it is driven by the adjustment processing, not by the raw data.
I realize that a straight mean of raw values does not represent a true climatic trend, as there is no attempt to area-weight or otherwise establish representative coverage.
The justification or technical validity of the time-of-observation and pairwise homogeneity adjustments is not what is being questioned here – these plots just show the bulk results over time, for whatever questions that may raise.
The recent years show a steep decline in the number of raw values reported, and the percentage of Estimated values flagged in the final (FLs) data has risen rapidly.

comment image?dl=0
comment image?dl=0
comment image?dl=0
comment image?dl=0
comment image?dl=0
comment image?dl=0
comment image?dl=0
comment image?dl=0

Finally, this is just for the month of December as an illustration.

Last edited 3 months ago by David Dibbell
Jeff Schmucker
Reply to  David Dibbell
February 23, 2022 2:58 pm

You might want to look into Thermo chromatic dyes. Think old “mood rings”.

Tom.1
February 23, 2022 2:45 pm

Bly Nye’s crappy explanation notwithstanding, is it not true that a jar with CO2 in it will warm up faster than the jar that has no CO2 in it when each is exposed to an external source of IR?

Richard Page
Reply to  Tom.1
February 23, 2022 3:05 pm

Glass jars? Most IR wavelengths are blocked by glass, such as in glass jars – the glass in the jars should heat up equally and warm the contents equally by conduction. Unless you change the material of the container to something that doesn’t block IR, then your methodology is completely flawed and the experiment is worthless.

rhs
Reply to  Richard Page
February 23, 2022 3:16 pm

This is my whole objection to the Eustice Crutch experiment which AGW is based on.

Tom.1
Reply to  Richard Page
February 23, 2022 6:02 pm

Can you also dismiss this? The Greenhouse Gas Demo – YouTube

Richard Page
Reply to  Tom.1
February 24, 2022 4:59 am

Polyethylene terethphalate (soda bottle plastic) also blocks and absorbs IR wavelengths similarly to glass – I would have used a form of polycarbonate instead.
Frankly all this ‘experiment’ shows is the difference between the relative density of CO2 and air, nothing whatsoever to do with the IR absorption of CO2.
By the way – which thermometer was in which bottle? Neither were labelled or marked and, again, we’re left wondering how this guy managed to get the results that he did?
If you want a better ‘experiment’ – look around the internet; there are hundreds of crappy ones like this, but occasionally you will find a good one you can use.

AGW is Not Science
Reply to  Richard Page
February 24, 2022 10:01 am

I have always thought that the experiment purporting to show an “effect” of CO2 was invalid.

Part of that is because by adding CO2 to one container and not the other means you are adding density to one vs. the other, which means it should heat up more by that fact alone – in order to be valid, the experiment should add compressed air (with the same CO2 as ambient) into the “non-CO2” container to ensure the content density of each container is equal, thereby removing that as a confounding variable.

The material of the container and whether infrared radiation can penetrate it is, of course, another massive issue with the experiment.

Right-Handed Shark
Reply to  Tom.1
February 24, 2022 2:34 pm

Seriously?

David Blenkinsop
Reply to  Tom.1
February 23, 2022 3:34 pm

I believe that the major point of Anthony Watts debunk of this video is that this looked for effect doesn’t happen, at least not reliably enough for the original video to have kept to an honest demonstration. If the effect doesn’t work reliability for every honest experimenter, then it’s invalid, not replicable.

Having said that, note that something even more to the point is that the atmosphere isn’t a sealed jar, so there’s no lid there, no blockage of convection, no block on turbulent mixing at the top, etc. So unless they can stick a planetary gravity field in a jar, the whole greenhouse effect concept doesn’t have a desktop demo like this anyway.

Tom.1
Reply to  David Blenkinsop
February 23, 2022 6:10 pm

I think the point of the experiment was to show that CO2 is warmed more by IR radiation than most of the other gases that make up the atmosphere (mostly O2 and N2). This seems to be to be a basic scientific fact. A properly done experiment would show this. I’m not saying Nye’s experiment, if it can be called that, was properly done.

Alexy Scherbakoff
Reply to  Tom.1
February 23, 2022 8:54 pm

When a CO2 molecule absorbs energy it becomes more massive, heavier. E=mc^2. In absorbing it becomes heavier and slows down somewhat-something to do with mass and momentum. A slower molecule is ‘cooler’. When it gets rid of the energy it speeds up again. This little dance continues and ends with no resultant heating. That was a simplified version of things.

AGW is Not Science
Reply to  David Blenkinsop
February 24, 2022 10:07 am

Having said that, note that something even more to the point is that the atmosphere isn’t a sealed jar, so there’s no lid there, no blockage of convection, no block on turbulent mixing at the top, etc. So unless they can stick a planetary gravity field in a jar, the whole greenhouse effect concept doesn’t have a desktop demo like this anyway.

Exactly. At best, even if the experiment was valid in every way, all it would show is a purely hypothetical effect of atmospheric CO2 on temperature, based on its implicit assumption of all other things held equal.

Here in reality, “all other things” are most certainly NOT “held equal,” the “feedbacks” are negative, offsetting feedbacks, and the actual, as opposed to hypothetical, effect of atmospheric CO2 on the Earth’s temperature cannot be distinguished from ZERO.

Which is what observations support. No impact whatsoever.

Reply to  Tom.1
February 23, 2022 3:46 pm

Nye’s entire experiment was upside down, or perhaps more correctly, inside out. In the Earth’s GHE the source of LW that gets absorbed by CO2 and re-radiated is the earth. So LW that would have escaped the earth system instead gets sent back to earth. In Nye’s experiment, the LW source was OUTSIDE the system instead of in the middle of it! So when it hit the CO2 layer, it would have been absorbed and re-radiated with some of it being ejected from the system. In other words, the most likely outcome was that the jar with CO2 would warm slower.

If I recall, that’s exactly what Anthony’s experiment showed, and is a testament to Nye’s complete lack of knowledge about the GHE. He designed an experiment that if succesfull could only show the opposite of what he wanted to show, and so when confronted with results that didn’t show what he expected, he faked them because he just didn’t know enough to design an experiment that actually replicated the GHE.

Last edited 3 months ago by davidmhoffer
Rick C
Reply to  Tom.1
February 23, 2022 4:06 pm

Aside from the fact that glass is opaque it IR, CO2 is heavier than air – molecular weight of CO2 = 44, air (O2 & N2) ~28. Also CO2 has a much higher specific heat than air. Thus, if the same amount of energy is added to both containers the air would warm up more than the CO2.

Richard Page
Reply to  Rick C
February 24, 2022 5:44 am

Ah wait a second – the density of CO2 means that it’s slower to heat up and has a higher specific heat value, but it also retains that heat better than air? So if enough heat was added by conduction and enough time was allowed for (given the endothermic properties of the alka seltzer reaction and the density of CO2) the CO2 bottle might get slightly warmer than the air, mightn’t it? Fake results as it’s due to conduction and the various properties of the bottles/jars, the water, the air, the alka seltzer and, not least, the resultant CO2 but not IR absorption.

RayB
Reply to  Tom.1
February 23, 2022 5:42 pm

A simple black body experiment with different insulators between the black body and the sensor would give better results.

Tom.1
Reply to  Tom.1
February 23, 2022 6:05 pm

To everyone who dismisses Nye’s poor experiment (where he probably did not adhere to anything resembling scientific procedures), I think that the CO2 should warm up faster when exposed to IR, because quite simply it absorbs IR wavelengths readily and will therefore be warmed more than non IR absorbing gases. That is the basis for the greenhouse effect.

Reply to  Tom.1
February 23, 2022 9:30 pm

That is not the basis for the GHE. The basis for the GHE is that CO2 absorbs LW and re-radiates it or else gives up the energy to other molecule in the atmosphere via collision. This LW is generated from INSIDE the system when the earth absorbs SW and converts it to longwave. Nye’s experiment reversed this, putting the longwave source OUTSIDE the system.

The GHE has nothing to do with how much CO2 warms from LW and everything to do with it re-radiating or otherwise giving up the energy it just absorbed. See my comment upthread.

Carlo, Monte
Reply to  davidmhoffer
February 24, 2022 5:01 am

In addition, sunlight contains very little energy with wavelengths longer than 4um.

David Blenkinsop
Reply to  Tom.1
February 23, 2022 11:24 pm

I’m sure that there is a lot of laboratory data to support general Greenhouse effect principles, like absorption of specific wavelengths by certain gases, whether CO2, CH4, H2O, or whatever. This in turn goes into the most basic integrative computer models, like MODTRAN, to start to give some idea as to how much a column of atmosphere ought to be warming. So GHE theory is based on data *plus* unavoidable assumptions about how to integrate things mathematically. This is quite a different matter from trying to confirm a GHE effect directly, in a closed jar, without a mathematical model, which is what we’re referring to here!

It is this latter thing, *directly* confirming the ‘closed jar’ Greenhouse effect, that we’re questioning here, both as something unverified *and* as likely to be irrelevant too, if it’s the open, ”no top’ situation that we’re really interested in.

AGW is Not Science
Reply to  David Blenkinsop
February 24, 2022 12:12 pm

It’s not even that “open top” situation. It’s the effect as a part of the Earth’s atmosphere, subject to all of the processes and feedbacks inherent in that system. They fixate on the hypothetical effect, when what they should be focused on is the real world effect when all processes and feedbacks are applied, which can only be determined by observation of the real world. The hypothetical effect of CO2 on temperature is based on the inherent assumption all other things held equal, a situation that has never existed, does not exist today, and never will exist.

At best all the blather about the “enhanced greenhouse effect,” i.e., the notion that adding CO2 to the atmosphere will raise the Earth’s temperature, is an academic exercise with no real world application. It should certainly NOT be used as a basis for ‘policy.’

The real world says quite plainly that atmospheric CO2 levels are not the “driver” of the Earth’s temperature.

The real world says CO2’s impact is not able to be differentiated from zero, because that is what observations support.

Jim Gorman
Reply to  Tom.1
March 1, 2022 7:24 am

The problem is that when CO2 absorbs energy is moves to an excited state. It will then reradiate that energy soon thereafter or it will transfer the energy to another molecule (N2/O2). When this happens it cools back to its original temperature. If there were N2/O2 in the bottle, you might see some warming.

commieBob
February 23, 2022 2:52 pm

The planet is very close to being in a radiation balance. (Even if you accept the alarmists’ numbers it’s easily within 1%.)

Let’s assume, for ease of arithmetic, that radiation from the surface is the only thing we have to worry about.

Let’s also assume that the planet is a disk with one side permanently facing the sun.

Let’s assume further that there are two possible conditions:

1 – One version of the disk does not distribute heat over its surface. The side that faces away from the sun is at 0 Kelvin.

2 – The other version evenly distributes heat. We will take its temperature as 279K.

Because radiated heat is proportional to T^4, the sunward side of the first disk will be at 331K. The average temperature will be 165K.

So, just changing the heat distribution changes the average temperature from 165K to 279K.

It seems that, the more evenly the heat is distributed, the higher will be the average temperature.

So, if the process of homogenization artificially evens out the average temperature, one of two things will happen:

1 – The radiation balance will be upset.

2 – The average temperature will be too high.

As far as I can tell, temperature distribution is a big deal, and it is mostly ignored. By artificially evening out the temperature distribution, homogenization will give a wrong (too high) average temperature.

Last edited 3 months ago by commieBob
Sparko
Reply to  commieBob
February 23, 2022 3:34 pm

Temperature is an intrinsic quantity, unlike energy, it isn’t conserved. You can average temperatures, but it has physical meaning only for averaging using materials with the same specific heat capacity.

commieBob
Reply to  Sparko
February 23, 2022 7:57 pm

That’s fine but the design brief was:

I’m designing an experiment to illustrate how ‘homogenization’ creates a false climate trend – suggestions welcome

🙂

Alexy Scherbakoff
Reply to  commieBob
February 23, 2022 8:35 pm

I don’t understand the pathological desire for there to be a ‘balance’. There has never been a balance. We have gone from full-on glaciation to interglacials. Where’s the balance?

commieBob
Reply to  Alexy Scherbakoff
February 24, 2022 3:46 am

It’s not a desire, it’s an observation.

The amount of incoming radiant energy from the sun is basically the same as the amount of energy the Earth radiates (and reflects) into the depths of outer space.

For the purposes of my thought experiment above we can treat the balance as perfect.

Rud Istvan
February 23, 2022 2:59 pm

Some suggestions from a guy with 13 issued patents (admittedly, I designed some unique experiments but had others run them.

  1. Use blue and yellow food dye to make ‘green’. Blue is ‘cold’ without UHI, Yellow is ‘hot’ with UHI. Green is the result on which GND is based.
  2. If the turbidity analogy is apt, don’t go for temperatures also. Much more difficult to control and easier to criticize. KISS principle. You will get criticized, so hold second round temperatures back.
  3. Do the experiment several times over on different radia representing different distances from city centers. So by definition also on different numbers of ‘stations’. This last suggestion is motivated by a guest post here some years ago looking qualitatively at your Surface Stations CRN1 only. There were three usable urban out of 4, and (IIRC) 10 suburban/ rural. Homogenization did reduce UHI, cooling the pristine urban stations, but increased temp trends in all but one suburban/rural. After posting this comment, I will go find the post for you and provide as a subcomment.
Rud Istvan
Reply to  Rud Istvan
February 23, 2022 3:17 pm

Had to go to main computer archive, as could not remember post title:
How Good is GISS, guest post 08/03/2015. My, time flies. Analytic memory was correct. 4 urban, ten suburban/rural, all Surface Stations project CRN1. The guest post also hyperlinked to the then Surface Stations data I extracted and used for the qualitative (Mark 1 eyeball) trend analysis.

Janice Moore
February 23, 2022 3:08 pm
  1. I hope a synthetic chemist volunteers a “recipe.” E.g., add a bit of this… another bit of this…. and instead of Kool-whip, you have toothpaste.

2. Take a baritone voice and homogenize it with a soprano to get a yucky blend.

3.Mix types of pop (remember doing that at Royal Fork or some other all-you-can-eat buffet as a kid lol) to get a “weird” tasting pop.

— Your idea of red-blue food coloring is good, but, make it yellow + blue slime that is too green (truth is blue) — here’s a recipie for “Slime” https://www.letsroam.com/explorer/easy-slime-recipe/
comment image

4. Speed up (hotter) music so it is hilarious but WRONG.
Normal speed “You’re Welcome”
https://www.youtube.com/watch?v=79DijItQXMM&ab_channel=DisneyMusicVEVO
(published on YouTube)

Speeded up “You’re Welcome” (1.5x, 1.75x, 2x)

Janice Moore
February 23, 2022 3:12 pm

My comment just *poof* disappeared😲

So, splitting it into two:

PART 1
1.  I hope a synthetic chemist volunteers a “recipe.” E.g., add a bit of this… another bit of this…… and instead of Kool-whip, you have toothpaste.
 
2. Take a baritone voice and homogenize it with a soprano to get a yucky blend.
 
3.Mix types of pop (remember doing that at Royal Fork or some other all-you-can-eat buffet as a kid lol) to get a “weird” tasting pop.
 
— Your idea of red-blue food coloring is good, but, make it yellow + blue slime that is too green (truth is blue) — here’s a recipie for “Slime” https://www.letsroam.com/explorer/easy-slime-recipe/
comment image

Janice Moore
February 23, 2022 3:13 pm

PART II

Speed up (= hotter temp.) music so it is hilarious but WRONG.
Normal speed “You’re Welcome”




(published on YouTube)
 
Speeded up “You’re Welcome” (1.5x, 1.75x, 2x)



randomengineer
February 23, 2022 3:13 pm

Who is this intended for? The homogenization problem seems very similar to how image convolutions work (eg image smoothing) which is typically a 3×3 array operating pixel by pixel. One can misuse a convolution and result in that which is nothing like the original.

Janice Moore
Reply to  randomengineer
February 23, 2022 3:15 pm

The average 5th grader. 🙂

Thomas
Reply to  Janice Moore
February 23, 2022 3:51 pm

In other words, the average American.

Mike Dubrasich
Reply to  Janice Moore
February 23, 2022 4:32 pm

In that case, I suggest using chocolate milk.

H.R.
Reply to  Mike Dubrasich
February 23, 2022 5:47 pm

Actually not a bad idea, given my understanding of what Anthony is trying to present in a visual… milk with different concentrations of chocolate syrup, say from 0% syrup to 100% syrup.

*sigh* but then there is the temperature thing involving two dissimilar materials.

But throwing some things out there is what we are being asked to do, and chocolate milk may set someone off on a better tack, just as Janice posited ‘slime’ in different colors.

👍👍 Mike

Rich Davis
Reply to  H.R.
February 24, 2022 2:02 am

Milk and chocolate sauce is a winner I think, because…

1) Humorous reference to homogenization which is most people’s only association with the word. (It is a subtle ridicule that may help click bait)

2) pouring in chocolate sauce to various glasses of milk is easy to associate with various amounts of UHI and allows you to call attention to the artificial warming in a visible way—talk about sources of urban heat as you pour more or less chocolate sauce and stir it up

3) It should be easy to see a contrast between pure milk and chocolate milk.

4) There’s no technical challenge of controlling or measuring temperature for people to nitpick.

5) Kids will enjoy replicating the experiment at home. Good safe and delicious fun! Get participation and reinforce learning.

Richard Page
Reply to  Rich Davis
February 24, 2022 5:51 am

I’m nitpicking as well but is it an experiment ‘to see what happens if….’ or a demonstration ‘to show what happens when….?’ The different approaches would require radically different ideas really. Reading Anthony’s post and the replies I’m getting quite mixed messages as to which approach is being taken.

Janice Moore
Reply to  Richard Page
February 24, 2022 10:11 am

I think, Mr. Page, given that homogenization easily results in inaccurate data, that it is “what happens when.”

Janice Moore
Reply to  Rich Davis
February 24, 2022 10:09 am

You make good points, Mr. Davis. I think chocolate milk isn’t the best way to demonstrate the UNSAVORY outcome of homogenization of surface temperatures, however.

If > chocolate = > temperature….. the more chocolate the better😋

If > blandness (lack of chocolate) = > temperature…., not a powerful way to get the point across.

Try:
1) accurate data = excellent chocolate milk;
2) homogenized = chocolate milk with kale blended in 😝

PaulID
Reply to  Janice Moore
February 24, 2022 10:36 am

So about 5 grades higher than the average alarmist?

tygrus
February 23, 2022 3:21 pm

Ignore 5 yrs of a station output & replace it with the estimates based on the data processing techniques (eg. homogenization) they normally use. Compare these estimates with the real data you put aside, noting the average error of daily values (differences).
Now repeat the process for all stations 1 at a time.

What happens if we do this with random selections of stations eg. 1%, 2%, 5%, 10%, 20% ?
What is the error range as we increase the number of stations being estimated?

If the stations with lower trends were more trusted, how would this change the result?
If the stations with higher trends were more trusted, how would this change the result?
How does the current result compare to those 2 extreme biases?

Repeating my comment from earlier regarding Onslow Airport (WA, Australia) 13 Jan 2022.
https://wattsupwiththat.com/2022/01/26/australias-broken-temperature-record-part-1/#comment-3440486

The latest Onslow obs was at the Onslow Airport. Checking the BOM records show they previously had another station in town. Airport read ~0.7C hotter than town in early decades (1941-1970), ~1.4C hotter towards the last decades of the overlap (1991-2020). I question the significance of the news.

See January mean of daily max.

http://www.bom.gov.au/climate/averages/tables/cw_005017.shtml

http://www.bom.gov.au/climate/averages/tables/cw_005016.shtml

The crude comparison above may not exactly quantify a fixed offset or bias. But it is worth further investigation of raw data & the differences need better explanations.

Last edited 3 months ago by tygrus
tygrus
Reply to  tygrus
February 23, 2022 7:11 pm

The position of weather stations in Penrith & Richmond (NSW) areas have been atleast 3 locations each. These 2 towns should have similar climates except for tracking of some clouds, similar low altitude, same river, same mountains to the west. The early locations were rural / small country town, surrounded by irrigated farms with no water restrictions. Now the Penrith weather station is North-west of a much larger suburban city (but near Penrith lakes, the river, open space with some buildings). The Richmond weather station is now at the RAAF base (airport) with less urban development compared to Penrith. Both now limit outdoor water use during the day & less area has crops & use less irrigation.

You would think the 2 locations would have a very similar climate if they had no human activity. But now the Penrith measurements are typically 0C to 2.5C warmer than Richmond. The average would probably be about 1C warmer. It depends greatly on UHI, wind direction/speed, cloud position & storm paths.

Remember The temperature (peak summer or coldest winter day) can vary 10C within 50km from Sydney coast to Western suburbs. Then colder over the next 50 to 100km as you go up mountains or southern highlands. How do they average these for grid cells of 250km?

Gunga Din
February 23, 2022 3:34 pm

As far as a turbidity visual, there are a number of companies that sell turbidity standards for calibrating turbidity meters. Most are a liquid that that are inverted several times before use. They’re not cheap but they are accurate. (The turbidity shows up as a white color.) The standards are often made with formazine. It will cloud the water but tends to take a longer time to settle out.
Since you are only looking for a visual illustration, you could buy one very high NTU standard (say, 4000 NTUs) and add different amount to the same volume of clear water.
If you could get a clear map to put over a light table then place your different NTU containers around the location to “homogenized”. Have an empty container for the missing site. Draw off the same volume of turbid water from each of the surrounding “sites”.
Camera angle would likely need to change to get the effect.
(If you have access to a turbidity meter, you could label each value.)

Gunga Din
Reply to  Gunga Din
February 23, 2022 3:46 pm

PS You could put a magnet stirrer under to eliminate the settling issues.

Janice Moore
February 23, 2022 3:34 pm

Also — show what a tiny mistake in plotting a plane’s or a ship’s course can make …. only .5 degrees off and……… after days of traveling….. uh, oh! You missed Hawaii…. you are out of fuel in the middle of the Pacific Ocean……. GAME OVER.

Andrew Partington.
February 23, 2022 3:37 pm

I would love to see a comparison of the digital thermometers used in Australia since 1998 which have 1 second resolution with the mercury thermometers used before then that have about a minute resolution. This must have a big impact on BOM stats.

Gunga Din
February 23, 2022 3:38 pm

As far as using actual temperatures of water goes, perhaps aquarium heaters/thermometers?

Reply to  Gunga Din
February 23, 2022 8:30 pm

Portable sous vide cooking units made today are pretty good at keeping liquids at constant temperature for long periods of time and they are reasonably priced.

Gunga Din
February 23, 2022 3:57 pm

A thought about data collection.
As I understand it, homogenizing temps involve taking surrounding “official” sites and extrapolating the temperature for a large are where there are no “official” sites.
In the stores I’ve noticed home temperature/weather stations that can upload the values to a website. (Of course that gives no clue as to proper siting.)
But if those home kits’ values in one of those large homogenized areas vary greatly …?

February 23, 2022 4:01 pm

The experiment that cries out to be done is to calculate the global average anomaly history with and without the adjustments, to see what difference it makes. I’ve done it, and the difference is small.

Reply to  Nick Stokes
February 23, 2022 4:04 pm

Define “small” Nick.

Clyde Spencer
Reply to  davidmhoffer
February 23, 2022 4:55 pm

While you’re at it Nick, you might mention which way the error goes.

MarkW
Reply to  Nick Stokes
February 23, 2022 5:28 pm

Look at the squirrel over there.

Alexy Scherbakoff
Reply to  Nick Stokes
February 23, 2022 8:27 pm

Adjustments are made on a regional basis. When you go global you ‘smear’ everything. Of course, there is little difference. Suddenly you get a new baseline.

AC Osborn
Reply to  Nick Stokes
February 24, 2022 2:38 am

Do you remember the post on here called On ‘denying’ Hockey Sticks, USHCN data, and all that – part 2
Look at the chart USHCN Temperatures Raw 5-yr Smooth

https://wattsupwiththat.com/2014/06/26/on-denying-hockey-sticks-ushcn-data-and-all-that-part-2/

It bears no resemblance to a currently plotted chart or the othe charts Zeke posted which have now unfortunately disappeared.

I have stored NCDC & Hadcrut charts that also bear no resemblance between old and new data, but I can’t post the Excel sheet on here.

Gunga Din
Reply to  Nick Stokes
February 24, 2022 2:31 pm

Question:
Did you use “TheWayBackMachine” to get the without adjustment values or what a current website now says are the without adjustment values?

Reply to  Gunga Din
February 24, 2022 4:48 pm

GHCN publishes the full unadjusted record here, updated daily. It is the qcu file, the qcf is adjusted. That is the primary source. From the discussion here, people do understandably get the impression that they only produce adjusted data.

Gunga Din
Reply to  Nick Stokes
February 24, 2022 7:51 pm

Try putting that site address in the search of TheWayBackMachine.

Reply to  Gunga Din
February 25, 2022 1:34 am

You don’t need to, and it would be of no use. Each day they post an updated version of the entire history (unadjusted and adjusted). That overwrites previous versions.

Gunga Din
Reply to  Nick Stokes
February 26, 2022 5:49 am

TheWayBackMachine may have histories before they were overwritten.

Gunga Din
Reply to  Gunga Din
February 26, 2022 2:07 pm
VJones(@diggingintheclay)
Editor
February 23, 2022 4:11 pm

Anthony, you might remember I spent some time exploring the temperature record a decade ago, and homogenisation really bugged me. While I don’t have a suggestion for an experiment, the effect of what is homogenised is likely to be important. And I don’t just mean rural vs urban. Some insight here – https://diggingintheclay.wordpress.com/2012/10/29/the-trouble-with-anomalies-part-2/ Actually obvious, but challenging to prove an effect.

Dr. Jimmy Vigo
February 23, 2022 4:16 pm

Hi, I’m a PhD MS chemist always here to help. Experimental design is the most solid foundation of a study. Immediate technical stuff that comes to my mind are how representative is your model with respect to the real system that you are trying to mimic and make conclusions. Statistics play the role of making sure that the experiment is objective and can conclude valid conclusions based on intervals of confidence. One problem with measuring climate from the point of view of temperature is that this reference has too many variables contributing, and so it is too hard to distinguish the pieces of heat from many contributors, including CO2. It is scientifically more adequate to study an issue from a reference point that has the minimum contributions, so that correlations become correspondences, in which there’s more confidence to affirm that there is a cause and effect. This is why scientists are saying that it would be better to study the climate change issue not from the temperature point of view, but from pressure and volumes. The thermodynamics of temperature for gases is established by the ideal gas law PV = nRT, and so data of pressure and volume can be used to indirectly study temperature changes.

A recent article here was talking about the issue of heat transmission called convection used by gas molecules to transmit heat in very low amounts and the ability to CO2 to heat up the atmosphere. Another recent article was mentioning that there’s no scientific proof that temperature is a function of CO2 concentration; no one knows if that relationship is lineal, logarithmic, quadratic, exponential, power series, …, and that will affect a lot of the real role of CO2 in the atmosphere.

Simulating experimentally in a lab something about the atmosphere is actually nearly to impossible for modern science. Notice that we are pretending to extrapolate the science of the minuscule into the macro atmosphere. This transference requires major adjustments to existing theory of gases to account for effects that are greater than quantum effects from which thermodynamics is founded, contributions that are not accounted for in the formulas. The adjustments are mostly unknown, so that using the science of thermodynamics as it is wouldn’t even be enough to really understand the behavior of the atmosphere. The classification of the atmosphere as chaos theory speaks for itself, in which the indeterminism of random events make impossible future predictions.

In addition, any experiment attempting to mimic something related to CO2, warming, and the earth would have to make sure that the ratio of gases is kept at about 78% N2, 21% O2, 0.90% water vapor 0.04% CO2; better experiments must include cycles of carbon transformations with the atmosphere, soils and oceans. The earth is a living equilibrium of constant physico-chemical changes of a level of articulation beyond comprehension. Whoever can overcome all of this issues and a lot more, to come up with real scientific answers, deserves a PhD.

Another recent article discussed a personal question that I happened to be asking for long: no one knows the macro equivalent of the thermodynamics concept of Heat Capacity for the atmosphere, meaning that we don’t know how much heat the atmosphere can take before showing a change of 1 degree. This would say a lot about the possibility of any gas changing it’s overall temperatures. In addition, the exact amount of heat that individual molecules of CO2 can absorb and transmit is not clear.

Furthermore experiments would have to asses issues such as quantum effects of molecular modes of vibrations attained by molecules of CO2 as a consequence of absorbing IR energy; this does not account for heat as in temperature.

As a researcher I advice to be careful with experiments, because experiments can be easily-flawed and even manipulated. If anyone is attempting to do experiments and share results, I strongly advice going the standard method of designing it, running it, getting data, concluding and publishing in a peer-reviewed journal. Until then it would only be insignificant/hypothesis/an idea.

I will read more of you experiment and Gore’s, didn’t finish it. Always here to help, I could take some time to check out your experimental design. Good luck with it.
Thanks.

JBVigo, PhD

Last edited 3 months ago by Dr. Jimmy Vigo
Alexy Scherbakoff
Reply to  Dr. Jimmy Vigo
February 23, 2022 8:06 pm

I don’t think the experiment/demonstration is about reproducing the physical effects of homogenisation. I think Anthony is looking for an analogy. Analogies are limited in scope.
He’s probably looking for a WAH !!! factor that immediately makes a person think.

cbean
February 23, 2022 4:16 pm

Y’all ‘Climate Deniers’ gonna get us all killed: Massive explosion on far side of the sun could have been catastrophic for Earth https://freerepublic.com/focus/f-chat/4040192/posts

Clyde Spencer
February 23, 2022 4:45 pm

Anthony,
I would suggest taking known historical weather station data, randomly deleting a few, and see if the homogenization algorithms can recover the deleted data. This might be tried for areas with different topographic relief, areas with different humidity (enthalpy), different seasons (storminess), and areas with different variances.

I would be surprised if you would get agreement to the same precision as the original data. However, that in itself would be instructive as it would give insight on how homogenization might degrade the reliability of the station data, when the purpose is to improve the database.

Tim Gorman
Reply to  Clyde Spencer
February 24, 2022 12:08 pm

I agree with this. Points to consider:

  1. The uncertainty of the surrounding stations must be introduced somehow.
  2. actual temps vary over time. The temp at station 1 at time t0 won’t be the same as the temp at station 2 at time t0. Homogenization assumes the temps *will* be the same regardless of the time/temp difference, terrain difference, humidity, etc. You can attempt to just use Tmax but even Tmax happens at a different point in time for each station and lots of things can happen in that time interval (e.g. cloud cover, pressure, etc) – difficult to account for.

When you homogenize the temps the uncertainties in each station will combine, probably by root-sum-square, making the final uncertainty of the sum used in the average higher than the uncertainty of each individual homogenization station. Not much of any way to reduce that.

Frank S.
February 23, 2022 4:46 pm

May I suggest Spell Check be used when typing your title, Einstein? There’s no such word as “homgenization”.

PaulID
Reply to  Frank S.
February 24, 2022 10:40 am

Never worked anywhere dairies exist have you homogenization is all over the place there and interestingly if it doesn’t exist why doesn’t my spellchecker flag it?

Frank S.
Reply to  PaulID
February 24, 2022 11:26 am

My posting (dated 2/23) appeared when the misspelled word appeared in the title. Now, (as of 2/24) that misspelling has been mysteriously corrected. Way ahead of you, Pauli!

Gunga Din
Reply to  Frank S.
February 24, 2022 2:59 pm

Some people use programs that “translate” the spoken word into text. I don’t know if such programs include “spell check”.
I believe Anthony uses such a program. (Dragon Speak?)
Have you ever watched an old movie on, say, YouTube and turned on “closed caption” where it’s done “on the fly”? The results can be amusing, especially if the movie is, say, an old Sherlock Holmes trying to trying to caption a Scottish accent!

Generally authors’ of post appreciate and address typos being pointed out.
But the “Einstein” crack was way out of line.

RayB
February 23, 2022 5:38 pm

The problem in using dirt is that the bigger particles will settle to the bottom faster and the lighter will also settle but much slower. Food coloring is better but the turbidity is only linear at higher concentration. Moreover, the temperature will affect the absorption maxima, which is not a linear relationship either. The standard material used to calibrate turbidimeter is Formazine. It gives a better linear relationship and can be opaque at high concentrations.

I would suggest you look into fluorescence. The fluorescence of compounds is affected by concentration and temperature. Fluorescein is an easy and good molecule to work with. You can play with these parameters to give you the best effect.

The Dark Lord
February 23, 2022 5:40 pm

Colored water won’t work. When you mix you’ll be taking away some liquid from the dirty water to mix it with the clean thus reducing the amount of dirty water . The fraudsters aren’t doing that they are keeping the dirty value and changing the clean value … so the weight of the dirty never goes down …

Kemaris
February 23, 2022 5:48 pm

Might be worthwhile (if no one has done this yet) to pull the UHSCN data and assign quality factors of 1 through 5 to the monitoring stations based on Anthony’s earlier siting analysis. Do a graph of the data, and resulting trend over time, using only quality 1 (best) stations, then 1 and 2, then 1, 2, and 3, etc. This would graphically demonstrate how the trend changes as lower and lower quality data is allowed to intrude.

Eric Worrall(@eworrall1)
Admin
February 23, 2022 5:57 pm

I suspect homogenisation is amplifying urban heating.

Rural stations are more likely that city stations to suffer a break, but city stations also suffer breaks.

So the city urban heating lifts rural stations when they break, and growing urbanisation around rural stations lifts city stations when they break, resulting in an amplified uplift caused by UHI growing at different rates at in different regions.

Alexy Scherbakoff
Reply to  Eric Worrall
February 23, 2022 7:55 pm

I’ve never been certain about homogenisation works. Does it lower the hotspots while raising the cool spots? Or does it just raise the cool spots without touching the hotspots?

Lil-Mike
February 23, 2022 5:58 pm

I would use vertical bar graphs indicating measured and homogonized data across a longitude line, such as San Francisco, Sacramento, Reno.

Of course, to any Californian, we see that San Francisco temperatures are moderated by proximity to the global oceanic heat sink, Sacramento on the other hand is in a very dry valley merely 3′ above sea level, yet 90 miles from the sea. Reno on the other hand sits at about 4,400′ elevation.

On any given summer day, San Francisco will be in the 60s, Sacramento above 100, Reno in the 80s. All for different climatic reasons.

Does Sacramento data pull up San Francisco and Reno? Does SF & Reno pull down Sac? Or perhaps these things “just are” … with apologies to Thoreau.

H.R.
February 23, 2022 6:44 pm

The the problem I’ve been wrestling with is the containers. If they are open, as they are heated to the different desired temperatures, I’m not sure what the effects of the varying rates of evaporation might be? Maybe significant, and maybe not.

So, spitballing here, how about filled-to-the-top closed containers with the desired colored water? A tube at the top allows the colored water to be forced out in proportion to the heat applied. Apply sufficient heat to each container such that more colored solution is dumped into the (?) container from the warmer containers

Run the tubes to the (?) container.

I’d think you’d have to account for the lengths of the tubes as either another variable or if the heat loss in the run of the tube is negligible, the tube lengths could be longer or shorter to represent the varying distances from the (?) jar.

Already I’m seeing details I’ve left out. Of course there’s a bit of calculating of the thermal expansion coefficients, tube inside diameters, lengths and whatnot.

Those aquarium heaters may be useful for getting the filled jars to desired nice, warm, different temperatures. There would need to be a valve on each tube to hold in the contents of the various jars. I’d think the valves would need a mechanism to allow them to all be opened at once when the jar temperatures stabilized so you’d get instantaneous ‘homogenization’ into the (?) jar, as that is what occurs mathematically with temperature data homogenization.

That’s my ball of yarn for others to pick apart or add a few wraps.

Alexy Scherbakoff
Reply to  H.R.
February 23, 2022 7:49 pm

Good try. Based on the complexity of your design, I would simply add a tesla coil emitting sparks for added effect.
I am joking. At least you have come up with something.
You have come up with an idea but realise it’s shortcomings.

Last edited 3 months ago by Alexy Scherbakoff
H.R.
Reply to  Alexy Scherbakoff
February 23, 2022 8:10 pm

Thanks, Alex. Yes, just spitballing, but enough of it and someone is bound to see the path forward.

Alexy Scherbakoff
February 23, 2022 6:48 pm

The box is painted matt black inside.
Dimensions of box- essentially a cube about one foot on a side.
The mirror should be a sheet of mirror plastic (easy to cut to whatever size you want and also not fragile).
LEDs should be white and adjusted for intensity.
The membrane should be a reflective type and stretched enough to be uniform. It could possibly be a white elastic sheet.
The video camera is a small variety that is connected to a laptop/PC.

Pressing on the membrane with your finger should make a dark spot appear on the screen. Greater or lesser pressure would make the spot larger or smaller.
The analogy is not the pressure applied but the size and intensity of the dot and how it can be reduced by homogenising with the background. Several pressure points at different ‘intensities’ can be introduced simultaneously.

The apparatus is not fragile and the components can be easily replaced/retensioned. Filters and solutions with pumps can get very complicated.

comment image

LdB
February 23, 2022 7:21 pm

Almost any distribution that is “non-normal” requires special handling.

This came up when our mate Nick Stokes was blending temperatures over land and water cells on one of his stupid blog pages.

Alexy Scherbakoff
Reply to  LdB
February 23, 2022 8:15 pm

Don’t mention Old Nick. When you mention Old Nick then Old Nick comes.
Paraphrasing a Chinese saying.

Jim Gorman
Reply to  LdB
March 1, 2022 8:13 am

I would second this. From Tmax/Tmin to seasonal differences to hemisphere differences there are large temperature differences that result in non-normal distributions. Anomalies attempt to remove the variance between absolute temperatures so averaging can be used to make “trends” can be discerned. Statistically this removes so much information that any calculated metric is questionable as to its value.

A way to show this is to carefully make trends at various stations in various regions. For every station that has little warming or even cooling over the last 150 years there must be also be a station with twice the warming to reach the average that is portrayed in the GAT.

I have become convinced that this is only possible by using UHI stations that have considerable warming.

Julian Flood
February 23, 2022 8:29 pm

Anthony, humanity is carrying out major climate experiments of AGW already and no-one’s looking.
Please, please look at the way enclosed bodies of water are warming at double or treble the rate expected. The Sea of Marmara is a perfect example.
In articles at the blog TCW Defending Freedom I suggest the oil, surfactant and lipid smoothing of the water surface is lowering albedo, reducing evaporation and preventing the production of salt aerosols by wave suppression.
You could start by repeating Benjamin Franklin’s oil drop experiment on Mount Pond. All you need is a lake and 5ml of olive oil.
Latedt blog post is titled Cold Comfort.

A search will find an image of a smooth on FEN Lake at UEA which you will find amusing, right under their noses.

JF

Julian Flood
Reply to  Julian Flood
February 23, 2022 8:45 pm

Anthony, closer to home at WUWT you will find an entry to the completion you ran some time ago which covers the territory with references. Since then the role of oleaginous plankton has been emphasised by Marmara’s outbreak of mucilage caused by diatom blooms. I believe that a couple of the Great Lakes are warming at a high rate.

Rember Tom Wigley’s ‘Why the blip? ‘ Sunk oil carriers during WWII.

I’ve seen a smooth over a hundred miles across from a beam Porto to a couple of hundred miles short of Madeira, ten of thousand square miles. Franklin should be alive this day.

Any non-CO2 contribution to AGW reduces the need to trash civilisation. This mechanism explains a lot about the anomalies in the one control knob theory and can be addressed by reducing oil surfactant and sewage dumping into rivers and the oceans.

JF

Julian Flood
Reply to  Julian Flood
February 24, 2022 12:20 am

see https://en.wikipedia.org/wiki/UEA_Broad, top image, for the UEA smooth.

JF

Joel O'Bryan(@joelobryan)
February 23, 2022 9:41 pm

Anthony,
Ross McKitrick is your Expert go-to source on how to do this. As you know, he has studied all the paleoclimate quackery statistical methods the ‘consensus” has used create hockey sticks.

February 23, 2022 10:16 pm

Anthony, can you point me to an up to date compilation of classifications (1-5) for California weather stations? I am conducting some research on the 102 California stations with data extending back at least 90 years.

JonasW
February 23, 2022 10:43 pm

Homogenization includes two different corrections:

1- Correction for breakpoints – e.g. when a station is moved. That is corrected by PHA.

2- Correction for trends – e.g. UHI effect or that trees has grown around the station.

Both those corrections make sense if they are applied in a correct way, but they open possibilities if the adjuster wants to “prove” global warming.

Some years ago I found a paper describing NOAA´s homogenization method. Unfortunately I have not been able to find it again.

What really struck me was that the included a “regional climate trend” in their method.

In short it worked like this – the model included a climate trend saying that the temperature should increase with X C/ year. If a station did not show that increase there was something wrong with the measurement. The value should have been higher (due to the climate trend).

The algorithm works the way that it keeps todays value unchanged. Consequently the historical values should be adjusted downwards.

I have tried hard to find a present detailed description of NOAA´s homogenization algorithm in order to see if they really use the “climate trend”-correction. Since I have not found the description, I do not really know if they apply this today.

Tony Heller has done a very interesting work where he shows that the corrections are proportional to the increase in atmospheric carbon dioxide.

One can suspect that the algorithm includes a “climate trend” that is assumed to be proportional to atmospheric CO2. If that is the case it can explain most of the so called “adjustments”.

My question is thus if NOAA use a “climate trend”- correction in their homogenization algorithm ?

Ben Vorlich
February 23, 2022 11:11 pm

My own experience of UHI, mainly from rural France, is that it affects even small villages and hamlets of a few hundred people. In areas like the East Midlands of England where towns and villages have little or no open spaces between them then it is much harder to detect without the aid of a thermometer.
The other thing about UHI is it is less noticeable on windy days/nights.
I think that virtually the whole of England is affected by UHI, as is Central Scotland, and South Wales. The Benelux countries will have similar issues.
How you model the wind related contamination I don’t know.

michel
February 24, 2022 1:04 am

Don’t think your proposed experiment will show much if anything. You have to work with the real world data, and show that there is an effect in it. What you’re proposing is argument by analogy, which rarely convinces and never proves anything anyway.

Could you just iteratively to plot, for each station, what its temperature is with and without homogenization?

The proposition you are trying to falsify is this: with homogenization, the mean of the temperature of the set of stations is higher than the mean of the stations taken individually from a set of the same stations each of which has had no homogenization.

So if you take homogenization off one station at a time, and see what the results are, noting the temp of each station for which its been removed as you do so, you should be able to see whether the mean of the set rises or falls.

Maybe I am missing something. In any case, I think the most important first step is to state clearly in quantitative terms what the proposition is that you are seeking to falsify.

Don’t even think you need do it one at a time, do you? Why not just take the entire set, take the mean with and without homogenization. Is it higher or lower, should be quite simple.

If I understand the argument, you are saying that when you take 10 stations raw scores, and homogenize, the average temperature of the ten rises. This should be quite easy to check without any physical experiment.

Or have I missed something?

Tim Gorman
Reply to  michel
February 24, 2022 12:13 pm

If you don’t take the uncertainties of the measurements into account then you won’t be able to tell if the mean of the set rises and falls an amount outside the propagated uncertainty.

Rich Davis
February 24, 2022 2:16 am

With all the different ideas, it occurs to me that another persuasive factor can be to see the same point made by a series of different analogies. Picking two (or even three) ideas could be more powerful than just one.

This also allows for a kid-friendly chocolate milk experiment and a more elaborate instrumented experiment that “confirms our simple experiment”

February 24, 2022 3:15 am

A fundamental problem with homogenization is the rate at which the correlation decays with distance from the source station. It is assumed for simplicity to imagine a circle around the point measured with the fall-off in correlation corresponding only with the distance from the centre. However in the real world this is demonstratably not so!

For example, Australian stations correlate poorly east or west of each other but are more strongly correlated north or south.

The only way to approach reality is to allow for this anisotropy and the best way to do that is to measure the covariance of temperature in the real world via observations of the shape of its atmospheric fields and by mapping the real terrain!

Anthony, you could make a compelling but simple – perhaps animated – demonstration on paper with a number of points marking stations with circles or ellipses around them and comparing the result when the correlation decay is circular or elliptical.This would demonstrate the error of homogenised data created by assuming that temperature fields are isotropic*, which assumes a constant correlation decay in all directions.

In reality atmospheric fields are rarely isotropic, and indeed the maintenance of westerly flow in the southern extratropics against frictional dissipation is only possible due to the northwest-southeast elongation of transient eddy activity (Peixoto and Oort 1993)**.

*Hansen and Lebedeff (1987) assume the isotropy of the covariance of temperature!
**Jones, DA & Trewin, Blair. (2000). The spatial structure of monthly temperature anomalies over Australia. 

Last edited 3 months ago by Scott Wilmot Bennett
Tim Gorman
Reply to  Scott Wilmot Bennett
February 24, 2022 12:29 pm

The daily temperature profile is based on the rotation of the earth which determines the angle of the sun. The profile is close to sin(t) during the day. It’s not quite a sine at night but its close.

If you take two locations separated east-west you have two profiles, sin(t) and sin(t + a). The correlation of those two curves is cos(a). As I have calculated it that’s about 50 miles for a correlation factor of 0.8. Anything less than 0.8 I would consider to be poor correlation.

The vertical factor north/shouth is also a sine function because the angle of incidence of the sun insolation changes as you move away from the equator. That’s probably a cos(a) function. At the equator you get max insolation, i.e. cos(0), and as you go north or south the insolation amount goes down. The difference between two locations would be cos(a) and cos(a + b). I haven’t calculated that correlation factor, maybe sin(b)?

This is just distance. Of course for sin(t + a) “a” is a function of several factors such as elevation, humidity, terrain, distance, etc.

Joao Martins
February 24, 2022 3:35 am

Turbidity mixing and color mixing cannot be measured in the same way. Let’s take turbidity as a measure of light passing through something: you measure total light. On the other hand, mixing different colors will need measurements at different wavelenghts and this introduces too much confusion and explanations needed. My bet would be to something creating “shades of gray” or of color in a certain small range of wavelenghts. In a way that the increase in gray or color can be directly related to an increase (or a decrease) of temperature.

H.R.
Reply to  Joao Martins
February 24, 2022 4:36 am

Another vote for chocolate milk. 😜

Gunga Din
Reply to  Joao Martins
February 24, 2022 3:25 pm

If you mix different liquid colors, you’re going to end up with black eventually.
If color is used for the visual analogy, it will need to be the same color but different amounts added to produce stronger of lighter colors that would easily show the differences visually. (Communicate the point “at a glance”.)
What Anthony didn’t tell is what resources he has available.

H.R.
Reply to  Gunga Din
February 24, 2022 5:21 pm

And yet another vote for chocolate milk. 😜😜

Gunga Din
Reply to  H.R.
February 26, 2022 4:29 pm

I wondered about your votes for chocolate milk so I did a Crtl-F and put in “chocolate milk”.
I seem to have missed a number of comments about it.
Yummm …er … Hmmm … if the desired visual is to show that the empty “vial” in the center is filled in with some from the surrounding vials, I don’t think chocolate milk will show enough of a contrast. (Unless one or more of the vials was straight chocolate syrup.) 😎 😎

tygrus
February 24, 2022 3:50 am

What adjustments do you expect to have a bell curve +/- adjustment with a net0 result?
What adjustments do you expect to have a trend that’s almost linear going up?
How likely would you expect the adjustments to be responsible for the majority of the calculated temperature rise?
How much of the calculated temperature has been filled/adjusted by using bogus climate models that are giving circular validation (predict a rise so the infill/adjustments enforce that assumption & verify the model, circular reasoning)?

Antonio
February 24, 2022 4:26 am

I think a good visual analogy may be sanding a wood part with some fonctionality.
Homogenizing may be likely to sanding out the roughness of the surface of a wood part. The roughness would be “unwanted noise on the signal”, we just prefer to have a soft surface in our wood part.
But sanding should not change the functional shape of the part, because then the part losses its functional meaning, it does not work any more. So if our part has a relief which is functional, like the relief in a key that opens a lock, or teeth that assemble with another part to transmit movement, rotation, for instance, then that relief is significant, and sanding can erode the meaninful shape of the part.

So homogenization, as sanding, can be eliminating the sense of the signals we have.
In the case of the temperature record, the stations contain two meaninful signals at least: one is the global temperature evolution (which exists and depends in different factors, including sun irradiation as well as CO2 and CH4 and H2O vapour) aka Global Warming, sorry, Climate Change, but it also includes an strong signal that is linked to urbanization induced warming (which is different in rural areas and city areas, obviously), and that includes heating, Air Conditioning, and Industrial heat sources.

Homogenization that “kills” the urbanization warming signal detracts from dataset value, and changes the average. When homogenizing, the average should be preserved, as well as local signals, and the algorithm should only chase noise generated by changes in sensors, for instance, or stations movements (very careful here).

Thanks for your work. I appreciate very much your sensible approach to these issues.

pklinge
February 24, 2022 7:04 am

The need for PAH comes from the bad idea of using the average of long continuous temperature (anomaly) time series. You should refute the continuous time series method by using a method that does not use time series and thus does not need homogenization.

It goes like this: Calculate the regional area weighted average temperature each month. A new triangular grid is needed each month for the region. That takes care of missing temperature measurements, so you don’t have to take care of continuity at each measurement location. As a result you will get the best estimate of the average temperature for the studied region for each month. So, you have a continuous temperature time series without any homogenization.

menace
February 24, 2022 8:30 am

Homogenization should include site quality ratings and high rated sites should be used to influence low rated site readings and not vice versa.

If homogenization is done blindly and there are more low quality sites than high quality sites, the reverse happens and the high quality sites adjusted data are corrupted while the low quality sites have much smaller corrections simply because they are in the majority.

Chart 1 clearly shows this. The corrected data tracks the low quality stations and the high quality stations are labeled as the bald faced liars.

Curiously, the adjusted data trend is even a bit higher than what the low quality stations indicate. (Those who write the homogenization code have also put their finger on the scales?)

ferdberple(@ferdberple)
February 24, 2022 2:15 pm

The GHG hypothesis is wrong if it says adding CO2 MUST raise average temperature. That is a simple matter to prove false mathematically.

There is a 4th power relation between radiation and temperature but a linear relation between average temperature and actual temperature.

Thus by changing the distribution of temperature while decreasing the average temperature you can actually increase the radiation!!!! Completely opposite to GHG theory.

This follows from the Holder Inequality for Sums for those seeking a formal proof.

The missing ingredient is variance. By reducing the variance pair-wise homogenization maintains the average temperature but reduces calculated outgoing radiation. However, if one then corrects for radiation, since energy must be conserved, you must increase the average temperature to maintain the same radiative balance.

At fist look, pairwise homogenization looks like a way to increase average temps by manipulating variance while maintaining the same radiative balance.

A neat trick whomever figured it out. Of course it is mathematical trickery (nonsense) but the layman would never catch it.

Last edited 3 months ago by ferdberple
ferdberple(@ferdberple)
February 24, 2022 2:38 pm

Take two identical objects. Temp 0K and 10K. Average temp 5K. Radiation is proportional to 0^4 + 10^4 = 10,000.

Now homogenize these 2 objects to 5K each. Average temp remains 5K. But radiation is proportional to 5^4 + 5^4 = 1250.

But hang on. Homogenization had resulted in a loss of energy. Which violates a fundamental requirement that energy must be conserved.

So homogenization required that we raise the average temperature so as to not violate the conservation of energy.

How much? Well it turns out we must raise the average temp from 5K to 8.4K during homogenization to avoid creating/destroying energy.

tygrus
Reply to  ferdberple
February 24, 2022 3:29 pm

I would have thought the error was their calculation of radiation forcing when based on average temperatures instead of smaller grid squares or actual values. If you underestimate the long wave radiation leaving earth, your model will run hot amongst other problems.
Actual energy content of a volume of space/matter depends on many factors. For gasses it includes the mixture of gasses & water content which each have their own heat capacity, pressure changes temperature without changing energy content per kg (but higher temps will increase radiation & convection), higher pressure means more matter in the same area so contains more energy per volume. Temperature is not a universal scale or measure of energy.

tygrus
February 24, 2022 8:51 pm

The amazing power of compounding rounding errors…
https://xkcd.com/2585/

Dale Truman
February 25, 2022 7:49 am

Using the same dataset, you can get the same results using multiple regression analysis. And because there are too many input variables, there is too much error. Each attempt to add “information” also adds “error.” The simpler the better. As usual. It’s nice to have PHA as a tool, but only with more or less homoscedastic data. “Which in our case we have not got.” (“Naming of Parts” by Henry Reed)

Ossqss
February 25, 2022 4:34 pm

My thought is to use some sort of colored sand and distribution arrangement similar to the vials, but different. Perhaps using some type of directional advection as a mixing agent on positioned flat recessed plates similar to the US graphic above.

Rich Lentz(@usurbrain)
February 25, 2022 5:35 pm

Homework:

Pick 40 random numbers that are greater than 50 and less than 100. Determine their Total [2,947] and Average [73.6000]
Pick 60 random numbers that are greater than 50 and less than 100. Determine their Total [4,855] and Average. [80.9167]
The average of 73.600 and 80.9167 is 77.2583.
However the average of ALL 100 numbers is 78.0200 Note the almost 1 degree difference

The reason I said numbers above 50 was to set a “Bias.” ( You would get different answers without that bias but not as large.) Just as there is a bias in all temperatures. Some areas of the Earth never get above 20 C, The highest temperature ever recorded on Antarctica. And the lowest reading at the Equator is rarely below 10 C. This establishes a bias in the temperatures averaged in these groups. Collecting all Sea Surface temperatures, getting the average and then averaging just land and sea temperatures would be like my above example.

Think of placing a pure sine wave voltage of 5 volts on a bias voltage of 10 volts. The average is 10 volts (The average of a pure sine wave voltage is ZERO.), change the bias voltage and you change the new average voltage. Since the temperatures collected may look like a sine wave, they are not and they are sort of random. Thus you get a random, fluctuating, Global temperature.

Tim Gorman
Reply to  Rich Lentz
February 26, 2022 6:39 am

Good example. Don’t expect to change the minds of all the CAGW advocates on here.

Andrew Partington.
February 25, 2022 7:25 pm

Kerang Victoria is mentioned on the BOM site as an example of how adjustments are done. http://www.bom.gov.au/climate/data/acorn-sat/
(See tab at bottom of page an example of the adjustment process)

They did not take into account urban heat island effect when deciding that past temperatures were too high because of a siting change.
What they should have done is to gradually adjust temperatures in accordance with gradually greater built up area, ie the more buildings, the higher the measured temp.

%d bloggers like this: