Mind over Math: Throwing Out the Numbers

Opinion by Kip Hansen

Over the years there has been a lot of discussion about the power of numbers both to inform us and misinform us regarding the world around us.

In fact, some versions of the definition of science include the idea that all science is based on measurement – on numbers.  This idea is wrong from the start and extremely unfortunate.  Science – the effort to understand the physical world around us – must start with ideas – not numbers.  Ideas are strung together into hypotheses and those hypotheses are tested first against existing knowledge through the application of logic and critical thinking. 

Of course, many ideas can be confirmed or rejected based on measurements.  However, measurements (and here I specifically mean measurements turned into numbers) in today’s world are often misrepresented, misconstrued and miscommunicated.  With the advent of digital computers and their associated software, calculation and statistical analysis are far too easy and seem to have replaced both logic and critical thinking, even basic reasoning.  Those data sets of numbers are an almost irresistible temptation to many of our colleagues in the various fields of science – they just can’t restrain themselves – they must dive in with Mathematica and other tools of digital analysis, making new sets of numbers and new visual representations of those numbers.  They seldom stop there – they must add their opinions into the data set visualizations as various trend lines and others ideas that are not part of the data set at all. 

In science today, data sets (or here) are often confused with the real world.  We must acknowledge that data sets are nothing but a collections of numbers, hopefully carefully collected, labelled, accurate, precise and they hopefully represent information about some real attribute/s of some object or phenomenon of interest. 

This is not always the case. For instance the data set(s) known as Global Mean Sea Level do not refer to the actual surface height of the global seas or its mean value but rather to a concept of what that height would be under non-existing conditions, such as “if the ocean basins had not expanded” and “if water storage on land had not increased”, as this quote illustrates:  “Eustatic (global) sea level refers to the sea level change of the volume of Earth’s oceans. This is not a physical level but instead represents the sea level if all of the water in the oceans were contained in a single basin.” [ my emphasis – kh  source ]

Regular readers of this blog or any other source of science news and science discussion, including the leading journals of science, are well aware of the problem.  Because some data set exists –> some scientist/s dig in with statistical analysis of some portion of the data set in an attempt to find a publishable result –> they always find such a result.  The fact that the result found is not significant in the real world (for instance, not a Minimal Clinically Important Difference), that the result is trivial, that the result is vanishing small with error-bars that all include zero, that the result is ephemeral, that the result adds nothing to our accumulated knowledge base, that the result depends on the pre-existing bias of the researcher or his field of research, that the result has no applicable reality or that the result is true only in a very limited academic-sort of way – all these are ignored and under-emphasized in the resulting journal paper.  The journal paper is then churned into click-bait by the Science Mass Media and presented as newly discovered truth.

Again, I know that there is a lot of good science being done, and some of it does result from careful considered analyses of good data sets about a topic of interest.  But there is far too much of the other as in the previous paragraph.

So, all that said, let’s look at a recent example and see what one might gain by looking at a data set from a new perspective which allows us to throw out the numbers and by doing so, arrive at a more pragmatic understanding of it all.

Some clever people in the UK have realized that lots of old archival records exist about the tides in UK harbours and that these records might lengthen the data sets of Mean Sea Level (MSL) in those places for those time periods.  It is an interesting topic – made all the rage by the alarming warnings about Sea Level Rise being issued by all sorts of advocacy groups and misguided scientists for the last 30 years.  I wrote a brief report here on historic EU tide gauge data back in 2019. [ I also found an  older UK record, albeit on temperature at Greenwich Observatory and thought it important enough to post here a few years ago. ]

The resulting paper, an interim report of an ongoing project, appeared recently as “Changes in mean sea level around Great Britain over the past 200 years” by P. Hogarth, D.T. Pugh, C.W. Hughes, S.D.P. Williams in the journal Progress in Oceanography.  The paper self-describes in the abstract as:

“We systematically assimilate a wide range of historical sea level data from around the coast of Great Britain, much of it previously unpublished, into a single comprehensive framework. We show that this greatly increased dataset allows the construction of a robust and extended Mean Sea Level curve for Great Britain covering a period of more than two centuries, and confirms that the 19th century trend was much weaker than that in the 20th century and beyond.”

In plain language, what they have done is transcribe old tide journals – recordings of high and low tides, their magnitudes and timings – from various places around Great Britain mostly from the early 1800s, applied a bunch of analyses and averaging and such, then patched that data onto the modern averages of MSL for Great Britain from the Permanent Service for Mean Sea Level (PSMSL). All-in-all a good and valuable effort.

Here’s an example of the type of old records from which data has been extracted.

(Correction: While modern tide measurements in the UK are in meters, the above, due to the date, may have been in feet, Current tidal range is Wick, Scotland is 1.5 to 2 meters which would compare favorably with the above being 5 feet. Many sites in the UK do have modern tidal ranges of 3-7 meters. h/t reader “Old England”)

Their main result is this chart:

Our own Willis Eschenbach re-graphs and throws stats at the data set from this study in an interesting piece here at WUWT.  Note that  the full data set is available only by viewing the original journal article here and scrolling down to the heading for Appendix A.  Just under that heading is a Download All supplemental files link (which somehow cannot be copied, you have to go to the page and click on it.  It downloads a .zip file). 

In this essay I will evaluate this data set using only the mind – meaning  critical thinking, prior knowledge, and logic – and not math, leading up to throwing out the numbers – the individual measurements (in this case these are averaged averages of averages…). 

Things to know about this data set before looking at anything about this paper – especially the numbers:

  •  While the data set is claimed to be an “extended Mean Sea Level curve”.  It is not that, but rather a graph of the annual average of annual average Relative Mean Sea Levels from 174 different sites on Great Britain (GB) and Ireland – some of the data points are modern (20th and 21st century) data and some are historic (19th century).
  • The recorded numbers have been taken from and by instruments of various types that have been used over the last 200 years.  Older data recorded by hand from tide staffs:

More modern records, mostly from the Permanent Service for Mean Sea Level (PSMSL), result from increasingly more accurate float devices in stilling wells.  The most recent are probably “Air Acoustic sensors in protective wells”.

While the chart from the paper shows “error bars” they are error bars for the averages – and not anything like original measurement error/uncertainty.   Tide staff readings, marked in half-meter increments, probably have an original measurement error of at least 2/10th of a meter (20 cm as in +/- 10 cm), even +/- 20 cm.  The most modern, up-to-date, precise acoustic tide gauge has a per-the-technical- specs error range of +/- 2 cm

  • A brief survey of tidal ranges for harbours in GB reveals that a rough average of tidal range (high tide to low tide) is around 5 meters or 16 feet. Some can be 7 meters and more.

In the real world, this looks like this:

One of the historic record sites used in the paper under discussion is a few miles from Port Isaac.  Port Isaac is the filming location for the famous Doc Martin TV series.

  • Most of the data (there are a few exceptions) before 1920 depend on an average of less than 5 data points – compared to data from 1960 onward which is based on of 30 to 40 data points.  Thus, uncertainty increases from two sources as we move back in time:  original measurement error alone increases by at least factor of ten and sample number reduces by a factor of ten, again multiplying uncertainty.
  • None of the tide data has been corrected in any way for Vertical Land Movement of the tide gauges themselves.  This means we can only use the data for judging Local Relative Sea Level as we can not separate vertical movement (up and down) of the Land from the rise or fall of the Sea. Results are thus not applicable to Global Sea Level or its mean.

I am going to show the process by which one might use a philosophical or strictly mental approach — rather than mathematics or statistics — and applying prior knowledge, critical thinking and logic in the place of math or statistics — to evaluate the data set using the points above to gain perspective and reducing the provided chart to ONLY the things that it can really tell us reliably.

I do this in the following slide show of eleven simple images:

I hope the slide show came off as self-explanatory – but I’ve misjudged readers before.  So, I’ll give my justifications slide by slide here below:

Slide 1:  As given in the original paper.

Slide 2:  I add (and later remove) a reminder of scale, for U.S. readers.  The 300 mm of rise is about 12 inches.

Slide 3:  I remove the unnecessary and misleading Red Line Trend.  The paper’s authors ignore the factors of sparse data before 1920 and ignore their own data, by not including the data points before 1830 – instead drawing a trend line that seems to depend on a single outlying data point around 1817 (the first open blue circle on the graph) and seems to disregard the other early data points that are in line with the 20th century data.   The Red Line is – to me and maybe you – viewpoint biased – trend lines drawn on graphs are always opinions – not part of the data.  All of us are capable of looking at the graph without it.

Slide 4:  I have added – as a grey underlay – something closer to the real minimum uncertainty in the tide measurements themselves.  I keep it as small as possible – only +/- 10 cm on the left and the NOAA-spec’d +/- 2 cm for the most modern data.  It is my opinion that the uncertainty is far greater, especially before  the 1950s.  ALL of the data points fall within the band of uncertainty thus we can ignore all those little squiggles – the little ups and downs – only the overall slope of the more reliable gray band, inside which we are pretty sure the real values lay, can be depended upon.

Slide 5 & 6: Trigger Warning #1 and #2.

Slide 7 & 8:  I remove the numerical data points leaving only the more reliable data range seen as the grey area.  The data range fades on the left side (earlier in time) as the data becomes more and more sparse and less and less dependable.  I say in Slide 8, “almost correct view” because, like almost all programmatically created graphs, the scale is automatically set – without rhyme or reason –  to about 120% of the graphed data range.

Slide 9 & 10:  Here the vertical scale of the graph is corrected to be approximately the average Tidal Range — the average range of Relative Sea Level – for GB harbours.  This is about 5 meters or 16 feet.  This allows us to see the relative importance of the change in RMSL against our daily experience in the real world where this change takes place. (See the photos of Port Isaac in the body of the essay above.)

Slide 11:  The final result of using Knowledge, Critical Thinking and Logic to derive the reliable pragmatic finding of the paper: “Changes in mean sea level around Great Britain over the past 200 years”.  The original findings are not nothing . . . but they are not what was claimed, which was “…that the 19th century trend was much weaker than that in the 20th century and beyond.”

By Throwing Out the Numbers, and evaluating the data first as information, comparing it to existing knowledge, critically considering each aspect, applying simple logic and proper perspective, we can arrive at a more pragmatic understanding of the whole. 

Bottom Line:

Hogarth et al., using the addition of historical tide records to modern records, have extended the understanding of long-term (two-century scale) Relative Sea Levels for GB’s harbours showing that average RMSL in those harbours has been rising steadily and more-or-less evenly since the early 1800s resulting in an overall 200-year rise of about 12-16 inches.  In many locations in the UK this is a trivial amount when compared to their normal daily tidal range.

This result is far easier to see when we Throw Out the Numbers.

# # # # #

Author’s Comment:

Many readers here may disagree with my approach above.  Truthfully, I have just written out the sequential mental steps that I commonly use, when sitting in my easy chair,  to gain a pragmatic perspective on the many data sets and graphs of those data sets that come my way in my daily walk through the ongoing outpouring of scientific results that I see in the journals and the popular science press.

I think it is a grand idea to capture all those tide readings from the old records and that the recovered data will be very valuable.  In this case, when looked at with a pragmatic eye, it confirms that — contrary to the published finding — nothing unusual has happened with relative sea levels in the UK over the last two hundred years and nothing unusual is happening now.  A good scientific confirmation. 

It is a mistake to reify the data – to pretend that all those over-precise numbers and their little wiggles and squiggles are the thing itself: in this case, actual changes in mean relative height of the sea surface at the millimetric scale.   In the reification, they have fooled themselves into calculating accelerations (speeding ups and slowing downs) and “trends” that just don’t exist in the real world. 

The combined overall result, once we have thrown out the numbers, is perfectly sound and probably reliable. 

Address your comments to “Kip…” if speaking to me.  In general, when replying to or addressing a specific person, prefacing your comment with the name of that person makes the Comments Section more readable.

# # # # #

4.4 29 votes
Article Rating
258 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Richard (the cynical one)
March 7, 2021 10:15 pm

Figures might not lie, but liars sure can figure.

Gordo
March 7, 2021 10:23 pm

Kip,

I spent half of today looking at a probabilistic framework for a resource estimate I am working on, concatenated the error bars – and ruefully retired from the field. I will provide an opinion, but as ever, very loathe to provide absolutes. Your article is a wonderful illustration of the fallacy of conflating precision with accuracy, and applying it to the natural world. Thanks.

davidf
Reply to  Gordo
March 7, 2021 11:26 pm

Oh, hell yes, very apt

Alan M
Reply to  Gordo
March 8, 2021 3:57 am

Gordo, work in the same field and couldn’t agree more

Tombstone Gabby
Reply to  Kip Hansen
March 9, 2021 11:43 am

Now how did that saying go? Something like: “Some researchers use statistics like a drunk uses a lamppost, not to illuminate but to lean on.”

Steve Case
March 7, 2021 10:45 pm

So it looks like sea level has been rising right along for as long as it’s been measured. I can’t say that that’s exactly new news. Over a foot higher in 200 years. The question is, “Why?” Is it actually more water? Or is it a function of tectonic changes to the oceanic basins? More water would point to the ice caps, Greenland and Antarctica, losing ice. Where else would it come from? In any case, we are told in the popular press that it’s because of CO2, and that doesn’t pass any kind of logical thought experiment. But we have governments around the world that have taken that idea as gospel and are acting on it, and that’s the problem.

tonyb
Editor
Reply to  Steve Case
March 7, 2021 11:44 pm

Kip

Gordon Manley who originally compiled CET reckoned the glaciers started melting (this time round) from around 1750 but not in a linear fashion.

There have been several high water stands of which the Roman one around 300 Ad and 600AD, the one around 1300AD another around 1600AD are most noticeable and coincide with known warm periods.

We have known low water marks around 800AD and 1700AD so in general we can see when ice formed taking water out of the system and when it melted, putting it back in again.

We have been witnessing a slow and steady melt from the coldest period of the modern Holocene and must expect that to continue.

i wrote a article on sea level rise in Britain carried here

Historic variations in sea levels. Part 1: From the Holocene to Romans | Climate Etc. (judithcurry.com)

“The North Sea had a nasty little jump between 350 and 550AD, flooding the coasts of northern Europe with an extra 2 feet of water and sending its inhabitants — folk known as Angles and Saxons — fleeing (although “conquering” might be the better word) into ill-prepared Roman territories. At the start of this rise, the areas we know as the Fens were a well-settled part of Roman Britain ruled from the town of Duroliponte (Cambridge) by its native people, the Christianized Romano-Celtic Iceni. Then the sea level rose, and history’s curtain went down for two centuries.”

We can identify a number of these jumps and subsequent partial falling back. In that context the sea level changes in Britain over the last two centuries are not extraordinary and confused by land rising, falling or remaining static due to the glacier effect.

tonyb

Ron Long
Reply to  tonyb
March 8, 2021 1:22 am

tonyb, I like your comments about the sea level in the composite Roman Warm Period. I am reminded of a History Channel presentation about searching for the sea port needed to get enough food into Rome to sustain the population. They went down to the current Mediterranean Coast and looked around. Nothing. Then they noticed an unusual lake, lots of straight sides, back from the coast and several meters higher. Turns out it was the sea port, complete with numbered tie-up posts.

Jimb
Reply to  Ron Long
March 8, 2021 10:16 am

And the Neanderthals lived in caves that are now very much under water.

Tom Abbott
Reply to  Ron Long
March 9, 2021 6:56 am

I saw that tv program. It was a good show, and another demonstration that the sea level was higher in Roman times than it is now.

Roger Knights
Reply to  Steve Case
March 8, 2021 1:49 am

Where else would it come from?”

Some came from our drilling for and release of “fossil water.”

Steve Case
Reply to  Roger Knights
March 8, 2021 4:48 am

Some came from our drilling for and release of “fossil water.”

Yes, some would come from that. On a search I found this from “Mother Jones
comment image

and it says 12mm (half an inch) since 1900. Call it 5 – 10% of sea level rise over the last 100 years assuming the amount pumped during the 19th century was minimal. And that’s significant, but not as important as the associated lowering of water tables world-wide.

Shanghai Dan
Reply to  Steve Case
March 8, 2021 11:58 am

Which is why we should plant a few more nuclear power plants here in California, use that to power our grid, and use the intermittent wind and solar to run a bunch of desalination plants – we can use that water locally and even feed back to the Colorado river basin, and beyond.

Robert W Turner
Reply to  Steve Case
March 8, 2021 6:46 am

Isostatic post-glacial rebound is still occurring around the Arctic Ocean and likely accounts for the gradual increase in eustasy in the past 8,000 years.
https://www.pc.gc.ca/en/lhn-nhs/mb/prince/decouvrir-discover/decouvrir-discover4

There is some subsidence south of the high latitudes that are uplifting but they do not completely compensate for the rebound.

guidoLaMoto
Reply to  Steve Case
March 8, 2021 6:59 am

Where else would it come from?…It’s obvious. GW is killing off the sponges in the coral reefs, so there’s not as many to soak up the water.

Jimb
Reply to  guidoLaMoto
March 8, 2021 10:19 am

Chuckle

Steve Case
Reply to  Kip Hansen
March 8, 2021 8:53 am

Kip, You’re right, considering your post, half an inch over 100 years isn’t significant or important.

Smart Rock
Reply to  Steve Case
March 8, 2021 9:51 am

The biggest cause of sea level rise (during a period of stable ice caps) is probably thermal expansion of the oceans. The coefficient of thermal expansion of sea water varies with temperature, pressure and salinity. Using a very rough average coefficient of expansion of 0.00015/°C, and mean ocean depth of 3600 metres, it looks like a 1°C rise in overall ocean temperature would lead to about 50 cm of global SLR.

A 1°C rise in overall ocean temperature is a lot of heat, and might require many centuries of atmospheric warming.

Eric Harpham
Reply to  Smart Rock
March 8, 2021 11:53 am

How about the water displaced by all of the ships and boats that the human race has built in the last 200 years. I wonder what rise in sea level that that accounts for?

Reply to  Eric Harpham
March 8, 2021 3:02 pm

Eric,
There are roughly 80,000 sea-going ships.
If , on average, each displaces 100,000 tonnes [of seawater] (I think that’s a very high estimate, perhaps three times the actual number, but let’s run with it!) – that’s 8,000 million tonnes of seawater. That’s about eight cubic kilometres of seawater [roughly; assuming a density of 1, not 1.025].
The oceans have an area of roughly 400 million square kilometres.
That leads to a rise of one part in fifty million of a kilometre – which I calculate to be 0.02 mm.
That, I suggest, is negligible.
The oceans are very big.

Be safe,.
Auto

Steve Taylor
Reply to  Eric Harpham
March 10, 2021 1:15 am

Every kid that visits a stony beach throws stones into the water. That must have an effect as well.

Crispin Pemberton-Pigott
Reply to  Steve Case
March 11, 2021 5:31 am

Steve:

Melting of land ice in the far north, plus thermal expansion of the oceans. That is where the additional water is coming from. Globally there is less ice than there was 200 years ago. There is still more than there was 6000 years ago. We might be heading back to that (thankfully) warmer world but there is no guarantee.

Living outside the habitable zone requires heating in winter or we would soon die of exposure. As the habitable zone expands poleward, we can live with much less energy. When small, safe and efficient nuclear reactors are available it won’t matter but at present any warming is to be appreciated. This is particularly true in terms of the food supply.

Steve Case
Reply to  Crispin Pemberton-Pigott
March 12, 2021 2:01 pm

The far north maybe losing ice, but it’s not melting. Thermal expansion is local. If the Pacific Ocean warms up and expands, the water in New York harbor will still be the same temperature and same level, i.e., it won’t be affected.

StephenP
March 7, 2021 11:19 pm

A sunken forest was found in the Solent of the Hampshire, UK, coast about 4 metres down with carbon dating of the timbers to about 6,400 years BC.
The rearchers attribute it to sea level rise, but whether it is the sea level rising or the land level falling they don’t say.

http://www.megalithic.co.uk/article.php?sid=2146410702

IIRC there has been isostatic rebound since the last ice age

http://www.dur.ac.uk/news/newsitem/?itemno=8805

Ian W
Reply to  StephenP
March 8, 2021 1:46 pm

StephenP
That date would fit with the flooding of Doggerland. A more disruptive event than a slow steady sea level rise. A first geological Brexit

As ice melted at the end of the last glacial period of the current ice age, sea levels rose and the land began to tilt in an isostatic adjustment as the huge weight of ice lessened. Doggerland eventually became submerged, cutting off what was previously the British peninsula from the European mainland by around 6500 BC. The Dogger Bank, an upland area of Doggerland, remained an island until at least 5000 BC

https://www.ancientpages.com/2020/12/02/catastrophic-final-flooding-of-doggerland-by-the-storegga-tsunami-new-study-results/

Meab
March 7, 2021 11:35 pm

Kip.

It would be even more illuminating to make a new final plot that shows the entire daily tidal range over time, both the high and low tide: a thick (~5m) region rising very slowly bounded by a shaded uncertainty band on both the top and bottom of the range.

tonyb
Editor
Reply to  Meab
March 8, 2021 12:46 am

I live in Devon and we have quite a high tidal range. It is said the Romans came to grief on these southern shores as being from the Mediterranean they were not used to tides of any great variance.

Nor are our tourists, who often complain that there was far less beach than when they came last year, or far more beach and what had happened to the water their youngsters had swam in?

Try to explain to them that according to the state of the lunar and annual cycle, we have high high tides, low low tides, low high tides and high low tides and sometimes barely any tides at all!

tonyb

Chaswarnertoo
Reply to  tonyb
March 8, 2021 12:56 am

We too have had complaints about the lack of sand on one of the highest tides of the year on our local,beach. Seems like it was our fault. 🤣

tonyb
Editor
Reply to  Chaswarnertoo
March 8, 2021 1:21 am

We have a beach chalet, so can observe the reality of a hot day, loads of tourists and an incoming tide at 9AM which will mean the tourists will have no beach to sit on. The cafe will get lots of complaints about that!

tonyb

Ben Vorlich
Reply to  tonyb
March 8, 2021 2:43 am

Not sure I buy into that theory. In 56 BC, Caesar defeated the Veneti in a naval battle and took most of northwest Gaul A walled French town on the coast of Brittany, Saint-Malo has the highest tides in Europe, with water that can rise 13m over the course of six hours. Julius Ceaser made incursion into southern Britain in 55 and 54 BC. So Roman naval commanders would be familiar with tides in the English Channel.

Rhs
Reply to  Ben Vorlich
March 8, 2021 6:21 am

Tides within the channel have an effect on navigation rather than a holiday on the beach. Tide flows of several meters around GB are unheard-of within most of the Mediterranean.

It doesn't add up...
Reply to  Ben Vorlich
March 8, 2021 7:54 am

An account of some of the tide related troubles Caesar faced on his first landing in Britain

https://www.historytoday.com/archive/julius-caesar%E2%80%99s-first-landing-britain

Ben Vorlich
Reply to  It doesn't add up...
March 8, 2021 9:20 am

Your link is more about storms, storm surges and an unusual tide. An ancient Briton’s Kamikaze.
I suggest rather than sailing round from the Mediterranean to Southern England Caesar used Gaulish craft crewed at least in part by Gaulish sailors, or ships built in th Pas De Calais. In which case they’d be familiar with tidal conditions in the channel. Cross Channel trade was well established in pre-Roman Britain

Meab
Reply to  Kip Hansen
March 8, 2021 9:38 am

Kip,

It would actually require less math, but that’s ok. Just plot the actual data over time (adjusted for isostatic rebound – a linear adjustment) for a harbor. The froth at the top and bottom of the tidal range automatically gives a visual indication of the error. Relocation of the gauge would show up as a step change and could be left uncorrected without compromising the value of the plot, but it would be easy to find data from a place where the gauge never moved.

“they are meant to tell ships how much water is under heir hulls in the harbour so they don;t run aground.”

There are other reasons equally (or even more) important for recording these old tidal measurements. One was to be able to predict the harbor’s tidal ebbs and floods. Older ships almost all had less draft than modern ships so they typically had no problems with harbor depth as long as they could avoid known shallow areas. However, they didn’t have any side thrusters and were often underpowered making manuvering at times of high tidal flows difficult. Many ships would enter the harbor at times of slack flow and then stay moored for several complete tidal cycles.

Meab
Reply to  Kip Hansen
March 8, 2021 5:30 pm

Kip,

“tide gauges and tidal charts tell those fishermen how close the pub at the top of the harbour they can pull their boats in to shore before touching bottom.”

Tide charts (or tide tables) along with a depth chart, yes, tide gauges (by themselves) no. In some places, you can get into trouble well before you can even see a tide gauge.

I was referring to the reason tide gauges were recorded in the first place. The times and heights of high and low tide couldn’t be calculated from first principles as each harbor is unique. When actual data was recorded for a long enough period, only then could future tides be predicted. They were continously recorded, in great part, to confirm the predictions and adjust them for long term trends (decadal trends). Few people cared about sea level rise.

Izaak Walton
March 7, 2021 11:55 pm

Kip,
The authors of the paper state that “the RSL data is adjusted for the ongoing different post glacial rebound rates around the British Isles (Emery and Aubrey, 1985Peltier and Tushingham, 1989Rennie and Hansom, 2011Whitehouse, 2018) using the Peltier ICE-6G_C (VM5a) GIA model data, which includes the effect on measured sea level via both VLM and gravitational effects. The GIA adjustments for each precise site location in mm/yr (column 19 of Table 4 in the appendix)” which seems to suggest that your claim that the data is unadjusted for vertical land movement is wrong.

Your error bars are also widely off. While your estimates might be correct for a single measuement the authors only included data for sites where there was at least a two week
long set of measurements meaning that the error gets reduced by 1/sqrt[n] for each
individual site and then you can further reduce the error by averaging different sites from
around the UK.

Clyde Spencer
Reply to  Izaak Walton
March 8, 2021 9:25 am

Walton
You erroneously claimed, “… you can further reduce the error by averaging different sites from around the UK.”

If one is measuring something that has random variation arising from the measurement instrument or observer, then precision can be increased by one significant figure by taking 100 measurements of a static variable. Taking multiple measurements of something that changes with time only allows one to obtain a better estimate of the range of the variable, and to calculate the mean and standard deviation of the sample of measurements for the time interval selected.

Averaging different sites is not justification for claiming increased accuracy or precision for the country. Instead, what one is doing is a form of low-pass filter that attenuates the peaks and valleys in the tides because different tidal basins have different amplitudes and different times of maxima. Water currents, wind, and high/low pressure systems can cause water to pile up along the coasts, which is unrelated to either sea level rise or subsidence. Furthermore, tides have decadal changes that need Fourier analysis to identify.

The credibility of your claims leaves a lot to be desired!

Meab
Reply to  Izaak Walton
March 8, 2021 9:59 am

The error only gets reduced by 1/sqrt(n) if the n measurements are randomly distributed, which they clearly aren’t. Consider a large set of measurements (ntotal) taken under m different prevailing wind conditions (wind affects the sea level). All the random errors of the measurements taken under each of the individual m wind conditions can be expected to average out with error 1/sqrt (nsubm) but the total error including wind variability won’t average with 1/sqrt (ntotal) if the wind conditions introduce coorelations between the ntotal measurements.

Carlo, Monte
Reply to  Izaak Walton
March 8, 2021 10:04 am

You still have not grasped simple statistics: n^-0.5 applies ONLY for multiple measurements of the SAME quantity. Averaging a time series does NOT qualify because the measurand is constantly changing.

Instead the standard deviation is a function of the variance of EACH data point.

Paul Penrose
Reply to  Izaak Walton
March 8, 2021 10:25 am

Izaak,
What Kip calls “error bars” for the devices used to measure sea level, is under a column entitled “Accuracy” in the NOAA documents he linked to. Accuracy is limited by the physical constraints of the devices themselves and can’t be improved by any kind of mathematical process. This is because the quoted accuracy (like +/- 10mm) represents a region around the theoretical correct value that is unknown. Since it is literally unknown, we don’t even know what the distribution of the (true) values are inside it. This kind of “error” carries forward completely in a computation, so Kip’s error bars are, if anything, conservative.

Paul Penrose
Reply to  Kip Hansen
March 9, 2021 9:39 am

Kip,
I do like “uncertainty range” better, although “error range” is not wrong from a holistic view. When you are trying to measure something, anything that produces a reading different from the (theoretical) true value could generally be considered “error”. However I like to distinguish between the different sources of error because some can be reduced, like random noise, and others are irreducible, like physical instrument accuracy (uncertainty). Even then, just considering random noise, you still need to characterize the noise so that you can properly reduce it. Most noise is not actually Gaussian, and often is auto-correlated, making a simple averaging function inappropriate. There are ways of handling different kinds of noise, but those techniques are more complex and often unknown to people outside the statistics community.

This is far from a simple subject.

Tim Gorman
Reply to  Izaak Walton
March 8, 2021 12:14 pm

at least a two week long set of measurements meaning that the error gets reduced by 1/sqrt[n] for each individual site”

Only if you can stop time and take a bunch of measurements of the same thing! Otherwise the error of each measurement adds by root sum square when you combine them.



Old England
March 8, 2021 12:40 am

Looking at the picture of the record page from Wick, Scotland, although undated the text refers to records from the 1800s. The caption, however, states this tide data was in Meters. I am confused and find this hard to believe as the UK did not begin to use metric data until forced to by the EU in the 1970s.

In Cornwall, where I lived as a child in the 1950s 60s near Newquay, Tide heights were always in Feet and Inches and always had been. Meters were introduced by Napoleon Bonaparte and enforced upon the parts of Europe conquered by the French.

Old England
Reply to  Kip Hansen
March 8, 2021 2:14 pm

Kip, thanks for taking the trouble to reply. Appreciated.

There are some harbours in both the UK and France that were in use in 15th -17th C but now are so far away from the sea (sea levels fell or isostatic rebound?) that they are no longer harbours. I seem to recall, but could be wrong, that Montreuil in France was one of them.

Roger
Reply to  Old England
March 8, 2021 9:39 am

Later this month a tidal range of 7.75 meters is predicted here on the south coast. That’s 25 feet for the traditionalists. Why were the Wick tide times converted into decimal and rounded to the nearest six minutes?

tonyb
Editor
March 8, 2021 12:49 am

Kip

Do you have access to historic Arctic area tide tables?

The strain on sea ice -causing cracking-due to tidal range is something I saw on a video. If tides are getting larger it might help to explain arctic ice disintegration over the last few decades.

tonyb

lackawaxen123
Reply to  tonyb
March 8, 2021 10:19 am

pretty sure oceans don’t have tides … shorelines of oceans do …

dodgy geezer
March 8, 2021 12:49 am

I can see little point in producing a comprehensively researched and well argued piece suggesting that climate change is not as dangerous as it is thought.

Such items will not be read by true believers in climate change, will not be taken into account by scientists and politicians working in the climate field, and will be suppressed by any mass media outlet. The work will be nugatory.

I do not know how to address this mass madness which has gripped society for the past 15-odd years, but I do know that telling the truth in as even-handed a way as possible is completely ineffectual and useless. No one will listen.

Robert W Turner
Reply to  dodgy geezer
March 8, 2021 6:52 am

I completely agree. I have been trying for years to get a single person to explain how the Quantum Theory (Law) of Radiation doesn’t completely refute the greenhouse gas back radiation hypothesis to no avail. This goes for the CO2 cult and lukewarmers.

nyolci
March 8, 2021 1:05 am

Kip, this was fun, an almost parody level representation of the usual science denier bs-ing. Especially the part when you compressed the y axis, well, boy, that’s a recurring theme in parodying deniers. Thanks, this was fun! 🙂

Dave Yaussy
Reply to  nyolci
March 8, 2021 2:47 am

I assume you’ve seen the graphs used by warmists to make changes of tenths of degrees over the past 30 years look impressively scary. In fact it’s the premier trick in their book. I wouldn’t be throwing stones

nyolci
Reply to  Dave Yaussy
March 8, 2021 5:29 am

I assume you’ve seen the graphs used by warmists

You mean scientists?

to make changes of tenths of degrees over the past 30 years look impressively scary.

Hm, the rise is actually 0.9C in the past 30 years. But this is not why it’s scary. The scary thing is that this rise is extremely quick in geological time scales and still accelerating. This is why you should listen to the scientists instead of the bullshiting like the article above. FYI one consequence is sea level rise, and here we talk about 10 centimeters in the past 30 years. This is also quick and accelerating.
Just as a side note, the fact that something looks very small to the layperson doesn’t mean it’s insignificant. A “discontinuity” of 0.1mm in your neck would be lethal, don’t try that.

Last edited 1 month ago by nyolci
Tim Gorman
Reply to  nyolci
March 8, 2021 5:56 am

ny,

As usual you are assuming that the global average temperature is real and meaningful. Since most of the measuring stations around the world have an uncertainty range of +/- 0.5C the combined uncertainty from adding all these together will be significantly higher – meaning your 0.9C increase would be well inside the uncertainty range and is therefore meaningless. You simply can’t know if it is 0.9C or something else entirely!

You simply didn’t even understand what this article was actually getting at, did you? Did you read it at all?

nyolci
Reply to  Tim Gorman
March 8, 2021 10:18 am

As usual you are assuming that the global average temperature is real and meaningful

I don’t assume anything. I just report what scientists say. FYI they usually use anomalies.

Since most of the measuring stations around the world have an uncertainty range of +/- 0.5C

Huh, you started to use the measurement unit at last! Progress!

the combined uncertainty from adding all these together will be significantly higher – meaning your 0.9C increase would be well inside the uncertainty range and is therefore meaningless.

Well, how large a change would meet your approval then? 🙂 Another (rhetorical) question: if combined measurements increase the uncertainty of the combined value, are we better off with less number of measurements? Ad absurdum with a single measurement? Another (rhetorical) question: should I continue to illustrate your ignorance? Ah well, a bonus question: how can scientists measure gravitational waves that have effects in the scale of the diameter of a proton if we don’t even have any direct method to measure distances that are orders of magnitude greater than the diameter of the proton? (NB: gravitational waves are detected by detecting the length change they cause.)

Last edited 1 month ago by nyolci
Tim Gorman
Reply to  nyolci
March 8, 2021 11:59 am

I don’t assume anything. I just report what scientists say. FYI they usually use anomalies.”

How do you judge anything to be correct? And *what* scientists do you report on? Just the AGW ones or the *real* scientists as well?

“Well, how large a change would meet your approval then?”

By the time you get through combining 1000 independent measurements of different things a 30C change would probably be outside the uncertainty range. Anything smaller would just be opinion, not science.

“are we better off with less number of measurements?”

If your measuring devices have an uncertainty that overwhelms what you are measuring then you are better off *not* combining the measurements into some kind of psuedo-data base that supposedly means something.

Just adding Tmax and Tmin together from a device that has +/- 0.5C uncertainty gives you a +/- 0.7C uncertainty in the mid-range value that you calculated. That by itself is almost enough to overwhelm the temperature changes you are trying to identify.

And using anomalies doesn’t help. The uncertainty travels right along with the anomaly. If your anomaly is in the +/- 0.7C range then you start off with an anomaly you cannot prove to be the true value.

“Ad absurdum with a single measurement?”

Which means you know nothing of physical science tenets.

“how can scientists measure gravitational waves that have effects in the scale of the diameter of a proton if we don’t even have any direct method to measure distances that are orders of magnitude greater than the diameter of the proton?”

By taking multiple measurements of the same thing using the same device in order to build a database of values with random error that can be reduced using the standard deviation of the mean (assuming your measurements for a Gaussian probability distribution). If that same “thing” isn’t a direct measurement of what you want to know then it needs to be an “other thing” that allows you to calculate what you want to know about the “thing”.

But this situation is *not* the same situation as taking multiple measurements of different things that does *NOT* give you a Gaussian probability distribution of random error.

nyolci
Reply to  Tim Gorman
March 9, 2021 10:39 am

How do you judge anything to be correct?

Good question, and science is all about this.

Just the AGW ones or the *real* scientists as well?

Whether you like it or not, nowadays only AGW scientists are the real scientists and vice versa. Perhaps in the 90s you could legitimately claim doubt but the last 20 years’ progress in science leave no room for any doubt now.

By the time you get through combining 1000 independent measurements of different things a 30C change

And you wonder why deniers are treated as laughingstock.

Just adding Tmax and Tmin together from a device that has +/- 0.5C uncertainty gives you a +/- 0.7C uncertainty

Now if we just disregard the fact that you extremely rarely add Tmax and Tmin together, even you yourself say (without realizing) that “uncertainty” decreases with this simple addition. See? Variance increases with the factor of sqrt(2) if you add two (independent) variables. That basically this is how you get 0.7 here. If you average them, that’s a division by sqrt(2), just to clarify why it this seeming increase is actually a decrease.

Which means you know nothing of physical science tenets.

Hello, I was just paraphrasing you 🙂

By taking multiple measurements of the same thing

What same thing? 🙂 This is getting comical. We are talking about a fcukin wave that is in the frequency range of seismic activity and most human factors (like transportation). How can you take multiple measurements here that you can’t do with temperature?

If that same “thing” isn’t a direct measurement of what you want to know then it needs to be an “other thing” that allows you to calculate what you want to know about the “thing”.

Huh, I didn’t know that 🙂 Okay, joking aside, my advice is to read about it. They have been planning this thing from the late 80s, there are plenty of very good popular scientific articles.

give you a Gaussian probability distribution of random error.

Just a side note. Not all distributions are Gaussian. Okay, adding multiple independent variables of the same arbitrary distribution results in a very good approximation of a Gaussian distribution. But on average you can’t assume a Gaussian automatically.

Tim Gorman
Reply to  nyolci
March 9, 2021 1:19 pm

“Good question, and science is all about this.”

Not climate science.

Whether you like it or not, nowadays only AGW scientists are the real scientists and vice versa.”

Real scientists that live in the physical world and not inside a computer understand uncertainty. Climate scientists just ignore it and pretend everything is 100% accurate.

“And you wonder why deniers are treated as laughingstock.”

See what I mean? I just stated the truth. And you just attempt to use the argumentative fallacy of Argument by Dismissal to rebut it.

Now if we just disregard the fact that you extremely rarely add Tmax and Tmin together, even you yourself say (without realizing) that “uncertainty” decreases with this simple addition.”

Doesn’t matter if you add Tmax and Tmin or subtract them (actually adding a negative number), the uncertainty still grows by root sum square. Not only that but you apparently can’t even read. Since when is +/- 0.7C less than +/- 0.5C?

“Variance increases with the factor of sqrt(2) if you add two (independent) variables.”

What do you think the term root-sum-square is? And uncertainty is not variance. Variance is associated with a probability distribution, uncertainty has no probability distribution.

If you average them, that’s a division by sqrt(2), just to clarify why it this seeming increase is actually a decrease.”

You don’t average uncertainty. Uncertainty adds as root-sum-square. For y +/- w = (a +/- u) +/- (b +/- v) (where u, v and w are the uncertainties, then

w^^2 = u^^2 + v^^2

The number of terms has no uncertainty associated with it. It’s a constant. So it can’t contribute to uncertainty, either by decreasing it or increasing it.

What same thing?”

A metal bar, the height of an incline plane, the mass of an unknown cube, the side of a table, and on and on and on.

“How can you take multiple measurements here that you can’t do with temperature?”

One is counting, the other is measuring. Your ignorance of physical science tenets is showing.

They have been planning this thing from the late 80s, there are plenty of very good popular scientific articles.”

And almost all of them ignore uncertainty. They all assume 100% accurate temperature measurements and calculate no propagation of uncertainty throughout their study methods.

It’s not my fault that most of the popular scientific articles are concerned with MODELS today and are prepared by mathematicians and computer programmers who know nothing of measurement uncertainty and how to propagate it correctly. They assume that multiple independent measurements of different things can be combined into a probability distribution that can then be analyzed as if the data values are a random variable with a probability distribution.

“Not all distributions are Gaussian.”

Of course they aren’t. Do you even know the simplest way to determine if it *might* be Gaussian or not? I doubt it. Just like most people today don’t know how to tell if a foundation form for pouting concrete is square!

 Okay, adding multiple independent variables of the same arbitrary distribution “

Independent measurements of different things don’t have an arbitrary distribution.

I made three measurements of three different things on my desk just now. M1 = 6 +/- 0.5, M2 = 4.5 +/- 0.5, and M3 = 8 +/- 0.5.

What arbitrary distribution do each of these three independent measurements have?

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 1:25 pm

Tim, the LIGO instruments measure gravitational waves in realtime. They don’t take multiple measurements to detect them. The merging of a pair of neutron stars or black holes will generate these waves, and when they pass the LIGO, the “wiggle” of less than the diameter of a proton shows up as a squiggly line on an oscilloscope.

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 7:33 pm

What’s your point? Thermometers measure temperatures in real time also. But that point in time happens once and it is never, ever seen again.

With gravity waves you are measuring extensive characteristics. With temperature you are measuring an intensive characteristic. Two vastly different things.

fred250
Reply to  nyolci
March 8, 2021 1:18 pm

“should I continue to illustrate your ignorance?”.

.

You are doing a really great job of illustrating YOUR gross ignorance and manic anti-science brain-washing.

Its all you ever manage to do.

nyolci
Reply to  fred250
March 9, 2021 10:41 am

Its all you ever manage to do.

I must’ve hit a nerve 🙂

fred250
Reply to  nyolci
March 9, 2021 11:45 am

Your own elbow bone.

Noted, still no actual science from you

Your ignorance just flows !

Robert W Turner
Reply to  nyolci
March 8, 2021 6:56 am

Naaa, this has got to be an ironic parody of climate cultists, right?

nyolci
Reply to  Robert W Turner
March 8, 2021 10:19 am

Naaa, this has got to be an ironic parody of climate cultists, right?

Yesofcourse! But please scale up the y axis for me, this is the local habit, right?

fred250
Reply to  nyolci
March 8, 2021 1:10 pm

Yawn, the empty IGNORANCE that is the d’nyholist falls on its face in its own BS.. yet again !!

The slop-stick comedy act has not improved. !

fred250
Reply to  nyolci
March 8, 2021 1:20 pm

Vertical axis covers the tidal range

Sorry your incompetent child-mind cannot cope with that concept.

live in your fetid d’nyholism, its all you can do.

fred250
Reply to  nyolci
March 8, 2021 1:16 pm

“rise is actually 0.9C in the past 30 years”

.

More complete BS from the d’nyholist.

fred250
Reply to  nyolci
March 8, 2021 1:40 pm

“and here we talk about 10 centimeters in the past 30 years”

.

ROFLMAO !! Its also Utter and complete BS !!!

Does not show up anywhere in stable tide gauges.

Sea level rise remains at a NON-accelerating about 2mm/year or less.

In many places FAR LESS

eg Sydney

comment image

Port Kembla

comment image

You really are displaying you brain-washed anti-science IGNORANCE today , d’nyholist !!

Ian W
Reply to  nyolci
March 8, 2021 2:11 pm

nyolci

The heat content of the top few meters of the oceans exceeds that of the entire atmosphere. A change lets say of 2C in 2M air temperature is such a small amount of heat in kilojoules/kilogram that it would not be noticed if added to the ocean surface layer. Remember >70% of the Earth’s surface is water.

Yet you are claiming concern over a trivial increase in heat content of the atmosphere.

nyolci
Reply to  Ian W
March 9, 2021 10:01 am

Yet you are claiming concern over a trivial increase in heat content of the atmosphere.

No. First of all, this is science that claims any concern, not me. I’m just repeat what scientists say. Secondly, I’m not claiming concern over heat content per se but temperature. See? You got even this simple thing wrong. That’s why you need to read scientists. To avoid these simplistic errors.

fred250
Reply to  nyolci
March 9, 2021 11:48 am

Still NO EVIDENCE at all that the slight but highly beneficial warming out of the coldest period iin 10,000 years has any human component except urban warming

I’m glad that your manifest ignorance and gullibility means you feel MANIC PANIC about such a small change in temperatures.

I makes you look like a truly deluded cretin.

nyolci
Reply to  Kip Hansen
March 8, 2021 9:54 am

It would have been better if you had understood

Exactly… just kidding 😉 I perfectly understand that you don’t understand these things. Regarding the specifics, the paper of course took into account the vertical land movement, and you deniers have a hard time understanding statistical procedures, but this is nothing unexpected, science deniers are called “science deniers” for a reason.

Carlo, Monte
Reply to  nyolci
March 8, 2021 10:05 am

Where did you learn to think?

nyolci
Reply to  Carlo, Monte
March 8, 2021 11:06 am

Where did you learn to think?

You wanna learn it too, right? Good! A step in the right direction. At last. Well, this story started at the hospital where I was born…

fred250
Reply to  nyolci
March 8, 2021 1:10 pm

poss d’nyholist has never had a rational thought in its life

All just brain-washed anti-science.

Tim Gorman
Reply to  nyolci
March 8, 2021 11:44 am

Statistical methods should not ignore uncertainty in data but it usually does, especially in climate science. It’s not the deniers that have a hard time understanding statistical procedures, it is the AGW proponents who have a hard time understanding that stated values aren’t 100% accurate.

nyolci
Reply to  Tim Gorman
March 8, 2021 2:34 pm

it is the AGW proponents who have a hard time understanding that stated values aren’t 100% accurate.

Tim, for the hundredth time, scientists (of course) never claim 100% accuracy for measurements. This is your misconception (or misunderstanding) that they don’t take it into account in statistical analysis.

Tim Gorman
Reply to  nyolci
March 8, 2021 2:46 pm

Of course scientists, especially climate scientists ignore uncertainty in their measurements, meaning they assume the 100% accuracy of the stated values.

If they didn’t ignore uncertainty and assume 100% accuracy then they would know that their global average temperature using anomalies is overwhelmed by the uncertainty propagated with their average calculations.

And again you quote “statistical analysis”. You can’t analyze independent measurements of different things using statistical analysis because they don’t represent a probability distribution.

What is the probability of Station A reading 20C and Station B reading 7C?

If you can’t determine the probability of each of these happening then how do they form part of a probability distribution?

nyolci
Reply to  Tim Gorman
March 9, 2021 10:06 am

Of course scientists, especially climate scientists ignore uncertainty in their measurements

yeah, of course. That’s why they put error bars in graphs 🙂

You can’t analyze independent measurements of different things

These are not “different things” 🙂 Again, you’re advertising your ignorance. See below:

What is the probability of Station A reading 20C and Station B reading 7C?

Well, how about the readings of Station A in a time series? When corrected for season (compare jan 5. to jan 5.) this is a probability distribution whether you like it or not. Yes, there are differences, and with climate change the jan 5. distribution itself change. And of course what I describe here is extremely simplified. But this is it.

fred250
Reply to  nyolci
March 9, 2021 11:50 am

What a load of ARRANT NONSENSE. !!

You have surpassed yourself for idiocy , d’nyholist

Tim Gorman
Reply to  nyolci
March 9, 2021 1:32 pm

Who puts error bars on their graphs? Ever seen one for UAH? Ever seen one for a climate model?

These are not “different things”  Again, you’re advertising your ignorance. See below”

Of course they are different things. Do you think a temperature station in Hays,KS is measuring the same thing as one in Denver, CO?

Do you think a measurement station is measuring the same piece of atmosphere when it is measuring Tmax as when it is measuring Tmin?

Well, how about the readings of Station A in a time series? When corrected for season (compare jan 5. to jan 5.) this is a probability distribution whether you like it or not.”

First, you didn’t answer my question. And if you think a temperature station is measuring the very same piece of atmosphere on Jan 5, 2019 as it is on Jan 5, 2020 then you are very, very lost. These are two separate, totally independent measurements.

Tmax and T_min are *NOT* part of a probability distribution. And they don’t become part of a probability distribution when you find their mid-range. And they don’t become part of a probability distribution when you take two different mid-range values from different years. You simply can’t take separate independent values, jam them together, and call them a probability distribution.

Tmax and Tmin are points on an almost sine wave, i.e. the daily temperature profile. And a sine wave is *NOT* a Gaussian probability distribution. It’s average value is not even the mid-range value between Tmax and Tmin.

If you would be honest and try to answer my question maybe the light would start to glow dimly.

What is the probability of Station A reading 20C and Station B reading 7C?

fred250
Reply to  nyolci
March 8, 2021 11:33 pm

How would an ignorant twit like you know what your so-called climate scientists claim

You have never shown one iota of scientific comprehension in any post you have ever made.

David Kamakaris
Reply to  nyolci
March 8, 2021 2:27 pm

OMG, you again?

Nyolci calls us deniers. Yet he/she/it denies that it was much warmer in recent, not so recent, and distant past, denies that whatever natural forcings that caused past warming could possibly be responsible for the modest warming since the 70’s, denies that sea level rise has not accelerated.

Keep it up, Denier nyolhist. Al Gore, Michael Mann, as well as the rest of the indulgent hypocrites appreciate your denialism.

nyolci
Reply to  David Kamakaris
March 9, 2021 10:10 am

OMG, you again?

Oh, yes!

denies that whatever natural forcings that caused past warming could possibly be responsible for the modest warming since the 70’s, denies that sea level rise has not accelerated.

No. Scientists do that. I’m just telling you about the results of science.

David Kamakaris
Reply to  nyolci
March 9, 2021 12:16 pm

Haven’t seen a bit of science out of you, so yet another ridiculous comment that shows you swallowed the climate change kool aid.

But go ahead and answer the questions Fred has been asking you over and over again if you think you can.
1… Do you have any empirical scientific evidence for warming by atmospheric CO2?

2… In what ways has the global climate changed in the last 50 years , that can be scientifically proven to be of human released CO2 causation?

Perhaps you could address my question as well.

Last edited 1 month ago by David Kamakaris
fred250
Reply to  nyolci
March 10, 2021 10:23 pm

No , you are CLUELESS about what actual real scientists say.

You make crap up from media nonsense.

Its all you are capable of doing.

fred250
Reply to  nyolci
March 9, 2021 1:00 am

“and you deniers”

.

Only d’nyholist here is YOU, a yabbering zero-science fool.

Tell us what we “deny” that you can produce solid scientific evidence for.

Let’s start with the very basics, shall we.

1… Do you have any empirical scientific evidence for warming by atmospheric CO2?

2… In what ways has the global climate changed in the last 50 years , that can be scientifically proven to be of human released CO2 causation?

I will remind you yet again, that mindless yabbering as is your normal meme, is NOT evidence.

Watching you fall flat on your face into your own BS…. as you always do.

Last edited 1 month ago by fred250
Paul Penrose
Reply to  nyolci
March 8, 2021 10:30 am

nyolci,
What a bunch of BS from you! He shows the y axis in a fully expanded form, and then for context, ALSO shows it against the much larger tidal range. Nothing deceptive about that. So, do you have any valid, specific criticisms, or are you here just to throw crap against the wall hoping something will stick?

nyolci
Reply to  Paul Penrose
March 8, 2021 11:38 am

then for context, ALSO shows it against the much larger tidal range

What a funny guy you are! You know that “context” in the specific context of this study is the famous “showing-it-little” context.

Nothing deceptive about that.

Well, it depends on the audience. It’s not deceptive for WUWT clowns like you, they are already long lost, you can’t deceive them further. It’s not deceptive for scientifically literate people, they just laugh at these amateurish attempts. Unfortunately the third category, the majority of the people, is susceptible for this bullshiting. This is why it’s important to challenge denier-bs here.

So, do you have any valid, specific criticisms

You’re not lucky ‘cos the original is a scientific article, so even if I didn’t have any specific criticism I could automatically dismiss a science denier’s convoluted and ridiculous writing. But I do have specific criticism. This whole Kip-shit is a superficial and clumsy. The vertical land movement for example is specifically addressed in the original article. The “tidal range” is irrelevant for sea level rise, regardless of its range. I don’t even understand why the hell he is pushing this thing. These values are corrected for tidal range. Actually, apart from a whole mountain of bullshit about numbers, Kip’s crap is very thin in specifics. He mentions the increasing uncertainty of measurements as we go back in time, and the decreasing number of actual data points. Yep, tough shit. The original never claimed it was an exact reconstruction. It reconstructed what it could from historical sources together with the calculated error band. Which, in turn, corresponds very well (ie. confirms) other scientific findings, like temperature reconstructions.

fred250
Reply to  nyolci
March 8, 2021 1:24 pm

Wow, more MINDLESS d’nyholist yabbering

DENIAL of the fact that sea level rise is TINY and inconsequential against the tidal range.

Not one tiny bit of rational thought or science in the whole mindless rant.

Poor ineffective nonce. !

fred250
Reply to  nyolci
March 8, 2021 1:29 pm

“This is why it’s important to challenge denier-bs here.”

.

But its NOT something you have ever been able to do

You do not have the capability or any science to be able to do so

Come on, yabbering idiot, what do we “deny” that you can actually produce real scientific proof for.

So far you remain totally and absolutely EMPTY, a NULL, a VOID.

You are like a mindless chihuahua yapping behind a 6ft fence.

TOTALLY INCONSEQUENTIAL.

David Kamakaris
Reply to  nyolci
March 8, 2021 3:00 pm

ROFLMFAO! Keep it up, Denier. You’re serving your lost cause well.

nyolci
Reply to  David Kamakaris
March 9, 2021 10:12 am

You’re serving your lost cause well.

Well, perhaps it’s not that lost. You know climate change already has some very negative effects and people started to notice this. Back in the good old days (for you, deniers) people didn’t give a shit ‘cos they didn’t feel it. So the days of propaganda and endless bullshiting is over for you.

fred250
Reply to  nyolci
March 9, 2021 11:53 am

“You know climate change already has some very negative effects”

More unsubstantiated BS from the d’nyholist.

Again, what do we “deny” that you have solid scientific proof for

Your comment are as meaningful as saying we deny Grimm Bros fairy-tales.

The days of AGW propganda and BS will never end while there are gullible anti-science twats like you to lap it up and regurgitate it

Brian Jackson
Reply to  fred250
March 9, 2021 1:39 pm

80 people died in Texas from the “record cold.” I would call that “negative effects.”

fred250
Reply to  Brian Jackson
March 10, 2021 10:24 pm

global warming caused the “record cold”

Whatever you want to “believe” , moron !

David Kamakaris
Reply to  nyolci
March 9, 2021 12:13 pm

Yes you are that lost, Denier. But go buy yourself some carbon offsets, Loser. You’ll feel better in the morning. I’m sure you will deny that they’re a totally useless exercise in virtue signaling. But then again, it’s you, a world class DENIER.

Tim Gorman
Reply to  nyolci
March 9, 2021 1:44 pm

What “very negative effects”?

Do you mean lower heating bills? Do you mean record grain harvest for most years over the past 20 years? Do you mean fewer major hurricanes and tornadoes over the past 20 years? Do you mean significant growth in the polar bear population? Do you mean deer population explosions from better forage (300K deer in 1930 and 30M today)? Do you mean the feral hog population explosion over the past 30 years from several thousand to almost 9 million today, primarily because of growth in forage?

As usual, you don’t really have a good grasp on what you are complaining about, do you?

Paul Penrose
Reply to  nyolci
March 9, 2021 9:44 am

It is obviously impossible to have a rational discussion with you, so I’ll not waste any more time trying.

nyolci
Reply to  Paul Penrose
March 9, 2021 10:07 am

so I’ll not waste any more time trying

🙂 Another one bites the dust

fred250
Reply to  nyolci
March 9, 2021 11:57 am

Oh dear, yet another who sees through your EMPTY EVIDENCE-FREE BS

And you think it is a victory,, proving that you KNOW that all you are doing is mindless trolling.

Arguing with a scientifically-illiterate barking-mad chihuahua, like you, is very low on most people’s time wasting list.

Brian Jackson
Reply to  fred250
March 9, 2021 1:46 pm

Love how the long term WUWT residents call someone professing facts as trolling.
.
PS: “scientifically-illiterate barking-mad chihuahua, like you,”
.
Someone once said when you resort to name calling you lost.
.
https://twitter.com/wattsupwiththat/status/406298804950798336

David Kamakaris
Reply to  Brian Jackson
March 9, 2021 2:32 pm

Normally I would agree with you, but when some mindless troll calls people a denier on this blog, that mindless troll flushed that rule straight down the commode.

Carlo, Monte
Reply to  Brian Jackson
March 9, 2021 8:45 pm

Like nholitz resorts to name calling?

fred250
Reply to  Brian Jackson
March 10, 2021 10:25 pm

Yet you are STILL TOTALLY DEVOID OF ANY SCIENCE

you are scientifically illiterate

you are still tapping mindless like a demented chihuahua. !

STOP COMPLAINING ABOUT THE FACTS !

It pathetic to say the least. !

Last edited 1 month ago by fred250
fred250
Reply to  nyolci
March 8, 2021 1:08 pm

And the science d’nyholist yaps mindlessly yet again.

ZERO CONTENT. just empty blather

Vertical axis is perfectly acceptable. It covers the range of the natural daily tides.

It puts the TINY RATE OF SEA LEVEL CHANGE IN PERSPECTIVE.

Sorry if that confuses your tiny incompetent little mind.

Last edited 1 month ago by fred250
Peta of Newark
March 8, 2021 1:41 am

I think I’ve sussed it.
2 parts/components/possibilities

Firstly I do so love that ‘yellow’ picture of the acoustic tide gauge.
THAT is how measurements of Global Average Surface Temperature should made – by putting or burying the sensor in the object/substance being measured.
I say that that is The Primary Fail of Climate Science – the thermometers are all in the wrong place
If you want the ‘Temperature of the Surface of the Earth’, go there and measure it.

Do not assume that the temp of The Soil/Surface or (listen up Roy Spencer) The Water is always the same as the air above it.
So simple to check and it is not

1) Serious question to which I can only guess (Willis is our man here)
= How does a tide move up a beach, especially comparing a long shallow beach with a short steep beach – in the ultimate extreme, a cliff face
Would an incoming tide roll further, Read= higher, up the shallow beach?
Thus leading the Casual Observer to ‘see’ a higher than usual tide.
Something in my head says ‘yes it would’ Don’t ask why.

The Beach I’m alluding to here is The Continental Shelf.(TCS)
Thus, if TCS was getting longer and shallower, would tides come further (higher) ashore?
We all know what made/makes and continues to grow TCS – it is all that muddy brown water we see on the news whenever it rains anywhere.
It did not used to be like that.
Insane as it seems, flood water should not be brown/orange/red/yellow/black – barring of course something really catastrophic like earthquake, volcano or forest fire

In the really simple case, is that mud ‘filling in the ocean’ while simultaneously decreasing the average height of The Land?
There’s your Sea Level Rise straight off

2) The answer to this query is out there already in ENSO-land
What we need to know is..
Are the tide heights in the Western Pacific measurably different during the 2 distinct phases of ENSO?
viz: La Nina vs El Nino
Surely they must be higher during La Nina, because of the large pool of warm water being accumulated there. Isn’t that what is or what powers the El Nino – that warm pool collapses and floods back eastward over the Pacific

Now then, that pool of warm water (high tides yes/no) is created by warm trade/prevailing winds sweeping up sun-warmed water.

What if, those winds got a ‘head start’
What if they started as really hot winds blowing off the land and then carrying on over the ocean.
The air in the winds is not going to have much (any) heating effect on the water, but the hot & dry air would clear the sky as it sets off across the ocean, allowing more sun to get into the water.

Can you see where I’m going..
We already have ENSO and we have the hot dry winds- The Santa Ana or Diablo winds.
We have an experiment running already.

me being me and you being you knowing me, you know what’s coming next.
Soil Erosion
As if it wasn’t part of Theory (1) above
Gawd, what am I like?

Thus, idea #2 goes like this:

  1. Deforestation & tillage create more offshore warm winds
  2. These warm winds let more sunlight into the water = extra heating
  3. Does this extra heat cause the water to expand and seemingly raise sea-level?
  4. Does this extra warm water create bigger and or more frequent ENSO events?
  5. Do these enhanced ENSO type events cause an apparent or actual sea-level rise where the pools of warm water are accumulating, before they collapse and the whole thing repeats

There was/is a musically annotated you-tube out there showing sea-level rises at myriad different places around the World – yes some went up but also, about the same number went down.
That little ENSO theory explains it beautifully doncha think

Bob Tisdale – over to you..
(see, I did try to pay attention)

Last edited 1 month ago by Peta of Newark
Smart Rock
Reply to  Kip Hansen
March 8, 2021 10:18 am

Kip – I think Peta lives at Newark, Nottinghamshire. He moved from Cumbria to be near the sugar refinery they have there (that’s a joke btw).

Neither unsafe, nor easy to get lost in!

I got lost in Newark NJ late at night once, but it was 1970 and probably less hazardous than yours. A very stressful experience all the same.

Steve Reddish
Reply to  Peta of Newark
March 8, 2021 8:37 am

Peta,
“Do not assume that the temp of The Soil/Surface or (listen up Roy Spencer) The Water is always the same as the air above it”

I don’t believe Roy Spencer, or anyone else measuring/studying the atmosphere, has made that claim. When meteorologists speak of the “surface temperature” they are referring to the “air temperature at or near the surface”, – usually several feet above the surface, as that matters to people. Air temperature right at the surface matters to bugs. Temperature of the soil matters to gardeners.

Ian W
Reply to  Kip Hansen
March 9, 2021 6:08 am

>>>>Kip

As is usual in ‘climate ‘science” there is no clarity about what they are measuring. This may be (probably is) due to a poor understanding of physics. Atmospheric temperature is an intensive variable – more than one variable affects atmospheric temperature. This makes ‘average’ atmospheric temperature a nonsense. The intent is to measure the ‘energy content’ of a volume of the atmosphere. This means that the enthalpy (specific heat) of that volume must also be measured – which is dependent on its water vapor content (humidity).

A volume of air at 2 meters in a misty Louisiana bayou after an afternoon thundershower at 75F and 100% humidity has more than twice the energy content of a similar volume of air in an Arizona desert at 100F but at almost zero humidity. The ‘average heat content’ in Kilo joules per Kilogram can be calculated and is a real value. But average temperature is meaningless.

The problem with Climate ‘Science’ is that many of its values are ‘colloquial’ rather than metrics used in classical physics. What do climate ‘scientists’ mean by ‘hotter’? What units are they measuring hotter with? What do climate scientists mean by ‘warming’? Unless they are really clear on the metrics that they use, the climate ‘scientists’ have no basis for claiming that something is dangerous. It is very easy to fall into this ‘colloquial’ trap.

For example, if the average atmospheric humidity drops the atmospheric temperature will go up. Is that dangerous? As it is the same amount of heat energy in the same volume of atmosphere just less water vapor. Carbon dioxide is said to ‘trap heat’ and then scatter the heat as radiated infrared some will return to the surface and be re-emitted. So the metric for the ‘trapping of heat’ would be atmospheric ‘heat content’ in units kilo joules per kilogram; NOT temperature. It is unfortunate that all the long historic observations have been temperature but if they also recorded humidity (wet bulb temperature) then the enthalpy can be calculated and the heat content calculated.

Continually using the incorrect metrics is the reason for the quotes in Climate ‘Science’. Then we see the angels on a pin arguments of precision in statistics based on the incorrect metrics – and these are just a waste of time. It really doesn’t matter how many places of decimals you report the wrong metric – it is the WRONG metric and proves nothing.

Tim Gorman
Reply to  Ian W
March 9, 2021 8:27 am

Can the satellites measure humidity the way they do radiance? If not then their results aren’t very useful either.

March 8, 2021 2:03 am

Kip,

allows the construction of a robust and extended Mean Sea Level curve

The Guardian run an article about the Atlantic Meridional Overturning Circulation. Some “progressive” outlets picked it up. The problem was (is): the Atlantic Meridional Overturning Circulation is losing its strength; it is weakening. A story of Nature Geoscience, told by scientists, who were told so by their models.
It started immediately a new round of self-flagellation among the alarmists. One commenter wrote: Gaia will find a way to purge the human virus that has made it sick. It will run a fever for a millennium if needed … Another one wrote: the earth will shake us off like fleas from a dog.

What is weakening?
They – the Guardian, and other “progressive” outlets – do not tell us.
Nature Geoscience is also not very helpful: “the giant ocean circulation is relevant for weather patterns …” so these patterns could change. I assume they do not think about changing summertime and wintertime, but something like more hurricanes, winter storms etc.

There is few observational data: measurements started in 2004.
The article – as other articles dealing with this subject – is loaded with “likely linked to” “suggests” “could imply” “is associated with” “possible consequences”.
The research is called independent, but it uses all the proxies used by the climate models (and that includes tree rings!) the same points of departure, and output (e.g. estimates) derived from other models.

This utterance is my favourite:

While the individual proxy data is imperfect in representing the AMOC evolution, the combination of them revealed a robust picture …

commieBob
March 8, 2021 2:32 am

You can do a real study and collect real data and not falsify the data in any way and still get bogus results that look statistically significant.

Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.

link

Here’s a link to an excellent cartoon that explains data dredging which is also known as p-hacking.

.If you torture the data long enough, it will confess to something. At that point you stop and publish. Do not investigate further because the result you published will evaporate. It reminds me of the advice given to a tabloid writer, do not fact check yourself out of a good story.

In light of the above, it’s no surprise at all that we have a replication crisis.

Ben Vorlich
March 8, 2021 2:33 am

This article has reminded me of something. At the beginning of lockdowns I was looking for things to keep the mind active. I came across Zooniverse.

The Zooniverse is the world’s largest and most popular platform for people-powered research. This research is made possible by volunteers — more than a million people around the world who come together to assist professional researchers. Our goal is to enable research that would not be possible, or practical, otherwise. Zooniverse research results in new discoveries, datasets useful to the wider research community,

Current front page project
https://www.zooniverse.org/projects/psmsl/uk-tides

one that appears to be complete for the moment
https://www.zooniverse.org/projects/edh/weather-rescue

How you obtain data from these projects I’m not sure

Steve Case
Reply to  Ben Vorlich
March 8, 2021 5:49 am

Ben, your 2nd link above about rescuing weather data reminds me of Dr. Phil Jones and his Climate Research Unit:

Global Warming ate my data
We’ve lost the numbers: CRU responds to FOIA requests
theregister,com

Steve Case
Reply to  Kip Hansen
March 8, 2021 8:56 am

Kip, what kind of quality control is there on volunteerism. I’m sure mistakes are made.

Ben Vorlich
Reply to  Steve Case
March 8, 2021 9:26 am

When I worked at getting paper records into a database two people entered the same records independently. The numbers were compared. This way it was unlikely that the same mistake was made on the same record. Omissions in either as well as typos could all be eliminated.
Hopefully this is standard practice.

2hotel9
March 8, 2021 3:41 am

If the math don’t fit you must acquit! Johnnie Cochran is spining in his grave but I had to say it.

Steve E.
March 8, 2021 4:13 am

For the truly clueless in the MSM, might I suggest ADDING one red line to the plot? It should go vertical at 1950 to represent the magic date when CO2 took over, causing all things.

Brian Jackson
March 8, 2021 4:47 am

This is a perfect example of starting with a pre-determined conclusion, then cherry picking the data you need to prove that conclusion.

Brian Jackson
Reply to  Kip Hansen
March 8, 2021 10:05 am

Your opinion doesn’t give you the authority to discard data you don’t like. That is the essence of “cherry picking.”

Brian Jackson
Reply to  Kip Hansen
March 8, 2021 4:27 pm

“because it impedes a more pragmatic understanding of the whole from a lived-experience” … in other words, you don’t understand the math, so you want to ditch it.
.
.
“non-computational perspective to understand some data.” What BS. The data is numerical, please tell me how you are going to “understand” this data without mathematical summerizations (AKA statistics?) Are you planning on using an Ouija board?
.
Can you please provide me a mathematical proof that the “average of average of an average” is not correct?
..
Do you really understand the difference between accuracy and precision? From your comment it is obvious that you do not. Standard Error is your friend when working with statistical estimators of a physical value. Do you know that these statistical methodologies underpin Quantum Physics? In Statistical/Quantum Mechanics it is blasphemy to “throw the numbers out” simply because the math is beyond your comprehension.
.
For a real world example of your logic, Newton’s laws of motion could not explain the precession of Mercury’s orbit. You would throw out the data for Mercury’s orbit for pragmatic reasons.
.
Good luck doing real science….too bad when it gets “hard” you give up and throw in the towel, you will fail as a scientist.

Brian Jackson
Reply to  Brian Jackson
March 8, 2021 4:38 pm

Average of averages:
.
.
A = {3,7,2,4,8,6} sum(A)= 30 avr(A)=30/6=5
B = {9,2,9,4,5,7} sum(B)= 36 avr(B)=36/6=6
avr(avr(A)+avr(B)) = (5+6)/2 = 5.5
sum(A+B)=66  avr(A+B) = 66/12 = 5.5

Get it?

Brian Jackson
Reply to  Brian Jackson
March 8, 2021 4:39 pm

Please provide a counterexample to prove that an average of averages is not correct.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 5:33 pm

You are *assuming* that whoever takes the average of A=30 and B=36 KNOWS that the underlying data sets have 6 members each that have been combined into one data set of 12 members.

If you just give me A=30 and B=36 then I would tell you the average is 33.

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 5:54 pm

Do you understand what a “dataset” is?…..if you question the number of entries in a dataset, you are in worse shape than Kip when it comes to math.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 6:31 pm

Brian,

Stop dissembling. You started with two data sets each with a population of 6. Then you moved to a data set that has a population size of two and an average of 33.

When you combine two data sets of population size 6 into one data set of population size 12 then you no longer have any 30 and 36 means. You no longer have an A data set and a B data set.

If you want to keep the A data set and the B data set with means of 30 and 36, respectively, then the average of the means is 33.

You need to learn consistency in your math!

Last edited 1 month ago by Tim Gorman
Brian Jackson
Reply to  Tim Gorman
March 9, 2021 12:23 pm

“If you want to keep the A data set and the B data set with means of 30 and 36”
.
Tim, 30 and 36 are the sums of datasets A and B. How can you misinterpret sum(A)=30 ?

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 2:11 pm

You missed the point entirely! If you want to keep two data sets then be consistent with your math and keep two data sets.

If you want one data set then stick with one data set!

You can’t do both!

Reply to  Brian Jackson
March 9, 2021 11:17 am

>>
Please provide a counterexample to prove that an average of averages is not correct.
<<

A = {3,7,2,4,8,6,8,6,8,6} sum(A)= 58 avr(A)=58/10=5.8
B = {9,2,9,4,5,7} sum(B)= 36 avr(B)=36/6=6
avr(avr(A)+avr(B)) = (5.8+6)/2 = 5.9
sum(A+B)=94  avr(A+B) = 94/16 = 5.875

You picked a special case where the number of elements in each set are the same. In general, that may not be the case. For example, averaging the grades in a school first by classroom and then averaging the classrooms to get a school average will not necessarily obtain the same result as averaging all the grades by the school together. The one exception would be if each classroom had the same number of students.

Jim

Brian Jackson
Reply to  Jim Masterson
March 9, 2021 12:33 pm

Jim, you are correct, but it is not a “special case.” When you combine two time series, you have to have the same number of elements in each, or you are not averaging across identical intervals of time.

Brian Jackson
Reply to  Brian Jackson
March 9, 2021 12:48 pm

For example if you have two datasets with a year’s worth of daily high temperatures, both of them will have 365 data points (acknowledging/corrected for leap years.)

Reply to  Brian Jackson
March 9, 2021 3:29 pm

>>
. . . both of them will have 365 data points (acknowledging/corrected for leap years.)
<<

You should have quit while you thought you were ahead. Different length years invalidates your original argument. However, that’s not how climatologists average temperatures–by the year. First they average the daily temperatures to get the monthly average–so they can add a bogus tenth of a degree precision. Then they average the monthly temperatures to get the yearly average–so they can add a bogus hundredth degree precision. (Thanks to the Romans, our months–hollow and full–have different number of days.) And February doesn’t even have thirty days.

Oh, by the way, temperatures measured by a thermometer or something that acts like a thermometer is an intensive thermodynamic property. Intensive properties can’t be averaged–it’s nonsense (despite what Mr. Mosher says about colors).

Jim

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 5:29 pm

The data is numerical, please tell me how you are going to “understand” this data without mathematical summerizations (AKA statistics?)”

What statistics do you use on a time series?

“Standard Error is your friend when working with statistical estimators of a physical value. “

Standard Error only works when the underlying data is random and is part of a probability distribution, usually Gaussian. How does data from different measurement devices get combined into being a random variable with a Gaussian probability distribution? Especially when these measurements are part of a non-stationery time series?

Quantum mechanics *are* random and usually fit a Gaussian probability distribution.

“In Statistical/Quantum Mechanics it is blasphemy to “throw the numbers out” simply because the math is beyond your comprehension.”

Like most so-called scientists in the climate sector the math seems to be beyond *your* comprehension. You simply cannot understand that not everything fits the definition of a random variable with a Gaussian distribution. Nor do you even attempt to convert time series into a stationery configuration. If you can calculate a standard deviation of the mean and generate a trend line you just ignore everything else!

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 5:56 pm

What statistics do you use on a time series” Basic stuff, like averages, standard deviation, and linear regression.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 6:46 pm

When analyzing time series they must be stationary.

  1. Constant mean
  2. Constant variance
  3. Constant covariance between periods of identical distance

I am not your teacher. Go google: time series stationary
Learn something

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 10:55 am

Why should I when you don’t understand BASIC stat? (i.e. random sampling)

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 1:56 pm

ROFL! Why should you learn something? I don’t know. Maybe because you won’t look so foolish?

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 5:57 pm

 and is part of a probability distribution,”…..correct, it’s called random sampling.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 6:49 pm

Sorry, when you don’t have a probability distribution you can’t do random sampling.

If you measure ten different boards with ten different tape measures how does this become a random variable? How do the measurements create a probability distribution?

You wind up with ten independent data sets of population size one. How do you random sample anything?

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 10:52 am

“Sorry, when you don’t have a probability distribution you can’t do random sampling.”
.
Not true.
.
https://en.wikipedia.org/wiki/Sampling_(statistics)
.
One can discover the underlying distribution with a random sample of the target population. You do not need to have any information about the distribution prior to taking a random sample of that population.
.
The problem with your measuring boards is that you have not indicated what the population is, how the ten were selected. In each of the ten instances, a single measurement is not random if the population is one board.

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 1:55 pm

Successful statistical practice is based on focused problem definition. In sampling, this includes defining the “population” from which our sample is drawn. “

Maybe you should read what you link to.

If you don’t have a probability distribution then you don’t have a population from which to sample either.

Nor can you compare a *counting* situation with a measurement uncertainty. There is no uncertainty when you are counting characteristics. Black hair is black hair and not black hair +/- u.

When all you have are a multiplicity of separate, independent measurements you simply don’t have a population you can randomly sample.

The problem with your measuring boards is that you have not indicated what the population is, how the ten were selected.”

Now you beginning to get the idea!

In each of the ten instances, a single measurement is not random if the population is one board.”

And what if those ten measurements were temperature measurements at different stations?

Those measurements wouldn’t be random either, right? So how do you combine them into a Gaussian distribution which requires you to assume that they *are* random?

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 6:00 pm

“How does data from different measurement devices get combined into being a random variable with a Gaussian probability distribution? ”

All measurements from any measurement device is randomly distributed with a mean and standard deviation. You can find these values in the specifications for the device form the manufacturer.

Last edited 1 month ago by Brian Jackson
Tim Gorman
Reply to  Brian Jackson
March 8, 2021 6:53 pm

You do *NOT* have the SAME MEASUREMENT DEVICES.

Why is this so hard to understand?

data from around the coast of Great Britain”

Did someone take the same measurement device all around the coast every day, install it in the same manner, and take measurements?

———————————–
How does data from different measurement devices”
“All measurements from any measurement device”
————————————

How is it possible you can’t tell the difference between these two phrases?

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 12:35 pm

Each measurement device measures the same quantity, such as mass, temp, length, and are calibrated to a known standard.

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 2:16 pm

They are not calibrated when they have been in the field for more than 24 hours. They become *field* stations with an uncertainty value! Actually it probably takes less than 24 hours. Unless the temperature, humidity, etc is exactly the same at the field location as it was in the calibration lab then the station becomes uncalibrated upon installation! And each field station will have an uncertainty interval!

Have you *ever* done field work at all?

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 6:07 pm

“Especially when these measurements are part of a non-stationery time series?”
.
The measurements are Gaussian. You combine Gaussian measurement with other Gaussian measurements to get a set of Gaussian measurements.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 6:59 pm

How can they be Gaussian when they are independent measurements?

To be a probability distribution you need to be measuring the same measurand with the same measuring device. This rule is violated when you are using measurements from different locations (i.e. different measurands) using different measurement devices.

Even if you are using the same measurement device you still have to make sure the distribution is Gaussian. You can’t just assume that it is.

The data from a sensor measuring the diameter of a wire being pulled through a die can easily generate a skewed, non-Gaussian probability distribution because of wear of the die and wear on the sensor. Do you even know how to test for a skewed probability distribution?

Brian Jackson
Reply to  Tim Gorman
March 8, 2021 6:12 pm

Mr Gorman, for your information *ANY* set of data has a calculable mean, standard deviation. If the data is a time series, then a trend line can also be calculated. The underlying distribution, if it is normal, uniform, geometric, or other is not relevant to these calculations.

Brian Jackson
Reply to  Brian Jackson
March 8, 2021 6:18 pm

Mr. Gorman, you and Kip Hansen must have the same aversion to mathematics.
.
Don’t you understand that when you are doing random sampling of a population mean, the distribution is known to be Gaussian?

fred250
Reply to  Brian Jackson
March 8, 2021 6:49 pm

Brianless has only the most basic rudimentary understanding of basic mathematics

Has “heard” a few words while sitting at the back of the class, but never bothered getting any understanding or insight.

And likes to SHOW HIS IGNORANCE at every post he makes.

Its quite hilarious to watch 🙂

Last edited 1 month ago by fred250
Brian Jackson
Reply to  fred250
March 9, 2021 12:37 pm

Please quote any error I have made, thank you in advance for your inconsideration.

fred250
Reply to  Brian Jackson
March 8, 2021 7:01 pm

Brianless.

If you record a temperature of 16C, it is equally likely to be 16.1C as to be 16.3C and most certainly will not be 16.8 unless you are totally incompetent.

ie NOT normally distributed about each discrete data point.

Try NOT to be mathematically ignorant all your life !!

Brian Jackson
Reply to  fred250
March 9, 2021 2:04 pm

Not true. If you measure 16, it is less likely to be 16.3 than 16.1 because the measurement error of the instrument is normally distributed. https://people.astro.umass.edu/~schloerb/ph281/Lectures/NormalDistribution/NormalDistribution.pdf

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 7:42 pm

Nope. Measurement uncertainty has *NO* probability distribution associated with it. The interval means the true value can be anywhere in the interval. I guess you could say the true value has 100% probability of being the true value and all other values in the interval have 0% chance of being the true value but if you don’t know what the true value is those probabilities are meaningless.

With ONE measurement, i.e. a recorded value of 16, there is no probability distribution to describe the uncertainty. Error only enters into the discussion if you make MULTIPLE measurements and that is impossible to do with temperature. Once you take a reading the atmosphere that generated that reading is gone, it lives in the past where it is unreachable. No multiple measurements.

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 7:08 pm

You don’t even know how to determine if a probability distribution is Gaussian, do you?

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 2:07 pm

Plot the data…..the bell curve will be obvious.

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 7:44 pm

Plot what data? You only have ONE piece of data. It’s just a pencil point on a graph.

fred250
Reply to  Brian Jackson
March 8, 2021 9:28 pm

Come on Brianless.

If I have 100 temperature recorded as 16ºC, are these distributed as a normal Gaussian curve ?

If so what is the standard deviation ?

Brian Jackson
Reply to  fred250
March 9, 2021 2:09 pm

Not Gaussian, average is 16 variance is zero

fred250
Reply to  Brian Jackson
March 10, 2021 10:30 pm

“variance is zero”

ROFLMAO

You have just shown how UTTERLY CLUELESS you really are about basic measurement…

Thanks for confirming that FACT that you are TOTALLY IGNORANT !!

Your slop-stick comedy non-act couldn’t get any more hilarious. ! 🙂

Tim Gorman
Reply to  Brian Jackson
March 8, 2021 7:07 pm

Mr Gorman, for your information *ANY* set of data has a calculable mean, standard deviation.”

Only if you violate the restriction that the data comes from a random variable that creates a probability distribution.

You seem to be saying I can develop a data set by gathering independent data (say – the population of Baltimore, the height of Mount Everest, the distance between New York City and Los Angeles, and the weight of an ounce of gold) and come up with a calculable mean and standard deviation. Yeah, you can calculate such values. If you think that mean and standard deviation means anything then you are lost.

Trend lines of time series that are not stationary is as meaningless as the data set of independent data described above.

Like I said, you can calculate anything you want. That doesn’t mean that what you calculate has any meaning in the real world!

Brian Jackson
Reply to  Tim Gorman
March 9, 2021 2:14 pm

If you add the population of Baltimore to the height of Mt. Everest, what is the units of the sum?

Tim Gorman
Reply to  Brian Jackson
March 9, 2021 7:45 pm

If you add the population of Baltimore to the height of Mt. Everest, what is the units of the sum?”

You tell me! Your the one that said you could turn separate, independent measurements into a probability distribution, not me.

Mr.
Reply to  Brian Jackson
March 8, 2021 8:25 am

You mean like one Yamal bristle cone pine tree?

Paul Penrose
Reply to  Brian Jackson
March 8, 2021 10:33 am

Brian,
I’m sorry you didn’t understand it.

Brian Jackson
Reply to  Paul Penrose
March 8, 2021 11:07 am

“new perspective which allows us to throw out the numbers ” ….. translated, “I don’t like this data, but I like that data.”
.
“critical thinking, prior knowledge, and logic – and not math, leading up to throwing out the numbers” ….. translated, I like the dark red cherries, and not the bright red ones.
.
What kind of cherries do you like?

fred250
Reply to  Brian Jackson
March 8, 2021 1:42 pm

Poor brianless one. Presensts NOTHING to counter Kip’s analysis.

Just more mindless yabbering.

Quite hilarious if it weren’t so pathetic.

Brian Jackson
Reply to  Kip Hansen
March 8, 2021 4:41 pm

Nice to see someone that finds error with your post is considered a “troll.”

fred250
Reply to  Brian Jackson
March 8, 2021 7:03 pm

Poor brianless, you are basically just a mathematically INEPT CLOWN,

You have NOT found any errors, except in your own ignorance and lack of comprehension. !!!

Lrp
Reply to  Brian Jackson
March 8, 2021 10:43 am

Do you have a different set of data?

DHR
March 8, 2021 5:19 am

For a very long perspective refer to “A Search for Scale in lea-Level Studies by Curtis Larson and Inga Clark of the USGS. Journal of Coastal Research 22/4/788-800, July 2006. Larson and Clark conclude from examination of peat bogs and sediments that relative sea level around the Chesapeake Bay in the US has been rising at about 1.5 or 2 mm/yr for the past 6,000 years with lots of ups and downs.

Carlo, Monte
March 8, 2021 5:32 am

Adding the range bar gives an excellent illustration, well done.

Nick Schroeder
March 8, 2021 5:49 am

Ideas:

How it does not work.

By reflecting away 30% of the incoming solar radiation the albedo, which would not exist w/o the atmosphere and its so-called GreenHouse Gases, makes the earth cooler than it would be without the atmos/GHGs much like that reflective panel propped up on the car dash. Remove the atmos/GHGs and the earth becomes much like the moon, an arid, barren rock with a 0.1 albedo, 20% more kJ/h, hot^3 on the lit side, cold^3 on the dark.
If this is correct, the Radiative GreenHouse Effect theory fails.

For the GHGs to warm the surface with downwelling energy as advertised they must absorb/trap/delay/intercept “extra” energy from somewhere in the atmospheric system. According to RGHE theory the source of that “extra” upwelling energy is the surface radiating as a near ideal Black Body. As demonstrated by experiment the surface cannot radiate BB because of cooling by the non-radiative heat transfer processes of the contiguous atmospheric molecules.
If this is correct, RGHE theory fails.

How it does works.

To move fluid through a hydraulic resistance requires a pressure difference.
To move current through an electrical resistance requires a voltage difference.
To move heat through a thermal resistance requires a temperature difference. (Q=UAdT)
Physics be physics.

The complex thermal resistance (R=1/U) of the atmosphere (esp albedo, net Q) is responsible for the temperature difference (dT=Tsurf-Ttoa) between the warm terrestrial surface and the cold edge of space (32 km).
And that heat transfer process involves the kinetic energy of ALL of the atmospheric molecules not just 0.04% of them.

DMacKenzie
Reply to  Nick Schroeder
March 8, 2021 7:15 am

Your heat transfer equations are common approximations to heat conduction and forced convection, where radiative heat transfer is too low to be relevant. But our planet circles the Sun in the largest vacuum bottle we know of. So Radiative heat leaving the TOA matters. And, radiative heat transfer is a function of T to the fourth power, not T to the first power. And there are sufficient water and CO2 molecules in the atmosphere to absorb high proportions of the infra-red photons emitted by the surface towards outer space.
So by all means, use Q=UAdT to calculate how much insulation your house needs, but don’t use it for calculating how much heat will be transmitted to your television by your IR remote, at least not without taking into account the relative humidity between you and your TV.

Last edited 1 month ago by DMacKenzie
Mohatdebos
March 8, 2021 5:55 am

Much of data science and analytics that is being taught at universities does not seem to distinguish between real causation and spurious correlations. I was presented with numerous models based on data analytics that could be used to forecast sales when I was the Chief Economist at a large American company. It was interesting that the modelers generally could not explain why variable X would cause variable Y to change, only that their data showed it did. I rejected these models to my CEO’s dismay. I understand the company has one of the largest data analytics group now that I am gone.

Lance Flake
March 8, 2021 6:28 am

My engineering physics professor in college was truly old school. His rule for homework and tests was that if you showed all your work on the formulas and units, with no numbers used, you could get a 90% grade. That class was where I learned to get the fundamental ideas correct first and worry about the exact answer last, which has served me quite well in my engineering career.

Tim Gorman
Reply to  Lance Flake
March 8, 2021 11:35 am

Wow! That’s how I was taught as well! The prof didn’t care very much if you mis-read your slide rule and/or screwed up your addition/sub/mult/div. *IF* you did it the right way! He said people can *always* check your work and find your screwups *if* you showed your work.

Robert W Turner
March 8, 2021 6:40 am

I think your point about computers making computation easier and slowly replacing logic, reasoning, and common sense can not be understated. There is some Stanford math professor making YouTube videos that are confusing people with rife sophistry, i.e. claiming that 60/5(7-5) = 24 rather than = 6 because that is the answer you get when you put it into a calculator – simply ignoring that basic calculators don’t parse 5(7-5) as a single integer as they should. So yes, academia is literally to the point where century old syntactical rules of math are being replaced with outputs designated from limitations in basic programming.

P.S. Climate cultists and lukewarmers would do well to understand how the Law of Conservation of Momentum applies to radiative transfer in gases. IR active gases are cooled by incident IR just as much as they are warmed by it. Einstein using the Principle of Relativity in conjunction with measurements of light to create the Theory of Relativity which then lead to QM is the perfect example of how ideas using logic and reasoning are the heart and soul of science.
http://web.ihep.su/dbserv/compas/src/einstein17/eng.pdf

Clyde Spencer
Reply to  Robert W Turner
March 8, 2021 9:33 am

It is all about precedence in order of calculation! If one leaves parentheses off of a formula, they cannot expect to get the right answer.

Robert W Turner
Reply to  Clyde Spencer
March 8, 2021 10:32 am

At least we were taught that when using a calculator, you need to use brackets around the entire numerator and denominator because the calculator can not parse the complete numerator and denominator like a human can. So now it’s backwards and we need to unnecessarily write out the understood parenthesis around the numerator and denominator so that the NPCs, err I mean the nu mathematicians can input it into their calculators without having to think for themselves – this does not surprise me in the least since the world is standing on its head.

Paul Penrose
Reply to  Robert W Turner
March 8, 2021 10:35 am

The calculator on my phone has () keys, so it gets it right. Also, if you put that into google, it gets the correct answer as well.

James Schrumpf
Reply to  Paul Penrose
March 8, 2021 7:13 pm

Yeah, well, when I enter 60/5(7-5) on my calculator, using parens, I get 24 because it’s the correct answer.

60/5(7-5) = 60/5*2, and since one does multiplication and division in order they occur, that’s 60/5=12 and 12*2=24.

Since when is 5(7-5) anything other than shorthand for 5*(7-5)? If I don’t use parens on my calculator, 60/5*7-5 = 82, because 60/5 = 12 and 12*7 = 84 and 84-2 = 82.

Am I on Facebook?

fred250
Reply to  James Schrumpf
March 8, 2021 11:49 pm

5(7-5) means “5 lots of (7-5)”. ie = 10

It effectively brackets the 5 with the (7-5)

Tell me, what is the value of…..

60 / 5(x-5) when x = 7 ?

Or is basic algebra beyond you ?

Last edited 1 month ago by fred250
Tim Gorman
Reply to  James Schrumpf
March 9, 2021 7:42 am

From wikipedia:

For example, the manuscript submission instructions for the Physical Review journals state that multiplication is of higher precedence than division with a slash,[22] and this is also the convention observed in prominent physics textbooks such as the Course of Theoretical Physics by Landau and Lifshitz and the Feynman Lectures on Physics.[d]

This means 60/5*2 = 60/10 = 6.

This is also how I learned the rules in my engineering courses in college.

Of course in high school we never heard of a forward slash. No computers. Very few typewriters.When we *did* use a typewriter we always used an underline between numerator and denominator and a half-line-space to type the result. How many here even remember trying to use a typewriter with equations?

Editor
March 8, 2021 7:57 am

Thanks, Kip, an interesting post.

One point of note. You say:

While the data set is claimed to be an “extended Mean Sea Level curve”.  It is not that, but rather a graph of the annual average of annual average Relative Mean Sea Levels from 174 different sites on Great Britain (GB) and Ireland – some of the data points are modern (20th and 21st century) data and some are historic (19th century).

In fact, in the paper they spend a lot of time discussing the “leveling” (or “levelling” in the UK spelling) of the data to remove vertical ground motion, using both GPS and standard surveying techniques. As a result, they are not measuring the relative sea levels as you say.

It is indeed as they claim, an “extended Mean Sea Level curve” for the UK.

Best regards,

w.

Granum Salis
March 8, 2021 7:58 am

I used to find the top of the sea 43 meters down from my house but now it’s just 1,331.7 km down from Jason-3 (+/- 12km).

Granum Salis
Reply to  Kip Hansen
March 8, 2021 9:27 am

To be fair, that’s perigee/apogee. They claim accuracy to 1 cm.

The satellite knows where it is because it measures its distance from the sea.

Ray in SC
Reply to  Granum Salis
March 8, 2021 1:34 pm

Granum,
That leads to some circular reasoning. If the altitude of the satellite is calibrated based on sea level, how can the satellite then be able to determine the relative change in sea level?

Granum Salis
Reply to  Ray in SC
March 8, 2021 5:52 pm

That is true, and they go round in a sort of circle every 112.42 minutes.
There is a cute minutephysics video on youtube on measuring sea level.

yirgach
March 8, 2021 8:05 am

Keeping with the Doc Martin meme, you could say that “throwing out the numbers” helps you from Going Bodmin.

Clyde Spencer
March 8, 2021 8:58 am

Kip
I am seriously disappointed. You promised to get rid of all those pesky numbers. However, yet there they were, like gum stuck to the sole of one’s shoe.

Seriously, I think that it would be more accurate to say that what you have done is remove the detail — not being able to see the forest for the trees. In remote sensing, this is sometimes referred to as a synoptic view, or a broad overview. That is, distilling all the detail into a synopsis of the essential information. You accomplished your goal with slide 9 by putting all those pesky numbers into perspective. Your tide interpretation is not unlike the frenetic agitation about the small increase in annual temperatures, which is a fraction of the seasonal changes, or the minute changes claimed for the marine pH, which also is a fraction of the diurnal and seasonal changes, or what happens minute by minute in upwelling zones.

Good show!

Steve Z
March 8, 2021 9:19 am

Trying to track changes in “mean sea level” over hundreds of years definitely has an extremely low signal-to-noise ratio, where the “noise” is the twice-daily fluctuation between low and high tide, with a secondary monthly cycle of the phases of the moon. The amplitude of the tidal cycle (water elevation difference between high and low tide) tends to be highest during the full moon, slightly lower at the new moon, with minima at the first and last quarter (sun and moon about 90 degrees apart in the sky). The amplitude of the tidal fluctuation is also higher at the equinoxes (gravitational forces from the sun and moon aligned in the same plane) than at the solstices (gravitational force from the sun offset from the plane of the moon’s orbit).

So, anyone trying to track changes in “mean sea level” has to filter out one cycle with a period of about 13 hours, another cycle with a period of about 29 days, and another cycle with a period of about 183 days, all of which have much higher amplitudes than the long-term trend.

Also, for a place such as Port Isaac shown in the photos, what would be of interest to any sailors wanting to enter or leave the harbor is the water depth at the entrance to the harbor–will the bottom of their keels be above the sand or not?

At low tide, with no water in the harbor, that water depth would be recorded as zero–all boats are grounded. But is the “sea level” at low tide actually negative–below the level of the sand at the entrance to the harbor? The person reading the tide gauge at low tide in the 1800’s (or even at present) would probably write down zero, and wouldn’t care how far the water’s edge was from the entrance to the harbor, although that would introduce a bias into the sea level at low tide.

Regarding Steve Case’s comment below:

“But we have governments around the world that have taken that idea as gospel and are acting on it, and that’s the problem.”

Suppose that, for sake of argument, sea levels are increasing at a rate of 2 to 3 mm per year, or about 8 to 12 inches per century. The problem is that “governments around the world” have tried to impose a massively expensive global “solution” that would create poverty everywhere (abandoning the use of fossil fuels) to what is essentially a local problem, limited to land areas at elevations less than a few feet above high tide.

The better, cheaper solution is for LOCAL governments in areas threatened by sea level rise to build sea walls over the next decades to protect themselves against a slowly rising sea level, and allow people living at higher elevations to continue normal life.

Joel O’Bryan
March 8, 2021 9:21 am

The global Elite billionaires, for some odd reason, appear to be ignoring alarmist SLR claims again:

https://www.foxnews.com/travel/bahamas-biggest-private-island-auctioned-off

The Bahamian island is supposed to be underwater in 9 years. Yet set to be sold for 10’s of millions. Maybe they know the SLR alarmists claims are a scam? Just a thought. We’ ll have to wait and see if a bidding war breaks out for this private Bahamian Island.

Jimb
March 8, 2021 10:11 am

Back in the ‘ops Jim Hansen warned of rising sea levels. The Major Deagan would be inundated by now. Uh… Nope.

Jimb
Reply to  Jimb
March 8, 2021 10:12 am

That should read “‘90s”.

lackawaxen123
March 8, 2021 10:11 am

have worked on Wall street for 35 years and have a rule I’ve used that may apply …

Just because you think you can measure something doesn’t mean you are able to find anything of value in those measurements …

  • be very sure of the accuracy of your measurements
  • be very clear about the costs and potential rewards of doing this measurement
  • always in context and always consider the scale of the problem …

many companies on Wall street spend huge sums trying to measure best execution … after all who doesn’t want to know if the million shares of stock was purchased at a good price …
Of course, the Portfolio Manager buying the million shares of stock doesn’t really care if it was purchased at $10.25 or $10.27 … he/she thinks the stock is going to $20 in a year … if he/she is right the difference in yield between 10.25 and 10.27 is .38 % … 95.12195% vs 94.74196% return …
(the costs are of course never taken from the Portfolio Managers bonus pool … if it was there would be a lot less spent on this …)

Duncan MacKenzie
March 8, 2021 10:35 am

Ideas are strung together into hypotheses and those hypotheses are tested first against existing knowledge through the application of logic and critical thinking. 

That’s a definition of religion, not science.

Science differs from religion in that it involves confirmation of ideas by measurement.

Gary Pearse
March 8, 2021 11:49 am

Kip
“…computers and their associated software, calculation and statistical analysis are far too easy and seem to have replaced both logic and critical thinking, even basic reasoning.”

This is a huge statement that speaks proverbial volumes. The larger implication is that making things easier has opened the door or perhaps floodgate to people, not particularly thinkers at all, into science, most notably politically agenda-ized science. The ‘science’ is used in the corrupted sociological science way as a hammer to change direction and exert control
on society.

Steve McIntyre’s famous assessment of the ‘mainstream’ climate scientists comes to mind (they’d be lucky in an earlier generation to become high school science teachers). Or Phil Jones admission in climategate that he never learned to use Excel!

Tim Gorman
Reply to  Kip Hansen
March 8, 2021 1:41 pm

Kip,

He got at least one wrong. The GAST does *not* allow us to say the day side of the planet is warmer than the night side. If we had a Dayside Global Average Surface Temp *and* a Nightside Global Average Surface Temperature then we might possibly be able to say the day side is warmer than the night side. But that would *also* mean everyone could see that it is the night side temperatures (usually when Tmin occurs) that are going up and not the day side temperatures (usually when Tmax occurs). Of course this would shoot down their entire CAGW religion and dry up most of their research money. So you’ll never see them convert to doing this. You won’t see the models converted to showing daytime/nighttime average temps either.

And I also disagree that the GAST has a precise physical meaning. If you can’t go someplace and measure it then it doesn’t have a precise meaning. It is a calculated result with all kinds of questionable calculations being used – which means there is no precise physical meaning.

And of course you are correct. We don’t need the homogenized, interpolated, infilled, and altered databases to know any of this!