Climate Alarmists Respond to the Global Warming Pause

Essay by Eric Worrall

The oceans swallowed my global warming? Desperate butt covering from alarmists who are facing increasingly embarrassing questions about the failure of the world to end.

14 July 2022  16:41

Factcheck: No, global warming has not ‘paused’ over the past eight years

A decade ago, many in the climate community were fixated on an apparent “pause” in rising global surface temperatures. So many studies were published on the so-called “hiatus” that scientists jokedthat the journal Nature Climate Change should be renamed Nature Hiatus. 

However, after a decade or so of slower-than-average warming, rapid temperature rise returned in 2015-16 and global temperatures have since remained quite warm. The last eight years are the warmest eight years since records began in the mid-1800s.

While the hiatus debate generated a lot of useful research on short-term temperature variability, it is clear now that it was a small variation on a relentlessly upward trend in temperatures.

But nearly a decade later, talk of a “pause” has re-emerged among climate sceptics, with columnist Melanie Phillips claiming in the Times this week that, “contrary to the dogma which holds that a rise in carbon dioxide inescapably heats up the atmosphere, global temperature has embarrassingly flatlined for more than seven years even as CO2 levels have risen”.

This falsehood appears to be sourced from a blog post by long-time climate sceptic Christopher Monckton, which claims to highlight the lack of a trend in global temperatures over the past eight years.

In a rebuttal letter to the TimesProf Richard Betts – head of climate impacts research at the Met Office Hadley Centre and University of Exeter – points out that it is “fully expected that there will be peaks of particularly high temperatures followed by a few less hot years before the next new record year”.

In fact, the last eight years have been unusually warm – even warmer than expected given the long-term rate of temperature increases – with global temperatures exceeding 1.2C above pre-industrial levels. The temperature record is replete with short-term periods of slower or more rapid warming than average, driven by natural variability on top of the warming from human emissions of CO2 and other greenhouse gases. 

There is no evidence that the past eight years were in any way unusual and the hype around – and obvious end of – the prior “pause” should provide a cautionary tale about overinterpreting year-to-year variability today.

Human-emitted greenhouse gases trap extra heat in the atmosphere. While some of this heat warms the Earth’s surface, the vast majority – around of 93% – goes into the oceans. Only 1% or so accumulates in the atmosphere and the remainder ends up warming the land and melting ice. 

Most years set a new record for ocean heat content, reflecting the continued trapping of heat by greenhouse gases in the atmosphere. The figure below shows that annual OHC estimates between 1950 and present for both the upper 700m (light blue) and 700m-2000m (dark blue) depths of the ocean.

Read more: https://www.carbonbrief.org/factcheck-no-global-warming-has-not-paused-over-the-past-eight-years/

Lord Moncton apparently stirred the hive by publishing a few articles on the growing pause, like this article from three weeks ago.

His article on the last 6 years are entertaining because, where’s the warming? Wasn’t there supposed to be a hockey stick or something? Oh yeah, it disappeared into the ocean depths, allegedly.

The last 172 years, since 1850, temperatures have risen a little. Except for that period between the 1940s to 1970s, when the drop in global temperature triggered climate scientists like Stephen Schneider to suggest we should use nuclear reactors to melt the polar ice, to prevent an ice age. Schneider later claimed he’d made a mistake, and went on to become a global warming activist.

But that context doesn’t stop in 1850.

Looking before 1850, there were notable warm periods during the last few thousand years, like the medieval warm period, Roman Warm Period and Minoan Warm Period, which look suspiciously like our current modern warm period, except back then people didn’t drive automobiles.

Going back further, 9000-5000 years ago, during the Holocene Optimum, the sea level was around 2m higher than today, so it was probably pretty warm back then as well.

20,000 years ago, much of the world was covered by massive ice sheets.

Three million years ago, the world was so warm Antarctica was mostly ice free – until the onset of the Quaternary glaciation, which we are still enduring today. To put the Quaternary Glaciation into context, the Quaternary is one of only five comparable great cold periods which have been identified over the last two billion years.

55 million years ago was the Palaeocene – Eocene thermal maximum, an extremely warm period of such abundance our primate ancestors spread throughout much of the world.

When you take a more complete look at the context, rather than the limited 172 year / 0.0000086% of climate history Carbon Brief seems to want you to focus on, there is nothing unusually warm about today’s global temperatures. Even if further global warming does occur, if those little primate ancestors with walnut size brains could manage to thrive in the Palaeocene – Eocene thermal maximum, I’m pretty sure we could figure out how to cope with a small fraction of the warming they enjoyed.

4.8 63 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

471 Comments
Inline Feedbacks
View all comments
MGC
July 21, 2022 1:31 pm

Sorry to be so blunt, but what a shameful disgrace of an article.

The article opens with that tired old, long dismissed misrepresentation – the Easterbrook Greenland ice core graph – which is improperly labelled and disingenuously tries to make it look like it includes the contemporary instrumental record, when it really doesn’t.

Easterbrook’s wrong (again)http://hot-topic.co.nz/easterbrooks-wrong-again/

We then go on from there to be treated to a plethora of other typical “skeptical” echo chamber talking point misrepresentations and outright falsehoods. For example:

1- it [warming] disappeared into the ocean depths, allegedly.

There is nothing “allegedly” about ocean warming. Measurements from thousands of Argo ocean floats worldwide clearly confirm a continuing trend of significant heat accumulation in the oceans.

https://www.ncei.noaa.gov/access/global-ocean-heat-content/

It can only be a matter of time until the next major El Nino releases that heat to the atmosphere, at which time the latest surface warming “pause” will end, and the folly of the “skeptics” who want to try to pretend away the anthropogenic warming trend with yammering about “pauses” will (yet again) be exposed, just like last time.

2- “like the medieval warm period, Roman Warm Period and Minoan Warm Period, which look suspiciously like our current modern warm period”

Uh, no, those periods look nothing like our current warming. Research demonstrates that the current warming is far, far more global in scope than any of those periods:

Neukom, et al Nature 2019

“Here we use global palaeoclimate reconstructions for the past 2,000 years, and find no evidence for preindustrial globally coherent cold and warm epochs. In contrast, we find that the warmest period of the past two millennia occurred during the twentieth century for more than 98 per cent of the globe. This provides strong evidence that anthropogenic global warming is not only unparalleled in terms of absolute temperatures, but also unprecedented in spatial consistency within the context of the past 2,000 years.”

3- “I’m pretty sure we could figure out how to cope with the warming.”

Yes, we will ultimately figure out how to “cope”. The real question, though, is this: what will “coping” cost? Most economic analyses conclude that in the long run, taking little if any action to limit warming is likely the costliest and the riskiest alternative.

Reply to  MGC
July 21, 2022 2:27 pm

The Argo floats have a +/- 0.5C uncertainty. That’s wider than the difference trying to be identified. So how do we know the deep ocean is warming? In any case the Argo floats only go down (I think) to about 2000ft. The *deep* ocean is far deeper than that.

The paleoclimate reconstructions many times just totally ignore the social clues that are available. 2000 years ago the Romans were all over Europe. It wasn’t till later that the climate turned colder and they drew in. The Mongols didn’t expand throughout Asia during a *cold* period but during a warm period. Native Americans 2000 years ago were more populous than in later periods. Same with the Mayans. These peoples would not have developed and expanded during cold climates. I’m sure there are lots of other examples.

MGC
Reply to  Tim Gorman
July 21, 2022 5:01 pm

Here’s Gorman once again pretending that the Argo measurements are “too uncertain” when of course this is totally not the case. And the Argo floats go down to 2000 m, not 2000 ft.

re: “The paleoclimate reconstructions many times just totally ignore the social clues, blah blah blah … “

And how comically ironic is it to hear Gorman first whine about measurement uncertainty, but then mention “social clues” as if they could somehow provide “better” measurement uncertainty than actual temperature proxies. Such a notion is, of course, pure nonsense. Another laughably ridiculous Gormanian “skeptical” excuse.

Reply to  MGC
July 21, 2022 5:31 pm

Here’s Gorman once again pretending that the Argo measurements are “too uncertain” when of course this is totally not the case.”

Sorry bud, it *IS* the case. The resolution of the sensor is something like +/- 0.001C.

But the sensor is not the float. The calibration of the sensor is dependent on the rate of water flow past the sensor, the ph of the water, and probably several other things I have forgotten. When the Argo floats were initially calibrated after being in the field the FLOATS were found to have an uncertainty of +/- 0.5C. Of course that is within the tolerance the Federal Handbook of Meteorology No. 1 specifies for temperature measuring devices – +/- 0.6C.

Sensor resolution is *NOT* measurement device uncertainty. That is true for *any* measuring device, be it a digital voltmeter or a thermometer.

“then mention “social clues” as if they could somehow provide “better” measurement uncertainty”

And now you are doing what you usually wind up doing when you are shown to be wrong. You put words in people’s mouth to create a strawman and then argue against the strawman.

The use of social clues is only a true/false indicator concerning whether past history was warm or cold. You can use those to validate reconstructions. For example, just how warm did it need to be for the Vikings to grow crops and raise livestock on Greenland? If a reconstruction says Greenland has always been too cold for that then the reconstruction has a problem!

You are your own worst enemy on here. Are you surprised that no one believes anything you assert any more?

MGC
Reply to  Tim Gorman
July 22, 2022 5:21 am

Same tired old Gormanian nonsense, over and over and over again. The uncertainty of the mean value of 4000 measurements is far smaller than the uncertainty of any single measurement. Continuing to pretend otherwise remains nothing but the epitome of willful ignorance.

re: “The use of social clues is only a true/false indicator”

Therefore the temperature proxies themselves are a far, far, far, far, FAR better indicator, which was my point. But because of ideological bias, you blindly ignore what those proxies tell us and want to trust the say so of an indicator that gives some totally vague hot or cold true/false signal instead.

Unbelievably ridiculous. But what else is new. Unbelievably ridiculous is business-as-usual for a Gorman.

Reply to  MGC
July 22, 2022 8:15 am

The crux is his obsessive use of “uncertainty” as a magical, unknowable parameter. He describes it’s distribution when he amazingly claims that it is “off” by a certain amount, and that there’s no possibility of intermediate points where it could land. But he balks at using even that distribution quantification when pressed. Finally, both J and T retreat into misty tales about their “real world experiences” when pressed with the facts.

For me, he channels my retired union electrician BIL. He drank on the job for his 30+ years of union protected employment at our local utility. Now, after a few, his fav stories are about how he saved the asses of “Those college boys” at work. Fueled no doubt by the fact that his 3 boys wanted to be engineers but were, you know, “High Hopes, Low SAT’s”.

rah
Reply to  bigoilbob
July 22, 2022 11:10 am

LOL! So your saying that all the adjustments to weather station records by NOAA are bull shit. Thank you. But your really late to the party. We already knew that.

Reply to  rah
July 22, 2022 11:28 am

Wut? WTF did THAT come from?

Reply to  bigoilbob
July 22, 2022 4:20 pm

The crux is his obsessive use of “uncertainty” as a magical, unknowable parameter. “

You just keep showing how little you understand of this. The uncertainty interval is not a magical, unknowable parameter. The TRUE VALUE is what is unknowable. It exists somewhere in the interval but it is UNKNOWN where! The whole purpose of the uncertainty interval is to show that you simply can’t ever have a perfect measurement!

“He describes it’s distribution when he amazingly claims that it is “off” by a certain amount, and that there’s no possibility of intermediate points where it could land.”

Again, more BS from an idiot. The true value has a probability of 1 of being the true value. All the other values in the interval have a 0 probability of being the true value. There can be only ONE true value. You just don’t know what it is!

“Finally, both J and T retreat into misty tales about their “real world experiences” when pressed with the facts.”

Not a single real-world example, based on experience, was ever refuted by you. And now you have just retreated to using the argumentative fallacy of Argument by Dismissal as a refutation.

And then you have to resort to ad hominems because you have nothing to actually offer in refutation of anything. Typical.

Carlo, Monte
Reply to  Tim Gorman
July 23, 2022 6:48 am

blob is another who is confuzzled about how uncertainty is not error.

MarkW
Reply to  bigoilbob
July 22, 2022 8:47 pm

Uncertainty is unknowable when you can’t define what all the possible errors are.
Your belief that data you can decide what the answer should be, then manipulate the data until it gives you that answer is unscientific, but par for the course in climate alarmism.

Carlo, Monte
Reply to  bigoilbob
July 23, 2022 6:42 am

The crux is his obsessive use of “uncertainty” as a magical, unknowable parameter.

blob weighs in and shows how clueless his heat-addled brain is.

Reply to  MGC
July 22, 2022 10:52 am

Tell me something. Have you ever held a job where the measurements you take and their uncertainty (think tolerances) made the difference in whether you kept your job or not? Think machinist or tool and die maker.

Have you ever had a job where your measurements had to meet legal requirements for accuracy, precision, and uncertainty? Have you performed engineering tasks that required a Professional Engineer to sign off on? Have you ever worked in a Certified Laboratory?

What was the job(s)?

If you have never had to meet forecasts, payroll, and other regulated requirements based upon your measurements then you simply have no room to criticize anyone. You won’t even use your name to identify yourself. That’s a good indication of just how much your criticism is worth.

Reply to  Jim Gorman
July 22, 2022 11:34 am

We’ve been thru this particular “real world experience” BS whine over and over, ad nauseum. There’s NOTHING about NCM operation that disagrees with centuries old statistical laws.

BTW, I AM a professional engineer. Precision, accuracy, uncertainty are our stock in trade. And NONE of these parameters is inconsonant with the Engineering Statistics 101 rules of the road that you seem to be hysterically blind to.

Reply to  bigoilbob
July 22, 2022 4:38 pm

*MY* engineering training, BSEE, power and nuclear, emphasized the need to consider uncertainty in *everything*!

If you were truly a professional engineer then you would be far more experienced with uncertainty. When building infrastructure you *have* to consider uncertainties, be they in ordering fish plates to join struts, ordering conduit between service panels, or even in designing rise/tread in stairwells.

My guess is that you really don’t even know what molding in a house is for!

Reply to  bigoilbob
July 23, 2022 5:46 am

If you are a professional engineer, what is your degree and from what university?

What courses have you had in metrology that dealt with measurement uncertainty? What textbook did you use?

Here is a photo of my degree. Let’s see yours!

Please note, I do not hide behind an anonymous nomenclature. I am proud of my accomplishments and have no problem with folks knowing about me.

my degree.jpg
Reply to  Jim Gorman
July 23, 2022 7:30 am

If you are a professional engineer, what is your degree and from what university?”

BS, Petroleum Engineering, Missouri School of Mines, 1981. (Admittedly, only slightly higher rated than yours.)
MS, Petroleum Engineering, University of Southern California (Drilling Emphasis), 1995.

But a degree doesn’t make you a “Professional Engineer.” Search for Oklahoma Professional Engineer #14428. 1985, by both references and examination (unlike Texas). But this surrenders my full name, so please honor my nom de WUWT here. BTW, where’s yours?

Reply to  bigoilbob
July 24, 2022 5:43 am

No, but being a Professional Engineer means you are responsible for the final approval of items that DO REQUIRE proper uncertainty calculations for life and safety purposes.

You should have intimate knowledge of measurement uncertainty, yet you have indicated little ability to understand how it applies to measurements of different things with different devices. I can only assume you have had little to no training in metrology at all.

I was lucky to have worked in the old Bell Telephone system and received much training developed by Deming and Shewhart at Bell Telephone Laboratory in Statistical Process Control (SPC). Part of this training was learning about uncertainty in measurement and how that can affect the end products meeting nominal specifications. They would be appalled at the lack of disciplined statistical analysis in the treatment of measurement data in climate science.

MGC
Reply to  Jim Gorman
July 23, 2022 7:46 am

Well, that explains a lot. The University of Kansas is not even in the top 100 engineering schools in the country, LOL.

The academic training of all those scientists and engineers that Gorman ridiculously pretends are “wrong” is orders of magnitude better.

Reply to  MGC
July 24, 2022 5:58 am

Just FYI, we had professors who wrote textbooks used in a number of different universities and they consulted in industry. Two of my professors worked at Bell Labs in the development of tunnel diodes. How about you. Have your professors had this kind of experience?

Funny how you can’t or won’t show your education since it seems so important to you. BTW, KU had one of the few nuclear reactors ran by a university when I went to school. Don’t denigrate what you know nothing about!

And, somehow, the mathematics required haven’t changed a whole lot over the years. Maxwell’s EM equations, Planck’s heat radiation, and thermodynamics all still rely on the same tried and true EXPERIMENTALLY derived mathematics. Can you say the same about General Circulation Model’s?

Reply to  MGC
July 24, 2022 9:10 am

Well, that explains a lot. The University of Kansas is not even in the top 100 engineering schools in the country, LOL.”

This is the argumentative fallacy known as Poisoning the Well. Why am I not surprised to see you using it.

KU was a leading research location for Satellite Remote Sensing in the 70’s and 80’s. They even had their own nuclear research building. Many engineering schools were not this advanced.

Carlo, Monte
Reply to  bigoilbob
July 23, 2022 6:49 am

bluff and bluster keeps blob inflated.

Reply to  MGC
July 22, 2022 4:08 pm

Same tired old Gormanian nonsense, over and over and over again. The uncertainty of the mean value of 4000 measurements is far smaller than the uncertainty of any single measurement. Continuing to pretend otherwise remains nothing but the epitome of willful ignorance.”

Nope. The uncertainty of that mean is *NOT* an uncertainty. It is actually the standard deviation of the sample means. If those sample means are inaccurate then so is the mean calculated from them. The standard deviation of the sample means is meaningless when it comes to accuracy.

Uncertainty does *NOT* cancel unless you have a Gaussian distribution and multiple measurements of different things using different devices simply can’t be assumed in such a situation.

If you have a distribution of temperatures like (20F +/- 1F, 21F +/- .5F, 19F +/- 2F, 18F +/- 1F, 70F +/- .6F, 71F +/- .8F, 80F +/- 1.1F, 77F +/- 1.2F, 16F +/- .9F) and you pull samples from that population what happens to the uncertainty of the mean you calculate from those stated values?

You can assert that all those uncertainties of the individual temps will cancel but they won’t. The mean you calculate won’t even exist in reality. It will be a meaningless number.

Like all statisticians you have been trained to only look at stated values of a distribution which implies that all the stated values are 100% certain. In other words no uncertainty at all in the stated values.

It just doesn’t work that way in the real world!

Carlo, Monte
Reply to  Tim Gorman
July 23, 2022 6:51 am

It is really unfortunate someone in the misty past came up with the term “standard error” and it has stuck.

Reply to  Carlo, Monte
July 23, 2022 2:03 pm

Would you prefer “probable error”?

Reply to  Carlo, Monte
July 24, 2022 8:34 am

I’m sure you just hate those inconvenient, well defined, standard statistical terms.

https://www.investopedia.com/ask/answers/042415/what-difference-between-standard-error-means-and-standard-deviation.asp

Reply to  bigoilbob
July 24, 2022 10:33 am

“I’m sure you just hate those inconvenient, well defined, standard statistical terms.”

From your link:

Standard error of the mean (SEM) measures how far the sample mean (average) of the data is likely to be from the true population mean. “

If your data elements have random, independent uncertainty intervals then how do you know the “true population mean”? The uncertainties of the individual data elements propagates onto the mean – so your mean should be quoted as “stated value +/- total uncertainty”

SEM is only accurate if there is no uncertainty in the population mean. And this is what statisticians like you and the rest of the climate “clique” do – IGNORE THE UNCERTAINTY OF THE POPULATION MEAN. Just erase it. Pretend like there is no uncertainty in the measurements!

Reply to  Tim Gorman
July 24, 2022 10:53 am

“If your data elements have random, independent uncertainty intervals then how do you know the “true population mean”?”

OMFG. What a load of self serving, undocumented BS. To say that a data set, even with both distributed x and y data, has no “true population mean”, or no true trend for that matter, is total RME material. Yes, they will have increased, easily calculated, standard deviations/standard errors, but their expected values are indeed “true”.

“SEM is only accurate if there is no uncertainty in the population mean. And this is what statisticians like you and the rest of the climate “clique” do – IGNORE THE UNCERTAINTY OF THE POPULATION MEAN.”

Er, no. Might want to look back at yesterdays comment by me to Mr. Carlo. I did just that with available data that I linked to. As climate scientists do very day…..

Reply to  bigoilbob
July 24, 2022 3:34 pm

OMFG. What a load of self serving, undocumented BS. To say that a data set, even with both distributed x and y data, has no “true population mean”, or no true trend for that matter, is total RME material. Yes, they will have increased, easily calculated, standard deviations/standard errors, but their expected values are indeed “true”.”

In other words you don’t understand anything about non-Gaussian distributions at all, let alone distributions whose data elements have uncertainty.

If you do not have a at least an identically distributed distribution then the mean and standard deviation is *NOT* the proper statistical description of the distribution.

Why is this so hard to understand? In this case you *have* to use the 5-number description: minimum, first quartile, median, third quartile, and maximum. You might even have to use a different method of statistically describing the distribution.

Suppose you have a distribution of (0,0,1,2,63,61,27,13).

Is this a Gaussian distribution? You would say that the mean is 21, and the standard deviation is 25. Exactly what do you think that tells you about the distribution? Does the mean even exist? Can you use it for anything?

Now, give each an uncertainty: 0 +/- 0.5, 0 +/- 0.5, 1 +/- 0.25, 2 +/- 0.75, 13 +/- 0.5, 27 +/- 1, 61 +/- 0.25, and 63 +/- 0.25.

What is the uncertainty of the mean value = 21? Is it zero? No uncertainty at all? Can you just ignore the uncertainties like you always want to do?

Er, no. Might want to look back at yesterdays comment by me to Mr. Carlo. I did just that with available data that I linked to. As climate scientists do very day…..”

All you are doing is demonstrating you have a Statistics 101 understanding of metrology, nothing more.

Reply to  Tim Gorman
July 24, 2022 7:02 pm

You would say that the mean is 21, and the standard deviation is 25. Exactly what do you think that tells you about the distribution?”

It tells me that the parameters are distributed. For example, it could be the initial production rates of a step out drilling program. And since future drilling might be expected to have the same distribution of outcomes, the IP rates would indeed tend towards gaussian. Hint: No gaussian data is still valuable.

“Does the mean even exist?”

Yes,

“Can you use it for anything?”

If you have reason to believe that the data is representative, or can be projected in any way, you can use it for future decision making.

“Now, give each an uncertainty: 0 +/- 0.5, 0 +/- 0.5, 1 +/- 0.25, 2 +/- 0.75, 13 +/- 0.5, 27 +/- 1, 61 +/- 0.25, and 63 +/- 0.25.
What is the uncertainty of the mean value = 21?

To 4 significant figures I am getting an increase from the “certain” data that was 25.27 to 25.28. I.e, none.

“Is it zero?”

No.

“No uncertainty at all?”

See above.

“Can you just ignore the uncertainties like you always want to do?”

What mean, “you”, Kemosabie? I’m the one considering them, quantitatively….

Reply to  bigoilbob
July 25, 2022 7:47 am

It tells me that the parameters are distributed.”

Oh, brother! In other words it tells you nothing!

“For example, it could be the initial production rates of a step out drilling program. And since future drilling might be expected to have the same distribution of outcomes, the IP rates would indeed tend towards gaussian. Hint: No gaussian data is still valuable.”

It could also be southern hemisphere data mixed with northern hemisphere data! So tell me what the standard deviation of mean implies! Not some hand-waving about “well, it could happen again”.

“Yes,”

If those are temperatures then where does the mean exist in reality? You can calculate a mean value but what does it tell you about reality?

If you have reason to believe that the data is representative, or can be projected in any way, you can use it for future decision making.”

If those are temperatures then what kind of decision making can be done from them?

To 4 significant figures I am getting an increase from the “certain” data that was 25.27 to 25.28. I.e, none.”

Meaning you have absolutely *NO* idea of how to handle uncertainty! The mean value is 21. What does 25.27 and 25.28 have to do with anything? I assume you meant 21.27 and 21.28. I am also assuming you think the uncertainty of is 0.01. In addition, the significant figure size here is two, not four.

if q_avg = x(total)/n then you get 21 for the average using significant figures.

The uncertainty of the average is:

u(q)/q = u(x1)x1 + u(x2)/x2 + … + u(x8)/x8 + u(n)/n

I picked bad stated values when I chose zero since you can’t calculate the relative uncertainty with a value of zero, so we’ll just shift everything by adding a 1 to the stated values.

1 ± 0.5, 1 ± 0.5, 2 ± 0.25, 3 ± 0.75, 14 ± 0.5, 28 ± 1, 62 ± 0.25, 64 ± 0.25

So:

u(q_avg)/q_avg = 0.5/1 + 0.5/1 + 0.25/2 + 0.75/3 + 0.5/14 + 1/28 + .25/62 + .25/64 = 1.5

u(q_avg)/q_avg = 1.5

I’ll leave it to you to do the root-sum-square method (hint: should be about 0.8). u(q_avg) = 21 * 0.8 = 17

So your average (mean) will be 21 +/- 17. In other words your average value is worthless. It would range from 4 to 38!

AVERAGE UNCERTAINTY IS *NOT* UNCERTAINTY OF THE AVERAGE.

Uncertainty of the average here is +/- 17. It is *NOT* +/- 0.19.

If this distribution was a sample of a larger population it wouldn’t matter. If you have five samples and their means are:

20 +/- 1
19 +/- 2
18 +/- 0.5
17 +/- 0.75
19 +/- 0.4

The standard deviation of the sample means is 1 and the mean is 18.6. But that is not the uncertainty of that 18.6 mean.The uncertainty of the mean (using root-sum-square) is +/- 0.13.

So the mean of the sample means should be given as 18.6 +/- 0.1.

In this example the uncertainty worked out to be less than the standard deviation of the sample means. Many times, however, this is *not* the case. The average of the sample means will remain the propagated uncertainty from the sample means, not the standard deviation of the sample means.

What mean, “you”, Kemosabie? I’m the one considering them, quantitatively….”

Nope. You don’t even know how to propagate uncertainty! So how can you be considering them?

Reply to  Tim Gorman
July 25, 2022 8:36 am

“So your average (mean) will be 21 +/- 17. In other words your average value is worthless. It would range from 4 to 38!”

Your arithmetic is from Bizarro world NO idea where these values came from. But stepping back, while my (correct) standard deviation of the average is even higher than yours, what makes it “worthless”? You seem hysterically blinded to the fact that highly distributed data can still be valuable. We oilfield trash would be kicking rocks otherwise.

What does 25.27 and 25.28 have to do with anything?”

It represents the change in the standard deviation of the average, by including the uncertainties in the individual data points. None, in practice.

“I am also assuming you think the uncertainty of is 0.01. In addition, the significant figure size here is two, not four.”.

No, it is the increase in it, from including the uncertainty of the data. I used 4 sig figs as a bone throw to you. I wanted to show you that the standard deviation of the average did change. Just by very, very, very, little.

The standard deviation of the sample means is 1 and the mean is 18.6. But that is not the uncertainty of that 18.6 mean.The uncertainty of the mean (using root-sum-square) is +/- 0.13.
So the mean of the sample means should be given as 18.6 +/- 0.1.”

Lost In Space as usual. The “mean of the sample”, including evaluation of the individual data uncertainties is 18.6 +/- 1.4. Yes, up some from 18.6 +/- 1.0. This is because the uncertainties in the individual data points are relatively larger, compared to the expected value sum of variance, for this set of values, compared to your first.

Again get thee to a community college. Audit Engineering Statistics 101, transferable to an actual engineering school later. All it would cost is a few audit fees, gas, time (it’s obvious that you’re not busy), and a used book. The scales would then fall from your eyes, and like Rush, if he had been able to complete his 4th try at drug rehab, you would have your epiphany…

Reply to  bigoilbob
July 25, 2022 9:56 am

Your arithmetic is from Bizarro world NO idea where these values came from.”

THEY ARE RIGHT THERE IN THE POST!

The relative uncertainty is 0.8. 0.8 x 21 = 17!

You are much like bellman. Neither of you can do basic algebra! And you are a professional engineer?

It represents the change in the standard deviation of the average, by including the uncertainties in the individual data points. None, in practice.”

The change in standard deviation is *NOT* the uncertainty! Just like bellman all you know how to do is ignore actual measurement uncertainty!

“No, it is the increase in it, from including the uncertainty of the data”

Standard deviation is *NOT* measurement uncertainty.
The change in standard deviation is not measurement uncertainty. Where in Pete’s name did you learn this concept?

tg: “The standard deviation of the sample means is 1 and the mean is 18.6. But that is not the uncertainty of that 18.6 mean.The uncertainty of the mean (using root-sum-square) is +/- 0.13.

So the mean of the sample means should be given as 18.6 +/- 0.1.”

Lost In Space as usual. The “mean of the sample”, including evaluation of the individual data uncertainties is 18.6 +/- 1.4. Yes, up some from 18.6 +/- 1.0.”

The mean of what sample? I gave you the means of FIVE samples and calculated the standard deviation of the stated values of those sample means. I then calculated the root-sum-square value of the uncertainties. I have no idea where you came up with +/- 1.4!

Again get thee to a community college. Audit Engineering Statistics 101, transferable to an actual engineering school later. All it would cost is a few audit fees, gas, time (it’s obvious that you’re not busy), and a used book. The scales would then fall from your eyes, and like Rush, if he had been able to complete his 4th try at drug rehab, you would have your epiphany…”

The only one here that needs training seems to be you. You should have to study under the grad student teaching my EE Lab 101. The lab where eight of our students each built our separate amplifiers, took one measurement of each, and averaged the eight results together to get our final answer on the lab. WE ALL FAILED THAT EXERCISE!

And I’m sure you have not got even the faintest of ideas as to why!

MGC
Reply to  Tim Gorman
July 23, 2022 7:39 am

As already stated more than once now, if Gorman’s “too much uncertainty” claims were really true, then the published measurement values would be varying all over the map from month to month. The simple fact that they don’t refutes entirely Gorman’s handwaving nonsense.

Gorman’s “too much uncertainty” claims are every bit as comically ridiculous as claiming that a gun doesn’t have the accuracy to hit a target, even after a shooter just used it to hit the bull’s eye five times in a row.

What a joke. And these so-called “skeptics” like Gorman still wonder why they are not taken seriously by the scientific community. SMH in disbelief.

Reply to  MGC
July 24, 2022 3:38 pm

Gorman’s “too much uncertainty” claims are every bit as comically ridiculous as claiming that a gun doesn’t have the accuracy to hit a target, even after a shooter just used it to hit the bull’s eye five times in a row.”

You just keep on demonstrating your ignorance. A bullseye five times in the row demonstrates accuracy AND precision. That is small uncertainty. But unless those five are a one-hole result then thee is *still* some uncertainty in where the next shot will hit. You cannot estimate where that will be by ignoring the uncertainty. My guess is that you can’t even enumerate all of the uncertainty factors in such an attempt!

MGC
Reply to  Tim Gorman
July 25, 2022 10:20 am

This Gormanian nonsense just grows ever more and more ridiculous and ever more tiresome.

Of course there is still some uncertainty in where the next shot will hit. Duh. But, as demonstrated by the data from the previous shots, all hitting the bull’s eye, that uncertainty is quite small. Nor is that small uncertainty “ignored” as Gorman wants to try to pretend.

The uncertainty of the mean ocean heat content data is similar. The values do not vary all over the place from month to month as Gorman’s “too much uncertainty” claims would require in order to be “correct”.

Gorman can handwave, stomp his feet, and pretend away as much as he wants, but the actual month by month data itself nevertheless totally refutes his utterly ridiculous “too much uncertainty” claims.

Reply to  MGC
July 25, 2022 12:22 pm

Of course there is still some uncertainty in where the next shot will hit.”

How do you calculate it for the future when you ignore it in the data you have?

“But, as demonstrated by the data from the previous shots, all hitting the bull’s eye, that uncertainty is quite small.”

Being small doesn’t mean you can ignore it – unless it is you, bellman, or a climate scientist!

The uncertainty of the mean ocean heat content data is similar. The values do not vary all over the place from month to month as Gorman’s “too much uncertainty” claims would require in order to be “correct”.” (bolding mine, tg)

Ahhh…. And now we get into the argumentative fallacy of Equivocation – change the definition of what we are discussing. When did the subject become JUST THE OCEAN TEMPERATURE?

The ocean temp can vary from something like -2C to 30C. And they *do* vary from month to month although the variation is not uncertainty. The uncertainty is in the measurement, not the stated value. Remember, measurements should be given as “stated value +/- uncertainty”. We are speaking of the uncertainty part of the measurement, not the time-related variation in the stated value!

but the actual month by month data itself nevertheless totally refutes his utterly ridiculous “too much uncertainty” claims.”

No, it doesn’t. You STILL confuse variation in the stated value with the uncertainty associated with the measurement of the stated value. You can whine and cry all you want about me but you *do* deny measurement uncertainty exists, just like bellman and the climate scientists!

MGC
Reply to  Tim Gorman
July 25, 2022 3:52 pm

Gorman’s comments have once again become way beyond ridiculous. But what else is new.

No one is “ignoring” uncertainty in the shooting example. Gorman imagining that this is the case is an utter absurdity. But what else is new.

re: “You confuse variation in the stated value with the uncertainty associated with the measurement of the stated value.”

False. But what else is new. The changes of the mean ocean heat content from month to month consist of some variation in the stated value and some uncertainty with the measurement of the stated value.

One can always put a worst case upper bound on the uncertainty of the measurements by assuming that all of the month to month changes are due to measurement variation.

The fact that these variations are tightly distributed relative to the magnitude of the decades long trend, leading to a highly statistically significant p-value < 0.0001 for the increasing ocean heat content trend, totally refutes Gorman’s utterly laughable “too much uncertainty” grasping at straws falsehoods.

MGC
Reply to  Tim Gorman
July 25, 2022 4:18 pm

An even more comical exposition of Gorman’s utterly ludicrous nonsense is this:

Gorman makes believe that researchers are not following the correct statistical methods as laid out in his sacred Taylor textbook; however, if one bothers to read the publications by the researchers who have actually done the ocean heat content analysis, one finds that the methods they have used are in fact taken right out of the Taylor textbook and are referenced as such.

One can surmise that Gorman has, of course, never done any such reading of the actual research, preferring to remain instead in his shameful cesspool of willful ignorance.

“Hoist by your own petard” is once again the phrase that comes to mind here.

Reply to  Tim Gorman
July 24, 2022 8:23 am

Uncertainty does *NOT* cancel unless you have a Gaussian distribution and multiple measurements of different things using different devices simply can’t be assumed in such a situation.”

Documentation, please. But thanks for accidentally admitting that uncertainties are distributed.

W.r.t. SPC. It is an application of the same statistical laws that have been evolved for centuries. There is no daylight between it and the statistical evaluations you decry, fact free, in climate science..

Reply to  bigoilbob
July 24, 2022 10:27 am

Documentation? You mean you can’t look at the temperature record and see that it is not a Gaussian distribution?

Start with Pielke, 2007, “Documentation of Uncertainties ….”

See also Hubbard, Lin, 2007, “On the USCRN Temperature System”

Hubbard and Lin have several studies on temperature measurement uncertainty. One notes that the micro-climate below the measurement station even affects it’s readings, e.g. fescue grass vs kentucky bluegrass, sand vs bare earth, etc. It concludes that any adjustment factor for a station must be done on an individual station basis, not on a grid basis.

W.r.t. SPC. It is an application of the same statistical laws that have been evolved for centuries.”

Derived by statisticians who never once use a data set where each element is shown as a “stated value +/- uncertainty interval”. Only as stated value, I..e the uncertainty interval is assumed to be zero or to always cancel.

Here is the common definition for uncertainty in SPC:

Measurement uncertainty: in simplistic terms in dimensional metrology, it can be said to be “A non-negative parameter characterising the dispersion of the values attributed to a measured quantity”. This potential uncertainty has a probabilistic basis that reflects incomplete knowledge of the measured quantity. Here, all measurements are subject to degree of uncertainty, and a measured value is only complete if it is accompanied by a “Statement of the associated uncertainty”. Relative uncertainty is the term obtained from the actual measurement uncertainty divided by the measured value.”

Note carefully that it speaks to a “measured quantity”, i.e. a single object being measured. NOT multiple measured quantities of different objects.

When you measure maximum and minimum temperature you have measured TWO DIFFERENT THINGS, one time each. Each measurement will have an uncertainty interval that is independent of the other. One measurement cannot give you a normal distribution or *any* kind of distribution that can be used to cancel uncertainty for either measurement. And the independent uncertainty of one measurement cannot cancel the independent uncertainty of the other measurement. You might get *partial* cancellation but you need to be able to show that this is the case rather than just assuming it.

Perhaps an example will help explain this.

You pull a sample plate with drilled holes from process1 at time1 and measure the diameter of the holes. You do the same thing for process2 at time1.

Now you come back 24 hours later and do the same thing at time2.

The diameter of the holes from process1 will have changed (can you guess why and what the effect is?).

The diameter of the holes in process2 will have changed as well.

Will the change in process1 tell you what the change in process2 is?

Ans: NO. They are independent objects being acted upon by independent processes. The uncertainty of one is independent of the other. You can’t assume that you can adjust the equipment in process2 by the same amount as for process1. NO CANCELLATION.

It’s the same for min/max temps. You can’t assume that the uncertainties in each form a random distribution that cancels.

I simply cannot understand how any engineer being held responsible for results in the field can’t understand this.

Reply to  Tim Gorman
July 24, 2022 12:56 pm

Once again, you’re providing a fairy tale instead of actual documentation. But to answer your irrelevant question, drill bits wear.
 
And also once again, all of the processes used in SPC are drawn from classical statistical theory.
 
It’s the same for min/max temps. You can’t assume that the uncertainties in each form a random distribution that cancels.”
 
Which is why we don’t. With increasing data, the uncertainties of the averages and trends are minimized. Depending on the mix of distributions for distributed input data, and the standard errors of their expected values, maybe or maybe not per the rule that the sum of the variance equals the variance of the sum. But always minimized.
 
AGAIN, your claim that statistical laws are not, your responsibility to tell us when.

Reply to  bigoilbob
July 24, 2022 3:44 pm

Once again, you’re providing a fairy tale instead of actual documentation. But to answer your irrelevant question, drill bits wear.”

In other words you are too illiterate to even find the documentation I gave you on the internet. Why am I not surprised?

And also once again, all of the processes used in SPC are drawn from classical statistical theory.”

And SPC assumes you are measuring the same thing multiple times. You can’t even address the situation where you are measuring multiple things one time – WHICH IS WHAT TEMP MEASUREMENTS ARE!

Why can’t you address how to handle temperature measurements that have uncertainty?

” With increasing data, the uncertainties of the averages and trends are minimized.”

Only if they are multiple measurements of the same thing! Why can’t you get outside of your narrow box where you assume that all uncertainty cancels?

All you have to do is just take minimum and maximum temperatures. Measurements of different things and show how the uncertainties of each cancel when you calculate their average. Show us explicitly how the uncertainties cancel. Address both random effects and systematic effects.

Put up or shut up!

Carlo, Monte
Reply to  MGC
July 23, 2022 6:41 am

The uncertainty of the mean value of 4000 measurements is far smaller than the uncertainty of any single measurement.

Another climate astrologer talking through his hat and showing his lack of clothing.

Reply to  Carlo, Monte
July 24, 2022 3:46 pm

He can’t even show explicitly how the uncertainties of a daily minimum and maximum temperature cancel. And yet he expects that to happen with any number of temperature measurements.

It’s an article of faith he learned in Statistics 101 at college, perhaps even Statistics 101 for Business majors!

MarkW
Reply to  Tim Gorman
July 21, 2022 5:07 pm

To the accuracy limitations of the probes themselves, you have to add the accuracy limitations cause by a grossly inadequate number of probes.

Reply to  MarkW
July 22, 2022 11:38 am

You don’t “add” them, but you do consider them. They add to the sum of variance in averaging and trending evaluations. Thankfully, even the most overinflated estimates of old timey measurement error, when included in regional or global temperature or sea level trends, over physically/statistically significant time periods, make very little difference to the standard errors of those trends.

Carlo, Monte
Reply to  bigoilbob
July 23, 2022 6:52 am

“consider them” — HAHAHAHAHAHAHAHAHAH

Keep painting yourself into the corner blob, it is hilarious.

Reply to  Carlo, Monte
July 23, 2022 4:48 pm

“consider them” — HAHAHAHAHAHAHAHAHAH”

I told you exactly how you would consider them. The sum off the variance is a part of both the evaluations of the standard deviation of an average and of the standard error of a trend. The sum of the variance of the distributed data point errors is simply added to that of either that of the “expected value” averaging or trending evaluation (depending on what you are doing) and the required parameter is then calculated.

Don’t believe me? Download this data.

https://www.epa.gov/climate-indicators/climate-change-indicators-sea-level

Now, from 1980-2013, for the pre sat data, calculate the acceleration, and it’s standard error. You should get:

0.00586 +/- 0.000661 in*yr^2

Now, do the same thing including the provided annual standard deviations. You get:

0.00586 +/-0.000871 in*yr^2

You have just increased your chance that there is either no or negative acceleration from

4.07E-19 to a whopping

5.65E-12

Tell ’em what they won, Monty.

But of course you have NO idea how to do any of this, do ya’…

Reply to  bigoilbob
July 23, 2022 5:50 pm

Correction:

yr^2 s/b yr^-2

A killer error, and I regret my carelessness. Especially when responding to someone as sharp as Mr. Carlo

comment image.

At “Mines” schools, units rule. And so they should. Almost anyone with fundamental engineering skills who is not Pat Frank recognizes this.

Reply to  bigoilbob
July 24, 2022 9:05 am

The sum off the variance is a part of both the evaluations of the standard deviation of an average and of the standard error of a trend.”

This *ONLY* applies to a distribution that is normal or at least identically distributed around a mean.

You have yet to explain how measurements of different things like temperature using different measurement devices can result in a normal distribution of both the stated values and the uncertainty.

Until you can do this nothing you say means anything.

Time to put up or shut up!

Show how maximum or minimum temperatures across just the US will result in a normal distribution or admit you have absolutely no understanding of metrology and uncertainty.

Reply to  Tim Gorman
July 24, 2022 9:16 am

This *ONLY* applies to a distribution that is normal or at least identically distributed around a mean.”

Ah, the motorized goal posts. Instead of trying in vain to back up your fact free assertion, you just modify it slightly. Let’s cut to the chase. Since this is your vague, moveable claim, it’s up to you to document exactly which data sets containing which kinds of distributions are not part of the law that variance of the sum is the sum of the variance. FYI, I can start. Both for trending and averaging, correlated data has a smaller standard error and standard deviation than non correlated data.

Now, time to document….

Reply to  bigoilbob
July 30, 2022 11:29 am

This entire forum is about climate – i.e. temperatures.

The data sets are made up of individual, random, independent variables, that means individual measurements of different thing using different measurement devices. There is simply no way to guarantee a distribution that is normal or identically distributed around a mean.

There is *NO* moving of the goal posts. Standard deviation and mean are statisical descriptors that *only* apply to normal distributions. They don’t apply to skewed distributions. They don’t apply to multi-modal distribution. For those kind of distributions the use of the 5-number description should be used: minimum, first quartile, median, third quartile, maximum.

Since the temperature data sets are *NOT* normal distributions the use of the mean and the standard deviation is just a crutch used by those who simply want to ignore statistical standards.

it’s up to you to document exactly which data sets containing which kinds of distributions are not part of the law that variance of the sum is the sum of the variance.”

Do you even realize what you have said here? I doubt it. Since each individual, independent, random temperature in the data set has a variance denoted by its measurement uncertainty you are saying that the variance of sum is the sum of the variances of all the individual data set members, i.e. the sum of the measurement uncertainties.

That is *EXACTLY * what we’ve been trying to tell you from the start. You simply cannot ignore the measurement uncertainties. All those uncertainties add up when you combine them!

Thanks for finally recognizing reality!

Dave Andrews
Reply to  Tim Gorman
July 22, 2022 9:59 am

Plus each Argo float represents an area of ocean the size of Portugal. Would we take a single land temperature and say that represented all of Portugal?

Reply to  Dave Andrews
July 23, 2022 6:06 am

CAGW alarmist would!

MarkW
Reply to  MGC
July 21, 2022 5:04 pm

And here comes MGC to trot out all of the standard alarmist lies.

As to ocean temperatures, even 1000 ARGO probes is short by a factor of at least 100 of the number needed to come close to an accurate reading of the entire oceans temperature.
Beyond that, the claimed warming for the entire ocean is only about 0.003C. This with a woefully inadequate number of probes with an accuracy of only 0.5C out of the lab. After several years at sea, the accuracy has decreased by an unknowable amount.

As always, the trolls are absolutely incapable of comparing like to like. They proclaim that since world wide temperature recordings are only available in the modern era, this proves that prior to the modern era, it is impossible for there to be a worldwide climate phenomena. Regardless, every proxy record available shows the existence of the MWP, but since we don’t have proxies for every spot on the planet, the warming couldn’t have been world wide.

So far the coping has cost nothing. The warming we have seen so far is 100% beneficial, and will be for at least the next 2 or 3C of warming. The historical record proves that.

MGC
Reply to  MarkW
July 22, 2022 8:12 am

Here’s MarkW with another sadly typical torrent of “skeptical” falsehoods.

The claim “accuracy of only 0.5C out of the lab” is wildly incorrect.

Argo Data 1999–2019: Two Million Temperature-Salinity Profiles and Subsurface Velocity Observations From a Global Array of Profiling Floats
Frontiers in Marine Science Sept 2020

“The accuracies of the float data have been assessed by comparison with high-quality shipboard measurements, and are concluded to be 0.002°C for temperature, 2.4 dbar for pressure, and 0.01 PSS-78 for salinity.”

https://www.frontiersin.org/articles/10.3389/fmars.2020.00700/full

If there were genuinely the large degree of measurement uncertainty as these intentionally blind nay-saying “skeptics” claim, then the published mean ocean heat content values would be varying all over the place from month to month. The fact that they are not proves that these juvenile “too much uncertainty” ankle biting claims are, of course, completely false.

“warming for the entire ocean is only about 0.003C”

No surprise, totally wrong again. As usual. Too low by over a full order of magnitude.

MarkW
Reply to  MGC
July 22, 2022 9:25 am

Even carefully calibrated lab instruments in highly controlled environments have trouble measuring temperatures to 0.01C, much less 0.001C.
You really will believe any lie, so long as it supports your religious convictions.

Reply to  MGC
July 22, 2022 4:48 pm

Go look up Hadfield, 2007.

The mean RMS difference across the whole section for the Argo based estimate (using the original OI parameters) is 0.58°C,”

Far too many so-called “scientists” today mistake sensor resolution for uncertainty. It is the entire float that causes the uncertainty, not the sensor resolution.

MGC
Reply to  Tim Gorman
July 23, 2022 8:39 am

How comical.

Hadfield 2007 was published 15 years ago, at a time when the full deployment of the Argo flotilla had not yet even been achieved.

Typical Gormanian fail.

Hadfield 2007 was one of the first research studies to initially investigate how useful the Argo data would be once full deployment was achieved.

Gorman, of course, blindly ignores the conclusions of this initial exploratory study:

“sampling of the temperature field at the Argo resolution results in a level of uncertainty … that is sufficiently small that it should allow investigations of variability in this region, on these timescales.”

What part of “sufficiently small uncertainty” is Gorman unable to comprehend?

Another 15 years of operation of the full Argo flotilla, and the analysis of the full flotilla data, post Hadfield 2007, have only further enhanced those initial findings.

Gorman’s ankle biting nay-saying continues to be nothing but ludicrous nonsense. His babble remains every bit as ridiculous as claiming that a gun “doesn’t have the accuracy to hit a target” even after a shooter used it to hit the bull’s eye five times in a row. Pure willfully ignorant garbage.

MGC
Reply to  MarkW
July 22, 2022 9:53 am

Oh, by the way, here are a couple other MarkW misrepresentations and falsehoods:

“every proxy record available shows the existence of the MWP”

Very doubtful that this is actually true, but moreover, the proxies that do show a “MWP” show that it occurred at widely different times, sometimes centuries apart, at different places around the globe. But of course so called “skeptics” like MarkW just disingenuously ignore that information, because it doesn’t fit the nay-saying “skeptical” agenda.

The claim “the warming we have seen so far is 100% beneficial” is also ridiculously false. Below is just one example of many that could be cited:

Sea Level Rise Caused an Extra $8 Billion in Losses During Hurricane Sandy
https://e360.yale.edu/digest/sea-level-rise-caused-an-extra-8-billion-in-losses-during-hurricane-sandy

And there will be many, many more examples like this, and worse, in the coming decades, as the planet continues to warm and sea levels continue to rise.

Mr.
Reply to  MGC
July 21, 2022 5:32 pm

MGC, I told you once before Scooter that any global warming and sea level rise that will have any consequential effects on habitation will take decades / generations to have any real impacts.

If remedial actions are required to ports, infrastructure, housing etc, that will provide much-needed work activity and economic boost for the future working age generations.

Just as WW2 did for those 1950s & 60s generations with all the re-building that was required.

Next time, however, there won’t tragically be millions of people killed in wars to create a huge work-making opportunity. Global warming is a doddle to deal with compared to 1,000 plane bombing raids.

If the world is lucky, global warming will also open up huge new areas for agriculture to feed a hungry world.

MGC
Reply to  Mr.
July 22, 2022 8:23 am

Here’s Mr. ridiculously imagining that things like putting much of the state of Florida underwater, which is a highly likely eventual outcome under “business-as-usual” practices, would somehow represent an “economic boost”.

And one has to be truly delusional, if not downright sick, to imagine that calamities like heinous world war destruction are “lucky” circumstances for providing “much-needed work activity and economic boost for the future working age generations”.

Words cannot even begin to describe the disgust felt for Mr.’s truly contemptible comments.

MarkW
Reply to  MGC
July 22, 2022 9:27 am

So a 8 inch increase in sea level over the next 100 years is going to put Florida under water?

Is there any lie so ludicrous that you won’t support it?

MGC
Reply to  MarkW
July 22, 2022 10:59 am

And here we go with yet *another* round of utterly tragic MarkW falsehoods.

MarkW has been shown more than once before that current global sea level rise rate is around 3.5 mm/yr. (AVISO, NOAA, NASA, CSIRO) That’s well over a foot in the next 100 years, not just his false 8 inches claim.

The rise rate has also been increasing, though so-called “skeptics” like MarkW also pretend away (i.e. simply lie about) that information, too.

Moreover, the rise rate along the U.S. east coast is much greater than the worldwide average. For example, Boston, NYC, and Miami have been seeing rise rates of 5-7 mm/yr. this century (Permanent Service for Mean Sea Level). That’s around two feet per 100 years.

Lastly, this sea level rise won’t just magically vanish at the end of this century. It will continue for centuries more.

People who have a sense of integrity don’t leave a looming issue like that for centuries of future generations to deal with. But apparently MarkW is not one of those kinds of people.

MarkW
Reply to  MGC
July 22, 2022 8:53 pm

As usual, MGC only takes the data set that shows what he wants to believe. But even if his false claims were for once accurate, that still doesn’t work out to Florida being underwater.

What is it about alarmists and their need to believe any catastrophic forecast he’s told to believe.

MGC
Reply to  MarkW
July 23, 2022 9:15 am

“only takes the data set that shows what he wants to believe”

Such utterly ludicrous drivel. Four different global datasets were referenced. They all show the same thing.

“that still doesn’t work out to Florida being underwater.”

More woefully intentional “skeptical” ignorance.

Unified Sea Level Rise Projection Southeast Florida:

“In the short term, sea level rise is projected to be 10 to 17 inches by 2040 and 21 to 54 inches by 2070 (above the 2000 mean sea level in Key West, Florida). In the long term, sea level rise is projected to be 40 to 136 inches by 2120.”

William Wilson
July 21, 2022 1:40 pm

The Trick is now showing on Netflix. Subscription cancel? Required viewing.

July 21, 2022 1:42 pm

I’m confused.
Is the missing heat hiding in the ocean again or a tree ring?

Richard Page
Reply to  Gunga Din
July 21, 2022 3:34 pm

I thought it was supposed to be in pine cones?

July 21, 2022 1:45 pm

From the above so-called “fact check” by http://www.carbonbrief.org:
“Human-emitted greenhouse gases trap extra heat in the atmosphere. While some of this heat warms the Earth’s surface, the vast majority – around of 93% – goes into the oceans.”

So sophomoric those two back-to-back sentences . . . so wrong on both.

1) Greenhouse gasses do not “trap” heat energy in the atmosphere. Instead, they briefly intercept LWIR radiated off Earth’s surfaces and then promptly redistribute that “extra” energy to the rest of the gases in the atmosphere (predominately nitrogen and oxygen) which are then free to both convect to TOA and to radiate that energy directly to space. This is why there is a current balance (“equilibrium”) between Earth’s incoming solar energy and Earth’s outgoing radiation energy. If greenhouse gases continuously “trapped” heat energy, Earth’s land and surface temperatures would reach the boiling point of water.

2) There is no credible evidence that 93% of the supposedly trapped (see #1 above) incoming solar energy passes into Earth’s oceans. If it did, the oceans would exhibit a significantly higher warming trend over time than currently measured.
“Based on time series of global ocean heat content variability calculated from Argo temperature measurements in the 10-1500m depth layer. The average [ocean] global warming rate accounts for 0.54±0.1 Wm-2 during the years 2005-2010.” (source: https://climatedataguide.ucar.edu/climate-data/ocean-heat-content-10-1500m-depth-based-argo ).
But there is this:
“The 2 degrees Celsius global temperature increase limit translates to a radiant energy increase of 2.5 watts per square meter.” (source: https://www.nsf.gov/news/news_summ.jsp?cntn_id=116862 ). Over the last 50 years, Earth’s average global (land and ocean) temperature has increased by about 0.9 °C, so the ratio’d radiate energy increase would be about (0.9/2.0)*2.5 = 1.13 W/m^2. In comparing this to the preceding quoted Argo data, we see that only 0.54/1.13 = 0.48 = 48% of the “trapped” incoming solar energy passes into Earth’s oceans, NOT the 93% asserted by the carbonbrief.org folks. 

Eric Worrall summed-up the above far better than I can:
“Desperate butt covering from alarmists who are facing increasingly embarrassing questions about the failure of the world to end.”

Tom Abbott
Reply to  Gordon A. Dressler
July 22, 2022 4:46 am

One of the more effective methods to attack the alarmist argument is to point out that every dire prediction they make about the Earth’s weather has failed to materialize. They are wrong every time. Going by the polls, many people seem to have stopped listening to them as only about one to three percent of those polled recently give a high priority to climate change.

Fool me once, shame on you. Fool me twice, shame on me.

July 21, 2022 1:55 pm

“Except for that period between the 1940s to 1970s, when the drop in global temperature triggered climate scientists like Stephen Schneider to suggest we should use nuclear reactors to melt the polar ice, to prevent an ice age.”

He does not suggest we should in that video:

Can we do these things? Yes. But will they make things better? I’m not sure. We can’t predict with any certainty what’s happening to our own climatic future. How can we come and intervene then in that ignorance? You could melt the ice-caps; what would that do to the coastal cities? The cure could be worse than the disease. Would that be better or worse than the risk of an ice age?

Tom Abbott
Reply to  Bellman
July 22, 2022 5:02 am

“We can’t predict with any certainty what’s happening to our own climatic future. How can we come and intervene then in that ignorance?”

Nothing has changed since that was first said. How can we intervene in our current ignorance?

Fools rush in where Angels fear to tread.

We have a lot of fools in positions of power today and they are rushing in for various selfish or delusional reasons, causing much unnecessary pain for the rest of the world, directly or indirectly.

About the only cure for this CO2 delusion is a significant temperature downtrend. That is a distinct possiblity going by past temperature history.

Reply to  Tom Abbott
July 22, 2022 5:49 am

So you accept this article was being misleading when it said that Stephen Schneider was arguing we should melt the polar ice caps?

Your claim that nothing has changed over the last 45 years is not one I recognize.

Dan W
July 21, 2022 2:06 pm

There is then and there is now. The questions with regards to now are:
1) Is CO2 a green house gas? IOW, does a significant increase in the air make it hotter?
2) Has there actually been a significant increase in CO2? Measure it.
3) Does the increase from vehicle, industrial and energy match what we see in the air?
4) Does there being other causes for warming in the past somehow invalidate warming due to green house gases?

gbaikie
July 21, 2022 2:24 pm

Holocene is roughly following the pattern of recent interglacial period, though it’s different
in that it looks like Holocene has been significantly cooler.
Which might related to Earth being coldest it’s been in last tens of million of years, and climbing out this very cold period, had a hiccup. And the hiccup was so serious as one might question weather Holocene is actually an interglacial period. But we chose to call it an interglacial period despite sea levels rises only about 2 meters higher than present sea levels
and it’s thought the last interglacial was 5 to 9 meters higher than current sea levels.
And the ocean during the Eemian interglacial period was thought to have average ocean
temperature of 4 C or warmer.
And currently our ocean averages about 3.5 C.
The Holocene interglacial period has been gradually cooling, as all other interglacial period did. This gradual cooling of Holocene has been occuring for over 5000 years.
More than 5000 years ago, sea levels were higher, the Sahara Desert was mostly grassland and forests, and we had an ice free polar sea ice in arctic ocean. And having ice free polar sea allowed the great northern forests to have trees, which today are just frozen stumps.
Or our great northern forest was a larger forest. And dry Sahara Desert was in period called,
African humid period:https://en.wikipedia.org/wiki/African_humid_period

Stevek
July 21, 2022 2:40 pm

One good argument is to say AGW science does not follow the gold standard of science. The gold standard is the double blind study. With Covid we have heard about double blind studies of drugs to fight it. We have been told wait to the double blind study comes out before putting faith in a drug.

Yet with AGW hypothesis there is no double blind study simply because there is only one earth. There is then an hypocrisy with any scientist that puts full faith in AGW but also believes the double blind study is gold standard of science. Scientists can’t have it both ways. If they have full faith in AGW then logically they deny the gold standard of a double blind study.

Tom Abbott
July 21, 2022 3:02 pm

From the article: “However, after a decade or so of slower-than-average warming, rapid temperature rise returned in 2015-16 and global temperatures have since remained quite warm.”

The truth is the global temperatures have cooled quite a bit, down by 0.6C from the 2016, 21st century highpoint (the year 1998 was just as warm).

Here’s the evidence:

comment image

Alarmists should explain why more CO2 is going into the air every day, yet the global temperatures have cooled 0.6C. That doesn’t seem to fit the alarmist claim that more CO2 in the atmosphere means higher temperatures.

Tom Abbott
July 21, 2022 3:15 pm

From the article: “In fact, the last eight years have been unusually warm – even warmer than expected given the long-term rate of temperature increases – with global temperatures exceeding 1.2C above pre-industrial levels.”

What the author is talking about is the temperature highpoint of 2016, which NASA Climate claims was 1.2C warmer than their average, and NOAA claims was 1.1C warmer than their average.

Whichever figure you care to use, current global temperatures are 0.6C cooler than 2016.

If we keep coling like this latest cooling trend, the current slight warming trend is going to turn into a cooling trend. It will be fun seeing the alarmists try to explain that, if it happens.

The real temperature profile of the globe shows the temperatures warm for a few decades and then they cool for a few decades and then they warm again, all the while staying within certain bounds high and low. At least, this is the case since the Little Ice Age, going by the unmodified, written historical temperature record.

So the temperatures have warmed for a few decades during the satellite era (1979 to the present), have hit a high point, and now are cooling, perhaps beginning decades of cooling. Wouldn’t that be a kick in the head for the alarmists.

I guess I ought to put this in here for perspective:

comment image

spren
July 21, 2022 3:15 pm

Down-welling IR energy can only penetrate the ocean surfaces to the thickness of a dime. UV penetrates the surface to a depth of several hundred feet before it is converted into IR. How exactly does the atmosphere warm the oceans when it is clear that oceans transport stored heat from the sun and transfer that heat into the atmosphere. Freaking charlatan liars.

damp
July 21, 2022 3:18 pm

The problem (for the Chicken Littles) is not the Pauses, but the holy computer models’ utter failure to predict the Pauses. They should not be allowed to bait-and-switch like this.

July 21, 2022 3:19 pm

Attacked is the temperature in the tropic from climate4you GlobalTemperatures. Is the temperature hiatus in the tropics from 1998 to now?

Tropic temperature.png
Cosmic
July 21, 2022 3:48 pm

Does not matter with Marxists in charge. None of what you wrote matters 1 iota. To you, me and scientifically adept folks like you and the rest of us, of course it matters but not to marxist socialist commies of the democrat party. Not 1 single bit.

MarkW2
July 21, 2022 4:00 pm

The point is that if the models really were accurate, these pauses would show up. But they don’t. Any idiot can build a model with two increasing variables and any statistician who knows their stuff will tell you that such a relationship is meaningless.

This is the real reason the pause matters because it provides a great way to test the models; and they clearly fail the test.

H B
July 21, 2022 4:17 pm

“Three million years ago, the world was so warm Antarctica was mostly ice free”
Was it the arctic may have been but Antarctica ?

Bob
July 21, 2022 4:59 pm

Can someone help me with the graph? Why do the numbers on the right side of the graph get larger as they go down or are those dashes minus signs. If they are minus signs what would zero represent?

July 21, 2022 5:08 pm

Notice that the Minoan, Roman, Medieval and the present time are all more or less equally spaced, lending evidence to the theory that today’s warmth is entirely natural.

I like to post this on facebook pages pushing Al Gore’s climate scam:
There is NO CLIMATE CRISIS!

THE CLIMATE HAS ALWAYS CHANGED!

5000 years ago, there was the Egyptian 1st Unified Kingdom warm period  
4400 years ago, there was the Egyptian old kingdom warm period.
3000 years ago, there was the Minoan Warm period. It was warmer than now WITHOUT fossil fuels.
Then 1000 years later, there was the Roman warm period. It was warmer than now WITHOUT fossil fuels.
Then 1000 years later, there was the Medieval warm period. It was warmer than now WITHOUT fossil fuels. 1000 years later, came our current warm period. 

You are claiming that whatever caused those earlier warm periods suddenly quit causing warm periods, only to be replaced by man’s CO2 emission, perfectly in time for the cycle of warmth every 1000 years to stay on schedule. Not very believable.
 
The entire climate scam crumbles on this one observation because it shows that there is nothing unusual about today’s temperature and ALL claims of unusual climate are based on claims of excess warmth caused by man’s CO2.

Evidence that the Roman & Medieval warm periods were global: 
http://www.debunkingclimate.com/warm_periods.html
Evidence that those warm periods actually occurred:   
http://www.debunkingclimate.com/climatehistory.html

Much more evidence on climate:  
http://www.debunkingclimate.com

Even the IPCC debunks climate alarmism: http://www.debunkingclimate.com/ipcc_says.html

Feel free to disagree by showing actual evidence that man’s CO2 is causing serious global warming. (Or show your unwillingness to learn by posting a laughter emoji.)

July 21, 2022 5:53 pm

I’m thinking the drought and the warming trend in the Southwestern US has more to do with the PDO than CAGW. Once the PDO descends into a negative phase we should cool dramatically with precipitation also increasing. This is connected to solar activity. Any thoughts on this? My theory is that this past warming trend is driven by the PDO driven by changes in solar activity.

Reply to  John
July 22, 2022 2:30 pm

The US southwest has been a desert/semi-arid desert for thousands of years. I wouldn’t expect a *lot* of precipitation increase. Cooling maybe but cool air doesn’t generate the rain that warmer air does.

Geoff Sherrington
July 21, 2022 5:53 pm

There are many claims that the Earth has warmed over the last 150 years. There is always an uncertainty term to consider. If the 95% uncertainty is such that temperatures are good only to plus or minus half a degree C, then warming of claimed 1 degree C in 150 years is all inside the uncertainty limits. It has a probability of not being there at all.
Clearly, a great deal of attention needs to go into uncertainty estimates.
After 6 years of asking our Australian authorities I have finally got an answer that the routine historical daily temperatures have 95% confidence envelopes in the range of +/- 0.1 to 0.4 C., depending on several factors like type of thermometer, years of deployment etc.
There are unrealistic, IMHO. Temperatures from the 1850s have a lot more uncertainty than that, particularly lack of global coverage which is not considered in those official estimates.
What uncertainty figures are realistic?
Have you national or local authorities published estimates? What are they?
This uncertainty factor is one of the biggest criticisms of allegations of global warming. Geoff S

MarkW
Reply to  Geoff Sherrington
July 22, 2022 9:32 am

Prior to the age of digital thermometers, all readings were rounded to the nearest degree.
This adds 0.5C of uncertainty on top of the confidence interval of the thermometer itself.

Reply to  Geoff Sherrington
July 22, 2022 11:17 am

Let’s also not forget that before min/max LIB thermometers, the readings were taken manually at times during the day that might not capture the actual high or low temps. This increases the uncertainty tremendously. Anytime I see see anomalies in the one hundreths or one thousandth place prior to WWII, I question the ability of the scientists in analyzing uncertainties. Too many times they want to “create” a Standard Error of the Mean (SEM) and treat that as the uncertainty. They have no clue about what they are doing.

Reply to  Jim Gorman
July 22, 2022 1:41 pm

min/max mechanical thermometers today can have an uncertainty of +/- 4F from the manufacturer. It is doubtful that uncertainty gets *better* when it is placed in the field.

Reply to  Tim Gorman
July 23, 2022 7:38 pm

+/-4F? The brochure I read was for half that. You might very well be right, but would you please doc?

Reply to  bigoilbob
July 24, 2022 9:07 am

I’m not your research assistant. Do your own research. I’m sure what you saw as +/- 2C, not +/- 2F.

Reply to  Tim Gorman
July 24, 2022 9:20 am

“I’m sure what you saw as +/- 2C, not +/- 2F.”

Perhaps. But per the rules of the road, your claim, your responsibility to back it up. Or, for the umpteenth time, per Chris Hitchens:

“That which can be asserted without evidence, can be dismissed without evidence.”

Reply to  Geoff Sherrington
July 22, 2022 1:39 pm

 the routine historical daily temperatures have 95% confidence envelopes in the range of +/- 0.1 to 0.4 C., depending on several factors like type of thermometer, years of deployment etc.”

These *are* unrealistic. Even today’s modern Argo floats have an uncertainty of +/- 0.5C. The US Federal Meteorology Handbook No. 1 specifies that federal temperature measurement devices only have to meet a +/- 0.6C uncertainty.

The best LIG min/max thermometers *today* have uncertainties greater than this, As much as +/- 4F (+/- 2C).

Too many people today assume the sensor resolution capability is the uncertainty of the device. That appears to be what you were quoted. RESOLUTION IS NOT UNCERTAINTY. You can easily have a high resolution device whose reading is very inaccurate with a large uncertainty. In fact, the higher the resolution of the sensor the easier it is for the whole device to be inaccurate because of component drift in the device.



Art
July 21, 2022 7:55 pm

The last eight years are the warmest eight years since records began in the mid-1800s….except for the 1930s.

Craig from Oz
July 21, 2022 8:54 pm

I love the way the authors of this document attempt to claim there is no pause by… saying there is a pause.

In fact, the last eight years have been unusually warm

Not actually the question. The question was is it continuing to get warmer.

In simple terms it is the difference between velocity and acceleration. Not really all that advanced topics.

One wonders if these authors don’t actually understand the different, or they do and are deliberately attempting to word salad the readers into trusting them.

Reply to  Craig from Oz
July 22, 2022 2:58 pm

It’s ever worse than this. The *anomalies* are unusually large. But there is no way to judge what has caused the anomalies to grow. Is it growth in minimum temps affecting the annual averages? Is it growth in the maximum temps that is affecting the annual averages? (annual average – long term average = anomaly) Or is it a combination of both?

You can’t tell from just looking at the anomaly by itself. But the CAGW advocates *ALWAYS* assume the anomaly growth is from higher and higher max temps that are going to turn the earth into a cinder – *ALWAYS*!

Where is the catastrophe if it is minimum temps going up?

July 21, 2022 9:17 pm

How does the Infrared trapped heat from atmospheric CO2 get into the oceans? It’s not from radiation as the oceans are opaque. So it must be convection or conduction. Yet, there is no explanation for that. Saying it goes into the oceans is silly without a mechanism.

MarkW
Reply to  Doonman
July 22, 2022 9:34 am

The infrared does not put heat into the oceans. It makes it harder for the heat that short wave radiation is putting into the oceans, to get out.

Reply to  MarkW
July 22, 2022 3:01 pm

What absorbs the longwave IR on the land surface? Silicon and gypsum, the two most common substances on land don’t respond much to IR at CO2 wavelength. Neither does plant life or fresh water.

MarkW
Reply to  Tim Gorman
July 22, 2022 8:57 pm

Why fixate on just long wave? Both long wave and shortwave radiation heat land surfaces. And just as with water, the rate at which the heat comes out of the land depends on the temperature difference between the land and the air.
The warmer the air, the slower the heat comes out of the land.

Reply to  Doonman
July 23, 2022 4:37 am

The warmer atmosphere heats the surface layer of the oceans. the warmer surface layer impedes heat loss from lower layers of the ocean.

The main source of ocean heat is sunlight. Additionally, clouds, water vapor, and greenhouse gases emit heat that they have absorbed, and some of that heat energy enters the ocean. Waves, tides, and currents constantly mix the ocean, moving heat from warmer to cooler latitudes and to deeper levels.

Covering more than 70% of Earth’s surface, our global ocean has a very high heat capacity. It has absorbed 90% of the warming that has occurred in recent decades due to increasing greenhouse gases, and the top few meters of the ocean store as much heat as Earth’s entire atmosphere.

RoHa
July 21, 2022 10:12 pm

Melanie Phillips is insane, but Christopher Monckton isn’t (even if he did carry water for Thatcher in his callow youth) and can be taken seriously.

RoHa
July 21, 2022 10:13 pm

And as I’ve said before, all the Missing Heat (TM) has collected at the bottom of the sea, and one day soon it will suddenly rise to the surface and go rampaging across Tokyo. You mark my words.

H.R.
Reply to  RoHa
July 22, 2022 7:57 am

No worries. Mothra will deal with it. Mothra needs something down there to keep warm, so that heat isn’t going anywhere without Mothra putting up a struggle.

(Anybody got a better explanation?)