Guest essay by Larry Hamlin
In a clear discrediting of NOAA’s and the media’s recent overhyped and flawed global temperature claim that “July 2021 was the hottest month ever recorded” (with this hype promoted by NOAA climate alarmist “scientists”) updated data from all major global temperature anomaly measurement systems (including NOAA as discussed below) proves that NOAA’s claim was exaggerated, deceptive and distorted.

The 4 major global temperature measurement systems including satellite systems UAH and RSS and surface measurement systems GISS and HadCRUT revealed that NOAA was an isolated outlier in making their exaggerated claim that was so ridiculously overhyped by the climate alarmist media as clearly demonstrated by the headline and picture shown in the above article by the AP’s decades long biased climate alarmist activist Seth Borenstein.
The combined land and sea global surface temperature monthly anomaly data are available for each of the 5 major global temperature measurement systems at HadCRUT5, UAH6.LT, GISSlo, RSS TLT V4 and NOAAlo as discussed (with links) in the information provided below.
The UAH, RSS, GISS and HadCRUT global temperature monthly anomaly measurement systems showed that the highest July occurred in years 1998, 2020, 2019 and 2019 respectively and not year 2021 as claimed by NOAA.
Furthermore, NOAA’s “July hottest month ever” claim was both exaggerated and deceptive because it was based on a trivial and miniscule 0.01 degrees C above the prior July NOAA peak monthly anomaly measurements which occurred in years 2020, 2019 and 2016.
The NOAA July 2021 global monthly temperature anomaly measurement 95% confidence level (accuracy range) is +/- 0.19 C which is nearly 20 times greater than the miniscule 0.01 degrees C temperature anomaly difference between July 2021 and July 2020, 2019, and 2016 meaning that the difference between these July temperature anomaly measurements is scientifically insignificant and unnoteworthy.
Further adding to NOAA’s and the media’s deception that the July 2021 global temperature anomaly “hottest month ever” hype is the fact that this week (9/14/21) NOAA reduced its July 2021 temperature anomaly value by 0.01 degrees C as a part of its August 2021 global temperature anomaly system update meaning that July 2021 is not the “hottest month ever” but tied with July year 2019 with years 2020 and 2016 July anomalies just 0.01 degrees C lower.
Where are the climate alarmist media headlines announcing NOAA’s embarrassing reduction in its prior reported July 2021 temperature anomaly “hottest month ever” hype and acknowledging this change in the public press? Don’t hold your breath waiting for the NOAA and media alarmist correction announcement.
The highest peak global monthly temperature anomaly for all 5 temperature measurement systems including the UAH, RSS, GISS, HadCRUT and NOAA measurement systems occurred over 5 years ago in year 2016 during the months of February and March.
More significantly the media’s ignorant and misguided “July hottest month” exaggeration and deception deliberately tried to grossly distort the global monthly temperature measurement anomaly record by concealing the fact that global monthly temperature anomaly declines have been underway since peak year 2016 as clearly reflected in all 5 global temperature anomaly measurement systems data records as shown below in the UAH, RSS, HadCRUT, GISS and NOAA data records respectively.


The graph below shows the HadCRUT4 data. HadCRUT5 data has about 14% higher values. The February 2016 peak anomaly for HadCRUT5 data is 1.22 C
versus about 1.1 C for HadCRUT4.



Of course, there will be no news article blaring headlines or climate science ignorant (yet incredibly arrogant) TV broadcasters in the biased climate alarmist media acknowledging the erroneously flawed hype of the “July 2021 hottest month ever” scam that was nothing but politically motivated climate “science” alarmist propaganda consistent with the usual alarmism and media shenanigans built upon climate hype dishonesty through use of exaggeration, deception and distortion.
The declining global monthly temperature anomaly data trends for all 5 major temperature measurement systems over the last 5+ years as shown above clearly establish that there is no climate emergency.
Additionally, the U.S. and EU who have been driving the UN IPCC climate alarmism political campaign for over 30 years have now completely lost the ability to control global energy and emissions outcomes through the IPCC’s flawed climate model contrived schemes.
In 1990 the year of the first UN IPCC climate report the world’s developed nations led by the U.S. and EU were accountable for nearly 58% of all global energy use and 55% of all global emissions. But that dominance in global energy use and emissions by the developed nations changed dramatically and completely disappeared over the next 15-year period.
The world’s developing nations led by China and India took command of total global energy use in 2007 (controlling more than 50% of all global energy use) after dominating total global emissions in 2003 (controlling more than 50% of global emissions).
In year 2020 the developing nations controlled 61% of all global energy use and 2/3rds of all global emissions with these nations clearly on a path to further increase these commanding percentages in the future. The developing nations have no interest in crippling their economies by kowtowing to the western nation’s flawed model driven climate alarmism political propaganda campaign with the developing nations having announced to the world that they are fully committed to increased use of coal and other fossil fuels.
In year 2020 the developing nations consumed 82% of all global coal use with China alone consuming 54% of the world’s coal. China was the only nation in the world that increased both energy use and emissions in pandemic year 2020.
The U.S. and EU have not contributed to the increasing level of global emissions over the last 15 years. In fact, these nations reduced emissions during this time period by many billions of metric tons. Yet global emissions have continued to dramatically climb ever higher by many more billions of tons driven exclusively by the increased use and unstoppable growth of fossil fuel energy by the world’s developing nations.
Assertions by U.S. and EU politicians that massively costly, horrendously onerous and bureaucratically driven reductions of emissions will “fight climate change” along with bizarre claims of supporting a “net zero” future are ludicrous, disingenuous and represent nothing less than completely fraudulent proposed schemes.
It’s time for the developed nations to stop their scientifically incompetent, globally irrelevant, real world inept and purely politically driven flawed climate model alarmist propaganda campaign.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
It is typical of the media to hype individual events during a cool year. If this were a El Nino year and the average global temperature through July was a record, they would have hyped that instead. Since the 2021 global temperature is well below any possible record, they instead hype one month. This type of cherry picking is actually even more dishonest than the whether it was or was not a record.
It is hilarious to see the anti-science trolls try to justify the headlines when the bigger problem is the obvious cherry picking and ignoring this is a cooler year. The lack of honesty by the media is disgusting and anyone who is interested in science would admit it. Not our trolls however. They once again proved to the world they have no interest in science.
NOAA uses surface weather observation locations for global temperature (and other) data – see attached image. Note the data density of US stations. When creating daily or monthly global averages, does anyone know if NOAA performs a common average (regardless of location), or is an area-weighting function used to account for spatial distribution differences?
It’s area weighted on a 5×5 lat/lon grid. I’ve never heard of any dataset using a common average. I don’t even know how that would be possible.
https://journals.ametsoc.org/view/journals/bams/93/11/bams-d-11-00241.1.xml
Of course it’s possible. Just take the average of available data reporting stations. Problem is, in high-density regions, NOAA does not use all the data and even subjectively removes some. They’ve turned a simple procedure into a complex process.
That’s just a trivial average though. The goal is to determine the global mean. I don’t how you would project a trivial average onto a spherical shape and have it adequately resemble the spherical mean. The procedure is complex for a bunch of reasons. And it’s not just NOAA using a complex process. It’s everyone. In fact, NOAA’s process is among the simplest.
And I’m not sure what you mean by “NOAA does not use all of the data and even subjectively removes some”. And to be frank I’m surprised you say it only because you didn’t know how NOAA is doing it so I question how you could possibly know one way or another on the point above. And I say that with due respect because you’re smart guy who is clearly very knowledgeable in the atmospheric sciences and especially sounding analysis. I’ve even email you to get support on RAOB. You were prompt and super helpful.
“NOAA does not use all of the data and even subjectively removes some” is explained in their well documented temperature processing literature … https://www.ncdc.noaa.gov/monitoring-references/faq/temperature-monitoring.php Even though NOAA engages in complex temperature data processing, I just have one simple question … why, on average, are the USHCN “raw” data temperatures cooled prior to 2008 and why are they warmed after 2008?
NOAA does not use USHCN for their global temperature dataset. They use the GHCN repository. The stations that comprise USHCN are contained within GHCN as well. In fact, USHCN is just a subset of GHCN. One of the design goals of USHCN was to be as consistent as possible. That means they try to keep the same 1219 stations. Stations are constantly being commissioned in the US. These stations are omitted because they don’t comply with the stated goals of USHCN. There have been occasions when a station of long record gets decommissioned. In that case I believe they replace it with a nearby station to keep the station count as close to 1219 as possible. Note that recently commissioned stations are added to the GHCN repository though. In fact, stations are added to GHCN constantly. This includes station records that are years and even decades old as those records get digitized. Digitization projects are still ongoing.
The webpage you linked to explains the adjustments. The most onerous is the time-of-observation bias. The bias is a result of the gradual shift from PM to AM observations. The bias propagates into analyzed trends so it must be corrected. Station moves, instrument changes, etc. also contaminate the record with biases as well. These biases are corrected with pairwise homogenization which has no subjective elements to it though it does use the HOMR database for clues regarding the bias inducing changepoints.
And although USHCN isn’t used for NOAA’s global mean temperature dataset the USCRN dataset confirms that USHCN is a pretty accurate depiction of US temperatures and that the pairwise homogenization corrections are doing their jobs. Actually, USCRN indicates that USHCN is still biased a bit too low though the difference is relatively small.
Why were the USHCN temperatures cooled prior to 2008 and warmed after 2008?
What are the impacts of any UHI adjustments. CONUS and global?
I just have one simple question … why, on average, are the USHCN “raw” data temperatures cooled prior to 2008 and why are they warmed after 2008.
I think bdgwx covered this (multiply-asked) question better than I could, but I do have a question for you, if I may?
Why do you care about USHCN?
Despite the name it ceased being the US dataset of record over 7 years ago. It’s 1200-odd stations were superceded by the 10,000-odd of nClimDiv.
“The USCRN serves, as its name and original intent imply, as a reference network for operational estimates of national-scale temperature. NCDC builds its current operational contiguous U.S. (CONUS) temperature from a divisional dataset based on 5-km resolution gridded temperature data. This dataset, called nClimDiv, replaced the previous operational dataset, the U.S. Historical Climatology Network (USHCN), in March 2014.
Compared to USHCN, nClimDiv uses a much larger set of stations—over 10,000—and a different computational approach known as climatologically aided interpolation, which helps address topographic variability. ”
https://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/background
Similarly GISTEMP stopped using USHCN in 2011. Although it is maintained as a legacy product, as far as I am aware no significant published dataset continues to use it.
So why the burning curiosity?
Why were the raw USHCN temperatures cooled prior to 2008 and warmed after 2008?
It’s been answered multiple times on this thread on other threads, by Nick Stokes on Moyhu, and on their website.
That many denizens refuse to read and understand enough to overcome their ideological motivated conspiracy ideation is not going to be changed by doing so again.
Your problem and it’s seemingly not going to be fixed by the scientific logic that homogenisation is.
Six times – the same question about an obsolete dataset that nobody uses.
And yet no reply to why he wants to know.
Weird.
I understand how that works for milk — and then the entire milk batch comes out the same temperature — but not for USHCN data. But doesn’t anyone know why the raw USHCN temperatures are cooled prior to 2008 and warmed after 2008?
The adjustments are explained on your own link. Did you read it?
I’m not sure I’m understanding your statement that USHCN is cooled prior and warmed after 2008. Where are you seeing that?
Here are some publications I want you to wade through.
Smith 2005 and Smith 2008 – NOAA GlobalTemp methods
Menne 2009 – Pairwise Homogenization Algorithm
Hausfather 2016 – Evaluation of PHA via USCRN overlap period
I had responded earlier but my post is pending approval. Let me try it again with DOI numbers instead of hyperlinks.
The adjustments are explained on your own link. Did you read it?
I’m not sure I’m understanding your statement that USHCN is cooled prior and warmed after 2008. Where are you seeing that?
Here are some publications I want you to wade through.
Smith 2005 (10.1175/JCLI3362.1) – NOAA GlobalTemp methods
Smith 2006 (10.1175/2007JCLI2100.1) – NOAA GlobalTemp methods
Menne 2009 (10.1175/2008JCLI2263.1) – Pairwise Homogenization Algorithm
Hausfather 2016 (10.1002/2015GL067640) – Evaluation of PHA via USCRN overlap period
See Figure 1 … the “final-raw” black line. Why are the older data made colder and the newer data made warmer? https://wattsupwiththat.com/2020/11/03/recent-ushcn-final-v-raw-temperature-differences/
I cant see that they did !
“Figure 5. Impact of adjustments on U.S. temperatures relative to the 1900-1910 period, following the approach used in creating the old USHCN v1 adjustment plot.”
http://berkeleyearth.org/wp-content/uploads/2015/04/Figure5.png
From: https://cdiac.ess-dive.lbl.gov/epubs/ndp/ushcn/monthly_doc.html
“Table 1. Differences between USHCN version 2.0 and version 2.5”
“VERSION 2.0: The temperature data were last homogenized by the PHA algorithm in May 2008. Since May 2008, more recent data have been added to the homogenized database using the monthly values computed from GHCN-Daily (but without re-homogenizing the dataset).”
“VERSION 2.5: The raw database is routinely reconstructed using the latest version of GHCN-Daily, usually each day. The full period of record monthly values are re-homogenized whenever the raw database is re-constructed (usually once per day)”
See Figure 1 … the “final-raw” black line. Why are the older data made colder and the newer data made warmer? https://wattsupwiththat.com/2020/11/03/recent-ushcn-final-v-raw-temperature-differences/
I don’t see your beef about 2008 specifically other than they introduced V2.0 then … irrelevant as has been defunct since 2014.
Read the contributions by Steven Mosher and Nick Stokes on that thread …. they are the only ones who know what they are talking about.
Of course the USHCN data is defunct, that’s why the data is updated and modified annually with our tax monies. I have about 1 million questions — and they increase each year. But let’s take one at a time. Let’s start with USH Station #17366, Selma, AL. The mean July 1920 “raw” temperature was 32.55 and the altered temperature was 33.23 C. Explain the reason for that change.
What did your own link say about this?
Make sure you read the comments in that WUWT blog post. There are some really good ones that explain what is going on.
Reminder…this is all moot because NOAA GlobalTemp does NOT use USHCN.
Of course they’re not used, that’s why the USHCN data is updated and modified annually with our tax monies. I have about 1 million questions — and they increase each year. But let’s take one at a time. Let’s start with USH Station #17366, Selma, AL. The mean July 1920 “raw” temperature was 32.55 and the altered temperature was 33.23 C. Explain the reason for that change.
Again…station moves, instrument changes, and time-of-observation changes. There are 13 documented changepoints for this station. It is very likely there were many undocumented changepoints as was common especially prior to WWII. The difference between unadjusted and adjusted is the result of the PHA processing. Read the literature I linked to above. Side note…where are you seeing 32.55 and 33.23? I’m seeing 27.39 vs 27.84 in GHCN-M.
Now you’ve done it. You got ahead of me and prematurely exposed the other 1 million questions. So now, I have to ask you 2 questions about the same station. Remember, my initial question (and all my comments) have been about the USHCN data — which is the core long-term temperature history of the US. So first, explain how the USHCN data was altered, and then second, explain how that same station reflects different data for the GHCN-M file.
1) Pairwise Homogenization Algorithm – Menne 2009.
2) I think I see the problem. You didn’t pull tavg; you pulled tmax. I just verified that USHCN matches GHCN-M for station 17366.
BTW…USHCN isn’t the core long term history of the US. It is only a long term history of the US. nClimDiv is the core long term history of the US. USHCN has a maximum of 1220 stations by design. nClimDiv has about 10000. Note that nClimDiv and USHCN-adj are nearly identical. Doubly note that USCRN is nearly identical suggesting that PHA is an effective bias correction technique.
Question…what are the 6 general steps of the PHA?
Have a look at temperature.global for another data set, no warming for 5 years.
Thanks for the note. The comment … “Temperature.Global calculates the current global temperature of the Earth. It uses unadjusted surface temperatures.” is especially refreshing — since it supports Tony’s work — and better reflects the UAH satellite data. Glad I brought my snow shovel to Florida.
They use area weighting, but my bitch is that a degree C in the Arctic does not have the energy associated with a degree C in the Tropics. Its the humidity not the temperature, as the old saying goes.
And … they don’t use all the data reporting stations for the averages. I want a refund of my tax monies.
They use all of the stations in the GHCN-M repository which I believe is more than 27,000 the last time I looked.
the claim of hottest month ever is based on precision in measurements that does NOT EXIST, it is not possible to assign a single temperature of the entire earth with the precision they claim = they are LYING to us
“It is not possible to assign a single temperature of the entire earth.” Period. End of sentence.
The UAH, RSS, GISS and HadCRUT global temperature monthly anomaly measurement systems showed that the highest July occurred in years 1998, 2020, 2019 and 2019 respectively and not year 2021 as claimed by NOAA.
Why is 2019 repeated?
They probably meant 2016. After all, 2016 was the “hottest year evah!”.
ZEKE 2014 Changing the Past?
Diligent observers of NCDC’s temperature record have noted that many of the
values change by small amounts on a daily basis. This includes not only
recent temperatures but those in the distant past as well, and has created
some confusion about why, exactly, the recorded temperatures in 1917 should
change day-to-day. The explanation is relatively straightforward. NCDC
assumes that the current set of instruments recording temperature is
accurate, so any time of observation changes or PHA-adjustments are done
relative to current temperatures. Because breakpoints are detected through
pair-wise comparisons, new data coming in may slightly change the magnitude
of recent adjustments by providing a more comprehensive difference series
between neighboring stations.
When breakpoints are removed, the entire record prior to the breakpoint is
adjusted up or down depending on the size and direction of the breakpoint.
This means that slight modifications of recent breakpoints will impact all
past temperatures at the station in question though a constant offset. The
alternative to this would be to assume that the original data is accurate,
and adjusted any new data relative to the old data (e.g. adjust everything
in front of breakpoints rather than behind them). From the perspective of
calculating trends over time, these two approaches are identical, and its
not clear that there is necessarily a preferred option.
Nick Stokes used to give a valuable monthly temperature update [on sabbatical at present]
He would also give a James Hansen update in his lnks.
The GISS V4 land/ocean temperature anomaly was 0.85°C in June 2021, up from 0.79°C in May. This small rise is similar to the 0.01°C increase (now 0.03°C) reported for TempLS. Jim Hansen’s report is here.
What I found in Hansen’s report was that the July anomalies are the lowest for all the months. So even if a record was broken it is actually less of an anomaly increase than for other months.
We are still on track for a much lower year than in recent years ? equal the highest at the moment with 4 months that we can only hope get lower .
N o one can keep up a fraud of global warming if we do get cool years.
Conversely if it was to keep going up ??
JMA differs with me and actually gives a figure for the month to date!
Amazing.
The monthly anomaly of the global average surface temperature in August 2021 (i.e. the average of the near-surface air temperature over land and the SST) was +0.27°C above the 1991-2020 average (+0.81°C above the 20th century average), and was the 4th warmest since 1891
So when a scientist starts to write up a new paper with temperature as one variable, does he use the version from today, yesterday, a month ago, a year ago?
Also, the scientist who wrote a paper 20 years ago, using versions available then, needs to either recalculate using the latest version, or retract the original because the temperature data were wrong.
Silly effects arise when people choose to modify basic scientific concepts that have stood the tests of time. Like inventing home made ways to calculate uncertainty and using temperature anomalies instead of real, measured temperatures and like using subjective adjustments like TOBS.
It just comes through as crooked science, or anti-science.
Geoff S
“It just comes through as crooked science, or anti-science.”
It does to me.
What I also find annoying is when location names are changed to be politically correct. When I’m looking to find an old mine, I can never be certain it is the correct location. The probability is high for most name changes, but there is no certainty.
It always amazes me that scientists use simple regression to try and find a “trend” in the temperature data statistical analysis results.
Everything I have researched tells me that temperature for sure is a periodic function composed of a number of underlying oscillations. This cries out for some kind of frequency/time based analysis rather than trying to do averages and find a trend via statistics.
People talk about cherry picking, but that is what you get when you try to chose a starting point on a periodic function. Do you start at the bottom of a cycle or the top of a cycle? Should you be using annual averages or monthly or daily temps in order to identify the cycle?
I don’t think many scientists or mathematicians have ever studied periodic functions and done Fourier or wavelet analysis of complex periodic functions. The predominate people who have studied this are engineers and physicists. Is it a coincidence that you see few of these people as authors of climate papers.
https://xkcd.com/793/
The hubris here is almost unbelievable. Dunning-Kruger.
What is your background in metrology?
None. I have no academic credentials in either metrology or climate science. I am but a layman in these matters and most defer to those smarter than I for knowledge and understanding.
The interesting thing about the comment section for this article was a tag team of three or four relatively skillful commenters completely highjacking the comment section by doing pretty much everything except address and refute the main point of the article. They sounded kind of like they were or might eventually or were working their way around to it. But really it was just a bombardment of non sequiturs and red herrings. Curious.
But they couldn’t have done it nearly so effectively without the participation of those being led astray.
I suppose you could have jumped in early and pointed out to those being led astray what the proper response should have been. I don’t have a lot of respect for Monday Morning Quarterbacks.
“everything except address and refute the main point of the article”
Disagree. The main point of the article was that NOAA’s announcement of a record hot month in July was ‘rejected’ by ‘all 5 major datasets’
One objection is that one of those 5 had not reported, and the author plotted the graph for June apparently without noticing. A sloppy error that does not speak of attention to detail by the author or proof-reading by the site (if any).
But the principal objection is that Hamlin does not seem to have grasped the concept of an anomaly. NOAA reported the hottest month because it was the hottest in absolute terms, (that is baseline + anomaly, just pointing out this basic truth is enough to earn massive downvotes, so it goes). The anomaly measures the difference from the average for the month so when Hamlin plotted anomaly graphs or quoted larger anomalies in March and February he is achieving no more than embarrassing himself.
NOAA is a surface dataset, Hadley July data is not yet available, the NASA data showed July 2021 joint warmest with 2019. After the August update NOAA show the same thing. UAH and RSS have different warmest months, but they are not measuring the same quantity as NOAA.
The ‘
54 datasets’ have changed nothing of significance.Hamlin is misunderstanding a lot of things here. One thing not mentioned yet in the comment section is his claim that the “95% confidence level (accuracy range)” is ±0.19C for July 2021. He arrives at this by taking the sum of the high frequency, low frequency, and bias variances of 0.004934, 0.000002, and 0.004130 respectively as reported for that month and transforming them via 2*sqrt(Vh+Vl+Vb). NOAA’s variance reporting is different than the other datasets. Vose et al. 2012 says it best when they say the error variances when presented as confidence intervals is “a broad depiction of historical noise rather than as a precise time series of exact error estimates.” This makes sense since the NOAA reconstruction employs a decadal low frequency analysis with the high frequency component being the residual of the detrended anomalies. That’s not to say that the ±0.19C confidence interval is incorrect. It just needs to be interpreted with context. Arguez 2013 has a pretty good summary of how to calculate the probability of the rankings and a more applicable uncertainty estimate for the task at hand. If Hamlin is aware of any of this he isn’t letting on his article.
You describe variance reporting using high freq, low freq, and bias components. I’m sorry but these are not frequency components, they are statistical results derived from periodic functions. The term “variance” is a dead give away as to the type of analysis.
I know you want to sound educated but you are simply a mathematician with an apparent lack of knowledge about time and frequency analysis. This type of analysis would have trig functions, not probability and variance.
Another give away is your reference to “noise”. Noise in a signal is extraneous information that is not signal but when demodulated look like a signal component. Temperature data does NOT have noise components, temperature data IS the signal. What you describe as noise is nothing more that the variance obscuring the “trend” you are looking for. Spoken by a real mathematician not a physical scientist.
Would you mind explaining NOAA’s reconstruction method and how those variances are calculated and what they mean in your own words?
We are all dumber after this Thread.
If you think about it the earth “floats” in a vast void and infinite heat sink (space) with an ultimate temperature of 3 degrees kelvin above absolute zero and the only true heat source ,apart from man’s measly input , is the sun.
And in 4.5 billion years of existence there is no evidence of there ever being an example of a runaway greenhouse effect (during the biological earth stages to today) , which is what many Climate scientists fear.
Despite there being over the ages vast and numerous sources of CO2 fluctuating the levels in the atmosphere over a wide range .
Taking this into account isn’t it far more likely that the earth will ‘lean” toward cooling more than heating since heating (the sun) only heats , ignoring the Northern and Southern parts of the globe , say 10 degrees below both , about 40% of the earths surface less clouds , the other 60% is just radiating heat into the cold of space .
As an example the Sahara desert can be 40-45 oC during the day but drop to -5 oC at night with no clouds and low humidity .
The Sun’s luminosity increases by 1% every 120 million years (Gough 1981).
It is widely accepted among climate scientists that a runaway greenhouse is not possible on Earth due to the Simpson–Nakajima. A moist greenhouse may be technically achievable, but it would require substantial forcing likely beyond what is anthropogenically possible. (Goldblatt 2012).
It is widely accepted among climate scientists that a runaway greenhouse is not possible on Earth
That certainly isn’t the impression that’s given to the general public by the media. I’ve seen concerns from “scientists” expressing exactly that fear “if we don’t fix it now”.
Along with the “tipping point” fearmongering.
I’ve seen the media and others hype the hypothesis as well. It’s pretty clear that the hype comes with a lack of understanding of the radiation limits in effect on Earth. BTW…I meant to say both the Komabayashi–Ingersoll and Simpson–Nakajima limits in my post above. The KI limit is about 75C. I don’t know exactly what the SN limit is, but I believe it is a bit lower than that. Even then I believe those require a completely GHG saturated atmosphere which obviously isn’t realistic. The evidence seems to suggest that 50C is probably the upper limit even in the most wildly unlikely scenarios which obviously is no where close to a runaway greenhouse effect. If someone knows of another figure for the upper bound on Earth’s temperature please post it.
“The evidence seems to suggest that 50C is probably the upper limit even in the most wildly unlikely scenarios which obviously is no where close to a runaway greenhouse effect.”
I think if THAT were what’s being communicated, it would be a lot easier to discuss among the general public. But then, they couldn’t keep people in panic mode, could they?
How much relief do you think the climate “stressed” (as posted here recently) would feel knowing that? How much despair would be relieved?
Some comments noted that the HadCRUT5 measurement system has not yet updated its official data record for July 2021. HadCRUT5 is now two months behind the other 4 global monthly temperature anomaly systems which have reported monthly anomaly data for both July and August 2021.
The HadCRUT5 monthly global temperature anomaly data records for years 2020 and 2021 are shown below.
Each of the HadCRUT5 years 2020 and 2021 starts with January then February, March etc.
2020 1.069, 1.113, 1.094, 1.063, 0.908, 0.825, 0.816, 0.801, 0.867, 0.811, 1.013, 0.693 Average 0.923
2021 0.701, 0.565, 0.726, 0.760, 0.706, 0.712,
Every HadCRUT5 monthly 2021 global temperature anomaly is significantly below every monthly 2020 global temperature anomaly as the recorded data shows.
This consistent trend supports that the HadCRUT5 July 2021 anomaly will also likely be below year 2020 and also even further below the HadCRUT5 highest peak July global temperature anomaly of 0.857 in year 2019.