Top Ten Reasons to Shut Down NASA’s Climate Change Shop Known as GISS

For decades, the NASA Goddard Institute for Space Studies (GISS) has projected itself as a sentinel of Earth’s climate future. But its transformation from a space science institute into a climate policy echo chamber represents a textbook example of mission drift. Founded for planetary studies, GISS has long since abandoned its original purpose and embraced speculative climate modeling and media-driven narratives—often built on data more adjusted than measured.

It’s time to be honest: GISS should be closed.

Here are ten good reasons why.


1. GISS Abandoned Its Original Mission

GISS was established in 1961 to support NASA’s planetary science efforts—specifically analyzing satellite data and studying planetary atmospheres. This made sense during the era of Apollo and planetary exploration. But today, GISS has become a climate modeling hub pushing speculative scenarios about Earth’s future, often far removed from observational reality. The pivot from space to climate was not a logical expansion—it was bureaucratic repurposing to fit political trends, and perhaps to save the organization from lack of relevancy and funding cuts once the Apollo missions were over..

2. Duplication of Effort and Bureaucratic Bloat

The U.S. and the world already have multiple agencies—NOAA, NCEI, HadCRUT, Berkeley Earth, UAH—dedicated to tracking Earth’s climate. GISS’s primary offering, the GISTEMP dataset, merely reprocesses NOAA’s Global Historical Climatology Network (GHCN) data. This is redundancy masquerading as innovation. There’s no compelling justification for maintaining a separate NASA-funded entity to do what other agencies already do—except perhaps to keep a particular narrative alive.

3. They Add a “Special Sauce” to NOAA’s Raw Data

GISS doesn’t collect its own raw temperature data—it relies on NOAA’s GHCN. But then it massages that data using its own proprietary adjustments. These adjustments frequently increase recent temperatures and decrease older temperatures, thereby inflating long-term warming trends. This isn’t transparency; it’s alchemy. When the same data goes through different filters and always comes out “hotter,” we should be asking tough questions.

4. GISS Uses an Old, Outdated Temperature Baseline to Juice the Alarm

One of the lesser-known tricks in GISS’s toolkit is its use of a 1951–1980 baseline to calculate temperature anomalies. This baseline includes some of the coldest decades of the 20th century, particularly the 1970s—a period marked by widespread cooling concerns. By anchoring temperature anomalies to this chilly benchmark, GISS makes today’s anomalies appear artificially warm.

Contrast this with NOAA and the University of Alabama in Huntsville (UAH), which use more recent baselines (like 1991–2020) that better reflect modern climatology. If GISS used the same baseline, their charts wouldn’t look nearly as alarming. Plus, they tend to use “hotter” colors in the global maps they produce. This is a visual sleight of hand—technically correct, but intentionally misleading. And it’s exactly the kind of misrepresentation that undermines public trust in climate science.

5. Opaque Adjustment Processes

The so-called “homogenization” process at GISS is more like a magic black box than method. While the code is open source, the how, why, and where data is adjusted is poorly documented and has not been replicated elsewhere by science. Stations with long, reliable temperature histories are frequently “corrected” in ways that flatten past warmth and enhance recent trends. This isn’t merely correcting data—it’s rewriting it.

6. Contaminated Data from NOAA’s Station Network

GISS uses NOAA’s surface station data, but that data has systemic flaws. My 2009 and 2022 studies of the nations’ weather stations, demonstrated that over 90% of NOAA’s weather stations fail their own siting standards, usually being too close to artificial heat sources like asphalt and air conditioner vents. GISS not only accepts this flawed input but compounds the problem by applying additional adjustments. The result? Garbage in, propaganda out.

7. From Science to Activism: GISS’s Politicized Leadership

Former director Dr. James Hansen infamously turned GISS into a platform for climate activism. His 1988 Senate testimony is often credited with launching the modern climate scare, but even then, he and his sponsor had to amp-up the alarm with some heated stagecraft in the Senate hearing room—and his models have missed the mark ever since. Under his tenure and beyond, GISS has increasingly acted as an advocacy shop, with researchers stepping into media roles, climate protests, and policy debates rather than quietly letting data speak for itself.

8. Alarmism Masquerading as Science

NASA GISS leads the charge every year announcing the “hottest year ever,” often based on differences so small they fall within the margin of error. Other datasets—like UAH’s satellite record—don’t always agree, but that doesn’t stop the press releases. What matters to GISS is the headline, not the nuance. That’s not science; that’s marketing. Meanwhile, they remain “baffled” by the heat for 2023, and don’t delve into finding the cause. They suffer from confirmation bias.

9. Dysfunction, Low Morale, and Disconnection from NASA’s Core Mission

According to a recent CNN report, GISS is in “absolute sh*tshow” mode, with demoralized staff and no clear direction following proposed budget cuts. Even NASA admits it plans to end GISS as a standalone entity. When the agency itself is phasing you out, maybe it’s time to pack up the models and go home.

Meanwhile, space exploration missions are being shelved while GISS continues to siphon off funding. This is a betrayal of NASA’s original charter. The agency should be launching missions to Mars and beyond—not fiddling with spreadsheets to make the 1930s look cooler.

10. The Climate Community Doesn’t Need GISS Anymore

With multiple datasets available—satellite-based, balloon based, ground-based, international and private—GISS is no longer indispensable. Its role as a check-and-balance in climate science is compromised by its activism, questionable methods, and redundancy. The scientific community would benefit from one less politicized voice distorting the record.

Conclusion: End the Era of GISS Distortion

Shutting down GISS isn’t anti-science. It’s pro-accountability. Even the Inspector General’s office agrees, when they identified questionable ‘$1.63 million of GISS’ expenditures since 2012’.

It’s time to retire this Cold War-era artifact of climate modeling. GISS has become a monument to adjustment-driven narrative building. Its adherence to outdated baselines, inscrutable processes, and a relentless pursuit of alarming outcomes betrays its scientific mandate.

NASA should archive the GISS data and then return to what it does best: exploring other worlds, not endlessly reinterpreting data from this one. We need clarity, not overcooked temperatures.  If GISTemp is so important, set up an automated process that ingests GHCN data and continues outputting the result to a NOAA webpage. The code is available, and it runs on Python. What can now be replaced by a single high-power desktop PC does not require an entire government department.

It’s time to close GISS.

5 56 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

234 Comments
Inline Feedbacks
View all comments
June 12, 2025 10:54 am

Nice shot. The hornets will arrive any moment now…

altipueri
June 12, 2025 10:56 am

Is unadulterated data still available or has it been “lost” so no one can see what happened?

cgh
Reply to  altipueri
June 12, 2025 2:14 pm

Canada had its historic data but has omitted it from the model results. You can read the then-Minister’s excuses here.
GOLDSTEIN: Feds scrapped 100 years of data on climate change | Toronto Sun

Nick Stokes
Reply to  altipueri
June 12, 2025 2:44 pm

Of course it is. It is published as GHCN unadjusted, and is as the national Met Offices reported it.

leefor
Reply to  Nick Stokes
June 12, 2025 8:23 pm

So what is the definition of “the mean”? How many readings per day?

Reply to  leefor
June 13, 2025 5:16 am

It isn’t a mean. It is a mid-range value, i.e. the median. You only need to know the maximum value and the minimum value to find the median. But the median tells you almost nothing when it comes to the temperature profile or the climate.

Sparta Nova 4
Reply to  Tim Gorman
June 13, 2025 7:33 am

The median tells more than the mean.
The median is the point where half the data population is above and half below.
The mean is merely the midpoint between high and low and ignores the profile.

bdgwx
Reply to  Sparta Nova 4
June 13, 2025 8:43 am

The median tells more than the mean.

Maybe. It depends on the context. In some cases a mean is more useful.

BTW…you cannot determine the median just from the maximum and minimum values alone.

Reply to  bdgwx
June 13, 2025 10:06 am

We are discussing temperatures. The value of (Tmax + Tmin)/2 appears to be accepted by climate science as an *average” value when of course it isn’t. It would be if the daily temp profile was a sine wave but it isn’t. For a continuous function it is more of a median.

Sparta Nova 4
Reply to  bdgwx
June 13, 2025 12:03 pm

BTW, I never said one could determine the median just from the maximum and minimum values alone.

Please read before you post.

bdgwx
Reply to  Sparta Nova 4
June 13, 2025 2:11 pm

BTW, I never said one could determine the median just from the maximum and minimum values alone.

And I never said you did. I mention it because Tim Gorman said it and I wanted to make you were aware of it because the subterfuge the Gorman’s sometime employ can be very subtle and it has fooled of number of people who have a kneejerk reaction to believe everything they say.

BTW…if the daily temperature profile were truly a sine wave then the mean and median could be the same.

BTW2…for a continuous function the mean value theorem for integrals says that there exists an average of the function that can be multiplied by the domain to get the same answer as the integration. The practicality of this is that for a continuous function of a physical quantity we can use the average as a way of avoiding a full integration. That’s not something you can do with a median. In that respect the mean is more useful (tells you more) than the median.

BTW3…I encourage you to look at the 5 minute METARs of some of the ASOS stations and experiment with which statistical method best represents the average temperature. Is it (Tmin+Tmax)/2 or is it the median? The results may surprise you. Or maybe not. I don’t know.

BTW4…I’m not suggesting you are disagreeing with BTWs 1-3. I’m just pointing them out.

Reply to  bdgwx
June 13, 2025 5:07 pm

For the daily temp profile, day = sine/night = exponential, there is no easy way to determine the “average”. It is *not* (Tmax .+ Tmin)/2.

If you have 5 min data then do a degree- day sum. Trend those. The average is physically meaningless for an intensive prpperty.

Reply to  bdgwx
June 13, 2025 5:47 pm

BTW…if the daily temperature profile were truly a sine wave then the mean and median could be the same.

That isn’t true. The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ. That gives equal area value above and below the median. You’ll need to look up a sine probability function.

The mean of a sine is calculated as

0.637Aₚₑₐₖ

(1 / (π – 0)) ∫(0, pi, (sin(x)), x)

You can find this at any electrical site discussing Vmean.

bdgwx
Reply to  Jim Gorman
June 14, 2025 6:31 am

That isn’t true. The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ.

The mean of a sine is calculated as

0.637Aₚₑₐₖ

I’ll give you credit here for responding to something I actually said. That doesn’t happen very often. And as I’ve said before I don’t mind engaging with you as long as you are actually responding to something I’ve said.

But my word…do you have no shame? That’s a serious question. Do literally have no shame when you make statements like these?

Reply to  bdgwx
June 14, 2025 11:57 am

But my word…do you have no shame?

Nice dance with hand waving included.

Why didn’t you address the issue of Tavg being an incorrect computation.

Tday_mean s/b ~ (0.637 × Tmax)
Tnight_mean s/b ~ (T₀ / e)

bdgwx
Reply to  Jim Gorman
June 14, 2025 3:10 pm

Nice dance with hand waving included.

Hardly. Anyone who has taken even an introductory high school level course involving trigonometry knows that 1) the peak of a sine wave is NOT the median and 2) the mean of a sine wave is NOT 0.637 * peak; at least not generally (it’s only true for a specific partial segment of a sine wave).

Reply to  bdgwx
June 14, 2025 6:14 pm

the peak of a sine wave is NOT the median; the mean of a sine wave is NOT 0.637 * peak;

Tell you what dude, your assertions are incomplete without references to support them.

When discussing Tmax, we are discussing a probability distribution defined as a sine from 0 to π. The center value of a sine whose interval is 0 to π occurs at π/2, i.e., the peak. That is the median, the point where equal values occur both below and above the median.

I showed you the formula for finding the mean of a sine function.

(1 / (π – 0)) ∫(0, pi, (sin(x)), x)

The general formula for the mean of a function over an interval of “a” to “b” is:

[1 ÷ (b – a)] ∫ₐᵇ sin(x) dx

Why don’t you refute the math by showing your own calculation? Just claiming it isn’t, is not much of a proof.



bdgwx
Reply to  Jim Gorman
June 14, 2025 7:35 pm

The general formula for the mean of a function over an interval of “a” to “b” is:[1 ÷ (b – a)] ∫ₐᵇ sin(x) dx

Why don’t you refute the math by showing your own calculation?

Here is a proof by contradiction.

a=0
b=2π
f(x) = sin(x)

1/(b-a) * ∫(a,b,sin(x)dx) = 0

0.637 * max(sin(x) {x:a to b}) = 0.637

Since 0.637 does not equal 0 we have proven that 0.637 * max(sin(x)) != mean(sin(x)).

When discussing Tmax, we are discussing a probability distribution defined as a sine from 0 to π.

I’m responding to your statement “The mean of a sine is calculated as
0.637Aₚₑₐₖ”. If you want to amend your statement to include the constraint that the domain is a=0 and b=π then that’s great. Just understand that your original statement without the constraint is wrong.

The center value of a sine whose interval is 0 to π occurs at π/2, i.e., the peak. That is the median, the point where equal values occur both below and above the median.

That is not what a median is. The median is the value in which half of the values are lower and half are higher. median(sin(x) {x: 0 to π}) is sqrt(2)/2 while max(sin(x) {x:0 to 2π}) = 1. median(sin(x) {x:0 to 2π}) is 0 while max(sin(x) {x: 0 to 2π}) = 1. It is never the case that the median = max for sin(x).

Here are some notable examples of the mean, median, and max for sin(x) over specific domains of x.

For the domain {x:0 to 2π) mean(sin(x)) = 0 and median(sin(x)) = 0 and max(sin(x)) = 1.

For the domain {x:0 to π} then mean(sin(x)) = 0.637 and median(sin(x)) = 0.707 and max(sin(x)) = 1.

For the domain {x:(1/4)π to (3/4)π} then mean(sin(x)) = 0.900 and median(sin(x)) = 0.924 and max(sin(x)) = 1.

For the domain {x:(1/8)π to (3/8)π} then mean(sin(x)) = 0.689 and median(sin(x)) = 0.708 and max(sin(x)) = 0.924.

As you can see the relationship between mean, median, and max of sin(x) is complex.

Reply to  bdgwx
June 15, 2025 6:40 am

Here is a proof by contradiction.

a=0

b=2π

f(x) = sin(x)

1/(b-a) * ∫(a,b,sin(x)dx) = 0

As I showed you in a previous post, the correct interval is from 0 (zero) to π. That is one half of a complete cycle. In other words, the sun comes up, then the sun goes down.

Here is a screen capture from Symbo.lab.

comment image

For the domain {x:0 to 2π) mean(sin(x)) = 0 and median(sin(x)) = 0 and max(sin(x)) = 1.

Of course this is true, but we are not dealing with a complete sine cycle are we. We are dealing with one-half of a complete cycle, i.e., 0 to π.

For the domain {x:0 to π} then mean(sin(x)) = 0.637 and median(sin(x)) = 0.707 and max(sin(x)) = 1.

The median of sin(x) is NOT 0.707. 0.707 is the RMS (Root Mean Square) value of a sine. That is the equivalent value of a DC voltage across a load. A simple examination of a half cycle of a sine will show one that (0.707 x Vₚₑₐₖ) does not have the same area above the 0.707 value as below the 0.707 value.

Here is a document discussing the mean and median of functions. You should read it carefully. It says:

One simple result of equally dividing the sets of function values above and below t is that, for any f monotonic on [ab]

fmed = f((a+b) / 2). 

=When the interval is again a = 0 to b = π, then

fmed = sin((0 + π) / 2) = 1.

The shows that the median is at the peak value of one-half cycle of the sine, i.e., π/2.

Reply to  Jim Gorman
June 15, 2025 7:33 am

This misunderstanding of very basic definite integration is some of the most bizarre stuff he’s ever posted.

bdgwx
Reply to  karlomonte
June 15, 2025 11:21 am

This misunderstanding of very basic definite integration is some of the most bizarre stuff he’s ever posted.

Make sure you tell the whole world that bdgwx has the gall to challenge the Gorman’s claims that mean(sin(x)) = 0.637 * max(sin(x)) and that median(sin(x)) = max(sin(x)).

I want this spread far and wide. Use whatever colorful language you feel is best. I just don’t want it to ever be said that I support these absurd claims.

Reply to  bdgwx
June 15, 2025 4:16 pm

And while they are doing that also spread the news that bdgwx thinks sunrise to sunset constitutes a full cycle of a sine wave!

bdgwx
Reply to  Tim Gorman
June 15, 2025 6:05 pm

And while they are doing that also spread the news that bdgwx thinks sunrise to sunset constitutes a full cycle of a sine wave!

This is the first I’m hearing about the sunrise to sunset temperature pattern being a full sine wave cycle. That is definitely something I don’t want to be associated with.

Reply to  bdgwx
June 16, 2025 3:49 am

You said: “There is no “correct” interval for a sine wave. And the interval from 0 to π is only half of a single wave cycle anyway. One complete cycle is from 0 to 2π.”

The discussion is about the average value of daytime temperatures. It’s been that from the beginning in this sub-thread.

When you interjected that the integration of daytime temps should go from 0 to 2Π that means you think that the insolation from the sun during the day is a full sine wave.

Like I said, you seemingly can’t relate to the physical world.

bdgwx
Reply to  Tim Gorman
June 16, 2025 8:02 am

When you interjected that the integration of daytime temps should go from 0 to 2Π

I didn’t say that.

What I said was a challenge to your statement “The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ. The mean of a sine is calculated as 0.637Aₚₑₐₖ”

And I stand behind this challenge because there are, in fact, two incorrect claims in this statement.

The median of sine does NOT occur at π/2 and it is NOT Aₚₑₐₖ.

The mean of a sine wave is NOT calculated as 0.637Aₚₑₐₖ.

Reply to  bdgwx
June 16, 2025 11:05 am

You didn’t say:

“BTW…if the daily temperature profile were truly a sine wave then the mean and median could be the same.”

“There is no “correct” interval for a sine wave. And the interval from 0 to π is only half of a single wave cycle anyway. One complete cycle is from 0 to 2π.”

“You can’t just say that mean(sin(x)) = 0.637 * max(sin(x)) because that’s not right. Well…it’s not right generally. It only works for the specific case when the domain is x:{0 to π}.”

And I stand behind this challenge because there are, in fact, two incorrect claims in this statement.” (referencing the claim that the average value of daytime temps is .637Tmax and the median is Tmax)

You tried to hijack the discussion. You are still trying.

Reply to  bdgwx
June 15, 2025 6:23 pm

Do you not understand that under clear skies the daily insolation profile is a half-cycle sinusoid?

Reply to  karlomonte
June 16, 2025 11:05 am

He either has no understanding of reality or he’s deliberately trying to hijack the discussion.

Reply to  Tim Gorman
June 16, 2025 11:34 am

Yes.

bdgwx
Reply to  Jim Gorman
June 15, 2025 7:49 am

As I showed you in a previous post, the correct interval is from 0 (zero) to π.

There is no “correct” interval for a sine wave. And the interval from 0 to π is only half of a single wave cycle anyway. One complete cycle is from 0 to 2π.

The median of sin(x) is NOT 0.707.

For the interval 0 to π It most certainly is.

0.707 is the RMS (Root Mean Square) value of a sine.

Yes. I know. It is also interesting to point out that the RMS of sin(x) over the domains x:{0 to (1/2)π}, x:{0 to π}, x:{0 to (3/2)π}, and x:{0 to 2π} are all 0.707.

One simple result of equally dividing the sets of function values above and below t is that, for any f monotonic on [ab]

fmed = f((a+b) / 2). 

That formula only works for monotonic functions. f(x) = sin(x) is not monotonic over the domain x:{0 to π} so this formula doesn’t work. Of course your source is careful to point out the monotonic constraint.

You should read it carefully

Don’t think the irony of you lecturing me about reading something carefully when you ignored an incredibly important clause in your source went unnoticed.

The shows that the median is at the peak value of one-half cycle of the sine, i.e., π/2.

Like I said above your formula does not work because f(x) = sin(x) is not monotonic over x:{0 to π}.

And BTW…this doesn’t even pass the sniff test anyway. Do you really think the median of sin(x) from 0 to π is 1…ya know…the value that represents the max or the right hand most value in the distribution? Really?

comment image

bdgwx
Reply to  bdgwx
June 15, 2025 7:51 am

And here is the distribution for sin(x) over the domain x:{0 to 2π}.

As can be clearly seen the median is NOT the peak of 1, but instead the middle value 0.

comment image

bdgwx
Reply to  bdgwx
June 15, 2025 8:33 am

Let’s discuss the median of sin(x) some more. As JG’s source says the formula M = f((a+b)/2) works when f(x) is monotonic. Let’s examine the monotonic segments of f(x) = sin(x) over x:{0 to π}.

The 1st monotonic segment is from x:{0 to π/2}. That gives us M = f((0+π/2)/2) = f(π/4) = 0.707 for this segment.

The 2nd monotonic segment is from x:{π/2 to 0}. That gives us M = f((π/2+0)/2) = f(π/4) = 0.707 for this segment.

Both segments separately have a median of 0.707 and since our segments have the same domain size (π/2) that means the median of the whole domain x:{0 to π} is also 0.707.

bdgwx
Reply to  bdgwx
June 15, 2025 11:57 am

More fun with the mean and median of sin(x)…

This conversation has piqued my interest in the behavior of the mean and median of the sine wave. I decided to plot the ratio of mean(f(x)) / median(f(x)) where f(x) = sin(x) with the domain being x:{0 to a} as a approaches 2π to see how they relate to each other.

comment image

And he is the plot zoomed in to the domain x:{0 to π}.

comment image

Reply to  bdgwx
June 15, 2025 4:19 pm

Daytime temps represent 0 to pi, not 0 to 2pi.

bdgwx
Reply to  Tim Gorman
June 15, 2025 6:19 pm

Daytime temps represent 0 to pi, not 0 to 2pi.

I don’t think that is an unreasonable way to model temperature. But if you’re going to use sine to model temperatures you’re going to have to use formulas for mean(sin(x)) and median(sin(x)) that are actually correct.

You can’t just say that mean(sin(x)) = 0.637 * max(sin(x)) because that’s not right. Well…it’s not right generally. It only works for the specific case when the domain is x:{0 to π}. And the formula he gave for median(sin(x)) is never right, not even for a specific case.

Reply to  bdgwx
June 15, 2025 8:23 am

There is no “correct” interval for a sine wave. And the interval from 0 to π is only half of a single wave cycle anyway. One complete cycle is from 0 to 2π.”

As Jim keeps telling you the daytime temperature curve is *NOT* a full cycle of a sine wave.

The sun comes up == 0 radians
The sun is overhead == Π/2 radians
The sun sets == Π radians

I.e. *HALF* of a full cycle.

WHY DO YOU INSIST ON IGNORING PHYSICAL REALITY?

That seems to be a common malady with climate science.

bdgwx
Reply to  Tim Gorman
June 15, 2025 11:14 am

As Jim keeps telling you the daytime temperature curve is *NOT* a full cycle of a sine wave.

I never said the daytime temperature curve was a full cycle of a sine wave.

WHY DO YOU INSIST ON IGNORING PHYSICAL REALITY?

Let me get this straight…your brother said mean(sin(x)) = 0.637 * max(sin(x)) and that median(sin(x)) = max(sin(x)) and I’m the one ignoring physical reality? Seriously?

Reply to  bdgwx
June 15, 2025 4:13 pm

He said the mean of a half cycle is 0.68Tmac. That is correct.

He said the median of s half cycle is Tmax. That is also correct.

You then went off on a tangent talkong about a *full cycle*. A.red herring non sequitur.

And now you are just babbling about your red herring.

You were asked to show the math for the average of a half cycle of a sine wave. And you gave a total fail. You apparently still think sunrise to sunset is s full cycle of a sine wave. Typical climate science garbage.

bdgwx
Reply to  Tim Gorman
June 15, 2025 6:14 pm

He said the mean of a half cycle is 0.68Tmac. That is correct.

This is what he said…

“The mean of a sine is calculated as 0.637Aₚₑₐₖ”

That is not correct.

As I said above if he just misspoke then it would be really easy for him to say my bad that was a typo I meant for the interval 0 to π. And I would have no big deal. I typo all of the time.

He said the median of s half cycle is Tmax. That is also correct.

Yeah. He did say that. The exact statement was “The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ.”

That is not correct.

And based on the number of times it was said I doubt this was a simple typo.

Reply to  bdgwx
June 16, 2025 3:52 am

Again, he made it CLEAR that he was talking about the temperature profile of earth, not some blackboard full sine wave cycle.

Your reading comprehension skills are atrocious. That seems to be endemic with “climate scientists”.

The only apology should come from you for being unable to understand simple English text.

bdgwx
Reply to  Tim Gorman
June 16, 2025 7:57 am

Again, he made it CLEAR that he was talking about the temperature profile of earth, not some blackboard full sine wave cycle.

I didn’t think so.The ironic thing is that when I read his statement “The mean of a sine is calculated as 0.637Aₚₑₐₖ” my first thought was that he was probably talking about the specific case where the domain was x:{0 to π}. But then he said “The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ.” and this made me question his entire understanding of a what median is. I then remembered all of the other times you guys conflated sums with averages and I suddenly wasn’t so sure that he was thinking of only the domain x:{0 to π} for the mean.

The only apology should come from you for being unable to understand simple English text.

Oh…so it’s my fault Jim provided incorrect formulas for both the mean and median because I’m “unable to understand simple English text”. That’s some grade A gaslighting right there.

Reply to  bdgwx
June 16, 2025 10:54 am

This entire forum is about TEMPERATURE OF THE EARTH’S BIOSPHERE.

The average value of the daytime temperature (assuming a sine wave insolation) *is* .637Tmax. The median *is* Tmax.

You can argue till you are blue in the face, it won’t change reality.

*YOU* are the one that sidetracked the discussion by offering up the red herring of a “full sine wave cycle”. Either you don’t understand reality or you were trying to deliberately confuse the issue. You need to apologize for one or the other – pick one.

bdgwx
Reply to  Tim Gorman
June 16, 2025 1:16 pm

You need to apologize for one or the other – pick one.

So now you want me to apologize for debunking Jim’s claims that the formulas he gave for mean(sin(x)) and median(sin(x)) were wrong?

Not going to happen. His formulas were wrong. That is indisputable and unequivocal. I’m not going to apologize for explaining why they are wrong.

You can argue till you are blue in the face, it won’t change reality.

Yeah. You’ve made that abundantly clear given this discussion and our previous discussions of your other math mistakes over the years. I guess I keep having these discussions that go nowhere with you guys because I naively think one day the epiphany will finally hit.

Reply to  bdgwx
June 17, 2025 5:38 am

You are *still* refusing to admit that you got the integration limits wrong because you didn’t understand that daytime temps area only a half-cycle of a sine wave.

The reality *is* that the average value of a half-cycle of a sine wave is .637Tmax and the median of a half-cycle of a sine wave is Tmax.

You can’t even seem to understand that it is root-mean-square that gives .707 values. My guess is that you can’t even figure why that is.

The mid-range value overstates both the average of the daytime and nighttime average temps and the average of the daytime and nighttime RMS temps. That *should* be a clue to climate science that it is using an obsolete metric for daily “average” temperature.

Reply to  Tim Gorman
June 17, 2025 6:05 am

And he is now pathetically trying to invoke the GUM to justify the Fake Data practices of climatology, which are nowhere in the document.

He still cannot (or refuses to) understand that air temperature adjustments requires knowledge of true values — the reality is they are nothing but guesses and gut feelings.

Reply to  bdgwx
June 16, 2025 6:18 am

Yeah. He did say that. The exact statement was “The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ.”

I see I forgot to include my reference. It is here.

Wallin.pdf

This is from a course at Whitman College. Read Section 3 about using measures. It basically defines the median as the point where the area under the curve is similar both above and below the median.

The integral from 0 to π/2 of a sine is equal to 1. The integral from π/2 to π of a sine is equal to 1. Consequently, the median of a sine from 0 to π is π/2 since that point has equal areas below and above that point as I said from the start.

bdgwx
Reply to  Jim Gorman
June 16, 2025 9:12 am

It basically defines the median as the point where the area under the curve is similar both above and below the median.

It absolutely does NOT say that.

Here is what it actually says.

Theorem 3: t= tm = fmed is the unique value that minimizes A(t) where A(t) = integral[abs(f(x) – t) dx, a, b].

Try it out. Let f(x) = sin(x) so A(t) = integral[abs(sin(x)) – t) dx, a, b].

When a=0, b=π, and t=1 then A(t) = 1.1415.

When a=0, b=π, and t=0.707 then A(t) = 0.8284

Note that t=1 occurs at f(π/2) and t=0.707 occurs at f(π/4).

So the median t=0.707 of sin(x) over the domain x:{0 to π} occurs at x=π/4.

I’m glad you mentioned this source because it actually contains the theorem I used to double checked t = 0.707 as being the median for sin(x) over x:{0 to π}. It’s also how I created the mean/median ratio plots above.

Consequently, the median of a sine from 0 to π is π/2 since that point has equal areas below and above that point as I said from the start.

As I said the point separating two intervals whose integrals are the same is NOT where the median occurs.

And as I said this doesn’t even pass an intuitive sniff test since f(π/2) = 1 when f(x) = sin(x). That is max(sin(x)). max(sin(x)) never equals median(sin(x)) over any definite interval. And that fact should actually be so intuitive that anyone who has taken an introductory course on trigonometry would understand.

Reply to  bdgwx
June 16, 2025 2:07 pm

As I said the point separating two intervals whose integrals are the same is NOT where the median occurs.

From https://www.whitman.edu/documents/academics/majors/mathematics/2016/Wallin.pdf

Now, for f defined on [a, b], define the sets

Above(t) = f⁻¹
((t, ∞)) = x ∈ (a, b)|f(x) > t

Below(t) = f⁻¹((−∞, t)) = x ∈ (a, b)|f(x) < t.

In the example of the sine function, Above(t) is the two intervals identified
by the orange lines . When we discuss the set of x values for which f(x) ≥ t,
we are discussing the set Above(t).

Theorem 3. The median value fmed exists for any continuous function f on a bounded interval. Furthermore, if f is absolutely integrable on the interval, then t = tm = fmed is the unique parameter value that minimizes the area function
A(t).

This absolute value function shows that, if t ≠ fmed, moving t closer to
fmed decreases the integral A(t) = ∫ᵇₐ
|f(x) − t|dx by subtracting more area
than it adds
. The horizontal, red line is y = t, the blue section of the x axis is
the set Below(t), and the orange section of the x axis is the set Above(t).
The
dark shaded regions directly above and below y = t represent the area removed
from A(t) by increasing or. decreasing t by a small increment, respectively. Note
that, until Above(t) and Below(t) each represent half the interval [a, b], we can
continue to decrease A(t), so A(t) is not at a minimum.

You’ll need to be able to use Geogebra to set this up as the links no longer works.

As I said earlier, this whole discussion is mathterbation of the worst kind. Tavg, Tmidpoint, Tmedian, are worthless measues. These calculations provide no insight as to what is occurring either globally, regionally, or locally.

That is what you need to be addressing!

bdgwx
Reply to  Jim Gorman
June 15, 2025 8:38 am

Here is a screen capture from Symbo.lab.

I missed this nugget. Over here your brother told me the CASes I use (Symbolab being one of them) was wrong.

Reply to  bdgwx
June 16, 2025 5:36 am

CASes I use (Symbolab being one of them) was wrong.

Siting here this morning going through all the posts about mean, median and mode is basically dancing around what is important. All this caterwauling about the math of a sine wave is very misplaced.

Tavg, Tmean, Tmedian, Tmidpoint all miss the mark about discussing temperature profiles. Why? Because any kind of central moment calculation lumps everything together into one common number that tells you nothing about what is actually occurring with the temperature profile.

From the day I started reading this site, one of my questions was, if the globe is warming, exactly what was warming. Daytime temps, nighttime temps, both temps, where, when, what rate? Simply looking at a single number tells one absolutely nothing about how the number was arrived at and how uncertain it is.

The real issue should be what are daytime temperatures doing and what are nighttime temperatures doing. Using an average of these two does not allow one to make an accurate conclusion of what is actually occurring because they have different profiles. For some reason, climate science just keeps on trucking using a central moment that quite honestly tells you nothing.

If you want to know what Tmax is doing, then look at Tmax. If you want to know what Tmin is doing, then look at Tmin. Anything else is just beating around the bush hoping something shows up. Changing past temperatures so one can create a long record out of thin air, homogenizing so one can have a less “noisy’ signal, Infilling areas with no measurements simply creates nothing of value to anyone.

bdgwx
Reply to  Jim Gorman
June 16, 2025 3:28 pm

Let’s review how this ridiculous discussion in chronological order.

Tim: You only need to know the maximum value and the minimum value to find the median.

This is algebra mistake #1 in this article. The median cannot be determined from min and max alone. I tell Sparta Nova this.

Tim: The value of (Tmax + Tmin)/2 appears to be accepted by climate science as an *average” value when of course it isn’t. It would be if the daily temp profile was a sine wave but it isn’t.

Here is the first reference to the full domain of a sine wave x:{0 to 2π} since mean(sin(x)) = (max(sin(x)) + min(sin(x))) / 2 only over the domain x:{0 to 2π}. I tell Sparta Nova that if the daily temp profile was a sine wave (Tim’s words; not mine) then the mean and median would be the same.

Tim: For the daily temp profile, day = sine/night = exponential

Possibly understanding how my statements to Sparta Nova undermine Tim’s statement notice that Tim moves the goal post from the daily temperature profile being a sine wave to only the day portion being a sine wave and the night portion being an exponential decay.

Jim: The median of a sine occurs at the peak, i.e., at π/2. It is Aₚₑₐₖ. That gives equal area value above and below the median. You’ll need to look up a sine probability function.

This was said when the context was still that of the daily temperature profile being a sine wave. Remember, Tim originally said (Tmax+Tmin)/2 would be the average if the daily temperature profile were a sine wave which only occurs over the domain x:{0 to 2π}…a full cycle.

This is also the start of algebra mistakes #2 and #3 because median(sin(x)) ≠ max(sin(x)) and mean(sin(x)) ≠ 0.637 * max(sin(x)).

This is when the gaslighting starts as well. From here Tim shifts the reality from him originally referencing the 0 to 2π interval (because of the [Tmax+Tmin]/2 statement) to me possibly because I’ve just called out Jim for mistakes #2 and #3. Meanwhile Jim is still vigorously defending mistake #3. All the while I’ve decided to let mistake #1 die even though it is pretty egregious as well.

And still not a single acknowledgement of mistakes #1, #2, or #3. There wasn’t even a “it was a typo” explanation. So if Tim and Jim realize the mistakes then they are doing everything they possibly can to convince me otherwise.

Reply to  bdgwx
June 17, 2025 6:51 am

This is algebra mistake #1 in this article. The median cannot be determined from min and max alone. I tell Sparta Nova this.”

Again, you are ignoring the fact that we are discussing TEMPERATURE IN THE BIOSPHERE, specifically daytime temperatures!

This is your red herring #1!

Here is the first reference to the full domain of a sine wave x:{0 to 2π}”

Again, you are ignoring the fact that the discussion is about TEMPERATURES IN THE BIOSPHERE, specifically daytime temperatures.

This is your red herring #2.

“Tim moves the goal post from the daily temperature profile being a sine wave to only the day portion being a sine wave and the night portion being an exponential decay.”

There is no goal post moving on my part. The daytime temp is a sinusoid and it is 1/2 of a full cycle. Again, you are ignoring the fact that the discussion is about TEMPERATURES IN THE BIOSPHERE, specifically daytime temperatures.

This is your red herring #3.

“This was said when the context was still that of the daily temperature profile being a sine wave”

No, this was *YOUR* lack of understanding that it is only the daytime temperature is a sinusoid, and only 1/2 of a full cycle!

You started off not understanding that the daytime temp is 1/2 of a sine wave and that nighttime temperature is a decaying exponential. Typical of climate science!

You are *STILL* trying to sluff off that you started out with either a total lack of understanding of the daily temperature profile or with a conscious attempt to hijack the sub-thread.

You *still* can’t accept that the median daytime temp is Tmax. You *still* can’t accept that the average daytime temp is .637Tmax. You *still* can’t accept that the mid-range temperature is *NOT* the average daily temperature.

You are a troll, pure and plain.

bdgwx
Reply to  Tim Gorman
June 17, 2025 9:35 am

Again, you are ignoring the fact that we are discussing TEMPERATURE IN THE BIOSPHERE, specifically daytime temperatures!

The discussion you started was of the daily temperature profile including Tmin and Tmax being a sine wave. This occurred at June 13, 2025 10:06 am.

There is no goal post moving on my part.

The goal post at June 13, 2025 10:06 am was the daily temperature profile including Tmax and Tmin being a sine wave.

This changed at June 13, 2025 5:07 pm to only the daytime temperature being a sine wave.

No, this was *YOUR* lack of understanding that it is only the daytime temperature is a sinusoid, and only 1/2 of a full cycle!

My post saying that the mean and median would be the same if the daily temperature profile were a sine wave occurred at June 13, 2025 2:11 pm. Note that 2:11pm is between 10:06am and 5:07pm and thus occurs before you moved the goal post.

You started off not understanding that the daytime temp is 1/2 of a sine wave and that nighttime temperature is a decaying exponential.

There is not misunderstanding on my part. As I said modeling the daytime profile as a 1/2 sine wave and the nighttime profile as an exponential decay isn’t unreasonable.

You *still* can’t accept that the median daytime temp is Tmax.

Correct. And I will never accept that because it is an absurdly stupid claim.

I mean think about it. Do you really think that 50% of the daytime temperatures are above Tmax? Really?

You *still* can’t accept that the average daytime temp is .637Tmax

As I’ve already said I accept that the special case in which mean(sin(x)) = 0.637*max(sin(x)) over the domain x:{0 to π}. What I will never accept that this formula is correct generally which was challenge. I standby by challenge.

You *still* can’t accept that the mid-range temperature is *NOT* the average daily temperature.

Again…that depends. If you truly believe your statement “The value of (Tmax + Tmin)/2 appears to be accepted by climate science as an *average” value when of course it isn’t. It would be if the daily temp profile was a sine wave but it isn’t.” then it is absolutely true that the median and mean are the same. Note that I’m not endorsing your statement that the daily temperature profile is a sine wave. I’m just stating the consequence of the statement.

If you want to know my position on how things really are then it that’s easy. I’ll just tell you. I don’t think the mean and median of the real daily temperature profile are the same. In fact, this is why I told Sparta Nova to do the experiment so that he would understand that there is a difference.

Reply to  Nick Stokes
June 13, 2025 3:15 am

“Of course it (unadulterated data) is. It is published as GHCN unadjusted,”

This is a blatant lie.

GHCN data is not original temperature data. It is adjusted to fit the CO2-Crisis narrative.

The original, historic temperature data shows that GHCN and all the rest of the bastardized databases are BIG LIES, meant to scare the public into submission.

I found the Physical Review Letters website. The one where NOAA did a study in 1987 that said it is no warmer today than it was in the Early Twentieth Century, which itself debunks all these Temperature Data Mannipulators claims about the historic temperature reocrds.

I’ll have to look through a lot of webpages to find the particular article, and when I find it, I will post a link to it every time someone claims GHCN is unadjusted data.

Orignial, historic temperature profiles look NOTHING like the bogus, bastardized temperature profile of the Dishonest Hockey Stick global chart.

The only temperature data the Data Mannipulators had to work with was the regional surface temperature charts from around the world, all of which show NO Hockey Stick temperature profile.

So how do you get a scary Hockey Stick profile out of temperature data that has no Hockey Stick profile? Answer: You Cheat and Lie.

Nick continues this lie.

Reply to  Tom Abbott
June 13, 2025 7:04 am

And averages of mid-point air temperatures and non-existent virtual ocean data are not “the climate.”

bdgwx
Reply to  altipueri
June 12, 2025 4:18 pm

Yes. It is available here.

Reply to  bdgwx
June 13, 2025 3:17 am

What would you guys do without bogus temperature data? Answer: You would be OUT OF BUSINESS!

bdgwx
Reply to  Tom Abbott
June 13, 2025 4:26 am

Let’s assume for the moment that the entire world is in on NOAA’s grand conspiracy to alter the data even in the qcu (unadjusted) file and that this conspiracy has been hidden the whole time just from you. How is it that you are always claiming that the global average temperature has not increased?

cgh
Reply to  bdgwx
June 13, 2025 7:12 am

The entire world is not necessary. This is a rather small group largely within American think-tanks and assorted universities. Since this relatively tiny group reviews all of each other’s work and provides most of the input into IPCC assessment reports, it’s rather easy to coordinate all this and give it a veneer of widespread acceptance.

The problem for you is that no one believes that “man behind the curtain” any more. The Climategate Emails over a decade ago laid out, by the conspirators in their own words, how they would subvert and corrupt whatever scientific institutions and publications they had not already converted to the cause.

bdgwx
Reply to  cgh
June 13, 2025 8:32 am

This is a rather small group largely within American think-tanks and assorted universities.

We’re talking about tens of thousands of land stations. Each one is maintained by different people. I bet there are at least hundreds of thousands if not over a million people involved just in reporting the observations. And this doesn’t count the ocean observations or the processing of the data by many other people with a wide variety of interests. When it is all said and done there have to be millions or tens of millions of people at least in on the conspiracy spanning hundreds of years.

How is it possible that not a single person spilled the beans on the conspiracy?

Why would the original conspirators back in the 1800’s or maybe even 1700’s even think to alter their data to begin with? What was their motivation?

I can tell you that I’ve personally never been approached to join the conspiracy. I wonder what the selection criteria and why they didn’t approach me.

cgh
Reply to  bdgwx
June 13, 2025 10:25 am

I can tell you that I’ve personally never been approached to join the conspiracy. I wonder what the selection criteria and why they didn’t approach me.”

Not a new phenomenon. Vladimir Lenin described most people trapped by an ideology as “useful idiots.” That you were not approached is merely an indication that you weren’t either necessary or needed by them.

And they did spill the beans. The Climategate Emails are filled with them confessing their own actions to distort and misrepresent science and to manipulate scientific publications, peer review and societies.

Your statement about thousands of land based stations is irrelevant. The data is systematically altered by GISS by methodology which Schmidt & Co. conceal. It doesn’t matter how many stations are included; the output result is still garbage because of the alterations.

bdgwx
Reply to  cgh
June 13, 2025 11:32 am

Maybe I’m too skeptical to be solicited. Or maybe I already am in on the conspiracy and I’m using reverse psychology on you to defect attention away from me. /s

Reply to  bdgwx
June 15, 2025 7:11 am

Increased since when? Based on what data? The whole world has not been measuring temperature for very long, nor has that measurement been consistent or reliable for any significant period of time, such that any conclusions can be reached. Averaging such a thing is a nonsense exercise.

It can be reasonably asserted that there has been a general warming since the coolest period of the last 10,000 years, the Little Ice Age. Other than that, it is all noise.

bdgwx
Reply to  Mark Whitney
June 15, 2025 8:55 am

I’m not sure I understand what you are asking. What has increased? Averaging what is nonsense?

Reply to  bdgwx
June 15, 2025 5:27 pm

Averaging unsuitable data over selective time frames. I am asking what you are insisting the significance of such an exercise is. What meaning do you actually think it has?

Reply to  Mark Whitney
June 16, 2025 4:04 am

It has no meaning. It’s based on several assumptions. 1. you can average an intensive property like temperature. 2. Averaging can increase resolution and decrease measurement uncertainty. 3. That temperature is a perfect proxy for heat. 4. You can ignore the different variances of random variables when adding them to calculate an average instead of weighting the components based on their variances. 5. All measurement uncertainty is random, Gaussian, and cancels. 6. That the sampling error (i.e. the SEM) is the measurement uncertainty of the average. 7. Numbers is just numbers, including measurement uncertainty, so you can calculate a measurement average out to as many decimal places as you want without worrying about resolution or significant digits.

I could go on. But I’ll end with an example of climate science thinking – the temperature at the peak of a mountain is the average of a measurement station on the east side of the mountain well below the peak and a measurement station on the west side of the mountain well below the peak. Think about it.

Reply to  Tim Gorman
June 16, 2025 4:11 am

Indeed. I don’t think bdgwx understands such nuance and keeps clinging to the Frisbee like a dog-ma.

bdgwx
Reply to  Mark Whitney
June 16, 2025 7:47 am

What is it specifically you think I don’t understand?

Reply to  bdgwx
June 16, 2025 10:36 am

That daytime is from sunrise to sunset. Either that or you think sunrise to sunset consists of a full cycle of a sine wave.

Pick one. It shows you don’t understand reality at all.

bdgwx
Reply to  Mark Whitney
June 16, 2025 7:49 am

This subthread is about Tom’s claim that all temperature data is bogus.

What data do you think is unsuitable?

What is your concern with averaging?

Are you asking me what meaning I think temperature data has?

Reply to  bdgwx
June 16, 2025 10:41 am
  1. Temperature is an intensive property.
  2. Temperature is *not* a good proxy for heat.
  3. Temperatures in the NH and SH have different variances.
  4. The temperature at the peak of a mountain is *NOT* the average of a measuring station on the east side of the mountain and one on the west side of the mountain.
  5. Systematic measurement uncertainty cannot be extrapolated back in time.

The data has obviously been manipulated. That alone makes it unsuitable. The facts above show that it is not fit for the purpose it is being used for.

Reply to  bdgwx
June 17, 2025 4:26 am

I think you need to do quite a bit of catching up on the material presented here. Begin with Anthony’s evaluation of US surface station integrity. Consider the history of surface station coverage. None of the data from those limited and poorly sited sources is suitable for the purpose of determining the actual surface temperature on either a spatial or temporal scale, and averaging garbage data results in the average of garbage. It is absolutely worthless for determining any long-term trends.
Add to that the manipulation of that data and the effects of surface changes, including urbanization. It tells us little about climate and certainly is no basis for claims of crisis or of CO2 radiative effects.

All in all, it will require more of you than simple one-sentence shots like the global temperature has or has not increased. Time frames and data quality are not trivial parameters.

bdgwx
Reply to  Mark Whitney
June 17, 2025 5:10 am

Right. So it sounds like you too are questioning Tom’s ability to declare that there has been no warming. In fact if all of the data is unsuitable for the purpose of determining long term trends then we cannot eliminate the possibility that the warming is FAR greater than what scientists are telling us. Afterall it has already been shown that these scientists have adjusted away a significant portion of the warming as compared to the raw data. Can you help me explain this to Tom?

Reply to  bdgwx
June 17, 2025 5:21 am

You have a remarkable ability to distort what people actually say. What has been shown is that adjustments to data have cooled the past and warmed the present, such that any warming is likely exaggerated. In addition, what surface warming that has been detected is likely only reflective of the point of measurement, and all “global” assertions are at best questionable. Tom is quite correct in his analysis. Your dog with a frisbee approach is quaint.

Reply to  Mark Whitney
June 17, 2025 7:39 am

Don’t you know that in climate science all measurement uncertainty is random, Gaussian, and cancels? Thus the global average temperature can be calculated to any number of decimal places you wish to use. It doesn’t matter what adjustments are made – they all work to decrease measurement uncertainty.

bdgwx
Reply to  Mark Whitney
June 17, 2025 9:17 am

What has been shown is that adjustments to data have cooled the past and warmed the present such that any warming is likely exaggerated.

That is not correct. The adjustments to data have actually warmed the past such that the overall warming is attenuated relative to the raw data.

comment image

[Hausfather 2017]

In addition, what surface warming that has been detected is likely only reflective of the point of measurement, and all “global” assertions are at best questionable.

If the global average temperature is questionable at best then why is Tom insisting that it is no warmer today than in the past?

Reply to  bdgwx
June 18, 2025 4:40 am

Do you even understand that there was no global data in 1850, or even in 1930-1950? Most of the planet has zero data prior to the mid-1950s. US data, though much distorted by siting issues and manipulation, indicates that it was warmer in the late 1930s. NASA GISS has repeatedly doctored the USHCN data to cool that period and to warm the present.

Proxies and historical accounts offer a general indication that it was as warm or warmer 900-700 years ago during the Medieval Warm Period, cooler 600-200 years ago during the Little Ice Age. It was almost certainly cooler in the 1960s and ’70s when claims of a returning ice age prevailed. Temperatures have been up and down for the last 30 years, though urban effects have certainly distorted the record as well as drop-outs in rural and high altitude/high latitude monitoring stations.
Various proxies confirm that most of the period known as the Holocene was warmer than today.

So, again, I ask you what you are referring to when you say there has been warming. Since when? How much? What significance do you think it has? Tom is correct that it is no warmer today than it has been in the past. Such is the nature of generalities.

bdgwx
Reply to  Mark Whitney
June 18, 2025 3:05 pm

Do you even understand that there was no global data in 1850, or even in 1930-1950?

There was data. It just wasn’t as much as there is today. That’s why the uncertainty on global average temperature generally increases the further back in time you go.

US data, though much distorted by siting issues and manipulation, indicates that it was warmer in the late 1930s.

Actually…US data indicates that it is much warmer today.

NASA GISS has repeatedly doctored the USHCN data to cool that period and to warm the present.

GISS does not doctor data. That is a myth.

And the only adjustment GISS makes is for the urban heat island effect.

What you might be referring to the pairwise homogenization algorithm executed on the QCF file from GHCN-M. The PHA finds and corrects for the biases caused by changepoints in the record.

Proxies and historical accounts offer a general indication that it was as warm or warmer 900-700 years ago during the Medieval Warm Period

The MWP refers the climatic period of the North Atlantic region and especially England from 1100-1350.

During this period the global average temperature was lower than it is today.

cooler 600-200 years ago during the Little Ice Age

The peak was about 300-200 years ago. Unlike the MWP the LIA was a globally synchronous climatic era.

It was almost certainly cooler in the 1960s and ’70s when claims of a returning ice age prevailed.

It was cooler than today, but warmer than the LIA. Claims of a returning ice age was primarily a media story. In the scientific literature the 60s and 70s were the era when the consilience of evidence began suggesting that further warming was going to happen with high confidence.

So, again, I ask you what you are referring to when you say there has been warming. Since when?

The global average temperature since 1850.

How much?

About 1.5 C.

What significance do you think it has?

That is a complex question I don’t think I’ve ever been asked. There are a lot of directions this could go depending on the context you were thinking of. If you could clarify a bit more about what you’re asking I could probably give a shot at answering.

Tom is correct that it is no warmer today than it has been in the past.

Sadly he is not. What he doesn’t tell is that the graphs he posts 1) include the biases and errors known to exist in the historical record as a result of station moves, instrument/shelter changes, and time-of-observation changes and 2) is only for the United States representing a mere 2% of the globe.

I’ve discussed with him before so he knows what he is posting is wrong. That makes his posts blatant disinformation as opposed to innocent misinformation.

Reply to  bdgwx
June 18, 2025 4:18 pm

include the biases and errors known to exist in the historical record as a result of station moves, instrument/shelter changes, and time-of-observation changes 

Fraudulent guesses and gut feelings, AKA Fake Data. Only in climatology is data “adjusted” so that it matches what the practitioners believe it should be.

It is you who is posting blatant disinformation.

Reply to  karlomonte
June 19, 2025 7:29 am

Yep. Biases and errors may be known to exist in the historical record and may be identifiable but they can *NOT* be quantified. Climate science can’t even tell you today what the temperature was yesterday at 2PM based solely on knowing the temperature at 2PM today. But they can tell you what the temperature was 100 years ago?

Reply to  bdgwx
June 19, 2025 5:31 am

There was no data at all for most of the globe prior to the mid-20th century. Remember, this planet’s surface is 70% ocean, and even on land vast areas and even entire continents have very little to no reliable measurement until quite recently. Anthony has demonstrated that the current UCHCN is not in compliance with NOAA standards. The data for the GHCN is certainly no better and likely far worse with regard to siting and reliability.
None that existed in the past was precise to the tenth of a degree used to make claims of record heat now. None of it is suitable for making the comparisons used to justify the alarmist narrative or even to obtain more than a crude result.

GISS has arbitrarily adjusted the data on more than one occasion, always resulting in an increased heat bias for the current period. It is not a myth.

We know it was warmer in the past because retreating ice has exposed the remains of forests and human activity where none could exist under current conditions. Other proxies confirm that there is nothing remarkable about the present warmth. Certainly, no data at all supports claims of crisis.

The claimed climate crisis is indeed a media story fuelled by a political agenda. I was there in the 1960s and 70s, and the ice age hype was taught in school and presented as credible in various venues. I never heard a peep about warming.
Indeed, the speculation about climate has flipped on several occasions since Arrhenius and his contemporaries first speculated about the radiative effects of gases.

bdgwx
Reply to  Mark Whitney
June 19, 2025 6:28 am

There was no data at all for most of the globe prior to the mid-20th century. Remember, this planet’s surface is 70% ocean

Ships existed in the mid-20th century. The World Ocean Database contains observations back to the 1700’s.

and even on land vast areas and even entire continents have very little to no reliable measurement until quite recently.

Right. That’s why the uncertainty on the global average temperature is higher in the past.

Anthony has demonstrated that the current UCHCN is not in compliance with NOAA standards.

He did not demonstrate that. At least he did not demonstrate that it cannot be used to determine the temperature. What he demonstrated is that many stations have siting problems. But everyone already knows that. That’s why PHA (or similar homogenization) techniques exist.

GISS has arbitrarily adjusted the data on more than one occasion, always resulting in an increased heat bias for the current period. It is not a myth.

The source code is here. Can you tell me which line in the source code arbitrarily adjusts data?

We know it was warmer in the past because retreating ice has exposed the remains of forests and human activity where none could exist under current conditions.

It was warmer in the past. No one is challenging that. In fact, it was scientists who presented evidence showing that. That’s why we know it was warmer in the past. However, the evidence from scientists also tell us that globally that it is warmer today than as compared to most of the Holocene.

The claimed climate crisis is indeed a media story fuelled by a political agenda. I was there in the 1960s and 70s, and the ice age hype was taught in school and presented as credible in various venues.

I know. That’s what I said.

I never heard a peep about warming.

That’s probably because you were not reading the scientific literature.

Reply to  bdgwx
June 19, 2025 7:36 am

Ships existed in the mid-20th century. The World Ocean Database contains observations back to the 1700’s.”

With what measurement error? Was the data accurate to a hundredth of a degree?

That’s why the uncertainty on the global average temperature is higher in the past.”

Is that uncertainty greater than the differences (anomalies) that are being used to identify trends? If so, how can the differences actually be KNOWN to a hundredth of a degree?

“At least he did not demonstrate that it cannot be used to determine the temperature”

He has demonstrated that it can’t be accurate to the hundredth of a degree!

“That’s why PHA (or similar homogenization) techniques exist.”

Which do nothing but spread measurement uncertainty around.

“It was warmer in the past.”

“tell us that globally that it is warmer today than as compared to most of the Holocene.”

Did you read this before you posted it?

Exactly what PAST was warmer? If it wasn’t warmer globally then where was it warmer?

Reply to  bdgwx
June 19, 2025 7:42 am

Right. That’s why the uncertainty on the global average temperature is higher in the past.

Total nonsense, there is no measurement uncertainty if there are no measurements!

Reply to  bdgwx
June 17, 2025 7:36 am

we cannot eliminate the possibility that the warming is FAR greater than what scientists are telling us”

Once again we see a comment like this from someone that seems to have *no* actual touch with reality. Looking only at continued record grain harvests year after year after year is one factor legislating against such a possibility. The greening of the earth is a factor that also legislates against this possibility. The fact that hardiness zones have seen almost no movement legislates against such a possibility. The facts that polar bears have not gone extinct, that sea level rise has not accelerated, that the poles are not ice free all legislate against this possibility.

We can’t eliminate the possibility that Jesus returned to Earth yesterday, we can’t eliminate the possibility that all the stars at the center of the galaxy went out yesterday, we can’t eliminate the possibility that the earth will be hit by a giant asteroid tomorrow. But reality legislates against all three. Do ^YOU* believe they might actually be true?

Reply to  Mark Whitney
June 17, 2025 7:25 am

Temperature is a crappy way metric for climate. Las Vegas and Miami have similar temperatures but far different climates. Yet climate science lumps those temperatures into their “global average temperature” as if they actually tell you something about climate.

Reply to  Mark Whitney
June 16, 2025 3:54 am

100%. Then couple that with the climate model outputs that show a linear increase in temperatures for the future – with no limit.

Reply to  bdgwx
June 13, 2025 7:49 am

Yes. It is available here.

So called unadjusted data is available at this url. However, only Tavg is available for most of the history.

The documentation shows Tmax and Tmin not included until mid to late 2016

Even more concerning from a data accuracy standpoint is this definition of the Tavg values.

VALUE: monthly value (MISSING=-9999). Temperature values are in

hundredths of a degree Celsius, but are expressed as whole

integers (e.g. divide by 100.0 to get whole degrees Celsius).

Temperature values in the hundredths of a °C? Even USCRN temperatures have only a resolution of 0.1°C. That is far from one-hundredth of a degree.

I know you have been adamant that I am incorrect in my statements that averaging is unable to decrease uncertainty nor is it able to increase resolution. Here are some answers from CoPilot, Grok, and ChatGTP to this common question. Similar responses to digital measurements are given by these AI’s.

can averaging temperature measurements add resolution beyond what was read on an LIG thermometer

CoPilot

For a liquid-in-glass (LIG) thermometer, averaging multiple readings cannot increase resolution beyond what is visually readable. Here’s why:

Key Limitations

  • Resolution is fixed by the smallest scale division (e.g., 0.1°C or 0.5°C).
  • Human interpolation between marks is subjective and limited.
  • No digital quantization noise to smooth out, unlike electronic sensors.

What Averaging Can Do

  • Reduce random reading errors (e.g., parallax, observer judgment).
  • Improve confidence in the mean temperature over time.
  • Help detect trends in temperature changes.

What Averaging Cannot Do

  • Increase the thermometer’s fundamental resolution.
  • Extract finer temperature variations beyond the scale markings.

ChatGTP

In the case of a liquid-in-glass (LIG) thermometer, averaging cannot add resolution beyond what you can visually read. Here’s why:

LIG thermometers have analog resolution limited by:

The smallest division on the scale (e.g., 0.5°C or 0.1°C).

The observer’s ability to interpolate between marks (usually ± half the smallest division).

Averaging multiple readings can:

Reduce random reading errors (like parallax or observer judgment), leading to better precision in repeated measurements.

Improve confidence in the mean value of a stable temperature.

But it can’t increase actual resolution, because:

The original data is already limited by your eye and the physical markings.

There is no digital noise or quantization to average out, as in electronic sensors.

Grok

Averaging temperature measurements from a liquid-in-glass (LIG) thermometer can improve the precision of the mean value but cannot increase the resolution beyond the physical markings of the thermometer. Here’s a detailed explanation specific to LIG thermometers:

Why Averaging Improves Precision but Not Resolution

Resolution of an LIG Thermometer: The resolution is determined by the physical markings on the glass (e.g., 0.1°C or 1°C increments). If the thermometer has markings every 0.1°C, you cannot reliably measure changes smaller than 0.1°C, even with averaging, because the resolution is limited by the physical scale and human ability to read it.

Precision through Averaging: Taking multiple readings and averaging them reduces random errors due to human judgment (e.g., estimating the liquid level between markings) or slight variations in reading conditions (e.g., parallax errors). This results in a more stable mean value, increasing confidence in the estimate.

Example: If you read 20.1°C, 20.2°C, 20.1°C, and 20.3°C on an LIG thermometer with 0.1°C markings, the mean is (20.1 + 20.2 + 20.1 + 20.3) / 4 = 20.175°C. This mean appears precise, but the true resolution remains 0.1°C because the thermometer cannot physically resolve finer increments.

No Digital Noise: Unlike electronic sensors, where averaging can reduce quantization noise or electrical fluctuations, LIG thermometers have no digital component. Errors are primarily from human reading variability or environmental factors, which averaging can only partially mitigate.

Why Resolution Cannot Increase

Physical Limitation: The smallest measurable increment is set by the thermometer’s scale (e.g., 0.1°C or 1°C). Even if you estimate the liquid level between markings (e.g., “halfway between 20.1°C and 20.2°C”), this is a subjective judgment, not a true increase in resolution. Averaging such estimates doesn’t create new information beyond the physical markings.

Human Perception: The human eye cannot reliably resolve positions finer than the marked intervals, even with tools like a magnifying glass, which only aids in reading the existing scale more clearly.

Denis
June 12, 2025 10:56 am

Our Aeronautics and Space group (NASA) should stick with aeronautics and space. Our Atmosphere and Ocean group (NOAA) can handle atmosphere and ocean (i.e. climate) issues.

Scissor
Reply to  Denis
June 12, 2025 11:37 am

Was Muslim outreach successful?

Reply to  Scissor
June 13, 2025 3:20 am

Well, I think NASA sent a bunch of Pakistani kids to Space Camp. I think that was about the extent of it.

Reply to  Scissor
June 13, 2025 11:14 am

The Muslim population of the US keeps going up. So it musta was.

Reply to  Denis
June 12, 2025 11:52 am

NASA has been concentrating pretty much on space. Less aeronautical research being done than in the past.

Editor
June 12, 2025 11:06 am

Well Done, Anthony!

Tom Halla
June 12, 2025 11:12 am

It is redundant. It is easier legally to eliminate a government operation than to reform it.

Bruce Cobb
June 12, 2025 11:56 am

#11. James Hansen. Yes, he’s no longer there, but he held fort there for over 30 years, pushing his warmunist pseudoscientific propaganda.

oeman50
Reply to  Bruce Cobb
June 13, 2025 6:20 am

What will Gavin do when it shuts down?

I know! He can get a costume and continue the clown show at kids’ birthday parties.

KevinM
June 12, 2025 12:02 pm

If jobs using STEM skills to convert data into frightening color scaled infographics are eliminated at the same time the middle class forces its kids to max out STEM scores, what happens to those kids in a few years? In a glass-half-full world they go invent the next Internet, I hope.

OweninGA
Reply to  KevinM
June 12, 2025 4:31 pm

That’s easy…they will do something productive in STEM and it will be a win-win for everyone.

KevinM
Reply to  OweninGA
June 13, 2025 10:57 am

I hope you are right. They’re going to have to invent their own tech sector to do it – We’ve been busily exporting their grandparents tech sector.

Reply to  KevinM
June 13, 2025 11:16 am

Between offshoring and programs such as H1-B, which import lower-cost foreign labor, our kids don’t have a chance with STEM. Recent grads can’t get interviews. In a few years, AI will make it worse.

Giving_Cat
June 12, 2025 12:03 pm

A supposed scientific agency that does no science? The answer is obvious.

I can only hope that upon dissolution the “proprietary” algorithms can be exposed to sunlight.

cgh
Reply to  Giving_Cat
June 12, 2025 2:25 pm

It’s entirely appropriate that GISS has this in its ground floor New York building.
Tom’s Restaurant – Wikipedia

A comedy about “nothing” and an agency that does nothing. They seem completely suited to each other. All of the characters in Seinfeld are dysfunctional dimwits; the implication for GISS seems obvious.

Reply to  cgh
June 12, 2025 4:41 pm

Unfortunately, GISS doesn’t “do nothing”.

Jeff Alberts
Reply to  Retired_Engineer_Jim
June 12, 2025 5:10 pm

Right. If they did actually do nothing, we wouldn’t care, except for the money they spend.

KevinM
Reply to  Giving_Cat
June 13, 2025 11:01 am

For those under 30 – “proprietary algorithm” is usually a euphamism for the work of a person with a strong enough persona that coworkers say “okay” while thinking “I wouldn’t do it that way”.

June 12, 2025 12:13 pm

If only I could “adjust” my bank accounts like GISS has the data.
I’d have been a multimillionaire years ago!
Reality?
I’d just post-date my checks until I’m long gone.
(It sort of works for climate models and climate predictions.)

Sparta Nova 4
June 12, 2025 12:15 pm

Someone paid attention in Power Point 101.
Red is bad.
Green is good.
At least on charts.
Risk charts also include yellow.

June 12, 2025 12:24 pm

NASA should archive the GISS data . . .

____________________________________________________________

GISS doesn’t archive their Monthly Land Ocean Temperature Index (LOTI).
You need to save it to your files during the monthly window that it appears.
There’s always The Internet Archives Way Back Machine, but for or some
inexplicable reason GISS has the Wayback Machine blocked. Access Denied
is typical of what you find on a search for past LOTI releases.

bdgwx
Reply to  Steve Case
June 12, 2025 4:24 pm

It works for me. I can see the LOTI back to 2005 using the Wayback Machine.

Reply to  bdgwx
June 12, 2025 9:59 pm

You can’t get the entire 12 month record for 2005, FEB MAR APR MAY JUN JUL are missing in your link. Different pathways in the WayBack Machine will sometimes pay off, for example I have LOTI for JAN of 2005 in my collection but your link shows JAN as missing. 2000 through 2004 there would have been 60 reports I only have 23 of them. Since FEB of 2015 my collection is complete.

bdgwx
Reply to  Steve Case
June 13, 2025 4:23 am

That’s typical of any site WM archives. It is a limitation of the WM itself. GISS isn’t blocking the WM’s access to the GISTEMP file.

Reply to  bdgwx
June 13, 2025 7:39 am

I assume GISS is somehow blocking The Wayback Machine (WM). If not GISS then who? Why is does a link appear for the GISS LOTI file you’re looking for and when you click on it, up comes “Forbidden” or “Access Denied”?

bdgwx
Reply to  Steve Case
June 13, 2025 8:13 am

It doesn’t give me either “Forbidden” or “Access Denied” when I open the LOTI file in WM.

Reply to  bdgwx
June 13, 2025 10:44 am

You didn’t try to get a complete record from 1997 When LOTI apparently started. Maybe Google AI will give it a whirl:

When did GISSTEMP publish their first Land Ocean Temperature Index (LOTI)? 
NASA GISS Data (.gov)
https://data.giss.nasa.gov › gistemp › history
GISTEMP results were first made available over the web in 1995, and some versions are retrievable from the “Wayback Machine” website as far back as 1997.
Missing: (LOTI ‎| Show results with: (LOTI

Google said some are retrievable and didn’t say a darn thing about the Access Denied & Forbidden answers that come up for those that are missing.

bdgwx
Reply to  Steve Case
June 13, 2025 2:00 pm

You didn’t try to get a complete record from 1997 When LOTI apparently started.

I’ve tried and have been unsuccessful.

When did GISSTEMP publish their first Land Ocean Temperature Index (LOTI)? 

[Hansen et al. 1987].

GISTEMP results were first made available over the web in 1995, and some versions are retrievable from the “Wayback Machine” website as far back as 1997.

Missing: (LOTI ‎| Show results with: (LOTI

Ok. So if we can figure out the url GISS used for the LOTI file then maybe we can look it up.

I too would like to see the progression of the LOTI between 1987 and 2005 so this is an interesting effort.

Reply to  Steve Case
June 13, 2025 8:38 am

One thing that’s happened which may or may not explain some of the issues is when they changed US government website from, say, “.com” to “.gov”.(If you search The Way Back Machine using the web address.)

bdgwx
Reply to  Gunga Din
June 13, 2025 10:39 am

Exactly. Url’s change over time. The LOTI file url has changed several times over the years. You have to know which url was in use at specific points in time for the WM to be useful. This is why I’m able to go back to at least 2005. There’s still a few years gap between when WM went online and 2005. This may be due to me not knowing what the LOTI file url was prior to 2005 or the WM might not have been crawling it. I don’t know.

I will say in defense of what Steve is suggesting is that it is not uncommon for websites to block automated crawlers. It’s not that they are necessarily blocking WM specifically. It’s just that WM being an automated crawler gets it lumped into the block by default. I don’t see any evidence that there was a broad long lasting block at least with the LOTI file though.

Reply to  Gunga Din
June 13, 2025 10:46 am

Thanks, doesn’t explain the Access Denied & Forbidden messages that appear.

Reply to  Steve Case
June 14, 2025 4:15 am

Thanks, doesn’t explain the Access Denied & Forbidden messages that appear.

Like “bdgwx” above, I have also “randomly” come across this problem with The Wayback Machine (WM) snapshots.

Start from the WM “calendar” of snapshots for the original LOTI location for the year 2011 :

https://web.archive.org/web/20111001000000*/https://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt

This covers both “version 1” (from August 2005 to April 2011) and “version 2” (May to October 2011 only) of the GISTEMP website’s “LOTI file” contents.

NB : For versions 3 (12/2011 to 7/2019) and 4 (since 6/2019) replace “…/tabledata/…” with “…/tabledata_v3/…” and “…/tabledata_v4/…” at the WM website.

Scroll down and “hover” your mouse over the blue circle on the 5th of December, then click on the “12:12:04” link in the box that appears (copied below).

https://web.archive.org/web/20111205121204/https://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt

For me that’s a file with “sources: GHCN-v2 1880-10/2011 + SST: 1880-11/1981 HadISST1 12/1981-10/2011 Reynolds v2″ in the header lines …

Back in the 2011 WM tab, scroll back up then click in the “2012” year box.

NB : The notes at the bottom of that “calender” page include :

Orange indicates that the URL was not found (4xx).

Green indicates redirects (3xx).

Hover over the first blue circle, on the 4th of February, then select the (blue text) “01:45:26” option (copied below).

https://web.archive.org/web/20120204014526/https://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt

For me that tab header displays “403 Forbidden” (instead of “Wayback Machine”), and contains the text :

Forbidden

You don’t have permission to access /gistemp/tabledata/GLB.Ts+dSST.txt on this server.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Apache/2.2.21 (Unix) Server at data.giss.nasa.gov Port 80

.

With “blue” indicators / flags the WM server thinks that it has a valid “snapshot” in its archives.

My assumption, which is probably wrong in its details, is that it depends on exactly which “error code” the WM server receives when it attempts to actually recover that “snapshot” which determines whether it displays “Forbidden” or “Access Denied”.

bdgwx
Reply to  Mark BLR
June 14, 2025 7:03 am

Yeah…I also suspect that the NASA sites, like many sites, have intrusion prevention systems and DDoS prevention systems that may occasionally get triggered by automated crawlers. It’s possible when these get triggered that they are returning HTTP 403.

Reply to  bdgwx
June 15, 2025 1:29 pm

The only problem with that is, as I understand how TheWayBackMachine works, it doesn’t archive the web address but rather what was on that site from at previous time.
IE https://web.archive.org/web/20250000000000*/http:/www.erh.noaa.gov/iln/cmhrec.htm
Comparing ,say, 2002 records with, say, 2012 records will show changes that have nothing to do with “anomalies”. They are simple high and low records.
And many a dates records have been changed. (i.e. A newer record high is lower than the older record high, etc.)

Nick Stokes
Reply to  Steve Case
June 12, 2025 11:39 pm

“GISS doesn’t archive their Monthly Land Ocean Temperature Index (LOTI).”
Just untrue. The table of values back to 1880 is online here. The link is easily accessible on the GISS site

Reply to  Nick Stokes
June 13, 2025 5:17 am

Hi Nick, yes that’s the current LOTI updated through MAY. If you want to see what LOTI looked like updated through APR you won’t find it on any of GISSTEMP pages. You can find the APR update on the Wayback Machine but you can’t find them all. Eventually your WayBack Machine searches come up missing or “Access Denied” or “Forbidden”
You might ask, “Why want them all?” The answer is, “The numbers change all the way back to JAN 1880”.

The current report for MAY says:

Year  Jan
1880  -20

Last Month’s report for APR said

Year  Jan
1880  -18

That’s right GISSTEMP changes the data all the way back to 1880 and they do it on a regular basis. Why don’t you email Gavin Schmidt and ask him why that is? This WUWT post is pinned to the top for a few days, so you could report back with what he says. 

 

bdgwx
Reply to  Steve Case
June 13, 2025 8:20 am

That’s right GISSTEMP changes the data all the way back to 1880 and they do it on a regular basis.

As I’ve explained before all of the monthly values in the LOTI will change on every monthly update. You are actually under counting the changes because you are only using the digits provided in the published LOTI file. If you change the gio.py source code file to show more digits and run the code locally you’ll see what I mean.

The reason for continuous changes is because primarily because older observations are still being digitized and uploaded to GHCN and ERSST. This will likely continue for decades. In many cases it is a laborious and time consuming process that still requires people to manually enter the data. Anyway, even a single new observations can cause a change in all monthly values because the baseline changes. The change may be small, but it is there nonetheless.

Why don’t you email Gavin Schmidt and ask him why that is? 

Because there’s no need. We already know what is happening.

Reply to  bdgwx
June 13, 2025 10:22 am

Yes, the changes are small, but they form a pattern: Each plot below represents the difference between what the J-D AnnMean was in 2011 and what it was in 2021. The pattern can’t be denied.

Reply to  Steve Case
June 13, 2025 10:23 am

My image didn’t show up )-:

Maybe you can see it HERE

bdgwx
Reply to  Steve Case
June 14, 2025 7:08 am

The pattern can’t be denied.

What conclusion do you draw from it?

Reply to  bdgwx
June 14, 2025 8:45 am

Quite a while back I emailed GISTEMP to ask why 1880 data was being altered. Here’s the answer I got:

Your main concern seems to be why data from 1880 get affected by the addition of 2018 January data and a few late reports from the end of 2017. To illustrate that, assume that a station moves or gets a new instrument that is placed in a different location than the old one, so that the measured temperatures are now e.g. about half a degree higher than before. To make the temperature series for that station consistent, you will either have to lower all new readings by that amount or to increase the old readings once and for all by half a degree. The second option is preferred, because you can use future readings as they are, rather than having to remember to change them. However, it has the consequence that such a change impacts all the old data back to the beginning of the station record.

Remembering to change new data or adjust old data once and for all

I didn’t ask why that adjustment occurs frequently (it does) nor why it forms a pattern (it does that too)

When you observe something odd like your algorithm produces a pattern you wouldn’t expect you should investigate why that is. I don’t know if GISTEMP has attempted to do that or not. If they have I don’t know that they’ve published anything about it.

So I can’t form any conclusions and have no way to do so. BUT the media needs to follow up but they don’t. They turn a blind eye. Certainly reporters have heard the “Cool the past and warm the recent refrain” They aren’t deaf to it, but ignoring something is another story. And that’s the issue.

Reply to  Steve Case
June 14, 2025 11:22 am

To make the temperature series for that station consistent, you will either have to lower all new readings by that amount or to increase the old readings once and for all by half a degree.

“To make the temperature series for that station consistent,”? Why is this needed?

There is only one reason, to artificially create a long record so one can divide the standard deviation of a “long record” by √(1200+ months) in order to calculate an extremely small standard deviation of the mean!

No other scientific endeavor allows such a manipulation of data. Only climate science gets away with this.

bdgwx
Reply to  Steve Case
June 14, 2025 11:42 am

Berkeley Earth had those same questions so they did their own independent analysis which confirmed that GISTEMP’s analysis was consistent with their own. This was interesting result at the time because BEST used a novel technique in which allowed them to analyze the data without making adjustments to correct for biases and errors in the station records.

Reply to  bdgwx
June 14, 2025 6:33 pm

Berkeley Earth … blah … blah … blah
_______________________________________

Nice word salad or tar baby, use what ever metaphor you want, I’m not going to continue with a discussion that goes nowhere.

Reply to  bdgwx
June 15, 2025 2:16 pm

It’s fun to analyze this stuff. The Then (1997) and Now (2019) graph below from 5 years ago is probably still valid and self explanatory



Reply to  Steve Case
June 15, 2025 2:18 pm

Once again my image didn’t show up

LOTI-1997-and-2019-Trends
Reply to  bdgwx
June 13, 2025 11:13 am

Yes. We already know whats happening.

bdgwx
Reply to  doonman
June 13, 2025 2:22 pm

Great. Maybe you can write an article on WUWT explaining how old observations are digitized and uploaded years or even decades after they were originally produced and that this causes old values in datasets like GISTEMP change. People might be more willing to listen if it came from you. I don’t know.

Reply to  Steve Case
June 15, 2025 3:46 am

That’s right GISSTEMP changes the data all the way back to 1880 and they do it on a regular basis.

While every month’s new LOTI file will include +/-1 (x0.01°C) changes all the way back to (January) 1880, the cumulative changes appear to be concentrated in their “Version 3”, which according to the Wayback Machine (WM) ran from December 2011 to mid-2019.

.

Notes on attached graphic

– WM files only go back to 2005

– “Version 2” only existed from May to October 2011, and all those files were virtually identical to the last “Version 1” file. I therefore combined the changes from 2005 to end-2011 into a single “Version 1/2” set.

– The largest cumulative changes over a single version occurred during V3, sometime between 2012 and mid-2019

– The largest average delta over any consecutive 12-months (the filtering I did to get the “smoothed” curves) was +/- 0.15°C

– The last V3 file (in mid-2019) had systematically added ~0.6-0.7°C to the values from 1980 to 2011 that were computed for the first V3 file (data to December 2011, released in January 2012).

– The result of transitioning from (mostly GHCN ?) V3 to V4 data combined with whatever changes to the GISS algorithm / methodology merited the version increment to add another ~0.3°C to the values in the last V3 file from 2005 to mid-2019

– I have no idea “how” or “why” the changes graphed below occurred, I am simply showing the end-result of the changes made to the GISS LOTI file over time with the available WM data

GISS-LOTI_Trends-plus-deltas_1
bdgwx
Reply to  Mark BLR
June 15, 2025 8:49 am

That is really interesting analysis and plots. Thanks for doing that.

One of the changes from v3 to v4 was the switch from GHCNv3 to GHCNv4. This increased the station count from ~7000 to ~25000.

It is not unreasonable to expect large differences between GISTEMPv3 and GISTEMpv4 as a result of the significant jump in station counts.

It is my understanding that digitization efforts are slowing down as more and more of the projects to digitize old records are completing.

I think there are still a lot of military observations that are classified though. As governments work to declassify these materials we can, hopefully, expect more past observations to get digitized and uploaded to GHCN.

bdgwx
Reply to  bdgwx
June 15, 2025 8:52 am

Here is an example of one of the ongoing digitization efforts. This project is enlisting high school students. That’s pretty cool!

[Manara et al. 2025]

John Hultquist
June 12, 2025 1:01 pm

I have read Trump’s team has proposed reducing or removing 60+ government entities. Is GISS on that list? If not, this post should be forwarded to Amy Gleason, the person leading the Department of Government Efficiency (DOGE).

June 12, 2025 1:22 pm

They should have ditche GISS in the ’80s.

mleskovarsocalrrcom
June 12, 2025 1:27 pm

Another government agency with the same mission creep that started affecting all of them thanks to Obama. The heads of those agencies were given the order to fall in line with the AGW narrative by virtue signaling support or find another job.

Reply to  mleskovarsocalrrcom
June 13, 2025 11:20 am

Every agency should have a charter, explicitly defining it’s mission and the agency’s expiration date. Ten years, max.

June 12, 2025 1:28 pm

From the article:”…temperature histories are frequently “corrected”…”

This is the thing that has bothered me from the being. I can understand one correction for geographic change or height change, but one only.

Cheating is wrong and correction are cheating.

I am sure Nick will agree with all corrections.

June 12, 2025 2:31 pm

Contrast this with NOAA and the University of Alabama in Huntsville (UAH)

Over the past ~20-years the warming rate in UAH has been more or less identical to that of GISS (+0.29 and +0.30C per decade respectively since 2005).

Is UAH cooking the books too?

bdgwx
Reply to  TheFinalNail
June 12, 2025 3:29 pm

It is also important to point out that Dr. Spencer and Dr. Christy were confident that 20 years was enough of a timeframe to conclude whether the atmosphere was warming or not when they originally declared it wasn’t based on their inaugural analysis which was later determined (and they conceded) to be wrong. So if 20 years is enough (I’m not saying it is) then it can be argued that the warming has accelerated.

Anyway, for those curious here are the adjustments UAH has made over the years to their methodology.

Year / Version / Effect / Description / Citation

Adjustment 1: 1992 : A : unknown effect : simple bias correction : [Spencer & Christy 1992]

Adjustment 2: 1994 : B : -0.03 C/decade : linear diurnal drift : [Christy et al. 1995]

Adjustment 3: 1997 : C : +0.03 C/decade : removal of residual annual cycle related to hot
target variations : [Christy et al. 1998]

Adjustment 4: 1998 : D : +0.10 C/decade : orbital decay : [Christy et al. 2000]

Adjustment 5: 1998 : D : -0.07 C/decade : removal of dependence on time variations of hot target temperature : [Christy et al. 2000]

Adjustment 6: 2003 : 5.0 : +0.008 C/decade : non-linear diurnal drift : [Christy et al. 2003]

Adjustment 7: 2004 : 5.1 : -0.004 C/decade : data criteria acceptance : [Karl et al. 2006]

Adjustment 8: 2005 : 5.2 : +0.035 C/decade : diurnal drift : [Spencer et al. 2006]

Adjustment 9: 2017 : 6.0 : -0.03 C/decade : new method : [Spencer et al. 2017] [open]

Adjustment 10: 2024 : 6.1 : -0.01 C/decade : NOAA19 drift : [Spencer 2024]

And here is how they infill their grid in their own words.

Next, each Tbl was binned into the appropriate 2.5° grid square according to the latitudes and longitudes provided with the raw data by NESDIS. This was done on a daily basis for each satellite separately and for ascending and descending satellite passes separately. At the end of the day, a simple (one per distance) interpolation was performed to empty 2.5° grid squares for ascending and descending grid fields separately with the nearest nonzero grid data, in the east and west directions out to a maximum distance of 15 grids. These interpolated ascending and descending Tbl fields were then averaged together to provide a single daily Tbl field for each satellite. At the end of the month, a time interpolation was (±2 days) to any remaining empty grid squares. The daily fields were then averaged together to produce monthly grid fields.

15 grids representing 2.5° of longitude each at the equator is 4175 km. Compare this to GISTEMP which only interpolates to a maximum of 1200 km. And GISTEMP does not perform any temporal interpolation.
:

Jeff Alberts
Reply to  bdgwx
June 12, 2025 5:12 pm

There is no “the atmosphere” the same way there is no single “Climate”. Averaging everything into one line on a chart tells us absolutely nothing.

bdgwx
Reply to  Jeff Alberts
June 12, 2025 5:15 pm

I’m going to let you pick this fight with Anthony Watts on your own.

Reply to  bdgwx
June 13, 2025 8:26 am

bdgwx,

I find it quite revealing that you took the time to list the nine numerical rate “corrections” UAH made to their data handling “methodology” over a period of 30 years, but then failed to note that the arithmetic sum of those corrections is a net 0.029 C/decade, or 0.3 C over 100 years. Pretty insignificant.

And thank you posting the details of how UAH bins, averages, and infills data for empty grid squares. I don’t see a problem with their technique . . . you?

Finally, you posted:

“15 grids representing 2.5° of longitude each at the equator is 4175 km. Compare this to GISTEMP which only interpolates to a maximum of 1200 km. “

but you failed to understand the UAH statement ” . . . a simple (one per distance) interpolation was performed to empty 2.5° grid squares . . . separately with the nearest nonzero grid data . . ” {my bold emphasis added}
which equates to interpolation for a single grid of missing data at the equator being only over a distance of (2.5/360)*40,075 = 278 km, which is much finer than what you assert for GISTEMP interpolation maximum distance (1200 km).

But perhaps you believe that UAH satellites frequently have 2.5° data bins that are empty (missing measurement data) except for the 15th ones away from the zenith point in both east and west directions . . . LOL.

bdgwx
Reply to  ToldYouSo
June 13, 2025 10:28 am

but then failed to note that the arithmetic sum of those corrections is a net 0.029 C/decade

Sometimes I do mention it.

And thank you posting the details of how UAH bins, averages, and infills data for empty grid squares. I don’t see a problem with their technique . . . you?

As I’ve said before I don’t have an issue with infilling grid cells especially when a strategy using locality is used.

but you failed to understand the UAH statement

I understand what it means.

which equates to interpolation for a single grid of missing data at the equator being only over a distance of (2.5/360)*40,075 = 278 km

278 km is the dimension of one grid cell. The interpolation goes out to a maximum of 15 grid cells. The interpolation must also be performed in the temporal dimension because the number of the empty cells is so large.

But perhaps you believe that UAH satellites frequently have 2.5° data bins that are empty except for the 15th ones away from the zenith point in both east and west directions

Yes. I do. I don’t really have a choice according to [Spencer et al. 1990]

comment image

Reply to  bdgwx
June 13, 2025 11:29 am

“Yes. I do. I don’t really have a choice according to [Spencer et al. 1990]”

You carelessly post a figure referenced to 1990 (!) and with a label clearly stating “two days of Earth coverage from a single MSU (NOAA-7)”.

You need to get up to date!

The University of Alabama, Huntsville (UAH) currently utilizes data from multiple satellites, primarily those in the Defense Meteorological Satellite Program (DMSP), three of which (F-16, F-17, and F-18) are currently active, to measure Earth’s lower atmospheric temperature. These DMSP satellites carry instruments like the Special Sensor Microwave/Imager (SSM/I), Special Sensor Microwave/Sounder (SSM/T), and the Microwave Sounding Units (MSU) that are aboard NOAA-11 to NOAA-14 satellites which are are used also by UAH.

To simplify it for you (obviously necessary): “That was then, this is now”. 

IOW, you do have a choice, even though you don’t want to accept/admit that fact.

bdgwx
Reply to  ToldYouSo
June 13, 2025 1:49 pm

You carelessly post

That’s some grade A gaslighting right there. Since when is citing UAH publications in regards to discussions involving UAH considered careless?

The University of Alabama, Huntsville (UAH) currently utilizes data from multiple satellites

I know. NOAA-7 is one of them.

The University of Alabama, Huntsville (UAH) currently utilizes data from multiple satellites, primarily those in the Defense Meteorological Satellite Program (DMSP), three of which (F-16, F-17, and F-18) are currently active

That is patently false. DMSP F-16, F-17, and F-18 are not used by UAH.

If you actually meant NOAA-16, NOAA-17, and NOAA-18 then yes those were used by UAH but of those 3 none are used by UAH currently. In fact, 16 and 17 are no longer even operational.

comment image

To simplify it for you (obviously necessary): “That was then, this is now”. 

Just because NOAA-19 has been in use since ~2010 does not mean that NOAA-7 was not in use in the 1980s. In fact, there was a period of time in the UAH record in which NOAA-7 is the only source of data.

IOW, you do have a choice, even though you don’t want to accept/admit that fact.

Do you think UAH is doing something other than what Dr. Spencer, Dr. Christy, and Dr. Grody documented?

bdgwx
Reply to  bdgwx
June 13, 2025 9:17 pm

I just saw the news that NOAA-18 was decommissioned on June 6th and the AMSU data feed from NOAA-19 is getting turned off on June 16th. It is my understanding that this shouldn’t impact UAH since they stopped using those satellites. We still have the European Union satellites providing data to UAH so hopefully they can keep providing monthly updates at least for a few more years.

Reply to  bdgwx
June 14, 2025 9:26 am

I can’t find any reference that cites UAH using data from “European Union satellites” in calculating its monthly atmospheric temperature data sets and plots.

bdgwx
Reply to  ToldYouSo
June 14, 2025 11:24 am

I can’t find any reference that cites UAH using data from “European Union satellites” in calculating its monthly atmospheric temperature data sets and plots.

I already posted the references. Here they are again.

[Spencer et al. 2017] [open]

[Spencer 2024]

Reply to  bdgwx
June 14, 2025 9:20 am

“That’s some grade A gaslighting right there.”

Well, then my apologies . . . allow me to correct my statement thusly:
“You carelessly intentionally post a figure referenced to 1990 (!) and with a label clearly stating “two days of Earth coverage from a single MSU (NOAA-7)”.

On to the more significant aspects of your last comments:

“I know. NOAA-7 is one of them.”

No, NOAA-7 was on of them. As the graph that you yourself posted shows, UAH stopped using its data in 1985.

“That is patently false. DMSP F-16, F-17, and F-18 are not used by UAH.”

Well, please see the attached screen-grab of what Google’s AI has to say on the subject effective as of today. In my prior comments, I clearly stated “The University of Alabama, Huntsville (UAH) currently utilizes data from multiple satellites . . .”, and the query to Google also asked about UAH using (utilizing) data from satellites.

Perhaps you want to claim that Google is hallucinating here? Well, there is the following information indicating that is not the case:

“In December 1972, DMSP data was declassified and made available to the civil/scientific community. The USAF maintains an operational constellation of two near-polar, sun-synchronous satellites . . . Data is also stored using on-board recorders for transmission to and processing by the Air Force Global Weather Central (AFGWC) . . . AFGWC also sends the entire data stream to the National Geophysical Data Center (NGDC) . . .”
— from https://ghrc.nsstc.nasa.gov/uso/source_docs/dmsp_f12.html , webpage title “Defense Meteorological Satellite Program (DMSP) Satellite F12 Source/Platform”, under the webpage logo citing “One of NASA’s DAACs, a collaboration between MSFC and the University of Alabama at Huntsville” {my bold emphasis added}

“Datasets and related data products and services are provided by the NASA Global Hydrometeorology Resource Center (GHRC) DAAC, managed by the NASA Earth Science Data and Information System (ESDIS) project. The GHRC DAAC is one of the Earth Observing System Data and Information System (EOSDIS) Distributed Active Archive Centers (DAACs), part of the ESDIS project. NASA data are freely accessible . . .”
and
“Examples of dataset citations: . . . Data set available online from the NASA Global Hydrometeorology Resource Center DAAC, Huntsville, Alabama, U.S.A. doi: https://dx.doi.org/10.5067/MEASURES/DMSP-F11/SSMI/DATA302
— both extracts above from https://ghrc.nsstc.nasa.gov/uso/citation.html , my bold emphasis added

“RSS SSMIS OCEAN PRODUCT GRIDS DAILY FROM DMSP F16 NETCDF V7”
and
“NASA MEASURES Precipitation Ensemble based on SSMIS DMSP F18 NASA PPS L1C V05 Tbs 1-orbit L2 Swath 12x12km V1 (PRECIP_SSMIS_F18) at GES DISC
— examples of searchable data sets available at https://search.earthdata.nasa.gov/search/granules , my bold emphasis added

Thus, I can only conclude that you have confused “using data from” with “managing sensing satellites” . . . big difference!

Finally,

“Do you think UAH is doing something other than what Dr. Spencer, Dr. Christy, and Dr. Grody documented?”

I have no idea at all what those named individuals have documented, publicly or privately . . . you?

UAH_Satellites
bdgwx
Reply to  ToldYouSo
June 14, 2025 11:36 am

No, NOAA-7 was on of them. As the graph that you yourself posted shows, UAH stopped using its data in 1985.

Just to be clear…UAH still uses NOAA-7 for data prior to 1985. For example, the plot you see in UAH’s monthly updates today use NOAA-7 and all of the other MSU capable satellites shown in the plot by Dr. Spencer. It’s just that when they drift too much or are otherwise nonoperational they stop using data from those satellites from that point on. But the data provided previous is still being used.

Perhaps you want to claim that Google is hallucinating here?

No I don’t. Google’s AI response in the picture you posted isn’t hallucinating here. It is consistent with UAH publications.

Thus, I can only conclude that you have confused “using data from” with “managing sensing satellites” . . . big difference!

None of your preferred sources say that UAH uses F-16, F-17, or F-18.

I have no idea at all what those named individuals have documented, publicly or privately . . . you?

Let me get this straight…you have no idea what Dr. Spencer, Dr. Christy, or Dr. Grody have said regarding how UAH works? Is this what you telling me and anyone else interested in this discussion?

you?

Yes. I’ve read their publications if this is what you’re asking.

Reply to  bdgwx
June 14, 2025 5:06 pm

“None of your preferred sources say that UAH uses F-16, F-17, or F-18.”

Once again a careless intentional post made without consideration of truth.
See attached screen-grab of Google AI response to the question, a “preferred source” (hah!) that I used above.

Futhermore,
“The Defense Meteorological Satellite Program (DMSP) is a long-term meteorological program of the United States Department of Defense (DoD) and the National Oceanic and Atmospheric Administration (NOAA) . . . There are three satellites currently operational in the series, DMSP F-16, DMSP, F-17, and DMSP F-18.
“Block 5D-3 satellites each carry six sensors to provide atmospheric, gravitational, land, and snow and ice measurements on a daily basis . . . the Special Sensor Microwave Imager Sounder (SSM/IS) is a multi-purpose imaging microwave radiometer which measures the thermal microwave radiation of the earth, with applications in global measurements of air temperature profiles, humidity profiles and other atmospheric measurements.
“NOAA/NESDIS-NSIDC (National Snow and Ice Data Center) and NOAA-NGDC (National Geophysical Data Center) in Boulder, CO have established a collection of digital satellite imagery acquired from the DMSP series of the US Air Force under NOAA-NESDIS contract. Hence, DMSP imagery data are available for the general user community (among them the SSM/I, SSM/T and SSM/T-2 sensor data). These data are prepared from a global, digital intensity file used operationally by the Air Force in forecasting and are subsequently archived at NGDC [DMSP imagery are archived after operational use (usually 45 to 60 days)]. A digital archive of DMSP data has been operational at NGDC since April 1992 (NGDC receives two 5 GByte tapes in compressed format every day). Archival services are continually upgraded.”
https://directory.eoportal.org/satellite-missions/dmsp-block-5d#references , © 2025 {my bold emphasis added}

UAH_Use_DMSP_Satellite_Data
bdgwx
Reply to  ToldYouSo
June 14, 2025 7:48 pm

See attached screen-grab of Google AI response to the question, a “preferred source” (hah!) that I used above.

That response is different and is inconsistent with the methodology published by Dr. Spencer and Dr. Christy.

Note that F-16 is not the same satellite as NOAA-16. F-16 was launched into orbit on October 13, 2003 while NOAA-16 was launched into orbit on September 21, 2000.

It is similarly true for F-17/18 and NOAA-17/18 as well.

The hallucination here may be a result of the fact that both F-16 and NOAA-16 were brain children of the DMSP and use the same technology. The main difference is that the F series satellites are military while the NOAA series satellites are civilian.

Reply to  bdgwx
June 15, 2025 8:15 am

Just hopeless. Goodbye.

MarkW
Reply to  TheFinalNail
June 12, 2025 3:38 pm

What matters is the last 50 years.

Reply to  MarkW
June 15, 2025 9:20 am

50 years now!?

A couple of years ago less than 10-years was plenty for Monckton.

It’s a moveable feast, isn’t it?

Reply to  TheFinalNail
June 13, 2025 8:58 am

“Over the past ~20-years the warming rate in UAH has been more or less identical to that of GISS (+0.29 and +0.30C per decade respectively since 2005).”

TOTALLY FALSE.

“The Version 6.1 global area-averaged linear temperature trend (January 1979 through May 2025) remains at +0.15 deg-C/decade (+0.22 C/decade over land, +0.13 C/decade over oceans).”
— Roy W. Spencer, https://wattsupwiththat.com/2025/06/05/uah-v6-1-global-temperature-update-for-may-2025-0-50-deg-c/

To paraphrase, “You can fool some of the people all of the time . . . but you can’t fool all WUWT readers even some of the time.”

bdgwx
Reply to  ToldYouSo
June 13, 2025 10:32 am

TOTALLY FALSE.

TFN is correct.

To paraphrase, “You can fool some of the people all of the time . . . but you can’t fool all WUWT readers even some of the time.”

You got fooled by not paying attention to the details. Dr. Spencer’s statement is for the period 1979/01 to 2025/05. TFN’s statement is for the period starting in 2005. The trend from 1979 to 2025 is different (and lower) than the trend from 2005 to 2025 which might be indication that the warming has accelerated in the most recent 2 decades.

Reply to  bdgwx
June 13, 2025 11:42 am

TFN is incorrect, and no, I did not get fooled by “not paying attention to the details”.

Dr. Spencer refers to a linear curve fit for about 46 years of data. That curve fit is applicable for the last 20 years (2005 to 2025). It appears to be YOU that wants to cherry pick just that last twenty years of data, especially with the anomalous spike-up in temperature from mid-2023 to present that will create a different slope.

You offer no argument for why the mid-2023-to-present spike in GLAT is to be believed as a real change in long-term slope.

However, I can accept the face that you believe TFN’s assertion . . . as referenced by my specific reference to not fooling “all WUWT readers evens some of the time.”

bdgwx
Reply to  ToldYouSo
June 13, 2025 1:28 pm

Dr. Spencer refers to a linear curve fit for about 46 years of data. That curve fit is applicable for the last 20 years (2005 to 2025).

That’s just patently false. You can prove this out for yourself in Excel. Download the data from here and put it in column A in your spreadsheet. In cell B1 enter the formula “=@LINEST(A1:A558) * 120”. In cell B2 enter the formula “=@LINEST(A314:A558) * 120”. You get +0.15 C.decade-1 and +0.30 C.decade-1 respectively like what TFN said.

It appears to be YOU that wants to cherry pick just that last twenty years of data, especially with the anomalous spike-up in temperature from mid-2023 to present that will create a different slope.

I’m not the one picking 20 years here. Dr. Spencer and Dr. Christy picked that duration as being enough to conclude whether there is warming or not.

You offer no argument for why the mid-2023-to-present spike in GLAT is to be believed as a real change in long-term slope.

I wasn’t asked to. And I’m not sure I would defend that argument even if I was.

However, I can accept the face that you believe TFN’s assertion

Math isn’t something you get to decide whether you want to believe it or not. It would be better to say I accept it because that’s what the math says.

Reply to  bdgwx
June 14, 2025 10:00 am

Gee . . . do you not understand that a linear curve fit over a given set of time-dependent data (e.g., Dr. Spencer’s cited January 1979 through May 2025 data set of GLAT, v6.1) is mathematically applicable to any time interval within that data set . . . be it a single year, a five-year interval, and even the interval from 2005 to the present?

Doing an Excel spreadsheet for a subset of data within the cited total span of data is nothing more than “cherry picking”, and a sophomoric misunderstanding of what a dataset-wide linear regression analysis means.

For example, do you really expect anyone to believe that a linear curve fit from January 2023 to January 2025 is representative of anything realistic? Hint for you: it would yield a GLAT rise rate of about 2.4 C/decade, more that ten times the cited linear fit slope! YIKES, the sky is falling . . . /sarc.

“I’m not the one picking 20 years here. Dr. Spencer and Dr. Christy picked that duration as being enough to conclude whether there is warming or not.”

You say that, but I note that, since the last revision to the UAH baseline period for establishing the temperature anomalies, in his posts (most recently here on WUWT at https://wattsupwiththat.com/2025/06/05/uah-v6-1-global-temperature-update-for-may-2025-0-50-deg-c/ )
Dr. Spencer has been consistent in noting the “global area-averaged linear temperature trend (January 1979 through May 2025) remains at +0.15 deg/ C/decade.” That’s 46 years, not 20 years.

On this point, both NASA and NOAA agree that climate is weather over a specified geographical location averaged over a period of 30 years or longer. I am sure that Dr. Spencer and Dr. Christy agree with this position, thereby calling into doubt your assertion that they “picked 20 years” as being sufficient to document a warming climate, as opposed to momentarily warming weather.

bdgwx
June 12, 2025 3:20 pm

These adjustments frequently increase recent temperatures and decrease older temperatures, thereby inflating long-term warming trends.

It’s actually the opposite. The corrections actually reduce the overall warming trend as compared to the uncorrected data. The corrections increase older temperatures.

comment image

[Hausfather 2017]

bdgwx
Reply to  Anthony Watts
June 12, 2025 5:56 pm

Anthony, thank you for taking the time to respond.

I’m not anonymous. My name Brian Gideon which I happily provide when people ask. My username is my initials plus wx which is the ITU-R M.1172 shorthand for “weather”.

You can see in the source code (specifically fetch.py and gio.py) that GISTEMP uses the QCF GHCNm file (available here) and per the documentation this is the adjusted data.

You can also see that GISTEMP uses the netcdf ERSST files (available here) though it is converted into the proprietary SBBX file by the include Fortran tool. And per [Haung et al. 2017] this dataset already includes the adjustments.

So as you can see GISTEMP does not currently perform their own adjustments with one important exception. That is the step2.py file which performs the urban adjustments which reduces the trend for land stations with high brightness values relative to the data in the QCF file. GISTEMP does not perform the pairwise homogenization algorithm (or similar method) for homogenizing the land station. That is done by GHCN.

Note that GISTEMP and NOAAGlobalTemp have very similar temperature trajectories because they both use GHCNm-qcf and ERSST. The minor differences you see are primarily the result of the gridding and infilling strategies. So the graph I posted above showing the adjustments is representative of GISTEMP as well.

I’ll also point out that Berkeley Earth does their own independent analysis without using adjustments and gets a result very close to GISTEMP, NOAAGlobalTemp, and others. BEST refers to their approach as the “scalpel” method which splits the station record into distinct timeseries (effectively treating it as if they were separate stations) as opposed to homogenizing (or adjusting/correcting) station record. [Rohde et. al. 2013]

Thanks again for taking the time to respond.

Reply to  bdgwx
June 13, 2025 3:39 am

“I’ll also point out that Berkeley Earth does their own independent analysis without using adjustments and gets a result very close to GISTEMP, NOAAGlobalTemp, and others.”

All of which is bogus data, that does not resemble the historic temperature records from around the world. The legitimate, written, historic, regional temperature records do NOT show the same temperature profile as all those bogus data bases you mention. The original, historic temperature data shows it was just as warm in the recent past as it is today. There is no bogus Hockey Stick “hotter and hotter and hotter” temperature profile in the original data.

Since all those databases supposedly derived their data from the historic, regional temperature data, which is the only data available to researchers, the question is: How does one get a “hotter and hotter and hotter” Hockey Stick temperature profile out of temperature data that has no Hockey Stick profile?

Answer: You Lie and Cheat and bastardized the data to look completely different from the historic, written records.

Bastardizing the temperature record is the biggest, most expensive Science Fraud in human history.

bdgwx
Reply to  Tom Abbott
June 13, 2025 11:28 am

All of which is bogus data

The original, historic temperature data shows it was just as warm in the recent past as it is today.

In one sentence you’re telling me the data is bogus while claiming in another that this data shows that it was just as warm in the past as it is today. What gives?

Mr.
Reply to  bdgwx
June 13, 2025 8:05 am

Wow.
With all that cosmetic surgery being done to the poor old temperature records, they must come out looking like the data version of Maxine Waters.

John Power
Reply to  bdgwx
June 14, 2025 3:06 pm

“I’ll also point out that Berkeley Earth does their own independent analysis without using adjustments and gets a result very close to GISTEMP, NOAAGlobalTemp, and others.”
 
If making adjustments produces a result that is not significantly different to that produced (by Berkeley Earth) without making them, I can’t see any scientific value in anyone making the adjustments at all. Can you?

bdgwx
Reply to  John Power
June 14, 2025 5:00 pm

Can you?

Absolutely. If your methodology requires accurate data and you know that data is biased then you must correct the bias otherwise you are guilty of scientific misconduct or unethical behavior at best or perhaps criminal prosecution at worst.

Note that Berkeley Earth’s “scalpel” technique was novel in that it didn’t require the correction to be applied.

Reply to  bdgwx
June 14, 2025 5:21 pm

If your methodology requires accurate data and you know that data is biased then you must correct the bias otherwise you are guilty of scientific misconduct or unethical behavior at best or perhaps criminal prosecution at worst.

Define very precisely what your definition of “bias” actually is.

You throw out a term like “accuracy” without showing how you know when a recorded measurement is inaccurate.

To make a judgement that measurements are inaccurate and/or biased requires very detailed procedures, especially when dealing with measurements from a century ago.

John Power
Reply to  bdgwx
June 16, 2025 6:14 pm

You seem to have missed the point of my question, BG. It was specifically about the scientific value of the adjustments, not their perceived ethical or legal implications as you appear to have imagined. 
 
However, since you have raised these issues, let me say that whereas you hold the view that it could be scientifically unethical for the data-producers not to make the adjustments, I hold the opposite view that it could be scientifically unethical for the data-producers to make them, because it automatically degrades the true information-content of the whole data-set by introducing false, inauthentic data of human origin into it and thereby increasing its total entropy in accordance with the 2nd Law of Thermodynamics.

bdgwx
Reply to  John Power
June 16, 2025 8:05 pm

It was specifically about the scientific value of the adjustments

I believe the scientific value is adequately described in the GUM [JCGM 100:2008]. A correction is a value added algebraically to the uncorrected result of a measurement to compensate for systematic error and is equal to the negative of the estimated systematic error. And since error is the result of a measurement minus a true value of the measurand that means a correction necessarily results in a measurement that more closely agrees with the true value. That is it’s scientific value…better agreement with the true value.

I hold the opposite view that it could be scientifically unethical for the data-producers to make them, because it automatically degrades the true information-content of the whole data-set by introducing false, inauthentic data of human origin into it

The GUM disagrees. It says a prudent person will apply an appropriate correction and that a measurement model function should include corrections. There is a lot of discussion of corrections contained throughout. There is even an example that focuses on the application of a correction to temperature measurements. The interesting thing about the example is that the correction being applied is “predicted”.

and thereby increasing its total entropy in accordance with the 2nd Law of Thermodynamics.

I’m not sure what this means so I cannot comment on it. Instead I’ll leave you with what I hope is thought provoking question.

You are collecting measurements related to some phenomenon which enables you to more effectively make a life-or-death decision related to future occurrences of that phenomenon. You discover that your measurements have a systematic error of E. Do you discard the measurements, make your decisions based on the raw measurement values, or make your decisions based on the corrected measurement values?

I know what I would do. I’d make my decisions based on the corrected measurement values because I’m certainly not going to be the cause of someone dying because of a dogmatic belief that corrections are never justified.

Reply to  bdgwx
June 17, 2025 7:19 am

I believe the scientific value is adequately described in the GUM

Who is gaslighting now?

The GUM disagrees. It says a prudent person will apply an appropriate correction

As you and the climatogists were not responsible for making the air temperature measurements back in the 1930s, you and they have no legitimate basis on which to apply your gut feelings about what the historic data should be. IOW, the GUM is referring to corrections that are part of a measurement procedure, not something that is done generations later.

You are grasping at straws and cherry picking for anything to support your pseudoscience.

Reply to  karlomonte
June 17, 2025 10:32 am

bdgwx shows his lack of real world experience once again.

A machinist can get a good estimate of the systematic bias in his micrometer through his measurement procedure – i.e. using a calibrated gage block to measure the offset both before and after making his measurement. A jeweler can do something similar by using a calibrated weight on his scale both before and after measuring his product.

It’s called QUANTIFYING the systematic uncertainty.

How in Pete’s name does bdgwx do the quantification of systematic uncertainty for a measurement station located in East Ambrosia, Texas? How does he do that for a measurement made at that station in 1960?

Answer? > He doesn’t. He makes a subjective guess based on what he wants it to be.

Reply to  bdgwx
June 17, 2025 7:21 am

A correction is a value added algebraically to the uncorrected result of a measurement to compensate for systematic error and is equal to the negative of the estimated systematic error”

As usual you have absolutely ZERO understanding of the GUM. Here is what it says. I have bolded the applicable part which you ignore.

“3.2.3 Systematic error, like random error, cannot be eliminated but it too can often be reduced. If a systematic error arises from a recognized effect of an influence quantity on a measurement result, hereafter termed a systematic effect, the effect can be quantified and, if it is significant in size relative to the required accuracy of the measurement, a correction (B.2.23) or correction factor (B.2.24) can be applied to
compensate for the effect. It is assumed that, after correction, the expectation or expected value of the error arising from a systematic effect is zero.”

Very few of the systematic bias effects in temperature measurements can be quantified. This usually requires performing a calibration run both prior to and after the measurements are taken.

In 2.2.3 of the GUM it states: “NOTE 3 It is understood that the result of the measurement is the best estimate of the value of the measurand, and that all components of uncertainty, including those arising from systematic effects, such as components associated with
corrections and reference standards, contribute to the dispersion.”

Contributing to the dispersion means an increase in the measurement uncertainty.

And since error is the result of a measurement minus a true value of the measurand that means a correction necessarily results in a measurement that more closely agrees with the true value.”

You *still* refuse to understand that you can *NEVER* know the true value so it is impossible to quantify the “error”.

As stated in Annex B of the GUM:

B.2.3
true value (of a quantity)
value consistent with the definition of a given particular quantity
NOTE 1 This is a value that would be obtained by a perfect measurement.
NOTE 2 True values are by nature indeterminate.”

You discover that your measurements have a systematic error of E. Do you discard the measurements, make your decisions based on the raw measurement values, or make your decisions based on the corrected measurement values?”

How do you discover that a measuring station has a systematic error of E? How do you quantify the value of E? Calibration drift is a VARIABLE, it is not a constant over time or across different instruments.

What you are trying to justify is GUESSING at an adjustment. Guesses ADD measurement uncertainty, they don’t decrease it!

———————————————–
The proper way to handle this is as a Type B measurement uncertainty which gets included in the measurement uncertainty budget and is propagated into any result that uses the measurement.
———————————————-

From the GUM:
“4.3.2 The proper use of the pool of available information for a Type B evaluation of standard uncertainty calls for insight based on experience and general knowledge, and is a skill that can be learned with practice. It should be recognized that a Type B evaluation of standard uncertainty can be as reliable as a Type A evaluation, especially in a measurement situation where a Type A evaluation is based on a comparatively small number of statistically independent observations.”

If you are GUESSING at a systematic bias, then the proper way to do it is to include it in the measurement uncertainty and *NOT* to change the actual measurement data.

Reply to  bdgwx
June 17, 2025 8:28 am

A correction is a value added algebraically to the uncorrected result of a measurement to compensate for systematic error and is equal to the negative of the estimated systematic error. And since error is the result of a measurement minus a true value of the measurand that means a correction necessarily results in a measurement that more closely agrees with the true value. That is it’s scientific value…better agreement with the true value.

No wonder climate science is lost in the wilderness.

And since error is the result of a measurement minus a true value

The true value mentioned in the GUM is the value THAT SHOULD HAVE BEEN READ. It is not a value meant to create a splice point that makes a long record.

YOU DONT KNOW THE TRUE VALUE of any measurement made decades ago. There is simply no way to obtain that information. Any proposed “correction” is not based on scientifically obtained measurements at the time of the reading.

Let’s examine what the GUM says about corrections.

3.2.3 Systematic error, like random error, cannot be eliminated but it too can often be reduced. If a systematic error arises from a recognized effect of an influence quantity on a measurement result, hereafter termed a systematic effect, the effect can be quantified and, if it is significant in size relative to the required accuracy of the measurement, a correction (B.2.23 ) or correction factor (B.2.24 ) can be applied to compensate for the effect. It is assumed that, after correction, the expectation or expected value of the error arising from a systematic effect is zero.

What is an “influence quantity”? Again, from the GUM.

B.2.10 influence quantity

quantity that is not the measurand but that affects the result of the measurement

You have not described the influence quantity(s), and the errors they cause, that you have used to determine the values of the correction you would apply.

You have also ignored at least two of my requests to accurately define the “bias” correction that needs to applied and the data used to determine its value.

John Power
Reply to  bdgwx
June 20, 2025 4:35 pm

Re. the ‘scientific value of the adjustments’ you say: I believe the scientific value is adequately described in the GUM [JCGM 100:2008].

I thought you might. That document is 137 pages long! I didn’t think you would expect me to read the whole thing just to find its description of the terms ‘scientific value’ so I did a word-search for ‘scientific value’ on the web-site’s own search-engine which gave me zero returns – nothing; zilch. So it seems to me that, far from describing the ‘scientific value’ of the adjustments ‘adequately’ the GUM doesn’t describe it at all! 

FYI, since I was the one who used the phrase ‘scientific value’ originally, I actually didn’t need the GUM or anyone else, including you, to provide me with an adequate description of what I meant by it. I think you will find that my own description of it, which you can read below if you’re interested in knowing it, will prove perfectly adequate for the purposes of this discussion.

A correction is a value added algebraically to the uncorrected result of a measurement to compensate for systematic error…

I was talking about adjustments, BG, not corrections. Do you not know the difference? I would draw you a Venn diagram to explain it visually, but I don’t know how to post pictures in my Comments here.

….That is it’s scientific value…better agreement with the true value.

To you, I accept that it is.  But how can you test the ‘corrected’ value’s agreement with the true value to know whether or not it really is in better agreement with the true value without knowing what the true value is or was to begin with? It appears to me that there could be some circular reasoning going on here.

“I hold the opposite view that it could be scientifically unethical for the data-producers to make them, because it automatically degrades the true information-content of the whole data-set by introducing false, inauthentic data of human origin into it”

The GUM disagrees. It says a prudent person…

I don’t care what the GUM disagrees with. I am having this conversation with you, not the GUM.

Now, about the phrase ‘scientific value’….
To me – and this is the sense in which I used the phrase in my original question (here) – the scientific value of any process or activity is the amount of new information that it adds to our existing state of knowledge about the object of interest which, in the present case, is Earth’s global mean surface temperature (GMST). 

So if your statement is true that the Berkeley Earth GMST data-set which has been produced without data-adjustments nevertheless produces results that are “very close” to the results produced by other data-set providers that do include adjustments (i.e. NOAA, GISS, etc), then it appears (on the face of it) that the Berkeley Earth methodology ultimately produces approximately the same amount of information without using adjustments, which I think implies that the amount of information contributed by the adjustments is effectively zero. Zero contribution of information implies zero scientific value, to my way of seeing it.
                                                             
“and thereby increasing its total entropy in accordance with the 2nd Law of Thermodynamics.”
I’m not sure what this means so I cannot comment on it.

That’s a pity, as the 2nd Law of Thermodynamics has fundamental implications for the thermal behaviour of Earth’s climate system and for the behaviour of information in all other kinds of systems too. I could explain some of those implications for you if you’re interested, but not right here and now as it would make this comment very much longer and would take too long for me to write.

Instead I’ll leave you with what I hope is thought provoking question.

Huh? You must think I want to have my thoughts provoked. I assure you I don’t. A quiet mind, unruffled by unnecessary thoughts, is a peaceful mind in which the truth can be known with certainty.

You are collecting measurements related to some phenomenon which enables you to more effectively make a life-or-death decision related to….
….Do you discard the measurements, make your decisions based on the raw measurement values, or make your decisions based on the corrected measurement values?

Your question contains too much vagueness and ambiguity to permit a definite, unequivocal answer, so I’ll have to leave it unanswered, I’m afraid.

I’m certainly not going to be the cause of someone dying because of a dogmatic belief that corrections are never justified.

I’ve never suggested that you should.

Reply to  John Power
June 17, 2025 6:54 am

The proper way to handle the issue is to include any “supposed” bias in the measurement uncertainty of the data. The proper way is *NOT* to guess at some adjustment which can’t actually be quantified and is only a guess.

Reply to  John Power
June 15, 2025 9:16 am

If you have thousands of measurement data points then changing even as many as 10% in a decimal place is not going to usually change your average value significantly. Making “adjustments” will typically be a losing proposition in that it results in falsified data. If the adjustments *do* result in changing the average significantly then the data should be abandoned as not fit for purpose. This is why “infilling” and homogenization of temperatures is truly a joke. If it results in changing the average significantly then your data is garbage to begin with!

John Power
Reply to  Tim Gorman
June 17, 2025 10:16 am

All good points, Tim! Well said!
 
I believe you’re right that making ‘adjustments’ to the raw data (especially by ‘infilling’ and ‘homogenization’) would tend to change the average. I think it would also tend to change the standard deviation as well.
 
Perhaps the combination of these two commonplace statistical parameters would give us a viable objective means of measuring the extent to which the adjustments have changed the basic characteristics of the original data-sets?

Reply to  John Power
June 17, 2025 10:45 am

You’ll never get such a thing. How often have you seen anything associated with a climate science paper (or even a comment) mention the variance of any data set? Climate science doesn’t even recognize that warm temperatures have a smaller variance than cool temperatures and therefore a weighting factor should be applied to each – but to calculate the weighting factor you need to have the variances, which are never calculated.

Climate science treats averages as DATA, *measurement* data, instead of just one of a variety of statistical descriptors. Statistical descriptors are *not* measurement data. When you combine northern hemisphere and southern hemisphere temperature data for February, (as an example) in order to get a “global average” for February what do you suppose you will get? You will get a multi-modal distribution of temperatures. What does the average value of a multi-modal distribution tell you when it is the only statistical descriptor provided?

Climate science always uses the excuse “the anomalies” make everything ok – while never mentioning that the anomalies inherit the variances of the absolute temps. Meaning that the anomalies will generate a multi-modal set of data as well!

It’s all crap from the very beginning and it never gets any better with more and more averaging.

bdgwx
Reply to  John Power
June 17, 2025 12:30 pm

I believe you’re right that making ‘adjustments’ to the raw data (especially by ‘infilling’ and ‘homogenization’) would tend to change the average. I think it would also tend to change the standard deviation as well.

It does. Here is a data denial demonstration showing what happens when you infill.

A: {1, 3, 3, 4, 5, 7, 7, 8, 9, 9}

B: {1, 3, ?, 4, 5, ?, 7, 8, 9, 9}

mean(A) = 5.6

mean(B) = 5.75

infill(B) = {1, 3, (3+4)/2, 4, 5, (5+7)/2, 7, 8, 9, 9}

mean(infill(B)) = 5.55

stdev(A) = 2.797
stdev(B) = 2.964
stdev(infill(B)) = 2.868

A is a sample of “true” values.

B is a subsample A with 2 values withheld.

infill(B) is uses a local regression strategy (not unlike kriging) to infill the missing values.

Infilling definitely changes both the average and stdev of B.

However notice that infill(B) is a closer match to A than the unadjusted set B.

The real world is far more complex, but what this demonstration does is falsify the hypothesis that infilling is universally bad.

Perhaps the combination of these two commonplace statistical parameters would give us a viable objective means of measuring the extent to which the adjustments have changed the basic characteristics of the original data-sets?

Yes they do. That is not say that there aren’t other ways to quantify the effect of adjustments and/or determine their usefulness.

BTW…if you want to do this infilling data denial experiment with actual station data so that you can see what happens in a realistic scenario let me know.

Reply to  bdgwx
June 17, 2025 12:59 pm

A is a sample of “true” values.

Of which you have ZERO knowledge.

Only guesses and gut feelings.

Reply to  bdgwx
June 17, 2025 4:15 pm

BTW…if you want to do this infilling data denial experiment with actual station data so that you can see what happens in a realistic scenario let me know.

Here is a screenshot of a TV stations weather view of current temperatures.

Tell everyone what your algorithms predicts the temperature under the pink dot will be.

comment image

I’ve asked you this before and you failed to answer. If you believe in your math, it should be an easy determination.

There are a number of other issues with your example. Let’s assume these are temp values, maybe °C in winter somewhere.

For Ex A

Significant digits / resolution uncertainty

  • The average s/b shown with 1 sig fig

Uncertainty calculation

  • Using NIST TN 1900
  • u = 0.9
  • U = 2.262 × 0.9 = 2 (DF = 9)

Final value

  • 6 ±2

That makes the interval where the actual value may lay:

  • 4 to 8

The uncertainty of the other two calculations do not change anything.

Your math gyrations are meaningless with this large value of uncertainty.

The upshot? Throw the data away, it is not fit for purpose.

June 12, 2025 4:18 pm

When you produce nothing but pointless BS you should be wondering how long can it go on.

How many “scientists” actually believe trace amounts of CO2 can warm Earth through radiative forcing!

No scientist could believe such nonsense.

Reply to  RickWill
June 12, 2025 5:21 pm

Scientists can calculate IR band absorption. You should try Modtran. About 1 degree per CO2 doubling, IIRC. (1 degree more surface temp emits 5.35 more watts of IR).
Scientists believe GW by CO2 cuz they can calculate it. However real world data has a data discrepancy to calculations because many surface readings change by a degree in a few minutes and vary amongst measurement stations close to each other, across weather fronts, under rain clouds, etc . But Modtran is approximately correct.

Gavin Schmidt of GISS has written this
If, for instance, CO2 concentrations are doubled, then the absorption would increase by 4 W/m2, but once the water vapor and clouds react, the absorption increases by almost 20 W/m2 — demonstrating that (in the GISS climate model, at least) the “feedbacks” are amplifying the effects of the initial radiative forcing from CO2 alone.“ This is NOT CORRECT due to failure to include the albedo-caused reduction of incoming sunlight. But is picked up as “the general consensus” by AI Chatbots….but AI has not got enough “I” to find the discrepancy….a real problem for climate change truth…..

Reply to  DMacKenzie
June 12, 2025 7:21 pm

Scientists can calculate IR band absorption.

Wacko.

If only it had any relationship to how Earth’s radiation balance and temperature are controlled.

You do not need to be a rocket scientist to appreciate what controls Earth’s energy balance. Just go outside on any sunny day and walk in and out of shade. Then look at what clouds do.

Unless you have a deterministic basis for the process of cloud formation you are playing silly games. And that is all GISS has done since it started climate modelling.

NASA need look no further than JPL to understand why Earth has been warming up since 1700.
https://wattsupwiththat.com/2025/05/04/high-resolution-earth-orbital-precession-relative-to-climate-weather/

Reply to  RickWill
June 12, 2025 10:39 pm

to paraphrase you….no scientist believes CO2 forcing can raise the temperature

That is just unscientifically silly beyond belief. You’re an electrical engineer, you can read some background info on IR absorption of gases and catch on.
If you assume planetary albedo is 0.3 (which I have already stated isn’t locally correct due to cloud albedo) then greenhouse gases warm the planet by roughly 33 C. Although CO2 is 400 ppm and water is much higher, H2O is only a few ppm 10 km up, plus CO2 has a much larger molecular cross section for IR, something like 25 times, with the net result that CO2 is responsible for about 1/3 of that 33 C greenhouse effect IIRC. Programs like Hitran can calculate that….saying it’s wacko is just going to cause a loss of respect for your otherwise reasonable precession calcs and “cloudy is cool” statements….

Reply to  DMacKenzie
June 13, 2025 5:31 am

You need to start looking at ice and its conduction, absorption, transmission and reflection; whether on the water, on the ground or in the atmosphere, All the big climate shifts are driven by ice. The inherent stability of Earth’s climate is driven by ice.

Understand ice and you are the beginning of the journey in understanding Earth’s climate.

Modern hot a dinky toy looking at irrelevant carp. It does not even begin to cope with what ice does in the atmosphere.

Reply to  RickWill
June 13, 2025 7:21 am

Ice is just solid water with a continued reduced vapor pressure with lower temperature. Clouds are liquid water and ice crystals. One degree adds 7% more water molecules to the air above the water surface. Water covers 70% of the Earths surface as ocean, 10% more as ice and snow….so water controls Earth’s climate, yes…but we’re talking about whether doubling CO2 has enough influence on atmospheric IR absorption to affect the temperature by 0.3% above its already 285 K degrees above outer space. Turns out it does if you also include the extra water vapor that results from the 0.3% warmer temp….
your point should be that this isn’t any crisis worth spending tax $ on, other than nuclear power plants.

Reply to  RickWill
June 13, 2025 9:26 am

The inherent stability of Earth’s climate is driven by ice.

Understand ice and you are the beginning of the journey in understanding Earth’s climate.

I think not.

The enthalpy of fusion of water ice (solid <—> liquid) is 333.6 kJ/kg. In comparison, the enthalpy of vaporization of water (liquid <—> vapor) is 2260 kJ/kg, about seven times greater.

Additionally, currently only about 2% of the world’s total mass of water (all phases) is present as ice.

Considering the hydrodynamic cycle as it occurs on average over the oceans and land surfaces and in the troposphere of Earth (especially as clouds), it is liquid water and its evaporation and condensation that establishes Earth’s climate and its long-term stability.

“The great tragedy of Science – the slaying of a beautiful hypothesis by an ugly fact.”
— Thomas Huxley

Sparta Nova 4
Reply to  DMacKenzie
June 13, 2025 7:53 am

I am a rocket scientist and systems engineer. 50 years experience.
One of the positions I held was electro-optical infra red sensors.
You are wrong.
I refuse to engage in a debate with a religious zealot so I will not.

Michael Flynn
Reply to  Sparta Nova 4
June 14, 2025 2:43 am

Sparta, if you agree that the surface was originally molten, what facts lead you to think that the surface has not cooled to its present temperature from a much hotter one?

Experiment (since John Tyndall) and experience show that adding. CO2 to air does not make it hotter. Four and a half billion years of continuous sunlight has not prevented the surface from cooling.

Surface temperatures due to unconcentrated sunlight, or the lack thereof, vary between around 90 C and -90 C.

All thermometer temperatures can be explained without the need for supernatural influences, such as the impossible-to-actually-describe GHE!

Sorry, but why abandon science in favour of religion?

Sparta Nova 4
Reply to  Michael Flynn
June 16, 2025 7:08 am

You have read my comments before. It is incredible that you think I am abandoning science in favor of religion, especially since:

“I refuse to engage in a debate with a religious zealot “

Reply to  Sparta Nova 4
June 14, 2025 2:59 pm

I also have many years experience in detection of water vapor and CO2 in natural gas streams, IR,UV, chilled mirror, and chemical sensors related to calculating performance of heat transfer and mass absorption equipment. And just what numbers have I given that are wrong ? Show your calcs rocket scientist…cuz you’re not on trajectory….in fact what is it that you think I am so clearly wrong about ? It’s really not clear whether you think CO2 raises the temperature by “a lot” or “not much”.

Sparta Nova 4
Reply to  DMacKenzie
June 16, 2025 7:07 am

Responding to:

plus CO2 has a much larger molecular cross section for IR, something like 25 times, with the net result that CO2 is responsible for about 1/3 of that 33 C greenhouse effect IIRC

June 12, 2025 4:56 pm

The #1 reason to close GISS is not among the 10 items in this article, although the reasons given are valid.
The #1 reason is Gavin Schmidt, the director of GISS, is a climate activist, a political animal; not a climate scientist. Schmidt made this perfectly clear when I listened to his lecture on 21 September, 2017 (https://www.studlife.com/news/2017/09/21/wash-u-hosts-climate-change-panel-continues-push-toward-increased-sustainability-on-campus).
Schmidt unequivocally asserted that climate has not changed, except due human activity, in the past century. That seems an important science statement, leaving no doubt in the minds of the audience. More than 200 more people heard him assert this repeatedly while pointing at large, impressive graphics. I may have been the ONLY person wondering if he had taken leave of his senses.
Then, I understood what he had said. His game was the old ‘change of playing field‘ game.
‘Climate change’ isn’t the same as ‘climate variability’. He had changed the playing field!
Climate change is defined to be the climate variability due to humans! This means Schmidt’s statement was correct, but completely circular. What Schmidt said was: ANY ‘climate change’, however small, whether positive or negative, is by definition, due to humans.
One does not need a large institute to make that assertion. One may define what one likes; circular arguments remain circular, but the tax payer should not pay for it.

Jeff Alberts
Reply to  whsmith@wustl.edu
June 12, 2025 5:15 pm

Darn. Thought you were done with your bolding narcissism.

Malcolm Chapman
Reply to  Jeff Alberts
June 15, 2025 8:17 am

Thanks to WHS for the link to Gavin’s tautology – interesting and to the point of this thread. But I agree with you about being too bold.

Bob
June 12, 2025 5:51 pm

I would give them 60 days to show that their work is the gold standard and is what the country should operate with. If they haven’t made their case in sixty days they can use the next 30 days to clear out their offices.

cgh
Reply to  Bob
June 12, 2025 7:18 pm

Why give Gavin & Co. that much time? They have piled up decades of modeling in a host of papers published widely. They have been one of the principal sources of so-called research used by IPCC. That’s more than sufficient evidence to have them terminated now. Three decades of publishing useless garbage is more than sufficient. Nothing is gained by sparing them for another 60 days.

Reply to  cgh
June 13, 2025 3:56 am

Gavin should tell us where the water vapor “hot spot” is.

You see, the Climate Alarmists can’t have a climate catastrophe unless there is a water vapor “hot spot” which is supposedly created by CO2. It’s not the CO2 that causes the overheating, it is the water vapor “hot spot” feedback that causes the temperatures to go out of control, according to the Climate Alarmists.

But there is no “hot spot”! It’s missing! More and more CO2 goes into the air, yet there is no water vapor “hot spot”. We can’t have a climate crisis without a water vapor “hot spot”.

Where’s the water vapor “hot spot” Gavin? Your Climate Crisis theory is missing a piece. An essential piece. A theory-killing piece.

Reply to  Bob
June 13, 2025 3:49 am

Their work does not represent reality, so it is not the “gold standard”, although I guess it could be called the gold standard for temperature data bastardization.

The bogus, bastardized, “hotter and hotter and hotter” Hockey Stick chart does not represent reality.

It was just as hot in the recent past, according to the original temperature records. The bogus Hockey Stick chart does not show this. The bogus Hockey Stick chart is lying to us.

June 12, 2025 6:28 pm

GISS was established in 1961 to support NASA’s planetary science efforts

James Hansen worked at GISS and specialized in studying the atmosphere of Venus, which is 96.5% carbon dioxide, and made a computer model of it. Earth is 0.04% carbon dioxide, so naturally he made the leap of logic that Venus once had an atmosphere like Earth’s, but stupid humans burned fossil fuels and caused a runaway positive feedback cycle that turned Venus into a smothering hothouse. He created a computer model to “prove” his hypothesis, demonstrating that scientists often make the most ludicrous logical leaps, and are immune to correction once they grab hold of an idea with religious zeal that they can get taxpayer funding for. Hansen is the godfather of global warming alarmism.

abolition man
Reply to  stinkerp
June 13, 2025 3:49 am

James Hansen wanted to play himself in a 1950’s style sci-fi movie adventure. He was hoping the mandatory blonde bombshell assistant would be a Jayne Mansfield look-a-like; instead he got a Twiggy impersonator, and went straight to DVD!

TBeholder
Reply to  stinkerp
June 13, 2025 6:36 am

Stuff like this makes me wonder how many people “working for” the circus are blatantly punking it.

June 12, 2025 9:32 pm

it was bureaucratic repurposing to fit political trends goals

Typo “trends” is spelt G-O-A-L-S

June 12, 2025 9:37 pm

Plus, they tend to use “hotter” colors in the global maps they produce. This is a visual sleight of hand—technically correct, but intentionally misleading.

It’s the same with “ocean acidification”. Technically correct but intentionally misleading.

I had this argument with Richard Black, formerly the BBC’s climate churnalist

2hotel9
June 13, 2025 5:01 am

Number one reason? Telling lies about climate and weather do not advance America’s space exploration program.

June 13, 2025 5:24 am

GISS (climate) was never indispensable.

Verified by MonsterInsights