We’ve been busy.
As the proprietor of “World’s Most Viewed Climate Related Website,” I have a mandate to keep that title. Today, two new permanent additions to the WUWT set of resources are now online. Both are unique, and both are exclusive, both are factually based. Even better, both will irritate climate alarmists.
#1 Earth’s Real-Time Temperature
The first is a really simple widget added to the right sidebar, seen below:

Whaaaat! you say? Real-Time Earth’s temperature – impossible! Only certified climate scientists can do that. Well, um….no. We can do it, and we did. Working with a friend of ours who runs a website https://temperature.global/ (who shall remain nameless) and our resident tech wizard, Eric Worrall, we have made use of their API (specially modified at my request) to display the real-time temperature of the Earth. Data has been recorded back to 2015, as you can see in the graph below. It will continue going forward.



How does it work? It is pretty simple really. Thanks to the Internet, all sorts of global temperature data in near real-time (hourly) is available.
Data sources – click links for details:
- NOAA global METARs 2015-current
- NDBC global buoy reports 2015-current
- MADIS Mesonet Data, NOAA OMOs
All that data is gathered hourly, put into a gridded average, computed and displayed. It is compared to the “normal” baseline temperature of 57.2°F.
In this calculation, the “normal” temperature of the Earth is assumed to be 57.2°F, and that is simply compared to temperature reported by all of the stations to obtain the absolute temperature deviation from “normal” at that moment.
The basis of this number comes from NASA GISS itself, from their FAQ page as seen in August 2016 as captured by the Wayback Machine.



Of course, they’ve removed it now, because well, they don’t want people like me doing this stuff. Only certified climate scientists can do that stuff. /sarc.
#2 Failed Predictions Timeline
We wanted a “one stop shop” that would display all of the bogus predictions (and outcomes) about climate, energy and the human condition all in one easy to use package.
This has been a work in progress over several months between myself, Charles Rotter, and Eric Worrall. At least two previous versions have been thrown out and redone (thank you Eric for your patience). We think we’ve got it right.
See it here: https://wattsupwiththat.com/failed-prediction-timeline/
Screencap below.



Note the search features. You can choose topic, person who made the prediction, and year to filter by. Once you do that, it will give you the result.
Most importantly, each entry comes with an “outcome” section. See below for an example.



Try it out, https://wattsupwiththat.com/failed-prediction-timeline/ and please be sure to let us know in comments what you think, and if we missed anything.
The global warming people are really going to hate you! Actual facts!
“going”?
Sadly, the misanthropic Greens already hate Anthony and anyone else who questions the agenda.
1 – Actually, they hate everyone and think we should all die. example The left prefers theory and abstraction to reality. They have ablated the part of their brain that comprehends context and common sense (I blame the education system.). As such, they fall into conditions that resemble schizophrenia and psychopathology. link That’s what it takes to actually wish death on their fellow humans.
2 – I realize Anthony provided a link in the story above, but I can’t otherwise find a way to navigate to the failed predictions page.
We cannot be doing this, as we all died long ago.
I, for one, disappeared in a cloud of blue smoke that was washed away under 30′ of rising oceans before being burnt by raging wildfires.
Luxury!.. And you try to tell that to the young people of today..
Tears in my eyes. So true
When I were a kid, we used to ‘ave to write on slates, and in a ‘igh wind it wa’ terrible t’ number o’ kids used to fall off that roof.
He ha ha ROTFLMAO!
Sorry that you “disappeared” Hopefully you will re-surface soon
Global warming turned me into a newt.
(But I got better).
(h/t John Cleese in Monty Python’s Search For The Holy Grail)
More tears in my eyes. Needed the laugh
The very warmest congratulations to Anthony, Charles, Eric and the team for two first-class initiatives.
The real-time temperature updates will be very widely referred to. And the record will show that the extremists’ predictions are simply not coming to pass.
It would be a good idea to plot the least-squares linear-regression trend on the data since 2015 in real time.
The list of failed predictions will also be very useful. Perhaps the most important omission is the original 0.3 K/decade business-as-usual global-warming prediction made by IPCC (1990). The UAH outturn since then is only 0.136 K/decade.
Well done to all concerned! Two more first-class widgets at WUWT.
The UAH rate will likely lower over the next couple of decades, as their record coincides roughly with both the AMO and PDO entering their decades long warm phases. Both are indicating shift to their cool phases is starting.
I think we may be in for one last intense el Nino before the cool phases of both AMO and PDO kick in. When they do, we could then be in for the longest Pause ever – perhaps 25 years
Or “stop”?
Mind you the alarmists will still claim it’s CO2 wot dunnit
Indeed. The Holocene will eventually end and we’ll all be buried under a mile of ice. On the up side, the oceans will drop four hundred feet and we’ll be able to drive between Alaska and Siberia. Maybe we can even bring back the woolly mammoths. 🙂
The alarmists will claim that the heating has stopped because of their sterling efforts to force us to net zero.
The old joke: A man on a London bus was tearing up a newspaper and throwing pieces of paper out of the window. The conductor asked why he was doing that. “To keep the elephants away” was the reply. “But there aren’t any elephants in central London” says the conductor. “Yes, effective isn’t it?” says the man (alarmist).
Christopher,
Caution please with linear trend lines.
The short ones show weather when climate is the context.
They invite the eye to forecast the future, which is invalid.
Apart from that, I join with you to congratulate Anthony, Charles and Eric.
Geoff S
Sherro01 is perhaps unaware of the following facts:
Christopher,
This WUWT blog sometimes calls me sherro01 from my email address and sometimes Geoff Sherrington, my real name. You and I met in Melbourne.
Essentially, we have similar purpose on these matters.
Often, I use the linear least squares fit on temperature/time graphs to assist people to interpret trends on a quick reading. I do not like this approach, but no alternative seems much better.
What I wrote here was brief but hard to argue. WUWT advertises as a Climate Blog.
BTW, I was corresponding with Prof Phil Jones as early as March 2006,before Climategate, longer backstory linked, showing I was fairly well aware of past relevant work.
Cheers Geoff S
https://joannenova.com.au/2012/01/that-famous-email-explained-and-the-first-volunteer-global-warming-skeptic/
How about a section for correct predictions?
I predict that 20,000 years from now that Canada will be covered in ice.
I predict that the next governor of Califonia will mandate things 10 year after he is out of office.
I predict that the all the US wind and solar equipment currently part of the US mix will not be producing any power by 2040.
I predict the US will not build a museum of bad ideas to put broken PV panels, wind turbines, BEV, and HFCV in so that future high school students can learn from the mistakes we make over and over.
No open ocean surface will sustain a temperature of more than 30C over an annual cycle.
“Open” means 500nm from any land. The 30C is rounded to the nearest whole number so under 30.5C.
There is is hot and dry and hot and humid. Also cold and dry. Humans have adopted to both for thousands of years.
The grid operator predicts how many will die if the grid goes down on an extreme day.
Twice I have been in the control room of a large nuke plant for a scheduled shut down on such a day when it was the last plant prevent a blackout.
I do not if the grid operator was right because we waited to shut down.
I can also tell you who will tell you those nuke plants are not needed.
I can also tell you about blackout over the last 30 years. In one case a million people were without power. Kit P was not there because his job went when the nuke plant closed.
In two other cases, the control systems rode out the event stopping the spread of the cascading failure. This was good fortune for those not among the millions that lost power.
One of the lessons learned is to require new steam plant keep running providing house loads when the grid goes away. I was the lead author for the design guide on this issue for our new nuke design.
I can predict what kind of steam plant will provide the energy you need on what fossil fuels you do not have.
There will be a new snowfall record somewhere on the globe every year for the next 10,000 years.
The failed predictions timeline will be especially useful for applied ridicule. A few of my favorites pre the new WUWT timeline:
In 1990, Hansen gave a published interview. He looked out his office window and said that in 20 years time, the East Side Manhattan parkway would be under water from sea level rise acceleration. Not even close.
In 2000, UK Met’s David Viner got published saying that UK children would soon not know snow, Not even close.
In 2015, EIA figured that onshore wind and CCGT had reached about LCOE parity at about $90/MWh. EIA ‘only’ made a few basic mistakes. In 2016 I published essay ‘True Cost of Wind’ over at Judith’s, simply correcting the major EIA factual errors. In fact, the 2015 LCOE of CCGT was about $58/MWh while onshore wind (excluding subsidies) was about $146/MWh. Not even close to parity.
I will observe that a few of the smarter alarmists have by now learned to make their predictions vaguer and further into the future. Most haven’t—because a vague distant future isn’t alarming.
The alarmists will say these failed predictions weren’t made by “real climate scientists”. 😉
Very hard for alarmists to disavow Jim Hansen and David Viner, both official high priests of their warmunist religion.
I thought the Hansen East Side Manhattan parkway prediction had been confirmed as 40 years?
Not that the number of decades makes any difference
I found the original article. Said 20. When 20 past, Hansen then announced he had said 40 and the journalist had misquoted him. The journalist responded, NOPE.
Besides, we are now 33 years in and Manhatten’s East Side Drive is still just fine, still over a meter above high tide along the East River,
Yep, Hansen later backpedaled on his prediction, and of course alarmists completely accept his backpedal, despite the truth staring them in the face.
After all, everyone knows it’s so common to allow someone’s “misquote” of you to stand unchallenged for 20 years.
The book “the coming storm” is where Bob Reiss made the original quotation:”When I interviewed James Hansen I asked him to speculate on what the view outside his office window could look like in 40 years with doubled CO2. I’d been trying to think of a way to discuss the greenhouse effect in a way that would make sense to average readers. I wasn’t asking for hard scientific studies. It wasn’t an academic interview. It was a discussion with a kind and thoughtful man who answered the question.”
Thanks, Anthony, CTM, and Eric for the two new WUWT features. I especially enjoyed the failed prognostications.
Regards,
Bob
It would be very easy to add a juxtaposed WUWT successful alarmist predictions timeline, as there aren’t any that I am aware of.
Two fun climate extinction examples, both from essay No Bodies in ebook Blowing Smoke:
The Queensland white ring tailed lemur was said to have gone extinct (big deal in AUS MSM) until several were subsequently spotted. The underlying basic problem was that Queensland ring tailed lemurs come in two color variations, the common chocolate and the uncommon white. Not two species as presumed.
The Costa Rican golden toad was said to have gone extinct (AR4). It is in fact extinct, have succumbed to fungal cytridiomycosis brought to its small range on Brilliante Ridge in the Costa Rica nation park by tourists. Nothing to do with climate.
Two fun climate examples.
Wadhams said the summer Arctic would be ice free by 2014.
USNPS had visitor center signage for many years (taken down winter 2020) saying Glacier National Park would be glacier free by 2020.
I saw this yesterday
‘Extinct’ lion spotted in Chad’s Sena Oura National Park after almost 20 years
https://abcnews.go.com/International/extinct-lion-spotted-chads-sena-oura-national-park/story?id=98752700
There’s a district of Derby, where I live, Chaddesden known locally as “Chad” so I often get stories about the African nation suggested by various searches. So interesting some not so.
Rud,
IIRC, the lemur is native to Madagascar alone.
The nick-name lemur has been informally given to one or more possums here.
Geoff S
True. I went back to my book notes. It was the white lemuroid ring tail,possum that supposedly went extinct in Queensland..
“The Costa Rican golden toad was said to have gone extinct (AR4).”
As I read it at the time, the fungus was brought in by toad researchers taking stock of small populations wearing their unwashed boots from one habitat to another. Because it was caused by the researchers, they brought out a new protocol for cleaning their gear and implemented it world wide.
At the time there as a alarming claim (completely unfounded) that “increasing temperatures” had driving them to extinction. No one apologized for that lie.
The researchers also caused another toad species to go extinct in Australia for the same reason. I do not recall the name, or whether some have been found since. Someone else can advise…
Failed Predictions Timeline
Genius, pure Genius.
After smiling through pg. 1 of F.P.T. I clicked the first right red arrow to advance to pg. 2 – it does advance but content fails to load.
No more smiles……… ;>(
Story Tip:
Start a page that links to all the weather stations showing no warming.Here is a good start. Link
Story Tip:
Start a page debunking all the extreme climate claims. Here is a start Link
Story Tip:
Start a page with the top arguments against CO2 being the cause of the warming. Here is a good start Link
OT….but to Cherry pick the current date….we have more Arctic Ice cover on this date than we have had since 2014
https://nsidc.org/arcticseaicenews/charctic-interactive-sea-ice-graph/
also on WUWT header page.
Doesn’t agree with DMI graph at all.
http://polarportal.dk/fileadmin/polarportal/sea/SICE_curve_extent_LA_EN_20230421.png
Are they measuring the same thing? I look at DMI regularly and it measures
Modeled ice thickness and volume
No mention of Area.
Both are showing the measure of Extent! Extent varies from Area. Area=Extent-voids/holes.
rah, that graph only goes back to 2018…
I was referring to the current extent shown. At the time I posted my reply the NSIDC graph showed the current extent well below the bottom edge of two standard deviations while the DMI graph showed current extent right at the bottom edge of the shaded area denoting two standard deviations.
Now the NSIDC graph has been updated and agrees with the DMI graph on the current extent.
But for me the bottom line is that it is all BS as far as using sea ice as an indicator of temperatures. Every metric concerning sea ice, be it area, extent, volume, or thickness, is greatly affected by wind and wave action. For that reason sea ice is a very poor proxy for either air or water temperatures when we are actually monitoring those temperatures. Why make such a big deal about a proxy when the very things it is supposed to be an indicator of are being directly measured?
on 19th and 20th April, the nsidc data I have, says more than in 1989
The maximum this year was about the same as the maximum in 1974.
The only thing about Arctic sea ice that is disappearing is… THE DISAPPEARANCE OF Arctic sea ice!
Excellent. So what have you been doing in your spare time?
Good idea, but largely already covered here:
https://extinctionclock.org/
Doesn’t hurt to have more sources though.
‘Failed Predictions’ is excellent, but a couple of searches by Topic gave duplicate finds – each same failed prediction appeared under two different dates. A search by ‘predicted by’ was OK.
Data – the best disinfectant
Until the Climate Fascists get their hands on it…
Marvelous! It is now official that the “climate” of the “earth” is now 57.56! (Much more accurate than a mere 17.2.) Meanwhile, the six thousand- plus temperatures actually measured ranged roughly from -40C to +40C.
A great bibliography of something that doesn’t exist. Global warming doesn’t exist, which is why they now call it “climate change,” while trying to make us believe there’s no difference between climate and weather. And nothing will stop them from making profits on their investments, especially the truth. But the risk is theirs, not ours, and we should not pay for it.
I still think that every variable you can use measure weather/climate is still within previously experienced extremes so cannot be changing in the sense used by climate extremists
Thanks so much Anthony and all connected with the upgrades. Lots of work, but I’m certain it will turn out to have been well worth it.
All that data is gathered hourly, put into a gridded average, computed and displayed. It is compared to the “normal” baseline temperature of 57.2°F.
you cant simply average temperature.
for latitude: METAR reports are not uniformly distributed across latitude: thats WHY we have to use anomalis.
the NOAA baseline is wrong, rather than correct it. they removed it because they dont have a method that can averagetemperatures.
There is also the diurnal factor. What does it even mean to average over an hour when half the world is in darkness and half in light?
This site has been around for a while, anonymously, and has never explained its methods.
Mr Stokes has, as usual, allowed his sullen prejudice to prevent him from realizing that all the temperature datasets necessarily average night-time and daytime temperatures. His criticism, as so very often, is not of the skepticism he is handsomely paid to attack but of the official narrative. If he does not think that daytime and night-time anomalies can be averaged, let him address himself to the keepers of all the official datasets and tell them they have no basis in reality for their averaging and therefore, those datasets provide no basis for alarm about the amplitude or rate of global warming.
“all the temperature datasets necessarily average night-time and daytime temperatures”
Not the surface datasets. They average each site daily max and daily min.
Mr Stokes is, as ever, disingenuous. The terrestrial datasets issue a monthly global mean surface temperature. He discredits himself by his pointless, scientifically ill-founded carping.
Perhaps it’s just that he doesn’t understand that, except for the polar regions, Tmin is nearly always a nighttime measurement.
They issue a global average of anomalies Daily max and min, and the mean of the two. They usually take the mean before global averaging, but the order doesn’t matter. The important point is that there is no mixing of times of day.
Both Mosher and Stokes whine about the methodology, but miss the most important point.
The number they come up with at temperature.global, at 57.56F is very close to what Gavin Schmidt says is the average global temperature baseline. 57.2F
If the methodology used was wrong, the number wouldn’t even be close.
The question you SHOULD BE ASKING IS: Why does Gavin Schmidt need to hide that baseline number now by removing it from the web page?
So, with all the respect it is due, take your whining and shove it up the bodily orifice of your choice.
“Why does Gavin Schmidt need to hide that baseline number now by removing it from the web page?”
The number dates from long before Gavin’s time as head. But the context is a page saying rightly that you should never ever average absolute temperatures. It says that you should average anomalies. The older version said that if you were really desperate for an absolute temperature to quote, you could get a value from GCM’s to add to the anomaly average. Gavin may think this is poor advice. I’m surprised you have so much greater faith in GCM’s.
But to show what is really said in that GISS quote, here it is with proper emphasis:
The documentation of this new index is so poor that one can’t say whether the authors are following the GISS advice by averaging anomalies and then adding a climatology, or not. But if they are, then the reason for the coincidence is probably that they are using the GISS base.
No Nick, you are misunderstanding this again. The reason to use anomoly (very poor name) is that some areas have negative numbers due to the temperature scale in use. Really the use of the Kelvin scale would be wise, but as the Public have no knowledge of it or why we have it or use it, it is not used generally.
Personally I am much more concerned that mixing measurements with one time constant with another (ie. electronic and glass) is the biggest problem we face. Really we should be campaigning to have the electronic ones modified to have the glass time constants, so that a proper comparison between historic and modern values can be made, in the future. The present electronic ones need to be discarded completely. Unfortunately it is not a simple constant that is involved in correction of one against the other. Comparing apples and oranges is foolish, except when the distortion is in the direction one wants!
Is it feasible to insert the tip of a Platinum resistance sensor into the mercury base of a normal mercury thermometer? Or slight variations on this theme?
Geoff S
A good engineer should be able to come up with a mass having the same thermal inertia as a standard MIG thermometer. Why that wasn’t done, I don’t know.
It is done.
Nick,
A device has been made and put into operation. Some scientists are interested in whether it works and whether its results support alarmism in the global warming narrative (pardon my immature snigger)..
What is wrong with separate checks of whether the device works well or poorly? I have not seen adequate official disclosure of results. Colleague Chris Gillham has produced interesting data analysis showing there is indeed a problem demanding explanation. Would you like me to put the findings into easy words for you, in case you read it and found it too hard to contemplate?
http://www.waclimate.net/aws-corruption.html
Geoff S
If you want to be a pessimist, use anomalies…if you want to be an optimist, use absolute temperatures. Which gives a more human-realistic view?
1). weather stations report in real thermometer readings, not anomalies.
2). pilots ask for the airport temperatures in thermometer reading, not anomaly.
3). before dog-walking, I ask my wife what temperature the thermometer says, not “what’s the anomaly dear?”
When someone uses “anomalies”, they may well be statistically correct but are emphasizing a trend using numbers disguised by a couple of extra levels of interpretation.
I think the real-time global average temperature is a great addition. Unfortunately that temperature is commonly used to estimate the total energy radiated from the surface and that gives an incorrect value. A more accurate temperature for that purpose could be calculated by averaging the gridded power radiated at each of the station locations and then calculating the temperature associated with that average power. Or just averaging the t^4 values and then taking the 4th root. Such a number should probably be called T(eff) since it would be greater than the average T(global) value.
What does averaging Tmax and Tmin mean when they come from different distributions AND are highly correlated. That contaminates every additional calculation of an average. Why are variances not propagated on each step that average an average. Why do anomalies not carry the variance of the baseline and monthly temperature used to calculate them?
Look at my post above. Do you really think you can combine the average in the Shahara Desert with the average from Fargo, ND and get a meaningful number without considering the variance of the two temps?
Using an SEM might tell you how accurately you have calculated the population mean, but the variance tells you much the numbers are spread out around that mean. Heck, even winter and summer have different variances, but you just average them like there is no difference in the distributions.
Aren’t you really trying to say that the further you get from the actual measurements, the less real information is displayed in the statistical calculations.
hiskorr,
Mostly the stats we see on this topic involve a sample from a larger population,The sample has to be reflective of the population. Tmax and Tmin would probably be defined as different populations, not to be careleslly mixed together. Ge off S
To a certain extent. But the variance give a better idea of how broad the range is. We not only don’t get a proper variance but no confidence interval either.
The numbers being quoted are, in general, called the error in the mean. This is a statistic that gets smaller with increasing data points. Why? Because it assumes a normal distribution for sample means and a decreasing standard deviation of the sample means.
What does it let one infer? That the sample mean gets closer and closer to the real population mean. In other words it is an estimated value that gets more accurate as you collect more data.
What can you infer about the variance of the population? Nothing directly. One can get an estimated population variance by multiplying the error of the mean by the number of data points.
Ironic isn’t it? You divide the sample sum of means by the number of data points to get an estimated mean but multiply the error of the mean by the number of data points to get an estimated variance!
Here is an article from NIST.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1255808/
Following Significant digit rules does not “lose” information. It maintains the resolution of measurement. Averaging without paying strict attention to variance does lose information. Lots of information!
That would actually seem to be a better “global” temperature. I’ve discussed this with others before. Everybody reads at the same GMT times every day. With automated stations this should be possible. What would give a better representation 2 times a day or 4 – 6 – 8 times a day? Summarize them separately or average them! It is up to the researcher.
Something like the CRN stations.
You know what. No one lives at the official Global Average Temperature as calculated using your procedures. So it is a fake number to begin with. It is full of spurious trends both from changing data sets and funky averages. The change is the ultimate purpose, not the accuracy.
Simple averages using a common baseline can show changes. Are they accurate? Is the GAT accurate? Heck, why do you even need a calculated baseline? Just assign a number and go with it.
You talk about altitudes. How about humidity? The Sahara and Fargo, ND can have the same average temperature. Are those really comparable when determining the climate? How about the range at both locations?
How about precision of measurements? Show one other physical science that allows digits of precision to be determined by averaging rather than what was actually measured. Adding additional decimal digits by using simple arithmetic averages is creating information out of thin air. Do you really support that?
I suggest using 4Pi for all baselines. It has the advantage of as many significant figures as desired. 🙂
First of all, the lapse rate is not always the same in every place, and there is no lapse rate under the ground to validate against. Therefore, you are suggesting an impossibility. Making an estimate for the correction would increase the uncertainty considerably, and possibly to an unknowable amount.
Next, the point of providing an average is that the whole ‘hue and cry’ about climate is the claim that the warming will make life unbearable for life. Warming it artificially by correcting to sea level would make it appear warmer than what people are actually experiencing. The temperatures at ground level, where plants and most animals live, is the environment that humans actually experience when outdoors. Thus, it is perfectly reasonable to provide the average temperature experienced by things living on the surface.
Sorry but, i am sooooo struggling with this Global Temperature thingy.
The more I think on it, the more it reveals how petty, stupid and trivial this entire climate thing is..
OK, why am I bursting this bubble?
Simple: How representative of The Globe are these thermometers?
How would you even start to qualify that?
Let’s take it to the limit, to the trivial extreme and suggest that peta installs 67,000 thermometers in back-yard garden near Newark, Notts and connects them to the interweb.
The cry goes up: What a fantastic resource, peta will save the world.
<peta blushes> Yes OK that might a nice thing to do, assuming it was really in any trouble, but do all those thermometers mean that peta is recording The Global Average Temperature?
see my point….
This entire climate thing is The Biggest Crock Of Shyte there ever was and ‘most everybody is entranced the bl00dy thing
What is that in degree’s centigrade? Most of the world works on the metric system..
Grade-school level to convert it. C = (F -32) * 5/9
IT IS RIGHT NEXT TO IT ON THE WIDGET.
Some years ago, the centigrade scale was renamed to honor Anders Celsius. The change was made in 1948.
You can always tell an old fart, but you can’t tell him much.
Much of the USA works with the metric system. Wine bottles have been 750 ml since 1979.
What won’t change in the USA is the use of feet, yards, and miles for land distance and area measurements. Look at the grid near Pocahontas, Iowa to see why. Back off to an eye elevation of about 20 miles.
All that data is gathered hourly, put into a gridded average, computed and displayed. It is compared to the “normal” baseline temperature of 57.2°F.
but Nasa figure is for Land only,
your data is land and bouy data.
cant average land air temps and marine water temps.
Mr Mosher has very little, if any, scientific knowledge. Of course one may average land and ocean temperatures, since the air temperature of the boundary layer directly above the ocean and the sea temperature of the surface stratum of the ocean are close to one another. One may just as easily average Kelvins of temperature from land and ocean as one may average quantities of apples and oranges if one wants to plot quantities of fruit. Mathematics is far more capable than Mr Mosher.
Yes, you CAN average them. But the result is physically meaningless. Intensive properties.
That depends entirely what use you are going to make of the average. Anthony wants some quick, simple to understand number for long term comparison, fair enough. If I was going to do complex thrermodynamics on the number the no, the situations are rather different. But you need to remember, we happily do this for land temperatures with completely different physical environments: hot and wet, hot and dry, frozen, British, Cities, forests, agriculture, etc. All those averaged together are not really any measure of proper real land temperature, ot even air temperature slightly above the land. In many cases this is the best we can do, but it is a great deal better than the terrible computer models that represent nothing that is real, complete or even physically correct. What happened to convection heat transfer for example? Insistence on energy radiation from cold to hot? Not at all physical (physics as understood).
Mr Alberts seems to think that global mean temperatures are of no value. However, they do provide a general indication of whether and to what extent temperatures across the globe are rising or falling. As Edmund Burke used to say, “There is no knowledge that is not valuable.”
I must politely disagree. It is important that the metric being calculated is both accurate and representative
To date I don’t think the current global temperatures being touted are neither.
As a trained engineer I am dismayed at climate scientists portraying anomalies that are far far below the resolution of measurements. Trending these values is abhorrent.
‘It is important that the metric being calculated is both accurate and representative’
Why?
Why? Because of science, dude. Being inaccurate means no one knows what is actually occuring. Being representative means people can know how they will be directly affected.
Mosher, somedays I think you’ve lost it. GISTEMP provides an example.
Look at the divergence since 1980. Prior to that, pretty good correlation. After 1980, well I have an explanation for that but you don’t want to hear it.
There is a reason temperature is called intensive. That has been ignored throughout climate science. You want an accurate measurement you can average? Go back where you can and get the humidity, altitude, etc. and calculate enthalpy. That will give you an adequate measurement. Temperature is nothing more than a terrible proxy for heat. Global Average Temperature is nothing more than an inadequate metric for something that means nothing.
What Anthony is showing is no better AND no worse than your cherished GAT. It is a metric that people can possibly use to gauge what is changing.
You want to know why climate change is low on everyone’s list of concerns? Stop putting out a Tavg metric. When you see a 57°F average and the day/night is 30/10°F or 93/82°F it is meaningless. Start putting out separate Tmax and Tmin temperatures that people can relate to. That is no different than what you see on every app on your phone.
All the more reason for truncating the superfluous digits and presenting it as an approximate average of thermometer readings, to the same precision as the thermometers.
Except the historical data that are used to calculate the base temperature used to calculate the anomalies.
Isn’t that what Karl did when he ‘disproved’ the hiatus?
Back in the1980s the International Standard Atmosphere (ISA) was 15C/59F and pressure 1013mb. This standard was used by aviation across the planet.
Today the ISA is still 15C/59F while pressure has increased to 1013.25mb.
Am I mistaken because according to my understanding of the lapse rate is if the pressure was 1013mb then the temperature would be lower than 15C/59F?
The planet hasn’t actually warmed or am I missing something? However, if the “normal” temperature of the Earth is assumed to be 57.2°F then this is colder than the standard atmosphere.
What a great idea to see all the failed predictions one after the other! I love it.
Facts? We don’t need no stinky facts. We have our narrative. Ignore the facts (the wizard of Oz behind the curtain).
…and Michael Mann (in front of the curtain).
Here is an Australian summary of failed predictions by Saltbush Club.Geoff S
https://mailchi.mp/093a98a205a8/happy-earth-day-4j6imn4q1z-195551?e=b4fa0c6183
There is no global temperature.
Jeff A,
Every object has a temperature.
It reflects its atomic composition.
You are quibbling about how a given object is defined.
So long as those who are using ‘climate change’ and ‘global temperature’ to influence your future and mine, we have a duty to argue about their terms.
I dislike the global temperature concept, the anomaly concept, the disregard for UAH, I hate invalid predictions and projections and I will continue to rubbish the poor standard of science that is used for political manipulation.
Geoff S
“So long as those who are using ‘climate change’ and ‘global temperature’ to influence your future and mine, we have a duty to argue about their terms”
—Yes, yes and yes…If it’s all about the temperature of a planet — then give them the REAL temperature of the planet… And keep doing it. Year after year after year…
I’m not quibbling. No amount of temperature readings from all over the planet, averaged in some way, will tell us anything meaningful. Period.
But it is possible, and useful, to take an average and note the trend.
It depends what is being averaged
Temperatures, not so much.
No, because it gives the false impression that all temperatures on Earth are moving in unison.
With respect, no. Averaging shows only what the highest or lowest numbers tell us. Shows us nothing about the middle, or that some places have cooled or remained static over the instrumental record.
You are right that calculating an ‘average’ doesn’t give the true average temperature of the air mass. What one gets is the average of the station readings, which is a first-order approximation.
Excellent on both counts! I have long wondered why most arguing against the climate change cult have not been referring to Temperature.Global, as this is a pretty darn good rendering of actual surface temps and the trends against a 30 year average.
And the failed prediction list is also great!
May I suggest another “list” for the brainwashed masses to contemplate: Do a list of all the products and materials that are generated from hydrocarbons/fossil fuels. I think the list starts at some 6,000 but is likely more.
The hard core cultists you are never going to influence, but the average person, who is caught up in the propaganda trap, might think twice about the “solutions” to the fake climate change hoax when they realize how much of their everyday lives will be negatively affected if even a small percentage of this kind of list is banned, or priced out of reach.
Am I correct in assuming that these temperature readings have not been “adjusted” and are, therefore, “facts” and not “estimates”?
I see the Global Temperature on the sidebar but I don’t see the Failed Predictions.
Is it supposed to be there?
(I’m on a Desktop.)
If anyone else couldn’t find it, CTM told me it’s not on the right sidebar but under the “Reference Pages” on the top bar.
If I may make a suggestion, the Failed Predictions Widget needs a “running total at the top”, so that the impacts an be intuitively assessed.
Another suggestion is to have a “Successful Predictions Widget. It also would need a running total for the same reason.
Great additions!
I have always been bothered by the way temperature data is treated. I suppose am biased by my training (Ph.D. Econometrician)
The land temperature data station by station is a terrific panel data set. Aggregating after homogenizing makes no sense to answer the question at hand: is the “climate” a stationary process? Given the data, we are really asking about the properties of temperature time series at each individual station. For all kinds of reasons, there is no reason to assume that the processes driving temperature are the same across stations.
Yes, there are reasons that particular station data needs attention due to discontinuities of various types. Data changes should be documented station by station and all primary data saved so the “changed” data can be properly audited.
Here is how I would begin my analysis. I would deseasonalize the station data (taking the 12th difference of monthly data.) Yes, you lose 12 months of data, but you never have to deal with a sliding averaging window, which is a really silly way of dealing with seasonality. With the seasonally adjusted data in hand, I would test for a simple time trend in each station data set. Why aggregate?
I have done this with the CA station data in the CRN. There are 7 stations in CA in the CRN (when I last checked, which was a year or so ago) None of the 7 stations have a significant time trend. Hmmmmm. Why do we continually hear Gov Newsome claim CA temperatures are rising dangerously?
My guess is that very few stations in the CRN have a significantly positive time trend in temperatures. Are there regional differences? Why not test in a cross-sectional times model to see whether CO2 has explanatory power through time on station temperature after accounting for all the station-specific variables (Latitude, longitude, elevation, air pressure humindity, etc) There are lots of tools that are more appropriate for analyzing the raw data than the current procedures used in much of climate “science”
Oh boy, have you hit the nail!
It only takes a few stations with no warming around the globe to show something is eeriely wrong with the less than rigorous treatment of temperature data.
You can’t conclude what is causing changing data until time trends are stationary.
Stations that have changes should have their current records stopped and a new record started. There is no guarantee that new instruments will have the systematic or random uncertainty quantitative values and changes over time.
This will destroy the ability to have “long records” but to me data value is conserved and so-called “corrections and homogenization” will not be needed.
It is a shame that more statistics people that are familiar with the physical science have not spoken up on some of these issues.
Nelson wrote “For all kinds of reasons, there is no reason to assume that the processes driving temperature are the same across stations.”
Yes.
The official Australian temperature data sets have some material for the comparison of stations placed close to each other. Less than 10 km separation, it is not unusual to find raw differences of 0.3-0.6 deg C on average, though there are also cases of good agreement.
While those temperatures seem small, they are not far from the warming claimed to be global as in climate change existential crisis, as in about 1 deg C per century.
The the first question is, how can you tell which readings to use?
The second question is, is it valid to adjust station temperatures based on other neaby stations, uo to 1500 km away?
The problem is an official reluctance to face these questions and give an answer. Ignorance is bliss.
Geoff S
.
The timeline doesn’t seem to be in strict chronological order.
You have indeed been busy
this is great
Not sure if it could be done, but it would be amusing to see the predictions appear on the home page as they came to pass. Or at least when they were supposed to come to pass!
The failed predictions page is a much needed resource that can be used and understood by people from all walks of life.
In a similar, but simpler vein, I have often wondered about a list of “years to save the planet” to illustrate how this kind of stuff is always ongoing, and always wrong.
Thanks Anthony and others! A great addition.
There is another category that I would like– Economic Costs and Benefits— the real cost of “green” and “warming” for the total effort to date, and against each major mandate–In a way that is understandable to laypersons.
Many of the articles showing the missing costs have been in WUWT already.
For example:
EV costs including all the CO2 credits sold/bought, rebates, etc..against the effect of EVs on global temperature!
The cost of wind and solar with all the storage,running natural gas supplemental , etc for reliable, 24/7 generation…equal to reliability on “non-green” generation.
The cost of urban/suburban/rural “electrification” again against world CO2 temperature change.
The cost of new mines, roadways, transmission and distribution lines, and for “green” generation and various mandates again against world CO2 temperature change.
The lost opportunities of expanded manufacturing and employment.
The cost and lost land use of massive disposal of replaced cars, refrigerators, gas furnaces and stoves.
The goal should be “Understandable at a glance” (for non-scientists/engineers..)
I particularly like the global temperature change shown on a human scale…say the change over time between the top of head and bottom of feet, with normal day to night ranges.
I think we can add the policy cost impacts to include today’s inflation costs, at least in part if not total.
The “economic cost and benefit” category could also include the cost of mandates that do NOT having fulltime replacement generation (‘green with full makeup and fuel supplies, O&M, etc on site) BEFORE shutting down of coal and nuclear facilities. Maintaining 24/7 electric service with historic reserve/reliability margins.
Failed Prediction Timeline/ Hansen prediction is unclear about units, F or C/K. The “outcome” refers to F when UAH is C/K, pretty sure HadCRU is C/K as well. Also, baseline reference period should be specified. (e.g. 1981-2010).
The point remains correct, despite the unfortunate errors and omissions
Anthony, bad news…
On my iPad browser, what you call the “right sidebar” is actually at the end of the loaded page, about a dozen scrolls down, where most people will have their attention diverted to an article before they ever see it….I think it’s good enough to be in the header bar…..I also think the degrees C font should be the same size as the degrees F font as a gesture of universal thermometric brotherhood. (/s:-)
The dates in the titles of items for the one titled “New Ice Age Will Cause Droughts and Affect Grain-exporting Countries 2023-04-28” onwards are all future dates. It’s not clear from the texts what those dates refer to.
SWEET!
Add a check box to turn on a line for the ENSO Index on the Global Temperature chart.
Story tip.
Speaking of Earth Day, I see that Peter Singer wants everyone to cut their consumption of meat in half to save the climate. I also see that he will be traveling from Princeton, N.J. to New York, Washington, California, London, and Australia between May 26 and July 23 to peddle his smugness and hypocrisy.
My research shows that his share of CO2 emissions from the transportation will be 15,000 pounds, and that a pound of steak “emits” 36 pounds of CO2. I do love my steak, but I don’t think I’ll be eating 417 pounds of it in two months.
Think any of his enraptured audiences will notice his tiresome hypocrisy? LOL
Here is a not-quite-climate one from Australia’s peak science body, the CSIRO….
https://www.smh.com.au/national/petrol-could-cost-8-a-litre-by-2018-20080711-3dc1.html
For reference, the average price in 2008 was $1.424, so they were predicting a more than 500% increase. In fact the price in 2018 was $1.443, so a 1.3% increase
Close, but no cigar
KW,
Ask CSIRO to express petrol pump prices in anomaly form, by subtracting the average price 1991-2020.
Then recommend that consumers pay the anomaly price.
Geoff S
You must add the word “Anomaly” where you have the text:
Global Temperature:March 2023 | 0.20°C (0.36°F)
As you know, 0.2 degrees C is 32.23 F, not the 0.36 you appear to be claiming. The text on the UAH page that you take the link from uses the word Anomaly further down and is unambiguous. However, by not also including that word in your page, you create an easy ‘gotcha’ to let your work be unfairly ridiculed.
Nice!!
The failed prediction timeline is phenonmenol – very useful. Thanks!
The world temp, on the other hand, is both pretty great and misleading. Maybe re-label as what it is? (averaged surface station reported temps) – to do this for real you’d need five satelites hanging about 120K out to measure radiated energy over decades and until someone does that any “world temp” number is just a guess; and not a particularly good one at that.
Bravo Anthony!
How about a one page selection of comments from the UN (and others) explaining why they want to make man-made CO2 the scapegoat and that it has nothing to do with climate change.
All they want is to (gently – frogs in slowly warming water) destroy capitalism.
Mr. Watts,
On April 9, I mailed the following note, along with the subject article, to you at 3008 Cohassett Rd, Chico CA:
“The subject of failed climate predictions frequently comes up, with comments citing maybe two or three examples. The Epoch Times has just posted a list of about forty (devoting 3 full pages, plus a short article on their front page). I thought you might enjoy reading their review.”
The postal service just returned my mailing with the notation “insufficient address.” I’m not sure why it takes them 2 weeks to decide they can’t deliver a letter. But I still think you would find the article interesting if you can provide an appropriate address.
Suggest adding a category, with magnitudes for every climate action that shows the modeled reduction in world temperature and cost (all in) to achieve.
Category Temperature Effect (+/-) Cost …with footnotes on sources, etc
Electric Vehicles (USA, passenger only), CO2 reduction (model), World Temperature Reduction (modeled), Cost $ .
EV batteries: worldwide (China, etc)
Grid batteries:
eg CO2 credits( buy, sell), rebates(taxpayer),…
eg efficiency mandates:
refrigerators, LEDs, water heaters, heat pumps, insulation…..
Wind/Solar generation( with full make-up generation running and standby to provide 24/7 reliability, availability (local),
ETC….
Outstanding effort, but I have some constructive criticisms of the “Failed Prediction Timeline.” I hope you will take them seriously.
I will continue with criticisms as I go through. Again, these are intended in a constructive way. I think your compilation is utterly magnificent, and want only for it to be more useful.
Oops. The Ehrlich article from the Redmond Daily Facts appears twice on p. 3, not p. 2. My mistake.
The Washington Post article “U.S. Scientist Sees New Ice Age Coming,” appears on p. 1 and p. 10.
An article from Salon appears on p. 9 and p. 10
An article about people having to wear gask masks appears on p. 1 and p. 10
An article about the river fish dying appears on p. 1 and p. 10
An article titled “The Cooling” appears on p. 2 and p. 10
An article titled, “Prepare for long, hot summers” appears on p. 3 and p. 8
An article from the Guardian about giving up meat appears on p. 5 and p. 11
An article about the threat to the Maldives appears on p. 3, p. 4, and p. 9
An article from the Time magazine archives appears on p. 2 and p. 10
Letter from Brown University appears on p. 2 and p. 11.
I think there might be more repetitions, but my eyes were getting tired.
I want to recommend “Failed Prediction Timeline” on social media, but it’s kind of a mess. Some copy editing mistakes, deficient linking, and LOTS of repetition. These problems are easily fixed. I hope you will do so. No sense in going to all the trouble only to have a few easily repaired “gotchas” make it unshareable.
Excellent info for use debating the warmistas. I would like to see one other chart to challenge the warmistas with. How about adding the chart of the paleontology record of temperature vs CO2 over the millennia. That is a powerful picture with which to challenge their favorite trope that it has never been so warm and it is the fault of CO2.
Story Tip: Create an Opportunity Cost of the Green Economy
1) The benefits are hugely speculative and highly destructive to the environment. Nothing has changed in the trend of CO2
2) The costs are real and undeniable
3) $5 Trillion or 20% of GDP can build a huge number of roads, bridges, hospitals, cancer research, etc etc.
Put the cost of fighting climate change in terms the people can undersand
The Great Energy Deception: The Truth Behind the $5 Trillion Renewable Energy Scam
https://internationalman.com/articles/the-great-energy-deception-the-truth-behind-the-5-trillion-renewable-energy-scam/
So what you say about temperature.global is not correct. It does not give you a real-time global temperature, and it is not a gridded average. It is a running annual mean updated every hour or so, and the mean is a simple average of station readings with no grid, Tony Heller style. It claims to use a 30-year baseline, but it doesn’t have 30 years of data, so it uses an estimate from NASA instead.
I had some correspondence with the person who runs the site, who goes by TG, and he confirmed what I’m saying here. I summarized what I was able to find out from him and a bit of my own analysis here.
https://woodromances.blogspot.com/2022/02/the-marketing-of-alt-data-at.html
TG explained to me, “The[sic] is no gridding or weighing of data…. There are many organizations that already so[sic] this, like NOAA and NASA. Our project just takes the statistical mean of all available surface data. The intention is to get a different look of[sic] the data without manipulating it at all.”
TG also explained to me, “The data functions are algorithms that create the global mean. It calls the database for the last 12M of data. Some data functions also serve as an API for users to embed the data on their own webpages.” TG will only report a 12-month running average. The reason why is clear if you look closely at the data.
I asked for the actual data instead of annual means, and TG refused to give me that data. But using the data on the website and the information TG gave me, I was able to reconstruct what should be the monthly mean temperatures to produce the running 12 month averages that they report. See the graph of what should be their monthly values below. I describe the method I used to reconstruct this in the blogpost.
Do you really want to promote this on your blog? This is little better than a random number generator.
This is just a different assessment. If warming is global, i.e., not localized, then area weighting is meaningless. In other words, all stations should be showing warming regardless of where it is.
If warming IS NOT global, but localized, then climate science is causing global fear when it is not warranted!
Baselines should not affect the overall trend. A baseline will only cause the scaling factor to be different. You could even use a temperature of 0°C and only the values of the differences will change. The trend will not. The trend will just move up and down in relative values.
Climate wants to claim accuracy by using station baselines per month to arrive at an “accurate” change for each station. The true inaccuracy can be debated elsewhere, but by averaging all anomalies, local and regional information is destroyed. You can not evaluate where warming is occuring!
Don’t you find it funny that no climate scientist has ever posted a regional or local analysis of temperature? Several folks are staring to do this and can’t find the warming when looking station by station! Have you done that analysis?
It’s not just a different assessment. It’s complete incompetence, and what this blogpost said about it is objectively false. There’s no gridding here – it’s just a simple average of thermometers, and you’re not given real-time global temperatures. You’re given a 12-month running mean that is updated frequently.
The reason why they will not disclose their data or even their identities is because their method is completely wrong. You can tell because every year (according to TG), global temperatures change by 55 C every few months. There’s no way that can possibly be true, so TG covers this up by supplying only a 12-month running mean.
I didn’t say that their baseline affects the trend. Why would you think I said or implied that? What I said is that TG claims to compare their values to a 30 year baseline, but they don’t have a 30 year baseline. So that’s made up. Their 30 year baseline comes from NASA’s dataset, which means it’s arbitrary with respect to TG.
I don’t even find it truthful. Look at NASA’s website. They give you access to global, national, regional, and local temperature time series. And it’s not hard to find.
If you believe NASA’s data, that is on you. A NOAA scientist just validated UAH. Makes you wonder about the other databases doesn’t it.
Here is what I’m learning. Most the data bases trumpet their “accuracy” based on an SEM computation. Yet their calculation of it is hosed. They do not publish an experimental uncertainty based on the data because it is much higher.
My investigations are finding locations with little warming. I have yet to find locations with sufficient significant warming to justify the warming being trumpeted. Maybe you can post some since climate science seems to ignore such.
Whatever your feelings about NASA’s data, the temperature.global website is a glorified random number generator. It doesn’t tell you anything meaningful about global temperatures.
They are all hosed. UAH comes closest. Look at this image. Why do you think no one states experimental uncertainty as determined by the actual data. Practically all the anomalies lie within the experimental uncertainty. This is just one site. There are many more.
You may not like the running average but it is one way to look at the data. Ask yourself why it does what it does. Here is my look, if CO2 is well mixed then everywhere should experience the same growth. Why doesn’t this graph show it?
It shouldn’t matter if one “grid” has 20 stations and the next one has none, they should all be increasing, right?
Your graph is too small to read, nor have you supplied enough information about it for me to replicate it. The CIs for GMST datasets are published in the literature, and sometimes they are printed on the graphs. In recent decades, the 95% CIs are about 0.05 C for most GMST datasets, gradually increasing to about 0.15 C in the late 19th century.
That of course is not an argument for using the random number generator called temperature.global. The post here says that this dataset is gridded, but it is not. It does not even post real time temperatures. It posts a 12 month running average that gets updated frequently. And the reason is clear. Their monthly data varies by as much as 50 C over the course of a year, which stupid.
This is wrong:
No, that’s objectively false. Different parts of the globe receive different amounts of energy from the Sun, and thermal properties of land and water differ, such that land warms faster than the oceans. Warming affects albedo in the Arctic, as sea ice decreases, so warming in the Arctic is more rapid than elsewhere. And we have this thing call wind that blow air around distributing warmer area to different parts of the globe. We have ENSO that changes the rates at which heat that escapes the ocean, etc.
Tell you what, here is a link for some Japanese locations that show no warming.
Now to get 1.5C of warming, find a site that has 3.0C of warming over the same time frame. I can get more sites later.
Tokyo’s Coolest September In Over 30 Years…Hachijojima No Warming In 107 Years…Latest Forecast: Sharp La Niña! – Watts Up With That?
You are just making your task more complicated. If not all parts of the earth receive the same amount of insolation (and I do agree with that), then you should have no problem plenty of sites that can offset locations with little or no warming. If you can’t find sites with 3 and 4 degrees of warming you might want to ask yourself what these trends are doing.