And What Have the Average Temperatures of Earth’s Surfaces Been Recently in Absolute Terms, Not Anomalies?
The answers may surprise you.
THIS POST HAS BEEN UPDATED. The update is near the end of the post.
Recently, the Intergovernmental Panel on Climate Change (IPCC) and the U.S. Global Change Research Program (USGCRP) have cranked up their alarmist propaganda, with the IPCC now pushing the goal of limiting global warming to 1.5 deg C above preindustrial global surface temperatures.
That, of course, initiates the title question, What Was Earth’s Preindustrial Global Mean Surface Temperature, In Absolute Terms Not Anomalies, Supposed to Be?
Four years ago, in the post On the Elusive Absolute Global Mean Surface Temperature – A Model-Data Comparison (WattsUpWithThat cross post is here), we illustrated and discussed the wide (3-deg C) span in the climate model simulations of global surface temperatures on an absolute, not anomaly, basis. Figure 1 below is Figure 5 from that post. In that post, we started the graphs in the year 1880 because the GISS Land-Ocean Temperature Index (LOTI) and NOAA NCDC (now NOAA NCEI) data started that year.
Figure 1
However, the spreadsheets I prepared for that post had the climate model hindcast outputs as far back as their common start year of 1861. (I say common start year of 1861 because the outputs of some models stored in the CMIP5 archive start in 1850 while others begin in 1861.) So I couldn’t use the climate model outputs stored on that spreadsheet for this post.
Note: The IPCC’s new definition of preindustrial, as stated in their Changes to the Underlying Scientific-Technical Assessment to ensure consistency with the approved Summary for Policymakers:
The reference period 1850-1900 is used to approximate pre-industrial global mean surface temperature (GMST).
It’s odd that the IPCC selected those years when not all the climate models used in their 5th assessment report (those stored in the CMIP5 archive) for simulations of past and future climates extend back to 1850. Some only extended back to 1861. Then again, no one expects the IPCC to be logical because they’re a political, not scientific, entity.
Luckily, the ensemble members that meet the criteria of this post do extend back to 1850. So we’ll use the ensemble member outputs for the full IPCC-defined preindustrial period of 1850 to 1900 for this post.
ACCORDING TO THE CMIP5-ARCHIVED CLIMATE MODELS THERE’S A WIDE RANGE OF SIMULATED PREINDUSTRIAL GLOBAL MEAN SURFACE TEMPERATURES
The source of the outputs of the climate model simulations of global mean surface temperature used in this post is the KNMI Climate Explorer. Specifically, as a pre-qualifier, I used the outputs of the simulations of Surface Air Temperatures (TAS) from 90S-90N from the 81 individual ensemble members. From those, I identified the ensemble member with the warmest global mean surface temperature for the preindustrial period and the ensemble member with the coolest global mean surface temperature for the same period. For those who wish to confirm my results, the coolest (lowest average absolute GMST for the period of 1850-1900) is identified as IPSL-CM5A-LR EM-3 at the KNMI Climate Explorer, and the warmest (highest average absolute GMST for the period of 1850-1900) is identified there as GISS-E2-H p3. The average global mean surface temperatures for the other 79 ensemble members during preindustrial times rest between the averages of the two ensemble members.
The outputs of the simulations of global mean surface temperature from those two (the warmest and coolest absolute temperatures) ensemble members for the preindustrial period of 1850-1900 are illustrated in Figure 2, along with their respective period-average global mean surface temperatures for the IPCC-defined preindustrial period of 1850 to 1900 (dashed lines).
Figure 2
As noted at the bottom of the illustration, The Scientists Behind the CMIP5-Archived Models (Used By the IPCC for AR5) Obviously Believe Earth’s Preindustrial Average Surface Temperature Should Be Somewhere Between 12.0 Deg C and 15.0 Deg C. The modelers at the Goddard Institute for Space Studies (GISS) and at the Institut Pierre-Simon Laplace (IPSL) would NOT have archived those models if they hadn’t believed they were of value.
AND WHAT HAS BEEN THE AVERAGE GLOBAL MEAN SURFACE TEMPERATURE FOR THE PAST 30 YEARS?
On their data page here, for converting their anomaly data to absolute global means surface temperatures values, Berkeley Earth lists the factor (14.186 deg C) that is to be added to their annual global mean (land plus ocean, with air temperature over sea ice) surface temperature anomaly data. On the other hand, after discussing why it’s so difficult to determine the global mean surface temperature in absolute terms, GISS states on an FAQ webpage here:
For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.
[SEE THE UPDATE NEAR THE END OF THE POST.]
So, for the purpose of this very simple illustration and comparison, and for the discussions it will generate, I’ve added 14 deg C to the annual GISS LOTI data available here, and added 14.186 deg C to the annual Berkeley Earth data. I also compared them to the 12.04 deg C to 15.05 deg C range of hindcast preindustrial global mean surface temperatures from the climate model ensemble members discussed earlier. See Figure 3. Not too surprisingly, the Berkeley Earth and GISS global mean surface temperatures, in absolute form, are very similar, with only a 0.1 deg C difference during the most recent 30-years.
Figure 3
As is plainly visible and as noted at the bottom of the illustration, The Most-Recent 30-Year Averages of Observed Global Surface Temperatures Have Not Yet Reached the Warmest of the Modeled Preindustrial Global Temperatures!
In other words, based on 30-year climatological averages, recent global mean absolute surface temperatures have yet to reach the high end of the range of preindustrial global mean surface temperatures as hindcast by the climate models used by the IPCC for their 5th Assessment Report. It could, therefore, be argued that recent global mean surface temperatures are still within the realm of preindustrial natural variability, regardless of the 1 deg C that the Earth’s surfaces have risen since pre-industrial times.
If you look closely at Figure 3, you can see that the Berkeley Earth data recently peaked in 2016 at slightly above the highest of the simulated preindustrial average temperatures…with the 2014/15/16 El Niño-caused “record high” of 15.1 deg C. On the other hand, the GISS data peaked just below in 2016, at a “record high” of 15.0 deg C.
[sarc on.] Oh, heaven forbid! Based on the Berkeley Earth data, Earth’s global mean surface temperatures are now starting to run a teeny bit higher than the range of pre-industrial average temperatures. What are we to do, other than sit back and enjoy the vacation from cooler temperatures? [sarc off]
One might suspect that a simple comparison like this is one of the real reasons why the primary suppliers of global mean surface temperature data do not furnish their data in absolute form. Thank you, Berkeley Earth, for sparking my interest. We discussed another possible reason in the post Do Doomsters Know How Much Global Surface Temperatures Cycle Annually?
DEAR TROLLS, IF YOU LINK THE FOLLOWING WEBPAGE FROM REALCLIMATE TO SUPPORT YOUR ARGUMENT, SOME READERS WILL LAUGH AT YOU
In an apparent response to the 4-year-old post linked earlier (here) and to Willis Eschenbach’s post CMIP5 Model Temperature Results in Excel at WUWT that followed a month later, just after those two posts, in 2014, there was a discussion at RealClimate of modeled absolute global surface temperatures, authored by Gavin Schmidt, the head of the Goddard Institute of Space Studies (GISS). Gavin’s post is Absolute temperatures and relative anomalies. (Archived copy is here, just in case.)
The following are a couple of quotes from Gavin’s post. First (my boldface):
Second, the absolute value of the global mean temperature in a free-running coupled climate model is an emergent property of the simulation. It therefore has a spread of values across the multi-model ensemble. Showing the models’ anomalies then makes the coherence of the transient responses clearer. However, the variations in the averages of the model GMT values are quite wide, and indeed, are larger than the changes seen over the last century, and so whether this matters needs to be assessed.
Needs to be assessed? A 3-deg C span in modeled global mean surface temperatures, which is “larger than the [1-deg C] changes seen over the last century”, hadn’t been assessed by 2014? Oy vey! As shown in the graph generated by the KNMI Climate Explorer in Figure 4, a similar 3-deg C spread in modeled absolute global surface temperatures existed in the models archived in CMIP3, which was used by the IPCC for their 4th Assessment Report published way back in 2007.
Figure 4
Note: For those questioning the coordinates listed in the above graph, the “-180-180E -90-90N” reads 180W to 180E, 90S to 90N. [End note.]
Second quote from Gavin’s post (my boldface):
Most scientific discussions implicitly assume that these differences aren’t important i.e. the changes in temperature are robust to errors in the base GMT value, which is true, and perhaps more importantly, are focussed on the change of temperature anyway, since that is what impacts will be tied to. To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.
Implicitly assume! I love that. Makes me laugh. Such is climate science.
And I really enjoy, “To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.” Maybe the IPCC, the USGCRP and mainstream media should take that quote to heart. Hmmm, I feel a new post forming as I write this. I’ll most likely post it in a few days.
And if those two quotes look familiar, I presented them in the December 2014 post Interesting Post at RealClimate about Modeled Absolute Global Surface Temperatures (WattsUpWithThat cross post is here.)
As with all my model-data presentations, I suspect there will be complaints by the usual whiners.
That’s it for this post. Have fun in the comments and enjoy the rest of your day.
UPDATE REGARDING THE 14.0 DEG C ABSOLUTE TEMPERATURE FACTOR ADDED TO THE GISS ANOMALIES
There was no specific time period listed in the GISS FAQ link (here), or in the quote from it, presented earlier in the post, that I provided for the 14.0 deg C factor to be added to the GISS LOTI data to produce global temperatures in absolute form. (Thanks, blogger Dee at WUWT.) The 14.0 deg C factor used to be (past tense) listed at the bottom of the GISS LOTI data page and they were referenced to the period GISS uses for anomalies, 1951-1980. But the 14.0 deg C (57.2 deg F) factor is no longer listed on that GISS data page. GISS removed it for some reason. But archived versions of that webpage still exist (example here). There you’ll find:
Best estimate for absolute global mean for 1951-1980 is 14.0 deg-C or 57.2 deg-F, so add that to the temperature change if you want to use an absolute scale (this note applies to global annual means only, J-D and D-N !)
[End Update.]
STANDARD CLOSING REQUEST
Please purchase my recently published ebooks. As many of you know, this year I published 2 ebooks that are available through Amazon in Kindle format:
- Dad, Why Are You A Global Warming Denier? (For an overview, the blog post that introduced it is here.)
- Dad, Is Climate Getting Worse in the United States? (See the blog post here for an overview.)
To those of you who have purchased them, thank you. To those of you who will purchase them, thank you, too.
Regards,
Bob
If the GAT is a range, then the anomalies are a range. It can be no other way. Anomalies have the same errors as the GAT.
It seems Gavin is saying all change is evil. Darwin is hating on him as we speak.
There isn’t a GAT. But if you calculated average surface temperatures the value you get depends on the time of the year. From a low in January to a peak in July. January “GAT” will be about 2.5C lower than July’s. Despite January coinciding with the closest point earth’s orbit gets to the sun. How can there be a GAT when it varies by time of year?
As posted on Bob’s site, Gavin Schmidt’s explanations are nonsense. He states that (average?) temperatures are emergent phenomenon of the individual models. To get a 3 degree C difference in average temperatures, the different models are using entirely different physics. Ocean evaporation is entirely different at + or – 3 degrees.
Sheri and Bob: Some change is “evil”. The last ice age was apparently about 6 degC colder than today – whatever today’s temperature might be. There were 120 m of SLR associated with the end of the last ice age. As continental ice sheets have retreated towards the poles, 20 m of SLR per degC seems an overestimate for the future, but is a good reminder that future change won’t be trivial. Hopefully it will be slower than the alarmist predict, allowing more time for adaptation.
About a good fraction of the nation’s corn is grown in the Iowa corn belt, which extends about 100 miles from north to south. Is that because the optimum temperature range occurs between these latitudes? In that case, even a 1 degC change is important. More likely, some combination of temperature, rainfall, soil and agricultural experience is responsible for the Iowa corn belt. What happens to that corn belt when the temperature that characterized it has moved into souther Canada? Obviously adaptation. Those adapting will call this change “evil”; as will those asked to exist without cheap fossil fuel.
The Iowa corn belt is not determined by temperature … it is determined by soil type. As a native Iowan, we all learned in school that Iowa’s glacial-deposited “loess” soil, which is the world’s finest topsoil, existed within Iowa because that is where the retreating glacier dropped its load of material scraped off the land further north as the growing glacier expanded southward, until the warming started.
Go a bit further south into Missouri and the topsoils are much poorer and much thinner, and much rockier. Go a bit further north to Minnesota, and the topsoils are much poorer and much thinner and much rockier. It is not absolute climate or temperature that made Iowa what is it.
CLIMATE CHANGE created Iowa’s cornbelt. A static climate would not have produced that lucky result.
All climate changes create winners and losers. But humans adapt, no matter the result.
Bob – Am I understanding from your graphs, that the models’ pre-industrial temp range has a spread on absolutes of 3°, but that there is also a similar (3°) spread in the actual GAT used at the time the models were run?
If so, why would that be?
mothcatcher, I believe you’ve misunderstood the post.
Regards,
Bob
Well, thanks for the dis, Bob. I know I’m not very bright! Let me put the question another way, so that whether I’m getting your point or not doesn’t come into it.
At the junction of hindcast and forecast, models have a run date.
– Is it true that the Global Ave temperature they use differs by as much as 3°? (The CMIP3 graph)?
– If so, where are they getting that first input from? Are they using different data?
I’m sorry if that is obvious to everyone else, but gimme a chance!
Completely agree with your question. I believe they frequently adjust all the models to some fixed point in the past so they look like they are in agreement. And then we can see their ‘anomalies from each other’ in the future.
But your question is what data set are they using, what definition are they using and why do they not address that ‘they are different by 3 degrees from each other right off!
As Gavin Schmidt said, temperatures are emergent phenomenon of the models. Therefore, they only publish anomalies to hide the 3 degree C difference.
I’m okay with anomalies -for most purposes they are certainly preferable. I just want to know how it is done. Seems to me that the models are solving many equations vast numbers of times during each run. Many of those equations will involve energy flows which will involve not anomalies, but the initial Kelvin temperature. After trillions of iterations, those initial inputs will be significant. Or am I just showing my ignorance again?
mothcatcher
You still misunderstand the very point and the meaning of the post.
Regardless…from my point of view.
If I am not wrong in this, is about Gavin, openly and clearly condemning himself to the outmost
imbecility and stupidity by his own deeds, acts and words coming out of his own mouth.
What I think this brilliant post by Bob puts there in the open, is the very evidence, that there is not much in question about Gavin’s imbecility or stupidity there anyhow, but is more like that on the top of it all, the question is more like about in the context point of the actual total incompetence, or total malevolence or total devious deceptive intention, or something in between…by Gavin
If I am not wrong in this, hopefully, what you ask for is in the context of initial condition versus outcome condition.
The range in the context of initial condition is ~3C in temp absolute condition, as per the GCMs start point given simulations , but also the outcome condition for every GCM simulation is ~ 3C in terms of absolute temp condition,,,,,,where no way any one can ever claim or propose some kind of averaging of the GCM simulations outcome results under such given point or condition…….oh well unless one is or happen to be a stupid named Gavin…
The very latest shinning “star” in climatology, the very best shinning “madona” or “drama queen” of the climata science of late.
No way to consider or dream or flirt with the idea of any consideration of averaging of GCMs simulation outcome results, under the given condition, unless one happens to be an imbecile or stupid in the context of basic science, scientific method and basic maths that serve the basic structures of understanding.
Really sorry, if this comment harsh or heavy or offending…do really apology beforehand to Bob…if misunderstanding the point.
If this to much, or offensive in any way, please do not allow it to be published… (Mods)
Really sorry if wrong in this one…but for what it could be worth!
cheers
mothcatcher and others,
As I understand it, models are initialized centuries before the start date of the simulations. They are run in a kind of “base mode,” and more components are added over time. This allows the models to reach an equilibrium by the time the full model is run. You can’t just add all the ocean data and processes, for instance, and go from there – the model has to incorporate them, have them functioning over time in conjunction with the atmosphere to get a good simulation.
There are a few different datasets used for initialization. I suppose there might be some divergence in temperature during the initialization process, too. The point is to get the emergent properties relatively stable (at equilibrium) before the “start date” of the full simulation.
The IPCC has always maintained that the mean of the model simulations is more accurate than any one simulation in the CMIPs. So, when the CMIP mean rises by 1 C over 100 years, that is the result of the comparison – an emergent property.
I don’t see a problem with this. Seems to me what Gavin Schmidt said makes sense – laugh at me all you want, but don’t call me a troll, Bob Tisdale.
mothcatcher; is your question ‘What was the Global Average Temperature used to initialize each of the hincast model runs, and were they initialized identically, or with the 3 degree range?’ I have the same question and have not yet learned how to answer it myself when exploring all the info at the KNMI site. I suspect the answer is there somewhere!
Nice post, but isn’t fretting about anomalies more fun?
That is how they get statistical noise to claim CO2 causes global warming.
Historic analyses suggests that if CO2 is indeed the great big control knob, it wasn’t doing a very good job of controlling things in the 18th or 19th century, when all sorts of weather catastrophes were abundant at a time of much lower atmospheric CO2 concentrations.
http://www.pascalbonenfant.com/18c/geography/weather.html
The good thing about “anomalies” is that it is a big, scary word. If you called it by the more normal word “difference”, people wouldn’t run about like the sky is falling.
Also critical is the base period selected. Phil Jones didn’t want to move away from 61-90, partly because several thousand stations disappeared aften then, but also because it guaranteed “global warming” when used as the base line for anomalies, containing some of the coldest years of the century.
The other trick is to define “Pre-Industrial” and they claim that they choose the period they do because no-one could read thermometers before then and there was no “fossil fueled industry”.
http://en.wikipedia.org/wiki/Industrial_Revolution
“The Industrial Revolution was a period from the 18th to the 19th century where major changes in agriculture, manufacturing, mining, and transport had a profound effect on the socio-economic and cultural conditions starting in the United Kingdom, then subsequently spreading throughout Europe, North America, and eventually the world.”
The period of time covered by the Industrial Revolution varies with different historians. Eric Hobsbawm held that it ‘broke out’ in Britain in the 1780s and was not fully felt until the 1830s or 1840s, while T. S. Ashton held that it occurred roughly between 1760 and 1830.
The improved steam engine invented by James Watt and patented in 1775 was initially mainly used for pumping out mines, but from the 1780s was applied to power machines. This enabled rapid development of efficient semi-automated factories on a previously unimaginable scale in places where waterpower was not available.
The major change in the metal industries during the era of the Industrial Revolution was the replacement of organic fuels based on wood with fossil fuel based on coal. Much of this happened somewhat before the Industrial Revolution, based on innovations by Sir Clement Clerke and others from 1678, using coal reverberatory furnaces known as cupolas. Abraham Darby…made great strides using coke to fuel his blast furnaces at Coalbrookdale in 1709.
Jevons wrote The Coal Question in 1865, examing the possibilities of Peak Coal, because of the extensive usage for industry: https://editors.eol.org/eoearth/wiki/The_Coal_Question_(e-book)
“Since we began to develop the general use of coal, about a century ago, we have become accustomed to an almost yearly expansion of trade and employment.”
Pre-industrial coal must have been non-warming…
“it is the change in temperature compared to what we’ve been used to that matters.”
So it’s the rate of warming that causes tornadoes, forest fires, droughts, etc? Strange physics.
bobbyv,
If we are used to droughts, fires, floods, etc. happening in particular regions, we can adapt to them through infrastructure and management. Likewise, organisms can adapt to slowly-changing conditions, and ecosystems have long-term dynamic stability when climate changes slowly, with natural variability. It’s when there are changes that we get into trouble. If, for instance, Californians were more used to the risk of catastrophic fire, they wouldn’t build and plan their settlements as they do, and there would be better land management to prevent fires from reaching urban areas.
The rate of change is much more important than how much of a change there is (to a point). An increase of 1 C over 200 years is very different from the same increase over 50 years.
I think that the most important question is, what is the optimum climate for our biosphere. We need to grow an increasing amount of grain crops. We need to grow plants for fiber like cotton, and we need to grow trees for lumber.
When I was a kid, my mom let me have ONE teaspoon of sugar in my koolaid. That was it! When I complained it wasn’t sweet enough, she said, “Stir what you got.”
The most important question is not, “What is the optimum climate for our biosphere?” The question is, “How do we intelligently work with the climate we have?”
The optimum climate for the biosphere in general would be as during the balmy, ice sheet-free Eocene Optimum, before the onset of global cooling some 49 Ma. We probably could not enjoy such an equable interval now, due to the arrangement of the continents.
The best conditions of the past 115,000 years, ie since the end of the Eemian interglacial, occurred during the Holocene Climatic Optimum, some 9000 to 5000 years ago, interrupted by the 8.2 Ka cold snap. Global average temperature was at least two degrees warmer than now. Summer Arctic sea ice extent was much lower than now. However, sea level was then above many of today’s coastal cities.
The second best climates were during the Egyptian (~4 Ka) and Minoan (~3 Ka) Warm Periods, the third best the Roman (~2 Ka) WP, the fourth the Medieval (~1 Ka) WP and the fifth now or in the mid-20th century.
In the long term big picture, there seems to be a step down in the temperature every 1000 years or so from each successive warm period as we get further into the interglacial. And each cooling period every 1000 years or so seems to get a little bit cooler, leading to me to only one conclusion that the interglacial is slowly winding down during this Great Precessional Winter.
Perhaps in the scheme of things from a future historical perspective, the little bit of warming humans are doing with anthropogenic and land use change was to only delay the inevitable cooling that is locked in over longer time scales. And who really knows what the Solar Cycle has in store. The last 30 years of minor beneficial warming is hardly enough time to get any conclusive evidence what is really going on in longer time scales.
Yes, each warming cycle of at least the past 3000 years has peaked at a lower level. The LIA might or might not have been as cold as the previous cool period, the Dark Ages CP.
IMO anthropogenic warming on a global scale won’t be a pimple on the posterior of the long-term cooling trend, headed into the next glaciation. However in future, with fusion power, we might be able to melt the snowfields which grow into ice sheets. Maybe with giant blow dryers deployed in Canada, Scotland, Scandinavia and Siberia.
Well John, we’ve already built a whole lot of giant fans. All we need to do is run them in reverse, and put big heating coils in front. Ta-da!
Earthling2,
I think the issue here is that what humans are doing is not part of the natural variation in climate, and has the potential to override the normal changes. If what we do is changing the normal direction of temperature change, how is that a “delay”? It seems like you are counting on forcings that overwhelm the action of ever greater atmospheric CO2 – how do you know they will?
I don’t know myself, I’m just saying that it may not be a safe assumption.
Kristi, CO2’s effect on temperatures is logarithmic; the more CO2 there is, the less impact an additional incremental amount will have on temperatures. Without the supposition of unrealistically large positive feedbacks, there is little to be concerned about with the responsibly projected future levels of CO2 in the atmosphere.
We might add to this overview that the Earth has cooled since the Eocene Optimum to the point of entering an Ice Age around 2.6 mya. This is the overriding, but largely ignored, climate fact. The average person is lead to believe the ice age ended ten, twelve thousand years ago when it was only the glacial period that ended. Those who predict run away warming are predicting the end of the ice age. What amazing power humanity has, or what hubris. If true, could it be that our burning of fossil fuels is part of Nature’s plan to return the Earth to a more salubrious state?
Bob,
” If true, could it be that our burning of fossil fuels is part of Nature’s plan to return the Earth to a more salubrious state?”
Nature’s plan? Since when does Nature have a plan?
Humanity does have amazing power over the Earth, even disregarding climate change. We turn vast forests into pasture, divert and dam major rivers, suck aquifers dry, blow off the tops of mountains, create earthquakes…we have the ability to kill every land-based organism on a continent, if not globally. I’d say we have amazing power.
It would be interesting to see where the members rank at 1850 vs 2100.
1850 is just pulled out of someone’s………..
Who’s to say the LIA didn’t end in 1900
Or is just ending now. Because we seem to actually be right about where we started before the little ice age at this point. We have been recovering the whole time.
Yes the English, geological and statisticians playing in the science field pulled it out of thin air because it seemed to give a good answer.
This is a point that I made to Lord Monckton when he argues that temperatures were in equilibrium in 1850.
Since it is almost universally accepted that the planet warmed between 1860 to 1880 and manmade CO2 was miniscule (according to proxies there was only a ~2 ppm increase in CO2), it is difficult to see how one can maintain that the LIA ended by 1850 rather than around 1880.
Further it is almost universally accepted that the planet warmed between 1910/1920 through to 1940 and once again manmade CO2 was minuscule such that it is strongly arguable that the LIA did not end before the highs of the late 1930s/early 1940s.
Yet further, if one looks around the globe, the date when the LIA ended is very uncertain, and since it appears that the globe is not warmer than the high of the MWP, there is a strong argument that the recovery from the LIA is not complete until such time as temperatures are the same as the highs of the MWP.. If that argument is accepted that would put the modern warming part of the recovery from the LIA.
Richard
Christopher Monckton can speak for himself, as you will well know, but in case he’s not monitoring this, I think you misrepresent his position “when he argues that temperatures were in equilibrium in 1850”.
That is the de facto position of his opponents. He is trying to deconstruct the logic of the warmist case regarding feedbacks, and finds it wanting – a position with which I strongly agree.
As you say, Lord Monkton can speak for himself, and no doubt in due course, he will update us on how his paper is getting on in the peer review process.
But the point I make is a point of principle, since if one wishes to argue that all or most of the ‘observed’ warming is due to natural variation, and not caused by manmade CO2, it is important to know whether the recovery from the LIA has fully taken place, and if so when, we fully rebounded from the LIA .
Lord Monckton argues that temperatures were in equilibrium for 80 years from 1850, ie., through to 1930. Skeptics like to argue that of the approximately 1 degC of warming since 1850, over half of this was of natural origin, ie the warming between 1860 to late 1930/1940. But Lord Monkton’s position rather destroys that argument since he argues that there is no natural forcings during that period.
I consider it unfortunate that he took that position, since the precise temperature does not make much difference to the value for climate sensitivity assessed by Christopher Monckton, and as a point of principle, it is quite clear that the temperature was not in equilibrium as at 1850 and that there were forcings then present that caused the temperature to rise through to 1880, and then through to 1940. Those forcings were not the increased radiative forcing caused by increased levels of CO2 since according to proxy evidence CO2 only increased by about 3 to 4 ppm through to 1880, and only by about 25 ppm through to 1940..
In fact HADCRUT3 shows there to be an about 0.6 degC warming in just 3 to 4 months of 1850. There was substantial variation that year.
Of course the point is that we have no idea as to global temperatures back in 1850, and all the temperature reconstructions are not fit for scientific purpose. The errors in these reconstructions are so large that we cannot assess what is going on. Of course temperature change of and in itself does not explain why temperatures are changing (if indeed they are changing).
Richard, if you say, ” The errors in these reconstructions are so large that we cannot assess what is going on.” then how do you even know the LIA occurred?
Because of historic evidence both written, and the extent of glaciation. We also have proxy evidence on tree lines and fauna that confirm the existence of the LIA, as well as the ice core data which again shows the LIA. But that said, what we lack is precise quantitative data, ie., we know that it was cold, the approximate dates, but not precisely how cold it was.
What we know is that there is much decadal and multidecadal variability in temperatures, but we do not know whether the planet today is warmer than it was at the highs of the late 1930s/early 1940s, or for that matter warmer than it was in ~1880.
We ought to select a small number of good quality stations which we know from historic data have not moved and have undergone no significant environmental change, and have good practice of maintenance and record keeping and intact historic records covering the 1930s/1940s, and then retrofit these stations with the same type of enclosure, painted with the same type of paint and equipped with the same type of LIG thermometer as was used at the station in the 1930s, calibrated in the same manner as was used historically at that particular location, and then take modern day observations using the same TOB as was historically used at that station in the 1930s/1940s, and thereby get modern day RAW data that can be directly compared with the historic 1930/1940 RAW data for the station in question, without the need for any adjustment whatsoever. Simply make a like for like comparison
We would not seek to construct any hemispherical or global construct. We would not need to use infilling, kriging homenisation etc. Just compare each station individually with its own historic RAW data.
If we selected about 100 such stations we would quickly ascertain whether there had been any significant change in temperature since the highs of the 1930s/1940s.
I suggest using the highs of the 1930s/1940s as a reference since about 95% of all manmade CO2 emissions have taken place since then such that a simply comparison with the 1930/1940 data would quickly confirm whether CO2 may be a potential driver of change, or if there has been no significant temperature change since the 1930s/1940 then we can conclude that the worse case scenario is that CO2 is an insignificant driver of temperature change .
Hi, Richard.
I do hope that Lord Monckton does drop by (though maybe another thread is a better venue?) because you say –
“…Lord Monckton argues that temperatures were in equilibrium for 80 years from 1850, ie., through to 1930…”
I just don’t recall him arguing any such thing. He is merely pointing out that mainstream scientists are, in effect, making that assumption when they are deriving feedback numbers, whether they realise it or not. Lord M’s position is that those feedbacks are already operating at that “equilibrium” position, and therefore the derived numbers are incorrect. I can’t really critique his math but, given how the hydrological cycle works on this ocean-rich planet, I suspect that he is fundamentally correct. To me, it all fits nicely.
Thanks your further comment.
If I understand him correctly, he argues that the feedbacks, whatever they may be, are already operating in 1850, and therefore form part of the forcing that brought about the temperatures at that time (whatever that temperature may have been; the precise temperature within a few degrees is all but immaterial to the point).
He argues that all feedbacks do not suddenly start as at 1850, whereas climate science appears to considers that some feedbacks only start at that time. Thus in 1850 we have already had the forcing associated with the then concentration of CO2 plus the additional feedback associated with the then current levels of water vapour in the atmosphere which water vapour was in turn a by product of the 1850 temperature; the oceans being already warm and the temperature of the oceans giving rise to water vapour in the atmosphere.
That is a point of principle, which I think that he is probably correct on. The feedbacks begin to operate as soon as the Earth acquired an atmosphere. They begin to operate as soon as the forcing is above the background temperature of space. Thus, for example, as the sun grew in intensity from a quiet sun, the feedbacks in Earth’s atmosphere moved with the increased forcing as solar irradiance increased.
Richard
Many thanks for that.
Yes, that is my understanding also. It seems we are mostly agreed after all. Perhaps we are attaching too much significance to the word ‘equilibrium’.
Lord Monckton is really just challenging the derivation of feedback values. He isn’t claiming that other things will not be affecting the global temperature, and as far as I can see the question of whether we have yet fully emerged from the LIA, whilst most interesting, is not a direct challenge to his work. I would like to see ‘natural variation’ renamed in some way, as it seems to encompass just about everything that we can’t evaluate properly in the light of CO2 forcing, rather than an unknowable [chaotic] element. Natural variation shouldn’t be a get out clause for sceptics, any more than it should be used to explain failures in the CO2 projections.
In NJ around the Navesink River, where my ancestors made their home, ice boat races were held regularly on the river and bay area in the late 1890’s, but by 1910 the ice was no longer sufficient to support that activity.
What this really notes is that the records are so spotty and incomplete, worldwide “records” prior to satellites are themselves models. As the plausible range for temperatures includes the current temperature, achieving any sort of statistical significance would be more than a bit difficult.
Bob, this scam is so ropey that the statement you showed above, directly from GISS, can also be read to infer that they are speaking of current temperatures, because it is written in the present tense:
“For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.”
It is indeed.
https://data.giss.nasa.gov/gistemp/faq/abs_temp.html
Dee, the GISS conversion factor of 14 deg C (referenced to the period of 1951-1980 that GISS uses for anomalies) used to be listed at the bottom of their LOTI data pages…
https://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
…but no longer. I was able to find an archived version though, here:
https://archive.is/7sTCC
There they write:
“Best estimate for absolute global mean for 1951-1980 is 14.0 deg-C or 57.2 deg-F,
so add that to the temperature change if you want to use an absolute scale
(this note applies to global annual means only, J-D and D-N !)”
I’ll update the post when I get the chance.
Cheers
Bob
Bob, the odd thing is, it didn’t used to be, the 1997/1998 report by NOAA quoted 1997 as being 62.45F or 16.92C which was 0.42C above the 1961-1990 baseline.
See here
https://www.ncdc.noaa.gov/sotc/global/199713
And 1998 was even warmer at about 0.7C.
see
https://www.ncdc.noaa.gov/sotc/global/199813
A C Osborn, in the Global Climate Report – 2013, NOAA stated:
“The year 2013 ties with 2003 as the fourth warmest year globally since records began in 1880. The annual global combined land and ocean surface temperature was 0.62°C (1.12°F) above the 20th century average of 13.9°C (57.0°F).”
https://www.ncdc.noaa.gov/sotc/global/201313
Then last year they wrote in the 2017 annual report:
“The 2017 average global temperature across land and ocean surface areas was 0.84°C (1.51°F) above the 20th century average of 13.9°C (57.0°F), behind the record year 2016 (+0.94°C / +1.69°F) and 2015 (+0.90°C / +1.62°F; second warmest year on record) both influenced by a strong El Niño episode.”
The quotes you provided from 1997 and 1998 sound a little confused.
Regardless, I didn’t use the NOAA data in this post.
Cheers,
Bob
I suggest that you go and read them for yourself instead of using the word confused.
What it means is they drastically changed the baseline (which they state as an excuse), but did not change the 1997/8 anomalies.
Using GISS is using NOAA data. just modified even more.
In the 1980s Dr. Hansen’s research stated that the average global temperature was 288k
https://www.google.ie/url?q=https://pdfs.semanticscholar.org/9b5a/43d1ae48b205218466e7285bf2f9e869dd37.pdf&sa=U&ved=2ahUKEwjtz4Ky9vTeAhWBJMAKHdhhCfcQFjAHegQIAxAB&usg=AOvVaw0n5Om2wigb7FO6D5Kl2vMu
In 2018, after decades of catastrophic man made global warming, NASA still says the earth’s average temperature is 288k.
https://nssdc.gsfc.nasa.gov/planetary/factsheet/earthfact.html
Dee, I have pinched your post to put on my Facebook page to enrage my alarmist friends who will, no doubt, have conniptions because they believe the IPCC implicitly and were overjoyed because they thought that the recent report by the US climate group caught President Trump out. They thought , wrongly, that the report was released on Black Friday so no one would notice it. I trust I have your blessing in this endeavour. K
PS I won’t post until I hear from you thanks.
Kevin McNeil, yes of course.
You can also tell them that the first conference of the WMO in 1979 stated that we are currently living in an abnormal and unusually cooler period, with the “normal” temperature for the earth possibly being anywhere between 5 to 10°C warmer than the current period:
“Such analyses are sketchy and approximate, but they suggest that the present-
day Earth’s surface (and that of the past two million years) are substantially cooler
than has been usual in history. We live in an abnormal phase of a planetary climate
that in most epochs permitted a largely ice-free surface.
Nothing in the record
suggests that we are about to climb back to the normal condition which may well be
5 to 10 deg C warmer than present conditions.”
https://www.google.ie/url?q=https://library.wmo.int/pmb_ged/wmo_537_en.pdf&sa=U&ved=2ahUKEwjCtPjexvXeAhWMIsAKHUx6D5QQFjABegQICBAB&usg=AOvVaw2jg5Ewzo1630fON7SFR-z-
So basically they start with a spread of guestimates from 100+yrs before any of us were born, and compare them to guestimates 50+yrs after we will all be dead.
Throw their hands up and scream that all the dead people before us, and we ourselves are doomed,……
So what, we will all be dead, and anyone we ever knew will be dead.
Who gives a flying about people in a 100+yrs time.
No-one thats who.
Bob …. just to be a stickler for transparency, as I don’t think we of the skeptical side should engage in the types of dishonest manipulation of the alarmist …… IMO, you should have stayed with a “within model} comparison. Your post takes the model with the highest estimated PI temp, and the model of the lowest PI temp, …. then compare an average to this range.
It would seem to me the more ethical comparison would be to center the warmest model and the coolest model on today’s average and then view the difference in the hindcast of the two models to attain the real estimated range of the two models. Statistically, I would think it invalid to not account for changes within model ….. ie comparing predicted changes within model to average changes across observations. But really, you can’t compare an average of today to one models prediction of yesterday.
Though I would agree, your exercise demonstrates that any changes that may have occurred are still within the modeled natural variance, and as such, there can be no claim of unprecedented warming.
Dr. Deanster, you wrote, “It would seem to me the more ethical comparison …”
There is nothing unethical about my comparisons.
You are more than welcome to produce illustrations any way you like.
Regards,
Bob
1.5 deg C — why don’t more people look at this number, realizing how ridiculously small it is ?
And then realize that gloom and doom is projected from a piss-ant small number. I mean, it’s right there in their faces — all the hype, all the analysis, all the hand waving, all the warnings and cautions and encouragement to change-or-else … is based on a piss-ant number ! … whose legitimacy itself is in dire question !
[Snip. We’re fairly confident that this analogy doesn’t actually add to conversation. -mod]
Glorifying 1.5 deg C, thus, is obsessive beyond any measure of sanity.
Robert, global surfaces have already warmed 1.0 deg C so all of the hubbub is only about an additional 0.5 deg C.
Cheers,
Bob
Including at least 0.6C of “Adjustments” as per the NASA/NOAA description.
Mod, I trust your confidence. (^_^) … and now those who did not read your snip will suffer in curiosity — “What, oh what, did Kernodle write that was so moderation worthy?” This is a historic day — my first snip ever at WUWT. I shall mark it on the calendar, so that, from here on out, the 22 of November will always be WUWT Snip Day. How should I celebrate — by getting a clue, maybe ? … yes, getting a clue, … when cake alone just won’t do.
I see also that I have been corrected by later commentators on the actual smallness of the number on which so many people are building gloom and doom. Thanks for the clarification.
[If you think that snip was noteworthy, you should’ve seen some of our others. Truly magnificent works of moderation! But, we appreciate your good cheer. -mod]
(Don’t lose that Calendar, might be come in handy someday) MOD
Gee! Celebrating a 5 day error.
“Mod, I trust your confidence. (^_^) … and now those who did not read your snip will suffer in curiosity — “What, oh what, did Kernodle write that was so moderation worthy?”
That’s what I want to know! 🙂
(Too late) MOD
Bill Nye, the science guy, said that if it wasnt for our CO2 pollution a new glacial period would have begun. Appearently it is desireable to se the northern hemisphere covered under hundreds of meters of ice and snow? This CAGW death cult is a complete mystery to me, like an evil doomsday sect that cant wait for the end of the world but advocates action to bring on the end faster. Otherwise intelligent people are slaves under this misanthropic groupthink religion. Our time is surrly a new dark age.
Bjorn, do you have a link to a video or article where Bill Nye makes that claim? That would make a great topic for an article here at WUWT.
I can recall Jim Hansen saying that CO2 is extending the time until Earth goes to a new ice age. But I don’t recall Bill Nye saying anything similar.
Regards,
Bob
https://co2islife.wordpress.com/2017/04/29/bill-nye-the-science-guy-catastrophic-ice-age-averted-man-made-co2-saved-mankind/
The Anti-Science Guy apparently would prefer renewed glaciation to a slightly warmer world:
https://co2islife.wordpress.com/2017/04/29/bill-nye-is-not-the-right-guy-would-prefer-an-ice-age-over-the-current-warming/
Thanks, John. I feel a post writing itself. But let me check to see if it was covered here at WUWT while I was on hiatus.
Cheers,
Bob
I’d surely welcome such a post. Can’t speak for anyone else.
Bill Nye is just a joke machine that practically writes itself.
Yes. Gavin Schmidt does have one redeeming feature in that he really does hope to be taken seriously as a scientist one day. But Bill Nye not only trades on the image of the scientist as someone who is slightly touched, he is genuinely crackers. He has “issues” going well beyond global warming and science.
While Schmidt has made clear his contempt for out-there scientists who have lost the plot like Peter Wadhams at Cambridge, it is the Bill Nye’s of this world that are responsible for Schmidt have so little hair left.
I can’t imagine Nye saying that, he’s a climate alarmist by nature. However, the onset of an ice age is relatively gradual, it’s the exit which is fast. So we would have nothing much to worry about in the short term anyway.
Well, we would, especially now with 7.4 billion people on the planet. When there is just a minor hiccup in the weather now, the damage can be extensive both to agriculture and property. It isn’t the 2 Km of ice sheets that are the problem, it is the first few years of frost every summer that wipes out a lot of agriculture crops. If we experience anything close to the average norm of the LIA lows, then many people are at risk because of wide spread crop failures. There is no big issue with surviving any warming trend, but it becomes a nightmare in a significant cooling trend when you can’t grow crops sufficient to keep a proper inventory of food stuffs.
This is very apparent to historians who have noted that it is almost always a significant cooling trend, or some temporary event causing short term cooling and chaos in the weather for even a few short years like a few significant volcanoes that we see in the historic record. While you are right about the general slow descent into an ice age taking thousands of years, it is the low temperature anomalies that come with the cyclic gradual downturn that we see in a cooling trend like the LIA. Or a random chaotic event for a few years. We are only as safe as our last harvest and food stores and intact political systems to ensure an orderly transition back to normalcy.
Shortening the growing season in the temperate zone breadbaskets would indeed be disastrous. Famine would stalk the planet, probably in company with its fellow apocalyptic horsemen pestilence, war and societal collapse.
Many grain regions are already near the limit, helped by the Modern WP and more CO2, plus of course fossil fuel inputs. Canada, Europe, Russia, Ukraine, Manchuria, all vulnerable to fairly small changes in temperature and precipitation. Even the US as well. We have the advantage of reliance on corn, a C4 plant.
All,
Leif Svalgaard sent me a link to a Dutton/Brune Penn State METEO 300 chapter.
They quite clearly assume that the 0.3 albedo would remain even if the atmosphere were gone or if the atmosphere were 100 % nitrogen.
This is just flat ridiculous.
Without the atmosphere or with 100% nitrogen there would be no liquid water or water vapor, no vegetation, no clouds, no snow, no ice, no oceans and no longer a 0.3 albedo.
The sans atmosphere albedo would be much as Nikolov and Kramm suggest, a lunarific 0.12.
And the w/o atmosphere earth would be hotter not colder, a direct refutation of the greenhouse effect theory.
Nick S.
https://www.linkedin.com/feed/update/urn:li:activity:6466699347852611584
https://www.linkedin.com/feed/update/urn:li:activity:6457980707988922368
NIck S
I am having an off-list conversation over a period of weeks with 5 others including an astrophysicist. One of the participants can’t understand why my observation is not generating any discussion: that with an atmosphere containing no GHG’s, the temperature near the surface would be much higher than it is now, because it would continue to be heated by the surface and would have no way to cool radiatively. Initially at least, adding GHG’s cools the atmosphere.
When the hullabaloo is about GHG’s we should be comparing the 1.5m altitude temperature with varying amounts of GHG’s from say, 0% to 1%. That would be a meaningful demonstration of the putative greenhouse effect.
Such a comparison is never presented. The comparison is always between the current state and a naked planet. That makes no sense. See the Wikipedia entry on the subject – it is quite confused. The proposal is not that adding GHG’s and a lot more atmosphere causes changes, it is that GHG’s alone do. In that case, the baseline is the atmosphere without and without GHG’s, not “without any air”.
Thanks for the links.
The calculations on no atmosphere and no greenhouse gases are so ceteris paribus that there’s no point at all.
They got that part right: the average temperature of the Earth with no atmosphere is going to be about the same as the moon. The calculation for the Earth with an atmosphere and no GHG’s is missing from all “standard” explanations.
If adding GHG’s cools the atmosphere, even if it is only initially, then the explanation suffers from a conceptual error. I identify that error as forgetting that the moon and the BB temperature at the TOA are not meaningful to the GAST. Neither informs us how the temperature of the air near the ground is affected by GHG’s, which is what follows from their claim.
“…(in) an atmosphere containing no GHG’s, the temperature near the surface would be much higher than it is now, because it would continue to be heated by the surface and would have no way to cool radiatively. ……”
N2/O2 are transparent to LWIR.
Terrestrial LWIR therefore passes through without interference straight to space.
The Earth would still cool via radiation without GHG’s
https://eesc.columbia.edu/courses/ees/climate/lectures/radiation_hays/
“Nitrogen absorbs only in the extreme ultraviolet of which there is very little in the Sun’s radiation. Oxygen absorbs more strongly than nitrogen and over a wider range of wavelengths in the ultraviolet.”
“The smaller molecules of oxygen and nitrogen absorb very short wavelengths of solar radiation while the larger molecules of water vapor and carbon dioxide absorb primarily longer infrared radiant energy.”
Hi Mr Tisdale
“What Was Earth’s Preindustrial Global Mean Surface Temperature, In Absolute Terms Not Anomalies…”
Sorry to nit-pick, but temperature in degrees C (centigrade or further back Celsius) is a kind of anomaly too, since it varies in both + and – ranges.
Kelvin is only absolute scale since it has only + no – range.
You can set graphic scale at any K number, but it will always be positive, since temperature is reflection of the amount of energy of the system.
vukcevic, the term absolute is commonly used by the climate science community when discussing GMST not in anomaly form.
For examples, see here:
https://archive.is/8nMwz
And here:
https://www.ncdc.noaa.gov/monitoring-references/faq/anomalies.php
Thanks for wasting my time.
Adios,
Bob
Thanks for your replay, but if we are discussing science then they are wrong on that one as they are wrong many other things.
In science absolute value is the magnitude of a real number without regard to its sign.
Absolute value of (-3) = absolute value of (+3) = 3.
Absolute values can not be negative.
Absolutely, correct.
Absolutely, picking at nits.
Absolutely, you knew the exact meaning Bob absolutely intended when he incorporated the valid use of the word “absolute” in his blog.
Have an absolutely fabulous day!
Hi Doug,
Consider my comment to be absolutely anomalous.
Absolutely!
“it is the change in temperature compared to what we’ve been used to that matters.”
Hong Kong has an annual variation in temperature, from the coldest in winter to the hottest in summer, of just over 30C. The diurnal variation is anywhere up top 10C and the geographic variation (in a relatively small area) at any one time is from 4C to 10C.
My question is: what is the temperature I am used to?
David, as you say, in the UK it can vary 15C overnight and change 20C in as little as a week.
We live with it all the time and would barely notice 1.5C, especially when most of the changes have been in the lowest temperatures and not the highest.
Which they fail to mention in the MSM propaganda.
I did this dance with Nick Stokes, use there own method on any section of what they are claiming as a baseline and watch what happens it will likely report an anomaly on there own baseline. Now do a simple search on what a baseline is in hard science not in pseudo junk science.
It’s not a baseline it is a random biased choice of some number, call it what it is. You can call it the CAGW preferred temperature of Earth but what you can not call it is a science baseline.
As I wrote over at Bob’s original article “such a small difference and it doesn’t even include decent UHI adjustments for the last 170 years of human expansion.”
i would love to see what the models would have produced if they didn’t already know the sort of answer they were looking for. I simply don’t believe that the temperatures are an “emergent property”, the models are tuned to produce a temperature that makes sense. Once you do that, the output simply cannot be called an emergent property.
And since these models cannot be checked against reality, what it is the point? We don’t the global average temperature so all these models do is suggest what it might have been if the models are somewhat accurate.
You punch in a factor arising from the known radiative physics of CO2, assume a simple relationship via water vapour feedback, and off you go ….
What should the average temperature of Adelaide, South Australia be given it varies by 46 degrees C over a year (I fibbed and it was really 46.1 deg C)
http://www.weatherzone.com.au/climate/station.jsp?lt=site&lc=23090
What spatial and temporal temperature readings should I take to work it out for the benefit of residents? Even if I did what use would it be to them experiencing the max, min and in between temps anyway? Would it be more relevant for visitors?
Would you believe it that Kent Town station was moved to the eastern side of the CBD in 1979 because that’s where the BoM moved to and it was convenient for manual readings. In the process they dismantled one of the longest serving Stevenson Screens in the Southern Hemisphere that was situated in the West Parklands on the seaward side of the city where our weather patterns come from. The irony is now the Kent Town location is automated anyway.
Global surface temperature 13.9 to 14°C as defined subjectively by IPCCC NOAA NASA at pre-idustrial level is meaningless.
Surface temperature changes with elevation above MSL and spatial location of latitude.
ICAO defined surface temperature for standard atmosphere at sea level as 15°C (59°F or 288.15K). Used in aviation universally.
This is what climate researchers should use for surface temperature reference. But AGW Climare fanatics won’t use it because it will not show any warming and 1.5 degree increase will not ring doomsday bell.
The IPCC is claiming that a rise of only 1.5 degrees will take us to the edge of disaster.
To my layman’s undersanding, that would take us to roughly the temperature of the Roman times. It may have escaped my attention but I don’t recall any accounts of climate disasters in my Latin and History classes. Indeed, those times are generally viewed as a period of great advancement – such as the fabled vine-growing in the North of England.
To paraphrase Monty Python: “What did Roman weather do for us?”
Hey, someone linked your article on Reddit, and I replied with a rebuttal. Can you check it out, please?
https://www.reddit.com/r/Conservative/comments/a0ul8m/what_was_earths_preindustrial_global_mean_surface/eakxnsi/
‘Indeed, those times are generally viewed as a period of great advancement”
Aye maybe, but they were also the times of greatest atmospheric pollution..
Black lung was a big killer, along with many more respiratory diseases.
http://www.eolss.net/Sample-Chapters/C09/E6-156-15.pdf
What was the pre-industrial time of day?
What was the pre-industrial season of the year?
What was the pre-industrial state of the tide?
What was the pre-industrial phase of the moon?
The best performing CMIP5 model was INMCM4. The latest version is INMCM5 preparing to qualify for CMIP6. There are descriptions of improvements made and as of last month, a new hindcast comparable to what is discussed above. As the image shows 7 model runs were averaged for comparison to HADCRUT4.

Figure 1. The 5-year mean GMST (K) anomaly with respect to 1850–1899 for HadCRUTv4 (thick solid black); model mean (thick solid red). Dashed thin lines represent data from individual model runs: 1 – purple, 2 – dark blue, 3 – blue, 4 – green, 5 – yellow, 6 – orange, 7 – magenta. In this and the next figures numbers on the time axis indicate the first year of the 5-year mean.
HADCRUT4 uses 12 monthly global averages for the period 1961 to 1990, with a resulting average annual mean of 13.97C.
More on INMCM5 at https://rclutz.wordpress.com/2018/10/22/2018-update-best-climate-model-inmcm5/
STUPID QUESTION: Does the phrase, “averaging model runs”, really make any sense ?
Isn’t an “average of model runs” just another absurdity ?
All of a sudden, it just seemed ridiculous.
False insight? or late-blooming glimmer of brilliance ?
Robert, when the runs are from the same model, each one is a scenario. The average is a better representation of the model’s behavior than a single run. Whether any of them will become reality is another question. The fact that the hindcast is realistic is encouraging. And as you say, HadCRUT4 is itself a proxy (estimate) of a fictitious GMT. But as R.G. Brown used to say, it’s the only game in town.
Ron, as noted in the second citation, the different model runs were based on different assumptions, e.g. ~60 year oscillations, etc. Averaging model runs with differing assumptions gives nonsense.
From Volodin et al. 2018
“Seven model runs were started with different initial conditions obtained from long preindustrial run, where all external forcings were prescribed at the level of year 1850. The length of preindustrial run was several hundred years, so upper oceanic layer was adjusted to atmospheric model conditions, but it is not the case for the deep ocean. A small trend of model climate is visible because of deep ocean adjustment to upper oceanic and atmospheric conditions – a common situation for simulation of historical climate with present day climate models. The obvious reason for multiple integrations is to separate the role of natural variability and external forcing in climate changes. When data of seven model runs are consistent with each other, then one can expect that the phenomenon of interest is a manifestation of (or response to) an external forcing. If there is a noticeable difference between different model runs, then a role of natural variability is crucial. To estimate statistical significance of near surface temperature trend, t-test at 99% level was used. Variance of 5 year means was calculated from 1200 years of preindustrial run.”
IIRC R.G. Brown made a post explaining why averaging the model runs was ridiculous and unscientific. I wish he still posted here. We was both objective and informative.
Regarding hindcasts: how can they be so accurate and their forecasts be completely inaccurate? On top of that what the hindcasts are matching is not an actual thing — it’s a construct. On top of that, we have no idea what the global temperature was in 1850, and certainly not accurately to 0.1 C.
Reg, R.G. Brown’s explanation of climate models is reprinted here:
https://rclutz.wordpress.com/2015/06/11/climate-models-explained/
He explained:
“Each model one at a time can have the confidence interval produced by the spread in long-run trajectories produced by the perturbation of its initial conditions compared to the actual trajectory of the climate and turned into a p-value. The p-value is a measure of the probability of the truth of the null hypothesis — “This climate model is a perfect model in that its bundle of trajectories is a representation of the actual distribution of future climates”. This permits the estimation of the probability of getting our particular real climate given this distribution, and if the probability is low, especially if it is very low, we under ordinary circumstances would reject the huge bundle of assumptions tied up in as “the hypothesis” represented by the model itself and call the model “failed”, back to the drawing board.”
“One cannot do anything with the super-average of 36 odd non-independent grand average per-model results. To even try to apply statistics to this shotgun blast of assumptions one has to use something called the Bonferroni correction, which basically makes the p-value for failure of individual models in the shotgun blast much, much larger (because they have 36 chances to get it right, which means that even if all 36 are wrong pure chance can — no, probably will — make a bad model come out within a p = 0.05 cutoff as long as the models aren’t too wrong yet.”
The first paragraph describes what INM is doing with a single model.
You have a point. Should we use of statistical methods when we handle a sample of man-made computer programs that depend on who are the men and how they programmed the models?
Because climate is a complex adaptive system its models are that too. They do not produce predictable outputs with even slightly varying inputs. Small change in the input parameters may result in very different results.
Consider a lottery machine. Physics is known but we can’t predict what balls will come out of it. After enough runs we can conjecture that we have only certain numbers in the machine (at least so far).
Of course we thrust that climate model runs are not cherry-picked for IPCC reports. Can I add some Russian models more to these reports.
I don’t think matching the HadCrut 4 is a very good result in terms of reality. 😉
It almost certainly says the model is not working well at all..
BTW, for those interested in the pre-industrial baseline, there is this graph:
https://www.eea.europa.eu/data-and-maps/indicators/global-and-european-temperature/global-and-european-temperature-assessment-8/image_xlarge
It shows what the Russians are showing. When you take 1850-1899 as “pre-industrial baseline”, the rise is about 0.8K.
Ron, the model overshoots the cool periods to clip the tops of the warm periods. Please note the 21st Century overshoot.
Dave, wait till you see the other ones.
And the IPCC AR6 will use them with no blushing.
Ron Clutz
That’s handy, because it makes it easy to answer Bob’s initial question “What was earth’s pre-industrial global mean surface temperature, in absolute terms not anomalies, supposed to be? At least according to HadCRUT4, which stars in 1850.
If the IPCC’s reference period for pre-industrial temperatures is 1850-1900, then you just need to calculate the average annual anomaly in HadCRUT4 for that period and add it to 13.97C. The HadCRUT4 anomaly average for 1850-1900 amounts to -0.30C; so the absolute ‘pre-industrial’ temperature was 13.67C, according to HadCRUT4.
The latest annual temperature anomaly in HadCRUT4 (2017) is 0.68C, which needs to be added to 13.97C (the 1961-1990 base), giving 14.65C in absolute values. According to HadCRUT4 then, global surface temperatures in 2017 were about 1.0C warmer than they were on average during the period 1850-1900. They’ll be a little lower this year; around 0.9C warmer.
If I recall correctly HADCRUT4 was a revision to ‘better’ take account of ship measurements of sea temperature made when the TEAM were beginning to get concerned about the pause.
The problem is that before 1920 there was all but no ship measurements of ocean data particularly in the SH. In fact Phil Jones in the Climategate emails was quite candid when he remarked that most of the data below the tropics and between the Antarctic is simply made up.
Given that it is crazy to consider that we have global data going back to 1950, and we ought to limit consideration solely to the Northern Hemisphere where there is better coverage.
If one is going to use a HADCRUT reconstruction, HADCRUT3 is likely to be better when considering the 19th century.
From your INMCM5 link (my caps):
“…When compared to the INMCM4 surface temperature climatology, the INMCM5 shows several improvements. Negative bias over continents is reduced mainly because of the increase in daily minimum temperature over land, which is achieved by TUNING the surface flux parameterization…”
How many times do people insist there’s no “tuning” of parameters in the climate models to the past? LOL.
“The 14.0 deg C factor used to be (past tense) listed at the bottom of the GISS LOTI data page and they were referenced to the period GISS uses for anomalies, 1951-1980. But the 14.0 deg C (57.2 deg F) factor is no longer listed on that GISS data page. “
That right there deserves a post of its own.
Hi Anthony.
What’s also curious is the Wayback Machine archive. The Wayback Machine won’t allow access to the archived GISS LOTI data in 2013, 2014 or 2015, but it will allow access to that data in 2017 and 2018. I had to retrieve a copy of the older data from ArchiveToday:
https://archive.is/7sTCC
BTW, you’ll love my next post on the two CMIP5 ensemble members used in this post, comparing them to the Berkeley Earth data on a monthly basis, in absolute form. What I found is remarkable. I couldn’t make it up.
Cheers,
Bob
I look forward to reading it.
Drive on, Bob Tisdale, drive on!
Thanks for all your fine work. I’m glad you’re back.
Err, lost me there, Bob. The variability of a model, the variability of the nature, and the model spread are all different animals, so I guess you should not add therefore there.
Your point shortly could be – the models don’t agree on T, and have a ±.5% spread, larger than the warming deemed dangerous by some.
Also, Bob isn’t comparing recent global surface temperatures with ‘pre-industrial natural variability’ here; he’s comparing them to model outputs that included the pre-industrial period. Hind-cast models, not global surface temperature records.
Bob:
I did some rough scaling on your figure 1. As far as I could disentangle the different models, the difference in global average temperature between 1880 and 2018 varied from +1.1° to +1.9°C, and seemed to gravitate towards an average of +1.5°C.
Does this mean we’re already at the dreaded 1.5° tipping point, the point of no return? Does this mean there’s no point in worrying about it or blogging about it any more? Does it mean that the end of life on earth is absolutely guaranteed to happen in the next decade or so?
Or perhaps not.
I have been working with Earth’s energy balance. In my presentations as well in other presentations, the LW radiation emitted by the Earth’s surface is about 396 W/m2. It means a black body temperature about 16.0 degrees. Temperature 14 C gives about 385 W/m2. There is a problem. Maybe I have been working too much on this issue but I believe that 16 C is close to the right value. So, I believe.
But you cannot assume anything about a black body, the Earth is far from that! Also there is a thermodynamic problem with the “black body” model, see http://vixra.org/pdf/1502.0007v2.pdf and a you tube video showing the experiment to prove it!
C an we get away from Models, what about the written records which go back some 300 0 years in say China. Water freezes and is observed. A long period of no rain is also recorded . Heat while not actually measured was noted and a estimate can be made from what written about it.
Church records exist in Europe prior to the MWP. Research can show a lot, far better than guesses via models.
MJE
Even though I wrote about models, the radiation value 396 W/m2 has been also confirmed with the direct global observations. If the value of 396 W/m2 is not correct, then the outgoing LW radiation value of 240 W/m2 is also difficult to explain.
Bob there certainly doesn’t seem to be any CAGW to be found in your post. But Bob what about the Vinther et al 2006 Greenland study ( note Jones and Briffa listed as co authors) using instrumental data from the late 18th century until the then present day.
Certainly doesn’t seem to be any CAGW there either if you look at their Table 8 of the study. Anyway what will happen when the AMO changes back to the cool phase or has that started in 2015 as some scientists have asked recently? Here’s the link. So Bob what do you make of this very long Greenland instrumental record and the decade by decade table 8? I can’t find any post 1950 warming that seems to point to any scary CAGW, but perhaps I’m missing something?
https://crudata.uea.ac.uk/cru/data/greenland/vintheretal2006.pdf
If everything’s made up of atoms in the periodic table.
https://www.youtube.com/watch?v=J2K3mAKr67U
and the electron Bohr model has been replaced by the electron cloud model. Then how big is the electron cloud that surrounds and formed Earth??? Electrons are what the space between atoms is (air), If you put electrons under pressure they will create heat “T” as they resist from being forced together.
https://www.youtube.com/watch?v=zYS9kdS56l8
https://www.youtube.com/watch?v=zYS9kdS56l8
I see the electrons as the ether, plasma or whatever you care to label the soup proton and neutrons (mas) float around in.
Only if you are prepared to compare apples with political history.
To be sure that no human caused global warming has been included one should go back 20k years to the coldest part of the last ice age which is definitely “pre industrial” . Of course 1.2 degrees C more than that value has been exceeded more than 15k years ago and yet the effect has been catastrophic. The previous ice age ended.
“To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.”
Thats makes everthing ok then, because everone who was alive in 1900 is dead anyway
Global surface temperature 13.9 to 14°C as defined by IPCC NOAA NASA at pre-industrial level is meaningless.
Surface temperature changes with elevation above MSL and spatial location of latitude.
ICAO defined surface temperature for standard atmosphere at sea level as 15°C (59°F or 288.15K). Used in aviation universally.
This is what climate researchers should also use for surface temperature reference. But AGW Climate fanatics won’t use it because it will not show any warming and 1.5 degree increase will not ring doomsday bell for 2030 and beyond.
Again.
lobal surface temperature 13.9 to 14°C as defined by IPCC NOAA NASA at pre-industrial level is meaningless.
Surface temperature changes with elevation above MSL and spatial location of latitude.
ICAO defined surface temperature for standard atmosphere at sea level as 15°C (59°F or 288.15K). Used in aviation universally.
This is what climate researchers should also use for surface temperature reference. But AGW Climate fanatics won’t use it because it will not show any warming and 1.5 degree increase will not ring doomsday bell for 2030 and beyond.
Again.
Global surface temperature 13.9 to 14°C as defined by IPCC NOAA NASA at pre-industrial level is meaningless.
Surface temperature changes with elevation above MSL and spatial location of latitude.
ICAO defined surface temperature for standard atmosphere at sea level as 15°C (59°F or 288.15K). Used in aviation universally.
This is what climate researchers should also use for surface temperature reference. But AGW Climate fanatics won’t use it because it will not show any warming and 1.5 degree increase will not ring doomsday bell for 2030 and beyond.
Edited comment.
Global surface temperature 13.9 to 14°C as defined by IPCC NOAA NASA at pre-industrial level is meaningless.
Surface temperature changes with elevation above MSL and spatial location of latitude.
ICAO defined surface temperature for standard atmosphere at sea level as 15°C (59°F or 288.15K). Used in aviation universally.
This is what climate researchers should also use for surface temperature reference. But AGW Climate fanatics won’t use it because it will not show any warming and 1.5 degree increase will not ring doomsday bell for 2030 and beyond.
Bob Tisdale:
You wrote “If you look closely at Figure 3, you can see that the Berkeley Earth Data recently peaked in 2016 slightly above the highest of the simulated pre-industrial average temperatures,…with the 2014/2015/2016 El Nino-caused “record high” of 15.1 deg. C.”
The cause of the El Nino was an approximately 30 Megaton drop in anthropogenic SO2 aerosol emissions between 2014 and 2016, which partially cleansed the atmosphere, allowing sunshine to strike the Earth’s surface with greater intensity, causing increased surface warming.
The very strong 1997-1998 El Nino was due to a reported 7.7 Megaton decrease in SO2 aerosol emissions between 1997 and 1998, due to Clean Air efforts.
The temperatures shown in your Figures 2 and 3 are simply reflective of changing levels of SO2 aerosols in the atmosphere, of either volcanic or anthropogenic origin.
The gradual rise in average global temperatures since 1975 have been due to decreased levels of anthropogenic SO2 aerosol emissions in the atmosphere due to Clean Air efforts, from a peak of 131 Megatons in 1975, to 101 Megatons in 2014 (and to about 71 Megatons in 2016).
This on top of Earth’s natural recovery from the Little Ice Age cooling of ~.05 deg. C per decade, from 1900-present.
Bob, it is obvious that the control knob for Earth’s temperatures is simply the amount of SO2 aerosols in the atmosphere. Add them, and it cools down. Decrease them, and it warms up.
Your thoughts on on this?
I’ve also (recently) become curious about the effects of a reduced atmospheric aerosol load due to the clean air act and etc. The attribution question is still very much in doubt based on the two competing physical processes (i.e. radiative forced warming vs aerosol forced cooling). The aerosol factor is one of the tuning parameters in the models (as I understand it), and given the potential magnitude of the forcing, it seems like we might inadvertently be contributing to warming by cleaning up our atmosphere.
rip
ripshin:
Your statement that “we are inadvertently …contributing to warming by cleaning up our atmosphere” is absolutely correct.
In fact, it is not a “contribution to warming”, but the actual CAUSE of all of the anomalous warming that has occurred since 1975, apart from temporary effects due to volcanic activity.
Radiative-forced warming, if it exists, is so small that it is undetectable. On the other hand, the relationship between decreased SO2 aerosol levels and increased anomalous temperatures is ~.02 deg. C. of temp. rise for each net Megaton of decrease in global SO2 aerosol emissions. The same relationship exists for increased SO2 aerosol emissions, which causes cooling.
“Specifically, as a pre-qualifier, I used the outputs of the simulations of Surface Air Temperatures (TAS) from 90S-90N from the 81 individual ensemble members.”
First mistake.
Steven, I’m looking at this NOAA and NASA joint report for 2017:
https://www.google.ie/url?q=https://www.giss.nasa.gov/research/news/20180118/NOAA-NASA_Global_Analysis-2017.pdf&sa=U&ved=2ahUKEwjb-7mPg_feAhVN3KQKHT9mANQQFjAKegQIARAB&usg=AOvVaw0xln7gzRmIO9ZWkDAC-5tn
NASA states that:
“2017:
0.9°C / 1.6°F
above 1951-80
average”
But the average temperature isn’t supplied.
Therefore any claimed “departure” from it has no context and in isolation is akin to statistical noise.
NOAA provides the following similar statement in the same document:
0.84ºC / 1.51ºF above 1901-2000 average; 3rd warmest year of record
Again the average that is being referred to is not published, which renders contextualising any departure from it impossible.
It is clear that there is an unwillingness to publicly state what this average being written about is alleged to be.
At a time when we constantly hear appeals to urgently adopt measures (which themselves, according to the UNIPCC) will cause “unprecedented changes to all aspects of society” in order to attempt to limit warming to less than a rise of 2°C above an obviously constantly hidden figure, do you agree that it doesn’t serve policy makers or the electorate by having a situation where it appears that nobody can say what the target global temperature we either want to attain, or halt a 2°C departure from actually is, with any accuracy?
There needs to be a little more transparency in this science in order to make appeals to society to urgently adopt unprecedented and clearly unwanted changes and to get national agreements to implement the UN’s stated policy of “transforming the global economic model which has prevailed for the past 150 years”.
(In order to quickly achieve the goals stated above, democracy and sovereignty will clearly need to be transcended*, as the democratic mandate to do what is allegedly required to fix the problems being cited cannot currently be demonstrated in any democracy.
*Transcended by whom, or by what entity is the question.)
If it’s not your own area of interest, maybe someone else can share their thoughts?
Thanks, Dee.
“Preindustrial” is easily the most stupid word in the English language. Saying it is the same as saying “my IQ is the same as my shoe size”.
Have I missed something here? I have seen many quotes of a baseline temperature, but never noticed once where an error statement was included.
If the baseline is 15 +- 0.5 degrees, then does a 0.5 degree increase mean anything really?
I guess I’m just bugged that scientists everywhere just ignore measurement errors and act like they don’t exist. Maybe that’s ok for scientists, but for engineers that just won’t fly!
In all the models I ever used as an electrical engineer, you had to include component tolerances, voltage and current measurement errors, etc. Do climate models not include any error variances in their calculations or are all the inputs and algorithms so accurate that error statements are not needed.
Mr. Tisdale, THANK YOU KINDLY for touching on a topic from the category I call, A Question Any Sane and Earnest Child Would Ask.
“Daddy, if there was one temperature for the whole Earth, what was it a couple hundred years ago? And what is it today?”
“Son, when you become smart you will live in a world of anomalies and won’t ask such things. Only trends matter. Measurement is a cloud of probabilities like mosquitoes. Statistics is full of error and the statisticians who won’t say how much make the most money. But first we must examine your precepts in order to find bad assumptions in logic. WHY did you ask about specific, actual temperatures?”
“I thought it would just be fun, to know.”
“Daddy, why do the continents fit together like a puzzle?”
“When you become smart you will pretend not to notice things like this, and wait until a proven mechanism is found by someone else. But you’re in luck…”
“It’s okay. I’ll wait.”
“Dad, if there is no wristwatch inside the Sun, why do the sunspots go tick-tock every 11 years? It’s like Jupiter is jumping up and down saying, Look at me! Look at me!”
“Don’t look, son! It does not match exactly. For all we know right now, the sunspots could be chasing Jupiter round and round.”
“Everything goes round and round. Maybe the sunspots are throwing stuff out that changes things or makes them slippery… like someone throwing up on the dance floor.”
“Dad, if an asteroid killed all the dinosaurs and cooked the Earth, how come my teacher said we don’t have to worry because it does not happen very often?”
“Your teacher is a compulsive gambler and needs help. Thinking something will not happen tomorrow because it has not happened for a long time is the same as thinking you will win just because you think you should win. And we don’t want to think about losing. Losing everything. In fire. All dead! Except small rodents in holes! All that time building missiles for years and years but not ONE missile was built to send up to meet an asteroid. The more you think about it the more stupid it gets! Oh no… I’m thinking of it now, the idiocy, the waste! Of time! Time! How much time DO we have?? I want to figure it out but we cannot figure it out. I cannot figure. What could *I* have done? Whispered in someone’s ear… a speech, a book? Is it too late? What if… we pretend it’s not too late, and start preparing now? It wouldn’t hurt to be wrong, would it…?”
“It’s okay Dad… my teacher just lied. Besides, it would be cool to be a rat in a hole again.”
The word ‘anomaly’ is used frequently by CAGW enthusiasts as it immediately draws up bad and loathsome connotations. ‘Temperature Anomalies’ are a preferred CAGW phrase. Bob has drawn attention to the wide spread of the models of pre-industrial temperature range used for future anomaly forecasts. From the look of that, your guess as to the past and future temperatures on Earth would be as good as mine and their’s too.
Silber’s an idiot with no knowledge of natural variability and the miniscule contribution of increased CO2 to the warming of the Earth which is not a catastrophe in any event!