WMO Reasoning behind Two Sets of “Normals” a.k.a. Two Periods of Base Years for Anomalies

Most of us are familiar with the World Meteorological Organization (WMO)-recommended 30-year period for “normals”, which are also used as base years against which anomalies are calculated. Most, but not all, climate-related data are referenced to 30-year periods. Presently the “climatological standard normals” period is 1981-2010. These “climatological standard normals” are updated every ten years after we pass another year ending in a zero. That is, the next period for “climatological standard normals” will be 1991-2020, so the shift to new “climatological standard normals” will take place in a few years.

But were you aware that the WMO also has another recommended 30-year period for “normals”, against which anomalies are calculated? It’s used for the “reference standard normals” or “reference normals”. The WMO-recommended period for “reference normals” is 1961-1990. And as many of you know, of the primary suppliers of global mean surface temperature data, the base years of 1961-1990 are only used by the UKMO.

Basically, in simple terms, “climatological standard normals” are for what we might expect at a given time in a given location. On the other hand, the “reference normals” are for how things have changed since the reference period. That way politicians, activists, and eco-profiteers, etc., can whine about how climate has changed and ask you to pay for it…as if we should expect climate not to change.

We can find a simple and easy-to-understand discussion of the reasons for two sets of “normals” in the WMO document titled WMO Guidelines on the Calculation of Climate Normals (2017 edition), linked here, 725 KB .pdf. There they write (my boldface and brackets):

5. APPLICATION ASPECTS

5.1 Why calculate both climatological standard normals and reference normals?

As mentioned above, climate normals serve two major functions: as an implicit predictor of the conditions most likely to be experienced in the near future at any given location, and as a stable benchmark against which long-term changes in climate observations can be compared.

In a stable climate, these two purposes can both be served by a common reference period. However, as discussed in The Role of Climatological Normals in a Changing Climate (WMO, 2007), for elements where there is now a clear and consistent trend (most notably temperature), the predictive skill of climate normals is greatest if they are updated as frequently as possible. A 1981–2010 averaging period is much more likely to be representative of conditions in 2017 than the 1961–1990 period. On the other hand, there are clear benefits of using a stable benchmark as a reference point for long-term datasets, both in practical terms (not having to recalculate anomaly-based datasets every 10 years), and in terms of communication – an “above average” year does not suddenly become “below average” because of a change in reference period.

[Wouldn’t that be terrible? To have to list and show the global surface temperature anomaly for a recent year as being “below normal”, as if that would’ve happened in a while. See Figure 1 below.]

The quote continues:

As these two primary purposes of climate normals have become mutually inconsistent in terms of their requirements for a suitable averaging period, WMO has decided that both should be calculated (subject to availability of data). While the best predictive skill would be achieved from updating climatological standard normals every year, it is recognized that this would be impractical for many countries, and hence it has been decided that these should be updated every 10 years, with the next update due after the end of 2020.

Figure 1 includes Berkeley Earth annual global surface temperature anomalies from 1988 to 2017 using the “climatological standard normals” base years of 1981-2010; and the “reference normals” base years of 1961-1990, which is used by the UK Met Office; and also the base years of 1951-1980, which have been adopted by NASA GISS and Berkeley Earth as their standard base years; and, last but not least, 1901-2000, which is used by NOAA NCEI for their global surface temperature data.

Figure 1

As you can plainly see, the base years used for anomalies do not impact which year was warmest or coolest. The base years only impact how far above “normal” the supplier can claim a year has been. Not surprisingly, NOAA NCEI uses base years, 1901-2000, that provide the highest values above “normal”.

And that brings to mind something that will cause a slight shift in topics to global surface temperatures in absolute form:

“To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.”

Of course, the above quote comes from Gavin Schmidt, who is the Director of the NASA Goddard Institute of Space Studies. It is from a 2014 post at the “Climate science from real climate scientists” blog RealClimate, and that quote comes from the blog post Absolute temperatures and relative anomalies (Archived here, just in case.). So not to be accused of quoting Gavin out of context, I’ll present the full paragraph. The topic of discussion for the post was the wide span of absolute global mean temperatures [GMT, in the following quote] found in climate models. Gavin wrote (my boldface):

Most scientific discussions implicitly assume that these differences aren’t important i.e. the changes in temperature are robust to errors in the base GMT value, which is true, and perhaps more importantly, are focussed on the change of temperature anyway, since that is what impacts will be tied to. To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.

Anyone with the slightest bit of common sense knows that, annually, the local ambient temperatures where they live vary much more than the 1-deg C change in global surface temperatures that data show we’ve experienced since preindustrial times and way much more than the 0.5-deg C additional change in global mean surface temperatures the UN has set its sights on trying to prevent in the future.

So, to put things in perspective, as a simple example—for a well-known country—let’s plot the long-term monthly variations in surface temperatures, not anomalies, and compare them with global surface temperature anomalies, which is how global mean surface temperatures are normally presented. The country I’ve chosen for this example is China, the most-populated country on Earth. The data we need are available from Berkeley Earth. Their monthly global mean land+ocean surface temperature anomaly data are here, and the near-surface land air temperature anomaly data for China are here along with the all-important monthly surface temperature factors for converting the anomalies into absolute form. (Thank you, Berkeley Earth!) The comparison runs from January 1900 to August 2013, when the data for China ends at Berkeley Earth. See Figure 2. The annual variations in surface temperatures in China average 28.2-deg C for the period of 1951-1980 and those annual variations dwarf the long-term rise in global surface temperatures, about 1-deg C.

But, but, but, I can hardly see the changes in the red curve, the global surface temperature anomalies.

Exactly.

Figure 2

In other words, the gazillion people living in China have been used to annual variations in temperatures that are far, far greater than the wimpy little 1-deg C warming the Earth has experienced since the end of the pre-industrial period—far, far greater. Also, the average annual variation in monthly surface temperatures for China is more than 56 times greater than the additional 0.5 deg C rise in global surface temperatures the UN is now pushing to avoid.

Hmmm, I really feel a series of posts coming on with lots of reference graphs.

NOTE REGARDING MY USE OF THE TERM ABSOLUTE TEMPERATURE

The term Absolute is commonly used by the climate science community when discussing Earth’s surface temperature when they aren’t using anomalies. See the quote from Gavin Schmidt above or refer to the FAQ webpages of the global surface temperature data suppliers.

[End note]

That’s it for now. Have fun in the comments and enjoy the rest of your day.

STANDARD CLOSING REQUEST, WITH AN ADDITION

Please purchase my recently published ebooks. As many of you know, this year I published 2 ebooks that are available through Amazon in Kindle format:

And please purchase the ebook by Anthony Watts et al. Climate Change: The Facts – 2017.

To those of you who have purchased them, thank you very much. To those of you who will purchase them, thank you, too.

Regards,

0 0 votes
Article Rating
118 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Robber
December 3, 2018 5:35 pm

You have nailed it Bob. I look forward to seeing similar charts for other countries that people can then send to their politicians. Daily highs and lows would illustrate even more catastrophic conditions. How do they survive in China with so many below freezing temperatures?

Zig Zag Wanderer
December 3, 2018 5:54 pm

The goalposts are moving around so often and in so many different directions, it’s no wonder the CAGW faithful sometimes scores own goals.

commieBob
December 3, 2018 5:55 pm

… annual variations in temperatures that are far, far greater than the wimpy little 1-deg C warming the Earth has experienced since the end of the pre-industrial period—far, far greater.

The various warm periods (MWP, RWP, Holocene Optimum) were only a couple of degrees warmer than now. I suspect there is a problem comparing instrumental temperatures with proxies.

We have a pretty good idea of what was going on during the MWP because it is recorded history in Europe and Asia. The apparent difference of a couple of degrees made a huge difference, for the better, to the people living then.

Given the historical evidence that a warmer climate is better, it is gobsmacking that the alarmists get away with crying wolf.

commieBob
Reply to  Bob Tisdale
December 3, 2018 6:28 pm

Thanks. 🙂

People get lost in numbers and analysis when a simple observation points out that their work doesn’t pass the smell test. In this case, we have historical evidence both for Europe and Asia and North Africa, as well America as recorded by the Vikings who briefly settled Newfoundland. When I say historical, I use the word in its strict sense.

History (from Greek ἱστορία, historia, meaning “inquiry, knowledge acquired by investigation”)[2] is the study of the past as it is described in written documents. link

The good news is that Dr. Mann’s misbegotten hockey stick contradicted my knowledge of history and turned me into a skeptic. Similarly, it is hard to get around the demonstrated fact that the world was a better place for civilization when it was a bit warmer.

Reply to  commieBob
December 4, 2018 5:20 am

Bingo!

John Endicott
Reply to  commieBob
December 4, 2018 12:53 pm

Exactly. To be a climate alarmist requires one to be ignorant of history.

RACookPE1978
Editor
Reply to  Bob Tisdale
December 3, 2018 6:45 pm

Bob Tisdale: it is irritating that we are almost, but not actually reliable.
Minoan Warming Period. MWP, obviously.
Roman Warming Period. RWP, obviously.
Medieval Warming Period. MWP, logically.
Modern Warming Period. MWP, unfortunately also logical.

Shouldn’t find some useful, memorable, nuemonetic alliterative alternative? Like Middle Warming Period?
Besides, I can never spell EITHER Minoan NOR Medieval without the spellcheck-right-click-select-one delay!

Reply to  RACookPE1978
December 3, 2018 7:37 pm

Hmm. I don’t see any way to avoid the evil spell checker, but two options that are at least referring to the same kinds of time periods?

Technological:
BRONZE Warming Period (BWP).
IRON Warming Period (IWP).
MEDIEVAL Warming Period (MWP).
ELECTRONIC Warming Period (EWP).

Civilizational:
MINOAN Warming Period (MWP).
ROMAN Warming Period (RWP).
CHRISTIAN Warming Period (CWP). (Can be interpreted as CHINESE Warming Period too, since that civilization was near it’s peak about the same time.)
EUROPEAN Warming Period (EWP). (Or, for this one: WESTERN Warming Period (WWP)).

Not that I think anyone will change over, either way.

Alan Tomalty
Reply to  Writing Observer
December 3, 2018 8:00 pm

CCWP – Climate computer warming period 1988- Present
CCFWP – Climate computer forecasted warming period – This warming period will last from Present- Climate Armageddon
PCCP- Post Climate computer period – This period will last forever because it will start after all western countries are bankrupt from madness policies that fought the the CCFWP. However the Climate Armaggedon will not have been caused by any warming but by the madness listed in the last sentence.

Chris Hanley
Reply to  commieBob
December 3, 2018 7:45 pm

Physical evidence that can be reliably radiocarbon-dated like fossil oyster beds and fossil coral micro atolls indicate that sea levels around the Holocene Optimum were 1m – 2m higher that now and since the purported 1C temperature rise during the past century has caused a sea-level rise of ~250mm, a 1.5m rise would infer a sea-level 4C – 6C higher than now.
http://notrickszone.com/2017/08/21/10000-to-5000-years-ago-global-sea-levels-were-3-meters-higher-temperatures-4-6-c-warmer/

December 3, 2018 6:01 pm

“But were you aware that the WMO also has another recommended 30-year period for “normals”, against which anomalies are calculated? It’s used for the “reference standard normals” or “reference normals”.”
There is a basic difference in the arithmetic here. As I mention over and over, the steps in anomaly are:
1. For each month/location, calculate over some base period (call it A) the normal (average)
2. Subtract the appropriate normal from each temperature data point to form anomalies
3. Spatially average the anomalies.
The order is important, and 3 is the big calculation. The result will be that the global mean over A will be zero (in fact, zero for each month, eg over the 30 Mays). There is after all that another option. From the global series you can subtract the means of some other period. That will shift the result overall to have zero mean in that period (call it B).

The choice of A is governed by practical considerations; NOAA use 1961-90 because it is the period when the greatest number of stations have data. GISS use 1951-80 because it is what they have always used, and is about as good. NOAA then produce modified results for periods B by subtracting global monthly means, where B might be 1901-2000 or 1981-2010. That is a trivial operation; anyone can do it with Excel. Here, for example, I show all the major temperature data sets relative to base B=1981-2010. And it does make sense to use a stable standard period for comparison.

RicDre
Reply to  Nick Stokes
December 3, 2018 6:39 pm

Mr. Stokes:

Thank you for explaining how anomalies are calculated and why one might use different base periods when calculating anomalies.
Since you have said in the past that you agree with Mr. Schmidt’s statement “To be clear, no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.”, I’d be interested to know your opinion on what makes those changes in temperature so risky when, as Mr. Tisdale shows, they are insignificant when compared to the annual temperature changes that people are use to.

RicDre
Reply to  Bob Tisdale
December 3, 2018 7:11 pm

Bob Tisdale:

Being a grumpy old retired computer programmer, I tend to be formal when I address people I have not met personally, but I will comply with your requested preference when addressing you. Also, I want to take this opportunity to thank you for all of the interesting and thought provoking articles you post on WUWT. Its articles like this one (and comments from people on both sides of the Global Warming issue) that make WUWT such an interesting site.

Solomon Green
Reply to  RicDre
December 4, 2018 1:42 am

As another grumpy very old man, I prefer Mr. Dee’s address. In the UK Mr. Tisdale’s research based contribution to the CAGW scam is worth infinitely more than a mere “Bob”.

Reply to  RicDre
December 3, 2018 7:02 pm

“I’d be interested to know your opinion on what makes those changes in temperature so risky”
I’ve outlined how a global anomaly average is calculated. And why it is important is that the process can be reversed. If the anomaly average rises, local temperatures are likely to be higher than they were. You can’t use an average global temperature (absolute) in the same way. It works for anomaly because of the homogeneity.

So that is the key. There are places and times where heat is harmful, for some reasons. It may be that it damages crops, which have been planted in expectation of more reasonable temperatures. Or it may threaten health. And a higher anomaly average makes that danger greater. Where I live (Victoria) we worry a lot about forest fires. Our two worst days, when towns were burnt, were also our two hottest – 2009 at 46.4°C in Melbourne, and 1939 at 45.6°. We get days approaching 45° reasonably often without the same disastrous results. There was another very bad day in 1983 at 43.8; that was at the tail end of a very dry summer. So very small increases in temperature, at the right time, can have disastrous results.

Timo Soren
Reply to  Nick Stokes
December 3, 2018 7:46 pm

Using anecdotes for science is rather poor. I have yet to find a single published result on how actual measured heat is bad, without some reference to fingerprint. Do you actually have an example in the last 30-50 years where the addition 1.5 degrees is bad? If so please provide. One caveat and that is coral is not allowed (by my rules, as it is all over the place. ) However, I will give you 4 on my side,
crop production up, world wide greening up, insect/moth reproduction in Finland up, and lastly herring larval reproduction up with higher water temps.

But please give a published result. (no models, actual data.)

FYI,
coral: no
crop production:no
forest fires: no
droughts:no

But i will be happy to read a data based complete analysis on ANY place in the world that the conclusion is significant that warming has been harmful.

Alan Tomalty
Reply to  Nick Stokes
December 3, 2018 8:18 pm

Forest fires are not caused by heat. They are either arson, carelessness, electrical storms, or drought. There aren’t many forest fires in tropical zones where the moisture is high. Drought is the main problem for forest fires. I havent seen any crops damaged in the world because it got too hot. We don’t plant crops in deserts unless we have lots of irrigation. Crops dont die in California because of high temperatures. Again, drought or frost kills crops. So far in the world’s history the number of people dying from heatstroke has gone down every year.

Mr. Stokes; you said “We get days approaching 45° reasonably often without the same disastrous results.”

The reason that the fires started was because the plant life was much dryer.It wasnt because of the 1C difference in temperature. AGAIN YOU DIDN’T ANSWER THE QUESTION. [snip. let’s not attack Nick in this manner, although I admit in another incarnation, I have tangled with him myself, so I am not without sin~ctm]

Reply to  Alan Tomalty
December 3, 2018 9:35 pm

“Forest fires are not caused by heat. They are either arson, carelessness, electrical storms, or drought.”
In our eucalypt forest, at least, the worst fires are on the hottest days. And yes, drought is a factor too, here often associated with heat. And that is universally understood. The weather for our disaster on Feb 7 2009 was well predicted. Here are just some of the warnings:

“CONSIDER these words: “I can’t stress this enough. I know that the chief fire officer has been out and he said it will be as bad as you can get and he’s not exaggerating.” So Premier John Brumby warned Victorians, on Friday, February 6 — the day before Black Saturday — of imminent extreme fire danger across the state. The Premier explained the level of extremity by saying that conditions were going to be even worse than those experienced in the Black Friday fires of 1939 and the Ash Wednesday fires of 1983. “It’s just as bad a day as you can imagine and on top of that the state is just tinder-dry,” he said. The warning was reported on The Age’s website on the day he uttered it, and again in the newspaper the next morning, under the heading “The Sun Rises on Our ‘Worst Day in History’ “. Other news media in Victoria carried similar reports.”

They were right. And they weren’t basing that on reports of unusual numbers of arsonists or carelessness. They know, as all Victorians do, what follows with extreme heat.

BoyfromTottenham
Reply to  Alan Tomalty
December 3, 2018 10:15 pm

Alan – and don’t forget overhead power lines and locomotives. I guess that ‘carelessness’ includes things like barbeques, hot vehicle exhaust pipes and welding sparks. I remember from many years ago when an airport fire service vehicle (a much loved Willy’s Jeep) caught fire at Cooma airport due to contact between very dry long grass and the exhaust pipe. ;-(

Doc Chuck
Reply to  Alan Tomalty
December 5, 2018 11:25 pm

Yup, it’s not all down to the daily high temperature. Both last year’s devastating ‘Thomas’ and this years ‘Woolsey’ (Malibu) wildfires in southern California were in the month of November when the annual tall grasses had dried. Combine that with a several day interior (Mojave desert sourced, very low humidity) offshore ‘Santa Ana’ wind event and early suppression of any initial small fire with access to a wide area of that dry grass becomes extremely difficult. Indeed the high winds initially grounded available helicopter and fixed wing water/retardant dropping aircraft until the expanses of those fires were nearly overwhelming.

Fortunately for those all around me, despite residing at the shore-side end of the major river valley that is the region’s major offshore wind chute, we also have a several mile buffer area of intensively cultivated green agricultural fields (strawberries and cabbages prominent among them) on that same river’s fertile coastal flood plain that now harbors no such expanse of dry grass between us and those nearby fire-prone grassy slopes, and so the only real threat hereabouts is airborne smoke/ash from such fires.

Barry Constant
Reply to  Nick Stokes
December 3, 2018 8:28 pm

Most of the warming in the anamoly is coming from increased lows – not increased highs. In places where there is little humidity the temperature can certainly get very hot but the night will also cool quite a bit more.

Was the 1939 temperature the result of global warming or was that just the natural variability of a system for which we have a very small amount of actual measured data?

Reply to  Barry Constant
December 3, 2018 8:42 pm

“Was the 1939 temperature the result of global warming or was that just the natural variability of a system”
Natural variability is obviously a component. 2009 was our last big fire year in Victoria (2018 for Queensland). But rising mean anomaly puts the natural variation into a different range. I made a table here of the top 20 hot days in Melbourne, which includes the three megafires. Seven were in the last five years.

Michael Jankowski
Reply to  Barry Constant
December 4, 2018 11:31 am

2009 fire supposedly had 65 mph winds. Who cares if it was hot.

John F. Hultquist
Reply to  Nick Stokes
December 3, 2018 8:54 pm

Nick Stokes @ 7:02 pm

The temperatures you mention [ 46.4°C, 45.6°C, & 43.8°C ] are not high enough to ignite a fire. As you know, they are not even close.
Cliff Mass, on his November 20 post, explained why the high temperature, and global warming, were not the cause of the recent serious fire in California.
Here is the link:
https://cliffmass.blogspot.com/2018/11/was-global-warming-significant-factor.html

Search for ‘ Surface dry conditions ‘

I can see no reason why the explanation would not hold for AU.
A temperature of 30°C (86°F) during a dry season can have serious fires.
So that is the key.

Reply to  John F. Hultquist
December 3, 2018 9:16 pm

John,
“The temperatures you mention [ 46.4°C, 45.6°C, & 43.8°C ] are not high enough to ignite a fire.”
True. But the fact is that it is on such days that we get the main fires. If the temperature is over 40 with a big N wind (as there usually is) it will be a day of total fire ban, and at least some fires are very likely. Ignition events which in milder weather would have little effect, just take off. The same is happening right now with the extreme heat in Queensland.

Pat Lane
Reply to  John F. Hultquist
December 3, 2018 10:13 pm

I fought the 2009 fire at Churchill in Victoria.
It was lit by an arsonist.

Michael Jankowski
Reply to  John F. Hultquist
December 4, 2018 4:06 pm

Gee Nick, it couldn’t because Jan and Feb are the driest months and also the highest for envirotranspiration. Are you just playing dumb again?

Reply to  Nick Stokes
December 3, 2018 9:11 pm

Nothing you have mentioned as advantage of using anomaly cannot equally be accomplished using absolute temperature, as far as I can see. If the crops will be wilted by anomaly, they will also be wilted by absolute temperature. If frail folks suffer from anomaly, they will also suffer from absolute temperature. Nobody uses anomaly to calculate vapor pressure in a fire.

The only reason to use anomaly is to cancel out the annoying complexity of daily and seasonal variation, yet this complexity may be the key.

Reply to  Gordon Lehman
December 3, 2018 9:21 pm

Gordon,
“Nothing you have mentioned as advantage of using anomaly cannot equally be accomplished using absolute temperature, as far as I can see.”
The issue is spatial average anomaly. If that is up 1° for your region, or even globally, it is then likely that your local temperature (and anomaly) will be higher than usual for that time. You can’t say the same for average absolute.

RicDre
Reply to  Nick Stokes
December 3, 2018 9:30 pm

Mr. Stokes,

I understand your point that small changes in temperature (along with other factors such as a very dry summer) can make a difference in the occurrences of Forest fires but that doesn’t answer the question I asked, which was, in your opinion, if it is true, as Mr. Schmidt says, that “…no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.”. In other words, absolute temperatures of say 46.4C or 45.6C or 43.8C don’t matter unless they are a change from what we are use to.

Reply to  RicDre
December 3, 2018 9:46 pm

“In other words, absolute temperatures of say 46.4C or 45.6C or 43.8C don’t matter unless they are a change from what we are use to.”
Well, the fires matter, whether or not it’s a change. Although you have a point in that if we had such days regularly over a long time, we probably wouldn’t have a forest. Some would regret that.

To take a more mundane example. Forty years ago, most table wines here came from the Barossa Valley or the Hunter Valley. They still produce, but the centre of gravity is moving south, including Tasmania and New Zealand. That isn’t a disaster, in fact a lot of people benefit. But does have consequences for the people who invested in the Barossa. And it can’t go on forever. There’s a big gap before Antarctica becomes productive.

RicDre
Reply to  RicDre
December 3, 2018 10:36 pm

Mr. Stokes, you said: “Well, the fires matter, whether or not it’s a change…”

Are you say that, at least in the case of the Forest fires in Victoria, you don’t agree with Mr. Schmidt that “…no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.”?

Ozonebust
Reply to  RicDre
December 4, 2018 12:03 am

Nick
Those regions in new Zealand were slways likely spots for grape growing, they just did not have the capital and know how. Plus the market had not developed into the guzzling consumption that exists now. Plus development of various varietal for those conditions. Blue duck was the level of sophistication.
Nice try tho.
Regards

rich
Reply to  RicDre
December 4, 2018 12:36 am

Hi Nick,

See https://www.wineaustralia.com/getmedia/ba012fa8-3e20-44e9-a783-e8d137a2c74d/MI_SectorReport_Jan2018_F

2017 wine sector report.

The ‘warm’ regions of Australia being SA, Murry Darling and NSW produced 94% of Australian wine, the ‘moving south’ region of Tasmania produced a whopping 1% of wine and these are mostly cool climate varieties. Hardly what I would call a massive change in the wine industry that you ‘claim’.

LdB
Reply to  Nick Stokes
December 3, 2018 11:09 pm

It is far more about the wind conditions and very little to do with any temperature. It’s generally just the hotter weather coincides with hot temperature, the only effect the temperature really has it makes it hot for the fire fighters.

For any catastrophic fire in Australia look at the wind speed that is all you ever really need to know. You can basically ignore the temperature and certainly a number of the catastrophic fires were on cool days fanned by high winds.

Reply to  LdB
December 3, 2018 11:53 pm

We get strong winds all year round. We get fires in summer.

LdB
Reply to  LdB
December 5, 2018 1:16 am

LOL … you only have dry fuel in Summer, think there might be a correlation 🙂

Reply to  LdB
December 5, 2018 1:45 pm

“dry fuel”
We get rain pretty evenly year-round. It’s hot in summer. Think there might be a correlation?

Steven Mosher
Reply to  Nick Stokes
December 4, 2018 12:21 am

Nick, nothing has changed here.
You will still be abused.
really no point in it.
you will stay technical, they will go personal.

DaveS
Reply to  Steven Mosher
December 4, 2018 4:54 am

Eh? What ‘abuse’? Apart from one snip in Alan Tomalty December 3, 2018 at 8:18 pm, scrolling up all I see is disagreement, with some folk providing evidence that they feel contradicts Nick Stokes’ comments. No-one, for example, has questioned his parentage.

Scott W Bennett
Reply to  Steven Mosher
December 4, 2018 5:08 am

The scare tactics about fire from Nick are completely absurd in the context of the reality of the climate history of Australian flora!

Many Australian species are so well adapted to fire (And flood) that many will not germinate without it.

Growing up Down Under before AGW was popular, the accepted wisdom from our scientists was that Australia and no long term predictable climate patterns.

Near where I grew up on the Sunshine Coast there is a stand of Rose or Flooded Gums (Eucalyptus grandis) that in pre-industrial 1850 would have been on the boundary between wet sclerophyll and Eucalypt forest.

Captain James Cook might have seen them in 1770 as he sailed past naming the surrounding hills*, as the trees are over 500 years old.

These trees have become completely surrounded by rainforest in the last 150 years and no more seed will germinate in the wet and fireless conditions today! In fact it says on the sign beneath the trees:

No more seed will germinate after these have died until future climate change again allows fire into the forest.

As an example of the outstanding ignorance of alarmists talking up heat and fire and how little we really know, a story that came out of the Black Saturday fires of 2009 is revealing.

A survey in 2005 found that a rare species of gum which inhabits only two sites in Australia was failing to regenerate in the wild. The Buxton Silver Gum’s secrets were only revealed after the bushfires razed a reserve setup to protect them! The fires exposed the tree’s large underground lignotubers which were previously unknown because the trees had only been examined above ground! Parks Victoria said that the discovery pointed to the trees’ longevity “… its an indication that those trees may be hundreds of years old. The reserve also flooded for the first time in 14 years last September, creating ideal conditions for regeneration.”

It is now known that the species requires fire – and an extremely severe, hot one at that – and flood to survive. – Julie Flack, Parks Victoria

*…So these hills lay but a little way inland and not far from each other, they are very remarkable on account of there singlar form of elivation which very much resemble glass houses which occasioned my giving them that name… – Cook’s Journal: Daily Entries, 17 May 1770

Mike Bryant
Reply to  Steven Mosher
December 4, 2018 5:10 am

The comments were about Nick’s comments… not about Nick.

Giles Bointon
Reply to  Steven Mosher
December 4, 2018 5:34 am

Not seeing any’personal’ yet. How wonderful for Nick to have such a wise guardian angel looking out for him!

RicDre
Reply to  Steven Mosher
December 4, 2018 7:24 am

Mr. Mosher:

I believe I have been very respectful to Mr. Stokes, but if you can give me an example of where I have gone personal, I will apologize to Mr. Stokes. In any case, I am told that you are a very knowledgeable person yourself, so it would be very helpful to the discussion if instead of trying to defend Mr. Stokes, who is quite capable of defending himself, you would use your knowledge to add to the discussion. For example, I would be very interesting in knowing if you agree with Mr. Schmidt that “…no particular absolute global temperature provides a risk to society, it is the change in temperature compared to what we’ve been used to that matters.” and why you agree or disagree.

Reply to  Nick Stokes
December 4, 2018 5:48 am

What I don’t get is how the very small amount of an anomaly can be reasoned to be distributed over the entire globe in such a way that its even smaller amount causes any alarming consequences.

We go through all this rigor to calculate the anomaly. Okay, then why do we not go through some similar rigor to distribute it back over the whole globe to determine the possibility of its actual physical effect.

Show me how, say, a 0.5 C anomaly, figured globally, translates into an actual effect … globally.

Reply to  Robert Kernodle
December 4, 2018 12:59 pm

Robert, the anomaly is not distributed evenly globally; changes in anomalies increase with latitudes. Image shows how anomaly changes (up and down) are dominated by northern higher latitudes.
comment image

https://rclutz.wordpress.com/2017/01/28/temperature-misunderstandings/

Reply to  Robert Kernodle
December 4, 2018 1:04 pm

Correction: Should have said anomaly changes are dominated by northern continents.

Reply to  Robert Kernodle
December 4, 2018 1:16 pm

“Okay, then why do we not go through some similar rigor to distribute it back over the whole globe “
We don’t need to distribute it back. It was computed locally. Posts at WUWT frequently show maps of temperature. Providers eg UAH post them every month – here is GISS. I show them here in interactive style.

Reply to  Nick Stokes
December 4, 2018 3:53 pm

Nick Stokes:

I can’t see how an extra degree or two C of temperature makes the difference. Did the fires themselves contribute to the temperature?

Anthony Violi
Reply to  Ragnaar
December 5, 2018 4:23 am

Absolute garbage Nick, tell everyone of the green policy of not allowing all of the cattle country to be grazed thanks to the greenies after 2006. it was a tinderbox waiting to be burnt because of bad policies, it is well known by everyone in Victoria. Most devastating fires, like the QLD ones at the moment, are caused by bad management of the land firstly, and then pyromaniacs who light fires on catastrophic fire days.

https://www.smh.com.au/environment/green-ideas-must-take-blame-for-deaths-20090211-84mk.html

Dr Bob
Reply to  Nick Stokes
December 4, 2018 9:29 pm

Nick Stokes, you are incorrect in stating that small increases in temperature can have disastrous results … the key element in a bad fire is the strength of the wind … that is what worries the CFA … a slightly hotter day than usual makes no difference …

Reply to  Bob Tisdale
December 3, 2018 7:19 pm

Bob,
“Did you bother to read the post beyond the second paragraph?”
Yes. But more to the point, I actually know how it is done. I do it myself. Although I am out of date on NOAA; 1961-90 was their period A for MLOST, but they have superseded that (June 2015, says the FAQ) with V4 and a period A of 1971-2000. It is described in detail here.
“We provide the NOAAGlobalTemp dataset as temperature anomalies, relative to a 1971–2000 monthly climatology, following the World Meteorological Organization convention. This is the dataset NOAA uses for global temperature monitoring
That is the set before global averaging.

I don’t think you bothered to read your NOAA quote. They said
“The national maps show temperature anomalies relative to the 1981–2010 base period. This period is used in order to comply…”
That’s what they show. It is a period B. It isn’t what they use for the anomaly average calc. And then they say
“the reference period is adjusted to the 20th Century average for conceptual simplicity (the period is more familiar to more people, and establishes a longer-term average). The adjustment does not change the shape of the time series or affect the trends within it”
That is the adjustment process I described above. Indeed it does not change shape or trends. They explain exactly what they are doing.

Reply to  Nick Stokes
December 3, 2018 7:43 pm

“That is a trivial operation; anyone can do it with Excel.”
To make that anomaly base conversion (when you have already don ethe spatial averaging) easier, I have a post here from 2015. It sets out, for each supplier, the numbers you need to add to convert between any of the commonly used periods. There is a different number for each month.

LdB
Reply to  Nick Stokes
December 3, 2018 11:37 pm

I agree it is off topic but I would also like Nick to consider spatial anomalies may not be a great idea it comes with it’s own set of problems. The ENSO oscillations basically move the patterns around so a better approach would be to allow the whole template to move slightly and try to lineup the best position of the major pattern (there are obvious limits on the amount it can move) before you attempt to run localized anomaly calcs. It still leaves the not solvable problem that you can’t predict much more than a few years ahead but you should be able to get a reasonable past history match.

Reply to  LdB
December 3, 2018 11:58 pm

“I would also like Nick to consider spatial anomalies may not be a great idea”
They aren’t spatial anomalies. They are anomalies in time that are then being integrated spatially. NOAA make some use of EOF’s, but I don’t think they would be included in what is called anomaly.

LdB
Reply to  Nick Stokes
December 5, 2018 1:20 am

Classic Nick stokes defense complete redirection to the stupid.

You did not at all deal with the fact your weather/climate patterns oscillate per year 🙂

Michael Jankowski
Reply to  Nick Stokes
December 4, 2018 4:47 pm

“…The choice of A is governed by practical considerations. The choice of A is governed by practical considerations; NOAA use 1961-90 because it is the period when the greatest number of stations have data. GISS use 1951-80 because it is what they have always used, and is about as good…”

(1) Picking the period for which the “greatest number of stations have data” is not a “practical consideration.”
(2) You frequently like to claim that using a large number of stations is not that important when dealing in anomalies. Now it’s apparently THE reason NOAA uses 1961-90 – although as you noted, 1951-90 is “about as good.”
(3) On what basis is 1951-80 “about as good?”
(4) What happens if/when it turns-out that there are other baseline periods which produce a lower uncertainty than 1961-90 and/or 1951-80?

Paramenter
Reply to  Nick Stokes
December 5, 2018 11:09 am

Hey Nick,

There is a basic difference in the arithmetic here. As I mention over and over, the steps in anomaly are:

I reckon it would be nice if you publish here, of course if WUWT Powers, Principalities, Thrones and Dominions allow, a short cheat-sheet how to do anomalies used in the ‘official’ climatology. That may disperse some confusion and may be helpful for some cross-validation work.

markl
December 3, 2018 7:18 pm

Nothing to see here, move on. Just another goal post movement.

Chris Hanley
December 3, 2018 7:24 pm

Fig. 2 indicates any change in temperature experienced by a person in their nineties in that part of the planet within the (presumably) current borders of China would have been in the av. minimum temperature and, although variable from year to year by maybe three or four degrees C, only a degree or two C over their lifetime.

Chris Hanley
Reply to  Bob Tisdale
December 3, 2018 9:34 pm

Bob,
It was just a garbled observation based on the absolute temperatures indicated on fig 2 relating to the comment by Gavin Schmidt appreciating your point viz. we’re used to much greater variations in temperature during a day or over an annual cycle than anyone will experience over a lifetime, AGW or no AGW.

nw sage
December 3, 2018 7:35 pm

from the quote; …”the predictive skill of climate normals …” strikes me as very odd.
Every single statistics class I ever took emphasized that statistics are NOT a predictive tool or measurement. They work ONLY with past data – already ‘cast in concrete’ – and can predict nothing. It seems those who use statistics – and ‘normals’ averages’, variants, etc – to somehow predict what might be true for tomorrow, next week or next century are skating on very thin ‘scientific ice’. Dressing it up in technobabble doeswn’t change its nature.
It is true that we can expect tomorrow to have a temperature (because all ‘tomorrows’ always have had one) but its amount or value is necessarily a guess. any so called scientist who doesn’t realize this needs a refresher course in basic statistics and just what data can be used for.

RicDre
Reply to  nw sage
December 3, 2018 9:41 pm

nw sage:

I think your comment is nicely summarized by a quote commonly found on a Stock Prospectus: “Past performance is not an indicator of future results”

Rick C PE
December 3, 2018 7:58 pm

The warmest large US cities are Honolulu, Miami and Phoenix – all in the 23 – 25 C range for annual average. Most Midwest and Eastern cities have annual average temperatures in the 10 – 15 C range. The coldest cities – Bismark, ND and Juneau, AK – under 5 C. In most of these places (except Honolulu and Miami) the difference between average winter and summer temperatures is 20 to 25 C. All are populated by humans and support abundant vegetation and wildlife. How could anyone conclude that a gradual change in local average temperature of 2 – 3 C would somehow be devastating?

Only “sea level rise” could conceivably be a concern and I’m not going to get excited about it until arctic/antarctic average temperatures approach the freezing point of water.

JBom
December 3, 2018 8:48 pm

Psychopaths (e.g. W.M.O.) build “castles in the clouds in the air” where as sociopaths (U.N. HQNYC-Geneva-Vienna and E.U. e.g. Macron, Merkel, Tusk et al.) live in the the “castles in the clouds in the air”.

Ha ha.

December 3, 2018 10:17 pm

The American city Minneapolis has an average temperature of 8 degrees Celsius.

The American city Denver has an average temperature of 10 degrees Celsius.

With 2 degrees Celsius of global warming, the average temperature of Minneapolis would increase to approximately 10 degrees Celsius.

Will this amount of global warming turn Minneapolis into a disaster?

How do the people of Denver cope with their current average temperature of 10 degrees Celsius?

There seems to be a logical error in the idea that it will be catastrophic if we exceed the 2 degrees Celsius temperature limit.

Am I missing something here?

LdB
Reply to  Sheldon Walker
December 3, 2018 11:59 pm

Farmers, animals etc in one area may be affected and have to move to different locations which may or may not be a problem depending if they can. However that problem exists with or without CAGW.

China for example has been drying out for centuries and in places like Beijing the Yongding river hasn’t had water in it for 30 years and hasn’t flooded since 1958. Those sorts of trends and problems exist long before man got the blame via CO2 and people and animals just adapted because they didn’t know they had to run around and panic.

Steven Fraser
Reply to  Sheldon Walker
December 4, 2018 1:32 pm

Only if Minneapolis grew mountains as a result.

John Dowser
December 3, 2018 10:21 pm

This article does not address the consequences of the hypothetical rise of several degrees in average temperature for very large areas or even near global . This is simply not related to any short term cyclical, seasonal or random fluctuations, which are all canceling itself out over the given measurement time frame or have limited influence because of a serious lack of sustain.

It’s above and beyond documented how rapid changes in any climatic temperature can dramatically influence fauna, flora, ice surfaces, mud slopes, agriculture and so on. Does it need any reference?

The climate discussion for a skeptic is not to question all that but it’s about the question if we’ll get the steep changes as projected in the various models. Actually to imply that climatic changes of e.g. four degrees would not matter because of cold nights or seasonal flux, sounds really like comparing apples and oranges, then selling them as eggs. It’s simply a matter of [i]not being in the discussion[/i]. Which of course, not surprisingly, Bob does not find himself to be in outside a small cocoon. Surprising how little readers, often displaying smarts and ability to boldly critique, are not opposing it a bit more. Perhaps it’s time to clean up the nonsense among the climate realists? But it’s not happening much and it’s worrying.

Dee
December 3, 2018 11:05 pm

To make matters worse, NOAA produces an annual global report which contains national weather “highlights” from different countries, presumably designed to drive home the message about “extreme weather” occurring global.

The only thing is there is no context to this list of national highlights because NOAA rather awkwardly points out that

“Please note that different countries report anomalies with respect to different base periods.”

So the point of listing “extremes” from various countries in order to try to drive the message home that every country is being affected simultaneously and in the same way by climate change does not stand up when country listed is being compared to a different base period.

Basically they are page-filling with irrelevant nonsense from different countries.

But, they do say that the 20th century global average is 15.5°C (59.9°F).

https://www.ncdc.noaa.gov/sotc/global/201806

Which leads to the following conundrum.

Moving to NASA, they explain their maps:

“They depict how much various regions of the world have warmed or cooled when compared with a base period of 1951-1980. (The global mean surface air temperature for that period was estimated to be 14°C (57°F), with an uncertainty of several tenths of a degree.) In other words, the maps show how much warmer or colder a region is compared to the norm for that region from 1951-1980.”

https://earthobservato…f-change/DecadalTemp

Do you see what I’m getting at?
NASA is claiming that 30 years at an average 14°C is a suitable baseline out of an observed 100 years at an average of 15.5°C calculated by NOAA.

If that’s the average for a hundred years, why is a cooler baseline of 30 years chosen if most years were warmer?

NASA explain that their baseline reflects normal or average temperature, and anyway, it starts when GISS started up and lots of people grew up during that period, which, frankly, are just silly reasons.

Geoff Sherrington
December 4, 2018 1:10 am

This whole habit of climate people to subtract a 30-year average from long term trends is wrong for a number of reasons.
One justification for it is to account for the altitude differences of stations being compared. This lapse rate for aviation is from Wiki … “As an average, the International Civil Aviation Organization (ICAO) defines an international standard atmosphere (ISA) with a temperature lapse rate of 6.49 K/km[15] (3.56 °F or 1.98 °C/1,000 ft) from sea level to 11 km (36,090 ft or 6.8 mi).”
However, there is a dry adiabat, a moist adiabat and an environmental adiabat, each with a different rate of T change with altitude. Therefore, it is hard to correct your run of the mill temperatures for lapse rate because there is a choice. Those who promote the anomaly method rely on it to do some sort of correction for lapse rate, but it must carry a huge error that is usually not disclosed.
Instead, promoters like Nick Stokes show that their mathematical standard deviation from anomaly data is less than that of absolute data, up to 7 times smaller, with the uncorrected and un-caveated implication that the SD is lower because the uncertainty is less. It is not. Sadly, the gross uncertainty is the same, because those early thermometer readings have large errors that are NOT reducible by choosing to use the anomaly method. For example, there is an error involved in converting from older F to to modern C scales and this error is unaffected by adoption of the anomaly method, even though math calculations on anomalies might show an improvement. That is errors and numbers at work for the anomaly-lovers, not the physical reality.
When you start with absolute or raw temperature data and convert it to anomaly data by subtracting a reference period of 30 years or whatever, you are doing a sort of transform. Sometimes, it is convenient for data to undergo a transform, such as taking logarithms to compress data into more readable graphs. I posted to Nick Stokes on WUWT that you can reduce the SD of a string of temperatures by taking their square roots, another transform. In each case some convenience might arise, the SD might become smaller, but a trap has been sprung. The trap is that one cannot attempt to do the same statistics on the untransformed set as the transformed set. The results, like SD, mean different things once transformed. So it is with the anomaly method. You can do standard stats on anomaly data IF you are wise enough to know that some types of stats are going to cause trouble. Nick has demonstrated this with his artificial lowering of the SD.
Try as I can, I see no compelling reason to use anomaly data. There is another complication. Some data collectors like GISS adjust big lumps of data from time to time for reasons that seem plausible to them. Suppose they change historic data monthly. This means that there is a chance that the reference data (the 30-year part subtracted from the rest) will also change from month to month even though the same 30-year normal is used. So will all of the numbers. Picture the poor climate researcher who has just submitted a paper, only to find that the numbers changed last month.
People wiser than me like to use original data as much as possible, so that they can make their own changes to suit their research purposes, without having to unwind all of the adjustments previously made by the data collectors and their uncontrollable habit of wanting to change numbers as a passion. I had some emails with Phil Jones of UEA in the early 1990s about data used to claim global warming trends in Australia and China. He implied (though later implied the reverse) that he had lost the original absolute data and had kept only anomaly data. Life gets complicated, don’t it? Geoff

http://joannenova.com.au/?s=that+email+phil+jones+sherrington

Reply to  Geoff Sherrington
December 4, 2018 2:21 am

Geoff,
“Therefore, it is hard to correct your run of the mill temperatures for lapse rate because there is a choice. “
With anomalies you don’t need to think about lapse rate or any such causes (although I think BEST does). You just subtract a past average for that location. The altitude effect doesn’t change, so it is subtracted out.

“because those early thermometer readings have large errors”
Yet again, anomaly does not reduce instrument error. It reduces sampling error.

“reduce the SD of a string of temperatures by taking their square roots, another transform”
Anomaly is a linear operation.

“Try as I can, I see no compelling reason to use anomaly data.”
There is nothing else. Check the WUWT temperature page. For global average, it is all anomaly. UAH, RSS all anomaly. Try to imagine what an absolute average might mean. Most days at WUWT there is a discussion of some temperature anomaly plot. People seem to find that quite meaningful.

Reply to  Nick Stokes
December 4, 2018 9:49 am

“Yet again, anomaly does not reduce instrument error. It reduces sampling error.”

Yet I never see any instrument errors quoted with anomaly data or anomaly graphs. Is it because the errors would dwarf the size of the anomalies?

Reply to  Jim Gorman
December 4, 2018 1:53 pm

“Is it because the errors would dwarf the size of the anomalies?”
No, it is because sampling error dwarfs the effect of instrument error on the average.

Michael Jankowski
Reply to  Nick Stokes
December 4, 2018 5:06 pm

“…There is nothing else. Check the WUWT temperature page. For global average, it is all anomaly. UAH, RSS all anomaly…”

NCDC/NOAA used to calculate a “global average.” See this from 1997, for example. https://www.ncdc.noaa.gov/sotc/global/199713

Jones et al did as recently as 1999 https://agupubs.onlinelibrary.wiley.com/doi/10.1029/1999RG900002

How did the best and brightest climate scientists fail to grasp such a simple concept less than two decades ago?

Reply to  Michael Jankowski
December 4, 2018 7:35 pm

“How did the best and brightest climate scientists fail to grasp…”
This century, they have mostly figured it out, and follow their own advice. It is just a dumb thing to do. Well, Jones was OK; he was just doing the best he could to integrate absolute temperature. But, as almost everyone since recognises, it isn’t very good, and adding an estimate of average absolute temp to the average anomaly is just dumb. Almost everyone – occasionally someone at NOAA lapses, and trouble ensues.

It actually isn’t so bad for regions like China. On a smaller scale the inhomogeneity of absolute temperature is more manageable, and the average starts to become a little bit meaningful. But only a little bit. Does anyone remember what the average temperature of China was? Don’t peek.

1sky1
Reply to  Geoff Sherrington
December 4, 2018 3:43 pm

Try as I can, I see no compelling reason to use anomaly data.

There is, of course, no compelling scientific reason to resort to anomalies if one is dealing with intact records uniformly covering a long period of observation. Indeed, the entire thermodynamic significance of temperature is lost thereby.

The sole reason for using them is to reduce the purely statistical effect of temporal and spatial gaps in data coverage. Anomalies relative to any reasonable base period vary spatially far less , even in a small region, than absolute temperatures. Thus the drop-out of data at various locations has a smaller impact upon their spatial average.

The onerous aspect of all this is the misbegotten notion that the temperature field is far more homogeneous than it actually is. Thus it seems to negate the basic requirement that a uniformly fixed set of locations needs to be measured over a long period of time to detect any secular change in global average temperatures. In reality, far from being superior in any scientific respect, average anomalies from a constantly changing set of locations simply mask the fact that the data for scientifically reliable such detection are not available.

old construction worker
December 4, 2018 1:54 am

‘In a stable climate, these two purposes can both be served by a common reference period. However, as discussed in The Role of Climatological Normals in a Changing Climate (WMO, 2007),’ Question: What is the definition of a “stable climate”?

Dee
Reply to  old construction worker
December 4, 2018 6:51 am

A much, much warmer one (+5 to 10°C) was considered as “normal” by the WMO in it’s First World Climate Conference report:

“We live in an abnormal phase of a planetary climate that in most epochs permitted a largely ice-free surface.

Nothing in the record suggests that we are about to climb back to the normal condition which may well be 5 to 10 deg C warmer than present conditions.”

The expert opinion was that ice on earth is abnormal and that the earth’s normal climate is much warmer.

Contrast that with the current global warming propoganda which claims that if human activities weren’t warming the planet it would be naturally cooling.

Which leaves us with the following knot to untangle:

A) In 1979 the WMO claimed that the planet was in an abnormally cool phase and is up to 10°C cooler than what is normal for the planet.

B) Alarmists are now saying that it is abnormal for the planet to have warmed by a barely detectable amount since the last little ice age, and that it should actually be cooling. See skepticalscience for further information on that one……

C) Alarmists refute any suggestions that global cooling was predicted in the 70s, also from skepticalscience……..

C) This must mean that the scientific consensus in the 70s was that the planet should be much warmer because the alarmists say there was no consensus in the 70s that a new ice age was about to happen.

D) Now that it has warmed by almost 1.5° since 1750, the alarmists complain it should have been still cooling or that temperatures should have been static.

Seems you can make any AGW claim and be on to a winner.

The PDF file is large, and can be downloaded here at the WMO site

https://www.google.ie/url?q=https://library.wmo.int/pmb_ged/wmo_537_en.pdf&sa=U&ved=2ahUKEwiHxLa4pIbfAhXLI8AKHYHsCTwQFjADegQICRAB&usg=AOvVaw01K0NyyZ5XsoemeAbLB_61

Dee
Reply to  Bob Tisdale
December 4, 2018 11:43 am

You’re welcome Bob.

There’s a lot in it.

That conference has been described as the only “scientific” one, as opposed to the present day UN equivalents aimed at attracting “political” and “social” “scientists” who are more interested in deciding how best to rapidly impose unwanted and unprecedented changes upon every aspect of society by rapidly transitioning off of fossil fuels in order to implement their #climatejustice agenda rather than improving their scientific understanding and knowledge.

Reply to  Dee
December 4, 2018 12:01 pm

“In 1979 the WMO claimed that the planet was in an abnormally cool phase… “
The WMO did not claim that. They staged a conference at which many scientists said many things, as they usually do. In this case, the words were in a paper by F Kenneth Hare.

But as usual, you need context. Dr Hare continued:
“Man’s entire evolution as a species took place during this cool phase of Earth history. The cooling began in early Tertiary times (about 50 million years back) culminating in the series of glacial and interglacial epochs of Quaternary times, from which we have not emerged. Our history as a labour-dividing, civilized society has taken place within one single interglacial epoch, the past 10 000 years. Though one cannot say deterministically that we are the product of this anomaly, it has profoundly influenced our behaviour, our economy, and perhaps even our physique. “

Dee
Reply to  Nick Stokes
December 4, 2018 3:45 pm

Nick?

Seriously?

The claim was published by the WMO. Therefore, saying the WMO didn’t claim it doesn’t make sense, unless you’re telling me that we have to treat the IPCC reports the same way?

That the opinions contained within IPCC reports are the opinions of their respective authors and that publication by the IPCC is not to be taken as an endorsement of those views?

Well, maybe you have a point, given that this is what the IPCC has previously “claimed”:

“In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

“The fact that the global mean temperature has increased since the late 19th century and that other trends have been observed does not necessarily mean that an anthropogenic effect on the climate system has been identified.

Climate has always varied on all time-scales, so the observed change may be natural. A more detailed analysis is required to provide evidence of a human impact.”

“No best estimate for equilibrium climate sensitivity can now be given because of lack of agreement on values across assessed lines of evidence and studies.”

Whoever wrote those paragraphs for the reports must be the most popular scientists at the IPCC get-togethers…..

Joking aside, it is not very different to what Dr. F Kenneth Hare and the WMO were concluding about Climactic Variability, (he being an Editor of the WMO Proceedings), back in ’79:

“Conclusion:

No simple conclusion emerges from this review of climatic variation and
variability over the ages – not even from the most recent epochs, when abundant information is available.”

Apart from the following of course:

“Climate is a topic that lends itself to facile generalization and to warnings of impending disaster.

Such liberties are perhaps best avoided.”

Reply to  Dee
December 4, 2018 3:55 pm

“That the opinions contained within IPCC reports are the opinions of their respective authors”
IPCC reports are the product of much discussion. They cycle through drafts, have plenary sessions etc, all to decide on a text they can agree to.

When an organisation like WMO stages a conference, they do not usually vet the author’s papers for conformance to WMO thinking. They probably just have an abstract refereed (by someone outside WMO) to make sure the work is of an adequate scientific standard. The author just turns up with his paper (at least, so it would have been in 1979).

But anyway, nothing Dr Hare said is in any way controversial. The WMO Conference did make a declaration, which I have set out in the next comment.

Reply to  Dee
December 4, 2018 2:31 pm

Although individual papers in a Conference are not statements by the WMO, the 1979 Conference did make a Declaration, headed “An Appeal to Nations”. It included:

“Nevertheless, we can say with some confidence that the burning of fossil fuels, deforestation, and changes of land use have increased the amount of carbon dioxide in the atmosphere by about 15 per cent during the last century and it is at present increasing by about 0.4 per cent per year. It is likely that an increase will continue in the future. Carbon dioxide plays a fundamental role in determining the temperature of the earth’s atmosphere, and it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere, especially at high latitudes. Patterns of change would be likely to affect the distribution of temperature, rainfall and other meteorological parameters, but the details of the changes are still poorly understood.

It is possible that some effects on a regional and global scale may be detectable before the end of this century and become significant before the middle of the next century. This time scale is similar to that required to redirect, if necessary, the operation of many aspects of the world economy, including agriculture and the production of energy. Since changes in climate may prove to be beneficial in some parts of the world and adverse in others, significant social and technological readjustments may be required.”

Ice Age scare?

Geoff Sherrington
Reply to  Nick Stokes
December 4, 2018 6:00 pm

Nick,
Where in the words you quote is any proof that “Carbon dioxide plays a fundamental role in determining the temperature of the earth’s atmosphere, and it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere, especially at high latitudes.”
Yes, I know I go on about this, but it is absolutely fundamental to the whole exercise to be able to demonstrate, not in the lab Arrhenius rudimentary style, but in the real atmosphere, that there are no compensatory effects to balance out the alleged GHG effect. You might ask for a suggested mechanism and I might answer ‘variations in cloud cover and timing’ but neither of us would be the wiser because more cloud work needs to be done.
It is really, really frustrating to see, as a scientist who dealt with hard numbers for a career, the use of guesses as inputs to outcomes now being used to try to shape the energy future of the world. To see the inability to measure a climate sensitivity, or to prove that it is not zero.

Why, with your background, are you not active in calling out other researchers who are using guesses as if they were data to support the Establishment line? Geoff.

Reply to  Geoff Sherrington
December 4, 2018 8:28 pm

Geoff,
“Where in the words you quote is any proof…”
Well, look at what Dr Hare actually said in 1979:
“Carbon dioxide plays a fundamental role in determining the temperature of the earth’s atmosphere”
Yes, it does. We know that going back to Tyndall. CO2 absorbs and emits in the IR range that carries a lot of outgoing energy through the atmosphere. And you can see a big bite in the outgoing spectrum in that IR range. Whatever else that is, it’s a major role.
“it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere”
Well, he’s not claiming proof. But many would. You seem to be demanding proof in situ, not in lab. I wonder if you demanded that uranium chemistry had to be all re-demonstrated in the ground, not just lab? It is, of course, a big thing to do an experiment on the atmosphere. But we’re doing one now. The prediction was that adding CO2 would cause warming. So we did it, and it warmed.

Dr Deanster
December 4, 2018 6:13 am

Excellent graph Bob (the one showing the monthly for China vs Global anomaly) It would be cool to post graphs for several diverse areas, …. like The US, Canada, Brazil’s, Russia, Australia, Antarctica, …. and throw in the northern Pacific, southern Pacific for good measure, … just to hammer down the point that there is no such thing as “global” warming. The “global” meme is not relevant to each region.

As I’ve said all along, the “global” meme is driven by the made up Arctic anomaly, … and that anomaly is still well below freezing.

Crispin in Waterloo
December 4, 2018 6:22 am

Bob Tisdale

Some interesting new for you from the University of Waterloo Weather Station:
http://uwweatherstation.blogspot.com/

They have a monthly email with an analysis of the past month’s weather, comparing it with the history and it has the highest/lowest and so on as one might expect. The coordinator’s name is Frank Seglenieks.
weather@uwaterloo.ca

They had (emphasize had) a thin line plot of the “average temperature” for the most recent month based on the 1981-2010 average on their charts to show how the past month compared with the historical “average”.

I wrote to Frank and suggested that while it was interesting to know the numerical (daily) average for the month, it was difficult to know whether the temperature or rainfall “just experienced” was in any way unusual.

As you know, the CBC is touting every “above average” daytime temperature as some sort of catastrophe and “proof of global warming” while ignoring such comparisons for “below average” days and bloody cold nights. So be it. I decided to try something new as Frank doesn’t have an axe to grind.

So I wrote the following to him:

+++++++++++

I received your recent University of Waterloo Weather Station Monthly Summary – August 2018 with interest as always. I have a suggestion you might consider, though I realise it will take time to implement.

One of the things that is never clear about the weather is how “today” fits into the long history of previous measurements. As someone who frequently reports “numbers” from measurements it is distinctly odd that weather is compared with an “average” with no indication of what “average weather” is really like. In recent years there has been an obsession with numbers with high precision and no communication of the uncertainty and how the number fits into the greater scheme of things.

The average expectable weather in Waterloo is surely better described by a Mean and, for example, a Sigma 1 range encompassing 68% of the readings for a date, month or year. I give a common example: the long term average temperature (timeline unstated) for August is 24.8°C but there may never have been any August in which that was the average temperature. You get my point? The normal or typical temperature range could be reported, together with the average calculated for the current year. In that way the readership can automatically place the current values in a context that is meaningful to Waterloo.

If the “normal” range (defined someone [sic] in a footnote) is 24.8°C ±2.5°C with a P value it falls into line with health statistics reporting, for just one example, and the context is automatically included.

If one is very good with numbers, the P value might be obtained by using the other information you provided such as “the last time” the average was that value and so on, but it is not really how measurements and averages are correctly reported.

I would far rather read,

Average Daily High Temperature 27.1°C – Normal range 22.3°C to 28.3°C (σ1)

than

Average Daily High Temperature 27.1°C (Long term average 24.8°C)

for the simple reason that I have no idea from the latter what the normal range is for August. Similarly, being below average doesn’t mean anything much unless it is well below the average (normal) range.

What do you think?

+++++++++

To my delight, I received the November email with the “average line” replaced by a bright green broad line that was clearly based on the historical range. There is no explanation of what that range is, no Sigma number or explanation, but I think this is a victory for weather science. Instead of exclaiming that every single day is somehow an aberration, too high or too low compared with what we deserve from an unchanging climate, it starts the train of science rolling.

Not only that, it has the upper and lower historical limits – a gain I do not know for what period (I understand the station is more than 100 years old).

So, full compliments to Frank and the University of Waterloo Weather Station.

The long term average is only useful to me if it is accompanied by a Sigma 1 or 2 range within which we expect the daily weather to probably appear.

It happens that the last half of October and all of November was 2 degrees C below average and well outside the “normal range” if you want to see it that way. That means something. That has value and impact. Being 0.2 degrees C low means nothing at all, literally, because it is well within the prevailing pattern.

If you subscribe to the email, you will see at least one bit of climate common sense on your screen. The rainfall for November was uplifted by two really wet days at the beginning of the month. Minus that, the rest was “average”. It is interesting to see that the average range for rainfall in the middle of November is much more variable than the beginning and end.

For those who have time to view the station, please write to Frank and compliment him on his effective weather science communication efforts.

DWR54
December 4, 2018 6:29 am

Everyone seems to agree that there’s effectively no difference between anomalies and ‘absolute’ temperatures when it comes to linear regression trend estimates. Regression can also be used to estimate total temperature change over time. Again, this will be virtually the same whether you use anomalies or ‘absolute’ temperatures.

In Bob’s example he uses Chinese data from Jan 1900 to Aug 2013. Stopping in August weights the ‘absolute’ data slightly warm, so figures quoted here are based on ‘absolute’ annual values 1900-2012. The rate of change over this period was 0.11 C per decade, which amounts to a total linear increase of 1.2C between 1900 and 2012.

Of course there are temperature variations between individual years, but what this is saying is that *on average*, annual temperatures in China lately are consistently around 1.2C warmer than they were around the early 1900s. To illustrate this: the average annual temperature in China over the 10 years starting 1900 was 6.3C. Over the 10 years ended 2012, average annual temperature in China was 7.6C. That’s roughly a 20% increase in annual average temperature in just over a century.

Set in that context, I think it is likely that the long term temperature increase will have had some noticeable effects on China, over and above the normal annual variations of winter/summer extremes.

Philo
Reply to  DWR54
December 4, 2018 9:31 am

DWR54: ” That’s roughly a 20% increase in annual average temperature in just over a century.
Set in that context, I think it is likely that the long term temperature increase will have had some noticeable effects on China, over and above the normal annual variations of winter/summer extremes.”

The numbers you give are typical of a weather forecast. People want to know how the forecast compares with what was usual over the last few years.(see the previous post Crispin in Waterloo
December 4, 2018 at 6:22 am ). In this case the 20% means nothing.

The weather/climate change is a result of the heat machine of the ocean/air/sun. What happens depends on energy transfer, not temperature. T is only one component. ∆T (delta T or change in T) is what affects the climate and whether or not the weather changes are large or small. ∆T K (change in degrees K) has to be used because that is what drives energy transfer. The climate system has tremendous power because it has a huge mass, mainly water. But even the mass of the atmosphere is huge, so small changes in temperature can have big effects such as hurricanes. Or you can think of what might happen if all the energy from the sun during 24 hrs was focused on 1 sq. meter for 1 2nd. The energy of an atomic bomb.

DWR54
Reply to  Philo
December 4, 2018 11:33 am

Philo

The numbers you give are typical of a weather forecast.

I disagree, Philo. The numbers are averages of annual temperatures from the same region over a 112 year period, with the opposite ends of that period described in terms of 10-year averages. There is no forecasting and it’s quite clear that there has been substantial warming in China over that time.

The weather/climate change is a result of the heat machine of the ocean/air/sun.

Not to mention greenhouse gases.

What happens depends on energy transfer, not temperature.

Indeed; but temperature change is a consequence of energy transfer.

michael hart
December 4, 2018 6:40 am

Similarly, why do they have so many, mostly awful, models in IPCC predictions, sorry, projections?
It allows them to select ones to make a case for many desired pre-determined endpoints: High enough to be scary, and low enough to be closer to reality.

DWR54
December 4, 2018 6:40 am

Re: ” That’s roughly a 20% increase in annual average temperature in just over a century.” Badly put. I mean relative to zero on the Celsius scale of course, not Kelvin. I’m not suggesting that China is 20% warmer than it was in 1900! It’s just 1.2K warmer per year on average; but that’s still a considerable amount and it has got to be evident somewhere.

Geoff Sherrington
Reply to  DWR54
December 4, 2018 6:06 pm

DWR%$,
Pleased you corrected that. I was just about to comment on its invalidity.
What is the basis for your claim that 1.2K warmer per year on average (comparing year 1900 with now) is a considerable amount?
To me it is so tiny that it can be neglected with impunity.
Have you been ‘persuaded’ by messages that it is considerable? Opinions, beliefs, guesses … the real test is what 1.2K change can do to life around us and on that I see immense confusion and little hard output of any value. Goodness, society cannot even decide if it is overall a benefit or a threat. Geoff

crosspatch
December 4, 2018 8:40 am

I would prefer to see what it looks like with a 60 year base period. This would capture an entire PDO cycle which at least for US climate is important. For example, 1926 to 1985 or 1936 to 1995.

mptc
December 4, 2018 11:46 am

A.G.W. is a ruse to get the people to buy (literally) into a carbon tax. Why? Because all the world governments are so in debt that the only way to stop the hemorrhaging is to institute a tax on every single man, woman and child on the face of the planet. And I’m skeptical that such a tax will even stop it. The governments of the world do not know how to cut spending. We the people continue to elect those that promise us the most out of the treasuries of the world governments (to paraphrase John Adams). The currencies of the world are not worth the paper they are printed on. So the proponents of A.G.W. should be honest with us and themselves.

Wally
December 4, 2018 1:00 pm

Does the reference standards have an error bar? Ie, +/- 0.5

December 4, 2018 2:56 pm

Mr Layman here.
I don’t have the skills or access to data to do so, but do wonder what would be shown by changing the “30 year average” standard.
Why not a 60 year average as the standard? Or a “rolling” 60 year average? (Always the last 60 years rather than picking a past 60 year span.)
Or a 100 years?
And just what is the average of ALL the actual temperature measurements we have going back as far as they go?

Reply to  Gunga Din
December 4, 2018 3:23 pm

“do wonder what would be shown by changing the “30 year average” standard”
Nothing new. NOAA offers variously 1981-2010 or 1901-2000, mostly the latter. The adjustment doesn’t tell any detail about the history; it’s just a single number, or 12, one for each month. I’ve tabulated those numbers here.

Reply to  Nick Stokes
December 5, 2018 1:22 pm

Thanks, Nick, but I wasn’t talking about using different 30 year periods but, rather 60 or 100 year periods.
As I understand it, 30 years was chosen as the standard back when we only had about 30 years of instrument records.

Reply to  Gunga Din
December 5, 2018 1:43 pm

1901-2000 is a 100 year period.

Geoff Sherrington
December 4, 2018 6:12 pm

Here is another problem with the anomaly method.
We often use ‘relative standard deviation” to compare variability of number strings. If a temperature series from the Tropics is expressed in degrees c and compared with another near the Poles, they are hard to compare because their means are different. Just a number thing, not a weather/climate issue.
So, we divide each standard deviation by the mean to level up the comparison.
Now, consider that we did not subtract a 30-year period from each data string. Suppose we subtracted the mean all all observations at a station. Our residual would have a mean of zero. Then, to calculate or RSD, we are faced with a division by zero in every case so we cannot do it.
The larger point is that the length of the period subtracted has an influence on some forms of math or stats that might follow. Geoff.

Reply to  Geoff Sherrington
December 4, 2018 9:23 pm

Geoff,
“So, we divide each standard deviation by the mean to level up the comparison.”
No. You might subtract the mean. In any case (see your comment to DWR) you should never divide by a temperature in °C. It has an arbitrary offset. The result is supposed to be unitless, yet if you did the calc in F you would get a quite different result. If the temp happened to be 0°F you’d get a crazy result. But there is nothing special about 0F. The same is true for your RSD stuff.

You’ve then said how that happens with anomaly. But it happens with temperature on any scale, except K. Don’t do it.

Geoff Sherrington
Reply to  Nick Stokes
December 4, 2018 9:47 pm

The demonstration showed that the anomaly method can render some traditional approaches unworkable or erroneous. I have seen far more data in non-climate fields using relative standard deviation than using this ‘anomaly’ method.
There have been many recommendations that climate research would benefit from the close involvement of statisticians. If that recommendation had been taken up, maybe we would now have fewer home made devices like this anomaly method, invented for climate work.
Nick, how does the industry cope with the about-monthly changes to the GISS record that can produce a new normal for each adjusted station each month? What is done by authors who base a paper on a superseded set?
Geoff

Reply to  Geoff Sherrington
December 4, 2018 10:39 pm

Geoff,
“maybe we would now have fewer home made devices like this anomaly method, invented for climate work”
There is nothing unfamiliar to statisticians in subtracting a mean. There is even a word for it – would you believe – demean.

Coping with GISS changes? The effect is tiny. That is if the adjustments in toto; the effect of changes is even less. Far less than for satellite averages.

Paramenter
December 5, 2018 11:00 am

Most, but not all, climate-related data are referenced to 30-year periods. Presently the “climatological standard normals” period is 1981-2010.

I assumed that anomalies are calculated based on moving 30-years average. So for the 2017 anomaly procedure would be:

1. Calculate 1986-2016 average (baseline)
2. Average 2017 temperatures and subtract it from baseline
3. For 2018 do the same but baseline would be now 1987-2017

Looks like I was wrong, for ‘official’ anomaly calculations they use ‘hardcoded’ timespan.

December 5, 2018 2:08 pm

I followed the link but, unless I missed something, there is just a list of different 30 year periods using different data sets.
Wrong link or am I missing something?

Reply to  Gunga Din
December 5, 2018 4:26 pm

I was referring to NOAA, which uses 1901-2000 as its main display period for time series.

Reply to  Nick Stokes
December 7, 2018 6:56 pm

Nick, I was suggesting that the “normal” be based on something other than 30 years. 60 years would be a good start.
All the “anomalies” seem to be based on a 30 year average. Why not a 60 year average?