What Is A “Normal” Climate?

Guest Opinion: Dr. Tim Ball

There is a form of argument called reductio ad absurdum. If you can reduce a position to the absurd, it was absurd in the first place. It doesn’t work as well as it used to because there is an embarrassment of absurdity in today’s world. However, there are arguments that are exposed by such an approach.

Al Gore’s fairy tale movie, An Inconvenient Truth claimed global temperature was “just right.” It is as Goldilocks said about porridge, “Not too hot, not too cold, but just right.” The movie fully deserved the Oscar because it was a fairy tale produced in Hollywood, the land of make-believe. The great Goracle declares we must maintain this normal because the wicked witch CO2 threatens it. From whichever castle he currently resides, he dictates we maintain the status quo, so he can continue his “normal” lifestyle, including profiting from selling the tale. He delights in referencing people from the past, such as Arrhenius, Callendar, or Roger Revelle, conveniently ignoring that they all lived through different “normals”. He also ignores the “normal” conditions his Ice Age ancestors enjoyed.

Gore wants to maintain his “normal” by reducing the amount of CO2 in the atmosphere to pre-industrial levels. The Intergovernmental Panel on Climate Change (IPCC) says this was 270 ppm. This is incorrect, but let’s assume it is true in order to consider the consequences of achieving that level. Assume also that the IPCC is correct and that virtually all the increase in global temperatures from the nadir of the Little Ice Age, especially since 1950. The IPCC says current levels are 400 ppm, so presumably to achieve pre-industrial levels requires a 130-ppm reduction. According to the science of the IPCC and fellow Nobel winner Al Gore CO2 levels determine temperature, so this will result in a return to Little Ice Age conditions. A multitude of sources itemizes these conditions, particularly Jean Grove’s The Little Ice Age and a listing at CO2Science.org, and the Nongovernmental International Panel on Climate Change (NIPCC).

The IPCC and Gore only consider the temperature implications of CO2, but it is essential to plant life, which in turn determines oxygen levels essential to all life. How much vegetative loss would occur with a 130-ppm reduction? It is only a computer model determination, but Donohue et al., (pay-walled) abstract explains.

Satellite observations reveal a greening of the globe over recent decades. The role in this greening of the “CO2 fertilization” effect—the enhancement of photosynthesis due to rising CO2 levels—is yet to be established. The direct CO2 effect on vegetation should be most clearly expressed in warm, arid environments where water is the dominant limit to vegetation growth. Using gas exchange theory, we predict that the 14% increase in atmospheric CO2 (1982–2010) led to a 5 to 10% increase in green foliage cover in warm, arid environments. Satellite observations, analyzed to remove the effect of variations in precipitation, show that cover across these environments has increased by 11%. Our results confirm that the anticipated CO2 fertilization effect is occurring alongside ongoing anthropogenic perturbations to the carbon cycle and that the fertilization effect is now a significant land surface process.

Again assuming they are correct, a 14% increase in CO2 resulted in an 11% increase in vegetation. What impact would a reduction of 130 ppm, approximately a 32% decrease, have? What would the combined impact of reduced CO2 fertilization and temperature have? Grove and others showed the impact of temperature reduction, but not CO2.

In the 1970s when global cooling was the consensus, Martin Parry produced studies of the impact of cooling over the course of the LIA (Figure1).

clip_image002

Figure 1

Figure 1 shows the county of Berwickshire in the Borders Region of the UK with a high percentage of land lost to cultivation over the period. What was normal for the people living through these times? The answer is whatever they experienced.

The World Meteorological Organization (WMO) introduced the 30-year normal, purportedly to help this problem of what is normal or average for planning and other applications. As they explain,

“The Standard Climate Normals underpin many climate services and applications, including climatologies, and also comprise the reference period for the evaluation of anomalies in climate variability and change monitoring.”

A problem becomes evident when comparing historic records, such as for the period 1781 to 1810, with the modern normal. Which modern normal would you use, the first one, 1931 – 1960, or the current one 1981-2010? William Wright wrote a paper about the problem in which he

“…argued in favour of a dual normals standard. CCl-MG concurred with the conclusion that there is a need for making frequent updates in computing the normals for climate applications (prediction and climatology purposes), based on the need to base fundamental planning decisions on average and extreme climate conditions in non stationary climate conditions.”

And there is the rub, “non stationary climate conditions.”

There is also the problem of adjustments endemic with all “official” data. The National Oceanographic and Atmospheric Administration (NOAA) says,

“Several changes and additions have been incorporated into the 1981-2010 Normals. Monthly temperature and precipitation normals are based on underlying data values that have undergone additional quality control. Monthly temperatures have also been standardized to account for the effects of station moves, changes in instrumentation, etc.”

 

Presumably this means you cannot compare the results with those of earlier “normals”.

NOAA informs us that

“Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations across the United States.”

No, they aren’t! They are only 30–year averages that add nothing to understanding typical climate conditions for any location. Since the 30-year average changes because of mechanisms that operate on longer than 30-year timescales, they simply tell you the climate for that period. The problem illustrates the omission of Milankovitch mechanisms from the IPCC. As reported, Professor Lindzen observed in the recent APS workshop,

He also notes that the IPCC estimate of the man-made effect is about 2 Watts/m2 in AR5 and that is much smaller than the Milankovitch effect of 100 Watts/m2 at 65 degrees north, see Edvardson, et al.

Figure 2 shows the 100 Watts/m2 insolation variability at 65°N calculated by Berger in 1978 and discussed in my article “Important But Little Known “Earth “ scientists.”

clip_image004

Figure 2: Variations in the amount of insolation (incoming solar radiation) at 65°N

Source: BERGER, A. 1978. Long-term variations of daily insolation and Quaternary climatic changes. J. Atmos. Sci. 35: 2362–2367.

 

While on a radio program, an IPCC modeler told me they omitted Milankovitch because they considered the time scale inappropriate.

Apparently people are planning and making management decisions on the basis of these “normals”. NOAA reports,

In addition to weather and climate comparisons, Normals are utilized in seemingly countless applications across a variety of sectors. These include: regulation of power companies, energy load forecasting, crop selection and planting times, construction planning, building design, and many others.

They are assuming that these conditions will continue. It reminds me of the presentation by Michael Schlesinger at a conference in Edmonton on the future of climate on the Canadian Prairies. A bureaucrat said we are planning reforestation in parts of Southern Alberta, and your data shows it is a desert in 50 years. How accurate is your prediction? Schlesinger said about 50 percent. The bureaucrat said, “My Minister wants 98 percent.”

It is no better today. Figure 3 is a map of 12 – month precipitation forecast accuracy for Canada. It is less than 40 % for 95 % of Canada when compared to the 30-year normal for 1981-2010.

clip_image006

Figure 3

I understand 30 years was chosen because 30 is considered a statistically significant sample size (n) for any population (N). It is of no value for climate patterns and the mechanisms that create them that operate over much longer time periods. NOAA acknowledge this when they write,

In fact, when the widespread practice of computing Normals commenced in the 1930s, the generally-accepted notion of the climate was that underlying long-term averages of climate time series were constant.

That idea permeated and became the fundamental public understanding that climate is constant, making current changes unnatural. What also happened was the 30 – year normal became the average for radio and TV weather people. NOAA confirms this adaptation.

Meteorologists and climatologists regularly use Normals for placing recent climate conditions into a historical context. NOAA’s Normals are commonly seen on local weather news segments for comparisons with the day’s weather conditions.

When media meteorologists say a weather variable is above average today that is usually only for the 30-year average, not the entire weather station record. This narrows the range and creates a distorted picture of how much climate varies. It enhances the effectiveness of Gore and others claiming that current weather is abnormal The climate “normal” is now as distorted as Goldilocks Gore’s “normal”. It might work for porridge and climate fairy tales, but it doesn’t work for the actual climate. Claiming that normal is abnormal appears to be an absurdity.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
161 Comments
Inline Feedbacks
View all comments
Harry Passfield
March 18, 2015 2:59 pm

What is ‘normal’? Freud et al would have a field day with Al Gore and his need for ‘normalcy’. For example: To many tribesmen in some African countries, living in a mud hut might be considered normal; Al Gore considers living in a huge mansion as normal. Of course, the thing is, he allows himself the affordable power that helps to define his life-style while denying the tribesman the same privilege. Something else that Freud et al would have fun with.

Bob Mount
March 18, 2015 3:09 pm

Once you change/move the goalposts (or baseline), you can never usefully compare the following results with those made earlier. You have changed the game.

Brian H
March 18, 2015 10:41 pm

As Johnny Horton observed, “When it’s springtime in Alaska, It’s 16-below!”

March 20, 2015 10:36 am

The climate was normal at 1:06pm on June 6, 1964 in New York City’s
Central Park, and has not been normal since then.
.
This fact is based on my model, run by a REALLY BIG COMPUTER
so there is no doubt.
.
In fact, my confidence level is 105%, so even if I’m off by 5%,
we’re still at 100%.
.
My climate model, by the way, accounts for a little known, but major
cause, of global warming: Al Gore.
.
When Mr. Gore get’s a lot of face time in the media, as in the 1990’s,
the average temperature of Earth goes up.
.
When Mr. Gore is mainly in the background, spending most of his
time at all-you-can-eat buffets, as in the past 15 years, the average
temperature goes down.
.
So, this is proof, beyond a shadow of a doubt, that global
warming is caused by Al Gore’s hot air (and you thought his greatest
contribution to the world was inventing the internet?)
.
http://www.elOnionBloggle.blogspot.com

Dawtgtomis (Steve Lochhaas from SIUE)
March 21, 2015 11:34 am

Could the current driver of global climate be considered as the sum of the interaction of all measurable forcings at the present juncture?

Dawtgtomis
Reply to  Dawtgtomis (Steve Lochhaas from SIUE)
March 21, 2015 11:59 am

Some forcings are cyclical and some are random, right? Some might even be induced by the presence of life on this planet. If our power to warm the planet is as immense as some believe, then this might prove to be a future advantage, as the next current driver might very well cause falling temperature. What is obtuse, is to assume any linearity from any single force, and use it to shape world policy.

Dawtgtomis
Reply to  Dawtgtomis (Steve Lochhaas from SIUE)
March 21, 2015 12:27 pm

Have to strike “measurable” and replace with “known and unknown”