Guest essay by Dr. Tim Ball
The Intergovernmental Panel on Climate Change (IPCC) set climate research back thirty years, mostly by focusing world attention on CO2 and higher temperature. It was a classic misdirection that required planning. The IPCC was created for this purpose and pursued it relentlessly. Through the World Meteorological Organization (WMO) they controlled national weather offices so global climate policies and research funding were similarly directed.
IPCC’s definition of climate change narrowed the focus to human causes, but they exacerbated it by ignoring, downgrading or misusing variables. Most important and critical was water in all its forms and functions. The obsession restricted focus to higher temperatures and increased CO2, which directed funding of impact analyses, whether economic or environmental to cost only, instead of cost/benefit. Climate studies only considered temperature, usually and incorrectly attributing changes caused by precipitation to temperature. This practice was most evident in paleoclimate reconstructions, either done by IPCC participants or chosen for inclusion in the IPCC Reports.
It is almost a maxim that if the people at the Climatic Research Unit (CRU), who effectively controlled IPCC science, were looking at a topic it was because it posed a threat to their predetermined hypothesis.
Tom Wigley took over from Hubert Lamb as Director of the CRU and guided much of the early research and then remained the major influence as the leaked emails revealed. He completely redirected CRU from Lamb’s objective, which was the need for data before any understanding could occur;
“the first and greatest need was to establish the facts of the past record of the natural climate in times before any side effects of human activities could well be important.”
Lamb was at odds with and appears to regret hiring Wigley and wrote about the different direction Wigley took the Unit. He wrote,
“Professor Tom Wigley, was chiefly interested in the prospect of world climates being changed as a result of human activities, primarily through the burning of wood, coal, oil and gas reserves…”
That became the focus of the CRU and subsequently of the IPCC. It was a predetermined hypothesis that led to manipulated climate science. The leaked CRU emails disclose Wigley as the eminence gris to whom all his old pupils and colleagues at CRU turn to for advice and direction.
A classic danger in climate research and an early threat to claims of a human signal was that they could be dismissed as a result of auto-correlation. The issue was identified in 1944 in Conrad’s classic Methods in Climatology. A 1999 article The Autocorrelation Function and Human Influences on Climate by Tsonis and Elsner commented on Wigley’s attempt to prove a human influence was not due to autocorrelation. They note,
This (Wigley’s) result is impressive, and there may indeed be a human influence on climate. However, the use of the autocorrelation function as a tool for such comparisons presents a problem. Climate models, whether forced or unforced, constitute dynamical systems. If these models faithfully represent the dynamics of the climate system, then a comparison between an observation and a model simulation should address whether or not these two results have the same dynamical foundation.
In Quantitative approaches in climate change ecology Brown et al., identify the issues.
We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.
The two lead items in Brown et al’s list for resolving problems of auto-correlation are also central to understanding the corruption and misdirection of the IPCC.
(i) data limitations.
As Lamb identified, lack of data was and remains the most serious limitation. The situation is completely inadequate for temperature, supposedly the best measured variable. How can two major agencies HadCRUT and GISS produce such different results,
supposedly from the same data set? Paul Homewood produced the following Table comparing results for four data sources for the period 2002 to 2011.
GISS and UAH differ by 0.36°C, which is enormous in nine years. Compare it to the 0.6°C increase over 140 years, a change the 2001 IPCC claimed was dramatic and unnatural.
Data is even worse spatially and temporally for water in all its forms, especially precipitation. In a classic understatement the 2007 IPCC Report says,
Difficulties in the measurement of precipitation remain an area of concern in quantifying the extent to which global- and regional-scale precipitation has changed.
They also concede that,
For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.
The lack of data is worse than temperature and precipitation for all other weather variables. There is insufficient data to determine inferences of auto-correlation.
(ii) alternative mechanisms for change.
Ability to determine mechanisms and their implications is impossible without adequate data. Besides, we don’t understand most mechanisms now so considering alternatives is difficult. Many mechanisms are identified but there are many more still unknown. Donald Rumsfeld’s quote is appropriate.
“… there are known knowns; there are things we know that we know. There are known unknowns; that is to say, there are things that we now know we don’t know. But there are also unknown unknowns – there are things we do not know we don’t know.”
Contradiction between results from different authorities, such as the temperature data, proves the point. The IPCC bypassed the problems with a limited definition that allowed them to ignore most mechanisms. Often the excuse was quite bizarre, such as this from Chap
ter 8 of the 2007 report.
Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.
IPCC did what Einstein warned against. “Everything should be made as simple as possible, but not simpler.”
Beyond Auto-correlation?
Autocorrelation is a danger in climatology but what has happened in IPCC goes beyond. In major reconstructions of past climates, temperature series are created from data and processes that are primarily due to precipitation.
Dendroclimatology
Many of them began as chronologic reconstructions. Tree rings began as dendrochronology; an absolute dating method that assumed a new ring is created every year. Age of the Bristlecone Pine made them valuable for this purpose at least. A. E. Douglass founded the discipline of dendrochronology in 1894 and later used tree ring to reconstruct solar cycles and precipitation; the latter became the purpose of all early climate reconstructions.
Available moisture explains most plant growth as farmers and gardeners know. Koppen recognized this in his climate classification system that required classification first on precipitation (B Climate) then on temperature (A,C, and D Climates).
Gross misuse of tree rings to argue the Medieval Warm Period (MWP) didn’t exist was exposed because of inappropriate statistical manipulation. The conclusion used in the 2001 IPCC Science Report claimed the tree rings (the effect) showed no increase in temperature (the cause). In reality with climate change there is a change in all weather variables, hence the auto-correlation problem.
The degree of change to each variable is a function of the latitude as major weather mechanisms migrate toward or away from the poles. For example, during the Ice Ages the Polar climate region expanded primarily at the expense of the middle and low latitude climates, particularly in the desert zone, approximately between 15 and 30° latitude. The low latitude deserts become wet regions in what was traditionally called Pluvials. In the early days it was thought there was no evidence of the Ice Age in the tropical region associated with the Hadley Cell circulation.
Moisture is a controlling factor even in harsh temperature conditions at the tree line. Research at Churchill, Manitoba showed the major predictors of growth were rainfall in the Fall of the preceding year and winter snow amount.
The spruce tree in the photo (Figure 1) is at the tree line at Churchill. It is approximately 100 years old. The lower branches are larger and are on all sides because they are protected from desiccating winter winds by snow; above that powerful persistent arid northeast winds prevent branches growing. Local humor says you cut three trees and tie them together for a complete Christmas tree.
Tree growth, especially annual, is primarily about moisture not temperature. The amount of moisture required by the plant and the amount available both vary with wind speed. At the tree line the ability to trap snow is critical to survival. Small clumps or outliers exist beyond the tree limit as long as they trap snow. Similarly, an open area within the tree limit will remain treeless if denuded of snow by the wind.

Speleology (Stalactites/ Stalagmites)
Stalactites (ceiling) and stalagmites (ground) are another example of precipitation created features claimed to represent temperature. They are created by rainwater, which is a mild carbonic acid because of dissolved atmospheric CO2 that absorbs calcium as it filters through limestone. As the water drips from the ceiling calcium deposits accumulate to create the stalactite. Where it hits the floor more calcium accumulates to create a stalagmite. Growth of both features is a direct result of changes in precipitation at the surface.
Glacial Stratigraphy and Ice Cores
Seasonal or annual records in stratigraphic form are collectively called rhythmites. An early use of rhythmites in climate reconstruction was a specific form called varves and related to annual sedimentary layers in proglacial lakes. In 1910 Swedish scientist Gerard de Geer provided an important chronology for glacial sequences of the Holocene. The thickness of the sediment layer is a result of temperature, but also how much rain fell during the summer that changed the melt rate of the snow and ice.
Seasonal layers in a glacier often reflect temperature change, but are also modified by precipitation. Glacier movement is used as a measure of temperature change, but it is also about precipitation change. Thickness of each layer varies with the amount of snow. (Yes, droughts also occur in winter). When sufficient layers form to about 50 m depth the ice becomes plastic under pressure and flows. Ice is always flowing toward the snout within the glacier. Amount of advance or retreat of the glacier snout is as much about snow accumulation above the permanent snowline as temperature. A snout can advance or retreat without a change in temperature.
Meltwater from a glacier is a function of temperature, but also precipitation. When rain falls on the glacier it increases the melt rate of snow and ice dramatically. This is likely a major explanation for the rapid melt and vast proglacial lakes associated with melt of the ice during the Holocene Optimum. Dynamics of a continental glacier are a slow build up as snow layers accumulate, followed by a relatively rapid melt as snow turns to rain.
The amount of CO2 in the ice crystals varies with the temperature of the water droplet and raindrop, just as seawater CO2 capacity varies. This means glacier meltwater has a higher concentration of CO2 and as it trickles down through the ice layers modifies the ice bubbles as Jaworowski explained in his presentation to the US Senate Committee (March 2004).
This is because the ice cores do not fulfill the essential closed system criteria. One of them is a lack of liquid water in ice, which could dramatically change the chemical composition the air bubbles trapped between the ice crystals. This criterion, is not met, as even the coldest Antarctic ice (down to -73°C) contains liquid water[2]. More than 20 physico-chemical processes, mostly related to the presence of liquid water, contribute to the alteration of the original chemical composition of the air inclusions in polar ice[3].
IPCC maintained focus on the Carbon Cycle, but the Water Cycle is more important, especially as it relates to the dynamics of change. Put a dehydrated rock in a chamber and vary the temperature as much as possible and little happens. Add a few drops of water and the breakdown (weathering) of the rock is dramatic. Any climate experiment or research that excludes water, such as the list of greenhouse gases in dry air, is meaningless. Water exists everywhere on the planet.
Precipitation occurs over the oceans but we have virtually no measures so we cannot determine the diluting effect on the salinity and gaseous content of the critical surface layer. How much does precipitation as a 10 percent carbonic acid solution affect the CO2 measures of that layer? Snowmelt has a higher percentage of CO2 concentration.
Wind speed and direct
ion are major determinant of water distribution in the atmosphere and therefore across the world. It alters the impact of temperature, as we know from wind chill or heat index measures. What is the effect of a small increase in regional, hemispheric or global wind speed on the weather and climate?
Atmospheric pressure varies with temperature that determines the weight of the atmosphere pushing on the surface. How much do these changes affect sea level? We know it is considerable because of storm surges that accompany intense low-pressure systems.
The list of variables unmeasured, unknown or excluded from official IPCC science invalidates their models and their claims. Water in all its forms and functions is the most egregious. It also illustrates the degree of auto-correlation confronting climate research and understanding. It appears Wigley and therefore the IPCC knew of the problems but chose to sidestep them by carefully directing the focus – a scientific sleight of hand.
###
Related articles
- Celebrated Physicist Calls IPCC Summary ‘Deeply Unscientific’
- A look at treemometers and tree ring growth
On the argument about anomalies – I go with John Kehr Misunderstanding of the Global Temperature Anomaly
Go read the rest of the article to see what he is talking about.
“Many Climate Reconstructions Incorrectly Attributed to Temperature Change”
“…but they exacerbated it by ignoring, downgrading or misusing variables. Most important and critical was water in all its forms and functions.”
Maybe the variable of the temperature does not always apply,–>with latant heat the ‘humid air’ parcel can vary in ‘heat’ content without changing the temperature…?
IMO also some of the other variables are/are not temperature dependant or they are like the Thermosphere where individual(?) molecules ,gases etc…can be excited by the suns rays but not transfer heat to the lower levels of atmosphere(?)
Just some questions/thoughts(however disjointed ,deal with it) on a cool (30°F outside) saturday morning…
Thanks for the interesting posts,postings,stories,essays,articles,papers,links and comments.
ps-i also think that there is a missing negative sign and that’s why in the long run things tend to be cold(glaciation/ice ages)
Genghis says:
December 27, 2013 at 6:01 pm
“Water in all its forms is obviously the most important climate factor.
The dry lapse rate is 9.8 C/km. The moist lapse rate is 5 C/km. What that means is that a moist atmosphere is warmer than a dry atmosphere. Increased levels of CO2 tend to increase the lapse rate thereby cooling the atmosphere, relative to a moist atmosphere.”
but we have to be sure to keep things straight. If the temperature in the radiating region at 10 km altitude is -50 deg C, than for a moist lapse rate of 5 C/km the surface would be at -50 + 5 * 10 = 0 C. For a dry lapse rate of 9.8 C the surface would be -50 + 9.8 * 10 = 48 C. So moisture in the atmosphere tends to warm the surface and it is the overwhelming effect, not CO2. Fortunately, since the oceans provide an inexhaustible supply of water and the mass of inert gas in the atmosphere is fixed, the carrying capacity of the atmosphere for water vapor is also approximately constant overall, although the convection cells do introduce day-to-day and seasonal variability.
Mosher you must admit that the linear rate of warming has yet to prove robust against independent variables such as data set and end points. This is because the data is too noisy for robust outcomes and statements and far exceeds the buried artificial OLS statistic. In classical statistical analysis climate OLS is a garbage number thrown out by even a high school level statistics teacher. And you are fully aware of that. I dare you to say otherwise.
That and the ignored issue with tree ring temperature proxies that caused the ‘hide the decline’ ‘trick’ of note fame. They weren’t hiding a decline in temperatures, they were hiding the divergence between observed temperature and tree ring growth from 1960 onwards. Tree ring growth indicated that temperatures were falling rather than rising, so the good Dr appended the actual temps form ~ 1960 onwards to the tree ring proxies from thee prior 1000 years or so, of course he neglected to note this in any of his explanatory information.
The fact that current tree ring growth does not correlate with current temperatures calls intoo question the entire field of dendrochronology. If they aren’t a good proxy for current temps, why should they be considered a good proxy for past temps.
To put it in human terms if I were to calculate an ordinary least squares rate of improvement in oral reading rate based on a data string identical to global temperature and say to a parent that their child was improving at such a rate, I would be committing fraud. The spread of noisy data points from the artificial predicted rate would preclude me from saying such nonsense.
Unfortunately there are pockets of educational institutions that do it anyway, not from malice but from not understanding proper use and the limits of statistical analysts. Would this be the case with you?
“analysis”. I hate iPad autocorrect.
I said:
pochas says:
December 28, 2013 at 7:06 am
“moisture in the atmosphere tends to warm the surface” and this is backwards. The effect of moisture in the atmosphere is to cool the surface relative to the radiating region.”
Pamela Gray says: December 28, 2013 at 7:08 am
Pamela, I am not fully aware of that. And, although your comment was not addressed directly to me, are you able to explain why?
Pochas, glad you corrected that.
But note the bigger issue.
Humid air cools at the moist rate when rising and then after condensation out of the water vapour it descends and warms at the faster dry rate.
Consider the effect of that descending warm air on surface temperatures globally when at any given moment 50% of the atmosphere is rising and 50% is falling.
A major omission in the global energy budget.
Gail Combs says: December 28, 2013 at 6:33 am
Lots of different factors effect the growth of plants and just saying it is temperature is idiotic.
Correct, here is an example from my experience:
I have some land in the county of Kent, England. Woodland part is populated mainly by silver birch species. Flat part of the land was totally devastated by hurricane in October 1987, while part which is in a valley section did survive intact. Now 26 years later some of the newly grown saplings in the clearances have caught up with much older trees from the valley part, only two hundred meters apart. If two were to be used in some future dendrology study they would give totally different growth rate.
Non Nomen says:
December 28, 2013 at 7:19 am
Check for the “Bilderberg group”. That are the powerful guys behind the curtains for several decades now. Here a good oversight:
http://bilderberg.org/bilder.htm
The connection between the Bilderberg group and Maurice Strong is via David Rockefeller as described here:
http://oathkeepers.org/oath/2012/05/25/henry-lamb-on-maurice-strong_un_bilderberg_agenda-21/
Non Nomen says:
December 28, 2013 at 7:19 am
Thanks. The wrongdoer Maurice Strong is known to me….. Something must have happened behind the scene and I want to trace the guys responsible for that crap. Tar and feather
>>>>>>>>>>>>>>>>
Your getting into ‘Conspiracy Theory’ country of course. That said; you might look into these 737 people who “accumulate 80% of the control over the value of all are transnational corporations”: The Network of Global Corporate Control and cross check against the 1001 club who funded World Wildlife Federation.
The groups are pretty incestuous with the same names cropping up in group after group.
1001 Club Incomplete membership list continually updated. Also see: At The Hand Of Man – The White Man’s Game – Prince Bernhard and the World Wildlife Fund (WWF) pp. 66-71
Also checkout THIRD WORLD TRAVELER
There are several pitfalls of OLS (and many other linear statistics) that when applied, can often reveal non-robust issues with your conclusions. The following article is easy to read for most of us here and explains some of the limitations when applied to noisy data replete with many independent variables such as we see in temperature products. The artificial nature of temperature products means that there are many independent variables to deal with, often to the point of making linear measures inappropriate.
http://www.arsa-conf.com/archive/?vid=1&aid=3&kid=60101-220&q=f1
MikeB, if your comment and question relates to educational data, the noisier the data spread the less reliable your prediction of future improvement will be. Why? It is often the case that noisy data has elements in it that you did not intend to measure. And these variables are often independent of your treatment.
if the data doesn’t warm adjust it so it does. These have been going on for both GISS and HADCRUT with the difference between revisions examples of them.
http://www.woodfortrees.org/plot/hadcrut3gl/from:2001/plot/hadcrut4gl/from:2001/plot/hadcrut3gl/from:2001/trend/plot/hadcrut4gl/from:2001/trend
The cooling from HADCRUT3 to HADCRUT4 reduced to almost flat yet again.
USA temperatures from GISS 1999.
http://www.john-daly.com/usa-1999.gif
Notice how 1930s and 1940s temperatures were warmer than the 1990s for the USA.
Global temperatures from GISS 1999/2000
http://www.john-daly.com/giss2000.gif
A link to temperatures in regions all over the world up to 2003 using GISS.
http://www.john-daly.com/stations/stations.htm
A detailed look into surface temperatures.
http://www.john-daly.com/ges/surftmp/surftemp.htm
Vukcevic, not necessarily. Young and old trees can have similar growth differences during the same period of time if exposed to the same conditions. Good conditions can allow young trees to grow well relative to their own history. Those same good conditions can allow old trees to grow well relative to their own history.
Pamela Gray @ur momisugly vukcevic
Good conditions can allow young trees to grow well relative to their own history. Those same good conditions can allow old trees to grow well relative to their own history.
I wish I understood what you meant to say. Oh, forget it..
Gail Combs says:
December 28, 2013 at 7:04 am
“On the argument about anomalies – I go with John Kehr Misunderstanding of the Global Temperature Anomaly.”
=========================================================================
Gail, thanks for reference. I did read the rest of the article, but I’m not sure that everyone will. Since the monthly anomalies are followed closely here, I think everyone should understand what they are.
The reference implies, but does not actually say so in as many words, that the anomaly for a given calendar month (say January) is calculated by subtracting that month’s global temperature from a 30 year average of that specific calendar month temperatures (e.g. all the January temperatures from 1981 to 2010), not all the monthly temperatures for 30 years. Of course, that is the way it has to be, otherwise in the NH, January anomalies would always be lower than June anomalies. I never gave the actual meaning much thought. Thanks for the comment.
Notice the difference between HADCRUT3 and HADCRUT4 for the massive peak around 2008.
http://www.woodfortrees.org/plot/hadcrut3gl/from:2001/plot/hadcrut4gl/from:2001/plot/hadcrut3gl/from:2001/trend/plot/hadcrut4gl/from:2001/trend
The difference between the two was supposedly to include more Arctic data above 80N. So how can 0.43 percent of the planet surface above 82.5N warm up the global mean by nearly 0.25c?
if the areas used was 2% of the planet it would take an anomaly 12.5c above the HADCRUT3 data to match this 0.25c positive anomaly globally. This gives scientific evidence that HADCRUT4 was adjusted not just for Arctic temperatures, but also to show less warming changing data elsewhere.
Typo in post above should read around 2007.
Typo 2 – to show less COOLING changing data elsewhere.
Pamela Gray
Sorry, I did miss your elaboration whilst posting my question. The fact that data is noisy does not preclude using OLS, it just means that the resulting straight line fit may not be ‘statistically significant’. This would be the case with any trend over the last decade or so; I am not aware of anyone saying differently.
Even if the data points were completely random, the OLS technique would still produce a trend line and that trend line would of course be meaningless. So it is always important to retain some common sense and not be blinded by science or maths.
On the other hand, I would expect the rising trend throughout the 20th century to have some validity.
Any method which fits a straight line to a number of data points relies on the underlying assumption that the trend is in fact linear. If it is not, the technique is not valid. In respect of global temperature, it is questionable whether it is valid ( one camp says that CO2 induced warming increases logarithmically and so should slow down and the other camp says it should accelerate away) – but it may be of some indicative use in some circumstances.
We are only using one dependent variable here – temperature ( although it is affected by many different things).
Thanks for the link, it is a very nice summary.
vukcevic says: @ur momisugly December 28, 2013 at 8:44 am
I think Pam may be referring to this study:
NOTE: I skipped the Abstract because it seems to say what Pam was saying which contridicts the body of the text.
There also seems to be a genetic component (or environmental or CO2?) At least I think that is what they are saying.
Vuc, you said, “If two were to be used in some future dendrology study they would give totally different growth rate.” My prediction is that they would be very similar if the conditions were the same. Each tree is measured for growth rate relative to itself in order to compensate for independent variables such as tree age. Your only other choice is to measure only trees of the same age with the same number of growth rings under the same conditions.