Why and How the IPCC Demonized CO2 with Manufactured Information

Guest essay by Dr. Tim Ball

Elaine Dewar spent several days with Maurice Strong at the UN and concluded in her book The Cloak of Green that, Strong was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda. Strong conjectured about a small group of world leaders who decided the rich countries were the principle risk to the world. These countries refused to reduce their environmental impact. The leaders decided the only hope for the planet was for collapse of the industrialized nations and it was their responsibility to bring that about. Strong knew what to do. Create a false problem with false science and use bureaucrats to bypass politicians to close industry down and make developed countries pay.

Compare the industrialized nation to an internal combustion engine running on fossil fuel. You can stop the engine in two ways; cut off the fuel supply or plug the exhaust. Cutting off fuel supply is a political minefield. People quickly notice as all prices, especially food, increase. It’s easier to show the exhaust is causing irreparable environmental damage. This is why CO2 became the exclusive focus of the Intergovernmental Panel on Climate Change (IPCC). Process and method were orchestrated to single out CO2 and show it was causing runaway global warming.

In the 1980s I warned Environment Canada employee Henry Hengeveld that convincing a politician of an idea is a problem. Henry’s career involved promoting CO2 as a problem. I explained the bigger problem comes if you convince them and the claim is proved wrong. You either admit your error or hide the truth. Environment Canada and member nations of the IPCC chose to hide or obfuscate the truth.

1. IPCC Definition of Climate Change Was First Major Deception

People were deceived when the IPCC was created. Most believe it’s a government commission of inquiry studying all climate change. The actual definition from the United Nations Environment Program (article 1) of the United Nations Framework Convention on Climate Change (UNFCCC) limits them to only human causes.

a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over considerable time periods.

In another deception, they changed the definition used in the first three Reports (1990, 1995, 2001) in the 2007 Report. It’s a footnote in the Summary for Policymakers (SPM).

Climate change in IPCC usage refers to any change in climate over time, whether due to natural variability or as a result of human activity. This usage differs from that in the United Nations Framework Convention on Climate Change, where climate change refers to a change of climate that is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and that is in addition to natural climate variability observed over comparable time periods.

It was not used because Reports are cumulative and to include natural variability required starting over completely.

It is impossible to determine the human contribution to climate change if you don’t know or understand natural (non-human) climate change. Professor Murray Salby showed how the human CO2 portion is of no consequence, that variation in natural sources of CO2 explains almost all annual changes. He showed that a 5% variation in these sources is more than the total annual human production.

2. IPCC Infer And Prove Rather than Disprove a Hypothesis

To make the process appear scientific a hypothesis was inferred based on the assumptions that,

• CO2 was a greenhouse gas (GHG) that slowed the escape of heat from the Earth.

• the heat was back-radiated to raise the global temperature.

• if CO2 increased global temperature would rise.

• CO2 would increase because of expanding industrial activity.

• the global temperature rise was inevitable.

To further assure the predetermined outcome the IPCC set out to prove rather than disprove the hypothesis as scientific methodology requires. As Karl Popper said,

It is the rule which says that the other rules of scientific procedure must be designed in such a way that they do not protect any statement in science against falsification.

The consistent and overwhelming pattern of the IPCC reveal misrepresentations of CO2. When an issue was raised by scientists performing their role as skeptics, instead of considering and testing its validity and efficacy the IPCC worked to divert, even creating some false explanations. False answers succeeded because most people didn’t know they were false.

3. CO2 Facts Unknown to Most But Problematic to IPCC.

Some basic facts about CO2 are unknown to most people and illustrate the discrepancies and differences between IPCC claims and what science knows.

• Natural levels of Carbon dioxide (CO2) are less than 0.04% of the total atmosphere and 0.4% of the total GHG. It is not the most important greenhouse gas.

• Water vapour is 95 percent of the GHG by volume. It is the most important greenhouse gas by far.

• Methane (CH4) is the other natural GHG demonized by the IPCC. It is only 0.000175 percent of atmospheric gases and 0.036 percent of GHG.

• Figure 1 from ABC news shows the false information. It’s achieved by considering a dry atmosphere.

clip_image002

Figure 1

• The percentages troubled the IPCC so they amplified the importance of CO2 by estimating the “contribution” per unit (Figure 2). The range of estimates effectively makes the measures meaningless, unless you have a political agenda. Wikipedia acknowledges It is not possible to state that a certain gas causes an exact percentage of the greenhouse effect.

clip_image004

Figure 2 (Source Wikipedia)

4. Human CO2 production critical to IPCC objective so they control production of the information.

Here is their explanation.

What is the role of the IPCC in Greenhouse Gas inventories and reporting to the UNFCCC?

A: The IPCC has generated a number of methodology reports on national greenhouse gas inventories with a view to providing internationally acceptable inventory methodologies. The IPCC accepts the responsibility to provide scientific and technical advice on specific questions related to those inventory methods and practices that are contained in these reports, or at the request of the UNFCCC in accordance with established IPCC procedures. The IPCC has set up the Task Force on Inventories (TFI) to run the National Greenhouse Gas Inventory Programme (NGGIP) to produce this methodological advice. Parties to the UNFCCC have agreed to use the IPCC Guidelines in reporting to the convention.

How does the IPCC produce its inventory Guidelines? Utilising IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC. This process ensures that the widest possible range of views are incorporated into the documents.

They control the entire process from methodology, designation of technical advice, establishment of task forces, guidelines for reporting, nomination of experts to produce the reports, to final report approval. The figure they produce is a gross calculation, but it is estimated humans remove 50% of that amount.

Regardless, if you don’t know natural sources and variabilities of CO2 you cannot know the human portion. It was claimed the portion in the atmosphere from combustion of fossil fuels was known from the ratio of carbon isotopes C13/C12. Roy Spencer showed this was not the case. In addition, they ignore natural burning of fossil fuels including forest fires, long-burning coal seams and peat; as Hans Erren noted, fossil coal is buried wood. Spencer concluded,

If the C13/C12 relationship during NATURAL inter-annual variability is the same as that found for the trends, how can people claim that the trend signal is MANMADE??

The answer is, it was done to prove the hypothesis and further the deception.

5. Pressure For Urgent Political Action

Early IPCC Reports claimed the length of time CO2 remains in the atmosphere as very long. This implied it would continue as a problem even with immediate cessation of CO2 production. However as Segalstad wrote,

Essenhigh (2009) points out that the IPCC (Intergovernmental Panel on Climate Change) in their first report (Houghton et al., 1990) gives an atmospheric CO2 residence time (lifetime) of 50-200 years [as a “rough estimate”]. This estimate is confusingly given as an adjustment time for a scenario with a given anthropogenic CO2 input, and ignores natural (sea and vegetation) CO2 flux rates. Such estimates are analytically invalid; and they are in conflict with the more correct explanation given elsewhere in the same IPCC report: “This means that on average it takes only a few years before a CO2 molecule in the atmosphere is taken up by plants or dissolved in the ocean.

6. Procedures to Hide Problems with IPCC Science And Heighten Alarmism.

IPCC procedures and mechanisms were established to deceive. IPCC has three Working Groups (WG). WGI produces the Physical Science Basis Report, which proves CO2 is the cause. WGII produces the Impacts, Adaptation and Vulnerability Report that is based on the result of WGI. WGIII produces the Mitigation of Climate Change Report. WGI and WGII accept WGI’s claim that warming is inevitable. They state,

Five criteria that should be met by climate scenarios if they are to be useful for impact researchers and policy makers are suggested: Criterion 1: Consistency with global projections. They should be consistent with a broad range of global warming projections based on increased concentrations of greenhouse gases. This range is variously cited as 1.4°C to 5.8°C by 2100, or 1.5°C to 4.5°C for a doubling of atmospheric CO2 concentration (otherwise known as the “equilibrium climate sensitivity”).

They knew few would read or understand the Science Report with its admission of serious limitations. They deliberately delayed its release until after the Summary for Policymakers (SPM). As David Wojick explained,

Glaring omissions are only glaring to experts, so the policymakers”—including the press and the publicwho read the SPM will not realize they are being told only one side of a story. But the scientists who drafted the SPM know the truth, as revealed by the sometimes artful way they conceal it.

What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.

An example of this SPM deception occurred with the 1995 Report. The 1990 Report and the drafted 1995 Science Report said there was no evidence of a human effect. Benjamin Santer, as lead author of Chapter 8, changed the 1995 SPM for Chapter 8 drafted by his fellow authors that said,

While some of the pattern-base discussed here have claimed detection of a significant climate change, no study to date has positively attributed all or part of climate change observed to man-made causes.

to read,

The body of statistical evidence in chapter 8, when examined in the context of our physical understanding of the climate system, now points to a discernible human influence on the global climate.

The phrase “discernible human influence became the headline as planned.

With AR5 (2013) they compounded the deception by releasing the SPM then releasing a correction. They got the headline they wanted. It is the same game as the difference between the exposure of problems in the WGI Science Report and the SPM. Media did not report the corrections, but the IPCC could now claim they detailed the inadequacy of their work. It’s not their fault that people don’t understand.

7. Climate Sensitivity

Initially it was assumed that constantly increasing atmospheric CO2 created constantly increasing temperature. Then it was determined that the first few parts per million achieved the greenhouse capacity of CO2. Eschenbach graphed the reality

clip_image006

(Figure 3).

Figure 3

It is like black paint on a window. To block sunlight coming through a window the first coat of black paint achieves most of the reduction. Subsequent coats reduce fractionally less light.

There was immediate disagreement about the amount of climate sensitivity from double and triple atmospheric CO2. Milloy produced a graph comparing three different sensitivity estimates (Figure 4).

clip_image008

Figure 4.

The IPCC created a positive feedback to keep temperatures rising. It claims CO2 causes temperature increase that increases evaporation and water vapour amplifies the temperature trend. Lindzen and Choi, discredited this in their 2011 paper which concluded The results imply that the models are exaggerating climate sensitivity.

Climate sensitivity has declined since and gradually approaches zero. A recent paper by Spencer claims “…climate system is only about half as sensitive to increasing CO2 as previously believed.

8. The Ice Cores Were Critical, But Seriously Flawed.

The major assumption of the inferred IPCC hypothesis says a CO2 increase causes a temperature increase. After publication in 1999 of Petit et al., the Antarctic ice core records appeared as evidence in the 2001 Report (Figure 5).

clip_image010

Figure 5. Antarctic core core record

Four years later research showed the reverse – temperature increase preceded CO2 increase contradicting the hypothesis. It was sidelined with the diversionary claim that the lag was between 80 and 800 years and insignificant. It was so troubling that Al Gore created a deceptive imagery in his movie. Only a few experts noticed.

Actually, temperature changes before CO2 change in every record for any period or duration. Figure 6 shows a shorter record (1958-2009) of the relationship. If CO2 change follows temperature change in every record, why are all computer models programmed with the opposite relationship?

clip_image011

Figure 6; Lag time for short record, 1958 to 2009.

IPCC Needed Low Pre-Industrial CO2 Levels

A pre-industrial CO2 level lower than today was critical to the IPCC hypothesis. It was like the need to eliminate the Medieval Warm Period because it showed the world was not warmer today than ever before.

Ice cores are not the only source of pre-industrial CO2 levels. There are thousands of 19th Century direct measures of atmospheric CO2 that began in 1812. Scientists took precise measurements with calibrated instruments as Ernst Beck thoroughly documented.

In a paper submitted to the US Senate Committee on Commerce, Science, and Transportation Hearing Professor Zbigniew Jaworowski stated,

The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.[1]

Of equal importance Jaworowski states,

The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv[11] (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv[12]. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution [13].

There are other problems with the ice core record. It takes years for air to be trapped in the ice, so what is actually trapped and measured? Meltwater moving through the ice especially when the ice is close to the surface can contaminate the bubble. Bacteria form in the ice, releasing gases even in 500,000-year-old ice at considerable depth. (Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice. Brent C. Christner, 2002 Dissertation. Ohio State University). Pressure of overlying ice, causes a change below 50m and brittle ice becomes plastic and begins to flow. The layers formed with each year of snowfall gradually disappear with increasing compression. It requires a considerable depth of ice over a long period to obtain a single reading at depth. Jaworowski identified the problems with contamination and losses during drilling and core recovery process.

Jaworowski’s claim that the modellers ignored the 19th century readings is incorrect. They knew about it because T.R.Wigley introduced information about the 19th century readings to the climate science community in 1983. (Wigley, T.M.L., 1983 “The pre-industrial carbon dioxide level.” Climatic Change 5, 315-320). However, he cherry-picked from a wide range, eliminating only high readings and ‘creating’ the pre-industrial level as approximately 270 ppm. I suggest this is what influenced the modellers because Wigley was working with them as Director of the Climatic Research Unit (CRU) at East Anglia. He preceded Phil Jones as Director and was the key person directing the machinations revealed by the leaked emails from the CRU.

Wigley was not the first to misuse the 19th century data, but he did reintroduce it to the climate community. Guy Stewart Callendar, a British Steam engineer, pushed the thesis that increasing CO2 was causing warming. He did what Wigley did by selecting only those readings that supported the hypothesis.

There are 90,000 samples from the 19th century and the graph shows those carefully selected by G. S. Callendar to achieve his estimate. It is clear he chose only low readings.

clip_image013

Figure 7. (After Jawaorowski Trend Lines added)

You can see changes that occur in the slope and trend by the selected data compared to the entire record.

Ernst-Georg Beck confirmed Jaworowski’s research. An article in Energy and Environment examined the readings in great detail and validated their findings. In his conclusion Beck states

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.

The pre-industrial level is some 50 ppm higher than the level claimed.

Beck found,

Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.

The challenge for the IPCC was to create a smooth transition from the ice core CO2 levels to the Mauna Loa levels. Beck shows how this was done but also shows how the 19th century readings had to be cherry-picked to fit with ice core and Mauna Loa data (Figure 8).

clip_image015

Figure 8

Variability is extremely important because the ice core record shows an exceptionally smooth curve achieved by applying a 70-year smoothing average. Selecting and smoothing is also applied to the Mauna Loa data and all current atmospheric readings, which naturally vary up to 600 ppm in the course of a day. Smoothing done on the scale of the ice core record eliminates a great deal of information. Consider the variability of temperature data for the last 70 years. Statistician William Brigg’s says you never, ever, smooth a time-series. Elimination of high readings prior to the smoothing make the losses greater. Beck explains how Charles Keeling established the Mauna Loa readings by using the lowest readings of the afternoon and ignored natural sources. Beck presumes Keeling decided to avoid these low level natural sources by establishing the station at 4000 m up the volcano. As Beck notes

“Mauna Loa does not represent the typical atmospheric CO2on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude. (Beck, 2008, “50 Years of Continuous Measurement of CO2on Mauna Loa” Energy and Environment, Vol. 19, No.7.)

Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2measurements. He is a co-author of the IPCC reports, that accept Mauna Loa and all other readings as representative of global levels.

As a climatologist I know it is necessary to obtain as many independent verifications of data as possible. Stomata are small openings on leaves, which vary in size directly with the amount of atmospheric CO2. They underscore effects of smoothing and the artificially low readings of the ice cores. A comparison of a stomata record with the ice core record for a 2000-year period (9000 – 7000 BP) illustrates the issue (Figure 9).

clip_image017

Figure 9.

Stomata data show higher readings and variability than the excessively smoothed ice core record. They align quantitatively with the 19th century measurements as Jaworowski and Beck assert. The average level for the ice core record shown is approximately 265 ppm while it is approximately 300 ppm for the stomata record.

The pre-industrial CO2 level was marginally lower than current levels and likely within the error factor. Neither they, nor the present IPCC claims of 400 ppm are high relative to the geologic record. The entire output of computer climate models begins with the assumption that pre-industrial levels were measurably lower. Elimination of this assumption further undermines the claim that the warming in the industrial era period was due to human addition of CO2 to the atmosphere. Combine this with their assumption that CO2 causes temperature increase, when all records show the opposite, it is not surprising IPCC predictions of temperature increase are consistently wrong.

The IPCC deception was premeditated under Maurice Strong’s guidance to prove CO2 was causing global warming as pretext for shutting down industrialized nations. They partially achieved their goal as alternate energies and green job economies attest. All this occurred as contradictory evidence mounts because Nature refused to play. CO2 increases as temperatures decline, which according to IPCC science cannot happen. Politicians must deal with facts and abandon all policies based on claims that CO2 is a problem, especially those already causing damage.

clip_image019

Source: The Global Warming Policy Foundation: CCNet 14/10/13


1. [1] “Climate Change: Incorrect information on pre-industrial CO2” Statement written for the Hearing before the US Senate Committee on Commerce, Science, and Transportation by Professor Zbigniew Jaworowski March 19, 2004

5 4 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

290 Comments
Inline Feedbacks
View all comments
Gail Combs
November 14, 2013 11:19 am

Jquip says: November 13, 2013 at 7:13 pm
Waitwaitwait. We calibrate our CO2 instruments from a sensor parked on the side of an active volcano? Which pencil neck thought this was a good idea? …
>>>>>>>>>>>>>>>>
Oh it is worse that that. Remember they had a large sink, the ocean also to hand. It made cherry picking very easy.

At the Mauna Loa Observatory the measurements were taken with a new infra-red (IR) absorbing instrumental method, never validated versus the accurate wet chemical techniques. Critique has also been directed to the analytical methodology and sampling error problems (Jaworowski et al., 1992 a; and Segalstad, 1996, for further references), and the fact that the results of the measurements were “edited” (Bacastow et al., 1985); large portions of raw data were rejected, leaving just a small fraction of the raw data subjected to averaging techniques (Pales & Keeling, 1965)….
Revelle foresaw the geochemical implications of the rise in atmospheric CO2 resulting from fossil fuel combustion, and he sought means to ensure that this ‘large scale geophysical experiment’, as he termed it, would be adequately documented as it occurred. During all stages of the present work Revelle was mentor, consultant, antagonist. He shared with us his broad knowledge of earth science and appreciation for the oceans and atmosphere as they really exist, and he inspired us to keep in sight the objectives which he had originally persuaded us to accept.” Is this the description of true, unbiased research?…
http://www.co2web.info/ESEF3VO2.pdf

And then from Mauna Loa step 4 of how they take measurements:

4. In keeping with the requirement that CO2 in background air should be steady, we apply a general “outlier rejection” step, in which we fit a curve to the preliminary daily means for each day calculated from the hours surviving step 1 and 2, and not including times with upslope winds. All hourly averages that are further than two standard deviations, calculated for every day, away from the fitted curve (“outliers”) are rejected. This step is iterated until no more rejections occur.
http://www.esrl.noaa.gov/gmd/ccgg/about/co2_measurements.html

This step is why I have major problems with the Assumption that CO2 is well mixed in the atmosphere.
Graph of raw data link
Graph after climatologists get through with the data manipulating: link

Gail Combs
November 14, 2013 11:29 am

J. Philip Peterson says:
November 13, 2013 at 7:30 pm
Is Maurice Strong still alive?
>>>>>>>>>>>>>>>>>>>>
He got caught with his hand in the cookie jar in the Food for Oil scandal and had to hightail it to China. Last I heard he was in Beijing China as an advisor to the Chinese government and on the board of a U.S.-based engineering and construction firm CH2M Hill.
He has a website that is active.

Gail Combs
November 14, 2013 11:37 am

Jquip says: November 13, 2013 at 8:27 pm
… The problem is calibrating other instruments from Mauna Loa. I don’t mean to call Dr. Ball into question, but the idea that this is the case is beyond absurd.
>>>>>>>>>>>>>>
Why would you say that?
Round Robins and “matching” using a ‘standard’ is often done among labs owned by one corporation. Also this is not the first I hear of Mauna Loa controlling the calibration. After all it was the first lab and would be consider the ‘expert’

Gail Combs
November 14, 2013 11:44 am

A.D. Everard says:
>>>>>>>>>>>
Unfortunately the traitors are from within. Strong is a Canadian. He is aided by Al Gore and Bill Clinton, Americans, Tony Blair, UK Prime Minster, and Pascal Lamy, French…..
Lamy (Director-General of WTO) made clear what the true goal is Global Governance and that the goal was agreed on back in the 1930s after WWI and the Great Depression. Strong served on the Commission on Global Governance.

Mac the Knife
November 14, 2013 12:16 pm

Dr. Ball,
Thank You for this excellent summary! There is too much here to digest on my lunch break so I’ll have to return later for a more thorough ‘read’. You have written it beautifully though, in terms that most laymen with out the benefit of science or engineering degrees can understand.
I think this is a ‘bookmark and distribute widely’ summary!
MtK

Mac the Knife
November 14, 2013 12:24 pm

Konrad says:
November 14, 2013 at 2:32 am
In his exploration of “why”, Dr. Tim Ball misses an important point, money. More precisely money for the UN.
Konrad,
I agree. A news story is just breaking now about UN-IPCC feigning poverty while concealing fat slush funds.
MtK
UN carbon emissions reduction system awash in cash as it claims to face hard times.
EXCLUSIVE: The United Nations-administered cap-and-trade system for reducing greenhouse gases is sitting on a cash hoard of close to $200 million, even as it warns of hard times ahead that could impede its mission.
The cash cushion for the Geneva-based organization known as the Clean Development Mechanism, or CDM, amounts to more than 400 percent of the $45 million reserve that it considers a normal set-aside for rainy days, according to its recently published business plan for 2014-2015.

http://www.foxnews.com/world/2013/11/14/un-carbon-emissions-reduction-system-awash-in-cash-as-it-claims-to-face-hard/?intcmp=latestnews

Gail Combs
November 14, 2013 12:38 pm

Jquip says:
November 14, 2013 at 10:01 am
wayne: “Interesting, I always thought Keeling sent out a calibration gas sample of a known and accurate concentration which has its own set of problems. In that case if the calibration samples are off, everyone is off but since the calibration was deemed accurate no mother station adjustments would ever be made after the fact as they are assimilated each month.”
Yes, exactly. And assuming all the best integrity of Keeling, Keeling is calibrating the gas samples with his own equipment….
>>>>>>>>>>>>>>
Normally the calibration sample for a Gas Chromatograph or an IR is done by taking pure samples and making up standards of various percents covering the range of interest. You would take a pure samples of CO2 and dilute it with dry nitrogen to 300 ppm, 325 ppm, 350 ppm, 375 ppm 400 ppm, 425 ppm, 450 ppm ,475 ppm and 500 ppm for example.
(You then hope like heck you didn’t screw-up)

transport by Zeppelin
November 14, 2013 12:59 pm

“Strong was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda.”
<<<
"In the early 1990's, shortly after the IPCC was organised, President Clinton's chief environmental scientist, Dr Robert Watson, told me that after he had helped get the production of Freon banned by the international community with the Montreal protocol, next on the list to be regulated was carbon dioxide. There was no mention of investigating the science behind the claim that global warming was man-made – only a specific policy outcome that the IPCC was going to support.
Dr Watson later became one of the IPCC's directors, from 1997 to 2002."
Roy Spencer

Jquip
November 14, 2013 1:01 pm

Gail Combs: “(You then hope like heck you didn’t screw-up)”
Yep. And this is the whole crux of it. For to know whether you did or didn’t screw up, you need calibrated equipment. Remember: We have only one site for this. If we had numerous people doing numerous calibrations then it reduces to a byzantine general problem and you crack open the statistics book.
But one source, one site, doesn’t get you that. There are a multitude or problems in this. But it remains: It’s measuring the aging process of Keeling’s equipment. The only way for this to not be the case is to be producing samples, and calibrations, from multiple sources.

John Whitman
November 14, 2013 1:04 pm

Why and How the IPCC Demonized CO2 with Manufactured Information
By guest essayist Tim Ball

– – – – – – – –
In his guest essay Tim Ball is saying that Maurice Strong is the sufficient and necessary reason for what Ball describes as the ‘why and how the IPCC demonized CO2 with manufactured Information’.
I am not convinced by Tim Ball’s essay. In his finding that it was Maurice Strong, what does it explain about the past approximately half century of Western Civilization? I think it explains nothing about it. I think Tim Ball’s essay does not even try to explain it, he instead gives us a stereotyped villainous scapegoat (Maurice Strong) as merely necessary and sufficient.
Insufficient and unnecessary is my evaluation.
John

November 14, 2013 1:25 pm

Gail Combs: Most gas standards are started with gravimetric methods, which are more accurate than volumetric dilutions. Of course, getting it down to a few hundred ppm would be volumetric.

November 14, 2013 1:25 pm

Wow! what a story!
Who need enemies if your friends tell such stories… Seems that Lewandowsky may be somewhere right about conspiracy theories amongst sceptics?
To start with:
Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2measurements.” He is a co-author of the IPCC reports, that accept Mauna Loa and all other readings as representative of global levels.
The calibration of CO2 levels in nitrogen (later in air) used to calibrate all equipment on all stations was first done by C.D. Keeling end 1950’s with a self made manometric apparatus (he knew glassblowing) with an accuracy of 1:40,000 for the simple reason that the available equipment used for CO2 measurements at that time was not beter than +/-3% of full scale or +/-10 ppmv CO2. With the procedures of calibration of the new NDIR equipment it was possible to reach +/- 0.1 ppmv in continuous measurements without much maintenance, reagens or manpower.
See the fascinating autobiography of C.D. Keeling to maintain the funding of the measurements and his struggle with the different administrations:
http://scrippsco2.ucsd.edu/publications/keeling_autobiography.pdf
The manometric calibrator was still in service at the Scripps institute until a few years ago and did retire after 50 years of service.
Meanwhile NOAA is now in charge of providing the necessary calibration gases to all laboratories which need them. These calibration gases are tested in several other laboratories, while others organisations use their own calibration gases, including Scripps who wasn’t happy to give their monopoly away. NOAA also maintains the 10 “baseline” stations for CO2 and a lot of other gases, including Mauna Loa, besides some 70 other “background” stations maintained by other groups from other countries.
The shift from Scripps to NOAA [means] that Scripps still takes their own samples at Mauna Loa, measure with their own calibration gases and surely will be happy to catch the NOAA on any fault. Despite that, their results don’t differ more than 0.2 ppm from those of NOAA.
Thus while there is only one official organisation which delivers the calibration gases, these are crossvalidated by several other laboratories of other organisations.
See further the calibration proceduresas used at Mauna Loa and other stations:
http://www.esrl.noaa.gov/gmd/ccgg/about/co2_measurements.html
Then, Mauna Loa is not the official “global” CO2 level or trend. The “global” CO2 level is the average of several ground level stations, Mauna Loa not included, but that hardly makes any difference for the trend:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_trends.jpg
But the Mauna Loa trend is often used, as it is the longest continuous record of CO2 (the South Pole started earlier, but has a gap of a few years).
So that is already a start…

Lil Fella from OZ
November 14, 2013 1:33 pm

Excellent article. Comment on Strong is spot on. World control! Terrible consideration when we have witnesed such corruption in attempt to uphold ‘climate change.’

November 14, 2013 1:34 pm

Good article, Dr. Ball.
I’ve always been impressed by the thought that the global atmospheric concentration is based on one measuring station on an active volcano on instruments that are not subject to any other calibration checks except that stations. That certainly solves the age old problem of getting two instruments to produce exactly the same values and having some referee checks for accuracy and precision. Dadgum, I sure do wish I had that luxury when I was running a QC lab. On second thought I don’t, because I wanted some assurance that the numbers we put out were correct and reproducible within statistical error.
Now on to the amazing CO2 transport phenomena in which the global atmospheric CO2 concentration is the same today as the measurement in Hawaii. Seems to go against my experience with mixing anything.

Tim Clark
November 14, 2013 1:43 pm

Now if there were only a specific or limited amount of energy available for absorption then I could agree with the above statement……
Sam, there is for CO2.B
etter do some homework on wavelengths.

BLACK PEARL
November 14, 2013 1:44 pm

Sherlock1 says:
November 14, 2013 at 6:32 am
Brilliant essay by Tim Ball…
Er – why isn’t it on every politician’s Christmas present list..?
######
Could send it to UK Climate Change / Energy Minister daveye@parliament.uk
Just in the interests of balance & education of coarse

November 14, 2013 1:56 pm

Gail Combs says:
November 14, 2013 at 11:19 am
This step is why I have major problems with the Assumption that CO2 is well mixed in the atmosphere.
Graph of raw data link
Graph after climatologists get through with the data manipulating: link

Gail, you tell outright lies. I have shown you different times that the two links you provide are from different stations the noisy one is from Neuglobsow (as can be seen above the graph) near Berlin, Germany, midst of a forest and completely unsuitable for “background” measurements as too close to huge sinks and sources. The other is from Mace Head, coastal, Ireland, an actual “background” measuring station. Still the raw data in the latter.
One time you can make a mistake, a second time you may have it forgotten, but three times is deliberately.
For those interested the raw hourly aveaged data + stdv of Barrow, Mauna Loa, Samoa and the South Pole still are available at:
ftp://ftp.cmdl.noaa.gov/ccg/co2/in-situ/
You can compare them with the published “cleaned” data for the same stations:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_mlo_spo_raw_select_2008.jpg
Look at the scale of the outliers: +4 ppmv for downslope winds from volcanic vents and -4 ppmv for upslope winds, slightly depleted by vegetation in the valley.
For the yearly trend and average that makes no difference if you include or exclude the local disturbances. But as we are interested in “background” CO2, throw out the data which are clearly contaminated by local disturbances.

Climate agnostic
November 14, 2013 2:01 pm

I have more confidence in Fred Singer than in Tim Ball who has no background in climate science other than as a promoter of “pseudo-skepticism” including conspiracy theories.
Here is what Singer has to say in an article, Tale of two Climate Hockeysticks, in American Thinker:
“A more detailed interpretation of the CO2 curve leads to these additional conclusions:
2. Various skeptics have suggested that CO2 levels were higher during the 19th century than they are today. There is nothing wrong per se with these old measurements — though they were performed by old-fashioned chemical methods rather than current infrared techniques. It just means that the data obtained were contaminated and were not representative of global concentrations of free-atmosphere CO2. Antarctic is reasonably free of contamination.
3. It is often claimed by skeptics that the human contribution to atmospheric CO2 (from fossil-fuel burning) is tiny — less than a percent. The data clearly show that the contribution is 400 minus 280 parts per million (ppm) — roughly 30% of the current concentration.
4. Extreme skeptics have often claimed that George Callender, the British pioneer of the global-warming story during the early 20th century, was hiding some higher CO2 values from ice cores that approached present values. This does not seem to be the case.
5. From time to time, skeptics have claimed that the CO2 increase was mainly due to global warming, which caused the release of dissolved CO2 from the ocean surface into the atmosphere. (A recent adherent of this hypothesis is Prof. Murry Salby in Australia.) However, the evidence appears to go against such an inverted causal relation. While this process may have been true during the ice ages, the isotope evidence seems to indicate that the human contribution from fossil-fuel burning clearly dominates during the last 100 years.
6. Finally, note that the temperature ‘blade’ starts around 1910, while CO2 starts its sharp upward climb around 1780AD.”
http://www.americanthinker.com/2013/08/a_tale_of_two_climate_hockeysticks.html#ixzz2h7ZyRVmo
You can very well stick to the facts and still be a skeptic.

more soylent green!
November 14, 2013 2:16 pm

Mauna Loa is considered an active volcano (http://volcano.oregonstate.edu/what-are-active-volcanoes-hawaii-and-what-their-status). “Active” does not mean it’s currently erupting like Kilauea.

Rosarugosa
November 14, 2013 2:25 pm

“…You can stop the engine in two ways; cut off the fuel supply or plug the exhaust…”
Well no, actually you can stop the engine 4 ways, the other two are “stop the air supply” and “apply an overload”.
A bit pedantic, I suppose, but a rotten analogy sort of seriously weakens the argument.

Jquip
November 14, 2013 2:39 pm

Ferdinand Engelbeen: “But as we are interested in “background” CO2, throw out the data which are clearly contaminated by local disturbances.”
Unfortunately, we don’t have any reliable calibration of the instruments that measure that. So if we’re throwing out unreliable data, so go — apparently — every CO2 station.

Robert
November 14, 2013 2:56 pm

I don’t understand why the push from the Nuclear Industry is overlooked. A very influential group of scientists (I believe that one was a director of Central Intelligence for a year and is a chemistry professor now) jumped on the band wagon in 1980 if not earlier. They were founded by nuclear physicists in the 50s.
The move to electric cars was pushed in the 90s and a 2006 tried to convince us that Big Oil stopped the development of the electric car when battery technology was always the issue. Why do it when the sudden surge in demand for electricity would not be met by renewable energy sources. A sudden shift to electric cars would only mean more nuclear power plants would be needed.
And Graham from Sydney, Shakespeare couldn’t spell his own name. Biggest imbecile that there ever was?

November 14, 2013 3:00 pm

About the late Ernst Beck.
Please don’t take it against me that i don’t agree to a large extent with the results of the late Ernst Beck. I had years of correct discussions with him before his untimely death. So, while he can’t defend his work anymore, I can’t let it pass without comment.
The problem of most of the historical measurements is not the equipment (with a few exceptions) which in general was accurate to +/- 10 ppmv. The main poblem was where was measured: in the middle of towns (Paris, Boston,…), forests (diurnal variation of hunderds of ppm’s), under and inbetween leaves of growing crops,…
Ernst Beck lumped all data together: the good, the bad and the ugly, thus his results were the average of 200 ppmv near leaves in the US and 600 ppmv at a mountain slope in Austria without much quality control (some data were from equipment accurate to +/- 150 ppmv)… Here a few days from modern data measured at Linden/Giessen were a long series of historical data were taken around 1942:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/giessen_background.jpg
All data are raw data, thus including all the outliers at Mauna Loa like volcanic vents etc.
The historical samples were taken three times a day, which that alone already did give a bias of +40 ppmv. See further:
http://www.ferdinand-engelbeen.be/klimaat/beck_data.html
A similar problem occurs with stomata data: stomata index data are measured on land plants which reflect the average CO2 level of the previous growing season. Thus with a local bias in CO2 level. The SI is calibrated against direct measurements, firn and ice cores over the past century. But nobody knows how the local bias changed over the pevious centuries from land changes in the main wind direction. Even the main wind direction may have changed (MWP-LIA…). Thus if there is a discrepancy between ice core and stomata CO2 data, I prefer the ice core data for the most accurate average, while the stomata data may better reflect the variability over short term.

Gail Combs
November 14, 2013 3:29 pm

John Whitman says: November 14, 2013 at 1:04 pm
In his guest essay Tim Ball is saying that Maurice Strong is the sufficient and necessary reason for what Ball describes as the ‘why and how the IPCC demonized CO2 with manufactured Information’.
I am not convinced by Tim Ball’s essay. In his finding that it was Maurice Strong….
>>>>>>>>>>>>>>>>>>>>>>>>>>>
Maurice Strong was a key player but he certainly was not the ‘King Pin’ He was just a well rewarded intelligent go-for.
The following three excerpts are not from ‘Conspiracy Theorists’ but from well known and respected men…
The two time Director-General of the World Trade Organization, Pascal Lamy stated in an article:

The reality is that, so far, we have largely failed to articulate a clear and compelling vision of why a new global order matters — and where the world should be headed. Half a century ago, those who designed the post-war system — the United Nations, the Bretton Woods system, the General Agreement on Tariffs and Trade (GATT) — were deeply influenced by the shared lessons of history.
All had lived through the chaos of the 1930s — when turning inwards led to economic depression, nationalism and war. All, including the defeated powers, agreed that the road to peace lay with building a new international order — and an approach to international relations that questioned the Westphalian, sacrosanct principle of sovereignty…
http://www.theglobalist.com/pascal-lamy-whither-globalization/

So there is the goal stated plainly.
From a book by President Woodrow Wilson

The Anglo-American Establishment 1961
Pg. 24:
Since I entered politics, I have chiefly had men’s views confided to me privately. Some of the biggest men in the United States, in the field of commerce and manufacture, are afraid of somebody, are afraid of something. They know that there is a power somewhere so organized, so subtle, so watchful, so interlocked, so complete, so pervasive, that they had better not speak above their breath when they speak in condemnation of it.
They know that America is not a place of which it can be said, as it used to be, that a man may choose his own calling and pursue it just as far as his abilities enable him to pursue it; because to-day, if he enters certain fields, there are organizations which will use means against him that will prevent his building up a business which they do not want to have built up; organizations that will see to it that the ground is cut from under him and the markets shut against him…

I watched this happen to the corporation I was working for that was headed by a ‘Maverick.’ All of a sudden out of nowhere a nationwide front page environmental campaign against his company blew-up in his face and just about bankrupted his company. This was less than a month before the TV ads were to be aired on a joint venture between McDonald’s, Sweetheart Plastic and Polysar announcing a new plant for recycling McDonald’s post consumer polystyrene dinnerware. It was even designed to use handicapped labor. In Massachusetts alone, five plants were closed as the result of a NH grade school teacher’s campaign against Polystyrene. Now consider the trouble we have to get a word in edge wise about CAGW. Yeah, Right and I am going to believe some New Hampshire teacher has this much power? (Story was later changed to some teenager as the heroine.)
Finally President Bill Clinton’s mentor Carroll Quigley wrote a book:

The Anglo-American Establishment 1981
pg 3
One wintry afternoon in February 1891, three men were engaged in earnest conversation in London. From that conversation were to flow consequences of the greatest importance to the British Empire and to the world as a whole. For these men were organizing a secret society that was, for more than fifty years, to be one of the most important forces in the formulation and execution of British imperial and foreign policy.
The three men who were thus engaged were already well known in England. The leader was Cecil Rhodes, fabulously wealthy empire-builder and the most important person in South Africa. The second was William T. Stead, the most famous, and probably also the most sensational, journalist of the day. The third was Reginald Baliol Brett, later known as Lord Esher, friend and confidant of Queen Victoria, and later to be the most influential adviser of King Edward VII and King George V.
… the three drew up a plan of organization for their secret society and a list of original members. The plan of organization provided for an inner circle, to be known as “The Society of the Elect,” and an outer circle, to be known as “The Association of Helpers.” Within The Society of the Elect, the real power was to be exercised by the leader, and a “Junta of Three.” The leader was to be Rhodes, and the junta was to be Stead, Brett, and Alfred Miler. In accordance with this decision, Miter was added to the society by Stead …
The creation of this secret society was not a matter of a moment… Rhodes had been planning for this event for more than seventeen years. Stead had been introduced to the plan on 4 April 1889, and Brett had been told of it on 3 February 1890. Nor was the society thus founded an ephemeral thing, for, in modified form, it exists to this day. From 1891 to 1902, it was known to only a score of persons. During this period, Rhodes was leader, and Stead was the most influential member. From 1902 to 1925, Milner was leader, while Philip Kerr (Lord Lothian) and Lionel Curtis were probably the most important members. From 1925 to 1940, Kerr was leader, and since his death in 1940 this role has probably been played by Robert Henry Brand (now Lord Brand).
During this period of almost sixty years, this society has been called by various names. During the first decade or so it was called “the secret society of Cecil Rhodes” or “the dream of Cecil Rhodes.” In the second and third decades of its existence it was known as “Milner’s Kindergarten” (1901-1910) and as “the Round Table Group” (1910-1920). Since 1920 it has been called by various names…

Tough to call these men tin foil hat ‘Conspiracy Theorists’

Climate agnostic
November 14, 2013 3:32 pm

Bob Greene says:
November 14, 2013 at 1:34 pm
“I’ve always been impressed by the thought that the global atmospheric concentration is based on one measuring station on an active volcano on instruments that are not subject to any other calibration checks except that stations.”
There are more than one measuring station and all of them show the same data. Please read:
http://www.esrl.noaa.gov/gmd/ccgg/insitu.html

1 3 4 5 6 7 12
Verified by MonsterInsights