Why and How the IPCC Demonized CO2 with Manufactured Information

Guest essay by Dr. Tim Ball

Elaine Dewar spent several days with Maurice Strong at the UN and concluded in her book The Cloak of Green that, Strong was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda. Strong conjectured about a small group of world leaders who decided the rich countries were the principle risk to the world. These countries refused to reduce their environmental impact. The leaders decided the only hope for the planet was for collapse of the industrialized nations and it was their responsibility to bring that about. Strong knew what to do. Create a false problem with false science and use bureaucrats to bypass politicians to close industry down and make developed countries pay.

Compare the industrialized nation to an internal combustion engine running on fossil fuel. You can stop the engine in two ways; cut off the fuel supply or plug the exhaust. Cutting off fuel supply is a political minefield. People quickly notice as all prices, especially food, increase. It’s easier to show the exhaust is causing irreparable environmental damage. This is why CO2 became the exclusive focus of the Intergovernmental Panel on Climate Change (IPCC). Process and method were orchestrated to single out CO2 and show it was causing runaway global warming.

In the 1980s I warned Environment Canada employee Henry Hengeveld that convincing a politician of an idea is a problem. Henry’s career involved promoting CO2 as a problem. I explained the bigger problem comes if you convince them and the claim is proved wrong. You either admit your error or hide the truth. Environment Canada and member nations of the IPCC chose to hide or obfuscate the truth.

1. IPCC Definition of Climate Change Was First Major Deception

People were deceived when the IPCC was created. Most believe it’s a government commission of inquiry studying all climate change. The actual definition from the United Nations Environment Program (article 1) of the United Nations Framework Convention on Climate Change (UNFCCC) limits them to only human causes.

a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over considerable time periods.

In another deception, they changed the definition used in the first three Reports (1990, 1995, 2001) in the 2007 Report. It’s a footnote in the Summary for Policymakers (SPM).

Climate change in IPCC usage refers to any change in climate over time, whether due to natural variability or as a result of human activity. This usage differs from that in the United Nations Framework Convention on Climate Change, where climate change refers to a change of climate that is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and that is in addition to natural climate variability observed over comparable time periods.

It was not used because Reports are cumulative and to include natural variability required starting over completely.

It is impossible to determine the human contribution to climate change if you don’t know or understand natural (non-human) climate change. Professor Murray Salby showed how the human CO2 portion is of no consequence, that variation in natural sources of CO2 explains almost all annual changes. He showed that a 5% variation in these sources is more than the total annual human production.

2. IPCC Infer And Prove Rather than Disprove a Hypothesis

To make the process appear scientific a hypothesis was inferred based on the assumptions that,

• CO2 was a greenhouse gas (GHG) that slowed the escape of heat from the Earth.

• the heat was back-radiated to raise the global temperature.

• if CO2 increased global temperature would rise.

• CO2 would increase because of expanding industrial activity.

• the global temperature rise was inevitable.

To further assure the predetermined outcome the IPCC set out to prove rather than disprove the hypothesis as scientific methodology requires. As Karl Popper said,

It is the rule which says that the other rules of scientific procedure must be designed in such a way that they do not protect any statement in science against falsification.

The consistent and overwhelming pattern of the IPCC reveal misrepresentations of CO2. When an issue was raised by scientists performing their role as skeptics, instead of considering and testing its validity and efficacy the IPCC worked to divert, even creating some false explanations. False answers succeeded because most people didn’t know they were false.

3. CO2 Facts Unknown to Most But Problematic to IPCC.

Some basic facts about CO2 are unknown to most people and illustrate the discrepancies and differences between IPCC claims and what science knows.

• Natural levels of Carbon dioxide (CO2) are less than 0.04% of the total atmosphere and 0.4% of the total GHG. It is not the most important greenhouse gas.

• Water vapour is 95 percent of the GHG by volume. It is the most important greenhouse gas by far.

• Methane (CH4) is the other natural GHG demonized by the IPCC. It is only 0.000175 percent of atmospheric gases and 0.036 percent of GHG.

• Figure 1 from ABC news shows the false information. It’s achieved by considering a dry atmosphere.

clip_image002

Figure 1

• The percentages troubled the IPCC so they amplified the importance of CO2 by estimating the “contribution” per unit (Figure 2). The range of estimates effectively makes the measures meaningless, unless you have a political agenda. Wikipedia acknowledges It is not possible to state that a certain gas causes an exact percentage of the greenhouse effect.

clip_image004

Figure 2 (Source Wikipedia)

4. Human CO2 production critical to IPCC objective so they control production of the information.

Here is their explanation.

What is the role of the IPCC in Greenhouse Gas inventories and reporting to the UNFCCC?

A: The IPCC has generated a number of methodology reports on national greenhouse gas inventories with a view to providing internationally acceptable inventory methodologies. The IPCC accepts the responsibility to provide scientific and technical advice on specific questions related to those inventory methods and practices that are contained in these reports, or at the request of the UNFCCC in accordance with established IPCC procedures. The IPCC has set up the Task Force on Inventories (TFI) to run the National Greenhouse Gas Inventory Programme (NGGIP) to produce this methodological advice. Parties to the UNFCCC have agreed to use the IPCC Guidelines in reporting to the convention.

How does the IPCC produce its inventory Guidelines? Utilising IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC. This process ensures that the widest possible range of views are incorporated into the documents.

They control the entire process from methodology, designation of technical advice, establishment of task forces, guidelines for reporting, nomination of experts to produce the reports, to final report approval. The figure they produce is a gross calculation, but it is estimated humans remove 50% of that amount.

Regardless, if you don’t know natural sources and variabilities of CO2 you cannot know the human portion. It was claimed the portion in the atmosphere from combustion of fossil fuels was known from the ratio of carbon isotopes C13/C12. Roy Spencer showed this was not the case. In addition, they ignore natural burning of fossil fuels including forest fires, long-burning coal seams and peat; as Hans Erren noted, fossil coal is buried wood. Spencer concluded,

If the C13/C12 relationship during NATURAL inter-annual variability is the same as that found for the trends, how can people claim that the trend signal is MANMADE??

The answer is, it was done to prove the hypothesis and further the deception.

5. Pressure For Urgent Political Action

Early IPCC Reports claimed the length of time CO2 remains in the atmosphere as very long. This implied it would continue as a problem even with immediate cessation of CO2 production. However as Segalstad wrote,

Essenhigh (2009) points out that the IPCC (Intergovernmental Panel on Climate Change) in their first report (Houghton et al., 1990) gives an atmospheric CO2 residence time (lifetime) of 50-200 years [as a “rough estimate”]. This estimate is confusingly given as an adjustment time for a scenario with a given anthropogenic CO2 input, and ignores natural (sea and vegetation) CO2 flux rates. Such estimates are analytically invalid; and they are in conflict with the more correct explanation given elsewhere in the same IPCC report: “This means that on average it takes only a few years before a CO2 molecule in the atmosphere is taken up by plants or dissolved in the ocean.

6. Procedures to Hide Problems with IPCC Science And Heighten Alarmism.

IPCC procedures and mechanisms were established to deceive. IPCC has three Working Groups (WG). WGI produces the Physical Science Basis Report, which proves CO2 is the cause. WGII produces the Impacts, Adaptation and Vulnerability Report that is based on the result of WGI. WGIII produces the Mitigation of Climate Change Report. WGI and WGII accept WGI’s claim that warming is inevitable. They state,

Five criteria that should be met by climate scenarios if they are to be useful for impact researchers and policy makers are suggested: Criterion 1: Consistency with global projections. They should be consistent with a broad range of global warming projections based on increased concentrations of greenhouse gases. This range is variously cited as 1.4°C to 5.8°C by 2100, or 1.5°C to 4.5°C for a doubling of atmospheric CO2 concentration (otherwise known as the “equilibrium climate sensitivity”).

They knew few would read or understand the Science Report with its admission of serious limitations. They deliberately delayed its release until after the Summary for Policymakers (SPM). As David Wojick explained,

Glaring omissions are only glaring to experts, so the policymakers”—including the press and the publicwho read the SPM will not realize they are being told only one side of a story. But the scientists who drafted the SPM know the truth, as revealed by the sometimes artful way they conceal it.

What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.

An example of this SPM deception occurred with the 1995 Report. The 1990 Report and the drafted 1995 Science Report said there was no evidence of a human effect. Benjamin Santer, as lead author of Chapter 8, changed the 1995 SPM for Chapter 8 drafted by his fellow authors that said,

While some of the pattern-base discussed here have claimed detection of a significant climate change, no study to date has positively attributed all or part of climate change observed to man-made causes.

to read,

The body of statistical evidence in chapter 8, when examined in the context of our physical understanding of the climate system, now points to a discernible human influence on the global climate.

The phrase “discernible human influence became the headline as planned.

With AR5 (2013) they compounded the deception by releasing the SPM then releasing a correction. They got the headline they wanted. It is the same game as the difference between the exposure of problems in the WGI Science Report and the SPM. Media did not report the corrections, but the IPCC could now claim they detailed the inadequacy of their work. It’s not their fault that people don’t understand.

7. Climate Sensitivity

Initially it was assumed that constantly increasing atmospheric CO2 created constantly increasing temperature. Then it was determined that the first few parts per million achieved the greenhouse capacity of CO2. Eschenbach graphed the reality

clip_image006

(Figure 3).

Figure 3

It is like black paint on a window. To block sunlight coming through a window the first coat of black paint achieves most of the reduction. Subsequent coats reduce fractionally less light.

There was immediate disagreement about the amount of climate sensitivity from double and triple atmospheric CO2. Milloy produced a graph comparing three different sensitivity estimates (Figure 4).

clip_image008

Figure 4.

The IPCC created a positive feedback to keep temperatures rising. It claims CO2 causes temperature increase that increases evaporation and water vapour amplifies the temperature trend. Lindzen and Choi, discredited this in their 2011 paper which concluded The results imply that the models are exaggerating climate sensitivity.

Climate sensitivity has declined since and gradually approaches zero. A recent paper by Spencer claims “…climate system is only about half as sensitive to increasing CO2 as previously believed.

8. The Ice Cores Were Critical, But Seriously Flawed.

The major assumption of the inferred IPCC hypothesis says a CO2 increase causes a temperature increase. After publication in 1999 of Petit et al., the Antarctic ice core records appeared as evidence in the 2001 Report (Figure 5).

clip_image010

Figure 5. Antarctic core core record

Four years later research showed the reverse – temperature increase preceded CO2 increase contradicting the hypothesis. It was sidelined with the diversionary claim that the lag was between 80 and 800 years and insignificant. It was so troubling that Al Gore created a deceptive imagery in his movie. Only a few experts noticed.

Actually, temperature changes before CO2 change in every record for any period or duration. Figure 6 shows a shorter record (1958-2009) of the relationship. If CO2 change follows temperature change in every record, why are all computer models programmed with the opposite relationship?

clip_image011

Figure 6; Lag time for short record, 1958 to 2009.

IPCC Needed Low Pre-Industrial CO2 Levels

A pre-industrial CO2 level lower than today was critical to the IPCC hypothesis. It was like the need to eliminate the Medieval Warm Period because it showed the world was not warmer today than ever before.

Ice cores are not the only source of pre-industrial CO2 levels. There are thousands of 19th Century direct measures of atmospheric CO2 that began in 1812. Scientists took precise measurements with calibrated instruments as Ernst Beck thoroughly documented.

In a paper submitted to the US Senate Committee on Commerce, Science, and Transportation Hearing Professor Zbigniew Jaworowski stated,

The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.[1]

Of equal importance Jaworowski states,

The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv[11] (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv[12]. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution [13].

There are other problems with the ice core record. It takes years for air to be trapped in the ice, so what is actually trapped and measured? Meltwater moving through the ice especially when the ice is close to the surface can contaminate the bubble. Bacteria form in the ice, releasing gases even in 500,000-year-old ice at considerable depth. (Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice. Brent C. Christner, 2002 Dissertation. Ohio State University). Pressure of overlying ice, causes a change below 50m and brittle ice becomes plastic and begins to flow. The layers formed with each year of snowfall gradually disappear with increasing compression. It requires a considerable depth of ice over a long period to obtain a single reading at depth. Jaworowski identified the problems with contamination and losses during drilling and core recovery process.

Jaworowski’s claim that the modellers ignored the 19th century readings is incorrect. They knew about it because T.R.Wigley introduced information about the 19th century readings to the climate science community in 1983. (Wigley, T.M.L., 1983 “The pre-industrial carbon dioxide level.” Climatic Change 5, 315-320). However, he cherry-picked from a wide range, eliminating only high readings and ‘creating’ the pre-industrial level as approximately 270 ppm. I suggest this is what influenced the modellers because Wigley was working with them as Director of the Climatic Research Unit (CRU) at East Anglia. He preceded Phil Jones as Director and was the key person directing the machinations revealed by the leaked emails from the CRU.

Wigley was not the first to misuse the 19th century data, but he did reintroduce it to the climate community. Guy Stewart Callendar, a British Steam engineer, pushed the thesis that increasing CO2 was causing warming. He did what Wigley did by selecting only those readings that supported the hypothesis.

There are 90,000 samples from the 19th century and the graph shows those carefully selected by G. S. Callendar to achieve his estimate. It is clear he chose only low readings.

clip_image013

Figure 7. (After Jawaorowski Trend Lines added)

You can see changes that occur in the slope and trend by the selected data compared to the entire record.

Ernst-Georg Beck confirmed Jaworowski’s research. An article in Energy and Environment examined the readings in great detail and validated their findings. In his conclusion Beck states

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.

The pre-industrial level is some 50 ppm higher than the level claimed.

Beck found,

Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.

The challenge for the IPCC was to create a smooth transition from the ice core CO2 levels to the Mauna Loa levels. Beck shows how this was done but also shows how the 19th century readings had to be cherry-picked to fit with ice core and Mauna Loa data (Figure 8).

clip_image015

Figure 8

Variability is extremely important because the ice core record shows an exceptionally smooth curve achieved by applying a 70-year smoothing average. Selecting and smoothing is also applied to the Mauna Loa data and all current atmospheric readings, which naturally vary up to 600 ppm in the course of a day. Smoothing done on the scale of the ice core record eliminates a great deal of information. Consider the variability of temperature data for the last 70 years. Statistician William Brigg’s says you never, ever, smooth a time-series. Elimination of high readings prior to the smoothing make the losses greater. Beck explains how Charles Keeling established the Mauna Loa readings by using the lowest readings of the afternoon and ignored natural sources. Beck presumes Keeling decided to avoid these low level natural sources by establishing the station at 4000 m up the volcano. As Beck notes

“Mauna Loa does not represent the typical atmospheric CO2on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude. (Beck, 2008, “50 Years of Continuous Measurement of CO2on Mauna Loa” Energy and Environment, Vol. 19, No.7.)

Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2measurements. He is a co-author of the IPCC reports, that accept Mauna Loa and all other readings as representative of global levels.

As a climatologist I know it is necessary to obtain as many independent verifications of data as possible. Stomata are small openings on leaves, which vary in size directly with the amount of atmospheric CO2. They underscore effects of smoothing and the artificially low readings of the ice cores. A comparison of a stomata record with the ice core record for a 2000-year period (9000 – 7000 BP) illustrates the issue (Figure 9).

clip_image017

Figure 9.

Stomata data show higher readings and variability than the excessively smoothed ice core record. They align quantitatively with the 19th century measurements as Jaworowski and Beck assert. The average level for the ice core record shown is approximately 265 ppm while it is approximately 300 ppm for the stomata record.

The pre-industrial CO2 level was marginally lower than current levels and likely within the error factor. Neither they, nor the present IPCC claims of 400 ppm are high relative to the geologic record. The entire output of computer climate models begins with the assumption that pre-industrial levels were measurably lower. Elimination of this assumption further undermines the claim that the warming in the industrial era period was due to human addition of CO2 to the atmosphere. Combine this with their assumption that CO2 causes temperature increase, when all records show the opposite, it is not surprising IPCC predictions of temperature increase are consistently wrong.

The IPCC deception was premeditated under Maurice Strong’s guidance to prove CO2 was causing global warming as pretext for shutting down industrialized nations. They partially achieved their goal as alternate energies and green job economies attest. All this occurred as contradictory evidence mounts because Nature refused to play. CO2 increases as temperatures decline, which according to IPCC science cannot happen. Politicians must deal with facts and abandon all policies based on claims that CO2 is a problem, especially those already causing damage.

clip_image019

Source: The Global Warming Policy Foundation: CCNet 14/10/13


1. [1] “Climate Change: Incorrect information on pre-industrial CO2” Statement written for the Hearing before the US Senate Committee on Commerce, Science, and Transportation by Professor Zbigniew Jaworowski March 19, 2004

Get notified when a new post is published.
Subscribe today!
5 4 votes
Article Rating
290 Comments
Inline Feedbacks
View all comments
Sherlock1
November 14, 2013 6:32 am

Brilliant essay by Tim Ball…
Er – why isn’t it on every politician’s Christmas present list..?

Genghis
November 14, 2013 6:43 am

It might be helpful to remember the omissions too, like the warming was supposed to accelerate, the warming was supposed to create more CO2, the warming was supposed to create worse weather, the warming was supposed to accelerate the rising seas, etc. It is all about the acceleration.

Cheshirered
November 14, 2013 7:10 am

The whole AGW charade now beggars belief. Look at the scoreboard as things stand:
Long-held ice core data that flatly contradicts claimed cause and effect.
Pitiful CO2 sensitivity – with vast margins of error for good measure.
Completely failed computer models with zero predictive skills.
Missing heat.
No ‘pause’ predicted.
Etc etc…
And now the latest of a string of humiliations – an absolutely feeble attempts to re-write history to explain away the most humiliating failure of all – a totally unpredicted standstill.
‘Pathetic’ is too kind a word to explain this garbage.

beng
November 14, 2013 7:14 am

So, the warmunist’s strategy boils down to sticking a potato in the exhaust pipe…

ferd berple
November 14, 2013 7:24 am

RockyRoad says:
November 13, 2013 at 7:17 pm
So what’s in it for Maurice Strong?
==============
Kyoto, Molten metals, IPCC, Oil for Food, CCX. Same names, over and over.
http://www.canadafreepress.com/2005/cover120905.htm
http://old.nationalreview.com/rosett/rosett200601102128.asp
http://old.nationalreview.com/pdf/strong.pdf
http://www.canadafreepress.com/2005/strong-cheque.htm
http://www.canadafreepress.com/index.php/article/9629

michael hart
November 14, 2013 7:24 am

Not everyone is familiar with Maurice Strong. An introductory sentence or two would improve the article.

Dennis Ambler
November 14, 2013 7:32 am

For much more on the politics of the UN and the IPCC and some more background on Maurice Strong’s nurturing of the “Environmental Movement” and its NGO’s, check out:
“United Socialist Nations – Progress on Global Governance via Climate Change, Sustainable Development and Bio-Diversity”
http://scienceandpublicpolicy.org/originals/un_progress_governance_via_climate_change.html
Strong was responsible for the first Earth Summit in Stockholm, in 1972, The United Nations Conference on the Human Environment, (UNCHE), as its Secretary-General. He initiated and was a member of the Brundtland Report which led to Agenda 21.
“….in 1994, Maurice Strong, (Earth Council) and former Soviet leader Mikhail Gorbachev, (Green Cross International) relaunched the Earth Charter as a “civil society” initiative. As the architect of the United Nations Environment Program and the United Nations Development Program, (UNEP-UNDP), Strong had for many years co-ordinated and strengthened the integration of Non-Governmental Organisations, (NGO’s) into the UN environmental bodies.
In Geneva in 1973, he launched the “World Assembly of NGO’s concerned with the Global Environment”. He realised that for his ambitions of a UN world government to become reality he needed the vast networking opportunities offered by the NGO’s, now referred to as “Civil Society”.
By offering them involvement and a perception of power he brought them on board and certainly the UN could not have developed as far as it has without them. NGO’s are now involved in all UN bodies and are major contributors to the IPCC reports. They have helped to build this all-encompassing bureaucracy into the behemoth that it now is.”
COMMISSION ON GLOBAL GOVERNANCE, (CGG)
“The CGG was established in 1992, after Rio, at the suggestion of Willy Brandt, former West German chancellor and President of the Socialist International.
It recommended that “user fees” should be imposed on companies operating in the “global commons.” Such fees could be collected on international airline tickets, ocean shipping, deep-sea fishing, activities in Antarctica, geostationary satellite orbits, and electromagnetic spectrum. The main revenue stream would be carbon taxes, to be levied on all fossil fuels. “A carbon tax,” the report said, would yield very large revenues indeed.”
Sound familiar? Conspiracy theory? These things are all part of a Vast Nexus of Influence, http://eureferendum.blogspot.co.uk/2009/12/vast-nexus-of-influence.html
They seek global laws and global punishment:
http://www.inece.org/principles/PrinciplesHandbook_23sept09.pdf
“The International Network for Environmental Compliance and Enforcement (INECE) is a
partnership of more than 3,000 government and non-government enforcement and compliance
practitioners from more than 150 countries.”
Convention on Bio-Diversity – Nagoya Declaration
http://www.cbd.int/nagoya/outcomes/other/
Nagoya Declaration on Local Authorities and Biodiversity
From 24 to 26 October, 2010, parallel to COP 10, 679 participants including more than 240 mayors, governors and top local government executives met at the City Biodiversity Summit 2010 to exchange experiences on local biodiversity management and support the endorsement, by Parties, of a plan of action on sub-national and local governments.
Nagoya Declaration on Parliamentarians and Biodiversity
120 legislators from 38 Parties to the Convention participated in the Parliamentarians and Biodiversity Forum on October 25 and 26, 2010, co-organized by GLOBE International and its Japan chapter, and the Secretariat of the CBD
Just last week, this conference was held: http://biodiversity-l.iisd.org/events/interpol-unep-international-environmental-compliance-and-enforcement-conference/
Climate Change is simply the diversion.

ferd berple
November 14, 2013 7:39 am

Pat says:
November 13, 2013 at 11:11 pm
So if CO2 was making the sea acidic and killing the reef, why isn’t the CO2 being mixed by sea currents and “killing off” the surrounding reefs?
===============
because climate science doesn’t bother to study chemistry. the salt dissolved in the oceans is formally known in chemistry as a buffer. You cannot change a base (the oceans are a base, not an acid) to an acid without also driving the buffer out of solution. As you add CO2 to the oceans, salt rich rocks will form on the bottom of the oceans, which neutralizes the acid. This process will continue until you either run out of CO2 or run out of salt in the ocean. the enormous volume of dissolved salts makes the latter physically impossible.

Tom in Indy
November 14, 2013 7:53 am

Jan P Perlwitz says:
November 14, 2013 at 5:02 am
What a ridiculous nonsense by Ball.

Care to elaborate? Or are we supposed to take your word for it? Can you at least give us a hint as to why we should take your claim as expert opinion?

ferd berple
November 14, 2013 8:00 am

The enormous deposits of marine limestone worldwide are evidence of the physical process by which the oceans turn CO2 into rock, and thereby neutralize the ability of CO2 to acidify the oceans. limestone is fossilized CO2.
climate science observes that acid dissolves limestone, so they propose that adding CO2 to the oceans will dissolve limestone. Nothing could be further from the truth. Limestone contains CO2. Adding CO2 to the oceans must increase the precipitation of limestone, until such time as the oceans run out of calcium salts. Over billions of years, with CO2 levels much higher than present, that has never happened.

ferd berple
November 14, 2013 8:11 am

Jan P Perlwitz says:
November 14, 2013 at 5:02 am
Can you at least give us a hint as to why we should take your claim as expert opinion?
==========
Isn’t there a Perlwitz is paid by NASA GISS? Isn’t NASA GISS the are the ones that used billions in taxpayer dollars to invent the rocket ship that has taken generations of space traveler to visit the exotic planet called “earth”? Only to find that it was already inhabited by “earthlings”
Didn’t this allow those pesky rocket scientists, with their oh so annoying attention to facts and detail, to be replaced by the much more reliable politically correct scientists? Didn’t a bunch of the ex-NASA rocket scientists write a letter to this effect?

wayne
November 14, 2013 8:12 am

Jquip: “The problem with such a condition is that you are not calibrating per se. You are not setting each level in a standard way to as fixed and unchanging a quantity as possible. You are synchronizing, much like you do with clocks. Pick one clock in your house and set all other clocks to the same value. “
So what you seem to be saying is Keeling is synchronizing all of the co2 measuring stations about our globe and not giving them necessarily a perfect co2 calibrating sample of a known accurate concentration so if every station is off they can correct after the fact at Mauna Loa who determines how far off everyone is from a fixed concentration they determine, that they “know” is correct, just like your example of all clocks being off by the same amount, but it doesn’t matter as long as they are all off by the same amount, the mother station determines the accurate point and adjusts accordingly.
Interesting, I always thought Keeling sent out a calibration gas sample of a known and accurate concentration which has its own set of problems. In that case if the calibration samples are off, everyone is off but since the calibration was deemed accurate no mother station adjustments would ever be made after the fact as they are assimilated each month.
I guess since they know from energy records how much co2 rises each month it becomes quite easy to determine who is correct and accurate. 😉
Now I wonder which it is, synchronization or calibaration?

November 14, 2013 8:33 am

Nice story, pity that you still cling to wrong CO2 diffusion physics.

sergeiMK
November 14, 2013 8:46 am

Dr Ball you say of the ice cores:
“It was sidelined with the diversionary claim that the lag was between 80 and 800 years and insignificant”
then you also say of CO2vs temp record
“The temperature can be seen to lead by 6 months”
How can it lead by 6months and also by 80 to 800 years.
If temp changes (looking at yearly data) can initiate a change in CO2 within 6months then why a few 100k years ago was it so much more sluggish.
Perhaps the ocean “plankton” is the cause of your annual changes as the phytoplankton change from using CO2 and producing O2 (during summer days) and using 02 and producing CO2 during the long winter nights?

G. Karst
November 14, 2013 9:03 am

Dr. Ball – you have performed a most excellent autopsy. However the cadaver is still walking about, and doesn’t seem to know it is a zombie. GK

Bruce Cobb
November 14, 2013 9:26 am

Jan P Perlwitz says:
November 14, 2013 at 5:02 am
What a ridiculous nonsense by Ball.
Which part threatened your demented CAGW worldview most? Inquiring minds wish to know.

November 14, 2013 9:33 am

If the ipcc was Pinnoccio its nose would stretch from here to the moon and back. Then we could
Cut it off and use as biomass and send it to drax.

Gail Combs
November 14, 2013 9:44 am

What great timing Dr. Ball,
I was just having a go round on this subject at Towards a theory of climate By Christopher Monckton of Brenchley
And I referenced Dr. Jaworowski. I was told his “paper’ linked to the Hearing before the US Senate Committee was not peer -reviewed It was ” …a 1997 rant from Jaworowski” link
And:
“…Gail, please let the late Jaworowski rest in peace, together with his ideas about ice cores. All of his objections were already refuted in 1996 by the drilling of 3 cores at Law Dome by Etheridge e.a. link
And:
in Response to someone else’s comment: “We know the method by which they were spliced together is also questionable.”
Came this comment:
“That is pure nonsense: the late Jaworowski from who is this story did look at the wrong column in Neftel’s Siple Dome ice core table: he used the column of the age of the ice instead of the average gas age to compare with the Mauna Loa data. But as far as I know most of the CO2 is in the gas phase, which is much younger than the ice at the same depth” link
(The adjusting of the age of the gas to match up with the Mauna Loa data was of course what Dr Jaworowski was objecting to so that is a real whopper.)
Boy do they dislike Dr. Jaworowski’s work and the poor man is no longer here to defend himself. So I am glad someone who is much more familiar with Dr. Jaworowski and with Mr. Beck is around to defend their work. So thank you Dr. Ball for this article.

Jquip
November 14, 2013 10:01 am

wayne: “Interesting, I always thought Keeling sent out a calibration gas sample of a known and accurate concentration which has its own set of problems. In that case if the calibration samples are off, everyone is off but since the calibration was deemed accurate no mother station adjustments would ever be made after the fact as they are assimilated each month.”
Yes, exactly. And assuming all the best integrity of Keeling, Keeling is calibrating the gas samples with his own equipment. But as he’s the only calibrator, he can only calibrate his equipment with his gas samples. For if it were in any other condition, everyone would be able to calibrate to an independent mechanism. eg. It’s synchronization in this case, and purely. Which means that we cannot state any absolute value with certainty, or anomalistic value with certainty, and whatever the trend over time does is a reflection not of ‘reality’ outside Mauna Loa or even at Mauna Loa, but a direct reflection of the speed and direction at which the equipment there comes out of calibration.
In the absolute best case, with the most dilligent and trustworthy actors: We are only measuring how the equipment is aging.

Jimbo
November 14, 2013 10:18 am

michael hart says:
November 14, 2013 at 7:24 am
Not everyone is familiar with Maurice Strong. An introductory sentence or two would improve the article.

He is James Hansen, Peter Glieck, Michael Mann, Al Gore, WWF, Greenpeace, KOCH BROTHERS all rolled into one. He has been on of the main architects of the CAGW fraud from the start. His career has included a couple of stints at Big Oil.

Jimbo
November 14, 2013 10:18 am

Ooops!
He has been one of the main

November 14, 2013 10:32 am

– As ever what Tim Ball says makes a lot of sense & it’s consistent with the last 30 year temperature vs CO2 pattern. However why doesn’t everyone believe this already ?
OK the outside world is controlled by the Green Fundamentalist Five percent so the media is censored and you don’t keep your job and your research institute get it grants unless you keep the IPCC party line. But you’d expect more scientists to be stepping through cracks in the woodwork. Where are the rest of the informed dissenters ?

Samuel C Cogar
November 14, 2013 10:35 am

How can anyone deny the existence of a Great CO2 Causing Anthropogenic Global Warming conspiracy after reading the above essay by Dr. Tim Ball?
Anyway, being the Devil’s Advocate that i am …. and even with my limited knowledge of the physics involved, … I still feel compelled to question the author’s commentary in Section – 7. Climate Sensitivity
[quoting article] “Initially it was assumed that constantly increasing atmospheric CO2 created constantly increasing temperature. Then it was determined that the first few parts per million achieved the greenhouse capacity of CO2. Eschenbach graphed the reality (Figure 3)
How is it possible for “more to equal less” when the quantity of CO2 in the atmosphere has nothing to do with the quantity of thermal energy that is radiating through the atmosphere and given the fact that all CO2 molecules have the same physical attributes how does one molecule limit another like molecule’s ability to absorb said energy?
Now if there were only a specific or limited amount of energy available for absorption then I could agree with the above statement. But there is no such limit other than the number of hours of daylight.
[quoting article] “It (CO2) is like black paint on a window. To block sunlight coming through a window the first coat of black paint achieves most of the reduction. Subsequent coats reduce fractionally less light.
IMHO, that was a really bad example about CO2 because H2O vapor will not block the sunlight even at 40,000 ppm (4%). If one wants to compare cloud cover to black paint, then fine. But typically in reverse order to the above stated. In actuality, at 400 ppm the CO2 molecules are so far n’ few inbetween that one molecule of CO2 seldom ever “bumps” into another one.
And I also have a problem with Milloy’s graph in Figure 4 …. as well as the cited commentary of Lindzen and Choi in their attempt to discredit the IPCC’s claims about CO2 …. but no problem with their conclusion of: “The results imply that the models are exaggerating climate sensitivity.
But I’ll save that problem for another day.
And ps: One can actually trust the “fossilized stomata record” because those plants were actually there at the time ….. and were “measuring” the CO2 ppm that was available for their use.

JaneHM
November 14, 2013 10:37 am

Calibration is a major issue. Unfortunately the JAXA GOSAT satellite carbon dioxide measurements are also calibrated to ground-based measurements

Terry Hoffman
November 14, 2013 11:14 am

Very good paper summary of the state of the ‘art.’ A lot of it, as we used to say on the field, is obvious to the casual observer. I checked out Al’s DVD on truth a few years back and wondered why the thing passed the smell test. Evidence of the state of math and science education in the US, I suppose.
What I wonder now is why we allow geologists and biologists, much less politicians, to represent interpretation of climate data and modeling when it’s clearly a physics issue. Opinions are fine but comparing a guy who’s expert on rocks with a radiation physicist should be noted at the end of the paper.