Guest essay by Dr. Tim Ball
Elaine Dewar spent several days with Maurice Strong at the UN and concluded in her book The Cloak of Green that, “Strong was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda.” Strong conjectured about a small group of world leaders who decided the rich countries were “the principle risk to the world.” These countries refused to reduce their environmental impact. The leaders decided the only hope for the planet was for collapse of the industrialized nations and it was their responsibility to bring that about. Strong knew what to do. Create a false problem with false science and use bureaucrats to bypass politicians to close industry down and make developed countries pay.
Compare the industrialized nation to an internal combustion engine running on fossil fuel. You can stop the engine in two ways; cut off the fuel supply or plug the exhaust. Cutting off fuel supply is a political minefield. People quickly notice as all prices, especially food, increase. It’s easier to show the exhaust is causing irreparable environmental damage. This is why CO2 became the exclusive focus of the Intergovernmental Panel on Climate Change (IPCC). Process and method were orchestrated to single out CO2 and show it was causing runaway global warming.
In the 1980s I warned Environment Canada employee Henry Hengeveld that convincing a politician of an idea is a problem. Henry’s career involved promoting CO2 as a problem. I explained the bigger problem comes if you convince them and the claim is proved wrong. You either admit your error or hide the truth. Environment Canada and member nations of the IPCC chose to hide or obfuscate the truth.
1. IPCC Definition of Climate Change Was First Major Deception
People were deceived when the IPCC was created. Most believe it’s a government commission of inquiry studying all climate change. The actual definition from the United Nations Environment Program (article 1) of the United Nations Framework Convention on Climate Change (UNFCCC) limits them to only human causes.
“a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over considerable time periods.”
In another deception, they changed the definition used in the first three Reports (1990, 1995, 2001) in the 2007 Report. It’s a footnote in the Summary for Policymakers (SPM).
“Climate change in IPCC usage refers to any change in climate over time, whether due to natural variability or as a result of human activity. This usage differs from that in the United Nations Framework Convention on Climate Change, where climate change refers to a change of climate that is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and that is in addition to natural climate variability observed over comparable time periods.”
It was not used because Reports are cumulative and to include natural variability required starting over completely.
It is impossible to determine the human contribution to climate change if you don’t know or understand natural (non-human) climate change. Professor Murray Salby showed how the human CO2 portion is of no consequence, that variation in natural sources of CO2 explains almost all annual changes. He showed that a 5% variation in these sources is more than the total annual human production.
2. IPCC Infer And Prove Rather than Disprove a Hypothesis
To make the process appear scientific a hypothesis was inferred based on the assumptions that,
• CO2 was a greenhouse gas (GHG) that slowed the escape of heat from the Earth.
• the heat was back-radiated to raise the global temperature.
• if CO2 increased global temperature would rise.
• CO2 would increase because of expanding industrial activity.
• the global temperature rise was inevitable.
To further assure the predetermined outcome the IPCC set out to prove rather than disprove the hypothesis as scientific methodology requires. As Karl Popper said,
It is the rule which says that the other rules of scientific procedure must be designed in such a way that they do not protect any statement in science against falsification.
The consistent and overwhelming pattern of the IPCC reveal misrepresentations of CO2. When an issue was raised by scientists performing their role as skeptics, instead of considering and testing its validity and efficacy the IPCC worked to divert, even creating some false explanations. False answers succeeded because most people didn’t know they were false.
3. CO2 Facts Unknown to Most But Problematic to IPCC.
Some basic facts about CO2 are unknown to most people and illustrate the discrepancies and differences between IPCC claims and what science knows.
• Natural levels of Carbon dioxide (CO2) are less than 0.04% of the total atmosphere and 0.4% of the total GHG. It is not the most important greenhouse gas.
• Water vapour is 95 percent of the GHG by volume. It is the most important greenhouse gas by far.
• Methane (CH4) is the other natural GHG demonized by the IPCC. It is only 0.000175 percent of atmospheric gases and 0.036 percent of GHG.
• Figure 1 from ABC news shows the false information. It’s achieved by considering a dry atmosphere.
Figure 1
• The percentages troubled the IPCC so they amplified the importance of CO2 by estimating the “contribution” per unit (Figure 2). The range of estimates effectively makes the measures meaningless, unless you have a political agenda. Wikipedia acknowledges “It is not possible to state that a certain gas causes an exact percentage of the greenhouse effect.”
Figure 2 (Source Wikipedia)
4. Human CO2 production critical to IPCC objective so they control production of the information.
Here is their explanation.
What is the role of the IPCC in Greenhouse Gas inventories and reporting to the UNFCCC?
A: The IPCC has generated a number of methodology reports on national greenhouse gas inventories with a view to providing internationally acceptable inventory methodologies. The IPCC accepts the responsibility to provide scientific and technical advice on specific questions related to those inventory methods and practices that are contained in these reports, or at the request of the UNFCCC in accordance with established IPCC procedures. The IPCC has set up the Task Force on Inventories (TFI) to run the National Greenhouse Gas Inventory Programme (NGGIP) to produce this methodological advice. Parties to the UNFCCC have agreed to use the IPCC Guidelines in reporting to the convention.
How does the IPCC produce its inventory Guidelines? Utilising IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC. This process ensures that the widest possible range of views are incorporated into the documents.
They control the entire process from methodology, designation of technical advice, establishment of task forces, guidelines for reporting, nomination of experts to produce the reports, to final report approval. The figure they produce is a gross calculation, but it is estimated humans remove 50% of that amount.
Regardless, if you don’t know natural sources and variabilities of CO2 you cannot know the human portion. It was claimed the portion in the atmosphere from combustion of fossil fuels was known from the ratio of carbon isotopes C13/C12. Roy Spencer showed this was not the case. In addition, they ignore natural burning of fossil fuels including forest fires, long-burning coal seams and peat; as Hans Erren noted, fossil coal is buried wood. Spencer concluded,
If the C13/C12 relationship during NATURAL inter-annual variability is the same as that found for the trends, how can people claim that the trend signal is MANMADE??
The answer is, it was done to prove the hypothesis and further the deception.
5. Pressure For Urgent Political Action
Early IPCC Reports claimed the length of time CO2 remains in the atmosphere as very long. This implied it would continue as a problem even with immediate cessation of CO2 production. However as Segalstad wrote,
Essenhigh (2009) points out that the IPCC (Intergovernmental Panel on Climate Change) in their first report (Houghton et al., 1990) gives an atmospheric CO2 residence time (lifetime) of 50-200 years [as a “rough estimate”]. This estimate is confusingly given as an adjustment time for a scenario with a given anthropogenic CO2 input, and ignores natural (sea and vegetation) CO2 flux rates. Such estimates are analytically invalid; and they are in conflict with the more correct explanation given elsewhere in the same IPCC report: “This means that on average it takes only a few years before a CO2 molecule in the atmosphere is taken up by plants or dissolved in the ocean”.
6. Procedures to Hide Problems with IPCC Science And Heighten Alarmism.
IPCC procedures and mechanisms were established to deceive. IPCC has three Working Groups (WG). WGI produces the Physical Science Basis Report, which proves CO2 is the cause. WGII produces the Impacts, Adaptation and Vulnerability Report that is based on the result of WGI. WGIII produces the Mitigation of Climate Change Report. WGI and WGII accept WGI’s claim that warming is inevitable. They state,
Five criteria that should be met by climate scenarios if they are to be useful for impact researchers and policy makers are suggested: Criterion 1: Consistency with global projections. They should be consistent with a broad range of global warming projections based on increased concentrations of greenhouse gases. This range is variously cited as 1.4°C to 5.8°C by 2100, or 1.5°C to 4.5°C for a doubling of atmospheric CO2 concentration (otherwise known as the “equilibrium climate sensitivity”).
They knew few would read or understand the Science Report with its admission of serious limitations. They deliberately delayed its release until after the Summary for Policymakers (SPM). As David Wojick explained,
Glaring omissions are only glaring to experts, so the “policymakers”—including the press and the public—who read the SPM will not realize they are being told only one side of a story. But the scientists who drafted the SPM know the truth, as revealed by the sometimes artful way they conceal it.
…
What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.
An example of this SPM deception occurred with the 1995 Report. The 1990 Report and the drafted 1995 Science Report said there was no evidence of a human effect. Benjamin Santer, as lead author of Chapter 8, changed the 1995 SPM for Chapter 8 drafted by his fellow authors that said,
“While some of the pattern-base discussed here have claimed detection of a significant climate change, no study to date has positively attributed all or part of climate change observed to man-made causes.”
to read,
“The body of statistical evidence in chapter 8, when examined in the context of our physical understanding of the climate system, now points to a discernible human influence on the global climate.”
The phrase “discernible human influence” became the headline as planned.
With AR5 (2013) they compounded the deception by releasing the SPM then releasing a correction. They got the headline they wanted. It is the same game as the difference between the exposure of problems in the WGI Science Report and the SPM. Media did not report the corrections, but the IPCC could now claim they detailed the inadequacy of their work. It’s not their fault that people don’t understand.
7. Climate Sensitivity
Initially it was assumed that constantly increasing atmospheric CO2 created constantly increasing temperature. Then it was determined that the first few parts per million achieved the greenhouse capacity of CO2. Eschenbach graphed the reality
(Figure 3).
Figure 3
It is like black paint on a window. To block sunlight coming through a window the first coat of black paint achieves most of the reduction. Subsequent coats reduce fractionally less light.
There was immediate disagreement about the amount of climate sensitivity from double and triple atmospheric CO2. Milloy produced a graph comparing three different sensitivity estimates (Figure 4).
Figure 4.
The IPCC created a positive feedback to keep temperatures rising. It claims CO2 causes temperature increase that increases evaporation and water vapour amplifies the temperature trend. Lindzen and Choi, discredited this in their 2011 paper which concluded “The results imply that the models are exaggerating climate sensitivity.”
Climate sensitivity has declined since and gradually approaches zero. A recent paper by Spencer claims “…climate system is only about half as sensitive to increasing CO2 as previously believed.”
8. The Ice Cores Were Critical, But Seriously Flawed.
The major assumption of the inferred IPCC hypothesis says a CO2 increase causes a temperature increase. After publication in 1999 of Petit et al., the Antarctic ice core records appeared as evidence in the 2001 Report (Figure 5).
Figure 5. Antarctic core core record
Four years later research showed the reverse – temperature increase preceded CO2 increase contradicting the hypothesis. It was sidelined with the diversionary claim that the lag was between 80 and 800 years and insignificant. It was so troubling that Al Gore created a deceptive imagery in his movie. Only a few experts noticed.
Actually, temperature changes before CO2 change in every record for any period or duration. Figure 6 shows a shorter record (1958-2009) of the relationship. If CO2 change follows temperature change in every record, why are all computer models programmed with the opposite relationship?
Figure 6; Lag time for short record, 1958 to 2009.
IPCC Needed Low Pre-Industrial CO2 Levels
A pre-industrial CO2 level lower than today was critical to the IPCC hypothesis. It was like the need to eliminate the Medieval Warm Period because it showed the world was not warmer today than ever before.
Ice cores are not the only source of pre-industrial CO2 levels. There are thousands of 19th Century direct measures of atmospheric CO2 that began in 1812. Scientists took precise measurements with calibrated instruments as Ernst Beck thoroughly documented.
In a paper submitted to the US Senate Committee on Commerce, Science, and Transportation Hearing Professor Zbigniew Jaworowski stated,
“The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”[1]
Of equal importance Jaworowski states,
The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv[11] (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv[12]. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution [13].
There are other problems with the ice core record. It takes years for air to be trapped in the ice, so what is actually trapped and measured? Meltwater moving through the ice especially when the ice is close to the surface can contaminate the bubble. Bacteria form in the ice, releasing gases even in 500,000-year-old ice at considerable depth. (“Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice.” Brent C. Christner, 2002 Dissertation. Ohio State University). Pressure of overlying ice, causes a change below 50m and brittle ice becomes plastic and begins to flow. The layers formed with each year of snowfall gradually disappear with increasing compression. It requires a considerable depth of ice over a long period to obtain a single reading at depth. Jaworowski identified the problems with contamination and losses during drilling and core recovery process.
Jaworowski’s claim that the modellers ignored the 19th century readings is incorrect. They knew about it because T.R.Wigley introduced information about the 19th century readings to the climate science community in 1983. (Wigley, T.M.L., 1983 “The pre-industrial carbon dioxide level.” Climatic Change 5, 315-320). However, he cherry-picked from a wide range, eliminating only high readings and ‘creating’ the pre-industrial level as approximately 270 ppm. I suggest this is what influenced the modellers because Wigley was working with them as Director of the Climatic Research Unit (CRU) at East Anglia. He preceded Phil Jones as Director and was the key person directing the machinations revealed by the leaked emails from the CRU.
Wigley was not the first to misuse the 19th century data, but he did reintroduce it to the climate community. Guy Stewart Callendar, a British Steam engineer, pushed the thesis that increasing CO2 was causing warming. He did what Wigley did by selecting only those readings that supported the hypothesis.
There are 90,000 samples from the 19th century and the graph shows those carefully selected by G. S. Callendar to achieve his estimate. It is clear he chose only low readings.
Figure 7. (After Jawaorowski Trend Lines added)
You can see changes that occur in the slope and trend by the selected data compared to the entire record.
Ernst-Georg Beck confirmed Jaworowski’s research. An article in Energy and Environment examined the readings in great detail and validated their findings. In his conclusion Beck states
Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.
The pre-industrial level is some 50 ppm higher than the level claimed.
Beck found,
“Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.”
The challenge for the IPCC was to create a smooth transition from the ice core CO2 levels to the Mauna Loa levels. Beck shows how this was done but also shows how the 19th century readings had to be cherry-picked to fit with ice core and Mauna Loa data (Figure 8).
Figure 8
Variability is extremely important because the ice core record shows an exceptionally smooth curve achieved by applying a 70-year smoothing average. Selecting and smoothing is also applied to the Mauna Loa data and all current atmospheric readings, which naturally vary up to 600 ppm in the course of a day. Smoothing done on the scale of the ice core record eliminates a great deal of information. Consider the variability of temperature data for the last 70 years. Statistician William Brigg’s says you never, ever, smooth a time-series. Elimination of high readings prior to the smoothing make the losses greater. Beck explains how Charles Keeling established the Mauna Loa readings by using the lowest readings of the afternoon and ignored natural sources. Beck presumes Keeling decided to avoid these low level natural sources by establishing the station at 4000 m up the volcano. As Beck notes
“Mauna Loa does not represent the typical atmospheric CO2on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude. (Beck, 2008, “50 Years of Continuous Measurement of CO2on Mauna Loa” Energy and Environment, Vol. 19, No.7.)
Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2measurements.” He is a co-author of the IPCC reports, that accept Mauna Loa and all other readings as representative of global levels.
As a climatologist I know it is necessary to obtain as many independent verifications of data as possible. Stomata are small openings on leaves, which vary in size directly with the amount of atmospheric CO2. They underscore effects of smoothing and the artificially low readings of the ice cores. A comparison of a stomata record with the ice core record for a 2000-year period (9000 – 7000 BP) illustrates the issue (Figure 9).
Figure 9.
Stomata data show higher readings and variability than the excessively smoothed ice core record. They align quantitatively with the 19th century measurements as Jaworowski and Beck assert. The average level for the ice core record shown is approximately 265 ppm while it is approximately 300 ppm for the stomata record.
The pre-industrial CO2 level was marginally lower than current levels and likely within the error factor. Neither they, nor the present IPCC claims of 400 ppm are high relative to the geologic record. The entire output of computer climate models begins with the assumption that pre-industrial levels were measurably lower. Elimination of this assumption further undermines the claim that the warming in the industrial era period was due to human addition of CO2 to the atmosphere. Combine this with their assumption that CO2 causes temperature increase, when all records show the opposite, it is not surprising IPCC predictions of temperature increase are consistently wrong.
The IPCC deception was premeditated under Maurice Strong’s guidance to prove CO2 was causing global warming as pretext for shutting down industrialized nations. They partially achieved their goal as alternate energies and green job economies attest. All this occurred as contradictory evidence mounts because Nature refused to play. CO2 increases as temperatures decline, which according to IPCC science cannot happen. Politicians must deal with facts and abandon all policies based on claims that CO2 is a problem, especially those already causing damage.
Source: The Global Warming Policy Foundation: CCNet 14/10/13
1. [1] “Climate Change: Incorrect information on pre-industrial CO2” Statement written for the Hearing before the US Senate Committee on Commerce, Science, and Transportation by Professor Zbigniew Jaworowski March 19, 2004
Related articles
- IPCC ‘s Bogus Evidence for Global Warming (americanthinker.com)
- 2013 On Track to be Seventh Warmest Year Since 1850 (climatecentral.org)
- Another Reason Why IPCC Predictions (Projections) Fail. AR5 Continues to Let The End Justify the Unscrupulous Means (wattsupwiththat.com)
- UN climate panel corrects carbon numbers in influential report (trust.org)
- The IPCC’s muddled definitions of climate change (ipccreport.wordpress.com)
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Jquip says:
November 14, 2013 at 2:39 pm
Unfortunately, we don’t have any reliable calibration of the instruments that measure that. So if we’re throwing out unreliable data, so go — apparently — every CO2 station.
Sorry, but that is what the story want to believe you, but in reality calibration mixtures are centrally made by NOAA, crossvalidated by different labs of different organisations and then used by all stations. Three calibration gases are injected each hour, spanning the expected range of CO2. A fourth calibration gas outside the range is injected every 25 hours and compared to the three others.
Each station works on it own, there is no synchronizing between Mauna Loa and other stations, as that would be impossible: there is a lag in CO2 increase between the NH and the SH and there is a lag with altitude and there is a much larger seasonal variation at ground level than at altitude in the NH and much less in the SH:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/month_2002_2004_4s.jpg
So there is no problem to throw out outliers, which are specific for a station: volcanic vents are only with downwind conditions from certain directions and cause a huge variability in CO2 within an hour and are excluded for averaging, measurements with wind from land side for coastal stations are excluded, etc…
In fact it is a luxury problem: even throwing out 99% of the data stil yields so many left that one can plot the trend (even from biweekly flask samples). Compare that to the 3 samples per day of the historical measurements.
Fedinand: “Sorry, but that is what the story want to believe you, but in reality calibration mixtures are centrally made by NOAA, crossvalidated by different labs of different organisations and then used by all stations.”
So your claim is that it is false that Mauna Loa is the sole calibration souce. And that there are no worries because NOAA is the sole calibration source. Then there’s still two problems. The first is that Mauna Loa is under the NOAA umbrella: “NOAA doesn’t do it, NOAA does.” The second is that it’s still a single calibration souce.
Ferdinand Engelbeen, you accuse Gail Combs of lying.-well I accuse you of putting forward false information (is that lying also ?). It is clear that you have never measured CO2 with a instrument based on chemical means. They work by absorbing or adsorbing CO2 passed through a chemical, such as KOH, in solution or through a column solid granules. The absorbing or adsorbing medium can be analysed for the quantity CO2 removed from the gas. The accuracy depends on the care of measurements of the volume of gas and the analyses. The absolute reading depends on the volume of gas passed through. It is possible to detect 1ppm by passing through enough volume.
I have corresponded with E-G Beck while he was alive. I have a lot of respect for his Germanic thoroughness and integrity. I respect his daughter for maintaining his web site http://www.biomind.de/realCO2/realCO2-1.htm . There you will find details of the many scientists (some awarded Noble prizes at a time when these had some meaning) who measured CO2 in the atmosphere. I am a humble chemical engineer with experience in heat transfer. I can say I have made many measurements of CO2 including in the atmosphere and at a higher level than presently claimed for the average background level.
RE: Elaine Dewar spent several days with Maurice Strong at the UN and concluded in her book The Cloak of Green that, “Strong,the Father of Global Warming was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda.”
Strong, your ordinary billionaire socialist, once tried to steal the sell the people of Colorado back their own water (he bought land on the state’s biggest aquifer and tried to claim ownership but was stopped, so he has moved on to selling air he doesn’t own. He is in China now that his daughter stole $20 million from the UN’s Food for Peace.
– – – – – – – –
Due to serious problems found in his earlier work, Popper revealed in a much later major work a support of logical positivist philosophy’s view of the basis of science.
Therefore in another work, years later than his work that the quote above was taken from, Popper also said (‘Conjectures and Refutations {London: 1963} page 48),
Does that look like Feynmanian observation based science or does it look like ‘thinking makes the observation support it’? I think it is the latter and I think it is the essence of the science that is being promoted in the AR5 (and in previous ARs) by the intellects of the IPCC Bureau.
John
Personal Note => I remind everyone that Kant is fundamentally a dual reality / dual epistemological philosophy supporter; namely there exists a true reality/knowledge that we cannot know and another reality/knowledge we experience which isn’t true reality/knowledge.
> The percentages troubled the IPCC so they amplified the importance of CO2 by estimating the “contribution” per unit (Figure 2). The range of estimates effectively makes the measures meaningless, unless you have a political agenda.
Tim Ball doesn’t understand the chart.
Those aren’t ranges, Darl. The bottom value is the proportion of the warming that would drop if you removed the greenhouse gas, the top is the proportion of the warming that would remain if you removed all the other greenhouse gasses. The difference is the overlap between greenhouse gasses.
You’re welcome.
For the entire picture of the Maurice Strong’s Agenda 21 Crap and the realization that the UN including our political establishment is waging a war on our civilization (yes, that includes you) while we are expected to make polite and civil conversation with warmista’s.
I think we’re wasting our time debunking their crap and start stocking up on “Tar & Feathers”
(and rope, the kind of rope that is thin enough to make a knot and thick enough not to damage the trunks of the trees). We all love trees don’t we.
http//green-agenda.com
John Whitman: “Does that look like Feynmanian observation based science or does it look like ‘thinking makes the observation support it’? ”
” But in thinking that … we necessarily succeed in imposing them upon nature, he was wrong.”
Other way around. Popper is stating that we can impose our intellect on nature in the same way we impose ourselves on a girl we want to date. In both we’re hopeful, perhaps mad, but that doesn’t mean she’ll agree to it.
Jquip says:
November 14, 2013 at 4:04 pm
So your claim is that it is false that Mauna Loa is the sole calibration souce. And that there are no worries because NOAA is the sole calibration source. Then there’s still two problems. The first is that Mauna Loa is under the NOAA umbrella: “NOAA doesn’t do it, NOAA does.” The second is that it’s still a single calibration souce.
Somebody must prepare the calibration mixtures. That is as good the case for CO2 mixtures as for blood sugar tests or anything measurable in any lab. The CO2 mixtures then are crossvalidated against mixtures prepared by other labs to be sure that the calibration gases are as accurate as claimed.
Even so, other laboratories prepare their own calibration gases and measure CO2 at a lot of other places. Scripps still makes its own calibration gases and takes its own (flask) samples at Mauno Loa, independent of NOAA. So there is a cross validation of the main calibration source with other calibration sources and for Mauna Loa a crossvalidation of the values found with a different set of calibration gases…
cementafriend says:
November 14, 2013 at 5:39 pm
Ferdinand Engelbeen, you accuse Gail Combs of lying.-well I accuse you of putting forward false information (is that lying also ?).
Lying is when you provide wrong information while have been warned that it is wrong. It is a word that I very seldom use, as many, myself included, make mistakes. But if one repeat the same wrong information, as Gail did after three times warned that the information is wrong, then it is clearly lying. Except if Gail has a serious loss of memory (as I have sometimes…).
About Ernst Beck’s data: I had about three years discussion with him and even once a full day together at the home of Arthur Rörsch in Leiden (Netherlands) to discuss things out. I studied about everything he published on his website, especially the sources he quoted for the 1935-1945 period, the most recent CO2 “peak” according to his compilation.
Indeed some historical methods were quite accurate. But most analyses were not better than 3% of the measured value, thus not better than +/- 10 ppmv. Besides that there is not the slightest information about the calibration of the real life series, the preparation of the samples and reagens, the aging of the reagens, the skill of the people doing the analyses, etc…
But that all is not the main problem of the historical data. The main problem is where was measured. You can’t have any reliable CO2 data from places in the middle of Paris, in the middle of a forest, inbetween growing agricultural crops, etc. with one or even with three samples a day.
Beck used all of them and then declared that there was a “peak” of some 80 ppmv around 1942.
It is possible (but largely improbable) to have an outburst of CO2 from the oceans, equivalent to burning down 1/3rd of all land vegetation in seven years time. But it is impossible to absorb that again in 7 years time.
Further, there is not the slightest peak seen in other direct measurements (ice cores) or proxies (stomata data, coralline sponges) with sufficient resolution around 1942.
Thus while I do admire the tremendous amount of work done by the late Ernst Beck, I only can disagree with the results of his compilation.
Then about the late Jaworowski:
In a paper submitted to the US Senate Committee on Commerce, Science, and Transportation Hearing Professor Zbigniew Jaworowski stated,
“The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”
Take a look at the paper by Jaworowski (no matter if it was really sent to the Senate Committee):
http://www.warwickhughes.com/icecore/
There are two graphs which show that the CO2 data from the Siple Dome ice core were shifted an “arbitrary” 83 years to match the Mauna Loa CO2 data.
But there is nothing arbitary about the shift: the average age of the gas bubbles is much younger than the age of the surrounding ice, for the simple reason that the pores of the snow/firn still are open many years after the snow did fall. That makes that air exchanges/migration continues until the density of the firn is high enough to sufficiently reduce the pores and prevent further migration. A nice overview of that process is given at:
http://courses.washington.edu/proxies/GHG.pdf
Neftel calculated the (at that time rough) average age of the gas bubbles from the pore diameters and also saw that there was a remelt layer at near 70 m depth. Remelt layers prevent further vertical migration and thus the average gas age doesn’t get younger anymore compared to the ice age at the same depth. According to Neftel:
Based on porosity measurements the time lag between the mean age of the gas and the age of the ice was determined to be 95 yr and the duration of the close-off process to be 22 yr. These values are, of course, evaluated for one particular core representing the present situation (1983), assuming a homogeneous enclosure process and not taking into account the sealing effect of observed impermeable layers.
Further:
Because of the layers between 68 and 69 m.b.s., the air below is already completely isolated, about 7 m above the debt obtained assuming a homogeneous enclosure. Consequently, for this core, the difference between ice and mean gas age is only 80-85 yr instead of 95 yr as estimated previously.
I asked Jaworowski in a personal mail why he insisted on this “arbitrary” shift. According to him, there is no difference in age between the gas phase and ice phase in ice cores, as all ice remelts and closes the pores from the beginning.
Even if that only happened once in 83 years in the Siple Dome ice core…
Further, he says that one finds too low values in ice cores, because CO2 (by preference?) escapes through cracks in the ice after drilling, relaxation and transport. I asked him how one can have migration from 180-300 ppmv inside the ice to 360-400 ppmv in outside air.
Never had an answer.
That closed the door for me…
Let him rest in peace, together with his ideas about ice cores…
See further:
http://www.ferdinand-engelbeen.be/klimaat/jaworowski.html
A slight deviation from the discussion however, I am trying to find concrete information to support the claim there is a “lag” between changes in CO2 concentration and temperatures of some ~800 years. I used to have links but I rebuild my laptop recently and lost it all. Does anyone have any reliable information on this?
John Tiller says:
November 14, 2013 at 6:23 pm
“Strong, your ordinary billionaire socialist, once tried to steal the sell the people of Colorado back their own water (he bought land on the state’s biggest aquifer and tried to claim ownership but was stopped, so he has moved on to selling air he doesn’t own. He is in China now that his daughter stole $20 million from the UN’s Food for Peace”
These seem to be very serious accusations, especially the last sentence. Can you enlarge on your statement so that the.facts can be brought out into the open in an endeavour to shame these people, if that is ever possible?.
Ferdinand Engelbeen – thankyou for putting so much effort into countering Tim Ball’s points. I would welcome WUWT granting you space to put your own essay, so that we can clearly read your points.
Tim Clark says:
November 14, 2013 at 1:43 pm
[quoting Sam] “Now if there were only a specific or limited amount of energy available for absorption then I could agree with the above statement……”
“Sam, there is for CO2. Better do some homework on wavelengths.”
———————–
Are you now asserting that the wavelengths (energy) that CO2 is subject to …. cease to be emitted from their sources after the first 20+- ppm of CO2 achieves their maximum Specific Heat Capacity?
Tim C, ….. I did my homework, and for two reasons. First, to verify my comments before posting, …. and secondly, in anticipation that my comments would likely be questioned.
CO2 is not a singularity, but is only one of the many “players” upon the field of play (atmosphere), all of which are capable of absorbing and emitting thermal energy to one another, but with specific individual Rules that govern their aforesaid actions.
Anyway, Lindzen and Choi et el states in the cited reference noted by the author that, to wit, with emphasis on the “boldfaced text”:
—————————
“Within all current climate models, water vapor increases with increasing temperature so as to further inhibit infrared cooling. Clouds also change so that their visible reflectivity decreases, causing increased solar absorption and warming of the earth.”
“However, with the annual time scale, the signal of short-term feedback associated with water vapor and clouds can be contaminated by unknown timevarying radiative forcing in nature, and the feedbacks cannot be accurately diagnosed (Spencer, 2010).”
————————
Thus said. the way I see it is, …… “what is good for the goose (H2O vapor) …. is also good for the gander (CO2)”.
Any increase in ppm has the potential of causing an increase in warming of the atmosphere.
And if the feedback (energy emissions) associated with water vapor cannot be accurately diagnosed because of the unknown timevarying radiative forcing in nature, ….. then the feedback (energy emissions) associated with CO2 cannot be accurately diagnosed because of the unknown timevarying radiative forcing in nature, ….. simply because they are both subject to the same explicit Rules
And the following are statements which I copied from said “explicit Rules”, to wit:
—————————————————
Lecture 7
Absorption/emission by atmospheric gases. Solar, IR and microwave spectra of main atmospheric gases . Ref: http://www.heliosat3.de/e-learning/remote-sensing/Lec7.pdf
Review of main underlying physical principles of molecular absorption/emission:
1) The origins of absorption/emission lie in exchanges of energy between gas molecules and electromagnetic field.
2) In general, total energy of a molecule can be given as: E = Erot+ Evib+ Eel + Etr
[snip]
NOTE: The above dependence on pressure is very important because atmospheric pressure varies by an order of 3 from the surface to about 40 km.
• The Lorentz profile is fundamental in the radiative transfer in the lower atmosphere where the pressure is high.
• The collisions between like molecules (self-broadening) produces the large linewidths than do collisions between unlike molecules (foreign broadening). Because radiatively active gases have low concentrations, the foreign broadening often dominates in infrared radiative transfer.
————————————
I do not believe it is humanly possible for anyone to measure the warming effect of the lesser quantity (398 ppm) of gas (CO2) in a mixture of two different gases when the quantity of the greater volume of gas (H2O vapor) is constantly changing (10,000 ppm to 40,000 ppm) from hour to hour, day to day and week to week.
Especially when said greater volume of gas (H2O vapor) has a potentially 239.2 greater “warming” potential for said mixture than does the lesser volume of said gas (CO2) in said mixture.
Cheers
Folks, the entire edifice of CAGW is based on the one assumption that CO2 is well mixed in the atmosphere and therefore a single measurement at Manua Loa or the poles is an accurate representation of global CO2. That is why Ferdinand defends this position at all costs. Without the acceptance of “CO2 is Well-Mixed” all the CO2 measurements are meaningless in showing there has been a global increase. This is why the work of Dr. Zbigniew Jaworowski and Ernst Beck has to be trashed. Their work refutes the leg the Hoax stands on.
…………….
Ferdinand Engelbeen says:
November 15, 2013 at 2:31 am
……..Lying is when you provide wrong information while have been warned that it is wrong. …….But if one repeat the same wrong information, as Gail did after three times warned that the information is wrong, then it is clearly lying. Except if Gail has a serious loss of memory (as I have sometimes…).
>>>>>>>>>>>>>
Ferdinand
WHO made YOU GOD? who pronounces from on high who is and is not telling the truth?
You are an unknown on the internet who also claims someone like Zbigniew Jaworowski is wrong and should be buried along with all he has said. See link
You are also contradicting yourself. You are constantly say that CO2 is well mixed in the atmosphere and then you say:
” I have shown you different times that the two links you provide are from different stations the noisy one is from Neuglobsow (as can be seen above the graph) near Berlin, Germany, midst of a forest and completely unsuitable for “background” measurements as too close to huge sinks and sources. The other is from Mace Head, coastal, Ireland, an actual “background” measuring station. “
Well if CO2 is “Well Mix” it should not matter WHERE the data was from because WELL MIXED means uniform and if it is NOT uniform it is NOT well mixed.
You really need to get your story straight. Personally I quit reading what you say a year or so ago and you are just about the only one here at WUWT I do not bother to waste my time reading.
Gail Combs says:
November 14, 2013 at 11:19 am
Graph of raw data
http://ds.data.jma.go.jp/gmd/wdcgg/cgi-bin/wdcgg/quick_plot.cgi?imagetype=png&dataid=200702142827
Graph after climatologists get through with the data manipulating:
http://ds.data.jma.go.jp/gmd/wdcgg/cgi-bin/wdcgg/quick_plot.cgi?imagetype=png&dataid=200906040013
=============
looking at the two graphs it appears CO2 measurements are not nearly as simple or as reliable as we are led to believe.
I’m always surprised when people dismiss other peoples points of view as “conspiracy theories”.
people get together all the time and reach agreements that are in their best interests. this is what people do. when these agreements go against other people’s interests, as they often are, they are rarely publicized. this is simply being prudent.
this goes on all over the world, 24 hours a day, 365 days a year. People making agreements to benefit themselves at other peoples expense. agreements that rarely reach the light of day because if they did, it would harm the interests of the people that made the agreement. so they don’t talk about it, except on occasion, behind closed doors.
there is no conspiracy theory required in any of this. it is simply what people do. make deals to further their own interests at the expense of “others”. these agreements are necessarily kept secret, lest the “others” take action against the agreement.
Gail Combs says:
November 15, 2013 at 5:36 am
Gail, I have repeatedly told you that the data you compared are from two different stations, one in the middle of a forest, the other coastal with winds mostly from the SW over the ocean. Thus nothing to do with “cleaning” the data and thus not comparable, and as bad for the forest based station as for 90% of the historical data collected by the late Ernst Beck.
I have repeatedly told you that one can find background data in over 95% of the atmosphere: over the oceans from sea level up to the stratosphere and over land above the first few hundred meters: all within +/- 2% of full scale, despite a 20% exchange of all CO2 in the atmosphere over the seasons within a year.
Below a few hundred meters, there are huge diurnal changes in sources and sinks over a day over land, which are not leveled off if there is no sufficient wind speed. Thus measuring CO2 near ground over land is as problematic as measuring the air temperature over an asphalted parking lot or near a barbeque or an AC unit outlet. All skeptics know that that is wrong, but some skeptics insist that one should include CO2 measurements of similar problematic places. A little more consequence in skepticism would make skeptics a more reliable source of information in the eye of the public (never mind the “warmers”)…
But if you don’t read what I wrote, then you can repeat whatever you want without lying: simply by ignoring what you don’t like to read…
ferd berple says:
November 15, 2013 at 5:39 am
Ferd, the data are not manipulated, the data are from two different stations, one in a forest (the noisy one), the other coastal with wind mostly coming in from the Atlantic Ocean. Both datasets are the raw data.
That is what Gail didn’t tell you and why I am quite angry about her manipulation…
So, which one would you choose to tell you what the “background” CO2 levels are?
stewgreen says:
November 15, 2013 at 4:18 am
Ferdinand Engelbeen – thank you for putting so much effort into countering Tim Ball’s points. I would welcome WUWT granting you space to put your own essay, so that we can clearly read your points.
I already had that honor three years ago:
http://wattsupwiththat.com/2010/08/05/why-the-co2-increase-is-man-made-part-1/
http://wattsupwiththat.com/2010/08/20/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-2/
http://wattsupwiththat.com/2010/09/16/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-3/
http://wattsupwiththat.com/2010/09/24/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-4/
It triggered hundreds of comments and the discussions still go on between a few diehards (myself included) up to today…
Is CO2 well mixed? What is the definition of well mixed?
Richard Sharpe says:
November 15, 2013 at 6:50 am
Is CO2 well mixed? What is the definition of well mixed?
More or less:
If a huge change in a gas or liquid at some point is rapidely dispersed in the rest of the gas or liquid.
In the case of ozone: that is not well mixed, as its destruction is faster than its equatorial topolar distribution over the stratosphere.
In the case of methane: that is well mixed for most of the troposphere, but not for near and in the stratosphere.
In the case of CO2: that is well mixed in most of the atmosphere as huge (seasonal) changes are leveled off to 10% of the change in a a few months.
– – – – – – – – –
Gail Combs,
Thanks for engaging with my comment. A pleasure.
As you can read in my comment (quoted above in full for convenience) I do not think Tim Ball is (using your terminology from your comment ) a “tin foil hat ‘¢on$piracy Theorist’”. Regarding Elaine Dewar (the reporter who Tim Ball quotes regarding Maurice Strong’s global aspirations), I do not know if she has tendencies toward being a “tin foil hat ‘¢on$piracy Theorist’” and likewise I do not know that about the men you discussed in your comment.
I think Tim Ball sees certain people with certain world views (philosophies) as inimical to his world view and inimical to the world view informing the basis of Western Culture / Civilization / Science. As most intellects do. As I do, although we essentially disagree on: who they are; and our assessments of what their world views are; assigning significance of risk; his world views versus mine.
My criticism of Tim Ball’s essay (that emphasizes Maurice Strong) is it does not contain necessary or sufficient discussion to explain the phenomena we see in the philosophy of science. Philosophy of Science phenomena seen for at least the last ~50 years (and probably for the last ~150+ years) as now practiced by the various bodies / institutions / academies in support of ideologies surrounding CAGW / AGW / GW.
I do greatly appreciate Tim Ball’s efforts to keep so many people so intensely focused on the problematic scientific processes and philosophies that have culminated in the IPCC. It is an immensely important topic. Thank you Tim Ball.
John
What the CO2 current debate from the Al Gore fraud side is a “rear guard” action.
A head fake deal to cover the retreat and movement of the “lie base” to a new attack method.
None of what they do is fact based, it is all media cover stories to keep the truth from seeing the light of day.
All the political contributions to get the liar Democrats elected is spent also to cover the truth with gold covered lies and fraud.
November of 2014 is the most important date in recent times to over throw the lie based crime operation in our U.S. Senate and House.
Take Action Now.