Guest Opinion: Dr. Tim Ball –
[Note: Some parts of this essay rely on a series of air sample chemical analysis done by Georg Beck of CO2 at the surface. I consider the air samplings as having poor quality control, and not necessarily representative of global CO2 levels at those times and locations. While the methods of chemical analysis used by Beck might have been reasonably accurate, I believe the measurements suffer from a location bias, and in atmospheric conditions that were not well mixed, and should be taken with skepticism. I offer this article for discussion, but I don’t endorse the Beck data. – Anthony]
The failed predictions (projections) of the Intergovernmental Panel on Climate Change (IPCC) are proof that there is something seriously wrong with the science. A useful analogy of how to analyze what we are witnessing is that it is like coming upon a car wreck. What you see and what happened is hard to figure out. It takes a lot of measurements and deconstruction to reconstruct what happened. Deconstruction of the IPCC wreckage must begin with determining what they did prior to the crash and those actions involved creating conditions for a self-inflicted crash. I know some of this material is not new. I covered some of it myself. However, it is time to revisit because more people are aware of what is going on and are now on the crash scene.
IPCC and their proponents drew the map, built the roads, and designed the traffic signals, but they also designed, built and drove the car. They did not plan to crash and did everything to reach their destination. The problem developed because of the assumptions they made and the manipulation of the data needed to pre-meditate the result of the trip; a crash was inevitable.
What were the conditions they considered necessary to reach their destination? There are two distinct lists. The first is a list of the assumptions made for the scientific part of the AGW hypothesis. The second is a list of the starting conditions necessary for the political part of the AGW objective.
Scientific Assumptions
1. CO2 is a gas with effectively one-way properties that allows sunlight to enter the atmosphere but prevents heat from leaving. It supposedly functions like the glass in a greenhouse.
2. If atmospheric CO2 levels increase, the global temperature will increase.
3. Atmospheric levels of CO2 will increase because humans are adding more every year.
Political Assumptions
1. Global temperatures are the highest ever.
2. Global temperatures rose commensurate with the start of the Industrial Revolution.
3. CO2 levels are the highest ever.
4. CO2 levels were much lower before the Industrial Revolution.
5. CO2 levels continue to rise at a steady rate because of the annual contribution of humans.
Data Sources
Major objectives were to start with a low pre-industrial level of atmospheric CO2 and have a steady rise over the last 150 years. Data sources included the following
1. Bubbles extracted from ice cores, but primarily the Antarctic record.
2. Stomata are the pores on a leaf through which plants exchange gases with the atmosphere. The size varies with atmospheric levels of CO2.
3. Approximately 90,000 instrumental readings from the 19th century. Measurements began in 1812 as science determined the chemistry of the atmosphere.
4. Modern instrumental readings primarily centered on the Mauna Loa record begun in 1958 by Charles Keeling as part of the International Geophysical Year (IGY).
5. The recently launched NASA Orbiting Carbon Observatory OCO2 satellite with the first published data of CO2 concentration for October 1 to November 11, 2014.
6. IPCC estimates of human production of CO2, known currently as Representative Concentration Pathways (RCP).
The first question is what are the non-human sources and sinks of CO2. The answer is we don’t know. All we have are very crude estimates of some of them but no actual useable measures. Remember what the IPCC said in Box 2.1 Uncertainty in Observational Records.
The uncertainty in observational records encompasses instrumental/ recording errors, effects of representation (e.g., exposure, observing frequency or timing), as well as effects due to physical changes in the instrumentation (such as station relocations or new satellites). All further processing steps (transmission, storage, gridding, interpolating, averaging) also have their own particular uncertainties. Because there is no unique, unambiguous, way to identify and account for non-climatic artefacts (sic) in the vast majority of records, there must be a degree of uncertainty as to how the climate system has changed.
It is important to note that they identify one exception because it is important to their narrative, but also for recreating the IPCC wreck.
The only exceptions are certain atmospheric composition and flux measurements whose measurements and uncertainties are rigorously tied through an unbroken chain to internationally recognized absolute measurement standards (e.g., the CO2 record at Mauna Loa; Keeling et al., 1976a).
The IPCC provide a bizarre and confusing diagram (Figure 1) that is more about creating the base scenario for their narrative than it is about providing clarification.
Figure 1
I don’t normally include the legend of a graph or diagram but, in this case, it is informative. Not that it provides clarification, but because it illustrates how little is known and how important it is to direct the focus on human production of CO2 over the Industrial Revolution period. This is not surprising since that is the definition of climate change they received in Article 1 of the United Nations Framework Convention on Climate Change (UNFCCC). If you drive like this, a crash is inevitable.
——————————-
Figure 6.1 | Simplified schematic of the global carbon cycle. Numbers represent reservoir mass, also called ‘carbon stocks’ in PgC (1 PgC = 1015 gC) and annual carbon exchange fluxes (in PgC yr–1). Black numbers and arrows indicate reservoir mass and exchange fluxes estimated for the time prior to the Industrial Era, about 1750 (see Section 6.1.1.1 for references). Fossil fuel reserves are from GEA (2006) and are consistent with numbers used by IPCC WGIII for future scenarios. The sediment storage is a sum of 150 PgC of the organic carbon in the mixed layer (Emerson and Hedges, 1988) and 1600 PgC of the deep-sea CaCO3 sediments available to neutralize fossil fuel CO2 (Archer et al., 1998). Red arrows and numbers indicate annual ‘anthropogenic’ fluxes averaged over the 2000–2009 time period. These fluxes are a perturbation of the carbon cycle during Industrial Era post 1750. These fluxes (red arrows) are: Fossil fuel and cement emissions of CO2 (Section 6.3.1), Net land use change (Section 6.3.2), and the Average atmospheric increase of CO2 in the atmosphere, also called ‘CO2 growth rate’ (Section 6.3). The uptake of anthropogenic CO2 by the ocean and by terrestrial ecosystems, often called ‘carbon sinks’ are the red arrows part of Net land flux and Net ocean flux. Red numbers in the reservoirs denote cumulative changes of anthropogenic carbon over the Industrial Period 1750–2011 (column 2 in Table 6.1). By convention, a positive cumulative change means that a reservoir has gained carbon since 1750. The cumulative change of anthropogenic carbon in the terrestrial reservoir is the sum of carbon cumulatively lost through land use change and carbon accumulated since 1750 in other ecosystems (Table 6.1). Note that the mass balance of the two ocean carbon stocks Surface ocean and Intermediate and deep ocean includes a yearly accumulation of anthropogenic carbon (not shown). Uncertainties are reported as 90% confidence intervals. Emission estimates and land and ocean sinks (in red) are from Table 6.1 in Section 6.3. The change of gross terrestrial fluxes (red arrows of Gross photosynthesis and Total respiration and fires) has been estimated from CMIP5 model results (Section 6.4). The change in air–sea exchange fluxes (red arrows of ocean atmosphere gas exchange) have been estimated from the difference in atmospheric partial pressure of CO2 since 1750 (Sarmiento and Gruber, 2006). Individual gross fluxes and their changes since the beginning of the Industrial Era have typical uncertainties of more than 20%, while their differences (Net land flux and Net ocean flux in the figure) are determined from independent measurements with a much higher accuracy (see Section 6.3). Therefore, to achieve an overall balance, the values of the more uncertain gross fluxes have been adjusted so that their difference matches the Net land flux and Net ocean flux estimates. Fluxes from volcanic eruptions, rock weathering (silicates and carbonates weathering reactions resulting into a small uptake of atmospheric CO2), export of carbon from soils to rivers, burial of carbon in freshwater lakes and reservoirs and transport of carbon by rivers to the ocean are all assumed to be pre-industrial fluxes, that is, unchanged during 1750–2011. Some recent studies (Section 6.3) indicate that this assumption is likely not verified, but global estimates of the Industrial Era perturbation of all these fluxes was not available from peer-reviewed literature. The atmospheric inventories have been calculated using a conversion factor of 2.12 PgC per ppm (Prather et al., 2012).
—————————-
This is likely one the most remarkable examples of scientific obfuscation in history. Every number used is a crude estimate. The commentary says we don’t know anything but are certain about human CO2 production in the Industrial Era. To my knowledge, there are no cohesive, comprehensive, measures of CO2 exchanges for most of the land surfaces covered by various forests, but especially the grasslands. The grasslands illustrate the problem, because, depending on the definition the extent varies from 15 to 40 percent. The important point is that we have little idea about volumes or how they change over time. A supposedly knowledgeable group, the American Chemical Society, provides confirmation of this point. Of course, we know how professional societies were co-opted to support the IPCC positions. In an article titled “Greenhouse Gas Sources and Sinks” they present a diagram from the IPCC (Figure 2).
Figure 2
The text says from the American Chemical Society, who presumably knows about atmospheric chemistry says,
The sources of the gases given in these brief summaries are the most important ones, but there are other minor sources as well. The details of the sinks (reactions) that remove the gases from the atmosphere are not included. The graphic for each gas (or class of gas) is from Figure 1, FAQ 7.1, IPCC, Assessment Report Four (2007), Chapter 7. Human-caused sources are shown in orange and natural sources and sinks in teal. Units are in grams (g) or metric tons (tonne: international symbol t = 103 kg = 106 g). Multiples used in the figures are: Gt (gigatonne) = 109 t = 1015 g; Tg (teragram) = 1012 g = 106 t; and Gg (gigagram) = 109 g = 103 t.
As a professional group surely they should know about the lack of knowledge about gases in the atmosphere, yet they promote the IPCC illusions as fact. There are few caveats or warnings of the scientific limitations that even the IPCC include as in Box 2.1
Creating A Smooth CO2 Curve
A major flaw of the hockey stick involved connecting a tree ring record, the handle, with an instrumental temperature record, the blade. It was done because the tree ring record declined and that contradicted their hypothesis and political agenda. Ironically, a major challenge in climatology is to produce a continuous record from data gathered from different sources. H.H. Lamb spends the first part of his epic work, Climate, Present, Past and Future (1977) discussing the problems. He also provides a graph showing the length of possible climate time scales and the overlap problem (Figure 3). There are three areas, the instrumental or secular, the historic, and the biological and geologic.
Figure 3
Data from different sources had to link to create the continuous smooth curve of CO2 from the pre-industrial levels through to the present. This involved three data sources, ice cores, 19th century instrumental readings and the Mauna Loa record. Figure 4 shows Ernst-Georg Beck’s reconstruction of the three sources. If you remove the 19th century data, it is another example of a ‘hockey stick’. The ice core data is the handle, from a single source, an Antarctic core. The blade is the Mauna Loa instrumental measure. As the 2001 IPCC Working Group I Report notes,
“The concentration of CO2 in the atmosphere has risen from close to 280 parts per million (ppm) in 1800, at first slowly and then progressively faster to a value of 367 ppm in 1999, echoing the increasing pace of global agricultural and industrial development. This is known from numerous, well-replicated measurements of the composition of air bubbles trapped in Antarctic ice. Atmospheric CO2 concentrations have been measured directly with high precision since 1957; these measurements agree with ice-core measurements, and show a continuation of the increasing trend up to the present.”
These measurements are not well replicated and have many serious limitations. Some of these include
1. It takes years for the bubble to be trapped in the ice. Which year does the final bubble represent?
2. As the ice gets thicker, it becomes impossible to determine the layers and, therefore, the relative dating sequence. Some say that at 2000 meters it requires 245 cm of ice to obtain a single sample, but under the compression and melding that represents one bubble for several thousand years.
3. Meltwater on the surface, which occurs every summer, moves down through the ice contaminating the bubbles. As Zbigniew Jaworowski said in his testimony to the US Senate,
“More than 20 physico-chemical processes, mostly related to the presence of liquid water, contribute to the alteration of the original chemical composition of the air inclusions in polar ice.”
4. A study by Christner (2002) titled “Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice.” Found bacteria were releasing gases at great depth even in 500,000-year old ice.
Figure 4
A deconstruction of these portions of the crash reveals how it was achieved.
Professor Zbigniew Jaworowski was attacked viciously during the latter years of his life because of his views on climate change and ice core data. Like all who are attacked it is a sure indication they are exposing the deliberate deceptions of the global warming political agenda. Here are Jaworowski’s credentials that accompanied his presentation to the US Senate Committee on Commerce, Science, and Transportation.
“I am a Professor at the Central Laboratory for Radiological Protection (CLOR) in Warsaw, Poland, a governmental institution, involved in environmental studies. CLOR has a “Special Liaison” relationship with the US National Council on Radiological Protection and Measurements (NCRP). In the past, for about ten years, CLOR closely cooperated with the US Environmental Protection Agency, in research on the influence of industry and nuclear explosions on pollution of the global environment and population. I published about 280 scientific papers, among them about 20 on climatic problems. I am the representative of Poland in the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), and in 1980 – 1982 I was the chairman of this Committee.
For the past 40 years I was involved in glacier studies, using snow and ice as a matrix for reconstruction of history of man-made pollution of the global atmosphere. A part of these studies was related to the climatic issues. Ice core records of CO2 have been widely used as a proof that, due to man’s activity the current atmospheric level of CO2 is about 25% higher than in the pre-industrial period. These records became the basic input parameters in the models of the global carbon cycle and a cornerstone of the man-made climatic warming hypothesis. These records do not represent the atmospheric reality, as I will try to demonstrate in my statement.”
There was nobody more qualified to comment on the ice core record and here is part of what he said to the Committee.
“The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”
Of equal importance Jaworowski states,
The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv[11] (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv[12]. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution [13].
Figure 5 shows the stomatal evidence of CO2 levels compared with the ice core data that Jaworowski referencs.
Figure 5
Apart from the higher overall average, notice the smoothness of the ice core curve partly achieved by a 70 year smoothing average, an action that removes large amounts of information, especially the variability, as the stomata record shows.
The other reference Jaworowski makes is to a graph (Figure 6) produced by British Steam Engineer and early supporter of AGW, Guy Stewart Callendar.
Figure 6 (Trend lines added by the author.)
The dots represent the measures of atmospheric CO2 taken during the 19th century by scientists using rigid methods and well-documented instrumentation. The objective of the measures, started in 1812, was not related to climate. It was to determine the constituent gases of the atmosphere. It continued the work of Joseph Priestly who, though not the first to discover oxygen, was the first with published reports (1774). Figure 6 shows the samples that Callendar selected (cherry picked) to claim a low pre-industrial level. Equally important, he changed the slope from a decreasing to increasing trend. Figure 4 shows the same 19th century data plotted against the ice core and Mauna Loa curves.
Disclaimer: Ernst-Georg Beck sent me his preliminary work on the data, and we often communicated until his untimely death. I warned him about the attacks but know they exceeded anything he expected. They continue today, even though his work was meticulous as his friend, Edgar Gartner, explained in his obituary.
“Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Panel on Climate Change IPCC. (Translation from the German)
The problem with Beck’s work was it identified why Callendar dealt with the data as he did. In the climate community, the threat was identified and dealt with by a 1983 paper “The pre-industrial carbon dioxide level” published by Tom Wigley, then Director of the Climatic Research Unit (CRU). I recall the impact because I ran graduate level seminars at the time on the significance of the paper.
Criticisms of the 19th century records are summarized with one comment; they were random. Yes, in most studies randomly sampling is more desirable and representative of the reality than pre-selected, pre-determined sampling at specific points and specific levels as is currently done. That only works if you assume the gas is well mixed. One criticism is that Beck’s record shows high levels around 1942 compared to the Antarctic record. This is likely because CO2 is not well mixed, as the OCO2 and other records record indicate, but also that most of the records were taken in Europe during the war. Besides, with the 70-year period required to enclose the Antarctic gas bubble that record would only be showing up in 2012. The truth is there are no accurate measures of CO2 in 1942 other than the ones Beck used.
Another criticism says the locations, including the height at which measurements were taken varied considerably. Of course, that’s the point. They were not narrowed and manipulated like the current Mauna Loa and similar records, so they only provide measures at a few points that essentially eliminate all natural influences. It is obvious from the preliminary OCO2, the stomata, and Beck’s record that great variability from day to day and region-to-region is the norm. Further proof that this is the norm of this is that they tried to eliminate all this natural variability in the ice core record and at Mauna Loa. When outgoing longwave radiation leaves the surface, it passes through the entire atmosphere. The CO2 effect operates throughout, not just in certain narrowly chosen spots at certain altitudes like Mauna Loa measures. As Beck noted,
“Mauna Loa does not represent the typical atmospheric CO2 on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude.
Charles Keeling established the Mauna Loa station with equipment he patented. As Beck wrote, the family owns the global monopoly of all CO2 measurements. Keeling is credited with being the first to alert the world about AGW. As Wikipedia’s undoubtedly vetted entry notes,
Charles David Keeling (April 20, 1928 – June 20, 2005) was an American scientist whose recording of carbon dioxide at the Mauna Loa Observatory first alerted the world to the possibility of anthropogenic contribution to the greenhouse effect and global warming.
Keeling’s son, a co-author of IPCC Reports continues to operate the facilities at Mauna Loa. The steady rise in the Keeling curve, as it is known, is troubling, especially considering the variability in the records not considered suitable for the IPCC story. How long will that trend continue? We know the global temperatures rose until the satellite data produced a record independent of the IPCC. There is no independent CO2 record, the Keeling’s have the monopoly and are the official record for the IPCC.
As Beck explained,
Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.
Now they have the dilemma that the temperature has not increased for 19 + years but CO2, according to Mauna Loa, continues its steady rise. How long before we see a reported decline in the Mauna Loa record to bring the data in line with the political message? Fortunately, thanks to the work of people like Jaworowski and Beck, it is too late for them to mitigate the damage from the slow motion crash that is inevitably evolving? The hockey sticks of the entire team were broken in the crash.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Thank you for the thorough background on the measurements and who made them
Test…
“There is no independent CO2 record, the Keeling’s have the monopoly and are the official record for the IPCC.”
This is nonsense. Here are just some of the CO2 measuring sites around the world.
And yes, Beck’s collection are just measuring CO2 variations in highly populated and vegetated environments. Does anyone really believe, looking at Fig 4, that CO2 changed its behaviour so radically in 1957? The CO2 changes shown by Beck would, if global, involve totally unphysical massive carbon fluxes.
“…the Keeling’s have the monopoly and are the official record for the IPCC.”
This is the problem.
And another thing.
Take a look up there at the OCO2 annualized carbon dioxide concentration show. Look at those colors.
Has the concentration in Antarctica always been that low? No wonder the official 70-year smoothed “pre-industrial” ice core concentration is so tractable, so congenial.
Now that’s picking a test site.
Yes, there are numerous sites measuring CO2 in the atmosphere, but Scripps (Keeling) is the source of primary reference standards for instrument calibration.
NOAA generates secondary reference standards (see following). http://www.esrl.noaa.gov/gmd/ccl/airstandard.html
These standards are likely biased low due to losses of CO2 during drying (coalescer and perchlorate filters). (Incidentally, industrial manufacturers have gotten away from perchlorates, such as this, due to quality and safety issues). If CO2 is biased low in the standard, this would have the effect of artificially inflating atmospheric measurements and it wouldn’t easily be caught due to the circular reference of measurements.
Pictures from the esrl.noaa show a lack of attention and professionalism. For instance, a dog (?) riding in the back of the vehicle transporting cylinders and lack of use of personal protective equipment by a technician filling cylinders (he’s wearing a T-shirt and has no eye protection). It would appear that an external audit of NOAA is warranted if for only improving safety.
R. Shearer,
The current WMO primary standards are under NOAA supervision, previously under Scripps, but still Scripps, (and the Japanese) have their own standards and scales, which are intercalibrated. There may be losses by drying the outside air, but that has no effect as the standards are made by adding doses of CO2, CO2, CH4 and different isotopes of C in CO2 to give the desired level. That mixture has a precision of 0.014 µmol/mol (one standard deviation) by the manometric method used for the calibrations.
Further, even if the level would be a few ppmv higher or lower (which it isn’t), that doesn’t make any difference in the trends over the past 57 years…
Nothing I hate more than a map with no Key.
Nick, the difference in 1957 could have to do with calibration issues as much as anything. His instrument was not all that accurate (the 1957 device). The chemical knowledge now that CO2 is not well mixed is also a contributing factor to a discrepancy.
The chemical method was more accurate by I recall 2 or 3 orders of magnitude at the time. I see no reason to think that the chemical methods were ‘wrong’. Further, this can easily be replicated using the same chemical methods and standard calibration gases. The old NDIR CO2 instrument is, i am sure, still around as well for recalibration of the record.
The easy dismissal of all chemical records is concerning. It is certainly not justified if the Keeling record is ‘the standard’. The instrument was simply not good enough to be ‘the standard’.
Using a combination of Nafion dryers and 0.01 micron filters it is easy now to get a clean, dry sample of air. It is routine in my work. I cannot anoint anyone’s historical work, but there are no solid grounds for criticizing the 1950’s chemical methods on the basis of a 1950’s NDIR machine.
Crispin (Just moved from Ulan Bator?),
The best performing chemical methods were accurate to +/- 3% or +/- 10 ppmv in the best circumstances. Not even enough to detect the seasonal changes at Barrow (+/- 8 ppmv)…
Keeling made a glass instrument (he was a skilled glass blower) to accurately gravimetrically measure CO2 in air with an accuracy of 1:40,000 and used that instrument to calibrate all NDIR equipment and calibration gases. That makes that the precision of the measurements he performed were better than 0.1 ppmv. Two orders better than the best performing chemical methods…
The “trick” that Keeling used and still is in use at all stations that use NDIR, is frequently recalibrating the instrument with 2 (later 3 + 1) calibration gases, so that any drift of the instrument is counted for. See:
http://www.esrl.noaa.gov/gmd/ccgg/about/co2_measurements.html
The accuracy was not the main point of the difference between the methods, the main problem is where was measured: 90% of the historical measurements were over land, where one can find levels of 250 to 700 ppmv, depending of time of the day and inversion / wind speed in the neighborhood of huge sources and sinks. Especially vegetation, which respires CO2 at night and removes CO2 during the day.
All historical measurements made over land are essentially worthless for estimating the real “background” CO2 levels of the pre-1958 times. With one escape: if there are many measurements at high wind speed, these go asymptotically towards the real background levels, but the historical measurements had either too few points or still had huge ranges.
There were few historical measurements made on board of ships during ocean travels and some were coastal with wind coming from the seaside. These are all around the ice core CO2 measurements for the same period…
Thanks for the reference document. Nice compendium.
Had the chance to go fishing today with an old friend who recently finished a year long walkabout through east germany and poland.
I had read this and that about the WGBU and Schellnhuber. Also had known he is the gatekeeper to the new Pope concerning climate science.
What I didn’t know (as explained by my friend) was the depth of the belief that Schellnhuber and his circle believe they are the beginnings of the Age of Transition. They believe they have a populist mandate to move the world towards sustainable living ( a Rousseau type belief). They also believe that accelerated growth will occur in underdeveloped nations now that extreme poverty is likely to be nil over the next 15 years. And finally, they believe (now with the Pope’s blessings and force) that the growth will be unlike the West and more controlled and sustainable.
Europe has its hands full.
Btw, he thinks generally speaking that East Germans and the Poles perceive the whole sustainability push as a con meant to control people. They see Western Europeans as too naive and soft to know any better.
Very correct appraisal IMO, Knute. We always have to be on our guard against the Rousseau zombie:
http://figures-of-speech.com/2015-09/general-will.htm
RS
That was a great read. Very much appreciate the link. I’ll refer to it often.
Another great read:
http://figures-of-speech.com/2015-09/faith-tests.htm
Pay attention to the moral at the bottom!
Yes – good read. Thanks.
An excellent read. Erudite, quite humorous and engaging.
Aldous Huxley foretold the future in Brave New World:
…by means of ever more effective methods of mind-manipulation, the democracies will change their nature; the quaint old forms—elections, parliaments, Supreme Courts and all the rest—will remain. The underlying substance will be a new kind of non-violent totalitarianism. All the traditional names, all the hallowed slogans will remain exactly what they were in the good old days. Democracy and freedom will be the theme of every broadcast and editorial—but Democracy and freedom in a strictly Pickwickian sense. Meanwhile the ruling oligarchy and its highly trained elite of soldiers, policemen, thought-manufacturers and mind-manipulators will quietly run the show as they see fit.
Yes, East Europeans recognize control when they see it. The era of Communism wasn’t that long ago.
He is absolutely correct. From half a century living under Soviet rule they are keen to the manipulations of that style of control and understand and see clearly the weakness of the west with it’s “soft” naive liberals and their trust and acceptance of the toxic ideas that will kill them…In the end. Or shortly. Ref “stupid or traitorous” Obama.
Is Obama really so stupid? Perhaps. If not, the only alternative is that he is out to inflict as much damage to the U.S. as he can without being too obvious about it. Motive? Who are his friends from his past… !
Liberals refuse to believe that Darwin is driving the bus–and always has been. They actually think people are motivated by altruism, which is another word for the PC approval of their like-minded peers.
Liberalism is thinking with your heart rather than your brain. Its one major failing is it assumption of the ultimate goodness of mankind when it only takes two weeks at Parris Island or Fort Jackson prove how thin the veneer of civilization is on the human.
Writes Joe Crawford:
Well, it’s widely known among recruits that Drill Instructors (and Drill Sergeants) are not even classifiable as members of species H. sapiens, much less representative of the civilizations that issue their pay.
As I see it, you guys are doing the work our manipulators on high want done. Repeat the bizzaro notion that “liberal” means illiberal group-thinker, and the reactive mind’s heartless impulses are the heart thinking . . The lingo mutilation game began long before the conjuring of imaginary people who deny climate arrived on the crime scene.
That (most) people feel discomfort when they witness suffering is not a bad thing, who wants to live in pschopath world? (I mean besides the hyper-wealthy psychopathic elite that I’m quite sure want exactly that).
I advise; If it’s the way the TV talking heads speak, avoid speaking that way yourself, it’s probably mind-control BS.
Yup, JK sometimes it’s hard not too slide into the same diatribe. Good catch. I offered the phrase ‘soft and naive’ from a fishing time chat a friend had concerning east European views of the West. One man, one view … anecdotal.
On another note … someone here noted Hoffer quote and it inspired me to pick up the book The True Believers again. The clarity of the book is terrific. He discusses the seminal need of the TB to be disillusioned with the status quo and empowered by a new vision. I know we don’t often quote Hitler but Hoffer did when discussing when TB groups start to erode …
According to Hitler, the more “posts and offices a movement has to hand out, the more inferior stuff it will attract, and in the end these political hangers-on overwhelm a successful party in such number that the honest fighter of former days no longer recognizes the old movement…. When this happens, the ‘mission’ of such a movement is done for.”
Are we there yet ?
Is it imploding ?
Knute,
Poles are no less susceptible to the environmental cult of global warming than are west Europeans.
Percentage of population believing that “global warming” is a scary man made threat:
* * * * * * * * * *
Australia – 54
Canada – 61
China – 58
Denmark – 49
Finland – 53
Germany – 59
France – 63
Italy – 65
Japan – 91
Netherlands – 44
Poland – 58
UK – 48
USA – 49
https://en.wikipedia.org/wiki/Climate_change_opinion_by_country
* * * * * * * * * *
Anecdotal reports from a friend may not accurately reflect the pulse of a nation.
Semi-socialist Norway, with nationalized oil and lots of money in the bank, appears to have the highest rate of skepticism in all of Europe.
KH
Thanks for the fallacy check. Indeed my friend’s survey is jaded as an anecdote and no doubt even more biased by his circle of travels. He stayed mostly with agrarian families and as he pointed out takes much of his general opinion from the elders who still remember the past. He did point out that much of the youth were eager to embrace the West, including a semi god admiration for the current president of the US.
Along the same vein of potential fallacy, I often take wiki numbers with a grain of salt because they are too easy to manipulate.
It is the ultimate irony that despite living in the most globally interconnected times in history, we struggle to validate the opinions of the masses.
Knute” “It is the ultimate irony that despite living in the most globally interconnected times in history, we struggle to validate the opinions of the masses.”.
Great insightful observation. And it should come as no surprise that the masterful manipulators of information are using the Greatest Fooling Machine Ever Invented to its full advantage.
Thanks Piper
I had “a moment”.
Ack! Missed the close italics thingie!
If you were to plot the median latitude of these countries vs. the percentage of their population believing in AGW, I conjecture that a linear least squares fit to that scatter plot would have negative slope; and of course a very low goodness of fit.
In other words, the further the latitude from the equator, the less they are concerned about AGW – maybe because populations in colder climates wouldn’t mind warmer weather.
It matters not what people think about dangerous man-made global warming. All that matters is the truth as best we know it, and as we know from past experience, it takes only one scientist to prove all the others wrong. Sadly, our knowledge of the climate system has a very long way to go before it can be fully understood.
Einstein once said, “We still do not know one thousandth of one percent of what nature has revealed to us.” His statement holds true on the issue of climate change.
On man-made global warming, reliance is placed on climate models by the IPCC, models which have never been validated, and will never be able to simulate the climate system for obvious reasons. Climate models do not represent science. For that reason alone, the IPCC’s hypothesis should be treated with a pinch of salt. Instead, it’s been treated as the ‘gospel truth’ … the science is settled, the debate is over … it’s time for action. Ridiculous!
M
“It matters not what people think about dangerous man-made global warming. All that matters is the truth as best we know it, and as we know from past experience, it takes only one scientist to prove all the others wrong.”
Thanks M. The horse has long since left the barn concerning science and CAGW. Wish it were not the case, but that is reality.
CAGW is now a popularity contest intertwined with economic and social justice interests. In some countries like the US, CO2 has been codified as a pollutant.
While I think it’s good that good science continues to pound the table concerning a flawed conclusion, it will take more than that to bring the horse back.
Who is Hans Joachim Schellnhuber?
Thanks ( well maybe not). this guy is scary and I doubt I’ll sleep better tonight. These master manipulators are truly frightening and the lack of care for their fellow humans is appalling.
Interesting. Someone with his nose so deeply in the grant-trough that he would not know objectivity and honesty if it smacked him over the head.
That is the man and his institute (PIK – Potsdam Institute for Klimawandelfolge – climate change consequences) who advised Angela Merkel into the “Energiewende” disaster in Germany and advised the Pope for his latest paper to the church…
A real dangerous man…
Not long, Dr. Ball. In the mind of the Social Justice Warrior (SJW) “reality” is what suits the Narrative, and that is – ever and anon – malleable.
The measurements were not made by Beck. They were made by many people in many locations over a long period of time. The readings were done purely to measure the amount of CO2 in the atmosphere. Beck analyzed them in remarkable detail looking at the instruments and the method. I urge everyone to go and read Beck’s articles and judge the scientific thoroughness of his work and the degree to which they provide important information. One of the links takes you to six of his works.
http://www.biomind.de/realCO2/papers.htm
Another thing to remember is the extent to which Callendar and Wigley tried to downplay the record. In contradiction, they still used the record to confirm the low low pre-industrial level of CO2 they wanted.
Dr. Ball,
Callendar used stringent a-priori criteria for including or excluding measurements like “not used for agricultural purposes”. You may agree or not with these criteria – which would remove the two longest series used by Beck: Poonah (India) and Giessen Germany, which are at the base of Beck’s 1942 “peak”, but he at least had criteria and quality checks, which Beck had not.
Callendar was not interested in downplaying any record, why would he? Many scientists at that time saw higher temperatures as beneficial. His compilation was decennia later confirmed by the independent ice core record…
Ferdinand, Dr. Ball did explain the 1948 Callendar paper, which was cherry picked from word one. As a steam engineer he had no reason to cherry pick but he did. he wrote an “off expertese” paper. (He wrote about a subject he knew zero about).
johnmarshall,
He was interested in CO2 as good as we all are interested as “informed citizens”, no paper bull of “expertise” can change that. And “cherry picking”, based on stringent a-priory criteria is not cherry picking and way better than lumping everything between 200 and 800 ppmv for the same year together and calling the average of that mess the “background” CO2 for that year. That doesn’t make any sense…
BTW, the trend compiled by Callendar was decennia later confirmed by ice core measurements and several proxies, Beck’s impossible huge 80 ppmv (and worse if not smoothed) 1942 peak is not confirmed by any other measurement or proxy.
Dr. Ball
The huge unspoken and unproven Scientific and Political Assumption that is missing from your lists and that underpins this entire discussion is the claim that global warming is dangerously harmful. This unspoken assumption is rarely mentioned and rarely challenged and always assumed before the discussions even begin. It is the question that is always begged and never answered. It is the unspoken false premise that underpins the entire discussion of CO2.
Thus CO2 becomes a rabbit trail that that sends us all off barking through the tall grass and up the wrong trees.
Thank you for your work.
http://realclimatescience.com/wp-content/uploads/2015/10/Image-48.png
Is it plagiarism to rebirth ?
(Sockpuppet. Comments deleted. ~mod.)
I am not a professional scientist; I’m a recently retired CFO, so I’ve dealt with my share of “questionable” numbers (unit cost calculations, benchmarks, project estimates, reserves, etc). As I’ve read and learned from WUWT over the past few years, I’ve definitely purged any naive thoughts that scientific numbers were somehow more “pristine”.
However, I seriously doubt the general public has faced this reality. One of the most effective arguments with my warmest friends (some of whom are NASA PhD physicists…) regarding “climate change” (or whatever we’re calling it this week) is the dawning realization that climate numbers have been manipulated.
I’ve seen hundreds of arctic ice charts, hundreds of hockey sticks, hundreds of temperature simulation charts for the 70+ climate models, and other such exhibits – all predicated upon the alleged warming. I’ve read many times on WUWT that the IPCC has (several times) “revised” the 1880-to-present day temperature data, BUT I HAVE NEVER SEEN A SIMPLE 1-PAGE CHART THAT ACTUALLY SHOWS THE MAGNITUDE OF ALL THESE ADJUSTMENTS (i.e.: a single chart from 1880 to present with each of the 5-10 “official” IPCC temperature data revisions).
The issue of corrupt data is a powerful concept in my collegial arguments with my more scientifically literate warmest friends (they even conceded “if the data has been manipulated, the case for warming alarmism may be overstated’).
I have unsuccessfully searched the web for this material. I feel strongly that it would be a huge step forward in this debate if one of the WUWT analytical gurus could produce and discuss this chart. This is a critical and missing piece of the argument.
“I’ve read many times on WUWT that the IPCC has (several times) “revised” the 1880-to-present day temperature data, BUT I HAVE NEVER SEEN A SIMPLE 1-PAGE CHART THAT ACTUALLY SHOWS THE MAGNITUDE OF ALL THESE ADJUSTMENTS”
The IPCC does not manage datasets, and does not adjust data. NOAA manages GHCN and publishes adjusted and unadjusted data. For each of its 7280 stations it produces a one-page chart showing all the adjustments. This is typical.
Your “this” does not reflect the historic difference in the charts published in the 80s and 90s, (not even close)
“I HAVE NEVER SEEN A SIMPLE 1-PAGE CHART THAT ACTUALLY SHOWS THE MAGNITUDE OF ALL THESE ADJUSTMENTS”
I’m sure that anyone with CRU’s budget, manpower, and access to CRU’s data and methods would have no trouble wading through the up to 6000 stations’ data and discovering the magnitude of the scam.
Tony Heller regularly shows his analyses of the NASA and NOAA temperature manipulations via their adjustments through the last several decades.
.
http://realclimatescience.com/
More specifically Tony Heller’s pages here http://realclimatescience.com/alterations-to-climate-data/
Chip, I would suggest that you ask this question at http://realclimatescience.com/ . Steve Goddard has a lot of graphs of adjustments, and perhaps one of them is what you seek.
Rich.
Paul Homewood has done extensive research into the manipulated NOAA and USHCN numbers as they attempt to ‘cool the past’. This a link to just one of his blog page comparisons:
http://notalotofpeopleknowthat.wordpress.com/2014/03/25/cooling-the-past-in-nebraska/#comment-21147
This deceit is going on worldwide. Homewood has graphically displayed the data to show how obvious it is. This one post I linked to is one of several.
Visit Goddard’s site, Real Climate Science for evidence of manipulated data.
You can see the glaring effect of the adjustments in this simple plot. After moving in lock step for over 100 years, the NH temperature record suddenly diverges markedly from the SH.
The SH, on the other hand, is in agreement with the satellite data. Just about all of the supposed warming of the last 20 or so years comes from the obviously manipulated NH temperatures.
Anthony
Beck never performed any chemical analyses – and his first paper on the topic was published by me as then editor of AIG News. As Tim points above, Beck collated all the published data in the scientific literature but it helps to actually have read his papers.
Writes Louis Hissi:
In other words all of the referenced publications uttered by Beck had been review papers? Not reports of investigations (with close considerations of instrumentation, analytical methods, discrete limits of accuracy, and other confounding factors) but reports of reports, in which the work of multiple investigators had been aggregated as if one farmer’s pippins were pretty much precisely equal to another orchardman’s winesaps and a third guy’s crop of gala apples.
Not to mention the prospect of a few truckloads of oranges wandering into the juicing process.
Pardon me for getting all Sicilian here, but oy, gevalt!
Damn. I hate misspelling names. Sorry, Dr. Hissink.
Tucci78,
I have read and discussed all papers compiled by the late Ernst Beck with himself prior to his untimely death, for the period 1935-1955, as that shows a “peak” of ~80 ppmv around 1942 in his compilation, which doesn’t show up in any other measurement or proxy…
Indeed he lumped everything together: the good, the bad and the ugly… One data point in the US at 200 ppmv with 1000 data points with an average 415 ppmv (66 ppmv – one sigma!) in Europe. Methods sometimes known, many times even unknown. Accuracy sometimes known: the best around +/- 10 ppmv, the worst +/- 150 ppmv (that were the ugly data). Calibration: seldom mentioned, skill of the operators, fresh chemicals used? Never mentioned…
In the best cases, the standard deviation of the series and/or the range is given, which shows that there was a lot of local contamination…
I agree Louis. The comment by Anthony is not warranted or helpful in understanding. Beside reading Becks papers and looking at his presentations, Anthony should at least read some of the original papers of those who made measurements. The paper by W Kreutz (1941) is worth looking at because of the vast amount of data he acquired (there is a partial translation https://cementafriend.wordpress.com/2013/01/08/co2-in-the-atmosphere/). It is unfortunate that Kreutz had little understanding of mathematics and statistics. He should have looked at some time lags but these are obvious in his graphs (incoming radiation leads temperature which in turn leads CO2 – at ground level and two higher levels). Kreutz measured wind speed and wind direction. Prof Francis Massen and Ernst-Georg Beck used wind information to obtain better estimates of CO2 background levels (see the first paper here http://www.biomind.de/realCO2/papers.htm ) That is the sort of information which should be supplied about the measurements at Mauna Loa where there is volcanic outpouring of CO2 to allow some estimates of reliability.
2. Stomata are the pores on a leaf through which plants exchange gases with the atmosphere. The size varies with atmospheric levels of CO2.
Actually it’s the number of stomata that varies. Leaves will have more stomata when the CO2 is low. The stomata themselves open during the day and close at night to save water when photosynthesis isn’t going on.
In reality, both cell size and number change, so what is measured is the stomata index, the number of stomata cells per unit of area divided by the number of epidermal cells per unit of area.
We have two measures of paleo levels of CO2, stomata and ice-cores, that do not agree very much. In essence ice-core levels are a low band pass filter that eliminates most variability in the record. Unless ice-core CO2 data, stomata CO2 data is difficult to obtain and calibrate, so most scientists have decided to ignore what stomata data is saying.
17 Oct: UK Telegraph: Christopher Booker: Met Office shown to be wrong by its own data
In recent years the organisation’s forecasts have become skewed by its obsession with global warming
Imagine if Michael Fish, our most famous weatherman, had been sacked by the BBC for writing a book accusing the world’s climate scientists of having “manipulated” their data to promote panic over global warming. Something similar made headlines in France last week when its “top TV weatherman”, Philippe Verdier, was taken off air by the state-owned France 2 channel for writing a book claiming that we have all been made “hostages to a planetary scandal over climate change”…
Before this scandale erupted, I planned to start this column by asking “what on earth is happening to our British weather”?…
In a series of recent posts on his Notalotofpeopleknowthat blog, Paul Homewood has been meticulously plotting the Met Office’s predictions against its own recorded data. In one, titled “Met Office forecasts contain a warming bias”, he compared all its running three-monthly forecasts for the first nine months of 2015, made on the basis of “observations, several numerical models and expert judgment”, with what actually happened…
Against its frequent claims that we can expect “a general increase in summer temperatures” thanks to “human influence on climate”, the Met Office’s own data show that, since 2006, summers have on average become cooler…
As remarkable as anything are the graphs on a guest post by Neil Catto, a former Met Office employee, who, as part of his scientific work, has plotted data from a representative sample of Met Office UK weather stations every day since 1998. On every one of his graphs recording temperature, rainfall and much else, the trend line over 18 years has been astonishingly consistent. Despite fluctuations, the overall trend has been flat. The general pattern of our weather has remained remarkably unchanged…
http://www.telegraph.co.uk/comment/11938459/Met-Office-shown-to-be-wrong-by-its-own-data.html
(Sockpuppet. Comments deleted. ~mod.)
Mr. Fish did not get the sack for predicting the weather in 1987 when Seven Oaks became One. That was one crazy night of weather.
Possibly too cryptic, Sevenoaks, Kent, UK and the Great Storm 28 years ago are the keys.
Much too cryptic.
Sevenoaks (one word) was a small town named after its seven very large and very old oaks. But after the great storm of ’87 that the Met Office failed to predict, six of them blew down. And so now Sevenoaks is known locally as Oneoak.
ralfellis,
It wasn’t much too cryptic.
I was aware of the Fish Storm (sorry) story and could handle the arithmetic pertaining to the oak trees.
>>It wasn’t cryptic. I was aware of the Fish Storm.
Yes, but this blog is mostly USA based. I doubt if many state-side would have any clue about a little English town called Sevenoaks.
Maybe someone could clarify for me, is the stomata variation a response of individual plants to short term environmental variation, or is it a long term effect involving descent with modification?
Juan Slayton:
You ask
Individual plants grow leaves with stomata for optimum photosynthesis with average atmospheric CO2 concentration in the leaf growing season of the previous year.
Hence, laboratory (i.e. sealed greenhouse) exposures of such plants to controlled atmospheric CO2 concentrations enables calibration of leaf stomata for indication of atmospheric CO2 concentration.
I hope this helps.
Richard
I believe each plant sets stomata density based on the CO2 level it finds as it grows, so short term.
Juan Slayton:
Further to my attempt to help you, I commend this paper which is in language comprehensible to non-botanists.
It says
The “mature leaves” can only convey information on CO2 concentrations they have experienced and, therefore, your definition of “short-term” requires clear specification.
Simply, all “mature leaves” have the pore distribution they were instructed to develop by their elders as they grew. Hence, “mature leaves” indicate the average atmospheric CO2 concentration when they grew (i.e. the previous leaf growing season).
In one sense this is mute because identification of the ages of leaves to individual years (e.g. by carbon dating) is problematic.
Richard
Richard, Mike: Thanx. Cleared that one up.
Juan,
It is indeed as Richard said: the stomata number of index (stomata cells / other cells) of new leaves is influenced on the CO levels in the previous growing season. In that way short-term, but if there are changes from season to season, one can use that to see a trend.
Stomata data are calibrated with the CO2 levels of the past century: up to 1960 with ice core CO2 data, later on with direct measurements (Mauna Loa):
http://www.ferdinand-engelbeen.be/klimaat/klim_img/stomata.jpg
Which BTW refutes Beck’s compilation: if there was a real 80 ppmv CO2 peak around 1942, the stomata data would show SI levels far below the scale around that year (that is around 305 ppmv on the CO2 scale)…
Besides that the stomata data are proxies, that is not only influenced by CO2, but also by rain/drought, nutrients, the main problem is the local bias. CO2 levels over land are average higher than in the bulk of the atmosphere. For e.g. Giessen that is over 40 ppmv higher than “background”. The calibration compensates for that for the past century, but there is no guarantee at all that the local bias didn’t change over time due to changes in land use over the centuries in the main wind direction. Even the main wind direction may have changed in certain periods (like the MWP-LIA-current warm period)…
Thus while stomata data have a better resolution than ice core data, their absolute value should be taken with a grain of salt…
Hi from Oz. Great article and comments – thanks Dr Ball and readers. I have been following this and other skeptical web sites for about a decade, and I have never (to my knowledge) read that the Mauna Loa “lab” has exclusive global (?) rights to the measurement of atmospheric CO2! It triggered a few questions like: If Keeling has a patent on this method, which patent office issued it and when, and when does this patent expire? Is there no other possible way to measure CO2 than via the Keeling (patented) equipment / method? Do no scientists have doubts about the veracity of the Keeling method / equipment? I could go on, but you get the idea. Nullius in Verba!
There are other CO2 data sources, e.g. NOAA’s tall towers, but MLO is the only one followed by IPCC & MSM. IPCC AR5 TS.6 expressed uncertainty about the CO2 data over land. Preliminary OCO data does not look good for the CAGW crowd.
BoyfromTottenham,
Mauna Loa has no rights at all, but it is convenient to use its data, simply because it is the longest continuous series of CO2 measurements, although the measurements at the South Pole started a year earlier (but lack a few years of continuous measurements, still had regular flask samples). It doesn’t make any difference if you take one of the other NOAA stations or one of the 60 other non-NOAA stations which measure CO2 as good as possible away from local contamination, all show the same CO2 trends over the years. See:
http://www.esrl.noaa.gov/gmd/ccgg/iadv/
For the rest, Dr. Ball shows some lively fantasy about what happens in the CO2 world, which I will respond to below…
“some lively fantasy”
An entertaining phrase. Makes me feel like someone is thwapping another in the forehead with a flick of the thumb and middle finger.
Delivered with a smile no less.
The planet is about to abruptly cool due to the solar cycle 24 abrupt change to the sun. There is now observational evidence of the start of the cooling mechanisms which are latitudinal and regional specific. When the planet cools atmospheric CO2 levels will abruptly fall. The increase in atmospheric CO2 in the last 150 years and the increase in atmospheric CO2 in the last 75 years has not due to anthropogenic CO2 emissions.
There are dozens of different peer reviewed papers which all support the assertion that the majority of the increase in atmospheric CO2 in the last 75 years is due to warming of the oceans and a mechanism that increases low C13 emission from the deep earth.
CH4, ‘natural gas’ has C13 levels three to four times lower than atmospheric CO2 and CO2 in biological sequestered material.
There is no biological mechanism to explain where the super low C13 CH4 comes from based on the late veneer theory of the atmosphere. The explanation for the super low C13 CH4 is that natural gas is CH4 that is extruded from the core of the earth when it solidifies. The super high pressure liquid CH4 breaks the mantel and is the cause of tectonic plate movement on the earth.
Comments:
1) There are two theories to explain why the planet is 70% covered with water and why hydrocarbon deposits on the surface of the planet have gradually increased in time.
2) Late in the formation history of the earth a large Mars size object struck the earth. The impact of this impact created the moon and removed the majority of the light volatile elements and light molecules (including water) from the mantel.
3)The late veneer theory hypotheses that a bombard of comets created a super atmosphere immediately following the big splat. The super atmosphere would require roughly 50 times more pressure than the current atmosphere. The noble gases in the current atmosphere do not match that of comets and there is no evidence in the geological record of the unique chemical bonds that would form in a super high pressure atmosphere. Those pushing the late veneer theory hand wave the noble gas paradox away by assuming an ancient source of comets that has different noble gas concentration than current comets.
4) The geological record shows a gradual increase in surface hydrocarbons overtime which does not support the late veneer theory.
5) The second theory to explain the fact that earth is 70% covered with water and the geological record shows a gradual increase in surface hydrocarbons is the deep core CH4 hypothesis. This theory hypotheses that super high pressure liquid CH4 is extruded from the core of the planet as it solidifies. It is known that there are light elements in the liquid core based on speed of wave measurements of earthquakes. Roughly 2% CH4 is required to explain the speed of wave measurements. The size of the liquid core is roughly the same as the moon. The super high pressure extruded liquid CH4 provides the force to cause tectonic plate movement. The continents float on the mantel due to the liquid CH4 that has accumulated at the base of the continents at roughly 110 to 150 km. The earth’s ocean flow is continually pushed under the continents. The oldest ocean floor on the earth is roughly 200 million years. The pushed ocean floor carries with it CH4, a portion of which is left at the thrust plate at the margin of continents. This explains why there are bands of mountains in the edge of continents and explains why the bands of mountains extend hundreds of miles inland into the continents. The deep core CH4 hypothesis also explains why there are very high plateaus in some regions of the earth.
The deep core CH4 explains why there are massive natural gas deposits in the Rocky Mountain range at very, very deep levels, up to 20,000 feet. There are very, very deep CH4 reservoirs throughout the world. As gas flows up not down, there is no explanation as to why CH4 is found at the deep levels, followed by oil, and final as one move closer to surface of the planet black coal. Where is the biological source of the super deep CH4?
An observation to support the core extruded CH4 hypothesis is the fact that there is helium associated with oil fields. Oil fields are the only source of commercial helium.
The earth’s helium is produced by radioactive decay of Uranium and Thorium. (The big splat removed all primordial helium form the earth.
Helium gas emitted by radioactive decay cannot break the mantel and would hence remain in the mantel unless there was a mechanism to enable it to move the surface of the planet. The super high pressure liquid CH4 dissolves heavy metals which creates concentration of heavy metals higher in the mantel where as the liquid CH4 pressure decreases such that the liquid CH4 can no longer carry the heavy metals.
The liquid CH4 breaks the mantel as it pushed higher in the mantel which provides a pathway for the helium gas to reach the oil fields. The deep core liquid CH4 is also the source of the oil in the oil fields as well as the source for CH4 natural gas fields and black coal. The late Nobel prize winning astrophysicist Thomas Gold provided more than 50 independent observations to support the assertion that fossil fuel is a myth in his peer published peer reviewed papers and his book The Deep Hot Biosphere: The Myth of Fossil Fuel. The helium connection with oil fields is one of the observations that supports the deep core CH4 hypothesis.
There is in the paleo record unexplained cyclic changes in C13. There are also massive deposits of ultra low C13. Both of these observations support the assertion that there is an enormous deep earth source of low C13. A large continual input of new CH4 into the biosphere requires there to be large continual sinks of CO2.
C13 Paradox Last 20 years
Changes in atmospheric C13 levels in the southern hemisphere do not support the assertion that the rise recent rise in atmospheric CO2 is due anthropogenic CO2 emission. C13 in the Southern Hemisphere remains the same for long periods (5 or 6 years) and then suddenly increases. As anthropogenic CO2 emission is constant C13 should if anthropogenic CO2 emission was the cause of the increase in atmospheric C2 increases gradually. That is not what is observed.
Sources and sinks of CO2 Tom Quirk
http://icecap.us/images/uploads/EE20-1_Quirk_SS.pdf
It is clear that we do not yet properly understand the formation of the solar system and the planets. The recent fly by of Pluto doubly confirms this point.
At the moment, we are widely speculating, that is all. Whether we will ever possess enough data to answer the questions I do not know since the data required may well lie in other solar systems, and without being able to study a representative sample of other solar systems in various states of formation, we may never know. But of course, we should not give up trying to find the answer.
It seems the more we ;earn, the more we find out that there is EVEN more to learn.
Thank you William, the current thoughts about our fuel supplies are indeed bogus, for the deeper we drill the more we find. Very hard to explain buried forests and critters so far down.
Wayne it might help if you got some education …. say a geology course or two and pick up a decent book on petroleum occurrences – for example Dynamics of oil & gas accumulations by A. Perrodon printed by Elf Aquitaine.
Stewart – your harrumph is a throwaway line with no particular usefulness. If you can explain why Wayne is wrong, if you can explain the biological origins of deeply buried hydrocarbons, do so.
Where is the biological source of the super deep CH4?
the deeper we drill the more we find
==============================
limestone, which is fossilized CO2 and calcium from the oceans is carried into the earth’s mantle by plate tectonics, along with water from the oceans. Under heat and pressure this will form hydrocarbons when reduced by iron. (steam and iron produce hydrogen) iron is the stable end product of both fusion and fission, and is present in large quantities within the earth. The excess calcium dissolves back into the oceans, waiting to be combined with CO2 to again form limestone.
Without this recycling of limestone back into hydrocarbons, all the carbon on earth necessary for life would eventually be turned into limestone and life would go extinct. likely most of the hydrocarbons produced within the earth simply percolate upwards to the surface and are recycled by living organisms. Some gets trapped by rock formations, which humans have learned to exploit as an energy source.
William,
As I told you before, Tom Quirk’s analyses is wrong, because he compared the seasonal δ13C changes in both hemispheres on a yearly zeroed base, where there is no difference in lag if the real lag is + or – 12, 24, 36 months…
In reality there are lags in CO2 of 6 months between ground stations and height and 12-24 months between the hemispheres and several years in δ13C trends. The latter is not only influenced by human emissions, but also by the huge variability in CO2 uptake by vegetation, due to e.g. El Niño. Vegetation uptake/release gives a strong opposite change in δ13C…
The real δ13C trends:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/d13c_trends.jpg
Needs some update for recent years, but what is clear is that the main source of the low-13C CO2 is in the NH and not caused by vegetation, as that is an increasing sink for CO2, the earth is greening…
It is right to place a caveat on the Beck data, but that said, I consider that the data could have significance, and should not simply be dismissed.
I have often suggested that the analysis should be replicated today and see what results are obtained, ie., air samples would be taken from the same location, same season, same time of day etc and analysed using the same equipment and methodology. If this is done, then one would expect to get the same results plus the rise in CO2 since say 1950.
Of course, some places may have changed beyond recognition, but I understand that there were about 70,000 analysis in the Beck data, so it is likely that some sites would remain similar, and hence valid for comparison purposes.
I have also raised the point that the Beck data suggests that CO2 is far from well mixed at low altitude and suggested that this may have an impact upon the AGW theory and/or its testability. If most of the back radiation comes from low altitude, then the fact that CO2 at low altitude is already circa 800 ppm in some places may give a way of measuring climate sensitivity (or at any rate climate sensitivity less the water vapour feedback)..
If I can ever get my comments out of “permanent double-secret probation” with regard to moderation, it might be worthwhile for others reading here to consider richard verney‘s remark:
…in light of the fact that (per Dr. Hissink‘s information) Beck’s publications were all review papers aggregating the surface-level observations of many investigators.
Not just “about 70,000 analyses” but a large (not as yet tabulated in this discussion) number of observers conducting the measurements, with untold varieties of instruments and methods, and at various times of the year.
Under boatloads of different conditions in terms of latitude, altitude, ambient temperature, cloud cover (insolation), relative humidity, species of photosynthetic ground cover at sites of measurement, prevailing winds – Borjemoi! – you bloody well name it.
Without consistencies in terms of variables such as these, can it be said that what’s gathered from Beck is really data at all, but rather nothing more than enticing intimations of what might be gotten through an approach more scrupulously structured?
The great virtue of Keeling’s observations at Mauna Loa is that at that single site, at very high altitude (dunno what the ground cover at the observatory is like, but it ain’t exactly a garden spot, meaning there ain’t no vegetable transpiration going on to any great extent) there’s the virtue of consistent admixture of the atmospheric components being measured.
Dr. Keeling never needed a “patent” on his claim to fame; he got it by way of location, location, location.
See my comment below about the historical and modern station at Giessen, Germany, where the modern station shows that the local data are simply unrelated to CO2 in the bulk of the atmosphere…
Further, if you use Modtran, one can include 280 and 1000 ppmv in the first 1000 meters and calculate the difference in outgoing radiation. You need less than 0.1 degree warming at ground level to overcome the difference. That means that the influence of the variability in CO2 levels in the first few hundred meters over land is negligible…
The main cause of the global warming scam, is the academic belief that any explanation – no matter how bad it is – is better than no explanation. As such if a sceptic says: “we don’t know what causes natural variation”, and some greenspin eco-nutter says “it was man-made”, they will readily accept those who give an explanation and reject those who rightly say the subject is far too complex.
From the post –
“When outgoing longwave radiation leaves the surface, it passes through the entire atmosphere.”
Golly gee!
What an amazing discovery! That probably explains how the Earth has managed to cool over the last four and a half billion years, and why the temperature drops after the Sun goes down. Wonders will never cease!
How many climatologists did it take to work that out? Or haven’t they woken up yet?
Good grief. Next thing someone will discover that the temperature drops in the shade.
Cheers.
“Next thing someone will discover that the temperature drops in the shade.”
I hope it’s that taking a sentence out of context is a really poor way to appraise someone’s lengthy explanation.
@Chip Javert,
“I’m a recently retired CFO, so I’ve dealt with my share of “questionable” numbers (unit cost calculations, benchmarks, project estimates, reserves, etc).”
Mr Javert, on you request of a one page chart that “actually shows the magnitude of all these adjustments”, here is my thought,
Your own description of what you have dealt with over the years as in, “unit cost calculations”, “project estimates”, “benchmarks”, “reserves”, show to me that even you as a CFO ( I was a DOO and as you definitely know even in business the “climate” changes every day), that you and I never had a one page solution because it varied every day, so let’s be honest.
And if you cannot find it in on the account sheets dealing with hard dollars and cents, how are you going to ever find it in the the incredible complex, ever changing climate around us?
( I’d rather be a CFO or a DOO than a climate accountant btw).
“unit cost calculations”, “project estimates”, “benchmarks”, “reserves” are exactly the kind of semantics the “warmists” use to confound and confront us. Every time people like Dr. Tim Ball, Dr Soon and others put forward a paper they get run over by personal attacks and even lawsuits ( like the RICO letter) and the warping of our language. But for some reason they never answer our questions with a one page letter now do they.
Wrt the Met Office, here’s a little test. Spot four lies in the ‘Climate Change’ section of their website.
So, accepting that CO2 is not well mixed, is it also accepted that it is not well mixed but predictably mixed in any certain are such as Mauna Loa, and if it is, is any trend observed at that spot a reliable indicator of rate of change of CO2 generally? This article doesn’t discuss the consequences of predictably mixed vs not well mixed regarding CO2 concentrations. The point being, not well mixed does not suggest or imply a local mixture is not predictable. The CO2 level at Mauna Loa may be average CO2 +- N where N is a constant.
dp,
Besides the seasonal changes, the variability of the raw data at Mauna Loa, including the infrequent volcanic vents and the depletion by upwind conditions from the valleys, is not more than 4 ppmv (1 sigma). At the South Pole it is even smaller. At many places used by Beck’s compilation, it was much higher: at one place with the longest series it was 66 ppmv (1 sigma) with a range from 200 to 700 ppmv…
All stations from near the North Pole to the South Pole show the same trends for yearly averages:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_trends_1995_2004.jpg
More data at:
http://www.esrl.noaa.gov/gmd/dv/iadv/
Recently published satelleite data shows that “well mixed” is wishful thinking. Not only is it not well mixed, but the mix signatures vary widely over time.
No it does not, a 1% variation is ‘well mixed’, it isn’t ‘perfectly mixed’ but no one ever said that it was. When you have large localized fluxes you will get fluctuations particularly when you make measurements near those sources (as Beck did). Over the troposphere the turbulent mixing disperses the CO2 throughout the atmosphere to the observed 1% fluctuation and the timescale of the NH-SH transport.
I have seen the satellite data and the entire scale of variation of CO2 is 15 PPMV out of a mean of 400PPMV. That seems fairly well mixed to me.
Ben,
Here the raw hourly and “cleaned” daily average data of Mauna Loa (tropical NH, 3,400 m height) and Samoa (tropical SH, near sea level) for 2008 at full scale:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_raw_select_2008_fullscale.jpg
Seems quite well mixed to me…
Anthony Watts:
“[Note: Some parts of this essay rely on a series of air sample chemical analysis done by Georg Beck of CO2 at the surface. I consider the air samplings as having poor quality control, and not necessarily representative of global CO2 levels at those times and locations. While the methods of chemical analysis used by Beck might have been reasonably accurate, I believe the measurements suffer from a location bias, and in atmospheric conditions that were not well mixed, and should be taken with skepticism. I offer this article for discussion, but I don’t endorse the Beck data. – Anthony]”
Official measurements are made at a select ‘FEW’ locations. We are told this is to get the ‘BACKGROUND’ level. Sounds important huh?
Consider that there are no well mixed areas, just areas in flux with a range of values. Why measure Co2 at these select few areas?? We certainly are not getting any kind of idea of how much Co2 is actually in the atmosphere at any one day or time.
Why aren’t we measuring temps the same way??? Temps are being measured where we get the most anthropogenic contamination. Co2 allegedly the least along with the lowest range. Even so they must continually throw out out-of-range readings. WHY?????
You say not well mixed. I say the official data is taken in areas where Co2 is low. Words do make a difference. What is the AVERAGE??? By using these low areas it gives the models another break. If they were run with the real Co2 their output would be even higher and even less physical. They are simply tuned to what is desired. They have no skill.
kuhnkat,
See my own compilation of where to measure CO2 at:
http://www.ferdinand-engelbeen.be/klimaat/co2_measurements.html#Where_to_measure
It doesn’t make sense to measure CO2 over land, about 5% of the atmospheric mass, where huge diurnal changes of hundreds of ppmv are present, while in 95% of the atmosphere there is not more than 4 ppmv difference between the North Pole and the South Pole for yearly averages, mostly because the source of the increase is mostly in the NH and the ITCZ hinders the air exchanges between the hemispheres…
Ferdinand,
I always enjoy reading your comments, and I enjoyed that link you posted of your thoughts on CO2 measurement. Perhaps you should do a post written for this site on that subject? I would enjoy seeing it.
~ Mark
I’ve wondered why we want to discount any periodic variation. Anthony discounts Beck because of local variation and your sentence “It doesn’t make sense to measure CO2 over land…” Actually, I think it becomes too much work to account for local variations. I spent many years getting stuff in much smaller volumes than the global atmosphere “well mixed.” It takes a lot of energy and effort to get small volumes well mixed. On a global scale it seems to be assumed that it happens instantaneously. If CO2 is well mixed and we don’t need to look at local variations, why did we put up a satellite to measure local variations? And why doesn’t it show the base assumption of a well-mixed atmosphere? In other disciplines this might be known as a glittering generality. If you preface it with “climate” it seems to be hard science. What we get as hard science is one location, smoothed, giving out the climate science version of the mystical number 666.
We can use local variations when it suits us. We look at sweeping estimates of CO2 generation on a local level and can add it to the measurement of global CO2. We can look at local “carbon sinks” and mount an international crisis over these sinks that we can’t trust CO2 measurements on because they are too variable?
Where would climate science be if it couldn’t assume such instantaneous global mixing and smooth curves for the magic molecule that does that?
Bob Greene,
Actually, I think it becomes too much work to account for local variations.
It doesn’t make sense to measure over land, as that is the equivalent of measuring temperature near an AC exhaust, over a hot asphalted parking lot, etc. If you have millions of temperature measurements, it may make sense to include these badly situated places, but it is better to choose places away from local contamination, no matter the type of measurement.
In the case of CO2, there is no reason at all to measure CO2 near huge local CO2 sources and sinks (it is done, but that is for measuring CO2 fluxes), as that is only over land in less than 5% of the atmospheric mass. The only exceptions are on mountain tops, deserts and ice deserts like Antarctica…
Well mixed doesn’t imply instant mixing, only that the residence time is (much) longer than the mixing time. In the case of CO2, some 20% of all CO2 in the atmosphere is swapped with CO2 from other reservoirs within a few months over the seasons. The resulting measurements show not more than +/- 2% changes of full scale over the seasons, +/- 0.5% of full scale for yearly averages… I call that well mixed…
The OCO-2 satellite is meant to measure CO2 fluxes (to punish the human sinners…), the absolute values are +/- 1 ppmv, ground measurements (NDIR) are better than +/- 0.2 ppmv…
Mark Stoval,
The same discussions were held many times, so we made a 4-part series already 5 years ago where the same items were discussed (with hundreds of reactions…):
http://wattsupwiththat.com/2010/08/05/why-the-co2-increase-is-man-made-part-1/
http://wattsupwiththat.com/2010/08/20/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-2/
http://wattsupwiththat.com/2010/09/16/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-3/
http://wattsupwiththat.com/2010/09/24/engelbeen-on-why-he-thinks-the-co2-increase-is-man-made-part-4/
The foregoing link is a compilation of these thoughts, together with the following one:
http://www.ferdinand-engelbeen.be/klimaat/co2_origin.html
kuhnkat,
I agree. The fact is that there should be people measuring CO2 levels all over the world day and night. Nearly every university in the world should have some department measuring the CO2 levels. I have several temperature measuring stations within a few miles of me and no CO2 measuring stations. Why not?
Heck, if the stuff is so important they should give me a daily CO2 measurement on the local news!
markstoval,
There is an essential difference between CO2 measurements and temperature measurements: temperature hugely differs from place to place and from near surface to any height. CO2 is only badly mixed in the first few hundred meters over land, except with high wind speeds, but is within 1% of full scale for 95% of the atmosphere if taken anywhere over the oceans or over a few hundred meters over land up to 30 km height, besides the seasonal changes, as some 20% of all CO2 in the atmosphere is exchanged with CO2 from other reservoirs (oceans and vegetation).
As the full CO2 column gives some IR absorption, the effect of the first few hundred meters over land is negligible, any changes there, even a tripling or halving has little influence on the overall absorption and its influence on temperature before positive or negative feedbacks.
BTW, there are several tall towers in use, which try to measure the CO2 fluxes in and out a large area by measuring CO2 at different heights and the vertical velocity. Here for Cabauw, The Netherlands:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/cabauw_day_week.jpg
As you can see, the largest changes are at ground level while at 200 m height the variability is a lot smaller.
Ferdinand
Your comment (Ferdinand Engelbeen October 18, 2015 at 5:48 am) seems a little bit wishful thinking. My recollection of the OCO-2 data is that it suggested a variation between about 380 to about 415 ppm, so that is +/- 5% , not the 1% figure that you suggest. And whilst the full particulars of that data are not yet precisely known, it would appear that the OCO-2 data already includes some smoothing, so we are not actually see the extent of local near surface variability.
I consider that in the mid to upper atmosphere particularly so over oceans, CO2 is reasonably well mixed. I accordingly agree with you that Mauna Loa gives a reasonable account of global atmospheric CO2 levels.
However, CO2 does not appear to be well mixed in the first 1000 metres, and I am far from convinced that this fact is properly taken into account.
But the real problem is the lack of knowledge and understanding of the carbon cycle. In particular the fact that we do not have sufficient data of each individual sink, and each individual source, and how each has changed over time. Looking at the net position is not sufficient.
Richard,
OCO-2 measures momentary data which are influenced by the seasons (+/- 8 ppmv at Barrow, +/- 4 ppmv at Mauna Loa and +/- 1 ppmv at the South Pole), plus local emissions and sinks, mainly over land. If you remove the seasonal variations by averaging over a year, the difference between Barrow and the South Pole is not more than 4 ppmv…
Indeed we don’t know all individual C fluxes, but the overall fluxes are roughly known, based on CO2 / δ13C / O2 movements and we know the net result at the end of the year, after a full seasonal cycle: from +0.5 to +2.15 +/- 1 ppmv /year over the past 55 years.
It doesn’t make any difference if nature was a net sink caused by 100 GtC in, 104.5 GtC out or 200 GtC in, 204.5 GtC out, all what counts is that it was 4.5 GtC more sink than source (in last years) and that figure we have with reasonable accuracy…
The net increase of 70 ppmv over the same period is the only point of interest for any radiation imbalance, as far as that is not compensated by negative feedbacks…
“…all what counts is that it was 4.5 GtC more sink than source …”
Nonsense. This is, again, the horrifically misguided and widely debunked pseudo-mass balance argument. If you do not understand why it is so horrifically misguided, you should not be opining on the matter at all. It is not even remotely valid.
Bart,
So, you are back from vacation…
As we have had that discussion for years, here a short reply for the last not informed persons:
– The carbon mass balance must be obeyed in all circumstances, as no carbon can be destroyed or created (except traces of 14C – different story).
– The mass balance shows that nature was a constant sink over the past 55 year.
– Human emissions, increase in the atmosphere and net sinks all increased a 4-fold in the past 55 years.
– The only way that the natural cycle can be the cause of the increase is if the natural cycle increased a 4-fold over the same period, in complete lockstep with human emissions.
– Such a 4-fold increase in the natural carbon cycle is not seen in any observation, in fact it violates all known observations…
Thus sorry Bart, it is you who don’t understand why all observations point to one source: human emissions…
(Sockpuppet. Comments deleted. ~mod.)
(Sockpuppet. Comments deleted. ~mod.)
It’s a stupid argument, Ferdinand. You are really not technically qualified. You do not understand the basics of dynamically evolving systems.
[Snip. Sockpuppetry. ~mod.]
Bart,
I have enough practical knowledge of process dynamics to know that you are completely wrong: I have repeatedly shown that the only way that the natural carbon cycle can overwhelm human emissions is that it increased a 4-fold in the past 55 years in complete lockstep with human emissions.
For which is zero indication that the natural cycle increased anyway, to the contrary, as the average residence time slightly increased over time, which indicates a stable turnover in an increasing CO2 level within the atmosphere…
Not a 3-fold or a 5-fold. If you can show that a sufficient different increase in the total natural carbon cycle than a 4-fold can induce a 4-fold increase of CO2 in the atmosphere and a 4-fold increase in net sink rate together with a 4-fold increase in human emissions, then we may agree. Until now, I haven’t seen such a calculation from your side…
tim r.,
If that is aimed at me, first read the reasons why I am sure that human emissions are the cause of the increase:
http://www.ferdinand-engelbeen.be/klimaat/co2_origin.html
Official measurements are made at a select ‘FEW’ locations. We are told this is to get the ‘BACKGROUND’ level. Sounds important huh?
===============
by the same logic, we only need a couple of stations measuring temperature for the whole world. this would certainly make the adjustments simpler.
Sorry Dr. Ball,
I did think that the ideas of both the late Dr. Jaworowski and the late Ernst Beck, that they rest in peace, were buried with them, but you still use them, while completely debunked.
To begin with: the work of the late Ernst Beck. I had years of direct discussions with him, pointing to the problems with the historical data. Some measurements were extremely unreliable for the purpose he used them, because the instrument was intended for measuring CO2 in exhaled air (~40,000 ppmv) and was calibrated by using outside air. If the reading was between 200-500 ppmv (+/- 150 ppmv!) it was ready for use for that purpose. Beck used the calibration readings as true outside air CO2…
After a lot of discussion, he dropped readings like that, but still included a lot of questionable data, where many were measured on land nearby huge sources and sinks.
I have looked at all his data in the period 1935-1950, as his compilation showed a huge peak (~80 ppmv) around 1942, which in itself is questionable, as that is the equivalent of burning down of 1/3rd of all land vegetation and regrowth in a few years, for which is not the slightest indication in any other proxy like stomata data or coralline sponges.
The problem with the 1942 “peak” is that it is completely based on two long series, which were over land: Poonah, India and Giessen, Germany. The first series was from many measurements under, in between en over growing crops. That has zero value to know anything about “background” CO2 in the bulk of the atmosphere. The second series was from a semi-rural town. The diurnal changes are huge for any place within a few hundred meters over land, especially under inversion. Fortunately there is a modern station, measuring CO2 each half hour, a few km from the historical place. Its measurements show the following daily curves on a few summer days with inversion, compared to Barrow, Mauna Loa and the South Pole (all raw, uncorrected data):
http://www.ferdinand-engelbeen.be/klimaat/klim_img/giessen_background.jpg
The historical data were taken 3 times a day, where 2 were taken at the flanks of the diurnal changes. The local bias for Giessen is already over 40 ppmv for the modern station. The variability of the historical data of Giessen was 66 ppmv (1 sigma), of the modern station around 30 ppmv and of Mauna Loa 4 ppmv…
Historical data taken at more representative places like from air above the oceans or coastal with wind from the seaside all are around the ice core CO2 data…
See further: http://www.ferdinand-engelbeen.be/klimaat/beck_data.html
Then the late Dr. Jaworowski.
In short: he was a specialist in radionuclides and their migration in ice cores. I didn’t find anything that shows that he made any investigations about CO2 in ice cores. He only made a lot of objections in 1992 which may have influenced CO2 reading in ice cores, but almost all were answered by the work of Etheridge e.a. on three high accumulation ice cores of Law Dome, reflected in his report of 1996:
http://onlinelibrary.wiley.com/doi/10.1029/95JD03410/pdf unfortunately behind a paywall…
What closed the door for me was his remark that CO2 readings in ice cores are too low, because CO2 migrates from the inside to the open air through cracks caused by drilling and expansion of the core. As we measure 180-300 ppmv in the air bubbles of the ice core, while the outside air was at 360-400 ppmv at measurement time, that is simply impossible, it is the other way out…
See further: http://www.ferdinand-engelbeen.be/klimaat/jaworowski.html
So Dr. Ball, it is of no purpose to use the historical compilation of the late Ernst Beck, or the remarks of Dr. Jaworowski, as the historical data don’t show real background CO2 data of these periods, as most were taken in 5% of the atmosphere where most fast sources and sinks were present. Ice core data are far more reliable, be it with one drawback: the resolution which gets worse the farther you go back in time.
Stomata data also have their problems, which are worse than the ice core data: if the CO2 average from the stomata proxy differs from the ice core direct data over a period longer than the ice core resolution, the stomata data are certainly wrong…
See further:
http://www.ferdinand-engelbeen.be/klimaat/co2_measurements.html#Where_to_measure
Thus sorry, Dr. Ball, this part of the AGW discussion is completely outdated and only can be used by warmistas to point to the stupidity of all the skeptics and undermines the rest of the real valid points that skeptics have: the lack of recent warming or the real impact of CO2 on temperatures, whatever its source…
@ferdinand meeus Engelbeen,
You say: “…if the CO2 average from the stomata proxy differs from the ice core direct data over a period longer than the ice core resolution, the stomata data are certainly wrong…”
What would the ice core resolution be for core samples taken from depths equivalent to 5,000 years ago? How do you determine that two core samples from different depths are sufficiently separated so as not to be correlated through diffusion?
(Sockpuppet. Comments deleted. ~mod.)
willb02,
Ice core resolution depends mainly of the snow accumulation rate at the places where the ice is formed.
The highest resolution is at Law Dome, coastal with 1.2 m ice equivalent snowfall per year. That gives a resolution of less than a decade over the past 150 years, where the thick ice layers reached near rock bed. Other more inland cores have only a few mm of ice equivalent snowfall per year and have resolutions of 600 (Vostok) to 560 years (Dome C) which go back in time some 420,000 years (Vostok) and 800,000 years (Dome C). So resolution and time period are more or less inversely correlated.
For more info about ice cores see: http://courses.washington.edu/proxies/GHG.pdf
The repeatability of CO2 measurements in one ice core at the same depth is better than 1.2 ppmv (1 sigma) and the difference between different ice cores for the same average gas age is less than 5 ppmv. For 5000 years ago we have several cores with different resolution available:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/antarctic_cores_010kyr.jpg
of which the best resolutions are from Siple Dome (20-25 years resolution) and Taylor Dome (~40 years resolution).
Laboratory measurements under high pressure differences were insufficient to measure any migration, therefore they used an indirect method by looking at the CO2 levels near melt layers in the (coastal) Siple Dome ice core.
These are enriched in CO2, partially caused by migration (although questionable). But anyway, the result was that any theoretical migration did broaden the resolution of the Siple Dome ice core at middle depth from 20 to 22 years. At full depth (~70,000 years back in time) a doubling from 20 to 40 years, due to more time and thinner layers to migrate. All together no big deal and negligible in the much colder inland ice core like Vostok and Dome C.
If there was some appreciable migration, then the about 8 ppmv/°C difference between glacial and interglacial periods would fade over time for each interglacial 100,000 years further back in time, which is not seen at all, not even after 800,000 years…
Thus if the stomata data show other CO2 levels than the ice cores some 5,000 or 10,000 years ago, over a period longer than e.g. the Taylor Dome resolution of ~40 years, then the stomata data should be recalibrated for that period… But still they have their value as they show more rapid changes, especially with rapid temperature changes like the Younger Dryas event…
Ferdinand Engelbeen,
I appreciate your taking the time to answer my question.
In trying to estimate past atmospheric CO2 concentrations from ice cores, it seems to me there are a number of questions related to resolution that need to be answered:
1. How accurately can a particular depth in the ice core be sampled?
2. How accurately can a particular age be assigned to that sample?
3. What separation between depths is required to ensure that two distinct samples are uncorrelated?
I am concerned mainly with the 3rd question and the two factors that I believe determine the answer. One is the rate of snow accumulation, for which you’ve already provided a good explanation. The second is the extent of CO2 diffusion through the length of the ice core. The link you provided suggests that there are some uncertainties in the assumptions that go into the diffusion model. Here are a couple of relevant quotes:
“Diffusivity-porosity relationship is the most uncertain component of the gas diffusion model.”
“Gradual bubble close off during the transition from firn to ice leads to a smoothing of atmospheric concentrations over time.”
How much smoothing and over what time frame will be a major factor in determining the resolution of ice core CO2 estimates.
Ferdinand Engelbeen,
You say: “If there was some appreciable migration, then the about 8 ppmv/°C difference between glacial and interglacial periods would fade over time for each interglacial 100,000 years further back in time, which is not seen at all, not even after 800,000 years…”
Not so if migration ceased after 1,000 years.
Willb01:
Not so if migration ceased after 1,000 years.
Why would it cease? As long as there is a CO2 concentration difference (in general 100-120 ppmv over the past 800,000 years), migration should go on, if there is some migration. Except with some physical mechanism that blocks the pores (as far as there are) after 1,000 years. But the only difference that makes is that the changes need more time, for the long-term cores 1-2 samples for their 600 and 560 years resolution…
If there was real migration, the problem would be both in the peaks and the valleys: every peak back in time must have been higher and higher to maintain the same ratio over a longer period of migration and the “valleys” then were originally lower, which are already problematic at 180 ppmv for a lot of C3 plants…
Allow me once again to salute Mr. Engelbeen’s patience in explaining what is known about historical carbon-dioxide concentration. I know little of the area myself, so I hope I’m not being deceived, but to me his contributions always have the ring of truth.
That said, I wonder if there’s a typo in “a huge peak (~80 ppmv) around 1942.”
Joe Born,
Thanks for the compliment…
The 1942 “peak” of 80-100 ppmv was meant above the “baseline” before and after the “peak” and is from a more smoothed compilation of the same data as in Fig. 4 in the story by Dr. Ball. The smoothed version is already physically impossible: there is no source and sink in the world which can release and then pick up 80-100 ppmv in some 5 years time without being noticed in any other proxy or ice core measurements.
The unsmoothed version in Fig.4 shows a 120 ppmv drop in only 2 years time, then a 80 ppmv increase in 2 (or 3) years then again a drop of 120 ppmv in two years time, which is absolutely impossible…
And all these huge variations suddenly end when the accurate measurements at Mauna Loa started… That should ring a bell about the reliability of the historical data…
Willbo1 (sorry, Willbo2 was not for your brother…)
1. How accurately can a particular depth in the ice core be sampled?
2. How accurately can a particular age be assigned to that sample?
3. What separation between depths is required to ensure that two distinct samples are uncorrelated?
1. Is easily answered: with the sublimation technique one need only a few cm3 ice that can be taken circular (after removing the outer layer) from the same depth of the ice core. The repeatability is better than 1.2 ppmv (1 sigma) for samples at the same depth.
2. That is the most difficult to answer. The age of the ice is more easy: counting layers, as long as these are thick enough. They have counted all 110,000 layers in the ice core of the Greenland summit… If the layers get thinner, other means are used: electrical conductivity (winter and summer layers have different density and conductivity), radar, echo,… Certain historical events like ash layers from known huge volcanic events, etc…
For the average gas age that is a lot more difficult. That is mainly based on a firn densification model, which gives an answer on the time needed to fully close all the air bubbles. That gives the average age of the enclosed air, the resolution and the difference between the age of the surrounding ice and the average gas age at the same depth. The latter is the most uncertain part of the whole story and is prone to several revisions for many ice cores…
Here too there are some other age distribution possibilities with recent human chemicals like CFC’s, and the presence of cosmogenic nuclides,…
But on the other side, precipitation and thus ice age – gas age difference is different during glacial periods than during interglacials…
See some overviews at: http://www.ncdc.noaa.gov/data-access/paleoclimatology-data/datasets/ice-core
3. In general I should say that the samples should be separated by at least the resolution, translated to depth difference, but I have the impression that they use some oversampling…
Ferdinand Engelbeen,
Again, thank you for your very informative reply.
“In general I should say that the samples should be separated by at least the resolution, translated to depth difference, but I have the impression that they use some oversampling…”
I would say that too. And if it is not known whether oversampling has occurred for a particular time period, I think this sentence of yours is overstating your case:
“…if the CO2 average from the stomata proxy differs from the ice core direct data over a period longer than the ice core resolution, the stomata data are certainly wrong…”
since the ice core resolution for the period in question may not be known with any accuracy.
Willb01,
I think this sentence of yours is overstating your case
Depends of the trend in such a period: if there is little trend, a constant offset is a sure sign of a bias in the stomata data, while at the flanks of an increase or decrease, it may be due to timing errors in the ice core…
In most cases I have seen, the difference between stomata and ice core CO2 data is in rather flat periods.
Ferdinand Engelbeen
“Why would it cease?”
Well, it clearly does cease, and for the very reason you mention – because of the presence of the peaks and valleys in the ice core record. There also clearly is real migration occurring for the period of time between snowfall and the complete close-off of all diffusion pathways. The question is, how long does it take for that to happen?
willb01,
The migration indeed is in the gas phase before bubble closing and slower with increasing density, thus decreasing pore diameter. There is a lot of time to migrate before full closing.
At the high accumulation cores of Law Dome, it takes ~40 years to start closing the bubbles, a few (~8) years more before fully closed. The composition at start closing depth of ~72 meters is average only 7 years younger than in the above atmosphere. At full closing depth the average gas age is about 10 years older than in the atmosphere, as that is a mix of early and late closed bubbles. That makes that for the same depth, the average gas age is a ~30 years younger than the surrounding ice with a resolution of less than a decade.
For Vostok, where the closing takes many centuries, at start closing (~80 m depth), CO2 also is only ~10 years younger than in the atmosphere, but the ice is already thousands of years old… Because of the time needed between the first and last closures, the resolution broadens to several centuries and the gas age – ice age difference is many thousands of years.
After full closing, there is no measurable migration through the ice, only a very small (theoretical) migration in relative “warm” (coastal) ice cores and none in the much colder inland ice cores. The only migration you see was thus in the gas phase and is incorporated in the width of the resolution.
Thanks Dr. Ball, very interesting. If Callender was a steam engineer he should have known better. It was early work with steam that hatched the laws of thermodynamics, yes folks the laws that AGW/GHE violate.
John,
The difference between Beck and Callendar is that Callendar used very stringent a priori criteria to include or exclude data, while Beck lumped everything together: the good, the bad and the ugly. After a lot of discussion, he dropped the ugly data, but still included bad data, which should have been rejected as not representative for “global” CO2 level in the atmosphere…
Whose opinion was the data “bad”? Beck was a scientist who knew his subject Callendar a steam engineer who cherry picked the data to prove his biased thinking.
Have you actually read the Beck paper in question?
John,
I didn’t only read all the published historical papers for the period 1935-1955 as that is the period of a huge “peak” in Beck’s compilation (which doesn’t exist in any other ice core or proxy), I also discussed them directly with E-G Beck until his untimely death. Including a direct confrontation of the data and results at the home of Arthur Rörsch in The Netherlands.
See e.g. the discussion of the ocean “atmospheric” data here.
After long discussions with him, he dropped the data from Barrow (accuracy +/- 150 ppmv!) and Antarctica (+/- 300 ppmv for CO2, +/- 400 ppmv for O2). But others with sampling and especially big local contamination problems remained in his compilation…
And why would Callendar have a biased thinking? Many at that time saw higher temperatures as beneficial.
OMG not this Beck tripe again?
OMG not this Hans Erren tripe again?
Yes Hans,
Again and again, the same discussion which has not the slightest interest, even is negative in any discussion with warmistas…
Another thought provoking essay from Dr. Tim Ball. Thanks.
There are many reasons to believe that the CO2 level of pre-industrial times was not the value that the IPCC says that it was. Just the fact that the political organization IPCC said it makes the stated values suspect. What we do know is that CO2 levels have been much higher in the past without “destroying the earth”. That fact goes hand in hand with the fact that the temperatures have gone up in the distant past and then after a long time lag the CO2 levels went up. (the thing that came after did not cause the thing that came before)
We also know that the laws of thermodynamics tell us that CO2 is not the driver of the earth’s temperature. All in all, I normally believe the laws of thermodynamics over the rent-seekers, politically motivated “scientists” and railroad engineers of the IPCC.
~ Mark
markstoval,
My main response to Dr. Ball is still in moderation, but in short: the assumption by the late Dr. Jaworowski is false and was completely refuted by the work of Etheridge e.a. already in 1996. Dr. Jaworowski made statements which are simply impossible, like migration of CO2 from low levels (180-300 ppmv) in ice core bubbles via cracks in the ice to the outside air (at 360-400 ppmv), which is contrary to what any engineer can tell you…
Further refuted by other proxies like coralline sponges, which show the opposite movements of δ13C and CO2 changes in ocean surface waters and air, firn and ice cores:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/sponges.jpg
(Sockpuppet. Comments deleted. ~mod.)
IPCC AR5 Figure 1 is in Petagram C not CO2!! One gr C makes 3.67 gr CO2!!