CO2 data might fit the IPCC hypothesis, but it doesn't fit reality

Opinion by Dr. Tim Ball

I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. – Arthur Conan Doyle. (Sherlock Holmes)

Create The Facts You Want.

In a comment about the WUWT article “The Record of recent Man-made CO2 emissions: 1965-2013”, Pamela Gray, graphically but pointedly, summarized the situation.

When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different. Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred. Or maybe because of the random nature of weather we would have had it worse. Or it could have been much better. Now do something really ridiculous and take out just China’s portion. I know, the post isn’t meant to paint China as the bad guy. But. Really? Really? All this for something so tiny you can’t find it? Not even in a child’s balloon?

The only quibble I have is that the amount illustrates the futility of the claims, as Gray notes, but the Intergovernmental Panel on Climate Change (IPCC) and Environmental Protection Agency (EPA) are focused on trends and attribution. It must have a human cause and be steadily increasing, or, as they prefer – getting worse.

Narrowing the Focus

It’s necessary to revisit criticisms of CO2 levels created by the IPCC over the last several years. Nowadays, a measure of the accuracy of the criticisms, are the vehemence of the personal attacks designed to divert from the science and evidence.

From its inception, the IPCC focused on human production of CO2. It began with the definition of climate change, provided by the UNFCCC, as only those caused by humans. The goal was to prove their hypothesis that increase of atmospheric CO2 would cause warming. This required evidence that the level increased from pre-Industrial times, and would increase each year because of human industrial activity. How long before they start reducing the rate of CO2 increase to make it fit the declining temperatures? They are running out of guesses, 30 at latest count, to explain the continued lack of temperature increase now at 17 years and 10 months.

The IPCC makes the bizarre claim that up until 1950 human addition of CO2 was a minor driver of global temperature. After that over 90 percent of temperature increase is due to human CO2.

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.

 

The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous. This claim comes from computer models, which are the only place in the world where a CO2 increase causes a temperature increase. It depends on human production and atmospheric levels increasing. It assumes temperature continues to increase, as all three of IPCC scenario projections imply.

Their frustration is they control the CO2 data, but after the University of Alabama at Huntsville (UAH) began satellite global temperature data, control of temperature data was curtailed. It didn’t stop them completely, as disclosures by McIntyre, Watts, Goddard, the New Zealand Climate Science Coalition among others, illustrated. They all showed adjustments designed to enhance and emphasize higher modern temperatures.

Now they’re confronted with T. H. Huxley’s challenge,

The Great Tragedy of Science – the slaying of a beautiful hypothesis by an ugly fact.

This article examines how the modern levels of atmospheric CO2 were determined and controlled to fit the hypothesis. They may fit a political agenda, but they don’t fit nature’s agenda.

New Deductive Method; Create the Facts to Fit the Theory

Farhad Manjoo asked in True Enough: Learning To Live In A Post-fact Society,

“Why has punditry lately overtaken news? Why do lies seem to linger so long in the cultural subconscious even after they’ve been thoroughly discredited? And why, when more people than ever before are documenting the truth with laptops and digital cameras, does fact-free spin and propaganda seem to work so well?”

Manjoo’s comments apply to society in general, but are enhanced about climate science because of differing public abilities with regard to scientific issues. A large majority is more easily deceived.

Manjoo argues that people create facts themselves or find someone to produce them. Creating data is the only option in climate science because, as the 1999 NRC Report found, there is virtually none. A response to February 3, 1999 US National Research Council (NRC) Report on Climate Data said,

“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

The situation is worse today. The number of stations used is dramatically reduced and records adjusted to lower historic temperature data, which increases the gradient of the record. Lack of data for the oceans was recently identified.

“Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content.”

Oceans are critical to CO2 levels because of their large sink or source capacity.

Data necessary to create a viable determination of climate mechanisms and thereby climate change, is completely inadequate. This applies especially to the structure of climate models. There is no data for at least 80 percent of the grids covering the globe, so they guess; it’s called parameterization. The 2007 IPCC Report notes,

Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parameterized. The differences between parameterizations are an important reason why climate model results differ.

Variable results occur because of inadequate data at the most basic level and subjective choices by the people involved.

The IPCC Produce The Human Production Numbers

In the 2001, IPCC Report identified 6.5 GtC (gigatons of carbon) from human sources. The figure rose to 7.5 GtC in the 2007 report and by 2010 it was 9.5 GtC. Where did they get these numbers? The answer is the IPCC has them produced and then vet them. In the FAQ section they ask, “How does the IPCC produce its Inventory Guidelines?”

Utilizing IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC.

They were called Special Report on Emissions Scenarios (SRES) until the 2013 Report, when they became Representative Concentration Pathways (RCP). In March 2001, John Daly reports Richard Lindzen referring to the SRES and the entire IPCC process including SRES as follows,

In a recent interview with James Glassman, Dr. Lindzen said that the latest report of the UN-IPCC (that he helped author), “was very much a children’s exercise of what might possibly happen” prepared by a “peculiar group” with “no technical competence.”

William Kininmonth, author of the insightful book “Climate Change: A Natural Hazard”, was former head of Australia’s National Climate Centre and their delegate to the WMO Commission for Climatology. He wrote the following in an email on the ClimateSceptics group page.

I was at first confused to see the RCP concept emerge in AR5. I have come to the conclusion that RCP is no more than a sleight of hand to confuse readers and hide absurdities in the previous approach.

You will recall that the previous carbon emission scenarios were supposed to be based on solid economic models. However, this basis was challenged by reputable economists and the IPCC economic modelling was left rather ragged and a huge question mark hanging over it.

I sense the RCP approach is to bypass the fraught economic modelling: prescribed radiation forcing pathways are fed into the climate models to give future temperature rise—if the radiation forcing plateaus at 8.5W/m2 sometime after 2100 then the global temperature rise will be 3C. But what does 8.5 W/m2 mean? Previously it was suggested that a doubling of CO2 would give a radiation forcing of 3.7 W/m2. To reach a radiation forcing of 7.4 W/m2 would thus require a doubling again—4 times CO2 concentration. Thus to follow RCP8.5 it is necessary for the atmospheric CO2 concentration equivalent to exceed 1120ppm after 2100.

We are left questioning the realism of a RCP 8.5 scenario. Is there any likelihood of the atmospheric CO2 reaching about 1120 ppm by 2100? IPCC has raised a straw man scenario to give a ‘dangerous’ global temperature rise of about 3C early in the 22nd century knowing full well that such a concentration has an extremely low probability of being achieved. But, of course, this is not explained to the politicians and policymakers. They are told of the dangerous outcome if the RCP8.5 is followed without being told of the low probability of it occurring.

One absurdity is replaced by another! Or have I missed something fundamental?[1]

No, nothing is missed! However, in reality, it doesn’t matter whether it changes anything; it achieves the goal of increasing CO2 and its supposed impact of global warming. Underpinning of IPCC climate science and the economics depends on accurate data and knowledge of mechanisms and that is not available.

We know there was insufficient weather data on which to construct climate models and the situation deteriorated as they eliminated weather stations, ‘adjusted’ them and then cherry-picked data. We know knowledge of mechanisms is inadequate because the IPCC WGI Science Report says so.

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

or

For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.

 

Two critical situations were central to control of atmospheric CO2 levels. We know Guy Stewart Callendar, A British steam engineer, cherry-picked the low readings from 90,000 19th century atmospheric CO2 measures. This not only established a low pre-industrial level, but also altered the trend of atmospheric levels. (Figure 1)

clip_image002

Figure 1 (After Jaworowski; Trend lines added)

Callendar’s work was influential in the Gore generated claims of human induced CO2 increases. However, the most influential paper in the climate community, especially at CRU and the IPCC, was Tom Wigley’s 1983 paper “The pre-industrial carbon dioxide level.” (Climatic Change. 5, 315-320). I held seminars in my graduate level climate course about its validity and selectivity to establish a pre-industrial base line.

I wrote an obituary on learning of Becks untimely death.

I was flattered when he asked me to review one of his early papers on the historic pattern of atmospheric CO2 and its relationship to global warming. I was struck by the precision, detail and perceptiveness of his work and urged its publication. I also warned him about the personal attacks and unscientific challenges he could expect. On 6 November 2009 he wrote to me,In Germany the situation is comparable to the times of medieval inquisition.” Fortunately, he was not deterred. His friend Edgar Gartner explained Ernst’s contribution in his obituary. “Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Penal on Climate Change IPCC. He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And it doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linear back to the 19th century.” (This is a translation from the German)

Beck was the first to analyze in detail the 19th century data. It was data collected for scientific attempts to measure precisely the amount of CO2 in the atmosphere. It began in 1812, triggered by Priestly’s work on atmospheric oxygen, and was part of the scientific effort to quantify all atmospheric gases. There was no immediate political motive. Beck did not cherry-pick the results, but examined the method, location and as much detail as possible for each measure, in complete contrast to what Callendar and Wigley did.

The IPCC had to show that,

· Increases in atmospheric CO2 caused temperature increase in the historic record.

· Current levels are unusually high relative to the historic record.

· Current levels are much higher than pre-industrial levels.

· The differences between pre-industrial and current atmospheric levels are due to human additions of CO2 to the atmosphere.

Beck’s work showed the fallacy of these claims and in so doing put a big target on his back.

Again from my obituary;

Ernst Georg Beck was a scholar and gentleman in every sense of the term. His friend wrote, “They tried to denounce Ernst Georg Beck in the Internet as naive amateur and data counterfeiter. Unfortunately, Ernst could hardly defend himself in the last months because of its progressive illness.” His work, determination and ethics were all directed at answering questions in the skeptical method that is true science; the antithesis of the efforts of all those who challenged and tried to block or denigrate him.

The 19th-century CO2 measures are no less accurate than those for temperature; indeed, I would argue that Beck shows they are superior. So why, for example, are his assessments any less valid than those made for the early portions of the Central England Temperatures (CET)? I spoke at length with Hubert Lamb about the early portion of Manley’s CET reconstruction because the instruments, locations, measures, records and knowledge of the observers were comparable to those in the Hudson’s Bay Company record I was dealing with.

Once the pre-industrial level was created it became necessary to ensure the new CO2 post-industrial trend continued. It was achieved when C.D.Keeling established the Mauna Loa CO2 measuring station. As Beck notes,

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC.

Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements.” He is also a co-author of the IPCC reports, which accept Mauna Loa and all other readings as representative of global levels. So the IPCC control the human production figures and the atmospheric CO2 levels and both are constantly and consistently increasing.

This diverts from the real problem with the measures and claims. The fundamental IPCC objective is to identify human causes of global warming. You can only determine the human portion and contribution if you know natural levels and how much they vary and we have only very crude estimates.

What Values Are Used for Each Component of the Carbon Cycle?

Dr. Dietrich Koelle is one of the few scientists to assess estimates of natural annual CO2 emissions.

Annual Carbon Dioxide Emissions GtC per annum

1.Respiration (Humans, animals, phytoplankton) 45 to 52

2. Ocean out-gassing (tropical areas) 90 to 100

3. Volcanic and other ground sources 0.5 to 2

4. Ground bacteria, rotting and decay 50 to 60

5. Forest cutting, forest fires 1 to 3

6. Anthropogenic emissions Fossil Fuels (2010) 9.5

TOTAL 196 to 226.5

Source: Dr. Dietrich Koelle

The IPCC estimate of human production (6) for 2010 was 9.5 GtC, but that is total production. One of the early issues in the push to ratify the Kyoto Protocol was an attempt to get US ratification. The US asked for carbon credits, primarily for CO2 removed through reforestation, so a net figure would apply to their assessment as a developed nation. It was denied. The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).

The Truth Will Out.

How much longer will the IPCC continue to produce CO2 data with trends to fit their hypothesis that temperature will continue to rise? How much longer before the public become aware of Gray’s colorful observation that, “The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.” The almost 18-year leveling and slight reduction in global temperature is essentially impossible based on IPCC assumptions. One claim is already made that the hiatus doesn’t negate their science or projections, instead of acknowledging it, along with failed predictions completely rejects their fear mongering.

IPCC and EPA have already shown that being wrong or being caught doesn’t matter. The objective is the scary headline, enhanced by the constant claim it is getting worse at an increasing rate, and time is running out. Aldous Huxley said, “Facts do not cease to exist because they are ignored.” We must make sure they are real and not ignored.


[1] Reproduced with permission of William Kininmonth.

Advertisements

  Subscribe  
newest oldest most voted
Notify of

A typical Beck site was Giessen. Ferdinand E has a plot of the daily cycle here, with modern measurements every half hour. They vary during a day from 350 to 500 ppm. You can analyse as accurately as you like, but the answer will depend on what time you sample. This has nothing to do with global CO2.
Here is a plot of CO2 measured at Mauna Loa, and in ice cores, over the last thousand years, matched with tonnage of emissions and CO2 liberated by land clearing. It’s hard to say human emissions had nothing to do with the CO2 rise.

Sorry Dr. Ball, this is such a bunch of nonsense and misinterpretations that I don’t even know where to start.
CO2 emissions inventories are not done by the IPCC. The guidelines are made by the IPCC, but the inventories are made by the governments of each country based on production / use of fossil fuels.
The IPCC doesn’t control these figures, except if clear mistakes were made or clarifications are needed. But still others like oil giant BP give similar overviews.
That has nothing to do with the future scenario’s used by the IPCC to test the different climate models for what “may” happen with climate for different emissions schemes.
Their frustration is they control the CO2 data
this is just nonsense: they don’t control the CO2 data, neither of human emissions nor of the measurements. Or do you really think that they will curb the Mauna Loa and lots of other station data to accommodate with the temperature “pause”? I suppose that the hundreds of people working in different organizations in different countries all measuring CO2 wouldn’t appreciate that.
Beck did not cherry-pick the results, but examined the method, location and as much detail as possible for each measure, in complete contrast to what Callendar and Wigley did.
Again sorry, but that was the problem with the late Beck’s interpretation of the data: he didn’t cherry pick the data, he simply lumped them all together: the good, the bad and the ugly. Guy Callendar had pre-defined criteria like “not done for agricultural purposes”. That would remove a lot of suspect data which were used by Beck: all series from Poonah (India) were taken under, above and in between growing crops, which has nothing to do with “background” CO2, but it is one of the two long series used by Beck, which causes his “peak” in CO2 of around 1942.
Simply said, a lot of data used by Beck and rejected by Callendar were taken over land near huge sources and sinks of CO2. That is the equivalent of temperature measurements on a hot asphalt roof.
Callendar was right, Beck was wrong: decennia after Callendar, the measurements taken at better places: over the oceans, or at the seaside with wind from the oceans all are around the ice core data for the same time frame.
See further:
http://www.ferdinand-engelbeen.be/klimaat/beck_data.html
The 19th-century CO2 measures are no less accurate than those for temperature
The accuracy of most old wet methods was +/- 10 ppmv (several were much worse), hardly sufficient to see the seasonal variations or a trend in that period. Reason why Keeling was searching for more accurate methods which were also far less labor and maintenance intensive.
owns the global monopoly of calibration of all CO2 measurements
Keeling Jr. owns nothing. In the early days Scripps with Keeling Sr. did calibrate all instruments and calibration gases over all the world, because that is what needs to be done by someone somewhere.
Some years ago, NOAA got the calibration task from he WMO, but still (the Japanese and) Scripps have their own calibration sets. Scripps still measures at Mauna Loa independent of NOAA. If NOAA would change the data, I am pretty sure Scripps would react, as they still are mad that NOAA did get their work.
If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).
Completely irrelevant: human net production is additional, the natural sources are more than compensated with natural sinks: Nature is a net sink for CO2, not a source.
Thus sorry Dr. Ball, too many misinterpretations and non-factual remarks not based on actual information…

Justthinkin

Anthony……never ever forget the IPCC and global whatever was created to advance Agenda 21 of the UN.Let there be no mistake about that. (and cops at the door again).

steveta_uk

… After that over 90 percent of temperature increase is due to human CO2.
Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.

Author is misreading this I think. What it says it that the IPCC is 95% confident (very likely) that over 50% (Most of) of the increase is anthropogenic. 90% isn’t mentioned anywhere.

“The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous.”
Incredulous means “skeptical”
The word you are looking for is “incredible”.

mikewaite

The “adjustment” of historic data on CO2 concentration or temperature in order to justify an extrapolated future extreme warming has implications beyond its immediate effect on the increased taxation we are all experiencing. Those professionals who have to plan future health requirements , agricultural needs or even financial market trends are all involved .
An example will show what I mean , taken from a short BBC radio news item yesterday. A botanist was explaining that the conversion of CO2 into sugars, and hence the yield of cereals , occurs by 2 mechanisms : the “C3” route , the original mechanism , and a “C4” route evolved about 60 M years ago as the climate became warmer and drier . Rice , the staple cereal of about 1/2 the world’s population , is a member of the C3 family , whilst Maize , with its enormously greater yield belongs to the C4 group. Apparently attempts are being made to gene convert rice into a C4 type cereal because of the future hotter global climate that the climate scientists are promising.
Given that changes in rice variety tend to be irreversible , (think of the success of “miracle rice”), and that a few people think that the future may be not a warmer , but a colder , climate then the lives of those 1 billion people for whom rice is the major calorie source will be badly affected if plant breeding in general is based on what appears to be , from the essay above, a possibly dubious scientific basis.

Actually, this is quite false.
“Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred.
As Pamela should know in order to improve the forecast ability of weather models you actually have to model radiative physics and yes that includes C02.

Dr Ball, thanks for the article but please be careful about citing Goddard as an authority. For every 5 clever insights he has on climate, he’ll toss in 5 equally absurd ones. Your opponents will focus on the weakest links in your claims.

ren

Let’s see how ionization of the stratosphere above the polar circle influences the shape of the polar vortex.
http://www.cpc.ncep.noaa.gov/products/stratosphere/strat_a_f/gif_files/gfs_z10_sh_f00.gif

ren

“Using the oxygen isotope and Sr/Ca thermometers measured in Barbados corals spanning the last deglaciation, we first concluded that tropical sea surface temperatures were as much as 5 degrees cooler during the last glacial period. Although we have since abandoned the Sr/Ca thermometer based on our coral culture experiments; our sea surface temperature estimates still stand based on the strength of the original oxygen isotope data. Several other proxies, including noble gas paleothermometers, tropical ice cores, and some pollen-based reconstructions, confirmed cool tropical temperatures. Prior to our tropical sea surface temperature results, the CLIMAP sea surface temperature reconstructions based on statistical analyses of microfossil abundances in deep sea cores, indicated constant tropical sea surface temperatures. The assumption of constant tropical sea surface temperatures and polar regions that varied synchronously had a profound influence on the course of research for over two decades. The climate and paleoceanographic communities looked to atmospheric CO 2 and deep ocean currents to transmit climate signals from the north to the south either over (CO 2 ) or under (NADW) the tropics. Recognition that the tropics are not thermostated at present day temperatures but are free to change by more than 5°C shifted the climate community’s focus to the role of the tropics in possibly driving global climate change. This remains one of the most exciting and challenging topics in paleoceanographic research. Our current research is directed toward development and testing new paleotemperature proxies in corals through culture experiments and application of these new proxy thermometers to our coral sample set. In addition, we are preparing a series of papers that reanalyze the global alkenone, Mg/Ca, δ 18 O, and foraminifera transfer function sea surface temperature estimates based on our thermocline/flux model.”
http://radiocarbon.ldeo.columbia.edu/research/sst.htm
http://weather.unisys.com/surface/sst_anom.gif

I always enjoy Pamela’s posts. She has a gift for reducing complex assertions into common sense analogies which is sorely lacking in today’s discussions.
I used to grow tomatoes as a hobby. I only grew the heirloom varieties. My plants didn’t care one iota that July of any given year had an average temperature that was 1 deg warmer or 1 deg cooler than the year before. They also had no knowledge of what the “trend” was. They responded to light cycles from the sun. Days were long enough to provide enough sunlight for that particular type of plant to bear fruit, which has happened for hundreds of years. But I’m given to understand that a shift of 2 deg over a period of 20yrs will be catastrophic. It defies all logic and reasoning.
3 winters ago in the Northeast, we had a very warm and early spring. All of the golf courses were open for play in March. Two common statements were heard frequently; “I’ve never seen anything like this in 50yrs!” and “This is certainly evidence of global warming.” The former was from local people, who were enjoying a wonderful anommaly regarding typical New England weather. The latter was from all of the local news stations. Last spring almost didn’t happen. It stayed cold for an extended period of time, and it seemed it would never end. I never heard a single “news” station say “Ok…I guess our coverage last year attributing the unusually eary spring weather to Global Warming may have been a bit premature.”
Of course, the “true believers” wrote this off as “That’s weather, not climate.”
Given the decline in temps over the past 10 +- years, has anyone noticed trees migrating south?…birds?…have the gardening zones been readjusted to compensate for this cooling? Does Maine not have their traditional Strawberry Festivals in July each year?
In discussions with friends, when all of this logic and reasoning failed, my go to question that I got from a post many years ago here on WUWT is “Assuming that you could actually CONTROL the temperature of the globe, what would you set the thermostat to?”
That always gets met with a blank stare.
Keep on posting, Pamela…your contributions here are much appreciated 🙂
Jim

richardscourtney

Tim Ball:
Thankyou for your much-needed essay.
The single most important fact in your essay is this

Creating data is the only option in climate science because, as the 1999 NRC Report found, there is virtually none.

Yes! And people select from what little data exists, revile the remainder, then build mountains of conjecture from their selection before proclaiming their conjectures are facts!
In reality, as you say, there is almost no data on the carbon cycle and the paucity of data enables almost any conjecture – however ridiculous – to be modelled with agreement to the existing data.
(ref. Rorsch A, Courtney RS & Thoenes D, ‘The Interaction of Climate Change and the Carbon Dioxide Cycle’ E&E v16no2 (2005) )
Richard

Mayor of Venus

Will: Linus Pauling was similar to Goddard. Answering a question on how he comes up with so many good ideas, he explained that he just gets lots and lots of ideas, and then discards all the bad ones.

The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous.
==============
You may be incredulous to hear the claim, but the claim itself is more correctly described as “incredible”.

johnmarshall

The ”critical” 8.5W/m2 that would provide the dangerous warming is a grasp at nothing. The Radiation Equilibrium Temperature of 8.5W/m2 is -(minus)160C. A real big help for dangerous warming.

Mayor of Venus:
“Will: Linus Pauling was similar to Goddard. Answering a question on how he comes up with so many good ideas, he explained that he just gets lots and lots of ideas, and then discards all the bad ones.”
Except Goddard and Pauling can’t and couldn’t tell good ideas from bad ones. Pauling turned into a crank at the end of his career, publishing pamphlets claiming that mega doses of vitamin C cured the common cold. Apparently giving everyone vitamin C tablets would save billions in health costs and productivity losses… On his blog today Goddard claimed he could detect the rate of change in global sea level rise using a single tide gauge. And nobody was going to set him straight on that. 😉

sleepingbear dunes

Since I cannot sort out the claims and counterclaims, I think it is important for Dr. Ball to address the first 2 comments. This is the beauty of this process.

M Courtney

The situation is worse today. The number of stations used is dramatically reduced and records adjusted to lower historic temperature data, which increases the gradient of the record.

No-one disputes that do they?
So let’s push hard on “records adjusted to lower historic temperature data“.
If it’s justified make them say so and why.
It will expose how unreliable these records that we are beating, are.
If it’s not justified…

A nice wrap up of CO2 and where it comes from and is going. I notice some sniping at the edges. But the central point remains uncontested.

Daniel G.

A nice wrap up of CO2 and where it comes from and is going. I notice some sniping at the edges.
But the central point remains uncontested.

That the IPCC organization controls co2 data to fit a pre-set hypothesis?
That claim seems to have been contested by two commenters above.

Bruce Cobb

It makes no difference where the increased CO2 comes from, so it’s a red herring. The increased CO2 is nothing but a boon to all of life, and especially to man, by helping plants grow. Whatever warming effect it may have had cannot be sussed from what is natural, and only in the twisted, humanity-hating minds of the Warmistas could a small amount of warming be a detriment to “the planet”.

eyesonu

Will Nitschke says:
August 5, 2014 at 3:11 am
==============
While you denigrate Pauling with regards to the use of vitamin C to combat the common cold I ask, have you personally tried it?
Well I have and I haven’t had a cold in over 15 years to 20 years. Have recommended the same to many in my circles that could not shake a severe cold through the use of antibiotics and it proved effective.

Daniel G. says:August 5, 2014 at 4:35 am
“That the IPCC organization controls co2 data to fit a pre-set hypothesis?
That claim seems to have been contested by two commenters above.”

Yes. Emissions data comes from the US DoE at Oak Ridge. And CO2 measurements come from many sites around the world, but the main player is the Scripps Institution of Oceanography.

Mayor of Venus says:
August 5, 2014 at 2:40 am

Will: Linus Pauling was similar to Goddard. Answering a question on how he comes up with so many good ideas, he explained that he just gets lots and lots of ideas, and then discards all the bad ones.

I don’t think I’ve ever seen Goddard discard a bad idea. I’m not sure if he’s ever admitted to having a bad idea more than once or twice a year.

rgbatduke

A typical Beck site was Giessen. Ferdinand E has a plot of the daily cycle here, with modern measurements every half hour. They vary during a day from 350 to 500 ppm. You can analyse as accurately as you like, but the answer will depend on what time you sample. This has nothing to do with global CO2.

This is very disturbing, actually. So much for CO_2 being a well-mixed gas. On the other hand, co-analysis of this data with local temperatures and direct measurements of atmospheric radiative spectra similarly sampled should yield a lot of interesting data, given that the daily peaks appear to be order of 500 ppm. I confess that I’m having a real problem even imagining a local uptake/delivery mechanism that could drop levels to a sharp, consistent 350 ppm for half of every day on a planet with a supposed “well-mixed” background of 400 ppm and then spike to over 500 ppm over the other half, though. It would also be interesting to integrate over time to obtain the actual average.
Note well that your objections also apply to the entire temperature record, everywhere, and most of the other major parameters of interest in climate prediction or reconstruction. Daily temperatures vary by as much as 45-50 C (or as little as 0-1 C). As you say “You can analyze as accurately as you like, but the answer will depend on the time you sample. This has nothing to do with global…” land temperature? rainfall? sea surface temperature? humidity? wind field (often have to get down to the second, there)? cloud cover? albedo? aerosol levels? air pressure?
It is extremely constructive to contemplate the probable hourly variations in the total greenhouse effect due to direct variation of atmospheric pressure (not partial pressure of CO_2) compared to the variations expected from increasing already-saturated CO_2. One is signal, the other is noise. Do we even have real-time parameters for the signal (increased pressure directly modulating the absorptivity of all of the GHGs by altering the pressure broadening of the absorptive bands) in the models? We certainly put a lot of weight on the expected behavior of the noise…
The interesting thing is that we somehow imagine that we can go back in the historical record of observations (of almost anything) and “correct” it a century after the fact, with a correction that never seems to come at the cost of estimated precision in the corrected data. I would wax poetic on the Bayesian priors (usually unstated) necessary for this task to proceed, or the posterior probabilities associated with those priors after the fact, but why bother? Unless or until climate scientists are required to actually learn some statistics (and work well within its axiomatic confines when making statements about “confidence” instead of pulling confidence assertions in summaries for policy makers out of the region of nether cheeks with no possible axiomatic, computable justification) we will continue to have disclaimers quietly tucked away in the statistics section of the ARs where nobody will ever read them or understand them if they happen upon them that totally contradict the assertions of “high confidence” in e.g. the attributions of cause in the SPM.
rgb

rgbatduke says: August 5, 2014 at 5:39 am
“So much for CO_2 being a well-mixed gas.”

As with much of Beck’s data, you are seeing a daily cycle dominated by plant respiration/photosynthesis. That is close to the ground in Europe. If you get away from that, as at these sites, for example, you’ll avoid that daily cycle, and the measures are in close agreement, which indicares good mixing.

Here is a graphic which shows how closely the far separated stations agree on CO2 ppmv. There is variation i n the annual cycle, but the means track well.

Pamela Gray

Steven Mosher, you must be kidding. Take out just the anthropogenic portion of CO2 radiative affects and rerun weather (or if you prefer, climate) models at a 30 year time span (along with the necessary multiple trials). Run them just like the IPCC does. You would not be able to use the difference between the two sets of multiple spaghetti runs to say anything at all about the weather future. And you know that. In those spaghetti graphs, the ups and downs of the scenario results will have such a broad (and broadening) road, you might as well flip a coin to get better results. I stand completely behind my thought experiment and will not give an inch to you. We could have had the same weather, worse weather, or better weather. Anthropogenic CO2 radiative affects do not determine weather, therefore they cannot determine climate.
Look folks, the thing that determines weather thus climate is geography and your location in it, interacting with large and small scale oceanic/atmospheric teleconnected pressure systems. It is the battle of pressure systems, air heated or cooled, ladened with or not ladened with moisture, and traveling over your geographic location. Which one of these could anthropogenic CO2 substantially change, and even create a trend? It would have to be able to get in that powerful mix and muscle it around. It’s like saying the mouse lifts the elephant and hurls him out of the room instead of the elephant leaving under its own power.
So back to you Mosher. I am not saying that atmospheric gasses are not capable of reabsorbing and re-emitting longwave infrared radiation. Of course they are. I am saying that the anthropogenic CO2 molecules (a tiny, tiny fraction of all the LWIR absorbing/reemitting molecules present) in the atmosphere at any given time are not capable of changing the weather, thus the climate. Doesn’t have the cojones and the noise of natural forces buries it.

From the original post:
The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).
Well if the net figure is a better representation then we should use it for the natural sources as well, unfortunately for your thesis it’s overall negative, i.e. about -3GtC, which is why you don’t use it.

hunter

The point that the increase of CO2 over the past 30 years could be removed with no significant impact on weather seems valid since weather patterns over the past 30 years are basically indistinguishable from even longer historical trend records.
However, the assertions that the IPCC is incontrol of CO2 records and other inflammatory and easily disputed/disproven claims only distracts from the issue.

metasequoia

How dead is dead? It’s farcical, who doesn’t agree with the premise that if the forecast does not meet the reality, the forecast is wrong, as is the theory underpinning it. The question now becomes how long can the Post AGW debate stagger on.

I was tempted to add to my article a paragraph predicting who would react immediately and what they would say. They didn’t let me down.
Two comments by others expose false IPCC assumptions. First, that CO2 is evenly distributed through the atmosphere and second that somehow properties of CO2 don’t apply in air near the ground – insolation and IR pass through the entire atmospheric column.

Pamela Gray,
You mean something like this?
http://i81.photobucket.com/albums/j237/hausfath/ScreenShot2014-08-05at73255AM_zps8775e38b.png
Generally model runs with and without anthropogenic forcings have pretty distinctly different trajectories over the last 30 years.

rgbatduke says:
August 5, 2014 at 5:39 am
rg, CO2 is not well mixed in 5% of the atmosphere: the first few hundred meters over land near huge sources and sinks. Plants are huge sources at night (respiring up to 60 GtC summed over a year) and huge sinks during daylight (120 GtC intake over a year, but decay from falling leaves etc. add some 60 GtC/year again to the atmosphere).
CO2 is well mixed in 95% of the atmosphere: on mountain tops, in deserts and everywhere over the oceans or coastal with wind from the seaside.
Several tall towers measure CO2 at different heights (to calculate in/out fluxes) over land, which shows the difference in variability. Here for Cabauw (The Netherlands):
http://www.ferdinand-engelbeen.be/klimaat/klim_img/cabauw_day_week.jpg
The problem with many historical data is exactly that they were taken near ground over land: the middle of towns, under inversion, mountain valleys, forests,… mostly unsuitable to give even an idea of the background CO2 levels of that period.
Except if there was a lot of wind, then it is possible to estimate the background levels as wind mixes most differences out. Unfortunately, the longest series don’t have enough datapoints at high wind speed to make the calculation.
The before mentioned station at Giessen (Germany) was one of the cornerstones of Beck’s data. The historical data show a 1-sigma variability of 68 ppmv. In comparison, the modern station halves that (still very high) but Mauna Loa is around 4 ppmv, including the huge seasonal variation.
Integrated modern monthly data from Giessen are not very good to:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/giessen_mlo_monthly.jpg
and show a positive bias against “background” CO2.

Warmist Claptrap

Re comments about “World” CO2 being measured at Mauna Loa.
This has alway stuck me as asinine really.
Mauna Loa system, including the ongoing thirty years old, Pu`u `O`o, on-going eruption of Kilauea nearby, is the largest, most active volcano on the Planet at the moment. The magma reservoir is again refilling faster than Pu`u `O`o can erupt it and so conditions may soon be right for another large eruption from Mauna Loa itself. Meanwhile fumerols continue to pump out vast amounts of CO2, all across the Big Island’s active zones. Isn’t this what we are measuring? Surely it would make more sense to measure CO2 at some neutral point, like Mount Everest, or Mount Kilimanjaro , or somewhere that CO2 isn’t being emitted from all around the measuring instruments.

Latitude

If man made CO2 is ~4%…..and it’s cumulative and what’s making CO2 levels rise….
Then you’re not going to get the straight line linear increase in CO2 that all the measurements show….
…you would have an exponential increase

Warmist Claptrap says:
August 5, 2014 at 7:57 am
Isn’t this what we are measuring?
If the wind is downstream of the slope of the volcanic vents at Mauna Loa, the measurements over an hour show a lot of variability. If that exceeds 0.25 ppmv (1 sigma) the data are not used for daily, monthly and yearly averages. The same happens with upwind conditions in the afternoon, when slightly depleted CO2 levels are measured from the valleys.
The including or excluding of outliers doesn’t change the average or trend with more than 0.1 ppmv at the end of the year. Here the 2008 raw hourly data + “cleaned” averages from Mauna Loa and the South Pole:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_mlo_spo_raw_select_2008.jpg
but mind the scale!
But there are lots of other places where CO2 is measured, the South Pole started even before Mauna Loa, but misses a few years of continuous measurements (but still had flask sampling). Therefore Mauna Loa is often used as the reference. See:
http://www.esrl.noaa.gov/gmd/dv/iadv/
For the “global” CO2 average, Mauna Loa is not even used, only sealevel stations are used, spread over different latitudes…

Warmist Claptrap,
Hate to break it to you, but its not just Mauna Loa: http://www.esrl.noaa.gov/gmd/ccgg/ggrn.php
There are many areas of real uncertainty in climate science that are interesting to discuss. Beck’s theories, unfortunately, are not one of them.

Alan Robertson

Ferdinand Engelbeen says:
August 5, 2014 at 7:35 am
” Plants are huge sources at night (respiring up to 60 GtC summed over a year) and huge sinks during daylight (120 GtC intake over a year, but decay from falling leaves etc. add some 60 GtC/year again to the atmosphere).”
________________________
Just considering terrestrial plants, such a balance might only be true if one considers leaf mass/other plant material which might completely cycle on an annual basis, but plant CO2 uptake sequesters C in woody mass, as well. Overall, the biosphere sequesters more C each year than it produces, as witness such things as topsoil and tree rings, or measurably, by the known- increasing sink rate. One could say that the natural course of the biosphere as a whole, is to eat itself out of house and home, replenished historically on a geologic time scale, by periodic glaciation events, which more or less, start the whole process over again. Now, here we are with our annual emissions intervening in the slow, but inexorable process of the biosphere bankrupting itself. We can’t say for certain what may result from our inadvertent fertilization of the whole life process, because we’ve never been here before.

paullinsay

rgbatduke, here’s some Japanese measurements that show variations up to 650 ppm http://www.terrapub.co.jp/journals/GJ/pdf/4106/41060429.pdf
If memory serves the level in a corn field can go to zero at midday since the plants use it so aggressively.
You’re correct about the missing error bars. They’re generally missing in Climate Science(tm) as far I can tell and the ones that do get displayed are ridiculously small. Measuring ocean temps to 0.001 K, really?

“While you denigrate Pauling with regards to the use of vitamin C to combat the common cold I ask, have you personally tried it?
Well I have and I haven’t had a cold in over 15 years to 20 years.”
Sorry, but that is hardly a valid argument. I have NEVER taken vitamin C and avoid all citrus
fruits and haven’t had a cold in 30 years. So much for your “evidence.”

Alan Robertson says:
August 5, 2014 at 8:40 am
In a mature forest, as is mostly the case in the tropics, the balance is quite neutral, except for short disturbances like an El Niño. Extra-tropical forests indeed expand and are destroyed by ice ages and interglacials. We may be of some help with our extra CO2…
The extra uptake is more or less known out of the oxygen balance:
http://www.sciencemag.org/content/287/5462/2467.short and
http://www.bowdoin.edu/~mbattle/papers_posters_and_talks/BenderGBC2005.pdf

rgbatduke

Actually, this is quite false.
“Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred.
As Pamela should know in order to improve the forecast ability of weather models you actually have to model radiative physics and yes that includes C02.

Not quite what she said: variation. If you take a particular model, and run it many times, it produces a (usually enormously wide) range of outcomes. What is very interesting is to compare and contrast the distribution of these outcomes, specifically to attempt to resolve:
a) The marginal probability of observing the actual present behavior of the climate, assuming as a null hypothesis that “this is a perfectly correct model”. The p-value is the basis of a hypothesis test — if it is a very low number (less than 0.05 traditionally) we are justified in rejecting the null hypothesis as our model is probably wrong.
b) The marginal shift in the probability distribution and/or p-value with CO_2. This is the basis of a Bayesian analysis that can actually estimate the posterior probability of the CO_2-specific component of the model being correct, for example.
Note well that one cannot legitimately average many (marginally failing) models and expect to get a successful one, in spite of the fact that this is precisely what is done, repeatedly, in climate science and specifically in AR5. Note that it is also a “capital mistake” to assume that it is nature that is in error or doing something “unlikely” rather than the models. Sure, maybe, “p happens” (to quote Marsaglia, a master of the hypothesis test) but in science and physics the second law basically states that “but don’t bet on it”. If a model marginally fails a p-value-based hypothesis test, the best you can say is “Answer cloudy, try again later”. If it decisively fails, it is time to pitch the model.
One model at a time. Not collectively. You cannot make ten Hartree models equal Hartree-Fock, or a hundred Hartree-Fock models give you the correct correlation/exchange energy for an electron in an atom. An incorrect, or approximate, model, cannot generally be corrected by using lots of equally incorrect, approximate models. The circumstances where this is not true — and they can be so corrected — are both very specific and very unlikely, and as a pure matter of fact are not realized in climate models.
So sure, Ms. Gray was speaking hastily, and Dr. Ball might have done better than to quote her, but the point is still the same. The variation of climate models in comparison with the observed climate is evidence for the assertion “CO_2 variation is empirically irrelevant to the climate’s variation”, as the actual climate is following the track indicated for no CO_2 increase while CO_2 is increasing. This is evidence for the assertion “this model is wrong” (one model at a time, for most of the CMIP5 models). It is not evidence for the assertion “this model is correct”, one model at a time, for most of the CMIP5 models.
Note well I say nothing about “proving” or “disproving” the models. Hypothesis testing is not usually that sharp (at least until p-values descend below 0.01 into the range of remote probability). However, “the pause” is certainly not evidence for the correctness of the models, and it is absurd to pretend that its continuation has (or should have) no impact on our confidence that the models are — one at a time — correct.
rgb
rgb

Latitude says:
August 5, 2014 at 8:09 am
…you would have an exponential increase
It is slightly quadratic:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/temp_emiss_increase.jpg

Nick Stokes said : ” It’s hard to say human emissions had nothing to do with the CO2 rise.”
Apparently quite hard, since nobody said such a thing.

rgbatduke

The problem with many historical data is exactly that they were taken near ground over land: the middle of towns, under inversion, mountain valleys, forests,… mostly unsuitable to give even an idea of the background CO2 levels of that period.

Which was precisely my point. This is also true of all of the other climate measures from the historical past. Where/when did they measure temperature? Near ground over land: the middle of towns, under inversion, surrounded by mountains or in forests or agricultural land and at widely variable times of day — mostly unsuitable to give even an idea of the background global average temperatures of that period. Where they did not regularly or accurately sample temperature until the very recent past includes: 70% of the Earth’s surface in one fell swoop (the oceans), 2/3 or thereabouts of the Earth’s continental surface land area (Antarctica, Siberia, much of China, much of Africa, Asia, and South America and even the US and Canadian West), and where they did sample it was corrupted with the “Human Habitation Effect” — humans alter their living environment from a “state of nature” to something that suits humans better. UHI is just one component of the HHE. HHE is overwhelmingly a source of local warming — local to the human habitations — but that is also precisely where things like temperature and CO_2 level and rainfall and wind speed/direction have historically been sampled. People don’t live so much in the middle of the South Pacific or the middle of Antarctica or in the North Atlantic at a depth of 100 meters or along ridge lines of mountains or in deserts.
The truly laughable thing is that when e.g. GISS corrects for UHI/HHE, it manages to make it relatively warm the present compared to the past by some truly awe inspiring legerdemain. HADCRUT4 doesn’t even bother — they just present UHI/HHE corrupted temperature series with the additional urban/human habitation warming projected onto the entire global average.
In a way I can respect that — HADCRUT4 at least can be viewed as a time dependent upper bound on the actual global temperature that is strictly increasing (relative to the expected “true” average”) from the historical past to the present. So when HADCRUT4 indicates (say) 0.4-0.6C of warming over the last fifty years we can be certain that the actual number is smaller than this although we cannot really say by how much. Whenever anybody tries to determine how much, it seems as though at least half of the warming observed is HHE error. That leaves only half of the warming to be explained by both CO_2 and natural variation, which might well leave only 0.1-0.2 C of actual CO_2 driven warming, with a total expected sensitivity well under 1 C for the rest of the century.
That’s one key part of AR5’s repeated SPM assertion that they are ever so certain that at least half of the warming is due to human CO_2. What they really should mean is that half of the warming is due to the HHE and is an error, with the other half attributable in an unresolvable mix to natural and anthropogenic causes. Unresolvable because to resolve it we’d have to be able to solve the Navier-Stokes equation with unknown initial conditions on an absurdly coarse grid a mere five orders of magnitude larger than the Kolmogorov scale for the dynamics, with nothing but highly biased guesses for how to project the microdynamics onto the coarse grained model solvers.
Truly, the miracle is that they get anybody to believe all of this stuff.
rgb

Latitude

It is slightly quadratic:
====
then the hypothesis is not correct
If nature is able to use a little of it……then nature can use it all

DD More

When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.
Remember to multiply all the factors. China’s portion of the 3% antro to total CO2, then factor in that water vapor is at least 60% of the GHG factor. Then remember that levels of radiative heat transfer, as stated in an engineering manual, – at room temperatures radiative heat transfer can generally be ignored. Evaporation / convection are the major drivers.

Gary Pearse

Preindustrial CO2 levels were one of the ‘facts’ that sceptics had remarkably not challenged in any significant way. Interestingly, on the the climate facts piece on Portugal :
http://wattsupwiththat.com/2014/08/05/surprising-facts-about-climate-change-in-portugal-why-the-climate-catastrophe-is-not-happening/
I commented this before reading this current thread:
” Gary Pearse says:
August 5, 2014 at 6:22 am
I’m sceptical that CO2 levels were below 285 over the past couple of thousand years. During the MWP, wine grapes were grown in Scotland, farmsteads fluorished in Greenland, etc . Low CO2 doesn’t jibe with this kind of situation. That CO2 is higher today than previously during the last 1000 years or so is the next bit of climate sophistry that is going to bite the dust.”
I wasn’t aware that it had already gotten underway, starting with Pamela’s comment and Tim’s post (that there was criticism before had been well managed and stifled by the team, as this is the foundation of their theory – sceptics had largely been arguing temperature aspects, but the ‘pause’ best showed the divergence between temp and CO2 so was a natural progression to look more closely at CO2 ‘data’). Yes, this is the final major item that needs rooting out. All the critiques raised concerning temperature leading CO2, CO2 being higher during some ice ages etc, but the fact that CO2 was still bubbling away with temps flat and declining for a period as long as that of the modern day global warming ‘era’ put it in the spotlight. I now look forward to an avalanche of papers on CO2 level proxies and the giving of the little foraminifera thermometers a well deserved rest.

richardscourtney

rgbatduke:
Many thanks for your superb post at August 5, 2014 at 9:38 am which is here. It concludes saying

Truly, the miracle is that they get anybody to believe all of this stuff.

I completely agree, and my agreement is not surprising because your post I have here linked can e considered as being an exposition of the point I made in my post at August 5, 2014 at 2:36 am which is here here and concludes y saying with a reference

there is almost no data on the carbon cycle and the paucity of data enables almost any conjecture – however ridiculous – to be modeled with agreement to the existing data.

Richard