Opinion by Dr. Tim Ball
I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. – Arthur Conan Doyle. (Sherlock Holmes)
Create The Facts You Want.
In a comment about the WUWT article “The Record of recent Man-made CO2 emissions: 1965-2013”, Pamela Gray, graphically but pointedly, summarized the situation.
When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different. Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred. Or maybe because of the random nature of weather we would have had it worse. Or it could have been much better. Now do something really ridiculous and take out just China’s portion. I know, the post isn’t meant to paint China as the bad guy. But. Really? Really? All this for something so tiny you can’t find it? Not even in a child’s balloon?
The only quibble I have is that the amount illustrates the futility of the claims, as Gray notes, but the Intergovernmental Panel on Climate Change (IPCC) and Environmental Protection Agency (EPA) are focused on trends and attribution. It must have a human cause and be steadily increasing, or, as they prefer – getting worse.
Narrowing the Focus
It’s necessary to revisit criticisms of CO2 levels created by the IPCC over the last several years. Nowadays, a measure of the accuracy of the criticisms, are the vehemence of the personal attacks designed to divert from the science and evidence.
From its inception, the IPCC focused on human production of CO2. It began with the definition of climate change, provided by the UNFCCC, as only those caused by humans. The goal was to prove their hypothesis that increase of atmospheric CO2 would cause warming. This required evidence that the level increased from pre-Industrial times, and would increase each year because of human industrial activity. How long before they start reducing the rate of CO2 increase to make it fit the declining temperatures? They are running out of guesses, 30 at latest count, to explain the continued lack of temperature increase now at 17 years and 10 months.
The IPCC makes the bizarre claim that up until 1950 human addition of CO2 was a minor driver of global temperature. After that over 90 percent of temperature increase is due to human CO2.
Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.
The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous. This claim comes from computer models, which are the only place in the world where a CO2 increase causes a temperature increase. It depends on human production and atmospheric levels increasing. It assumes temperature continues to increase, as all three of IPCC scenario projections imply.
Their frustration is they control the CO2 data, but after the University of Alabama at Huntsville (UAH) began satellite global temperature data, control of temperature data was curtailed. It didn’t stop them completely, as disclosures by McIntyre, Watts, Goddard, the New Zealand Climate Science Coalition among others, illustrated. They all showed adjustments designed to enhance and emphasize higher modern temperatures.
Now they’re confronted with T. H. Huxley’s challenge,
The Great Tragedy of Science – the slaying of a beautiful hypothesis by an ugly fact.
This article examines how the modern levels of atmospheric CO2 were determined and controlled to fit the hypothesis. They may fit a political agenda, but they don’t fit nature’s agenda.
New Deductive Method; Create the Facts to Fit the Theory
Farhad Manjoo asked in True Enough: Learning To Live In A Post-fact Society,
“Why has punditry lately overtaken news? Why do lies seem to linger so long in the cultural subconscious even after they’ve been thoroughly discredited? And why, when more people than ever before are documenting the truth with laptops and digital cameras, does fact-free spin and propaganda seem to work so well?”
Manjoo’s comments apply to society in general, but are enhanced about climate science because of differing public abilities with regard to scientific issues. A large majority is more easily deceived.
Manjoo argues that people create facts themselves or find someone to produce them. Creating data is the only option in climate science because, as the 1999 NRC Report found, there is virtually none. A response to February 3, 1999 US National Research Council (NRC) Report on Climate Data said,
“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.
The situation is worse today. The number of stations used is dramatically reduced and records adjusted to lower historic temperature data, which increases the gradient of the record. Lack of data for the oceans was recently identified.
“Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content.”
Oceans are critical to CO2 levels because of their large sink or source capacity.
Data necessary to create a viable determination of climate mechanisms and thereby climate change, is completely inadequate. This applies especially to the structure of climate models. There is no data for at least 80 percent of the grids covering the globe, so they guess; it’s called parameterization. The 2007 IPCC Report notes,
Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parameterized. The differences between parameterizations are an important reason why climate model results differ.
Variable results occur because of inadequate data at the most basic level and subjective choices by the people involved.
The IPCC Produce The Human Production Numbers
In the 2001, IPCC Report identified 6.5 GtC (gigatons of carbon) from human sources. The figure rose to 7.5 GtC in the 2007 report and by 2010 it was 9.5 GtC. Where did they get these numbers? The answer is the IPCC has them produced and then vet them. In the FAQ section they ask, “How does the IPCC produce its Inventory Guidelines?”
Utilizing IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC.
They were called Special Report on Emissions Scenarios (SRES) until the 2013 Report, when they became Representative Concentration Pathways (RCP). In March 2001, John Daly reports Richard Lindzen referring to the SRES and the entire IPCC process including SRES as follows,
In a recent interview with James Glassman, Dr. Lindzen said that the latest report of the UN-IPCC (that he helped author), “was very much a children’s exercise of what might possibly happen” prepared by a “peculiar group” with “no technical competence.”
William Kininmonth, author of the insightful book “Climate Change: A Natural Hazard”, was former head of Australia’s National Climate Centre and their delegate to the WMO Commission for Climatology. He wrote the following in an email on the ClimateSceptics group page.
I was at first confused to see the RCP concept emerge in AR5. I have come to the conclusion that RCP is no more than a sleight of hand to confuse readers and hide absurdities in the previous approach.
You will recall that the previous carbon emission scenarios were supposed to be based on solid economic models. However, this basis was challenged by reputable economists and the IPCC economic modelling was left rather ragged and a huge question mark hanging over it.
I sense the RCP approach is to bypass the fraught economic modelling: prescribed radiation forcing pathways are fed into the climate models to give future temperature rise—if the radiation forcing plateaus at 8.5W/m2 sometime after 2100 then the global temperature rise will be 3C. But what does 8.5 W/m2 mean? Previously it was suggested that a doubling of CO2 would give a radiation forcing of 3.7 W/m2. To reach a radiation forcing of 7.4 W/m2 would thus require a doubling again—4 times CO2 concentration. Thus to follow RCP8.5 it is necessary for the atmospheric CO2 concentration equivalent to exceed 1120ppm after 2100.
We are left questioning the realism of a RCP 8.5 scenario. Is there any likelihood of the atmospheric CO2 reaching about 1120 ppm by 2100? IPCC has raised a straw man scenario to give a ‘dangerous’ global temperature rise of about 3C early in the 22nd century knowing full well that such a concentration has an extremely low probability of being achieved. But, of course, this is not explained to the politicians and policymakers. They are told of the dangerous outcome if the RCP8.5 is followed without being told of the low probability of it occurring.
One absurdity is replaced by another! Or have I missed something fundamental?[1]
No, nothing is missed! However, in reality, it doesn’t matter whether it changes anything; it achieves the goal of increasing CO2 and its supposed impact of global warming. Underpinning of IPCC climate science and the economics depends on accurate data and knowledge of mechanisms and that is not available.
We know there was insufficient weather data on which to construct climate models and the situation deteriorated as they eliminated weather stations, ‘adjusted’ them and then cherry-picked data. We know knowledge of mechanisms is inadequate because the IPCC WGI Science Report says so.
Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.
or
For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.
Two critical situations were central to control of atmospheric CO2 levels. We know Guy Stewart Callendar, A British steam engineer, cherry-picked the low readings from 90,000 19th century atmospheric CO2 measures. This not only established a low pre-industrial level, but also altered the trend of atmospheric levels. (Figure 1)
Figure 1 (After Jaworowski; Trend lines added)
Callendar’s work was influential in the Gore generated claims of human induced CO2 increases. However, the most influential paper in the climate community, especially at CRU and the IPCC, was Tom Wigley’s 1983 paper “The pre-industrial carbon dioxide level.” (Climatic Change. 5, 315-320). I held seminars in my graduate level climate course about its validity and selectivity to establish a pre-industrial base line.
I wrote an obituary on learning of Becks untimely death.
I was flattered when he asked me to review one of his early papers on the historic pattern of atmospheric CO2 and its relationship to global warming. I was struck by the precision, detail and perceptiveness of his work and urged its publication. I also warned him about the personal attacks and unscientific challenges he could expect. On 6 November 2009 he wrote to me, “In Germany the situation is comparable to the times of medieval inquisition.” Fortunately, he was not deterred. His friend Edgar Gartner explained Ernst’s contribution in his obituary. “Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Penal on Climate Change IPCC. He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And it doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linear back to the 19th century.” (This is a translation from the German)
Beck was the first to analyze in detail the 19th century data. It was data collected for scientific attempts to measure precisely the amount of CO2 in the atmosphere. It began in 1812, triggered by Priestly’s work on atmospheric oxygen, and was part of the scientific effort to quantify all atmospheric gases. There was no immediate political motive. Beck did not cherry-pick the results, but examined the method, location and as much detail as possible for each measure, in complete contrast to what Callendar and Wigley did.
The IPCC had to show that,
· Increases in atmospheric CO2 caused temperature increase in the historic record.
· Current levels are unusually high relative to the historic record.
· Current levels are much higher than pre-industrial levels.
· The differences between pre-industrial and current atmospheric levels are due to human additions of CO2 to the atmosphere.
Beck’s work showed the fallacy of these claims and in so doing put a big target on his back.
Again from my obituary;
Ernst Georg Beck was a scholar and gentleman in every sense of the term. His friend wrote, “They tried to denounce Ernst Georg Beck in the Internet as naive amateur and data counterfeiter. Unfortunately, Ernst could hardly defend himself in the last months because of its progressive illness.” His work, determination and ethics were all directed at answering questions in the skeptical method that is true science; the antithesis of the efforts of all those who challenged and tried to block or denigrate him.
The 19th-century CO2 measures are no less accurate than those for temperature; indeed, I would argue that Beck shows they are superior. So why, for example, are his assessments any less valid than those made for the early portions of the Central England Temperatures (CET)? I spoke at length with Hubert Lamb about the early portion of Manley’s CET reconstruction because the instruments, locations, measures, records and knowledge of the observers were comparable to those in the Hudson’s Bay Company record I was dealing with.
Once the pre-industrial level was created it became necessary to ensure the new CO2 post-industrial trend continued. It was achieved when C.D.Keeling established the Mauna Loa CO2 measuring station. As Beck notes,
Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC.
Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements.” He is also a co-author of the IPCC reports, which accept Mauna Loa and all other readings as representative of global levels. So the IPCC control the human production figures and the atmospheric CO2 levels and both are constantly and consistently increasing.
This diverts from the real problem with the measures and claims. The fundamental IPCC objective is to identify human causes of global warming. You can only determine the human portion and contribution if you know natural levels and how much they vary and we have only very crude estimates.
What Values Are Used for Each Component of the Carbon Cycle?
Dr. Dietrich Koelle is one of the few scientists to assess estimates of natural annual CO2 emissions.
Annual Carbon Dioxide Emissions GtC per annum
1.Respiration (Humans, animals, phytoplankton) 45 to 52
2. Ocean out-gassing (tropical areas) 90 to 100
3. Volcanic and other ground sources 0.5 to 2
4. Ground bacteria, rotting and decay 50 to 60
5. Forest cutting, forest fires 1 to 3
6. Anthropogenic emissions Fossil Fuels (2010) 9.5
TOTAL 196 to 226.5
Source: Dr. Dietrich Koelle
The IPCC estimate of human production (6) for 2010 was 9.5 GtC, but that is total production. One of the early issues in the push to ratify the Kyoto Protocol was an attempt to get US ratification. The US asked for carbon credits, primarily for CO2 removed through reforestation, so a net figure would apply to their assessment as a developed nation. It was denied. The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).
The Truth Will Out.
How much longer will the IPCC continue to produce CO2 data with trends to fit their hypothesis that temperature will continue to rise? How much longer before the public become aware of Gray’s colorful observation that, “The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.” The almost 18-year leveling and slight reduction in global temperature is essentially impossible based on IPCC assumptions. One claim is already made that the hiatus doesn’t negate their science or projections, instead of acknowledging it, along with failed predictions completely rejects their fear mongering.
IPCC and EPA have already shown that being wrong or being caught doesn’t matter. The objective is the scary headline, enhanced by the constant claim it is getting worse at an increasing rate, and time is running out. Aldous Huxley said, “Facts do not cease to exist because they are ignored.” We must make sure they are real and not ignored.
[1] Reproduced with permission of William Kininmonth.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Gary Pearse says: August 5, 2014 at 1:39 pm
“I suspect there is a CO2 hole at both poles (not shown for antarctica in the image) – possibly in part because of its diamagnetic property (like ozone). In any case it doesn’t look as uniformly distributed as advertized”
Look again at the scale on that NASA Earth pic. It runs from 390 to 401 ppmv. Yes, there is variability in that range.
But South Pole (SPO) was one of the stations in this comparison. It matches the others.
Gary Pearse says:
August 5, 2014 at 1:16 pm
The enormous ice pressures have pressed the underlyiing land surface in Greenland and Antarctica to below sea level. How is it possible that there has been no diffusion of gases in this environment?
Nature did show us that there was no measureable diffusion of CO2 over 800,000 years in the ice cores:
Between glacials and interglacials temperatures go up and down and so does CO2 with a lag. The ratio between CO2 and temperature is about 8 ppmv/K. If there was the slightest migration, the CO2 peaks would fade out for each interglacial back in time, thus the ratio should go down, which is not the case…
Ferdinand
By 1942 they had been taking measurements of co2 for some 110 years. Adherence to stipulated co2 levels in factories were enshrined in British law in 1890 . The guidelines to taking measurements in such situations as cotton factories included taking into account co2 emissions from the internal gas lighting.
By 1942 a variety of measuring instruments had been patented. Do you really think that a relatively simple thing like a co2 reading would have been wildly inaccurate in 1942, yet by that time scientists were dabbling with the atomic bomb?
Tonyb
SandyInLimousin says:
August 5, 2014 at 12:00 pm
Oh like measuring global CO2 at a single location by one group of people you mean. Yes that would be very stupid.
That indeed would be stupid, therefore after the first measurements at the South Pole, they expanded the locations and more people and organizations and countries are involved: some 70 “background” stations nowadays and some 400 spread over forests etc. to measure in/out fluxes. Here several stations with their data:
http://www.esrl.noaa.gov/gmd/dv/iadv/
Tonyb says:August 5, 2014 at 2:10 pm
“By 1942 a variety of measuring instruments had been patented. Do you really think that a relatively simple thing like a co2 reading would have been wildly inaccurate in 1942, yet by that time scientists were dabbling with the atomic bomb?”
The issue isn’t measurement accuracy. Here again is the daily fluctuation at Giessen. Beck is like a man trying to measure sea level, standing in the waves with a very accurate ruler.
Boy, considering climate science (especially the CAGW portion) is all settled science, this thread must be all hooey.
Ok, sarc off.
Looks like our highly educated psychologists and historian “climate scientists” (snicker) have some work to do to defend their choice of CO2 data.
Ok, now the sarc is really off.
Tonyb says:
August 5, 2014 at 2:10 pm
Hello Tony,
+/ost wet measurements of that time were accurate to +/- 3%, that is +/- 10 ppmv (during calibration with known mixtures), but that is under strictly controlled circumstances. The problem with the wet methods was that the result was quite dependent of the skill of the operator, the freshness of the chemicals and the (lack of) calibrations. For the latter I have read very little about inter calibration of equipment and gas mixtures for the wet methods. Something Keeling rigorously introduced for all CO2 measurements, still standard for the whole world of CO2 measurements…
Tonyb says:
August 5, 2014 at 2:10 pm
Ferdinand
By 1942 they had been taking measurements of co2 for some 110 years. Adherence to stipulated co2 levels in factories were enshrined in British law in 1890 . The guidelines to taking measurements in such situations as cotton factories included taking into account co2 emissions from the internal gas lighting.
By 1942 a variety of measuring instruments had been patented. Do you really think that a relatively simple thing like a co2 reading would have been wildly inaccurate in 1942, yet by that time scientists were dabbling with the atomic bomb?
Tonyb
__________________
…and they kept each other in the loop via Twitter.
Nick
It wasn’t Beck doing the measuring. It was often done by researchers whose job it was. So they didn’t know about daily fluctuations of a gas known since Roman times a bare three years before the far more onerous and technically difficult job of splitting the atom was achieved?
IF the measurements had been inaccurate for 110 years it just shows that scientists often don’t know as much as they think and this established and widely practiced science was incorrect. Perhaps that should give us pause for thought about the certainties surrounding many aspects of today’s climate science?
Tonyb
Ferdinand Engelbeen:
At August 5, 2014 at 12:55 pm I wrote of the 1942 peak in Beck’s data
And at August 5, 2014 at 1:55 pm you have replied saying in total
I am not ‘buying’ that!
Growing crops consume CO2 so their net effect would be draw down CO2 and not to emit it.
And
The recent data is not the same as the 1940’s data. Yes! The reason for that difference is the issue needing to be resolved.
But
As usual, your response to data which fails to agree with what you want to be true is to say the data should be “discarded completely”.
My response is to seek “to determine why the direct measurements of atmospheric CO2 concentration in the early 1940s provide a difference from the proxy indications for that time”.
Richard
Ferdinand
Hi Ferdinand, hope you are keeping well.
Keeling knew nothing of taking co2 measurements when he started his job, yet within a year apparently he was taking measurements to a far greater degree of accuracy than the highly experienced scientists before him stretching back some110 years?
Tonyb
Tonyb says: August 5, 2014 at 2:37 pm
“It wasn’t Beck doing the measuring. It was often done by researchers whose job it was.”
But what was their job? How many of them claimed to be measuring global CO2 levels? They simply reported CO2 in their environment. It was Beck who claimed they were measures of global CO2.
Nick Stokes:
At August 5, 2014 at 2:57 pm you assert
No! Beck collated their data to determine global CO2. I suggest you read his work.
It was Keeling who began the procedure of measuring at one site and claiming he was measuring global CO2.
Richard
Ferdinand, slightly O/T, but do you have a link to the data for the ongoing 14C measurements at the Jungfraujoch? I saw one about a year or so ago but can’t find it.
Also, when I looked, the measured 14C appeared to be approaching the asymptote of pre bomb-spike levels. The continuing decline was ascribed to dilution by emitted anthropogenic ‘cold’ CO2. This makes it (i.e. now) a very interesting part of the experiment. If that explanation holds true it should now continue to fall below the pre bomb-spike levels. Any comments?
Nick
I think you misunderstand Becks role.
Some 4 years ago I wrote this. Some of the links will be dead ends by now
http://noconsensus.wordpress.com/2010/03/06/historic-variations-in-co2-measurements/
It is extensively referenced and the lively comments are worth reading.my purpose in writing it was to determine if co2 readings were embedded in everyday life or were merely taken by a few researchers who had a passing interest in the subject. Ihad no Interest at the outset of the actual levels that were recorded
I Had no knowledge of Beck at the outset although I quickly became aware of his work. I discounted it until I had finished my own reseach as I did not want to be influenced by his work. At the time I thought him a bit of a crackpot and somewhat obsessive.
However I got into correspondence with him and he sent me much material and admitted he had released his findings too soon. His later work was much better organised and his second generation website much more professional.
After reading very many hundreds of papers and realising that taking measurements was a well established everyday occurence through much of the 19 th century enshrined in law by an act of parliament in the 1890’s , it seemed strange to me that many hundreds of clever scientists were apparently incapable of taking measurements after 110 years of trying.
Ferdinand kindly sent me an article on fractionation of the ice cores and I must say they made me more, not less sceptical, that they make a good co2 proxy.
Temperatures have varied greatly over the last thousand years in the manner you would expect from high and low levels of co2 .
Callendar undoubtedly selected low readings for his seminal paper in 1938 which was roundly taken apart by Slocum in 1956 keeling was influenced by callendar. What the truth of past levels of co2 will turn out to be I do not know. It is a shame they can not be properly audited by competent independent scientists as that would put the matter to rest.
You will see from the comments that beck took part in the debate around my paper. I did not know at the time he was so Ill and suspect it was his last major public appearance.
Tonyb
Nick Stokes says:
How many of [Beck’s scientific sources] claimed to be measuring global CO2 levels? They simply reported CO2 in their environment. It was Beck who claimed they were measures of global CO2.
Those were eminent researchewrs who took CO2 measurements. Their reputations were at stake, and there were scientists like Haldane, Bunsen, Krogh, Pettenkofer, Callendar, Warburg, deSaussure, and many others — several were Nobel laureates [when that distinction really meant something]. They truly cared about their reputations. Further, Beck took into account diurnal and seasonal changes, as he indicates here.
Those scientists took more than 90,000 CO2 readings, in locations that were sparsely populated, and on mountain peaks, and on ocean transits on the windward side of ships. That large number of readings and locations would have smoothed out much of the diurnal and seasonal changes.
As Ferdinand notes, he had discussions with Dr. Beck, who subsequently altered his data base to take Ferdinand’s concerns into account. Clearly Dr. Beck had good reasons to keep the other data that went into his analysis. Maybe Ferdinand just didn’t have enough conversations with Dr. Beck.
Finally, to cut through all the hair splitting, what are we left with? We are left with the fact that the IPCC was wrong, and also the fact that the rise in CO2 from 0.03% to 0.04% of the atmosphere has caused no global harm, or damage. Thus, CO2 is harmless at current concentrations, and there is zero evidence that it is harmful at projected concentrations.
On the other hand, the rise in CO2 has been clearly beneficial to the biosphere. The planet is measurably greening as a direct result, and the ‘carbon’ scare has turned out to be a false alarm. If the alarmist crowd just admitted that much, they would get the respect of skeptics. But they are still crying “Wolf!!” over the rise in harmless, beneficial CO2. So they get no respect.
dbstealey says:
August 5, 2014 at 1:29 pm
You read it correctly.
Ferdinand Engelbeen says:
August 5, 2014 at 1:32 pm
“…the first halve of the CO2 data give some 60% “airborne fraction” of human emissions. The second halve about 40%.”
This is constructing epicycles. There is no need to kluge together some model that claims the sinks are becoming more powerful. It is much more direct simply to realize that human inputs never had any significant effect. The apparent decrease in the airborne fraction is a misattribution. It is simply the case that temperatures have stagnated.
Alan Robertson says:
August 5, 2014 at 1:55 pm
“To me, the graph of the Mauna Loa data set shows a nearly constant rate of change in all years, regardless of T.”
Note the curvature in the plot you referenced. The curve was accelerating in those years in which global temperatures were increasing. Now, with global temperatures stable, no acceleration. Telling, no?
dbstealey says: August 5, 2014 at 3:32 pm
“Those were eminent researchewrs who took CO2 measurements. Their reputations were at stake, and there were scientists like Haldane, Bunsen, Krogh, Pettenkofer, Callendar, Warburg, deSaussure, and many others”
Willis Eschenbach at WUWT showed this plot of some of those results. And of it he said (correctly):
So, Nick, how many CO2 measurements is that out of 90,000?
The fact is, you don’t know what CO2 levels were that long ago. No one else does, either. Dr. Beck provides a good approximation, but your typical response is: “Oh, no. Can’t be.”
Also, you never responded to my comment that there is no evidence of global harm due to the rise in CO2. If that is a fact, then all your arguments and protests mean exactly nothing. CO2 is harmless, and it is beneficial to the biosphere. More is better. Put that in your pipe and smoke it.
Gary Pearse says:
August 5, 2014 at 1:54 pm
Phil. says:
August 5, 2014 at 11:35 am
“dbstealey says:
August 5, 2014 at 10:44 am
Gary Pearse,
Count me as one who has always been skeptical of the assertions of past CO2 levels. …
Phil. says:
“And yet the annual increase in CO2 has grown from about 0.5ppm/yr in 1960 to about 2.5ppm/yr in recent years. Over that period the measured CO2 level has increased from 320 to 400ppm, that’s a growth of 25%. That conflicts with your narrative so you have to reject it”
Or so the keepers of the molecule say! Just where on NASA’s globe below would you say we should be collecting CO2 data.
Based on that data Mauna Loa seems like an excellent location, just about the average and at the right altitude too.
From ~ 20degrees N to 90 S, CO2 seems to be a bit thin.
Learn to read a graph, the total range is less than 10ppm, also note that the data is taken during the austral winter.
And why would you think that pre-industrial levels of CO2 were ~280 or so when in the MWP – warmer than now – wine grapes were grown in Scotland then.
I only referred to modern measurements not to the pre-industrial levels, however grapes were not grown outdoors in Scotland at that time, the furthest north was around Lincolnshire in England and in any case grapes are grown more widely in England now than they were then.
Indeed, why wouldn’t it make sense to think that CO2 was even more concentrated than now? It would better fit the CAGW narrative. I know they have been trying to kill off the MWP for a couple of decades now so they haven’t thought the CO2 of the question through. I’m sure there is a paper in the works somewhere putting the CO2 back into years of yore to account for MWP, etc. What does it take to instill a tiny bit of doubt in your mind after all the well publicized shennanigans that have been perpetrated by the climate industrial complex?
A lot more than the fanciful theorizing that I’ve seen here.
Bart says:
August 5, 2014 at 3:44 pm
dbstealey says:
August 5, 2014 at 1:29 pm
You read it correctly.
Ferdinand Engelbeen says:
August 5, 2014 at 1:32 pm
“…the first halve of the CO2 data give some 60% “airborne fraction” of human emissions. The second halve about 40%.”
This is constructing epicycles.
Nothing wrong with epicycles, they provided an excellent means of predicting the future positions of the planets from the Earth’s frame of reference.
Bart says:
August 5, 2014 at 3:44 pm
———-
Alan Robertson says:
August 5, 2014 at 1:55 pm
“To me, the graph of the Mauna Loa data set shows a nearly constant rate of change in all years, regardless of T.”
————–
Note the curvature in the plot you referenced. The curve was accelerating in those years in which global temperatures were increasing. Now, with global temperatures stable, no acceleration. Telling, no?
__________________
Well… there are 2 ways of looking at the annual rate of change of the MLO data, which may be seen while looking at the following graph:
http://www.woodfortrees.org/plot/esrl-co2/normalise/mean:48/plot/hadcrut4gl/from:1959/mean:48/plot/hadcrut4gl/from:1959/trend
1) The greatest annual change will be shown by the portion of the line with the steepest slope.
The steepest slope appears to be during the last decade, while the T trend has been flat, or negative There appears to be no curve in the line in the last decade, the slope appears constant.
2) The greatest change in the trend of MLO data will be shown by the portion of the line with the greatest change in slope, i.e., the portion of the line which shows the greatest curve.
The decades 1959-1989 appear to be the decades with the greatest curve upwards, showing increasing slope. The decades 1959-1979 do not show increasing T trend. The decade 1979- 1989 not only shows rising T trend overall, but also has large +/- swings in T trend, while the slope of MLO data continued a slight annual increase in slope, i.e.- curve.
Phil. says:
August 5, 2014 at 6:51 pm
But they did not explain what was happening in physical reality.
Bruce Cobb says:
August 5, 2014 at 4:50 am
It makes no difference where the increased CO2 comes from, so it’s a red herring. The increased CO2 is nothing but a boon to all of life, and especially to man, by helping plants grow. Whatever warming effect it may have had cannot be sussed from what is natural, and only in the twisted, humanity-hating minds of the Warmistas could a small amount of warming be a detriment to “the planet”.
Yes, the watermelon greens are willing to put up with any amount of damage to the ecosphere in order to hurt people. Still, I think human well-being is the key. As long as we keep pointing out that CO2 helps PLANTS, we strengthen the idea that it is a waste product for humans, hence bad.
The reality is that human physiology evolved (like all others) under conditions much higher in [CO2] than today’s. We need it for a pH buffer in our blood, and goodness knows what else. There are indications that maximum longevity would occur under CO2 concentrations many times higher than today’s. It is important for respiration as lower concentrations cause shallower breathing and less oxygen concentration in our tissues. Asthmatics, COPD and anybody carrying oxygen tanks is probably being harmed because they lack CO2 in those tanks. Indeed, I suspect the fire-hazard oxygen tanks could be dispensed with altogether and higher CO2 substituted for better health outcomes.
I d not know whether I am right about that, but I am pretty sure nobody is studying such questions. NSF will not fund anything that might shake the CAGW hypothesis, because they believe a good scare makes more science funding. If I AM right, then the lack of interest is murder of people with respiratory problems–and maybe all the rest of us as well.
sturgishooper says:
August 5, 2014 at 6:53 pm
Phil. says:
August 5, 2014 at 6:51 pm
But they did not explain what was happening in physical reality.
That wasn’t their purpose, they were like present-day tide tables they allowed accurate prediction of future events wrt the Earth’s frame of reference. They amount to a type of Fourier analysis in the complex plane.