CO2 data might fit the IPCC hypothesis, but it doesn't fit reality

Opinion by Dr. Tim Ball

I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. – Arthur Conan Doyle. (Sherlock Holmes)

Create The Facts You Want.

In a comment about the WUWT article “The Record of recent Man-made CO2 emissions: 1965-2013”, Pamela Gray, graphically but pointedly, summarized the situation.

When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different. Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred. Or maybe because of the random nature of weather we would have had it worse. Or it could have been much better. Now do something really ridiculous and take out just China’s portion. I know, the post isn’t meant to paint China as the bad guy. But. Really? Really? All this for something so tiny you can’t find it? Not even in a child’s balloon?

The only quibble I have is that the amount illustrates the futility of the claims, as Gray notes, but the Intergovernmental Panel on Climate Change (IPCC) and Environmental Protection Agency (EPA) are focused on trends and attribution. It must have a human cause and be steadily increasing, or, as they prefer – getting worse.

Narrowing the Focus

It’s necessary to revisit criticisms of CO2 levels created by the IPCC over the last several years. Nowadays, a measure of the accuracy of the criticisms, are the vehemence of the personal attacks designed to divert from the science and evidence.

From its inception, the IPCC focused on human production of CO2. It began with the definition of climate change, provided by the UNFCCC, as only those caused by humans. The goal was to prove their hypothesis that increase of atmospheric CO2 would cause warming. This required evidence that the level increased from pre-Industrial times, and would increase each year because of human industrial activity. How long before they start reducing the rate of CO2 increase to make it fit the declining temperatures? They are running out of guesses, 30 at latest count, to explain the continued lack of temperature increase now at 17 years and 10 months.

The IPCC makes the bizarre claim that up until 1950 human addition of CO2 was a minor driver of global temperature. After that over 90 percent of temperature increase is due to human CO2.

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.

 

The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous. This claim comes from computer models, which are the only place in the world where a CO2 increase causes a temperature increase. It depends on human production and atmospheric levels increasing. It assumes temperature continues to increase, as all three of IPCC scenario projections imply.

Their frustration is they control the CO2 data, but after the University of Alabama at Huntsville (UAH) began satellite global temperature data, control of temperature data was curtailed. It didn’t stop them completely, as disclosures by McIntyre, Watts, Goddard, the New Zealand Climate Science Coalition among others, illustrated. They all showed adjustments designed to enhance and emphasize higher modern temperatures.

Now they’re confronted with T. H. Huxley’s challenge,

The Great Tragedy of Science – the slaying of a beautiful hypothesis by an ugly fact.

This article examines how the modern levels of atmospheric CO2 were determined and controlled to fit the hypothesis. They may fit a political agenda, but they don’t fit nature’s agenda.

New Deductive Method; Create the Facts to Fit the Theory

Farhad Manjoo asked in True Enough: Learning To Live In A Post-fact Society,

“Why has punditry lately overtaken news? Why do lies seem to linger so long in the cultural subconscious even after they’ve been thoroughly discredited? And why, when more people than ever before are documenting the truth with laptops and digital cameras, does fact-free spin and propaganda seem to work so well?”

Manjoo’s comments apply to society in general, but are enhanced about climate science because of differing public abilities with regard to scientific issues. A large majority is more easily deceived.

Manjoo argues that people create facts themselves or find someone to produce them. Creating data is the only option in climate science because, as the 1999 NRC Report found, there is virtually none. A response to February 3, 1999 US National Research Council (NRC) Report on Climate Data said,

“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

The situation is worse today. The number of stations used is dramatically reduced and records adjusted to lower historic temperature data, which increases the gradient of the record. Lack of data for the oceans was recently identified.

“Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content.”

Oceans are critical to CO2 levels because of their large sink or source capacity.

Data necessary to create a viable determination of climate mechanisms and thereby climate change, is completely inadequate. This applies especially to the structure of climate models. There is no data for at least 80 percent of the grids covering the globe, so they guess; it’s called parameterization. The 2007 IPCC Report notes,

Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parameterized. The differences between parameterizations are an important reason why climate model results differ.

Variable results occur because of inadequate data at the most basic level and subjective choices by the people involved.

The IPCC Produce The Human Production Numbers

In the 2001, IPCC Report identified 6.5 GtC (gigatons of carbon) from human sources. The figure rose to 7.5 GtC in the 2007 report and by 2010 it was 9.5 GtC. Where did they get these numbers? The answer is the IPCC has them produced and then vet them. In the FAQ section they ask, “How does the IPCC produce its Inventory Guidelines?”

Utilizing IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC.

They were called Special Report on Emissions Scenarios (SRES) until the 2013 Report, when they became Representative Concentration Pathways (RCP). In March 2001, John Daly reports Richard Lindzen referring to the SRES and the entire IPCC process including SRES as follows,

In a recent interview with James Glassman, Dr. Lindzen said that the latest report of the UN-IPCC (that he helped author), “was very much a children’s exercise of what might possibly happen” prepared by a “peculiar group” with “no technical competence.”

William Kininmonth, author of the insightful book “Climate Change: A Natural Hazard”, was former head of Australia’s National Climate Centre and their delegate to the WMO Commission for Climatology. He wrote the following in an email on the ClimateSceptics group page.

I was at first confused to see the RCP concept emerge in AR5. I have come to the conclusion that RCP is no more than a sleight of hand to confuse readers and hide absurdities in the previous approach.

You will recall that the previous carbon emission scenarios were supposed to be based on solid economic models. However, this basis was challenged by reputable economists and the IPCC economic modelling was left rather ragged and a huge question mark hanging over it.

I sense the RCP approach is to bypass the fraught economic modelling: prescribed radiation forcing pathways are fed into the climate models to give future temperature rise—if the radiation forcing plateaus at 8.5W/m2 sometime after 2100 then the global temperature rise will be 3C. But what does 8.5 W/m2 mean? Previously it was suggested that a doubling of CO2 would give a radiation forcing of 3.7 W/m2. To reach a radiation forcing of 7.4 W/m2 would thus require a doubling again—4 times CO2 concentration. Thus to follow RCP8.5 it is necessary for the atmospheric CO2 concentration equivalent to exceed 1120ppm after 2100.

We are left questioning the realism of a RCP 8.5 scenario. Is there any likelihood of the atmospheric CO2 reaching about 1120 ppm by 2100? IPCC has raised a straw man scenario to give a ‘dangerous’ global temperature rise of about 3C early in the 22nd century knowing full well that such a concentration has an extremely low probability of being achieved. But, of course, this is not explained to the politicians and policymakers. They are told of the dangerous outcome if the RCP8.5 is followed without being told of the low probability of it occurring.

One absurdity is replaced by another! Or have I missed something fundamental?[1]

No, nothing is missed! However, in reality, it doesn’t matter whether it changes anything; it achieves the goal of increasing CO2 and its supposed impact of global warming. Underpinning of IPCC climate science and the economics depends on accurate data and knowledge of mechanisms and that is not available.

We know there was insufficient weather data on which to construct climate models and the situation deteriorated as they eliminated weather stations, ‘adjusted’ them and then cherry-picked data. We know knowledge of mechanisms is inadequate because the IPCC WGI Science Report says so.

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

or

For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.

 

Two critical situations were central to control of atmospheric CO2 levels. We know Guy Stewart Callendar, A British steam engineer, cherry-picked the low readings from 90,000 19th century atmospheric CO2 measures. This not only established a low pre-industrial level, but also altered the trend of atmospheric levels. (Figure 1)

clip_image002

Figure 1 (After Jaworowski; Trend lines added)

Callendar’s work was influential in the Gore generated claims of human induced CO2 increases. However, the most influential paper in the climate community, especially at CRU and the IPCC, was Tom Wigley’s 1983 paper “The pre-industrial carbon dioxide level.” (Climatic Change. 5, 315-320). I held seminars in my graduate level climate course about its validity and selectivity to establish a pre-industrial base line.

I wrote an obituary on learning of Becks untimely death.

I was flattered when he asked me to review one of his early papers on the historic pattern of atmospheric CO2 and its relationship to global warming. I was struck by the precision, detail and perceptiveness of his work and urged its publication. I also warned him about the personal attacks and unscientific challenges he could expect. On 6 November 2009 he wrote to me,In Germany the situation is comparable to the times of medieval inquisition.” Fortunately, he was not deterred. His friend Edgar Gartner explained Ernst’s contribution in his obituary. “Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Penal on Climate Change IPCC. He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And it doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linear back to the 19th century.” (This is a translation from the German)

Beck was the first to analyze in detail the 19th century data. It was data collected for scientific attempts to measure precisely the amount of CO2 in the atmosphere. It began in 1812, triggered by Priestly’s work on atmospheric oxygen, and was part of the scientific effort to quantify all atmospheric gases. There was no immediate political motive. Beck did not cherry-pick the results, but examined the method, location and as much detail as possible for each measure, in complete contrast to what Callendar and Wigley did.

The IPCC had to show that,

· Increases in atmospheric CO2 caused temperature increase in the historic record.

· Current levels are unusually high relative to the historic record.

· Current levels are much higher than pre-industrial levels.

· The differences between pre-industrial and current atmospheric levels are due to human additions of CO2 to the atmosphere.

Beck’s work showed the fallacy of these claims and in so doing put a big target on his back.

Again from my obituary;

Ernst Georg Beck was a scholar and gentleman in every sense of the term. His friend wrote, “They tried to denounce Ernst Georg Beck in the Internet as naive amateur and data counterfeiter. Unfortunately, Ernst could hardly defend himself in the last months because of its progressive illness.” His work, determination and ethics were all directed at answering questions in the skeptical method that is true science; the antithesis of the efforts of all those who challenged and tried to block or denigrate him.

The 19th-century CO2 measures are no less accurate than those for temperature; indeed, I would argue that Beck shows they are superior. So why, for example, are his assessments any less valid than those made for the early portions of the Central England Temperatures (CET)? I spoke at length with Hubert Lamb about the early portion of Manley’s CET reconstruction because the instruments, locations, measures, records and knowledge of the observers were comparable to those in the Hudson’s Bay Company record I was dealing with.

Once the pre-industrial level was created it became necessary to ensure the new CO2 post-industrial trend continued. It was achieved when C.D.Keeling established the Mauna Loa CO2 measuring station. As Beck notes,

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC.

Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements.” He is also a co-author of the IPCC reports, which accept Mauna Loa and all other readings as representative of global levels. So the IPCC control the human production figures and the atmospheric CO2 levels and both are constantly and consistently increasing.

This diverts from the real problem with the measures and claims. The fundamental IPCC objective is to identify human causes of global warming. You can only determine the human portion and contribution if you know natural levels and how much they vary and we have only very crude estimates.

What Values Are Used for Each Component of the Carbon Cycle?

Dr. Dietrich Koelle is one of the few scientists to assess estimates of natural annual CO2 emissions.

Annual Carbon Dioxide Emissions GtC per annum

1.Respiration (Humans, animals, phytoplankton) 45 to 52

2. Ocean out-gassing (tropical areas) 90 to 100

3. Volcanic and other ground sources 0.5 to 2

4. Ground bacteria, rotting and decay 50 to 60

5. Forest cutting, forest fires 1 to 3

6. Anthropogenic emissions Fossil Fuels (2010) 9.5

TOTAL 196 to 226.5

Source: Dr. Dietrich Koelle

The IPCC estimate of human production (6) for 2010 was 9.5 GtC, but that is total production. One of the early issues in the push to ratify the Kyoto Protocol was an attempt to get US ratification. The US asked for carbon credits, primarily for CO2 removed through reforestation, so a net figure would apply to their assessment as a developed nation. It was denied. The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).

The Truth Will Out.

How much longer will the IPCC continue to produce CO2 data with trends to fit their hypothesis that temperature will continue to rise? How much longer before the public become aware of Gray’s colorful observation that, “The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.” The almost 18-year leveling and slight reduction in global temperature is essentially impossible based on IPCC assumptions. One claim is already made that the hiatus doesn’t negate their science or projections, instead of acknowledging it, along with failed predictions completely rejects their fear mongering.

IPCC and EPA have already shown that being wrong or being caught doesn’t matter. The objective is the scary headline, enhanced by the constant claim it is getting worse at an increasing rate, and time is running out. Aldous Huxley said, “Facts do not cease to exist because they are ignored.” We must make sure they are real and not ignored.


[1] Reproduced with permission of William Kininmonth.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

240 Comments
Inline Feedbacks
View all comments
August 13, 2014 1:38 pm

richardscourtney says:
August 13, 2014 at 2:43 am
Bart:
At August 13, 2014 at 2:24 am you say to the troll who posts as Phil.
You’re way out of your depth on this.
Yes, and at August 11, 2014 at 2:05 pm I attempted to inform the troll how to swim in the deep water it had entered but – as is the way with trolls – Phil. replied showing a desire to splash about instead of learning.

As usual courtney your post contains nothing of substance.

richardscourtney
August 13, 2014 2:17 pm

Troll who operates as Phil.:
As usual your post at August 13, 2014 at 1:38 pm fails to recognise that you are being corrected because you are nothing of substance.
Richard

Bart
August 13, 2014 10:55 pm

Phil. says:
August 13, 2014 at 1:30 pm
“If it weren’t so why would SF6 (MW 146) be a gas of choice to measure dispersion in the BL?”
I expect because its transition from laminar to turbulent flow occurs at a convenient point for making the measurement. Do you even know what a Reynolds number is? Look it up. What physical properties does it depend on? Are you getting a clue?
” This from a guy who…”
… knows how dynamical systems behave. Stop digging, Phil.

August 14, 2014 7:04 am

Bart says:
August 13, 2014 at 10:55 pm
Phil. says:
August 13, 2014 at 1:30 pm
“If it weren’t so why would SF6 (MW 146) be a gas of choice to measure dispersion in the BL?”
I expect because its transition from laminar to turbulent flow occurs at a convenient point for making the measurement.

You really do want to keep digging that hole don’t you Bart.
The SF6 is introduced to the atmosphere at 1,000 ppm so it will have no effect on the Reynolds number.
Do you even know what a Reynolds number is? Look it up. What physical properties does it depend on? Are you getting a clue?
Yes I do know what a Reynolds number is since I taught Fluid mechanics for many years, I also know that at the level of 1,000 ppm to ppt SF6 will not influence the viscosity or density of the bulk gas.
Bart claims that he “… knows how dynamical systems behave”, so far he’s failed to demonstrate that here and has just added Fluid mechanics to the list of subjects he doesn’t understand.

Bart
August 14, 2014 11:39 pm

Phil. says:
August 14, 2014 at 7:04 am
“I also know that at the level of 1,000 ppm to ppt SF6 will not influence the viscosity or density of the bulk gas.”
No sh*t? We aren’t talking about the bulk atmospheric properties, Phil. That’s what I’ve been trying to get through to you.
We’re not even talking about small fractional components. We’re talking about smokestack emissions, which are mostly CO2 and water vapor.
If what you imagine to be true were true, we wouldn’t even have a need for smokestacks – the gases would immediately be swept up and well mixed, and it wouldn’t matter if the stacks were 100 feet or 100 inches tall.
This is such a stupid conversation. You focus on these large scale, airy general conditions, and their associated textbook results, and utterly fail to realize that we are interested here in specific, local conditions in the vicinity of a large emissions source.
Smokestacks create lingering, elevated concentrations in their immediate vicinity. There is zero sane argument possible against that. Vehicles burning fossil fuels create large, low altitude concentrations of their combustion products. If that were not the case, cities wouldn’t have smog alerts. The air in LA and Mexico City would be comparable to the Rocky Mountains.
Get your head out of your tuckus, and concentrate on the specific problem. Leave aside the textbooks for a moment and think.

Nick Stokes
August 15, 2014 12:20 am

Bart says: August 13, 2014 at 10:55 pm
“I expect because its transition from laminar to turbulent flow occurs at a convenient point for making the measurement. Do you even know what a Reynolds number is?”

This does not sound like someone who knows about fluid mechanics. Phil. is right. At 1000 ppmv, SF6 makes negligible difference to Re or transition.

Bart
August 15, 2014 7:33 pm

Nick Stokes says:
August 15, 2014 at 12:20 am
It makes little difference to the bulk properties of a well mixed parcel. It makes a significant difference to the dynamics of the gas itself, especially near the point of injection, before it has had a chance to mix.

August 16, 2014 6:32 am

Bart says:
August 15, 2014 at 7:33 pm
Nick Stokes says:
August 15, 2014 at 12:20 am
It makes little difference to the bulk properties of a well mixed parcel. It makes a significant difference to the dynamics of the gas itself, especially near the point of injection, before it has had a chance to mix.

No it doesn’t you idiot, read the damned paper, it’s injected mixed with air at 1,000ppm.
Stop digging, you’ve proved you don’t have clue about what’s being talked about.
You made a stupid remark about CO2 being a ‘heavy gas’ and your thrashing around trying to justify it isn’t helping, you were wrong, face up to it.

Bart
August 16, 2014 12:09 pm

Phil. says:
August 16, 2014 at 6:32 am
“No it doesn’t you idiot, read the damned paper, it’s injected mixed with air at 1,000ppm.”
Then, very unlike the situation we are talking about, you moron.Do you ever stop to think before you write these comments? Ever?

August 17, 2014 8:56 pm

ladylifegrows says:
August 5, 2014 at 7:06 pm
Bruce Cobb says:
August 5, 2014 at 4:50 am
It makes no difference where the increased CO2 comes from, so it’s a red herring. The increased CO2 is nothing but a boon to all of life, and especially to man, by helping plants grow. Whatever warming effect it may have had cannot be sussed from what is natural, and only in the twisted, humanity-hating minds of the Warmistas could a small amount of warming be a detriment to “the planet”.
—————————————————
“Yes, the watermelon greens are willing to put up with any amount of damage to the ecosphere in order to hurt people. Still, I think human well-being is the key. As long as we keep pointing out that CO2 helps PLANTS, we strengthen the idea that it is a waste product for humans, hence bad…”
~ Humans eat plants, or eat other animals that eat plants. Most humans understand that.
“The reality is that human physiology evolved (like all others) under conditions much higher in [CO2] than today’s. We need it for a pH buffer in our blood, and goodness knows what else. There are indications that maximum longevity would occur under CO2 concentrations many times higher than today’s. It is important for respiration as lower concentrations cause shallower breathing and less oxygen concentration in our tissues. Asthmatics, COPD and anybody carrying oxygen tanks is probably being harmed because they lack CO2 in those tanks. Indeed, I suspect the fire-hazard oxygen tanks could be dispensed with altogether and higher CO2 substituted for better health outcomes…”
~Humans create CO2 within their bodies during metabolism, which they exhale (and some does fulfill a function within the body in the switch between aerobic and anaerobic metabolism, and “goodness knows” in some other ways as well). Carbon from food intake and oxygen from inhalation. What are the ‘indications’ that longevity would be maximized with higher CO2 concentrations in the inhalation side? It is well known that CO2 concentrations above a certain threshold result in eventual suffocation and death in the short term. Hemoglobin carries both oxygen and CO2 (along with other chemicals~ nitric oxide and hydrogen sulfide for examples). More CO2 in, given limited hemoglobin bearing blood cells, means less oxygen in. Replacing oxygen in tanks for people who need them with CO2 will very quickly kill them.
“I d not know whether I am right about that, but I am pretty sure nobody is studying such questions. NSF will not fund anything that might shake the CAGW hypothesis, because they believe a good scare makes more science funding. If I AM right, then the lack of interest is murder of people with respiratory problems–and maybe all the rest of us as well…”
~You aren’t right about anything in your post, except that humans may have been around during higher concentrations of CO2 in the air, but that is moot, because humans use mostly aerobic metabolism, and whatever CO2 they require is produced by metabolism.

August 18, 2014 6:17 am

Bart says:
August 16, 2014 at 12:09 pm
Phil. says:
August 16, 2014 at 6:32 am
“No it doesn’t you idiot, read the damned paper, it’s injected mixed with air at 1,000ppm.”
Then, very unlike the situation we are talking about, you moron.Do you ever stop to think before you write these comments? Ever?

Yes, and unlike you I know what I’m talking about.
Just when I think you have plumbed the depths of stupidity you come up with something like this:
We aren’t talking about the bulk atmospheric properties, Phil. That’s what I’ve been trying to get through to you.
We’re not even talking about small fractional components. We’re talking about smokestack emissions, which are mostly CO2 and water vapor.

On planet Earth smokestack emissions are mostly Nitrogen (~70%+), just like the air we breathe.
It contains about 10% CO2 depending on the fuel used and the mean molecular weight is between 29 and 30, compare with air at 29, so you were talking nonsense when you were talking about the Reynolds number differences. Bear in mind that the smokestack emissions which you see are water droplets (10s of microns in diameter) which disperse more slowly than the gases. So to summarize we are talking about the bulk gas properties which are very similar to air itself.
Go back to school Bart and come back here when you actually know something about what the grown-ups are talking about.

Bart
August 18, 2014 8:12 pm

Phil. says:
August 18, 2014 at 6:17 am
“Yes, and unlike you I know what I’m talking about.”
Congratulations on being the world’s leading authority on irrelevant information.
“On planet Earth smokestack emissions are mostly Nitrogen…”

The gases emitted through smokestacks largely consist of carbon dioxide and water vapor, though some nitrogen and oxygen are typically present, along with a number of pollutants.

Stop digging.

August 19, 2014 6:40 am

Bart says:
August 18, 2014 at 8:12 pm
Phil. says:
August 18, 2014 at 6:17 am
“Yes, and unlike you I know what I’m talking about.”
Congratulations on being the world’s leading authority on irrelevant information.
“On planet Earth smokestack emissions are mostly Nitrogen…”
The gases emitted through smokestacks largely consist of carbon dioxide and water vapor, though some nitrogen and oxygen are typically present, along with a number of pollutants.

No wonder you’re always wrong about everything, you rely on a source called ‘Wisegeek’!
That source apparently thinks you can burn fuel with air which is about 80% Nitrogen and it doesn’t make it to the smokestack. I’m afraid that any vestige of credibility you might have had just went out the window.
For a credible source try:
http://chemengineering.wikispaces.com/Flue+gas
http://www.eoearth.org/view/article/171355/
From the latter: “Since ambient air contains about 79 volume percent gaseous nitrogen (N2), which is essentially non-combustible, the largest part of the flue gases from most fossil fuel combustion is uncombusted nitrogen.
Here’s a paper from Argonne National Labs giving 73% N2 for gas fired power stations
and 76% N2 for coal fired.

Bart
August 19, 2014 9:50 pm

Meh. It’s still rather significantly more than 1000 ppm, something near a quarter of the mass. You don’t really know how the gas behaves when ejected, Phil. Nobody does. Estimates of plume dispersion are notoriously inaccurate, and who knows the ultimate dispensation?
As for credibility, you blew yours long ago. On this very thread, you have once again put forward a “mass balance” argument which is frankly kindergarten level. I’m talking so far out to lunch that you would figuratively starve. And, your notion that density has no effect in the PBL, especially the idea that it becomes less important in the turbulent zone where inertial forces specifically come to the fore, is cuckoo.
So, spare me the jibes. The rate of change of atmospheric CO2 is in the doldrums, and the globe is not warming, even as emissions continue relentlessly accelerating. You are wrong about just every important detail, and you better start packing your parachute to bail out from the “cause”, because the salad days are over.

1 8 9 10
Verified by MonsterInsights