Will Humanity Ever Reach 2XCO2? Possibly Not

Reposted from Dr. Roy Spencer’s Blog

February 1st, 2020 by Roy W. Spencer, Ph. D.

Summary

The Energy Information Agency (EIA) projects a growth in energy-based CO2 emissions of +0.6%/yr through 2050. But translating future emissions into atmospheric CO2 concentration requires a global carbon budget model, and we frequently accept the United Nations reliance on such models to tell us how much CO2 will be in the atmosphere for any given CO2 emissions scenario. Using a simple time-dependent CO2 budget model forced with yearly estimates of anthropogenic CO2 emissions and optimized to match Mauna Loa observations, I show that the EIA emissions projections translate into surprisingly low CO2 concentrations by 2050. In fact, assuming constant CO2 emissions after 2050, the atmospheric CO2 content eventually stabilizes at just under 2XCO2.

Introduction

I have always assumed that we are on track for a doubling of atmospheric CO2 (“2XCO2”), if not 3XCO2 or 4XCO2. After all, humanity’s CO2 emissions continue to increase, and even if they stop increasing, won’t atmospheric CO2 continue to rise?

It turns out, the answer is probably “no”.

The rate at which nature removes CO2 from the atmosphere, and what controls that rate, makes all the difference.

Even if we knew exactly what humanity’s future CO2 emissions were going to be, how much Mother Nature takes out of the atmosphere is seldom discussed or questioned. This is the domain of global carbon cycle models which we seldom hear about. We hear about the improbability of the RCP8.5 concentration scenario (which has gone from “business-as-usual”, to “worst case”, to “impossible”), but not much about how those CO2 concentrations were arrived at from CO2 emissions data.

So, I wanted to address the question, What is the best estimate of atmospheric CO2 concentrations through the end of this century, based upon the latest estimates of future CO2 emissions, and taking into account how much nature has been removing from the atmosphere?

As we produce more and more CO2, the amount of CO2 removed by various biological and geophysical processes also goes up. The history of best estimates of yearly anthropogenic CO2 emissions, combined with the observed rise of atmospheric CO2 at Mauna Loa, Hawaii, tells us a lot about how fast nature adjusts to more CO2.

As we shall see, it is entirely possible that even if we continued producing large quantities of CO2, it is possible for CO2 levels in the atmosphere to eventually stabilize.

In their most recent 2019 report, the U.S. Energy Information Agency (EIA) projects that energy-based emissions of CO2 will grow at 0.6% per year until 2050, which is what I will use to project future atmospheric CO2 concentrations. I will show what this emissions scenario translates into using a simple atmospheric CO2 budget model that has been calibrated with the Mauna Loa data. And we will see that the resulting remaining amount of CO2 in the atmosphere is surprisingly low.

A Review of the CO2 Budget Model

I previously presented a simple time-dependent CO2 budget model of global atmospheric CO2 concentration that uses (1) yearly anthropogenic CO2 emissions, along with (2) the central assumption (supported by the Mauna Loa CO2 data) that nature removes CO2 from the atmosphere at a rate in direct proportion to how high atmospheric CO2 is above some natural level the system is trying to ‘relax’ to.

As described in my previous blog post, I also included an empirical El Nino/La Nina term since El Nino is associated with higher CO2 in the atmosphere, and La Nina produces lower concentrations. This captures the small year-to-year fluctuations in CO2 from ENSO activity, but has no impact on the long-term behavior of the model.

The model is initialized in 1750 with the Boden et al. (2017) estimates of year anthropogenic emissions, and produces an excellent fit to the Mauna Loa CO2 observations using the assumption of a baseline (background) CO2 level of 295 ppm and a natural removal rate of 2.33% per year of the atmospheric excess above that baseline.

Here is the resulting fit of the model to Mauna Loa data, with generally excellent results. (The post-Pinatubo reduction in atmospheric CO2 is believed to be due to increased photosynthesis due to an increase in diffuse sunlight penetration into forest canopies caused by the volcanic aerosols):

Fig. 1. Calibrated CO2 budget model compared to the Mauna Loa, Hawaii CO2 observations. The model is forced with the Boden et al. (2017) estimates of yearly anthropogenic CO2 emissions, and removes CO2 in proportion to the excess of atmospheric CO2 above a baseline value.

The model even captures the slowly increasing trend in the apparent yearly fractional removal of CO2 emissions.

Fig. 2. Yearly apparent fraction of anthropogenic emissions removed by nature, in the Mauna Loa observations (red) versus the model (blue).

Model Projections of Atmospheric CO2

I forced the CO2 model with the following two future scenario assumptions:

1) EIA assumption of 0.6% per year growth in emissions through 2050
2) Constant emissions from 2050 onward

The resulting CO2 concentration is shown in Fig. 3, along with the CO2 concentration scenarios, RCP2.6, RCP4.5, RCP6.0, and RCP8.5, used in the CMIP5 climate model projections.

CO2-model-for-blog-post-fig03-550x453Fig. 3. CO2 model projection of atmospheric CO2 assuming EIA estimates of CO2 emissions growth through 2050, followed by constant CO2 emissions afterward.

Interestingly, with these rather reasonable assumptions regarding CO2 emissions, the model does not even reach a doubling of atmospheric CO2, and reaches an equilibrium CO2 concentration of 541 ppm in 2240.

Discussion

In my experience, the main complaint about the current model will be that it is “too simple” and therefore probably incorrect. But I would ask the reader to examine how well the simple model assumptions explain 60 years of CO2 observations (Figs. 1 & 2).

Also, I would recall the faulty predictions many years ago by the global carbon cycle modelers that the Earth system could not handle so much atmospheric CO2, and that the fraction which is removed over time would start to decrease. As Fig. 2 (above) shows, that has not happened. Maybe when it comes to photosynthesis, more life begets still more life, leading to a slowly increasing ability of the biosphere to remove excess CO2 from the atmosphere.

Given the large uncertainties in how the global carbon cycle responds to more CO2 in the atmosphere, it is entirely reasonable to hypothesize that the rate at which the ocean and land removes CO2 from the atmosphere is simply proportional to how high the atmospheric concentration gets above some baseline value. This simple hypothesis does not necessarily imply that the processes controlling CO2 sources and sinks are also simple; only that the net global rate of removal of atmospheric CO2 can be parameterized in a very simple form.

The Mauna Loa CO2 data clearly supports that hypothesis (Fig. 1 and Fig. 2). And the result is that, given the latest projections of CO2 emissions, future CO2 concentrations will not only be well below the RCP8.5 scenario, but might not even be as high as RCP4.5, with atmospheric CO2 concentrations possibly not even reach a doubling (560 ppm) of estimated pre-Industrial levels (280 ppm) before leveling off. This result is even without future reductions in CO2 emissions, which is a possibility as new energy technologies become available.

I think this is at least as important an issue to discuss as the implausibility (impossibility?) of the RCP8.5 scenario. And it raises the question of just how good the carbon cycle models are that the UN IPCC depends upon to translate anthropogenic emissions to atmospheric CO2 observations.

Advertisements

119 thoughts on “Will Humanity Ever Reach 2XCO2? Possibly Not

  1. Good posting, Charles, of an interesting data and thought piece from Dr. Roy. This scenario, wherein photosynthesis utilizes increasing CO2 to enhance vegetation, is good news for those of us who like a good salad with our beef. I’m thinking now we should start a counter-revolution, basically “Keep Your Hands Off MY CO2!”

    • Dr Ed Berry has this important paper in preprint. I am still digesting it – yummy but quite a mouthful.

      Some are quick to dismiss it – far too quick, imo. Berry is probably a lot smarter than they are.

      Best, Allan 🙂

      From the Abstract:

      “Human emissions through 2019 have added only 31 ppm to atmospheric CO2 while nature has added 100 ppm. If human emissions were stopped in 2020, then by 2100 only 8 ppm of human CO2 would remain in the atmosphere.”

      PREPRINT: “THE PHYSICS MODEL CARBON CYCLE FOR HUMAN CO2”
      by Edwin X Berry, Ph.D., Physics
      https://edberry.com/blog/climate/climate-physics/human-co2-has-little-effect-on-the-carbon-cycle/

      ABSTRACT
      The scientific basis for the effect of human carbon dioxide on atmospheric carbon dioxide rests upon correctly calculating the human carbon cycle. This paper uses the United Nations Intergovernmental Panel on Climate Change (IPCC) carbon-cycle data and allows IPCC’s assumption that the CO2 level in 1750 was 280 ppm. It derives a framework to calculate carbon cycles. It makes minor corrections to IPCC’s time constants for the natural carbon cycle to make IPCC’s flows consistent with its levels. It shows IPCC’s human carbon cycle contains significant, obvious errors. It uses IPCC’s time constants for natural carbon to recalculate the human carbon cycle. The human and natural time constants must be the same because nature must treat human and natural carbon the same. The results show human emissions have added a negligible one percent to the carbon in the carbon cycle while nature has added 3 percent, likely due to natural warming since the Little Ice Age. Human emissions through 2019 have added only 31 ppm to atmospheric CO2 while nature has added 100 ppm. If human emissions were stopped in 2020, then by 2100 only 8 ppm of human CO2 would remain in the atmosphere.

      • A fraction of a fraction… and with positive effect.

        Climate (i.e. 30 year period) change may be real given large natural variance. Anthropogenic climate (i.e. persistent) change over local and perhaps regional frames is plausible. Catastrophic [anthropogenic] climate change is improbable.

      • Allan

        Willem Nel’s PhD thesis at the University of Johannesburg which I have pointed to previously points out that the problem is not peak oil or peak coal or peal fossil fuels, it is peak energy. We simply cannot get our hands on enough fossil fuels to double the atmospheric rate.

        First, as pointed out by a paper which you cite, we are not making as much difference as Nature, mostly warming oceans which started 200 years ago. Second, we can’t burning everything at once – it takes time. Also the fuels are harder and harder to reach. The absorption rate increases with time.

        Typically when the global temperature rises 2 or 3 C above the current level, it starts raining regularly in the Gobi and Sahara Deserts. They turn into grasslands and the total area absorbing CO2 dramatically increases. It is not all that long ago that the Sahara was wet. The water Libya was scolded for pumping out of the desert and sending north for irrigation is “5000 years old”. Really!? Yup. reversing desertification requires warmth and shifts in the winds. There was a major shift in the winds in 1868 but I don’t know if it is sufficient to start the roller coaster of moisture from West to East in the Sahara. Maybe another shift is still coming.

        • Good points, thank you Crispin.

          I have not spent much time on the important scientific hypothesis of Salby, Berry and Harde because I do not need it to falsify the very-scary CAGW and the “wilder weather” hypotheses. My recent paper falsifies these scares ~25 times, but as Albert Einstein famously stated “One would be enough”. Climate sensitivity to increasing atmospheric CO2 is far too low (less than ~1C/doubling) to cause dangerous global warming or wilder weather. End of crisis.

          “The Catastrophic Anthropogenic Global Warming (CAGW) and The Humanmade Climate Change Crises Are Proved False”
          By Allan M.R. MacRae, B.A.Sc.(Eng.), M.Eng., January 10, 2020
          https://thsresearch.files.wordpress.com/2020/01/the-catastrophic-anthropogenic-global-warming-cagw-and-the-humanmade-climate-change-crises-are-proved-false.pdf

          Also, your cited reference (Willem Nel’s PhD thesis) states that atmospheric CO2 is unlikely to reach doubling due to fossil fuel combustion.

          My conclusion is there never was a credible CAGW/wilder weather crisis – it was always a false crisis, manufactured by wolves to stampede the sheep.

      • So Spencer is saying one thing, and the paper is saying about the opposite. Spencer seems to be saying our emissions are being sunk. Which is about the opposite of what the paper might be saying. The paper would overturn a lot. A lot would have to be redone. I think if you were to ask, over 3/4s of commenters are going to go with Spencer’s take even after the paper is published.

      • Berry: “Human emissions through 2019 have added only 31 ppm to atmospheric CO2 while nature has added 100 ppm.”

        Given that dissolved CO2 levels in the ocean are rising, in other words its a net sink, what is the source of this “natural” 100ppm?

        Methane levels and nitrous oxides levels are following the same, steeply rising curve since 1750 as CO2, after having been more or less flat for thousands of years, is this merely a coincidence?

        https://d32ogoqmya1dw8.cloudfront.net/images/integrate/teaching_materials/food_supply/student_materials/graph_depicting_concentration_green_744.png

        Graph from here: https://www.ipcc.ch/site/assets/uploads/2018/02/ar4-wg1-chapter2-1.pdf

        • Yes.
          “…what is the source of this “natural” 100ppm?”

          Endorse or not this argument people. People here have a lot of opinions. Let’s have yours on this. Don’t avert your eyes.

        • “what is the source of this ‘natural’ 100ppm?”

          As temperatures rise more CO2 is sequestered are re-released through short term sinks vs long term sinks.

          The balance between long term and short term CO2 sinks shifts as the climate warms.

    • However, there is no mention of the 50 to 1 partitioning of CO2 into the waters of the world or the ~5 year half-life of CO2 in the atmosphere. This makes CO2 a very dynamic factor and our input very temporary. It is ingenuous to pretend that our ever-increasing emissions are having any effect of atmospheric CO2 as it continues to increase almost linearly. The oceans are the controller of the CO2 concentration not our emissions. It is known that there is a serious lag between climate temperature changes as atmospheric CO2 changes. Looking at it year to year or decade to decade is truly myopic.

      All of this is meaningless, as no gas at any concentration can detectably warm the climate, as all gases in the upper tropical troposphere (which is the region that is supposed to be warming Earth’s surface) are colder than the surface.

      In particular, IR from CO2, which emits IR at -80ºC, cannot warm the surface anywhere. In addition, during the day, CO2 indeed can absorb light equivalent to 400 and 800ºC, but it serves to absorb and re-emit this energy in all directions, which means it is short-stopping insolation and redirecting it to space, which serves to cool the Earth. CO2 is simply not an issue to our climate and means more plant food for our planet. All of the discussion regarding IPCC models fro CO2 are baseless when one starts with what CO2 cannot do

      • Residence times of 5 to 7 years are nonsense. The atmosphere is not just a mixing chamber. There another mixing chamber in the oceans and huge fluxes between these reservoirs. And then there is the biosphere.

    • The facts speak for themselves.

      Excerpts from published article, plus my critique of said:

      an empirical El Nino/La Nina term since El Nino is associated with higher CO2 in the atmosphere, and La Nina produces lower concentrations.

      Yep, in the equatorial Pacific, …… an El Nino describes unusually warm ocean waters ….. and a La Nina describes unusually cool ocean waters, ……. thus an El Nino causes an outgasing of CO2 into the atmosphere and a La Nina causes an ingassing of CO2 from the atmosphere. Henry’s Law

      As we shall see, it is entirely possible that even if we continued producing large quantities of CO2, it is possible for CO2 levels in the atmosphere to eventually stabilize.

      Given the fact that us humans have been producing (emitting) exponentially increasing quantities of CO2 since the 1940’s ……. without affecting the atmospheric CO2 ppm quantities, …… I seriously doubt that humans will have any effect on said “quantities” during the next 50 years.

      The FACTS speak for themselves, to wit:

      Increases in World Population & Atmospheric CO2 by Decade

      year — world popul. – % incr. — May CO2 ppm – % incr. — avg ppm increase/year
      1940 – 2,300,000,000 est. ___ ____ 300 ppm est.
      1950 – 2,556,000,053 – 11.1% ____ 310 ppm – 3.3% —— 1.0 ppm/year
      [March 03, 1958 …… Mauna Loa — 315.71 ppm]
      1960 – 3,039,451,023 – 18.9% ____ 320.03 ppm – 3.2% —— 1.0 ppm/year
      1970 – 3,706,618,163 – 21.9% ____ 328.07 ppm – 2.5% —— 0.8 ppm/year
      1980 – 4,453,831,714 – 20.1% ____ 341.48 ppm – 4.0% —– 1.3 ppm/year
      1990 – 5,278,639,789 – 18.5% ____ 357.32 ppm – 4.6% —– 1.5 ppm/year
      2000 – 6,082,966,429 – 15.2% ____ 371.58 ppm – 3.9% —– 1.4 ppm/year
      2010 – 6,809,972,000 – 11.9% ____ 393.00 ppm – 5.7% —— 2.1 ppm/year
      2019 – 7,714,576,923 – 11.7% ____ 414.66 ppm – 5.5% —— 2.1 ppm/year
      Source CO2 ppm: ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_mm_mlo.txt

      Based on the above statistics, to wit:

      Fact #1 – in the past 79 years – world population has increased 235% (5.4 billion people) – atmospheric CO2 has increased 37.3% (112 ppm)

      Fact #2 – human generated CO2 releases have been exponentially increasing every year for the past 79 years (as defined by the “yearly” population increase of 5.4 billion people).

      Fact #3 – the burning of fossil fuels by humans has been exponentially increasing every year for the past 79 years. (as defined by the “yearly” population increase of 5.4 billion people).

      Fact #4 – a biyearly or seasonal cycling of an average 6 ppm of atmospheric CO2 has been steadily and consistently occurring each and every year for the past 61 years (as defined by the Mauna Loa Record and Keeling Curve Graph).

      Fact #5 – atmospheric CO2 has been steadily and consistently increasing at an average yearly rate of 1 to 2 ppm per year for each and every year for the past 61 years (as defined by the Mauna Loa Record and Keeling Curve Graph), ….. regardless of what the world’s yearly population numbers were.

      Conclusions:

      Given the above statistics, it appears to me to be quite obvious that for the past 79 years (or the 61 years of the Mauna Loa Record) there is absolutely no direct association or correlation between:

      #1 – increases in atmospheric CO2 ppm and world population increases:

      #2 – the biyearly or seasonal cycling of an average 6 ppm of atmospheric CO2 and world population increases;

      #3 – the biyearly or seasonal cycling of an average 6 ppm of atmospheric CO2 and the exponential yearly increase in fossil fuel burning;

      #4 – the average yearly increase in atmospheric CO2 of 1 to 2 ppm and the exponential increase in fossil fuel burning;

      #5 – there is absolutely, positively no, per se, “human (anthropogenic) signature” to be found anywhere within the 61 year old Mauna Loa Atmospheric CO2 Record.

      #6 – this composite graph of 1979-2013 uah satellite global lower atmosphere temperatures and yearly May max CO2 accumulations is literal proof that green growing/decomposing NH biomass and/or near surface air temperatures have little to no effect whatsoever on atmospheric CO2 ppm quantities.

  2. More plant food in the air not only enables more vegetation where it already grows, but allows greenery to expand into drier areas previously devoid of plants. When Their stomata need stay open for less time, while still taking in enough CO2 to meet their requirements for sugar synthesis, then plants lose less water. The de-desertifying Sahel is a good example.

    This just one reason why CO2 sinks have grown more than some thought possible.

    • The average trend for the past decade was +2.5 ppm/year.
      https://www.esrl.noaa.gov/gmd/ccgg/trends/gl_gr.html
      The NH summer drawdown of CO2 runs 5 months May through end-September. The NH Winter sees a CO2 increase over 7 months October thru April. That rapid NH summer CO2 drawdown in just 5 months demonstrates that not only are the biological carbon sinks far from saturated, they are increasing uptake and staying apace the increasing total pCO2.

        • Go to this NOAA web page:
          https://www.esrl.noaa.gov/gmd/ccgg/trends/gl_trend.html

          Observe that the Barrow Alaska CO2 dynamic range is a full 20 ppm every year. While the whole of the tropics (MLO) to the Southern Hemisphere diminishes in annual dynamic range.

          This says that the Arctic regions (land and ocean) will keep absorbing CO2 the warmer it gets.
          Strong negative feedback. And they are just getting started with the end of the LIA.
          The IPCC’s Bern Model does not replicate, nor capture, any of this biological CO2 dynamic behavior.

          • This says that the Arctic regions (land and ocean) will keep absorbing CO2 the warmer it gets.

            The warmer it gets the less the oceans will be able to absorb. That’s why some here keep claiming that the rise in CO2 is caused by warming oceans.

      • Joel O’Bryan – February 2, 2020 at 10:31 am

        The NH summer drawdown of CO2 runs 5 months May through end-September.

        Now that’s what I would call “weazelwording”

        “HA”, …… so it is now an unscientific “NH summer drawdown” of atmospheric CO2, ……. rather than a scientific “NH summer of green-growing biomass ingassing” of atmospheric CO2.

        The BIG problem with the claim of “NH summer drawdown of CO2 between mid-May and end of September” ……. is that said “drawdown” of CO2 by the NH green-growing biomass ACTUALLY begins in the lower latitudes in January (not May) and progresses thru the higher latitudes (as the near-surface temps increase) until mid to late June.

        And don’t be fergettin, as those “Spring” temps increase thru the higher latitudes, microbial decomposition of the dead biomass follows suite, …… and CO2 is being outgased into the atmosphere by microbes, ……. before it is being ingassed by the green growing biomass.

        You don’t have to believe me, ….. but you have to believe your nose, ….. to wit:

        Ooooh, that smell! Odors rise with the temperature

        Your nose doesn’t lie – odors intensify in the warm summer months, be they of rotting garbage on the sidewalk or fragrant flowers blooming in a garden.

        The combination of heat and humidity allows bacteria to grow faster and smells to travel farther, said Victoria Henshaw, who researches urban smells throughout the world.

        Read more here

        I can lead someone to water, ……… but I can’t make them drink. 😊

    • I imagine some plants are more sensitive to co2 concentrations than other plants. Essentially are existing co2 levels bottlenecking the total biomass of certain plant species that uptake more co2 than other plant species ? If so then co2 concentration Growth will be reduced by such plants as their population relative to other plants increase.

  3. A compelling argument, thank you Dr Spencer. And it didn’t need <$100k of grant money with umpteen co- authors, which seems to be the norm in Climate Science these days.

  4. It would be a good thing to have 600 ppm, or is that 500 or 800, CO2 in the atmosphere. Plants, Canadians and |Russians would be very happy.

    • And the Swedes would be happy too.

      (Except for the ones that had their dreams and their childhoods stolen!)

  5. Models that forecast a geometric growth (constant compounded value like 0.6% per year) rarely turn out to be very accurate, and they invariably over estimate. I have formulated reasons for this, but growth of anything is nearly always a logistic type curve which is indistinguishable from geometric growth early, but soon flattens out.

    I had a look back at 1960s and 1970s projections of electrical energy demand up to year 2000, and depending on assumptions, none of which actually pertained to the true history, one could get any answer one wished. The totality of them looked exactly like the collection of curves in the figure above. The amazing thing about these projections was the level of experience, and learned formulae involved. No matter. Still wrong.

  6. Just like we were surprised to see how much CO2 nature has absorbed, we will be equally amazed how much more can be absorb. Mother Nature is amazingly resilient. We are learning new things everyday, and I thank Roy for quantifying and graphing current thinking. While not limitless, we now know nature’s ability to “sink” excess CO2 is huge, and to put it in “alarmist rhetoric” … nature’s CO2 sinks are alarmingly efficient.

    • Many forests could greatly increase their carbon sequestration for the following reason: many forests have been abused which has lowered their sequestration potential. I can speak with confidence of the forests on the USA northeast- many were “high graded” where they cut the best and left the rest- with the rest usually being diseased and injured trees along with species which don’t live long or don’t grow fast. With better forest mgt.- that is the application of silviculture, taught in all forestry schools, we can convert many forests into far more productive condition- not only sequestering more carbon but also producing valuable timber for the owners- which will encourage them to retain the forest as forests and not selling to developers. Unfortunately, climate alarmist extremists here in Massachusetts think it’s wiser to lock up all the forests.
      Joe Zorzin
      MA Forester License #261
      https://www.youtube.com/user/JoeZorzin/videos (my 3 forestry videos and one of the construction of hideous solar farm)

      • I took 20 minutes to watch two of your videos. They are great. Informative and fun to watch. Thanks.

        In about six weeks I’ll be hiking back into the woods around here, at least below 9500 ft. And your videos made me anxious for then. The higher country around here, up to 12,000 feet, must wait until June.

      • Excellent educational videos. Unfortunately, tree huggers are mostly immune to reality and believe any human involvement in managing a forest to be bad bad bad. I like to point out to them that ancient Indians managed the forest with the use of fire – one result being, out here in the real Northern California area, big beautiful forests with large trees and plentiful spaces between them.

        Keep up the great work.

      • Joe excellent information.
        My uncle used to own a small sawmill in Marysville Victoria.
        They ran a successful business based on 100 selected trees per year.
        Do you or the logger choose the trees or is it a mutual decision?
        Thanks
        Waza

        • Waza, my role is as a “forestry consultant”. I have a BS in forest mgt. from U. Mass. Been doing this since Nixon was in the White House. I work for the forest owner. I prepare long term mgt. plans, which are approved by the state. When it’s time for a timber harvest- I set up the entire process- I mark every tree to be cut at head height and on the stump as a control- the marking is based on the science of silviculture- it all depends on forest type, age, quality, etc. I measure every tree; calculate timber volumes by species and for every 2″ diameter class and produce a spreadsheet with that information; which I then send out to all possible loggers in the area. I then have a “timber showing”- kinda like a real estate showing where I discuss relevant issues such as what to not do, when to start and when to stop. I’ll have a contract with me to show them. I also prepare a state required “forest cutting plan”. The state of Mass rigorously regulates all water resources- that is, stream crossings, stream buffers, swamps, etc. After getting bids from loggers I talk with the owner and recommend which company to sell to- usually the highest since I only send bid prospectus to dependable loggers. I’ll hold a performance bond and oversee the work. It’s all tightly controlled- yet, the crazies here want to stop all logging because they say it’ll allow the forests to maximize carbon sequestration- not listening to me when I describe what I said in my original post- that most forests are in poor condition- and they’ll sequester more carbon if managed long term- while producing wealth for the owner and society due to taxes. In Wendell State Forest in the western part of the state- dozens of fanatics chained themselves to trees and logging machines. Many were arrested but they only got a slap on the wrist. The loggers took a loss because it slowed down their work- and now they’re suing the state for not protecting them.

          I only recently learned that it was the state of Mass. that sued the EPA to force it to declare a “finding” that carbon is a pollutant. And, I noticed that the coauthor for Michael Mann for his hockey stick paper was a researcher at U. Mass. in Amherst. Yes, Massachusetts has some of the craziest of the crazies when it comes to climate alarmism and tree cutting.

      • “Unfortunately, climate alarmist extremists here in Massachusetts think it’s wiser to lock up all the forests.”

        Until, of course, someone wants to build a wind farm or solar farm on that forest land. THEN, the Eco-Fascists will allow the trees to be sacrificed, you know, to “save the planet.”

  7. The alarmists insist that the residency of CO2 in the atmosphere is measured in decades. On the other hand, the annual variation in CO2 has a slew rate that indicates otherwise. On an annual basis, CO2 is removed from the atmosphere at a rapid rate. A decades long atmospheric residency of CO2 doesn’t pass the smell test.

    • If only it were decades.. back around last March in a BBC news interview, Mickey Mann claimed that CO2 stays in the atmosphere for thousands of years. I’ve been trying to find a clip, but no luck.

  8. I’m sure analyses like this one will be diligently followed up and expanded upon by Alarmists… hoping against hope that it shows some light at the end of the short CO2 Disaster Tunnel.

    It’s such wonderful news that GAT’s are unlikely to gain more than 1.5 C° post-industrialization and probably not even that is possible…assuming all the current warming since 1940 was from CO2 emissions…and assuming CO2 emission trends hold steady…and assuming empirical ECS values continue through 2100.

    Instead of wasting 10’s to 100’s of $Trillions on the Climate Crisis and driving the world economies into crisis, we can devote a lot more resources at tackling the tough environmental issues, and provide clean water and energy to the third world to overcome poverty and slow then halt population growth without coercion. Win, win, win.

    I’m sure all the Alarmists will be relieved that all the runaway scenarios are basically impossible since the Globe is doing a lot better at CO2 sequestration than early models suggested.

    I can almost hear the celebrations now…… crickets….. crickets….. crickets…. chirp.

  9. “In my experience, the main complaint about the current model will be that it is “too simple” and therefore probably incorrect.”

    In my experience, complicated models of a system with many unknowns and interactions requires too many assumptions to be be correct and therefore are probably wrong.

    I’m a fan of simpler is better where the best fit simple model is the first order solution. To achieve better results, incrementally add second order effects until they have little to no effect. For example, the planet’s average relationship between the surface temperature and emissions at TOA from pole to pole is so close to the behavior of a gray body with an emissivity of 0.62, higher order corrections aren’t even required.

    Note that this simple model isn’t just an empirical curve fit to the data, but is a law of physics that fits the data and whose single unknown variable (the emissivity) can be both measured from the top down and calculated from the bottom up. It’s a shame that so many on both sides ignore this simple yet demonstrably correct model because they don’t understand the implications of Occam’s Razor.

    • My carbon cycle model, what I refer to below contains 26 equations. There is only one empirical parameter based on the permille value of the atmosphere. The model is too simple because it can calculate the three CO2 fluxes and amount namely anthropogenic, natural and the sum of them (total) CO2. I think that it is the only carbon cycle model that can do so.

    • @Henry
      “but there are some people who did the ‘wrong’ calculation, using Planck.

      “using Planck” what? relation? postulate? law? constant? current? density? energy? length? mass? time? epoch? crater? Those are all terms, which when preceded by “Planck”, denote the many eponymous concepts, accomplishments and honors attributed to this great scientist.

      I think you meant “Planck’s Law, which Planck developed (empirically) to quantify the spectral radiance of a so-called “black body” at a given absolute temperature.

      Here are the spectral BB distributions for the Sun (~5800K) and the Earth (~250K).
      https://www.acs.org/content/acs/en/climatescience/energybalance.html
      They are both “good” BB’s, in the sense that, at their respective Planck Law temperatures, the observed radiances are a good fit for the observed radiances.

      So your comment about “radiation at 4.2-4.3 [plain numbers?]” is wrong, because each BB has
      its own characteristic temperature and distribution, according to Planck’s Law.

    • A radiating body as quantified by the Stefan-Boltzmann Law doesn’t require a Planck spectrum. The SB Law relates W/m^2 to temperature, independent of the spectral properties of those W/m^2.

      In a formal sense, the selective attenuation of specific wavelengths from an ideal Planck spectrum turns a black body radiator into a gray body radiator. A common misconception is to restrict a gray body to one where the Planck spectrum of an ideal BB is uniformly attenuated. The attentuation resulting in grayness is also independent of wavelength given that the SB Law is itself independent of spectral proprties.

      • A radiating body as quantified by the Stefan-Boltzmann Law doesn’t require a Planck spectrum.

        A curious statement. It is an observable fact that all radiating bodies have spectra which follow the law of thermal radiation. But you seem to be saying they do not “require” them?

        Law of Thermal Radiation (Kirchoff, 1859)

        For a body of any arbitrary material emitting and absorbing thermal electromagnetic radiation at every wavelength in thermodynamic equilibrium, the ratio of its emissive power to its dimensionless coefficient of absorption is equal to a universal function only of radiative wavelength and temperature. That universal function describes the perfect black-body emissive power

        So Kirchoff predicted, in 1859, the abstract substance of Planck’s Law: EMR energy dependent on wavelength and temperature. (Kirchoff was one of Planck’s PhD advisors, another was Hemholtz). Planck acknowledged his debt to Kirchoff in his 1900 paper.
        The quantum nature of light (as discrete packets with energy proportional to wavelength) was proposed in 1905 by Einstein, which led to his Nobel Prize in 1921 for the photoelectric effect.

        S-B Law merely establishes that electromagnetic energy is proportional to 4th power of temperature. It can be derived using Planck’s Law.

        The laws of physics are often based on “ideal” models that do not really exist, e.g. ideal gases and black bodies. So all real-world objects tend to be ‘gray bodies’ because they do not perfectly absorb and emit all EMR. Planck’s Law provides that real-world spectra must be bounded (‘attenuated’ as you say) to prevent the Ultraviolet Catastrophe.

        • Johanus,

          The Stefan-Boltzmann Law is independent of the spectral characteristics of the emissions. It’s a wavelength independent relationship between W/m^2 and temperature. For example, the 255K equivalent temperature of the planet corresponds to the average 239 W/m^2 of emissions while those emissions have have many spectral gaps owing to absorption.

          A single wavelength laser beam has an equivalent temperature as well. If you have a laser generating a power density of 390 W/m^2 and put a thermometer in the beam, it will measure a temperature of 288K.

          Your definition of a gray body is incorrect. Reflecting energy away does not make a body gray since reflected energy does not contribute to the temperature. A gray body is one that absorbs (and emits) less energy than the SB law says is required to sustain its temperature. This requires a layer between the ideal BB whose temperature we care about and its environment. This layer absorbs energy emitted by the BB preventing it from being released into the environment and instead, that energy is returned to the emitting black body to offset emissions that are greater than can be supported by the incident energy. Sound familiar?

          • S-B Law is a way (derviable from Planck’s Law) that calculates the total energy flux from a black-body given its peak Wien displacement temperature. You see the energy flux predicted by S-B as the area under the black-body spectrum in the original link in my post above:
            https://www.acs.org/content/acs/en/climatescience/energybalance.html

            Wien himself showed in 1893 show the dependence on frequency:
            https://en.wikipedia.org/wiki/Wien%27s_displacement_law#Frequency-dependent_formulation
            https://en.wikipedia.org/wiki/Wien%27s_displacement_law#Discovery

            The law is named for Wilhelm Wien, who derived it in 1893 based on a thermodynamic argument.[4] Wien considered adiabatic expansion of a cavity containing waves of light in thermal equilibrium. He showed that, under slow expansion or contraction, the energy of light reflecting off the walls changes in exactly the same way as the frequency. A general principle of thermodynamics is that a thermal equilibrium state, when expanded very slowly, stays in thermal equilibrium. The adiabatic principle allowed Wien to conclude that for each mode, the adiabatic invariant energy/frequency is only a function of the other adiabatic invariant, the frequency/temperature. A modern variant of Wien’s derivation can be found in the textbook by Wannier.[5]

            Again, S-B Law can be derived from Planck’s Law,:
            https://en.wikipedia.org/wiki/Stefan%E2%80%93Boltzmann_law#Derivation%20from%20Planck's%20law

            Of course Stefan-Boltzmann did not use Planck’s Law to derive their formula in 1884. They used statistical mechanics and used radiation pressure derived from Maxwell Relations to link energy flux to radiation pressure (first done in 1876 by Bartoli).
            https://en.wikipedia.org/wiki/Radiation_pressure#Compression_in_a_uniform_radiation_field
            The consequence is that the shape of the black-body radiation function (which was not yet understood) would shift proportionally in frequency (or inversely proportionally in wavelength) with temperature. When Max Planck later formulated the correct black-body radiation function it did not explicitly include Wien’s constant b. Rather, Planck’s constant h was created and introduced into his new formula.

            This can also be shown in the specific case of the pressure exerted on surfaces of a body in thermal equilibrium with its surroundings, at a temperature T: The body will be surrounded by a uniform radiation field described by the Planck black-body radiation law, and will experience a compressive pressure due to that impinging radiation, its reflection, and its own black body emission. From that it can be shown that the resulting pressure is equal to one third of the total radiant energy per unit volume in the surrounding space.

            @you
            “Your definition of a gray body is incorrect. Reflecting energy away does not make a body gray since reflected energy does not contribute to the temperature. ”

            How else would you be able to see gray bodies with your eyes if they did not reflect light? Recall that black-bodies are invisible because they absorb all incoming radiation and emit nothing.
            Yes, the energy reflected by clouds has no action on the temperature of the Earth. But neither does the energy directly reflected by every piece of earth (rock, water, organic mass etc) visible from outer space. It is all part of Earth’s _albedo_. (Otherwise the surface of the planet would be invisible from space)

          • Recall that black-bodies are invisible because they absorb all incoming radiation and emit reflect nothing.

          • Johanus,

            Black bodies are not invisible. LWIR photons are light just like visible photons. If a black body was receiving a powerful visible Planck spectrum across it’s ENTIRE surface, it would be emitting the same spectrum that it’s receiving. In fact, from a distance, it would be impossible to tell whether it was reflecting everything or absorbing and re-emitting everything unless the incident energy was shut off. Reflection is instant while re-emissions are subject to a finite time constant.

            Black bodies in the steady state will always emit the same W/m^2 that they are absorbing. Earth’s emissions are invisible because visible light falls on average across only 1/4 of the surface, therefore, the steady state emissions will have an average wavelength 4 times longer than the incident light and which is outside of our visual perception.

          • ” If a black body was receiving a powerful visible Planck spectrum across it’s ENTIRE surface, it would be emitting the same spectrum that it’s receiving. ”

            The term “black-body” is an abstraction of ideal thermal radiation which absorbs all of the light striking it, reflecting nothing of the original spectrum. Black-bodies can also radiate light in a continuous spectrum centered around a peak determined by its within the spectrum specified by its Wien displacement temperature, which is also the temperature used by S-B Law.

            So, assuming Earth is a B-B, light absorbed by the Earth will be converted to heat, raising the temperature above absolute zero. That temperature is determined by the incoming flux, according to S-B Law. So it seems to me the only way that B-B radiation would have the same spectrum as the Sun would require the received flux to be 63 million watts per square meter, same as the Sun, to create B-B temperture of 5800K. But it doesn’t have that king of B-B flux. It’s less than 1366 watts per square meter divided by four at TOA, resulting in 250K-288K or so (depending on which warming theory you believe).

            Going back to the real world. Earth is not a perfect B-B, but can reflect (i.e. redirect) some of the original radiation back into space (about 30% of it). That is not the same as B-B radiation using S-B Law.

            Perhaps we’re in agreement here and I’m just not understanding clearly what you are saying.

            So the only way

          • [Sorry, that last bit was garbled].

            So I think you do accept that radiating bodies do possess (and are required to have) a spectrum, which is specified by the Planck Radiation Law, which has both frequency and temperature as inputs (as predicted by Kirchoff). The output is somewhat bounded exponentially to avoid the so-called “ultraviolet catatrosphe”.

            Are we in agreement with that?

          • Yes, an ideal BB will radiate an ideal Planck spectrum according to its temperature, regardless of the spectral composition of its incident radiation. The point I was making was that the relationship between total W/m^2 and temperature is independent of the spectral composition of those W/m^2. For example, an active radiator like a laser can have an arbitrary spectrum and temperature.

            The solar spectrum is uniformly attenuated by 1/r^2, so the temperature of its W/m^2 as they are arriving at Earth is about 120C corresponding to 1366 W/m^2. However; since those W/m^2 fall on 4x the area they arrive, the average is only 5.5C corresponding to 341.5 W/m^2. Since 30% is reflected, the resulting 240 W/m^2 has an equivalent temperature of 255K, or about -18C. Note that none of this corresponds to planets ‘grayness’.

            The grayness its along the output path from the surface to space, where the surface is close to an ideal BB emitting a Planck spectrum at an average temperature of about 288K, corresponding to about 390 W/m^2. At TOA, only 240 W/m^2 leaves in order to offset the 240 W/m^2 arriving from the Sun. The atmosphere attenuates the 390 W/m^2 by about 0.62 to about 240 W/m^2, where the 0.62 is functionally equivalent to the emissivity of a gray body. The attenuation is spectrum specific, and it’s this attenuation that contributes to the planets apparent grayness, where grayness is defined as the W/m^2 emitted by the planet, relative to the temperature of the source of the radiation that’s ultimately leaving the planet.

          • Yes, an ideal BB will radiate an ideal Planck spectrum according to its temperature, regardless of the spectral composition of its incident radiation. The point I was making was that the relationship between total W/m^2 and temperature is independent of the spectral composition of those W/m^2.

            I think you are confusing irradiance (received flux, measured in watts per unit area) with radiance (emitted flux, measured in watts per solid angle).

            Planck’s Law applies only to radiance, as you applied it in your incorrect reply above:

            A radiating body as quantified by the Stefan-Boltzmann Law doesn’t require a Planck spectrum.

            ” For example, an active radiator like a laser can have an arbitrary spectrum and temperature.”
            Planck’s Law only holds for bulk matter, and is violated at nanoscale. Laser emissions emissions are the result of stimulated radiation where an incoming photon stimulates the creation of an exact clone of the photon, including phase, resulting in a single frequency (monochromatic) coherent light. This is only possible because Pauli Exclusion does not apply to photons, which are bosons, not fermions.
            https://physicsworld.com/a/plancks-law-violated-at-the-nanoscale/
            Bulk matter is made of fermions, required to have a distribution of Maxwell-Boltzmann states which leads to distribution of spectral frequencies.

            ” …. where grayness is defined as the W/m^2 emitted by the planet, relative to the temperature of the source of the radiation that’s ultimately leaving the planet.”

            Again, “W/m^2” does not apply to emissions, only used to quantify irradiance.

            A grey-body is simply an imperfect black-body that does not absorb all incoming radiation. That is why you are able to see objects in the real-world. It is merely reflected incident light. The Earth is a good B-B emitter for long-wave IR. But that’s invisible to the eye. The objects you see with your eyes are illuminated by reflected light from the Sun, fire, lightning or man-made lightning.

            “… it’s this attenuation that contributes to the planets apparent grayness, where grayness is defined as the W/m^2 emitted by the planet, relative to the temperature of the source of the radiation that’s ultimately leaving the planet.”

            So if you hold a pure gold nugget in your hand and illuminate it with a flashlight, explain why you are able to see it in an otherwise dark room. It is not a black-body, which would absorb _all_ of the flashlight’s radiation. It is not a perfect reflector, because of the gold color, so _some_ radiation is absorbed (and converted to heat).

            It is a grey-body. Do you not agree? :-]

          • @me
            “Again, “W/m^2” does not apply to emissions, only used to quantify irradiance.”

            I need to clarify that it can apply to emissions only if you specify the distance from the radiator, whereas ‘watts per steradian’ etc needs no such qualification, it is a constant independent of distance.

          • Johanus.

            Reflection is not what makes a body gray. A gray body is more properly characterized as a 2-body system with an ideal black body absorbing and emitting W/m^2 according to its SB temperature and a graying layer between it and the environment that attenuates the emissions of the BB by a factor equal to the emissivity. The attenuated energy is returned to the black body which provides the offsetting energy required to support a warmer temperature than possible based on the incident energy alone.

          • A gray body is more properly characterized as a 2-body system with anideal black body absorbing and emitting W/m^2 according to its SB temperature and a graying layer between it and the environment that attenuates the emissions of the BB by a factor equal to the emissivity.

            In standard physics a black body absorbs _all_ radiation hitting it, hence the name. A B-B emits an energy flux whose intensity (watts per solid angle) depends on frequency and absolute temperature, as predicted by Kirchoff and confirmed by Planck’s Law.

            Grey Body:

            A body that does not absorb all incident radiation (sometimes known as a grey body) emits less total energy than a black body and is characterized by an emissivity < 1

            https://en.wikipedia.org/wiki/Stefan%E2%80%93Boltzmann_law

            But you seem to be claiming ‘grey body’ is defined as a ‘two body’ system. Can you provide a reference for this definition? Or is this your own definition? What are the “two bodies” for my gold-nugget example?

            Real-world objects have three properties which determine its classification: transmission τ, absorption α, and reflection ρ.

            “transparent”: τ=1 α=0 ρ=0
            “opaque”: τ=0 α=0 ρ=0..1
            “white body”: τ=0 α=0 ρ=1
            “black body”: τ=0 α=1 ρ=0
            “grey body”: uniform τ α ρ for _all_ wavelengths [are there any examples in real-world?]

            [“real-world body”: everything else ]
            https://en.wikipedia.org/wiki/Black_body#Transmission,_absorption,_and_reflection

            I will admit that this definition, in Wikipedia, does not agree with my understanding of ‘grey body’, because there are no real-world objects that partially absorb or emit EMR at a uniformly constant rate at all wavelengths (DC to gamma rays and beyond).

          • Johanus,

            The wikipedia definition is wrong. If a body is absorbing less energy than it’s temperature would suggest, then it must have either an internal source of energy or attenuated energy is returned to its origin.

            Think about it. How can a body have a temperature higher than the energy it’s absorbing and radiating can support? This is the salient characteristic of a gray body whose emissivity less than 1. It’s not that it’s absorbing less, but that it’s emitting less.

            A gray body has a higher temperature than it’s radiant emissions suggest, while the wiki definition presumes that the temperature will be lower than the incident energy can support since the claim is that grayness arises from energy being reflected away. Reality is exactly opposite to what wikipedia says. The wiki gray body requires an emissivity grater than unity!

  10. Beyond hypothesis is the near obvious prediction that molten salt nuclear reactors will dominate future energy generation and even more likely, the predition that electric vehicles will dominate transportation sector and result in huge reductions in carbon emissions. Also obvious that many are attempting to lower their carbon footprints

    • Electric vehicles will not “dominate transportation” until they can draw their electric power from the road or some infrastructure built above it. Don’t hold your breath.

  11. This analysis would seem to support the President’s recent announcement at Davos that the U.S. will support the Trillion Trees initiative as being sound and rational environmental policy. I guess the Orange Man Bad collective are going to have another aneurism. Especially when they belatedly realize that Trump is exploiting this to support forestry and redwood decking jobs as a “growth” element of his MAGA agenda : )

    • and some of the trillion trees can be burnt to keep people alive if the greentards manage to ruin the power supply,

    • Another open issue: cause and effect. CO2 originating from the surface, observed in the atmosphere, is likely to be forced by warming.

  12. The natural removal rate of 2.33% of the part of atmospheric CO2 that is in excess of 295 PPMV to only hold up while this excess is growing approximately exponentially at the approximate exponential rate it has been growing. After that, it will become apparent that the Bern model is correct (although I expect it to need some tweaking). In the Bern model, a pulse of CO2 injected into the atmosphere does not decay exponentially, but with a time constant that slows as time goes on. This happens as CO2 accumulates in the ocean below the upper ocean, causing the upper ocean’s “equilibrium level” that atmospheric CO2 exponentially decays towards to increase.

    While atmospheric CO2 is growing at a constant rate of exponential increase above some baseline, the decay profile of each year’s contribution cannot be discerned (using growth and natural removal of atmospheric CO2) and cannot be proven to be according to the Bern model or any other non-exponential decay model or disproven to be according to exponential decay, even natural Removal is happening according to the Bern model or a minor correction thereof.

    • This has a surface temperature component.

      If the world fails to warm as predicted the emitted CO2 from the oceans will reduce maybe a lot.
      I think the annual CO2 cycle is impacted by the average temperature of the globe. The NH has more land area and warms more quickly than the SH, the oceans continue to absorb CO2 via photosynthesis probably at the same rate.

      If the solar cycle has a material temperature impact then even the CO2 level may exhibit a different trajectory.

  13. In pharmacokinetics, most drugs are eliminated in proportion to their concentration – known as ‘first order’ kinetics. This means that they behave as CO2 did for Dr Spencer, reaching an asymptote when input dose is constant. It is the assumption that the output of human CO2 becomes constant in 2050 that leads to the levelling off of CO2 concentration. Later or earlier will lead to higher or lower asymptotes. If CO2 does follow first order kinetics, reaching a constant emission level when all countries have developed to a decent standard of living.

    The common drug that does not have first order kinetics is alcohol. It is removed from the body at a constant rate, and with continued input at a steady rate continues to rise in a straight line – zero order kinetics.

  14. Like the Democrats noxious political and crumbled vendeta against Trump, the Warmers case for Global Warming that leads to all of mankind dieing-off by 2032 smells increasingly like a pile of pshchotic bullshit. When do these twisted and angry political charlatans finally surrender?

  15. Whether or not we reach 2XCO2, it’s now clear that humans will have little to do with it. Because details of this model are obscure, it’s difficult to tell what’s actually being done. But from the statement

    “The model assumes that the rate at which CO2 is removed from the atmosphere is proportional to the atmospheric excess above some natural “equilibrium level” of CO2 concentration.”

    it seems that the “equilibrium level” is assumed to be constant, except for the addition of human emissions. That is the assumption of the IPCC, which was a matter of convenience more than physical reasoning. Several treatments that do not impose this unphysical assumption make it clear that human emissions play only a minor role in increasing CO2. Underscoring that role is the fact that even when human emissions leveled off, atmospheric CO2 continued to rise.

    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2997420

    https://youtu.be/b1cGqL9y548?t=41m52s

    https://edberry.com/blog/climate-physics/agw-hypothesis/human-co2-emissions-have-little-effect-on-atmospheric-co2-discussion/

    http://www.esjournal.org/article/161/10.11648.j.earth.20190803.13

  16. An interesting article: https://wattsupwiththat.com/2015/04/19/the-secret-life-of-half-life/ Willis Eschenbach came up with natural removal rate of excess CO2 being consistent with exponential decay towards a pre-industrial baseline of 283 PPMV with a time constant (tau or e-folding time) of 59 years. This indicates a half life of 41 years and natural removal rate of 1.695% per year of CO2 in excess of 283 PPMV. I see this as not disprovable and the Bern model also not disprovable (by atmospheric CO2 data) until the atmospheric CO2 level greatly deviates from exponential growth above some pre-industrial baseline.

    • “This indicates a half life of 41 years and natural removal rate of 1.695% per year of CO2 in excess of 283 PPMV”

      Should be reported as 1.7%. You are implying a level of precision that doesn’t exist.

    • Sorry, but just what is “excess CO2?” The “pre-industrial baseline” is poorly established relative to modern atmospheric measurements which began in 1958, and the assumption that this level was “static” is suspect at best.

    • “Willis Eschenbach came up with… a time constant (tau or e-folding time) of 59 years.”

      Disproven by carbon 14, which decayed on a time scale of a decade.

      https://edberry.com/blog/climate-physics/agw-hypothesis/human-co2-emissions-have-little-effect-on-atmospheric-co2-discussion/

      In light of its continuous re-emission into the atmosphere after carbon 14 is removed from the atmosphere, the actual e-folding time of CO2 can be only shorter than a decade.

      https://youtu.be/rohF6K2avtY?t=58m25s

      As shown by Drs. Berry, Salby, and Harde, the rapid removal makes man’s involvement in increasing CO2 very small.

      • One has to distinguish between the removal rate and the turnover rate. The concentration of C14 diminishes faster but is replaced by C12 due to the faster turnover rate.

        I first was impressed by Salby’s presentations but not so much anymore. He simplified things a lot, for example that the time derivative of CO2 correlates strongly with the temperature. I still ponder about that but he claims that therefore CO2 is solely determined by the integration of temperature. Yes, mathematically you can do that but does that reflect what is really going on? I think these temperature correlations are a short term effect, maybe due to increased activity from soil bacteria when it warms. Or the top level of the oceans. Analysis of ice core records show a much slower respond of about 800 years. But still worth investigating this correlation effect.

  17. Atmospheric CO2 will fall when the planet cools, as observations support the assertion that humans are responsible for less than 5% of the recent rise in atmospheric CO2. There is no significant AGW and there is no worry about ocean ph change.

    Salby and close to a dozen peer reviewed papers, using independent observations and analysis techniques have shown that the recent change in atmospheric CO2 is tracking planetary temperature not anthropogenic CO2 emissions.

    For this observation to be true, there must be an unaccounted-for, large source of CO2 and an accompanying large sink of CO2, in addition, to the volcano emitted CO2 which is assumed to be the only ‘new’ source of CO2 and water into the atmosphere by the IPCC.

    I included water as there is a recent discovery that three times more water is being dragged down into the mantel by sinking ocean plates than is coming out in volcanoes.

    This is an interesting summary of the Monkey business concerning the creation of the IPCC Bern model and past cherry picking of CO2 data to create the CAGW paradigm.

    Carbon cycle modelling and the residence time of natural and anthropogenic atmospheric CO2: on the construction of the “Greenhouse Effect Global Warming” dogma.

    https://www.co2web.info/ESEF3VO2.pdf

    The IPPC’s Bern model of CO2 sources, sinks, and resident times, (named after the city were the cult created the equation and its assumptions) assumes that ocean circulation (with hundreds of years delay) is the only method for deep sequestration of CO2 in the ocean.

    Quote from the above summary:

    “The alleged long lifetime of 500 years for carbon diffusing to the deep ocean is of no relevance to the debate on the fate of anthropogenic CO2 and the “Greenhouse Effect”, because POC (particular organic carbon; carbon pool of about 1000 giga-tonnes; some 130% of the atmospheric carbon pool) can sink to the bottom of the ocean in less than a year (Toggweiler, 1990).”

    https://www.livescience.com/65466-bomb-carbon-deepest-ocean-trenches.html

    Bomb C14 Found in Ocean Deepest Trenches

    ‘Bomb Carbon’ from Cold War Nuclear Tests Found in the Ocean’s Deepest Trenches

    Bottom feeders
    Organic matter in the amphipods’ guts held carbon-14, but the carbon-14 levels in the amphipods’ bodies were much higher. Over time, a diet rich in carbon-14 likely flooded the amphipods’ tissues with bomb carbon, the scientists concluded.

    Ocean circulation alone would take centuries to carry bomb carbon to the deep sea. But thanks to the ocean food chain, bomb carbon arrived at the seafloor far sooner than expected, lead study author Ning Wang, a geochemist at the Chinese Academy of Sciences in Guangzhou, said in a statement.

    • William Astley: You say “as observations support the assertion that humans are responsible for less than 5% of the recent rise in atmospheric CO2”. What do you think caused the other 95% of the increase from ~280 to ~410 PPMV? CO2 did not get anywhere near 410 PPMV in the last few hundred thousand years, including times that are warmer than now due to Earth orbital & rotation mechanics favoring more insolation than now in regions that have high regional positive feedback.

  18. I published the first version of carbon cycle in 2015. The third version of this model was published Jan. 20th, 2020.
    The link to the paper: http://www.journalpsij.com/index.php/PSIJ/article/view/30168

    The link to the blog story: https://www.climatexam.com/copy-of-2018-1.

    There are two important results. The amount of anthropogenic CO2 in the atmosphere in 2017 was only 73 GtC, when the total increase of CO2 since 1750 was 270 GtC. This means that the portion of anthropogenic CO2 was 27% about the total increase and the portion of anthropogenic CO2 of the atmospheric CO2 of 866 GtC was only 8.4%. The percentage of anthropogenic CO2 increase about the total CO2 increase according to Berry was 24% – very close to my result.

    These results deviate totally about the Joos et al. (2013) which says that all the CO2 increase in the atmosphere is anthropogenic by nature since 1750. IPCC has the same opinion. In AR5 (p. 467), the IPCC writes: “About half of the emissions remained in the atmosphere 240 PgC±10 PgC since 1750.” The IPCC refers to Joos et al. (2001), which states that “Currently, only about half of the anthropogenic CO2 emission stays airborne.”

    Most of the climate scientist has ever heard about the permille values of the atmosphere. It has been continuously measured at different locations. Simple mathematics shows that if there were 270 GtC anthropogenic CO2, its permille value would be about -13‰, but the real measured value is -8.5‰. Probably due to this huge difference, there are no references to the observed permille values or the model-calculated permille values in the study of Joos et. al and for this reason, the IPCC is silent about this issue. Since the error is so massive, the correctness of this carbon cycle model can be questioned.

    The residence time of my carbon cycle model 1DAOMB-3 is from 12 years to 150 years meaning that the decay rate would be about 600 years. My simulation results for 1000 GtC CO2 emissions show the maximum CO2 level of 480 ppm. In this simulation, the emissions stay at level 10 GtC/yr to 2040 and then it gradually decreases to zero in 2100.

    • Joos is one of two authors on their latest Bern Model (BernSCM) v1.0.
      https://www.geosci-model-dev.net/11/1887/2018/

      While their BernSCM they have looks all sophisticated and full of science and math, it utterly fails to reflect reality. Yet no doubt the IPCC will use it in their AR6 as the basis for more alarming conclusions, conclusions completely devoid of reality.

    • In reply to:

      Most of the climate scientist has ever heard about the permille values of the atmosphere. It has been continuously measured at different locations. Simple mathematics shows that if there were 270 GtC anthropogenic CO2, its permille value would be about -13‰, but the real measured value is -8.5‰.

      You are looking at one piece of data which is not sufficient to understand the physical problem.

      There is a real paradox in the observations. Paradoxes should not exist.

      What we have done is ignored the paradoxes and pressed ahead with mathematical analysis when we are incorrect at the level of imagination/concepts.

      Climate scientists ignored other data they could not explain concerning the atmospheric C12/C13 ratio change.

      Tom Quik’s analysis shows the permille value is changing with El Ninos which disproves the physical assumption of the Bern model that volcanic eruptions is the only new source of carbon into the atmosphere.

      https://www.livescience.com/65466-bomb-carbon-deepest-ocean-trenches.html

      Sources and sinks of CO2 Tom Quirk

      http://icecap.us/images/uploads/EE20-1_Quirk_SS.pdf

      The yearly increases of atmospheric CO2 concentrations have been nearly two orders of magnitude greater than the change to seasonal variation which implies that the fossil fuel derived CO2 is almost totally absorbed locally in the year that it is emitted.

      A time comparison of the SIO measurements of CO2 at Mauna Loa with the South Pole shows a lack of time delay for CO2 variations between the hemispheres that suggests a global or equatorial source of increasing CO2. The time comparison of 13C measurements suggest the Southern Hemisphere is the source.

      This does not favour the fossil fuel emissions of the Northern Hemisphere being responsible for their observed increases. All three approaches suggest that the increase of CO2 in the atmosphere may not be from the CO2 derived from fossil fuels. The 13C data is the most striking result and the other two approaches simply support the conclusion of the first approach.

      This is a phase analysis of the yearly changes in atmospheric CO2. The physical observations cannot be explained by the Bern model.

      https://www.researchgate.net/publication/257343053_The_phase_relation_between_atmospheric_carbon_dioxide_and_global_temperature

      The phase relation between atmospheric carbon dioxide and global temperature
      Summing up, our analysis suggests that changes in atmospheric CO2 appear to occur largely independently of changes in anthropogene emissions.

      A similar conclusion was reached by Bacastow (1976), suggesting a coupling between atmospheric CO2 and the Southern Oscillation.

      However, by this we have not demonstrated that CO2 released by burning fossil fuels is without influence on the amount of atmospheric CO2, but merely that the effect is small compared to the effect of other processes.

      Our previous analyses suggest that such other more important effects are related to temperature, and with ocean surface temperature near or south of the Equator pointing itself out as being of special importance for changes in the global amount of atmospheric CO2.

      • Quote: “You are looking at one piece of data which is not sufficient to understand the physical problem. There is a real paradox in the observations. Paradoxes should not exist. What we have done is ignored the paradoxes and pressed ahead with mathematical analysis when we are incorrect at the level of imagination/concepts. Climate scientists ignored other data they could not explain concerning the atmospheric C12/C13 ratio change.”

        Response: I have shown the whole picture of the permille value problem and I could explain the problem. You cannot show any evidence about the permille value problem. The ENSO effects are short-term climate variations and they cannot explain this problem at all.

    • I have never seen the term “permille values of the atmosphere” used here at WUWT. A goggle advanced search gives only this post. Would you define it please?

      • Your comment proves that my assumption about the general knowledge about the permille values seems to be true. I explain the basics in the comment locating at the end of the comments.

    • Antillo Ollila: CO2 with anthropogenic carbon dissolves in the ocean and CO2 with non-anthropogenic carbon gases out of the ocean. Individual CO2 molecules have atmospheric residency time of a half-life around 10 years, according to atmospheric C14 dioxide after the end of nuclear bomb testing that caused atmospheric C14 dioxide. Because CO2 molecules entering the ocean cause others to gas out when they otherwise wouldn’t, the atmospheric residency time of a CO2 bump-up by an injection of CO2 is longer, with best-fit exponential decay approximation being 29.4-29.7 years according to Spencer’s results, 41 years according to work by Willis Eschenbach in a 2015 WUWT article. So, the anthropogenic boost of atmospheric CO2 consists in part of non-anthropogenic CO2.

  19. I have pondered what is the great error in the model of Joos et al. I have concluded that it must be in the CO2 fluxes between the atmosphere and the oceans. The magnitude of this flux is about 85 GtC/yr. In 2017 it flushed 7.1 GtC anthropogenic CO2 into the mixing layer and the oceans absolved back to the atmosphere 1.1 GtC/yr. It means that the anthropogenic CO2 in the mixing layer increases all the time. Since 1750 totally 233 GtC anthropogenic CO2 has been absorbed into the deep sea. According to the observations, this amount was about 128 GtC in 1994.

    But there is natural CO2 flux between the atmosphere and the oceans. Since 1750 the natural CO2 into the mixing layer had been136 GtC, and the natural CO2 flux from the oceans into the atmosphere was 333 GtC. This difference of 197 GtC of natural CO2 stayed in the atmosphere. Thus the total CO2 increase in the atmosphere was 73 + 197 = 270 GtC, which the same as measured. I think that Joos et al. do not take into account this natural CO2 flux from the oceans into the atmosphere.

  20. “(The post-Pinatubo reduction in atmospheric CO2 is believed to be due to increased photosynthesis due to an increase in diffuse sunlight penetration into forest canopies caused by the volcanic aerosols):”

    There may also be a fertilizing effect as the fine-grained ash falls onto the ground and into the oceans. Phytoplankton are generally assumed to be restricted by availability of iron, so volcanic ash might have eased that restriction. It is generally acknowledged that a reason volcanoes often have high human population densities is because the ash weathers quickly, producing very fertile soils. So, it is not unreasonable that ash fertilizer would support increased photosynthetic activities over much of the world.

  21. A comment about the residence time of anthropogenic CO2 in the atmosphere. Joos et al. say that the increase of CO2 in the atmosphere is totally anthropogenic by nature. Their residence time function has been called Impulse Response Function (IRF): “The IRF is thus the first-order approximation how excess anthropogenic carbon is removed from the atmosphere by a particular model.” The residence time of the IRF for anthropogenic CO2 is more than 100 000 years and about 21% will stay forever in the atmosphere.

    Humanity has carried out one full-scale experiment with the climate in the form of nuclear bomb tests in the atmosphere, which stopped in 1964. These tests produced radioactive carbon 14C. Its concentration has declined continuously, and it is closing zero. The fitting of 16 years residence time gives almost a perfect result. This result means that the adjustment or settling time should be 4 * 16 = 64 years, which will be reached in 2028 and then only 2 % of the original concentration should be detected in the atmosphere. What this has to do with climate models? Well, this test is a perfect tracer test for the anthropogenic CO2 behavior in the atmosphere, because also anthropogenic CO2 concentration started from the zero in the system. The behavior of the total CO2 is different. The residence time of more than 100 000 years for anthropogenic CO2 is totally incorrect.

  22. In addition to 1 trillion trees, why not add a million floating reefs made from Pumice.
    But they don’t float at the surface, they float around say, 30 meters under the ocean surface and they are in ocean water which is more than 100 meters deep- say 100 to 300 meter deep water.
    And they are places where ocean plant and animal can live.
    And pumice could made out of recycled glass, ie:
    https://www.youtube.com/watch?v=6umXittJPPM
    But you want it thicker than what what’s made in video.
    Something like 1 foot {30 cm} thick.
    And say 4 meter wide and 12 meter long
    Or 48 square meter and about 1/3 meter thick is 16 cubic meter.
    16 cubic meter of water {1 density} is 16 tons and say pumice density averages
    around .8 density and so 16 times .8 is total weight/mass of 12.8 tons.
    And these can joined together into a raft and floated to some location.
    Then need anchor and tether which will make neutral buoyant at the 30 meter depth below the surface. That can also be made to float “like a boat” and at location, you “sink the boat”. It sinks and pulls “island” 30 meter under the water.
    And “island” can lots of small cave openings which various size fish can hide from
    larger predators.
    Since it’s 30 meter under water, sunlight can reach it, for plant growth, and plants growth could grow up to the surface. And it’s deep enough not to be a navigation hazard- it could be near shipping lanes.
    So one could a raft of hundreds of islands, towed to some location, and deployed.
    And to get million, less than 10,000 rafts needed- and all over the place- ie could many cluster islands in different part of Gulf of Mexico.
    And at 30 meter depth, hurricanes should not have much effect upon them.

    A critical aspect would be the cost of material. I assume it could or should be less than cost of concrete. And it should be made somewhat close to where “islands” would be deployed, and using rafts or using larger numbers, should lower unit cost of deploying them.

  23. Dr Spencer wrote: “And it raises the question of just how good the carbon cycle models are that the UN IPCC depends upon to translate anthropogenic emissions to atmospheric CO2 observations.”

    Clearly they are not good at all. I have frequently pointed over the past year here at WUWT the OCO-2 results that came out in 2017, and then the OCO-2 team has gone quiet for 3+ years now despite their continuing to acquire post the OCO-2 CO2 Level-2 data to the NASA data portal.
    Those OCO-2 results destroyed the predictions made by the Bern Model used by the IPCC.

    “The Bern Simple Climate Model (BernSCM) v1.0: an extensible and fully documented open-source re-implementation of the Bern reduced-form model for global carbon cycle–climate simulations”
    https://www.geosci-model-dev.net/11/1887/2018/

    Why does the Bern Model fail?
    Well just one of several ways is the lack of biological uptake modeling.
    From their own paper:

    “BernSCM (Fig. 1) is designed to compute decadal- to millennial-scale perturbations in atmospheric CO2, in climate and in fluxes of carbon and heat relative to a reference state, typically preindustrial conditions. The uptake of excess, anthropogenic carbon from the atmosphere is described as a purely physicochemical process (Prentice et al., 2001). As in pioneering modeling approaches with box-type (Oeschger et al., 1975; Revelle and Suess, 1957) and general ocean circulation models (Maier-Reimer and Hasselmann, 1987; Sarmiento et al., 1992), modification of the natural carbon cycle through potential changes in circulation and the marine biological cycle (Heinze et al., 2015) are not explicitly considered.

    Stated simply, they ignore the biological consequences of increasing fertilization and uptake of CO2. And universally, all the models get the sign of CO2 sensitivity wrong.

    From their earlier work:

    “All models simulated a negative sensitivity for both the land and the ocean carbon cycle to future climate. ”
    Climate–Carbon Cycle Feedback Analysis: Results from the C4MIP Model Intercomparison
    https://journals.ametsoc.org/doi/10.1175/JCLI3800.1

    Yet we can predict certainty that IPCC AR6 WG1 authors won’t (aren’t allowed to?) admit that past CO2 modelling efforts are utter failures to refelct the reality seen in the OCO-2 data. And Judith Curry has pointed put they are devising an even worse case scenario for AR6 than the bogus, impossible RCP8.5 from AR5. There is far too much money and power at stake in the climate scam for them (the scientists who might find a conscience) to be allowed to ruin the march to global socialism under the auspices of the UN.

    • The Bern model, or a minor correction of it, has not yet failed. Non-exponential decay of an atmospheric CO2 bump-up by a pulse or injection of CO2 into the atmosphere cannot be disproven by atmospheric CO2 data until anthropogenic CO2 emissions deviate greatly enough from exponential growth long enough to cause atmospheric CO2 level to deviate greatly from exponential growth of its excess over some “pre-industrial” baseline. Until that happens, atmospheric CO2 level will be consistent with both exponential and non-exponential decay (towards some “pre-industrial” baseline) of the atmospheric CO2 gain from each year’s anthropogenic CO2 emissions.

  24. If we magically burned every gram of proven reserves of coal, oil, and natural gas, and none of it dissolved in the oceans, or was taken up in the biosphere, we would reach 735 ppm – max.

    That amounts to 1.39 x climate sensitivity = warming °C. That’s it, and no more. The only way to get more CO2 into the air is to burn biotic carbon, or dissolve *lots* of limestone. That’s not even close to enough to stave off the next glaciation. So much for gloom and doom.

    • I figure below 868 not 735 ppm from burning the proven reserves according to BP, which is 1.63 log scale doublings from 280 ppm. Then there’s the matter that the world has a lot more coal, oil and natural gas than proven reserves.

      As of the end of 2017, the world’s proven reserves of oil was 1696.6 billion barrels according to https://www.bp.com/content/dam/bp/business-sites/en/global/corporate/pdfs/energy-economics/statistical-review/bp-stats-review-2018-full-report.pdf That is 231.45 billion tonnes, of which about 198 billion tonnes is carbon.

      That source says the world’s proven reserves of natural gas at the end of 2017 was 193.5 trillion cubic meters, and I figure the mass of that being 127.13 billion tonnes, of which 95 billion tonnes is carbon.

      That source says the world’s proven reserves of coal at that time were 1035 billion tonnes. I figure that has 650 billion tonnes of carbon.

      All three of these total 943 billion tonnes of carbon at the end of 2017. I’m pretty sure proven reserves are growing and are no less than 943 billion tonnes now. 943 billion tons of carbon would get oxidized to 3,457 billion tonnes of CO2. The earth’s atmosphere has a mass of 5,000,000 billion tonnes, and 3,457 is 691.4 parts per million of that by mass. Dividing 691.4 by 29.1/44 (molar masses of air and CO2 respectively) means 455.7 parts per million by volume, which is the unit normally used for atmospheric CO2. We’re currently at about 413 PPMV, and adding 455 to that means 868 PPMV.

  25. Roy, in a number of my comments over half a dozen years on the “Great Greening” that alarmists are loathe to discuss and at great pains to find something alarming about, I thought it self evident that natural sequestration by ‘fattening’ existing trees and shrubs and new growth fringing concentrically into arid barren lands would be exponential and would therefore flatten the rising CO2 curve to some equilibrium plateau.

    I also noted that photosynthesis is endothermic and would therefore have a significant cooling effect given the magnitude of the Greening. The recent study from Boston College discussed the other day here, attributed cooling to release of water vapor. This would actually be in addition to the endothermic reaction I was positing. The amount of endothermic cooling should be equal to the enthalpy of oxidation of the carbon sequestered as a bare minimum (imagine burning this amount of anthracite coal). Ocean life is likely also expanding (phytoplankton, etc). The reduced growth of CO2 in the atmosphere plus the cooling effect further attenuates any reason to be alarmed about fossil fuel use.

  26. I certainly hope we more than double the ppm CO2 in the atmosphere. That would be a benefit to plants and agriculture, and IF CO2 slightly warms the atmosphere, and Dr. Spencer is assuming it does without knowing how the planet warmed occasionally over the last few thousand years, then it would be a benefit to the Northern Hemisphere as well. I really don’t like the assumption that human beings MUST be changing the climate, when we can only estimate how much CO2 going into the atmosphere from natural sources, 97%? and how much is from burning fossil fuels 3%? +/- 3% or more? So why all the jumping to conclusions? It could easily be 99.999% from the oceans that cover 70% of our planet, along with land-based sources, and 0.001% human activity. We have no way of measuring accurately. And CO2 may not increase the temperature of the atmosphere, at all anyway. We don’t know yet. According to ice core studies, rise in CO2 follows rise in global temperature. There are so many other variables to be studied: Heat from the Earth’s mantle seeping into the oceans, slightly warming oceans= more CO2 into the atmosphere maybe? How much? The Sun’s cycles, gravitational pulls from other planets, Earth’s elliptical orbit, cosmic radiation, Earth’s spin, and who knows how many other variables we haven’t considered yet.

    • “There are so many other variables to be studied: Heat from the Earth’s mantle seeping into the oceans, slightly warming oceans= more CO2 into the atmosphere maybe? How much? The Sun’s cycles, gravitational pulls from other planets, Earth’s elliptical orbit, cosmic radiation, Earth’s spin, and who knows how many other variables we haven’t considered yet.”

      Yup that in a nutshell is the problem. Even some skeptical of the junk climate science are guilty of it. They all want to look at ONE “factor” which is the climate “driver,” when the reality is that there are MANY “factors,” and we don’t even know what they all are, much less have any data regarding any of them.

      Of course, the Climate Fascists are just working backwards from the preconceived conclusions and looking to justify them, so you can’t expect them to change their stripes, ever. No way they’re going to “kill the golden goose.”

    • holly says:
      I certainly hope we more than double the ppm CO2 in the atmosphere.

      That would be a benefit for the planet. But on the planet’s timescale, fossil fuels will quickly become scarce and a CO2 doubling would go back to a standard level (getting dangerously close to CO2 starvation levels) in the blink of a geologic-eye — unless we purposely “burn” abundant limestone to keep the levels up.

  27. Maybe the best example of the grossness of the IPCC and the climate establishment is in the acceptance of the Joos et al. and Bern2.5CC model. They state it clearly that the whole increase of the atmospheric CO2, which was about 270 GtC in 2017, is totally anthropogenic. The direct permille observations show that it is impossible: model-calculated value would be -13‰, and the observed value is -8.5‰. Nobody has commented on this issue.

    An explanation might be this. People do not know what is the permille, and therefore they do not know if I have calculated this value correctly or not. Because people do not know about these observations, the IPCC and the climate establishment can be silent about this conflict between the models of Joos et al. and Bern2.5CC and the reality.

    The permille values of the present atmosphere can be calculated by a simple equation:
    Permille = ((100- CO2ant) * (-6,35) / 100 + (CO2ant) * (-28)) / 100 (2)

    CO2ant is the percentage portion of anthropogenic CO2 in the atmosphere and in 2017 it was according to the IPCC = 100*270/866 = 31.2. The permille value of the natural CO2 before 1750 was -6.35‰, and the permille value of fossil fuels is -28 (the same as with plants). By applying these figures, the permille value should have been -13‰ according to the IPCC. The Mauna Loa value in 2017 is -8.6‰ and the error is really massive.

  28. Will Humanity Ever Reach 2XCO2 (800 ppm)? Sadly no. Humanity will be powerless to bring about this highly desirable goal. As for the Plants’ desire to return to 1,200 ppm, they are likely to be disappointed in our puny efforts.
    At least, however, if we can keep the Geo-Engineering Nuts in check, we will not further damage the previously precariously low levels of CO2 for our plant partners.

    • Which is only a “surprise” to deluded pseudo-scientists that think CO2 is a “climate driver,” when there is no empirical evidence of that.

  29. In my experience, the main complaint about the current model will be that it is “too simple” and therefore probably incorrect.

    Bingo.
    And is the “diffuse sunlight under the canopy” theory the correct explanation? Or maybe thermal effects over the Pacific? Or deposited nutrients from ash dispersal increasing Pacific photosynthesis? Or plenty of other things I could suggest?

    Or are they just making it up as they go along? I know which one I believe.

  30. SCIENTISTS DRILL INTO DOOMSDAY GLACIER, STOP IT FURTHER DRILLING NOW. !!!!!!!
    https://raveendrannarayanan.wordpress.com/2020/01/30/scientists-drill-into-doomsday-glacier-stop-it-further-drilling-now/    
    https://youtu.be/f0AWsJ0cmLE

    https://youtu.be/UfY4YtixRA8

    #MELTING_DEICINGareTWOdiffrentPROCESS

    https://youtu.be/XlarSQIYHL4

    Not CO2 and GHG, But DEICERS from Heavy duty Desalination plants. Install ZERO DISCHARGE SYSTEMS (Z.D.S.) in Desalination plant and capture DEICERS and build more ICE MASSES in both POLES. DEICERS brought by Hurricanes, Cyclones and Typhoons helping to DEICES ICE SHELVES, even during WINTER, thereby METHANE escaping to atmosphere. These will STOP SEA LEVEL RISE.

    NOAA/NASA dramatically altered Global Temperatures. http://wp.me/pPrQ9-t2w via @wordpressdotcom https:raveendrannarayanan.wordpress.com/2014/10/14/one-shot-many-birds/

    “NOT CO2 & SUN, THEN WHAT?” http://wp.me/p25H2W-9M

    IPCC needs $122 TRILLION for Climate Correction.
    https://lnkd.in/erZHg3W ” NOMINATE FOR NOBEL PEACE PRIZE 🏆 2020″

    #CO2GangsDistroyingMotherEarth #unfccc #ipcc #parisagreement
    #notco2 #rewrittingearthscience #airconditioningthemotherearth
    #nobelpeaceprizecommitte #earthscieenceconferencecommitty
    #ChallengingGlacialScintists
    #waterprize

  31. There are two permanent isotopes of carbon molecules. The most common is 12C having 6 positrons and 6 neutrons but 13C has one extra neutron. Isotope 13C is the most common being 98.9 % of all carbon and the rest is 13C. There is also a very small concentration of unstable isotope 14C, which is radioactive. Cosmic radiation produces 14C all the time by splitting nitrogen molecules in the atmosphere. Its half time is 5730 years and it is used in the radiocarbon timing method.

    The measurement unit of 13C proportion (also marked as δ13C) and it is a fraction of carbon isotope 13 expressed as ‰ (written also in forms per mil, per mill, permil, permill or permille). This unit is linearly dependent on the relationship 13C/12C per eq. (1)

    Permille = ((13C/12C) / (13C/12C)standard -1 ))*1000 (1)

    This measurement unit is odd because all the values for CO2 mixtures are negative. The reason is in the standard value of 13C/12C. It can be found in a sea fossil by name Pee Dee Belemnite (PDB) and its value is very high 0.0118. Observations show that the permille value of plants and fossil fuels is -28‰ (13/12C = 0.010922). The permille value of natural CO2 in the year 1750 was -6.35‰. This value is from the research studies, which assume that in 1750 the mixing layer of the ocean and the atmosphere were in balance having the same relationship of 13C/12C. Because of anthropogenic emissions, the permille value of the atmosphere has increased continuously being -7.0‰ in 1960 and now it is about -8.6‰.

    Here is the link to the observations and trends (rather difficult to find, why??):
    https://scrippsco2.ucsd.edu/graphics_gallery/other_stations/global_stations_co2_concentration_trends.html

  32. One more example about the cheating by the climate establishment. There is a nice animation about the carbon cycle in Youtube made by Robert Rohde who works at Berkley Earth organization. I sent an email asking this question:
    – On which research papers the numbers and the model is based on?
    – What is the nature of the atmospheric carbon dioxide increase being about 276 GtC in 2018? Is it totally anthropogenic (permille value about -28) as reported by the IPCC, Joos et al. (2011) and Le Quere et al. (2018)? “

    Rohde replied very promptly that the animation is based on AR5 of the IPCC and Global Carbon Project 2018 report. The animation assumes that the entire perturbation is anthropogenic, through a combination of fossil fuel production and land-use changes.

    As you can see, Rohde did not answer univocally to my question about the nature of atmospheric CO2. We can understand the reply in the way that all the values – including the total increase of atmospheric CO2 – is anthropogenic by nature.

    Because this was a typical answer of a politician who never replies directly to an inconvenient question but talks about something else, I sent another email asking again about the nature of the atmospheric CO2. I have not received any answer so far. I think that Rohde noticed or found out, that I am aware of this cheating.

  33. People need to realize that the Mauna Loa data aren’t data in the usual sense. They are a model. Mauna Loa is a volcano and emits CO2. Yes, that’s right, our primary CO2 monitoring station sits right on top of a volcano emitting CO2. So the CO2 measurements can vary all over the place depending on variability in CO2 emissions and wind direction. What they do is model how much they think CO2 should increase and measurements that fit the model are included and ones outside the expected values are discarded.

    • In both the CO2 concentration data and permille data you can see the measurement values in different places of the Earth. There are small differences in the levels and fluctuations but the overall trends are the same in all locations.

      CO2 concentration and permille measurements are not man-made scams. They are real. .

      • Antero
        You are assuming that the other locations don’t have similar problems to Mauna Loa. In the case of Pt. Barrow, the village is a little hot spot of CO2 generation. If the wind is blowing towards the recording station, it will generate outliers that need to be discarded. There is some subjectivity in deciding what wind conditions or anomalous readings warrant eliminating data. It is less than an ideal situation for recording CO2! I wouldn’t be surprised to find other stations around the world have had their scientific integrity compromised for the sake of convenience in reading or maintenance.

  34. The concentration of CO2 in the atmosphere more closely resembles a balance being achieved with a warming ocean than it does a gross concentration of CO2 being established in the atmosphere that is slowly absorbed by the ocean. It just looks like they have cause and effect completely backwards, and more ancient climate data backs this up (CO2 follow temperature).

    Another problem I see is that scientists seem to think that removal of CO2 by a warming ocean is not affected. The ocean not only has a huge potential to absorb CO2, but to chemically bind it into an inert form – carbonate rocks. If this chemical process increases with either CO2 concentration or warmth (or both) then vast quantities of additional CO2 are being deposited on the ocean floor over and above what was occurring 50 years ago.

    So I agree, the higher end of CO2 concentration in the atmosphere is nothing but a huge guess, and likely wildly over estimated. Even it CO2 were to triple, I still fail to see a giant threat. I do not care if sea ice melts. The oceans are already rising and we will already have to adapt to it. More rain in places like Egypt and the Southwestern U.S. are not frightening to me. Canadians baking in 60F degree heat is not too concerning – they have plenty of cold beer up there. Faster growing more abundant food crops is hard to be frightened of. I could care less if polar bears end up eating berries along shores instead of eating baby seals on ice packs.

    Now rafts of angered baby penguins…that kind of frightens me. I have seen video’s of them slapping each other.

      • Since “we” (humans) are not in control of the CAUSES of any “sea ice melt” anyway, what’s your point? And since sea ice is stubbornly “average,” there’s no “consequences” to BE concerned about.

      • Loydo
        It looks like you left a word out of your last sentence, making the meaning ambiguous. However, if the intended word was “know,” then I think the sentence would apply to you. Melting sea ice will not affect the level of the oceans.

  35. If we assume that all man-made CO2 remained in the atmosphere, a mass balance shows that the CO2 content of the atmosphere would increase by about 1 ppm per gigatonne (GT) CO2 emitted. Worldwide CO2 emissions for 2018 are estimated at about 36.8 GT, so that if all the CO2 remained in the atmosphere, the CO2 concentration should rise at a rate of 4.6 ppm/year. If the actual rate of increase is 2.5 ppm/yr, then CO2 is being removed at a NET rate of 2.1 ppm/yr, or about 16.8 GT/year, which represents about 46% of the human CO2 emission rate.

    If it is assumed that the net removal rate is linearly proportional to the CO2 concentration in the air (first-order reaction), then the net removal rate should catch up to the emission rate when the CO2 concentration is 1/0.46 = 2.2 times the current level, which would be about 900 ppm, and the concentration would remain in equilibrium thereafter.

    The net natural removal rate is actually the CO2 emission rate of all natural sources (animal respiration, emission from oceans, volcanic emissions, etc.) minus the CO2 absorption rate of all natural sinks (photosynthesis, absorption into oceans, conversion to carbonates in oceans, etc.). Accounting for the natural sources would mean that the total natural absorption rate is actually higher than the net absorption rate of 16.8 GT per year. It would be expected that the total natural absorption rate would be proportional to the CO2 concentration, but that the natural emission rate would likely remain constant. Depending on the magnitude of the natural emission rate, it would be likely that the CO2 concentration will reach equilibrium at less than 900 ppm, although the numerical value of the plateau would depend on the value of the natural emission rate.

    What natural CO2 emission rate did Dr. Roy Spencer assume in his CO2 budget model?

    • I don’t know why one should assume a linearly proportional rate of removal/emissions of atmospheric CO2, Steve.

      But taking that model as read, observations demonstrate that despite the increased removal rate, it hasn’t caught up with the increased emissions rate. Over the long term atmos CO2 accumulation rate is accelerating.

      “It would be expected that the total natural absorption rate would be proportional to the CO2 concentration, but that the natural emission rate would likely remain constant.”

      There is a good basis for this – the ice core records show relatively steady concentrations for the last few thousand years, particularly compared to the last 60 (Mauna Loa) or so. However, the implicit assumption here is that there are no significant feedbacks when total concentration changes significantly.

      Sinks are blind to source, whether anthropogenic or natural. Sinks respond to the sum – the total content of the atmosphere.

      • Barry,

        No, as shown in Fig. 2, observations demonstrate that the absorbed fraction is increasing. That means the removal rate is more than keeping up with the emissions rate.

        Secondly, assuming natural emissions remain constant doesn’t mean they have. At least two sources come to mind. As world population grows, land use changes–especially soil cultivation–would provide an additional source of emissions. Warmer temperatures augment those emissions. These are in addition to the outgassing due to warmer ocean temperatures.

  36. Don’t forget fungi. Lots of beneficial fungi like arbuscular mycorrhizal fungi in pristine grasslands and forests.

    Plowing land kills most fungi as the soil is turned over.

    75% of US soil is no longer plowed as farmers are trying to allow beneficial fungi to grow in their cropland soil. I have no idea how the global rate of plowed cropland is decreasing, but I’m pretty sure it is decreasing.

    Arbuscular mycorrhizal fungi in particular sequester more carbon in soil than plants.

    Thus, it is likely much of not all the increasing ability of earth to drawdown CO2 is actually a farmer induced behavior.

  37. In “Climate:The Counter Consensus” by the late Professor Robert M.Carter-
    “The Global Carbon Dioxide budget.
    The effect of human emissions on global levels of atmospheric carbon dioxide is not well understood because no one, including the IPCC can satisfactorily account for the observed levels in detail; our best estimates of carbon dioxide sources and sinks have large error bars.
    ….One estimate by Canadian climatologist Tim Ball is that the human production of carbon dioxide ( 7.2 GtC/year, IPCC 2007) is more than 4 times less than the combined error (32 Gt) on the estimated carbon production from all other sources.
    A perspective that follows is that even were human emissions to be reduced to zero, the difference would be lost among other uncertainties in the global carbon budget.”
    What are the error bars in Le Quere 2018, the World Carbon budget paper accepted as central by the IPCC.
    For all sinks they appear to be quite considerable.
    So, the carbon budget is not well constrained.

Comments are closed.