CO2, Soot, Modeling and Climate Sensitivity

Warming Caused by Soot, Not CO2

From the Resilient Earth

Submitted by Doug L. Hoffman on Wed, 07/15/2009 – 13:19

A new paper in Science reports that a careful study of satellite data show the assumed cooling effect of aerosols in the atmosphere to be significantly less than previously estimated. Unfortunately, the assumed greater cooling has been used in climate models for years. In such models, the global-mean warming is determined by the balance of the radiative forcings—warming by greenhouse gases balanced against cooling by aerosols. Since a greater cooling effect has been used in climate models, the result has been to credit CO2 with a larger warming effect than it really has.

This question is of great importance to climate modelers because they have to be able to simulate the effect of GHG warming in order to accurately predict future climate change. The amount of temperature increase set into a climate model for a doubling of atmospheric CO2 is called the model’s sensitivity. As Dr. David Evans explained in a recent paper: “Yes, every emitted molecule of carbon dioxide (CO2) causes some warming—but the crucial question is how much warming do the CO2 emissions cause? If atmospheric CO2 levels doubled, would the temperature rise by 0.1°, 1.0°, or by 10.0° C?”

Temperature sensitivity scenarios from IPCC AR4.

The absorption frequencies of CO2 are already saturated, meaning that the atmosphere already captures close to 100% of the radiation at those frequencies. Consequently, as the level of CO2 in the atmosphere increases, the rise in temperature for a given increase in CO2 becomes smaller. This sorely limits the amount of warming further increases in CO2 can engender. Because CO2 on its own cannot account for the observed temperature rise in the past century, climate modelers assume that linkages exist between CO2 and other climate influences, mainly water vapor (for a more detailed explanation of what determines the Global Warming Potential of a gas see my comment “It’s not that simple”).

To compensate for the missing “forcing,” models are tuned to include a certain amount of extra warming linked to carbon dioxide levels—extra warming that comes from unestablished feedback mechanisms whose existence is simply assumed. Aerosol cooling and climate sensitivity in the models must balance each other in order to match historical conditions. Since the climate warmed slightly last century the amount of warming must have exceeded the amount of cooling. As Dr. Roy Spencer, meteorologist and former NASA scientist, puts it: “They program climate models so that they are sensitive enough to produce the warming in the last 50 years with increasing carbon dioxide concentrations. They then point to this as ‘proof’ that the CO2 caused the warming, but this is simply reasoning in a circle.”

A large aerosol cooling, therefore, implies a correspondingly large climate sensitivity. Conversely, reduced aerosol cooling implies lower GHG warming, which in turn implies lower model sensitivity. The upshot of this is that sensitivity values used in models for the past quarter of a century have been set too high. Using elevated sensitivity settings has significant implications for model predictions of future global temperature increases. The low-end value of model sensitivity used by the IPCC is 2°C. Using this value results, naturally, in the lowest predictions for future temperature increases. According to the paper “Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect” published in Science on july 10, 2009, Gunnar Myhre states that previous values for aerosol cooling are too high—by as much as 40 percent—implying the IPCC’s model sensitivity settings are too high also. Here is the abstract of the paper:

In the Intergovernmental Panel on Climate Change Fourth Assessment Report, the direct aerosol effect is reported to have a radiative forcing estimate of –0.5 Watt per square meter (W m–2), offsetting the warming from CO2 by almost one-third. The uncertainty, however, ranges from –0.9 to –0.1 W m–2, which is largely due to differences between estimates from global aerosol models and observation-based estimates, with the latter tending to have stronger (more negative) radiative forcing. This study demonstrates consistency between a global aerosol model and adjustment to an observation-based method, producing a global and annual mean radiative forcing that is weaker than –0.5 W m–2, with a best estimate of –0.3 W m–2. The physical explanation for the earlier discrepancy is that the relative increase in anthropogenic black carbon (absorbing aerosols) is much larger than the overall increase in the anthropogenic abundance of aerosols.

The complex influence of atmospheric aerosols on the climate system and the influence of humans on aerosols are among the key uncertainties in the understanding recent climate change. Rated as one of the most significant yet poorly understood forcings by the IPCC there has been much activity in aerosol research recently (see Airborne Bacteria Discredit Climate Modeling Dogma and African Dust Heats Up Atlantic Tropics). Some particles absorb sunlight, contributing to climate warming, while others reflect sunlight, leading to cooling. The main anthropogenic aerosols that cause cooling are sulfate, nitrate, and organic carbon, whereas black carbon absorbs solar radiation. The global mean effect of human caused aerosols (in other words, pollution) is a cooling, but the relative contributions of the different types of aerosols determine the magnitude of this cooling. Readjusting that balance is what Myhre’s paper is all about.

Smoke from a forest fire.

Photo EUMETSAT.

Discrepancies between recent satellite observations and the values needed to make climate models work right have vexed modelers. “A reliable quantification of the aerosol radiative forcing is essential to understand climate change,” states Johannes Quaas of the Max Planck Institute for Meteorology in Hamburg, Germany. Writing in the same issue of Science Dr. Quaas continued, “however, a large part of the discrepancy has remained unexplained.” With a systematic set of sensitivity studies, Myhre explains most of the remainder of the discrepancy. His paper shows that with a consistent data set of anthropogenic aerosol distributions and properties, the data-based and model-based approaches converge.

Myhre argues that since preindustrial times, soot particle concentrations have increased much more than other aerosols. Unlike many other aerosols, which scatter sunlight, soot strongly absorbs solar radiation. At the top of the atmosphere, where the Earth’s energy balance is determined, scattering has a cooling effect, whereas absorption has a warming effect. If soot increases more than scattering aerosols, the overall aerosol cooling effect is smaller than it would be otherwise. According to Dr. Myhre’s work, the correct cooling value is some 40% less than that previously accepted by the IPCC.

Not that climate modelers are unaware of the problems with their creations. Numerous papers have been published that detail problems predicting ice cover, precipitation and temperature correctly. This is due to inadequate modeling of the ENSO, aerosols and the bane of climate modelers, cloud cover. Apologists for climate modeling will claim that the models are still correct, just not as accurate or as detailed as they might be. Can a model that is only partially correct be trusted? Quoting again from Roy Spencer’s recent blog post:

It is also important to understand that even if a climate model handled 95% of the processes in the climate system perfectly, this does not mean the model will be 95% accurate in its predictions. All it takes is one important process to be wrong for the models to be seriously in error.

Can such a seemingly simple mistake in a single model parameter really lead to invalid results? Consider the graph below, a representation of the predictions made by James Hansen to the US Congress in 1988, plotted against how the climate actually behaved. Pretty much what one would expect if the sensitivity of the model was set too high, yet we are still supposed to believe in the model’s results. No wonder even the IPCC doesn’t call their model results predictions, preferring the more nebulous term “scenarios.”

Now that we know the models used by climate scientists were all tuned incorrectly what does this imply for the warnings of impending ecological disaster? What impact does this discovery have on the predictions of melting icecaps, rising ocean levels, increased storm activity and soaring global temperatures? Quite simply they got it wrong, at least in as much as those predictions were based on model results. To again quote from David Evans’ paper:

None of the climate models in 2001 predicted that temperatures would not rise from 2001 to 2009—they were all wrong. All of the models wrongly predict a huge dominating tropical hotspot in the atmospheric warming pattern—no such hotspot has been observed, and if it was there we would have easily detected it.

Once again we see the shaky ground that climate models are built on. Once again a new paper in a peer reviewed journal has brought to light significant flaws in the ways models are configured—forced to match known historical results even when erroneous values are used for fundamental parameters. I have said many times that, with enough tweaking, a model can be made to fit any set of reference data—but such bogus validation does not mean the model will accurately predict the future. When will climate science realize that its reputation has been left in tatters by these false prophets made of computer code?

Be safe, enjoy the interglacial and stay skeptical.

==================================

ADDENDUM BY ANTHONY

I’d like to add this graph showing CO2’s temperature response to supplement the one Doug Hoffman cites from IPCC AR4. here we see that we are indeed pretty close to saturation of the response.

CO2_temperature_curve_saturation
click for larger image

The “blue fuzz” represents measured global CO2 increases in our modern times.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
209 Comments
Inline Feedbacks
View all comments
July 16, 2009 2:58 am

Re Nick Stokes (00:44:05)
That “Unreal Climate” thread is dated May 2007, the first graph looks like it ends in c 2003, so extrapolate it out to 2009 & see where it is now.
Reading it carefully, it also doesn’t seem to plot any temperatures, it’s just “scenarios” with/without various forcings.
So the second chart is the one to look at and to me it would seem that we’re below even Hansen’s Scenario C.
The models are junk, something(s) important is/are missing from them and until that/those missing factor(s) are identified and included, their predictive value is worthless.

Joachim
July 16, 2009 3:10 am

I was guessing that the RC crowd would simply say the author was wrong, even though it is published in Science. Lo and behold…..
Now that they have hyposethized that global warming prior to 1998 was the normal AGW rate, “the climate system overshot the target”, we’ll just have to wait another 15-30 years for this new idea to be proven wrong, if it is wrong. I don’t think AGW prophesizing scientists will be convinced of anything other than their own work.

MattN
July 16, 2009 3:23 am

Excellent entry and reinforces what I’ve said for years now. Thanks.
If soot is indeed the issue, we need to look no further than China who spews more unchecked coal emissions than the US every thought of. Our coal burning practices, by comparison, are as pure as a mountain stream….

Chris Wright
July 16, 2009 3:29 am

This is fascinating stuff. I have little doubt that the IPCC has grossly exaggerated future global warming for the obvious reasons. However, I think the sceptical case is best served by not exaggerating the truth and by not presenting misleading data. We should leave that to the warmists.
I think Nick Stokes has a point. If, as he says, Hansen Scenario A was for exponential CO2 growth, that plainly didn’t happen and, if so, to display Scenario A in that context was misleading at the very least.
I wonder if WUWT could comment on this and also show the equivalent graph for Scenario B. If Hansen got it right twenty years ago then he should have credit for it. But on the other hand I would be surprised if his model correctly predicted the pause in global warming that started ten years ago.
Overall, I think WUWT does a great job – but, as in all walks of life, occasionally things do go wrong!
Chris

Leon Brozyna
July 16, 2009 3:32 am

After reading through this, I was struck by the way so much of modeling is based upon assumptions … modelers assume certain values for cooling by aerosols, then assume certain values for GHG heating, assume for this, assume for that, etc. etc.
From my 20 year sojourn in the intellectual atmosphere of the Big Green Machine, I picked up this succinct descriptor [vulgarity level filtered]:
Assumptions are the mother of all foul ups.

Miles
July 16, 2009 4:13 am

Climate models are a colossal waste of time and money.
Until climatology is a better understood science, research, time and money could be put to better use trying to understand the basic fundamentals of earth’s climate instead of trying to predict something on which we’re basically guessing at.
Shouldn’t there be a basic law of climate science anyways, like the laws of thermodynamics ?
1) The system will stay at equilibrium until an external forcing changes it’s current equilibrium ( or something like that )

SOYLENT GREEN
July 16, 2009 4:20 am

Richard III
You forgot the best line relative to Hansen et al; “it is a tale told by an idiot, full of sound and fury, signifying nothing.”

Ron de Haan
July 16, 2009 4:20 am

If we want real answers, we need to open up to all possibilities and with all, I mean all.
Before this publication we had similar publications about black soot, black carbon, aerosols etc.
I think we should have a much broader perspective and involve much more factors and combine them, where possible with real world observations.
If we look at this season with continuing cold anolomies in the northern hemisphere,
we have experienced at least 4 volcanic eruptions, that started with Mt. Redoubt and ended with Sarychev Peak sending a lot of ash and gases, water vapor and SO2 high into the atmosphere, penetrating the stratosphere. (Sarychev Peak eruption reached an altitude of 21 km)
http://earthobservatory.nasa.gov/NaturalHazards/category.php?cat_id=12
We also have observed real massive dust storms. http://earthobservatory.nasa.gov/NaturalHazards/category.php?cat_id=7
This month for example a gigantic amount of Sahara dust as big as the entire USA
crossed the Atlantic, with a huge impact on Atlantic Storm formation.
Joseph D’Aleo, in his current publication (see http://www.icecap.us) about the sun effecting weather events, tells about the droughts in Africa and Argentina which always seem to happen during a solar minimum. These droughts are a major cause for dust storms, expanding the area’s of source material besides the usual deserts.
At the same time we observe real gigantic forest fires.
http://earthobservatory.nasa.gov/NaturalHazards/category.php?cat_id=8
Especially impressive the sat image of Central Africa.
Several posters here made simple experiments with their photo camera to conclude
there is a significant dimming of the sun. I believe figures like 18% were mentioned.
This in my opinion is very significant.
Some time ago I read a publication about cloud forming.
The usual condensation nuclei consist of ice but this article described they found bio materials, plant pollen and even spider eggs were found in their air samples the took during several flights at different altitudes.
We know that fine dust and smoke also function as condensation nuclei.
Than we have the theory of Svensmark about the inter galactic particles crashing into our atmosphere responsible for seeding clouds.
Just to make my point, I think there is a trend where scientists tend to compact a very complex climate system, involving the sun, the oceans the land, the atmosphere with single one liners.
I can perfectly understand this because most scientist are specialized and perform long years of research on a single phenomena.
But if we want to understand the huge puzzle that is called climate, we should take good notice of their research but at the same time we must realize that what we are looking at is only a single (small) piece of that puzzle.
We already know that CO2 is not a climate driver and it is a logical step that people
look for arguments to debunk settled science that states the opposite.
The fact that the CO2 argument is blown out of proportion and high jacked by politics dies not make it easy.
But I think we should try to generate a view of the entire puzzle and make a direct link between real world observations and implement (check, test) the available science.
We have to fit the pieces in order to finish the puzzle and see what’s missing.

Editor
July 16, 2009 5:23 am

Confusing typo alert, 2nd sentence.
Unfortunately, the assume greater cooling has been used in climate models for years.
It took a couple attempts (e.g. the->they), but perhaps it should have assume->assumed, i.e.
Unfortunately, the assumed greater cooling has been used in climate models for years.
It would’ve helped if I had read the first sentence first. 🙂

Dave in Delaware
July 16, 2009 5:29 am

re DJ (01:51:27) :
Following the Harries, J.E., H.E. Brindley, P.J. Sagoo, and R.J. Bantges link –
The press release for the 2001 paper tells us about the methodology –
“The team examined the infrared spectrum of long-wave radiation data from a region over the Pacific Ocean…”
“The effects of cloud cover were effectively removed by using a cloud-clearing algorithm.”
——
There were a number of related papers by these authors.
In an earlier paper on this same time snapshot comparison (1970 and 1997 data)- H.E. Brindley*, P.J. Sagoo, R. J. Bantges and J.E. Harries conclude –
“By comparing spectrally resolved observations from
the IRIS and IMG instruments we have identified clear
signatures due to long term changes in trace gas
amounts. Although these strongly affect the OLR the
atmospheric temperature and humidity response cannot
be unequivocally determined owing to the snapshot
nature of the observations.”
————————————————————————-
And in a later paper, adding 2003 to the 1970 and 1997 data, authors Jennifer A. Griggs and John E. Harries tell us about the percentage of clear sky data over that portion of the Pacific.
“The cloud clearing process removes 99% of IRIS spectra, 67% of IMG spectra and 76% of AIRS spectra.”
—————————
My conclusion from this is:
* They saw a composition change in the atmosphere between 1971 and 1997.
* They were not able to draw conclusions about atmospheric temperature and humidity (they wanted to but the data didn’t support that conclusion)
* What they observed represents only 1% to 24% of a typical day over that portion of the Pacific. What was happening the other 76% to 99% of the time was governed in some way by cloud cover.
Again, from the 2001 press release,
“Professor Harries described the next challenges for the team: “The next step is to assess whether these data can provide information about changes in not only the greenhouse gas forcing, but the cloud feedback, which is a response of the cloud field to that forcing. ”
My comment –
We have good news Professor Harris.
Since water vapor and clouds account for an estimated 90% of the ‘greenhouse’ affect, and it appears that the Pacific had cloud cover more often than not, and more recent studies suggest that clouds have a moderating affect on climate, then the OLR shift in the other 10% of ‘greenhouse gasses’ may not be a climatic catastrophe.

Murray Carpenter
July 16, 2009 5:33 am

O.T.
I was thinking about sea level rise in the last 100 years and was wondering wether at least some of it could be atributed to construction, in terms of sea defences, Docks, piers, pipelines, Bridges, land reclamation from the sea, man made land in places like Dubai, even ships and boats (including wrecks!) displace water. Maybe you think all this would amount to only a few grains of sand in a bucket of water, but perhaps we are up to half a brick by now!! Your thoughts….Thanks

Ron de Haan
July 16, 2009 5:34 am

After reading the comments, I think this subject is closed!
In the mean time:
Chain of cold records continues:
http://www.examiner.com/x-11224-Baltimore-Weather-Examiner~y2009m7d14-Record-low-temperature-tied-this-morning-another-on-the-way
Observations of noctilucent clouds come in from all over the world (dimming?)
see http://www.spaceweather.com
And it is already five days ago that the last sun spot was seen.
If the sun was a car, it would need a new engine and a new battery.
“Ramp up” becomes an entire different meaning.

Pofarmer
July 16, 2009 5:38 am

Thank you for that last graph, I’ve been looking for something like that for some time!
So, does this mean that we could run CO2 to 1000 ppm and simply generate larger crop yields? I wonder what it would replace in the atmosphere?

July 16, 2009 5:38 am

A plug for my own analysis of this issue in CHILL: a reassessment of global warming theory (June, 2009, Clairview Books) – I reviewed a number of papers published in Science in 2005 when it was shown very clearly from an assessment of satellite data on the flux of radiation to the earth’s surface that the pollution aerosol explanation for the trough in global temperatures was wrong – the global dimming was a worldwide phenomenon, but it happened also in unpolluted regions of the earth, and hence was a combination of changing atmospheric transparency (aerosols) and cloud cover. I show convincing evidence that the dimming began to reverse BEFORE the major clean-up of sulphur in western Europe or the collapse of the economies of Russia and eastern Europe. In IPPC4 (2007) in amongst the technical reports, they admit this new science BUT do not state the implications for the models – which as Hoffman rightly points out, had been validated by ‘hindcasting’ or replicating this trough. As we know the 30 year trough coincided with the PDO and other ocean cycle phases.
I also pointed out that the models were DOUBLY unreliable because they also relied upon ‘warmth-in-the-pipeline’ as accumulated heat in the ocean surface waters was transferred to land – as we know in WUWT, that heat estimate was cut by half in 2006/2007 by Willis/Lyman/Gouretski’s work, BUT this was not reported in IPCC4.
The UK MetOffice are aware of this latter (if not the former) and have been revising their model projections. BUT they have a separate team, as yet unpublicised, working on the medium-range projections (to 2030) – and they expect cooling. The other team – the long range to 2050/2080 has just received maximum publicity by the government, because of course, the long term trend will be dominated by the unreconstructed models and predicts – wait for it – 4 dgrees C of warming in the UK by 2050 unless we reduce emissions by 80% – which our government has just announced it will do. It is now very busy ‘streamlining’ the planning system to remove ‘bottlenecks’ – a euphamism for people who get in the way of the very destructive wind, tidal and biofuel projects.

Ron de Haan
July 16, 2009 5:42 am

Well, we can now tell that this posting about CO2, Soot and Climate Modelling did not stop Boxer:
http://climateprogress.org/2009/07/15/boxer-planning-sept-8-rollout-for-climate-bill/
One republican is in support of her new bill.
Keep them “warm” with your calls so they can make a “cool” decision.

Pofarmer
July 16, 2009 5:46 am

it’s easy to see that the only band CO2 has that isn’t covered by the 100 times more plentiful water bands, is the one at 4 microns. Going up to the blackbody curves, that band just doesn’t have much to give.
So, in the bands where water vapor is dominate, does the CO2 concentration even matter? Are those bands already at 100% absorption even without CO2?

Pofarmer
July 16, 2009 5:54 am

Holy smoke. I just looked at our forecast. Lows in the 50’s for Friday and Saturday night. That’s in the Missouri in JULY????? I don’t ever remember lows in the 50’s in July.

MattN
July 16, 2009 5:55 am

Can anyone tell me if solar activity increases, will the energy availible in the band that CO2 absorbs also increase? If so, could *that* be what we’re seeing?

Nogw
July 16, 2009 6:18 am

DJ: How do you explain the fact that CO2, being a trace part of the air (3.85 per ten thousand) and, worst, the air having a VHC of 0.001297 J cm−3 K−1 can keep any heat (compare it with that of water=4.186 J cm−3 K−1)?
And the oustanding physicist Niels Bohr said about the so called “Green House effect”
the absorption of specific wavelengths of light didn’t cause gas atoms/molecules to become hotter. Instead, the absorption of specific wavelengths of light caused the electrons in an atom/molecule to move to a higher energy state. After absorption of light of a specific wavelength an atom couldn’t absorb additional radiation of that wavelength without first emitting light of that wavelength. (Philosophical Magazine Series 6, Volume 26 July 1913, p. 1-25)
And precisely is what Anthony´s graph above clearly shows (“saturation”, you know..).
That is why the following words are anything but science:
We conclude that global warming of more than ≈1°C, relative to 2000, will constitute “dangerous” climate change as judged from likely effects on sea level and extermination of species.
And, for sure, its author/s will be totally forgotten except for psychiatry books.

Jim
July 16, 2009 6:19 am

Supercritical (01:03:54) : My question also! Water is the chameleon in the workings of climate, and clouds certainly are aerosols. Who knows how the “climatologists” define aerosols. But clouds are a suspension of liquid water particles in air and meet the definition of an aerosol. No one has explained why a cloud aerosol should behave differently than a sulfate-based aerosol, for example. Of course there are many different kinds of clouds. The bigger ones seem to reflect all visible frequencies. It seems some of the thinner, higher ones would behave as sulfate-based aerosols.

Jim
July 16, 2009 6:21 am

DJ (01:51:27) : To read the Nature paper, you have to pay for it. This is OT, but it will be nice when sci papers are published on line and are truly public.

Ron de Haan
July 16, 2009 6:22 am

h.oldeboom (00:10:58) :
“German (Dr. Heinz Hug) and American scientists already reported about the very little influence of CO2 in the 80’s and 90’s of the past century. Now it becomes true! Why? I think our miserable politicians are fearing the possible coming colder times and are confronted with an ill economy”.
h.oldeboom
Unfortunately these politicians currently represent a minority.
Everything now depends on the US Senate.
If they reject the new climate bill that is prepared right now, we have a slight chance.
If they pass this bill we are put in “Green Shackles”, see book Vaclav Claus.

Mike Monce
July 16, 2009 6:23 am

DJ,
I just went through the Nature paper you referenced. Two points: the satellite data the authors use has an uncertainty of .45-.75 degree K, which by normal practice I assume to be 1 sigma. They state they see a signifcant change in the CO2 absorbtion over the 27 year period which from their graph is about 1 degree in brightness temp. However, that must be put in context with the uncertainty of 0.75 degrees (1 sigma). So, at most the change is more like .25 degrees, and is perhaps even less if the uncertainty is carried to 2 sigma. After almost 30 years in the science business, I still don’t understand how journals like Nature let authors get away with not explicitly putting error bars on their graphs.
Second, the authors state they don’t really know what the aerosol factor really is in the data, and the above post now speaks to that directly.
I will give you that the Hansen projection presented above is just one of the three.

Bill Illis
July 16, 2009 6:26 am

There was another recent paper which used actual mass spectoscopy measurements of the atmosphere which showed that soot (which produces slight warming) and sulfate aerosols (which produce cooling) combine quite rapidly chemically in the atmosphere to produce a net warming affect.
This is more consistent with the temperature experience of areas which should have been the most affected by sulfate aerosols. The areas which have been the most affected and which should have seen the greatest cooling effect from sulfate aerosols, have actually increased in temperature at a faster rate than areas not affected.
So generally, I think we need more actual measuring and less made-up theoritical forcings in the models.
http://www.pnas.org/content/early/2009/07/02/0900040106.abstract
http://www.sciencedaily.com/releases/2009/06/090629200808.htm

July 16, 2009 6:45 am

@DJ…
The “gagging” might be due to links. When that happens to me, I post a note to the mod’s asking that they check to see if that’s why the Spam filter grabbed my post because of the links.
On Hansen’s models… All three of his scenarios overshot the actual temperature data… Badly overshot in the case of the satellite data and IIRC even over shot the HadCRUT surface temp. series.
On CO2 radiative saturation… The main frequencies are saturated and most of CO2’s bandwidths over-lap the water vapor bandwidths. That’s why CO’s greenhouse effect is logarithmic. Marginal CO2 warming occurs on the side-lobes of its bandwidths.
If CO2 and temperature had a linear relationship, it would be a lot hotter now than it was in the last two interglacials. When in fact the last two interglacials were warmer than it is today, despite having significantly lower atmospheric CO2 levels.