Even on business as usual, there will be <1° Kelvin warming this century
By Christopher Monckton of Brenchley
Curiouser and curiouser. As one delves into the leaden, multi-thousand-page text of the IPCC’s 2013 Fifth Assessment Report, which reads like a conversation between modelers about the merits of their models rather than a serious climate assessment, it is evident that they have lost the thread of the calculation. There are some revealing inconsistencies. Let us expose a few of them.
The IPCC has slashed its central near-term prediction of global warming from 0.28 K/decade in 1990 via 0.23 K/decade in the first draft of IPCC (2013) to 0.17 K/decade in the published draft. Therefore, the biggest surprise to honest climate researchers reading the report is why the long-term or equilibrium climate sensitivity has not been slashed as well.
In 1990, the IPCC said equilibrium climate sensitivity would be 3 [1.5, 4.5] K. In 2007, its estimates were 3.3 [2.0, 4.5] K. In 2013 it reverted to the 1990 interval [1.5, 4.5] K per CO2 doubling. However, in a curt, one-line footnote, it abandoned any attempt to provide a central estimate of climate sensitivity – the key quantity in the entire debate about the climate. The footnote says models cannot agree.
Frankly, I was suspicious about what that footnote might be hiding. So, since my feet are not yet fit to walk on, I have spent a quiet weekend doing some research. The results were spectacular.
Climate sensitivity is the product of three quantities:
Ø The CO2 radiative forcing, generally thought to be in the region of 5.35 times the logarithm of the proportionate concentration change – thus, 3.71 Watts per square meter;
Ø The Planck or instantaneous or zero-feedback sensitivity parameter, which is usually taken as 0.31 Kelvin per Watt per square meter; and
Ø The system gain or overall feedback multiplier, which allows for the effect of temperature feedbacks. The system gain is 1 where there are no feedbacks or they sum to zero.
In the 2007 Fourth Assessment Report, the implicit system gain was 2.81. The direct warming from a CO2 doubling is 3.71 times 0.31, or rather less than 1.2 K. Multiply this zero-feedback warming by the system gain and the harmless 1.2 K direct CO2-driven warming becomes a more thrilling (but still probably harmless) 3.3 K.
That was then. However, on rootling through chapter 9, which is yet another meaningless expatiation on how well the useless models are working, there lies buried an interesting graph that quietly revises the feedback sum sharply downward.
In 2007, the feedback sum implicit in the IPCC’s central estimate of climate sensitivity was 2.06 Watts per square meter per Kelvin, close enough to the implicit sum f = 1.91 W m–2 K–1 (water vapor +1.8, lapse rate –0.84, surface albedo +0.26, cloud +0.69) given in Soden & Held (2006), and shown as a blue dot in the “TOTAL” column in the IPCC’s 2013 feedback graph (fig. 1):
Figure 1. Estimates of the principal positive (above the line) and negative (below it) temperature feedbacks. The total feedback sum, which excludes the Planck “feedback”, has been cut from 2 to 1.5 Watts per square meter per Kelvin since 2007.
Note in passing that the IPCC wrongly characterizes the Planck or zero-feedback climate-sensitivity parameter as itself being a feedback, when it is in truth part of the reference-frame within which the climate lives and moves and has its being. It is thus better and more clearly expressed as 0.31 Kelvin of warming per Watt per square meter of direct forcing than as a negative “feedback” of –3.2 Watts per square meter per Kelvin.
At least the IPCC has had the sense not to attempt to add the Planck “feedback” to the real feedbacks in the graph, which shows the 2013 central estimate of each feedback in red flanked by multi-colored outliers and, alongside it, the 2007 central estimate shown in blue.
Look at the TOTAL column on the right. The IPCC’s old feedback sum was 1.91 Watts per square meter per Kelvin (in practice, the value used in the CMIP3 model ensemble was 2.06). In 2013, however, the value of the feedback sum fell to 1.5 Watts per square meter per Kelvin.
That fall in value has a disproportionately large effect on final climate sensitivity. For the equation by which individual feedbacks are mutually amplified to give the system gain G is as follows:
where g, the closed-loop gain, is the product of the Planck sensitivity parameter λ0 = 0.31 Kelvin per Watt per square meter and the feedback sum f = 1.5 Watts per square meter per Kelvin. The unitless overall system gain G was thus 2.81 in 2007 but is just 1.88 now.
And just look what effect that reduction in the temperature feedbacks has on final climate sensitivity. With f = 2.06 and consequently G = 2.81, as in 2007, equilibrium sensitivity after all feedbacks have acted was then thought to be 3.26 K. Now, however, it is just 2.2 K. As reality begins to dawn even in the halls of Marxist academe, the reduction of one-quarter in the feedback sum has dropped equilibrium climate sensitivity by fully one-third.
Now we can discern why that curious footnote dismissed the notion of determining a central estimate of climate sensitivity. For the new central estimate, if they had dared to admit it, would have been just 2.2 K per CO2 doubling. No ifs, no buts. All the other values that are used to determine climate sensitivity remain unaltered, so there is no wriggle-room for the usual suspects.
One should point out in passing that equation (1), the Bode equation, is of general application to dynamical systems in which, if there is no physical constraint on the loop gain exceeding unity, the system response will become one of attenuation or reversal rather than amplification at loop-gain values g > 1. The climate, however, is obviously not that kind of dynamical system. The loop gain can exceed unity, but there is no physical reality corresponding to the requirement in the equation that feedbacks that had been amplifying the system response would suddenly diminish it as soon as the loop gain exceeded 1. The Bode equation, then, is the wrong equation. For this and other reasons, temperature feedbacks in the climate system are very likely to sum to net-zero.
The cut the IPCC has now made in the feedback sum is attributable chiefly to Roy Spencer’s dazzling paper of 2011 showing the cloud feedback to be negative, not strongly positive as the IPCC had previously imagined.
But, as they say on the shopping channels, “There’s More!!!” The IPCC, to try to keep the funds flowing, has invented what it calls “Representative Concentration Pathway 8.5” as its business-as-usual case.
On that pathway (one is not allowed to call it a “scenario”, apparently), the prediction is that CO2 concentration will rise from 400 to 936 ppmv; that including projected increases in CH4 and N2O concentration one can make that 1313 ppmv CO2 equivalent; and that the resultant anthropogenic forcing of 7.3 Watts per square meter, combined with an implicit transient climate-sensitivity parameter of 0.5 Kelvin per Watt per square meter, will warm the world 3.7 K by 2100 (at a mean rate equivalent to 0.44 K per decade, or more than twice as fast on average as the maximum supra-decadal rate of 0.2 K/decade in the instrumental record to date) and a swingeing 8 K by 2300 (fig. 2). Can They not see the howling implausibility of these absurdly fanciful predictions?
Let us examine the IPCC’s “funding-as-usual” case in a little more detail.
Figure 2. Projected global warming to 2300 on four “pathways”. The business-as-usual “pathway” is shown in red. Source: IPCC (2013), fig. 12.5.
First, the CO2 forcing. From 400 ppmv today to 936 ppmv in 2100 is frankly implausible even if the world, as it should, abandons all CO2 targets altogether. There has been very little growth in the annual rate of CO2 increase: it is little more than 2 ppmv a year at present. Even if we supposed this would rise linearly to 4 ppmv a year by 2100, there would be only 655 ppmv CO2 in the air by then. So let us generously call it 700 ppmv. That gives us our CO2 radiative forcing by the IPCC’s own method: it is 5.35 ln(700/400) = 3 Watts per square meter.
We also need to allow for the non-CO2 greenhouse gases. For a decade, the IPCC has been trying to pretend that CO2 accounts for as small a fraction of total anthropogenic warming as 70%. However, it admits in its 2013 report that the true current fraction is 83%. One reason for this large discrepancy is that once Gazputin had repaired the methane pipeline from Siberia to Europe the rate of increase in methane concentration slowed dramatically in around the year 2000 (fig. 3). So we shall use 83%, rather than 70%, as the CO2 fraction.
Figure 3. Observed methane concentration (black) compared with projections from the first four IPCC Assessment Reports. This graph, which appeared in the pre-final draft, was removed from the final draft lest it give ammunition to skeptics (as Germany and Hungary put it). Its removal, of course, gave ammunition to skeptics.
Now we can put together a business-as-usual warming case that is a realistic reflection of the IPCC’s own methods and data but without the naughty bits. The business-as-usual warming to be expected by 2100 is as follows:
3.0 Watts per square meter CO2 forcing
x 6/5 (the reciprocal of 83%) to allow for non-CO2 anthropogenic forcings
x 0.31 Kelvin per Watt per square meter for the Planck parameter
x 1.88 for the system gain on the basis of the new, lower feedback sum.
The answer is not 8 K. It is just 2.1 K. That is all.
Even this is too high to be realistic. Here is my best estimate. There will be 600 ppmv CO2 in the air by 2100, giving a CO2 forcing of 2.2 Watts per square meter. CO2 will represent 90% of all anthropogenic influences. The feedback sum will be zero. So:
2.2 Watts per square meter CO2 forcing from now to 2100
x 10/9 to allow for non-CO2 anthropogenic forcings
x 0.31 for the Planck sensitivity parameter
x 1 for the system gain.
That gives my best estimate of expected anthropogenic global warming from now to 2100: three-quarters of a Celsius degree. The end of the world may be at hand, but if it is it won’t have anything much to do with our paltry influence on the climate.
Your mission, gentle reader, should you choose to accept it, is to let me know in comments your own best estimate of global warming by 2100 compared with the present. The Lord Monckton Foundation will archive your predictions. Our descendants 85 years hence will be able to amuse themselves comparing them with what happened in the real world.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Mr Ard is indeed “missing something”: for he sees no purpose in demonstrating the falsity of a hypothesis that is not yet proven. However, nearly all hypotheses in the physical sciences are not susceptible of demonstration. The “global warming is going to be dangerous unless we shut down the capitalist West” hypothesis, though neither demonstrable nor at all likely to be true, is nevertheless the political credo of the Left, and is now one of their most sacred shibboleths. The Left are notoriously unsusceptible to reason. Accordingly, it is necessary for us here to demonstrate, over and over again, the increasingly glaring scientific defects in the scientific case for what is at root not a scientific but a political theory. Since this particular – and particularly poisonous – political theory is rooted in scientific error, this is one political theory that, in the end, will not survive: for it is already evident even to the promoters of the climate scare that they must begin to modify their absurd claims, as the head posting demonstrates.
Thank you, sir. May I have another?
Sensitivity is not hard to determine if you ignore all the multitudinous side issues brought out above. First, we must determine what temperature change, if any, takes place when atmospheric carbon dioxide increases. Theory tells us there must be a monotonic increase of temperature with increase of carbon dioxide in air. If you ignore its seasonal wiggle the Keeling curve shows an extremely smooth, upward curving history of atmospheric carbon dioxide. If we extend the Keeling curve back from 1958 by using ice core measurements we can go back as far as we have temperature records. It turns out that there is no place on the corresponding global warming curve where the two have run parallel for the last two centuries. This in itself should have raised someone’s suspicions about the role of carbon dioxide in global warming. Unfortunately all those great experts pulling salaries to study global warming have an idée fixe that greenhouse warming exists, don’t bother me with observations. That is not how science is done, that is pseudo-science. Coming back to observations, it is not a secret that there has been no warming at all for the last 17 years. We are fortunate in being able to use this as a natural experiment. From the Keeling curve we observe that atmospheric carbon dioxide has continued to increase during this time and perhaps even accelerated a bit. The greenhouse theory that these salary-men apply is the Arrhenius theory according to which this increase of carbon dioxide should produce a noticeable temperature rise. But it does not. If your theory predicts warming and nothing happens for 17 years it is time to get rid of that theory. It belongs in the dust bin of history. The only theory that accurately explains what happens here is the Miskolczi greenhouse theory (MGT). It differs from the Arrhenius theory in being able to handle the general case where several greenhouse gases simultaneously absorb in the IR, something Arrhenius cannot do. In such a case the gases present establish a joint optimal absorption window they control. In the earth atmosphere the gases that must be accounted for are carbon dioxide and water vapor. Their joint optimal absorption window has an IR optical thickness of 1.87, determined by Miskolczi from first principles. If carbon dioxide is now added to the atmosphere it will start to absorb, just as Arrhenius says. But this will increase the optical thickness. And as soon as this happens, water vapor will start to diminish, rain out, and the original optical thickness is restored. This explains why global temperature today is not increasing despite continual addition of carbon dioxide to the atmosphere. Also, this is not the first time it has happened. Using NOAA radiosonde database Miskolczi observed that IR absorption by the atmosphere stayed the same for 61 years while at the same time atmospheric carbon dioxide increased by 21.6 percent. This is the exact equivalent of what is happening today with the warming pause. It is clear from all this that, thanks to MGT, if you should even double the amount of carbon dioxide in the air the optical thickness will still be the same, there will be no additional absorption, and sensitivity will be zero. What this means for anthropogenic global warming or AGW will be left as an exercise for the student.
Mr Arrak’s comment on the Miskolczi findings is interesting, but it is not yet clear that infra-red absorption by the atmosphere has remained the same for several decades. Satellite measurements do show some attenuation in outgoing radiation, particularly at the CO2 absortion wavelengths. I don’t think we’ll be able to stand up his theory unless most of the outgoing-IR datasets confirm the absence of any change in outgoing long-wave radiation.
Rod Leman forcefully inquires: “Where is the extraordinary evidence to support the extraordinary claim of massive, international scientific corruption?”
Did you not see it or are you ignoring it, Rod? Here it is again, a so far unretracted paper that just months ago, not as ancient pre-Climategate history, but the very latest and widely celebrated alarm raising hockey stick in the very top journal Science, and the damn thing has no blade in any of the actual input data plotted directly from the paper’s own data archive:
http://s6.postimg.org/jb6qe15rl/Marcott_2013_Eye_Candy.jpg
This is but the latest of a long pattern. Alarmists act as if these scam studies are real. It’s a fraud, a money-fueled lame conspiracy of complacent gravy train rent seeking.
Another fraud was on the cover of Nature, an equally prominent scientific journal, that smeared hot red Antarctic Peninsula warming over the whole continent, in mathematically illegal fashion. Did any climate “scientist” catch this in peer review?
I’m sorry to sound so forceful, since you are new to the topic, and when you say: “As an engineer who has worked with real scientists, I have found competency to be the most valued characteristic of peers and a deep respect for facts, truth, logic.”, then I hope you have enough open minded respect for our best engineer of all, Burt Rutan, who is an outspoken “Show me the data!”
skeptic:
http://youtu.be/oxFm1TXshZY
The thing is, with agenda-laden funding, a great conspiracy isn’t needed, just complacency as self-interest takes over especially for third rate scientists who want tens of millions in new funding as long as they shut up about the mere couple of dozen hockey stick team insiders who are politically connected. They merely need study the effects of perfectly natural warming without there actually being “tens of thousands” of scientists producing the core deception that recent warming is outside of natural variation via hockey stick denial of the vast majority of proxy studies that exist that form bowls instead of hockey stick handles.
Monckton of Brenchley says:
June 9, 2014 at 3:56 pm
Mr Arrak’s comment on the Miskolczi findings is interesting, but it is not yet clear that infra-red absorption by the atmosphere has remained the same for several decades. Satellite measurements do show some attenuation in outgoing radiation, particularly at the CO2 absortion wavelengths. I don’t think we’ll be able to stand up his theory unless most of the outgoing-IR datasets confirm the absence of any change in outgoing long-wave radiation.
_________________________
In that single paragraph, you proved that you apply an honest metric to climate discussions.
I shall advance here on the significance of CO2 in all these calculations, and whilst it is undeniable that CO2 should cause some warming, it is not clear that the atmospheric concentration of CO2 is responsible for the 70-odd to 80-odd percent implied by these calculations (whomsoever it was who did them).
I would suggest that it is indeed the case that the World’s average temperature (whatever that is supposed to mean), is not a result of rising CO2 concentrations at all, but instead the reverse is true. When the World’s oceans heat up, then dissolved CO2 effervesces into the atmosphere and so the CO2 level increases mostly as a result of that process. Again when the World’s oceans cool down again, then more CO2 will be able to be dissolved, and concentrations in the atmosphere will reduce as a result. Therefore I do not accept the premiss that CO2 will either double or be very much larger than it it today by the year 2100.
All this glib talk of Global average temperatures, and World CO2 averages and so on is actually largely meaningless. Temperatures and indeed CO2 vary significantly at any single spot on this Earth, on a Monthly, Weekly, Daily, or even Hourly basis, at least an order of magnitude larger than these predicted variations discussed here. It’s the Sun & Biosphere that does that in synergy.
CO2 significant effect on temperatures is a dead horse. Please to stop the flogging of it.
If there is any positive feedback in the earth-sea-air climate system, they we should expect that some place in the last 4 billion years the positive feedbacks should have had a runaway excursion, i.e. the earth should have turned into Venus. There is no evidence of such an excursion and therefore the feedbacks are ≤ 0.
I think Mosher and Rod Lehman have framed part of the problem with Climate Modelling exactly. To Rod, there does not have to be ANY corruption at all. All you need is for a large group of scientists to agree to a set of principles such as the exact mandate the IPCC has set for itself. It does not ask “if” climate change is human induced, it has already decided that the goal is to deal with man caused climate change. I can buy into that (deforestation, agriculture, land use, urban heat islands), but I have trouble with CO2 as a driver. Of course I am just an old engineer.
From Mosh: “That discussion would start with a list of assumptions.” Well, we all know the joke about assumptions. And like Mr. Lehman, I am an engineer and I can tell you sometimes it is more than assumptions, it is INSTRUCTIONS. I would hate to say how many times over 40 years I had to go read my code of ethics with respect to my duty to my client versus my duty to the public and my fellow engineers when I was given parameters for a project that I knew could be done by going outside the parameters. But my client (frequently government) wanted limited parameters for their own reasons.
It doesn’t take much for a whole group of people to have to live within artificially imposed design constraints that costs project a ton of money more than it should because the people with the money have a goal that is not necessarily visible to the designer/scientist/researcher. You are required to work within the parameters/assumptions given. You owe a duty of care to inform, but having done that, you soldier on or quit. I have fired clients, and I have worked at things I didn’t like but could accept.
Using Mr. Mosher’s example, I recall reading that in computerized Nato War games, Nato generals were a lot more likely to look at the Nuclear option when they were losing a ground war in Europe than any of the strategists thought. This was something that we were taught in University 50 years ago, to watch for preconditioning. Might have been in my “Philosophy of Science”, class, I don’t recall. (I looked for a reference but didn’t find one) When President Kennedy was assassinated and an announcement was made over the school PA system, many of us thought the announcement was going to be that ICBM’s were in the air as we had been conditioned for that announcement with our “duck and cover” training (useless as it would have been).
Assumptions and preconditioning can be deadly:
http://www.theatlantic.com/international/archive/2013/05/the-ussr-and-us-came-closer-to-nuclear-war-than-we-thought/276290/
The IPCC says the models can’t agree so shows no central estimate for climate sensitivity, yet their feedback calculations as shown in figure 1 look exactly the same methodology as before and show a central estimate…..hoist by their own petard?
Stephen Mosher: If you dont have a better approach.
Do you think that Christopher Monckton has demonstrated a better approach than the IPCC? Actually, he has the same approach, but different results.
What do you do when you can show that the best available theory is not accurate enough to be the foundation of policy? Pretend that it is?
I, for one, would make sure that Sen James Inhof and his staff and the staffs of the House members were aware of Christopher Monckton’s writing, so that when Obama proposes climate control legislation to the Congress that legislation can be blocked. That way Obama’s pen can be curtailed.
Succinct, clear, and interesting summary of the IPCC shenanigans and the fundamental issues. Thank-you. I hope you make a speedy recovery.
Best wishes,
William
The observational data supports the assertion that the planet resists forcing changes (Lindzen and Choi’s paper for example and Spencer’s paper on clouds) and something inhibits the greenhouse forcing mechanism (this explains why the tropical troposphere has not warmed) for regions higher in the atmosphere (the lower regions of the atmosphere are saturated due to the overlap of the H2O absorption and CO2, higher in the atmosphere there is less water vapour so the CO2 should theoretically cause the most amount of warming in higher regions of the troposphere, that is not observed.)
http://www-eaps.mit.edu/faculty/lindzen/236-Lindzen-Choi-2011.pdf
On the Observational Determination of Climate Sensitivity and Its Implications
Richard S. Lindzen1 and Yong-Sang Choi2
http://icecap.us/images/uploads/DOUGLASPAPER.pdf
A comparison of tropical temperature trends with model predictions
We examine tropospheric temperature trends of 67 runs from 22 ‘Climate of the 20th Century’ model simulations and try to reconcile them with the best available updated observations (in the tropics during the satellite era). Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modelled trend is 100 to 300% higher than observed, and, above 8 km, modelled and observed trends have opposite signs. These conclusions contrast strongly with those of recent publications based on essentially the same data.
Based on increased sea ice in the Antarctic (all months of the years) and a recovery of the Arctic sea ice it appears the high latitude regions have started to cool which is a reversal of the warming trend in the last 70 years. The reversal of the warming in the last 70 years is not surprising as there are cycles of warming and cooling (high latitude regions both hemispheres, same regions that warmed the most in the last 70 years) in the paleo record that correlate with solar magnetic cycle changes.
Eliza says:
June 9, 2014 at 5:41 am
Google
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Google split their stock into Class A and C shares to maintain control. Their stock prices have slid ever since. I sold for both financial and “ethical” reasons. I only use their search engine to double check searches. When you get too deep in the mud, sometimes you loose your way for a while. What ever happened to US anti-trust legislation anyway?
Uh-oh – running on my gen set as lightning knocked the power out and I don’t have my Wi-Fi on the gen set with my water pumps and refrigeration. Dang UPS’s are beeping all over the house. This comment won’t be going for who knows how long and likely irrelevant by then (if not now😏)
Off to feed les animaux in the light. Maybe the power will be back in a few hours. Might have to throw a log on the fire tonight if I don’t get enough solar gain before dark. The joy of rural living.😊
[Congratulations on your successful! backup power supplies. .mod]
My estimate for year 2100 = 2.75 x 0.27 x 0.73 = 0.54 C
Details below
CO2 = 641 ppm (At current growth rate of 0.0054% / year for 86 years)
CO2 = 712 (641 times 10/9 to allow for non-CO2 anthropogenic forcings)
Forcing = 2.75 W/m^2 (4.77 log(712 / 400); Hitran 2008 shows 4.77 is closer than 5.35)
Planck parameter = 0.27 (based on Hitran 2008 and other calculations)
Combined Feedback factor = – 0.369 C / C (see below for details)
Feedback (system) gain = 1 / (1 + 0.369) = 0.73
Temperature increase (ignoring heat storage delay)
2.75 x 0.27 x 0.73 = 0.54 C
Feedback Details in C / C, the correct way to define
Water vapor: + 0.29 (Paper in progress. Based on Hitran 2008 with 5.3% / C water vapor increase and increased water vapor solar absorption. IPCC is 1.8 x 0.31 = +0.56).
Clouds: 0 (IPCC’s 0.69 x 0.31 or +0.21 is a fudged. Probably negative, but don’t know how to estimate)
Lapse Rate: – 0.74 (Based on 6% / C evaporation increase. See my paper http://wattsupwiththat.com/2014/04/15/major-errors-apparent-in-climate-model-evaporation-estimates/
Albedo: +0.26 x 0.31 = +0.081 (use IPCC value)
Total feedbacks = 0.29 + 0 – 0.74 + 0.081 = – 0.369
Feedback (system) gain = 1 / (1 – (– 0.369)) = – 0.73
OK, I’ll bite:
CO2 peaks at 540 ppm Max about 2070 AD with a lot of uranium gone and peak coal coming
Peak coal coming in about 2070. Peak uranium will be past by then. Thorium will be worked out. Remember it is all about demographics. The population will be back down to 6 billion by 2101 (the turn of the next century). Thereafter CO2 will slowly drop. After the 2015-2030 cooling, the global temperature will be about 0.7 C higher than in 2001. Sea level will be up 180mm.
@Crispin in Waterloo
But won’t all of the above you mentioned be largely irrelevant when we have all got out “Higgs Field” personal generators by then, and able to extract power from “thin air” like Tesla did, back in the day ?
It is just as likely as any of the scenarios mentioned so far.
Stephen Richards says:
June 9, 2014 at 11:34 am
Key word for you Sasha “measured”
****
Measuring the CO2 signal is not straightforward.
Atmospheric carbon dioxide levels for the last 500 million years
http://www.pnas.org/content/99/7/4167.full.pdf
The last 500 million years of the strontium-isotope record are shown to correlate significantly with the concurrent record of isotopic fractionation between inorganic and organic carbon after
the effects of recycled sediment are removed from the strontium signal. The correlation is shown to result from the common dependence of both signals on weathering and magmatic processes.
Because the long-term evolution of carbon dioxide levels depends similarly on weathering and magmatism, the relative fluctuations of CO2 levels are inferred from the shared fluctuations of the isotopic records. The resulting CO2 signal exhibits no systematic correspondence with the geologic record of climatic variations at tectonic time scales.
The CO2 signal derived from this analysis represents fluctuations at time scales greater than
about 10 million years (My). Comparison with the geologic record of climatic variations (18) reveals no obvious correspondence.
See also page 3 section: Estimation of the CO2 Signal
****
On this site:
A Brief History of Atmospheric Carbon Dioxide Record-Breaking
http://wattsupwiththat.com/2012/12/07/a-brief-history-of-atmospheric-carbon-dioxide-record-breaking/
More on CO2 signal measurement…
Direct Observation of the Oceanic CO2 Increase Revisited
Peter Brewer, Catherine Goyet, Gernot Friederich
1997. NAS 0027-8424/97/948308-6$2.00/o
http://www.pnas.org
http://www.nap.edu/openbook.php?record_id=6238&page=36
Here is another PDF (16 pages) of detailed CO2 signal measurements:
Characteristics of the atmospheric CO2 signal as observed over the conterminous United States during INTEX-NA
JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 113, D07301, doi:10.1029/2007JD008899, 2008
http://nldr.library.ucar.edu/repository/assets/osgc/OSGC-000-000-003-124.pdf
…and another PDF (10 pages)
The non-steady state oceanic CO2 signal: its importance, magnitude and a novel way to detect it
B. I. McNeil and R. J. Matear
http://www.biogeosciences.net/10/2219/2013/bg-10-2219-2013.pdf
Even on business as usual, there will be <1° Kelvin warming this century
Kelvin isn't ° as its absolute tempreature.
one prediction it is possible to make is that peak uranium, will be some time in the 3rd millenium.
Lord Monckton, you require a correction:
Your mention of “CO2” here should be “CH4”, I believe, unless you’re equating non-CO2 gases to CO2.. Maybe I missed something?
Keep up the great work.
Mr Koenders has indeed “missed something”: in the passage he quotes, I am discussing CO2-driven forcing as a fraction of total anthropogenic forcing. If methane concentration does not grow as fast as predicted (as it is not growing), the non-CO2 fraction diminishes and the CO2 fraction correspondingly increases. It is now 83% of total anthropogenic forcing, though the IPCC’s estimate, based on wildly exaggerated predictions of increases in methane concentration, had been 70% in the Fourth Assessment Report.
I can report that 29 commenters made identifiable projections of global warming. The mean projection was for warming of one-eighth of a Celsius degree. I suspect that this value will prove to be considerably closer to the mark than the 3.7 K business-as-usual anthropogenic warming predicted by the IPCC. Time to apply for some funding for the WUWT suck-it-and-see model.
And no summary observation (or criticism) from you for those of use brave enough to predict either cooler temperatures or an official “we don’t know” ?? 8<)
Lord Monckton;
No one deserves the Royal “we” more then you. Too bad science has become a religion and a tool of control. 100 years ago you would be universally recognized as a great mind.
bad timing
The truth is becoming more and more irrelevant the scientific method died with the advent of the computer, The great tool has created a bunch of real Tools.