Since there has been a lot of discussion about Monckton here and elsewhere, I’ve offered him the opportunity to present his views here. – Anthony
Guest post by Christopher Monckton of Brenchley
At www.scienceandpublicpolicy.org I publish a widely-circulated and vigorously-debated Monthly CO2 Report, including graphs showing changes in CO2 concentration and in global mean surface temperature since 1980, when the satellites went on weather watch and the NOAA first published its global CO2 concentration series. Since some commenters here at Wattsup have queried some of our findings, I have asked Anthony to allow me to contribute this short discussion.
We were among the first to show that CO2 concentration is not rising at the fast, exponential rate that current anthropogenic emissions would lead the IPCC to expect, and that global temperature has scarcely changed since the turn of the millennium on 1 January 2001.
CO2 concentration: On emissions reduction, the international community has talked the talk, but – not least because China, India, Indonesia, Russia, Brazil, and South Africa are growing so quickly – it has not walked the walk. Accordingly, carbon emissions are at the high end of the IPCC’s projections, close to the A2 (“business as usual”) emissions scenario, which projects that atmospheric CO2 will grow at an exponential rate between now and 2100 in the absence of global cuts in emissions:
Exponential increase in CO2 concentration from 2000-2100 is projected by the IPCC on its A2 emissions scenario, which comes closest to today’s CO2 emissions. On the SPPI CO2-concentration graph, this projection is implemented by way of an exponential function that generates the projection zone. This IPCC graph has been enlarged, its ordinate and abscissa labeled, and its aspect ratio altered to provide a comparison with the landscape format of the SPPI graph.
On the A2 emissions scenario, the IPCC foresees CO2 rising from a measured 368 ppmv in 2000 (NOAA global CO2 dataset) to a projected 836[730, 1020] ppmv by 2100. However, reality is not obliging. The rate of increase in CO2 concentration has been slowing in recent years: an exponential curve cannot behave thus. In fact, the the NOAA’s deseasonalized CO2 concentration curve is very close to linear:
CO2 concentration change from 2000-2010 (upper panel) and projected to 2100 (lower panel). The least-squares linear-regression trend on the data shows CO2 concentration rising to just 570 ppmv by 2100, well below the IPCC’s least estimate of 730 ppmv on the A2 emissions scenario.
The IPCC projection zone on the SPPI graphs has its origin at the left-hand end of the linear-regression trend on the NOAA data, and the exponential curves are calculated from that point so that they reach the IPCC’s projected concentrations in 2100.
We present the graph thus to show the crucial point: that the CO2 concentration trend is well below the least IPCC estimate. Some have criticized our approach on the ground that over a short enough distance a linear and an exponential trend may be near-coincident. This objection is more theoretical than real.
First, the fit of the dark-blue deseasonalized NOAA data to the underlying linear-regression trend line (light blue) is very much closer than it is even to the IPCC’s least projection on scenario A2. If CO2 were now in fact rising at a merely linear rate, and if that rate were to continue, concentration would reach only 570 ppmv by 2100.
Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100, rather than the linear 570 ppmv. In practice, the substantial shortfall between prediction and outturn is important, as we now demonstrate. The equation for the IPCC’s central estimate of equilibrium warming from a given rise in CO2 concentration is:
∆T = 4.7 ln(C/C0),
where the bracketed term represents a proportionate increase in CO2 concentration. Thus, at CO2 doubling, the IPCC would expect 4.7 ln 2 = 3.26 K warming – or around 5.9 F° (IPCC, 2007, ch.10, p.798, box 10.2). On the A2 scenario, CO2 is projected to increase by more than double: equilibrium warming would be 3.86 K, and transient warming would be <0.5 K less, at 3.4 K.
But if we were to take the best-fit exponential trend on the CO2 data over the past decade, equilibrium warming from 2000-2100 would be 4.7 ln(614/368) = 2.41 K, comfortably below the IPCC’s least estimate and a hefty 26% below its central estimate. Combining the IPCC’s apparent overestimate of CO2 concentration growth with the fact that use of the IPCC’s methods for determining climate sensitivity to observed increases in the concentration of CO2 and five other climate-relevant greenhouse gases over the 55 years 1950-2005 would project a transient warming 2.3 times greater than the observed 0.65 K, anthropogenic warming over the 21st century could be as little as 1 K (less than 2 F°), which would be harmless and beneficial.
Temperature: How, then, has observed, real-world global temperature responded?
The UAH satellite temperature record shows warming at a rate equivalent to 1.4 K/century over the past 30 years. However, the least-squared linear-regression trend is well below the lower bound of the IPCC projection zone.
The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).
Even with this natural warming, the least-squares linear-regression trend on the UAH monthly global mean surface temperature anomalies is below the lower bound of the IPCC projection zone.
Some have said that the IPCC projection zone on our graphs should show exactly the values that the IPCC actually projects for the A2 scenario. However, as will soon become apparent, the IPCC’s “global-warming” projections for the early part of the present century appear to have been, in effect, artificially detuned to conform more closely to observation. In compiling our graphs, we decided not merely to accept the IPCC’s projections as being a true representation of the warming that using the IPCC’s own methods for determining climate sensitivity would lead us to expect, but to establish just how much warming the use of the IPCC’s methods would predict, and to take that warming as the basis for the definition of the IPCC projection zone.
Let us illustrate the problem with a concrete example. On the A2 scenario, the IPCC projects a warming of 0.2 K/decade for 2000-2020. However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010, and equilibrium warming should thus have been 4.7 ln(399.5/368) = 0.39 K, which we reduce by one-fifth to yield transient warming of 0.31 K, more than half as much again as the IPCC’s 0.2 K. Of course, CO2 concentration in 2010 was only 388 ppmv, and, as the SPPI’s temperature graph shows (this time using the RSS satellite dataset), warming occurred at only 0.3 K/century: about a tenth of the transient warming that use of the IPCC’s methods would lead us to expect.
Barely significant warming: The RSS satellite data for the first decade of the 21st century show only a tenth of the warming that use of the IPCC’s methods would lead us to expect.
We make no apology, therefore, for labelling as “IPCC” a projection zone that is calculated on the basis of the methods described by the IPCC itself. Our intention in publishing these graphs is to provide a visual illustration of the extent to which the methods relied upon by the IPCC itself in determining climate sensitivity are reliable.
Some have also criticized us for displaying temperature records for as short a period as a decade. However, every month we also display the full 30-year satellite record, so as to place the current millennium’s temperature record in its proper context. And our detractors were somehow strangely silent when, not long ago, a US agency issued a statement that the past 13 months had been the warmest in the instrumental record, and drew inappropriate conclusions from it about catastrophic “global warming”.
We have made one adjustment to please our critics: the IPCC projection zone in the SPPI temperature graphs now shows transient rather than equilibrium warming.
One should not ignore the elephant in the room. Our CO2 graph shows one elephant: the failure of CO2 concentration over the past decade to follow the high trajectory projected by the IPCC on the basis of global emissions similar to today’s. As far as we can discover, no one but SPPI has pointed out this phenomenon. Our temperature graph shows another elephant: the 30-year warming trend – long enough to matter – is again well below what the IPCC’s methods would project. If either situation changes, followers of our monthly graphs will be among the first to know. As they say at Fox News, “We report: you decide.”
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





I’m going to show you a few examples of how a climate model is actually constructed and then we will compare the current temperature trends to the projections. This should help bring you out of the spinned explanations of 10 year downturns that you have been subjected to by the spinners.
I’m going to pick on GISS partly because they produce world-class climate models and partly because they actually made the data available (hats off to them on that).
First, the two main components of the 2004 version of GISS Model E – the GHG forcing temperature impact and the non-GHG forcing temperature impact (I can show all the individual components separately if someone wants).
http://img183.imageshack.us/img183/6131/modeleghgvsotherbc9.png
Some small randomness but it follows the script to the letter that “forcings” drive the climate models, not the random butterfly initial condition fluctations.
The big negatives offsetting the big positive CO2/GHG impact are volcanoes (which are clear enough) but the other is sulfur Aerosols (the direct and indirect impact of which).
This is the actual Aerosols forcing built into this world-class 2004 climate model. GISS is the source of the Aerosols forcing used in many other models. It is a joke. It is -1.8 Watts/m2 and it is just made up.
http://img83.imageshack.us/img83/7408/modeleforcing.gif
Let’s compare the Aerosols forcing to the GHG forcing as a percentage. It is also a joke. Aerosols caused all the cooling around 1900 for example and it is really the balancing factor used to make the hindcast come even close to the actual temperatures recorded.
http://img101.imageshack.us/img101/6042/modeleaerosols.gif
So let’s exend the various components of GISS Model E out to the future (because these are easy to decifer). GISS Model E would be at about +0.8C today (only 6 years later) while the El Nino-impacted temperature in July is only +0.55C. Next March, when the current La Nina has its biggest impact and temperatures are down a further 0.2C to 0.3C from today, GISS Model E will off by 0.5C or so.
http://img175.imageshack.us/img175/2107/modeleextraev0.png
Let’s use the more up-to-date GISS Models that were used in the IPCC AR4. Sure, there is randomness, but there is clearly only positive trends over 10 years. And yes, these models are off 0.2C or so just a few years later and will be off by 0.4C by next March.
http://img189.imageshack.us/img189/7442/gissar4forecasts.png
[Unfortunately, I lost the spreadsheet these charts came from and can’t update the charts themselves but it easy enough for one to download GISTemp].
http://data.giss.nasa.gov/gistemp/graphs/Fig.C.lrg.gif
Bill,
I followed you just fine right up until you extended the GHG and non-GHG forcings into the future – purely by extrapolating the curves based on their recent shape? Or did you plug in actual calculated values to make those curves? You didn’t show your work at that point to justify those smooth curves past 2003.
Possibilty 1, you have a data source for those curves… easy to fix, just show the link or cite the paper.
Possibility 2, you fit a curve to the most recent data points, and extended it into the future. Not cool, IMO.
Solar cycle 23 has been longer and TSI a bit lower than recent cycles, China has fired up a whole batch of coal-fired power plants with no scrubbers that have pumped out a lot of aerosols, methane stopped growing suddenly at about that time, etc…
There are lots known variables that have, you know, varied a lot recently. You have assumed that the most recent trends just continued. Your data from GISS ends at the top of solar cycle 23. Extrapolating the nonGHG forcings past 2003 without any data is equivalent, in part, to assuming that cycle 23 never ended, and just kept on increasing. It didn’t.
But I like your approach, overall… just find us some post-2003 forcings data .
Looking at the GISS txt file of forcings, it looks like a deeper than normal solar minimum should be about a 0.15 to 0.2 reduction from the peak. With your GHG forcings at +1.1 (reading off your graph around 2008), nonGHG forcings should be reduced further negative to about -0.5 to -0.6 due to the solar minimum, for a sum of about +0.5 to 0.6… right on the actual temp for 2008 and 2009.
Do you agree?
Brad
Barry Bickmore says:
August 19, 2010 at 12:39 pm
Yeah, that’s about the level of analysis I’ve come to expect of Tamino. Smooth the hell out of the data and don’t tell anyone how you dealt with the endpoints. In fact, don’t explain anything. Just present your data massages and demand everyone bow down before it.
I have plotted the data myself, and I have put it through my filters. What I see is a definite deceleration in the last 20 odd years.
‘… a few years of data don’t mean ANYTHING in terms of “climate”.’
That depends on what it is you are trying to discern, and how large the signal is compared to the noise. Tell me, how precisely did you decide that a few decades of increasing temperatures were enough to declare that the Earth had an anthropogenically induced fever? Especially when there are major climate cycles within historical data which are as long or longer than the record of highly accurate observation?
Bill Illis
“First, the feedback is 2 not 3.
Second, the feedback is not happening to date.”
Let’s just take these two for now. IPCC has CO2 doubling leading to 2-4.5 C. Since CO2 doubling with no feedback is about 1 C, this implies a feedback of 2-4.5. So, 3 is the geometric mean, 3.25 (corresponding to Monckton’s value) is the arithmetic mean. Where do you get 2 from? Obviously 836/390 is more than doubling, so the warming would be easily more than the 3 or 3.25 C doubling estimates.
Second, the UAH lower troposphere warming smoothed to decadal scales (and no UHI for satellites, we should agree), is 0.5 C between 1979 and 2009. From the CO2 change alone, we expected 0.2 C with no feedback. The factor looks like 2.5 to me, and that is ignoring aerosol increases (if any) that would tend to reduce this number.
I would say the present warming rate is consistent with CO2 and significant feedback, and other evidence like stratospheric cooling corroborates the mechanism, while disproving ocean or solar explanations.
Jim D:
Warming of the troposphere has ceased and cooling may well be imminent. On some measures it has already begun.
The stratosphere stopped cooling 10 years ago and now shows slight warming.
Thus proving ocean and solar explanations.
Do catch up at the back there.
Joel Shore says:
August 19, 2010 at 2:18 pm
“What sort of behavior of CO2 levels over, say, the next 5 years would cause you to rethink your views on the cause of rising CO2 levels…Or, do you think that that is just too short a period to verify either way?”
If I saw a large CO2 event, such as the eruption of a volcano, cause an immediate, discernible, and persistent step change in overall CO2 levels, that would influence my thinking. But, it hasn’t happened before, so why should it in the next 5 years?
If I saw some similar spectral characteristics (doesn’t anyone in climate science know how to do a freaking PSD?) between the estimates of anthropogenically released CO2 and the measured CO2, that would give me pause. But, again, it hasn’t happened before, so why should it in the next 5 years?
And, if someone could come up with a reliable method for estimating atmospheric CO2 for at least several centuries before 1958 which showed conclusively that the current runup is unprecedented in magnitude and speed, that would have an impact. Again, it hasn’t happened before, so why should it in the next 5 years?
There are a few other items, but they’re all “it hasn’t happened before, so why should it in the next 5 years?”
Stephen Wilde,
Did you even notice that solar minimum we just had? These tend to cause pauses, but it should be back on track in the next ramp-up. It all makes sense.
Actually I think it is quite interesting about the solar cycle, that in terms of forcing it only can account for 0.05 C peak to peak, but we see 0.15-0.2 C, almost like a feedback amplification of 3 to 4. Why don’t the negative feedbacks favored here affect that too?
Bart says:
That depends on what it is you are trying to discern, and how large the signal is compared to the noise. Tell me, how precisely did you decide that a few decades of increasing temperatures were enough to declare that the Earth had an anthropogenically induced fever? Especially when there are major climate cycles within historical data which are as long or longer than the record of highly accurate observation?
Actually, what impresses me most is the ability of current climate models to do a decent job reproducing known variation during the glacial-interglacial cycles. They would have to be fairly robust to do that.
Brad Beeson,
On the forcings going out from 2003.
The GHG forcing is following CO2 very closely with a certain formula (4.053 ln(CO2)-23 which is actually the temperature response the theory says should happen with a lag of about 20 years in ocean heat up-take so it makes complete sense to me that it would be a simplified form for the more detailed formulae that GISS was using). I have the actual CO2 numbers going out and the trendlines going out so that part is solid [I dropped the impact very slightly to take into account Methane starting to stabilize earlier than expected].
The one big impact from Volcanoes is no longer a forcing. It reached Zero by 2000 and there has been no sulfur Volcanic eruptions since Pinatubo.
Aerosol forcing had been flatlined in the last several years (technically, the sulfur levels collected from ice and ice cores shows that it is dropping fast right now) but I just left it at the forcing GISS had already flatlined.
Land-use, black carbon and the other small forcings had a certain trend so I just extended that. It is slight decline overall.
Solar forcing is a question. The quiet Sun says that it might have gone down but TSI did not actually decline by any more than the usual solar cycle. At one time, I built in a small 0.05C decline but after the SORCE TSI instrument showed TSI going back up up again, close to the usual trend, I put it back in as a small 11 year forcing cycle. It is not a big number +/- over a cycle.
Which was one of my points above in this thread, there is no negative forcing happening right now so the models should be going up. Even black carbon which is increasing in Asia is more of a positive forcing that a negative one.
So, the extension was done the right way in my opinion.
Bart says:
That’s strange…You are proposing something that would change your thinking that, as far as I know, are not things that any of the scientists who understand that the current run-up in CO2 is due to humans would say are possible. In other words, the only way your thinking will change is if the carbon cycle starts behaving in ways that scientists don’t even expect it to. I guess that is a good way to guarantee that your thinking will remain ossified.
What do you mean by similar spectral characteristics? Over what time (or frequency) scales? Of course, the runup in CO2 has very closely much the anthropogenic release…but are you saying that you expect variations in CO2 on time scales of, say, months to match that…even though we know that such time scales are dominated by annual cycles in the carbon uptake? Again, you seem to be proposing something that is in contradiction with the science, guaranteeing that your thinking is not going to change.
That’s already been done to most scientists satisfaction.
It is interesting because it sounds as if even though CO2 will inevitably keep rising, that is not going to change your thinking and you are going to continue to believe that in some not-too-distant future it will change. Good luck with that!
Bill,
Thanks for your thoughtful reply explaining your thinking. I agree with most of it.
In fact one point you made 2 posts ago – that the aerosol and indirect aerosol forcings seem fudged to make the curve fit – I think that is a strong possibility. In the most recent years from the GISS text file of forcings, they have held aerosols, indirect aerosols and black carbon almost constant. Yet we know that China has exponentially grown their combustion of coal and production of cement, which are quite dirty activities. Also the Soviet Union’s dissolution crashed the consumption of energy in the Eastern Block. I would expect these big changes to be reflected in the GISS table, but they are not. They do admit they have no way to measure these forcings directly.
On the solar cycle, I think you are clearly wrong to assume this cycle 23 is different from all previous cycles. Look at the GISS forcings for solar. Each cycle has a regular pulse – a low about 0.1 to 0.2, followed by a peak at 0.25 to 0.27 – why do you think this cycle is different? I don’t think you have justified your assumption at all.
Can you show that TSI at this cycle low is different from previous lows? I think it has declined just as before. Can you show that the reduction has had no effect THIS time? I don’t see how you could possibly show that with a physics-based arguement.
Brad
Sorry, that should read ” a low of 0.1 to 0.12″ for the forcings at the solar cycle minimum… for a peak to valley difference of about .1 to .15
Read too fast Brad, I left it as a normal cycle.
OK, so shouldn’t the non-GHG curve dip down about .1 to .15 from 2003 to 2009 (from -0.39 to -0.54 ish)? You show no dip at all.
Assuming everything else is constant (which I don’t like, given the huge changes is energy use and mix), a decline in solar forcing of .1 to .15 should have dropped your non-GHG forcing curve by the same amount, n’est ce-pas? And if you update the temp curve to the present day, then we seem to be above the GISS model prediction. We spend some time above the model and some time below. Given the unmeasureable forcings, that seems pretty good.
I still agree that the model with unmeasureable forcings is not really very useful, especially if they have ex-post-facto derived those forcings by asuming the model is correct, and calculating what those forcings “must have been”.
I would much prefer a model that shows some ranges for the estimated forcings. Imagine a model that shows the non-GHG line as a wide band, reflecting our uncertainty in what those unmeasured aerosol and black carbon forcings really are.
What do you think?
Joel Shore says:
August 20, 2010 at 8:24 am
“You are proposing something that would change your thinking that, as far as I know, are not things that any of the scientists who understand that the current run-up in CO2 is due to humans would say are possible.”
Ah, I see. So, you are a disciple of the “magic anthropogenic CO2” cult, which holds that anthropogenic CO2 builds up, but natural additions don’t, because, well, it’s just different somehow.
“Of course, the runup in CO2 has very closely much the anthropogenic release…but are you saying that you expect variations in CO2 on time scales of, say, months to match that…even though we know that such time scales are dominated by annual cycles in the carbon uptake?”
You seem to be uninformed about Fourier decomposition and systems theory. If you see a 1 year, 11 year, 20 year, what have you cycle in the input, you MUST see it in the output (as well as possibly integer harmonics of it, depending on the linearity of the system) according to how sensitive the system is to that cycle.
You need to be very specific about what you mean by “dominates”. To the naked eye, you may not see a particular component, but it will come through loud and clear in a spectral decomposition. If you gave me a time series with a sinusoid with amplitude 1,000,000 at 1 Hz and an amplitude of 1 at 5 Hz, I could still very easily show up that 5 Hz signal in a PSD. And, I could create a filter which would isolate it. You do this kind of thing every day when you listen to the radio or talk on your cell phone.
If you had a daily familiarity with such concepts, you would think it very odd that the CO2 system is apparently only sensitive to dc anthropogenic forcing, and almost totally filters out any other cycles.
“That’s already been done to most scientists satisfaction.”
They are not worthy of the appellation, then.
Barry Bickmore says:
August 20, 2010 at 3:39 am
“Actually, what impresses me most is the ability of current climate models to do a decent job reproducing known variation during the glacial-interglacial cycles. They would have to be fairly robust to do that.”
What is essentially a multivariable curve fit, fits the data. Who’da thunk it?
Now I feel really stupid. I just realized that the GISS forcings data is given in W/m^2, while your graph is translating that into temperature effect. Please forgive my confusion, and now I see my questions are totally invalid.
Let me go back and refresh my memory on how to translate W/m^2 into temps before I say anything else stupid…
GISS Model E solar forcing temperature impact is generally about +/-0.025C. I now see I didn’t include that because it would have shown up even though it is exceeding small. I did at one time, maybe I took it out when I made the last change not including a downturn in TSI. (It is essentially flat anyway, since technically, noone can find the solar cycle signal in the surface temperature series. The solar cycle shows up in the high stratosphere temperatures but it is not big enough to rise above the noise even in very sensitive tests. There are hints of one every now and again but there is not enough consistency to be clear. That at least means, it is very small).
Turning W/m2 into actual temperatures, however, is a very, very, very tricky business which is at the heart of global warming theory itself. It is not well known but this is where the biggest potential error factor is in the theory and the climate models.
The Stefan Boltzmann equations say that each extra 1 W/m2 changes surface temperatures by 0.18C.
The same formula says that temperatures in the lower troposphere change by 0.265C for each W/m2 change – the Planck response. If the lapse rate stayed constant (as is assumed in the theory, the surface should warm by the same 0.265C rather than 0.18C). (As an aside, the Stefan Boltzmann equations seem to indicate the lapse rate will change and I think we are seeing that but that is for another day).
Inevitably, there are also feedbacks to take into account beyond Stefan Boltzmann.
1 W/m2 of GHG forcing could translate into more water vapor thus increasing the net effect. There could be an immediate hourly feedback impact, a medium term monthly feedback impact and even a long-term 1500 year impact from ocean heat up-take, glacial melt and vegetation/albedo impacts.
Generally, GISS Model E is using 0.32C per W/m2 as a short-term impact (except for volcanic forcing which is only 0.15C per W/m2 for reasons which are not explained sufficiently in my opinion – the impact is, in fact, less than 0.15C per W/m2 for volcanoes so they have just adjusted it down – all the climate model simulations that say they have reproduced Pinatubo for example have fudged their numbers to take this into account).
I don’t know if the 0.32C value increases in the future year GISS’ simulations beyond 2003 but it must if they are going to reach the +3.0C per doubling. It is not in 2003 at least.
In the long-term, global warming theory says the Charney 30 year lag equilibrium figure is 0.75C per extra W/m2 of additional forcing including feedbacks. This increases to about 1.0C per extra W/m2 after 140 years and it as high as 1.5C per extra W in the long-1500year-term equilibrium (although this figure has not been officially confirmed as a consensus).
I used the actual reported temperatures by GISS in the charts which are generally 0.32C per 1 W/m2.
Bart says:
No. It is just that the volcanic inputs are much smaller. And year-to-year variability in the CO2 rise is more determined by the sinks than the sources.
But, what sort of human cycle is there that you would be looking for? And, how would it be affected by the fact that you have spatial variation as well as temporal variation? I am not saying it is impossible to do such detection. I am just saying that it is not clear that it is possible…and you would have to provide evidence that such cycles should be detectable if the rise in CO2 is anthropogenic (as almost all of us know it is). Arguing that they haven’t detected something that you haven’t demonstrated should be detectable if the CO2 rise is anthropogenic is just talking through your hat.
But, hey, don’t let me discourage you from arguing far and wide that the rise in CO2 is not anthropogenic and that it will start decreasing soon. I am sure this will make a suitable impression on actual scientists and policymakers. They will certainly know how seriously to take anything else you might say on the subject.
Joel Shore says:
August 21, 2010 at 5:11 pm
“No. It is just that the volcanic inputs are much smaller.”
But, we CAN see their effect in the CO2 record. They are transient, and incompatible with the notion of a long residence time.
“But, what sort of human cycle is there that you would be looking for?”
If you do a PSD of the estimates of anthropogenic output, you will see particular frequency spikes. If you do a PSD of the Mauna Loa or other CO2 measurement data, you will see other spikes. It is simply not compatible with the assumption of anthropogenic dominance in the forcing that they should be different.
“I am sure this will make a suitable impression on actual scientists and policymakers.”
That is the whole problem with the politicization of Science. It becomes more important for practitioners to impress other scientists and policymakers than it does to seek the truth, and we reach a point of rampant confirmation bias and censorship of opposing views and data.
Bart says:
Do you have some sort of cite for this?
You would have to show this, together with modeling evidence that the standard carbon cycle point-of-view would predict that you should be able to see the frequency spikes that you claim exist. Preferably, you would also show that your alternate view of the carbon cycle is more compatible with the data.
Nearly everyone on the losing side of a scientific argument think it is because of biases, censorship, or what-have you (for example: http://en.wikipedia.org/wiki/Expelled:_No_Intelligence_Allowed ). That is because they have a heard time accepting they truth, which is that they’ve lost the argument because the scientific evidence is not on their side.
You are so far out in the weeds here that you even make someone like Monckton look reasonable by comparison…and that’s saying something!
Do you have some sort of cite for this?
Where do you think all those little (but statistically significant) transients in the curve here come from? I don’t think this is really controversial – I’ve seen graphs of CO2 concentration on RealClimate pointing, e.g., to the Mt. Pinatubo transient.
“You would have to show this,…”
I’ve done the analysis. I’d post the plots if I knew how. Here’s an idea: if you have a genuine interest in seeking knowledge, how about you do it, or find someone you trust who knows how to do it for you? That way, we wouldn’t have to argue over whether I cooked the books in some way.
“Nearly everyone on the losing side of a scientific argument think it is because of biases, censorship, or what-have you…”
Sometimes. And, sometimes those on the “losing side” are found to have been correct years later. I don’t know if you realize this is an admission of insecurity on your part – that you have to seek comfort from some alternative means other than pertinent facts to shore up (no pun intended) your position. If you can put me in a box as a crank, you can dismiss my arguments without having to think them through.
Well, have at it. I am supremely confident that my POV will eventually be vindicated. The only question is how much damage will be done by the alarmist herd by that time.
Bart,
I am fairly sure you are confusing aerosols from volcanoes with CO2. Aerosols have a cooling transient effect that does show up. CO2 variation is most correlated with surface temperature or SST, and its increase rate dips when it is cooler, as it seems to have after Pinatubo. That is, volcanoes (at least of the Pinatubo type) indirectly have a negative effect on CO2.
“Do you have some sort of cite for this?”
Here’s a link of some interest.
Now, I read that as, while the average CO2 output of volcanos over a year is much less than human inputs, individual events can release more than the yearly anthropogenic output. Do you concur? Surely, such events should be observable as steps in the CO2 record, if the long residence time hypothesis holds, which of course I claim it does not.
I noted that some of the largest transients in the data presented at the NOAA link seem to correlate in time with large eruptions of Mount Nyamuragira, as judged by SO2 output. Maybe this is just happenstance, as Mount Nyamuragira erupts on average every couple of years, but it appears to correlate with the largest eruptions, so maybe there is something there, maybe not.
Also, keep in mind that “4–5 · 10^14 moles CO2 yr^−1” is an estimate of “man’s current CO2 production”. Obviously, in years past, it would have been significantly less.