Why doesn't the AR5 SOD's climate sensitivity range reflect its new aerosol estimates?

This article is a detailed complement to Matt Ridley’s Op-Ed today in the Wall Street Journal:

Cooling Down the Fears of Climate Change

Evidence points to a further rise of just 1°C by 2100. The net effect on the planet may actually be beneficial.

Guest post by Nic Lewis

There has been much discussion on climate blogs of the leaked IPCC AR5 Working Group 1 Second Order Draft (SOD). Now that the SOD is freely available, I can refer to the contents of the leaked documents without breaching confidentiality restrictions.

I consider the most significant – but largely overlooked – revelation to be the substantial reduction since AR4 in estimates of aerosol forcing and uncertainty therein. This reduction has major implications for equilibrium climate sensitivity (ECS). ECS can be estimated using a heat balance approach – comparing the change in global temperature between two periods with the corresponding change in forcing, net of the change in global radiative imbalance. That imbalance is very largely represented by ocean heat uptake (OHU).

Since the time of AR4, neither global mean temperature nor OHU have increased, while the IPCC’s own estimate of the post-1750 change in forcing net of OHU has increased by over 60%. In these circumstances, it is extraordinary that the IPCC can leave its central estimate and ‘likely’ range for ECS unchanged.

I focused on this point in my review comments on the SOD. I showed that using the best observational estimates of forcing given in the SOD, and the most recent observational OHU estimates, a heat balance approach estimates ECS to be 1.6–1.7°C – well below the ‘likely’ range of 2‑4.5°C that the SOD claims (in Section 10.8.2.5) is supported by the observational evidence, and little more than half the best estimate of circa 3°C it gives.

The fact that ECS, as derived using the new aerosol forcing estimates and a heat balance approach, appears to be far lower than claimed in the SOD is highlighted in an article by Matt Ridley in the Wall Street Journal, which uses my calculations. There was not space in that article to go into the details – including the key point that the derived ECS estimate is very well constrained – so I am doing so here.

How does the IPCC arrive at its estimated range for climate sensitivity?

Methods used to estimate ECS range from:

(i) those based wholly on simulations by complex climate models (GCMs), the characteristics of which are only very loosely constrained by climate observations, through

(ii) those using simpler climate models whose key parameters are intended to be constrained as tightly as possible by observations, to

(iii) those that rely wholly or largely on direct observational data.

The IPCC has placed a huge emphasis on GCM simulations, and the ECS range exhibited by GCMs has played a major role in arriving at the IPCC’s 2–4.5°C ‘likely’ range for ECS. I consider that little credence should be given to estimates for ECS derived from GCM simulations, for various reasons, not least because there can be no assurance that any of the GCMs adequately reflect all key climate processes. Indeed, since in general GCMs significantly overestimate aerosol forcing compared with observations, they need to embody a high climate sensitivity or they would underestimate historical warming and be consigned to the scrapheap. Observations, not highly complex and unverifiable models, should be used to estimate the key properties of the climate system.

Most observationally-constrained studies use instrumental data, for good reason. Reliance cannot be placed on paleoclimate proxy-based estimates of ECS – the AR4 WG1 report concluded (Box 10.2) that uncertainties in Last Glacial Maximum studies are just too great, and the only probability density function (PDF) for ECS it gave from a last millennium proxy-based study contained little information.

Because it has historically been difficult to estimate ECS purely from instrumental observations, a standard estimation method is to compare observations of key observable climate variables, such as zonal temperatures and OHU, with simulations of their evolution by a relatively simple climate model with adjustable parameters that represent, or are calibrated to, ECS and other key climate system properties. A number of such ‘inverse’ studies, of varying quality, have been performed; I refer later to various of these. But here I estimate ECS using a simple heat balance approach, which avoids dependence on models and also has the advantage of not involving choices about niceties such as truncation parameters and Bayesian priors, which can have a major impact on ECS estimation.

Aerosol forcing in the SOD – a composite estimate is used, not the best observational estimate

Before going on to estimating ECS using a heat balance approach, I should explain how the SOD treats forcing estimates, in particular those for aerosol forcing. Previous IPCC reports have just given estimates for radiative forcing (RF). Although in a simple world this could be a good measure of the effective warming (or cooling) influence of every type of forcing, some forcings have different efficacies from others. In AR5, this has been formalised into a measure, adjusted forcing (AF), intended better to reflect the total effect of each type of forcing. It is more appropriate to base ECS estimates on AF than on RF.

The main difference between the AF and RF measures relates to aerosols. In addition, the AF uncertainty for well-mixed greenhouse gases (WMGG) is double that for RF. Table 8.7 of the SOD summarises the AR5 RF and AF best estimates and uncertainty ranges for each forcing agent, along with RF estimates from previous IPCC reports. The terminology has changed, with direct aerosol forcing renamed aerosol-radiation interactions (ari) and the cloud albedo (indirect) effect now known as aerosol-cloud interactions (aci).

Table 8.7 shows that the best estimate for total aerosol RF (RFari+aci) has fallen from −1.2 W/m² to −0.7 W/m² since AR4, largely due to a reduction in RFaci, the uncertainty band for which has also been hugely reduced. It gives a higher figure, −0.9 W/m², for AFari+aci. However, −0.9 W/m² is not what the observations indicate: it is a composite of observational, GCM-simulation/aerosol model derived, and inverse estimates. The inverse estimates – where aerosol forcing is derived from its effects on observables such as surface temperatures and OHU – are a mixed bag, but almost all the good studies give a best estimate for AFari+aci well below −0.9 W/m²: see Appendix 1 for a detailed analysis.

It cannot be right, when providing an observationally-based estimate of ECS, to let it be influenced by including GCM-derived estimates for aerosol forcing – a key variable for which there is now substantial observational evidence. To find the IPCC’s best observational (satellite-based) estimate for AFari+aci, one turns to Section 7.5.3 of the SOD, where it is given as −0.73 W/m² with a standard deviation of 0.30 W/m². That is actually the same as the Table 8.7 estimate for RFari+aci, except for the uncertainty range being higher. Table 8.7 only gives estimated AFs for 2011, but Figure 8.18 gives their evolution from 1750 to 2010, so it is possible to derive historical figures using the recent observational AFari+aci estimate as follows.

The values in Figure 8.18 labelled ‘Aer-Rad Int.’ are actually for RFari, but that equals the purely observational estimate for AFari (−0.4 W/m² in 2011), so they can stand. Only the values labelled ‘Aer-Cld Int.’, which are in fact the excess of AFari+aci over RFari, need adjusting (scaling down by (0.73−0.4)/(0.9−0.4), all years) to obtain a forcing dataset based on a purely observational estimate of aerosol AF rather than the IPCC’s composite estimate. It is difficult to digitise the Figure 8.18 values for years affected by volcanic eruptions, so I have also adjusted the widely-used RCP4.5 forcings dataset to reflect the Section 7.5.3 observational estimate of current aerosol forcing, using Figure 8.18 and Table 8.7 data to update the projected RCP4.5 forcings for 2007–2011 where appropriate. The result is shown below.

clip_image002

The adjustment I have made merely brings estimated forcing into line with the IPCC’s best observationally-based estimate for AFari+aci. But one expert on the satellite observations, Prof. Graeme Stephens, has stated that AFaci is at most ‑0.1 W/m², not ‑0.33 W/m² as implied by the IPCC’s best observationally-based estimates: see here and slide 7 of the linked GEWEX presentation. If so, ECS estimates should be lowered further.

Reworking the Gregory et al. (2002) heat balance change derived estimate of ECS

The best known study estimating ECS by comparing the change in global mean temperature with the corresponding change in forcing, net of that in OHU, is Gregory et al. (2002). This was one of the studies for which an estimated PDF for ECS was given in AR4. Unfortunately, ten years ago observational estimates of aerosol forcing were poor, so Gregory used a GCM-derived estimate. In July 2011 I wrote an open letter to Gabi Hegerl, joint coordinating lead author of the AR4 chapter in which Gregory 02 was featured, pointing out that its PDF was not computed on the basis stated in AR4 (a point since conceded by the IPCC), and also querying the GCM-derived aerosol forcing estimate used in Gregory 02. Some readers may recall my blog post at Climate Etc. featuring that letter, here. Using the GISS forcings dataset, and corrected Levitus et al. (2005) OHU data, the 1861–1900 to 1957–1994 increase in QF (total forcing – OHU) changed from 0.20 to 0.68 W/m². Dividing 0.68 W/m² into ΔT‘, the change in global surface temperature, being 0.335°C, and multiplying by 3.71 W/m² (the estimated forcing from a doubling of CO2 concentration) gives a central estimate (median) for ECS of 1.83°C.

I can now rework my Gregory 02 calculations using the best observational forcing estimates, as reflected in Figure 8.18 with aerosol forcing rescaled as described above. The change in QF becomes 0.85 W/m². That gives a central estimate for ECS of 1.5°C.

An improved ECS estimate from the change in heat balance over 130 years

The 1957–1994 period used in Gregory 02 is now rather dated. Moreover, using long time periods for averaging makes it impossible to avoid major volcanic eruptions, which involve uncertainty as to the large forcing excursions involved and their effects. I will therefore produce an estimate based on decadal mean data, for the decade to 1880 and for the most recent decade, to 2011. Although doing so involves an increased influence of internal climate variability on mean surface temperature, it has several advantages:

a) neither decade was significantly affected by volcanic activity;

b) neither decade encompassed exceptionally large ENSO events, such as the 1997/98 El Nino, and average ENSO conditions were broadly neutral in both decades (arguably with a greater tendency towards warm El Nino conditions in the recent decade); and

c) the two decades are some 130 years apart, and therefore correspond to similar positions in the 60–70 year quasi-periodic AMO cycle (which appears to have a peak-to-peak influence on global mean temperature of the order of 0.1°C).

Since estimates of OHU have become much more accurate during the latest decade, as the ARGO network of diving buoys has come into action, the loss of accuracy by measuring OHU only over the latest decade is modest.

I summarise here my estimates of the changes in decadal mean forcing, heat uptake and global temperature between 1871–1880 and 2002–2011, and related uncertainties. Details of their derivations are given in Appendix 2.

Change in global decadal mean between 1871–1880 and 2002–2011 Mean estimate Standard deviation Units
Adjusted forcing: CO2 and other well-mixed greenhouse gases 0.29 W/m²
Adjusted forcing: all other sources (balancing error standard dev.) 0.34 W/m²
Adjusted forcing: total 2.09 0.45 W/m²
Earth’s heat uptake 0.43 0.08 W/m²
Surface temperature 0.73 0.12 °C

Now comes the fun bit, putting all the figures together. The best estimate of the change from 1871–1880 to 2002–2011 in decadal mean adjusted forcing, net of the Earth’s heat uptake, is 2.09 − 0.43 = 1.66 W/m². Dividing that into the estimated temperature change of 0.727°C and multiplying by 3.71 W/m² gives an estimated climate sensitivity of 1.62°C, close to that from reworking Gregory 02.

Based on the estimated uncertainties, I compute a 5–95% confidence interval for ECS of 1.03‑2.83°C – see Appendix 3. That implies a >95% probability that ECS is below the IPCC’s central estimate of 3°C.

ECS estimates from recent studies – good ones…

As well as this simple estimate from heat balance implying a best estimate for ECS of approximately 1.6°C, and the reworking of the Gregory 02 results suggesting a slightly lower figure, two good quality recent observationally-constrained studies using relatively simple hemispheric-resolving models also point to climate sensitivity being about 1.6°C:

§ Aldrin et al. (2012), an impressively thorough study, gives a most likely estimate for ECS of 1.6°C and a 5–95% range of 1.2–3.5°C.

  • Ring et al. (2012) also estimates ECS as 1.6°C, using the HadCRUT4 temperature record (1.45°C to 2.01°C using other records).

And the only purely observational study featured in AR4, Forster & Gregory (2006), which used satellite observations of radiation entering and leaving the atmosphere, also gave a best estimate of 1.6°C, with a 95% upper bound of 4.1°C.

and poor ones…

Most of the instrumental-observation constrained studies featured in IPCC reports that give PDFs for ECS peaking at significantly over 2°C have some identifiable deficiency. Two such studies were featured in Figure 9.21 of AR4 WG1: Forest 06 and Knutti 02. Forest 06 has several statistical errors (see here) and other problems. Knutti 02 used a weak statistical procedure and an arbitrary combination of five ocean models, and there is little information content in its probabilistic ECS estimate.

Five of the PDFs for ECS from 20th century studies featured in Figure 10.19 of the AR5 SOD peak significantly above 2°C:

  • one is Knutti 02;
  • three are various cases from Libardoni and Forest (2011), a study that suffers the same deficiencies as Forest 06;
  • one is from Olson et al. (2012); the Olson PDF, like Knutti 02’s, is extremely wide and contains almost no information.

Conclusions

In the light of the current observational evidence, in my view 1.75°C would be a more reasonable central estimate for ECS than 3°C, perhaps with a ‘likely’ range of around 1.25–2.75°C.

Nic Lewis

==============================================================

Appendix 1: Inverse estimates of aerosol forcing – the expert range largely reflects the poor studies

The AR5 WG1 SOD composite AFari+aci estimate of −0.9 W/m² is derived from mean estimates from satellite observations (−0.73 W/m²), GCMs (−1.45 W/m² from AR4+AR5 models including secondary processes, −1.08 W/m² from CMIP5/ACCMIP models) and an “expert” range of −0.68 to −1.52 W/m² from combined inverse estimates. These figures correspond to box-plots in the lower panel of Figure 7.19. One of the inverse studies cited hasn’t yet been published and I haven’t been able to obtain it, but I have examined the other twelve studies.

Because of its strong asymmetry between the northern and southern hemispheres, in order to estimate aerosol forcing with any accuracy using inverse methods it is essential to use a model that, at a minimum, resolves the two hemispheres separately. Only seven of the twelve studies do so. Of the other five:

  • one is just a survey and derives no estimate itself;
  • one (Gregory 02) merely uses an AOGCM-derived estimate of a circa 100-year change in aerosol forcing, without itself deriving any estimate;
  • three are based on global-mean only data (with two of them assuming an ECS of 3°C when estimating aerosol forcing).

One of the seven potentially useful studies is based on GCM simulations, which I consider to be untrustworthy. A second does not estimate aerosol forcing over 90S–28S, and concludes that over 1976–2007 it has been large and negative over 28S–28N and large and positive over 28N–60N, the opposite of what is generally believed. A third study is Andronova and Schlesinger (2001), which it turns out had a serious code error. Its estimate of −0.54 to ‑1.30 W/m² falls to −0.42 to −0.99 W/m² when using the corrected model (Ring et al., 2012). Three of the other studies, all using four latitude zones, estimate aerosol forcing to be even lower: in the ranges −0.14 to −0.74, −0.3 to −0.95 and −0.19 to −0.83 W/m². The final study estimates it to be around or slightly above ‑1 W/m², but certainly below ‑1.5 W/m². One recent inverse estimate that the SOD omits is −0.7 W/m² (mean – no uncertainty range given) from Aldrin et al. (2012).

In conclusion, I wouldn’t hire the IPCC’s experts if I wanted a fair appraisal of the inverse studies. A range of −0.2 to −1.3 W/m² looks more reasonable – and as it happens, is centred almost exactly on the mean of the estimates derived from satellite observations.

===============================================================

Appendix 2: Derivation of the changes in decadal mean global temperature, forcing and heat uptake

Since it extends back before 1880 and includes a correction to sea surface temperatures in the mid-20th century, I use HadCRUT4 global mean temperature data, available as annual data here. The difference between the mean for the decade 2002–2011 and that for 1871–1880 is 0.727°C. The uncertainty in that temperature change is tricky to work out because the various error sources are differently correlated in time. Adding the relevant years’ total uncertainty estimates for the HadCRUT4 21-year smoothed decadal data (estimated 5–95% ranges 0.17°C and 0.126°C), and very generously assuming the variance uncertainty scales inversely with the number of years averaged, gives an error standard deviation for the change in decadal temperature of 0.08°C (all uncertainty errors are assumed to be normally distributed, and independent except where otherwise stated). There is also uncertainty arising from random fluctuations in the internal state of the climate. Surface temperature simulations from a GCM control run suggest that error source could add a further error standard deviation of 0.08°C for both decades. However, the matching of their characteristics as set out in the main text, points a) to c), and the fact that some fluctuations will be reflected in OHU, suggests a reduction from the 0.11°C error standard deviation implied by adding two 0.08°C standard deviations in quadrature, say increasing halfway, to 0.095°C. Adding that to the observational error standard deviation of 0.08°C gives a total standard deviation of 0.124°C.

The change between 1871‑1880 and 2002–2011 in decadal mean AF, with aerosol forcing scaled to reflect the best recent observational estimates, is 2.09 W/m², taking the average of the Figure 8.18 and RCP4.5 derived estimates (which are both within about 1% of this figure). The total AF uncertainty estimate of ± 0.87 W/m² in Table 8.7 equates to an error standard deviation of 0.44 W/m², which is taken as applying for 2002–2011. Using the observational aerosol forcing error estimate given in Section 7.5.3 instead of the corresponding Table 8.7 uncertainty range gives the same result. Although there would be some uncertainty in the small 1871–1880 mean forcing estimate, the error therein will be strongly correlated with that for 2002–2011. That is because much of the uncertainty relates to the relationships between:

§ concentrations of WMGG and the resulting forcing

§ emissions of aerosols and the resulting forcing,

the respective fractional errors in which are common to both periods. Therefore, the error standard deviation for the change in forcing between 1871–1880 and 2002–2011 could well be smaller than that for the forcing in 2002–2011. However, for simplicity, I assume that it is the same. Finally, I add an error standard deviation of 0.05 W/m² for uncertainty in volcanic forcing in 1871–1880 and a further 0.05 W/m² for uncertainty therein in 2002–2011, small though volcanic forcing was in both decades. Solar forcing uncertainty is included in Table 8.7. Summing the uncertainties, the total AF change error standard deviation is 0.45 W/m².

I estimate 2002–2011 OHU from a regression over 2002–2011 of 0–2000 m pentadal ocean heat content estimates per Levitus et al. (2012), inversely weighting observations by their variance. OHU in the 2000–3000 m layer is estimated to be negligible. After conversion from zeta Joules/year, the trend equates to 0.433 W/m², averaged over the Earth’s surface. The standard deviation of the OHU estimate as computed from the regression residuals is 0.031 W/m², but because of the autocorrelation implicit in using overlapping pentadal averages the true figure will be much higher. Multiplying the standard deviation by sqrt(5) provides a crude adjustment for the autocorrelation, bringing the standard deviation to 0.068 W/m². There is no alternative to using GCM-derived estimates of OHU for the 1871–1880 period, since there were no measurements then. I adopt the OHU estimate given in Gregory 02 for the bracketing 1861–1900 period of 0.16 W/m², but deduct only 50% of it to compensate for the Levitus et al. (2012) regression trend implying a somewhat lower 2002-2011 OHU than is given in the SOD. Further, to be conservative, I treble Gregory 02’s optimistic-looking standard deviation, to 0.03 W/m². That implies a change in OHU of 0.353 W/m², with a standard deviation of 0.075 W/m², adding the uncertainty variances. Although Gregory 02 ignored non-ocean heat uptake, some allowance should be made for that and also for any increase in ocean heat content below 3000 m. The (slightly garbled) information in Section 3.2.5 of the SOD implies that 0–3000 m ocean warming accounts for 80–85% of the Earth’s total heat uptake, with the error standard deviation for the remainder of the order of 0.03 W/m². Allowing for all components of the Earth’s heat uptake implies an estimated change in total heat uptake of 0.43 W/m² with an error standard deviation of 0.08 W/ m². Natural variability in decadal OHU should be the counterpart of natural variability in decadal global surface temperature, so is not accounted for separately.

================================================================

Appendix 3: Derivation of the 5–95% confidence interval

In the table of changes in the variables between 1871–1880 and 2002–2011, I split the AF error standard deviation between that for CO2 and other greenhouse gases (0.291 W/m²), and for all other items (0.343 W/m²). The reason for doing so is this. Almost all the SOD’s 10.2% error standard deviation for greenhouse gas AF relates to the AF magnitude that a given change in the greenhouse gas concentration produces, not to uncertainty as to the change in concentration. When estimating ECS, whatever that error is, it will affect equally the 3.71 W/m² estimated forcing from a doubling of equivalent CO2 concentration used to compute the ECS estimate. Most of the uncertainty in the ratio of AF to concentration is probably common to all greenhouse gases. Insofar as it is not, and the relationship between changes in greenhouse gases is different in the future to in the past, then the two AF estimation fractional errors will differ. I ignore that here. As most of the past greenhouse gas forcing is due to CO2 and that is expected to be the case in future, any inaccuracy from doing so should be minor.

So, I estimate a 5–95% confidence interval for ECS as follows. Randomly draw a million realisations from each of the following independent Normal(mean, standard deviation) distributions:

a: AF WMGG uncertainty – before scaling – from N(0,1)

b: Total AF without WMGG uncertainty – from N(2.09,0.343)

c: Earth’s heat uptake – from N(0.43,0.08)

d: Surface temperature – from N(0.727,0.124)

and for each quartet of random numbers compute ECS as: 3.71 * (1 + 0.102*a) * d / (0.291*a + b − c).

One then computes a histogram for the million ECS estimates and finds the points below which 5% and 95% of the total estimates lie. The resulting 5–95% range comes out at 1.03 to 2.83°C.

UPDATE: Dr. Judith Curry provides her take on the issue, and endorses the leak:

http://judithcurry.com/2012/12/19/climate-sensitivity-in-the-ar5-sod/

 

0 0 votes
Article Rating
83 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
rilfeld
December 19, 2012 7:12 am

If one takes away nothing else — There is now sufficient data to study climate trends without models, either computer or paleo. Therefore, it makes sense to do so. While computer modeling and paleo reconstructions both have their place, and can lead to valid “science”, policy makers would do well to concentrate on real data studies, and implied consequences for a decade or two. Hundred year plans of today continue to be as valid as those of a hundred plus years ago involving much handwringing over uncontrollable mountains of horse-hockey —hmm. often the same topic.

John W. Garrett
December 19, 2012 7:15 am

Wow.
Richard Feynman (R.I.P.) is beginning to crack a smile.

Skiphil
December 19, 2012 7:25 am

Fascinating….. if one takes the “catastrophic” out of CAGW then the public debate should change….. shouldn’t it?
Others will need to assess (‘audit’) the calculations and arguments, but I am glad to see this kind of analysis getting attention. Many warm thanks to Nic Lewis for his important work on this topic, and to the extraordinary blog hosts (Steve Mc, Anthony, Andrew, Judith Curry) who have provided such fertile environments for discussing truly innovative thinking.

Espen
December 19, 2012 7:29 am

Nice. I note that this was based on comparing 1871–1880 to 2002–2011. Considering Willis Eschenbach’s recent article I think it’s likely that future sensitivity will be lower than it has been for the period compared here. The higher the temperature, the more the convection-driven negative feedback kicks in, and i.e. the lower the sensitivity.

December 19, 2012 7:39 am

The title is “Why doesn’t the AR5s SODs climate sensitivity range reflect it’s new aerosol estimates?”
The answer is easy. The IPCC is stuck with the baggage of it’s previous reports. It has been obvious for some time that writing the AR5 was going to be a problem. Either it must omit and ignore a lot of work that has been done since the AR4 which gives a strong indication that previous IPCC conclusions were just plain wrong. Or it must admit that it’s previous estimates were gross exaggerations. Neither of these alternatives are acceptable to the IPCC.
It remains to be seen what the IPCC will actually do.

December 19, 2012 7:48 am

Why would the IPCC want to publish a figure of 1.6C for ECS? The figure of 3.0C is much more acceptable, because it exceeds the 2.0C that has been arbitrarily agreed as the target for CAGW. If the IPCC was to publish 1.6C, then there would be no reason for the IPCC to continue to exists. In effect the scientists and bureaucrats involved would have done themselves out of a rather sweet job.
Who in their right mind at the IPCC is going to do that? In effect publish something that says there is no problem, time for us to wrap things up and go home. Imagine hiring an efficiency expert to take a look at your company. How likely is it that the efficiency expert will recommend you fire the efficiency expert to save costs? How much more likely is it that the efficiency expert recommends you fire everyone else and keep the efficiency expert to ensure things are efficient? No matter if you have no sales people, not shipping department, no receivables. At least you will be efficient.
Isn’t that really what the IPCC is saying? Shut down your economies and be efficient. No matter if you can heat your houses or feed your children, that isn’t what matters. What matters is that you live a green lifestyle, leaving all natural resources in the ground where they belong, so that future generations can dig them up and consume them.
But wait, if future generations are going to dig them up and consume the natural resources, what will the future, future generations consume? This means that no generation can ever dig up the natural resources of the planet, and we must leave them for eternity. In which case we might as well dig them up and use them ourselves as no one else will be using them.

David L. Hagen
December 19, 2012 7:56 am

Nic
Compliments on your detailed analysis and identifying major errors that attribute too great a climate sensitivity to CO2 amplification.
Dismal Theorem Correction
To complement to your Probability Density Function corrections, recommend incorporating the major economic projection correction by Ross McKitrick (2012):
Ross McKitrick, Cheering Up the Dismal Theorem, DISCUSSION PAPER 2012-05 University of Guelph DEPARTMENT OF ECONOMICS AND FINANCE

The Weitzman Dismal Theorem (DT) suggests agents today should be willing to pay an unbounded amount to insure against fat-tailed risks of catastrophes such as climate change. . . .Use of the exact measure completely changes the pricing model such that the resulting insurance contract is plausibly small, and cannot be unbounded regardless of the distribution of the assumed climate sensitivity.

Declining Clouds
How does cloud sensitivity affect your analysis/conclusions?
See the evidence of declining cloud cover of Eastman & Warren 2012. where the global average cloud cover declined about 1.56% over 39 years (1979 to 2009) or ~0.4%/decade, primarily in middle latitudes at middle and high levels. A one percentage point decrease in albedo (30% to 29%) nominally increases the black-body radiative equilibrium temperature about 1°C, about equal to a doubling of atmospheric CO2. e.g., by a 1.5% reduction in clouds since they form up to 2/3rds of global albedo (IPCC report AR4 1.5.2 p.114).
Ryan Eastman, Stephen G. Warren, A 39-Year Survey of Cloud Changes from Land Stations Worldwide 1971-2009 Journal of Climate 2012 ; e-View doi: http://dx.doi.org/10.1175/JCLI-D-12-00280.1
Could this indicate that a portion of warming could be from declining cloud cover, suggesting that the magnification of CO2 warming is even more overrated? (Caution: We only have this recent data, not back to 1871–1880, and thus skewed by the warming portion of the 60 year cycle.)
Now how do we distinguish which the cause and which the consequence in Clouds vs CO2?
Total Solar Insolation uncertainty:
IPCC uses the PMOD team’s analysis of Total Solar Insolation (TSI) BUT it ignores the ACRIM team’s analysis. Nicola Scafetta (2011) shows that alternative reconstructions of the ACRIM Gap in TSI measurements could result in the solar contribution causing 15%, 50% or 60% of the warming. i.e., the TSI uncertainty appears very strongly underestimated by the bias of selecting one team’s analysis and ignoring the competing team’s analysis.
Nicola Scafetta (2011) Total Solar Irradiance Satellite Composites and their Phenomenological Effect on Climate. In Evidence-Based Climate Science edited by Don Easterbrook (Elsevier), chap. 12, 289-316.
Consequence:
Technically: The confidence in CO2 climate amplification is very likely overestimated, with climate sensitivity likely to be much lower than IPCC’s AR4 estimates.
Popularly: Lewis + McKitrick + Scafetta = Why worry?
“If the cost of the premium exceeds the cost of the risk, don’t insure.”
Adapt for small changes as they occur!

Rud Istvan
December 19, 2012 8:07 am

There are additional, quasi independent (all relying on the same observational inputs) ways to reach the same general conclusions about ECS. See the climate change chapter in the new eBook, The Arts of Truth.

John Francis
December 19, 2012 8:08 am

The focus on average temperature seems wrong to me. Even if this analysis is 100% correct, the likely outcome is daytime highs changing insignificantly, and night time lows warming slightly, due to the T^4 factor. If so, I fail to see any problem at all.

Marco
December 19, 2012 8:11 am

Nice quote from the WSJ article:
“…given the organization’s record of replacing evidence-based policy-making with policy-based evidence-making…”

cui bono
December 19, 2012 8:19 am

Far from a triumphant grin, it is difficult to repress anger. The waste of time, money and scientific talent. The opportunity cost of all the useless energy systems pushed by the Greens. The propoganda, hype and alarm generated just to turn toddlers into sloganeering anthrophobes.
Almost enough for a large ‘bah humbug’ directed at the responsible parties. Merry Christmas, IPCC.

richard telford
December 19, 2012 8:19 am

This estimation of equilibrium climate sensitivity assumes that the climate system is in equilibrium but makes no attempt to verify if this assumption is valid. If there is “warming in the pipeline”, this analysis will underestimate ECS.
This is the enormous advantage of GCM – it is possible to run them until they are in equilibrium.

December 19, 2012 8:42 am

nice work nic

lgl
December 19, 2012 8:57 am

Or, in four sentences:
The surface absorbes ~200 W/m2 solar (160 directly and 40 via atmospheric reradiation)
The energy transport from the surface is ~500 W/m2, so the system gain is ~2.5.
The next 3.7 W/m2 will see the same 2.5 amplification and will increase the surface emission by 9.25 W/m2. The surface temperature has to increase 1.7 K to get rid of the extra 9.25 watts.
(note no unreliable temp measurements needed)

John Blake
December 19, 2012 8:57 am

Under Railroad Bill Pachauri, the UN IPCC remains irremediably a New World Order propaganda organ dressed up with pseudo-science a la Rene Blondlot, J.B. Rhine, Immanuel Velikovsky, Trofim Lysenko. Rather than dignify such twits with rational discourse, better to simply cite replicable observations without regard to AGW blowhards’ self-serving Big Government grant monies.

Louis Hooffstetter
December 19, 2012 9:00 am

My psychic friend Miss Cleo points out the Team’s message leading up to AR5 has been “It’s worse than we thought!” Stephan Rahmstorf and his buddy Tamino clearly proved the planet is still warming according to model projections (thermometers just aren’t recording it because of pesky interference from ENSO and volcanoes):
http://iopscience.iop.org/1748-9326/6/4/044022/pdf/1748-9326_6_4_044022.pdf
And once you correct for that, sea levels are rising 60% faster than IPCC projections!:
http://thinkprogress.org/climate/2012/11/28/1249391/study-sea-levels-rising-60-faster-than-projected-planet-keeps-warming-as-expected/
She says Jim Cripwell is right; there is no way they can admit their previous conclusions were just plain wrong or that previous estimates were gross exaggerations. So Miss Cleo predicts the IPCC is going to double down their SWAGs* in AR5!
*SWAG = Scientific Wild Ass Guess

John F. Hultquist
December 19, 2012 9:09 am

I read Matt Ridley’s Op-Ed and the comments that followed on-line. As Nic Lewis (Thanks, Nic) shows there is science being done, and also science being ignored in the 2nd Order Draft (SOD). Many of the folks commenting at the WSJ seem clueless and apparently are determined to stay that way. I found this attitude as interesting as the question about equilibrium climate sensitivity (ECS).

richardscourtney
December 19, 2012 9:22 am

richard telford:
At December 19, 2012 at 8:19 am

This estimation of equilibrium climate sensitivity assumes that the climate system is in equilibrium but makes no attempt to verify if this assumption is valid. If there is “warming in the pipeline”, this analysis will underestimate ECS.
This is the enormous advantage of GCM – it is possible to run them until they are in equilibrium.

Equilibrium is reached very fast: i.e. in less than a year.
This is demonstrated by the absence of the “committed warming” predicted (n.b. predicted, not projected) in the AR4 on the basis of GHGs already in the system.
So, allow me to correct your final sentence.
This is the enormous advantage of GCM – it is possible to TUNE them until they SHOW WHATEVER IS DESIRED.
Richard

Nic Lewis
December 19, 2012 9:23 am

Richard Telford wrote:
“If there is “warming in the pipeline”, this analysis will underestimate ECS.”
Not so. The analysis allows for ocean etc. heat uptake, as even a cursory reading of it shows. The “warming in the pipeline” issue relates to part of the increase in forcing going into heating the ocean rather than being radiated into space as a result of surface (and hence atmospheric) temperatures being higher. I suggest you read the Gregory et al. (2002) paper if you don’t believe my explanation.

John West
December 19, 2012 9:25 am

1) Thank you Nic Lewis!!!!!
2) This is a step in the right direction, not the finale. There’s evidence suggesting even lower ECS (see David L. Hagen comment) perhaps in the 0.3 to 0.7 range.
3) Given that those who seek carbon controls can point to James Hansen’s 2 degrees will be detrimental paper and the range of 1.03 to 2.83 here, they can still insist the precautionary principle demands action, therefore we’ve still got a long way to go.

Theo Goodwin
December 19, 2012 9:30 am

“The IPCC has placed a huge emphasis on GCM simulations, and the ECS range exhibited by GCMs has played a major role in arriving at the IPCC’s 2–4.5°C ‘likely’ range for ECS. I consider that little credence should be given to estimates for ECS derived from GCM simulations, for various reasons, not least because there can be no assurance that any of the GCMs adequately reflect all key climate processes.”
Warms my heart to read this. In my humble opinion, sceptics deserve credit for the fact that reference to “climate processes” is now taken for granted in sophisticated writing about climate science. In the not too old days, so-called climate scientists were quite happy to base their work on strings of temperature readings without reference to the many and varied climate processes from which those readings were taken. Sceptics have “bent the curve” toward the empirical and genuine science.

December 19, 2012 9:41 am

Thank you very much Nic, excellent article!

Camburn
December 19, 2012 9:54 am

Thank you Nic. Very timely and very precise.
Confirmation that the Skeptical Science Syndrome is not spreading, but rather being laid to waste by folks without optical illusion problems.

Editor
December 19, 2012 9:54 am

Thanks for releasing this Nic 🙂
I’ll just add that the estimated sensitivity would be reduced still further by taking into account the solar forcing beyond TSI that the IPCC now admits to be indicated by the paleo evidence (page 7-43, lines 1-5):

Many empirical relationships have been reported between GCR or cosmogenic isotope archives and some aspects of the climate system (e.g., Bond et al., 2001; Dengel et al., 2009; Ram and Stolz, 1999). The forcing from changes in total solar irradiance alone does not seem to account for these observations, implying the existence of an amplifying mechanism such as the hypothesized GCR-cloud link. We focus here on observed relationships between GCR and aerosol and cloud properties.

The more the forcing behind a given temperature rise the lower the sensitivity, and the sun was a “grand maximum” levels according to Usoskin 2007 from roughly 1920 to 2000, and no, it makes no qualitative difference whether solar activity was merely “high” across this span rather than “exceptional,” as Raimund Muscheler claims. But if you do think it matters, please note that Usoskin thinks Muscheler’s revision is wrong. (I’d look up that last reference too but I have to run!)

P. Solar
December 19, 2012 9:55 am

Silly SODs at the IPCC seem intent on making a AR5 of themselves.

December 19, 2012 9:57 am

Henry@Nic
this is all rubbish
There is no man made global warming:
http://blogs.24.com/henryp/2011/08/11/the-greenhouse-effect-and-the-principle-of-re-radiation-11-aug-2011/
there never was:
a balance sheet / test result
There is no man made global cooling either
there never was
a balance sheet /test result there either.
All we had is natural warming from 1927
(ignore the “Global” temp. record before that time; they did not even re-calibrate thermometers in those days once they were manufactured)
And now we are on our way to natural cooling since 1998.
note the recent graph of AR5 and see the actual measurements going up from 1992 reaching a maximum in 1998 and now curving down – a real binomial curve clearly visible for anyone who would care to have a good look.
As predicted,
http://blogs.24.com/henryp/2012/10/02/best-sine-wave-fit-for-the-drop-in-global-maximum-temperatures/
I might add.
Each weather station has its own sine wave, depending on the make up of the chemicals lying on top of the atmosphere.
Global warming.
Global cooling.
Everything is natural….
we are now cooling.
http://www.woodfortrees.org/plot/hadcrut4gl/from:2002/to:2013/plot/hadcrut4gl/from:2002/to:2013/trend/plot/hadcrut3vgl/from:2002/to:2013/plot/hadcrut3vgl/from:2002/to:2013/trend/plot/rss/from:2002/to:2013/plot/rss/from:2002/to:2013/trend/plot/gistemp/from:2002/to:2013/plot/gistemp/from:2002/to:2013/trend/plot/hadsst2gl/from:2002/to:2013/plot/hadsst2gl/from:2002/to:2013/trend
and we will continue to cool, until ca. 2040
Live with it. Prepare for it.

Jean Parisot
December 19, 2012 9:58 am

So the simple message is: The observed ECS using IPCC AR5 values is 1.75 degrees. Congratulations are due to the IPCC, the member states, our science community, and concerned citizens of the globe everywhere for having saved the earth through tireless advocacy, study, and innovation. The people of the earth have risen to the challange and beaten our goal. Champagne will served in the lobby, thank you.

December 19, 2012 10:12 am

I still say this modeling foolishness and obsessing our part of a degree C that is firmly in the error band is simply masturbation with millions of taxpayer dollars. That is simply wrong masturbation should be done by ones self, not in public. This was a nice write up and, thanks Nick.

Bill Treuren
December 19, 2012 10:38 am

Taking the C out of CAGW is very important. As a member of the oil exploration community it’s clear that peak oil is hard to support, but lifting production by 100% is hard to see being possible, and a bussiness as usual model tends to imply that this will happen.
A sane application of efficient technologies throughout the world where pollution controls focuses on substances that are pollution is what the world needs. so we can all enjoy the fruits of economic growth, not just the elite ruling class that fills the UN.

December 19, 2012 11:02 am

I think it might be useful to consider what Union General William Tecumseh Sherman said during the American Civil War. WTTE, “A good general gives his opponent two alternatives, both of which are bad.” I am not sure which of us skeptics has achieved this, but I would suggest that now the IPCC, in writing the AR5, is faced with two alternatives, both of which are bad.

RobW
December 19, 2012 11:03 am
December 19, 2012 11:08 am

Richard,
If there is significant “warming in the pipeline”, the energy needs to be stored somewhere.for years The atmosphere is not able to hide enough energy without it showing up as heat and radiating out. Humidity alters heat capacity but we watch for that. The deep ocean is often given as the likely storage zone, In principles heat being locked into deep ocean now could come back to the surface decades later. It seems far fetched the energy could sneak to the deep without being spotted on its way down by Argos.
So the “enormous advantage of GCM models” – it is possible to run them using assumptions without supporting observation. The models kind of worked where the training period was warming and the predicted period was also observed to be warming. But now the CO2 is going up and the temp seems fairly, stable and this was not predicted by the GCM in advance. You can always can “improve” a model to account for past failures to predict behavior. Remember Aerosols. However, it is just a glorified gut feeling until the observations back up the assumptions.

D Böehm
December 19, 2012 11:12 am

Bill Treuren,
I agree with your comment 100%. AGW may exist, but if so, it is only a third order forcing (per Willis Eschenbach’s definition).
There is nothing ‘catastrophic’ about the rise in the (harmless, beneficial) trace gas CO2. Economic growth in the less developed countries is inevitable, therefore CO2 will rise from 0.03% to 0.04%, and the biosphere will be the beneficiary. There is no downside to this minuscule rise in CO2.
Fortunately, the rise in CO2 is truly harmless. Developing countries will continue to generate CO2, regardless. Only the eco-totalitarians oppose economic growth, for their own self-serving, nefarious reasons. Political power and control are their true motivations. Science is nothing but a thin veneer to cover their anti-West agenda.

Manfred
December 19, 2012 11:32 am

There are 2 ways to show that a result is wrong:
1. Prove assumptions are wrong.
2. Suppose, assumptions are right and prove computation is wrong.
This estimate follows 2. The good thing about this way is that it is hard to ignore.
Assumptions are most likely wrong as well, because following them, we would still be at the bottom of the little ice age, probably the coldest episode of the last 2000+ years without human interference. Part of this recovery is most likely natural.and sensitivity even lower. Another issue is UHI.

December 19, 2012 11:46 am
Doug Proctor
December 19, 2012 12:14 pm

There are so many pieces of the puzzle with questionable basis, one wonders what those, like Lewandowsky, mean by having “99%” certainty.
The AR5: what is the “certainty” of the world having at 2100 now, at the 95% level? I’m asking for not a range, but a NUMBER. Or at 2025?
Statistically, what does a high probability of a large range of events mean?

eyesonu
December 19, 2012 12:17 pm

Matt Ridley’s Op-Ed today in the Wall Street Journal was excellent.
Nic Lewis’ post here makes for an even better day. Well done.
And then ferd berple says: December 19, 2012 at 7:48 am on this same post just makes today an even better one.
And then when we have all survived the feared (by some) Mayan apocalypse the weekend will just be absolutely terrific.

Gail Combs
December 19, 2012 12:21 pm

cui bono says:
December 19, 2012 at 8:19 am
Far from a triumphant grin, it is difficult to repress anger…..
>>>>>>>>>>>>>>>>>>>>>>>>>
I agree and I am old with no children. I can not imagine the anger of a parent when they realize not only have they been fleeced but their children’s future has been mortgaged and trashed just to feed the egos and bank accounts of a bunch of liars and cheats.

Scarface
December 19, 2012 12:21 pm

@Nic Lewis
“Conclusion: In the light of the current observational evidence, in my view 1.75°C would be a more reasonable central estimate for ECS than 3°C, perhaps with a ‘likely’ range
of around 1.25–2.75°C.”
Wouldn’t that have to be 1.25-2.25°C?
Or am I missing the point completely?

R Barker
December 19, 2012 12:28 pm

Nice work Nic. When model forecasts and observations part company, you have to stay with the observations.

Frank K.
December 19, 2012 12:28 pm

HenryP says:
December 19, 2012 at 9:57 am
“we are now cooling…and we will continue to cool, until ca. 2040
Live with it. Prepare for it.”
Related news story…
Down to -50C: Russians freeze to death as strongest-in-decades winter hits
“Russia is enduring its harshest winter in over 70 years, with temperatures plunging as low as -50 degrees Celsius. Dozens of people have already died, and almost 150 have been hospitalized.”
“The country has not witnessed such a long cold spell since 1938, meteorologists said, with temperatures 10 to 15 degrees lower than the seasonal norm all over Russia.”
“Across the country, 45 people have died due to the cold, and 266 have been taken to hospitals. In total, 542 people were injured due to the freezing temperatures, RIA Novosti reported.”

Mitigation, anyone??

pdtillman
December 19, 2012 12:32 pm

Re: Aldrin et al. 2012.
Is this
* Magne Aldrin et al., 2012, “Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperatures and global ocean heat content,” : Environmetrics Volume 23, Issue 3, pages 253–27.
Doesn’t appear to be an empirical estimate!
“The [climate sensitivity] mean is 2.0°C… which is lower than the IPCC estimate from the IPCC Fourth Assessment Report (IPCC, 2007), but this estimate increases if an extra forcing component is added, see the following text. The 95% credible interval (CI) ranges from 1.1°C to 4.3°C, whereas the 90% CI ranges from 1.2°C to 3.5°C.”

pdtillman
December 19, 2012 12:34 pm

Nic, do you have a complete citation handy for
* Ring et al. (2012) also estimates ECS as 1.6°C, using the HadCRUT4 temperature record (1.45°C to 2.01°C using other records).

Henry Galt
December 19, 2012 12:37 pm

Mark my, far shorter words. Eventually it will be possible to show that mankind’s contribution to the planetary CO2 budget echos the Reid Bryson effect; ‘You can go outside and spit and have the same effect as doubling carbon dioxide.’

December 19, 2012 12:53 pm

Don’t get me wrong, I do enjoy the articles and the comments although I must confess some of the science is way above my pay grade. I greatly admire those swimming against the tide of ‘scientific consensus’ when it comes to CAGW, discussing the real science behind the climate.
However…..
I also feel that a few contributors and commenters on this blog have unrealistic expectations. They seem to believe that when they prove that the science behind the CAGW theory is wrong, people will abandon ‘the project’. I believe that it doesn’t matter anymore what the science says. Global temperatures might flatten while CO2 goes up unabated (As is presently the case), they will still blame anthropogenic CO2 for global warming, even when the data tells them it isn’t warming anymore. Temperatures and CO2 might go up or down, it matters not. For them, Anthropogenic Global Warming is a fact, no matter what the data says.
Hundreds of billions of dollars have been invested by individuals, financial institutions, Governments, oil companies, NGO’s, energy companies, politicians and energy companies in ‘Green Technology’. ‘Green Energy’ and the whole ‘Green Ideology’.
They are not going to change just because the facts are proven wrong.
Compare the CAGW debate with the Hayek- Keynes debate. Proponents and opponents have been debating Hayek versus Keynes for decades. While the reality shows, looking at the economies of Greece, Portugal, Italy, Spain etc., that Keynes isn’t working, people still vote for politicians advocating this model.
The IPCC is part of and is controlled by the biggest quasi- political organisation in the world.
An organisation where democracies have been a minority since 1958.
An organisation that is controlled by a bunch of corrupt, dictatorial countries.
Last year……
Libya, China, Russia, Cuba, Saudi Arabia and Cameroon all had seats on the Human Rights Council.
North Korea had the presidency of the UN Conference on Disarmament
Iran had a seat on the UN’s Commission on the Status of Women.
Pakistan served as acting head of a U.N. body called the Counter-Terrorism Implementation Task Force.
This is an organisation (Of non-elected bureaucrats, politicians and hypocritical self-serving kleptomaniacs) trying to impose a global taxation system.
An organisation trying to limit free speech.
An organisation trying to restrict the flow of information by ‘regulating’ the Internet.
Hardly an organisation you can trust on anything.
And still people support this organisation and take this organisation and its ‘reports’ serious.
So when will CAGW and Keynes be abandoned?
Both theories will be finished when the economy collapses, when there is no more money for their boondoggles and when the sheeple realise they’ve been taken by the hypocritical self-proclaimed elite.

Gail Combs
December 19, 2012 12:53 pm

The bit that stuck out like a sore thumb when I looked at the AR5 table on page 8-39 (hanks Mr Rawls) was the small amount of radiative forcing attributed to H2O. Around 0.1Wm^2 while CO2 gets assigned ~ 1.7Wm^2.
We know are told that H2O is a much stronger greenhouse gas than CO2 and that water vapor is either the same or decreasing [See graph ] so H2O as an ‘amplifier’ of CO2 no longer flies.
So why hasn’t H2O been assigned an amount greater than that of CO2 this time around?
My take home from this whole exercise is everything is assigned a minimum value if positive or a maximum value if negative to amplify the radiative forcing attributed to CO2. This includes leaving out as many other factors as possible.
I can not see how any logical person can believe that AR5 table after even a cursory knowledge of the subject. If that table is bogus then the equilibrium climate sensitivity is also called into question.
I do however appreciate the fact that Nic Lewis has taken the data as presented by the SOD and illustrated where the conclusions drawn are wrong. – Hats off to you.

Gail Combs
December 19, 2012 12:56 pm

Darn the computer hiccuped again.
H2O vs CO2 – http://www.globalwarmingart.com/images/7/7c/Atmospheric_Transmission.png
Global relative Humidity (water vapor) – http://i38.tinypic.com/30bedtg.jpg

Gary Hladik
December 19, 2012 1:06 pm

Alec Rawls says (December 19, 2012 at 9:54 am): “I’ll just add that the estimated sensitivity would be reduced still further by taking into account the solar forcing beyond TSI that the IPCC now admits to be indicated by the paleo evidence (page 7-43, lines 1-5):”
Indeed. If I read the article correctly, if any of the supposed historical temperature increase is caused by factors (solar and/or something else) other than so-called “greenhouse” gasses, then the ECS drops even lower. So the 1.6 degrees per doubling figure is an upper limit.

BillD
December 19, 2012 1:13 pm

Isn’t it great that a retired financier who knows some math can set all of the scientists and their graduate students straight? Not sure why anyone would want to bother with a Ph.D. when the the amateurs seem to be so much better. Why don’t the retired finacial wizzards and retired engineers just take over all of those Ph.D. science jobs. US science is the best around the world.. Think of how good we could be if we just replaced all of the know nothing professionals and professors with amateurs. And while we are at it, let’s replace all of those pesky expensive peer-reviewed journals with the WSJ and blogs.

Richard M
December 19, 2012 1:16 pm

It appears the sensitivity is based on a warming of 0.727°C (love the accuracy). If that warming is too high then the sensitivity will also be too high. Given the adjustments to the temperature record I suspect both are true.

December 19, 2012 1:24 pm

Excellent Nic.
As you are comparing recent with 19th century observations, there is something you should be aware of, which is the sensitivity of minimum temperatures to near ground aerosols and aerosol seeded clouds.
Briefly, deriving average temperature from min/max temperatures over-estimates the average temperature increase (since 1950 and probably since the 19th century) by more than 40%.
http://www.bishop-hill.net/blog/2011/11/4/australian-temperatures.html

Nic Lewis
December 19, 2012 1:32 pm

pdtillman
“Nic, do you have a complete citation handy for * Ring et al. (2012) also estimates ECS as 1.6°C,…”
Sure. And freely available at http://www.scirp.org/journal/acs/
Michael J. Ring, Daniela Lindner, Emily F. Cross, Michael E. Schlesinger:
Causes of the Global Warming Observed since the 19th Century. Atmospheric and Climate Sciences, 2012, 2, 401-415 doi:10.4236/acs.2012.24035 Published Online October 2012
You should look at Table 3, BTW. It reveals the drop of 0.9°C in the ECS estimate using this climate model, from 2.5°C in the original paper to 1.6°C now, is entirely accounted for by a computer code error that they eventually discovered. 

John West
December 19, 2012 1:43 pm

@ Gail Combs
Looks to me like the only change in forcing from H2O in the table is what they label H2O(Strat.) which I assume to be stratospheric H2O. I’m ok with no radiative anomaly from water vapor since RH doesn’t seem to have changed any in defiance of all their positive feedback warnings. What I find most objectionable in the table is the lack of change in radiative forcing from variation in solar spectrum output. They still continue to push the idea that TSI variance is all there is to consider. It’s ludicrous. Also, the lack of any notion in the table that there could be changes in cloud cover or albedo other than the surface is objectionable as well.
In other words, I agree, that table is complete fiction designed to support an outdated but essential hypothesis for political leverage.

Nic Lewis
December 19, 2012 1:48 pm

Scarface
“Conclusion: In the light of the current observational evidence, in my view 1.75°C would be a more reasonable central estimate for ECS than 3°C, perhaps with a ‘likely’ range of around 1.25–2.75°C.”
Wouldn’t that have to be 1.25-2.25°C? Or am I missing the point completely?”
Good question. Let me explain. The probability distribution for ECS is asymmetrical, with a long upper tail. That is essentially because the uncertainty in the change in forcing, net of ocean heat uptake (the denominator in the ECS estimate), is much greater than the uncertainty in the change in global surface temperature (the numerator). So, with normally distributed observational errors, the estimate for the reciprocal of ECS – the climate feedback parameter – will be nearly symmetrical.
Let’s take the lower bound on and central estimate for ECS as given: 1.25 C and 1.75 C. Then, 1/1.75 = 0.57 and 1/1.25= 0.8, so to be symmetrical the lower bound for the climate feedback parameter (corresponding to the upper bound for ECS) would be 0.57 – (0.8-0.57) = 0.34, implying ECS of 1/0.34 = 2.9 C. But there is some error in the temperature change, so one needs to move the 2.9 C upper bound for ECS a bit towards the central estimate. Hence my figure of 2.75 C (to the nearest 0.25 C).

Paul Marko
December 19, 2012 1:58 pm

Here’s what Scientific American thinks about it. It’s an uphill battle.
Leaked Report Confirms Human-Induced Climate Change
The world is on track for warming of at least 2 degrees Celsius, according to a leaked draft of the next IPCC report. David Biello reports
A rogue reviewer posted a draft this week of the Intergovernmental Panel on Climate Change’s next report, in a bid to promote climate change denial.
Instead the draft reaffirms humanity’s starring role in global warming, which, along with sea level rise is now “unequivocal.” Also human caused CO2 increases are now “virtually certain” to be responsible for trapping extra heat. And it is “extremely likely that human activities have caused more than half of the observed increase in global average surface temperatures since the 1950s.”
In other words, we’re well on our way towards global warming of 2 degrees Celsius or more by 2050, if not sooner. After all, concentrations of CO2 in the atmosphere are likely to hit 400 parts-per-million this coming spring or next.
The final draft isn’t due until next fall and the leak highlights a need to consider updating the IPCC process. Global warming no longer needs confirmation—instead the world needs solutions to climate change’s challenges. After all, “many aspects of climate change will persist for centuries even if concentrations of greenhouse gases are stabilized.”
—David Biello
http://www.scientificamerican.com/podcast/episode.cfm?id=leaked-report-confirms-human-induce-12-12-16

Nic Lewis
December 19, 2012 1:59 pm

Richard M
“It appears the sensitivity is based on a warming of 0.727°C (love the accuracy).”
This is precision, not accuracy, hence my showing it in the table as 0.73°C :). I don’t want to worry about accumulating rounding errors! I make clear that the error standard deviation is 0.12°C.

RACookPE1978
Editor
December 19, 2012 2:05 pm

BillD says:
December 19, 2012 at 1:13 pm
Isn’t it great that a retired financier who knows some math can set all of the scientists and their graduate students straight? Not sure why anyone would want to bother with a Ph.D. when the the amateurs seem to be so much better. Why don’t the retired finacial wizzards and retired engineers just take over all of those Ph.D. science jobs. US science is the best around the world.. Think of how good we could be if we just replaced all of the know nothing professionals and professors with amateurs. And while we are at it, let’s replace all of those pesky expensive peer-reviewed journals with the WSJ and blogs.

Your sarchasm – that gaping whole between an ivory-tower theist-dogmatist and the real world
– is underwhelming.
So, when the CAGW theology forces the pal-review process, the funding process, the promotion process and the grant renewal and hiring process into the religious orthodoxy, and the so-called “scientists” can be proved wrong by their own numbers and lies and exaggerations, who then “polices the custodians”?
The CAGW dogma is proven wrong by a little problem called “inconvenient data”: The past 2000 years prove its concepts wrong through the Roman Warming period, the Dark Ages, the Medieval Warming Period, the Little Ice Age; the past 250 years prove its concepts wrong as the earth rises naturally from the Little Ice Age, the past 112 years prove its conceits wrong because temperature in the measurement era changes up-down-and steady regardless of CO2 levels; the past 16 years prove its conceits wrong as CO2 rises and temperatures remain steady: in only one 23 year period out of all of earth’s history (from 1975 – 1998) do both CO2 and temperature rise at the same time.
Your lies are made for control, power, influence, and your self-importance and conceit. Your funding in the education community render YOUR opinion corrupt and self-funding. Self-supporting as YOU fight for YOUR future income.

December 19, 2012 2:11 pm

The black line “total” bears no resemblance to the real world. Just compare it with CET record.
http://climexp.knmi.nl/data/tcet_1770:2012_mean1a.png
http://wattsupwiththat.files.wordpress.com/2012/12/clip_image002_thumb4.jpg?w=650&h=477
All these “forcings” and “sensitivities” will disappear at the moment the cooling AMO cycle will add to already cold PDO.

Guy Leech
December 19, 2012 2:13 pm

This is impressive analysis but how can conclusions be drawn about the influence of CO2 on near surface atmospheric temperature based on temperature data of questionable accuracy, geographical coverage and meaning (the averaging problem, air temperature not heat content) from merely the 140 years since 1871? Doesn’t an estimate of the value of one parameter in a theoretical climate or heat balance model, CO2 sensitivity, require the model to reproduce historic climate change proxies over much longer periods in the past than 140 years – cycles and secular changes within the Holocene, within the 3m year ice age, over the past few hundred million years etc? There is no equilibrium climate.

Nic Lewis
December 19, 2012 2:22 pm

Gail Combs says:
“The bit that stuck out like a sore thumb when I looked at the AR5 table on page 8-39 (hanks Mr Rawls) was the small amount of radiative forcing attributed to H2O. Around 0.1Wm^2 while CO2 gets assigned ~ 1.7Wm^2.”
Actually, I think there is some logic in that. Unlike long lived, well mixed greenhouse gases like CO2, CH4 and N2O, and even the shorter lived O3, in the troposphere H2O is a transitory gas and its concentration is thought to be strongly dependent on temperature. For that reason, it is treated as a temperature-dependent feedback rather than as an autonomous forcing. The ~0.1 W/m^2 water vapour forcing relates to the stratosphere, where the mechanism determining water vapour level is different.
If it turns out, contrary to what most climate scientists believe (and climate models assume), that tropospheric H2O levels do not in fact respond to surface temperature, but are determined by some temperature-independent mechanism, then that would be a different matter. And, as your graph shows, there is certainly some evidence that relative humidity at least is decreasing in the mid/upper troposphere, which is where water vapour has the greatest effect on outgoing LW radiation. But in my analysis I didn’t want to challenge the accepted science, just to point out that the IPCC wasn’t taking proper account of recent observations in arriving at its climate sensitivity range.

Gary Pearse
December 19, 2012 2:45 pm

Frank K. says:
December 19, 2012 at 12:28 pm
Down to -50C: Russians freeze to death as strongest-in-decades winter hits
“Russia is enduring its harshest winter in over 70 years,
Those who may have noted my number of comments on predicting future climate (weather?) by looking back 60-70 years will see that, once again, something is the worst in 70 (usually 60) years ago. I have made this comment on topics in WUWT on droughts in Texas, wildfires in western US, flood, cold temps in the Pacific Northwest, tornado frequency and strength, etc. etc. etc. I know you all know that there have been 60 year (0r so) cycles and that I didn’t discover them. But by golly I’m going to continue to use this skillful model again and again.
One other bugbear I have raised as the CAGW fortress wall crumble, what the dickens can climate scientists mean by 95% confidence level!!! This confidence level has been given when the ECS was triple what it is now. It has been used re temperature predictions (I don’t buy the semantics of “projections” – they have been saying these things will come to pass) and all the catastrophes that used to await us. Meanwhile, meteorolgists are happy to give, say 60% chance of rain two days from now. I would like to see this 95% thing reviewed and evaluated or forevermore, we will just laugh at anyone saying 95% confidence level.
Finally, I note the trolls are staying away in droves from these recent IPCC Humpty Dumpty posts. Nippier weather has driven some away, but even Joel Shore isn’t making an appearance to defend the AR4 against the AR5 SOD!

Nic Lewis
December 19, 2012 2:47 pm

pdtillman says:
“Is this * Magne Aldrin et al., 2012, “Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperatures and global ocean heat content,” : Environmetrics Volume 23, Issue 3, pages 253–27.
Doesn’t appear to be an empirical estimate!
“The [climate sensitivity] mean is 2.0°C… which is lower than the IPCC estimate from the IPCC Fourth Assessment Report (IPCC, 2007), but this estimate increases if an extra forcing component is added, see the following text. The 95% credible interval (CI) ranges from 1.1°C to 4.3°C, whereas the 90% CI ranges from 1.2°C to 3.5°C.”
Yes. Note that the paper quotes the mean estimated sensitivity of 2.0°C, but with a strongly asymmetrical distribution the mean is not a good central estimate. I cited, as stated, their most likely estimate (the peak probability density from their main results sensitivity PDF graph) – it is actually more like 1.55°C than 1.6°C. The extra forcing component mentioned is additional aerosol-cloud interaction forcing, which there is very little reason to think is needed.
This type of study provides what constitutes a standard observationally based (strictly, observationally-constrained) estimate in climate science. So far as the climate model involved goes, it is about as simple as you can get while retaining separate hemispheres (vital to constrain the aerosol forcing estimate). And I think Aldrin et al. did a thorough job. But the Bayesian approach is full of pitfalls. In particular, use of uniform (or expert) priors for climate sensitivity and/or effective ocean diffusivity will typically lead to climate sensitivity being overestimated and having far too long an upper tail. I have been trying to persuade the key IPCC lead authors involved of this, and that it is essential to use a computed noninformative prior (with a view to achieving probabilistic results that reflect objective measures of probability, not the standard Bayesian subjective belief). But I don’t think they really understand the issue properly – maybe they don’t want to either.
I have a paper that uses an objective Bayesian method to estimate climate sensitivity undergoing peer review.

Roger Knights
December 19, 2012 2:51 pm

Unsettling!

jackmorrow
December 19, 2012 3:00 pm

Other_Andy Says
You got that right Andy!

Skiphil
December 19, 2012 3:21 pm

BillD,
A sarcastic appeal to authority and credentialism is even more inappropriate than usual when we are in the midst of such an interesting episode of “outsider” insights into the failings of such annointed experts. This doesn’t by any means imply that your favored scientific institutions are *always* wrong, but we know they are not **always right**
Reasonable scientists should avoid your resentments and emulate the example of Dr. Judith Curry of Georgia Tech:
Judith Curry’s thread on Nic Lewis article

JC summary: The leak of the SOD was a good thing; the IPCC still has the opportunity to do a much better job, and the wider discussion in the blogosphere and even the mainstream media places pressure on the IPCC authors to consider these issues; they can’t sweep them under the rug as in previous reports.

nvw
December 19, 2012 3:35 pm

Very nice work Nic Lewis, and an appropriate use of the leaked SOD. Inasmuch that the IPCC is stating that the leaked SOD is not the final version, you should formally contact the appropriate chapter editor and notify them of your results. Please keep us informed of your progress.

Kevin Kilty
December 19, 2012 3:57 pm

Ridley describes an improvement to the IPCC process as substituting

…evidence-based policy-making with policy-based evidence-making…

Priceless!

mpainter
December 19, 2012 4:09 pm

BillD
When indoctrination passes for science, what happens?

Bill Illis
December 19, 2012 6:11 pm

IPCC scientists can do the same math as Nic Lewis here.
And they have.
So what is wrong with their objectivity. It is clear enough.
Missing energy – it is actually more than 50%.

D Böehm
December 19, 2012 7:59 pm

jackmorrow,
I agree, Other_Andy got it right.

Hilary Ostrov (aka hro001)
December 19, 2012 10:34 pm

Nic Lewis says: December 19, 2012 at 2:22 pm

[…] in my analysis I didn’t want to challenge the accepted science, just to point out that the IPCC wasn’t taking proper account of recent observations in arriving at its climate sensitivity range.

So, the IPCC-niks wring their hands in despair because of their self-declared “failure to communicate”, and defend to the death (or at least the to detriment of their rapidly declining credibility) the so -called “transparency” and “objectivity” of their “gold-standard” process.
Andrew Weaver is an IPCC Lead Author – and modeller – whose “objectivity” is obviously compromised by his nomination as a Green Party candidate for the May 2013 British Columbia provincial election, and promotion to Deputy Leader of BC Greens – and whose candidacy was endorsed by no less a luminary than David Suzuki. If this has caused even a ripple of consternation at IPCC HQ, it is certainly not evident in their most recent “updated” list of authors and Review Editors [h/t Richard Betts via BH]: As of Nov. 12/2012 (almost two full months after he had declared his candidacy), Weaver, yet another who was quite content to assume the laurels of “shar[ing] the Nobel Peace Prize”, is still listed as a WG1 Ch. 12 Lead Author.
They seem to have garnered some highly dubious support from the likes of Lewandowsky and blinkered commentators such as Keith Kloor – not to mention others higher up in the MSM green is great advocacy food-chain.
And in the meantime, Nic Lewis has quietly and professionally put paid to one of their most cherished mantras.
In its responses to the InterAcademy Council (IAC)’s 2010 review of the IPCC’s processes and procedures, the IPCC clearly chose to reject the lifeline that had been handed to it. The IPCC most certainly did not take “proper account” of the IAC’s key recommendations.
Perhaps the IAC would have been kinder had they had told the IPCC to, in effect, fold up their tents and go home – before they got any further behind, as they seem to have done in the intervening two years.

December 20, 2012 2:11 am

Presumably the IPCC’s days are numbered. It has after all been a bit of a liability for ‘the cause’ for many years now.
The final flourish may well see climate science de-emphasised (far too troublesome these days when so many qualified people find fault in it as illustrated by the primary focus of this post and thread), and the main emphasis of the IPCC shifted more honestly to what it might well have been behind the scenes all along, right from the inception of this odious organisation – the ill-defined ragbag of interventionism called ‘sustainable development’, or perhaps more evocatively, ‘suppressed development’.
Paul Marko (19 Dec, 1:58pm) draws attention above to a remark by an asoociate editor of Scientific American:
“The final draft isn’t due until next fall and the leak highlights a need to consider updating the IPCC process. Global warming no longer needs confirmation—instead the world needs solutions to climate change’s challenges.”
We may merely be at another phase of a ‘grand plan’.

December 20, 2012 5:56 am

Nic,
I do not see the elephant in the room in your analysis. The possibility that much if not most of the temperature rise is from natural variation (Bob Tisdale’s ENSO variations, Solar magnetic field variation, or other causes). If much of the rise was natural, this means that it is likely that the sensitivity range you state is a maximum possible, not the likely range.

Nic Lewis
December 20, 2012 6:45 am

Leonard,
“I do not see the elephant in the room in your analysis. … it is likely that the sensitivity range you state is a maximum possible, not the likely range.”
You may be correct. But the IPCC reports do not accept that a significant, let alone a major, part of the temperature rise was due to natural variation. The point about my analysis is that even taking the IPCC’s position on that point and using its own best estimates, giving preference to its observationally based estimate for aerosol forcing over its composite estimate, climate sensitivity still comes out only little over half the IPCC’s central estimate, and below the bottom of its ‘likely’ range.

Bill Illis
December 20, 2012 6:46 am

Just noting there was recent paper that examined the Ocean Heat Content in the North Altantic below 2000 metres.
This is important because it is only the second study of the ocean below 2,000 metres where extra forcing could be accumulating / hiding. (The first was for the southern ocean around Antarctica which found a very small accumulation).
This study says the deep ocean in the North Atlantic below 2,000 metres was cooling as opposed to accumulating energy. So, no need to adjust the formulae from Nic.
http://www.nature.com/ngeo/journal/v5/n12/full/ngeo1639.html
http://www.nature.com/ngeo/journal/v5/n12/images_article/ngeo1639-f1.jpg

Hilary Ostrov (aka hro001)
December 20, 2012 11:49 am

John Shade says: December 20, 2012 at 2:11 am

Presumably the IPCC’s days are numbered. It has after all been a bit of a liability for ‘the cause’ for many years now.
The final flourish may well see climate science de-emphasised (far too troublesome these days when so many qualified people find fault in it as illustrated by the primary focus of this post and thread), and the main emphasis of the IPCC shifted more honestly to what it might well have been behind the scenes all along, right from the inception of this odious organisation – the ill-defined ragbag of interventionism called ‘sustainable development’ (SD), […]

I think we’ve been seeing signs of climate science “de-emphasis” – in favour of “sustainability” – accelerating for the past two years. Even as early as July 2009, Pachauri (whose unscripted pontifications and pronouncements have always struck me as being of the SD advocacy kind rather than anything remotely “scientific”) during the course of articulating his “vision” for AR5 had included the following word salad:

Equity, Fairness, Sustainable Development and Life Style Changes: Problems of collective action, or public good problems that may overlap with various parallel challenges, can only be solved if the solution is considered to be fair and based on adequate equity principles. In general, the equity principle has to be applied to inter- and intra-generational justice as a prerequisite for sustainable development as well as lifestyle changes.

Other signs include (but are not limited to): the birth of the IPCC’s younger sibling, IPBES which has been waiting in the wings for about two years (and in reports of other UN acronymic offsprings’ activities has gotten more mentions than IPCC). The Rio+20 “outcome” document in which the “final score” was climate change 22, sustainable/sustainability 400. Not to mention the serial flops of the IPCC’s “main client”, the UNFCCC during the course of its last four confabs.
So it would seem that these climate scientists’ days in the sun are perhaps numbered. Maybe deep in their heart of hearts – particularly those of the de-carbonize-now-committed persuasion – they know this. But rather than acknowledge that they are about to be knocked off their pedestal, these fading “stars” are lashing out in all directions – blaming everyone (particularly those who do not share their views) but themselves for their impending consignment to virtual irrelevance.

Roger Knights
December 20, 2012 12:34 pm

Bill Illis says:
December 20, 2012 at 6:46 am
Just noting there was recent paper that examined the Ocean Heat Content in the North Altantic below 2000 metres.
This is important because it is only the second study of the ocean below 2,000 metres where extra forcing could be accumulating / hiding. (The first was for the southern ocean around Antarctica which found a very small accumulation).
This study says the deep ocean in the North Atlantic below 2,000 metres was cooling as opposed to accumulating energy. So, no need to adjust the formulae from Nic.

Here’s a blast from the past:

John Garrett says: December 31, 2011 at 7:18 pm
Missing: 7,000 quintillion joules of heat energy
If located, call Kevin Trenberth.
Reward offered.

Roger Knights
December 20, 2012 12:40 pm

hro001 says:
December 20, 2012 at 11:49 am

John Shade says: December 20, 2012 at 2:11 am
Presumably the IPCC’s days are numbered. It has after all been a bit of a liability for ‘the cause’ for many years now.
The final flourish may well see climate science de-emphasised (far too troublesome these days when so many qualified people find fault in it as illustrated by the primary focus of this post and thread), and the main emphasis of the IPCC shifted more honestly to what it might well have been behind the scenes all along, right from the inception of this odious organisation – the ill-defined ragbag of interventionism called ‘sustainable development’ (SD), […]

I think we’ve been seeing signs of climate science “de-emphasis” – in favour of “sustainability” – accelerating for the past two years.

But, if sustainability means windmills and solar power, they would just be jumping out of the frying pan and into the soup. The case for those power sources is delaminating even more rapidly than the case for CAGW.

Nic Lewis
December 20, 2012 1:49 pm

Bill Illis wrote:
This study says the deep ocean in the North Atlantic below 2,000 metres was cooling as opposed to accumulating energy.
Thanks for the pointer to that study, Bill

ba
December 20, 2012 2:06 pm

“…Why don’t the retired finacial wizzards[sic] and retired engineers just take over all of those Ph.D. science jobs. ”
They certainly should be a consideration in ousting persistently biased, venal, and vengeful occupants that are patently resistant to scientific processes of verification and have repeatedly been shown to be lacking, or outright corrupt.

Allan MacRae
December 21, 2012 5:28 am

Some history re aerosols – sorry no time now.
http://wattsupwiththat.com/2012/08/23/ar5-climate-forecasts-what-to-believe/#comment-1064671
Wonderful comments Pat- very informative.
Regarding hindcasting of models, you may have seen this exchange with Douglas Hoyt on aerosols. I would be interested in your opinion. I continue to correspond from time to time with Douglas, and would be pleased to invite him onto this thread if you so requested.
http://wattsupwiththat.com/2012/04/28/tisdale-a-closer-look-at-crutem4-since-1975/#comment-970931
markx says:April 29, 2012 at 9:56 am
ferd berple says: April 28, 2012 at 12:09 pm
Climate scientists complain when someone outside of climate science talks about climate science, but ignore the fact that climate science is no qualification to build reliable computer models.
Markx: IMHO, this one of the most important observations made within these pages.
___________
Allan:
Agree – when you build a mathematical model, you first try to verify it. One method is to determine how well it models the past (“hindcasting”).
The history of climate model hindcasting has been one of blatant fraud. Fabricated aerosol data has been the key “fudge factor”.
Here is another earlier post on this subject, dating from mid-2009.
It is remarkable that this obvious global warming fraud has lasted this long, with supporting aerosol data literally “made up from thin air”.
Using real measured aerosol data that dates to the 1880’s, the phony global warming crisis “disappears in a puff of smoke”.
Regards, Allan
http://wattsupwiththat.com/2009/06/27/new-paper-global-dimming-and-brightening-a-review/#comment-151040
Allan MacRae (03:23:07) 28/06/2009 [excerpt]
FABRICATION OF AEROSOL DATA USED FOR CLIMATE MODELS:
Douglas Hoyt:
The pyrheliometric ratioing technique is very insensitive to any changes in calibration of the instruments and very sensitive to aerosol changes.
Here are three papers using the technique:
Hoyt, D. V. and C. Frohlich, 1983. Atmospheric transmission at Davos, Switzerland, 1909-1979. Climatic Change, 5, 61-72.
Hoyt, D. V., C. P. Turner, and R. D. Evans, 1980. Trends in atmospheric transmission at three locations in the United States from 1940 to 1977. Mon. Wea. Rev., 108, 1430-1439.
Hoyt, D. V., 1979. Pyrheliometric and circumsolar sky radiation measurements by the Smithsonian Astrophysical Observatory from 1923 to 1954. Tellus, 31, 217-229.
In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly. There are other studies from Belgium, Ireland, and Hawaii that reach the same conclusions. It is significant that Davos shows no trend whereas the IPCC models show it in the area where the greatest changes in aerosols were occurring.
There are earlier aerosol studies by Hand and Marvin in Monthly Weather Review going back to the 1880s and these studies also show no trends.
___________________________
Allan:
Repeating: “In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly.”
___________________________
Here is an email just received from Douglas Hoyt [my comments in square brackets]:
It [aerosol numbers used in climate models] comes from the modelling work of Charlson where total aerosol optical depth is modeled as being proportional to industrial activity.
[For example, the 1992 paper in Science by Charlson, Hansen et al]
http://www.sciencemag.org/cgi/content/abstract/255/5043/423
or [the 2000 letter report to James Baker from Hansen and Ramaswamy]
http://74.125.95.132/search?q=cache:DjVCJ3s0PeYJ:www-nacip.ucsd.edu/Ltr-Baker.pdf+%22aerosol+optical+depth%22+time+dependence&cd=4&hl=en&ct=clnk&gl=us
where it says [para 2 of covering letter] “aerosols are not measured with an accuracy that allows determination of even the sign of annual or decadal trends of aerosol climate forcing.”
Let’s turn the question on its head and ask to see the raw measurements of atmospheric transmission that support Charlson.
Hint: There aren’t any, as the statement from the workshop above confirms.
__________________________
IN SUMMARY
There are actual measurements by Hoyt and others that show NO trends in atmospheric aerosols, but volcanic events are clearly evident.
So Charlson, Hansen et al ignored these inconvenient aerosol measurements and “cooked up” (fabricated) aerosol data that forced their climate models to better conform to the global cooling that was observed pre~1975.
Voila! Their models could hindcast (model the past) better using this fabricated aerosol data, and therefore must predict the future with accuracy. (NOT)
That is the evidence of fabrication of the aerosol data used in climate models that (falsely) predict catastrophic humanmade global warming.
And we are going to spend trillions and cripple our Western economies based on this fabrication of false data, this model cooking, this nonsense?
*************************************************

Marc
December 22, 2012 11:13 am

Could someone please explain to me why convection is unable to remove any excess heat from the surface to the toa in the event there is surface heating?
Why wouldn’t convective processes accelerate with higher surface temperatures and a static temperature in space?

December 23, 2012 4:08 pm

Nic Lewis, Matt Ridley reports you as saying:
“Taking the IPCC scenario that assumes a doubling of CO2, plus the equivalent of another 30 per cent rise from other greenhouse gases by 2100, we are likely to experience a further rise of no more than 1C.”
By my estimation, the nearest match to that description among the commonly shown RCP scenarios is RCP 4.5, which shows a near cessation of CO2 and CO2 equivalent emissions by 2050. I assume you did not intend to argue that we did not need to do anything about global warming because a scenario which shows greater reductions than have currently been committed to shows minimal warming. Of course, be simple calculation, for any forcing greater than 4.6 W/m^2, with an ECS of 1.61, the equilibrium response will be greater than 2 degrees C so I cannot see how you have used a scenario consistent with BAU.
Would you please clarify be specifying exactly which scenario you intended, and why you think it represents BAU.