Updated climate sensitivity estimates using aerosol-adjusted forcings and various ocean heat uptake estimates

Guest essay by Nic Lewis

The Otto et al. paper has received a great deal of attention in recent days. While the paper’s estimate of transient climate response was low, the equilibrium/effective climate sensitivity figure was actually slightly higher than that in some other recent studies based on instrumental observations. Here, Nic Lewis notes that this is largely due to the paper’s use of the Domingues et al. upper ocean (0–700 m) dataset, which assesses recent ocean warming to be faster than other studies in the field. He examines the effects of updating the Otto et al. results from 2009 to 2012 using different upper ocean (0–700 m) datasets, with surprising results.

Last December I published an article here entitled ‘Why doesn’t the AR5 SOD’s climate sensitivity range reflect its new aerosol estimates?‘ (Lewis, 2012). In it I used a heat-balance (energy-budget) approach based on changes in mean global temperature, forcing and Earth system heat uptake (ΔT, ΔF and ΔQ) between 1871–80 and 2002–11. I used the RCP 4.5 radiative forcings dataset (Meinshausen et al, 2011), which is available in .xls format here, conformed it with solar forcing and volcanic observations post 2006 and adjusted its aerosol forcing to reflect purely satellite-observation-based estimates of recent aerosol forcing.

I estimated equilibrium climate sensitivity (ECS) at 1.6°C,with a 5–95% uncertainty range of 1.0‑2.8°C. I did not state any estimate for transient climate response (TCR), which is based on the change in temperature over a 70-year period of linearly increasing forcing and takes no account of heat uptake. However, a TCR estimate was implicit in the data I gave, if one makes the assumption that the evolution of forcing over the long period involved approximates a 70-year ramp. This is reasonable since the net forcing has grown substantially faster from the mid-twentieth century on than previously. On that basis, my best estimate for TCR was 1.3°C. Repeating the calculations in Appendix 3 of my original article without the heat uptake term gives a 5–95% range for TCR of 0.9–2.0°C.

The ECS and TCR estimates are based on the formulae:

(1) ECS = F ΔT / (ΔF − ΔQ) and (2) TCR = F ΔT / ΔF

where F is the radiative forcing corresponding to a doubling of atmospheric CO2 concentrations.

A short while ago I drew attention, here, to an energy-budget climate study, Otto et al. (2013), that has just been published in Nature Geoscience, here. Its author list includes fourteen lead/coordinating lead authors of relevant AR5 WG1 chapters, and myself. That study uses the same equations (1) and (2) as above to estimate ECS and TCR. It uses a CMIP5-RCP4.5 multimodel mean of forcings as estimated by general circulation models (GCMs) (Forster et al, 2013), likewise adjusting the aerosol forcing to reflect recent satellite-observation based estimates – see Supplementary Information (SI) Section S1. It Although the CMIP5 forcing estimates embody a lower figure for F (3.44 W/m2) than do those per the RCP4.5 database (F: 3.71 W/m2), TCR estimates from using the two different sets of forcing estimates are almost identical, whilst ECS estimates are marginally higher using the CMIP5 forcing estimates[i].

Although the Otto et al. (2013) Nature Geoscience study illustrates estimates based on changes in global mean temperature, forcing and heat uptake between 1860–79 and various recent periods, it states that the estimates based on changes to the decade to 2000–09 are arguably the most reliable, since that decade has the strongest forcing and is little affected by the eruption of Mount Pinatubo. Its TCR best estimate and 5–95% range based on changes to 2000-09 are identical to what is implicit in my December study: 1.3°C (uncertainty range 0.9–2.0°C).

While the Otto et al. (2013) TCR best estimate is identical to that implicit in my December study, its ECS best estimate and 5–95% range based on changes between 1860–79 to 2000–09 is 2.0°C (1.2–3.9°C), somewhat higher than the 1.6°C (1.0–2.9°C) per my study, which was based on changes between 1871–80 and 2002–11. About 0.1°C of the difference is probably accounted for by roundings and the difference in F factors due to the different forcing bases. But, given the identical TCR estimates, differences in the heat-uptake estimates used must account for most of the remaining 0.3°C difference between the two ECS estimates.

Both my study and Otto et al. (2013) used the pentadal estimates of 0–2000-m deep-layer ocean heat content (OHC) updated from Levitus et al. (2012), and made allowances in line with the recent studies for heat uptake in the deeper ocean and elsewhere. The two studies’ heat uptake estimates differed mainly due to the treatment of the 0–700-m layer of the ocean. I used the estimate included in the Levitus 0–2000-m pentadal data, whereas Otto et al. (2013) subtracted the Levitus 0–700-m pentadal estimates from that data and then added 3-year running mean estimates of 0–700-m OHC updated from Domingues et al (2008).

Since 2000–09, the most recent decade used in Otto et al. (2013), ended more than three years ago, I will instead investigate the effect of differing heat uptake estimates using data for the decade 2003–12 rather than for 2000–09. Doing so has two advantages. First, forcing was stronger during the 2003–12 decade, so a better constrained estimate should be obtained. Secondly, by basing the 0–700-m OHC change on the difference between the 3-year means for 2003–05 and for 2010–12, the influence of the period of switchover to Argo – with its higher error uncertainties – is reduced.

In this study, I will present results using four alternative estimates of total Earth system heat uptake over the most recent decade. Three of the estimates adopt exactly the same approach as in Otto et al. (2013), updating estimates appropriately, and differ only in the source of data used for the 3-year running mean 0–700-m OHC. In one case, I calculate it from the updated Levitus annual data, available from NOAA/NOCDC here. In the second case I calculate it from updated Lyman et al. (2010), data, available here. In the third case I use the updated Domingues et al. (2008) data archived at the CSIRO Sea Level Rise page in relation to Church et al. (2011), here. Since that data only extends to the mean for 2008–10, I have extended it for two years at a conservative (high) rate of 0.33 W/m2 – which over that period is nearly double the rate of increase per the Levitus dataset, and nearly treble that per the Lyman dataset. The final estimate uses total system heat uptake estimates from Loeb et al. 2012 and Stephens et al. 2012. Those studies melded satellite-based estimates of top-of-atmosphere radiative imbalance with ocean heat content estimates, primarily updated from the Lyman et al. (2010) study. The Loeb 2012 and Stephens 2012 studies estimated average total Earth system heat uptake/radiative imbalance at respectively 0.5 W/m2 over 2000–10 and 0.6 W/m2 over 2005–10. I take the mean of these two figures as applying throughout the 2003–12 period.

I use the same adjusted CMIP5-RCP4.5 forcings dataset as used in the Otto et al. (2013) study, updating them from 2000–09 to 2003–12, to achieve consistency with that study (data kindly supplied by Piers Forster). Likewise, the uncertainty estimates I use are derived on the same basis as those in Otto et al. (2013).

I am also retaining the 1860–79 base reference period used in Otto et al. (2013). That study followed my December study in deducting 50% of the 0.16 W/m2 estimate of ocean heat uptake (OHU) in the second half of the nineteenth century per Gregory et al. (2002), the best-known of the earlier energy budget studies. The 0.16 W/m2 estimate – half natural, half anthropogenic – seemed reasonable to me, given the low volcanic activity between 1820 and 1880. However, I deducted only 50% of it to compensate for my Levitus 2012-derived estimate of 0–2000-m ocean heat uptake being somewhat lower than that per some other estimates. Although the main reason for making the 50% reduction in the Gregory (2002) OHU estimate for 1861–1900 disappears when considering 0–700-m ocean heat uptake datasets with significantly higher trends than per Levitus 2012, in the present calculations I nevertheless apply the 50% reduction in all cases.

Table 1, below, shows comparisons of ECS and TCR estimates using data for the periods 2000–09 (Otto et al., 2013), 2002–11 (Lewis, 2012 – my December study) and 2003–12 (this study) using the relevant forcings and 0–700 m OHC datasets.

NicLewis_table1

Table 1: ECS and TCR estimates based on last decade and 0.08 W/m2 ocean heat uptake in 1860–79.

Whichever periods and forcings dataset are used, the best estimate of TCR remains 1.3°C. The 5–95% uncertainty range narrows marginally when using changes to 2003–12, giving slightly higher forcing increases, rather than to 2000–09 or 2002–11, rounding to 0.9–1.95°C. The ‘likely’ range (17–83%) is 1.05–1.65°C. (These figures are all rounded to the nearest 0.05°C.) The TCR estimate is unaffected by the choice of OHC dataset.

The ECS estimates using data for 2003–12 reveal the significant effect of using different heat uptake estimates. Lower system heat uptake estimates and the higher forcing estimates resulting from the 3-year roll-forward of the period used both contribute to the ECS estimates being lower than the Otto et al. (2013) ECS estimate, the first factor being the most important.

Although stating that estimates based on 2000–09 are arguably most reliable, Otto et al. (2013) also gives estimates based on changes to 1970–79, 1980–89, 1990–99 and 1970–2009. Forcings during the first two of those periods are too low to provide reasonably well-constrained estimates of ECS or TCR, and estimates based on 1990–99 may be unreliable since this period was affected both by the eruption of Mount Pinatubo and by the exceptionally large 1997–98 El Niño. However, the 1970–2009 period, although having a considerably lower mean forcing than 2000–09 and being more impacted by volcanic activity, should – being much longer – be less affected by internal variability than any single decade. I have therefore repeated the exercise carried out in relation to the final decade, in order to obtain estimates based on the long period 1973–2012.

Table 2, below, shows comparisons of ECS and TCR estimates using data for the periods 1900–2009 (Otto et al., 2013) and 1973–2012 (this study) using the relevant forcings and 0–700-m OHC datasets. The estimates of system heat uptake from two of the sources used for 2003–12 do not cover the longer period. I have replaced them by an estimate based on data, here, updated from Ishii and Kimoto (2009). Using 2003–12 data, the Ishii and Kimoto dataset gives almost an identical ECS best estimate and uncertainty range to the Lyman 2010 dataset, so no separate estimate for it is shown for that period. Accordingly, there are only three ECS estimates given for 1973–2012. Again, the TCR estimates are unaffected by the choice of system heat uptake estimate.

Nic_Lewis_table2

Table 2: ECS and TCR estimates based on last four decades and 0.08 W/m2 ocean heat uptake in1860–79

The first thing to note is that the TCR best estimate is almost unchanged from that per Otto et al. (2013): just marginally lower at 1.35°C. That is very close to the TCR best estimate based on data for 2003–12. The 5–95% uncertainty range for TCR is slightly narrower than when using data for 1972–2012 rather than 1970–2009, due to higher mean forcing.

Table 2 shows that ECS estimates over this longer period vary considerably less between the different OHC datasets (two of which do not cover this period) than do estimates using data for 2003–12. As in Table 1, all the 1973–2012 based ECS estimates come in below the Otto et al. (2013) one, both as to best estimate and 95% bound. Giving all three estimates equal weight, a best estimate for ECS of 1.75°C looks reasonable, which compares to 1.9°C per Otto et al. (2013). On a judgemental basis, a 5–95% uncertainty range of 0.9–4.0°C looks sufficiently wide, and represents a reduction of 1.0°C in the 95% bound from that per Otto et al. (2013).

If one applied a similar approach to the four, arguably more reliable, ECS estimates from the 2003–12 data, the overall best estimate would come out at 1.65°C, considerably below the 2.0°C per Otto et al. (2013). The 5–95% uncertainty range calculated from the unweighted average of the PDFs for the four estimates is 1.0–3.1°C, and the 17–83%, ‘likely’, range is 1.3–2.3°C. The corresponding ranges for the Otto et al. (2013) study are 1.2–3.9°C and 1.5–2.8°C. The important 95% bound on ECS is therefore reduced by getting on for 1°C.

References

Church, J. A. et al. (2011): Revisiting the Earth’s sea-level and energy budgets from 1961 to 2008. Geophysical Research Letters 38, L18601, doi:10.1029/2011gl048794.

Domingues, C. M. et al. (2008): Improved estimates of upper-ocean warming and multi-decadal sea-level rise. Nature453, 1090-1093, doi:http://www.nature.com/nature/journal/v453/n7198/suppinfo/nature07080_S1.html.

Forster, P. M., T. Andrews, P. Good, J. M. Gregory, L. S. Jackson, and M. Zelinka (2013): Evaluating adjusted forcing and model spread for historical and future scenarios in the CMIP5 generation of climate models, J. Geophys. Res. Atmos., 118, doi:10.1002/jgrd.50174

Ishii, M. and M. Kimoto (2009): Reevaluation of historical ocean heat content variations with time-varying XBT and MBT depth bias corrections. J. Oceanogr., 65, 287 – 299.

Levitus, S. et al. (2012): World ocean heat content and thermosteric sea level change (0–2000 m), 1955–2010. Geophysical Research Letters39, L10603, doi:10.1029/2012gl051106.

Loeb, NG et al. (2012): Observed changes in top-of-the-atmosphere radiation and upper-ocean heating consistent within uncertainty. Nature Geoscience, 5, 110-113.

Lyman, JM et al. (2009): Robust warming of the global upper ocean. Nature, 465, 334–337. http://www.nature.com/nature/journal/v465/n7296/full/nature09043.html

Meinshausen M., S. Smith et al. (2011): The RCP greenhouse gas concentrations and their extension from 1765 to 2500. Climate Change, Special RCP Issue

Otto, A. et al. (2013): Energy budget constraints on climate response. Nature Geoscience, doi:10.1038/ngeo1836

Stephens, GL et al (2012): An update on Earth’s energy balance in light of the latest global observations. Nature Geoscience, 5, 691-696


[i]Total forcing after adjusting the aerosol forcing to match observational estimates is not far short of total long-lived greenhouse gas (GHG) forcing. Therefore, differing estimates of GHG forcing – assuming that they differ broadly proportionately between the main GHGs – change both the numerator and denominator in Equation (1) by roughly the same proportion. Accordingly, differing GHG forcing estimates do not matter very much when estimating TCR, provided that the corresponding F is used to calculate the ECS and TCR estimates, as was the case for both my December study and Otto et al. (2013). ECS estimates will be more sensitive than TCR estimates to differences in F values, since the unvarying deduction for heat uptake means that the (ΔF − ΔQ) factor in equation (2) will be affected proportionately more than the F factor. All other things being equal, the lower CMIP5 F value will lead to ECS estimates based on CMIP5 multimodel mean forcings being nearly 5% higher than those based on RCP4.5 forcings.

0 0 votes
Article Rating
113 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Jim Cripwell
May 24, 2013 6:43 am

I am sorry, but all these estimates of climate sensitivity are like discussing how many angels can dance on the head of a pin. People have been discussing estimates of climate sensitivity for something like 40 years. In that time, so far as I can make out, little, if any, progress has been made. Until we know the magnitudes and time constants of all naturally occurring events that cause a change in global temperatures, and so we might have a hope of actually measuring what the numeric value is, all these studies are just a waste of time and money. All people are actually doing is just taking another guess. My best guess is that the climate sensitivity of CO2 is indistinguishable from zero.

bobl
May 24, 2013 6:47 am

I am struggling with a not so related issue that came to me just yesterday. The theory has it that N2 an O2 lacks vibrational modes in the infrared making it incapable of reradiating heat. To me this implies that all IR radiation to space from the atmosphere must be from a greenhouse gas? So if the concentration of greenhouse gasses increases then the number of photons released to space must necessarilly increase, given that the non radiating gasses transfer their energy by collisions.
Surely this has to increase losses to space overall.
What am I missing?

Stephen Wilde
May 24, 2013 6:59 am

“To me this implies that all IR radiation to space from the atmosphere must be from a greenhouse gas? So if the concentration of greenhouse gasses increases then the number of photons released to space must necessarilly increase”
Quite.
GHGs provide an additional radiative window to space that is not provided by a non GHG atmosphere.
Still doesn’t necessarily result in any net thermal effect though once negative system responses are taken into account.
I think that whether the net effect of GHGs is potential warming or potential cooling the air circulation adjusts to negate it.
So an effect of zero or near zero overall but with a miniscule shift in air circulation.

May 24, 2013 7:12 am

Your updated result for ECS are the most honest, because it is based on the latest OHC data (Levitus 2012) and the latest temperature data (1973 – 2012). The result shows that ECS = 1.7 +- 1.0/0.4 degrees C. The UN goal of limiting climate change to 2 degrees C has apparently been met – congratulations !
Can we now drop the ‘C” from CAGW ?

bobl
May 24, 2013 7:16 am

Jim, I have been making this point for a while. The climate feedbacks are not Scalar, they are complex, they each have a time dimension, a lag, and they are all different, ranging between milliseconds and decades. Feedbacks cannot be added without accounting for the time (phase component). The lack of accounting for time means that transient sensitivity can vary wildly from moment to moment depending on the speed and direction of all the feedback effects on multiple timescales.
Acheiving a Net gain of 3 in the climate therefore requires a completely implausible loop gain of about 0.95.
In support of your point, sensitivity can only be evaluated by modelling each and every feedback effect, including the lags and amplitudes of each effect. In many cases the feedback amplitudes or phases are dependent on the system itself (consider tropical storm non-linear behaviour)! Sensitivity cannot be a simple number, it is a chaotically varying complex number in both time and space, it is to all intents and purposes unknowable.
Climate science attempts to model this as a simple scalar average, without even knowing if the combination of all the feedbacks represents a stationary function, that is, they dont even know if the mean of the sensitivity is a constant.

John Peter
May 24, 2013 7:23 am

Jim Cripwell may well be right in stating “I am sorry, but all these estimates of climate sensitivity are like discussing how many angels can dance on the head of a pin.” but such studies done studiously are still important and should be welcomed as the effect may be (as an intermediate stage) to reduce the “consensus” estimate of climate sensitivity from the IPCC median of 3C. A generally accepted ECS of 1.65-1.75C is much to be preferred to 3C and could have enormous consequences for policy decisions. It would mean that a doubling of CO2 would not mean a 2C (or higher) increase in global temperatures and would minimise the concept of the impending “tipping point”. We are moving slowly towards the Lindzen & Co view.

bobl
May 24, 2013 7:23 am

Stephen, no, must have an effect however miniscule, some change needs to drive the air current change, you can have a negative feedback, but there must be a net change to drive the effects.
Nevertheless, more photons to space surely implies cooling rather than warming

Patrick
May 24, 2013 7:32 am

Human driven climate change alarmists are just a bunch of aerosols imo.

Greg Goodman
May 24, 2013 7:43 am

All this work on narrowing the range of confidence values is impressive and valuable. That some major IPCC figures seem to be coming along with the process is very encouraging.
However, the whole idea of simple linear model (and indeed to much more complex GCMs) seems founded on the idea that the climate system has a linear response to radiative forcing.
The two major events of late 20th c. give one of the few discernible features other than the long slow (accelerating) rise.
However, I find it very hard to find evidence in climate data this the strong negative forcing of these events.
I saw nothing obvious in TLT nor in tropical SST but I was assured it was visible in land records. So I had a detailed look at CRUTEM4 for northern hemisphere.
http://climategrog.wordpress.com/?attachment_id=270
Now maybe I’m just not looking in the right place but there seems to be a problem here. There is no cooling effect to be seen. In fact good indications of a short term warming. There is no indication of the marked, permanent negative offset that a linear response would produce to such a negative forcing.
Now if the response to volcanic forcing is not materialising in the climate record, then the linear model is fundamentally inadequate and hence current GCMs as well.
If I am overlooking something obvious, looking at the wrong dataset or misinerpretting what to expect , hopefully Nic or someone can point out where.
thanks.

Patrick
May 24, 2013 7:55 am

“Greg Goodman says:
May 24, 2013 at 7:43 am”
The simple, and correct answer is, no-one actually knows. Once “we” accept that, we can move on!

Jim Cripwell
May 24, 2013 7:58 am

John Peter, you write “but such studies done studiously are still important”
To a limited extent I agree. My point is that with our current knowledge of the physics of our atmopshere, no-one has the slightest idea of what happens to global temperatures as we add more CO2 to the atmosphere from current levels. Just about the only things we know about climate sensitivity of CO2 is that it is probably positive, and it has a maximum value. If these studies were framed in terms of estimating the MAXIMUM value of climate sentiivity, I would not object. But I do object to claims that these estimates are in some sort of way associated with what the real number is.

Phil.
May 24, 2013 8:04 am

bobl says:
May 24, 2013 at 6:47 am
I am struggling with a not so related issue that came to me just yesterday. The theory has it that N2 an O2 lacks vibrational modes in the infrared making it incapable of reradiating heat. To me this implies that all IR radiation to space from the atmosphere must be from a greenhouse gas?

Correct.
So if the concentration of greenhouse gasses increases then the number of photons released to space must necessarilly increase, given that the non radiating gasses transfer their energy by collisions.
No, because the atmosphere is optically thick at the GHG wavelengths, i.e. lower in the atmosphere it absorbs more than it emits. Emission to space only occurs above a certain height and therefore at a certain temperature, as the concentration increases then that height increases and the temperature decreases and hence emission to space goes down.

Bill Illis
May 24, 2013 8:10 am

The NODC has updated the Ocean Heat Content numbers for the first quarter of 2013.
Big jump in the OHC numbers in the first quarter of 2013 (and some restating of the older numbers again).
0-2000 metre uptake equates to 0.49 W/m2 in the Argo era.
http://s13.postimg.org/u6al0f6xj/OHC_700_and_2000_M_Q1_2013.png
http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/basin_data.html
Equivalent to an average temperature increase of 0.073C in the 0-2000 metre ocean since 1977, 0.135C in the 0-700 metre ocean and 0.222C in the 0-100 metre ocean.
http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/basin_avt_data.html

Henry Clark
May 24, 2013 8:13 am

The climate sensitivity estimate in this article is better than highly overstated ones. Still, though:
1) Properly accounting for GCRs+TSI in solar-related change makes such contribute several times more to past warming than solar irradiance change alone, even aside from an ACRIM versus PMOD model matter on solar irradiance history. Almost whenever cosmic rays are not explicitly mentioned, usually one can assume someone is implicitly ignoring them entirely and treating them as zero effect, which is highly inaccurate. As Dr. Shaviv has noted:
“Using historic variations in climate and the cosmic ray flux, one can actually quantify empirically the relation between cosmic ray flux variations and global temperature change, and estimate the solar contribution to the 20th century warming. This contribution comes out to be 0.5 +/- 0.2 C out of the observed 0.6 +/- 0.2 C global warming (Shaviv, 2005).”*
That leaves roughly on the order of 0.1 degrees Celsius over the past century for net warming from anthropogenic effects / independent components of the longest types of ocean cycles (with likely a large portion of the apparent 60-year ocean cycle being rather sun & GCR generated as looking at appropriate plots suggests) / etc.
Especially considering logarithmic scaling and diminishing returns, human emissions over this century are not likely to contribute more than tenths of a degree warming if even that, even aside from how a near-future solar Grand Minimum starting another LIA by the mid 21st century looks likely. (A mixture of both cooling and warming effects, influence on water vapor, and other complexities apply).
* General discussion:
http://www.sciencebits.com/CO2orSolar
Related paper:
http://www.phys.huji.ac.il/~shaviv/articles/sensitivity.pdf
Some illustrations I made a while back:
http://s7.postimg.org/69qd0llcr/%20intermediate.gif
NOAA humidity data for decades past got drastically changed already since I started posting the above several months ago, but still it provides a number of illustrations.
2a) Considering how many problems there have been with activist-reported (Hansen, CRU, etc.) surface temperature measurements despite such being relatively more readily independently verified than 0–700m ocean heat content, where the latter is talking about mere hundredths of a degree change anyway (with there being quite a reason that such ocean temperature change over hundreds of meters of depth tends to be reported in joules rather than degrees Celsius or Kelvin), OHC has uncertainties, to say the least.
2b) Then there are questions on aerosol data…

thingodonta
May 24, 2013 8:14 am

One thing we do know is that the human response to climate sensitivity is very high. The positive feedbacks are much stronger than what is going on in the atmosphere.

May 24, 2013 8:38 am

“I am sorry, but all these estimates of climate sensitivity are like discussing how many angels can dance on the head of a pin. People have been discussing estimates of climate sensitivity for something like 40 years. In that time, so far as I can make out, little, if any, progress has been made. ”
This is factually wrong. The first estimates of sensitivity were made over 100 years ago.
Since then the estimate has followed a downward trajectory, from the first report to the fourth the central value has creeped downward. Nic’s work adds to that body of knowledge.
Let me put the importance of this metric into perspective: every degree of C in uncertainty is worth about 1 trillion dollars a year if you are planning to mitigate.
Jim. I suggest you read some of the history of climate science and read some actual papers and work with some actual data.

May 24, 2013 8:47 am

Nic,
Nice work. I think it might be instructive for WUWT readers to understand how Anthony’s claims about microsite bias would play into your calculations. For example, if one assumed that the land warming was biased by .1C per decade from 1979-current, what would that do to the sensitivity calculation? The purpose of course is to show people how they can locate and communicate their doubts WITHIN the framework and language of their opponents.
As you have shown it is much more effective to question the science from the inside rather
than attack the character and motivations of people from the outside. You’ve shown that there IS A DEBATE and you’ve shown people how to join that debate. You’ve shown that the consensus is broader and more uncertain than people think, not by questioning the existence of the consensus but by working with others to demonstrate that some of the core beliefs ( how much will it warm) admit many answers.

Rud Istvan
May 24, 2013 8:51 am

Nic Lewis, thanks for this post. WE posted a different way to derive basically the same answer. Good to see so many data sets and methods converging on something just over half of the AR4 number. It will be very interesting, and important, to see where AR5 comes out given the Otto co-authors. Either the C gets removed from CAGW, or the process is plainly shown to be utterly corrupted.

Stephen Wilde
May 24, 2013 8:54 am

“bobl says:
May 24, 2013 at 7:23 am
Stephen, no, must have an effect however miniscule, some change needs to drive the air current change, you can have a negative feedback, but there must be a net change to drive the effects. ”
The effect would a change in atmospheric heights and the slope of the lapse rate which is then compensated for by circulation changes.
Assuming there is a net thermal effect from GHGs in the first place. Some say net warming, others say net cooling.
Doesn’t matter either way. The system negates it by altering the thermal structure and circulation of the atmosphere.
I can’t actually prove that with current data so will just have to wait and see but it seems clear to me from current and past real world observations of climate behaviour.

HR
May 24, 2013 9:08 am

Jim Cripwell
In order to understand science you need a health dose of caution. The limits of our data and understanding mean we must pepper our conclusions with appropriate caveats and/or uncertainty ranges. You seem to completely misunderstand this and instead favour the idea of perfection or nothing. The unfortunate truth is that most of the time science is about being less wrong than it is about being right you need to moderate your skepticism appropriately.

Gary Pearse
May 24, 2013 9:40 am

Official ECS estimates since climategate seem to be roughly following a decay function:
N(t)=N(0)e^- lambda t
N(0) is the initial ECS ~ 3.0 and N(present) ~ 2 after t (to present) = 10yrs.
which makes lambda = 0.04
To get to an official ECS~1 will take Ln1/3 =0.04t; t=27yrs.
Hmm, if time elapsed since consensus ECS~3 has been just 5 years, then we would have 13 years to wait for consensus ECS~1. This assumes stalwart resistance and represents an outer limit. Lambda is probably not a constant here – I would go for half the 13 years ~6years.
My take: ECS finally turns out to be vanishingly small (i.e. there is a governor on climate responses al a Willis Eschenbach), then TCR is larger than ECS and within a few years it declines to the minor ECS figure and natural variability is basically all that is left. How’s that for a model!

bw
May 24, 2013 9:41 am

Mechanisms controlling atmospheric CO2 follow both geological and biological processes. Each pathway operates with different time constants and amplitudes over any time scale. The atmosphere has evolved in composition due to biology. Physicists can’t understand how the atmosphere behaves because they don’t include biology. Except for Argon, the atmosphere is completely biological in origin. Biology also alters surface albedo.
All evidence points to those supporting the “essentially zero” climate sensitivity on a planetary scale. The satellite data support zero temperature increase since 1980. Quality surface thermometers also show zero warming, eg the Antarctic science stations, Amundsen-Scott, Vostok, Halley and Davis.
CO2 follows biology, biology follows temperature.

richcar1225
May 24, 2013 9:53 am

I have trouble reconciling the realty of surface radiation measurements with climate sensitivity calculations based on TOA calculations. BSRN measurements indicate that since 1992 short wave radiation has increased by 3 w/m per decade likely due to global brightening (less clouds) while long wave radiation (including ghg back radiation) has increased by 2 w/m per decade.
Considering that SW (visible light) is much more easily absorbed by the oceans than thermal long wave radiation it would seem that the .4 to .6 w/m of ocean flux could be attributed mostly to the short wave contribution or simply to changes in cloud cover. AGW proponents will claim that the global brightening is a positive feed back of course. How much of the 2 w/mtr per decade of the long wave surface radiation increase is due to the ocean releasing heat versus GHG back radiation?

Greg Goodman
May 24, 2013 9:59 am

bw says: All evidence points to those supporting the “essentially zero” climate sensitivity on a planetary scale.
Don’t like terms like “all evidence” but here some evidence.
http://climategrog.wordpress.com/?attachment_id=271
http://climategrog.wordpress.com/?attachment_id=270
However, I do agree with Mosh’s last comment , Nic is taking a very wise approach and doing the difficult task injecting some reason into the thinking in small, digestible pieces. Congratulations on finding the right balance between being honest and being effective 😉

Jim Cripwell
May 24, 2013 10:28 am

HR, you write “you need to moderate your skepticism appropriately.”
I have absolutely no intention whatsoever of moderating my skepticism. There is no empirical data whatsoever to support the hypothesis of CAGW, and until we get such empirical data, I will continue to believe that CAGW is a hoax. The warmists have been conducting pseudo- science for years, trying to pretend that the estimates they have made on climate sensitivity have a meaning in physics. IMHO, as I have noted, I think these estimates are completely worthless.

May 24, 2013 10:52 am

Measurements since before 1900 demonstrate that sensitivity is between zero and insignificant.
Natural Climate change has been hiding in plain sight
http://climatechange90.blogspot.com/2013/05/natural-climate-change-has-been.html

george e. smith
May 24, 2013 10:56 am

“””””…..bobl says:
May 24, 2013 at 6:47 am
I am struggling with a not so related issue that came to me just yesterday. The theory has it that N2 an O2 lacks vibrational modes in the infrared making it incapable of reradiating heat. To me this implies that all IR radiation to space from the atmosphere must be from a greenhouse gas? So if the concentration of greenhouse gasses increases then the number of photons released to space must necessarilly increase, given that the non radiating gasses transfer their energy by collisions.
Surely this has to increase losses to space overall.
What am I missing?………””””””””””””
Bob, what it is that you are missing is an understanding of the fundamental difference between atomic or molecular line/band spectra emission/absorption radiation, which is entirely a consequence of atomic and molecular structure of SPECIFIC materials; and THERMAL RADIATION which is a continuum spectrum of EM radiation, that is NOT material specific, and depends (spectrally) ONLY on the Temperature of the material. Of course, the level of such emission or absorption depends on the density of the material (atoms/molecules per m^3).
Spectroscopists have known since pre-Cambrian times, that the sun emits a broad spectrum of continuum thermal radiation, on top of which it was discovered by Fraunhoffer and others, there is a whole flock of narrow atomic or molecular line spectra at very specific frequencies, that are characteristic of specific elements or charged ions, in the sun.
So-called “Black Body Radiation ” is an example of a thermal continuum spectrum.
I deliberately said “so-called”, because nobody ever observed black body radiation, since the laws of Physics prohibit the existence of any such object.
Well some folks think a black hole might be a black body.
By definition, a black body absorbs 100% of electromagnetic radiation of ANY frequency or wavelength down to, but not including zero; and up to, but not including infinity.
Yet no physical object (sans a black hole) is able to absorb 100% of even ONE single frequency, or wavelength; let alone All frequencies or wavelengths. To do that, the body would have to have a surface refractive index of exactly 1.0, the same as the refractive index of empty space. That would require that the velocity of light in the material be exactly (c).
Now (c) = 1/sqrt(munought x epsilonnought) ; the permeability, and permittivity of free space.
munought = 4pi E-7 Volt seconds per Amp metre.
epsilonnought = 8.85418781762 E-2 Amp seconds per Volt metre.
Both of these, and (c) = 2.99792458 E+8 are exact values. the only such fundamental physical constants that are exact.
So a material with a product of permeability and permittivity = 1 / c^2 would have a velocity of EM radiation also equal to (c). But that is not sufficient.
Free space vacuum, also has a characteristic impedance = sqrt( munought / epsilonnought) which is approximately 120 pi Ohms, or 377 Ohms.
And when a wave travelling in a medium of 377 Ohms, such as free space, encounters a medium of different impedance, there is a partially transmitted wave, and a partially reflected wave; so no total absorption.
So any real physical medium, must have a permeability of munought, and a permittivity of epsilon nought, at all frequencies and wavelengths, in order to qualify as a black body. It would be indistinguishable from the vacuum of free space.
The point of all this, is that real bodies only approximate what a black body might do, and only do so over narrow ranges of frequency or wavelength, depending on their Temperature.
And in the case of gases like atmospheric nitrogen and oxygen; the molecular density is extremely low. so the EM absorption doesn’t come anywhere near 100%, even for huge thicknesses of atmosphere. But the absorption per molecule is not zero, as some people assume, so even non IR active non GHG gases do absorb and emit a continuum spectrum of thermal radiation based on the gas Temperature.
Experimental practical near black bodies, operate as anechoic cavities, where radiation can enter a small aperture, and then gets bounced around in the cavity and never escapes. Some derivations of the Planck radiation law are based on such cavity radiation.
In the case of a “black body cavity”, the required conditions are that the walls be perfectly reflecting of ALL EM radiaton, and also must have zero thermal conductivity so that heat energy cannot leak out through the walls.
Once again, such conditions are a myth, and no ideal black body cavity can exist either.
So we have the weird circumstance, that Blackbody radiation has never been observed by anybody, and simply cannot exist, yet all kinds of effort went into theoretical models of a non-existing non-phenomenon, and gave us one of the crown jewels of modern physics; the Planck radiation formula.

May 24, 2013 12:09 pm

Has anyone looked at/challenged this?
http://www.naturalnews.com/040448_solar_radiation_global_warming_debunked.html
Climate sensitivity may be irrelevant or wrong

John Peter
May 24, 2013 12:46 pm

Steven Mosher says:
May 24, 2013 at 8:47 am
“You’ve shown that the consensus is broader and more uncertain than people think, not by questioning the existence of the consensus but by working with others to demonstrate that some of the core beliefs ( how much will it warm) admit many answers.”
So it is not a consensus after all. Good to see that the 3C consensus is breaking up. We will all benefit from that (other than the rent seakers).
I also applaud the fact that Steven Mosher has transformed into something less cryptic than usual. Long may it continue as he often has something valuable to add when the notion takes him.

Ian H
May 24, 2013 2:03 pm

Clearly estimates of climate sensitivity have had to fall because models based on higher numbers have tracked so poorly they have reached the point of falsification. The greatest pressure is on the TCR value since sufficient time has now passed without significant warming to rule out a high value for this number. The ECS on the other hand makes predictions that cannot be fully falsified for hundreds of years so I expect we’ll see people continuing to defend high numbers here for some time. I expect estimates of TCR and ECS will continue to fall if we see cooling over the next decade. These numbers in any case are still based on a simple forcing model with feedback which I don’t think is at all realistic.
I expect the immediate response of the most alarmed will be to start talking up the ECS and downplaying the TCR. However these ECS values are not really alarming. Over the longer term we are staring down the barrel of the next ice age. I find it reassuring to think that our influence on the planet might allow us to dodge this calamity. In fact I am more concerned that ECS might not be big enough to allow this to happen.
The problem is that ECS is bigger than TCR because of long term feedbacks to warming that depend on slow processes like the melting of ice sheets or warming of the deep oceans. But in the context of a planet that should be heading into an ice age the effect of added CO2 may not be to warm but merely to offset the expected natural cooling. If the greenhouse effect is not actually warming the planet but simply staving off the descent into the next ice age then none of these feedback effects will come into play.

Nic Lewis
May 24, 2013 2:11 pm

Steven Mosher wrote:
“I think it might be instructive for WUWT readers to understand how Anthony’s claims about microsite bias would play into your calculations. For example, if one assumed that the land warming was biased by .1C per decade from 1979-current, what would that do to the sensitivity calculation?”
Good point, Steve. That assumption would reduce the increase in global temperature betrween the 1860-79 mean and the 2003-12 mean from 0.76 C to about 0.68 C. All the climate sensitivity estimates, and their uncertainty ranges, would then reduce by about 11%. So a sensitivity of 1.7 C would change to just over 1.5 C, for example.

Rob
May 24, 2013 2:39 pm

Clive Menzies , I too have seen that , then I ran across this , it makes one wonder if we’re not over complicated this .http://www.crh.noaa.gov/dtx/march.php

X Anomaly
May 24, 2013 2:41 pm

What I would like to see is negative (below 1 deg C) ECS /TCS, I.e., AT what minimums over the next decades / century would it take for both estimates to get tickets to the LIndzen and Choi ball game ((0.7 deg C)???

Anteaus
May 24, 2013 2:58 pm

“One thing we do know is that the human response to climate sensitivity is very high. The positive feedbacks are much stronger than what is going on in the atmosphere.”
That’s because of the renewables subsidy forcing, which will result in runaway inflation-level rise and economies going under if the propaganda levels exceed 400ppm.

Greg Goodman
May 24, 2013 3:22 pm

In the previous article, Willis questioned why the volcanic forcings were being spread back in time by a running mean filter. It was confirmed by Nic that this was the case but he stated that it was immaterial to the findings of Otto et al 2013. This is probably true.
Now that Nic has kindly linked to a source of the forcings used, I have plotted it up against UAH TLT and TLS and marked in the dates of the two major eruptions.
I chose the SH extra-tropical region since this shows no visible impact from El Chichon and allows us to see the background variation in temperatures that was happening at that time. (Note stratospheric temps tend to vary in the opposite sense so I have inverted and scaled to give a ‘second opinion’ on the background variaitons).
http://climategrog.wordpress.com/?attachment_id=273
Now we see that the effects of the back spreading of the forcing data produce a totally false correlation with natural variations of temperature that preceded the eruption. This has nothing to do with forcing or the model and is entirely a result of improper processing.The distorted form of the forcing data just happens to correlate with the natural temperature background around the time of the event.
Incidentally, I remain even more convinced now of my initial assessment that this is a five year running mean, not a three year as suggested by Willis and confirmed by Nic. I would ask Nic to check his source of information because it seems pretty incontrovertible from this, that it is affecting two points either side not one, hence it is a 5 pt filter kernel.
So why was this done? There is no valid reason and it has to be an intentional act , you can’t accidentally run a filter on one of your primary inputs.
Whoever had the idea to “smooth” the volcanic forcings, are they also introducing this practice elsewhere than Otto et al, where it may be falsely improving the ability of the hindcasts to reproduce key features of the temperature record?

May 24, 2013 3:31 pm

What I love about science are the necessary assumptions that are made in order to carry out a calculation, you know the kind of thing I mean….’let’s assume a value for such and such’ or, let’s invent a concept like a ‘Black Body’, which of course cannot exist but is nonetheless useful in carrying out this calculation; well here are a couple observations from ‘real life’ which in my opinion seem to render ‘sensitivity’ calculations almost completely irrelevant….
Let’s assume (see what I did there?) that the increase of CO2 concentration from 350 to 400ppm does indeed capture sufficient energy to raise the overall temperature of the atmosphere by say 1 degree C. Let’s then assume that excess heat is eventually transported by ocean currents towards the polar regions. In the case of the Arctic Ocean in winter, sea Ice cover is reduced thereby allowing ‘larger volumes of warmer’ water to come into contact with the atmosphere at a time when there is no solar input (indeed conditions are ideal for heat loss to space).
Could it not then be argued that a slight heating of the atmosphere would cause and be balanced by subsequent polar cooling effect?
Indeed could it be further argued that Arctic Ocean heat loss could be a self amplifying effect ( a bit like the Warmist ‘feedbacks’…subsequently causing ‘runaway cooling’?

Richard M
May 24, 2013 4:16 pm

Phil. says:
May 24, 2013 at 8:04 am
No, because the atmosphere is optically thick at the GHG wavelengths, i.e. lower in the atmosphere it absorbs more than it emits. Emission to space only occurs above a certain height and therefore at a certain temperature, as the concentration increases then that height increases and the temperature decreases and hence emission to space goes down.

You are oversimplifying the situation.
First, the GHE is real and works off of radiation from the surface. Bobl wasn’t referring to this process.
Second, thermalization and radiation of atmospheric energy (not surface energy) is basic physics. This works in parallel to the GHE and this is what Bobl was asking about. Since the density of the atmosphere is reduced the higher you go, the average distance the radiation travels until re-absorption (or loss to space) is computable, let’s assume X meters upwards. It looks like any flow through a pipe. Now, if you add more CO2 you increase the probability of these events occurring which increases the flow of energy at all levels of the atmosphere towards space. Essentially you create a wider pipe. If climate models ignore this process it’s not surprising they get the wrong answer.

John Parsons
May 24, 2013 5:22 pm

Nic Lewis’s work (a significant contribution) and it’s implications need to be put into perspective. His work doesn’t seem to take into account the paleo record, nor should it necessarily do so. But the extremely short sample period needs to be recognized.
Additionally, from my reading of his results (as well as Dr. Otto’s apparently); at most, we may have a reprieve of ten or fifteen years before the same effects are upon us.
Not exactly a ‘Hallelujah’. JP

Janice Moore
May 24, 2013 5:25 pm

1) “… if one makes the assumption that the evolution of forcing over the long period involved approximates a 70-year ramp. This is reasonable [based on another assumption that] the net forcing has grown substantially faster from the mid-twentieth century on than previously.”
***
2) “… estimates based on changes to the decade to 2000–09 are arguably the most reliable, since that decade has the strongest forcing… .” [
assumes the forcing is of any significance at all]
***
3) “…forcing was stronger during the 2003–12 decade…” [assumes significant forcing causation]
***
4) “… Since that data only extends to the mean for 2008–10, I have extended it for two years at a conservative (high) rate of 0.33 W/m2… ”
***
From statements like those quoted above, this well-executed paper appears to be a careful attempt to both: 1) deprogram genuinely brainwashed AGW cult members by gingerly casting doubt upon their core beliefs; and 2) provide a face-saving way for AGW crooks who know better to back down from their lies.
It is not, nevertheless, robust, open, debate.
When a debate opponent has no evidence to back up their conjectures, when that opponent offers only assumptions and speculation, then, no matter how complicated their math, it adds up to no more than “I simply believe this.” There is nothing to debate. The above is only playing their imaginary game. It may get them to change their behavior slightly, but not significantly. It’s like going along with a person having a psychotic episode just enough to get them out of the middle of the road and onto the shoulder. “Yes, yes, my good fellow, those tiny green men most likely do want you to go with them, but, I know that they want you to walk on the shoulder, not down the centerline. There’s a good lad. Just keep to the right of (or left — in U.K.) of that solid white line there. Good luck!”
While it is shrewd not to try to tell them “TINY GREEN MEN DO NOT EXIST,” the above really isn’t a debate.
Conclusion: While scientific discussion is very important, the main goal is to save our economies, thus we must win over the voters. And that debate needs to be simply and powerfully stated. In terms such as:
“All people are actually doing is just taking another guess.” [Jim Cripwell]
“Climate science attempts to model this as a simple scalar average, without even knowing if the combination of all the feedbacks represents a stationary function. That is, they don’t even know if the mean of the sensitivity is a constant.” [bobl]
“Clearly estimates of climate sensitivity have had to fall because models based on higher numbers have tracked so poorly they have reached the point of falsification. ” [IanH]

GO, you wonderful WUWT SCHOLARS — argue with vigor! TRUTH IS ON YOUR SIDE.

Janice Moore
May 24, 2013 5:27 pm

AAAAAaaaack! Please forgive me. I messed up my first after the “]” in first paragraph. Sigh.

Janice Moore
May 24, 2013 5:28 pm

Oh, brother… “my first [end bold]…”

bobl
May 24, 2013 5:30 pm

Thank you Richard, that’s exactly what I was trying to say, I was thinking about how energy lost from the surface, by convection is radiated to space, and whether CO2 partial pressure plays into the efficiency of that process.
1. CO2 molecule takes up energy through collision with non radiating gas
2. C02 molecule emits photon
It seems to me that increasing the CO2 concentration, increases the probability of such an interaction, and therefore must increase the emission to space. Does this component for example form part of the increased IR emission in the C02 emission bands seen in the Satellite record?
This isn’t much more that a thought at the moment, but seems to me that this is just a question of conservation IE energy in Vs energy out, anything that increases energy out must result in an overall cooling – granted it could be stratified, cooling in upper atmosphere only, but given the convection processes at play… Increasing the efficiency of radiation must increase the temperature difference, increasing the rate of convective and conductive heat transport to match.
This question has rocked my world so to speak, I can’t reconcile this with a warming effect, and to date I have been firmly of the opinion that CO2 warms. That’s still true if one only considers radiation, in that case radiation to space should decrease as GHGs rise because the radiation never reaches from the surface to height. But likely not if convective heat is radiated to space by GHGs. In that case there is always plenty of energy drawn from the thermal energy of surrounding N2 and O2 to feed into the pipe…
Thoughts on this are wlecome

Janice Moore
May 24, 2013 5:50 pm

left should be right and right left above (I really need a vacation… !!!)

May 24, 2013 6:33 pm

Nic // Why not do a meta analysis to collapse those wide C.I. values. The consistency between the various results suggests that the C.I. is too large.

Tsk Tsk
May 24, 2013 7:27 pm

bobl says:
May 24, 2013 at 5:30 pm
Thank you Richard, that’s exactly what I was trying to say, I was thinking about how energy lost from the surface, by convection is radiated to space, and whether CO2 partial pressure plays into the efficiency of that process.
1. CO2 molecule takes up energy through collision with non radiating gas
2. C02 molecule emits photon
It seems to me that increasing the CO2 concentration, increases the probability of such an interaction, and therefore must increase the emission to space.
———————————————————-
I think you’re also assuming that the radiation always has to be outwards (are you?). The reality is that the CO2 molecule has basically a 50/50 chance of radiating up and out or down and in. The net effect is to increase the transit time of the photon and increase the energy content of the atmosphere and the surface as a result. Of course this is happening at all levels of the atmosphere just to make it more complicated. Finally, it can be directly observed just by measuring the radiation from a dark sky at night.

ruvfsy
May 24, 2013 7:50 pm

So Mosh,
Where is the fine line between denialism and lukewarmerism?
1.2 per doubling of CO2?

ruvfs
May 24, 2013 7:57 pm

Nic:
Could you tell us something about the journey from your first interestm your first calculations, your first paper and to the collabroartion towards this paper?
Would be interesting to hear.

Master_Of_Puppets
May 24, 2013 9:59 pm

‘Climate’ and ‘Climate Change’ are interpretations, in part based on the psychological state of the ‘observer’ at any particular time and therefore not physical in any way or form, i.e. fantasies or phantasms.
Fantasies and phantasms have no sensitivity, not even memory, they are only apparitions.

AlecM
May 24, 2013 10:27 pm

bobl 6.47 am: ‘Surely this has to increase losses to space overall.’
The fundamental problem with Climate Alchemy is that it starts from the premise that the ~15 µm CO2 IR band emitting at ~220 °K to space controls IR energy flux to space because if you double CO2, it reduces that band’s emitted flux by ~3 W/m^2.
However, at present CO2 level, that band is ~8% of OLR. 92% of the OLR comes from cloud level, the H2O bands and in the atmospheric window, near the surface temperature.
The premise has to shift to accepting that the Earth self regulates OLR equal to SW energy IN and the variations about the set point are oscillations as long time constant parts of the system adapt.
In other words, CO2-AGW is by definition zero on average.

atarsinc
May 25, 2013 12:32 am

AlecM says:
May 24, 2013 at 10:27 pm
“…the Earth self regulates OLR equal to SW energy IN…”
Care to describe that mechanism? Or are we supposed to just take your word for it.
JP

atarsinc
May 25, 2013 12:38 am

Master_Of_Puppets says:
May 24, 2013 at 9:59 pm
You may have mastered puppets, but you need to work on your physics. Next time your thermometer says it’s 104 degrees out, tell yourself that that sweat pouring down your face is just a “phantasm”. I’m sure you’ll feel a lot cooler. JP

bobl
May 25, 2013 1:15 am

tsk tsk, whether 50 % is radiated down is irrelevant, 50% of more is still more

AlecM
May 25, 2013 1:26 am

atarsinc: 12.32 am: ‘Care to describe that mechanism? Or are we supposed to just take your word for it.’
Very simple. The OLR bite idea, whilst superficially appealing can only apply if it is proved that no extra atmosphere heat from increased CO2 leaves directly to space via the atmospheric window.
The system has many ways of doing this so we see the null point of the heat engine. The real GHE is the rise in surface temperature above the no GHG state, ~11 K, and is completely separate from the LR difference of surface temperature to the tropopause.
No CO2-AGW means no feedback via H2O and no positive feedback.
In reality we see extreme negative feedback with the oscillations from long time constant parts of the system, e.g. ENSO.
Ice ages are the absence of phytoplankton biofeedback as Fe trace nutrient levels fall.

William Astley
May 25, 2013 2:14 am

http://www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_Apr_2013_v5.5.png
There appears to be a substantial difference between UAH Version 5.5 global anomaly Vs HadCRUT4 & RISS global anomaly.
It appears HadCRUT3 to HadCRUT4 has been adjusted to match RISS. HadCRUT3 to HadCRUT4 looks to be 0.1C temperature increase for recent temperatures.
I thought it odd that Woods for Trees does not include the UAH global anomaly data which would enable an easy comparison of the different data sets.
Has the sensitivity analysis been done with the UAH Version 5.5 global anomaly data?
Comments:
1. There seems to be a concerted effort to manipulate the temperature data sets to reduce past temperatures and increase current. James Hansen’s actions (and what he states in his books/papers) and the climategate emails appears to indicate that there are some people in senior positions who will support the adjustment of data and analysis to push an agenda.
2. It is surreal that those people who are pushing the agenda appear to have absolutely no understand of the implications of what is required for a true, world reduction in CO2 emissions (say a true reduction of 20% moving to 60%), not just spending money on green scams. The scams do not work. For example, EU carbon emissions, have increased when the carbon content of goods manufactured and imported into the EU is included. The scams do not even reduce CO2 emissions locally. For example, the conversion of food to biofuel has resulted in a net increase in CO2 emissions if all energy inputs are to grow and convert the food are include in the calculations. That scheme is significantly worst than burning fossil fuel as virgin forest is being cut down to grow food to convert to biofuel. Same comment concerning frustration over temperature data manipulation concerning cooked book scam calculations. Western countries are spending money on green scams which only results in higher energy costs and job loss in Western countries. The scams do not significantly reduce CO2 emissions in Western countries. Regardless world emissions are increasing and will continue to increase.
3. If there truly was an AGW crisis and world CO2 emissions were required to be significantly reduced the only viable solution is a massive move to nuclear and wartime like restrictions for all countries. For example, draconian restrictions on private and business travel (airlines go bankrupt, no more tourism travel, forced living in apartments rather than separate housing, and so on.), rationing of energy per person, and so on. There has been zero discussion of the implications of true world reduction in CO2 emissions.
4. The current plan is spend money on scams and accept increased job loss in Western countries and hope for a fairy with a magic wand. Facts are facts. Ignoring reality does not change reality.

Greg Goodman
May 25, 2013 3:16 am

What is important here is that Nic has established AND got accepted this improved method of extracting these parameters. The results however, do depend upon the accuracy of OHC data for the difference of TCS and ECS.
The researcher incharge of OHC record found a notable drop in 2006 and was about to present his results to a conference when he was persuaded , on the basis of a disparity with TOA energy budget to “correct” the error in OHC.
This resulted in a significant number XTB data being deemed incorrect and the cooling was removed. He then had to change his conference address to explain it was all big mistake.
However, the was a significant change in length of day that would suggest his original findings were correct (just not politically correct).
http://climategrog.wordpress.com/?attachment_id=274
Global temperature change causes large shifts in water from oceans to atmosphere and thus bears a strong resemblance to the temperature record.
At some time someone will need to re-examine the objectivity of removing inconvenient data and then reassess OHC. That can be the basis of next years incremental step towards a more accurate assessment of ECS.

Greg Goodman
May 25, 2013 3:31 am

http://earthobservatory.nasa.gov/Features/OceanCooling/
Perhaps Josh Willis’ original calculations before “correction” were a more accurate assessment of OHC drop on 2006.

May 25, 2013 5:17 am

@bobl
> To me this implies that all IR radiation to space from the atmosphere must
> be from a greenhouse gas? …
> … What am I missing?
The ‘elephant in the room’ that you are missing is most of the IR radiated to space from a planet is from the surface . GHE does add signficantly to that, mostly from water vapor. CO2 plays a very minor role, along with dust and other microscopic suspended matter.
Look at Mars, where the absolute amount of CO2 per unit surface area is about 30 times greater than on Earth. Virtually no water vapor. So, Mars’ temperature is barely above its black-body temp (~210K)
http://nssdc.gsfc.nasa.gov/planetary/factsheet/marsfact.html

Richard M
May 25, 2013 5:37 am

Tsk Tsk says:
May 24, 2013 at 7:27 pm
I think you’re also assuming that the radiation always has to be outwards (are you?). The reality is that the CO2 molecule has basically a 50/50 chance of radiating up and out or down and in. The net effect is to increase the transit time of the photon and increase the energy content of the atmosphere and the surface as a result. Of course this is happening at all levels of the atmosphere just to make it more complicated. Finally, it can be directly observed just by measuring the radiation from a dark sky at night.

While your statement is true the average of all radiation emitted from thermalized energy is towards space. This is as I said above. Any radiation emitted towards space travels a longer distance before re-absorption than radiation emitted toward the surface. This is due to the density differences. Hence, we can model all these radiation events as the statistical average with all radiation travelling a small distance outwards. Adding CO2 increases the number of radiation events thus increasing the flow of energy to space.
Keep in mind we are not discussing surface energy. The surface energy radiated upward is absorbed as well and the more CO2 the more that gets absorbed and radiated back towards the surface. I’ve read that about 90% is immediately re-emitted. Hence, 10% is thermalized and will participate in the above process as will latent energy, conductive energy and energy absorbed in the atmosphere from the sun.

Richard M
May 25, 2013 5:45 am

John Day says:
May 25, 2013 at 5:17 am
@bobl
> To me this implies that all IR radiation to space from the atmosphere must
> be from a greenhouse gas? …
> … What am I missing?
The ‘elephant in the room’ that you are missing is most of the IR radiated to space from a planet is from the surface .

Is it? From what I’ve read the vast majority on Earth is from the atmosphere (almost 90%). According to the KT energy budget around 30% of the sun’s energy is absorbed directly in the atmosphere just to start with. Add to that latent energy, conductive energy, thermalized energy and radiation absorbed by the atmosphere and you aren’t left with much that passes through. Much of it may have started from the surface, but the final radiation event takes place in the atmosphere.

Richard M
May 25, 2013 7:04 am

The drop in sensitivity by simply factoring in a microsite bias also applies to any other temperature record adjustments. For example, by removing the OHC adjustments Greg Goodman mentioned or all the historic adjustments that lower previous temperatures I would expect to see a much, much smaller sensitivity. IOW, there calculations are only as good as the data itself.
I wonder what the number would be if only raw temperature data was used …

pochas
May 25, 2013 7:49 am

This is probably the best we can do until the unknown unknowns bite us in the ass.

May 25, 2013 8:14 am

M
> around 30% of the sun’s energy is absorbed directly in the atmosphere just to start with…
I think you’re confusing absorption with albedo. 30% of the sun’s energy is reflected by the clouds, 3% is absorbed by the clouds. The atmosphere absorbs only 13% of solar irradiation directly (not counting clouds). Of the non-reflected energy, 84% (the ‘elephant’) reaches the surface and oceans, 4% of that is reflected back. The remainder of that (~80%, aka the ‘elephant’) is re-rediated as IR heat. Yes, 90% of this is aborbed the atmosphere, but its source was the surface. And yes, that heat absorbed by the atomsphere,mostly by water vapor, is significant (as I said in my post) as GHE warming. CO2 plays a minor role, again as I said in my post.
You can literally (almost) see the GHE effect of water vapor in NOAA’s 6-micron IR imagery, where the white areas denote sections of highest absorption of H2O by the atmosphere. Black represents direct IR heat radiating from surface. (It’s a negative image, somewhat unintutive at first glance)
http://www.goes.noaa.gov/GSSLOOPS/ecwv.html
Where are the analogous imagery loops of roiling clouds of CO2 absorbing IR heat? I’ve never seen any, have you?
😐

May 25, 2013 9:06 am

Regarding John Day’s comments at 8:14 above, the IPCC tells us in Chapter 8 of their AR4 report per Dr. James Hansen that “…a doubling of atmospheric CO2…with no feedbacks…the global warming from GCMs would be around 1.2°C…”
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-6-2-3.html
Has anyone ever seen similar climate sensitivity value for a doubling of water vapor without feed backs published anywhere?
We often read about feed backs and how water vapor being a strong green house gas, increases in the warm-up and produces a positive feed back, presumably some of this on its greenhouse effect properties alone. So how much would that be without consideration of its latent heat, cloud albedo and cloud insulation properties?
If the average world temperature is around 60°C and it goes up to around 61°C due to a doubling of CO2, the average concentration of water vapor in the air might go from around 1200 ppm to 1300 ppm. or around 8% and far short of doubling. So how much warming would an 8% increase in water vapor produce?

May 25, 2013 9:11 am

Oops 12,000 – 13,000 ppm

Gary Pearse
May 25, 2013 9:17 am

I’m opting for B. My comments the other day on Nic’s recent WUWT article: Updated Climate Sensitivity….
“…..My take: ECS finally turns out to be vanishingly small (i.e. there is a governor on climate responses al a Willis Eschenbach), then TCR is larger than ECS and within a few years it declines to the minor ECS figure and natural variability is basically all that is left. How’s that for a model!”

phlogiston
May 25, 2013 9:36 am

Bill Illis says:
May 24, 2013 at 8:10 am
The NODC has updated the Ocean Heat Content numbers for the first quarter of 2013.
Big jump in the OHC numbers in the first quarter of 2013 (and some restating of the older numbers again).
0-2000 metre uptake equates to 0.49 W/m2 in the Argo era.
http://s13.postimg.org/u6al0f6xj/OHC_700_and_2000_M_Q1_2013.png

The most reliable aspect of OHC data, from argo or otherwise, is the relative data on vertical heat movement (as opposed to global heat budget speculation). Thus this new data, by showing more heat uptake at 2000 than 700m, points to vertical mixing and downward movement of heat. Thats why its getting so bloody cold. Its odd that Nina3.4 seems to be taking a dive now after a rather anomalous peak in March-April, normally Nina 3.4 peaks either in Jan (el Nino like event) or in the summer (La Nina like event). Its also noteworthy that total NH sea ice remains close to the winter maximum.

pochas
May 25, 2013 11:03 am

Steve Case says:
May 25, 2013 at 9:06 am
“Has anyone ever seen similar climate sensitivity value for a doubling of water vapor without feed backs published anywhere?”
Not possible. If as I maintain the surface temperature is set in the radiating zone above 14,000 feet and teleconnected to the surface via the lapse rate, then the water content of the atmosphere is approximately fixed at some value less than 100 percent relative humidity. Exceed this value and it rains.

Eliza
May 25, 2013 12:21 pm

My cents worth: It is highly unlikely that this generation or the next will see any significant change in climate be it cooling or warming, unless there is massive volcanic activity or a giant meteorite hits us. So to those who say we are going into an ice age is just as dumb as a the warmista proposals currently in vogue although I would just love to see a massive ice age coming tomorrow just to win the argument against the warmistas. Although current low solar activity WILL affect climate over to 1000’s of years, we only live 100’s of yrs (we hope!) at most. LOL

Eliza
May 25, 2013 12:22 pm

BTW I believe most meteorologist’s would agree with my previous statement haha.

atarsinc
May 25, 2013 1:19 pm

Steve Case says:
May 25, 2013 at 9:06 am
The average worldwide temp is nowhere near 60 C. More like 14.5 C. JP

Nic Lewis
May 25, 2013 1:24 pm

Noblesse Oblige says:
Nic // Why not do a meta analysis to collapse those wide C.I. values. The consistency between the various results suggests that the C.I. is too large.
The errors in the various estimates will be highly correlated, unfortunately, so a meta analysis would not be straightforward and, I suspect, would bring little improvement over the best constrained single estimate, that for the latest decade (or maybe the last 13 or 14 years).
We need to get a better constraint on aerosol forcing, in particular, to bring down the CI. Interestingly, some of the inverse studies give much more tightly constrained estimates of aerosol forcing than do the direct satellite observation based studies. Eg, Forest et al 2006 and my objective Bayesian reworking of it, Lewis 2013, Aldrin et al 2012, and probably Ring et al 2012 if they could be bothered to work out the CI. However, that does ignore model error.

May 25, 2013 6:31 pm

atarsinc says:May 25, 2013 at 1:19 pm
The average worldwide temp is nowhere near 60 C. More like 14.5 C. JP
I hate making Fahrenheit/Celsius errors.

Chewage
May 25, 2013 8:39 pm

What else is missing? SURGE events that make an overall short term balance within the planetary heat budget; heat dissipation and cold dissipation at the tropopause (think stratwarm & chill) via the narrow bands of the spectrum. It would behoove all to know what the subsurface temperature balance looks like over time at -1m through -100m, including below the ocean basins, which is a trick and a half and unachievable…
Equilibrium happens, but at what levels/forcings?

May 25, 2013 9:39 pm

Aerosol-adjusted forcing, conveniently leaving out the Geo-engineering aerosol-adjusted forcing element?

richard verney
May 26, 2013 3:12 am

Greg Goodman says:
May 24, 2013 at 3:22 pm
////////////////////////////////////////////
Well done.
I (and many others) have been suggesting for years that volcanos are used as a fudge factor to give the impression that model and their projections (predictions) are more capable than they truly are. Now we see the role of smoothing in bringing about this impression.
You are right to observe that there ought to have been no smoothing of volcano forcing.
Greg also points out (Greg Goodman says: May 25, 2013 at 3:31 am) also points out the questionable adjustment made to the ARGO data. It is important to bear in mind that this makes the ARGO data set potentially unreliable at least as far as its tuning to pre ARGO data sets. The full implications of this questionable adjustment should not be overlooked whenever one discusses OHC or ARGO data.

David
May 26, 2013 3:17 am

Steve Mosher says, “Let me put the importance of this metric into perspective: every degree of C in uncertainty is worth about 1 trillion dollars a year if you are planning to mitigate.”
Are you saying that the “C” in CAGW will be a trillion dollars in damage per 1 degree rise, or are you claiming it will cost one trillion dollars to change the T one C. Either way I call B.S. to your numbers, so please show me your power. (I actually think a drop of one C would be far more costly, and reduced CO2 means less food, and more land and warter required to produce said food.)

David
May 26, 2013 3:44 am

bobl says:
May 24, 2013 at 5:30 pm
Thank you Richard, that’s exactly what I was trying to say, I was thinking about how energy lost from the surface, by convection is radiated to space, and whether CO2 partial pressure plays into the efficiency of that process.
1. CO2 molecule takes up energy through collision with non radiating gas
2. C02 molecule emits photon
————————————————————————————————–
I see all of this as a function of the residence time of the energy involved. So a GHG decreases the residence time of energy received via collision from a non GHG, but can, 50% of the time, increase the residence time of outgoing energy recieved from OLWIR from the surface, by directing said energy back towards the surface. Clearly, if the GHG cools the upper atmosphere relative to a non GHG molecule, then conduction from below, as well as convection accelerates upwards.
An interesting thought experiment is what would happen in an atmosphere with zero GHG. According to radiation theory the atmosphere would be far cooler (some say 30 degrees) then the surface. However, then the hotter surface would continually net conduct energy to the atmosphere just above the surface, the atmosphere above the surface would then cool by conducting energy to ever higher elevations within the atmosphere, and the lower atmosphere would then continually recieve ever more energy via conduction from the surface. Eventually, as energy is never lost, the atmosphere would establish an equalibrium with the surface, the lapse rate would be set via the molecules per sq M with the T established, not by different vibrational rates of each molecule, as they would equalise, but by the number of molecules hitting the measuring instrument. (the more mass per m2, the higher the specific heat per m2) Eventualy, in this non GHG world, you would not have back radiation to the surface, but “back conduction” to the surface, thereby increasing the specific heat above the S-B equation.
So this is my assertion, based on Davids Law of physics which reads, “Only two things can effect the energy content of any system in a radiative balance. Either a change in the input, or a change in the “residence time” of some aspect of those energies within the system.”

pat
May 26, 2013 5:28 am

26 May: UK Telegraph: Louise Gray: Hay Festival 2013: global warming is ‘fairly flat’, admits Lord Stern
Lord Stern, who originally warned the Government about climate change, has admitted that global warming has been “fairly flat” for the last decade.
“I note this last decade or so has been fairly flat,” he told the Telegraph Hay Festival audience.
He said the reasons were because of quieter solar activity, aerosol pollution in certain parts of the world blocking sunshine and heat being absorbed by the deep oceans.
Lord Stern pointed out that all these effects run in cycles or are random so warming could accelerate again soon.
“In the next five to ten years it is likely we will see the acceleration because these things go in cycles,” he warned…
He said it was an “illusion” to claim that the short term flat line in global warming means that global warming is no longer a threat.
“It is a dangerous extrapolation of the short term phenomenon into a long term trend when the underlying responses for long term trends in terms of rising greenhouse gases are well understood and clear.”
Lord Stern also said he has written to the Prime Minister urging him to introduce a target to decarbonise electricity by 2030 as part of the Energy Bill, currently going through Parliament.
***He said investors need the policy clarity in order to build the infrastructure Britain needs in future…
http://www.telegraph.co.uk/culture/hay-festival/10081250/Hay-Festival-2013-global-warming-is-fairly-flat-admits-Lord-Stern.html
***yes, it’s all about those “investors” Lord Stern.

May 26, 2013 7:59 am

Richard M says:
May 25, 2013 at 7:04 am
The drop in sensitivity by simply factoring in a microsite bias also applies to any other temperature record adjustments. For example, by removing the OHC adjustments Greg Goodman mentioned or all the historic adjustments that lower previous temperatures I would expect to see a much, much smaller sensitivity. IOW, there calculations are only as good as the data itself.
I wonder what the number would be if only raw temperature data was used …
########################
Raw data is garbage. You obviously havent looked at it.

May 26, 2013 8:08 am

ruvfsy says:
May 24, 2013 at 7:50 pm
So Mosh,
Where is the fine line between denialism and lukewarmerism?
1.2 per doubling of CO2?
##################################
first principles will get you to 1.2C. I think we pu the line of demarcation at 1C. But as folks get more evidence that might be push down.
Here’s the difference between a lukewarmer and a CAGW: looking at the same PDF for sensitivity that ranges from 1 to 6, the lukewarmer will note that over half of the PDF falls below 3C. the CAGWer will talk about everything above 3C.
Simple point. The range is uncertain. there is room for many beliefs. But, I discount people who say the KNOW the value is low. I discount people who say the FEAR the value is high.
Just look at the best knowledge we have, dont whine that this knowledge is imperfect. Look at the bet understanding and describe it without over reaching. The PDF runs from about 1 to 6. Its a good bet that the true value is less than 3C. If you can say that you are a lukewarmer
A. Humans add GHGs
B. GHGs warm, they do not cool the planet.
C. How much? over time periods of 100 years or less, doubling is more likely to create less than 3C warming than it is to create more than 3C warming.

Reply to  Steven Mosher
May 26, 2013 9:14 am

Steven,
“B. GHGs warm, they do not cool the planet.”
According to the data from NASA last year: “Carbon dioxide and nitric oxide are natural thermostats,” explains James Russell of Hampton University, SABER’s principal investigator. “When the upper atmosphere (or ‘thermosphere’) heats up, these molecules try as hard as they can to shed that heat back into space.”
That’s what happened on March 8th when a coronal mass ejection (CME) propelled in our direction by an X5-class solar flare hit Earth’s magnetic field. (On the “Richter Scale of Solar Flares,” X-class flares are the most powerful kind.) Energetic particles rained down on the upper atmosphere, depositing their energy where they hit. The action produced spectacular auroras around the poles and significant upper atmospheric heating all around the globe.
“The thermosphere lit up like a Christmas tree,” says Russell. “It began to glow intensely at infrared wavelengths as the thermostat effect kicked in.”
http://science.nasa.gov/science-news/science-at-nasa/2012/22mar_saber/

May 26, 2013 8:12 am

Nic Lewis says:
May 24, 2013 at 2:11 pm
Steven Mosher wrote:
“I think it might be instructive for WUWT readers to understand how Anthony’s claims about microsite bias would play into your calculations. For example, if one assumed that the land warming was biased by .1C per decade from 1979-current, what would that do to the sensitivity calculation?”
Good point, Steve. That assumption would reduce the increase in global temperature betrween the 1860-79 mean and the 2003-12 mean from 0.76 C to about 0.68 C. All the climate sensitivity estimates, and their uncertainty ranges, would then reduce by about 11%. So a sensitivity of 1.7 C would change to just over 1.5 C, for example.
I hope Anthony and others take note of this. What this does is allow people to Frame their concerns about temperature accuracy in the LARGER PICTURE of the scientific debate over sensitivity.
Look at the equations for ECS and TCR. People need to relate their skepticism to this equation
That way they are part of the debate.

May 26, 2013 8:24 am

HR says:
May 24, 2013 at 9:08 am
Jim Cripwell
In order to understand science you need a health dose of caution. The limits of our data and understanding mean we must pepper our conclusions with appropriate caveats and/or uncertainty ranges. You seem to completely misunderstand this and instead favour the idea of perfection or nothing. The unfortunate truth is that most of the time science is about being less wrong than it is about being right you need to moderate your skepticism appropriately.
#####################################
Yes, Many have pointed out to Jim that he needs to be skeptical about his skepticism.
Skepticism in science is a TOOL, it is not a philosophy. By Jim’s definitions we can never know anything, which is fine for philosophical skepticism, but death to science which operates on incomplete, inconclusive, data and induction. There is a debate in science. people can join that debate by following Nic’s path. So there is a seat for everybody at that table. Crying that the table doesnt exist will get you ignored. Complaining that you want to eat at a different table, will get you ignored. There is a debate. Its a scientific debate. Its about climate science and the most important question we can ask: How much warmer? Arm waving will get you ignored.
Attcking fundamental physics will get you ignored. Screaming fraud will get you ignored. There is a debate. There is a question on the table and open seats. Do like Nic and take a seat.

Richard M
May 26, 2013 8:49 am

John Day says:
May 25, 2013 at 8:14 am
M
> around 30% of the sun’s energy is absorbed directly in the atmosphere just to start with…
I think you’re confusing absorption with albedo. 30% of the sun’s energy is reflected by the clouds, 3% is absorbed by the clouds. The atmosphere absorbs only 13% of solar irradiation directly (not counting clouds). Of the non-reflected energy, 84% (the ‘elephant’) reaches the surface and oceans, 4% of that is reflected back. The remainder of that (~80%, aka the ‘elephant’) is re-rediated as IR heat. Yes, 90% of this is aborbed the atmosphere, but its source was the surface. And yes, that heat absorbed by the atomsphere,mostly by water vapor, is significant (as I said in my post) as GHE warming. CO2 plays a minor role, again as I said in my post.

I am using the KT energy budget diagram as my source. Where did you get your numbers? In addition, my 30% was computed after you subtract out albedo and includes clouds which may have led to some confusion. If you add the atmospheric sources you have 78+80+17 = 175 w/m2 and that does not include thermalized energy. That’s a lot of energy.
http://rabett.blogspot.com/2012/08/one-kt-diagram.html
Clearly CO2 can interact with any molecule in the atmosphere so I didn’t limit it. I agree the effect is small but so is the greenhouse contribution of CO2 compared to water vapor and clouds. We have two small effects and we don’t know exactly how they work in total. That was my point.

Richard M
May 26, 2013 8:57 am

Steven Mosher says:
May 26, 2013 at 7:59 am
Raw data is garbage. You obviously havent looked at it.

I happen to agree. The problem is very biased people think they know how to adjust the data. As has been proven time and again in medical research, researcher bias always affects the results towards the bias. We can be 99+% assured that the data is biased on the warm side. Being mathematically inclined, I’m more of the opinion that the errors are likely to cancel out and it’s probable we don’t even know all of the factors that make it bad. That makes using the raw data as good (and probably better) as anything else.
Steven Mosher says:
May 26, 2013 at 8:08 am

Luther Wu
May 26, 2013 9:00 am

Steven Mosher says:
May 26, 2013 at 8:08 am
____________________________
Attempts at appearances of reasonableness only go so far.
There isn’t any compelling argument that a doubling of CO2 produces anywhere near as much as 3C.
All this middle- road stuff only puts half your headlights in the opposing lane.

Jim Cripwell
May 26, 2013 10:54 am

Steven Mosher. You have made this personal. Let me reply in the same spirit. You misquote what I write, and give a completely false impression of what I believe. I have stated over and over again that CAGW is a perfectly viable and reasonable hypothesis. But that is all it is; just a hypothesis with no empirical data to back it up, and prove it is correct. You accuse me of bringing nothing to the table. That is false. You claim that by bringing hypothetical estimates of a numeric value of climate sensitivity to the table, you are somehow proving the CAGW is correct. You insist that there is no categorical difference between estimates and measurements. All I try to point out is that, until you have actual measurements of climate sensitivity, you cannot make CAGW any more than a hypothesis. I dont bring nothing to the table. I point out that what you and the warmists bring to the table is not proper physics.
The fundamental issue, which you refuse to discuss, is whether the IPCC statements in the SPMs that there is a 95% or 90% probability of something being correct, have any basis in science. I maintain that while CAGW is a viable hypothesis, that it all it is, and the IPCC claims of high probabilities that things are correct, are scientific garbage.

Richard M
May 26, 2013 11:35 am

Steven’s claim that you must “sit at the table” (i.e. accept the basic paradigm) to have an impact on a scientific discussion is nonsense. Both ulcers and plate tectonics are examples that prove his assertion is wrong (and there are many others). Sorry Steven, making erroneous assertions to try and force people to your way of thinking only detracts from your credibility.

May 26, 2013 6:55 pm

@RobertM
> …my 30% was computed after you subtract out albedo and includes clouds
> which may have led to some confusion.
What confusion? I was not confused. Your statement was very clear and was simply wrong:
“…around 30% of the sun’s energy is absorbed directly in the atmosphere just to start with…”
>Where did you get your numbers?
http://education.gsfc.nasa.gov/ess/units/unit2/u2l5a.html (see budget diagram under Resources)

David
May 26, 2013 7:22 pm

Steven Mosher says:
May 26, 2013 at 8:08 am
ruvfsy says:
May 24, 2013 at 7:50 pm
So Mosh,
Where is the fine line between denialism and lukewarmerism?
1.2 per doubling of CO2?
##################################
first principles will get you to 1.2C. I think we pu the line of demarcation at 1C. But as folks get more evidence that might be push down.
————————————————————————
Let it be recorded here, Steven Mosher considers Richard Lindzen to be a denier.
Also Mr Mosher, I notice you avoided an answer to my questions with regard to your trillion dollar per degree C statement.

george e. smith
May 26, 2013 7:34 pm

I’m a believer of the standard assertion, that GHGs like H20, CO2, and O3 can and do absorb small sections of the LWIR surface emitted radiation, and thereby raise the Temperature of that atmosphere, at whatever altitude the absorption occurs; quite high, in the case of O3.
Now whatever means, by which, the Temperature of the atmosphere is increased; the result of such a higher Temperature, has to be an increase in the rate of radiative cooling of that atmosphere to outer space. Higher Temperature things radiate faster; all else being equal.
Well increasing atmospheric Temperature, also should enhance convective transport of heat energy to higher altitudes, to be lost to space.
So increasing the atmospheric Temperature, beyond what the natural altitude lapse rate would dictate, should permit faster cooling to space, with a lower surface Temperature; because of the rapid radiative transfer of energy from the surface to higher altitude GHGs, bypassing the slower conduction/ convection mechanisms.
For the life of me, I can’t even imagine by what mechanism, a higher atmospheric Temperature, can transport heat energy down to the surface, contrary to what the second law mandates for actual heat energy transport. The net heat flow must be from the warmer surface to the cooler upper atmosphere. True; the smaller Temperature gradient between a lowered Temperature surface , and a Temperature enhanced upper atmosphere (via LWIR radiation), must diminish the rate of conductive heat energy transport, upwards in the atmosphere.
And of course, we know that downward LWIR radiation from the warmed atmosphere is strongly absorbed in a thin layer of mostly ocean surface, leading to accelerated evaporation, and transfer of latent heat energy back into the atmosphere.
So yes I believe (more) GHGs increase atmospheric Temperatures; but no, I don’t believe that increases surface Temperatures; it just increases the upper atmosphere radiative cooling rate.
Cold things don’t cool faster than hot things; there’s that T^4 problem there.

george e. smith
May 26, 2013 8:27 pm

Readers of WUWT, and all friends of “Blackbody Radiation” (including moi) might be interested in the following narrative, from the June 2013 issue of Optics & Photonics News; a regular publication of The Optical Society of America; one of the foundation bodies of the American Institute of Physics; in their “Optical Feedback” (letters) section, from one Edward Collett of New Jersey.
He comments on an article in the April 2013 OPN issue regarding a less than happy episode between Albert Einstein, and S.N. Bose, of Bose-Einstein Statistics renown.
Bose had published a paper on this statistics, and Einstein followed with a paper he couldn’t have written without seeing the Bose paper. Einstein never thought to include Bose by invite, as a co-author of his paper.
It seems that in 1905, Einstein had recognized the inconsistency of Max Planck’s derivation of the Black Body Radiation formula.
“it was part quantum and part classical”.
Einstein spent the next 20 years trying to formulate Planck’s radiation law in completely Quantum Mechanical terms. He failed and by 1925, Einstein had not developed any part of modern Quantum Mechanics. No wonder Einstein didn’t like the whole idea of quantum theory.
The above freely excerpts from Collett’s letter, and he is relating from the literature on Bose and Einstein.
Bottom line is, even Einstein himself; nor anyone else, has derived BB radiation laws from Quantum Mechanical theory. It is purely classical physics, and quite fictional at that.
Yes Max Planck suggested that EM radiation came in integral chunks; same as pumpkins on a vine.
You can pick two pumpkins, or 17 pumpkins, but not 2.71828 pumpkins; nor photons either.
Planck placed no restrictions on the size (energy) of either photons, or pumpkins. He simply said that the photon energy, and the associated wave frequency were related by E = h (nu)
That makes h = energy times time (action), so h is the action in each cycle of the associated wave frequency of the photon.
(nu) can range from zero to infinity, sans both ends, without restriction, so the photon energies are in no way quantized; just as the mass and size of pumpkins are not quantized.
Quantum theory deals with the actual physical energy levels of electrons et al in real physical materials. BB radiation theory, incorporates the real physical properties of no material of any kind; and is in every way non-existent; yet such an important step in the evolution of modern physical understanding of our universe.
By all accounts, S.N Bose was one very smart guy, and apparently a very nice guy too. One of the great Indian physicists, like Raman and Chandrasekar . Forgive me if I have left out any of the well known Indian Physics “biggies”.

May 26, 2013 8:34 pm

george e. smith says:
May 26, 2013 at 7:34 pm
Good analysis as usual, George.

May 27, 2013 4:29 am

e. smith
> Bose had published a paper on this statistics, and Einstein followed with a
> paper he couldn’t have written without seeing the Bose paper. Einstein
> never thought to include Bose by invite, as a co-author of his paper.
Are you referring to Bose’s watershed paper “”Plancks Gesetz und Lichtquantenhypothese”, Zeitschrift für Physik, 1924? (If you were referring to another Bose paper, please elaborate.)
If so, you’ve got the historical details completely wrong here.
Einstein had indeed seen Bose’s paper, because Einstein wrote it! Bose (the ‘bos’ in boson) had requested Einstein to translate it into German for him. He had unsuccessfully tried to get it published elsewhere, but was rejected because his startling new theory about the distribution of energy in a photon gas didn’t conincide with the ‘consensus’ theory (classical Maxwell-Boltzmann distribution of ordinary gasses).
http://en.wikipedia.org/wiki/Satyendra_Nath_Bose
As for Planck’s Black-Body Radiation Law being ““it was part quantum and part classical” and Eintein’s involvment you’ve got that twisted too. Planck formulated the law in 1900 using only empirically derived constants, under classical assumptions. It wasn’t until 1914 that he further expressed it as a statistical distribution:
http://en.wikipedia.org/wiki/Planck's_law#CITEREFPlanck1914
So that statement about “in 1905, Einstein had recognized the inconsistency of Max Planck’s derivation of the Black Body Radiation formula” is BS. (What inconsistency?) What Einstein did in 1905 (besides Relativity) was to discover the photolectric effect (scattering of photons as light). It was for this photoelectric work that Einstein received his only Nobel Prize in 1921.
Planck rightfully deserves credit as the ‘father of quantum theory” because the Planck Relation (with its famous constant ‘nu’: e=h*nu) and the radiation law are fundamental “planks” (sorry) in Quantum Theory. Planck’s Relation (like Einstein’s e=mc2 and Boltzmann’s S=k*logW) is amazing because it reveals a remarkable relationship between two worlds that previously seemed unrelated.
Einstein gets supporting credit too, for his photoelectric effect, which was one of the earliest portrayals of photons as particles.

May 27, 2013 9:43 am

@me>…. famous constant nu…..
Oops, I meant Planck’s famous constant ‘h’ of course. Nu is the photo’s frequency.

george e. smith
May 27, 2013 7:22 pm

“””””……John Day says:
May 27, 2013 at 4:29 am
e. smith
> Bose had published a paper on this statistics, and Einstein followed with a
> paper he couldn’t have written without seeing the Bose paper. Einstein
> never thought to include Bose by invite, as a co-author of his paper……”””””
Well John, if you read my post, you would see I merely relayed the essence of a letter published in OPN for June 2013.
I suggest that the best place for your learned rebuttal of that “BS”, is for you to send it to the OPN feedback column, for them to publish; they always seek to learn the truth, so they would be most appreciative of Wiki’s definitive reporting on the matter.
As for Einstein having “written” Bose’s circa 1924 paper, your Wiki source merely says that Einstein simply translated into German; he did not “write it”.
Yes Einstein belatedly received his Nobel prize in physics for the work on the photo-electric effect.
And for your information, the photo-electric effect has nothing whatsoever to do with “the scattering of photons as light”.
It relates to the emission of electrons from certain metals, when irradiated by EM radiation.
Classical physics, had no explanation for the PE effect (and still doesn’t).
The emission of electrons (or not) is quite unrelated to the intensity of the EM radiation; other than the number of electrons emitted (if any).
What determines the emission (or not), is the frequency or wavelength of the radiation. No matter how weak, the irradiance; even down to a single photon, if the photon energy [h.(nu)] exceeds a certain threshold, electron emission can occur (and with quite high quantum efficiency).
But even a kilowatt of power from a CO2 laser, at 10.6 microns wavelength, will not release a single photo-electron from a material that will emit an electron with as high as 90% QE, from a single 2 eV photon.
So if you rely on wiki for your source of factual scientific information; don’t be surprised if they sometimes feed you “BS”.
And do send your rebuttal to OPN feedback column; I can’t wait to see it in print.

May 28, 2013 6:25 am

e. smith
> … if you read my post, you would see I merely relayed the
> essence of a letter published in OPN for June 2013…
Yet, you had no problem in letting dbstealey and the rest of us think it was your analysis too. George, no offense, but you occasionally pontificate a bit too much on topics you havent’ quite mastered. So, as remediation, I suggest that _you_ write the rebuttal to OPN.
> And for your information, the photo-electric effect has nothing whatsoever
> to do with “the scattering of photons as light”.
Oops, typo, I meant to say “as electrons”. Mea culpa, pescavi. [And perhaps I should have put “scattering” in quotes.] But, PE _is_ a kind of scattering, in a broad sense (if you squint a little, so you can’t distinguish between photons and electrons). So, arriving particles (photons) arrive, interact with atoms and leave in different directions (as electrons). That is ‘scattering’, which is what I was trying to convey (by slightly abusing the term)..
In fact, the photo-electric effect is just one of five well-known ‘scattering’ interactions of photons and atomic matter:
1. Coherent Scattering
2. Photo-disintegration
3. Pair Production
4. Compton Effect or scattering
5. Photoelectric Effect
http://whs.wsd.wednet.edu/faculty/busse/mathhomepage/busseclasses/radiationphysics/lecturenotes/chapter12/chapter12.html
(Squint a little when you look at the diagrams and you’ll see that they all result in a kind of ‘scattering’ effect)
>… Einstein simply translated into German; he did not “write it”.
Now you seem to be trying to hide your previous endorsement of this allegation that Einstein had somehow plagiarized Bose’s idea:
” Bose had published a paper on this statistics, and Einstein followed with a paper he couldn’t have written without seeing the Bose paper. ”
The historical facts are: 1) Bose wrote a paper but could _not_ get his paper published 2) he sent the paper to Einstein and asked for his help to get it published.
So the above allegation is clearly false and misleading. Of course Einstein had seen the paper! That was the point I was making! So Einstein translated (i..e ‘re-wrote’) the paper in German. Do you disagree with Wikipedia and other references on the historical accuracy of this? [ BTW, Wikipedia is a collection of
peer-reviewed documents (not that that guarantees total accuracy of course)]
Einstein was no saint, but it is well known that he did enthusiastically endorse Bose and his work to the international scientific community. Subsequently, Bose was promoted and given a 2-year paid sabbatical to visit Europe to collaborate with his “peers” (even though he did not have a PhD). Dirac named the ‘boson’ in his honor.
But now I am pontificating. So, pen down.
😐

george e. smith
May 29, 2013 12:05 am

Well John, I just lost an hour’s worth of typing, when PG&E decided to put a four way stop sign on my local power grid. My LED reading lamp, was only out for sixty seconds; but my internet was out and the loss was total, when I tried to post it in the dark.
So just a short comment. Wiki of course does NOT publish peer reviewed papers. They publish what someone unknown wrote. Now they certainly list peer reviewed papers, such as the German Bose paper you mentioned..
But did you look up that paper itself to be certain that the wiki author correctly quoted from it; or from any of the other references.
Some of the Wiki authors, are English language impaired, so what they write versus what they cite in bibliographies, are two quite different things.
As for what I personally write, 95% of it I simply type from memory; so yes, sometimes I disremember it. The other 5%, is typically data directly excerpted from reference handbooks, and other widely available texts. I almost always cite my sources, when I do that.
As for Wiki, I NEVER consult them for information. I do sometimes check them when others, such as you, give specific links to them. Mostly, that is a waste of time.
And Stealey, evidently had no trouble discerning the difference between what I excerpted from Collett’s letter, and what was subsequently my own personal input. So what was your problem with that ?
You should really start thinking for yourself; there is little of this that any WUWT reader can’t understand. So stop citing Wiki references, unless you first read the peer reviewed papers they list, to ensure they quoted them correctly..

May 29, 2013 4:34 am

e. smith
>Wiki of course does NOT publish peer reviewed papers.
Wikipedia is not a system for publishing scholarly papers (yet). It is an on-line encyclopedia that may be reviewed, collaborated or edited by anyone, including experts and yourself. In theory, such a ‘collaborative public encyclopedia’ cannot possibly work, but in practice it works remarkably well. It’s not perfect, but, like a living thing, Wikipedia is evolving and getting better.
George, I know you want to change the subject (which is already far off-topic from ‘aerosol adjusted forcings’) but you really need to face the music and apologize for retweeting those errors in that OPN letter. Just say you’re sorry for any misinformation that you may have inadvertently said or repeated in regard to the 1924 Bose-Einstein paper. There, I said it for you.
😐

May 29, 2013 5:22 am

e. smith
>But did you look up that paper itself to be certain that the wiki author correctly quoted from it;
Most of these early Zeitschrift manuscripts are locked up behind Springer paywalls ($40 to unlock). But the Bose paper is an exception:
http://tu-dresden.de/die_tu_dresden/fakultaeten/fakultaet_mathematik_und_naturwissenschaften/fachrichtung_physik/itp/tp/lehre_dir/vorlesungen_dir/ws_2012_2013/folder.2012-12-21.5820555304/bose_1924.pdf
You’ll see that Bose received full credit as the only author of the paper (“Bose, Dacca University India). Einstein’s name doesn’t appear until the end, in the Translator’s Note, where he warmly praises the work as “important progress in my opinion”. Einstein was already famous in 1924, so that tiny endorsement gave Bose the huge opportunity he was seeking.

george e. smith
May 29, 2013 9:50 pm

“””””…..John Day says:
May 29, 2013 at 4:34 am
e. smith
>Wiki of course does NOT publish peer reviewed papers.
Wikipedia is not a system for publishing scholarly papers (yet). It is an on-line encyclopedia that may be reviewed, collaborated or edited by anyone, including experts and yourself. In theory, such a ‘collaborative public encyclopedia’ cannot possibly work, but in practice it works remarkably well. It’s not perfect, but, like a living thing, Wikipedia is evolving and getting better.
George, I know you want to change the subject (which is already far off-topic from ‘aerosol adjusted forcings’) but you really need to face the music and apologize for retweeting those errors in that OPN letter. Just say you’re sorry for any misinformation that you may have inadvertently said or repeated in regard to the 1924 Bose-Einstein paper. There, I said it for you…..”””””
Stop making stuff up John. I don’t “tweet” what ever the hell that is. You can’t seem to understand, that the issue is NOT that the first paper by Bose WAS published with Bose as author which he was; and of course in Einstein’s translation. The letter writer’s comment related to a SECOND PAPER written by Einstein as sole author; but which, according to history, Einstein could not have written, but for the fact that he had already seen the earlier Bose paper. The letter writer asserts based on his understanding of the history; that Einstein should have acknowledged the earlier Bose work, and invited him to be co-author of the second paper which drew heavily on Bose’s work. You have not even acknowledged the existence of the later Einstein paper.
You are the one who is challenging the correctness of the letter writer’s assertions. You owe it to him, to write your objections to the OPN editorial staff, rather than shoot blanks here at WUWT.
I have no cause to alter a syllable of what I posted.
And there was no Bose-Einstein paper in 1924; there was a Bose paper, followed later by an Einstein paper, that drew heavily on the former, without attribution..
As for aerosol adjusted or any other “forcings”, these are childish gobbledegook, trying to simplify an overly complex chaotic system. The actual observationally measured real climate data, to the extent there is any, has not been explained by any of these ramblings that assert to blame a single minor component of atmospheric physics, for changes in the climate. We don’t even have any credible climate data predating about 1980, due to the erroneous substitution of ocean water temperatures from uncontrolled depths, for actual lower atmospheric Temperatures. Not only are they quite different, but they also are not even correlated, so the error is uncorrectable.

May 30, 2013 4:06 am

e. smith
>As for aerosol adjusted or any other “forcings”, these are childish gobbledegook…
Then you should have posted that opinion/comment to Anthony and Nic, instead of posting your other off-topic remarks here in this WUWT article.

Updated climate sensitivity estimates using aerosol-adjusted forcings and various ocean heat uptake estimates
Posted on May 24, 2013 by Anthony Watts
Guest essay by Nic Lewis

> I don’t “tweet” what ever the hell that is.
Of course you don’t. You seem to be a dinosaur and apparently proud of it. I was speaking metaphorically:
http://en.wikipedia.org/wiki/Twitter#Format
> … there was no Bose-Einstein paper in 1924;
Then we could have avoided this whole, apparently useless, discussion if you had properly responded to my initial disclaimer:

Are you referring to Bose’s watershed paper “”Plancks Gesetz und Lichtquantenhypothese”, Zeitschrift für Physik, 1924? (If you were referring to another Bose paper, please elaborate.)
If so, you’ve got the historical details completely wrong here.

You could have just said: “No John, the OPN letter was addressing another paper” and that would have been the end of it.
So now you owe _me_ a really big apology for _your_ lack of due diligence.
😐

george e. smith
May 30, 2013 6:55 pm

“””””…..John Day says:
May 30, 2013 at 4:06 am
So now you owe _me_ a really big apology for _your_ lack of due diligence…….””””””
Well, I have often said: “Ignorance is not a disease; we are all born with it; but stupidity has to be taught, and there are plenty willing and able to teach it. ”
If you check the record, you will see it was YOU who transferred the focus from Einstein’s johny-come-lately paper, to the earlier landmark Bose paper which inspired it.
So don’t come wailing due diligence to me. If you actually learned anything in school you wouldn’t have to google wiki to find out how to spell a publication you never ever heard of.
As for dinosaurs; they managed to survive for 140 million years, just by being big, and mean, and ugly; whereas human “intelligence” is maybe 10% of the way to its first million years of survivability testing by Mother Gaia; and may not last longer than the next generation of time wasting juvenile distraction toys lets it run.

May 31, 2013 6:16 am

e. smith
> If you check the record, you will see it was
> YOU who transferred the focus from
> Einstein’s johny-come-lately paper,
What johnny-come-lately paper? I still don’t know the identity of this so-called “paper”, which Einstein allegedly plagiarized from Bose, even though I very specifically asked you to provide a reference for it before any discussion took place. So, how could _you_ “focus on a paper whose identity has not been cited?
Do Dinosaurs even know what citation means?
http://en.wikipedia.org/wiki/Citation
Actually, I would still like to know which paper you were “focusing” on. Then we can have a nice long discussion on it, and its impact on “aerosol-adjusted forcings”. I’m sure Anthony and Nic would appreciate devoting even more bandwidth to that.
You’ll probably say “Look it up yourself”, but it’s funny that didn’t stop you from pontificating on it, a paper whose existence you still can’t provide a citation for.
😐

June 1, 2013 3:35 am

e. smith
> If you actually learned anything in school you
> wouldn’t have to google wiki to find out how
> to spell a publication you never ever heard of.
George, let me try to fill in another, very annoying gap in your knowledge. You may surprised to learn that the word “wiki” was not coined by Jimmy Wales, and it is _not_ a general abbreviation for “Wikipedia”. There are tons of wikis around on the Internet. They were invented by Ward Cunningham in 1994, long before Wales invented Wikipedia in 2001:
http://en.wikipedia.org/wiki/Wiki
So, Wikipedia consists of many, many thousands of wikis (“articles”), which is why you see the term ‘wiki’ in every single Wikipedia URL reference. But there are tons of wikis on the Internet that don’t belong to Wikipedia.
So please stop using ‘wiki’ as an abbreviation for Wikipedia. (Unless you’re just trying to irritate me, then that’s OK)
> Well, I have often said: “Ignorance is not a disease;
> we are all born with it; but stupidity has to be taught,
> and there are plenty willing and able to teach it. ”
So who taught you to never use Wikipedia? That’s like saying you won’t use any textbook if it contains any factual errors or typos. By doing this you are deliberately depriving yourself of a rich source of knowledge. After using Widipedia for a while, you’ll soon learn to judge the reliability of the articles, and use the contained links to learn more, and navigate around the bogus information that you occasionally find in some Wikipedia wikis.
As I have often said: A person who teaches himself stupidity has a fool for a teacher.

June 2, 2013 4:36 am

e. smith
> If you actually learned anything in school you
> wouldn’t have to google wiki to find out how
> to spell a publication you never ever heard of.
So you believe that schools teach you everything you need to know for the rest of your life?
Most of what I know I taught myself by studying on my own, and asking questions. I think you should do more of that yourself.
As Bertrand Russell (1872-1970), the English philosopher, mathematician and writer famously said:
“Men are born ignorant, not stupid; they are made stupid by education. “

June 2, 2013 5:03 am

John Day says:
“Most of what I know I taught myself by studying on my own…”
And George Smith’s long professional carreer was in the electro-optics field, which trumps ‘studying on your own’. You really should listen to George, who has probably forgotten more than most folks will ever learn about the subject. He has been helping readers understand the subject for many years here. You showed up only in the past few months, that I know of.
When you need 3 posts to respond to one of George’s comments, you’re taking it way too personally. You could learn something by reading, instead of reacting. George Smith knows what he’s talking about WRT optics.

June 2, 2013 5:55 am

George hurled a few invectives at me, so I guess I got carried away. Sorry.
@dbstealey>And George Smith’s long professional carreer was in the electro-optics field…
He wouldn’t happen to be the Nobel Laureate who invented the CCD (or related to him)?
http://en.wikipedia.org/wiki/George_E._Smith

george e. smith
June 2, 2013 1:03 pm

“””””…..John Day says:
June 2, 2013 at 5:55 am
George hurled a few invectives at me, so I guess I got carried away. Sorry.
@dbstealey>And George Smith’s long professional carreer was in the electro-optics field…
He wouldn’t happen to be the Nobel Laureate who invented the CCD (or related to him)?
http://en.wikipedia.org/wiki/George_E._Smith…..”””””
The 2009 Nobel Physics Prize co-winner, was a long time Bell Labs researcher. For 30 years, our careers intertwined, but I never met him. Often, at a technical conference, I would go to the registration desk, only to be told that I was already pre-registered, and there were messages for me on the message board. Well it was the Bell labs CCD inventor. By some quirk of fate, the director of R&D at Beckman Instruments, at that time, was also a George E. Smith. Three of us with similar jobs and job functions.
There are folks, who know me, as well as the Nobel Laureate, and have for decades. The inventor of the first practical “visible” LED, knows both of us well. The inventor of the first practical LED ( GaAs infrared) was Bob Baird at Texas Instruments in the early 1960s. He is also still active in the industry, but he would NOT know me.
As for learning in school; the best we can hope for John, is that they teach us HOW to learn. The actual material they teach, is not that important, so long as we learn how to replace it with what we really want to know.
If you don’t already have it, a small cheap ($11) book, I would heartily endorse for your reading enjoyment, as well as reliable information is George Gamow’s “Thirty Years that Shook Physics.”, subtitled, The Story of Quantum Mechanics. It’s extremely readable and informative.
I am fortunate to be able to chat with a PhD Physicist, who was a student of Gamow. He also is a top medical Doctor; not just any medical Doctor, but a world famous star of television news. Well he was back in the days of Mercury, Gemini, and Apollo, when astronauts needed their vitals monitored. As it so happens, he is currently studying Quantum Mechanics at Stanford University.
There is some, not very accurate bio of mine bobbing around on the web, circa 2000; but no, I am not the Nobellist.

June 2, 2013 3:23 pm

Those must have been interesting times for you, George, with three George E. Smith’s working in the same field!
Let’s give our little discussion a rest. Sorry I gave you such a hard time. My only motive is to seek the truth. I like reading about the history of physics, because it gives us some insights into physics today.
Best regards,
John Day

June 2, 2013 3:31 pm

John Day,
It takes a stand-up guy to say “Sorry”. Much appreciated here.

george e. smith
June 3, 2013 11:49 am

“””””…..John Day says:
June 2, 2013 at 3:23 pm
Those must have been interesting times for you, George, with three George E. Smith’s working in the same field!
Let’s give our little discussion a rest. Sorry I gave you such a hard time. My only motive is to seek the truth. I like reading about the history of physics, because it gives us some insights into physics today.
Best regards,
John Day……””””
Never any hard feelings John; at my age my hide is well tanned like a good Texas saddle.
And I really seriously commend to you that George Gamow book; it’s a Dover paper back, available at any Barnes and Noble or Amazon. And although maybe not a very big name in early 20th century Physics, he was actually there in the midst of all those biggies, while all that earth shattering stuff was going down. You will get a lot of enjoyment, and good learning too out of it.
Ciao.
George