By Christopher Monckton of Brenchley
The Christmas pantomime here in Paris is well int0 its two-week run. The Druids who had hoped that their gibbering incantations might begin to shorten the Pause during the United Necromancers’ pre-solstice prayer-group have been disappointed. Gaia has not heeded them. She continues to show no sign of the “fever” long promised by the Prophet Gore. The robust Pause continues to resist the gathering el Niño. It remains at last month’s record-setting 18 years 9 months (Fig. 1).
Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset continues to show no global warming for 18 years 9 months since March 1997, though one-third of all anthropogenic forcings have occurred during the period of the Pause.
The modelers ought to be surprised by the persistence of the Pause. NOAA, with rare honesty, said in its 2008 State of the Climate report that 15 years or more without warming would demonstrate a discrepancy between prediction and observation. One reason for NOAA’s statement is that there is supposed to be a sharp and significant instantaneous response to a radiative forcing such as adding CO2 to the air.
The steepness of this predicted response can be seen in Fig. 2, which is based on a paper on temperature feedbacks by Professor Richard Lindzen’s former student Professor Gerard Roe in 2009. The graph of Roe’s model output shows that the initial expected response to a forcing is supposed to be an immediate and rapid warming. But, despite the very substantial forcings in the 18 years 9 months since February 1997, not a flicker of warming has resulted.
Figure 2. Models predict rapid initial warming in response to a forcing. Instead, no warming at all is occurring. Based on Roe (2009).
The current el Niño, as Bob Tisdale’s distinguished series of reports here demonstrates, is at least as big as the Great el Niño of 1998. The RSS temperature record is beginning to reflect its magnitude.
Figure 3. The glaring discrepancy between IPCC’s predicted range of warming from 1990-2015 (orange zone) and the outturn (blue zone).
The sheer length of the Pause has made a mockery of the exaggerated prediction made by IPCC in 1990 to the effect that there should have been 0.72 [0.50. 1.08] degrees’ global warming by now. The observed real-world warming since 1990, on all five leading global datasets, is 0.24-0.44 degrees, or one-third to three-fifths of IPCC’s central prediction and well below its least prediction (Fig. 3).
The Pause will probably shorten dramatically in the coming months and may disappear altogether for a time. However, if there is a following la Niña, as there often is, the Pause may return at some time from the end of next year onward.
The hiatus period of 18 years 9 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate.
The start-date for the Pause has been inching forward, though just a little more slowly than the end-date, which is why the Pause continues on average to lengthen.
So long a stasis in global temperature is simply inconsistent with the extremist predictions of the computer models. It raises legitimate questions whether they overstate the value for the radiative forcing in response to a proportionate change in CO2 concentration.
The UAH dataset shows a Pause almost as long as the RSS dataset. However, the much-altered surface tamperature datasets show a small warming rate (Fig. 4).
Figure 4. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to 1.1 C° per century during the period of the Pause from January 1997 to September 2015.
Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century is not exactly alarming.
As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur.
The Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing discrepancy between the predictions of the general-circulation models and observed reality.
The divergence between the models’ predictions in 1990 and the observed outturn continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990, taken as the mean of the RSS and UAH data, will fall below 1 C°/century equivalent (Fig. 5).
Figure 5: The mean of the RSS and UAH satellite data for the 311 months January 1990 to November 2015. The warming rate is equivalent to just 1.04 C° per century.
Roy Spencer, at drroyspencer.com, says 2015 will probably be the third-warmest year in the satellite record since 1979 on his UAH dataset, but thinks it likely that, since the second year of an el Niño is usually warmer than the first, 2016 may prove to be the warmest year in the satellite record, beating 1998 by 0.02-0.03 degrees.
The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. In a rational scientific discourse, those who had advocated extreme measures to prevent global warming would now be withdrawing and calmly rethinking their hypotheses.
Key facts about global temperature
Ø The RSS satellite dataset shows no global warming at all for 225 months from March 1997 to November 2015 – more than half the 443-month RSS record.
Ø There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since the Pause began in March 1997.
Ø The entire UAH dataset for the 444 months December 1978 to November 2015 shows global warming at an unalarming rate equivalent to just 1.14 Cº per century.
Ø Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century.
Ø The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us.
Ø The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century.
Ø Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century.
Ø In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century.
Ø The warming trend since 1990, when the IPCC wrote its first report, is equivalent to 1 Cº per century. The IPCC had predicted close to thrice as much.
Ø To meet the IPCC’s central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur.
Ø Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100.
Ø The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950.
Ø The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950.
Ø The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years.
Ø Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.
Technical note
Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend.
The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record.
The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out.
Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years.
The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity.
The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line.
The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend.
Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat.
RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures.
Dr Mears’ results are summarized in Fig. T1:
Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998.
Dr Mears writes:
“The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation. This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.”
Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:
“Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades. Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site. Is this really your data?’ While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate. … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.”
In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. The headline graph in these monthly reports begins in 1997 because that is as far back as one can go in the data and still obtain a zero trend.
Fig. T1a. Graphs for RSS and GISS temperatures starting both in 1997 and in 2001. For each dataset the trend-lines are near-identical, showing conclusively that the argument that the Pause was caused by the 1998 el Nino is false (Werner Brozek and Professor Brown worked out this neat demonstration).
Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record.
The length of the Pause, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed.
Sources of the IPCC projections in Figs. 2 and 3
IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded:
“Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.”
That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted.
In 1990, the IPCC said this:
“Based on current models we predict:
“under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii).
Later, the IPCC said:
“The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv).
The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025.
The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2).
Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii).
Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely.
But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3).
Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990).
Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date.
True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless.
The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn.
Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality.
To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.72 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate.
In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration.
Is the ocean warming?
One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date.
Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork.
Results for the 11 full years of ARGO data are plotted in Fig. T5. The ocean warming, if ARGO is right, is just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1.
Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow).
Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger.
The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize.
Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is.
Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO.
ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution.
What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way.
On these data, too, there is no evidence of rapid or catastrophic ocean warming.
Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth.
Figure T7. Near-global ocean temperatures by stratum, 0-1900 m, providing a visual reality check to show just how little the upper strata are affected by minor changes in global air surface temperature. Source: ARGO marine atlas.
Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean.
Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere.
If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has.
Why were the models’ predictions exaggerated?
In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T8):
Figure T8. Predicted manmade radiative forcings (IPCC, 1990).
However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990 (Fig. T9):
Figure T9: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013).
Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T10):
Figure T10. Energy budget diagram for the Earth from Stephens et al. (2012)
In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission.
It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC.
Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T11) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling.
Figure T11. Reality (center) vs. 11 models. From Lindzen & Choi (2009).
A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions.
It is interesting to see how the warming rate, expressed as degrees per century equivalent, has changed since 1950 (Fig. T12).
Figure T12. Changes in the global warming rate, 1950-2005.
Finally, how long will it be before the Freedom Clock (Fig. T13) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable.
Figure T13. The Freedom Clock edges ever closer to 20 years without global warming
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Anyone that understands the absorption pattern of CO2 should have known that was incorrect. The IR absorption of CO2 is logarithmic. The vast majority of “forcing” has to do with the levels of CO2 way under the 400 PPM we are at today.
CO2=life
The Y axis on the graph shows net downward/back radiation equal to the entire and then some 240 +/- W/m^2 of outgoing radiation at ToA. I find that a bit difficult to accept. Or is the net downward/back radiation of CO2 only 235 at 0 CO2 to 258 at 380 CO2, a 23 W/m^2 difference. Do you have a link that explains this graph? Thanks,
Yes, that’s counterintuitive, but it turns out to be true,
A toy mental model that’s simple in concept but hard to run completely in your head shows that even at equilibrium the surface can radiate (or the atmosphere can back-radiate) more power than is received from or escapes to space.
What you have to do is follow each of a number of photons as it is emitted from the surface and have it encounter some arbitrary number of layers between the surface and space, at each of which it has some probability of passing through to the next layer and a complementary probability of being captured, in which case it has a 50% chance of being re-emitted downward and a 50% chance of being re-emitted upward. (This is a one-dimensional model, so unlike real life there’s no sideways emission, but the one-dimensional model is adequate for conceptual purposes.)
As I said, it’s hard to keep track of very many particles in your head, but you can see the effect if you write a computer program for that purpose.
Just go to the U of Chicago Modtran and enter in various levels of CO2. Doubling CO2 from 400 PPM means very little.
http://climatemodels.uchicago.edu/modtran/
Here is the actual calculation. Double CO2 from 50 to 100, 100 to 200, 200 to 400 and 400 to 800.
http://climatemodels.uchicago.edu/modtran/
Change that 2.94 to 3.7, which would require changing the 233.6 to 227.1 to keep net downward unchanged at 380 PPMV CO2. Even Dr. Roy Spencer goes along with 3.7. The IR spectrum of CO2 has details that are finer than the resolution of MODTRAN. This change would show the direct effect of a change of CO2 being 26% more than is shown in the graph above, although still logarhythmic.
Thanks, Christopher, Lord Monckton.
You show an abundance of evidence.
And according to the MEI, at
http://www.esrl.noaa.gov/psd/enso/mei/ts.gif
this El Niño seems to have peaked last month.
Peaked at a level weaker than the El Nino of 1997-1998 and also the one of 1982-1983. It looks like this El Nino is not contributing as much to global temperature as much as the 1997-1998 one did.
Andres Valencia will notice from previous el Ninos that there is usually – but not always – a double peak, of which the first tends to occur before the winter and the second after it. So we must be cautious. The most likely course is a few more months of above-average ocean surface temperatures in the Nino 3.4 region of the tropical eastern Pacific, so that there will be quite a sharp upturn in global atmospheric temperatures yet to come.
However, Bob Tisdale’s admirable post on the current state of the el Nino shows a cold pool in the tropical western Pacific. If that follows the warm pool across the Pacific, there will be a la Nina behind the el Nino, which could largely cancel the warming caused by the el Nino. But, given that one would expect some global warming as a result of Man’s influence on climate, there will probably be a net increase in global temperature after the el Nino/la Nina cycle. That, at any rate, would be the cautious view.
On any view, though, it is now clear that the rate of global warming is very substantially less than the rates predicted by the IPCC and the models. That is very unlikely to change in the longer term. Expect the divergence between exaggerated prediction and moderate, harmless warming to continue.
Joe Born December 5, 2015 at 1:06 pm
“Yes, that’s counterintuitive, but it turns out to be true,”
and a violation of thermodynamic laws. Energy in and energy out can not be equal. Energy out must be less than in depending on the work function, internal losses, knocking electrons out of orbit, oscillations, heating, etc. And 2 W/m^2 (W = power not energy) aren’t squat in the overall picture.
“…in which case it has a 50% chance of being re-emitted downward and a 50% chance of being re-emitted upward.”
Perhaps, but at reduced energy level. The LWIR photon that impacts a CO2/GHG molecule can not re-emit at the same energy level per Einstein’s Nobel winning photo-electric effect. The leaving photon might be in the microwave energy band, good for warming water a little and not much else. And evaporating water will suck up that energy w/o any increase in temperature.
As I requested above, a link or few?
Well, I tried to help you out , but if you think that over time energy out doesn’t equal energy in, then explaining this all to you will take much more time than I have.
I’m not trying to be snide here, but it would take a face-to-face to get the idea through to you if your understanding of thermodynamics is no better than what you seem to be indicating; this communications mode is not well suited to that task.
As I pointed out above I had to demonstrate competence in thermo to earn my BSME and have applied that knowledge in real life applications for 35 years. Maybe the reason you can’t explain it is what you lack.
Have any links to help clarify?
BTW, back to my original quest: can you close the balance on Schneider’s power flux graphic? There are many similar graphic images that don’t seem to have a closure problem.
Absorption and emission of radiation by atoms/molecules easily occurs at the same wavelength (google “resonance line” wavelength), although the absorbed quantum of energy may be converted to movement or vibration (in the case of molecules) kinetic energy in whole or in part, and remission is sometimes lacking or at a longer wavelength or in multiple photons at longer wavelengths. Atomic and molecular spectral emission and absorption of radiation have nothing to do with the photoelectric effect. Also, thermal radiation by an atom or molecule can occur at any wavelength in the thermal emission spectrum of the material in question to an extent limited by the Planck curve for the temperature in question, regardless of the source – even absorption of photons whose wavelengths are the same or longer.
Is it smart to mock one prayer-group while being a member of another prayer group? Not so sure on that one…
I think Paris is doing a great job in fixing climate warming. Here in the Western Australia capital of Perth, yesterday’s maximum of 19.0C was the coldest December day ever recorded at the current BoM site in Mt Lawley for Perth Metro 9225. Looking back through the 2,944 December days with max temp recordings at the Perth Regional Office 9034 station in West Perth/East Perth since 1897, it was Perth’s equal 12th coldest December day in 118 years. Unless Perth warms substantially in the next couple of hours, today looks like being the equal 16th coldest December day recorded at 9225 since opening in 1991.
At the ACORN station of Perth Airport 9021 but with raw temps, yesterday’s 18.9C was the eighth coldest December day since the station opened in 1944. If you prefer to use the adjusted daily ACORN temps, yesterday was the fifth coldest December day in Perth since 1910.
This is despite being near the peak of a Godzilla El Nino. It might be summer in Perth but I’m planning on buying a jumper before Paris winds up because the experts are obviously solving the AGW crisis. Or am I getting confused and was yesterday just another “extreme” caused by CO2?
/sarc … I only bothered writing above because if yesterday was the hottest December day in 54 years for Perth it would get media headlines, but being the coldest in 54 years gets no attention so the fact should at least be broadcast somewhere.
Joe Born et.al.
I have another global balance diagram by Trenberth et. el. 2011 that closes. It shows eight values for each of the state points, apparently eight different studies, models, opinions as to the values. So much for consensus. Of course some of the uncertainties are multiples of the 2 W/m^2 RF of CO2. And there is a 333 +/- 8.5 W/m^2 continuous (GHG?) loop between earth and sky, i.e. lower troposphere, LT.
The first law of thermo says that heat/energy flows from a high energy source to a lower energy sink and can’t flow the other way without help. The second law says that no system can output work equal to the energy input. Any energy system left to its own devices, i.e. w/o some sustaining external source, will eventually grind to halt. Entropy is about heat and energy, not order/disorder no matter what the charlatans say.
Consider a refrigerator. The inside of the refrigerator is the cold sink (LT), the surroundings are the warm source (earth surface). If the plug is pulled heat will flow through the walls of the refrigerator and the inside temperature will eventually equal the surroundings. In order to remove heat from the inside and cool it off the electric compressor has to add energy to the system. The Freon is compressed, heat of compression removed, the Freon evaporates, removing heat from the inside.
So heat flows from the earth at 55 F to the LT at -40 F. That’s how all of the heat moves from surface and atmos to ToA. It’s not coming back without some magical energy source. This 333 +/- 8.5 W/m^2 appears out of nothing at the surface or out of nothing in the LT. This GHG loop is nothing but your basic charlatan’s perpetual motion energy loop and a gross violation of the second law of thermo. Of course removing it from the diagram changes nothing.
So you or one of your buddies need to ‘splain how this works, whether the magical heat source is on the surface or in the LT. I’ll be more than happy to kick their butts, too.
Similar to Freon, water’s latent heat of evaporation and condensation move large amounts of heat at constant temperature. It’s water vapor that runs the greenhouse, not CO2. Water vapor is what makes the earth different from Venus and Mars. BTW none of that is news.
Much of this smacks of “sky dragon slayers”. The laws of thermodynamics only require net flow of heat unassisted by external work to be downhill in potential. The laws don’t prohibit a recirculation loop between the surface and the top of the atmosphere, as long as for the net effect the surface is losing heat directly to outer space and into this loop, and the top-of-atmosphere end of this loop is losing heat to outer space. If the flow is 70% outward and 30% backward, the net outward flow is sufficient to satisfy the laws of thermodynamics. For one thing, photons of longwave thermal infrared don’t carry tags stating the temperature of their source, and cannot be selectively absorbed or rejected on basis of temperature of their source.
What an increase of greenhouse gases does is increase impedance against heat loss from the surface to outer space. It’s somewhat of a longwave IR thermal radiation version of a blanket. A blanket is cooler than your body, but reduces the loss of heat from your body to the air in your bedroom.
Donald L. Klipstein December 6, 2015 at 1:21 pm
“…by external work to be downhill in potential.”
By definition the “back radiation” of GHE/GHG is uphill in potential.
“…but reduces the loss of heat from your body to the air in your bedroom.”
It doesn’t reduce the loss of heat, only the rate of that heat loss, your body can then back down it’s internal furnace. So you will freeze in 16 hours instead of 8.
“If the flow is 70% outward and 30% backward, the net outward flow is sufficient to satisfy the laws of thermodynamics.”
30% is reflected, 70% absorbed, and 70% exits ToA, i.e. balanced. There is no 30% left to power/backfeed the loop especially from a cold state to a warmer state. Legal satisfaction wasn’t in my engr school thermo book.
“The laws don’t prohibit a recirculation loop between the surface and the top of the atmosphere…”
Yeah they do, unless there is some magical additional energy left over to compensate for the internal losses and these balances have none. The 70% can’t both repower the loop and exit ToA.
The blanket analogy is more simple minded BS. Like so much it assumes everything stays the same which in real life it doesn’t. Put on a heavy jacket while chopping wood & I’m going to get hot and sweat. There’s that water cycle thermostat altering the heat flux. If you blanket/insulate your house and don’t cut back the furnace (everything stays the same) the house is going to get hot. The wall mounted thermostat (water cycle) reduces the furnace heat flow to the set point.
Spent Thanksgiving week in Phoenix visiting our son and his wife. They are expecting their first child, a girl, in May and our first grandchild. Drove by way of Raton pass. We prefer Raton to Wolf Creek especially this time of year. They drive a Prius with regenerative braking (GHE loop). There is a graphic display on the dashboard that shows whether power is going into or coming out of the battery pack (GHG loop) as we drive around town. So let’s put that Prius on a flatbed wrecker, drive it to the top of Raton pass (lower troposphere) and assume the first trip is free.
Now we can coast south to Raton or north to Trinidad. Let’s coast to Trinidad charging the batteries as we go (333 W/m^2 of down welling back radiation). Now turn around and head back up the pass (333 W/m^2 of upwelling LWIR surface radiation). Pretty obvious that because of internal/external/entropy losses we aren’t going to make it, say maybe 90%. Coast back down and try again. This time we make it 80% of the way. Ten cycles and we’re totally done. The only way we can make it back up the pass is to fire up that auxiliary gasoline engine and replace the lost energy.
That’s what bothers me about these global heat (power flux) balances and the GHE loop. There is no auxiliary engine to re-power the GHE loop and as such this loop is, in the words of Click & Clack, BOOOOGGGGGUUUUS!!!!!!
And furthermore:
“The laws don’t prohibit a recirculation loop between the surface and the top of the atmosphere…”
This statement absolutely violates the 2nd law. If it were true we could mount wind powered generators on the roof of the car and charge the batteries – for free!! No extra gas required.
Detroit hasn’t done this. Guess why.
The 2nd law only restricts net flows. For example, suppose you have two blackbody plates near each other, one held at 1000 K and one held at 950 K. They will be exchanging photons back and forth. More photons will travel from the hotter one to the cooler one than from the cooler one to the hotter one (5.67 vs. 4.62 W/cm^2), but photons will go both ways. The net radiant heat flow will be 1.05 W/cm^2 from the hotter surface to the cooler one.
If the cooler surface is removed, then the hotter surface will lose more heat. The part of the hotter surface that had the cooler one nearby will lose 5.67 W/cm^2 instead of 1.05 W/cm^2. Similarly, atmosphere with greenhouse gases will slow heat loss from the surface even if the atmosphere is cooler than the surface because this atmosphere is warmer than the alternative – the 3 degree K effective edge of the universe.
Donald L. Klipstein
“The net radiant heat flow will be 1.05 W/cm^2 from the hotter surface to the cooler one.” Check. OK w/ 1st law, net energy/heat flows only from hot to cold.
“…atmosphere with greenhouse gases will slow heat loss.” Check. The hypothetical blanket slows the rate of heat loss.
340 +/- W/m^2 hits ToA (100%), 102.0 +/- W/m^2, (30%) are reflected, 238 +/- W/m^2 (70%) are absorbed and must upwell radiation though an atmospheric downhill energy potential powered by a delta T back to ToA, aka maintaining the great balance.
Along comes the CO2 GHE blanket and traps 2 +/- W/m^2 (0.6%) in the perpetual GHE loop. Only 236 W/m^2 (69.4%) now leaves ToA. In order to recover the 238 +/- W/m^2 (70%) ToA the upwelling downhill potential must increase and the delta T must increase. If you blanket/insulate your house and maintain the furnace output the inside temperature must increase. Well, duh!
There is another way.
2 W/m^2 is 6.28 Btu/h / m^2. Evaporating water into dry air (check moist air psychometric properties) absorbs 1,000 +/- Btu/lb of water at a constant temperature. So an almost negligible/immeasurable/undetectable increase in ocean evaporation, resulting cloud cover and reflecting albedo can absorb the 2 +/- W/m^2 of CO2 RF, reflect it back through ToA and without an increase in temperature.
340 +/- W/m^2 hits ToA, 104.0 +/- W/m^2, (30.6%) are reflected, 236.0 +/- W/m^2 (69.4 %) are absorbed and upwell radiation though a smaller atmospheric downhill energy potential powered by an unchanged delta T back to ToA.
Presto, more CO2 and yet no increased delta T, e.g. the pause/stasis/lull/hiatus.
And I successfully ‘splained it without billions of dollars in computer hardware & software and high powered manpower that doesn’t.
2 W/m^2 is 6.28 Btu/h / m^2
Oops – 6.82 Btu/h
This is an encore presentation.
First off a discussion of units.
A watt is a metric unit of power, energy over time, not energy per se. The metric energy unit is the joule, English energy unit is the Btu. A watt is 3.412 Btu per English hour or 3.600 kilojoule per metric hour.
In 24 hours ToA power of 340 W/m^2 will deliver 1.43 E19 Btu to a spherical surface with a radius of 6,386 km. The CO2 RF of 2 W/m^2 will deliver 8.39 E16 Btu, 0.59% of the ToA.
At 950 Btu/lb of energy, evaporating 0.74 inches of the ocean’s surface would absorb the entire ToA, evaporating 0.0044 inches of the ocean’s surface would absorb the evil unbalancing CO2 RF.
More clouds. Big deal.
ToA spherical surface area, m^2……………5.125.E+14
W = 3.412 Btu/h……………………………………3.412.E+00
ToA, 340 W/m^2, Btu/24 h……………………1.43E+19
CO2 RF, 2 W/m^2, Btu/24 h…………………..8.39E+16
Ocean surface , m^2………………………………3.619E+14
m^2 = 10.764 ft^2………………………………….1.076E+01
Ocean surface, ft^2………………………………..3.895E+15
Water density, lb/ft^3………………………….62.4
Lb of water in 1 foot of ocean………………..2.431E+17
Evaporation, Btu/lb……………………………950.0
Amount of ocean evaporation
Feet needed to absorb ToA…………………..0.062
Inches needed to absorb ToA………………..0.74
Feet needed to absorb CO2 RF………………0.0004
Inches needed to absorb CO2 RF…………..0.0044