No global warming at all for 18 years 8 months
By Christopher Monckton of Brenchley
The Paris agreement is more dangerous than it appears. Though the secession clause that this column has argued for was inserted into the second draft and remained in the final text, the zombies who have replaced the diplomatic negotiators of almost 200 nations did not – as they should have done in a rational world – insert a sunset clause that would bring the entire costly and pointless process to an end once the observed rate of warming fell far enough below the IPCC’s original predictions in 1990.
It is those first predictions that matter, for they formed the official basis for the climate scam – the biggest transfer of wealth in human history from the poor to the rich, from the little guy to the big guy, from the governed to those who profit by governing them.
Let us hope that the next President of the United States insists on a sunset clause. I propose that if 20 years without global warming occur, the IPCC, the UNFCCC and all their works should be swept into the dustbin of history, and the prosecutors should be brought in. We are already at 18 years 8 months, and counting. The el Niño has shortened the Pause, and will continue to do so for the next few months, but the discrepancy between prediction and reality remains very wide.
Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 8 months since May 1997, though one-third of all anthropogenic forcings have occurred during the period of the Pause.
It is worth understanding just how surprised the modelers ought to be by the persistence of the Pause. NOAA, in a very rare fit of honesty, admitted in its 2008 State of the Climate report that 15 years or more without global warming would demonstrate a discrepancy between prediction and observation. The reason for NOAA’s statement is that there is supposed to be a sharp and significant instantaneous response to a radiative forcing such as adding CO2 to the air.
The steepness of this predicted response can be seen in Fig. 1a, which is based on a paper on temperature feedbacks by Professor Richard Lindzen’s former student Professor Gerard Roe in 2009. The graph of Roe’s model output shows that the initial expected response to a forcing is supposed to be an immediate and rapid warming. But, despite the very substantial forcings in the 18 years 8 months since May 1997, not a flicker of warming has resulted.
Figure 1a: Models predict rapid initial warming in response to a forcing. Instead, no warming at all is occurring. Based on Roe (2009).
The current el Niño, as Bob Tisdale’s distinguished series of reports here demonstrates, is at least as big as the Great el Niño of 1998. The RSS temperature record is now beginning to reflect its magnitude. If past events of this kind are a guide, there will be several months’ further warming before the downturn in the spike begins.
However, if there is a following la Niña, as there often is, the Pause may return at some time from the end of this year onward.
The hiatus period of 18 years 8 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. The rate on the RSS dataset since it began in 1979 is equivalent to 1.2 degrees/century.
And yes, the start-date for the Pause has been inching forward, though just a little more slowly than the end-date, which is why the Pause has continued on average to lengthen.
The UAH satellite dataset shows a Pause almost as long as the RSS dataset. However, the much-altered surface tamperature datasets show a small warming rate (Fig. 1b).
Figure 1b. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to 1.1 C° per century during the period of the Pause from May 1997 to September 2015.
Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century (even if it had occurred) would not be cause for concern.
As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur.
The Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing discrepancy between the predictions of the general-circulation models and observed reality.
The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990 will fall below 1 C°/century equivalent.
Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 311 months January 1990 to November 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1 K/century equivalent, taken as the mean of the RSS and UAH v.6 satellite monthly mean lower-troposphere temperature anomalies.
Figure 3. Predicted temperature change, January 2005 to September 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v.6 satellite lower-troposphere temperature anomalies.
The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. In a rational scientific discourse, those who had advocated extreme measures to prevent global warming would now be withdrawing and calmly rethinking their hypotheses. However, this is not a rational scientific discourse.
Key facts about global temperature
These facts should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”.
Ø The RSS satellite dataset shows no global warming at all for 224 months from May 1997 to December 2015 – more than half the 444-month satellite record.
Ø There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since 1997.
Ø The entire UAH dataset for the 444 months (37 full years) from December 1978 to November 2015 shows global warming at an unalarming rate equivalent to just 1.14 Cº per century.
Ø Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century.
Ø The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us.
Ø The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century.
Ø Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century.
Ø In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century.
Ø The warming trend since 1990, when the IPCC wrote its first report, is equivalent to little more than 1 Cº per century. The IPCC had predicted close to thrice as much.
Ø To meet the IPCC’s original central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur.
Ø Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100.
Ø The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950.
Ø The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950.
Ø The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years.
Ø Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.
Technical note
Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend.
The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record.
The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out.
Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe as 13.82 billion years.
The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity.
The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line.
The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend.
Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat.
RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures.
Dr Mears’ results are summarized in Fig. T1:
Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998.
Dr Mears writes:
“The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation. This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.”
Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:
“Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades. Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site. Is this really your data?’ While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate. … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.”
In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. The headline graph in these monthly reports begins in 1997 because that is as far back as one can go in the data and still obtain a zero trend.
Fig. T1a. Graphs for RSS and GISS temperatures starting both in 1997 and in 2001. For each dataset the trend-lines are near-identical, showing conclusively that the argument that the Pause was caused by the 1998 el Nino is false (Werner Brozek and Professor Brown worked out this neat demonstration).
Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record.
The length of the Pause, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed.
Sources of the IPCC projections in Figs. 2 and 3
IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded:
“Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.”
That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted.
In 1990, the IPCC said this:
“Based on current models we predict:
“under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii).
Later, the IPCC said:
“The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv).
The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025.
The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2).
Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii).
Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely.
But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3).
Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990).
Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date.
True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless.
The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn.
Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality.
To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate.
In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration.
Is the ocean warming?
One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date.
Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork.
Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1.
Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow).
Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger.
The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize.
Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is.
Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO.
ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution.
What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way.
On these data, too, there is no evidence of rapid or catastrophic ocean warming.
Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth.
Figure T7. Near-global ocean temperatures by stratum, 0-1900 m, providing a visual reality check to show just how little the upper strata are affected by minor changes in global air surface temperature. Source: ARGO marine atlas.
Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean.
Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere.
If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has.
In early October 2015 Steven Goddard added some very interesting graphs to his website. The graphs show the extent to which sea levels have been tampered with to make it look as though there has been sea-level rise when it is arguable that in fact there has been little or none.
Why were the models’ predictions exaggerated?
In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T8):
Figure T8. Predicted manmade radiative forcings (IPCC, 1990).
However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990 (Fig. T9):
Figure T9: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013).
Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T10):
Figure T10. Energy budget diagram for the Earth from Stephens et al. (2012)
In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission.
It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC.
Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T11) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling.
Figure T11. Reality (center) vs. 11 models. From Lindzen & Choi (2009).
A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today.
On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions.
Finally, how long will it be before the Freedom Clock (Fig. T12) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable.
Figure T12. The Freedom Clock edges ever closer to 20 years without global warming
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
I see the Pause in the UAH dataset merits only half a sentence.
I’m not surprised because you can find a negative trend in this dataset if, and only if, you start your analysis in December 1997, Next month it will be gone
The RSS Pause should hang on until March
Yes, MoB uses a method that is the very definition of how to pick the best cherry … pointing that out, of course, causes vehement denial of having done so – always good for a laugh. As soon as you step beyond the 97/98 E.N. peak the pause goes into hiding. As you say, in a few months he’ll have to search for another non-robust messaging device to emphasize. It’s the way of his type.
This pause is really bugging you blokes isn’t it? Why has there been ~80-odd, peer-reviewed studies released in the past 5 or so years trying to explain this pause? Every excuse from aerosols, volcanic activity to ‘the heat is hiding in the oceans’ has been used by ‘climate science’ authors such as Mann, Schmidt and Trenberth et all.
You guys don’t have a clue, do you!
Just a reminder, this pause has been bugging you lot since at least 2009:
One of the authors who has written a couple of ‘peer-reviewed’ papers trying to explain ‘The Pause
As soon as you step beyond the 97/98 E.N. peak the pause goes into hiding.
No, it doesn’t. Skip the 1998 El Nino — and the 1999-2000 La Nina — and you get this:
http://woodfortrees.org/plot/rss/from:2001/plot/rss/from:2001/trend
“The RSS Pause should hang on until March”
Maybe, or earlier. Here is what is really going on with these “Pause” calculations. The graph below shows plots of the trends from the year on the x-axis to now. It is shown for each of the main surface indices, and for UAH V6 and for RSS. The lowest curve is RSS, which is the one favored here. The “Pause” is dated from when the RSS curve crosses the grey trend=0 line, which I’ve marked with a red circle. As you see, none of the surface indices comes close to a pause (zero trend, grey), and UAH only just. And RSS has only a very brief passage below the line, and then another (brief) around 2001.
http://www.moyhu.org.s3.amazonaws.com/2016/1/trend0.png
With warm present months, these curves are all rising. Within a month or two, the 1997 dip will be gone. The 2001 dip might hold for another month. Then it’s all the way back to 2009, and even that won’t last.
Here’s how the plot looked three months ago. The rate of rise is proportional to the excess warmth of current months, so if Jan is warmer than Dec (for RSS a good bet), then the rise of the curve will not only continue, but will accelerate.
http://www.moyhu.org.s3.amazonaws.com/2016/1/trend3.png
Well presented, Mr Stokes. That has to be a bracing splash of reality to the face of many on this forum. Perhaps it provides the regulars with a truer sense of the fragile nature of MoB’s monthly PR posts on the subject. I’m sure MoB will have other equally robust and “informing” PR devices to unveil in the coming months.
John@EF,
You know, you can always submit an article just like Lord Monckton and lots of others do. Nick Stokes can, too.
But you don’t, and I know why: you would get ripped to shreds because the facts show you’re wrong. Planet Earth is falsifying your beliefs.
But prove me wrong, write that article…
dbs,
“because the facts show you’re wrong”
I’ve just drawn a graph. Where do you think the graph is wrong?
But I’ll make a prediction. By May, at the latest, with the April RSS results included, this “pause” will be no longer than a month or two.
Nick Stokes,

Here’s another graph, from 1880:
What do you see? Do you see “accelerating” global warming? No. You see natural global warming, and if the “pause” stops pausing, it will be as natural as when it paused. Here’s another chart, from the mid-1800’s.
See, global temperatures don’t rise in a smooth, straight line. That wouldn’t be natural. That would only happen if human CO2 emissions warmed the planet like you claim. Instead, global T rises in fits and starts:
http://jonova.s3.amazonaws.com/graphs/hadley/Hadley-global-temps-1850-2010-web.jpg
The warming from CO2 has already happened; the saturation is almost complete, and even another 50% or 100% of CO2 emissions wouldn’t make a measurable difference.
Just extrapolate a doubling of CO2 from the current 400 ppm, and tell us how much global warming would result:
You’re trying to sell a pig in a poke, Nick. You know better. Why are you still trying to claim that CO2 causes dangerous global warming?
I am afraid Nick is correct about the fragility of the pause. If RSS continues in 2016 as it did in 1998, the 18 year pause is history. However a La Nina might come later in 2016 and restore the pause.
Argue about the fact that the El Nino is natural and not due to CO2, or as you see fit. But the RSS pause is on shaky ground right now.
The “the discrepancy between prediction and reality remains very wide” will remain a valid point to bring up though.
Nick,
this is fascinating stuff. Thanks for those informative graphs. Of course, the temperatures are expected to continue to rise within the next few months, and this could (and most likely will, depending on the temperatures) jeopardize the Pause.
The Viscount Monckton has acknowledged this, (http://wattsupwiththat.com/2015/11/05/the-pause-lengthens-again-just-in-time-for-paris/) saying that he suspects that the Pause may disappear altogether for a time because of this El Nino, but he suspects that it may return after that.
Nick, please explain why the current Holocene (Modern Warm Period) has the lowest temperature, yet the highest CO2 levels in the past 400,000 years?
http://i255.photobucket.com/albums/hh154/crocko05/Temp%20vs%20CO2%20-%20400000%20years_zpskyy0qvra.jpg
Here is what is really going on with RSS.
The mean monthly anomaly for the past 18 years is 0.251 (til 2015.93). For the first 9 years it was 0.266 and for the next it was 0.237. If its extended to a 20 year period, it only went up by 0.006 from the first 10 years to the next 10.
The latter corresponds to the temperature difference between your feet and testicles on a dry day.
“Within a month or two, the 1997 dip will be gone.”
Take a closer look at the temperature graph at the top of the article. The descent has already started. There’ll be no wiping out of temperature records.
Mr Stokes has long been anxious to explain away the Pause. Soon – for about a year, at any rate – he will not have to do so, for, as this monthly column has frequently pointed out, the effect of the current strong el Nino will be to extinguish the Pause unless and until a countervailing la Nina occurs (which may or may not happen).
However, the Pause is indicative of a far more serious problem for the true-believers: the growing discrepancy between the exaggerated predictions of the climate models and the far less exciting observed reality.
It is now 15 years since the Third ASSessment Report of the IPCC in 2001. So we now have a long enough stretch of observational data to verify the medium-term predictions made in the First, Second and Third ASSessment Reports. In all three reports, the predicted trends have proven to be far in excess of observation – so far, indeed, that it is becoming very difficult to maintain that manmade global warming will be at all likely to prove significant, let alone damaging in net terms.
Even if the Pause disappears, these monthly columns will continue, but there will be a greater emphasis on the continuing growth in the discrepancy between exaggerated prediction and unexciting reality.
The point is simple. The international totalitarian Left, in order to get the climate scam going, had to make exaggerated predictions. By now, though, enough time has passed to demonstrate compellingly – and perhaps conclusively – that those predictions were indeed wild exaggerations.
Of course, we shall have to wait for the retirement or death of the current generation of politicians before we can succeed in persuading governments to change what passes for their minds on this subject. In the meantime, as people throughout Africa and India and in large parts of South America and even China are denied coal-fired electricity – or any electricity – some 10-20 million a year will die before their time. Their deaths will have been caused, in no small part, by the utter refusal of the totalitarian Left to admit its profitable but evil error.
If the discrepancy between wild prediction and unspectacular reality in the temperature record continues to widen, as I expect it to do, there will come time when the world will turn on the totalitarian Left and drive it into permanent, unlamented extinction. The totalitarian-left ideology has been responsible for 75 million deaths under Nazism, 150 million and counting under Communism, 50 million and counting from the green Left’s ban on DDT just at the point where malaria might have been driven to extinction, 40 million and counting because the homosexual Left campaigned against the usual public-healh measures to contain HIV when it first emerged, and now 10-20 million a year because the environmental Left opposes fossil fuel corporations on no better ground than that they have always been donors to anti-Left political parties worldwide.
In the end, the Left will rue the day they decided to adopt the pseudo-science of allegedly catastrophic global warming. For it is my hope, and that of others who have studied the emergence of this scare, that it will prove to be the Left’s undoing. They have tried to repeal the laws of science itself: and those laws, like it or not, are beyond our power to amend or to repeal.
You’ve only demonstrated that you can’t or refuse to comprehend such a simple thing.
If I were to ask, how long ago can you go back before seeing global average temperatures as high as today, you would determine that 18 years and 8 months ago was warm as today, according to RSS. This is what the hiatus in global temperatures is. This is why there are nearly 100 papers discussing the hiatus.
And if the pause is “on shaky ground” it is so because the 20 year temperature trend may be negative by mid 2017. At the rate of heating from the decay of this El Nino, looks like it might raise temperature to the same levels of 1998. That would not erase the hiatus, it would shorten it back to 17 years.
If tropospheric temperatures during a major El Nino in 2016 matches tropospheric temperatures from a major El Nino in 1998, then what does that tell you about the trend in global temperatures?
If you wish, the time for no statistically significant warming can also be brought up.
Presently from:
http://moyhu.blogspot.com.au/p/temperature-trend-viewer.html
Temperature Anomaly trend
May 1993 to Dec 2015
Rate: 0.772°C/Century;
CI from -0.030 to 1.574;
t-statistic 1.886;
Temp range 0.125°C to 0.299°C
That is 22 years and 8 months of no statistically significant warming for RSS.
Being longer than NOAA’s 15 years, that is still important.
Nick Stokes January 10, 2016 at 6:18 pm
“But I’ll make a prediction. By May, at the latest, with the April RSS results included, this “pause” will be no longer than a month or two.”
Tell you what, I’ll make a prediction too.
Not only will your prediction fail miserably, but in accordance with the current negative phase of the ~60 year cycle that appears to correlate fairly well with the NAO, in about five years – perhaps less – it will become abundantly clear to all and sundry that despite the beast efforts of the Government’s hired climate “scientists” to Mannipiulate the temperature databases that the Earth’s temperature has in fact been dropping since ~2000, and will continue to do so until ~2030.
What do you say, Mr. Stokes?
You two idiots fail to understand the method. I’ll try to explain in terms my 11 yr old grand daughter would understand.
There is not cherry picking unless to call starting rom the most recent ta and going backwards is cherry picking because that is what Christopher has do. ou see that’s who it HAS to work in order to define th length of pause.
Right on cue …. hahaha
And what happens to the pause when you go just a couple months on either side of the monster e.n. peak? And what happens when over the course of just a few coming months when the pause duration starts getting slashed significantly? Perhaps your granddaughter will whisper to you that it doesn’t mean much other for PR purposed.
The point here that has been well made many times is the answer to the question:
“how far back can you extend the trend and not find any statistical cooling or warming?”
This is not “cherry-picking”, it is something even worse for CAGW supporters: it is an application of common sense.
Johnwho that isn’t quite right…if you extend this back a few months or a year…or some other time the trend will be positive… So in fact it’s slightly cherry picked. But….that said there does exist a point of considerable time period with no increase…the important aspect is that the overall trend does not indicate a serious problem
John@EF, why do ALL of the sea-ice charts start in 1979, when satellites have been measuring sea-ice extent since 1972?
http://i255.photobucket.com/albums/hh154/crocko05/SeaiceextentIPCCAR1_zps93416e03.jpg
@Jamie
Yes, if you extend it or shorten it the trend may be positive or negative.
Doesn’t alter the answer to the question:
“how far back can you extend the trend and not find any statistical cooling or warming?”
What you seem to miss is from a scientific view the pause is much longer. You know, considering error bands. Why is it you true believers ignore the scientific viewpoint?
The pause will not go away for many months and then will appear again in spades as the very likely La Nina kicks in. What are you going to be claiming when it goes over 20 years?
Have no fear, they’ll think of something. None of us are supposed to be here anyway …… according to Earth Day 1970!
That’s the problem with short data runs – it invents cherries for picking.
The longest continuous instrument record we have are the Central England Temperatures [CET] from 1659.
Xmetman has produced continuous charts for the 4 seasons + Annual mean –
https://xmetman.wordpress.com/2016/01/09/long-term-seasonal-trends-in-central-england-temperatures/comment-page-1/#comment-2893
The Annual chart shows a 0.95°C warming trend over 356yrs. (0.027/dec);
But also shows some other trends & puts them in context, like the rapid drop from 2002 -2012.
Check out the peaks in 1734, 1830, 2002, all have similar shapes.
Climate is a long term thing therefore ONLY long term data tells us any thing useful & it removes cherry picking.
OK this is only Central England but is probably indicative of a wider area.
Xmetman has a lot of interesting stuff on his site worth looking at.
Sorry for the double post
Why do supposedly intelligent people not “get” this. The “cherry-picked” start date is now, today, whenever the analysis is done. Then you work BACKWARDS to the time when there is no pause. This is equivalent to saying Manchester United/Denver Broncos (put any team in) haven’t scored a goal for x minutes – you work back to the last goal – no cherry picking involved.
How hard is that?
SteveT
A rather belated reply to Mr Barraclough, who states, inaccurately, that there is only a Pause in the the UAH dataset from December 1997. In fact, the Pause in the UAH data begins in July 1997, only two months shorter than the RSS dataset.
And a rather belated reply from me, as I have been refreshingly out of internet range for a couple of weeks.
Apologies Christopher Monckton.- I was looking at the wrong month. Yes, as of December’s update, there are actually 7 months in which you can start the Pause – July 1997 until January 1998. I was looking at my projections for January 2016, on the assumption that the anomaly will remain about the same. Then you will have to start only in December 1997 to find a Pause.
If next month’s anomaly is 0.48 or above, the UAH Pause will have vanished.
“The Pause” will always be that portion of the data that showed a statistically flat trend. It would only be while we are in “The Pause” that it will be meaningful to show that it extends from the present back.
The last month slight rise could be the beginning of the end of “The Pause” or it could be just a bump and following months bring the trend back in line with “The Pause” or we could even have a downturn showing a period of cooling. Check what has happened before since 1850 and you’ll see we’ve had a couple of “The Pause” periods.
Most importantly, as pointed out, is the separation between the World-ending projections/predictions of the IPCC and reality coupled with the lack of evidence that human CO2 emissions are having an observable effect on the atmosphere.
It may also be of interest to note that, unlike the strident claims in the media recently, 2015 was NOT the hottest ever. 1997 was, by a large margin. 2010 also beat 2015 by a smaller margin.
Given that the graph shows a downward trend at the end of 2015, it is extremely unlikely that 2016 will beat it. But that hasn’t stopped the media from starting to promote it as yet another “hottest ever”.
” 2015 was NOT the hottest ever.” It depends where you live. If you live on the surface, it was. If you live in the troposphere, it wan’t.
Good review posting.
A carbonist will look at you funny and say, “Hey man, it’s a greenhouse gas, it HAS to cause warming”. Such is state of their high school physics. You just have to tell them that the reason CO2 is not a significant governor of climate at any scale is that when it takes a big bite out of outgoing radiation, its four front teeth are missing.
I love this site and have followed it for some time. I have seen enough to know both sides of the AGW issue have difficulties with even basic statistics.
When performing a Least Squares Regression on a time series, specific rules should be followed and often this type of regression is not valid.
I am curious, is anyone checking the statistical methods of the anti-AGW publications just like those wonderful Canadian guys (forgot their names) are doing with the pro-AGW scientists?
Statistics are nearly worthless when applied to such rough data compilations with no margin of error specified.
All you need is two eyeballs and a chart.
In the past 20 years the average temperature (assuming that is a meaningful statistic, which I doubt) has increased slightly, remained the same, or declined slightly.
The measurements are not accurate enough for anyone to be sure.
There is certainly nothing to get excited about in the data.
Since 1880 the average temperature trend is unknown, but probably up.
If not up, then it must be down!
Earth’s temperature is always changing.
Since the beginning of this planet, about 4.5 billion years ago, the average temperature TODAY is most likely near the coolest it has ever been.
Pick any two start and end points, and the average temperature between them will be in an uptrend or downtrend — none of the trends have been permanent so far.
Surface measurements are so haphazard, and non-global, that we can guess the change since 1880 ranges somewhere between zero (no change) degrees and +2 degrees C., .
I’m assuming a conservative margin of error (+ or – 1 degree C.) which is especially needed for 1800s measurements, when thermometers tended to read low, and measurements were far from global.
I don’t need a weatherman to know which way the wind blows … or to tell me how little Earth’s climate has changed since I was born in the 1950s
If any aliens landed on this planet and found out many people here were VERY concerned about a slight change in the average temperature since the 1800s, and many people thought CO2 was an evil gas (rather than plant food), i think they would shake their heads and assume there were many “village idiots” living on Earth!
Again, the post is very specific about creating the trend line using LSR which carries with it R squared, SE implications as well.
I believe that in this case the method to be erroneous. There are rules of data independence that need to be addressed. The stickiness of the year to year data suggests LSR to be improperly applied.
I only mention it because I do cost forecasting for a living and would not use LSR to forecast the future of a time series until I was assured of the independence of each data point.
But that is just me and I am sure that there are much smarter people on this site than me.
Also, in response to the other replier, yes it is McIntyre and McKitrick that I could not remember their names. They do wonderful work. Frankly, there should be a Nobel Award for Statistics and they deserve it.
You forgot to mention that there is no such thing as “the Earth’s temperature”. It’s not like there’s a thermometer stuck into one of the poles, after all. What climate “scientists” do is add up all the various little thermometer measurements from all over the world, “homogenize” it, weight it (to adjust for the fact that thermometers are close together in civilized parts of the world and far away in Australia) and finally do a quality check. The quality check consists of making sure the end result is a 0.12 degree per century warming rate.
In response to David, I did not merely assume that the monthly global temperature data were auto-correlated: I checked. In fact, although regional data are of course auto-correlated for seasonal reasons, there is no particular aotocorrelative signal in the global data (the summer in one hemisphere coincides with the winter in the other). Besides, like it or not, least-squares regression is the method repeatedly used by the IPCC and recommended (in a climategate email) by Phil Jones of the University of East Anglia. In short, it is Their method. And if Their method shows little or no warming on any dataset over the past couple of decades, then, Houston, They have a problem.
With response to Monckton …
Of course they (the IPCC) have a problem and the data clearly has correlation problems, almost all time series data does. For the most part, the people that you mention are statistical nitwits and you should not follow their course.
This is why if one is going to perform LSR, then they should probably plot Temp as a function of CO2. One would do this for any time window but I suspect a clear negative correlation during the pause and further as the pause could disappear soon, this method would be much more reliable.
Do you mean Steve McIntyre and Ross McKitrick?
The Durbin Watson statistic on the global anomaly dataset I calculate is 0.252. This means it is very highly positively autocorrelated as it is much less than 1.
the most disconcerting thing for me is that for over 18 years mother nature has falsified the “increasing atmospheric co2 levels will have a corresponding rise in global temperatures hypothesis” , yet very few people in the official climate science community appear to have noticed this.
It has been noticed! And Karl actually did something about it.
“yet very few people in the official climate science community appear to have noticed this.”
It has been noticed. Why do you think they are so frantically rewriting history to remove the inconvenient truth?
Isn’t it a warmist argument that there’s a delay in the feedback that would result in further warming “in the pipeline” despite a radiative-imbalance value considerably smaller than the assumed forcing? If so, the following passage may not be as compelling an argument as it seems against the cited forcing estimate:
Also, can anyone speak authoritatively about those imbalance measurements’ reliability? I vaguely recall having heard that they are wildly inaccurate, but I admit that I may be confusing those measurements with something else.
Actually, since the ocean’s store more about a gazillion times more energy than the atmosphere, if we don’t know ocean temperatures we don’t know nothing…. And how man bazillion years will it take for the ocean to reach equilibrium from a transient? At least 3 turnovers, maybe 150 years…. is there a “lapse rate” for the oceans?
Just a thought: the first figure should have the arrow pointing backwards if it is a regression starting NOW and extending into the past.
As always, Christopher, I enjoyed the post. Thank you for the kind words about my series about the current El Nino.
Cheers.
“The hiatus period of 18 years 8 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. ” Your language implies that you agree that the pause is a hiatus? Do you rule out that this levelling out represents a high point before the beginning of a decline. Do you think we know whether “the pause” is a hiatus or a high point?
Of course, we do not know. Time will tell. At some stage, the temperatures will either begin to rise, or begin to fall. In a sane world, we would simply wait and see what happens since this ought to advance our understanding of the system. But we are in an insane world where those governing hold the view that we must act, even though no one knows what is going on, or why.
in which case using their word hiatus is to cede our honest point of view to their dishonest hype
Julian Williams in Wales,
Exactly.
This hiatus/ pause terminology is misleading because as you pointed out those terms presume (have some premises on) the past behavior versus future behavior.
It should just be called what it is, a period of no (or not significant) change in global average temperature.
John
Could, should, expect, etc. The only sane thing is to wait and see as Verney says.
Probably a local high point. Whether that becomes the last high point in centuries is still unknown.
if it is local high point it is not a hiatus, it is local high point, the crest of a small, short and insignificant wave. It is only a hiatus if if starts rising again at roughly the same rate as it left off about 20 years ago
Would anyone like to have a stab at:
— projecting forward some likely figures (based on what the current El Nino will produce); and then
— see what happens to The Pause in that scenario?
I don’t have the mathematical/statistical chops for this task, but I’d be interested to know how The Pause would be affected. How short might it get? Would it disappear altogether? If so, when?
Stephen,
The “pause” itself is merely a convenient/effective way of highlighting the lack of scary increase in temps since satellite measurements began. It would not matter all that much if one used the entire satellite record taken as a whole (which eliminates the “pause” technically speaking), there simply has not been total warming anything like what the IPCC predicted if CO2 emissions continued unabated.
The pause is a period of no rise at all, but if warming resumed for a century at the rate indicated by the complete satellite record, it would still be no big deal.
If coincident with this current strong El Nino there is a long lasting step change in temperature, as there was such a step change (of about 0.27degC), coincident with the Super El Nino of 1997/98, the ‘pause’ will be eradicated.
If on the other hand, there is no long lasting step change, but rather a short term spike, much like the one that accompanied the strong 2010 El Nino, then for the first 6 months or so of 2016, the ‘pause’ will shorten (possibly substantially), but then it is likely that the following La Nina, in late 2016/early 2017 will bring the temperature anomaly back down, at which time the ‘pause’ will begin to recover, ie., head towards the 18yr 9 month duration, and if following that La Nina temperature re-stabilise at around the 2001 to 2003 temperature anomaly level, the ‘pause; will lengthen beyond 18yrs 9 months in duration and heading into 2019, when AR6 is being written, it will by then be over 21 years in duration.
So the issue is simple. Will there be a long lasting step change in temperature coincident with this current strong El Nino(just like 1997/98), or not? If there is such a long term step change in temperatures, the ‘pause’ will be busted. If no such long lasting step change, it would appear probable that the ‘pause’ will shorten for a short period before beginning to grow and extent over 19 yrs in duration.
There are a number of IFs because the future is not known and has yet to reveal itself. Nature will in time tell us the answer.
Excellent points!
I just want to add that “we” are often criticized for starting the pause right before the super El Nino of 1998. Whatever the merits of this assertion, should the 2016 El Nino eradicate the present pause for good, we can always claim that we did have a pause of at least 15 years that goes right between the 1998 and 2016 El Ninos. And unless RSS is Karlized, this will never go away.
Take it from 2001, after the Nino/Nina swing. RSS is flat-to-cooling.
To quote Sir Winston Churchill;
It is a mistake to look too far ahead. Only one link of the chain of destiny can be handled at a time.
and
It is always wise to look ahead, but difficult to look further than you can see.
and
Vi>A politician needs the ability to foretell what is going to happen tomorrow, next week, next month, and next year. And to have the ability afterwards to explain why it didn’t happen.
Although certain of the cited papers seem to provide ample reason to believe that climate sensitivity is relatively modest (to extent that the climate-sensitivity concept means anything at all), I feel compelled again to caution readers against relying on the head post’s following passage as evidence for that proposition:
That paper can be said to have “found” such a sensitivity value only to the extent that “found” means “pulled out of thin air.”
The truth is that the “model” upon which the authors based their “finding” is just this: 0.26 K per W/m^2. That’s it. You multiply a forcing trend by that value to get the resultant temperature trend. That value is the sole basis for the projection in their paper’s signature Fig. 6, the one on which they base their claim of model skill.
It is said that the paper’s “irreducibly simple” model was “developed over eight years.” Yet anyone who has a modest command of math could come up with a “model” at least as good before breakfast by just putting forcing and temperature data on a spreadsheet. All you have to do is divide an observed temperature trend by the corresponding trend in forcing.
Of course, Lord Monckton argues that the authors’ approach was different:
I have no doubt that similarly low sensitivity values can be arrived at by what could indeed be justifiably characterized as “using physics.” In the case of the Monckton et al. paper, though, what in the authors’ view apparently passes for “using physics” is merely their unsupported opinion that “810,000 years of thermostasis suggest” a [-0.5, +0.1] range for the loop gain fg in the closed-loop-gain equation h = g / (1-fg).
Sure, the mean of that range does cause a 1 Cº value to result from an assumed 3.7 W/m^2 doubled-concentration forcing-change value if the IPCC-suggested value of 3.2 W/m^2 per kelvin is used for the reciprocal of open-loop gain g. But what is it about that “thermostasis” that suggests Monckton et al.’s range instead of, say, a range of [-2, -1] and thus a mean sensitivity of 0.5 Cº, or, for that matter, a range of [+0.5, +0.7] and thus a mean sensitivity of 3 Cº? Where are the calculations by which the authors arrived at their range from the observed “thermostasis”?
They have shown none. The sum total of their support for that range is the bald assertion that “Fig. 5 and 810,000 years of thermostasis suggest” it. (Fig. 5 is nothing more than a closed-loop-gain-equation-illustrating hyperbola on which they placed the “process engineers’ design limit” that Lord Monckton has flogged for years without providing any basis.)
In other words, the authors just made the range up.
It’s as though I said that my “finding” of a 50º F. outside temperature is based on physics because I contend that melting icicles suggest a Celsius range of [8º C., 12º C.]. It’s physics, I suppose, that the 50º Fahrenheit value corresponds to the mean of the [8º, 12º] Celsius range, but why would melting icicles suggest [8º C., 12º C.] rather than, say, [1º C., 5º C.]? However true my estimate may ultimately turn out to be, it’s still just conjecture. The same is true of the sensitivity value that the authors “found.”
For evidence that climate sensitivity is low, one is well advised to rely on other papers.
Rely on other papers? Which papers don’t involve making up a range, as you put it? Did God report on this matter recently? ; )
Those who would like to see whether Mr Sour Grapes Born, who was caught out lying when he had first ineptly attempted to criticize Monckton of Brenchley et al. (2013) and has griped ever since whenever the paper is mentioned, will be able to find the paper at scibull.com, the website of the Science Bulletin of the Chinese Academy of Sciences, where the paper is – by a factor of ten – the most downloaded in the entire 60-year archive of that learned journal.
Professor Monckton,
I am unable to locate the expected verb for the clause introduced by the conjunction whether in the above paragraph. Beware of hypotaxis.
: > )
Typical Monckton response. All bluster, no substance.
Note that he is unable to address why the data require his range rather than any other. He has similarly evaded addressing the numerous errors in physics and math with which his paper is replete. This is the sign of a lightweight.
[? What the specific “errors in physics and math”? .mod]
Don’t whine. Mr Born made several allegations about our paper in Vol 60 no 1 (January 2015) of the Science Bulletin of the Chinese Academy of Sciences, but they were comprehensively answered in earlier postings here. Mr Born lied to the effect that I had refused to supply data to him when he had in fact sent me no request for it. He was roundly criticised for his lie by other commenters, and has been whining ever since. Discredit his nonsense as a proven liar’s sour grapes.
Mr Born lied to the effect that I had refused to supply data to him when he had in fact sent me no request for it.
If anyone is a liar here, it’s Lord Monckton.
The basis for his allegation is a passage in which I referred to a post of mine as a request and to his reply post as turning it down.
The issue was the contents of Monckton et al.’s Table 2. That table’s caption claims that all of its entries, which Monckton et al. refer to as “transience fractions,” were “derived from” a paper by Gerard Roe. Unless “derived from” means “inconsistent with,” however, that caption is a falsehood. Roe’s Fig. 6 shows that at every point in time after t = 0 the response value for a higher-feedback system must exceed that every lower-feedback system’s response value. In contrast, the Monckton et al. table’s first-row entries dictate that in the early years the lowest-feedback system’s response exceed higher-feedback systems’.
Readers before me had placed those quantities at issue in blog threads in which Lord Monckton participated. Characteristically, however, any answers he gave were at best evasive; even in the face of objections that such values appeared to be non-physical he failed to explain how he could possibly have inferred from Roe et al. that the zero-feedback values would be unity for all time values.
To elicit a clear explanation, therefore, I cranked up the volume: I wrote a post specifically entitled “Reflections on Monckton et al.’s Transience Fraction.” In that post I explicitly stated that the manner in which Monckton et al. inferred that table’s values had not been made entirely clear.
So it was hardly a stretch for a subsequent post of mine to refer to that earlier post as a “request for further information about how the Table 2 ‘transience fraction’ values . . . were obtained from a Gerard Roe paper’s Fig. 6.” Nor was it inappropriate for me to characterize as turning down that request a reply post in which Lord Monckton merely repeated the paper’s contention that “The table was derived from a graph in Gerard Roe’s magisterial paper of 2009 on feedbacks and the climate” without explaining, as I requested, how that could possibly be true.
Nowhere did I say that the authors needed to supply data in order to give the explanation (which could simply have been, No, we didn’t really derive that row’s values from Roe; we just made them up). Nowhere did I state or imply that I had made a request through any channel other than that post. Indeed, by hyperlinking the word “request” to it, I explicitly identified that post as the request. I similarly hyperlinked “turned down” to his reply post. No one who knows how to click on a hyperlink could have had any excuse for not knowing what those terms referred to.
Such is the forthright, above-board, completely transparent behavior that Lord Monckton has chosen to characterize as a lie. That he would thus resort to slander is an indication of how desperate he is to avoid a technical discussion, in which it would be apparent to anyone conversant with feedback theory and electrical circuits that in writing about them Monckton et al. had ventured in way over their heads. His doing so is of a piece with the posts in which he claims to have “comprehensively answered” the technical issues: it continues his practice of distortion, evasion, and misdirection.
[? What the specific “errors in physics and math”? .mod]
Even in a complete head post it would be hard to cover all of the paper’s problems. But one can get a sense just from considering Monckton et al.’s Equation 1.
To appreciate how ludicrous Equation 1 is, think of a simple system like a bathtub with a slow drain. If you open the faucet, the tub fills until it reaches a level at which the drain flow, which increases with increasing water depth, equals the flow from the faucet. Think of the faucet flow as the forcing (delta F in the equation), and think of the resultant water level as temperature (delta T in the equation); its evolution in time after the faucet is initially opened is represented by curves something like those in the Roe diagram, i.e., in the diagram that the Table 2 values were supposed to represent. (Greater feedback values correspond to a slower drain.)
Now suppose you suddenly close the faucet. Does the accumulated water disappear instantaneously? Of course not. But that’s what Monckton et al.’s Equation 1 says would happen. Look at the equation: delta F = 0 means delta T = 0.
If you accept, despite your everyday experience, that the accumulated water would disappear instantaneously, then you’ll see nothing wrong with Equation 1. Otherwise, you should question it and much else that’s in that paper.
Also, if you know anything about electrical circuits, you know that “the voltage transits from the positive to the negative rail” when loop gain exceed unity makes no sense.
Mr Born now admits to his lie by conceding that, contrary to his statement that he had “requested” me to supply data that I had “refused” to supply, he had not in fact sent me any request for it, nor had I issued any statement refusing it. The data were in fact supplied in the paper he presumed – and presumes – to criticize without actually having read it.
He whines – for that is the usual tone of this particular troll – that we had said that we had derived certain values for our transience fraction from a graph by Dr Gerard Roe when in his opinion we had not done so. Our method, however, was made entirely explicit in the paper, so he is merely quibbling in his usual futile fashion.
Then he tries, again futilely, to attack our equation 1 on the ground that – in effect – it is too simple. Well, the main points of that equation are in the IPCC’s documents, so if he is criticizing our equation he is criticizing the IPCC too. A feature of our approach was to use its methods and to show where they led.
How vexing it must be for him to learn that the paper he loves so little is now, by an order of magnitude, the most-dowloaded paper in the entire 60-year archive of the Science Bulletin of the Chinese Academy of Sciences. Let him pick nits all he likes: the paper was very thoroughly peer-reviewed, and it does provide a clear description of some of the reasons why climate sensitivity has been overstated in the models.
I have told him before, and must instruct him again, that the paper made it quite clear that the values we had chosen for the inputs to our simple climate-sensitivity model were our choices, and that anyone was free to choose his own values and run the model for himself. So whining about our choice of values for the initial conditions does not constitute a criticism of our model itself, which is based closely on equations to be found in the documents of the IPCC, which he would do well to read so as to inform himself of how climate sensitivity is determined.
He should also go and consult a competent electronic engineer for information on what happens in an electronic circuit when the loop gain is driven above unity. He may like, in particular, to make himself better informed about what is known as “feedback-induced oscillation”, and about how that oscillation is induced by increasing the loop gain transiently above unity and allowing it to relax back below unity. He is ill informed but pretends to be well informed. Let him do more homework and less whining.
Meanwhile, it is becoming apparent to all – and even to him – that climate sensitivity is, as we have found it to be, likely to be a great deal less than the canonical value. That is the main point, from the policymaker’s point of view.
Lord Monckton clings to his position that the closed-loop-gain equation depicted in his Fig. 5 is descriptive only of circuits, not of a feedback-implementing climate system. He seems to base this view on interpreting the unity-loop-gain transition in his Fig. 5 from positive infinity to negative infinity as meaning that voltage “transits from the positive rail to the negative rail.” That’s not what it means. It means that the equilibrium state is negative above unity loop gain and positive below it, not that increasing loop gain would cause the system to transition to that equilibrium state.
And that equation does indeed apply to climate. The reasoning couldn’t be simpler. If a system’s equilibrium open-loop gain is g, i.e., if its equilibrium open-loop response y to a static stimulus x is given by y = gx, then adding feedback fy simply means replacing x in that equation with x + fy: y = (x + fy)g; that’s the definition of feedback. So, if the climate system implements feedback, it by definition has to obey that equation—as well as Fig. 5’s closed-loop-gain equation y = gx / (1 – fg), which that equation becomes when y is isolated. Contrary to what the Monckton et al. contend, therefore, the closed-loop-gain equation is not “the wrong equation” to use in a feedback-implementing climate model.
Mindlessly invoking “feedback-induced oscillation” doesn’t change that. Look, I was consulting electrical engineers, and discussing feedback as well as oscillation, when Lord Monckton was still a teenager. Take it from me, there is nothing in the closed-loop-gain equation that requires a circuit to “transit from the positive to the negative rail” at unity loop gain.
From his statement that “oscillation is induced by increasing the loop gain transiently above unity and allowing it to relax back below unity,” it appears that Lord Monckton consulted engineers but didn’t understand what they told him. Lord Monckton’s error probably results from hearing about circuits whose equilibrium, “DC” loop gain is less than unity but whose AC loop gain at some frequency is unity. Such a circuit would indeed oscillate, i.e., repeatedly “transit,” as Lord Monckton puts it, but the circuit needs the right reactances to make that happen.
Without such reactances, unity equilibrium loop gain would cause an electronic circuit to peg at one rail, not spontaneously transit to the other. In the first of my above-mentioned posts I demonstrated that fact: without more a circuit whose loop gain starts above unity and is allowed, as Lord Monckton expresses it, to “relax back” to below unity, would not “transit from the positive to the negative rail.” And Lord Monckton has never been able to demonstrate anything wrong with that analysis. Nor will he; circuits such as flip-flops routinely exhibit such behavior.
If a system implements feedback, it has to obey the feedback equation. Lord Monckton’s going on about oscillators doesn’t change that.
Others may disagree, but I share Judith Curry’s opinion that the Monckton et al. paper doesn’t provide any new insights into why the GCM outputs disagree with observations.
The equation’s parameters’ may have been obtained from the IPCC’s documents, but I am unaware of the IPCC’s having embraced that equation’s main innovation, which is the preposterous notion that the response of a memory-implementing time-invariant system can reliably be obtained by treating it as a memoryless time-variant system.
Ah, Christopher Monckton, you have to love him; he’s the Humpty-Dumpty of climate science.
What the paper says of the method is that low-feedback-system values “derived from” Roe can “safely be taken as unity.”
That makes his method “entirely explicit” only if (1) from the statement that the first-row values were “derived from” Roe one understands that he adopted values inconsistent with Roe and (2) from the statement that the low-feedback values could “safely be taken as unity” one understands that if you use those values the equation will erroneously depict low-feedback Roe-type as initially producing higher responses than higher-feedback ones.
Apparently, words in Lord Monckton’s head mean what he chooses them to mean.
It is indeed vexing to have so mathematically incoherent a paper seen so widely as exemplifying the quality of skeptic thought.
All through the history of science, you will find people that found the right result by the unconventional methods. It was discovered that CO2 ice will cause rain because the scientist threw a whole bunch of things into a humid oven and checked it it caused rain.
So don’t criticize Monkton for getting the right result by the “wrong” method. Instead criticize those that got the wrong result by the “right” method.
I believe that equilibrium climate sensitivity is low, but I don’t know everything, so I’m still accumulating evidence, and I look to papers to find it. Monckton et al. purported to provide such evidence, but when I investigated I found that they just went through a lot of verbiage to hide that the only basis for that conclusion was that it was their opinion. To me that isn’t evidence. It shouldn’t be for you, either.
Besides, how do you know they got the right result? Their demonstration of skill (Section 9) was bogus.
Since I am unaware of having contested any “initial condition,” I assume that Lord Monckton intended to reprise his refrain that my criticism of the transience fraction is directed mainly to his choice of values. Although I do criticize how falsely he characterized his particular choice, the reason that I focus on transience fraction is that its usage in Equation 1 betrays a profound misunderstanding of the relevant mathematics.
That equation was presented as a way to study climate models on which the IPCC and others rely. If the model being studied is memoryless—and almost none is—then the transience fraction is an unnecessary but harmless complication to his “irreducibly simple” model. But, in the more typical case of models that are not essentially memoryless, using transience fractions as Equation 1 does produces results that can differ wildly from those of the model it’s supposed to approximate.
For example, if one uses the equation to determine what equilibrium climate sensitivity a high transient climate response implies in accordance with the Roe response curves, the value that Monckton et al.’s equation gives is less than a third of the value that those curves actually imply, as I demonstrated by reference to a previous post’s Fig. 7.
Mr Born is well off topic. Besides, the Roe response curves are based on the assumption that the CO2 radiative forcing is 4 Watts per square meter per doubling, whereas even the IPCC only thinks it is 3.7, and if soon-to-be-published research on the Lorentzian and Voigt equations in the models proves correct, that should be only 2.6. The transience ratio can in any event be adjusted to take account of whatever sort of model the user desires to emulate. Mr Born, as always, is futilely picking nits.
Hardly. It was Lord Monckton who brought up the question of whether I was criticizing the model itself. That’s the question I addressed. Answers are not off topic just because Lord Monckton says they are.
Ah, more of Lord Monckton’s cargo-cult logic. He spouts a lot of irrelevance, safe in the knowledge that to those unable to follow the substance it will appear that he has made a relevant reply. He hasn’t.
For one thing, it’s a little late to complain about the Roe curves; those were Lord Monckton’s choices, not mine, and most if not all of his paper’s calculations were based on them.
More to the point, the issue was whether Monckton et al.’s Equation 1 would fairly estimate the responses of models from whose step responses given transience-fraction values were taken. Independently of whether transience-fraction values are taken from Roe or some other family of curves, I demonstrated that Monckton et al.’s Equation 1 can lead to results that are off by as much as a factor of three or more from what the chosen curve would dictate.
That’s not nit-picking. It’s the heart of the matter. It shows that Monckton et al. had no idea of what they were doing; they attempted mathematics that was beyond their competence.
My guess is still that much of this warming is caused by albedo, not CO2. All the interglacial warmings max out at about the same temperature, because that is when albedo cannot reduce any further. And if albedo cannot reduce further, the globe cannot warm more.
But in the modern era, dust from Western and now from Chinese industry is leaving deposits on the winter snows. So although northern hemisphere snow extent has been increasing for three decades, it has been melting quicker in the spring. Quicker melt, equals a lot more insolation absorption – as much as 250 w/m2 extra, regionally, during those important spring months.
Ralfellis:
Everyone’s eyes see a different thing when looking at graphs. You can represent the same data with different kinds of graphs and get people to see different things. A straight line drawn on a graph implies a trend.
But when I look at your snow graph, I see two distinct periods that I could draw FLAT lines through. The first from 1967 to 1989 (22 years). And a second FLAT line from 1990 to 2012 (another 22 years). You can pick your points but what it looks like is for 20 years we had and average of 30+- an for 20 years we had an average of 29+- million square km. Is that unusual? Will it go back up to 30+ ? Yeah, it will. But it may be 20 years, 50 years, 100 years or a thousand years from now. The declining trend line is really meaningless unless it s in context. How much snow coverage was there from 1930 to 1950? Might it have been lower than 1990 to 2012?
Not being critical. We all draw these lines. But we have found recently that lines in the sand are just that: lines in the sand that will disappear with the next strong wind.
Now I have to go find my remote so I can watch football.
Have a great Sunday afternoon.
I’m pretty sure it’s a high point and we will soon experience global cooling.
http://www2.sunysuffolk.edu/mandias/global_warming/images/temperature_trends_1880-2009.png
You forgot the trend lines 1910 – 1940 Barry.
Lord Monckton, thank you as ever for your efforts. There’s something about what you are doing that I don’t quite understand. I raised it here with somebody a little while ago, thought I’d got it from his reply, and then realised that I hadn’t.
I understand the outcome of your calculation. You can go back about 18 years and find that there is no average upward trend over the period from that time to the present. Go back any further and you find there is. I think that’s what you are saying.
However, I’m trying to imagine how, given the data, I would write a program to do this calculation. I would start with the most recent data, for 2014, say, and work my way backwards in time year by year. But then I would find (I believe) that 2013 was cooler than 2014, and my program would have to stop at the point. So my question is: What principle is getting you past that point?
Apologies if I’m being thick about this and have misunderstood it.
Simply tell your program to check at 1 year, then 2 then 3 all the way 20 or 30, then give you the results for each stretch of time..Example..2014 to 2013, 2014 to 2012 all the way to 2014 to 1990…
..TO 20 or 30 …
Thanks Marcus. Yes, I figured it must be that about a minute after I pressed the send button. Basically you throw away the periods where the trend was increasing, and retain the longest period for which it was not increasing. I guess that’s what you mean too? Don’t know why I had such trouble with it.
Nothing is ” Thrown Away ” !!
Marcus:
Unless it concerns ‘climate science’ when you are told to wipe your hard-drives!
Marcus, you say “Nothing is ” Thrown Away ” !!”
Isn’t it? What happens to it then? I just assumed that such periods would be of no interest with regard to the target of the calculation. Am I still missing some point here?
“Is the ocean warming” – The mystery of ocean warming, if caused by down welling IR radiation, is not how the heat got to the lower regions, but how it got into the ocean at all.
The preferred hypothesis of the CAGW brigade is that the down welling radiation changes the temperature gradient in the thin film surface layer, which is measured in microns, and reduces heat losses from conduction through that layer. In other words CO2 does not cause any direct heating of 75% of the earths surface, but supposedly reduces the loss of heat caused by insolation.
Some experiments with a NZ research ship indicated that down welling radiation from clouds could affect surface temperature by this mechanism but I have never seen any reference to a paper that attempts to quantify this effect from CO2. It is simply assumed that CO2 adds heat directly to the ocean by calculating area and multiplying by the forcing, which has to massively overstate any ocean warming contributed by the gas.
If I recall correctly, that paper is not a peer reviewed paper, and the evidence was questionable.
There was a paper published in 2000 on the RV Tangaroa experiments, presumably peer reviewed, but I have seen nothing since that even attempts to quantify the heat retention effects of CO2 forcing. It has to be far less than simply assuming that it equals the heat gained if the forcing directly warms the Ocean – which it cannot.
This seems to me quite fundamental to the quantification of ocean heat gain, and yet it has not been addressed. This enables alarmists to continue with the ‘it’s in the ocean’ argument.
It is also not reasonabla to call the answer to a question a cherry pick if the question is reasonable. How long has the earth gone without warming?; is a reasonable question.
If the proponents of CAGW notice, after a very warm El Nino, that the trend is greatly shortened, they will not be calling a reasonable question a cherry pick.
I personally think that answering the question, what, in the satellite record,
is the warmest year the earth has experienced? is a reasonable question. By now over 50 percent of all years should exceed 1998. So far, and likely to continue in 2016, zero do.
C.M. demonstrates that according to IPCC climate models, and assuming IPCC projected harms as accurate, (a large assumption) even if 2016 exceeds 1998 by say .05 degrees, we will still not be warming at anything but a beneficial rate of about .3 C per century, comparing the warming trend from super El Nino to super El Nino, still well below all IPCC computer projections.
Can someone explain Fig T10, Top of Atmosphere Imbalance, Stephens et al. 2012
Each one of the imbalances show a plus/minus error anywhere between 3 and 17 w/m2. Yet at the top in bold it states top-of-atmosphere imbalance of 0.6 plus/minus 0.4 w/m2. How can there be such a small error bar here when all the others are magnitudes larger?
Duncan,
I believe the TOA error bar is based on measurements actually made at the TOA by satellite, not an amalgam of other measurements/estimates made below. From the abstract;
“At the top of the atmosphere, this balance is monitored globally by satellite sensors that provide measurements of energy flowing to and from Earth. By contrast, observations at the surface are limited mostly to land areas. As a result, the global balance of energy fluxes within the atmosphere or at Earth’s surface cannot be derived directly from measured fluxes, and is therefore uncertain.”
Thank-you John
Curiously, a pause does not shorten but “the pause” does. If you look back over the historical temperature record, you can see lots of pauses of various lengths. They are still there, and they never change (unless someone changes historical temperatures). “The pause”, however, is the pause that ends today. It changes every month because it is a different entity each month. [“month”, because temperatures are reported monthly].
In years gone by, in the land based thermometer record there was a noticeable cooling between ~1940 to early 1970s. Now, that cooling has all but disappeared, and instead of a negative temperature trend, the temperature trend is now almost flat. This, of course is due to the repeated adjustments to the record that have gone to cool the past and warm the present. Now that is something to be rightly concerned about since we have now so badly bastardised the temperature record that it is not fit for scientific purpose and can tell us nothing of significance.
richard verney,
But as we get older and die off, few will remember how the temps really were. And worse, I don’t think you could prove what the record used to say anymore. The “record” is now just a myth that tells the tale the government wants told.
@ur momisugly Forrest Gardener
The part that you seem to be missing is that we only know the exact length of a pause when we know that such a pause has definitely ended. The reason “the pause” continues to change lengths is because we don’t yet know when the end of it as been reached and as such each month we look at it as if the current month is the end. It’s only in hindsight, once the pause has ended and becomes a historically fait accompli that we will know the exact length that the pause was.
@Mark, Hansen et. al 1981 is still on the NASA website.
I’m sure Hansen had already started fudging the temperature by that time.
Thanks, Christopher, Lord Monckton, for bringing data to the debate.
The temperature trend for RSS MSU lower tropospheric global mean from 2002 to 2014.92 was -0.59°C per century.
See http://www.woodfortrees.org/plot/rss/from:1979/to:2014.93/plot/rss/from:1979/to:2002/trend/plot/rss/from:2002/to:2014.93/trend/plot/rss/from:1979/to:2014.93/mean:13
The RSS temperature trend from 2002 to 2015.92 was just -0.16°C per century, due to the 2015 El Niño.
See http://www.woodfortrees.org/plot/rss/from:1979/to:2015.93/plot/rss/from:1979/to:2002/trend/plot/rss/from:2002/to:2015.93/trend/plot/rss/from:1979/to:2015.93/mean:13
The Earth is now cooler than it has been since the “Little Ice Age” ended, and it keeps on cooling.
I see a lot of argument over graphs, statistics, theory, computer models, all over a tiny interval in Earth’s history. Richard Lindtzen has called Climate Science “a science in its infancy.” The thing to do is go out and gather more data all over the Earth, both present observations and on paleoclimate, not quibble about graphs that aren’t statistically significant anyway, or build models that don’t work. Fixation on the global average temperature on a yearly scale in tenths of a degree is a ridiculous reductio ad absurdum for a hugely complex system that’s been around for billions of years. Only when we have enough data on all the possible influences on climate and how they interact, will we have a real science.
Here it is:
Temperature fluctuations over the past 17,000 years:
http://www.oarval.org/Foster_20k.jpg
Temperature fluctuations over the past 17,000 years showing the abrupt cooling during the Younger Dryas.
The late Pleistocene cold glacial climate that built immense ice sheets terminated suddenly about 14,500 years ago [12,500 BC], causing glaciers to melt dramatically.
About 12,800 years ago [10,800 BC], after about 2,000 years of fluctuating climate, temperatures plunged suddenly and remained cool for 1,300 years.
About 11,500 years ago [9,500 BC], the climate again warmed suddenly and the Younger Dryas ended.
Also showing the Holocene Warm Period, the Roman Warm Period and the Medieval Warm Period compared to today’s small rise in average temperature.
See “Geologic Evidence of Recurring Climate Cycles and Their Implications for the Cause of Global Climate Changes” Don J. Easterbrook (2011). Department of Geology, Western Washington University, .pdf
And “The Intriguing Problem Of The Younger Dryas – What Does It Mean And What Caused It”, at http://wattsupwiththat.com/2012/06/19/the-intriguing-problem-of-the-younger-dryaswhat-does-it-mean-and-what-caused-it/
This graph is incorrect, the date axis is wrongly annotated and the line labelled ‘Present Temperature’ is misplaced.
One of the most misleading thing that can be put on a graph is a trend line. And when only two points are considered, a trend line is totally useless.
Mr. Layman here.
The fundamental “point” is the claim that Man’s CO2 will lead to the “C” in “CAGW”. Hansen, Mann, “et al” have made that claim and the climate models politicians’ use to justify their policies are based on that.
The rise in CO2 and the lack of a rise in the “projected” temperature points to the simple fact that the claims of Man’s CO2 leading us to Catastrophe and the need for political control of said Man’s CO2 are WRONG.
Good or bad, the climate may not be a non-problem. But climate is beyond our control.
According to Merriam-Webster Dictionary and Thesaurus:
Simple Definition of trend : a general direction of change : a way of behaving, proceeding, etc., that is developing and becoming more common.
Ask says: A trend in mathematics is a pattern in a set of data points. Knowing the trend allows outcomes to be predicted by a mathematical model. Estimating the trend of data requires a technique known as regression or curve fitting.
Can you see the differece?
A simple straight-line trend is all about the past, not a predictor.
Mr Denio seems to be suggesting that the headline graphs in this monthly series are determined from only two data points. Not so. They are determined from all of the monthly mean global lower-troposphere anomalies, using the usual equations for determining the slope and y-intersect of that straight line that minimizes the sums of the squares of the residuals (the differences between each data point and the line).
No predictions can be made on the basis of a past trend on a stochastic dataset. However, the fact that there has been a long pause when much warming had been predicted suggests that Monckton of Brenchley et al. (2015a), in the Science Bulletin of the Chinese Academy of Sciences, may have been correct to find climate sensitivity to be of the order of 1 K per CO2 doubling rather than 3 K.
It can be indicative or misleading. Depends on how you do it.
“I propose that if 20 years without global warming occur, the IPCC, the UNFCCC and all their works should be swept into the dustbin of history, and the prosecutors should be brought in …”.
=======================================
Quite so. Twenty years would be, and should be, an important watershed.
There was a temperature peak around the late ‘30s – early ‘40s then a dip; the length and depth of that dip is uncertain because the data has been mutilated by ‘activist’ scientists.
Added to the 1950 — 1980 temperature stasis (HadCRUT4) it would mean that during the so-called ’anthropocene’ the temperature has been unaffected by CO2 emissions over 70% of the time, data fiddling notwithstanding.
Lord this dangerous AGW hypothesis is a non sequitur minefield, of course during the periods of stasis the effect of increasing CO2 emissions could have been countered by other unknown factors, factors which the IPCC dismiss, but unknown factors could equally have augmented CO2 forcing during periods of rising temperatures.
I think one has to argue against the IPCC Climate Change™ science on its own terms.
The fact remains that the IPCC had predicted global warming that has now been absent for almost two decades. I refuse to do science the intergovernmental way. I start with the data, not with a preconceived notion to which the data must be tortured to conform.
Chris Hanley:
You say
Viscount Monckton has very reasonably maintained his consistency by saying
OK. That being so, I write to provide the relevant argument “against the IPCC Climate Change™ science on its own terms” which you say you want.
Box 9.2 on page 769 of Chapter 9 of IPCC the AR5 Working Group 1 (i.e. the most recent IPCC so-called science report) is here and says
GMST trend is global mean surface temperature trend.
and
A “hiatus” is a stop.
Firstly, it is important to note that the quoted IPCC Box refers to a “hiatus” and not a “pause”. This is important because the commonly used word “pause” is a misnomer. The word “pause” misleads by suggesting the present cessation of discernible change to linear trend of GMST is an interruption to global warming although this cannot be known: the lack of discernible change to global temperature trend at 95% confidence will end with global warming or global cooling and nobody can know which until it happens. The IPCC rightly calls the “pause” a “hiatus”; i.e. a stop to global warming.
Secondly, the quoted IPCC Box provides two definitions of the misnamed ‘pause’; viz.
(a) The ‘pause’ is “a GMST trend over 1998–2012 that is higher than the entire HadCRUT4 trend ensemble”.
And
(b) The ‘pause’ is “the observed GMST trend hiatus”.
The quoted IPCC Box states that the ‘pause’ exists according to either of its definitions.
Definition (a) is most serious for the ‘consensus science’ because it demonstrates that the climate models provide wrongly high indications of global warming (i.e. GMST trend).
Viscount Monckton is addressing Definition (b); i.e. “the observed GMST trend hiatus”.
This consideration is important because in 2008 the US Government’s National Oceanic and Atmospheric Administration (NOAA) reported
Ref. NOAA, ‘The State of the Climate’, 2008
http://www1.ncdc.noaa.gov/pub/data/cmb/bams-sotc/climate-assessment-2008-lo-rez.pdf
Hence, if “zero trends for intervals of 15 yr or more” exist then that creates “a discrepancy with the expected present-day warming rate” provided by the climate models.
In his above essay, Viscount Monckton reports
Hence, according to IPCC Climate Change™ science, the data provided by Viscount Monckton demonstrates “a discrepancy with the expected present-day warming rate” provided by the climate models. Or, to put that in plain language, Viscount Monckton has provided data that – according to IPCC Climate Change™ science – shows the climate models don’t work and provide indications which are wrong.
Richard
One of the problems with the entire CAGW debacle is that so much circumstantial data (much of it manipulated) is continuously thrown up to obfuscate the issue. From a scientific point of view the ACW proposition can be stated very succinctly as follows;
Mans use of of fossil fuel is raising atmospheric CO2. CO2 is a green house gas and the action of greenhouse gases reduces the rate of energy loss to space for a given temperature. That, coupled with constant insolation creates an energy imbalance which causes earth to warm until equilibrium is re-established.
Can we test this hypothesis? Well there is one very simple unequivocal test. If true, Earth’s energy loss to space (in the form of outgoing long wave radiation – OLR) should be falling in line with rising CO2. This has been measured since 1978 by satellites and NOAA has published the data (note; NOAA is hardly a skeptical organisation). It shows that OLR has been rising not falling since 1978. One clear unequivocal fact is enough to destroy an hypothesis – this is it.
This data indeed destroyed the original thesis but the response was to modify the thesis. Now the claim was that the rise in CO2 caused an initial drop in OLR and some warming but that raised water vapour levels in the atmosphere and because water absorbs near infrared energy it meant that some of the insolation that was otherwise reflected back out to space was instead absorbed. Thus the warming was perpetuated by more solar energy being absorbed while OLR in fact rose because the temperature of the Earth was rising.
This new thesis can also be readily tested on two fronts. Firstly, absorbing energy results in warming, if extra solar energy is being absorbed by water vapour in the atmosphere then that must warm the atmosphere. For the energy to be transferred from the atmosphere to the surface there must be a temperature gradient meaning that the atmosphere must be warming faster than the surface (second law of thermodynamics). This is the basis for the claim of a mid troposphere hot spot in the tropics and the model claim that the atmosphere in this region should be warming about two times as fast as the surface. But both satellite data and 1000’s of balloon flights show there is no hot spot. The second test is wrt OLR. If the thesis is true the change in OLR should be the rise due to the temperature increase of the planet (easily calculated from the Stefan Boltzmann law) less the fall due to the rising green house gas concentration. The net could be a rise or a fall but one thing is certain, if it is a rise the rate of rise MUST be less than predicted by the SB law alone. Unfortunately the rate of rise is greater than predicted by the SB law.
These two tests utterly destroy the new hypothesis. It does however raise the question as to why Earth is warming if OLR is rising. If OLR is rising faster than predicted by the SB law it means earth’s emissivity to space is increasing. Green house gases reduce the effective emissivity for long wave radiation and so do clouds. GHG concentrations are clearly not falling but If cloud cover was reducing then Earth’s emissivity would rise. Reducing cloud cover would also reduce Earth’s albedo which would mean a greater fraction of incoming solar energy would be absorbed. The latter dominates over the former so reducing cloud cover will both raise OLR and cause Earth to warm. Svensmark’s theory is that increasing solar magnetic fields reduces cosmic rays which reduces cloud seeding and thus cloud cover. This is at least consistent with the above observations.
Michael Hammer:
No hypothesis can be falsified when post hoc excuses are used to evade its falsification by confounding data.
Richard
Michael Hammer’s analysis is interesting and worthy to be amplified to are a head posting.
Monckton of Brenchley proposes
Seconded!
Richard
Seconded a Second time !!!
The length of the pause for UAH6.0beta4 is 18 years and 6 months from July 1997 to December 2015:
Temperature Anomaly trend
Jul 1997 to Dec 2015
Rate: -0.009°C/Century;
CI from -1.129 to 1.111;
t-statistic -0.016;
Temp range 0.141°C to 0.139°C
This is from:
http://moyhu.blogspot.com.au/p/temperature-trend-viewer.html
Werner, and all, consider that T is independent of past and future; it is what it is, it was what it was. A cogent question is what is the warmest year in the satellite record?
The clear and undisputed answer is 1998, and not by a little. Both years, 2016 and 1998 will have had comparable El Nino’s. Therefore a logical case can be made that the affect of CO2 can be partially gauged by comparing the two. IMV it is extremely likely that 2016 will not exceed 1998 in either satellite data set.
But, just for drill, let us assume it does by .05 C degree. Then, outside of ENSO conditions the atmosphere will have warmed at a rate of about .3 degrees per century. The catastrophe is where?
Which agrees with the long term CET
https://xmetman.wordpress.com/2016/01/09/long-term-trends-in-seasonal-cet/#comment-2893
I said this on the previous post here at WUWT.
It seems to apply here. The politics is the fuel that keeps CAGW burning.
Politics, greed, envy on national levels … not an honest consideration of the facts. Some of “the logs” thrown on the fire fall into the pride category. (Think tree rings.)