The Pause draws blood – A new record Pause length: no warming for 18 years 7 months

By Christopher Monckton of Brenchley

For 223 months, since January 1997, there has been no global warming at all (Fig. 1). This month’s RSS temperature shows the Pause setting a new record at 18 years 7 months.

It is becoming ever more likely that the temperature increase that usually accompanies an El Niño will begin to shorten the Pause somewhat, just in time for the Paris climate summit, though a subsequent La Niña would be likely to bring about a resumption and perhaps even a lengthening of the Pause.

clip_image002

Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 7 months since January 1997.

The hiatus period of 18 years 7 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate.

The Pause has now drawn blood. In the run-up to the climate conference in Paris this December, the failure of the world to warm at all for well over half the satellite record has provoked the climate extremists to resort to desperate measures to try to do away with the Pause.

First there was Tom Karl with his paper attempting to wipe out the Pause by arbitrarily adding a hefty increase to all the ocean temperature measurements made by the 3600 automated ARGO bathythermograph buoys circulating in the oceans. Hey presto! All three of the longest-standing terrestrial temperature datasets – GISS, HadCRUT4 and NCDC – were duly adjusted, yet again, to show more global warming than has really occurred.

However, the measured and recorded facts are these. In the 11 full years April 2004 to March 2015, for which the ARGO system has been providing reasonably-calibrated though inevitably ill-resolved data (each buoy has to represent 200,000 km3 of ocean temperature with only three readings a month), there has been no warming at all in the upper 750 m, and only a little below that, so that the trend over the period of operation shows a warming equivalent to just 1 C° every 430 years.

clip_image004

Figure 1a. Near-global ocean temperatures by stratum, 0-1900 m. Source: ARGO marine atlas.

And in the lower troposphere, the warming according to RSS occurred at a rate equivalent to 1 C° every 700 years.

clip_image006

Figure 1b. The least-squares linear-regression trend on the UAH satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 5 months since March 1997.

Then along came another paper, this time saying that the GISS global temperature record shows global warming during the Pause and that, therefore, GISS shows global warming during the Pause. This instance of argumentum ad petitionem principii, the fallacy of circular argument, passed peer review without difficulty because it came to the politically-correct conclusion that there was no Pause.

The paper reached its conclusion, however, without mentioning the word “satellite”. The UAH data show no warming for 18 years 5 months.

clip_image008

Figure 1c. The least-squares linear-regression trend on the UAH satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 5 months since March 1997.

For completeness, though no reliance can now be placed on the terrestrial datasets, here is the “warming” rate they show since January 1997:

clip_image010

Figure 1d. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to a little over 1 C° per century during the period of the Pause from January 1997 to July 2015.

Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century is not exactly alarming. However, the paper that reported the supposed absence of the Pause was extremely careful not to report just how little warming the terrestrial datasets – even after all their many tamperings – actually show.

As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur.

Furthermore, the long, slow build-up of the current el Nino, which has now become strongish and – on past form – will not peak till the turn of the year, is already affecting tropical temperatures and, as the thermohaline circulation does its thing, must eventually affect global temperatures.

Though one may expect the el Nino to be followed by a la Nina, canceling the temporary warming, this does not always happen. In short, the Pause may well come to an end and then disappear. However, as this regular column has stressed before, the Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing divergence between the predictions of the general-circulation models and observed reality.

The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990 will fall below 1 C°/century equivalent.

clip_image012

Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 307 months January 1990 to July 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies.

clip_image014

Figure 3. Predicted temperature change, January 2005 to July 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies.

The page Key Facts about Global Temperature (below) should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”.

The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century.

Key facts about global temperature

Ø The RSS satellite dataset shows no global warming at all for 223 months from January 1997 to July 2015 – more than half the 439-month satellite record.

Ø There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since January 1997, during the pause in global warming.

Ø The entire RSS dataset from January 1979 to date shows global warming at an unalarming rate equivalent to just 1.2 Cº per century.

Ø Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century.

Ø The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us.

Ø The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century.

Ø Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century.

Ø In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century.

Ø The warming trend since 1990, when the IPCC wrote its first report, is equivalent to 1 Cº per century. The IPCC had predicted more than two and a half times as much.

Ø To meet the IPCC’s central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur.

Ø Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100.

Ø The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950.

Ø The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950.

Ø The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years.

Ø Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.

 

 

Technical note

Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend.

The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record.

The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out.

Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years.

The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity.

The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line.

The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend.

Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat.

RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures.

Dr Mears’ results are summarized in Fig. T1:

clip_image016

Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998.

Dr Mears writes:

“The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.”

Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:

“Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.”

In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself.

Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record.

The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. The el Nino may well strengthen throughout this year, reducing the length of the Great Pause. However, the discrepancy between prediction and observation continues to widen.

Sources of the IPCC projections in Figs. 2 and 3

IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded:

“Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.”

That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted.

In 1990, the IPCC said this:

“Based on current models we predict:

“under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii).

Later, the IPCC said:

“The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv).

The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025.

The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2).

clip_image018

Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii).

Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely.

But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3).

clip_image020

Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990).

Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date.

True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless.

The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn.

clip_image022

Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality.

To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate.

In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration.

Is the ocean warming?

One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date.

Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork.

Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1.

clip_image024

Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow).

Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger.

The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize.

clip_image026

Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is.

Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO.

ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming.

Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth.

Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere.

If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has.

Why were the models’ predictions exaggerated?

In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T7):

clip_image028

Figure T7. Predicted manmade radiative forcings (IPCC, 1990).

However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990:

clip_image030

Figure T8: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013).

Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T9):

clip_image032

Figure T9. Energy budget diagram for the Earth from Stephens et al. (2012)

In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission.

It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC.

Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T10) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling.

clip_image034

Figure T10. Reality (center) vs. 11 models. From Lindzen & Choi (2009).

A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today.

On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions.

Finally, how long will it be before the Freedom Clock (Fig. T11) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable.

clip_image036

Figure T11. The Freedom Clock

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

285 Comments
Inline Feedbacks
View all comments
Andy Phillips
August 7, 2015 5:17 am

“The hiatus period of 18 years 7 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate.”
I don’t understand this: you have to skip 2014 (0.555 +/- 2.920 C/Decade), 2013 (0.346 +/- 1.572 C/decade), 2012 (0.417 +/-1.301 C/decade), and 2011 (0.416 +/- 0.912 C/decade) before you come to a sub-zero trend. So Monckton can’t have worked backwards looking for the first sub-zero trend. What am I missing?
Also, other temperature series, such as GISTEMP, show a positive trend from 1997. How is this not a cherry-pick?

Werner Brozek
Reply to  Andy Phillips
August 7, 2015 8:12 am

What am I missing?

There are many positive trends all over the place over the last 18 years and 7 months. However at the present time, ALL trends from December 1996 and earlier are positive. Not a single one is negative prior to January 1997. But it is negative from January 1997 (until about December 1997).

Andy Phillips
Reply to  Werner Brozek
August 7, 2015 4:09 pm

Okay, thanks – I understand that.

John Endicott
Reply to  Andy Phillips
August 7, 2015 9:59 am

Andy Phillips says: “Also, other temperature series, such as GISTEMP, show a positive trend from 1997. How is this not a cherry-pick?”
Lord M addressed the other data sets in his post – “All three of the longest-standing terrestrial temperature datasets – GISS, HadCRUT4 and NCDC – were duly adjusted, yet again, to show more global warming than has really occurred.”

Andy Phillips
Reply to  John Endicott
August 7, 2015 4:17 pm

I find that hard to swallow. Why would they do that? If you were trying to figure out what’s happening with global surface temperature, wouldn’t you do it to the best of your ability? I have no reason to suppose they would cheat. I would not risk my professional reputation on such a venture.

Reply to  John Endicott
August 7, 2015 9:39 pm

Well then, Mr. Phillips, you are completely unqualified and mentally incapable of performing the duties of a “climate scientist”.
Although perhaps you could be a real scientist, or an engineer, or something else which requires honesty and is useful.

MarkW
Reply to  John Endicott
August 8, 2015 8:54 am

Why would they do that? I don’t know. Several possibilities suggest themselves.
Ideological certainty that CO2 is a problem and must be addressed, if the current data isn’t of sufficient quality to prove what you already know to be true, then adjust the data until the errors have been removed.
Or you are just protecting your grant gravy train, hoping to maintain the fiction of global warming until you are safely retired. Then the repercussions will be someone else’s problem.

richardscourtney
Reply to  John Endicott
August 8, 2015 9:14 am

Andy Phillips:
I think you will find it informative to look at this and to read all of this.
I hope this helps.
Richard

Sturgis Hooper
Reply to  Andy Phillips
August 7, 2015 1:11 pm

GISS, HadCRU and NOAA are packs of lies.

Science or Fiction
Reply to  Andy Phillips
August 8, 2015 5:13 am

Yes – it is hard to swallow. I recommend that you harden yourself before having a look at the following:
1. The best correlation I have ever seen within climate science:
https://stevengoddard.wordpress.com/2014/10/02/co2-drives-ncdc-data-tampering/
2. Very good vizualisation of the adjusments:
http://realclimatescience.com/alterations-to-climate-data/

MarkW
Reply to  Andy Phillips
August 8, 2015 8:51 am

“is the farthest back one can”
In your world, is not 1997 further back than 2014 and 2013?

Dudley Horscroft
August 7, 2015 6:04 am

About this time last year, or perhaps a month or so earlieer, NOAA released the result of the new United Sates temperature record (then 10 years of consistent records) using properly sited accurate thermometers which was intended to be the basis for a consistent temperature record for the United States. Has an update on this been issued? I would think that there would be annual updates. Surely we do not have to wait for another 10 years for this?

jim south london
August 7, 2015 6:43 am


“Eighteen going on Nineteen”

August 7, 2015 7:53 am

Re: No warming for 18 years 7 months
I am a bit sad for all the young ‘freshers’ starting at universities this autumn. Many will be ‘conscripted’ into the AGW echelons, but none have experienced ‘horrors’ of the global warming.
All of the life during the boring GT’s pause,
not exactly a fighting cause.
Young friends, believe me nothing compares to the zeal of the 1968 ‘revolution’ !

Alx
August 7, 2015 8:00 am

We know our human ancestors lived through an ice age and without human intervention the earth has warmed considerably. Selecting that period to the present there has been fortunately an overall warming trend. Selecting the pre-ice age time period would show a cooling trend to get to the ice age.
Alarmists like to focus on post industrial history (minus the last 20 years or so) and ignore all pre-industrial age history. When not ignoring pre-industrial age history they suggest that even though we are not sure what caused warming and cooling prior to the industrial age, we are sure humanity has taken charge of warming the planet via CO2. A rather novel approach to analysis, cherry picking data periods and using lack of knowledge as a basis for building new knowledge.
That there is no agreement on what absolute or calculated periods to consider and that there are multiple datasets that in diverse ways represents the singular “global temperature” only demonstrates how immature climate science is.

August 7, 2015 10:59 am

Tangential to this discussion I thought you folks might be interested in a BBC (British Broadcasting Corporation) programme broadcast last Tuesday. The programme was called ‘What’s the point of the Met Office?’. In it various parties have expressed criticism of the M.O., particularly their medium and long term predictions. This has caused outrage from the usual suspects, (what’s not to like). For anyone wishing to listen, and have a laugh, here is the link.
http://www.bbc.co.uk/programmes/b06418l5

Bob S
August 7, 2015 12:44 pm

Ref: Thomas, August 6, 2015 at 7:21 pm “I agree. I was looking for OCO-2 data yesterday but found nothing. Watts up with that?”
I know this info is not directly responsive to your request (I’m trying to get access to the science data) but this may be useful to help understand operational issues with on-orbit satellites.
Goddard’s report on OCO-2 satellite data collection issues over the last year. I have a couple of decades of satellite system engineering experience on the industry side and these are my comments (not Goddard’s):
• Optical sensors are subject to contamination from outgassing products on orbit
• Occasionally, the optical path / sensors are warmed to remove these products and improve data collection
• Sometimes optical sensor covers (i.e. doors) are closed to avoid thermal issues / damage from an external source that is outside the sensor design envelope (i.e. Sun, Moon, Earth limb) – depending on the sensor type
• Occasionally, the sensor cover is closed to update sensor calibration against known target properties
• Satellite maneuvers are sometimes required for orbit maintenance, debris avoidance, and/or thermal management of the satellite
• The issues shown by Goddard do not indicate any significant problem with the satellite
Taken from … http://disc.sci.gsfc.nasa.gov/OCO-2/data-holdings/oco-2-summary#KnownDataIssues
Known Data Issues
Dates Issue Impact
7/2/14 – 8/4/14 Door closed/Warm Science data not created or invalid
8/4/14 – 8/5/14 Warm Science data invalid
8/9/14 – 8/15/14 Door closed Science data invalid
8/23/14 – 8/24/14 Maneuvers/Door closed No Science taken
8/31/14 – 9/3/14 Warm (decontamination) Science data invalid
9/14/2014 Maneuvers/Door closed No Science taken
10/23/2014 Door closed Science data invalid
10/23/14 – 10/26/14 Door closed/Warm (decontamination) Science data not created or invalid
10/26/14 – 10/28/14 Door closed Science data invalid
12/12/2014 Maneuvers/Special calibration (Full-orbit dark) No Science taken
1/1/2015 Maneuvers/Door closed No Science taken
1/6/15 – 1/8/15 Warm (decontamination) Science data not created or invalid
1/11/2015 Poor telemetry quality Could not create L1B product
1/17/15 – 1/23/15 Warm/decontamination/maneuver Science data not created or invalid
1/24/2015 Special calibration (Full-orbit solar) No Science taken
1/29/15 – 1/30/15 Ground station problems Science data missing one band
2/8/2015 Ground station problems Science data missing
2/18/2015 Special calibration (Full-orbit solar) No Science taken
2/20/2015 Special calibration (Full-orbit Dark) No Science taken
3/15/15 – 3/16/15 Maneuvers/Door closed No Science taken
3/22/2015 Maneuvers/Door closed No Science taken
3/25/2015 Poor telemetry quality Could not create L1B product
3/26/2015 Special calibration (Full-orbit solar) No Science taken
4/2/2015 Maneuvers/Door closed No Science taken
4/16/15 – 4/17/15 Maneuvers/Door closed No Science taken
4/19/2015 Special calibration (Full-orbit solar) No Science taken
4/20/2015 – 5/10/2015 Warm (decontamination), instrument safing Science data not created or invalid

Reply to  Bob S
August 8, 2015 11:35 am

Bob S. Thanks. That’s an interesting post. Dr. Ann Marie Eldering announced at the end of the AGU fall meeting in 2014 that more data would be released in March of 2015. I emailed her and asked when they now expect to release more data. Particularly more data like the video presented at about 11 minutes into the video linked below.

Say What?
August 7, 2015 1:19 pm

Well, I welcome El Nino. It usually gives us milder winters and after the last two cold winters we could use a break.

August 7, 2015 4:20 pm

I know that the methods used will affect the results, but when I check the woodfortrees.org site for the UAH record from 1998 to today, it shows an OLS trend of approximately 0.1 ( I presume C) over the record. Low, but not the zero trend from this article.
What could cause the difference in this case?

Reply to  James Schrumpf
August 7, 2015 5:26 pm

Try using the same start date for UAH pause data that I used.

Werner Brozek
Reply to  James Schrumpf
August 7, 2015 5:47 pm

WFT uses the 5.6 version. The latest is 6.0beta3 and can be found here:
http://vortex.nsstc.uah.edu/data/msu/v6.0beta/tlt/tltglhmam_6.0beta3.txt

August 7, 2015 10:00 pm

The timeline is all scrambled up. Strange.
I wish I knew what he said to me before they were deleted.
My curious bone is all itchy now.

ren
August 8, 2015 1:14 am

Well, that the Great Barrier Reef is not compromised.
http://www.bom.gov.au/difacs/IDX0942.pdf

Science or Fiction
August 8, 2015 2:53 am

These would have been falsifying experiences if IPCC had not acted total unscientific and based their work on inductivism and justificationism.
As Karl Popper (The master mind behind the modern scientific method – the empirical method) warned about:
“it is still impossible, for various reasons, that any theoretical system can ever be conclusively falsified. For it is always possible to find some way of evading falsification, for example by introducing ad hoc an auxiliary hypothesis, or by changing ad hoc a definition. It is even possible without logical inconsistency to adopt the position of simply refusing to acknowledge any falsifying experience whatsoever. Admittedly, scientists do not usually proceed in this way, but logically such procedure is possible”
Has definitions been changed ad hoc?
Oh yes – but it is difficult to pinpoint – since they haven’t provide any! Exactly what is supposed to be warming – by how much? Is it the troposphere, sea surface temperature, upper oceans, deeper oceans or any combination? Both theory and observation temperature product changes continuously.
Unclear definitions can be continuously altered.
Has hypothesis been added – in ad hoc manners?
Oh yes:
“Well, I have my own article on where the heck is global warming?…The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t.”
Kevin E. Trenberth
The ad hoc hypothesis that the warming went into the deep oceans were added.
From IPCC´s point of view a brilliant ad hoc hypothesis.
Because of the difference in heat capacity of the oceans and the atmosphere – the amount of energy which would heat the atmosphere by 1 K (Kelvin) will only heat the oceans by 0.001 K. Suddenly – any amount of warming of the troposphere can be explained by a minuscule change in ocean temperature! Changes that are so minuscule that they cannot be measured with sufficient accuracy.
Ref: Contribution from Working group I; On the scientific basis; to the fifth assessment report by IPCC)
“Ocean warming dominates the total energy change inventory, accounting for roughly 93% on average from 1971 to 2010 (high confidence). The upper ocean (0-700 m) accounts for about 64% of the total energy change inventory. Melting ice (including Arctic sea ice, ice sheets and glaciers) accounts for 3% of the total, and warming of the continents 3%. Warming of the atmosphere makes up the remaining 1%.”
“In so far as a scientific statement speaks about reality, it must be falsifiable: and in so far as it is not falsifiable, it does not speak about reality.”
― Karl Popper, The Logic of Scientific Discovery
“A theory that explains everything, explains nothing”
― Karl Popper
“The discovery of instances which confirm a theory means very little if we have not tried, and failed, to discover refutations. For if we are uncritical we shall always find what we want: we shall look for, and find, confirmation, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain what appears to be overwhelming evidence in favour of a theory which, if approached critically, would have been refuted.”
― Karl Popper, The Poverty of Historicism

Bruce Cobb
August 8, 2015 4:12 am

Well well, it appears the exceedingly tiresome, and mind-numbingly dense troll, “Brian” has flamed out. They usually do. They follow a set path of self-destruction, trod by many before, and sadly, probably many more to come (though they do, thankfully, appear to be dwindling in number). Their goal is always a dishonest one; to simply disrupt. They have no interest in educating themselves, nor of being educated. What a miserable existence they must lead.

Khwarizmi
August 8, 2015 5:02 am

Sturgis Hooper, August 6, 2015 at 4:57 pm
And what is causing record breaking lake ice and snow cover?
========
Aran, August 6, 2015 at 7:16 pm
I don’t know about these things. So no idea really. If you can point me to sources of information I’d be very grateful.
=====================
Some sources:
Historical Great Lakes Ice Cover
NCDC/NOAA, March 02, 2014
During the winter of 2013/14, very cold temperatures covered the Great Lakes and surrounding states. Minnesota, Wisconsin, Michigan, Illinois, and Indiana each had winter temperatures that ranked among the ten coldest on record. The persistent cold caused 91 percent of the Great Lakes to be frozen by early March. This was the second largest ice coverage for the lakes, with data dating to 1973, and the largest on record for the date.
* * * * * * * * * * * * *
Great Lakes covered in record-shattering amount of ice this late in spring
WP, April 23, 2014
It’s almost May and a third of the Great Lakes is still covered by ice. This is unprecedented in records dating back more than three decades, and it’s not even close.
Environment Canada’s Great Lakes ice dataset, which extends back to 1980-81, shows the current ice extent at a chart-topping 32.8 percent as of April 22. The year with the next greatest ice extent on this date, 1996, had about half as much ice – or 16.49 percent coverage. The average Great Lakes ice cover right now is 2.2 percent.
There is roughly 16 times more ice than normal right now!
[…] In early March this year, the Great Lakes ice extent reached 94.19%, the second most on record for any month, dating back to 1973 in NOAA’s dataset, and most on record so late in the season.
* * * * * * * * * * * * *
Great Lakes are FINALLY ice free after record breaking seven months frozen
DailyMail, June 10, 2014
It has been a long, cold winter for much of America – but the Great Lakes have really suffered. Forecasters finally revealed today that all of the Great Lakes including Lake Superior are now ice free. It marks the end of a record breaking 7 month stretch where the lakes were covered in at least one ice cube, which is the longest period since satellite records began back in the 70’s.
* * * * * * * * * * * * *
Earliest ice on record appears on Great Lakes
CBC, Nov 24, 1014
According to the National Oceanic and Atmospheric Administration, the ice formation on Lake Superior at this time of year is the earliest ever recorded on any of the Great Lakes since records started being kept more than 40 years ago.
* * * * * * * * * * * * *
Fall snow cover in Northern Hemisphere was most extensive on record, even with temperatures at high mark
WP, December 4, 2014
In 46 years of records, more snow covered the Northern Hemisphere this fall than any other time. It is a very surprising result, especially when you consider temperatures have tracked warmest on record over the same period.
* * * * * * * * * * * * *

Sturgis Hooper
Reply to  Khwarizmi
August 10, 2015 9:25 am

Thanks.
More complete and humorous reply than I could have given.
Aran either doesn’t read this blog regularly or is out solely to disrupt.

Ed Zuiderwijk
August 8, 2015 6:15 am

My money is on the “pause” to end soonish and the temperature to drop by at least 0.5C over the following decade.

skeohane
August 8, 2015 8:02 am

I was responding to ” Thomas August 6, 2015 at 7:11 pm “,
What does this even mean? What is the “underlying data generation model?”

skeohane
August 8, 2015 8:04 am

This thread is posting comments at random places chronologically, not where we are responding….

August 8, 2015 10:00 am

I really wish you’d source the articles you argue against. With “Then along came another paper, this time saying that the GISS global temperature record shows global warming during the Pause and that, therefore, GISS shows global warming during the Pause.” there’s not even a name that I can use to look it up to verify your claim.

Richard Barraclough
August 9, 2015 2:38 pm

Talking of Anarctic ice (which somebody was), has anyone noticed that the sea ice surplus which has been omnipresent for the last couple of years, has now gone.
Meanwhile Arctic sea ice has plunged to an anomaly of -1.7 million sq km

Ralph Kramden
August 9, 2015 2:46 pm

I just read the July UAH version 6.0 global temperature departure from Dr. Roy Spencer’s web site. It dropped from the June value of 0.33 to 0.18. I was surprised because I thought it would increase because of the El Nino. I’m not sure what is happening here. Any comments?

August 10, 2015 4:34 am

Reblogged this on Climatism and commented:
For the alarmists and hardened global warming believers out there. Some advice:
– Lower tropospheric Satellite temperature data (UAH and RSS) was all the rage in the 1990’s when it ‘was’ warming. Now it is scoffed at.
– RSS and UAH satellites measure the lower troposphere – the precise area of the atmosphere where “Global Warming” theory is measured. Alarmists now point to the oceans as the main component of the “Global Warming” system. They do this precisely because the atmosphere has indeed stopped warming, despite the fact that 35% of all human CO2 emissions, since 1751, have been emitted over the past 18 years, with NO atmospheric ‘Global Warming’, at all. A terrible stat for the global warming cult.
– The name ‘Global Warming’ was changed to ‘Climate Change’ when it became obvious the atmosphere had stopped warming. This suits the agenda greatly, as the name ‘Climate Change’ cannot be falsified. The ‘climate’ always changes. Hence, any metric can be used to prove their theory: hot, cold, wet, dry, drought, flood. Therefore as a ‘science’, the theory of “climate change” is a null-hypothesis. i.e pseudo-science.

Mervyn
August 11, 2015 5:16 am

This information should be plastered around Paris from now until December 2015.

Brian G Valentine
August 11, 2015 7:16 pm

I thank the moderator for eliminating useless comments from someone for some reason having used my name.
I agree with everything Lord Monckton of Brenchley has to say, I always have.
[Reply: Sorry that an identity thief has stolen your good name. Those fake comments have been deleted, but saved. Another commenter whose name was stolen by the same individual has made it clear that he will take legal action against the identity thief as soon as we gather sufficient proof. We’re working on that. ~mod.]