El Niño has not yet paused the Pause

Global temperature update: no warming for 18 years 5 months

By Christopher Monckton of Brenchley

Since December 1996 there has been no global warming at all (Fig. 1). This month’s RSS temperature – still unaffected by the most persistent el Niño conditions of the current weak cycle – shows a new record length for the Pause: 18 years 5 months.

The result, as always, comes with a warning that the temperature increase that usually accompanies an el Niño may come through after a lag of four or five months. If, on the other hand, la Niña conditions begin to cool the oceans in time, there could be a lengthening of the Pause just in time for the Paris world-government summit in December 2015.

clip_image002

Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 5 months since December 1996.

The hiatus period of 18 years 5 months, or 221 months, is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend.

The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, also continues to widen.

clip_image004

Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 303 months January 1990 to March 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at less than 1.4 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies.

clip_image006

Figure 3. Predicted temperature change, January 2005 to March 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies.

The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. There are also details of the long-awaited beta-test version 6.0 of the University of Alabama at Huntsville’s satellite lower-troposphere dataset, which now shows a pause very nearly as long as the RSS dataset. However, the data are not yet in a form compatible with the earlier version, so v. 6 will not be used here until the beta testing is complete.

Key facts about global temperature

Ø The RSS satellite dataset shows no global warming at all for 221 months from December 1996 to April 2015 – more than half the 436-month satellite record.

Ø The global warming trend since 1900 is equivalent to 0.8 Cº per century. This is well within natural variability and may not have much to do with us.

Ø Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century.

Ø The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century.

Ø In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century.

Ø The global warming trend since 1990, when the IPCC wrote its first report, is equivalent to below 1.4 Cº per century – half of what the IPCC had then predicted.

Ø Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100.

Ø The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950.

Ø The IPCC’s 4.8 Cº-by-2100 prediction is almost four times the observed real-world warming trend since we might in theory have begun influencing it in 1950.

Ø The oceans, according to the 3600+ ARGO bathythermograph buoys, are warming at a rate equivalent to just 0.02 Cº per decade, or 0.23 Cº per century.

Ø Recent extreme weather cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.

 

 

Technical note

Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend.

The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able to capture such fluctuations without artificially filtering them out than other datasets.

Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years.

The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file, takes their mean and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity.

The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line.

The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model generates results little different from a least-squares trend.

Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat.

RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures.

Dr Mears’ results are summarized in Fig. T1:

clip_image008

Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998.

Dr Mears writes:

“The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.”

Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:

“Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.”

In fact, the spike in temperatures caused by the Great el Niño of 1998 is largely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself.

Curiously, Dr Mears prefers the much-altered terrestrial datasets to the satellite datasets. However, over the entire length of the RSS and UAH series since 1979, the trends on the mean of the terrestrial datasets and on the mean of the satellite datasets are near-identical. Indeed, the UK Met Office uses the satellite record to calibrate its own terrestrial record.

The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. It remains possible that el Nino-like conditions may prevail this year, reducing the length of the Great Pause. However, the discrepancy between prediction and observation continues to widen.

Sources of the IPCC projections in Figs. 2 and 3

IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded:

“Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.”

That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted.

In 1990, the IPCC said this:

“Based on current models we predict:

“under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii).

Later, the IPCC said:

“The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv).

The orange region in Fig. 2 represents the IPCC’s less extreme medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025, rather than its more extreme Scenario-A estimate, i.e. 1.8 [1.3, 3.7] K by 2030.

Some try to say the IPCC did not predict the straight-line global warming rate that is shown in Figs. 2-3. In fact, however, the IPCC’s predicted global warming over so short a term as the 25 years from 1990 to the present are little different from a straight line (Fig. T2).

clip_image010

Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii).

Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely.

Likewise, to reach 1.8 K by 2030, there would have to be four or five times as much warming in the next 15 years as there was in the last 25 years. That is still less likely.

But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3).

clip_image012

Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990).

Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date.

True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted, and the predictions were extravagantly baseless.

The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed emissions outturn, and yet there has only been a third of a degree of global warming since 1990 – about half of what the IPCC had then predicted with what it called “substantial confidence”.

clip_image014

Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality.

To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.35 Cº, equivalent to just 1.4 Cº/century, or a little below half of the central estimate of 0.70 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990). The outturn is visibly well below even the least estimate.

In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration.

Is the ocean warming?

One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date.

Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys somehow has to cover 200,000 cubic kilometres of ocean – a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork.

Fortunately, a long-standing bug in the ARGO data delivery system has now been fixed, so I am able to get the monthly global mean ocean temperature data – though ARGO seems not to have updated the dataset since December 2014. However, that gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, or 0.2 Cº century–1 equivalent.

clip_image016

Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow).

Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger.

The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize.

clip_image018

Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in fractions of a Kelvin that were originally measured. NOAA’s conversion of the minuscule temperature change data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is.

Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO.

ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming.

Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions that are relevant to land-based life on Earth.

Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere.

If the “deep heat” explanation for the hiatus in global warming were correct (and it is merely one among dozens that have been offered), then the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has.

The UAH v. 6.0 dataset

The long-awaited new version of the UAH dataset is here at last. The headline change is that the warming trend has fallen from 0.14 to 0.11 C° per decade since 1979. The UAH and RSS datasets are now very close to one another, and there is a clear difference between the warming rates shown by the satellite and terrestrial datasets.

Roy Spencer’s website, drroyspencer.com, has an interesting explanation of the reasons for the change in the dataset. When I mentioned to him that the usual suspects would challenge the alterations that have been made to the dataset, he replied: “It is what it is.” In that one short sentence, true science is encapsulated.

Below, Fig. T7 shows the two versions of the UAH dataset superimposed on one another. Fig. T8 plots the differences between the two versions.

clip_image020

Fig. T7. The two UAH versions superimposed on one another.

clip_image022

Fig. T8. Difference between UAH v. 6 and v. 5.6.

0 0 votes
Article Rating
132 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
May 4, 2015 9:10 am

Trying to kill one bird with two stones.
Part the first: How much did anthropogenic CO2 contribute to the increase in atmospheric CO2 between 1750 and 2011?
Atmospheric dry air:……………………………5.14E18 kg
Atmospheric CO2 @ 390.5 ppm:……………….3.05E15 kg
(All ppm based on moles.)
Per IPCC AR5:
1750………………..278.0 ppm…………….2.17E15 kg
2011………………..390.5 ppm…………….3.05E15 kg
Increase……………..112.5 ppm…….………8.78E14 kg
Added anthropogenic CO2………………….5.55E14 kg
Retained in atmosphere, 45%:……………….2.50E14 kg
Added anthropogenic Fossil Fuel CO2….…..3.67E14 kg
Retained in atmosphere, 45%:……………….2.50E14 kg
Contribution of anthro CO2 to increase between 1750 and 2011: 2.50/8.78 = 28.5%.
Contribution of anthro CO2 to increase in atmospheric content: 2.50/30.5 = 8.2%.
Contribution of anthro FF CO2 to increase between 1750 and 2011: 2.50/8.78 = 19.2%.
Contribution of anthro FF CO2 to increase in atmospheric content: 2.50/30.5 = 5.5%.
The popular notion seems to be that anthropogenic CO2 was responsible for all of the added atmospheric CO2 between 1750 and 2011. The numbers don’t seem to bear that out. Over 70% might be due to natural variability, e.g ocean floor geothermal heat flux outgassing CO2 or volcanic vents producing CO2 or an entire world of unknown unknowns.
Part the second: How much heat was added to the global heat balance by anthropogenic CO2 between 1750 and 2011 and where did it go?
Atmospheric dry air:…………………5.14E18 kg
Atmospheric water vapor……….……1.27E16 kg
Heat capacity of dry air……………….0.24 Btu/(lb – °F)
Latent heat of water:…………………..950 Btu/lb
Atmospheric ToA surface area:…..…..5.14E15 ft^2
Watt:……………………….…………3.41 Btu/h
m^2:…………………………………10.76 ft^2
Per IPCC AR5:
Radiative forcing increase due to anthropogenic GHGs between 1750 and 2011: <3.0 W/m^2.
Heat delivered to atmosphere by increased AGW/RF GHGs over 365 days:…….1.91E18 Btu
Annual temperature rise of dry air to absorb additional heat:…….0.70 °F or 7.0 °F per decade or 3.9 C per decade.
Amount of additional water vapor needed to absorb heat without increasing atmospheric temperature: 1.91E19 Btu / 950 = 2.01E15 lb or 9.14E14 kg or 7.2% of atmospheric water vapor content.
More water vapor means more clouds and more albedo which reflects heat back into space, a powerful negative feedback.

Reply to  nickreality65
May 4, 2015 11:31 am

Since December 1996 there has been no global warming at all
I’m still skeptical about ‘no global warming’ part. How do they make such absolute statements?
I’ve been told time and again by my teachers on how to never make meta narratives. Because there is always going to be some glitch that breaks the absolute. I keep seeing these ‘no global warming’ for 18 years articles, but did we somehow change the positioning of our planet that global warming stopped happening 18 years ago?
http://www.ayeshajamal.com

Reply to  Ayesha
May 4, 2015 12:10 pm

Ayesha should understand that once the measurements have been made and the trend determined, then – subject to the uncertainties in the datg and the choice of statistical model (here, least-squares linear regression) – the result is what it is. The RSS and UAH satellite lower-troposphere datasets show no global warming at all for more than 18 years.
The planet continues to be warmed by the Sun and to radiate energy to space. The two processes – warming and cooling – are in approximate equilibrium. However, a perturbation – such as greenhouse-gas forcing – should in theory be disturbing that equilibrium, causing some warming. However, no atmospheric warming is observed for more than 18 years, and the rate of ocean warming is minuscule, suggesting that the disequilibrium is small and harmless.

Reply to  Ayesha
May 4, 2015 12:58 pm

Ayesha says:
I’m still skeptical about ‘no global warming’ part.
As Dr. Spencer said, “It is what it is.”
The ‘positioning’ of the planet is irrelevant. Satellite measurements are the most accurate because they don’t just take land values, or ARGO values; they look at the entire planet (almost all, anyway; 80S to 80N).
The public has been told so often that ‘global warming’ is happening that, when reality is observed, some people cannot accept it at face value. They prefer to believe the narrative rather than the data.
Instead of being skeptical about ‘no global warming’, try instead to be skeptical of the unfounded claims that dangerous man-made global warming is occurring. There is no supporting evidence for that.

Bernie
Reply to  Ayesha
May 4, 2015 1:57 pm

The AGW (and in particular the CAGW) hypothesis is really all about the net positive feedback mechanisms. With a net positive feedback there should be an acceleration in the rate of temperature increase and sea level increase. While some argue that very old proxy data and old direct measurement data supports acceleration, recent sophisticated measurements do not. The models all have been programmed with (or otherwise arrive at) positive feedback, as can be seen in the literature. Yet these models are diverging from reality. No one should be surprised with this result. If our planet was susceptible to runaway global warming caused by a 300 ppm change in a trace gas, life forms here would never have reached this advanced state that you see today.

Reply to  nickreality65
May 4, 2015 3:31 pm

nickreality65 May 4, 2015 at 9:10 am

Per IPCC AR5:
1750………………..278.0 ppm…………….2.17E15 kg
2011………………..390.5 ppm…………….3.05E15 kg
Increase……………..112.5 ppm…….………8.78E14 kg
Added anthropogenic CO2………………….5.55E14 kg

Thanks, Nick, but you have a huge problem. You’ve given the value of the added anthropogenic CARBON, not the value of the added anthropogenic CO2. As a result, it looks like the increase since 1850 is greater than the anthropogenic emissions.
But that’s not the case. In fact, the emissions of CO2 are 44/12 of the emissions of C, or 2.04 E+15 kg since 1750. In other words, the change in atmospheric CO2 is LESS than the total human emissions.
As this error occurs very early in your calculations, you’ll need to re-do the whole lot … sorry to be the bearer of the bad tidings.
All the best,
w.

ren
May 4, 2015 9:13 am

The total amount of energy received per second at the top of Earth’s atmosphere (TOA) is measured in watts and is given by the solar constant times the cross-sectional area of the Earth. Because the area of a sphere is four times the cross-sectional area of a sphere, the average TOA flux is one quarter of the solar constant, and so is approximately 340 W/m²). Since the absorption varies with location as well as with diurnal, seasonal, and annual variations, numbers quoted are long-term averages, typically averaged from multiple satellite measurements.
Of the ~340 W/m² of solar radiation received by the Earth, an average of ~77 W/m² is reflected back to space by clouds and the atmosphere, and ~23 W/m² is reflected by the surface albedo, leaving about 240 W/m² of solar energy input to the Earth’s energy budget.
Past and future of daily average insolation at top of the atmosphere on the day of the summer solstice, at 65 N latitude. The green curve is with eccentricity e hypothetically set to 0. The red curve uses the actual (predicted) value of e. Blue dot is current conditions, at 2 ky A.D.
http://upload.wikimedia.org/wikipedia/commons/thumb/9/90/InsolationSummerSolstice65N.png/600px-InsolationSummerSolstice65N.png
http://en.wikipedia.org/wiki/Insolation
Current absorption of solar radiation on the surface is about 237 W / m ^ 2.
http://www.ospo.noaa.gov/data/atmosphere/radbud/gs19_prd.gif
Another issue is the changes in solar energy because of weak solar cycle.
POD and AMO cycles show that there are processes of autoregulation surface temperature. Currently AMO begins to fall and you can see it in the growth of ice in the Arctic.
https://sunriseswansong.wordpress.com/category/arctic-sea-ice/
Warm (>+0.5oC; red stippled line) and cold (<0.5oC; blue stippled line) episodes for the Oceanic Niño Index (ONI), defined as 3 month running mean of ERSST.v3b SST anomalies in the Niño 3.4 region (5oN-5oS, 120o-170oW)]. Base period: 1971-2000. For historical purposes cold and warm episodes are defined when the threshold is met for a minimum of 5 consecutive over-lapping seasons. The thin line indicates 3 month average values, and the thick line is the simple running 7 year average of these. Last 3 month running mean shown: January-March 2015. Last diagram update 14 April 2015.
http://www.climate4you.com/images/NOAA%20CPC%20OceanicNinoIndexMonthly1979%20With37monthRunningAverage.gif

Reply to  ren
May 4, 2015 3:28 pm

A watt is not energy, it is a power unit, energy over time. 3.412 Btu/h in English units. 3,600 Joule/h metric. Be sure to use English hours w/ Btu and metric hours w/ Joules

ren
Reply to  nickreality65
May 5, 2015 1:12 am

If this energy is divided by the recording time in hours, it is then a density of power called irradiance, expressed in watts per square metre (W/m2).
Direct insolation is the solar insolation measured at a given location on Earth with a surface element perpendicular to the Sun’s rays, excluding diffuse insolation (the solar radiation that is scattered or reflected by atmospheric components in the sky). Direct insolation is equal to the solar constant minus the atmospheric losses due to absorption and scattering. While the solar constant varies with the Earth-Sun distance and solar cycles, the losses depend on the time of day (length of light’s path through the atmosphere depending on the Solar elevation angle), cloud cover, moisture content, and other impurities.
Direct insolation is measured in (W/m²) or kilowatt-hours per square meter per day (kW·h/(m²·day)).
1 kW·h/(m²·day) = 1,000 W · 1 hour / ( 1 m² · 24 hours) = 41.67 W/m²
In the case of photovoltaics, average direct insolation is commonly measured in terms of peak direct insolation as kWh/(kWp·y) (kilowatt hours per year per kilowatt peak rating).

Bruce Cobb
May 4, 2015 9:15 am

Like a lot of the Climate Liars, Mears knowingly mis-states the start date of the “Pause” in order to make his bogus claim of “cherry-picking”. It starts now and moves back to when the trend stops being zero.

Tim
Reply to  Bruce Cobb
May 4, 2015 9:17 am

He is obviously a real nasty piece of work.

Tim
May 4, 2015 9:15 am

Dr Carl Mears, the senior research scientist at RSS removes any reason for respecting him when he refers to denialists. (If you think, to disagree with a theory makes you a denialist, what do you do when data doesn’t agree with your theory? You have just stated the theory can’t be wrong.)
Wow, just thinking of someone like this calling themselves a scientist makes me want to throw up.

May 4, 2015 9:24 am

“It remains possible that el Nino-like conditions may prevail this year, reducing the length of the Great Pause.”
It’s not only possible, it’s quite likely we will see a moderate to strong El Nino into the fall, which then settles back into a Modoki style El Nino condition which is what we had this past winter..
I recommend checking with Joe Bastardi and Joe D’Aleo of Weatherbell on this topic…It’s also quite likely that any pause busting El Nino will be immediately followed a dramatic reduction in temps…So let the alarmists celebrate (of course if they were normal human beings they’d be depressed that we’re all going to fry…). It will be short-lived..

Reply to  aneipris
May 4, 2015 10:23 am

In response to Aneipris, the problem is timing. The el Nino, if it endures to the end of this year, will shorten the Pause. Then the Paris conference will introduce a world government. Then the la Nina will continue the Pause, but it will be too late. In the current draft treaty, there is no secession clause.

Reply to  Monckton of Brenchley
May 4, 2015 10:45 am

Points taken. Yes, the timing would be terrible, but don’t forget that to pronounce the pause “ended,” they’re going to have to concede the pause existed in the first place, which leads naturally to a discussion of the poorly performing models, which of course is the more important story.
Granted, attendees are not going to pay any attention to such things, but it’s hard for me to see in these difficult economic times, many governments foolhardy enough to tie themselves to onerous energy restrictions, especially without any escape clause. Obama of course will be more than willing, a few others perhaps. But as depressing as a temporary end to the pause would be, I don’t think it would be disastrous.

Harry Passfield
Reply to  Monckton of Brenchley
May 4, 2015 12:10 pm

Chris:

In the current draft treaty, there is no secession clause.

Sheesh! That’s a hell of an oversight (tho’ not for those who seek World Government). There is so much that Satayana would recognise here. [sigh]

May 4, 2015 9:58 am

April Update for UAH and RSS:
The April anomaly is out for UAH version 6.0 and it is 0.065, a drop of about 0.07 from March. As a result, after 4 months, UAH would rank in 8th place if the new average were to hold for the rest of the year. As well, since the pause was 18 years and 3 months after March, and since the March value was on the zero line, the pause with April is now at least 18 years and 4 months on UAH.
RSS for April is also out. It dropped from 0.255 to 0.174. If the new average of 0.281 holds, RSS would end up in 6th place for 2015.
Up to this point, the lower troposphere does not seem to be aware of the fact that we have had an El Nino since October. See:
http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/ensoyears.shtml
However GISS and Hadcrut4.3 are solidly in first place after 3 months by about 0.12 so it would take a huge drop in April to knock either one out of first place in April.
With respect to statistically significant warming, that is obviously going to increase greatly.
With version 5.6, Dr. McKitrick had it at 16 years while Nick Stokes had it at 18 years and 8 months.
For RSS, Dr. McKitrick had it at 26 years while Nick Stokes had it at 22 years and 3 months.
But with the pause being 18 years and 4 months on version 6, I would say Nick’s time would increase by 4 years to over 22 years.
As for Dr. McKitrick, notice that most of the points for RSS since last April when he made his calculation for RSS are below the trend line. So if he were to calculate RSS today, he could get 27 years. I have no clue about UAH6, but I would not be surprised if it would be similar.
http://www.woodfortrees.org/plot/rss/from:1988/plot/rss/from:1988/trend

May 4, 2015 10:06 am

I think the beginning of the pause is not just before a century class spike early in an 18 year 5 month period, but when a rapidly rising trend meets a trend with lack of rapid increase. So, I used the Woodfortrees tool to split the RSS linear trend into two segments, and find what choice of end of one and beginning of the other resulted in the linear trends meeting each other. As of when the March data was the latest, this trend intersection time is the beginning of November, 2003. The linear trend from 1/1/1979 to then was .178 degree/decade, and from that time to the end of April 2015 the trend was cooling at a rate of .014 degree/decade.
http://woodfortrees.org/plot/rss/plot/rss/to:2003.833/trend/plot/rss/from:2003.833/trend
Meanwhile, the radiosonde curve in Figure 7 of http://www.drroyspencer.com/2015/04/version-6-0-of-the-uah-temperature-dataset-released-new-lt-trend-0-11-cdecade indicates that the surface-adjacent troposphere has had a warming rate .02, maybe .03 degree/decade greater than that of the main part of the lower troposphere, and that of what is covered by TLT weighting curves. This indicates the corner time of late 2003 was a transition from a warming rate near .2 degree/decade to a negigible warming trend.

Bruce Cobb
Reply to  Donald L. Klipstein
May 4, 2015 10:15 am

Nope. The time period for the Pause starts today, moving back.

michael hart
Reply to  Bruce Cobb
May 4, 2015 11:35 am

Yup, that’s the point. Whether it is a good, bad, or indifferent measure, it is always defined as starting now and going back in time. No cherry-picking possible.

michael hart
Reply to  Bruce Cobb
May 4, 2015 11:38 am

Unless you are Francis Fukuyama, who claimed history ended in 1992.

RoHa
Reply to  Bruce Cobb
May 5, 2015 12:44 am

Fukuyama was writing nonsense. History ended in 1946, when I was born. Everything since then counts as Current Events.

Reply to  Donald L. Klipstein
May 4, 2015 10:28 am

Mr Klipstein is incorrect in implying that the Great el Nino of 1997/8 artificially lengthened the Pause. It is more or less exactly offset by the quite strong and prolonged el Nino of 2010.
And his proposed statistical method, while useful for trying to demonstrate that the Pause has only endured for 12 years rather than the 18 years 5 months for which it has endured, is not a standard method.
It is really no longer possible to advance arguments, however ingenious, to try to minimize the Pause. The discrepancy between the predictions on which the scare was based and the reality that is measured in the temperature datasets is now too great.

Aran
Reply to  Monckton of Brenchley
May 4, 2015 2:18 pm

I am happy to see there are such firm believers in this kind of analyses. I have used exactly the same method to determine how long the planet has experienced warming. Looking only at RSS and UAH I have found that – starting from today and moving back – the earth has experienced warming for the entire period of measurement, i.e. over 35 years of warming.

Reply to  Monckton of Brenchley
May 4, 2015 4:40 pm

Aran, if you plot the height of a 65 year old man every year for 65 years, you could prove he was still growing, even if he stopped growing when he was 20. Correct?

Aran
Reply to  Monckton of Brenchley
May 4, 2015 4:51 pm

Yes

Aran
Reply to  Monckton of Brenchley
May 4, 2015 6:19 pm

So, if you are trying to say that the linear regression parameters don’t mean much by themselves, but rather the interpretation of them, then I would agree wholeheartedly

Reply to  Monckton of Brenchley
May 4, 2015 8:35 pm

Aran,
I believe this linear fit method is used by Lord Monckton because that’s the way the warmanistas have done it.
Bruce

Aran
Reply to  Monckton of Brenchley
May 4, 2015 9:31 pm

OK, well I’m sure Mr Monckton could let us know whether this has indeed been the motivation. So does your comment mean you think that the scientific methods used by these warmanistas are valid?

Reply to  Monckton of Brenchley
May 4, 2015 9:36 pm

The 2010 El Nino was a much weaker one than the 1998 El Nino. The baseline level of global temperature around each of these two El Nino events was noticeably lower for the 1998 one than the 2010 one, considering a period of 5 or 7 years or whatever.
As for discrepancy between climate model forecasts and reality – I agree that is real, but I disagree that this is support for the pause being defined as an 18.something time coverage that has a century-class spike shortly after its start point. Divergences of reality from climate models can easily be found to have corner points at various times, and I see this mostly being from mid 2001 (many flat linear trends can be found starting in mid-late 2001) to sometime in 2005 (according to my Fourier attempt on HadCRUT3 – which I like better than GISS, NCDC and all versions of HadCRUT4 so far due to better agreement with RSS and all versions of UAH from 5.4 to 6.0-beta).
My take is that stating the start time of the pause as sometime in 1996 is an overstatement, like overstatements from the warmist side, but to a carefully chosen lesser degree of being overstatements than warmist overstatements that significant warming has continued since 1996, 1998, or even since 2005.

Reply to  Monckton of Brenchley
May 4, 2015 10:16 pm

Aran,
No.

Reply to  Monckton of Brenchley
May 5, 2015 2:01 am

In response to Aran, the IPCC and Dr Jones of UEA use linear-regression on temperatures. I use their methods because doing that reduces the scope for argument.
Also, there is no need for an AR(n) auto-regression model because the global temperature data show no particular seasonality (the same would not be true of regional data).
Nor is there any need for higher-order polynomial fits. The linear trend is simple and straightforward. And, like it or not, it shows no warming for 18 years 5 months.
The start-date is of course influential. My headline graph each month simply goes back as far as one can go in the recent temperature record to find a zero trend. If I were to go back to the medieval, Roman, Minoan, Old Kingdom, or Holocene climate optima I’d be able to find zero or even negative trends. There is nothing particularly special about today’s warming, except that notwithstanding record increases in CO2 concentration there has been so little of it.

Aran
Reply to  Monckton of Brenchley
May 5, 2015 2:12 pm

@ Monckton of Benchley
Thank you for your answer. I agree that using more complex fitting procedures would be of any use for this particular case, where we are close to fitting in the noise anyway. I also agree that going back a lot further would be futile, because as Werner’s example illustrates as well, the results would become meaningless.
I am curious about your interpretations of these results. There does seem to be a trend break about 18 years ago. At least if we limit ourselves to the lower troposphere, the oceans and surface level temperatures do not show the same behaviour as you show in your post. What do you think this trend-break in lower troposphere temperatures indicates?
@ Boulder Sceptic
If you don’t think the ‘warmanistas’ methods are valid and Mr Monckton uses the same method, does that not mean that you would have to think Mr. Monckton’s results are also invalid?

Aran
Reply to  Monckton of Brenchley
May 5, 2015 6:28 pm

wrote my last post a bit too quickly.
Second sentence should have been:
I agree that using more complex fitting procedures would not be of any use for this particular case,

Aran
Reply to  Monckton of Brenchley
May 7, 2015 5:43 pm

I sincerely hope to get an answer to my question

Lance Wallace
May 4, 2015 10:07 am

It’s worse than we thought–you may have an error when you state:
“Added anthropogenic Fossil Fuel CO2….…..3.67E14 kg
Retained in atmosphere, 45%:……………….2.50E14 kg”
45% of 3.67 is rather less than 2.50. So the % impact on atmospheric content is more like 3% than 5.5%.

Reply to  Lance Wallace
May 4, 2015 11:23 am

I saw that too. “Don’t copy and paste.”

Reply to  Lance Wallace
May 4, 2015 1:39 pm

Wasn’t a matter of cut/paste or drag/drop, just ran too many different cases and got confused. W/ or w/o cement, w or w/o land use, ppm mass, vol, mole, C as C and C as CO2. At least I ran them.
So allow me to reiterate and correctify:
Per IPCC AR5:
Fossil fuel CO2 1750 to 2011: 367 PgC (E15) (In this case I take C to mean CO2) or 3.67 E14 kg.
45% residual: 1.65E14 kg (per World Bank 4C, IPCC says 43%. Tomato –tomahto)
Atmospheric CO2 at 390.5 ppm (2011) mole basis: 3.05E15kg
1.65E14/3.05E15 is 5.4%.
IMO there are only two questions that matter:
1) Does human fossil fuel use add a significant amount of CO2 to the atmosphere? At 5.4% I’d say not. (If you have a different result, pls show/share your work.)
2) Does CO2 play a significant role in controlling the climate? The pause says not.

Reply to  nickreality65
May 5, 2015 9:40 am

367 PgC is the carbon in 1345 Pg of CO2. The mass of the carbon alone is a common unit of global carbon budget figures. The conversion factor is 44.01/12.01, since these are the molar masses of CO2 and carbon respectively in grams per mole. 45% of 1345 Pg CO2 is 605 Pg of CO2 retained in the atmosphere, or 6.05E14 kg of CO2. A petagram is 1E15 grams, or 1E12 kg. 605 Pg of CO2 is 20% of the 2011 atmospheric total.
This is not counting CO2 added to the atmosphere by cement production and land use changes.

Reply to  nickreality65
May 5, 2015 10:04 am

Nick, you have a huge error in your numbers, as I discussed above.
w.

Eliza
May 4, 2015 10:07 am

Dr Roy Spencer is reporting an anomaly of 0.17C for April 2015 so this will bring down the trend further. If the cold trend continues to end of the year I think we can safely bury AGW. LOL

Reply to  Eliza
May 4, 2015 10:22 am

Sorry, Eliza, but there is no way to safely bury AGW; it’s very toxic waste, has to be incinerated. 😉

MarkW
Reply to  Andres Valencia
May 4, 2015 11:43 am

What kind of carbon footprint would incinerating it have?

May 4, 2015 10:19 am

Thanks, Christopher, Lord Monckton.
Your excellent article sheds a lot of light on this pseudo-scientific scam of CO2-induced global warming. Atmospheric CO2 keeps increasing, global temperatures keep unchanged. What gives?
I just see a very low CO2 climate sensitivity and no positive feedbacks, that’s all I see.

Reply to  Andres Valencia
May 4, 2015 12:15 pm

Senor Valencia is right on all counts. It is becoming very difficult to maintain with a straight face that climate sensitivity is much above 1 K per CO2 doubling. But how to penetrate the wall of ignorance in the media and hence among governments is the real problem.

Reply to  Monckton of Brenchley
May 4, 2015 9:54 pm

I completely agree with climate sensitivity being very likely between 1 and 1.5 degree C of warming per doubling of atmospheric CO2 in the past century or two, and decreasing as atmospheric CO2 increases past where it has been lately. (The lapse rate negative feedback appears to me as increasing with increased presence of greenhouse gases, including water vapor from the water vapor positive feedback.)
(In opposition to IPCC appearing to me as favoring around 2.5-3 degrees C per doubling of atmospheric CO2. The zero-feedback number (other than Plankian radiation stuff) is close enough to 1.1 degrees C per doubling of atmospheric CO2.
What I disagree with is statements that the start time of the pause is with calculations that include a century-class short term spike shortly after the beginning of when the pause was calculated to have started. And that current/recent climate sensitivity to CO2 increase is much less than 1 degree C per doubling of atmospheric CO2.

Leonard Lane
Reply to  Andres Valencia
May 4, 2015 12:29 pm

Agree, thanks Andres.

Reply to  Max Photon
May 4, 2015 8:41 pm

It seems to me that his poor little eyes are saying, “Your adjusted and homogenized temperature calculations using thermometer readings as much as 1000 miles distant from where I’m standing are bulls#!t. It’s f^ck!ng cold out here. May I please come in?”
Maybe it’s just me…

May 4, 2015 10:24 am

Vaguely on topic here. I understand from old comments on WUWT the global models run 24/7 in sunlight. (no night) I have no way to verify this. If true, then how do you support a model missing half of the equation . Then secondly what trickery is going on in models. If the sun shines 24/7 then the models have to run hot; where do the models get anywhere close to correct? (this Terra is not Tatooine) (May the fourth be with you)

Reply to  taz1999
May 4, 2015 7:06 pm

The computers running the models are solar powered, so they simply stop running at night. Problem solved.

Idiot of Village
May 4, 2015 10:40 am

On topic and on message:
But..but..but Sir Christopher. There’s something I don’t understand. I thought you peddled the RSS data set as the most reliable because of it’s super-sensitivity to El nino events (re: RSS’s the Great 1998 massive, exaggerated spike). Could you please enlighten us Villagers as to why surface temperature Measurements, at the moment, are on the up, whereas the heavily adjusted RSS and ‘new improved’ adjusted UAH are showing, well….a flat trend.
Just so that we can counter the snide remarks of those evil Warmistass

Stephen Richards
Reply to  Idiot of Village
May 4, 2015 11:34 am

Go back to picking through the cow dung where you will be more suitable.

MarkW
Reply to  Idiot of Village
May 4, 2015 11:47 am

It really amazes me how the useful idiots get their panties in a wad over the adjusting of the satellite data, but not the adjusting of the earth station data.
The reason for the satellite adjustments was clearly explained and the methodology of the adjustments was out in the open for anyone to criticize. But nobody has.
On the other hand both the reasons and the methodology for adjusting the ground based stations are so secret that not even FOIA requests can shake them free.

RD
Reply to  Idiot of Village
May 4, 2015 12:12 pm

Village Idiot’s kin, no doubt

Idiot of Village
Reply to  RD
May 4, 2015 12:47 pm

Yes, a fair portayal of our Village 🙂

ren
May 4, 2015 10:44 am

It may be a hot summer in the central and eastern US, due to weak El Niño.
http://weather.gc.ca/data/saisons/images/2015050400_054_G6_global_I_SEASON_tm@lg@sd_000.png

Reply to  ren
May 4, 2015 10:09 pm

A weak El Nino is more of an El Nino than ENSO-neutral is. My experience in and near Philadelphia says that ENSO positivity indicates a mild summer in USA’s Northeast Corridor, and eastern USA in general. There are already 1-month and 2-month scale forecasts indicting that the northeast / mid-Atlantic USA will probably have a cooler-than-average summer. My expectation is that Philly will have a warmer summer than the one of 2014 (which I felt as the coolest Philly summer since Philly’s cool one of 1976, that unusually did not achieve 95 F).

May 4, 2015 11:01 am

Monckton
Why do you persist in following the IPCC models and forecast climate trends ahead linearly when the climate is clearly controlled by natural orbital cycles and cycles in solar activity – most importantly on a time scale of human interest the millennial solar activity cycle?
It is of interest that the trends in the UAH v6 satellite are now much closer to the RSS satellite data,. In particular they confirm the RSS global cooling trend since 2003 when the natural millennial solar activity driven temperature cycle peaked.
see
http://www.woodfortrees.org/graph/rss/from:1980.1/plot/rss/from:1980.1/to:2003.6/trend/plot/rss/from:2003.6/trend
It is the satellite data sets which should be used in climate discussions because the land and sea based data sets have been altered and manipulated so much over the years in order to make them conform better with the model based CAGW agenda.
The IPCC climate models are built without regard to the natural 60 and more importantly this 1000 year periodicity so obvious in the temperature record. This approach is simply a scientific disaster and lacks even average commonsense .It is exactly like taking the temperature trend from say Feb – July and projecting it ahead linearly for 20 years or so. The models are back tuned for less than 100 years when the relevant time scale is millennial. This is scientific malfeasance on a grand scale.
The temperature projections of the IPCC – Met office models and all the impact studies which derive from them have no solid foundation in empirical science being derived from inherently useless and specifically structurally flawed models. They provide no basis for the discussion of future climate trends and represent an enormous waste of time and money. As a foundation for Governmental climate and energy policy their forecasts are already seen to be grossly in error and are therefore worse than useless.
A new forecasting paradigm urgently needs to be adopted and publicized ahead of the Paris meeting.
For forecasts of the timing and extent of the coming cooling based on the natural solar activity cycles – most importantly the millennial cycle – and using the neutron count and 10Be record as the most useful proxy for solar activity check my blog-post at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
The most important factor in climate forecasting is where earth is in regard to this quasi- millennial natural solar activity cycle which has a period in the 960 – 1020 year range. For evidence of this cycle see Figs 5-9. From Fig 9 it is obvious that the earth is just approaching ,just at or just past a peak in the millennial cycle. I suggest that more likely than not the general trends from 1000- 2000 seen in Fig 9 will likely repeat from 2000-3000 with the depths of the next LIA at about 2650. The best proxy for solar activity is the neutron monitor the count and 10 Be data.Based on the Oulu neutron count – Fig 14 the solar activity millennial maximum peaked in Cycle 22 in about 1991. There is a varying lag between the change in the in solar activity and the change in the different temperature metrics. There is a 12 year delay between the neutron peak and the probable millennial cyclic temperature peak seen in the RSS data in 2003.
There has been a declining temperature trend since then (Usually mis-interpreted as a “pause”) There is likely to be a steepening of the cooling trend in 2017- 2018 corresponding to the very important Ap index break below all recent base values in 2005-6. Fig 13. The Polar excursions of the last few winters are harbingers of even more extreme winters to come more frequently in the near future.
It is important that prominent spokesmen for the skeptic cause speak out forcefully prior to the Paris meetings so that decisions could be based more solidly on the real world data rather than the virtual delusory world of the CAGW propagandists projections. It would be helpful if you would or could ,in future presentations, state that the earth is now past the millennial temperature peak and headed for several hundred years of general cooling modified from time to time by the shorter multi-decadal and centennial
periodicities.( 60 year, Gleissberg and de Vries cycles.)

Reply to  Dr Norman Page
May 4, 2015 11:03 am

The reason he does this is because Monckton as far as I can tell is not fully in the camp of solar/climate connections to any great degree.

Reply to  Salvatore Del Prete
May 4, 2015 10:15 pm

As I have posted above, I see the RSS data showing a corner being turned in late 2003, while Monckton argues the that the pause started about 1.5 years before the center of a century-class short-term spike.

Reply to  Dr Norman Page
May 4, 2015 11:32 am

Page, why do you persist in pretending Monckton forecasts climate trends ahead linearly. He doesn’t.

Reply to  Rainer Bensch
May 4, 2015 11:51 am

Compare Moncktons Fig 1 here with what I believe more accurately describes what has and what may happen.
http://www.woodfortrees.org/plot/rss/from:1980.1/plot/rss/from:1980.1/to:2003.6/trend/plot/rss/from:2003.6/trend
Quite a different view of what the past implies and what the future may bring I think.

Reply to  Rainer Bensch
May 4, 2015 12:23 pm

Herr Bensch is correct. In these monthly updates I merely report the observed temperature changes, calculate the least-squares linear-regression trend for the longest period that shows a zero trend, and also compare the observed trend since 1990 to the interval of predictions made by the IPCC in 1990 and the revised interval of predictions made by the IPCC in 2007.
I am not expert enough in analysis of solar output to determine the extent to which changes in solar activity are currently influencing global temperature, so I make no pronouncement on that. The four significant influences on climate are the Sun, the mantle, internal variability and Man. I cannot say how much each influence has contributed to recent temperature trends – and nor, I suspect, can anyone else, since both observation and theory are insufficiently mature.

Reply to  Rainer Bensch
May 4, 2015 3:32 pm

Monckton It depends on what you are observing and how you use it.I maintain that if you continue to discus [the] “pause” you essentially concede the policy argument to the IPCC and UNFCCC. They merely come up with one of their specious ad hoc excuses for the ” pause” and say we need to control GHGs because CAGW warming will come roaring back soon .
If you climb a mountain 1000ft high you will be able to say when you are 500 ft down past the peak that the ascending trend has paused for the last 1000 ft of movement. Is this a useful description of what has been going on.? No.
There is good evidence that we have entered the descending ie cooling phase of the millennial cycle and that phase will last until about 2650.I think your rhetorical skills could make that case very powerfully if you so chose .
Also as I said above
“The IPCC climate models are built without regard to the natural 60 and more importantly this 1000 year periodicity so obvious in the temperature record. This approach is simply a scientific disaster and lacks even average commonsense .It is exactly like taking the temperature trend from say Feb – July and projecting it ahead linearly for 20 years or so. The models are back tuned for less than 100 years when the relevant time scale is millennial. This is scientific malfeasance on a grand scale”
This is an argument that even the main stream media especially the BBC and Guardian editor might be able to grasp and that you might have a lot of fun with.

ren
Reply to  Dr Norman Page
May 4, 2015 12:10 pm

“Also, changes in the solar spectrum, in particular in the UV, could enhance this influence by affecting stratospheric chemistry: most importantly the balance between ozone production and destruction. Although the UV radiation shortward of 400 nm contributes only about 8% to the magnitude of the total solar irradiance, it is responsible for about 60% of the variation of the total irradiance.”
http://www2.mps.mpg.de/projects/sun-climate/data.html

Reply to  ren
May 4, 2015 2:00 pm

Here is what my post says re solar influence.
“NOTE!! The connection between solar “activity” and climate is poorly understood and highly controversial. Solar “activity” encompasses changes in solar magnetic field strength, IMF, CRF, TSI, EUV, solar wind density and velocity, CMEs, proton events etc. The idea of using the neutron count and the 10Be record as the most useful proxy for changing solar activity and temperature forecasting is agnostic as to the physical mechanisms involved.
Having said that, however, it is reasonable to suggest that the three main solar activity related climate drivers are:
a) the changing GCR flux – via the changes in cloud cover and natural aerosols (optical depth)
b) the changing EUV radiation – top down effects via the Ozone layer
c) the changing TSI – especially on millennial and centennial scales.
The effect on climate of the combination of these solar drivers will vary non-linearly depending on the particular phases of the eccentricity, obliquity and precession orbital cycles at any particular time.
Of particular interest is whether the perihelion of the precession falls in the northern or southern summer at times of higher or lower obliquity.”
Note – however it is not necessary to be able to calculate or even understand all the processes involved in order to make reasonable forecasts- this is the basic mistake made by the IPCC approach- they think they understand the system well enough to calculate outcomes from the bottom up. We don’t and they can’t even be computed anyway even if we did. If you know where we are with regard to the natural periodicities rational forecasts can be made.as at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
It is this different approach I am trying to encourage Monckton to embrace.

ren
Reply to  ren
May 5, 2015 1:19 am

Changes in UV and GCR work together in a zone of the ozone. Therefore, the loss of ozone can be greater than we think.

rd50
Reply to  Dr Norman Page
May 4, 2015 4:27 pm

I hope Monckton stays on track and continues to publish his data every month, up or down results. He is not predicting or projecting, he is observing.
The idea here is not to try to produce a new model or pounder on other elements in climate change to counter IPCC. If you want someone to counter IPCC and the next Paris meeting your post is of absolutely no help.
Monckton is one of the person to do it.
He is strictly on track, showing that the IPCC models are deviating from the actual measurements, as he wrote, “the ever-growing discrepancy between the temperature trends predicted by the models”.
This is what is important. And the longer the actual measurements deviates, and not only deviates but the differences are increasing each year the better Monckton case is. The greater the differences become, the higher the difficulty the IPCC will have calling against fossil fuels use and obviously the greater the difficulty for the modelers to adjust their models to fit the actual measurements.
This now brings me to the second point I hope Monckton will bring against the IPCC. (I know he knows this).
Forget ancient time. Get the IPCC on the CO2 vs. temperature measurements relationship in modern time.
Here he has extremely reliable CO2 atmospheric concentrations since 1958 from Mauna Lua. 1958 is more than good enough, 1958 to now is the relevant period to bring against the IPCC “hypothesis” that CO2 is “the” driver. Simply not quite. Plenty of graphs shown on the Internet about the relationship between the increase in CO2 and increase in temperature measurements from 1958. It “did happen”. A short time period of increasing CO2 (starting in 1958) and flat or slight decrease in temperature. Then a nice correlation between the increasing CO2 and temperature. Followed by the phase we are now in: increase in CO2 just as before, but no corresponding increase in temperature.
So. Monckton does not have to invent anything. He has IPCC crawling with two issues.
So please forget “new approaches”. IPCC has none. You have to call IPCC on their old hypothesis, not try to have them embrace new ones.

Reply to  rd50
May 4, 2015 5:25 pm

RD 50 see my earlier reply to Monckton.
http://wattsupwiththat.com/2015/05/04/el-nio-has-not-yet-paused-the-pause/#comment-1925779
The IPCC approach is inherently meaningless – it falls into the category of being “not even wrong” therefore to try to refute it on its own terms is therefore equally meaningless . see section 1 at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html

rd50
Reply to  rd50
May 4, 2015 5:47 pm

To Dr Norman Page at 5:25 PM
I did read your posted links.
Nothing there you are doing to show that the IPCC is wrong.
You can write ” The IPPC approach is inherently meaningless”. So easy to write such. Provide the argument showing it is meaningless, or else.
You can write “it falls …..” Same thing, provide the argument….or else.
Monckton is doing the heavy lifting. Month after month. He follows what Mother Nature is presenting, no more and no less and no speculation.
Please read what the IPCC “hypothesis” is. Monckton did and he sticks to facts involved in exploring and proving if this hypothesis is correct. No BS.

Reply to  rd50
May 4, 2015 6:25 pm

RD50
First As I said earlier – the most obvious problem is
“This approach is simply a scientific disaster and lacks even average commonsense .It is exactly like taking the temperature trend from say Feb – July and projecting it ahead linearly for 20 years or so. The models are back tuned for less than 100 years when the relevant time scale is millennial. This is scientific malfeasance on a grand scale”
Second. GCMs such as they use are inherently incomputable. This alone makes their method useless.
I do not believe you read section 1 of my linked post- you certainly didn’t watch the Essex presentation which is linked there I said
” The modelling approach is also inherently of no value for predicting future temperature with any calculable certainty because of the difficulty of specifying the initial conditions of a sufficiently fine grained spatio-temporal grid of a large number of variables with sufficient precision prior to multiple iterations. For a complete discussion of this see Essex: https://www.youtube.com/watch?v=hvhipLNeda4
Third Their models are incorrectly structured see Fig 1 of the linked post and
“The only natural forcing in both of the IPCC Figures is TSI, and everything else is classed as anthropogenic. The deficiency of this model structure is immediately obvious. Under natural forcings should come such things as, for example, Milankovitch Orbital Cycles, lunar related tidal effects on ocean currents, earth’s geomagnetic field strength and most importantly on millennial and centennial time scales all the Solar Activity data time series – e.g., Solar Magnetic Field strength, TSI, SSNs, GCRs, (effect on aerosols, clouds and albedo) CHs, MCEs, EUV variations, and associated ozone variations and Forbush events”
Fourth
The IPCC climate models are further incorrectly structured because they are based on three irrational and false assumptions. First, that CO2 is the main climate driver. Second, that in calculating climate sensitivity, the GHE due to water vapor should be added to that of CO2 as a positive feed back effect. Third, that the GHE of water vapor is always positive. As to the last point, the feedbacks cannot be always positive otherwise we wouldn’t be here to talk about it. For example, an important negative feed back related to Tropical Cyclones has recently been investigated by Trenberth, see:
http://www.cpc.ncep.noaa.gov/products/outreach/proceedings/cdw31_proceedings/S6_05_Kevin_Trenberth_NCAR.ppt
Temperature drives CO2 and water vapor concentrations and evaporative and convective cooling independently. The whole CAGW – GHG scare is based on the obvious fallacy of putting the effect before the cause. Unless the range and causes of natural variation, as seen in the natural temperature quasi-periodicities, are known within reasonably narrow limits it is simply not possible to even begin to estimate the effect of anthropogenic CO2 on climate. In fact, the IPCC recognizes this point.
The key factor in making CO2 emission control policy and the basis for the WG2 and 3 sections of AR5 is the climate sensitivity to CO2. By AR5 – WG1 the IPCC itself is saying: (Section 9.7.3.3)
“The assessed literature suggests that the range of climate sensitivities and transient responses covered by CMIP3/5 cannot be narrowed significantly by constraining the models with observations of the mean climate and variability, consistent with the difficulty of constraining the cloud feedbacks from observations ”
In plain English, this means that the IPCC contributors have no idea what the climate sensitivity is. Therefore, there is no credible basis for the WG 2 and 3 reports, and the Government policy makers have no empirical scientific basis for the entire UNFCCC process and their economically destructive climate and energy policies.
There is simply no point in trying to refute the IPCC forecasts which are essentially gobbledygook.

rd50
Reply to  rd50
May 4, 2015 7:11 pm

To Dr Norman Page at 6:25
You wrote:
“There is simply no point in trying to refute the IPCC forecasts which are essentially gobbledygook.”
You don’t need to try to refute the IPCC forecasts.
Can’t you just look at the surface temperature data or satellite temperature data furnished by Mother Nature?
Mother Nature is refuting the IPCC forecasts. Do I need to send you a graph?
Do I need to send you a graph of CO2 atmospheric concentration from 1958 vs. global surface temperature anomalies?
No need to invent anything. When IPCC started (1988), yes CO2 was a possibility, for a while, and interestingly after a pause or a decline in temperature anomalies while CO2 was increasing!. Nobody at that time paid much attention to such a decline in temperature anomalies from 1958 to about 1975 or so. The excitement was after this period. But now we are back to no correlation between increasing atmospheric CO2 and temperature anomalies.
So these are the issues with the IPCC. Not trying to talk them into something new or different. The IPCC is not a scientific research organization.

May 4, 2015 11:01 am

Three observations is first of al El Nino is weak, secondly it is out of season. Third the blob of warm water has been and is still present in the Eastern Pacific.
All of this limiting El Nino warming effects?
In addition S.H has mostly below average sea surface temperatures while Atlantic is cooling.

taxed
May 4, 2015 11:27 am

The jet stream is interesting at the moment, due to the fact its well to the south of its normal track across the Atlantic. Now should this last through into the summer then it looks like there will be little in the way of warming over the Atlantic and NW europe this year.

ren
Reply to  taxed
May 4, 2015 11:54 am

Annual Atlantic Multidecadal Oscillation (AMO) index values since 1856. The thin line indicates 3 month average values, and the thick line is the simple running 11 year average. Further explanation in text above. Data source: Earth System Research Laboratory at NOAA. Last year shown: 2014. Last diagram update 20 January 2015.
http://www.climate4you.com/images/AMO%20GlobalAnnualIndexSince1856%20With11yearRunningAverage.gif

Phlogiston
May 4, 2015 11:38 am

But the pause has paused el Nino.

Harry Passfield
May 4, 2015 11:50 am

Chris: I do hope you heard John Humphries on Radio 4’s TODAY prog this morning. He was interviewing (giving a plug to!) Stern. John Humphries was remarking on the fact that Paris will be the 21st COP – “and still the world is warming”, he said. I figure you have enough clout to get this post in front of the man and make him see where he’s going wrong.
As for Stern, he’s a lost cause – having succumbed to Mammon (courtesy of Grantham).

Reply to  Harry Passfield
May 4, 2015 12:30 pm

The only way to get the BBC to behave is to take its Trust to court for failing to honor its obligation to require the BBC to be impartial. The procedure is known as judicial review of administrative action. We are preparing a substantial report on years of prejudice on the BBC’s part, which will be sent to the Trust with an ultimatum. If it does not respond properly, the court will hear the case and the BBC will have to convince it that its flagrantly prejudiced and sullenly one-sided coverage is permissible. It will not succeed. Punitive damages against the BBC could run into the tens or even hundreds of millions, because the BBC takes $3 billion in poll tax from UK households and is obliged by its contract with the Secretary of State for “Culture” to be impartial, and there is a long history of correspondence from various parties complaining about the BBC’s bias. We shall need some funds to pay for the case, but it will be nothing like as costly as suing in the U.S.

richardscourtney
Reply to  Monckton of Brenchley
May 6, 2015 8:30 am

Monckton of Brenchley
You say

The only way to get the BBC to behave is to take its Trust to court for failing to honor its obligation to require the BBC to be impartial.

Sorry, but there is a problem with that, and I exzplain it as follows.
I recently complained about the BBC television programme titled ‘Climate Change by Numbers” because it was blatantly biased and factually inaccurate political propaganda. The BBC did not reply, but I recorded my complaint on WUWT here.
I complained to the BBC Trust that the BBC had not replied, and asked the Trust to address the subject of my complaint. I recorded that complaint to the BBC Trust on WUWT here and recorded the reply from the BBC Trust on WUWT here.
I subsequently provided this reply by email to the BBC Trust but have had no reply.

Re: BBC Complaint – Climate Change by Numbers
Dear Ms Seehra:
Nearly a month has passed since I received your reply to me that I copy below. It reported that the BBC Audience Services had “advised that they will send you a response to [my] concerns as soon as possible and the complaints manager has asked [you] to pass on his apologies for the delay”. Since then I have heard nothing from them.
Please note that my complaint was and is that the BBC clearly breached its Charter by broadcasting blatantly biased and factually inaccurate political propaganda in the form of the programme titled ‘Climate Change by Numbers’ broadcast on BBC4 at 21:00 to 22:15 hours on Monday 2 March 2015. My complaint consisted of clear and accurate information and argument supported by all necessary references.
It seems from your reply to me that I am in a ‘Catch 22’.
You say;
“I should explain that the BBC complaints process requires that complaints must be dealt with in the first instance by the BBC’s management; the Trust’s role in this process is only at the final stage, hearing complaints on appeal.”
But the BBC’s failure to reply to my complaint means there is no formal reply for me to appeal.
I am now writing to you to assert that
(a) the BBC’s failure to reply to my complaint is a response to my complaint
and
(b) the BBC’s failure to reply to my complaint is a tacit admission that they cannot dispute my complaint.
Hence, I am requesting that
(1) you accept this email as being my appeal against the BBC’s response to my complaint
and
(2) you call upon the BBC to correct its Breach of the BBC Charter by making a public declaration that the programme was blatantly biased and factually inaccurate political propaganda.
I am continuing my practice of making a public record of all my correspondence on this matter at
http://wattsupwiththat.com/2015/03/03/countdown-to-alarm/#comment-1874935
and, therefore, I am copying this email to that record.
Richard S Courtney

The issue of the ‘Catch22’ inhibits possibility of the legal case against the BBC Trust which you propose.
Richard
PS I like to think my complaint has acted to inhibit ‘BBC bollocks’ on climate change during the General Election period.

richardscourtney
Reply to  Monckton of Brenchley
May 6, 2015 8:34 am

Sorry, but two of my links did not work. Please scroll down from the link that does work to find them.
Richard

Keitho
Editor
Reply to  Harry Passfield
May 5, 2015 4:56 am

Stern also lied about “subsidies” to fossil fuels and to the expected temperature increase by century’s end which he said would be 4.5 deg C. Humphries doesn’t know enough about the topic to even ask the right questions.

jorgekafkazar
May 4, 2015 11:51 am

“Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:”
Mears’ cherry-picking argument is ridiculous. If we were going to cherry pick a start date, why wasn’t March, 1998, chosen? Mears has nothing.

Reply to  jorgekafkazar
May 4, 2015 12:33 pm

Jorge Kafkazar is correct. There is no cherry-picking. That is one of the default slogans of the true-believers, along with “settled science” and “consensus”. Any data that contradicts the party line (er, consensus) are said to have been cherry-picked. Childish, really.

May 4, 2015 12:55 pm

“The four significant influences on climate are the Sun, the mantle, internal variability and Man. I cannot say how much each influence has contributed to recent temperature trends – and nor, I suspect, can anyone else, since both observation and theory are insufficiently mature.” ~ Monckton of Brenchley
I would add that the mass of the atmosphere in conjunction with gravity has a lot to do with climate as does water in all its phases. The fact this is a “water world” is very significant to understanding our weather machine I think.
I do agree that observation and theory, at this point in time, is very immature. Someday, probably long after both you and I have passed on, mankind will take a fresh look at this issue and see that CO2 does squat as far as warming goes. Other factors are vastly more important than the theory James Hansen spun up to scare the masses.

May 4, 2015 1:11 pm

Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys somehow has to cover 200,000 cubic kilometres of ocean – a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork

It will surprise me if Willis has approved this formulation. It is a good example of how you can use statistics to mislead the readers.
The fact is that the 3600 automated ARGO bathythermograph buoys gives an extremely good picture of the ocean temperatures, much better than the often badly placed surface stations measuring the air temperatures. We therefore at present probably have a better picture of the ocean temperatures than air temperatures.
Each buoy of course has to cover many cubic kilometers; yes it is so because the ocean is big. But we have 3600 independent measurements, and that give a high degree of robustness and high accuracy.
To compare it with one temperature in a big lake is just silliness.
/Jan

Reply to  Jan Kjetil Andersen
May 4, 2015 1:21 pm

“The fact is that the 3600 automated ARGO bathythermograph buoys gives an extremely good picture of the ocean temperatures, much better than the often badly placed surface stations measuring the air temperatures.”
I agree that the ARGO buoys give a much better picture or ocean temperatures than the surface stations do of air temperatures. But I also agree that even the ARGO leaves much room for improvement. In fact, the whole idea of trying to take the planet’s temperature to a hundredth of a degree seems preposterous.

Reply to  markstoval
May 4, 2015 4:06 pm

I would think that the Argo measurements should be considered as a low resolution description of the state of the oceans, at best. You know what would be an interesting test for Argo would be to run a group of floats in relatively close proximity, perhaps 20 to 30 floats in Monckton,s 316 km box. Then see how much variation exists within that box. That would be very informative.

Reply to  markstoval
May 5, 2015 12:26 pm

Goldminor says:

You know what would be an interesting test for Argo would be to run a group of floats in relatively close proximity

Goldminor, you see this with the Argo buoys because they are floating around and sometimes they are close to each other.
/Jan

Reply to  Jan Kjetil Andersen
May 4, 2015 3:28 pm

Mr Andersen is incorrect. Willis Eschenbach indeed made the quite appropriate comparison described in the Technical Note. Only one ARGO bathythermograph per 200,000 cubic kilometers of ocean is equivalent to one bathythermograph per Lake Superior, taking less than one measurement per year.
It would be good to be able to be sure about the ARGO network’s accuracy, because if its record is correct then there is very, very little ocean warming, and there is no climate crisis. Unfortunately, the measurements are manifestly not sufficiently resolved to allow any definite conclusion to be drawn.
The best one can say is that the ARGO network is less ill resolved than the previous methods of attempting to measure ocean temperatures. And of course the mean ocean temperature cannot be accurately measured to the nearest hundredth of a degree: we are taking nothing like enough measurements for such precision.

Reply to  Monckton of Brenchley
May 4, 2015 4:15 pm

Additionally, even if properly situated, the ARGO system doesn’t measure the ocean temperature, they measure the upper 2000 meters of the ocean temperature.

Arno Arrak
Reply to  Monckton of Brenchley
May 4, 2015 6:37 pm

The Argo network enters prominently into the “missing heat” mystery of Trenberth and Fasullo [Science 328:316 (2010)]. They presented a graph which showed that oceanic heat content began to go down in 2004, so much so that by 2008 eighty percent of it had disappeared. They had no idea of what was going on but justified publishing this absurdity with the statement that “…Since 2004, ~3000 Argo fl oats have provided regular temperature soundings of the upper 2000 m of the ocean, giving new confidence in the ocean heat content assessment…” If I had been the reviewer I would have sent them back checking those buoys until they found the explanation. As it happens, the authors, the reviewers, and the editors all failed to stop this stupidity from becoming part of scientific literature.

Reply to  Monckton of Brenchley
May 4, 2015 9:46 pm

Only one ARGO bathythermograph per 200,000 cubic kilometers of ocean is equivalent to one bathythermograph per Lake Superior, taking less than one measurement per year

Uncertainty in measurements is all about making many independent measurements. If you have only one thermometer you cannot tell anything about the confidence because you may have a systemic error on your instrument. If you have a number of independent instruments, and all show the same trend, you can be pretty sure that the trend is correct.
Nobody says that the number of cubic kilometers per instrument is wrong, but to compare one measurement in one large and shallow lake with thousands of measurements over a large deep ocean is just very misleading.
As Anthony’s surface station project has shown, many of the surface stations measuring air temperature is ill placed even in north America which is is developed and sparsely populated. We can assume that the situation is worse most other places. And satellite measurements should definitely be used with care, as the latest large UAH adjustments show.
On the other hand there are now 3847 independet Argo floats in the oceans. Some areas are are over-populated while others have gaps that need to be filled with additional floats, but that does not alter the overall status is that we have much better ocean measurements than air measurements.
/Jan

Reply to  Monckton of Brenchley
May 5, 2015 1:28 pm

Mr Andersen is incorrect.

No, not incorrect, only surprised and disappointed, because I think you both know better.
Nobody disputes the number of cubic kilometers behind each instrument, but it is misleading to say that making one measurement in a big shallow sea is comparable in accuracy to doing thousands of measurements in a deep ocean.

It would be good to be able to be sure about the ARGO network’s accuracy, because if its record is correct then there is very, very little ocean warming, and there is no climate crisis

You have a point there, the Argo show smaller warming than expected. The pre-Argo measurements, which had huge uncertainties, showed more warming. Joanne nova had a good article about this here: http://joannenova.com.au/2013/05/ocean-temperatures-is-that-warming-statistically-significant/
But I am not following you all the way in saying that the Argo show so little warming that there is no climate crisis. We know that warming of the deep oceans take long time and nobody had expected a quick rise in temperatures.
The climate debate is very polarized and people typically either insist that the hiatus does not exist, or insist that the increase has stopped completely. However, the surface measurements show that the increase has slowed from approximately 0.15 C /decade from 1970 – to 2000 to 0.05 C/decade from 2000 to 2014.
Although the time period from 2000 to 2014 is of course too short to draw any conclusions with confidence, I think we least can take this slower temperature growth as an indication that the global warming is still real, but that the climate sensitivity may be lower than was thought two decades ago.
/Jan

KevinK
Reply to  Jan Kjetil Andersen
May 4, 2015 4:45 pm

” But we have 3600 independent measurements, and that give a high degree of robustness and high accuracy. ”
The statement about robustness has no meaning, what exactly is a “degree” of robustness, how is that defined ? Without a definition this portion of your statement is meaningless, sorry.
And NO a large number of independent measurements DOES NOT give “high accuracy”. You cannot average your way to a higher accuracy, it simply does not work that way. The accuracy of a data set with 3600 independent measurements is only a good as the least accurate measurement. Accuracy is determined by the calibration technique used to compare any individual sensor against one standard. And by how much and how quickly that calibration drifts away from that standard. In high tech manufacturing it is common to recalibrate all sensors (thermometers, voltage meters, light meters, etc.) every 12 months.
The ARGO buoys have never been re-calibrated and likely have never been tested for drift. All we know is that they report data with some specified precision (number of real digits), we frankly know nothing about the accuracy of this data other than the fact that they left the factory calibrated.
Number of measurements DOES NOT EQUAL accuracy of measurements, sorry. The ARGO network has been another waste of money that was demanded by the climate scientists to prove the AGW conjecture. Now that it does not support the conjecture one explanation is that the “missing heat” somehow managed to sneak around the ARGO sensors (presumably when they were asleep) and is now hiding in the deep Ocean.
There is no clear traceable measurement data of Ocean heat content that shows anything other than the null hypothesis; the climate is complex and nobody really knows what is going to happen next, beyond wild arse guessing.
Cheers, KevinK.

Just an engineer
Reply to  KevinK
May 5, 2015 7:33 am

Arse Guessing Wildly you say? 😉

Reply to  KevinK
May 5, 2015 12:11 pm

KevinK, I think you underestimate the Argo project. It is an international cooperation between 33 countries and hundreds of scientists are involved in the project.
You seem so confident, have you ever participated in a project like that?
Well, I have, and I can tell you that there are a lot of smart people out there. The mathematics and statistics in such a project is advanced, and it takes many good scientists to develop, test and verify the mathematics and statistics.
You say:

The accuracy of a data set with 3600 independent measurements is only as good as the least accurate measurement

This is very naïve and it is of course wrong. Do you seriously mean that if we have 3599 independent measurements with accuracy +/- 0.1C and one with accuracy +/- 10 C, the accuracy of the whole set is +/- 10C?
/Jan

richardscourtney
Reply to  KevinK
May 6, 2015 8:52 am

Jan Kjetil Andersen
You say to KevnK

You seem so confident, have you ever participated in a project like that?
Well, I have, and I can tell you that there are a lot of smart people out there.

True, and there are many thick people out there, too.
I have also “participated in a project like that” and I can tell you it is not relevant whether or not

The mathematics and statistics in such a project is advanced, and it takes many good scientists to develop, test and verify the mathematics and statistics.

What matters is whether or not the few members of the project who control the project make two short planks look thin.
Your attempt at ‘appeal to authority’ fails because anybody who thinks the ARGO buoys can measure ocean temperature to an average value of +/-0.1K is a deluded fool.
Richard

Reply to  KevinK
May 6, 2015 3:11 pm

Richard says:

What matters is whether or not the few members of the project who control the project make two short planks look thin.
Your attempt at ‘appeal to authority’ fails because anybody who thinks the ARGO buoys can measure ocean temperature to an average value of +/-0.1K is a deluded fool.

A quote from George Burns comes to mind: “Too bad all the people who know how to run the country are busy driving taxi cabs”
A modern version could be:
“Too bad all the people who know how to run science are busy arguing on blogs”
Excuse for the sarcasm, but it was tempting; I hope you can see it from the humorous side it is meant Richard.
I do not mean to discourage you from expressing your opinion, Richard, I just think that a little more respect for the experts in the field and the tremendous work laid down there would be proper.
If you go to the Argo site they specify the accuracy to ± 0.002°C, which is fifty times better than +/- 0.1 K. Do you think that they lie, or that they are just deluded fools times fifty?
http://www.argo.ucsd.edu/FAQ.html#accurate
/Jan

Reply to  KevinK
May 6, 2015 3:46 pm

JK Andersen,
ARGO produces measurements. That’s good. But those measurements showed generally declining ocean temperatures:
http://tumetuestumefaisdubien1.sweb.cz/ARGO-sea-temperature-max-max.PNG
So then the data was “adjusted”. That’s bad.
Adjustments are fine, so long as they are documented clearly every step of the way. But in most government science ‘adjustments’, it’s difficult to know what they’re doing.
We know one thing: after adjusting, it almost always turns out that the adjusted data shows more warming, never less. What are the odds, eh?

richardscourtney
Reply to  KevinK
May 6, 2015 11:07 pm

Jan Kjetil Andersen
You say to me

A quote from George Burns comes to mind: “Too bad all the people who know how to run the country are busy driving taxi cabs”.
A modern version could be:
“Too bad all the people who know how to run science are busy arguing on blogs”
Excuse for the sarcasm, but it was tempting; I hope you can see it from the humorous side it is meant Richard.

NO! I do NOT “see it from the humorous side”. I can see your reply to me is defamatory and inflamatory trolling intended to deflect attention from my complete debunking of your untrue assertions.
I suggest you don ‘t bother to attempt to defend your silly assertions concerning the ARGO data if untrue personal abuse is the only defence you can muster.
Richard

Reply to  KevinK
May 7, 2015 12:56 pm

Richard,
I understand that you are a serious guy so I assume you mean it literally when you write

anybody who thinks the ARGO buoys can measure ocean temperature to an average value of +/-0.1K is a deluded fool.

What do you then think of their claim of having a precision fifty times better than that?
Are they lying or are they deluded fools times fifty?
And in figure T5 above you can see the seasonal variations measured by Argo. The values are approximately between 6.43 and 6.48 Celsius.
What do you think of that graph?
/Jan

SkepticGoneWild
Reply to  Jan Kjetil Andersen
May 4, 2015 5:36 pm

All is not rosy with the ARGO system. Here are some of the limitations:
1. Uncorrectable Pressure Sensor Biases in 27% of Floats
“Pressure Sensor Drifts in Argo and Their Impacts”
http://journals.ametsoc.org/doi/pdf/10.1175/2011JTECHO831.1
“In early 2009, 62% of the Argo floats were APEX, and approximately 57% of their profile data could be confidently corrected for pressure biases….. Although positive and negative pressure biases
were corrected, the bulk of the corrections were done for positive biases…..About 43% of the APEX profiles are uncorrectable—half of this number due to firmware limitations, where negative pressure drifts are truncated to zero…..The net effect of the positive APEX pressure biases is to overestimate the temperature in the oceans..”

2. Limited coverage for ocean depths greater than 2,000 meters:
“Observational coverage of the ocean deeper than 2000 m is still limited and hampers more robust estimates of changes in global ocean heat content and carbon content. This also limits the quantification
of the contribution of deep ocean warming to sea level rise. {3.2, 3.7, 3.8; Box 3.1} [IPCC AR5 Technical Summary.]
3. Incomplete coverage at high latitudes, especially in seasonally ice-covered regions
Approximately 10% of ocean not sampled.

David A
Reply to  SkepticGoneWild
May 4, 2015 10:17 pm

The floats, well they float AND drift. Systemic bias is possible with this. Imagine if surface stations drifted.

May 4, 2015 4:33 pm

Average ocean depth: 4,000 meters. Geothermal heat flux on the floors & volcanic vents emitting CO2, CH4, SO2 – anybody’s guess.

Richard M
May 4, 2015 5:46 pm

The biggest component of the ocean surface temperature increase is the blob. When that dissipates we will likely see a cooling of the surface. If this is timed with the next La Nina it could be a huge change.
One way to tell if the start date is affecting the trend is to look at the trend after all the ENSO variation (1997-2001) is complete. This is about January of 2001. When this is done the trend becomes a cooling trend. Hence, the start date actually warms the trend, it doesn’t cool it.
http://www.woodfortrees.org/plot/rss/from:2001/plot/rss/from:2001/trend

Reply to  Richard M
May 5, 2015 12:19 am

I would say that this is exactly what will happen Richard.
The blob will dissipate as soon as the next LA nina begins to come, later this year sometime.
And then a decent drop into the negatives i think with temps, lower than the last La nina is a chance.

Arno Arrak
May 4, 2015 5:49 pm

Indeed it has not becasuse El Nino has nothing whatsoever to do with the hiatus/pause we are living through. El Nino is one of two phases of ENSO, the other one being La Nina. They are caused by a harmonic oscillation of ocean water from side to side in the equatorial Pacific. It has been active ever since the Panamanian Seaway closed and thereby established the current equatorial current system in the Pacific. The present hiatus as well as the previous one in the eighties and nineties (presently hidden by false warming) exist because the Arrhenius greenhouse theory used by the IPCC is a fantasy cooked up by pseudo-scientists. The correct greenhouse theory to use today is the Miskolczi greenhouse theory or MGT. Its prediction is simple: addition of carbon dioxide to air does not warm the air. This is why the hiatus/pause exists now and another one existed in the twentieth century. MGT came out in 2007 and was quickly blacklisted by the global warming establishment. You could not refer to its existence and graduate students were kept ignorant of it. It differs from the Arrhenius theory in being able to handle more than one GHG simultaneously. Arrhenius can handle only one: carbon dioxide. Even the IPCC claim that water vapor will triple the amount of greenhouse warming that CO2 alone can produce must be added as an ad hoc amendment that never has been scientifically proved. According to MGT, carbon dioxide and water vapor in the atmosphere jointly form an optimal absorption window in the infrared whose optical thickness is 1.87, determined by Miskolczi from first principles. If you now add carbon dioxide to atmosphere, it will start to absorb in the IR just as Arrhenius says. But this will increase the optical thickness. And as soon as this happens, water vapor will start to diminish, rain out, and the original optical thickness is restored. The added carbon dioxide will of course keep absorbing but the reduction of water vapor keeps total absorption constant and no warming is possible. Which makes IPCC’s alleged greenhouse warming also impossible. And without greenhouse warming there can be no AGW, the raison d’être of IPCC. In view of this our conclusion must be that any attempt to claim that carbon dioxide is causing global warming is either fraudulent or pseudo-scientific, or just plain stupid, or all three.

Editor
May 4, 2015 5:50 pm

Dr Norman Page May 4, 2015 at 3:32 pm

There is good evidence that we have entered the descending ie cooling phase of the millennial cycle and that phase will last until about 2650.

Given than 2650 is over six hundred years from now, I have to conclude that your definition of “good evidence” and my definition of “good evidence” are diagonally parking in parallel universes …
w.

SkepticGoneWild
Reply to  Willis Eschenbach
May 4, 2015 6:48 pm

No. No. No! Dr.Page is completely wrong regarding the year 2650. The millennial cycle will be one of warming and it will end in the year………..wait for it…………… 2525! :

Notice the CO2 induced fog swirling around the floor.

SkepticGoneWild
Reply to  SkepticGoneWild
May 4, 2015 7:09 pm

I strongly suggest the IPCC adopt the above as their theme song; dramatic score with a rising crescendo of bat crazy alarmism.

Reply to  Willis Eschenbach
May 4, 2015 9:19 pm

Willis for the evidence that we passed the peak of the millennial solar driver cycle at about 1991 see Figs 14 and 13 at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
It is a reasonable working hypothesis that Figs 9 and 5 suggest that we are just coming to, just at or just past a millennial temperature peak and that FIgs 13 and14 suggest that it has just passed –
given about a 12 year lag between the driver peak and the RSS temperature peak .We can see the start of the cooling trend at about 2003. at
http://www.woodfortrees.org/plot/rss/from:1980.1/plot/rss/from:1980.1/to:2003.6/trend/plot/rss/from:2003.6/trend
The 12 year cooling trend is not long enough to prove anything but it is certainly consistent with having just passed the peak.
As to the 2650 date The simplest and most conservative hypothesis is that in general the trends during the next millennial cycle will be an approximate repeat of the last one.- probably slightly cooler over all as we will be another thousand years closer to the next big Ice Age by 3000 AD The peak to trough of the last one was about 650 years (1000 -1650 ) with a faster 350 year climb from the Maunder minimum to the current peak. ( Fig 9) We wont have good evidence for this until about 2120 – but we will be able to check the trend along the way to strengthen the hypothesis.

Reply to  Dr Norman Page
May 4, 2015 11:27 pm

Dr Norman Page May 4, 2015 at 9:19 pm

As to the 2650 date The simplest and most conservative hypothesis is that in general the trends during the next millennial cycle will be an approximate repeat of the last one.- probably slightly cooler over all as we will be another thousand years closer to the next big Ice Age by 3000 AD

As my brother used to say, “It’s easy to predict the future … as long as it looks exactly like the past”. All you’ve done is claimed that the next millennium will be a repeat of the last, a most simplistic claim in my book. Look at the centuries as an example. The “simplest and most conservative hypothesis” is NOT that this century will look just like the last century.
In any case, you claimed that you had “good evidence” about 2650 … but now we find out it’s just your guess.
Dr. Page, do you see why folks don’t pay you much mind? Your exaggerations come back to bite you on the fundamental orifice. You don’t have a “good evidence” what will happen in 2650, it’s just a boastful fantasy. Nobody knows what the climate will be like in 2650, that’s just your ego hitting warp speed.
If you restrict yourself to believable claims and avoid over-egging your pudding, you’ll get a lot more folks to try to follow your ideas.
w.

Reply to  Dr Norman Page
May 5, 2015 6:24 am

Willis I agree that my statement
“There is good evidence that we have entered the descending ie cooling phase of the millennial cycle and that phase will last until about 2650.”
is confusing. To clarify- I think that the evidence that we have entered the cooling phase of the millennial cycle is good. (ftgs 5,9,13,14)
That the cooling phase of the millennial cycle will last for 650 years as it did for the last millennial cycle is a perfectly reasonable suggestion – I certainly do not expect that the shorter term centennial and decadal modulations of that trend within that 650 years will track the 1000 -1650 trends exactly. In complex natural systems everything only happens once because other things ie the state of the system as a whole are never equal.

Aicha Wallaby
May 4, 2015 7:44 pm

Shorter Dr. Mears: “I call these people ‘denialists’ for noticing what I myself have also noticed.”

May 4, 2015 9:17 pm

Lord Monckton,
I’d like to suggest enhancing the following point from your “Key facts” summary…
“Ø The oceans, according to the 3600+ ARGO bathythermograph buoys, are warming at a rate equivalent to just 0.02 Cº per decade, or 0.23 Cº per century.”
It seems that this point is badly in need of a follow-on/added statement about the error bars on this number which, if applied, would show that statistically you have almost as much confidence of ocean cooling as you do ocean warming. Sorry, but I’m just not impressed/convinced that the 0.02C per decade result actually does show warming given the measurement technique, the sensor calibration accuracy, the measurement timing, the total duration of the data set and the measurements’ spatial resolution among other things.
I apologize in advance if I missed a statement about Argo error analysis in the technical notes (I did see the technical details of the system) or if it’s already addressed in this comments thread.
Bruce

Reply to  Boulder Skeptic
May 5, 2015 2:25 am

Boulder Skeptic raises a most interesting point. I suspect that one approach might be to apply Kriging to the ARGO measurements: it is a well-established method of interpolation. But the uncertainty is certainly formidable.

Larry Wirth
May 4, 2015 11:51 pm

Point of order: The currently correct “term of art” is “warmunist.”

richardscourtney
Reply to  Larry Wirth
May 6, 2015 8:57 am

Larry Wirth
As the inventor of the word “warmunist”, I write to say it is only one of several correct “terms of art”. I am sure you can devise a good one, too.
Richard

ren
May 5, 2015 1:58 am
Reply to  ren
May 5, 2015 9:28 am

It looks like the ENSO conditions have peaked in the different regions, especially in regions 3 and 4. Region 1-2 may hold the higher positive levels for a bit longer, but otherwise it sure looks like it is over to me. If this holds true, then I will have successfully forecast the last 3 peaks and 2 valleys on the MEI. Also, notice the change to conditions in the area of the Blob. That area will look entirely different by the end of July.

ren
May 5, 2015 6:02 am

The temperature at a height of 1500 m shows the most-cooled areas. The blue color indicates the temperature below 0 C.
http://oi57.tinypic.com/iy0oie.jpg

Lars P.
May 5, 2015 8:34 am

El Nino did not paused the pause yet, but some are working hard at it:
pause? what pause?
notrickszone.com/2015/05/02/151-degrees-of-fudging-energy-physicist-unveils-noaas-massive-rewrite-of-maine-climate-history/
Over the last months I have discovered that between 2013 and 2015 some government bureaucrats have rewritten Maine climate history between 2013 and 2015 (and New England’s and of the U.S.). This statement is not based on my opinion, but on facts drawn from NOAA 2013 climate data vs NOAA 2015 climate data after when they re-wrote it

John Craig
May 5, 2015 2:06 pm

Is it the moon? Metonic cycle?

Brian
May 5, 2015 6:21 pm

Are you sure the satellites use Platinum RTDs? I thought they used Microwave receivers, which receive 60Ghz Raidio waves from excited oxygen molecules.
http://www.remss.com/missions/amsu
“MICROWAVE SOUNDING UNIT (MSU)
The Microwave Sounding Units (MSU) operating on NOAA polar-orbiting satellite platforms were the principal sources of satellite temperature profiles from late 1978 to the early 2000’s. The MSUs were cross-track scanners that made measurements of microwave radiance in four channels ranging from 50.3 to 57.95 GHz on the lower shoulder of the Oxygen absorption band. These four channels measured the atmospheric temperature in four thick layers spanning the surface through the stratosphere. The were 9 MSUs in total. The last MSU instrument, NOAA-14, ceased reliable operation in 2005.
ADVANCED MICROWAVE SOUNDING UNIT (AMSU)
A series of follow-on instruments, the Advanced Microwave Sounding Units (AMSUs), began operation in 1998. The AMSU instruments are composed of 2 sub units, AMSU-A and AMSU-B. AMSU-B is a humidity sounder (not discussed further), and AMSU-A is a 15-channel temperature sounder similar to MSU. Of the 15 channels, 11 (Channels 4 through 14) are located in the 60 GHz absorption complex and thus are most closely related to atmospheric temepratures at various heights above the surface. The increased number of channels relative to MSU means that AMSU-A samples the temperature of the atmosphere in a larger number of layers. The AMSU measurement footprints are also smaller than those for MSU, leading to higher spatial resolution. 3 AMSU channels (Channels 5,7,and 9) are closely matched to MSU channels 2,3 and 4. By using these channels,we have extended our climate-quality dataset to the present.”

KevinK
May 5, 2015 6:55 pm

Jan wrote;
“This is very naïve and it is of course wrong. Do you seriously mean that if we have 3599 independent measurements with accuracy +/- 0.1C and one with accuracy +/- 10 C, the accuracy of the whole set is +/- 10C?”
YES, if you have no prior knowledge of which sensor is only accurate to +/- 10C then the entire data set is only good to +/- 10 C for any traceable analysis purposes. And the Argo system does not know the current accuracy of the individual sensors. Short of hauling them all back on board and recalibrating them we do not know the current accuracy of the system. It might be as good as when it was designed, it very well might not be. Nobody really knows.
I have been responsible for calibrating the focal plane sensors to NIST traceable accuracies for the most modern commercial imaging satellites currently orbiting the Earth. I am quite familiar with the proper ways to account for error propagation through a system. I consider the ARGO system to be a waste of money.
I do not believe that anybody knows the true temperature of the Ocean’s to better than a few degrees C.
Does not matter how many smart people worked on it, there is no calibration traceability after they are dropped in the water, nobody knows what the true temperature is. They really really want to believe that with lots of measurements the accuracy must be better, but it simply is not.
Sorry, but that is my well informed opinion, Cheers, KevinK.

KevinK
May 5, 2015 7:57 pm

Just for fun I will expand on my previous comment, again Jan wrote:
““This is very naïve and it is of course wrong. Do you seriously mean that if we have 3599 independent measurements with accuracy +/- 0.1C and one with accuracy +/- 10 C, the accuracy of the whole set is +/- 10C?””
OK here is an example of how accuracy works in the engineering world; I have a factory that manufactures engines for automobiles. These engines have many parts all of which have to be accurate (with respect to their physical dimensions) to plus or minus one thousandth of an inch or else the engine cannot be assembled or function (for very long).
So one approach would be to buy 5000 micrometers (a precision length/width/diameter measuring instrument/tool). Then we calibrate them all once against one standard instrument. Then we send them out all over the factory floor to the folks that setup the machining tools, to the quality control persons that check the parts, to the operators that run the machines. What happens; well some of these micrometers get dropped onto a concrete floor occasionally (nobody is perfect) which distorts them. Some get some dust/dirt inside which degrades their performance, some even get “adjusted” by the user because they believe they are “out of spec” or “acting funny”. After a while the accuracy of our original “fleet” of micrometers has drifted away from their original accuracy. Heck one of them even got run over by a forklift (just that one time, honest). What happens is that there is one micrometer (precision measuring instrument) that has been dropped, driven over and manhandled so much that it is no longer sufficiently accurate.
IF that micrometer is off by ten thousandths of an inch then none of the engine parts fit together properly anymore and the whole factory grinds to a halt until somebody finds that micrometer and re-calibrates it.
Of course nobody knows which micrometer it is because there is no process to periodically recalibrate (i.e. reestablish the accuracy of) the micrometers.
This is why in precision manufacturing environments there is always a strict re-calibration process in place. EVERY measuring instrument gets re-calibrated against a NIST traceable standard at least once a year.
I have have products ready to ship to a customer but the quality control folks would not “sign off” because one tool (out of hundreds) used to manufacture the product was “not in calibration”. This is a desired outcome so that a multimillion dollar satellite does not get shipped to a customer with any doubts about the provenance of the entire manufacturing process.
The ARGO system exhibits NONE of the attributes of a modern manufacturing environment, and hence produces a sub standard product that actual paying customers (forking over their own dollars willingly without the force of government taxation) would never purchase.
The whole ARGO system was proposed by those that believe they understand the climate and can control it (given unlimited power over all the other folks currently residing on the Earth). They got the funding and the system basically shows NO DISCERNIBLE CHANGE IN OCEAN TEMPERATURES beyond properly accounted for calibration errors. Now they want to dismiss the calibration issues and claim we must trust them when this system allegedly shows hundredths of degrees of warming. Yeah, and I saw a unicorn run across the road in front of me on the way to my office yesterday morning, really I did, honest.
Cheers, KevinK

Reply to  KevinK
May 5, 2015 8:30 pm

KevinK, your analogy to a factory does not hold. We are interested in the average ocean temperature, not the precise temperature in every cubic kilometer.
A scientific project works different than a factory floor, but that does not mean that the science is all wrong. It does not work that way, and it does not help to write your very wrongheaded assumptions in capital letters.
/Jan

KevinK
Reply to  Jan Kjetil Andersen
May 5, 2015 9:00 pm

Jan, sorry, but a factory floor is all about knowledge, knowing exactly what the statistical distribution of part sizes is so the parts will work together.
The average temperature of the ocean is a meaning less bit of information. When doing a proper energy analysis of a system the important information is the total heat content, this cannot be determined by measuring the average temperature. The temperature is a result of the total heat content and the thermal capacity of the material, it is a product of those two pieces of information. Without carefully measuring the thermal capacity of the oceans (dependent on pressure, temperature, salinity, etc. and never a uniform property over the vast spatial distribution of the oceans) and the temperature where that specific thermal capacity exists it is impossible to discern the heat content of the oceans.
After all the expenditures of money to measure the temperature of the oceans we know nothing more than when we started. There may be more total energy, there may be less or most likely the energy content probably goes up and down due to causes nobody really understands at this point in time.
We, all of us, do not know the current, past or future energy content of the oceans, and probably never will to better than a few percent, if that. Asserting that a scientific project knows the energy content of the oceans with the kind of accuracy associated with temperature differences of 0.1 degree C is pure unadulterated hubris.
Again, sorry if my opinions about the current state of the accuracy of knowledge about the energy content of the oceans is offensive, but there is no evidence that anybody can defend their claimed knowledge to the accuracy commonly present in the modern engineering world.
You just don’t know.
Cheers, KevinK

Reply to  KevinK
May 5, 2015 9:10 pm

Manufacturing an automobile engine is different from generating a temperature dataset. As KevinK explained, when parts have to fit and work together, one badly drifted measurement tool can effectively throw a wrench into the works. But for a temperature dataset, one thermometer out of 3600 going wrong by 10 degrees C does not wreck the whole dataset by 10 degrees C, not even if it is not known which thermometer of the 3600 is the one that went so badly wrong. A somewhat-global distribution of thermometers don’t interact with each other like parts of an automobile engine do.

Reply to  Donald L. Klipstein
May 6, 2015 3:17 pm

The problem with the terrestrial datasets is the persistent unidirectional tampering with large numbers of station records from all over the world. The satellite data are less subjective. It is excellent news that Terry Kealey of the U. of Buckingham is leading an investigation into the corruption of the terrestrial datasets.

Reply to  Donald L. Klipstein
May 6, 2015 3:20 pm

KevinK is of course correct about the uncertainty surrounding our ill-resolved attempts to measure ocean temperature, with each bathythermograph buoy covering some 200,000 cu. km of ocean. Even with some ingenious Kriging to interpolate, it is not particularly likely that we are measuring the change in ocean temperature correctly. Nevertheless, the ARGO dataset is the least ill-resolved we have, and it shows warming of the ocean at a rate equivalent to only 0.2 K/century, which is scarcely alarming.