No significant warming for 17 years 4 months

By Christopher Monckton of Brenchley

As Anthony and others have pointed out, even the New York Times has at last been constrained to admit what Dr. Pachauri of the IPCC was constrained to admit some months ago. There has been no global warming statistically distinguishable from zero for getting on for two decades.

The NYT says the absence of warming arises because skeptics cherry-pick 1998, the year of the Great el Niño, as their starting point. However, as Anthony explained yesterday, the stasis goes back farther than that. He says we shall soon be approaching Dr. Ben Santer’s 17-year test: if there is no warming for 17 years, the models are wrong.

Usefully, the latest version of the Hadley Centre/Climatic Research Unit monthly global mean surface temperature anomaly series provides not only the anomalies themselves but also the 2 σ uncertainties.

Superimposing the temperature curve and its least-squares linear-regression trend on the statistical insignificance region bounded by the means of the trends on these published uncertainties since January 1996 demonstrates that there has been no statistically-significant warming in 17 years 4 months:

clip_image002

On Dr. Santer’s 17-year test, then, the models may have failed. A rethink is needed.

The fact that an apparent warming rate equivalent to almost 0.9 Cº is statistically insignificant may seem surprising at first sight, but there are two reasons for it. First, the published uncertainties are substantial: approximately 0.15 Cº either side of the central estimate.

Secondly, one weakness of linear regression is that it is unduly influenced by outliers. Visibly, the Great el Niño of 1998 is one such outlier.

If 1998 were the only outlier, and particularly if it were the largest, going back to 1996 would be much the same as cherry-picking 1998 itself as the start date.

However, the magnitude of the 1998 positive outlier is countervailed by that of the 1996/7 la Niña. Also, there is a still more substantial positive outlier in the shape of the 2007 el Niño, against which the la Niña of 2008 countervails.

In passing, note that the cooling from January 2007 to January 2008 is the fastest January-to-January cooling in the HadCRUT4 record going back to 1850.

Bearing these considerations in mind, going back to January 1996 is a fair test for statistical significance. And, as the graph shows, there has been no warming that we can statistically distinguish from zero throughout that period, for even the rightmost endpoint of the regression trend-line falls (albeit barely) within the region of statistical insignificance.

Be that as it may, one should beware of focusing the debate solely on how many years and months have passed without significant global warming. Another strong el Niño could – at least temporarily – bring the long period without warming to an end. If so, the cry-babies will screech that catastrophic global warming has resumed, the models were right all along, etc., etc.

It is better to focus on the ever-widening discrepancy between predicted and observed warming rates. The IPCC’s forthcoming Fifth Assessment Report backcasts the interval of 34 models’ global warming projections to 2005, since when the world should have been warming at a rate equivalent to 2.33 Cº/century. Instead, it has been cooling at a rate equivalent to a statistically-insignificant 0.87 Cº/century:

clip_image004

The variance between prediction and observation over the 100 months from January 2005 to April 2013 is thus equivalent to 3.2 Cº/century.

The correlation coefficient is low, the period of record is short, and I have not yet obtained the monthly projected-anomaly data from the modelers to allow a proper p-value comparison.

Yet it is becoming difficult to suggest with a straight face that the models’ projections are healthily on track.

From now on, I propose to publish a monthly index of the variance between the IPCC’s predicted global warming and the thermometers’ measurements. That variance may well inexorably widen over time.

In any event, the index will limit the scope for false claims that the world continues to warm at an unprecedented and dangerous rate.

UPDATE: Lucia’s Blackboard has a detailed essay analyzing the recent trend, written by SteveF, using an improved index for accounting for ENSO, volcanic aerosols, and solar cycles. He concludes the best estimate rate of warming from 1997 to 2012 is less than 1/3 the rate of warming from 1979 to 1996. Also, the original version of this story incorrectly referred to the Washington Post, when it was actually the New York Times article by Justin Gillis. That reference has been corrected.- Anthony

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

429 Comments
Inline Feedbacks
View all comments
Gail Combs
June 16, 2013 6:20 pm

Greg Mansion says: June 14, 2013 at 6:10 pm
….Besides, and this is something you must know very well, warmists do not say that “global warming” is something steady. They have always said that it is about an overall trend….
>>>>>>>>>>>>>>>>>>>>>
And has been shown many times the overall temperature trend of the latter half of the Holocene is COOLING!
GRAPH: GSIP2 (Greenland) vs CO2
GRAPH: 10,000 yrs Vostok (present on left)
And just in case that ice core data doesn’t sink in you can ask other glaciers:

…The study went after a variety of sediments in the lake bed to determine the sediment that was depositing in the lake. By determining the different compositions in the sediment they could find how much glacial activity was taking place over the past 8,000 years.
Here is the official chart from the study itself….
Astute readers will notice the brief periods from 1,000 and 2,000 years ago that are commonly referred to as the Medieval and Roman Warming periods. Both are simply interludes of the expanding glacial activity that has steadily been taking place for the past 4,000 years….
This study is not an anomaly either. Any study of the Northern Hemisphere shows this exact overall behavior. The NH was warmer several thousand years ago, even though the CO2 level was lower. There has been a general cooling trend throughout the NH over the past 4,000 years. It is not steady by any means over a period of a few hundred years, but over the course of thousands of years it is very steady. This is simply one more study that shows the same thing.
The authors of the study simply state their findings in their abstract.

href=”http://www.sciencedirect.com/science/article/pii/S0033589411001256″>A new approach for reconstructing glacier variability based on lake sediments recording input from more than one glacier, Quaternary Research, Volume 77, Issue 1, January 2012, Pages 192–204
ABSTRACT:
We explore the possibility of building a continuous glacier reconstruction by analyzing the integrated sedimentary response of a large (440 km2) glacierized catchment in western Norway, as recorded in the downstream lake Nerfloen (N61°56′, E6°52′). A multi-proxy numerical analysis demonstrates that it is possible to distinguish a glacier component in the ~8000-yr-long record, based on distinct changes in grain size, geochemistry, and magnetic composition. Principal Component Analysis (PCA) reveals a strong common signal in the 15 investigated sedimentary parameters, with the first principal component explaining 77% of the total variability. This signal is interpreted to reflect glacier activity in the upstream catchment, an interpretation that is independently tested through a mineral magnetic provenance analysis of catchment samples. Minimum glacier input is indicated between 6700-5700 cal yr BP, probably reflecting a situation when most glaciers in the catchment had melted away, whereas the highest glacier activity is observed around 600 and 200 cal yr BP. During the local Neoglacial interval (~4200 cal yr BP until present), five individual periods of significantly reduced glacier extent are identified at ~3400, 3000-2700, 2100-2000, 1700-1500, and ~900 cal yr BP. <a

link

June 17, 2013 5:15 am

Gail. No doubt all readers who think that CO2 drives temperature will acknowledge that the GISP graph you linked “proves” that C2 cools the earth and is an Ice House not a Greenhouse gas

Gail Combs
June 17, 2013 5:54 am

jai mitchell says: June 15, 2013 at 8:15 am
…… you are entitled to your own opinions, but not your own facts.
We have a very significant and credible record based on thousands of ice cores (recent 2,000 years) and hundreds of ice cores (earlier Holocene).
as well as plant stomata and tree ring growth as well as other ancillary indicators that
CO2 has not been anywhere near current atmospheric levels for almost 52 million years…..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Another untruth.
The actual data from chemical testing for CO2 GRAPH 1 Note the tendency to select low values for the CO2 concentration in the 19th century atmosphere despite values as high as 550 ppm and above.
A closer look at the cherry picked results used by warmists from above graph. GRAPH 2
Again note the cherry picking of values as outlined by Mauna Loa Obs.

4. In keeping with the requirement that CO2 in background air should be steady, we apply a general “outlier rejection” step, in which we fit a curve to the preliminary daily means for each day calculated from the hours surviving step 1 and 2, and not including times with upslope winds. All hourly averages that are further than two standard deviations, calculated for every day, away from the fitted curve (“outliers”) are rejected. This step is iterated until no more rejections occur.
How we measure background CO2 levels on Mauna Loa.



CO2 continuous hourly data, all values. GRAPH
CO2 continuous hourly data after selection process. GRAPH


EARLY STUDIES of CO2 in snow and ice. starting with a small glacier in Norway (Coachman et al 1956, 1958a, b), and the studies were continued in Greenland and Antarctica (Table 1). In the first Antarctic study of Matsuo and Miyake (1966) an elegant method of 13C isotopic dilution was used for CO2 determinations. The precision of these determinations, with an analytical error of +/- 0.002%, was never matched in later studies, which reported errors usually ranging between +/- 0.2 and 3%.
TABLE 1

THE PERIOD OF HIGH CO2 READINGS
After 1980 most of the studies of CO2 in glaciers were carried out on Greenland and Artarctic ice by Swiss and French research groups; one core was studied in an Australian laboratory. A striking feature of the data published until about 1985 is the high concentrations of CO2 in air extracted from both pre-industrial and ancient ice, often much higher than in the contemporary atmosphere (Table 1).
Fig. 2. Concentration of CO2 in a 90-cm long section of a Camp Century (Greenland) ice core. The lower curve represents 15 min. “wet” extraction from melted ice and “dry” extraction; the upper curve 7 hours “wet” extraction. Redrawn after Stauffer et al (1981)
For example, in 11 samples of about 185-year-old ice from Dye 3 (Greenland) an average CO2 concentration of 660 ppm was measured in the air bubbles (using the “dry” extraction method), with a range of 290 – 2450 ppm (Stauffer et al 1985). In a deep ice core from Camp Century (Greenland), covering the last 40,000 years, Neftel et al (1982) found CO2 concentrations in the air bubbles ranging between 273 and 436 ppm (average 327 ppm). They also found that in an ice core of similar age from Byrd Station (Antarctica) these concentrations ranged between 257 and 417 ppm. Both these deep cores were heavily fractured and contaminated with drilling fluid. Neftel et al (1982) arbitrarily assumed that “the lowest CO2 values best represent the CO2 concentrations of the originally trapped air”.
Using the same dry extraction method, in the same segment of an ice core from a depth of 1616.21m in Dye 3 (Greenland), Neftel et al (1983) found a CO2 concentration of 773 ppm in the air bubbles. Two years later, Stauffer et al (1985) reported only about half of this concentration (410 ppm).
It appears from Table 1 that the change from high to low CO2 values reported for polar ice occurred in the middle of 1985….
THE PERIOD OF LOW CO2 READINGS
Since 1985, low concentrations, near a value of 290 ppm or below, started to dominate the records. They were interpreted as indicating “the CO2 increase during the last 150 years” and “overlapping or adjacent to results from direct measurements on Mauna Loa started in 1958” (Stauffer and Oeschger 1985)….
[See SOURCE for a lot more information including a rebuttal of Ferdinand Engelbeen’s criticism of Jaworowski.]

Statement of Prof. Zbigniew Jaworowski
Do glaciers tell a true atmospheric CO2 story? Z Jaworowski, T V Segalstad, & N Ono, 1992 227-284 Science of Total Environment
Dr. Zbigniew Jaworowski denied funding and fired
THE ACQUITTAL OF CARBON DIOXIDE by Jeffrey A. Glassman, PhD
ON WHY CO2 IS KNOWN NOT TO HAVE ACCUMULATED IN THE ATMOSPHERE & WHAT IS HAPPENING WITH CO2 IN THE MODERN ERA by Jeffrey A. Glassman, PhD
The Trouble With C12 C13 Ratios
Bombshell from Bristol: Is the airborne fraction of anthropogenic CO2 emissions increasing? – study says “no” University of Bristol Press release issued 9 November 2009
The whole hoax is based on the ASSumption that CO2 is uniformly mixed in the atmosphere and then cherry picking the desired results from the shotgun scatter of real life data.
The Japanese satellite (JAXA) shows CO2 is not ‘well-mixed’. map 1 and map 2

Gary Hladik
June 17, 2013 1:42 pm

Gail Combs says (June 17, 2013 at 5:54 am): “Again note the cherry picking of values as outlined by Mauna Loa Obs.”
Cherry picking or good observational science? Measuring the background level of something so easily affected by local sources is no easy task. CO2 is also measured at the South Pole (and elsewhere). Even in such a “pristine” location, precautions must be taken. On the linked page, watch the air sampling video, then compare the two graphs of air samples taken downwind and upwind of the station. I’ve read that background CO2 measurements from around the world are quite comparable, so either there’s a lot of cherry-picking going on, or CO2 measurement is one of the more reliable aspects of climate science.
“The whole hoax is based on the ASSumption that CO2 is uniformly mixed in the atmosphere…”
1) The assumption is actually that CO2 is “well-mixed”, which isn’t the same as “uniformly mixed”.
2) I don’t believe the concept of CAGW or even AGW requires that CO2 be “well-mixed”, though in that case the GCMs might need to take into account persistent geographic variations in so-called “greenhouse gases”.
3) If I understand the AGW concept correctly, the critical area is the upper atmosphere, i.e. the “effective radiating level” or ERL. Anybody know if CO2 is “weller-mixed” in the upper atmosphere than at ground level?
“The Japanese satellite (JAXA) shows CO2 is not ‘well-mixed’. map 1 and map 2”
That depends on what the meaning of is “well-mixed” is. 🙂 Even on these maps the CO2 levels vary less than 20 ppm (by eyeball) over the (limited) coverage areas, or roughly 5%. Considering the non-uniform CO2 sources and sinks, that’s “well-mixed” to me. Check out a larger set of maps. Note the variation of CO2 on time scales as short as a month. Again, it looks like the CO2 is getting stirred around pretty well.
BTW, two observations on these maps:
1) I understand the satellite measures the entire column of atmospheric CO2, so even if CO2 is more uniform in the upper atmosphere, the reading would be skewed by the levels closer to the surface; and vice-versa, although less so because the absolute amount of CO2 decreases with altitude.
2) Comparing the same month year over year, the maps get redder. Near Hawaii in April 2010 I see an orange square, consistent with the 392.52 ppm CO2 reading from Mauna Loa. In April 2013 there’s a reddish square over Hawaii, consistent with a Mauna Loa measurement of 398.40 ppm. The scale has to be exaggerated to make such a small change noticeable, but as a result a 1.5% change seems to set the maps on fire. 🙂
“The Trouble With C12 C13 Ratios”
I visited that link before and found Chiefio’s musings quite intriguing. Likewise the carbon isotope section of Murry Salby’s talk covered at WUWT. I’m not sure if the satellite data support or contradict Salby and Chiefio.

Bart
June 17, 2013 3:54 pm

rgbatduke says:
June 13, 2013 at 7:20 am
“You might as well use a ouija board as the basis of claims about the future climate history as the ensemble average of different computational physical models that do not differ by truly random variations and are subject to all sorts of omitted variable, selected variable, implementation, and initialization bias.”
That is extremely appropriate on a particular level. The Ouija board is, of course, a bunch of nonsense. The smart kids realize at some point that the way to make it work is to slowly guide it to the answer he or she wants, all the while protesting that he or she didn’t do anything, and the answer was provided by the spirits (or, in the case of the climate, “the science”).

rgbatduke
June 18, 2013 7:51 am

1) I understand the satellite measures the entire column of atmospheric CO2, so even if CO2 is more uniform in the upper atmosphere, the reading would be skewed by the levels closer to the surface; and vice-versa, although less so because the absolute amount of CO2 decreases with altitude.
Did you notice the other oddity? There is no possible way that the Japanese satellite measurement OF the entire air column supports global CO_2 on the edge of 400 ppm. Eyeballing the graphs the means should be what, 380 ppm, and as you note, it should be skewed high compared to Mauna Loa, not low.
I’m thus troubled by what appears to be a 5 or 6% error in normalization. That’s actually a rather lot. Mauna Loa is what it is, whether or not you approve of its data reduction methodology at least it has been consistently applied for a long time so its own value is apples-to-apples. But it has been used as a sort of gold-standard of global atmospheric CO_2, and I’d rather bet that its readings are used as primary input into the GCMs.
The satellite data suggests many things. First, that the Mauna Loa readings are a lousy proxy for global CO_2 levels. Second, that they are far too high — as you note, if CO_2 concentration on average increases with depth (which makes moderate sense and is consistent with it not being perfectly homogeneous over the globe as it is) then the mountaintop reading should be lower by some percentage than the mean of the entire air column, not higher. Given that surface readings are frequently in the 400 to 500 ppm (depending on where you look) Mauna Loa could be off by 10% or 20% on the high side compared to true top-of-the troposphere average CO_2 concentration. Since that is where the atmosphere nominally becomes transparent to LWIR emitted from the CO_2 because the atmosphere itself has thinned to the point where it is no longer optically opaque, this suggests that the emission temperature being used in the models is derived from a point too high in the DALR (and hence too cold) — thereby exaggerating warming.
If it “only” a 5% effect I’d worry but perhaps it isn’t “important” (bearing in mind that the entire post-LIA warming in degrees Kelvin is order of 0.5%, so small numbers cannot safely be ignored when trying to explain the 0.1-0.2% that might be attributable to anthropogenic causes). If Mauna Loa is off by 10% or more for any reason whatsoever, that cannot possibly be ignorable. For example, if top of the troposphere global mean CO_2 were really 370 ppm and not 400 ppm, that’s a huge difference.
One wonders if there are any reliable controls. The other possibility is of course that the satellite isn’t correctly normalized. One wonders if contemporaneous soundings support one or the other. That actually might be a large enough error to kick the GCM spaghetti back down into agreement with nature all by itself, although systematically correcting them back in time is going to be very difficult. One also wonders why Mauna Loa produces so very different a number (just as one wonders why LTT is diverging from land surface temperature records, or was until they stopped adjusting it because they had to after one final push of all the older temperatures down).
rgb

beng
June 18, 2013 8:30 am

I agree, rgb. Look at an IR spectrum emitted from the earth, and the CO2 region is at ~-55C — around 40,000 – 50,000 ft (?). So that’s the relevant level/concentration. But I’d assume someone’s been monitoring that. (?)

beng
June 18, 2013 8:44 am

And rgb, spectrums of IR from the poles show CO2 emitting instead of absorbing. This goes w/Chiefo’s idea that the TOA (defined as no convection) at the poles during an inversion is actually at the surface, and CO2 acting as a coolant instead of an insulator. So the standard theory of ~1C warming from CO2 doubling might not be so solid, IMO.

June 18, 2013 10:41 am

rgbatduke says:
June 18, 2013 at 7:51 am
1) I understand the satellite measures the entire column of atmospheric CO2, so even if CO2 is more uniform in the upper atmosphere, the reading would be skewed by the levels closer to the surface; and vice-versa, although less so because the absolute amount of CO2 decreases with altitude.
Did you notice the other oddity? There is no possible way that the Japanese satellite measurement OF the entire air column supports global CO_2 on the edge of 400 ppm. Eyeballing the graphs the means should be what, 380 ppm, and as you note, it should be skewed high compared to Mauna Loa, not low.

The data I’m looking at for April is consistent with 400ppm:
https://data.gosat.nies.go.jp/GosatBrowseImage/browseImage/fts_l2_swir_co2_gallery_en_image.html?image=46

Gary Hladik
June 18, 2013 11:58 am

rgbatduke says (June 18, 2013 at 7:51 am): “Did you notice the other oddity? There is no possible way that the Japanese satellite measurement OF the entire air column supports global CO_2 on the edge of 400 ppm. Eyeballing the graphs the means should be what, 380 ppm, and as you note, it should be skewed high compared to Mauna Loa, not low.”
Well, I’m not so sure. In the 2013/04 map, for example, Hawaii doesn’t seem out of line with the rest of the northern hemisphere, and in fact several places are even redder. The maps seem to cover most CO2 sources, but a lot of space is blank. Plus, as you mention later in your comment, calibration/sensitivity/reliability of this new satellite tool is unknown. And this says CO2 at the South Pole is within 6 ppm or less of Mauna Loa. Most or even all of the difference is explainable as a gradient from the major CO2 sources to the north.
BTW, the GOSAT site also has a neat animation of global CO2 distribution. It’s a simulation, so take with a grain of salt, but it smooths out the monthly maps nicely.
Last night I re-watched the “Seasonal Forests” episode of the BBC’s “Planet Earth”. It mentioned that the vast northern forests are a major source of atmospheric oxygen. In the GOSAT simulation, Canada and Siberia turn dark blue (low CO2) in late summer, indicating massive photosynthesis and oxygen production. Watching the simulation unfold, I felt the same thrill up my leg that Chris Matthews gets listening to Obama. 🙂

Lars P.
June 18, 2013 12:42 pm

Phil. says:
June 18, 2013 at 10:41 am
The data I’m looking at for April is consistent with 400ppm:
Phil if you look at the picture you post you see between 390 and 400 in the North Hemisphere, with maybe a bit darker spots South of Japan and around 390 and some lower in the South Hemisphere.
To my eyes, the average of that is in no case 400.

June 18, 2013 2:52 pm

Lars P. says:
June 18, 2013 at 12:42 pm
Phil. says:
June 18, 2013 at 10:41 am
“The data I’m looking at for April is consistent with 400ppm:”
Phil if you look at the picture you post you see between 390 and 400 in the North Hemisphere, with maybe a bit darker spots South of Japan and around 390 and some lower in the South Hemisphere.
To my eyes, the average of that is in no case 400.

Well Lars I opened the map in Photoshop and examined the data point next to Hawaii using the colorsync utility and it came out at the same rgb value as 400ppm on the scale bar!

Eliza
June 18, 2013 6:22 pm

From above and other postings Ny Nick Stokes and Mr T.O.O tjhere is no doubt in my mind that they are paid trolls who receive a salary to come to the skeptical sites, The TEAM is very concerned about the influence of climate skeptic sites and the effect it is having on public opinion. My cents worth

June 18, 2013 9:37 pm

Superimposing the temperature curve and its least-squares linear-regression trend on the statistical insignificance region bounded by the means of the trends on these published uncertainties since January 1996 demonstrates that there has been no statistically-significant warming in 17 years 4 months:
Some Moncktonesque statistical treatment here, what exactly does “bounded by the means of the trends on these published uncertainties” mean in relation to statistical significance? Using the same data I obtained a trend of 0.089±0.118 ºC/decade which indicates statistically significant warming at the 85% level. To say that there has been “no statistically-significant warming in 17 years 4 months” as Monckton does is meaningless.

rgbatduke
June 19, 2013 7:01 am

The data I’m looking at for April is consistent with 400ppm:
https://data.gosat.nies.go.jp/GosatBrowseImage/browseImage/fts_l2_swir_co2_gallery_en_image.html?image=46

Fair enough, but one wonders why August of 2012 was then so very different from April 2013. Did they renormalize their detectors to get agreement?
As for the comment about polar cooling and TOA descending to ground level, this is something that has occurred to me as well. The DALR from ground level to the tropopause is determined, among other things, by the height where GHGs cease to be optically opaque and lose their heat content to space. Above the tropopause the stratosphere begins, which is hotter than the top of the troposphere, in part because the ordinary atmosphere no longer has any radiative cooling channel. In polar inversions with very dry air (and hence very little GHE), one essentially drops the stratosphere/tropopause down to the ground — I recall there is a spectrograph pair in Perry that illustrates this case.
This makes me less sure about just what the temperature profile would be in a mythical planet covered in just oyxgen-nitrogen-argon, no H2O or CO2 or ozone or GHGs per se, or better yet, a planet with a Helium atmosphere. The ground would radiatively heat and cool unopposed, like the moon, so from one point of view one would expect it to get very hot in the day, very cold at night, not unlike the low humidity desert does now.
But not exactly like the desert, because the DALR doesn’t break down over the desert and in this mythical planet the tropopause would basically be at ground level, and temperatures would ascend from the ground up, just as the stratosphere warms to the thermosphere only far overhead. Then my imagination breaks down. Such an atmosphere would have basically no cooling mechanism overhead until it gets hot enough to activate SWIR bands in the non-GHG atmosphere, say thousands of degrees. It might well be that it cools by contact with the ground at nighttime because the ground is an efficient radiator where the atmosphere is not. The atmosphere would, however, be densest and coolest at the bottom, and cooling there would not generate surface convection.
However there would be differential heating and cooling between the equator and the poles, and differential heating and cooling from daytime to nighttime, so there would be some general circulation — cold air from the poles being pushed south, uplifting as it heats, displacing hotter upper air poleward, and forcing it down to contact the colder surface there, which might well still make such an atmosphere net heating. IIRC, one of the moons — Triton? — has such an atmosphere and a moderately uniform surface and might serve as a sort of laboratory for this kind of model, although its density and mean temperature are of course way off for applicability to Earth.
One of our many problems is that we just don’t have enough, simple enough, planets to study to get a good feel for planetary climatology. It is so easy to make some simplistic assertion that captures one “linearized” cause-effect relationship and then fail to realize that in the real world four other nonlinear relationships are coupled in such a way that the hypothesized linearized relationship is not correct, it is at best valid in the neighborhood of “now” in a single projective direction of a complex, curved, surface, the projection of the first term in a Taylor series expansion of a solution to a multivariate nonlinear partial differential equation. So stating that CO_2 is “cooling at the poles” might better be stated as “sometimes, when conditions are just right, CO_2 can contribute to net cooling at the poles”. But this is probably a smaller effect globally than the statement that “sometimes, when the conditions are just right, water vapor is strongly net cooling; other times when the conditions are just right, water vapor is net warming; the conditions that regulate which it is might well depend on aerosol levels, soot/particulate levels, solar magnetic state, geomagnetic state, time of year, macroscopic state (e.g. water content) of the stratosphere, the phase of the decadal oscillations, the phase of the moon, and who won the superbowl”. And since water vapor is by far the dominant GHG, getting the global climate answer approximately correct depends on getting it right first, and only then worrying about what happens with CO_2.
rgb

rgbatduke
June 19, 2013 8:07 am

Some Moncktonesque statistical treatment here, what exactly does “bounded by the means of the trends on these published uncertainties” mean in relation to statistical significance? Using the same data I obtained a trend of 0.089±0.118 ºC/decade which indicates statistically significant warming at the 85% level. To say that there has been “no statistically-significant warming in 17 years 4 months” as Monckton does is meaningless.
Not wanting to speak for Mr. Monckton, of course, but I suspect he is referring to R^2 for the linear fit, usually interpreted as a measure of the signficance of the fit compared to the null hypothesis of no relationship. I also think he is using “accepted” values for the conclusion, which a lot of people have grudgingly been coming to accept in the climate community.
At the same time, I completely agree with you. There is nothing special or magical about 17 years, or 16 years, or 13 years. Or rather, there might well be, but we don’t know what it is because we do not actually know the relevant timescales for variation of the climate as opposed to the weather. The climate as reflected in some sort of average global temperature derived from either the thermometric record or proxies has never been constant, never been linear, never had anything like a single time constant or frequency that could be associated with anything but a Taylor series/power series fit to some sufficiently small chord or fourier transform ditto. Well, I take that back — there is a pretty clear fourier signal associated with Milankovitch processes over the last 3.5 million years, only the period changes without warning three times over that interval (most recently to roughly 100 ky) and we don’t know why.
The much better way to assert precisely the point Monckton makes above is by asserting no point at all — simply presenting the actual e.g. LTT and SST and LST records over the last 34 years where LTT, at least, is consistently and accurately measured, SST is increasingly precisely measured, and sadly, LST estimates are comparatively dubious. Over that entire interval, LTT (as arguably the best measure of actual warming for a variety of reasons) suggests a non-catastrophic warming rate on the order of 0.1 C/decade but with (obviously!) large error bars. SSTs lead to a similar conclusions. Measurements of SLR (any mix of tide gauge data and satellite) lead to a similar conclusion.
Deconstructing the causes of the warming, decomposing it into (say) a flat fit pre-1997 and a second flat fit post-1999 (which reveals that most of the warming occurred in a single discrete event associated with the super-El Nino/La Nina pair in between as far as we can tell from the data) or a linear fit, or an exponential fit, or throwing new fourier, linear, or otherwise functional components in coincidence with decadal oscillations, the solar cycle, the level of solar activity, CO_2 concentrations, stratospheric water vapor, stratospheric ozone, or the density of kitchen sinks (per household) in Zimbabwe is in some sense all complex numerology, climatological astrology. The temperature record is what it is, not what we would fit it to be, not the stories we make up to explain it because we damn sure cannot compute it!
That’s the real mistake Monckton made — he presented an aggregate view of the GCMs, because that is what the IPCC did in its notorious AR4 summary for policy makers, which contains an absolutely horrendous abuse of statistics by using the averages and standard deviations of model results over many completely different GCMs to make quantitative assertions of the probabilities of various warming scenarios as if “structure of a particular climate GCM” is an independent, identically distributed variable and reality is somehow bound to the mean behavior averaged over this variable by the central limit theorem which is sheer madness. This isn’t Monckton’s error, it is a standard error made by the IPCC, an error where incompetence in statistical analysis is beautifully married to political purpose to the detriment of scientific reasoning), but he perhaps should have avoided perpetuating it (as Nick Stokes rather overvehemently insisted above and just presented the spaghetti snarl of actual GCM model results themselves as they say precisely the same thing, only better, without the irrelevant and incorrectly computed probabilities.
Madness and incorrect computation that is, of course, perpetuated and reflected in your citing 0.089±0.118 ºC/decade which indicates statistically significant warming at the 85% level.. Let’s restate this in Bayesian language. If there is no bias in the process associated with generating the points being fit (an assumption we can frame as a prior probability of there being occult bias), if the underlying climate is a linear function (an assumption we can frame as the prior probability of the climate behaving linearly over some interval of time, an assumption we can actually make at least semi-quantitative if one can believe the proxy record over the Holocene and bewares the fact that much of that proxy is intrinsically coarse grain averaged over intervals longer than 33 years), if the error bar you obtain from a linear fit (presumably from the distribution of Pearson’s \chi^2) over a remarkably short interval where we do not know the timescales of the relevant noise compared to the linear trend is relevant (again, one can try to guestimate the probable timescales of noise compared to the fit interval, but the curve itself strongly suggests that the two are comparable as it decomposes into two distinct and quite good fits joined at the ENSO in the middle), and if there is nothing else going on that we, in our ignorance, should be correcting for, then your linear fit yields warming with a considerably wider variance than you are allowing for — none of the Bayesian probabilities above are optimal for the linear fit to be precisely meaningful, and the uncertainties they introduce all broaden the uncertainty of the result or worse, reflect the probability that the linear fit itself is nonsense and cannot be extrapolated with any real confidence at all.
I strongly suggest that you read the Koutsoyiannis paper on hydrology that has as it first graphic a function plus noise at a succession of timescales (in fact, Anthony, this graph should be a front-page feature on WUWT somewhere, IMO, as a permanent criticism of the plague of linearizing a function we know is nontrivially nonlinear). On a short enough timescale it appears linear. Then it appears exponential. Then it appears sinusoidal. But is that really its behavior? Or is the sinusoidal merely systematic noise on a longer term linear behavior? Note that no possible statistical analysis on the original fit interval can reveal the tru-er longer time behavior, and at no time can one separate out the unknown longer time behavior from the fit.
In the meantime, here is a statement that perhaps everybody — even Nick Stokes and in other venues Joel Shore — can agree with. The last 15 to 17 years of the climate record — depending on which side of the discrete “event” of the 1997/1998 ENSO you with to start on — are not strong evidence supporting the hypothesis of catastrophic anthropogenic whatever, global warming, climate change. So much so that the warmist community stopped using the phrase global warming over this interval, substituting the non-falsifiable term “climate change” instead because any weather event can safely be attributed to human activity and who can prove you wrong other than by asserting statistical arguments so abstruse that any member of the lay population will fall asleep long before you finish them, where the catastrophe itself is always immediate and exciting and real to them. Every experience is a peak experience if you are demented and forget past experiences, after all.
Note that I carefully avoid stating that the data “falsifies” the assertion of CAGW, AGW, or any particular “scenario” put forth in AR4’s summary for policy makers. To make such an assertion I would have to have prior knowledge that not only I lack, but everybody lacks. The issue of e.g. UHI bias and cherrypicking in the contemporary land surface temperature record is an open question at this point, with evidence supporting both sides. Who really knows if there is a bias, what the sign of the bias is, what the magnitude of the bias is? Even given the best possible intentions and totally honest scientists constructing it, totally honest scientists have biases and often cannot help incorporating them into their database — an assertion I make with considerable empirical evidence derived from meta-studies in e.g. medical science.
There is considerable unacknowledged uncertainty in climate science — the egregious treatment of various Bayesian priors as equal to unity or zero (in a way that reflects one’s prejudices on the issue) in order to avoid diluting the purity of one’s conclusions with the inconvenient truth of uncertainty. But quite independent of Bayes, it is entirely safe to say that an interval of essentially flat temperatures does not support the assertion of aggressive, catastrophic, global warming. Indeed, 13 years (starting MOST generously in the year 2000) is 1/8th of a century. If 1/8th of the twenty first century has been climate neutral in spite of a significant increase in CO_2 in that time and over an interval in which the GCMs unanimously call for aggressive warming, one would have to lack simple common sense to assert that this is evidence for the correctness of the GCMs and the likelihood of a catastrophic warming increasingly confined to the 7/8 of the century remaining.
rgb

Gary Hladik
June 19, 2013 8:22 am

rgbatduke says (June 19, 2013 at 7:01 am): “This makes me less sure about just what the temperature profile would be in a mythical planet covered in just oyxgen-nitrogen-argon, no H2O or CO2 or ozone or GHGs per se, or better yet, a planet with a Helium atmosphere.”
In case you haven’t seen it, Dr. Spencer discusses a no-GHG Earth here.
If WUWT commenter “Konrad” is reading this thread he may add a somewhat different view.

Gary Hladik
June 19, 2013 8:53 am

rgbatduke says (June 19, 2013 at 8:07 am): “Let’s restate this in Bayesian language.”
I gather from what follows the above that the “Bayesian” language must be spoken in very long and very complex sentences. 🙂
No worries, though. I ran it through Google Translate and after about 15 minutes of processing it spit out “It’s not statistically significant, Phil.”. 🙂
One final thought: We must at all costs keep RGB away from the cryptic Steve Mosher, lest their mutual annihilation destroy the entire planet. 🙂

June 19, 2013 9:11 am

rgbatduke says:
June 19, 2013 at 7:01 am
The data I’m looking at for April is consistent with 400ppm:
https://data.gosat.nies.go.jp/GosatBrowseImage/browseImage/fts_l2_swir_co2_gallery_en_image.html?image=46
Fair enough, but one wonders why August of 2012 was then so very different from April 2013. Did they renormalize their detectors to get agreement?

I didn’t look at last year but I’d expect it to have been ~6ppm lower based on ML data.
As for the comment about polar cooling and TOA descending to ground level, this is something that has occurred to me as well. The DALR from ground level to the tropopause is determined, among other things, by the height where GHGs cease to be optically opaque and lose their heat content to space. Above the tropopause the stratosphere begins, which is hotter than the top of the troposphere, in part because the ordinary atmosphere no longer has any radiative cooling channel. In polar inversions with very dry air (and hence very little GHE), one essentially drops the stratosphere/tropopause down to the ground — I recall there is a spectrograph pair in Perry that illustrates this case.
I don’t think this is correct, see the following for example:
http://tinyurl.com/l33n3cv

June 19, 2013 9:46 am

rgbatduke says:
June 19, 2013 at 8:07 am
“Some Moncktonesque statistical treatment here, what exactly does “bounded by the means of the trends on these published uncertainties” mean in relation to statistical significance? Using the same data I obtained a trend of 0.089±0.118 ºC/decade which indicates statistically significant warming at the 85% level. To say that there has been “no statistically-significant warming in 17 years 4 months” as Monckton does is meaningless.”
Not wanting to speak for Mr. Monckton, of course, but I suspect he is referring to for the linear fit, usually interpreted as a measure of the signficance of the fit compared to the null hypothesis of no relationship. I also think he is using “accepted” values for the conclusion, which a lot of people have grudgingly been coming to accept in the climate community.

You might think that but the values shown on his graph don’t correspond with that, hence my question.
The much better way to assert precisely the point Monckton makes above is by asserting no point at all — simply presenting the actual e.g. LTT and SST and LST records over the last 34 years……
Certainly, but he doesn’t do that!
That’s the real mistake Monckton made — he presented an aggregate view of the GCMs, because that is what the IPCC did in its notorious AR4 summary for policy makers, which contains an absolutely horrendous abuse of statistics by using the averages and standard deviations of model results over many completely different GCMs to make quantitative assertions of the probabilities of various warming scenarios as if “structure of a particular climate GCM” is an independent, identically distributed variable and reality is somehow bound to the mean behavior averaged over this variable by the central limit theorem which is sheer madness.
Agreed.
Madness and incorrect computation that is, of course, perpetuated and reflected in your citing 0.089±0.118 ºC/decade which indicates statistically significant warming at the 85% level…..
Which has nothing to do with the GCMs, it’s the corrected version of Monckton’s statistical analysis. Your argument re Bayesian statistics shows that Monckton’s attempt to show that there is no significant trend in the data is invalid.
So much so that the warmist community stopped using the phrase global warming over this interval, substituting the non-falsifiable term “climate change” instead because any weather event can safely be attributed to human activity and who can prove you wrong other than by asserting statistical arguments so abstruse that any member of the lay population will fall asleep long before you finish them,
An often repeated canard, since the term ‘climate change’ was already in use when the IPCC was founded in 1988!

June 19, 2013 9:49 am

Gary Hladik says:
June 19, 2013 at 8:53 am
rgbatduke says (June 19, 2013 at 8:07 am): “Let’s restate this in Bayesian language.”
I gather from what follows the above that the “Bayesian” language must be spoken in very long and very complex sentences. 🙂
No worries, though. I ran it through Google Translate and after about 15 minutes of processing it spit out “It’s not statistically significant, Phil.”. 🙂

Check out your translator, it should have said “RGB thinnks that the method used by Monckton isn’t capable of determining the significance of the trend”.

Tim Clark
June 19, 2013 12:40 pm

{ rgbatduke says:
June 19, 2013 at 8:07 am }
I’ll bet you’re a hoot at a party!
;<)

Lars P.
June 19, 2013 12:44 pm

Phil. says:
June 18, 2013 at 2:52 pm
Well Lars I opened the map in Photoshop and examined the data point next to Hawaii using the colorsync utility and it came out at the same rgb value as 400ppm on the scale bar!
Phil, wonderful, you spotted one spot that looks like 400. To have 400 in average one should have equal numbers 410 as 390 or at least about.
Also remember that you look at the whole column of CO2 which makes me wonder as the records vary more:
http://m4gw.com/the_photosynthesis_effect/

June 19, 2013 1:43 pm

Lars P. says:
June 19, 2013 at 12:44 pm
Phil. says:
June 18, 2013 at 2:52 pm
“Well Lars I opened the map in Photoshop and examined the data point next to Hawaii using the colorsync utility and it came out at the same rgb value as 400ppm on the scale bar!”
Phil, wonderful, you spotted one spot that looks like 400. To have 400 in average one should have equal numbers 410 as 390 or at least about.

RGB was talking about ML CO2 readings so I picked the closest which was 400, there were many others with the same value and plenty which were greater!
Also remember that you look at the whole column of CO2 which makes me wonder as the records vary more:
http://m4gw.com/the_photosynthesis_effect/

There will be variation near the surface due to the presence of sources and sinks but higher in the atmosphere it will be fairly constant up to the tropopause. Near growing crops there will be strong diurnal variation from about 300-450, bear in mind that the GOSAT data is a monthly average.

Gary Hladik
June 19, 2013 1:55 pm

Phil. says (June 19, 2013 at 9:11 am): “I didn’t look at last year but I’d expect it to have been ~6ppm lower based on ML data.”
April 2012: 396.18 ppm
April 2013: 398.4 ppm
http://co2now.org/images/stories/data/co2-mlo-monthly-noaa-esrl.pdf

Verified by MonsterInsights