Note: Dr. Judith Curry also has an essay on this important paper. She writes:
My mind has been blown by a new paper just published in Nature.
Just when I least expected it, after a busy day when I took a few minutes to respond to a query from a journalist about a new paper just published in Nature [link to abstract]:
This has important implications for IPCC’s upcoming AR5 report, where they will attempt to give attribution to the warming, which now looks more and more like a natural cycle. See updates below. – Anthony
================================================================
Guest essay by Bob Tisdale
The recently published climate model-based paper Recent global-warming hiatus tied to equatorial Pacific surface cooling [Paywalled] by Yu Kosaka and Shang-Ping Xie has gained a lot of attention around the blogosphere. Like Meehl et al (2012) and Meehl et al (2013), Kosaka and Xie blame the warming stoppage on the recent domination of La Niña events. The last two sentences of Kosaka and Xie (2013) read:
Our results show that the current hiatus is part of natural climate variability, tied specifically to a La-Niña-like decadal cooling. Although similar decadal hiatus events may occur in the future, the multi-decadal warming trend is very likely to continue with greenhouse gas increase.
Anyone with a little common sense who’s reading the abstract and the hype around the blogosphere and the Meehl et al papers will logically now be asking: if La Niña events can stop global warming, then how much do El Niño events contribute? 50%? The climate science community is actually hurting itself when they fail to answer the obvious questions.
And what about the Atlantic Multidecadal Oscillation (AMO)? What happens to global surface temperatures when the AMO also peaks and no longer contributes to the warming?
The climate science community skirts the common-sense questions, so no one takes them seriously.
UPDATE
Another two comments:
Kosaka and Xie (2013) appear to believe the correlation between their model and observed temperatures adds to the credibility of their findings. They write in the abstract:
Although the surface temperature prescription is limited to only 8.2% of the global surface, our model reproduces the annual-mean global temperature remarkably well with correlation coefficient r = 0.97 for 1970–2012 (which includes the current hiatus and a period of accelerated global warming).
Kosaka and Xie (2013) used the observed sea surface temperatures of the central and eastern equatorial Pacific as an input to their climate model. By doing so they captured the actual El Niño-Southern Oscillation (ENSO) signal. ENSO is the dominant mode of natural variability on the planet. In layman terms, El Niño and La Niña events are responsible for the year-to-year wiggles. It’s therefore not surprising that when they added the source of the wiggles, the models included the wiggles, which raised the correlation coefficient.
Table 1 from Kosaka and Xie (2013) is also revealing. The “HIST” experiment is for the climate model forced by manmade greenhouse gases and other forcings, and the “POGA-H” adds the tropical Pacific sea surface temperature data to the “HIST” forcings. For the modeled period of 1971-1997, adding the ENSO signal increased the linear trend by 34%. Maybe that’s why modeling groups exclude the multidecadal variability of ENSO by skewing ENSO to zero. That way El Niños and La Niñas don’t contribute to or detract from the warming. Unfortunately, by doing so, the models have limited use as tools to project future climate.
UPDATE2 (Anthony): From Dr. Judith Curry’s essay – she writes at her blog:
The results in terms of global-average surface temperature are shown in Fig 1 below:
In Fig 1 a, you can see how well the POGA H global average surface temperature matches the observations particularly since about 1965 (note central Pacific Ocean temperatures have increasing and significant uncertainty prior to 1980).
What is mind blowing is Figure 1b, which gives the POGA C simulations (natural internal variability only). The main ’fingerprint’ of AGW has been the detection of a separation between climate model runs with natural plus anthropogenic forcing, versus natural variability only. The detection of AGW has emerged sometime in the late 1970′s , early 1980′s.
Compare the temperature increase between 1975-1998 (main warming period in the latter part of the 20th century) for both POGA H and POGA C:
- POGA H: 0.68C (natural plus anthropogenic)
- POGA C: 0.4C (natural internal variability only)
I’m not sure how good my eyeball estimates are, and you can pick other start/end dates. But no matter what, I am coming up with natural internal variability associated accounting for significantly MORE than half of the observed warming.
The paper abstract:
Recent global-warming hiatus tied to equatorial Pacific surface cooling
Yu Kosaka & Shang-Ping Xie Nature (2013) doi:10.1038/nature12534
Despite the continued increase in atmospheric greenhouse gas concentrations, the annual-mean global temperature has not risen in the twenty-first century1, 2, challenging the prevailing view that anthropogenic forcing causes climate warming. Various mechanisms have been proposed for this hiatus in global warming3, 4, 5, 6, but their relative importance has not been quantified, hampering observational estimates of climate sensitivity. Here we show that accounting for recent cooling in the eastern equatorial Pacific reconciles climate simulations and observations. We present a novel method of uncovering mechanisms for global temperature change by prescribing, in addition to radiative forcing, the observed history of sea surface temperature over the central to eastern tropical Pacific in a climate model. Although the surface temperature prescription is limited to only 8.2% of the global surface, our model reproduces the annual-mean global temperature remarkably well with correlation coefficient r = 0.97 for 1970–2012 (which includes the current hiatus and a period of accelerated global warming). Moreover, our simulation captures major seasonal and regional characteristics of the hiatus, including the intensified Walker circulation, the winter cooling in northwestern North America and the prolonged drought in the southern USA. Our results show that the current hiatus is part of natural climate variability, tied specifically to a La-Niña-like decadal cooling. Although similar decadal hiatus events may occur in the future, the multi-decadal warming trend is very likely to continue with greenhouse gas increase.

Steve Garcia writes:
“To be honest, this is the very first paper (or blog post even) I’ve seen that has attempted to determine the proportions of natural vs anthropogenic. Pardon the emphasis, but: This sort of determination should have been done way back in the late 1980s.
It is utterly pathetic – for their side – that this has never been done before. Almost everybody on the skeptics’ side has understood this from the first pitch of the top of the first inning:
YES, humans have contributed – but you dolts over there imagining that natural variation wasn’t part of it – what were you THINKING?”
It seems to me that Steve’s point is incredibly important. The climate modeling community has some serious explaining to do. I hope that Willis Eschenbach or some regular will do a major post on this topic.
PeterB in Indianapolis says….
>>>>>>>>>>>>>>>>>
Don’t forget the effects of the moon’s tidal pull. It is not just East-West.
http://ansatte.hials.no/hy/climate/defaultEng.htm
http://joannenova.com.au/2013/06/can-the-moon-change-our-climate-can-tides-in-the-atmosphere-solve-the-mystery-of-enso/
https://chiefio.wordpress.com/2011/11/03/lunar-resonance-and-taurid-storms/
Henry Clark says: August 28, 2013 at 9:17 pm
http://s24.postimg.org/rbbws9o85/overview.gif
In the “correlation does not equal causation but it’s one heck of a good place to start” I side with you. The correlation is obvious to me at least. I’ve read both sides of the debate on this site and find it very interesting (and scientific method following).
There may be some test that disproves the galactic cosmic ray model but so far it is in the lead (IMHO) as to explanations of climate.
Theo Goodwin says:
August 29, 2013 at 12:40 pm
Of course you’re correct that back in the 1980s, “climate scientists”, in order to behave like real scientists, should have shown false the null hypothesis that climate change remained primarily natural, as had been the case for the previous 4.55 billion years of earth history.
Then as now, I was unable to find any actual scientific evidence supporting a large anthropogenic fingerprint, let alone the 90% imagined by IPCC fantasists.
Here is cosmoclimatologist Dr. Nir J. Shaviv’s 2006 attempt to separate man-made from natural warming over the past century:
http://www.sciencebits.com/CO2orSolarHYPERLINK
IMO there is little or no reason to assume that the human component will dominate over the next century, as he did. But observations, if they can be made objectively without books-cooking “adjustments”, in coming decades will show whose predictions are valid.
But in any case, should global warming or cooling resume later this century (from flatness of recent past), its degree won’t be catastrophic, whether the man-made contribution be ten percent or ninety. Assuming such presumptive warming or cooling can even be measured within margin of error.
Thank you Richard for your excellent comments.
Steve Garcia says:
August 29, 2013 at 10:21 am
YES, humans have contributed – but you dolts over there imagining that natural variation wasn’t part of it – what were you THINKING?
————————————————————
ka-ching ka-ching
Theo Goodwin says:
August 29, 2013 at 12:40 pm
YES, humans have contributed – but you dolts over there imagining that natural variation wasn’t part of it – what were you THINKING?”
=====
Theo, don’t berate them for finally discovering the PDO….
..they might dig their heels in and never discover the AMO
milodonharlani says:
August 29, 2013 at 12:54 pm
Very well said. My simple opinion is that the Alarmists thought they could persuade the public of CAGW without doing the science. As inevitably happens, they are being hoisted on their own inconsistencies. Their investigation of natural regularities has been sorely lacking from the very beginning and remains so. In my humble opinion, there has never been evidence for dangerous global warming.
Latitude says:
August 29, 2013 at 1:19 pm
Cute. Try this one: after some reflection and consultation with the like minded, they throw Kosaka and Xie under the nearest bus.
…and no fair giving them clues either!….LOL
richardscourtney
Re your comment at
August 29, 2013 at 10:21 am
About needing a super computer, yep we do. But what if we do it the same way as say seti@home has been via distributed computing effort. Perhaps via BIONC?
Theo Goodwin says:
August 29, 2013 at 1:21 pm
CACA was also a convenient for statists conclusion jumped to without even the still highly limited understanding of the earth’s air, sea & land climate systems. The question was declared “settled” long before even the PDO & AMO were discovered for instance, by real scientists not consensus GIGO model climate scammers.
Salvatore Del Prete says:
They are saying a natural event such as ENSO has been responsible for almost all of the temperature variations, which proves (sorry AGW theory) AGW theory is invalid.
In fact, they aren’t saying that at all.
HarveyS:
Thankyou for your post addressed to me at August 29, 2013 at 1:34 pm
It says in total
OK. I will give my answer and I ask you to recognise my answer is an honest statement of my ignorance: it is not an attempt to avoid your suggestion.
Replicating what has been done but adopting different understandings and assumptions is clearly possible. Many people (including me) could do it if provided with the needed supercomputer and the funds to employ relevant and competent programmers. System development and testing could be modular and progressive.
However, establishing a distributed computing operation is a very different enterprise. It requires much specialist computer science (that e.g. I don’t have) and little testing would be possible prior to initiation of the system but the entire system would need to work first time. That is very ambitious and, for example, the BBC failed in the attempt when the BBC used its university and Met. Office contacts to do something similar.
Hence, I lack knowledge to give serious consideration of your suggestion, but the little I do know tells me your suggestion would not be easy.
I would welcome comment from people knowledgeable in computer science because I think your suggestion may have merit if it could be done. Also, the existence of the WUWT ‘community’ affords the possibility of such a distributed activity.
Richard
Ian W says:
August 29, 2013 at 10:31 am
“Janice, I would refer you (and Pamela and Leif et al) to…”
You probably missed my point, which is correlations to the solar signal at the scales of weather. At this scale we can get very convincing evidence of solar forcing, with huge volumes of hindcasts available. And also show solar forcing of teleconnections that are assumed to be internal variations.
to richardscourtney
August 29, 2013 at 1:57 pm.
Thank you for your response. I think it it was more of a thought rather than a suggestion. As much to remove the need for the supercomputer.
I did for a living work as DB admin/designer and wrote for the front ends for those DB’s. So do have some experence 🙂
What we would need as far I understand it.
“The main requirement of the application is that it be divisible into a large number (thousands or millions) of jobs that can be done independently.
The BOINC server software is extremely efficient, so that a single mid-range server can dispatch and handle millions of jobs per day. The server architecture is also highly scalable, making it easy to increase server capacity or availability by adding more machines.
BOINC supports applications that produce or consume large amounts of data, or that use large amounts of memory. Data distribution and collection can be spread across many servers, and participant hosts transfer large data unobtrusively. Users can specify limits on disk usage and network bandwidth. Work is dispatched only to hosts able to handle it. ”
http://boinc.berkeley.edu/trac/wiki/BoincIntro
So the basic question would be could we ( not me not that good i dont think), write a model so that the processing is split into chunks?
HarveyS:
Thankyou for your post at August 29, 2013 at 3:08 pm.
You conclude
Well, I certainly could not, and it is up to other members of the WUWT community to offer if they could and are willing.
However, i and others (rgb@Duke comes to mind) could define the requirements of the model.
Richard
CO2 doesn’t seem to play much of a role here, even the trolls tread lightly with that regard. It may eventually be concluded that its radiative properties toward the earth surface are offset by the same amount to space and therefore a wash.
Indeed – in this context it may be significant that while AGW radiative models typically employ a Narnia-Diskworld flat earth disk, taking account of the earth’s sphericity leads to more outward than inward radiation from any atmospheric radiation sources.
The fact that it radiates the same amount to space and back to earth don’t make it a wash, that IS the greenhouse effect. It’s the “back to earth” bit that doesn’t happen without the absorptive gas layer. On the moon, for example radiation leaving the moon’s surface simply is gone, next stop infinity. Radiation leaving the earth’s surface in the LWIR bands associated with the earth’s surface temperature has a significant probability of being returned to the surface, sufficient that any IR radiometer, pointed up near the earth’s surface, will register hundreds of watts per square meter coming back down at any hour of the day or night (in addition to direct sunlight during the day). On the moon the same IR radiometer, pointed up, would read zero (in addition to direct sunlight during the day).
The direct sunlight during the day would be more intense on the moon, so it would be and is hotter in the middle latitudes. However, the AVERAGE temperature of the moon is much colder. In some part this is because of the T^4 variation Richard ably described up above — the moon also lacks active lateral transport of heat and its temperature is a lot more non-uniform than the earth’s which favors more efficient cooling. Thermal mixing by heat transport in the ocean and atmosphere contributes to warming, because the rate at which a planet cools can vary tremendously at a fixed average temperature. The rate of cooling is related to the fourth order cumulant of the temperature, where the mean is a the first order average.
There is almost no contribution from spherical geometry in a 9 km layer on top of a 6400 km radius, and what little there might be is more than offset by the DALR that is what SUPPRESSES net radiation from the CO_2 band outward. That is the gas at the top of the troposphere where it thins enough that LWIR photons have a good chance of escaping is much colder than the gas at the bottom, so its radiation rate is strongly suppressed (T^4) again.
Don’t go picking on the GHE itself, now, just because there is data suggesting that the partial derivative of the greenhouse trapping of heat with respect to CO_2 concentration may be zero to very weakly positive instead of comparatively strongly positive. There really is little doubt that the GHE is real, but it is also complex and feedbacks from increased CO_2 can easily be all or mostly negative as easily as positive. Because the problem is not separable — as this paper demonstrates if nothing else — it is very, very difficult to apportion a split between natural variations that we do not know how to a priori calculate, direct effects from increased CO_2, feedback effects from the mix of natural effects and increased CO_2, other effects from aerosols and particulates natural and otherwise, internal feedback effects that have effectively stabilized the system within a range of plus or minus perhaps 10C over hundred million year timescales while even the strength of the sun itself has varied, geographical effects, effects from the slowly varying orbital parameters of the earth, slowly varying effects from the sun itself (that may or may not have a significant impact on some subset of the above) in a nonlinear, chaotic system with highly variable timescales in non-Markovian terms that contribute to its local time evolution (that is, heat that is stored and re-released on timescales ranging from seconds to centuries acting within the system).
This one paper does not mean that we suddenly understand the climate system, in other words. There are probably dozens of models one can build that fit any sufficiently small fraction of the temperature timeseries within its (honestly computed) uncertainty. It’s the future that is the problem:
“Prediction is very difficult, especially if it is about the future.” — Niels Bohr
rgb
rgb
This is why the Propaganda Outlets News Media had to finally own-up to ‘The Pause’ If they did not they would have lost all credibility especilly given the internet can tell you that the cold at home is not a one off
Impressive collection, Gail, thanks!
Not that it is a surprise. The interesting thing is that it really looks like it is a global phenomenon. One does indeed have to wonder if they haven’t finally added too many tenths of a degree C to the real temperature “anomaly” to make the absolute temperature believable.
rgb
Gail Combs,
I have never considered glaciers to be particularly good evidence for either side of the argument since they can be affected enormously by precipitation as much as temperature.
My general point is that there is some AGW effect. It might be less than many thought but it might not be completely negligible. If someone thinks it is negligible then, indeed, they need to explain when we will actually begin to cool or have a good explanation for why we are not. A pause in warming or some cold weather in various parts of the world is not evidence of cooling.
So all I was doing was to challenge anyone who thinks AGW is completely negligible to give a prediction or an explanation. I haven’t seen anyone give a solid prediction of when we will be back to close to 1910 temperature levels and some who think AGW is negligible actually seem to think warming will continue because of some unknown warming factor in a rebound from the LIA.
James Cross says:
August 29, 2013 at 4:33 pm
You wrote:
“My general point is that there is some AGW effect. It might be less than many thought but it might not be completely negligible.”
If you could demonstrate this phenomena with reproducible results, you would be on to something.
I think that rgb’s comment at 3:41 pm, neatly sum’s up why we or anyone cant model the earths climate system.
Since there is so much we dont know and much we think we know and have wrong.
We will be ever be able to model even close?, i am not sure we will, the MO cant even get a 3 day forecast correct.
philjourdan says:
August 29, 2013 at 1:34 pm
“@Theo Goodwin – I can show you some awesome mushrooms! ;-)”
Ours are the size of footballs and they pop up overnight. Mold is everywhere. Everything has a pale green patina.
richardscourtney (to HarveyS)
So the basic question would be could we ( not me not that good i dont think), write a model so that the processing is split into chunks?
Well, I certainly could not, and it is up to other members of the WUWT community to offer if they could and are willing.
>>>>>>>>>>>>>>>>
Pretty much any compute problem can be broken up into “chunks” and solved through parallel processing. The techniques are well documented and they work. rgb has some background in this, he was one of the pioneers of the “beowolf cluster” and his online commentary during the formative years was so prolific that many in the HPC community theorized that he was actually someone’s artificial intelligence experiment and that no such person actually existed.
That said, crowd sourcing a supercomputer for this task is likely not practical. Massive compute tasks, ones of this type in particular, are bound not so much by the processor speed as they are by the speed at which data can be accessed from a central storage repository by the processor. When we design large compute clusters, we’re aiming for latency on the order of microseconds. A compute load spread across the internet would have latency in the dozens of milliseconds at best, thousands of times as high as what would be required for the processors to work efficiently. Same goes for MPI (message passing interface) which coordinates the workload between parallel processes…it has to be very fast or the processors wind up twiddling their thumbs 99% of the time.
Of course I could be over estimating the complexity of the model that Richard proposes to build. But anything similar in complexity to the existing models, regardless of the underlying physics, would be problematic to run across the internet.
Anthony Maybe You should do a [poll] on what people/readers feel is the biggest Climate alarmist and have the most political gain.
Just Like the one you did on the new name for climate change..
I am calling the climate alarmist a new name in my book since they call anyone a denier with a valid point, data on current/past data. New Name for the Name callers is “Anti climate evolutionist”. The earth has been going through cycles for billions of years and there is nothing they can do about it. Species, plants have all evolved for millions of year to years with the changes.