On Guemas et al (2013) “Retrospective prediction of the global warming slowdown in the past decade”

I received a number of emails about the newly published Guemas et al (2013) paper titled “Retrospective prediction of the global warming slowdown in the past decade”. It’s paywalled. The abstract is here. It reads:

Despite a sustained production of anthropogenic greenhouse gases, the Earth’s mean near-surface temperature paused its rise during the 2000–2010 period1. To explain such a pause, an increase in ocean heat uptake below the superficial ocean layer2, 3 has been proposed to overcompensate for the Earth’s heat storage. Contributions have also been suggested from the deep prolonged solar minimum4, the stratospheric water vapour5, the stratospheric6 and tropospheric aerosols7. However, a robust attribution of this warming slowdown has not been achievable up to now. Here we show successful retrospective predictions of this warming slowdown up to 5 years ahead, the analysis of which allows us to attribute the onset of this slowdown to an increase in ocean heat uptake. Sensitivity experiments accounting only for the external radiative forcings do not reproduce the slowdown. The top-of-atmosphere net energy input remained in the [0.5–1] W m−2 interval during the past decade, which is successfully captured by our predictions. Most of this excess energy was absorbed in the top 700 m of the ocean at the onset of the warming pause, 65% of it in the tropical Pacific and Atlantic oceans. Our results hence point at the key role of the ocean heat uptake in the recent warming slowdown. The ability to predict retrospectively this slowdown not only strengthens our confidence in the robustness of our climate models, but also enhances the socio-economic relevance of operational decadal climate predictions.

Not too surprisingly ClimateProgress has a post New Study: When You Account For The Oceans, Global Warming Continues Apace about the paper.

The abstract suggests that the tropical Pacific and Atlantic Oceans are responsible for 65% of warming of global ocean heat content for the depths of 0-700 meters since 2000. However, the much-adjusted NODC ocean heat content data for the tropical Pacific (Figure 1) shows a decline in ocean heat content since 2000, and the ocean heat content for the Atlantic (Figure 2) has been flat since 2005.

Figure 1

Figure 1

###########

Figure 2

Figure 2

The abstract also mentions a new-found ability to predict slowdowns in warming. But the warming of tropical Pacific ocean heat content is dependent on the 3-year La Niña events of 1954-57, 1973-76 and 1998-01 and on the freakish 1995/96 La Niña, Figure 3. And the warming of sea surface temperatures for the Atlantic, Indian and West Pacific oceans, Figure 4, depends on strong El Niño events.

Figure 3

Figure 3

###########

Figure 4

Figure 4

CLOSING

Can Guemas et al (2013) can predict 3-year La Niñas and freakish La Niñas like the one in 1995/96? Can they predict strong El Niño events, like those in 1986/87/88, 1997/97 1997/98 and 2009/10? Both are unlikely—the specialized ENSO forecast models have difficulty projecting beyond the springtime predictability barrier every year.

FURTHER READING

For further information about the problems with ocean heat content data, refer to the post Is Ocean Heat Content Data All It’s Stacked Up to Be?

And for further information about the natural warming of the global oceans, see “The Manmade Global Warming Challenge.”

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
207 Comments
Inline Feedbacks
View all comments
richardscourtney
April 9, 2013 11:20 am

To the left of centre:
Your post at April 9, 2013 at 11:04 am asserts:

I see no reason to apologise for my comment. It was a perfectly reasonable comment that I would be happy to debate/discuss with anyone who was willing to refrain from referring to what I say as pseudo-scientific nonsense.

I call BS on that!
Following your evasion of providing a proper response to my comment in rebuttal of your nonsense, I have repeatedly goaded you in attempt to obtain a proper response. And you have persistently refused to provide one.
I remind that my comment you are evading was as follows.
Richard
================
richardscourtney says:
April 9, 2013 at 4:31 am
To the left of centre:
It says in total
Your post at April 9, 2013 at 3:43 am is ridiculous

There are a number of comments on this post that are mockingly dismissive of “retrospective predictions”. It is, however, an entirely reasonable thing to do. If one has a model that one wishes to use to predict the future it is sensible to ask the question “what would my model have predicted in the past”. Consider only data up to some previous point in time, use that data in your model and compare what your model predicts with what actually happened. As far as I can tell, there is nothing unreasonable about doing such a thing. To mock such a process either indicates a level of ignorance or a fundamental bias against any kind of climate science with which you disagree.

No!
Science says that a prediction of a model is compared to reality. Any difference between observed reality and the prediction is an indication of a flaw in the model. This is because the model is a representation of an understanding of reality.
Therefore, the difference between the model prediction and observed reality is a demonstration of a flaw in the modelled understanding of reality.
The flaw may be in
(a) the understanding
or
(b) how the model is constructed to represent that understanding
or
(c) both (a) and (b).
There are no other possibilities. And it is pure pseudoscience to imagine a not-measured possibility and to feed that into the model to determine if it can be used to adjust the model prediction to agree with observed reality.
Such a practice is pseudoscience because there are an infinite number of not-measured possibilities which can be imagined.
The model failed its empirical test. Using “retrospective-prediction” of the kind reported by Guemas at al. is pure pseudoscience and cannot be thought “an entirely reasonable thing to do” except by pseudoscientists.
Richard

TomRude
April 9, 2013 11:25 am

Virginie Guemas is a model student of Codron, Cassou, Meteo France who also signed the petition that Valerie Masson Delmotte started against Claude Allegre and Vincent Courtillot back in 2010. The WWU call against Easterbrook participates of the same idea.
Her presentations in which Azores anticyclones and other niceties of the statistical climatology are prominently featured suggests a she’d be in pain to figure out what’s the weather doing.

richardscourtney
April 9, 2013 11:32 am

To the left of centre:
I apologise that I made two errors in copying my post that you are evading. Sorry.
The corrected copy is as follows.
Richard
============
richardscourtney says:
April 9, 2013 at 4:31 am
To the left of centre:
Your post at April 9, 2013 at 3:43 am is ridiculous.
It says in total

There are a number of comments on this post that are mockingly dismissive of “retrospective predictions”. It is, however, an entirely reasonable thing to do. If one has a model that one wishes to use to predict the future it is sensible to ask the question “what would my model have predicted in the past”. Consider only data up to some previous point in time, use that data in your model and compare what your model predicts with what actually happened. As far as I can tell, there is nothing unreasonable about doing such a thing. To mock such a process either indicates a level of ignorance or a fundamental bias against any kind of climate science with which you disagree.

No!
Science says that a prediction of a model is compared to reality. Any difference between observed reality and the prediction is an indication of a flaw in the model. This is because the model is a representation of an understanding of reality.
Therefore, the difference between the model prediction and observed reality is a demonstration of a flaw in the modelled understanding of reality.
The flaw may be in
(a) the understanding
or
(b) how the model is constructed to represent that understanding
or
(c) both (a) and (b).
There are no other possibilities. And it is pure pseudoscience to imagine a not-measured possibility and to feed that into the model to determine if it can be used to adjust the model prediction to agree with observed reality.
Such a practice is pseudoscience because there are an infinite number of not-measured possibilities which can be imagined. Such a practice is pseudoscience because there are an infinite number of not-measured effects which can be imagined, but none of them add to understanding of what was modeled. Addition of such an effect pretends the modelers have an understanding which they do not have.
The model failed its empirical test. Using “retrospective-prediction” of the kind reported by Guemas at al. is pure pseudoscience and cannot be thought “an entirely reasonable thing to do” except by pseudoscientists.
Richard

izen
April 9, 2013 11:36 am

@- mwhite
“So to validate this paper they should be able to predict what hasn’t happened yet, ie what does their model say will happen to the earths temperature over the next five to ten years?”
The models predict a continuing warming trend a little faster than the trend from 1980 till now. Say around 0.15degC/decade. with a wide margin of uncertainty.
None predict cooling.
Most include a prediction range that covers a trend the same as that seen over the last 20 years.

noaaprogrammer
April 9, 2013 11:55 am

In computer science we have:
exponential time algorithms e.g. O(2^n)
polynomial time algorithms e.g. O(n^3)
linear time algorithms e.g. O(n)
constant time algorithms e.g. O(1)
And evidently Guemas et al. have now discovered an O(-1) algorithm which knows the answer before it is run.

Bill Illis
April 9, 2013 11:56 am

From Church and White 2011, here is your Ocean Heat Uptake in the Red and Dark Blue areas in comparison to all the components we should be talking about.
http://s23.postimg.org/hmhsd470r/grl28456_fig_0003.png
Big farking deal. Why so much focus on these tiny little areas.
Because it can be shown this way instead in Guemas 2013 (just ignoring the other 90% of the Earth’s Heat Budget).
http://www.nature.com/nclimate/journal/vaop/ncurrent/images_article/nclimate1863-f3.jpg
Or SkepticalScience’s version (ignoring 90% of the relevant components)
http://www.skepticalscience.com/graphics/Total_Heat_Content_2011.jpg
Or another version (ignoring 90% of the relevant components).
http://www.carbonbrief.org/media/158828/balmaseda_et_al._ocean_heat_content_600x415.jpg
These are just lines going up. GHG radiative forcing is 6 times bigger.
Have a read of Church and White 2011 and just ignore the sea level budget discussion and focus on the Earth Heat Budget part.
http://onlinelibrary.wiley.com/doi/10.1029/2011GL048794/full

Richard M
April 9, 2013 12:07 pm

Izen claims the reason the oceans are now absorbing excess heat is … “The amount of downwelling longwave radiation from rising CO2.”
Let’s assume he is correct and we hit a “tipping point” where the oceans start absorbing ALL the energy from now on. The current models predicted an atmosphere rise of 3C/doubling. That means by the time we get to 800 ppm in about 200 years the average global temperature would rise from 15C to 18C.
However, due to this tipping point all the excess energy will now enter the oceans. Since the heat capacity of the oceans is 1000x the atmosphere we should see a 3/1000 increase in average ocean temperature. Since the oceans now average 3.9C that means it would go up to 3.903C. Oh wait, I bet that 3.9C is already rounded. In other words, the warming would not be detectable.
Thank you Izen for pointing out the there is no longer any looming catastrophe. Can we now quit pouring trillions of dollars into re-inventing our energy infrastructure and killing the poor people of the world?

Owen in GA
April 9, 2013 12:09 pm

Let me see, the ocean is the largest repository of CO2 out there, yet its ability to hold CO2 is dependent on temperature. My question: if the oceans were to warm, wouldn’t the CO2 levels have spiked at a much higher rate as the CO2 boiled out of the oceans?
Another smell test not quite right.

T Control
April 9, 2013 12:28 pm

I just skimmed the paper- can someone fill me in on how reliable ORAS4 should be?

Lars P.
April 9, 2013 12:51 pm

I sprayed my monitor with coffee reading this, so though maybe others would also appreciate it:
“The ocean ate my global warming”
http://junkscience.com/2013/04/07/more-the-ocean-ate-my-global-warming/

Wayne2
April 9, 2013 12:57 pm

Bob, this is off-topic, but I think you (or Willis) might find this paper interesting:
http://www.tau.ac.il/~colin/courses/AtmosElec/ChenISUAL08.pdf
I had been wondering this morning about how much power might be transported electrically within the atmosphere, through thunderstorms and higher. Turns out that regular lightning tops out around 10 km, but blue jets reach 2-4 times that height, and Sprites occur at 8-10 times that height.
Note the paper’s connection between oceanic thunderstorms and Sprites. They’re much more likely over water and with higher SSTs. ENSO reaches into space?
http://en.wikipedia.org/wiki/Upper-atmospheric_lightning

JimF
April 9, 2013 1:23 pm

JJ says:
April 9, 2013 at 7:40 am: “…this ad hoc gnostic bullshit ain’t it….” Heh. Funny and well said, but it phases not at all.
Now that we know their model is “robust” how about putting out a real prediction – you know, of the future? I can’t wait to see how the next few years shape up. I want to know whether or not to invest in pinot noir production from Greenland – the next “Burgundy” if things go warmer.

JJ
April 9, 2013 1:24 pm

The problem with model validation by hindcasting or “retrospective prediction” or whatever the term of the day happens to be, is that it is imagined to work like this:
1. Build a model.
2. Calibrate the model with data from 1900-1950.
3. Test the model by comparing model predictions for 1950-2000 to data from 1950-2000.
4. Accept or reject the model based on Step 3, and publish the results whatever they may be.
But in practice, it really works like this:
1. Build a model.
2. Calibrate the model with data from 1900-1950.
3. Test the model by comparing model predictions for 1950-2000 to data from 1950-2000.
4. If Step 3 indicates that you should accept the model, publish the results.
5. If Step 3 indicates that you should reject the model, then make changes to the model, and return to Step 2. Rinse, repeat.
The first method uses the first subset of the data for calibration, and the second subset of the data for validation. The second method uses all of the data in a multi-step calibration, without any validation. Often, that is not how the results of the second method are reported, or even understood by the people doing the work.
In practice, the only way to reliably avoid that pitfall is to insist on validation by data that cannot have been part of the calibration, as it cannot have been known to anyone involved in the process. Future data.

To the left of centre
April 9, 2013 1:42 pm

@richardscourtney Somehow you think I’m obliged to respond to your comment. Why would that possibly be a reasonable expectation? As I mentioned, I am more than happy to engage with people who are willing to remain civil and who refrain from accusing me of being a troll and from referring to what I say as pseudo-scientific nonsense. Why would I possible want to engage with someone who appears to have already made up their mind about the relevance of my views? If you already know the answer (as you seem to think you do) you don’t need me to be involved at all.

izen
April 9, 2013 1:45 pm

@- Richard M
“Let’s assume izen is correct and we hit a “tipping point” where the oceans start absorbing ALL the energy from now on. The current models predicted an atmosphere rise of 3C/doubling. That means by the time we get to 800 ppm in about 200 years the average global temperature would rise from 15C to 18C.”
That is with the oceans absorbing 90% of the extra energy as at present.
@-“However, due to this tipping point all the excess energy will now enter the oceans. Since the heat capacity of the oceans is 1000x the atmosphere we should see a 3/1000 increase in average ocean temperature. Since the oceans now average 3.9C that means it would go up to 3.903C. Oh wait, I bet that 3.9C is already rounded. In other words, the warming would not be detectable.”
You are quite right that the rise in the AVERAGE temperature might not be detectable (except by expansion) if this thermodynamically impossibility occurred.
But as we can see in the present with 90% of the energy going into the oceans the surface warms only a little slower than the land. About 0.7degC/century?
http://www.woodfortrees.org/plot/hadsst2gl/plot/hadsst2gl/from:1913/trend
There is no reason to suppose that if 100% the energy went into the oceans it could all stay there. The percentage partitioning of the energy is an emergent property of the physics. Unless you invent ice nine to prevent any energy exchange between the oceans and the atmosphere it is difficult to envisage how the phase changes of water would not distribute that energy in climate altering ways.
@-“Thank you Izen for pointing out the there is no longer any looming catastrophe. ”
I never claimed there was one.
But if you think that an increase in global temperatures from 15C to 18C is potentially catastrophic, then I fear your concept of an ocean with a single fixed temperature from its surface to its depth from pole to pole is an unconvincing alternative.

izen
April 9, 2013 2:26 pm

@- Bob Tisdale
“Unfortunately, the data disagrees with the hypothesis, because the data indicate the oceans warmed naturally.”
There is nothing unnatural about DWLR. The oceans would quickly freeze over without it.
I would suggest that they warmed ‘naturally’ because increasing CO2 and thereby increasing DWLR slows the net rate at which the ocean surface layer can shed energy.
It gains more during the energy absorbing La Niña part of the ENSO fluctuations.

Alan D McIntire
April 9, 2013 2:47 pm

JJ, you got it right on the button. In practice, they’re going to continue adjusting the model until it accurately “predicts” 1950 to 2000 using 1900 to 1950 data. I’ve made that error when trying to “predict” horse races.

jim2
April 9, 2013 2:56 pm

izen April 9, 2013 at 11:12 am said:
“The amount of downwelling longwave radiation from rising CO2.”
Izen: I can see how shortwave radiation from the Sun gets aborbed into the ocean, but downwelling IR does not penetrate the skin of the ocean. This means that more CO2 can’t be the cause. More surface mixing could be the cause, but that hasn’t been demonstrated to be the case.

Berényi Péter
April 9, 2013 3:11 pm

“Retrospective prediction” is contradictio in adjecto, a technical term borrowed from astrology. Similis simili gaudet.

richardscourtney
April 9, 2013 3:28 pm

To the left of centre:
Thankyou for your post at April 9, 2013 at 1:42 pm which I and any other rational person can only understand to be saying.
1. You made a post.
2. I explained why and how your post was pseudoscientific nonsense.
3. You cannot refute or even dispute my explanation.
That seems pretty clear and I thank you for it.
Richard

JJ
April 9, 2013 3:50 pm

izen says:
There is no reason to suppose that if 100% the energy went into the oceans it could all stay there.

But that is precisely the assumption upon which the “global warming” scare was built. We have been told for twenty five years that the ONLY way to explain any rise in atmospheric surface temp was to invoke the wrath of Almighty CO2. A fine fairy tale that supports the desired conclusion of high climate sensitivity via positive feedbacks (and the leftist political “solutions” to that “problem”), but one that rather inconveniently ran into the Truth: Surface atmospheric temps stopped rising, while CO2 decidedly did not.
NOW we are told that the oceans are actually a part of the clmate system, and that heat that was modeled to go to the atmosphere may inexplicably go into the ocean depths instead. A nice ad hoc adjustment to the catechism, but it has an undesirable side effect. No one gives a rats ass about an infinitessimal “problem” buried under hundreds of meters of water. So the mythology must allow for the soggy fire and brimstone to be able to dry itself off, and at some point in the future come roaring back into the atmosphere where it may harm us unrepentant sinners.
But if it is a legitimate threat that “missing heat” will come back in the future, then it is a legitimate supposition that it has done so in the past. Like 1980-2000, for example … This is a big gaping hole in the “global warming” narrative, that we skeptics have been talking about since the beginning. Welcome to the club.