Oh this is hilarious. In a “Back To The Future” sort of moment, this press release from the National Center for Atmospheric Research claims they could have forecast “the pause”, if only they had the right tools back then.
Yes, having tools of the future would have made a big difference in these inconvenient moments of history:
“We could have forecast the Challenger Explosion if only we knew O-rings became brittle and shrank in the cold, and we had Richard Feynman working for us to warn us.”
“We could have learned the Japanese were going to bomb Pearl Harbor if only we had the electronic wiretapping intelligence gathering capability the NSA has today.”
“We could have predicted the Tacoma Narrows Bridge would collapse back then if only we had the sophisticated computer models of today to model wind loading.”
Yes, saying that having the tools of the future back then would have fixed the problem, is always a big help when you want to do a post-facto CYA for stuff you didn’t actually do back then.
UPDATE: WUWT commenter Louis delivers one of those “I wish I’d said that” moments:
Even if they could have forecast the pause, they wouldn’t have. That would have undercut their dire message that we had to act now because global warming was accelerating and would soon reach a point where it would become irreversible.
Here’s the CYA from NCAR:
Progress on decadal climate prediction
Today’s tools would have foreseen warming slowdown
If today’s tools for multiyear climate forecasting had been available in the 1990s, they would have revealed that a slowdown in global warming was likely on the way, according to new research.
The analysis, led by NCAR’s Gerald Meehl, appears in the journal Nature Climate Change. It highlights the progress being made in decadal climate prediction, in which global models use the observed state of the world’s oceans and their influence on the atmosphere to predict how global climate will evolve over the next few years.
Such decadal forecasts, while still subject to large uncertainties, have emerged as a new area of climate science. This has been facilitated by the rapid growth in computing power available to climate scientists, along with the increased sophistication of global models and the availability of higher-quality observations of the climate system, particularly the ocean.

Although global temperatures remain close to record highs, they have shown little warming trend over the last 15 years, a phenomenon sometimes referred to as the “early-2000s hiatus”. Almost all of the heat trapped by additional greenhouse gases during this period has been shown to be going into the deeper layers of the world’s oceans.
The hiatus was not predicted by the average conditions simulated by earlier climate models because they were not configured to predict decade-by-decade variations.
However, to challenge the assumption that no climate model could have foreseen the hiatus, Meehl posed this question: “If we could be transported back to the 1990s with this new decadal prediction capability, a set of current models, and a modern-day supercomputer, could we simulate the hiatus?”
Looking at yesterday’s future with today’s tools
To answer this question, Meehl and colleagues applied contemporary models in a “hindcast” experiment using the new methods for decadal climate prediction. The models were started, or “initialized,” with particular past observed conditions in the climate system. The models then simulated the climate over previous time periods where the outcome is known.
The researchers drew on 16 models from research centers around the world that were assessed in the most recent report by the Intergovernmental Panel on Climate Change (IPCC). For each year from 1960 through 2005, these models simulated the state of the climate system over the subsequent 3-to-7-year period, including whether the global temperature would be warmer or cooler than it was in the preceding 15-year period.
Starting in the late 1990s, the 3-to-7-year forecasts (averaged across each year’s set of models) consistently simulated the leveling of global temperature that was observed after the year 2000. (See image at bottom.) The models also produced the observed pattern of stronger trade winds and cooler-than-normal sea surface temperatures over the tropical Pacific. A previous study by Meehl and colleagues related the observed hiatus of globally averaged surface air temperature to this pattern, which is associated with enhanced heat storage in the subsurface Pacific and other parts of the deeper global oceans.
Letting natural variability play out

Although scientists are continuing to analyze all the factors that might be driving the hiatus, the new study suggests that natural decade-to-decade climate variability is largely responsible.
As part of the same study, Meehl and colleagues analyzed a total of 262 model simulations, each starting in the 1800s and continuing to 2100, that were also assessed in the recent IPCC report. Unlike the short-term predictions that were regularly initialized with observations, these long-term “free-running” simulations did not begin with any particular observed climate conditions.
Such free-running simulations are typically averaged together to remove the influence of internal variability that occurs randomly in the models and in the observations. What remains is the climate system’s response to changing conditions such as increasing carbon dioxide.
However, the naturally occurring variability in 10 of those simulations happened, by chance, to line up with the internal variability that actually occurred in the observations. These 10 simulations each showed a hiatus much like what was observed from 2000 to 2013, even down to the details of the unusual state of the Pacific Ocean.
Meehl pointed out that there is no short-term predictive value in these simulations, since one could not have anticipated beforehand which of the simulations’ internal variability would match the observations.
“If we don’t incorporate current conditions, the models can’t tell us how natural variability will evolve over the next few years. However, when we do take into account the observed state of the ocean and atmosphere at the start of a model run, we can get a better idea of what to expect. This is why the new decadal climate predictions show promise,” said Meehl.
Decadal climate prediction could thus be applied to estimate when the hiatus in atmospheric warming may end. For example, the UK Met Office now issues a global forecast at the start of each year that extends out for a decade.
“There are indications from some of the most recent model simulations that the hiatus could end in the next few years,” Meehl added, “though we need to better quantify the reliability of the forecasts produced with this new technique.”
The paper:
Meehl, Gerald A., Haiyan Teng, and Julie M. Arblaster, “Climate model simulations of the observed early-2000s hiatus of global warming,” Nature Climate Change (2014), doi:10.1038/nclimate2357
There is some indication of continued cool conditions in the Southern Ocean, and of a developing cooling in the north Atlantic sub-polar gyre. The latter is potentially important for climate impacts over Europe, America and Africa
How about that then. AMO and PDO are” potentially important” WOW!!
Anthony, for some reason the link to the paper is mail to. Just need to correct that.
“We could have learned the Japanese were going to bomb Pearl Harbor if only we had the electronic wiretapping intelligence gathering capability the NSA has today.”
Well some say the USA did crack the code before Pearl Harbour, and that was the reason the carrier left harbour a day before, leaving only obsolete battleships for destruction and delivering a convenient reason to sway public opinion for entry into the war.
The USA does admit that they cracked the Japanese code a while later, enabling them to intercept and shoot down a plane in which General Yamamoto was travelling; killing him and paralysing the Japanese navy for months – which led to the delay of their super U Boat project, of which Yamamoto was the driving force.
Almost correct, it was the Brits who cracked the code as it was a variant of the four wheel German army enigma code (not the five wheel navy). They informed only trusted members of the US admin.
Incorrect. The Japanese JN25 naval code had nothing to do with enigma, it was based on code books that were replaced from time to time. The last issue prior to Pearl Harbor was Dec 4 1944. Even had the USN being able to read the codes it would have been of no use. Admiral Yamamoto had ordered that no operational information was to be transferred by radio and the fleet maintained radio silence until the attack was under way.
@Keith. – Don’t you mean Dec 4, 1941?
Actually, wasn’t it the Poles who sent the original code-work (and machines) to the Brits – lest they fall into the hands of the encroaching Nazis?
Oops. That would be the German code stuff. My bad.
So what if Yamamoto ordered radio silence. The commander of the fleet did NOT maintain silence. Following the storm the type commanders began a long series of radio messages to round up the scattered ships. The message level was enough to wake up anyone even if it was in code. Except that the specific stations listening only reported to DC (even if one was located on Oahu). Nothing was to be sent to Pearl. Then we had the interesting fact that the Headquarters of the Red Cross (DC) sent a vast stockpile of disaster supplies to Pearl but did not reveal the secret to the local chapter until after the attack. We even managed to redirect civilian shipping (to Soviet Union) so that they would not come across the IJN.
Remember that only a year earlier the Commander of the Pacific Fleet resigned in protest over the Roosevelt order to move the Pacific Fleet from San Diego to the poorly prepared and provocative position at Pearl.
As to negligence, remember that a review board (still lacking the real data on the lead up) exonerated Kimmel and Short. To put it bluntly they were patsies. Then we have the message that MacArthur sent that the Japanese Fleet was spotted passing the Philippines on the way to Indochina. (why did Roosevelt give the MoH to Mac??? It could hardly be the heroic command to leave their prepared defenses and attack the Japanese in the field that led to Bataan Death March. Or his Heroic hiding in the caves of Corregidor. Or his graft and corruption looting the Defense spending of the country. ) That message was enough for anyone to let down their guard.
The honest evaluation of the data is that Roosevelt did everything he could to force Japan to attack and then hid the evidence of the pending attack so as to create the national outrage. The Army and Navy came to just such a conclusion during the war. (and the post war revelations of the secret actions put the icing on the cake)
We have always gone to war as a result of “erroneous data” Recent examples are Tonkin Gulf and Saddam’s nukes but the practice goes back to before the Mexican war. It is just a matter of convincing the masses that we are innocent parties and have been attacked through no fault of our own.
Sorry, but you are all wrong.
The code was broken by Michael Mann. He cracked it with his Hockey stick and was later awarded a Nobel Prize for his efforts.
And I can develop a model which infallibly generates the winning lotto numbers for the past 10 weeks. It’s just getting it to generate next week’s numbers that’s proving a challenge.
Why? All you have to do is wait another week.
Also over the last 20 years. That’s a great “sometimes referred” name when they’re sitting around spitballing a better name than “the stopping” or “the halting”. I suggest “Warming Sensation Vacation”.
Note the strange careful wording, “…heat trapped by additional greenhouse gases…”
ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_annmean_mlo.txt
Over 15 years, from 1998 to 2013, the increase was 30ppm. With the evidence the logarithmic greenhouse effect of CO2 is pretty much saturated, I don’t see how the extra heat trapped by the additional 30ppm could have been accurately measured being such a tiny amount, let alone tracked and verified as going into the deeper ocean layers.
Wait, let me guess. It was shown as happening by the models therefore it was shown as happening in reality, right?
Well yes, the LWIR back radiation (losing none of that energy to evaporation) somehow bypassed the first 700 meters of ocean, and at less then1/2 of the models predicted rate, added (poorly measured with large error bars) heat to the deep oceans, where it will hide for some short time, keeping separate from the rest of the deep oceans, and soon it will, or maybe could come screaming out of the oceans and cause global catastrophic disaster world wide.
(Climate science 2014 in a nut shell)
Anthony,
congratulations to you and the WUWT team on the milestone it looks will be achieved today!
DirkH I dont know what your knowledge of climate science is like but you are flat wrong regarding the run up to Pearl Harbor
The commanders at Pear Harbor had been sent a flash message on November 27th 1941 that began with the words
“*THIS DESPATCH IS TO BE CONSIDERED A WAR WARNING*. NEGOTIATIONS WITH
JAPAN LOOKING TOWARD STABILIZATION OF CONDITIONS IN THE PACIFIC *HAVE
CEASED* AND AN AGGRESSIVE MOVE BY JAPAN IS EXPECTED WITHIN THE NEXT FEW
DAYS. ”
In the face of this stark warning Kimmel and Short had their own meeting at Pearl and decided that despite being specifically warned that war was coming AND the Japanese Fleet was at sea AND they knew the IJN had the capability to mount an attack there was no real risk and so took no action at all. The base remained on a peacetime footing, the emergency room was only manned 9 to 5 and no air patrols were flown. The army anti-aircraft guns which were supposed to protect the fleet failed to fire a single shot as no ammunition had been issued to them.
Yamamoto was of course an Admiral , the IJN was not paralyzed by his death and the I-400 program had been initiated well before his death by Yamamoto himself in January 1942.
Initiated yes.
The warning you cite is not very specific.
The commanders of Pearl Harbor were negligent. Leaders are supposed to think in terms of how apparently vague threats can be prudently prepared for. “War Warning” would be prudently and reasonably interpreted as “prepare for war”. The warning from reality that the models are failed is a prudent person’s warning to reconsider the claims of those promoted the models.
A message that an attack is to be expected is about as explicit as any military leader can hope for. Did you expect a complete breakdown by aircraft and target ?
Like climate science, with WWII, we can’t even agree on the past.
What tools do they have know they didn’t in the 90’s?
Hindsight
Twenty years of climate records?
Better turd polishing apparatus.
http://www.guffsturdpolish.com/images/products/turdpolish.png
http://www.guffsturdpolish.com/
They better first get the data right on which the models are based.
HadCRUT4, a joint production of the UK Met Office Hadley Centre and the Climatic Research Unit of the University of East Anglia, is the world’s “official” global surface temperature time series.
…
[HadCRUT4 a combination of] CRUTEM4 and HadSST3 have been “corrected”.
It’s not possible to say exactly what corrections have been applied to CRUTEM4 because no one publishes a land surface air temperature time series that uses only raw records.
http://euanmearns.com/hadcrut4-strikes-out/
If we’d known what the right answer was, we could have arranged for our models to produce it. LOL
Basically, yes. They have so many free parameters they can tweak the models to fit just about anyting. But if they tune even the current models to only fit 1960-1990 they will still produce spurious warming.
The bad news is NCAR has determined empirically that extending the trend line based on 20 years of data does not produce accurate results. The good news is extending the trend line from only 5 years of data works much better. The conclusion is NCAR’s reputation should only be dependent on their last two predictions.
NCAR: “The hiatus was not predicted by the average conditions simulated by earlier climate models because they were not configured to predict decade-by-decade variations.”
They were configured to match decade-by-decade variations from 1970 to 1995, but not decade-by-decade variations.from 1915 to 1935 or decade-by-decade variations.from 1945 to 1965.
That is why they did not predict decade-by-decade variations from 1997 onwards.
Even the current models do correctly reproduce the early 20th c. warming. or variability.
They start from a false assumption that 1960-1990 variability was not “decade-by-decade” variability but long term climate change. This is nothing more than an assumption and is still without physical proof from observational data.
They wilfully ignore the fact that the models do not fit the majority of the climate record, even after it has been “corrected” to better fit the hypothesis.
So they start with an assumption that did not fit the evidence even in 1997, before the “pause”. It was purely dogma.
It is disingenuous to pretend this is just a question of decade-by-decade variations when the models were tuned to fit a restricted period of the data on exactly that scale.
What makes you mistrust supposedly independent scientific organisations like this is their continual attempt to put some ‘spin’ on each statement. For example, the caption on the graph says “After rising rapidly in the 1980s and 1990s, global surface air temperature has plateaued at high levels…”
High levels? What level should the Earth’s temperature be? This after all is a cool period in the Earth’s history. It is technically an ice age (literally).
When we compare current temperatures to those of pre-industrial levels remember that period was the little ice age. Does anyone prefer to go back to those condition? Was the little ice age the optimum temperature? Of course not. Perhaps people in Florida or California may say yes but, personally, I would welcome the world being a couple of degrees warmer.
I’ve often wondered how many Climate Alarmists own ocean front property ;).
Well, at least they’re saying “plateaud” instead of “paused.” That’s progress.
Words fail me, almost as much as their models do…
…Ignore the chap behind the curtain tweaking the dials; move along, move along!
What they are admitting is that these multi-billion dollar models that climate policy, treaties, laws and taxes are based on are worthless.
We deserve better. Fire the climate hypesters.
If I’d known what future temperatures were going to be, I could have jockeyed the climate model inputs so they would show that. Eureka! I’ve just invented “hind-casting”. But wait! That’s what I’ve been doing all along (except for the Medieval Warm Period – just couldn’t get the models to do that, so I made the MWP disappear).
Yogi Berra is still alive. The climate folks should award him, before he takes his last strike, a PhD (not an honorary one) in Climate Global Warming. After all, he discovered “Forecasting is hard, especially about the future.”
At least a Nobel? Maybe he could share it with the hockey stick – or broken baseball bat modeled on the hockey stick?
Proper scientists must have shame of this clowns.
It is enough to read how they write to see that whole thing was “managed” to have a “pause”.
Of course what they say about the future is “could” of nothing . We will have more lucky in a casino.
I knew in advance that this would happen.
This looks like a very interesting paper. If decadal GCM forecasting is getting as good as they claim, you’ll be hearing a lot more about it. The thing is, it does what people wrongly thought GCM’s were doing before – actually forecasting decadal weather (eg pause). GCM’s had previously been generating random weather subject to forcing, and could only be expected to generate long term climate averages. Now it seems they can synchronize with Earth weather, to some extent.
Of course, when such capability has come to exist, it will be first applied to hindcasting recent decades. What would be a better check?
Unfortunately, the paper seems effectively paywalled. However, there is another more discursive 2014 paper here by Meehl. Well worth reading.
“GCM’s had previously been generating random weather subject to forcing, and could only be expected to generate long term climate averages.”
If weather is random, how can you base models on it? How do you quantify ‘weather’ if it is random? Anyway, GCM’s are about temperature, not weather.
“Now it seems they can synchronize with Earth weather, to some extent”
Synchronize with random events? After the fact (hindcast) or before the fact (forecast)?
10 out of 262 is not the sort of track record I would trust. yes, it is a big improvement from 0 out of 262, but nothing to write home about. unless of course you are paid by the government.
a pair of dice predicts there is 87.3 out of 262 chance temps will plateau, 87.3 out of 262 they will increase, and 87.3 out of 262 they will decrease. And this forecast was available 20 years ago!
So, on that basis the pair of dice is outperforming the billion dollar climate models, and outperforming them by a very wide margin.
so now you know why your local weather forecast calls for a 1/3 chance of sun, 1/3 chance of rain, and 1/3 chance of mixed. All the money invested, and the most reliable forecast is still “today’s weather will be much like yesterday”.
Yup, that one forecast, “today’s weather will be much like yesterday”, is likely true in your location, at least a correct as your local weather forecast.
So there you have it. I’m predicting the weather for the entire planet. The first global weather forecast. Odds are, I will outperform the computers.
Your method works for Hawaii.
“Almost all of the heat trapped by additional greenhouse gases during this period has been shown to be going into the deeper layers of the world’s oceans.” Shown to be going? Shown? Could somebody, anybody please advise me where there is definitive evidence of this alleged effect, and a plausible hypothesis of method?
So when and where was this climate function predicted?
And how was it ‘shown’?
“How was it shown”
By telling a lie big enough and often enough it becomes the truth, that’s how. The model re-run showed if they would have known to start telling the “all the heat is now going into the ocean” lie before the hiatus, many years ago, the lie would have been accepted and everything would be hunky-dory today.
If it wasn’t so serious, it would be hilarious. If they spent less time falsifying climate data, to show warming, they might have noticed what was happening in the climate.
In order for the tools to work correctly, you have to have accurate data. Instead, they still rely on Hokey sticks and undocumented and unwarranted adjustments. GIGO was around back then. And it still works the same way today.
It’s silly & disingenuous to expect future predictions to be 100% perfect. Climate scientists understand temperature prediction, they always predict warmer weather in summer than in winter and they are always right. They also understand the climate well enough to know that rainy days will be wet and non-rainy days will be dryer. They know there’s ice at the poles (just not exactly how much and where). And every year they accurately predict that there will be tornadoes in tornado alley during the tornado season proving global warming is getting worse due to CO2. So they’re getting a lot of very important stuff right.
Such knowledge isn’t gained overnight, climate scientists are amongst the most intelligent people on the planet, requiring vast intellect way beyond a scientist who deals with one basic science like a physicist or chemist does. Often their intellect is doubted because they’re too engaged in the thought process to be able to understand trivialities like how to calculate a trend excel or which way up to use the data.
The problem is, future predictions can be taken out of context and that can get in the way of “the science”. If any future predictions are given that turn out not to appear to be right its usually because the scientist had to withhold some information because of confidentiality agreements.
If only climate scientists could travel to the future – what a future it would be.
to be accurate, not all climate scientists were able to correctly predict that rain is wet.
So translation is : we were wong before because we did not know what we were talking about… But trust us because we do now.
Utter bullshit.