Joe Bastardi writes on the Patriot Post:
There is a huge event being forecasted this year by the CFSV2, and I don’t know if anyone else is mentioning this. For the first time in over a decade, the Arctic sea ice anomaly in the summer is forecast to be near or above normal for a time! While it has approached the normals at the end of the winter season a couple of times because of new ice growth, this signals something completely different – that multiyear growth means business – and it shows the theory on the Atlantic Multidecadal Oscillation (AMO) is likely to be on target.
Once it flips, this red herring of climate panic will be gone. Global and Southern Hemisphere anomalies are already unmentionable since the former is well above normal and the latter is routinely busting daily records.

The biggest minimum anomalies are in the summer since this flipped, and the only peaks came very close to the height of winters once this melting was underway.
Now look at what the CFSV2 forecasted for 2012.

The brief positive anomaly hit early, but for the summer it’s well below normal. In 2013, it’s the same, though not as far.

But this year it’s forecast to be around normal in August!

This is only with a yearly AMO back off. I don’t think this is the real deal of the flip yet. But it makes the point that one can correlate the ice in the Arctic with the Atlantic cycle.
…
It should be obvious as to who is the boss here, and with the warm AMO in its waning years, the Arctic sea ice hysteria will wind up where so many agenda driven items do – on the ash heap of history.
This, if correct, is going to be a huge story. It would be the first summer where Arctic sea ice returned to near normal, indicative of the increase in multiyear ice and what a turn to the colder AMO in the future means! Let’s see if anyone else picks up on it.
Read his full story here: http://patriotpost.us/opinion/25340
==========================================================
More on CFSv2 here: http://origin.cpc.ncep.noaa.gov/products/people/wwang/cfsv2fcst/
Janne says:
May 6, 2014 at 11:00 am
Like it or not the passage is becoming a viable option for shipping.
——————-
nothing new.
The russians offered the route up for shipping in 1967.
Steven Mosher says:
May 6, 2014 at 8:34 am
In general the GCM is better than any other tool in predicting long range changes.
======
LOL……oh well
Janne says:
May 6, 2014 at 11:00 am
“The SS Manhattan did it in 1969.”
——————————
“The most important satellite we use is equipped with an active radar instrument that sends out a signal obliquely to the Earth’s surface and measures the signal reflected back to the satellite. If the signal meets a calm sea surface, little is reflected, but when the satellite passes over ice, the surface is normally more uneven, and the reflected signal is stronger. In this way we obtain valuable knowledge of where and when it is safe for ships in the area to travel,” says Dinessen.
Professing ignorance of many of the Arctic ice issues, I compared periodicity of Earth’s magnetic filed to numerous temperature records and indices.
Only the Arctic temperature came up with its two major components near equal in periods and intensity with the EMF’s major two.
http://www.vukcevic.talktalk.net/ATvsGMF.htm
Two ‘contributing’ factors; magnetic pole of course, and the crack in the Earth’s crust, peppered with submarine vents and volcanoes, all the way from Iceland to the other side of the Beaufort Sea.
Coincidence is likely, but need to know of another climate data periodiogram matching the above before I write it off. Anyone?
You are spot on with your assessment of ECIMs/GCMs. Unfortunately, those who believe in their ability to predict future climate really don’t want to talk about the differential equations, numerical methods or initial/boundary conditions which comprise these codes. That’s where the real problems are…
Well, let’s be careful how you state this. Those who believe in their ability to predict future climate who aren’t in the business don’t want to talk about all of this, and those who aren’t expert in predictive modeling and statistics in general in the business would prefer in many cases not to have a detailed discussion of the difficulty of properly validating a predictive model — a process which basically never ends as new data comes in.
However, most of the GCMs and ECIMs are well, and reasonably publicly, documented. It’s just that unless you have a Ph.D. in (say) physics, a knowledge of general mathematics and statistics and computer science and numerical computing that would suffice to earn you at least masters degree in each of those subjects if acquired in the context of an academic program, plus substantial subspecialization knowledge in the general fields of computational fluid dynamics and climate science, you don’t know enough to intelligently comment on the code itself. You can only comment on it as a black box, or comment on one tiny fragment of the code, or physics, or initialization, or methods, or the ode solvers, or the dynamical engines, or the averaging, or the spatiotemporal resolution, or…
Look, I actually have a Ph.D in theoretical physics. I’ve completed something like six graduate level math classes (mostly as an undergraduate, but a couple as a physics grad student). I’ve taught (and written a textbook on) graduate level electrodynamics, which is basically a thinly disguised course in elliptical and hyperbolic PDEs. I’ve written a book on large scale cluster computing that people still use when setting up compute clusters, and have several gigabytes worth of code in my personal subversion tree and cannot keep count of how many languages I either know well or have written at least one program in dating back to code written on paper tape. I’ve co-founded two companies on advanced predictive modelling on the basis of code I’ve written and a process for doing indirect Bayesian inference across privacy or other data boundaries that was for a long time patent pending before trying to defend a method patent grew too expensive and cumbersome to continue; the second company is still extant and making substantial progress towards perhaps one day making me rich. I’ve did advanced importance-sampling Monte Carlo simulation as my primary research for around 15 years before quitting that as well. I’ve learned a fair bit of climate science. I basically lack a detailed knowledge and experience of only computational fluid dynamics in the list above (and understand the concepts there pretty well, but that isn’t the same thing as direct experience) and I still have a hard time working through e.g. the CAM 3.1 documentation, and an even harder time working through the open source code, partly because the code is terribly organized and poorly internally documented to the point where just getting it to build correctly requires dedication and a week or two of effort.
Oh, and did I mention that I’m also an experienced systems/network programmer and administrator? So I actually understand the underlying tools REQUIRED for it to build pretty well…
If I have a hard time getting to where I can — for example — simply build an openly published code base and run it on a personal multicore system to watch the whole thing actually run through to a conclusion, let alone start to reorganize the code, replace underlying components such as its absurd lat/long gridding on the surface of a sphere with rescalable symmetric tesselations to make the code adaptive, isolate the various contributing physics subsystems so that they can be easily modified or replaced without affecting other parts of the computation, and so on, you can bet that there aren’t but a handful of people worldwide who are going to be able to do this and willing to do this without a paycheck and substantial support. How does one get the paycheck, the support, the access to supercomputing-scale resources to enable the process? By writing grants (and having enough time to do the work, in an environment capable of providing the required support in exchange for indirect cost money at fixed rates, with the implicit support of the department you work for) and getting grant money to do so.
And who controls who, of the tiny handful of people broadly enough competent in the list above to have a good chance of being able to manage the whole project on the basis of their own directly implemented knowledge and skills AND who has the time and indirect support etc, gets funded? Who reviews the grants?
Why, the very people you would be competing with, who all have a number of vested interests in there being an emergency, because without an emergency the US government might fund two or even three distinct efforts to write a functioning climate model, but they’d never fund forty or fifty such efforts. It is in nobody’s best interests in this group to admit outsiders — all of those groups have grad students they need to place, jobs they need to have materialize for the ones that won’t continue in research, and themselves depend on not antagonizing their friends and colleagues. As AR5 directly remarks — of the 36 or so named components of CMIP5, there aren’t anything LIKE 36 independent models — the models, data, methods, code are all variants of a mere handful of “memetic” code lines, split off on precisely the basis of grad student X starting his or her own version of the code they used in school as part of newly funded program at a new school or institution.
IMO, solving the problem the GCMs are trying to solve is a grand challenge problem in computer science. It isn’t at all surprising that the solutions so far don’t work very well. It would rather be surprising if they did. We don’t even have the data needed to intelligently initialize the models we have got, and those models almost certainly have a completely inadequate spatiotemporal resolution on an insanely stupid, non-rescalable gridding of a sphere. So the programs literally cannot be made to run at a finer resolution without basically rewriting the whole thing, and any such rewrite would only make the problem at the poles worse — quadrature on a spherical surface using a rectilinear lat/long grid is long known to be enormously difficult and to give rise to artifacts and nearly uncontrollable error estimates.
But until the people doing “statistics” on the output of the GCMs come to their senses and stop treating each GCM as if it is an independent and identically distributed sample drawn from a distribution of perfectly written GCM codes plus unknown but unbiased internal errors — which is precisely what AR5 does, as is explicitly acknowledged in section 9.2 in precisely two paragraphs hidden neatly in the middle that more or less add up to “all of the `confidence’ given the estimates listed at the beginning of chapter 9 is basically human opinion bullshit, not something that can be backed up by any sort of axiomatically correct statistical analysis” — the public will be safely protected from any “dangerous” knowledge of the ongoing failure of the GCMs to actually predict or hindcast anything at all particularly accurately outside of the reference interval.
rgb
And for those who want to see a typical government-funded GCM, here you go…
http://www.giss.nasa.gov/tools/modelE/
I urge everyone who has experience with scientific computing to download the source code and check it out. Let me know if anyone can determine what equations it’s solving – they aren’t listed anywhere…
Sure, but then there is:
http://www.cesm.ucar.edu/models/atm-cam/docs/description/
NASA is actually required by law to distribute their code and data, but they aren’t AFAIK required to document it. They should, though. And in fact they do:
http://www.giss.nasa.gov/tools/modelE/modelE.html
Beyond that I’m sure that one does have to go down to the level of individual routines and look (although there are a number of papers cited that claim to do an overview). That actually isn’t surprising — As I pointed out in my last reply, this is a grand challenge project, and no single human is likely to be ABLE to write, and integrate, all of the code. FWIW, the code looks like it is a bit better organized than CAM 3, although CAM 3 is definitely a lot better documented. Too bad we can’t have both (at least, not among these two specific examples).
I haven’t tried to build modelE, though — perhaps I’ll download it and give it a try. I’d very much like to have a functioning GCM of my very own without having to solve multiple problems just to get it running on my own systems. I simply don’t have the time to screw around with the latter stuff.
rgb
Jimbo says:
May 6, 2014 at 2:46 am
“Here is the acclaimed Arctic climate scientist Professor Peter Wadhams of Cambridge University…..
…Guardian – 17 September 2012
Arctic expert predicts final collapse of sea ice within four years
“This collapse, I predicted would occur in 2015-16 at which time the summer Arctic (August to September) would become ice-free. The final collapse towards that state is now happening and will probably be complete by those dates”.
Since when does ice ‘collapse’? My simple schooling taught me that ice can melt. I thought that the Arctic Ice sheet can break apart and move about, and it does that continuously; during the refreeze as well as the melt season.
Re graph in my comment above, forgot to say, disappointingly : no AMO, no SSN:; while for the noise not clone but a sort of a look alike.
Kenny (May 6 4:55am) – “Y’all know that all eyes are going to be on the ice at the North Pole this summer. This has always been their “go to” data. “. I’m afraid you have misunderstood the process, which is to highlight whichever feature suits best at the time. Arctic Ice was good for a while, but now … who knows what’s next.
Robert Brown (rgbatduke) – “absurd lat/long gridding on the surface of a sphere “. Isn’t this the core issue – that this method can’t work? In other words, it’s hopelessly inaccurate weather modelling masquerading as climate modelling and incapable of predicting anything reliably over any significant period of time. To my mind, a climate model would have to operate very differently, and AFAIK no such model has yet been developed, partly because almost none of the major climate factors are understood well enough.
RGB says:
“I personally am curious as to whether or not even these SHORT term predictions for things like ENSO, the AMO, and the arctic sea ice cycle are borne out. I do think that the decadal oscillations are a major factor in climate evolution, but since the decadal oscillations themselves are chaotic and at best quasi-periodic and of highly variable strength and duration, one can see their large effect on climate after the fact easily enough, but predicting that large effect beforehand requires being able to predict them and we can’t.”
I daresay that continuing weak solar activity will make for a generally more negative AO&NAO, and a continuation of a warm AMO phase, as in the weaker solar cycles of the 1880/90’s.
Thank yewww! shout-outs to Mike Jonas , RACookPE1978 and everyone else who set me straight on my “anomaly” question.
Anthony
I think RGB explanation of the failure of computer models above should be put up as a separate post as it is so compelling along with Mosher’s oft quoted line that a model which is wrong is still more useful than no model at all.
If I happened to be a squawking Chicken Little I would say: you guys are making a big deal about “normal” amounts of Arctic ice? Like normal is a “significant story.” Like it signifies plummeting temperatures and the onset of a new ice age? Seriously, normal? We’ve had very low levels of Arctic ice and once this little bit of normal blows over we will go back to being ice free again like we predicted for 2013. And if not, so what, we look elsewhere, like to
Antarctica, or well then toU.S. temperatures, or toskyrocketing global temperatures over the last 20 years, or er toour climate models, or er um, uhm, to whatever it is, the media will report our story, and the little people will believe. Like you guys think that the people are going to be all freaked out about “normal” levels of ice. Not! That won’t scare anyone! (/s)@ur momisugly rgbatduke says:
May 6, 2014 at 1:24 pm
Have you read the “documentation” for Model E? I have – it’s no where near adequate, and again does NOT described the underlying physics, specific equations, or the basic numerical algorithms. I defy ANYONE to find the actual equations they are solving written down anywhere.
BTW, my background is a Ph.D. in Mechanical Engineering, with a specialty in Computational Fluid Dynamics (CFD), and over 20 years of professional experience with CFD and numerical analysis.
“NASA is actually required by law to distribute their code and data, but they aren’t AFAIK required to document it.”
This is entirely unacceptable, especially given the purposes to which this “code” is being applied. It really speaks to the lack of any sense of responsibility these people at GISS have for their work. But yet NASA lets them get away with it.
The NCAR documentation is of course much, much better, and I have lauded their efforts in the past.
“I haven’t tried to build modelE, though — perhaps I’ll download it and give it a try.”
Let us know if you’re successful. I’d like to try it myself too, although it’ll probably be troublesome to get the makefiles to work correctly. Then I can go into the code and start seeing how sensitive the numerics are to instability – I suspect it’ll be easy to get it to diverge and give nonsense results.
Stephen Skinner says:
May 6, 2014 at 1:50 pm
Jimbo says:
May 6, 2014 at 2:46 am
“Here is the acclaimed Arctic climate scientist Professor Peter Wadhams of Cambridge University…..
…Guardian – 17 September 2012
Arctic expert predicts final collapse of sea ice within four years
“This collapse, I predicted would occur in 2015-16 at which time the summer Arctic (August to September) would become ice-free. The final collapse towards that state is now happening and will probably be complete by those dates”.
Since when does ice ‘collapse’? My simple schooling taught me that ice can melt. I thought that the Arctic Ice sheet can break apart and move about, and it does that continuously; during the refreeze as well as the melt season.
Here’s a nice ice collapse for you.
http://www.livescience.com/31989-major-glacier-calving-captured-in-time-lapse-video-video.html
SAMURAI
Water temps count for much more in the Arctic than air temps. Recent record sea ice summer lows have been in spite of low air temps. And this summer’s big recovery will be in spite of the warmer winter air due to the polar vortex. It is, as Joe says, the AMO and water temperatures that boss the Arctic ice.
phlogiston says:
May 7, 2014 at 12:32 pm
SAMURAI
Water temps count for much more in the Arctic than air temps. Recent record sea ice summer lows have been in spite of low air temps. And this summer’s big recovery will be in spite of the warmer winter air due to the polar vortex. It is, as Joe says, the AMO and water temperatures that boss the Arctic ice.
We shall see.
I don’t understand. Can someone simplify this for me, please? This is our current ice level in the arctic, matching that of this time in 2011.
http://www.ijis.iarc.uaf.edu/seaice/extent/Sea_Ice_Extent_v2_L.png
So Joe is saying that come August, the ice is predicted to be above the blue or one more more dotted lines?
Or is he suggesting that the ice will be much thicker, despite volume?
*Despite area.