Guest Post by Willis Eschenbach
Inspired by my previous posts, Boy Child Girl Child and Sea Levels in the Nino-Nina Cycle, I decided to take a look at what is happening below the sea surface along the Equator. A commenter pointed me to an Australian archive of past sub-surface analyses, with an unfortunate URL that includes “oceantemp/pastanal/” … but I digress.
These analyses show the longitude on the horizontal axis and the depth on the vertical axis. Here’s their view of the peak of the El Nino in November 1997. The black area on the right is South America, and the black area on the left is Africa. The shallow zone between 90°E—120°E is Indonesia and Borneo.

Figure 1. Equatorial Pacific Ocean vertical section temperature analysis for the peak El Nino, November 1997. Date is shown in the center. Top panel shows the “climatology”, a term of art meaning the long-term average for the climate variable for the month. In this case the variable is the temperature distribution. Middle panel shows the actual temperatures for the month. And the final panel shows the “anomaly”, which is the actual temperature less the climatology. This is how much hotter or cooler than average the ocean is in that location that month.
In Figure 1 you can see how the anomalous heat in the eastern Pacific just off of South America extends from the surface down deep, a couple hundred meters. The “H” note shows it peaked at over eight degrees warmer than average. There’s also a corresponding subsurface cool spot to the west of that, which the “L” note says is some six degrees cooler than average. Big swings.
Compare that with the situation one year later, November 1998 at the peak of the corresponding La Nina.

Figure 2. Equatorial Pacific Ocean vertical section temperature analysis for the peak La Nina, November 1998.
Here we can see that the warm water has been moved out and has been replaced by the colder subsurface waters, which have now come to the surface..
And to give myself a larger understanding of the undersea world, here’s a movie I made of the month-by-month situation from 1995 to 2008, when they changed the format. See the Endnote for details on how I made the movie.
Figure 3. Monthly changes in the vertical section temperature analysis.
Always more to learn … please quote whatever you are commenting on, avoids misunderstandings.
Half-moon tonight, crisp and cold. The deer came by again today. Hummingbirds were drinking from the flowers. Life in the forest doesn’t care about politics, which seems like a brilliant plan to me these days. The silence of the woods is my balm and my refuge.
My best regards to all,
w.
Endnote: To make the movie, first I had to download the .GIF files. I do most of my work in the computer program R. If you program, you should learn R. Free, cross-platform, free killer user interface called RStudio, interpreted. But I digress. I first needed to create the year/month number combinations used by the Aussies in the URLs of the graphics. Here’s that code. Everything on a line after a hashmark “#” is a comment.
allyears=paste0(rep(1995:2020,each=12),twodigit(rep(1:12,26))) # repeats the numbers from 1995 to 2020 twelve times each, and pastes the two-digit representations of the months onto each one head(allyears) #shows the first few data points of the variable [1] "199501" "199502" "199503" "199504" "199505" "199506"
(“twodigit” is a function I wrote because I couldn’t remember the actual code to format a number with two digits.)
twodigit=function(x) { if(is.numeric(x)){ sprintf("%02d",x) # formats numbers } else { format(x,digits = 2) # formats text } }
Then I created a folder (directory) to store the .GIF images, and put the path to the folder into a variable.
dir.create("Aussie Underwater All") pathname="Aussie Underwater All/"
Then I wrote a loop to read each .GIF and write it to the folder. Comments after the “#” in each line.
for (i in 1:length(allyears)){ # cycle through all the years print(paste0(allyears[i])) # prints progress aurl=paste0("http://www.bom.gov.au/archive/oceanography/ocean_anals/IDYOC002/IDYOC002.",allyears[i],".gif") #paste together the URL mygif=image_read(aurl) # read the gif image file at that URL image_write(mygif,paste0(pathname,basename(aurl))) # write the image file to the folder created above. }
Once I had the images, the rest I did online. Upload the images to the Animated GIF Maker, and make an animated GIF. Upload the animated GIF to CloudConvert, and convert it to MP4. Whole thing took me a couple hours.
w.
Always better to see images than numbers. That’s a lot of energy moving in and out (or maybe just moving through).
Thanks Willis.
Brilliant series of data driven think-throughs, Willis!
You have greatly expanded our surface driven view of ENSO events, and both raise and answer quant comparison issues that surround thos of us who have believed that nature still dominates climate change.
May I be among the first to suggest you have all the visuals to walk through your findings at a Heartland Institute ICCC event?
I’m hoping the next one scheduled for Las Vegas in Spring is delayed several months. Your fans and science geeks alike hope to see you there!
Amazing, there is a lot more to ENSO than what you (literally) can see at the surface!
It seems to reveal why the ENSO meter tends to swing more shallowly toward the La Nina side than it does to the El Nino… the cold anomaly water tends to stay more submerged in La Nina compared to the hot anomaly water in El Nino, so they don’t fully measure the La Nina effect by sampling temperatures on the surface. Looking at the whole depth profile it appears more of a balanced effect.
Also interesting to see that there is a lot of submerged anomalously colder water even during the El Nino pumping cycle. I suppose this is due to the colder water rising up to shallower depths as the wind drives the warm surface water off to the east Pacific.
Orson, I keep hoping that the Heartland folks will invite me again. Here’s the next one coming up, April 16-17 in Las Vegas. I guess my invitation must have gotten lost in the mail …
w.
Amazing. Thanks Willis. I have a friend who was having a problem understanding “all this slopping east to west stuff”, and here’s the movie 🙂
Glad to be of assistance.
w.
Yes, and especially thanks for the code. I’m glad you’re getting back to providing that like you did a few years back.
What strikes me when looking at these zero lat sections is that when the east pacific is cool this presumably leads to clearer skies, less thunder storms “emerging” and a period of accumulation of thermal energy in the ocean.
Following ’97 we see the warm water spread across right to S.Am and the disperse. It does not get blown back AFAICS to west, it just disperses.
Since the ocean gyres tend towards the equator there, no polewards as in the west, it seems that the energy moves to the atmosphere. That is also a poleward route via Hadley circulation to start with.
You were doing some back of napkin figures last time. How does this atmospheric route compare with heat moved by ocean currents in the west Pacific?
Also it is interesting to look at +/- 10 lat. and how heat moves N/S. The isometic view plots you showed last time show that is quite marked.
?resize=768%2C361&ssl=1
I often think that taking a zero degree section at the equator is likely to lead to misinterpretations of where some of the “anomalous” heat is coming from. When the warm “anomaly” starts building in mid Pacific in 97 , then spreads east, there does not seem to be connection to the surface. This suggests that it is not just warm water ‘pushing down’ which produces the anomaly. That seems to leave only N/S movement at depths around 100m.
My tentative interpretation is that this is a slow gravitational tidal wave acting on the density difference around the top of the the thermocline. Such movements would be on the scale of 1-2y.
https://wattsupwiththat.com/2020/11/23/sea-levels-in-the-nino-nina-cycle/#comment-3132434
I am not a masochist, so I don’t use R.
What do you use? That looks like it barely qualifies for the First Circle of Programmer Hell. (There are some differences over the other levels, but most people I know agree that Lisp is the Eighth Circle, and APL the Ninth.)
Personally, Prolog was much harder for me to learn than Lisp or APL.
Suanno, not sure why you think R is masochistic.
I wrote my first program in 1963, in an extinct language called Algol, using punchcards. So I’ve been programming for more than a half-century. Since then I’ve learned Fortran, COBOL, Lisp, Logo, Mathematica (3 languages), Pascal, Basic, Datacom, C/C++, 68000 Assembly Language, Visual Basic, Hypertalk, Vectorscript, and most recently R.
I learned R at the urging of Steve McIntyre of climateaudit.org. I’m so happy I did. It is far and away the easiest of all of the languages I’ve used.
For example. Suppose you have a block of data called “mydata” that’s 37 rows by 18 columns. You want to add one to every data point. In pseudocode for all those languages but R, you have to write code to loop through each individual data point to add one to all of them, viz:
Complex and easy to miswrite. In R, on the other hand, you just say
Easy money.
Plus it has the best user interface I’ve ever seen, RStudio. Autocompletion of all names, both functions and variables. Control-click on a function name and it takes you to the definition. And because it is interpreted, you can select anything from a single word to lines of code, hit command-enter, and it runs just that bit of code.
Because you can run it bit by bit, I’ve only once or twice ever had to set a breakpoint to debug it. No need.
Plus, R is cross platform and free. And RStudio is free. Plus there are literally hundreds and hundreds of free packages out there for download, with functions to do literally anything a computer can do.
Anyhow, that’s what this old programmer thinks about R.
w.
Thanks Willis for providing the R code.
Many of your articles provide invaluable insights and a rather down to earth analysis built on quality data.
I think you should consider ALWAYS providing the code and the (R) source data with your articles. This is now the praxis in the science community, and it will make it easier for other researchers to follow up on your writings.
kiasom November 25, 2020 at 9:20 pm
Thanks, Kiasom. Most of my code is so idiosyncratic and disorganized that I’d be ashamed to post it up. Oh, it works just fine … but only if you run the section at the end before the other sections. And skip lines 116-121, they crash it every time. And lines 146 and 152 have to be run before the section just above it, to initialize the variables …
And then there’s my dozens and dozens of functions that I’ve written to “simplify” things, which they do very well but only if someone knows how to use them. And they are not all in one place either.
Finally, the datasets are often quite large. Just got done downloading one that is 8.86 Gb … not going to be a lot of help to most folks to know where it is. With my gigabit connection it took me about ten minutes to download … so someone with the average ~20 Mb download speed would take fifty times that long, which is over eight hours.
And without the datasets and the functions, my crazy-quilt code won’t work.
As a result, I always provide the data, usually including a link to it on the graphics themselves. Uncited graphics drive me nuts. And I usually don’t post up the code unless someone asks. Not trying to hide it, but when I post it I find out just how many subsidiary documents it depends on … which is usually “too many”.
Having said that, if you want an explanation or code for some particular thing I’ve done, ask and I’ll see just what I can do.
Regards,
w.
Fortran 90 also allows array operations like mydata=mydata+1.
My first was Algol a couple of years later, followed by Fortran. In my case Algol was on paper tape and Fortran on cards. Still have a box of cards from one of my programs as a souvenir.
I use a pirated version of R. It’s called “Rrrrrrrrrrrrr!”
It is amazing to me how little animations can clarify things so quickly. After years of trying to understand how ENSO affects my part of the Big Pond way down bottom left corner in NZ, your animation showed that during the El Nino phase the warmer pool spreading out off the coast of equatorial America had an opposite cool pool hovering down around our way giving cooler seas.
When the Girl Child returned the eastern equatorial Pacific was much cooler than normal. Down at 7, 8 & 9 o’clock the water was above average. Since records began La Nina events have given NZ our hottest summers. El Nino summers are often bummers.
A paper by NZ scientists a few years ago explained that the reason for the rapid quarter century advance of our two most famous glaciers on the West Coast of the South Island (Franz Josef & Fox) was due to NZ experiencing a prolonged cool period under the strong influence of El Nino events. This has changed in the last 12 years and the glaciers have retreated back to where they were 40 years ago. It wasn’t just a case of increased La Nina events, because that is not what happened, but in my opinion the prolonged ‘neutral’ periods also had a deleterious affect on feeding the neve’s at the head of the glaciers as well.
Your animation quickly showed what takes several years to occur and reinforced the feeling I had about what actual impact ENSO has on us down in our neck of the woods.
The media claim that the resulting hot summers are a sign of CAGW. The weak La Nina event in the austral summer of 2017-18 resulted in our hottest recorded summer. That summer beat out the previous record holder of 1934-35 by 0.1C! I asked my dad who was 2.5 years old back then if he could tell the difference! He just laughed. Truth is most people can’t tell the difference between 1C let alone 0.1!
While nature does its eternal ‘to and fro’ sloshing of warm/cool waters across the Pacific, La Presidente Jacinda Ardern announced today she will declare, next week, a *climate emergency* in New Zealand’s parliament. These crooks just can’t get their timing right: MetService is calling for snow this weekend along the South Island’s Southern Alps, and possibly MORE snow next week in December, which will mean it has snowed EVERY SINGLE MONTH during the year 2020. The only ’emergency’ is that their crock prophecies have, as we all know too well, failed completely yet again.
I, for one, would love an extra degree or two this coming summer – boardshorts & jandals sound much better than thermal layers and snow shoes…
Greg
Jacinda is guided by international acceptance, and be seen to do the right thing in the international community. The Greens under James Shaw want us to lead the world in this area. Summer starts officially at the start of December, perfect time for the snowy Climate Emergency.
A heat wave in New Zealand according to NIWA is when we have five days per month above the monthly average.
And in Australia, according to their BoM, it’s three days above average: thank goodness for ‘settled science’ – or meteorological standards (cough!).
Aware that snowfalls are merely ‘weather’, I still have a chuckle when NIWA announces a marine heatwave yet fails to mention sub-zero snow-bearing fronts roaring up out of the Great Southern Ocean.
As Bing would sing: I’m dreaming of a white Christmas.
“The media claim that the resulting hot summers are a sign of CAGW. The weak La Nina event in the austral summer of 2017-18 resulted in our hottest recorded summer. That summer beat out the previous record holder of 1934-35 by 0.1C!”
So it looks like it was just as warm in the 1930’s as it is today, in your neck of the woods.
That’s what every unmoified regional temperature history shows, that it was just as warm in the recent past, as it is today.
Which means that CO2 is a minor player in the Earth’s atmosphere because there is a lot more CO2 in the atmosphere today than there was in the 1930’s, yet it is no warmer today than then, so CO2 has had little or no warming effect since the 1930’s.
There is no need to spend Trillions of dollars on CO2 mitigation. It’s not causing any problems.
Willis,
How does the hot anomaly blob, moving east underwater, gain heat on this section before surfacing off Chile?
Seems like lateral, N-S, movement might need to be visualised in 3D.
I still do not know why the water warms unevenly over time but favour your cloud emergence affecting somewhat steady solar input. But this movie of yours makes it look more of a puzzle. It is nice work, no criticism there, but the mechanisms are not simple.
Geoff Sherrington November 25, 2020 at 6:44 pm
Geoff, if I understand you, I’d suggest it’s not gaining heat. Remember, it’s an anomaly, a difference (T) between observation and climatology.
I’d say what you’re observing is a blob of water at a (relatively) constant temperature moving into an area that is normally cooler than the area it was in before. As a result the ∆T between observed temperature and the climatology increases.
Best regards, always good to hear from you,
w.
Geoff has spotted the same thing as I commented on above. I fully get that we have to be careful how we interpret “anomalies” but it does mean more heat energy in that part of the ocean.
As you have shown in the past the surface water seems to get clamped at a max of about 30 deg C. so as we see the warmer waters pushing down and creating the anomaly, the surface stays about the same temperature and does not show much of an anomaly.
That means that the increasing volume of 30 degree water in the west which is pushing down to about 100m must be coming from somewhere.
Would it be possible to calculate the “area” of each isotherm colour block as it progresses though time. From watching the vid, I do not get the impression that there is a constant volume of 30 deg. water in this cross-section which is simply spreading out eastwards, as the comic book “sloshing back” description would have us think.
If there is sufficient data at depth either side of the lat=0 section, I think we would see N/S movement of heat. If I was a skilled a Willis with R I would probably have done this years ago. It’s something which has always bugged me about the equatorial sections.
OK, I used the K word, lets try again …
Geoff has spotted the same thing as I commented on above. I fully get that we have to be careful how we interpret “anomalies” but it does mean more heat energy in that part of the ocean.
As you have shown in the past the surface water seems to get clamped at a max of about 30 deg C. so as we see the warmer waters pushing down and creating the anomaly, the surface stays about the same temperature and does not show much of an anomaly.
That means that the increasing volume of 30 degree water in the west which is pushing down to about 100m must be coming from somewhere.
Would it be possible to calculate the “area” of each isotherm colour block as it progresses though time. From watching the vid, I do not get the impression that there is a constant volume of 30 deg. water in this cross-section which is simply spreading out eastwards, as the comic book “sloshing back” description would have us think.
If there is sufficient data at depth either side of the lat=0 section, I think we would see N/S movement of heat. If I was a ski11ed a Willis with R I would probably have done this years ago. It’s something which has always bugged me about the equatorial sections.
Here is a snippet of R code to look at ARGO potential temperature data I have in my OHC archive. I have no recollection of what state it is in from 2012 but it would be good have another look now we have the 2016 El Nino in the data since.
require(ncdf)
require(abind)
require(fields)
require(animation) # needed to produce gif, requires ImageMagick software installation
nc=open.ncdf("argo_2005-2012_grd.nc")
#
#### PARAMETERS ####
#
use_anom=1
ub=0 # Upper Level Depth
lb=500 # Lower Level Depth - ARGO data goes to 2000M
#wb=160 # western boundary
wb=60 # western boundary
eb=290 # eastern boundary
#nb= +50.5
#sb= -50.5
nb= 15.5
sb= -5.5
mths=96
mthv=1:mths
nyrs=8;
#
# Function: getncsample
#
# This function returns a four dimensional array of netCDF potential temperature data.
#
# In addition to the nc file and source description, the function also
# receives the bounds for the sample:
# wb: west bound
# eb: east bound
# sb: south bound
# nb: north bound
# ub: upper bound (ocean depth - usually zero)
# lb: lower bound
# sy: start year
# cm: count years (number of years to sample)
#
getncsample=function(nc, ncsource, wb, eb, sb, nb, ub, lb, sm, cm)
{
if (wb =wb & long=sb & lat=ub & depth<=lb)
startlong=min(longx)
countlong=length(longx)
startlat=min(latx)
countlat=length(latx)
startdepth=min(depthx)
countdepth=length(depthx)
start=c(startlong,startlat,startdepth,sm)
count=c(countlong,countlat,countdepth,cm)
ptsamp=get.var.ncdf(nc,tempvar,start,count) - k
dim(ptsamp)=count
attributes(ptsamp)$dimnames = list(long[longx],lat[latx],depth[depthx],1:cm-0.5)
return(ptsamp)
} # selectncsamp()
#
#### MAIN PROGRAM ####
#
if (use_anom) { anom = get_anom(nc, ncsource, wb, eb, sb, nb, ub, lb, 1, mths) }
### function imageplt()
imageplt=function(mth){
ncsource="argo"
if (use_anom) {
### samp = which ( anom, ..... mth, 1)
} else {
ptnc=getncsample(nc, ncsource, wb, eb, sb, nb, ub, lb, mth, 1)
# samp = apply(ptnc,c(2,3),function(x)mean(x,na.rm=T)) # lat and depth
samp = apply(ptnc,c(1,3),function(x)mean(x,na.rm=T)) # long and depth
} # endif use_anom
x=as.numeric(dimnames(samp)[[1]])
y=as.numeric(dimnames(samp)[[2]])
# x=as.numeric(dimnames(anom)[[1]])
# y=as.numeric(dimnames(anom)[[2]])
fnum = formatC(mth,width=3,format="d",flag="0")
yr = 2005 + floor((mth-1)/12)
mo = mth %% 12
mo = ifelse(mo==0,12,mo)
mo = formatC(mo,width=2,format="d",flag="0")
# plotmain1=paste("ARGO PTemp - ",wb,"E to ",eb,"E",sep="")
plotmain1=paste("ARGO PTemp - ",sb,"N to ",nb,"N",sep="")
plotmain2=paste(yr,".",mo,sep="")
plotmain =c(plotmain1,plotmain2)
image.plot(x, y, samp, main=plotmain, zlim=c(3,32),
ylab="Depth (M)", xlab="Longitude", ylim=rev(range(y)))
# ylab="Depth (M)", xlab="Latitude", ylim=rev(range(y)))
} # end imageplt()
##################
# create animation
##################
# create animation gif
saveGIF({
ani.options(nmax = mths)
sapply(mthv, imageplt) # for m in mntv{ imageplt(m) }
}, interval = 0.2, movie.name = "argo-animation.gif", ani.width = 600, ani.height = 400, outdir=getwd())
close.ncdf(nc)
Hope the formating is OK.
This is what I got from a little reading a time or twosome time ago. I was never there, I never studied the data to get a deeper understanding, but what I read, or believed I read, is that the warm waters of an El Nino, which end up near South America, come from the Western Pacific Warm Pool. It slides along the top of the thermocline, west to east. This happens because the Trade Winds are in abeyance due to a major atmospheric pressure shift. There are also El Nino related counter currents going the other way, either deeper or further afield. I forget the details.
During the 5 to 7 years (average) between El Nino events the generally reliable Trade Winds are blowing towards the west. This leads to clear skies and, being around the equator, plenty of sunshine heating the water to perhaps 100 meters depth, or however far enough sunlight penetrates.
At the eastern end of the Pacific is Indonesia and other islands, and much under water bumpiness (shallower water), that makes a sort of dam, preventing such easy flow into the Indian Ocean. This leads to the extra warm water literally piling up, eventually to about 22 inches above average sea level. When the driving force ceases (the Trade Winds quiet) that huge blob of hot water starts flowing back down hill. If the atmospheric pressure difference last long enough, it becomes an El Nino event, and eventually crashes into South America.
Your work is too straightforward, I can actually understand it.
I think this needs to be infilled with warmer temperature measurements from inside passing ship’s boilers, then have 3 passes at smoothing the data using 3 different incompatible methods, and then grided before it can meet the standards set by NOAA.
A key point to note is that the SST in open waters never exceeds 32C because it cannot. The atmosphere is in hyperdrive before that temperature is reached with so much TOA insolation being reflected that the surface cools even during the midday sun.
https://1drv.ms/b/s!Aq1iAj8Yo7jNg3qPDHvnq-L6w5-5
Ice forming in the atmosphere during cloudburst creates dense, highly reflective clouds.
The rest is noise.
And what climate models predict is simply not possible in Earth’s atmosphere. They are junk.
Some noise comes from the cylinders known as Pacific, Atlantic and Indian driving the crank shaft known as the southern ocean circulation so blobs of heat get redistributed, sometimes adding, sometimes subtracting so it is never in equilibrium.
“SST in open waters never exceeds 32C because it cannot…”
SST might have a cap and some areas might be reaching it but that won’t stop the average from rising:



?itok=og51DNqa
… and the area of 32C increasing. You’d expect the sub-surface layers to continue warming through mixing too.
The temperature cannot alter from where it is in the next few thousand years because orbital eccentricity is not altering that fast. And there is no likelihood of major changes in currents due to land movements in that time frame.
Climate models have the sea surface in the Nino34 region exceeding 32C by 2040:
http://climexp.knmi.nl/data/itas_cmip3_ave_mean_sresa1b_-120–170E_5–5N_n_5sea_max.png
Cannot happen. And the forecast is wrong already:
https://1drv.ms/b/s!Aq1iAj8Yo7jNg3j-MHBpf4wRGuhf
Equatorial SST simply cannot exceed 32C in open ocean. It defies the physics of the atmosphere.
If you see any long term temperature trend that is anything but zero, take a close look at the measurement system. The Earth is not gaining or losing energy. There are powerful feedback mechanisms that prevent that. Ice on the ocean surface regulates heat loss and ice in the atmosphere regulates heat input.
And the buoys show a very modest rate of warming, not worth destroying our economies and restricting the developing countries from alleviating the suffering of their people. Your cure is worse than the assumed disease, even if it were as bad as you claim.
Lloydo
SST might have a cap and some areas might be reaching it but that won’t stop the average from rising
And yet, the Pacific surface has been cooling for at least two decades.
https://www.nature.com/articles/s41467-020-19338-z
A robust eastern Pacific surface temperature cooling trend was evident between ~1990–2013 that was considered as a pronounced contributor to the global surface warming slowdown. The majority of current climate models failed to reproduce this Pacific cooling trend, which is at least partly due to the underrepresentation of trans-basin teleconnections.
As always in climate, everyone a winner!
Rick says zero, you say “modest”, modest is arguable but *is* some amount, so I guess at least we both agree to disagree with him on that.
You can’t just cherry-pick the bits that cooled: “Tropical central-to-eastern Pacific…”
I couldn’t see where they defined the size of that region, but as a proportion of the whole Pacific I’ll guess at 10%.
How has the rest of the “Pacific” trended?
…and the global SST *has* risen… so I guess that must be even more warming elsewhere…and the rate even looks like its increasing – now 10 years above the trend line. Not much cooling going on there.



Mad OHC dog, chaotically wagging its atmospheric tail, galloping up a steepening gorge. Good luck with modest.
Mad OHC dog, chaotically wagging its atmospheric tail, galloping up a steepening gorge. Good luck with modest.
You got a real psychological problem there……
Thanks Willis for providing the R code.
Many of your articles provide invaluable insights and a rather down to earth analysis built on quality data.
I think you should consider ALWAYS providing the code and the (R) source data with your articles. This is now the praxis in the science community, and it will make it easier for other researchers to follow up on your writings.
Several centuries ago at Queen’s we started with versions of Fortran, using punch cards on an IBM360. Early versions were ForGo and Watfor (Waterloo Fortran). Even earlier versions were BeForGo and OnceUponatran. Occasionally some guy would spill his box of punch cards and have a full melt-down in front of everyone. Hard to watch – decent people would look away.
These early languages were cumbersome beasts that did not allow zero iterations in a “Do Loop” – you had to program around the loop is you wanted to skip it. I recall Algol changed that – a huge improvement. Then there were simulation languages like GPSS – miserable beasts to program, but they could be made to work.
Later I did a lot of financial modelling – the high-level 2D matrix programs used for finance were crap, imo. They did not allow for sufficient detail to model the intricacies of the real world that we dealt with. I rejected them and started to use Lotus123, which did the job – and later moved to Excel, which was even better.
I hardly ever program anymore, but I guess I should learn C. [That’ll be an afternoon I never get back.]
Here we can see that the warm water has been moved out and has been replaced by the colder subsurface waters, which have now come to the surface..
This is called the Bjerknes feedback.
http://iridl.ldeo.columbia.edu/maproom/ENSO/New/bjerknes.html
https://ptolemy2.wordpress.com/2020/07/23/enso-and-the-anchovy/
Thanks, Phil. As an erstwhile anchovy fisherman myself, fishing off of Cannery Row made famous by Steinbeck, I greatly enjoyed the article on ENSO and the anchovy.
w.
Your endorsement made my day! How big is the North Pacific anchovy fishery? The Peruvian anchovy seems to routinely outperform most ENSO pundits in predicting ENSO status – strong juvenile recruitment in particular seems to indicate upwelling and impending cooling / La Nina conditions. (The authorities sometimes stop the fishery if dominated by young fish – but you know a big year class in on the way.) I usually check at Undercurrent news:
https://www.undercurrentnews.com/
Good stuff. I’m of the opinion that at least a great deal of the “oscillation” is driven by energy released by undersea volcanism being carried east by the Cromwell Current. If I find the time (and talent) to produce something useful I’ll send it in.