Coarse graining technique enables them to quantify, for the first time, the energy of ocean currents larger than 1,000 kilometers. They find that the most energetic is the Antarctic Circumpolar Current, some 9,000 kilometers in size
UNIVERSITY OF ROCHESTER
For the first time University of Rochester researchers have quantified the energy of ocean currents larger than 1,000 kilometers. In the process, they and their collaborators have discovered that the most energetic is the Antarctic Circumpolar Current, some 9,000 kilometers in diameter.
The team, led by Hussein Aluie, associate professor of mechanical engineering, used the same coarse-graining technique developed by his lab to previously document energy transfer at the other end of the scale, during the “eddy-killing” that occurs when wind interacts with temporary, circular currents of water less than 260 kilometers in size.
These new results, reported in Nature Communications, show how the coarse-graining technique can provide a new window for understanding oceanic circulation in all its multiscale complexity, says lead author Benjamin Storer, a research associate in Aluie’s Turbulence and Complex Flow Group. This gives researchers an opportunity to better understand how ocean currents function as a key moderator of the Earth’s climate system.
The team also includes researchers from the University of Rome Tor Vergata, University of Liverpool, and Princeton University.
Traditionally, researchers interested in climate and oceanography have picked boxes in the ocean 500 to 1,000 square km in size. These box regions, which were assumed to represent the global ocean, were then analyzed using a technique called Fourier analysis, Aluie says.
“The problem is, when you pick a box, you are already limiting yourself to analyzing what’s in that box,” Aluie says. “You miss everything at a larger scale.
“What we are saying is, we don’t need a box; we can think outside the box.”
When the researchers use the coarse-graining technique to “blur” satellite images of global circulation patterns, for example, they find that “we gain more by settling for less,” Aluie says. “It allows us to disentangle different-sized structures of ocean currents in a systematic way.”
He draws an analogy to removing your eyeglasses, then looking at a very crisp, detailed image. It will appear to be blurred. But as you look through a succession of increasing stronger eyeglasses, you will often be able to detect various patterns at each step that would otherwise be hidden in the details.
In essence, that is what coarse graining allows the researchers to do: quantify various structures in ocean current and their energy “from the smallest, finest scales to the largest,” Aluie says.
Aluie credits Storer for further developing and refining the code; it has been published so other researchers can use it.
Other collaborators include Michele Buzzicotti, a research scientist at the University of Rome Tor Vergata; Hemant Khatri, a research associate at the University of Liverpool, and Stephen Griffies, a senior scientist at Princeton.
Support for the project included funding from the National Science Foundation, the National Aeronautics and Space Administration, and the Department of Energy.
METHOD OF RESEARCH
SUBJECT OF RESEARCH
Global energy spectrum of the general oceanic circulation
ARTICLE PUBLICATION DATE
The authors declare no competing interests.
What, no climate doom? How did it not get rejected?
In researching the comment below, I stumbled across your answer. Several climate model papers in the last 5 years claim ocean currents are ‘accelerating’ (becoming more energetic) faster than climate models predicted. So climate change is WORSE than the models predict.
And now we have a new way of modelling how much worse than the models.
“METHOD OF RESEARCH
While I caught the “no climate doom”, I missed this important info
below the main body of the article. Thanks for the warning!
I got suspicious from the ‘category of science’ for the paper posting: computer simulations, modelling. So they announce a new way to analyze computer models of ocean currents.
Then I noticed the caption under the sidebar image showing the Antarctic circumpolar to be the most energetic as shown by the new method. ‘Currents from satellite observation’. I wondered, how can satellites ‘observe’ ocean currents?
I found the answer at NCEI.NOAA.gov. They do not . Currents are inferred in a computer model from three remotely sensed elements: satellite altimetry of differential ‘pixel’ sea height, inferred surface wind vector (from wave orientation and imputed reflective wave height, itself a model needed to,pretend to know pixel sea level), and surface temp. Only surface temp is remotely sensed somewhat reliably . As my previous posts have shown, satellite altimetry is unfit for purpose for sea level rise, and so even more unfit for pixel level current detection.
So this new paper fancifully touts a new method for garbage in>>garbage out. Lipstick on a pig. Still a pig.
Eh, they need to test a few of those currents with floats, Rud. Then we can pronounce it as garbage. For now, who knows?
Will it be tested? Ummm… I wouldn’t bet on it. Cheaper to fund models than all the kit required to validate them. And until then it will be hard to prove they are wrong.
[edit – stray word that got in there]
Floats would not be sufficient as there are many currents in many layers at varying depths that move differently than the surface waters (there is no wind at 1,000 fathoms).
I was thinking floats set for the depth of the currents they mapped. I was seeing currents on that graphic that I’d never seen before and just figured something was going on shallower or deeper than had ever been mapped.
I take Rud’s comment regarding all models to heart. But I figure we should at least let them sink or swim on validations.
The need to build a bunch of giant windmills and mount them upside down on floating barges in the Sourthern Ocean. That would provide both the data and generate a lot of electricity. It would also slow down the rapidly accelerating currents that are such a deadly threat to the planet’s very existence in the universe.
The ARGO floats might be a place to start for that initial cross check against models. The floats descend and re-ascend, so are moved by currents at multiple layers. Someone must be keeping track of their GPS locations when they arrive at the surface.
Good point, DJ. It leads one to ask, why the models? They should have that Argo data to work with.
Wow. I had assumed that the data used to create the maps of currents was actual data. Instead it sounds more like this ‘research’ was accomplished using indirect measurements, and modeling of models or some such wizardry? Higher learning has really sunk to a low. Sad.
Yup. Except my descriptive adjective is much worse and less polite than ‘sad’. FUBAR.
Rud, are you suggesting that we would be better off not trying to learn about sea currents unless we can do it without using “models”?
There is no such thing as a “model-free” estimate (if you rule out finger counting and forearm (cubit) measurements). Thermometers and yard sticks have to be very carefully constructed and calibrated, but still are very limited by their built-in assumptions of scale and linearity. (e.g. you wouldn’t use a micrometer to measure the distance between New York and London, or a meat thermometer inside a blast furnace).
From what I have read on this (and I’m no expert) satellite altimetry, using radar and lasers, detects Doppler shifts in wavelengths, which can be used to meaure return timings to compute relative motions of the sea surface wrt the satellite sensor. From this estimates of surface motion and relative positions could be estimated, after substracting out “noise” of course. Of course these estimates won’t be completely error free (no measurement is free of errors). But could they not be useful for analyzing ocean behavior, using techniques such as data assimilation, to predict (“forecast”) behavior in unobserved places and times?
This approach seems to work for weather and oceans (sort of). Obviously there are still many wrinkles to be ironed out. I think data assimilation is a very useful kind of Bayesian inference, properly applied.
I really do not see any nefarious “agenda” in this paper (link), except perhaps Prof. Aluie is trying to promote the careers of his students (and himself), by getting papers published.
In fact I find the paper intereresting because it makes the claim to be the first to determine that the peak energy of ocean currents lies in the Souther Ocean, in a scale (~10^^4 km) that has never been documented before, “out of the box” they call it.
So I quickly fired up windy.com and nullschool to investigate this claim:
My uncalibrated eyeball would have guessed that the equatorial currents are the most energetic, because they are more intensely displayed. But that may be the point that Aluie is claiming, they are measuring flow-based kinetic energy in a too-restricted scale.
And it makes sense (in metaphysical sense) that the Southern Ocean would have a large scale amplitude at the planetary scale, because it is the only ocean that circumvents the world without interuption. Strange that no one has apparently investigated this feature.
But don’t misjudge my motives in writing this, I also believe that a large segment of the scientific literature in the climate area has been totally corrupted by activism. I don’t see it in this paper. But I only skimmed it quickly.
Nice work mapping out the currents.
Now it’s all up to the Climate Science™ modelers to show how the currents will cause nothing but doom.
See above. They already did, at least in their own minds.
I missed the doom part, Rud. I just figured it was coming later.
I saw your comment about all models after I posted. I do take your investigations to heart.
I dunno. I think a few validations (which will likely never happen cuz models are cheaper and kit is dear) will allow them to h@ng themselves. If they fight some sort of experimental validation, then they are already swinging.
Idle thought. The Polynesians had stick maps of the currents in the Pacific. They worked. I wonder if these modern models agree with what knowledge of currents was gained and mapped through experience? That would be a good test.
Just been discussing this paper with Javier over at Judith Curry’s site. He pointed out figure 5. In the NH hardly any kinetic energy north of 45 degrees (relatively speaking) while in the SH massive kinetic energy in the “Roaring forties-fifties” southern circumpolar current. Most of the world’s oceanic kinetic energy is right there, emphasising the southern hemisphere’s domination of oceanic heat transport.
Nice observation. There is a reason the sailing whalers named those SH latitudes ‘roaring’. Tremendous circumferential surface winds almost unimpeded by land masses. Unlike anywhere else on Earth.
Don’t tell me about it, it is one the reasons we in New Zealand have some of the best wind resource in the world.
Although having now read it the results don’t seem to imply this at all.
Interesting, I was just replying on WE’s last post and made a general observation about the atmosphere’s wind power density. It too is much higher in the SH than in the NH.
How did this get published? How could there be tipping points when oceans “moderate” the climate system?
Why bring oceans into a discussion on climate. Manabe did not need that to get his Nobel prize. He just used his atmospheric phiisics.
How this got published:
Nature Communications is an open access … Prompt dissemination of accepted papers to a wide readership and beyond is achieved through a programme of continuous online publication.
Another aspect of this that troubles me is that it is a bunch of mechanical engineers doing work on ocean currents. Obviously following the money.
A relevant anecdote. Before deregulation the State of New York required the utilities to spend a certain fraction of their income on R&D. It was so much money that there was a lot of work done and the local universities knew there was money available. I was the only meteorologist in the company when I started and was asked to evaluate many projects. Some academic engineers had convinced the company to do a feasibility study for a pumped storage system using wind turbines to pump water up to an already existing reservoir at the eastern end of Lake Ontario. The thought was they would augment the already existing hydro plant on the river so it could run even when precipitation levels were low. They put up a meteorological tower in the Salmon River valley midway near an existing hydro plant.
At some point I was asked to review the meteorological tower and the data. The engineers claimed that the data was favorable because they found a drainage flow caused by the terrain differences that they thought would enhance the wind resource. They found very few times of calm winds. My main criticism was that they didn’t measure where the turbines would be placed and eventually everyone agreed that there wasn’t enough of wind resource for pumped hydro there. I was skeptical of the terrain enhancement argument and said so but they dismissed the kid from the power company.
A few years later I designed a meteorological network to look at mesoscale regimes around Lake Ontario for the nearby nuclear plant and found the real reason for the drainage flow. Lake Ontario is the primary driver of the local winds. It turns out that within ten or so miles of the Lake the only time you get calm winds is when the lake and the land are at the same temperature. That is very rare. What they thought was a drainage flow was actually a land breeze caused by the lake. It turns out that was one of the most common mesoscale regimes.
My point. Somewhere there is an expert on ocean currents thinking what the heck are these clowns trying to do with this approach that will add value to our understanding. I would bet a lot that expert can explain everything they are finding using local knowledge of the systems.
I strongly suspect the same — especially, given that the above “research” was not peer reviewed. It was merely quickly passed by the publication’s “editors” and swiftly published (see my above comment).
Yes, “follow the money”.
It reminded me of a medical analogy from the USA in the mid-1980’s.
The AIDS epidemic started ~1981 and gay activist groups lobbied Congress [rightly in my view] to do more. So Congress mandated [IIRC] ~$5 billion to be spent EVERY YEAR on R&D.
As you can imagine, with that much money sloshing around, every biomedical researcher included some tenuous link to AIDS in their grant proposals. A lot of lousy research got funded.
My favorite was a salamander study in which they included a blurb about how its immune system might relate to humans.
Fast forward to to now: we get “model mania” with the latest iteration CMIP6 even further divorced from reality [aka observations].
Even worse, the results are treated as facts by the “Mainstream media” cheerleaders.
Good anecdote. But I think you are being too harsh on the authors of this paper. They do not seem to be trying to do the thing you expect.
This paper isn’t about explaining the ocean currents. It is about understanding the limits of the modelled ocean currents that are introduced by the size of the grid cells.
Now we all know that the models are being used to try and explain the behaviour of the ocean currents. So indirectly your point makes sense.
Yet the primary purpose of this paper (and the published code) is to help the modellers be aware of their models’ limitations.
The primary purpose of the paper, I believe, is to assert this:
Coarse graining technique enables them to quantify…
As Rud Istvan points out above, they have no meaningful data to quantify.
It is widely recognised among painters that a good idea is to narrow your eyes and squint at the scene to get rid of unnecessary detail. There is some merit in this approach. I found the overview of the circumpolar current marvellously effective in showing the bottleneck effect of the Drake Passage between South America and the Antarctic peninsula. It might cast light on two known geotechnical problems. One question is, could the visible turbulence on the eastern downstream side have an influence on the weird large geomagnetic anomaly in the same area? The other question is, could there be a connection between undersea volcanoes in the Drake Passage itself and the La Nina, El Nino switching so far unexplained?
I believe, though I can’t find the reference, that Buckminster Fuller attributed his patterns of thought to his early childhood years with extreme myopia that led him to see the big picture rather than all the small details.
It’s a novel approach, it will need to be augmented over time, hopefully with some empirical data, though this would be a massive project involving many floats at many depths over massive areas.
I just don’t see them ever getting funding for that.
More likely, this will be open to getting results you want, given the extra play in the rope here.