Study provides new approach to forecast hurricane intensity

From the University of Miami Rosenstiel School of Marine & Atmospheric Science

UM Rosenstiel School scientists offer new information to help improve tropical storm forecasting

MIAMI – New research from University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science suggests that physical conditions at the air-sea interface, where the ocean and atmosphere meet, is a key component to improve forecast models. The study offers a new method to aid in storm intensity prediction of hurricanes.

“The general assumption has been that the large density difference between the ocean and atmosphere makes that interface too stable to effect storm intensity,” said Brian Haus, UM Rosenstiel School professor of ocean sciences and co-author of the study. “In this study we show that a type of instability may help explain rapid intensification of some tropical storms.”

Experiments conducted at the UM Rosenstiel School Air-Sea Interaction Salt Water Tank (ASIST) simulated the wind speed and ocean surface conditions of a tropical storm. The researchers used a technique called “shadow imaging,” where a guided laser is sent through the two fluids – air and water – to measure the physical properties of the ocean’s surface during extreme winds, equivalent to a category-3 hurricane.

Using the data obtained from the laboratory experiments conducted with the support of the Gulf of Mexico Research Initiative (GOMRI) through the CARTHE Consortium, the researchers then developed numerical simulations to show that changes in the physical stress at the ocean surface at hurricane force wind speeds may explain the rapid intensification of some tropical storms. The research team’s experimental simulations show that the type of instability, known as Kelvin-Helmoltz instability, could explain this intensification.

Haus and colleagues will conduct further studies on hurricane intensity prediction in the new, one-of-a-kind Alfred C. Glassell, Jr., SUSTAIN research facility located at the UM Rosenstiel School. The SUrge-STructure-Atmosphere INteraction laboratory is the only facility capable of creating category-5 level hurricanes in a controlled, seawater laboratory. The nearly 65-foot long tank allows scientists to simulate major hurricanes using a 3-D wave field to expand research on the physics of hurricanes and the associated impacts of severe wind-driven and wave-induced storm surges on coastal structures.

The SUSTAIN research facility is the centerpiece of the new $45 million Marine Technology and Life Sciences Seawater Complex at the UM Rosenstiel School where scientists from around the world have access to state-of-the-art seawater laboratories to conduct an array of marine-related research.

The study, titled “The air-sea interface and surface stress under tropical cyclones” was published in the June 16 issue of the journal Nature Scientific Reports. The paper’s lead author was Alex Soloviev of the UM Rosenstiel School and Nova Southeastern University Oceanographic Center and its co-authors include: Mark A. Donelan from the UM Rosenstiel School; Roger Lukas of the University of Hawaii; and Isaac Ginis from the University of Rhode Island.

###
0 0 votes
Article Rating
13 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
July 11, 2014 12:51 am

It’s good to see Scientists at the University of Miami Rosenstiel School of Marine & Atmospheric Science doing some research that might be of some practical use to people. Compare that to the endless and mindless activity on futile and inaccurate Models undertaken by the Gang of CAGW.

Bloke down the pub
July 11, 2014 2:49 am

That would be one hell of a ride to be inside when it was in use.

steveta_uk
July 11, 2014 3:34 am

“Experiments conducted at the UM Rosenstiel School Air-Sea Interaction Salt Water Tank …”
What a ridiculous waste of money! Don’t they realise that using a computer model written in Excel you can model the entire climate on a cheap PC?

Stargazer
July 11, 2014 4:30 am

This is the way science should be done. I wish them well.

ren
July 11, 2014 4:51 am

Is sufficient observe temperature of the surface of the Atlantic off the coast of Africa and the circulation in the upper troposphere.
http://weather.unisys.com/surface/sst_anom.gif
http://earth.nullschool.net/#current/wind/isobaric/500hPa/orthographic=-54.23,21.84,636
By the way, what on the Gulf Stream?
http://earth.nullschool.net/#current/ocean/surface/currents/overlay=sea_surface_temp_anomaly/orthographic=-54.23,21.84,636

P@ Dolan
July 11, 2014 5:41 am

@ Bloke down the pub
I’m with you! Wahoo, what a blast, eh?

Tom O
July 11, 2014 6:00 am

” steveta_uk says:
July 11, 2014 at 3:34 am
“Experiments conducted at the UM Rosenstiel School Air-Sea Interaction Salt Water Tank …”
What a ridiculous waste of money! Don’t they realise that using a computer model written in Excel you can model the entire climate on a cheap PC? ”
So you don’t believe the notch theory. Tough. It appears to do a better job that the crap run on Crays. Science doesn’t have to be expensive and it doesn’t have to be BIG, but it does need an open mind and a willingness to learn. Too bad you have neither.

kenw
July 11, 2014 6:07 am

Tom O says:
July 11, 2014 at 6:00 am
Tom, I believe that was sarcasm…..

Jim Clarke
July 11, 2014 7:42 am

…changes in the physical stress at the ocean surface at hurricane force wind speeds may explain the rapid intensification of some tropical storms. The research team’s experimental simulations show that the type of instability, known as Kelvin-Helmoltz instability, could explain this intensification.
Okay, but wouldn’t this physical stress be nearly identical for any given wind speed over open ocean? How does this explain the wide variety of possible outcomes? And how would forecasters gather data on this Kelvin-Helmet instability to make it useful?
Interesting work, but I still think the key to forecasting hurricane intensity is in the atmosphere, especially in the upper atmosphere. In the heart of the tropics (in season), the near-surface conditions are near perfect most of the time.

P@ Dolan
Reply to  Jim Clarke
July 11, 2014 3:24 pm

@Jim Clarke
Actually, no, I wouldn’t expect the stress to be identical. It should rise with windspeed until a certain rate is reached, because the winds will create waves. Seems to me that the period and amplitude of the waves will have differing frictions for differing wind speeds, and of course, the amplitude and period of the waves depends upon the wind speed, duration of exposure to it, direction, fetch, etc—a dynamic set of mechanics. But for friction, I think at the surface, the effect is probably somewhat like driving on a washboard road: at a low speed, your shocks can’t adjust and you feel every bump heavily. Increase the speed to a certain rate, and you start to skip over the bumps and catch the crests only, and the ride smooths out. At what speed does the ride smooth? Depends on the frequency of the crests and to a certain degree the amplitude and waveform (shape).
The longer the water surface is exposed to winds, the greater the effect of wave creation; but waves amplitude and frequency are also subject to other influences beside winds: salinity, depth, etc. The mechanics are dynamic; and I’m over-simplifying, but I think you can see where my thoughts are on this—I have no idea the specifics of their experiments, so please don’t let me give the impression I’m speaking about what they are doing—I’m just speaking from my own limited knowledge; though my experience on the seas is pretty extensive (retired Navy, life long waterman—stinkpot and rag—proud owner of a ’71 C&C Redwing 35).
I don’t know if the problem can be broken down by some calculus that permits prediction, but if they can do it, it would be very useful.
More power to them! It is refreshing to hear about climate studies that involve something more than a GIGO computer program… Even if they fail, they’ll learn something, which can’t really be said of GCMs, from which we can only learn the prejudices of the programmer… What the poor deluded fools don’t seem to understand about models: models are not reality; they are an abstration; in the case of GCMs they are an articulation of a hypothesis that allows us to play with that hypothesis somewhat, tossing numbers at it. And if it matches reality, hey! great, and we can tentatively draw some conclusions and move to more concrete experiments. These guys seem to have forgotten that their models are NOT actually experiments: they’re simply articulations of hypotheses.
Regarding David Evans’ project, that’s a different use of a computer than modeling, if I may get a bit off-topic for a sec: he’s got a mathematical examination, an analysis tool, that he’s used on a few data sets, some reconstructions, some observations, and states that he may have found that there’s a signal previously undetected which correlates with gross changes in climate. With this as evidence, it should be possible to create sensors to look for this signal in the raw output of the Sun, though how long it will take—years, decades, centuries—to be able to say definitively that it exists, may make the task daunting. Which should not be discouraging: Einstein’s Special Theory was nearly a mathematical certainty, yet it was decades before technology to actually prove it existed.. Yet for most of that time the Special Theory of Relativity was accepted as a given by most. So if it takes time, it takes time.
But David Evans’ work is a practical demonstration of how scientists, even amateur scientists, actually use computers: as tools. Not as Ouija Boards. I applaud his efforts. I don’t know enough to state an INFORMED opinion about if he is really on to something or not, but I believe it’s worth looking. Scientific pursuit should be merciless in leaving no stone unturned, impartial in the search for data. His approach and his observation appears unique to me. Let him have his chance to be proven correct or incorrect impartially, I say.
That’s the very essence of the Scientific Method.

July 12, 2014 1:04 am

They still miss an important point. Storm energy based on wind velocity squared fails to take into account the track and speed of the storm. A northern tracking storm on the atlantic coast has additive winds due to the track speed but only on the eastern quadrants. Whence ACE is not really an accurate model when storm velocity is significant component of wind. It’s like a rolling tire on the ground – it doesn’t slip on the road where it contacts the road but the high point of the tire has significant velocity wrt the road. The energy is in the rolling vehicle but isn’t imparted as energy to the road. The velocity difference between road and wheel is zero. Our ACE index would see the offshore energy as being enormous while the actual energy is much lower.

P@ Dolan
July 12, 2014 6:49 am

@ ,b>timothybeatty says:
July 12, 2014 at 1:04 am
True about the additive vectors in the northeast quadrant of a northbound storm. But at the same time, the winds are reduced by the difference with the track vector in the southwest quadrant.
Measuring the energy in a storm is a complex issue. Pressure, temperature, max sustained winds, size, track speed… But it’s destructiveness isn’t so much a factor of any of these, as we can see from Sandy, Katrina and Rita—it’s more a factor of how ready the people are where the storm makes landfall. I’ve read heresay (but seen no actual reports of total costs) to the effect that places which are hit more frequently by tropical storms may have higher storm costs over long periods of time, but lower costs from individual storms, because of the relative preparedness level.
I live in Maryland, which was hit by Isabelle pretty hard in 2003, and again by Ernesto in 2009. In the case of Ernesto, it reduced to the status of a tropical storm, went feet-dry right atop us, and stalled here for over 16 hrs, and in that time, built back up to a Cat 2—the strength of Sandy. It did a great deal of damage, but nowhere near what Sandy did. There weren’t any calls to declare disaster—although the Admiral at PAX River was hopping mad, because NOAA’s final message on Ernesto was that Tropical Storm Ernesto has gone feet dry, this will be the final report, and they were as good as their word—so all day, there were no further reports on what to expect, and the estimates right up to the final were, “Yeah, you’ll get some wind and a lot of rain, but it’ll be over quickly and nothing to write home about” and meanwhile, we had very bad storm surge, gale winds, and apparently inadequate preparation throughout the area—if not at PAX itself—because people had been told it wasn’t a big deal.
But even without the warnings to brace for a hurricane, the damage was moderate compared to Sandy.
I’m not claiming people hereabouts are more prepared by nature or smarts—it was simply very soon after Isabelle (only 6 years), and caution learned from that one hadn’t gotten very stale yet.
If someone has a tool that allows prediction of storm intensity, that would only be a good thing. Because next month it will be 9 years since Katrina, and the longer we go without a big storm, the longer it is for people to become complaisant and forget about preparations, and as bad as Sandy’s impact was, it was only a Category 2 becoming a Category 1 when it did all that damage. It was not a super storm. Woe betide the folks who get nailed by a Cat 5—which is the estimate of Isaac’s storm, far and away the most destructive storm, in terms of loss of life, to have hit the US.
Yeah: a tool to more accurately predict or even simply detect a storm’s intensity such that it can be tracked would be very useful.

bh2
July 12, 2014 8:20 pm

One may wonder how forecasts produced on super-computers by either method compare to forecasts produced by simple coin-tossing.