Titanic Twisters - Texas Advanced Computing Center

Titanic Twisters
University of Oklahoma researchers use TACC’s Ranger supercomputer to simulate unpredictable
tornadoes
On May 3, 1999, violent thunderstorms swept through
Oklahoma and Kansas, spawning 66 tornados and
claiming 48 lives. Most severe among them was the
F5-intensity tornado that swept through Moore,
Oklahoma, which carried wind speeds of more than
300 miles per hour — the most powerful winds ever
recorded on earth. Thirty-six lives were claimed by
this tornado alone, which caused nearly $2 billion in
property damage.
Tornadoes are violent, rotating columns of air, generated by intense thunderstorms. Hundreds to thousands of tornadoes can strike in the United States each
year, incurring billions of dollars in damages, dozens
of deaths, and thousands of injuries annually.
One of several tornadoes observed by the VORTEX-99 team on May
3, 1999, in central Oklahoma. Note the tube-like condensation funnel,
attached to the rotating cloud base, surrounded by a translucent dust
cloud. [Courtesy of NSSL]
Yet, despite their power and preponderance, many
mysteries remain regarding how tornadoes form.
“Various theories exist that try to explain the cause of
rotation in tornadoes,” according to Ming Xue, professor of meteorology at the University of Oklahoma,
and director of the Center of Analysis and Prediction
of Storms (CAPS). “But the true causes are still not
well understood.”
One important principle behind the tornado’s rotation
is the conservation of angular momentum — the same
principle that makes ice-skaters spin faster when they
embrace their stretched arms.
“Tornadoes form inside intense thunderstorms when
there is strong vertical motion that causes a concentration of air towards a convergence center,” Xue said.
“When you reduce the radius of rotation, you increase
the rotation rate.” And as the funnel narrows to a
few dozen feet, the vortex grows into one of the most
powerful natural forces on earth.
Xue is on the forefront of tornado simulation and prediction research. Using the Ranger supercomputer at
the Texas Advanced Computing Center (TACC), Xue
(working with graduate students, Daniel Dawson and
Ming Hu) has been able to simulate the May 3, 1999
tornado, as well as a May 8, 2003 F4 tornado that also
passed through Moore, with unprecedented fidelity,
obtaining simulations that match well with the observed storm in its characteristics, path and timing.
Xue’s detailed analysis of the model output is providing insights into how and why tornadoes form, and
how microphysical processes within the cloud affect
tornado formation.
Unlike hurricanes, which last for several days and can
be anticipated far out at sea, tornadoes are localized,
short-lived and almost unpredictable, making them
notoriously difficult to observe and foresee.
“It’s very hard to predict any weather system beyond
its life cycle, and to predict tornadoes, we have to first
forecast their parent storms,” Xue said. “We’re still
For more information, please contact:
Faith Singer-Villalobos, Public Relations, [email protected], 512.232.5771
Page 1 of 4
Titanic Twisters
trying to predict these thunderstorms with significant
lead-time.”
The difficulty in simulating tornadoes lies primarily
in the difference between the size of the tornado itself
and the much larger thunderstorms that spawn them
as they travel over land. This difference in scale —
from tens of meters (the size of the funnel) to hundreds of kilometers (the path of the storm as it travels
and generates tornadoes) — makes for an extremely
challenging multi-scale computational problem,
where each scale has to be accurately handled in the
numerical model. This doesn’t even include the even
larger-scale environmental flows that affect the development and evolution of thunderstorms themselves.
Add the complex microphysics within the cloud (the
processes that generate rain, wind and hail), and the
large volumes of Doppler radar observations that
have to be assimilated to allow for forward prediction, and you have a problem that can only be solved
by the world’s most powerful supercomputers.
The use of the Doppler radar data is important for
prediction of all weather phenomena that involve
precipitation. These forecasts “require sophisticated
methods and software to effectively incorporate the
data into numerical prediction models, a processing
called data assimilation. Only a few groups in the
world do this kind of work,” Xue said, “and our center, CAPS, pioneered this area of research.”
The ultimate goal of this process is combine advanced
sensors and numerical models to predict hazardous
weather, so the National Weather Service can provide
warnings well in advance of their formation, a concept called ‘Warn on Forecast.’
But before we can predict future cases, it’s necessary
to simulate past cases and produce a more complete
picture of the atmosphere than raw observations can
provide. Said Xue: “When we have good simulations,
A still from an animation showing the predicted surface reflectivity
field on the 50-m grid valid at 2213:45 UTC (a), and observed reflectivity at the 1.45° elevation of OKC radar at 2216 UTC (b).
we can analyze the data and obtain a much better understanding of the processes involved in the weather
phenomena.”
On Ranger, Xue and Dawson have produced 3-D
simulations of the 1999 storm that have a higher resolution than any previous investigations: 12.5 meters
throughout the entire thunderstorm, comprising 500
million grid points. The simulations, which used over
1 million computing hours in 2008, showed the
For more information, please contact:
Faith Singer-Villalobos, Public Relations, [email protected], 512.232.5771
Page 2 of 4
Titanic Twisters
formation of a condensation funnel that looks very
much like an actual tornado and illustrated tornado
paths only a few kilometers from the observed paths.
[A paper describing an earlier set of simulations at 100
meter resolution was recently published by Xue and
his graduate student, Nathan Snook, in Geophysical
Research Letters (December 2008).]
Based on his research and most recent simulations,
Xue has formed a new hypothesis about why tornadoes form. “Our theory says that the cloud microphysics affect a thunderstorm’s cold pool, and the
cold pool affects how the mid-level updraft and rotation are positioned relative to the low-level rotation,”
Xue explained. “We believe this relative position is a
key factor affecting whether a thunderstorm can produce a tornado or not. It explains why some thunderstorms produce tornadoes, and others, though they
may look very similar, don’t.”
Xue’s group at CAPS has been able to simulate tornadoes with unprecedented realism, especially using
their new methods for assimilating high-resolution
Doppler radar data and for synthesizing this information into 3-D visualizations of evolving storms.
However, to turn these methods into a predictive tool
that can say where, when and how a twister will hit in
real-time is still a distant dream.
“The full simulations can take up to a week to complete, but the actual tornado forecast has to be done in
a matter of minutes,” Xue clarified. “To actually use
the tools for real-time prediction, computers need to
be much faster.”
Furthermore, assessing the reliability of a prediction
requires ‘ensemble’ methods, where a simulation
runs dozens of times to quantify the probability that
a tornado will form and what track it will most likely
take. Ensemble methods increase the computational
workload by another order of magnitude.
Real-time, high-fidelity predictions will need to wait
for the next-generation of HPC systems that can compute at the multi-petascale level and beyond (several
times faster than today’s fastest supercomputers).
But the field is developing quickly. Five years ago,
a high-fidelity simulation capturing both the parent
thunderstorm and an embedded tornado was impossible to produce. Five years hence, multiple simulations will be performed relatively quickly using tens
of thousands of computer processors in tandem, so
that many factors affecting the formation of tornadoes
can be studied together. By using next-generation
supercomputers, Xue believes it will be possible to
resolve supercell thunderstorms in real-time with a
resolution of about 250 meters — not tornado-resolving capacity, but enough to give a good indication of
where, when and how a tornado may occur.
“High performance computers are essential for
continued advances in weather prediction,” said Jack
Kain, a research meteorologist at the National Oceanic
and Atmospheric Administration’s National Severe
Storms Laboratory (NSSL). “This is especially true for
high impact, small-scale phenomena like tornadoes.
Xue’s pioneering tornado simulations help us to better
understand how tornadoes form and they provide
fundamental insights about the computational resources and modeling framework that will be necessary if we are to achieve our eventual goal of real-time
numerical prediction of tornadoes.”
In the meantime, Xue is working on related research
projects that will enhance real-time predictions
for tornadoes and other weather systems in different ways. This year, a National Science Foundation
(NSF)-supported field experiment (VORTEX-II) will
deploy mobile radar and other observing systems to
gather more precise data on tornadoes and tornadic
thunderstorm valuable for simulation studies; Xue
will provide real-time numerical weather prediction
For more information, please contact:
Faith Singer-Villalobos, Public Relations, [email protected], 512.232.5771
Page 3 of 4
Titanic Twisters
in support of that field experiment. Xue also leads
the Analysis and Prediction Thrust of the NSF Engineering Research Center for Collaborative Adaptive
Sensing of the Atmosphere (CASA), a group that
develops and tests adaptive sensors for best observing
the atmosphere; his computer models will be used to
tell the radars when and where to probe the thunderstorms.
Driving all this research are the powerful HPC systems at TACC. “Having access to large supercomputers like Ranger allows us to do simulations that were
not possible before and to analyze the huge volume of
data much faster,” Xue said. “They make such advanced research possible.”
###
The Ranger supercomputer is funded through the National
Science Foundation (NSF) Office of Cyberinfrastructure
“Path to Petascale” program. The system is a collaboration
among the Texas Advanced Computing Center, The University of Texas at Austin’s Institute for Computational
Engineering and Science, Sun Microsystems, Advanced
Micro Devices, Arizona State University, and Cornell
University.
Ranger is a key resource of the NSF TeraGrid (www
teragrid.org), a nationwide network of people, resources
and services, also sponsored by the NSF Office of Cyberinfrastructure, which enables discovery in U.S. science and
engineering. The TeraGrid provides scientists and researchers expertise in and access to large-scale computing power,
networking, data-analysis, and visualization systems.
Aaron Dubrow
Texas Advanced Computing Center
Science and Technology Writer
February 23, 2009
For more information, please contact:
Faith Singer-Villalobos, Public Relations, [email protected], 512.232.5771
Page 4 of 4