CASE STUDY Titan Simulates Earthquake Physics

CASE STUDY
Titan Simulates Earthquake Physics Necessary for Safer Building Design
Organization
Researchers conduct unprecedented study on GPUs of high-frequency shaking
U.S. Department of Energy
Oak Ridge Leadership Computing Facility
Oak Ridge National Laboratory
Oak Ridge, TN
www.olcf.ornl.gov
When the last massive earthquake shook the San Andreas Fault in 1906 — causing
fires that burned down most of San Francisco and leaving half the city’s population
homeless — no one would hear about “plate tectonics” for another 50 years. Needless to
say, by today’s standards, only primitive data survive to help engineers prepare southern
California for an earthquake of similar magnitude.
Cray® XK7™ ‘Titan’ supercomputer
“We haven’t had a really big rupture since the city of Los Angeles existed,” says Thomas
Jordan, Southern California Earthquake Center (SCEC) director.
The Oak Ridge Leadership Computing Facility
(OLCF) is home to Titan, the nation’s most
powerful supercomputer for open science.
A hybrid-architecture Cray XK7 system, Titan
contains advanced AMD Opteron™ central
processing units (CPU) and NVIDIA® Tesla®
K20X graphics processing units (GPU). With
this combination, Titan achieves 10 times
the speed and 5 times the energy efficiency
of its predecessor, the Cray XT5™ Jaguar
supercomputer. Using Titan, researchers
are getting unparalleled accuracy in
their simulations and achieving research
breakthroughs more rapidly than ever before.
System Overview
•Speed: 27 petaflops
•Cabinets: 200
•Processors: 18,688 16-core AMD Opteron
•Total cores: 299,008
•Memory/node: 32 GB + 6 GB
•Memory/core: 2 GB
•Total memory: 710 TB
•GPUs: 18,688 NVIDIA® Tesla® K20X
•Interconnect: Gemini
High-frequency ground motion modeling is a
complex problem that requires a much larger
scale of computation. With the capabilities
that we have on Titan, we can approach those
higher frequencies.”
—Thomas Jordan
Director, Southern California Earthquake Center
Scientists predict this is just the quiet before the storm for cities like San Francisco and
Los Angeles, among other regions lining the San Andreas. “We think the San Andreas
Fault is locked and loaded, and we could face an earthquake of 7.5-magnitude or
bigger in the future,” Jordan says. “But the data accumulated from smaller earthquakes
in southern California over the course of the last century is insufficient to predict the
shaking associated with such large events.”
To prepare California for the next “big one,” SCEC joint researchers — including
computational scientist Yifeng Cui of the University of California, San Diego and
geophysicist Kim Olsen of San Diego State University — are simulating on Titan, the
world’s most powerful supercomputer for open science research, earthquakes at high
frequencies for the more detailed predictions needed by structural engineers.
Titan, which is managed by the Oak Ridge Leadership Computing Facility (OLCF) located
at Oak Ridge National Laboratory (ORNL), is a 27-petaflop Cray XK7 machine with a
hybrid CPU/GPU architecture. GPUs, or graphics processing units, are accelerators that
can rapidly perform calculation-intensive work while CPUs carry out more complex
commands. The computational power of Titan enables users to produce simulations —
comprising millions of interacting molecules, atoms, galaxies, or other systems difficult to
manipulate in the lab — that are often the largest and most complex of their kind.
The SCEC’s high-frequency earthquakes are no exception. “It’s a pioneering study,” Olsen
says, “because nobody has really managed to get to these higher frequencies using fully
physics-based models.”
Many earthquake studies hinge largely on historical and observational data, which
assumes that future earthquakes will behave as they did in the past (even if the rupture
site, the geological features, or the built environment is different).
“For example, there have been lots of earthquakes in Japan, so we have all this data
from Japan, but analyzing this data is a difficult task because scientists and engineers
preparing for earthquakes in California have to ask ‘Is Japan the same as California?’ The
answer is in some ways yes, and in some ways no,” Jordan says.
The physics-based model calculates wave propagations and ground motions radiating
from the San Andreas Fault through a 3-D model approximating the Earth’s crust.
Essentially, the simulations unleash the laws of physics on the region’s specific geological
features to improve predictive accuracy.
Cray Inc.
901 Fifth Avenue, Suite 1000
Seattle, WA 98164
Tel: 206.701.2000
Fax: 206.701.2500
www.cray.com
Seismic wave frequency, which is measured in hertz (cycles per second), is important to
engineers who are designing buildings, bridges, and other infrastructure to withstand
earthquake damage. Low-frequency waves, which cycle less than once per second (1
hertz), are easier to model, and engineers have largely been able to build in preparation
for the damage caused by this kind of shaking.
©2014 Cray Inc. All rights reserved. Cray is a registered trademark of Cray Inc.
All other trademarks mentioned herein are the properties of their respective owners. 20140107_V1KJL
CASE STUDY
Page #2
“Building structures are sensitive to different frequencies,” Olsen says. “It’s mostly the big structures like highway overpasses and high-rises
that are sensitive to low-frequency shaking, but smaller structures like single-family homes are sensitive to higher frequencies.”
But high-frequency waves (in the 2-10 hertz range) are more difficult to simulate than low-frequency waves, and there has been little
information to give engineers on shaking up to 10 hertz. “The engineers have hit a wall as they try to reduce their uncertainty about how to
prevent structural damage,” Jordan says. “There are more concerns than just building damage there, too. If you have a lot of high-frequency
shaking it can rip apart the pipes, electrical systems, and other infrastructure in hospitals, for example. Also, very rigid structures like nuclear
power plants can be sensitive to higher frequencies.”
High-frequency waves are computationally more daunting because they move much faster through the ground. And in the case of the
SCEC’s simulations on Titan, the ground is extremely detailed: representing a chunk of terrain one-fifth the size of California (including a
depth of 41 kilometers) at a spatial resolution of 20 meters. The ground models include detailed 3-D structural variations — both larger
features such as sedimentary basins as well as small-scale variations on the order of tens of meters — through which seismic waves must
travel. Along the San Andreas, the Earth’s surface is a mix of hard bedrock and pockets of clay and silt sands.
“The Los Angeles region, for example, sits on a big sedimentary
basin that was formed over millions of years as rock eroded
out of mountains and rivers, giving rise to a complex layered
structure,” Jordan says.
Soft ground like Los Angeles’s sedimentary basin amplifies
incoming waves, causing these areas to shake more over
a longer period of time than rocky ground, which means
some areas further away from the rupture site could actually
experience more infrastructure damage.
The entire simulation totaled 443 billion grid points. At every
point, 28 variables — including different wave velocities, stress,
and anelastic wave attenuation (how waves lose energy to
heat as they move through the crust) — were calculated.
“High-frequency ground motion modeling is a complex problem
that requires a much larger scale of computation,” Jordan says.
“With the capabilities that we have on Titan, we can approach
those higher frequencies.”
Back in 2010, the SCEC team used the OLCF’s 1.75-petaflop Cray
XT5™ Jaguar supercomputer to simulate an 8-magnitude earthquake along the San Andreas Fault. Those simulations peaked at 2 hertz. At
the time the Jaguar simulations were conducted, doubling wave frequency would have required a 16-fold increase in computational power.
But on Titan in 2013, the team was able to run simulations of a 7.2-magnitude earthquake up to their goal of 10 hertz, which can better
inform performance-based building design. By modifying their code originally designed for CPUs for GPUs — the Anelastic Wave Propagation
by Olsen, Steven Day, and Cui, known as the AWP-ODC — they significantly improved speed up. The simulations ran 5.2 times faster than
they would have on a comparable CPU machine without GPU accelerators.
“We redesigned the code to exploit high performance and throughput,” Cui says. “We made some changes in the communications schema
and reduced the communication required between the GPUs and CPUs, and that helped speed up the code.”
The SCEC team anticipates simulations on Titan will help improve its CyberShake platform, which is an ongoing sweep of millions of
earthquake simulations that model many rupture sites across California.
“Our plan is to develop the GPU codes so the codes can be migrated to the CyberShake platform,” Jordan says. “Overcoming the computational
barriers associated with high frequencies is one way Titan is preparing for this progression.”
Utilizing hybrid CPU/GPU machines in the future promises to substantially reduce the computational time required for each simulation,
which would enable faster analyses and hazard assessments. And it is not only processor-hours that matter but real time as well. The 2010
San Andreas Fault simulations took 24 hours to run on Jaguar, but the higher frequency, higher resolution simulations took only five and a
half hours on Titan.
And considering the “big one” could shake California anytime in the next few decades to the next few years, accelerating our understanding
of the potential damage is crucial to SCEC researchers.
“We don’t really know what happens in California during these massive events, since we haven’t had one for more than 100 years,” Jordan
says. “And simulation is the best technique we have for learning and preparing.”
Content courtesy of Oak Ridge National Laboratory and the Oak Ridge Leadership Computing Facility