Slide - SIAM Student Chapter at VT

Research Computing at
Virginia Tech
Advanced Research Computing
Outline
•
•
•
•
ARC Overview
ARC Resources
Training & Education
Getting Started
Advanced Research Computing
2
ARC OVERVIEW
Advanced Research Computing
3
Terascale Computing Facility
 2200 Processor - Apple G5 Cluster
 10.28 teraflops; 3 on 2003 Top500 list
Advanced Research Computing (ARC)
• Unit within the Office of the Vice President of
Information Technology
– Office of Vice President for Research
• Provide centralized resources for:
– Research computing
– Visualization
• Staff to assist users
• Website: http://www.arc.vt.edu/
Advanced Research Computing
Goals
• Advance the use of computing and
visualization in VT research
• Centralize resource acquisition, maintenance,
and support for research community
– HPC Investment Committee
• Provide support to facilitate usage of
resources and minimize barriers to entry
• Enable and participate in research
collaborations between departments
Advanced Research Computing
Personnel
• Terry Herdman, Associate VP for Research
Computing
• BD Kim, Deputy Director, HPC
• Nicholas Polys, Director, Visualization
• Computational Scientists
– Justin Krometis
– James McClure
– Gabriel Mateescu
• User Support GRAs
Advanced Research Computing
ARC RESOURCES
Advanced Research Computing
8
Computational Resources
•
•
•
•
•
•
•
Blue Ridge – Large scale Linux cluster
Hokie Speed – GPU cluster
Hokie One – SGI UV SMP machine
Athena – Data Analysis and Viz cluster
Ithaca – IBM iDataPlex
Dante – Dell R810
Other resources for individual research groups
Advanced Research Computing
Blue Ridge Large Scale Cluster
• Resources for running jobs
–
–
–
–
318 dual-socket nodes with 16 cores/node
socket is an eight-core Intel Sandy Bridge-EP Xeon
4 GB/core, 64 GB/node
total: 5,088 cores, 20 TB memory
• Two login nodes and two admin nodes
– 128 GB/node
•
•
•
•
Interconnect: Quad-data-rate (QDR) InfiniBand
Top500 #402 (November 2012)
Requires allocation to run (only ARC system)
Released to users on March 20, 2013
Advanced Research Computing
Allocation System
• Like a bank account for system units
– Jobs run are deducted from allocation account
• Project PIs (i.e., faculty) request allocation for
research project
– Based on research output of project (papers, grants)
and type of computing/software used
– Once approved, add other users (faculty, researchers,
students)
• Only applies to BlueRidge (no allocation required
to run on other ARC systems)
Advanced Research Computing
HokieSpeed – CPU/GPU Cluster
• 206 nodes, each with:
– Two 6-core 2.40-gigahertz Intel Xeon E5645 CPUs and
24 GB of RAM
– Two NVIDIA M2050 Fermi GPUs (448 cores/socket)
•
•
•
•
•
Total: 2,472 CPU cores, 412 GPUs, 5 TB of RAM
Top500 #221, Green500 #43 (November 2012)
14-foot by 4-foot 3D visualization wall
Intended Use: Large-scale GPU computing
Available to NSF Grant Co-PIs
Advanced Research Computing
HokieOne - SGI UV SMP System
• 492 Intel Xeon 7542 (2.66GHz) cores
– Two six-way sockets per blade (12 cores/blade)
– 41 blades for apps; one blade for system + login
• 2.6TB of Shared Memory (NUMA)
– 64 GB/blade, blades connected with NUMAlink
• SUSE Linux 11.1
• Recommended Uses:
– Memory-heavy applications
– Shared-memory (e.g. OpenMP) applications
Advanced Research Computing
Athena – Data Analytics Cluster
• 42 AMD 2.3GHz Magny Cours quad-socket, octacore nodes (Total: 1,344 cores, 12.4 TFLOP peak)
• 32 NVIDIA Tesla S2050 (quad-core) GPUs
– 6 GB GPU memory
• Memory: 2 GB/core (64 GB/node, 2.7 TB Total)
• Quad-data-rate (QDR) InfiniBand
• Recommended uses:
– GPU Computations
– Visualization
– Data intensive applications
Advanced Research Computing
Ithaca – IBM iDataPlex
• 84 dual-socket quad-core Nehalem 2.26 GHz
nodes (672 cores in all)
– 66 nodes available for general use
• Memory (2 TB Total):
– 56 nodes have 24 GB (3 GB/core)
– 10 nodes have 48 GB (6 GB/core)
• Quad-data-rate (QDR) InfiniBand
• Recommended uses:
– Parallel Matlab
– ISV apps needing x86/Linux environment
Advanced Research Computing
Dante (Dell R810)
• 4 octa-socket, octa-core nodes (256 cores in
all)
• 64 GB RAM
• Intel x86 64-bit, Red Hat Enterprise Linux 5.6
• No queuing system
• Recommended uses:
– Testing, debugging
– Specialty software
Advanced Research Computing
Visualization Resources
• VisCube: 3D immersion environment with
three 10′ by 10′ walls and a floor of
1920×1920 stereo projection screens
• DeepSix: Six tiled monitors with combined
resolution of 7680×3200
• Athena GPUs: Accelerated rendering
• ROVR Stereo Wall
• AISB Stereo Wall
Advanced Research Computing
EDUCATION & TRAINING
Advanced Research Computing
18
Spring 2013 (Faculty Track)
1.
2.
3.
4.
5.
Intro to HPC (13 Feb)
Research Computing at VT (20 Feb)
Shared-Memory Prog. in OpenMP (27 Feb)
Distributed Memory Prog. using MPI (6 Mar)
Two session courses:
1.
2.
3.
4.
Visual Computing (25 Feb, 25 Mar)
Scientific Programming with Python (1 Apr, 8 Apr)
GPU Programming (10 Apr, 17 Apr)
Parallel MATLAB (15 Apr, 22 Apr)
Advanced Research Computing
19
Workshops
• Offered last: January 2013, August 2012
• Two days, covering:
– High-performance computing concepts
– Introduction to ARC’s resources
– Programming in OpenMP and MPI
– Third-party libraries
– Optimization
– Visualization
• Next offered: Summer 2013?
Advanced Research Computing
Other Courses Offered
• Parallel Programming with Intel Cilk Plus (Fall
2012)
• MATLAB Optimization Toolbox (ICAM
Others being considered/in development:
• Parallel R
Advanced Research Computing
Graduate Certificate (Proposed)
• Certificate Requirements (10 credits)
– 2 core-coursework: developed and taught by ARC computational scientists
• Introduction to Scientific Computing & Visualization (3 credits)
• Applied Parallel Computing for Scientists &Engineers (3 credits)
– A selection of existing coursework (3 credits - list provided in proposal draft)
– HPC&V seminar (1 credit)
– Interdisciplinary coursework (3 credits – optional)
• Administration
– Steering/Admissions Committee
– Core faculty: develop the courseware and seminar, PhD committee member
– Affiliate faculty: instruct existing courses, guest lectures, etc.
Advanced Research Computing
Proposed Core Courses & Content
• Introduction to Scientific Computing & Visualization
–
–
–
–
Programming environment in HPC
Numerical Analysis
Basic parallel programming with OpenMP and MPI
Visualization tools
• Applied Parallel Computing for Scientists &Engineers
–
–
–
–
–
Advanced parallelism
Hybrid programming with MPI/OpenMP
CUDA/MIC programming
Optimization and scalability of large-scale HPC applications
Parallel & remote visualization and data analysis
Advanced Research Computing
GETTING STARTED ON ARC’S
SYSTEMS
Advanced Research Computing
24
Getting Started Steps
1. Apply for an account (all users)
2. Apply for an allocation (PIs only for projects
wishing to use BlueRidge)
3. Log in (SSH) into the system
4. System examples
a. Compile
b. Submit to scheduler
5. Compile and submit your own programs
Advanced Research Computing
Resources
• ARC Website: http://www.arc.vt.edu
• ARC Compute Resources & Documentation:
http://www.arc.vt.edu/resources/hpc/
• Allocation System:
http://www.arc.vt.edu/userinfo/allocations.php
• New Users Guide:
http://www.arc.vt.edu/userinfo/newusers.php
• Training:
http://www.arc.vt.edu/userinfo/training.php
Advanced Research Computing
Research Projects at VT Interdisciplinary
Center for Applied mathematics
Terry L. Herdman
Associate Vice President for Research Computing
Director Interdisciplinary Center for Applied Mathematics
Professor Mathematics
Virginia Tech
ICAM History
Founded in 1987 to promote and facilitate interdisciplinary
research and education in applied and computational
mathematics at Virginia Tech. Currently, ICAM has 45 members
from 10 departments, 2 colleges, VBI and ARC.
o Named SCHEV Commonwealth Center of Excellence in 1990.
o Named DOD Center of Research Excellence & Transition in 1996.
o Received more than $25 Million in external funding from
federal sources and numerous industrial partners.
o Received several MURI and other large center grants.
o leader of the VT effort on Energy Efficient Building HUB (EEB)
AGILITY - INGENUITY - INTEGRITY
DON’T OVER PROMISE
KEEP SCIENTIFIC CREDIBILITY & REPUTATION
BUILD EXCELLENT WORKING RELATIONSHIPS WITH INDUSTRY AND
NATIONAL LABORATORIES
MATHEMATICAL MODELS FOR MANY DIFFERENT PROBLEMS
Sources of ICAM’s Funding
Department of Defense
o
o
o
o
o
AIR FORCE OFFICE OF SCIENTIFIC RESEARCH - AFOSR
DEFENSE ADVANCED RESEARCH PROJECT AGENCY – DARPA
ARMY RESEARCH OFFICE - ARO
OFFICE OF NAVAL RESEARCH - ONR
ENVIRONMENTAL TECHNOLOGY DEMONSTRATION & VALIDATION PROGRAM ESTCP
o VARIOUS AIR FORCE RESEARCH LABS – AFRL
 Flight Dynamics Lab - Weapons Lab - Munitions Lab
Other Agencies
o
o
o
o
o
o
NATIONAL SCIENCE FOUNDATION – NSF
NATIONAL AERONAUTICS AND SPACE ADMINISTRATION – NASA
FEDERAL BUREAU OF INVESTIGATION – FBI
DEPARTMENT OF HOMELAND SECURITY – DHS
DEPARTMENT OF ENERGY – DOE EERE, ORNL
NATIONAL INSITUTES OF HEALTH – NIH (ID IQ CONTRACT PROPOSAL)
Industry Funding Sources
AEROSOFT, INC. - BABCOCK & WILCOX - BOEING AEROSPACE - CAMBRIDGE
HYDRODYNAMICS - COMMONWEALTH SCIENTIFIC CORP. - HONEYWELL - HARRIS CORP. LOCKHEED - SAIC - TEKTRONIX - UNITED TECHNOLOGIES - SOTERA DEFENSE SOLUTIONS…
Industry-National Lab Partners
Boeing
(Seattle)
Honeywell
(Minneapolis)
United Technologies
(Hartford)
Tektronix
(Beaverton)
LBNL
DOE Lab
(Berkeley)
LLNL
DOE Lab
(Livermore)
NASA
(Ames)
Lockheed
(Los Angeles)
NREL
DOE Lab
(Golden)
Sandia
(Albuquerque)
Air Force
AFRL
(Albuquerque)
Air Force
Flight Dynamics
(Dayton)
Babcock & Wilcox
(Lynchberg)
ORNL
(Oak Ridge)
AeroSoft
(Blacksburg)
Air Force
AEDC
(Tullahoma)
Air Force
Munitions Lab
(Eglin)
Nestles
(Ludwigsburg)
Germany
SAIC
(McLean)
NASA
(Langley
Harris Corp.
(Melbourne
Deutsche Bank
(Frankfurt)
Germany
International Collaborations
ICAM Team
o
o
o
o
10 Academic Departments
2 Colleges
VBI
ARC - IT
2010 - 2011 CORE MEMBERS
FACULTY
DEPARTMENT
*
COLLEGE/INSTITUTE
FACULTY
DEPARTMENT
COLLEGE
Ball, Joseph A.
Mathematics
Science
Baumann, William T.
Electrical Engineering
Engineering
Beattie, Christopher
Mathematics
Science
Borggaard, Jeff
Mathematics
Science
Broadwater, Robert
Electrical Engineering
Engineering
Burns, John A.
Mathematics
Science
Ball, Ken
Mechanical Engineering
Engineering
Cliff, Eugene M.
Aerospace Engineering
Engineering
Day, Martin V.
Mathematics
Science
Raffaella De Vita
Engr. Science & Mechanic
Engineering
Diplas, Panayiotis
Civil Engineering
Engineering
S. Gugercin
Mathematics
Science
Hagedorn, George A.
Mathematics
Science
Herdman, Terry L.
Mathematics
Science
Iliescu, Traian
Mathematics
Science
Inman, Daniel J.
Mechanical Engineering
Engineering
Kapania, Rakesh K.
Aerospace Engineering
Engineering
Kim, Jong U.
Mathematics
Science
J. T. Borggaard
Mathematics
Science
Kohler, Werner E.
Mathematics
Science
Laubenbacher, Reinhard
Bioinformatics Institute
VBI
J. A. Burns
Mathematics
Science
Lin, Tao
Mathematics
Science
E. M. Cliff
Aerospace & Ocean Engr.
Engineering
Lindner, Douglas K.
Electrical Engineering
Engineering
Marathe, Madhav
Bioinformatics Institute
VBI
T. L. Herdman
Mathematics
Science
Neu, Wayne L.
Aerospace Engineering
Engineering
S. Gugercin
Mathematics
Science
Pierson, Mark
Mechanical Engineering
Engineering
Polys, Nichalos
Research Computing
Information Technology
T. Iliescu
Mathematics
Science
Prather, Carl L.
Mathematics
Science
Puri, Ishwar
Renardy, Michael
Engr. Science and Mechanics
Mathematics
Engineering
Science
Renardy, Yuriko
Mathematics
Science
Ribbens, Calvin
Computer Science
Engineering
Rogers, Robert C.
Mathematics
Science
Russell, David
Mathematics
Science
Sachs, Ekkehard
Mathematics
Science
D. J. Inman
Mechanical Engineering
Engineering
Reinhard Laubenbacher
Discrete Modeling
VBI
Madhav Marathe
Simulation
VBI
Henning Mortveit
Simulation
VBI
Nicholas Polys
Visualization
ARC- IT
Santos, Eunice
Computer Science
Engineering
Shinpaugh, Kevin
Research Computing
Information Technology
Kevin Shinpaugh
HPC
ARC - IT
Spanos, Aris
Economics
Science
L. Zietsman
Mathematics
Science
Sun, Shu-Ming
Mathematics
Science
Tyson, John J.
Biology
Science
Vick, Brian
Mechanical Engineering
Engineering
Watson, Layne T.
Computer Science
Engineering
Wheeler, Robert L.
Mathematics
Science
Williams, Michael
Mathematics
Science
L. Zietsman
Mathematics
Science
*
DEPENDS ON CURRENT PROJECTS & FUNDING
1 staff person: Misty Bland
CURRENT ASSOCIATE MEMBERS
ICAM History of Interdisciplinary Projects
Advanced Control
Nano Technology
Homeland Security
HPC - CS & E
Space Platforms
H1N1
IMMUNE
CANCER
HIV
Life Sciences
Design of Jets
Energy Efficient
Buildings
Good News / Bad News
 Good News
 Every IBG Science Problem has a Mathematics Component
 Bad News
 No IBG Science Problem has only a Mathematics Component
W.R. Pulleyblank
Director, Deep Computing Institute
Director, Exploratory Server Systems
IBM Research
Two Applications to Aerospace
 Past Application / New Application
 Airfoil Flutter
 New Application
 Next Generation Large Space Systems
FEEDBACK CONTROL OF
FLUID/STRUCTURE INTERACTIONS
ICAM - Interdisciplinary Center For Applied Mathematics
Stealth
• Began as an unclassified project at DARPA in the early
’70’s
• Proved that physically large objects could still have
miniscule RCS (radar cross section)
• Challenge was to make it fly!
ICAM History of Interdisciplinary Projects
1987 - 1991 DARPA - $1.4 M
1993 - 1997 USAF - $2.76 M
An Integrated Research Program for the
Modeling, Analysis and Control of Aerospace
Systems
Optimal Design And Control of Nonlinear
Distributed Parameter Systems
VT- ICAM
TEAM
NASA
USAF
X - 29
University Research Initiative Center Grant
MURI
TEAM
VT - ICAM
NC STATE
Boeing
Lockheed
USAF
F – 117A
DARPA ALSO PROVIDED FUNDS FOR THE
RENOVATION OF WRIGHT HOUSE –
ICAM’s HOME SINCE 1989
09/14/97: F-117A
CRASH CAUSED BY
FLUTTER
MURI TOPIC: CONTROL OF AIR FLOWS
Mathematical Research
 motivated by problems of interest to industry, business, and




government organizations as well as the science and
engineering communities.
Mathematical framework: both theoretical and computational
Projects require expertise in several disciplines
Projects require HPC
Projects require Computational Science: Modeling, analysis,
algorithm development, optimization, visualization.
University Research Team
 John Burns
 Graciela Cerezo
 Dennis Brewer
 Elena Fernandez
 Herman Brunner
 Brian Fulton
 Gene Cliff
 Z. Liu
 Yanzhao Cao
 Hoan Nguyen
 Harlan Stech
 Diana Rubio
 Janos Turi
 Ricardo Sanchez Pena
 Dan Inman
 8 Undergraduate Students
 Kazifumi Ito
 10 Graduate Students
Research Support and Partners
 AFOSR
 DARPA ACM and SPO
 NASA- LaRC
 NIA
 Flight Dynamics Lab, WPAFB
 Lockheed Martin
Build Math Model
•
•
•
•
start simple
use and keep the Physics (Science)
use and keep Engineering Principles
do not try to be an expert in all associated
disciplines – interdisciplinary team
• learn enough so that you can communicate
• know the literature
• computational/experimental validation
Spring Mass System
h(t) plunge
α(t) Pitch Angle
β(t) Flap Angle
Pitching, Plunging and Flap Motions of Airfoil
1
Mz(t )  Bz (t )  Kz (t )  F (t )
m
T
z (t )  [h(t ),  (t ),  (t )]
F (t )  [ L(t ), M  (t ), M  (t )]
T
Force: Lift
z
(
t
)


d 0
2
L(t )     U  2U  (  t )d  C 



dt
 z(t )
Note: Lift depends on past history
Evolution Equation for Airfoil Circulation:
1 1
1   (t   )
 a (t , x)dx   a (t , x)  
d

 1
 0 ( x  1  U )
1
(t )    a (t , x)dx  a (t , x) known function
1
1
2
0
1

s


 (t  s )ds  2 
k
(
s
)


1  1  s   a (t , s)ds   (s)ds 
0
Us  2
k ( s) 
Us
1
(0)   ,  ( s )   ( s), s  [, o)
Mathematical Model
 change 2nd order ODE to 1st order system
 couple ODE with evolution equation
 past history of circulation function provides part of the initial
conditions
Complete Mathematical Model
0
d
[ Ax(t )   A( s ) x(t  s ) ds ]

dt
t
 Bx (t ) 
 B( s) x(t  s)ds

 t )]T
x (t )  [ h(t ),  (t ),  (t ), h(t ),  (t ),  (t ), (t ), 
 A is a singular 8 by 8 matrix : last row zeros
 A(s) : A8i=0 i=1,2,…,7
 A88(s)=[(Us-2)/Us]1/2, U constant
 B constant matrix, B(s) is smooth
 Non Atomic Neutral Functional Differential Equation
Non Atomic NFDE
 Need Theory of Non Atomic NFDE
 Well Posedness results
 Approximation Techniques
 Parameter Identification
 Validation of the Model
Abstract Cauchy Problem
NFDE
d
Dxt  Lxt
dt

ACP
z(t )  Az (t )
x0  
z(0)  z 0
A is an IFG of a C 0 semigroup T(t)
z(t)  T(t)z 0
Capture x (or x t ) from z
ISAT
Innovative Space Based Radar Antenna Technology
Canister
Inflatable booms
Antenna surface
•300 m long truss structure, 1000 m2 antenna
• fly in 2009
•Launched in container the size of a small SUV
Next Generation Space Systems
•
•
•
•
develop and deploy large space antennas
take advantage of new materials
take advantage of inflatable technology
joint effort DARPA, NASA LaRC, NIA and Virginia Tech
ICAM, Boeing, Lockheed Martin, JPL, Harris Corp.,
AFRL and others
• ICAM – build physics based mathematical models for
simulation and control (after deployment)
• NASA/AFRL – experiments, testing, development,
packaging and deployment
Build Math Model
• Dr. Joe Guerci, SPO, DARPA
• Remember: Obey all Physics (Science) laws
• need experts in all associated disciplines –
interdisciplinary team
• Must communication across disciplines and
organizations
• know the literature
• computational/experimental validation
New Mathematical Models
Including Thermal Effects Changes Everything
0
2
3



 2 y(t , x) 
[ EI
y(t , x)    ( s)
y(t  s, x)ds]
2 t
t
 x2
 x2

x

2
2
 b( x)u (t )    (t , x)

x
ADD THERMAL
EQUATIONS
  (t , x)    2  (t , x)    3 y(t , x)  f (t , x)
0
t
2
x
 x2 t
MORE ACCURATE – MORE COMPLEX – MORE DIFFICULT
Necessary Components for Success
• research expertise in many areas –
interdisciplinary team
• experience (knowledge of what may work)
• MATHEMATICS
• external support
• state of the art computing facilities
• GRAs and young research faculty (new ideas)