High Performance Computing (HPC) and Activities of Computer

High Performance Computing (HPC) and
Activities of Computer Centre
Centre, IIT Kanpur
A presentation to the Board of Governors, IIT Kanpur
May 28, 2010
Amalendu Chandra
Head, Computer Centre
IIT Kanpur
Acknowledgments
™ Board of Governors, IIT Kanpur
™ CC Engineers, HPC Group
™ IWD Engineers
Computer Centre
CC@IITK has a glorious history
This centre was established in 1964 and it was
started in Western Labs under Department of
Electrical Engineering.
It moved to its present building in 1969, when it
was recognized as an independent department in
the Institute.
IBM-7044
IBM-1620 was the first Computer acquired by IIT
Kanpur. Next was IBM-7044 in 1966, followed by
an IBM-1401.
Several specialized Computers such as IBM-1800,
PDP-1 etc were added in subsequent years.
The next major upgrade was the addition of DEC1090 mainframe computer in 1979, which was the
first sharing computer of IIT Kanpur. This was first
computer which had terminals.
In 1989, CC purchased Super minicomputers of HP
9000 series.
PDP-1
Computer Centre
History of Computer Centre
In 1987, the first PC lab was setup providing DOS
environment.
environment
Convex 220 was setup in 1990. CC got its Mini
Supercomputer
p
p
IBM SP2 was setup in 1999. First parallel computer
HP Servers
Email service was run under Ernet project in early
1990s and was subsequently moved to computer
centre around 1994
In 1995 the Campus Network was upgraded to 100
Mbps Fiber Backbone and 10 Mbps UTP Access
Network.
64 Kbps Internet link was setup in 1998 and today
the bandwidth has increased to more than 1 Gbps.
Linux Cluster (SUN & HP) set up in 2004-05
Convex-220
Current Staff of Computer Centre
Principal Computer Engineer : 2 + 1 (on deputation)
Senior Computer Engineer
: 3 + 3 (OA)
Computer Engineer
: 1+1 (retiring soon)
Jr. Technical Superintendent : 3
Jr. Technician
: 2
Computer Centre
Facilities Provided by CC
Computer Center provides state-of-the-art Computing, E-mail,
Internet and other facilities 24 hours a day and 365 days a
year for more than 7500 users.
Major facilities provided by CC are:
Computing hardware
Application software
Campus Network
Email and Internet
Linux and Windows Labs
File Storage and Backup
Services
Hosting of IITK website
Office Automation (under DD)
Technical support via phone and email
Maintenance of PCs and peripherals
Activities
•
•
•
•
•
•
•
•
•
•
•
4-5 hours of Classes for UG and PG per day at CC
Students work on computing and CAD assignments using CC Lab facilities
User training / Education / Familiarization / Help with compilation coding etc.
Workshops and Seminars
Troubleshooting
Research and Related Activity
Maintenance of Hardware and Software
Software installation and upgradation
Development and Installation of regularly used software (in-house)
Mass User registration / authentication and login id distribution for support
Support for
– Alumni Office, DRPG Lists, Office Automation, OARS, Regular Courses Mailing,
Conferences, Short Term Courses, Mass Internal Mailing and announcements,
NO Dues
Dues, supply and repair of PCs for administration and other sections
sections, web
hosting
•
•
Security
Mail, Domain Name Service, Networking, Software Downloads sites, Mirror
Sit maintenance
Site
i t
Computer Centre
Computing
p
g Facility
y
Computer Centre
C
C
h 3 4-CPU
has
4 CPU master nodes
d and
d 146 dual-CPU
d l CPU
compute nodes in the Cluster amounting to 292 cores of
computing power connected over 1Gbps LAN.
Applications - Gaussian, Linda, Charmm, FEM, Diff.Equ. Solving
Molpro, parallel libraries etc.
Environment - Parallel as well
as sequential on open source
OS
A new facility of about 3000
cores of very fast compute
nodes with 40 Gbps Infiniband
interconnect is in the pipeline.
Computer Centre
Storage and Backup
Storage
One 33 TB HP Storage
g
Works
EVA8000 Enterprise Virtual Array and
One 6TB SUN StorEdge 6120
File Service
One HP Storage Works Clustered File
Systems (Poly Serve Symmetric
Cluster File System) and One Virtual
File Service (PolyServe Matrix Server)
Backup Service
One Backup
p Server for Users' home
directories and Users' Mail with HP
MSL6000 DP
Backup Policy: Daily incremental
backup, saved for one week, Weekly
i
incremental
t l backup,
b k
saved
d for
f
one
Month and Monthly full backup,
saved for one year.
HP EVA 8000 Enterprise VA
New Storage:
100 TB for HPC
Linux and Windows Environment
Linux Working Environment
• Three Labs equipped with 143 Latest Configuration PC with Ubuntu and
Fedora.
• 9 Computational Servers.
Servers
• Software Installed for Simulations and Modelling, Optimization etc.
Windows Working Environment
• Domain Controller and License Server, SAMBA Server and Deployment
Server
• Two Labs equipped with 75 Latest
Configuration PCs
• Software Installed in wide category
Email Setup
• Approximately 7500 mail boxes
• Quota ranging from 500MB to 1500 MB
• Around 250000 Mails sent and received every
day out of which 90% of incoming mail is SPAM
which is filtered by Barracuda SPAM firewall
• Both Linux Postfix and Microsoft Exchange
platforms provided
• The setup is state-of-the art enterprise class
providing high availability and fault tolerance.
Email Architecture
iitk.ac.in and other
served domains
Internet
HTTP/
PUSH
HTTP
MS
Exchange
Server
MS EXCHANGE
PROTOCOLS
Other Served
Domains
Iitkalumni.org
security.iitk.ac.in
anataragni.iitk.ac.in
g
techkriti.org
cse.iitk.ac.in etc.
SMTP Server
Un-Authenticated
SMTP
Spam Filter Pair
Local lists server
SMTP Exchange
Registerd users
Web Mail Server
HTTP
HTTP
IMAP
POP
Mail Hub Pair
SMTP Server
IMAP
POP
WebMail
and
MSXchange
Clients
Local Mail Store
SMTP/POP Clients
Authenticated /
Unauthenticated
Computer Centre
Current Network Setup
I tit t Gigabit
Gi bit LAN (Local
(L
l Area
A
Institute
Network) with more than 15000
nodes covering Academic Area
and Student Hostels.
2 Core Switches, ~50 Distribution
Switches, ~800 Access Switches
Gigabit Fiber Optic Backbone Network
with more than 21 Kms of Fiber laid in
the Campus
Fully Managed Network
1 Gbps Internet Bandwidth from
Airtel
Backup Bandwidth from Reliance and
NKN
Overlay Wi
Wi-Fi
Fi Network in the
Academic Area
~500 Access Points
Internet Application Servers for
providing Web, Mail, Proxy, DNS
and other Internet services to the
users.
Computer Centre
Network Architecture
FORTIGATE 3600A
UTM (HA MODE)
L2 Switch
CISCO 6500
CORE SWITCHES
CISCO ROUTER
7200
AIRTEL MUX
CISCO AS 2960G
BSNL MUX
Internet
Cloud
CISCO DS 3750
CISCO AS 2960G
CISCO AP 1131
Computer Centre
Network: Immediate and Long Term Plans
Immediate Plans:
Expand the network to accommodate new buildings/
facilities and increasing student strength
Provide 1 Gbps LAN in the residential area
Long Term Plans:
Build an All IP Network to provide integrated VoiceVideo-Data network
Provide IP based Voice/Video phones and
Conferencing facility
Desktop
Computer Centre
Cyber
Security
y
y
In view of the requirement to strengthen the Cyber Security,
f ll i steps have
following
h
b
been
taken
k in
i the
h recent past:
All the switches have been replaced with managed switches.
This allows binding an IP address with every network port
which helps in tracing the machine/individual who has been
involved in any hacking.
A CCTV IP camera has been installed at the entrance of CC
to record the entry and exit of users.
A Fortigate UTM (Unified Threat Management) device has
been installed at the Internet Gateway to monitor and control
the Internet Traffic and allow tighter control on the traffic to
Internet Application Servers
Ser ers for better security.
sec rit
Measures to Improve Cyber Security
Block all the unused TCP/UDP ports for Internet Application
Servers on the Internet Gateway Firewall.
Implement electronic Access Control based on identity in all
CC labs. In addition, install CC Camera in all the labs.
Implement DHCP based IP address allocation policy. A
machine should be able to use the network only if it has
b
been
allocated
ll
t d IP address
dd
th
through
h DHCP.
DHCP
Implement Network authentication for both wired and
wireless network. Make provision for issuing temporary ids
for visitors.
Implement Wireless Intrusion Detection and Prevention
system.
Implement more secure authentication than currently
followed scheme.
Implement better security for system logs on the individual
servers. This will help trace the attack.
Computer Centre
Activities Underway
y
Windows and Linux Labs for more than 200 users in
New Core Building
Gigabit LAN connectivity in Residences
New Mail Storage (25 TB)
New UPS (900 KVA)
GPU servers and high end workstations
New HPC facility
Why HPC ?
To solve complex problems in science and engineering
Higher resolution simulations for longer time
Sometimes experiments cannot be done !!
Computational experiments can be used to
simulate extreme conditions
Vast expertise
p
in Numerical Methods
A li ti
Applications
Basic
science
HPC
Visualization
Hardware
Support
y
Systems
Numerical
algorithms
l
ith
More than 100 faculty
y members across
various disciplines are involved in computing
Overall
(CHM+PHY+AE+BSBE+CE+CHE+EE+ME+MME)
300
250
200
150
100
50
0
255
125
All
Involved in
Computing
HPC Research @IITK
Multiscale, Adaptive Finite Element Methods
using Domain Decomposition
Flow Past Bodies with Complex
p
Geometry
y and
Corners
Pseudo-spectral Turbulence Simulations
Enhanced Oil Recovery
Analysis of Aircraft Structures
Stress Analysis and Composite Materials
Vi t l Reality
Virtual
R lit
Vibration and Control
Computational Chemistry
Semiconductor Physics, Feynman Integrals
Nanoblock Self Assembly
Molecular Simulation (Molecular Dynamics &
Monte Carlo Methods)
of
Thermal and Hydraulic Turbomachinery
Numerical Weather Prediction
Turbulence Modelling through RANS
Statistical Thermodynamics
Optimization
Vortex Dominated Flows and Heat Transfer
Geo-seismic Prospecting
Flow Induced Vibrations
Geometric
Large Eddy Simulation of Turbulence
Large
Systems
Electronic Structure Calculations
Aggregation and Etching
Quantum Simulations
Thin Film Dynamics
Optical / EM Field Calculations
Parallel Spectral Element Methods
Organic
Neural Networks
Impurities in Anti-Ferro Magnets
Raman Scattering
Spin Fluctuation in Quantum Magnets
Robotics
Multi-Body Dynamics
Computer Aided Tomography
Nuclear Magnetic Resonance
Awards and Honours
™ S.S. Bhatnagar Prize
™ Fellowship of Academies:
¾ FNA; FASc; FNAE
™ Research Fellowships:
¾ J.C. Bose; Swarnajayanti; Raja Ramanna
Status of Central HPC Facilities at Academic Institutions
•
IISc Bangalore: 8192 processors IBM BLUE GENE (17 TF)
•
IIT Bombay : 380 Nodes : Xeon Dual Core (partly on infiniband)
•
IIT Madras : 256 Nodes, Xeon dual core (partly on infiniband)
•
JNCASR : 128 nodes, Xeon dual core (infiniband)
•
Univ. of Hyderabad : P690 SMP server (32 processors x 4)
•
•
IISER Pune: 64 Nodes Xeon Quadcore (Infiniband)
IIT Hyderabad: 64 Nodes Xeon Quadcore (Infiniband) (6 TF)
•
IIT K
Kanpur : 144 nodes,
d
AMD O
Opteron
t
single
i l core
5 year old Hardware (< 1 TF)
•
Our goal is to …
be the best in the country and one of the best
in the world in HPC
•
carry out cutting edge research on computational
science and engineering
•
develop large parallel software for research
applications
li ti
•
collaborate within IITK and with neighbouring
academic institutions
•
have training programs for students and scientists.
The New HPC Setup
The Main
Cluster: 260 nodes
Dual proc; Nehalem Quad core
Smaller Test
Clusters
Dual proc, Nehalem Quad core
HPC
Servers
Nehalem Quadcore/GPU
Disk
100 TB storage
Visualization Lab
Hi h end
High
d graphics
hi W/S
Infiniband
N t
Network
k
(40 Gbps)
System Integration
Linux cluster (260 nodes)
GB switches
Connection with
IITK network
Mstr
Mgmt
Mgmt Mgmt Comp Comp
compute nodes
Comp
GB switch
IB switch layer
Switch
Switch
servers
(Multi-node)
Smaller Test Clusters
comp Comp
compute
nodes
GB switch
Comp
Storage
100 TB disk
The New HPC
New HPC Facility at IITK
The integrated facility will have a total of 372 nodes
and a projected delivered performance of ~ 30 TF
Should be the best HPC facilityy among
g all academic
Institutes in the country.
Second best among Government organizations
(C-DAC has got a 38 TF cluster)
We might break into top 500 globally!
Layout of the area for HPC facility at ground floor of CC
N
E
Proposed HPC Data Centre
N
E
List of work associated with the HPC set-up
1. Layout of the proposed area for HPC facility
2. PAC requirements for the facility
3 AC (Non-PAC) requirements
3.
4. UPS requirements
5. Total power requirements
6 UPS/b
6.
UPS/battery/control
tt /
t l panell rooms
7. How to provide the required power from main/DG set
8. Civil work in the p
proposed
p
HPC area
9. Civil work in the UPS/Battery rooms
10. Electrical work/laying of lines and panels
11 Fire safety issues
11.
12. Building Management System (BMS)
CC and IWD components
p
of HPC-related work
1.
2
2.
3.
4.
HPC Systems
UPS (700 KW)
PAC (630 kW)
Fire safety and BMS
CC
CC
CC
CC
5.
6.
7.
8.
9
9.
AC (non-PAC)
IWD
Substation ((1.5 MW))
IWD
DG set (1.5 MW)
IWD
Electrical equipment/distribution IWD
Civil work*/laying
work /laying of electrical lines IWD
*Flooring work in main HPC area will be done by PAC vendor
A HUB for Collaborative Research
HCRI,
Alld
Alld U
Delhi U
SGPIMS,
LKW
BHU
AMU
Aligarh
HPC
Centre
IITK
CDRI,
CDRI
LKW
JNU
NIIT, All
d
Kanpur
Univ +
HBTI
MNNIT
Allahabad
Lucknow
Univ
Training and Workshops
•
•
•
•
Visitors program
Summer schools/workshops
International/national conferences
HPC users meeting
ti
• Academic Program at IIT Kanpur
– Masters in Computational
p
Science and
Engineering
– Doctoral Program
Proposal submitted to MHRD
Inspire Young Minds ….
Computer Centre as both Service and Academic Centre
Immediate need for HPC
Four Project Scientists (DST, advertisement made)
System Administrator, Secretary (Institute)
For General CC jobs: Engineers, Technical staff
Number of users has gone up, number of CC personnel has gone
down
Immediate need of more office space for HPC and CC staff. Also, a
seminar room, visitors room and test labs should be there in place
for this transformation to occur
Need more space, more manpower
A proposal with more details has been sent to the Space
Committee
CC as the Nucleus of IITK Activities
CC
Smart Card
Thank you