distributed intrusion detection system for computational grids

DISTRIBUTED INTRUSION DETECTION
SYSTEM FOR COMPUTATIONAL GRIDS
Mohammad F. Tolba
Mohammad S. Abdel-Wahab
Ismail A. Taha
Ahmad M. Al-Shishtawy
Scientific Computing Department
Faculty of Computer and Information Sciences
Ain Shams University
Cairo, Egypt
Agenda
●
●
●
●
●
Introduction.
The Grid Intrusion Detection
Architecture (GIDA).
GIDA implementation.
Testing and Results.
Conclusions and Future Work.
Introduction
●
●
●
●
The Grid is a new approach to
computing.
Still under research and development.
Couples multiple sites administrated
locally and independently.
Security is important for the success of
this field.
Introduction
●
●
●
●
Basic security requirements.
Concentrates on authentication, access
control, single sign on, ...
No intrusion detection.
Intrusion detection needed as a second
line of defense.
– Bugs.
– Protection against insiders.
Agenda
✔
●
●
●
●
Introduction.
The Grid Intrusion Detection
Architecture.
GIDA implementation.
Testing and Results.
Conclusions and Future Work.
Grid Intrusion Detection
Architecture
●
Intrusion Detection Agent (IDA)
–
●
Data Gathering Module
Intrusion Detection Server (IDS)
–
–
Analyzing Module
Cooperation Module
Grid Intrusion Detection
Architecture
SSL
GIS
SSH
IDS
Kerberos
IDS
Plain
Text
TLS
GIS
Grid Characteristics
Heterogeneity
IDA
Scalability
Distributed Components
Dynamicity or
adaptability
Registration with multiple
IDSs
Requirements
No centralized control
Cooperation between
IDSs
Standard protocols
Build on top of Grid
protocols
Nontrivial QoS
Different ID algorithms
and trust relationships
Agenda
✔
✔
●
●
●
Introduction.
The Grid Intrusion Detection
Architecture.
GIDA implementation.
Testing and Results.
Conclusions and Future Work.
GIDA Implementation
●
●
●
●
●
Simulated Grid environment.
Simulated IDA.
Homogeneous IDSs with LVQ Neural
Network.
Simple cooperation with sharing results.
No trust relationships.
Why Simulation?
●
●
●
No real Grid for testing (Expensive).
Best for testing and evaluation new
architectures.
Control experiments in dynamic
environment.
Grid Simulators
●
●
●
Many Grid simulation tools (GridSim,
SimGrid, MicroGrid).
Unfortunately they concentrate on
resource management problems.
Develop our own simulator for security
and intrusion detection.
The Simulated Grid
Generated Log
Files
Intrusion
Detection Servers
Log
IDS
Resources
(IDAs)
...
...
Log
IDS
...
Requests
...
Users
...
Intruders
Intrusion Detection
Classifications
Misuse
Anomaly
Network
Based
1
x
2
x
Host
Based
3
x
4
√
Why LVQ?
●
●
●
Similar to SOM and used for
classification.
Does not require anomalous records in
training data.
Classes and their labels (User Name)
are known.
IDS Analyzing Module
IDS
Log
Log
IDS
Log
Peer-to-peer Network
or
GIS
IDS
IDS Analyzing Module
Analyzing and detection module
Log
Preprocessing
Trained LVQ
Decision Module
Cooperation Module
Response
IDS Cooperation Module
●
●
●
●
Sharing results among IDSs.
Using P2P or GIS.
The IDS query others for analysis
results of users in its scope.
Inform other IDSs when intrusion is
detected.
Agenda
✔
✔
✔
●
●
Introduction.
The Grid Intrusion Detection
Architecture (GIDA).
GIDA implementation.
Testing and Results.
Conclusions and Future Work.
Measured Parameters
●
●
●
●
●
False Positive.
False Negative.
Recognition.
Training Time.
Detection Duration
Tested Issues
●
Controllable (Internal)
–
–
●
Data Preprocessing
Number of IDSs
Uncontrollable (External)
–
–
–
Number of Users
Number of Resources
Number of Intruders
Different Types of Windows
(Preprocessing)
Type 1: Fixed number of events.
Type 2: Fixed time period window.
Type 3: Fixed number of events with time limit.
Type 4: Fixed events with time limit ignoring incomplete.
Type 5: Fixed events with time limit fixing incomplete.
Window Size
False Negative
Recognition
90
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
50
40
60
1 IDS
4 IDSs
50
40
Percentage
100
Percentage
100
60
40
30
30
20
20
20
10
10
10
0
0
10
20
30
40
0
0
10
Window Size
20
30
40
0
10
Window Size
20
Window Size
Detection Duration
Training Time
55
350
325
50
300
275
250
45
40
225
200
175
1 IDS
4 IDSs
150
125
100
Minutes
0
1 IDS
4 IDSs
50
30
Minutes
Percentage
False Positive
100
35
30
1 IDS
4 IDSs
25
20
15
75
50
25
10
5
0
0
0
10
20
Window Size
30
40
0
10
20
Window Size
30
40
30
40
Window Period
Recognition
False Negative
100
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
50
40
60
1 IDS
4 IDSs
50
40
Percentage
100
90
Percentage
100
60
40
30
30
20
20
20
10
10
10
0
0
500
1000
1500
2000
0
2500
0
Window Length (Sec.)
500
1000
1500
2000
0
2500
500
1000
Detection Duration
Training Time
350
55
325
300
275
250
50
225
200
175
150
35
45
40
1 IDS
4 IDSs
30
1 IDS
4 IDSs
25
125
100
75
50
20
25
0
5
15
10
0
0
500
1000
1500
2000
Window Length (Sec.)
2500
0
1500
2000
Window Length (Sec.)
Window Length (Sec.)
Minutes
0
1 IDS
4 IDSs
50
30
Minutes
Percentage
False Positive
500
1000
1500
2000
Window Length (Sec.)
2500
2500
Hybrid Window at size 10
Recognition
False Negative
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
50
40
60
1 IDS
4 IDSs
50
40
Percentage
90
Percentage
100
100
60
40
30
30
20
20
20
10
10
10
0
0
0
500
1000
1500
2000
2500
0
500
Window Length at Size 10 (Sec.)
1000
1500
2000
0
2500
500
1000
Training Time
Detection Duration
350
55
325
300
275
250
50
225
200
175
35
45
40
1 IDS
4 IDSs
150
125
100
75
30
1 IDS
4 IDSs
25
20
15
10
50
25
0
5
0
0
500
1000
1500
2000
Window Length at Size 10 (Sec.)
2500
0
1500
2000
Window Length at Size 10 (Sec.)
Window Length at Size 10 (Sec.)
Minutes
0
1 IDS
4 IDSs
50
30
Minutes
Percentage
False Positive
100
500
1000
1500
2000
Window Length at Size 10 (Sec.)
2500
2500
Hybrid Window at size 20
Recognition
False Negative
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
50
40
60
1 IDS
4 IDSs
50
40
Percentage
100
90
Percentage
100
100
60
4 IDSs
40
30
30
20
20
20
10
10
10
0
0
500
1000
1500
2000
2500
0
0
Window Length at Size 20 (Sec.)
500
1000
1500
2000
2500
0
500
1000
Detection Duration
Training Time
55
350
325
50
300
275
250
45
40
225
200
175
1 IDS
4 IDSs
150
125
100
35
30
1 IDS
4 IDSs
25
20
15
75
50
25
10
5
0
0
0
500
1000
1500
2000
Window Length at Size 20 (Sec.)
2500
0
500
1500
2000
Window Length at Size 20 (Sec.)
Window Length at Size 20 (Sec.)
Minutes
0
1 IDS
50
30
Minutes
Percentage
False Positive
1000
1500
2000
Window Length at Size 20 (Sec.)
2500
2500
Hybrid Window at size 30
False Negative
Recognition
90
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
50
40
60
1 IDS
4 IDSs
50
40
Percentage
100
Percentage
100
60
40
30
30
20
20
20
10
10
10
0
0
500
1000
1500
2000
0
0
2500
500
1000
1500
2000
2500
0
Window Length at Size 30 (Sec.)
Window Length at Size 30 (Sec.)
Training Time
500
1000
55
50
300
275
250
45
40
225
200
175
1 IDS
4 IDSs
150
125
100
35
30
1 IDS
4 IDSs
25
20
15
75
50
25
10
5
0
0
0
500
1000
1500
2000
2500
Window Length at Size 30 (Sec.)
0
500
1500
2000
2500
Window Length at Size 30 (Sec.)
Detection Duration
350
325
Minutes
0
1 IDS
4 IDSs
50
30
Minutes
Percentage
False Positive
100
1000
1500
2000
Window Length at Size 30 (Sec.)
2500
Number of IDSs
Recognition
False Negative
100
90
90
80
80
80
70
70
70
60
50 User
200 User
350 User
50
40
Percentage
100
90
Percentage
100
60
50 User
200 User
350 User
50
40
60
40
30
30
20
20
20
10
10
10
0
0
5
10
15
20
25
0
0
5
10
IDSs
15
20
0
25
5
10
Detection Duration
Training Time
60
1800
55
1600
50
1400
45
40
1200
1000
50 User
200 User
350 User
800
35
50 User
200 User
350 User
30
25
20
600
15
400
10
200
5
0
0
0
5
10
15
IDSs
20
25
15
IDSs
IDSs
Minutes
0
50 User
200 User
350 User
50
30
Minutes
Percentage
False Positive
0
5
10
15
IDSs
20
25
20
25
Number of Users
Recognition
False Negative
100
90
90
90
80
80
80
70
70
70
60
1 IDS
4 IDS
8 IDS
50
40
Percentage
100
Percentage
100
60
1 IDS
4 IDS
8 IDS
50
40
60
40
30
30
20
20
20
10
10
10
0
0
0
100
200
300
400
0
100
200
Users
300
0
400
100
200
Users
Users
Training Time
Detection Duration
1800
65
60
1600
55
1400
50
1200
45
1000
1 IDS
4 IDS
8 IDS
800
600
Minutes
0
1 IDS
4 IDS
8 IDS
50
30
Minutes
Percentage
False Positive
40
35
1 IDS
4 IDS
8 IDS
30
25
20
15
400
10
200
5
0
0
0
100
200
Users
300
400
0
100
200
Users
300
400
300
400
Number of Resources
90
90
90
80
80
80
70
70
60
1 IDS
4 IDSs
8 IDSs
50
40
Percent age
100
Percentage
100
60
1 IDS
4 IDSs
8 IDSs
50
40
70
60
40
30
30
20
20
20
10
10
10
0
0
20
40
60
80
100
120 140
0
0
160
20
40
60
80
100
120 140
0
160
20
40
Resources
Resources
60
Detection Duration
Training Time
50
600
45
550
40
500
35
450
400
350
1 IDS
4 IDSs
8 IDSs
300
250
200
30
1 IDS
4 IDSs
8 IDSs
25
20
15
150
10
100
5
50
0
0
0
20
40
60
80
100
Resources
120 140
160
80
100 120 140
Resources
650
Minutes
0
1 IDS
4 IDSs
8 IDSs
50
30
Minutes
Percentage
Recognition
False Negative
False Positive
100
0
20
40
60
80
100
Resources
120
140
160
160
Number of Intruders
Recognition
False Negative
90
90
90
80
80
80
70
70
70
60
1 IDS
4 IDSs
8 IDSs
50
40
60
1 IDS
4 IDSs
8 IDSs
50
40
Percentage
100
Percentage
100
60
40
30
30
20
20
20
10
10
10
0
0
0
10
20
30
40
0
10
20
Intruders
30
0
40
10
20
Intruders
Intruders
Training Time
Detection Duration
300
55
275
50
250
45
225
40
200
175
1 IDS
4 IDSs
8 IDSs
150
125
Minutes
0
1 IDS
4 IDSs
8 IDSs
50
30
Minutes
Percentage
False Positive
100
35
30
1 IDS
4 IDSs
8 IDSs
25
20
100
75
15
50
10
25
5
0
0
0
10
20
Intruders
30
40
0
10
20
Intruders
30
40
30
40
Agenda
✔
✔
✔
✔
●
Introduction.
The Grid Intrusion Detection
Architecture (GIDA).
GIDA implementation.
Testing and Results.
Conclusions and Future Work.
Conclusions
●
●
●
●
●
●
●
Intrusion Detection needed for real Grids as
second line of defense.
GIDA suitable for grid environments.
Simulation prove applicability.
LVQ produced good results.
Better that centralized system.
Results help in building real systems.
Better understanding of the problem of
intrusion detection in Grid environments.
Future Work
●
●
●
●
●
Trust Relationships in Grid environment.
Heterogeneous Analyzing modules.
More complicated algorithms for
cooperation.
Misuse detection.
Testing on real Grid testbeds.
The End
Thank you for careful listening 