Mon, 1 Oct

Automatic Evaluation of
Intrusion Detection Systems
F. Massicotte, F. Gagnon, Y. Labich, L. Briand,
Computer Security Applications Conference,
ACSAC ’06, pp 361-370, 2006.
Presented by: Lei WEI
10/1/2007
1
Summary
1.
2.
3.
10/1/2007
Proposed a strategy that is able to evaluate
Intrusion Detection System (IDS)
automatically and systematically
Evaluated two famous IDS programs, Snort
2.3.2 and Bro 0.9a9, by using this new
proposed strategy.
Proposed a 15-class taxonomy for test
results.
2
Appreciative Comments:
Automatization
This is an automatic IDS evaluation system. Because of
automation, it is possible to efficiently and systematically
create a large number of sample data .


10/1/2007
“ We use 124 VEP (covering a total of 92 vulnerabilities) and
108 different target system configurations” (Automatic
Evaluation of Intrusion Detection Systems)
“38 different attacks were launched against victim UNIX
hosts in seven weeks of training data and two weeks of test
data.” (Evaluation Intrusion Detection Systems: The 1998
DARPA Off-line Intrusion Detections Evaluation)
3
Critical Comment:
1. Complicated classification
Each of the collected traffic traces belongs to one of the type, TP,
TN, FP and FN.
According to types of all traces collected from IDS evaluation
tests, the authors suggested a 15-class taxonomy for IDSes, such
as, alarmist, quiet, quiet and complete detection, complete
evasion etc.

This does make the evaluation complicated and confused.
 Hard to remember all the class names
 Is quiet and complete detection a subclass of quiet? No!

10/1/2007
I prefer a statistical way by calculating the following two ratios,
TP
( TPTN , FPFP
 FN) , from which we know the percentage of attack
being detected and the percentage about wrong alarms.
4
Critical Comment:
2. Confused diagrams
In this paper, the two diagrams, Figure 5 and Figure 1, and
relevant description used to represent the working process of the
whole system are not clear enough.
(a). A title should be “… an effective guide for scientists rapidly
scanning lists of titles for information relevant to their interests.”
(Scientific writing for graduate students: a manual on the teaching
of scientific writing, edited by F. Peter Woodford. New York:
Rockefeller University Press, 1968. )
However, neither the title nor the content provides clear
explanation to the meaning of numbers in Figure5.
10/1/2007
5
Critical Comment:
2. Confused diagrams (Continue)
(b). Although the article describes the steps listed in Figure1, the
provided diagram does confused us to understand the structure
and working process of the system. The title is Virtual network
infrastructure, but the figure actually covers more stuff than that. It
does not only represent Virtual network infrastructure, but also
shows the working process of the subsystem.
10/1/2007
6
Working process of Automatic IDS
Evaluation system
This system could be divided into two
subsystems.
 The attack simulation and data collection
system
 The IDS Evaluation Framework
10/1/2007
7
1. Attack simulation and data collection system
Script Generation
Set up Virtual Network
Set up Attack Script
Execute Attack
Data Set
Restore
10/1/2007
1. Choose Vulnerability
Exploitation Program (VEP)
2. Choose Configuration of the
target System (e.g. IDS)
Provide the virtual attacking
machine the proper attack
configuration (e.g. Whether
apply IDS Evasion Tech.)
1. Capture attack traffic traces
2. Document the traffic traces
1. Save the traffic traces and
IDS alarms on the shared
hard-drive
2. Restore the virtual attacker
and target machines to their
8
initial state
2. IDS Evaluation Framework
The collected IDS alarms
are fetched by the IDS
Results Analyser
Data Set
IDS Evaluator
IDS
IDS Result Analyzer
IDS Evaluator takes
documented traffic
traces from the Data Set
IDS Evaluator provide
traffic traces to each
tested IDS
Compare the two groups
of data sets and
determine whether the
IDS detection succeed
Generate the evaluation
report
Report
10/1/2007
9
Question
This paper evaluated two open source IDSes
by the new strategy. However, many IDSes
have patent or copy right protection. Those
creators would never reveal the weak points
of their products.
Is it ethical or illegal to publish the evaluations
of IDS programs so that others can know the
truth?
10/1/2007
10
The End
10/1/2007
11
The 15-class taxonomy (Supplement)
10/1/2007
12
Document traffic traces (Supplement)
1.
2.
3.
4.
10/1/2007
Each traffic trace is documented by four
characteristics:
Target system configuration
VEP configuration
Whether or not the VEP exploited the
vulnerability of the target system
Whether or not the attack is successful
13