Security Attribute Evaluation Method

Security Attribute Evaluation Method:
A Cost Benefit Analysis
Shawn A. Butler
Computer Science Department
Carnegie Mellon University
9 November 2001
Hey Boss, we need more
security. I think we should
get the new Acme 2000
Hacker Abolisher
S
We always seem to
need more security!
Don’t we have
enough?
M
What are my
alternatives?
Trust me, we will be
more secure!
What is it going to
cost?
S
M
What is the added
value?
Alternatives?
Value?
S
Problem
• Security managers lack structured costbenefit methods to evaluate and compare
alternative security solutions.
Security Architecture Development Process
Threats
Available
Countermeasures
Risk
Assessment
Outcomes
Prioritized
Risks
Policies
System
Design
Select
Countermeasures
Security
Components
Requirements
Security Architecture
Develop
Security
Architecture
Security Architecture Development Process
Threats
Available
Countermeasures
Risk
Assessment
Outcomes
Prioritized
Risks
Policies
System
Design
Select
Countermeasures
Security
Components
Requirements
Security Architecture
Develop
Security
Architecture
The Multi Attribute Risk Assessment
1.
2.
3.
4.
5.
Determine threats and outcomes
Assess outcome attribute values
Assess weights
Compute threat indices
Sensitivity Analysis
Threats
Risk
Assessment
Outcomes
Prioritized
Risks
Determine Threats and Outcomes
Threats
Outcome Attributes
Scanning
Procedural Violation
Browsing
Distributed Denial of
Service
Password Nabbing
Personal Abuse
Signal Interception
:
:
29 Threats
Lost Productivity
Lost Revenue
Regulatory Penalties
Reputation
Lives Lost
Lawsuits
:
:
Oi = (Lost Prod, Lost Rev, Reg
Penalties, Reputation)
Assess Outcome Attribute Values
Outcomes
Lost
Productivity (hrs)
Lost
Revenue
($$)
Regulatory
Penalties
(scale 0-6)
Reputation
(scale 0-6)
Low
.3
0
0
1
Expected
.5
2
0
1
High
1
1,000
0
4
Low
0
0
0
0
Expected
2
2
0
1
40
12,000
3
4
Attacks
Scanning
10,220/yr
(3-4/hr)
Procedural
Violation
4,380/yr
(1-2/hr)
High
Prioritize and Assess Weights
(Swing Weight Method)
Worst
Lost Prod 240 hrs
Lost Rev $12,000
Reg Penal
3
Reputation
4
Best Order Rank Weight (wi)
0 hrs
$0
0
0
1
4
3
2
100
20
40
80
.42
.08
.17
.33
Compute Threat Indices
Hours + $$ + Reputation + Regulatory Penalties = ?
Nonsense !
So determine Value Functions Vj(xj)
1
0
1
0
12,000
L: Lost Revenue
0
1
0
240
P: Lost
Productivity
0
1
0
4
R: Reputation
0
0
3
G: Regulatory
Penalties
L(x1) $$ + P(x2)Hours + R(x3)Reputation + G(x4)Regulatory Penalties = TI
Computing the Threat Index
Expected threat
pexpected
 (j=attributesWj  Vj(xj
expected))
 (j=attributesWj  Vj(xj
low))
Threat index
TIa = Freqa [
plow
+
pexpected  (j=attributesWj  Vj(xj
expected))
phigh
high))
]
 (j=attributesWj  Vj(xj
+
Scanning in More Detail
Outcomes
Lost
Productivity (hrs)
Attacks
Scanning
10,220/yr
Lost
Revenue
($$)
Regulatory
Penalties
(scale 0-6)
Reputation
(scale 0-6)
Low
.3
0
0
1
Expected
.5
2
0
1
High
1
1,000
0
4
.01 = plow  (j=attributesWj  Vj(xj
low))
.07 = pexpected  (j=attributesWj  Vj(xj
expected))
.00 = phigh  (j=attributesWj  Vj(xj
high))
10,220  (.01 +.07 +.00)  886.57
Risk Assessment Results
Threat
Frequency
Low
Expected
High
Total
Scanning
10,220
.0084
.0750
.0034
886.57
Procedural
Violation
4380
.0000
.0773
.0065
367.03
Browsing
2920
.0000
.0742
.0035
226.71
Dist Denial
of Service
156
.0085
.1530
.0060
26.12
Password
Nabbing
365
.0001
.0008
.0009
.62
Personal
Abuse
110
.0000
.0003
.0009
.13
TOTAL
1,507.18
But what
about the
numbers?
Risk Assessment
Sensitivity Analysis
• Attack Frequencies
• Outcome Attribute Values
• Attribute Weights
Probability Distributions
Trigen(1.0000, 1.0000, 4.0000,
5, 95)
Normal(10220, 1)
Trunc(0,30660.0000)
0.6
0.45
0.40
0.5
0.35
0.30
0.4
0.25
0.3
0.20
0.15
0.2
0.10
5.5
5.0
4.5
4.0
3.5
3.0
2.5
2.0
1.5
0.0
1.0
10223
10222
10221
10220
10219
10218
10217
0.00
0.5
0.1
0.05
< 5.0%
90.0%
5.0% >
1.0218E+04
1.0222E+04
1.0000
Scanning Frequency Dist
Scanning Reputation Dist
90.0%
5.0%
4.0000
Regression Sensitivity for Threat Index
Sum/R60
Reputation/K31
Scanning / Low/F34
Procedural Violation / Low.../F35
Signal Interception / Low/I40
Browsing / Low/F36
Procedural Violation / Low.../L35
.568
.56
.309
.268
.199
.167
-.073
Lost Productivity/K30
Procedural Violation / Low.../I35
Procedural Violation / Ran.../C35
Signal Interception / Low/F40
Signal Interception / Low/L40
Scanning / Ranking/C34
Alteration / Low/F37
DDoS / Low/I39
Trojan Horse / Low/F44
Compromising Emanations / .../F58
-1
-0.75
-0.5
-0.25
.057
.057
.055
.03
.029
.029
.026
.024
.022
0
0.25
Std b Coefficients
0.5
0.75
1
Si
g
Pr na S
oc l In ca
ed te nn
ur rc ing
al ep
Vi tio
o n
Br lati
ow on
si
ng
Al Vir
te us
C
ry
ra
pt
ti
C
og
om D on
D
ra
ph Tr pro oS
ic oja m
Co n ise
M
m Ho
es
pr rs
sa
om e
ge
is
St
e
T
r
e
h
Pa
am e
f
ss
M t
w
o
Pe rod Fr d
rs Na au
on b d
al bin
Tr Ab u g
IP ap se
Sp Do
D
o o
Pa en Va ofi r
ss ial nd ng
w of ali
or S sm
d er
G v
Lo ue ice
C
om W C gi ss
pr eb on c B ing
om Pa ta om
is ge min b
in S a
El g E po tion
ec m of
tro an ing
D n ati
at ic o
a G ns
En r a
try ffit
Er i
ro
r
Rank
Change in TI Rankings
30
25
20
15
10
5
-0
Threats
?
+1SD, -1SD
+95% Perc, -5% Perc
Mean
Prob Density
Cryptographic Compromise
Distribution
0.160
0.140
0.120
0.100
0.080
0.060
0.040
0.020
0.000
Mean=11.004
0
10
20
30
Rank
5%
90%
6
5%
25
Regression Sensitivity
-.639
Reputation Outcome
Reputation/wj
.19
.078
.075
Denial of Service / Anti-S.../Y49
.061
.057
.054
Scanning / URL Block/AA34
.048
Logic Bomb / Auditing/AU55
.046
.046
.046
.045
-.213
Lost Productivity/K30
Compromise / Low/L45
Alteration / Low/F37
-.063
Logic Bomb / FREQ/year/B24
Trojan Horse / Low/F44
Procedural Violation / Bio.../AR35
-.053
Message Stream Mod / Crypt.../AE48
-.048
Procedural Violation / e-S.../AO35
Passwrod Nabbing / Line En.../AB46
Personal Abuse / Low/F52
Trap Door / Auditing/AU47
-1
-0.75
-0.5
-0.25
0
0.25
Std b Coefficients
0.5
0.75
1
Sensitivity Analysis
• How sensitive are the answers to
estimation errors?
• Does it matter if the estimates are not
accurate?
• How accurate do they have to be before
the decision changes?
• When is it important to gather additional
information?
Selecting Countermeasures
Threats
Available
Countermeasures
Risk
Assessment
Outcomes
Prioritized
Risks
Policies
System
Design
Select
Countermeasures
Security
Components
Requirements
Security Architecture
Develop
Security
Architecture
Security Attribute Evaluation Method
(SAEM)
What is SAEM?
A structured cost-benefit analysis technique for evaluating
and selecting alternative security designs
Why SAEM?
Security managers make explicit their assumptions
Decision rationale is captured
Sensitivity analysis shows how assumptions affect design
decisions
Design decisions are re-evaluated consistently when
assumptions change
Stakeholders see whether their investment is consistent
with risk expectations
SAEM Process
•
Evaluation Method
1.
2.
3.
4.
Assess security technology benefits
Evaluate security technology benefits
Assess coverage
Analyze Costs
Available
Countermeasures
Prioritized
Risks
Policies
System
Design
Select
Countermeasures
Requirements
Security
Components
Assess Security Technology Benefits
Procedural
Violation
33%
50%
Browsing
33%
40%
Personal Abuse
50%
25%
30%
Dist Denial of
Service
Password
Nabbing
Net Monitors
66%
Virtual Priv Net
66%
Auth Policy Serv
Hardened OS
75%
Net IDS
Vuln Assess
50%
Auditing
Prxy Firewall
Scanning
Host IDS
Threat
PF Firewall
Effectiveness Percentages
75%
50%
40%
Procedural
Violation (367)
594
183
Browsing (226)
594
220
Personal Abuse
(.13)
Net Monitors
443
274
158
Dist Denial of
Service (26.12)
Password
Nabbing (.62)
Virtual Priv Net
301
Auth Policy Serv
301
Net IDS
223
Auditing
Hardened OS
443
Host IDS
Vuln Assess
Scanning (886)
Prxy Firewall
Threat
PF Firewall
Evaluate Security Technology Benefits
6.6
.31
.08
Prioritized Technologies
Value
Threat Index
Overall
Rank
PKI/Cert
.24
28
Auditing
241
11
Auth Policy
Server
161
15
Host-IDS
589
2
Net-IDS
293
10
Smart Cards
103
16
One Time Psswrd
340
7
0
35
Technology
Single Sign-on
Assess Coverage
Host Intrusion Detection Coverage
Auditing Coverage
Analyze Costs
589
Threat Index 
 Host IDS
 Net IDS
 Auditing
 Auth Policy Server
 Smart Cards
 Single Sign-on
0
$0
Purchase Cost
 PKI Cert
$20,000
SAEM Sensitivity Analysis
The vulnerability Assessment
tool is 66% effective. What does
that really mean?
Security Technology Effects on
the Risk Assessment
Benefit Estimates:
- Reduce Frequency
- Change Outcomes
Normal(0.66, 0.1) Trunc(0,1)
4.5
4.0
3.5
3.0
2.5
2.0
1.5
1.0
0.5
5.0%
90.0%
5.0%
0.4955 0.8242
1.2
1.0
0.8
0.6
0.4
0.2
0.0
0.0
-0.2
Vulnerability Assess Scanner
Benefit Distribution
rd
-ID
S
FW
Vu OS
ln
As
s
VP
PF N
FW
1x
Pw
Bi d
om
C
e
rp t
tC
ds
N
-ID
S
Au
U dit
R
L
Lo Blk
g
An
a
e- l
Si
Au gn
th
Sm Po
a l
D rt C
B
En d
cr
y
M p
od
Fo em
re
Ln nsi
En c
Se cry
c p
em
ai
l
M Vir
ob us
il e
W Cde
eb
A
Se cc
c
O
S
H
H
Pr
xy
Rank
Top 25 Countermeasure Rankings
Reduced Frequency
35
30
25
20
15
10
5
-0
Countermeasures
+1SD, -1SD
+95% Perc, -5% Perc
Mean
Countermeasure Rank Overlaps
35
30
Rank
25
20
15
10
5
0
PKI/Cert
Auditing
Auth Policy
Servers
H-IDS
Technology
N-IDS
One Time
password
Smart Cards
Outcome Changes
Procedural Violations Reputation
Trigen(0.0000, 1.0000, 4.0000,
5, 95)
0.00
0.00
90.0%
1.6718E-07
Before
5.0%
4.0000
5.0%
90.0%
2.5060E-07
After
5.0%
4.0000
6
0.05
5
0.05
4
0.10
3
0.10
2
0.15
1
0.15
0
0.20
-1
0.20
6
0.25
5
0.25
4
0.30
3
0.30
2
0.35
1
0.35
0
0.40
-1
0.40
-2
Trigen(0, 2.5, 4.0000, 5, 95)
Preliminary Results
• Risk Assessment threat indices reflect
security manager’s concerns
– based on interviews and feedback
• Security managers are able to estimate
technology benefits
– based on experience, organizational skill levels,
and threat expectations
• Sensitivity Analysis is key to method
– based on uncertainty of assumptions