Security policy models

4/9/2010
Bell‐LaPadula Model
Security Policy Models
comp620
• The Bell‐LaPadula Model is a state machine model used for enforcing access control in d l
df
f i
t li
government and military applications
• Developed by David Elliott Bell and Leonard J. La Padula in 1973
• Basis for many confidentiality policies
Basis for many confidentiality policies
Example
Bell‐LaPadula Model, Step 1
• Security levels arranged in linear ordering
– Top Secret: highest
T S
t hi h t
– Secret
– Confidential
– Unclassified: lowest
• Both people and objects have a security level Both people and objects have a security level
– People or subjects have a clearance level, L(s)
– Objects have security classification, L(o)
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐3
L
security level
subject
object
4
p Secret
Top
Tanya
y
Personnel Files
3
Secret
Sam
E-Mail Files
2
Confidential
Claire
Activity Logs
1
Unclassified
Umoja
Telephone Lists
• Tanya can read all files
• Claire cannot read Personnel or E-Mail Files
• Umoja can only read Telephone Lists
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐4
1
4/9/2010
Reading Information
Writing Information
• Information flows up, not down
• Information flows up, not down
– “Reads up” disallowed, “reads down” allowed
“R d
” di ll
d “ d d
” ll
d
– “Writes up” allowed, “writes down” disallowed
“W it
” ll
d “ it d
” di ll
d
• Simple Security Condition (Step 1)
• *‐Property (Step 1)
– Subject s can read object o iff, L(o) ≤ L(s) and s has permission to read o
– Subject s can write object o iff L(o) ≥ L(s) and s has permission to write o
• Note: combines mandatory control (relationship of security levels) and discretionary control (the required
security levels) and discretionary control (the required permission)
– Sometimes called “no reads up” rule
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
• Note: combines mandatory control (relationship of security levels) and discretionary control (the required
security levels) and discretionary control (the required permission)
– Sometimes called “no writes down” rule
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐5
Basic Security Theorem, Step 1
Bell‐LaPadula Model, Step 2
• If a system is initially in a secure state, and every transition of the system satisfies the t
iti
f th
t
ti fi th
simple security condition, step 1, and the
*‐property, step 1, then every state of the system is secure
• Expand notion of security level to include categories
t
i
• Security level is (clearance, category set)
• Examples
– ( Top Secret, { NUC, EUR, ASI } )
– ( Confidential, { EUR, ASI } )
( Confidential { EUR ASI } )
– ( Secret, { NUC, ASI } )
– Proof: induct on the number of transitions
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐6
Slide #5‐7
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐8
2
4/9/2010
Dominating
• The dom relation (“Dominates”) specifies that the levels are greater or equal and the categories includes all items in the dominated
categories includes all items in the dominated categories.
• (L(a), C) dom (L(b), C′) iff L(a) ≥ L(b) and C′ ⊆ C
• Not a symmetric or asymmetric relation
• Examples
– (Top Secret, {NUC, ASI}) dom (Secret, {NUC})
– (Secret, {NUC, EUR}) dom (Confidential,{NUC, EUR})
– (Top Secret, {NUC}) ¬dom (Confidential, {EUR})
– (Confidential, {EUR}) ¬dom (Top Secret, {NUC})
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐9
Reading Information
• Security levels partially ordered
– Any pair of security levels may (or may not) be related by dom
• “dominates” serves the role of “greater than” in step 1
– “greater than” is a total ordering, though
g
g,
g
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐10
Writing Information
• Information flows up, not down
• Information flows up, not down
– “Reads up” disallowed, “reads down” allowed
“R d
” di ll
d “ d d
” ll
d
– “Writes up” allowed, “writes down” disallowed
“W it
” ll
d “ it d
” di ll
d
• Simple Security Condition (Step 2)
• *‐Property (Step 2)
– Subject s can read object o iff L(s) dom L(o) and s
has permission to read o
– Sometimes called “no reads up” rule
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Levels and Ordering
Slide #5‐11
– Subject s can write object o iff L(o) dom L(s) and s
has permission to write o
– Sometimes called “no writes down” rule
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐12
3
4/9/2010
Paul, cleared for (TOP SECRET, {A,C}), wants to access a document classified (SECRET, {B,C})
Fred, cleared for (SECRET, {C}), wants to access a document
wants to access a document classified (CONFIDENTIAL, {C})
25%
25%
n
on
e
25%
n
on
e
read
write
both
none
b
ot
h
n
on
e
b
ot
h
w
rit
e
1.
2
2.
3.
4.
25%
rit
e
25%
w
25%
ad
25%
read
write
both
none
ad
25%
What type of access is allowed?
re
25%
re
1.
2
2.
3.
4.
25%
Slide #5‐13
What type of access is allowed?
Mary, cleared for (CONFIDENTIAL, {C}), wants to access a doc ment classified
access a document classified (CONFIDENTIAL, {B})
w
re
Computer Security: Art and Science
©2002‐2004 Matt Bishop
25%
read
write
both
none
ad
1.
2
2.
3.
4.
25%
b
ot
h
• If a system is initially in a secure state, and every transition of the system satisfies the simple security
transition of the system satisfies the simple security condition, step 2, and the *‐property, step 2, then every state of the system is secure
What type of access is allowed?
rit
e
Basic Security Theorem, Step 2
4
4/9/2010
25%
Joe has no clearance (UNCLASSIFIED) wants to
(UNCLASSIFIED), wants to access a document classified (CONFIDENTIAL, {B})
ad
re
Problem
25%
25%
Solution
• Define maximum, current levels for subjects
• Colonel has (Secret, {NUC, EUR}) clearance
• Major has (Secret, {EUR}) clearance
– maxlevel(s) dom
maxlevel(s) dom curlevel(s)
– Major can talk to colonel (“write up” or “read down”)
– Colonel cannot talk to major (“read up” or “write down”)
• Clearly absurd!
June 1, 2004
25%
read
write
both
none
n
on
e
b
ot
h
w
re
rit
e
1.
2
2.
3.
4.
25%
n
on
e
25%
b
ot
h
25%
read
write
both
none
ad
1.
2
2.
3.
4.
25%
rit
e
Susan, cleared for (TOP SECRET, {A C}) wants to access a
{A,C}), wants to access a document classified (CONFIDENTIAL, {A})
What type of access is allowed?
w
What type of access is allowed?
• Example
–
–
–
–
Treat Major as an object (Colonel is writing to him/her)
Colonel has maxlevel (Secret, { NUC, EUR })
Colonel sets curlevel to (Secret, { EUR })
Now L(Major) dom curlevel(Colonel)
• Colonel can write to Major without violating “no writes down”
– Does L(s) mean curlevel(s) or maxlevel(s)?
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐19
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐20
5
4/9/2010
Principle of Tranquility
Types of Tranquility
• Raising object’s security level
• Strong Tranquility
– Information
Information once available to some subjects is no longer once available to some subjects is no longer
available
– Usually assume information has already been accessed, so this does nothing
• Lowering object’s security level
– The declassification problem
– Essentially, a Essentially a “write
write down
down” violating *‐property
violating * property
– Solution: define set of trusted subjects that sanitize or remove sensitive information before security level lowered
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐21
– Th
The clearances of subjects, and the classifications of l
f bj t
d th l ifi ti
f
objects, do not change during the lifetime of the system
• Weak Tranquility
– The clearances of subjects, and the classifications of objects, do not change in a way that violates the simple security condition or the *‐property during the lifetime of the system
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #5‐22
Biba Integrity Model Up and Down
• Formal state transition system that describes a set of access control rules designed to ensure t f
t l l d i dt
data integrity.
• Developed by Kenneth J. Biba in 1977
• The model is designed so that subjects may not corrupt data in a level ranked higher than
not corrupt data in a level ranked higher than the subject, or be corrupted by data from a lower level than the subject.
• This security model is directed toward data i t it ( th th
integrity (rather than confidentiality) and is fid ti lit ) d i
characterized by the phrase: “no read down,
no write up”
• This is in contrast to the Bell‐LaPadula model which is characterized by the phrase
c s c a ac e ed by e p ase “noo w
writee
down, no read up”.
6
4/9/2010
Reading and Writing
Information Transfer Path
• Users can only create content at or below
th i
their own integrity level (a monk may write a i t it l l (
k
it
prayer book that can be read by commoners, but not one to be read by a high priest).
• Users can only view content at or above their g y
(
y
own integrity level (a monk may read a book written by the high priest, but may not read a pamphlet written by a lowly commoner).
• An information transfer path is a sequence of objects o
bj t 1, ..., on+1 and corresponding d
di
sequence of subjects s1, ..., sn such that si r oi
and si w oi+1 for all i, 1 ≤ i ≤ n.
• Idea: information can flow from o1 to on+1
along this path by successive reads and writes
g
p
y
June 1, 2004
Computer Security: Art and Science
©2002‐2004 Matt Bishop
Slide #6‐26
Triad Programming Contest
• Saturday morning, April 10
• Teams of 3 students writing programs as fast Teams of 3 students writing programs as fast
as possible
• Prizes – 1st Place – 2nd Place
2nd Place
– 3rd Place
$300.00 ($100 per member)
$225.00 ($75 per member)
$225.00 ($75 per member)
$150.00 ($50 per member)
• Contact Prof. Carr ([email protected]) if you are interested
Teaching Evaluation
ƒYou should have received an email at
your ncat account explaining how to
do the teaching evaluation
ƒTeaching evaluations are important
ƒAnswer honestly
7