Assessing security threats of looping constructs

Assessing security
threats of looping
constructs
Pasquale Malacaria,
Department of Computer Science
Queen Mary, University of London
Computational systems:
components which interact by passing information.
Given a probability distribution on the inputs of a system each
information passing event e has:
1- a probability of happening p(e) and
2- a consequence if e happens (a weight w(e))
in security term w(e)= the possible damage to security arising from e happening.
Computational systems:
components which interact by passing information.
Given a probability distribution on the inputs of a system each
information passing event e has:
1- a probability of happening p(e) and
2- a consequence if e happens (a weight w(e))
in security term w(e)= the possible damage to security arising from e happening.
distributed interaction
server
open
account
of Mr X
μ(m=pw)
Client
(attacker)
send m
Computational systems:
components which interact by passing information.
Given a probability distribution on the inputs of a system each
information passing event e has:
1- a probability of happening p(e) and
2- a consequence if e happens (a weight w(e))
in security term w(e)= the possible damage to security arising from e happening.
distributed interaction
server
open
account
of Mr X
μ(m=pw)
Client
(attacker)
send m
sequential interaction
x=0
μ(x=0)
y=0
μ(x!=0)
y=1
Aim:
Compute the expected security damage in a computational system.
Aim:
Compute the expected security damage in a computational system.
Use probability theory to define “expected”:
Aim:
Compute the expected security damage in a computational system.
Use probability theory to define “expected”:
Interference = Expected damage= E[w(e)] = ∑ p(e) w(e)
Aim:
Compute the expected security damage in a computational system.
Use probability theory to define “expected”:
Interference = Expected damage= E[w(e)] = ∑ p(e) w(e)
Example:
compute the interference between high and the events access, deny in
if (low == high) access else deny
low, high are 2 (=log(4)) bits variables (i.e. values in the range 0..3)
with uniform distribution (i.e. all values equally likely)
low = public , high = secret.
if (low == high) access else deny
Damage access = disclose whole secret i.e. 2 (= log(4)) bits.
Damage deny = eliminate one possibility (out of 4):
if (low == high) access else deny
Damage access = disclose whole secret i.e. 2 (= log(4)) bits.
Damage deny = eliminate one possibility (out of 4):
1. observe access:
probability =1/4,
damage = 2= log(4) - log(1) = log(4/1)
if (low == high) access else deny
Damage access = disclose whole secret i.e. 2 (= log(4)) bits.
Damage deny = eliminate one possibility (out of 4):
1. observe access:
probability =1/4,
damage = 2= log(4) - log(1) = log(4/1)
2. observe deny:
probability =3/4,
damage = log(4) - log(3) = log(4/3)
if (low == high) access else deny
Damage access = disclose whole secret i.e. 2 (= log(4)) bits.
Damage deny = eliminate one possibility (out of 4):
1. observe access:
probability =1/4,
damage = 2= log(4) - log(1) = log(4/1)
2. observe deny:
probability =3/4,
damage = log(4) - log(3) = log(4/3)
Combining: 1/4 log(4/1) + 3/4 log(4/3 ), an instance of ∑ pi log(1/pi)
Shannon’s entropy formula.
if (low == high) access else deny
Damage access = disclose whole secret i.e. 2 (= log(4)) bits.
Damage deny = eliminate one possibility (out of 4):
1. observe access:
probability =1/4,
damage = 2= log(4) - log(1) = log(4/1)
2. observe deny:
probability =3/4,
damage = log(4) - log(3) = log(4/3)
Combining: 1/4 log(4/1) + 3/4 log(4/3 ), an instance of ∑ pi log(1/pi)
Shannon’s entropy formula.
Notice 1/4 log(4/1) + 3/4 log(4/3 ) =0.8112 > 0: the program is not secure
Theorem: No (password protected) program is secure!
O v e r v i e w
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
O v e r v i e w
Computational
System+ input
distribution
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
O v e r v i e w
Computational
System+ input
distribution
Information
theory
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
Computational
System+ input
distribution
O v e r v i e w
Information numbers
measuring
theory
in-security
(interference)
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
Computational
System+ input
distribution
O v e r v i e w
Information numbers
Massey
measuring results
theory
in-security
(interference)
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
Computational
System+ input
distribution
O v e r v i e w
average
Information numbers
Massey length of a
measuring results
theory
dictionary
in-security
attack on
(interference)
the system
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
Computational
System+ input
distribution
O v e r v i e w
average
Information numbers
Massey length of a
measuring results
theory
dictionary
in-security
attack on
(interference)
the system
This setting has been explored in several papers by Clark, Hunt, Malacaria(2000-2006)
and relates to previous works by
• Dennings (82):
• Millen (87) • McLean (90)• Gray (92)
The present work:
How to measure precisely interference for loops?
Main problem: circular control flow
The present work:
How to measure precisely interference for loops?
Main problem: circular control flow
if (low == high)
access else deny
The present work:
How to measure precisely interference for loops?
Main problem: circular control flow
if (low == high)
access else deny
while(high < low)
{low = low − 1}
The present work:
How to measure precisely interference for loops?
Main problem: circular control flow
if (low == high)
access else deny
while(high < low)
{low = low − 1}
Other issues: non termination, unbounded iterations, probabilistic operators...
The theory: an information theoretical semantics of loops
while e M
Interference = Leakage = limn →∞W(e, M)n
Idea: W(e, M)n= leakage up to the n-th iteration of the guard e + body M
The theory: an information theoretical semantics of loops
while e M
Interference = Leakage = limn →∞W(e, M)n
Idea: W(e, M)n= leakage up to the n-th iteration of the guard e + body M
To define the formula W(e, M)n two ingredients:
The theory: an information theoretical semantics of loops
while e M
Interference = Leakage = limn →∞W(e, M)n
Idea: W(e, M)n= leakage up to the n-th iteration of the guard e + body M
To define the formula W(e, M)n two ingredients:
1-from the guard e we define e<i> as the event: “e is true up to the i-th iteration”
(”e is true after 0 iteration and . . . and e is true after i iterations but e is false
after i + 1 iterations ” )
The theory: an information theoretical semantics of loops
while e M
Interference = Leakage = limn →∞W(e, M)n
Idea: W(e, M)n= leakage up to the n-th iteration of the guard e + body M
To define the formula W(e, M)n two ingredients:
1-from the guard e we define e<i> as the event: “e is true up to the i-th iteration”
(”e is true after 0 iteration and . . . and e is true after i iterations but e is false
after i + 1 iterations ” )
2-To the command (state transformer) M we associate an equivalence
relation i.e. a random variable:
Leakage = limn →∞W(e, M)n
Leakage = limn →∞W(e, M)n
W(e, M)n = H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) + Σ1≤i≤n μ(e <i> )H(Mi |e <i> )
Leakage = limn →∞W(e, M)n
W(e, M)n = H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) + Σ1≤i≤n μ(e <i> )H(Mi |e <i> )
H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) = entropy of the events e<i>
=leakage of the guard
Leakage = limn →∞W(e, M)n
W(e, M)n = H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) + Σ1≤i≤n μ(e <i> )H(Mi |e <i> )
H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) = entropy of the events e<i>
=leakage of the guard
Σ1≤i≤n μ(e <i> )H(Mi |e <i> )
= leakage of the body
μ(e <i> )H(Mi |e <i> ) =(probability of e true up to iteration i) * (leakage of Mi knowing e<i>)
Mi = random variable M;...;M i times.
Using this semantics we can define
leak
Rate of leakage = lim n→∞,μ(e <n> )=\= 0 W(e, M)n
n
number of iterations
Using this semantics we can define
leak
Rate of leakage = lim n→∞,μ(e <n> )=\= 0 W(e, M)n
n
number of iterations
leak
channel capacity
Channel capacity = maxμ lim n→∞ W(e, M)n
input distributions
Using this semantics we can define
leak
Rate of leakage = lim n→∞,μ(e <n> )=\= 0 W(e, M)n
n
number of iterations
leak
channel capacity
Channel capacity = maxμ lim n→∞ W(e, M)n
input distributions
example of case studies
Questions:
low = 20;
leakage, channel capacity, rate?
while(high < low)
Use the above to decide:
{low = low − 1}
Is this program secure?
Leakage (h under uniform distribution) in
l = 20;
while(h < l)
{ l = l − 1}
Here Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0 (no leakage in the body)
So we only need H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> ))
Leakage (h under uniform distribution) in
l = 20;
while(h < l)
{ l = l − 1}
Here Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0 (no leakage in the body)
So we only need H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> ))
Leakage (h under uniform distribution) in
l = 20;
while(h < l)
{ l = l − 1}
Here Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0 (no leakage in the body)
So we only need H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> ))
Leakage (h under uniform distribution) in
l = 20;
while(h < l)
{ l = l − 1}
Here Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0 (no leakage in the body)
So we only need H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> ))
H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) =
Leakage (h under uniform distribution) in
l = 20;
while(h < l)
{ l = l − 1}
Here Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0 (no leakage in the body)
So we only need H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> ))
H(μ(e<0>), . . . , μ(e<n>), 1 − Σ0≤i≤n μ(e <i> )) =
leakage for uniform
distribution
leakage for uniform
distribution
Notice channel capacity much
higher: take the distribution:
then leakage = 4.3219
leakage for uniform
distribution
Notice channel capacity much
higher: take the distribution:
then leakage = 4.3219
SUMMARY
l = 20;
while(h < l)
{ l = l − 1}
Bound=4.3219
Channel capacity=4.3219
Channel rate constant
Security >= k − 4.3219
A may terminating loop
l=0;
flag=tt;
while (flag or l<h)
{
if (h<= C) flag=ff;
l=l+1;
}
This loop will terminate if h <= C and in that case l = h.
A may terminating loop
l=0;
flag=tt;
while (flag or l<h)
{
if (h<= C) flag=ff;
l=l+1;
}
This loop will terminate if h <= C and in that case l = h.
A may terminating loop
l=0;
flag=tt;
while (flag or l<h)
{
if (h<= C) flag=ff;
l=l+1;
}
This loop will terminate if h <= C and in that case l = h.
SUMMARY
Bound=Log(C)
Channel capacity=Log(C)
Rate decreasing
Security >= k - Log(C)
parity check loop
high=BigFile
i=0;
low=0;
while (i<FileSize)
{
low= Xor(high[i],low);
i=i+1;
}
This loop computes the Xor of all secret bits.
parity check loop
leakage guard = 0, for all µ
Body: H(Mi |e <i> ) <= 1 for all i,
(equality using µU)
and μ(e <i> ) = 1 if i = FileSize else 0
high=BigFile
i=0;
low=0;
while (i<FileSize)
so Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0*1+...+0*1+1*1=1
{
and µU channel distribution.
low= Xor(high[i],low);
i=i+1;
}
This loop computes the Xor of all secret bits.
parity check loop
leakage guard = 0, for all µ
Body: H(Mi |e <i> ) <= 1 for all i,
(equality using µU)
and μ(e <i> ) = 1 if i = FileSize else 0
high=BigFile
i=0;
low=0;
while (i<FileSize)
so Σ1≤i≤n μ(e <i> )H(Mi |e <i> ) = 0*1+...+0*1+1*1=1
{
and µU channel distribution.
low= Xor(high[i],low);
i=i+1;
}
This loop computes the Xor of all secret bits.
SUMMARY
Bound = 1
Channel capacity=1
Channel rate constant
Security >= k-1
Loop with probabilistic operators:
int i=0; low = 0;
while (i< size(high)) {
if (Coin[i]==0 )
low[i] = high[i];
i=i+1;
}
System.out.println(low);
intuition: at each bit of high
toss a coin and if coin=tail
copy it into low
Loop with probabilistic operators:
int i=0; low = 0;
while (i< size(high)) {
if (Coin[i]==0 )
low[i] = high[i];
i=i+1;
}
System.out.println(low);
intuition: at each bit of high
toss a coin and if coin=tail
copy it into low
Guard leak=0. Body: see coins as a stream
of 0,1 and compute probabilities of
streams (binomial distribution)
Loop with probabilistic operators:
int i=0; low = 0;
while (i< size(high)) {
if (Coin[i]==0 )
low[i] = high[i];
i=i+1;
}
System.out.println(low);
intuition: at each bit of high
toss a coin and if coin=tail
copy it into low
Guard leak=0. Body: see coins as a stream
of 0,1 and compute probabilities of
streams (binomial distribution)
Loop with probabilistic operators:
int i=0; low = 0;
while (i< size(high)) {
if (Coin[i]==0 )
low[i] = high[i];
i=i+1;
}
System.out.println(low);
Guard leak=0. Body: see coins as a stream
of 0,1 and compute probabilities of
streams (binomial distribution)
intuition: at each bit of high
toss a coin and if coin=tail
copy it into low
SUMMARY
Unbounded
Channel capacity=kµ(Coin = 0)
Channel rate constant
Security >= k - kµ(Coin = 0)
What next?
Analysis of hashing protocols
Multithreads, distributed systems
Program Analysis