A Probabilistic Analysis of
Onion Routing in a Black-box
Model
10/29/2007
Workshop on Privacy in the Electronic Society
Aaron Johnson (Yale)
with
Joan Feigenbaum (Yale)
Paul Syverson (NRL)
Contributions
Contributions
1. Use a black-box abstraction to create a
probabilistic model of onion routing
Contributions
1. Use a black-box abstraction to create a
probabilistic model of onion routing
2. Analyze unlinkability
a. Provide worst-case bounds
b. Examine a typical case
Related Work
•
•
•
A Model of Onion Routing with Provable
Anonymity
J. Feigenbaum, A. Johnson, and P. Syverson
FC 2007
Towards an Analysis of Onion Routing Security
P. Syverson, G. Tsudik, M. Reed, and C.
Landwehr
PET 2000
An Analysis of the Degradation of Anonymous
Protocols
M. Wright, M. Adler, B. Levine, and C. Shields
NDSS 2002
Anonymous Communication
• Sender anonymity: Adversary can’t
determine the sender of a given message
• Receiver anonymity: Adversary can’t
determine the receiver of a given message
• Unlinkability: Adversary can’t determine
who talks to whom
Anonymous Communication
• Sender anonymity: Adversary can’t
determine the sender of a given message
• Receiver anonymity: Adversary can’t
determine the receiver of a given message
• Unlinkability: Adversary can’t determine
who talks to whom
How Onion Routing Works
1
2
u
User u running client
3
5
4
Routers running servers
d
Internet destination d
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
How Onion Routing Works
{{{m}3}4}1
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
{{m}3}4
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
4
d
{m}3
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
m
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
m’
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
4
d
{m’}3
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
u
2
{{m’}3}4
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
{{{m’}3}4}1
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged.
4. Stream is closed.
How Onion Routing Works
1
2
u
3
5
d
4
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged.
4. Stream is closed.
5. Circuit is changed every few minutes.
Adversary
1
2
u
3
5
4
Active & Local
d
Anonymity
u
v
w
1.
2.
3.
4.
1
2
d
3
5
4
e
f
Anonymity
u
v
w
1
2
d
3
5
4
1. First router compromised
2.
3.
4.
e
f
Anonymity
u
v
w
1
2
d
3
5
4
1. First router compromised
2. Last router compromised
3.
4.
e
f
Anonymity
u
v
w
1
2
d
3
5
4
1. First router compromised
2. Last router compromised
3. First and last compromised
4.
e
f
Anonymity
u
v
w
1
2
d
3
5
4
e
f
1. First router compromised
2. Last router compromised
3. First and last compromised
4. Neither first nor last compromised
Black-box Abstraction
u
d
v
e
w
f
Black-box Abstraction
u
d
v
e
w
f
1. Users choose a destination
Black-box Abstraction
u
d
v
e
w
f
1. Users choose a destination
2. Some inputs are observed
Black-box Abstraction
u
d
v
e
w
f
1. Users choose a destination
2. Some inputs are observed
3. Some outputs are observed
Black-box Anonymity
u
d
v
e
w
f
• The adversary can link observed
inputs and outputs of the same user.
Black-box Anonymity
u
d
v
e
w
f
• The adversary can link observed
inputs and outputs of the same user.
• Any configuration consistent with
these observations is
indistinguishable to the adversary.
Black-box Anonymity
u
d
v
e
w
f
• The adversary can link observed
inputs and outputs of the same user.
• Any configuration consistent with
these observations is
indistinguishable to the adversary.
Black-box Anonymity
u
d
v
e
w
f
• The adversary can link observed
inputs and outputs of the same user.
• Any configuration consistent with
these observations is
indistinguishable to the adversary.
Probabilistic Black-box
u
d
v
e
w
f
Probabilistic Black-box
u
d
v
e
w
f
pu
• Each user v selects a destination
from distribution pv
Probabilistic Black-box
u
d
v
e
w
f
pu
• Each user v selects a destination
from distribution pv
• Inputs and outputs are observed
independently with probability b
Probabilistic Anonymity
u
v
w
u
v
w
d
e
f
d
e
f
u
v
w
d
e
f
u
v
w
Indistinguishable configurations
d
e
f
Probabilistic Anonymity
u
v
w
u
v
w
d
e
f
d
e
f
u
v
w
d
e
f
u
v
w
Indistinguishable configurations
Conditional distribution: Pr[ud] = 1
d
e
f
Black Box Model
Let U be the set of users.
Let be the set of destinations.
Configuration C
• User destinations CD : U
• Observed inputs CI : U{0,1}
• Observed outputs CO : U{0,1}
Let X be a random configuration such that:
Pr[X=C] = u puCD(u)
bCI(u) (1-b)1-CI(u)
bCO(u) (1-b)1-CO(u)
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Note: There are several other
candidates for a probabilistic
anonymity metric, e.g. entropy
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Exact Bayesian inference
• Adversary after long-term intersection attack
• Worst-case adversary
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Exact Bayesian inference
• Adversary after long-term intersection attack
• Worst-case adversary
Unlinkability given that u visits d:
E[Y | XD(u)=d]
Worst-case Anonymity
Worst-case Anonymity
Let pu1 pu2 pud-1 pud+1 … pu
Theorem 1: The maximum of E[Y | XD(u)=d]
over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Worst-case Anonymity
Let pu1 pu2 pud-1 pud+1 … pu
Theorem 1: The maximum of E[Y | XD(u)=d]
over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Show max.
occurs when,
for all vu,
pvev = 1 for
some ev.
Show max.
occurs when,
for all vu,
ev = d or
ev = .
Show max.
occurs when
ev=d for all
vu, or when
ev = for all
vu.
Worst-case Estimates
Let n be the number of users.
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
Theorem 3: When pvd=1 for all vu:
E[Y | XD(u)=d] = b2 + b(1-b)pud +
(1-b) pud/(1-(1- pud)b) + O(logn/n)]
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
b + (1-b) pud
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Worst-case Estimates
Let n be the number of users.
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Increased chance of total
compromise from b2 to b.
Typical Case
Let each user select from the Zipfian distribution:
pdi = 1/(is)
Theorem 4:
E[Y | XD(u)=d] = b2 + (1 − b2)pud+ O(1/n)
Typical Case
Let each user select from the Zipfian distribution:
pdi = 1/(is)
Theorem 4:
E[Y | XD(u)=d] = b2 + (1 − b2)pud+ O(1/n)
E[Y | XD(u)=d] b2 + (1 − b2)pud
Contributions
1. Use a black-box abstraction to create a
probabilistic model of onion routing
2. Analyze unlinkability
a. Provide worst-case bounds
b. Examine a typical case
Future Work
1. Extend analysis to other types of
anonymity and to other systems.
2. Examine how quickly users distribution
are learned.
3. Analyze timing attacks.
© Copyright 2026 Paperzz