On Improving the Performance of Role-Based Cascaded

Point-Based Trust: Define
How Much Privacy is Worth
Danfeng Yao
Brown University
Keith B. Frikken
Miami University
Mikhail J. Atallah
Purdue University
Roberto Tamassia
Brown University
Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS-0242421, ONR N0001402-1-0364, CERIAS, and Purdue Discovery Park
ICICS December, 2006, Raleigh NC
1
Outline of the talk
1. Introduction to privacy
protection in authorization
3. Secure 2-party protocol
for knapsack problem
2. Point-based authorization and
optimal credential selection
2.1 New York State Division of Motor
Vehicle 6-Point Authentication System
4. Applications
2.2 Knapsack problem
2
Protecting private information
Alice
Request for discount
Request UID
Request BBB
BBB
Policy
Releasing UID
requires BBB
UID
Cred.
UID (student
ID)
Grant the discount

Discount
Policy
requires UID
BBB (better
Cred.
business bureau)
Trust negotiation protocols [Winsborough Seamons Jones 00, Yu Ma
Winslett 00, Winsborough Li 02, Li Du Boneh 03]
3
Our goals

Prevent pre-mature information leaking by both parties


Support some kind of cumulative privacy quantitatively


Credentials should be exchanged only if services can be
established
Disclosing more credentials should incur higher privacy loss
Support flexible service model


Allow customized (or personalized) access policies
Adjustable services based on qualifications
Our ultimate goal is to encourage users to participate in e-commerce
4
What can we learn from New York
State DMV?
6-point proof-of-identity for getting NY driver’s license
Credential
Points
Passport
5
Utility bill
1
Birth certificate
4
Social security card
3
5
Another motivation – adjustable
services
Membership, Credential
Discount
Mastercard
2%
Airline frequent flier
1%
AAA
0.5%
Veteran
0.5%
Adjustable services based on the private information revealed
6
Point-based authorization model




Credential type C1, C2, …, Cn
The service provider defines
 Point values p1, p2, …, pn of credentials ----- private
 Threshold T for accessing a resource ----- private
The user defines sensitivity scores a1, a2, …, an of credentials ----private
Credential selection problem
 The user (or client) wants to satisfy threshold T with the
minimum disclosure of privacy
Minimize
Subject to
n
 ai xi
i=1
n
xi = 0 not to disclose Ci
xi= 1 disclose Ci
 pi xi ≥ T
i=1
This can be converted to a knapsack problem
7
Example
Threshold of accessing a resource: 10
Credential
College
ID
Driver’s
license
Credit
card
SSN
Point
value
3
6
8
10
Credential
College
ID
Driver’s
license
Credit
card
SSN
Sensitivity
score
10
30
50
100
Alice’s option
Point
values
Sensitivity
score
SSN
10
100
College ID,
Credit card
11
60
License,
Credit card
14
80
8
Where do points come from?

Reputation systems [Beth Borcherding Klein 94, Tran Hitchens
Varadharajan Watters 05, Zouridaki Mark Hejmo Thomas 2005]
 This is future work, but here is an idea
Evaluate
Member of
Evaluate
Evaluate
9
Converting CSP into a knapsack problem

Defines binary vector y1, y2, …, yn, where yi = 1 – xi


{ai}: Private to user
{pi }: Private to provider
Maximize
Subject to
Bag of size T’,
n=6
n
 ai yi
i=1
n
py
i=1 i i
n
< T’
Let T’ = i=1
 pi - T
What to pick and steal?
10
Dynamic programming of knapsack
problem

Dynamic programming for 0/1 knapsack problem
 Construct a n-by-T’ table M, where
M i, j =
M i-1, j
max {M
i-1, j,
if j < pi
n
T’ =  pi - T
i=1
M i-1, j-pi + ai } if j ≥ pi
..
..
..
..
Mi-1, j-pi
..
..
Mi-1, j
..
..
..
?
{ai }: Private to user
{pi }: Private to provider
11
Overview of privacy-preserving
knapsack computation


Uses 2-party maximization protocol [Frikken Atallah 04]
Uses homomorphic encryption scheme




E(x)E(y) = E(x + y)
E(x)c = E(xc)
Preserves privacy for both
Two phases: table-filling and traceback
M i, j = max {M i-1, j , - ∞ + ai }
max {M

i-1, j ,
if j < pi
M i-1, j-pi + ai } if j ≥ pi
Add maximization and addition of ai to make the two computation
procedures indistinguishable
12
Preliminary: 2-party maximization
protocol in a split format
Player
Input
Output
Privacy
Alice
Alice1, Alice2
Alice’s share of max*
Do not know
which is the max
Amazon Amazon1, Amazon2
Amazon’s share of
max*
* Alice’s share + Amazon’s share =
max (Alice1 + Amazon1, Alice2 + Amazon2)
Amazon1
Alice1
Amazon2
Alice2
Amazon’s
Alice’s
Max
share
share
Comparison can be done similarly [Frikken Atallah 04]
13
Our protocol for dynamic programming of 0/1
knapsack problem





Computed entries are encrypted
and stored by the provider
The provider splits the two
candidates of Mi, j
The client and provider engage in
a 2-party private maximization
protocol to compute the
maximum
The client encrypts her share of
the maximum and sends it to the
provider
The provider computes and
stores the encrypted Mi, j
M i, j = max { M i-1, j , - ∞ + ai }
if j < pi
max {Mi-1, j , Mi-1, j-pi + ai } if j ≥ pi
..
..
..
..
E(Mi-1, j-pi)
..
..
E(Mi-1, j)
..
..
..
?
ai
E(Mi-1
Alice
Amazon
, j)
Amazon’s
Alice’s
Max
share
share
Alice
E(Mi-1Amazon
, j-pi)
14
Our protocol for knapsack (Cont’d)


At the end of the 2-party dynamic programming, the provider has a
n-by-T’ table of encrypted entries
n
T’ =  pi - T
Number of credentials n=4
i=1
E(M1, 1)
E(M1, 2)
E(M1, 3)
E(M1, 4)
E(M1, 5)
E(M2, 1)
E(M2, 2)
E(M2, 3)
E(M2, 4)
E(M2, 5)
E(M3, 1)
E(M3, 2)
E(M3, 3)
E(M3, 4)
E(M3, 5)
E(M4, 1)
E(M4, 2)
E(M4, 3)
E(M4, 4)
E(M4, 5)
How does the client find out the optimal selection of credentials?
15
Traceback protocol: get the optimal
credential selection
Item 0
0
Item 1
E(M1, 1),
E(F1, 1)
E(M1, 2),
E(F1, 2)
E(M1, 3),
E(F1, 3)
E(M1, 4),
E(F1, 4)
E(M1, 5),
E(F1, 5)
Item 2
E(M2, 1),
E(F2, 1)
E(M2, 2),
E(F2, 2)
E(M2, 3),
E(F2, 3)
E(M2, 4),
E(F2, 4)
E(M2, 5),
E(F2, 5)
Item 3
E(M3, 1),
E(F3, 1)
E(M3, 2),
E(F3, 2)
E(M3, 3),
E(F3, 3)
E(M3, 4),
E(F3, 4)
E(M3, 5),
E(F3, 5)
Item 4
E(M4, 1),
E(F4, 1)
E(M4, 2),
E(F4, 2)
E(M4, 3),
E(F4, 3)
E(M4, 4),
E(F4, 4)
E(M4, 5),
E(F4, 5)
E(Fi, j)
Fi, j = 0 or 1

Security in a semi-honest (honest-but-curious) model
16
Security and efficiency of our privacypreserving knapsack computation




Informally, security means that private information is not leaked
Security definitions
 Semi-honest adversarial model
 A protocol securely implements function f if the view of
participants are simulatable with an ideal implementation of the
protocol
Theorem The basic protocol of the private two-party dynamic
programming computation in the point-based trust management
model is secure in the semi-honest adversarial model.
Theorem The communication complexity between the provider and
the client of our basic secure dynamic programming protocol is
O(nT'), where n is the total number of credentials and T' is the
marginal threshold.
17
Fingerprint protocol: an improved
traceback protocol

We want to exclude the provider in the traceback
 To prevent tampering and reduce costs
1. Filling knapsack table
2. (Encrypted) last entry
3. Decrypt and identity
optimal credential selection
Fingerprint protocol is a general solution for traceback in DP
18
Fingerprint protocol (cont’d)
Item No.
Privacy score
(decimal)
Privacy score
(binary)
Transformed score
1
2
010
010 0001
2
3
011
011 0010
3
5
101
101 0100
4
8
1000
1000 1000
Knapsack result
(decimal)
Knapsack result
(binary)
Item numbers in
the knapsack
3
… 0010
2
20
… 1111
1, 2, 3, 4
19
Application of point-based authorization:
fuzzy location query in Presence systems
Ex
Where is Alice?
Boss
Mom
Alice’s mom
Where is Alice?
Alice’s boss
Where is Alice?
Alice’s ex
20
Related work

Hidden credentials [Bradshaw Holt Seamons 04, Frikken Li Atallah
06]
Private policy negotiation [Kursawe Neven Tuyls 06], Optimizing
trust negotiation [Chen Clarke Kurose Towsley 05], Trust negotiation
protocol/framework [Winsborough Seamons Jones 00, Yu Ma
Winslett 00, Winsborough Li 02, Li Du Boneh 03, Li Li Winsborough
05]
Anonymous credential approaches [Chaum 85, Camenisch
Lysyanskaya 01]
Secure Multiparty Computation [Atallah Li 04, Atallah Du 01]
OCBE [Li Li 06]

Manet [Zouridaki Mark Hejmo Thomas 05]

Platform for Privacy Preferences (P3P) [W3C]




21
Conclusions and future work




Our point-based model allows a client to choose the
optimal selection of credentials
We presented private 2-party protocol for knapsack
problem
Our fingerprint protocol is a general solution for traceback
Future work
 Add typing to credentials
 Reputation systems and points
22