Privacy and Trust in Entity Authorization

Trust and Privacy
in Authorization
Bharat Bhargava
Yuhui Zhong
Leszek Lilien
CERIAS Security Center
CWSA Wireless Center
Department of CS and ECE
Purdue University
Supported by NSF IIS 0209059, NSF IIS 0242840
1
Applications/Broad Impacts
• Guidelines for the design and deployment of
security-sensitive applications in the nextgeneration networks
– Data sharing for medical research and treatment
– Collaboration among government agencies for
homeland security
– Transportation systems (travel security checks,
hazardous material disposal)
– Collaboration among government officials, law
enforcement, security personnel, and health care
facilities during bio-terrorism and other emergencies
2
Trust-based Authorization
Authorization based on:
• Role Based Access Control model
• Uncertain evidence
• Dynamic Trust
Authorization process considering:
• Tradeoff between privacy and trust
3
A. Trust-based Authorization
• Problem
– Dynamically establish and maintain trust among
entities in an open environment
• Research directions
– Handling uncertain evidence
– Modeling dynamic trust
• Challenges
– Uncertain information complicates inference
– Subjectivity leads to varying interpretations of the
same information
– Trust is multi-faceted and context-dependent – hence
trust modeling requires tradeoffs:
• representation comprehensiveness vs. computation simplicity4
Uncertain Evidence
• Evaluating uncertainty of a role assignment
policy given a set of uncertain evidence
• Probability-based approach
– Atomic formula:
Bayes network + causal inference +
conditional probability interpretation of opinion
– AND/OR expressions: rules [Jøsang'01]
– Subjectivity handled by discounting operator
[Shafer'76]
5
Dynamic Trust
• Trust established based on direct interaction
– Identify behavior patterns and their characteristic
features
– Determine which pattern is the best match for the
current interaction sequence
– Develop algorithms establishing trust
• Unique feature: we consider behavior patterns
• Reputation evaluation
– Choose reputation information providers
– Scale reputation ratings
• Bob’s 0.7 means 0.5 to Alice but 0.8 to Carol
6
TERA Architecture
RBAC enhanced
application server
Interactions
User's behavior
Assigned role
Trust based on behaviors
Role request
Alice
Reputation
TERM server
Trust based on behaviors
Reputation server
Assigned role
Bob
Role request
Reputation
TERM server
Interactions
TERA
User's behavior
RBAC enhanced
application server
7
Trust Enhanced Role Assignment
(TERA) Prototype
• Trust enhanced role mapping (TERM) server
assigns roles to users based on
– Uncertain & subjective evidence
– Dynamic trust
• Reputation server
– Dynamic trust information repository
– Evaluate reputation from trust information by using
algorithms specified by TERM server
Prototype and demo are available at
http://www.cs.purdue.edu/homes/bb/NSFtrust/
8
B. Trading Privacy for Trust
•
Problems
– Minimize loss of privacy necessary to gain the required level of trust
– Control dissemination of “traded” private data
•
Research directions
– Measuring privacy
– Modelling privacy - trust tradeoff
– Controlling private data dissemination
•
Challenges
– Specify policies through metadata and establish guards as procedures
– Efficient implementation
• self-descriptiveness, apoptosis, evaporation
– Define context-dependent privacy disclosure policies
• depending on who will get this information, possible uses of this information, information
disclosed in the past, etc.
– Propose more universal privacy metrics
• usually they are ad hoc and customized
Details at: http://www.cs.purdue.edu/homes/bb/priv_trust_cerias.ppt
9
Privacy Metrics
• Determine the degree of data privacy
– Size-of-anonymity-set metrics
– Entropy-based metrics
• Privacy metrics should account for:
– Dynamics of legitimate users
– Dynamics of violators
– Associated costs
10
Privacy-Trust Tradeoff
• Gain required level of trust with minimal privacy
loss
• Build trust based on digital users’ credentials
that contain private information
• Formulate the privacy-trust tradeoff problem
• Estimate privacy loss due to disclosing a set of
credentials
• Estimate trust gain due to disclosing credentials
• Develop algorithms that minimize privacy loss
for required trust gain
11
Controlling
Private Data Dissemination



Design self-descriptive private objects
Construct a mechanism for apoptosis of
private objects
apoptosis = clean self-destruction
Develop proximity-based evaporation of
private objects
12
Examples of Proximity Metrics
• Examples of one-dimensional distance metrics
– Distance ~ business type
2
Used Car
Dealer 3
Used Car
Dealer 1
Bank I Original
Guardian 5
Insurance
Company C
2
5
1
1
2
5
Bank III
Insurance
Company A
Bank II
Used Car
Dealer 2
If a bank is the
original guardian,
then:
-- any other bank is
“closer” than any
insurance company
-- any insurance
company is “closer”
than any used car
dealer
Insurance
Company B
– Distance ~ distrust level: more trusted entities are “closer”
• Multi-dimensional distance metrics
– Security/reliability as one of dimensions
13
Private and Trusted System (PRETTY)
Prototype
(4)
(1)
(2)
[2c2]
(3) User Role
[2b] [2d]
[2a]
[2c1]
TERA = Trust-Enhanced Role Assignment
14
Information Flow in PRETTY
1) User application sends query to server application.
2) Server application sends user information to TERA server for trust evaluation
and role assignment.
a) If a higher trust level is required for query, TERA server sends the request for more
user’s credentials to privacy negotiator.
b) Based on server’s privacy policies and the credential requirements, privacy negotiator
interacts with user’s privacy negotiator to build a higher level of trust.
c) Trust gain and privacy loss evaluator selects credentials that will increase trust to the
required level with the least privacy loss. Calculation considers credential
requirements and credentials disclosed in previous interactions.
d) According to privacy policies and calculated privacy loss, user’s privacy negotiator
decides whether or not to supply credentials to the server.
3) Once trust level meets the minimum requirements, appropriate roles are
assigned to user for execution of his query.
4) Based on query results, user’s trust level and privacy polices, data disseminator
determines: (i) whether to distort data and if so to what degree, and (ii) what
privacy enforcement metadata should be associated with it.
15