Talk slides PPT

Better Data Protection by Design
through Multicriteria Decision Making:
x
On False Tradeoffs between Privacy and Utility
Bettina Berendt
Dept. of Computer Science, KU Leuven
https://people.cs.kuleuven.be/~bettina.berendt/
Annual Privacy Forum 2017
What’s the utility of personal data – or: when Data
Protection by Design is not (only) about privacy
Data science / computer
science:
Privacy risk – utility
tradeoffs
Law:
Purposes
Data minimisation
DPbD:
multi-disciplinary and multi-criteria
Behavioural sciences & HCI:
Knowledge, social influences,
intentions, use of PETs,
assessment of technology
Economics:
Utilities and dis-utilities
of different stakeholders
The case study
Practical Questions and Research Questions:
(PQ1) Did employees know what personal data
were collected, and did they care?
(PQ2) Would they utilize an anonymization PET
if it was available?
(RQ1) To what extent is the intent to use PETs
dependent on prior knowledge of the
underlying data collection technology?
(RQ2) Is privacy-related decision-making
individual or subject to social influences?
(RQ3) Co-design: Would employees be able to
generate more design options for access
control, and what would these be like?
Method: 5 questions + open-ended comment
via Surveymonkey
(PQ1, PQ2, RQ1):
People care, often get the basics wrong, &
intention to use PETs depends on prior knowledge
101 responses (42%) within 48 hrs
Q1: Which data do you think are collected and
stored by the card readers?
This is
in fact
logged
Q3: If there was a button “anonymised
version of coffee-getting
authentication” on the card reader,
would you use it?  plotted vs. Q1
(RQ3):
An efficient co-design technique
Q2: Do you think the purpose of barring unauthorized coffee-getting
could also be attained with other data, or other means?
Examples
x
without data: lock the room,
 plotted vs. Q1
warning sign, security guard,
no plastic cups and no cups in
the cafeteria
without personal data (collection
and/or storage): anonymous
tokens, typing a code, only
checking authorization
with less, or less fine-grained,
personal data: restrict access
to the cafeteria with card
readers, at all hours, or restrict
access to the coffee machines
with card readers, but only
outside office hours
(RQ2):
Social influences on decision making
Q4: Do you think your use of the coffee machines
may influence how your colleagues use them?
Q5: Why?
• Peer pressure /
social influence
Awareness/reflection: 7
Imitation: 9
no
maybe
yes
no answer
• Surveillance /
chilling effects: 2
~ k-anonymity: 1
Exploratory analysis:
“Anonymity is not the point here.”
•
•
•
•
•
I do think the card readers are weird. I do not think
they are a big problem and I am not worried about
them. I am more worried about the social conflict.
The underlying thought process makes me
uncomfortable. “Tracking everything will solve our
problems”
The department always have felt like a place where
everybody tries to be as flexible as possible. It's sad
that the department now seems to be willing to
question this flexibility over the price of a few coffees.
They will buy own coffee machines. This will reduce the
number of informal meetings in the cafeteria.
The critical reception of the card readers on our coffee
machines is actually a good sign. One can't expect
(junior) scientists to be good and uncritical at the same
time.
“Social comments” topics
(n=33):
• Communal space:
actions, perceptions
• Salience of decision
making
• Accountability (charging)
• Usability
Social
Privacy
Synopsis of findings: Technology (incl.
PETs) acceptance and intention to use
cv
cv
Data
cv
Computer science modelling: data utility
vs. data privacy / risk-utility tradeoffs
Data utility: “the value of a given data release
as an analytical resource – the key issue
being whether the data represent whatever
it is they are supposed to represent.”
risk
Full data
Data
Transformed
data
No data
utility
A legal view: purpose specification,
data minimisation
Can the goal be reached, the purpose
fulfilled, with these data? With less data?
 Data have some utility relative to a
purpose.
risk
Full data
Data
Example purposes
Authentication: only personnel members should
be able to use the resource.
accountability1 of resource consumers for their
consumption.
accountability2 of the resource supplier for
invoiced amounts
accountability3 of individuals or specified
anonymity sets in cases of abuse (e.g. theft)
Transformed
data
No data
Transformed
here: the dataminimal
No
solution
utility
An economic view: Utilities
and dis-utilities for stakeholders
risk
Full
Full data
Data
Transformed
•
•
•
Utility = a measure of preferences over
some set of goods (incl. services);
represents satisfaction experienced by the
consumer of the good.
(Dis-)utilities from multiple criteria
Aggregation over different stakeholders!
No
No data
Transformed
data
Transformed
No
utility
A multi-disciplinary, multi-criteria view
risk
Full
Full data
Data
Transformed
No
No data
Transformed
data
Transformed
No
utility
Summary: utility …
Data science / computer
science
Law
DPbD:
multi-disciplinary and multi-criteria
Behavioural sciences & HCI
Economics
Conclusion: More questions!
How to model realistic riskutility spaces?
How does this relate to
• GDPR “protection of individuals’
fundamental rights and freedoms”
• proportionality ?
DPbD:
multi-disciplinary and multi-criteria
Whether/how to design
systems to influence
knowledge and attitudes, and
leverage social influences?
DPbD often involves organisational
change.
Is this wanted? By whom? What if it
is not wanted?