AA - DePaul University

Privacy and Rationality:
Theory and Evidence
Alessandro Acquisti
Heinz School, Carnegie Mellon University
[email protected]
Who should protect your privacy?
It is true that there are potential costs of using Gmail for
email storage […] The question is whether consumers
should have the right to make that choice and balance
the tradeoffs, or whether it will be preemptively denied to
them by privacy fundamentalists out to deny consumers that
choice. -- Declan McCullagh (2004)
Privacy attitudes vs. behavior…
• Attitudes: usage
– Top reason for not going online (Harris [2001]). 78% would
increase Internet usage given more privacy (Harris [2001])
• Attitudes: shopping
– $18 billion in lost e-tail sales (Jupiter [2001]). Reason for
61% of Internet users to avoid ECommerce (P&AB [2001]).
73% would shop more online with guarantee for privacy
(Harris [2001])
• Behavior
– Anecdotic evidence: DNA for BigMac
– Experiments: Spiekermann, Grossklags, and Berendt
(2001): privacy “advocates” & cameras
– Everyday examples: Dot com deathbed
Explanations
• Syverson (2003)
– “Rational, after all” explanation
• Shostack (2003)
– “When it matters” explanation
• Vila, Greenstadt, and Molnar (2003)
– “Lemon market” explanation
• Are there other explanations?
– Acquisti and Grossklags (2003): privacy and
rationality
Personal information
is a very peculiar economic good…
•
•
•
•
•
Subjective
Ex-post
Context-dependent
Asymmetric
Both private and public good aspects
– Lump sum vs. negative annuity
• Buy value vs. sell value vs. expected loss
• … privacy issues actually originate from two different
markets
– Market for personal information
– Market for privacy
Privacy and rationality
• Traditional economic view: forward looking
agent, utility maximizer, bayesian updater,
perfectly informed
– Both in theoretical works on privacy
– And in empirical studies
• Exceptions: PEW Survey 2000, Annenberg Survey 2003
Yet: privacy trade-offs
• Protect:
– Immediate costs or loss of immediate benefits
– Future (uncertain) benefits
• Do not protect:
– Immediate benefits
– Future (uncertain) costs
(sometimes, the reverse may be true)
Why is this problematic?
• Incomplete information
• Bounded rationality
• Psychological/behavioral distortions
Theory: Acquisti and Grossklags WEIS 04
Acquisti ACM EC 04
Empirical approach: Acquisti and Grossklags WEIS 04
1. Incomplete information
• What information has the individual access to when
she takes privacy sensitive decisions?
– For instance, is she aware of privacy invasions and
associated probability and magnitude of risks?
– Is she aware of benefits she may miss by protecting her
personal data?
– What is her knowledge of the existence and characteristics
of protective technologies?
• Privacy:
– Asymmetric information
• Exacerbating: e.g., RFID, GPS
– Material and immaterial costs and benefits
– Uncertainty vs. risk, ex post evaluations
2. Bounded rationality
• Is the individual able to consider all the
parameters relevant to her choice?
– Or is she limited by bounded rationality?
– Herbert Simon’s “mental models” (or shortcuts)
• Privacy:
– Decisions must be based on several stochastic
assessments and intricate “anonymity sets”
– Inability to process all the stochastic information
related to risks and probabilities of events leading to
privacy costs and benefits
– E.g., HIPAA
– E.g., pervasive computing
3. Psychological/behavioral
distortions
• Privacy and deviations from rationality
– Optimism bias
– Complacency towards large risks
– Inability to deal with prolonged accumulation of small
risks
– Coherent arbitrariness
– “Hot/cold” theory
– Attitude (generic) vs. behavior (specific)
• Fishbein and Ajzen. Attitude, Intention and Behavior: An
Introduction to Theory and Research, 1975
– Hyperbolic discounting, immediate gratification
Hyperbolic discounting and Privacy
• Hyperbolic discounting (Laibson 94; O’Donoghue
and Rabin 01)
• Procrastination, immediate gratification
• We do not discount future events in timeconsistent way
• Compare:
• Privacy attitude in survey versus actual behavior
• Privacy behavior when risks are more or less
temporally close
Hyperbolic discounting
Survey time
vs. decision time
Survey time
vs. decision time
Time consistency
vs. time inconsistency
The big picture
Sum of costs
Marginal costs of
privacy protection
Costs
Expected costs of
privacy intrusions
Privacy protection
Survey and experiment
• Phase One: pilot
• Phase Two: ~100 questions, 119 subjects from CMU
list.
– Paid, online survey (CMU Berkman Fund)
– Goals: check for
– Privacy attitudes / behavior
– Information, bounded rationality, and psychological distortions
• Phase Three: experiment
Clusters
•
Multivariate clustering techniques (k-means)
•
Privacy attitudes
–
privacy fundamentalists 26.1%; two medium groups (concerned about online 23.5% or
offline 20.2% identity); low concerns 27.7%
•
Self reported behavior of privacy relevance
–
high degree of information revelation and risk exposure 64.7%; low revelation and
exposure 35.3%
•
Knowledge of privacy risks
–
•
average knowledge of privacy threats 46.2%; high unawareness 31.9%; “aware” 21.9%
Knowledge of privacy protection and security
–
small group very knowledgeable 31.7%; larger group showing a blatant lack of
awareness 68.3%
Economic factors
(excerpts)
• Preliminary evidence of hyperbolic discounting,
bounded rationality, risk aversion
Indirect example of bounded
rationality (excerpts)
You have completed a credit card
purchase with an online merchant.
Besides you and the merchant
website, who has data about parts of
your transaction?
Nobody: 36.4%
Credit card company: 18.7%
Hackers: 15%
“Nobody, assuming an SSL transaction, without
which I would not commit an online transaction
using my credit card”
Behavior
(excerpts)
• 74% adopted some strategy or technology or
otherwise took some particular action to protect
their privacy
–
–
–
–
–
Encryption, PGP
Do-not-call list
Interrupt purchase
Provide fake information
[…]
• However, when you look at details, percentages go
down…
–
–
8% encrypt emails regularly
Similar results for shredders, do-not-call lists, caller-IDs, etc.
Conclusions
•
Theory
– Time inconsistencies may lead to under-protection and over-release of
personal information. Genuinely privacy concerned individuals may
end up not protecting their privacy
•
Evidence
– Sophisticate attitudes, and (somewhat) sophisticate behavior
– But also: evidence of overconfidence, incorrect assessment of own
behavior, incomplete information about risks and protection, buy/sell
dichotomy.
•
Implications
– Rationality model not appropriate to describe individual privacy
behavior
– Self-regulation alone, or reliance on technology and user responsibility
alone, may not work