How to (not) Analyze Cryptographic Protocols using Game Theory

How
to
(not)
Analyze
Cryptographic
Protocols
using
Game
Theory
Jesper
Buus
Nielsen
Main
Points
•  Idealizing
crypto:
Replace
real‐life
crypto
tools
by
formal
objects
like
term
algebras
or
oracles
to
make
analysis
of
a
protocol
easier
•  Common
in
cryptography
–  known
to
be
sound
in
the
usual
crazy‐versus‐
stupid
models
•  Researcher
have
been
idealizing
crypto
tools
for
the
sake
of
game
theoreIc
analysis
too
–  That
is
typically
not
sound
Terminology:
ComputaIonal
SoluIon
Concept
•  Takes
computa3on
feasibility
into
account
–  Examples:
Only
allow
polynomial
Ime
computable
strategies,
price
computaIon
via
the
uIlity
funcIon,
discounIng,
…
•  Allows
the
use
of
(imperfect)
cryptography
–  Example:
When
your
opponent
uses
encrypIon
the
deviaIon
which
makes
one
guess
at
his
secret
key
and
uses
the
key
to
break
the
protocol
if
the
guess
is
correct
gives
you
a
small
advantage,
so
go
for
‐NE
for
negligible
small

to
allow
stability
–  Example:
UIlity
of
key‐guessing
smaller
than
the
price
of
the
computaIon
or
discounted
away
Terminology:
Game
TheoreIc
SoluIon
Concept
•  A
soluIon
concept
which
allows
arbitrary
strategies
Idealizing
Crypto
•  (Very
simple)
idealized
signatures:
–  The
world
has
a
global
signing
oracle
O
which
all
parIes
have
access
to
–  Sign:
A
party
Pi
can
send
sign(m)
to
O
which
stores
(i,
(i,
m))
[read
Pi
has
a
signature
on
m
from
Pi]
–  Transfer:
If
Pk
inputs
trans((i,m),n)
to
O
and
(k,
(i,
m))
is
stored
in
O,
then
O
stores
(n,
(i,
m))
–  Verify:
If
Pk
inputs
verify(i,m)
to
O
and
(k,
(i,
m))
is
stored
in
O
then
O
outputs
accept
otherwise
reject
•  Possible
to
show
that
any
cryptographic
protocol
which
is
secure
when
using
these
idealized
signatures
is
equally
secure
when
they
are
replaced
by
real
signatures
–  Up
to
negligible

–  PKI
+
unforgeable
signatures
+
UC
framework
Why?
(1/3)
•  A
possible
solu3on
heuris3c:
–  Idealize
the
crypto
tools
in
a
protocol
and
then
apply
your
favorite
GT
soluIon
concept
to
the
idealized
protocol
–  Since
the
idealized
protocol
does
not
rely
on
computaIon
crypto
tools
it
is
free
of
the
deviaIons
with
negligibly
small
advantage
which
disturb
most
known
GT
soluIon
concepts
•  Implicit
assumpIon:
Guarantees
that
there
are
no
problems
besides
key‐guessing‐like
deviaIons
Why?
(2/3)
•  Might
guide
the
development
of
computaIonal
soluIon
concepts:
–  Given
GT
soluIon
concept
X
try
to
develop
a
computaIonal
version
CX
–  Then
check
if
CX
produces
soluIons
similar
to
the
soluIons
X
produces
for
the
idealized
protocol
•  AssumpIon:
The
computaIonal
version
should
behave
like
the
pure
GT
noIon
Why?
(3/3)
•  Modular
analysis
of
complex
protocols
•  Given
a
protocol
using
both
signature
and
encrypIon:
–  First
idealize
both
primiIves
and
give
a
hopefully
simple
analysis
of
the
idealized
protocol
–  Show
that
plugging
in
real
signatures
preserves
soluIons
–  Show
that
plugging
in
real
encrypIon
preserves
soluIons
–  Conclude
that
the
real
protocol
has
the
same
soluIons
as
the
ideal
protocol
Hope!
•  O`en
a
cryptographic
analysis
(honest
parIes
versus
corrupted
parIes)
of
an
idealized
protocol
can
be
proven
to
give
sound
conclusions
about
the
real‐life
protocol
–  Signatures

–  EncrypIon

–  Zero‐knowledge
proof
of
knowledge

–  Zero‐knowledge
proof
of
correctness

Claims
•  The
solu3on
heuris3c
is
likely
to
give
wrong
conclusions
•  Comparison
to
idealizaIon
is
not
a
good
sanity
check
for
computaIonal
soluIon
concepts
•  ComputaIonal
soluIon
concepts
must
be
developed
cauIously
and
have
their
own
computaIonal
epistemologies
•  A`er
developing
good
computaIonal
soluIon
concepts
idealizaIon
is
possible
as
a
tool
for
modular
analysis
“Proof
by
Example”
•  Will
try
to
argue
my
point
by
“solving”
a
small
game
in
three
different
seengs
•  Will
see
that
we
get
dramaIcally
different
soluIons
depending
on
whether
we
idealize
crypto
or
not
•  And
the
soluIon
called
by
the
idealized
analysis
is
arguably
the
wrong
one
Overview
1:
signal
1
4:
g1
Good
choice
Bad
choice
2:
(g,b)
2
3:
communica3on
3:
comm.
3
4:
(g3,b3)
A
Few
Pennies
Good&bad:
P2
plays
g

{1,2,3}
and
b

{1,2,3}\{g}
Guess:
P1
plays
g1

{1,2,3}
Guess:
P3
plays
g3

{1,2,3,a}
and
b3

{1,2,3}
Abstain:
If
P3
plays
a
all
parIes
get
uIlity
0
Avoid
bad:
If
g1=b
or
g3=b
then
P1
and
P3
die
and
P2
wins
the
world
•  Know
bad:
Same
if
P3
does
not
abstain
and
b3b
•  Coordinate:
If
g1,g3{1,2,3}
\{b}
and
b3=b,
then
P1
and
P3
get
a
posiIve
uIlity
from
g1=g3
but
P2
prefers
g1g3
• 
• 
• 
• 
• 
–  P1
has
negaIve
uIlity
on
g1g3
but
P3
does
not,
though
he
prefers
g1=g3
–  And
P1
prefers
to
match
on
g
Played
in
a
Network
•  Before
P2
specifies
(g,b):
–  P1
can
send
a
signal
to
P3
•  Also
seen
by
P2
•  Then
P1
learns
(g,b)
but
P3
does
not
•  A`er
P2
specifies
(g,b):
–  P1
can
send
a
message
to
P2
•  Not
seen
by
P3
–  P2
and
P3
can
communicate
with
each
other
•  Not
seen
by
P1
Recap
1:
signal
1
4:
g1
Good
choice
Bad
choice
2:
(g,b)
3:
comm.
2
3
3:
comm.
4:
(g3,b3)
signal
1
g1
Good
Bad
(g,b)
2
comm
•  g1
g3:
•  g1=g3=g:
•  g1=g3
g:
•  Abstain:
g3=a:
u1=u2=u3=0
•  Avoid:
g1=b
or
g3=b:
3
u1=u3=‐
,
u2=
(g ,b )
•  Know:
g3a
,
b3b:
u1=u3=‐
,
u2=
•  Otherwise:
comm
u1=‐2
u1=
1
u1=
0
3
3
u2=
3
u2=
1
u2=
2
u3=
0
u3=
1
u3=
1
signal
1
g1
Good
Bad
(g,b)
comm
2
3
comm
(g3,b3)
• 
• 
• 
• 
Abstain,
Avoid,
Know
g1
g3:
‐2 3
0
g1=g3=g:
1
1
1
g1=g3
g:
0
2
1
signal
1
g1
Good
Bad
(g,b)
comm
2
3
comm
(g3,b3)
• 
• 
• 
• 
Abstain,
Avoid,
Know
g1
g3:
‐2 3
0
g1=g3=g:
1
1
1
g1=g3
g:
0
2
1
•  Will
draw
conclusions
from
this
game
by
informally
solving
it
using
“common
knowledge
of
raIonality”
in
the
following
seengs:
1.  Arbitrary
strategies
2.  Idealized
signatures
3.  Poly‐Ime
strategies
signal
1
g1
Good
Bad
(g,b)
comm
2
3
comm
(g3,b3)
• 
• 
• 
• 
Abstain,
Avoid,
Know
g1
g3:
‐2 3
0
g1=g3=g:
1
1
1
g1=g3
g:
0
2
1
•  If
g3a
in
some
NE
(with
posiIve
probability)
given
some
(signal,
b)
then
P2
gains
by
shi`ing
Common
knowledge
of
raIonality
to
the
strategy
where
it
picks
b=g
when
it
3
Arbitrary
deviaIons
sees
signal
and
then
shows
P

Always
abstain
3
communicaIon
with
the
distribuIon
it
would
have
seen
if
P2
had
played
according
to
the
NE
signal
Good
Bad
(g,b)
1
comm
g1
2
3
comm
(g3,b3)
• 
• 
• 
• 
Abstain,
Avoid,
Know
g1
g3:
‐2 3
0
g1=g3=g:
1
1
1
g1=g3
g:
0
2
1
“RaIonalizable”:
P1:
signal
=
verificaIon
key
vk
of
P1
P2:
pick
(g,b)
uniformly
at
random
Common
knowledge
of
raIonality
P1:
send
s=sig
Idealized
signatures
sk(g,b)
to
P2

Never
abstain
P2:
send
(g,b)
and
s
to
P
3
if
received,
otherwise
nothing
•  P3:
if
vervk((g,b),s)=accept
play
g3=g
and
b3=b
otherwise
g3=a
• 
• 
• 
• 
• 
signal
1
Good
Bad
(g,b)
2
3
• 
• 
• 
• 
Abstain,
Avoid,
Know
g1
g3:
‐2 3
0
g1=g3=g:
1
1
1
g1=g3
g:
0
2
1
PWhen
P
3
knows
b
but
2
can
use
s
to
prove
not
g
it
should
play
to
P
Hence
raIonal
for
P
3
that
P1
signed
a
1
not
“matching
pennies”
with
• to
give
any
verifiable
P1:
signal
=
verificaIon
key
vk
of
P1
value
of
the
form
P2:
pick
(g,b)
uniformly
at
random
P• (.,b),
using,
e.g.,
a
Common
knowledge
of
raIonality
informaIon
on
b
away
2
using
a
random
g
3,
•  P1:
send
s=sigsk
higher
(g,b)
to
P2
which
gives
P

Always
abstain
2 Real
signatures
zero‐knowledge
proof
•  P2:
send
(g,b)
and
s
to
P3
if
received,
otherwise
g1
comm
comm
(g3,b3)
payoff
but
gives
P
1
a
Hence
P
nothing
3
will
abstain
•  PnegaIve
payoff
3:
if
vervk((g,b),s)=accept
play
g3=g
and
b3=b
otherwise
g3=a
What
went
Wrong?
•  IdealizaIon
of
signatures
have
been
proven
sound
in
cryptography,
so
what
went
wrong?
•  P2
can
prove
to
P3
that
P1
sent
b
while
hiding
g
and
thus
renegoIate
P3
into
a
strategy
which
is
an
advantage
for
P2
•  Cryptography
has
a
centralized
adversary
who
controls
and
coordinates
all
corrupted
parIes,
hence
the
use
of
cryptography
“internal
to
the
deviaIon”
does
not
give
extra
power
to
the
adversary
compared
to
the
idealized
case
Conclusion
1
•  The
heurisIc
soluIon
concept
can
easily
give
“very”
wrong
soluIons
–  A
three‐party,
simultaneous
mutual
conflict/
mutual
advantage
of
cooperaIon
seeng,
like
the
one
used,
can
arise
in
many
seengs
and
might
even
be
subtly
hidden
•  Seems
hard
to
judge
whether
a
protocol
can
be
soundly
analyzed
using
the
heurisIc,
so
beker
just
abstain
from
doing
it
Conclusion
2
•  It
does
not
seem
as
a
way
out
to
make
more
involved
idealizaIons
which,
e.g.,
allows
“spliIng”
of
signatures
as
we
did
in
the
example
–  The
idealizaIon
would
probably
end
up
being
more
complicated
than
the
real‐life
tool
–  The
idealizaIon
would
have
to
be
head
on:
allow
all
possible
uses
and
misuses
and
nothing
else
to
hope
for
soundness
Conclusion
3
•  Comparison
to
how
GT
soluIon
concepts
behave
on
idealized
protocols
is
not
a
good
sanity
check
for
proposed
computaIonal
soluIon
concepts
–  In
our
case
the
computaIonal
noIon
should
exactly
give
another
soluIon
Conclusion
4
•  There
does
not
seem
to
be
a
way
around
cauIously
developing
computaIonal
soluIon
concepts
and
try
to
give
epistemic
models
based
on
bounded
raIonality
The
Good
News
•  Modular
analysis
via
idealizaIon
is
possible
for
ComputaIonal
Nash
Equilibrium
(CNE)
–  Only
reasons
via
single
agent
deviaIon
–  Hence
crypto
cannot
be
used
to
facilitate
deviaIons
•  In
[Peter
Bro
Miltersen,
Jesper
Buus
Nielsen,
Nikos
Triandopoulos:
Privacy‐Enhancing
AucIons
Using
RaIonal
Cryptography.
CRYPTO
2009]
we
show
a
cryptographic
aucIon
protocol
to
be
a
CNE
via
a
sound
idealizing
of
the
crypto
and
a
game
theoreIc
analyzing
of
the
idealized
protocol
Seeng
•  The
goal
in
[MNT09]
was
to
give
a
game‐
theoreIc
analysis
of
a
protocol
which
n
parIes
can
run
among
themselves
on
the
Internet
to
emulate
a
trusted
mediator
–  They
should
end
up
having
signed
contracts
from
all
other
parIes
on
their
outcomes
to
avoid
disputes
a`er
the
game
is
over
–  The
parIes
are
allowed
to
have
privacy
concerns,
e.g.,
to
prefer
to
keep
their
type
secret
over
leaking
it
AnalyIc
Technique
•  We
use
a
noIon
of
protocol
game,
which
allows
to
model
both
a
trusted
mediator
and
the
Internet
in
a
unified
manner
•  We
then
relate
the
properIes
of
the
real‐life
protocol
to
the
mediated
case
and
conclude
that
the
real‐life
protocol
is
as
stable
as
the
mediated
case
and
gives
the
same
uIlity
profile
–  Implies
that
it
leaks
no
more
informaIon,
as
the
uIlity
associated
to
informaIon
loss/collecIon
is
captured
in
the
uIlity
funcIons
Protocol
Games
t1
party
communicaIon
device

1
L1
C
o1
tn
party

o n
fiscal
uIlity:
fi(t,o)
informaIon
uIlity:
Ii(t,L)
uIlity:
ui(t,o,L)
=
fi(t,o)+Ii(t,L)
n
L n
MediaIon
t1
party
b1
L1
b n
(o1,…,on)
=
M
(b1,…,bn)

1
tn
o1
party

o n
fiscal
uIlity:
fi(t,o)
informaIon
uIlity:
Ii(t,L)
uIlity:
ui(t,o,L)
=
fi(t,o)+Ii(t,L)
n
L n
t1
Internet
Contract
Games
tn
party
Plays
CA,
seeng
up
PKI
party

Allows
communicaIon
between
parIes

1
L1
o1
Calls
outcome
oi
if
Pi
returns
a
signature
on
oi
from
all
parIes
o n
fiscal
uIlity:
fi(t,o)
informaIon
uIlity:
Ii(t,L)
uIlity:
ui(t,o,L)
=
fi(t,o)+Ii(t,L)
n
L n
Important
Design
Choices
•  Same
type
profile
T
makes
sense
in
all
seengs
•  Outcome
is
called
by
the
device
as
last
round
of
outputs,
so
well‐defined
in
all
seengs
•  Local
informaIon
is
output
be
the
parIes,
so
well‐defined
in
all
seengs
•  So,
same
u=f+I
makes
sense
in
all
seengs
•  We
can
keep
types
and
uIliIes
fixed
and
relate
different
strategies
in
different
seengs
–  We
can
talk
about
whether
it
is
beker
to
play
some
given
strategy
in
the
real‐life
seeng
than
it
is
to
play
some
other
strategy
in
the
ideal
seeng
Nash
ImplementaIon
•  Fix
T
and
f
=
(f1,…,fn)
•  We
say
that
(C,)
is
a
t‐resilient
privacy‐
enhanced
Nash
implementaFon
of
(D,),
wriSen
(C,)
t,T,r
(D,),
if
for
all
admissible
I
and
u
=
f+I
it
holds
that:
•  No
less
uFlity:
For
all
Pi:
ui(T,C,)

ui(T,D,)
‐

•  No
more
incenFve
to
deviate:
For
all
C{1,
…,n}
with
|C|t
and
all
C*
there
exists
C*
such
that
ui(T,D,(C*,‐C))

ui(T,C,(C*,‐
C))
‐

for
all
iC
The
Result
in
the
Paper
•  We
construct
for
each
mechanism
M
a
contract
game
for
the
Internet
which
is
an
(n‐1)‐resilient
privacy‐enhanced
Nash
implementaIon
of
the
ideally
mediated
seeng
for
M
if
all
parIes
have
ex
interim
strict
raIonality
Property
1
of
Nash
ImplementaIon
•  If
(C,)
is
an
‐NE
(toleraIng
collusions
of
size
t)
and
(C,)
t,T,r
(D,)
then
(D,)
is
an
‐NE
(toleraIng
collusions
of
size
t)
–  Allows
to
li`
analysis
from
an
ideal
seeng
to
a
real‐life
seeng
•  So,
any
‐NE
for
the
mediated
seeng
(with
ex
interim
strict
raIonality)
is
also
a
‐NE
in
the
Internet
contract
game
Property
2
of
Nash
ImplementaIon
•  If
(C,)
t,T,r
(D,)
and
(D,)

t,T,r
(E,)
then
(C,)

t,T,r
(E,)
•  This
allows
a
modular
analysis
going
from
the
mediated
seeng
to
the
Internet
seeng
via
gradually
more
refined
seengs
(introducing,
e.g.,
one
crypto
primiIve
at
a
Ime)
…
•  The
noIon
of
Nash
implementaIon
is
a
trivial
adopIon
of
the
noIon
NE
from
intra‐game
analysis
to
inter‐
game
analysis
•  Yet
it
allows
to
do
modular
analysis
with
much
the
same
flavor
as
modular
analysis
in
crypto
via
idealizaIon
•  There
is
jusIfied
hope
that
other
good
computaIonal
soluIon
concepts
will
allow
similar
li`ing
to
inter‐game
analysis
and
hence
allow
modular
analysis
•  We
just
need
some
good
computaIonal
soluIon
concepts…