Reverse-Engineering Hidden Assumptions in

Reverse-Engineering Hidden Assumptions in
Differential-Linear Attacks
Kaisa Nyberg
Aalto University School of Science, Finland
14 January 2015
ESC Clervaux
Outline
Introduction
Part I: Assumptions in Differential-Linear Attacks
The Setting
Computing the DL Probability
Previous Models
Reverse Engineering
Part II: Link Between Boomerang and Differential-Linear Attack
Hidden Assumptions in DLA
2/20
Outline
Introduction
Part I: Assumptions in Differential-Linear Attacks
The Setting
Computing the DL Probability
Previous Models
Reverse Engineering
Part II: Link Between Boomerang and Differential-Linear Attack
Hidden Assumptions in DLA
3/20
Motivation
Differential-linear(DL) cryptanalysis has been quite successful
attacking block ciphers, in particular, Serpent
At FSE 2014 we (w. Céline, Gregor) provided a thorough analysis of
its foundations.
We know by now that DL attack is a special case of truncated
differential attack.
We gave a complete expression of the probability of the DL relation in
terms of the
I
differential probabilities of the first part of the cipher and
I
squared correlations over the latter part of the cipher.
How do the previous ad hoc models relate to the theory?
Hidden Assumptions in DLA
4/20
Outline
Introduction
Part I: Assumptions in Differential-Linear Attacks
The Setting
Computing the DL Probability
Previous Models
Reverse Engineering
Part II: Link Between Boomerang and Differential-Linear Attack
Hidden Assumptions in DLA
5/20
Differential-Linear Cryptanalysis: The Setting
U ⊥ = {δ, 0}
z}|{
I
E = E1 ◦ E0
I
V is a subspace of the
intermediate layer
I
Strong truncated differential
(δ, V ⊥ ) over E0
p
V
V⊥ ?
}|
{
z }| { z
|
{z
}
v
sp(v )⊥
cv ,w
E
p = Pr[δ →0 V ⊥ ]
I
Strong linear approximations
(v , w) over E1 , where v ∈ V
?
|{z}
W = {0, w}
cv ,w = 2·Pr[v ·y +w ·E1 (y ) = 0]−1
Hidden Assumptions in DLA
6/20
Differential and Linear Cryptanalysis: The Link
[Chabaud Vaudenay 94] [Blondeau Nyberg 13, 14]
U⊥
{ z}|{
U
}|
z
F
CU,W
Pr[U ⊥ → W ⊥ ]
? ?
|{z} |
W
multidim. linear
trunc. differential
differential-linear
{z
W⊥
}
1
(CU,W + 1)
|W |
F
Pr[U ⊥ → W ⊥ ]
1 X
Pr[w · (F (x + δ) + F (x)) = 0, for all w ∈ W ]
|U ⊥ |
⊥
δ∈U
Hidden Assumptions in DLA
7/20
DL Probability
Assumption: E0 and E1 are independent
Result : For all δ ∈ Fn2 \ {0} and w ∈ Fn2 \ {0}
E
Pr[δ → sp(w)⊥ ] =
X
E
Pr[δ →0 sp(v )⊥ ]cv2,w .
v ∈Fn2
The ideas of the proof:
I Round-independence
I Splitting
E
Pr[δ → sp(w)⊥ ] =
X
E
E
Pr[δ →0 ∆] Pr[∆ →1 sp(w)⊥ ]
∆∈Fn2
I
Link between linear and differential cryptanalysis
Hidden Assumptions in DLA
8/20
DL Bias
Eδ,w = Pr[w · (E(x + δ) + E(x)) = 0] −
εδ,v = Pr[v · (E0 (x + δ) + E0 (x)) = 0] −
1
2
1
1
E
= Pr[δ →0 sp(v )⊥ ] −
2
2
Result : For all δ ∈ Fn2 \ {0} and w ∈ Fn2 \ {0}
Eδ,w =
X
εδ,v cv2,w .
v ∈Fn2
Hidden Assumptions in DLA
9/20
The Classical Setting
δ
p
I
E = E1 ◦ E0
I
One strong differential (δ, ∆) over
E0
v
|
cv ,w
∆
{z
sp(v )⊥
?
E
p = Pr[δ →0 ∆]
}
I
One strong linear approximation
(v , w) over E1
cv ,w = 2·Pr[v ·y +w ·E1 (y ) = 0]−1
?
w
Hidden Assumptions in DLA
10/20
Previous Models
[Langford and Hellman 1994]
Assuming
Piling-up lemma
p = 1 and Eδ,w =
Eδ,w = εδ,v cv2,w
[Biham et al 2002] Problem:
p < 1 known, εδ,v not known. They get
δ
p
v
|
cv ,w
∆
{z
sp(v )⊥
?
}
εδ,v ≈
1 2
c
2 v ,w
p
2
under Assumption 3 (Lu): For Ω = E0 (x +δ)+
E0 (x) 6= ∆ the parities of v · Ω balanced.
[Lu 2012] Can avoid this assumption by considering the truncated differential over E0 with
all ∆ ∈ sp(v )⊥ and determines an estimate
for its probability P to get
?
w
εδ,v ≈ P −
1
2
Hidden Assumptions in DLA
11/20
Do these models make sense?
I
Can they be derived from our result under some assumptions
other than Piling-up lemma?
I
If yes, what are the assumptions?
I
These considerations have relevance in practice: cryptanalysts
should know which properties of the cipher remain to be
validated in simulations
Hidden Assumptions in DLA
12/20
Case Lu: One strong linear approximation
Assumption A: One strong linear approximation (v , w) over E1 . All other
linear approximations equally weak.
Denote ρ = cv ,w . Then
cν,w 2 =
1 − ρ2
, ν 6= v .
2n − 2
E
Consider all differences ∆ ∈ sp(v )⊥ and denote P = Pr[δ →0 sp(v )⊥ ]. Then
X
E
E
Pr[δ → sp(w)⊥ ] =
Pr[δ →0 sp(v )⊥ ]cv2,w
v ∈Fn2
= Pρ2 +
X
ν6=v
E
Pr[δ →0 sp(ν)⊥ ]
1 − ρ2
1
1
= (P − )ρ2 + ,
n
2 −2
2
2
where we have estimated
1 X
1
E
Pr[δ →0 sp(ν)⊥ ] =
n
2 −2
2
ν6=v
Hidden Assumptions in DLA
13/20
Case Biham et al.: One strong linear approximation and one
differential
Assumption A: One strong linear approximation (v , w) over E1 . All other
linear approximations equally weak.
Denote ρ = cv ,w . Then
cν,w 2 =
1 − ρ2
, ν 6= v
2n − 2
Assumption B: One strong differential δ → ∆ over E0 . Differential
probabilities for all other output differences equally small.
E
E
Denote p = Pr[δ →0 ∆]. Then Pr[δ →0 Ω] =
P=
X
Ω∈sp(v )⊥
Pr[δ → Ω] = p +
1−p
2n −2 ,
Ω 6= ∆ and
1 − p n−1
p 1
(2
− 2) ≈ +
n
2 −2
2 2
By Case Lu we obtain the estimate p2 ρ2 of the DL bias by Biham et al.
Hidden Assumptions in DLA
14/20
Summary
I
Assumption 2 (Lu) obsolete
I
Assumption 3 (Lu) obsolete
Lu claimed to need only independence of cipher parts, while he
needs more. The following will be sufficient:
I
I
I
Assumption A: one strong linear approximation; and all other
linear approximations are equally weak.
For Biham et al. the following are sufficient:
I
I
Assumption A: as above
Assumption B: one strong differential, other differentials with
output differences Ω, v · Ω = 0 have equally small probabilities.
Hidden Assumptions in DLA
15/20
Outline
Introduction
Part I: Assumptions in Differential-Linear Attacks
The Setting
Computing the DL Probability
Previous Models
Reverse Engineering
Part II: Link Between Boomerang and Differential-Linear Attack
Hidden Assumptions in DLA
16/20
A Link Between Boomerang and Differential-Linear
Attacks
Hidden Assumptions in DLA
17/20
DL Statistical Models
Truncated Differential Model
E
Pr[δ → sp(w)⊥ ]
Difference Distribution Model
Evaluate the capacity (aka SEI, aka energy) of the probability
distribution of the output differences Γ
Cdiff (δ) = 2n
X
E
Pr[δ → Γ]2 − 1
Γ
By the well-known Lemma (see Willi’s slides)
2 X
X
X
E
E
(2Eδ,w )2
2n
Pr[δ → Γ]2 =
2 Pr[δ → sp(w)⊥ ] − 1 =
Γ
w
w
Hidden Assumptions in DLA
18/20
Boomerang Attack
∆ and ∇ fixed
For x1 , denote
x2 = x1 + ∆
x10 = E −1 (E(x1 ) + ∇)
x20 = E −1 (E(x2 ) + ∇).
This setting describes a boomerang with ∇ over the cipher E.
Probability of return of the boomerang:
Prx1 [x10 + x20 = ∆].
It holds:
X
∆
Prx1 [x10 + x20 = ∆] = 2n
X
E −1
Pry [∇ → ∆]2 .
∆
Hidden Assumptions in DLA
19/20
The Link
The sum of the probabilities of return of boomerang over E through ∇
taken over the initiating differences ∆
=
1 + the capacity Cdiff (∇) of the difference distribution over E −1 with
input difference ∇
=
The sum taken over all w of squared correlations (2E∇,w )2 of the DL
approximations over E −1 with input difference ∇
Hidden Assumptions in DLA
20/20