erez - Webcourse

Linearizable Implementations Do
Not Suffice For Randomized
Distributed Computation
Wojciech Golab, Lisa Higham, Philipp Woelfel
Erez Druk
Seminar in Distributed Algorithms, 10/6/13
Overview






Motivation
Preliminaries
Main Theorem
Good to Know
Conclusion
Questions
Motivation

Linearizability
 Simplicity
 Correctness
 Intuitive
 Local
& Composable
 Useful
Motivation

Randomization
 Simplicity
 Achieves
 Natural
 Cool
the impossible
The Big Question

Can we combine the two?
NO!
An Example
ℛ is a register initialized to 1
𝑤: ℛ. 𝑊𝑟𝑖𝑡𝑒 2 ; 𝑐 = 𝑢𝑛𝑖𝑓𝑜𝑟𝑚 0,2 ; ℛ. 𝑊𝑟𝑖𝑡𝑒(𝑐)
𝑝: ℛ. 𝑅𝑒𝑎𝑑()


An adversary tries to minimize the value that 𝑝
reads
An Example

When ℛ is atomic
1
2
0/2
𝑅𝑒𝑎𝑑

The expected value of 𝑝’s 𝑅𝑒𝑎𝑑 is at least 1
An Example

When ℛ is implemented via atomic bits [Vid88]
𝑊𝑟𝑖𝑡𝑒(𝑣)
𝑅𝑒𝑎𝑑()
1
1
0
0
0
0
An Example

When ℛ is implemented via atomic bits [Vid88]
0
1
0
Start reading

1
1
0
1
0
0
1
0
0/1
Finish reading
The expected value of 𝑝’s 𝑅𝑒𝑎𝑑 is 1/2
Another Example
𝑝, 𝑟, 𝑞 have access to a shared snapshot object
𝑝: 𝑆𝑐𝑎𝑛𝑝 ()
𝑟: 𝑈𝑝𝑑𝑎𝑡𝑒𝑟 2 ; 𝑈𝑝𝑑𝑎𝑡𝑒𝑟 0
𝑞: 𝑈𝑝𝑑𝑎𝑡𝑒𝑞 6 ; 𝑐 = 𝑢𝑛𝑖𝑓𝑜𝑟𝑚 −1,1 ; 𝑈𝑝𝑑𝑎𝑡𝑒𝑞 (8𝑐)


An adversary tries to minimize the some of values
that 𝑝 observes
Another Example

Atmoic
0,0
2,0
0,0
0,6
𝑆𝑐𝑎𝑛()

The expected value of 𝑝’s 𝑆𝑐𝑎𝑛 is −1
0, −8/8
Another Example

Linearizable implementation [AADGMS93]
𝑆𝑐𝑎𝑛()
𝑥1 𝑥2


𝑈𝑝𝑑𝑎𝑡𝑒(𝑣)
𝑥𝑛
Wait for two identical
collects, or
Burrow from a process
which updated twice
𝑣 𝑥2


𝑥𝑛
Scan
Write the whole
snapshot
Another Example

Linearizable implementation [AADGMS93]
𝐶𝑜𝑙𝑙𝑒𝑐𝑡
𝑆𝑐𝑎𝑛()
𝑈𝑝𝑑𝑎𝑡𝑒(2)
𝑆𝑐𝑎𝑛()
𝐶𝑜𝑙𝑙𝑒𝑐𝑡
𝑈𝑝𝑑𝑎𝑡𝑒(0)
𝑊𝑟𝑖𝑡𝑒
𝑈𝑝𝑑𝑎𝑡𝑒(6)

𝑈𝑝𝑑𝑎𝑡𝑒(8𝑐)
The expected value of 𝑝’s 𝑆𝑐𝑎𝑛 is −2
𝐶𝑜𝑙𝑙𝑒𝑐𝑡
Some definitions







A coin flip vector 𝑐 is an infinite binary vector
An adversary 𝒜 is a function from past coin flips to the
set of processes
An history 𝐻 is a sequence of steps of processes
An history 𝐻 is sequential if it is sequential
An history 𝐻 is linearizable if there is a sequential history
𝐻 that agrees with it
𝑐𝑙𝑜𝑠𝑒 ℋ denotes the prefix-closure of a set of histories
ℋ
𝐻ℳ,𝒜,𝑐 is the history formed by running algorithm ℳ
with adversary 𝒜 and coin flips vector 𝑐
Some definitions


(ℳ, 𝒜) and (ℳ′, 𝒜′) are equivalent if for any 𝑐
there exists a sequential history that is a linearization
of both 𝐻ℳ,𝒜,𝑐 and 𝐻ℳ ′,𝒜′,𝑐
For convenience we denote
 𝐻𝑐
= 𝐻ℳ,𝒜,𝑐
 𝐻′𝑐 = 𝐻ℳ ′ ,𝒜′ ,𝑐
Goal


Fix the following theorem
Let ℳ be an algorithm that uses atomic objects and
ℳ′ be obtained from ℳ by replacing some objects
with linearizable implementations. Then for any 𝒜′
there exists 𝒜 such that ℳ, 𝒜 and ℳ ′ , 𝒜‘ are
equivalent
Strong Linearizability

A set of histories ℋ is strongly linearizable if there
exists a function 𝑓 mapping of histories in 𝑐𝑙𝑜𝑠𝑒 ℋ
to sequential histories such that
is a linearization of 𝐻
 𝑓 is prefix preserving
 𝑓(𝐻)


A shared object is strongly linearizable if its set of
histories is strongly linearizable
From here on ℳ′ is obtained from ℳ by replacing
some object with their strongly linearizable
implementation
Main Theorem

For every adversary 𝒜′ there exists an adversary
𝒜 such that ℳ, 𝒜 and ℳ ′ , 𝒜′ are equivalent.

Main Theorem - Proof






For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.
Need to find 𝒜 such that 𝐻′ 𝑐 and 𝐻𝑐 are
simultaneously linearizable
𝐻𝑐 is already sequential
How can we guarantee that 𝐻𝑐 is a linearization of
𝐻′𝑐 ? (hint: use the assumptions!)
Let 𝐻𝑐 = 𝑓 𝐻′𝑐
Need 𝒜 such that 𝐻𝑐 = 𝑓 𝐻′𝑐
Define 𝒜(𝑐) to be the schedule of 𝑓 𝐻′𝑐

Main Theorem - Proof
𝐻′ 𝑐 :
𝑓
𝑓(𝐻′ 𝑐 ):
𝒜 𝑐 =
For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.

Main Theorem - Proof



For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.
Done?
What is 𝑓?
Locality – A set of strongly linearizable objects is
strongly linearizable
Locality Lemma




Highly Technical
Induction on 𝑘
ℋ 𝑘 are the histories of length at most 𝑘
𝑓 𝑘 is good for ℋ 𝑘
Locality Lemma 𝐻:
𝑓1 (
𝑘
𝑓
𝑘
𝑘 ) = 𝑓1 (
Define 𝑓 𝑘 𝐻 = 𝑓 𝑘−1 𝐻 ∘ 𝜆
)∘𝜆
from
𝑘−1
𝑓
𝑂1 :
With 𝑓1
𝑂2 :
With 𝑓2

Main Theorem - Proof


Done?
𝒜 is not allowed to cheat
For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.

Main Theorem - Proof
𝐻′ 0 :
𝑓(𝐻′ 0 ):
𝐻′ 1 :
𝑓(𝐻′ 0 ):
For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.

Main Theorem - Proof


For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.
Fix 𝑓
Move coin flips to the earliest point possible
𝑓(𝐻′ 0 ):
𝑓 ∗ (𝐻′ 0 ):
𝑓(𝐻′ 1 ):
𝑓 ∗ (𝐻′ 1 ):

Main Theorem - Proof


𝒜 is good if 𝑓 ∗ is used
Done?
Yes!
For every adversary 𝒜′
there exists an adversary
𝒜 such that ℳ, 𝒜 and
ℳ ′ , 𝒜′ are equivalent.
Strong Linearizability is Necessary

If 𝒜, ℳ and 𝒜′, ℳ′ are equivalent then ℋ ′
= 𝐻′ 𝑐 ∀𝑐 is strongly linearizable
Necessity





If 𝒜, ℳ and 𝒜′, ℳ′ are
equivalent then ℋ ′ = 𝐻′ 𝑐 ∀𝑐
is strongly linearizable
𝑓 Should linearize 𝐻𝑐 ′ ∈ ℋ ′
𝐻𝑐 and 𝐻𝑐 ′ have the same linearization
𝐻𝑐 is already linearized
Define 𝑓 𝐻𝑐′ = 𝐻𝑐
Necessity



If 𝒜, ℳ and 𝒜′, ℳ′ are
equivalent then ℋ ′ = 𝐻′ 𝑐 ∀𝑐
is strongly linearizable
Done?
𝑓 must map all ℋ ∗ = 𝑐𝑙𝑜𝑠𝑒 ℋ ′
𝐻′ :
𝐺′ :
𝑓(𝐻′ ):
𝑓(𝐺 ′ ):
Necessity





If 𝒜, ℳ and 𝒜′, ℳ′ are
equivalent then ℋ ′ = 𝐻′ 𝑐 ∀𝑐
is strongly linearizable
Done?
𝑓(𝐺 ′ ) depends on the choice of 𝐻′
Lemma: No it doesn’t
The theorem follows
Good to Know




Composability
Weak Adversaries
Feasibility
Proof techniques
Conclusion

Linearizability & Randomization
 Examples

Strong Linearizability
 Sufficient
 Necessary
 Local
& Composable
 Feasible
(Easy) Questions?
Thanks For Listening!