Generalized ABRACADABRA Martingale Problem
Matt Rosenzweig
Contents
1 Doob’s Theorem
1.1 Martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.2 Stopping Times; Stopped Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.3 Doob’s Stopping Time Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2
2
2
4
2 Waiting for a Fixed Pattern
2.1 Martingale Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.2 ABRACADABRA Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
6
9
1
1
Doob’s Theorem
Notation: Z+ = {n ∈ Z : n ≥ 0}.
1.1
Martingales
Definition 1.1.1. A filtered space (Ω, F, Fn , P) consists of a probability triple (Ω, F, P) together with a
collection of nested σ-algebras
F0 ⊂ F1 ⊂ · · · ⊂ Fn ⊂ · · · ⊂ F
called a filtration.
Definition 1.1.2. A countable collection of random variables (Xn )∞
n=1 is called a discrete time random
process. A random process is said to be adapted to the filtration (Fn )∞
n=1 if Xn is Fn -measurable for each
n. A random process is said to form a martingale relative to the filtration (Fn ) if it is adapted and if
E(|Xn |) < ∞ and E(Xn |Fn−1 ) = Xn−1 ∀n ∈ Z+
1.2
Stopping Times; Stopped Processes
Definition 1.2.1. Suppose we have a filtered space (Ω, F, Fn , P). A map τ : Ω → Z+ ∪ {∞} is called a
stopping time if it satisfies
1. τ −1 ({n}) ∈ Fn ∀n ∈ Z+
2. P(τ < ∞) = 1
We say that τ is a random time if we only that τ satisfies the first condition.
Lemma 1.2.2. Let τ be a random time with respect to a filtration (Fn )∞
n=0 . If for some fixed N ∈ N,
E 1{τ ≤n+N } |Fn ≥ > 0 a.e. ∀n ≥ 0
then E(τ ) < ∞
Proof. From our hypothesis that E( 1{τ ≤n+N } |Fn > a.e., we obtain
P ({τ ≤ n + N }) = E E 1{τ ≤n+N } |Fn ≥ · P(Ω) = ⇒ P ({τ > n + N }) ≤ 1 − ∀n ≥ 0
In particular, P ({τ > N }) ≤ 1−. {τ > (k − 1)N } ∈ F(k−1)N as an elementary consequence of the definition
of random time, and
E(1{τ ≤kN } |F(k−1)N ) ≥ ⇒ 1 − E(1{τ ≤kN } |F(k−1)N ) = E(1{τ >kN } |F(k−1)N ) ≤ (1 − )
by hypothesis. Hence,
Z
P ({τ > kN }) =
1{τ >kN } dP
{τ >(k−1)N }
Z
E 1{τ >kN } |F(k−1)N dP
=
{τ >(k−1)N }
≤ (1 − )P ({τ > (k − 1)N })
So, it follows by induction that
∀k ∈ N, P ({τ > kN }) ≤ (1 − )k
Since ({τ > kN })k∈N is a decreasing sequence of measurable sets, we have that
P ({τ = ∞}) = lim P ({τ > kN }) = lim (1 − )k = 0
k→∞
k→∞
2
where we use that 0 < 1 − < 1. Furthermore,
∞
X
nP({τ = n}) ≤
n=0
∞
X
(k + 1)N P ({kN < τ ≤ (k + 1)N })
k=0
∞
X
=N
≤N
≤N
(k + 1) [P ({τ > kN }) − P ({τ > (k + 1)N })]
k=0
∞
X
(k + 1)P ({τ > kN })
k=0
∞
X
(1 − )k +
k=0
∞
X
!
k(1 − )k
k=0
P∞
1
=
(1 − x)k , where the convergence is
Clearly, thee first term is bounded. We can write 1−(1−x)
Pk=0
∞
1
uniform for |x| ≤ < 1. Differentiating termwise, we see that k=1 k(1 − )k−1 = (1−(1−))
It
2 < ∞.
Pm
follows immediately from the monotone convergence theorem applied to the partial sums n=0 n1{τ =n} that
E(τ ) < ∞.
If we are given a stopping time τ , then we can consider the function Xτ : Ω → R defined by
Xτ (ω) = Xτ (ω) (ω) ∀ω ∈ Ω
Lemma 1.2.3. Xτ : Ω → R is a random variable (i.e. a measurable function with respect to (Ω, F, P)).
Proof. Let α ∈ R. Then
ω ∈ Xτ−1 ({t < α}) ⇐⇒ x ∈
∞
[
{ω : τ (ω) = n} ∩ {ω : Xn (ω) < α}
n=0
By definition, {ω : τ (ω) = n} ∈ F. So by σ-additivitiy, we conclude that Xτ−1 ({t < α}) ∈ F.
Definition 1.2.4. Let τ be a random time. If (Xn )∞
n=0 is a martingale, we define a stopped process
(Xτ ∧n )∞
n=0 by
(
Xn (ω)
n ≤ τ (ω)
Xτ ∧n (ω) =
Xτ (ω) (ω) n > τ (ω)
∞
Lemma 1.2.5. If (Xn )∞
n=0 is a martingale, then the stopped process (Xτ ∧n )n=0 is also a martingale and
E(Xτ ∧n ) = E(X0 ).
Proof. For each n ∈ Z+ , define an indicator random variable In by
(
1
if τ (ω) ≥ n 0
In (ω) =
if τ (ω) < n
I claim that
Xτ ∧n = Xτ ∧(n−1) + In · (Xn − Xn−1 )
Indeed, if (τ ∧ n)(ω) = n, then τ (ω) ≥ n > n − 1, so
Xτ ∧(n−1) (ω) = Xn−1 (ω) + (Xn (ω) − Xn−1 (ω)) = Xn (ω) = Xτ ∧n (ω)
If (τ ∧ n)(ω) = τ (ω), then τ (ω) < n and In = 0, so the assertion is obvious. Taking conditional expectations
and applying linearity,
E(Xτ ∧n |X1 , · · · , Xn−1 ) = E(Xτ ∧(n−1) |X1 , · · · , Xn−1 ) + E(In (Xn − Xn−1 )|X1 , · · · , Xn−1 )
The reader can verify that
−1
Xτ−1
∧(n−1) ({t < α}) = Xn−1 ({t < α}) ∪
n−1
[
k=0
3
{τ = k} ∩ {Xn < α}
By definition of a random time, {τ = k} ∈ σ(X0 , · · · , Xk ) ⊂ σ(X0 , · · · , Xn−1 ), 0 ≤ k ≤ n−1. So, we conclude
that Xτ ∧(n−1) is σ(X1 , · · · , Xn−1 )-measurable and therefore E(Xτ ∧(n−1) |X1 , · · · , Xn−1 ) = Xτ ∧(n−1) . An
analogous argument shows that In is σ(X1 , · · · , Xn−1 )-measurable, and therefore
E(In (Xn − Xn−1 )|X1 , · · · , Xn−1 ) = In E(Xn − Xn−1 |X1 , · · · , Xn−1 = In (Xn − Xn−1 )) = 0
where we use E(Xn |X1 , · · · , Xn−1 ) = Xn−1 by definition of a martingale. We conclude that E(Xn∧τ |X1 , · · · , Xn−1 ) =
Xτ ∧(n−1) . Since σ(Xτ ∧1 , · · · , Xτ ∧(n−1) ) ⊂ σ(X1 , · · · , Xn−1 ) and Xτ ∧(n−1) is obviously σ(Xτ ∧1 , · · · , Xτ ∧(n−1) )measurable, it follows that
E(Xτ ∧n |Xτ ∧1 , · · · , Xτ ∧(n−1) ) = Xτ ∧(n−1) ∀n ∈ Z+
which shows that (Xτ ∧n )∞
n=0 is a martingale. Since Xτ ∧0 = X0 , we also obtain
E(Xτ ∧n ) = E(X0 ) ∀n ∈ Z+
1.3
Doob’s Stopping Time Theorem
Theorem 1.3.1. Let τ be a random time, and suppose one of the following conditions holds:
1. There exists N ∈ N such that τ (ω) ≤ N ∀ω. I.e., τ is bounded
2. There exists K ∈ R such that |Yn (ω)| < K ∀ω, ∀n ∈ N and P ({τ < ∞}) = 1.
3. E(τ ) < ∞ and there exists M ∈ R such that
E (|Xn+1 − Xn | |X0 , · · · , Xn ) < M
Then E(Xτ ) = E(X0 ).
Proof. If τ is bounded by N ∈ N, then X(τ ∧N )(ω) (ω) = Xτ (ω) (ω) ∀ω ∈ Ω. Using our result that (Xτ ∧n )∞
n=0
is a martingale with Xτ ∧0 = X0 , we conclude
E(X0 ) = E(Xτ ∧N ) = E(Xτ )
If there exists K ∈ R such that |Xτ ∧n (ω)| ≤ K ∀ω ∈ Ω ∀n ∈ N, then since P(Ω) = 1 < ∞, the function
K · 1Ω ∈ L1 (Ω, F, P). Since{τ = ∞} has measure zero, limn→∞ X(τ ∧n)(ω) (ω) = Xτ (ω) (ω) a.e. The dominated
convergence theorem then tells us that
Z
Z Z
Xτ ∧n dP =
lim E (Xτ ∧n ) = lim
lim Xτ ∧n dP =
Xτ dP = E(Xτ )
n→∞
n→∞
Ω
Ω
n→∞
Ω
Since (Xτ ∧n )∞
n=0 is a martingale, Xτ ∧0 = X0 , and therefore E(Xτ ∧n ) = E(X0 ) ∀n ∈ N, we conclude that
E(X0 ) = E(Xτ )
Suppose E(τ ) < ∞ and there exists M ∈ R such that
E(|Xn+1 − Xn | |X0 , · · · , Xn ) < M
First, I claim that
Xτ = X0 +
∞
X
(Xn − Xn−1 )1{τ >n−1} a.e.
n=1
Indeed, since E(τ ) < ∞, it follows that P ({τ < ∞}) = 1. So for if τ (ω) = k, then
X0 (ω) +
∞
X
(Xn (ω) − Xn−1 (ω))1{τ >n−1} (ω) = X0 (ω) +
n=1
k
X
(Xn (ω) − Xn−1 (ω)) = Xk (ω) = Xτ (ω) (ω)
n=1
4
Therefore,
|Xτ | ≤ |X0 | +
∞
X
|Xn − Xn−1 | 1{τ >n−1}
n=1
I also claim that the RHS dominates a.e. |Xτ ∧m | ∀m ∈ N. Indeed, if τ (ω) = k ≤ m, then (τ ∧ m)(ω) = τ (ω),
and our preceding work gives us the inequality. If τ (ω) = k > m, then (τ ∧ m)(ω) = m and
|X0 (ω)| +
∞
X
|Xn (ω) − Xn−1 (ω)| 1{τ >n−1} (ω) = |X0 | +
X
|Xn (ω) − Xn−1 (ω)| 1{τ >n−1} (ω) +
n>m
n=1
X
≥
m
X
|Xn−1 (ω) − Xn (ω)|
n=1
|Xn (ω) − Xn−1 (ω)| 1{τ >n−1} + |Xm (ω)|
n>m
≥ |Xm (ω)| = X(τ ∧m)(ω) (ω)
We want to show that the RHS is integrable. Observe that ∀n ≥ 1,
E |Xn − Xn−1 | 1{τ >n−1} = E E |Xn − Xn−1 | 1{τ >n−1} |X0 , · · · , Xn−1
Since 1{τ >n−1} is bounded and σ(X0 , · · · , Xn−1 )-measurable (this follows from the fact that {τ = k} ∈
σ(x0 , · · · , Xk ) ⇒ {τ ≤ n − 1} ∈ σ(X0 , · · · , Xn−1 ) by definition of random time), we have
E E |Xn − Xn−1 | 1{τ >n−1} |X0 , · · · , Xn−1 = E 1{τ >n−1} E(|Xn − Xn−1 | |X0 , · · · , Xn−1 ) ≤ M P ({τ > n − 1})
Thus,
E(|X0 |) +
∞
X
E(|Xn − Xn−1 | 1{τ >n−1} ) ≤ E(|X0 |) + M
n=1
∞
X
P ({τ > n − 1}) = E(|X0 |) + M E(τ ) < ∞
n=1
R∞
R
P∞
That n=1 P ({τ > n − 1}) = E(τ ) follows from the general result that Ω f dµ = 0 µ ({f > t}) dt, where
dt is the Lebesgue measure on R, for any nonnegative measurable function f : Ω → R on a σ-finite measure
space (X, A, µ) (See DiBenedetto,P
Real Analysis, Proposition 15.1, pg. 149). It follows from the monotone
∞
convergence theorem that |X0 | + n=1 |Xn − Xn−1 | 1{τ >n−1} is integrable, and also E(|Xτ |) < ∞. Since
τ < ∞ a.e., Xτ ∧m → Xτ a.e., and
!
∞
X
|Xn − Xn−1 | 1{τ >n−1}
|Xm∧τ − Xτ | ≤ |Xm∧τ | + |Xτ | + 2 |X0 | +
n=1
where the RHS is integrable. By the dominated convergence theorem,
E(|Xτ ∧m − Xτ |) → 0, m → ∞ ⇒ E(X0 ) = lim E(Xτ ∧m ) = E(Xτ )
m→∞
Alternatively, we can show that (|Xτ ∧m |)m∈N is a uniformly integrable family and derive the derived result
from this. Indeed, for m ∈ N, write
Xτ ∧m = |Xτ | 1{τ ≤m} + |Xm | 1{τ >m}
I claim that E(|Xm | 1{τ >m} ) → 0, m → ∞. Indeed, our above work shows that
E(|Xm | 1{τ >m} ) ≤ E (|X0 | + Bτ ) 1{τ >m}
Since P ({τ < ∞}) = 1, 1{τ >m} → 0 a.e., m → ∞. Hence, (|X0 | + Bτ )1{τ >m} → 0 a.e., m →. Since
this sequence of functions is dominated by |X0 | + Bτ , which is integrable, it follows from the dominated
convergence theorem that
E(|Xm | 1{τ >m} ) ≤ E (|X0 | + Bτ ) 1{τ >m} → 0, m → ∞
Since |Xτ | 1{τ ≤m} → |Xτ | a.e, and is dominated by |Xτ |, another application of the dominated convergence
theorem gives
lim E(|Xτ ∧m |) = lim E(|Xτ | 1{τ ≤m} ) + E(|Xm | 1{τ >m} ) = E(|Xτ |)
m→∞
m→∞
This shows that the collection (Xτ ∧m ) is uniformly integrable. For any > 0 given
{|Xτ ∧m − Xτ | > } ⊂ {τ > m} ⇒ lim P ({|Xτ ∧m − Xτ | > }) ≤ lim P ({τ > m}) = 0
m→∞
m→∞
So Xτ ∧m → Xτ , m → ∞, in probability. The Vitali convergence theorem tells us that E (|Xτ ∧m − Xτ |) →
0, m → ∞, from which we obtain E(Xτ ∧m ) → E(Xτ ).
5
2
Waiting for a Fixed Pattern
2.1
Martingale Approach
Suppose we are interested in the expected number of trials (or amount of time) until a given finite string
of letters appears. Order A, · · · , Z and let pi be the probability of the ith letter appearing (p1 = pA is the
probability of an A, and p26 = pZ is the probability of a Z). We continue to assume the trials are independent
and identically distributed. Restricting to a proper subset of the alphabet if necessary, we may assume that
pi > 0 ∀1 ≤ i ≤ 26. Fix a pattern
ζ1 · · · ζm , where m ∈ N and ζi ∈ {A, · · · , Z} ∀1 ≤ i ≤ m
We will solve this problem by constructing a martingale and applying Doob’s stopping time theorem. Let
(Ω, F, P) be a probability space and (Un )∞
n=1 be a sequence of i.i.d. random variables with probability
distribution given by
P(Un = ζi ) = pi
Un corresponds to the nth keystroke. Set F0 = {∅, Ω}, and
Fn = σ(U1 , · · · , Un ) n ≥ 1
S∞
Each Fn is a finite σ-algebra. Replacing F by n=0 Fn , we may assume that (Fn )∞
n=0 is a filtration.
Suppose there is a sequence of gamblers who observe the typing and a casino which accepts the gamblers’
wagers on the keystrokes. Each gambler arrives with a fortune of $1 and bets as follows: at minute 1, the first
gambler bets $1 on letter ζ1 ,receives $ p1ζ − 1 plus his bet back if U1 = ζ1 , loses his $1 and quits otherwise.
1
At minute 2, if theQfirst gambler is still there, then he bets his entire fortune on the letter ζ2 and ends up
2
with a fortune of $ i=1 p1ζ if U2 = ζ2 , and he quits otherwise. Also at minute 2, a second gambler wagers $1
i
on ζ1 and quits if he loses. We continue this process for all gamblers analogously for the letters ζ3 , · · · , ζm .
The casino continuous to accept wagers indefinitely (i.e. gamblers continue to arrive and wager, as described
above), but once a gambler wins consecutive wagers spelling out ζ1 · · · ζm , he stops wagering.
We define a process (Mn )∞
n=0 be letting M0 = 0 and Mn be the random variable giving the combined fortune
of all the gamblers at minute n. Since Mn only depends on the values of U1 , · · · , Un , it is clear that Mn is
Fn -measurable. Hence, the process is adapted to (Fn )∞
n=0 .
Define a process (Xn )∞
n=0 by Xn = Mn − n.
Lemma 2.1.1. (Xn )∞
n=0 is a martingale.
Proof. Suppose (αn )∞
n=1 is an infinite sequence of letters repeatedly spelling ζ1 ζ2 · · · ζm ; i.e.,
ζ1 ζ2 · · · ζm ζ1 · · ·
|{z}
|{z}
|{z} |{z}
α1
α2
αm αm+1
Clearly,
E(M1 |F0 ) =
1
· p1 = 1 ⇒ E(X1 |F0 ) = 0 = X0
p ζ1
For 1 ≤ j ≤ n, let Yn,j denote the fortune at minute n of the gambler who made his first bet at minute j.
So,
n
X
Mn =
Yn,j
j=1
Clearly, each Yn,j is Fn -measurable, depending only on the values of U1 , · · · , Un . Fix n ∈ N. If 1 ≤ n ≤ m,
then
(Qn+1−j 1
Qn+1−j 1
with probability i=1
i=1
pζi
pζi
Yn,j =
Qn+1−j
1
0
with probability 1 − i=1
pζ
i
6
and we see that E(Yn,j ) = 1, 1 ≤ j ≤ n. Now suppose n > 11. If n − j > m, then since the gambler stops
wagering if he has successfully bet ζ1 · · · ζm , Yn,j = Yn−1,j . We also obtain E(Yn,j ) = 1, 1 ≤ j ≤ n.
Since at minute n + 1, the n + 1th gambler comes along and bets $1 that the n + 1th keystroke is ζ1 , we have
Mn+1 =
1
· 1{Un+1 =ζ1 } +
pζ1
X
Yn,j ·
j
n−j<m−1
1
pαn+2−j
· 1{Un+1 =αn+2−j } +
n
X
Yn,j
j
n−j≥m−1
So by linearity of conditional expectation, we have
E(Mn+1 |Fn ) =
1
E 1{Un+1 =ζ1 } |Fn +
p ζ1
X
j
n−j<m−1
1
pαn+2−j
n
X
E Yn,j 1{Un+1 =αn+2−j } |Fn +
E(Yn,j |Fn )
j
n−j≥m−1
Qm
Since Yn,j is bounded by i=1 p1ζ and is Fn -measurable for 1 ≤ j ≤ n, it follows from elementary properties
i
of conditional expectation and the fact that the Un+1 is independent of U1 , · · · , Un that
E(1{Un+1 =A} |Fn ) = E(1{Un+1 =ζ1 } ) =
1
pζ1
E(Yn,j |Fn ) = Yn,j if n − j ≥ m − 1
and
E Yn,j 1{Un+1 =αn+2−j } |Fn = Yn,j E 1{Un+1 =αn+2−j } |Fn =
1
Yn,j if n − j < m
pαn+2−j
We conclude that
1
· p ζ1 +
E(Mn+1 |Fn ) =
p ζ1
X
j
n−j<m−1
1
pαn+2−j
· pαn+2−j
Yn,j +
n
X
Yn,j = 1 +
j
n−j≥m−1
n
X
Yn,j = 1 + Mn
j=1
⇒ E(Xn+1 |Fn ) = Xn
Since Mn is bounded, E(|Xn |) < ∞ ∀n, and we conclude that (Xn )∞
n=0 is indeed a martingale.
Lemma 2.1.2. |Xn+1 − Xn | ≤ M ∀n ∈ N for some M > 0.
Proof. Since a gambler stops betting after he loses or successfully wagers ζ1 , · · · , ζm in that order, we may
assume that n < m. Using the same notation as in Problem 6,
n
n
X
X
1
1
|Xn+1 − Xn | = · 1{Un+1 =ζ1 } +
Yn,j ·
· 1{Un+1 =αn+2−j } −
Yn,j − 1
pαn+2−j
pζ1
j=1
j=1
Then since Yn,j ≥ 0 1 ≤ j ≤ n, ∀n,
|Xn+1 − Xn | ≤
n
X
1
1
+1+
Yn,j · 1{Un+1 =αn+2−j } − 1
pζ1
pαn+2−j
j=1
n
X
1
1
+1+
Yn,j · max
ζi pζi
pζ1
j=1
m
1
1
≤
+ 1 + m · max
=M
ζi p ζi
pζ1
≤
I claim that E(|Xn+1 − Xn | | Fn ) ≤ M a.e.. Suppose not. Then the set E(|Xn+1 − Xn | | Fn ) > M + k1
7
has strictly positive measure for some k ∈ N. Hence,
Z
1
MP
E(|Xn+1 − Xn | | Fn ) > M +
≥
|Xn+1 − Xn | dP
k
{E(|Xn+1 −Xn | | Fn )>M + k1 }
Z
E(|Xn+1 − Xn | | Fn )dP
=
{E(|Xn+1 −Xn | | Fn )>M + k1 }
1
1
≥ (M + )P
E(|Xn+1 − Xn | | Fn ) > M +
k
k
1
> MP
E(|Xn+1 − Xn | | Fn ) > M +
k
which is a contradiction.
Let τ be the hitting time. More precisely, let τ : Ω → Z+ ∪ {∞} be the random variable given by
)
(
m
Y
1
τ (ω) = inf Yn,j (ω) =
: for some j, 1 ≤ j ≤ n ∀ω ∈ Ω
p
n∈Z+
i=1 ζi
Lemma 2.1.3. τ is a random time.
Proof. Clearly, {τ = n} = ∅ ∈ Fn , n < m, and {τ = m} ∈ Fm . For n ∈ N>m ,
ω ∈ {τ = n} ⇐⇒ ω ∈ {Un = αm , · · · , Un−m+1 = α1 } ∩
n−m
\
Ω \ {Un−j = αm , · · · , Un−j−m+1 = α1 }
j=1
from which we conclude that {τ = n} ∈ Fn , and therefore τ is a random time.
Lemma 2.1.4. E(τ ) < ∞.
Proof. Recall that m ∈ N is the length of the string of letters we are interested in. I claim that
E 1{τ ≤m+n} | Fn ≥ pζ1 · · · pζm a.e. ∀n ∈ N
Indeed,
τ (ω) ≤ m + n ∀ω ∈ {Un+1 = ζ1 , Un+2 = ζ2 , · · · , Un+m = ζm }
and by the independence of the Ui ,
P ({Un+1 = ζ1 , Un+2 = ζ2 , · · · , Un+m = ζm }) = pζ1 · · · pζm
So the function
1{τ ≤n+m} − 1{Un+1 =ζ1 ,Un+2 =ζ2 ,··· ,Un+m =ζm }
is nonnegative, and therefore
E 1{τ ≤n+m} | Fn ≥ E 1{Un+1 =ζ1 ,Un+2 =ζ2 ,··· ,Un+m =ζm } | Fn = pζ1 · · · pζm a.e.
where the last equality follows from the fact that 1{Un+1 =ζ1 ,Un+2 =ζ2 ,··· ,Un+m =ζm } is independent of Fn . That
E(τ ) < ∞ follows from Lemma 1.2.2.
The two preceding lemmata show that the martingale (Xn )∞
n=0 and the stopping time τ satisfy the third
condition of Doob’s stopping time thoerem. We conclude that
0 = E(X0 ) = E(Xτ ) = E(Mτ ) − E(τ ) ⇐⇒ E(τ ) = E(Mτ )
8
2.2
ABRACADABRA Problem
As an application of the results of the preceding section, we considering the following well-known problem: each minute, a monkey sitting at a keyboard types one of the 26 capital letters independently and
1
). What is the expected number of minutes until the monkey spells out
with uniform probability (i.e. 26
ABRACADABRA?
∞
∞
∞
Let (Un )∞
n=1 , (Fn )n=0 , (Mn )n=0 , (Xn )n=0 be as above. Here,
ζ1 · · · ζm = ABRACADABRA and pi =
1
∀1 ≤ i ≤ 26
26
Let τ be the hitting time for ABRACADABRA. The gamblers who made their first bets respectively at
minutes j ≤ τ − 11 (where no gambler made a bet at minute 0) have fortunes of $0. The gamber who made
his first bet at minute τ − 10 will have a fortune of $2611 at minute τ . The gamblers who made their first
bets at τ − j for 3 < j < 10 and 1 ≤ j < 3 will have fortunes of $0 at minute τ . The gambler who made his
first bet at τ − 3 will have a fortune of $264 , and the gambler who made his first bet at τ will have a fortune
of $26 at minute τ . So,
Mτ = 2611 + 264 + 26 ⇐⇒ Xτ = 2611 + 264 + 26 − τ
Therefore
E(Mτ ) − E(τ ) = E(Xτ ) = E(X0 ) = 0 ⇒ E(τ ) = 2611 + 264 + 26
In other words, it takes on average 2611 + 264 + 26 minutes for the monkey to type ABRACADABRA.
9
© Copyright 2026 Paperzz