Can you produce an example of a probability space and a sequence

MATH 7550-01
INTRODUCTION TO PROBABILITY
FALL 2011
Problems.
1 Can you produce an example
of a probability space and a sequence of events 𝐴𝑖 in it,
βˆ‘βˆž
such that limπ‘›β†’βˆž 𝑃 (𝐴𝑛 ) = 0, 𝑖=1 𝑃 (𝐴𝑖 ) = ∞, 𝑃 {infinitely many of 𝐴𝑖 occur} = 1?
2 Can you produce an example
βˆ‘ ∞of a probability space and a sequence of events 𝐴𝑖 in it,
such that limπ‘›β†’βˆž 𝑃 (𝐴𝑛 ) = 0, 𝑖=1 𝑃 (𝐴𝑖 ) = ∞, 𝑃 {infinitely many of 𝐴𝑖 occur} ∈ (0, 1)?
3 Can you produce an example
of a probability space and a sequence of events 𝐴𝑖 in it,
βˆ‘βˆž
such that limπ‘›β†’βˆž 𝑃 (𝐴𝑛 ) = 0, 𝑖=1 𝑃 (𝐴𝑖 ) = ∞, 𝑃 {infinitely many of 𝐴𝑖 occur} = 0?
4 Let ℐ 2 be the class of all rectangles {(π‘₯, 𝑦) : π‘Ž < π‘₯ ≀ 𝑏, 𝑐 < 𝑦 ≀ 𝑑}, finite or infinite,
in the plane ℝ2 (if, say, 𝑏 = ∞, the inequality π‘₯ ≀ ∞ means the same as π‘₯ < ∞: no real
number π‘₯ is equal to ∞); let π’ͺ be the class of all open sets in ℝ2 .
Prove that 𝜎(ℐ 2 ) = 𝜎(π’ͺ).
5 Let 𝑋 be a Borel set in ℝ𝑛 . Prove that the class of all Borel subsets of 𝑋 is a 𝜎-algebra
in 𝑋.
Prove that this 𝜎-algebra is the same as that generated by all subsets of 𝑋 that are
open in 𝑋.
6 Prove that if πœ‰ and πœ‚ are two random variables (on the same sample space, and
taking values in ℝ1 ), then πœ‰ + πœ‚ is also a random variable.
The deadline for Problems 1 – 6 is September 16.
of measures π‘š and 𝑛 on a space (𝑋, 𝒳 ) such that 𝑛(𝐢) =
∫ 7 Produce an example
∫
𝑓1 (π‘₯) π‘š(𝑑π‘₯) =
𝑓2 (π‘₯) π‘š(𝑑π‘₯), but π‘š{π‘₯ : 𝑓1 (π‘₯) βˆ•= 𝑓2 (π‘₯)} βˆ•= 0. Show at which point
𝐢
𝐢
the β€œproof” of Theorem 7.1 fails.
The deadline for Problem 7 is September 19.
8 Prove or disprove: Let 𝐹 (𝑑), βˆ’βˆž < 𝑑 < ∞, be a right-continuous nondecreasing
function, 𝐹 (βˆ’βˆž) = 0, 𝐹 (+∞) = 1. If 𝐹 has a finite number of jumps at the points
π‘₯π‘š < π‘₯π‘š+1 < ... < π‘₯𝑛 , and is constant on the intervals (βˆ’βˆž, π‘₯π‘š ), (π‘₯π‘š , π‘₯π‘š+1 ), ...,
(π‘₯π‘›βˆ’1 , π‘₯𝑛 ), (π‘₯𝑛 , ∞), then this function is a distribution function of a discrete random
variable.
9 Prove or disprove the statement of the previous problem with the finite sequence of
jump points replaced by a countable one that is infinite on one side: π‘₯π‘š < π‘₯π‘š+1 < ... <
π‘₯π‘˜ < π‘₯π‘˜+1 < ..., or ... < π‘₯π‘˜ < π‘₯π‘˜+1 < ... < π‘₯𝑛 ; or on both: ... < π‘₯βˆ’1 < π‘₯0 < π‘₯1 < π‘₯2
< ... , assuming that limπ‘šβ†’βˆ’βˆž π‘₯π‘š = βˆ’ ∞ if the sequence is infinite to the left, and that
limπ‘›β†’βˆž π‘₯𝑛 = ∞ if it is infinite to the right.
10 Prove or disprove: Let 𝐹 (π‘₯), βˆ’ ∞ < π‘₯ < ∞, be a right-continuous nondecreasing
function, 𝐹 (βˆ’ ∞) = 0, 𝐹 (+∞) = 1. If
βˆ‘
[𝐹 (π‘₯) βˆ’ 𝐹 (π‘₯βˆ’ )] = 1,
π‘₯
1
then the function 𝐹 is a distribution function of a discrete random variable.
(The sum has, in appearance, an uncountable number of summands, but in fact all
of them except a countable number are equal to 0, because a nondecreasing function can
have only countably many discontinuities.)
11 Prove or disprove: Let 𝐹 (𝑑), βˆ’ ∞ < 𝑑 < ∞, be a right-continuous nondecreasing
function, 𝐹 (βˆ’ ∞) = 0, 𝐹 (+ ∞) = 1. If 𝐹 has jumps at the points that form a dense set in
ℝ1 , then the function 𝐹 is a distribution function of a discrete random variable.
12 Prove or disprove: Let 𝐹 (π‘₯), βˆ’ ∞ < π‘₯ < ∞, be a nondecreasing function, 𝐹 (βˆ’ ∞)
= 0, 𝐹 (+ ∞) = 1. If 𝐹 is continuous on ℝ1 , it is a distribution function corresponding to
a density.
13 Prove or disprove: Let 𝐹 (π‘₯), βˆ’ ∞ < π‘₯ < ∞, be a nondecreasing function, 𝐹 (βˆ’ ∞)
= 0, 𝐹 (+ ∞) = 1. If 𝐹 is continuous on ℝ1 and piecewise continuously differentiable
(that is, ℝ1 is divided into pieces by points (... <) π‘₯π‘š < π‘₯π‘š+1 < ... < π‘₯𝑛 (< ...) so that
𝐹 is continuously differentiable on the open intervals between these points, to the left of
the smallest π‘₯π‘š if it exists, and to the right of the greatest of them), it is a distribution
function corresponding to a density.
14 Prove or disprove: Let 𝐹 (π‘₯), βˆ’ ∞ < π‘₯ < ∞, be a right-continuous nondecreasing
function, 𝐹 (βˆ’ ∞) = 0, 𝐹 (+ ∞) = 1. If
∫
∞
𝐹 β€² (π‘₯) 𝑑π‘₯ = 1,
βˆ’βˆž
then the function 𝐹 is a distribution function corresponding to a density.
15 Suppose we take a random permutation of numbers 1, 2, ..., 𝑛; that is, the sample
space Ξ© consists of all 𝑛! sequences πœ” = (π‘₯1 , π‘₯2 , ..., π‘₯𝑛 ) such that {π‘₯1 , π‘₯2 , ..., π‘₯𝑛 } =
{1, 2, ..., 𝑛}; (of course, β„± = 𝒫(Ξ©)); and the probabilities are taken so that all different
orders of the natural numbers from 1 to 𝑛 are equally probable:
𝑃 {(π‘₯1 , π‘₯2 , ..., π‘₯𝑛 )} =
1
.
𝑛!
The random variable πœ‰ is equal to the number of numbers 𝑖, 1 ≀ 𝑖 ≀ 𝑛, standing in their
own place: for πœ” = (π‘₯1 , π‘₯2 , ..., π‘₯𝑛 ),
πœ‰(πœ”) = πœ‰(π‘₯1 , π‘₯2 , ..., π‘₯𝑛 ) = #{𝑖 : π‘₯𝑖 = 𝑖}.
It was proved in the lecture that 𝐸 πœ‰ = 1, 𝐸 πœ‰ 2 = 2.
Is 𝐸 πœ‰ 3 = 3?
The deadline for Problems 8 – 15 is Sep. 30.
16 Let πœ‰1 , πœ‰2 be two random variables taking the values in measurable spaces (𝑋𝑖 , 𝒳𝑖 ),
𝑖 = 1, 2. Prove that 𝝃 = (πœ‰1 , πœ‰2 ) is a random vector with values in the product
2
space (𝑋1 × π‘‹2 , 𝒳1 × π’³2 ) (that is that the function 𝝃 : πœ” 7β†’
(β„±, 𝒳1 × π’³2 )-measurable function from Ξ© to 𝑋1 × π‘‹2 ).
(
)
πœ‰1 (πœ”), πœ‰2 (πœ”) is an
17 Prove that ℬ1 × β„¬ 1 = ℬ 2 .
18
disprove that for every sequence 𝐴1 , 𝐴2 , ..., 𝐴𝑛 , ... of independent events
∩ ∞Prove or ∏
∞
𝑃 ( 𝑖=1 𝐴𝑖 ) = 𝑖=1 𝑃 (𝐴𝑖 ).
19 Let 𝑓 (𝑑, πœ”) be a (ℬ[0, 1] × β„±)-measurable function on [0, 1] × Ξ©, taking values 0, 1.
Is the following true for all such functions:
∫
∫
𝐸
𝑓 (𝑑, βˆ™) #(𝑑𝑑) =
𝐸 𝑓 (𝑑, βˆ™) #(𝑑𝑑),
[0, 1]
βˆ‘
[0, 1]
βˆ‘
i. e., 𝐸 π‘‘βˆˆ[0, 1] 𝑓 (𝑑, βˆ™) =
π‘‘βˆˆ[0, 1] 𝐸 𝑓 (𝑑, βˆ™)? (The counting measure # on [0, 1] is not
𝜎-finite, so the positive answer does not follow from Fubini’s Theorem – but still it could
be true.)
20 Prove that if real-valued random variables πœ‰1 , πœ‰2 have (absolutely) continuous joint
distribution with density π‘πœ‰1 , πœ‰2 (π‘₯1 , π‘₯2 ), then each of them separately has a continuous
one-dimensional distribution, with densities
∫ ∞
∫ ∞
π‘πœ‰1 (π‘₯1 ) =
π‘πœ‰1 , πœ‰2 (π‘₯1 , π‘₯2 ) 𝑑π‘₯2 ,
π‘πœ‰2 (π‘₯2 ) =
π‘πœ‰1 , πœ‰2 (π‘₯1 , π‘₯2 ) 𝑑π‘₯1 .
βˆ’βˆž
βˆ’βˆž
21 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be an infinite sequence of real-valued random variables. Show
that the event
{limπ‘›β†’βˆž πœ‰π‘› ≀ π‘₯}
belongs to the tail 𝜎-algebra β„±β‰₯∞ for every real π‘₯ (lim denotes the upper limit; another
notation for it is lim sup; and we know that an upper limit, finite or infinite, always exists).
22 Using the fact that a finite limπ‘›β†’βˆž π‘₯𝑛 exists if and only if limπ‘›β†’βˆž, π‘šβ†’βˆž (π‘₯𝑛 βˆ’ π‘₯π‘š )
= 0, prove that the event
{there exists a finite limit
lim πœ‰π‘› } = {πœ” : there exists a finite limit
π‘›β†’βˆž
lim πœ‰π‘› (πœ”)}
π‘›β†’βˆž
belongs to the tail 𝜎-algebra.
23 Prove or disprove: the event
{ lim πœ‰π‘› = βˆ’ ∞} ∈ β„±β‰₯∞ .
π‘›β†’βˆž
24 Prove that the random variable limπ‘›β†’βˆž πœ‰π‘› (taking values in the extended real
line [βˆ’βˆž, ∞]) is measurable with respect to the tail 𝜎-algebra β„±β‰₯∞ .
25 Prove or disprove: the event
{
}
πœ‰1 + ... + πœ‰π‘›
=0
π‘›β†’βˆž
𝑛
lim
3
belongs to the tail 𝜎-algebra.
26 Prove: the event
{
the series
∞
βˆ‘
}
πœ‰π‘– converges
𝑖=1
belongs to β„±β‰₯∞ .
27 Prove or disprove, for nonnegative πœ‰π‘– : the random variable
with respect to the tail 𝜎-algebra.
βˆ‘βˆž
𝑖=1 πœ‰π‘–
is measurable
The deadline for Problems 16 – 27 is October 7.
28 Suppose πœ‰π‘‘ , 0 ≀ 𝑑 < ∞, are independent random variables with the same probability
density 𝑝(π‘₯). Prove that for no 𝑑0 ∈ [0, ∞) does πœ‰π‘‘ converge in probability to πœ‰π‘‘0 as
𝑑 β†’ 𝑑0 .
29 Let πœ‰ and πœ‚ be two random variables taking nonnegative integer values. Prove or
disprove: If their generating functions π‘ƒπœ‰ (𝑠), π‘ƒπœ‚ (𝑠) coincide for 𝑠 ∈ [βˆ’1, 1], then their
distributions πœ‡πœ‰ and πœ‡πœ‚ coincide.
30 Give an example of a random variable whose moment generating function is equal
to ∞ at all points except at 0.
31 Let a random variable πœ‰ have the normal distribution with parameters (0, 𝑏). Prove
that all odd-numbered moments π‘šπ‘˜ = 𝐸 πœ‰ π‘˜ about zero are equal to 0, and for π‘˜ = 2π‘š
we have:
π‘š2π‘š = 𝐸 πœ‰ 2π‘š = (2π‘š βˆ’ 1)!! β‹… π‘π‘š ,
where (2π‘š βˆ’ 1)!! denotes the product of all odd numbers from 1 to 2π‘š βˆ’ 1.
The deadline for Problems 28 – 31 is October 17.
32 Prove that the normal distribution with parameters (𝒂, 𝐡), where 𝐡 is a nonsingular
matrix, has the density
{ βˆ‘
𝑝(𝒙) = const β‹… exp βˆ’ 12
π‘žπ‘˜π‘™ (π‘₯π‘˜ βˆ’ π‘Žπ‘˜ )(π‘₯𝑙 βˆ’ π‘Žπ‘™ )},
π‘˜, 𝑙
where the matrix 𝑄 = (π‘žπ‘˜π‘™ ) = 𝐡
βˆ’1
.
What is the constant in this formula equal to?
33 Let the two-dimensional random vector 𝝃 have the normal
with zero
( distribution
)
1 1
expectation (𝐸 𝝃 = 0) and the matrix of covariances 𝐡1 =
. Prove that the
1 1
distribution of this random vector is concentrated on some line in the plane (i. e. that
almost surely 𝑐1 πœ‰1 + 𝑐2 πœ‰2 = const for some (𝑐1 , 𝑐2 ) βˆ•= (0, 0)).
Deduce form this that the normal distribution with parameters (0, 𝐡1 ) has no density
with respect to the two-dimensional Lebesgue measure πœ†2 .
4
34 (Is the)result of the previous problem true if we replace the matrix 𝐡1 with
1 2
𝐡2 =
?
2 4
35* (* means a non-obligatory problem). Prove or disprove: if the characteristic function
π‘“πœ‰ (𝑑) is differentiable at 0, then πΈβˆ£πœ‰βˆ£ < ∞.
HINT: A random variable having the standard Cauchy distribution with density 𝑝(π‘₯) =
βˆ’1
πœ‹ /(1+π‘₯2 ) has no expectation; its characteristic function 𝑓 (𝑑) = π‘’βˆ’βˆ£π‘‘βˆ£ is not differentiable
at 0, but this is, so to speak, touch and go: a little more, and it would be: at least the
one-sided derivatives are not infinite. Couldn’t we try and consider a density π‘Λœ(π‘₯) that
goes to 0 at ±βˆž a little slower than 𝑝(π‘₯), but still the expectation does not exist? It could
be that for this density the characteristic function has zero derivative at 𝑑 = 0.
Better take your density (-ies) symmetric with respect to 0 (even functions): the
corresponding characteristic functions will be real-valued.
36 Let a four-dimensional
(
) random vector (πœ‰1 , ..., πœ‰4 ) have the normal distribution with
parameters 0, 𝐡 = (π‘π‘—π‘˜ ) . Find the fourth mixed moment π‘š1111 = 𝐸(πœ‰1 πœ‰2 πœ‰3 πœ‰4 ).
The deadline for Problems 32 – 36 is October 21.
37 Let πœ‡1 , πœ‡2 , ..., πœ‡π‘› , ..., 𝜈 be distributions concentrated on nonnegative integers: for
all natural 𝑛
∞
∞
βˆ‘
βˆ‘
πœ‡π‘› {π‘˜} =
𝜈{π‘˜} = 1.
π‘˜=0
π‘˜=0
Prove (or disprove) that if
lim πœ‡π‘› {π‘˜} = 𝜈{π‘˜}
π‘›β†’βˆž
for every π‘˜ = 0, 1, 2, 3, ..., then πœ‡π‘› →𝑀 𝜈 (𝑛 β†’ ∞): convergence of the values of the
probability mass function implies weak convergence of distributions.
38 Let πœ‡π‘› be the normal distribution with parameters (π‘Žπ‘› , 𝑏𝑛 ), 𝑏𝑛 > 0; suppose π‘Žπ‘› β†’ π‘Ž,
𝑏𝑛 β†’ 0 as 𝑛 β†’ ∞.
Prove that πœ‡π‘› →𝑀 π›Ώπ‘Ž (𝑛 β†’ ∞), where π›Ώπ‘Ž is the distribution concentrated at the
point π‘Ž: π›Ώπ‘Ž (𝐢) = 1 if 𝐢 βˆ‹ π‘Ž, and = 0 if 𝐢 βˆ•βˆ‹ π‘Ž.
39 Prove or disprove that if πœ‰π‘› →𝑃 πœ‚, then πœ‡πœ‰π‘› →𝑀 πœ‡πœ‚ .
40 Prove or disprove: if πœ‡πœ‰π‘› →𝑀 πœ‡πœ‚ , then πœ‰π‘› →𝑃 πœ‚.
41 For every 𝑛, let the random variables πœ‰π‘› , πœ‚π‘› be independent; and let πœ‰π‘› β†’a. s. πœ‰,
πœ‚π‘› β†’a. s. πœ‚. Prove that then πœ‰ and πœ‚ are independent.
The deadline for Problems 37 – 41 is October 28.
42 Check that
{
𝑓 (𝑑) =
1 βˆ’ βˆ£π‘‘βˆ£,
0,
βˆ£π‘‘βˆ£ ≀ 1,
βˆ£π‘‘βˆ£ > 1,
5
is a characteristic function of a continuous distribution on the real line. Find its density.
43 For a random variable πœ‰ having the density obtained in the previous problem, prove
that 𝐸 βˆ£πœ‰βˆ£ = ∞.
44 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be a sequence of independent random variables having the
distribution with the density found in Problem 42 (such a sequence exists by Theoπœ‰1 + ... + πœ‰π‘›
rem 13.5). Let πœπ‘› =
. Does the sequence πœπ‘› converge in probability to a
𝑛
constant as 𝑛 β†’ ∞? If yes, what is this constant equal to?
45 For the sequence of random variables of the previous problem, does the weak limit
of the distribution of πœπ‘›
(𝑀) lim πœ‡πœπ‘›
π‘›β†’βˆž
exist? If yes, is the limiting distribution discrete or continuous, and what is its probability
mass function or its density?
We call a Markov chain time-homogeneous (or just homogeneous) if its transition
matrix is the same at all steps: 𝑃1 = 𝑃2 = ... = π‘ƒπ‘˜ = ... = 𝑃 . For homogeneous Markov
chains the transition matrix 𝑃 π‘›π‘š = 𝑃 π‘šβˆ’π‘› (the (π‘šβˆ’π‘›)-th power of the one-step transition
matrix 𝑃 ) depends only on the difference π‘š βˆ’ 𝑛 = π‘˜, and its entries π‘π‘›π‘š
π‘₯𝑦 also depend
(π‘˜)
(π‘˜)
(π‘˜)
π‘˜
on this difference only: π‘π‘›π‘š
π‘₯𝑦 = 𝑝π‘₯𝑦 , where 𝑝π‘₯𝑦 are the entries of the power 𝑃 (𝑝π‘₯𝑦 are
not the powers of 𝑝π‘₯𝑦 , the (π‘₯, 𝑦)-th entry of the matrix 𝑃 , this is why we use (π‘˜) with
parentheses in the subscript).
A substantial part of the theory of Markov chains deals with the behavior of the π‘˜(π‘˜)
step transition probabilities 𝑝π‘₯𝑦 = 𝑃 {πœ‰π‘›+π‘˜ = π‘¦βˆ£πœ‰π‘› = π‘₯} as the number of steps π‘˜ β†’ ∞.
It turns out sometimes (under some conditions) that these conditional probabilities have
the limit as π‘˜ β†’ ∞ that does not depend on the number π‘₯ of the matrix row. This can
be interpreted as follows: under these conditions, the events {πœ‰π‘›+π‘˜ = 𝑦} and {πœ‰π‘› = π‘₯}
(or: the random variables πœ‰π‘›+π‘˜ and πœ‰π‘› ) become independent in the limit as π‘˜ β†’ ∞. Or:
under these conditions the events or random variables separated by a growing number of
steps (by a growing time interval) become independent in the limit.
I want you to solve several problems about this phenomenon (in the more intuitive
words, disappearance of dependence with lapse of time; in a more prosaic and more precise
(π‘˜)
words, existence, for each pair π‘₯, 𝑦 ∈ 𝑋, of the limit limπ‘˜β†’βˆž 𝑝π‘₯𝑦 not depending on π‘₯ –
or existence of the limit limπ‘˜β†’βˆž 𝑃 π‘˜ being a matrix all of whose rows are identical to one
another).
(
)
0
1
46 Let 𝑃 =
. Find the limit limπ‘˜β†’βˆž 𝑃 π‘˜ .
1/2 1/2
HINT: Represent the matrix 𝑃 in the form 𝑃 = 𝐴 β‹… 𝐷 β‹… π΄βˆ’1 , where 𝐴 is a non-singular
matrix, and 𝐷 a diagonal one.
6
47 Let
βŽ›
⎞
0
0
1
0
0
0
1 ⎟
⎜ 0
𝑃 =⎝
⎠.
1/2 0 1/2 0
0 1/2 0 1/2
Does the limit limπ‘˜β†’βˆž 𝑃 π‘˜ exist? Are the rows of this limiting matrix identical?
48 The same question for the matrix
βŽ›
0
⎜ 2/3
𝑃 =⎝
0
1/3
1/3 0
0 1/3
2/3
0
0 2/3
⎞
2/3
0 ⎟
⎠
1/3
0
(which is the same as that in the example with formula (25.5), only with 5 replaced with 4 –
which makes all the difference).
49* The same question for the matrix 𝑃 given by formula (25.5).
The deadline for Problems 42 – 48 is November 4.
50* For a Markov chain with the transition matrix of Problem 46 , prove that
π›Όβˆ— (ℱ≀𝑛 , β„±β‰₯π‘š ) β†’ 0 as π‘š β†’ ∞.
At what rate does it go to 0 (that is, e. g.: as some power of π‘š – or of π‘š βˆ’ 𝑛; or
exponentially fast; or: faster than any exponential function)?
51 Let πœ‰0 , πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be independent random variables having the normal distribution with parameters (0, 1) (the standard normal distribution). Let πœ‚1 = (πœ‰1 βˆ’ πœ‰0 )2 , πœ‚2 =
πœ‚1 + πœ‚2 + ... + πœ‚π‘›
(πœ‰2 βˆ’ πœ‰1 )2 , ..., πœ‚π‘› = (πœ‰π‘› βˆ’ πœ‰π‘›βˆ’1 )2 , ... . Prove that there exists limπ‘›β†’βˆž (𝑃 )
.
𝑛
What is this limit?
We cannot use the laws of large numbers that we had for independent random variables: πœ‚π‘– are definitely
dependent. And we cannot use Theorem 26.4 – even if we check that π›Όβˆ— (β„±πœ‚π‘– , 𝑖≀𝑛 , β„±πœ‚π‘– , 𝑖β‰₯π‘š ) ≀
𝛽(π‘š βˆ’ 𝑛), 𝛽(π‘˜) β†’ 0 (π‘˜ β†’ ∞), because the random variables πœ‚π‘– are unbounded (just as normal
random variables).
52* For the random variables of the previous problem prove that the strong law of large
πœ‚1 + πœ‚2 + ... + πœ‚π‘›
numbers takes place: the sequence
converges almost surely.
𝑛
HINT: Try to copy the proof of Theorems 16.3, 16.4: in the sum
𝑛
βˆ‘
𝐸[(πœ‚π‘– βˆ’ 𝐸 πœ‚π‘– )(πœ‚π‘— βˆ’ 𝐸 πœ‚π‘— )(πœ‚π‘˜ βˆ’ 𝐸 πœ‚π‘˜ )(πœ‚π‘™ βˆ’ 𝐸 πœ‚π‘™ )]
𝑖, 𝑗, π‘˜, 𝑙=1
not all summands with 𝑖, 𝑗, π‘˜, 𝑙 different are equal to 0 as for independent random
variables, but the number of nonzero summands can be counted and estimated.
7
53 Let πœ‰ be a continuous random variable with probability density
⎧
⎨ 4 βˆ’ π‘₯,
π‘πœ‰ (π‘₯) =
8
⎩
0,
π‘₯ ∈ [0, 4],
π‘₯∈
/ [0, 4];
πœ‚ = (πœ‰ βˆ’ 1)2 .
Find the conditional expectation 𝐸{πœ‰βˆ£πœ‚ = 𝑦}.
54 Let random variables πœ‰, πœ‚ be independent and have normal distributions with parameters, respectively, (0, 1) and (0, 2); and 𝜁 = πœ‰ + πœ‚. Find the conditional distribution of πœ‰
under the condition 𝜁 = 𝑧. Find the conditional expectation 𝐸(πœ‰βˆ₯𝜁) and the conditional
((
)2 ° )
variance 𝐸 πœ‰ βˆ’ 𝐸(πœ‰βˆ₯𝜁) ° 𝜁 .
55 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be independent random variables having each the same normal
distribution with parameters (0, 1).
Find (a version of) 𝐸(πœ‰12 βˆ₯β„±β‰₯∞ ).
56 Let πœ‰ and πœ‚ be independent random variables with densities π‘πœ‰ (π‘₯), π‘πœ‚ (𝑦); 𝜁 = πœ‰ + πœ‚.
Find the conditional density π‘πœβˆ£πœ‰=π‘₯ (𝑧).
The deadline for Problems 51 – 56 is Nov. 11.
(
(
)
57 Prove that if the random variable πœ‰ is π’œ-measurable, πΈβˆ£πœ‚βˆ£ < ∞, 𝐸 βˆ£πœ‰βˆ£ β‹… 𝐸 βˆ£πœ‚βˆ£βˆ₯π’œ)
< ∞, then πΈβˆ£πœ‰ πœ‚βˆ£ < ∞.
58 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be independent random variables with 𝐸 πœ‰π‘– = 0, 𝐸 πœ‰π‘–2 = πœŽπ‘–2 ;
πœ‚π‘› = πœ‰1 + ... + πœ‰π‘› .
2
βˆ₯πœ‰1 , ..., πœ‰π‘› ).
Find 𝐸(πœ‚π‘›+1
59 Let 𝑋 = β„€+ = {0, 1, 2, 3, ...};
[πœ†(𝑠 βˆ’ 𝑑)]π‘¦βˆ’π‘₯ π‘’βˆ’πœ†(π‘ βˆ’π‘‘)
𝑝(𝑑, π‘₯, 𝑠, 𝑦) =
(𝑦 βˆ’ π‘₯)!
if π‘₯, 𝑦 ∈ β„€+ , 𝑦 β‰₯ π‘₯, and 0 for 𝑦 < π‘₯, where πœ† β‰₯ 0 (you see that this is, in fact, up to a
shift, the Poisson distribution with parameter πœ† β‹… (𝑠 βˆ’ 𝑑)). Check that the conditions β˜…1 ,
β˜…3 , β˜…4 are satisfied.
The Markov process with transition probabilities (30.5) and with right-continuous
trajectories πœ‰βˆ™ (πœ”) is called the Poisson process with parameter (the rate) πœ†.
60 Prove: If πœ‰π‘‘ , 𝑑 β‰₯ 0, is a Markov process on 𝑋 = β„€+ with transition probabilities
(30.5) (supposing that such a process exists), then for 𝑠 β‰₯ 𝑑 the random variable πœ‰π‘  βˆ’ πœ‰π‘‘
has the Poisson distribution with parameter πœ† β‹… (𝑠 βˆ’ 𝑑).
61 For the same process prove that for 0 ≀ 𝑑0 ≀ 𝑑1 ≀ ... ≀ 𝑑𝑛 the random variables
πœ‰π‘‘1 βˆ’ πœ‰π‘‘0 , πœ‰π‘‘2 βˆ’ πœ‰π‘‘1 , ..., πœ‰π‘‘π‘› βˆ’ πœ‰π‘‘π‘›βˆ’1 are independent.
8
62 Let 𝑇 = [0, ∞), 𝑋 = ℝ1 (and, of course, 𝒳 = ℬ 1 ). Check that the density
2
1
𝑝(𝑑, π‘₯, 𝑠, 𝑦) = √
π‘’βˆ’(π‘¦βˆ’π‘₯) /2(π‘‘βˆ’π‘ )
2πœ‹(𝑑 βˆ’ 𝑠)
(as a function of its density-argument 𝑦 this is the normal density with parameters
(π‘₯, 𝑠 βˆ’ 𝑑)) satisfies the Chapman – Kolmogorov equation (30.7).
63 For a Markov process πœ‰π‘‘ , 𝑑 ∈ [0, ∞), with the transition function given by (30.6)
with the density (30.8), and 𝑃 (𝑑, π‘₯, 𝑑, 𝐢) = 𝛿π‘₯ (𝐢) (assuming such a process exists) prove
that the difference πœ‰π‘  βˆ’ πœ‰π‘‘ for 𝑠 β‰₯ 𝑑 has the normal distribution with parameters (0, 𝑠 βˆ’ 𝑑)
and that for 0 ≀ 𝑑0 ≀ 𝑑1 ≀ ... ≀ 𝑑𝑛 the differences πœ‰π‘‘1 βˆ’ πœ‰π‘‘0 , πœ‰π‘‘2 βˆ’ πœ‰π‘‘2 , ..., πœ‰π‘‘π‘› βˆ’ πœ‰π‘‘π‘›βˆ’1 are
independent.
64 Let the initial distribution 𝜈 at time 𝑑0 = 0 be concentrated at the point π‘₯0 :
𝜈(𝐢) = 𝛿π‘₯0 (𝐢), and the finite-dimensional distributions be given by formula (31.1) with
𝑃 (𝑑, π‘₯, 𝑠, 𝐢) of the previous problem. Check that the joint distribution of πœ‰π‘‘π‘– ’s (if the
stochastic process πœ‰π‘‘ , 𝑑 β‰₯ 0, exists) is a normal one. With what parameters?
The deadline for Problems 57 – 64 is Nov. 18.
65 Find πœ†π‘₯ for the Poisson process (see Problem 59 ).
66 Find the matrix 𝐴 = (π‘Žπ‘₯𝑦 ) for the Poisson process. Find the distribution of the
position of the process after the first jump if it starts from the point π‘₯ = 5 at time 0.
67 Let πœ†π‘₯ > 0. Prove that the time 𝜏1 of the first jump and the position πœ‰πœ1 of the
process at this time are independent with respect to the probability measure 𝑃π‘₯ .
∫
π‘Žπ‘₯𝑦
HINT: Prove that 𝑃π‘₯ {𝜏1 ∈ 𝐢, πœ‰πœ1 = 𝑦} =
πœ†π‘₯ π‘’βˆ’πœ†π‘₯ 𝑑 𝑑𝑑 β‹…
for every Borel set
πœ†π‘₯
𝐢
𝐢 βŠ† [0, ∞) and 𝑦 βˆ•= π‘₯. To do this, start with 𝐢 = [π‘Ž, 𝑏), 0 ≀ π‘Ž ≀ 𝑏 <∫ ∞): such sets
for a semi-algebra, and if two finite measures 𝑃π‘₯ {𝜏1 ∈ 𝐢, πœ‰πœ1 = 𝑦} and
π‘’βˆ’πœ†π‘₯ 𝑑 𝑑𝑑 β‹… π‘Žπ‘₯𝑦
𝐢
coincide on a semi-algebra, they coincide on the 𝜎-algebra ℬ0, ∞ generated by it. Then
use the equality 𝑃π‘₯ {𝜏1 ∈ [π‘Ž, 𝑏), πœ‰πœ1 = 𝑦} = limβ„Žβ†’0+ 𝑃π‘₯ {𝜏1β„Ž ∈ [π‘Ž, 𝑏), πœ‰πœ1β„Ž = 𝑦}, where the
probability depending on β„Ž is expressed as a sum of some products: a sum of a finite
geometric progression.
68 Let a function 𝑔(𝑑), 𝑑 β‰₯ 0, be continuous, let it have, for every 𝑑 ∈ [0, ∞), a right𝑑+
hand derivative β„Ž(𝑑) =
𝑔(𝑑), which is also continuous on [0, ∞). Prove that then the
𝑑𝑑
𝑑
two-sided derivative
𝑔(𝑑) exists for all 𝑑 > 0.
∫ 𝑑
𝑑𝑑
HINT: Take π‘”Λœ(𝑑) = 𝑔(𝑑) βˆ’
β„Ž(𝑠) 𝑑𝑠. The function π‘”Λœ(𝑑) is continuous and has a zero
0
right-hand derivative at every point; we have to prove that it is a constant. Now you take
any proof of the corresponding fact for the two-sided derivative, and try to adapt it to the
one-sided one.
9
69 Check that both kinds of Kolmogorov equations are satisfied for the transition probabilities of the Poisson process (and this is a way how these probabilities can be found
starting with the matrix 𝐴).
(
)
(
)
βˆ’1 1
70 Let 𝑋 = {0, 1}, 𝐴 =
. Find the solution 𝑃 𝑑 = 𝑝(𝑑, π‘₯, 𝑦) π‘₯, π‘¦βˆˆπ‘‹ of the
2 βˆ’2
𝑑 𝑑
𝑑 𝑑
matrix equation
𝑃 = 𝐴𝑃 𝑑 or
𝑃 = 𝑃 𝑑 𝐴 with the initial condition 𝑃 0 = 𝐼. Find the
𝑑𝑑
𝑑𝑑
limits lim π‘‘β†’βˆž 𝑝(𝑑, π‘₯, 𝑦).
⎞
βŽ›
βˆ’2 2
0
71 Let 𝑋 = {0, 1, 2}, 𝐴 = ⎝ 1 βˆ’3 2 ⎠. Do the limits lim π‘‘β†’βˆž 𝑝(𝑑, π‘₯, 𝑦) exist? If
0
1 βˆ’1
they do, find them by solving the system 𝒑 β‹… 𝐴 = 0, 𝒑 β‹… 1 = 1 (𝒑 and 0 are row vectors,
1 the column vector with all components equal to 1).
72 Let 𝑋 = {0, 1, 2, ..., 𝑛, ...}, π‘Žπ‘₯π‘₯ = βˆ’ πœ†, π‘Žπ‘₯, π‘₯+1 = πœ†, where πœ† is a positive constant, all
other π‘Žπ‘₯𝑦 = 0. Find all solutions of the equation 𝒑𝐴 = 0. Does the discrete distribution
on 𝑋 with the probability mass function 𝑝(𝑑, π‘₯, βˆ™) converge as 𝑑 β†’ ∞ to some probability
distribution on 𝑋?
73 Let π‘ˆ , 𝑉 be independent random variables with exponential distributions
with pa(
)
rameters πœ† and πœ‡. Find the distribution of the random variable 𝑇 (πœ”) = min π‘ˆ (πœ”), 𝑉 (πœ”) .
74 Let 𝑋 = {0, 1, 2, ..., 𝑛, ...}, π‘Ž00 = βˆ’πœ†, π‘Ž01 = πœ†, π‘Žπ‘›, π‘›βˆ’1 = πœ‡, π‘Žπ‘›π‘› = βˆ’πœ† βˆ’ πœ‡,
π‘Žπ‘›, 𝑛+1 = πœ† for 𝑛 > 0, all other π‘Žπ‘₯𝑦 = 0, where πœ† and πœ‡ are positive constants (negative
numbers on the main diagonal of the infinite matrix 𝐴, positive numbers on the diagonal
one row above the main one and one row below it, and the rest zeros; the Markov process
describes a system with one service and unrestricted queue).
Find all solutions of the equation 𝒑 β‹… 𝐴 = 0. For πœ† = πœ‡ = 2, does the limiting
distribution for 𝑝(𝑑, π‘₯, βˆ™) as 𝑑 β†’ ∞ exist; and if it does, what is this limiting distribution?
The deadline for Problems 65 – 74 is Nov. 30.
75 Do the tail 𝜎-algebras associated with the Markov chains of Problems 47 , 48
consist only of events of probability 0 and 1?
√
76 Assuming that πœ‚π‘‘ = πœ‰π‘‘ / 𝑑, 0 < 𝑑 < ∞, where πœ‰π‘‘ is the Markov process of Problem 62 , is a Markov process, find its transition density π‘πœ‚ (𝑑, π‘₯, 𝑠, 𝑦) (it seems obvious
that it ought to have one).
√ of the random variable πœ‚π‘  =
√HINT: π‘πœ‚ (𝑑, π‘₯, 𝑠, 𝑦) should be the conditional density
πœ‰π‘  / 𝑠 under the condition πœ‚π‘‘ = π‘₯, that is,βˆšπœ‰π‘‘ = π‘₯ β‹… 𝑑. We know the conditional distribution of πœ‰π‘  under the condition πœ‰π‘‘ = π‘₯ β‹… 𝑑: it’s a normal distribution with such and
such parameters; we should be able to find the conditional distribution of πœ‚π‘‘ being a linear
function of πœ‰π‘‘ too.
77 Check that π‘πœ‚ (𝑑, π‘₯, 𝑠, 𝑦) satisfies the Chapman – Kolmogorov equation (all other
necessary properties are quite obvious).
10
78 Check that πœ‚π‘‘ , 0 < 𝑑 < ∞, is a Markov process with the transition density found in
Problem 76 .
79 Let 𝑝(π‘₯) be a probability density on the real line. Does there exist a family of random
variables πœ‰π‘‘ , 𝑑 β‰₯ 0, that are independent and have the distribution with the density 𝑝(π‘₯)?
80 Let the joint distribution of random variables πœ‰, πœ‚, 𝜁 be normal with parameters
(0, 𝐡), where
βŽ›
⎞
2 βˆ’1 βˆ’1
𝐡 = ⎝ βˆ’1 2 βˆ’1 ⎠ .
βˆ’1 βˆ’1 2
Is this distribution continuous? That is: does this distribution have a density π‘πœ‰πœ‚πœ (it is
not important whether this density is a continuous function or not)?
81 For the random variables of Problem 80 , find the conditional distribution of the
random variable πœ‰ under the condition that πœ‚ is fixed (with respect to πœ‚). That is, answer
the following questions:
Is the conditional distribution a discrete one, i. e.,
βˆ‘
𝑃 {πœ‰ ∈ 𝐢βˆ₯πœ‚} =
𝑝𝑖 (πœ‚) ?
𝑖: π‘₯𝑖 ∈𝐢
If yes, find 𝑝 𝑖 (πœ‚) = 𝑃 {πœ‰ = π‘₯𝑖 βˆ₯πœ‚}.
Is the conditional distribution a continuous one, i. e.,
∫
𝑃 {πœ‰ ∈ 𝐢βˆ₯πœ‚} =
𝑝 πœ‰βˆ₯πœ‚ (π‘₯) 𝑑π‘₯ ?
𝐢
If yes, find the conditional density π‘πœ‰βˆ₯πœ‚ (π‘₯) (depending on πœ‚ = πœ‚(πœ”)).
Is the conditional distribution neither discrete nor continuous? If so, find the conditional distribution function πΉπœ‰βˆ₯πœ‚ (𝑑) = 𝑃 {πœ‰ ≀ 𝑑βˆ₯πœ‚}.
Note that, in all these cases, we can try to find not the things (conditional probabilities,
or conditional densities) under the condition that πœ‚ is known (fixed) (which is denoted
with βˆ₯πœ‚ ), but rather under the condition that πœ‚ = 𝑦, for an arbitrary real 𝑦.
82 Let (πœ‰, πœ‚) have the uniform distribution on the square [0, 1]2 . Does the random
variable 𝜁 = πœ‰ 2 + πœ‚ 2 have a continuous distribution?
83 Let the random variables πœ‰ and πœ‚ be independent, πœ‰ having a discrete distribution
with 𝑃 {πœ‰ = 0} = 𝑃 {πœ‰ = 2} = 𝑃 {πœ‰ = 4} = 1/3, and πœ‚ having the uniform distribution on
the interval [0, 3].
Find the distribution of the random variable 𝜁 = πœ‰ + πœ‚.
Is this distribution discrete? continuous? neither discrete nor continuous? According
to which of this cases takes place, find the probability mass function of 𝜁, or its density,
or its distribution function 𝐹𝜁 (𝑑).
The deadline for Problems 75 – 83 is Dec. 9.
11
84 Let Ξžπ‘Ž for every π‘Ž > 0 be a Poissonian random variable with parameter π‘Ž. Prove
that Ξžπ‘Ž converges in probability to +∞ as π‘Ž β†’ ∞ (the notation: Ξžπ‘Ž →𝑃 ∞); that is, that
for every 𝐢 we have lim π‘Žβ†’βˆž 𝑃 {Ξžπ‘Ž > 𝐢} = 1.
85 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be independent random variables, πœ‰π‘– having the Poisson
distribution with parameter 1/𝑖(𝑖 + 1).
Does the event {πœ‰1 + πœ‰2 + ... + πœ‰π‘› + ... < ∞} have probability equal to 1? equal to 0?
strictly between 0 and 1?
In the first case, what is the distribution of the random variable πœ‚ = πœ‰1 +πœ‰2 +...+πœ‰π‘› +...?
To be concrete: find the probability 𝑃 {πœ‚ = 1} with accuracy up to 0.01.
In the second and in the third case: does the sum πœ‰1 +πœ‰2 +...+πœ‰π‘› →𝑃 +∞ (convergence
in probability to infinity, see the previous problem)?
86 Let πœ‰ have a uniform distribution on the interval (0, 2); πœ‚ = ln πœ‰. Find 𝐸 πœ‚. Is πΈβˆ£πœ‚βˆ£π‘˜
finite for all natural π‘˜?
87 Let πœ‰1 , πœ‰2 , ..., πœ‰π‘› , ... be independent random variables with uniform distribution on
the interval (0, 2).
βˆπ‘›
Find 𝐸 𝑖=1 πœ‰π‘– .
Which of the following events have probability 1, and which 0:
{
lim
π‘›β†’βˆž
𝑛
∏
}
{
πœ‰π‘– = 0 ;
lim
π‘›β†’βˆž
𝑖=1
𝑛
∏
}
πœ‰π‘– = +∞ ;
𝑖=1
𝑛
∏
{
}
there exists a finite positive limit lim
πœ‰π‘– ;
π‘›β†’βˆž
{
lim
𝑛
∏
π‘›β†’βˆž
𝑖=1
πœ‰π‘– does not exist, lim
𝑛
∏
π‘›β†’βˆž 𝑖=1
{
lim
π‘›β†’βˆž
𝑖=1
πœ‰π‘– = 0, lim
π‘›β†’βˆž
𝑛
∏
𝑛
∏
}
πœ‰π‘– = 𝑒3/2 ;
𝑖=1
}
πœ‰π‘– = + ∞ ?
𝑖=1
βˆπ‘›
HINT: Take ln 𝑖=1 πœ‰π‘– ; use the previous problem and the Strong Law of Large Numbers
(a concrete theorem of this group).
88 Prove or disprove: If a random variable πœ‰ has a finite fifth moment 𝐸 πœ‰ 5 , it also has
a finite second moment 𝐸 πœ‰ 2 .
89 Suppose the random variable( πœ‰ has a finite
πœ‚1 , πœ‚2 are two arbitrary
° expectation;
)
°
random variables. We know that 𝐸 𝐸(πœ‰βˆ₯πœ‚1 , πœ‚2 ) πœ‚1 = 𝐸(πœ‰βˆ₯πœ‚1 ) (almost surely, of course;
or: one of the versions of one of these conditional expectations is the other conditional
expectation).
Is the iterated conditional expectation
°
(
)
𝐸 𝐸(πœ‰βˆ₯πœ‚1 )° πœ‚1 , πœ‚2
necessarily equal to 𝐸(πœ‰βˆ₯πœ‚1 ) (a. s.)? Is it necessarily equal to 𝐸(πœ‰βˆ₯πœ‚1 , πœ‚2 )? Or it may be
not equal (a. s.) to either of these two conditional expectations, but equal to some quite
different random variable?
12
90 Let πœ‰ and πœ‚ be independent and have Poisson distribution with parameters, correspondingly, π‘Ž and 𝑏. Find the conditional distribution of the random variable πœ‰ if πœ‰ + πœ‚
is known; i. e., find
𝑃 {πœ‰ = π‘˜βˆ₯πœ‰ + πœ‚}.
91 Prove or disprove: Let πœ‰π‘› , πœ‚ be random variables with finite expectations. Then if
the distribution of πœ‰π‘› converges weakly to that of πœ‚ as 𝑛 β†’ ∞, then also the expectations
converge: 𝐸 πœ‰π‘› β†’ 𝐸 πœ‚.
The deadline for Problems 84 – 91 is Dec. 16, noon.
92* Let π‘Šπ‘‘ , 𝑑 β‰₯ 0, be a Wiener process starting from a non-random point π‘₯0 β‰₯ 0 at
Λ† 𝑑 = βˆ£π‘Šπ‘‘ ∣.
time 0: π‘Š0 ≑ π‘₯0 . Let π‘Š
Λ† 𝑑.
For 𝑑 > 0, find the probability density of the random variable π‘Š
Λ† 𝑑.
93* Find the finite-dimensional densities of the stochastic process π‘Š
Λ† 𝑑 a Markov process?
94* Is π‘Š
13