1
Probability Models
The list of questions below is provided in order to help you to prepare for the test and exam. It
reflects only the theoretical part of the course.
You should expect the questions in the test/exam to range over the entire set of notes
and exercises for the course. If a proof or any other part of the material we have discussed is
not examinable then this is clearly stated in the notes.
Most of the questions 1 - 9 are included here in order to help you to revise the Intro to
Probability material which is used in this course. Answers to some of these questions should be
looked up in your Intro to Probability notes (IPN for short).
Introduction
1. What is a Probability Space?
2. What is the definition of a random variable?
3. What is the definition of a discrete random variable? List the main properties of the
probabilities P {X = xi } ≡ pi . Examples of discrete probability mass functions: Bernoulli,
binomial, Poisson, geometric, negative binomial. (See your IPN.)
4. What is the definition of the expectation of a discrete random variable?
5. How and under what conditions can we compute E(g(X)), where g : R → R is a function?
6. What is the definition of a conditional probability P (A/B)?
7. Define independence of two events; of any finite number of events.
8. State and prove the Theorem of Total Probability (the total probability formula).
Be able to use it as in examples.
The Voting Problem - This section in the Notes is for those who want to know more
about applications of the Theorem of Total Probability. It is not examinable.
9. Explain what is the voting problem. Prove the main result concerned with this problem
and be bale to use it.
Random Walks
10. Give the definition of a random walk on a line.
11. What is the gambler’s ruin problem? Give the re-formulation of it in terms of the behaviour
of a random walk on a finite interval.You are supposed to be able to solve problems stated in
terms of the gambler’s ruin by reducing them to problems about random walks. Examples
include finding the probability that one of the players will win the game (and the other
will be ruined) or that, when playing in a casino, the gambler wins an infinite amount of
money.
12. Let Xt be a simple random walk on [M, N ]. Let X0 = n and rn be the probability that
the random walk starting from n, M ≤ n ≤ N , will reach N before reaching M . Prove
that
rn = prn+1 + qrn−1 if M < n < N
rM = 0,
rN = 1
2
13. Know the statements of results about the Second Order Difference Equations. Proofs are
not examinable.
14. Solve the equations stated in question 12 and prove that
( n M
λ −λ
if p 6= q
λN −λM
rn =
n−M
if p = q = 0.5
N −M
(1)
where λ = q/p. You are supposed to remember this formula.
15. Suppose the random walk starts from n and M ≤ n. What is the probability that it would
reach +∞ before visiting M ? Know the derivation of the corresponding formula.
16. Suppose that X0 = 0. Prove that the probability for a random walk to return to 0 is 2p if
p < q and 2q if q ≤ p.
17. Be able to state and prove the Theorem of Total Probability for Expectations.
18. Suppose that a random walk is starting from n, M ≤ n ≤ N . The walk stops once it
reaches M or N . Let En be the expected duration of the walk. Prove that
E0 = EN = 0,
(2)
En = pEn+1 + qEn−1 + 1
for 0 < n < N .
(3)
19. Know how to solve equations (2)-(3) and remember the result in the case p = q = 1/2.
20. Prove the following statement: Suppose that p = q = 1/2 and a random walk starts from
position 1. Then the expected time until it reaches zero is infinite!
Conditional Expectations as Random Variables
21. Define E(X|Y ), Var (X | Y ).
22. Prove the Tower law for expectations.
23. Prove that Var (X) = E(Var (X | Y )) + Var (E(X | Y )).
24. Define what is a Random sum of Random Variables. Prove the following
Theorem 1. Suppose X1 , X2 , X3 , ... are independent identically distributed random variables with mean µ and variance σ 2 , and
P that N is another independent non-negative
integer-valued random variable. Let Y = N
i=1 Xi . Then
E(Y ) = E(N )µ
Var (Y ) = σ 2 E(N ) + µ2 Var (N ).
25. Generating Functions. Know all definitions and theorems about the probability generating functions.
Branching processes.
26. What is the definition of a branching process?
27. Prove the following theorem
TheoremSuppose that X is a random variable with mean µ and Y0 , Y1 , Y2 . . . is the branching process generated by X. Then
E(Yn ) = µn ,
n ≥ 1.
3
28. Prove the following statement:Suppose Y0 , Y1 , Y2 . . . is a branching process generated by a
random variable X with mean µ < 1. Then
lim P(Yn = 0) = 1.
n→∞
29. Let Gn (t) = E tYn be the probability generating function of Yn . Prove the following
theorem.
Theorem Suppose Y0 , Y1 , Y2 . . . is a branching process generated by a random variable X
with probability generating function G. Then
Gn (t) = Gn−1 (G(t)).
(4)
Gn+1 (t) = G(Gn (t)).
(5)
30. Prove that equation (4) implies
31. Denote by θn = P(Yn = 0) - the probability of extinction of the branching process by time
n. Prove that θn = G(θn−1 ) with θ0 = 0.
32. How do you find the probability of ultimate extinction of a branching process? State and
prove the related theorem.
Example. Suppose that P(X = 0) = 0.3, P(X = 2) = 0.7. Find θ1 , θ2 , θ3 . Find the
probability of ultimate extinction in this case.
33. State, in terms of the mean value of the generating random variable, the necessary and
sufficient condition for the probability of ultimate extinction of a branching process to be
equal to 1.
Continuous random variables.
34. What is the definition of a continuous random variable?
35. What is the definition of the probability density function?
36. State the main properties of probability density functions.
37. Know the following examples of probability density functions: uniform, exponential, Gamma,
Normal.
38. Define what is a cumulative distribution function (c.d.f.) of a random variable X?
How do you find the the c.d.f. of X if its p.d.f. fX (x) is given? And how do you find the
p.d.f. of X if the c.d.f. FX is given?
How do you find E(g(X)) in terms of fX (x)?
39. Let X ∼ N (µ, σ 2 ) be a normal random variable. Prove that E(X) = µ, Var (X) = σ 2 .
40. Suppose that X and Y are random variables. Define what does it mean to say that X and
Y are jointly continuous?
41. Define the joint probability density function of two random variables.
42. What are the main properties of the joint probability density function of two random
variables?
4
43. Define the joint distribution function FX,Y of two random variables X, Y .
Express FX,Y (x, y) in terms of the joint p.d.f. fX,Y of X, Y .
44. Prove that FX (x) = FX,Y (x, ∞),
45. Prove that fX,Y (x, y) =
FY (y) = FX,Y (∞, y).
∂2
∂x∂y FX,Y (x, y).
46. Prove that the marginal densities can be found as follows: fX (x) =
R∞
fY (y) = −∞ fX,Y (x, y) dx.
R∞
−∞ fX,Y (x, y) dy
and
47. Give the definition of independence of two random variables.
48. What is the necessary and sufficient condition for two continuous r. v.s to be independent
expressed in terms of probability density functions fX,Y (x, y), fX (x), fY (y)?
49. Prove that two continuous random variables X and Y are independent if and only if there
are functions g and h such that fX,Y (x, y) = g(x)h(y) for all x, y.
Conditional distributions (continuous case).
50. Let X and Y be jointly continuous random variables with joint density function fX,Y .
What is the conditional density function of X given Y = y, fX|Y =y (x)? What is fY |X=x (y)?
51. What is the definition of E(X|Y = y) and of E(X|Y )?
52. Prove that E(X) = E(E(X|Y )).
53. Let g(x, y) be a function of two real variables x, y. How do you find E(g(X, Y )) in terms
of fX,Y (x, y)?
Express E((X − µ1 )k (Y − µ2 )m ) in terms of fX,Y (x, y).
54. Exercise. Prove that if random variables X, Y are independent then E(X k Y m ) = E(X k )×
E(Y m ).
55. What is the definition of covariance and correlation of two random variables?
56. The bivariate normal distribution.
You are not asked to remember the formula for the joint p.d.f. of X, Y . Rather, you will
be told what fX,Y (x, y) is.
But, given fX,Y (x, y), you are supposed to be able to prove all statements concerning the
bivariate normal distribution.
Exercise. Prove that two normal random variables are independent if and only if the
parameter ρ of the normal distribution is zero.
Poisson Processes
57. Give the definition of the Poisson process N (t) with rate λ > 0.
58. What is the joint distribution of the values of the Poisson process at times t1 , t2 , ..., tn ,
where t1 < t2 < ... < tn ?
5
59. What is the definition of the arrival time Tn of a Poisson process?
(
λe−λx if x > 0
Prove that fT1 (x) =
0
if x ≤ 0.
(
λ2 xe−λx if x > 0,
Prove that fT2 (x) =
0
if x ≤ 0.
Be able to state fTn (x).
(
λ2 e−λy
60. Prove that fT1 ,T2 (x, y) =
0
if y ≥ x ≥ 0
otherwise.
61. Prove that the random variable T1 |T2 = y is uniformly distributed on [0, y].
62. What is the definition of the inter-arrival time Wn ,, n ≥ 1 ?
Know the statement of the theorem describing the joint distribution of n inter-arrival times
W1 , W2 , ..., Wn .
63. Prove that W1 , W2 are independent random variables each having the exponential distribution with parameter λ.
Inequalities, Law of Large Numbers, Central Limit Theorem.
64. State and prove Markov’s inequality.
65. State and prove Chebyshev’s inequality.
66. State and prove the Law of Large Numbers.
67. State and prove the Bernoulli Law of Large Numbers.
68. State the Central Limit Theorem (CLT).
Be able to apply the Central Limit Theorem. Be able to give answers in terms of an
integral or the Φ function (examples in the Notes and exercise sheet 11)
© Copyright 2026 Paperzz