Assignment 6

February 17, 2017
Math 303 Assignment 6: Due Friday, March 3 at start of class
I. Problems to be handed in:
1. Prove that the pivot algorithm, as defined in class, is reversible, with stationary distribution
uniform on the state space SN of N -step self-avoiding walks.
2. A graph G consists of a set V of elements called vertices (or nodes) and a set E of pairs of
elements of V called edges (or arcs). The graph is connected if, given any pair of vertices,
there is a path consisting of edges in the graph, which starts at one of the vertices and ends
at the other one. There is some discussion of this in Ross in Section 3.6.2 and Google will
take you to web pages on this topic. Suppose that V is a finite set. A spanning tree of
G is a connected subgraph of G that contains every vertex in V and no cycles (a cycle is
a connected subgraph C of G such that every vertex in C is the endpoint of exactly two
edges).
This problem concerns the grid graph in the figure, which has 16 vertices and 24 edges. The
figure also shows an example of a spanning tree (blue) in the grid graph, from Wikipedia:
It can be shown that (i) every spanning tree of the grid graph contains 16 − 1 = 15 edges,
(ii) every subgraph of the grid graph with 15 edges and no cycles is a spanning tree, and
(iii) adding an edge to a spanning tree always creates exactly one cycle.
The following is a Markov Chain Monte Carlo approach to generating a spanning tree of
a graph uniformly at random. Let S denote the set of all spanning trees of the above grid
graph. We now define a Markov chain X0 , X1 , X2 , . . ., whose state space is S. Given the
current spanning tree Xn , we generate Xn+1 as follows:
(i) There are 24 − 15 = 9 edges that are not in Xn . Choose one of these at random (equal
probabilities 91 ).
(ii) Let C denote the unique cycle of Xn ∪ {e}.
(iii) Choose an edge e0 uniformly at random from C.
(iv) Let Xn+1 be the result of removing e0 from Xn ∪ {e}.
It is possible to show that this above Markov chain is irreducible (it is not part of the
problem to prove this, but you may wish to try).
Now comes the problem:
(a) Show that the Markov chain has a stationary distribution.
(b) Show that its transition matrix is symmetric, i.e., Pi,j = Pj,i for all i, j ∈ S.
(c) Conclude that the stationary distribution is uniform.
3. Let X, Y be independent exponential random variables with respective rates λ, µ. Determine the conditional distribution of X given that X < Y .
Hint: start with P (X > x | X < Y ).
4. Smith is waiting for his two friends Lee and Yang to visit his house. The time until Lee
arrives is Exp(λ1 ) and the time until Yang arrives is Exp(λ2 ). After arrival, Lee stays an
amount of time that is Exp(µ1 ), whereas Yang stays an amount of time that is Exp(µ2 ).
All four random variables are independent.
(a) What is the probability that Lee arrives before and departs after Yang?
(b) What is the expected time of the first departure?
Hint: Let X be the time of the first departure, and write X = F + A where F is the
time of the first arrival and A is the additional amount of time until the first departure.
Compute EA by conditioning on who arrived first.
5. Smith and Yang run a race. Smith’s time is Exp(λ) and Yang’s is Exp(µ), and these two
random variables are independent. The one who finishes first is the winner and wins Ae−αt
if the winning time is t. The loser receives zero. What is Smith’s expected winnings?
II. Recommended problems: These provide additional practice but are not to be handed
in. Starred problems have solutions in the text.
Chapter 5: #3, 7*, 10*, 18*, 23*.
Quote of the week: Bennett’s classmates hated word problems. Indeed, they hated math altogether,
but they’d rather have a tooth filled than be forced to sit down and contemplate word problems.
Bennett, on the other hand, placed word problems on a level with Florida’s pecan pie. Word
problems were delicious. He devoured them. He convinced the flabbergasted Mrs. Dixon to give
him additional word problems, beyond the assignments, and when she ran out of problems he
created them himself. After school, when the other boys played basketball or loitered behind the
Rexall drugstore to smoke and discuss girls, Bennett went home and up to his room to do word
problems.
Alan Lightman, in Good Benito.
A random 60,000-step simple random walk on the square lattice.
A random 1,000,000-step self-avoiding walk on the square lattice.
(Figure courtesy of Tom Kennedy, University of Arizona.)