ECE 6605
HW#3, Assigned Sept 15, Due Sept 24.
1. (10 points) Consider a stationary 3-state Markov chain {Xi } where Xi is an
element of X= {1, 2, 3} with probability transition matrix
P=
0
1-p
p
1
0
0
p
1-p
0
a) Draw the state transition diagram and find the stationary distribution m.
b) Find the entropy rate H(X) as a function of p and evaluate for p=1/2 .
2. For a stationary random process X 1 , X 2 ,...X n and all n, show that:
H(X 1 , X 2 ,...X n ) H (X 1 , X 2 ,...X n-1 )
£
n
n -1
†
H(X 1 , X 2 ,...X n )
b)
≥ H (X n | X n-1 , X 1 )
n
a)
†
†
†
†
†
3) Let X -n ..., X -1 , X 0 , X 1 ,...X n be a stationary (not necessarily Markov)
†stochastic process. Which of the following statements are true? Prove or provide
a counterexample.
(a) H(X n | X 0 ) = H (X -n | X 0 )
† (b) H(X n | X 0 ) ≥ H (X n-1 | X 0 )
(c) H(X n | X 1 ,..., X n-1 , X n +1 ) is nonincreasing in n.
(d) H(X n +1 | X 1 ,..., X n , X n +2 ,...X 2n +1 ) is nonincreasing in n.
4) This problem deals with mutual information and testing independence. Let
Z i = (X i ,Yi ) be a an IID sequence (the vectors Z i ’s are IID but X i and Yi are
not necessarily independent) with density p(x, y) . If we form the (log likelihood
ratio) function
†
†
†
†
p(X 1 , X 2 ,..., X n ) p(Y1 ,Y 2 ,...,Y n )
1
log
n
p(X 1 , X 2 ,..., X n ,Y1 ,Y 2 ,...,Y n )
What is its limit (that is, evaluate as n tends to • )? Provide an interpretation.
†
5) a) In class we saw that conditioning
† reduces entropy, i.e. H(X) ≥ H(X|Y). Use
this property to show that (in general) conditioning does not reduce mutual
information. For example, try to prove I(X;Y|Z) ≤ I(X;Y) then indicate why this
cannot be true in general.
b) Let X Æ Y Æ Z be a Markov chain in that order. Prove that conditioning does
reduce the mutual information, namely, prove I(X;Y|Z) ≤ I(X;Y).
Entropy rate of dog looking for a bone. A dog walks from one integer to the
† 6) †
next, possible reversing his direction at each step with probability p=0.1 . Let Xi
denote his position at time i. Let X0 = 0 and his first step is equally likely to be
positive or negative. A typical walk might be:
(X 0 , X 1 , X 2 , X 3 ,...) = (0,1,2,1,0,-1,-2,-3,-2,...)
Find H(X 1 ,..., X n ) and the entropy rate H(X)
†
†
© Copyright 2026 Paperzz