Statistical inference - V16 Homework 3 February 08, Monday: 1215 - 1400 Realfagbygget, grupperom RFB : 4F18A 1 Version-C-160112 2016-01-13-13-04-14 Teaching assistant: Håkon Otneim2 Problem 3.1 [CB, Exercise 4.1, page 192] A random point is distributed uniformly on the square with vertices (1, 1), (1−1), (−1, 1), and (−1, . − 1). That is the joint pdf is 1 2 RFB : 4F18A Håkon Otneim 1 f (x, y) = on the square. 3 Determine the 4 probabilities of the following events, i) X 2 + Y 2 < 1. ii) 2X − Y > 0. iii) |X + Y | < 2. f (x, y) = 4−1 1 (x, y) ∈ square , the last term is an indicator. 3 Problem 3.2 [CB, Exercise 4.4, page 192] A pdf is defined by 4 f (x, y) = C(x+2y) 1(0 < y < 1)1(0 < x < 2). i) Find the value of C. ii) Find the marginal distribution of X . iii) Find the joint cdf of X and Y . iv) Find the pdf of the random variable Z = 9(X + 1)−2 . 4 The two terms on the right hand side are indicators, e.g. 1(0 < y < 1) = 1(0,1) (y). Problem 3.3 [CB, Exercise 4.5a, page 192] i) √ Find P(X > Y ) if X and Y are jointly distributed with pdf f (x, y) = (x+y)1(0 ≤ x ≤ 1) 1(0 ≤ y ≤ 1) Problem 3.4 [CB, Exercise 4.10, page 193] The random pair (X, Y ) has distribution: Y =2 Y =3 Y =4 X=1 X=2 X=3 1 12 1 6 1 6 1 12 1 6 0 0 1 3 0 i) Show that X and Y are independent. ii) Give a probability table for random variables U and V that have the same marginals as X and Y but are dependent. Problem 3.5 [CB, Exercise 4.11, page 193] Let U be the number of trials needed to get the first head and V be the number of trials needed to get the two heads in repeated tosses of a fair coin. Are U and V independent random variables? Problem 3.6 [CB, Exercise 4.22, page 195] Let (X, Y ) be a bivariate random vector with joint pdf f (x, y)5 . Let U = aX + b and V = cY + d , where a, b, c, and d are fixed constants with a > 0 and c > 0. Show that the joint pdf of (U, V ) is fU,V (u, v) = 1 fX,Y a−1 (u − b), b−1 (v − d) . ac Hint: Use the transformation formula. 5 The alternative way of writing this simultaneous density is fX,Y . Problem 3.7 [CB, Exercise 4.26, page 195] Let X and Y be independent random variables with X ∼ exponential(λ) 6 and Y ∼ exponential(µ). It is impossible to obtain direct observations7 of X and Y . Instead, we observe the random variables Z and W , where Z = X∧Y = min(X, Y ) and W = 1(Z = X). i) Find the joint distribution of Z and W 8. ii) Prove that Z and W are independent. Hint: Show that P(Z ≤ z|W = i) = P(Z = z) for i = 0, 1 . 6 EX = λ This is a situation that aries, in particular, in medical experiments. The variables X and Y are censored. 8 cf. Stat220. 7 Problem 3.8 [CB, Exercise 4.27, page 195] Let X ∼ N (µ, σ 2 ) and Y ∼ N (γ, σ 2 ). Suppose that X and Y are independent normal random variables. Define U = X + Y and V = X − Y . Show that U and V are independent normal random variables. Find the distribution of each of them. Problem 3.9 [CB, Exercise 4.32, page 196] i) For the hierarchical model Y |Λ ∼ Poisson(Λ), Λ ∼ gamma(α, β). Find the marginal distribution, mean and variance of Y . Show that the marginal distribution of Y is negative binomial if α is an integer. ii) Show that the three stage model Y |N ∼ binomial(N, p), N |Λ ∼ Poisson(Λ), Λ ∼ gamma(α, β), leads to the same marginal distribution of Y . Textbook George Casella and Roger L. Berger. Statistical inference. The Wadsworth & Brooks/Cole Statistics/Probability Series. Wadsworth & Brooks/Cole Advanced Books & Software, Pacific Grove, CA, second edition, 2002. ISBN 0534243126.
© Copyright 2026 Paperzz