Practice Question No. 1, Econ 852, Answer Key Instructor: Susumu Imai Question 1 Let X = (X1 , X2 ) . 1 Show that PX y = PX1 y + PX2 y does not hold in general. PX y = X1 β̂1 + X2 β̂2 PX1 y = PX1 X1 β̂1 + X2 β̂2 + û = X1 β̂1 + PX1 X2 β̂2 PX2 y = PX2 X1 β̂1 + X2 β̂2 + û = PX2 X1 β̂1 + X2 β̂2 Hence, PX1 y + PX2 y = X1 β̂1 + X2 β̂2 + PX1 X2 β̂2 + PX2 X1 β̂1 6= X1 β̂1 + X2 β̂2 2 Report the assumptions under which the equality holds. Equality holds if XT1 X2 = 0 or β̂1 = β̂2 = 0 Question 2 Consider the following linear model yt = Xt β + Zt γ + ut where ut , t = 1, ..., n. 1 Suppose σ 2 = V ar(ut ) is known. Then, describe the bias and variance of the OLS estimator of β when including γ and excluding γ, if γ = 0. Do the same exercise if γ 6= 0. OLS estimator including Z. −1 T X MZ y β̂ = XT MZ X Bias: −1 T −1 T X MZ u β̂ = XT MZ X X MZ (Xβ + Zγ + u) = β + 0 + XT MZ X Hence, i h i h −1 T −1 T X MZ u = XT MZ X X MZ E (u|X, Z) = β E β̂|X, Z = β + E XT MZ X Therefore, unbiased. OLS estimator excluding Z. −1 T β̂ = XT X X y 1 Bias: −1 T −1 T −1 T β̂ = XT X X (Xβ + Zγ + u) = β + XT X X Zγ + XT X X u Hence: h i −1 T E β̂|X, Z = β + XT X X Zγ Therefore, biased if γ 6= 0 Variance if Z is included. h i −1 V ar β̂|X, Z = σ 2 XT MZ X Variance if Z is not included. h i −1 V ar β̂|X, Z = σ 2 XT X 2 Do the same exercise when the variance of the error term is not known and needs to be estimated. Estimated variance matrix if Z is included. h i −1 V ˆar β̂|X, Z = s2 XT MZ X where ûT û , û = M[X,Z] y n − kX − kZ where kX is the number of varianbles in X, and kZ is the number of variables in Z Estimated variance if Z is not included. h i −1 V ar β̂|X, Z = s2 XT X s2 = where s2 = ûT û , û = MX y n − kX Question 3 Do Exercise 3.19. The restricted model parameter estimate is: −1 T −1 −1 T βr = XT X X y = XT MZ X XT MZ X XT X X y −1 = XT MZ X XT MZ PX y The unrestricted model paramter estimate is: βu = XT MZ X −1 XT MZ y Then, the difference is βu − βr = XT MZ X −1 XT MZ (I − PX ) y = XT MZ X −1 XT MZ MX y Question 4 Suppose y and X are n by 1 vectors, and let β be the OLS coefficient, i.e. y = X β̂ + û. 2 1 By using Cauchy-Schwartz inequality, show that y T y ≥ (Px y)T (Px y) From Cauchy-Schwartz inequality, |xt y| ≤ kxkkyk Then, kyk2 ≥ |xT y|2 kxk−2 Therefore, y T y ≥ y T X(X T X)−1 X T y = (Px y)T (Px y) We also know that equalith holds if x = αy for some constant α. Also, use Cauchy-Schwartz inequality to show when the equality holds. 2 Now, we want to use the above Cauchy-Schwartz inequality to prove the Gauss-Markov Theorem. Let Ay be the unbiased estimate. use the above relationship, and substitute AT for y, i.e. AT A≥ (Px AT )T (Px AT ) This should be enough to prove the Gauss-Markov Theorem. Because of the unbiasedness, AX = I. Hence, P x AT = X X T X −1 X T AT = X X T X −1 = ATOLS Hence, by using the above inequality, AT A ≥ (Px AT )T (Px AT ) = ATOLS AOLS equality holds if A = αAOLS . From unbiasedness, α = 1 3
© Copyright 2026 Paperzz