General MANOVA Tests General MANOVA Tests I Recall: N responses x1 , x2 , . . . , xN are assumed to be independent, and the αth response is related to q covariate values, represented by the (q × 1) vector zα , through xα ∼ Np (βzα , Σ) , α = 1, 2, . . . , N. where β is a (p × q) matrix of regression coefficients. I Partition β into (p × q1 ) and (p × q2 ) blocks: β = (β 1 β 2 ) I We want to test the null hypothesis H0 : β 1 = β ∗1 for some given matrix β ∗1 . NC STATE UNIVERSITY 1 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Sufficiency I Σ̂Ω , β̂ 1,Ω , and β̂ 2,ω are sufficient statistics. I Without loss of generality, assume that β ∗1 = 0. I Write (2) βzα = β 1 z(1) α + β 2 zα −1 (2) (2) = β 1 z(1) + β 2 + β 1 A1,2 A−1 α − A1,2 A2,2 zα 2,2 zα = β 1 z∗(1) + β ∗2 z(2) α α I Then β̂ 1 = β̂ 1,Ω , NC STATE UNIVERSITY 2 / 17 ∗ β̂ 2 = β̂ 2,ω . Statistics 784 Multivariate Analysis General MANOVA Tests Invariance I H0 is invariant under the transformation x∗α = xα + Γz(2) α , since E(x∗α ) = β1 z∗(1) + (β ∗2 + Γ) z(2) α α . I Under this transformation, Σ̂Ω and β̂ 1 , are invariant. I However, for any β̂ 2 we can find Γ such that β̂ 2 + Γ = 0, so β̂ 2 is not invariant. I So the only invariant functions of the sufficient statistics Σ̂Ω , β̂ 1 , and ∗ β̂ 2 are Σ̂Ω and β̂ 1 . ∗ NC STATE UNIVERSITY ∗ 3 / 17 ∗ Statistics 784 Multivariate Analysis General MANOVA Tests I Also, H0 is invariant under the transformation z∗∗(1) = Cz∗(1) α α for nonsingular (q1 × q1 ) C, under which β 1 becomes β 1 C−1 . I I 0 Under this transformation, Σ̂Ω and β̂ 1 A1,1·2 β̂ 1 are invariant. If a function f β̂ 1 , A1,1·2 is invariant, then 1 2 f β̂ 1 , A1,1·2 = f β̂ 1 A1,1·2 , Iq1 1 2 = f β̂ 1 A1,1·2 Q, Iq1 0 for any orthogonal Q, so it depends only on β̂ 1 A1,1·2 β̂ 1 . 0 I So Σ̂Ω and β̂ 1 A1,1·2 β̂ 1 are the only invariant functions of Σ̂Ω and β̂ 1 . I As before, write E = N Σ̂Ω and H = β̂ 1 A1,1·2 β̂ 1 . NC STATE UNIVERSITY 0 4 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests I Finally, H0 is invariant when xα is replaced by Kxα for nonsingular (p × p) K. I This transforms E to KEK0 and H to KHK0 . I We can find K such that KEK0 = Ip and KHK0 = L, where L= NC STATE UNIVERSITY l1 0 . . . 0 l2 . . . .. .. . . . . . 0 0 ... 5 / 17 0 0 .. . lp Statistics 784 Multivariate Analysis General MANOVA Tests I Here l1 ≥ l2 ≥ · · · ≥ lp ≥ 0 are the roots of the determinantal equation det(H − lE) = 0, which are also the solutions to the two-matrix eigenvalue problem: Hξ = lEξ and are the eigenvalues of E−1 H. I These p roots are invariant, and are the only invariants. I So any test statistic that is invariant under all of these transformations must be a function of these roots l1 , l2 , . . . , lp . I Note that in general min(p, q1 ) of the roots are non-zero. NC STATE UNIVERSITY 6 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests The Common Tests I Since E has a central Wp (Σ, n) distribution under both the null and alternate hypotheses, while H has a Wp (Σ, m) distribution that is central only under the null hypothesis, we want to reject H0 when the roots are large by an appropriate measure. I The likelihood ratio λ satisfies det(E) det(E + H) det(Ip ) = det(Ip + L) p Y 1 = . 1 + li λ2/N = U = i=1 NC STATE UNIVERSITY 7 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Lawley-Hotelling Trace I Lawley-Hotelling trace: p X li = trace(L) = trace E−1 H . i=1 I For large N, approximately, N trace E−1 H ∼ χ2pq1 . I Pillai’s F -approximation: ν2 trace E−1 H × ∼ Fν1 ,ν2 , ν1 n min(p, q1 ) where ν1 = pq1 and ν2 = (n − p − 1) min(p, q1 ). NC STATE UNIVERSITY 8 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Bartlett-Nanda-Pillai Trace I Pillai’s trace: V = p X i=1 I li = trace H(E + H)−1 1 + li P fi , where f1 ≥ f2 ≥ · · · ≥ fp ≥ 0 are the V can also be written as roots of the determinantal equation det[H − f (E + H)] = 0. I The joint density of these roots is ( p ) Y Y 1 (m−p−1) 1 (n−p−1) 2 2 C fi (1 − fi ) (fi − fj ) i=1 NC STATE UNIVERSITY i<j 9 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests I The density of V can be found by integration, but is difficult to use. I Pillai approximated the density by a β density, which which allows the statistic to be transformed into an approximately F -distributed statistic. NC STATE UNIVERSITY 10 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Roy’s Greatest Root I Recall: whence (2) E (xα ) = β1 z(1) α + β 2 zα 0 (2) E c0 xα = c0 β1 z(1) α + c β 2 zα I I I for every c, and conversely. So we can test the multivariate hypothesis H0 : β 1 = 0 by testing all the univariate hypotheses H0 (c) : c0 β 1 = 0. The conventional test of the univariate hypothesis is based on the F -statistic, say F (c): 1 0 c Hc F (c) = m1 0 n c Ec We reject H0 if we reject any H0 (c); that is, we use the statistic n max F (c) = l1 c m NC STATE UNIVERSITY 11 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests I The Roy maximum root criterion is l1 . I The exact distribution of l1 = f1 /(1 − f1 ) can be computed from the joint distribution of f1 , f2 , . . . , fp given above. I The usual approximation is based on an F -distributed upper bound to a transformed l1 , which provides only an approximate lower bound to the P-value. NC STATE UNIVERSITY 12 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Large Samples I I I Note that for large N, E is Op (N) but, under H0 , H is Op (1). So the roots are op N −1 . All P test statistics except Roy’s are therefore approximately equivalent to li , and may be expected to have similar behavior at or close to H0 . NC STATE UNIVERSITY 13 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Power I Under the alternative hypothesis β 1 6= β ∗1 , H has the noncentral Wishart distribution with noncentrality parameter (matrix) (β 1 − β ∗1 )A1,1·2 (β 1 − β ∗1 )0 I The power of any invariant test is a function of the roots (N) (N) (N) ν1 ≥ ν2 ≥ · · · ≥ νp ≥ 0 of the determinantal equation det[(β 1 − β ∗1 )A1,1·2 (β 1 − β ∗1 )0 − νΣ] = 0. NC STATE UNIVERSITY 14 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests I If these roots are substantially unequal, the Lawley-Hotelling test is more powerful than the likelihood ratio test, which is in turn more powerful than Pillai’s trace. I The reverse is true if the roots are close to each other. I So the likelihood ratio test is, in this sense, maximin. I Roy’s test is most powerful if only ν1 > 0, or more generally if (N) (N) ν1 ν2 , but is otherwise less powerful than the other three tests. NC STATE UNIVERSITY (N) 15 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Admissibility I Recall: I I one test is better than another if it has the same size (or lesser) and uniformly as much power, with strict inequality somewhere, and a test is admissible if there is no better test. I All four tests are admissible. I But other tests based on the same roots may not be. For instance, the test based on the smallest root is inadmissible (presumably for min(m, p) > 1). NC STATE UNIVERSITY 16 / 17 Statistics 784 Multivariate Analysis General MANOVA Tests Unbiasedness and Monotonicity I A test is unbiased if its power is minimized at the null hypothesis. I Its power function is monotone if power increases with some measure of distance from the null hypothesis. I If the power function is monotone, the test is unbiased. I The likelihood ratio test, the Lawley-Hotelling test, and Roy’s greatest root test all have power functions that are monotone (N) functions of each νi . I The Pillai trace test also has a monotone power function, provided the critical value is less than 1, which we expect to be true for typical sizes and large n. NC STATE UNIVERSITY 17 / 17 Statistics 784 Multivariate Analysis
© Copyright 2026 Paperzz