Lecture 4: Difference between fixed and random effects

Lecture 4: Difference between fixed and random effects
LMM Lecture 4
Outline
• Random effects are “shrinkage estimates” and are good for ranking
•
•
•
•
A simulated example
Theory
Augmented regression version
Interpretation of fixed and random effects
LMM Lecture 4
How can estimated random effects be used?
1
Smoothing of time series
2
Spatial data
1
2
3
Smoothing
Predictions for areas with no observations
Ranking
1
2
3
“Small Area Estimation” for spatial data
Health care example: hospital performance for different sized hospitals
Genetics:
1 performance of dairy cows having different number of relatives
2 possible to rank bulls even though they have no observed values on milking
performance
LMM Lecture 4
0
-1
-2
Subject average, mean(y)
1
A simulated example
2
4
6
8
Observations per subject, n
LMM Lecture 4
10
Theory
Let a be a random effect
y = Xβ + Za + e with ai ∼ N(0, σa2 )
X0 X
Z0 X
X0 Z
2
Z0 Z + I σσ 2
a
!
βˆ
â
=
X0 y
Z0 y
Let a be a fixed effect in y = Xβ + Za + e . Then the estimates of a are the
ordinary least square solutions
0 0
X X X0 Z
Xy
βˆ
=
Z0 X Z 0 Z
Z0 y
â
2
The only difference is the term +I σσ 2 that imposes a shrinkage to the estimates
a
when effect a is random.
LMM Lecture 4
Theory
Let a be a random effect
y = Xβ + Za + e with ai ∼ N(0, σa2 )
X0 X
Z0 X
X0 Z
2
Z0 Z + I σσ 2
a
!
βˆ
â
=
X0 y
Z0 y
Let a be a fixed effect in y = Xβ + Za + e . Then the estimates of a are the
ordinary least square solutions
0
0 Xy
X X X0 Z
βˆ
=
Z0 y
Z0 X Z 0 Z
â
2
The only difference is the term +I σσ 2 that imposes a shrinkage to the estimates
a
when effect a is random.
LMM Lecture 4
Theory
Let a be a random effect
y = Xβ + Za + e with ai ∼ N(0, σa2 )
X0 X
Z0 X
X0 Z
2
Z0 Z + I σσ 2
a
!
βˆ
â
=
X0 y
Z0 y
Let a be a fixed effect in y = Xβ + Za + e . Then the estimates of a are the
ordinary least square solutions
0
0 Xy
X X X0 Z
βˆ
=
Z0 y
Z0 X Z 0 Z
â
2
The only difference is the term +I σσ 2 that imposes a shrinkage to the estimates
a
when effect a is random.
LMM Lecture 4
Linear mixed model as an augmented regression model
The linear mixed model y = Xβ + Za + e with ai ∼ N(0, σa2 ) can be written as a
regression model
y
X Z
β
e
=
+
0
0 I
a
−a
where 0 is a vector of zeros (same length as a), 0 is a matrix of zeros and I is the
identity matrix.
The weighted least square solutions are
where W =
X
0
Z
I
I σ12
0
W
X
0
Z
I
βˆ
â
=
X
0
!
I σ12
.
a
Same as Henderson’s mixed model equations!
LMM Lecture 4
Z
I
0
W
y
0
Linear mixed model as an augmented regression model
The linear mixed model y = Xβ + Za + e with ai ∼ N(0, σa2 ) can be written as a
regression model
y
X Z
β
e
=
+
0
0 I
a
−a
where 0 is a vector of zeros (same length as a), 0 is a matrix of zeros and I is the
identity matrix.
The weighted least square solutions are
where W =
X
0
Z
I
I σ12
0
W
X
0
Z
I
βˆ
â
=
X
0
!
I σ12
.
a
Same as Henderson’s mixed model equations!
LMM Lecture 4
Z
I
0
W
y
0
Linear mixed model as an augmented regression model
The linear mixed model y = Xβ + Za + e with ai ∼ N(0, σa2 ) can be written as a
regression model
y
X Z
β
e
=
+
0
0 I
a
−a
where 0 is a vector of zeros (same length as a), 0 is a matrix of zeros and I is the
identity matrix.
The weighted least square solutions are
where W =
X
0
Z
I
I σ12
0
W
X
0
Z
I
βˆ
â
=
X
0
!
I σ12
.
a
Same as Henderson’s mixed model equations!
LMM Lecture 4
Z
I
0
W
y
0
Classical interpretation of fixed and random effects
• A fixed effect is fixed and does not change if the experiment is repeated.
• A random effect is sampled from a distribution of effects. The effects can
change between experiments but the distribution is fixed.
LMM Lecture 4