5 Operations on Multiple Random Variables

'
$
tmentarOperations
Chapter
Dep5:
J on Multiple Random Variables
om
Si
s-Electrical Eng
i
s
ly
ine
na
sis-Electr
EE 360- Ra nd
EE360 Random Signal analysis
gn
gi
n
Si
g
En
ri n
EE360-Random
ysis-Electr ical
ee
EE360 - E
le
nal
ical
naly
En
lA
gi
a
n
g
i
n
n
e
E
e
l
r
a
i
n
c
i
g
ctr
al
A
pt.-JUST.
g De
rin
e
ne
tment.
par
De
1
US
T
.
ing
er
andom S
i
g
0-R
n
al
36
A
EE
gD
e p t .- J U S T.
5 Operations on Multiple Random Variables
Expected value of a function of r.v.’s
Two r.v.’s:
ḡ = E[g(X, Y )] =
R∞ R∞
−∞ −∞
g(x, y)fX,Y (x, y)dxdy
N r.v.’s:
ḡ = E[g(X1 , X2 , · · · , XN )] =
R∞
R∞
· · · −∞ g(x1 , x2 , · · · , xN )fX1 ,X2 ,···,XN (x1 , x2 , · · · , xN )dx1 dx2 · · · dxN
−∞
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
$
2
Ex. 5.1-1
PN
g(X1 , X2 , · · · , XN ) = i=1 αi Xi =weighted sum of r.v.’s
PN
PN
E[g(X1 , X2 , · · · , XN )] = E[ i=1 αi Xi ] = i=1 αi E[Xi ]
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
3
Joint Moments about the Origin
joint moment mnk
m10
n
k
= E[X Y ] =
= E[X], m01 = E[Y ]
R∞ R∞
−∞
n k
x
y fX,Y (x, y)dxdy
−∞
Second order moment = correlation of X and Y = m11
m11 = E[XY ] = RXY =
R∞ R∞
−∞ −∞
xyfX,Y (x, y)dxdy
• If RXY = E[XY ] = E[X]E[Y ] ⇒ X, Y are uncorrelated.
• If X, Y are independent, then X, Y are uncorrelated but converse is not true
(except for gaussian)
• If RXY = 0 ⇒ X, Y are orthogonal.
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
4
EE360 Random Signal analysis
Ex. 5.1-2
2
Y = −6X + 22, X̄ = 3, σX
=2
2
m20 = E[X 2 ] = σX
+ (X̄)2 = 2 + 9 = 11
Ȳ = E[−6X + 22] = −6X̄ + 22 = −18 + 22 = 4
RXY = E[XY ] = E[−6X 2 + 22X] = −6(11) + 22(3) = 0 ⇒ X, Y are
orthogonal.
RXY 6= E[X]E[Y ] ⇒ X, Y are correlated
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
Chapter 5: Operations on Multiple Random Variables
$
5
EE360 Random Signal analysis
Example
If Y
= aX + b, then X, Y are always correlated if a 6= 0.
RX,Y = E[XY ] = E[aX 2 + bX] = aE[X 2 ] + bE[X].
If we want X, Y to be orthogonal, i.e.
RX,Y = 0 = E[aX 2 + bX] ⇒ b = −aE[X 2 ]/E[X]
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
$
6
Joint Moments about the origin N-dim. case
mn1 ,n2 ,···,nN
nN
= E[X1n1 X2n2 · · · XN
]
Z ∞
Z ∞
=
···
xn1 1 · · · xnNN fX1 ,···,XN (x1 , · · · , xN )dx1 · · · dxN
−∞
with ni
&
−∞
= 0, 1, · · · ∀i = 1, 2, · · · , N
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
$
7
Joint Central Moments
µnk
= E[(X − X̄)n (Y − Ȳ )k ]
Z ∞Z ∞
=
(x − X̄)n (y − Ȳ )k fX,Y (x, y)dxdy
−∞
−∞
2
2
µ20 = E[(X − X̄)2 ] = σX
and µ02 = E[(Y − Ȳ )2 ] = σY
Covariance of X, Y : CXY = µ11 = E[(X − X̄)(Y − Ȳ )] =
E[XY − X̄Y − Ȳ X + X̄ Ȳ ] = RXY − X̄ Ȳ − Ȳ X̄ + X̄ Ȳ
&
CXY = RXY − X̄ Ȳ
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
EE360 Random Signal analysis
Comments on covariance of X, Y
8
: CXY = RXY − X̄ Ȳ
• X, Y uncorrelated, i.e. RXY = X̄ Ȳ ⇒ CXY = 0
• X, Y orthogonal, i.e. RXY = 0 ⇒ CXY = −X̄ Ȳ
• X, Y orthogonal and (X̄ = 0)or(Ȳ = 0) ⇒ CXY = 0
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
Correlation Coefficient ρ
CXY
µ11
=
=E
ρ= √
µ20 µ02
σX σY
·µ
X − X̄
σX
¶µ
Y − Ȳ
σY
$
9
¶¸
Using Cauchy-Schwarz inequality
E[U 2 ]E[V 2 ] ≥ (E[U V ])2 show that −1 ≤ ρ ≤ 1
Let U
=
X−X̄
σX and
E[(
&
V =
Y −Ȳ
σY we get
X − X̄ 2
Y − Ȳ 2
X − X̄ Y − Ȳ 2
) ]E[(
) ] ≥ (E[
])
σX
σY
σX
σY
2
σY2
σX
2
2
1
≥
ρ
≥
ρ
⇒
2 σ2
σX
Y
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
10
N r.v.’s µn1 n2 ···nN
For N r.v.’s X1 , X2 , · · · , XN the (n1
moment is defined by
µn1 n2 ···nN
&
+ n2 + · · · + nN )-order joint central
¤
£
n2
nN
n1
¯
= E (X1 − X̄1 ) (X2 − X̄2 ) · · · (XN − XN )
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
11
Ex. 5.1-3
PN
αi Xi , with αi are real weights. Find σY2
PN
PN
Ȳ = E[Y ] = i=1 αi E[Xi ] = i=1 αi X̄i
PN
Y − Ȳ = i=1 αi (Xi − hX̄i ) and
i
P
P
N
N
σY2 = E[(Y − Ȳ )2 ] = E
i=1 αi (Xi − X̄i )
j=1 αj (Xj − X̄j ) =
PN PN
PN PN
i=1
i=1
j=1 αi αj CXi Xj
j=1 αi αj E[(Xi − X̄i )(Xj − X̄j )] =
Let
Y =
i=1
σY2 =
N
N X
X
αi αj CXi Xj
i=1 j=1
For Xi are uncorrelated, i.e.
CXi Xj =
The variance of a weighted sum of uncorrelated random variables (weights
the random variables (weights α2 )
i
&
Jordan University of Science and Technology - Electrical Engineering
2
δ(i
σX
i
− j)
2
we get σY
αi ) equals the weighted sum of the variances of
=
PN
2 2
α
i=1 i σXi
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
12
Joint Characteristic Function - 2D Fourier Transform
ΦX,Y (ω1 , ω2 ) = E[ejω1 X+jω2 Y ] with ω1 , ω2 are real numbers.
Z ∞Z ∞
ΦX,Y (ω1 , ω2 ) =
fX,Y (x, y)ejω1 x+jω2 y dxdy
−∞ −∞
1
fX,Y (x, y) =
(2π)2
Z
∞
Z
∞
ΦX,Y (ω1 , ω2 )e−jω1 x−jω2 y dω1 dω2
−∞ −∞
Marginal Characteristic fcns:
ΦX (ω1 ) = ΦX,Y (ω1 , 0), ΦY (ω2 ) = ΦX,Y (0, ω2 )
Joint mnk moment: mnk
&
= (−j)
n+k ∂
Jordan University of Science and Technology - Electrical Engineering
n+k
ΦX,Y (ω1 ,ω2 )
|ω1 =ω2 =0
∂ω1n ∂ω2k
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Ex. 5.2-1 on using mnk , mnk
2
n+k ∂
= (−j)
n+k Φ
X,Y (ω1 ,ω2 )
∂ω1n ∂ω2k
13
|ω1 =ω2 =0
2
= e−2ω1 −8ω2 , find X̄, Ȳ , RXY , CXY
∂Φ
(ω ,ω )
= −j X,Y∂ω1 1 2 |ω1 =ω2 =0 =
Given ΦX,Y (ω1 , ω2 )
X̄ = m10
$
Chapter 5: Operations on Multiple Random Variables
−2ω12 −8ω22
−j(−4ω1 )e
|ω1 =ω2 =0 = 0
∂Φ
(ω ,ω )
Ȳ = m01 = −j X,Y∂ω2 1 2 |ω1 =ω2 =0 =
−2ω12 −8ω22
−j(−16ω2 )e
|ω1 =ω2 =0 = 0
2
2
∂2
2
RXY = m11 = (−j) ∂ω1 ∂ω2 e−(2ω1 +8ω2 ) |ω1 =ω2 =0 = 0
CXY = RXY − X̄ Ȳ = 0 ⇒ X, Y are Uncorrelated
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
$
14
Joint Characteristic function for N r.v.s
X1 , X2 , · · · , XN r.v.s, then
ΦX1 ,···,XN (ω1 , · · · , ωN ) = E(ejω1 X1 +···+jωN XN )
and the joint moments are obtained from
mn1 n2 ···nN = (−j)
&
n1 +···+nN
∂ n1 +···+nN ΦX1 ,···,XN (ω1 , ω2 , · · · , ωN )
|all ωk =0
nN
n1
n2
∂ω1 ∂ω2 · · · ∂ωN
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
Chapter 5: Operations on Multiple Random Variables
$
15
EE360 Random Signal analysis
Example 5.2-2
Y = X1 + X2 + · · · + XN where Xi , i = 1, 2, · · · , N are statistically
independent r.v.s with fXi (xi ) and ΦXi (ωi )
ΦX1 ,···,XN (ω1 , · · · , ωN ) =
&
Jordan University of Science and Technology - Electrical Engineering
N
Y
ΦXi (ωi )
i=1
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
16
Jointly Gaussian Two R.V.s
fX,Y (x, y) =
=
with ρ
=
1
p
2πσX σY
1
p
2πσX σY
CXY
σ X σY
e
h
e
h¡
−
1−
ρ2
−
1 − ρ2
= X̄, y = Ȳ , i.e.
1√
fX,Y (x, y) ≤ fX,Y (X̄, Ȳ ) =
(y−Ȳ )2
(x−X̄)2
2ρ(x−X̄)(y−Ȳ )
+
−
σX σY
σ2
σ2
X
Y
2(1−ρ2 )
¢
x−X̄ 2
σX
−2ρ
i
¡ x−X̄ ¢¡ y−Ȳ ¢ ¡ y−Ȳ ¢2 i
σX
σY
+
σY
2(1−ρ2 )
Maximum occurs at x
2πσX σY
&
Jordan University of Science and Technology - Electrical Engineering
1−ρ2
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
17
Jointly Gaussian Uncorrelated R.V.s are Independent
Case of uncorrelated X, Y , i.e.
fX,Y (x, y) =
=
1
p
2πσX σY
1
e
2πσX σY
Uncorrelated Gaussian X, Y
⇒ X, Y independent.
&
ρ=0
−
1−
h
−
ρ2
e
h
(y−Ȳ )2
2ρ(x−X̄)(y−Ȳ )
(x−X̄)2
+
−
σX σY
σ2
σ2
X
Y
2(1−ρ2 )
(y−Ȳ )2
(x−X̄)2
+
σ2
σ2
X
Y
2
i
i
⇒ fX,Y (x, y) = fX (x)fY (y)
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
18
EE360 Random Signal analysis
Can we remove correlation between 2 r.v.’s by proper rotation θ ?
Ex. 5.3-1 For any X1 , X2 r.v.s, we can form two new r.v.s Y1 , Y2 by rotating the
axes an angle θ to make Y1 , Y2 uncorrelated.
X2
(x1,x2)=(y1,y2)
x2
Y2
Y1
y1
Y1 = X1 cos θ + X2 sin θ
Y2 = −X1 sin θ + X2 cos θ
Want θ that makes CY1 Y2 = 0.
&
Jordan University of Science and Technology - Electrical Engineering
y2
θ
x1
X1
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
CY1 Y2
$
Chapter 5: Operations on Multiple Random Variables
19
= µ11 = E[(Y1 − Y¯1 )(Y2 − Y¯2 )]
= E[{(X1 − X̄1 ) cos θ + (X2 − X̄2 ) sin θ} ×
{−(X1 − X̄1 ) sin θ + (X2 − X̄2 ) cos θ}]
= E[−(X1 − X̄1 )2 cos θ sin θ + (X2 − X̄2 )2 sin θ cos θ +
(X1 − X̄1 )(X2 − X̄2 )(cos2 θ − sin2 θ)]
2
2
2
2
= −σX
sin
θ
cos
θ
+
C
cos
θ
−
C
sin
θ
+
σ
X
X
X
X
X2 sin θ cos θ
1 2
1 2
1
1 2
2
= − (σX
−
σ
X2 ) sin(2θ) + CX1 X2 cos(2θ)
1
2
Set CY1 Y2 = 0 we get
1
2
2
(σ
−
σ
X1
X2 ) sin(2θ) = CX1 X2 cos(2θ) = ρσX1 σX2 cos(2θ)
2
tan(2θ) =
&
1
2ρσX1 σX2
−1
θ
=
tan
⇒
2 − σ2
σX
2
X2
1
Jordan University of Science and Technology - Electrical Engineering
µ
2ρσX1 σX2
2 − σ2
σX
X2
1
¶
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
20
EE360 Random Signal analysis
Jointly Gaussian - N - r.v.s X1 , X2 , · · · , XN
|[CX ]−1 |1/2 { −1 (x−X̄)t [CX ]−1 (x−X̄)}
fX1 ,···,XN (X1 , · · · , XN ) =
e 2
N/2
(2π)



x1 − X̄1
C11 C12 · · · C1N



 x2 − X̄2 
 C21 C22 · · · C2N



where (x − X̄) = 
,
C
=

 .
X
..
..
..



.
.
.
···
.


 .
xN − X¯N
CN 1 CN 2 · · · CN N

 σ2
i=j
Xi
Cij = E[(Xi − X̄i )(Xj − X̄j )] =
 CX X i 6= j
i
&
Jordan University of Science and Technology - Electrical Engineering







j
%
Abdel-Rahman Jaradat
'
EE360 Random Signal analysis
Case of N

[CX ] = 
−1
[CX ]
=
2
σX
1
ρσX1 σX2
ρσX1 σX2

2
σX
2
1
1−ρ2

t
− X̄)
fX1 ,X2 (x1 , x2 ) =
p
1
2
2πσX
&
1
p
1
2
2πσX
2
2
σX
1
1
2 σ2
σX
X
1
1−ρ2
1


2
=2
2
2
 , |[CX ]| = (1 − ρ2 )σX
σ
X
2
1
2
σX
2
1


1
2/2
(1−ρ2 ) (2π)
2
σX
1
−ρ
σX1 σX2
·
×
−ρ
σX1 σX2
1
1
exp{− (1−ρ
2)
21

−ρ
σX1 σX2
1
−ρ
σX1 σX2
fX1 ,X2 (x1 , x2 ) = p
exp{− 21 (x
$
Chapter 5: Operations on Multiple Random Variables
1
2
σX
2
(x1 −X̄1 )
2
2σX
Jordan University of Science and Technology - Electrical Engineering
1
2
+

 (x − X̄)} can verify
(x2 −X̄2 )
2
2σX
2
2
¸
−X̄1 )(x2 −X̄2 )
}
− 2ρ(x12σ
X σX
1
2
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
22
EE360 Random Signal analysis
Notes on Gaussian r.v.s
1. Only mean, variance, and covariance are needed to completely characterize
gaussian r.v.s.
2. Uncorrelated ⇒ statistically independent,
3.
Xi , i = 1, 2, · · · , n are gaussian,
Pn
i=1
ai Xi is gaussian.
4. Any k -dim marginal density is also gaussian.
5. Conditional density is also gaussian, i.e.,
fX1 ,X2 ,···,Xk (x1 , x2 , · · · , xk |{Xk+1 = xk+1 , · · · , XN = xN }) gaussian.
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
23
Linear Transformation of Multiple r.v.s
Y = TX
where Y is an N × 1 vector, T is an N × N matrix, X is an N × 1 vector
E[Y ] = T E[X]
E[Y Y t ] = E[T XX t T t ]
RY = T R X T t
also E[(Y − Ȳ )(Y − Ȳ )t ] = E[T (X − X̄)(X − X̄)t T t ]
from which we get
CY = T CX T t
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
24
EE360 Random Signal analysis
Ex. 5.5-1 Transformation of Multiple r.v.s
∼ N (0, 4), X2 ∼ N (0, 9), CX1 X2 = 3. Let
Y1 = X1 − 2X2 , Y2 = 3X1 + 4X2 . Find means, variances, and covariance of
Y1 and Y2 .
E[Y1 ] = E[X1 ] − 2E[X2 ] = 0, E[Y2 ] = 0.
E[Y12 ] = E[X12 ] − 4CX1 X2 + 4E[X22 ] = 4 − 4 ∗ 3 + 4 ∗ 9 = 28
E[Y22 ] = E[9X12 ] + 24CX1 X2 + 16E[X22 ] = 9 ∗ 4 + 24 ∗ 3 + 16 ∗ 9 = 252
E[Y1 Y2 ] = E[3X12 − 2X1 X2 − 8X22 ] = 3 ∗ 4 − 2 ∗ 3 − 8 ∗ 9 = −66
Gaussian X1
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
25
Example
Gaussian random vector, X

1

 

µ=
 5 , CX
2
Find
∼ N (µ, CX ) with

1 1 0



= 1 4 0 

0 0 9

1. pdf of X1 : marginal of jointly gaussian is gaussian.
X1 ∼ N (1, 1)
2. pdf of X2
+ X3 : here C23 = C32 = 0 ⇒ X2 , X3 uncorrelated, since
gaussian ⇒ independent.
Sum of two jointly gaussian is also gaussian. Mean will add and variance will
add.
X2 + X3 ∼ N (7, 13).
&
3. pdf of 2X1
+ X2 + X3 : linear combination of gaussian r.v.s, i.e.
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables

X1
26

EE360 Random Signal analysis



2X1 + X2 + X3 = [2, 1, 1]  X2 
,
X3
mean=µ = 2X̄1 + X̄2 + X̄3 = 2(1) + 1(5) + 1(2) = 9

 
 
1 1 0
2
2

 
 
2





σ = [2, 1, 1]  1 4 0   1  = [3, 6, 9] 
 1  = 21
0 0 9
1
1
2X1 + X2 + X3 ∼ N (9, 21)
4. pdf of X3 |(X1 , X2 )
= f (X3 |X1 , X2 ) =?
C23 = C13 = 0 ⇒ X3 , X2 stat. independent and ⇒ X3 , X1 stat.
independent, ⇒ f (X3 |X1 , X2 ) = f (X3 ) ⇒ X3 |(X1 , X2 ) ∼ N (2, 9)
P {2X1 + X2 + X3 < 0} =?
Y = 2X1 + X2 + X3 as in previous part ∼ N (9, 21)
&
5.
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
Chapter 5: Operations on Multiple Random Variables
EE360 Random Signal analysis
Ȳ
−9
√
P {Y < 0} = Φ( 0−
) = Φ(−1.96) = 1 − Φ(1.96) =
)
=
Φ(
σY
21
0.0248


2 1 1


6. Y = T X , with T =
1 −1 1
 
 1



2 1 1  
9






Ȳ = T X̄ =
 5 =
1 −1 1
−2
2




 1 1 0
2 1


2 1 1 
t






CY = T CX T =
 1 4 0   1 −1  =
1 −1 1
0 0 9
1 1


21 6

 hence Y ∼ N (Ȳ , CY )
6 12
&
Jordan University of Science and Technology - Electrical Engineering
$
27
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
28
Sampling and Some Limit Theorems
• Sampling and Estimation
• Estimation of Mean, Power, and Variance Given N samples xn representing
values of independent (at least pair-wise) identically distributed
ˆ as follows x̄
ˆN =
Xn , n = 1, 2, · · · , N . Define the r.v. X̄
N
1
N
mean
PN
n=1
xn its
N
X
1
ˆ ]=
E[Xn ] = X̄, for any N
E[X̄
N
N n=1
This is an Unbiased estimator ≡ mean of estimate=mean of the r.v. and its
variance
&
σ 2ˆ
X̄N
2
2
ˆ
ˆ
ˆ + X̄ 2 ]
= E[(X̄ N − X̄) ] = E[X̄ N − 2X̄ X̄
N
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
29
N
N
X
X
1
1
ˆ + E[
Xn
Xm ]
= −X̄
N n=1
N m=1
2
N
N X
X
1
ˆ +
E[Xn Xm ]
= −X̄
2
N n=1 m=1
2
Since Xn and Xm are pairwise independent identically distributed, then
E[Xn Xm ]
⇒
&
N
N X
X
n=1 m=1

 X̄ 2
=
 E[X 2 ]
for m
for m
6= n
=n
E[Xn Xm ] = N E(X 2 ) + (N 2 − N )X̄ 2
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
EE360 Random Signal analysis
Chapter 5: Operations on Multiple Random Variables
30
Hence
2
1
1
σ
2
2
2
2
2
X
ˆ +
= −X̄
[N
E(X
)+(N
−N
)
X̄
=
[E(X
)−
X̄
]
=
N2
N
N
hence σ ˆ 2 → 0 as N → ∞.
X̄
2
σ 2ˆ
X̄N
N
Using Chebychev’s inequality,
ˆ − X̄| < ǫ} ≥ 1 − (
P {|X̄
N
Consistent estimator:
σ 2ˆ
X̄N
ǫ2
2
σX
=1−
) → 1 as N → ∞
2
Nǫ
X̄ˆN → X̄ with probability 1 as N → ∞
• Weak Law of Large Numbers:
ˆ − X̄| < ǫ} = 1, for any ǫ > 0
lim P {|X̄
N
N →∞
• Strong Law of Large Numbers:
n
o
ˆ
P lim (X̄ N ) = X̄ = 1
&
N →∞
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
31
EE360 Random Signal analysis
Complex Random Variables
Z = X + jY , X, Y are real r.v.s
R∞ R∞
E[g(Z)] = −∞ −∞ g(z)fX,Y (x, y)dxdy
Z̄ = X̄ + j Ȳ
2
σZ
= E[|Z − E[Z]|2 ]
For two complex r.v.s Zm , Zn : joint pdf fXm ,Ym ,Xn ,Yn (xm , ym , xn , yn )
If fXm ,Ym ,Xn ,Yn (xm , ym , xn , yn ) = fXm ,Ym (xm , ym )fXn ,Yn (xn , yn ) ⇒
Zm , Zn statistically independent.
Can extend to N −r.v.s
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
32
EE360 Random Signal analysis
Complex vars.: Correlation, Covariance, Independence, Orthogonal
∗
• Correlation RZm Zn = E[Zm
Zn ], n 6= m
• Covariance CZm Zn = E[(Zm − Z¯m )∗ (Zn − Z¯n )], n 6= m
• Uncorrelated Complex r.v.s
∗
]E[Zn ], n 6= m, ⇒ CZm Zn = 0
RZm Zn = E[Zm
• Independence ⇒ Uncorrelation
∗
• Orthogonal RZm Zn = E[Zm
Zn ] = 0
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat
'
$
Chapter 5: Operations on Multiple Random Variables
33
Summary
EE360 Random Signal analysis
Extend chapter 3 to work on multiple random variables. Topics extended were:
Expected values were developed of functions of random variables, which included
both joint moments about the origin and central moments, as well as joint
characteristic functions that are useful in finding moments. New moments of special
interest were correlation and covariance.
Multiple gaussian random variables were defined.
Transformation results were used to show how linear transformation of jointly
gaussian random variables is especially important, as it produces random variables
that are also joint gaussian.
Some new material on the basics of sampling and estimation of mean, power, and
variance was given.
Definition of complex random variables and their characteristics.
&
Jordan University of Science and Technology - Electrical Engineering
%
Abdel-Rahman Jaradat