Distribution Function properties
1. FX () 0
2.
3.
4.
5.
FX () 1
0 FX ( x) 1
FX ( x1 ) FX ( x2 )
if x1 x2
P x1 X x2 FX ( x2 ) FX ( x1 )
6. FX ( x ) FX ( x)
FX ( x) P ( X 0) u ( x) P ( X 1) u ( x 1) P ( X 2) u ( x 2) P ( X 3) u ( x 3)
step function
step function
step function
step function
Density Function
– We define the derivative of the distribution function FX(x) as
the probability density function fX(x).
dFX ( x)
d N
f X ( x)
P ( X xi )u ( x xi )
dx i 1
dx
N
P( x ) ( x
i 1
i
xi )
Properties of Density Function
1.
2.
f X ( x) 0
for all x
f X ( x)dx 1
3. FX ( x)
x
f X ( ) d
4. P x1 X x2
x2
x1
f X ( x)dx
Binomial
Let 0 < p < 1, N = 1, 2,..., then the function
N=6
is called the binomial density function.
p = 0.25
is the parameter of the distribution.
We say X follows a Poisson distribution
with parameter (average rate)
In our book
k
b is the parameter of the distribution
bk
b b
f X ( x) e
(x k) e
(x k)
(average rate)
k
!
k
!
k 0
k 0
b
bk
FX ( x) e
u( x k )
k
!
k 0
b
1.8 birth/houre
1.8 birth/houre
an infinite number of probabilities to calculate
What is the probability of observing no birth (X=0) births in a given hour at the hospital?
P ( X 4) e
1.8
1.80
1.8
e
0!
0.165
1.8 birth/houre
Then Y is Poisson with 3.6
The Gaussian Random Variable
f X (x )
1
2 x2
e
( x aX ) 2 2 x2
X
N (a X , X2 )
Which is tabulated
x aX
FX ( x) F
x
Uniform distribution
1
, a xb
fX x b a
0
, elsewhere
Exponential distribution
1 x a / b
,x a
e
fX x b
0
,xa
The Rayleigh distribution
2
( x a )2
( x a )e
f X ( x) b
0
for real constants a and b 0
1 e ( x a )
FX ( x)
0
2
b
b
x a
x a
x a
x a
Conditional Distribution and Density Functions
P(A B)
P(A|B) =
P(B)
Conditional Distribution
Let X be a random variable and define the event A = X x
we define the conditional distribution function FX (x|B)
{X x} B
A
FX (x|B) = P{X x|B} =
P{X x B}
P(B)
Properties of Conditional Distribution
(1) FX (|B) = 0
(2) FX (|B) = 1
(3) 0 FX (x|B) 1
(4)
FX (x1|B) FX (x 2 |B)
if
x1 < x 2
(5) P x1 < X x 2 |B = FX (x 2 |B) FX (x1|B)
(6)
FX (x + |B) = FX (x|B)
Conditional Density Functions
f X ( x | B)
dFX ( x | B )
dx
Properties of Conditional Density
(1)
(2)
(3)
(4)
f X (x|B) 0
f (x|B) dx = 1
X
FX (x|B) =
x
f (ξ|B)dξ
X
P x1 < X x 2 |B =
x2
x1
f X (x|B)dx
FX (x)
FX (x|X b) = FX (b)
1
x<b
b x
FX (x|X b) FX (x)
The conditional density function derives from the derivative
dFX (x|X b)
f X (x|X b) =
dx
Similarly for the conditional density function
f X (x|X b) f X (x)
f X (x)
F (b) =
= X
0
f X (x)
b
x<b
f (x)dx
X
x b
Example 8 Let X be a random variable with an exponential
probability density function given as
x
e
f X (x )
0
x 0
x 0
Find the probability P( X < 1 | X ≤ 2 )
Since
2
f X (x )
f X (x )dx
f X (x | X 2)
0
x 2
x 2
e x
1e 2
x 2
0
x 2
f X (x | X 2)
f X (x )
x
e
P (X 1| X 2) f X (x | X 2)dx
dx
2
0
0 1e
1
1
1
0
e x dx
1e 2
1
1
e
2 0.7310
1e
Ch3 Operations on one random
variable-Expectation
Expected value of a random variable
E X = X =
E X = X =
x P(x )
i
x i SX
i
xf X (x)dx
if X is discrete values
if X is continuous value with Probability density
Expected value of a Function of a random variable
E g(X) =
N
g(x )P(x )
i
i
i=1
E g(x) =
g(x)f X (x)dx
Conditional Expectation
We define the conditional density function for a given event
B = { X b}
f X (x)
b
f X (x|X b) = f X (x)dx
0
x<b
x b
we now define the conditional expectation in similar manner
E X|B =
xf X (x|B)dx =
b
xf X (x|B)dx +
b
xf X (x|B)dx
xb
x<b
b
=
b
x
f X (x)
b
f X (x)dx
constant = FX (b)
dx +
b
x 0 dx
0
=
b
xf X (x)dx
f X (x)dx
constant = FX (b)
Moments
The expected value defined previously as
E X = X =
xf X (x)dx
we can define the n th moment (about the origin) m n as
m n = E X =
n
m1 = E X =
1
x n f X (x)dx
xf X (x)dx = X The expected value of X
we define the n th moment (about the mean) Central Moments
μ n = E (X X) =
n
(x X) n f X (x)dx
μ n = E (X X) =
n
μ 0 = E (X X) =
0
E1 =1
(x X) n f X (x)dx
(X X)0f X (x)dx = 1 The area of the function f X (x)
fX (x)dx = 1
μ1 = E (X X)1 = E[X] E[X] = X X = 0
were we have used the fact that E[ a ] = a
constant
Moments
Moments about the mean
called central moments
Moments about the origin
N
mn E X n = x ni P(X = x i )
i=1
mn = E X =
n
μ n = E (X X) n
N
=
x
i
i=1
n
x f X (x)dx
μ n = E (X X) n
m1 E X = X
=
Thus the variance is given by
σ = μ 2 = E (X X) =
2
x
X P(X = x i )
n
2
(x X) 2f X (x)dx
σ 2x = μ 2 = E (X X) 2 =E X 2 X 2 = m 2 m12
(x X) n f X (x)dx
Thus the variance is given by
σ = μ 2 = E (X X) =
2
x
2
(x X) 2 f X (x)dx
= E X 2 X 2 = m 2 m12
Properties of the variance
(1) σ c2 0
c is a constant
(2) σ 2x + c σ 2x
The variance does not change by shifting
(3)
2
σ cx
c 2 σ 2x
3.3 Function that Give moments
d n Φ X (ω)
mn = ( j )
dω n ω = 0
n
f X ( x) X ( ) Fourier Transform f X ( x)
X ()
e
X ()
j X
f X ( x)dx
f X ( x)
Inverse
Fourier
Transform
=
1
ΦX (ω)e- j X dω
2π
Example Let X be a random variable with an exponential
probability density function given as
x
x 0
e
f X (x )
0
m1 E[ X ]
x 0
xf X ( x)dx
0
x
xe dx
=1
Now let us find the 1st moment (expected value) using the characteristic function
X ()
x
e j X e dx
= Fourier Transform{e
x
1
1
=
} =
1 j
1 j
m1 ( j) d X ( )
d
0
d ( ) d 1
d X
d 1 j
j
(1 j )
2
m1 ( j)
j2
j
2
2 =1
(1 j ) 0 (1 j (0)) 0
3.4 Transformations of A Random Variable
OR
dx
f ( y ) f ( x)
Y
X
dy
x T 1 ( y )
x T 1 ( y )
Nonmonotonic Transformations of a Continuous Random Variable
fY ( y )
n
f X (x n )
dT(x)
dx x = x n
Ch4: Multiple Random Variables
Joint Distribution and its Properties
6
6
FX,Y (x,y) = P(X x, Y y) = P(x n ,y m ) u(x x m ) u(y y m )
n=1 m=1
6
6
f X,Y (x,y) = P(x n ,ym ) δ(x x m ) δ(y ym )
n=1 m=1
y x
FX,Y (x,y) =
f
X,Y
(ξ1 ,ξ 2 )dξ1dξ 2
f X,Y (x,y) =
2 FX,Y (x,y)
xy
Properties of the joint distribution
(1) FX,Y ( , ) = 0
FX,Y ( , y) = 0
FX,Y (x, ) = 0
(2)
FXY (, ) = 1
(3)
0 FXY (x,y) 1
(4)
FX (x,y) is a nondecreasing function of both x and y
(5) Px1 < X x 2 ,y1 < Y y 2 = FX,Y (x 2 ,y 2 ) FX,Y (x1,y 2 ) FX,Y (x 2 ,y1 ) + FX,Y (x1 , y1 ) 0
(6) FX,Y (x,) = FX (x)
FX,Y (, y) = FY (y)
Marginal Distribution Functions
FX,Y (x,) = FX (x)
FX,Y (, y) = FY (y)
Joint Density and its Properties
f X,Y (x,y) =
2 FX,Y (x,y)
xy
y x
FX,Y (x,y) =
f
X,Y
(ξ1 ,ξ 2 )dξ1dξ 2
Properties of the Joint Density
(1)
(2)
(3)
(4)
f X,Y (x,y) 0
f X,Y (x,y)d x d y = 1
y
F (x) = f
F (y) = f
FX,Y (x,y) =
X,Y
y
Y
(5)
(6)
f
X,Y
x
X
x
X,Y
Properties (1) and (2) may be used as
sufficient test to determine if some
function can be a valid density function
(ξ1 ,ξ 2 )dξ1dξ 2
(ξ1 ,ξ 2 )dξ 2 dξ1
(ξ1 ,ξ 2 )dξ1dξ 2
P x1 < X x 2 ,y1 < Y y 2 =
(y) =
f X (x) =
fY
f
X,Y
f
X,Y
Marginal Distribution
y2
x2
y1
x1
f X,Y (x,y) d x d y
(x,y) dy
(x,y) dx
Marginal Densities
Conditional Distribution and Density
The conditional distribution function of a random variable X given some event B was
defined as
FX x|B =P X x|B =
PX x
P B
B
were P B 0
The corresponding conditional density function was defined through the derivative
f X x|B =
dFX x|B
dx
(1) X and Y are Discrete
N
P(x ,y
i
K
)u(x x i )
i=1
FX (x|Y= y K ) =
P(y K )
N
P(x i ,y K )
u(x x i )
i = 1 P(y K )
dFX (x|Y= y K ) N P(x i ,y K )
f X (x|Y= y K ) =
δ(x x i )
dx
i = 1 P(y K )
(2) X and Y are Continuous
F (x Y = y)
x
f
ξ1 ,y dξ1
fY y
X,Y
X
For every y such that
dFX (Y = y) f X,Y x,y
f X x|Y = y =
dx
fY y
f Y (y) 0
STATICAL INDEPENDENCE
FXiX j (x i ,x j ) FXi (x i )FX j (x j )
FXiX jXk (x i ,x j , x k ) FXi (x i )FX j (x j )FXk (x k )
FX1X2
XN
(x1 ,x 2 ,
x N ) FX1 (x1 )FX2 (x 2 )
FX N (x N )
f XiX j (x i ,x j ) f Xi (x i )f X j (x j )
f XiX jXk (x i ,x j , x k ) f Xi (x i )f X j (x j )f Xk (x k )
f X1X2
XN
(x1 ,x 2 ,
x N ) f X1 (x1 )f X2 (x 2 )
f X N (x N )
We seek the distribution or density of W=X+Y
FW w = P W w = P X+Y w
FW w =
fW w =
wy
f
x X,Y
(x,y)dxd y
dFW w
dw
If X and are independent f X,Y (x,y) f X (x)f Y (y)
FW w =
f Y (y)
wy
x
f X (x) dx dy
using Leibnizerule we get
fW w =
dFW w
dw
=
f Y (y)f X (w x)dy = f
Convolution Integral
Y
(y) f X (x)
Operations on Multiple Random Variables
g(x,y)f (x,y)dxdy
X,Y
g = E g(X,Y) =
g(x i ,y k )PX,Y (x i ,y k )
i k
Continuous
Discrete
Joint Moment about the Origin
m nk = E X Y =
n
k
x n y k f X,Y (x,y)dxdy
m n0 = E[X n ] the n th moment m n of the one random variable X
m 0k = E[Y k ] the k th moment m k of the one random variable Y
correlation
R XY = m11 = E[XY] =
xyf X,Y (x,y)dxdy
Continuous
= x n ym P(x n ,ym )
Discrete
n= n=
R XY =
E[X]E[Y] uncorrelated
0
Orthogonal
The variance
σ2X μ 20
σ2Y μ02
=
E (X
=
E (Y
X)2 =
E[X ] X
2
Y)2 =
E[Y ] Y
2
2
covariance
CXY = μ11 = E (X X)(Y Y) = R XY E[X]E[Y]
0
E[X]E[Y]
if X and Y are uncorrelated
if X and Y are orthogonal
Independence Uncorrelation
The converse is not true in general except for Gaussian
2
covariance
CXY = μ11 = E (X X)(Y Y) = R XY E[X]E[Y]
0
E[X]E[Y]
The correlation coefficient
ρ=
CXY
σXσY
1 ρ 1
if X and Y are uncorrelated
if X and Y are orthogonal
The joint moments can be found from the joint characteristic function
m nk = ( j) n+k
n+k Φ X,Y (ω1 ,ω2 )
ω1n ωk2
f XY (x, y) Φ X,Y (ω1 ,ω2 )
ω1 =0, ω2 0
2D Fourier Transform with reversal of sign
Yi = Ti (X1 ,X 2 ,...,X N )
i = 1,2, ..., N
X j = Tj1 (Y1 ,Y2 ,...,YN )
j = 1,2, ..., N
f Y1 ,Y2 ,...,YN (y1 ,...,y N ) f X1 ,X2 ,...,XN (x1 = T11 ,...,x N = TN1 ) J
T11
Y1
T11
YN
TN1
Y1
TN1
YN
J=
Random Process and its Applications to linear systems
Distribution and Density of Random Processes
For a particular time t1 , X t1 is a random variable
X t1 R.V has a distribution FX t (x) , and a density f X t (x)
1
FX t (x1 ) P X t1 x1
1
1
f X t (x1 ) =
1
dFX t (x1 )
1
dx1
Wide - Sense Stationary WSS
A process that satisfies the followings :
E X (t ) X = constant
E X (t ) X (t ) RXX ( )
The time average of a quantity is defined as
1 T
A = lim
dt
T 2T T
Autocorrelation Function and Its Properties
R XX (t, t + τ) = E X(t)X(t + τ)
Random Function
X(t)
Random Function
h(t)
R XX (τ)
Y(t)=X(t)*h(t)
Linear System
None random Deterministic Function
R YY (τ)
None random Deterministic Function
Y = XH(0)
X
R YY (τ) = R XX (τ)
R XX (τ)
h( τ) h(τ)
SYY (f ) = S XX (f )
S XX (f )
H (f ) H (f ) = S XX (f ) H ( f )
*
H( f )
Total power of the Input
Total power of the Output
2
E[X (t )]=R XX (0) =
S XX ( f )df
2
E[Y 2 (t )] = RYY (0) =
SYY ( f )df =
2
S XX ( f ) H ( f ) df
2
© Copyright 2026 Paperzz