CEN 343 Chapter 7: Random Processes

CEN 343
Chapter 7: Random Processes
Chapter 10
Random Processes
1
CEN 343
Chapter 7: Random Processes
1. Introduction
ο‚·
In chapter 2, we have defined a random variable 𝑋 as a mapping of the elements of the sample
space S into points on the real axis.
ο‚·
For random processes, the sample space could map into a family of time functions. That is if
every outcome S is a function of time. This will lead to the concept of a random process
ο‚·
Definition of a random process: A random process 𝑋(𝑑) is am mapping of the elements of
the sample space into functions of time. Each element of the sample space is associated with a
time function as shown below.
x1(t)
t
t2
t1
S
x2(t)
s1
s2
t
t1
sk
t2
xk(t)
t
t1
t2
2
CEN 343
ο‚·
Chapter 7: Random Processes
Associating a time function to each element of the sample space results in a family of time
functions called the ensemble.
Example 1
Consider a random process 𝑋(𝑑) = π΄π‘π‘œπ‘ (πœ”π‘‘ + πœƒ), where πœƒ is a random variable uniformly
distributed between 0 and 2πœ‹.
Show some sample functions of this random process.
What could we say on the type function above if for example we fix the phase πœƒ = πœ‹/4
Solution:
- Sample functions of this r.p. are cosine functions with frequency wt and different phase shifts
when t is fixed.
- Fixing πœƒ and varying t results in a deterministic time function.
3
CEN 343
Chapter 7: Random Processes
2. Classification of random processes
2.1 Continuous random process
ο‚·
In this case, both X(t) and t have continuous values
x(t)
t
2.2 Discrete random process
ο‚·
𝑋(𝑑) assumes a discrete set of values while time t is continues
X(t)
t
4
CEN 343
Chapter 7: Random Processes
2.3 Continuous random sequence
ο‚·
𝑋(𝑑) assumes a continuous set of values while time t is discrete
X(t)
t
2.4 Discrete random sequence
ο‚·
Both 𝑋(𝑑) and t assumes a discrete set of values
X(t)
t
ο‚·
Fixing the time t, the random process 𝑋(𝑑) becomes a random variable. In this case, the
techniques we use with random variables apply.
5
CEN 343
ο‚·
Chapter 7: Random Processes
We can characterize a random process by the first order distribution as:
𝐹𝑋 (π‘₯; 𝑑) = 𝑃[𝑋(𝑑0 ) ≀ π‘₯]
ο‚·
The first order density function for all possible values of t is given by:
𝑑
𝑓𝑋 (π‘₯; 𝑑) = 𝑑π‘₯ 𝐹𝑋 (π‘₯; 𝑑)
ο‚·
The second order distribution function is the joint distribution of the two random variables
𝑋(𝑑1 ) and 𝑋(𝑑2 ) for each 𝑑1 and 𝑑2 . It is given by:
𝐹𝑋1 𝑋2 (π‘₯1 , π‘₯2 ; 𝑑1 , 𝑑2 ) = 𝑃[𝑋(𝑑1 ) ≀ π‘₯1 π‘Žπ‘›π‘‘ 𝑋(𝑑2 ) ≀ π‘₯2 ]
ο‚·
The second order density function is given by:
πœ•2
𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ; 𝑑1 , 𝑑2 ) = πœ•π‘₯
1 πœ•π‘₯2
𝐹𝑋1 𝑋2 (π‘₯1 , π‘₯2 ; 𝑑1 , 𝑑2 )
Deterministic and Nondeterministic Process:
ο‚·
A r.p. (random process) is called deterministic if future values of any sample functions
can be predicted from past values. For example:
Here, 𝐴, πœƒ, πœ”0 (or all) may be r.v.s.
ο‚·
A r.p. is called nondeterministic if future values of any sample function cannot be
predicted exactly from past values.
6
CEN 343
Chapter 7: Random Processes
3. Independence and stationarity
ο‚·
In many problems of interest the first and second order statistics may be necessary to
characterize a random process.
ο‚·
The mean value of a random process 𝑋(𝑑) is given by:
+∞
𝐸[𝑋(𝑑)] = βˆ«βˆ’βˆž π‘₯𝑓𝑋 (π‘₯; 𝑑)𝑑π‘₯
ο‚·
The autocorrelation function is defined by:
∞
∞
𝑅𝑋𝑋 (𝑑1 , 𝑑2 ) = 𝐸[𝑋(𝑑1 )𝑋(𝑑2 )] = ∫ ∫ π‘₯1 π‘₯2 𝑓𝑋2 𝑋1 (π‘₯1 , π‘₯2 ; 𝑑1 , 𝑑2 )𝑑π‘₯1 𝑑π‘₯2
βˆ’βˆž βˆ’βˆž
3.1 Statistical Independence:
Two random processes X(t) and Y(t) are statistically independent if the random variable group
X(𝑑1 ),….. X(𝑑𝑁 ) is independent of the group Y(𝑑1Μ€ ),……… Y(𝑑𝑀̀ ) for any choice of
𝑑1 , … , 𝑑𝑁 , 𝑑1Μ€ , … , 𝑑𝑀̀ :
𝑓𝑋,π‘Œ (π‘₯1 , … , π‘₯𝑁 , 𝑦1 , … , 𝑦𝑀 ; 𝑑1 , … , 𝑑𝑁 , 𝑑1Μ€ , … , 𝑑𝑀̀ )
= 𝑓𝑋 (π‘₯1 , … … . . π‘₯𝑁 , 𝑑1 … … . . 𝑑𝑁 ). π‘“π‘Œ (𝑦1 , … … . . 𝑦𝑀 , 𝑑1Μ€ … … . . 𝑑𝑀̀ )
3.2 First-order stationary processes:
ο‚·
A random process is called first order stationary in the strict sense if the first-order density
function does not change with a shift in time :
𝑓𝑋 (π‘₯1 ; 𝑑1 ) = 𝑓𝑋 (π‘₯1 ; 𝑑1 + Ξ”) π‘€β„Žπ‘’π‘Ÿπ‘’
Ξ” is any real number
7
CEN 343
ο‚·
Chapter 7: Random Processes
This means that 𝑓𝑋 (π‘₯1 , 𝑑1 ) is independent of 𝑑1 and the process mean value 𝐸[𝑋(𝑑)] is
constant:
∞
𝐸[𝑋(𝑑)] = βˆ«βˆ’βˆžπ‘₯ 𝑓𝑋 (π‘₯; 𝑑)𝑑π‘₯ = π‘π‘œπ‘›π‘ π‘‘π‘Žπ‘›π‘‘
3.3 Second-order stationary process
ο‚·
A random process is second order stationary in the strict sense if the first and second order
density functions stratify:
𝑓𝑋 (π‘₯1 ; 𝑑1 ) = 𝑓𝑋 (π‘₯1 ; 𝑑1 + Ξ”)
(βˆ—)
𝑓𝑋 (π‘₯1 , π‘₯2 ; 𝑑1 , 𝑑2 ) = 𝑓𝑋 (π‘₯1 , π‘₯2 ; 𝑑1 + Ξ”, 𝑑2 + Ξ”) (**)
ο‚·
This means that 𝑓𝑋 is a function of time difference 𝑑2 βˆ’ 𝑑1 and not absolute time.
ο‚·
A second-order stationary process is also first-order stationary
ο‚· For second-order stationary random process , the autocorrelation function is a function of
time differences and not absolute time :
𝜏 = 𝑑2 βˆ’ 𝑑1
𝑅𝑋𝑋 (𝑑, 𝑑 + 𝜏) = 𝐸[𝑋(𝑑)𝑋(𝑑 + 𝜏)] = 𝑅𝑋𝑋 (𝜏)
3.4 Wide sense stationary random process
ο‚·
As the basic conditions for strict sense stationary process are usually difficult to verify
(equations (*) and (**) )
ο‚·
We often resort to a weaker definition of stationarity known was wide sense stationary.
8
CEN 343
ο‚·
Chapter 7: Random Processes
Thus, when the autocorrelation function 𝑅𝑋𝑋 (𝑑1 , 𝑑2 ) of the random process 𝑋(𝑑) varies only
with the time difference |𝑑1 βˆ’ 𝑑2 | and the mean 𝐸[𝑋(𝑑)] is constant then 𝑋(𝑑) is said to be
wide sense stationary. That is:
(1) 𝐸[𝑋(𝑑)] ≑ π‘π‘œπ‘›π‘ π‘‘π‘Žπ‘›π‘‘
(2) 𝑅𝑋𝑋 (𝑑 + 𝜏, 𝑑) = 𝑅𝑋𝑋 (𝜏)
ο‚·
A strict sense stationary process is wide sense stationary process but the inverse is not true.
Example 2
Consider a random process 𝑋(𝑑) = π΄π‘π‘œπ‘ (πœ”π‘‘ + πœƒ), where πœƒ is a random variable uniformly
distributed between 0 and 2πœ‹. Is the process stationary in the wide sense?
Solution:
2πœ‹
𝐸[𝑋(𝑑)] = ∫
𝐴 π‘π‘œπ‘ (𝑀0 𝑑 + πœƒ).
0
1
π‘‘πœƒ = 0
2πœ‹
𝑅𝑋𝑋 (𝑑, 𝑑 + 𝜏) = 𝐸[𝑋(𝑑)𝑋(𝑑 + 𝜏)]
π‘“πœƒ
= 𝐸[𝐴 π‘π‘œπ‘ (𝑀0 𝑑 + πœƒ)𝐴 π‘π‘œπ‘ (𝑀0 𝑑 + 𝑀0 𝜏 + πœƒ)] . We have:
1
2πœ‹
=
=
=
𝐴2
2
𝐴2
2
𝐴2
2
𝐸[cos(𝑀0 𝜏) + cos(2𝑀0 𝑑 + 𝑀0 𝜏 + 2πœƒ)]
cos(𝑀0 𝜏) +
𝐴2
2
0
2πœ‹
E[cos(2𝑀0 𝑑 + 𝑀0 𝜏 + 2πœƒ)]
cos(𝑀0 𝜏) = 𝑅𝑋𝑋 (𝜏)
X(t) is wide-sense stationary r.p.
9
πœƒ
CEN 343
Chapter 7: Random Processes
3.5 Jointly wide-sense stationary:
ο‚·
For two random processes X(t) and Y(t), we say they are jointly wide-sense stationary if
each process is wide-sense stationary and:
π‘…π‘‹π‘Œ (𝑑, 𝑑 + 𝜏) = 𝐸[𝑋(𝑑)π‘Œ(𝑑 + 𝜏)] = π‘…π‘‹π‘Œ (𝜏) β‡’ π‘“π‘’π‘›π‘π‘‘π‘–π‘œπ‘› π‘œπ‘“ π‘‘π‘–π‘šπ‘’ π‘‘π‘–π‘“π‘“π‘’π‘Ÿπ‘’π‘›π‘π‘’ π‘œπ‘›π‘™π‘¦.
π‘…π‘‹π‘Œ (𝑑1 , 𝑑2 ) = 𝐸[𝑋(𝑑1 )π‘Œ(𝑑2 )] Represents the cross-correlation function of X(t) and Y(t) .
ο‚·
We also define the covariance function by:
𝐢𝑋𝑋 (𝑑1 , 𝑑2 ) = 𝐸[{𝑋(𝑑1 ) βˆ’ π‘šπ‘‹ (𝑑1 )}{𝑋(𝑑2 ) βˆ’ π‘šπ‘‹ (𝑑2 )}]
ο‚·
The cross-covariance function is defined by:
πΆπ‘‹π‘Œ (𝑑1 , 𝑑2 ) = 𝐸[{𝑋(𝑑1 ) βˆ’ π‘šπ‘‹ (𝑑1 )}{π‘Œ(𝑑2 ) βˆ’ π‘šπ‘Œ (𝑑2 )}]
3.6 N-order and strict-sense stationary:
ο‚·
A random process is strict stationary to order N if the N-th order density function is
invariant to time origin shift:
𝑓𝑋 (π‘₯1 , . . . . . , π‘₯𝑁 ; 𝑑1 , . . . . . , 𝑑𝑁 ) = 𝑓𝑋 (π‘₯1 , . . . . . , π‘₯𝑁 ; 𝑑1 + βˆ†, . . . . . , 𝑑𝑁 + βˆ†)
for all 𝑑1 , . . . . . , 𝑑𝑁 and βˆ†.
10
CEN 343
Chapter 7: Random Processes
3.7 Time averages and ergodicity
ο‚·
The time average of a quantity [.] is defined as:
1 𝑇
𝐴[. ] = lim
∫ [. ]𝑑𝑑
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
Where A is used to denote the time average as E for statistical averages.
ο‚·
Here, we are interested in:
οƒΌ Mean value of a sample function:
1 𝑇
∫ π‘₯(𝑑)𝑑𝑑
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
π‘₯Μ… = 𝐴[π‘₯(𝑑)] = lim
οƒΌ Time auto correlation function:
1 𝑇
ℛ𝑋𝑋 (𝜏) = 𝐴[π‘₯(𝑑)π‘₯(𝑑 + 𝜏)] = lim
∫ π‘₯(𝑑)π‘₯(𝑑 + 𝜏)𝑑𝑑
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
ο‚·
Ergodic process: A random process X(t) is said to be ergodic if the time averages π‘₯Μ… and ℛ𝑋𝑋 (𝜏)
equal the statistical averages 𝑋̅ and 𝑅𝑋𝑋 (𝜏). In other words, time averages equal the
corresponding statistical averages.
οƒΌ Erogodicity in the mean: A random process that satisfies π‘₯Μ… = 𝑋̅, i.e, time average equals
the ensemble average, is called mean-ergodic random process.
11
CEN 343
Chapter 7: Random Processes
οƒΌ Erogodicity in the autocorrelation: A random process that satisfies ℛ𝑋𝑋 (𝜏) = 𝑅𝑋𝑋 (𝜏) is
called autocorrelation-ergodic.
Collection of all
random process
p.s
Wide-sense stationary
ergodic
strict stationary
Example 3:
Let two random processes given by:
𝐼(𝑑) = X cos 𝑀𝑑 + π‘Œ sin 𝑀𝑑
𝑄(𝑑) = X cos 𝑀𝑑 βˆ’ Y cos 𝑀𝑑
Where X and Y are two random variables with zero mean, uncorrelated, and each has a variance
equal to 𝜎 2 . Find the cross-correlation function between I(t) and Q(t)
Solution:
We have E[X] = E[Y] = 0
X and Y are uncorrelated β‡’E[XY] = 0
Variance= E[X2] = E[Y2] =𝜎 2
𝑅𝐼𝑄 (t1 , t 2 ) = E[I(𝑑1 ) Q(t 2 )]
= 𝐸[(𝑋 cos 𝑀𝑑1 + π‘Œ sin 𝑀𝑑1 )(π‘Œ cos 𝑀𝑑2 βˆ’ 𝑋 cos 𝑀𝑑2 )]
12
CEN 343
Chapter 7: Random Processes
= 𝐸[π‘‹π‘Œ]( π‘π‘œπ‘  𝑀𝑑1 cos 𝑀𝑑2 βˆ’ sin 𝑀𝑑1 cos 𝑀𝑑2 ) + E[π‘Œ 2 ] sin 𝑀𝑑1 cos 𝑀𝑑2 βˆ’ 𝐸[𝑋 2 ] π‘π‘œπ‘  𝑀𝑑1 cos 𝑀𝑑2
= 𝜎 2 cos 𝑀𝑑2 (sin 𝑀𝑑1 βˆ’ π‘π‘œπ‘  𝑀𝑑1 )
Let 𝜏 = t 2 βˆ’ t1 β‡’ t 2 = t1 + 𝜏
π‘…π‘‹π‘Œ (t, t + 𝜏) = 𝜎 2 cos 𝑀(t + 𝜏) [sin 𝑀𝑑 βˆ’ π‘π‘œπ‘  𝑀𝑑]
Example 4:
Let X(t) a random process with: 𝑋(𝑑) = 𝐴 cos(𝑀0 𝑑 + πœƒ)
Where A and 𝑀0 are constants and πœƒ is random variable uniformly distributed in the interval
[0,2].
Is X(t) ergodic in the mean and autocorrelation function?
Solution:
We found that 𝐸[𝑋(𝑑)] = 0 = 𝑋̅
𝑅𝑋𝑋 (𝜏) =
𝐴2
2
cos(𝑀0 𝜏)
where 𝜏 is a constant
1 𝑇
∫ 𝐴
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
Now, π‘₯Μ… = 𝐴[π‘₯(𝑑)] = lim
𝐴
= 𝑀 lim
0
π‘‡β†’βˆž
1
cos(𝑀0 𝑑 + πœƒ) 𝑑𝑑
𝑇
sin(𝑀0 𝑑 + πœƒ)|
2𝑇
βˆ’π‘‡
=0
𝑋̅ = π‘₯Μ… β‡’ 𝑋(𝑑) is ergodic in the mean.
1 𝑇
∫ 𝐴 cos(𝑀0 𝑑 + πœƒ) . 𝐴 cos(𝑀0 (𝑑 + 𝜏) + πœƒ) 𝑑𝑑
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
ℛ𝑋𝑋 (𝑑, 𝑑 + 𝜏) = lim
=
𝐴2
1 𝑇
lim
∫ [ cos(𝑀0 𝜏) + cos(2𝑀0 𝑑 + 𝑀0 𝜏 + 2πœƒ)] 𝑑𝑑
2 π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
13
CEN 343
Chapter 7: Random Processes
=
𝐴2
1 𝑇
𝐴2
cos(𝑀0 𝜏) lim
∫ 𝑑𝑑 =
cos(𝑀0 𝜏) = 𝑅𝑋𝑋 (𝜏)
π‘‡β†’βˆž 2𝑇 βˆ’π‘‡
2
2
β„›π‘₯π‘₯ (𝜏) = 𝑅π‘₯π‘₯ (𝜏) β‡’ 𝑋(𝑑) is ergodic in the autocorrelation
4. Spectral characteristics
ο‚·
In the previous chapter we introduced random processes and presented the related temporal
characteristics.
ο‚· In this chapter we will present another aspect for representing random processes in the
frequency domain.
ο‚· In this chapter, we assume that the random processes are wide sense stationary.
5. Power spectral density
5.1 Deterministic signals
ο‚·
We known that the Fourier transform of a deterministic signal 𝑠(𝑑) is given by:
+∞
𝑆(πœ”) = ∫
𝑠(𝑑)𝑒 βˆ’π‘—πœ”π‘‘ 𝑑𝑑
βˆ’βˆž
ο‚·
𝑆(πœ”) is called sometimes the spectrum of 𝑠(𝑑).
ο‚·
In going for the time domain description 𝑠(𝑑) to the frequency domain 𝑆(πœ”) no information
about the signal is lost.
ο‚·
Which means that 𝑆(πœ”) forms a complete description of 𝑠(𝑑) and vise-versa.
ο‚·
The signal 𝑠(𝑑) can be obtained using the inverse furrier transform:
14
CEN 343
Chapter 7: Random Processes
𝑠(𝑑) =
1 +∞
∫ 𝑆(πœ”)𝑒 π‘—πœ”π‘‘ π‘‘πœ”
2πœ‹ βˆ’βˆž
Or
+∞
𝑆(𝑓)𝑒 𝑗2πœ‹π‘“π‘‘ 𝑑𝑓
𝑠(𝑑) = ∫
βˆ’βˆž
5.2 Random process
ο‚·
If the random process 𝑋(𝑑) is stationary in the wide sense, then the power spectral density
𝑆𝑋𝑋 (πœ”) can be expressed as the Fourier transform of the autocorrelation function 𝑅𝑋𝑋 (𝜏):
+∞
𝑅𝑋𝑋 (𝜏)𝑒 βˆ’π‘—πœ”πœ π‘‘πœ
𝑆𝑋𝑋 (πœ”) = ∫
βˆ’βˆž
Or
+∞
𝑅𝑋𝑋 (𝜏)𝑒 βˆ’π‘—2πœ‹π‘“πœ π‘‘πœ
𝑆𝑋𝑋 (𝑓) = ∫
βˆ’βˆž
ο‚·
As for deterministic signals, the autocorrelation function 𝑅𝑋𝑋 (𝜏)can be obtained from the
power spectral density 𝑆𝑋𝑋 (𝑓) using the inverse Fourier transform:
1 +∞
𝑅𝑋𝑋 (𝜏) =
∫ 𝑆 (πœ”)𝑒 π‘—πœ”πœ π‘‘πœ”
2πœ‹ βˆ’βˆž 𝑋𝑋
Or
+∞
𝑅𝑋𝑋 (𝜏) = ∫
𝑆𝑋𝑋 (𝑓)𝑒 𝑗2πœ‹π‘“πœ 𝑑𝑓
βˆ’βˆž
ο‚·
The transformation 𝑅𝑋𝑋 (𝜏) ↔ 𝑆𝑋𝑋 (πœ”) is sometimes called the wiener-Kinchin relations.
ο‚·
The average power is given by:
15
CEN 343
Chapter 7: Random Processes
+∞
𝑆𝑋𝑋 (𝑓)𝑑𝑓
𝑃𝑋𝑋 = ∫
βˆ’βˆž
𝑃𝑋𝑋
ο‚·
1 +∞
=
∫ 𝑆 (πœ”)π‘‘πœ”
2πœ‹ βˆ’βˆž 𝑋𝑋
Note also the 𝑃𝑋𝑋 is given by:
𝑃𝑋𝑋
1 +∞
= lim
∫ 𝐸[𝑋 2 (𝑑)]𝑑𝑑
π‘‡β†’βˆž 2𝑇 βˆ’βˆž
5.3 Proprieties of the power density spectrum
ο‚·
The power spectrum density has several proprieties:
οƒΌ 𝑆𝑋𝑋 (πœ”) β‰₯ 0
οƒΌ 𝑆𝑋𝑋 (βˆ’πœ”) = 𝑆𝑋𝑋 (πœ”) for 𝑋(𝑑) real
οƒΌ 𝑆𝑋𝑋 (πœ”) is real
1
+∞
οƒΌ 𝑃𝑋𝑋 = 2πœ‹ βˆ«βˆ’βˆž 𝑆𝑋𝑋 (πœ”)π‘‘πœ” = 〈𝐸[𝑋 2 (𝑑)]βŒͺ
Example 4
Let 𝑋(𝑑) be a wide sense stationary process with autocorrelation function
𝑅𝑋𝑋
(𝜏) = {𝐴(1 βˆ’
0
|𝜏|
)
𝑇
βˆ’ π‘‡β‰€πœβ‰€π‘‡
π‘œπ‘‘β„Žπ‘’π‘Ÿπ‘€π‘–π‘ π‘’
Where 𝑇 > 0 and 𝐴 are constants.
Determine the power spectrum density.
Solution:
16
CEN 343
Chapter 7: Random Processes
Fourier Transform of some common functions:
17