CEN 343 Chapter 7: Random Processes Chapter 10 Random Processes 1 CEN 343 Chapter 7: Random Processes 1. Introduction ο· In chapter 2, we have defined a random variable π as a mapping of the elements of the sample space S into points on the real axis. ο· For random processes, the sample space could map into a family of time functions. That is if every outcome S is a function of time. This will lead to the concept of a random process ο· Definition of a random process: A random process π(π‘) is am mapping of the elements of the sample space into functions of time. Each element of the sample space is associated with a time function as shown below. x1(t) t t2 t1 S x2(t) s1 s2 t t1 sk t2 xk(t) t t1 t2 2 CEN 343 ο· Chapter 7: Random Processes Associating a time function to each element of the sample space results in a family of time functions called the ensemble. Example 1 Consider a random process π(π‘) = π΄πππ (ππ‘ + π), where π is a random variable uniformly distributed between 0 and 2π. Show some sample functions of this random process. What could we say on the type function above if for example we fix the phase π = π/4 Solution: - Sample functions of this r.p. are cosine functions with frequency wt and different phase shifts when t is fixed. - Fixing π and varying t results in a deterministic time function. 3 CEN 343 Chapter 7: Random Processes 2. Classification of random processes 2.1 Continuous random process ο· In this case, both X(t) and t have continuous values x(t) t 2.2 Discrete random process ο· π(π‘) assumes a discrete set of values while time t is continues X(t) t 4 CEN 343 Chapter 7: Random Processes 2.3 Continuous random sequence ο· π(π‘) assumes a continuous set of values while time t is discrete X(t) t 2.4 Discrete random sequence ο· Both π(π‘) and t assumes a discrete set of values X(t) t ο· Fixing the time t, the random process π(π‘) becomes a random variable. In this case, the techniques we use with random variables apply. 5 CEN 343 ο· Chapter 7: Random Processes We can characterize a random process by the first order distribution as: πΉπ (π₯; π‘) = π[π(π‘0 ) β€ π₯] ο· The first order density function for all possible values of t is given by: π ππ (π₯; π‘) = ππ₯ πΉπ (π₯; π‘) ο· The second order distribution function is the joint distribution of the two random variables π(π‘1 ) and π(π‘2 ) for each π‘1 and π‘2 . It is given by: πΉπ1 π2 (π₯1 , π₯2 ; π‘1 , π‘2 ) = π[π(π‘1 ) β€ π₯1 πππ π(π‘2 ) β€ π₯2 ] ο· The second order density function is given by: π2 ππ1 π2 (π₯1 , π₯2 ; π‘1 , π‘2 ) = ππ₯ 1 ππ₯2 πΉπ1 π2 (π₯1 , π₯2 ; π‘1 , π‘2 ) Deterministic and Nondeterministic Process: ο· A r.p. (random process) is called deterministic if future values of any sample functions can be predicted from past values. For example: Here, π΄, π, π0 (or all) may be r.v.s. ο· A r.p. is called nondeterministic if future values of any sample function cannot be predicted exactly from past values. 6 CEN 343 Chapter 7: Random Processes 3. Independence and stationarity ο· In many problems of interest the first and second order statistics may be necessary to characterize a random process. ο· The mean value of a random process π(π‘) is given by: +β πΈ[π(π‘)] = β«ββ π₯ππ (π₯; π‘)ππ₯ ο· The autocorrelation function is defined by: β β π ππ (π‘1 , π‘2 ) = πΈ[π(π‘1 )π(π‘2 )] = β« β« π₯1 π₯2 ππ2 π1 (π₯1 , π₯2 ; π‘1 , π‘2 )ππ₯1 ππ₯2 ββ ββ 3.1 Statistical Independence: Two random processes X(t) and Y(t) are statistically independent if the random variable group X(π‘1 ),β¦.. X(π‘π ) is independent of the group Y(π‘1Μ ),β¦β¦β¦ Y(π‘πΜ ) for any choice of π‘1 , β¦ , π‘π , π‘1Μ , β¦ , π‘πΜ : ππ,π (π₯1 , β¦ , π₯π , π¦1 , β¦ , π¦π ; π‘1 , β¦ , π‘π , π‘1Μ , β¦ , π‘πΜ ) = ππ (π₯1 , β¦ β¦ . . π₯π , π‘1 β¦ β¦ . . π‘π ). ππ (π¦1 , β¦ β¦ . . π¦π , π‘1Μ β¦ β¦ . . π‘πΜ ) 3.2 First-order stationary processes: ο· A random process is called first order stationary in the strict sense if the first-order density function does not change with a shift in time : ππ (π₯1 ; π‘1 ) = ππ (π₯1 ; π‘1 + Ξ) π€βπππ Ξ is any real number 7 CEN 343 ο· Chapter 7: Random Processes This means that ππ (π₯1 , π‘1 ) is independent of π‘1 and the process mean value πΈ[π(π‘)] is constant: β πΈ[π(π‘)] = β«ββπ₯ ππ (π₯; π‘)ππ₯ = ππππ π‘πππ‘ 3.3 Second-order stationary process ο· A random process is second order stationary in the strict sense if the first and second order density functions stratify: ππ (π₯1 ; π‘1 ) = ππ (π₯1 ; π‘1 + Ξ) (β) ππ (π₯1 , π₯2 ; π‘1 , π‘2 ) = ππ (π₯1 , π₯2 ; π‘1 + Ξ, π‘2 + Ξ) (**) ο· This means that ππ is a function of time difference π‘2 β π‘1 and not absolute time. ο· A second-order stationary process is also first-order stationary ο· For second-order stationary random process , the autocorrelation function is a function of time differences and not absolute time : π = π‘2 β π‘1 π ππ (π‘, π‘ + π) = πΈ[π(π‘)π(π‘ + π)] = π ππ (π) 3.4 Wide sense stationary random process ο· As the basic conditions for strict sense stationary process are usually difficult to verify (equations (*) and (**) ) ο· We often resort to a weaker definition of stationarity known was wide sense stationary. 8 CEN 343 ο· Chapter 7: Random Processes Thus, when the autocorrelation function π ππ (π‘1 , π‘2 ) of the random process π(π‘) varies only with the time difference |π‘1 β π‘2 | and the mean πΈ[π(π‘)] is constant then π(π‘) is said to be wide sense stationary. That is: (1) πΈ[π(π‘)] β‘ ππππ π‘πππ‘ (2) π ππ (π‘ + π, π‘) = π ππ (π) ο· A strict sense stationary process is wide sense stationary process but the inverse is not true. Example 2 Consider a random process π(π‘) = π΄πππ (ππ‘ + π), where π is a random variable uniformly distributed between 0 and 2π. Is the process stationary in the wide sense? Solution: 2π πΈ[π(π‘)] = β« π΄ πππ (π€0 π‘ + π). 0 1 ππ = 0 2π π ππ (π‘, π‘ + π) = πΈ[π(π‘)π(π‘ + π)] ππ = πΈ[π΄ πππ (π€0 π‘ + π)π΄ πππ (π€0 π‘ + π€0 π + π)] . We have: 1 2π = = = π΄2 2 π΄2 2 π΄2 2 πΈ[cos(π€0 π) + cos(2π€0 π‘ + π€0 π + 2π)] cos(π€0 π) + π΄2 2 0 2π E[cos(2π€0 π‘ + π€0 π + 2π)] cos(π€0 π) = π ππ (π) X(t) is wide-sense stationary r.p. 9 π CEN 343 Chapter 7: Random Processes 3.5 Jointly wide-sense stationary: ο· For two random processes X(t) and Y(t), we say they are jointly wide-sense stationary if each process is wide-sense stationary and: π ππ (π‘, π‘ + π) = πΈ[π(π‘)π(π‘ + π)] = π ππ (π) β ππ’πππ‘πππ ππ π‘πππ ππππππππππ ππππ¦. π ππ (π‘1 , π‘2 ) = πΈ[π(π‘1 )π(π‘2 )] Represents the cross-correlation function of X(t) and Y(t) . ο· We also define the covariance function by: πΆππ (π‘1 , π‘2 ) = πΈ[{π(π‘1 ) β ππ (π‘1 )}{π(π‘2 ) β ππ (π‘2 )}] ο· The cross-covariance function is defined by: πΆππ (π‘1 , π‘2 ) = πΈ[{π(π‘1 ) β ππ (π‘1 )}{π(π‘2 ) β ππ (π‘2 )}] 3.6 N-order and strict-sense stationary: ο· A random process is strict stationary to order N if the N-th order density function is invariant to time origin shift: ππ (π₯1 , . . . . . , π₯π ; π‘1 , . . . . . , π‘π ) = ππ (π₯1 , . . . . . , π₯π ; π‘1 + β, . . . . . , π‘π + β) for all π‘1 , . . . . . , π‘π and β. 10 CEN 343 Chapter 7: Random Processes 3.7 Time averages and ergodicity ο· The time average of a quantity [.] is defined as: 1 π π΄[. ] = lim β« [. ]ππ‘ πββ 2π βπ Where A is used to denote the time average as E for statistical averages. ο· Here, we are interested in: οΌ Mean value of a sample function: 1 π β« π₯(π‘)ππ‘ πββ 2π βπ π₯Μ = π΄[π₯(π‘)] = lim οΌ Time auto correlation function: 1 π βππ (π) = π΄[π₯(π‘)π₯(π‘ + π)] = lim β« π₯(π‘)π₯(π‘ + π)ππ‘ πββ 2π βπ ο· Ergodic process: A random process X(t) is said to be ergodic if the time averages π₯Μ and βππ (π) equal the statistical averages πΜ and π ππ (π). In other words, time averages equal the corresponding statistical averages. οΌ Erogodicity in the mean: A random process that satisfies π₯Μ = πΜ , i.e, time average equals the ensemble average, is called mean-ergodic random process. 11 CEN 343 Chapter 7: Random Processes οΌ Erogodicity in the autocorrelation: A random process that satisfies βππ (π) = π ππ (π) is called autocorrelation-ergodic. Collection of all random process p.s Wide-sense stationary ergodic strict stationary Example 3: Let two random processes given by: πΌ(π‘) = X cos π€π‘ + π sin π€π‘ π(π‘) = X cos π€π‘ β Y cos π€π‘ Where X and Y are two random variables with zero mean, uncorrelated, and each has a variance equal to π 2 . Find the cross-correlation function between I(t) and Q(t) Solution: We have E[X] = E[Y] = 0 X and Y are uncorrelated βE[XY] = 0 Variance= E[X2] = E[Y2] =π 2 π πΌπ (t1 , t 2 ) = E[I(π‘1 ) Q(t 2 )] = πΈ[(π cos π€π‘1 + π sin π€π‘1 )(π cos π€π‘2 β π cos π€π‘2 )] 12 CEN 343 Chapter 7: Random Processes = πΈ[ππ]( πππ π€π‘1 cos π€π‘2 β sin π€π‘1 cos π€π‘2 ) + E[π 2 ] sin π€π‘1 cos π€π‘2 β πΈ[π 2 ] πππ π€π‘1 cos π€π‘2 = π 2 cos π€π‘2 (sin π€π‘1 β πππ π€π‘1 ) Let π = t 2 β t1 β t 2 = t1 + π π ππ (t, t + π) = π 2 cos π€(t + π) [sin π€π‘ β πππ π€π‘] Example 4: Let X(t) a random process with: π(π‘) = π΄ cos(π€0 π‘ + π) Where A and π€0 are constants and π is random variable uniformly distributed in the interval [0,2ο°]. Is X(t) ergodic in the mean and autocorrelation function? Solution: We found that πΈ[π(π‘)] = 0 = πΜ π ππ (π) = π΄2 2 cos(π€0 π) where π is a constant 1 π β« π΄ πββ 2π βπ Now, π₯Μ = π΄[π₯(π‘)] = lim π΄ = π€ lim 0 πββ 1 cos(π€0 π‘ + π) ππ‘ π sin(π€0 π‘ + π)| 2π βπ =0 πΜ = π₯Μ β π(π‘) is ergodic in the mean. 1 π β« π΄ cos(π€0 π‘ + π) . π΄ cos(π€0 (π‘ + π) + π) ππ‘ πββ 2π βπ βππ (π‘, π‘ + π) = lim = π΄2 1 π lim β« [ cos(π€0 π) + cos(2π€0 π‘ + π€0 π + 2π)] ππ‘ 2 πββ 2π βπ 13 CEN 343 Chapter 7: Random Processes = π΄2 1 π π΄2 cos(π€0 π) lim β« ππ‘ = cos(π€0 π) = π ππ (π) πββ 2π βπ 2 2 βπ₯π₯ (π) = π π₯π₯ (π) β π(π‘) is ergodic in the autocorrelation 4. Spectral characteristics ο· In the previous chapter we introduced random processes and presented the related temporal characteristics. ο· In this chapter we will present another aspect for representing random processes in the frequency domain. ο· In this chapter, we assume that the random processes are wide sense stationary. 5. Power spectral density 5.1 Deterministic signals ο· We known that the Fourier transform of a deterministic signal π (π‘) is given by: +β π(π) = β« π (π‘)π βπππ‘ ππ‘ ββ ο· π(π) is called sometimes the spectrum of π (π‘). ο· In going for the time domain description π (π‘) to the frequency domain π(π) no information about the signal is lost. ο· Which means that π(π) forms a complete description of π (π‘) and vise-versa. ο· The signal π (π‘) can be obtained using the inverse furrier transform: 14 CEN 343 Chapter 7: Random Processes π (π‘) = 1 +β β« π(π)π πππ‘ ππ 2π ββ Or +β π(π)π π2πππ‘ ππ π (π‘) = β« ββ 5.2 Random process ο· If the random process π(π‘) is stationary in the wide sense, then the power spectral density πππ (π) can be expressed as the Fourier transform of the autocorrelation function π ππ (π): +β π ππ (π)π βπππ ππ πππ (π) = β« ββ Or +β π ππ (π)π βπ2πππ ππ πππ (π) = β« ββ ο· As for deterministic signals, the autocorrelation function π ππ (π)can be obtained from the power spectral density πππ (π) using the inverse Fourier transform: 1 +β π ππ (π) = β« π (π)π πππ ππ 2π ββ ππ Or +β π ππ (π) = β« πππ (π)π π2πππ ππ ββ ο· The transformation π ππ (π) β πππ (π) is sometimes called the wiener-Kinchin relations. ο· The average power is given by: 15 CEN 343 Chapter 7: Random Processes +β πππ (π)ππ πππ = β« ββ πππ ο· 1 +β = β« π (π)ππ 2π ββ ππ Note also the πππ is given by: πππ 1 +β = lim β« πΈ[π 2 (π‘)]ππ‘ πββ 2π ββ 5.3 Proprieties of the power density spectrum ο· The power spectrum density has several proprieties: οΌ πππ (π) β₯ 0 οΌ πππ (βπ) = πππ (π) for π(π‘) real οΌ πππ (π) is real 1 +β οΌ πππ = 2π β«ββ πππ (π)ππ = β©πΈ[π 2 (π‘)]βͺ Example 4 Let π(π‘) be a wide sense stationary process with autocorrelation function π ππ (π) = {π΄(1 β 0 |π| ) π β πβ€πβ€π ππ‘βπππ€ππ π Where π > 0 and π΄ are constants. Determine the power spectrum density. Solution: 16 CEN 343 Chapter 7: Random Processes Fourier Transform of some common functions: 17
© Copyright 2024 Paperzz