Probability Space • Conceptually we should imagine a sample

Stochastic Signals Overview
Introduction
• Discrete-time stochastic processes provides a mathematical
framework for working with non-deterministic signals
• Definitions
• Second order statistics
• Signals that have an exact functional relationship are often called
predictable or deterministic, though some stochastic processes
are predictable
• Stationarity and ergodicity
• Random signal variability
• Power spectral density
• I’m going to use the term deterministic to refer to signals that
are not affected by the outcome of a random experiment
• Linear systems with stationary inputs
• I will use the terms stochastic process and random process
interchangeably
• Random signal memory
• Correlation matrices
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
1
J. McNames
Probability Space
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
Definitions and Interpretations
• Each has a probability Pr {ζk }
• Interpretations
– Random variable: x(n, ζ) with n = no fixed and ζ treated as
a variable
– Sample Sequence: x(n, ζ) with ζ = ζk fixed and n treated as
an independent (non-random) variable
– Number: x(n, ζ) with both ζ = ζk and n = no fixed
– Stochastic Process: x(n, ζ) with both ζ and n treated as
variables
• By some rule, each outcome generates a sequence x(n, ζk )
• Realization: a sample sequence
• We can think of x(n, ζk ) as a vector of (possibly) infinite duration
• Ensemble: The set of all possible sequences, {x(n, ζ)}
x(n, ζ)
Ω
2
ζ
• Conceptually we should imagine a sample space with some
number (possibly infinite) of outcomes: Ω = {ζ1 , ζ2 , . . . }
• Note that the entire sequence is generated from a single outcome
of the underlying experiment
• x(n, ζ) is called a discrete-time stochastic process or a random
sequence
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
3
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
4
Probability Functions
Second Order Statistics
At any time n, we can specify the mean and variance of x(n)
In order to fully characterize a stochastic process, we must consider
the cdf or pdf
Fx (x1 , . . . , xk ; n1 , . . . , nk )
fx (x1 , . . . , xk ; n1 , . . . , nk )
μx (n) E[x(n)]
Pr {x(n1 ) ≤ x1 , . . . , x(nk ) ≤ xk }
∂ k Fx (x1 , . . . , xk ; n1 , . . . , nk )
=
∂x1 . . . ∂xk
=
• μx (n) and σx2 (n) are both deterministic sequences
• The expectation is taken over the ensemble.
for every k ≥ 1 and any set of sample times {n1 , n2 , . . . , nk }.
• In general, the second-order statistics at two different times are
given by the autocorrelation or autocovariance sequences.
• Without additional sweeping assumptions, estimation of fx (·)
from a realization is impossible
• Autocorrelation Sequence
• Many stochastic processes can be characterized accurately or, at
least, usefully by much less information
rxx (n1 , n2 ) = E[x(n1 )x∗ (n2 )]
• Autocovariance Sequence
• To simplify notation, from here on will mostly use x(n) to denote
both random processes and single realizations
∗
γxx (n1 , n2 )
= E[(x(n1 ) − μx (n1 )) (x(n2 ) − μx (n2 )) ]
= rxx (n1 , n2 ) − μx (n1 )μ∗x (n2 )
• In most cases will assume x(n) is complex valued
J. McNames
σx2 (n) E[|x(n) − μx (n)|2 ]
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
5
Cross-Correlation and Cross-Covariance
Cross-Correlation
J. McNames
Portland State University
Ver. 1.10
6
Ver. 1.10
8
• Independent: iff
fx (x1 , . . . , xk ; n1 , . . . , nk ) =
k
f (x , ; n )
∀k
=1
Cross-Covariance
=
=
∗
E (x(n1 ) − μx (n1 )) (y(n2 ) − μy (n2 ))
rxy (n1 , n2 ) −
• Uncorrelated: if
σx2 (n1 )
γx (n1 , n2 ) =
0
μx (n1 )μ∗y (n2 )
Normalized Cross-Correlation
ρxy (n1 , n2 ) =
J. McNames
Stochastic Signals
More Definitions
rxy (n1 , n2 ) = E[x(n1 )y ∗ (n2 )]
γxy (n1 , n2 )
ECE 538/638
Portland State University
γxy (n1 , n2 )
σx (n1 )σy (n2 )
ECE 538/638
Stochastic Signals
n1 = n2
n1 = n2
• Orthogonal: if
σx2 (n1 ) + |μx (n1 )|2
rx (n1 , n2 ) =
0
Ver. 1.10
7
J. McNames
Portland State University
ECE 538/638
n1 = n2
n1 = n2
Stochastic Signals
Still More Definitions
Stationarity
• Wide-sense Periodic: if
μx (n)
rx (n1 , n2 )
Stationarity of Order N : A stochastic process x(n) such that
= μx (n + N ) ∀n
fx (x1 , . . . , xN ; n1 , . . . , nN ) = fx (x1 , . . . , xN ; n1 + k, . . . , nN + k)
= rx (n1 + N, n2 ) = rx (n1 , n2 + N )
for any value for any k.
= rx (n1 + N, n2 + N )
• Any stochastic process of Order N , is also a stochastic process of
order M for all M ≤ N
• Statistically Independent: iff for every n1 and n2
• Strict-Sense Stationary (SSS): A stochastic process that is
stationary of all orders N
fxy (x, y; n1 , n2 ) = fx (x; n1 )fy (y; n2 )
• Uncorrelated: if for every n1 and n2 ,
γxy (n1 , n2 ) = 0
• Orthogonal: if for every n1 and n2
rxy (n1 , n2 ) = 0
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
9
J. McNames
Wide Sense Stationary
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
10
Example 1: Stationarity
fx (x1 , x2 ; n1 , n2 ) = fx (x1 , x2 ; n1 + k, n2 + k)
Describe a random process that is stationary. Describe a second
random process that is not stationary.
• Wide-Sense Stationary (WSS): A stochastic process with a
constant mean and autocorrelation that only depends on the delay
between the two sample times
• WSS Properties
E[x(n)] = μx
rx (n1 , n2 ) = rx () = rx (n1 − n2 ) = E[x(n + )x∗ (n)]
γx () = rx () − |μx |2
• This implies the variance is also constant, var[x(n)] = σx2
• All processes that are stationary of order 2 are WSS
• Not all WSS processes are stationary of order 2
• Note this is slightly different from the text
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
11
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
12
Stationarity Notes
Autocorrelation Sequence Properties
• SSS implies WSS
rx (0) = σx2 + |μx |2
• If the marginal pdf of a signal is Guassian for all n, then WSS
implies SSS
rx (0) ≥ |rx ()|
rx () = rx∗ (−)
• The book states that most WSS processes are SSS. True?
M
M • Jointly Wide-Sense Stationary: two random signals x(n) and
y(n) are jointly WSS if they are both WSS and
∗
αk rx (k − m)αm
≥
0
∀α sequences
k=1 m=1
rxy () = rxy (n1 − n2 ) = rxy () = E[x(n)y ∗ (n − )]
• Average DC Power: |μx |2
γxy () = γxy (n1 − n2 ) = γxy () = rxy () − μx μ∗y
• Average AC Power: σx2
• WSS is a very useful property because it enables us to consider a
spectral description
• Nonnegative Definite: A sequence is said to be nonnegative
definite if it satisfies this last property
• In practice, we only need the signal to be WSS long enough to
estimate the autocorrelation or cross-correlation
• Positive Definite: Any sequence that satisfies the last inequality
strictly for any α
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
13
J. McNames
Comments on Stationarity
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
14
Introduction to Ergodicity
• Many real processes are nonstationary
• In most practical situations we can only observe one or a few
realizations
• Best case: can determine from domain knowledge of the process
• If the process is ergodic, we can know all statistical information
from a single realization
• Else: must rely on statistical methods
• Many nonstationary processes are approximately locally stationary
(stationary over short periods of time)
• Ensemble Averages: Repeat the experiment many times
• Time Averages:
• Much of time-frequency analysis is dedicated to this type of signal
N
1
(·)
N →∞ 2N + 1
• There is no general mathematical framework for analyzing
nonstationary signals
(·) lim
n=−N
• However, many nonstationary stochastic processes can be
understood through linear estimation (i.e., Kalman filters)
• Note that nonstationary is a negative definition: not stationary
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
15
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
16
Time Averages of Interest
Mean value
Mean square
Variance
Autocorrelation
Autocovariance
Cross-correlation
Cross-covariance
Ergodic Random Processes
• Ergodic Random Process: a random signal for which the
ensemble averages equal the corresponding time averages
x(n)
|x(n)|2 |x(n) − x(n)|2 x(n)x∗ (n − )
[x(n) − x(n)][x(n − ) − x(n)]∗ x(n)y ∗ (n − )
[x(n) − x(n)][y(n − ) − y(n)]∗ =
=
=
=
=
=
=
• Like stationarity, there are various degrees
• Ergodic in the Mean: a random process such that
x(n) = E[x(n)] = μx
• Ergodic in Correlation: a random process such that
x(n)x∗ (n − ) = E[x(n)x∗ (n − )] = rx ()
• Similar to correlation sequences for deterministic power signals
• If a process is ergodic in both mean and correlation, it is also WSS
• Both quantitites have the same properties
• Only stationary signals can be ergodic
• Difference
– Time averages are random variables (functions of the
experiment outcome)
– In the deterministic case the quantities are fixed numbers
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
• WSS does not imply any type of ergodicity
• Text: “Almost all stationary processes are also ergodic” True?
• Our usage: ergodic = ergodic in both the mean and correlation
17
J. McNames
More on Ergodicity
ECE 538/638
Stochastic Signals
Ver. 1.10
18
Ver. 1.10
20
Problems with Ergodicity
• Problem we never know x(n) for n = −∞ to +∞
• Joint Ergodicity: two random signals are jointly ergodic iff they
are individually ergodic and
∗
Portland State University
• In all real sitations, we only have finite records
• The most common estimator is then
∗
x(n)y (n − ) = E[x(n)y (n − )]
• Stationarity ensures time invariance of the statistics
(·)N • Ergodicity implies the statistics can be obtained from a single
realization with time averaging
N
1
(·)
2N + 1
n=−N
• Note that it is a random variable
• In words: one realization (a single ζk ) is sufficient to estimate any
statistic of the underlying random process
• How good is it?
– Bias
– Variance
– Consistent
– Confidence intervals
– Distribution
• This is one of the key topics of this class
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
19
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ergodic Processes vs. Deterministic Signals
Random Processes in the Frequency Domain
N
1
x(n)x∗ (n − )
N →∞ 2N + 1
Power Spectral Density (PSD)
rx () = lim
n=−N
jω
Rx (e )
• The autocorrelation of a deterministic power signal and a ergodic
process can be calculated with the same infinite summation
F {rx ()} =
rx () = F
• What’s the difference then?
– With deterministic signals there is only one signal
– With stochastic signals, we assume it was generated from an
underlying random experiment ζk
– This enables us to consider the ensemble of possible signals:
rx () = E[x(n)x∗ (n − )]
– We can therefore draw inferences and make predictions about
the population of possible outcomes, not merely this one signal
∞
−1
rx ()e−jω
=−∞
1
Rx (e ) =
2π
jω
π
−π
Rx (ejω )ejω dω
• Stationary random processes have deterministic correlation
sequences
• They have a single index (independent variable)
• Note again that the power spectral density can be calculated with
the same equation for deterministic and ergodic signals
• Whether you define a given signal as deterministic or as a single
realization of a random process depends largely on the application
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
21
J. McNames
Periodic and Non-Periodic Processes
Rx (ejω ) F {rx ()} =
∞
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
22
Ver. 1.10
24
Power Spectral Density Properties
• Rx (ejω ) is real-valued
rx ()e−jω
• Rx (ejω ) is periodic with period 2π
=−∞
• Rx (ejω ) ≥ 0 is nonnegative definite
• If rx () is periodic, the DTFS is most appropriate
• Rx (ejω ) has nonnegative area and
π
1
Rx (ejω ) dω = rx (0) = E[|x(n)|2 ]
2π −π
• Line Spectrum If we allow impulses in the PSD, then the PSD of
a periodic rx () consists of an impulse train
• If the process x(n) is non-zero mean (i.e., nonzero average DC
power), the PSD will contain an impulse at ω = 0
• If x(n) is real-valued
– rx () is real and even
– Rx (ejω ) is an even function of ω
• More generally, a random process can be composed of both
deterministic components and non-periodic components
• What if x(n) is complex-valued?
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
23
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
White Noise
Harmonic Process
Harmonic Process any process defined by
White Noise Process: A random WSS sequence w(n) such that
2
rw () = σw
+ μ2w δ()
E[w(n)] = μw
x(n) =
M
ak cos(ωk n + φk )
ωk = 0
k=1
• Specifically, this is a second-order white process
M
where M , {ak }M
1 , and {ωk }1 are constant. The random variables
M
{φk }1 are pairwise independent and uniformly distributed in the
interval [−π, π].
2
)
• Notation: w(n) ∼ WN(μw , σw
• Not a complete characterization of w(n): the marginal pdf could
be anything
• x(n) is stationary and ergodic with zero mean and autocorrelation
• If w(n) is Gaussian, then a white Guassian process is denoted by
2
)
w(n) ∼ WGN(μw , σw
rx () =
Portland State University
ECE 538/638
Stochastic Signals
M
ak cos(ωk n + φk )
Ver. 1.10
25
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
26
Harmonic Process Comments
ωk = 0
x(n) =
k=1
M
ak cos(ωk n + φk )
ωk = 0
k=1
• The PSD consists of pairs of impulses (line spectrum) of area
located a frequencies ±ωk
Rx (ejω ) =
a2k cos(ωk )
• Note the cosines in the autocorrelation are in-phase
Harmonic Process PSD
x(n) =
M
k=1
• The term white comes from properties of white light
J. McNames
1
2
M
π 2
ak [δ(ω − ωk ) + δ(ω + ωk )]
2
πa2k
2
• It is only stationary if all of the random phases are equally likely
(uniformly distributed over all possible angles)
• This is an unusual circumstance where the signal is stationary but
is parameterized by one or more random variables that are
constant over all n
−π ≤ω ≤π
k=1
• If all ωk /(2π) are rational numbers, x(n) is periodic and the
impulses are equally spaced apart
• In general, is non-Guassian
• Is a predictable random sequence! (also highly unusual)
• This never happens, unless there is a single periodic (perhaps
non-sinusoidal) component
• Otherwise they are almost periodic (always happens)
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
27
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
28
Cross-Power Spectral Density
Coherence
Normalized Cross-Spectrum
Cross-power Spectral Density: if x(n) and y(n) are jointly
stationary stochastic processes,
Rxy (ejω )
F {rxy ()} =
rxy () =
jω
Rxy (e )
=
1
2π
π
−π
jω
∗
Ryx
(e
∞
Gxy (ejω ) rxy ()e−jω
Rxy (ejω )
Rx (ejω )Ry (ejω )
Also known as the coherency spectrum or simply coherency. Similar
to the correlation coefficient in frequency.
=−∞
Rxy (ejω )ejω dω
Coherence Function
)
2
(ejω ) Gxy
|Rxy (ejω )|2
Rx (ejω )Ry (ejω )
Also known as the coherence and magnitude square coherence.
• Also known as the cross-spectrum
2
• If y(n) = h(n) ∗ x(n), then Gxy
(ejω ) = 1 ∀ω
• Note that unlike the PSD, it is not real-valued, in general
2
(ejω ) = 0 ∀ω
• If rxy () = 0, then Gxy
2
≤1
• 0 ≤ Gxy
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
29
J. McNames
Portland State University
Linear Transforms and Coherence
x(n)
H(z)
ECE 538/638
Stochastic Signals
Ver. 1.10
30
Ver. 1.10
32
Linear Transforms and Coherence
y(n)
w(n)
G(z)
• Linear transforms have no effect on coherence
x(n)
• Similar to the case of random variables: y = mx + b
– x and y are perfectly correlated: ρ = ±1
2
Gxy
(ejω ) =
F(z)
y(n)
H(z)
Rx (ejω )|H(ejω )|2
Rx (ejω )|H(ejω )|2 + Rw (ejω )|G(ejω )|2
• Noise w(n) decreases coherence
• The final linear transform F (z) has no effect!
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
31
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Complex Spectral Density Functions
Random Processes and Linear Systems
Complex Spectral Density
If the input to an LTI system is a random process, so is the output.
Rx (z)
∞
=
Ry (z)
rx ()z −
k=−∞
∞
=
ry ()z −
• If the system is BIBO stable and the input process is stationary
with E[|x(n, ζ)|] < ∞, then the output converges absolutely with
probability one.
Complex Cross-Spectral Density
• In English, the output is stationary
∞
=
rxy ()z
−
• If E[|x(n, ζ)|2 ] < ∞, then E[|y(n, ζ)|2 ] < ∞
k=−∞
J. McNames
Portland State University
• If the h(n) has finite energy, the output converges in the mean
square sense
ECE 538/638
Stochastic Signals
Ver. 1.10
33
J. McNames
Portland State University
Linear System Statistics
h(n)
x(n)
y(n)
H(z)
x(n)
μy
=
rxy () =
k=−∞
∞
∞
h(k) E[x(n − k)] = μx
x(n)
y(n)
h (k)rx ( + k) =
k=−∞
∗
∞
Stochastic Signals
Ver. 1.10
34
h(n)
y(n)
x(n)
H(z)
y(n)
Let x(n) be a random process that is the input to an LTI system with
an output y(n).
Py
j0
h(k) = μx H(e )
k=−∞
∗
ECE 538/638
Output Power
Let x(n) be a random process that is the input to an LTI system with
an output y(n).
∞
h(k)x(n − k, ζ)
k=−∞
k=−∞
Rxy (z)
∞
y(n, ζ) =
ry (0) = [rh () ∗ rx ()]=0
∞
=
rh (k)rx (−k)
=
∗
h (−m)rx ( − m)
=
m=−∞
k=−∞
∞
rh (k)rx∗ (k)
k=−∞
rxy () = h (−) ∗ rx ()
If the system is FIR, then
ryx () = h() ∗ rx ()
ry () = h() ∗ rxy () = h() ∗ h∗ (−) ∗ rx () = rh () ∗ rx ()
Py = hH Rx h
If μx = 0, then μy = 0 and σy2 = Py
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
35
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
36
z Domain Analysis
Output Distribution
x(n)
h(n)
y(n)
x(n)
H(z)
y(n)
x(n)
• In general, it is very difficult to solve for the output PDF (even
when y(n) is WSS)
ECE 538/638
Stochastic Signals
Ver. 1.10
h(n)
y(n)
x(n)
H(z)
Note that if h(n) is real, then
Rxy (e )
∗
jω
Z
h∗ (−n) = h(−n)
37
J. McNames
h(−n) ←→ H(z −1 )
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
y(n)
x(n)
h(n)
y(n)
x(n)
H(z)
y(n)
• Zero-memory: a process for which rx () = σx2 δ()
• Examples: white noise, IID process
jω
= H (e )Rx (e )
jω
= H(e )Rx (e )
• We can create a signal with memory (dependence) by passing a
zero-memory process through an LTI system
jω
= |H(ejω )|2 Rx (ejω )
• Extent and degree of imposed dependence depend on h(n)
Ryx (e )
Ry (e )
jω
38
Random Signal & System Memory
If the system is stable, z = ejω lies in the ROC and the following
relations hold
jω
y(n)
Ry (z) = H(z)H ∗ (z −∗ )Rx (z)
Frequency Domain Analysis
x(n)
H(z)
x(n)
Ryx (z) = Z {h() ∗ rx ()} = H(z)Rx (z)
• If x(n) is IID,
– The output is a weighted sum of IID random variables
– If the distribution of x(n) is stable, then y(n) has the same
distribution (even if the mean and variance differ)
– If many of the largest weights are approximately equal so that
many elements of the input signal have an equal effect on the
output, then the CLT applies (approximately) and the output
will be approximately Gaussian
Portland State University
y(n)
Z {h∗ (−n)} = H ∗ (z −∗ )
Rxy (z) = Z {h∗ (−) ∗ rx ()} = H ∗ (z −∗ )Rx (z)
• If x(n) is a Gaussian process, the output is a Gaussian process
J. McNames
h(n)
jω
• Knowing ry () and rx () or the input and output PSD’s are
sufficient to determine |H(ejω )|
• We can’t estimate H(ejω ) from this information (the second
order statistics)
• Only rxy () or Rxy (ejω ) can provide phase information
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
39
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
40
Correlation Length
Short Memory Processes
• Correlation Length: given a WSS process,
Lc =
• Short Memory: a WSS process x(n) such that
∞
∞
∞
=0
=0
=−∞
1 rx () =
ρx ()
rx (0)
• Equal to the “area” of the normalized autocorrelation curve
ρx () < ∞
• For example, autocorrelation decays exponentially
• Undesirable properties
– Why is it one sided?
– Lengths should not be negative, in general. Could this be
negative?
ρx () ≈ a|| for large r() = [1.0000, −0.3214, −0.7538] for = 1, 2, 3
– “zero-memory” processes have a non-zero correlation length
(Lc = 1)
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
41
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
Long Memory Processes
Correlation Matrices
• Long memory: for WSS signal x(n) with finite variance, if there
exists 0 < α < 1 and Cr > 0
1
r ()−α = 1
lim
2 x
→∞ Cr σx
Let the random vector x(n) be related to the (possibly nonstationary)
random process x(n) as follows
H
x(n) x(n) x(n − 1) · · · x(n − M + 1)
H
E[x(n)] = μx (n) μx (n − 1) · · · μx (n − M + 1)
• Equivalently, there exists 0 ≤ β < 1 and Cr > 0 such that
Rx (n)
1
R (ejω )|ω|β = 1
lim
2 x
ω→0 Cr σx
• Implies
– The autocorrelation has heavy tails
– Autocorrelation decays as a power law
∞
ρx () = ∞
−α
ρx () ≈ Cr ||
42
E[x(n)x(n)H ]
⎡
rx (n, n)
⎢
..
= ⎣
.
=
⎤
···
rx (n, n − M + 1)
⎥
..
..
⎦
.
.
rx (n − M + 1, n) · · · rx (n − M + 1, n − M + 1)
Note that Rx (n) is nonnegative definite and Hermitian since
rx (n − i, n − j) = rx∗ (n − j, n − i).
as → ∞
=−∞
– Has infinite autocorrelation length
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
43
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
44
Correlation Matrices
Conditioning of Correlation Matrix
If x(n) is a stationary process, the correlation matrix becomes
⎤
⎡
rx (1)
rx (2)
· · · rx (M − 1)
rx (0)
⎢ rx∗ (1)
rx (0)
rx (1)
· · · rx (M − 2)⎥
⎥
⎢
∗
⎢ rx∗ (2)
r
(1)
r
(0)
· · · rx (M − 3)⎥
x
x
Rx (n) = ⎢
⎥
⎥
⎢
..
..
..
..
..
⎦
⎣
.
.
.
.
.
∗
∗
∗
rx (0)
rx (M − 1) rx (M − 2) rx (M − 3) · · ·
• Condition number: of a positive definite matrix Rx is
χ(Rx ) λmax
λmin
where λmax and λmin are the largest and smallest eigenvalues of
the autocorrelation matrix, respectively
• If x(n) is a WSS random process, then the eigenvalues of the
autocorrelation matrix are bounded by the dynamic range of the
PSD
∀λi
min R(ejω ) ≤ λi ≤ max R(ejω )
RxH ,
Toeplitz (the elements along
In this case Rx is Hermitian, Rx =
each diagonal are equal), and nonnegative definite.
ω
ω
• See text for proof
• Interpretation: a large spread in eigenvalues implies
– PSD is more variable (less flat)
– Process is less like white nose (more predictable)
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
45
J. McNames
Portland State University
ECE 538/638
Stochastic Signals
Ver. 1.10
46