Quasi-stationary signals

SYSTEMS
Identification
Ali Karimpour
Assistant Professor
Ferdowsi University of Mashhad
Reference: “System Identification Theory For The User” Lennart Ljung(1999)
lecture 2
Lecture 2
Introduction
Topics to be covered include:

Impulse responses and transfer functions.

Frequency domain expression.

Stochastic Process.

Signal spectra

Disturbances

Ergodicity
2
Ali Karimpour Sep 2010
lecture 2
Impulse responses
It is well known that a linear, time-invariant, causal system can be described as:
y (t )  
t
  
g (t   )u ( )d
y (t )  

g ( )u (kT   )d

 0
g ( )u (t   )d
Sampling
y (kT )  
 0
Most often, the input signal u(t) is kept constant between the sampling instants:
u(t )  uk
So
y (kT )  

 0

kT  t  (k  1)T

g ( )u (kT   )d   
l 1
lT
  ( l 1)T
g ( )u (kT   )d

  
g ( )d uk l   gT (l ) uk l
  ( l 1)T

l 1 
l 1
lT
3
Ali Karimpour Sep 2010
lecture 2
Impulse responses

y (kT )   gT (l ) uk l
Where
l 1
lT

 ( l 1)T
g ( )d  gT (l )
For ease of notation assume that T is one time unit and use t to enumerate
the sampling instants

y (t )   g (k ) u (t  k )
t  0,1, 2, 3.....
k 1
4
Ali Karimpour Sep 2010
lecture 2
Transfer functions
Define forward and backward shift operator q and q-1 as
qu (t )  u (t  1)
q 1u (t )  u(t  1)
Now we can write output as:


k 
k
y (t )   g (k )u (t  k )   g (k )q u (t )    g (k )q u (t )  G (q)u (t )
 k 1

k 1
k 1
G(q) is the transfer operator or transfer function

G (q)   g (k )q  k
k 1
Similarly for disturbance we have
v(t )  H (q )u (t )


So the basic description for a linear system with additive disturbance is:
y (t )  G (q)u (t )  H (q)e(t )
5
Ali Karimpour Sep 2010
lecture 2
Transfer functions
Some terminology
G(q) is the transfer operator or transfer function


G (q)   g (k )q  k
or
k 1
G ( z )   g (k ) z  k
k 1
We shall say that the transfer function G(q) is stable if

 g (k )  
k 1
This means that G(z) is analytic on and outside the unit circle.
We shall say the filter H(q) is monic if h(0)=1:

H ( q )   h( k ) q  k
k 0
6
Ali Karimpour Sep 2010
lecture 2
Frequency-domain expressions
Let
u (t )  cos t
It will convenient to write
u (t )  Re e jt
Now we can write output as:


k 1
k 1
y (t )   g (k ) Re ei (t  k )  Re  g (k )ei (t  k )
 it 
 Re e . g (k ) ei
k 1

 
k

it
i

Re
e
G
(
e
)




So we have

y (t )  G (ei ) cos t  arg G (ei )

7
Ali Karimpour Sep 2010
lecture 2
Frequency-domain expressions
u (t )  cos t

y (t )  G (ei ) cos t  arg G (ei )
u (t )  Re e jt

Now suppose
cos t
u (t )  
 0
t
y (t )   g (k ) Re e
Re e jt t  0
u (t )  
t0
 0
t0
t0
i ( t  k )
k 1
 it t
 Re e . g (k ) ei
k 1

 
k
 it t
 Re e . g (k ) ei
k 1

 

 it 
i

Re
e
.
g
(
k
)
e

 
k t


k
 



k

 it 
i

Re
e
.
g
(
k
)
e

 
k t


 
k



For stable system
 it 
y (t )  G (e ) cos t  arg G (e )  Re e . g (k ) ei
k t

i

i

 
k



8
Ali Karimpour Sep 2010
lecture 2
Periodograms of signals over finite intervals
Fourier transform (FT)
1 
it
g ( ) 
f
(
t
)
e
dt



2
g ( )
f (t )

f (t )   g ( )eit d

Discrete Fourier transform (DFT)
1
U N ( ) 
N
u (1)
u ( 2)
u (3)
u ( 4)
......
u( N )
N
1
u
N
 u(t )e
it
t 1
2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N
UN (
1
u (t ) 
N
N
U
k 1
N
(2k / N )ei 2 kt / N
U N ( )
Exercise1: Show that u(t) can be derived by putting UN(ω) in u(t).
9
Ali Karimpour Sep 2010
lecture 2
Periodograms of signals over finite intervals
Some property of UN(ω)
__________
U N ( )  U N ( )
U N (  2 )  U N ( )
The function UN(ω) is therefore uniquely defined by its values over the interval
[ 0, 2π ]. It is, however, customary to consider UN(ω) for the interval [ - π, π ]. So
u(t)can be defined as
1
u (t ) 
N
N /2
i 2 kt / N
U
(
2
k

/
N
)
e
 N
k   N / 2 1
The number UN(ω) tell us the weight that the frequency ω carries in the
decomposition. So
U N ( )
2
Is known as the periodogram of the signal u(t), t= 1 , 2 , 3 , …..
Parseval’s relationship:
N
N
2
U
(
2
k

/
N
)

u
(
t
)
 N

k 1
2
t 1
10
Ali Karimpour Sep 2010
lecture 2
Periodograms of signals over finite intervals
Example: Periodogram of a sinusoid
u (t )  A cos 0t
Suppose u (t ) is periodic so 0  2 / N0 for some integer N 0  1
Let t  1, 2 , 3 , ... , N where N is a multiple of N 0 ( N  sN 0 )
1
U N ( ) 
N
N
 A cos 0t e
t 1
 A2
N

2
 4
U N ( )  
 0

it
A

2 N
 e
A

2 N
 e
N
 i 0 t

 e i  0 t e  it
t 1
N
i (0  ) t
 e  i (  0  ) t

t 1
2
2
if   0  

s
N0
N
2k
if  
, ks
N
11
Ali Karimpour Sep 2010
lecture 2
Periodograms of signals over finite intervals
Discrete Fourier transform (DFT)
u (1)
u ( 2)
u (3)
u ( 4)
......
u( N )
1
U N ( ) 
N
N
 u(t )e
it
t 1
2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N
UN (
u1N
The periodogram defines, in a sense, the frequency contents of a signal over a
finite time interval.
12 ∞).
But we seek for a definition of a similar concept for signals over the interval [1,
1
U N ( ) 
N
N
 u(t )e
it
N 
t 1
But this limits fail to exist for many signals of
practical interest.
12
Ali Karimpour Sep 2010
lecture 2
Transformation of Periodograms
As a signal is filtered through a linear system, its Periodograms changes.
Let:
Define:
Claim:
where
13
Ali Karimpour Sep 2010
lecture 2
Transformation of Periodograms
Claim:
Proof:
Now
14
Ali Karimpour Sep 2010
lecture 2
Transformation of Periodograms
Claim:
Proof:
Now
15
Ali Karimpour Sep 2010
lecture 2
Transformation of Periodograms
Claim:
Proof:
So
16
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
A random variable (RV) is a rule (or function) that assigns a real number to every
outcome of a random experiment.
The closing price of Iranian power market observed from Apr. 15 to Sep. 22, 2009.
For scalar (RV)
For vector (RV)
Probability density function (PDF)
If e may assume a certain value with nonzero probability then fe contains δ function.
Two random variables e1 and e2 are independent, if we have:
P(e1  x1  e2  x2 )  P(e1  x1 ).P(e2  x2 )
Definition: The expectation E[e] of a random variable e is:
Definition: The variance, Cov[e], of a random variable, e, is:
17
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
A stochastic process is a rule (or function) that assigns a time function
to every outcome of a random experiment.
• Consider the random experiment of tossing a dice at t = 0 and observing
the number on the top face.
• The sample space of this experiment consists of the outcomes
{1, 2, 3, · · · , 6}.
• For each outcome of the experiment, let us arbitrarily assign a
function of time t in the following manner.
• The set of functions {x1(t), x2(t), · · , x6(t)} represents a stochastic process.
18
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
Mean of a random process X(t) is
In general, mX(t) is a function of time.
Correlation RX(t1, t2) of a random process X(t) is
Note RX(t1, t2) is a function of t1 and t2.
Autocovariance CX(t1, t2) of a random process X(t) is defined as the covariance
of X(t1) and X(t2):
In particular, when t1 = t2 = t, we have
19
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
Example Sinusoid with random amplitude
20
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
Example Sinusoid with random phase
21
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
x(t) is stationary if
Example Sinusoid with random phase
Clearly x(t) is a stationary (WSS).
Example Sinusoid with random amplitude
Clearly x(t) is not a stationary.
This may be a limiting definition.
?????
22
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
A Common Framework for Deterministic and Stochastic Signals
y (t )  G (q)u (t )  H (q)e(t )
Ey(t )  G (q)u (t )
This may be a limiting definition.
y(t) is not a
stationary process
?????
To deal with this problem, we introduce the following definition:
Quasi-stationary signals
23
Ali Karimpour Sep 2010
lecture 2
Stochastic Processes
x(t) is stationary if
Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to
and
(
If {s(t)} is a deterministic
sequence
)
Quasi-stationary
means
{s(t)} is a bounded sequence and
1
Rs ( )  lim
N  N
If {s(t)} is a stationary
stochastic process
N
 s(t )s(t   )
Exist
t 1
It is quasi stationary since
24 on t.
Es(t )s(t   )  Rs ( ) does not depend
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Notation:
The notation means that the limit exists.
Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to
and
(
)
Sometimes with some abuse of notation, we call it Covariance function of s.
Exercise2: Show that sometime it is exactly covariance function of s.
25
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Two signals {s(t)} and {w(t)} are jointly quasi-stationary if:
1- They both are quasi-stationary,
2- the cross-covariance function
exist.
Uncorrelated
26
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Discrete Fourier transform (DFT)
u (1)
u ( 2)
u (3)
u ( 4)
......
u( N )
1
U N ( ) 
N
N
 u(t )e
it
t 1
2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N
UN (
u1N
The periodogram defines, in a sense, the frequency contents of a signal over a
finite time interval.
27 ∞).
But we seek for a definition of a similar concept for signals over the interval [1,
1
U N ( ) 
N
N
 u(t )e
it
N 
t 1
But this limits fail to exist for many signals of practical interest.
So we shall develop a frame work for describing signals and their spectra that is
27
applicable to deterministic as well as stochastic signals.
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Use Fourier transform of covariance function (Spectrum or Spectral density)
We define the (power) spectrum of {s(t)} as
 s ( ) 

i
R
(

)
e
 s
  
When following limits exists:
and cross spectrum between {s(t)} and {w(t)} as
 sw ( ) 

i
R
(

)
e
 sw
  
When following limits exists:
Exercise3: Show that spectrum always is a real function but cross spectrum
is in
28
general a complex-valued function.
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Exercise4 : Spectra of a Periodic Signal: Consider a deterministic, periodic
signal with period M, i.e., s(t)=s(t+M)
Show that
 s ( )   sp ( ) F ( , M )
Where
M 1
 ( )   Rs ( )e
p
s
 0
i
and
F ( , M ) 

ilM
e

l  
And finally show that
2
 s ( ) 
M
M 1
p

 s (2k / M ) (  2k / M ), 0    2
k 0
29
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Exercise5: Spectra of a Sinusoid: Consider
u(t )  A cos 0t
to the interval
[1, )
Show that
A2
 (  0 )   (  0 ).2
 u ( ) 
4
30
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Example Stationary Stochastic Processes: Consider v(t) as a stationary
stochastic processes
We will assume that e(t) has zero mean and variance λ . It is clear that:
Rv ( )  E v(t )v(t   )  Ev(t )v(t   )  

 h(k )h(k  )
(I )
k  max( 0, )
The spectrum is:
 v ( ) 

R ( )e


 
i
v



 
k  max( 0, )
  h(k )h(k   )e



Where
i
 .............
i
  H (e )
2

H (e )   h( s )e is
i
s 1
Exercise6: Show (I)
31
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Spectrum of Stationary Stochastic Processes
The stochastic process described by v(t)= H(q)e(t), where {e(t)}
is a sequence of independent random variables with zero mean
values and covariances λ , has the spectrum
i
 v ( )   H (e )
2
32
Ali Karimpour Sep 2010
lecture 2
Signal Spectra
Spectrum of a Mixed Deterministic and Stochastic Signal
deterministic
Rs  Ru  Rv
Stochastic:
stationary and zero
mean
 s ( )  u ( )   v ( )
Exercise7: Proof it.
33
Ali Karimpour Sep 2010
lecture 2
Transformation of Spectra by Linear Systems
Theorem: Let{w(t)} be a quasi-stationary with spectrum  w ( ) , and let G(q) be
a stable transfer function. Let
Then {s(t)} is also quasi-stationary and
i
2
 s ( )  G (e )  w ( )
 sw ( )  G (ei ) w ( )
34
Ali Karimpour Sep 2010
lecture 2
Disturbances
There are always signals beyond our control that also affect the system.
We assume that such effects can be lumped into an additive term v(t) at
the output
v(t)
u(t)
y(t)
+
So

y (t )   g (k ) u (t  k )  v(t )
k 1
There are many sources and causes for such a disturbance term.
• Measurement noise.
• Uncontrollable inputs. ( a person in a room produce 100 W/person)
35
Ali Karimpour Sep 2010
lecture 2
Disturbances
Characterization of disturbances
• Its value is not known beforehand.
• Making qualified guesses about future values is possible.
• It is natural to employ a probabilistic framework to describe future
disturbances.
We put ourselves at time t and would like to know disturbance at
t+k, k ≥ 1 so we use the following approach.

v(t )   h(k ) e(t  k )
Where e(t) is a white noise.
k 0
This description does not allow completely general characteristic of
all possible probabilistic disturbances, but it is versatile enough.
36
Ali Karimpour Sep 2010
lecture 2
Disturbances
Consider for example, the following PDF for e(t):
Small values of μ are suitable to describe classical disturbance patterns, steps,
pulses, sinuses and ramps.
A realization of v(t) for propose e(t)
Exercise8: Derive above figure for μ=0.1 and μ=0.9 and a suitable h(k).
37
Ali Karimpour Sep 2010
lecture 2
Disturbances
On the other hand, the PDF:
A realization of v(t) for propose e(t)
Often we only specify the second-order properties of the sequence {e(t)} that is the
mean and variances.
Exercise9: What is a white noise?
38
Exercise10: Derive above figure for δ=0.1 and δ=0.9 and a suitable h(k). Ali Karimpour Sep 2010
lecture 2
Disturbances
We will assume that e(t) has zero mean and variance λ. Now we want to know
the characteristic of v(t) :
Mean:
Covariance:
 k 0 s 0 h(k )h(s) (k    s)


  k 0 h(k )h(k   )  Rv ( )

We know that
h(r )  0 for r  0 (Since of causality)
39
Ali Karimpour Sep 2010
lecture 2
Disturbances
We will assume that e(t) has zero mean and variance λ. Now we want to know
the characteristic of v(t) :
Mean:
Covariance:
Rv ( )  Ev(t )v(t   )
Since the mean and covariance are do not depend on t, the
process is said to be stationary.
40
Ali Karimpour Sep 2010
lecture 2
Ergodicity
Suppose you are concerned with determining what the most visited parks in a
city are.
• One idea is to take a momentary snapshot: to see how many people are this moment
in park A, how many are in park B and so on.
• Another idea is to look at one individual (or few of them) and to follow him for a
certain period of time, e.g. a year.
The first one may not be representative for a longer period of time, while the second
one may not be representative for all the people.
The idea is that an ensemble is ergodic if the two types of statistics give the same result.
Many ensembles, like the human populations, are not ergodic.
41
Ali Karimpour Sep 2010
lecture 2
Ergodicity
Let x(t) is a stochastic process
Most of our computations will depend on a given realization of a quasi
stationary process.
Ergodicity will allow us to make statements about repeated experiments. 42
Ali Karimpour Sep 2010