Estimation of states of some recursively defined systems. Estimator S6

Estimation of states of some recursively defined systems.
Estimator S6
V. L. GIRKO
Department of Probability and Statistics, Michigan State University
East Lansing, Michigan 48824
Consider the problem of estimation of states of systems with discrete time described by
the equations
~xi+1 = ~xi + Ai ~xi + ξ~i ; ~yi = Ci ~xi + ~ηi ; i = 1, 2, ...
(11.1)
where ~xi are the n-dimensional vectors of state; ~yi are p-dimensional vectors of observed
variables; Ai are square n × n matrices; Ci are matrices of dimension p × n; ξ~i and ~ηi are
error vectors of dimensions n and p, respectively, which satisfy the inequality
(
ξ~i , ~ηi ∈ G, G =
)
¶
k µ° °2
X
°~ °
2
ξ~i , ~ηi :
°ξi ° + k~ηi k ≤ 1 .
i=1
In this section we generalize the estimator S3 for such an equation. Consider the
following problem of estimating the state ~xk : find matrices K̂i of dimension n × p and a
vector ~l of dimension n which minimize the expression
°
°2
k
°
°
X
°
°
~
max °~xk+1 −
Ki ~yi − l° .
°
°
ξ~i ,~
ηi ∈G
(11.2)
i=1
Let Ln×p be the set of real matrices of dimension n × p and let Ln be the set of real
vectors ~l of dimension n.
Theorem 11.1. Under the above formulated assumptions
°
°2
( k
)
k
°
°
X
X
¡
¢
°
°
T
min
max °~xk+1 −
Ki ~yi − ~l° = λ1
Zi+1 Zi+1
+ Ki∗ Ki∗T
,
°
Ki ∈Ln×p ;~l∈Ln ξ~i ,~
ηi ∈G °
i=1
(11.3)
i=1
where the matrices Zi satisfy the recursive equations
Zi+1 (I + Ai ) = Zi + Ki∗ Ci ; p = 1, ..., k; Zk+1 = I; ~l∗ = Z1 ~x1 ;
the matrices Ki∗ ; i = 1, ..., k satisfy the system of equations S6
s
X
pl [Kp∗T + Cp Sp ]~
ϕl ϕ
~ Tl = 0; p = 1, ..., k
l=1
1
(11.4)
where
pl > 0, l = 1, ..., s;
s
X
pl = 1,
j=0
ϕ
~ k , k = 1, ..., s are orthonormal eigenvectors which correspond to the maximal s-multiple
eigenvalue λ1 of the matrix
k
X
£
¤
T
Zi+1 Zi+1
+ Ki∗ Ki∗T ,
i=1
the matrices Sp satisfying the system of equations
T
; p = 1, · · · , k; S1 = 0,
Sp+1 = Sp + Ap Sp − Zp+1
(11.5)
one of the solutions of equation (10.4) is
Kp∗T = −Cp Sp , p = 1, ..., k.
Proof. It is obvious that
°
°2 °
°2
k
k
k
°
°
°
°
X
X
X
°
°
°
°
Ki ~yi − ~l° = °~xk+1 −
Ki Ci ~xi −
Ki ~ηi −~l° .
°~xk+1 −
°
°
°
°
i=1
i=1
i=1
Consider the system of recursive equations
Zp+1 = Zp − Zp+1 Ap + Kp Cp ; p = 1, ..., k
with the initial condition Zk+1 = I. Then, using (10.1) after obvious transformations we
have
~xk+1 −
k
X
Ki Ci ~xi = ~xk+1 −
i=1
k
X
(Zi+1 − Zi )~xi −
i=1
= ~xk+1 −
k
X
= Z1 ~x1 +
Zi+1 Ai ~xi
i=1
Zi+1 (~xi+1 − Ai ~xi − ξ~i ) −
i=1
k
X
k
X
k
X
Zi+1 Ai ~xi +
i=1
Zi+1 ξ~i .
i=1
Therefore
°
°2 °
°2
k
k
k
°
°
°
°
X
X
X
°
°
°
°
Ki ~yi − ~l° = °Z1 ~x1 +
Zi+1 ξ~i +
Ki ~ηi −~l° .
°~xk+1 −
°
°
°
°
i=1
i=1
2
i=1
k
X
i=1
Zi ~xi
Using the proof of Theorem 5.1 we get
°
°2
( k
)
k
°
°
X
X
¡
¢
°
°
T
min
max °~xk+1 −
Ki ~yi − ~l° = min λ1
Zi+1 Zi+1
+ Ki∗ Ki∗T
,
Ki ∈Ln×p ; ~
K
∈L
°
°
i
n×p
ξi ,~
ηi ∈G
i=1
~
l∈Ln
i=1
~l∗ = Z1 ~x1 ,
and the unknown matrices Ki∗ satisfy the equation
s
X
ϕ
~ Tq pq
q=1
k
X
T
(Z̃i+1 Zi+1
+ Θi Ki∗T )~
ϕq = 0,
(11.6)
i=1
where ϕ
~ k , k = 1, ..., s are orthonormal eigenvectors which correspond to the maximal
s-multiple eigenvalue λ1 of the matrix
k
X
£
¤
T
Zi+1 Zi+1
+ Ki∗ Ki∗T ,
i=1
Θi are arbitrary matrices which have the same dimension as matrices Ki , and the matrices
Z̃i+1 satisfy the equations
Z̃i+1 = Z̃i − Z̃i+1 Ai + Θi Ci ; Z̃k+1 = 0; i = 1, ..., k.
Obviously
k
X
T
Z̃i+1 Zi+1
=
i=1
k ³
X
T
Z̃i+1 Zi+1
+ Z̃i+1 Si+1 − Z̃i Si
´
i=1
=
k ³
X
´
¡
¢
T
T
Z̃i+1 Zi+1
+ Z̃i+1 Si + Ai Si − Zi+1
− Z̃i Si
i=1
k h³
´
i
X
=
Z̃i − Z̃i+1 Ai + Θi Ci Si + Z̃i+1 Ai Si − Z̃i Si
i=1
=
k
X
Θi Ci Si .
i=1
Using this equality and the auxiliary systems of equations (11.5) we obtain that (11.6)
equals
s
X
q=1
ϕ
~ Tq pq
k
X
¡
¢
Θi Ki∗T + Ci Si ϕ
~ q = 0.
i=1
From this equation we obtain all assertions of Theorem 11.1.
3
Consider the case when ξ~i and ~ηi are error vectors of dimensions m and n, respectively,
which satisfy the inequality
(
)
¶
k µ ° °2
³ ´
³ ´ X
°~ °
2
~ ~η ∈ G, G =
~ ~η :
ξ,
ξ,
°ξi ° + k~ηi k ≤ 1
i=1
T
for some fixed k (where we have put ξ~ = (ξ1 , · · · , ξk ) , and similarly ~η ). Consider the
following problem of estimating the state ~xk : to find matrices K̂i of dimension m × n and
a vector ~l of dimension m which minimize the expression
°2
°
k
°
°
´
X
°
°
~
~
Ki ~yi − l° .
ϕ K1 , · · · , Kk ; l = max °~xk+1 −
°
~ η )∈G °
(ξ,~
i=1
³
The vector
ˆ k+1 =
~x
k
X
ˆ
K̂i ~yi + ~l
i=1
is called linear minimax estimator of ~xk+1 .
Repeating the proof of Theorem 10.1 we get
Theorem 11.2. Under the above formulated assumptions
°
°2
)
( k
k
°
°
X
X¡
¢
°
°
T
,
min
max °~xk+1 −
+ Ki∗ Ki∗T
Ki ~yi − ~l° = λmax
Zi+1 Zi+1
°
°
m×n
m
~
~
Ki ∈R
;l∈R (ξ,~
η )∈G
i=1
i=1
where the matrices Zi satisfy the recursive equations
Zi+1 (I + Ai ) = Zi + K̂i Ci ; i = 1, ..., k; Zk+1 = I.
ˆ
Moreover, ~l = Z1 x1 and the matrices K̂i ; i = 1, ..., k satisfy the system of equations
s
X
h
i
pl K̂iT + Ci Si ϕ
~lϕ
~ Tl = 0; i = 1, ..., k ,
(11.7)
l=1
where
pl > 0; l = 1, ..., s,
s
X
pl = 1,
j=1
ϕ
~ k , k = 1, ..., s are orthonormal eigenvectors which correspond to the maximal s-multiple
eigenvalue λmax of the matrix
k h
i
X
T
Zi+1 Zi+1
+ K̂i K̂iT
i=1
4
and the matrices Si satisfying the system of equations
T
Si+1 = Si + Ai Si − Zi+1
; p = 1, ..., k; S1 = 0,
one of the solutions of equation (11.7) is
K̂iT = −Ci Si , i = 1, ..., k.
REFERENCES
1. V.L. Girko, S.I. Lyascko and A.G. Nakonechny. On Minimax Regulator for Evolution
Equations in Hilbert Space Under Conditions of Uncertainty. Cybernetics, N.1, 1987,
67–68 p.
2. V.L. Girko. Spectral Equations S2 for Minimax Estimations of Solutions of Some
Linear Systems*. Third International Workshop on Model Oriented Data Analysis
(Moda-3) St. Petersburg, Petrodvorets, Russia 25–30 May 1992 p.11.
3. V.L. Girko. Spectral Equations S1 and S2 for Minimax Estimations of Solutions of
some Linear Systems Linear Minimax Estimation - Theory and Practice 3–4 August
1992, Oldenburg.
4. V.L. Girko. Spectral Equation for Minimax Estimator of Parameters of Linear Systems*. Calculating and Applied Mathematics, N.63, 1987, 114–115 p. Translation in
J. Soviet Math. 66 2221–2222 (1993).
5. V.L. Girko and N. Christopeit. Minimax Estimators for Linear Models with Nonrandom Disturbances. Random Operators and Stochastic Equations 3, N.4, 1995, 361–377
p.
6. V.L. Girko. Spectral Theory of Minimax Estimation, Acta Applicandae Mathematicae
43, 1996, 59–69 p.
7. V.L. Girko. Theory of Linear Algebraic Equations with Random Coefficients, (monograph), Allerton Press, Inc. New York 1996, 320 p.
5