Science in the 20th
century
•
•
•
Materials and devices
• Relativity
• Quantum mechanics
• Chemical bond
• Molecular basis of life
•
•
Systems
Robustness (Bode,
Zames,…)
Computational complexity
(Turing, Godel, …)
Information (Shannon,
Kolmogorov)
Chaos and dynamical
systems (Poincare,
Lorenz,…)
Optimal control
(Pontryagin, Bellman,…)
Current dominant challenges
Materials and devices
• Unified field theory
• Dynamics of chemical
reactions
• Dynamics of
biological
macromolecules
Systems
• Robustness of complex
interconnected dynamical
systems and networks
• “Unified field theory” of
control, communications,
computing
Current dominant challenges
• Robustness of complex
interconnected dynamical
systems and networks
Role of control theory
• Robustness
• Interconnection
• Rigor
We need an
expanded view
of all of these.
Robust
Humans have
exceptionally robust
systems for vision
and speech.
Yet fragile
…but we’re not so
good at surviving,
say, large meteor
impacts.
Yet fragile
…but we’re not so
good at surviving,
say, large meteor
impacts.
Robustness and uncertainty
Sensitive
Error,
sensitivity
Robust
Types of uncertainty
Robustness and uncertainty
Humans
Sensitive
Error,
sensitivity
Archaebacteria
Robust
speech/
vision
Meteor
impact
Types of uncertainty
Robustness and uncertainty
Sensitive
Humans
yet
fragile
Error,
sensitivity
Archaebacteria
Robust
Robust
speech/
vision
Meteor
impact
Types of uncertainty
“Complex” systems
yet
fragile
Sensitive
Error,
sensitivity
Robust
Robust
Types of
uncertainty
“Robust, yet fragile”
• Robust to uncertainties
– that are common,
– the system was designed for, or
– has evolved to handle,
• …yet fragile otherwise
• This is the most important feature of
complex systems (the essence of HOT).
Example: Auto airbags
• Reduces risk in highspeed collisions
• Increases risk otherwise
• Increases risk to small
occupants
• Mitigated by new
designs with greater
complexity
• Could just get a heavier
vehicle
• Reduces risk without the
increase!
• But shifts it elsewhere:
occupants of other
vehicles, pollution
Biology (and engineering)
• Grow, persist, reproduce, and
function despite large uncertainties
in environments and components.
• Yet tiny perturbations can be fatal
– a single specie or gene
– minute quantities of toxins
• Complex, highly evolved organisms and
ecosystems have high throughput,
• But are the most vulnerable in large
extinctions.
• Complex engineering systems have
similar characteristics
Automobile
air bags
children
low-speed
Error,
sensitivity
high speed
head-on
collisions
Types of uncertainty
Robustness/
Uncertainty
Information/
Computation
constrained
Is robustness a “conserved quantity”?
Materials
Entropy
Energy
Uncertainty
and
Robustness
Complexity
Interconnection/
Feedback
Dynamics
Hierarchical/
Multiscale
Heterogeneous
Nonlinearity
Uncertainty
and
Robustness
Complexity
Interconnection/
Feedback
Dynamics
Hierarchical/
Multiscale
Heterogeneous
Nonlinearity
error
uncertain
sequence
delay
predictor
Prediction:
the most basic scientific question.
x(k) : uncertain
sequence
-
e(k)
u(k-1)
delay
e(k) = x(k) - u(k-1)
u(k)
predictor
u(k) : prediction of x(k+1)
e(k) : error
x(k)
u(k)
k
e(k)
k
Prediction is a special case of feedforward.
For known stable plant, these are the same:
feedforward
-
e
x
c
Plant
Control
-
e
feedback
Plant
Control
x
c
For simplicity, assume x
u, and e are finite
sequences.
x(k)
u(k)
k
e(k)
k
N
X ( z ) x( k ) z
k 0
k
Then the discrete Fourier
transform X, U, and E are
polynomials in the transform
variable z.
If we set z = ei , [0,] then X() measures
the frequency content of x at frequency .
e
x
u(k-1)
delay
u(k)
C
x(k)
u(k)
e(k) = x(k) - u(k-1)
How do we measure
performance of our predictor C
in terms of x, e, X, and E?
e(k)
Typically want ratios of norms:
e
x
t
t
to be small.
or
E
X
where
f
f
t
and
f
are some suitable time or
frequency domain (semi-)
norms, usually weighted.
et
Good performance
(prediction) means
x
E
1
or
t
et xt
Equivalently,
For example,
x
2
2
or
f
X
E
f
f
2
|
x
(
k
)
|
0
X
Plancheral
Theorem:
x
2
2
2
2
X
1
2
2
2
|
X
(
)
|
d
0
1
X
f
Interesting alternative:
X
b
log
| X ( ) |d
0
Or to make it closer to existing norms:
X
b
exp log | X ( ) |d
0
Not a norm, but a very useful measure of signal size, as
we’ll see. (The b in ||||b is in honor of Bode.)
A useful measure of performance is in terms of the
sensitivity function S(z) defined by Bode as
E ( z ) X ( z ) z 1U ( z )
1 U ( z )
S ( z)
1 z
X ( z)
X ( z)
X ( z)
If we set z = ei , [0,] then |S()| measures how well C
does at each frequency. (If C is linear then S is independent of
x, but in general S depends on x.)
It is convenient to study log |S()| and then
u 0 ( u(k)=0 k)
S 1, and log|S| 0.
log |S()| < 0
C attenuates x at frequency .
log |S()| > 0
C amplifies x at frequency .
Assume u is a
causal function of x.
Note: as long as we assume that
for any possible sequence {x(k)}
it is equally likely that {-x(k)} will
occur, then guessing ahead can
never help.
x(k)
u(k-1)
k
Then the first nonzero element of u is delayed at
least one step behind the first nonzero element of x.
This implies that
(taking z )
0
1 U ( )
S ( ) 1
1
X ( )
This will be used later.
e(k)
e(k) = x(k) - u(k-1)
x(k)
u(k-1)
u(k)
delay
C
For any C, an unconstrained “worst-case” x(k) is
x(k) = -u(k-1), which gives
e(k) = x(k) - u(k-1) = - 2*u(k-1) = 2*x(k)
Thus, if nothing is known about x(k), the “safest” choice
is u 0. Any other choice of u does worse in any norm.
If x is white noise, then u 0 is also the best choice for
optimizing average behavior in almost any norm.
Summary so far:
• Some assumptions must be valid about x in order that it be
at all predictable.
• Intuitively, there appear to be fundamental limitations on
how well x can be predicted.
• Can we give a precise mathematical description of these
limitations that depends only on causality and require no
further assumptions?
e(k)
e(k) = x(k) - u(k-1)
x(k)
u(k-1)
u(k)
delay
C
• Recall that S(z) = E(z)/X(z) and S() = 1.
• Denote by {k} and {k} the complex zeros for |z| > 1 of
E(z) and X(z), respectively. Then
0
log S ( ) d
log k
log k
Proof: Follows directly from Jensen’s formula, a standard
result in complex analysis (advanced undergraduate level).
If x is chosen so that X(z) has no zeros in |z| > 1 (this is an
open set), then
0
log S ( ) d
log k
0
• Recall that S(z) = E(z)/X(z) and S() = 1.
• Denote by {k} and {k} the complex zeros for |z| > 1 of
E(z) and X(z), respectively.
If the predictor is linear and time-invariant, then
0 log S ( ) d 0
0
log S ( ) d
log k
log k
Under some circumstances, a time-varying predictor
can exploit signal “precursors” that create known {k}
0
log S ( ) d 0
log|S | > 0
amplified
log|S | < 0
attenuated
log|S |
he amplification must at
least balance the attenuation.
…yet
fragile
Robust
0
•
•
•
•
log S ( ) d 0
Originally due to Bode (1945).
Well known in control theory as a property of linear systems.
But it’s a property of causality, not linearity.
Many generalizations in the control literature, particularly in
the last decade or so.
• Because it only depends on causality, it is in some sense the
most fundamental known conservation principle.
• This “conservation of robustness” and related concepts are as
important to complex systems as more familiar notions of
matter, energy, entropy, and information.
X
Recall:
0
b
exp log | X ( ) |d
0
log S ( ) d 0
is equivalent to
E
b
X
b
Uncertainty
and
Robustness
Complexity
Interconnection/
Feedback
Dynamics
Hierarchical/
Multiscale
Heterogeneous
Nonlinearity
What about feedback?
feedforward
-
e
x
c
Plant
Control
-
e
feedback
Plant
Control
x
c
Simple case of feedback.
d
c
+
e
F
e=d+c
= d + F (e)
(1-F )e = d
If e, d, c, and F are just numbers:
e
1
S
d 1 F
e = error
d = disturbance
c = control
S = sensitivity function
measures the disturbance rejection
It’s convenient to study ln(S).
Positive ( F > 0 )
ln(S) > 0
disturbance amplified
Negative ( F < 0)
ln(S) < 0
disturbance attenuated
e
1
S
d 1 F
ln(S)
F>0
ln(S) > 0
amplification
F
F<0
ln(S) < 0
attenuation
e
1
S
d 1 F
ln(S)
F1
ln(S)
extreme
sensitivity
F
F
ln(S)
extreme
robustness
Uncertainty
and
Robustness
Complexity
Interconnection/
Feedback
Dynamics
Hierarchical/
Multiscale
Heterogeneous
Nonlinearity
If these model physical processes,
then d and e are signals and F is an
operator. We can still define
S( = |E( /D( |
where E and D are the Fourier
transforms of e and d. ( If F is
linear, then S is independent of D.)
d
c
+
e
F
Under assumptions that are consistent with F and d
modeling physical systems (in particular, causality),
it is possible to prove that:
log S ( ) d 0
log|S | > 0 amplified
log|S | < 0
attenuated
log|S |
he amplification must at
least balance the attenuation.
log S ( ) d 0
Positive and negative
feedback are balanced.
ln|S|
log|S |
F
…yet
fragile
ln|S|
log|S |
Robust
F
Feedback is very
powerful, but there
are limitations.
It gives us remarkable
robustness, as well as
recursion and looping.
But can lead to
instability, chaos,
and undecidability.
Formula 1:
The ultimate high
technology sport
Air bags
Temperature control
EGR control
Active
suspension
Electronic fuel injection
Electronic ignition
Electric power
steering (PAS)
Anti-lock brakes
In development:
• drive-by-wire
• steering/traction control
• collision avoidance
Electronic transmission
Cruise control
Formula 1 allows:
• Electronic fuel injection
• Computers
• Sensors
• Telemetry/Communications
• Power steering
actuators
sensors
driver
computers
No active control allowed.
telemetry
Control
Theory
Computational
Information
Theory
Theory of
Complex systems?
Complexity
Dynamical
Systems
Statistical
Physics
d
c
+
F
This is a natural
departure point
for introduction
of chaos and
undecidability.
uncertain
sequence
error
e
delay
predictor
• Kolmogorov complexity
• Undecidability
• Chaos
• Probability, entropy
• Information
• Bifurcations, phase transitions
© Copyright 2026 Paperzz