L11 Optimal Design L.Multipliers
•
•
•
•
•
Homework
Review
Convex sets, functions
Convex Programming Problem
Summary
1
Constrained Optimization
LaGrange Multiplier Method
Remember:
1. Standard form
2. Max problems f(x) = - F(x)
MINIMIZE :
f (x )
Subject To :
hi ( x )= 0
i 1... p
g j (x ) 0
x
(U )
k
j 1...m
)
x k x(L
k 1 n
k
2
KKT Necessary Conditions for Min
p
m
i 1
i 1
Lx, v, u, s f (x ) υi hi (x ) ui ( gi (x ) si2 )
p
f
hi m g j
υi
u j
0 for k 1 to n
xk i 1 xk j 1 xk
hi (x*) 0 for i 1 to p
g j (x*) s 2j 0 for j 1 to m
s2j 0 for j 1 to m
u*j s j 0 for j 1 to m
u*j 0 for j 1 to m
Regularity check - gradients of active inequality
constraints are linearly independent
3
Prob 4.120
Min f ( x1 , x2 ) ( x1 1) 2 ( x2 1) 2
subject to g1 x1 x2 4 0
g 2 2 x1 0
L f ( x1 , x2 ) u1 ( g1 s12 ) u2 ( g 2 s22 )
L ( x1 1) 2 ( x2 1) 2 ...
u1[ x1 x2 4 s12 ] u2 [ 1x1 0 x2 2 s22 ]
4
KKT Necessary Conditions
KKT Necessary Conditions :
L
2( x1 1) u1 u2 0
x1
L
2( x2 1) u1 0 0
x 2
g1 1x1 1x2 4 s12 0
g 2 1x1 0 x2 2 s22 0
s12 0, s22 0
u1 s1 0, u2 s2 0
u1 0 u2 0
Regular po int?
5
Case 1
u2
s2
u1 u1u2
u1s2
s1
s1s2
s1u2
Case 1 u1 , u2 0
Case 2 u1 , s2 0
Case 3 s1 , u2 0
Case 4 s1 , s2 0
Case 1 u1 0, u2 0
2( x1 1) 0 0 0
2( x2 1) 0 0 0
x1 1, x2 1
g1 1x1 1x2 4 s12 0
g 2 1x1 0 x2 2 s22 0
1(1) 1(1) 4 s12 0... s12 2 0
1(1) 0(1) 2 s22 1 0 BAD!
6
Case 2
Case 2. u1 0, s2 0
2( x1 1) 0 u2 0
2( x2 1) 0 0 0
2( 2 1) 0 u2 0
u2 2
check :
1x1 1x2 4 s12 0 s12 0, s22 0, u1 0 , u2 0
1x1 0 x2 2 0 0
KKT point : x ( 2,1), u1 0, u2 2, f 1
2( x2 1) 0 0 0
Regularity:
x2 1, x1 2
1. pt is feasible
2. only one active constraint
2
1( 2) 1(1) 4 s1 0
Point is KKT pt, OK!
2
s1 1
7
Case 3
Case3. s1 0, u2 0
2( x1 1) u1 0 0
2( x2 1) u1 0 0
2 x1 0 x2 1u1 2
0 x1 2 x2 1u1 2
0 x1 2 x2 1u1 6
2 x1 0 x2 1u1 2
0 x1 2 x2 1u1 2
1x1 1x2 0u1 4
2u1 4
u1 2, BAD!
x1,2 2
8
Case 4
Case 4. s1 , s2 0
2( x1 1) u1 u2 0
2( x2 1) u1 0 0
2 x1 0 x2 u1 u2 2
0 x1 2 x2 u1 0u2 2
1x1 1x2 4 0 0
1x1 0 x2 2 0 0
x1 2, x2 2
u1 2, u2 0 BAD!
9
Sensitivity or Case 2
Case 2 KKT point :
x ( 2,1)
u1 0, u2 2
f 1
Constraint Variation Sensitivity
f (bi , e j ) f * (0,0) υi bi u j e j
f (bi , e j ) f * (0,0) u1e1 u2e2
10% f (0,0.2) 1 2(0.2) 0.6
actual f (0,0.2) 0.68
50% f (0,1.0) 1 2(1) 1
actual f (0,1) 0
From convexity theorems:
1. Constraints linear
2. Hf is PD
Therefore KKT Pt is global Min!
10
Graphical Solution
3
1
4
2
f u1g1 u2g 2 0
2( x1 1) 1 1 0
2( x2 1) 0 1 2 0 0
2( 2 1) 0 2 0
2(1 1) 0 0 0
2 0 2 0
0 0 0 0
11
LaGrange Multiplier Method
• May produce a KKT point
• A KKT point is a CANDIDATE minimum
It may not be a local Min
• If a point fails KKT conditions, we cannot
guarantee anything….
The point may still be a minimum.
• We need a SUFFICIENT condition
12
Recall Unconstrained MVO
For x* to be a local minimum: f f (x ) f (x*) 0
1 T
T
f f ( x*)d d H d
2
f T (x*) 0
1rst order
Necessary
Condition
dT H d 0
2nd order
Sufficient
Condition
i.e. H(x*) must be positive definite
13
Considerations
for Constrained Min?
Objective function
Differentiable, continuous i.e. smooth?
Hf(x) Positive definite
(i.e. convexity of f(x) )
Weierstrass theorem hints:
x closed and bounded
x contiguous or separated, pockets of points?
Constraints h(x) & g(x)
Define the constraint set, i.e. feasible region
x contiguous or separated, pockets of points?
Convex: sets, functions, constraint set and
Programming problem
14
Punchline (Theorem 4.10, pg 165)
The first-order KKT conditions are Necessary
and Sufficient for a GLOBAL minimum….if:
1. f(x) is convex
Hf(x) Positive definite
2. x is defined as a convex feasible set.
Equality constraints must be linear
Inequality constraints must be convex
HINT: linear functions are convex!
15
Convex sets
Convex set:
All pts in feasible
region on a
straight line(s).
Non-convex set
Pts on line are not
in feasible region
16
Single variable
x αx2 (1 α) x1 ; 0 α 1
x x1 α( x2 x1 ) 0 α 1
No “gaps” in feasible “region”
17
Multiple variables Fig 4.21
x αx (1 α )x ; 0 α 1
x x ( 2 ) α(x ( 2 ) x (1) ); 0 α 1
(2)
(2)
x x 1 0
2
1
2
2
What if it were an equality
constraint?
misprint
18
Figure 4.22 Convex function f(x)=x2
Bowl that holds water.
. f ( x ) αf ( x ) (1 α ) f ( x ) ; 0 α 1
2
1
f ( x ) f ( x1 ) α( f ( x2 ) f ( x1 ) ) 0 α 1
f (αx 2 (1 α ) x1 ) f ( x1 ) α[ f ( x2 ) f ( x1 )] 0 α 1
19
Fig 4.23 Characterization of a convex
function.
f ( x ) αf ( x2 ) (1 α ) f ( x1 ) ; 0 α 1
f ( x ) f ( x1 ) α( f ( x2 ) f ( x1 ) ) 0 α 1
f (αx 2 (1 α ) x1 ) f ( x1 ) α[ f ( x2 ) f ( x1 )] 0 α 1
20
Test for Convex Function
f (x ) αf (x ( 2 ) ) (1 α) f (x (1) ) ; 0 α 1
f (x ) f (x (1) ) α( f (x ( 2 ) ) f (x (1) ) ) 0 α 1
Difficult to use above definition!
However, Thm 4.8 pg 163:
If the Hessian matrix of the function
is PD ro PSD at all points in the set
S, then it is convex.
PD… “strictly” convex, otherwise
PSD… “convex”
21
Theorem 4.9
Given:
Constraint Set
S {x | hi (x ) 0, for i 1 to p;
g j (x ) 0, j 1 to m}
S is convex if:
1. hi are linear
2. gj are convex i.e. Hg PD or PSD
When f(x) and S are convex=
“convex programming problem”
22
“Sufficient” Theorem 4.10, pg 165
The first-order KKT conditions are Necessary
and Sufficient for a GLOBAL minimum….if:
1. f(x) is convex
Hf(x) Positive definite
2. x is defined as a convex feasible set S
Equality constraints must be linear
Inequality constraints must be convex
HINT: linear functions are convex!
23
Summary
• LaGrange multipliers are the
instantaneous rate of change in f(x)
w.r.t. relaxing a constraint.
• KKT point is a CANDIDATE min!
(need sufficient conditions for proof)
• Convex sets assure contiguity and or
the smoothness of f(x)
• KKT pt of a convex progamming
problem is a GLOBAL MINIMUM!
24
© Copyright 2026 Paperzz