MATH2070 Optimisation
Nonlinear optimisation with constraints
Semester 2, 2012
Lecturer: I.W. Guo
Lecture slides courtesy of J.R. Wishart
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Review
The full nonlinear optimisation problem with equality constraints
Method of Lagrange multipliers
Dealing with Inequality Constraints and the Kuhn-Tucker
conditions
Second order conditions with constraints
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
The full nonlinear optimisation problem with equality constraints
Method of Lagrange multipliers
Dealing with Inequality Constraints and the Kuhn-Tucker
conditions
Second order conditions with constraints
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Non-linear optimisation
Interested in optimising a function of several variables with
constraints.
Multivariate framework
Variables
x = (x1 , x2 , . . . , xn ) ∈ D ⊂ Rn
Objective Function
Z = f (x1 , x2 , . . . , xn )
Constraint equations,
g1 (x1 , x2 , . . . , xn ) = 0,
h1 (x1 , x2 , . . . , xn ) ≤ 0,
g2 (x1 , x2 , . . . , xn ) = 0,
..
.
h2 (x1 , x2 , . . . , xn ) ≤ 0,
..
.
gm (x1 , x2 , . . . , xn ) = 0,
hp (x1 , x2 , . . . , xn ) ≤ 0.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Vector notation
Multivariate framework
Minimise
Z = f (x)
such that
g(x) = 0,
h(x) ≤ 0,
where g : Rn −→ Rm and h : Rn −→ Rp .
Objectives?
I
Find extrema of above problem.
I
Consider equality constraints first.
I
Extension: Allow inequality constraints.
Second order conditions
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Initial Example
Minimise the heat loss through the surface area of an open lid
rectangular container.
Formulated as a problem
Minimise,
S = 2xy + 2yz + xz
such that,
V = xyz ≡ constant and
I
Parameters
I
: (x, y, z)
Objective function : f = S
I
Constraints
: g = xyz − V
x, y, z > 0.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Graphical interpretation of problem
level contours of f (x, y)
g(x, y) = c
P
I
I
I
Feasible points occur at the tangential intersection of g(x, y)
and f (x, y).
Gradient of f is parallel to gradient of g.
Recall from linear algebra and vectors
∇f + λ∇g = 0,
for some constant, λ.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Lagrangian
Definition
Define the Lagrangian for our problem,
L(x, y, z, λ) = S(x, y, z) + λg(x, y, z)
= 2xy + 2yz + xz + λ (xyz − V )
I
Necessary conditions,
∂L
∂L
∂L
∂L
=
=
=
= 0.
∂x
∂y
∂z
∂λ
I
This will be a system of four equations with four unknowns
(x, y, z, λ) that needs to be solved
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Solving the system
I
Calculate derivatives,
Lx = 2y + z − λyz,
Ly = 2x + 2z − λxz,
Lz = 2y + x − λxy,
Lλ = xyz − V,
I
Ensure constraint,
Lλ = 0 ⇒ z =
I
V
xy
(1)
Eliminate λ, solve for (x, y, z)
Lx = 0 ⇒
2y + z = λyz ⇒
Ly = 0 ⇒
2x + 2z = λxz ⇒
Lz = 0 ⇒
2y + 2x = λxy ⇒
2 1
+
z y
2 2
λ= +
z x
2 1
λ= +
x y
λ=
(2)
(3)
(4)
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Solving system (continued)
I
From (2) and (3) it follows x = 2y
I
From (2) and (4) it follows x = z
I
Use (1) with x = z = 2y then simplify,
!
r
√
√
3
3
3 V
x = (x, y, z) =
2V ,
, 2V
4
Second order conditions
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
The full nonlinear optimisation problem with equality constraints
Method of Lagrange multipliers
Dealing with Inequality Constraints and the Kuhn-Tucker
conditions
Second order conditions with constraints
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Constrained optimisation
Wish to generalised previous argument to multivariate framework,
Minimise
Z = f (x), x ∈ D ⊂ Rn
such that
g(x) = 0,
h(x) ≤ 0,
where g : Rn −→ Rm and h : Rn −→ Rp .
I
Geometrically, the constraints define a subset Ω of Rn , the
feasible set, which is of dimension less than n − m, in which
the minimum lies.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Constrained Optimisation
I
A value x ∈ D is called feasible if both g(x) = 0 and
h(x) ≤ 0.
I
An inequality constraint hi (x) ≤ 0 is
I
I
Active if the feasible x ∈ D satisfies hi (x) = 0
Inactive otherwise.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Equality Constraints
A point x∗ which satisfies the constraint g(x∗ ) = 0 is a regular
point if
{∇g1 (x∗ ), ∇g2 (x∗ ). . . . , ∇gm (x∗ )}
is linearly independent.
∂g1
∂x1
...
∂gm
∂x1
...
Rank ...
∂g1
∂xn
.. = m.
.
∂gm
∂xn
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Tangent Subspace
Definition
Define the constrainted tangent subspace
T = {d ∈ Rn : ∇g(x∗ )d = 0} ,
where ∇g(x∗ ) is an m × n matrix defined,
∂g1
∂x1
...
∂gm
∂x1
...
∇g(x∗ ) := ...
and ∇giT :=
∂gi
∂gi
∂x1 , . . . , ∂xn
∂g1
∂xn
∇g1 (x∗ )T
.. =
..
.
.
∗
T
∂gm
∇gm (x )
∂x
n
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Tangent Subspace (Example)
Second order conditions
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Generalised Lagrangian
Definition
Define the Lagrangian for the objective function f (x) with
constraint equations g(x) = 0,
L(x, λ) := f +
m
X
λ i g i = f + λT g .
i=1
I
Introduce a lagrange multiplier for each equality constraint.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
First order necessary conditions
Theorem
Assume x∗ is a regular point of g(x) = 0 and let x∗ be a local
extremum of f subject to g(x) = 0, with f, g ∈ C 1 . Then there
exists λ ∈ Rm such that,
∇L(x∗ , λ) = ∇f (x∗ ) + λT ∇g(x∗ ) = 0.
In sigma notation:
m
∂L(x∗ )
∂f (x∗ ) X ∂gi (x∗ )
λi
=
+
= 0,
∂xj
∂xj
∂xj
j = 1, . . . , n ;
i=1
and
∂L(x∗ )
= gi (x∗ ) = 0 ,
∂λi
i = 1, . . . , m .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Another example
Example
Find the extremum for f (x, y) = 12 (x − 1)2 + 12 (y − 2)2 + 1 subject
to the single constraint x + y = 1.
Solution: Construct the Lagrangian:
L(x, y, λ) = 12 (x − 1)2 + 12 (y − 2)2 + 1 + λ(x + y − 1) .
Then the necessary conditions are:
∂L
=x−1+λ=0
∂x
∂L
=y−2+λ=0
∂y
∂L
= 0 ⇒ x + y = 1.
∂λ
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Solve system
Can solve this by converting problem into matrix form.
i.e. solve for a in M a = b
1 0 1
x
1
0 1 1 y = 2
1 1 0
λ
1
Invert matrix M and solve gives,
−1
x
1 0 1
1
0
y = 0 1 1 2 = 1
λ
1 1 0
1
1
Second order conditions
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Comparison with unconstrained problem
Example
Minimize f (x, y) = 12 (x − 1)2 + 12 (y − 2)2 + 1 subject to the single
constraint x + y = 1.
I
In unconstrained problem, minimum attained at x = (1, 2).
Minimum at f ∗ = f (1, 2) = 1
I
Constraint modifies solution.
Constrained minimum at f ∗ = f (0, 1) = 2.
I
Note: Constrained minimum is bounded from below by the
unconstrained minumum.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Use of convex functions : Example
Example
Minimise:
f (x, y) = x2 + y 2 + xy − 5y
such that:
x + 2y = 5 .
The constraint is linear (therefore convex) and Hessian matrix of f
is positive definite so f is convex as well.
The Lagrangian is
L(x, y, λ) = x2 + y 2 + xy − 5y + λ(x + 2y − 5) .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Solve
Finding first order necessary conditions,
∂L
= 2x + y + λ = 0
∂x
∂L
= 2y + x − 5 + 2λ = 0
∂y
∂L
= x + 2y − 5 = 0 .
∂λ
In matrix form the system is,
2 1 1
x
0
1 2 2 y = 5
1 2 0
λ
5
Inverting and solving yields,
−5
x
y = 1 10
3
λ
0
(5)
(6)
(7)
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
The full nonlinear optimisation problem with equality constraints
Method of Lagrange multipliers
Dealing with Inequality Constraints and the Kuhn-Tucker
conditions
Second order conditions with constraints
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Inequality constraints
We now consider constrained optimisation problems with p
inequality constraints,
Minimise:
subject to:
f (x1 , x2 , . . . , xn )
h1 (x1 , x2 , . . . , xn ) ≤ 0
h2 (x1 , x2 , . . . , xn ) ≤ 0
..
.
hp (x1 , x2 , . . . , xn ) ≤ 0 .
In vector form the problem is
Minimise:
subject to:
f (x)
h(x) ≤ 0.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
The Kuhn-Tucker Conditions
I
To deal with the inequality constraints, introduce the so called
Kuhn-Tucker conditions.
p
X ∂hi
∂f
+
λi
= 0,
∂xj
∂xj
j = 1, . . . , n ;
λi hi = 0 ,
i = 1, . . . , m ;
λi ≥ 0 ,
i = 1, . . . , m .
i=1
I
Extra conditions for λ apply in this sense,
hi (x) = 0 ,
if λi > 0, i = 1, . . . , p
or
hi (x) < 0 ,
if λi = 0, i = 1, . . . , p.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Graphical Interpretation
level contours of f (x, y)
h(x, y) < 0 (λ = 0)
h(x, y) = 0
(λ > 0)
P
h(x, y) > 0
I
If λ > 0 then find solution on line h(x, y) = 0
I
If λ = 0 then find solution in region h(x, y) < 0
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Graphical Interpretation
level contours of f (x, y)
h(x, y) = 0
(λ > 0)
(x∗ , y ∗ )
I
Consider h(x, y) = 0, then λ > 0. Optimum at x∗ = (x∗ , y ∗ )
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Graphical Interpretation
level contours of f (x, y)
(x∗ , y ∗ )
h(x, y) < 0 (λ = 0)
h(x, y) = 0
I
Consider h(x, y) < 0, then λ = 0. Optimum at x∗ = (x∗ , y ∗ )
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Kuhn Tucker
I
Kuhn-Tucker conditions are always necessary.
I
Kuhn-Tucker conditions are sufficient if the objective function
and the constraint functions are convex.
I
Need a procedure to deal with these conditions.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Introduce slack variables
Slack Variables
To deal with the inequality constraint introduce a variable, s2i for
each constraint.
New constrained optimisation problem with m equality constraints,
Minimise:
subject to:
f (x1 , x2 , . . . , xn )
h1 (x1 , x2 , . . . , xn ) + s21 = 0
h2 (x1 , x2 , . . . , xn ) + s22 = 0
..
.
hp (x1 , x2 , . . . , xn ) + s2p = 0 .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
New Lagrangian
The Lagrangian for constrained problem is,
L(x, λ, s) = f (x) +
p
X
λi hi (x) + s2i
i=1
T
= f (x) + λ h(x) + λT s,
where s is a vector in Rp with i-th component s2i .
I
L depends on the n + 2m independent variables
x1 , x2 , . . . , xn , λ1 , λ2 , . . . , λp and s1 , s2 , . . . , sp .
I
To find the constrained minimum we must differentiate L
with respect to each of these variables and set the result to
zero in each case.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Solving the system
Need to solve n + 2m variables given by the n + 2m equations
generated by,
∂L
∂L
∂L
=
= ··· =
=0
∂x1
∂x2
∂xn
∂L
∂L
∂L
∂L
∂L
∂L
=
= ··· =
=
=
= ··· =
= 0.
∂λ1
∂λ2
∂λp
∂s1
∂s2
∂sp
I
Having λi ≥ 0 ensures that
L = f (x) +
p
X
λi hi (x) + s2i
i=1
has a well defined minimum.
I
If λi < 0, we can always make L smaller due to the λi s2i term.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
New first order equations
I
The differentiation, ∂L/∂λi = 0, ensures the constraints hold.
Lλi = 0 ⇒ hi + s2i = 0 ,
I
The differentiation,
minimising,
∂L
∂si
i = 1, . . . , p .
= 0, controls observed region of
Lsi = 0 ⇒ 2λi si = 0 ,
i = 1, . . . , p .
I
If λi > 0 we are on the boundary since si = 0
I
If λi = 0 we are inside the feasible region since si 6= 0.
I
If λi < 0 we are on the boundary again, but the point is not a
local minimum. So only focus on λi ≥ 0.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Simple application
Example
Consider the general problem
Minimise:
subject to:
I
f (x)
a ≤ x ≤ b.
Re-express the problem to allow the Kuhn-Tucker conditions.
Minimise:
subject to:
f (x)
− x ≤ −a
x ≤ b.
I
The Lagrangian L is given by
L = f + λ1 (−x + s21 + a) + λ2 (x + s22 − b) .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Solving system
I
Find first order equations
∂L
= f 0 (x) − λ1 + λ2 = 0,
∂x
∂L
∂L
= −x + a + s21 = 0
= x − b + s22 = 0,
∂λ1
∂λ2
∂L
∂L
= 2λ1 s1 = 0
= 2λ2 s2 = 0 .
∂s1
∂s2
I
Consider the scenarios for when the constraints are at the
boundary or inside the feasible region.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Possible scenarios
y
y
y
f ′ (x) = 0
f ′ (x) > 0
a
I
DOW N
b
x
a
IN
b
f ′ (x) < 0
x
a
UP
b
x
Down variable scenario (lower boundary).
s1 = 0 ⇒ λ1 > 0 and s22 > 0 ⇒ λ2 = 0.
Implies that x = a, and f 0 (x) = f 0 (a) = λ1 > 0, so that the
minimum lies at x = a.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Possible scenarios
y
y
y
f ′ (x) = 0
f ′ (x) > 0
a
I
DOW N
b
x
a
IN
b
f ′ (x) < 0
x
a
UP
b
x
In variable scenario (Inside feasible region).
s21 > 0 ⇒ λ1 = 0 and s22 > 0 ⇒ λ2 = 0.
Implies f 0 (x) = 0, for some x ∈ [a, b] determined by the
specific f (x), and thus there is a turning point somewhere in
the interval [a, b]
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Possible scenarios
y
y
y
f ′ (x) = 0
f ′ (x) > 0
a
I
DOW N
b
x
a
IN
b
f ′ (x) < 0
x
a
UP
b
x
Up variable scenario (upper boundary).
s21 > 0 ⇒ λ1 = 0 and s2 = 0 ⇒ λ2 > 0.
Implies x = b, so that f 0 (x) = f 0 (b) = −λ2 < 0 and we have
the minimum at x = b.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Numerical example
Example
Minimise:
f (x1 , x2 , x3 ) = x1 (x1 − 10) + x2 (x2 − 50) − 2x3
x1 + x2 ≤ 10
subject to:
x3 ≤ 10 .
Write down Lagrangian,
L(x, λ, s) = f (x) + λT (h(x) + s) ,
where,
h(x) =
x1 + x2 − 10
x3 − 10
s=
s21
s22
!
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Solve the system of first order equations
∂L
= 2x1 − 10 + λ1 = 0
∂x1
∂L
= 2x2 − 50 + λ1 = 0
∂x2
∂L
= −2 + λ2 = 0
∂x3
(8)
(9)
(10)
The Kuhn-Tucker conditions are λ1 , λ2 ≥ 0 and
λ1 (x1 + x2 − 10) = 0
(11)
λ2 (x3 − 10) = 0
(12)
Immediately from (10) and (12), λ2 = 2 and x3 = 10.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Two remaining cases to consider
I
Inside the feasible set λ1 = 0.
From equations (8) and (9) find,
x = (x1 , x2 , x3 ) = (5, 25, 10)
However, constraint x1 + x2 ≤ 10, is violated
Reject this solution!
I
At the boundary, λ1 > 0
Using constraint equation (11) x1 + x2 = 10. Find the system:
−2x1 − 30 + λ1 = 0
2x1 − 10 + λ1 = 0 ,
Solving gives, λ1 = 20 and x1 = −5, x2 = 15.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
The full nonlinear optimisation problem with equality constraints
Method of Lagrange multipliers
Dealing with Inequality Constraints and the Kuhn-Tucker
conditions
Second order conditions with constraints
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Motivation
I
Determine sufficient conditions for minima (maxima) given a
set of constraints.
I
Define the operator ∇2 to denote the Hessian matrices,
∇2 f := Hf .
I
2
What
2 are
m the necessary/sufficient conditions on ∇ f and
∇ gj j=1 that ensure a constrained minimum?
I
Consider equality constraints first.
I
Extend to inequality constraints.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Notation
Hessian of the constraint equations in the upcoming results needs
care . Define as follows,
I
Constraint Hessian involving Lagrange multipliers,
λT ∇2 g(x) :=
m
X
i=1
λi ∇2 gi (x) =
m
X
i=1
λi Hgi (x)
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Necessary second order condition for a constrained
minimum with equality constraints
Theorem
Suppose that x∗ is a regular point of g ∈ C 2 and is a local
minimum of f such that g(x) = 0 and x∗ is a regular point of g.
Then there exists a λ ∈ Rm such that,
∇f (x∗ ) + λT ∇g(x∗ ) = 0.
Also, from the tangent plane T = {y : ∇g(x∗ )y = 0}, the
Lagragian Hessian at x∗ ,
∇2 L(x∗ ) := ∇2 f (x∗ ) + λT ∇2 g(x∗ ),
is positive semi-definite on T . i.e. y T ∇2 L(x∗ )y ≥ 0 for all y ∈ T .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Sufficient second order condition for a constrained
minimum with equality constraints
Theorem
Suppose that x∗ and λ satisfy,
∇f (x∗ ) + λT ∇g(x∗ ) = 0.
Suppose further that the Lagragian Hessian at x∗ ,
∇2 L(x∗ ) := ∇2 f (x∗ ) + λT ∇2 g(x∗ ).
If y T ∇2 L(x∗ )y > 0 for all y ∈ T . Then f (x∗ ) is a local minimum
that satisfies g(x∗ ) = 0.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Determine if positive/negative definite
Options for determining definiteness of ∇2 L(x∗ ) on the subspace
T.
I
Complete the square of y T ∇2 L(x∗ )y.
I
Compute the eigenvalues of y T ∇2 L(x∗ )y.
Example
Max:
Z = x1 x2 + x1 x3 + x2 x3
subject to:
x1 + x2 + x3 = 3.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Extension: Inequality constraints
Problem:
Min
subject to:
f (x)
h(x) ≤ 0
g(x) = 0.
where f, g, h ∈ C 1 .
I
Extend arguments to include both equality and inequality
constraints.
I
Notation: Define the index set J :
Definition
Let J (x∗ ) be the index set of active constraints.
hj (x∗ ) = 0,
j ∈ J;
hj (x∗ ) < 0,
j∈
/ J.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
First order necessary conditions
The point x∗ is called a regular point of the constraints,
g(x) = 0 and h(x) ≤ 0, if
{∇gi }m
i=1
and
{∇hj }j∈J
are linearly independent.
Theorem
Let x∗ be a local minimum of f (x) subject the constraints
g(x) = 0 and h(x) ≤ 0. Suppose further that x∗ is a regular
point of the constraints. Then there exists λ ∈ Rm and µ ∈ Rp
with µ ≥ 0 such that,
∇f (x∗ ) + λT ∇g(x∗ ) + µT ∇h(x∗ ) = 0
µT h(x∗ ) = 0.
(13)
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Second order necessary conditions
Theorem
Let x∗ be a regular point of g(x) = 0, h(x) ≤ 0 with
f, g, h ∈ C 2 . If x∗ is a local minimum point of the fully
constrained problem then there exists λ ∈ Rm and µ ∈ Rp such
that (13) holds and
2
∗
∇ f (x ) +
m
X
i=1
2
∗
λi ∇ gi (x ) +
p
X
µj ∇2 hj (x∗ )
j=1
is positive semi-definite on the tangent space of the active
constraints. i.e. on the space,
n
o
T = d ∈ Rn ∇g(x∗ )d = 0; ∇hj (x∗ )d = 0, j ∈ J
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Second order sufficient conditions
Theorem
Let x∗ satisfy g(x) = 0, h(x) ≤ 0. Suppose there exists λ ∈ Rm
and µ ∈ Rp such that µ ≥ 0 and (13) holds and
∇2 f (x∗ ) +
m
X
i=1
λi ∇2 gi (x∗ ) +
p
X
µj ∇2 hj (x∗ )
j=1
is positive definite on the tangent space of the active constraints.
i.e. on the space,
n
o
T 0 = d ∈ Rn ∇g(x∗ )d = 0; ∇hj (x∗ )d = 0, j ∈ J ,
n o
where J = j hj (x∗ ) = 0, µj > 0 . Then x∗ is a strict local
minimum of f subject to the full constraints.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Ways to compute the sufficient conditions
The nature of the Hessian of the Lagrangian function and with its
auxiliary conditions can be determined.
1. Eigenvalues on the subspace T .
2. Find the nature of the Bordered Hessian.
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Eigenvalues on subspaces
Let {ei }m
i=1 be an orthonormal basis of T . Then d ∈ T can be
written,
d = Ez,
where z ∈ Rm and,
E = [e1
e2
...
Then use,
dT ∇2 Ld = . . . ,
em ] .
Introduction
Lagrange
Inequality Constraints and Kuhn-Tucker
Second order conditions
Bordered Hessian
Alternative method to determining the behaviour of the Lagragian
Hessian.
Definition
Given an objective function f with constraints g : Rn −→ Rm ,
define the Bordered Hessian matrix,
!
∇2 L(x, λ) ∇g(x)T
B(x, λ) :=
∇g(x)
0
Find the nature of the Bordered Hessian.
© Copyright 2026 Paperzz