Lecture Notes

Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
1
Module - 2 Lecture Notes – 5
Kuhn-Tucker Conditions
Introduction
In the previous lecture the optimization of functions of multiple variables subjected to
equality constraints using the method of constrained variation and the method of Lagrange
multipliers was dealt. In this lecture the Kuhn-Tucker conditions will be discussed with
examples for a point to be a local optimum in case of a function subject to inequality
constraints.
Kuhn-Tucker Conditions
It was previously established that for both an unconstrained optimization problem and an
optimization problem with an equality constraint the first-order conditions are sufficient for a
global optimum when the objective and constraint functions satisfy appropriate
concavity/convexity conditions. The same is true for an optimization problem with inequality
constraints.
The Kuhn-Tucker conditions are both necessary and sufficient if the objective function is
concave and each constraint is linear or each constraint function is concave, i.e. the problems
belong to a class called the convex programming problems.
Consider the following optimization problem:
Minimize f(X) subject to gj(X)  0 for j = 1,2,…,p ; where X = [x1 x2 . . . xn]
Then the Kuhn-Tucker conditions for X* = [x1* x2* . . . xn*] to be a local minimum are
m
f
g
  j
0
xi j 1 xi
D Nagesh Kumar, IISc, Bangalore
i  1, 2,..., n
j g j  0
j  1, 2,..., m
gj  0
j  1, 2,..., m
j  0
j  1, 2,..., m
(1)
M2L5
Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
2
In case of minimization problems, if the constraints are of the form gj(X)  0, then  j have
to be nonpositive in (1). On the other hand, if the problem is one of maximization with the
constraints in the form gj(X)  0, then  j have to be nonnegative.
Example (1)
Minimize f  x12  2 x22  3x32 subject to the constraints
g1  x1  x2  2 x3  12
g 2  x1  2 x2  3x3  8
using Kuhn-Tucker conditions.
Solution:
The Kuhn Tucker conditions are given by
a)
g
g
f
 1 1  2 2  0
xi
xi
xi
i.e.
2 x1  1  2  0
(2)
4 x2  1  22  0
(3)
6 x3  21  32  0
(4)
b)  j g j  0
i.e.
1 ( x1  x2  2 x3  12)  0
2 ( x1  2 x2  3x3  8)  0
(5)
(6)
c) g j  0
i.e.,
D Nagesh Kumar, IISc, Bangalore
M2L5
Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
3
x1  x2  2 x3  12  0
(7)
x1  2 x2  3x3  8  0
(8)
d)  j  0
i.e.,
1  0
2  0
(9)
(10)
From (5) either 1 = 0 or, x1  x2  2 x3  12  0
Case 1: 1 = 0
From (2), (3) and (4) we have x1 = x2 = 2 / 2 and x3 = 2 / 2 .
Using these in (6) we get 22  82  0,  2  0 or  8
From (10), 2  0 , therefore, 2 =0, X* = [ 0, 0, 0 ], this solution set satisfies all of (6) to (9)
Case 2: x1  x2  2 x3  12  0
Using (2), (3) and (4), we have
1  2 1  22 21  32


 12  0 or,
2
4
3
171  122  144 . But conditions (9) and (10) give us 1  0 and 2  0 simultaneously,
which cannot be possible with 171  122  144 .
Hence the solution set for this optimization problem is X* = [ 0 0 0 ]
D Nagesh Kumar, IISc, Bangalore
M2L5
Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
4
Example (2)
Minimize f  x12  x22  60 x1 subject to the constraints
g1  x1  80  0
g 2  x1  x2  120  0
using Kuhn-Tucker conditions.
Solution
The Kuhn Tucker conditions are given by
a)
g
g
g
f
 1 1  2 2  3 3  0
xi
xi
xi
xi
i.e.
2 x1  60  1  2  0
(11)
2 x2  2  0
(12)
1 ( x1  80)  0
2 ( x1  x2  120)  0
(13)
x1  80  0
(15)
x1  x2  120  0
(16)
b)  j g j  0
i.e.
(14)
c) g j  0
i.e.,
d)  j  0
i.e.,
D Nagesh Kumar, IISc, Bangalore
M2L5
Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
1  0
2  0
5
(17)
(18)
From (13) either 1 = 0 or, ( x1  80)  0
Case 1: 1 = 0
From (11) and (12) we have x1  
2
2
 30 and x2  
2
2
Using these in (14) we get 2  2  150  0 ;  2  0 or  150
Considering 2  0 , X* = [ 30, 0].
But this solution set violates (15) and (16)
For 2  150 , X* = [ 45, 75].
But this solution set violates (15) .
Case 2: ( x1  80)  0
Using x1  80 in (11) and (12), we have
2  2 x2
1  2 x2  220
(19)
Substitute (19) in (14), we have
2 x2  x2  40  0 .
For this to be true, either x2  0 or x2  40  0
For x2  0 , 1  220 . This solution set violates (15) and (16)
D Nagesh Kumar, IISc, Bangalore
M2L5
Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions
6
For x2  40  0 , 1  140 and 2  80 . This solution set is satisfying all equations from
(15) to (19) and hence the desired. Therefore, the solution set for this optimization problem
is X* = [ 80 40 ].
D Nagesh Kumar, IISc, Bangalore
M2L5