equality constrained optimization problem

Optimal Control
Optimization
• It is a process of search that seek to optimize (Maximize & Minimize)
a mathematical function of several variables subject to constraints
(equality or inequality)
• It is of two types
1. Static Optimization
2. Dynamic Optimization
Static Optimization
• It is concerned with design variables (involve in objective function)
which does not change with time.
• There are 3 techniques to solve such problems
1. Ordinary Calculus
2. Lagrange Multiplier
3. Linear & Non-Linear Programing
Dynamic Optimization:
• It is concerned with design variables (involve in objective function)
are changing with time and thus time is involve in the problem
statement.
• There are 3 techniques to solve such problems
1. Calculus of variation
2. Dynamic programing
3. Convex optimization problem
How to get mathematical model for optimal
design problem
• Example: Design a can with the following constrains
• which can contain maximum 200 ml liquid
• Radius(cm)
3≤r≤5
For
comfortable
• Height (cm)
5 ≤ h ≤ 22
handing,
• Ratio
h ≥ 3r
griping etc
So that cost is minimum.
Our objective is to minimize fabrication cost and assume the cost of the
material use to fabricate the can is Rs c/cm2.
Example 1…
• Let x1 = r, x2 = h (x1 & x2 are know as design variables)
• Minimize fabrication cost (Fabrication cost is now objective function)
• Fabrication cost = Objective function
Note:
2
• f(x1,x2) = (2πrh +2πr )C
Objective function is a
scalar function
• f(x1,x2) = (2πx1x2 +2πx12)C
---(1) (Non linear)
• Subject to
•
•
•
•
πr2h = 200
π x12x2=200
h ≥ 3r
3x1-x2 ≤ 0
---(2) ( Equality Constraint ) (Non linear)
---(3) ( Inequality constraint ) (Linear)
Example 1…
• 3 ≤ x1 ≤ 5
Side constraints
• 5 ≤ x2 ≤ 22
• Side constraints are necessary part of the solution techniques
• X1 & x2 does not change with time => Static Optimization Problem.
• Equation (1) & (2) are non linear while equation (3) is linear. So it is
Non linear static optimization problem.
• Hence optimization problem may be
• Linear optimization problem: Objective function & Constraints all are linear
• Non linear optimization problem
Example 1…
• Graphical Representation:
x2=h
3x1-x2≤0
3x1-x2=0
25
3x1-x2≥0
20
Search Region
(Design
variables lies
here)
15
10
5
0
1
2
3
4
5
x1=r
General Optimization problem statement
(Single objective Optimization problem)
• In case we have n design variables (x1, x2, x3, …..xn) then optimize
(maximize or minimize) objective function (cost function) with some
constraints and side constraints.
• General objective function f(x1, x2, x3, …..xn)
---(1)
• Subjected to
•
•
•
•
hi(x1, x2, x3, …..xn) = 0 ; i=1 2 3…p
gj(x1, x2, x3, …..xn) ≤ 0; j=1, 2, 3….m
xiL ≤xi≤xiU
i=1, 2, 3, …n
(L=lower value, U =upper value)
---(2) Equality constraints
---(3) Inequality constraints
---(4) side constraints
General Optimization problem statement
(Single objective Optimization problem)
• In more compact form
• Let a vector
x = xnx1=[x1,x2, x3,……..xn]T
• General objective function f(x)
---(1)
• Subjected to
•
•
•
•
hi(x) = 0 ;
i=1 2 3…p
gj(x) ≤ 0;
j=1, 2, 3….m
xiL ≤xi≤xiU
i=1, 2, 3, …n
(L=lower value, U =upper value)
---(2) Equality constraints
---(3) Inequality constraints
---(4) side constraints
Multi objective Optimization problem
• As we see on the previous slide objective function was
f(x1, x2, x3, …..xn) or f(x) => one objective function
• We may have multiple objective function i.e
f1(x1, x2, x3, …..xn), f2(x1, x2, x3, …..xn) etc may be one function is to be
maximize and other function to be minimize.
This is know as Multi Objective Optimization Problem.
Optimality conditions
Theorem: Necessary & Sufficient conditions
for optimality condition
• The necessary & sufficient conditions for optimality condition in nvariables form (mini or max)
• Necessary condition:
• 𝛻𝑓 𝑥 =
𝜕𝑓(𝑥)
𝜕𝑥
=0
𝑥1
𝑥2
𝑥= .
.
𝑥𝑛
• Sufficient condition:
• 𝐻 = 𝛻 2 𝑓 𝑥 𝑥=𝑥 ∗ > 0 function f(x) has min value at x=x*
• 𝐻 = 𝛻 2 𝑓 𝑥 𝑥=𝑥 ∗ < 0 function f(x) has max value at x=x*
Matrix H is know as Hessian Matrix
Example
• Example 1: Find the optimum value of the function
• 𝑓 𝑥 = 2𝑥12 + 4𝑥1 𝑥2 + 4𝑥22 − 4𝑥1 + 2𝑥2 + 16
• Solution:
• Necessary condition
• 𝛻𝑓 𝑥 =
𝜕𝑓(𝑥)
𝜕𝑥
=
𝜕𝑓(𝑥)
𝜕𝑥1
𝜕𝑓(𝑥)
𝜕𝑥2
4𝑥1 + 4𝑥2 − 4
0
=
=
4𝑥1 + 8𝑥2 + 2
0
• Solving above equations 𝑥1 = 𝑥1∗ = 2.5 &
𝑥2 = 𝑥2∗ = −1.5
Example 1…
• Sufficient condition
• 𝐻 = 𝛻 2𝑓 𝑥 =
𝜕2 𝑓 𝑥
𝜕𝑥12
𝜕2 𝑓 𝑥
𝜕𝑥1 𝑥2
𝜕2 𝑓 𝑥
𝜕𝑥1 𝑥2
𝜕2 𝑓 𝑥
𝜕𝑥22
=
𝑥1 =𝑥1∗ =2.5
𝑥2 =𝑥2∗ =−1.5
4 4
4 8
• Now check whether H is +definite, + semi definite or –definite , - semi
definite
• H is + Definite (H>0) because
• All diagonal elements are >0
• Leading principal minor of order one =4 >0
• Leading principal minor of order two =32-16=16>0
• Hence function f(x) has its min value at 𝑥1 = 𝑥1∗ = 2.5 &
𝑥2 = 𝑥2∗ = −1.5
Unconstrained
Optimization problem,
Numerical Techniques
Unconstrained Optimization problem
• Unconstrained optimization problem can be solved by two method
1. Numerical methods: iterative process
2. Analytical methods:
Unconstrained Optimization problem…
1. Numerical methods:
I.
II.
III.
IV.
Steepest Descent Method: Convergence order is 1
Gradient Method: Steepest Descent Method with predetermined step size
Conjugate Gradient Method: Convergence order is b/w 1 & 2
Newton’s Method: Convergence order is 2
Steepest Decent Method Algorithm
• Let 𝑓 𝑥𝑛×1 is to be minimize
• Step 1: Choose a starting point 𝑥 (0)
and let K=0, ɛ1, ɛ2, ɛ3 are the
stopping criterion for the algorithm. (ɛ1, ɛ2& ɛ3 are predetermined very
small +ve values)
• Step 2: At kth iteration, determine the gradient of the objective function f(x)
i.e.
• 𝛻𝑓 𝑥
𝑘
• Step 3: Find search direction
• 𝑑𝑘 = −𝛻𝑓(𝑥
𝑘
)
• Step 4: Find the optimum step size 𝜆𝑘 =
𝜆∗𝑘
=
− 𝛻𝑓 𝑥
𝑘
𝑇
𝑑𝑘
(𝑑𝑘 )𝑇 𝛻 2 𝑓 𝑥 𝑘 𝑑𝑘
Steepest Decent Method Algorithm …
• Step 5: Find out next iteration variable 𝑥
• Step 6: Find ∆𝑓 = 𝑓(𝑥 𝑘+1 ) − 𝑓(𝑥
• If
∆𝑓 <∈1 then STOP,
•
𝑘
𝑘+1
= 𝑥 (𝑘) + 𝜆𝑘 𝑑𝑘
) and ∆𝑥 = 𝑥
𝑘+1
− 𝑥 (𝑘)
• (=> Function value is not changing, ∈1 is very small predetermined + ve value )
If
∆𝑥 2 = ∆𝑥 𝑇 ∆𝑥 <∈2 then STOP,
• (=> Design variable value is not changing, ∈2 is very small predetermined +ve value )
• There may be some function for which ∆𝑓 is higher but ∆𝑥 2 is very low or vice versa. So
in that situation use above both conditions to stop the execution.
• Note: ∆𝑓 is scalar so normal mode ∆𝑓 and ∆𝑥 is a vector so norm ∆𝑥
2
is taken.
Steepest Decent Method Algorithm …
• If 𝛻𝑓(𝑥
𝑘+1
)
𝑇
𝛻𝑓(𝑥
𝑘+1
) <∈3 then STOP
• (=> Function is converged, ∈3 is very small predetermined + ve value )
• k=k+1
• Go to step 2
Conjugate Gradient Method
• Algorithm: Let 𝑓 𝑥𝑛×1 is to be minimize
• Step 1: same
• Step 2: same
• Step 3: Compute the new conjugate search direction
•
•
•
•
𝑑𝑘 = −𝛻𝑓 𝑥 𝑘 + 𝛽𝑘 𝑑𝑘−1
𝑑𝑘−1 previous iteration value of search direction
𝛽𝑘 > 0 constant
𝛽𝑘 𝑑𝑘−1 Scaled search direction of previous iteration
• 𝛽𝑘 =
𝛻𝑓 𝑥 𝑘
𝑇
𝛻𝑓 𝑥 𝑘
𝛻𝑓 𝑥 𝑘−1
𝑇
𝛻𝑓 𝑥 𝑘−1
=
𝛻𝑓 𝑥 𝑘
𝛻𝑓 𝑥 𝑘−1
Conjugate Gradient Method …
• Step 4: same
• Step 5: Same
• Step 6: Same
Note
𝛻𝑓 𝑥
𝑘
𝑇
𝑑𝑘 = − 𝛻𝑓 𝑥
𝑘
𝑇
𝛻𝑓 𝑥
𝑘
+ 𝛽𝑘 𝛻𝑓 𝑥
𝑘
𝑇
𝑑𝑘−1 < 0
=> we are moving in right direction in this method too i.e 𝑓(𝑥
(𝑑𝑘 is the direction of descent )
𝑘+1
) < 𝑓(𝑥
𝑘
)
Newton’s Method Algorithm
•
•
•
•
Let 𝑓 𝑥𝑛×1 is to be minimize
Step 1: Same as Steepest Decent Method Algorithm
Step 2: same
Step 3: Compute Hessian matrix 𝑃 = 𝛻 2 𝑓 𝑥 𝑘
• If P>0
• Else
then
𝑑𝑘 = − 𝑃 −1 𝛻𝑓 𝑥 𝑘
𝑑𝑘 = − 𝑀𝑘 −1 𝛻𝑓 𝑥 𝑘
• Step 4: Find the optimum step size 𝜆𝑘
• If P>0
• Else
then
𝜆𝑘 =
𝜆∗𝑘
=
𝜆𝑘 = 𝜆∗𝑘 =
− 𝛻𝑓 𝑥 𝑘
𝑇
𝑑𝑘
(𝑑𝑘 )𝑇 𝛻2 𝑓 𝑥 𝑘 𝑑𝑘
𝑇
− 𝛻𝑓 𝑥 𝑘
𝑑𝑘
(𝑑𝑘 )𝑇 𝑀𝑘 𝑑𝑘
Newton’s Method Algorithm …
• Step 5: same
• Step 6: same
Constrained Optimization
Problem
Constrained Optimization Problem
• Generalized constrained optimization problem
• Minimize 𝑓 𝑥𝑛×1 = 𝑓(𝑥1 , 𝑥2 … 𝑥𝑛 )
---(1)
• Subjected to
• ℎ𝑖 𝑥1 , 𝑥2 … 𝑥𝑛 = 0
---(2) i=0,1 2, …p (p equality constrained)
• 𝑔𝑗 𝑥1 , 𝑥2 … 𝑥𝑛 ≤ 0
---(3) j=0,1 2, …m (m inequality constrained)
• May or may not have some side constrained
• Any inequality constrained optimization problem can be converted
into equality constrained optimization problem and this equality
constrained optimization problem can be converted into
unconstrained optimization problem(By Lagrange Multiplier
Approach), which can be solved by previous methods(Numerical
method or analytical methods).