A Secant Method for Nonsmooth Optimization
Asef Nazari
CSIRO Melbourne
CARMA Workshop on Optimization, Nonlinear Analysis, Randomness and Risk
Newcastle, Australia
12 July, 2014
Asef Nazari
A Secant Method for Nonsmooth Optimization
Outline
Background
Secant Method
1
Background
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
2
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
The problem
Unconstrained Optimization Problem
min f (x)
x∈R n
f : Rn → Rn
Locally Lipshitz Ö
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
The problem
Unconstrained Optimization Problem
min f (x)
x∈R n
f : Rn → Rn
Locally Lipshitz Ö
Why just unconstrained?
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Transforming Constrained into Unconstrained
Constrained Optimization Problem
min f (x)
x∈Y
where Y ⊂ R n .
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Transforming Constrained into Unconstrained
Constrained Optimization Problem
min f (x)
x∈Y
where Y ⊂ R n .
Distance function
dist(x, Y ) = min ky − xk
y ∈Y
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Transforming Constrained into Unconstrained
Constrained Optimization Problem
min f (x)
x∈Y
where Y ⊂ R n .
Distance function
dist(x, Y ) = min ky − xk
y ∈Y
Theory of penalty function (under some conditions)
min f (x) + σdist(x, y )
x∈R n
f (x) + σdist(x, y ) is a nonsmooth function
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Sources of Nonsmooth Problems
Minimax Problem
min f (x)
x∈R n
f (x) = max fi (x)
1≤i≤m
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Sources of Nonsmooth Problems
Minimax Problem
min f (x)
x∈R n
f (x) = max fi (x)
1≤i≤m
System of Nonlinear Equations
fi (x) = 0, i = 1, . . . , m,
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Sources of Nonsmooth Problems
Minimax Problem
min f (x)
x∈R n
f (x) = max fi (x)
1≤i≤m
System of Nonlinear Equations
fi (x) = 0, i = 1, . . . , m,
we often do
min kf¯(x)k
x∈R n
where f¯(x) = (f1 (x), . . . , fm (x))
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Structure of Optimization Algorithms
Step 1 Initial Step x0 ∈ R n
Step 2 Termination Criteria
Step 3 Finding descent direction dk at xk
Step 4 Finding step size f (xk + αk dk ) < f (xk )
Step 5 Loop xk+1 = xk + αk dk and go to step 2.
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Classification of Algorithms Based on dk and αk
Directions
dk = −∇f (xk ) Steepest Descent Method
dk = −H −1 (xk )∇f (xk ) Newton Method
dk = −B −1 ∇f (xk ) Quasi-Newton Method
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Classification of Algorithms Based on dk and αk
Directions
dk = −∇f (xk ) Steepest Descent Method
dk = −H −1 (xk )∇f (xk ) Newton Method
dk = −B −1 ∇f (xk ) Quasi-Newton Method
Step sizes
h(α) = f (xk + αdk )
1
2
exactly solve h0 (α) = 0 exact line search
loosly solve it, inexact line serach
Trust region methods
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
When the objective function is smooth
Descent direction f 0 (xk , d) < 0
f 0 (xk , d) = h∇f (xk ), di ≤ 0 descent direction
−∇f (xk ) steepest descent direction
k∇f (xk )k ≤ ε good stopping criteria (First Order Necessary
Conditions)
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
Basic Inequality for convex differentiable function:
f (x) ≥ f (y ) + ∇f (y )T .(x − y ) ∀x ∈ Dom(f )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
g ∈ R n is a subgradient of a convex function f at y if
f (x) ≥ f (y ) + g T .(x − y ) ∀x ∈ Dom(f )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
g ∈ R n is a subgradient of a convex function f at y if
f (x) ≥ f (y ) + g T .(x − y ) ∀x ∈ Dom(f )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
g ∈ R n is a subgradient of a convex function f at y if
f (x) ≥ f (y ) + g T .(x − y ) ∀x ∈ Dom(f )
if x < 0, g = −1
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
g ∈ R n is a subgradient of a convex function f at y if
f (x) ≥ f (y ) + g T .(x − y ) ∀x ∈ Dom(f )
if x < 0, g = −1
if x > 0, g = 1
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subgradient
g ∈ R n is a subgradient of a convex function f at y if
f (x) ≥ f (y ) + g T .(x − y ) ∀x ∈ Dom(f )
if x < 0, g = −1
if x > 0, g = 1
if x = 0, g ∈ [−1, 1]
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subdifferential
Set of all subgradients of f at x, ∂f (x), is called
subdifferential.
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subdifferential
Set of all subgradients of f at x, ∂f (x), is called
subdifferential.
for f (x) = |x|, the subdifferential at x = 0 is ∂f (0) = [−1, 1]
For x > 0 ∂f (x) = {1}
For x < 0 ∂f (x) = {−1}
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subdifferential
Set of all subgradients of f at x, ∂f (x), is called
subdifferential.
for f (x) = |x|, the subdifferential at x = 0 is ∂f (0) = [−1, 1]
For x > 0 ∂f (x) = {1}
For x < 0 ∂f (x) = {−1}
f is differentiable at x ⇐⇒ ∂f (x) = {∇f (x)}
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Subdifferential
Set of all subgradients of f at x, ∂f (x), is called
subdifferential.
for f (x) = |x|, the subdifferential at x = 0 is ∂f (0) = [−1, 1]
For x > 0 ∂f (x) = {1}
For x < 0 ∂f (x) = {−1}
f is differentiable at x ⇐⇒ ∂f (x) = {∇f (x)}
for Lipschitz functions (Clarke)
∂f (x0 ) = conv {lim ∇f (xi ) : xi −→ x0 ∇f (xi )exists}
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Optimality Condition
For a smooth convex function
f (x ∗ ) =
inf
f (x) ⇐⇒ 0 = ∇f (x ∗ )
x∈Dom(f )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Optimality Condition
For a smooth convex function
f (x ∗ ) =
inf
f (x) ⇐⇒ 0 = ∇f (x ∗ )
x∈Dom(f )
For a nonsmooth convex function
f (x ∗ ) =
inf
f (x) ⇐⇒ 0 ∈ ∂f (x ∗ )
x∈Dom(f )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Directional derivative
In Smooth Case
f 0 (xk , d) = lim+
λ→0
f (xk + λd) − f (xk )
= h∇f (xk ), di
λ
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Directional derivative
In Smooth Case
f 0 (xk , d) = lim+
λ→0
f (xk + λd) − f (xk )
= h∇f (xk ), di
λ
In nonsmooth case
f 0 (xk , d) = lim+
λ→0
f (xk + λd) − f (xk )
= sup hg T , di
λ
g ∈∂f (xk )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Descent Direction
In Smooth Case, if f 0 (xk , d) = h∇f (xk ), di < 0. (−∇f (xk )
steepest descent direction)
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Descent Direction
In Smooth Case, if f 0 (xk , d) = h∇f (xk ), di < 0. (−∇f (xk )
steepest descent direction)
In nonsmooth case, if f 0 (xk , d) < 0. It is proved that
d = −argmin kg k
g ∈∂f (xk )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
In Summary
Optimality condition
f (x ∗ ) =
inf
f (x) ⇐⇒ 0 ∈ ∂f (x ∗ )
x∈Dom(f )
Directional Derivative
f 0 (xk , d) = lim+
λ→0
f (xk + λd) − f (xk )
= sup hg T , di
λ
g ∈∂f (xk )
Steepest Descent Direction
d = −argmin kg k
g ∈∂f (xk )
Asef Nazari
Secant Method
Outline
Background
Secant Method
The Problem
Optimization Algorithms and components
Subgradient and Subdifferential
Methods for nonsmooth problems
The way we treat ∂f (x) leads to different types of algorithms in
nonsmooth optimization
Subgradient method (one subgradient at each iteration)
Bundle method (a bundle of subgradients in each iteration)
Gradient Sampling method
smoothing technique
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Definition of r -secant
S1 = {g ∈ R n : kg k = 1} the unit sphere in R n
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Definition of r -secant
S1 = {g ∈ R n : kg k = 1} the unit sphere in R n
g ∈ S1 we define
g max = max {|gi |, i = 1, . . . , n} .
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Definition of r -secant
S1 = {g ∈ R n : kg k = 1} the unit sphere in R n
g ∈ S1 we define
g max = max {|gi |, i = 1, . . . , n} .
g ∈ S1 and gj = g max , v ∈ ∂f (x + rg )
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Definition of r -secant
S1 = {g ∈ R n : kg k = 1} the unit sphere in R n
g ∈ S1 we define
g max = max {|gi |, i = 1, . . . , n} .
g ∈ S1 and gj = g max , v ∈ ∂f (x + rg )
s = s(x, g , r ) ∈ R n where
s = (s1 , . . . , sn ) : si = vi , i = 1, . . . , n, i 6= j
and
sj =
f (x + rg ) − f (x) − r
Pn
i=1,i6=j si gi
rgj
is called an r -secant of the function f at a point x in the
direction g .
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Facts about r -secant
If n = 1,
s=
f (x + rg ) − f (x)
rg
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Facts about r -secant
If n = 1,
s=
f (x + rg ) − f (x)
rg
Mean Value Theorem for r -secants
f (x + rg ) − f (x) = r hs(x, g , r ), g i
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Facts about r -secant
If n = 1,
s=
f (x + rg ) − f (x)
rg
Mean Value Theorem for r -secants
f (x + rg ) − f (x) = r hs(x, g , r ), g i
Set of all possible r -secants of the function f at the point x
Sr f (x) = {s ∈ R n : ∃g ∈ S1 : s = s(x, g , r )}
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Facts about r -secant
If n = 1,
s=
f (x + rg ) − f (x)
rg
Mean Value Theorem for r -secants
f (x + rg ) − f (x) = r hs(x, g , r ), g i
Set of all possible r -secants of the function f at the point x
Sr f (x) = {s ∈ R n : ∃g ∈ S1 : s = s(x, g , r )}
1
compact for r > 0
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Facts about r -secant
If n = 1,
s=
f (x + rg ) − f (x)
rg
Mean Value Theorem for r -secants
f (x + rg ) − f (x) = r hs(x, g , r ), g i
Set of all possible r -secants of the function f at the point x
Sr f (x) = {s ∈ R n : ∃g ∈ S1 : s = s(x, g , r )}
1
2
compact for r > 0
(x, r ) 7→ Sr f (x), r > 0 is closed and upper semi-continuous
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
A new set
S0c f (x) = conv {v ∈ R n : ∃(g ∈ S1 , rk → +0, k → +∞) :
v = lim s(x, g , rk )},
k→+∞
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
A new set
S0c f (x) = conv {v ∈ R n : ∃(g ∈ S1 , rk → +0, k → +∞) :
v = lim s(x, g , rk )},
k→+∞
For regular and semismooth function f at a point x ∈ R n :
∂f (x) = S0c f (x).
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Optimality condition
Let x ∈ R n be a local minimizer of the function f and it is
directionally differentiable at x. Then
0 ∈ S0c f (x).
x ∈ R n is an r -stationary point for a function f on R n if
0 ∈ Src f (x).
x ∈ R n is an (r , δ)-stationary point for a function f on R n if
0 ∈ Src f (x) + Bδ where
Bδ = {v ∈ R n : kv k ≤ δ}.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Descent Direction
If x ∈ R n is not an r -stationary point of a function f on R n ,
0 6∈ Src f (x).
we can compute a descent direction using the set Src f (x)
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Descent Direction
If x ∈ R n is not an r -stationary point of a function f on R n ,
0 6∈ Src f (x).
we can compute a descent direction using the set Src f (x)
Let x ∈ R n and for given r > 0
min{kv k : v ∈ Src f (x)} = kv 0 k > 0.
Then for g 0 = −kv 0 k−1 v 0
f (x + rg 0 ) − f (x) ≤ −r kv 0 k.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Descent Direction
If x ∈ R n is not an r -stationary point of a function f on R n ,
0 6∈ Src f (x).
we can compute a descent direction using the set Src f (x)
Let x ∈ R n and for given r > 0
min{kv k : v ∈ Src f (x)} = kv 0 k > 0.
Then for g 0 = −kv 0 k−1 v 0
f (x + rg 0 ) − f (x) ≤ −r kv 0 k.
minimize
subjectto
Asef Nazari
kv k2
v ∈ Src f (x).
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
An algorithm for descent direction (Alg1)
step 1. compute an r -secant s 1 = s(x, g 1 , r ) . Set W 1 (x) = {s 1 } and
k = 1.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
An algorithm for descent direction (Alg1)
step 1. compute an r -secant s 1 = s(x, g 1 , r ) . Set W 1 (x) = {s 1 } and
k = 1.
step 2. kw k k2 = min{kw k2 : w ∈ coW k (x)}. If
kw k k ≤ δ,
then stop. Otherwise go to Step 3.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
An algorithm for descent direction (Alg1)
step 1. compute an r -secant s 1 = s(x, g 1 , r ) . Set W 1 (x) = {s 1 } and
k = 1.
step 2. kw k k2 = min{kw k2 : w ∈ coW k (x)}. If
kw k k ≤ δ,
then stop. Otherwise go to Step 3.
step 3. g k+1 = −kw k k−1 w k .
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
An algorithm for descent direction (Alg1)
step 1. compute an r -secant s 1 = s(x, g 1 , r ) . Set W 1 (x) = {s 1 } and
k = 1.
step 2. kw k k2 = min{kw k2 : w ∈ coW k (x)}. If
kw k k ≤ δ,
then stop. Otherwise go to Step 3.
step 3. g k+1 = −kw k k−1 w k .
step 4. f (x + rg k+1 ) − f (x) ≤ −cr kw k k, then stop. Otherwise go to
Step 5.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
An algorithm for descent direction (Alg1)
step 1. compute an r -secant s 1 = s(x, g 1 , r ) . Set W 1 (x) = {s 1 } and
k = 1.
step 2. kw k k2 = min{kw k2 : w ∈ coW k (x)}. If
kw k k ≤ δ,
then stop. Otherwise go to Step 3.
step 3. g k+1 = −kw k k−1 w k .
step 4. f (x + rg k+1 ) − f (x) ≤ −cr kw k k, then stop. Otherwise go to
Step 5.
k+1
step 5. s k+1 = s(x, g k+1 , r ) with respectto the direction
S k+1 g ,
construct the set W k+1 (x) = co W k (x) {s
} , set
k = k + 1 and go to Step 2.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (r , δ)-stationary point(Alg 2)
step 1. x 0 ∈ R n and set k = 0.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (r , δ)-stationary point(Alg 2)
step 1. x 0 ∈ R n and set k = 0.
step 2. Apply Alg 1 at x = x k
Either kv k k ≤ δ or for the search direction g k = −kv k k−1 v k
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (r , δ)-stationary point(Alg 2)
step 1. x 0 ∈ R n and set k = 0.
step 2. Apply Alg 1 at x = x k
Either kv k k ≤ δ or for the search direction g k = −kv k k−1 v k
step 3. If kv k k ≤ δ then stop. Otherwise go to Step 4.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (r , δ)-stationary point(Alg 2)
step 1. x 0 ∈ R n and set k = 0.
step 2. Apply Alg 1 at x = x k
Either kv k k ≤ δ or for the search direction g k = −kv k k−1 v k
step 3. If kv k k ≤ δ then stop. Otherwise go to Step 4.
step 4. x k+1 = x k + σk g k , where σk is defined as follows
n
o
σk = arg max σ ≥ 0 : f (x k + σg k ) − f (x k ) ≤ −c2 σkv k k .
Set k = k + 1 and go to Step 2.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (Alg 3)
step 1. x 0 ∈ R n and set k = 0.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (Alg 3)
step 1. x 0 ∈ R n and set k = 0.
step 2. Apply Alg 2 to x k for r = rk and δ = δk . This algorithm
terminates after a finite number of iterations p > 0 and as a
result the algorithm finds (rk , δk )-stationary point x k+1 .
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
The secant method (Alg 3)
step 1. x 0 ∈ R n and set k = 0.
step 2. Apply Alg 2 to x k for r = rk and δ = δk . This algorithm
terminates after a finite number of iterations p > 0 and as a
result the algorithm finds (rk , δk )-stationary point x k+1 .
step 3. Set k = k + 1 and go to Step 2.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Convergence Theorem
Theorem
Assume that the function f is locally Lipschitz and the set L(x 0 ) is
bounded for starting points x 0 ∈ R n . Then every accumulation
point of the sequence {x k } belongs to the set
X 0 = {x ∈ R n : 0 ∈ ∂f (x)}.
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Results
Asef Nazari
Secant Method
Outline
Background
Secant Method
Definitions
Optimality Condition and Descent Direction
The Secant Algorithm
Numerical Results
Results
Asef Nazari
Secant Method
© Copyright 2026 Paperzz