COMP SCI / SFWR ENG 4TE3 / 6TE3: McMaster University Final Exam - Day Class
April 21, 2015 (Instructor: Antoine Deza)
SOLUTIONS
Problem 1
Consider the following minimization problem {min f (x) = (x1 − 2x2 )2 +
ex2 2
e x1
: x ∈ IR2 }.
(a) Determine ∇f (x), the gradient of f (x), and gives the value of ∇f (x) for x = (0, 0)T .
Solution:
∇f (x1 , x2 ) = 2(x1 − 2x2 ) − 2(e(−x1 +x2 ) )2 , −4(x1 − 2x2 ) + 2(e(−x1 +x2 ) )2
∇f (0, 0) = (−2, 2)T
(b) Determine the steepest descent search direction at the point x = (0, 0)T
Solution:
The steepest descent search direction at the point x = (0, 0)T is : −∇f (0, 0) = (2, −2)T
Problem 2
Consider the following minimization problem {min f (x) = (x1 − 2x2 )2 +
ex2 2
e x1
: x ∈ IR2 }.
(a) Starting at the point x = (0, 0)T , compute the first BFGS iterate x+ assuming that B0 is the identity
matrix. Note: If a line search is needed, you are allowed to perform a full step instead.
Solution:
B0 = I and a full step is performed; thus x+ = x − ∇f (0, 0) = (0, 0)T + (2, −2)T = (2, −2)T
(b) Determine ∆B0 and the BFGS search direction at x+ .
Solution:
∆B0 =
y0 y0T
B0 σ0 σ0T B0
−
σ0T y0
σ0T B0 σ0
2e−8 )T ,
(2, −2)T .
"
with B0 = I, y0 = (14 −
2e−8 , −26
+
and σ0 =
1.9498
−4.0498
#
Thus, ∆B0 =
−4.0498 7.9498
The BFGS search direction at x+ is −B1−1 g1 = −(B0 + ∆B0 )−1 g1 = (−1.0200, 2.2200)T since g1 =
(12 − 2e−8 , −24 + 2e−8 )T
Problem 3
Consider the following minimization problem {min f (x) =
x1
2
+
x2
3
− ln(1 + x21 ) − ln(1 + x22 ) : x ∈ IR2 }.
(a) Determine ∇f (x), the gradient of f (x), and ∇2 f (x), the Hessian of f (x), and gives the values of ∇f (x)
and ∇2 f (x) for x = (0, 0)T .
Page 1 of 5
Solution:
∇f (x1 , x2 ) =
1
2x1 1
2x2 , −
−
2
2 1 + x1 3 1 + x22
1 1
∇f (0, 0) = ( , )
2 3
2(1+x21 )−2x1 (2x1 )
(1+x21 )2
2(1+x22 )−2x1 (2x2 )
−
(1+x22 )2
∇2x1 f (x1 , x2 ) = −
=
∇2x2 f (x1 , x2 ) =
"
#
−2 0
∇2 f (0, 0) =
0 −2
=
−2(1−x21 )
(1+x21 )2
−2(1−x22 )
(1+x22 )2
(b) Determine the Newton method search direction at the point x = (0, 0)T
Solution:
"
The Newton search direction is
−∇f (0, 0)(∇2 f (0, 0))−1
=
−( 21 , 13 )
×
−2
0
0
−2
#−1
= ( 14 , 16 )
Problem 4
Consider the following minimization problem {min f (x) =
x1
2
+
x2
3
− ln(1 + x21 ) − ln(1 + x22 ) : x ∈ IR2 }.
(a) Determine the Trust-Region search direction at the point x = (0, 0)T with the initial value α = 4.
Solution:
H = ∇2 f (x1 , x2 ) + αI
"
#
−2 + 4
0
H(0, 0) =
0
−2 + 4
s0 = −H(0, 0)−1 ∇f (0, 0) = −(1/4, 1/6)T
x1 = x0 + s0 = −(1/4, 1/6)T
(b) Let µ = 0.2, η = 0.9, γ1 = 0.5, γ2 = 2.5. Would you accept this step or α should be changed? If it
needs to be changed, should α increase or decrease?
Solution:
To determine if the step is good enough, we need to check the ratio
ρ0 =
f (x0 ) − f (x1 )
f (x0 ) − f (x1 )
actual decrease = f (x0 ) − f (x1 ) = f (0, 0) − f (−1/4, −1/6) = 0.2686
predicted decrease = f (x0 ) − q(x1 ) = −∇f (x0 )(x1 − x0 )T − 1/2(x1 − x0 )∇2 f (x1 )(x1 − x0 )T = 0.2708
Since ρ0 = 0.9917 > η the step is good, we accept it. and update α to α1 = 0.5α0 = 2. (Note: if you
use log base 10 or base 2, the different solutions are accepted too.)
Page 2 of 5
Problem 5
Consider the following minimization problem {min Q(x) = 21 (xT Ax) − bT x : x ∈ IRn } where A is a positive
definite n × n matrix. Let x∗ denote the unique minimizer of Q(x) over IRn . Let f (x) = Q(x) − Q(x∗ ).
(a) Determine ∇f (x), the gradient of f (x), and ∇2 f (x), the Hessian of f (x).
Solution:
∇Q(x∗ ) = 0 since x∗ is the minimizer of Q(x); thus:
∇f (x) = ∇Q(x) = Ax − bT ,
∇2 f (x) = ∇2 Q(x) = A
(b) Prove that f (x) = 21 (x − x∗ )T A(x − x∗ ).
Solution:
Q(x) and f (x) are two quadratic functions which value, gradient, and Hessian coincide at a point. Thus,
Q(x) = f (x) (same Taylor expansion – which is exact for quadratic a function).
Problem 6
n − 1 −1 · · ·
−1
−1 n − 1 · · ·
−1
Consider the n × n matrix H = .
..
.. , and let I denote the identity matrix.
..
..
.
.
.
−1
−1 · · · n − 1
(a) Give the determinant of (H + αI).
Solution:
Adding rows 2, . . . , n to the first one create a row 1 made of α’s without changing the determinant. Add
the first row divided by α create an upper triangular matrix with eigenvalue 1, n + α, . . . , n + α. Thus
det(H + αI) = α ∗ (n + α)n−1
(b) Give the range of α such that (H + αI) is positive definite.
Solution:
All principal minors should be positive, that is, α > 0.
Problem 7
Consider the following minimization problem {min f (x) = 12 (x21 + 2x1 x2 + 4x22 ) − x1 − x2 : x ∈ IR2 }.
(a) Compute the first 2 iterates x+ and x++ of the Fletcher-Reeves conjugate gradient algorithm starting
from the point x = (0, 0)T . Note: If a line search is needed, you are allowed to perform a full step
instead.
Page 3 of 5
Solution:
f (x) a quadratic function and
∇f (x) =
1 1
1 4
x1
x2
1
1
−
=
x1 + x2 − 1
x1 + 4x2 − 1
Step 1 :
Initial point: x =
0
0
,
1
−1
−1
∇q(x ) =
1
1
s = −∇q(x ) =
,
1
1
,
We perform a full step and thus
+
1
x =x+s =
0
0
+
1
1
=
1
1
.
Step 2 :
∇f (x+ ) =
||∇f (x+ )||2 1
s =
s = −∇f (x ) +
||∇f (x)||2
2
+
1
4
−1
−4
,
17
+
2
1
1
=
7.5
4.5
.
(Note : s1 and s2 are mot conjugate, i.e., (s1 )T As2 6= 0 since a full step was performed.)
We perform a full step and thus:
8.5
7.5
1
++
+
2
.
=
+
x
=x +s =
5.5
4.5
1
(Note : x++ is not the optimal solution since full steps were performed).
(b) If the first 2 iterates of the Fletcher-Reeves conjugate gradient algorithm are computed using an exact
line search, determine the co-ordinates of second iterate x++ .
Solution:
Using exact line search, Fletcher-Reeves conjugate gradient algorithm gives the optimal point and thus
x++ = (1, 0)T since x++ satisfies:
x1 + x2 − 1
0
++
∇f (x ) =
=
x1 + 4x2 − 1
0
Problem 8
Prove or disprove each of the following claims:
(a) Consider the following minimization problem {min f (x) = (x − 1)4 − (x − 1)2 : x ∈ IR}. The Trust
Region algorithm, starting from the point x = 0, converge to a (possibly local) minimum of f (x).
Solution:
True : the Trust Region algorithm is globally convergent.
(b) The following function is convex : (3x1 + 2x2 + 4x3 )2 + e4x4 + x3 log(x3 )
Solution:
True : positive sum of convex functions.
Page 4 of 5
1 +x2
(c) The following function is convex : − log( |x3x+3x
2 |+1 ) +
1
2
p
|x1 + x22 | + x21 + x23 +
1
cos(x1 +x2 )
Solution:
False : set x3 = 0 and consider the neighbourhood of x1 + x2 = π/2 (where the function is not even
defined).
Problem 9
Consider the minimization problem {min xT Ax : x ∈ IRn } where A is a positive definite matrix. Let sD
denote the steepest descent search direction and sN denote the Newton method search direction. Prove or
disprove each of the following claims:
(a) There is a positive definite A such that sD = sN .
Solution:
True : A = I
(b) There is a positive definite A such that sD 6= sN .
Solution:
True : A = 2I
*** END OF EXAM ***
Page 5 of 5
© Copyright 2026 Paperzz