Some additional material to rehearse

Some additional material to rehearse
Recall that at the time these problems were given the contents of the course were
somewhat different.
1. Solve by the Simplex method
−4x1 − x2
min
2x1 + 3x2
≤
12
4x1 + x2
≤
16
x1 + x2
≤
5
x ≥
0.
2. Solve by using the Simplex algorithm
x2 − x4 − 2x5
min
x1 − x4 + 2x5
=
2
x2 + 6x4 + x5
=
1
x3 + x4 − 3x5
=
1
x
≥ 0.
3. Which of the following statements are true?
a) The function f (x) = x1 + x22 can be a cost function for a LP-problem.
b) If a point satisfies the KKT-conditions, it is an optimal solution.
c) There exists a LP-problem which has a infinite number of solutions.
d) The set
C = (x1 , x2 ) ∈ R2 : x2 ≤ x21
is convex.
4. Use the gradient method to approximate the minimum of
x41 + x42 + 2x21 x22 − 4x1 + 3.
Take x(0) = [0 0]| to be the initial point.
5. Assume that a company (Woodjoy Inc.) has one hundred 5 meter long planks and one hundred
and fifty 3 meter long planks in its stock. The needs of our company is 200 2.6 meter long planks
and 250 2 meter long planks. Each 2.6. meter plank bought in addition costs 2 euros and each 2
meter long 1.50 euros. The cost for one cut is 0.2 euros.
One 5 m long plank can be cut in two different ways. In type 1 cut you get one 2.6 m plank and
one 2 m plank. Type 2 cut produces two 2 meter long planks. A 3 meter long plank can be cut
also in two ways. Type 3 cut producec one 2.6 meter plank and type 4 cut delivers one 2 meter
plank. Determine a minimal cost cutting plan.
a) What are the variables (1 point)?
b) What are the constraints (1 point)?
c) What is the cost function (1 point)?
d) Formulate and solve the linear optimization problem (3 points).
6. Using the Newton method find the approximation of the minimum for the unconstrained
optimization problem
min2 e−x1 −x2 + x21 + x22 .
x∈R
Use x(0) = [0 0]| as the starting point for your iterations. Compute two iterations.
7. Solve the quadratic optimization problem
min 2x21 − 2x1 x2 + 4x22 − x1 − 2x2
x∈R2
1
using the gradient method with the initial quess x(0) = [0 0]| . Compute two iterations. What is
the global minimum?
8. Find all the points that satisfy the Karush-Kuhn-Tucker conditions for the constrained optimization problem
x21 + x1 x2 + x22 − 4x1
min
2x1 − x2 − 2 ≤ 0,
subject to
x1 ≤ 2.
9. Find all the points that satisfy the Karush-Kuhn-Tucker condition for the constrained optimization problem
ex1 −x2
min
ex1 + ex2 − 20 ≤ 0,
subject to
−x1 ≤ 0.
10. Starting from the initial guess x(0) use the gradient method to find the approximation x(1) for
min
x∈R2
x41 − 4x31 + 6(x21 + x22 ) − 4(x1 + x2 ).
11. Determine the vector a ∈ R3 such that the point x̃ = (0, 0, 1) is a local minima of the
constrained optimization problem
ex1 + x1 x2 + x22 − 2x2 x3 + x23
min
x21 + x22 + x23 − 5 ≤ 0
subject to
a| x + 2 = 0.
12. Consider the inequality constrained optimization problem
min
3−x1 −x2 ≤0
where
2
H=
−1
1 |
x Hx − b| x,
2
−1
,
2
Find all the points that satisfy the KKT-conditions.
2
1
b=
.
1