Exercises on Approximation Algorithms

Exercises on Approximation Algorithms
Exercise 6.5 [LN]: Proof by contradiction. Suppose there is an instance of
Knapsack for which this is not true. Suppose that for this instance, after
ordering on non-increasing pj /wj -ratio, the items 1, 2, . . . , k − 1 fit entirely in
the knapsack and item k fits only partially (where partially may even be a 0fraction); i.e., the
solution is represented by the vector x with xi = 1, 0 ≤ i ≤
P
k−1
w
i
and xi = 0, otherwise.
k −1, xk = B − i=1
wk
If this is not the optimal solution, then there exists a better solution.
Pn From all
of these take the one x0 that diverges least from x in L1 -distance ( i=1 |x0i −xi |).
In this solution there must be some j, 0 ≤ j ≤ k with x0j < xj . Then it must
be that for some i > j and i ≥ k, x0i > xi , since in an optimal LP-solution the
capacity of the knapsack will always be fully utilized. Now we will take a bit
from the x0i and reallocate the space becoming available to raise x0j , and find
out that this improves the solution. So, we set x0i to x0i − , by which we sacrifice
pi in objective value. The space becoming available is wi . Thus we use this to
wi
wi
, gaining in objective value w
pj .
raise x0j with w
j
j
pi
pj
pi
wi
−
pj − wi
≥ 0,
wj
=
wj
wi
given that i > j and the items have been ordered on non-increasing ratio.
Thus, either we have a solution with better value, which is a contradiction or
we have solution with equal value, which is closer to x in L1 -distance, which is
a contradiction. 2
Exercise 6.6. [LN]: An example with which the worst-case bound of 1/2 can
be approximated arbitrarily closely is the following one: Take n = 3 with
p1 = w1 = 1 + 21 B, p2 = w2 = p3 = w3 = 12 B, and B the capacity. The
performance of Approximation algorithm 3 amounts to ( 12 B + 1)/b, which goes
to 1/2 if B goes to infinity.
Exercise 6.7. [LN]: See the proof of Thm. 17.12 in [PS].
Exercise 6.9. [LN]: Take the instance with 2m − 1 jobs, given in the order:
• Jobs j = 1, . . . , m − 1 have all pj = m − 1;
• Jobs j = m, . . . , 2(m − 1) have all pj = 1;
• Job 2m − 1 has p2m−1 = m.
List Scheduling will create makespan 2m − 1, whereas optimal clearly is m.
Exercise
[LN]: Notice that
Fj = Cj − rj . Hence,
P
P6.11. P
Pthe flow time of job j isP
Fj = P Cj − rj and since
rj is fixed minimising
Fj is equal to minimising
Cj .
1
Exercise 6.12. [LN]: The single machine version is solved by the First-InFirst-Out rule, processing always the job with the earliest release date. The 2
machine case is NP-hard by a reduction from Makespan. Give all jobs in a
Makespan instance release time 0 and keep the constant K. With all jobs having
release time 0, max Fj is simply equal to max Cj , the makespan.
Exercise 17.1 [PS]: Consider the graph that consists of a single odd circuit
with 2n + 1 vertices. In this case the vertex cover approximation algorithm
will select all vertices, whereas n + 1 would have sufficed. Still, 2n+1
n+1 ≤ 2.
However, this vertex cover according to the transformation we used between
Vertex Cover and Independent Set would produce an empty independent
set, which is certainly an independent set, but the optimal one has n vertices.
Argue similarly for Clique.
Exercise 17.3 [PS]: a) Easy reduction from Hamilton Circuit.
b) Again Christofides’ Tree+Matching-algorithm works, where now no shortcuts are made, simply because they do not exist. Prove the ratio as in the TSP.
Exercise 17.4.a [PS]: I consider this problem as a 2-machine scheduling problem minimising makespan. I call the algorithm using k here A(k). We distinguish two cases. First if one of the k largest jobs completes last. Then we have
the optimal solution. Otherwise one of the n − k smaller jobs completes last,
say job ` with processing time p` . Clearly, p` ≤ pk , the processing time of the
k-th largest job. Hence, using the same upper bound and lower bound as used
for the analysis of List Scheduling we have that
n
C` = S` + p` ≤
1X
1
pj + p`
2 j=1
2
Hence,
Z A(k)
Z OP T
≤ 1+
1
2 p`
P
n
1
j=1
2
=
pj
p`
1 + Pn
pj
p`
j=1
≤ 1 + Pk
pj + p`
p`
≤ 1+
kp` + p`
1
≤ 1+
.
k+1
j=1
1
Hence, A(k) is a 1 + k+1
algorithm.
1
b) To obtain a ratio of 1+ we need to choose k such that ≤ k+1
i.e. k +1 ≥ 1
or k ≥ 1 − 1. For we call the smallest such value k(): k() = 1 − 1. Then
2
the algorithm A(k()) yields approximation ratio 1 + and its running time is
O(21/ + n). Therefore for all possible values of the algorithms A(k()) form
a PTAS.
3