Travelling Salesman Problem

Travelling Salesman Problem
Algorithms and Networks 2015/2016
Hans L. Bodlaender
Johan M. M. van Rooij
1
Contents
 TSP and its applications
 Heuristics and approximation algorithms
 Construction heuristics, a.o.: Christofides, insertion heuristics
 Improvement heuristics, a.o.: 2-opt, 3-opt, Lin-Kernighan
 Exact algorithms
2
Travelling Salesman Problem – Algorithms and Networks
PROBLEM DEFINITION
APPLICATIONS
3
Problem
 Instance: n vertices
(cities), distance
between every pair of
vertices.
 Question: Find
shortest (simple) cycle
that visits every city?
4
1
3
2
2
3 2
4
1
3
2
2
3 2
4
2
5
4
2
5
4
13
4
1
3
2
2
3 2
2
5
4
11
Applications
 Pickup and delivery problems
 Robotics
 Board drilling / chip manufacturing
 Lift scheduling
5
Assumptions / Problem Variants
 Complete graph vs underlying graph structure.
 Complete graph: a given distance between all pairs of cities.
 Directed edges vs undirected arcs.
 Undirected: symmetric: w(u,v) = w(v,u).
 Directed: not symmetric.
 Asymmetric examples: painting/chip machine application.
 Lengths:
 Lengths are non-negative (or positive).
 Triangle inequality: for all x, y, z:
w(x,y) + w(y,z)  w(x,z)
 Different assumptions lead to different problems.
6
NP-complete
 Instance: cities, distances, k.
 Question: is there a TSP-tour of length at most k?
 Is an NP-complete problem.
 Relation with Hamiltonian Circuit problem.
7
If triangle inequality does not hold
 Theorem: If PNP, then there is no polynomial time
algorithm for TSP without triangle inequality that
approximates within a ratio c, for any constant c.
 Proof: Suppose there is such an algorithm A. We build a
polynomial time algorithm for Hamiltonian Circuit (giving a
contradiction):
 Take instance G=(V,E) of Hamiltonian Circuit.
 Build instance of TSP:
• A city for each v  V.
• If (v,w)  E, then d(v,w) = 1, otherwise d(v,w) = nc+1.
 A finds a tour with distance at most nc, if and only if, G has a
Hamiltonian circuit.
8
Travelling Salesman Problem – Algorithms and Networks
CONSTRUCTION
HEURISTICS
9
Heuristics and approximations
 Construction heuristics:
 A tour is built from nothing.
 Improvement heuristics:
 Start with `some’ tour, and continue to change it into a better
one as long as possible
10
1st Construction Heuristic:
Nearest neighbor
 Start at some vertex s; v=s;
 While not all vertices visited
 Select closest unvisited neighbor w of v
 Go from v to w; v=w
 Go from v to s.
 Without triangle inequality:
 Approximation ratio arbitrarily bad.
 With triangle inequality:
 Approximation ratio O(log n).
11
2nd Construction Heuristic:
Heuristic with ratio 2
 Find a minimum spanning tree
 Report vertices of tree in preorder
 Preorder: visit a node before its children
 Approximation ratio 2 (with triangle inequality):
 OPT ≥ MST
 2 MST ≥ Result
 Result / OPT ≤ 2MST / MST = 2
12
3rd Construction Heuristic:
Christofides
1. Make a Minimum Spanning Tree T.
2. Set W = {v | v has odd degree in tree T}.
3. Compute a minimum weight matching M in the graph
G[W].
4. Look at the graph T+M.
 Note that T+M is Eulerian!
5. Compute an Euler tour C’ in T+M.
6. Add shortcuts to C’ to get a TSP-tour.
13
Ratio 1.5
 Total length edges in T:
at most OPT
 Total length edges in
matching M: at most
OPT/2.
 T+M has length at most
3/2 OPT.
 Use D-inequality.
14
4th Construction Heuristic:
Closest insertion heuristic
 Build tour by starting with one vertex, and inserting
vertices one by one.
 Always insert vertex that is closest to a vertex already in
tour.
15
Closest insertion heuristic has
performance ratio 2
 Build tree T: if v is added to tour, add to T edge from v to
closest vertex on tour.
 T is a Minimum Spanning Tree (Prim’s algorithm).
 Total length of T  OPT.
 Length of tour  2* length of T.
16
Many variants
 Closest insertion: insert vertex closest to vertex in the
tour.
 Farthest insertion: insert vertex whose minimum
distance to a node on the cycle is maximum.
 Cheapest insertion: insert the node that can be inserted
with minimum increase in cost.
 Gives also ratio 2 .
 Computationally expensive.
 Random insertion: randomly select a vertex.
 Each time: insert vertex at position that gives minimum
increase of tour length.
17
5th Construction Heuristic:
Cycle merging heuristic
 Start with n cycles of length 1.
 Repeat:
 Find two cycles with minimum distance.
 Merge them into one cycle.
 Until 1 cycle with n vertices.
 This has ratio 2: compare with algorithm of Kruskal for
MST.
18
6th Construction Heuristic:
Savings heuristic
 Cycle merging heuristic where we merge tours that provide
the largest “savings”:
 Saving for a merge: merge with the smallest additional cost /
largest savings.
19
Some test results
 In an overview paper, Junger et al. report on tests on set
of instances (105 – 2392 vertices; city-generated TSP
benchmarks)








20
Nearest neighbor: 24% away from optimal in average
Closest insertion: 20%;
Farthest insertion: 10%;
Cheapest insertion: 17%;
Random Insertion: 11%;
Preorder of min spanning trees: 38%;
Christofides: 19% with improvement 11% / 10%;
Savings method: 10% (and fast).
Travelling Salesman Problem – Algorithms and Networks
IMPROVEMENT
HEURISTICS
21
Improvement heuristics
 Start with a tour (e.g., from heuristic) and improve it
stepwise
 2-Opt
 3-Opt
 K-Opt
 Lin-Kernighan
Iterative
improvement
 Iterated LK
 Simulated annealing, …
Local search
22
Scheme
 Rule that modifies solution to different solution
 While there is a Rule(sol, sol’) with sol’ a better solution
than sol
 Take sol’ instead of sol
 Cost decrease
 Stuck in `local minimum’
 Can use exponential time in theory…
23
Very simple
 Node insertion:
 Take a vertex v and put it in a different spot in the tour.
 Edge insertion:
 Take two successive vertices v, w and put these as edge
somewhere else in the tour.
24
2-opt
 Take two edges (v,w) and
(x,y) and replace them by
(v,x) and (w,y) if this
improves the tour.
 Costly: part of tour should
be turned around.
 On R2: get rid of crossings
of tour.
25
2-Opt improvements
 Reversing shorter part of the tour.
 We need a direction on the tour to decide what reconnect
option leads to subtours.
 Postpone correcting tour.
 Clever search to improving moves.
 Restrict distance function evaluations.
 Prove that you do not need to consider some potential 2-opt
moves, because these were not improving in earlier steps.
 Look only to subset of candidate improvements.
 Combine with node insertion.
26
3-opt
 Choose three edges from tour
 Remove them, and combine the three parts to a tour in the
cheapest way to link them
27
3-opt
 Costly to find 3-opt improvements: O(n3) candidates
 k-opt: generalizes 3-opt
28
Lin-Kernighan
 Idea: modifications that are bad can lead to enable
something good
 Tour modification:
 Collection of simple changes
 Some increase length
 Total set of changes decreases length
29
Lin-Kernighan
 One LK step:
 Make sets of edges X = {x1, …, xr}, Y = {y1,…,yr}
• If we replace X by Y in tour then we have another tour
 Sets are built stepwise
 Repeated until …
 Variants on scheme possible
30
One Lin-Kernighan step
 Choose vertex t1, and edge x1 = (t1,t2) from tour.
 i=1.
 Choose edge y1=(t2, t3) not in tour with g1 = w(x1)–w(y1) > 0
(or, as large as possible).
 Repeat a number of times, or until …
 i++;
 Choose edge xi = (t2i-1,t2i) from tour, such that:
• xi not one of the edges yj.
• oldtour – X + (t2i,t1) +Y is also a tour.
 if oldtour – X + (t2i,t1) +Y has shorter length than oldtour, then
take this tour: done.
 Choose edge yi = (t2i, t2i+1) such that:
• gi = w(xi) – w(yi) > 0.
• yi is not one of the edges xj .
• yi not in the tour.
31
Iterated Lin-Kernighan
 Construct a start tour.
 Repeat the following r times:
 Improve the tour with Lin-Kernighan until not possible.
 Do a random 4-opt move that does not increase the length
with more than 10 percent.
 Report the best tour seen.
Cost much time.
Gives excellent
results!
32
Other methods
 Simulated annealing and similar methods.
 Problem specific approaches, special cases.
 Iterated LK combined with treewidth/branchwidth
approach:
 Run ILK a few times (e.g., 5).
 Take graph formed by union of the 5 tours.
 Find minimum length Hamiltonian circuit in graph with clever
dynamic programming algorithm.
33
Travelling Salesman Problem – Algorithms and Networks
DYNAMIC PROGRAMMING
34
Held-Karp algorithm for TSP
 O(n22n) algorithm for TSP.
 Uses dynamic programming.
 Take some starting vertex s.
 For set of vertices R (s  R), vertex w  R,
let B(R,w) = minimum length of a path, that
 Starts in s.
 Visits all vertices in R (and no other vertices).
 Ends in w.
35
TSP: Recursive formulation
 B({s},s) = 0
 B({s},v) = ∞ if v ≠ s
 If |R| > 1, then
 B(R,x) = minvR\{x}B( R\{x}, v ) + w(v,x)
 If we have all B(V,v) then we can solve TSP.
 Gives requested algorithm using DP-techniques.
36
Space Improvement for
Hamiltonian Cycle
 Dynamic programming algorithm uses:
 O(n22n) time.
 O(n2n) space.
 In practice space becomes a problem before time does.
 Next, we give an algorithm for Hamiltonian Cycle that
uses:
 O(n32n) time.
 O(n) space.
 This algorithm counts solutions using ‘Inclusion/Exclusion’.
37
Counting (Non-)Hamiltonian Cycles
 Computing/counting tours is much easier if we do not care
about which cities are visited.
 This uses exponential space in the DP algorithm.
 Define: Walks[ vi, k ] = the number of ways to travel from
v1 to vi traversing k times an edge.
 We do not care whether nodes/edges are visited (or twice).
 Using Dynamic Programming:
 Walks[ vi, 0 ]
=1
if i = 1
 Walks[ vi, 0 ]
=0
otherwise
 Walks[ vi, k ]
= ∑vj ∈ N(vi) Walks[ vj, k – 1 ]
 Walks[ v1, n ] counts all length n walks that return in v1.
 This requires O(n3) time and O(n) space.
38
How Does Counting
‘Arbitrary’ Cycles Help?
A useful formula:
 Number of cycles through vj =
total number of cycles (some go through vj some don’t)
total number of cycles that do not go through vj.
 Hamiltonian Cycle: cycle of length n that visits every city.
 Keeping track of whether every vertex is visited is hard.
 Counting cycles that do not go through some vertex is easy.
 Just leave out the vertex in the graph.
 Counting cycles that may go through some vertex is easy.
 We do not care whether it is visited or not (or twice) anymore.
39
Using the formula on every city
A useful formula:
 Number of cycles through vj =
total number of cycles (some go through vj some don’t)
total number of cycles that do not go through vj.
 What if we apply the above formula to every city (except v1)?
 We get 2n-1 subproblems, were we count walks from v1 to v1
where either the city is not allowed to be visited, or may be
visited (not important whether it is visited, or twice, or not).
 After combining the computed numbers from all 2n-1
subproblems using the above formula, we get:
 The number of cycles of length n visiting every vertex.
 I.e, the number of Hamiltonian cycles.
40
Apply the formula to every city:
the Inclusion/Exclusion formula
A useful formula:
 Number of cycles through vj =
total number of cycles (some go through vj some don’t)
total number of cycles that do not go through vj.
Let CycleWalks( V’ ) be the number of walks from v1 to v1 using
only vertices in V’, then:
41
Travelling Salesman Problem – Algorithms and Networks
CONCLUSIONS
42
Conclusions
 TSP has many applications.
 Also many applications for variants of TSP.
 Heuristics.
 Construction heuristics
 Improvement heuristics
 Dynamic programming in O(n22n) time and O(n2n) space.
 Inclusion/Exclusion for Hamiltonian Cycle in O(n32n).
 Further reading:
 M. Jünger, G. Reinelt, G. Rinaldi, The Traveling Salesman
Problem, in: Handbooks in Operations Research and
Management Science, volume 7: Network Models, NorthHolland Elsevier, 1995.
43