Smoothed Analysis of the Successive Shortest Path Algorithm

Introduction
Analysis
Smoothed Analysis of the
Successive Shortest Path Algorithm
Tobias Brunsch1 Kamiel Cornelissen2
Bodo Manthey2 Heiko Röglin1 Clemens Rösner1
1 Department
of Computer Science
University of Bonn, Germany
2 Department of Applied Mathematics
University of Twente, The Netherlands
1 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Minimum-Cost Flow Network
flow network:
balance values:
costs:
capacities:
2
G = (V , E )
b:V →Z
c : E → R≥0
u:E →N
1/2
3/2
1
3/1
0
1/3
0
cost/capacity
3/1
-1
3/1
1/2
-2
2 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Minimum-Cost Flow Problem
2
1/2
1
1
3/2
1
1
3/1
flow:
capacity constraints:
Kirchhoff’s law:
0
1 1/3
0
3/1
-1
1
3/1
2
1/2
-2
f : E → R≥0
∀e ∈ E : f (e) ≤ u(e)
∀v ∈ V : b(v ) = out(v ) − in(v )
3 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Minimum-Cost Flow Problem
2
1/2
1
1
3/2
1
1
3/1
flow:
capacity constraints:
Kirchhoff’s law:
3/1
0
1 1/3
1
3/1
2
1/2
0
-1
-2
f : E → R≥0
∀e ∈ E : f (e) ≤ u(e)
∀v ∈ V : b(v ) = out(v ) − in(v )
Goal: minflow f
P
e∈E
f (e) · c(e)
3 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Minimum-Cost Flow Problem
2
1/2
2
3/2
1
1
3/1
flow:
capacity constraints:
Kirchhoff’s law:
3/1
1
0
1 1/3
3/1
2
1/2
0
-1
-2
f : E → R≥0
∀e ∈ E : f (e) ≤ u(e)
∀v ∈ V : b(v ) = out(v ) − in(v )
Goal: minflow f
P
e∈E
f (e) · c(e)
3 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Short History
Pseudo-Polynomial Algorithms:
Out-of-Kilter algorithm
Cycle Canceling algorithm
Successive Shortest Path algorithm
[Minty 60, Fulkerson 61]
4 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Short History
Pseudo-Polynomial Algorithms:
Out-of-Kilter algorithm
Cycle Canceling algorithm
Successive Shortest Path algorithm
[Minty 60, Fulkerson 61]
Polynomial Time Algorithms:
Capacity Scaling algorithm
Cost Scaling algorithm
[Edmonds and Karp 72]
4 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Short History
Pseudo-Polynomial Algorithms:
Out-of-Kilter algorithm
Cycle Canceling algorithm
Successive Shortest Path algorithm
[Minty 60, Fulkerson 61]
Polynomial Time Algorithms:
Capacity Scaling algorithm
Cost Scaling algorithm
[Edmonds and Karp 72]
Strongly Polynomial Algorithms:
Tardos’ algorithm
Minimum-Mean Cycle Canceling algorithm
Network Simplex algorithm
Enhanced Capacity Scaling algorithm
[Tardos 85]
[Orlin 93]
4 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Theory vs. Practice
Theory
Practice
Fastest algorithm:
Enhanced Capacity Scaling
Fastest algorithm:
Network Simplex
5 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Theory vs. Practice
Theory
Practice
Fastest algorithm:
Enhanced Capacity Scaling
Fastest algorithm:
Network Simplex
Successive Shortest Path:
exponential in worst case
Successive Shortest Path
much faster than
Minimum-Mean Cycle Canceling:
strongly polynomial
Minimum-Mean Cycle Canceling
5 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Reason for Gap between Theory and Practice
Adversary
Worst-case complexity is too pessimistic!
There are artificial worst-case inputs. These
inputs, however, do not occur in practice.
“I will
trick your
algorithm!”
6 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Reason for Gap between Theory and Practice
Adversary
Worst-case complexity is too pessimistic!
There are artificial worst-case inputs. These
inputs, however, do not occur in practice.
This phenomenon occurs also for many other
problems and algorithms.
“I will
trick your
algorithm!”
6 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Reason for Gap between Theory and Practice
Adversary
Worst-case complexity is too pessimistic!
There are artificial worst-case inputs. These
inputs, however, do not occur in practice.
This phenomenon occurs also for many other
problems and algorithms.
“I will
trick your
algorithm!”
Goal
Find a more realistic performance measure that is not just based
on the worst case.
6 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Smoothed Analysis
Observation: In worst-case analysis, the adversary is too powerful.
Idea: Let’s weaken him!
Input model:
Adversarial choice of flow network
Adversarial real arc capacities ue and node balance values bv
Adversarial densities fe : [0, 1] → [0, φ]
Arc costs ce independently drawn according to fe
7 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Smoothed Analysis
Observation: In worst-case analysis, the adversary is too powerful.
Idea: Let’s weaken him!
Input model:
Adversarial choice of flow network
Adversarial real arc capacities ue and node balance values bv
Adversarial densities fe : [0, 1] → [0, φ]
Arc costs ce independently drawn according to fe
Randomness models, e.g., measurement errors, numerical
imprecision, rounding, . . .
7 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Smoothed Analysis
Worst-case Analysis:
maxce T
Smoothed Analysis:
maxfe E[T ]
8 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Smoothed Analysis
Worst-case Analysis:
maxce T
Smoothed Analysis:
maxfe E[T ]
f (x)
φ = 1:
Average-case analysis
1
0
1
x
8 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Smoothed Analysis
Worst-case Analysis:
maxce T
Smoothed Analysis:
maxfe E[T ]
f (x)
φ
0
1/φ
1
x
8 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Smoothed Analysis
Worst-case Analysis:
maxce T
Smoothed Analysis:
maxfe E[T ]
f (x)
φ → ∞: Worst-case analysis
φ
0
x?
1
x
8 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Smoothed Analysis
Worst-case Analysis:
maxce T
Smoothed Analysis:
maxfe E[T ]
f (x)
φ
φ → ∞: Worst-case analysis
0
x?
1
x
8 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Initial Transformation
Successive Shortest Path algorithm
2
1/2
3/2
1
3/1
0
1/3
0
3/1
-1
3/1
1/2
-2
9 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Initial Transformation
Successive Shortest Path algorithm
0
1/2
0
3/1
0
0/2
0/1
3
3/2
1/3
3/1
-3
0/1
0
3/1
0
1/2
0
0/2
9 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Augmenting Steps
Successive Shortest Path algorithm
0
0/2
1/2
2
0
3/1
0
0/1
2
3
3/2
2 1/3
0/1
0
3/1
0
3/1
2
1/2
-3
0
2
0/2
path length: 3, augmenting flow value: 2
10 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Augmenting Steps
Successive Shortest Path algorithm
0
-1/2
0
3/1
0
0/2
0/1
1
3/2
1/1
-1/2
3/1
-1
0/1
0
3/1
0
-1/2
0
0/2
update residual network
10 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Augmenting Steps
Successive Shortest Path algorithm
0
-1/2
0
0/2
1
3/2
3/1
1
1/1 1 -1/2
0
0/1
1
3/1
-1
1
0/1
0
1
3/1
0
-1/2
0
0/2
path length: 5, augmenting flow value: 1
10 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Augmenting Steps
Successive Shortest Path algorithm
0
-1/2
0
-3/1
0
0/2
0/1
0
3/2
1/2
-1/1
3/1
0
0/1
0
-3/1
0
-1/2
0
0/2
update residual network
10 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Resulting Flow
Successive Shortest Path algorithm
0
0/2
1/2
2
0
3/1
1
0
0/1
1
2
3
3/2
1 1/3
3/1
-3
1
0/1
0
1
3/1
0
2
1/2
0
2
0/2
flow cost: 11, flow value: 3
11 / 17
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Introduction
Analysis
Resulting Flow
Successive Shortest Path algorithm
2
1/2
2
3/2
1
1
3/1
0
1 1/3
0
3/1
1
-1
3/1
2
1/2
-2
flow cost: 11, flow value: 3
11 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Results
Theorem (Upper Bound)
In expectation, the SSP algorithm requires O(mnφ) iterations and
has a running time of O(mnφ(m + n log n)).
12 / 17
Introduction
Analysis
Minimum-Cost Flow Problem
Smoothed Analysis
Successive Shortest Path Algorithm
Results
Theorem (Upper Bound)
In expectation, the SSP algorithm requires O(mnφ) iterations and
has a running time of O(mnφ(m + n log n)).
Theorem (Lower Bound)
There are smoothed instances on which the SSP algorithm requires
Ω(m · min {n, φ} · φ) iterations in expectation.
upper bound tight for φ = Ω(n)
12 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
0
value
initial solution: empty flow
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
slope = path length
value
×
length
0
value
value
after 1 iteration
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
value
0
after 2 iterations
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
value
0
after 3 iterations
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
value
0
after 4 iterations
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
0
value
after 5 iterations
#iterations = #distinct slopes
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Useful Properties
Lemma
The distances from the source to any node increase monotonically.
cost
0
value
after 5 iterations
#iterations = #distinct slopes
Lemma
Every intermediate flow is optimal for its flow value.
13 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n]
0
n
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
0
k
S
`=1
I` ,
|I` | =
n
k
n
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
`=1
I` ,
|I` | =
n
k
n
0
=⇒
k
S
#slopes =
k
P
`=1
#slopes ∈ I`
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
k
S
`=1
I` ,
|I` | =
n
k
n
0
=⇒ E[#slopes] =
k
P
E[#slopes ∈ I` ]
`=1
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
k
S
`=1
I` ,
|I` | =
n
k
n
0
=⇒ E[#slopes] =
k
P
E[#slopes ∈ I` ]
`=1
≈
k
P
`=1
Pr[∃slope ∈ I` ]
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
k
S
I` ,
`=1
|I` | =
n
k
n
0
=⇒ E[#slopes] =
k
P
E[#slopes ∈ I` ]
`=1
≈
k
P
`=1
Pr[∃slope ∈ I` ]
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Counting the Number of Slopes
slope = augmenting path length ∈ (0, n] =
k
S
I` ,
`=1
|I` | =
n
k
n
0
=⇒ E[#slopes] =
k
P
E[#slopes ∈ I` ]
`=1
≈
k
P
`=1
Pr[∃slope ∈ I` ]
= O(mnφ)
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
14 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
0
value
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
d - slope threshold
d
>d
>d
≤d
≤d
0
value
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
d - slope threshold
d
F ? - flow at breakpoint
>d
>d
≤d
≤d
0
F?
value
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
d - slope threshold
d
F ? - flow at breakpoint
>d
P - next augmenting path
c(P ) > d
≤d
e - empty arc of P in Gf ?
≤d
0
F?
value
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
d - slope threshold
d
F ? - flow at breakpoint
>d
P - next augmenting path
c(P ) > d
≤d
e - empty arc of P in Gf ?
≤d
0
F?
value
∃slope ∈ (d, d + ε] ⇐⇒ c(P) ∈ (d, d + ε]
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
cost
d - slope threshold
d
F ? - flow at breakpoint
>d
P - next augmenting path
c(P ) > d
≤d
e - empty arc of P in Gf ?
≤d
0
F?
value
∃slope ∈ (d, d + ε] ⇐⇒ c(P) ∈ (d, d + ε]
Goal: Reconstruct F ? and P without knowing ce
15 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Principle of Deferred Decisions
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
Phase 1: Reveal all ce 0 for e 0 6= e.
Assume this suffices to uniquely identify F ? and P.
cost
d
>d
c(P ) > d
≤d
≤d
0
F?
value
16 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Principle of Deferred Decisions
Main Lemma
∀d ≥ 0 : ∀ε ≥ 0 : Pr[∃slope ∈ (d, d + ε]] = O(mφε)
Phase 1: Reveal all ce 0 for e 0 6= e.
Assume this suffices to uniquely identify F ? and P.
cost
d
>d
Phase 2:
c(P ) > d
Pr[c(P) ∈ (d, d + ε]]
= Pr[c(e) ∈ (z, z + ε]] ≤ φε,
≤d
≤d
0
F?
value
where z is fixed if ce 0 for e 0 6= e is fixed.
16 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Case 1: e forward arc
Set c 0 (e) = 1 and for all e 0 6= e set c 0 (e 0 ) = c(e 0 ).
Run SSP with modified costs c 0 .
17 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Case 1: e forward arc
Set c 0 (e) = 1 and for all e 0 6= e set c 0 (e 0 ) = c(e 0 ).
Run SSP with modified costs c 0 .
cost
d
>d
c(P ) > d
≤d
≤d
0
F?
value
17 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Case 1: e forward arc
Set c 0 (e) = 1 and for all e 0 6= e set c 0 (e 0 ) = c(e 0 ).
Run SSP with modified costs c 0 .
cost
c0
c
d
>d
>d
≤d
≤d
0
F?
value
17 / 17
Introduction
Analysis
Observations
Discretization
Flow Reconstruction
Flow Reconstruction
Case 1: e forward arc
Set c 0 (e) = 1 and for all e 0 6= e set c 0 (e 0 ) = c(e 0 ).
Run SSP with modified costs c 0 .
cost
c0
c
d
>d
>d
≤d
≤d
0
F ? is the same for c and c 0
F?
value
17 / 17