OR 215
Network Flows
Spring 1999
M. Hartmann
MULTI-COMMODITY FLOWS II
Subgradient Optimization
Resource-directive Decomposition
Price-directive Decomposition
Lagrangian Relaxation
SUBGRADIENTS
A function f(x) from Rn to R is convex if for all [0,1]
and for all x and y in Rn the following is true:
f(x+[1-] y) f(x) + [1-] f(y).
Lemma: A function f(x) from Rn to R is convex if and only
if for any y in Rn there exists s in Rn such that
f(x) f(y) + s(x-y)
for all x in Rn. Here s is said to be a subgradient of f(x) at
the point y (see below).
subgradients of a
convex function
f(y)+s(x-y)
y
f(x)
MINIMIZING A CONVEX FUNCTION
If f(x) is differentiable at y, then the only subgradient at y
is s = f(y), the gradient of f(x) at y.
The steepest descent method for minimizing f(x) sets
xq+1 = xq - qf(xq)
(q > 0)
for q=1,2,... since -f(xq) is the direction of maximum
decrease in f(x) at xq.
If we generalize this method to non-differentiable convex
functions, via
xq+1 = xq - qsq
(q > 0)
where sq is a subgradient of f(x) at xq, then it may be
the case that
f(xq+1) = f(xq - qsq) > f(xq).
So there is no guarantee that f(xq+1) > f(xq). But if
f(x*) < f(xq), since sq is a subgradient of f(x) at xq,
sq (x*-xq) f(x*) - f(xq) < 0
So for q > 0 sufficiently small, xq+1 is closer to x* than xq
(see below).
x*
xq+1 = xq -qsq
xq
sq x = sqxq
sq
Theorem: If q 0 in such a way that the series {q }
diverges (e.g., q = 1/q), then xq converges to a minimum
x* of f(x).
RESOURCE-DIRECTIVE METHOD
Let z(r) be the optimal value of the linear programming
problem
minimize
Kk1 ckxk
subject to
Nxk = bk
k = 1,2,…,K
0 x kij rijk
(i,j) A, k = 1,2,…,K
Note that z(r) can be computed by solving K min-cost flow
problems, and the multi-commodity flow problem can be
expressed as
minimize
z(r)
subject to
Kk1 rijk
uij (i,j) A
0 rijk u kij
(i,j) A, k = 1,2,…,K
Lemma: Let R denote the set of allocations r for which the
linear programming problem is feasible. Then z(r) is a
piece-wise linear convex function of the allocations r in R.
Lemma: Let zk(rk) denote the objective value of the mincost flow problem
minimize
ckxk
subject to
Nxk = bk
0 x kij rijk
(i,j) A
then sk= [c kij -k(i)+k(j)]- is a subgradient of zk() at rk,
where k are any optimal node prices.
Corollary: The sk defined above constitute a subgradient
of z() at r.
SUBGRADIENT-BASED METHOD
If for q=1,2.... we set
(rk)q+1 = (rk)q - q(sk)q
(q 0)
then notice that
(rk)q+1 (rk)q
(rk)1
and so eventually (rk)q will be not be feasible to the
resource allocation problem.
The convergence properties of the subgradient
method can be maintained if (rk)q+1 is instead chosen
to be the feasible allocation rk which minimizes
ij { (rk)q - q(sk)q - rk }2
PRICE-DIRECTIVE METHOD
If we relax the bundle constraints and “charge” a per-unit
shipping fee of wij on each arc (i,j), the multi-commodity
flow problem would become
minimize
Kk1(ck +w)xk
subject to
Nxk = bk
k = 1,2,…,K
0 x kij u kij
(i,j) A, k = 1,2,…,K
How can we relate the optimal value z(w) of this linear
programming problem to the minimum cost of a multicommodity flow?
LAGRANGIAN FUNCTION
Define the Lagrangian function L(w) by
L(w) = z(w) - ij wijuij
Lemma: The Lagrangian function L(w) is a piecewise
linear concave function of w.
L(w)
Lagrangian function
w
Theorem: The Lagrangian function L(w) is a lower bound
on the minimum cost of a multi-commodity flow for any w
0, and for some w* 0, L(w*) is equal to this value.
Proof: To see that L(w) is a lower bound, consider
any optimal multi-commodity flow x kij . Since every multicommodity flow is feasible to the Lagrangian relaxation,
we have
z(w)
Kk1(ck +w)xk
=
Kk1ckxk + ij wij Kk1 x kij
Kk1ckxk + ij wijuij
so that L(w) ckxk, the minimum cost.
The second assertion holds if w* are the optimal arc prices
(dual variables from the multi-commodity flow problem).
THE LAGRANGIAN DUAL
One method to find the maximum of L(w) over w 0 is to
use subgradient optimization to minimize the piece-wise
linear convex function -L(w) over w 0.
Lemma: If xk is an optimal solution to
minimize
Kk1(ck +w)xk
subject to
Nxk = bk
k = 1,2,…,K
0 x kij u kij
(i,j) A, k = 1,2,…,K
then sij = uij - is a subgradient of -L() at w.
Note: We need to modify the subgradient method to
wq+1 = [xq - qsq]+ so that the arc prices remain nonnegative.
NUMERICAL EXAMPLE
b1(s1) = 10
b1(t1) = -10
$1, 5
s1
t1
$5,
$5,
$1, 10
1
2
$1,
$1,
s2
t2
$5,
b2(s2) = 20
b2(t2) = -20
Arc labels are costs cij and capacities uij.
5
s1
5
5
5
1
5
2
5
5
s2
t1
15
t2
© Copyright 2026 Paperzz