[T i ] -1

Low Complexity
Algebraic Multicast
Network Codes
Sidharth “Sid” Jaggi
Philip Chou
Kamal Jain
The Multicast Problem Model
t1
R
R
s
t2
Network with directed edges,
nodes.
Single source, rate R.
|T| sinks, desired rate R, identical
information.
R
Network
.
.
.
R
t|T|
Examples –
1. Online news broadcasts.
2. Online gaming.
History-I
Assumptions
Acyclic graph.
Each link has unit capacity.
Links have zero delay.
Arithmetic operations allowed
at all nodes.
t1
C1
C2
t2
Upper bound for
multicast capacity C,
s
Network
C|T|
min max (cutsize )  Ci , i  1,2,..., | T |
cut ( s t|T | ) flow
.
.
.
t|T|
C ≤ min{Ci}
Multicast capacity C achievable!
(Random coding argument, [ACLY 2000])
Example
s
b1
b2
b1
b2
b1+b2
b1
b2
b1+b2 b1+b2
t1
t2
(b1,b2)
(b1,b2)
Example due to [ACLY2000]
History-II
b1 b2

bm
(b1b2 ...bm )  0,1    F (2m )
m
1
2
k
F(2m)-linear network
[KM2001]
Source:- Group together `m’ bits,
Any node:- Perform linear combinations
over finite field F(2m)
β1
β2
11   2 2  ...   k k
βk
Local Coding Vector: [β1 β2... βk]
F(2m)-linear network can
achieve multicast capacity C!
2m>|T|C,
Computational complexity high.
Results (Ours and Others’)
Main Result (Also [SET2003]): For 2m ≥ |T|, exists an F(2m)linear network which can be designed in polynomial time.
Importance
 Linear encoding/decoding.
 [SET2003] Capacity gap without coding arbitrarily large.
 Lower bound on field size 2m | T | ..
 [LL(preprint)]
Lower bound on alphabet size  | T |; finding smallest alphabet NP-hard.
 Multicast the only “easy and interesting” case.



Can be implemented as binary block-linear codes.
Randomized construction (Also [SET2003],[HMKKE2003])


Faster design,
More robust (Single code, zero error, small rate loss, arbitrary failure
pattern).
Example: Local Coding Vectors
length = number of incoming edges
(variable)
s
b1
Local Coding Vector: [1]
b1
b2
Local Coding Vector: [1 1]
b1
b1+b2
Idea: Global Coding Vectors
length = capacity
(fixed)
s
b1
b2
Global Coding Vector: [1 0]
Global Coding Vector: [0 1]
b1+b2
Global Coding Vector: [1 1]
Information carried by edge↔(g.c.v.)edge.[s1s2...sC]T
Idea: Linear Independence
TASK: Find local coding
vectors so that each
receiver can decode.
METHOD: Find local coding
vectors sequentially so that
global coding vectors on every
cut-set to every receiver are
linearly independent.
s
PREPROCESSING:
Find a set of C paths from
s to each ti.
OUTPUT: Local coding vectors,
Final global coding vectors.
b1
T1 = [1 0]
[1 1]
t1
b1+b2 b1+b2
b2
T2 = [1 1]
[0 1]
t2
Decoding: [x1...xk]=[Ti]-1[y1...yk]T
One edge at a time…
[1 0]
b1
e1
e3
e4
e5
[0 1]
b2
e2
M1({1,4}) = [1 0]
[0 1]
M2({3,2}) = [1 0]
[0 1]
Design g.c.v. for edge 5
b1+b2
[1 1]
M1({1,5}) = [1 0]
[1 1]
M2({2,5}) = [0 1]
[1 1]
Choosing local coding vectors
appropriately
Let v1,…,vk be global coding vectors feeding into e.
Let Mi(n) be matrices of global coding vectors
on nth cutset to ti.
Inductive hypothesis – each Mi(n) has rank C
v1
L
Let L = span {v1,…,vk},
vk
v1
v2
e
vk
L
S1
Sk
V
v11v12 ...v1C 


:


j
v j1v j 2 ...v jC 


:

v v ...v 
 C1 C 2 CC 
Sj = span{rows(Mj(n))- v }
Then we wish to find v in L such that
v not in Sj for all Sj (and therefore
rank of each Mi is still C)
L
Cool Lemma
S1
k
Lemma: | T | q  L  (  S j )  
j 1
Proof: Consider
L  S1
Sk
V
L  Sk
k
L  (  ( L  S j )),
j 1
| L | q rank( L ) ,
| L  S j | q rank( L ) 1 ,
k
| T | q |  L  S j || T | (q rank( L ) 1  1)  1  q rank( L )
k
j 1
| L  (  ( L  S j )) | 0
j 1
Hence Proved.
(Quick deterministic algorithm,
Faster randomized algorithms.)
Practical Optimal Network Codes.




Multicast the only “easy and interesting” case for
network coding problems.
Capacity achieving codes.
Linear encoding/decoding.
Small field sizes.




At most quadratic gap.
Smallest field-size determination hard.
Polynomial design complexity. (Random code design)
Robustness. (Random code design)
Joint paper being prepared for submission to IT Trans.:
Jaggi, Sanders, Chou, Effros, Egner, Jain, Tolhuizen