Puchinger

Introduction to Coding Theory
Sven Puchinger
Institute of Communications Engineering, Ulm University
COSIP Winter Retreat, 7.12.2016
Introduction to Coding Theory
1
Coding Theory
Introduction to Coding Theory
2
Coding Theory
code
Introduction to Coding Theory
2
Coding Theory
E
code
Introduction to Coding Theory
2
Coding Theory
E
code
Introduction to Coding Theory
node
2
Coding Theory
E
code
Encoder
Introduction to Coding Theory
node
Channel
Decoder
2
Coding Theory
Code C = Set of codewords
Introduction to Coding Theory
dH = Hamming Metric
3
Coding Theory
Code C = Set of codewords
dH = Hamming Metric
C = {code, node, core}
dH (code, node) = 1,
Introduction to Coding Theory
dH (code, core) = 1,
dH (node, core) = 2
3
Coding Theory
Code C = Set of codewords
dH = Hamming Metric
C = {code, node, core}
dH (code, node) = 1,
dH (code, core) = 1,
dH (node, core) = 2
C = {simplifications, overgeneralized}
dH (simplifications, overgeneralized) = 13
Introduction to Coding Theory
3
Coding Theory
Code C = Set of codewords
dH = Hamming Metric
C = {code, node, core}
dH (code, node) = 1,
dH (code, core) = 1,
dH (node, core) = 2
C = {simplifications, overgeneralized}
dH (simplifications, overgeneralized) = 13
Encoding (Rate R =
Introduction to Coding Theory
k
n)
k
n
Information
Codeword
3
Decoding
simplifications
overgeneralized
Introduction to Coding Theory
4
Decoding
simplifications
ovmrleniralxzed
overgeneralized
Introduction to Coding Theory
4
Decoding
simplifications
dH = 10
ovmrleniralxzed
dH = 4
overgeneralized
Introduction to Coding Theory
4
Coding Theory
Claude E. Shannon,
A Mathematical Theory of
Communication, 1948
If R < C, Perr → 0 (n → ∞)
Introduction to Coding Theory
5
Coding Theory
Claude E. Shannon,
A Mathematical Theory of
Communication, 1948
Richard Hamming,
Error Detection and Error
Correction Codes, 1950
If R < C, Perr → 0 (n → ∞)
First Practical Codes
Introduction to Coding Theory
5
Jim Massey’s History of Channel Coding
Experts’ opinions
In the 50s and 60s
Coding is dead! All interesting problems are already solved.
In the 70s
Coding is dead as a doornail, except on the deep-space channel.
In the 80s
Coding is quite dead, except on wideband channels such as the
deep-space channel and narrowband channels such as the telephone
channel.
In the 90s
Coding is truly dead, except on single sender channels.
Introduction to Coding Theory
6
Applications
CD/DVD/Blu-Ray
Compressed Sensing
Mobile Comm.
·
IBAN
=
Biology
Distributed Storage
Cryptography
PUFs
DE78 0192 2932 3627 6328 10
Network Coding
Alice
S, G, P
Bob
r =c+e
PUF
Helper
Data Generation
e
Helper
Data
Storage
S·G·P
r 0 = c + e + e0
e
m
m̂
Introduction to Coding Theory
m·S·G·P+e
Key
Repro-
r̂ = ĉ + e
Hash
Key
duction
7
Applications
Experts’ opinion
Coding is surely dead, except on the deep-space channel, on narrowband
one-sender channels such as the telephone channel, on many sender
channels, storage, networks, ...
Introduction to Coding Theory
8
Outline
1 Code Constructions
2 Channels & Metrics
3 Decoding
4 Connection to Compressed Sensing
Introduction to Coding Theory
9
Outline
1 Code Constructions
2 Channels & Metrics
3 Decoding
4 Connection to Compressed Sensing
Introduction to Coding Theory
10
Code
K Field (usually finite, e.g. K = F2 )
Code
C ⊂ Kn
Goal: Codewords c1 , c2 ∈ C “far apart” w.r.t. a metric
d : K n × K n → R≥0
Example: Hamming Metric
dH (x, y) = wtH (x − y),
Introduction to Coding Theory
wtH (x) = |supp(x)|
11
Linear Block Codes
C(n, k, d) ⊆ K n
Length n
K-subspace of dimension k
Minimum distance d
d = min d(c1 , c2 ) = min wt(c)
c1 ,c2 ∈C
c1 6=c2
c∈C\{0}
Singleton Bound d ≤ n − k + 1
Introduction to Coding Theory
12
Generator & Parity Check Matrix
Generator Matrix = A Basis of C
G=
Introduction to Coding Theory
n
c1
..
.
ck
k
13
Generator & Parity Check Matrix
Generator Matrix = A Basis of C
n
c1
..
.
ck
G=
k
(
)
⇒C=
·
i
Introduction to Coding Theory
G
=
c
∈ Kk
:
i
13
Generator & Parity Check Matrix
Generator Matrix = A Basis of C
n
c1
..
.
ck
G=
k
(
)
⇒C=
·
i
G
=
c
∈ Kk
:
i
Dual Code C ⊥ = c⊥ : hc⊥ , ci = 0 ∀c ∈ C (orth. complement)
Introduction to Coding Theory
13
Generator & Parity Check Matrix
Generator Matrix = A Basis of C
n
c1
..
.
ck
G=
k
(
)
⇒C=
·
G
i
=
c
∈ Kk
:
i
Dual Code C ⊥ = c⊥ : hc⊥ , ci = 0 ∀c ∈ C (orth. complement)
Parity Check Matrix = A Basis of C ⊥
n
H=
c⊥
1
..
.
n−k
c⊥
n−k
Introduction to Coding Theory
13
Generator & Parity Check Matrix
Generator Matrix = A Basis of C
n
c1
..
.
ck
G=
k
(
)
⇒C=
·
G
=
i
c
∈ Kk
:
i
Dual Code C ⊥ = c⊥ : hc⊥ , ci = 0 ∀c ∈ C (orth. complement)
Parity Check Matrix = A Basis of C ⊥
n
H=
c⊥
1
..
.
n−k
c⊥
n−k
⇒ C = c ∈ K n : HcT = 0
Introduction to Coding Theory
13
Example 1: Repetition Code
K = F2
Code
C(n, 1, n) = {(00 . . . 0), (11 . . . 1)}
Generator Matrix
G= 1
1
...
1 ∈ F1×n
2
Parity Check Matrix

1


H=



1
1

n−1×n

..  ∈ F2
.
1
..
.
1
Introduction to Coding Theory
1
14
Example 2: Reed–Solomon Codes
α1 , . . . , αn ∈ K distinct
n
o
C(n, k) = f (α1 ), . . . , f (αn ) : f (x) ∈ K[x], deg f (x) < k
Introduction to Coding Theory
15
Example 2: Reed–Solomon Codes
α1 , . . . , αn ∈ K distinct
n
o
C(n, k) = f (α1 ), . . . , f (αn ) : f (x) ∈ K[x], deg f (x) < k
Theorem
d=n−k+1
Introduction to Coding Theory
15
Example 2: Reed–Solomon Codes
α1 , . . . , αn ∈ K distinct
n
o
C(n, k) = f (α1 ), . . . , f (αn ) : f (x) ∈ K[x], deg f (x) < k
Theorem
d=n−k+1
Proof: Singleton Bound: d ≤ n − k + 1.
c 6= 0
⇒ f (x) 6= 0 and deg f (x) ≤ k − 1
⇒ At most k − 1 many αi give f (αi ) = 0
⇒ At least n − (k − 1) many αi give f (αi ) 6= 0
⇒ wtH (c) ≥ n − k + 1
Introduction to Coding Theory
15
Code Classes (Selection)
Convolutional
Codes
Block
Codes
Introduction to Coding Theory
16
Code Classes (Selection)
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
Convolutional
Codes
Block
Codes
Introduction to Coding Theory
16
Code Classes (Selection)
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
Convolutional
Codes
Block
Codes
LDPC
(+) Achieve C (n → ∞)
(–) Few analytical tools
Introduction to Coding Theory
16
Code Classes (Selection)
Convolutional
Codes
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
(P)UM
Block
Codes
LDPC
(+) Achieve C (n → ∞)
(–) Few analytical tools
Introduction to Coding Theory
16
Code Classes (Selection)
Convolutional
Codes
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
(P)UM
Code
Concatenation
Block
Codes
LDPC
(+) Achieve C (n → ∞)
(–) Few analytical tools
Introduction to Coding Theory
16
Code Classes (Selection)
...
Convolutional
Codes
BCH
(P)UM
Code
Concatenation
Block
Codes
LDPC
(+) Achieve C (n → ∞)
(–) Few analytical tools
...
Introduction to Coding Theory
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
AG
Reed–Muller
16
Code Classes (Selection)
...
Convolutional
Codes
BCH
(P)UM
Code
Concatenation
Block
Codes
LDPC
(+) Achieve C (n → ∞)
(–) Few analytical tools
...
Reed–Solomon
(+) Decoding
(–) n ≤ |K|
AG
Reed–Muller
There is no (known) universally superior code!
Introduction to Coding Theory
16
Outline
1 Code Constructions
2 Channels & Metrics
3 Decoding
4 Connection to Compressed Sensing
Introduction to Coding Theory
17
Channels
General
c
Introduction to Coding Theory
P(r | c)
r
18
Channels
General
P(r | c)
c
r
Binary Symmetric Channel (0 < ε < 0.5)
0
1−ε
ε
0
ε
1
Introduction to Coding Theory
1−ε
1
18
Channels
General
P(r | c)
c
r
Binary Symmetric Channel (0 < ε < 0.5)
0
1−ε
ε
0
ei ∼ Ber(ε)
⇐⇒
ci
ε
1
Introduction to Coding Theory
1−ε
L
ri = ci + ei
1
18
Channels
General
P(r | c)
c
r
Binary Symmetric Channel (0 < ε < 0.5)
0
1−ε
ε
0
ei ∼ Ber(ε)
⇐⇒
ci
ε
1
1−ε
L
ri = ci + ei
1
⇒ Hamming Metric
Introduction to Coding Theory
18
Channels
AWGN Channel
ei ∼ N (0, σ 2 )
ci
Introduction to Coding Theory
L
ri = ci + ei
19
Channels
AWGN Channel
ei ∼ N (0, σ 2 )
ci
L
ri = ci + ei
⇒ Euclidean Metric
Introduction to Coding Theory
19
Channels
AWGN Channel
ei ∼ N (0, σ 2 )
ci
L
ri = ci + ei
⇒ Euclidean Metric
Rank Metric
dR : K m×n × K m×n → N0
(X, Y ) 7→ rank(X − Y )
Introduction to Coding Theory
19
Channels
AWGN Channel
ei ∼ N (0, σ 2 )
ci
L
ri = ci + ei
⇒ Euclidean Metric
Rank Metric
dR : K m×n × K m×n → N0
(X, Y ) 7→ rank(X − Y )
Corresponding Channels
Random Linear Network Coding
MIMO Transmission Systems
Introduction to Coding Theory
19
Outline
1 Code Constructions
2 Channels & Metrics
3 Decoding
4 Connection to Compressed Sensing
Introduction to Coding Theory
20
Decoding
c
P(r | c)
r
Maximum-Likelihood (ML) Decoding
ĉ = argmax P(c | r) = argmax
c∈C
Introduction to Coding Theory
c∈C
P(r | c)P(c)
P(r)
21
Decoding
c
P(r | c)
r
Maximum-Likelihood (ML) Decoding
ĉ = argmax P(c | r) = argmax
c∈C
c∈C
P(r | c)P(c)
P(r)
If metric d fits to channel
ĉ = argmin d(r, c)
c∈C
Introduction to Coding Theory
21
Decoding
c
P(r | c)
r
Maximum-Likelihood (ML) Decoding
ĉ = argmax P(c | r) = argmax
c∈C
c∈C
P(r | c)P(c)
P(r)
If metric d fits to channel
ĉ = argmin d(r, c)
c∈C
In Practice
Convolutional Codes: ML-decodable (Viterbi Algorithm)
Block Codes: Often not ML-decodable
E.g. C(400, 272, d) code over F2 : |C| ≈ 1082
Introduction to Coding Theory
21
Bounded Minimum Distance Decoding
c2
c1
c3
c4
Introduction to Coding Theory
22
Bounded Minimum Distance Decoding
c2
c1
d
c3
c4
Introduction to Coding Theory
22
Bounded Minimum Distance Decoding
c2
c1
d
d
2
c3
c4
Introduction to Coding Theory
22
Bounded Minimum Distance Decoding
c2
c1
d
d
2
c3
r
c4
Introduction to Coding Theory
22
Bounded Minimum Distance Decoding
c2
c1
d
d
2
c3
r
c4
Introduction to Coding Theory
22
Example: Reed–Solomon Codes
r = f (α1 ), . . . , f (αn ) + e
Introduction to Coding Theory
23
Example: Reed–Solomon Codes
r = f (α1 ), . . . , f (αn ) + e
Welch–Berlekamp
Find a non-zero Q(x, y) = Q0 (x) + yQ1 (x) ∈ K[x, y] s.t.
i) Q(αi , ri ) = 0 ∀i
ii) deg Q0 (x) ≤ n − 1 −
iii) deg Q1 (x) ≤ n − k −
Introduction to Coding Theory
d
2
d
2
23
Example: Reed–Solomon Codes
r = f (α1 ), . . . , f (αn ) + e
Welch–Berlekamp
Find a non-zero Q(x, y) = Q0 (x) + yQ1 (x) ∈ K[x, y] s.t.
i) Q(αi , ri ) = 0 ∀i
ii) deg Q0 (x) ≤ n − 1 −
iii) deg Q1 (x) ≤ n − k −
d
2
d
2
If dH (r, c) < d2 , then
Q(x, y) exists
0 (x)
f (x) = − Q
Q1 (x)
Introduction to Coding Theory
23
Example: Reed–Solomon Codes
r = f (α1 ), . . . , f (αn ) + e
Welch–Berlekamp
Find a non-zero Q(x, y) = Q0 (x) + yQ1 (x) ∈ K[x, y] s.t.
i) Q(αi , ri ) = 0 ∀i
ii) deg Q0 (x) ≤ n − 1 −
iii) deg Q1 (x) ≤ n − k −
If dH (r, c) < d2 , then
Q(x, y) exists
0 (x)
f (x) = − Q
Q1 (x)
Introduction to Coding Theory
d
2
d
2
Complexity
Naive O(n3 )
Practical O(n2 )
Optimal O∼ (n)
23
Example: Reed–Solomon Codes
r = f (α1 ), . . . , f (αn ) + e
Welch–Berlekamp
Find a non-zero Q(x, y) = Q0 (x) + yQ1 (x) ∈ K[x, y] s.t.
i) Q(αi , ri ) = 0 ∀i
ii) deg Q0 (x) ≤ n − 1 −
iii) deg Q1 (x) ≤ n − k −
If dH (r, c) < d2 , then
d
2
d
2
Complexity
Naive O(n3 )
Q(x, y) exists
Practical O(n2 )
0 (x)
f (x) = − Q
Q1 (x)
Optimal O∼ (n)
Beyond d2 ?
Introduction to Coding Theory
23
List Decoding
c2
c1
d
2
c3
c4
Introduction to Coding Theory
24
List Decoding
c2
c1
d
2
r
c3
c4
Introduction to Coding Theory
24
List Decoding
c2
c1
d
2
r
r
c3
c4
Introduction to Coding Theory
24
List Decoding
c2
c1
r
c3
c4
Introduction to Coding Theory
24
List Decoding
c2
c1
r
c3
c4
Introduction to Coding Theory
24
Outline
1 Code Constructions
2 Channels & Metrics
3 Decoding
4 Connection to Compressed Sensing
Introduction to Coding Theory
25
Connection to Compressed Sensing
Compressed Sensing (m n)
Given b ∈ Cm , A ∈ Cm×n , find sparsest x ∈ Cn s.t.
b = Ax
Introduction to Coding Theory
26
Connection to Compressed Sensing
Compressed Sensing (m n)
Given b ∈ Cm , A ∈ Cm×n , find sparsest x ∈ Cn s.t.
b = Ax
Decoding Problem (n − k n)
Given s ∈ K n−k , H ∈ K n−k×n , find e ∈ K n of minimal wtH (e) s.t.
s = He
Introduction to Coding Theory
26
Connection to Compressed Sensing
Compressed Sensing (m n)
Given b ∈ Cm , A ∈ Cm×n , find sparsest x ∈ Cn s.t.
b = Ax
Decoding Problem (n − k n)
Given s ∈ K n−k , H ∈ K n−k×n , find e ∈ K n of minimal wtH (e) s.t.
s = He
Solution:
Find some solution r = c + e of
s = He = H(c + e) = Hr.
Decode r −→ obtain c, e with minimal dH (r, c) = wtH (e)
Introduction to Coding Theory
26