Convolutional Codes

Convolutional codes
Tomashevich Victor
Name
Introduction
• Convolutional codes map information to code bits sequentially by
convolving a sequence of information bits with “generator” sequences
• A convolutional encoder encodes K information bits to N>K code bits
at one time step
• Convolutional codes can be regarded as block codes for which the
encoder has a certain structure such that we can express the
encoding operation as convolution
Name
-2-
Properties of convolutional codes
Consider a convolutional encoder. Input to the encoder is a
information bit sequence u (partitioned into blocks of length K):
u  (u 0 , u 1 , )
u i  (u i , u i
(1)
(2)
, u i
(K )
),
The encoder output the code bit sequence x (partitioned into blocks
of length N)
x  ( x 0 , x 1 ,)
x i  ( xi , xi ,  xi
(1)
( 2)
(N )
),
K
R
N
Name
-3-
Example: Consider a rate ½ convolutional code with K=1 and N=2
defined by the circuit:
xi
ui
(1)
(1)
( 2)
( 2)
(
x
,
x
,

)
(
x
,
x
The sequences 0
, 0
1
1 , ) are generated as follows:
xi(1)  ui
Multiplexing between
and
x  ui  ui1
(2)
i
xi(1) and xi(2) gives the code bit sequence
x  (( x0(1) x0( 2) ), ( x1(1) x1( 2) ),)  ( x 0 , x1 ,)
Name
-4-
• The convolutional code is linear
• The encoding mapping is bijective
• Code bits generated at time step i are affected by information bits up
to M time steps i – 1, i – 2, …, i – M back in time. M is the maximal
delay of information bits in the encoder
• Code memory is the (minimal) number of registers to construct an
encoding circuit for the code.
• Constraint length is the overall number of information bits affecting
code bits generated at time step i: =code memory + K=MK + K=(M +
1)K
• A convolutional code is systematic if the N code bits generated at
time step i contain the K information bits
Name
-5-
Example: The rate ½ code defined by the circuit
xi
ui
has delay M=1, memory 1, constraint length 2, and it is systematic
Example: the rate
2/3
code defined by the circuit
xi(1)
u i(1)
xi( 2 )
xi( 3)
u i( 2 )
has delay M=1, memory 2, constraint length 4, and not systematic
Name
-6-
Tree
u0
u1
u2
00
00
00
A
11
A
01
11
1
A
0
B
10
00
01
A
11
11
01
10
Name
B
10
-7-
Trellis
The tree graph can be contracted to a direct graph called trellis
of the convolutional code having at most S nodes at distance
i=0,1,… to the root
The contents of the (at most) MK encoder registers are assigned
the variables si( j )  GF (2), j  0,1,, M  K  1
s i  ( si( 0 ) , si(1) , si( M K 1) )
The vector
combibing all register contents at time step i is called state of
the encoder at time step i.
The code bit block
Name
x i is clearly a function of s i
and
u i , only
-8-
Example:
The encoder of the rate ½ convolutional code has
S  21  2
different states. The state is given by
s  si
xi
The code bit block
at time step i is computed from s i and u i by
xi(1)  u i and xi( 2 )  u i  si
A
00
A
11
B
Name
00
A
00
11
11
01
01
10
B
10
A
ui  0
ui  1
B
-9-
Example: Constructing a trellis section
xi
ui
xi(1)  u i and xi( 2 )  u i  si
Two equations are required:
(1) How does
(2) How does
s i depend on u i  m and possibly s i  m , m 0
si  u i 1
x i depend on s i and u i
xi(1)  u i
and
The branches are labeled with
leading from a state
Name
si
x i( 2 )  u i  s i
u i | xi
to a new state
called state transition
s i 1
- 10 -
Trellis section:
0|00
0
0|01
1
0|00
0|01
1
Name
1|11
1|10
1
s1
s0
0
0
0
s2
0|00
0
1|10
1|10
0|00
0
1|11 0|01 1|11
1
s L 1
s L2
0|01
1
0
1|11
1
1
1|10
- 11 -
s0
0
s2
s1
0|00
0
0|00
0
s L2
0
1|11 0|01 1|11
1
Name
0|00
0|01
1
1|10
s L 1
0
1|11
1
1
1|10
- 12 -
State diagram
Example: Trellis of the rate ½ convolutional code
xi(1)  u i
si
and
xi( 2 )  u i  si
s i 1
00
0
11
01
1
0
ui  0
ui  1
1
10
State diagram:
00
0
11
01
ui  0
ui  1
1
10
Name
- 13 -
Description with submatrices
Definition: A convolutional code is a set C of code bit sequences
x  ( x 0 , x1 ,, x i ,),
x i  ( xi , xi ,  xi
(1)
( 2)
(N )
), x
( j)
i
 GF (2)
There exist many encoders mapping information bit sequences
partitioned into lenth N blocks
u  (u 0 , u 1 , )
u i  (u i , u i
(1)
(2)
, u i
(K )
), ui
( j)
 GF (2)
(partitioned into length K<N blocks) to code bit sequences
for the same code
Name
x
- 14 -
Example: the following two encoding curcuits generate the
same set of code word sequences
x
u
(1)
i
(1)
i
x
u
( 2)
i
xi( 2 )
( 3)
xi
u i( 2 )
Name
(1)
i
xi( 2 )
( 3)
xi
u i(1)
- 15 -
Generator matrix
x  u  G,
where
G0


G



G1
G2

GM
G0
G1
G2

GM
G0
G1
G2



GM






M
x i   u i  m G m , i
m 0
The generated convolutional code has rate R=K/N, memory
K*M, and constraint length K*(M+1)
Name
- 16 -
Example:
The rate ½ code is given by
x  ui
(1)
i
and
x
( 2)
i
 u i  si
x i  ( xi(1) xi( 2) ) : G 0  1 1
G0
governs how
ui
G1
governs how
u i 1 affects x i : G1  0 1
affects
(( x0(1) x0( 2 ) ), ( x1(1) x1( 2 ) ), ( x 2(1) x 2( 2 ) ))  (u 0 , u1 , u 2 )  G,
where
Name
11 01



G
11 01


11


- 17 -
Description with polynomials
 g1(1) ( D)
 (1)
 g ( D)
G ( D)   2
 
 g (1) ( D)
 K
g1( 2 ) ( D)  g1( N ) ( D) 

( 2)
(N )
g 2 ( D)  g 2 ( D) 




( 2)
(N )
g K ( D)  g K ( D) 
gi( j) (D)  gi(,0j)  gi(,1j) D1  gi(,2j) D2  gi(,Mj) DM , gi(,mj) GF(2)
u ( D)  (u (1) ( D), u ( 2) ( D),u ( K ) ( D)) where
u ( j ) ( D)  u0( j )  u1( j ) D    ui( j ) D i  , j  1,2,, K ,
x( D)  ( x (1) ( D), x ( 2) ( D),, x ( N ) ( D)) where
x ( j ) ( D)  x0( j )  x1( j ) D    xi( j ) D i  , j  1,2,, N
x( D)  u ( D)G ( D)
g i(,mj )  G m (i, j ), i  1,, K , j  1,, N , m  0,, M ,
M  max deg( g i( j ) ( D)), i  1,, K , j  1,, N
i, j
Name
- 18 -
Example:
The rate ½ code is given by
(2)
xi(1)  ui and x i  u i  s i
(1)
(2)
( 2)
(1)
and
g1 ( D) G(D)  (g1 (D)g1 (D))
g1 ( D )
deg(( g ( D)))  1
The polynomial g (1) ( D ) governs how u l  m , m  0,1, affects
1
(1)
(1)
xl : g 1 ( D )  1  0  D  1
From M=1 follows that
( j)
i
g1( 2 ) ( D) governs how u l  m , m  0,1, affects
( 2)
( 2)
xl : g1 ( D )  1  1  D  D  1
u  (1,1,0,)  u ( D)  1  D
x  u  G, yielding x  (11,10,01,00,)
2
x( D)  u ( D)G ( D) yielding x( D)  1  D 1  D
The polynomial

Name
- 19 -

Punctured convolutional codes
A sequence of code bits is punctured by deleting some of the
bits in the sequence according to a fixed rule
In general, the the puncturing of a rate K/N convolutional
code is defined using N puncturing tables, one for any code
( j)
bit xi , j  1,  , N , in a block x i
Each table contains p bits, where p is the puncturing period. If
a bit is 1, the corresponding code bit is part of the punctured
code, if the bit is 0, the corresponding code bit is not part of the
punctured code
For a sequence of code bit blocks x i , i  0,1, , the puncturing
tables are applied periodically. N puncturing tables are combined
in a N  p puncturing matrix
P
Name
- 20 -
Example:
The encoder circuit of rate ½ convolutional code given by

G ( D)  1  D  D 2 1  D 2
x NP  (00,00,11,01,11)
u  (0,0,1,0,0)
The sequence
x NP

is punctured using two different puncturing
matrices:
1 1 1 0 
1 1 1 0 

, P 2  
P1  
1 1 0 1 
1 0 0 1 
The puncturing period p is 4. Using
and 2 out of 4 code bits
others are discarded
and
Name
P 1 , 3 out of 4 code bits xi(1)
xi( 2 ) of the mother code bits are used, the
R  1 / 2  (4  4) /(3  2)  4 / 5
u is encoded to x  (00,0 X ,1X , X 1,11)  (00,0,1,1,11)
- 21 -
P 2 , the rate of the punctured code is
R  1 / 2  (4  4) /(3  3)  2 / 3 and u is encoded to
x  (00,00,1X , X 1,11)  (00,00,1,1,11)
Using
Pucturing period p=4
1234
x0(1) x0(2) x1(1) x2(1) x3(2) x4(1) x4(2)
| | | |
0 0 0 1 1 0 1
1110
x
(1)
i
u0 u1 u2 u3 u4
1110
0 0 1 0 0
1001
P1
1101
x
( 2)
i
P2
Puncturing tables
Encoder of a rate ½ code punctured to a rate 4/5 (top puncturing
tables) or a rate 2/3 code (bottom puncturing tables)
Name
- 22 -
The rate R of a punctured code obtained from a rate
Np
R0  K / N
P is given as
Np
Kp
R  R0 

# of 1in P # of 1in P
mother code using the
puncturing matrix
With puncturing we can easily construct convolutional codes
with arbitrary rational rate. However, punctured codes of rate
R=K/N obtained from an optimized “good” mother code of
memory m usually perform worse than unpunctured rate K/N,
memory m optimized codes given by a K  N generator matrix
G (D)
This performance gap increases with the number of punctured
bits. The advantage of puncturing is that the decoding
complexity is not altered, since the original trellis of the mother
code can be used
Name
- 23 -
Consider a rate 1/3, memory 4 mother code given by the submatrices
G  (111), G  (010), G  (011), G  (101) and
0
G 4  (111),
1
2
code# rate punc. table d c
f df
1/3 1111 1111
9
1111 1111 11 8
8/24 1111 1111
8
7
4/11
8/12
4/10
8/20
6
4/9
8/18
5
1/2
8/16
Name
1111 1111
1111 1111
1110 1110
1111 1111
1111 1111
1100 1100
1111 1111
1111 1111
1000 1000
1111 1111
1111 1111
0000 0000
3
code# rate punc. table d f c d f
4/7 1111 1111
4
1110 1110 5 8
8/14 0000 0000
9 10
3
8
2
2
7
2
1
2/3
8/12
4/5
8/10
8/9
8/9
1111 1111
1010 1010
0000 0000
1111 1111
1000 1000
0000 0000
1111 0111
1000 1000
0000 0000
4
4
3 42
2
7 32
- 24 -
2
Decoding of convolutional codes
The Viterbi algorithm
x1 j
s1 j
s2 j
uj
x2 j
u  (1,1,1,1,1,1,1)
( s10 , s 20 )  (1,1)
x  (1  1,1  1,1  1,1  1,1  1,1  1,1  1)
Name
- 25 -
+1+1
+1/+1+1
+1+1
-1/-1-1
+1-1
+1/+1-1
+1-1
-1/-1+1
+1/+1-1
-1+1
-1/-1+1
-1+1
+1/+1+1
-1-1
Name
-1/-1-1
s1 j , s 2 j x1 j , x 2 j
-1-1
s1 j 1 , s 2 j 1
- 26 -
Hard decisions
y  (+1+1,-1+1,+1+1,+1+1,+1+1,+1+1,+1+1)
j
+1+1
+1+1
+2
0
-2
-1+1
+2
0
 x1( mj )  y1 j  x 2( mj )  y 2 j
+1+1
+2
+2
0
-2
(m)
+1+1
+1+1
+1+1
+1+1
+4
+6
+8
+10
0
+2
+4
+6
+2
+4
+2
+4
+2
0
+2
+4
j=3
j=4
j=5
j=6
+12
-2
+2
0
+8
0
-2
+2
j=1
Name
+6
0
0
j=0
0
-4
+2
j=2
-2
+6
j=7
- 27 -
+1+1
+1+1
+2
0
-2
-1+1
+2
0
+1+1
+2
+2
0
-2
+1+1
+1+1
+1+1
+1+1
+4
+6
+8
+10
0
+2
+4
+6
+2
+4
+2
+4
+2
0
+2
+4
j=3
j=4
j=5
j=6
+12
-2
+2
0
+8
0
-2
+2
j=1
+2
j=2
û j 
Name
+6
0
0
j=0
0
-4
-2
+1 +1 +1 +1 +1 +1 +1
+6
j=7
- No error
- 28 -
Hard decisions
y  (+1+1,-1+1,+1-1,+1+1,+1+1,+1+1,+1+1)
j
+1+1
+1+1
+2
0
-2
-1+1
+2
0
 x1( mj )  y1 j  x 2( mj )  y 2 j
+1-1
+2
+1+1
+2
0
0
-2
(m)
+2
0
+2
+2
+1+1
+1+1
+1+1
+4
+6
+8
+4
+2
+4
+2
+4
+6
+2
+4
+2
j=4
j=5
j=6
+2
+10
-2
+2
-2
0
+6
0
-2
+2
j=1
Name
0
+4
-2
0
j=0
+2
-4
j=2
0
0
+4
0
0
j=3
+2
-2
+4
j=7
- 29 -
+1+1
+1+1
+2
0
-2
-1+1
+2
0
+1-1
+2
-2
+2
0
0
+1+1
+2
0
+2
+2
+1+1
+4
+1+1
+6
+2
+1+1
+8
+2
+10
-2
+2
-2
+4
0
+6
0
-2
+2
j=1
0
+4
-2
0
j=0
+2
-4
j=2
0
0
+2
0
j=3
+2
-2
+2
j=4
û j  +1 +1 +1 +1 +1 +1 +1
Name
+4
0
+4
j=5
j=6
j=7
- No error
- 30 -
Hard decisions
y (+1+1,-1-1,-1+1,+1+1,+1+1,+1+1,+1+1)
j
+1+1
+1+1
+2
0
-2
-1-1
+2
-2
x
-1+1
0
(m)
1j
+2
0
+4
-2
 y1 j  x
+1+1
0
0
+2
-2
(m)
+1+1
(m)
2j
 y2 j
+1+1
+1+1
+2
+8
+10
+2
+8
+6
+8
+6
+8
+12
-2
0
+2
0
+8
0
0
0
j=1
Name
0
+2
+2
-2
j=0
-2
-2
j=2
0
0
+10
0
+6
j=3
+2
-2
+4
j=4
+8
j=5
+6
j=6
j=7
- 31 -
+1+1
+1+1
+2
0
-2
-1-1
+2
-2
-1+1
0
-2
0
0
+2
+1+1
+2
0
+4
-2
+1+1
+2
-2
0
+2
+8
+2
+1+1
+10
+12
-2
+2
0
+2
+1+1
0
+6
0
+8
0
0
j=1
û j 
Name
0
+2
+2
-2
j=0
-2
-2
j=2
0
0
+8
+8
+10
0
+6
+2
-2
j=3
+1 -1 -1 +1 +1 +1 +1
+4
j=4
+8
j=5
+2
-2
j=6
+6
j=7
- 2 decoding errors
- 32 -
Soft decisions
  2, xij( m )  y ij , GOOD channel

(m)
 y ij , BAD channel
x
,
2
/
1


ij
lij  
(m)
 y ij , BAD channel
x
,
2
/
1

ij

  2, xij( m )  y ij , GOOD channel


( m)
j
 x1 j l1 j y1 j  x2 j l 2 j y 2 j
CSI values ((G,B),(B,B),(G,G),(G,B),(B,B),(G,G),(G,G))
y  (+1+1,-1-1,-1+1,+1+1,+1+1,+1+1,+1+1)
Name
- 33 -
+1+1
-1-1
-1+1
+1+1
+1+1
+1+1
+1+1
GB
BB
GG
GB
BB
GG
GG
-0.5-0.5
-2 +2
+2 +2
+2 +2
+1 +11
+4
+14 +4
-1
-4
-4
+2 +0.5
+1+1 +2.5 +2.5 +1 +3.5
0
-2.5
-1
-2.5
+2 +0.5
+0.5 +0.5
+4 +7.5 +2.5 +10
-2.5
-4
+1.5 0
-0.5
0
0
+5
0
+9
0
0
+7
0
0
+18
+10
0
0
0
0
j=1
Name
+1.5
0
-2.5
j=0
0
-2.5
+4
j=2
-4
0
0
+4
0
0
+5
0
0
+2.5
+1
+1.5 -2.5 -0.5 -1
+5
j=3
j=5
j=4
0
+9
+4
-4
0
+9
+13
+4
-4
j=6
+7
j=7
- 34 -
+1+1
-1-1
-1+1
+1+1
+1+1
+1+1
+1+1
GB
BB
GG
GB
BB
GG
GG
+2 +0.5
+1+1
0
-0.5-0.5
+2.5 +2.5
-2 +2
+2 +0.5
+0.5 +0.5
+1 +3.5
+4 +7.5 +2.5 +10
-1
-4
-2.5
-2.5
+1.5 0
+1 +11
+2 +2
+2 +2
+4
+14 +4
+18
-2.5
-0.5
0
0
+10
0
0
0
j=1
û j 
Name
+1.5
0
-2.5
j=0
0
-2.5
+4
j=2
-4
0
+13
0
+2.5
+1.5 -2.5
j=3
+7
j=4
+1 +1 +1 +1 +1 +1 +1
j=5
j=6
j=7
- No error
- 35 -