Information Theory

COMP2610/6261 - Information Theory
Lecture 22: Hamming Codes
NU Logo Use
Guidelines
Mark
Reid and Aditya Menon
U logo is a contemporary
on of our heritage.
y presents our name,
ld and our motto:
Research School of Computer Science
The Australian National University
earn the nature of things.
authenticity of our brand identity, there are
rn how our logo is used.
ogo - horizontal logo
Preferred logo
October 15th, 2014
Black version
ogo should be used on a white background.
cludes black text with the crest in Deep Gold in
CMYK.
rinting is not available, the black logo can
white background.
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
1 / 37
Reminder: Repetition Codes
The repetition code R3 :
s
t
η
0
z }| {
000
000
0
z }| {
000
001
1
z }| {
111
000
0
z }| {
000
000
1
z }| {
111
101
1
z }| {
111
000
0
z }| {
000
000
r
000
001
111
000
010
111
000
For a BSC with bit flip probability f = 0.1, drives error rate down to ≈ 3%
For general f , the error probability is f 2 (3 − 2f )
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
2 / 37
This time
Introduction to block codes
I
Extension to basic repetition codes
The (7,4) Hamming code
Redundancy in (linear) block codes through parity check bits
Syndrome decoding
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
3 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
4 / 37
Motivation
Goal: Communication with small probability of error and high rate:
Repetition codes introduce redundancy on a per-bit basis
Can we improve on repetition codes?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
5 / 37
Motivation
Goal: Communication with small probability of error and high rate:
Repetition codes introduce redundancy on a per-bit basis
Can we improve on repetition codes?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
5 / 37
Motivation
Goal: Communication with small probability of error and high rate:
Repetition codes introduce redundancy on a per-bit basis
Can we improve on repetition codes?
Idea: Introduce redundancy to blocks of data instead
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
5 / 37
Motivation
Goal: Communication with small probability of error and high rate:
Repetition codes introduce redundancy on a per-bit basis
Can we improve on repetition codes?
Idea: Introduce redundancy to blocks of data instead
Block Code
A block code is a rule for encoding a length-K sequence of source bits s
into a length-N sequence of transmitted bits t.
Introduce redundancy: N > K
Focus on Linear codes
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
5 / 37
Motivation
Goal: Communication with small probability of error and high rate:
Repetition codes introduce redundancy on a per-bit basis
Can we improve on repetition codes?
Idea: Introduce redundancy to blocks of data instead
Block Code
A block code is a rule for encoding a length-K sequence of source bits s
into a length-N sequence of transmitted bits t.
Introduce redundancy: N > K
Focus on Linear codes
We will introduce a simple type of block code called
the (7,4) Hamming code
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
5 / 37
An Example
The (7, 4) Hamming Code
Consider K = 4, and a source message s = 1 0 0 0
The repetition code R2 produces
t=11000000
The (7,4) Hamming code produces
t=1000101
Redundancy, but not repetition
How are these magic bits computed?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
6 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
7 / 37
The (7,4) Hamming code
Coding
Consider K = 4, N = 7 and s = 1 0 0 0
It will help to think of the code in terms of overlapping circles
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
8 / 37
The (7,4) Hamming code
Coding
Consider K = 4, N = 7 and s = 1 0 0 0
It will help to think of the code in terms of overlapping circles
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
8 / 37
The (7,4) Hamming code
Coding
Copy the source bits into the the first 4 target bits:
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
9 / 37
The (7,4) Hamming code
Coding
Set parity-check bits so that the number of ones within each circle is even:
encoder
So we have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
10 / 37
The (7,4) Hamming code
Coding
It is clear that we have set:
ti = si for i = 1, . . . , 4
t5 = s1 ⊕ s2 ⊕ s3
t6 = s2 ⊕ s3 ⊕ s4
t7 = s1 ⊕ s3 ⊕ s4
where we use modulo-2 arithmetic
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
11 / 37
The (7,4) Hamming code
Coding
In matrix form:





I4
t = GT s with GT =
=

P



1
0
0
0
1
0
1
0
1
0
0
1
1
0
0
0
1
0
1
1
1
0
0
0
1
0
1
1





,




T
where s = s1 s2 s3 s4
G is called the Generator matrix of the code.
The Hamming code is linear!
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
12 / 37
The (7,4) Hamming code:
Codewords
Each (unique) sequence that can be transmitted is called a codeword.
s
0010
Codeword examples: 0110
1010
1110
Mark Reid and Aditya Menon (ANU)
Codeword (t)
0010111
0110001
1010010
?
COMP2610/6261 - Information Theory
October 15th, 2014
13 / 37
The (7,4) Hamming code:
Codewords
Each (unique) sequence that can be transmitted is called a codeword.
s
Codeword (t)
0010
0010111
0110001
Codeword examples: 0110
1010
1010010
1110
?
For the (7,4) Hamming code we have a total of 16 codewords
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
13 / 37
The (7,4) Hamming code:
Codewords
Each (unique) sequence that can be transmitted is called a codeword.
s
Codeword (t)
0010
0010111
0110001
Codeword examples: 0110
1010
1010010
1110
?
For the (7,4) Hamming code we have a total of 16 codewords
There are 27 − 24 other bit strings that immediately imply corruption
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
13 / 37
The (7,4) Hamming code:
Codewords
Each (unique) sequence that can be transmitted is called a codeword.
s
Codeword (t)
0010
0010111
0110001
Codeword examples: 0110
1010
1010010
1110
?
For the (7,4) Hamming code we have a total of 16 codewords
There are 27 − 24 other bit strings that immediately imply corruption
Any two codewords differ in at least three bits
I
Each original bit belongs to at least two circles
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
13 / 37
The (7,4) Hamming code
Coding
Write
GT = G1· G2· G3· G4·
where each Gi· is a 7 dimensional bit vector
Then, the transmitted message is
t = GT s
= G1· G2· G3· G4· s
= s1 G1· + . . . + s4 G4·
All codewords can be obtained as linear combinations of the rows of G:
( 4
)
X
Codewords =
αi Gi· ,
i=1
where αi ∈ {0, 1} and Gi· is the ith row of G.
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
14 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
15 / 37
The (7,4) Hamming code:
Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3
parity check bits
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
16 / 37
The (7,4) Hamming code:
Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3
parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
s
1000
}|
{
z
t 1000101
Mark Reid and Aditya Menon (ANU)
η
0100000
r
1100101
COMP2610/6261 - Information Theory
October 15th, 2014
16 / 37
The (7,4) Hamming code:
Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3
parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
s
1000
}|
{
z
t 1000101
η
0100000
r
1100101
How should we decode r?
We could do this exhaustively using the 16 codewords
Assuming BSC, uniform p(s): Get the most probable explanation
Find s such that kt(s) rk1 is minimum
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
16 / 37
The (7,4) Hamming code:
Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3
parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
s
1000
}|
{
z
t 1000101
η
0100000
r
1100101
How should we decode r?
We could do this exhaustively using the 16 codewords
Assuming BSC, uniform p(s): Get the most probable explanation
Find s such that kt(s) rk1 is minimum
We can get the most probable source vector in an more efficient way.
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
16 / 37
The (7,4) Hamming code:
Decoding Example 1
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 1 0 0 1 0 1:
(1) Detect circles with wrong (odd) parity
I
What bit is responsible for this?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
17 / 37
The (7,4) Hamming code:
Decoding Example 1
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 1 0 0 1 0 1:
(2) Detect culprit bit and flip it
The decoded sequence is ŝ = 1 0 0 0
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
18 / 37
The (7,4) Hamming code:
Decoding Example 2
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 0 0 0 0 1:
(1) Detect circles with wrong (odd) parity
I
What bit is responsible for this?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
19 / 37
The (7,4) Hamming code:
Decoding Example 2
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 0 0 0 0 1:
(2) Detect culprit bit and flip it
The decoded sequence is ŝ = 1 0 0 0
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
20 / 37
The (7,4) Hamming code:
Decoding Example 3
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 1 0 1 0 1:
(1) Detect circles with wrong (odd) parity
I
What bit is responsible for this?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
21 / 37
The (7,4) Hamming code:
Decoding Example 3
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 1 0 1 0 1:
(2) Detect culprit bit and flip it
The decoded sequence is ŝ = 1 0 0 0
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
22 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
23 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1 , . . . , r7 , assume BSC with small noise level f :
1
Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5 , r6 , r7 .
I
I
zi = 1 when the ith parity bit does not match the parity of r
Flipping a single bit leads to a different syndrome
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
24 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1 , . . . , r7 , assume BSC with small noise level f :
1
Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5 , r6 , r7 .
I
I
2
zi = 1 when the ith parity bit does not match the parity of r
Flipping a single bit leads to a different syndrome
Check parity bits r5 , r6 , r7 and identify the syndrome
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
24 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1 , . . . , r7 , assume BSC with small noise level f :
1
Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5 , r6 , r7 .
I
I
2
3
zi = 1 when the ith parity bit does not match the parity of r
Flipping a single bit leads to a different syndrome
Check parity bits r5 , r6 , r7 and identify the syndrome
Unflip the single bit responsible for this pattern of violation
I
This syndrome could have been caused by other noise patterns
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
24 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1 , . . . , r7 , assume BSC with small noise level f :
1
Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5 , r6 , r7 .
I
I
2
3
zi = 1 when the ith parity bit does not match the parity of r
Flipping a single bit leads to a different syndrome
Check parity bits r5 , r6 , r7 and identify the syndrome
Unflip the single bit responsible for this pattern of violation
I
This syndrome could have been caused by other noise patterns
z
000
001
010
011
100
101
110
111
Flip bit
none
r7
r6
r4
r5
r1
r2
r3
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
24 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1 , . . . , r7 , assume BSC with small noise level f :
1
Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5 , r6 , r7 .
I
I
2
3
zi = 1 when the ith parity bit does not match the parity of r
Flipping a single bit leads to a different syndrome
Check parity bits r5 , r6 , r7 and identify the syndrome
Unflip the single bit responsible for this pattern of violation
I
This syndrome could have been caused by other noise patterns
z
000
001
010
011
100
101
110
111
Flip bit
none
r7
r6
r4
r5
r1
r2
r3
The optimal decoding algorithm unflips at most one bit
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
24 / 37
The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
When the noise level f on the BSC is small, it may be reasonable that we
see only a single bit flip in a sequence of 4 bits
The syndrome decoding method exactly recovers the source message in
this case
c.f. Noise flipping one bit in the repetition code R3
But what happens if the noise flips more than one bit?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
25 / 37
The (7,4) Hamming code:
Decoding Example 4: Flipping 2 Bits
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 1 0 1 0 0:
(1) Detect circles with wrong (odd) parity
I
What bit is responsible for this?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
26 / 37
The (7,4) Hamming code:
Decoding Example 4: Flipping 2 Bits
encoder
noise
We have s = 1 0 0 0 −→ t = 1 0 0 0 1 0 1 −→ r = 1 0 1 0 1 0 0:
(2) Detect culprit bit and flip it
The decoded sequence is ŝ = 1 1 1 0
I
We have made 3 errors but only 2 involve the actual message
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
27 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the
actual ones we received:
z1 = r1 ⊕ r2 ⊕ r3 r5
z2 = r2 ⊕ r3 ⊕ r4 r6
z3 = r1 ⊕ r3 ⊕ r4 r7 ,
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
28 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the
actual ones we received:
z1 = r1 ⊕ r2 ⊕ r3 r5
z2 = r2 ⊕ r3 ⊕ r4 r6
z3 = r1 ⊕ r3 ⊕ r4 r7 ,
but in modulo-2 arithmetic −1 ≡ 1 so we can replace
have:

1110
z = Hr with H = P I3 =  0 1 1 1
1011
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
with ⊕ so we

100
010 
001
October 15th, 2014
28 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the
actual ones we received:
z1 = r1 ⊕ r2 ⊕ r3 r5
z2 = r2 ⊕ r3 ⊕ r4 r6
z3 = r1 ⊕ r3 ⊕ r4 r7 ,
but in modulo-2 arithmetic −1 ≡ 1 so we can replace
have:

1110
z = Hr with H = P I3 =  0 1 1 1
1011
with ⊕ so we

100
010 
001
What is the syndrome for a codeword?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
28 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
Recall that we obtain a codeword with t = GT s
Assume we receive r = t + η, where η = 0
The syndrome is
z = Hr
= Ht
= HGT s
=0
This is because HGT = P + P = 0
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
29 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
For the noisy case we have:
r = GT s + η
z = Hr
= HGT s + Hη
= Hη.
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
30 / 37
The (7,4) Hamming code:
Syndrome Decoding: Matrix Form
For the noisy case we have:
r = GT s + η
z = Hr
= HGT s + Hη
= Hη.
Therefore, syndrome decoding boils down to find the most probable η
satisfying Hη = z.
Maximum likelihood decoder
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
30 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
31 / 37
The (7,4) Hamming code:
Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits ŝi does not
match the corresponding source bit si for i = 1, . . . 4
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
32 / 37
The (7,4) Hamming code:
Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits ŝi does not
match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(ŝ 6= s)
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
32 / 37
The (7,4) Hamming code:
Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits ŝi does not
match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(ŝ 6= s)
K
1 X
p(Bit Error) : pb =
p(ŝk =
6 sk )
K
k=1
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
32 / 37
The (7,4) Hamming code:
Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits ŝi does not
match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(ŝ 6= s)
K
1 X
p(Bit Error) : pb =
p(ŝk =
6 sk )
K
k=1
K
4
Rate : R =
=
N
7
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
32 / 37
The (7,4) Hamming code:
Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits ŝi does not
match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(ŝ 6= s)
K
1 X
p(Bit Error) : pb =
p(ŝk =
6 sk )
K
k=1
K
4
Rate : R =
=
N
7
What is the probability of block error for the (7,4) Hamming code with
f = 0.1?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
32 / 37
The (7,4) Hamming code:
Leading-Term Error Probabilities
Block Error: This occurs when 2 or more bits in the block of 7 are flipped
We can approximate pB to the leading term:
7 X
7 m
f (1 − f )7−m
m
m=2
7 2
≈
f = 21f 2 .
2
pB =
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
33 / 37
The (7,4) Hamming code:
Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or
more bits
The most probable case is when the noise corrupts 2 bits, which induces 3
errors in the decoded vector:
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
34 / 37
The (7,4) Hamming code:
Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or
more bits
The most probable case is when the noise corrupts 2 bits, which induces 3
errors in the decoded vector:
3
p(ˆ
si 6= si ) ≈ pB for i = 1, . . . , 7
7
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
34 / 37
The (7,4) Hamming code:
Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or
more bits
The most probable case is when the noise corrupts 2 bits, which induces 3
errors in the decoded vector:
3
p(ˆ
si 6= si ) ≈ pB for i = 1, . . . , 7
7
All bits are equally likely to be corrupted (due to symmetry)
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
34 / 37
The (7,4) Hamming code:
Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or
more bits
The most probable case is when the noise corrupts 2 bits, which induces 3
errors in the decoded vector:
3
p(ˆ
si 6= si ) ≈ pB for i = 1, . . . , 7
7
All bits are equally likely to be corrupted (due to symmetry)
3
pb ≈ pB ≈ 9f 2
7
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
34 / 37
What Can Be Achieved with Hamming Codes?
0.1
0.1
0.01
R1
0.08
pb
H(7,4)
R5
1e-05
more useful codes
BCH(511,76)
0.06
0.04
R1
H(7,4)
1e-10
BCH(31,16)
R3
0.02
R5
BCH(15,7)
more useful codes
BCH(1023,101)
0
1e-15
0
0.2
0.4
0.6
Rate
0.8
1
0
0.2
0.4
0.6
Rate
0.8
1
H(7,4) improves pb at a moderate rate R = 4/7
BCH are a generalization of Hamming codes.
BCH better than RN but still pretty depressing
Can we do better? What is achievable / nonachievable?
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
35 / 37
1
Motivation
2
The (7,4) Hamming code
Coding
Decoding
Syndrome Decoding
Error Probabilities
3
Wrapping up
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
36 / 37
Summary
The (7,4) Hamming code
Redundancy in (linear) block codes through parity check bits
Syndrome decoding via identification of single bit noise patterns
Block error, bit error, rate
Reading: Mackay §1.2 − §1.5
Mark Reid and Aditya Menon (ANU)
COMP2610/6261 - Information Theory
October 15th, 2014
37 / 37