EE410 Information Theory

Module Name
Module Code
Version
Last Reviewed
Information Theory
EE410
2012–2013b
18th Jan. 2013
Module Co-ordinator
Department
Refer to Excel document Module_Co-ordinators
Electronic Engineering
Module Level
Credit rating
4
5 ECTS Credits
Pre-requisites
Co-requisites
EE304 Probability and Statistics; EE306 Modulation and Coding Techniques
None
Aims


Learning Outcomes
At the end of this module, the student will be able to:
1. Perform source coding on a stationary data source with known
statistics using Huffman coding, Shannon coding, arithmetic coding.
2. Calculate the entropy (self, mutual, conditional) of a random variable
and interpret the result in terms of information content and theoretical
limits to source coding performance.
3. Perform data compression of a stationary data source with unknown
statistics.
4. Calculate the channel capacity of simple noisy channels including the
binary symmetric case, and interpret the result in terms of theoretical
limits to channel coding performance.
5. Design linear block codes (including Hamming and cyclic) and
determine the error correcting capability of such codes.
6. Implement syndrome decoding for block codes.
7. Implement simple convolutional codes using state transition and
trellis diagrams.
To give an introduction to source and channel coding.
To develop the basic information theory to underpin these topics.
Time Allowance for Constituent Elements
Lectures
Tutorials
Assignments (2 x 5hr)
Quiz
Independent study
Semester Examination
24 hours
10 hours
10 hours
2 hours
77 hours
2 hours
Indicative Syllabus



Introduction – The meaning of Information Theory, Information, Uncertainty and Random variables,
Sources of Information, Information Transfer, Mathematical Description of Information
Information Sources – The discrete source, discrete source types, modelling the continuous source,
capacity of source types
Source Coding – The Kraft Inequality, McMillan Inequality, Shannon-Fano-Elias coding,Huffman
coding, Arithmetic coding, Lempel-Ziv compression








Entropy –Self Information, Mutual Information, Entropy rate, Markov processes
Noisy channels, channel capacity, error probability and the Channel coding theorem
Introduction to Channel Coding – Noisy channels, mutual information and channel capacity, error
probability, information rate of a code.
Shannon’s Channel Coding Theorem
Introduction to Block Codes – Hamming distance, minimum distance decoding, the Packing bound,
Gilbert-Varshamov bound.
Linear Block Codes – generator matrix, parity check matrix, constructing 1-error correcting codes,
syndrome decoding, perfect codes and Hamming codes
Cyclic codes – generator and parity check polynomial, generator and parity matrices in systematic
form, multiple error-correction and BCH codes
Introduction to convolutional codes – state transition diagrams and trellis diagrams.
Assessment Criteria
Semester Examination
Assignments (2)
Quiz
70%
20%
10%
Penalties: Late submission of assignments will be subject to a penalty of 10% of the assessment grade
for each day (or part thereof) overdue.
Pass Standard and any Special Requirements for Passing Modules: Pass 40% - students are not
required to pass the written and continuous components separately - an overall pass mark of 40% is
acceptable.
Requirements for Autumn Supplemental Examination: The continuous assessment mark is carried
forward to the Autumn examinations as there is no facility available for repeating the continuous
assessment components of the course.
Continual Assessment Results: Assignments will be corrected within two weeks, where that does not
extend past the end of the semester. Results and corrected assignments will be available for viewing upon
request.
Assessment Philosophy
Information Theory is a fundamental subject in communications and computer engineering. It requires a
strong understanding of probabilistic and algebraic concepts and their applications in the setting of data
compression and transmission.
Weekly tutorials/labs are designed to reinforce material presented in lectures and will provide an informal
assessment of the student’s grasp of all learning outcomes as the module progresses.
There are two formal assignments given throughout the semester; these require the student to read around
the core material of the lectures and will explicitly test learning outcomes 1-2 (Assignment 1) and
outcomes 3-5 (Assignment 2).
The class quiz assesses all learning outcomes with particular emphasis on outcomes 5-7. The final
examination assesses all outcomes.
Course Text

N. L. Biggs, Codes: An Introduction to Information Communication and
Cryptography, Springer, 2008
References



Simon Haykin, Communication Systems (4th ed.), Wiley, 2000
Thomas M. Cover and Joy A. Thomas, Elements of Information Theory,
2nd Edition, Wiley, 2006
R.J. McEliece, The Theory of Information and Coding, 2ndEdition,
Cambridge University Press, 2002
Programmes currently utilising module
BE in Electronic Engineering with Computers
BE in Electronic Engineering with Communications