Lecture 1: Introduction Eirik Rosnes Department of Informatics, University of Bergen, N-5020 Bergen, Norway Rosnes (INF 144) January 12, 2015 1/7 Introduction Book: Informationsteori - grundvalen for (tele-)kommunikasjon by Rolf Johannesson. Most of the material in the book will be included. I Chapter 1: Introduction (all). I Chapter 2: Fundamentals of information theory (all). I Chapter 3: Typical sequences, Shannon’s source and channel coding results (all). I Chapter 4: Source coding (all). I Chapter 5: Channel coding (all, except 5.4). I Chapter 6: Multi-user information theory (all, except Aloha protocol). I Chapter 7: The Gaussian channel (all). I Chapter 8: Cryptography (all, except 8.4). Alternative reading: Elements of Information Theory by Cover and Thomas. Homepage of course: http://folk.uib.no/st03333/INF144_2015/ There will be one mandatory programming assignment and exercises on Fridays from 10.15 to 12 in G2103. Group leader is Asieh Mofrad. Rosnes (INF 144) January 12, 2015 2/7 Block diagram of a communication system Separation of source and channel coding is (almost always) optimal. Rosnes (INF 144) January 12, 2015 3/7 Shannon’s results Shannon’s source coding result: Every source can be characterized by a parameter H called the source entropy such that the source can be represented exactly by R bits per time unit if R > H. Otherwise, if R < H, no such representation exists. Rosnes (INF 144) January 12, 2015 4/7 Shannon’s results Shannon’s source coding result: Every source can be characterized by a parameter H called the source entropy such that the source can be represented exactly by R bits per time unit if R > H. Otherwise, if R < H, no such representation exists. Shannon’s channel coding result: Every communication channel can be characterized by a parameter C, the channel capacity, such that R bits per time unit can be transmitted over the channel with arbitrary low error probability if R < C. Otherwise, if R > C, the error probability is bounded away from zero. Rosnes (INF 144) January 12, 2015 4/7 Shannon’s results Shannon’s source coding result: Every source can be characterized by a parameter H called the source entropy such that the source can be represented exactly by R bits per time unit if R > H. Otherwise, if R < H, no such representation exists. Shannon’s channel coding result: Every communication channel can be characterized by a parameter C, the channel capacity, such that R bits per time unit can be transmitted over the channel with arbitrary low error probability if R < C. Otherwise, if R > C, the error probability is bounded away from zero. I The main thing is not the signal-to-noise ratio (as long as it is large enough so that R < C), but the way the information is encoded. Rosnes (INF 144) January 12, 2015 4/7 Shannon’s results Shannon’s source coding result: Every source can be characterized by a parameter H called the source entropy such that the source can be represented exactly by R bits per time unit if R > H. Otherwise, if R < H, no such representation exists. Shannon’s channel coding result: Every communication channel can be characterized by a parameter C, the channel capacity, such that R bits per time unit can be transmitted over the channel with arbitrary low error probability if R < C. Otherwise, if R > C, the error probability is bounded away from zero. I The main thing is not the signal-to-noise ratio (as long as it is large enough so that R < C), but the way the information is encoded. I You should not send information bit-by-bit, but instead encode long sequences in such a way that each information bit influence several transmitted bits. Rosnes (INF 144) January 12, 2015 4/7 Shannon’s results Shannon’s source coding result: Every source can be characterized by a parameter H called the source entropy such that the source can be represented exactly by R bits per time unit if R > H. Otherwise, if R < H, no such representation exists. Shannon’s channel coding result: Every communication channel can be characterized by a parameter C, the channel capacity, such that R bits per time unit can be transmitted over the channel with arbitrary low error probability if R < C. Otherwise, if R > C, the error probability is bounded away from zero. I The main thing is not the signal-to-noise ratio (as long as it is large enough so that R < C), but the way the information is encoded. I You should not send information bit-by-bit, but instead encode long sequences in such a way that each information bit influence several transmitted bits. I This is a completely new idea introduced by Shannon in 1948. Rosnes (INF 144) January 12, 2015 4/7 Coding theory Shannon’s channel coding result gave birth to the field of coding theory. In the 50’s and 60’s, BCH and Reed-Solomon codes were invented. Convolutional codes came in the 60’s and 70’s. Trellis-coded modulation (TCM) came in the 80’s. Turbo and low-density parity-check (LDPC) codes came in the 90’s. Rosnes (INF 144) January 12, 2015 5/7 Multi-user information theory Multi-user information theory has applications to networks. Multiple-access channel (MAC) (completely solved, Ahlswede and Liao (1972)). Broadcast channel (BC) (only solved in special cases, Cover (1972)). Relay channel (not solved in general, believed to be very hard). The two-way channel by Shannon (1961) (not solved in the general case). A general multi-user information theory is still missing and is still a topic of active research. Rosnes (INF 144) January 12, 2015 6/7 Cryptography Secrecy: Key distribution: Solved by the Diffie-Hellman protocol. Authentication: Rosnes (INF 144) January 12, 2015 7/7
© Copyright 2026 Paperzz