EE 452 Digital Coding of Waveforms
Catalog Data: Information content, conditional, joint and mutual entropy. Binary symmetric channels;channels with and without memory. Source coding algorithms and rate-distortion bounds. Channel capacity and Shannon law. Block codes, cyclic codes, convolution codes.
Textbook: Jan C. A. van der Lubbe, Information Theory, Cambridge Universty Press, 1997.
References :
1. Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, 2. ed., John Wiley and Sons, Inc. 2006
2. Raymond W. Yeung, Information Theory and Network Coding, Springer, 2008.
3. Robert Ash, “Information Theory”, Dover Publications
4. Darrel Hankerson, Greg A. Harris and Peter D. Johnson, Jr., Introduction to Information Theory and Data Compression, 2. ed., Chapman & Hall/CRC, 2003.
5. Fazlollah M. Reza, An Introduction to Information theory, Dover Publications, 1994.
6. R. J. McEliece, The Theory of Information and Coding, 2 ed., Cambridge University Press, 2004
7. R. Galleger, Information Theory and Reliable Communication, John Wiley and Sons
8. Abramson, Information Theory and Coding
Instructor : Dr.
Mustafa A. Altinkaya, Associate Professor
Class Hours
: To be
determined
Prerequisites: EE
333
Probability and Random
Processes
Topics:
Discrete information
The origin of information theory
The concept of probability
Shannon’s information measure
Conditional, joint and mutual information
Axiomatic foundations
The communication model
2. The discrete memoryless information source
The discrete information source
Source coding
Coding strategies
Most probable messages
3. The discrete information source with memory
Markov processes
The information of a discrete source with memory
Coding aspects
4. The discrete communication channel
Capacity of noiseless channels
Capacity of noisy channels
Error probability and equivocation
Coding theorem for discrete memoryless channels
Cascading of channels
Channels with memory
5. The continuous information source
Probability density functions
Stochastic signals
The continuous information measure
Information measures and sources with memory
Information power
6. The continuous communication channel
The capacity of continuous communication channels
The capacity in the case of additive gaussian white noise
Capacity bounds in the case of non-gaussian white noise
Channel coding theorem
The capacity of a gaussian channel with memory
7. Rate distortion theory
The discrete rate distortion function
Properties of the R(D) function
The binary case
Source coding and information transmission theorems
The continuous rate distortion function
8. Error-correcting codes
Introduction
Linear block codes
Syndrome coding
Hamming codes
9. Network information theory (TENTATIVE)
Introduction
Multi-access communication channel
Broadcast channels
Two-way channels
Homeworks : There are 4-6 homeworks (which will not be collected).
Grading : 1 or 2 Midterm Exams (28% each), (Tentative: Term Paper (28%)), Final (44%)