EE 452 Digital Coding of Waveforms


Catalog Data: Information content, conditional, joint and mutual entropy. Binary symmetric channels;channels with and without memory. Source coding algorithms and rate-distortion bounds. Channel capacity and Shannon law. Block codes, cyclic codes, convolution codes.

Textbook: Jan C. A. van der Lubbe, Information Theory, Cambridge Universty Press, 1997.

References :

1. Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, 2. ed., John Wiley and Sons, Inc. 2006

2. Raymond W. Yeung, Information Theory and Network Coding, Springer, 2008.

3. Robert Ash, “Information Theory”, Dover Publications

4. Darrel Hankerson, Greg A. Harris and Peter D. Johnson, Jr., Introduction to Information Theory and Data Compression, 2. ed., Chapman & Hall/CRC, 2003.

5. Fazlollah M. Reza, An Introduction to Information theory, Dover Publications, 1994.

6. R. J. McEliece, The Theory of Information and Coding, 2 ed., Cambridge University Press, 2004

7. R. Galleger, Information Theory and Reliable Communication, John Wiley and Sons

8. Abramson, Information Theory and Coding

Instructor : Dr. Mustafa A. Altinkaya, Associate Professor
Class Hours : To be determined
Prerequisites: EE 333 Probability and Random Processes

Topics:

  1. Discrete information

2. The discrete memoryless information source

3. The discrete information source with memory

4. The discrete communication channel

5. The continuous information source

6. The continuous communication channel

7. Rate distortion theory

8. Error-correcting codes

9. Network information theory (TENTATIVE)



Homeworks : There are 4-6 homeworks (which will not be collected).

Grading : 1 or 2 Midterm Exams (28% each), (Tentative: Term Paper (28%)), Final (44%)


Mustafa A. ALTINKAYA 2018