Section 1: C. E. Shannon: His life, work, and influence on modern communications.
Section 2: Information measures and basic properties: Eetropy, mutual information, Kullback-Leibler divergence, convexity.
Section 3: Typicality and the asymptotic equipartition property.
Section 4: Stationary (ergodic) sources and entropy rate.
Section 5: Lossless source compression, prefix codes, fundamental compression limits based on entropy rate, Shannon and Huffman codes.
Section 6: Channel capacity examples (binary symmetric channel, erasure channel) and properties, statement and proof of the channel coding theorem for discrete memoryless channels, achievability, joint typicality.
Section 7: Fano's inequality and the converse theorem, feedback capacity.
Section 8: Continuous-time sources and channels, differential entropy, mutual information and properties, entropy of a Gaussian random vector.
Section 9: Additive Gaussian channel and its capacity, typicality, coding theorem, bandlimited channel capacity.
Section 10: Parallel Gaussian channels, the colored noise channel, power allocation for rate maximization, water-filling method.
Section 11: Source coding with stream codes, arithmetic coding, and Lempel-Ziv coding.
Section 12: Introduction to rate-distortion theory and lossy compression.
Section 13: Linear codes, description and encoding, Hamming codes.
Section 14: Introduction to statistical signal processing and the Wald test.
Section 15: Fundamental information-theoretic inequalities and applications to communication networks.
- Suggested bibliography:
T. M. Cover, J. A. Thomas, Elements of Information Theory, Wiley-Interscience, 2nd edition, 2006. (Available free hardcopy)
Y. Polyanskiy, Y. Wu, Information Theory: From Coding to Learning, Cambridge University Press, 2025.
- Related academic journals:
IEEE Transactions on Information Theory
IEEE Journal on Selected Areas in Information Theory
IEEE BITS
IEEE Transactions on Information Forensics and Security Annals of Statistics