Information Theory and Coding

General: Information Theory and Coding

Course schedule

Type Day Hours Room
Lecture Tuesday 13h15 – 15h00 CO 1
Lecture Friday 13h15 – 15h00 CM 3
Exercises Tuesday 15h15 – 16h00 CO 1
Exercises Friday 15h15 – 16h00 CM 3

Course Mechanics

Weekly Assignments
Midterm Exam: 50% (Tentative Date: Friday, November 9.)
Final Exam: 50%

More Information:

course information sheet

Homework Assignments: Information Theory and Coding

Histogram for the Midterm Exam

0 – 9: + + + + +
10 – 19: + + + + + + + + +
20 – 29: + + + + + + + + + + + + + + + + +
30 – 39: + + + + + + + + + + + +
40 – 49: + + + + + + + + + + + + + + + + + + +
50 – 59: + + + + + + + + + + + +
60 – 69: + + + + + + + +
70 – 79: + + + + + + + + +
80 – 89: + + + + + +
90 – 99: + + + +
100 – 109: + +

Coverage and Lecture notes: Information Theory and Coding

Cover & Thomas Chapters and Sections are provided. They do not encompass everything taught in the class.


Friday, Sep 21 Source Coding: singular, uniquely decodable, prefix-free codes. Kraft’s inequality and its converse for prefix-free codes, entropy and average codeword length.
Cover & Thomas: Ch. 5, Sections 5.1,5.2,5.3,5.4,5.5

Tuesday, Sep 25 Source Coding: Huffman’s procedure for optimal coding, properties of optimal codes, proof of Huffman’s procedure, kraft’s inequality for uniquely decodable codes.
Cover & Thomas: Ch. 5, Sections 5.5,5.6,5.8

Friday, Sep 28 Source Coding: entropy of a source, properties of entropy, chain rule of entropy, mutual information.
Cover & Thomas: Ch. 2, Sections 2.1,2.2,2.3,2.4,2.5

Tuesday, Oct 2 Source Coding: binary entropy, upper bound on entropy, properties of mutual information, distance between distributions and its relation to mutual information, convex sets and functions.
Cover & Thomas: Ch. 2, Sections 2.6

Friday, Oct 5 Source Coding: concavity of entropy, chain rule of mutual information, conditional entropy and mutual information, chain rule for conditional entropy and mutual information.
Cover & Thomas: Ch. 2, Same sections as above

Tuesday, Oct 9 Source Coding: conditional mutual information, chain rule for conditional mutual information, stationary processes, entropy rate for stationary processes, markov chains, entropy rate for markov chains.
Cover & Thomas: Ch. 4, Sections 4.1,4.2

Friday, Oct 12 Source Coding: weak and strong laws of large numbers, asymptotic equipartition property (AEP), typical sequences, properties of typical sequences, source coding via AEP, definition of a dictionary .
Cover & Thomas: Ch. 3, Sections 3.1,3.2

Tuesday, Oct 16 Review of Probability Theory

Friday, Oct 19
Tuesday, Oct 23 Source coding: Tunstall coding notes

Friday, Oct 26
Tuesday, Oct 30 Source coding: Lempel-Ziv

Friday, Nov 2 Source Coding: performance of Lemple-Ziv algorithm. Channel Coding: efficient/reliable communication, definition of channel, discrete memoryless without feedback channels .
Cover & Thomas: Ch. 8, Sections 8.1, 8.5

Tuesday, Nov 6 Channel Coding: capacity of binary symmetric channel, capacity of binary erasure channel, Fano’s inequality.
Cover & Thomas: Ch. 8, Sections 8.1,8.2, 8.3, 8.9

Tuesday, Nov 13 Channel Coding: data processing theorem, Shannon’s channel coding theorem – converse using fano’s inequality, block codes, rate of code .
Cover & Thomas: Ch. 8, Sections 8.4, 8.5, 8.7 (for proof of achievability please refer to Professor’s notes below), 8.9

Friday, Nov 16 Channel Coding: typicality, Shannon’s channel coding theorem – achievability using random codes .
Cover & Thomas: Ch. 8, Sections 8.6
Notes on Capacity with Constraints

Tuesday, Nov 20 Channel Coding: continuation of Shannon’s channel coding theorem

Friday, Nov 23 Channel Coding: maximization of mutual information by Kuhn-Tucker conditions

Tuesday, Nov 27 Channel Coding: continuous random variables – differential entropy, mutual information, gaussian random variables, properties of differential entropy .
Cover & Thomas: Ch. 9, Sections 9.1, 9.2, 9.3, 9.4, 9.5, 9.6

Friday, Nov 30 Channel Coding: capacity of continuous alphabet channel with input constraints, Shannon’s channel coding theorem, additive white gaussian noise channel and its capacity
Cover & Thomas: Ch. 10, Sections 10.1, 10.2

Tuesday, Dec 4
Friday, Dec 7 Channel coding: differential versus discrete entropy, parallel gaussian channels, water filling solution, wide sense stationary processes, auto-correlation function and power spectral density, capacity of band-limited channels .
Cover & Thomas: Ch. 10, Sections 10.3, 10.4

—-

Tuesday, Dec 11 Channel Coding: binary block codes, hamming distance, minimum distance of a code, dimension of a code, sphere-packing bound on the number of codewords, Gilbert-Varshamov bound on the number of codewords
Notes on Coding

Friday, Dec 14 Channel Coding: binary linear block codes, hamming distance, weight of a codeword, minimum distance of a linear code, (7,4,3) hamming code and general hamming codes

Tuesday, Dec 18 Channel Coding: error correction and detection using hamming codes, Reed-Solomon (RS) codes, construction of RS codes, dimension and minimum distance of RS codes

Friday, Dec 21 Quantization/Lossy compression: rate distortion theory, Shannon’s theorem on rate distortion, rate distortion function of gaussian sources .
Cover & Thomas: Ch. 13, Sections 13.1, 13.2, 13.3, 13.4,

Textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley, 2006.

Reference Material:

  1. R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1968.
  2. C. E. Shannon, The Mathematical Theory of Communication , U. of Illinois Press, 1963.
  3. J. M. Wozencraft and I. M. Jacobs, Principles of Communication Engineering, Wiley 1965 (also, Waveland, 1990).

Additional Material: Information Theory and Coding