Information Theory and Coding

 

Instructor Emre Telatar
Office INR 117
Phone +41 21 69 37693
Email [email protected]
Office Hours By appointment
   
   
Teaching Assistant Rajai Nasser
Phone +41 21 69 37554
Office INR 036
Email [email protected]
Office Hours
   
   
Teaching Assistant Serj Haddad
Phone +41 21 69 31357
Office INR038
Email [email protected]
Office Hours

 

Lectures Monday 13:00 – 15:00 (Room: ELA01)
  Tuesday 13:00 – 15:00 (Room: ELA02)
Exercises Tuesday 15:00 – 17:00 (Room: ELA02)

 

Language:   English
Credits :   7 ECTS

 

See the course information.

Special Announcements

Final Exam Information:

  • The final exam has been set on Thursday January 22nd from 8:15 till 11:15. It will take place in INM200 and INM202.
  • The exam covers the whole course.
  • You are allowed 4 sheets of paper (8 sides) of (your own) notes. No electronic devices.

Detailed Schedule

Date Topics Covered Assignment Solutions Remarks
Sep 15 Intro to source coding; non-singular,      
  uniquely decodable; prefix-free codes;      
  Kraft’s inequality for prefix-free codes.      
         
         
         
Sep 16 Kraft’s inequality for uniquely decodable Homework 1 Solutions 1  
  codes; reverse Kraft; entropy as a lower      
  bound on optimal codes; entropy + 1 as      
  an upper bound on optimal codes.      
         
         
Sep 22       Holiday :-)
         
         
Sep 23 Entropy is asymptotically achievable for Homework 2 Solutions 2  
  memoryless sources; Properties of      
  optimal codes; intro to Huffman codes.      
         
         
Sep 29 Optimality of Huffman codes; entropy;      
  joint entropy; conditional entropy.      
         
         
Sep 30 Mutual information; chain rules; Homework 3 Solutions 3  
  positivity of mutual information;      
  conditioning reduces entropy;      
  data-processing inequality.      
         
         
Oct 06 Entropy rate; stationary processes;      
  typicality.      
         
         
Oct 07 Typicality; size of typical sets; source Homework 4 Solutions 4  
  coding via typicality; variable to fixed      
  coding; valid prefix-free dictionaries.      
         
         
Oct 13 H(W)=H(U)E(length(W)); number of bits      
  per source symbol used by a variable to      
  fixed code.      
         
         
Oct 14 Tunstall procedure and its optimality; Homework 5 Solutions 5  
  analysis of Tunstall codes; description      
  of Lempel-Ziv codes.      
         
         
Oct 20 FSM compressors.      
         
         
Oct 21 Compressibility of FSM compressors; Homework 6 Solutions 6  
  compressibility of Lempel-Ziv codes;      
  LZ compressibility is less than FSM      
  compressibility for every sequence.      
         
         
Oct 27 Communication problem; discrete      
  memoryless channels without      
  feedback; code rates; probability      
  of error; converse to the channel      
  coding theorem.      
         
         
Oct 28   Midterm Solutions  
         
         
Nov 03 Converse to the channel coding      
  theorem (cont); capacity examples;      
  how to maximize concave functions.      
         
         
Nov 04 convexity, convex and concave Homework 7 Solutions 7  
  functions, convexity properties of      
  entropy and mutual information.      
         
         
Nov 10 Kuhn-Tucker conditions,      
  intro to random coding.      
         
         
Nov 11 Proof of the coding theorem Homework 8 Solutions 8  
         
         
Nov 17 Differential entropy, mutual      
  information; chain rules for them.      
         
         
Nov 18 Capacity with cost; Additive Gaussian Homework 9 Solutions 9  
  noise channel; converse to the coding      
  theorem with cost. Relationship between      
  discrete and differential entropy;      
  differential mutual information as a limit      
  of discrete mutual information.      
         
         
Nov 24 Coding theorem with cost.      
  Waterfilling solution to parallel Gaussian      
  channels with overall power constraint.      
         
         
Nov 25 Introduction to practical codes, linearity, Homework 10 Solutions 10  
  hamming distance, hamming codes.      
         
         
Dec 01 Sphere packing, singleton,      
  Gilbert-Varshamov bounds.      
         
Dec 02 Reed-Solomon Codes. Homework 11 Solutions 11  
         
         
Dec 08 Polar codes.      
         
         
Dec 09 Polar codes (cont). Homework 12 Solutions 12  
         
         
Dec 15 lossy data compression      
  (rate-distortion theory);      
         
         
Dec 16 What information theory says about      
  other communication problems not      
  covered in class.      
Jan 22   Final Solutions  

Textbook

Elements of information theory, Thomas M. Cover, Joy A. Thomas, 2006. ISBN:0-471-24195-4

Additional Reading Material

Claude Shannon’s A Mathematical Theory of Communication, published in Bell System Technical Journal, 1948 (part 1 in July, part 2 in October)

Notes on Tunstall codes: Tunstall
Notes on the Lempel-Ziv algorithm: Lempel-Ziv
Notes on coding theory: Coding
Notes on polar codes: Polar. These notes are taken by Elie Najm during the polar coding lectures in the Fall 2013 version of the course, posted with his permission. The lectures follow the same logic as in this year, with minor differences; they also include a proof of “Mrs Gerber’s” lemma for completeness.