Information Theory and Coding

Instructor Emre Telatar
Office INR 117
Phone +41 21 69 37693
Email [email protected]
Office Hours By appointment
   
Teaching Assistant Yunus Inan
Office INR 033
Phone +41 21 69 36604
Email [email protected]
Office Hours By appointment
   
Teaching Assistant Reka Inovan
Office INR 033
Phone +41 21 69 36604
Email [email protected]
Office Hours By appointment
 
Lectures Monday 11:00 – 13:00 (Room: BC01)
  Tuesday 13:00 – 15:00 (Room: ELA 2)
Exercises Tuesday 15:00 – 17:00 (Room: ELA 2)
Language:   English
Credits :   7 ECTS
 

Course Information See the course information.

 

Special Announcements

  • First Lecture on 11h15-13h00 Monday Aug. 14, 2020 at BC01
  • Students who are not in campus should join the Zoom Meeting at the same time as the lectures (link on Moodle)
  • Course Moodle page : https://moodle.epfl.ch/course/view.php?id=14593
  • Exercise session every Tuesday on class and Zoom.

Grading

  • Take-Home Midterm (30%)
  • Graded Homework (20%)
  • Final Exam (50%)

Detailed Schedule

Date Topics Covered Assignment Solutions Remarks
Sep 14

– Introduction to Source Coding

– Non-singular codes

– Uniquely decodable codes

– Prefix-free codes

– Kraft’s inequality for prefix-free codes

– Kraft’s sum for products of codes

hw01 sol01  
Sep 15

– Kraft’s inequality for non-singular codes

– Kraft’s inequality for uniquely decodable codes

– Reverse Kraft’s inequality

– Expected codeword length

     
Sep 21  Public Holiday      
Sep 22

– Entropy as a lower-bound to the expected codeword length

– Existence of prefix-free codes with expected length at most entropy + 1

– Properties of optimal codes

– Huffman procedure

– Equivalence of prefix-free codes and strategies for guessing via binary questions

hw2 sol2  
Sep 28

– Entropy of multiple random variables

– Source coding by describing words of n letters

– Conditional entropy and mutual information

– Chain Rules for entropy and mutual information

– Short reminder for Markov Chains

– Minimal and maximal values of H(U)

– Data Processing inequality

     
Sep 29

– Entropy Rate

– Entropy rate of Stationary Processes

– Coding Theorem for Stationary Sources

– Fixed-to-Fixed Length Source Codes

– Typicality

– Properties of Typical Sets

– Asymptotic Equipartition Property

hw3 sol3  
Oct 5

– Source Coding via Typicality

 

 

    Notes on FSM Compressor and Lempel-Ziv
Oct 6

– Finite state machine (FSM) compressors

-Invertible machines, information lossless (IL) machines

hw4 sol4.pdf  
Oct 12

– Lower bound to the output length for IL FSMs

– Lempel-Ziv Algorithm

     
Oct 13

– Properties of entropy and mutual information

– Review of Source Coding

hw5 sol5.pdf  
Oct 19

– Communication channels

– memoryless/stationary channels, communication without feedback

-Multi-letter Fano’s Inequality

     
Oct 20 – KKT condition for maximization of channel capacity hw6.pdf sol6.pdf  
Oct 26

– KKT condition for maximization of channel capacity

-Properties of optimal solution

-Channel capacity computation

     
Oct 27

-Probabilistic Method

-Channel Coding Theorem

ITC Midterm 2020    
Nov 2

-Channel Coding Theorem with Cost Constraint

-Threshold Decoding

-Separation Principle

     
Nov 3

-Differential Entropy

hw7.pdf sol7.pdf  
Nov 11

-Entropy of Gaussian R.Vs

-Capacity of Gaussian Channel

 

     
Nov 12

-Bandlimited AWGN Channels

hw8.pdf sol8.pdf  
Nov 16

-Parallel Gaussian Channels, water-filling

-Hamming Distance

     
Nov 17

– Sphere-packing bound

– Gilbert bound

– Singleton bound

hw9.pdf sol9.pdf  
Nov 23

– Linear codes

– Generator and parity check matrices

    Notes on coding
Nov 24

 

hw10.pdf    
 

Textbook

Additional Reading Material

To Review Basic Probability Theory