Learning Theory CS-526

Instructor Nicolas Macris Instructor Ruediger Urbanke
Office INR 134 Office INR 116
Email [email protected] Email [email protected]
Office Hours By appointment Office Hours By appointment
Teaching Assistant Chan Chun Lam Email [email protected] Office INR032
Teaching Assistant Kirill Ivanov
Email [email protected] Office INR 030
Teaching Assistant Clement Luneau
Email [email protected]
Office INR 141
Lectures Monday 08:15 – 10:00  Room: INM202
Exercises Tuesday 17:15 – 19:00 Room:  INR219
Language: English
Credits : 4 ECTS

Prerequisites:

  • Analysis I, II, III
  • Linear Algebra
  • Machine learning
  • Probability
  • Algorithms (CS-250)

Here is a link to official coursebook information.

Homework:
Some homework will be graded.

Grading:
If you do not hand in your final exam your overall grade will be NA. Otherwise, your grade will be determined based on the following weighted average: 10 % for the Homework, 90 % for the Final Exam. For the graded homeworks,  you can discuss the homework with other people. But you have to write down your own solution and note on the first page the set of people that you discussed with.

Special Announcements

Topics

Detailed Schedule

(tentative, subject to changes)

Date Lectures Exercises Solutions
18/2 Chap 3 and 4 (in UML) 3.1; 3.3; 3.7; 3.8; 4.1; 4.2
25/2 Chap 5  (in UML) idem ++

Exercise 2

4/3 Chap 6  (in UML) Graded: 5.1; 6.2; 6.5; 6.8; 6.9; 7.3
11/3 Chap 7  (in UML) idem
18/3 remaining of Chap 7 and Chap 14 start (in UML) Deadline for handing in graded homework 19/3 during exercise session

Exercise 4

25/3 remaining of Chap 14 (in UML) 2nd graded homework:

Exercise 5

Deadline 16 April

1/4 Lecture notes on two-layer neural networks” by A. Montanari Hand-out of 1st graded homework

(lecture and exercise session)

8/4 Introduction to graphical probabilistic models

PGM-Lect-1.pdf

(Chap 3 and 4 in D. Barber and Chap 8 in C. Bishop)

2nd graded homework continued (exercise 5).

Deadline is 16 April

15/4 Factor graphs, Marginalization.

PGM-Lect-2.pdf

Notes on message passing for marginalization (sum-product algorithm)

(Chap 4, 5 in Barber, Chap 8 in Bishop)

Exercise 6
22/4 Vacations
29/4 Sampling: Ancestral sampling for belief Networks and MCMC.

PGM-Lect-3.pdf

Learning graphical models: (Barber paragraphs 9.3 and 9.6 mostly 9.6.1)

Exercises 6 continued. Use notes on message passing for problems 8, 9, 10
6/5 Variational bayes EM, standard EM

PGM-Lect-4.pdf

Learning graphical models: (Barber 11.2 mostly 11.2.1, 11.2.2 and 11.5.

3rd graded hmw

Exercise 7

New Deadline: May 28.

13/5 Tensor methods: Next three classes based on the Review

Tens-Lect-1.pdf

Tensor product, Rank, Jennrich’s thm

4th graded hmw

Exercise 8

New Deadline: June 4 in  mailbox in IPG corridor (INR) or with the assistants.

20/5 Tens-Lect-2.pdf

ALS, multilinear rank, Tucker HOSVD

27/5 Tens-Lect-3.pdf

Applications: GMM, Topic models, multiview models.

If time permits: Power Method, Whitening

Textbooks and notes: