Mathematics of Machine Learning

Mathematics of Machine Learning (Fall 2009)

Mathematics of Machine Learning (Fall 2009)

 (under construction)

Lecturer

Assistant

News & Log

  • The last assignment series can be collected from the box of Nicolai Hähnle in the mail room on the second floor of MA
  • Lecture of 17.12.: Weighted Majority, Review of semester
  • Lecture of 10.12.: Margin Perceptron algorithm, Weighted Majority algorithm
  • Lecture of 3.12.: Strong learning, Perceptron algorithm, Margin Perceptron
  • Lecture of 26.11.: Boosting the accuracy of weak learning algorithms
  • Lecture of 19.11.: Transversal number of sets with bounded VC dimension, Chernoff bounds, Boosting success probability of learning algorithms
  • Lecture of 12.11.: VC dimension of polynomial inequalities, learning a halfspace using linear programming, transversals and fractional transversals of sets
  • Lecture of 5.11.: Radon's lemma, VC dimension of halfspaces, VC dimension of unions, intersections and complements
  • Lecture of 29.10.: Using epsilon nets in PAC learning, the epsilon net theorem
  • Lecture of 22.10.: VC dimension, Shatter Function lemma, epsilon nets
  • Lecture of 15.10.: Occam's Razor continued, using Set Cover to improve PAC learning of sparse conjunctions
  • Lecture of 08.10.: Final definition of the PAC model, learning 3-term DNF using 3-CNF, Occam's Razor, improved PAC learning of conjunctions
  • Exercise 7 from assignment 1 has been post-poned to assignment 2 because the necessary definitions have not been given in the lecture yet
  • Lecture of 01.10.: NP-complete problems, RP, hardness of learning 3-term DNF formulae
  • 24.09.: Updated assignment 1
  • Lecture of 24.09.: Representation classes, updated definition of the PAC model, learning conjunctions of boolean literals, P vs. NP
  • Lecture of 17.09.: Rectangle learning game, preliminary definition of the PAC model
  • There will be no exercise session in the first week
  • Announcements and course topics will be posted here

Description

The course covers topics from computational learning theory. Machine learning deals with the classification of objects based on training data. We deal with questions like: What accuracy can we guarantee for a learning algorithm? How hard is it to learn concepts? In particular, some of the topics covered are:

  • PAC model
  • Occam's Razor
  • VC Dimension and e-Nets
  • Weak and strong learning
  • Learning in the presence of Noise

Grading

Your grade will be determined by a written final exam. You can collect bonus points by handing in solutions to selected exercises from the assignment sheets. If you solve 50% or more of the exercises, the grade of your final exam will be improved by a half grade. If you solve 90% or more of the exercises, the grade of your final exam will be improved by a full grade.

Schedule

Lecture: Thursday 08:15 - 10:00 (MA A331); Lectures start on September 17th
Exercises: Thursday 10:15 - 12:00 (MA A331); first session on September 24th
Final exam: Tuesday, 26.1.2010, 8:15 in CM4

Assignments

We will publish an assignment sheet with exercises and problems on this website every two weeks. You can work on the exercises, ask questions, and discuss problems during the exercise sessions.

You will have the opportunity to hand in written solutions of select exercises to obtain bonus points. There will be no solutions published on the website, but we will discuss solutions during the exercise sessions.

Links

Literature

Michael J. Kearns, Umesh V. Vazirani, An introduction to computational learning theory, MIT Press. (Google Books)
Bernhard Schölkopf, Alexander J. Smola, Learning with Kernels, MIT Press. (website)