Machine Learning: Algorithms and Theory
(Summer term 2016)

Who, when, where

Lectures (by Prof. U. von Luxburg): Wednesday, 8:15 - 10:00, and Thursday, 10:15 - 12:00. Sand 6/7 - Hörsaal 2. First lecture is on April 13.
Tutorials (by M. Kleindessner): Tuesday, afternoon. Sand 1 - A 301. 1st group: 14:20 - 15:50. 2nd group: 16:10 - 17:40.

Contents of the lecture

The focus of the lecture is on both algorithmic and theoretic aspects of machine learning. We will cover many of the standard algorithms and learn about the general principles and theoretic results for building good machine learning algorithms.
  • Bayesian decision theory, no free lunch theorem.
  • Supervised learning problems (regression, classification): Linear methods; regularization; non-linear kernel methods and the theory behind it
  • Unsupervised learning problems: Dimension reduction ((kernel) PCA, multi-dimensional scaling, manifold methods); spectral clustering and spectral graph theory
  • How to model machine learning problems
  • Learning Theory: ERM, consistency, generalization bounds
  • Online learning and its theory
  • Low rank matrix completion, compressed sensing
The following topics are NOT going to be covered: decision trees, neural networks, deep networks, graphical models, Bayesian approaches to machine learning. You can learn about some of them in other courses in the department.

Course material

All course material (slides, handouts, assignments, ...) will be made available on an ILIAS page (only accessible for members of the University of Tübingen).

Online feedback form

We want to know what you like / do not like about the lecture! You can tell us anonymously with the following feedback form. The more concrete and constructive the feedback, the higher the likelihood that we can adapt to it.


You need to know the basics of probability theory and linear algebra, as taught in the mathematics for computer science lectures in your bachelor degree. If you cannot remember them so well, I strongly urge you to recap the material. A video of a recap lecture by myself and some summary notes can already be downloaded from ILIAS.

Assessment criteria and exams

Weekly assignments (theoretical assignments and programming assignments in R --- an introduction to R will be given in the first tutorial). You need to achieve at least 50% of all assignment points to be admitted to the final exam.

Final exam: written (Klausur). First exam: 26. Juli, 10 bis 12 Uhr. Hörsaalzentrum Morgenstelle - Hörsaal N02. Second exam: 13. Oktober, 10 bis 12 Uhr. Hörsaalzentrum Morgenstelle - Hörsaal N01.


There is no textbook that covers machine learning the way I want. We are going to use individual chapters of various books. Here is a list of relevant books:
  • Shalev-Shwartz, Ben-David: Understanding Machine Learning. 2014.
  • Chris Bishop: Pattern recognition and Machine Learning
  • Hastie, Tibshirani, Friedman: Elements of statistical learning. (the statistics point of view on machine learning, written by statisticians, not machine learners)
  • Mohri, Rostamizadeh, Talwalkar: Foundations of Machine Learning. 2012
  • Schölkopf, Smola: Learning with kernels (mainly support vector machines and kernel algorithms)
  • Hastie, Tibshirani, Wainwright: Statistical learning with sparsity. 2015 (Lasso, low rank matrix methods)
  • Devroye, Györfi, Lugosi: A Probabilistic Theory of Pattern Recognition (Covers statistical learning theory, we do not use much of it in this lecture.)