Machine Learning: Algorithms and Theory
(Summer term 2018)

You can find the allocation to the tutorials here.

Please note the following
  • You have to register for the final exam. For the second exam you have to fill in a form (it is not possible to the registration online in campus). You can get the form from your tutor, pick it up at room B107 (Sand 14) or just send an email to Mortiz Haas.
  • The registration deadline is 20. September for the second exam.
  • If you want to take both exams you also have to register twice.

The lecture is intended for Master students in computer science or mathematics, but might also be interesting for students from other disciplines like physics, linguistics, etc. If in doubt, please simply attend the first lecture and talk to me in the end.

Who, when, where

Lectures (by Prof. U. von Luxburg): Tuesday 12:15-14:00 and Thursday 10:15 - 12:00, Room F119 in Sand 6 (Computer Science Campus, Tuebingen). First lecture is on April 17.

Tutorials (by Diego Fioravanti, Tobias Frangen, Mortiz Haas, Siavash Haghiri): You can choose among Tue 16-18 (2 groups, A302 and C118) or Wed 16-18 (2 groups, A302 and C118). The regular tutorials start from the third week. In the first two weeks there will be a math recap on April 17 (in room F119) and a introduction to Python on Wednesday April 25 16:15-18:00 in room F119.

Please join the ILIAS room so that we can give relevant information to you.

Course material

You can find the most important information on our information sheet.

Recap material: Python Tutorial Slides:
Weekly assignments:
Your exam questions
Here you can find some of the exam questions, which you handed in for assignment 4 and assignment 10. Please note: We had a look at it, but there is no guarantee that the answers or questions are correct. It is just to help you review the material from the lecture and is not representative of what you will find in the exam.


There is no textbook that covers machine learning the way I want. We are going to use individual chapters of various books. Here is a list of relevant books:
  • Shalev-Shwartz, Ben-David: Understanding Machine Learning. 2014.
  • Chris Bishop: Pattern recognition and Machine Learning. 2006.
  • Hastie, Tibshirani, Friedman: Elements of statistical learning. 2001. (the statistics point of view on machine learning, written by statisticians)
  • Kevin Murphy: Machine Learning, a probabilistic perspective, 2012 (for the probabilistic point of view)
  • Schölkopf, Smola: Learning with kernels. 2002. (for support vector machines and kernel algorithms)
  • Hastie, Tibshirani, Wainwright: Statistical learning with sparsity. 2015. (Lasso, low rank matrix methods)
  • Duda, Hart, Stork: Pattern classification. 2012. (a somewhat outdated classic, but some sections still good to read)
  • Mohri, Rostamizadeh, Talwalkar: Foundations of Machine Learning. 2012.(rather theoretical point of view)
  • Devroye, Györfi, Lugosi: A Probabilistic Theory of Pattern Recognition. 2013. (Covers statistical learning theory, we do not use much of it in this lecture.)

Online feedback form

We want to know what you like / do not like about the lecture! You can tell us anonymously with the following feedback form. The more concrete and constructive the feedback, the higher the likelihood that we can adapt to it.

Contents of the lecture

The focus of the lecture is on both algorithmic and theoretic aspects of machine learning. We will cover many of the standard algorithms and learn about the general principles and theoretic results for building good machine learning algorithms. Topics range from well-established results to very recent results.
  • Bayesian decision theory, no free lunch theorem.
  • Supervised learning problems (regression, classification): Linear methods; regularization; non-linear kernel methods and the theory behind it
  • Unsupervised learning problems: Dimension reduction: (kernel) PCA, multi-dimensional scaling, manifold methods); spectral clustering and spectral graph theory
  • How to model machine learning problems
  • Learning Theory: ERM, consistency, generalization bounds
  • Low rank matrix completion, compressed sensing
  • Ranking
  • Online learning and its theory
The following topics are NOT going to be covered: decision trees, neural networks, deep networks, graphical models, Bayesian approaches to machine learning. You can learn about some of them in other courses in the department.


You need to know the basics of probability theory and linear algebra, as taught in the mathematics for computer science lectures in your bachelor degree. If you cannot remember them so well, I strongly urge you to recap the material (see below for links).

Assessment criteria and exams

Weekly assignments (theoretical assignments and programming assignments in python --- an introduction to python will be given in one of the first tutorials ; it will not be possible to use any other programming language). You need to achieve at least 50% of all assignment points to be admitted to the final exam.

The final exams are written (Klausur), the first one at July 24 (Kupferbau - Hörsaal 21), 10:00-12:00, the second one at October 4, 10:00 - 12:00 (Sand 6 - F119).

  • You can choose which exams to take, but you should be aware that this year there will not be any third exam (nor oral exams). So if you skip the first exam and fail the second one, you have to wait for one year for the next exam.
  • You are not allowed to bring any material (books, slides, etc) except for what we call the controlled cheat sheet: one side (A4, one side only) of handwritten (!) notes, made by yourself.
  • To get an idea how the exam will look like, here is an old one. Above you can also find the collected exam questions.
  • Please be there at least 15 minutes earlier so that we can start in time.
  • Please bring passport/ID and student ID with you