Machine Learning: Algorithms and Theory
(Summer term 2017)
The lecture is intended for Master students in computer science or mathematics, but might also be interesting for students from other disciplines like physics, linguistics, etc. If in doubt, please simply attend the first lecture and talk to me in the end.
Who, when, where
Lectures (by Prof. U. von Luxburg): Monday, 10:15-12:00 (Hörsaal 1, Room F119, Sand 6) and Thursday, 10:15 - 12:00 (Room A 301, Sand 14). First lecture is on April 20.Tutorials (by Sascha Meyen and Leena Chennuru Vankadara): Monday 12:15 - 13:45 (Room F122, Sand 6) and Friday 12:15 - 13:45 (Room F122, Sand 6).
Please join the ILIAS room so that we can give relevant information to you.
Exams
The results of the second exam are out. You can come and see your exam this Friday (13.10.), 13:00 - 15:00 in room F116, Sand 6.The final exams are written (Klausur): First exam: July 27, 08:30 - 10:30 (Hörsaal N6, Morgenstelle) and second exam: October 9, 08:30 - 10:30 (Hörsaal N3, Morgenstelle).
Exam modalities:
- Please be there at least 15 minutes earlier so that we can start in time.
- Please bring ID and student ID with you
- You are not allowed to bring any material (books, slides, etc) except the controlled cheat sheet: one page of handwritten (!) notes, made by yourself, size A4, one side only (back side empty).
- You can choose which exams to take, but you should be aware that this year there will not be any third exam (nor oral exams). So if you skip the first exam and fail the second one, you have to wait for one year for the next exam.
Make sure you are officially registered. You have to go to Frau Hallmayers office for the registration. Currently, the campus system does not work and we can not register you directly this week anymore.
You can prepare with this exam from 2013 and with the students exam questions from the tutorial assignment: exam from 2013 (Probeklausur), Students exam questions 1 and Students exam questions 2
Course material
You can find the most important information on our information sheet.Slides:
- Latest version (updated 2018-04-17)
Weekly assignments:
- Assignment 1, assignment01.m and assignment01.csv
- Assignment 2, assignment02.m and mixGaussian2d.m, usps_train.mat, usps_test.mat
- Assignment 3, assignment03.m and assignment03.mat, latex template for exam questions
- Assignment 4, assignment04.m, assignment04mat, generateDataLDA.m and ridgeRegression.m
- Assignment 5 and assignment05.m
- Assignment 6 and assignment06.m
- Assignment 7, assignment07.m, plotSVM2d.m and cancer_data.mat
- Assignment 8, assignment08.m and assignment08data.m
- Assignment 9, assignment09.m, eurodist.mat, USPS_digits_iso.mat and USPS_labels_iso.mat
- Assignment 10, assignment10.m, assignment10data.mat and my_exam_questions.tex
- Assignment 11
- Assignment 12 (Bonus), usps_train.mat and usps_test.mat
Recap material:
- linear algebra
- probability theory
- official interactive MATLAB tutorial (you have to create a mathworks account)
- unofficial interactive MATLAB tutorial
Online feedback form
We want to know what you like / do not like about the lecture! You can tell us anonymously with the following feedback form. The more concrete and constructive the feedback, the higher the likelihood that we can adapt to it.Contents of the lecture
The focus of the lecture is on both algorithmic and theoretic aspects of machine learning. We will cover many of the standard algorithms and learn about the general principles and theoretic results for building good machine learning algorithms. Topics range from well-established results to very recent results.- Bayesian decision theory, no free lunch theorem.
- Supervised learning problems (regression, classification): Linear methods; regularization; non-linear kernel methods and the theory behind it
- Unsupervised learning problems: Dimension reduction ((kernel) PCA, multi-dimensional scaling, manifold methods); spectral clustering and spectral graph theory
- How to model machine learning problems
- Learning Theory: ERM, consistency, generalization bounds
- Low rank matrix completion, compressed sensing
- Ranking
- Online learning and its theory
Prerequisites
You need to know the basics of probability theory and linear algebra, as taught in the mathematics for computer science lectures in your bachelor degree. If you cannot remember them so well, I strongly urge you to recap the material (see below for links).Assessment criteria and exams
Weekly assignments (theoretical assignments and programming assignments in Matlab --- an introduction to Matlab will be given on April 21, see above; it will not be possible to use any other programming language). You need to achieve at least 50% of all assignment points to be admitted to the final exam.The final exams are written (Klausur): First exam: July 27, 08:00 - 11:00 (Hörsaal N6, Morgenstelle) and second exam: October 9, 08:00 - 11:00 (Hörsaal N3, Morgenstelle).
Exam modalities: tba
Literature
There is no textbook that covers machine learning the way I want. We are going to use individual chapters of various books. Here is a list of relevant books:- Shalev-Shwartz, Ben-David: Understanding Machine Learning. 2014.
- Chris Bishop: Pattern recognition and Machine Learning. 2006.
- Hastie, Tibshirani, Friedman: Elements of statistical learning. 2001. (the statistics point of view on machine learning, written by statisticians, not machine learners)
- Mohri, Rostamizadeh, Talwalkar: Foundations of Machine Learning. 2012.
- Schölkopf, Smola: Learning with kernels. 2002. (mainly support vector machines and kernel algorithms)
- Hastie, Tibshirani, Wainwright: Statistical learning with sparsity. 2015. (Lasso, low rank matrix methods)
- Duda, Hart, Stork: Pattern classification. 2012. (a classic, more than 20 years old, some sections still good to read)
- Devroye, Györfi, Lugosi: A Probabilistic Theory of Pattern Recognition. 2013. (Covers statistical learning theory, we do not use much of it in this lecture.)
- Dasgupta, Papadimitriou, Varzirani: Algorithms. 2006.