Mathematics for Machine Learning (Winter term 2024/25)

Quick links

News

  • On Dec 3rd, the lecture needs to be moved to a different lecture hall:
    lecture hall N11 in "Auf der Morgenstelle 3 (Botanik)"; this is a different building) than where we usually are!!! Please note it in your calendars ...

Assignments

Presence sheets

Lecture notes

Current lecture notes, hand-written, will be updated each week:
  • Linear algebra: pdf (last updated Nov 7)
  • Calculus: pdf (updated 2024-11-15)

My Lecture notes of 2020, hand-written:
  • Linear algebra (A): pdf
  • Calculus (C): pdf
  • Probability theory (P): pdf
  • Statistics (S): pdf
  • Mixed materials (H): pdf

Exams

To be admitted to the exam, you need to have passed the assignments in the course (as discussed at the beginning of term): you need to have received at least 50% of the assignment points of the whole term. If you got admitted last year, you can also take part in exam (but please send us an email when you register to the exam to let us know, so we can check whether you are really elegible). If you got admitted before last year, you need to pass the assignments again this year. The exam is written. There will be two exams, the tentative dates are:
  • Feb 7, 15:30 - 17:30, lecture hall N7, Morgenstelle
  • April 2nd, 9:30 - 11:30, room not yet assigned
You can choose which exam to take, both will have the same difficulty. But note that there won't be a third exam nor oral exams: if you skip the first exam and fail the second one, you would need to wait for next winter term to take the exam again.
To see how the exam might look like, you can have a look at the following Mock exam.

Background information

This course is intended for the students of the master program in machine learning. Depending on your background, some of the material might be a recap - or not. Contents of the course are Linear algebra, Mulitvariate analysis, Probability Theory, Statistics, Optimization. Note that the course requires a solid basis in mathematics, similar to what students would have attended in our Bachelor program in computer science. The course is not recommended for students without this background.

Registration

You need to register for this course in Ilias, link see above. Registration deadline is Oct 15.

Lecture

What: Mathematics for Machine Learning, 9 CP
Lecturer: Prof. Ulrike von Luxburg
When and where: Tuesdays 8:15 - 9:45, lecture hall N4, Morgenstelle; Thusdays 8:15 - 9:45, lecture hall HS 23, Kupferbau. First lecture is on Oct 15.

Tutorials

We will have weekly tutorial sessions in small groups of about 30 students, where you can ask questions and interact with other students. Based on the preferences you stated when entering the Ilias course, we assigned you to the different tutorial slots.
  • Assignment to tutorials: pdf

The teaching assistants are:
Eric Guenther (organizes the assignments)
Karolin Frohnapfel (organizes the assignments)
Swagatam Haldar
Balazs Szabados
Devank Tyagi

Literature:

General: More specific:
  • For linear algebra, I recommend: Sheldon Axler: Linear Algebra Done Right. Third edition, 2015. There are also online videos by the author if you want to get longer explanations than the ones I will provide.
  • Calculus (Integration, Measures, Metric spaces and their topology): Sheldon Axler: Measure, Integration & Real Analysis. 2019
  • Calculus (Differential calculus in R^n): Here I haven't found my one favorite textbook for this course yet (some are too recipe-like, the others a bit too abstract)
    • My current favorite textbook, mathematically rigorous (but not easy to read):
      Terence Tao, Analysis 1 and 2.
    • Books with many figures, but partly informal or recipe-like (might be good as a start if you need to get the intuition before diving deeper):
      Stanley Miklavcic: An Illustrative Guide to Multivariable and Vector Calculus.
      Charles Pugh: Real Mathematical Analysis
    • A classic: Rudin: Principles of Mathematical Analysis.
    • If you are looking for a german book, I like: Walter: Analysis 1 and Analysis 2. The second one covers everything that we have been discussing.
  • Probability theory: Jacod, Protter: Probability essentials. Short and to the point, tries to avoid measure theory whereever possible, yet is rigorous. Good compromise.
  • Statistics:
    • For a very short overview over all the topics we cover: Wasserman: All of statistics, a concise course in statisticial inferece.
    • A bit more details: Casella/Berger, Statistical Inference.
    • Testing, rigorously: Lehmann/Romano: Testing statistical hypotheses.
  • Optimization
    • For convex optimization: Boyd/Vandenberghe: Optimization
    • Non-convex optimization basics, in the new book by Francis Bach, Learning Theory from first principles, pdf
  • For high-dimensional probability and statistics there are several good books, but they go much deeper than our lecture:
    • Wainwritght: High-dimensional statistics
    • Vershynin: High-dimensional probability
    • Bühlmann, van de Geer: Statistics for High-dimensional data (this is from the more traditional statitics point of view)

Online feedback form

We want to know what you like / do not like about the lecture! You can tell us anonymously with the following feedback form. The more concrete and constructive the feedback, the higher the likelihood that we can adapt to it.