Course syllabus
Course-PM
SSY316 Advanced probabilistic machine learning lp2 HT24 (7.5 hp)
This course is offered by the department of Electrical Engineering
Prerequisites
Working knowledge of probability theory, statistics, and linear algebra is required. Moreover, it is advisable to have completed an introductory course in machine learning.
Course purpose
This course delves into the connections between machine learning and probability theory and statistics, unveiling a profound probabilistic perspective of machine learning problems.
The very essence of probability theory is the study of uncertainty. Within machine learning, uncertainty manifests in multifaceted ways--ranging from the intrinsic noise within collected data to uncertainties concerning optimal predictions based on historical data or the most fitting model to explain the data.
The fundamental tenet underscoring the probabilistic paradigm of machine learning is that learning can be thought of as the art of inferring plausible probabilistic models that describe the data emanating from a given system. These probabilistic models not only enable to make predictions and statements about observable data but also possess the capability to articulate the inherent uncertainty of those predictions.
This course will describe an extensive array of probabilistic models, suitable for a diverse spectrum of data types and tasks. It will also elucidate a wide variety of algorithms tailored for inference and learning faciltated by such models. The overarching goal of the course is to present a unified vision of machine learning, casting it in the illuminating light of probabilistic modeling and inference.
In the course the students will not only learn universal models/methods that are useful in probabilistic machine learning, but also in other areas.
Learning outcomes
After completion of this course, the student should be able to
- Explain the philosophy behind Bayesian inference
- Develop an inference algorithm using the principles of Bayesian decision theory and a given cost function
- Understand the connections between probability theory and machine learning
- Explain similarities and differences between probabilistic and classical machine learning methods
- Interpret and explain results from probabilistic machine learning
- Derive, analyze, and implement the probabilistic methods introduced in the course
- Understand how to apply several probabilistic models to data and determine the most suitable one for a given task
- Discuss and determine whether an engineering-relevant problem can be formulated as a supervised or unsupervised machine learning problem
Content
- Review of probability theory: frequentist vs Bayesian approach
- Bayesian inference, probabilistic modeling of data
- Supervised learning: Bayesian linear regression
- Linear methods for classification
- Bayesian graphical models: Bayesian networks, Markov random fields, factor graphs
- Belief propagation
- Approximate inference and learning: Monte Carlo inference (importance sampling, Gibbs sampling, Markov chain Monte Carlo)
- Approximate inference: Variational inference and Expectation propagation
- Unsupervised learning: K-means clustering, the Gaussian mixture model, expectation maximization, principal component analysis
Course staff
- Examiner and Lecturer: Alexandre Graell i Amat (alexandre.graell@chalmers.se). Office: Room 6409, Department of Electrical Engineering
- Teaching assistants: Mehdi Sattari (mehdi.sattari@chalmers.se, office: 6323), Javad Aliakbari (javada@chalmers.se, office: 6329), and Delio Jaramillo-Vélez (delio@chalmers.se, office: 6329).
Course literature
This course will follow closely the following book:
- C. M. Bishop, Pattern Recognition and Machine Learning. Berlin, Heidelberg: Springer-Verlag, 2006.
The book is available for free download in the link: https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf
An errata is available in the link: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/05/prml-errata-1st-20110921.pdf
Additional course material
Slides for the lectures and exercises for the tutorial sessions will be made available on Canvas.
Schedule
Lectures are typically on Tuesdays, Thursdays, and Fridays 13:15-15:00. However, there may be some exceptions. Please see the course memo for the preliminary schedule for the lectures. Please check Canvas for updates.
Tutorials are typically on Thursdays 15:15-17:00. Please check Canvas for updates.
Lectures
The objective of the lectures is to highlight the most important parts of the course. However, it is not motivated (and there is not enough time) to cover all relevant parts in all detail. Part of the learning takes place outside the lecture hall, and the number of lectures has therefore been reduced to free up more time for group and individual work.
Tutorial sessions
The tutorial sessions offer a practical avenue to apply the theoretical concepts covered in class.
During each tutorial session, the TA will give a concise 30-minute mini-lecture, which is tailored to complement the regular lectures.
Homework assignments
The course encompasses a total of 6 homework assignments, consisting of theoretical (pen-and-paper) exercises and python exercises. The latter will be used as a tool to consolidate the understanding of the theoretical topics discussed during the lectures, and to apply them to real-world scenarios.
These assignments are designed to be solved in pairs and will be graded. You will receive each assignment one week in advance of the respective tutorial session, and you should submit your solutions before the tutorial session. Any late submissions will regrettably not be assessed, resulting in a loss of points.
Consultation hours
Mehdi, Javad, or Delio will be available in their office (office number 6323, 6329, and 6329, respectively, in EDIT building, floor 6Ö, take the north staircase) to answer questions regarding the homework on Tuesdays 15:30-17:00. For questions outside office hours, feel free to email, message on Canvas, or ask during the break in class.
Take-home exam
The course does not have a final written exam in class. Instead, there will be a take-home final assignment. The final assignment, to be solved in pairs, will be given two weeks before Christmas and must be handed in by January 12. The take-home final assignment gives a maximum of 40 points.
To pass the course, you need a minimum score of least 10 points in the final assignment.
Final grades
The final score (maximum score 100) will be decided based on the points collected via the homework assignments (maximum score 60) and the final take-home assignment (maximum score 40). Three points will be given to all students providing feedback on the course evaluation questionnaire. Please upload a screen shot showing you completed the course evaluation in the relevant assignment. Note that the three points will not be awarded if their addition would cause a grade to move from fail to 3.
The final grade is then calculated as follows:
- Total score: 0-39 -> Grade: fail
- Total score: 40-59 -> Grade: 3
- Total score 60-79 -> Grade: 4
- Totall score 80-100 -> Grade 5
To pass the course, you need a minimum score of least 10 points in the final assignment.
Course summary:
Date | Details | Due |
---|---|---|