Course syllabus

Course-PM

SSY316 Advanced probabilistic machine learning lp2 HT21 (7.5 hp)

Course is offered by the department of Electrical Engineering

 

Course purpose

This course delves into the connections between machine learning and probability theory and statistics. In particular, it will give a probabilistic viewpoint of machine learning problems. Probability theory can be applied to any problem involving uncertainty. In machine learning, uncertainty comes in many forms, e.g., the noise in the collected data, or uncertainty about the best prediction given some past data or in what is the best suited model to explain the data. The key idea behind the probabilistic framework to machine learning is that learning can be thought of as inferring plausible (probabilistic) models to describe data that one could observe from a system. Probabilistic models are able to make predictions and statements about observable data and are also capable to express the uncertainty of the predictions. 

The course will describe a wide variety of probabilistic models, suitable for a wide variety of data and tasks. It will also describe a wide variety of algorithms for inference and learning, when using such models. The goal is to present a unified view of machine learning through the lens of probabilistic modeling and inference. 

In the course the students will also learn universal models/methods that are useful in probabilistic machine learning, but also in other areas.

 

Learning outcomes

After completion of this course, the student should be able to

  • Explain the philosophy behind Bayesian inference
  • Develop an inference algorithm using the principles of Bayesian decision theory and a given cost function
  • Understand the connections between probability theory and machine learning
  • Explain similarities and differences between probabilistic and classical machine learning methods
  • Interpret and explain results from probabilistic machine learning
  • Derive, analyze, and implement the probabilistic methods introduced in the course
  • Understand how to apply several probabilistic models to data and determine the most suitable one for a given task
  • Discuss and determine whether an engineering-relevant problem can be formulated as a supervised or unsupervised machine learning problem

 

Content

  • Review of probability theory: frequentist vs Bayesian approach
  • Bayesian inference, probabilistic modeling of data
  • Supervised learning: Bayesian linear regression
  • Linear methods for classification
  • Bayesian graphical models: Bayesian networks, Markov random fields, factor graphs
  • Belief propagation
  • Approximate inference and learning: Monte Carlo inference (importance sampling, Gibbs sampling,  Markov chain Monte Carlo)
  • Approximate inference: Variational inference and Expectation propagation
  • Unsupervised learning: K-means clustering, the Gaussian mixture model, expectation maximization, principal component analysis
  • Inference for sequential data: Hidden Markov models

 

Course staff

  • Examiner and Lecturer: Alexandre Graell i Amat (alexandre.graell@chalmers.se). Office: Room 6409, Department of Electrical Engineering
  • Teaching assistants: Azadeh Tabeshnezhad (Azadeh.Tabeshnezhad@chalmers.se, office: 6333) and Mehdi Sattari (mehdi.sattari@chalmers.se, office: 6323) 

 

Course literature

This course will follow closely the following book: 

  • C. M. Bishop, Pattern Recognition and Machine Learning. Berlin, Heidelberg: Springer-Verlag, 2006.

The book is available for free download in the link: https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf

An errata is available in the link: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/05/prml-errata-1st-20110921.pdf

 

Additional course material

Slides for the lectures, exercises for the tutorial sessions will be made available on Canvas.

 

Schedule

Lectures are typically on Tuesdays, Thursdays, and Fridays 13:15-15:00.  However, there will be a few exceptions. Please see the course memo for the preliminary schedule for the lectures.  Please check Canvas for updates.

Tutorials are typically on Tuesdays and Thursdays 15:15-17:00. Please see the course memo for the preliminary schedule.  Please check Canvas for updates.

 

Grading

The final grade of the course will be decided by (tentative) 6 homework assignments, 6 python assignments, and a final exam. The final exam will comprise a written exam and a "final python assignment" to be done at home. 

Course summary:

Date Details Due