Course syllabus

Course-PM

SSY210 Information theory, advanced level lp4 VT21 (7.5 hp)

The course is offered by the department of Electrical Engineering

Contact details

Teacher: Giuseppe Durisi

Course purpose

This course offers an introduction to information theory and its application to digital communication, statistics, and machine learning.
One important feature of the information-theory approach is its ability to provide fundamental results, i.e., results that demonstrate the optimality of certain procedures.
Obtaining results of this flavor is useful for many reasons: for example, we can assess whether achieving a target error probability in the transmission of information is feasible; we can determine how many data samples need to be collected to distinguish between two or more statistical hypotheses, or how many examples are needed to train a machine learning algorithm

Schedule

TimeEdit (see "Modules" tab for a detailed description of what will be taught when)

Course literature

The course is partly based on the following references (available online):

Lecture notes and slides prepared by the teacher will be made available.

Course content 

  • Shannon’s information metrics and their properties: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information
  • Asymptotic equipartition property and typicality
  • Data compression and the source coding theorem
  • Data transmission and the channel coding theorem
  • Binary hypothesis testing, Neyman-Pearson Lemma, Stein’s lemma
  • Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds

Organization 

The course comprises 16 lectures and 6 homework sessions. Each lecture is linked to a reading assignment, which will be reviewed in depth, and augmented with examples and short exercises. It is important that the participants work on the reading assignment before each lecture. In the homework session, we will discuss the homework assignment.

Homework assignments

Homework assignments are given each Wednesday. Students are encouraged to form groups of up to 3 people and solve the homework assignment together. The homework assignments will be corrected in class, typically in the Wednesday afternoon slot, and will need to be handed in just before this slot. One group each week will be responsible to post the solutions. 

Handing in the solutions of all HW assignments (one per group) is necessary to take the oral exam.

Learning objectives and syllabus

Learning objectives:

  • Define entropy, relative entropy, and mutual information and explain their operational meaning
  • Describe and demonstrate Shannons source coding and channel coding theorems
  • Compute the capacity of discrete communication channels
  • Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test
  • Explain how relative entropy can help characterizing the generalization error in statistical learning

Link to the syllabus on Studieportalen.

Examination form

The assessment is based on an oral examination. At the oral exam, the course participants will be asked to solve one of the problems given as homework assignments and to discuss a theoretical topic. Grades are pass/fail only. Submitting solutions to all 6 homework assignments (as part of a group) is necessary to take the oral exam.

Course summary:

Date Details Due