Course syllabus
Course-PM
SSY210 Information theory, advanced level lp4 VT21 (7.5 hp)
The course is offered by the department of Electrical Engineering
Contact details
Teacher: Giuseppe Durisi
Teaching assistant: Hoang-Khac Ngo
Course purpose
Schedule
This year, the course will be offered remotely via zoom.
Course literature
The course is partly based on the following references (available online):
- S. M. Moser, Information Theory (Lecture Notes), six ed., ETH Zurich, Switzerland and National Chiao Tung University, Taiwan, Oct. 2019. [Online]. Available: https://moser-isi.ethz.ch/cgi-bin/request_script.cgi?script=it
- Y. Polyanskiy and Y. Wu, Lecture notes on information theory, May 2019. [Online]. Available: http://people.lids.mit.edu/yp/homepage/papers.html
- J. Duchi, Information theory and statistics (lecture notes). Stanford, CA: Stanford University, 2019. [Online]. Available: http://web.stanford.edu/class/stats311/
- A. El Gamal and Y.-H. Kim, Network information theory. Cambridge, U.K.: Cambridge Univ. Press, 2011. Available online via Chalmers library.
Course content
- Shannon’s information metrics: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information
- Asymptotic equipartition property and typicality
- Data compression and the source coding theorem
- Data transmission and the channel coding theorem
- Binary hypothesis testing, Neyman-Pearson Lemma, Stein’s lemma
- Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds
- Minimax bounds in statistical estimations and the Fano’s method
Organization
The course comprises 18 lectures/practice sessions and 6 homework sessions. Each lecture/practice session is linked to a reading assignment, which will be reviewed in depth, and augmented with examples and exercises. It is important that the participants work on the reading assignment before each lecture/practice session. In the homework session, we will discuss the homework assignment.
Homework assignments
Homework assignments are given every Wednesday. Students are encouraged to form groups of up to 3 people and solve the homework assignment together. The homework assignments will be corrected in class. One group each week will be responsible to post the solutions.
Learning objectives and syllabus
Learning objectives:
- Define entropy, relative entropy, and mutual information and explain their operational meaning
- Describe and demonstrate Shannons source coding and channel coding theorems
- Compute the capacity of discrete communication channels
- Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test
- Explain how relative entropy can help characterizing the generalization error in statistical learning
- Apply Fanos inequality to demonstrate impossibility results in group testing, graphical model selection, and sparse linear regression
Link to the syllabus on Studieportalen.
Examination form
The assessment is based on an oral examination. At the oral exam, the course participants will be asked to solve one of the problems given as homework assignments and to discuss a theoretical topic. Grades are pass/fail only.
Course summary:
Date | Details | Due |
---|---|---|