# Course syllabus

## Course-PM

SSY125 Digital communications lp2 HT23 (7.5 hp)

This course is offered by the Department of Electrical Engineering.

### Course purpose

This course introduces the basics of information and coding theory. We will be concerned with the fundamental communication problem of sending information from a transmitter (source) to a receiver over a physical channel efficiently and reliably.

Several questions immediately come to mind when reading the above paragraph. What is meant by information? How can we represent efficiently (i.e., compress) a source? How is the transmission cost calculated? How is reliability defined and measured? How much information can be transmitted reliably over the channel? What design tradeoffs can be made? The aim of this course is to answer these questions.

### Learning outcomes

After completion of this course, the student should be able to

• Define entropy and mutual information and explain their operational meaning.
• Describe Shannon's source coding and channel coding theorems.
• Apply Huffman codes to compress discrete memoryless sources losslessly.
• Compute the capacity of discrete memoryless point-to-point channels.
• Describe Shannon's capacity formula on the additive white Gaussian noise (AWGN) channel and elaborate on the fundamental tradeoff between power and bandwidth.
• Compute and estimate the symbol and bit error probability of simple modulations (PAM, PSK, QAM) for transmission over the AWGN channel.
• Estimate the performance of communication links (i.e., modulation formats, channel codes, and decoders) over the AWGN channel by computer simulation. This includes determining simulation parameters to reach the desired accuracy as well as programming the simulation in MATLAB.

### Content

• Entropy, data compression, prefix-free codes, Kraft inequality, Huffman codes, and the source coding theorem.
• Mutual information, channel capacity, the channel coding theorem.
• Detection theory: maximum likelihood (ML) and maximum a posteriori detection.
• Methods for computing and bounding the symbol and bit error probabilities: decision regions, Q-function, union bound techniques.
• Analysis of linear modulation formats (PAM, PSK, QAM), power, and spectral efficiency.
• Channel coding, Hamming distance, hard- and soft-decision decoding.
• Linear block codes: generator and parity-check matrices, syndrome decoding, error correction, and error detection capability.
• Convolutional codes: trellis diagram, ML decoding, Viterbi algorithm, union bound on the error probability for ML soft- and hard-decision decoding.
• Modern codes: turbo codes and low-density parity-check codes, iterative message-passing decoding.

### Course Staff

• Examiner: Alexandre Graell i Amat (alexandre.graell@chalmers.seOffice: Room 6409, Department of Electrical Engineering)
• Lecturers: Alexandre Graell i Amat and Christian Häger (christian.haeger@chalmers.se; Office: Room 6439, Department of Electrical Engineering)
• Teaching assistant: Mohammad Farsi ( Office: Room 6340)

### Course literature

This course does not have any mandatory literature. The course is based on a complete set of lecture notes that will be made available in Canvas, and therefore it does not follow a particular textbook.

For those students who wish to delve deeper, the following books may be interesting:

• Stefan M. Moser and Po-Ning Chen, A Student's Guide to Coding and Information Theory, Cambridge University Press, 2012.
• W. E. Ryan and S. Lin, Channel codes. Classical and modern, Cambridge: Cambridge University Press, 2009.
• John Proakis, Digital Communications, McGraw-Hill, 2000.
• Sergio Benedetto, Ezio Biglieri, Principles of Digital Transmission: With Wireless Applications, Kluwer Academic Publishers, 1999;

Check Modules for lecture slides, previous exams, and project components.
Check Assignments for the quizzes and deliverables.