20877 - INFORMATION THEORY
Department of Computing Sciences
Course taught in English
EMMANUELA ORSINI
Suggested background knowledge
Mission & Content Summary
MISSION
CONTENT SUMMARY
The course develops the core mathematics of information theory and its algorithmic consequences for compression, communication, error correction, and security. After a brief refresh of prerequisites, it introduces entropy, mutual information, KL divergence, typicality (AEP), and the data-processing viewpoint, then presents Shannon’s source and channel coding theorems and their operational meanings (compression rate, capacity). Applications include lossless and lossy source coding (Huffman, arithmetic, Lempel–Ziv; rate–distortion), reliable communication over noisy channels (DMCs with a glance at Gaussian), and error-control methods from both Hamming (combinatorial) and Shannon (probabilistic) perspectives, covering Reed–Solomon, list decoding, message passing, and polar codes. Optionally we will cover information-theoretic cryptography and connections of mutual information/rate–distortion ideas to modern machine learning.
Intended Learning Outcomes (ILO)
KNOWLEDGE AND UNDERSTANDING
What can you hope to learn?
In this course, you will develop a deep understanding of the principles and applications of information theory. You will learn the importance of precisely defining fundamental concepts such as entropy, mutual information, channel capacity, and error correction. The course will provide examples of both theoretical and practical solutions to key information-theoretic problems, including data compression, reliable communication, and error correction codes.
You will also explore the mathematical foundations underpinning information theory, including probability theory, combinatorics, and linear algebra, which form the basis for modern advancements in the field. While the primary focus will be on theoretical frameworks, the course will connect these ideas to real-world applications in areas such as secure communication, cryptography, and machine learning. If time allows, the course will delve into higher-level applications, demonstrating how the principles of information theory are used in secure communication protocols, distributed systems, financial modeling, and advanced machine learning techniques.
APPLYING KNOWLEDGE AND UNDERSTANDING
At the end of the course student will be able to:
-
Model practical compression and communication problems using information-theoretic formalisms and compute relevant metrics (rates, capacities, error probabilities, distortions);
-
Analyze basic lossless and lossy coding schemes for discrete memoryless sources and channels, assessing their efficiency relative to theoretical limits;
-
Select and justify appropriate error-correcting codes and decoding strategies (e.g. algebraic decoding, list decoding, message passing) for simple noisy-channel scenarios;
-
Apply information-theoretic tools such as mutual information, typicality, and rate–distortion ideas to reason about the behavior of learning algorithms and secure communication protocols at an abstract level.
Teaching methods
- Lectures
- Guest speaker's talks (in class or in distance)
- Practical Exercises
- Individual works / Assignments
DETAILS
Teaching and learning activities for this course are structured around lectures, in-class exercises, guest talks, and individual assignments.
Lectures introduce the main theoretical concepts of information theory and derive key results with full mathematical rigor. Examples and short derivations are used to connect formal definitions to operational interpretations in compression, communication, and coding.
Guest speaker talks (in class or in distance) are delivered by researchers or practitioners working in information theory, communications, cryptography, or machine learning. Their role is to illustrate how the concepts developed in the course arise in current research and real-world systems, and to give students the opportunity to interact with experts in the field.
In-class exercises are dedicated to solving problems on the material covered in the lectures, such as computing information measures, analyzing simple communication models, and working through examples of coding schemes.
Individual works / assignments consist of problem sets in which students apply the course concepts to concrete tasks: analyzing compression schemes, studying simple noisy channels, or exploring the performance of specific coding algorithms. These assignments support continuous learning and provide feedback on the achievement of the intended learning outcomes.
Assessment methods
| Continuous assessment | Partial exams | General exam | |
|---|---|---|---|
|
x | x | |
|
x |
ATTENDING AND NOT ATTENDING STUDENTS
For all the students, assessment is based on written examinations, with the final grade determined entirely by the exam result (either via continuous assessment with partial exams or via a single general exam). Individual assignments are used only for formative purposes and do not contribute numerically to the final grade.
Teaching materials
ATTENDING AND NOT ATTENDING STUDENTS
- Lecture notes will be posted on blackboard.
- "Elements of Information Theory" Authors: Thomas M. Cover and Joy A. Thomas
- "Information Theory, Inference, and Learning Algorithms" Author: David J.C. MacKay