Insegnamento a.a. 2025-2026

20877 - INFORMATION THEORY

Department of Computing Sciences


Class timetable
Exam timetable

Course taught in English
Go to class group/s: 29
AI (8 credits - II sem. - OB  |  INF/01)
Course Director:
EMMANUELA ORSINI

Classes: 29 (II sem.)
Instructors:
Class 29: EMMANUELA ORSINI


Suggested background knowledge

Prior background in linear algebra, mathematical reasoning, and probability theory is highly desirable. The course assumes the following core skills: Mathematical reasoning: ability to read and write formal definitions and proofs. Linear algebra: familiarity with vector spaces, matrices, eigenvalues, and eigenvectors. Probability theory: understanding of independence, conditional probability, and expectation.While prior exposure to all these topics is not strictly required, no prior exposure to any of them, it might challenging to keep up with the course material.

Mission & Content Summary

MISSION

Information theory is the science of quantifying, transmitting, and protecting information in both secure and nreliable environments. It provides a mathematical framework for understanding the fundamental limits of data compression, communication, and storage, as well as the reliability of communication in the presence of noise. The digital age has greatly expanded the relevance of information theory, influencing fields as diverse as telecommunications, machine learning, cryptography, and data science. Over the past decades, information theory has evolved into a rigorous discipline grounded in solid mathematical principles. This course will focus on modern approaches to information theory, emphasizing both its foundational concepts and their practical applications. We will develop precise mathematical definitions for key problems in the field and construct algorithms and proofs that demonstrate optimal or near-optimal solutions to these problems. Topics will include entropy, mutual information, coding theorems, and the limits of reliable communication, all within a rigorous theoretical framework.

CONTENT SUMMARY

The course develops the core mathematics of information theory and its algorithmic consequences for compression, communication, error correction, and security. After a brief refresh of prerequisites, it introduces entropy, mutual information, KL divergence, typicality (AEP), and the data-processing viewpoint, then presents Shannon’s source and channel coding theorems and their operational meanings (compression rate, capacity). Applications include lossless and lossy source coding (Huffman, arithmetic, Lempel–Ziv; rate–distortion), reliable communication over noisy channels (DMCs with a glance at Gaussian), and error-control methods from both Hamming (combinatorial) and Shannon (probabilistic) perspectives, covering Reed–Solomon, list decoding, message passing, and polar codes. Optionally we will cover  information-theoretic cryptography and connections of mutual information/rate–distortion ideas to modern machine learning.


Intended Learning Outcomes (ILO)

KNOWLEDGE AND UNDERSTANDING

At the end of the course student will be able to...

What can you hope to learn?
In this course, you will develop a deep understanding of the principles and applications of information theory. You will learn the importance of precisely defining fundamental concepts such as entropy, mutual information, channel capacity, and error correction. The course will provide examples of both theoretical and practical solutions to key information-theoretic problems, including data compression, reliable communication, and error correction codes.
You will also explore the mathematical foundations underpinning information theory, including probability theory, combinatorics, and linear algebra, which form the basis for modern advancements in the field. While the primary focus will be on theoretical frameworks, the course will connect these ideas to real-world applications in areas such as secure communication, cryptography, and machine learning. If time allows, the course will delve into higher-level applications, demonstrating how the principles of information theory are used in secure communication protocols, distributed systems, financial modeling, and advanced machine learning techniques.

APPLYING KNOWLEDGE AND UNDERSTANDING

At the end of the course student will be able to...

At the end of the course student will be able to:

  • Model practical compression and communication problems using information-theoretic formalisms and compute relevant  metrics (rates, capacities, error probabilities, distortions);

  • Analyze basic lossless and lossy coding schemes for discrete memoryless sources and channels, assessing their efficiency relative to theoretical limits;

  • Select and justify appropriate error-correcting codes and decoding strategies (e.g. algebraic decoding, list decoding, message passing) for simple noisy-channel scenarios;

  • Apply information-theoretic tools such as mutual information, typicality, and rate–distortion ideas to reason about the behavior of learning algorithms and secure communication protocols at an abstract level.


Teaching methods

  • Lectures
  • Guest speaker's talks (in class or in distance)
  • Practical Exercises
  • Individual works / Assignments

DETAILS

Teaching and learning activities for this course are structured around lectures, in-class exercises, guest talks, and individual assignments.

 

Lectures introduce the main theoretical concepts of information theory  and derive key results with full mathematical rigor. Examples and short derivations are used to connect formal definitions to operational interpretations in compression, communication, and coding.

 

Guest speaker talks (in class or in distance) are delivered by researchers or practitioners working in information theory, communications, cryptography, or machine learning. Their role is to illustrate how the concepts developed in the course arise in current research and real-world systems, and to give students the opportunity to interact with experts in the field.

 

In-class exercises are dedicated to solving problems on the material covered in the lectures, such as computing information measures, analyzing simple communication models, and working through examples of coding schemes.

 

Individual works / assignments consist of problem sets  in which students apply the course concepts to concrete tasks: analyzing compression schemes, studying simple noisy channels, or exploring the performance of specific coding algorithms. These assignments support continuous learning and provide feedback on the achievement of the intended learning outcomes.


Assessment methods

  Continuous assessment Partial exams General exam
  • Written individual exam (traditional/online)
  x x
  • Individual Works/ Assignment (report, exercise, presentation, project work etc.)
x    

ATTENDING AND NOT ATTENDING STUDENTS

For all the  students, assessment is based on written examinations, with the final grade determined entirely by the exam result (either via continuous assessment with partial exams or via a single general exam). Individual assignments are used only for formative purposes and do not contribute numerically to the final grade.


Teaching materials


ATTENDING AND NOT ATTENDING STUDENTS

- Lecture notes will be posted on blackboard.
- "Elements of Information Theory" Authors: Thomas M. Cover and Joy A. Thomas

- "Information Theory, Inference, and Learning Algorithms" Author: David J.C. MacKay

Last change 08/11/2025 10:03